Contextual navigational control for digital television

Information

  • Patent Grant
  • 11785308
  • Patent Number
    11,785,308
  • Date Filed
    Wednesday, October 21, 2020
    4 years ago
  • Date Issued
    Tuesday, October 10, 2023
    a year ago
Abstract
A contextual navigational control for digital television is described. An apparatus, comprises a contextual navigation control interface (CNCI). The CNCI includes a first area that represents a plurality of cable television programs having a first level of relevance. A second area represents a first group of the plurality of programs having a second level of relevance. A third area represents a first subgroup of the first group having a third level of relevance. A fourth area represents a second subgroup of the first group having a fourth level of relevance. There may be additional areas representing additional sub-groups of relevance to the current viewing context.
Description
FIELD OF THE INVENTION

The present invention relates to systems and methods for presenting navigation; and control of television viewing services.


BACKGROUND

Interactive television (iTV) is currently available in varying forms. At the core of iTV applications are the navigation applications provided to subscribers to assist in the discovery and selection of television programming. Currently available methods and systems for browsing and selecting broadcast (linear television) are known as interactive program guides (IPGs)—or electronic program guides (EPGs). Current IPGs allow the subscriber to browse and select linear broadcast programming. These IPGs also include the ability to subset the broadcast linear program listing data by subject or type of programing.


In addition to linear broadcast television, subscribers may now also be given opportunities to select from a list of programs that are not linear, but instead are provided on demand. Such technology is generally referred to as Video on Demand (VOD). The current schemes for browsing and selecting VOD programs include the ability to select such programming from categories of programming.


Due to advances in technologies such as data compression, system operators such as cable multiple system operators (MSOs) and satellite operators are able to send more and more broadcast channels and on-demand content over their systems. This in turn has prompted broadcast content providers and programmers to develop more and more channels and on-demand content offerings. Also, the addition of digital video recorder (DVR) technology to set-top boxes (STBs) now provide additional options for time-shifted viewing of broadcast TV and increasing options for the storage of VOD titles that have been purchased for viewing, or likely-to-purchase.


The current television navigational structure is predicated on the numeric channel lineup where a channel's position is determined arbitrarily for each MSO system and without regard for clustering content type or brand. To the TV viewer, this is also manifested in the grid-based navigational tools as they are generally structured in a time-by-channel grid format. As a navigational model, this has become outdated with the increasing number of channels (500+). The problem is further exacerbated with the addition of non-linear (non time-based) On-Demand and time-shifted (DVR) content and other interactive applications such as games.


With these increasing number of TV viewing options comes a complexity of navigating the options to find something to watch. There are generally two types of viewers. One type of viewer knows the type of content they want to watch and are searching for an instance of that type of content. This is exemplified by a viewer who, wanting to watch an action film, wishes to browse available action films. The second type of viewer is one that has no specific notion of what they want to watch—they just want to find something interesting to them in a more impulse oriented manner.


The current state of technology for browsing for TV content includes searching lists of content underneath category heading or browsing large lists or grids of data to find content, or typing in search criteria. Each of these browse methods are referred to in this document as content search points. Content search points include IPG's and EPG's, Movies-On-Demand applications, text search, DVR recorded shows listings, and Category Applications as specified in the above-cited patent application. Current technology also consists of menus and toolbars that allow one to jump to the various content search points. The problem with current technology is that due to the large. amount of content on the Digital TV service, the menus and toolbars themselves are: becoming either long lists of specific content that are difficult to search, or short lists. of general categories that do not provide quick access to specific needs. Thus, the new features of digital television, new content types and the sheer volume of viewing options warrant a new navigational model for viewing television.


SUMMARY

A contextual navigational control for digital television is described. In one embodiment a contextual navigation control interface (CNCI) includes a first area that represents a plurality of cable television programs having a first level of relevance. A second area represents a first group of the plurality of programs having a second level of relevance. A third area represents a first subgroup of the first group having a third level of relevance. A fourth area represents a second subgroup of the first group having a fourth level of relevance. There may be additional areas representing additional sub-groups of relevance to the current viewing context.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention will be understood and appreciated more fully from the following detailed description taken in conjunction with the drawings in which:



FIG. 1 illustrates an exemplary two-way cable television system that provides contextual navigational control for digital television, according to one embodiment of the present invention;



FIG. 2 illustrates four dimensions of programming contextual relevance, according to one embodiment of the present invention;



FIG. 3 illustrates an exemplary user interface for selecting the contextual navigation control mode, according to one embodiment of the present invention;



FIG. 4 illustrates an exemplary user interface 400 for selecting category related links, according to one embodiment of the present invention;



FIG. 5 illustrates an exemplary user interface 505 for selecting channel/provider related links, according to one embodiment of the present invention;



FIG. 6 illustrates an exemplary network architecture, according to one embodiment of the present invention; and



FIG. 7 illustrates an exemplary computer architecture, according to one embodiment of the present invention.





DETAILED DESCRIPTION

The present invention provides, in various embodiments, systems and methods by which subscribers are presented with a dynamic navigational interface for linking to content. This dynamic navigational interface speeds access to content by providing a minimal graphical interface and by first presenting contextual options to the viewer that are relevant to the currently viewed program, channel, provider or genre of same. The contextual options are further divided in the user presentation along multiple levels or dimensions from general to more specific relative to the currently viewed content.


Described herein are systems and methods by which subscribers are presented with dynamic iTV navigational hierarchies that first present navigational options based on relevance to the available content on television, and to the currently viewed program, channel, provider or genre of program, channel or provider. In current navigational systems users are provided menus from which they can select a target content search point. These menus are either presented as long lists of options or as very short lists. The long lists provide greater subdivision of content but create a navigational obstacle in that the lists must be scanned to find a link to a specific content of interest. The short lists provide content search points that are much too general such that if the link is selected, the viewer is taken to yet another page for further categorization or the viewer is presented with a large set of content that must be browsed. The present invention enables the quick access to multiple levels of granularity of content categorization on a single menu, by using the context of the currently viewed program to set the state of the navigational control when displayed.


In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be evident, however, to one skilled in the art that the present invention may be practiced without these specific details. As one example, the terms subscriber, user, viewer are used interchangeably throughout this description. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present invention. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical, electrical, and other changes may be made without departing from the scope of the present invention.


Some portions of the detailed descriptions•that follow are presented in terms of algorithms and symbolic representations of operations on data bits within a computer memory. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is here, and generally, conceived to be a self-consistent sequence of acts leading to a desired result. The acts are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, signals, datum, elements, symbols, characters, terms, numbers, or the like.


It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.


The present invention can be implemented by an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer, selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.


The algorithms and processes presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method. For example, any of the methods according to the present invention can be implemented in hard-wired circuitry, by programming a general-purpose processor or by any combination of hardware and software. One of skill in the art will immediately appreciate that the invention can be practiced with computer system configurations other than those described below, including hand-held devices, multiprocessor systems, microprocessor-based or programmable consumer electronics, DSP devices, network PCs, minicomputers, mainframe computers, and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. The required structure for a variety of these systems will appear from the description below.


The methods of the invention may be implemented using computer software. If written in a programming language conforming to a recognized standard, sequences of instructions designed to implement the methods can be compiled for execution on a variety of hardware platforms and for interface to a variety of operating systems. In addition, the present invention is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein. Furthermore, it is common in the art to speak of software, in one form or another (e.g., program, procedure, application, etc.), as taking an action or causing a result. Such expressions are merely a shorthand way of saying that execution of the software by a computer causes the processor of the computer to perform an action or produce a result.


An Exemplary Cable Television System



FIG. 1 illustrates an exemplary two-way cable television system that provides contextual navigational control for digital television, according to one embodiment of the present invention. Generally, cable television system (CATV) 100 provides video and data services through a network of high bandwidth coaxial cables and fibers. The cable system includes a head-end amplifier 110 that combines the broadcast and data signals for transmission to the subscribers. The head-end 110 is connected to fiber or coax trunks 111 that carry the signals into the neighborhoods 120 where they are tapped off to provide service to the residence 130.


The head-end 110 is the initial distribution center for a CATV system 100. The head-end 110 is where incoming video and television signal sources (e.g., video tape, satellites, local studios) are received, amplified and modulated onto TV carrier channels for transmission on the CATV cabling system. The cable distribution system is a cable (fiber or coax) that is used to transfer signals from the head-end 110 to the end-users. The cable is attached to the television 131 through a set-top box 132. The set-top box 132 adapts the signals from the head-end 110 to a format suitable for the television 131. Additionally, the set-top box 132 renders a user interface through which the end-user navigates through menus to select programming to view.


CATV system 100 allows two-way data transmission. Data is provided by a cable modem 133 in the residence 130 and data gateway (cable modem termination system (CMTS)) 111 at the head-end 110. The CMTS 111 also provides an interface to other networks such as the Internet 140. Furthermore, CATV system 100 allows cable telephony to initiate, process and receive voice communications. Telephony is provided by a voice gateway 112 at the head-end 110 that converts communication signals between data networks and telephone networks (e.g., PSTN) 150. The data and telephony attributes of CATV system 100 described above, are for the reader's benefit to appreciate the entire CATV system 100. However, the present system for contextual navigational control may be mainly associated with the digital television content delivery aspects of CATV system 100.


To further understand the present invention consider that all content has descriptive attributes. For example, the show “Sopranos” could be described by the following four attributes: HBO, Drama, Primetime, Crime. These attribute dimensions, namely Provider, Genre, Time of Day and Content are chosen purely for illustration, and additional categories or dimension could easily be identified. The type and number of attributes is variable and may be unique to an article of content. In the navigational model described by this invention, these attributes are the doorways to other content of interest based on the notion that what the viewer is watching has some basic attributes that are of interest to the viewer and can be predictive of where the viewer may wish to navigate.


In one embodiment, the present contextual navigational control has four selection dimensions, or levels, with increasing contextual relevance to the currently broadcast program. FIG. 2 illustrates four dimensions of programming contextual relevance, according to one embodiment of the present invention. The four dimensions 200 or levels of navigation begin with the most general and proceed to the most content specific navigation that link context to the current content being viewed. These dimensions are:

    • a All Listings (non-filtered) Related Links 210
    • b Category Related Links 220
    • c Channel/Provider Related Links 230
    • d Show/Content Related Links 240



FIG. 2 illustrates how programming choices available to the viewer range from the most general or all programming choices, to filtered selections by content type (e.g. Sports), provider (e.g. ESPN), down to programming choices related to specific programming (e.g. a football game). Each contextual level of navigation from the most general to the most specific with respect to the current content will now be described in detail.


Level one (All Listings 210) represents the links to general system function, in particular, it allows the user to select how he/she wishes to interact with the present contextual navigation control. FIG. 3 illustrates an exemplary user interface for selecting the contextual navigation control mode, according to one embodiment of the present invention. The control mode interface 300 includes the following viewing modes:

    • 1 Full Guide 310: Full Screen Guide of linear programming choices (either blocking video or with video inset)
    • 2 Mini-Guide 320: Overscreen guide to linear programming to allow for content selection with continued viewing of current programming
    • 3 On-Demand Guide 330: Guide to On-Demand programming. In some embodiments, this may be combined with the other guides.
    • 4 Favorites (not shown): list of content selections for user's favorite categories
    • 5 My VOD (not shown): list of On-Demand content according to selection rules provided by user
    • 6 My DVR Shows (not shown): list of previously recorded programs


Returning to FIG. 2, level two (category related links 220) represents a contextual level of navigation where programming content is ordered according to relevance by content category. The category related links level 220 consists of content categories where the category contextually presented is the one most relevant to the category of content currently being viewed. In one embodiment, the category related links level link 220 navigates the viewer to the ‘category application’. The present contextual navigation control provides a method for quickly jumping to the most relevant category based on the currently viewed program. For example, FIG. 4 illustrates an exemplary user interface 400 for selecting category related links, according to one embodiment of the present invention. In this example, if the viewer is currently watching a football game then the ‘Sports’ category link 420 would be the item shown in the initial state of the contextual navigation control. As a further example, in situations where a viewer is watching a movie, the initial state of the second level 220 of the present contextual navigation control would be ‘Movies’ 430. Further examples are illustrated Table 1 below.












TABLE 1







Currently viewed program type
Initial Category Link









Football
Sports 420



Movie
Movies 430



Sesame Street
Kids 410



HD program
HD (not shown)



Headline News
News (not shown)










In this manner, the contextual display and dynamic positioning of the relevant links related to a current program, provider, category or genre of same, will assist the viewer to more quickly navigate to the programming they desire. The possible links are also navigable by the viewer once the selection is moved to highlight an element of the present contextual navigation control, as illustrated by FIG. 4. In other words, a viewer may navigate to and select the Kids link 410 or the Movies link 430 upon highlighting the Sports link 420 and in so doing expose further category selections.


Returning to FIG. 2, level three (Channel/Provider Related link 230) represents an additional order of relevance. This level consists of channel specific links where the channel category link contextually presented is the one most relevant to the category of content currently being viewed. This ‘programmer category application’ can be implemented as a category application or can be any content provided for the current channel or channel family to which this channel belongs. For example, in the examples of FIGS. 3 and 4, the viewer has selected a football game that sets the second level link 220 state to ‘Sports’ 420 initially.


For the purpose of illustrating the third level's relevance (Channel/Provider Related link 230), FIG. 5 illustrates an exemplary user interface 505 for selecting channel/provider related links, according to one embodiment of the present invention. Continuing with the examples shown in FIGS. 3 and 4, FIG. 5 illustrates that the football game is on ESPN. Given that the viewer is watching the football game on ESPN, when the navigation control 505 is launched, the initial state of third level (Channel/Provider Related link 230) will be ‘ESPN’ 501. This permits the viewer to jump directly to a sports category 501 to see what other sports programming is available on other networks, such as NBC 502, or AMC 503. Additionally, the viewer can jump to a provider category by selecting ESPN 501 to see what other programming and information ESPN is providing. Further examples of relationships between currently viewed channel and the initial option display for the channel/provider link level 230 are shown in Table 2 below.












TABLE 2







Currently viewed channel
The Initial Channel Link









ESPN
ESPN



Discovery
Discovery



Discovery Wings
Discovery



TLC
Discovery



Headline News
News










The ‘Discovery’ example is provided to illustrate that multiple channels may link to a single channel-family content application. All of the possible links are also navigable by the viewer once the selection is moved to highlight this element of the contextual navigation control.


Returning to FIG. 2, level four (Show/Content Related links 240) represents an additional order of relevance. The show/content level 240 consists of program or show specific links where the channel program link (contextually) presented is the one most relevant to the content currently being viewed. For example, if the current program being viewed is the ESPN show “Pardon the Interruption”, this link could in one embodiment be an interactive application for “Pardon the Interruption”. In one embodiment, the other links available in this level could be interactive applications for programs or shows that share some attribute in common with the current program being viewed.


It will also be apparent that the relationships to categories may be specified in any manner and may contain additional levels of relevance. For instance, in the example of a viewer watching football on ESPN, the first order of relevance was content category (sports), the second order of relevance was the current channel (ESPN), a third order of relevance can be added to be specific to the program. For example if the game were a college game between LSU and Florida then an additional order of relevance might be a link to an application for ‘College football’ or for ‘South-Eastern Conference’.


An Exemplary Network Architecture


Elements of the present invention may be included within a client-server based system 500 such as that illustrated in FIG. 6. According to the embodiment depicted in FIG. 6, one or more servers 510 communicate with a plurality of clients 530-535 and set-top boxes 570-575. The clients and set-top boxes 530-535 and set-top boxes 570-575 may transmit and receive data from servers 510 over a variety of communication media including (but not limited to) a local area network (“LAN”) 540 and/or a wide area network (“WAN”) 525 (e.g., the Internet). Alternative communication channels such as cable RF and wireless communication via GSM, TDMA, CDMA or satellite broadcast (not shown) are also contemplated within the scope of the present invention.


Servers 510 may include a database for storing various types of data. This may include, for example, specific client data (e.g., user account information and user preferences) and/or more general data. The database on servers 510 in one embodiment runs an instance of a Relational Database Management System (RDBMS), such as Microsoft™ SQL-Server, Oracle™ or the like. A user/client may interact with and receive feedback from servers 510 using various different communication devices and/or protocols. According to one embodiment, a user connects to servers 510 via client software. The client software may include a browser application such as Netscape Navigator™ or Microsoft Internet Explorer™ on the user's personal computer, which communicates to servers 510 via the Hypertext Transfer Protocol (hereinafter “HTTP”). In other embodiments included within the scope of the invention, clients may communicate with servers 510 via cellular phones and pagers (e.g., in which the necessary transaction software is electronic in a microchip), handheld computing devices, and/or touch-tone telephones (or video phones). According to another embodiment, set-top boxes 570-575 connects to servers 510 via a TV application.


Servers 510 may also communicate over a larger network (e.g., network 525) with other servers 550-552. This may include, for example, servers maintained by businesses to host their Web sites—e.g., content servers such as “yahoo.com.” Network 525 may include router 520. Router 520 forwards data packets from one local area network (LAN) or wide area network (WAN) to another. Based on routing tables and routing protocols, router 520 reads the network address in each IP packet and makes a decision on how to send if based on the most expedient route. Router 520 works at layer 3 in the protocol stack.


An Exemplary Computer Architecture


Having briefly described an exemplary network architecture which employs various elements of the present invention, a computer system 600 representing exemplary clients 530-535, set-top boxes 570-575 (e.g., set-top box 132) and/or servers (e.g., servers 510), in which elements of the present invention may be implemented will now be described with reference to FIG. 7.


One embodiment of computer system 600 comprises a system bus 620 for communicating information, and a processor 610 coupled to bus 620 for processing information. Computer system 600 further comprises a random access memory (RAM) or other dynamic storage device 625 (referred to herein as main memory), coupled to bus 620 for storing information and instructions to be executed by processor 610. Main memory 625 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 610. Computer system 600 also may include a read only memory (ROM) and/or other static storage device 626 coupled to bus 620 for storing static information and instructions used by processor 610.


A data storage device 627 such as a magnetic disk or optical disc and its corresponding drive may also be coupled to computer system 600 for storing information and instructions. Computer system 600 can also be coupled to a second I/0 bus 650 via an I/0 interface 630. Multiple I/0 devices may be coupled to I/0 bus 650, including a display device 643, an input device (e.g., an alphanumeric input device 642 and/or a cursor control device 641). For example, video news clips and related information may be presented to the user on the display device 643.


The communication device 640 is for accessing other computers (servers or clients) via a network 525, 540. The communication device 640 may comprise a modem, a network interface card, or other well-known interface device, such as those used for coupling to Ethernet, token ring, or other types of networks.


A contextual navigational control for digital television has been described. It will be appreciated that the embodiments described above are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention includes both combinations and subcombinations of the various features described hereinabove, as well as variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art.

Claims
  • 1. A method comprising: causing, by a computing device, output of a content item;receiving, by the computing device and during the output of the content item, a user request to output a content guide;determining, by the computing device, during the output of the content item, and based on the user request, a content category associated with the content item;generating, by the computing device and based on the user request, the content guide to comprise: an indication for the determined content category, wherein the indication is configured to be selectable, anda control to expand the content guide to output selectable options for additional content categories; andcausing, by the computing device, output of the generated content guide.
  • 2. The method of claim 1, wherein determining the content category associated with the content item comprises determining a content provider associated with the content item or a genre associated with the content item.
  • 3. The method of claim 1, wherein generating the content guide further comprises: determining a second content category associated with the content item, wherein the second content category is a subcategory of the content category; andgenerating the content guide to further comprise a second indication for the second content category.
  • 4. The method of claim 3, wherein the second content category corresponds to one or more content providers associated with the content category or the content item.
  • 5. The method of claim 1, wherein causing output of the generated content guide comprises overlaying the content guide on the content item.
  • 6. The method of claim 1, wherein causing output of the generated content guide comprises causing output of the content guide while causing output of the content item.
  • 7. A method comprising: receiving, by a computing device and while causing output of a content item associated with a content category, a user request to output a content guide;determining, by the computing device, while causing output of the content item, and based on the user request, a content provider associated with the content item;generating, by the computing device and based on the user request, the content guide to comprise: an indication for the determined content provider, wherein the indication is configured to be selectable, anda control to expand the content guide to output selectable options for other content providers providing content items associated with the content category; andcausing, by the computing device, output of the generated content guide.
  • 8. The method of claim 7, wherein the content category associated with the content item comprises a genre associated with the content item.
  • 9. The method of claim 7, wherein generating the content guide comprises: generating the content guide to further comprise a second indication for the content category.
  • 10. The method of claim 7, wherein causing output of the generated content guide comprises overlaying the content guide on the content item.
  • 11. The method of claim 7, wherein causing output of the generated content guide comprises causing output of the content guide while causing output of the content item.
  • 12. A method comprising: determining, during output of a content item, a content category associated with the content item;generating, by a computing device and during the output of the content item, a content guide to comprise: a first indication for the content category associated with the content item,a second indication for a content subcategory of the content category, anda control to expand the content guide to output selectable options for additional content categories; andcausing, by the computing device, output of the generated content guide.
  • 13. The method of claim 12, wherein the content category associated with the content item comprises a genre associated with the content item.
  • 14. The method of claim 12, wherein generating the content guide further comprises: generating the content guide to further comprise a second control to expand the content guide to output second selectable options for additional content subcategories.
  • 15. The method of claim 12, wherein causing output of the generated content guide comprises overlaying the generated content guide on the content item.
  • 16. The method of claim 12, wherein causing output of the generated content guide comprises causing output of the generated content guide while causing output of the content item.
  • 17. The method of claim 1, further comprising: receiving a second user request to expand the content guide; andupdating the content guide to comprise the selectable options for the additional content categories.
  • 18. The method of claim 7, further comprising: receiving a second user request to expand the content guide; andupdating the content guide to comprise the selectable options for the other content providers.
  • 19. The method of claim 12, further comprising: receiving a second user request to expand, based on the control, the content guide; andupdating the content guide to comprise the selectable options for the additional content categories.
  • 20. The method of claim 12, wherein the content subcategory comprises a content provider associated with the content item, and wherein generating the content guide further comprises generating the content guide to further comprise: a second control to expand the content guide to output second selectable options for additional content providers providing content items associated with the content category.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation of and claims priority to U.S. patent application Ser. No. 15/968,160, filed May 1, 2018, which claims priority to U.S. patent application Ser. No. 14/279,645, filed May 16, 2014, now U.S. Pat. No. 9,992,546, which claims priority to U.S. patent application Ser. No. 10/925,737 filed Aug. 24, 2004, now U.S. Pat. No. 8,819,734, which is a continuation-in-part of U.S. patent application Ser. No. 10/664,275, filed Sep. 16, 2003, now U.S. Pat. No. 7,703,116. U.S. patent application Ser. No. 10/925,737 also claims priority to U.S. Provisional Patent Application No. 60/552,998, filed Mar. 11, 2004. The disclosures of each of the aforementioned are incorporated by reference herein in their entity.

US Referenced Citations (471)
Number Name Date Kind
5287489 Nimmo et al. Feb 1994 A
5321750 Nadan Jun 1994 A
5353121 Young et al. Oct 1994 A
5485221 Banker et al. Jan 1996 A
5521841 Arman et al. May 1996 A
5530939 Mansfield, Jr. et al. Jun 1996 A
5583563 Wanderscheid et al. Dec 1996 A
5589892 Knee et al. Dec 1996 A
5592551 Lett et al. Jan 1997 A
5594509 Florin Jan 1997 A
5613057 Caravel Mar 1997 A
5621456 Florin et al. Apr 1997 A
5657072 Aristides et al. Aug 1997 A
5659793 Escobar et al. Aug 1997 A
5666645 Thomas et al. Sep 1997 A
5675752 Scott et al. Oct 1997 A
5694176 Bruette et al. Dec 1997 A
5737552 Lavallee et al. Apr 1998 A
5802284 Karlton et al. Sep 1998 A
5818438 Howe et al. Oct 1998 A
5826102 Escobar et al. Oct 1998 A
5844620 Coleman et al. Dec 1998 A
5850218 LaJoie et al. Dec 1998 A
5852435 Vigneaux et al. Dec 1998 A
5860073 Ferrel et al. Jan 1999 A
5883677 Hofmann Mar 1999 A
5892902 Clark Apr 1999 A
5892905 Brandt et al. Apr 1999 A
5905492 Straub et al. May 1999 A
5929849 Kikinis Jul 1999 A
5945987 Dunn Aug 1999 A
5960194 Choy et al. Sep 1999 A
5990890 Etheredge Nov 1999 A
5996025 Day et al. Nov 1999 A
6002394 Schein et al. Dec 1999 A
6005561 Hawkins et al. Dec 1999 A
6008083 Brabazon et al. Dec 1999 A
6008803 Rowe Dec 1999 A
6008836 Bruck et al. Dec 1999 A
6016144 Blonstein et al. Jan 2000 A
6025837 Matthews, III et al. Feb 2000 A
6038560 Wical Mar 2000 A
6049823 Hwang Apr 2000 A
6061695 Slivka et al. May 2000 A
6067108 Yokote et al. May 2000 A
6088722 Herz et al. Jul 2000 A
6091411 Straub et al. Jul 2000 A
6094237 Hashimoto Jul 2000 A
6141003 Chor et al. Oct 2000 A
6148081 Szymanski et al. Nov 2000 A
6162697 Singh et al. Dec 2000 A
6169543 Wehmeyer Jan 2001 B1
6172677 Stautner et al. Jan 2001 B1
6177931 Alexander et al. Jan 2001 B1
6191781 Chaney et al. Feb 2001 B1
6195692 Hsu Feb 2001 B1
6205582 Hoarty Mar 2001 B1
6219839 Sampsell Apr 2001 B1
6239795 Ulrich et al. May 2001 B1
6240555 Shoff et al. May 2001 B1
6281940 Sciammarella Aug 2001 B1
6292187 Gibbs et al. Sep 2001 B1
6292827 Raz Sep 2001 B1
6295057 Rosin et al. Sep 2001 B1
6314569 Chernock et al. Nov 2001 B1
6317885 Fries Nov 2001 B1
6345305 Beck et al. Feb 2002 B1
6405239 Addington et al. Jun 2002 B1
6415438 Blackketter et al. Jul 2002 B1
6421067 Kamen et al. Jul 2002 B1
6426779 Noguchi et al. Jul 2002 B1
6442755 Lemmons et al. Aug 2002 B1
6477705 Yuen et al. Nov 2002 B1
6486920 Arai et al. Nov 2002 B2
6522342 Gagnon et al. Feb 2003 B1
6529950 Lumelsky et al. Mar 2003 B1
6530082 Del Sesto et al. Mar 2003 B1
6532589 Proehl et al. Mar 2003 B1
6564263 Bergman et al. May 2003 B1
6567104 Andrew et al. May 2003 B1
6571392 Zigmond et al. May 2003 B1
6591292 Morrison et al. Jul 2003 B1
6621509 Eiref et al. Sep 2003 B1
6636887 Augeri Oct 2003 B1
6658661 Arsenault et al. Dec 2003 B1
6678891 Wilcox et al. Jan 2004 B1
6684400 Goode et al. Jan 2004 B1
6694312 Kobayashi et al. Feb 2004 B2
6698020 Zigmond et al. Feb 2004 B1
6704359 Bayrakeri et al. Mar 2004 B1
6731310 Craycroft et al. May 2004 B2
6745367 Bates et al. Jun 2004 B1
6760043 Markel Jul 2004 B2
6763522 Kondo et al. Jul 2004 B1
6766526 Ellis Jul 2004 B1
6806887 Chernock et al. Oct 2004 B2
6857128 Borden, IV et al. Feb 2005 B1
6886029 Pecus et al. Apr 2005 B1
6904610 Bayrakeri et al. Jun 2005 B1
6910191 Segerberg et al. Jun 2005 B2
6918131 Rautila et al. Jul 2005 B1
6963880 Pingte et al. Nov 2005 B1
7028327 Dougherty et al. Apr 2006 B1
7065785 Shaffer et al. Jun 2006 B1
7080400 Navar Jul 2006 B1
7103904 Blackketter et al. Sep 2006 B1
7114170 Harris et al. Sep 2006 B2
7134072 Lovett et al. Nov 2006 B1
7152236 Wugofski et al. Dec 2006 B1
7162694 Venolia Jan 2007 B2
7162697 Markel Jan 2007 B2
7174512 Martin et al. Feb 2007 B2
7177861 Tovinkere et al. Feb 2007 B2
7197715 Valeria Mar 2007 B1
7207057 Rowe Apr 2007 B1
7213005 Mourad et al. May 2007 B2
7221801 Jang et al. May 2007 B2
7237252 Billmaier Jun 2007 B2
7293275 Krieger et al. Nov 2007 B1
7305696 Thomas et al. Dec 2007 B2
7313806 Williams et al. Dec 2007 B1
7337457 Pack et al. Feb 2008 B2
7360232 Mitchell Apr 2008 B2
7363612 Satuloori et al. Apr 2008 B2
7406705 Crinon et al. Jul 2008 B2
7440967 Chidlovskii Oct 2008 B2
7464344 Carmichael et al. Dec 2008 B1
7472137 Edelstein et al. Dec 2008 B2
7490092 Sibley et al. Feb 2009 B2
7516468 Deller et al. Apr 2009 B1
7523180 DeLuca et al. Apr 2009 B1
7587415 Gaurav et al. Sep 2009 B2
7624416 Vandermolen et al. Nov 2009 B1
7640487 Amielh-Caprioglio et al. Dec 2009 B2
7702315 Engstrom et al. Apr 2010 B2
7703116 Moreau et al. Apr 2010 B1
7721307 Hendricks et al. May 2010 B2
7743330 Hendricks et al. Jun 2010 B1
7752258 Lewin et al. Jul 2010 B2
7861259 Barone, Jr. Dec 2010 B2
7913286 Sarachik et al. Mar 2011 B2
7958528 Moreau et al. Jun 2011 B2
7975277 Jerding et al. Jul 2011 B1
8006262 Rodriguez et al. Aug 2011 B2
8032914 Rodriguez Oct 2011 B2
8156533 Crichton Apr 2012 B2
8220018 de Andrade et al. Jul 2012 B2
8266652 Roberts et al. Sep 2012 B2
8296805 Tabatabai et al. Oct 2012 B2
8365230 Chane et al. Jan 2013 B2
8381259 Khosla Feb 2013 B1
8434109 Kamimaeda et al. Apr 2013 B2
8448208 Moreau et al. May 2013 B2
8660545 Redford et al. Feb 2014 B1
8699862 Sharifi et al. Apr 2014 B1
8793256 McIntire et al. Jul 2014 B2
8850495 Pan Sep 2014 B2
8863196 Patil et al. Oct 2014 B2
8938675 Holladay et al. Jan 2015 B2
8943533 de Andrade et al. Jan 2015 B2
8973063 Spilo et al. Mar 2015 B2
9021528 Moreau et al. Apr 2015 B2
9363560 Moreau et al. Jun 2016 B2
9473548 Chakrovorthy et al. Oct 2016 B1
9516253 De Andrade et al. Dec 2016 B2
20010014206 Artigalas et al. Aug 2001 A1
20010027563 White et al. Oct 2001 A1
20010049823 Matey Dec 2001 A1
20010056573 Kovac et al. Dec 2001 A1
20010056577 Gordon et al. Dec 2001 A1
20020010928 Sahota Jan 2002 A1
20020016969 Kimble Feb 2002 A1
20020023270 Thomas et al. Feb 2002 A1
20020026642 Augenbraun et al. Feb 2002 A1
20020032905 Sherr et al. Mar 2002 A1
20020035573 Black et al. Mar 2002 A1
20020041104 Graf et al. Apr 2002 A1
20020042915 Kubischta et al. Apr 2002 A1
20020042920 Thomas et al. Apr 2002 A1
20020046099 Frengut et al. Apr 2002 A1
20020059094 Hosea et al. May 2002 A1
20020059586 Carney et al. May 2002 A1
20020059629 Markel May 2002 A1
20020067376 Martin et al. Jun 2002 A1
20020069407 Fagnani et al. Jun 2002 A1
20020070978 Wishoff et al. Jun 2002 A1
20020078444 Krewin et al. Jun 2002 A1
20020078449 Gordon et al. Jun 2002 A1
20020083450 Kamen et al. Jun 2002 A1
20020100041 Rosenberg et al. Jul 2002 A1
20020104083 Hendricks et al. Aug 2002 A1
20020107973 Lennon et al. Aug 2002 A1
20020108121 Alao et al. Aug 2002 A1
20020108122 Alao et al. Aug 2002 A1
20020120609 Lang et al. Aug 2002 A1
20020124254 Kikinis Sep 2002 A1
20020124256 Suzuka Sep 2002 A1
20020144268 Khoo et al. Oct 2002 A1
20020144269 Connelly Oct 2002 A1
20020144273 Reto Oct 2002 A1
20020147645 Alao et al. Oct 2002 A1
20020152477 Goodman et al. Oct 2002 A1
20020156839 Peterson et al. Oct 2002 A1
20020156890 Carlyle et al. Oct 2002 A1
20020162120 Mitchell Oct 2002 A1
20020169885 Alao et al. Nov 2002 A1
20020170059 Hoang Nov 2002 A1
20020171691 Currans et al. Nov 2002 A1
20020171940 He et al. Nov 2002 A1
20020184629 Sie et al. Dec 2002 A1
20020188944 Noble Dec 2002 A1
20020194181 Wachtel Dec 2002 A1
20020196268 Wolff et al. Dec 2002 A1
20020199187 Gissin et al. Dec 2002 A1
20020199190 Su Dec 2002 A1
20030001880 Holtz et al. Jan 2003 A1
20030005444 Crinon et al. Jan 2003 A1
20030005453 Rodriguez et al. Jan 2003 A1
20030014752 Zaslavsky et al. Jan 2003 A1
20030014753 Beach et al. Jan 2003 A1
20030018755 Masterson et al. Jan 2003 A1
20030023970 Panabaker Jan 2003 A1
20030023975 Schrader et al. Jan 2003 A1
20030025832 Swart et al. Feb 2003 A1
20030028871 Wang et al. Feb 2003 A1
20030028873 Lemmons Feb 2003 A1
20030041104 Wingard et al. Feb 2003 A1
20030051246 Wilder et al. Mar 2003 A1
20030056216 Wugofski et al. Mar 2003 A1
20030056218 Wingard et al. Mar 2003 A1
20030058948 Kelly et al. Mar 2003 A1
20030061028 Dey et al. Mar 2003 A1
20030066081 Barone et al. Apr 2003 A1
20030067554 Klarfeld et al. Apr 2003 A1
20030068046 Lindqvist et al. Apr 2003 A1
20030070170 Lennon Apr 2003 A1
20030079226 Barrett Apr 2003 A1
20030084443 Laughlin et al. May 2003 A1
20030084444 Ullman et al. May 2003 A1
20030084449 Chane et al. May 2003 A1
20030086694 Davidsson May 2003 A1
20030093790 Logan et al. May 2003 A1
20030093792 Labeeb et al. May 2003 A1
20030097657 Zhou et al. May 2003 A1
20030110500 Rodriguez Jun 2003 A1
20030110503 Perkes Jun 2003 A1
20030115219 Chadwick Jun 2003 A1
20030115612 Mao et al. Jun 2003 A1
20030126601 Roberts et al. Jul 2003 A1
20030132971 Billmaier et al. Jul 2003 A1
20030135464 Mourad et al. Jul 2003 A1
20030135582 Allen et al. Jul 2003 A1
20030140097 Schloer Jul 2003 A1
20030151621 McEvilly et al. Aug 2003 A1
20030158777 Schiff et al. Aug 2003 A1
20030172370 Satuloori et al. Sep 2003 A1
20030177501 Takahashi et al. Sep 2003 A1
20030182663 Gudorf et al. Sep 2003 A1
20030189668 Newnam et al. Oct 2003 A1
20030204814 Elo et al. Oct 2003 A1
20030204846 Breen et al. Oct 2003 A1
20030204854 Blackketter et al. Oct 2003 A1
20030207696 Willenegger et al. Nov 2003 A1
20030226141 Krasnow et al. Dec 2003 A1
20030229899 Thompson et al. Dec 2003 A1
20040003402 McKenna Jan 2004 A1
20040003404 Boston et al. Jan 2004 A1
20040019900 Knightbridge et al. Jan 2004 A1
20040019908 Williams et al. Jan 2004 A1
20040022271 Fichet et al. Feb 2004 A1
20040024753 Chane et al. Feb 2004 A1
20040025180 Begeja et al. Feb 2004 A1
20040031015 Ben-Romdhane et al. Feb 2004 A1
20040031058 Reisman Feb 2004 A1
20040031062 Lemmons Feb 2004 A1
20040039754 Harple Feb 2004 A1
20040073915 Dureau Apr 2004 A1
20040078814 Allen Apr 2004 A1
20040107437 Reichardt et al. Jun 2004 A1
20040107439 Hassell et al. Jun 2004 A1
20040111465 Chuang et al. Jun 2004 A1
20040128699 Delpuch et al. Jul 2004 A1
20040133923 Watson et al. Jul 2004 A1
20040136698 Mock Jul 2004 A1
20040168186 Rector et al. Aug 2004 A1
20040172648 Xu et al. Sep 2004 A1
20040189658 Dowdy Sep 2004 A1
20040194136 Finseth et al. Sep 2004 A1
20040199578 Kapczynski et al. Oct 2004 A1
20040221306 Noh Nov 2004 A1
20040224723 Farcasiu Nov 2004 A1
20040225751 Urali Nov 2004 A1
20040226051 Carney et al. Nov 2004 A1
20050005288 Novak Jan 2005 A1
20050015796 Bruckner et al. Jan 2005 A1
20050015804 LaJoie et al. Jan 2005 A1
20050028208 Ellis et al. Feb 2005 A1
20050086172 Stefik Apr 2005 A1
20050125835 Wei Jun 2005 A1
20050149972 Knudson Jul 2005 A1
20050155063 Bayrakeri et al. Jul 2005 A1
20050160458 Baumgartner Jul 2005 A1
20050166230 Gaydou Jul 2005 A1
20050204385 Sull et al. Sep 2005 A1
20050259147 Nam et al. Nov 2005 A1
20050262542 DeWeese et al. Nov 2005 A1
20050283800 Ellis et al. Dec 2005 A1
20050287948 Hellwagner et al. Dec 2005 A1
20060004743 Murao et al. Jan 2006 A1
20060059525 Jerding et al. Mar 2006 A1
20060068818 Leitersdorf et al. Mar 2006 A1
20060080707 Laksono Apr 2006 A1
20060080716 Nishikawa et al. Apr 2006 A1
20060104511 Guo et al. May 2006 A1
20060105793 Gutowski et al. May 2006 A1
20060125962 Shelton et al. Jun 2006 A1
20060143191 Cho et al. Jun 2006 A1
20060156336 Knudson et al. Jul 2006 A1
20060195865 Fablet Aug 2006 A1
20060200842 Chapman et al. Sep 2006 A1
20060206470 McIntyre Sep 2006 A1
20060206912 Klarfeld et al. Sep 2006 A1
20060233514 Weng et al. Oct 2006 A1
20060248572 Kitsukama et al. Nov 2006 A1
20070019001 Ha Jan 2007 A1
20070050343 Siddaramappa et al. Mar 2007 A1
20070064715 Lloyd et al. Mar 2007 A1
20070083538 Roy et al. Apr 2007 A1
20070112761 Xu et al. May 2007 A1
20070157247 Cordray et al. Jul 2007 A1
20070211762 Song et al. Sep 2007 A1
20070214123 Messer et al. Sep 2007 A1
20070214488 Nguyen et al. Sep 2007 A1
20070220016 Estrada et al. Sep 2007 A1
20070239707 Collins et al. Oct 2007 A1
20070250901 McIntire et al. Oct 2007 A1
20070260700 Messer Nov 2007 A1
20070261072 Boulet et al. Nov 2007 A1
20070271587 Rowe Nov 2007 A1
20080037722 Klassen Feb 2008 A1
20080060011 Kelts Mar 2008 A1
20080060020 Kelts Mar 2008 A1
20080071770 Schloter et al. Mar 2008 A1
20080092201 Agarwal et al. Apr 2008 A1
20080113504 Lee et al. May 2008 A1
20080126109 Cragun et al. May 2008 A1
20080133504 Messer et al. Jun 2008 A1
20080148317 Opaluch Jun 2008 A1
20080163304 Ellis Jul 2008 A1
20080183681 Messer et al. Jul 2008 A1
20080183698 Messer et al. Jul 2008 A1
20080189740 Carpenter et al. Aug 2008 A1
20080196070 White et al. Aug 2008 A1
20080204595 Rathod et al. Aug 2008 A1
20080208796 Messer et al. Aug 2008 A1
20080208839 Sheshagiri et al. Aug 2008 A1
20080221989 Messer et al. Sep 2008 A1
20080235209 Rathod et al. Sep 2008 A1
20080235393 Kunjithapatham et al. Sep 2008 A1
20080235725 Hendricks Sep 2008 A1
20080250010 Rathod et al. Oct 2008 A1
20080256097 Messer et al. Oct 2008 A1
20080266449 Rathod et al. Oct 2008 A1
20080276278 Krieger et al. Nov 2008 A1
20080282294 Carpenter et al. Nov 2008 A1
20080288641 Messer et al. Nov 2008 A1
20080288644 Gilfix et al. Nov 2008 A1
20080301320 Morris Dec 2008 A1
20080301732 Archer et al. Dec 2008 A1
20080317233 Rey et al. Dec 2008 A1
20090006315 Mukherjea et al. Jan 2009 A1
20090019485 Ellis et al. Jan 2009 A1
20090024629 Miyauchi Jan 2009 A1
20090025054 Gibbs et al. Jan 2009 A1
20090083257 Bargeron et al. Mar 2009 A1
20090094113 Berry et al. Apr 2009 A1
20090094632 Newnam et al. Apr 2009 A1
20090094651 Damm et al. Apr 2009 A1
20090123021 Jung et al. May 2009 A1
20090133025 Malhotra et al. May 2009 A1
20090164904 Horowitz et al. Jun 2009 A1
20090183210 Andrade Jul 2009 A1
20090222872 Schlack Sep 2009 A1
20090228441 Sandvik Sep 2009 A1
20090240650 Wang et al. Sep 2009 A1
20090249427 Dunnigan et al. Oct 2009 A1
20090271829 Larsson et al. Oct 2009 A1
20090288132 Hegde Nov 2009 A1
20090292548 Van Court Nov 2009 A1
20100023966 Shahraray et al. Jan 2010 A1
20100077057 Godin et al. Mar 2010 A1
20100079670 Frazier et al. Apr 2010 A1
20100175084 Ellis et al. Jul 2010 A1
20100180300 Carpenter et al. Jul 2010 A1
20100223640 Reichardt et al. Sep 2010 A1
20100250190 Zhang et al. Sep 2010 A1
20100251284 Ellis et al. Sep 2010 A1
20100257548 Lee et al. Oct 2010 A1
20110055282 Hoving Mar 2011 A1
20110058101 Earley et al. Mar 2011 A1
20110087348 Wong Apr 2011 A1
20110093909 Roberts et al. Apr 2011 A1
20110131204 Bodin et al. Jun 2011 A1
20110176787 DeCamp Jul 2011 A1
20110209180 Ellis et al. Aug 2011 A1
20110211813 Marks Sep 2011 A1
20110214143 Rits et al. Sep 2011 A1
20110219386 Hwang et al. Sep 2011 A1
20110219419 Reisman Sep 2011 A1
20110225417 Maharajh et al. Sep 2011 A1
20110246495 Mallinson Oct 2011 A1
20110247042 Mallinson Oct 2011 A1
20120002111 Sandoval et al. Jan 2012 A1
20120011550 Holland Jan 2012 A1
20120054811 Spears Mar 2012 A1
20120066602 Chai et al. Mar 2012 A1
20120117151 Bill May 2012 A1
20120185905 Kelley Jul 2012 A1
20120192226 Zimmerman et al. Jul 2012 A1
20120227073 Hosein et al. Sep 2012 A1
20120233646 Coniglio et al. Sep 2012 A1
20120295686 Lockton Nov 2012 A1
20120324002 Chen Dec 2012 A1
20120324494 Burger et al. Dec 2012 A1
20120324495 Matthews, III et al. Dec 2012 A1
20120324518 Thomas et al. Dec 2012 A1
20130014155 Clarke et al. Jan 2013 A1
20130040623 Chun et al. Feb 2013 A1
20130051770 Sargent Feb 2013 A1
20130103446 Bragdon et al. Apr 2013 A1
20130110769 Ito May 2013 A1
20130111514 Slavin et al. May 2013 A1
20130170813 Woods et al. Jul 2013 A1
20130176493 Khosla Jul 2013 A1
20130198642 Carney et al. Aug 2013 A1
20130262997 Markworth et al. Oct 2013 A1
20130298038 Spivack et al. Nov 2013 A1
20130316716 Tapia et al. Nov 2013 A1
20130326570 Cowper et al. Dec 2013 A1
20130332839 Frazier et al. Dec 2013 A1
20130332852 Castanho et al. Dec 2013 A1
20130332855 Roman et al. Dec 2013 A1
20130347018 Limp et al. Dec 2013 A1
20130347030 Oh et al. Dec 2013 A1
20140006951 Hunter Jan 2014 A1
20140009680 Moon et al. Jan 2014 A1
20140026068 Park et al. Jan 2014 A1
20140032473 Enoki et al. Jan 2014 A1
20140053078 Kannan Feb 2014 A1
20140068648 Green et al. Mar 2014 A1
20140075465 Petrovic et al. Mar 2014 A1
20140082519 Wang et al. Mar 2014 A1
20140089423 Jackels Mar 2014 A1
20140089967 Mandalia et al. Mar 2014 A1
20140129570 Johnson May 2014 A1
20140149918 Asokan et al. May 2014 A1
20140150022 Oh et al. May 2014 A1
20140237498 Ivins Aug 2014 A1
20140267931 Gilson et al. Sep 2014 A1
20140279852 Chen Sep 2014 A1
20140280695 Sharma et al. Sep 2014 A1
20140282122 Mathur Sep 2014 A1
20140325359 Vehovsky et al. Oct 2014 A1
20140327677 Walker Nov 2014 A1
20140334381 Subramaniam et al. Nov 2014 A1
20140359662 Packard et al. Dec 2014 A1
20140365302 Walker Dec 2014 A1
20140373032 Merry et al. Dec 2014 A1
20150020096 Walker Jan 2015 A1
20150026743 Kim et al. Jan 2015 A1
20150263923 Kruglick Sep 2015 A1
Foreign Referenced Citations (24)
Number Date Country
0624039 Nov 1994 EP
0963115 Dec 1999 EP
1058999 Dec 2000 EP
1080582 Mar 2001 EP
2323489 Sep 1998 GB
2448874 Nov 2008 GB
2448875 Nov 2008 GB
9963757 Dec 1999 WO
2000011869 Mar 2000 WO
0033576 Jun 2000 WO
0110115 Feb 2001 WO
0182613 Nov 2001 WO
2001084830 Nov 2001 WO
02063426 Aug 2002 WO
02063471 Aug 2002 WO
02063851 Aug 2002 WO
02063878 Aug 2002 WO
03009126 Jan 2003 WO
2003026275 Mar 2003 WO
2007115224 Oct 2007 WO
2008053132 May 2008 WO
2011053271 May 2011 WO
2012094105 Jul 2012 WO
2012154541 Nov 2012 WO
Non-Patent Literature Citations (99)
Entry
U.S. Pat. No. 7,703,116, System and Method for Construction, Delivery and Display of iTV Applications That Blend Programming Information of On-Demand and Broadcast Service Offerings, Sep. 16, 2003.
U.S. Pat. No. 7,805,746, Optimized Application On-The-Wire Format for Construction, Delivery and Display of Enhanced Television, Oct. 18, 2005.
U.S. Pat. No. 7,818,667, Verification of Semantic Constraints in Multimedia Data and in Its Announcement, Signaling and Interchange, May 3, 2006.
U.S. Pat. No. 7,958,528, System and Method for Construction, Delivery and Display of iTV Applications That Blend Programming Information of On-Demand and Broadcast Service Offerings, Mar. 24, 2010.
U.S. Pat. No. 8,042,132, System and Method for Construction, Delivery and Display of iTV Content, Mar. 14, 2003.
U.S. Pat. No. 8,220,018, System and Method for Preferred Placement Programing of iTV Content, Sep. 2, 2004.
U.S. Pat. No. 8,352,983, Programming Contextual Interactive User Interface for Television, Jul. 11, 2003.
U.S. Pat. No. 8,365,230, Interactive User Interface for Television Applications, Sep. 19, 2002.
U.S. Pat. No. 8,413,205, System and Method for Construction, Delivery and Displax of iTV Content, Jul. 29, 2003.
U.S. Pat. No. 8,416,952, Channel Family Surf Control, Jul. 19, 2004.
U.S. Pat. No. 8,448,208, System an Method for Construction, Delivery, and Display of ITV Applications that Blend, Mar. 17, 2011.
U.S. Pat. No. 8,578,411, System and Method for Controlling iTV Application Behaviors Through the Use of Application Profile Filters, Sep. 25, 2003.
U.S. Pat. No. 8,707,354, Graphically Rich, Modular, Promotional Tile Interface for Interactive Television, Jun. 12, 2003.
U.S. Pat. No. 8,745,658, System and Method for Construction, Delivery and Display of iTV Content, Jun. 16, 2011.
U.S. Pat. No. 8,756,634, Programming Contextual Interactive User Interface for Television, Sep. 12, 2012.
U.S. Pat. No. 8,819,734, Contextual Navigational Control for Digital Television, Aug. 24, 2004.
U.S. Pat. No. 8,850,480, Interactive User Interface for Television Applications, Jan. 28, 2013.
U.S. Pat. No. 8,943,533, System and Method for Preferred Placement Programing of iTV Content, May 31, 2012.
U.S. Pat. No. 9,021,528, System and Method for Construction, Delivery and Display of iTV Applications That Blend, Apr. 10, 2013.
U.S. Pat. No. 9,197,938, Programming Contextual Interactive User Interface for Television, Jan. 29, 2014.
U.S. Pat. No. 9,363,560, System and Method for Construction, Delivery and Display of iTV Applications That Blend Programming Information of On-Demand and Broadcast Service Offerings, Mar. 18, 2015.
U.S. Pat. No. 9,414,022, Verification of Semantic Constraints in Multimedia Data and In Its Announcement, Signaling and Interchange, Sep. 8, 2010.
U.S. Pat. No. 9,451,196, System and Method for Construction, Delivery and Display of iTV Content, Jun. 7, 2012.
U.S. Pat. No. 9,516,253, System and Method for Preferred Placement Programming of iTV Content, Dec. 16, 2014.
U.S. Pat. No. 9,553,927, Synchronizing Multiple Transmissions of Content, Mar. 13, 2013.
U.S. Pat. No. 9,729,924, System and Method for Construction, Delivery and Display of iTV Applications That Blend Programming Information of On-Demand and Broadcast Service Offerings, Apr. 25, 2016.
U.S. Pat. No. 9,967,611, Prioritized Placement of Content Elements for iTV Applications, Oct. 6, 2016.
U.S. Pat. No. 9,992,546, Contextual Navigational Control for Digital Television, May 16, 2014.
U.S. Pat. No. 10,110,973, Verification of Semantic Constraints in Multimedia Data and in its Announcement, Signaling and Interchange, May 5, 2016.
U.S. Pat. No. 10,149,014, Interactive User Interface for Television Applications, Jan. 8, 2013.
U.S. Pat. No. 10,171,878, System and Method for Controlling iTV Application Behaviors through the Use of Application Profile Filters, Sep. 27, 2013.
U.S. Pat. No. 10,237,617, System and Method for Construction, Delivery and Display of iTV Applications that Blend Programming Information of On-Demand and Broadcast Service Offerings, Jun. 30, 2017.
U.S. Pat. No. 10,491,942, Prioritized Placement of Content Elements for iTV Applications, Mar. 1, 2018.
U.S. Pat. No. 10,575,070, Validation of Content, Aug. 31, 2018.
U.S. Pat. No. 10,587,930, Interactive User Interface for Television Applications, Oct. 15, 2018.
U.S. Pat. No. 10,602,225, System and Method for Construction, Delivery and Display of iTV Content, Feb. 20, 2013.
U.S. Pat. No. 10,616,644, System and Method for Blending Linear Content, Non-Linear Content, or Managed Content, Jan. 18, 2019.
U.S. Pat. No. 10,664,138, Providing Supplemental Content, Mar. 14, 2013.
U.S. Pat. No. 10,687,114, System and Method for Controlling iTV Application Behaviors Through the Use of Application Profile Filters, Nov. 13, 2018.
U.S. Appl. No. 15/968,160, Contextual Navigational Control for Digital Television, filed May 1, 2018.
U.S. Pat. No. 10,848,830, Content Event Messaging, Mar. 14, 2013.
U.S. Pat. No. 10,880,609, Method and Apparatus for Delivering Video and Video Related Content as Sub-Asset Level, Nov. 20, 2008.
U.S. Appl. No. 10/306,752, Broadcast Database, filed Nov. 27, 2002.
U.S. Appl. No. 10/635,799, User Customization of User Interfaces for Interactive Television, filed Aug. 5, 2003.
U.S. Appl. No. 12/274,452, Method and Apparatus for Delivering Video and Video Related Content as Sub-Asset Level, filed Nov. 20, 2008.
U.S. Appl. No. 13/671,626, Crowdsourcing Supplemental Content, filed Nov. 8, 2012.
U.S. Appl. No. 14/520,819, Systems and Methods for Curating Content Metadata, filed Oct. 22, 2014.
U.S. Appl. No. 14/842,196, System and Method for Construction, Delivery and Display of iTV Content, filed Sep. 1, 2015.
U.S. Appl. No. 16/740,921, Validation of Content, filed Jan. 13, 2020.
U.S. Appl. No. 16/746,111, System and Method for Blending Linear Content, Non-Linear Content, or Managed Content, filed Jan. 17, 2020.
U.S. Appl. No. 16/851,814, Providing Supplemental Content for a Second Screen Experience, filed Apr. 17, 2020.
U.S. Appl. No. 17/100,341, Content Event Messaging, filed Nov. 20, 2020.
Andreas Kraft and Klaus Hofrichter, “An Approach for Script-Based Broadcast Application Production”, Springer-Verlag Berlin Heidelberg, pp. 74-82, 1999.
Fernando Pereira, “The MPEG-4 Book”, Prentice Hall, Jul. 10, 2002.
Michael Adams, “Open Cable Architecture”, Cisco Press, Dec. 3, 1999.
Mark Riehl, “XML and Perl”, Sams, Oct. 16, 2002.
MetaTV, Inc., PCT/US02/29917 filed Sep. 19, 2002, International Search Report dated Apr. 14, 2003; ISA/US; 6 pages.
Sylvain Devillers, “Bitstream Syntax Definition Language: an Input to MPEG-21 Content Representation”, Mar. 2001, ISO, ISO/IEC JTC1/SC29/WG11 MPEG01/M7053.
Shim, et al., “A SMIL Based Graphical Interface for Interactive TV”, Internet Tech. Laboratory Dept. of Comp. Engineering, San Jose State University, pp. 257-266, 2003.
Yoon, et al., “Video Gadget: MPET-7 Based Audio-Visual Content Indexing and Browsing Engine”, LG Electronics Institute of Technology, 2001, pp. 59-68.
Watchwith webpage; http://www.watchwith.com/content_owners/watchwith_plalform_components.jsp (last visited Mar. 12, 2013).
Matt Duffy; TVplus App reveals content click-through rates north of 10% across sync enabled programming; http://www.tvplus.com/blog/TVplus-App-reveals-content-click-through-rates-north-of-10-Percent-across-sync-enabled- programming (retrieved from the Wayback Machine on Mar. 12, 2013).
“In Time for Academy Awards Telecast, Companion TV App Umami Debuts First Real-Time Sharing of a TV Program's Images”; Umami News; http:www.umami.tv/2012-02-23.html (retrieved from the Wayback Machine on Mar. 12, 2013).
European Patent Application No. 09175979.5—Office Action dated Dec. 13, 2011.
Canadian Patent Application No. 2,685,833—Office Action dated Jan. 20, 2012.
Li, Y. et al. “Reliable Video Clock Time Recognition”, Pattern Recognition, 2006, 1CPR 1006, 18th International Conference on Pattern Recognition, 4 pages.
European Search Report dated Mar. 1, 2010.
Salton et al., Computer Evaluation of Indexing and Text Processing Journal of the Association for Computing Machinery, vol. 15, No. 1, Jan. 1968, pp. 8-36.
Smith, J.R. et al., An Image and Video Search Engine for the World-Wide Web Storage and Retrieval for Image and Video Databases 5, San Jose, Feb. 13-14, 1997, Proceedings of Spie, Bellingham, Spie, US, vol. 3022, Feb. 13, 1997, pp. 84-95.
Kontothoanassis, Ledonias et al. “Design, Implementation, and Analysis of a Multimedia Indexing and Delivery Server”, Technical Report Series, Aug. 1999, Cambridge Research Laboratory.
Messer, Alan et al., “SeeNSearch: A context Directed Search Facilitator for Home Entertainment Devices”, Paper, Samsung Information Systems America Inc., San Jose, CA, 2008.
Boulgouris N. V. et al., “Real-Time Compressed-Domain Spatiotemporal Segmentation and Ontologies for Video Indexing and Retrieval”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 14, No. 5, pp. 606-621, May 2004.
Changsheng Xu et al., “Using Webcast Text for Semantic Event Detection in Broadcast Sports Video”, IEEE Transactions on Multimedia, vol. 10, No. 7, pp. 1342-1355, Nov. 2008.
Liang Bai et al., “Video Semantic Content Analysis based on Ontology”, International Machine Vision and Image Processing Conference, pp. 117-124, Sep. 2007.
Koskela M. et al., “Measuring Concept Similarities in Multimedia Ontologies: Analysis and Evaluations”, IEEE Transactions on Multimedia, vol. 9, No. 5, pp. 912-922, Aug. 2007.
Steffan Staab et al., “Semantic Multimedia”, Reasoning Web; Lecture Notes in Computer Science, pp. 125-170, Sep. 2008.
European Search Report for Application No. 09180776.8, dated Jun. 7, 2010, 9 pages.
European Search Report, EP 09 18 0762, completion date Mar. 22, 2010.
European Search Report dated Jun. 4, 2010.
EP Application No. 09 179 987.4-1241—Office Action dated Feb. 15, 2011.
European Application No. 09 175 979.5—Office Action dated Apr. 11, 2011.
Boronat F et al: “Multimedia group and inter-stream synchronization techniques: A comparative study”, Information Systems. Pergamon Press. Oxford. GB. vol. 34. No. 1. Mar. 1, 2009 (Mar. 1, 2009). pp. 108-131. XP025644936.
Extended European Search Report—EP14159227.9—dated Sep. 3, 2014.
Canadian Office Action—CA 2,685,833—dated Jan. 22, 2015.
European Extended Search Report—EP 13192112.4—dated May 11, 2015.
CA Response to Office Action—CA Appl. 2,685,833—Submitted Jul. 17, 2015.
Response to European Office Action—European Appl. 13192112.4—submitted Dec. 9, 2015.
CA Office Action—CA App 2,685,833—dated Jan. 27, 2016.
European Office Action—EP App 14159227.9—dated Jul. 12, 2016.
Agnieszka Zagozdzinnska et al. “TRIDAQ Systems in HEP Experiments at LHC Accelerator” Kwartalnik Elektroniki I Telekomunikacji, vol. 59, No. 4, Oct. 2013.
CA Office Action—CA Application 2685833—dated Feb. 8, 2017.
Nov. 29, 2017—Canadian Office Action—CA 2,685,833.
Feb. 19, 2018—European Summons to Oral Proceedings—EP 14159227.9.
Mar. 9, 2018—European Office Action—EP 13192112.4.
Jul. 31, 2018—European Decision to Refuse—14159227.9.
Sep. 5, 2019—Canadian Office Action—CA 2,685,833.
Nov. 6, 2019—Canadian Office Action—CA 2,832,800.
Apr. 21, 2020—European Summons to Oral Proceedings—EP 09175979.5.
Aug. 24, 2020, Canadian Office Action, CA 2,832,800.
Related Publications (1)
Number Date Country
20210051376 A1 Feb 2021 US
Provisional Applications (1)
Number Date Country
60552998 Mar 2004 US
Continuations (3)
Number Date Country
Parent 15968160 May 2018 US
Child 17076446 US
Parent 14279645 May 2014 US
Child 15968160 US
Parent 10925737 Aug 2004 US
Child 14279645 US
Continuation in Parts (1)
Number Date Country
Parent 10664275 Sep 2003 US
Child 10925737 US