Method and system for changing programming on a television

Information

  • Patent Grant
  • 10051314
  • Patent Number
    10,051,314
  • Date Filed
    Friday, August 16, 2013
    10 years ago
  • Date Issued
    Tuesday, August 14, 2018
    5 years ago
Abstract
A method, system, and executable instructions are provided that receive a request to change a television tuner from a first channel to a second channel, the first channel providing first programming and the second channel providing second programming, wherein the first and second programming are different and changing from the first channel to the second channel causes a delay in providing, by the television tuner, the second programming, determine, based on metadata associated with the second channel and/or second programming, an image associated with the second programming, and, during the delay, display, on a screen of the television, the image.
Description
BACKGROUND

Consolidation of device features or technological convergence is in an increasing trend. Technological convergence describes the tendency for different technological systems to evolve toward performing similar tasks. As people use more devices, the need to carry those devices, charge those devices, update software on those devices, etc. becomes more cumbersome. To compensate for these problems, technology companies have been integrating features from different devices into one or two multi-functional devices. For example, cellular phones are now capable of accessing the Internet, taking photographs, providing calendar functions, etc.


The consolidation trend is now affecting the design and functionality of devices generally used in the home. For example, audio receivers can access the Internet, digital video recorders can store or provide access to digital photographs, etc. The television in home audio/video systems remains a cornerstone device because the display function cannot be integrated into other devices. As such, consolidating home devices leads to integrating features and functionality into the television. The emergence of the Smart Television (Smart TV) is evidence of the trend to consolidate functionality into the television.


A Smart TV is generally conceived as a device that integrates access to the Internet and Web 2.0 features into television sets. The Smart TV represents the trend of technological convergence between computers and television sets. The Smart TV generally focuses on online interactive media, Internet TV, on-demand streaming media, and generally does not focus on traditional broadcast media. Unfortunately, most Smart TVs have yet to provide seamless and intuitive user interfaces for navigating and/or executing the various features of the Smart TV. As such, there are still issues with the consolidation of features and the presentation of these features in Smart TVs.


SUMMARY

There is a need for an Intelligent TV with intuitive user interfaces and with seamless user interaction capability. These and other needs are addressed by the various aspects, embodiments, and/or configurations of the present disclosure. Also, while the disclosure is presented in terms of exemplary embodiments, it should be appreciated that individual aspects of the disclosure can be separately claimed.


A method is provided that can include the steps:


receiving, by a television, a request to change a television tuner from a first channel to a second channel, the first channel providing first programming and the second channel providing second programming, the first and second programming being different and changing from the first channel to the second channel causing a delay in providing, by the television tuner, the second programming;


determining, by a microprocessor executable application framework and based on metadata associated with the second channel and/or second programming, a graphical image associated with the second programming; and


during the delay, displaying, by the microprocessor executable application framework and on a screen of the television, the image.


A television is provided that can include:


a television tuner;


a screen for displaying programming;


an infrared and/or radio frequency module configured to receive a (wireless) request, such as from a remote control, to change the television tuner from a first channel to a second channel; and


a microprocessor executable application framework configured to determine, based on metadata associated with the second channel and/or second programming, a graphical image associated with the second programming and, during the delay, display, on the screen of the television, the image.


A tangible and non-transient computer readable medium is provided that includes microprocessor executable instructions that, when executed, are configured to perform operations that can include:


receive a request to change a television tuner from a first channel to a second channel;


determine, based on metadata associated with the second channel and/or second programming, a graphical image associated with the second programming; and


during the delay, display, on a screen of the television, the image.


Each of the first and second programming can be one or more of a movie, an episode, sport event, and a television special.


The image can be one of a captured frame of the second programming, a picture, and a photograph.


The captured frame can be a still image and not a moving image.


The image can be displayed full screen.


The image can be displayed as part of a header bar.


The header bar can include a program name, series name, episode number, episode name, and/or sport event name and a plurality of a program description, a program type, a program category, a start-to-finish time, current runtime, and a remaining runtime.


The metadata and image can be provided by an electronic program guide service.


The metadata can be populated into an electronic program guide using the Program and System Information Protocol.


The metadata can be converted into a broadcast-ready format by a Programming Metadata Communication Protocol generator.


The present disclosure can provide a number of advantages depending on the particular aspect, embodiment, and/or configuration. The display of an image provided by or derived from metadata associated with the programming to be provided after the channel change can provide meaningful information without needing to wait for the channel change to be activated fully. This can improve the channel changing experience as all channels can show a full screen channel preview while tuning takes place. This can enable the user to flip quickly through channels to see what is on without needing to wait for the tuner to catch up. The header bar can provide further useful information during channel changing.


These and other advantages will be apparent from the disclosure.


The phrases “at least one”, “one or more”, and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C”, “at least one of A, B, or C”, “one or more of A, B, and C”, “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.


The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising”, “including”, and “having” can be used interchangeably.


The term “automatic” and variations thereof, as used herein, refers to any process or operation done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material.”


A “blog” (a blend of the term web log) is a type of website or part of a website supposed to be updated with new content from time to time. Blogs are usually maintained by an individual with regular entries of commentary, descriptions of events, or other material such as graphics or video. Entries are commonly displayed in reverse-chronological order.


A “blogging service” is a blog-publishing service that allows private or multi-user blogs with time-stamped entries.


The term “cable TV” refers to a system of distributing television programs to subscribers via radio frequency (RF) signals transmitted through coaxial cables or light pulses through fiber-optic cables. This contrasts with traditional broadcast television (terrestrial television) in which the television signal is transmitted over the air by radio waves and received by a television antenna attached to the television.


The term “channel” or “television channel,” as used herein, can be a physical or virtual channel over which a television station or television network is distributed. A physical cannel in analog television can be an amount of bandwidth, typically 6, 7, or 8 MHz, that occupies a predetermine channel frequency. A virtual channel is a representation, in cable or satellite television, of a data stream for a particular television media provider (e.g., CDS, TNT, HBO, etc.).


The term “computer-readable medium,” as used herein, refers to any tangible storage and/or transmission medium that participate in providing instructions to a processor for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, NVRAM, or magnetic or optical disks. Volatile media includes dynamic memory, such as main memory. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic medium, magneto-optical medium, a CD-ROM, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, a solid state medium like a memory card, any other memory chip or cartridge, a carrier wave as described hereinafter, or any other medium from which a computer can read. A digital file attachment to e-mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. When the computer-readable media is configured as a database, it is to be understood that the database may be any type of database, such as relational, hierarchical, object-oriented, and/or the like. Accordingly, the disclosure is considered to include a tangible storage medium or distribution medium and prior art-recognized equivalents and successor media, in which the software implementations of the present disclosure are stored.


The term “enhanced television” (ETV) refers to a collection of specifications developed under the OpenCable project of CableLabs (Cable Television Laboratories, Inc.) that define an ETV Application consisting of resources (files) adhering to the Enhanced TV Binary Interchange Format (EBIF) content format as well as PNG images, JPEG images, and PFR downloadable fonts. An ETV application is normally delivered through an MPEG transport stream and accompanies an MPEG program containing video and audio elementary streams. An “ETV Application” is a collection of resources (files) that include one or more EBIF resources that represent viewable information in the form of pages. Two forms of a given ETV Application may be distinguished: (1) an interchange form and (2) an execution form. The interchange form of an ETV Application consists of the resources (files) that represent the compiled application prior to its actual execution by an ETV User Agent. The execution form of an ETV Application consists of the stored, and possibly mutated forms of these resources while being decoded, presented, and executed by an ETV User Agent. An “ETV User Agent” is a software component that operates on a set-top box, a television, or any other computing environment capable of receiving, decoding, presenting, and processing an ETV Application. This component usually provides, along with its host hardware environment, one or more mechanisms for an end-user to navigate and interact with the multimedia content represented by ETV Applications.


The term “high-definition television” (HDTV) provides a resolution that is substantially higher than that of standard-definition television. HDTV may be transmitted in various formats, namely 1080p—1920×1080p: 2,073,600 pixels (approximately 2.1 megapixels) per frame, 1080i (which is typically either 1920×1080i: 1,036,800 pixels (approximately 1 megapixel) per field or 2,073,600 pixels (approximately 2.1 megapixels) per frame or 1440×1080i:[1] 777,600 pixels (approximately 0.8 megapixels) per field or 1,555,200 pixels (approximately 1.6 megapixels) per frame), or 720p—1280×720p: 921,600 pixels (approximately 0.9 megapixels) per frame. As will be appreciated, “frame size” in pixels is defined as number of horizontal pixels×number of vertical pixels, for example 1280×720 or 1920×1080. Often the number of horizontal pixels is implied from context and is omitted, as in the case of 720p and 1080p, “scanning system” is identified with the letter “p” for progressive scanning or “i” for interlaced scanning, and “frame rate” is identified as number of video frames per second. For interlaced systems an alternative form of specifying number of fields per second is often used. For purposes of this disclosure, “high-definition television” is deemed to include other high-definition analog or digital video formats, including ultra high definition television.


The term “internet television” (otherwise known as Internet TV, Online Television, or Online TV) is the digital distribution of television content via the Internet. It should not be confused with Web television—short programs or videos created by a wide variety of companies and individuals, or Internet protocol television (IPTV)—an emerging internet technology standard for use by television broadcasters. Internet Television is a general term that covers the delivery of television shows and other video content over the internet by video streaming technology, typically by major traditional television broadcasters. It does not describe a technology used to deliver content (see Internet protocol television). Internet television has become very popular through services such as RTÉ Player in Ireland; BBC iPlayer, 4oD, ITV Player (also STV Player and UTV Player) and Demand Five in the United Kingdom; Hulu in the United States; Nederland 24 in the Netherlands; ABC iview and Australia Live TV in Australia; Tivibu in Turkey; and iWanTV! in the Philippines.


The term “internet protocol television” (IPTV) refers to a system through which television services are delivered using the Internet protocol suite over a packet-switched network such as the Internet, instead of being delivered through traditional terrestrial, satellite signal, and cable television formats. IPTV services may be classified into three main groups, namely live television, with or without interactivity related to the current TV show; time-shifted television: catch-up TV (replays a TV show that was broadcast hours or days ago), start-over TV (replays the current TV show from its beginning); and video on demand (VOD): browse a catalog of videos, not related to TV programming. IPTV is distinguished from Internet television by its on-going standardization process (e.g., European Telecommunications Standards Institute) and preferential deployment scenarios in subscriber-based telecommunications networks with high-speed access channels into end-user premises via set-top boxes or other customer-premises equipment.


The term “silo,” as used herein, can be a logical representation of an input, source, or application. An input can be a device or devices (e.g., DVD, VCR, etc.) electrically connected to the television through a port (e.g., HDMI, video/audio inputs, etc.) or through a network (e.g., LAN WAN, etc.). Rather than a device or devices, the input could be configured as an electrical or physical connection to one or more devices. A source, particularly a content source, can be a data service that provides content (e.g., a media center, a file system, etc.). An application can be a software service that provides a particular type of function (e.g., Live TV, Video on Demand, User Applications, photograph display, etc.). The silo, as a logical representation, can have an associated definition or property, such as a setting, feature, or other characteristic.


The term “panel,” as used herein, can mean a user interface displayed in at least a portion of the display. The panel may be interactive (e.g., accepts user input) or informational (e.g., does not accept user input). A panel may be translucent whereby the panel obscures but does not mask the underlying content being displayed in the display. Panels may be provided in response to a user input from a button or remote control interface.


The term “screen,” as used herein, refers to a physical structure that includes one or more hardware components that provide the device with the ability to render a user interface and/or receive user input. A screen can encompass any combination of gesture capture region, a touch sensitive display, and/or a configurable area. The device can have one or more physical screens embedded in the hardware. However a screen may also include an external peripheral device that may be attached and detached from the device. In embodiments, multiple external devices may be attached to the device. For example, another screen may be included with a remote control unit that interfaces with the Intelligent TV.


The term “media” of “multimedia,” as used herein, refers to content that may assume one of a combination of different content forms. Multimedia can include one or more of, but is not limited to, text, audio, still images, animation, video, or interactivity content forms.


The term “Intelligent TV,” as used herein, refers to a television configured to provide one or more intuitive user interfaces and interactions based on a unique application platform and architecture. The Intelligent TV utilizes processing resources associated with the television to integrate Internet connectivity with parallel application functionality. This integration allows a user the ability to intuitively access various sources of media and content (e.g., Internet, over-the-top content, on-demand streaming media, over-the-air broadcast media, and/or other forms of information) via the Intelligent TV in a quick and efficient manner. Although the Intelligent TV disclosed herein may comprise one or more components of a “smart TV,” it is an aspect of the Intelligent TV to provide expanded intuitive user interaction capability for navigating and executing the various features of the television. A “smart TV,” sometimes referred to as a connected TV, or hybrid TV (not to be confused with IPTV, Internet TV, or with Web TV), describes a trend of integration of the Internet and Web 2.0 features into television sets and set-top boxes, as well as the technological convergence between computers and these television sets/set-top boxes. The smart TV devices have a higher focus on online interactive media, Internet TV, over-the-top content, as well as on-demand streaming media, and less focus on traditional broadcast media than traditional television sets and set-top boxes. As can be appreciated, the Intelligent TV encompasses a broader range of technology than that of the smart TV defined above.


The term “television” is a telecommunication medium, device (or set) or set of associated devices, programming, and/or transmission for transmitting and receiving moving images that can be monochrome (black-and-white) or colored, with or without accompanying sound. Different countries use one of the three main video standards for TVs, namely PAL, NTSC or SECAM. Television is most commonly used for displaying broadcast television signals. The broadcast television system is typically disseminated via radio transmissions on designated channels in the 54-890 MHz frequency band. A common television set comprises multiple internal electronic circuits, including those for receiving and decoding broadcast signals. A visual display device which lacks a tuner is properly called a video monitor, rather than a television. A television may be different from other monitors or displays based on the distance maintained between the user and the television when the user watches the media and based on the inclusion of a tuner or other electronic circuit to receive the broadcast television signal.


The term “Live TV,” as used herein, refers to a television production broadcast in real-time, as events happen, in the present.


The term “standard-definition television” (SDTV) is a television system that uses a resolution that is not considered to be either high-definition television (HDTV 720p and 1080p) or enhanced-definition television (EDTV 480p). The two common SDTV signal types are 576i, with 576 interlaced lines of resolution, derived from the European-developed PAL and SECAM systems; and 480i based on the American National Television System Committee NTSC system. In the US, digital SDTV is broadcast in the same 4:3 aspect ratio as NTSC signals. However, in other parts of the world that used the PAL or SECAM analog standards, standard-definition television is now usually shown with a 16:9 aspect ratio. Standards that support digital SDTV broadcast include DVB, ATSC and ISDB. Television signals are transmitted in digital form, and their pixels have a rectangular shape, as opposed to square pixels that are used in modern computer monitors and modern implementations of HDTV. The table below summarizes pixel aspect ratios for various kinds of SDTV video signal. Note that the actual image (be it 4:3 or 16:9) is always contained in the center 704 horizontal pixels of the digital frame, regardless of how many horizontal pixels (704 or 720) are used. In case of digital video signal having 720 horizontal pixels, only the center 704 pixels contain actual 4:3 or 16:9 image, and the 8 pixel wide stripes from either side are called nominal analogue blanking and should be discarded before displaying the image. Nominal analogue blanking should not be confused with overscan, as overscan areas are part of the actual 4:3 or 16:9 image.


The term “video on demand (VOD),” as used herein, refers to systems and processes which allow users to select and watch/listen to video or audio content on demand. VOD systems may stream content, to view the content in real time, or download the content to a storage medium for viewing at a later time.


The term “satellite positioning system receiver” refers to a wireless receiver or transceiver to receive and/or send location signals from and/or to a satellite positioning system, such as the Global Positioning System (“GPS”) (US), GLONASS (Russia), Galileo positioning system (EU), Compass navigation system (China), and Regional Navigational Satellite System (India).


The term “display,” as used herein, refers to at least a portion of a screen used to display the output of the television to a user. A display may be a single-screen display or a multi-screen display, referred to as a composite display. A composite display can encompass the touch sensitive display of one or more screens. A single physical screen can include multiple displays that are managed as separate logical displays. Thus, different content can be displayed on the separate displays although part of the same physical screen.


The term “displayed image,” as used herein, refers to an image produced on the display. A typical displayed image is a television broadcast or menu. The displayed image may occupy all or a portion of the display.


The term “display orientation,” as used herein, refers to the way in which a rectangular display is oriented by a user for viewing. The two most common types of display orientation are portrait and landscape. In landscape mode, the display is oriented such that the width of the display is greater than the height of the display (such as a 4:3 ratio, which is 4 units wide and 3 units tall, or a 16:9 ratio, which is 16 units wide and 9 units tall). Stated differently, the longer dimension of the display is oriented substantially horizontal in landscape mode while the shorter dimension of the display is oriented substantially vertical. In the portrait mode, by contrast, the display is oriented such that the width of the display is less than the height of the display. Stated differently, the shorter dimension of the display is oriented substantially horizontal in the portrait mode while the longer dimension of the display is oriented substantially vertical.


The term “module,” as used herein, refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and software that is capable of performing the functionality associated with that element.


The terms “determine,” “calculate” and “compute,” and variations thereof, as used herein, are used interchangeably and include any type of methodology, process, mathematical operation or technique.


The term “touch screen” or “touchscreen” refer to screen that can receive user contact or other tactile input, such as a stylus. The touch screen may sense user contact in a number of different ways, such as by a change in an electrical parameter (e.g., resistance or capacitance), acoustic wave variations, infrared radiation proximity detection, light variation detection, and the like. In a resistive touch screen, for example, normally separated conductive and resistive metallic layers in the screen pass an electrical current. When a user touches the screen, the two layers make contact in the contacted location, whereby a change in electrical field is noted and the coordinates of the contacted location calculated. In a capacitive touch screen, a capacitive layer stores electrical charge, which is discharged to the user upon contact with the touch screen, causing a decrease in the charge of the capacitive layer. The decrease is measured, and the contacted location coordinates determined. In a surface acoustic wave touch screen, an acoustic wave is transmitted through the screen, and the acoustic wave is disturbed by user contact. A receiving transducer detects the user contact instance and determines the contacted location coordinates.


The term “web television” is original television content produced for broadcast via the World Wide Web. Some major distributors of web television are YouTube, Myspace, Newgrounds, Blip.tv, and Crackle.


The terms “instant message” and “instant messaging” refer to a form of real-time text communication between two or more people, typically based on typed text.


The term “internet search engine” refers to a web search engine designed to search for information on the World Wide Web and FTP servers. The search results are generally presented in a list of results often referred to as SERPS, or “search engine results pages”. The information may consist of web pages, images, information and other types of files. Some search engines also mine data available in databases or open directories. Web search engines work by storing information about many web pages, which they retrieve from the html itself. These pages are retrieved by a Web crawler (sometimes also known as a spider)—an automated Web browser which follows every link on the site. The contents of each page are then analyzed to determine how it should be indexed (for example, words are extracted from the titles, headings, or special fields called meta tags). Data about web pages are stored in an index database for use in later queries. Some search engines, such as Google™, store all or part of the source page (referred to as a cache) as well as information about the web pages, whereas others, such as AltaVista™, store every word of every page they find.


The terms “online community”, “e-community”, or “virtual community” mean a group of people that primarily interact via a computer network, rather than face to face, for social, professional, educational or other purposes. The interaction can use a variety of media formats, including wikis, blogs, chat rooms, Internet forums, instant messaging, email, and other forms of electronic media. Many media formats are used in social software separately or in combination, including text-based chatrooms and forums that use voice, video text or avatars.


The term “remote control” refers to a component of an electronics device, most commonly a television set, DVD player and/or home theater system for operating the device wirelessly, typically from a short line-of-sight distance. Remote control normally uses infrared and/or radio frequency (RF) signaling and can include WiFi, wireless USB, Bluetooth™ connectivity, motion sensor enabled capabilities and/or voice control. A touchscreen remote control is a handheld remote control device which uses a touchscreen user interface to replace most of the hard, built-in physical buttons used in normal remote control devices.


The term “satellite TV” refers to television programming delivered by the means of communications satellites and received by an outdoor antenna, usually a parabolic reflector generally referred to as a satellite dish, and as far as household usage is concerned, a satellite receiver either in the form of an external set-top box or a satellite tuner module built into a TV set.


The term “social network service” is a service provider that builds online communities of people, who share interests and/or activities, or who are interested in exploring the interests and activities of others. Most social network services are web-based and provide a variety of ways for users to interact, such as e-mail and instant messaging services.


The term “social network” refers to a web-based social network.


The term “gesture” refers to a user action that expresses an intended idea, action, meaning, result, and/or outcome. The user action can include manipulating a device (e.g., opening or closing a device, changing a device orientation, moving a trackball or wheel, etc.), movement of a body part in relation to the device, movement of an implement or tool in relation to the device, audio inputs, etc. A gesture may be made on a device (such as on the screen) or with the device to interact with the device.


The term “gesture capture” refers to a sense or otherwise a detection of an instance and/or type of user gesture. The gesture capture can occur in one or more areas of the screen. A gesture region can be on the display, where it may be referred to as a touch sensitive display or off the display where it may be referred to as a gesture capture area.


The term “electronic address” refers to any contactable address, including a telephone number, instant message handle, e-mail address, Universal Resource Locator (URL), Universal Resource Identifier (URI), Address of Record (AOR), electronic alias in a database, like addresses, and combinations thereof.


It shall be understood that the term “means,” as used herein, shall be given its broadest possible interpretation in accordance with 35 U.S.C., Section 112(f). Accordingly, a claim incorporating the term “means” shall cover all structures, materials, or acts set forth herein, and all of the equivalents thereof. Further, the structures, materials or acts and the equivalents thereof shall include all those described in the summary of the invention, brief description of the drawings, detailed description, abstract, and claims themselves.


The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, embodiments, and/or configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, embodiments, and/or configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A includes a first view of an embodiment of an environment or a intelligent television;



FIG. 1B includes a second view of an embodiment of an environment or a intelligent television;



FIG. 2A includes a first view of an embodiment of a intelligent television;



FIG. 2B includes a second view of an embodiment of a intelligent television;



FIG. 2C includes a third view of an embodiment of a intelligent television;



FIG. 2D includes a fourth view of an embodiment of a intelligent television;



FIG. 3 is a block diagram of an embodiment of the hardware of the intelligent television;



FIG. 4 is a block diagram of an embodiment of the intelligent television software and/or firmware;



FIG. 5 is a second block diagram of an embodiment of the intelligent television software and/or firmware;



FIG. 6 is a third block diagram of an embodiment of the intelligent television software and/or firmware;



FIG. 7 is a plan view of an embodiment of a handheld remote control;



FIG. 8 is a side view of an embodiment of a remote control;



FIG. 9A is a bottom view of an embodiment of a remote control with a joystick in a neutral position;



FIG. 9B is a bottom view of an embodiment of a remote control with the joystick in a lower position;



FIG. 9C is a bottom view of an embodiment of a remote control with the joystick in an upper position;



FIG. 10 is a plan view of another embodiment of a handheld remote control;



FIG. 11A is a front view of an embodiment of an Intelligent TV screen;



FIG. 11B is a front view of an embodiment of an Intelligent TV screen;



FIG. 11C is a front view of an embodiment of an Intelligent TV screen;



FIG. 12 is a block diagram of an embodiment of a handheld remote control of either FIG. 7 or 10;



FIG. 13 is a block diagram of an embodiment of a content data service;



FIG. 14 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 15 is a partial screen shot of the Intelligent TV screen according to an embodiment;



FIG. 16 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 17 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 18 is a partial screen shot of the Intelligent TV screen according to an embodiment;



FIG. 19 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 20 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 21 is a partial screen shot of the Intelligent TV screen according to an embodiment;



FIG. 22 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 23 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 24 is a screen shot of the Intelligent TV screen according to according to an embodiment;



FIG. 25 is a flow chart according to an embodiment;



FIG. 26 is a flow chart according to an embodiment; and



FIG. 27 is a series of partial screen shots of the Intelligent TV screen according to an embodiment.





In the appended figures, similar components and/or features may have the same reference label. Further, various components of the same type may be distinguished by following the reference label by a letter that distinguishes among the similar components. If only the first reference label is used in the specification, the description is applicable to any one of the similar components having the same first reference label irrespective of the second reference label.


DETAILED DESCRIPTION

Presented herein are embodiments of a device. The device can be a network-enabled telecommunications device, such as a television, an electronic visual display device, or other smart device. The device can include one or more screens, or sections of a screen, that are configured to receive and present information from a number of sources. Further, the device can receive user input in unique ways. The overall design and functionality of the device provides for an enhanced user experience making the device more useful and more efficient.


Intelligent Television (TV) Environment:


Referring to FIGS. 1A and 1B, an Intelligent TV, or device, 100 is shown. It is anticipated that the Intelligent TV 100 may be used for entertainment, business applications, social interaction, content creation and/or consumption, and to organize and control one or more other devices that are in communication with the Intelligent TV 100. As can be appreciated, the Intelligent TV can be used to enhance the user interactive experience whether at home or at work.


In some embodiments, the Intelligent TV 100 may be configured to receive and understand a variety of user and/or device inputs. For example, a user may interface with the Intelligent TV 100 via one or more physical or electrical controls, such as buttons, switches, touch sensitive screens/regions (e.g., capacitive touch, resistive touch, etc.), and/or other controls associated with the Intelligent TV 100. In some cases, the Intelligent TV 100 may include the one or more interactive controls. Additionally or alternatively, the one or more controls may be associated with a remote control. The remote control may communicate with the Intelligent TV 100 via wired and/or wireless signals. As can be appreciated, the remote control may operate via radio frequency (RF), infrared (IR), and/or a specific wireless communications protocol (e.g., Bluetooth™, Wi-Fi, etc.). In some cases, the controls, whether physical or electrical, may be configured (e.g., programmed) to suit a user's preferences.


Additionally or alternatively, smart phones, tablets, computers, laptops, netbooks, and other smart devices may be used to control the Intelligent TV 100. For example, control of the Intelligent TV 100 may be achieved via an application running on a smart device. The application may be configured to present a user with various Intelligent TV 100 controls in an intuitive user interface (UI) on a screen associated with the device 100. The screen may be a touch sensitive, or touch screen, display. Selections input by a user via the UI may be configured to control the Intelligent TV 100 by the application accessing one or more communication features associated with the smart device.


It is anticipated that the Intelligent TV 100 can receive input via various input devices including, but in no way limited to, video, audio, radio, light, tactile, and combinations thereof. Among other things, these input devices may be configured to allow the Intelligent TV 100 to see, recognize, and react to user gestures. For instance, a user may talk to the Intelligent TV 100 in a conversational manner. The Intelligent TV 100 may hear and understand voice commands in a manner similar to a smart device's intelligent personal assistant and voice-controlled navigator application (e.g., Apple's Siri, Android's Skyvi, Robin, Iris, and other applications).


The Intelligent TV 100 may also be a communications device which can establish network connections 104 through many alternate means, including wired 108 or wireless 112 means, over cellular networks 116 to connect via cellular base antenna 142 to telephone networks operated by telephone company 146, and by using a telephone line 120 to connect to telephone networks operated by telephone company 146. These connections 104 enable the Intelligent TV 100 to access one or more communication networks 132. The communication networks may comprise any type of known communication medium or collection of communication media and may use any type of protocols to transport messages or signals between endpoints. The communication networks may include wired and/or wireless communication technologies. The Internet is an example of a communication network 132 that constitutes an Internet Protocol (IP) network consisting of many computers, computing networks, and other communication devices located all over the world, which are connected through many telephone systems and other means.


Other examples of the communication network 132 include, without limitation, a standard Plain Old Telephone System (POTS), an Integrated Services Digital Network (ISDN), the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a cellular network, and any other type of packet-switched or circuit-switched network known in the art. In addition, it can be appreciated that the communication network 132 need not be limited to any one network type, and instead may be comprised of a number of different networks and/or network types.


In some embodiments, the Intelligent TV 100 may be equipped with multiple communication means. The multiple communication means may allow the Intelligent TV 100 to communicate across Local Area Networks (LANs) 124, wireless local area networks (WLANs) 128, and other networks 132. The networks 132 may be connected in a redundant manner to ensure network access. In other words, if one connection is interrupted, the intelligent TV 100 can use an alternate communications path to reestablish and/or maintain the network connection 104. Among other things, the Intelligent TV 100 may use these network connections 104 to send and receive information, interact with an electronic program guide (EPG) 136, receive software updates 140, contact customer service 144 (e.g., to receive help or service, etc.), and/or access remotely stored digital media libraries 148. In addition, these connections can allow the Intelligent TV 100 to make phone calls, send and/or receive email messages, send and/or receive text messages (such as email and instant messages), surf the Internet using an internet search engine, post blogs by a blogging service, and connect/interact with social media sites and/or an online community (e.g., Facebook™, Twitter, LinkedIn™, Pinterest™, GooglePlus™, MySpace™, and the like) maintained by a social network service. In combination with other components of the Intelligent TV 100 described in more detail below, these network connections 104 also enable the Intelligent TV 100 to conduct video teleconferences, electronic meetings, and other communications. The Intelligent TV 100 may capture and store images and sound, using associated cameras, microphones, and other sensors. Additionally or alternatively, the Intelligent TV 100 may create and save screen shots of media, images, and data displayed on a screen associated with the intelligent TV 100.


Further, as shown in FIG. 1B, the Intelligent TV 100 can interact with other electronic devices 168 by either by the wired 108 and/or wireless 112 connections. As described herein, components of the Intelligent TV 100 allow the device 100 to be connected to devices 168 including, but not limited to, DVD players 168a, BluRay players 168b, portable digital media devices 168c, smart phones 168d, tablet devices 168e, personal computers 168f, external cable boxes 168g, keyboards 168h, pointing devices 168i, printers 168j, game controllers and/or game pads 168k, satellite dishes 1681, external display devices 168m, and other universal serial bus (USB), local area network (LAN), Bluetooth™, or high-definition multimedia interface (HDMI) compliant devices, and/or wireless devices. When connected to an external cable box 168g or satellite dish 1681, the Intelligent TV 100 can access additional media content. Also, as further described below, the Intelligent TV 100 is capable of receiving digital and/or analog signals broadcast by TV stations. The Intelligent TV 100 can be configured as one or more of a standard-definition television, enhanced television, and high-definition television. It may operate as one or more of cable, Internet, Internet Protocol, satellite, web, and/or smart television. The Intelligent TV 100 may also be used to control the operation of, and may interface with, other smart components such as security systems 172, door/gate controllers 176, remote video cameras 180, lighting systems 184, thermostats 188, refrigerators 192, and other appliances.


Intelligent TV:



FIGS. 2A-2D illustrate components of the Intelligent TV 100. In general, as shown by FIG. 2A, the Intelligent TV 100 can be supported by a removable base or stand 204 that is attached to a frame 208. The frame 208 surrounds edges of a display screen 212, leaving a front surface of the display screen 212 uncovered. The display screen 212 may comprise a Liquid Crystal Display (LCD) screen, a plasma screen, Light Emitting Diode (LED) screen, or other screen types. In embodiments, the entire front surface of the screen 212 may be touch sensitive and capable of receiving input by the user touching the front surface of the screen 212.


The Intelligent TV 100 may include integrated speakers 216 and at least one microphone 220. A first area of the frame 208 may comprise a horizontal gesture capture region 224 and second areas comprise vertical gesture capture regions 228. The gesture capture regions 224, 228 may comprise areas or regions that are capable of receiving input by recognizing gestures made by the user, and in some examples, without the need for the user to actually touch the screen 212 surface of the Intelligent TV 100. However, the gesture capture regions 224, 228 may not include pixels that can perform a display function or capability.


One or more image capture devices 232, such as a camera, can be included for capturing still and/or video images. The image capture device 232 can include or be associated with additional elements, such as a flash or other light source 236 and a range finding device 240 to assist focusing of the image capture device. In addition, the microphone 220, gesture capture regions 224, 228, image capture devices 232, and the range finding device 240 may be used by the Intelligent TV 100 to recognize individual users. Additionally or alternatively, the Intelligent TV 100 may learn and remember preferences associated with the individual users. In some embodiments, the learning and remembering (i.e., identifying and recalling stored information) may be associated with the recognition of a user.


An IR transmitter and receiver 244 may also be provided to connect the Intelligent TV 100 with a remote control device (not shown) or other IR devices. Additionally or alternatively, the remote control device may transmit wireless signals via RF, light, and/or a means other than IR. Also shown in FIG. 2A is an audio jack 248, which may be hidden behind a panel that is hinged or removable. The audio jack 248 accommodates a tip, ring, sleeve (TRS) connector, for example, to allow the user to utilize headphones, a headset, or other external audio equipment.


The Intelligent TV 100 can also include a number of buttons 252. For example, FIG. 2A illustrates the buttons 252 on the top of the Intelligent TV 100, although the buttons could be placed at other locations. As shown, the Intelligent TV 100 includes six buttons 252a-f, which can be configured for specific inputs. For example, the first button 252a may be configured as an on/off button used to control overall system power to the Intelligent TV 100. The buttons 252 may be configured to, in combination or alone, control a number of aspects of the Intelligent TV 100. Some non-limiting examples include, but are not limited to, overall system volume, brightness, the image capture device, the microphone, and initiation/termination of a video conference. Instead of separate buttons, two of the buttons may be combined into a rocker button. This rocker button arrangement may be useful in situations where the buttons are configured to control features such as volume or brightness. In some embodiments, one or more of the buttons 252 are capable of supporting different user commands. By way of example, a normal press has a duration commonly of less than about 1 second and resembles a quick input. A medium press has a duration commonly of 1 second or more but less than about 12 seconds. A long press has a duration commonly of about 12 seconds or more. The function of the buttons is normally specific to the application that is active on the Intelligent TV 100. In the video conference application for instance and depending on the particular button, a normal, medium, or long press can mean end the video conference, increase or decrease the volume, increase a rate speed associated with a response to an input, and toggle microphone mute. Depending on the particular button, a normal, medium, or long press can also control the image capture device 232 to increase zoom, decrease zoom, take a photograph, or record video.


In support of communications functions or capabilities, the Intelligent TV 100 can include one or more shared or dedicated antennae 256 and wired broadband connections 260 as shown in FIG. 2B. The antennae 256 also enable the Intelligent TV 100 to receive digital and/or analog broadcast TV channels. The wired broadband connections 260 are, for example, a Digital Subscriber Line (DSL), an optical line, an Ethernet port, an IEEE 1394 interface, or other interfaces. The Intelligent TV 100 also has a telephone line jack 262 to further provide communications capability.


In addition to the removable base 204, the Intelligent TV 100 may include hardware and mounting points 264 on a rear surface to facilitate mounting the Intelligent TV 100 to a surface, such as a wall. In one example, the Intelligent TV 100 may incorporate at least one Video Equipment Standards Association (VESA) mounting interface for attaching the device 100 to the surface.


As shown in FIG. 2C, the Intelligent TV 100 may include docking interfaces or ports 268. The docking ports 268 may include proprietary or universal ports to support the interconnection of the Intelligent TV 100 to other devices or components, which may or may not include additional or different capabilities from those integral to the Intelligent TV 100. In addition to supporting an exchange of communication signals between the Intelligent TV 100 and a connected device or component, the docking ports 268 can support the supply of power to the connected device or component. The docking ports 268 can also comprise an intelligent element that comprises a docking module for controlling communications or other interactions between the Intelligent TV 100 and the connected device or component.


The Intelligent TV 100 also includes a number of card slots 272 and network or peripheral interface ports 276. The card slots 272 may accommodate different types of cards including subscriber identity modules (SIM), secure digital (SD) cards, MiniSD cards, flash memory cards, and other cards. Ports 276 in embodiments may include input/output (I/O) ports, such as universal serial bus (USB) ports, parallel ports, game ports, and high-definition multimedia interface (HDMI) connectors.


An audio/video (A/V) I/O module 280 can be included to provide audio to an interconnected speaker or other device, and to receive audio input from a connected microphone or other device. As an example, the audio input/output interface 280 may comprise an associated amplifier and analog to digital converter.


Hardware Features:



FIG. 3 illustrates components of a Intelligent TV 100 in accordance with embodiments of the present disclosure. In general, the Intelligent TV 100 includes a primary screen 304. Screen 304 can be a touch sensitive screen and can include different operative areas.


For example, a first operative area, within the screen 304, may comprise a display 310. In some embodiments, the display 310 may be touch sensitive. In general, the display 310 may comprise a full color, display.


A second area within the screen 304 may comprise a gesture capture region 320. The gesture capture region 320 may comprise an area or region that is outside of the display 310 area, and that is capable of receiving input, for example in the form of gestures provided by a user. However, the gesture capture region 320 does not include pixels that can perform a display function or capability.


A third region of the screen 304 may comprise a configurable area 312. The configurable area 312 is capable of receiving input and has display or limited display capabilities. In embodiments, the configurable area 312 may present different input options to the user. For example, the configurable area 312 may display buttons or other relatable items. Moreover, the identity of displayed buttons, or whether any buttons are displayed at all within the configurable area 312 of a screen 304, may be determined from the context in which the Intelligent TV 100 is used and/or operated.


In an exemplary touch sensitive screen 304 embodiment, the touch sensitive screen 304 comprises a liquid crystal display extending across at least those regions of the touch sensitive screen 304 that are capable of providing visual output to a user, and a capacitive input matrix over those regions of the touch sensitive screen 304 that are capable of receiving input from the user.


One or more display controllers 316 may be provided for controlling the operation of the screen 304. The display controller 316 may control the operation of the touch sensitive screen 304, including input (touch sensing) and output (display) functions. The display controller 316 may also control the operation of the screen 304 and may interface with other inputs, such as infrared and/or radio input signals (e.g., door/gate controllers, alarm system components, etc.). In accordance with still other embodiments, the functions of a display controller 316 may be incorporated into other components, such as a processor 364.


The processor 364 may comprise a general purpose programmable processor or controller for executing application programming or instructions. In accordance with at least some embodiments, the processor 364 may include multiple processor cores, and/or implement multiple virtual processors. In accordance with still other embodiments, the processor 364 may include multiple physical processors. As a particular example, the processor 364 may comprise a specially configured application specific integrated circuit (ASIC) or other integrated circuit, a digital signal processor, a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like. The processor 364 generally functions to run programming code or instructions implementing various functions of the Intelligent TV 100.


In support of connectivity functions or capabilities, the Intelligent TV 100 can include a module for encoding/decoding and/or compression/decompression 366 for receiving and managing digital television information. Encoding/decoding compression/decompression module 366 enables decompression and/or decoding of analog and/or digital information dispatched by a public television chain or in a private television network and received across antenna 324, I/O module 348, wireless connectivity module 328, and/or other wireless communications module 332. The television information may be sent to screen 304 and/or attached speakers receiving analog or digital reception signals. Any encoding/decoding and compression/decompression is performable on the basis of various formats (e.g., audio, video, and data). Encrypting module 368 is in communication with encoding/decoding compression/decompression module 366 and enables the confidentiality of all the data received or transmitted by the user or supplier.


In support of communications functions or capabilities, the Intelligent TV 100 can include a wireless connectivity module 328. As examples, the wireless connectivity module 328 can comprise a GSM, CDMA, FDMA and/or analog cellular telephony transceiver capable of supporting voice, multimedia and/or data transfers over a cellular network. Alternatively or in addition, the Intelligent TV 100 can include an additional or other wireless communications module 332. As examples, the other wireless communications module 332 can comprise a Wi-Fi, Blutooth™, WiMax, infrared, or other wireless communications link. The wireless connectivity module 328 and the other wireless communications module 332 can each be associated with a shared or a dedicated antenna 324 and a shared or dedicated I/O module 348.


An input/output module 348 and associated ports may be included to support communications over wired networks or links, for example with other communication devices, server devices, and/or peripheral devices. Examples of an input/output module 348 include an Ethernet port, a Universal Serial Bus (USB) port, Thunderbolt™ or Light Peak interface, Institute of Electrical and Electronics Engineers (IEEE) 1394 port, or other interface.


An audio input/output interface/device(s) 344 can be included to provide analog audio to an interconnected speaker or other device, and to receive analog audio input from a connected microphone or other device. As an example, the audio input/output interface/device(s) 344 may comprise an associated amplifier and analog-to-digital converter. Alternatively or in addition, the Intelligent TV 100 can include an integrated audio input/output device 356 and/or an audio jack for interconnecting an external speaker or microphone. For example, an integrated speaker and an integrated microphone can be provided, to support near talk or speaker phone operations.


A port interface 352 may be included. The port interface 352 may include proprietary or universal ports to support the interconnection of the device 100 to other devices or components, such as a dock, which may or may not include additional or different capabilities from those integral to the device 100. In addition to supporting an exchange of communication signals between the device 100 and another device or component, the docking port 136 and/or port interface 352 can support the supply of power to or from the device 100. The port interface 352 also comprises an intelligent element that comprises a docking module for controlling communications or other interactions between the Intelligent TV 100 and a connected device or component. The docking module may interface with software applications that allow for the remote control of other devices or components (e.g., media centers, media players, and computer systems).


An Intelligent TV 100 may also include memory 308 for use in connection with the execution of application programming or instructions by the processor 364, and for the temporary or long term storage of program instructions and/or data. As examples, the memory 308 may comprise RAM, DRAM, SDRAM, or other solid state memory. Alternatively or in addition, data storage 314 may be provided. Like the memory 308, the data storage 314 may comprise a solid state memory device or devices. Alternatively or in addition, the data storage 314 may comprise a hard disk drive or other random access memory.


Hardware buttons 358 can be included for example for use in connection with certain control operations. One or more image capture interfaces/devices 340, such as a camera, can be included for capturing still and/or video images. Alternatively or in addition, an image capture interface/device 340 can include a scanner, code reader, or motion sensor. An image capture interface/device 340 can include or be associated with additional elements, such as a flash or other light source. The image capture interfaces/devices 340 may interface with a user ID module 350 that assists in identifying users of the Intelligent TV 100.


The Intelligent TV 100 can also include a global positioning system (GPS) receiver 336. In accordance with embodiments of the present invention, the GPS receiver 336 may further comprise a GPS module that is capable of providing absolute location information to other components of the Intelligent TV 100. As will be appreciated, other satellite-positioning system receivers can be used in lieu of or in addition to GPS.


Power can be supplied to the components of the Intelligent TV 100 from a power source and/or power control module 360. The power control module 360 can, for example, include a battery, an AC to DC converter, power control logic, and/or ports for interconnecting the Intelligent TV 100 to an external source of power.


Communication between components of the Intelligent TV 100 is provided by bus 322. Bus 322 may comprise one or more physical buses for control, addressing, and/or data transmission. Bus 322 may be parallel, serial, a hybrid thereof, or other technology.


Firmware and Software:


An embodiment of the software system components and modules 400 is shown in FIG. 4. The software system 400 may comprise one or more layers including, but not limited to, an operating system kernel 404, one or more libraries 408, an application framework 412, and one or more applications 416. The one or more layers 404-416 can communicate with each other to perform functions for the Intelligent TV 100.


An operating system (OS) kernel 404 contains the primary functions that allow the software to interact with hardware associated with the Intelligent TV 100. Kernel 404 can include a collection of software that manages the computer hardware resources and provides services for other computer programs or software code. The operating system kernel 404 is the main component of the operating system and acts as an intermediary between the applications and data processing done with the hardware components. Part of the operating system kernel 404 can include one or more device drivers 420. A device driver 420 can be any code within the operating system that helps operate or control a device or hardware attached to or associated with the Intelligent TV. The driver 420 can include code for operating video, audio, and/or other multimedia components of the Intelligent TV 100. Examples of drivers include display, camera, flash, binder (IPC), keypad, WiFi, and audio drivers.


Library 408 can contain code or other components that may be accessed and implemented during the operation of the software system 400. The library 408 may contain one or more of, but is not limited to, an operating system runtime library 424, a TV services hypertext application language (HAL) library 428, and/or a data service library 432. The OS runtime library 424 may contain the code required by the operating system kernel 404 or other operating system functions to be executed during the runtime of the software system 400. The library can include the code that is initiated during the running of the software system 400.


The TV services hypertext application language library 428 can include code required by TV services either executed in the application framework 412 or an application 416. The TV services HAL library 428 is specific to the Intelligent TV 100 operations that control different functions of the Intelligent TV. The TV service HAL library 428 can also be formed from other types of application languages or embodiments of different types of code or formats for code beyond the hypertext application language.


The data services library 432 can include the one or more components or codes to implement components for the data services function. The data services function can be implemented in the application framework 412 and/or applications layer 416. An embodiment of a function of the data services and the type of components that may be included is shown in FIG. 6.


The application framework 412 can include a general abstraction for providing functionality that can be selected by one or more application 416 to provide specific application functions or software for those applications. Thus, the framework 412 can include one or more different services, or other applications, that can be accessed by the applications 416 to provide general functions across two or more applications. Such functions include, for example, management of one or more of windows or panels, surfaces, activities, content, and resources, The application framework 412 can include one or more, but is not limited to, TV services 434, TV services framework 440, TV resources 444, and user interface component 448.


The TV services framework 440 can provide an additional abstraction for different TV services. TV services framework 440 allows for the general access and function of services that are associated with the TV functionality. The TV services 436 are general services provided within the TV services framework 440 that can be accessed by applications in the applications layer 416. The TV resources 444 provide code for accessing TV resources 444 including any types of storage, video, audio, or other functionality provided with the Intelligent TV 100. The TV resources 444, TV services 436, and TV services framework 440 provide for the different implementations of TV functionality that may occur with the Intelligent TV 100.


One or more user interface components 448 can provide general components for display of the Intelligent TV 100. The user interface components 448 might be general components that may be accessed by different applications provided in the application framework 412. The user interface components 448 may be accessed to provide for panels and silos as described in conjunction with FIG. 5.


The applications layer 416 can both contain and execute applications associated with the Intelligent TV 100. Applications layer 416 may include one or more of, but is not limited to, a live TV application 452, a video on demand application 456, a media center application 460, an application center application 464, and a user interface application 468. The live TV application 452 can provide live TV over different signal sources. For example, the live TV application, 452, can provide TV from input from cable television, over air broadcasts, from satellite services, or other types of live TV services. Live TV application 452 may then present the multimedia presentation or video and audio presentation of the live television signal over the display of the Intelligent TV 100.


The video on demand application 456 can provide for video from different storage sources. Unlike Live TV application 452, video on demand 456 provides for display of videos that are accessed from some memory source. The sources of the video on demand can be associated with users or with the Intelligent TV or some other type of service. For example, the video on demand 456 may be provided from an iTunes library stored in a cloud, from a local disc storage that contains stored video programs, or from some other source.


The media center application 460 can provide application for different types of media presentation. For example, the media center 460 can provide for displaying pictures or audio that is different from, but still accessible by the user and different from live TV or video on demand. The media center 460 allows for the access of different sources to obtain the media in the display of such media on the Intelligent TV 100.


The application center 464 allows for the provision, storage and use of applications. An application can be a game, a productivity application, or some other application generally associated with computer systems or other devices, but may be operated within the Intelligent TV. An application center 464 may obtain these applications from different sources, store them locally and then execute those types of applications for the user on the Intelligent TV 100.


User interface application 468 provides for the specific user interfaces associated with the Intelligent TV 100. These user interfaces can include the silos and panels that are described in FIG. 5. An embodiment of the user interface software 500 is shown in FIG. 5. Here the application framework 412 contains one or more code components which help control the user interface events while one or more applications in the applications layer 416 affects the user interface use for the Intelligent TV 100. The application framework 412 can include a silo transition controller 504 and/or an input event dispatcher 508. There may be more or fewer code components in the application framework 412 than those shown in FIG. 5. The silo transition controller 504 contains the code and language that manages the transitions between one or more silos. A silo can be a vertical user interface feature on the Intelligent TV that contains information for user. The transition controller 504 can manage the changes between two silos when an event occurs in the user interface. The input event dispatcher 508 can receive user interface events that may be received from the operating system and provided to the input event dispatcher 508. These events can include selections of buttons on a remote control or on the TV or other types of user interface inputs. The input event dispatcher 508 may then send these events to a silo manager 532 or panel manager 536 depending on the type of the event. The silo transition controller 504 can interface with the silo manager 532 to affect changes in the silos.


The application layer 416 can include a user interface application 468 and/or a silo application 512. The application layer 416 can include more or fewer user interface applications as necessary to control the user interface of the Intelligent TV 100 than those shown in FIG. 5. The user interface application 468 can include a silo manager 532, a panel manager 536, and one or more types of panels 516-528. The silo manager 532 manages the display and/or features of silos. The silo manager 532 can receive or send information from the silo transition controller 504 or the input event dispatcher 508 to change the silos displayed and/or to determine types of input received in the silos.


A panel manager 536 is operable to display panels in the user interface to manage transitions between those panels or to affect user interface inputs received in the panel. The panel manager 536 may thus be in communication with different user interface panels such as a global panel 516, a volume panel 520, a settings panel 524, and/or a notification panel 528. The panel manager 536 can display these types of panels depending on the inputs received from the input event dispatcher 508. The global panel 516 may include information that is associated with the home screen or top level hierarchal information for the user. A volume panel 520 may display information about an audio volume control or other settings for volume. A settings panel 524 can include information displayed about the settings of the audio or video, or other settable characteristics of the Intelligent TV 100. A notification panel 528 can provide information about notifications to a user. These notifications can be associated with information, such as, video on demand displays, favorites, currently provided programs, or other information. Notifications can be associated with the media or with some type of setting, or operation or the Intelligent TV 100. The panel manager 536 may be in communication with a panel controller 552 of the silo application 512.


The panel controller 552 may operate to control portions of the panels of the types described previously. Thus, the panel controller 552 may be in communication with a top panel application 540, an application panel 544, and/or bottom panel 548. These types of panels may be differently displayed in the user interface of the Intelligent TV 100. The panel control thus may be based on the configuration of the system or the type of display being used currently, put the types of panels 516-528 into a certain display orientation governed by the top panel application 540, application panel 544, or bottom panel application 548.


An embodiment of the data service 432 and the operation of the data management is shown in FIG. 6. The data management 600 can include one or more code components that are associated with different types of data. For example, there may be code components within the data service 432 that execute and are associated with video on demand, the electronic program guide, or media data. There may be more or fewer types of data service 432 components than those shown in FIG. 6. Each of the different types of data may include a data model 604-612. The data models govern what information is to be stored and how that information will be stored by the data service. Thus, the data model can govern regardless of where the data comes from, how the data will be received or managed within the Intelligent TV system. Thus, the data model 604, 608, and/or 612, can provide a translation ability or affect the ability to translate data from one form to another to be used by the Intelligent TV 100.


The different types of data services (video on demand, electronic programming guide, media) each have a data subservice 620, 624, and/or 628 that is in communication with one or more internal and/or external content providers 616. The data subservices 620, 624, and 628 that communicate with the content providers 616 to obtain data that may then be stored in databases 632, 636, and 640. The subservices 620, 624, and 628 may communicate with and initiate or enable one or more source plug-ins 644, 648, and 652 to communicate with the content provider. For each content provider 616, there may be a different source plug-in 644, 648, and 652. Thus, if there is more than one source of content for the data, each of the data subservice 620, 624, and 628 may determine and then enable or initiate a different source plug-in 644, 648, and/or 652. The content providers 616 may also provide information to a resource arbitrator 656 and/or thumbnail cache manager 660. The resource arbitrator 656 may operate to communicate with resources 664 that are external to the data service 432. Thus, the resource arbitrator 656 may communicate with cloud based storage, network based storage, or other types of external storage in the resources 664. This information may then be provided through the content provider module 616 to the data subservices 620, 624, 628. Likewise, a thumbnail cache manager 660 may obtain thumbnail information from one of the data subservices 620, 624, 628 and store that information in the thumbnail database 668. Further the thumbnail cache manager 660 may extract or retrieve that information from the thumbnails database 668 to provide to one of the data subservices 620, 624, 628.


An exemplary content aggregation architecture 1300 is shown in FIG. 13. The architecture can include a user interface layer 1304 and content aggregation layer 1308. The user interface layer 1304 may include a TV application 1312, media player 1316, and application(s) 1320. The TV application 1312 enables the viewer to view channels received via an appropriate transmission medium, such as cable, satellite, and/or the Internet. The media player 1316 views other types of media received via an appropriate transmission medium, such as the Internet. The application(s) 1320 include other TV-related (pre-installed) applications, such as content viewing, content searching, device viewing, and setup algorithms, and coordinates with the media player 1316 to provide information to the viewer.


The content source layer 1308 includes, as data services, a content source service 1328, a content aggregation service 1332 and a content presentation service 1336. The content source service 1328 can manage content source investigators, including local and/or network file system(s), digital network device manager (which discovers handheld and non-handheld devices (e.g., digital media servers, players, renderers, controllers, printers, uploaders, downloaders, network connectivity functions, and interoperability units) by known techniques, such as a multicast universal plug and play or UPnP discovery techniques, and, for each discovered device, retrieves, parses, and encodes device descriptors, notifies the content source service of the newly discovered device, and provides information, such as an index, on previously discovered devices), Internet Protocol Television or IPTV, digital television or DTV (including high definition and enhanced TV), third party services (such as those referenced above), and applications (such as Android applications).


Content source investigators can track content sources and are typically configured as binaries. The content source service 1328 starts content source investigators and maintains open and persistent channels for communications. The communications include query or command and response pairs. The content aggregation service 1332 can manage content metadata fetchers, such as for video, audio, and/or picture metadata. The content presentation service 1336 may provide interfaces to the content index 1340, such as an Android application interface and digital device interfaces.


The content source service 1328 can send and receive communications 1344 to and from the content aggregation service 1332. The communications can include notifications regarding new and removed digital devices and/or content and search queries and results. The content aggregation service 1332 can send and receive communications 1348 to and from the content presentation service 1336 including device and/or content lookup notifications, content-of-interest advisories and notifications, and search queries and results.


When a search is performed, particularly when the user is searching or browsing content, a user request may be received from the user interface layer 1300, by the content presentation service 1336, which responsively opens a socket and sends the request to the content aggregation service 1332. The content aggregation service 1332 first returns results from the local database 1340. The local database 1340 includes an index or data model and indexed metadata. The content source service 1328 further issues search and browse requests for all content source investigators and other data management systems. The results are forwarded to the content aggregation service 1332, which updates the database 1340 to reflect the further search results and provides the original content aggregation database search results and the data updates, reflecting the additional content source service search results, over the previously opened socket to the content presentation service 1336. The content presentation service 1336 then provides the results to one or more components in the user interface layer 1300 for presentation to the viewer. When the search session is over (e.g., the search session is terminated by the user or by an action associated with user), the user interface layer 1300 disconnects the socket. As shown, media can be provided directly by the content aggregation service 1332 to the media player 1316 for presentation to the user.


Remote Control:


A handheld remote control can be provided to enable user interaction with the Intelligent TV 100. An exemplary handheld remote control is shown in FIGS. 7-9. The remote control 700 can include one or more of, but is not limited to, top, side and bottom housings 704, 708, and 712, an (on/off) power button 716, an input source button 720 (to select input source such as Live TV, video on demand, media center, application center, high definition multimedia interface or HDMI, component or COMP, audio/Video or A/V, digital or analog television or DTV/ATV, and video graphics array (VGA)), a (volume) mute button 724, a Live TV button 728 (to activate or select the Live TV silo), a video on demand (VOD) button 732 (to activate or select the video on demand silo), a media center button 736 (to activate or select the media center application or silo, which access various types of media such as music, TV programming, videos, and the like), an application center button 740 (to activate or select the application center application or silo), a global panel button 744, an application panel button 748, a back button 752 (to select a prior user operation or Intelligent TV state and/or navigate up a hierarchy of any displayed image or object(s) (in which case the back button 752 does not navigate within application panels or across application silos), a play button 756 (to play or pause media), a D-pad 760 (which includes north, east, west, and south directional arrows to navigate among displayed images and/or move between levels of an application's or object's hierarchy such as application view navigation, panel navigation, and collection navigation), an OK (or select) button 764 (to select a highlighted displayed image (such as displayed speed control, rewind, forward, play, and pause objects and/or objects on menu bar or in a menu box) and/or navigate down a hierarchy of any displayed image or object(s)), a rocker-type volume-up and volume-down button 768 (to adjust the volume), a menu/guide button 772 (to select for display a menu or guide of programming), a 0-9 (number) button 776 (to display a number pad on the TV screen), a settings button 780 (which launches an application to access current and change TV settings (such as channel settings and settings used to adjust picture and sound effects (e.g., image mode (e.g., standard, playground, game, cinema, concert, and studio), brightness, contrast, saturation, color temperature, energy savings, 3D noise reduction, hue, sharpness, zoom mode (e.g., full screen, standard, smart zoom, and dot-to-dot), picture position, 3D mode, for picture, and sound retrieval system or SRS TruSurround, sound mode (e.g., standard, live 1, live 2, theatre, music, speech, user equalizer mode, Left/Right speaker balance, auto volume control, Sony/Philips Interconnect Format or S/PDIF (off, auto, pulse code modulation or PCM) for sound) and system settings (such as system (e.g., selected language for graphical user interface, user geographical and/or geopolitical location information, input method, area settings, and sleep time), network (e.g., WiFi, WiFi hotspot, WiFi direct, Point-to-Point Protocol over Ethernet or PPPoE (asymmetric digital subscriber line or ADSL), Ethernet) settings (e.g., enabled and disabled and selected and non-selected) and information (e.g., network information (e.g., electronic address such as Internet Protocol or IP address, subnet mask, gateway, domain name server information, domain name, Media Access Control or MAC address, service set identification or SSID, security information, and password information) and inline status), manage applications (e.g., currently installed applications, currently executing applications, and internal and external computer readable medium usage), and view user information regarding the Intelligent TV 100)), a rocker-type channel-up and channel-down button 784 (to increment or decrement the selected channel), and first, second, third and fourth hotkeys 788, 792, 794, and 796, and/or a moveable joystick 900 on a bottom of the remote control 700. The first, second, third, and fourth hotkeys are generally assigned different colors, which color indexing is depicted as visual indicia on a selected panel to show the currently assigned function, if any, for each hotkey. As can be seen, the actuator layout can provide a highly efficient, satisfactory, and easily usable experience to the end user.


Unlike the functional associations and functions of many of the actuators, those of some of the actuators are not readily apparent. A number of examples will now be discussed by way of illustration.


The media center button 736, when selected, can provide information regarding music, videos, photographs, collections or groupings of music, videos, and/or photographs, and internal and external computational devices (such as personal computers, laptops, tablet computers, wireless phones, removable computer readable media, and the like), which can be grouped in a selected manner (such as favorites, most recently viewed, most watched or viewed, and most recently added). The information can includes previews (which can include selected portions of the media content, duration, file size, date created, date last watched, times watched or viewed, and audio and/or video format information).


The application center button 740, when selected, may provide information regarding pre-installed and downloaded applications. Unlike downloaded applications, pre-installed applications cannot be removed by the user or manually updated. Exemplary pre-installed applications include web browser, settings control, and content search algorithms. By way of illustration, the application center button 740 can provide a scrollable graphical grid of icons (each icon being associated with an application) currently available in the application center.


The global panel button 744, when selected, can provide the user, via one or more panels or windows, with access to one or more of, but not limited to, silos, notifications, a web browser, system settings, and/or information associated therewith. For example, the global panel button 744 can enable the user to determine what external devices are currently connected to and/or disconnected from the Intelligent TV 100, determine what inputs (e.g., HDMI ports) are currently available for connecting to external devices, determine a connection and/or operational status of a selected external device and/or network (e.g., WiFi connected, Ethernet connected, and offline), assign a custom (or user selected) name to each input source, determine what content is currently being offered on Live TV, on demand, the media center, and/or the application center, access vendor messages and notifications to the user (e.g., system and/or application updates are available), activate the Internet browser, and/or access shortcuts on a displayed shortcut bar to more frequently used and desired applications. Common shortcuts are Internet browser (e.g., Internet search engine), system settings, and notifications. The common types of panels are for information (which is typically information related to a currently displayed image and/or content (e.g., title, date/time, audio/visual indicator, rating, and genre), browse requests, and/or search requests (such as search term field)). Each of the panel types may include a panel navigation bar, detailed information or relevant content to the panel function, operation and/or purpose, and a hotkey bar (defining currently enabled functional associations of hotkeys).


The application panel button 748, when selected, can display an application window or panel. One application panel may be an information panel regarding a selected (pre-installed or previously downloaded) application icon. The information panel can one or more of identify the selected application, provide a description of the functionality (including application developer and/or vendor, version, release, and/or last update date and a category or type of application based on the application's functionality) and user ratings and/or degree of other user downloading of the application (e.g., a star rating assigned based on one or more of the foregoing inputs), provide the option to launch, remove, update, and add to favorites the identified application, and provide a listing of selectable links of other (not yet downloaded) recommended applications that provide similar functionality to the identified application. The latter listing can, in turn, provide a description of the functionality (including application developer and/or vendor, version, release, and/or last update date and a category or type of application based on the application's functionality) and user ratings and/or degree of other user downloading of the application (e.g., a star rating assigned based on one or more of the foregoing inputs).


The functions of the first, second, third, and fourth hotkeys 788, 792, 794, and 796 can change depending on system state, context, and/or, within a selected screen and/or panel, based on a content or currently selected portion of (or relative cursor position on) the screen. Commonly, a currently assigned function of any of the first, second, third, and fourth hotkeys 788, 792, 794, and 796 depends on a currently accessed silo and/or panel (with which the user is currently interacting within the silo). In other words, a first function of one of the first, second, third, and fourth hotkeys 788, 792, 794, and 796 is activated by the respective hotkey in a first system state while a different second function is activated by the respective hotkey in a different second system state. In another example, a third function of one of the first, second, third, and fourth hotkeys 788, 792, 794, and 796 is activated by the respective hotkey when a user focus (or currently selected cursor position or screen portion) is at a first screen position while a different fourth function is activated by the respective hotkey when a user focus (or currently selected cursor position or screen portion) is at a different second screen position. The first screen position can, for instance, be within an icon while the second screen position is outside of the icon. Hotkey functionality that could be enabled when in the first screen position may be “configure” and “remove” and disabled is “add”, and, when in the second position hotkey functionality enabled can be “add” and disabled is “configure” and “remove”. Generally, the states of hotkeys can include normal (for enabled actions or functions), disabled (when an action or function is temporarily disabled), pressed (when selected by a user to command an action or function to be performed), and unavailable (when no association between the hotkey and an action or function is currently available). While examples of hotkey functions are discussed below, it is to be understood that these are not intended to be exhaustive or limiting examples.


The first hotkey 788, when selected in a first system state, can enable the user to assign, change, or edit a name of an input source. It is typically enabled only when the input source of HDMI, Comp/YPbPr (e.g., component video cables), video output, and VGA is in focus. When selected in a second system state, the first hotkey 788 can return the user to a top of a scrollable collection of objects, such as application icons.


The second hotkey 792 may show all or less. In other words, the hotkey 792 can allow the user to show all inputs, including the unconnected/undetected ones and to hide the unconnected/undetected inputs, e.g., to expand and collapse the silo/input list. Each input source can have one of two states, namely connected/detected and unconnected/undetected. Some input sources, including Live TV, video on demand, media center, and application center are always connected/detected.


The moveable joystick 900 on the bottom of the remote control 700, when manipulated, can cause a displayed image on the Intelligent TV 100 screen to be displaced a proportional amount. In other words, the displayed image is displaced substantially simultaneously with displacement of the joystick 900 within the joystick aperture 904 in the bottom housing 712 of the remote control. As shown in FIGS. 9B-C, the joystick 900 moves or slides between forward and reverse positions. Releasing the joystick 900 causes the joystick 900 to return to the center position of FIG. 9A, and the window to move or slide upwardly (when the joystick is released from the joystick position of FIG. 9B) or downwardly (when the joystick is released from the joystick position of FIG. 9C) until it disappears from view as shown in FIG. 11A. The effect on the screen of the Intelligent TV 100 is shown in FIGS. 11A-C. In FIG. 11A, video content, such as TV programming, a video, movie, and the like, is being displayed by front surface of the screen 212. In FIG. 11B, the joystick 900 is moved or slid to the upper position of FIG. 9B, and a drop down window or panel 1100 moves or slides down (at the substantially the same rate of joystick 900 movement) at the top of the screen 212. In FIG. 11C, the joystick 900 is moved or slid to the lower position of FIG. 9C, and a drop up window or panel 1100 moves or slides up (at the substantially the same rate of joystick 900 movement) at the bottom of the screen 212. The window 1100 partially covers the video content appearing on the remainder of the screen 212 and/or causes a portion of the screen 212 displaying video content to move and/or compress up or down the height of the window 1100.


The window 1100 can include one or more of information (which is typically information related to a currently displayed image and/or content (e.g., panel navigation bar, detailed information (e.g., title, date/time, audio/visual indicator, rating, and genre), and hotkey bar (defining current functional associations of hotkeys)), browse requests, and/or search requests. Commonly, the window 1100 includes suitable information about the content (such as name, duration, and/or remaining viewing duration of content), settings information, TV or system control information, application (activation) icons (such as for pre-installed and/or downloaded applications such as application center, media center and Web browser), and/or information about input source(s), When the joystick 900 is in either the forward or reverse position, the user can select an actuator on the front of the remote control, such as the OK button 764, and be taken, by displayed images on the screen 212, to another location in the user interface, such as a desktop. This process can be done in a nonintrusive manner and without affecting the flow of content that is pushed up or down. The joystick 900 could be moved, additionally or differently, from side-to-side to cause the window to appear at the left or right edge of the screen 212.


An alternative actuator configuration is shown in FIG. 10. The actuators are substantially the same as those of FIGS. 7-9 except that the social network button 1000, when selected, can automatically select content and publish, via a social network service or other social media, the content to a social network or online community. User or viewer comments and/or other messages can be included in the outbound message. For example, all or one or frames or portions of media content (such as a video, music, a photograph, a picture, or text) can be provided automatically to a predetermined or selected group of people via Linked-In™, Myspace™, Twitter™, YouTube™, DailyMotion™, Facebook™, Google+™ or Second Life™ The user, upon activating the button 1000 could, in response, select a social forum or media upon which the selected content (which is the content displayed to the user when the social network button 1000 is activated) is to be posted and/or a predetermined group within that social media to which the content is to be posted. Alternatively, these selections could be preconfigured or preselected by the user.


The social network button can also be used to “turn up” or “turn down” a social volume visualization. The Intelligent TV 100 can create dynamically a visualization of aggregated connections (and inbound and/or outbound messages) from a variety of social networks. The aggregation (and inbound and outbound messages) can be depicted graphically on the screen as a volume of connections to influence the viewer user. With a social volume visualization, selected contents of each linked social network profile of a social contact (and inbound and/or outbound messages from or to the linked social network contact and/or current activity of the social contact (such as watching the same programming or content the viewer is currently watching) can be presented in a separate tile (or visually displayed object). The size of the tile can be related to any number of criteria, including a relationship of the linked social contact (e.g., a relative degree of importance or type of relationship can determine the relative size of the tile, a degree of influence of the linked social contact to the current viewer, a geographic proximity of the linked social contact to the current viewer, a degree to which the currently provided media content is of interest to both the viewer and linked social contact (e.g., both parties enjoy war movies, murder mysteries, musicals, comedies, and the like), an assigned ranking of the linked viewer by the viewer, a type of social network type linking the viewer with the linked social contact, a current activity of the social network contact (e.g., currently watching the same content that the viewer is currently watching), a current online or offline status of the linked social contact, and a social network grouping type or category to which both the viewer and linked social contact belong (e.g., work contact, best friend, family member, etc.).


The viewer can designate a portion of the screen to depict the social network aggregation. By turning the social volume up (+) or down (−), the viewer can increase the size and/or numbers of linked contact tiles provided to the viewer. In other words, by increasing the social volume the viewer can view, access, and/or push more social content from those of his or her social networks associated with him or her in a memory of the Intelligent TV. By decreasing the social volume, the viewer can view, access, and/or push less social content from his or her associated social networks. By selecting the mute button 724, the viewer can stop or pause any interactivity with his or her associated social networks (e.g., inbound or outbound messages). Social volume and/or mute can be separated into two (or more) volume settings for outbound and inbound social network activity. By way of illustration, a first volume setting, control, and/or button can control the volume for outbound social network activity (e.g., outbound social messages) while a second (different) volume setting, control, and/or button can control the volume for inbound social network activity (e.g., inbound social messages). By way of further illustration, a first mute setting, control, and/or button can stop or pause outbound social network activity (e.g., outbound social messages) while a second (different) mute setting, control, and/or button can stop or pause inbound social network activity (e.g., inbound social messages).


A functional block diagram of the remote control is shown in FIG. 12. The remote control 700 includes a controller 1208 to control and supervise remote control operations, optional wireless (RF) transceiver 1224 and antenna 1244 to send and receive wireless signals to and from the Intelligent TV 100 and other external components, optional infrared emitter 1228 to emit infrared signals to the Intelligent TV 100, optional light emitting diode or LED driver 1232 to control LED operation to provide video-enabled feedback to the user, actuators 1220 (including the various buttons and other actuators discussed above in connection with FIGS. 7 and 10), and joystick 900, all interconnected via a bus 1248. An on board power source 1200 and power management module 1204 provide power to each of these components via power circuitry 1240. The infrared emitter 1228 and receiver (not shown) on the Intelligent TV system 100 can be used to determine a displayed object illuminated by the infrared signal and therefore adjust the displayed image, for example to indicate a focus of the user (e.g., illuminate a displayed object or show cursor position relative to displayed objects on the screen) and to determine and activate a desired command of the user. This can be done by tracking a position of the remote control in relation to infrared tracking reference points (e.g., a sensor bar or infrared LED's) positioned on or adjacent to the screen of the Intelligent TV 100. Motion tracking can further be augmented using position information received from a multi-axis gyroscope and/or accelerometer on board the remote control (not shown).


Using metadata associated with the programming, the application framework 416 can distinguish between, and classify Live TV programming, the following program types: movies, TV show episodes (which can be any type of recurring show such as drama series, sitcoms, animated series, reality shows, mini series, talk shows, game shows, and newscasts), sport events (such as baseball, hockey, boxing, football, and other sporting event programming), specials (such as holiday specials, election coverage, and special news reports), radio (which deliver audio-only broadcasts), and undefined (which refers to programming having no associated metadata). Based on metadata associated with the programming, the application framework 416 can further distinguish between and classify each program type into a program category. Common program categories include political, law, military, economic, culture, sports, the Arts, history, geography, science, medical, environmental, social, recreation, and other (for any non-specified electronic programming guide (“EPG”) entries).


The metadata can be obtained from the content aggregation service 1332, a content metadata fetcher, such as for video, audio, and/or picture metadata, and the like. The metadata, for example, can be obtained from an Internet accessible EPG provider. Examples of metadata formats include EVB-SI, TV-Anytime, and MPEG-7 and communication protocols include the Programming Metadata Communication Protocol (“PMCP”) and Program and System Information Protocol (“PSIP”). Typically, the data which populates the EPG comes from the PSIP. PMCP provides the source data for PSIP, A PSIP generator converts PMCP to a broadcast-ready format. The downloaded EPG metadata is typically stored locally in a computer readable medium.


The user application framework 416 further enables the user to mark a selected channel or program as a “favorite”. This marking means that the user is more interested in these programming items than other items; in other words, the user is more likely to watch favorite programs and channels than other unmarked programs and channels. Once a channel or program is marked as a “favorite”, it is highlighted in all views and made easier to access. While favorite channels and programs can be located by an EPG provider in the cloud, the favorite channels and programs are typically maintained only in the local computer readable medium and not remotely.


The application framework 416 can further enable the user to set reminders for specific programs. Reminders cause notifications by the notification panel 528 that a specific program is about to start, regardless of where or how the user is currently interacting with the Intelligent TV 100 (e.g., regardless of what channel or silo the user is currently interacting with). When a reminder is set, it is created for that specific instance of the complete series. For example if a program were to be on every Friday night at 8 pm on channel 3, setting a reminder for the program creates a reminder that the program is provided every Friday at 7:58 pm. If a program were to be a single airing event (such as a movie or special), a reminder is provided only for that instance.


The EPG, which can be selected by the menu/guide button 772, provides users with a continuously updating table of scheduling information on current and upcoming programming. The EPG is typically obtained, via a network 128 (e.g., the Internet) from a third party source. The EPG can be used to discover content to watch and set reminders for future programs. EPG program listings typically provide at least the following information: show name (which is the full name of the program (e.g., TV episode, movie or special), length of show (which is shown by the relative size of the listing or associated icon in the table), progress of show (programs in process are visually differentiated from shows not in progress to show the progress), optionally favorite indicator (programs that have been marked as a favorite are visually differentiated, such as by a favorite icon, to show that they are favorites), reminder indicator (programs that have been set to trigger a reminder are visually differentiated, such as by a reminder icon, to show that they contain reminders), first air indicator (programs that are airing for the first time are visually differentiated, such as by a first-to-air icon, to show that they are airing for the first time), program type indicator (the program listing contains a visual indicator, such a program-type indicator, to show the program type), and an HD indicator (the program listing contains a visual indicator, such an HD indicator, to show that the program is in high definition).


One EPG format will be discussed with reference to FIG. 14. The intelligent TV 100 displays the EPG modal 1400. The display includes a live preview area 1404 (which displays programming on the currently active channel (and not the channel that is in focus within the EPG)), a program preview area 1408, the EPG 1412, and the hot key bar 1416 (which is color-coded to the hot keys to show currently enabled hot key functionality, e.g., the red hot key corresponds to “Now”, which, if selected, jumps to currently playing program in the EPG, the yellow hot key to “Categories”, which, if selected, switches the category view, the green hot key to “Manage”, which, if selected, switches to the manage view, and the blue hot key to “Mark”, which, if selected, provides a modal with three choices, namely set a reminder for [Show Name], Add [Show Name] to favorites, and add [channel number] to favorite channels, where the item in brackets refers to the currently selected channel and program).


The program preview area 1408 corresponds to the program or channel listing currently selected or in-focus. For the channel listing, the program preview area 1408 includes program information about the program currently being aired on the channel. The program preview area 1408 can include the following fields, namely show name 1420, season 1422 (if applicable) (which can be the number of the season), episode number 1424, episode name 1428, start and end time 1432, description 1436 (which describes the current program), thumbnail view 1440 (which is a thumbnail preview of the current program (e.g., a still (not moving) image, such as a captured frame of the second programming (e.g., a screen grab provided by the Internet-provided EPG service)), a picture (such as a movie poster), or a photograph), channel 1442 providing the program, and other information 1444 (which can include current time (e.g., the current time of day), duration (or start-to-finish time), runtime (or temporal progress of the program), runtime remaining, rating (e.g., TV-Y, TY-Y7, TV-G, TV-PG, TV-14, and TV-MA), video format (or resolution of the current program), audio format (or audio format of the current program), digital air date or first airing indicator, encryption status (when the channel is detected to be encrypted (regardless of decryption status)), program category, program type, track information (for radio programming—the name of the current artist and track), and other indicators noted above).


The EPG includes plural program listings, with each listing having a channel indicator 1448 and show name 1452. The program listings may further include a favorite (program) indicator 1456 (which is shown in the shape of a star but can be any other indicator), a first air indicator 1460 (which is shown as “NEW” but can be any other indicator), and a reminder indicator 1464 (shown as a blue dot but can be any other indicator). The channel indicator 1448 typically includes the channel number, channel logo, an active channel indicator (to show that the channel is currently playing), a selected channel indicator (to show that the channel is currently in-focus), and a favorite (channel) indicator (not shown) (showing that the channel is marked as a favorite channel by the user). Time indicators are shown at the top of the EPG and placed relative to the EPG program listings to show, as the case may be, a total or remaining duration of the program.


The EPG of FIG. 14 is by channel. Other EPG formats can be used. An example EPG 1500 format is shown in FIG. 15, which is by time. The listed programs each have an associated start time 1504 (which may be different or the same for plural listings), a show name 1456, and channel indicator 1448. The program listings can be grouped by program type, program category, or channel. A band 1508 shows the current program runtime for programs in process. The format supports the other visual indicators of the EPG of FIG. 14. While the EPG of FIG. 14 can be scrolled not only vertically (e.g., up and down in channels) but also horizontally (e.g., backwards and forwards in time) using the D-Pad 760 of the remote control, the EPG of FIG. 15 can be scrolled only vertically (e.g., up and down in channels for the selected time). The topmost program listing in each of the EPG formats, when the format is presented initially, is typically the current program in view. The OK button 764 or play button 756 on the remote control selects an in-focus program listing for view. When a channel or program is selected, the application layer 416 automatically exits the EPG display and commences viewing the selected programming.


With reference to FIG. 16, the interaction of EPG with selected program categories is illustrated. The vertically scrollable category area 1600 presents a scrollable listing of selectable program category bars 1604. Moving focus to a category bar causes the EPG 1412 program listings to be refreshed with a set of program listings filtered by the in-focus category bar. In the hot key bar 1416, the red hot key corresponds to “Now”, the yellow and green hot keys correspond to no assigned functionality and the blue hot key to “Mark”. Unlike the other EPG formats, the category-enabled EPG is fixed to a two-hour window and therefore cannot be scrolled horizontally. Stated another way, only channels having a program on (in the next two hours) that matches the selected or in-focus category are shown. Those category-matching programs are visually highlighted relative to category non-matching programs to show the match. The content of the program preview area 1408 reflects the currently in-focus program listing. The OK button 764 or play button 756 on the remote control selects an in-focus program listing for view. When a channel or program is selected, the application layer 416 automatically exits the EPG display and commences viewing the selected programming.


With reference to FIG. 17, the interaction of EPG with favorite indicators is illustrated. The manage favorite area 1700 includes selectable bars 1708 for favorite channels, shows, reminders, and channel list and a vertically scrollable channel listing 1704. Moving the focus from one selectable bar 1708 to another causes the vertically scrollable channel listing 1704 to be refreshed to mirror the currently in-focus bar 1708. Each listing in the channel listing 1704 includes, when the favorite channels bar 1708 is in-focus, a favorite indicator 1456, a channel identifier 1712, channel logo 1716, channel name 1720, and a remove icon 1724 (which, if selected, removes the corresponding channel from the set of favorite channels). When the favorite shows bar 1708 is in-focus, a program listing is provided in lieu of the channel listing 1704. Each channel listing includes a favorite indicator 1456, a show name, and a remove icon 1724 (which, if selected, removes the corresponding program from the set of favorite programs). Referring to FIG. 18, when multiple airings of the same program are currently airing, those programs are highlighted accordingly. Selecting the program will provide the user with a modal 1800 of all the listings that are currently live. The modal lists each currently live program listing or episode by start time, show name and channel indicator. The content of the program preview area 1408 reflects the currently in-focus channel programming (in-focus not requiring selection by the OK or play buttons but focus only by the cursor) or program listing. In the hot key bar 1416, the red hot key corresponds to “Guide” (which, if selected, jumps back to currently playing program in the full EPG), and the yellow, green, and red hot keys correspond to no assigned functionality.


The channel list bar 1700, when selected, enables the user to hide and restore channels. Hiding a channel will remove the channel from all views as well as from the normal channel changing experience. If the hidden channel has been marked as a favorite, it will automatically be un-favorited (or removed from the set of favorite channels) before removal. Restoring or un-hiding a channel will reactivate the channel and make it re-accessible from all views. Channels that are hidden are visually differentiated from active channels. In the hot key bar 1416, the red hot key corresponds to “Guide” (which, if selected, jumps back to currently playing program in the full EPG), the yellow hot key to “Restore all” (which, if selected restores all hidden channels), the green hot key to no assigned functionality, and the blue hot key to “Favorites” (which, if selected, toggles the favorite status of the listed channel(s)).


The reminder bar 1708, when selected, enables the user to manage all configured reminders. The vertically scrollable list of programs in the program listing contains all programs that have reminders set. In the hot key bar 1416, the red hot key corresponds to “Guide” (which, if selected, jumps back to currently playing program in the full EPG), the yellow hot key to “Date/Alphabetic” (which, if selected changes the sort order from alphabetic to date (items ordered by soonest to latest) or vice versa with items having no upcoming air date being ordered alphabetically at the end), and the green and blue hot keys to no assigned functionality.


In any of the variations of FIG. 17, the OK button 764 or play button 756 on the remote control selects an in-focus program or channel listing for live viewing. When a channel or program is selected, the application layer 416 automatically exits the EPG display and commences viewing the selected programming.


With reference to FIG. 19, the header bar 1900 will be discussed. The header bar 1900 provides contextual information for the currently watched program. The header bar 1900 includes the following fields: channel indicator 1448, program preview area 1408, current time, current viewing time remaining (or completed) 1904 and/or start-to-finish viewing time 1908 for the currently selected program, video format 1912, and audio format 1916. The header bar 1900 is presented for all program types, including movies, TV show episodes, sport, specials, radio, and undefined. During programming viewing, the header bar 1900 is always displayed whenever the D-Pad 760 is touched or whenever the channel is changed. It auto hides (or is no longer viewable) after a predetermined time, such as 5 seconds. In the viewing area 1920, a full screen thumbnail view of the current program (which can be the same or different to the thumbnail view 1440 in the header bar 1900) is displayed. As noted, the thumbnail view can be program poster if the program poster is available; otherwise, the thumbnail view should be a screen grab provided by the Internet-provided EPG service. When a new channel or program is selected, both the header bar 1900 and viewing area 1920 are updated immediately to reflect the new programming without needing to wait for the newly selected channel or program to activate. The activation delay can be caused by tuning delays. During the activation delay, the viewing area 1920 of the display is unable to display the new programming as a live feed and may continue to display video frames from the prior programming even after a new channel or program is selected. The thumbnail view on the display or viewing area can improve the channel changing experience as all channels will show a full-screen thumbnail view immediately after channel or program selection and while tuning takes place. This can allow the user to flip quickly through channels and view what is on without having to wait for the tuner to catch up.


There are many modes to change channels or programming. With reference to FIGS. 7-10, the remote control includes channel-up and channel-down button 784 to select a next higher or lower channel, respectively, for viewing. The D-Pad 760 can scroll through EPG program or channel listings as described above, and the OK button 764 can select a program or channel for viewing. When not in the EPG display mode, the D-Pad 760 can be used to cycle through favorite channels that have been set on the system along with favorite programs that are currently airing. Opposing sets of arrows, either the up and down arrows or the left and right arrows, when pressed cycle through the favorite channels and programs. The back button 752 can change the currently viewed channel to the last-viewed channel. Repeatedly pressing the back button 752 will toggle between the currently viewed and previously viewed channels. When scrolling in any of these modes, once the largest or smallest numbered channel is reached, the next press takes the user to the smallest or largest numbered channel, respectively.


With reference to FIGS. 7-10, the remote control does not have number keys but rather a 0-9 (number) button 776, which, if selected, reveals an on screen channel changer 2000 shown in FIG. 20. The on screen channel changer 2000 includes a program preview area 1408 (reflecting the currently in-focus channel), a vertically scrollable (EPG) channel listing 2004 (each listing including the channel indicator 1448, show name and progress bar), and a number input bar 2008. In the number input bar 2008, only numbers that will currently produce a selectable channel are enabled. Disabled numbers cannot receive focus and are bypassed during navigation of the number input bar. As each number is selected, the numbers enabled and disabled will vary to reflect the selectable channels available from the root numbers selected. The back button 752 acts as a backspace and, when selected, deletes the last number selected. The removal of the number keys or buttons from the remote control reduces the complexity of the remote control and thereby improves the remote control's overall usability and simplicity while the on screen channel changer 2000 provides the user with the ability to change channels using only the D-Pad 760. One opposing set of arrows (e.g., the up and down arrows) scrolls up and down through the program listing 2004 while the other opposing set of arrows (e.g., the left and right moves to the left and right, respectively, through the number input bar 2008.


The on screen channel changer 2000 can have different EPG channel listing configurations. In one configuration, a mini EPG is presented having a single channel listing of the last channel that was manually entered by the user (that may not be the currently viewed channel). In another configuration shown in FIG. 20, as numbers are selected a dynamic EPG channel listing is presented. The channels in the dynamic EPG listing correspond to the channels that are currently selectable from the root entered numbers. For example, if the user has entered “11” and only numbers “2”, “4”, and “6” in the number input bar are enabled, the channels in the dynamic EPG channel listing would include channel listings for channels 112, 114, and 116.


With reference to FIG. 21, a D-Pad channel navigation display is shown. Pressing left or right arrows in the D-Pad moves focus from file 2100 to file 2100. Each file 2100 includes a channel number and logo, thumbnail view 1440, progress bar 1504, and program information 2104 (e.g., program name, episode name, season number, episode number, and/or program type and/or category). An in-focus file can be selected by pressing the OK button. The channels to the right of center 2108 can be made up of all channels that have been set as favorites along with any other channels that are currently airing a favorite program. The favorite channels can be ordered by channel increasing or decreasing from the center 2108 out. The channels to the left of the center 2108 can be made of all recommendations based on the user's favorites and watching patterns (e.g., this can initially be limited to programs matching the active program's category until cloud-based recommendations are available). The user's current favorites, for example, can be used to identify similar channels and/or programs that may be of interest to the user. A similar channel may be, for example, a channel broadcasting a favorite program of the user. A similar program may be, for example, a program of a similar category or type to a current favorite. Watching patterns can be determined, for example, by monitoring the frequency (over a selected period of time) of the user selecting a channel and/or watching a program (such as an episode, sport event, news program, and the like). When the frequency is at least a determined amount, the channel or program is recommended. The recommended channels and/or programs can be ordered based on frequency magnitude. As will be appreciated, other techniques can be used to recommend a channel or program to the user. The recommended channels can be ordered by channel increasing or decreasing from the center 2108 out.


The information panel for an episode is shown in FIG. 22. It is available when the user is watching any episode and provides extended information on the currently watched program. The information panel 2200 is adjacent to the viewing area 1920 displaying the live episode programming and displaying the current time 2216. The information panel 2200 includes a panel navigation bar 2204, a thumbnail view 1440 and channel number, name and logo 1442 with accompanying information fields for the episode being displayed in the viewing area 1920, a favorite action button 2208 (which, is selected, provides a modal providing the options: add or remove a program to or from favorites, add or remove a channel to or from favorites, audio track button 2220 (only visible is there are multiple audio tracks and, if selected, provides a modal with a list of all audio tracks available with the video), and subtitle button 2224 (which, if selected, provides a modal with a list of all subtitle tracks available within the video), an on next mini EPG 2212, and hot keys 1416. The accompanying information fields include show name 1420, episode name 1428, and other information 1428 (which includes one or more of program category of the episode, program video format of the episode, program audio format of the episode, program description of the episode, program start and end times of the episode, program season number of the episode (or the season to which the episode belongs), episode number, original air date of the episode, first air indicator, encryption status, and rating of the episode). The on next mini EPG 2212 shows a small time-based EPG with the next few listings of the episode that will be aired. In the hot key bar 1416, the red hot key corresponds to “Guide”, which, if selected, dismisses the information panel and launches the full-screen EPG, the yellow hot key to “View in HD”, which, if selected, switches to the channel currently airing the same program in HD (which is useful when the current channel is airing the program in SD), the green hot key to “Remind”, which, if selected, toggles the reminder on and off for the selected or in-focus program, and the blue hot key to “Favorite”, which, if selected, provides a modal with the options: add or remove [Program] to or from favorites and add or remove [Channel] to or from favorites.


An information panel similar to that of FIG. 22 is provided for TV specials, movies, sporting events, radio, and undefined programming. Any of the information panels can be selected by selecting the “Info” option on the panel navigation bar 2204.


From the display of FIG. 22, an EPG panel can be selected by selecting the On Now option on the panel navigation bar 2204. This selection replaces the information panel 2200 with an EPG 1412 similar to that of FIG. 14. It provides a vertically scrollable listing of programs available or airing within the next hour. In this panel, the hot key assignments remain the same as the information panel except for the yellow hot key, which has no corresponding assigned function.


From the display of FIG. 22, a categories panel can be selected by selecting the Category option on the panel navigation bar 2204. This selection replaces the information panel 2200 with a vertically scrollable list of all available program categories. Categories not having available content are excluded from the listing. Selecting a listed category provides a focused mini EPG listing the corresponding available programs within the selected category. In the hot key bar 1416, the red hot key corresponds to “Go to Top”, which, if selected, returns the user to the top of the panel, the yellow hot key has no corresponding function, the green hot key to “Remind”, which, if selected, toggles the reminder on and off for the selected or in-focus program, and the blue hot key to “Favorite”, which, if selected, marks the series in focus as a favorite and, when the series is already a favorite, removes the series from the set of favorites.


From the display of FIG. 22, a favorites panel can be selected by selecting the Favorite option on the panel navigation bar 2204. This selection replaces the information panel 2200 with a vertically scrollable list of all channels that have been marked as favorites along with currently airing programs that are marked as favorites. The listing of channels includes channel number, name and logo and show name. The listing is updated dynamically such that, when a favorite program ends for a channel that is not a favorite, the program is removed from the listing. In the hot key bar 1416, the red hot key corresponds to “Guide”, which, if selected, dismisses the favorites panel and launches the full-screen EPG, the yellow and green hot keys have no corresponding functions, and the blue hot key to “Favorite”, which, if selected, provides a modal with the options: add or remove [Program] to or from favorites and add or remove [Channel] to or from favorites.


From the display of FIG. 22, a search panel 2300 can be selected by selecting the Search option on the panel navigation bar 2204. The search panel 2300 allows the user to search for media within the EPG for content that will be available within a predetermined time period. With reference to FIG. 23, the search panel 2300 includes a search term field 2304, a list of previous search terms 2308, and the hot key bar 1416. The previous search terms are listed with the most recent searched terms on the top of the list. Any of the previous search terms can be selected by the OK button to initiate a new search using the terms. The search term can search any field within an EPG, including channel number, name, or logo, show name 1420, episode name 1428, program category, program video format, program audio format, program description, program start and end times, program season number, episode number, original air date, first air indicator, encryption status, and rating. In the hot key bar 1416, the red hot key corresponds to “Guide”, which, if selected, dismisses the search panel and launches the full-screen EPG, the yellow hot key to “Clear all”, which, if selected, clears the search history, the green hot key to “Remove”, which, if selected, removes the selected term from the search history, and the blue hot key has no corresponding function.


The search results panel 2400 of FIG. 24 is provided after a search is initiated. The search results panel includes not only the panel navigation bar 2204 and hot key bar 1416 but also a search term field 2404 and search results listing 2408. The search results update in real time while the user fills in the search term field 2404. The search results 2408 provide a vertically scrollable collection of programs that are related to the search term and are or will be airing within a predetermined time. Each search term listing includes time on or program start time, show name, progress bar, indicators such as the favorite and reminder indicators, and channel number, name, and logo. The search terms are reset (to the prior search terms on the display) when the user exits the live TV silo, the user turns off the TV, or a predetermined time passes from the time of the search. In the hot key bar 1416, the red hot key corresponds to “Go to Top”, which, if selected, returns the user to the top of the panel, the yellow hot key to “Clear”, which, if selected, clears the current search and returns the panel to the prior to search view with the focus in the empty entry field, the green hot key to “Remind”, which, if selected, toggles the reminder on and off for the selected or in-focus program, and the blue hot key to “Favorite”, which, if selected, marks the series in focus as a favorite and, when the series is already a favorite, removes the series from the set of favorites. As shown, the search results panel 2400 can optionally include a program preview area 1408 for the in-focus listing in the search results listing 2408.


A user interface operation of the application framework 412 (typically the panel manager 536 and/or panel controller 552) is shown in FIG. 25. The operation will be discussed with reference to FIG. 19.


In step 2500, the application framework 412 detects a stimulus, such as selection of a new channel or program that causes the intelligent TV 100 tuner to locate a new programming signal.


In step 2504, the intelligent TV 100 tuner initiates tuning to locate the new programming signal.


In step 2508, the application framework 412, during the period that the tuner is attempting to locate the new programming signal, displays, in the viewing area 1920, the display of FIG. 19. One or more thumbnail views of the current program on the new channel or the newly selected program are shown in the thumbnail view and viewing area.


In decision diamond 2512, the application framework 412 determines whether tuning is completed and the new programming signal has been located and is currently being received. If not, the application framework 412 returns to step 2508. If so, the application framework 412, in step 1516, displays, in the viewing area 1920, live programming in lieu of the view of FIG. 19.


A further user interface operation of the application framework (typically the panel manager 536 and/or panel controller 552) is shown in FIG. 26. The operation will be discussed with reference to FIGS. 20 and 27.


In step 2600, the application framework 412 detects a stimulus, such as selection of the 0-9 (number) button 776, a new channel or program that causes the intelligent TV 100 tuner to locate a new programming signal.


In step 2604, the application framework 412, in response, displays, in the viewing area 1920, an on screen channel changer 2000. With reference to FIG. 27, the on screen channel changer 2000 will have the appearance of the on screen channel changer 2700. When the on screen channel changer is brought into focus, it will contain a mini EPG with a single entry of the last channel that was manually entered (that is not the current channel). The in-focus indicator will be in a default center position on the number input bar 2008 (which is in the “5” position as shown). Only numbers in the number input bar 2008 that will produce a selectable channel are enabled while the other numbers are disabled.


In step 2608, the application framework 412 receives a number selection via the on screen channel changer.


In decision diamond 2612, the application framework 412 determines whether the user has entered a complete channel entry. If not, the application framework 412, in step 2612, updates the on screen channel changer 2700 to reflect the selected number. Referring to FIG. 27, the user used the D-Pad to select the number “2”. The on screen channel changer 2704 is updated to show, in the mini EPG, the next three channels that begin with the number “2”. Selecting any of these EPG listings will automatically change the channel and dismiss the on screen channel changer 2704.


In the example of FIG. 27, the user has selected only the first number of the channel. The user next uses the D-Pad to select the number “9”. Because a complete channel entry has not been entered, the application framework 412 proceeds to step 2612 and updates the on screen channel changer 2704 to reflect the selected number. Referring to FIG. 27, the user used the D-Pad to select the number “9”. The on screen channel changer 2708 is updated to show, in the mini EPG, the next three channels that begin with the numbers “29”. Selecting any of these EPG listings will automatically change the channel and dismiss the on screen channel changer 2708.


The user, in step 2608, next selects the number “4” using the D-Pad. In decision diamond 2612, the application framework 412 determines that a complete channel has been entered (which is 294 in the example) and, in step 2620, updates the on screen channel changer 2712 by showing, in the mini EPG, and putting in focus only a single listing corresponding to the entered channel. If the user selects the listing, the application framework 412 will automatically change the channel and dismiss the on screen channel changer 2712.


The exemplary systems and methods of this disclosure have been described in relation to intelligent televisions. However, to avoid unnecessarily obscuring the present disclosure, the preceding description omits a number of known structures and devices. This omission is not to be construed as a limitation of the scopes of the claims. Specific details are set forth to provide an understanding of the present disclosure. It should however be appreciated that the present disclosure may be practiced in a variety of ways beyond the specific detail set forth herein.


Furthermore, while the exemplary aspects, embodiments, and/or configurations illustrated herein show the various components of the system collocated, certain components of the system can be located remotely, at distant portions of a distributed network, such as a LAN and/or the Internet, or within a dedicated system. Thus, it should be appreciated, that the components of the system can be combined in to one or more devices, such as a television, or collocated on a particular node of a distributed network, such as an analog and/or digital telecommunications network, a packet-switch network, or a circuit-switched network. It will be appreciated from the preceding description, and for reasons of computational efficiency, that the components of the system can be arranged at any location within a distributed network of components without affecting the operation of the system. For example, the various components can be located in a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof. Similarly, one or more functional portions of the system could be distributed between a telecommunications device(s) and an associated computing device.


Furthermore, it should be appreciated that the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements. These wired or wireless links can also be secure links and may be capable of communicating encrypted information. Transmission media used as links, for example, can be any suitable carrier for electrical signals, including coaxial cables, copper wire and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.


Also, while the flowcharts have been discussed and illustrated in relation to a particular sequence of events, it should be appreciated that changes, additions, and omissions to this sequence can occur without materially affecting the operation of the disclosed embodiments, configuration, and aspects.


A number of variations and modifications of the disclosure can be used. It would be possible to provide for some features of the disclosure without providing others.


In yet another embodiment, the systems and methods of this disclosure can be implemented in conjunction with a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device or gate array such as PLD, PLA, FPGA, PAL, special purpose computer, any comparable means, or the like. In general, any device(s) or means capable of implementing the methodology illustrated herein can be used to implement the various aspects of this disclosure. Exemplary hardware that can be used for the disclosed embodiments, configurations and aspects includes computers, handheld devices, telephones (e.g., cellular, Internet enabled, digital, analog, hybrids, and others), and other hardware known in the art. Some of these devices include processors (e.g., a single or multiple microprocessors), memory, nonvolatile storage, input devices, and output devices. Furthermore, alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.


In yet another embodiment, the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms. Alternatively, the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized.


In yet another embodiment, the disclosed methods may be partially implemented in software that can be stored on a storage medium, executed on programmed general-purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like. In these instances, the systems and methods of this disclosure can be implemented as program embedded on personal computer such as an applet, JAVA® or CGI script, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated measurement system, system component, or the like. The system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system.


Although the present disclosure describes components and functions implemented in the aspects, embodiments, and/or configurations with reference to particular standards and protocols, the aspects, embodiments, and/or configurations are not limited to such standards and protocols. Other similar standards and protocols not mentioned herein are in existence and are considered to be included in the present disclosure. Moreover, the standards and protocols mentioned herein and other similar standards and protocols not mentioned herein are periodically superseded by faster or more effective equivalents having essentially the same functions. Such replacement standards and protocols having the same functions are considered equivalents included in the present disclosure.


The present disclosure, in various aspects, embodiments, and/or configurations, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various aspects, embodiments, configurations embodiments, subcombinations, and/or subsets thereof. Those of skill in the art will understand how to make and use the disclosed aspects, embodiments, and/or configurations after understanding the present disclosure. The present disclosure, in various aspects, embodiments, and/or configurations, includes providing devices and processes in the absence of items not depicted and/or described herein or in various aspects, embodiments, and/or configurations hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease and\or reducing cost of implementation.


The foregoing discussion has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more aspects, embodiments, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, embodiments, and/or configurations of the disclosure may be combined in alternate aspects, embodiments, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, embodiment, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.


Moreover, though the description has included description of one or more aspects, embodiments, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, embodiments, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.

Claims
  • 1. A method, comprising: displaying, by a television, before receiving a channel-tuning request to tune from a first channel to a second channel by a tuner of the television, a first programming of the first channel from a video buffer of the television on a display screen of the television;upon receiving the channel-tuning request and during a tuning delay caused by the tuner in locating the second channel: displaying a still image corresponding to a second programming of the second channel from a data source in a viewing area of the display screen; andsimultaneously displaying a thumbnail view associated with the second programming of the second channel from the data source in a preview area of the display screen, wherein the viewing area and the preview area occupies an entirety of the display screen; andat an end of the tuning delay, displaying the second programming of the second channel on the display screen from the video buffer.
  • 2. The method of claim 1, further comprising displaying an information panel capable of accepting an interactive user input for accessing information related to the second channel in the preview area during the tuning delay.
  • 3. The method of claim 1, wherein the thumbnail view is a captured frame of the second programming; wherein the thumbnail view is displayed as part of a header bar; andwherein the header bar comprises at least one of a program name, series name, episode number, episode name, and sport event name and a plurality of a program description, a program type, a program category, a start-to-finish time, current runtime, and a remaining runtime.
  • 4. The method of claim 1, wherein the thumbnail view is determined based on metadata associated with the at least one of the second channel and second programming.
  • 5. The method of claim 4, wherein the metadata is populated into an electronic program guide using a Program and System Information Protocol.
  • 6. The method of claim 4, wherein the metadata is converted into a broadcast-ready format by a Programming Metadata Communication Protocol generator.
  • 7. The method of claim 1, wherein each of the first programming and the second programming is one or more of a movie, an episode, sport event, and a television special.
  • 8. A television, comprising: a television tuner;a display screen;a video buffer; anda microprocessor configured to: cause the display screen to display, before receiving a channel-tuning request to tune from a first channel to a second channel by the television tuner of the television, a first programming of the first channel from the video buffer of the television on the display screen of the television;upon receiving the channel-tuning request and during a tuning delay caused by the television tuner in locating the second channel: cause the display screen to display a still image corresponding to a second programming of the second channel from a data source in a viewing area of the display screen; andcause the display screen to simultaneously display a thumbnail view associated with the second programming of the second channel from the data source in a preview area of the display screen, wherein the viewing area and the preview area occupies an entirety of the display screen; andat an end of the tuning delay, cause the display screen to display the second programming of the second channel on the display screen from the video buffer.
  • 9. The television of claim 8, wherein the microprocessor is further configured to cause the display screen to display an information panel capable of accepting an interactive user input for accessing information related to the second channel in the preview area during the tuning delay.
  • 10. The television of claim 8, wherein the thumbnail view is a captured frame of the second programming; wherein the thumbnail view is displayed as part of a header bar; andwherein the header bar comprises at least one of a program name, series name, episode number, episode name, and sport event name and a plurality of a program description, a program type, a program category, a start-to-finish time, current runtime, and a remaining runtime.
  • 11. The television of claim 8, wherein the thumbnail view is determined based on metadata associated with the at least one of the second channel and second programming.
  • 12. The television of claim 11, wherein the metadata is populated into an electronic program guide using a Program and System Information Protocol.
  • 13. The television of claim 11, wherein the metadata is converted into a broadcast-ready format by a Programming Metadata Communication Protocol generator.
  • 14. The television of claim 8, wherein each of the first programming and the second programming is one or more of a movie, an episode, sport event, and a television special.
CROSS REFERENCE TO RELATED APPLICATION

The present application claims the benefits of and priority, under 35 U.S.C. § 119(e), to U.S. Provisional Application Ser. Nos. 61/684,672 filed Aug. 17, 2012, “Smart TV”; 61/702,650 filed Sep. 18, 2012, “Smart TV”; 61/697,710 filed Sep. 6, 2012, “Social TV”; 61/700,182 filed Sep. 12, 2012, “Social TV Roadmap”; 61/736,692 filed Dec. 13, 2012, “SmartTV”; 61/798,821 filed Mar. 15, 2013, “SmartTV”; 61/804,942 filed Mar. 25, 2013, “SmartTV”; 61/804,998 filed Mar. 25, 2013, “SmartTV”; 61/804,971 filed Mar. 25, 2013, “SmartTV”; 61/804,990 filed Mar. 25, 2013, “SmartTV”; 61/805,003 filed Mar. 25, 2013, “SmartTV”; 61/805,053 filed Mar. 25, 2013, “SmartTV”; 61/805,030 filed Mar. 25, 2013, “SmartTV”; 61/805,027 filed Mar. 25, 2013, “SmartTV”; 61/805,042 filed Mar. 25, 2013, “SmartTV”; and 61/805,038 filed Mar. 25, 2013, “SmartTV.” Each of the aforementioned documents is incorporated herein by reference in their entirety for all that they teach and for all purposes.

US Referenced Citations (452)
Number Name Date Kind
4284849 Anderson et al. Aug 1981 A
5539479 Bertram Jul 1996 A
5701183 Bellemare et al. Dec 1997 A
5734444 Yoshinobo Mar 1998 A
5801747 Bedard Sep 1998 A
5867227 Yamaguchi Feb 1999 A
5933141 Smith Aug 1999 A
5964839 Johnson et al. Oct 1999 A
6008803 Rowe et al. Dec 1999 A
6047319 Olson Apr 2000 A
6111614 Mugura et al. Aug 2000 A
6177931 Alexander et al. Jan 2001 B1
6239794 Yuen et al. May 2001 B1
6285804 Crinon et al. Sep 2001 B1
6493038 Singh et al. Dec 2002 B1
6493876 DeFreese et al. Dec 2002 B1
6515968 Combar et al. Feb 2003 B1
6563515 Reynolds et al. May 2003 B1
6564378 Satterfield et al. May 2003 B1
6629077 Arling et al. Sep 2003 B1
6661468 Alten et al. Dec 2003 B2
6704062 Ahida Mar 2004 B1
6766526 Ellis Jul 2004 B1
6971118 Akhavan et al. Nov 2005 B1
7058600 Combar et al. Jun 2006 B1
7174126 McElhatten et al. Feb 2007 B2
7228556 Beach et al. Jun 2007 B2
7232073 de Jong Jun 2007 B1
7233316 Smith et al. Jun 2007 B2
7266726 Ladd et al. Sep 2007 B1
7271848 Saito Sep 2007 B2
7363591 Goldthwaite et al. Apr 2008 B2
7487529 Orlick Feb 2009 B1
7493641 Klosterman et al. Feb 2009 B2
7506350 Johnson Mar 2009 B2
7543320 Schein et al. Jun 2009 B2
7577923 Beam et al. Aug 2009 B2
7623933 Sarosi et al. Nov 2009 B2
7685520 Rashkovskiy et al. Mar 2010 B2
7698606 Ladd et al. Apr 2010 B2
7805634 Balazich et al. Sep 2010 B2
7822716 Lee et al. Oct 2010 B2
7880077 Pauws et al. Feb 2011 B2
7900228 Stark et al. Mar 2011 B2
7904924 De Heer et al. Mar 2011 B1
7908635 Barton et al. Mar 2011 B2
8006201 Bhattacharya Aug 2011 B2
8065390 Cheng Nov 2011 B2
8089455 Wieder Jan 2012 B1
8127329 Kunkel et al. Feb 2012 B1
8130327 Han et al. Mar 2012 B2
8151215 Baurmann et al. Apr 2012 B2
8166511 Griggs Apr 2012 B2
8201104 Yamamoto et al. Jun 2012 B2
8220021 Look et al. Jul 2012 B1
8281339 Walker et al. Oct 2012 B1
8352983 Chane et al. Jan 2013 B1
8473976 Udani Jun 2013 B2
8510780 Stallings et al. Aug 2013 B2
8516519 Lee Aug 2013 B2
8549561 Yeh et al. Oct 2013 B2
8566874 Roberts et al. Oct 2013 B2
8589981 Lee et al. Nov 2013 B2
8640166 Craner Jan 2014 B1
8683519 McCarthy et al. Mar 2014 B2
8707378 Nambakkam Apr 2014 B2
8756620 Papish et al. Jun 2014 B2
8832742 Rasanen et al. Sep 2014 B2
8875056 Onogi et al. Oct 2014 B2
9635416 Hoffert Apr 2017 B2
20010001160 Shoff et al. May 2001 A1
20010010097 Lee Jul 2001 A1
20010056577 Gordon et al. Dec 2001 A1
20020007485 Rodriguez et al. Jan 2002 A1
20020026637 Markel et al. Feb 2002 A1
20020042914 Walker et al. Apr 2002 A1
20020049970 Park Apr 2002 A1
20020052746 Handelman May 2002 A1
20020056098 White May 2002 A1
20020067376 Martin et al. Jun 2002 A1
20020152267 Lennon Oct 2002 A1
20020188944 Noble Dec 2002 A1
20020188958 Miller Dec 2002 A1
20030003876 Rumsey Jan 2003 A1
20030014752 Zaslavsky et al. Jan 2003 A1
20030070171 Jeon et al. Apr 2003 A1
20030084449 Chane et al. May 2003 A1
20030110234 Egli et al. Jun 2003 A1
20030118323 Ismail et al. Jun 2003 A1
20030149988 Ellis et al. Aug 2003 A1
20030151621 McEvilly et al. Aug 2003 A1
20030177498 Ellis et al. Sep 2003 A1
20030182658 Alexander Sep 2003 A1
20030182659 Ellis et al. Sep 2003 A1
20030208763 McElhatten et al. Nov 2003 A1
20030221192 Rappaport et al. Nov 2003 A1
20030225777 Marsh Dec 2003 A1
20030226146 Thurston et al. Dec 2003 A1
20040078807 Fries et al. Apr 2004 A1
20040093616 Johnson May 2004 A1
20040103120 Fickle et al. May 2004 A1
20040117831 Ellis et al. Jun 2004 A1
20040119815 Soloff Jun 2004 A1
20040148362 Friedman Jul 2004 A1
20040154040 Ellis Aug 2004 A1
20040158193 Bui Aug 2004 A1
20040211282 Kim Oct 2004 A1
20040216156 Wagner Oct 2004 A1
20040237108 Drazin et al. Nov 2004 A1
20050015804 LaJoie et al. Jan 2005 A1
20050022237 Nomura Jan 2005 A1
20050066278 Sloo Mar 2005 A1
20050091596 Anthony et al. Apr 2005 A1
20050097606 Scott, III et al. May 2005 A1
20050097622 Zigmond et al. May 2005 A1
20050120306 Klassen et al. Jun 2005 A1
20050149964 Thomas et al. Jul 2005 A1
20050160452 Lawler et al. Jul 2005 A1
20050165918 Wantanabe et al. Jul 2005 A1
20050188318 Tamir et al. Aug 2005 A1
20050188402 de Andrade et al. Aug 2005 A1
20050204382 Ellis et al. Sep 2005 A1
20050204387 Knudson et al. Sep 2005 A1
20050229210 Akhavan Oct 2005 A1
20050232210 Karaoguz et al. Oct 2005 A1
20050273819 Knudson et al. Dec 2005 A1
20050289603 Cezeaux et al. Dec 2005 A1
20060031875 Yu Feb 2006 A1
20060064716 Sull et al. Mar 2006 A1
20060090183 Zito et al. Apr 2006 A1
20060101504 Aravamudan et al. May 2006 A1
20060117337 Shivaji Rao et al. Jun 2006 A1
20060117343 Novak et al. Jun 2006 A1
20060123445 Sullivan et al. Jun 2006 A1
20060143674 Jones et al. Jun 2006 A1
20060158838 Kinoshita et al. Jul 2006 A1
20060174301 Hashimoto et al. Aug 2006 A1
20060184972 Rafey et al. Aug 2006 A1
20060248557 Stark et al. Nov 2006 A1
20060262116 Moshiri et al. Nov 2006 A1
20060280449 Ogawa et al. Dec 2006 A1
20070011702 Vaysman Jan 2007 A1
20070028282 Kooijmans et al. Feb 2007 A1
20070050733 Lee et al. Mar 2007 A1
20070061724 Slothouber et al. Mar 2007 A1
20070066283 Haar et al. Mar 2007 A1
20070096939 Walrath May 2007 A1
20070115391 Anderson May 2007 A1
20070143809 Chen et al. Jun 2007 A1
20070154163 Cordray Jul 2007 A1
20070162936 Stallings et al. Jul 2007 A1
20070192715 Kataria et al. Aug 2007 A1
20070195197 Seong Aug 2007 A1
20070199022 Moshiri et al. Aug 2007 A1
20070250762 Mansfield Oct 2007 A1
20070250864 Diaz Perez Oct 2007 A1
20070261090 Miller et al. Nov 2007 A1
20080022309 Begeja et al. Jan 2008 A1
20080086745 Knudson et al. Apr 2008 A1
20080092198 Hutten Apr 2008 A1
20080098433 Hardacker et al. Apr 2008 A1
20080114794 Craner May 2008 A1
20080114861 Gildred May 2008 A1
20080120635 Trimper et al. May 2008 A1
20080133525 Ott Jun 2008 A1
20080141307 Whitehead Jun 2008 A1
20080155613 Benya et al. Jun 2008 A1
20080163307 Coburn et al. Jul 2008 A1
20080177828 Accarie et al. Jul 2008 A1
20080235595 Krantz et al. Sep 2008 A1
20080244637 Candelore Oct 2008 A1
20080250455 Fukuda et al. Oct 2008 A1
20080271080 Gossweiler et al. Oct 2008 A1
20080276279 Gossweiler et al. Nov 2008 A1
20080282294 Carpenter et al. Nov 2008 A1
20080288461 Glennon et al. Nov 2008 A1
20080307457 Yang et al. Dec 2008 A1
20080313677 Lee Dec 2008 A1
20090013350 Ohlfs et al. Jan 2009 A1
20090015725 Park Jan 2009 A1
20090019485 Ellis et al. Jan 2009 A1
20090064222 Dawson et al. Mar 2009 A1
20090102966 Jiang et al. Apr 2009 A1
20090106793 Tecot et al. Apr 2009 A1
20090125940 Kim et al. May 2009 A1
20090129340 Handa May 2009 A1
20090150379 Park et al. Jun 2009 A1
20090158326 Hunt et al. Jun 2009 A1
20090158375 Rodriguez et al. Jun 2009 A1
20090160764 Myllymäki Jun 2009 A1
20090165045 Stallings et al. Jun 2009 A1
20090177981 Christie et al. Jul 2009 A1
20090179989 Bessone et al. Jul 2009 A1
20090199237 White et al. Aug 2009 A1
20090199241 Unger et al. Aug 2009 A1
20090199242 Johnson et al. Aug 2009 A1
20090204929 Baurmann et al. Aug 2009 A1
20090210910 Smith et al. Aug 2009 A1
20090213079 Segal et al. Aug 2009 A1
20090228919 Zott et al. Sep 2009 A1
20090235311 Michel et al. Sep 2009 A1
20090241048 Augustine et al. Sep 2009 A1
20090249394 Schwesinger et al. Oct 2009 A1
20090251611 Sumiyoshi Oct 2009 A1
20090271823 Jung et al. Oct 2009 A1
20090293078 Pirani et al. Nov 2009 A1
20090293079 McKee et al. Nov 2009 A1
20090300674 Ratsch et al. Dec 2009 A1
20090313232 Tinsley et al. Dec 2009 A1
20100013997 Whang Jan 2010 A1
20100031193 Stark et al. Feb 2010 A1
20100050219 Angiolillo et al. Feb 2010 A1
20100058231 Duarte et al. Mar 2010 A1
20100060799 Ishii Mar 2010 A1
20100066909 Miyazaki Mar 2010 A1
20100071019 Blanchard et al. Mar 2010 A1
20100086277 Craner Apr 2010 A1
20100095317 Toebes et al. Apr 2010 A1
20100100512 Brodin et al. Apr 2010 A1
20100107112 Jennings et al. Apr 2010 A1
20100122284 Yoon et al. May 2010 A1
20100128183 Ishii May 2010 A1
20100161506 Bosenick et al. Jun 2010 A1
20100172407 Ramaswamy Jul 2010 A1
20100177252 Larsen et al. Jul 2010 A1
20100180292 Epstein et al. Jul 2010 A1
20100180300 Carpenter et al. Jul 2010 A1
20100191727 Malik Jul 2010 A1
20100201890 DeGonde Aug 2010 A1
20100225830 Blanchard et al. Sep 2010 A1
20100235740 Friedlander et al. Sep 2010 A1
20100241962 Peterson et al. Sep 2010 A1
20100293497 Peterson Nov 2010 A1
20100302444 Ahn et al. Dec 2010 A1
20100306402 Russell et al. Dec 2010 A1
20100306800 Jung et al. Dec 2010 A1
20100319036 Lim et al. Dec 2010 A1
20100321405 MacInnes et al. Dec 2010 A1
20110016492 Morita Jan 2011 A1
20110022450 Meredith Jan 2011 A1
20110023608 Rueger Feb 2011 A1
20110035774 Parker Feb 2011 A1
20110041150 Schein et al. Feb 2011 A1
20110043696 Onogi et al. Feb 2011 A1
20110047572 Hill et al. Feb 2011 A1
20110055873 Heo et al. Mar 2011 A1
20110060661 Chai et al. Mar 2011 A1
20110067066 Barton et al. Mar 2011 A1
20110072480 Stone Mar 2011 A1
20110074591 Arling et al. Mar 2011 A1
20110078745 Macrae et al. Mar 2011 A1
20110083148 Sakaguchi Apr 2011 A1
20110093888 Araki et al. Apr 2011 A1
20110107376 Lee et al. May 2011 A1
20110119702 Jang et al. May 2011 A1
20110119707 Bae et al. May 2011 A1
20110125755 Kaila et al. May 2011 A1
20110126251 LaFreniere et al. May 2011 A1
20110119626 Faenger et al. Jun 2011 A1
20110131506 Callisendorff Jun 2011 A1
20110145860 Wei Jun 2011 A1
20110145881 Hartman et al. Jun 2011 A1
20110154198 Bachman et al. Jun 2011 A1
20110154405 Isaias Jun 2011 A1
20110161996 Hamano et al. Jun 2011 A1
20110167452 Baumgartner et al. Jul 2011 A1
20110173657 Thomas et al. Jul 2011 A1
20110179453 Poniatowski Jul 2011 A1
20110179454 Yates Jul 2011 A1
20110187929 Suzuki et al. Aug 2011 A1
20110213670 Strutton et al. Sep 2011 A1
20110213700 Sant'Anselmo Sep 2011 A1
20110219395 Moshiri et al. Sep 2011 A1
20110246495 Mallinson Oct 2011 A1
20110252446 Jeong et al. Oct 2011 A1
20110273552 Wang et al. Nov 2011 A1
20110273625 McMahon et al. Nov 2011 A1
20110276995 Alten et al. Nov 2011 A1
20110283304 Roberts et al. Nov 2011 A1
20110283318 Seidel et al. Nov 2011 A1
20110289067 Jordan et al. Nov 2011 A1
20110289419 Yu et al. Nov 2011 A1
20110289452 Jordan et al. Nov 2011 A1
20110296478 Jin Dec 2011 A1
20110321095 Yao et al. Dec 2011 A1
20110321098 Bangalore et al. Dec 2011 A1
20120002951 Reisman Jan 2012 A1
20120011545 Doets Jan 2012 A1
20120026400 Kang et al. Feb 2012 A1
20120030317 Smyth et al. Feb 2012 A1
20120033950 Cordray et al. Feb 2012 A1
20120042334 Choi et al. Feb 2012 A1
20120054803 Lee et al. Mar 2012 A1
20120060094 Irwin et al. Mar 2012 A1
20120060187 Moon et al. Mar 2012 A1
20120069131 Abelow Mar 2012 A1
20120072952 Vaysman et al. Mar 2012 A1
20120076473 Kunkel et al. Mar 2012 A1
20120079096 Cowan et al. Mar 2012 A1
20120079427 Carmichael et al. Mar 2012 A1
20120086857 Kim et al. Apr 2012 A1
20120090004 Jeong Apr 2012 A1
20120096264 Traversat et al. Apr 2012 A1
20120099024 Ryu et al. Apr 2012 A1
20120110623 Hill et al. May 2012 A1
20120120316 Lee May 2012 A1
20120127181 Fujimoto May 2012 A1
20120133840 Shirasuka et al. May 2012 A1
20120143972 Malik et al. Jun 2012 A1
20120144416 Wetzer et al. Jun 2012 A1
20120144422 Han et al. Jun 2012 A1
20120144423 Kim et al. Jun 2012 A1
20120147049 Lee Jun 2012 A1
20120147270 Kim et al. Jun 2012 A1
20120147825 Hassan et al. Jun 2012 A1
20120151006 McInerney et al. Jun 2012 A1
20120154449 Ramagem et al. Jun 2012 A1
20120167154 Kim et al. Jun 2012 A1
20120173979 Lee Jul 2012 A1
20120174039 Rhoads et al. Jul 2012 A1
20120174157 Stinson, III et al. Jul 2012 A1
20120176546 Yoon Jul 2012 A1
20120194742 Barnes et al. Aug 2012 A1
20120198490 Mertens Aug 2012 A1
20120200574 Hill et al. Aug 2012 A1
20120206652 Yi Aug 2012 A1
20120210275 Park et al. Aug 2012 A1
20120210355 Kim et al. Aug 2012 A1
20120210367 Lee et al. Aug 2012 A1
20120210370 Kim et al. Aug 2012 A1
20120210375 Wong et al. Aug 2012 A1
20120210386 Kim et al. Aug 2012 A1
20120218203 Kanki Aug 2012 A1
20120221972 Dougall et al. Aug 2012 A1
20120229320 Yu Sep 2012 A1
20120229473 Tam Sep 2012 A1
20120257108 Friedlander et al. Oct 2012 A1
20120260284 Friedlander et al. Oct 2012 A1
20120272271 Nishizawa et al. Oct 2012 A1
20120278725 Gordon et al. Nov 2012 A1
20120284751 Kim et al. Nov 2012 A1
20120284752 Christie Nov 2012 A1
20120291068 Khushoo et al. Nov 2012 A1
20120297422 Mountain Nov 2012 A1
20120331505 Chun et al. Dec 2012 A1
20130014006 Abellera et al. Jan 2013 A1
20130027613 Kim et al. Jan 2013 A1
20130031584 Chane et al. Jan 2013 A1
20130034023 Jung Feb 2013 A1
20130047174 Lee et al. Feb 2013 A1
20130051615 Lim et al. Feb 2013 A1
20130091053 Park et al. Apr 2013 A1
20130097640 Lemmons et al. Apr 2013 A1
20130148023 Sullivan et al. Jun 2013 A1
20130152135 Hong et al. Jun 2013 A1
20130160065 Aso et al. Jun 2013 A1
20130191869 Sugiyama et al. Jul 2013 A1
20130238777 Raleigh Sep 2013 A1
20130238796 Lentzitzky Sep 2013 A1
20130263021 Dunn et al. Oct 2013 A1
20130263053 Tritschler et al. Oct 2013 A1
20130275519 Nichols Oct 2013 A1
20130276031 Oh et al. Oct 2013 A1
20130282839 Alcala Oct 2013 A1
20130291015 Pan Oct 2013 A1
20130297697 Haimovitch et al. Nov 2013 A1
20130298164 Moreau et al. Nov 2013 A1
20130318553 Yegorov et al. Nov 2013 A1
20130326557 Kang et al. Dec 2013 A1
20130332856 Sanders et al. Dec 2013 A1
20140040344 Gehring et al. Feb 2014 A1
20140040819 Duffy Feb 2014 A1
20140049651 Voth Feb 2014 A1
20140049691 Burdzinski et al. Feb 2014 A1
20140049692 Sirpal et al. Feb 2014 A1
20140049693 Selim et al. Feb 2014 A1
20140052785 Sirpal Feb 2014 A1
20140052786 de Paz Feb 2014 A1
20140053176 Milano et al. Feb 2014 A1
20140053177 Voth Feb 2014 A1
20140053178 Voth et al. Feb 2014 A1
20140053179 Voth Feb 2014 A1
20140053180 Shoykher Feb 2014 A1
20140053190 Sirpal Feb 2014 A1
20140053191 Selim Feb 2014 A1
20140053192 Sirpal Feb 2014 A1
20140053193 Selim et al. Feb 2014 A1
20140053194 Shoykher et al. Feb 2014 A1
20140053195 Sirpal et al. Feb 2014 A1
20140053196 Selim Feb 2014 A1
20140053197 Shoykher et al. Feb 2014 A1
20140053198 Sirpal et al. Feb 2014 A1
20140053200 de Paz et al. Feb 2014 A1
20140053202 Selim Feb 2014 A1
20140053203 Csiki Feb 2014 A1
20140053204 Milano Feb 2014 A1
20140053205 Sirpal et al. Feb 2014 A1
20140053206 Shoykher et al. Feb 2014 A1
20140053207 Shoykher et al. Feb 2014 A1
20140053208 Sirpal et al. Feb 2014 A1
20140053211 Milano Feb 2014 A1
20140053212 Shoykher et al. Feb 2014 A1
20140053221 Sirpal et al. Feb 2014 A1
20140053222 Shoykher et al. Feb 2014 A1
20140053225 Shoykher et al. Feb 2014 A1
20140059480 de Paz et al. Feb 2014 A1
20140059578 Voth et al. Feb 2014 A1
20140059589 Sirpal Feb 2014 A1
20140059596 Dourado Feb 2014 A1
20140059598 Milano Feb 2014 A1
20140059599 Sirpal et al. Feb 2014 A1
20140059600 Duarado Feb 2014 A1
20140059601 Sirpal Feb 2014 A1
20140059602 Sirpal Feb 2014 A1
20140059603 Lee et al. Feb 2014 A1
20140059605 Sirpal et al. Feb 2014 A1
20140059606 Selim et al. Feb 2014 A1
20140059609 Duarado Feb 2014 A1
20140059610 Sirpal et al. Feb 2014 A1
20140059612 Selim Feb 2014 A1
20140059613 Burdzinski et al. Feb 2014 A1
20140059614 Shoykher et al. Feb 2014 A1
20140059615 Sirpal et al. Feb 2014 A1
20140059625 Dourado et al. Feb 2014 A1
20140059626 Selim Feb 2014 A1
20140059635 Sirpal et al. Feb 2014 A1
20140059637 Chen et al. Feb 2014 A1
20140067954 Sirpal Mar 2014 A1
20140068673 Sirpal et al. Mar 2014 A1
20140068674 Sirpal et al. Mar 2014 A1
20140068679 Kaneko et al. Mar 2014 A1
20140068682 Selim et al. Mar 2014 A1
20140068683 Selim et al. Mar 2014 A1
20140068685 Selim et al. Mar 2014 A1
20140068689 Sirpal et al. Mar 2014 A1
20140075475 Sirpal et al. Mar 2014 A1
20140075476 de Paz et al. Mar 2014 A1
20140075477 de Paz et al. Mar 2014 A1
20140075479 Soto et al. Mar 2014 A1
20140075483 de Paz et al. Mar 2014 A1
20140075484 Selim et al. Mar 2014 A1
20140075487 Selim et al. Mar 2014 A1
20140082659 Fife et al. Mar 2014 A1
20140109143 Craner Apr 2014 A1
20140173660 Correa et al. Jun 2014 A1
20140181851 Givon et al. Jun 2014 A1
20140181853 Dureau et al. Jun 2014 A1
20140188970 Madhok et al. Jul 2014 A1
20140223464 Moran et al. Aug 2014 A1
20140259074 Ansari et al. Sep 2014 A1
20140380414 Saidi et al. Dec 2014 A1
20150089538 Kim Mar 2015 A1
Foreign Referenced Citations (35)
Number Date Country
1832534 Sep 2006 CN
101472102 Jul 2009 CN
101540850 Sep 2009 CN
101567992 Oct 2009 CN
201937743 Aug 2011 CN
1031929 Aug 2000 EP
1067458 Jan 2001 EP
1770956 Apr 2007 EP
1865723 Dec 2007 EP
2328346 Jun 2011 EP
2348724 Jul 2011 EP
2439935 Apr 2012 EP
2487922 Aug 2012 EP
WO 9921308 Apr 1999 WO
WO 9935849 Jul 1999 WO
WO 2000005884 Feb 2000 WO
WO 0106784 Jan 2001 WO
WO 0106788 Jan 2001 WO
WO 02080552 Oct 2002 WO
WO 03044755 May 2003 WO
WO 2007137611 Dec 2007 WO
WO 2011148054 Dec 2011 WO
WO 2011163481 Dec 2011 WO
WO 2012030024 Mar 2012 WO
WO 2012068438 May 2012 WO
WO 2012073027 Jun 2012 WO
WO 2012094247 Jul 2012 WO
WO 2012103121 Aug 2012 WO
WO 2014026636 Feb 2014 WO
WO 2014026640 Feb 2014 WO
WO 2014028067 Feb 2014 WO
WO 2014028068 Feb 2014 WO
WO 2014028069 Feb 2014 WO
WO 2014028071 Feb 2014 WO
WO 2014028074 Feb 2014 WO
Non-Patent Literature Citations (211)
Entry
Official Action for U.S. Appl. No. 13/969,846, dated Aug. 15, 2014 19 pages.
Official Action for U.S. Appl. No. 13/969,869, dated Sep. 16, 2014 20 pages.
Official Action for U.S. Appl. No. 13/969,880, dated Sep. 11, 2014 10 pages.
Official Action for U.S. Appl. No. 13/969,884, dated Oct. 23, 2014 20 pages.
Official Action for U.S. Appl. No. 13/969,887, dated Oct. 23, 2014 16 pages.
Official Action for U.S. Appl. No. 13/969,482, dated Oct. 21, 2014 22 pages.
Official Action for U.S. Appl. No. 13/969,485, dated Oct. 2, 2014 21 pages.
Official Action for U.S. Appl. No. 13/970,388, dated Sep. 4, 2014 14 pages.
Official Action for U.S. Appl. No. 13/970,398, dated Sep. 2, 2014 14 pages.
Official Action for U.S. Appl. No. 13/970,420, dated Oct. 9, 2014 15 pages.
Official Action for U.S. Appl. No. 13/970,427, dated Sep. 30, 2014 24 pages.
Official Action for U.S. Appl. No. 13/970,442, dated Oct. 7, 2014 10 pages.
Official Action for U.S. Appl. No. 13/969,588, dated Nov. 10, 2014 11 pages.
Official Action for U.S. Appl. No. 13/969,487, dated Sep. 11, 2014 10 pages.
Official Action for U.S. Appl. No. 13/969,491, dated Oct. 8, 2014 21 pages.
Official Action for U.S. Appl. No. 13/864,214, dated Aug. 19, 2014 17 pages.
Official Action for U.S. Appl. No. 13/969,506, dated Oct. 2, 2014 9 pages.
Official Action for U.S. Appl. No. 13/968,665, dated Oct. 9, 2014 12 pages.
Official Action for U.S. Appl. No. 13/968,897, dated Aug. 15, 2014 15 pages.
Official Action for U.S. Appl. No. 13/970,234, dated Aug. 12, 2014 19 pages.
Official Action for U.S. Appl. No. 13/968,969, dated Aug. 19, 2014 14 pages.
Official Action for U.S. Appl. No. 13/968,610, dated Aug. 19, 2014 13 pages.
Official Action for U.S. Appl. No. 13/968,630, dated Jul. 31, 2014 9 pages.
Official Action for U.S. Appl. No. 13/969,507, dated Aug. 29, 2014 30 pages.
Official Action for U.S. Appl. No. 13/969,508, dated Nov. 7, 2014 10 pages.
Official Action for U.S. Appl. No. 13/969,510, dated Nov. 3, 2014 16 pages.
Official Action for U.S. Appl. No. 13/970,243, dated Aug. 22, 2014 20 pages.
U.S. Appl. No. 13/970,420, filed Aug. 19, 2013, Selim et al.
Tvonicsuk “Tvonics: How to use the EPG search function.” 1 pages retrieved from the internet on Nov. 11, 2013 from [www.youtube.com/watch?v=H8euZ0lydo].
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/36844, dated Jun. 28, 2013 8 pages
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/36843, dated Jun. 28, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/36826, dated Jun. 17, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55589, dated Dec. 19, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055289, dated Dec. 2, 2013 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055288, dated Dec. 2, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055286, dated Dec. 2, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55285, dated Dec. 11, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055284, dated Jan. 14, 2014 12 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055293, dated Dec. 2, 2013 12 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055294, dated Dec. 2, 2013 10 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055296, dated Nov. 29, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55493, dated Nov. 7, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/036828, dated Jun. 28, 2013 12 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055340, dated Feb. 10, 2014 10 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55551, dated Dec. 6, 2013 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/036819, dated Jun. 17, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/036782, dated Jun. 28, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/36678, dated Aug. 30, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/036804, dated Jul. 1, 2013 13 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55405, dated Jan. 17, 2014 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055278, dated Mar. 11, 2014 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55509, dated Nov. 8, 2013 9 pages.
International Search Report and Written Opinion for Intenational (PCT) Patent Application No. PCT/CN2013/081639, dated Nov. 28, 2013 10 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055303, dated Dec. 5, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055280, dated Jan. 10, 2014 10 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55383, dated Nov. 13, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055344, dated Mar. 11, 2014 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055345, dated Dec. 6, 2013 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55371, dated No. 8, 2013 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55389, dated Mar. 10, 2014 10 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055283, dated Mar. 4, 2014 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55564, dated Nov. 22, 2013 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55379, dated Dec. 6, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055349, dated Nov. 29, 2013 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055312, dated Dec. 5, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55374, dated Nov. 12, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55542, dated Mar. 10, 2014 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55315, dated Jan. 24, 2014 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055317, dated Dec. 2, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055318, dated Jan. 9, 2014 10 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055319, dated Mar. 11, 2014 7 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055321, dated Dec. 6, 2013 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/CN2013/081630, dated Nov. 21, 2013 11 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055298, dated Nov. 29, 2013 8 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55567, dated Dec. 20, 2013 10 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55570, dated Nov. 22, 2013 8 pages.
Official Action for U.S. Appl. No. 13/864,206, dated Nov. 27, 2013 17 pages.
Official Action for U.S. Appl. No. 13/864,214, dated Mar. 12, 2014 12 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55494, dated Apr. 16, 2014 9 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US2013/055342, dated Apr. 17, 2014 12 pages.
International Search Report and Written Opinion for International (PCT) Patent Application No. PCT/US13/55592, dated Mar. 10, 2014 11 pages.
Notice of Allowance for U.S. Appl. No. 13/864,206, dated Apr. 16, 2014 7 pages.
Official Action for U.S. Appl. No. 13/970,409, dated Jul. 21, 2014 21 pages.
Official Action for U.S. Appl. No. 13/970,442, dated Jun. 4, 2014 10 pages.
Official Action for U.S. Appl. No. 13/970,450 dated Jul. 9, 2014 17 pages.
Official Action for U.S. Appl. No. 13/970,453, dated Jul. 21, 2014 12 pages.
Official Action for U.S. Appl. No. 13/968,884, dated Jun. 25, 2014 23 pages.
Official Action for U.S. Appl. No. 13/968,876 dated Jun. 16, 2014 10 pages.
Official Action for U.S. Appl. No. 13/968,983, dated Ju. 7, 2014 15 pages.
Official Action for U.S. Appl. No. 13/968,948 dated May 6, 2014 11 pages.
Official Action for U.S. Appl. No. 13/864,120 dated Jun. 13, 2014 10 pages.
Official Action for U.S. Appl. No. 13/969,505, dated Jul. 15, 2014 7 pages.
Official Action for U.S. Appl. No. 13/969,490 dated May 23, 2014 18 pages.
Official Action for U.S. Appl. No. 13/969,492 dated May 23, 2014 17 pages.
Official Action for U.S. Appl. No. 13/970,000, dated Jul. 15, 2014 12 pages.
Official Action for U.S. Appl. No. 13/970,355, dated Jul. 7, 2014 7 pages.
Official Action for U.S. Appl. No. 13/970,374, dated Jun. 6, 2014 7 pages.
Official Action for U.S. Appl. No. 13/968,903, dated Jul. 18, 2014 14 pages.
Official Action for U.S. Appl. No. 13/968,969, dated May 6, 2014 12 pages.
Official Action for U.S. Appl. No. 13/968,610, dated Apr. 30, 2014 11 pages.
Official Action for U.S. Appl. No. 13/968,732, dated May 27, 2014 12 pages.
Official Action for U.S. Appl. No. 13/969,777, dated Jun. 25, 2014 9 pages.
Official Action for U.S. Appl. No. 13/969,510, dated Jun. 25, 2014 13 pages.
Official Action for U.S. Appl. No. 13/970,247, dated Jul. 9, 2014 19 pages.
Official Action for U.S. Appl. No. 13/969,504, dated Jul. 15, 2014 9 pages.
Official Action for U.S. Appl. No. 13/968,709, dated Feb. 23, 2015 13 pages.
Official Action for U.S. Appl. No. 13/968,903, dated Jan. 5, 2015 13 pages.
Official Action for U.S. Appl. No. 13/970,234, dated Dec. 26, 2014 13 pages.
Notice of Allowance for U.S. Appl. No. 13/968,969, dated Jan. 6, 2015 pages.
Official Action for U.S. Appl. No. 13/968,610, dated Dec. 31, 2014 7 pages.
Notice of Allowance for U.S. Appl. No. 13/968,610, dated Feb. 24, 2015 5 pages.
Official Action for U.S. Appl. No. 13/968,732, dated Dec. 26, 2014 13 pages.
Official Action for U.S. Appl. No. 13/969,777, dated Jan. 8, 2015 11 pages.
Official Action for U.S. Appl. No. 13/968,630, dated Nov. 19, 2014 11 pages.
Official Action for U.S. Appl. No. 13/969,507, dated Jan. 27, 2015 30 pages.
Official Action for U.S. Appl. No. 13/968,937, dated Dec. 18, 2014 9 pages.
Official Action for U.S. Appl. No. 13/969,510, dated Feb. 24, 2015 17 pages.
Official Action for U.S. Appl. No. 13/970,243, dated Jan. 22, 2015 14 pages.
Official Action for U.S. Appl. No. 13/970,247, dated Dec. 22, 2014 20 pages.
Official Action for U.S. Appl. No. 13/969,504, dated Feb. 23, 2015 10 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55494, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/36826, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055342, dated Feb. 26, 2015 10 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55589, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055289, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055288, dated Feb. 26, 2015 10 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055286, dated Feb. 26, 2015 8 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055284, dated Feb. 26, 2015 9 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055293, dated Feb. 26, 2015 12 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055294, dated Feb. 26, 2015 10 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55493, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/036828, dated Feb. 26, 2015 12 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055340, dated Feb. 26, 2015 8 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055551, dated Feb. 26, 2015 6 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55592, dated Feb. 26, 2015 10 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/036819, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/036782, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/36678, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/036804, dated Feb. 26, 2015 12 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55405, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055278, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55509, dated Feb. 26, 2015 8 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US2013/055303, dated Feb. 26, 2015 7 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055383, dated Apr. 2, 2015 8 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055344, dated Apr. 2, 2015 7 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055389, dated Apr. 2, 2015 9 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055283, dated Apr. 2, 2015 6 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055564, dated Apr. 2, 2015 6 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055379, dated Apr. 2, 2015 8 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055349, dated Apr. 2, 2015 7 pages.
International Preliminary Report for International (PCT) Patent Application No. PCT/US2013/055298, dated Feb. 26, 2015 7 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55567, dated Feb. 26, 2015 9 pages.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/US13/55570, dated Feb. 26, 2015 7 pages.
Official Action for U.S. Appl. No. 13/969,880, dated Mar. 27, 2015 14 pages.
Notice of Allowance for U.S. Appl. No. 13/969,884, dated Mar. 24, 2015 16 pages.
Official Action for U.S. Appl. No. 13/969,887, dated Mar. 11, 2015 24 pages.
Official Action for U.S. Appl. No. 13/969,482, dated Apr. 3, 2015 19 pages.
Official Action for U.S. Appl. No. 13/969,485, dated Mar. 11, 2015 19 pages.
Corrected Notice of Allowance for U.S. Appl. No. 13/968,884, dated Apr. 1, 2015 15 pages.
Official Action for U.S. Appl. No. 13/968,913, dated Apr. 1, 2015 16 pages.
Official Action for U.S. Appl. No. 13/968,983, dated Mar. 12, 2015 21 pages.
Official Action for U.S. Appl. No. 13/968,948, dated Mar. 11, 2015 13 pages.
Official Action for U.S. Appl. No. 13/968,929, dated Apr. 29, 2015 12 pages.
Official Action for U.S. Appl. No. 13/969,588, dated Mar. 11, 2015 12 pages.
Notice of Allowance for U.S. Appl. No. 13/864,120, dated Apr. 14, 2015 9 pages.
Official Action for U.S. Appl. No. 13/969,490, dated Mar. 13, 2015 26 pages.
Official Action for U.S. Appl. No. 13/969,491, dated Mar. 20, 2015 25 pages.
Official Action for U.S. Appl. No. 13/970,000, dated Mar. 5, 2015 16 pages.
Official Action for U.S. Appl. No. 13/969,179, dated Mar. 12, 2015 17 pages.
Official Action for U.S. Appl. No. 13/968,652, dated Apr. 17, 2015 17 pages.
Official Action for U.S. Appl. No. 13/969,777, dated Apr. 15, 2015 11 pages.
Official Action for U.S. Appl. No. 13/968,630, dated Mar. 27, 2015 15 pages.
Official Action for U.S. Appl. No. 13/968,937, dated Apr. 7, 2015 10 pages.
Official Action for U.S. Appl. No. 13/969,508, dated Mar. 5, 2015 9 pages.
U.S. Appl. No. 14/407,593, filed Dec. 12, 2014, Sirpal et al.
U.S. Appl. No. 14/407,598, filed Dec. 12, 2014, Sirpal et al.
U.S. Appl. No. 14/407,601, filed Dec. 12, 2014, Sirpal et al.
U.S. Appl. No. 14/407,609, filed Dec. 12, 2014, Sirpal et al.
U.S. Appl. No. 14/418,969, filed Feb. 2, 2015, Sirpal et al.
U.S. Appl. No. 14/418,972, filed Feb. 2, 2015, Sirpal et al.
U.S. Appl. No. 14/418,971, filed Feb. 2, 2015, Shoykher et al.
International Preliminary Report on Patentability for International (PCT) Patent Application No. PCT/CN2013/081639, dated Feb. 26, 2015 6 pages.
Official Action for U.S. Appl. No. 13/969,846, dated Mar. 2, 2015 21 pages.
Official Action for U.S. Appl. No. 13/969,869, dated Feb. 24, 2015 22 pages.
Official Action for U.S. Appl. No. 13/970,388, dated Jan. 22, 2015 12 pages.
Notice of Allowance for U.S. Appl. No. 13/970,398, dated Feb. 2, 2015 14 pages.
Official Action for U.S. Appl. No. 13/970,409, dated Feb. 11, 2015 25 pages.
Official Action for U.S. Appl. No. 13/970,420, dated Feb. 10, 2015 13 pages.
Official Action for U.S. Appl. No. 13/970,442, dated Feb. 26, 2015 10 pages.
Official Action for U.S. Appl. No. 13/970,450, dated Nov. 19, 2014 12 pages.
Notice of Allowance for U.S. Appl. No. 13/970,453, dated Dec. 9, 2014 11 pages.
Notice of Allowance for U.S. Appl. No. 13/968,884, dated Jan. 5, 2015 20 pages.
Official Action for U.S. Appl. No. 13/968,876, dated Dec. 26, 2014 13 pages.
Official Action for U.S. Appl. No. 13/968,867, dated Jan. 27, 2015 17 pages.
Official Action for U.S. Appl. No. 13/968,983, dated Nov. 26, 2014 17 pages.
Official Action for U.S. Appl. No. 13/968,948, dated Nov. 24, 2014 13 pages.
Official Action for U.S. Appl. No. 13/864,120, dated Dec. 24, 2014 13 pages.
Official Action for U.S. Appl. No. 13/969,505, dated Feb. 24, 2015 9 pages.
Official Action for U.S. Appl. No. 13/969,490, dated Nov. 19, 2014 22 pages.
Official Action for U.S. Appl. No. 13/969,492, dated Dec. 5, 2014 14 pages.
Official Action for U.S. Appl. No. 13/969,493, dated Mar. 2, 2015 15 pages.
Official Action for U.S. Appl. No. 13/9700,000, dated Nov. 14, 2014 14 pages.
Official Action for U.S. Appl. No. 13/970,355, dated Dec. 19, 2014 9 pages.
Official Action for U.S. Appl. No. 13/864,214, dated Jan. 29, 2015 17 pages.
Official Action for U.S. Appl. No. 13/969,179, dated Nov. 28, 2014 15 pages.
Official Action for U.S. Appl. No. 13/968,767, dated Jan. 22, 2015 15 pages.
Official Action for U.S. Appl. No. 13/969,506, dated Jan. 23, 2015 9 pages.
Official Action for U.S. Appl. No. 13/970,374, dated Dec. 9, 2014 9 pages.
Official Action for U.S. Appl. No. 13/968,897, dated Jan. 28, 2015 18 pages.
Official Action for U.S. Appl. No. 13/968,652, dated Dec. 5, 2014 25 pages.
Official Action for U.S. Appl. No. 13/968,709, dated Nov. 14, 2014 17 pages.
Related Publications (1)
Number Date Country
20140055673 A1 Feb 2014 US
Provisional Applications (16)
Number Date Country
61684672 Aug 2012 US
61702650 Sep 2012 US
61697710 Sep 2012 US
61700182 Sep 2012 US
61736692 Dec 2012 US
61798821 Mar 2013 US
61804942 Mar 2013 US
61804998 Mar 2013 US
61804971 Mar 2013 US
61804990 Mar 2013 US
61805003 Mar 2013 US
61805053 Mar 2013 US
61805030 Mar 2013 US
61805027 Mar 2013 US
61805042 Mar 2013 US
61805038 Mar 2013 US