Users may access content through a computing device having a single, relatively large display, such as a digital television. Interacting with such a device may be cumbersome. For example, providing textual input to the digital television may involve an on-screen keyboard that uses arrow buttons on a remote control to select the keys. Further, it may be difficult to provide additional information about the content being rendered due to readability, aesthetics, or other concerns.
Many aspects of the present disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, with emphasis instead being placed upon clearly illustrating the principles of the disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
The present disclosure relates to extending a shopping experience from one computing device to another computing device. An online shopping experience using a digital television exclusively may be frustrating to users. For example, textual input via a digital television typically involves an on-screen keyboard manipulated via a remote control. The typing speed achieved via such an on-screen keyboard may be extremely slow, especially where arrow buttons on the remote control are employed to navigate to different keys on the keyboard. Thus, it can be difficult for users to enter item search queries, create item reviews, or provide user names and passwords, billing information, payment information, and other information that may be involved in a shopping experience. Also, it may be challenging to read paragraphs of text on a digital television with a large screen. Accordingly, it may be preferable to browse item descriptions and customer reviews via a tablet or other computing device.
Various embodiments of the present disclosure facilitate extending a shopping experience from a digital television to another computing device and vice versa. A digital television with a relatively large display may be employed to provide an enhanced shopping experience. For example, a digital television may be used to provide simulated “try it on” functionality for apparel items. However, according to various embodiments, a user may employ another computing device such as a tablet, smartphone, laptop, desktop, etc., to browse an online catalog and select an item to be virtually tried on. Further, the other computing device may be employed for the user to provide information to consummate an order of an item. In the following discussion, a general description of the system and its components is provided, followed by a discussion of the operation of the same.
With reference to
The digital television 112 renders a user interface 107b upon a display 115 that provides an immersive experience for the item 106, e.g., a high-resolution image of the particular item 106 with zoom functionality. The user interface 107b may also facilitate a “try it on” feature. Images of other items 106 that are recommended or similar to the particular item 106 may be rendered in a recommendations panel 118. The shopping assistance application may be controlled by physical gestures, voice commands, a remote control, and/or other user input approaches. The shopping assistance application may also be controlled via user input provided through the mobile computing device 103. Through the shopping experience application, the user may choose to add the particular item 106 to a shopping list, initiate an order of the particular item 106, select another item 106, and/or perform other actions. In particular, as a result of such actions, the shopping assistance application may send a directive to the mobile computing device 103 to render information in a user interface 107c on the display 109. Such information may correspond to an item detail screen, a shopping list screen, a checkout screen to consummate an order, and other information.
Referring next to
The computing environment 203 may comprise, for example, a server computer or any other system providing computing capability. Alternatively, the computing environment 203 may employ a plurality of computing devices that are arranged, for example, in one or more server banks or computer banks or other arrangements. Such computing devices may be located in a single installation or may be distributed among many different geographical locations. For example, the computing environment 203 may include a plurality of computing devices that together may comprise a cloud computing resource, a grid computing resource, and/or any other distributed computing arrangement. In some cases, the computing environment 203 may correspond to an elastic computing resource where the allotted capacity of processing, network, storage, or other computing-related resources may vary over time.
Various applications and/or other functionality may be executed in the computing environment 203 according to various embodiments. Also, various data is stored in a data store 215 that is accessible to the computing environment 203. The data store 215 may be representative of a plurality of data stores 215 as can be appreciated. The data stored in the data store 215, for example, is associated with the operation of the various applications and/or functional entities described below.
The components executed on the computing environment 203, for example, include an electronic commerce application 218 and other applications, services, processes, systems, engines, or functionality not discussed in detail herein. The electronic commerce application 218 is executed to perform various functions relating to the online presence of one or more merchants. Specifically, the electronic commerce application 218 may facilitate searching, browsing, and ordering from an item catalog, among other functions. To this end, the electronic commerce application 218 may generate and serve up network content, such as network pages and/or other data for client applications. In some embodiments, the electronic commerce application 218 may perform video encoding and/or other functions to support operations by the first computing device 206 and the second computing device 209.
The data stored in the data store 215 includes, for example, item data 221, customer data 224, order data 227, network content 230, and potentially other data. The item data 221 includes various data relating to items 106 (
The customer data 224 includes data relating to customers of the merchant(s) associated with the electronic commerce application 218. Such data may include shopping lists (e.g., shopping carts, gift registries, watch lists, and other lists of items 106), preferences, contact information, payment instruments, shipping addresses, and other information. The order data 227 includes data relating to customer orders of items 106 that are in progress and/or have been completed. The network content 230 includes text, images, video, audio, templates, and/or other content that may be served up by the electronic commerce application 218.
The first computing device 206 is representative of a plurality of client devices that may be coupled to the network 212. The first computing device 206 may comprise, for example, a processor-based system such as a computer system. Such a computer system may be embodied in the form of digital television 112 (
The first computing device 206 may be configured to execute various applications such as a shopping assistance application 236 and/or other applications. The shopping assistance application 236 may be executed in a first computing device 206, for example, to access network content 230 served up by the computing environment 203 and/or other servers, thereby rendering a user interface 239 on the display 115. The shopping assistance application 236, may, for example, correspond to a browser, a mobile application, or other client application, etc., and the user interface 239 may correspond to a network page, a mobile application screen, etc.
The shopping assistance application 236 extends a shopping experience from the second computing device 209 to the first computing device 206 and vice versa. In this regard, the shopping assistance application 236 may facilitate viewing of high-resolution images of items 106, a “try it on” experience for items 106, and other functions. Further, the shopping assistance application 236 may be configured to receive various user commands to add items 106 to shopping lists, initiate orders of items 106, view additional information about items 106, view information regarding other items 106, and so on. The first computing device 206 may include a camera device 242, a microphone device 245, and/or other input devices. The first computing device 206 may be configured to execute applications beyond the shopping assistance application 236, such as, for example, video applications, browsers, mobile applications, email applications, social networking applications, and/or other applications.
The second computing device 209 is representative of a plurality of client devices that may be coupled to the network 212. The second computing device 209 may comprise, for example, a processor-based system such as a computer system. Such a computer system may be embodied in the form of a mobile computing device 103 (
The second computing device 209 may be configured to execute various applications such as a client application 248 and/or other applications. The client application 248 may be executed in a second computing device 209, for example, to access network content served up by the computing environment 203 and/or other servers, thereby rendering a user interface 251 on the display 109. The client application 248 may, for example, correspond to a browser, a mobile application, etc., and the user interface 251 may correspond to a network page, a mobile application screen, etc.
Specifically, the client application 248 may be employed to interact with the electronic commerce application 218 to search and browse items 106 from an item catalog and to place orders for items 106. Further, the client application 248 may be employed to select items 106 to be presented by the shopping assistance application 236 of the first computing device 206. The second computing device 209 may include a camera device 254, a microphone device 257, and/or other input devices. The second computing device 209 may be configured to execute applications beyond the client application 248 such as, for example, browsers, mobile applications, email applications, social networking applications, and/or other applications.
Next, a general description of the operation of the various components of the networked environment 200 is provided. To begin, a user at the second computing device 209 interacts with the client application 248 to request network content 230 from the electronic commerce application 218. For example, the client application 248 may request network pages or other forms of network content 230 relating to item search, item browsing, item ordering, and/or other shopping-related functions. The client application 248 may render a user interface 251 on the display 109.
Concurrently, the user may also execute the shopping assistance application 236 of the first computing device 206. The shopping assistance application 236 is configured to render content that supports the shopping experience of the user through the second computing device 209. Such content may comprise videos and other images. For example, the shopping assistance application 236 may render a high resolution image of an item 106 or items 106 that are described by content being rendered by the client application 248. The user may be able to issue various user commands to the shopping assistance application 236 to zoom, pan, scroll, etc. relative to the high resolution image. Additional images and/or item information, e.g., in a recommendations panel 118, may be rendered for other items 106 that are recommended or are otherwise similar to the currently selected item(s) 106. Various techniques relating to using a second screen for displaying content are described in U.S. patent application Ser. No. 13/709,768 filed on Dec. 10, 2012 entitled “PROVIDING CONTENT VIA MULTIPLE DISPLAY DEVICES,” which is incorporated by reference herein in its entirety.
User interaction with the shopping assistance application 236 may be facilitated through several approaches. As a first example, a microphone device 245 of the first computing device 206 may capture audio. The first computing device 206 may process the audio to determine various voice commands issued by the user. The user may say, for example, “buy it now,” “try it on,” “exit,” and/or other voice commands.
As a second example, the camera device 242 of the first computing device 206 may capture a video stream of the user. In the captured video, the user may perform physical gestures to communicate commands. Non-limiting examples of physical gestures that may be identified include an open palm, a grabbing action, a moving hand, a closed fist, and so on for one or more hands. In one embodiment, the shopping assistance application 236 may process the captured video to identify such gestures. In another embodiment, the captured video stream may be uploaded to the computing environment 203 for server-side processing, with the gesture identification being communicated to the shopping assistance application 236.
Further non-limiting examples of user input to the shopping assistance application 236 may include keyboard input, touch screen input, remote control input, mouse input, and input from other input devices. Additionally, in some embodiments, the input for the shopping assistance application 236 may be captured, and potentially processed, by the client application 248 of the second computing device 209. For example, a video stream of the user, including physical gestures, may be captured by a camera device 254 of the second computing device 209. Also, audio containing voice commands of the user may be captured by a microphone device 257 of the second computing device 209. Such gestures and/or voice commands captured by way of the second computing device 209 may be employed to control the shopping assistance application 236 of the first computing device 206.
User commands provided to the shopping assistance application 236 may be employed to view additional information about an item 106, initiate an order for an item 106, add an item 106 to a shopping list, view information about another item 106, and/or other functions. Such functions may include sending data encoding a command to the electronic commerce application 218 and/or the client application 248. For example, in response to a voice command of “add to cart,” the shopping assistance application 236 may send a shopping cart addition directive to the electronic commerce application 218, where the directive specifies a particular item 106.
Alternatively, directives may be sent to the client application 248, possibly by way of the electronic commerce application 218. For example, a user command to view information about another item 106 may result in a directive being sent to the client application 248, whereupon the client application 248 renders a user interface 251 upon the display 109 to present a detail page regarding the other item 106. In some cases, directives or instructions may be sent from the client application 248 to the shopping assistance application 236, also possibly by way of the electronic commerce application 218. For example, a user may select a particular item 106 through the user interface 251 to be shown with a high resolution image in the user interface 239 rendered upon the display 115. To this end, a directive may be sent to the shopping assistance application 236, whereupon the shopping assistance application 236 renders a user interface 239 showing high resolution images of the particular item 106.
Through this communication between the first computing device 206 and the second computing device 209, the shopping experience between the displays 115 and 109 is maintained with a level of synchronization. As an example, display 115 may show a high resolution image of an item 106 for which a detail page is shown on the display 109. As another example, a user may select a particular search result shown on the display 109, whereupon high resolution images for the item 106 corresponding to the particular search result are rendered on the display 115. As yet another example, a user may select a “buy it now” function through the shopping assistance application 236, whereupon a first screen in an order pipeline is rendered upon the display 109. The user may then provide textual input to consummate the order via the second computing device 209.
Another functionality provided by the first computing device 206 may correspond to “try it on” functionality for apparel, shoes, handbags, hats, and/or other items 106. A camera device 242 (or 254) may capture a video stream of a user, and an image of an item 106 may be composited or superimposed upon the video stream of the user. In one embodiment, the image of the item 106 may be rendered by the shopping assistance application 236 at a fixed location, and the user may position himself or herself in front of the camera device 242 (or 254) so that he or she appears to “try on” the item. In another embodiment, the user may move, resize, stretch, skew, and/or otherwise transform the item 106 so that the item 106 appears to be “tried on.”
The user may use voice commands, gestures, etc. or input via the second computing device 209 to select different options (e.g., colors, sizes, fabrics, etc.) for the item 106. Selection of such options may result in different images being presented upon the display 115 or an existing image being manipulated to appear to show the selected option (e.g., a red scarf being color corrected to be a blue version of the scarf, one fabric being replaced with a different fabric, an image of a dress at one size being scaled to represent the dress at a different size, and so on).
Different images of the item 106 may be employed for the user to turn to different angles to “model” the item 106 at different views. For example, a front-view image of a dress may be replaced with a side-view image of the same dress when the user turns to her side. Such an image replacement may be performed automatically through analysis of the user movement shown in the captured video, or manually in response to user commands. In one embodiment, a video of the item 106 may be employed, and the user may move to mimic the predefined movement of the item 106 as shown in the video. The video may be transformed (scaled, skewed, moved, etc.) as desired by the user so that the user appears to model the item 106.
In various embodiments, a three-dimensional camera device 242 (or 254) may be employed, and a three-dimensional model of the user may be generated from video captured from the three-dimensional camera device 242 (or 254). A three-dimensional model of an item 106 stored in the image data 233 may then be rendered upon the three-dimensional model of the user. The resulting image or video may then be shown upon the display 115 by the shopping assistance application 236. The image may be updated and re-rendered upon movement of the user.
In one embodiment, the shopping assistance application 236 may be configured to recommend a particular item 106 or a size of a particular item 106 to a user based at least in part on the video of the user captured from the camera device 242 (or 254). For example, the shopping assistance application 236 may perform an analysis to ascertain the human body dimensions of the user. The shopping assistance application 236, possibly through interactions with the electronic commerce application 218, may determine items 106 that would fit the measured human body dimensions of the user.
Turning now to
Continuing to
Moving to
Referring next to
Beginning with box 403, the shopping assistance application 236 obtains a selection of an item 106 (
In box 412, the shopping assistance application 236 captures audio and/or video of the user by way of the camera device 242 (
In box 421, the shopping assistance application 236 determines whether a physical gesture has been captured from the user on video. If a physical gesture has been made, the shopping assistance application 236 moves to box 424 and processes the command associated with the physical gesture and updates the user interface 239. The shopping assistance application 236 then continues to box 427. If a physical gesture has not been made, the shopping assistance application 236 moves from box 421 to box 427.
In box 427, the shopping assistance application 236 determines whether the display 109 (
Turning now to
Beginning with box 433, the shopping assistance application 236 obtains an image of an item 106 (
In box 445, the shopping assistance application 236 determines whether to transform the image of the item 106. For example, the shopping assistance application 236 may transform the image of the item 106 in response to receiving a transformation command from the user. Alternatively, the shopping assistance application 236 may transform the image of the item 106 automatically in response to detecting movement of the user within the video stream depicting the user. Thus, if the user physically moves to the left within the scene, the image of the item 106 may be transformed within the composited video stream so that the image of the item 106 is still superimposed upon the image of the user. If a transformation is to be applied, the shopping assistance application 236 moves to box 448 and applies the transformation to the image of the item 106. The shopping assistance application 236 then returns to box 439. If no transformation is to be performed, the portion of the shopping assistance application 236 ends.
Continuing to
Beginning with box 451, the client application 248 obtains a user selection of an item 106 (
In box 463, the client application 248 determines whether a directive is received to update the user interface 251. For example, such a directive may originate in a user action relative to a user interface 239 (
With reference to
Stored in the memory 506 are both data and several components that are executable by the processor 503. In particular, stored in the memory 506 and executable by the processor 503 is the electronic commerce application 218 and potentially other applications. Also stored in the memory 506 may be a data store 215 and other data. In addition, an operating system may be stored in the memory 506 and executable by the processor 503.
It is understood that there may be other applications that are stored in the memory 506 and are executable by the processor 503 as can be appreciated. Where any component discussed herein is implemented in the form of software, any one of a number of programming languages may be employed such as, for example, C, C++, C#, Objective C, Java®, JavaScript®, Perl, PHP, Visual Basic®, Python®, Ruby, Flash®, or other programming languages.
A number of software components are stored in the memory 506 and are executable by the processor 503. In this respect, the term “executable” means a program file that is in a form that can ultimately be run by the processor 503. Examples of executable programs may be, for example, a compiled program that can be translated into machine code in a format that can be loaded into a random access portion of the memory 506 and run by the processor 503, source code that may be expressed in proper format such as object code that is capable of being loaded into a random access portion of the memory 506 and executed by the processor 503, or source code that may be interpreted by another executable program to generate instructions in a random access portion of the memory 506 to be executed by the processor 503, etc. An executable program may be stored in any portion or component of the memory 506 including, for example, random access memory (RAM), read-only memory (ROM), hard drive, solid-state drive, USB flash drive, memory card, optical disc such as compact disc (CD) or digital versatile disc (DVD), floppy disk, magnetic tape, or other memory components.
The memory 506 is defined herein as including both volatile and nonvolatile memory and data storage components. Volatile components are those that do not retain data values upon loss of power. Nonvolatile components are those that retain data upon a loss of power. Thus, the memory 506 may comprise, for example, random access memory (RAM), read-only memory (ROM), hard disk drives, solid-state drives, USB flash drives, memory cards accessed via a memory card reader, floppy disks accessed via an associated floppy disk drive, optical discs accessed via an optical disc drive, magnetic tapes accessed via an appropriate tape drive, and/or other memory components, or a combination of any two or more of these memory components. In addition, the RAM may comprise, for example, static random access memory (SRAM), dynamic random access memory (DRAM), or magnetic random access memory (MRAM) and other such devices. The ROM may comprise, for example, a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other like memory device.
Also, the processor 503 may represent multiple processors 503 and/or multiple processor cores and the memory 506 may represent multiple memories 506 that operate in parallel processing circuits, respectively. In such a case, the local interface 509 may be an appropriate network that facilitates communication between any two of the multiple processors 503, between any processor 503 and any of the memories 506, or between any two of the memories 506, etc. The local interface 509 may comprise additional systems designed to coordinate this communication, including, for example, performing load balancing. The processor 503 may be of electrical or of some other available construction.
Although the electronic commerce application 218, the client application 248 (
The flowcharts of
Although the flowcharts of
Also, any logic or application described herein, including the electronic commerce application 218, the client application 248, and the shopping assistance application 236, that comprises software or code can be embodied in any non-transitory computer-readable medium for use by or in connection with an instruction execution system such as, for example, a processor 503 in a computer system or other system. In this sense, the logic may comprise, for example, statements including instructions and declarations that can be fetched from the computer-readable medium and executed by the instruction execution system. In the context of the present disclosure, a “computer-readable medium” can be any medium that can contain, store, or maintain the logic or application described herein for use by or in connection with the instruction execution system.
The computer-readable medium can comprise any one of many physical media such as, for example, magnetic, optical, or semiconductor media. More specific examples of a suitable computer-readable medium would include, but are not limited to, magnetic tapes, magnetic floppy diskettes, magnetic hard drives, memory cards, solid-state drives, USB flash drives, or optical discs. Also, the computer-readable medium may be a random access memory (RAM) including, for example, static random access memory (SRAM) and dynamic random access memory (DRAM), or magnetic random access memory (MRAM). In addition, the computer-readable medium may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or other type of memory device.
It should be emphasized that the above-described embodiments of the present disclosure are merely possible examples of implementations set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.
Number | Name | Date | Kind |
---|---|---|---|
5260556 | Lake et al. | Nov 1993 | A |
5596705 | Reimer et al. | Jan 1997 | A |
5691527 | Hara et al. | Nov 1997 | A |
5692212 | Roach | Nov 1997 | A |
5781730 | Reimer et al. | Jul 1998 | A |
6065042 | Reimer et al. | May 2000 | A |
6556722 | Russell et al. | Apr 2003 | B1 |
7103541 | Attias et al. | Sep 2006 | B2 |
7293275 | Krieger et al. | Nov 2007 | B1 |
7444593 | Reid | Oct 2008 | B1 |
7558865 | Lin et al. | Jul 2009 | B2 |
7774075 | Lin | Aug 2010 | B2 |
7814521 | Ou et al. | Oct 2010 | B2 |
8161082 | Israel et al. | Apr 2012 | B2 |
8209396 | Raman et al. | Jun 2012 | B1 |
8250605 | Opaluch | Aug 2012 | B2 |
8365235 | Hunt et al. | Jan 2013 | B2 |
8510775 | Lafreniere et al. | Aug 2013 | B2 |
8510779 | Slothouber et al. | Aug 2013 | B2 |
8552983 | Chiu | Oct 2013 | B2 |
8644702 | Kalajan | Feb 2014 | B1 |
8689255 | Gregov | Apr 2014 | B1 |
8763041 | Timmermann et al. | Jun 2014 | B2 |
8849943 | Huang et al. | Sep 2014 | B2 |
8955021 | Treder et al. | Feb 2015 | B1 |
9078030 | Kuo | Jul 2015 | B2 |
9113128 | Aliverti et al. | Aug 2015 | B1 |
9241187 | Ricci | Jan 2016 | B2 |
20020042920 | Thomas et al. | Apr 2002 | A1 |
20020059610 | Ellis | May 2002 | A1 |
20030050863 | Radwin | Mar 2003 | A1 |
20040028258 | Naimark et al. | Feb 2004 | A1 |
20040056097 | Walmsley et al. | Mar 2004 | A1 |
20040133919 | Incentis | Jul 2004 | A1 |
20040197088 | Ferman et al. | Oct 2004 | A1 |
20050160465 | Walker | Jul 2005 | A1 |
20050177538 | Shimizu et al. | Aug 2005 | A1 |
20050264527 | Lin | Dec 2005 | A1 |
20060007452 | Gaspard et al. | Jan 2006 | A1 |
20060184538 | Randall et al. | Aug 2006 | A1 |
20060271836 | Morford et al. | Nov 2006 | A1 |
20060278722 | Tominaga | Dec 2006 | A1 |
20070061724 | Slothouber et al. | Mar 2007 | A1 |
20070143737 | Huang et al. | Jun 2007 | A1 |
20080002021 | Guo et al. | Jan 2008 | A1 |
20080005222 | Lambert et al. | Jan 2008 | A1 |
20080066135 | Brodersen et al. | Mar 2008 | A1 |
20080148317 | Opaluch | Jun 2008 | A1 |
20080172293 | Raskin et al. | Jul 2008 | A1 |
20080196072 | Chun | Aug 2008 | A1 |
20080209465 | Thomas et al. | Aug 2008 | A1 |
20080235749 | Jain | Sep 2008 | A1 |
20080271068 | Ou et al. | Oct 2008 | A1 |
20090018898 | Genen | Jan 2009 | A1 |
20090019009 | Byers | Jan 2009 | A1 |
20090081950 | Matsubara | Mar 2009 | A1 |
20090089186 | Paolini | Apr 2009 | A1 |
20090090786 | Hovis | Apr 2009 | A1 |
20090094113 | Berry et al. | Apr 2009 | A1 |
20090138906 | Eide et al. | May 2009 | A1 |
20090199098 | Kweon et al. | Aug 2009 | A1 |
20090228919 | Zott et al. | Sep 2009 | A1 |
20100057782 | McGowan et al. | Mar 2010 | A1 |
20100092079 | Aller | Apr 2010 | A1 |
20100103106 | Chui | Apr 2010 | A1 |
20100153831 | Beaton | Jun 2010 | A1 |
20100154007 | Touboul et al. | Jun 2010 | A1 |
20100199219 | Poniatowski et al. | Aug 2010 | A1 |
20100222102 | Rodriguez | Sep 2010 | A1 |
20100251292 | Srinivasan | Sep 2010 | A1 |
20100287592 | Patten et al. | Nov 2010 | A1 |
20100312596 | Saffari | Dec 2010 | A1 |
20110023073 | McCarthy et al. | Jan 2011 | A1 |
20110047299 | Yu | Feb 2011 | A1 |
20110049250 | Hovis et al. | Mar 2011 | A1 |
20110067061 | Karaoguz et al. | Mar 2011 | A1 |
20110131520 | Al-Shaykh et al. | Jun 2011 | A1 |
20110154405 | Isaias | Jun 2011 | A1 |
20110162007 | Karaoguz et al. | Jun 2011 | A1 |
20110167456 | Kokenos | Jul 2011 | A1 |
20110173659 | Lafreniere et al. | Jul 2011 | A1 |
20110181780 | Barton | Jul 2011 | A1 |
20110246495 | Mallinson | Oct 2011 | A1 |
20110282906 | Wong | Nov 2011 | A1 |
20110289534 | Jordan et al. | Nov 2011 | A1 |
20110296465 | Krishnan et al. | Dec 2011 | A1 |
20120014663 | Knight | Jan 2012 | A1 |
20120033140 | Xu | Feb 2012 | A1 |
20120072953 | James et al. | Mar 2012 | A1 |
20120096499 | Dasher et al. | Apr 2012 | A1 |
20120151530 | Krieger et al. | Jun 2012 | A1 |
20120210205 | Sherwood et al. | Aug 2012 | A1 |
20120220223 | Rose et al. | Aug 2012 | A1 |
20120238363 | Watanabe | Sep 2012 | A1 |
20120240161 | Kuo | Sep 2012 | A1 |
20120256000 | Cok | Oct 2012 | A1 |
20120256007 | Cok | Oct 2012 | A1 |
20120257766 | Seymour et al. | Oct 2012 | A1 |
20120308202 | Murata et al. | Dec 2012 | A1 |
20130014155 | Clarke et al. | Jan 2013 | A1 |
20130021535 | Kim et al. | Jan 2013 | A1 |
20130024783 | Brakensiek et al. | Jan 2013 | A1 |
20130057543 | Mann et al. | Mar 2013 | A1 |
20130060660 | Maskatia et al. | Mar 2013 | A1 |
20130074125 | Hao et al. | Mar 2013 | A1 |
20130094013 | Hovis et al. | Apr 2013 | A1 |
20130110672 | Yang | May 2013 | A1 |
20130113830 | Suzuki | May 2013 | A1 |
20130113993 | Dagit, III | May 2013 | A1 |
20130115974 | Lee et al. | May 2013 | A1 |
20130144727 | Morot-Gaudry et al. | Jun 2013 | A1 |
20130219434 | Farrell | Aug 2013 | A1 |
20130291018 | Billings et al. | Oct 2013 | A1 |
20130339991 | Ricci | Dec 2013 | A1 |
20140035726 | Schoner | Feb 2014 | A1 |
20140035913 | Higgins | Feb 2014 | A1 |
20140043332 | Rollett | Feb 2014 | A1 |
20140068670 | Timmermann et al. | Mar 2014 | A1 |
20140122564 | Arora et al. | May 2014 | A1 |
20140130102 | Iijima et al. | May 2014 | A1 |
20140134947 | Stouder-Studenmund | May 2014 | A1 |
20140208355 | Gregov et al. | Jul 2014 | A1 |
20140281985 | Garrison et al. | Sep 2014 | A1 |
20150095774 | Bates et al. | Apr 2015 | A1 |
20150156562 | Treder et al. | Jun 2015 | A1 |
20150195474 | Lu | Jul 2015 | A1 |
20150235672 | Cudak et al. | Aug 2015 | A1 |
20150339508 | Hosokane | Nov 2015 | A1 |
20150357001 | Aliverti et al. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
1993282 | Nov 2008 | EP |
2071578 | Jun 2009 | EP |
2003084229 | Oct 2003 | WO |
2014036413 | Mar 2014 | WO |
Entry |
---|
Anonymous, Swivel by FaceCake, the World's First 3D Virtual Dressing Room Showcased at Computex Taipei 2012, Jul. 12, 2012, Business Wire, 0EIN, p. 1. (Year: 2012). |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012 entitled “Providing Content Via Multiple Display Devices.” |
U.S. Appl. No. 13/227,097 entitled “Synchronizing Video Content With Extrinsic Data” and filed Sep. 7, 2011. |
U.S. Appl. No. 13/601,267 entitled “Enhancing Video Content With Extrinsic Data” and filed Aug. 31, 2012. |
U.S. Appl. No. 13/601,235 entitled “Timeline Interface for Video Content” and filed Aug. 31, 2012. |
U.S. Appl. No. 13/601,210 entitled “Providing Extrinsic Data for Video Content” and filed Aug. 31, 2012. |
U.S. Appl. No. 13/927,970 entitled “Providing Soundtrack Information During Playback of Video Content” and filed Jun. 26, 2013. |
U.S. Appl. No. 14/034,055 entitled “Playback of Content Using Multiple Devices” and filed Sep. 23, 2013. |
“Entertainment is more amazing with Xbox SmartGlass,” Xbox SmartGlass 1 Companion Application—Xbox.com, retrieved from “http://www.xbox.com/en-US/smartglass,” retrieved Dec. 4, 2012. |
International Searching Authority and Written Opinion dated Mar. 21, 2014 for PCT/US2013/057543 filed Aug. 30, 2013. |
“Sony Pictures to smarten up Blu-ray with MovieiQ, the ‘killer app for BD-Live,’”' Engadget, retrieved from http://www.engadget.com/2009/06/18/sony-pictures-to-smarten-up-blu-ray-with-movieiq-the-killer-ap/, Jun. 18, 2009. |
“Hulu ‘Face Match’ feature attaches an actor's entire history to their mug,” Engadget, retrieved from http://www.engadget.com/20 11/12/08/hulu-face-match-feature-attaches-an-actors-entire-h istory-to/, Dec. 8, 2011. |
“TVPlus for the iPad,” iTunes Store, retrieved from “http://itunes.apple.com/us/app/tvplus/id444774882?mt=B,” updated Apr. 13, 2012. |
“Wii U GamePad,” Wii U Official Site—Features, retrieved from “http://www.nintendo.com/wiiu/features/,” retrieved Dec. 4, 2012. |
U.S. Appl. No. 14/225,864, filed Mar. 26, 2014, Response to Final Office Action dated Jul. 13, 2015. |
U.S. Appl. No. 14/225,864, filed Mar. 26, 2014, Final Office Action dated Jul. 13, 2015. |
U.S. Appl. No. 14/225,864, filed Mar. 26, 2014, Response to Non-Final Office Action dated Mar. 3, 2015. |
U.S. Appl. No. 14/225,864, filed Mar. 26, 2014, Non-Final Office Action dated Mar. 3, 2015. |
U.S. Appl. No. 15/154,233, filed May 13, 2016, Non-Final Office Action dated Jun. 2, 2017. |
U.S. Appl. No. 15/154,233, filed May 13, 2016, Response to Restriction/Election dated Feb. 3, 2017. |
U.S. Appl. No. 15/154,233, filed May 13, 2016, Restriction/Election dated Feb. 3, 2017. |
U.S. Appl. No. 13/227,097, filed Sep. 7, 2011, Notice of Allowance dated Oct. 22, 2013. |
U.S. Appl. No. 13/227,097, filed Sep. 7, 2011, Response to Non-Final Office Action dated Apr. 9, 2013. |
U.S. Appl. No. 13/227,097, filed Sep. 7, 2011, Non-Final Office Action dated Apr. 9, 2013. |
U.S. Appl. No. 14/826,508, filed Aug. 14, 2015, Response to Non-Final Office Action dated Oct. 26, 2016. |
U.S. Appl. No. 14/826,508, filed Aug. 14, 2015, Non-Final Office Action dated Oct. 26, 2016. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Response to Non-final Office Action dated Mar. 30, 2017. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Notice of Allowance dated Aug. 15, 2017. |
U.S. Appl. No. 13/927,970, filed Jun. 26, 2013, Non-Final Office Action dated Apr. 3, 2017. |
U.S. Appl. No. 13/927,970, filed Jun. 26, 2013, Response to Non-Final Office Action dated Apr. 3, 2017. |
U.S. Appl. No. 13/927,970, filed Jun. 26, 2013, Non-Final Office Action dated Nov. 2, 2017. |
U.S. Appl. No. 13/927,970, filed Jun. 26, 2013, Response to Non-Final Office Action dated Nov. 2, 2017. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Response to Non-Final Office Action dated Jun. 6, 2017. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Response to Final Office Action dated Feb. 10, 2017. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Final Office Action dated Feb. 10, 2017. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Final Office Action dated Dec. 7, 2017. |
U.S. Appl. No. 14/225,864, filed Mar. 26, 2014, Notice of Allowance dated Feb. 1, 2016. |
U.S. Appl. No. 15/154,233, filed May 13, 2016, Response to Non-Final Office Action dated Jun. 2, 2017. |
U.S. Appl. No. 15/154,233, filed May 13, 2016, Notice of Allowance dated Nov. 15, 2017. |
U.S. Appl. No. 14/826,508, filed Aug. 14, 2015, Notice of Allowance dated Apr. 27, 2017. |
U.S. Appl. No. 15/164,070, filed May 25, 2016, Non-Final Office Action dated Feb. 7, 2018. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Response to Final Office Action dated Dec. 1, 2016. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Non-Final Office Action dated May 5, 2017. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Response to Non-Final Office Action dated May 5, 2017. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Final Office Action dated Oct. 24, 2017. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Response to Final Office Action dated Oct. 24, 2017. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Notice of Allowance dated Mar. 15, 2018. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Non-Final Office Action dated Jun. 6, 2017. |
U.S. Appl. No. 14/218,408, filed dated Mar. 18, 2014, Non-final Office Action dated Mar. 30, 2017. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Patent Board of Appeals Decision dated Apr. 18, 2018. |
U.S. Appl. No. 15/792,217, filed Oct. 24, 2017, Non-Final Office Action dated Apr. 18, 2018. |
SO/IEC 18004:2006. Information technology—Automatic identification and data capture techniques—QR Code 2005 bar code symbology specification. International Organization for Standardization, Geneva, Switzerland. |
Canadian Patent Application CA2, 882,899 filed on Aug. 30, 2013, Determination of Allowance dated Nov. 9, 2017. |
Canadian Patent Application CA2, 882,899, Office Action dated Mar. 30, 2017. |
Canadian Patent Application CA2, 882,899, Office Action dated Apr. 6, 2016. |
European Patent Application EP13832505.5, Extended European Search Report dated Mar. 15, 2016. |
European Patent Application EP13832505.5 filed on Aug. 30, 2013, Office Action dated Jul. 10, 2017. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Response to Final Office Action dated Nov. 25, 2016. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Final Office Action dated Nov. 25, 2016. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Response to Non-Final Office Action dated Apr. 11, 2016. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Non-Final Office Action dated Apr. 11, 2016. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Response to Final Office Action dated Jul. 27, 2015. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Final Office Action dated Jul. 27, 2015. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Response to Non-Final Office Action dated Feb. 12, 2015. |
U.S. Appl. No. 14/218,408, filed Mar. 18, 2014, Non-Final Office Action dated Feb. 12, 2015. |
U.S. Appl. No. 13/927,970, filed Jun. 26, 2013, Response to Restriction/Election dated Oct. 5, 2016. |
U.S. Appl. No. 13/927,970, filed Jun. 26, 2013, Restriction/Election dated Oct. 5, 2016. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Examiner's Answer dated May 24, 2017. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Response to Final Office Action dated Jul. 29, 2016. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Final Office Action dated Jul. 29, 2016. |
U.S. Appl. No. 14/034,05, filed Sep. 23, 2013, Response to Non-Final Office Action dated Apr. 7, 2016. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Non-Final Office Action dated Apr. 7, 2016. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Response to Final Office Action dated Jan. 6, 2016. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Final Office Action dated Jan. 6, 2016. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Response to Non-Final Office Action dated Aug. 3, 2015. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Non-Final Office Action dated Aug. 3, 2015. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Response to Non-Final Office Action dated Jul. 29, 2016. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Non-Final Office Action dated Jul. 29, 2016. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Final Office Action dated Dec. 1, 2016. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Response to Non-Final Office Action dated May 26, 2016. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Non-Final Office Action dated May 26, 2016. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Response to Election/Restriction dated Feb. 10, 2016. |
U.S. Appl. No. 14/615,950, filed Feb. 6, 2015, Restriction/Election dated Feb. 10, 2016. |
U.S. Appl. No. 13/601,210, filed Aug. 31, 2012, Notice of Allowance dated Sep. 23, 2014. |
U.S. Appl. No. 13/601,210, filed Aug. 31, 2012, Response to Final Office Action dated Jan. 2, 2014. |
U.S. Appl. No. 13/601,210, filed Aug. 31, 2012, Final Office Action dated Jan. 2, 2014. |
U.S. Appl. No. 13/601,210, filed Aug. 31, 2012, Response to Non-Final Office Action dated Aug. 1, 2013. |
U.S. Appl. No. 13/601,210, filed Aug. 31, 2012, Non-Final Office Action dated Aug. 1, 2013. |
U.S. Appl. No. 13/601,235, filed Aug. 31, 2012, Notice of Allowance dated Mar. 27, 2015. |
U.S. Appl. No. 13/601,235, filed Aug. 31, 2012, Response to Non-Final Office Action dated Sep. 11, 2014. |
U.S. Appl. No. 13/601,235, filed Aug. 31, 2012, Non-Final Office Action dated Sep. 11, 2014. |
U.S. Appl. No. 13/601,267, filed Aug. 31, 2012, Notice of Allowance dated Jan. 21, 2014. |
U.S. Appl. No. 13/601,267, filed Aug. 31, 2012, Response to Non-Final Office Action dated Aug. 14, 2013. |
U.S. Appl. No. 13/601,267, filed Aug. 31, 2012, Non-Final Office Action dated Aug. 14, 2013. |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012, Notice of Allowance dated Mar. 17, 2016. |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012, Response to Final Office Action dated Oct. 23, 2015. |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012, Final Office Action dated Oct. 23, 2015. |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012, Response to Non-Final Office Action dated Apr. 21, 2015. |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012, Non-Final Office Action dated Apr. 21, 2015. |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012, Response to Non-Final Office Action dated Oct. 3, 2014. |
U.S. Appl. No. 13/709,768, filed Dec. 10, 2012, Non-Final Office Action dated Oct. 3, 2014. |
U.S. Appl. No. 15/792,217, filed Oct. 24, 2017, Response to Non-Final Office Action dated Apr. 18, 2018 filed Jul. 16, 2018. |
U.S. Appl. No. 14/034,055, filed Sep. 23, 2013, Notice of Allowance dated Sep. 14, 2018. |
U.S. Appl. No. 14/493,970, filed Sep. 23, 2014, Non-Final Office Action dated Sep. 21, 2018. |
U.S. Appl. No. 15/164,070, filed May 25, 2016, Final Office Action dated Aug. 16, 2018. |
U.S. Appl. No. 15/792,217, filed Oct. 24, 2017, Final Office Action dated Sep. 28, 2018. |
U.S. Appl. No. 13/927,970, filed Jun. 26, 2013, Non-Final Office Action dated Oct. 5, 2018. |