Overlay images and texts in user interface

Abstract
In one embodiment, overlaying a first element on top of a second element in a user interface; and adjusting visual appearance of the first element based on a portion of the second element underneath the first element.
Description
TECHNICAL FIELD

This disclosure generally relates to overlaying texts on top of images for presentation to users.


BACKGROUND

A user interface (UI), in the industrial design field of human-machine interaction, is the space where interactions between humans and machines occur. The goal of interactions between a human, often referred to as a “user”, and a machine at the user interface is user's control of the machine and its operations (e.g., through user input) and machine feedback (e.g., through program output). A graphical user interface (GUI) is a type of user interface that allows users to interact with software applications executing on electronic or computing devices through multimedia objects (e.g., images, videos, audios, etc.) rather than purely text commands.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates an example hierarchical structure of a user interface.



FIG. 2 illustrates an example method for overlaying text on top of an image.



FIGS. 3A, 3B, and 3C illustrate an example for automatically aligning text overlaid on top of an image.



FIGS. 4A and 4B illustrate an example for automatically adjusting the color or luminosity of text overlaid on top of an image.



FIG. 5 illustrates an example for automatically adding shadow around text overlaid on top of an image.



FIG. 6 illustrates an example computer system.





DESCRIPTION OF EXAMPLE EMBODIMENTS

In particular embodiments, a user interface (UI) may include any number of UI components or elements organized into a hierarchical structure. The positions of the individual UI components or elements within the hierarchy may indicate the relationships among the components or elements. The user interface may be presented to a user of an electronic device according to the hierarchical structure.


In particular embodiments, some of the interface components or elements may be images, videos, audios, or texts. Sometimes, text may be overlaid on top of an image. In such cases, the visual characteristics of the text or the image or both may be adjusted automatically, based on, for example and without limitation, the position of the text with respect to the image, the color or luminosity or size of the text, the color or luminosity of the portion of the image underneath or near the text, or the level of detail in the portion of the image underneath or near the text.


In particular embodiments, a UI editor may be provided to enable a person (e.g., a UI designer or developer) to design and create a user interface (e.g., defining the hierarchical structure of the user interface, selecting and laying out the individual UI components or elements, stipulating the operations associated with or behaviors of the individual UI components or elements, etc.). As a part of the UI design process using such a UI editor, a UI designer or developer may choose to overlay some text over an image. In particular embodiments, the UI editor may automatically adjust the visual appearance or characteristics of the text or the image or both so that when the image, with its overlaid text, are presented to a user (e.g., as a part of the user interface), it is easy for the user to read the text as well as seeing the image.


In particular embodiments, a user interface may include any number of UI elements, some of which may be, for example and without limitation, text, image, video, or audio. These UI elements may be arranged according to a hierarchical structure. FIG. 1 illustrates an example hierarchy 100 representing a user interface having a number of UI elements. Each node 110 in FIG. 1 may represent or correspond to a specific UI element. In particular embodiments, a node (e.g., corresponding to a UI element) in a hierarchy may or may not have a parent. If a node does not have a parent, it may be referred to as a “root” node (e.g., node 110A). In particular embodiments, a node in a hierarchy may or may not have any children. If a node does not have any children, it may be referred to as a “leaf” node (e.g., node 110B). If a node does have children (e.g., node 110C), it may have any number of children. In addition, nodes sharing the same parent may be referred to as each other's “siblings”. For example, in FIG. 1, node 110C is the parent of both nodes 110D and 110B. Nodes 110D and 110B are the children of node 110C and are siblings to each other. Thus, a hierarchy of nodes (e.g., node hierarchy 100) not only includes the individual nodes themselves but also indicates the relationships among the specific nodes. Moreover, the position of a specific node within the hierarchy may be used to indicate its relationships with other nodes in the hierarchy. Similarly, a hierarchy of UI elements not only includes the individual UI elements themselves but also indicates the relationships among the specific UI elements (e.g., within the user interface). Moreover, the position of a specific UI element within the hierarchy may be used to indicate its relationships with other UI elements in the user interface.


In particular embodiments, when a hierarchical user interface is displayed on an electronic device, the UI elements are displayed in layers. With some implementations, given two UI elements, if the first UI element is the parent of the second UI element, then the first UI element is displayed in a layer that is beneath the layer in which the second UI element is displayed. If the first UI element and the second UI element are siblings, then they are displayed in the same layer. In other words, the UI elements are displayed in layers corresponding to the levels of the hierarchy.


In particular embodiments, there may be various types of UI elements, such as, for example and without limitation, windows, panels, web pages, buttons, menus, checkboxes, text input fields, icons, images, videos, audios, and links (e.g., uniform resource locators (URLs)), included in a user interface. In particular embodiments, in a user interface, text (e.g., a text UI element) may be overlaid on top of an image (e.g., an image UI element, the text as, for example, the image's caption, title, or description). FIG. 2 illustrates an example method for overlaying text on top of an image.


The method may start at step 210, where a text UI element is overlaid on top of an image UI element. For example, the text may be a caption or title of the image, or may provide a brief description or explanation of the image. At step 220, the visual appearance of the text, or the image, or both may be automatically adjusted, at which point the method may end.


In particular embodiments, an image and its overlaying text are two individual UI elements in the hierarchy of a user interface. For example, the image and its overlaying text may have a parent-child relationship in the hierarchy, with the image being the parent and the text being the child. When they are displayed on an electronic device, the image is displayed in one layer that is beneath another layer in which the text is displayed.


In particular embodiments, when text is overlaid on top of an image, depending on the relative positions of the text with respect to the image, the alignment of the text may be automatically adjusted. For example, as illustrated in FIG. 3A, when text 320 is positioned near the left of image 310, text 320 may be left aligned. As illustrated in FIG. 3B, when text 320 is positioned near the center of image 310, text 320 may be center aligned. As illustrated in FIG. 3C, when text 320 is positioned near the right of image 310, text 320 may be right aligned.


In particular embodiments, when text is overlaid on top of an image, depending on the color or luminosity of the portion of the image underneath the text, the color or luminosity of the text may be automatically adjusted. For example, as illustrated in FIG. 4A, when text 320 is positioned on top of a lighter portion of image 310, text 320 may be darker. Conversely, as illustrated in FIG. 4B, when text 320 is positioned on top of a darker portion of image 310, text 320 may be lighter. Similarly, the color of text 320 may be selected to be one that is somewhat in contrast to the color of the portion of image 310 underneath text 320 (e.g., red versus blue, yellow versus purple, etc.). By contrasting the overlaying text and the underlaying image, in terms of their color or luminosity, it is easier for a user to read the text overlaying an image.


In particular embodiments, when text is overlaid on top of an image, depending on the color or luminosity or level of detail of the portion of the image underneath the text, shadow may be automatically added or removed around the text. For example, as illustrated in FIG. 5, when text 320 is positioned on top of image 310, shadow 330 may be automatically added around text 320. Often, adding shadow around an UI element may serve several purposes. As an example, shadow 330 may add a three-dimensional (3D) visual effect that makes text 320 appears to be in front or on top of image 310. As another example, shadow 330 may provide a visual separation between text 320 and image 310, making it easier to read text 320, especially when text 320 and image 310 have similar colors or luminosity levels or when the portion of image 310 underneath text 320 has a lot of details (e.g., visually busy). In particular embodiments, shadow 330 may be transparent or semi-transparent so that the portion of image 310 underneath shadow 330 is still visible. Conversely, in other cases, shadow around text overlaid on top of an image may be automatically removed (e.g., when the shadow causes visual distraction to users).


In particular embodiments, in addition or alternative to adjusting the overlaying text, the underlaying image may be automatically adjusted. For example, when overlaying light color text on top of a light color image, the portion of the image beneath the text may be slightly darkened to make the text more visible. Conversely, when overlaying dark color text on top of a dark color image, the portion of the image beneath the text may be slightly lightened. Other characteristics of the underlaying image, such as, for example and without limitation, its luminosity or contrast, may be automatically adjusted when applicable to make the overlaying text stand out and thus easier to read.


Particular embodiments may repeat the steps of the method of FIG. 2, where appropriate. Moreover, although this disclosure describes and illustrates particular steps of the method of FIG. 2 as occurring in a particular order, this disclosure contemplates any suitable steps of the method of FIG. 2 occurring in any suitable order. Furthermore, although this disclosure describes and illustrates particular components, devices, or systems carrying out particular steps of the method of FIG. 2, this disclosure contemplates any suitable combination of any suitable components, devices, or systems carrying out any suitable steps of the method of FIG. 2.


Although the method illustrated in FIG. 2 is described in connection with a layered user interface where the UI components are arranged in a hierarchy, the method may be similarly applied to place text on an image even when the text and the image are not separated into two layers. For example, the method may be applied to place text (e.g., caption) on a background image where the image is not necessarily user interactive. Moreover, the method may be similarly applied to overlay one UI component on top of another UI component in general. In this case, various characteristics of either the overlaying component or the underlaying component or both may be adjusted, wherever appropriate, so that it is easier to view both components.


In particular embodiments, the method illustrated in FIG. 2 may be implemented as computer software executed on one or more computing or electronic devices. FIG. 6 illustrates an example computer system 600. In particular embodiments, one or more computer systems 600 perform one or more steps of one or more methods described or illustrated herein. In particular embodiments, one or more computer systems 600 provide functionality described or illustrated herein. In particular embodiments, software running on one or more computer systems 600 performs one or more steps of one or more methods described or illustrated herein or provides functionality described or illustrated herein. Particular embodiments include one or more portions of one or more computer systems 600.


This disclosure contemplates any suitable number of computer systems 600. This disclosure contemplates computer system 600 taking any suitable physical form. As example and not by way of limitation, computer system 600 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, or a combination of two or more of these. Where appropriate, computer system 600 may include one or more computer systems 600; be unitary or distributed; span multiple locations; span multiple machines; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 600 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one or more computer systems 600 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 600 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.


In particular embodiments, computer system 600 includes a processor 602, memory 604, storage 606, an input/output (I/O) interface 608, a communication interface 610, and a bus 612. Although this disclosure describes and illustrates a particular computer system having a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.


In particular embodiments, processor 602 includes hardware for executing instructions, such as those making up a computer program. As an example and not by way of limitation, to execute instructions, processor 602 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 604, or storage 606; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 604, or storage 606. In particular embodiments, processor 602 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 602 including any suitable number of any suitable internal caches, where appropriate. As an example and not by way of limitation, processor 602 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 604 or storage 606, and the instruction caches may speed up retrieval of those instructions by processor 602. Data in the data caches may be copies of data in memory 604 or storage 606 for instructions executing at processor 602 to operate on; the results of previous instructions executed at processor 602 for access by subsequent instructions executing at processor 602 or for writing to memory 604 or storage 606; or other suitable data. The data caches may speed up read or write operations by processor 602. The TLBs may speed up virtual-address translation for processor 602. In particular embodiments, processor 602 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 602 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 602 may include one or more arithmetic logic units (ALUs); be a multi-core processor; or include one or more processors 602. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.


In particular embodiments, memory 604 includes main memory for storing instructions for processor 602 to execute or data for processor 602 to operate on. As an example and not by way of limitation, computer system 600 may load instructions from storage 606 or another source (such as, for example, another computer system 600) to memory 604. Processor 602 may then load the instructions from memory 604 to an internal register or internal cache. To execute the instructions, processor 602 may retrieve the instructions from the internal register or internal cache and decode them. During or after execution of the instructions, processor 602 may write one or more results (which may be intermediate or final results) to the internal register or internal cache. Processor 602 may then write one or more of those results to memory 604. In particular embodiments, processor 602 executes only instructions in one or more internal registers or internal caches or in memory 604 (as opposed to storage 606 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 604 (as opposed to storage 606 or elsewhere). One or more memory buses (which may each include an address bus and a data bus) may couple processor 602 to memory 604. Bus 612 may include one or more memory buses, as described below. In particular embodiments, one or more memory management units (MMUs) reside between processor 602 and memory 604 and facilitate accesses to memory 604 requested by processor 602. In particular embodiments, memory 604 includes random access memory (RAM). This RAM may be volatile memory, where appropriate. Where appropriate, this RAM may be dynamic RAM (DRAM) or static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM. This disclosure contemplates any suitable RAM. Memory 604 may include one or more memories 604, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.


In particular embodiments, storage 606 includes mass storage for data or instructions. As an example and not by way of limitation, storage 606 may include an HDD, a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Storage 606 may include removable or non-removable (or fixed) media, where appropriate. Storage 606 may be internal or external to computer system 600, where appropriate. In particular embodiments, storage 606 is non-volatile, solid-state memory. In particular embodiments, storage 606 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these. This disclosure contemplates mass storage 606 taking any suitable physical form. Storage 606 may include one or more storage control units facilitating communication between processor 602 and storage 606, where appropriate. Where appropriate, storage 606 may include one or more storages 606. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.


In particular embodiments, I/O interface 608 includes hardware, software, or both providing one or more interfaces for communication between computer system 600 and one or more I/O devices. Computer system 600 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 600. As an example and not by way of limitation, an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these. An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 608 for them. Where appropriate, I/O interface 608 may include one or more device or software drivers enabling processor 602 to drive one or more of these I/O devices. I/O interface 608 may include one or more I/O interfaces 608, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.


In particular embodiments, communication interface 610 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 600 and one or more other computer systems 600 or one or more networks. As an example and not by way of limitation, communication interface 610 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network. This disclosure contemplates any suitable network and any suitable communication interface 610 for it. As an example and not by way of limitation, computer system 600 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, computer system 600 may communicate with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these. Computer system 600 may include any suitable communication interface 610 for any of these networks, where appropriate. Communication interface 610 may include one or more communication interfaces 610, where appropriate. Although this disclosure describes and illustrates a particular communication interface, this disclosure contemplates any suitable communication interface.


In particular embodiments, bus 612 includes hardware, software, or both coupling components of computer system 600 to each other. As an example and not by way of limitation, bus 612 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these. Bus 612 may include one or more buses 612, where appropriate. Although this disclosure describes and illustrates a particular bus, this disclosure contemplates any suitable bus or interconnect.


Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs, optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, solid-state drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage medium or media, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium or media may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate.


Herein, “or” is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A or B” means “A, B, or both,” unless expressly indicated otherwise or indicated otherwise by context. Moreover, “and” is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A and B” means “A and B, jointly or severally,” unless expressly indicated otherwise or indicated otherwise by context.


This disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments herein that a person having ordinary skill in the art would comprehend. Moreover, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative.

Claims
  • 1. A method, performed by one or more computing devices, comprising: receiving a first input from a user in a user interface, the first input comprising a text element;receiving a second input from the user in the user interface, the second input comprising an image element;overlapping the text element directly on top of the image element in the user interface; andautomatically adjusting a visual appearance of the text element based on a portion of the image element underneath the text element, in response to the overlapping of the text element on top of the image element, based on a portion of the image element,wherein adjusting the visual appearance of the text element comprises: left aligning each line of text of the text element relative to the left edge of the image element when the text element is positioned at a left portion of the image element,right aligning each line of text of the text element relative to the right edge of the image element when the text element is positioned at a right portion of the image element, andcenter aligning each line of text of the text element relative to a center of the image element when the text element is positioned at a center portion of the image element.
  • 2. The method of claim 1, further comprising adjusting visual appearance of the portion of the second element underneath the first element based on the first element.
  • 3. The method of claim 1, wherein adjusting the visual appearance of the text element comprises selecting a first luminosity level for the text element that is in contrast to a second luminosity level of the portion of the image element underneath the text element.
  • 4. The method of claim 1, wherein adjusting the visual appearance of the text element comprises selecting a first color for the text element that is in contrast to a second color of the portion of the image element underneath the text element.
  • 5. The method of claim 1, wherein adjusting the visual appearance of the text element comprises adding or removing a shadow around the text element based on visual characteristics of the portion of the image element underneath the text element.
  • 6. A system comprising: a memory comprising instructions executable by one or more processors; andthe one or more processors coupled to the memory and operable to execute the instructions, the one or more processors being operable when executing the instructions to: receive a first input from a user in a user interface, the first input comprising a text element;receive a second input from the user in the user interface, the second input comprising an image element;overlap the text element directly on top of the image element in the user interface; andautomatically adjust a visual appearance of the text element based on a portion of the image element underneath the text element, in response to the overlap of the text element on top of the image element, based on a portion of the image element,wherein the adjusting of the visual appearance of the text element comprises: left aligning each line of text of the text element relative to the left edge of the image element when the text element is positioned at a left portion of the image element,right aligning each line of text of the text element relative to the right edge of the image element when the text element is positioned at a right portion of the image element, andcenter aligning each line of text of the text element relative to a center of the image element when the text element is positioned at a center portion of the image element.
  • 7. The system of claim 6, wherein the one or more processors are further operable when executing the instructions to adjust visual appearance of the portion of the second element underneath the first element based on the first element.
  • 8. One or more computer-readable non-transitory storage media embodying software operable when executed by one or more computer systems to: receive a first input from a user in a user interface, the first input comprising a text element;receive a second input from the user in the user interface, the second input comprising an image element;overlap the text element directly on top of the image element in the user interface; andautomatically adjust a visual appearance of the text element based on a portion of the image element underneath the text element, in response to the overlap of the text element on top of the image element, based on a portion of the image element,wherein the adjusting of the visual appearance of the text element comprises: left aligning each line of text of the text element relative to the left edge of the image element when the text element is positioned at a left portion of the image element,right aligning each line of text of the text element relative to the right edge of the image element when the text element is positioned at a right portion of the image element, andcenter aligning each line of text of the text element relative to a center of the image element when the text element is positioned at a center portion of the image element.
  • 9. The media of claim 8, wherein the software is further operable when by the one or more computer systems to adjust visual appearance of the portion of the second element underneath the first element based on the first element.
  • 10. The media of claim 8, wherein adjust the visual appearance of the text element comprises select a first luminosity level for the text element that is in contrast to a second luminosity level of the portion of the image element underneath the text element.
  • 11. The media of claim 8, wherein adjust the visual appearance of the text element comprises select a first color for the text element that is in contrast to a second color of the portion of the image element underneath the text element.
  • 12. The media of claim 8, wherein adjust the visual appearance of the text element comprises add or remove a shadow around the text element based on visual characteristics of the portion of the image element underneath the text element.
RELATED APPLICATION(S)

This application claims the benefit, under 35 U.S.C. §119(e), of U.S. Provisional Patent Application No. 61/593,841, filed 1 Feb. 2012, which is incorporated herein by reference.

US Referenced Citations (138)
Number Name Date Kind
5727129 Barrett Mar 1998 A
6934740 Lawande Aug 2005 B1
6948125 Detweiler Sep 2005 B2
6971957 Osako Dec 2005 B2
7320113 Roberts Jan 2008 B2
7439975 Hsu Oct 2008 B2
7663620 Robertson Feb 2010 B2
7663623 Zhou Feb 2010 B2
7667719 Goodwin Feb 2010 B2
7675518 Miller Mar 2010 B1
7743322 Atkins Jun 2010 B2
7769794 Moore Aug 2010 B2
7817823 O'Donnell Oct 2010 B1
7916157 Kelley Mar 2011 B1
7996788 Carmichael Aug 2011 B2
8006195 Woodings Aug 2011 B1
8131898 Shah Mar 2012 B2
8365091 Young Jan 2013 B2
8458614 Smith Jun 2013 B1
8473868 Kauffman Jun 2013 B1
8516385 Eismann Aug 2013 B1
8533190 Walker Sep 2013 B2
8539344 Hull Sep 2013 B2
8539384 Hinckley Sep 2013 B2
8584027 Quennesson Nov 2013 B2
8635531 Graham Jan 2014 B2
8639694 Wolfe Jan 2014 B1
8656312 Kagaya Feb 2014 B2
8669950 Forstall Mar 2014 B2
8736561 Anzures May 2014 B2
8799658 Seller Aug 2014 B1
8856678 Cho Oct 2014 B1
20010030667 Kelts Oct 2001 A1
20010033303 Anderson Oct 2001 A1
20020070982 Hill Jun 2002 A1
20020107892 Chittu Aug 2002 A1
20030046401 Abbott Mar 2003 A1
20030090504 Brook May 2003 A1
20040001106 Deutscher Jan 2004 A1
20040095376 Graham May 2004 A1
20040145603 Soares Jul 2004 A1
20050005246 Card Jan 2005 A1
20050010955 Elia Jan 2005 A1
20050055426 Smith Mar 2005 A1
20050071783 Atkins Mar 2005 A1
20050210403 Satanek Sep 2005 A1
20050262149 Jung Nov 2005 A1
20060059425 Anspach Mar 2006 A1
20060174209 Barros Aug 2006 A1
20060230354 Jennings Oct 2006 A1
20060253777 Yalovsky Nov 2006 A1
20070088681 Aravamudan Apr 2007 A1
20070115300 Barney et al. May 2007 A1
20070150826 Anzures Jun 2007 A1
20070226640 Holbrook Sep 2007 A1
20070258642 Thota Nov 2007 A1
20070271516 Carmichael Nov 2007 A1
20080052742 Kopf Feb 2008 A1
20080065675 Bozich Mar 2008 A1
20080079972 Goodwin Apr 2008 A1
20080082927 Kelts Apr 2008 A1
20080168404 Ording Jul 2008 A1
20080174570 Jobs Jul 2008 A1
20080222540 Schulz Sep 2008 A1
20090007017 Anzures Jan 2009 A1
20090061837 Chaudhri Mar 2009 A1
20090070710 Kagaya Mar 2009 A1
20090132921 Hwangbo May 2009 A1
20090172532 Chaudhri Jul 2009 A1
20090172543 Cronin Jul 2009 A1
20090201270 Pikkujamsa Aug 2009 A1
20090204928 Kallio et al. Aug 2009 A1
20090228782 Fraser Sep 2009 A1
20090228832 Cheng Sep 2009 A1
20090249239 Eilers Oct 2009 A1
20090271703 Chu Oct 2009 A1
20090288032 Chang Nov 2009 A1
20090300548 Sullivan Dec 2009 A1
20090309846 Trachtenberg Dec 2009 A1
20100060666 Fong Mar 2010 A1
20100097338 Miyashita Apr 2010 A1
20100114991 Chaudhary May 2010 A1
20100119180 Wiedemann et al. May 2010 A1
20100122195 Hwang May 2010 A1
20100277496 Kawanishi Nov 2010 A1
20100287494 Ording Nov 2010 A1
20100313125 Fleizach Dec 2010 A1
20110035703 Negishi Feb 2011 A1
20110063248 Yoon Mar 2011 A1
20110122078 Kasahara May 2011 A1
20110157051 Shohga et al. Jun 2011 A1
20110163969 Anzures Jul 2011 A1
20110163971 Wagner et al. Jul 2011 A1
20110167380 Stallings Jul 2011 A1
20110187655 Min et al. Aug 2011 A1
20110202834 Mandryk Aug 2011 A1
20110209100 Hinckley Aug 2011 A1
20110246614 Votaw Oct 2011 A1
20110276863 Bhise Nov 2011 A1
20110302532 Missig Dec 2011 A1
20120047432 Yalovsky Feb 2012 A1
20120054684 Gossweiler Mar 2012 A1
20120070017 Dorogusker Mar 2012 A1
20120084662 Navarro Apr 2012 A1
20120131516 Chiu May 2012 A1
20120159393 Sethi Jun 2012 A1
20120227002 Tiwarie Sep 2012 A1
20120233573 Sullivan Sep 2012 A1
20120266068 Ryman Oct 2012 A1
20120272171 Icho Oct 2012 A1
20120272181 Rogers Oct 2012 A1
20120311438 Cranfill Dec 2012 A1
20120327009 Fleizach Dec 2012 A1
20130019263 Ferren Jan 2013 A1
20130073932 Migos et al. Mar 2013 A1
20130104017 Ko Apr 2013 A1
20130135309 King May 2013 A1
20130183943 Tow Jul 2013 A1
20130194269 Matas Aug 2013 A1
20130194307 Matas Aug 2013 A1
20130198261 Matas Aug 2013 A1
20130198631 Matas Aug 2013 A1
20130198634 Matas Aug 2013 A1
20130198661 Matas Aug 2013 A1
20130198663 Matas Aug 2013 A1
20130198664 Matas Aug 2013 A1
20130198665 Matas Aug 2013 A1
20130198668 Matas Aug 2013 A1
20130198681 Matas Aug 2013 A1
20130198682 Matas Aug 2013 A1
20130198683 Matas Aug 2013 A1
20130205210 Jeon Aug 2013 A1
20130227494 Matas Aug 2013 A1
20130339907 Matas Dec 2013 A1
20130346906 Farago Dec 2013 A1
20140013283 Matas Jan 2014 A1
20140033124 Sorrick Jan 2014 A1
20140046809 Baker Feb 2014 A1
Non-Patent Literature Citations (68)
Entry
U.S. Appl. No. 13/487,765, filed Jun. 4, 2012, Matas.
U.S. Appl. No. 13/487,805, filed Jun. 4, 2012, Matas.
U.S. Appl. No. 13/555,607, filed Jul. 23, 2012, Matas.
U.S. Appl. No. 13/555,657, filed Jul. 23, 2012, Matas.
U.S. Appl. No. 13/555,845, filed Jul. 23, 2012, Matas.
U.S. Appl. No. 13/488,039, filed Jun. 4, 2012, Matas.
U.S. Appl. No. 13/555,876, filed Jul. 23, 2012, Matas.
U.S. Appl. No. 13/490,343, filed Jun. 6, 2012, Matas.
U.S. Appl. No. 13/488,076, filed Jun. 4, 2012, Matas.
U.S. Appl. No. 13/555,909, filed Jul. 23, 2012, Matas.
U.S. Appl. No. 13/490,367, filed Jun. 6, 2012, Matas.
U.S. Appl. No. 13/490,736, filed Jun. 7, 2012, Matas.
U.S. Appl. No. 13/489,172, filed Jun. 5, 2012, Matas.
U.S. Appl. No. 13/491,100, filed Jun. 7, 2012, Matas.
Response to Final Office Action for U.S. Appl. No. 13/487,765, Dec. 29, 2014.
Final Office Action for U.S. Appl. No. 13/487,765, Oct. 3, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/487,765, Aug. 13, 2014.
Non-Final Office Action for U.S. Appl. No. 13/487,765, Mar. 14, 2014.
Response to Final Office Action for U.S. Appl. No. 13/487,805, Dec. 29, 2014.
Final Office Action for U.S. Appl. No. 13/487,805, Sep. 26, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/487,805, Jul. 31, 2014.
Non-Final Office Action for U.S. Appl. No. 13/487,805, Apr. 3, 2014.
Final Office Action for U.S. Appl. No. 13/555,607, Dec. 3, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/555,607, Nov. 4, 2014.
Non-Final Office Action for U.S. Appl. No. 13/555,607, Aug. 20, 2014.
Amendment under Rule 312 for U.S. Appl. No. 13/555,657, Dec. 30, 2014.
Notice of Allowance for U.S. Appl. No. 13/555,657, Dec. 5, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/555,657, Nov. 19, 2014.
Non-Final Office Action for U.S. Appl. No. 13/555,657, Sep. 18, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/555,845, Oct. 24, 2014.
Non-Final Office Action for U.S. Appl. No. 13/555,845, Apr. 24, 2014.
Final Office Action for U.S. Appl. No. 13/488,039, Dec. 12, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/488,039, Nov. 7, 2014.
Non-Final Office Action for U.S. Appl. No. 13/488,039, Jun. 11, 2014.
Final Office Action for U.S. Appl. No. 13/555,876, Nov. 20, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/555,876, Oct. 23, 2014.
U.S. Appl. No. 14/569,475, filed Dec. 12, 2014, Matas.
U.S. Appl. No. 14/572,405, filed Dec. 16, 2014, Matas.
Non-Final Office Action for U.S. Appl. No. 13/555,876, Jul. 14, 2014.
Response to Final Office Action for U.S. Appl. No. 13/490,343, Dec. 19, 2014.
Final Office Action for U.S. Appl. No. 13/490,343, Nov. 7, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/490,343, Oct. 22, 2014.
Non-Final Office Action for U.S. Appl. No. 13/490,343, Jun. 4, 2014.
Response to Final Office Action for U.S. Appl. No. 13/488,076, Dec. 29, 2014.
Final Office Action for U.S. Appl. No. 13/488,076, Oct. 3, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/488,076, Jul. 31, 2014.
Non-Final Office Action for U.S. Appl. No. 13/488,076, Apr. 1, 2014.
Non-Final Office Action for U.S. Appl. No. 13/555,909, Nov. 14, 2014.
Response to Final Office Action for U.S. Appl. No. 13/555,909, Aug. 8, 2014.
Final Office Action for U.S. Appl. No. 13/555,909, Apr. 28, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/555,909, Mar. 27, 2014.
Non-Final Office Action for U.S. Appl. No. 13/555,909, Jan. 2, 2014.
Non-Final Office Action for U.S. Appl. No. 13/490,367, Nov. 20, 2014.
Response to Final Office Action for U.S. Appl. No. 13/490,367, Sep. 19, 2014.
Final Office Action for U.S. Appl. No. 13/490,367, May 1, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/490,367, Apr. 4, 2014.
Non-Final Office Action for U.S. Appl. No. 13/490,367, Jan. 3, 2014.
Notice of Allowance for U.S. Appl. No. 13/490,736, Nov. 21, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/490,736, Oct. 23, 2014.
Non-Final Office Action for U.S. Appl. No. 13/490,736, Jul. 8, 2014.
Final Office Action for U.S. Appl. No. 13/489,172, Dec. 12, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/489,172, Aug. 12, 2014.
Non-Final Office Action for U.S. Appl. No. 13/489,172, Mar. 28, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/491,100, Oct. 30, 2014.
Non-Final Office Action for U.S. Appl. No. 13/491,100, Jul. 31, 2014.
Response to Non-Final Office Action for U.S. Appl. No. 13/677,132, Jan. 5, 2015.
Non-Final Office Action for U.S. Appl. No. 13/677,132, Oct. 3, 2014.
Non-Final Office Action for U.S. Appl. No. 13/677,093, Nov. 21, 2014.
Related Publications (1)
Number Date Country
20130198666 A1 Aug 2013 US
Provisional Applications (1)
Number Date Country
61593841 Feb 2012 US