The present disclosure relates generally to interacting with content on a mobile device, and more particularly to sharing images with overlays as well as converting overlays into actions on a mobile device.
Touchscreens are touch-sensitive electronic visual displays that receive tactile input information entered using a human digit, a special glove, or a stylus. A typical touchscreen can sense touch events including contact or movement on the surface of the screen, such as taps, swipes, pinches, flicks, other gestures, marks, lines, or geometric shapes. In general, touchscreens enable users to interact directly with images displayed on the screen, rather than through an intermediate device, such as a mouse or a touchpad.
Some existing touchscreens implement resistive touch-sensing technology, while other existing touchscreens implement capacitive, surface acoustic wave, infrared or optical technologies to sense touch events. Touchscreens have been used as input devices in tablet computers, mobile phones, and gaming consoles.
A currently emerging area of application is in compact wearable processing devices, such as wrist-wearable devices, in which the touchscreens typically are of relatively small size. The reduced size of touchscreens on wearable devices has drawbacks regarding existing user interface implementations. For example, a user of a compact processing device does not have available a full range of user interface elements such as menu bars, drop down menu items, navigation buttons or the like that are supported and provided on a computing device with a larger footprint of processing power. Consequently, users' experience on compact wearable processing devices lacks ease and efficiency.
Furthermore, images displayed on such compact wearable processing devices are usually non-interactive, thereby only allowing users to view the images, but not to interact or connect with the image content in manners available on non-compact processing devices. For example, a user cannot leave a comment or feedback regarding the image content for the purposes of sharing opinions or impressions among friends, family, groups or communities on a social network.
According to one exemplary embodiment of the present disclosure, a method for sharing images with overlays on a mobile platform includes the steps of receiving an image, adding an overlay to the received image such that the overlay conveys an impression of the image upon viewing the image, recognizing by matching the overlay to a set of pre-defined overlay template to identify a template intended by the overlay, assigning the recognized matching template to the received image and superimposing the template onto the received image to create an image such that the image indicates the impression conveyed by the overlay.
According to another exemplary embodiment of the present disclosure, a non-transitory computer readable storage medium having embedded therein program instructions, when executed by one or more processors of a computer, causes the computer to execute a process for sharing images with overlays on a mobile platform. The process includes receiving an image, adding an overlay to the received image such that the overlay conveys an impression of the image upon viewing the image, recognizing by matching the overlay to a set of pre-defined overlay template to identify a template intended by the overlay, assigning the recognized matching template to the received image and superimposing the template onto the received image to create an image such that the image indicates the impression conveyed by the overlay.
According to yet another exemplary embodiment of the present disclosure, a system for sharing reactions towards images on a mobile platform includes a displaying module for the receiving an image, an acquisition module to add overlay content such that the overlay content conveys an impression of the image upon viewing the image, a reorganization module for recognizing the overlay content by matching to a set of pre-defined overlay template to identify a template intended by the overlay content, and for assigning a template in response to the recognized matching, a modification module for superimposing the received image and the assigned overlay into an image such that the image indicates the impression conveyed by the overlay.
According to another exemplary embodiment of the present disclosure, a method for converting a user generated overlay into an action on a mobile device includes the steps of receiving a content on a display of the mobile device, the content being associated with a set of actions for selection, generating an overlay image on the received content, the overlay image being generated content which enables an action by the mobile device in response to the received content, matching the overlay image and received content to a set of templates for identifying a template associated with the overlay image and received content, each template of the set of templates being associated with an action, identifying a template matched to the overlay image and received content, and assigning an action associated with the template to the received content.
According to another exemplary embodiment of the present disclosure, a non-transitory computer readable storage medium having embedded therein program instructions, when executed by one or more processors of a computer, causes the computer to execute a process for converting a generated overlay into an action on a mobile device. The process includes the steps of receiving a content on a display of the mobile device, the content being associated with a set of actions for selection, generating an overlay image on the received content, the overlay image being generated content which enables an action by the mobile device in response to the received content, matching the overlay image and received content to a set of templates for identifying a template associated with the overlay image and received content, each template of the set of templates being associated with an action, identifying a template matched to the overlay image and received content, and assigning an action associated with the template to the received content.
According to yet another exemplary embodiment of the present disclosure, a system for converting a generated overlay into an action on a mobile device includes a displaying module for the receiving a content, the content being associated with a set of actions for selection, an acquisition module configured to capture an overlay image on the received content, the overlay image being generated content which enables an action by the mobile device in response to the received content, a recognition module configured to match the overlay image and received content to a set of templates for identifying a template associated with the overlay image and received content, each template of the set of templates being associated with an action, and an overlay action module configured to assign an action associated with the template to the received content.
The details of one or more embodiments of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
In some exemplary embodiments, a system and method for content interaction through user generated overlays on a mobile device is illustrated. One exemplary system and method is illustrated as sharing images with overlays with a third party on a communication network. A user shopping in an ecommerce marketplace on a mobile platform may like to share impulsive impressions towards product or service items encountered on the ecommerce marketplace with friends, family, groups or communities. For example, with a touch screen of a mobile device executing an ecommerce application, the user can finger draw a graphic heart shape on an image displayed on the touch screen to convey an impression of the content portrayed by the image as “liking” and “wishing to have”. Sending the image with a heart shape to friends, family, groups or communities on communication networks accomplishes sharing the aforementioned “liking” and “wishing to have” impression towards the content of the image amongst the intended recipients.
Another exemplary system and method is illustrated for converting an overlay to an action on a mobile device. A user of an application on a mobile device with limited user interface elements may like to interact with the application using contacts or touches with a touch display screen of the mobile device to initiate actions or commands for the application to execute on the mobile device. For example, with a touch screen of a mobile device executing an ecommerce application, the user can again finger draw a graphic heart shape on an image displayed on the touch screen to task the ecommerce application to both add the item portrayed in the image on display to a “wish-to-have” list associated with the user's ecommerce account or user profile data, and to update the image by superimposing a pre-defined heart shape at the upper left corner of the image to reflect the action performed in response to the user generated heart shaped overlay.
For yet another example, with a touch screen of a compact mobile device executing an ecommerce application in the context of a time display of the compact mobile device, the user can nevertheless finger draw an arc shape or any shape that generally follows the outer perimeter of the shape of the watch dial of the time display from a first position to a second position, passing through a first icon representing a first category of items for purchase and a second icon representing a second category of items for purchase. Upon the user gesturing the arc overlay through a vicinity of the first icon, an image of an item of the first category will be displayed in the center portion of the time display. As the user further gestures past the remote end of the vicinity of the first position and into a vicinity of the second icon, an image of an item from the second category will be displayed at the center portion of the time display, replacing the image of the item from the first category. Consequently, the user initiates categories switching actions by generating an arc shaped overlay over the time display of the compact mobile device.
Now referring to
Programming code, such as source code, object code or executable code, stored on a computer-readable medium, such as the storage 108 or a peripheral storage component coupled to the computing device 100, can be loaded into the memory 104 and executed by the processor 102 in order to perform the functions of a system and method in accordance with the present invention. In various embodiments, the computing device 100 can include, for example, a mobile device, such as a personal digital assistant (PDA), a cellular telephone, a smart phone, a wearable device, or the like, with a relatively compact touch display.
Referring to
As shown in
In various embodiments, the communication network 306 can include any viable combination of devices and systems capable of linking computer-based systems, such as the Internet; an intranet or extranet; a local area network (LAN); a wide area network (WAN); a direct cable connection; a private network; a public network; an Ethernet-based system; a token ring; a value-added network; a telephony-based system, including, for example, T1 or E1 devices; an Asynchronous Transfer Mode (ATM) network; a wired system; a wireless system; an optical system; a combination of any number of distributed processing networks or systems or the like.
Database 400 is shown to include third party permission profile 402 that stores user profile information 404 for each third party, with whom the user of the ecommerce marketplace intends to share content encountered and impressions thereof while shopping on the ecommerce marketplace. The user profile information 404 may include third party identify information 406 that prescribes the identities of third parties. The user profile information 404 may also store communication platform information 408, which the prescribed third parties associate with. The user profile information 404 can be provisionalized by the user of the ecommerce marketplace through an application (part of the menu options of which is shown in
Now turning to
Exemplary types of user overlays that a user can generate and subsequently use to interact with content presented can include, but not limited to graphic shapes, lines, dots, textual information, numeric information and combinations thereof. User overlays can also be communicated in any language the user of a mobile device wishes to. In alternative embodiments, both the ecommerce application and the overlay manager 502 can be implemented as full blown applications running on a general computing device, in addition to applications or components customized for a compact processing device with a smaller footprint of processing power.
As illustrated in
Content received by the display module 504 can associate with a set of actions from which the user can select to task the application or the mobile device to perform in the context of the received content. In some embodiments, when the content received is a product image presented by user interface elements of the ecommerce mobile App, the context of the content is the ecommerce application. Given the context being the ecommerce application, the content received is accordingly responsive to actions available under the ecommerce application. For example, such actions can be “browse to the next product,” “browse to the prior product,” “switch to the next category of products,” “switch to the prior category of products,” “buy now,” “add to wish list,” “gift wrap,” “share impression,” and the like. In addition to initiating actions by touch events such as tapping on certain sensitized areas on the touch screen display, right-to-left linear swiping and left-to-right linear swiping, the user can also initiate actions by generating overlays over the content received. For example, a user drawing a box shape over a product image of the received content initiates an action of including a gift wrapping option upon the user proceeding to purchase the product.
In other embodiments, when the content received is a composite, e.g., a product image displayed at the center portion of a watch face of the time display of a compact processing device, the context is the ecommerce application integrated into the time displaying application rendering the watch face. As the context being applications with hybrid or merged interfaces, the composite content can accordingly be responsive to actions specifically customized to the shared interfaces of the applications. For example, such actions can be “select a category,” “show details,” “hide details” and “switch categories,” and the like.
The acquisition module 506 is configured to capture an overlay (not shown) generated by the user upon the content received by the display module 504. The acquisition module 506 acquires the overlay through detecting touches or contacts generated by the user upon the content displayed on a touch screen of a compact processing device. Touches and contacts include, but not limited to sensed contact or movement on the surface of the touch screen display that correspond to taps, swipes, pinches, flicks, marks, lines, geometric shapes, gestures, or the like. In various embodiments, touches and contacts also include contact, movement and indications generated from input mechanisms for non-touch screen displays. For example, a user may use a stylus, a special glove, a laser beam, or the like, in accordance with display technologies known in the art.
The recognition module 508 matches the user generated overlay with a set of pre-defined overlay templates for the purposes of identifying a template intended by the user with the overlay. For example, if strokes resembling a shape of heart are received as an overlay, and the shape of heart is defined in the set of overlay templates, the recognition module 508 will recognize the shape of heart template as indicated by the user generated overlay by matching the two shapes. The context information can include but not limited to the user interface context, or any other signals the compact processing device can detect or communicate.
Alternatively, the recognition module 508 can also match the user generated overlay together with the received content to a set of templates to identify a template intended by the user. In this instance, each template of the set of the templates has an action associated therewith. In some embodiments, the recognition module 508 can utilize the context information provided by the received content together with the overlay to match to the set of overlay templates. In other embodiments, the recognition module 508 can utilize any information received content may provide with, together with the overlay to match to the set of overlay templates.
Upon a successful match of the overlay and received content to an overlay template of the set of the overlay templates, the recognition module 508 signals the overlay template identified to an overlay action module 514. The overlay action module 514 is configured to task the application or the mobile device to perform the action associated with the identified overlay template. In some embodiments, an overlay template with an action associated therewith is further specified to have an option of updating the content to reflect the action performed. With such an option, the overlay action module 514 will signal the modification module 510 to update the content according to the option defined for the action associated with the template. For example, the update option can be assigning the overlay template for modifying the content received. In other embodiments, an action can be performed without providing any visual updates to inform the user of the action performed.
Upon a successful match of the overlay to an overlay template without associated action, the recognition module 508 assigns the identified overlay template to the received content. Upon unsuccessful recognition of any match to the set of overlay templates with the overlay generated by the user, the acquisition module 508 discards the overlay generated by the user and is ready for recognizing a new overlay input by the user. In alternative embodiments, upon finding no match to the set of pre-defined overlay templates with the overlay generated by the user, the acquisition module 508 can nevertheless assign to the content the overlay as generated by the user. However, in either case, absent successful recognition of a matching template with the generated overlay, the acquisition module 508 does not signal the overlay action module 514.
The modification module 510 is configured to superimpose the assigned overlay template onto the underlying image on display to create a third image. The overlay template can be assigned to the received content either by the recognition module 508 or the overlay action module 514. Depending on the graphic format utilized and supported by the mobile platform, the modification module 510 can juxtapose the assigned overlay template in a variety of ways. In some instances where the graphic format supports image layers, the modification module 510 can superimpose the assigned template as the top layer onto the underlying image. In other instances where the graphic format does not support image layers, the modification module 510 can merge the template image with the underlying image. In either case, for example, an overlay template can be applied to the underlying image in the manner of being wholly solid, wholly opaque, partly opaque and partly translucent, or wholly translucent. Herein, the term “superimpose” and the term “merge” can be used interchangeably.
When the user generated overlay is assigned as a template to the underlying content, the modification module 510, in alternative embodiment, can superimpose the overlay in a wholly translucent manner while preserving the other aspects of the user generated overlay such as the exact shape and the exact position in relation to the underlying content. In this way, no portions of the underlying content will be obscured when the overlay is superimposed or merged with the underlying content. For example, a user-drawn shape of heart can be displayed on top of the underlying content allowing some portions of the underlying image to remain visible while obscuring other portions of the underlying content. The shape of heart may also be translucent so that the content portions obscured by the shape of heart may remains partially visible, providing unobstructed view of the underlying content.
In alternative embodiments, the third image may include information that allows an overlay template to be removed from the third image. With an overlay template removeably added to the underlying image, a user can further modify the underlying image by canceling the overlay template, or providing different overlay template, when the user forms a different impression of the underlying image upon viewing the image again at a later point of time.
The distribution module 512 is configured to transmit the third image over a communication network 520 to a third party 516. The distribution module 512 retrieves from a database 400 of
One of the menu options 602 is an overlay settings option 602, which further includes two sub-options: definition settings option 604 and sharing settings option 606. The definition settings option 604 includes an option 608 to “detect the end of overlay content input upon”, allowing the user to choose from pre-determined ways of detecting the end of the user inputting an overlay content. For example, with the lapse of time based mechanism, the user may select a pre-defined amount of time after which inactiveness from the user on the touch screen indicates the end of user generating an overlay content upon the touch screen. With this option, the user may select “5 seconds” or “10 seconds” or any user-preferred amount of time as the time-out setting for acquisition module 506. Alternatively, the user may also select a touch event based mechanism to detect the end of user generated overlay content. With this option, upon detecting any touch event which the ecommerce application is responsive to, the acquisition module 506 stops capturing overlay content form the user. For example, if a user swiping from left to right on the touch screen cues the ecommerce application to present the next commercial item in the same category, then such swiping indicates the end of the user conveying an impression on the image of the current commercial item on display. Nevertheless, the end of the user generated overlay content can be implemented in accordance with user interface technologies known in the art.
The definition settings option 604 also includes a choice of whether to “Always Use Overlay Template” 610, allowing the user to select a “YES” or “NO” setting. For example, when the user selects “YES” option and when an overlay content generated by the user is not recognized as matching any pre-defined overlay template, no overlay template will be superimposed onto the image upon which the user generates the overlay content. However, when the user selects the “NO” option, a user generated overlay content that is not recognized as matching with any of the pre-defined overlay template will nevertheless be superimposed upon the image to indicate the impression the user conveys with the overlay content.
The definition settings option 604 further includes a template editing option 612, allowing the user to add, edit or delete individual overlay template of a set of pre-determined overlay templates through an edit template option 614. The template editing option 612 also includes an edit template description option 616. The edit template description option 616 allows the user to review and edit a string of textual comments that are associated with each template of the set of overlay templates. For example, a description for a heart-shaped template can be “like it and wish to have it” while the description for a smiley face template can be “it is nice.” For another example, the user may add or define new overlay template such as an acronym “SFGD” and provide the associated description as “Shop For Group Discount?”. Without user's definition of a phrase like “SFGD” the recognition module 506 will recognize the overlay content of “SFGD” as not matching any of the pre-defined overlay template.
The sharing settings option 606 includes a set of recipient contact information 618. The recipient contact information 618 identifies third parties with whom the user intends to share content and experience while shopping on the ecommerce marketplace. With this option, the user can add, edit, or delete third party contact information. The recipient contact information 618 is stored in the third party permission profile 502 of
Further, in the context of an AliExpress integrated with a circular watch face of a time display application, a shape of an arc with an center angle of any degree and a start point 670 and an end point 672 corresponds to an action for switching displays from a category represented by an icon in the closet vicinity to the start point 670 to a category represented by an icon in the closet vicinity to the end point 672. When the end point 672 of the arch shaped overlay is generated by a clock-wise movement starting from the start point 670, the action corresponds to switching categories clock-wise. When the end point 672 of the arch shaped overlay is generated by a counter clock-wise movement starting from the start point 670, the action corresponds to switching categories counter clock-wise. As the correspondent actions will update the time display accordingly upon the switch of categories, no update option needs to be defined in the mappings 680.
Furthermore, back to the context of an AliExpress App, a similar shape of an arc with a center angle of any degree and a start point 670 and an end point 672 will not correspond to any action that can be performed in response to the content received in the AliExpress App. For another example, a smiley face template and a content received in the AliExpress App does not associate with any action. Recognized as a template without associated action, the smiley face template will nevertheless be assigned to the received content for the purposes of modifying the received content, etc.
Now turning to
Referring to
In response, as depicted in
Upon viewing the newly displayed product image 702, the user forms an impression of the product, i.e., the blazer, and starts to share that impression with friends, family or a group by initiating a touch contact 730. Referring to
As the touch contact 730 continues in a heart-shaped drawing motion on the touch display screen 703 over the product image 704, the acquisition module 506 continues to capture the touch contact 730 into a user generated overlay content. When the touch contact 730 concludes upon the completion of drawing the heart shape, the user stops interacting with the touch screen 703 and the acquisition module 506 stops acquiring a user generated overlay content from the user, and captures a user generated overlay 740 in the form of a hand-drawn shape of heart, as depicted in
Upon capturing the user generated overlay 740, the recognition module 508 analyzes the overlay content 740 by matching the shape with the set of overlay templates provisionalized in the provisionalizing application 600. In response to a successful matching of the hand-drawn shape of heart to a heart-shaped template 656, the heart-shaped overlay template 656 is superimposed at the upper left corner of the product image 704 to generate an third image 750, as depicted in
Furthermore, the recognition module 508 can further analyzes the overlay 740 by matching the heart-shaped template and the received content to the set of templates associated with actions. Upon determining that an action of “add-to-wish-list” is associated with the heart-shaped template and the content received in the context of AliExpress App, the recognition module 508 signals the overlay action module 514 to task the AliExpress App to perform the action associated, i.e., to add the blazer to the wish list of the user of the AliExpress App.
Upon the generation of the third image 750, the distribution module 512, referring to the third party permission profile provisionalized by the overlay sharing setting option 606 to send the third image 750 to each party provisionalized in the overlay sharing settings. For example, the third image 750 can be saved into a file of formats in compliance with the Facebook® timeline postings and the file can be posted to the Facebook® timeline of the third party. For another example, the third image 750 can be saved into a file in HTML format and the file can be posted as a blog update to a blog on the Internet. In alternative embodiments, the overlay template description can also be included in the file generated from the third image 750. In other alternative embodiments, the file generated from the third image 750 can further include information such as star ratings, which correlate the number of stars with the impression conveyed by the template.
Now referring to
In response, as depicted in
Upon viewing the product display image 702 displaying a product from a new product category, the user again forms an impression of the product, i.e., the handbag, and starts to share that impression with friends, family or a group by initiating a touch contact 770. Referring to
As depicted in
Now turning to
Referring to
Next, as depicted in
As depicted in
As further depicted in
User generated arc-shaped overlays through circular contact on the touch display screen only illustrates exemplary overlays that can have actions associated therewith for content received in the exemplary time display context. As the dial of the watch face may take on any shapes (e.g., a shamrock), bear a variety of a full or limited or none of hour markings or minute markings, overlays that generally tracks the particular shapes of the dial of a watch face can initiate actions similar to an arc-shaped overlay over a circular dial of a watch face.
In block 902, content with an image, shown on a touch display screen of the mobile platform upon the user's encountering of an item offered for purchase, is received by the overlay manager 502 of
When the conclusion of the user adding overlay content is detected, in block 906, the user generated overlay is recognized by matching to a set of pre-defined overlay templates to identify an overlay template intended by the user generated overlay in bock 904. Upon a successful match, an overlay template is assigned to the image in block 908. Alternatively, when the user generated overlay content does not match any of the overlay template pre-defined in the set of overlay templates, and when the provisionalized overlay definition settings permits adding an overlay as it is generated by the user, an overlay template will be assigned to the image in the form of the overlay as the user generates in block 908.
In block 910, the assigned template, either in the form of a pre-defined overlay template or in the form of the overlay as generated by the user in block 904, is superimposed onto the image displaying the item offered for purchase. A third image is created to include both the image displaying the item offered for purchase and the assigned template as well in block 910. The third image is saved in a file in block 912 and transmitted to a third party on a communication network such that, upon receiving and viewing the third image created in block 912, the third party comprehends the impression conveyed by the overlay template generated in block 904 towards the item displayed in the image.
Now turning to
In block 1002, content shown on a touch display screen of the mobile platform is received by the overlay manager 502 of
Next, at decision block 1006, the overlay is matched to a set of overlay template to decide whether the overlay is recognized by the system. The decision block 1006 takes the Yes path to the decision block 1008 if there is a successful match of the overlay image to a pre-defined template of the set of the templates. Otherwise, the decision block 1006 takes the No path to the start of the method 1000.
At decision block 1008, with an identified template, a decision is made regarding whether an action is associated with the identified template under the context relating to the content received in block 1002. The decision block 1008 takes the Yes path to block 1010 if an action is identified. Otherwise, the decision block 1008 takes the No path to the start of the method 1000. In block 1010, the action associated with the identified overlay template is performed on the mobile device. In block 1012, the content received in block 1002 is updated, if specified by the options related to the associated action, to reflect the action performed on the received content.
Aspects of this disclosure are described herein with reference to flowchart illustrations or block diagrams, in which each block or any combination of blocks can be implemented by computer program instructions. The instructions may be provided to a processor of a general purpose computer, special purpose computer, mobile programming device, or other programmable data processing apparatus to effectuate a machine or article of manufacture, and when executed by the processor the instructions create means for implementing the functions, acts or events specified in each block or combination of blocks in the diagrams.
In this regard, each block in the flowchart or block diagrams may correspond to a module, segment, or portion of code that including one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functionality associated with any block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or blocks may sometimes be executed in reverse order.
A person of ordinary skill in the art will appreciate that aspects of this disclosure may be embodied as a device, system, method or computer program product. Accordingly, aspects of this disclosure, generally referred to herein as circuits, modules, components or systems, may be embodied in hardware, in software (including firmware, resident software, micro-code, etc.), or in any combination of software and hardware, including computer program products embodied in a computer-readable medium having computer-readable program code embodied thereon. In the context of this disclosure, a computer readable storage medium may include any tangible medium that is capable of containing or storing program instructions for use by or in connection with a data processing system, apparatus, or device.
It will be understood that various modifications may be made. For example, useful results still could be achieved if steps of the disclosed techniques were performed in a different order, and/or if components in the disclosed systems were combined in a different manner and/or replaced or supplemented by other components. Accordingly, other implementations are within the scope of the following claims.
This application claims the benefits to U.S. Provisional Application No. 62/129,639 filed on Mar. 6, 2015, which is incorporated herein by reference in its entireties.
Number | Name | Date | Kind |
---|---|---|---|
6057845 | Dupouy | May 2000 | A |
6459442 | Edwards | Oct 2002 | B1 |
6525997 | Narayanaswami | Feb 2003 | B1 |
6556222 | Narayanaswami | Apr 2003 | B1 |
6593973 | Sullivan et al. | Jul 2003 | B1 |
7072859 | Huber | Jul 2006 | B1 |
7181695 | Jaeger | Feb 2007 | B2 |
7224991 | Fuoss | May 2007 | B1 |
7365736 | Marvit et al. | Apr 2008 | B2 |
7618260 | Daniel | Nov 2009 | B2 |
7711611 | Martineau | May 2010 | B2 |
7876335 | Pittenger et al. | Jan 2011 | B1 |
7882095 | Gwozdz | Feb 2011 | B2 |
7890434 | Narayanaswami | Feb 2011 | B2 |
7907476 | Lee | Mar 2011 | B2 |
7930220 | Gaw | Apr 2011 | B1 |
7941754 | Lindberg | May 2011 | B2 |
8014560 | Nafarieh et al. | Sep 2011 | B2 |
8032185 | Ahn | Oct 2011 | B2 |
8165916 | Hoffberg | Apr 2012 | B2 |
8301754 | Moon | Oct 2012 | B1 |
8312552 | Hadden et al. | Nov 2012 | B1 |
8364208 | Choi | Jan 2013 | B2 |
8456420 | Nachman | Jun 2013 | B2 |
8478349 | Seo | Jul 2013 | B2 |
8522301 | Zalewski | Aug 2013 | B2 |
8526666 | Hadden et al. | Sep 2013 | B1 |
8541745 | Dickinson | Sep 2013 | B2 |
8549425 | Sakamoto | Oct 2013 | B2 |
8576073 | Mooring | Nov 2013 | B2 |
8606912 | Moon | Dec 2013 | B1 |
8615401 | Price | Dec 2013 | B1 |
8650476 | Belle | Feb 2014 | B1 |
8689103 | Lindley et al. | Apr 2014 | B2 |
8707360 | Proidl et al. | Apr 2014 | B2 |
8738390 | Price | May 2014 | B1 |
8787892 | Gu | Jul 2014 | B2 |
8792944 | Ledet | Jul 2014 | B1 |
8831279 | Rodriguez et al. | Sep 2014 | B2 |
8831636 | Kim et al. | Sep 2014 | B2 |
8868406 | Tirumalachetty et al. | Oct 2014 | B2 |
8893054 | Amento et al. | Nov 2014 | B2 |
8897377 | Dougherty, III et al. | Nov 2014 | B2 |
8904430 | Zalewski | Dec 2014 | B2 |
8914736 | Cardasco | Dec 2014 | B2 |
8935638 | Steiner et al. | Jan 2015 | B2 |
8943092 | Dai | Jan 2015 | B2 |
8949882 | Sherwin et al. | Feb 2015 | B2 |
8954521 | Faaborg et al. | Feb 2015 | B1 |
8959459 | Aoki et al. | Feb 2015 | B2 |
8971623 | Gatt et al. | Mar 2015 | B2 |
8990677 | Sitrick et al. | Mar 2015 | B2 |
8994650 | Marti | Mar 2015 | B2 |
9021402 | Li | Apr 2015 | B1 |
9026935 | Rasmussen et al. | May 2015 | B1 |
9026950 | Eltoft | May 2015 | B2 |
9035874 | Fowers et al. | May 2015 | B1 |
9041727 | Ubillos et al. | May 2015 | B2 |
9047913 | Gehani | Jun 2015 | B2 |
9052764 | Day et al. | Jun 2015 | B2 |
9053078 | Moon | Jun 2015 | B1 |
9064237 | Simons et al. | Jun 2015 | B2 |
9064436 | Patel et al. | Jun 2015 | B1 |
9065979 | Cohen et al. | Jun 2015 | B2 |
9066200 | Loxam et al. | Jun 2015 | B1 |
9116613 | Jung | Aug 2015 | B2 |
9124765 | Zhang et al. | Sep 2015 | B2 |
9134789 | Cotlarciuc | Sep 2015 | B2 |
9141594 | Pittenger et al. | Sep 2015 | B2 |
9164972 | Lin et al. | Oct 2015 | B2 |
9176607 | Shibata | Nov 2015 | B2 |
9196003 | Argue | Nov 2015 | B2 |
9229622 | Sakamoto | Jan 2016 | B2 |
9414115 | Mao | Aug 2016 | B1 |
9438646 | Andler | Sep 2016 | B2 |
9563350 | Kim | Feb 2017 | B2 |
9589535 | Poon | Mar 2017 | B2 |
9626697 | Rathus | Apr 2017 | B2 |
9690477 | Li | Jun 2017 | B2 |
9791962 | Long | Oct 2017 | B2 |
9805241 | Li | Oct 2017 | B2 |
9882859 | Kimura | Jan 2018 | B2 |
9954812 | Kimura | Apr 2018 | B2 |
9996255 | Rav-Noy | Jun 2018 | B2 |
10055101 | Namgung | Aug 2018 | B2 |
10089006 | Lee | Oct 2018 | B2 |
20030231604 | Liu et al. | Dec 2003 | A1 |
20040188529 | Kim | Sep 2004 | A1 |
20050015803 | Macrae | Jan 2005 | A1 |
20050210418 | Marvit | Sep 2005 | A1 |
20050212767 | Marvit et al. | Sep 2005 | A1 |
20050273761 | Torgerson | Dec 2005 | A1 |
20060026521 | Hotelling | Feb 2006 | A1 |
20060242607 | Hudson | Oct 2006 | A1 |
20080058007 | Kang | Mar 2008 | A1 |
20090002392 | Hou | Jan 2009 | A1 |
20090003658 | Zhang | Jan 2009 | A1 |
20090005088 | Hsu | Jan 2009 | A1 |
20090006292 | Block | Jan 2009 | A1 |
20090051648 | Shamaie et al. | Feb 2009 | A1 |
20090055771 | Nurmi | Feb 2009 | A1 |
20090265671 | Sachs et al. | Oct 2009 | A1 |
20100056222 | Choi | Mar 2010 | A1 |
20100124152 | Lee | May 2010 | A1 |
20100127991 | Yee | May 2010 | A1 |
20100153890 | Wang | Jun 2010 | A1 |
20100311470 | Seo | Dec 2010 | A1 |
20100315358 | Chang | Dec 2010 | A1 |
20110066984 | Li | Mar 2011 | A1 |
20110153463 | Lovelace | Jun 2011 | A1 |
20110157046 | Lee | Jun 2011 | A1 |
20120078746 | Maciocci | Mar 2012 | A1 |
20120146911 | Griffin | Jun 2012 | A1 |
20120150620 | Mandyam | Jun 2012 | A1 |
20120169632 | Yu | Jul 2012 | A1 |
20120260208 | Jung | Oct 2012 | A1 |
20120265644 | Roa | Oct 2012 | A1 |
20130014041 | Jaeger | Jan 2013 | A1 |
20130058198 | Tu | Mar 2013 | A1 |
20130073420 | Kumm | Mar 2013 | A1 |
20130103712 | Li | Apr 2013 | A1 |
20130106748 | Hosaka | May 2013 | A1 |
20130120279 | Plichta | May 2013 | A1 |
20130132221 | Bradford | May 2013 | A1 |
20130132904 | Primiani | May 2013 | A1 |
20130170324 | Tu | Jul 2013 | A1 |
20130204739 | Friedman | Aug 2013 | A1 |
20130212606 | Kannan | Aug 2013 | A1 |
20130226646 | Watkins | Aug 2013 | A1 |
20130227409 | Das | Aug 2013 | A1 |
20130227418 | Sa | Aug 2013 | A1 |
20130285934 | Ting et al. | Oct 2013 | A1 |
20130311340 | Krishnan | Nov 2013 | A1 |
20130321314 | Oh | Dec 2013 | A1 |
20130346258 | Ali | Dec 2013 | A1 |
20140015780 | Kim | Jan 2014 | A1 |
20140019905 | Kim | Jan 2014 | A1 |
20140025688 | Andler | Jan 2014 | A1 |
20140033136 | St. Clair | Jan 2014 | A1 |
20140089819 | Andler | Mar 2014 | A1 |
20140089841 | Kim | Mar 2014 | A1 |
20140100997 | Mayerle | Apr 2014 | A1 |
20140122983 | Shyamsundar | May 2014 | A1 |
20140123003 | Song | May 2014 | A1 |
20140123183 | Fujimoto | May 2014 | A1 |
20140136435 | Nuzzi | May 2014 | A1 |
20140146022 | Takeda | May 2014 | A1 |
20140180792 | Zaheer | Jun 2014 | A1 |
20140195513 | Raichelgauz et al. | Jul 2014 | A1 |
20140214495 | Kutty | Jul 2014 | A1 |
20140215391 | Little | Jul 2014 | A1 |
20140250143 | Dai | Sep 2014 | A1 |
20140258030 | Koch | Sep 2014 | A1 |
20140289647 | Cortright | Sep 2014 | A1 |
20140304178 | Bengson | Oct 2014 | A1 |
20140351720 | Yin | Nov 2014 | A1 |
20140372896 | Raman | Dec 2014 | A1 |
20150024840 | Poon | Jan 2015 | A1 |
20150029110 | Chang | Jan 2015 | A1 |
20150033150 | Lee | Jan 2015 | A1 |
20150049033 | Kim | Feb 2015 | A1 |
20150098309 | Adams | Apr 2015 | A1 |
20150105125 | Min | Apr 2015 | A1 |
20150113084 | Kimura | Apr 2015 | A1 |
20150113439 | Kimura | Apr 2015 | A1 |
20150120851 | Kimura | Apr 2015 | A1 |
20150145781 | Lewis et al. | May 2015 | A1 |
20150169531 | Campbell et al. | Jun 2015 | A1 |
20150177944 | Petrov | Jun 2015 | A1 |
20150177964 | Spirer | Jun 2015 | A1 |
20150205509 | Scriven | Jul 2015 | A1 |
20150254222 | Shadfar | Sep 2015 | A1 |
20150294303 | Hanson | Oct 2015 | A1 |
20150301506 | Koumaiha | Oct 2015 | A1 |
20150324078 | Dipin | Nov 2015 | A1 |
20150356093 | Abbas | Dec 2015 | A1 |
20150363065 | Kim | Dec 2015 | A1 |
20160012465 | Sharp | Jan 2016 | A1 |
20160065843 | Zhu | Mar 2016 | A1 |
20160096706 | Tang | Apr 2016 | A1 |
20160110100 | Wang | Apr 2016 | A1 |
20160117141 | Ro | Apr 2016 | A1 |
20160117754 | DeStefano | Apr 2016 | A1 |
20160132231 | Rathod | May 2016 | A1 |
20160240149 | Kim | Aug 2016 | A1 |
20160267403 | Hoffart | Sep 2016 | A1 |
20160345076 | Makhlouf | Nov 2016 | A1 |
20170118518 | Kannan | Apr 2017 | A1 |
20170212478 | Basargin | Jul 2017 | A1 |
20170228035 | Irzyk | Aug 2017 | A1 |
20170310920 | Chiu | Oct 2017 | A1 |
20180095653 | Hasek | Apr 2018 | A1 |
20180174559 | Elson | Jun 2018 | A1 |
Number | Date | Country | |
---|---|---|---|
20160259464 A1 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
62129639 | Mar 2015 | US |