The present disclosure relates generally to portable electronic devices and more particularly to user customization of portable electronic devices.
Users of cellular phones, tablets, and other portable electronic devices frequently are interested in customizing the appearance of their portable electronic devices so that they reflect the users' personalities and interests. Typically, such customizations have been limited to selection of a device case with a particular print, pattern, or logo, the selection of a particular photograph as the background image on the display screen of the portable electronic device, or the selection of pre-defined themes for the display screen of the portable electronic device. These conventional customizations often do not provide users with a sufficient sense of individuality that they often seek from their devices.
The present disclosure may be better understood by, and its numerous features and advantages made apparent to, those skilled in the art by referencing the accompanying drawings. The use of the same reference symbols in different drawings indicates similar or identical items.
The following description is intended to convey a thorough understanding of the present disclosure by providing a number of specific embodiments and details involving customization of a color scheme of a portable electronic device by configuring one or more diffused light sources of the portable electronic device or an associated holster, cover, or other case accessory. It is understood, however, that the present disclosure is not limited to these specific embodiments and details, which are examples only, and the scope of the disclosure is accordingly intended to be limited only by the following claims and equivalents thereof. It is further understood that one possessing ordinary skill in the art, in light of known systems and methods, would appreciate the use of the disclosure for its intended purposes and benefits in any number of alternative embodiments, depending upon specific design and other needs.
With an image of the object captured, the predominant color of the object, or another color associated with the object, is determined, and one or more diffuse light sources associated with the portable electronic device are configured to emit light that approximates the determined color. In some embodiments, the portable electronic device performs one or both of the identification of the object within the image and the determination of the color of the object, while in other embodiments, the portable electronic device communicates with a remote processing system, which performs one or both of the processes of identifying the object and determining the color associated with the object. The one or more diffused light sources controlled in this manner can include multi-color light emitting elements (e.g., light emitting diodes) disposed at a housing of the portable electronic device, multi-color light emitting elements disposed at a case accessory of the portable electronic device (e.g., a case cover, holster, etc.), or a combination thereof. In some embodiments, the color of the object also can be used to configure a color scheme of a graphical user interface (GUI) presented via a display screen of the portable electronic device.
By directly or indirectly expressing an affinity or other interest in an object in the user's proximity or local environment, the user can configure the portable electronic device or associated case accessory to emit diffused light that approximates the color of the object of interest, and thus the portable electronic device or associated case accessory can provide “mood lighting” or other color scheme that is individualized to the user's tastes or preferences. Additionally, in some embodiments, to facilitate color scheme coordination in a group of devices, the portable electronic device can transmit an indicator to other portable electronic devices of a specified group so that these other portable electronic devices can likewise configure their diffuse light sources to emit light having the same or similar hue.
In the depicted example, the portable electronic device 100 includes a display screen 102 and one or more diffused light sources disposed at a housing 104, such as a diffused light source 106 disposed at a side panel of the housing 104, and a diffused light source 108 disposed at the front panel of the housing 104 and which serves as backlighting for a “hard” or “physical” button 110 on the front panel. The portable electronic device 100 further is associated with an imaging camera 112, which in the illustrated embodiment is disposed at a back panel of the portable electronic device 100 and thus integrated into the portable electronic device 100. In other embodiments, the imaging camera 112 can include an imaging camera separate from, but wirelessly connected to, the portable electronic device 100. To illustrate, the portable electronic device 100 can include a smartphone or a tablet computer, and the imaging camera 112 may be integrated into a smartwatch, a head-mounted display device (e.g., Google™ Glass™), or other user-wearable device that communicates with the smartphone or tablet computer.
In operation, the portable electronic device 100 monitors the user's interactions or other user activities to detect a trigger event that signals a user's interest in an object in proximity to the portable electronic device 100. In some embodiments, the trigger event occurs in the form of an explicit instruction, such as when a user uses an imaging application of the portable electronic device 100 to initiate the capture of an image of the object via the imaging camera 112 with the intent to use a color of the object to configure the diffused light sources 106 and 108 accordingly. This explicit command may come in the form of a user's manipulation of a physical button on the housing 104 or manipulation of a “soft” button presented on the display screen 102, or in the user's vocalization of a verbal command to the imaging software, such as the user speaking the command “capture image for custom housing color” into a microphone of the portable electronic device 100. In other embodiments, the trigger event occurs in the form of an inference that the user is interested in the object. To illustrate, the portable electronic device 100 may monitor the user's vocalizations 130 to detect terms or phrasing indicating an interest in an object in the local environment, such as the user exclaiming “I like that painting!”. The portable electronic device 100 also may infer interest in the object from a usage pattern or user interactions with the portable electronic device 100. As an example, if the user handles the portable electronic device 100 such that the object remains in an imaging viewfinder of the portable electronic device 100 for longer than a threshold duration, the portable electronic device 100 may infer the user's interest in the object.
In response to the trigger event, the portable electronic device 100 obtains an image of the object. In some embodiments, the trigger event and the image capture are tightly coupled—that is, the image capture serves as part of the trigger event. To illustrate, the trigger event may be the user's manipulation of a button to capture the image of the object, and thus it is presumed that the object is present in the captured image. In other embodiments, the portable electronic device 100 may have to capture one or more images of the local environment in response to the trigger event, and the one or more images are then analyzed to identify the presence of the object in at least one of the captured images. This analysis may be performed at the portable electronic device 100, or the portable electronic device 100 may transmit one or more of the captured images and one or more descriptors of the object to a remote processing system (not shown in
With the presence of the object presumed or confirmed in the captured image, the portable electronic device 100 or the remote processing system then determines from the image(s) a predominant color of the object or other color associated with the object. This color may include, for example, a mean color or median color of the entire image containing the object. Alternatively, the color may include a mean color or median color of a certain region of the image. For example, under the assumption that users typically attempt to center a picture or captured image around an object of interest, the predominant color of a target region centered around a center point of the image may be used to determine the color associated with the object. In the event that the border of the object in the image is known or can be determined, the color can be determined as the predominant color of the pixels within this border (that is, the predominant color of the region of the image that represents the object). This color extraction process may be performed at the portable electronic device 100, or the image may be provided to a remote processing system for the color extraction process, such as in conjunction with the object recognition process described above.
With the color identified, the portable electronic device 100 configures one or more of the diffused light sources 106 and 108 to emit light approximating this color. In some embodiments, the diffused light sources 106 and 108 include LEDs of different colors, and the portable electronic device 100 controls the intensity of each LED so that the different colored lights emitted by the LEDs through a diffuser results in the emission of mixed light having the intended hue. In implementations whereby a diffused light source is located on a case accessory, the portable electronic device 100 may transmit an indicator of the color to the case accessory, which then may control its diffused light source to emit light having the indicated color.
This process may be illustrated using the example presented in
The remote processing system performs an object recognition process to identify the presence of the painting 114 in the image 116, as well as the set of pixels of the image 116 that represents the painting 114. The remote processing system then analyzes the colors of the pixels of this set to identify a predominant color of the painting 114, which in this example is the color of a circle present in the painting 114 (this predominant color being indicated as hatched fill in
As a variation of this example, the user instead may open a software application that provides this customization feature, and the user then manipulates the portable electronic device 100 (or the imaging camera 112 if it is separate) such that the painting 114 is maintained within a target region of a viewfinder presented on the display screen 102 for sufficient time for the software application to register the painting 114 as an object of interest, and from this determine the average color of the pixels with the target region as the color associated with the painting 114, and thus configure the diffused light sources 106 and 108 to emit light approximating this color accordingly.
In addition to configuring the particular quality of light emitted by the diffused light sources 106 and 108, the portable electronic device 100 further can customize one or more aspects of a graphical user interface (GUI) presented via the display screen 102 based on the color associated with the object of interest. To illustrate, many operating systems and applications utilize color themes for their GUIs, and such color themes may be configured so as to incorporate the color associated with the object of interest as a primary color or accent color of the color theme for the GUI. Using the example described above, when the color of the painting 114 is determined, the portable electronic device 100 can configure the color scheme of the GUI of the operating system such that the buttons, icons, window, backgrounds, and other graphical features of the GUI, such as icons 121, 122, and 123, incorporate the color of the painting 114 into their on-screen graphical representations.
As depicted in the front view 201, the front surface of the portable electronic device 200 includes a display screen 202 disposed in a housing 204. The front surface further includes a “home” physical button 206 and a forward-facing diffused light source 208. In the illustrated example, the forward-facing diffused light source 208 includes a diffuser 210 extending across the front face above the display screen 202, whereby the diffuser 210 includes a diffusing lens, diffusing film, diffusing panel, or other diffusing structure to diffuse and mix light emitted by LEDs disposed underneath the diffuser 210 such that the light emitted through the diffuser 210 is substantially uniform in color. Likewise, the “home” physical button 206 incorporates a light diffuser as part of the physical button component, as a ring surrounding the physical button component, and the like. Similarly, as depicted in the back view 203, the back surface of the portable electronic device 200 includes an imaging camera 212 (one embodiment of the imaging camera 112 of
As illustrated in the display screen 202 in the front view 201, the portable electronic device 200 may execute an operating system or software application that provides a GUI 220 that incorporates various graphical features that are colored or otherwise represented in accordance with a specified color scheme, which may specify one or more primary colors and one or more detail or accent colors. To illustrate, the GUI 220 may include a clock “widget” 221 and icons 222, 223, 224, and 225 that employ a primary color of the color scheme as their background colors, and an icon 226 that employs a primary color of the color scheme as a border or other accent.
As similarly described above, the portable electronic device 200 can employ customized diffused light color scheme control so that one or more diffused light sources, such as the physical button 206, the forward-facing diffused light source 208, and the rear-facing diffused light source 216, emit light approximating a color of an object of interest. Further, the portable electronic device 200 can customize the color theme of the GUI 220 so that one or more primary colors or one or more accent colors of the color theme approximate the color of the object of interest, thereby customizing the appearance of, for example, the clock widget 221 and the icons 222, 223, 224, 225, and 226 to match the object of interest.
The camera interface 310 is coupleable to the imaging camera 112. In implementations whereby the imaging camera 112 is implemented as part of the portable electronic device 100 (as illustrated by integrated imaging camera 322), the camera interface 310 may be implemented as, for example, a wired bus or other wired interconnect coupling the integrated imaging camera 332 to the processor 302 or other component. In implementations whereby the imaging camera 112 is implemented as a separate imaging camera (as illustrated by wireless camera accessory 324), the camera interface 310 can include, for example, a wireless personal area network (PAN) wireless interface 326 of the portable electronic device 100, whereby the PAN wireless interface 326 wirelessly couples to the wireless camera accessory 324 using one or more PAN protocols. To illustrate, the wireless camera accessory 324 can include, for example, a wireless camera device that may be clipped onto a user's clothing or a camera-enabled smartwatch. The PAN wireless interface 326 thus may include, for example, a Bluetooth™ wireless interface, a ZigBee™ wireless interface, a wireless USB interface, and the like.
The diffused light source interface 312 is coupleable to one or more diffused light sources associated with the portable electronic device 100. In implementations whereby a diffused light sources is disposed at the housing 104 (
The diffused light source 325 represents an example configuration of the one or more diffused light sources integrated at the portable electronic device 100. The diffused light source 330 at the case accessory 333 may be similarly configured. The diffused light source 325 includes a diffuser 340 overlying a plurality of LEDs or other light emitting devices, such as LEDs 341, 342, 343, and 344, and a LED controller 346. Each of the LEDs 341, 342, 343, and 344 emits light having a different dominant wavelength. For example, the LED 341 is implemented as a red (R) LED primarily emitting red light, the LED 342 is implemented as a green (G) LED primarily emitting green light, the LED 343 is implemented as a blue (B) LED primarily emitting blue light, and the LED 344 is implemented as a white (W) LED emitting broad-spectrum light. Other color combinations may be implemented, and the diffused light source 325 may implement more than one LED of any particular color or wavelength of light. To illustrate, for diffused light sources having relatively large diffusers, such as the rear-facing diffused light source 216 of
The LED controller 346 receives data indicating the intensities intended for the LEDs, and sends a corresponding signal to each LED in accordance with the intended intensities. The diffuser 340 mixes the emitted light to generate an output diffused light of substantially uniform color, whereby the particular color of the diffused light is based on the intensity of each corresponding input colored light. To illustrate, to emit a light having a particular color, the LED controller 346 may receive an RGB value that indicates a separate intensity for each of the red color component, green color component, and blue color component. From this RGB value, the LED controller 346 may provide signals with different current levels or pulse width modulation (PWM) signals of different duty cycles so that the red LED 341, green LED 342, and blue LED 343 each emits its respective color light with the intended intensity, such that when combined or mixed by the diffuser 340, results in diffused light having the intended color.
In operation, the processor 302 executes a set of executable instructions stored at a computer readable storage medium, such as the system memory 304 or flash memory, whereby the set of executable instructions represent one or more software applications 350. The software application 350, when executed, manipulates the processor 302 to perform various software-based functionality to implement at least a portion of the techniques described herein, provide visual information via the display 102, respond to user input via the touchscreen 318 and other user input devices, and the like.
In the illustrated example, the software application 350 implements a color scheme controller 352 that provides diffused light customization features as described herein. To this end, the color scheme controller 352 includes an interest detection module 354, an image capture/color extraction module 356 (hereinafter, “color extraction module 356”), and a color scheme configuration module 358. Although embodiments of the color scheme controller 352 as one or more processors 302 executing software application 350 are described herein, in other embodiments, the color scheme controller 352 may be implemented in hard-coded logic, or as a combination of hard-coded logic and one or more processors 302 executing software application 350. For example, one or more modules of the color scheme controller 352 may be implemented as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), and the like.
In response to detecting the trigger event, at block 406 the color extraction module 356 operates to obtain one or more images of the object of interest using the imaging camera 332, unless capture of an image is part of the trigger event. At block 408, the color extraction module 356 analyzes the one or more captured images to identify the object of interest and at block 410 the color extraction module 356 determines the predominant color of the object of interest in the one or more captured images. In instances whereby the trigger event includes an image capture, the color extraction module 356 can assume that the object of interest is present in the image, and is likely centered in the image or located within a target region of the image. In such instances, a search for the object within the image may not be necessary. In other embodiments, the trigger event may be based on an event other than an image capture of the object, and in such instances the color extraction module 356 may obtain multiple images of the local environment from different perspectives using the imaging camera 332 in case that the imaging camera 332 was not directed at the object of interest when the trigger event was detected. In this situation, the color extraction module 356 may utilize one or more object recognition processes to search for and identify the presence of the object within one or more of the captured images. Techniques for determining a predominant color of the object from an image are described below with reference to
In one embodiment, the color extraction module 356 performs both the image analysis process and the color extraction process without remote support, while in other embodiments the color extraction module 356 outsources one or both of these processes to a remote processing system, such as an imaging analysis server 364 (
With the predominant color so identified, at block 412 the color scheme configuration module 358 uses color information specified by the color extraction module 356 to customize one or more color schemes of the portable electronic device 100 or the case accessory 333 based on the predominant color. In some embodiments, this customization includes implementing a housing LED configuration 371 (
Further, as noted above, the portable electronic device 100 may be a member of a group of portable electronic devices intending to have a coordinated color scheme. In such instances, at block 414 the color scheme configuration module 358 can communicate color coordination information (including the predominant color information) with other portable electronic devices via the wireless interfaces 306 and 326 so that the other portable electronic devices of the group can configure one or more of their light sources to emit light approximating the predominant color in the manner described above.
At block 502, the color extraction module 356 determines the manner by which an object is represented in the captured image: as a detected object with a known boundary in the captured image; as likely present in a target window in the captured image; or as in an undetermined location within the captured image. In the event that the boundary of the object within the captured image is known through object recognition analysis by the color extraction module 356 or the imaging analysis server 364, at block 504 the color extraction module 356 (or the imaging analysis server 364) can determine the predominant color of the object based on the color of the pixels of the image within this boundary. To illustrate, the color extraction module 356 can determine a mean or median pixel color of the pixels within the boundary as the predominant color.
In the event that the object is determined to be present in a target region of the captured image (such as a central region of the captured image when it is assumed that the user centered the object in a viewfinder when taking the image), at block 506 the color extraction module 356 or imaging analysis server 364 determines the predominant color of the object based on the color of the pixels in this target region. As with the example above, this predominant color can include a median or mean of the colors of the pixels within this target region. In the event that it is not assumed that the object is present in a specified target region, at block 508 the color extraction module 356 or imaging analysis server 364 may assume that the object of interest constitutes a large portion of the captured image and thus determine the predominant color of the object based on the pixels of the entirety of the captured image, such as by determining the predominant color as the mean or median of colors of every pixel in the captured image.
With the predominant color so determined, the process returns to block 412 of method 400 with the customization of one or more color schemes of the portable electronic device 100 or related case accessory based on the predominant color as described above.
Much of the inventive functionality and many of the inventive principles described above are well suited for implementation with or in software programs. It is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs minimal experimentation. Therefore, in the interest of brevity and minimization of any risk of obscuring the principles and concepts according to the present disclosure, further discussion of such software, if any, will be limited to the essentials with respect to the principles and concepts within the preferred embodiments.
In this document, relational terms such as first and second, and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms “comprises,” “comprising,” or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. An element preceded by “comprises . . . a” does not, without more constraints, preclude the existence of additional identical elements in the process, method, article, or apparatus that comprises the element. The term “another”, as used herein, is defined as at least a second or more. The terms “including” and/or “having”, as used herein, are defined as comprising. The term “coupled”, as used herein with reference to electro-optical technology, is defined as connected, although not necessarily directly, and not necessarily mechanically. The term “program”, as used herein, is defined as a sequence of instructions designed for execution on a computer system. A “program”, or “computer program”, may include a subroutine, a function, a procedure, an object method, an object implementation, an executable application, an applet, a servlet, a source code, an object code, a shared library/dynamic load library and/or other sequence of instructions designed for execution on a computer system.
The specification and drawings should be considered as examples only, and the scope of the disclosure is accordingly intended to be limited only by the following claims and equivalents thereof. Note that not all of the activities or elements described above in the general description are required, that a portion of a specific activity or device may not be required, and that one or more further activities may be performed, or elements included, in addition to those described. Still further, the order in which activities are listed are not necessarily the order in which they are performed. The steps of the flowcharts depicted above can be in any order unless specified otherwise, and steps may be eliminated, repeated, and/or added, depending on the implementation. Also, the concepts have been described with reference to specific embodiments. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present disclosure as set forth in the claims below. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of the present disclosure.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any feature(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature of any or all the claims.
Number | Name | Date | Kind |
---|---|---|---|
4985878 | Yamada et al. | Jan 1991 | A |
5416730 | Lookofsky | May 1995 | A |
5479285 | Burke | Dec 1995 | A |
5625697 | Bowen et al. | Apr 1997 | A |
5872744 | Taylor | Feb 1999 | A |
5889737 | Alameh et al. | Mar 1999 | A |
6158884 | Lebby et al. | Dec 2000 | A |
6382448 | Yuhara et al. | May 2002 | B1 |
6528203 | Mitamura | Mar 2003 | B1 |
6532152 | White et al. | Mar 2003 | B1 |
6771237 | Kalt | Aug 2004 | B1 |
7224963 | Anderson et al. | May 2007 | B2 |
7259155 | Sakai et al. | Aug 2007 | B2 |
7401758 | Liang et al. | Jul 2008 | B2 |
7623780 | Takita | Nov 2009 | B2 |
7766517 | Kerr et al. | Aug 2010 | B2 |
7953463 | Misawa | May 2011 | B2 |
8207936 | Gustafsson et al. | Jun 2012 | B2 |
8359020 | Lebeau et al. | Jan 2013 | B2 |
8456586 | Mathew et al. | Jun 2013 | B2 |
8517896 | Robinette et al. | Aug 2013 | B2 |
8675124 | Kawakami | Mar 2014 | B2 |
9009984 | Caskey et al. | Apr 2015 | B2 |
9201454 | Haupt et al. | Dec 2015 | B2 |
20020103014 | Hutchison et al. | Aug 2002 | A1 |
20030158593 | Heilman et al. | Aug 2003 | A1 |
20040056845 | Harkcom et al. | Mar 2004 | A1 |
20040250933 | DeMichele | Dec 2004 | A1 |
20050285811 | Kawase et al. | Dec 2005 | A1 |
20070103908 | Tabito et al. | May 2007 | A1 |
20070273609 | Yamaguchi et al. | Nov 2007 | A1 |
20080001971 | Kouninski | Jan 2008 | A1 |
20080074551 | Kawakami | Mar 2008 | A1 |
20080094515 | Gutta | Apr 2008 | A1 |
20080204367 | LaFarre et al. | Aug 2008 | A1 |
20080285290 | Ohashi et al. | Nov 2008 | A1 |
20080291225 | Arneson | Nov 2008 | A1 |
20080303782 | Grant et al. | Dec 2008 | A1 |
20080309589 | Morales | Dec 2008 | A1 |
20080309861 | Seki et al. | Dec 2008 | A1 |
20090195959 | Ladouceur et al. | Aug 2009 | A1 |
20090254869 | Ludwig et al. | Oct 2009 | A1 |
20100053174 | Cohen et al. | Mar 2010 | A1 |
20100056223 | Choi et al. | Mar 2010 | A1 |
20100225600 | Dai et al. | Sep 2010 | A1 |
20100231692 | Perlman et al. | Sep 2010 | A1 |
20100238367 | Montgomery et al. | Sep 2010 | A1 |
20100265431 | Li | Oct 2010 | A1 |
20100328571 | Itaya | Dec 2010 | A1 |
20110109538 | Kerr et al. | May 2011 | A1 |
20110221656 | Haddick et al. | Sep 2011 | A1 |
20110242750 | Oakley | Oct 2011 | A1 |
20110255303 | Nichol et al. | Oct 2011 | A1 |
20120038613 | Choi | Feb 2012 | A1 |
20120044131 | Nussbacher et al. | Feb 2012 | A1 |
20120055553 | Logunov et al. | Mar 2012 | A1 |
20120091923 | Kastner-Jung et al. | Apr 2012 | A1 |
20120112994 | Vertegaal et al. | May 2012 | A1 |
20120177953 | Bhardwaj et al. | Jul 2012 | A1 |
20120242592 | Rothkopf et al. | Sep 2012 | A1 |
20130053661 | Alberth et al. | Feb 2013 | A1 |
20130076649 | Myers et al. | Mar 2013 | A1 |
20130127733 | Krishnaswamy | May 2013 | A1 |
20130278631 | Border et al. | Oct 2013 | A1 |
20130307419 | Simonian | Nov 2013 | A1 |
20130329460 | Mathew et al. | Dec 2013 | A1 |
20140063049 | Armstrong-Muntner | Mar 2014 | A1 |
20140063055 | Osterhout et al. | Mar 2014 | A1 |
20140240903 | Allore et al. | Aug 2014 | A1 |
20140265821 | Malon | Sep 2014 | A1 |
20140285967 | Wikander et al. | Sep 2014 | A1 |
20140368981 | Haupt et al. | Dec 2014 | A1 |
20140372940 | Cauwels et al. | Dec 2014 | A1 |
20150138505 | Grenon | May 2015 | A1 |
Number | Date | Country |
---|---|---|
102009003128 | Nov 2010 | DE |
1225751 | Jul 2002 | EP |
2500898 | Sep 2012 | EP |
2327012 | Jan 1999 | GB |
WO-9624093 | Aug 1996 | WO |
WO-0025193 | May 2000 | WO |
WO-2008057143 | May 2008 | WO |
WO-2011121403 | Oct 2011 | WO |
Entry |
---|
“Final Office Action”, U.S. Appl. No. 13/455,921, Jun. 13, 2014, 18 pages. |
“Final Office Action”, U.S. Appl. No. 13/455,921, Oct. 7, 2015, 19 pages. |
“Final Office Action”, U.S. Appl. No. 13/893,533, Jul. 30, 2015, 15 pages. |
“International Preliminary Report on Patentability”, Application No. PCT/US2013/034760, Nov. 6, 2014, 10 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/012739, May 9, 2014, 11 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2013/034760, Jun. 28, 2013, 13 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2012/064300, Apr. 11, 2013, 14 pages. |
“International Search Report and Written Opinion”, Application No. PCT/US2014/017331, Sep. 1, 2014, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/297,662, Jun. 2, 2014, 12 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/455,921, Feb. 24, 2015, 17 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/455,921, Dec. 18, 2013, 15 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/893,533, Mar. 2, 2015, 14 pages. |
“Non-Final Office Action”, U.S. Appl. No. 13/893,533, Dec. 18, 2015, 22 pages. |
“Non-Final Office Action”, U.S. Appl. No. 14/474,808, Jan. 12, 2016, 9 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/082,733, Feb. 18, 2015, 7 pages. |
“Notice of Allowance”, U.S. Appl. No. 14/082,733, Jul. 27, 2015, 7 pages. |
“Restriction Requirement”, U.S. Appl. No. 13/297,662, Nov. 14, 2013, 5 pages. |
“Restriction Requirement”, U.S. Appl. No. 14/034,860, Sep. 4, 2015, 11 pages. |
Kee,“Bendable batteries in the pipeline?”, Ubergizmo, http://www.ubergizmo.com/2011/02/bendable-batteries-in-the-pipeline/, Feb. 28, 2011, 2 pages. |
Tan,“Exploiting the Cognitive and Social Benefits of Physically Large Displays”, Carnegie Mellon University CMU-CS-04-154, Aug. 2004, 201 pages. |
“Final Office Action”, U.S. Appl. No. 14/034,860, Jun. 27, 20216, 10 pages. |
“Final Office Action”, U.S. Appl. No. 14/474,808, Jun. 2, 2016, 15 pages. |
“Final Office Action”, U.S. Appl. No. 13/893,533, May 5, 2016, 22 pages. |
Advisory Action, U.S. Appl. No. 14/474,808, Sep. 19, 2016, 2 pages. |
Advisory Action, U.S. Appl. No. 13/893,533, Sep. 14, 2016, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20150179141 A1 | Jun 2015 | US |