This disclosure relates to the display of images via a mobile device.
Multi-functional mobile devices, for example smart phones and tablet computers, have become increasingly popular with many consumers. Many such multi-functional devices include a display and any combination of hardware and/or software configured to control the presentation of images via the display. In some examples, a multi-functional device may include a graphics processing pipeline that includes hardware, software, or any combination of hardware and software to process images for presentation to a user.
Multi-functional mobile devices may further incorporate a variety of detection elements, e.g., sensors, to detect user input. For example, multi-functional mobile devices may include one or more accelerometers, gyroscopes, camera elements, ambient light sensors, and the like to detect various user input. An accelerometer may detect device movement in space. A gyroscope may detect device orientation is space with respect to the ground. A camera element may capture images of a device's surroundings as directed by a user. An ambient light sensor may detect a level of ambient light in an optical environment of the device.
The instant disclosure is generally directed to techniques for improving a user experience when operating a mobile device. A mobile device may be configured to present images via a display of the device. One or more device sensors may be configured to detect characteristics of the device with respect to an optical environment of the device and correspondingly cause one or more images presented via the device display to be modified to reflect the optical environment. A user experience may be improved according to the techniques of this disclosure, because images presented via a mobile device display may appear more lifelike and animated. The techniques of this disclosure may further be used as an input mechanism for the detection of user input.
In one example, a method is described herein. The method includes rendering, by a graphics processing pipeline of a mobile device, an image presented by a display of the mobile device, wherein the image includes one or more properties. The method further includes identifying, using at least one sensor of the mobile device, a characteristic of a relationship between the mobile device and an optical environment of the mobile device. The method further includes providing, to the graphics processing pipeline, at least one indication of the characteristic of the relationship between the mobile device and the optical environment. The method further includes modifying, by the graphics processing pipeline, the one or more properties of the image presented on the display to reflect the characteristic of the relationship between the mobile device and the optical environment of the mobile device.
According to another example, a mobile device is described herein. The mobile device includes a graphics processing pipeline configured to render an image at a display of the mobile device, wherein the image includes one or more properties. The mobile device further includes a sensor processing module configured to receive, from at least one sensor communicatively coupled to the mobile device, a characteristic of a relationship between the mobile device and an optical environment of the mobile device and provide, to the graphics processing pipeline, at least one indication of the at least one characteristic of the relationship between the mobile device and the optical environment of the mobile device. The mobile device further includes means for modifying the one or more properties of the image to reflect the at least one characteristic of the relationship between the mobile device and the optical environment of the mobile device.
According to another example, an article of manufacture comprising a computer-readable medium that stores instructions is described herein. The instructions are configured to cause a mobile device to render, by a graphics processing pipeline of the mobile device, an image presented by a display of the mobile device, wherein the image includes one or more properties. The instructions further cause the mobile device to identify, using at least one sensor of the mobile device, a characteristic of a relationship between the mobile device and an optical environment of the mobile device. The instructions further cause the mobile device to provide, to the graphics processing pipeline, at least one indication of the at least one characteristic of the relationship between the mobile device and the optical environment. The instructions further cause the mobile device to modify, by the graphics processing pipeline, the one or more properties of the image to reflect the at least one characteristic of the relationship between the mobile device and the optical environment of the mobile device.
The details of one or more embodiments of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
Mobile device 101 may be configured to present some images via display 102 that include features dependent in part on a relationship between the subject of the image and a virtual environment in which the subject of the image is disposed. One example of an image 112A that includes a feature dependent on a virtual optical environment is illustrated in
For typical mobile devices, an image such as image 112A would maintain its relationship with respect to a virtual light source (e.g., the light source “illuminating” the ball of image 112A from the upper right), regardless of an optical environment of the mobile device 101. For example, if one were to view image 112A on a mobile device, and move from outdoors on a sunny day to an indoor area with little or no light, the image presented via the display will remain the same with respect to the virtual light source, e.g., the shadow of
Mobile device 101 may include one or more sensors. For example, mobile device may include one or more camera elements (image capture device(s)), ambient light sensors, accelerometers, gyroscopes, or the like. In the example of
Mobile device 101 may utilize one or more sensors to determine or identify characteristics of a relationship between device 101 and an optical environment of device 101 (e.g., light source 104). In some non-limiting examples, mobile device 101 may detect one or more characteristics of an optical environment such as a position of device 101 with respect to one or more sources of light (e.g., light source 104), an orientation of device 101 with respect to one or more sources of light, an intensity of light detected from one or more light sources, and/or a color, (wavelength) of detected light. In response to detection of optical environment characteristics, device 101 may present an image (e.g., image 112A) via display 102 consistent with the detected optical environment characteristic.
For example, according to
Device 101 may further be configured to detect changes in a relationship between device 101 and an optical environment of device 101. For example, as shown in the
As also shown in
The example depicted in
Furthermore,
For example, device 101 positioning depicted in
The techniques of this disclosure may provide for a generally improved user experience when operating a mobile device 101. For example, images presented or modified according to the techniques of this disclosure may appear more lifelike and/or fun for a user. In addition, a device operated according to the techniques of this disclosure may provide for additional input mechanisms for detection of user input, as described in further detail below with respect to
As shown in
As also shown in
As also shown in
As shown in the example of
For example, sensor processing module 226 may receive, from one or more camera elements 221, one or more signals indicative of images captured by camera elements 221. Sensor processing module 226 may analyze, process, and/or compare signals indicative of captured images to determine characteristics and/or changes in characteristics of an optical environment of device 201. For example, sensor processing module 226 may analyze an image to estimate and/or determine a position of a light source in a captured image.
Sensor processing module 226 may also or instead compare captured images to determine changes in an optical environment. For example, sensor processing module 226 may compare illumination in two or more captured images to determine that device 201 has changed position or orientation with respect to one or more light sources that effect an optical environment of device 201. Various other characteristics of an optical environment of device 201 may also or instead be determined, via one or more output signals from one or more of sensor elements 220, alone or in combination.
In another example, device 301 may be configured to detect an illumination level 342 of an optical environment of device 301. For example, sensor processing module 226 may receive from one or more cameral elements 231 one or more indications of captured images, and process/analyze the one or more indications to determine an illumination level 342 of the captured image(s). In other examples, sensor processing module 226 may receive one or more direct indications of illumination levels from one or more ambient light sensors 232 to determine an illumination level of an optical environment of device 301.
In another example, device 301 may be configured to detect a coloring of light of an optical environment of device 301. For example, sensor processing module 226 may receive from one or more cameral elements 231 one or more indications of captured images, and process/analyze the one or more indications to determine a color of objects of the capture images. Object coloring may indicate a color of light from one or more light sources of an optical environment of device 301.
In other examples, the one or more detected characteristics may include indirect indications of a relationship between device 201 and an optical environment of device 201. For example, sensor processing module 226 may provide display module with one or more indications of a device orientation 346 (e.g., detected via one or more gyroscopes 223) or movement (e.g., detected via one or more accelerometers 234) in space, which may indirectly indicate an orientation of device 201 with respect to an optical environment of device 201 (e.g., one or more light sources).
In another example, device 301 may be configured to detect one or more indications of device positioning 344. For example, device 301 may be configured to determine a positioning of device 301 with respect to one or more light sources. For example, sensor processing module 226 may receive from one or more camera elements 231 one or more indications of captured images, and process/analyze the one or more indications to determine a positioning of device 301 with respect to an optical environment of device 301 (e.g., positioning of one or more light sources, such as the sun, with respect to device 301).
In another example, sensor processing module 226 may receive from one or more GPS units 225 one or more indications of a geographic position of device 301. According to these examples, one or more other indications of light source positioning (e.g., via one or more camera elements 231 or ambient light sensors 232, or where the light source is the sun, a time of day) may be used in conjunction with the one or more indications of geographic position to determine a relative positioning of device 301 with respect to at least one light source of an optical environment of device 301.
In another example, device 301 may be configured to determine movement 346 of device 301 with respect to an optical environment of device 301 (e.g., with respect to one or more light sources of an optical environment of device 301). For example, sensor processing module 226 may receive one or more indications of device 301 movement from one or more accelerometers 234, gyroscopes 233, or GPS 225 to determine device 301 movement. According to these examples, movement of device 301 may indicate a position and/or orientation of device 301 with respect an optical environment of device 301, including one or more light sources.
In another example, device 301 may be configured to determine an orientation 348 of device 301 with respect to an optical environment of device 301. For example, sensor processing module 226 may receive one or more indications of device 301 orientation by processing/analysis of images captured by one or more camera elements 221. In another example, sensor processing module 226 may receive one or more indications from an accelerometer (orientation movement) or gyroscope (e.g., direct measurement of orientation) to determine an orientation of device 301 with respect to an optical environment of device 301.
Sensor processing module 226 may be configured to determine characteristics of an optical environment of device 301 based on one or more indications from the above-described sensors 220 alone or in combination. In one example, sensor processing module 226 may capture multiple images (e.g., from multiple camera elements 221, such as front and back camera elements of device 301) of an environment of device 301, and independently extract characteristics from the multiple images. According to this example, sensor processing module 226 may independently determine similar characteristics (e.g., illumination levels, shadowing, coloring) of the same or different objects of the device 301 optical environment, and determine one or more characteristics of the optical environment of device 301 based on both captured images. Determining an optical environment characteristic according to this example may improve accuracy.
In another example, sensor processing module 226 may be configured to determine optical environment characteristics based on indications from one or more other sensors in combination with photographic images captured by one or more camera elements (e.g., camera elements 221 depicted in
In still other examples, detection of a device 301 environment may be used to trigger detection of other environment characteristics. For example, gyroscope, accelerometer, and/or GPS sensors may provide an indication that device 301 has changed position or orientation. Detection of a position/orientation change of device 301 may trigger sensor processing module 226 to operate one or more sensors (e.g., sensors 220 depicted in
Referring back to
According to various techniques described herein, display control module 236 may be configured to receive, from sensor processing module 226 one or more indications of detected characteristics relevant to an optical environment of device 201, and correspondingly modify presentation of an image, e.g., properties of a still image or video, in response to the detected optical environment characteristic of device 201. In some examples, presentation (e.g., properties) of an image may be modified to reflect the same or similar optical environment characteristic detected for device 201. For example, where a detected characteristic indicates a shadow may be formed as a result of a relationship between device 201 and one or more light sources, an image may be presented with a shadow property that reflects the detected characteristic for device 201. In other examples, presentation of different properties of an image may be modified in light of a detected optical environment of device 201. For example, where a detected characteristic indicates a shadow would be formed as described above, a color, texture, light intensity, reflection or other characteristic may be modified in light of a detected device 201 optical environment characteristic.
As set forth above, sensor processing module 226 may be configured to determine optical environment characteristics and/or changes in optical environment characteristics for device 201. Display control module 236 may receive from sensor processing module 226 the one or more detected characteristics, and correspondingly cause one or more images presented via display 201 to be displayed with properties in response to the determined one or more characteristics, or change displayed image properties to reflect the determined one or more characteristics. In some examples, display control module 236 may be configured to modify shadowing, shading, texture, reflection, or other image properties based on detected optical environment characteristics.
For example, where a reflective object in one or more captured images reflects another object in the device optical environment, display control module may modify a displayed image to cause the displayed image to include the captured image of the reflective object. In one example, if the device camera element captures an image of a user standing in front of a mirror that reflects an image of the user, display control module 236 may cause a displayed image to include an image of the user.
In other examples, where a detected optical environment characteristic indicates a change in position or orientation of device 201 with respect to one or more light sources, display control module 236 may correspondingly modify the presentation of shadowing, shading, texture, or reflection in a displayed image, or modify a virtual light source (e.g., a location of a virtual light source) of a displayed image. In another example, where a detected optical environment condition indicates a particular light source color (or color of image reflection) of a device 201 optical environment, a color of a displayed image (or color of reflection of the displayed image) may be modified to reflect the detected color.
As discussed above, display control module 236 may include a graphics processing pipeline 238 as well known in the relevant arts. A graphics processing pipeline 238 as described herein may include any combination of hardware, software, or firmware configured to cause images to be presented via display 202. The graphics processing pipeline 238 may accept some representation of an image, and rasterize, or render, the image based on the input. A graphics pipeline 238 may operate based on one or more graphics modeling libraries. One non-limiting example of a graphics modeling library is OpenGL® made available by Silicon Graphics, Inc. Another non-limiting example of a graphics modeling library is Direct3D® made available by Microsoft®.
A graphics processing pipeline 238 may include a plurality of stages for translating a representation of an image (e.g., code defining characteristics of a particular image) into a rendered image based on image primitives such as those of a graphics library. In one non-limiting example, a graphics processing pipeline 238 includes transformation, per-vertex lighting, viewing transformation, primitive generation, projection transformation, clipping, scan conversion or rasterization, texturing fragment shading, and display stages. According to the techniques of this disclosure, display control module 236 may be operative to affect one or more stages of a graphic processing pipeline 238 to reflect detected device optical conditions (e.g., from sensor processing module 226) as described above.
In some examples, display control module 236 may provide parameters or other information to a pre-vertex lighting stage in which geometry of an image is lit according to defined locations of light sources, reflectance, and other surface properties, such that detected changes in device optical conditions may be reflected in properties of a displayed image. In other examples, display control module 236 may provide parameters or other information to a viewing transformation stage in which objects are transformed from 3D world space coordinates into a 3D coordinate system based on the position and orientation of a virtual camera. Other stages of a graphics processing pipeline 238 may also be configured to receive information as described above to modify rendering/rasterization of an image to reflect device 201 optical environment characteristics.
Display control module 236 and sensor processing module 226 as described herein may include any combination of hardware, software, or firmware configured to operate as described above. For example, one or more of display control module 236 and sensor processing modules 226 may include one or more program instructions (e.g., software) stored on a memory/storage module (e.g., memory/storage module 280 as depicted in
Unlike the example of
In the example of
In various examples, device 401 may be configured to utilize a change in an image optical characteristic caused by a user, such as a location of a shadow caused by a detected device optical environment characteristic (e.g., user modification of an orientation or position of device 401) as described herein, as a user input mechanism to cause one or more operations to be performed by device 401. For example, where device 401 is configured to operate a media player (e.g., a music and/or video player), a user may modify an optical environment of device 401 (e.g., a position or orientation of device 401 with respect to light source 404), to cause the music or video to be paused, skip to a subsequent track, or modify a playback volume or display intensity. Other examples are also contemplated. For example, a detected change in device optical environment characteristic may cause a device to execute a particular program, turn off or go to sleep, initiate a phone call, or operate a game.
Other examples of device actuation in response to optical environment characteristics are also contemplated. For example, detected changes as described herein may cause various modification of an image including color, texture, image positioning, orientation, or movement. Any or all changes to an image may be used as actuation mechanisms, alone or in combination. For example, a user may match up colors of an image with colors of a second, different image to cause a device 401 operation to be performed.
Various embodiments of the disclosure have been described. These and other embodiments are within the scope of the following claims.
This application is a continuation of U.S. application Ser. No. 12/955,577, filed Nov. 29, 2010, the entire content of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 12955577 | Nov 2010 | US |
Child | 13249572 | US |