This invention relates to a method, an apparatus and a computer program product for providing graphical user interface view on a display of an apparatus.
Portable computers, such as smart phones and tablet devices, are becoming more popular among users and are replacing conventional mobile phones due to their performance. These portable computers have a graphical user interface that includes graphical elements that are shown on the display of the apparatus to a user. The graphical user interface typically consists of a background, selection menus, text, icons and selectable buttons. When talking on graphical user interface, term “theme” is used to define the general appearance (e.g. colouring) of the graphical user interface. Term “theme” is used to describe also the look and feel of the user interface.
Portable computers often have a built-in theme library comprising several computer's brand-related themes having different colouring modes. These themes are selectable by a user in order to change the throughout appearance of the graphical user interface. In addition to these pre-defined themes, pre-created themes can also be downloaded from a network service. There are also services that make it possible for a user to create his/her own theme and to download it to his/her portable computer. Such services list the elements used in the graphical user interface and let the user select colouring for them. When the user is content with the selections, the theme is stored and downloaded to the portable computer.
It is realized that users are increasingly more biased towards customizing the appearance of the user interface of the portable computer according to their own desires. However, todays customization possibilities are based on either vendor-pre-defined themes or theme creator applications that require a network connection and much computing power, but also great amount of graphical elements that need to be varied between user interface views.
Thus, there is a need for a solution that generates user interface view (look and feel and/or theme) dynamically and enables users to customize the graphical user interface easily with less pre-defined resources.
This invention is targeted to such a need in order to provide user interface view (look and feel and/or theme) dynamically and enables users to customize the graphical user interface easily with less pre-defined resources. In addition to that, the invention proposes creating user interface view from layers where the theme elements are divided up among the layers. This makes the theme more dynamic compared to the solutions of prior art.
The present invention relates to a method, an apparatus and a computer program product for creating a user interface view to a display.
According to a first aspect, there is provided a method for an apparatus having a display and a user interface view, wherein the user interface view comprises user interface components, where components are differentiated according to their characteristics into at least content components and effect components. The method comprises automatically creating a theme by means of a source image; adjusting the created theme automatically based on sensor data; and rendering the user interface view on the display according to the theme, wherein the theme defines a common appearance of content components and to define the effect of at least one effect component.
According to a second aspect, there is provided an apparatus comprising a processing unit and a memory coupled to said processing unit, said memory configured to store computer program code and a user interface data, wherein the processing unit is configured to execute the program code stored in the memory and to provide a user interface view on a display of the apparatus. The user interface view comprises user interface components that are differentiated according to their characteristics into at least content components and effect components. The apparatus is configured to automatically create a theme by means of a source image; adjust the created theme automatically based on sensor data; and render the user interface view on the display according to the theme, wherein the theme defines a common appearance of content components and to define the effect of at least one effect component.
According to a third aspect, there is provided a computer program product comprising program code to be executed in an apparatus, wherein the computer program product comprises user interface data for providing a user interface view on a display of an apparatus. The user interface view comprises user interface components, which user interface components are differentiated according to their characteristics into at least content components and effect components. The computer program product comprises instructions for automatically creating a theme by means of a source image; adjusting the created theme automatically based on sensor data; and rendering the user interface view on the display according to the theme, wherein the theme defines a common appearance of content components and to define the effect of at least one effect component.
According to an embodiment the user interface components comprise also a background component configured to provide a background for the user interface view. According to an embodiment, the background component is defined according to the created theme. According to an embodiment, the content component comprises graphical elements, wherein the method comprises defining at least the colouring of the graphical elements according to the theme. According to an embodiment, the theme is created one or more of the following steps: collecting a hue histogram of hue of the source image, collecting a pixel histogram of pixel values, selecting the most common hue as dominant colour from the hue histogram, finding median value from the pixel histogram, blur the source image, darken the source image depending on the median value, use the source image as a background image for the theme, use the dominant colour as theme colour, select one or more font colours from the hue histogram, modifying alpha value and/or colour of the effect component. According to an embodiment, the background component is adjusted by means of the sensor data. According to an embodiment, the effect component is a diffusing component, wherein the method comprises adjusting the user interface view with respect to the other user interface components by means of said diffusing component. According to an embodiment, the effect component is configured to create an effect for at least part of the content component. According to an embodiment, the effect component is configured to change illumination levels on the user interface view by means of bump mapping. According to an embodiment, the alpha channels of the effect component are adjusted. According to an embodiment, the alpha channels of the effect component are adjusted according to state of a certain functionality. According to an embodiment, sensor data concerning the ambient environment is received, and the effect component is adjusted by means of said sensor data. According to an embodiment, the sensor data is received from one or more of the following: magnetometer, accelerometer, positioning means, camera or thermometer.
According to a fourth aspect, there is provided a method for an apparatus having a display and a user interface view, wherein the user interface view comprises user interface components. The method comprises automatically creating a them by means of a source image, adjusting the created theme automatically based on sensor data, and rendering the user interface view on the display according to the theme, wherein the theme defines a common appearance of the user interface components.
According to a fifth aspect, there is provided a method for an apparatus having a display and a user interface view, wherein the user interface view comprises user interface components for defining the appearance of the user interface view, wherein the method comprises receiving sensor data concerning the ambient environment, adjusting at least part of the user interface components by means of said sensor data.
According to an embodiment, effect of the user interface components is adjusted by means of the sensor data. According to an embodiment, the effect defines illumination for the user interface view. According to an embodiment, the sensor data relates to an ambient illumination level.
The invention is now described in more detailed with reference to the drawings, where
a illustrates an example of a layer providing an illumination area;
b illustrates an example of one illumination generating layer;
c illustrates an example of two illumination generating layers;
a illustrates an example of an effect component;
b illustrates an example of an effect component in a user interface view; and
The present invention is described next by using a smart phone as an example of the apparatus. However, the teachings of the present solution may be utilized also in other computing devices having a display and a graphical user interface. Examples of such devices are tablet and laptop computers.
The apparatus according to
The apparatus (1000) further comprises a memory (
According to an embodiment, the user interface view can be understood to be created from multiple user interface components. There can be different kinds of user interface components which can be classified according to their nature into various groups. In the example of
One of the component groups may relate to background components and comprise the background image and can provide illumination effect for the graphical elements and illumination level for the user interface view. In
There can be yet other type of graphical user interface components to create a user interface view, for example a diffusing component that is configured to blur (diffuse) the view with respect to the other user interface components. In
As said, the background component comprises the background image and may define the illumination level for the view. The background component may define bump map parameters to be used for the user interface view. For the purposes of the present invention, the bump mapping is implemented by software. However, in some embodiments, the bump mapping can be created by using back light of the display. For example, if the back light is given by LED (Light Emitting Diode) where the illumination can be controlled within a picture (sub areas of the display or per pixel), the back light can be used as bump map. The background image in the background component can be modified to form illumination type of effects in the user interface. For example, an area 3002 in the background image can be adjusted to have larger illumination values that normally. The modification of the illumination values would result in effect of said area looking more bright than other areas in the user interface. Alternatively for changing illumination levels for making the digital light effect, bump mapping can be used.
Bump mapping is a technique by means of which normal angle (seen from user's point of view) of a brightness or shading of the pixel is changed slightly in order to provide an illumination effect for that pixel. The pixel itself is not obstructed but the shading of it is modified. As a result of bump mapping, the pixel can be seen to moved closer to a user by a distance e.g. creating a bump.
As mentioned earlier, the diffusing component, a, also called a glossy glass component, is used to diffuse (blur) at least one other user interface component (in
The content component can comprise graphical elements, such as text, icons, menus.
In the following, a method for customizing the user interface components automatically is discussed. It is known to have a theme for graphical elements in the user interface. The theme defines the common appearance for the graphical elements, e.g. a common colouring for fonts, for icons, for background etc. The theme can be changed according to moods of the user, whereby the appearance of the user interface is changed accordingly. It is realized that the theme creates look and feel for the user interface.
Each of the above user interface components, independently on the implementation means, can have certain characteristics which can be customized. This customization refers basically setting up/defining/configuring/creating of a theme for the graphical user interface, which modifies the appearance of the graphical user interface so that the various user interface components share one or more certain visual factors being defined by the theme. According to embodiments, in order to perform customization, an inputted or stored image is processed to obtain parameters by means of which colouring scheme, illumination levels, dark levels, bright levels, glossy effects etc. being represented by various user interface components can be modified when the user interface view is rendered.
The image (4010) is analysed to determine colour space to be used for graphical elements for example in the content component. The analysis finds certain characteristics of the picture, for example contrast level changes, contours, areas with certain colours (or colour space) in colour map, e.g. certain range of RGB values (Red Green Blue) or other data such as luminance or chrominance, areas with certain brightness (minimum brightness to maximum brightness), objects, symbols etc.
For example, area 4020 is an area which corresponds to a relative size of entire figure with certain grey levels. An object 4030 is recognized from the image by using e.g. pattern recognition methods. Area 4040 is an area with certain brightness level. According to an embodiment, the image analysis is carried out by following steps:
The method can contain all the previous steps or any combination of the selected steps.
It is realized that the theme contains customization on various user interface components, i.e. not only in the content component, but also in other components (background component, effect component, diffusing component) defining effects for the user interface view.
Turning to
Theme creation method according to an embodiment is shown in
The theme can be further modified (5035) dynamically according to data (5030) being received from sensors sensing the environment. The sensor data can be received from one or more of the following sensors: magnetometer, accelerometer, location, camera, temperature. For example, a front camera can be used to determine the ambient illumination level and the direction of illumination, which information can be utilized for determining illumination effect parameters for the user interface components. When the theme content is ready, the theme (i.e. colours, illumination effects) is applied (5040) to all or some of the visual software components on the user interface view. It is to be noticed that each time a sensor data is received, the created theme can be adjusted only with respect to some of the user interface components. For example, if it is noticed that a surrounding illumination level has decreased, only the illumination level defined by the background component is modified or the diffusing component is adjusted accordingly. However, there is no need to touch the content component or the overall colouring of the user interface view.
Turning back to
According to an embodiment the illumination levels of layer L1 can be altered according to a sensor data of apparatus (1000) such as camera 1020, as mentioned above. For example if the camera 1020 (or other illumination measurement mean) indicates that the apparatus is being used in dark illumination conditions the illumination level of the layer L1 can be decreased. Similarly if the apparatus is used in bright illumination environment illumination level of layer L1 can be increased to modify theme dynamically. Additionally sensor data such as accelerometer can be used to detect orientation of the apparatus and the themes can be varied based on the orientation. For example direction of bump mapping vectors can be changes to take in account the orientation of the apparatus (gravity effect).
Alpha channel values define a transparency of an object, a user interface component, e.g. a layer, a graphical element, bitmap, where alpha channel value 0 relates to total transparency and alpha value 1 relates to non-transparency. When diffuse layer L22 has an alpha channel value 0.1, the content of layers below (i.e. L21) can be seen through the diffuse layer. If the diffuse layer L22 has an alpha channel value close to 1, the content of layers below (i.e. L21) cannot be seen.
In order to create an illumination effect on area A of the content component L23, an effect component L24 can be located on top of the content component L23 (as shown with reference 6000). The effect component L24 can be dimensioned to be smaller than the entire picture to save computational power. It is appreciated that the effect component L24 can also cover the entire picture. Reference 6010 shows a user interface view with effect component L24.
a shows details of effect component L24 according to an embodiment. The effect component L24 can be a bitmap. The bitmap may contain visual data, e.g. triangle shapes as in
Black colour of triangles indicates the colour of the effect component L24 (which can be of any colour, e.g. white, red, green, blue, . . . ). White colour triangles (and other white area) in the
In order to implement a smooth light box, an alpha channel (α) is applied to pixels of the effect component L24 as shown in
Basically alpha values create contour plots with alpha value (α) close to 1 in the middle with decreasing alpha value as the radius from the middle of the effect component L24 increases. The alpha value (α) at the edge is zero or very close to zero.
c illustrates an example of a user interface view 7000 having two effect components L24, L25. In this example the effect components L24, L25 are also light generating layers on top of the content component. Similarly, as in
According to the embodiments some or all visual characteristics of said components (L21, L22, L_22_21, L23 and L24) are defined by theme. The theme related parameters can be created automatically from a source figure as discussed earlier. For example, the form factor of alpha channel function (see for example
As discussed, sensor data of apparatus (1000) such as camera (1020) can be used as input to alter theme parameters related to alpha values. Sensor data can be used to take in consideration illumination conditions of the environment. For example if the camera (1020) (or other illumination measurement mean) indicates that the apparatus is being used in dark illumination conditions alpha values of applied alpha channel can be decreased. Similarly if the apparatus is used in bright illumination environment the alpha channel values can be increased (i.e. made closer to 1).
Further, according to embodiments, the effect component L24 graphics (bitmap) can be modified using bump mapping. The orientation and characteristics of said bump mapping are parameters in a theme. The bump map parameters related to the bitmap in effect component L24 can be altered for example to take in account orientation of the device i.e., the theme can be dynamically altered.
a illustrates yet another example of an effect component L26, the purpose of which is to highlight a graphical element on the content component by glowing. This can be implemented by applying alpha channels (α) to pixels of the effect components L26 as shown in
This kind of an implementation is useful in a situation, where graphical element 8001 is an icon of a certain functionality. When the functionality is running, the alpha channels are applied to the effect component as shown in
The user interface components can be implemented in various ways. For example, when using OpenGL software libraries the “layer like” representation of the user interface is created by painting the elements in the graphical memory in certain order. The order can be generated by painting the elements from “bottom to top” or it can be such that only visible elements are constructed in the display memory and further rendered.
The present invention concerns a graphical user interface that is dynamic in a sense that it can be customized according to user's desires easily and with less pre-defined resources. Different components of the graphical user interface provide different effects for the user interface view and also provides state indication for functionality without modifying the graphical element representing the functionality. The invention represents a substantial advancement compared to the user interface view customization methods of related art, because the present solution does not need any other input from the user but a source image to create automatically image-related appearance throughout the user interface views.
The present invention is not limited solely to the above-presented embodiments, but it can be modified within the teachings of the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/FI2012/051141 | 11/20/2012 | WO | 00 |