The above, as well as additional features and advantages of the disclosed embodiments, will be better understood through the following illustrative and non-limiting detailed description of preferred embodiments, with reference to the appended drawings, where the same reference numerals will be used for similar elements, wherein:
a and 8b illustrate an example of an image sending function.
In a first step, 100, a camera mode is entered by an apparatus, such as a mobile communication terminal. The mode of the apparatus may, for instance, be switched from communication mode to camera mode by a key input actuation, or by removing a lens cap of a camera comprised in the apparatus.
Thereafter, in a second step, 102, a keypad configuration of the apparatus is switched to a camera mode keypad configuration. For instance, the keypad configuration may be switched from a communication mode keypad configuration, i.e. a keypad configuration utilised when e.g. dialing a number, to the camera mode keypad configuration, i.e. a keypad configuration where camera specific functions are associated to the keys of the keypad.
Next, in a third step, 104, image data is received. The received image data may be present image data, temporarily stored image data, or stored image data.
The present image data is image data which is continuously updated by the camera comprised in the apparatus. Most often, the present image data is temporarily stored in a memory of the apparatus. Further, when the apparatus is in a camera mode, the present image data can be shown on a display of the apparatus.
If the user of the apparatus decides to store the present image data, he may, for instance, press a button on the apparatus. Then, the present image data is converted to temporarily stored image data. Before the temporarily stored image data is stored, the user may, for instance, decide in which folder the image data is to be stored. When the image data is stored by the user in a user-available memory, the image data is considered as stored image data.
Then, in a fourth step, 106, after having received image data, an image manipulating function is executed upon activation of a first key. Such an image manipulating function may be an image filter function, such as a BW (Black & White) image filter function, or in other words a greyscale image filter function, transforming color image data, such as RGB image data, to greyscale image data.
Next, in a sixth step, 108, the image data is manipulated using the image manipulating function.
For example, the first key may be a numerical “1”, button of the apparatus, and the image manipulating function may be a BW image filter function. By pressing the “1” button, the image data, which is assumed to be color image data, is transformed into BW image data. Optionally, the image data may be re-transformed into color image data by pressing the “1” button once again.
Optionally, in a seventh step, 110, cursor control data may be received. Then, in an eighth step, 112, the cursor control data may be transformed into coordinate data, and, in a ninth step, 114, the coordinate data may be transmitted to the image manipulating function.
Such coordinate data may be useful in certain image manipulating functions. For example, the image manipulating function may be a text adding function configured to be executed when a first key is actuated. In a first step of such a text adding function, a text box may be added. Then, in a second step, a text may be written in the text box. Thereafter, in a third step, the text box may be placed in accordance to the received coordinate data.
The first five steps of the flowchart, 200 to 208, corresponds to the first five steps of the flowchart illustrated in
In a sixth step, 210, an image sending function is executed upon an actuation of a second key of the keypad.
In a seventh step, 212, image data may be sent using said image sending function.
For example, the second key may be a numerical “2” button of the apparatus, and the image sending function may be an e-mail sending function. By pressing the “2” button, a new e-mail may automatically be generated and the image data may automatically be transformed into an image attached to the generated e-mail. After a receiving e-mail address and an optional text have been added the e-mail may be sent.
Optionally, when pressing the second key, the keypad configuration may be switched from camera mode keypad configuration to a text mode keypad configuration.
The first five steps of the flow chart, 300 to 308, corresponds to the first five steps of the flow chart illustrated in
In a sixth step, 310, a meta information handling function is executed upon an actuation of a third key of the keypad.
In a seventh step, 312, meta information associated to the image data may be handled using the meta information handling function.
For example, the third key may be a numerical “3” button of the apparatus, and the meta information handling function may be a file name handling function. By pressing the “3” button, a text box may appear on the display allowing the user to enter a name of the image file. If no image file exists, i.e. the image data is present image data or temporarily stored image data, the file name may be temporarily stored and added to the next stored image file.
The first five steps of the flow chart, 400 to 408, corresponds to the first five steps of the flow chart illustrated in
In a sixth step, 410, an image displaying function is executed upon an actuation of a fourth key of the keypad.
In a seventh step, 412, the image data may be displayed using the image displaying function.
For example, the fourth key may be a numerical “4” button of the apparatus, and the image displaying function may be a full screen function. Hence, by pressing the “4” button, the image data may be shown on the entire display. Optionally, by pressing the “4” button once again, the full screen mode is left and the prior mode is entered.
Optionally, in an eighth step, 414, cursor control data may be received. Then, in a ninth step, 416, the cursor control data may be transformed into coordinate data, and, in a tenth step, 418, the coordinate data may be transmitted to the image displaying function.
Such coordinate data may useful in image displaying functions. For example, the image displaying function may be a zoom in function configured to be executed when a fourth key is actuated. In a first step of such a zoom in function, a cursor is located in accordance to received coordinate data. Then, in a second step, the area around the cursor is zoomed in.
The amount of zoom may be determined by the number of times the button has been pressed down, or, alternatively, when the zoom in function is entered the keypad configuration may be set to a zoom in keypad configuration, in which, for example, the button “1” corresponds to 100% zoom (i.e. no zoom), the button “2” corresponds to 200% zoom, the “3” corresponds to 300% zoom etc.
Alternatively, the cursor may be used to select a sub-area of the image. In a next step, the sub-area of the image may be shown on the display. The result of such a zoom in function may be the same as the result of an image cropping function illustrated in
First, original color image data 500 is present on the display of the apparatus. Then, after the first key of the keypad has been actuated, a BW image filter function is executed and the image data is transformed to BW image data 502.
Optionally, if the user changes his mind, the first key may be actuated once again, and the BW image data 502 may be re-transformed into the original color image data 500. Since the color image data contains more information than the BW image data, the color image data has to be stored in order to enable this re-transformation.
First, a sub-area of the original image data 600 may be marked with the help of a cursor. Then, using the image cropping function, image data within the marked sub-area may be set to be new image data 602.
First, a text box is added to original image data, and, next, a text is added to the text box. By using a cursor, the location of the text box may be determined. When having determined the text and the location of the text box, the text and optionally the text box may be included in the image data.
a and 8b illustrate an example of an image sending function, more particularly an MMS sending function.
First, as illustrated
Second, as illustrated in
Alternatively, the contacts may be presented in a contact list, which may be controlled using a cursor control device.
The apparatus 900 comprises a display 902, a camera 904, a controller 906, a mode controller 907, optionally a cursor controller 908, optionally a cursor control mechanism 910, a keypad 912 comprising a first key 914, a second key 916, a third key 918 and a fourth key 920.
The display 902 may be an LCD configured to present a graphical user interface, as well as image data generated by the camera 904, to the user of the apparatus 900.
Generally, the controller 906 is configured to control the operation of the apparatus 900. More particularly, the apparatus is configured to receive image data from the camera 904, coordinate data from the cursor controller 908, key input actuation data from the keypad 912 and to transmit graphics data to the display 902. The controller 906 may comprise a processor and a memory.
The mode controller 907 may be configured to switch mode of the apparatus, e.g. switching to camera mode when the lens cap of the camera is removed. The mode controller 907 may be a software module comprised within the controller 906.
The cursor controller 908 is configured to receive cursor control output data from the cursor control mechanism 910, to transform this received cursor control output data to coordinate data, and to output the coordinate data to the controller 906.
The cursor controller 908 may be software implemented or hardware implemented, or a combination thereof, such as an FPGA circuit.
Further, the cursor controller 908 may be comprised within the controller 906.
The keypad 912 may comprise of a first number of soft keys and a second number of character related keys. The keypad may be configured according to the ITU-T standard.
In a first step, 1000, a camera mode is entered by an apparatus, such as a mobile communication terminal. The mode of the apparatus may, for instance, be switched from communication mode to camera mode by a key input actuation, or by removing a lens cap of a camera comprised in the apparatus.
Thereafter, in a second step, 1002, a keypad configuration of the apparatus is switched to a camera mode keypad configuration. For instance, the keypad configuration may be switched from a communication mode keypad configuration, i.e. a keypad configuration utilised when e.g. dialing a number, to the camera mode keypad configuration, i.e. a keypad configuration where camera specific functions are associated to the keys of the keypad.
Next, in a third step, 1004, image data is received. The received image data may be present image data, temporarily stored image data, or stored image data.
The present image data is image data which is continuously updated by the camera comprised in the apparatus. Most often, the present image data is temporarily stored in a memory of the apparatus. Further, when the apparatus is in a camera mode, the present image data can be shown on a display of the apparatus.
Then, in a fourth step, 1006, after having received image data, an data amending function is executed upon activation of a first key.
Next, in a sixth step, 1008, the data is amended using the data amending function.
The invention has mainly been described above with reference to a few embodiments. However, as is readily appreciated by a person skilled in the art, other embodiments than the ones disclosed above are equally possible within the scope of the invention, as defined by the appended patent claims.