This application claims priority under 35 U.S.C. §119 to an application filed in the Korean Intellectual Property Office on Mar. 14, 2013 and assigned Serial No. 10-2013-0027588, the entire contents of which are incorporated herein by reference.
The present disclosure relates generally to an apparatus and method for displaying an image in an electronic device, and more particularly to an apparatus and method for dynamically displaying one or more images in an electronic device based on the current state information of the electronic device.
Electronic devices which have become necessities for modern people due to their easy portability now include multimedia devices that provide a variety of services, such as voice and video communication functions, information input and output functions, and data transmission and reception.
Such electronic devices are provided with a display to display state information thereof, characters inputted by a user, moving pictures, and static pictures. As many electronic devices are provided with sensors to provide data on the velocity, location, altitude, and moving direction of the electronic device, a variety of services using these types of state information have developed.
For example, an electronic device can monitor and provide the state of exercise of a user through a jogging program. Such an electronic device may simply display at least one of the current step count, the target step count, the distance traversed, the current velocity, and calories so far consumed by the user in text or one or more gauges displayed on the display of the electronic device.
Such electronic devices need a method for displaying state information dynamically.
The present disclosure addresses at least the above problems and/or disadvantages and provides at least the advantages described below. Accordingly, one object of the present disclosure is to provide an apparatus and method for displaying an image dynamically in an electronic device based on the state information of the electronic device.
Another object of the present disclosure is to provide an apparatus and method for dynamically displaying an image in an electronic device based on the current velocity of the electronic device.
Another object of the present disclosure is to provide an apparatus and method for magnifying and displaying an image in an electronic device based on the current velocity of the electronic device.
Another object of the present disclosure is to provide an apparatus and method for dynamically displaying an image in an electronic device based on the altitude of the electronic device.
Another object of the present disclosure is to provide an apparatus and method for magnifying and displaying an image in an electronic device based on the altitude of the electronic device.
Another object of the present disclosure is to provide an apparatus and method for changing and displaying slope of an image in an electronic device based on the altitude of the electronic device.
Another object of the present disclosure is to provide an apparatus and method for dynamically displaying an image in an electronic device based on the current velocity and altitude of the electronic device.
According to an aspect of the present disclosure, a method in an electronic device includes determining an image conversion weight for each of a plurality of images shown on a screen of the electronic device; determining a current velocity of the electronic device; and displaying each of the plurality of images based on the image conversion weight and the current velocity.
According to another aspect of the present disclosure, a method in an electronic device includes determining an image conversion weight for each of the plurality of images; determining a current velocity and altitude of the electronic device; and displaying each of the plurality of images based on the image conversion weight, the current velocity, and the altitude.
According to another aspect of the present disclosure, a method for displaying a dynamic image in an electronic device includes determining an image conversion weight for each of a plurality of images; determining at least one of a current velocity and current altitude of the electronic device at a first time and a second time; calculating at least one of an nonlinear velocity and an nonlinear altitude using the determined at least one of current velocity and altitude; and displaying each of the plurality of images based on the image conversion weight and the calculated at least one of an nonlinear velocity and nonlinear altitude.
According to another aspect of the present disclosure, an electronic device includes at least one processor; at least one sensor unit; and at least one non-transitory computer-readable medium having program instructions recorded thereon, the program instructions configured to have the at least one processor perform one or more steps of: determining an image conversion weight for each of the plurality of images, determining a current velocity of the electronic device, and displaying each of the plurality of images based on the image conversion weight and the current velocity.
According to another aspect of the present disclosure, an electronic device includes at least one processor; at least one sensor unit; and at least one non-transitory computer-readable medium having program instructions recorded thereon, the program instructions configured to have the at least one processor perform one or more steps of: determining an image conversion weight for each of a plurality of images, determining a current velocity and altitude of the electronic device, and displaying each of the plurality of images based on the image conversion weight and the current velocity and altitude.
According to another aspect of the present disclosure, an electronic device includes at least one processor; at least one sensor unit; and at least one non-transitory computer-readable medium having program instructions recorded thereon, the program instructions configured to have the at least one processor perform one or more steps of: determining an image conversion weight for each of a plurality of images; determining at least one of a current velocity and a current altitude of the electronic device at a first time and a second time; calculating at least one of an nonlinear velocity and nonlinear altitude using the determined at least one of current velocity and altitude; and displaying each of the plurality of images based on the image conversion weight and the calculated at least one of nonlinear velocity and nonlinear altitude.
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings in which:
Embodiments of the present disclosure are described herein with reference to the accompanying drawings. In the following description, detailed descriptions of well-known functions or constructions are omitted since they would obscure the disclosure in unnecessary detail. Also, the terms used herein are to be construed according to the technical field of the present disclosure. Thus, how terms are construed may vary depending on the user's or operator's intentions or practices. Therefore, the terms used herein must be understood as not limited to the descriptions made herein.
The present disclosure relates to a technology for displaying a dynamic image based on current state information of an electronic device.
In the following description, the electronic device may be a mobile communication terminal, a PDA, a Personal Computer (PC), a laptop, a smartphone, a netbook, a television, a Mobile Internet Device (MID), an Ultra Mobile PC (UMPC), a tablet PC, a navigation device, a smart TV, a digital camera, a refrigerator, a digital watch, or an MP3.
As shown in
The respective elements of the electronic device will be described.
The memory 110 includes a program storing unit 111 configured to store a program for controlling an operation of the electronic device 100, and a data storing unit 112 configured to store data generated during execution of a program. For example, the data storing unit 112 may separately store a first image 511, a second image 513, a third image 515, and a fourth image 517 for a “Walk mate” program 501, as shown in
The program storing unit 111 includes a Graphic User Interface (GUI) program 113, an image control program 114, and at least one application program 115. The programs included in the program storing unit 111 may be expressed as a set of instructions.
The GUI program 113 may include at least one software element for providing a graphic user interface on the display unit 160. For example, the GUI program 113 may include instructions for displaying application program information executed by the processor 122 on the display unit 160. As another example, the GUI program 113 may include instructions which display each of the images from the left to the right or from the right to the left through the image control program 114 and display the images on the display unit 160. As another example, the GUI program 113 may include instructions which display each of the images up and down through the image control program 114 and display the images on the display unit 160. In another example, the GUI program 113 may include instructions which magnify or reduce each of the images through the image control program 114 and display the magnified or reduced images on the display unit 160. In another example, the GUI program 113 may include instructions which display a slope of each of the images through the image control program 114 and display the images on the display unit 160.
The image control program 114 may include at least one software element for displaying(or displaying) each of the images based on the current velocity of the electronic device and the image conversion weight. For example, the image control program 114 determines the image conversion weight for each of the images, as shown in
Also, the image control program 114 may control each of the images to be displayed based on the current velocity and altitude of the electronic device and the image conversion weight. For example, the image control program 114 determines the image conversion weight for each of the images, as shown in
The application program 115 may include a software element for at least one application program installed in the electronic device 100.
The processor unit 120 includes a memory interface 121, at least one processor 122, and a peripheral device interface 124. The memory interface 121, the at least one processor 122, and the peripheral device interface 124 which are included in the processor unit 120 may be integrated in at least one integrated circuit or be implemented as separate elements.
The memory interface 121 controls an access of an element such as the processor 122 or the peripheral device interface 124 to the memory 110.
The processor 122 controls the electronic device 100 to provide various services using at least one software program. In this regard, the processor 122 executes at least one program stored in the memory 110 to provide a service corresponding to the program.
The peripheral device interface 124 controls the input/output controller 150 of the electronic device 100, and a connection between the processor 122 and the memory interface 121.
The audio processing unit 130 provides an audio interface between the user and the electronic device 100 through a speaker 131 and a microphone 132.
The communication system 140 performs a communication function for audio communication and data communication. The communication system 140 may be divided into a plurality of communication service modules supporting different communication networks. For example, the communication networks may include, but are not limited to, a Global System for Mobile communication (GSM) network, an Enhanced Data rates for GSM Evolution (EDGE) network, a Code Division Multiplexing Access (CDMA) network, a Wideband CDMA (W-CDMA) network, a Long Term Evolution (LTE) network, an Orthogonal Frequency Division Multiplexing Access (OFDMA) network, a wireless Local Area Network (LAN), a Bluetooth network, and a Near Field Communication (NFC) network.
The input/output controller 150 provides an interface between the display unit 160 and an input/output unit of the input unit 170, and the peripheral device interface 124.
The display unit 160 displays state information of the electronic device 100, characters inputted by a user, a moving picture, and a static picture. For example, the display unit 160 may display application program information executed by the processor 122 under the control of the GUI program 113. As another example, the display unit 160 may display each of the images provided from the image processing program 114 from the left to the right or from the right to the left under the control of the GUI program 113. As another example, the display unit 160 may display each of the images provided from the image processing program 114 up and down under the control of the GUI program 113. As another example, the display unit 160 may magnify or reduce and display each of the images provided from the image processing program 114 under the control of the GUI program 113. As another example, the display unit 160 may change and display the slope of each of the images provided from the image processing program 114 under the control of the GUI program 113.
The input unit 170 provides data input by a user to the processor unit 120 through the input/output controller 150. The input unit 170 may include a keypad including at least one hardware button, and a touch screen configured to sense contact information. For example, the input unit 170 may provide contact information including a finger touch sensed through the touch screen, a finger motion on the touch screen, and a finger release (i.e., removal from the touch screen surface) to the processor 122.
The sensor unit 180 provides sensing information generated by the electronic device to the processor 122 through the peripheral device interface 124. Herein, the sensor unit 180 may include at least one of a GPS receiver recognizing a motion or position of the electronic device, a terrestrial magnetism sensor, an acceleration sensor, and a pressure sensor.
As shown in
The image controller 200 may execute the image control program 114 stored in the program storing unit 111 to control each of the images to be displayed based on the current velocity of the electronic device and the image conversion weight. For example, the image control program 200 determines an image conversion weight for each of the images, as shown in
The image controller 200 may execute the image control program 114 stored in the program storing unit 111 to control each of the images to be displayed based on the current velocity and altitude of the electronic device and the image conversion weight. For example, the image control program 200 determines an image conversion weight for each of the images, as shown in
The application program operating unit 210 executes at least one program stored in the program storing unit 111 and provides a service according to the corresponding application program. The application program operating unit 210 may be provided image information considering the current velocity and altitude of the electronic device, and the image conversion weight.
The display controller 220 executes the GUI program 113 stored in the program storing unit 111 and controls a graphic user interface to be displayed on the display unit 160. For example, the display controller 220 controls information from an application program being executed by the processor 122 to be displayed on the display unit 160. As another example, the display controller 220 may control each of the images to be displayed from the left to the right or from the right to the left and displayed on the display unit 160 under the control of the image controller 200. As another example, the display controller 220 may control each of the images to be displayed up and down and displayed on the display unit 160 under the control of the image controller 200. As another example, the display controller 220 may control each of the images to be magnified or reduced and displayed on the display unit 160 under the control of the image controller 200. As another example, the display controller 220 may control the slope of each of the images to be changed and displayed on the display unit 160 under the control of the image controller 200.
In the above-described embodiment, the electronic device 100 uses the processor 122 including the image controller 200 to control a dynamic image to be displayed based on current state information of the electronic device.
In another embodiment, the electronic device 100 may include a separate image control module controlling a dynamic image to be displayed based on current state information thereof.
The description will be made using an example in which an exercise state of a user of the electronic device is being monitored and provided through a “Walk mate 501” program, as shown in
Referring to
Thereafter, the electronic device determines its current velocity in step 303. For example, the electronic device may detect its current velocity using a GPS receiver. As another example, the electronic device may detect the current velocity based on the step count of a user per unit time. In this embodiment, the electronic device recognizes the user's motion or lack thereof using an acceleration sensor.
Thereafter, in step 305, the electronic device displays each of the images based on the image conversion weight and the current velocity of the electronic device. For example, when the current velocity of the electronic device is “4”, the electronic device calculates the conversion rate of the first image 511 as “0.4” based on the weight “×0.1” 521 of the first image 511. Then, the electronic device displays the first image 511 from the right to the left of the display unit based on the conversion rate “0.4” of the first image 511. Also, the electronic device calculates the conversion rate of the second image 513 as “1.2” based on the weight “×0.3” 523 of the second image 513. Then, the electronic device displays the second image 513 from the right to the left of the display unit based on the conversion rate “1.2” of the second image 513. Also, the electronic device calculates the conversion rate of the third image 515 as “2” based on the weight “×0.5” 525 of the third image 515. Then, the electronic device displays the third image 515 from the right to the left of the display unit based on the conversion rate “2” of the third image 515. Also, the electronic device calculates the conversion rate of the fourth image 517 as “12” based on the weight “×3.0” 527 of the fourth image 517. Next, the electronic device displays the fourth image 517 from the right to the left of the display unit based on the conversion rate “12” of the fourth image 517.
Additionally, when it is detected that the moving direction of the electronic device has changed, the electronic device may change the conversion direction of the images. For example, while each of the images may start out displayed from the right to the left based on the image conversion weight and the current velocity of the electronic device, if the moving direction of the electronic device changes, the electronic device may change the image conversion direction to the opposite direction, i.e., from the left to the right, and display the images.
Additionally, when the current velocity of the electronic device increases, the electronic device may display an image 531 in such a manner that a first region 541 is distorted and magnified to a second region 543, the second region 543 is distorted and magnified to a third region 545, and the third region 545 is distorted and magnified to a fourth region 547, as shown in
Referring to
After the application program is operated, the electronic device determines an image conversion weight for each of the images in step 403. The image conversion weight is a parameter for determining an image conversion rate according to a user's exercise state, using characteristics by which, when the user of the electronic device walks on the road, the sky image 511 appears as if viewed farthest from the user's eyes by having the slowest conversion rate, and the road image 517 appears as if viewed closest to the user's eyes by having the fastest conversion rate. For example, as shown in
After the image conversion weight is determined, the electronic device determines the current velocity and altitude in step 405. For example, the electronic device may detect the current velocity using a GPS receiver. As another example, the electronic device may detect the current velocity based on the step count of a user per unit time. As another example, the electronic device may detect the current altitude using a pressure sensor. In this embodiment, the electronic device recognizes a user's motion or lack thereof using an acceleration sensor.
After the current velocity and altitude of the electronic device are determined, the electronic device calculates an image conversion rate and a slope of each of the images based on the current velocity and altitude in step 407. For example, when the current velocity of the electronic device is “4”, the electronic device calculates the conversion rate of the first image 511 as “0.4” based on the weight “×0.1” 521 of the first image 511, the conversion velocity of the second image 513 as “1.2” based on the weight “×0.3” 523 of the second image 513, the conversion velocity of the third image 515 as “2” based on the weight “×0.5” 525 of the third image 515, and the conversion velocity of the fourth image 517 as “12” based on the weight “×3.0” 527 of the fourth image 517.
As an example of a calculation of slope, when the altitude of the electronic device at the current time is higher than that at a previous time, the electronic device determines a first reference line 563 which has an ascending slope, as indicated by a first angle 561, compared to an imaginary reference line 559 forming a right angle with the left side 553 of the display unit 551, as shown in
Thereafter, the electronic device displays the image based on the image conversion rate and the slope in step 409. For example, the electronic device converts the first image 511 from the right to the left of the display unit based on the conversion rate “0.4” of the first image 511 and displays the converted image. Also, the electronic device converts the second image 513 from the right to the left of the display unit based on the conversion rate “1.2” of the second image 513 and displays the converted image. Also, the electronic device converts the third image 515 from the right to the left of the display unit based on the conversion rate “2” of the third image 515 and displays the converted image. Also, the electronic device converts the fourth image 517 from the right to the left of the display unit based on the conversion rate “12” of the fourth image 517 and displays the converted image. In addition, the electronic device tilts the angle of the fourth image 517 based on the first imaginary line 563, as shown in
Additionally, when it is detected that the moving direction of the electronic device has changed, the electronic device may change the conversion direction of the images. For example, while each of the images starts out converted and displayed from the right to the left based on the image conversion weight and the current velocity of the electronic device, when the moving direction of the electronic device changes, the electronic device changes the image conversion direction to the opposite direction, i.e., from the left to the right, and display the images.
Additionally, when the current velocity of the electronic device has increased compared to that at a previous time, the electronic device may display an image 531 in such a manner that a first region 541 is distorted and magnified to a second region 543, the second region 543 is distorted and magnified to a third region 545, and the third region 545 is distorted and magnified to a fourth region 547, as shown in
Additionally, when the altitude of the electronic device has increased, the electronic device may recognize that a user of the electronic device is travelling on an uphill road. Therefore, the electronic device reduces the display of the third image 515 on the display unit 571 from the first region 573 as shown in FIG. 5E(a) to the second region 575 as shown in FIG. 5E(b). On the other hand, when the altitude of the electronic device has decreased, the electronic device may recognize that the user of the electronic device is traveling on a downhill road. Therefore, the electronic device magnifies the display of the third image 515 on the display unit 571 from the second region 575 as shown in FIG. 5F(a) to the third region 577 as shown in FIG. 5F(c).
The above-described electronic device calculates the image conversion rate and the slope of each of the images based on the current velocity and altitude of the electronic device. When the current velocity or altitude of the electronic device changes sharply, the electronic device may perform an abrupt image conversion based on the change in current velocity and altitude.
More specifically, when the current velocity or altitude of the electronic device is sharply changed, the electronic device may calculate the image conversion rate and the slope of each of the images by applying Equation (1) such that the image conversion rate or the slope is non-linearly accelerated, as shown in
TC=T0;
TC=TC+ratio×(T1−TC), Equation (1)
where T0613 is an initial velocity or an initial altitude, T1615 is a current velocity or a current altitude, TC 621 is a calculated velocity or a calculated altitude, and ratio is a weight. In Equation (1), TC has an initial value equal to T0.
For example, when the weight is “0.2”, the initial velocity or the initial altitude is “5”, and the current velocity or the current altitude has abruptly increased to “10”, the electronic device uses a velocity or altitude value TC which is calculated by repeatedly applying Equation (1), rather than using the current velocity/altitude T1, i.e., “10”, which, when input in the methods of
TC=5;
TC=5+0.2×(10−5)=6;
TC=6+0.2×(10−6)=6.8;
That is, when the velocity is sharply changed from an initial velocity or altitude “5” to a current velocity or altitude “10”, the velocity value or the altitude value used in the methods of
Thus, the electronic device may calculate the image conversion rate or the slope of each of the images based on an nonlinear velocity or altitude value which is calculated using Equation 1.
In the above-described embodiment, the electronic device converts each of the images, such as the first image 511, the second image 513, the third image 515, and the fourth image 517, which are separated from one another, according to the exercise state of the user, as shown in
In another embodiment, the electronic device may divide the display unit thereof into a plurality of regions and control the dynamic appearance of data displayed in each of the plurality of regions according to the exercise state of the user thereof
As described above, by displaying each of images based on the current velocity and the altitude of the electronic device, the user of the electronic device may be intuitively provided information according to the user's current moving state.
In some embodiments, of the present disclosure, some or all of the components may be implemented or provided at least partially in firmware and/or hardware, including, but not limited to one or more application-specific integrated circuits (“ASICs”), standard integrated circuits, controllers executing appropriate instructions, and including microcontrollers and/or embedded controllers, field-programmable gate arrays (“FPGAs”), complex programmable logic devices (“CPLDs”), and the like.
Some or all of the system components and/or data structures may also be stored as contents (e.g., as executable or other machine-readable software instructions or structured data) on a non-transitory computer-readable medium so as to enable or configure the computer-readable medium and/or one or more associated computing systems or devices to execute or otherwise use or provide the contents to perform at least some of the described techniques. Moreover, in alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the disclosure. Thus, embodiments of the disclosure are not limited to any specific combination of hardware circuitry and software.
The term “non-transitory computer-readable medium” as used herein refers to any medium that participates in providing instructions to a processor for execution, and may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media. Non-volatile media includes, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, or any other magnetic storage medium, a CD-ROM, DVD, and/or any other optical storage medium. Volatile media includes dynamic random access memory (“DRAM”), RAM, PROM, EPROM, FLASH-EPROM, and the like.
While the disclosure has been shown and described with reference to certain embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Therefore, the disclosure is not limited by the detailed description of the disclosure and is defined only by the appended claims and their equivalents, and all differences within the scope of the appended claims and their equivalents will be construed as being included in the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0027588 | Mar 2013 | KR | national |