1. Field
Apparatuses and methods consistent with exemplary embodiments relate to a mobile device and a control method thereof, and more particularly, to a mobile device capable of providing a haptic function and a control method thereof.
2. Description of the Related Art
A mobile device such as a cellular phone, a smart phone, a tablet personal computer (PC), etc. interfaces with a user through various methods such as a visual method, an auditory method, etc. Such an interface method includes a tactile method. For example, the mobile device reacts to a user's manipulation or operation and vibrates so that the user can feel tactile feedback which is referred to as haptic or haptic function.
A related art mobile device provides the haptic function in only a simple form where the mobile device vibrates when a user touches a screen or when there is a phone call. Accordingly, it is desirable to provide the haptic function in more various forms to satisfy a demand of a user who wants more sensitive functions.
Meanwhile, various functions provided by a mobile device are achieved by software applications. For example, a user may download an application related to a desired function from a network or the like and install it on the mobile device to use the function. However, an operating system (OS), a platform, etc. of the related art mobile device does not support various haptic functions needed for such an application.
Exemplary embodiments address at least the above problems and/or disadvantages and other disadvantages not described above. Also, an exemplary embodiment is not required to overcome the disadvantages described above, and an exemplary embodiment may not overcome any of the problems described above.
One or more exemplary embodiments provide a mobile device capable of providing a haptic function to satisfy a demand of a user who wants a more sensitive function, and a control method thereof.
Exemplary embodiments also provide a mobile device capable of supporting a function needed for an application to achieve a haptic function satisfying a user's demand, and a control method thereof.
According to an aspect of an exemplary embodiment, there is provided a mobile device providing a haptic function, the mobile device including a display unit which displays an image; a user input unit which receives an input of a user; a vibration unit which generates vibration for a tactile effect as the haptic function; and a control unit which includes a platform providing an application programming interface (API) corresponding to the haptic function and having a plurality of parameters, executes an application prepared by the API, determines a characteristic of the vibration on the basis of the plurality of parameters set up in the application, and controls the vibration unit to generate vibration having the determined characteristics.
The characteristic of the vibration may include at least one of a state, style and type of the vibration.
Each of the plurality of parameters may include a value for determining at least one of strength, time and frequency characteristics of the vibration.
The application may be downloaded from an exterior and installed.
According to another aspect of an exemplary embodiment, there is provided a control method of a mobile device providing a haptic function, the control method including executing an application prepared by an API that is provided by a platform of the mobile device, has a plurality of parameters, and corresponds to the haptic function; determining a characteristic of vibration for a tactile effect as the haptic function on the basis of the plurality of parameters set up in the application; and generating vibration having the determined characteristic.
The characteristic of the vibration may include at least one of a state, style and type of the vibration.
Each of the plurality of parameters may include a value for determining at least one of strength, time and frequency characteristics of the vibration.
The application may be downloaded from an exterior and installed.
The above and/or other aspects will become more apparent from the following description of exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
Certain exemplary embodiments are described in greater detail below with reference to accompanying drawings.
In the following description, like drawing reference numerals are used for like elements, even in different drawings. The matters defined in the description, such as detailed construction and elements, are provided to assist in a comprehensive understanding of exemplary embodiments. However, exemplary embodiments can be practiced without those specifically defined matters.
The communication unit 11 performs communication through a network. The contents and type of the communication performed by the communication unit 11 may vary depending on use and function of the mobile device 1. For example, in the case of a telephone function, the communication unit 11 calls the other device (not shown) for telephone conversation. In the case of an Internet function, the communication unit 11 performs Internet connection with a predetermined server (not shown) for transmitting/receiving data. Further, the communication unit 11 may perform communication with a peripheral device (not shown) through local communication such as Bluetooth, WiFi, etc.
The display unit 12 displays an image representing the operation or state of the mobile device 1. The display unit 12 may display an image by using various display devices including, for example, a liquid crystal display (LCD), an organic light emitting device (OLED), etc. The audio output unit 13 outputs an audio representing the operation or state of the mobile device 1. The audio output unit 13 may include an audio processor (not shown) that processes an audio signal, and a loudspeaker (not shown) that outputs an audio based on an audio signal.
The user input unit 14 receives a user's command. The user input unit 14 may receive a user's command in various forms, which may include a key input unit (not shown) that receives a user's command by a key input, and a touch input unit (not shown) that receives a user's command by a touch input. A touch input unit may include a touch screen provided in the display unit 12.
The vibration unit 15 generates vibration for a tactile effect to achieve a haptic function. The vibration generated by the vibration unit 15 may have various characteristics. The characteristics of the vibration may be controlled by the control unit 17.
The storage unit 16 is a non-volatile memory including, for example, a flash memory, a hard disk drive, etc., which stores data or programs needed for operating the mobile device 1. The power unit 18 supplies power for operating the mobile device 1. The camera unit 191 takes an image, and the audio input unit 192 may include a microphone or the like and receives an audio. Some of the above-described elements, such as for example, the camera unit 191, may be omitted from the mobile device 1 in consideration of its function or use.
The control unit 17 controls the operation of the elements in the mobile device 1. The control unit 17 may include a read only memory (ROM) 171 where a control program for performing various operations may be stored, a random access memory (RAM) 172 where the control program is at least partially loaded, and a central processing unit (CPU) 173 which executes the loaded control program. The control program of the control unit 17 may be stored in the storage unit 16 as well as in the ROM 171. The control program of the control unit 17 may include a plurality of programs.
As shown in
The API 231 of the platform 23 includes an API for the haptic function. In this exemplary embodiment, the API for the haptic function includes a plurality of parameters. The plurality of parameters corresponds to the vibration characteristics for the haptic function. In the application 24, the plurality of API parameters is set up for the haptic function. When the application 24 is executed, the control unit 17 analyzes the plurality of parameters set up in the application 24 and determines the characteristics of the vibration, thereby controlling the vibration unit 15 to generate vibration corresponding to the determined characteristics.
According to an exemplary embodiment, the characteristics of the vibration may include at least one of the state, style, and type of the vibration for the haptic function. Below, Table 1 shows an example of the vibration characteristics.
As shown in Table 1, the plurality of parameters may be used in determining the state, style, and type of the vibration. In this exemplary embodiment, the plurality of parameters may include information about strength, time and frequency characteristics of the vibration. Table 2 shows an example of various parameters for determining the strength, time, and frequency characteristics of the vibration variously.
It is possible to achieve the state, style and type of the vibration in various forms as shown in Table 1 by setting up values of many parameters shown in Table 2 variously and combining all or some of them.
In accordance with this exemplary embodiment, the values of at least some of the parameters shown in Table 2 are set up in the application 24, and thus vibration having a characteristic determined on the basis of the setup parameters is generated when the application 24 is executed. Accordingly, it is possible to achieve various types of vibration as the haptic function, so that a user who wants more sensitive effects can be further satisfied. Also, it is possible to provide the haptic function in various forms by the API 231 of the platform 23, so that the application 24 for achieving an enhanced haptic function can be more actively developed.
As described above, in accordance with an exemplary embodiment, it is possible to provide a haptic function which can satisfy a demand of a user who wants a more sensitive function.
It is also possible to support a function needed for an application to achieve a haptic function satisfying a user's demand.
The foregoing exemplary embodiments and advantages are merely exemplary and are not to be construed as limiting. The present teaching can be readily applied to other types of apparatuses. Also, the description of the exemplary embodiments is intended to be illustrative, and not to limit the scope of the claims, and many alternatives, modifications, and variations will be apparent to those skilled in the art.
Number | Date | Country | Kind |
---|---|---|---|
10-2010-0114478 | Nov 2010 | KR | national |
This application claims priority from U.S. Provisional Application Nos. 61/265,923 and 61/265,939, filed Dec. 2, 2009, and Korean Patent Application No. 10-2010-0114478 filed Nov. 17, 2010, in the Korean Intellectual Property Office, the disclosures of which are incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
6131097 | Peurach et al. | Oct 2000 | A |
6175789 | Beckert et al. | Jan 2001 | B1 |
7446752 | Goldenberg et al. | Nov 2008 | B2 |
20010035854 | Rosenberg et al. | Nov 2001 | A1 |
20030046401 | Abbott et al. | Mar 2003 | A1 |
20030174121 | Poupyrev et al. | Sep 2003 | A1 |
20040233167 | Braun et al. | Nov 2004 | A1 |
20050195390 | Jeon et al. | Sep 2005 | A1 |
20060109266 | Itkowitz et al. | May 2006 | A1 |
20060248183 | Barton | Nov 2006 | A1 |
20060288329 | Gandhi et al. | Dec 2006 | A1 |
20070005835 | Grant et al. | Jan 2007 | A1 |
20070011665 | Gandhi et al. | Jan 2007 | A1 |
20070146316 | Poupyrev et al. | Jun 2007 | A1 |
20070198360 | Rogers et al. | Aug 2007 | A1 |
20080036591 | Ray | Feb 2008 | A1 |
20080117175 | Linjama et al. | May 2008 | A1 |
20080122315 | Maruyama et al. | May 2008 | A1 |
20080153554 | Yoon et al. | Jun 2008 | A1 |
20080198139 | Lacroix et al. | Aug 2008 | A1 |
20090096632 | Ullrich et al. | Apr 2009 | A1 |
20110191674 | Rawley et al. | Aug 2011 | A1 |
Entry |
---|
International Search Report issued Aug. 30, 2011 by the International Searching Authority in counterpart Korean Patent Application No. PCT/KR2010/008603. |
Number | Date | Country | |
---|---|---|---|
20110130173 A1 | Jun 2011 | US |
Number | Date | Country | |
---|---|---|---|
61265939 | Dec 2009 | US | |
61265923 | Dec 2009 | US |