Pursuant to 35 U.S.C. §119 (a), this application claims the benefit of earlier filing date and right of priority to Korean Patent Application Nos. 10-2009-0105261 filed on Nov. 3, 2009, and 10-2009-0105262 filed on Nov. 3, 2009, the contents of which are hereby incorporated by reference in their entirety.
1. Field of the Invention
The present invention relates to a mobile terminal and corresponding method for displaying a 3D image using binocular disparity.
2. Discussion of the Related Art
Generally, terminals can be classified into mobile/portable terminals and stationary terminals. The mobile terminals can be further classified into handheld terminals and vehicle mount terminals. Mobile terminals also allow the user to perform a variety of functions such as photographing photos or moving pictures, playing music or moving picture files, playing games, watching broadcasts and the like, for example. Thus, the mobile terminal functions as a multimedia player.
However, because the mobile terminal is small in size, it is sometimes difficult to operate or see the variety of different functions provided on the terminal.
Accordingly, one object of the present invention is to address the above-noted and other problems of the related art.
Another object of the present invention is to provide a mobile terminal and corresponding method for displaying a 3D image using binocular disparity.
Yet another object of the present invention is to provide a user interface including a GUI (graphic user interface) that is more convenient and more visible to a user by generating a 3D image using binocular disparity, allows a user to set up an attribute of 3D image, displays a 3D image imaginary space and changes a display of the imaginary space displayed through an inclination sensor.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, the present invention provides in one aspect a method of controlling a mobile terminal, and which includes receiving a selection signal from an input unit setting a 3D attribute to at least one item among a plurality of items to be displayed on a display of the mobile terminal; and turning on a switching panel unit positioned in front of the display, via a controller controlling the switching panel unit, when the at least one item is displayed on the display of the mobile terminal. Further, the switching panel unit displays left and right eye images of the at least one item such that the at least one item is viewed as a 3D image based on binocular disparity. The present invention also provides a corresponding mobile terminal.
In another aspect, the present invention provides a method of controlling a mobile terminal, and which includes displaying, on a display of the mobile terminal, only a first portion of a 3D imaginary spatial image pre-stored in a memory of the mobile terminal by turning on a switching panel unit of the mobile terminal; receiving an inclination signal indicating an inclination of the mobile terminal; and displaying a second portion of the 3D imaginary spatial image on the display that is different than the first portion in response to the inclination detection signal. The present invention also provides a corresponding mobile terminal.
Further scope of applicability of the present invention will become apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are given by illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
The accompanying drawings, which are included to provide a further understanding of the disclosure and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the disclosure and together with the description serve to explain the principle of the disclosure. In the drawings:
A mobile terminal according to an embodiment of the present disclosure will be described in detail with reference to the accompanying drawings. The suffixes ‘module’, ‘unit’ and ‘part’ may be used for elements and may be used together or interchangeably. Embodiments of the present invention may also be applicable to various types of terminals such as mobile phones, user equipment, smart phones, DTV, computers, digital broadcast terminals, personal digital assistants, portable multimedia players (PMP) and/or navigators. The following description refers to a mobile terminal, although such teachings may apply equally to other types of terminals such as stationary terminals that include digital TVs and desktop computers.
In addition, the wireless communication unit 110 may be configured with several components and/or modules and in
For non-mobile terminals, the wireless communication unit 110 may be replaced with a wire communication unit. In addition, the wireless communication unit 110 and the wire communication unit may be commonly referred to as a communication unit. Further, the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast managing server via a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel, and the broadcast managing server refers to a system that transmits a broadcast signal and/or broadcast associated information to a mobile terminal. The broadcasting signal can also include not only a TV broadcasting signal, a radio signal, and a data broadcasting signal, but also a broadcasting signal in which a TV broadcasting signal or a radio signal is combined with a data broadcasting signal.
Examples of broadcast associated information include information associated with a broadcast channel, a broadcast program, a broadcast service provider, etc. Further, the broadcast associated information may be provided through a mobile terminal, and in this instance, the broadcast associated information may be received by the mobile communication module 112. For example, broadcast associated information may include an electronic program guide (EPG) of the digital multimedia broadcasting (DMB) system and an electronic service guide (ESG) of the digital video broadcast-handheld (DVB-H) system.
In addition, the broadcast receiving module 111 can receive broadcast signals transmitted from various types of broadcast systems such as the digital multimedia broadcasting-terrestrial (DMB-T) system, the digital multimedia broadcasting-satellite (DMB-S) system, the digital video broadcast-handheld (DVB-H) system, a data broadcasting system known as the media forward link only (MediaFLO®) and integrated services digital broadcast-terrestrial (ISDB-T). The receiving of multicast signals may also be provided, and data received by the broadcast receiving module 111 can be stored in the memory 160, for example.
In addition, the mobile communication module 112 can communicate wireless signals with one or more network entities (e.g. a base station, an external terminal, a server). The signals may represent audio, video, multimedia, control signaling, and data, etc. Further, the wireless Internet module 113 can support Internet access for the mobile terminal 100, and may be internally or externally coupled to the mobile terminal 100. Suitable technologies for wireless Internet include, but are not limited to, WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), and/or HSDPA (High Speed Downlink Packet Access). The wireless Internet module 113 may also be replaced with a wire Internet module in non-mobile terminals. The wireless Internet module 113 and the wire Internet module can thus be referred to as an Internet module.
Further, the short-range communication module 114 is a module that facilitates short-range communications. Suitable technologies for short-range communication include, but are not limited to, radio frequency identification (RFID), infrared data association (IrDA), ultra-wideband (UWB), as well as networking technologies such as Bluetooth and ZigBee. In addition, the position-location module 115 can identify or otherwise obtain a location of the mobile terminal 100, and may be provided using global positioning system (GPS) components that cooperate with associated satellites, network components, and/or combinations thereof.
Also, the position-location module 115 can precisely calculate current 3-D position information based on longitude, latitude and altitude by calculating distance information and precise time information from at least three satellites and then by applying triangulation to the calculated information. The location and time information can also be calculated using three satellites, and errors of the calculated location position and time information can then be amended or changed using another satellite. The position-location module 115 can also calculate speed information by continuously calculating a real-time current location.
In addition, the audio/video (A/V) input unit 120 provides audio or video signal input to the mobile terminal 100, and in
Further, the microphone 122 can receive an external audio signal while the mobile terminal 100 is in a particular mode such as a phone call mode, a recording mode and/or a voice recognition mode. The received audio signal is then processed and converted into digital data. Also, the mobile terminal 100, and in particular the A/V input unit 120, may include a noise removing algorithm or noise canceling algorithm to remove noise generated while receiving the external audio signal. The data generated by the A/V input unit 120 can also be stored in the memory 160, utilized by the output unit 150, and/or transmitted via one or more modules of the wireless communication unit 110. At least two or more microphones and/or cameras may also be provided.
In addition, the user input unit 130 generates input data responsive to user manipulation of an associated input device or devices. Examples of such devices include a keypad, a dome switch, a touchpad (e.g., static pressure/capacitance), a jog wheel and/or a jog switch. The sensing unit 140 can also provide status measurements of various aspects of the mobile terminal 100. For example, the sensing unit 140 can detect an opened/closed status or state of the mobile terminal 100, a relative positioning of components (e.g., a display and a keypad) of the mobile terminal 100, a change of position of the mobile terminal 100 or a component of the mobile terminal 100, a presence or absence of user contact with the mobile terminal 100, and/or an orientation or acceleration/deceleration of the mobile terminal 100.
The mobile terminal 100 may also be configured as a slide-type mobile terminal. In such a configuration, the sensing unit 140 can sense whether a sliding portion of the mobile terminal 100 is opened or closed. The sensing unit 140 can also sense a presence or absence of power provided by the power supply unit 190, a presence or absence of a coupling or other connection between the interface unit 170 and an external device, etc. In
In addition, the output unit 150 can generate an output relevant to a sight sense, an auditory sense, a tactile sense and/or the like. In
The display 151 may also include at least one of a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, and a 3-dimensional display. Further, the display 151 may have a transparent or light-transmissive type configuration to enable an external environment to be seen through. This type of display is called a transparent display such as a transparent OLED (TOLED). A backside structure of the display 151 may also have the light-transmissive type configuration. In this configuration, a user can see an object located behind the terminal body through the area occupied by the display 151 of the terminal body.
At least two or more displays 151 may also be provided. For example, a plurality of displays may be provided on a single face of the terminal 100 by being built in one body or spaced apart from the single face. Alternatively, each of a plurality of displays may be provided on different faces of the terminal 100. In addition, if the display 151 and a sensor for detecting a touch action (hereinafter referred to as a touch sensor) are constructed in a mutual-layered structure (hereinafter referred to as a touch screen), the display 151 may be used as an input device as well as an output device. For example, the touch sensor 142 may include a touch film, a touch sheet, a touchpad and/or the like.
The touch sensor 142 can also convert a pressure applied to a specific portion of the display 151 or a variation of electrostatic capacity generated from a specific portion of the display 151 to an electric input signal. The touch sensor can also detect a pressure of a touch as well as a position and size of the touch. If a touch input is provided to the touch sensor 142, signal(s) corresponding to the touch input can be transferred to a touch controller. The touch controller can then process the signal(s) and transfer corresponding data to the controller 180. The controller 180 can therefore determine which portion of the display 151 is touched.
Referring again to
In addition, examples of the proximity sensor include a transmissive photo sensor, direct reflective photo sensor, a mirror reflective photo sensor, a high frequency oscillating proximity sensor, a capacitive proximity sensor, a magnetic proximity sensor, an infrared proximity sensor and the like. When the touch screen is capacitive type, the proximity of a pointer can be detected by changes of electric fields caused by proximity of the pointer. The touch screen (touch sensor) therefore may be classified as a proximity sensor.
Further, when a pointer is recognized to be proximately placed on a touch screen without touching the touch screen is called a “proximity touch” and when the pointer completely touches the touch screen is called a “contact touch”. The position proximity-touched by the pointer on the touch screen is a position vertically corresponded by the pointer to the touch screen when the pointer proximity-touches the touch screen. The proximity sensor 141 can also detect the proximity touch and proximity touch pattern (e.g., proximity touch distance, proximity touch direction, proximity touch speed, proximity touch time, proximity touch position and proximity touch mobile state, etc.). Information corresponding to the detected proximity touch operation and proximity touch pattern may also be displayed on the touch screen.
Further, the audio output module 152 outputs audio data that is received from the wireless communication unit 110 in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode and/or the like. The audio output module 152 can also output audio data stored in the memory 160, and output an audio signal relevant to a function (e.g., a call signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. The audio output module 152 may include a receiver, a speaker, a buzzer and/or the like.
In addition, the alarm unit 153 outputs a signal for announcing an event occurrence of the mobile terminal 100 such as a call signal reception, a message reception, a key signal input, a touch input and/or the like. The alarm unit 153 can output a signal for announcing an event using vibration or the like as well as a video and/or an audio signal. For example, the video signal may be output via the display 151, and the audio signal may be output via the audio output module 152. Thus, the display 151 and/or the audio output module 152 may be classified as part of the alarm unit 153.
In addition, the haptic module 154 uses/outputs various haptic effects that can be sensed by a user. For example, vibration is a representative example of a haptic effect. The strength and pattern of the vibration generated from the haptic module 154 may also be controlled. For example, vibrations differing from each other may be output in a manner of being synthesized together or may be sequentially output.
The haptic module 154 can also generate various haptic effects including a vibration, an effect caused by such a stimulus as a pin array vertically moving against a contact skin surface, a jet power of air via outlet, a suction power of air via inlet, a skim on a skin surface, a contact of an electrode, an electrostatic power and the like, and/or an effect by hot/cold sense reproduction using an endothermic or exothermic device as well as the vibration. Further, the haptic module 154 can provide the haptic effect via direct contact, and enable a user to experience the haptic effect via muscular sense of a finger, an arm and/or the like. Two or more haptic modules 154 may also be provided according to a configuration of the mobile terminal 100.
Next, the switching panel unit 155 is a constituent element for expressing a 3D image using binocular disparity, the function of which will be described in more detail later with reference to
Referring again to
Further, the interface unit 170 functions as a passage to external devices connected to the mobile terminal 100. In more detail, the interface unit 170 may receive data from an external device, and/or be supplied with a power such that the power can be delivered to elements within the mobile terminal 100. The interface unit 170 can also enable data to be transferred to an external device connected to the mobile terminal 100. In addition, the interface unit 170 may include a wire/wireless headset port, an external charger port, a wire/wireless data port, a memory card port, a port for coupling to a device having an identity module, an audio input/output (I/O) port, a video input/output (I/O) port, an earphone port and/or the like.
In addition, the identity module may be a chip or card that stores various kinds of information for authenticating use of the mobile terminal 100. The identify module may also include a user identity module (UIM), a subscriber identity module (SIM), a universal subscriber identity module (USIM) and/or the like. A device provided with the above identity module (hereinafter referred to as an identity device) may also be manufactured in the form of a smart card. The identity device may also be connected to the mobile terminal 100 via the port.
In addition, the interface unit 170 functions a passage for supplying a power to the mobile terminal 100 from a cradle that is connected to the mobile terminal 100, and function as a passage for delivering various command signals, which are input from the cradle by a user, to the mobile terminal 100. Various command signals input from the cradle or the power can also work as a signal for recognizing that the mobile terminal 100 is correctly loaded in the cradle.
Further, the controller 180 controls the overall operations of the mobile terminal 100. For example, the controller 180 performs control and processing relevant to a voice call, a data communication, a video conference and/or the like. In
Further, the power supply unit 190 receives an external or internal power and then supplies the power required for operations of the respective elements under control of the controller 180. In addition, embodiments of the present invention may be implemented within a recording medium that can be read by a computer or a computer-like device using software, hardware or combination thereof.
According to a hardware implementation, arrangements and embodiments may be implemented using at least one of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors and electrical units for performing other functions. In some cases, embodiments may be implemented by the controller 180.
For a software implementation, arrangements and embodiments described herein may be implemented with separate software modules, such as procedures and functions, each of which may perform one or more of the functions and operations described herein. Software codes may be implemented with a software application written in any suitable programming language and may be stored in memory such as the memory 160, and may be executed by the controller 180 or processor.
Next,
The body of the terminal 100 may also include a case (casing, housing, cover, etc.) that forms an exterior of the terminal. In
The display 151, the audio output unit 152, the camera 121, user input units 130/131/132, the microphone 122, the interface unit 170 and the like may be provided on the terminal body, and more particularly on the front case 101. The display 151 can also occupy most of a main face of the front case 101, and the audio output module 152 and the camera 121 can be provided at an area adjacent to one end portion of the display 151, while the user input unit 131 and the microphone 122 are provided at another area adjacent to the other end portion of the display 151. The user input unit 132 and the interface unit 170 may also be provided on lateral sides of the front case 101 and a rear case 102.
In addition, the user input unit 130 may receive a command for controlling an operation of the mobile terminal 100, and include a plurality of manipulating units 131 and 132. The manipulating units 131 and 132 can be generally called a manipulating portion and adopt any mechanism including a tactile manner that enables a user to perform a manipulation action by experiencing a tactile feeling.
The contents input by the first manipulating unit 131 and/or the second manipulating unit 132 may be diversely set. For example, a command such as start, end, scroll and/or the like may be input to the first manipulating unit 131, and a command for a volume adjustment of sound output from the audio output unit 152, a command for switching to a touch recognizing mode of the display 151 or the like may be input to the second manipulating unit 132.
Next,
For example, the camera 121 may have a lower number of pixels to capture and transmit a picture of user face for a video call, while the camera 121′ may have a greater number of pixels for capturing a general subject for photography without transmitting the captured subject. Each of the cameras 121 and 121′ may also be installed on the terminal body to be rotated and/or popped up.
A flash 123 and a mirror 124 are also additionally provided adjacent to the camera 121′. In more detail, the flash 123 projects light toward a subject when photographing the subject using the camera 121′. If a user wants to take a picture of the user (self-photography) using the camera 121′, the mirror 124 allows the user to view a user face reflected by the mirror 124.
An additional audio output unit 152′ is also provided on the backside of the terminal body, and thus can implement a stereo function together with the audio output unit 152 shown in
In addition, the power supply unit 190 for supplying a power to the mobile terminal 100 is provided to the terminal body. The power supply unit 190 may also be embedded within the terminal body. Alternatively, the power supply unit 190 may be detachably and attachably connected to the terminal body.
In addition,
Also, the touchpad 135 may be activated by interconnecting with the display 151 of the front case 101. The touchpad 135 may also be provided in rear of the display 151 in parallel to one another, and have a size equal to or smaller than a size of the display 151.
Next,
Next, one of the principles of auto-stereoscopic 3D display will be explained with reference to
That is, if an image (R) seen through a right eye and an image (L) seen through a left eye are combined, the combined image is seen as a 3D image. To this end, an image is divided into two images, one seen by a right eye and the other seen by a left eye, and the left image (L) and the right image (R) are combined per pixel unit and displayed on one screen.
Thereafter, the two eyes of the user are made to divisively watch a pixel unit image by the left image and a pixel unit image by the right image, and thus the image is seen as a 3D image. A method of combining two images can use an interpolation method but may differ based on image-forming methods. In addition, the reference characters “b” in
Also, when two images are combined per pixel unit (L, R) as illustrated in
In addition,
Next, a method for displaying a 3D image using binocular disparity in a mobile terminal according to an embodiment of the present invention will be described with reference to the flowcharts of
Referring again to
Next,
At this time, only a part of the imaginary spatial image is displayed on the touch screen. Under this circumstance, the controller 180 checks if an inclination detection signal of the mobile terminal has been generated by the inclination detection sensor 142 (S13). When the inclination detection signal is generated (Yes in S13), the controller 180 changes the display of the 3D imaginary spatial image (S14).
Thus, according to the second embodiment of the present invention, a tilting of a mobile terminal by the user enables display of a not-yet-displayed 3D imaginary spatial image on the display 151. A more detailed explanation of the second embodiment will be described later with reference to
Next,
At this time, only a part of the 3D imaginary spatial image is displayed on the touch screen. Meanwhile, the touch screen is a constant current constant voltage combined touch screen capable of receiving all the constant current and constant voltage inputs. When only constant current is generated for one of the displayed icons (Yes in S22), the controller 180 successively monitors whether a constant voltage input signal has been generated (S23). If the constant voltage input signal has been generated (Yes in S23), the selected icon is executed (S24). If the constant voltage input signal has not been generated and only the constant current input signal has been generated (No in S23), the selected icon is displayed in highlight (S25). The constant current signal can be a touch signal and the constant voltage signal can be pressure touch signal.
The highlighting method may also be determined by a user setting. Thus, according to the third embodiment of the present invention, the user can use a constant current constant voltage combined touch screen to selectively execute and highlight the 3D icon by the switching panel unit. A detailed explanation with regard to the third embodiment will be provided later with reference to
Next, an image applicable to a method for display a 3D image using binocular disparity in a mobile terminal according to an embodiment of the present invention will be described with reference to
The controller 18 can also divisively generate an image pre-stored in the memory into a left eye image and a right eye image and display the images on the display 151. Further, the controller 180 can controllably turn on the switching panel unit 155 to allow the left eye image to be emitted to a left eye, and the right eye image to be emitted to a right eye, whereby the user can view a 3D image caused by binocular disparity. In addition, the 3D image according to embodiments of the present invention can also include a text object, an emoticon, an avatar and a moving picture image, the details of which will be described with reference to
Next, a method for displaying a 3D image using binocular disparity in a mobile terminal according to the first embodiment of the present invention will be described with reference to
As shown,
When the user selects the text input block 311 to input a predetermined character, a character in 3D image is generated, and when a call signal is received from or transmitted to the telephone number registered with the phone directory registration screen, the controller 180 turns on the switching panel unit 155 and displays the character on the display 151 in a 3D image. When the user selects the emoticon block 312 to select a predetermined emoticon, an emoticon in a 3D image is generated, and when a call signal is received from or transmitted to the telephone number registered with the phone directory registration screen, the controller 180 turns on the switching panel unit 155 and displays the emoticon on the display 151 in a 3D image.
In addition, when the user selects the photo block 313 to select a predetermined image, the controller 180 generates a left eye image and a right eye image from the image, and when a call signal is received from or transmitted to the telephone number registered with the phone directory registration screen, the left eye image and the right eye image are expressed to allow the selected image to be displayed on the display 151 in a 3D image by turning on the switching panel unit 155.
Next,
Next,
Meanwhile,
In addition, in
Thus, according to the first example, when a new item (e.g., a person's name) is stored in a phone directory, and when a 3D effect is set and a call signal is received from a phone number set with the 3D effect, the controller 180 turns on the switching panel unit 155 and thus outputs a 3D image to the display 151 to improve the visibility.
Next,
As shown in
When the user selects the transmission icon 404, the controller 180 transmits preparation-completed character message and the 3D attribute information to a receiver terminal (e.g., at least one other terminal). When the receiver terminal receives the character message and the 3D attribute information, and displays the character message, part of the character message may be displayed as a 3D image as designated by the sender.
Meanwhile,
Next,
Under this circumstance, the user can select one of the widgets and set a 3D attribute. The method of setting the 3D attribute may utilize that of the first example. Once the 3D attribute is provided, the controller 180 generates a left eye image and a right eye image of the selected icon, and turns on the switching panel unit 155. Then, the selected icon is displayed as a 3D image.
Although the third example has described the setting of a 3D attribute to the widgets displayed on the wallpaper 500, the description is not limited thereto, and the menu screen can also be set for the 3D attribute to the menu icon. Furthermore, although the third example has described the 3D attribute to one of the widgets displayed on the wallpaper, the description is not limited thereto, and 3D attribute information may be provided to all widgets displayed on the wallpaper 500.
Next,
Although the fourth example has described the application to the still image (photo), the description is not limited thereto. That is, the example may be applied to a moving picture image. Thus, according to the fourth example, the user can provide a 3D attribute to the still (stationary) or moving images.
Next,
Although the fifth example has described the preferred channel icon being displayed as a 3D image, the description is not limited thereto. That is, the example may be applied to a situation where one of the broadcasting menus is selected and displayed as a 3D image.
Next,
Further, the sixth example illustrates a 3D effect to a search word input block 801 and a particular image block 802. As a result, the controller 180 can generate a left eye image and a right eye image to the search word input block 801 and the image block 802, and turn on the switching panel unit 155 to display a 3D image. Thus, according to the sixth example, it is possible to configure a more visible webpage screen based on the user option.
Next,
As shown in
Next, a method of displaying a 3D image using binocular disparity in a mobile terminal according to the second and third embodiments of the present invention will be described in detail with reference to
In more detail,
Meanwhile, the 3D imaginary spatial image 200 may be displayed with at least one icon. In more detail, assuming the 3D imaginary spatial image 200 is wallpaper (e.g., a screen graphic), the icon may be a widget icon or an instant file icon. As shown in
Next,
In particular,
Next,
Under this circumstance, that is, in a state where the plurality of menu icons are displayed, when the user only touches an ‘LGT’ icon 911, the controller 180 highlights only the LGT icon 911 so the user can see that the ‘LGT’ icon has been selected. At this time, and as shown in
Under the state where the icon is selected and highlighted, and when the user generates a constant pressure signal by applying a pressure to the pointing device (within a predetermined period of time), the controller 180 executes the ‘LGT’ icon 911 and enters the menu option corresponding to the icon 911 as shown in
The above-described methods can be implemented in a program recorded medium as computer-readable codes. The computer-readable media may include all kinds of recording devices in which data readable by a computer system are stored. The computer-readable media include ROM, RAM, CD-ROM, magnetic tapes, floppy discs, optical data storage devices, and the like for example and also include carrier-wave type implementations (e.g., transmission via Internet).
The present invention encompasses various modifications to each of the examples and embodiments discussed herein. According to the invention, one or more features described above in one embodiment or example can be equally applied to another embodiment or example described above. The features of one or more embodiments or examples described above can be combined into each of the embodiments or examples described above. Any full or partial combination of one or more embodiment or examples of the invention is also part of the invention.
As the present invention may be embodied in several forms without departing from the spirit or essential characteristics thereof, it should also be understood that the above-described embodiments are not limited by any of the details of the foregoing description, unless otherwise specified, but rather should be construed broadly within its spirit and scope as defined in the appended claims, and therefore all changes and modifications that fall within the metes and bounds of the claims, or equivalence of such metes and bounds are therefore intended to be embraced by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2009-0105261 | Nov 2009 | KR | national |
10-2009-0105262 | Nov 2009 | KR | national |