This application claims the benefit under 35 U.S.C. §119(a) of a Korean patent application filed in the Korean Intellectual Property Office on Aug. 6, 2014 and assigned Serial No. 10-2014-0101007, the entire disclosure of which is incorporated herein by reference.
The present disclosure relates to electronic devices, and more particularly to a method and apparatus for simulating a musical instrument.
Electronic devices such as a smartphone, a personal computer, and a tablet computer provide many useful functions to users through various applications. These electronic devices are being developed to additionally provide various types of information by various functions as well as a voice call function.
Aside from a simple voice call function and an Internet browsing function, users of electronic devices have recently demanded various entertainment functions.
As one of the entertainment functions, a function of displaying an element(s) (for example, a piano keyboard) of a user-intended musical instrument (for example, a piano) and allowing a user to play the musical instrument using the displayed element(s) is provided.
However, since the elements of the musical instrument are displayed in a very limited space such as a display of an electronic device, it might be difficult for the user has difficulty in playing the musical instrument using the displayed element(s). Accordingly, the need exists for new techniques for simulating musical instruments.
The present disclosure addresses this need. According to one aspect of the disclosure, an apparatus is provided for simulating a musical instrument, comprising: a display configured to present a musical interface associated with an external image; a musical instrument setter configured to associate the musical interface with the musical instrument; and a sound area controller configured to arrange a portion of the musical interface as a sound area.
According to another aspect of the disclosure, a method is provided for simulating a musical instrument comprising: displaying, by an electronic device, a musical interface that is associated with an external image; associating the musical interface with the musical instrument; and arranging a portion of the musical interface as a sound area.
The above and other aspects, features and advantages of certain exemplary embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings, like reference numerals will be understood to refer to like parts, components, and structures.
As the present disclosure allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail. However, the present disclosure is not limited to the specific embodiments and should be construed as including all the changes, equivalents, and substitutions included in the spirit and scope of the present disclosure.
Although ordinal numbers such as ‘first’, ‘second’, and so forth will be used to describe various components, those components are not limited by the terms. The terms are used only for distinguishing one component from another component. For example, a first component may be referred to as a second component and likewise, a second component may also be referred to as a first component, without departing from the teaching of the concept of the present disclosure. The term ‘and/or’ used herein includes any and all combinations of one or more of the associated listed items.
The terminology used herein is for the purpose of describing an embodiment only and is not intended to limit the present disclosure. As used herein, the singular forms are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms ‘comprises’ and/or ‘has’ when used in this specification, specify the presence of stated feature, number, step, operation, component, element, or a combination thereof but do not preclude the presence or addition of one or more other features, numbers, steps, operations, components, elements, or combinations thereof.
An electronic device 100 according to an embodiment of the present disclosure may be a device with communication capabilities. For example, the electronic device 100 may be at least one of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an e-Book reader, a desktop PC, a laptop PC, a Netbook computer, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MP3 player, a mobile medical equipment, a camera, and a wearable device (for example, a Head-Mounted Device (HMD) such as electronic glasses, electronic clothes, an electronic bracelet, an electronic necklace, an electronic Appcessory, an electronic tattoo, or a smart watch). While a smart phone is described herein as an embodiment of the electronic device 100 by way of example, for the convenience of description, it is clear to those skilled in the art that this does not limit the embodiment of the present disclosure.
Referring to
Referring to
The controller 110 may include a Central Processing Unit (CPU) 111, a Read Only Memory (ROM) 112 for storing a control program to control the electronic device 100, and a Random Access Memory (RAM) 113 for storing signals or data received from the outside of the electronic device 100 or for use as a memory space for an operation performed by the electronic device 100. The CPU 111 may include any suitable type of processing circuitry, such as a general-purpose processor (e.g., an ARM-based processor), a Field-Programmable Gate Array (FPGA), an Application-Specific Integrated Circuitry (ASIC), a Digital Signal Processor (DSP), a Programmable Logic Device (PLD), etc. The CPU 111 may include one or more cores. The CPU 111, the ROM 112, and the RAM 113 may be interconnected through an internal bus.
The controller 110 may control the mobile communication module 120, the sub-communication module 130, the multimedia module 140, the camera module 150, the GPS module 157, the I/O module 160, the sensor module 170, the memory 175, the power supply 180, the touch screen 190, and the touch screen controller 195.
The mobile communication module 120 may connect the electronic device 100 to an external device through one or more antennas (not shown) by mobile communication under the control of the controller 110. The mobile communication module 120 transmits wireless signals to or receives wireless signals from a portable phone (not shown), a smart phone (not shown), a tablet PC (not shown), or another electronic device (not shown) that has a phone number input to the electronic device 100, for a voice call, a video call, a Short Message Service (SMS), or a Multimedia Messaging Service (MMS).
The sub-communication module 130 may include at least one of the WLAN module 131 and the short-range communication module 132. For example, the sub-communication module 130 may include either or both of the WLAN module 131 and the short-range communication module 132.
The WLAN module 131 may be connected to the Internet under the control of the controller 110 in a place where a wireless AP (not shown) is installed. The WLAN module 131 supports the WLAN standard IEEE802.11x of the Institute of Electrical and Electronics Engineers (IEEE). The short-range communication module 132 may conduct short-range wireless communication between the electronic device 100 and an image forming device (not shown) under the control of the controller 110. The short-range communication may conform to Bluetooth, Infrared Data Association (IrDA), WiFi Direct, Near Field Communication (NFC), and the like.
The electronic device 100 may include at least one of the mobile communication module 120, the WLAN module 131, and the short-range communication module 132 according to its capabilities. For example, the electronic device 100 may include a combination of the mobile communication module 120, the WLAN module 131, and the short-range communication module 132 according to its capabilities.
The multimedia module 140 may include the broadcasting communication module 141, the audio play module 142, or the video play module 143. The broadcasting communication module 141 may receive a broadcast signal (for example, a TV broadcast signal, a radio broadcast signal, or a data broadcast signal) and additional broadcasting information (for example, an Electronic Program Guide (EPG) or an Electronic Service Guide (ESG)) from a broadcasting station through a broadcasting communication antenna (not shown) under the control of the controller 110. The audio play module 142 may open a stored or received digital audio file (for example, a file having such an extension as mp3, wma, ogg, or wav) under the control of the controller 110. The video play module 143 may open a stored or received digital video file (for example, a file having such an extension as mpeg, mpg, mp4, avi, mov, or mkv) under the control of the controller 110. The video play module 143 may also open a digital audio file.
The multimedia module 140 may include the audio play module 142 and the video play module 143 without the broadcasting communication module 141. Or the audio play module 142 or the video play module 143 of the multimedia module 140 may be incorporated into the controller 110.
The camera module 150 may include at least one of the first camera 151 and the second camera 152, for capturing a still image or a video under the control of the controller 110. The first camera 151 or the second camera 152 may include an auxiliary light source for providing a light intensity required to capture an image. The first camera 151 may be disposed on the front surface of the electronic device 100, while the second camera 152 may be disposed on the rear surface of the electronic device 100. Or, the first camera 151 and the second camera 152 may be arranged near to each other in order to capture a three-dimensional still image or video.
The GPS module 155 may receive radio waves from a plurality of GPS satellites (not shown) in Earth orbit and determine a position of the electronic device 100 based on the Time of Arrivals (ToAs) of satellite signals from the GPS satellites to the electronic device 100.
The I/O module 160 may include at least one of the plurality of buttons 161, the microphone 162, the speaker 163, the vibration motor 164, the connector 165, and the keypad 166.
The buttons 161 may be formed on the front surface, a side surface, or the rear surface of a housing of the electronic device 100, and may include at least one of a power/lock button, a volume button, a menu button, a home button, a back button, a search button, and the like.
The microphone 162 receives a voice or a sound and converts the received voice or sound to an electrical signal under the control of the controller 110.
The speaker 163 may output sounds corresponding to various signals (for example, a wireless signal, a broadcast signal, a digital audio file, a digital video file, or a photo shot) received from the mobile communication module 120, the sub-communication module 130, the multimedia module 140, and the camera module 150 under the control of the controller 110. The speaker 163 may further output a sound corresponding to a function executed by the electronic device 100. One or more speakers 163 may be disposed at an appropriate position or positions of the housing of the electronic device 100.
The vibration motor 164 may convert an electrical signal to a mechanical vibration under the control of the controller 110. For example, when the electronic device 100 receives an incoming voice call from another device (not shown) in vibration mode, the vibration motor 164 operates. One or more vibration motors 164 may be mounted inside the housing of the electronic device 100. The vibration motor 164 may operate in response to a user's touch on the touch screen 190 and a continuous movement of the touch on the touch screen 190.
The connector 165 may be used as an interface for connecting the electronic device 100 to an external device (not shown) or a power source (not shown). The electronic device 100 may transmit data stored in the memory 175 to an external device (not shown) via a cable connected to the connector 165 or may receive data from the external device via the cable, under the control of the controller 110. The external device may be a docking station and the data may be an input signal from an external input device such as a mouse, a keyboard, and the like. The electronic device 100 may receive power from a power source (not shown) via a cable connected to the connector 165 or may charge a battery (not shown) using the power source.
The keypad 166 may receive a key input from a user to control the electronic device 100. The keypad 166 includes a physical keypad (not shown) formed in the electronic device 100 or a virtual keypad (not shown) displayed on the touch screen 190. The physical keypad may not be provided according to the capabilities or configuration of the electronic device 100.
An earphone (not shown) may be connected to the electronic device 100 by being inserted into the earphone connector jack 167.
The sensor module 170 includes at least one sensor for detecting a state of the electronic device 100. For example, the sensor module 170 may include a proximity sensor for detecting whether a user is close to the electronic device 100 and an illumination sensor (not shown) for detecting the amount of ambient light around the electronic device 100. In addition, the sensor module 170 may include a gyro sensor. The gyro sensor may detect a motion of the electronic device 100 (for example, a rotation of the electronic device 100 or an acceleration or vibration applied to the electronic device 100), detect a point of the compass using the earth's magnetic field, and detect the direction of gravity. The sensor module 170 may also include an altimeter for detecting an altitude by measuring air pressure. At least one sensor may detect a state of the electronic device 100, generate a signal corresponding to the detected state, and transmit the generated signal to the controller 110. A sensor may be added to or removed from the sensor module 170 according to the capabilities of the electronic device 100.
The memory 175 may store input/output signals or data in accordance with operations of the mobile communication module 120, the sub-communication module 130, the multimedia module 140, the camera module 150, the GPS module 155, the I/O module 160, the sensor module 170, and the touch screen 190 under the control of the controller 110. The memory 175 may store a control program for controlling the electronic device 100 or the controller 110, and applications.
The term “memory” may include the memory 175, the ROM 112 and the RAM 113 within the controller 110, or a memory card (not shown) (for example, a Secure Digital (SD) card, a memory stick, and the like) mounted to the electronic device 100. The memory may include a non-volatile memory, a volatile memory, a Hard Disk Drive (HDD), a Solid State Drive (SSD), and the like.
The power supply 180 may supply power to one or more batteries (not shown) mounted in the housing of the electronic device 100 under the control of the controller 110. The one or more batteries supply power to the electronic device 100. The power supply 180 may supply power received from an external power source (not shown) via a cable connected to the connector 165 to the electronic device 100. Further, the power supply 180 may supply power received from an external power source wirelessly to the electronic device 100 by a wireless charging technology.
The touch screen 190 may provide User Interfaces (UIs) corresponding to various services (for example, call, data transmission, broadcasting, photo taking, and the like) to the user. The touch screen 190 may transmit an analog signal corresponding to at least one touch on a UI to the touch screen controller 195. The touch screen 190 may receive at least one touch input through a user's body part (for example, a finger such as a thumb) or a touch input means (for example, a stylus pen). The touch screen 190 may receive a continuous movement of a single touch, among one or more touches. The touch screen 190 may transmit an analog signal corresponding to a continuous movement of a touch to the touch screen controller 195.
In the present disclosure, the touch may include a non-contact touch, not limited to contacts between the touch screen 190 and the user's body part or the touch input means. A gap detectable to the touch screen 190 may vary according to the capabilities or configuration of the electronic device 100.
The touch screen 190 may be implemented as, for example, a resistive type, a capacitive type, an infrared type, or an acoustic wave type.
The touch screen controller 195 converts an analog signal received from the touch screen 190 to a digital signal (X and Y coordinates) and transmits the digital signal to the controller 110. The controller 110 may control the touch screen 190 using the received digital signal. For example, the controller 110 may select or execute a shortcut icon (not shown) displayed on the touch screen 190 in response to a touch. The touch screen controller 195 may be incorporated into the controller 110.
Referring to
A home button 161a, a menu button 161b, and a back button 161c may be formed at the bottom of the touch screen 190.
The home button 161a is used to display the main home screen on the touch screen 190. For example, upon touching of the home button 161a while any home screen other than the main home screen or a menu screen is displayed on the touch screen 190, the main home screen may be displayed on the touch screen 190. Upon pressing (touching) of the home button 161a during execution of applications on the touch screen 190, the main home screen illustrated in
The menu button 161b provides link menus available on the touch screen 190. The link menus may include a widget adding menu, a background changing menu, a search menu, an edit menu, an environment setting menu, and the like. When an application is executed, a link menu linked to the application may be provided.
The back button 161c may display a screen previous to a current screen or end the latest used application.
The first camera 151, an illumination sensor 170a, and a proximity sensor 170b may be arranged at a corner of the front surface 100a of the electronic device 100, whereas the second camera 152, a flash 153, and the speaker 163 may be arranged on the rear surface 100c of the electronic device 100.
For example, a power/reset button 161d, a volume button 161e, a terrestrial Digital Multimedia Broadcasting (DMB) antenna 141a for receiving a broadcast signal, and one or more microphones 162 may be disposed on side surfaces 100b of the electronic device 100. The DMB antenna 141a may be mounted to the electronic device 100 fixedly or detachably.
The connector 165 is formed on the bottom side surface of the electronic device 100. The connector 165 includes a plurality of electrodes and may be connected to an external device by wire. The earphone connector jack 167 may be formed on the top side surface of the electronic device 100, for allowing an earphone to be inserted.
Referring to
The user authenticator 411 may authenticate a user by receiving user authentication information from the user. The user authentication information may include, for example, an Identifier (ID) and a password which are preset by the user.
Referring to
The user may select a “Create Band” icon on the initial screen 500 to play a musical instrument using the apparatus 400 according to the embodiment of the present disclosure. Upon user selection of the “Create Band” icon, the display 442 may display icons for selecting various modes related to “Create Band”, as illustrated in
Upon user selection of the “Instrument” mode, the display 442 may optionally display a user authentication screen 600 as illustrated in
The image acquirer 420 may acquire one or more photographs of an external image 700. A function(s) or operation(s) of the image acquirer 420 may be executed preferably by the camera module 150 according to an embodiment of the present disclosure. The image acquirer 420 may acquire the external image 700 and the controller 410 may control display of the music interface depicted in the external image 700 on the display 442, as illustrated in
The controller 410 may control display of various UIs along with the musical interface depicted in the external image 700. For example, the controller 410 may control display of an instrument selection menu 720. The user may select an available musical instrument by the instrument selection menu 720. Further, the controller 410 may control display of an octave selection menu 730 and a scale selection menu 740. The controller 410 may control display of a lock icon 750, an instrument display icon 760, a sound area setting icon 770, and a camera reversal icon 780.
If the user selects the lock icon 750, the controller 410 may disable a selected function/functions or operation/operations even though the user selects the home button 161a, the menu button 161b, and the back button 161c. The user may prevent execution of an unintended function(s) or operation(s) during manipulation of the electronic device 100 for music performance by selecting the lock icon 750 and thus activating lock setting.
When the user requests display of a musical instrument by selecting the instrument display icon 760, the controller 410 may control display of a musical instrument matching an instrument type selected through the instrument selection menu 720 by the user. According to an embodiment of the present disclosure, the musical instrument setter 413 may determine an instrument type to be played according to the user's instrument selection request through the instrument selection menu 720 and display a musical instrument matching the user-selected instrument type.
If the user sets a musical instrument to be played to “Acoustic Grand Piano” by the instrument selection menu 720 and selects the instrument display icon 760, the musical instrument setter 413 may control display of an image of an acoustic grand piano, as illustrated in
Musical instruments available for performance may be preset or the user may purchase such musical instruments by accessing a selling server (not shown) by wireless or wired communication. In the latter case, the user may pay for a musical instrument by electronic payment. The wireless communication may conform to, for example, at least one of WiFi, BT, NFC, GPS, and cellular communication (for example, Long Term Evolution (LTE), LTE-Advanced (LTE-A), Code Division Multiple Access (CDMA), Wideband CDMA (WCDMA), Universal Mobile Telecommunication System (UMTS), Wireless Broadband (WiBro), or Global System for Mobile communication (GSM)). The wired communication may conform to, for example, at least one of USB, High Definition Multimedia Interface (HDMI), Recommended Standard 232 (RS 232), and Plain Old Telephone Service (POTS).
Upon photographing the external image 700, the image processor 412 may generate a differential image for the external image 700. To generate the differential image, the image processor 412 may determine a reference image (hereinafter “first image”). For example, an image obtained a predetermined time (for example, 0.5 second) after a time when the image acquirer 420 acquires the external image 700 for the first time may be set as the reference image for differential image generation. Further, the user may reset the reference image by selecting an image reset icon 1320 illustrated in
After the reference image is set, the image processor 412 may generate the differential image by comparing the reference image with photographs (hereinafter “second image”) of the external image 700 continuously acquired from the image acquirer 420.
The image processor 412 may generate the differential image by comparing the reference image with the image of the external image 700 only in terms of chrominance Cb and Cr except luminance Y among the Y, Cb, and Cr data. Therefore, it is preferred that the external image 700 is monochrome (for example, white, gray, and black) according to an embodiment of the present disclosure. For example, it is preferred that paper serving as the background of the external image 700 is white, a figure(s) drawn on the paper is black, and an input object (for example, a drum stick) is monochrome.
As described before, the display 442 may display an external image 700 and a UI(s) related to a music performance. According to an embodiment of the present disclosure, a function(s) or operation(s) of the display 442 may be executed by the touch screen 190. If the display 442 is implemented by the touch screen 190, a function(s) or operation(s) executed by the input unit 430 may be implemented by the touch screen 190 according to an embodiment of the present disclosure. The description of the touch screen 190 is applied to the display 442 and thus the display 442 will not be described in detail herein.
The musical instrument setter 413 may determine an instrument type to be played according to a user's instrument selection request through the afore-described instrument selection menu 720 and may display a musical instrument matching the user-selected instrument type according to an instrument display request.
The sound area controller 414 may set at least one sound area 1000 that outputs a sound corresponding to the musical instrument selected by the user. The sound area 1000 may refer to an area that outputs a sound corresponding to each element of the musical instrument selected by the user. That is, if an input object 1100 is placed at a position of on the external image 700 corresponding to the sound area 1000, a sound corresponding to an element set for the sound area 1000 may be output.
Referring to
Referring to
Once the sound area 1000 is set, the type, octave, and scale of the musical instrument may be displayed in the sound area 1000 as illustrated in
The input object recognizer 415 may recognize an input object based on a differential image generated by the image processor 412. According to an embodiment of the present disclosure, since the image processor 412 generates the differential image based on a chrominance value as described before, the input object may be colored. In some embodiments, the input object may include a Light Emitting Diode (LED) as illustrated in
The user may play music by using an input object to make contact with various figures or shapes depicted in the external image 700. According to aspects of the disclosure, the user may make contact with the figures or shapes depicted in the external image by physically touching the figures or shapes with the input object. For example, the user may play music by tapping on the figures or shapes in the external image 700. Additionally or alternatively, the user may make contact with the various figures or shapes depicted in the external image 700 by shining a light on the figures or shapes with the input object. For example, as illustrated in
The input object 1100 illustrated in
The sound controller 416 may execute a function(s) or operation(s) including a change in sound property such as an octave and/or scale of a musical instrument and sound output control according to a user's request.
The sound output unit 444 may execute a function(s) or operation(s) for outputting sounds of various musical instruments, as described before. The function(s) or operation(s) of the sound output unit 444 may be performed by, for example, the speaker 163 according to an embodiment of the present disclosure.
Because the function(s) or operation(s) of the sound controller 416 and the sound output unit 444 have been described before, their detailed description will not be provided herein.
The input unit 430 may receive various types of information input by the user, for music performance according to an embodiment of the present disclosure. A function(s) or operation(s) of the input unit 430 may be performed by the touch screen 190, as described before. Further, a function(s) or operation(s) of the input unit 430 may be performed by, for example, the afore-described buttons 161 or the keypad 166.
The communication unit 450 may execute a function/functions or operation(s) for transmitting various types of information between the apparatus 400 according to the embodiment of the present disclosure and another electronic device (for example, a server or another apparatus) connected to the apparatus 400 wirelessly or via a wired connection. The function(s) or operation(s) of the communication unit 450 may be performed by, for example, the sub-communication module 130.
Referring to
Referring to
Referring to
In some embodiments, the input object 1100 is recognized based on a specific color emitted from the input object 1100, even though the external image 700 is not monochrome. The specific color may be received and set, for example, from a color list recognizable to the image processor 412 by the user. Or the specific color may be preset in the process of manufacturing the apparatus 400. According to the above embodiment of the present disclosure, the user may draw a figure(s) on paper as a background of the external image 700 in a color other than the specific color and may play using the input object 1100 emitting the specific color. According to another embodiment of the present disclosure, the method for controlling play of a musical instrument in the apparatus may not include the reference image storing step S1510 illustrated in
According to another embodiment of the present disclosure, an image of a changed external image may be reset by means of the image reset icon 1320 as in the foregoing embodiment. However, the input object 1100 is not recognized by generating a differential image in this embodiment. Thus, resetting an image of a changed external image may not mean “resetting a reference image”.
Referring to
While it has been described with reference to
Referring to
If one (for example, 400a) of the guest devices 400a, 400b, and 400c performs music, the other devices 400, 400b, and 400c may share sound data of the music performance. Sharing sound data means that sounds of all musical instruments included in the music performance group are output from each device. For example, when the guest device 400a corresponding to a piano and the guest device 400b corresponding to a drum play the musical instruments at the same time, each of the devices 400a and 400b may output the sounds of the piano and the drum. Therefore, the musical instruments are played in an ensemble through each of the devices 400, 400a, 400b, and 400c. Information about the music performance may be transmitted and received between the devices, for example, through their communication units (not shown).
The host device 400 may reproduce (or output) a music file stored in the host device 400 or another electronic device (for example, a music content providing server). Upon receipt of a music play request from a user, the host device 400 may display a list of available music files and reproduce a music file selected from the list. The host device 400 and the guest devices 400a, 400b, and 400c may play the musical instruments in an ensemble while the selected music file is being reproduced. That is, music selected by the user may serve as a BackGround Music (BGM) in the ensemble. However, the “music file” is a mere embodiment of acoustic data reproducible by each of the devices 400, 400a, 400b, and 400c. According various embodiments of the present disclosure, many other acoustic data than a music file may be reproduced. The volume of the reproduced acoustic data may be controlled by, for example, a volume control menu 1810 illustrated in
Referring to
Upon receipt of a volume control request for the master volume, the host device 400 may control the volumes of the respective devices 400a, 400b, and 400c and the volume of the host device 400, as illustrated in
Referring to
In another embodiment of volume control, if the concert mode is off in any device (for example, the guest device 400a) in relation to the concert mode on/off icon 1350, the device (that is, the guest device 400a) may not output sounds of the musical instruments played in the other devices. For example, if the concert mode is off in the guest device 400a, the guest device 400a may output only sounds of the musical instrument, that is, the piano played in the guest device 400a without outputting sounds of the musical instruments (for example, a drum and a xylophone) played in the other devices (for example, the guest devices 400b and 400c). A function(s) or operation(s) related to the volume control may be performed preferably by the sound controller 416.
Referring to
As is apparent from the foregoing description of the present disclosure, since a user performs music using an external image made freely by the user, a larger play area than in a conventional technology can be secured.
The above-described aspects of the present disclosure can be implemented in hardware, firmware or via the execution of software or computer code that can be stored in a recording medium such as a CD-ROM, a Digital Versatile Disc (DVD), a magnetic tape, a RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory machine-readable medium and to be stored on a local recording medium, so that the methods described herein can be rendered via such software that is stored on the recording medium using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an ASIC or FPGA. As would be understood in the art, the computer, the processor, microprocessor controller or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein. In addition, it would be recognized that when a general purpose computer accesses code for implementing the processing shown herein, the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein. Any of the functions and steps provided in the Figures may be implemented in hardware, software or a combination of both and may be performed in whole or in part within the programmed instructions of a computer. No claim element herein is to be construed under the provisions of 35 U.S.C. 112, sixth paragraph, unless the element is expressly recited using the phrase “means for”.
While the present disclosure has been particularly shown and described with reference to the examples provided therein, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2014-0101007 | Aug 2014 | KR | national |
Number | Name | Date | Kind |
---|---|---|---|
6995310 | Knapp | Feb 2006 | B1 |
8866846 | Kim | Oct 2014 | B2 |
20040031377 | Oshiyama | Feb 2004 | A1 |
20050196131 | Narusawa | Sep 2005 | A1 |
20060045276 | Gamo | Mar 2006 | A1 |
20060084218 | Lee | Apr 2006 | A1 |
20060179160 | Uehara et al. | Aug 2006 | A1 |
20080208740 | Uehara | Aug 2008 | A1 |
20090114079 | Egan | May 2009 | A1 |
20100053105 | Choi | Mar 2010 | A1 |
20100178028 | Wahrhaftig | Jul 2010 | A1 |
20110045907 | Villa | Feb 2011 | A1 |
20110134061 | Lim | Jun 2011 | A1 |
20110316793 | Fushiki | Dec 2011 | A1 |
20120007884 | Kim | Jan 2012 | A1 |
20120057012 | Sitrick | Mar 2012 | A1 |
20120174736 | Wang | Jul 2012 | A1 |
20120304847 | Hacker | Dec 2012 | A1 |
20130133506 | Daisy | May 2013 | A1 |
20140000438 | Feis | Jan 2014 | A1 |
20140059471 | Adam | Feb 2014 | A1 |
Number | Date | Country |
---|---|---|
10-2004-0064957 | Jul 2004 | KR |
10-1153333 | Jun 2012 | KR |
Number | Date | Country | |
---|---|---|---|
20160042727 A1 | Feb 2016 | US |