The present specification relates to a portable device and a control method therefor.
Recently, use of portable devices has been increased. The portable device may detect various inputs and execute an operation on the basis of the detected input. At this time, the portable device may detect a voice input as an input. In this case, a method for enabling a portable device to detect a voice input and execute an operation will be required.
An object of the present specification is to provide a portable device and a control method therefor.
Also, another object of the present specification is to provide a method for enabling a portable device to execute an operation on the basis of a voice input.
Also, still another object of the present specification is to provide a method for enabling a portable device to execute an operation on the basis of activation or deactivation of a display unit.
Also, further still another object of the present specification is to provide a method for enabling a portable device to execute an operation at a default level if the portable device detects a voice input in a state that a display unit is not activated.
Also, further still another object of the present specification is to provide a method for enabling a portable device to display an interface indicating information on an operation and an execution level of the operation.
Also, further still another object of the present specification is to provide a method for enabling a portable device to control an activated state of a display unit on the basis of a user's eyes.
Also, further still another object of the present specification is to provide a method for enabling a portable device to control an activated state of a display unit based on whether a user wears the portable device.
Also, further still another object of the present specification is to provide a method for enabling a portable device to execute an operation using an external device.
Also, further still another object of the present specification is to provide a method for enabling a portable device to provide a feedback for an operation executed based on a voice input.
Also, further still another object of the present specification is to provide a method for enabling a portable device to detect status recognition information and execute an operation on the basis of the detected status recognition information.
A portable device may be provided in accordance with one embodiment of the present specification. At this time, the portable device comprises an audio sensing unit detecting a voice input and delivering the detected voice input to a processor; a display unit displaying visual information; a control input sensing unit detecting a control input and delivering the detected control input to the processor; and a processor controlling the audio sensing unit, the display unit and the control input sensing unit. In this case, if the processor detects a first voice input including a first part for executing a first operation and a second part for indicating a first execution level for the first operation, the processor executes the first operation at the first execution level on the basis of the first voice input, if the processor detects a second voice input including only the first part for executing the first operation is detected and the display unit is in an activated state, the processor displays a first interface for indicating execution level information on the first operation, detects a control input for selecting a second execution level from the first interface and executes the first operation at the second execution level on the basis of the detected control input, and if the display unit is in a deactivated state, the processor executes the first operation at a default level on the basis of the second voice input.
A control method for a portable device according to one embodiment of the present invention comprises the steps of detecting a first voice input including a first part for executing a first operation and a second part for indicating a first execution level for the first operation; executing the first operation at the first execution level on the basis of the first voice input; detecting a second voice input including only the first part for executing the first operation; if a display unit is in an activated state, displaying a first interface for indicating execution level information on the first operation, detecting a control input for selecting a second execution level from the first interface, and executing the first operation at the second execution level on the basis of the detected control input; and if the display unit is in a deactivated state, executing the first operation at a default level on the basis of the second voice input.
The present specification may provide a portable device and a control method therefor.
Also, according to the present specification, the portable device may execute an operation on the basis of a voice input.
Also, according to the present specification, the portable device may execute an operation on the basis of activation or deactivation of a display unit.
Also, according to the present specification, the portable device may execute an operation at a default level if the portable device detects a voice input in a state that a display unit is not activated.
Also, according to the present specification, the portable device may display an interface indicating information on an operation and an execution level of the operation.
Also, according to the present specification, the portable device may control an activated state of a display unit on the basis of a user's eyes.
Also, according to the present specification, the portable device may control an activated state of a display unit based on whether a user wears the portable device.
Also, according to the present specification, the portable device may execute an operation using an external device.
Also, according to the present specification, the portable device may provide a feedback for an operation executed based on a voice input.
Also, according to the present specification, the portable device may detect status recognition information and execute an operation on the basis of the detected status recognition information.
Hereinafter, although the embodiments of the present specification will be described in detail with reference to the accompanying drawings and the disclosure described by the drawings, it is to be understood that claims of the present specification are not limited by such embodiments.
Although the terms used in this specification are selected from generally known and used terms considering their functions in the present specification, the terms may be modified depending on intention of a person skilled in the art, practices, or the advent of new technology. Also, in special case, the terms mentioned in the description of the present specification may be selected by the applicant at his or her discretion, the detailed meanings of which are described in relevant parts of the description herein. Accordingly, the terms used herein should be understood not simply by the actual terms used but by the meaning lying within and the description disclosed herein.
Although the terms such as “first” and/or “second” in this specification may be used to describe various elements, it is to be understood that the elements are not limited by such terms. The terms may be used to identify one element from another element. For example, the first element may be referred to as the second element, and vice versa within the range that does not depart from the scope of the present specification.
In the specification, when a part “comprises” or “includes” an element, it means that the part further comprises or includes another element unless otherwise mentioned. Also, the term “ . . . unit” or “ . . . module” disclosed in the specification means a unit for processing at least one function or operation, and may be implemented by hardware, software or combination of hardware and software.
The portable device 100 may comprise an audio sensing unit 110, a display unit 120, a control input sensing unit 130, and a processor 180. Also, the portable device 100 may further comprise an eyes detecting unit 140 as an optional element. Also, the portable device 100 may further comprise a wearing sensor unit 150 as an optional element. Also, the portable device 100 may further comprise a communication unit 160 as an optional element. Furthermore, the portable device 100 may further comprise a status recognition information sensing unit 170 as an optional element.
The portable device 100 may comprise the audio sensing unit 110. At this time, the audio sensing unit 110 may be a unit controlled by the processor 180. For example, the audio sensing unit 110 may detect a voice input in the periphery of the portable device 100. For example, the portable device 100 may detect a voice input using a microphone. That is, the portable device 100 may detect a sound in the periphery thereof and use the detected sound as an input, and the input is not limited to the aforementioned example.
The portable device 100 may comprise the display unit 120. At this time, the display unit 120 may be controlled by the processor 180. The portable device 100 may display visual information thereon by using the display unit 120. At this time, the display unit 120 may include at least one of an organic light emitting diode (OLED), a liquid crystal display (LCD), an electronic ink, a head mounted display (HMD), and a flexible display in accordance with the embodiment. That is, the display unit 120 may display visual information on the portable device 100 using a unit provided in the portable device 100. Also, for example, if the portable device 100 is a wearable device, the display unit 120 may display an augmented reality image. That is, the portable device may provide visual information to a user by using the display unit 120, and is not limited to the aforementioned embodiment.
Also, the portable device 100 may comprise the control input sensing unit 130. At this time, the control input sensing unit 130 may be controlled by the processor 180. The control input sensing unit 130 may deliver a user input or an environment recognized by the device to the processor 180 by using at least one sensor mounted in the portable device 100. In more detail, the control input sensing unit 130 may sense a control input of the user by using at least one sensor mounted in the portable device 100. In this case, at least one sensing means may include various sensing means for sensing the control input, such as a touch sensor, a fingerprint sensor, a motion sensor, a proximity sensor, an illumination sensor, a voice recognition sensor, and a pressure sensor. The control input sensing unit 130 refers to the aforementioned various sensing means, and the aforementioned sensors may be included in the portable device 100 as separate elements or may be included in the portable device by being incorporated as at least one or more elements. Also, for example, the control input sensing unit 130 may be an element incorporated with the display unit 120. For example, the control input sensing unit 130 may be a touch sensitive display unit 120. That is, the processor 180 may detect an input for visual information displayed by the display unit 120 through the control input sensing unit 130.
The portable device 100 may further comprise the eyes detecting unit 140 as an optional element. At this time, the eyes detecting unit 140 may be controlled by the processor 180. For example, the eyes detecting unit 140 may detect a user's eyes. At this time, the eyes detecting unit 140 may detect whether the user looks at the portable device 100. If the user's eyes look at the portable device 100 at a threshold time or more, the eyes detecting unit 140 may detect the user's eyes. At this time, the threshold time may be a threshold time for detecting the user's eyes, and may have a certain error range. Also, the eyes detecting unit 140 may deliver the detected eye information to the processor 180.
The portable device 100 may further comprise the wearing sensor unit 150 as an optional element. At this time, the wearing sensor unit 150 may be controlled by the processor 180. For example, the portable device 100 may be a wearable device. At this time, the portable device 100 may detect whether the user wears the portable device 100, by using the wearing sensor unit 150. As an embodiment, the portable device 100 may detect whether the user wears the portable device 100, by using the proximity sensor. Alternatively, the portable device 100 may detect whether the user wears the portable device 100, by using a sensor provided in a joint unit. That is, the portable device 100 of the present invention may determine whether the portable device 100 is worn by the user, using the aforementioned sensor units. At least one of the sensing units that provide the sensed result based on the determined will be referred to as the wearing sensor unit 150 in the present invention.
Also, the portable device 100 may further comprise the communication unit 160 as an optional element. The communication unit 160 may be controlled by the processor 180. At this time, the communication unit 160 may perform communication with an external device using various protocols and thus transmit and receive data. For example, the portable device 100 may transmit a triggering signal for operation execution to the external device through the communication unit 160. That is, the portable device 100 may transmit information from the external device by using the communication unit 160.
Also, the portable device 100 may further comprise the status recognition information sensing unit 170 as an optional element. The status recognition information sensing unit 170 may be controlled by the processor 180. At this time, status recognition information may be information of the status of the user or information on the state of the device. For example, the status recognition information may be position information of the user, time information, motion information or user data information. Also, the status recognition information may be information indicating whether the sensor unit within the portable device 100 is active, or information as to whether a communication network is active, or charging information of the device. That is, the status recognition information may be information on the portable device 100 and a user who uses the portable device 100, and is not limited to the aforementioned examples. At this time, the status recognition information sensing unit 170 may be a sensor unit for sensing the status recognition information. For example, the status recognition information sensing unit 170 may be a GPS that receives position information. Also, for example, the status recognition information sensing unit 170 may be a sensor unit that detect motion of the user. Also, the status recognition information sensing unit 170 may be an audio sensing unit for sensing peripheral sound. That is, the status recognition information sensing unit 170 refers to a sensor unit that may sense information on the portable device 100 and the user, and is not limited to the aforementioned examples.
The processor 180 may be a unit for controlling the audio sensing unit 110, the display unit 120 and the control input sensing unit 130. Also, the processor 180 may be a unit for controlling at least one or more of the eyes detecting unit 140, the wearing sensor unit 150, the communication unit 160 and the status recognition information sensing unit 170. In more detail, the processor 180 may detect a voice input by using the audio sensing unit 110. For example, the voice input may include a first part for executing a first operation and a second part indicating a first execution level for the first operation. The first part may be a command language previously stored as a command language for executing the first operation. Also, the second part may be a command language previously stored as a command language indicating an execution level for the first operation. For example, the execution level for the first operation may be configured based on at least one of attribute, type, operation time and operation method of the first operation. That is, the execution level for the first operation may be a detailed command for the first operation. For example, the first operation may be an operation for making toast. At this time, for example, the first part may be a command language for “toast” and “making operation”. Also, the second part may be baking intensity of toast. For example, if the user says that “make toast at a second stage”, the first part may be “toast” and “make”. Also, the second part may be “second stage”. At this time, the processor 180 may detect a voice input by using the audio sensing unit 110. Also, the processor 180 may execute the first operation at a first execution level on the basis of the voice input. That is, the processor 180 may make toast at the second stage. At this time, the processor 180 may execute the first operation by using the external device. In more detail, the processor 180 may transmit a first triggering signal to the external device by using the communication unit 160 on the basis of the detected voice input. At this time, the first triggering signal may be a signal for a command for executing the first operation with respect to the external device. The external device may execute the first operation on the basis of the first triggering signal. That is, the portable device 100 may control the external device in which the operation is executed, by using the communication unit 160. For example, the processor 180 may transmit the first triggering signal to a toaster, which is the external device, on the basis of the voice input of the user. At this time, the toaster may make toast on the basis of the first triggering signal.
For another example, the processor 180 may detect the status recognition information by using the status recognition information sensing unit 170. At this time, the processor 180 may executed the first operation on the basis of the status recognition information. In more detail, the processor 180 may determine whether to execute the first operation considering the status of the user or the device. Also, for example, the processor 180 may determine an execution level of the first operation considering the status of the user or the device. At this time, as described above, the status recognition information may be user or device information. For example, if time information corresponds to a.m. and position information of the user corresponds to house as the status recognition information, the processor 180 may execute toast baking, which is the first operation, on the basis of the voice input. Also, for example, if time information corresponds to p.m. and position information of the user corresponds to office as the status recognition information, the processor 180 may not execute toast baking, which is the first operation, even though the voice input is detected. That is, the processor 180 may control the first operation and the execution level of the first operation by using the information detected through the status recognition information sensing unit 180.
Also, the processor 180 may detect a voice input that includes only the first part for executing the first operation. In more detail, the processor 180 may detect a voice input having no information on the execution level. For example, if the user says “make toast”, the processor 180 cannot determine a desired baking stage of the user. That is, the processor 180 cannot execute the first operation considering the execution level. At this time, for example, if the display unit 120 is activated, the processor 180 may display a first interface indicating execution level information, by using the display unit 120. At this time, the processor 180 may detect a control input for selecting a second execution level from the first interface through the control input sensing unit 130. At this time, the processor 180 may execute the first operation at the second execution level on the basis of the detected control input. For example, the state that the display unit 120 is activated may be a state that the portable device 100 displays visual information through the display unit. That is, the state may be an on-state of the display unit 120. Whether the display unit 120 is activated may not be determined depending on whether a power is supplied to the display unit 120 inside the portable device 100. That is, if the user may view visual information through the display unit 120, the display unit 120 is activated. Also, if the user cannot view visual information through the display unit 120, the display unit 120 is deactivated. At this time, for example, if the processor 180 detects the user's eyes through the eyes detecting unit 140, the display unit 120 may be activated. For another example, if the portable device 100 is worn by the user through the wearing sensor unit 150, the processor 180 may activate the display unit. This will be described with reference to
Also, for example, if the display unit 120 is deactivated, the processor 180 may execute the first operation at a default level. At this time, the default level may be a default value set by the user or the processor 180. For example, the default level may be set on the basis of the status recognition information. In more detail, the processor 180 may set an optimal condition to the default level considering the status of the user or the device. For example, if the user says “make toast”, the processor 180 may make toast at a second stage which is the default level set by the user. For another example, the processor 180 may set the first stage used most frequently by the user to the default level by using history information of the user. That is, the default level may be information on a level previously set on the basis of the status recognition information.
That is, if the processor 180 detects a voice input that includes only the first part for executing the first operation, the processor 180 may set a method for executing the first operation differently depending on whether the display unit is activated. At this time, if the display unit is activated, the processor 180 may control execution of the first operation through the interface. Also, if the display unit is deactivated, the processor 180 may first execute the first operation at the default level and then control the first operation. This will be described with reference to
Also, the aforementioned elements may be included in the portable device 100 as separate elements, or may be included in the portable device 100 by being incorporated as at least one or more elements.
Also, the first device 100 may detect a second voice input that includes only a first part for executing the first operation. That is, the second voice input may not include information on the execution level of the first operation. At this time, if the display unit of the first device 100 is in an activated state, the first device 100 may display a first interface for the execution level of the first operation. At this time, the first device 100 may detect a control input for a second execution level from the first interface. The first device 100 may transmit a second triggering signal to the second devices 210, 220, 230 and 240 on the basis of the control input. At this time, the second devices 210, 220, 230 and 240 may execute the first operation at the second execution level on the basis of the second triggering signal. That is, the first device 100 may control the display unit by displaying information on an execution level if the display unit is in an activated state. For another example, if the display unit of the first device 100 is in a deactivated state, the first device 100 may transmit a third triggering signal to the second devices 210, 220, 230 and 240. At this time, the third triggering signal may be a signal for executing the first operation at a default level. The second devices 210, 220, 230 and 240 may execute the first operation at the default level on the basis of the third triggering signal. That is, the first device 100 may control the display unit to execute the first operation at the default level if the display unit is in a deactivated state. Hereinafter, in this specification, a device that executes an operation by detecting a voice input will be described based on the portable device, and may equally be applied to the voice recognition system.
For example, referring to
If the portable device 100 detects the second voice input in the deactivated state, the portable device 100 may execute the first operation at the default level. In more detail, if the user does not view the portable device 100 or does not use the portable device 100, the portable device 100 cannot be controlled by the user. Therefore, the portable device 100 may first execute the first operation at a default level on the basis of the second voice input. At this time, the default level may be a basic value 200 previously set by the user or the processor. For another example, the default level may be a value set based on status recognition information of the user or the device. For example, the default level may be a value having the highest frequency based on history information on a usage record of the user. That is, the default level may be a value that may be set in a state that there is no input of the user. If the portable device 100 detects the second voice input, the portable device may transmit a triggering signal to the external device 320. The external device 320 may execute the first operation at the default level on the basis of the triggering signal. As a result, the user may simply execute the operation even without a detailed control command or operation.
For example, referring to
For another example, referring to
Also, the portable device 100 may switch the display unit 120 to the activated state in various manners, and is not limited to the aforementioned examples.
The portable device 100 may detect that the display unit is switched from the deactivated state to the activated state. At this time, the portable device 100 may further display the first interface 410. That is, the portable device 100 first executes the first operation at the deactivated state of the display unit and then display information on the first operation if the display unit is activated. At this time, the portable device 100 may further display a first indicator 710 from the first interface 410 which is displayed. At this time, the first indicator 710 may be an indicator for controlling an execution level for the first operation. At this time, the portable device 100 may detect a control input for the first indicator 710. At this time, the portable device 100 may control the execution level for the first operation on the basis of the control input for the first indicator 710. Also, for example, the portable device may further display a second interface 720. At this time, the second interface 720 may be execution information on the first operation in a state that the display unit 120 is deactivated. In more detail, the portable device 100 may execute the first operation at a default level in a state that the display unit 120 is deactivated. At this time, the user may identify the execution information of the first operation for a time when the display unit 120 is deactivated. Therefore, the portable device may display the second interface 720 indicating the execution information of the first operation if the display unit 120 is switched to the activated state. For example, the second interface 720 may include information on operation time when the first operation is executed, progress information of the first operation, and information as to whether execution of the first operation is completed. That is, the second interface 720 may indicate the information of the first operation in a state that the display unit 120 is deactivated, and is not limited to the aforementioned examples. For another example, the portable device 100 may execute the first operation at the default level and end execution of the first operation if the display unit 120 is not activated within a first threshold time. At this time, the first threshold time may have a certain error. That is, the portable device 100 may end the first operation after the passage of a certain time even without the control operation of the user if the display unit is not activated. As a result, the user may directly control execution and termination of the first operation.
For example, referring to
For another example, referring to
For example, referring to
For example, referring to
Next, the portable device 100 may detect whether a first part for executing a first operation is included in the detected voice input (S1120). At this time, as described in
Next, the portable device 100 may detect whether a second part indicating a first execution level for the first operation is included in the detected voice input (S1130). At this time, as described in
Next, if the second part indicating the first execution level is included in the voice input, the portable device 100 may execute the first operation at the first execution level (S1140). At this time, as described in
Next, if the second part indicating the first execution level is not included in the voice input, the portable device 100 may detect whether the display unit is activated (S1150). At this time, as described in
Next, if the portable device 100 is in an activated state, the portable device 100 may display a first interface indicating an execution level of a first operation (S1220). At this time, as described in
Next, the portable device 100 may detect a control input for selecting a second execution level from the first interface (S1230). At this time, as described in
Next, the portable device 100 may execute the first operation at the second execution level on the basis of the detected control input (S1240). At this time, as described in
Next, if the portable device 100 is in an deactivated state, the portable device 100 may execute the first operation at a default level (S1250). At this time, as described in
For another example, if the portable device 100 executes the first operation at the default level, the portable device 100 may provide a feedback for the first operation. At this time, the feedback may include at least one of visual feedback, audio feedback and tactile feedback.
Moreover, for convenience of description, although the description has been made for each of the drawings, the embodiments of the respective drawings may be incorporated to achieve a new embodiment. A computer readable recording medium where a program for implementing the aforementioned embodiments is recorded may be designed in accordance with the need of the person skilled in the art within the scope of the present invention.
The portable device 100 and the control method therefor according to the present invention are not limited to the aforementioned embodiments, and all or some of the aforementioned embodiments may selectively be configured in combination so that various modifications may be made in the aforementioned embodiments.
Meanwhile, the portable device 100 and the control method therefor according to the present specification may be implemented in a recording medium, which may be read by a processor provided in a network device, as a code that can be read by the processor. The recording medium that can be read by the processor includes all kinds of recording media in which data that can be read by the processor are stored. Examples of the recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, and an optical data memory. Also, another example of the recording medium may be implemented in a shape of carrier wave such as transmission through Internet. Also, the recording medium that can be read by the processor may be distributed in a computer system connected thereto through the network, whereby codes that can be read by the processor may be stored and implemented in a distributive mode.
It will be apparent to those skilled in the art that the present specification can be embodied in other specific forms without departing from the spirit and essential characteristics of the specification. Thus, the above embodiments are to be considered in all respects as illustrative and not restrictive. The scope of the specification should be determined by reasonable interpretation of the appended claims and all change which comes within the equivalent scope of the specification are included in the scope of the specification.
In this specification, both the product invention and the method invention have been described, and description of both inventions may be made complementally if necessary.
—
The present invention has industrial applicability, which can be used for a terminal device and has reproducibility.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2014/012739 | 12/23/2014 | WO | 00 |