The disclosure relate to a method for selection and control of an object on a basis of classification and an electronic device configured to perform the method.
In order to provide enhanced user experience, an electronic device that provides an augmented reality (AR) service displaying information generated by a computer in connection with an object in real-world is being developed. The electronic device may be a wearable device that may be worn by a user. For example, the electronic device may be AR glasses.
Provided are a method for selection and control of an object on a basis of classification and an electronic device configured to perform the method.
According to an aspect of the disclosure, a wearable device includes: a display; and at least one processor, wherein the at least one processor is operably connected with the display and is configured to: receive an input indicating selection of a first external electronic device and a second external electronic device among a plurality of external electronic devices viewable through the display; based on the input, identify third external electronic devices in a category including both the first external electronic device and the second external electronic device, and identify the first external electronic device and the second external electronic device along with the third external electronic devices; display a first visual object and a second visual object respectively indicating that the first external electronic device and the second external electronic device are selected; and display at least one third visual object guiding selection of at least one fourth external electronic device among the third external electronic devices.
According to an aspect of the disclosure, a method of a wearable device, includes: receiving a first input indicating selection of a first external electronic device and a second external electronic device among a plurality of external electronic devices viewable through a display of the wearable device; based on the first input, identifying third external electronic devices in a category including both the first external electronic device, and the second external electronic device, and identifying the first external electronic device, and the second external electronic device; displaying a first visual object and a second visual object respectively indicating that the first external electronic device and the second external electronic device are selected; and displaying at least one third visual object guiding selection of at least one fourth external electronic device in the third external electronic devices.
According to an aspect of the disclosure, a wearable device includes: a display; and at least one processor, wherein the at least one processor is configured to: receive a first input indicating selection of one or more first external electronic devices among a plurality of external electronic devices viewable through the display; based on the first input, identify one or more first functions applicable to the one or more first external electronic devices; display, within the display, one or more visual objects for executing the one or more first functions; in a state that the one or more visual objects are displayed, identify, based on a second input indicating selection of at least one second external electronic device, at least one second function applicable to the at least one second external electronic device; cease to, among the one or more visual objects, displaying at least one visual object with respect to a function different from the at least one second function.
The above and other aspects, features, and advantages of certain embodiments of the disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Hereinafter, one or more embodiments of the disclosure will be described with reference to the accompanying drawings.
The one or more embodiments of the disclosure and terms used herein are not intended to limit the technology described in the disclosure to specific embodiments, and should be understood to include various modifications, equivalents, or substitutes of the corresponding embodiment. In relation to the description of the drawings, a reference numeral may be used for a similar component. A singular expression may include a plural expression unless it is clearly meant differently in the context.
The term “couple” and the derivatives thereof refer to any direct or indirect communication between two or more elements, whether or not those elements are in physical contact with each other. The terms “transmit”, “receive”, and “communicate” as well as the derivatives thereof encompass both direct and indirect communication. The terms “include” and “comprise”, and the derivatives thereof refer to inclusion without limitation. The term “or” is an inclusive term meaning “and/or”. The phrase “associated with,” as well as derivatives thereof, refer to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, have a relationship to or with, or the like. The term “controller” refers to any device, system, or part thereof that controls at least one operation. The functionality associated with any particular controller may be centralized or distributed, whether locally or remotely.
In the disclosure, an expression such as “A or B”, “at least one of A and/or B”, “A, B or C”, or “at least one of A, B and/or C”, and the like may include all possible combinations of items listed together. Expressions such as “1st”, “2nd”, “first” or “second”, and the like may modify the corresponding components regardless of order or importance, is only used to distinguish one component from another component, but does not limit the corresponding components. When a (e.g., first) component is referred to as “connected (functionally or communicatively)” or “accessed” to another (e.g., second) component, the component may be directly connected to the other component or may be connected through another component (e.g., a third component).
The term “module” used in the disclosure may include or correspond to hardware, software, or firmware, and may be used interchangeably with terms such as logic, logic block, component, or circuit, and the like. The module may be an integrally configured component or a minimum unit or part thereof that performs one or more functions. For example, a module may be configured with an application-specific integrated circuit (ASIC). For example, the “module” may be implemented by a program that is stored in a storage medium which may be addressed, and is executed by a processor. For example, the “module” may be implemented by components such as software components, object-oriented software components, class components, and task components, processes, functions, attributes, procedures, sub-routines, segments of a program code, drivers, firmware, a micro code, a circuit, data, a database, data structures, tables, arrays and parameters.
Moreover, multiple functions described below can be implemented or supported by one or more computer programs, each of which is formed from computer readable program code and embodied in a computer readable medium. The terms “application” and “program” refer to one or more computer programs, software components, sets of instructions, procedures, functions, objects, classes, instances, related data, or a portion thereof adapted for implementation in a suitable computer readable program code. The phrase “computer readable program code” includes any type of computer code, including source code, object code, and executable code. The phrase “computer readable medium” includes any type of medium capable of being accessed by a computer, such as Read Only Memory (ROM), Random Access Memory (RAM), a hard disk drive, a Compact Disc (CD), a Digital Video Disc (DVD), or any other type of memory. A “non-transitory” computer readable medium excludes wired, wireless, optical, or other communication links that transport transitory electrical or other signals. A non-transitory computer readable medium includes media where data can be permanently stored and media where data can be stored and later overwritten, such as a rewritable optical disc or an erasable memory device.
The wearable device 101 according to an embodiment may communicate with one or more external electronic devices (e.g., external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171) different from the wearable device 101 through a wired network and/or a wireless network. The wired network may include a network such as the Internet, a local area network (LAN), a wide area network (WAN), Ethernet, or a combination thereof. The wireless network may include a network such as long term evolution (LTE), 5G new radio (NR), wireless fidelity (WiFi), Zigbee, near field communication (NFC), Bluetooth, Bluetooth low-energy (BLE), or a combination thereof. For example, the wearable device 101 may communicate with one or more external electronic devices through a server in a network. A signal and/or a packet exchanged between the wearable device 101 and the one or more external electronic devices may be relayed by the server. For example, the wearable device 101 may be directly connected to the one or more external electronic devices, independently of the server, by using Bluetooth, BLE, NFC, and/or WiFi direct (or Wi-Fi peer-to-peer (P2P)). For example, the wearable device 101 may be connected indirectly through one or more routers and/or an access point (AP) in the network.
The external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171 of
Identifying the external electronic device by the wearable device 101 may include an operation of identifying at least one function supportable by the external electronic device, based on the metadata. The identifying the external electronic device by the wearable device 101 may include an operation of classifying the external electronic device based on hierarchical categories. Hereinafter, a category may be referred to as classification and/or a class. The hierarchical categories may be used to identify at least one functions commonly applicable to a plurality of external electronic devices. An operation in which the wearable device 101 according to an embodiment classifies at least one external electronic device accessible by the wearable device 101 based on the hierarchical categories will be described with reference to
Referring to
Referring to
The wearable device 101 according to an embodiment may execute a function related to augmented reality (AR). The wearable device 101 may include a display for displaying a screen related to the augmented reality. The display may include a transparent material. For example, based on the transparent material included in the display, light emitted toward a surface of the display may be at least partially penetrated to another surface opposite to the surface. The wearable device 101 according to an embodiment may display a screen related to an external object viewed to the user by penetrating the display within the display, based on the augmented reality. The wearable device 101 according to an embodiment may display information related to the external object to the user, based on the screen. The wearable device 101 according to an embodiment may display information which controls an external object such as the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, or is received electrically from the external object to the user by using the screen.
Referring to
In a state of identifying the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, the wearable device 101 according to an embodiment may receive an input indicating selection of at least one of the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and the visual objects 151, 152, and 153. The input may include an input selecting an external electronic device which is tangible and viewed through the FoV 100, such as the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and/or another input indicating selection of an external electronic device by using a visual object representing the external electronic device, such as the visual objects 151, 152, and 153.
By using homogeneity based on a category of each of the external electronic devices corresponding to the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and the visual objects 151, 152, and 153 to the user, the wearable device 101 according to an embodiment may guide selection of a group of external electronic devices distinguished by the category. Example operations in which the wearable device 101 receives the input are described with reference to
In an embodiment, in response to (or based on) an input indicating selection of at least one of the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171 and the visual objects 151, 152, and 153, the wearable device 101 may display one or more visual objects for guiding selection of another external electronic device, which is distinguished from at least one external electronic device selected by the input and is similar to at least one external electronic device selected by the input. The other external electronic device may be performed based on that the wearable device 101 classifies external electronic devices corresponding to the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and the visual objects 151, 152, and 153 into the hierarchical categories. An example of an operation in which the wearable device 101 according to an embodiment displays the one or more visual object based on hierarchical categories will be described with reference to
In an embodiment, in response to (or based on) the input indicating selection of at least one of the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and the visual objects 151, 152, and 153, the wearable device 101 may display a screen for controlling at least one external electronic device selected by the input. In a state in which a plurality of external electronic devices is selected by the input, the wearable device 101 may selectively and/or preferentially display functions commonly applicable to the plurality of external electronic devices selected by the input. An example of an operation in which the wearable device 101 according to an embodiment selectively displays the functions in the state will be described with reference to
As described above, the wearable device 101 according to an embodiment may receive an input indicating selection of at least two external electronic devices, based on the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and the visual objects 151, 152, and 153 through the display. For example, the wearable device 101 may receive an input indicating selection of a first external electronic device and a second external electronic device. In response to (or based on) the input, the wearable device 101 may identify third external electronic devices in a category including both the first external electronic device and the second external electronic device. The third external electronic devices may include the first external electronic device and the second external electronic device. The third external electronic devices may be identified from among other external electronic devices related to the external electronic devices 121, 122, 123, 131132, 141, 142, 161, and 171, and the visual objects 151, 152, and 153. The wearable device 101 may display, as associated with the first external electronic device and the second external electronic device viewable through the display, first visual object and a second visual object respectively indicating that the first external electronic device and the second external electronic device are selected. The wearable device 101 may display, as associated with at least one fourth external electronic device among the third external electronic devices that is different from the first external electronic device and the second external electronic device, at least one third visual object guiding selection of the at least one fourth external electronic device.
As described above, the wearable device 101 according to an embodiment may support selection and/or control of one or more external electronic devices based on the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and/or the visual objects 151, 152, and 153, which are viewable through the FoV 110. In response to (or based on) an input indicating selection of a plurality of external electronic devices, the wearable device 101 may identify functions commonly applicable to the plurality of external electronic devices, or may identify information commonly receivable from the plurality of external electronic devices, by using the metadata. Based on the commonly applicable functions, the wearable device 101 may control all the plurality of external electronic devices. Based on the commonly receivable information, the wearable device 101 may at least partially combine information received from each of the plurality of external electronic devices. The wearable device 101 may display the at least partially combined information to the user. For example, the wearable device 101 may improve a user experience related to the selection and/or the control of the plurality of external electronic devices.
Hereinafter, one or more hardware included in the wearable device 101 according to an embodiment will be described with reference to
according to an embodiment. The wearable device 101 of
The processor 210 of the wearable device 1012 according to an embodiment may include a hardware component for processing data, based on one or more instructions. The hardware component for processing data may include, for example, an arithmetic and logic unit (ALU), a floating point unit (FPU), a field programmable gate array (FPGA), and/or a central processing unit (CPU). The number of the processors 210 may be one or more. For example, the processor 210 may have a structure of a multi-core processor such as a dual core, a quad core, or a hexa core.
The memory 220 of the wearable device 101 according to an embodiment may include a hardware component for storing data and/or instructions inputted to the processor 210 or outputted from the processor 210. The memory 220 may include, for example, volatile memory such as random-access memory (RAM), and/or non-volatile memory such as read-only memory (ROM). The volatile memory may include, for example, at least one of dynamic RAM (DRAM), static RAM (SRAM), Cache RAM, and pseudo SRAM (PSRAM). The non-volatile memory may include, for example, at least one of programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), flash memory, a hard disk, a compact disk, and an embedded multimedia card (eMMC).
One or more instructions indicating a calculation and/or an operation to be performed on data by the processor 220, may be stored in the memory 220. A set of one or more instructions may be referred to as firmware, an operating system, a process, a routine, a sub-routine and/or an application. For example, the wearable device 101 and/or the processor 210 may perform at least one of operations of
The display 230 of the wearable device 101 according to an embodiment may output visualized information (e.g., at least one of screens of
The camera 240 of the wearable device 101 according to an embodiment may include one or more optical sensors (e.g., a charged coupled device (CCD) sensor and a complementary metal oxide semiconductor (CMOS) sensor) generating an electronic signal indicating a color and/or brightness of light. A plurality of optical sensors included in the camera 240 may be disposed in a form of a 2 dimensional array. The camera 240 may generate an image corresponding to light reaching the optical sensors of the 2 dimensional array and including a plurality of pixels arranged in two dimensions, by substantially simultaneously obtaining an electronic signal of each of the plurality of optical sensors. For example, photographic data captured by using the camera 240 may mean one image obtained from the camera 240. For example, video data captured by using the camera 240 may mean a sequence of a plurality of images obtained from the camera 240 according to a preset frame rate. The wearable device 101 according to an embodiment may further include a flashlight, which is disposed toward a direction in which the camera 240 receives light, for outputting light in the direction. The number of cameras 240 included in the wearable device 101 according to an embodiment may be one or more.
In an embodiment, a FoV of the camera 240 may correspond to an area corresponding to an image generated by the camera 240. The FoV of the camera 240 is an area formed based on a view angle in which a lens of the camera 240 is capable of receiving light.
Hereinafter, a subject may mean an object included in the FOV of the camera 240, and distinguished from the wearable device 101. In an embodiment, the FoV of the camera 240 may at least partially match an environment viewable to the user through the display 230, such as the FoV 110 of
The wearable device 101 according to an embodiment may include an output means for outputting information in a form other than a visualized form. For example, the wearable device 101 may include a speaker for outputting an acoustic signal. For example, the wearable device 101 may include a motor for providing haptic feedback based on a vibration.
The communication circuitry 250 of the wearable device 101 according to an embodiment may include hardware for supporting transmission and/or reception of an electronic signal between the wearable device 101 and an external electronic device 290. For example, the external electronic device 290 may include or correspond to external electronic devices corresponding to each of the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171, and/or the visual objects 151, 152, and 153 of
Referring to
Although illustrated based on different blocks, an embodiment is not limited thereto, a portion (e.g., at least one of the processor 210, the memory 220, and the communication circuitry 250) of the hardware component illustrated in
The sensor 260 of the wearable device 101 according to an embodiment may generate electronic information to be processed by the processor 210 and/or the memory 220 from non-electronic information related to the wearable device 101. The electronic information generated by the sensor 260 may be stored in the memory 220, processed by the processor 210, and/or transmitted to an external electronic device distinguished from the wearable device 101.
According to an embodiment, the sensor 260 of the wearable device 101 may include at least one of a gyro sensor, a gravity sensor, and/or an acceleration sensor for detecting a posture of the wearable device 101 and/or a posture of a body part (e.g., a head) of a user wearing the wearable device 101. Each of the gravity sensor and the acceleration sensor may measure acceleration of gravity and/or acceleration based on preset three-dimensional axes (e.g., x-axis, y-axis, and z-axis) perpendicular to each other. The gyro sensor may measure angular velocity of each of the preset three-dimensional axes (e.g., x-axis, y-axis, and z-axis). The acceleration of the gravity, the acceleration, and the angular velocity measured in each of the gravity sensor, the acceleration sensor, and the gyro sensor may be outputted in a form of electronic information that is capable of being processed by the processor 210 and/or being stored in the memory 220. The gravity sensor, the acceleration sensor, and the gyro sensor may repeatedly output the acceleration of the gravity, the acceleration, and the angular velocity, based on a preset period. At least one of the gravity sensor, the acceleration sensor, and the gyro sensor may be referred to as an inertial measurement unit (IMU).
The microphone 270 of the wearable device 101 according to an embodiment may output an electronic signal representing a vibration of the atmosphere. For example, the wearable device 101 may obtain an audio signal including the user's speech by using the microphone 270. The user's speech included in the audio signal may be converted into information in a format recognizable by the processor 210 of the wearable device 101, based on a voice recognition model and/or a natural language understanding model. For example, the wearable device 101 may execute one or more functions from among a plurality of functions that may be provided by the wearable device 101, by recognizing the user's speech. An example of an operation in which the wearable device 101 according to an embodiment performs based on the speech received through the microphone 270 will be described with reference to
As described above, the wearable device 101 according to an embodiment may include one or more hardware for providing a user experience based on augmented reality. For example, the wearable device 101 may identify the external electronic device 290 viewable through the display 230, by using the communication circuitry 250 and/or the camera 240. The wearable device 101 may display a screen for selecting and/or controlling the identified external electronic device 290 within the display 230. The wearable device 101 may receive an input indicating selection of the external electronic device 290, based on the sensor 260, the microphone 270, and/or the communication circuitry 250. Based on identifying selection of a plurality of external electronic devices by the input, the wearable device 101 may identify at least one function commonly applicable to the plurality of external electronic devices by using a hierarchical connection of categories assigned to each of the plurality of external electronic devices. The wearable device 101 may improve a user experience related to the control of the plurality of external electronic devices by displaying a screen for executing the at least one function to the user through the display 230.
Hereinafter, an example of a structure of the wearable device 101 according to an embodiment will be described with reference to
According to an embodiment, the wearable device 101 may be wearable on a portion of the user's body. The wearable device 101 may provide augmented reality (AR), virtual reality (VR), or mixed reality (MR) combining the augmented reality and the virtual reality to a user wearing the wearable device 101. For example, the wearable device 101 may display a virtual reality image provided from at least one optical device 382 and 384 on at least one display 230, in response to (or based on) a user's preset gesture obtained through a motion recognition camera 240-2.
According to an embodiment, the at least one display 230 may provide visual information to a user. For example, the at least one display 230 may include a transparent or translucent lens. The at least one display 230 may include a first display 230-1 and/or a second display 230-2 spaced apart from the first display 230-1. For example, the first display 230-1 and the second display 230-2 may be disposed at positions corresponding to the user's left and right eyes, respectively.
Referring to
According to an embodiment, a frame 300 may be configured with a physical structure in which the wearable device 101 may be worn on the user's body. According to an embodiment, the frame 300 may be configured so that when the user wears the wearable device 101, the first display 230-1 and the second display 230-2 may be positioned corresponding to the user's left and right eyes. The frame 300 may support the at least one display 230. For example, the frame 300 may support the first display 230-1 and the second display 230-2 to be positioned at positions corresponding to the user's left and right eyes.
Referring to
For example, the frame 300 may include a first rim 301 surrounding at least a portion of the first display 230-1, a second rim 302 surrounding at least a portion of the second display 230-2, a bridge 303 disposed between the first rim 301 and the second rim 302, a first pad 311 disposed along a portion of the edge of the first rim 301 from one end of the bridge 303, a second pad 312 disposed along a portion of the edge of the second rim 302 from the other end of the bridge 303, the first temple 304 extending from the first rim 301 and fixed to a portion of the wearer's ear, and the second temple 305 extending from the second rim 302 and fixed to a portion of the ear opposite to the ear. The first pad 311 and the second pad 312 may be in contact with the portion of the user's nose, and the first temple 304 and the second temple 305 may be in contact with a portion of the user's face and the portion of the user's ear. The temples 304 and 305 may be rotatably connected to the rim through hinge units 306 and 307. The first temple 304 may be rotatably connected with respect to the first rim 301 through the first hinge unit 306 disposed between the first rim 301 and the first temple 304. The second temple 305 may be rotatably connected with respect to the second rim 302 through the second hinge unit 307 disposed between the second rim 302 and the second temple 305.
According to an embodiment, the wearable device 101 may include hardware (e.g., hardware described above based on the block diagram of
According to an embodiment, the at least one optical device 382 and 384 may project a virtual object on the at least one display 230 in order to provide various image information to the user. For example, the at least one optical device 382 and 384 may be a projector. The at least one optical device 382 and 384 may be disposed adjacent to the at least one display 230 or may be included in the at least one display 230 as a portion of the at least one display 230. According to an embodiment, the wearable device 101 may include a first optical device 382 corresponding to the first display 230-1, and a second optical device 384 corresponding to the second display 230-2. For example, the at least one optical device 382 and 384 may include the first optical device 382 disposed at a periphery of the first display 230-1 and the second optical device 384 disposed at a periphery of the second display 230-2. The first optical device 382 may transmit light to the first waveguide 333 disposed on the first display 230-1, and the second optical device 384 may transmit light to the second waveguide 334 disposed on the second display 230-2.
In one or more embodiments, a camera 240 (e.g., the camera 240 of
The photographing camera may photograph a real image or background to be matched with a virtual image in order to implement the augmented reality or mixed reality content. The photographing camera may photograph an image of a specific object existing at a position (e.g., the FoV 110 of
The eye tracking camera 240-1 may implement a more realistic augmented reality by matching the user's gaze with the visual information provided on the at least one display 230, by tracking the gaze of the user wearing the wearable device 101. For example, when the user looks at the front, the wearable device 101 may naturally display environment information associated with the user's front on the at least one display 230 at a position where the user is positioned. The eye tracking camera 240-1 may be configured to capture an image of the user's pupil in order to determine the user's gaze. For example, the eye tracking camera 240-1 may receive gaze detection light reflected from the user's pupil and may track the user's gaze based on the position and movement of the received gaze detection light. In an embodiment, the eye tracking camera 240-1 may be disposed at a position corresponding to the user's left and right eyes. For example, the eye tracking camera 240-1 may be disposed in the first rim 301 and/or the second rim 302 to face the direction in which the user wearing the wearable device 101 is positioned.
The motion recognition camera 240-2 may provide a specific event to the screen provided on the at least one display 230 by recognizing the movement of the whole or portion of the user's body, such as the user's torso, hand, or face. The motion recognition camera 240-2 may obtain a signal corresponding to motion by recognizing the user's motion (e.g., gesture recognition), and may provide a display corresponding to the signal to the at least one display 230. The processor may identify a signal corresponding to the operation and may perform a preset function based on the identification. In an embodiment, the motion recognition camera 240-2 may be disposed on the first rim 301 and/or the second rim 302.
According to an embodiment, the battery module 370 may supply power to electronic components of the wearable device 101. In an embodiment, the battery module 370 may be disposed in the first temple 304 and/or the second temple 305. For example, the battery module 370 may be a plurality of battery modules 370. The plurality of battery modules 370, respectively, may be disposed on each of the first temple 304 and the second temple 305. In an embodiment, the battery module 370 may be disposed at an end of the first temple 304 and/or the second temple 305.
The antenna module 375 may transmit the signal or power to the outside of the wearable device 101 or may receive the signal or power from the outside. The antenna module 375 may be electrically and/or operably connected to the communication circuit 250 of
The sound output module may output a sound signal to the outside of the wearable device 101. The sound output module may be referred to as a speaker. In an embodiment, the sound output module may be disposed in the first temple 304 and/or the second temple 305 in order to be disposed adjacent to the ear of the user wearing the wearable device 101. For example, the sound output module may include a first sound output module disposed adjacent to the user's left ear by being disposed in the first temple 304, and a second sound output module disposed adjacent to the user's right ear by being disposed in the second temple 305.
The light emitting module may include at least one light emitting element. The light emitting module may emit light of a color corresponding to a specific state or may emit light through an operation corresponding to the specific state in order to visually provide information on a specific state of the wearable device 101 to the user. For example, when the wearable device 101 requires charging, it may emit red light at a constant cycle. In an embodiment, the light emitting module may be disposed on the first rim 301 and/or the second rim 302.
As described above, the wearable device 101 according to an embodiment may identify a motion of the user's head and/or gaze, or may receive the user's speech. The wearable device 101 may execute a function requested by the user, in response to (or based on) the motion and/or the speech. The function performed by the wearable device 101 based on the motion and/or the speech may include a function of controlling an external electronic device (e.g., the external electronic device 290 of
Hereinafter, an example of an operation in which the wearable device 101 according to an embodiment identifies one or more external electronic devices based on metadata will be described with reference to
The wearable device according to an embodiment may obtain metadata on at least one external electronic device through communication circuitry (e.g., the communication circuitry 250 of
In an embodiment, the wearable device may obtain the metadata by communicating with a server in a network (e.g., the network 280 of
In an embodiment, based on identifying an external electronic device directly connected to an AP connected with the wearable device, the wearable device may obtain metadata from the identified external electronic device. The wearable device may establish a communication link based on Bluetooth, WiFi, and/or an NFC with the external electronic device, based on a marker attached to the external electronic device, such as a quick response (QR) code. The wearable device may obtain metadata from the external electronic device by using the communication link.
The wearable device according to an embodiment may identify at least one external electronic device based on metadata. For example, the metadata may include data related to the at least one external electronic device. For example, the wearable device may identify at least one of a type, a vendor, a name, an address in a network (e.g., an IP address, and/or a MAC address), and a location (e.g., GPS coordinate) of an external electronic device corresponding to metadata by using the metadata. For example, the wearable device may identify one or more functions applicable to the external electronic device by using the metadata. For example, the wearable device may identify obtainable data from the external electronic device by using the metadata. For example, the wearable device may identify at least one category including the external electronic device by using the metadata.
The wearable device according to an embodiment may identify one or more categories in which an external electronic device is included, in hierarchical categories, based the metadata. The categories 410, 420, 421, 422, 423, 430, 431, 432, 433, 434, 440, 441, 442, 450, and 451 may be an example of the hierarchical categories. Referring to
The category 410 of
Referring to
Referring to
Referring to
Referring to
Hereinafter, an example operation in which the wearable device according to an embodiment classifies the external electronic devices 121, 122, 123, 131, 132, 141, 142, 161, and 171 of
For example, the wearable device may classify the external electronic device 121 of
For example, the wearable device may classify the external electronic device 131 of
For example, the wearable device may classify the external electronic device 141 of
The wearable device according to an embodiment may identify at least one function provided by a specific external electronic device, based on metadata and/or the hierarchical categories (e.g., the categories 410, 420, 421, 422, 423, 430, 431, 432, 433, 434, 440, 441, 442, 450, and 451). For example, the wearable device may identify a first set of functions applicable to the external electronic device 122 of
In an embodiment, the wearable device may receive an input indicating selection of a plurality of external electronic devices. The wearable device may use the hierarchical categories to interact with a plurality of external electronic devices selected by the input. For example, the hierarchical categories (e.g., the categories 410, 420, 421, 422, 423, 430, 431, 432, 433, 434, 440, 441, 442, 450, and 451) in which the wearable device uses to classify external electronic devices may be used to identify homogeneity of the external electronic devices. For example, the wearable device may identify a function common to the plurality of external electronic devices based on the hierarchical categories, in response to (or based on) the input indicating selection of the plurality of external electronic devices. For example, each of the hierarchical categories may indicate a function commonly applicable to external electronic devices included in a category.
For example, in response to (or based on) an input indicating selection of all of the external electronic devices 121 and 122 of
An example operation of the wearable device based on the intersection of sets of functions applicable to each of the plurality of external electronic devices has been described, but an embodiment is not limited thereto. For example, in response to (or based on) the input indicating selection of the plurality of external electronic devices, the wearable device may identify the lowest category including all of the plurality of external electronic devices within hierarchical categories. Like the example, in a state of receiving an input indicating selection of all of the external electronic devices 121 and 122 of
The wearable device according to an embodiment may guide a user's selection based on homogeneity of external electronic devices, based on the hierarchical categories. For example, in response to (or based on) an input of selecting at least two external electronic devices, the wearable device may guide selection of another external electronic device different from the external electronic devices, based on union or a higher category of categories corresponding to each of the external electronic devices selected by the input. Like the example, in a state of receiving the input indicating selection of all of the external electronic devices 121 and 122 of
As described above, the wearable device according to an embodiment may obtain hierarchical categories (e.g., the categories 410, 420, 421, 422, 423, 430, 431, 433, 434, 441, 442, 450, and 451) for classifying a plurality of external electronic devices, based on metadata on the plurality of external electronic devices connected to the wearable device. Based on the hierarchical categories, the wearable device may identify at least one function applicable common to the external electronic devices. For example, in addition to a function of turning on or off external electronic devices, the wearable device may identify another function commonly applicable to the external electronic devices. The wearable device may support the user to collectively control the plurality of external electronic devices in addition to the function of turning on or off the plurality of external electronic devices, by displaying a screen for execution of the other function.
Hereinafter, an operation of receiving an input indicating that the wearable device according to an embodiment adds an external electronic device will be described with reference to
The wearable device according to an embodiment may identify one or more external electronic devices included in the FoV 110 by using a camera (e.g., the camera 240 of
The wearable device according to an embodiment may receive an input for notifying a location of at least one external electronic device (e.g., an external electronic device 512) within the FoV 110. For example, the wearable device may obtain the location of the external electronic device from a user, independently of the object detection and/or the object recognition. Hereinafter, an operation in which the wearable device according to an embodiment identifies a location of the external electronic device 512 based on a user input will be described with reference to
In a state 510 of
The wearable device according to an embodiment may receive an input for moving the visual object 514. For example, the wearable device may receive the input based on a user's motion to be described later with reference to
A state 520 of
Referring to
In an embodiment, in response to (or based on) an input indicating selection of any one of the options 522, 524, and 526, the wearable device may classify the external electronic device 512 into at least one of the hierarchical categories. Hereinafter, it is assumed that the wearable device has received an input indicating selection of the option 522 from the user in the state 520 of
A state 530 of
The wearable device according to an embodiment may display the screen 534 on at least a portion of the FoV 110, in response to (or based on) an input indicating selection of the external electronic device 512. The wearable device may display one or more functions applicable to the external electronic device 512 selected by the input within the screen 534. The one or more functions applicable to the external electronic device 512 may be identified based on metadata on the external electronic device 512 and/or one or more categories in which the external electronic device 512 is included.
Referring to
Although an example operation of the wearable device related to the external electronic device 512 has been described, an embodiment is not limited thereto. For example, based on an input indicating selection of another external electronic device that is viewable through the FoV 110 and different from the external electronic device 512, the wearable device may display a screen for executing functions applicable to the other external electronic device, similar to the screen 534. For example, in response to (or based on) an input indicating selection of at least one of the visual objects 151, 152, and 153 superimposed on at least a portion of the FoV 110, the wearable device may display a screen for executing functions applicable to at least one external electronic device corresponding to at least one visual object selected by the input.
As described above, the wearable device according to an embodiment may display a UI for controlling an external electronic device, such as the screen 534, based on an input indicating selection of the external electronic device. In response to (or based on) the input based on the UI, the wearable device may transmit a signal for controlling the external electronic device. When the number of external electronic devices selected by the user increases to two or more, the wearable device may display functions commonly applicable to external electronic devices selected by the user, by changing the UI. In an embodiment, the wearable device may guide the user additional selection of an external electronic device, based on homogeneity of external electronic devices selected by the user.
Hereinafter, example operations of receiving an input indicating that a wearable device according to an embodiment selects two or more external electronic devices will be described with reference to
States 610 and 620 of
The wearable device 101 according to an embodiment may receive an input indicating selection of one or more external electronic devices, based on a path of a body part such as the hand 612. In a state 610 of
The wearable device 101 according to an embodiment may receive an input detected by the pointing device 622 and indicating selection of at least one external electronic device. Referring to
The wearable device 101 may receive an input indicating selection of a plurality of external electronic devices by using the pointing device 622 and/or the hand 612. The wearable device 101 may also receive an input indicating exclusion of at least one of the plurality of pre-selected external electronic devices. For example, the wearable device 101 may identify a motion of the pointing device 622 moving along a path of a visual object 626 after receiving a first input indicating selection of the external electronic devices 121, 122, and 131, such as the visual object 624. The wearable device 101 may receive a second input indicating exclusion of the external electronic device 131 specified by the path of the visual object 626, based on the motion of the pointing device 622 moving along the path of the visual object 626. After sequentially receiving the first input and the second input, the wearable device 101 may determine that the external electronic devices 121 and 122 are selected by the user.
As described above, the wearable device 101 according to an embodiment may receive an input indicating selection of one or more external electronic devices, based on the motion of the user's hand 612. The wearable device 101 may identify functions commonly applicable to a plurality of external electronic devices, based on identifying that the plurality of external electronic devices are selected by the input. The wearable device 101 identifying the functions commonly applicable to the plurality of external electronic devices may be performed based on hierarchical categories (e.g., the categories of
The wearable device 101 receiving an input indicating selection of the external electronic devices is not limited to example operations of
The wearable device 101 according to an embodiment may include a sensor (e.g., the sensor 260 of
The wearable device 101 according to an embodiment may identify a preset keyword from the audio signal. For example, the wearable device 101 may identify a preset keyword in the speech 714. The preset keyword may include a demonstrative pronoun, such as a keyword 716 of
Referring to
The wearable device 101 according to an embodiment may receive an input indicating selection of one or more external electronic devices, based on the visual object 712 indicating the received motion within the time section in which the keyword 716 is received. For example, the wearable device 101 may determine an external electronic device 121 and an external electronic device corresponding to a visual object 153, which are included in a closed curve indicated by the visual object 712 within the FoV 110, as external electronic devices selected by the user.
The wearable device 101 according to an embodiment may collectively control external electronic devices selected by the keyword 716 within the speech 714, based on another keyword that is included in the speech 714 and different from the keyword 716. Referring to
Referring to
Referring to
Referring to
Referring to
Although an embodiment for identifying a speech in English has been described, the embodiment is not limited thereto. For example, the wearable device 101 may control one or more external electronic devices connected to the wearable device 101 based on a speech such as “Turn all of those off!”.
As described above, the wearable device 101 according to an embodiment may receive an input that indicates selection of one or more external electronic devices and collective control of the selected one or more external electronic devices by using an audio signal and a user's motion. The wearable device 101 may collectively control the one or more external electronic devices based on the input. Although an operation in which the wearable device 101 collectively deactivates the external electronic devices based on the user's speech has been described, an embodiment is not limited thereto. The wearable device 101 may transmit a signal to execute a function corresponding to text to a plurality of external electronic devices, in response to (or based on) identifying the text from the user's speech, wherein the text corresponds to any one of functions commonly applicable to a plurality of external electronic devices selected by the user.
Hereinafter, referring to
In the state 810 of displaying the visual object 815 of
A state 820 of
In the state 820 of
The wearable device according to an embodiment may identify external electronic devices (e.g., an external electronic device that is homogeneous with the external electronic devices selected by the user) within a category including all the external electronic devices selected by the user. For example, the external electronic devices identified by the wearable device may be union of categories in which each of the external electronic devices selected by the user is classified. For example, the external electronic devices identified by the wearable device may be identified based on a higher category that the categories have in common.
In an embodiment of
The wearable device according to an embodiment may display a visual object indicating another external electronic device (e.g., an external electronic device that is homogeneous with external electronic devices selected by the user) similar to external electronic devices selected by the user. The wearable device may display the visual object, as associated with the other external electronic device. The visual object may be displayed based on a different shape and/or a color from the visual objects 815 and 821 displayed as associated with the external electronic devices selected by the user, to guide selection of the other external electronic device.
Referring to
Referring to
A state 830 of
As described above, the wearable device according to an embodiment may display the first type of the visual objects 815 and 821 indicating that the plurality of external electronic devices is selected, based on the input indicating the selection of the plurality of external electronic devices. While displaying the first type of visual objects 815, and 821, the wearable device may display a visual object (e.g., the visual objects 822, 823, 824, and 825), which is different from the first type, for guiding selection of another external electronic device that is homogeneous with the plurality of external electronic devices. The wearable device may notify the user of another external electronic device that is homogeneous with the plurality of external electronic devices, or may guide selection of the other external electronic device, by using a visual object different from the first type.
Hereinafter, referring to
In the state 910 of
Referring to
In the state 910 of
Referring to
In an example of
As described above, the wearable device according to an embodiment may display a visual object (e.g., the visual objects 913, 914, and 915) for execution of other functions commonly applicable to the external electronic devices, in addition to a function of collectively turning on or off external electronic devices selected by the user, such as the visual object 912. Since the wearable device lists functions commonly applicable to the external electronic devices based on metadata and/or categories of the external electronic devices, collective control of the external electronic devices by the user of the wearable device.
Hereinafter, an example of an operation in which a wearable device according to an embodiment processes information received from external electronic devices selected by a user will be described with reference to
A state 1010 of
The wearable device according to an embodiment may obtain data measured from each of the external electronic devices 141 and 142, in response to (or based on) the input indicating the selection of the external electronic devices 141 and 142. The wearable device may display data obtained from the external electronic devices 141 and 142 within the screen 1011. For example, the wearable device may display visual objects 1014 and 1015 indicating data (e.g., air quality) measured from the external electronic device 141, which is an air cleaner, within the screen 1011. An example in which the wearable device displays the visual objects 1014 and 1015 indicating air quality measured at two points from the external electronic device 141 is illustrated, but an embodiment is not limited thereto.
The wearable device according to an embodiment may display a combination of data measured in each of the external electronic devices 141 and 142, within the screen 1011. For example, the wearable device may display a visual object 1013 indicating a combination of power consumption within the screen 1011, based on receiving the data indicating the power consumption of each of the external electronic devices 141 and 142. Referring to
An operation in which the wearable device performs based on an input indicating selection of a plurality of external electronic devices is not limited to the collective control of the external electronic devices described above or visualization of the data received from the external electronic devices. The wearable device may display a visual object for executing a collaborative function of the plurality of external electronic devices selected by the user. For example, the collaborative function may be executed based on individual control of the plurality of external electronic devices.
A state 1020 of
For example, the wearable device may display the visual object 1025, which is related to all the external electronic devices 123 and 161 selected by an input and is for individually controlling the external electronic devices 123 and 161, within the screen 1023. Referring to
Referring to
An operation in which the wearable device according to an embodiment controls external electronic devices selected by the user is not limited to the above-described embodiment. For example, the wearable device may display a state of charge (SOC) of a battery included in each of the external electronic devices individually or in combination, based on an input selecting external electronic devices including the battery. In an embodiment of
Hereinafter, an operation of the wearable device described above will be described based on exemplary flowcharts with reference to
Referring to
In response to (or based on) the input of operation 1110, in operation 1115, the wearable device according to an embodiment may determine whether two or more external electronic devices have been selected by the input. Based on the operation 1110, when receiving an input indicating selection of one external electronic device (1115—No), in operation 1120, the wearable device according to an embodiment may display a screen for interacting with the external electronic device selected by the input of the operation 1110. For example, like the screen 534 of
When receiving an input indicating selection of a plurality of external electronic devices based on the operation 1110 (1120—Yes), in operation 1125, the wearable device according to an embodiment may identify a category including all of two or more external electronic devices selected by the input. The wearable device may identify categories including each of external electronic devices based on metadata on the external the electronic devices selected by the input. Based on the categories and/or a higher category (upper category) of the categories, the wearable device may identify a category including all the external electronic devices selected by the input.
Referring to
Referring to
Referring to
Referring to
Referring to
When the external object selected by the input corresponds to at least one of the external electronic devices (1230—Yes), in operation 1240 of
When a plurality of external electronic devices is selected by the input (1240—Yes), in operation 1250 of
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
While displaying the one or more visual objects of the operation 1530, in operation 1540, the wearable device according to an embodiment may determine whether a second input indicating selection of at least one second external electronic device is received. The at least one second external electronic device may be different from the one or more first external electronic devices of the operation 1510. When not receiving the second input (1540—No), in operation 1570, the wearable device may control the one or more first external electronic devices, in response to (or based on) a third input with respect to one or more visual objects. Before receiving the second input, the wearable device may maintain displaying the one or more visual objects for execution of the one or more first functions, based on the operation 1530.
When receiving the second input indicating the selection of the at least one second external electronic device (1540—Yes), in operation 1550, the wearable device according to an embodiment may identify at least one second function applicable to the at least one second external electronic device. Identifying the at least one second function by the wearable device may be performed based on metadata on the at least one second external electronic device, and/or a category including the at least one second external electronic device, similar to the operation 1520.
Referring to
As described above, the wearable device according to an embodiment may guide selection of homogeneous external electronic devices among a plurality of external electronic devices connected to the wearable device. In response to (or based on) an input indicating selection of a plurality of external electronic devices, the wearable device may recommend another external electronic device similar to the plurality of external electronic devices to the user. The wearable device may display a screen for execution of one or more functions commonly applicable to the plurality of external electronic devices in response to (or based on) the input.
According to an embodiment, a method of a wearable device for providing a user interface to select at least one function applicable to a plurality of external electronic devices may be required.
As described above, the wearable device 101 may comprise a display 230 and a processor 210. The processor may be configured to receive an input indicating selection of a first external electronic device, and a second external electronic device among a plurality of external electronic devices viewable through the display. The processor may be configured to, in response to (or based on) the input, identify third external electronic devices in a category including both the first external electronic device, and the second external electronic device. The third external electronic devices may be configured to include the first external electronic device and the second external electronic device. The processor may be configured to display, as associated with the first external electronic device and the second external electronic device viewable through the display, a first visual object and a second visual object (e.g., the visual objects 815 and 821) respectively indicating that the first external electronic device and the second external electronic device are selected. The processor may be configured to display, as associated with at least one fourth external electronic device among the third external electronic devices that is different from the first external electronic device and the second external electronic device, at least one third visual object (e.g., the visual objects 822, 823, and 825) guiding selection of the at least one fourth external electronic device. The processor may display the first visual object and the second visual object by superimposing on the first external electronic device and the second external electronic device viewable through the display. The wearable device according to an embodiment may provide a user with a UI (e.g., the screens 534, 911, 1011, 1023, and 1034) for executing functions commonly applicable to external electronic devices selected by the user. The wearable device according to an embodiment may provide a user interface for selecting at least one function applicable to the plurality of external electronic devices.
For example, the processor may be configured to, in response to (or based on) the input, identify a first category of the first external electronic device and a second category of the second external electronic device, from first metadata with respect to the first external electronic device and second metadata with respect to the second external electronic device. For example, the processor may be configured to, based on the first category and the second category, display executable objects to control the first external electronic device and the second external electronic device.
For example, the processor may be configured to identify one or more functions commonly included in a first set of functions applicable to an external electronic device in the first category, and a second set of functions applicable to an external object in the second category. The processor may be configured to display, within the display, the executable objects to execute the one or more functions.
For example, the processor may be configured to, based on identifying the second category different from the first category, identify the category where the third external electronic devices are included, based on a higher category (upper category) of each of the first category of the first external electronic device and the second category of the second external electronic device.
For example, the wearable device may further comprise communication circuitry 250. The processor may be configured to obtain, through the communication circuitry, metadata indicating functions applicable to each of a plurality of external electronic devices and including the first metadata and the second metadata.
For example, the input may be a first input. The processor may be configured to, in response to (or based on) a second input indicating selection of one object among the executable objects, request, to the first external electronic device and the second external electronic device, execution of a function mapped to the object selected by the second input.
For example, the processor may be configured to, in response to (or based on) the input, receive first data obtained by the first external electronic device and second data obtained by the second external electronic device. The processor may be configured to display, on the display, a fourth visual object based on a combination of the first data and the second data.
For example, the input may be a first input. The processor may be configured to, in response to (or based on) a second input indicating selection of the at least one of the third visual object, identify one or more functions applicable to all of the third external electronic devices. The processor may be configured to display, within the display, one or more fourth visual objects to execute the one or more functions.
For example, the wearable device may comprise a sensor 260 to track an eye gaze of a user wearing the wearable device. The wearable device may comprise a microphone 270 to receive an audio signal. The processor may be configured to identify a preset keyword 716 from the audio signal received by the microphone. The processor may be configured to, in response to (or based on) identifying the preset keyword, receive the input based on the eye gaze detected by the sensor and a time section 724 in which the preset keyword is identified.
For example, the processor may be configured to, based on identifying another keyword received after the preset keyword from the audio signal, request, to the first external electronic device and the second external electronic device, execution of a function indicated by the other keyword.
As described above, a method of a wearable device 101 may comprise an operation 1510 of receiving a first input indicating selection of one or more first external electronic devices among a plurality of external electronic devices viewable through the display. The method may comprise an operation 1520 of identifying, in response to (or based on) the first input, one or more first functions applicable to the one or more first external electronic devices. The method may comprise an operation 1530 of displaying, within the display, one or more visual objects for executing the one or more first functions. The method may comprise an operation 1550 of identifying, in a state that the one or more visual objects are displayed, in response to (or based on) a second input indicating selection of at least one second external electronic device, at least one second function applicable to the at least one second external electronic device. The method may comprise an operation 1560 of ceasing to, among the one or more visual objects, displaying at least one visual object with respect to a function different from the at least one second function. The wearable device according to an embodiment may provide a UI responding to a change in a user's selection with respect to the external electronic devices.
For example, identifying the one or more first functions may include receiving, through communication circuitry of the wearable device, metadata with respect to the one or more first external electronic devices. The Identifying the one or more first functions may include identifying, based on the received metadata, the one or more first functions.
For example, identifying the one or more first functions may include, based on the received metadata, identifying a category including all the one or more first external electronic devices. The identifying the one or more first functions may include, based on the identified category, identifying the one or more first functions.
For example, displaying the one or more visual objects may include receiving a third input indicating selection of one of the one or more visual objects. The displaying the one or more visual objects may include, in response to (or based on) the third input, requesting to the one or more first external electronic devices to execute a function corresponding to a visual object selected by the third input from among the one or more first functions.
For example, the method may include, based on the at least one second function identified, in response to (or based on) the second input, maintaining displaying of at least one visual object corresponding to the at least one second function.
For example, displaying one or more visual objects may include displaying a first visual object to adjust power of the one or more first external electronic devices. The displaying the one or more visual objects may include displaying one or more second visual objects, which corresponds to the one or more first functions applicable to all of the first external electronic devices and is different from the first visual object.
For example, receiving the first input may include identifying a preset keyword from the audio signal identified through the microphone of the wearable device. The receiving the first input may include identifying a motion of a user wearing the wearable device by using a sensor of the wearable device in a time section in which the preset keyword is identified. The receiving the first input may include selecting the one or more first external electronic devices from among the plurality of external electronic devices based on the identified motion.
As described above, a method of a wearable device 101 may comprise an operation 1410 of receiving an input indicating selection of a first external electronic device, and a second external electronic device among a plurality of external electronic devices viewable through a display of the wearable device. The method may comprise an operation 1420 of identifying, in response to (or based on) the input, third external electronic devices in a category including both the first external electronic device, and the second external electronic device. The third external electronic devices may include the first external electronic device, and the second external electronic device. The method may comprise an operation 1430 of displaying, as associated with the first external electronic device and the second external electronic device viewable through the display, a first visual object and a second visual object respectively indicating that the first external electronic device and the second external electronic device are selected. The method may comprise an operation 1440 of displaying, as associated with at least one fourth external electronic device among the third external electronic devices that is different from the first external electronic device and the second external electronic device, at least one third visual object guiding selection of the at least one fourth external electronic device.
For example, the method may comprise, in response to (or based on) the input, identifying a first category of the first external electronic device and a second category of the second external electronic device, from first metadata with respect to the first external electronic device and second metadata with respect to the second external electronic device. The method may comprise, based on the first category and the second category, displaying executable objects to control the first external electronic device and the second external electronic device.
For example, displaying the executable objects may comprise identifying one or more functions commonly included in a first set of functions applicable to an external electronic device in the first category, and a second set of functions applicable to an external object in the second category. The displaying the executable objects may comprise displaying, within the display, the executable objects to execute the one or more functions.
For example, the displaying the executable objects may comprise, based on identifying the second category different from the first category, identifying the category where the third external electronic devices are included, based on a higher category (upper category) of each of the first category of the first external electronic device and the second category of the second external electronic device. The displaying the executable objects may comprise, based on identifying the category that the third external electronic devices are included, displaying the executable objects based on the identified category.
For example, the displaying the executable objects may comprise, in response to (or based on) a second input indicating selection of one object among the executable objects, requesting, to the first external electronic device and the second external electronic device, execution of a function mapped to the object selected by the second input, wherein the second input is different from the input that is the first input.
For example, the method may include, in response to (or based on) the input, receiving first data obtained by the first external electronic device and second data obtained by the second external electronic device. The method may include displaying, on the display, a fourth visual object based on a combination of the first data and the second data.
For example, displaying the at least one third visual object may include, in response to (or based on) a second input indicating selection of the at least one of the third visual object, identifying one or more functions applicable to all of the third external electronic devices. The displaying the at least one third visual object may include displaying, within the display, one or more fourth visual objects to execute the one or more functions.
For example, receiving the input may include identifying a preset keyword from the audio signal received by the microphone. The receiving the input may include, in response to (or based on) identifying the preset keyword, receiving the input based on the eye gaze detected by the sensor and a time section in which the preset keyword is identified.
For example, the receiving the input may include, based on identifying another keyword received after the preset keyword from the audio signal, requesting, to the first external electronic device and the second external electronic device, execution of a function indicated by the other keyword.
As described above, a wearable device 101 according to an embodiment may comprise a display 230 and a processor 210. The processor may be configured to receive a first input indicating selection of one or more first external electronic devices among a plurality of external electronic devices viewable through the display. The processor may be configured to, in response to (or based on) the first input, identify one or more first functions applicable to the one or more first external electronic devices. The processor may be configured to display, within the display, one or more visual objects for executing the one or more first functions. The processor may be configured to, in a state that the one or more visual objects are displayed, identify, in response to (or based on) a second input indicating selection of at least one second external electronic device, at least one second function applicable to the at least one second external electronic device. The processor may be configured to cease to, among the one or more visual objects, displaying at least one visual object with respect to a function different from the at least one second function.
For example, the processor may be configured to receive, through communication circuitry of the wearable device, metadata with respect to the one or more first external electronic devices. The processor may be configured to identify, based on the received metadata, the one or more first functions.
For example, the processor may be configured to, based on the received metadata, identify a category including all the one or more first external electronic devices. The processor may be configured to, based on the identified category, identify the one or more first functions.
For example, the processor may be configured to receive a third input indicating selection of one of the one or more visual objects. The processor may be configured to, in response to (or based on) the third input, request to the one or more first external electronic devices to execute a function corresponding to a visual object selected by the third input from among the one or more first functions.
For example, the processor may be configured to, based on the at least one second function identified in response to (or based on) the second input, maintain displaying of at least one visual object corresponding to the at least one second function.
For example, the processor may be configured to display a first visual object for adjusting power of the one or more first external electronic devices. The processor may be configured to display one or more second visual objects, which corresponds to the one or more first functions applicable to all of the first external electronic devices and is different from the first visual object.
For example, the wearable device may be configured to further include a sensor 260 and a microphone 270. The processor may be configured to identify a preset keyword from the audio signal identified through the microphone. The processor may be configured to identify a motion of a user wearing the wearable device by using the sensor in a time section in which the preset keyword is identified. The processor may be configured to select the one or more first external electronic devices from among the plurality of external electronic devices based on the identified motion.
The device described above may be implemented as a hardware component, a software component, and/or a combination of a hardware component and a software component. For example, the devices and components described in the embodiments may be implemented by using one or more general purpose computers or special purpose computers, such as a processor, controller, arithmetic logic unit (ALU), digital signal processor, microcomputer, field programmable gate array (FPGA), programmable logic unit (PLU), microprocessor, or any other device capable of executing and responding to instructions. The processing device may perform an operating system (OS) and one or more software applications executed on the operating system. In addition, the processing device may access, store, manipulate, process, and generate data in response to (or based on) the execution of the software. For convenience of understanding, there is a case that one processing device is described as being used, but a person who has ordinary knowledge in the relevant technical field may see that the processing device may include a plurality of processing elements and/or a plurality of types of processing elements. For example, the processing device may include a plurality of processors or one processor and one controller. In addition, another processing configuration, such as a parallel processor, is also possible.
The software may include a computer program, code, instruction, or a combination of one or more thereof, and may configure the processing device to operate as desired or may command the processing device independently or collectively. The software and/or data may be embodied in any type of machine, component, physical device, computer storage medium, or device, to be interpreted by the processing device or to provide commands or data to the processing device. The software may be distributed on network-connected computer systems and stored or executed in a distributed manner. The software and data may be stored in one or more computer-readable recording medium.
The method according to the embodiment may be implemented in the form of a program command that may be performed through various computer means and recorded on a computer-readable medium. In this case, the medium may continuously store a program executable by the computer or may temporarily store the program for execution or download. In addition, the medium may be various recording means or storage means in the form of a single or a combination of several hardware, but is not limited to a medium directly connected to a certain computer system, and may exist distributed on the network. Examples of media may include a magnetic medium such as a hard disk, floppy disk, and magnetic tape, optical recording medium such as a CD-ROM and DVD, magneto-optical medium, such as a floptical disk, and those configured to store program instructions, including ROM, RAM, flash memory, and the like. In addition, examples of other media may include recording media or storage media managed by app stores that distribute applications, sites that supply or distribute various software, servers, and the like.
As described above, although the embodiments have been described with limited examples and drawings, a person who has ordinary knowledge in the relevant technical field is capable of various modifications and transform from the above description. For example, even if the described technologies are performed in a different order from the described method, and/or the components of the described system, structure, device, circuit, and the like are coupled or combined in a different form from the described method, or replaced or substituted by other components or equivalents, appropriate a result may be achieved.
| Number | Date | Country | Kind |
|---|---|---|---|
| 10-2022-0084682 | Jul 2022 | KR | national |
This application is a by-pass continuation application of International Application No. PCT/KR2023/005930, filed on May 1, 2023, which is based on and claims priority to Korean Patent Application No. 10-2022-0084682, filed on Jul. 8, 2022, in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein their entireties.
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/KR2023/005930 | May 2023 | WO |
| Child | 18962983 | US |