The disclosure relates to a wearable device for displaying multimedia content provided by an external electronic device and a method thereof.
In order to provide enhanced user experience, electronic devices are being developed that provide an augmented reality (AR) service that displays information generated by a computer in conjunction with an external object in the real-world. The electronic device may be a wearable device that may be worn by a user. For example, the electronic device may be AR glasses and/or a head-mounted device (HMD).
According to various embodiments, a head-wearable electronic device may include at least one display, a first camera usable for identifying eye gaze information, a second camera usable for obtaining images regarding a physical environment (space) in front of the head-wearable electronic device, communication circuitry, memory storing instructions, and at least one processor comprising processing circuitry. The instructions, when executed by the at least one processor individually or collectively, cause the head-wearable electronic device to display, using the at least one display, images of the physical environment obtained using the second camera; while displaying the images of physical environment, identify that first eye gaze information, obtained via the first camera, corresponds to a visual object in the images, the visual object corresponding to an external electronic device in the physical environment; based on identifying that the first eye gaze information corresponds to the visual object, display, using the at least one display, a user interface (UI) object associated with the visual object; while displaying the UI object associated with the visual object, identify that second eye gaze information, obtained via the first camera, corresponds to the UI object; based at least on identifying that the second eye gaze information corresponds to the UI object, execute a first function associated with the UI object including transmitting, through the communication circuitry, to the external electronic device, a signal to request establishment of a communication link with the external electronic device; and based on information received through the communication circuitry, display, using the at least one display, screen images, associated with the external electronic device, superimposed on the images of the physical environment.
According to various embodiments, a method for a head-wearable electronic device including at least one display, a first camera usable for identifying eye gaze information, a second camera usable for obtaining images regarding a physical environment in front of the head-wearable electronic device, and communication circuitry may include displaying, using the at least one display, images of the physical environment obtained using the second camera; while displaying the images of the physical environment, identifying that first eye gaze information, obtained via the first camera, corresponds to a visual object in the images, the visual object corresponding to an external electronic device in the physical environment; based on identifying that the first eye gaze information corresponds to the visual object, displaying, using the at least one display, a user interface (UI) object associated with the visual object; while displaying the UI object associated with the visual object, identifying that second eye gaze information, obtained via the first camera, corresponds to the UI object; based at least on identifying that the second eye gaze information corresponds to the UI object, executing a first function associated with the UI object including transmitting, through the communication circuitry, to the external electronic device, a signal to request establishment of a communication link with the external electronic device; and based on information received through the communication circuitry, displaying, using the at least one display, screen images, associated with the external electronic device, superimposed on images of the physical environment.
According to various embodiments, a wearable device may include communication circuitry, a display, a camera, and a processor. The processor may be configured to identify, based on a first object in an image obtained using the camera, a first external electronic device; transmit, using the communication circuitry, a first signal to a second external electronic device to identify coincidence of a first user account used by the wearable device and a second user account used by the first external electronic device; receive, using the communication circuitry, a second signal from the second external electronic device to identify the coincidence of the first user account and the second user account; display, at least based on the second signal, a second visual object in association with a selection of the first external electronic device in conjunction with the first object; and display, based on a user input with respect to the second visual object, content provided through the first external electronic device through the display.
According to various embodiments, a wearable device may include communication circuitry, a display, a camera, memory storing instructions, and a processor. The instructions, when executed by the processor, may cause the wearable device to identify an external electronic device based on a first object within an image obtained using the camera; request, based on the identification, information to display multimedia content provided through the external electronic device, to the external electronic device through the communication circuitry; display, in conjunction with the first object, a second object to notify that the external electronic device is selectable, in response to receiving the information transmitted from the external electronic device through the communication circuitry; and display, through the display, the multimedia content obtained from the external electronic device based on the information, based on whether a user input with respect to the first object is detected.
According to various embodiments, a method for a wearable device may include identifying an external electronic device based on a first object within an image obtained using a camera; requesting information to display multimedia content provided through the external electronic device, to the external electronic device through communication circuitry, based on the identification; displaying, in conjunction with the first object, a second object to notify that the external electronic device is selectable, in response to receiving the information transmitted from the external electronic device through the communication circuitry; and displaying, through the display, the multimedia content obtained from the external electronic device based on the information, based on whether a user input with respect to the first object is detected.
One or more non-transitory computer-readable storage media may store one or more programs which, when executed by at least one processor of a wearable device, may cause the processor of the wearable device to identify an external electronic device based on a first object within an image obtained using a camera; request information to display multimedia content provided through the external electronic device, to the external electronic device through communication circuitry, based on the identification; display, in conjunction with the first object, a second object to notify that the external electronic device is selectable, in response to receiving the information transmitted from the external electronic device through the communication circuitry; and display, through the display, the multimedia content obtained from the external electronic device based on the information, based on whether a user input with respect to the first object is detected.
The above and other aspects, features and advantages of certain embodiments of the present disclosure will be more apparent from the following detailed description, taken in conjunction with the accompanying drawings, in which:
Hereinafter, various embodiments of the present disclosure will be described with reference to the accompanying drawings.
The wearable device 101 according to various embodiments may execute a function associated with augmented reality (AR) and/or mixed reality (MR). Referring to
The wearable device 101 according to various embodiments may execute a function related to video see-through (VST) and/or virtual reality (VR). Referring to
Referring to
The wearable device 101 according to various embodiments may request information to display multimedia content provided through the external electronic device, to the external electronic device through the communication circuitry, based on the identification of the first object 110. For example, the wearable device 101 may request information to display the multimedia content to a first external electronic device through a second external electronic device (e.g., a server) different from the external electronic device, which is the first external electronic device. For example, the first external electronic device may be a device including a display, such as a TV or a computer monitor. For example, the first external electronic device may be a device including a display, such as a smartphone, a smartpad, or a tablet PC. For example, the first external electronic device may be a device capable of providing multimedia content through the display. However, the disclosure is not limited in this respect. The wearable device 101 according to an embodiment may identify first user information used for logging in to the first external electronic device corresponding to the first object 110 and second user information used for logging in to the wearable device 101. For example, the wearable device 101 may identify the sameness of the first user information and the second user information (e.g., the first user information and the second user information match). For example, the wearable device 101 may request information to display multimedia content provided by the external electronic device based on the matching of the first user information and the second user information. For example, the wearable device 101 may request the information based on the fact that the first user information and the second user information are at least partially identical (or at least partially match).
The wearable device 101 according to various embodiments may receive the information transmitted from the first external electronic device based at least in part on a request for information to display multimedia content. For example, the wearable device 101 may receive the information through the communication circuitry. For example, the wearable device 101 may receive information transmitted from the first external electronic device through the second external electronic device. In response to receiving the information, the wearable device 101 may display a second object 115 to notify that the external electronic device is selectable. For example, the wearable device 101 may display the second object 115 in conjunction with the first object 110. For example, the wearable device 101 may notify that the first object 110 is selectable through the first object 110 or a third object 120 different from the first object 110. For example, the third object 120 may be displayed in an area different from the first object 110 and the second object 115. For example, the second object 115 may be displayed along an edge of the first object 110. For example, although the second object 115 is displayed as a border of the first object 110 in
The wearable device 101 according to various embodiments may receive an input with respect to the first object 110. The wearable device 101 may identify an input with respect to the third object 120. The wearable device 101 may display multimedia content through a display based on information to display the multimedia content received in response to the input with respect to the first object 110 and/or the third object 120. For example, the wearable device 101 may provide multimedia content through the display following multimedia content provided through an external electronic device. The operation of displaying multimedia content through the display is described later in
As described above, the wearable device 101 according to various embodiments may identify the first object 110 within the image obtained using the camera. The wearable device 101 may identify the external electronic device corresponding to the first object 110 based on the first object 110. Based on the identification, the wearable device 101 may request information to display multimedia content provided through the first external electronic device to the first external electronic device through the communication circuitry. The wearable device 101 may display the second object 115 and/or the third object 120 to notify that the first external electronic device is selectable in response to receiving the information transmitted from the first external electronic device through the communication circuitry based at least in part on the request. The wearable device 101 may display the multimedia content through the display based on the information in response to an input received with respect to the first object 110 and/or the third object 120. For example, the wearable device 101 may display the multimedia content based on an entire area of the display and/or an area exceeding a specified ratio of the entire area. The wearable device 101 may enhance user experience of the wearable device 101 by displaying the multimedia content based on receiving information associated with the multimedia content provided from the first external electronic device.
Referring to
The wearable device 101 according to various embodiments may include hardware to process data based on one or more instructions. For example, the hardware to process data may include the processor 210 (including, e.g., processing circuitry). For example, the hardware to process data may include an arithmetic and logic unit (ALU), a floating point unit (FPU), a field programmable gate array (FPGA), a central processing unit (CPU), and/or an application processor (AP). The processor 210 may have a structure of a single-core processor, or may have a structure of a multi-core processor such as a dual core, a quad core, a hexa core, or an octa core. The operations of
The memory 220 of the wearable device 101 according to various embodiments may include a component to store data and/or instruction inputted to the processor 210 and/or outputted from the processor 210 of the wearable device 101. For example, the memory 220 may include volatile memory such as random-access memory (RAM) and/or non-volatile memory such as read-only memory (ROM). For example, the volatile memory may include at least one of dynamic RAM (DRAM), static RAM (SRAM), Cache RAM, and pseudo SRAM (PSRAM). For example, the nonvolatile memory may include at least one of programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), flash memory, hard disk, compact disc, solid state drive (SSD), and embedded multi-media card (eMMC).
The display 230 of the wearable device 101 according to various embodiments may output visualized information to a user. For example, the display 230 may output visualized information to the user by being controlled by the processor 210 including a circuit such as a graphic processing unit (GPU). The display 230 may include a flat panel display (FPD) and/or electronic paper. The FPD may include a liquid crystal display (LCD), a plasma display panel (PDP), and/or one or more light emitting diodes (LEDs). The LED may include an organic LED (OLED). The wearable device 101 according to an embodiment may provide virtual reality and/or actual reality through the display 230. Virtual reality may include, for example, synthesizing a virtual object on an image obtained through the camera 250. For example, the virtual reality may include mixed reality. For example, the wearable device 101 may display the image obtained through the camera 250 through the display 230. For example, the operation of displaying the image obtained through the camera 250 may be referred to as a video see-through (VST) mode.
The communication circuitry 240 of the wearable device 101 according to various embodiments may include a hardware component to support transmission and/or reception of an electrical signal between the wearable device 101 and an external electronic device 103. For example, the communication circuitry 240 may include at least one of a MODEM, an antenna, and an optic/electronic (O/E) converter. The communication circuitry 240 may support the transmission and/or reception of an electrical signal based on various types of protocols such as ethernet, local area network (LAN), wide area network (WAN), wireless fidelity (WiFi), BLUETOOTH®, Bluetooth low energy (BLE), ZigBee, long term evolution (LTE), and fifth generation new radio (5G NR). The wearable device 101 may establish a communication link with an external electronic device through the communication circuitry 240. For example, the wearable device 101 may transmit and/or receive an electrical signal to request information through the communication circuitry 240. For example, the wearable device 101 may request information to display multimedia content provided through the external electronic device to the external electronic device through the communication circuitry 240. The wearable device 101 may transmit second user information used for logging in to the wearable device 101 to the external electronic device in order to request first user information used for logging in to the external electronic device. The wearable device 101 may encrypt the second user information and transmit the encrypted second user information to the external electronic device. The external electronic device may identify the sameness (or agreement or matching) of the first user information and the second user information based on receiving the second user information. For example, the wearable device 101 may identify the sameness of the first user information and the second user information by decrypting the second user information. The external electronic device and the wearable device 101 may establish a communication link based on the identification of the sameness of the first user information and the second user information. The external electronic device may transmit information associated with multimedia content provided from the external electronic device transmitted from the wearable device 101 to the wearable device 101 based on the identification of the sameness. The wearable device 101 may display the multimedia content through the display 230 based on receiving information associated with the multimedia content. For example, information associated with the multimedia content may include information associated with the provision of the multimedia content, such as a name of the multimedia content and/or the playback time of the multimedia content.
The camera 250 of the wearable device 101 according to various embodiments may include a lens assembly or an image sensor. The lens assembly may collect light emitted from a subject which is a target of image capture. The lens assembly may include one or more lenses. The camera 250 according to an embodiment may include a plurality of lens assemblies. For example, in the camera 350, a portion of the plurality of lens assemblies may have the same lens property (e.g., angle of view, focal length, autofocus, f number, or optical zoom), or at least one lens assembly may have one or more lens properties that are different from the lens properties of the other lens assemblies. The lens assembly may include a wide-angle lens or a telephoto lens. An image sensor according to an embodiment may include, for example, one image sensor selected from among image sensors with different properties, such as an RGB sensor, a black and white (BW) sensor, an IR sensor, or a UV sensor, a plurality of image sensors with the same property, or a plurality of image sensors with a different property. Each image sensor included in the image sensor may be implemented using, for example, a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor. The wearable device 101 according to various embodiments may obtain an image through the camera 250. The wearable device 101 may perform the operation of displaying the image obtained through the camera 250. For example, the operation of the wearable device 101 displaying the image obtained through the camera 250 within a VR environment may be referred to as a VST mode. The wearable device 101 may display a visual object on at least a portion of the image while operating in the VST mode. For example, the visual object may be a virtual object generated by the processor 210 of the wearable device 101. For example, the wearable device 101 may display the visual object and/or virtual object by superimposing the image. The wearable device 101 may identify a real object identified within the image. The wearable device 101 may display a visual object to notify that the real object is selectable by, for example, superimposing the real object. The wearable device 101 according to various embodiments may provide actual reality through the display 230. For example, within an AR environment, the wearable device 101 may provide actual reality, which is seen by penetrating the display 230. The wearable device 101 may display the virtual object in the display 230 within the AR environment.
The wearable device 101 according to various embodiments may display a second object (e.g., the second object 115 of
As described above, the wearable device 101 according to various embodiments may obtain an image through the camera 250. The wearable device 101 may identify the external electronic device corresponding to the first object based on the first object included within the image. Based on the identification, the wearable device 101 may request information to display multimedia content provided through the external electronic device to the external electronic device through the communication circuitry 240. The wearable device 101 may receive the information transmitted from the external electronic device based at least in part on the request. In response to receiving the information, the wearable device 101 may display the second object to notify that the external electronic device is selectable in conjunction with the first object. The wearable device 101 may display the multimedia content based on the information in response to the input received with respect to the first object in conjunction with the second object. For example, the wearable device 101 may display multimedia content provided by the external electronic device corresponding to the first object in association with the external electronic device. The wearable device 101 may provide the multimedia content from a time when the multimedia content is being reproduced in the external electronic device. The wearable device 101 may enhance user experience of the wearable device 101 by continuously providing multimedia content provided by the external electronic device.
According to an embodiment, the wearable device 300 may be wearable on a portion of the user's body. The wearable device 300 may provide augmented reality (AR), virtual reality (VR), or mixed reality (MR), which combines augmented reality and virtual reality, to a user wearing the wearable device 300. For example, the wearable device 300 may output a virtual reality image through at least one display 350, in response to a preset user gesture obtained through a motion recognition camera, e.g., camera 340-2 of
According to an embodiment, the at least one display 350 in the wearable device 300 may provide visual information to a user. The at least one display 350 may include the display 230 of
Referring to
According to an embodiment, the wearable device 300 may include waveguides 333 and 334 that transmit light transmitted from the at least one display 350 and relayed by the at least one optical device 382 and 384 by diffracting to the user. The waveguides 333 and 334 may be formed using at least one of glass, plastic, or polymer. A nano pattern may be formed on at least a portion of the outside or inside of the waveguides 333 and 334. The nano pattern may be formed using a grating structure having a polygonal or curved shape. Light incident at an end of the waveguides 333 and 334 may be propagated to another end of the waveguides 333 and 334 by the nano pattern. The waveguides 333 and 334 may include at least one of at least one diffraction element (e.g., a diffractive optical element (DOE), a holographic optical element (HOE)) and a reflection element (e.g., a reflection mirror). For example, the waveguides 333 and 334 may be disposed in the wearable device 300 to guide a screen displayed by the at least one display 350 to the user's eyes. For example, the screen may be transmitted to the user's eyes through total internal reflection (TIR) generated in the waveguides 333 and 334.
According to an embodiment, the wearable device 300 may analyze an object included in a real image collected through a photographing camera 340-3, combine with a virtual object corresponding to an object that is a subject of augmented reality provision among the analyzed object, and display on the at least one display 350. The virtual object may include at least one of text and images for various information associated with the object included in the real image. The wearable device 300 may analyze the object based on a multi-camera such as a stereo camera. For the object analysis, the wearable device 300 may execute time-of-flight (ToF) and/or simultaneous localization and mapping (SLAM) supported by the multi-camera. The user wearing the wearable device 300 may watch an image displayed on the at least one display 350.
According to an embodiment, a frame may be configured with a physical structure in which the wearable device 300 may be worn on the user's body. According to an embodiment, the frame may be configured so that when the user wears the wearable device 300, the first display 350-1 and the second display 350-2 may be positioned corresponding to the user's left and right eyes. The frame may support the at least one display 350. For example, the frame may support the first display 350-1 and the second display 350-2 to be positioned at positions corresponding to the user's left and right eyes.
Referring to
According to an embodiment, the frame may include a first rim 301 surrounding at least a portion of the first display 350-1, a second rim 302 surrounding at least a portion of the second display 350-2, a bridge 303 disposed between the first rim 301 and the second rim 302, a first pad 311 disposed along a portion of the edge of the first rim 301 from one end of the bridge 303, a second pad 312 disposed along a portion of the edge of the second rim 302 from the other end of the bridge 303, the first temple 304 extending from the first rim 301 and fixed to a portion of the wearer's car, and the second temple 305 extending from the second rim 302 and fixed to a portion of the car opposite to the car. The first pad 311 and the second pad 312 may be in contact with the portion of the user's nose, and the first temple 304 and the second temple 305 may be in contact with a portion of the user's face and the portion of the user's car. The temples 304 and 305 may be rotatably connected to the rim through hinge units 306 and 307, shown in
According to an embodiment, the wearable device 300 may include hardware (e.g., hardware described above based on the block diagram of
According to an embodiment, the microphones 394-1, 394-2, and 394-3 of the wearable device 300 may obtain a sound signal, by being disposed on at least a portion of the frame. The first microphone 394-1 disposed on the nose pad 310, the second microphone 394-2 disposed on the second rim 302, and the third microphone 394-3 disposed on the first rim 301 are illustrated in
According to an embodiment, the optical devices 382 and 384 may transmit a virtual object transmitted from the at least one display 350 to the wave guides 333 and 334. For example, the optical devices 382 and 384 may be projectors. The optical devices 382 and 384 may be disposed adjacent to the at least one display 350 or may be included in the at least one display 350 as a portion of the at least one display 350. The first optical device 382 may correspond to the first display 350-1, and the second optical device 384 may correspond to the second display 350-2. The first optical device 382 may transmit light outputted from the first display 350-1 to the first waveguide 333, and the second optical device 384 may transmit light outputted from the second display 350-2 to the second waveguide 334.
In an embodiment, a camera 340 may include an eye tracking camera (ET CAM) 340-1, a motion recognition camera 340-2 and/or the photographing camera 340-3. The photographing camera 340-3, the eye tracking camera 340-1, and the motion recognition camera 340-2 may be disposed at different positions on the frame and may perform different functions. The photographing camera 340-3, the eye tracking camera 340-1, and the motion recognition camera 340-2 may be an example of the camera 225 of
In an embodiment, the photographing camera 340-3 may photograph a real image or background to be matched with a virtual image in order to implement augmented reality or mixed reality content. The photographing camera may photograph an image of a specific object existing at a position viewed by the user and may provide the image to the at least one display 350. The at least one display 350 may display one image in which a virtual image provided through the optical devices 382 and 384 is overlapped with information on the real image or background including the image of the specific object obtained using the photographing camera. In an embodiment, the photographing camera may be disposed on the bridge 303 disposed between the first rim 301 and the second rim 302.
In an embodiment, the eye tracking camera 340-1 may implement a more realistic augmented reality by matching the user's gaze with the visual information provided on the at least one display 350, by tracking the gaze of the user wearing the wearable device 300. For example, when the user looks forward, the wearable device 300 may naturally display environment information associated with the user's front on the at least one display 350 at a position where the user is positioned. The eye tracking camera 340-1 may be configured to capture an image of the user's pupil in order to determine the user's gaze. For example, the eye tracking camera 340-1 may receive gaze detection light reflected from the user's pupil and may track the user's gaze based on the position and movement of the received gaze detection light. In an embodiment, the eye tracking camera 340-1 may be disposed at a position corresponding to the user's left and right eyes. For example, the eye tracking camera 340-1 may be disposed in the first rim 301 and/or the second rim 302 to face the direction in which the user wearing the wearable device 300 is positioned.
The motion recognition camera 340-2 may provide a specific event to the screen provided by the at least one display 350 by recognizing the movement of the whole or portion of the user's body, such as the user's torso, hand, or face. The motion recognition camera 340-2 may obtain a signal corresponding to motion by recognizing a user's gesture, and may provide a display corresponding to the signal to the at least one display 350. A processor may identify a signal corresponding to the operation and may perform a preset function based on the identification. In an embodiment, the motion recognition camera 340-2 may be disposed on the first rim 301 and/or the second rim 302.
In an embodiment, the camera(s) 340 included in the wearable device 300 are not limited to the above-described eye tracking camera 340-1 and the motion recognition camera 340-2. For example, the wearable device 300 may identify an external object included in the FoV using the photographing camera 340-3 disposed toward the user's FoV. The wearable device 300 identifying an external object may be performed based on a sensor for identifying a distance between the wearable device 300 and the external object, such as a depth sensor and/or a time of flight (ToF) sensor. The camera 340 disposed toward the FoV may support an autofocus function and/or an optical image stabilization (OIS) function. For example, the wearable device 300 may also include a camera 340 (e.g., a face tracking (FT) camera) disposed toward a face to obtain an image including the face of the user wearing the wearable device 300.
Although not illustrated, the wearable device 300 according to various embodiments may further include a light source (e.g., LED) that emits light toward a subject (e.g., user's eyes, face, and/or an external object in the FoV) photographed using the camera 340. The light source may include an LED having an infrared wavelength. The light source may be disposed on at least one of the frame, and the hinge units 306 and 307.
According to an embodiment, the battery module 370 may supply power to electronic hardware components of the wearable device 300. In an embodiment, the battery module 370 may be disposed in the first temple 304 and/or the second temple 305. For example, the battery module 370 may be a plurality of battery modules 370. The plurality of battery modules 370, respectively, may be disposed on each of the first temple 304 and the second temple 305. In an embodiment, the battery module 370 may be disposed at an end of the first temple 304 and/or the second temple 305.
In an embodiment, the antenna module 375 may transmit a signal(s) or power to the outside of the wearable device 300 or may receive signal(s) or power from the outside. The antenna module 375 may be electrically and/or operably connected with a communication circuit (e.g., the communication circuit 235 of
In an embodiment, speakers 392-1 and 392-2 may output a sound signal to the outside of the wearable device 300. A sound output module may be referred to, for example, as a speaker. In an embodiment, the speakers 392-1 and 392-2 may be disposed in the first temple 304 and/or the second temple 305 in order to be disposed adjacent to the car of the user wearing the wearable device 300. For example, the wearable device 300 may include a second speaker 392-2 disposed adjacent to the user's left car by being disposed in the first temple 304, and a first speaker 392-1 disposed adjacent to the user's right car by being disposed in the second temple 305.
In an embodiment, a light emitting module (not illustrated) may include at least one light emitting element. The light emitting module may emit light of a color corresponding to a specific state or may emit light through an operation corresponding to the specific state in order to visually provide information on a specific state of the wearable device 300 to the user. For example, when the wearable device 300 requires charging, it may emit red light at a constant cycle. In an embodiment, the light emitting module may be disposed on the first rim 301 and/or the second rim 302.
Referring to
According to an embodiment, the wearable device 300 may include at least one of a gyro sensor, a gravity sensor, and/or an acceleration sensor for detecting the posture of the wearable device 300 and/or the posture of a body part (e.g., a head) of a user wearing the wearable device 300. Each of the gravity sensor and the acceleration sensor may measure gravity acceleration, and/or acceleration based on preset 3-dimensional axes (e.g., x-axis, y-axis, and z-axis) perpendicular to each other. The gyro sensor may measure angular velocity of each of preset 3-dimensional axes (e.g., x-axis, y-axis, and z-axis). At least one of the gravity sensor, the acceleration sensor, and the gyro sensor may be referred to, for example, as an inertial measurement unit (IMU). According to an embodiment, the wearable device 300 may identify the user's motion and/or gesture performed to execute or stop a specific function of the wearable device 300 based on the IMU.
As described above, the wearable device 300 according to various embodiments may identify a first object (e.g., the first object 110 of
Referring to
According to an embodiment, the wearable device 400 may include cameras 440-1 and 440-2 for photographing and/or tracking two eyes of the user adjacent to each of the first display 350-1 and the second display 350-2. The cameras 440-1 and 440-2 may be referred to as the ET camera. According to an embodiment, the wearable device 400 may include cameras 440-3 and 440-4 for photographing and/or recognizing the user's face. The cameras 440-3 and 440-4 may be referred to as a FT camera.
Referring to
According to an embodiment, the wearable device 400 may include the depth sensor 430 disposed on the second surface 420 in order to identify a distance between the wearable device 400 and the external object. Using the depth sensor 430, the wearable device 400 may obtain spatial information (e.g., a depth map) about at least a portion of the FoV of the user wearing the wearable device 400.
Although not illustrated, a microphone for obtaining sound outputted from the external object may be disposed on the second surface 420 of the wearable device 400. The number of microphones may be one or more according to various embodiments.
As described above, according to an embodiment, the wearable device 400 may display an image obtained through the cameras 440-5, 440-6, 440-7, 440-8, 440-9, and 440-10 through the display 350. The wearable device 400 may display a first object (e.g., the first object 110 of
Referring to
The wearable device 101 according to an embodiment may identify an input with respect to the first object 110 and/or the second object 115. In response to the input with respect to the first object 110 and/or the second object 115, the wearable device 101 may switch to a screen 520 for displaying multimedia content provided through an external electronic device corresponding to the first object 110. The screen 520 may express substantially the same timing as the multimedia content provided through the external electronic device.
The wearable device 101 according to an embodiment may identify the input with respect to the first object 110 and/or the second object 115. For example, the input may be identified based on a user's gesture of the wearable device 101. For example, the input may be received based on a controller of the wearable device 101. For example, the wearable device 101 may identify an input dragging the first object 110 and/or the second object 115. For example, the dragging input may, using the controller, maintain an input of pressing the first object 110 and/or the second object 115, and include an input of moving the first object 110 and/or the second object 115. The input with respect to the first object 110 and/or the second object 115 is not limited to the above-described example. The wearable device 101 may display the screen 520 through the display in response to the input. The wearable device 101 may display the screen 520 associated with multimedia content provided through the external electronic device corresponding to the first object 110.
The wearable device 101 according to an embodiment may display a screen associated with multimedia content provided by the external electronic device corresponding to the first object 110 on at least a portion of the display based on identifying the input with respect to the first object 110. For example, the wearable device 101 may display a third object 510 associated with the multimedia content on at least a portion of the display. For example, the third object 510 may be a visual object and/or a virtual object to display the multimedia content.
The wearable device 101 according to an embodiment may identify an input with respect to the third object 510 associated with multimedia content. The wearable device 101 may identify the input with respect to the third object 510 based on a signal received from a controller. For example, the controller may transmit a signal to indicate the virtual object and/or the visual object, such as a pointer, within the display. The wearable device 101 according to an embodiment may identify the input with respect to the third object 510 based on tracking the user's gaze. The wearable device 101 may display the multimedia content based on the entire area of the display based on the identification of the input.
Referring to a second example screen 505, the wearable device 101 according to an embodiment may display multimedia content through the screen 520 of the display. For example, the wearable device 101 may change to the second example screen 505 so that multimedia content provided in the first example screen 500 may be continuously watched. The wearable device 101 according to an embodiment may display a fourth object 530 to switch to the first example screen 500, within the screen 520. For example, the wearable device 101 may display the fourth object 530 by superimposing the fourth object onto the screen 520 associated with multimedia content. The wearable device 101 may cease displaying the fourth object 530 based on exceeding a designated duration while displaying the screen 520. For example, the wearable device 101 may at least temporarily cease displaying the fourth object 530 to provide multimedia content through the entire area of the display.
As described above, the wearable device 101 according to an embodiment may identify
an external electronic device while operating in a VST mode. The wearable device 101 may display the second object to notify that the first object 110 for the external electronic device is selectable. The wearable device 101 may display multimedia content provided by the external electronic device corresponding to the first object 110 in at least a partial area of the display based on identifying the input with respect to the first object 110. For example, the wearable device 101 may display the multimedia content through a visual object such as the third object 510. The wearable device 101 may display the screen 520 associated with multimedia content based on input with respect to the third object 510. For example, the screen 520 associated with the multimedia content may be the entire area of the display. For example, the wearable device 101 may display the fourth object 530 to reduce a size of the screen 520 while displaying the screen 520. The wearable device 101 may change the size of the screen 520 to a size of the third object 510 based on input with respect to the fourth object 530. The size of the screen 520, the size of the third object 510, and/or a position of the third object 510 are not limited. The wearable device 101 may perform switching of the third object 510 and the screen 520. The wearable device 101 may enhance user experience of the wearable device 101 by performing the switching based on the user's input.
Referring to
The wearable device 101 according to an embodiment may identify a plurality of objects 620 and 630. For example, the wearable device 101 may highlight and display the third object 630 including the area 610 superimposed with the first object 110 among the plurality of objects 620 and 630. For example, the operation of highlighting and displaying may include an operation of blinking the third object 630. For example, the operation of highlighting and displaying may include an operation of displaying the third object 630 in a different color. However, the disclosure is not limited in this respect. The wearable device 101 according to an embodiment may receive an input with respect to the third object 630 while highlighting and displaying the third object 630. For example, the input may include an input of dragging the third object 630. The wearable device 101 may move the third object 630 based on the input.
As described above, the wearable device 101 according to an embodiment may identify the first object 110 corresponding to the external electronic device providing multimedia content. The wearable device 101 may identify the third object 630 at least partially superimposed with the first object 110. The wearable device 101 may move the third object 630 to the area 635 that does not superimpose with the first object 110. The wearable device 101 may move the third object 630 based on input with respect to the third object 630. The wearable device 101 may provide a user with multimedia content provided from the external electronic device corresponding to the first object 110 by moving the third object 630 superimposed with the first object 110. The wearable device 101 may help the user watch multimedia content provided through the external electronic device corresponding to the first object 110 by moving the third object 630 superimposed with the first object 110.
Referring to
The wearable device 101 may identify a first object 110 within the image. The wearable device 101 may identify an external electronic device corresponding to the first object 110 based on the first object 110. Based on the identification, the wearable device 101 may request information to display multimedia content provided through the external electronic device to the external electronic device. The wearable device 101 may receive the information transmitted from the external electronic device based at least in part on the request. In response to receiving the information, the wearable device 101 may display a second object 115 to notify that the external electronic device is selectable. For example, the wearable device 101 may display the second object 115 in conjunction with the first object 110. For example, the wearable device 101 may display the second object 115 along an edge of the first object 110.
The wearable device 101 according to an embodiment may identify an input with respect to the first object 110 and/or the second object 115. The wearable device 101 may display a third object 710 and a fourth object 720 within at least a partial area 730 of a display based on the input with respect to the first object 110 and/or the second object 115. For example, the third object 710 may be a button to at least temporarily stop multimedia content provided by the external electronic device corresponding to the first object 110 and initiate performing an operation to display the multimedia content through the display of the wearable device 101. For example, the third object 710 may include text such as ‘Continue watching on App’. For example, the fourth object 720 may be a button to simultaneously provide multimedia content through the external electronic device corresponding to the first object 110 and the display of the wearable device 101. For example, the fourth object 720 may include text such as ‘simultaneous play of TV/App’. In the example 700 of
As described above, the wearable device 101 according to an embodiment may display the first object 110 and/or the second object 115 corresponding to the external electronic device. The wearable device 101 may identify the input with respect to the first object 110 and/or the second object 115. The wearable device 101 may display an object (or the objects 710 and 720) to perform a function (or operation) based on the input with respect to the first object 110 and/or the second object 115. The wearable device 101 may perform an operation corresponding to a function represented by the object based on an input with respect to the object. By performing a function (or operation) corresponding to the object, the wearable device 101 may provide multimedia content through the wearable device 101, or may provide multimedia content through the wearable device 101 and the external electronic device. The wearable device 101 may enhance user experience of the wearable device 101 by providing the multimedia content.
Referring to
The wearable device 101 according to an embodiment may identify the position of the wearable device 101 based on simultaneous localization and mapping (SLAM). For example, the wearable device 101 may identify the position of the wearable device 101 when providing the multimedia content 820 based on the SLAM. The wearable device 101 may identify a position where the multimedia content 820 is repeatedly provided based on the SLAM and/or the scene recognition. The wearable device 101 may store the position where the multimedia content 820 is repeatedly provided in a memory. Although not illustrated in
The wearable device 101 according to an embodiment may identify a first position of the wearable device 101 based on the image obtained through the camera. The wearable device 101 may display multimedia content based on identifying the first position corresponding to a second position stored in the memory. For example, the second position may be a position where multimedia content is repeatedly provided.
As described above, the wearable device 101 according to an embodiment may identify the position of the wearable device 101. The wearable device 101 may identify the position of the wearable device 101 based on the repeatedly identified scene 810 and/or the SLAM. The wearable device 101 may identify the position of the wearable device 101 when providing the multimedia content 820. The wearable device 101 may store the position of the wearable device 101 when providing the multimedia content 820 in the memory. The wearable device 101 may provide the multimedia content 820 based on identifying the position of the wearable device 101 corresponding to the position stored in the memory. The wearable device 101 may display the object to guide the provision of the multimedia content 820 based on identifying the position of the wearable device 101 corresponding to the position stored in the memory. The wearable device 101 may help a user easily receive the multimedia content 820 by providing the multimedia content 820 based on the position of the wearable device 101 corresponding to the position stored in the memory. The wearable device 101 may enhance user experience of the wearable device 101 by providing the multimedia content 820 based on identifying the position of the wearable device 101 identified at a designated position.
Referring to
The wearable device 101 according to an embodiment may reduce the size of the multimedia content 910 based on the identification of the movement of the wearable device 101. For example, the wearable device 101 may display a screen such as a second example screen 905. For example, the wearable device 101 may display an object 930 corresponding to the multimedia content 910 by reducing the size of the multimedia content 910 based on the movement of the wearable device 101. The wearable device 101 may display an image 920 obtained through the camera while displaying the object 930, which is reduced from the multimedia content 910. The wearable device 101 may display the object 930 superimposed on the image 920. For example, the wearable device 101 may provide the multimedia content 910 through the object 930.
The wearable device 101 according to an embodiment may adjust a size of the object 930. For example, the wearable device 101 may adjust the size of the object 930 based on an input with respect to the object 930. For example, the input with respect to the object 930 may include an operation of dragging the object 930. For example, the input with respect to the object 930 may include an operation to pinch-to-zoom the object 930. The input with respect to the object 930 may be received by a controller. The input with respect to the object 930 may be received by the user's gesture through the camera of the wearable device 101. The wearable device 101 may enlarge or reduce the size of the object 930 based on the input.
As described above, the wearable device 101 according to an embodiment may display the multimedia content 910. The wearable device 101 may identify the movement of the wearable device 101 while displaying the multimedia content 910. The wearable device 101 may display the image 920 obtained through the camera based on the identification of the movement. While displaying the image 920, the wearable device 101 may superimpose and display the object 930 corresponding to the multimedia content 910 on the image 920. The wearable device 101 may provide the multimedia content 910 through the object 930. The wearable device 101 may help the user of the wearable device 101 receive the multimedia content 910 in a safe manner by converting the multimedia content 910 to the object 930 and displaying the image 920 obtained through the camera based on identifying the user's movement.
Referring to
The wearable device 101 according to an embodiment may display an object 1030 to provide the multimedia content 1020 within the screen 1010 during a change in a position of the wearable device 101. For example, the wearable device 101 may display the multimedia content 1020 using the object 1030 within a second place 1005 different from the first place 1000. The wearable device 101 may receive an input to adjust a size of the object 1030 while displaying the object 1030. For example, the input to adjust the size of the object 1030 may include an operation of dragging the object 1030. For example, the input to adjust the size of the object 1030 may include an operation to pinch-to-zoom the object 1030. The wearable device 101 may adjust the size of the object 1030 based on the input.
As described above, the wearable device 101 according to an embodiment may identify the multimedia content 1020 provided from the first external electronic device. The wearable device 101 may receive the input with respect to the area including the multimedia content 1020 based on the identification of the multimedia content 1020. The wearable device 101 may search for the multimedia content 1020 using the second external electronic device different from the first external electronic device in response to the input. The wearable device 101 may display the multimedia content 1020 in at least a partial area of the screen 1010 based on the search for the multimedia content 1020. The wearable device 101 may enhance user experience of the wearable device 101 by providing the multimedia content 1020 identified within the screen 1010 based on the user's input.
Referring to
In operation 1103, the wearable device according to an embodiment may request information to display multimedia content provided through the external electronic device based on the identification of the external electronic device. For example, through communication circuitry (e.g., the communication circuitry 240 of
In operation 1105, the wearable device according to an embodiment may receive information transmitted from the external electronic device based at least in part on the request for information to display multimedia content. For example, the wearable device may receive the information through the communication circuitry. In response to receiving the information, the wearable device may display a second object to notify that the external electronic device is selectable in conjunction with the first object. For example, the wearable device may display the second object along an edge of the first object. For example, the wearable device may superimpose the second object on the first object and display it by blinking.
In operation 1107, the wearable device according to an embodiment may identify an input received with respect to the first object associated with the second object. For example, the input received with respect to the first object may be received based on a controller that has established a communication link with the wearable device. For example, the input received with respect to the first object may be received based on the identification of the user's gaze of the wearable device. The wearable device may display the multimedia content through a display (e.g., the display 230 of
As described above, the wearable device according to an embodiment may identify the external electronic device based on the first object within the image obtained using the camera. The wearable device may request information to display multimedia content provided through the external electronic device to the external electronic device through the communication circuitry based on the identification of the external electronic device. In response to receiving information transmitted from the external electronic device through the communication circuitry based on at least a portion of the request, the wearable device may display the second object to notify that the external electronic device is selectable in conjunction with the first object. The wearable device may display multimedia content through the display based on the information in response to the input received with respect to the first object associated with the second object. The wearable device may enhance user experience of the wearable device by providing the multimedia content by receiving information associated with multimedia content provided through the external electronic device.
Referring to
In operation 1220, the first external electronic device 1201 according to an embodiment may transmit the first user account information to the second external electronic device 1203. For example, the first external electronic device 1201 may encrypt the first user account information. The first external electronic device 1201 may transmit the encrypted first user account information to the second external electronic device 1203. For example, the first external electronic device 1201 may transmit information associated with the shape of the first external electronic device to the second external electronic device 1203 together with the first user account information.
In operation 1230, the wearable device 101 according to an embodiment may transmit a signal requesting to identify agreement of the first user account information and the second user account information to the second external electronic device 1203. For example, the signal may include information associated with the shape of the first object identified from the image obtained from the camera and user information regarding the user logged in to the wearable device 101. For example, the signal may include a signal to request user account information of the first external electronic device 1201 transmitted to the second external electronic device 1203.
In operation 1240, the wearable device 101 according to an embodiment may receive a signal identifying agreement between of the user account information. For example, the signal to identify matching of the account may be transmitted based on the agreement of the first user information regarding the user logged in to the first external electronic device 1201 and the second user information regarding the user logged in to the wearable device 101 by the second external electronic device 1203. For example, the second external electronic device 1203 may decrypt information (or signals) transmitted from the wearable device 101 and the first external electronic device 1201. The second external electronic device 1203 may identify agreement of the first user information and the second user information based on the decrypted information. For example, the second external electronic device 1203 may initiate an operation to identify agreement of user information based on agreement of information associated with the shape of the first object transmitted from the wearable device 101 and information associated with the shape of the first external electronic device 1201 transmitted from the first external electronic device 1201. The second external electronic device 1203 may identify agreement of the second user information of the wearable device 101 and the first user information of the first external electronic device based on the initiation. The second external electronic device 1203 may transmit a signal to confirm agreement of the user account to the wearable device 101 based on the agreement.
In operation 1250, the wearable device 101 according to an embodiment may display a second object (e.g., the second object 115 of
In operation 1260, the wearable device 101 according to an embodiment may receive an input with respect to the first object. The wearable device 101 may display multimedia content provided through the first external electronic device 1201 in response to the input received with respect to the first object.
As described above, the wearable device 101 according to an embodiment may perform an operation to identify agreement of the first user information logged in to the first external electronic device 1201 and the second user information logged in to the wearable device 101. The wearable device 101 may perform a request to the second external electronic device 1203 to identify agreement of the first user information and the second user information. The wearable device 101 may display multimedia content provided through the first external electronic device 1201 based on agreement of the first user information and the second user information. The wearable device 101 may enhance user experience of the wearable device 101 by displaying multimedia content provided through the first external electronic device 1201.
Metaverse is a compound word of the English word “Meta”, which means “virtual” and “transcendence,” and “Universe”, which means the universe, and refers to a three-dimensional virtual world where social, economic, and cultural activities like the real-world take place. Metaverse is a concept that has evolved one step further than virtual reality (VR, a state-of-the-art technology that enables people to experience real-life experience in a virtual world created by a computer) and is characterized by using avatars to not only enjoy games or virtual reality, but also to engage in social and cultural activities like real reality. Metaverse service may provide media content to enhance immersion in the virtual world based on augmented reality (AR), virtual reality environment (VR), mixed environment (MR), and/or extended reality (XR).
For example, media content provided by a metaverse service may include social interaction content including an avatar-based game, a concert, a party, and/or a meeting. For example, the media content may include information for economic activity such as advertising, user created content, and/or sales and/or shopping of a product. Ownership of the user created content may be proved by a blockchain-based non-fungible token (NFT). The metaverse service may support economic activity based on real money and/or cryptocurrency. Virtual content associated with the real world, such as digital twin or life logging, may be provided by the metaverse service.
Referring to
At this time, the server 1310 enables the user terminal 1320 to be active in a virtual space by providing the virtual space. In addition, by installing a software (S/W) agent to access the virtual space provided by the server 1310, the user terminal 1320 expresses information that the server 1310 provides to the user, or transmits information that the user wants to express in the virtual space to the server. The S/W agent may be provided directly through the server 1310, downloaded from a public server, or embedded when purchasing a terminal.
In an embodiment, the metaverse service may be provided to the user terminal 1320 and/or the user using the server 1310. The embodiment is not limited thereto, and the metaverse service may be provided through individual contact between users. For example, within the network environment 1301, the metaverse service may be provided by a direct connection between the first terminal 1320-1 and the second terminal 1320-2 independently of the server 1310. Referring to
In an embodiment, the user terminal 1320 (or the user terminal 1320 including the first terminal 1320-1 and the second terminal 1320-2) may be provided in various form factors, and is characterized by including an output device that provides an image or/and sound to the user and an input device for inputting information to the metaverse service. Examples of various form factors of the user terminal 1320 may include a smartphone (e.g., the second terminal 1320-2), an AR device (e.g., the first terminal 1320-1), a VR device, an MR device, a Video See Through (VST) device, an Optical See Through (OST) device, a smart lens, a smart mirror, a TV or projector capable of inputting/outputting.
The network (e.g., the network formed by the at least one intermediate node 1330) includes various broadband networks including 3G, 4G, and 5G, and short-range network (e.g., wired network or wireless network directly connecting the first terminal 1320-1 and the second terminal 1320-2) including Wi-Fi and BT.
A method to display multimedia content provided through an external electronic device may be required. As described above, a wearable device according to various embodiments may include communication circuitry, a display, a camera, memory storing instructions, and at least one processor. The instructions, when executed by the at least one processor, may cause the wearable device to identify an external electronic device based on a first object within an image obtained using the camera; request information to display multimedia content provided through the external electronic device, to the external electronic device through the communication circuitry based on the identification; display, in conjunction with the first object, a second object to indicating that the external electronic device is selectable, in response to receiving the information transmitted from the external electronic device through the communication circuitry; display, through the display, the multimedia content obtained from the external electronic device based on the information, based on whether a user input with respect to the first object is detected.
According to various embodiments, the first object in conjunction with the second object may be visually highlighted with respect to at least one third object within the image, that is displayed in an area different from the first object.
The second object according to various embodiments may be displayed along an edge of the first object, or may be displayed as superimposed on the first object with blinking.
The processor according to various embodiments may display a fourth object receiving an input to provide the multimedia content in conjunction with the external electronic device.
The instructions according to various embodiments, when executed by the at least one processor, may cause the wearable device to identify a first position of the wearable device based on an image obtained through the camera; and display the multimedia content through the display 230 based on identifying the first position corresponding to a second position stored in the wearable device 101.
The second position may, for example, be a position set to repeatedly provide the multimedia content.
The instructions according to various embodiments, when executed by the at least one processor, may cause the wearable device to display the second object in conjunction with the first object based on identifying agreement of first user account information regarding a user logged into the wearable device 101 and second user account information regarding a user logged into the external electronic device.
The instructions according to various embodiments, when executed by the at least one processor, may cause the wearable device to display the multimedia content in a portion of the display based on identifying a movement of a user while displaying the multimedia content.
As described above, according to various embodiments, a method of a wearable device may include identifying an external electronic device based on a first object within an image obtained using a camera; requesting information to display multimedia content provided through the external electronic device, to the external electronic device through communication circuitry based on the identification; displaying, in conjunction with the first object, a second object to indicate that the external electronic device is selectable in response to receiving the information transmitted from the external electronic device through the communication circuitry; and displaying, through the display, the multimedia content obtained from the external electronic device based on the information, based on whether a user input with respect to the first object is detected.
According to various embodiments, the first object in conjunction with the second object may be displayed in an area different from the first object and visually highlighted with respect to at least one third object within the image.
The second object according to various embodiments may be displayed along an edge of the first object, or may be displayed as superimposed on the first object with blinking.
The method according to various embodiments may include displaying a fourth object receiving an input to provide the multimedia content in conjunction with the external electronic device.
The method according to various embodiments may include identifying a first position of the wearable device based on an image obtained through the camera; and displaying the multimedia content through the display based on identifying the first position corresponding to a second position stored in the wearable device.
The second position according to various embodiments may be a position set to repeatedly provide the multimedia content.
The method according to various embodiments may include displaying the second object in conjunction with the first object based on identifying agreement of first user account information of a user of the wearable device and second user account information of a user of the external electronic device.
The method according to various embodiments may include displaying the multimedia content on a portion of the display and displaying an image obtained through the camera, based on identifying a movement of a user while displaying the multimedia content.
As described above, a non-transitory computer-readable storage medium may store one or more programs according to various embodiments, the one or more programs, when executed by at least one processor of a wearable device, may cause the at least one processor of the wearable device to identify an external electronic device based on a first object within an image obtained using a camera; request information to display multimedia content provided through the external electronic device, to the external electronic device through communication circuitry, based on the identification; display, in conjunction with the first object, a second object to notify that the external electronic device is selectable, in response to receiving the information transmitted from the external electronic device through the communication circuitry; and display, through the display, the multimedia content obtained from the external electronic device based on the information, based on whether a user input with respect to the first object is detected.
According to various embodiments, the first object in conjunction with the second object may be displayed in an area different from the first object and visually highlighted with respect to at least one third object within the image.
The second object according to various embodiments may be displayed along an edge of the first object, or may be displayed as superimposed on the first object with blinking.
The one or more programs according to various embodiments, when executed by the at least one processor of the wearable device, may cause the at least one processor of the wearable device to display a fourth object receiving an input to provide the multimedia content in conjunction with the external electronic device.
As described above, the head-wearable electronic device according to various embodiments may include display(s), a first camera usable for identifying eye gaze information, a second camera usable for obtaining images regarding physical environment in front of the head-wearable electronic device, communication circuitry, memory storing instructions, and at least one processor comprising processing circuitry. The instructions, when executed by the at least one processor individually or collectively, cause the head-wearable electronic device to display, using the displays, images of a physical environment obtained using the second camera; while displaying the images of the physical environment, identify that first eye gaze information, obtained via the first camera, corresponds to a visual object in the images, the visual object in the images corresponding to an external electronic device in the physical environment; based on identifying that the first eye gaze information corresponds to the visual object, display, using the displays, a user interface (UI) object associated with the visual object; while displaying the UI object associated with the visual object, identify that second eye gaze information, obtained via the first camera, corresponds to the UI object; based at least on identifying that the second eye gaze information corresponds to the UI object, execute a first function associated with the UI object including transmitting, through the communication circuitry, to the external electronic device, a signal to request a communication link with the external electronic device; and based on information received through the communication circuitry, display, using the displays, screen images, associated with the external electronic device, superimposed on images of the physical environment.
For example, the instructions, when executed by the at least one processor individually or collectively, cause the head-wearable electronic device to, in response to a gesture detected while the second eye gaze information is identified, transmit the signal. The first function associated with the UI object may be executed based on detecting the gesture including a movement of a hand in addition to identifying the second eye gaze information.
For example, the UI object may be displayed based on determining that the external electronic device is registered to the same user account to which the head-wearable electronic device is registered.
For example, the execution of the first function may further include causing the external electronic device to deactivate the display of the external electronic device.
For example, the instructions, when executed by the at least one processor individually or collectively, cause the head-wearable electronic device to, while displaying the screen images associated with the external electronic device, using the displays of the head-wearable electronic device, display another UI object alongside the displayed screen images.
For example, the instructions, when executed by the at least one processor individually or collectively, cause the head-wearable electronic device to identify, using the first camera, a third eye gaze information received with respect to the another UI object; and based on the third eye gaze information, change sizes of the screen images being displayed on the displays.
For example, the head-wearable electronic device may further include a sensor configured to output sensor data indicating a movement of the head-wearable electronic device. The instructions, when executed by the at least one processor individually or collectively, cause the head-wearable electronic device to, based on a gesture associated with the screen images, change sizes of the screen images being displayed on the displays.
As described above, according to an embodiment, a method of a head-wearable electronic device is provided. The head-wearable electronic device may include displays, a first camera usable for identifying eye gaze information, a second camera usable for obtaining images regarding a physical environment in front of the head-wearable electronic device, and communication circuitry. The method may comprise displaying, using the displays, images of the physical environment obtained using the second camera; while displaying the images of physical environment, identifying that first eye gaze information, obtained via the first camera, corresponds to a visual object in the images, the visual object in the images corresponding to an external electronic device in the physical environment; based on identifying that the first eye gaze information corresponds to the visual object, displaying, using the displays, a user interface (UI) object associated with the visual object; while displaying the UI object associated with the visual object, identifying that second eye gaze information, obtained via the first camera, corresponds to the UI object; based at least on identifying that the second eye gaze information corresponds to the UI object, executing a first function associated with the UI object including transmitting, through the communication circuitry, to the external electronic device, a signal to request a communication link with the external electronic device; and based on information received through the communication circuitry, displaying, using the displays, screen images, associated with the external electronic device, superimposed on images of physical environment.
For example, the transmitting may further include, in response to a gesture detected while the second eye gaze information is identified, transmitting the signal. The first function associated with the UI object may be executed based on detecting the gesture including a movement of a hand in addition to identifying the second eye gaze information.
For example, the UI object may be displayed based on determining that the external electronic device is registered to the same user account to which the head-wearable electronic device is registered.
For example, the executing may further include causing the external electronic device to deactivate the display of the external electronic device.
For example, the displaying the screen images may further include, while displaying the screen images associated with the external electronic device, displaying, using the displays of the head-wearable electronic device, another UI object alongside the displayed screen images.
For example, the method may further include, identifying, using the first camera, a third eye gaze information received with respect to the another UI object; and based on the third eye gaze information, changing sizes of the screen images being displayed on the displays.
For example, the method may further include changing, based on a gesture associated with the screen images, sizes of the screen images being displayed on the displays.
As described above, a wearable device according to an embodiment may include communication circuitry, a display, a camera, and a processor. The processor may be configured to identify, based on a first object in an image obtained using the camera, a first external electronic device. The processor may be configured to transmit, using the communication circuitry, a first signal to a second external electronic device to identify coincidence of a first user account used by the wearable device and a second user account used by the first external electronic device. The processor may be configured to receive, using the communication circuitry, a second signal from the second external electronic device to identify the coincidence of the first user account and the second user account. The processor may be configured to display, at least based on the second signal, a second visual object in association with a selection of the first external electronic device in conjunction with the first object. The processor may be configured to display, based on a user input with respect to the second visual object, content provided through the first external electronic device through the display.
For example, the second signal may be generated by the second external electronic device based on the second external electronic device receiving user account information from the first external electronic device that is used in the first external electronic device.
For example, the processor may be configured to display the second visual object in response to a gaze input of a user wearing the wearable device, the gaze input being directed at the first object.
For example, the wearable device may further include another camera usable for identifying the gaze input. The processor may be configured to identify the gaze input using the another camera.
For example, the processor may be configured to, based on the user input, transmit, through the communication circuitry, to the first external electronic device, a signal to instruct the first external electronic device to cease to display the content on a display of the first external electronic device.
For example, the processor may be configured to display, through the display, the content together with a visual object to receive another user input indicating to cease to display the content through the display.
The electronic device according to various embodiments may be one of various types of electronic devices. The electronic devices may include, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, a home appliance, or the like. According to an embodiment of the disclosure, the electronic devices are not limited to those described above.
It should be appreciated that various embodiments of the present disclosure and the terms used therein are not intended to limit the technological features set forth herein to particular embodiments and include various changes, equivalents, or replacements for a corresponding embodiment. With regard to the description of the drawings, similar reference numerals may be used to refer to similar or related elements. It is to be understood that a singular form of a noun corresponding to an item may include one or more of the things unless the relevant context clearly indicates otherwise. As used herein, each of such phrases as “A or B,” “at least one of A and B,” “at least one of A or B,” “A, B, or C,” “at least one of A, B, and C,” and “at least one of A, B, or C,” may include any one of or all possible combinations of the items enumerated together in a corresponding one of the phrases. As used herein, such terms as “1st” and “2nd,” or “first” and “second” may be used to simply distinguish a corresponding component from another, and do not limit the components in other aspect (e.g., importance or order). It is to be understood that if an element (e.g., a first element) is referred to, with or without the term “operatively” or “communicatively”, as “coupled with,” or “connected with” another element (e.g., a second element), the element may be coupled with the other element directly (e.g., wiredly), wirelessly, or via a third element.
As used in connection with various embodiments of the disclosure, the term “module” may include a unit implemented in hardware, software, and/or firmware, and may interchangeably be used with other terms, for example, “logic,” “logic block,” “part,” or “circuitry”. A module may be a single integral component, or a minimum unit or part thereof, adapted to perform one or more functions. For example, according to an embodiment, the module may be implemented in a form of an application-specific integrated circuit (ASIC).
Various embodiments as set forth herein may be implemented as software including one or more instructions that are stored in a storage medium (e.g., the memory 220) that is readable by a machine (e.g., the wearable device 101). For example, a processor (e.g., the processor 210) of the machine (e.g., the wearable device 101) may invoke at least one of the one or more instructions stored in the storage medium, and execute it, with or without using one or more other components under the control of the processor. This allows the machine to be operated to perform at least one function according to the at least one instruction invoked. The one or more instructions may include a code generated by a compiler or a code executable by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium, where the term “non-transitory” refers to the storage medium being a tangible device, not including a signal (e.g., an electromagnetic wave), but this term does not differentiate between a case in which data is semi-permanently stored in the storage medium and a case in which the data is temporarily stored in the storage medium.
According to an embodiment, a method according to various embodiments of the disclosure may be included and provided in a computer program product. The computer program product may be traded as a product between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)), or be distributed (e.g., downloaded or uploaded) online via an application store (e.g., PlayStore™), or between two user devices (e.g., smart phones) directly. If distributed online, at least part of the computer program product may be temporarily generated or at least temporarily stored in the machine-readable storage medium, such as memory of the manufacturer's server, a server of the application store, or a relay server.
According to various embodiments, each component (e.g., a module or a program) of the above-described components may include a single entity or multiple entities, and some of the multiple entities may be separately disposed in different components. According to various embodiments, one or more of the above-described components may be omitted, or one or more other components may be added. Alternatively or additionally, a plurality of components (e.g., modules or programs) may be integrated into a single component. In such a case, according to various embodiments, the integrated component may still perform one or more functions of each of the plurality of components in the same or similar manner as they are performed by a corresponding one of the plurality of components before the integration. According to various embodiments, operations performed by the module, the program, or another component may be carried out sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order or omitted, or one or more other operations may be added.
While the disclosure has been illustrated and described with reference to various example embodiments, it will be understood that the various example embodiments are intended to be illustrative, not limiting. It will be further understood by those skilled in the art that various changes in form and detail may be made without departing from the true spirit and full scope of the disclosure, including the appended claims and their equivalents. It will also be understood that any of the embodiment(s) described herein may be used in conjunction with any other embodiment(s) described herein.
No claim element is to be construed under the provisions of 35 U.S.C. § 112, sixth paragraph, unless the element is expressly recited using the phrase “means for” or “means.”
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0179815 | Dec 2022 | KR | national |
10-2023-0000812 | Jan 2023 | KR | national |
This application is a continuation of International Application No. PCT/KR2023/020920, designating the United States, filed on Dec. 18, 2023, in the Korean Intellectual Property Receiving Office and claiming priority to Korean Patent Application No. 10-2022-0179815 filed on Dec. 20, 2022 in the Korean Intellectual Property Receiving Office and to Korean Patent Application No. 10-2023-0000812 filed on Jan. 3, 2023, in the Korean Intellectual Property Receiving Office, the disclosures of each of which are incorporated by reference herein in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2023/020920 | Dec 2023 | WO |
Child | 18928957 | US |