The disclosure relates to a wearable device, a method, and a non-transitory computer readable storage medium providing a graphic region.
In order to provide an enhanced user experience, an electronic device that provides a service displaying information generated by a computer in association with an external object in the real-world is being developed. The electronic device may be a wearable device that may be worn by a user. For example, the electronic device may be an augmented reality (AR) glass. For example, the electronic device may be a virtual reality (VR) device. For example, the electronic device may be a video see-through (VST) device.
The above information is presented as background information only to assist with an understanding of the disclosure. No determination has been made, and no assertion is made, as to whether any of the above might be applicable as prior art with regard to the disclosure.
A wearable device is provided. The wearable device may comprise a display arranged with respect to eyes of a user wearing the wearable device. The wearable device may comprise a camera comprising at least one lens that faces a direction corresponding to a direction in which the eyes faces. The wearable device may comprise a processor. The wearable device may comprise memory storing instructions. The instructions may cause, when executed by the processor, the wearable device to identify, based on a schedule, a place associated with the schedule. The instructions may cause, when executed by the processor, the wearable device to identify, based on the wearable device being position in the place associated with the schedule, a direction of the camera of the wearable device with respect to a region in the place to which a graphic region for the schedule is set. The instructions may cause, when executed by the processor, the wearable device to display, via the display, at least portion of the graphic region on at least portion of the region, based on identifying that the direction of the camera corresponds to a first direction in which the camera faces the region. The instructions may cause, when executed by the processor, the wearable device display, via the display, information for informing to change a direction of the camera, based on identifying that the direction corresponds to a second direction different from the first direction.
A method is provided. The method may be executed for a wearable device comprising a display arranged with respect to eyes of a user wearing the wearable device and a camera including at least one lens that faces a direction corresponding to a direction in which the eyes faces. The method may comprise identifying, based on a schedule, a place associated with the schedule. The method may comprise, based on the wearable device being positioned in the place associated with the schedule, identifying a direction of the camera of the wearable device with respect to a region in the place to which a graphic region for the schedule is set. The method may comprise, based on identifying that the direction of the camera corresponds to a first direction in which the camera faces the region, display, via the display, at least portion of the graphic region on at least portion of the region. The method may comprise, based on identifying that the direction corresponds to a second direction different from the first direction, display, via the display, information for informing to change a direction of the camera.
A non-transitory computer readable storage medium is provided. The non-transitory computer readable storage medium may store one or more programs. The one or more programs may comprise instructions which, when executed by a processor of a wearable device including a display arranged with respect to eyes of a user wearing the wearable device and a camera including at least one lens that faces a direction corresponding to a direction in which the eyes faces, cause the wearable device to identify, based on a schedule, a place associated with the schedule. The one or more programs may comprise instructions which, when executed by the processor, cause the wearable device to, based on the wearable device being positioned in the place associated with the schedule, identify a direction of the camera of the wearable device with respect to a region in the place to which a graphic region for the schedule is set. The one or more programs may comprise instructions which, when executed by the processor, cause the wearable device to, based on identifying that the direction of the camera corresponds to a first direction in which the camera faces the region, display, via the display, at least portion of the graphic region on at least portion of the region. The one or more programs may comprise instructions which, when executed by the processor, cause the wearable device, based on identifying that the direction corresponds to a second direction different from the first direction, display, via the display, information for informing to change a direction of the camera.
The above and other aspects, features, and advantages of certain embodiments of the disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of various embodiments of the disclosure as defined by the claims and their equivalents. It includes various specific details to assist in that understanding but these are to be regarded as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the various embodiments described herein can be made without departing from the scope of the disclosure. In addition, descriptions of well-known functions and constructions may be omitted for clarity and conciseness.
The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used by the inventor to enable a clear and consistent understanding of the disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various embodiments of the disclosure is provided for illustration purpose only and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a component surface” includes reference to one or more of such surfaces.
Referring to
For example, the electronic device 101 may store a software application to control or manage another device through the electronic device 101. For example, the electronic device 101 may control or manage a device such as the wearable device 102 (or a physical device) by using the software application. For example, the electronic device 101 may control the other device, by executing the software application based on a user input received with respect to a user interface of the software application. For example, the software application may be used to provide a graphic region (or a virtual region) identified based on the user input received while the user interface is displayed, for the other device. For example, the graphic region may be provided in response to an event. For example, the event may be identified through the software application. The event will be illustrated below.
For example, the wearable device 102 may be a device for providing a virtual reality (VR) service, an augmented reality (AR) service, a mixed reality (MR) service, or an extended reality (XR) service. For example, the wearable device 102 may include a display for providing the AR service, the MR service, or the XR service. For example, when the wearable device 102 is AR glasses, the display of the wearable device 102 may include a transparent layer. For example, when the wearable device 102 is a video see-through (or visual see-through) (VST) device, the display of the wearable device 102 may be opaque.
For example, the wearable device 102 may provide the AR service, the MR service, or the XR service, by displaying real environment around the wearable device 102 or an image indicating the real environment, on the display of wearable device 102, with a virtual object. For example, when the wearable device 102 is the AR glasses, the wearable device 102 may display the virtual object on the real environment shown through the display of the wearable device 102. For example, when the wearable device 102 is the VST device, the wearable device 102 may display the virtual object on the image obtained through a camera of the wearable device 102. For example, the virtual object may include the graphic region. For example, the graphic region may be displayed on the display of the wearable device 102, based on data received from the electronic device 101 to the wearable device 102 through a connection 112. For example, the graphic region may be displayed on the display of the wearable device 102, based on data received from the external electronic device 104 to the wearable device 102 through a connection 124.
For example, the wearable device 102 may store one or more software applications for providing the graphic region. For example, the one or more software applications may be used to identify the event. For example, the one or more of software applications may include a software application for managing a schedule. For example, the one or more software applications may include a software application for managing another device (e.g., the electronic device 101) through the wearable device 102. For example, the one or more of software applications may include a software application for providing an alarm or a notification. For example, the one or more software applications may include a software application used to set a condition, used to set one or more functions corresponding to the condition, and used to execute the one or more functions in response to satisfaction of the condition. For example, the one or more software applications may include a software application used to provide a service using contact information or manage the contact information. For example, the one or more software applications may include a software application for recognizing an image obtained through a camera. However, it is not limited thereto.
For example, the one or more software applications may be executed based on a user account. For example, the user account used for the one or more of software applications may correspond to a user account used for the software application in the electronic device 101. However, it is not limited thereto.
For example, the external electronic device 104 may be one or more servers for processing related to the software application stored in the electronic device 101 and/or the one or more software applications stored in the wearable devices 102. For example, the external electronic device 104 may execute processing related to the software application in the electronic device 101 and/or the one or more of the software applications in the wearable device 102, based on a user account each corresponding to the user account used in the electronic device 101 and the user account used in the wearable device 102. For example, the external electronic device 104 may transmit a notification or a push message to the electronic device 101, based on the processing. The notification or the push message from the external electronic device 104 may be transmitted to the wearable device 102 using the connection 112 through the electronic device 101. For example, the external electronic device 104 may transmit a notification or a push message to the wearable device 102 through the connection 124, based on the processing. For example, the notification or the push message may be transmitted from the external electronic device 104 to the electronic device 101 and/or the wearable device 102, in response to identifying the event to be illustrated below in the external electronic device 104. However, it is not limited thereto.
According to embodiments, the electronic device 101 and/or the external electronic device 104 may not be included in the environment 100. For example, operations to be illustrated below may be executed by the wearable device 102 within a standalone state from the electronic device 101 and the external electronic device 104, and may be executed based on a communication between the electronic device 101 and the wearable device 102 and/or a communication between the external electronic device 104 and the wearable device 102.
The wearable device 102 may include components for providing a graphic region in response to an event through a display of the wearable device 102. The components may be illustrated in
Referring to
For example, the processor 210 may be used to execute operations (and/or methods) to be illustrated below. For example, the processor 210 may be operably coupled with the display 220, the first camera 230, the second camera 240, the sensor 250, and/or the communication circuit 260. For example, operative coupling of the processor 210 to the display 220, the first camera 230, the second camera 240, the sensor 250, and/or the communication circuit 260 may indicate that the processor 210 is directly connected to each of the display 220, the first camera 230, the second camera 240, the sensor 250, and the communication circuit 260. For example, operative coupling of the processor 210 to the display 220, the first camera 230, the second camera 240, the sensor 250, and/or the communication circuit 260 may indicate that the processor 210 is connected to each of the display 220, the first camera 230, the second camera 240, the sensor 250, and the communication circuit 260, through another component of the wearable device 102. For example, operative coupling of the processor 210 to the display 220, the first camera 230, the second camera 240, the sensor 250, and/or the communication circuit 260 may indicate that each of the display 220, the first camera 230, the second camera 240, the sensor 250, and the communication circuit 260 operates based on instructions executed by the processor 210. For example, operative coupling of the processor 210 to the display 220, the first camera 230, the second camera 240, the sensor 250, and/or the communication circuit 260 may indicate that each of the display 220, the first camera 230, the second camera 240, the sensor 250, and the communication circuit 260 is controlled by the processor 210. However, it is not limited thereto.
For example, the display 220 may be used to provide visual information. For example, the display 220 may be transparent when the wearable device 102 is AR glasses, and may be opaque or translucent when the wearable device 102 is a VST device. For example, the display 220 may be arranged with respect to eyes of a user. For example, the display 220 may be arranged to be positioned in front of the eyes of a user wearing the wearable device 102.
For example, each of the first camera 230 and the second camera 240 may be used to obtain an image. For example, the first camera 230 may include at least one lens that has a field of view (FOV) corresponding to an FOV of eyes of a user wearing the wearable device 102 and faces in a direction corresponding to a direction in which the eyes faces. For example, the first camera 230 may be used to obtain an image indicating an environment around the wearable device 102. For example, unlike the first camera 230, the second camera 240 may face the eyes of the user wearing the wearable device 102. For example, the second camera 240 may be used to identify a user input through the eyes. For example, the second camera 240 may be used for tracking the eyes or a gaze of the eyes. According to embodiments, the first camera 230 and/or the second camera 240 may not be included in the wearable device 102.
For example, the sensor 250 may be used to identify a state of the wearable device 102, a state of the user wearing the wearable device 102, and/or a state of the environment around the wearable device 102. For example, the sensor 250 may be used to obtain data indicating a posture of the wearable device 102, data indicating acceleration of the wearable device 102, and/or data indicating orientation of the wearable device 102. For example, the sensor 250 may be used to obtain biometric data of the user wearing the wearable device 102. For example, the sensor 250 may be used to obtain data indicating a pose of the user wearing the wearable device 102. For example, the sensor 250 may be used to obtain data indicating illuminance around the wearable device 102. For example, the sensor 250 may be used to obtain data indicating a temperature around the wearable device 102. However, it is not limited thereto.
For example, the communication circuit 260 may be used for a communication between the wearable devices 102 and another device (e.g., the electronic device 101 and/or the external electronic device 104). For example, the communication circuit 260 may be used to establish a connection between the wearable device 102 and the other device. For example, the communication circuit 260 may be used to transmit a signal, information, and/or data to the other devices through the connection. For example, the communication circuit 260 may be used to receive a signal, information, and/or data from the other device through the connection. However, it is not limited thereto.
For example, the processor 210 may execute operations for the graphic region illustrated through description of
Referring to
Referring to
For example, at least a portion of the plurality of candidate graphic regions may have a history of downloading from an external electronic device or being used (or displayed) within the wearable device 102. For example, at least another portion of the plurality of candidate graphic regions may have a history of being set through an executable object 403.
For example, the processor 210 may receive a user input for the executable object 403, displayed within the user interface 401 with the plurality of visual objects 402 in the state 400. For example, the executable object 403 may be used to set a new graphic region. For example, the executable object 403 may be used to register another graphic region that is at least partially different from the plurality of candidate graphic regions each indicated by the plurality of visual objects 402. For example, the processor 210 may change the state 400 to a state 410 in response to the user input.
In the state 410, the processor 210 may display a user interface 411, with an environment 412 around the wearable device 102 that includes a real region, through the display 220. For example, the environment 412 may be a real environment shown through the display 220, when the wearable device 102 is an AR glasses. For example, the environment 412 may be an image representing the real environment obtained through the first camera 230, when the wearable device 102 is a VST device.
For example, the user interface 411 may include an object 413 indicating to set the graphic region. For example, the user interface 411 may be displayed with a thumbnail image 414 provided for a user account used for a setting or a registration of the graphic region or a user related to the user account. For example, the thumbnail image 414 may be displayed with the user interface 411, in order to indicate that the graphic region is set through the user interface 411 based on the user account. However, it is not limited thereto.
For example, the processor 210 may change the state 410 to a state 420, in response to a user input for the object 413. For example, in the state 420, the processor 210 may display a layer 421 superimposed on a part (e.g., the real region) of the environment 412, through the display 220. For example, the layer 421 may be displayed based on executing spatial recognition (or spatial awareness) on an image obtained through the first camera 230. For example, the layer 421 may be superimposed on the part of the environment 412, in order to indicate a candidate region capable of being set the graphic region within the environment 412. For example, the layer 421 may be superimposed on the part of the environment 412, in order to indicate a position capable of being set the graphic region.
For example, unlike illustration of
For example, unlike the illustration of
For example, the processor 210 may change the state 420 to a state 440, based on a user input 424 indicating to select the layer 421. For example, the user input 424 may include an input with respect to an input device (e.g., a controller) related to the wearable device 102, a gaze input identified through the second camera 240, a user gesture identified through the first camera 230, and/or a voice input identified through a microphone of the wearable device 102. However, it is not limited thereto. The state 440 will be illustrated below.
For example, in the state 420, the processor 210 may display a user interface 422 through the display 220, together with the layer 421 superimposed on the part of the environment 412. For example, the user interface 422 may include an object 423 for identifying a position in which the graphic region will be set based on a user input (or manually identifying). For example, the object 423 may be displayed in the user interface 422 to set a user-designated region as the position of the graphic region. For example, the processor 210 may change the state 420 to a state 430, in response to a user input 425 for the object 423. For example, the user input 425 may include an input for an input device (e.g., a controller) related to the wearable device 102, a gaze input identified through the second camera 240, a user gesture identified through the first camera 230, and/or a voice input identified through the microphone of the wearable device 102. However, it is not limited thereto.
In the state 430, the processor 210 may display the user interface 432 through the display 220. For example, the user interface 432 may include text 433 indicating that a position in which the graphic region is to be set may be defined or specified through a user input. For example, in the state 430, the processor 210 may receive a user input for drawing a region 431. For example, the processor 210 may identify the region 431, which is a closed region formed along a movement path of the user input, based on identifying a completion, a termination, or a release of the user input, and may change the state 430 to the state 440 based on the identification of the region 431.
In the state 440, the processor 210 may display a user interface 441 for selecting a color (or texture) of the graphic region through the display 220, based on the user input 424 received within the state 420 or the user input received within the state 430 (or release of the user input received within the state 430). For example, the user interface 441 may include objects 442 each indicating candidate colors of the graphic region. For example, the processor 210 may change the state 440 to a state 450, based at least in part on a user input 444 indicating to select an object of the objects 442. For example, the user input 444 may include an input for an input device related to the wearable device 102, a gaze input identified through the second camera 240, a user gesture identified through the first camera 230, and/or a voice input identified through the microphone of the wearable device 102. However, it is not limited thereto.
On the other hand, the user interface 441 may further include an object 443 for selecting another color (or different texture) distinct from the candidate colors. For example, the object 443 may be used to provide other candidate colors (or other candidate textures) distinct from the candidate colors indicated by the objects 442. For example, the object 443 may be used to display other objects each indicating the other candidate colors (or the other candidate textures) provided from another software application. Although not illustrated in
For example, the user interface 441 may further include an object 445 and an object 446 for setting a software application that provides an execution screen to be displayed together with a graphic region 451 (to be exemplified below) set through at least a part of the object 442 and/or object 443 of the user interface 441. For example, the object 445 may be displayed to indicate at least one software application selected through the object 446. For example, an execution screen of the at least one software application indicated by the object 445 may be displayed together with the graphic region 451. However, it is not limited thereto.
For example, the user interface 441 may further include an object 447 indicating a completion of setting through the user interface 441. However, it is not limited thereto.
In the state 450, the processor 210 may display the graphic region 451 having a color (or texture) identified in the state 440 through the display 220, based at least in part on the user input 444 received in the state 440. For example, the graphic region 451 may be displayed by applying the color (or the texture) identified based on the user input 444 received in the state 440 to the region (i.e., layer 421) or the region 431. For example, the graphic region 451 may be set with respect to a real region 452. For example, the graphic region 451 may be displayed on the real region 452. For example, the graphic region 451 may replace the real region 452. However, it is not limited thereto.
For example, the graphic region 451 may be set with respect to the event, as well as the real region 452. For example, the event may include identifying a schedule. For example, the event may include identifying a change in a state of another device related to the wearable device 102. For example, the event may include identifying the wearable device 102 or a context of the wearable device 102 through an external object around the wearable device 102. For example, the event may include identifying a change in an environment around the wearable device 102. For example, the event may include identifying a condition set for the graphic region 451 based on a user input. However, it is not limited thereto.
For example, at least one intermediate state may be defined between the state 440 and the state 450, in order to set the event associated with the graphic region 451. For example, the processor 210 may change the state 440 to a state 460 based on the user input 444. For example, in the state 460, the processor 210 may display the user interface 465 through the display 220. For example, the user interface 465 may be used to set the event to identifying a change in a state of another device associated with the wearable device 102. For example, the user interface 465 may be provided from a software application used to manage at least one device (e.g., the electronic device 101) associated with the wearable device 102. For example, the user interface 465 may include a plurality of objects 466 indicating each of a plurality of devices that may be controlled through the wearable device 102. For example, based on at least one user input received in association with an object 467 among the plurality of objects 466 in the user interface 465, the processor 210 may set the event as identifying that a state of the electronic device 101 (e.g., a smartphone) indicated by the object 467 is changed to a charging state. For example, after the at least one user input is received, the object 467 may include text 468 indicating the event. For example, the processor 210 may change the state 460 to the state 450 based on a user input indicating a completion of setting the event.
For example, in the state 450, the processor 210 may display a user interface 453 through the display 220, together with the graphic region 451. For example, the user interface 453 may be displayed to indicate the event associated with the graphic region 451. For example, the user interface 453 may be displayed to indicate the event, which is a condition for displaying the graphic region 451. For example, the user interface 453 may be used to set the event. For example, the user interface 453 may be used to call or display another user interface (e.g., the user interface 465 in the state 460) for setting the event.
For example, the user interface 453 may indicate the event set with respect to the graphic region 451. For example, the user interface 453 may include text 454 indicating a schedule (e.g., a tea-time) associated with the graphic region 451. For example, the text 454 may indicate displaying the graphic region 451 through the display 220 when identifying the schedule. For example, the user interface 453 may further include an object 455 for editing the schedule. For example, the user interface 453 may further include an object 456 for adding an event associated with the graphic region 451. For example, the graphic region 451 may be displayed through the display 220, in response to the event indicated by the text 454 and/or an event set based on a user input for the object 456. However, it is not limited thereto.
Although
Referring to
For another example, the user interface 510 may be displayed through the display of the electronic device 101, based on a user account corresponding to a user account used to set the graphic region. For example, the user interface 510 may be displayed based on an execution of a software application within the electronic device 101 for setting a condition, setting one or more functions corresponding to the condition, and executing the one or more functions in response to a satisfaction of the condition. For example, the electronic device 101 may identify at least one condition that should be satisfied to display the graphic region, based on a user input received through the user interface 510. For example, based on the user input, the electronic device 101 may obtain information indicating a condition 511 related to a time (e.g., after 8 a.m. every day), a condition 512 related to another device (e.g., when a door of another device (e.g., a refrigerator) distinct from the electronic device 101 and the wearable device 102 are opened), a condition 513 related to a position of a user (e.g., when the user is at home), a condition 514 related to a weather (e.g., when it rains), and a condition 515 related to a security (e.g. when a monitoring device for security identifies a user in the house). For example, the electronic device 101 may transmit the information to the external electronic device 104 based on a user account. For example, the information may be linked or associated with information on the graphic region within the external electronic device 104. For example, in response to a satisfaction of the condition 511 to the condition 515, the external electronic device 104 may transmit a signal for informing the wearable device 102 to display the graphic region to the wearable device 102. For example, the processor 210 of the wearable device 102 may display the graphic region through the display 220, in response to the signal. Displaying the graphic region will be exemplified below.
For still another example, the user interface 520 may be displayed through the display of the electronic device 101 based on a user account corresponding to a user account used to set the graphic region. For example, the user interface 520 may be displayed based on an execution of a software application within the electronic device 101 for providing or managing an alarm. For example, the electronic device 101 may obtain information indicating a name 521 (e.g., a tea-time) of an alarm related to the graphic region, a timing 522 (e.g., 10 minutes later) when the alarm is provided, and a position 523 (e.g., a position set for the tea-time) in which the graphic region is displayed with the alarm, based on the user input received through the user interface 520. For example, the electronic device 101 may transmit the information to the external electronic device 104 based on a user account. For example, the external electronic device 104 may transmit a signal for informing the wearable device 102 to display the graphic region in association with the alarm to the wearable device 102, in response to identifying the timing 522 of the alarm. For example, the processor 210 of the wearable device 102 may display the graphic region through a display 220, with respect to the position 523, in response to the signal. For example, the graphic region may be displayed through the display 220, together with the name 521 of the alarm. Displaying the graphic region will be exemplified below.
For still another example, a user interface 530 may be displayed through the display of the electronic device 101, based on a user account corresponding to a user account used to set the graphic region. For example, the user interface 530 may be displayed based on an execution of a software application in the electronic device 101 for managing a contact and/or managing a call. For example, based on a user input received through the user interface 530, the electronic device 101 may obtain information indicating a name 531 of a user (e.g., Cheol Soo), a phone number 532 related to the graphic region (e.g., 010-XXXX-YYYYY), and a place 533 where the graphic region will be provided. For example, in response to identifying an incoming call from the phone number 532 indicated by the information or an outgoing call from the phone number 532 indicated by the information, the electronic device 101 may transmit a signal indicating to display the graphic region through the display 220 within the place 533 to the wearable device 102. For example, the processor 210 of the wearable device 102 may display the graphic region through the display 220, with respect to the place 533, in response to the signal. Displaying the graphic region will be exemplified below.
As described above, the event related to the graphic region may be set through another device (e.g., the electronic device 101) related to the wearable device 102 as well as the wearable device 102.
For example, the wearable device 102 may be used to set at least another function to be provided with displaying the graphic region. For example, the at least one other function may be set through a user interface displayed via the display 220 of the wearable device 102, together with the graphic region. The user interface may be exemplified through
Referring to
For example, the user interface 600 may include a region 602 for setting a content to be displayed with the graphic region or an execution screen to be displayed together with the graphic region. For example, the region 602 may include an object 603 indicating to display a content (e.g., content A) in the graphic region or together with the graphic region and/or an object 604 indicating to display an execution screen of a software application (e.g., software application B) in the graphic region or together with the graphic region. For example, the region 602 may further include an object 605 to add the execution screen of the software application or the content to be displayed in the graphic region or together with the graphic region. For example, the processor 210 may identify the content or the execution screen to be displayed with the graphic region, based at least in part on a user input with respect to the object 605.
For example, the user interface 600 may include a region 606 for setting or adding the graphic region. For example, the region 606 may include an object 607 indicating a first graphic region, which is the graphic region, and/or an object 608 indicating a second graphic region. For example, the object 607 may include a visual element 609 indicating a color (or texture) of the first graphic region and/or a visual element 610 indicating a real region in which the first graphic region will be displayed. For example, the object 608 may include a visual element 611 indicating a color (or texture) of the second graphic region and/or a visual element 612 indicating a real region in which the second graphic region will be displayed. For example, the region 606 may include an object 613 for adding a new graphic region (e.g., a third graphic region). For example, the processor 210 may change the state 601 to the state 410 in response to a user input for the region 606. However, it is not limited thereto.
For example, the user interface 600 may include a region 614 for setting a state of another device to be provided together with displaying the graphic region. For example, the region 614 may include an object 615 indicating setting of the wearable device 102 provided while at least one graphic region set through the region 606 is displayed. For example, the object 615 may include text 615-1 indicating the setting. For example, the region 614 may include an object 616 indicating setting of a first external electronic device (e.g., air conditioner) provided while the at least one graphic region is displayed. For example, the object 616 may include text 616-1 indicating the setting of the first external electronic device. For example, the region 614 may include an object 617 indicating a second external electronic device (e.g., kitchen light) available in association with the at least one graphic region while the at least one graphic region is displayed. For example, since setting of the second external electronic device provided while the at least one graphic region is displayed is not defined, the object 617 may not include text, unlike the object 615 and the object 616. For example, the region 614 may include an object 618 for adding a third external electronic device (e.g., new external electronic device) available in association with the at least one graphic region while the at least one graphic region is displayed. For example, the processor 210 may change the state 601 to the state 602, in response to a user input 619 for the object 617.
For example, in the state 602, the processor 210 may display the user interface 620 for setting the second external electronic device (e.g., kitchen light) indicated by the object 617 through the display 220. For example, the user interface 620 may include objects for setting the second external electronic device to be provided while the at least one graphic region is displayed.
For example, the user interface 620 may include an object 621 for turning on the second external electronic device while the at least one graphic region is displayed. For example, the user interface 620 may include an object 622 for turning off the second external electronic device while the at least one graphic region is displayed. For example, when the object 621 is selected according to a user input among the object 621 and the object 622, the object 621 may be visually emphasized with respect to the object 622, as illustrated in
For example, in the state 603, the processor 210 may display the user interface 600 through the display 220. For example, since the user interface 600 in the state 603 is displayed through the state 602, the object 617 within the user interface 600 in the state 603 may include text 617-1, unlike the object 617 within the user interface 600 in the state 601. For example, the text 617-1 may indicate setting of the second external electronic device identified according to a user input received in the state 602. For example, the processor 210 may change the state 603 to the state 604, in response to a user input 629 for the object 618.
For example, in the state 604, the processor 210 may display a user interface 630 through the display 220. For example, the user interface 630 may include an object 631, an object 632, and/or an object 633 each representing one or more external electronic devices available while the at least one graphic region is displayed.
For example, the one or more external electronic devices may be identified based on a position of the at least one graphic region. For example, the one or more external electronic devices may be positioned within a region in which the at least one graphic region is displayed. However, it is not limited thereto.
For example, the one or more external electronic devices may be identified based on a type (or attribute) of a service provided through the at least one graphic region. However, it is not limited thereto.
For example, the one or more external electronic devices may be identified based on a user account corresponding to a user account used to set the at least one graphic region. However, it is not limited thereto.
Although not illustrated in
As described above, since the wearable device 102 may provide setting a function provided while displaying the graphic region as well as setting the graphic region and setting an event for displaying the graphic region, the wearable device 102 may enhance a quality of a service provided in a real environment.
Referring back to
For example, the at least a portion of the graphic region may be displayed in response to the event identifying a schedule. Displaying the at least a portion of the graphic region in response to identifying the schedule may be exemplified through
Referring to
Referring to
In operation 803, the processor 210 may receive a signal from the external electronic device 104 through the communication circuit 260. For example, the signal may be transmitted from the external electronic device 104, in response to identifying the schedule in the external electronic device 104. For example, even when the software application within the wearable device 102 is in an inactive state, the external electronic device 104 may identify the schedule and transmit the signal in response to identifying the schedule so that the wearable device 102 may identify the schedule.
In operation 805, the processor 210 may change the state of the software application from the inactive state to the active state in response to the received signal. For example, the processor 210 may identify the schedule based on the software application changed to the active state.
Referring back to
Referring to
In operation 903, the processor 210 may receive a signal from the external electronic device 104 through the communication circuit 260. For example, the signal may be transmitted from the external electronic device 104, in response to identifying the schedule in the external electronic device 104. For example, even when the software application in the wearable device 102 is in an inactive state, the external electronic device 104 may identify the schedule and transmit the signal in response to identifying the schedule so that the wearable device 102 may identify the schedule. For example, the signal may be transmitted to change a state of one or more other software applications distinct from the software application, unlike a signal transmitted in operation 803 of
In operation 905, the processor 210 may change a state of the one or more other software applications from the inactive state to the active state, in response to the received signal. For example, the one or more other software applications may include a software application for an execution screen (or content) to be provided with displaying the graphic region. For example, the one or more other software applications may include a software application sharing the schedule with the software applications. For example, the one or more other software applications may include a software application used to display the graphic region. However, it is not limited thereto. For example, the processor 210 may identify the schedule, based on the one or more other software applications changed to the active state.
Although
Referring to
In operation 1003, the processor 210 may provide the data to the one or more other software applications distinct from the software application. For example, the data may be provided to maintain a state of at least a portion of the one or more other software applications in the active state. For example, a software application from among the one or more other software applications and the software application (e.g., a software applications used to register the schedule) may be maintained in the active state based on the data. For example, the software application maintained in the active state may periodically access the information in the external electronic device 104 through the communication circuit 260 to identify the schedule. However, it is not limited thereto.
As described above, the wearable device 102 may reduce a load of the external electronic device 104 by identifying the schedule based on accessing the external electronic device 104 through the data received from the external electronic device 104.
Referring back to
In operation 705, the processor 210 may identify whether a camera (e.g., the first camera 230) of the wearable device 102 positioned in the place faces a region (e.g., real region) in the place where a graphic region for the schedule is set, based at least in part on identifying the place. For example, since the graphic region should be displayed with respect to the schedule, the processor 210 may identify whether the camera faces the region to provide a service related to the schedule. For example, the processor 210 may execute operation 707 based on identifying that a direction of the camera corresponds to a first direction in which the camera faces the region, and execute operation 709 based on identifying that the direction corresponds to a second direction different from the first direction. For example, operation 705 may be executed based on an image obtained through the camera. For example, operation 705 may be executed through a sensor 250 of the wearable device 102. However, it is not limited thereto.
In operation 707, the processor 210 may display at least a portion of the graphic region on at least a portion of the region through display 220, in response to the camera facing the region. Displaying the at least a portion of the graphic region may be exemplified through
Referring to
For example, the environment 1101 may include a region in which a graphic region for the schedule is set. For example, the environment 1101 may include a region 1102 in which a first graphic region is set, a region 1103 in which a second graphic region is set, a region 1104 in which a third graphic region is set, and a region 1105 in which a fourth graphic region is set. For example, the region 1102, the region 1103, the region 1104, and the region 1105 may be identified through operations exemplified through the state 420 and/or the state 430 of
For example, as in a state 1130, on a condition that the direction corresponds to the first direction facing the region 1102, the region 1103, the region 1104, and the region 1105, the processor 210 may display at least a portion of the first graphic region 1131 on the region 1102, display at least a portion of the second graphic region 1132 on the region 1103, display at least a portion of the third graphic region 1133 on the region 1104, and display at least a portion of the fourth graphic region 1134 on the region 1105. For example, the first graphic region 1131 may be displayed to cover the region 1102, in order to reduce a focus on the schedule from being dispersed due to external objects positioned within the region 1102. For example, the first graphic region 1131 may include a content related to the schedule. For example, the first graphic region 1131 may be used as a virtual display. However, it is not limited thereto. For example, each of the second graphic region 1132, the third graphic region 1133, and the fourth graphic region 1134 may be displayed to cover the region 1103, the region 1104, and the region 1105, in order to reduce a focus on the schedule from being dispersed due to external objects positioned within each of the region 1103, the region 1104, and the region 1105. For example, since the environment 1101 in the state 1130 includes at least a portion of the first graphic region 1131 to the fourth graphic region 1134, the environment 1101 in the state 1130 may provide an enhanced environment more than the environment 1101 in the state 1100. For example, the environment 1101 in the state 1130 may be more suitable for the schedule than the environment 1101 in the state 1100.
Referring back to
Referring to
For example, the information 1161 may indicate the first direction through an arrow. For example, a direction in which the arrow faces may be identified through a spatial map obtained in the wearable device 102. However, it is not limited thereto. For example, the information 1161 may include text 1163 indicating that the first direction indicated by the arrow is associated with the schedule (e.g., task). For example, the information 1161 may be visually emphasized with respect to the environment 1162. For example, the information 1161 may have a color distinct from a color of the environment 1162. For example, the information 1161 may blink, unlike the environment 1162. However, it is not limited thereto.
Although not illustrated in
Although
Referring to
For example, the information 1210 may indicate a direction to the place through an arrow. For example, a direction in which the arrow faces may be identified through a spatial map obtained in the wearable device 102. However, it is not limited thereto. For example, the information 1210 may include text 1225 indicating that the direction indicated by the arrow is associated with the schedule. For example, the information 1210 may have a color distinct from a color of the environment 1220. For example, the information 1210 may blink, unlike the environment 1220. However, it is not limited thereto.
On the other hand, the processor 210 may provide another state before providing the state 1130 in response to identifying that the direction corresponds to the first direction. The other state may be exemplified through
Referring to
In the state 1300, the processor 210 may display a visual effect 1305 that causes the first graphic region 1131 to the fourth graphic region 1134 set with respect to the region 1102 to the region 1105 to appear gradually. For example, since suddenly displaying the first graphic region 1131 to the fourth graphic region 1134 may provide a sense of difference to the user, the processor 210 may display the visual effect 1305. For example, the processor 210 may display the visual effect 1305 by extending at least a portion of the graphic region (e.g., the first graphic region 1131 to the fourth graphic region 1134) from a portion (e.g., at least a portion of the region 1102 to the region 1105) of the region in the place spaced apart from the wearable device 102 to another portion of the region. The processor 210 may change the state 1300 to the state 1130 after displaying the visual effect 1305.
The processor 210 may identify whether a gaze positioned on the execution screen 1301 is maintained while changing the state 1300 to the state 1130. For example, the processor 210 may maintain a display of the execution screen 1301 independently of the schedule, based on identifying that the gaze is maintained on the execution screen 1301. For example, the processor 210 may cease the display of the execution screen 1301 for the schedule, based on identifying that the gaze is not maintained on the execution screen 1301.
Although the state 1300 of
Referring to
For example, the message 1400 may be simplified. For example, the message 1400 may be replaced with a message 1450. For example, the message 1450 may include text 1451 indicating the schedule. For example, the message 1450 may include an executable object 1452 for ceasing to display the at least a portion of the graphic region. Although not illustrated in
Although
Referring to
Unlike the above examples, a plurality of graphic regions may be set with respect to a real region. For example, a first graphic region among the plurality of graphic regions may be set for a first schedule, and a second graphic region among the plurality of graphic regions may be set for a second schedule. Displaying different graphic regions with respect to a real region according to different schedules may be exemplified through
Referring to
For example, the processor 210 may provide a state 1600 based on identifying the first schedule among the first schedule and the second schedule. For example, in the state 1600, the processor 210 may display an object 1601 indicating the first schedule and an object 1602 indicating the second schedule. For example, in the state 1600, the object 1601 may be visually emphasized with respect to the object 1602, in order to indicate that the first schedule among the first schedule and the second schedule is identified. For example, in the state 1600, the processor 210 may display the graphic region 1620 on the region 1615. For example, the graphic region 1620 may be suitably set for the first schedule (e.g., tea-time). For example, the graphic region 1620 may have a first color.
For example, the processor 210 may provide a state 1650 based on identifying the second schedule among the first schedule and the second schedule. For example, in the state 1650, the processor 210 may display the object 1601 and the object 1602. For example, in the state 1650, the object 1602 may be visually emphasized with respect to the object 1601, in order to indicate that the second schedule among the first schedule and the second schedule is identified. For example, in the state 1650, the processor 210 may display a graphic region 1670 different from the graphic region 1620 on the region 1615. For example, unlike the graphic region 1620, the graphic region 1670 may be suitably set for the second schedule. For example, the graphic region 1670 may have a second color distinct from the first color. For example, the graphic region 1670 may further include a visual object 1672 and/or a visual object 1674, unlike the graphic region 1620. For example, each of the visual objects 1672 and the visual object 1674 may be included in the graphic region 1670 for the second schedule.
Unlike the above examples, a graphic region may be floated on a real region. For example, the graphic region may be positioned over the real region and spaced apart from the real region. Displaying the graphic region spaced apart from the real region may be exemplified through
Referring to
For example, the processor 210 may provide a state 1750 different from the state 1700, based at least in part on identifying the schedule. For example, in the state 1750, the processor 210 may display a graphic region 1760 floated on the region 1702 and the region 1703, via the display 220. For example, the graphic region 1760 may be spaced apart from the object 1704 by a third distance shorter than the first distance between the region 1702 and the object 1704 and the second distance between the region 1703 and the object 1704. For example, the graphic region 1760 may be displayed over the region 1702 and the region 1703. For example, the graphic region 1760 may indicate a content set for the schedule. For example, the graphic region 1760 may be used as a virtual display for displaying the content for the schedule.
For example, displaying the graphic region 1760 may be provided with at least one another function set for the schedule.
For example, the graphic region 1760 may be displayed together with a virtual device 1765 for the schedule and/or a virtual device 1770 for the schedule. For example, the virtual device 1765 may be a virtual audio for adding a visual effect to a background music outputted through a speaker of the wearable device 102. For example, the virtual device 1770 may be an electronic book for adding a visual effect to a voice outputted through the speaker of the wearable device 102. However, it is not limited thereto.
For example, the graphic region 1760 may be displayed while a real device operates according to a setting changed according to the schedule. For example, an external electronic device 1710, an external electronic device 1715, and/or an external electronic device 1720, which are real devices in the environment 1701, may operate according to a first setting within the state 1700 and may operate according to a second setting for the schedule within the state 1750. Changing setting of real devices in the environment 1701 will be exemplified through
Unlike the above examples of displaying a graphic region based on identifying a schedule, the graphic region may be displayed based on a state of an environment around the wearable device 102. For example, the processor 210 may display the graphic region through the display 220, in response to identifying that the state of the environment corresponds to a reference state. For example, the graphic region may be displayed to enhance a quality of the environment. For example, since the graphic region is displayed according to a state of the environment, the wearable device 102 may provide a service suitable for a situation by displaying the graphic region. Displaying the graphic region based on a state of the environment may be exemplified through
Referring to
As described above, the graphic regions 1851 may be displayed based on identifying a state of the environment 1801. For example, the graphic regions 1851 may be displayed for safety of a user in the environment 1801.
Unlike the above examples, the graphic region may also be displayed based on identifying a change in a state of a real object in an environment. For example, the processor 210 may identify a change in the state of the real object through the first camera 230 and display the graphic region through the display 220 in response to the identification. For example, since the graphic region is displayed according to a change in the state of the real object, the wearable device 102 may provide a service suitable for a situation by displaying the graphic region. Displaying the graphic region based on a change in the state of the real object may be exemplified through
Referring to
For example, in the state 1950, the processor 210 may display the graphic region 1951 and the graphic region 1952 through the display 220. For example, each of the graphic region 1951 and the graphic region 1952 may be displayed for a situation (e.g., reading) corresponding to the second state 1920. For example, the wearable device 102 may enhance a quality of the environment 1101 by displaying the graphic region 1951 and the graphic region 1952.
Unlike the above examples, the graphic region may be displayed based on identifying a change in a state of an electronic device in an environment. For example, the processor 210 may display the graphic region through the display 220, based on receiving a signal indicating the change in the state of the electronic device from the electronic device through the communication circuit 260. For example, the processor 210 may display the graphic region through the display 220, by identifying the change in the state of the electronic device based on recognition of an image obtained through the first camera 230. For example, since the graphic region is displayed according to a change in a state of the electronic device in the environment, the wearable device 102 may provide a service suitable for a situation by displaying the graphic region. Displaying the graphic region based on a change in a state of the electronic device in the environment may be exemplified through
Referring to
For example, the processor 210 may change the state 1100 to the state 1130 in response to the identification. For example, the processor 210 may provide the state 1130 changed from the state 1100 for a situation corresponding to the second state 2020 (e.g., task). For example, the wearable device 102 may enhance a quality of the environment 1101 by providing the state 1130.
As described above, displaying the graphic region may be provided together with at least another function. For example, the at least one other function provided with the display of the graphic region may include changing setting of an electronic device set in association with the graphic region. Changing setting of the electronic device may be exemplified through
Referring to
In operation 2103, the processor 210 may transmit a signal for changing setting of the electronic device to the electronic device through the communication circuit 260 as setting for the schedule. For example, the processor 210 may transmit the signal to the electronic device to enhance a quality of the graphic region displayed based at least in part on the schedule. For example, the electronic device may receive the signal. For example, the electronic device may operate according to the setting for the schedule while the graphic region is displayed.
Changing setting of the electronic device to the setting for the schedule may be exemplified through
Referring to
For example, in response to a change from the state 1100 to the state 1130, the processor 210 may change a state of the electronic device 2201 from the state 2210 to a state 2220 together with displaying the first graphic region 1131 to the fourth graphic region 1134. For example, the electronic device 2201 in the state 2220 may emit a second brightness different from the first brightness for the schedule. For example, in response to the change from the state 1100 to the state 1130, the processor 210 may change a state of the electronic device 2251 from the state 2260 to a state 2270 together with displaying the first graphic region 1131 to the fourth graphic region 1134. For example, the electronic device 2251 in the state 2270 may be in a mode blocking to output a sound in response to an incoming call.
As described above, the wearable device 102 may enhance a quality of the environment 1101, by changing setting of an electronic device adjacent to the graphic region or identified in association with the graphic region to setting corresponding to the schedule as well as displaying the graphic region.
For example, the processor 210 may identify a second schedule next to the first schedule while displaying the first graphic region for the first schedule. In response to identifying the second schedule, the processor 210 may provide an environment that is adaptively changed according to a change in a schedule by changing the first graphic region to a second graphic region. Changing the first graphic region to the second graphic region according to a change in the schedule may be exemplified through
Referring to
In operation 2303, the processor 210 may identify another schedule registered with respect to the region in which the graphic region is set through a user account used for the graphic region, while the at least a portion of the graphic region is displayed through the display 220, the other schedule being distinct from the schedule.
In operation 2305, the processor 210 may display at least a portion of another graphic region for the other schedule through display 220 on at least a portion of the region, in response to identifying the other schedule. Displaying at least a portion of the other graphic region may be exemplified through
Referring to
For example, since the other schedule (e.g., meal) is different from the schedule (e.g., task), the processor 210 may cease to display the first graphic region 1131 in the state 2400, and may change the second graphic region 1132 to the fourth graphic region 1134 set for the schedule to the second graphic region 2402, the third graphic region 2403, and the fourth graphic region 2404 set for the other schedule, respectively. For example, in the state 2400, the processor 210 may display a first graphic region 2401, which is a new graphic region set for the other schedule, through the display 220. For example, in the state 2400, the processor 210 may display an execution screen 2405 set for the other schedule. For example, the execution screen 2405 may be provided from a software application executed in response to a change from the state 1130 to the state 2400.
As described above, on a condition of identifying another schedule changed from the schedule, the wearable device 102 may enhance a quality of an environment provided through the wearable device 102, by displaying a graphic region for the other schedule, which is at least partially distinct from the graphic region for the schedule.
According to the above examples, displaying the graphic region may enhance a quality of the environment, but a user may not accurately recognize a change in the real environment due to the display of the graphic region. The wearable device 102 may adjust transparency of a graphic region in response to identifying that an external object enters a region where the graphic region is set, in order to enhance recognition of a change in the real environment. Adjusting transparency of the graphic region according to entrance of the external object may be exemplified through
Referring to
In operation 2503, the processor 210 may identify whether an external object enters at least a portion of the region while the at least a portion of the graphic region is displayed. For example, the processor 210 may maintain identification of the external object through operation 2503 while the at least a portion of the graphic region is displayed. For example, the processor 210 may execute operation 2505 in response to identifying that the external object enters the at least a portion of the region.
In operation 2505, the processor 210 may adjust transparency of the at least a portion of the graphic region in response to the external object entering the at least a portion of the region. For example, according to adjustment of the transparency, the external object may be visually recognized by a user. Adjusting the transparency in response to the external object entering the at least a portion of the region may be exemplified through
Referring to
The processor 210 may change the state 1130 to a state 2630 in response to identifying an external object 2600 in the environment 1101. For example, in the state 2630, the processor 210 may adjust transparency of a first graphic region 1131 and a second graphic region 1132 corresponding to a position of the external object 2600. For example, the external object 2600 may be visually recognized through the adjustment of the transparency. For example, when the wearable device 102 is a VST device, the processor 210 may adjust transparency of the environment 1101 in the state 2630. However, it is not limited thereto. Unlike illustration of
As described above, the wearable device 102 may enhance safety of a user wearing the wearable device 102 through image recognition while displaying the graphic region.
Displaying the graphic region according to the above examples may enhance a quality of the environment, but a movement of the user while the graphic region is displayed may cause a risk to the user. The wearable device 102 may adjust transparency of the graphic region based on a changes in a posture of the user, in order to reduce accidents caused by the risk. Adjusting transparency of the graphic region according to a change in a posture of a user may be exemplified through
Referring to
In operation 2703, the processor 210 may identify whether a user's posture is changed to a reference posture capable of changing a position (or reference posture capable of being moved a user), while the at least a portion of the graphic region is displayed. For example, the processor 210 may identify a change in the user's posture through a change in a posture of the wearable device 102 worn by the user. For example, the processor 210 may execute operation 2703 through the sensor 250. For example, the sensor 250 may include an acceleration sensor and/or a gyro sensor. For example, the acceleration sensor may be used to identify a direction (or orientation) of the wearable device 102. For example, the gyro sensor may be used to identify a direction of the moving wearable device 102. For example, the processor 210 may execute operation 2703 based on recognition of an image obtained through the first camera 230. For example, the processor 210 may execute operation 2703 based on a reflection signal with respect to a signal transmitted from the communication circuit 260 (e.g., a communication circuit for UWB). However, it is not limited thereto.
For example, the processor 210 may maintain identifying whether the posture is being changed to the reference posture through operation 2703 while the at least a portion of the graphic region is displayed. For example, the processor 210 may execute operation 2705 in response to identifying that the posture is changed to the reference posture.
In operation 2705, the processor 210 may adjust transparency of the at least a portion of the graphic region in response to the posture changed to the reference posture. For example, a real region covered by the at least a portion of the graphic region according to the adjustment of the transparency may be visually recognized. Adjusting transparency of the at least a portion of the graphic region in response to the posture changed to the reference posture may be exemplified through
Referring to
The above descriptions illustrate a wearable device 102 that displays a graphic region together with a part of a real environment, but the wearable device 102 may also display the graphic region in a virtual environment. For example, the processor 210 may identify biometric data of a user wearing the wearable device 102, and provide a virtual reality displaying a graphic region in a virtual environment or a mixed reality or augmented reality displaying a graphic region in a real environment, according to the biometric data. Providing the virtual reality or the mixed reality according to the biometric data may be exemplified through
Referring to
For example, the biometric data may indicate the user's blood pressure, the user's heart rate, the user's breathing state, the user's body temperature, the user's stress index, the user's muscle state, and/or the user's sleep time. For example, the biometric data may indicate a user's concentration level for a schedule provided based at least in part on a display of the graphic region. For example, since the biometric data indicates a state of the user's body, the biometric data may be a usable parameter to identify how much the user can concentrate on the schedule.
In operation 2903, the processor 210 may identify whether the biometric data is within a reference range. For example, the biometric data within the reference range may indicate that the user cannot easily concentrate on the schedule. For example, the biometric data outside the reference range may indicate that the user can concentrate on the schedule. For example, the biometric data within the reference range may indicate a state in which processing of the wearable device 102 for the schedule is required or provided, and the biometric data outside the reference range may indicate a state in which the processing of the wearable device 102 for the schedule is not required, provided, or limited.
For example, the processor 210 may execute operation 2905 based on the biometric data within the reference range, and execute operation 2907 based on the biometric data outside the reference range.
In operation 2905, the processor 210 may display the at least a portion of the graphic region for the schedule in a virtual reality environment on a condition that the biometric data is within the reference range. Displaying the at least a portion of the graphic region in the virtual reality environment will be exemplified through
In operation 2907, the processor 210 may display the at least a portion of the graphic region in a mixed reality environment on a condition that the biometric data is not within the reference range. Displaying the at least a portion of the graphic region in the mixed reality environment may be exemplified through
Referring to
For example, the processor 210 may provide a state 3050 based on the biometric data within the reference range. For example, in the state 3050, the processor 210 may display an environment 3051. For example, the processor 210 may display the graphic region 3002. For example, the graphic region 3002 displayed within the state 3050 may correspond to the graphic region 3002 displayed within the state 3000. For example, the processor 210 may display a portion 3052 of the environment 3051 within the state 3050. For example, real objects in real environment may not be included in the portion 3052 of the environment 3051. For example, virtual objects included in the portion 3052 of the environment 3051 may not exist in the real environment. For example, the virtual objects may be displayed based on the biometric data outside the reference range, in order to induce the user to concentrate more on the schedule. For example, in the state 3050, the processor 210 may display the graphic region 3002 within the environment 3051, which is a virtual reality environment.
As described above, the wearable device 102 may enhance a user experience of a user wearing the wearable device 102 by adaptively providing a virtual reality environment or a mixed reality environment.
Referring to
In operation 3103, the processor 210 may identify whether the identified level is higher than a reference level. For example, the level higher than the reference level may indicate a state in which processing of the wearable device 102 for a concentration of the user is required or provided. For example, the level lower than or equal to the reference level may indicate a state in which processing of the wearable device 102 for the concentration is not required, provided, or limited.
For example, the processor 210 may execute operation 3105 in response to the level higher than the reference level, and may execute operation 3107 in response to the level lower than or equal to the reference level.
In operation 3105, the processor 210 may display the at least a portion of the graphic region for the schedule within a virtual reality environment, on a condition that the level is higher than the reference level. Displaying the at least a portion of the graphic region in the virtual reality environment will be exemplified through
In operation 3107, the processor 210 may display the at least a portion of the graphic region within a mixed reality environment, on a condition that the reference level is lower than or equal to the reference level. Displaying the at least a portion of the graphic region in the mixed reality environment may be exemplified through
Referring to
For example, the processor 210 may provide the state 3250 based on the level lower than or equal to the reference level. For example, in the state 3250, the processor 210 may provide or display an environment 3251, which is a mixed reality environment. For example, the processor 210 may display the graphic region 3202. For example, the graphic region 3202 displayed within the state 3250 may correspond to the graphic region 3202 displayed within the state 3200. For example, the environment 3251 in the state 3250 may include real objects in real environment, unlike the environment 3201 in the state 3200. For example, the environment 3251 may include real objects 3252. For example, when the wearable device 102 is AR glasses, the real objects 3252 may be shown through the display 220. For example, when the wearable device 102 is a VST device, the real objects 3252 may be visual objects in an image displayed through the display 220.
On the other hand, providing the environment 3201, which is a virtual reality environment, as in the state 3200 or the environment 3251, which is a mixed reality environment, as in the state 3250 may be different from intention of a user. For example, contrary to the user's intention, the processor 210 may display a user interface 3290, in order to reduce providing the environment 3201 or providing the environment 3251. For example, the user interface 3290 may include a viable object 3291 for providing a mixed reality environment and a viable object 3292 for providing a virtual reality environment. For example, the object 3291, which is executable when the mixed reality environment is provided among the mixed reality environment and the virtual reality environment may be visually emphasized with respect to the executable object 3292. For example, the object 3292, which is executable when the virtual reality environment is provided among the mixed reality environment and the virtual reality environment may be visually emphasized with respect to the executable object 3291.
For example, the processor 210 may change the state 3200 to the state 3250 in response to receiving a user input for an object 3291 executable within the state 3200. For example, the processor 210 may change the state 3250 to the state 3200 in response to receiving a user input for an object 3292 executable within the state 3250.
As described above, the wearable device 102 may enhance a user experience of a user wearing the wearable device 102 by adaptively providing a virtual reality environment or a mixed reality environment.
The wearable device 102 may change a graphic region to another graphic region according to a progress status of the schedule. For example, the progress status of the schedule may be identified based on various methods. For example, the progress status of the schedule may be changed based on the biometric data. Changing the graphic region to the other graphic region according to the progress status of the schedule identified based on the biometric data may be exemplified through
Referring to
For example, the biometric data may be obtained through at least a portion of the methods exemplified in operation 2901 of
In operation 3303, the processor 210 may identify whether a progress status of the schedule is changed based on the biometric data. For example, the change in the progress status of the schedule may indicate refraining from, bypassing, or limiting a display of the graphic region. For example, maintaining the progress status of the schedule may indicate maintaining a display of the graphic region. For example, when the biometric data indicates that the user is tired or the biometric data indicates a state of a user who has completed a mission provided within the schedule, the processor 210 may identify that the progress status is changed. For another example, when the biometric data indicates that the user is active leisurely or the biometric data indicates a state of a user performing a mission provided within the schedule, the processor 210 may identify that the progress status is maintained.
For example, the processor 210 may execute operation 3305 on a condition that the progress status is changed, and execute operation 3307 on a condition that the progress status is maintained.
In operation 3305, the processor 210 may change the graphic region to the other graphic region based on identifying that the progress status is changed. Changing the graphic region to the other graphic region may be exemplified through
Referring to
For example, the processor 210 may identify that the biometric data obtained while providing the state 3400 displaying the graphic region 3401 and the graphic region 3402 indicates a change in the progress status of the schedule. For example, the processor 210 may change the state 3400 to a state 3450 in response to the biometric data indicating the change in the process status.
In the state 3450, the processor 210 may display a graphic region 3451 changed from the graphic region 3401 and the graphic region 3402 maintained independently of a change from the state 3400 to the state 3450 through the display 220. For example, the graphic region 3451 may indicate that the schedule is completed or the schedule is at least temporarily stopped, unlike the graphic region 3401 indicating that the schedule is in progress. For example, the processor 210 may display the graphic region 3451 changed from the graphic region 3401, in order to reduce user fatigue accumulated while the state 3400 is provided.
As described above, the wearable device 102 may provide an enhanced user experience by adaptively executing a change in a graphic region or a change from an environment including a graphic region to an environment including a real region.
The wearable device 102 capable of executing the above-described operations may be configured as exemplified in
Referring to
In an embodiment, the display 3550 including the first display 3550-1 and the second display 3550-2 may include a liquid crystal display (LCD), a digital mirror device (DMD), a liquid crystal on silicon (LCoS), an organic light emitting diode (OLED), or a micro light emitting diode (LED). In an embodiment, when the display 3550 is configured with the LCD, the DMD, or the LCOS, the wearable device 102 may include a light source (not illustrated in
In an embodiment, the wearable device 102 may further include a first transparent member 3570-1 and a second transparent member 3570-2. For example, each of the first transparent member 3570-1 and the second transparent member 3570-2 may be formed of a glass plate, a plastic plate, or a polymer. For example, each of the first transparent member 3570-1 and the second transparent member 3570-2 may be transparent or translucent.
In an embodiment, the wearable device 102 may include a waveguide 3572. For example, the wave guide 3572 may be used to transmit a light source generated by the display 3550 to the eyes of a user wearing the wearable device 102. For example, the wave guide 3572 may be formed of a glass, a plastic, or a polymer. For example, the wave guide 3572 may include a nano pattern configured with a polygonal or curved grating structure within the waveguide 3572 or in a surface of the waveguide 3572. For example, light incident on one end of the waveguide 3572 may be provided to a user through the nano pattern. In an embodiment, the waveguide 3572 may include at least one of at least one diffraction element (e.g., a diffractive optical element (DOE), a holographic optical element (HOE)) or a reflection element (e.g., a reflection mirror). For example, the at least one diffraction element or the reflection element may be used to guide light to the user's eyes. In an embodiment, the at least one diffraction element may include an input optical member and/or an output optical member. In an embodiment, the input optical member may refer to an input grating area used as an input end of light, and the output optical member may refer to an output grating area used as an output end of light. In an embodiment, the reflection element may include a total internal reflection (TIR) optical element or a total internal reflection waveguide for a total internal reflection.
In an embodiment, a camera 3530 in the wearable device 102 may include at least one first camera 3530-1, at least one second camera 3530-2, and/or at least one third camera 3530-3.
In an embodiment, the at least one first camera 3530-1 may be used for motion recognition or spatial recognition of three degrees of freedom (3DoF) or six degrees of freedom (6DoF). For example, the at least one first camera 3530-1 may be used for head tracking or hand detection. For example, the at least one first camera 3530-1 may be configured as a global shutter (GS) camera. For example, the at least one first camera 3530-1 may be configured as a stereo camera. For example, the at least one first camera 3530-1 may be used for gesture recognition.
In an embodiment, the at least one second camera 3530-2 may be used to detect and track a pupil. For example, the at least one second camera 3530-2 may be configured as the GS camera. For example, the at least one second camera 3530-2 may be used to identify a user input defined by a user's gaze.
In an embodiment, the at least one third camera 3530-3 may be referred to as a high resolution (HR) or photo video (PV) camera, and may provide an auto focusing (AF) function or an optical image stabilization (OIS) function. In an embodiment, the at least one third camera 3530-3 may be configured as the GS camera or a remote shutter (RS) camera.
In an embodiment, the wearable device 102 may further include an LED unit 3574. For example, the LED unit 3574 may be used to assist in tracking the pupil through the at least one second camera 3530-2. For example, the LED unit 3574 may be configured as an IR LED. For example, the LED unit 3574 may be used to compensate for brightness when illuminance around the wearable device 102 is low.
In an embodiment, the wearable device 102 may further include a first PCB 3576-1 and a second PCB 3576-2. For example, each of the first PCB 3576-1 and the second PCB 3576-2 may be used to transmit an electrical signal to a component of the wearable device 102 such as a cameras 3530 or a display 3550. In an embodiment, the wearable device 102 may further include an interposer disposed between the first PCB 3576-1 and the second PCB 3576-2. However, it is not limited thereto.
According to an embodiment, the wearable device 102 may be worn on a portion of a user's body. The wearable device 102 may provide augmented reality (AR), virtual reality (VR), or mixed reality (MR) that combines the AR and the VR to a user wearing the wearable device 102. For example, the wearable device 102 may output a virtual reality image to a user through the at least one display 3650, in response to a user's designated gesture obtained through the third camera 3530-3 of
According to an embodiment, the at least one display 3650 in the wearable device 102 may provide visual information to a user. The at least one display 3650 may include the display 220 of
Referring to
According to an embodiment, the frame may have a physical structure in which the wearable device 102 can be worn on the user's body. According to an embodiment, the frame may be configured so that the first display 3650-1 and the second display 3650-2 may be positioned a position corresponding to the user's left and right eyes when the user wears the wearable device 102. The frame may support the at least one display 3650. For example, the frame may support the first display 3650-1 and the second display 3650-2 to be positioned in a position corresponding to the user's left and right eyes.
Referring to
According to an embodiment, the frame may include a first rim 3601 surrounding at least a portion of the first display 3650-1, a second rim 3602 surrounding at least a portion of the second display 3650-2, a bridge 3603 disposed between the first rim 3601 and the second rim 3602, a first pad 3611 disposed along a portion of an edge of the first rim 3601 from one end of the bridge 3603, a second pad 3612 disposed along a portion of an edge of the second rim 3602 from another end of the bridge 3603, a first temple 3604 extending from the first rim 3601 and fixed to a part of the wearer's ear, and a second temple 3605 extending from the second rim 3602 and fixed to a part of opposite ear of the ear. The first pad 3611 and the second pad 3612 may be contacted with a part of the user's nose, and the first temple 3604 and the second temple 3605 may be contacted with a part of the user's face and a part of the ear. The temples 3604 and 3605 may be rotatably connected to the rim via a hinge unit. According to an embodiment, the wearable device 102 may identify an external object (e.g., the user's fingertip) that touches the frame and/or a gesture performed by the external object, by using a touch sensor, a grip sensor, and/or a proximity sensor formed on at least a portion of the surface of the frame.
Referring to
According to an embodiment, the wearable device 102 may include cameras 3740-3 and 3740-4 for photographing and/or tracking the user's two eyes adjacent to each of the first display 3750-1 and the second display 3750-2. The cameras 3740-3 and 3740-4 may be referred to as an ET camera. According to an embodiment, the wearable device 102 may include cameras 3740-1 and 3740-2 for photographing and/or recognizing the user's face. The cameras 3740-1 and 3740-2 may be referred to as an FT camera.
Referring to
According to an embodiment, the wearable device 102 may include a depth sensor 3730 disposed on the second surface 3720 to identify a distance between the wearable device 102 and an external object. Using the depth sensor 3730, the wearable device 102 may obtain spatial information (e.g., depth map) on at least a portion of an FoV of the user wearing the wearable device 102.
Although not illustrated, a microphone for obtaining sound outputted from an external object may be disposed on the second surface 3720 of the wearable device 102. The number of microphones may be one or more according to embodiments.
As described above, a wearable device 102 may comprise a display 220 arranged with respect to eyes of a user wearing the wearable device 102, a camera including at least one lens that faces a direction corresponding to a direction in which the eyes faces, and a processor 210. The processor 210 may be configured to identify, in response to a schedule, a place labeled with respect to the schedule. According to an embodiment, the processor 210 may be configured to identify, based at least in part on the identification, whether the camera of the wearable device 102 positioned in the place faces a region in the place to which a graphic region for the schedule is set. According to an embodiment, the processor 210 may be configured to display, via the display 220, at least portion of the graphic region on at least portion of the region, based on identifying that a direction of the camera corresponds to a first direction in which the camera faces the region. According to an embodiment, the processor 210 may be configured to identify, via the display 220, information for informing the first direction, based on identifying that the direction corresponds to a second direction different from the first direction.
According to an embodiment, the processor 210 may be configured to display, via the display 220, information for informing a movement to the place, based on a position of the wearable device 102 outside of the place.
According to an embodiment, the processor 210 may be configured to display, in response to the direction corresponding to the first direction, the at least portion of the graphic region by expanding the at least portion of the graphic region from a portion of the region spaced apart from the wearable device 102 to another portion of the region adjacent to the wearable device 102.
According to an embodiment, the processor 210 may be configured to display, via the display 220, an execution screen of each of one or more software applications set for the schedule, with the at least portion of the graphic region, based on the direction corresponding to the first direction. According to an embodiment, the processor 210 may be configured to cease displaying one or more execution screens of one or more other software applications, distinct from the one or more software applications, based on the direction corresponding to the first direction. According to an embodiment, the wearable device 102 may comprise another camera facing the eyes. According to an embodiment, the processor 210 may be configured to identify a gaze of the user through images obtained by using the other camera. According to an embodiment, the processor 210 may be configured to cease displaying a first execution screen positioned outside of the gaze from among the one or more execution screens, based on the direction corresponding to the first direction. According to an embodiment, a second execution screen in which the gaze is positioned from among the one or more execution screens may be maintained via the display 220, independently from the direction corresponding to the first direction. According to an embodiment, the processor 210 may be configured to cease representing content provided through the second execution screen maintained via the display 220.
According to an embodiment, the processor 210 may be configured to display, via the display 220, a message including an executable object for stopping to display the graphic region, while at least portion of the graphic region appears. According to an embodiment, the processor 210 may be configured to display, via the display 220, a portion of the region and a portion of the graphic region by stopping to display the graphic region, in response to a user input on the executable object.
According to an embodiment, the processor 210 may be configured to display, via the display 220, a message including an executable object for ceasing to display the graphic region on the region, while at least portion of the graphic region appears. According to an embodiment, the processor 210 may be configured to maintain to provide the region, by ceasing to display a portion of the graphic region displayed based on the direction corresponding to the first direction, in response to a user input on the executable object.
According to an embodiment, the processor 210 may be configured to obtain biometric data of the user. According to an embodiment, the processor 210 may be configured to display, in a virtual reality environment, the at least portion of the graphic region, based on the direction corresponding to the first direction and the biometric data within reference range. According to an embodiment, the processor 210 may be configured to display, in a mixed reality environment, the least portion of the graphic region, based on the direction corresponding to the first direction and the biometric data outside of the reference range.
According to an embodiment, the processor 210 may be configured to identify a level of the schedule. According to an embodiment, the processor 210 may be configured to display, in a virtual reality environment, the at least portion of the graphic region, based on the direction corresponding to the first direction and the level higher than a reference label. According to an embodiment, the processor 210 may be configured to display, in a mixed reality environment, the least portion of the graphic region, based on the direction corresponding to the first direction and the level lower than or equal to the reference label.
According to an embodiment, the processor 210 may be configured to obtain data indicating illuminance around the wearable device 102. According to an embodiment, the processor 210 may be configured to display the at least portion of the graphic region in a brightness identified based on the illuminance in response to the direction corresponding to the first direction.
According to an embodiment, the processor 210 may be configured to identify, while the at least portion of the graphic region is displayed via the display 220, a progress status of the schedule, based on biometric data of the user. According to an embodiment, the processor 210 may be configured to, based on the progress status, maintain to display the at least portion of the graphic region or change the at least portion of the graphic region to at least portion of another graphic region set with respect to the region for the schedule.
According to an embodiment, the processor 210 may be configured to identify another schedule, distinct from the schedule, registered with respect to the region through an account of the user, while the at least portion of the graphic region is displayed via the display 220. According to an embodiment, the processor 210 may be configured to display, via the display 220, at least portion of another graphic region for the other schedule, on at least portion of the region, in response to the other schedule.
According to an embodiment, the wearable device may comprise a communication circuit. According to an embodiment, the processor 210 may be configured to identify an electronic device positioned in the region, via the camera or the communication circuit. According to an embodiment, the processor 210 may be configured to transmit, to the electronic device via the communication circuit, a signal for changing settings of the electronic device to settings for the schedule, based at least in part on the direction corresponding to the first direction.
According to an embodiment, the processor 210 may be configured to identify an electronic device, including a display 220, positioned in the region, via the camera or the communication circuit. According to an embodiment, the processor 210 may be configured to, based on the direction corresponding to the first direction, display, via the display 220, an execution screen of a first software application set for the schedule, with the at least portion of the graphic region, and transmit, to the electronic device via the communication circuit, a signal for displaying an execution screen of a second software application set for the schedule via the display 220 of the electronic device.
According to an embodiment, the processor 210 may be configured to register, through a software application, the schedule in which the place including the region to which the graphic region is set is labeled. According to an embodiment, the processor 210 may be configured to transmit, via the communication circuit to a server, information on the schedule, based on the registration. According to an embodiment, the processor 210 may be configured to, while the software application is in an inactive state, receive, via the communication circuit, a signal transmitted from the server in response to identifying the schedule based on the information. According to an embodiment, the processor 210 may be configured to change, in response to the signal, a state of the software application from the inactive state to an active state. According to an embodiment, the processor 210 may be configured to execute operations for displaying via the display 220 the least portion of the graphic region, by using the software application changed to the active state.
According to an embodiment, the processor 210 may be configured to register, through a software application, the schedule in which the place including the region to which the graphic region is set is labeled. According to an embodiment, the processor 210 may be configured to transmit, via the communication circuit to a server, information on the schedule, based on the registration. According to an embodiment, the processor 210 may be configured to receive, via the communication circuit, a signal transmitted from the server in response to identifying the schedule based on the information. According to an embodiment, the processor 210 may be configured to change, in response to the signal, states of one or more other software applications indicated by the signal to active states. According to an embodiment, the processor 210 may be configured to execute operations for displaying via the display 220 the at least portion of the graphic region, based at least in part on the one or more other software applications changed to the active states.
According to an embodiment, the processor 210 may be configured to register, through a software application, the schedule in which the place including the region to which the graphic region is set is labeled. According to an embodiment, the processor 210 may be configured to transmit, via the communication circuit to a server, information on the schedule, based on the registration. According to an embodiment, the processor 210 may be configured to provide, through an operating system, data for accessing to the information in the server, to one or more other software applications in the wearable device 102, the one or more other software applications capable of processing the schedule.
According to an embodiment, the processor 210 may be configured to register, through a software application, the schedule in which the place including the region to which the graphic region is set is labeled. According to an embodiment, the processor 210 may be configured to provide, through an operating system to one or more other software applications in the wearable device 102, data indicating a location in which information on the schedule is stored according to the registration, the one or more other software applications capable of processing the schedule.
According to an embodiment, the processor 210 may be configured to identify whether an external object enters at least a portion of the region covered according to a display of the at least a portion of the graphic region, through the camera. According to an embodiment, the processor 210 may be configured to adjust transparency of the at least a portion of the graphic region in response to the external object entering the at least a portion of the region.
According to an embodiment, the wearable device 102 may include at least one sensor. According to an embodiment, the processor 210 may be configured to identify whether the user's posture is changed to a reference posture capable of changing a position, through the at least one sensor, while the at least a portion of the graphic region is displayed. According to an embodiment, the processor 210 may be configured to adjust transparency of the at least a portion of the graphic region in response to the posture changed to the reference posture.
The electronic device according to various embodiments may be one of various types of electronic devices. The electronic devices may include, for example, a portable communication device (e.g., a smartphone), a computer device, a portable multimedia device, a portable medical device, a camera, a wearable device, or a home appliance. According to an embodiment of the disclosure, the electronic devices are not limited to those described above.
It should be appreciated that various embodiments of the disclosure and the terms used therein are not intended to limit the technological features set forth herein to particular embodiments and include various changes, equivalents, or replacements for a corresponding embodiment. As used herein, each of such phrases as “A or B,” “at least one of A and B,” “at least one of A or B,” “A, B, or C,” “at least one of A, B, and C,” and “at least one of A, B, or C,” may include any one of, or all possible combinations of the items enumerated together in a corresponding one of the phrases. As used herein, such terms as “1st” and “2nd,” or “first” and “second” may be used to simply distinguish a corresponding component from another, and does not limit the components in other aspect (e.g., importance or order). It is to be understood that if an element (e.g., a first element) is referred to, with or without the term “operatively” or “communicatively”, as “coupled with,” “coupled to,” “connected with,” or “connected to” another element (e.g., a second element), it means that the element may be coupled with the other element directly (e.g., wiredly), wirelessly, or via a third element.
As used in connection with various embodiments of the disclosure, the term “module” may include a unit implemented in hardware, software, or firmware, and may interchangeably be used with other terms, for example, “logic,” “logic block,” “part,” or “circuitry”. A module may be a single integral component, or a minimum unit or part thereof, adapted to perform one or more functions. For example, according to an embodiment, the module may be implemented in a form of an application-specific integrated circuit (ASIC).
Various embodiments as set forth herein may be implemented as software (e.g., a program) including one or more instructions that are stored in a storage medium (e.g., internal memory or external memory) that is readable by a machine (e.g., the electronic device 102). For example, a processor (e.g., the processor 210) of the machine (e.g., the electronic device 102) may invoke at least one of the one or more instructions stored in the storage medium, and execute it, with or without using one or more other components under the control of the processor. This allows the machine to be operated to perform at least one function according to the at least one instruction invoked. The one or more instructions may include a code generated by a complier or a code executable by an interpreter. The machine-readable storage medium may be provided in the form of a non-transitory storage medium. Wherein, the term “non-transitory” simply means that the storage medium is a tangible device, and does not include a signal (e.g., an electromagnetic wave), but this term does not differentiate between where data is semi-permanently stored in the storage medium and where the data is temporarily stored in the storage medium.
According to an embodiment, a method according to various embodiments of the disclosure may be included and provided in a computer program product. The computer program product may be traded as a product between a seller and a buyer. The computer program product may be distributed in the form of a machine-readable storage medium (e.g., compact disc read only memory (CD-ROM)), or be distributed (e.g., downloaded or uploaded) online via an application store (e.g., PlayStore™), or between two user devices (e.g., smart phones) directly. If distributed online, at least part of the computer program product may be temporarily generated or at least temporarily stored in the machine-readable storage medium, such as memory of the manufacturer's server, a server of the application store, or a relay server.
According to various embodiments, each component (e.g., a module or a program) of the above-described components may include a single entity or multiple entities, and some of the multiple entities may be separately disposed in different components. According to various embodiments, one or more of the above-described components may be omitted, or one or more other components may be added. Alternatively or additionally, a plurality of components (e.g., modules or programs) may be integrated into a single component. In such a case, according to various embodiments, the integrated component may still perform one or more functions of each of the plurality of components in the same or similar manner as they are performed by a corresponding one of the plurality of components before the integration. According to various embodiments, operations performed by the module, the program, or another component may be carried out sequentially, in parallel, repeatedly, or heuristically, or one or more of the operations may be executed in a different order or omitted, or one or more other operations may be added.
| Number | Date | Country | Kind |
|---|---|---|---|
| 10-2022-0176356 | Dec 2022 | KR | national |
| 10-2023-0003143 | Jan 2023 | KR | national |
This application is a continuation application, claiming priority under § 365(c), of an International application No. PCT/KR2023/012651, filed on Aug. 25, 2023, which is based on and claims the benefit of a Korean patent application number 10-2022-0176356, filed on Dec. 15, 2022, in the Korean Intellectual Property Office, and of a Korean patent application number 10-2023-0003143, filed on Jan. 9, 2023, in the Korean Intellectual Property Office, the disclosure of each of which is incorporated by reference herein in its entirety.
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/KR2023/012651 | Aug 2023 | WO |
| Child | 18486729 | US |