The disclosure relates to an electronic device for displaying a media content and a user interface (UI) in a plurality of areas and method thereof.
Recently, an electronic device for visualizing information in various types of environments such as an augmented reality (AR) environment, a virtual reality (VR) environment, a mixed reality (MR) environment, and/or extended reality (XR) environment is being developed. The electronic device may include a television, a monitor, an electronic display board, a beam projector, a mobile phone, and/or a tablet personal computer (PC). The electronic device may form a display area representing the information on a surface of the electronic device or on a surface outside the electronic device.
According to an aspect of the disclosure, there is provided an electronic device. The electronic device may comprise a projector, and a processor. The processor may be configured to obtain information related to a projection area on which light emitted from the projector is to be projected. The processor may be configured to identify, based on the information, a first area and a second area in the projection area, the first area may have a first ratio, and the second area may be smaller than the first area. The processor may be configured to obtain user interface (UI) associated with a media content displayed in the first area by the projector. The processor may be configured to control the projector to display, in the second area, the UI having layout configured based on a feature of the second area.
According to another aspect of the disclosure, there is provided a method of an electronic device. The method may include obtaining information related to one or more projection areas of the electronic device. The method may include displaying a media content in a first area identified in the one or more projection areas based on the information. The method may include identifying, based on the media content, a second area different from the first area, in the one or more projection areas. The method may include displaying, in the second area, user interface (UI) associated with the media content and having layout based on a feature the second area.
According to another aspect of the disclosure, there is provided a method of an electronic device. The method may include obtaining information related to a projection area on which light emitted from a projector of the electronic device is to be projected. The method may include identifying, based on the information, a first area and a second area in the projection area. The first area may have a first ratio, and the second area may be smaller than the first area. The method may include obtaining user interface (UI) associated with a media content displayed in the first area by the projector. The method may include displaying, in the second area, the UI having layout configured based on a feature of the second area.
According to an embodiment, an electronic device may include a communication circuitry, a projection assembly, and a processor. The processor may be configured to obtain information for a plane where light emitted from the projection assembly d to be projected. The processor may be configured to identify, from the plane based on the information, a first area having a preset ratio, and a second area smaller than the first area. The processor may be configured to obtain, in a state that a media content displays in the first area identified by the communication circuitry, user interface (UI) associated with the media content. The processor may be configured to display, in the second area, the UI having layout based on a width and a height of the second area.
According to an embodiment, a method of an electronic device may comprise obtaining information for a plane where light emitted from a projection assembly of the electronic device is to be projected. The method may comprise identifying, from the plane based on the information, a first area having a preset ratio, and a second area smaller than the first area. The method may comprise obtaining, in a state that a media content displays in the first area identified by a communication circuitry of the electronic device, user interface (UI) associated with the media content. The method may comprise displaying, in the second area, the UI having layout based on a width and a height of the second area.
According to an embodiment, an electronic device may include a projection assembly and a processor. The processor may be configured to obtain information with respect to one or more planes where light emitted from the projection assembly is to be projected. The processor may be configured to display, in a first area identified in the one or more planes based on the information, a media content. The processor may be configured to, based on the media content, identify, in the one or more planes, a second area distinguished from the first area. The processor may be configured to display, in the second area, user interface (UI) that is associated with the media content and having layout based on a width and height of the second area.
According to an embodiment, a method of an electronic device may comprise obtaining information with respect to one or more planes where light emitted from a projection assembly of the electronic device is to be projected. The method may comprise displaying, in a first area identified in the one or more planes based on the information, a media content. The method may comprise, based on the media content, identifying, in the one or more planes, a second area distinguished from the first area. The method may comprise displaying, in the second area, user interface (UI) that is associated with the media content and having layout based on a width and height of the second area.
Hereinafter, various embodiments of the disclosure will be described with reference to the accompanying drawings.
It should be appreciated that various embodiments of the disclosure and the terms used therein are not intended to limit the technological features set forth herein to particular embodiments and include various changes, equivalents, or replacements for a corresponding embodiment. With regard to the description of the drawings, similar reference numerals may be used to refer to similar or related elements. It is to be understood that a singular form of a noun corresponding to an item may include one or more of the things, unless the relevant context clearly indicates otherwise. As used herein, each of such phrases as “A or B,” “at least one of A and B,” “at least one of A or B,” “A, B, or C,” “at least one of A, B, and C,” and “at least one of A, B, or C,” may include any one of, or all possible combinations of the items enumerated together in a corresponding one of the phrases. As used herein, such terms as “1st” and “2nd,” or “first” and “second” may be used to simply distinguish a corresponding component from another, and does not limit the components in other aspect (e.g., importance or order). It is to be understood that if an element (e.g., a first element) is referred to, with or without the term “operatively” or “communicatively”, as “coupled with,” “coupled to,” “connected with,” or “connected to” another element (e.g., a second element), it means that the element may be coupled with the other element directly (e.g., wiredly), wirelessly, or via a third element.
As used in connection with various embodiments of the disclosure, the term “module” or “unit” may include an element implemented by hardware, software, or firmware, and may interchangeably be used with other terms, for example, “logic,” “logic block,” “part,” or “circuitry”. A module or a unit may be a single integral component, or a minimum unit or part thereof, adapted to perform one or more functions. For example, according to an embodiment, the module may be implemented in a form of an application-specific integrated circuit (ASIC).
Referring to
According to an embodiment, the electronic device 101 may scan the external space by itself. For instance, the electronic device 101 may scan the external space using one or more of the components of the electronic device 101. However, the disclosure is not limited thereto, and as such, according to an embodiment, the electronic device 101 may scan the external space by using an external electronic device 110 different from the electronic device 101. According to an embodiment, the electronic device 101 may request and/or receive information from the external electronic device 110 related to the operation of scanning the external space. Referring to
According to an embodiment, the electronic device 101 may identify a plane 120 on which light emitted from the electronic device 101 is projected, based on the scan of the external space. The electronic device 101 may identify one or more areas (e.g., one or more occluded areas) in the plane 120, which is at least partially occluded by one or more external objects 130. In the example case of
According to an embodiment, the electronic device 101 may identify one or more areas in the plane 120, based on identifying that the plane 120 is partially occluded by the one or more external objects 130. According to an embodiment, the one or more areas may be referred as candidate display (or projection) areas. According to an embodiment, the electronic device 101 may select and/or segment the first area 141 and the second area 142, as another portion in the plane 120, that is different from a portion occluded by the one or more external objects 130. For example, referring to
According to an embodiment, the candidate display areas identified by the electronic device 101 may adjoin or be spaced apart from each other in the plane 120. For example, when the electronic device 101 identifies the candidate display areas in the plane 120, the candidate display areas identified by the electronic device 101 may adjoin or be spaced apart from each other in the plane 120. In the one or more candidate display areas, the electronic device 101 may identify the first area 141 having a first ratio and the quadrangular form. Here, the first ratio may be a preset ratio. For example, the first ratio may be a first aspect ratio of the first area 141. For example, the first aspect ratio may be a ratio of a length to width of the first area 141. In the one or more candidate display areas, the electronic device 101 may determine a second area 142 having a second ratio (e.g., a second aspect ratio). According to an embodiment, the second area 142 may have an area smaller than the first area 141. However, the disclosure is not limited thereto, and as such, according to an embodiment, the second area 142 may be equal to or larger than the first area 141. The second aspect ratio may be same or different from the first aspect ratio. According to an embodiment, the first area 141 may be referred to as a main area and the second area 142 may be referred to as a sub area. In an embodiment, the first area 141 may be referred to as a primary area and the second area 142 may be referred to as a secondary area. However, the disclosure is not limited thereto, and such, the number of candidate display areas is not limited to the first area 141 and the second area 142. Moreover, the designation of the first area 141 and the second area 142 is not limited to main area and sub area, respectively. As such, the candidate display areas (e.g., the first area 141 and the second area 142) may be configured in various manner in accordance with various application and service rendered by the electronic device 101.
According to an embodiment, the electronic device 101 may select the first area 141 and the second area 142 so that light emitted from the electronic device 101 avoids the one or more external objects 130 provided between the plane 120 and the electronic device 101 or provided on the plane 120. The electronic device 101 may select the first area 141 having a maximized extent (e.g., size) in the plane 120 that is not occluded by the one or more external objects 130. The electronic device 101 may select the second area 142 that is not occluded by the one or more external objects 130 and has the quadrangular form in the plane 120 from which the first area 141 is excluded. Since the first area 141 has the maximized extent, the second area 142 may have a smaller extent than the first area 141. An example of an operation in which the electronic device 101 selects the first area 141 and the second area 142 in the plane 120 according to an embodiment is described with reference to
In an embodiment, the media content displayed in the first area 141 by the electronic device 101 may be stored in the memory of the electronic device 101 or may be transmitted to the electronic device 101 from another electronic device. The another electronic device may be the external electronic device 110, which may include, but is not limited to, a mobile phone, a set-top box (STB), a PC, and/or a TV. The media content may include the image and/or the video. The media content may be streamed from a network connected by electronic device 101. The media content may include the video and a sound synchronized with the video. The media content may include a video standardized by a motion picture expert group (MPEG). According to an embodiment, the electronic device 101 may obtain the UI associated with the media content in a state of displaying the media content in the first area 141.
According to an embodiment, the electronic device 101 may display the UI associated with the media content in the second area 142 together with the media content displayed in the first area 141. The UI may include information on the media content and/or a channel for transmitting the media content. The UI may be selected from a plurality of preset UIs provided by a software application executed by the electronic device 101. The UI may include one or more executable objects for controlling the playback of videos in the media content based on the first area 141. The UI may be set by a content provider providing the media content.
In an embodiment, the UI displayed by the electronic device 101 in the second area 142 may have layout based on the form (e.g., width, height, and/or size) of the second area 142. The layout may be associated with the size and/or position of at least one visual object included in the UI. The layout may be an arrangement of a plurality of visual objects included in the UI. The visual object may mean a deployable object that may be provided on the screen for transmission and/or interaction of information, such as text, image, icon, video, button, checkbox, radio button, text box, and/or table. An operation of selecting the second area 142 and an operation of displaying a UI having layout based on the form of the second area 142 performed by the electronic device 101 may be related or interconnected. For example, the electronic device 101 may select the second area 142 based on a form suitable for displaying the UI associated with the media content in the plane 120 excluding the first area 141. According to an embodiment, an example operation of displaying the UI in the second area 142 by the electronic device 101 will be described with reference to
According to an embodiment, the electronic device 101 may extract information from the media content. The electronic device 101 may display a UI including the information extracted from the media content in the second area 142. The information may include a scene in anyone timing of the video in the media content, or a text extracted from the scene. According to an embodiment, an example operation of displaying the UI in the second area 142 based on scene recognition for the video in the media content by the electronic device 101 will be described with reference to
Although the operation of the electronic device 101 for selecting a plurality of areas (e.g., the first area 141 and the second area 142) in the plane 120 has been exemplarily described, the embodiment is not limited thereto. For example, the electronic device 101 may identify a plurality of planes to which the light output from the electronic device 101 may reach. Based on identifying the plurality of planes, the electronic device 101 may identify a plurality of areas that are not occluded by at least one external object and have the quadrangular form, such as the first area 141 and the second area 142, in the plurality of planes. An example of an operation in which the electronic device 101 identifies the plurality of areas in the plurality of planes will be described with reference to
As described above, according to an embodiment, the electronic device 101 may identify the plane 120 capable of forming the screen based on the light output from the electronic device 101, by scanning an environment (e.g., the external space) adjacent to the electronic device 101. In case that the plane 120 is occluded by the one or more external objects 130, the electronic device 101 may select the plurality of areas (e.g., the first area 141 and the second area 142) having the quadrangular form, in a portion (e.g., a portion having a polygonal form) in the plane 120 that is not occluded by the one or more external objects 130. The electronic device 101 may additionally output information on the media content by displaying the media content and the UI associated with the media content on each of the plurality of areas. The information output together with the media content from the electronic device 101 may be used to enhance a user experience associated with the media content. The electronic device 101 may increase the amount of information displayed through the plane 120 by forming the plurality of areas in the plane 120. Since the amount of information is increased, the electronic device 101 may use the plane 120 more efficiently.
According to an embodiment illustrated in
According to an embodiment, the processor 210-1 of the electronic device 101 may include a hardware component for processing data based on one or more instructions. The hardware component for processing data may include, for example, an arithmetic and logic unit (ALU), a floating point unit (FPU), a field programmable gate array (FPGA), a central processing unit (CPU) and/or application processor (AP). The number of processors 210-1 may be one or more. For example, the processor 210-1 may have a structure of a multi-core processor such as a dual core, a quad core, or a hexa core.
According to an embodiment, the memory 220-1 of the electronic device 101 may include a hardware component for storing data and/or instructions inputted and/or output to the processor 210-1. The memory 220-1 may include, for example, volatile memory such as random-access memory (RAM) and/or non-volatile memory such as read-only memory (ROM). The volatile memory may include, for example, at least one of dynamic RAM (DRAM), static RAM (SRAM), Cache RAM, and pseudo SRAM (PSRAM). The non-volatile memory may include, for example, at least one of a programmable ROM (PROM), an erasable PROM (EPROM), an electrically erasable PROM (EEPROM), a flash memory, a hard disk, a compact disk, solid state drive (SSD) and an embedded multi media card (eMMC).
According to an embodiment, in the memory 220-1 of the electronic device 101, one or more instructions (or commands) indicating a calculation and/or operation to be performed on data by the processor 210-1 may be stored. A set of one or more instructions may be referred to as firmware, operating system, process, routine, sub-routine, and/or application. For example, the electronic device 101 and/or the processor 210-1 may perform at least one of the operations of
According to an embodiment, the communication circuitry 230-1 of the electronic device 101 may include hardware for supporting transmission and/or reception of an electrical signal between the electronic device 101 and the external electronic device 110. As another electronic device connected through the communication circuitry 230-1 of the electronic device 101, only the external electronic device 110 is illustrated, but the embodiment is not limited thereto. The communication circuitry 230-1 may include, for example, at least one of a modem (MODEM), an antenna, and an optic/electronic (O/E) converter. The communication circuitry 230-1 may support transmission and/or reception of the electrical signal based on various types of protocols, such as ethernet, local area network (LAN), wide area network (WAN), wireless fidelity (WiFi), Bluetooth, bluetooth low energy (BLE), ZigBee, long term evolution (LTE), 5G new radio (NR) and/or sixth generation (6G).
According to an embodiment, the electronic device 101 may receive a media content by using the communication circuitry 230-1. For example, the electronic device 101 may wirelessly receive a signal for displaying the media content, based on a wireless communication protocol such as wireless display (WiDi) and/or Miracast, through the communication circuitry 230-1. For example, the electronic device 101 may receive the signal for displaying the media content by wire based on a wired communication protocol (or a wired interface) such as high-definition multimedia interface (HDMI), display port (DP), mobile high-definition link (MHL), digital visual interface (DVI), and/or D-subminiature (D-sub), by using the communication circuitry 230-1.
According to an embodiment, the projection assembly 240 of the electronic device 101 may include a plurality of hardware assembled to emit light representing pixels arranged in two dimensions. For example, the projection assembly 240 may include cathode-ray tubes (CRTs) for emitting light of each of the three primary colors in the color space, and a combination of lenses for enlarging the light emitted from each of the CRTs. For example, the projection assembly 240 may include a light source (e.g., a lamp) for emitting light, optical filters for segmenting the light into light paths corresponding to each of the three primary colors, liquid crystal display (LCD) panels provided on each of the optical paths, and a combination of prisms and/or lenses for synthesizing light output from the LCD panels. For example, the projection assembly 240 may include the light source for emitting light, an optical filter that selects any one of the three primary colors from the light, a digital mirror device (DMD) for adjusting the reflection on the primary color filtered by the optical filter, and a combination of lenses for enlarging the light reflected by the DMD. In terms of requiring projection of light for display of the screen, at least one of the exemplified combinations may be referred to as the projection assembly 240. In an embodiment, the electronic device 101 including the projection assembly 240 may be referred to as a projector or a beam projector. However, the disclosure is not limited thereto, and as such, according to an embodiment, the projection assembly 240 may include components configured to output a 3D image or a hologram. According to an embodiment, the assembly 240 may include two or more projectors to project separate images on different planes.
According to an embodiment, the camera 250-1 of the electronic device 101 may include one or more optical sensors (e.g., a charged coupled device (CCD) sensor or a complementary metal oxide semiconductor (CMOS) sensor) that generate an electrical signal representing color and/or brightness of light. A plurality of light sensors in the camera 250-1 may be provided in the form of a 2 dimensional array. The camera 250-1 may generate an image corresponding to light reaching the optical sensors of the 2 dimensional array and including a plurality of pixels arranged in 2 dimensions, by obtaining the electrical signal of each of a plurality of optical sensors substantially simultaneously. For example, photo data captured by using the camera 250-1 may mean an image obtained from the camera 250-1. For example, video data captured by using the camera 250-1 may mean a sequence of a plurality of images obtained from the camera 250-1 according to a preset frame rate.
According to an embodiment, the sensor 260 of the electronic device 101 may generate electronic information that may be processed by the processor 210-1 and/or the memory 220-1 from non-electronic information associated with the electronic device 101. For example, the sensor 260 may include a depth sensor for measuring a distance between the electronic device 101 and an external object. The depth sensor may include a UWB sensor (or UWB radar) that uses a wireless signal in a frequency band of an ultra wide band (UWB). The depth sensor may include a ToF sensor that measures a time-of-flight (ToF) of laser light and/or infrared light. The electronic device 101 may obtain a depth image including depth values arranged in 2 dimensions, by using the ToF sensor. The ToF sensor may include an infrared diode and a plurality of infrared light sensors that detect the intensity of infrared light and are arranged in the form of the 2 dimensional array. The electronic device 101 may obtain the depth image based on a time at which light emitted from the infrared diode is reflected from a subject and reaches at least one of the plurality of infrared light sensors, by using the ToF sensor. In addition to the depth sensor, the electronic device 101 may include a global positioning system (GPS) sensor (or a sensor based on a global navigation satellite system (GNSS), such as galileo, beidou, and compass for detecting a geographic location of the electronic device 101, an image sensor for detecting electromagnetic waves including light, a touch sensor, and/or an illuminance sensor.
According to an embodiment, the electronic device 101 may include an output device for outputting information in another form other than a visualized form. For example, the electronic device 101 may include a speaker for outputting an acoustic signal. For example, the electronic device 101 may include a motor for providing vibration-based haptic feedback.
Referring to
According to an embodiment, an application 270 for communicating with the electronic device 101 may be installed in the external electronic device 110. The application 270 may be installed in the external electronic device 110 to exchange a signal and/or information between the electronic device 101 and/or the external electronic device 110. A processor 230-2 of the external electronic device 110 may control the communication circuitry 230-2 by executing the application 270. Through the communication circuitry 230-2, the external electronic device 110 may be connected to the communication circuitry 230-1 of the electronic device 101. In a state in which the application 270 is executed, a communication link may be established between the electronic device 101 and the external electronic device 110. The external electronic device 110 may obtain information to be transmitted to the electronic device 101 based on the execution of the application 270. For example, the electronic device 101 may transmit a first signal indicating to obtain information on one or more planes spaced apart from the electronic device 101, to the external electronic device 110, by executing the application 270. The first signal may be transmitted to the external electronic device 110 based on booting of the electronic device 101. The external electronic device 110 may execute the application 270 based on receiving the first signal. The external electronic device 110 may obtain at least one image for the one or more planes from the camera 250-2 based on the execution of the application 270. The external electronic device 110 may transmit a second signal including the information including the at least one image to the electronic device 101 through the communication circuitry 230-2, as a response to the first signal. An example of an operation of the electronic device 101 and the external electronic device 110 according to an embodiment will be described with reference to
According to an embodiment, the processor 210-1 of the electronic device 101 may obtain information on one or more planes to which light emitted from the projection assembly 240 is to be projected based on the second signal received through the communication circuitry 230-1. The embodiment is not limited thereto, and the electronic device 101 including the camera 250-1 and/or the sensor 260 may obtain the information on the one or more planes by controlling the camera 250-1 and/or the sensor 260. The processor 210-1 of the electronic device 101 may display the media content in the first area (e.g., a first area 141 of
As described above, according to one embodiment, the electronic device 101 may obtain information on an external space where the light of the projection assembly 240 is to be propagated, by using the camera 250-2 in the external electronic device 110 and/or the camera 250-1 of the electronic device 101. From the information, the electronic device 101 may identify one or more planes on which the light is to be projected. The electronic device 101 may identify a plurality of areas on which the light is to be projected, in a portion not occluded by at least one external object (e.g., one or more external objects 130 of
Hereinafter, with reference to
According to an embodiment, the electronic device 101 may obtain information on an external space in which light of a projection assembly (e.g., a projection assembly 240 of
However, the disclosure is not limited thereto, and as such, according to an embodiment, the projector may display a guidance layout (e.g., a blank screen or a white screen) corresponding to the projection area of the projector to assist the user to capture the projection area using the external electronic device. According to an embodiment, information for guiding the capture of the image may be output in a different manner. For example, the external electronic device 110 may output an audio to guide the capture of the image. The external electronic device 110 may obtain an image (e.g., an image where the plane 120 is captured) to be transmitted to the electronic device 101 based on an image capture input. The external electronic device 110 may transmit the image to the electronic device 101.
According to an embodiment, the information obtained by the electronic device 101 by using the camera of the external electronic device 110 and/or the electronic device 101 may include an image for the plane 120 to which light emitted from the projection assembly in the electronic device 101 is to be projected. The electronic device 101 may identify one or more external objects (e.g., a first external object 131 and/or a second external object 132) different from the plane 120 by performing object recognition on the image. The object recognition may include an operation of classifying a subject captured in the image into any one of preset categories (e.g., categories distinguished by a name of the subject). The object recognition may be performed based on an artificial neural network (ANN) executed by the electronic device 101. For example, the electronic device 101 may perform the object recognition based on the artificial neural network such as a convolution neural network (CNN) and/or long-short term memory (LSTM). That the electronic device 101 identifies the one or more external objects may include an operation of identifying positions of the one or more external objects in the image and/or at least a portion of the image occupied by the one or more external objects.
According to an embodiment, the electronic device 101 may separate the light path of the projection assembly from the one or more external objects identified by the image, in order to prevent light output from the projection assembly from being distorted by the three-dimensional (3D) form and/or color of the one or more external objects. For example, the electronic device 101 may identify a portion 320 different from a portion occluded by the first external object 131 and the second external object 132, in the plane 120. In an embodiment, the portion 320 may be referred to as a projectable portion. The electronic device 101 may use a camera of the electronic device 101 and/or the external electronic device 110 to identify the projectable portion. According to an embodiment, the electronic device 101 may identify the projectable portion based on coordinates of the one or more external objects 130 with respect to the plane 120. According to an embodiment, independent of the plane 120 having the form of a quadrangle, the form of the portion 320 identified by the electronic device 101 from the plane 120 may have a form of a polygon and/or a closed curve included in the plane 120. In an embodiment, in case that there is no external object such as the one or more external objects 130 between the plane 120 and the electronic device 101, the electronic device 101 may determine the entire plane 120 as the projectable portion.
Referring to
According to an embodiment, the electronic device 101 may select the second area (e.g., the second area 142 of
In an embodiment, the electronic device 101 may select the second area different from the first area, based on the priority assigned to each of the other areas 322 and 323 except for the area 321 determined as the first area. For example, the electronic device 101 may select the second area in which the UI associated with the media content displayed through the first area is to be displayed, from among the areas 322 and 323. In case that the area 323 among the areas 322 and 323 is selected as the second area, the electronic device 101 displays the display through the second area based on the width, the electronic device 101 may identify the UI to be displayed through the second area, based on the width, height, and/or extent of the area 323. For example, the electronic device 101 may select the UI to be displayed through the area 323, based on the media content and the width, height, and/or extent of the area 323, from among the preset UIs stored in a memory (e.g., the memory 220-1 of
According to an embodiment, the electronic device 101 may display the UI through the area 323 determined as the second area, in a state of displaying the media content through the area 321 determined as the first area. Based on the width, height, and/or aspect ratio of the area 323, the electronic device 101 may adjust the layout of the UI. Since the electronic device 101 adjusts the layout of the UI, the UI displayed through the area 323 may have a form suitable for the area 323. In terms of responding to the size of the second area, the UI displayed through the second area may be referred to as a responsive UI.
As described above, according to an embodiment, the electronic device 101 may obtain information associated with the plane 120 and/or the one or more external objects 130, by using the camera of the external electronic device 110 and/or the electronic device 101. Based on the information, the electronic device 101 may distinguish a plurality of areas (e.g., the areas 321, 322, and 323), in another portion (e.g., the projectable portion including the portion 320) that is different from a portion not occluded by the one or more external objects 130, in the plane 120. In the plurality of areas, the electronic device 101 may select the first area (e.g., the area 321 of
Hereinafter, with reference to
In the first scenario 401 illustrated in
In
In the first scenario 401 illustrated in
According to an embodiment, that the electronic device 101 displays the UI based on the extent and/or the aspect ratio of the second area 142 may include an operation of changing the layout of one or more visual objects (e.g., the visual objects 410, 420, 430, and 440) included in the UI. The layout may include a position, form, size and/or arrangement of the one or more visual objects.
In the second scenario 402 in
Referring to the first scenario 401 and the second scenario 402 in
In an embodiment, that the electronic device 101 displays the media content and the UI in different areas on the plane 120 is not limited to the embodiment of
As described above, according to an embodiment, the electronic device 101 may display the UI including a plurality of visual objects 410, 420, 430, and 440 arranged based on the form of the second area 142, in the second area 142 of the plane 120 that is not occluded by the external object (e.g., the first external object 131 to the third external object 133) such as furniture. For example, the UI and/or the plurality of visual objects 410, 420, 430, and 440 in the UI may be determined by the media content displayed through the first area 141. For example, the position and/or size of the plurality of visual objects 410, 420, 430, and 440 in the second area 142 may be determined by the form of the second area 142.
Although the first scenario 401 and the second scenario 402 displaying UI including information associated with the home shopping are illustrated as an example, in a state in which the electronic device 101 displays the media content for the home shopping, the disclosure is not limited thereto. While displaying another type of the media content different from the home shopping, the electronic device 101 may display a UI based on the other type of the media content. For example, while displaying a media content classified as a movie, the electronic device 101 may display information (e.g., the title, actors, running time, ticketing information and/or script of the movie) associated with the movie. For example, while displaying a media content classified as news, the electronic device 101 may display information (e.g., news title) associated with the news. For example, while displaying a media content classified as a sports game, the electronic device 101 may display information described later with reference to
According to an embodiment, the electronic device 101 may obtain the UI to be displayed through the second area 142, by extracting text included in the media content. Hereinafter, an example of an operation of displaying the UI based on the text by the electronic device 101 according to an embodiment will be described with reference to
According to an embodiment, the electronic device 101 may perform scene recognition on the media content (e.g., live video and/or over-the-top (OTT) video) displayed through the first area 141. The electronic device 101 may obtain a UI to be displayed through the second area 142, based on the scene recognition. For example, the electronic device 101 may obtain the UI, based on any one of frames included in a video of the media content. For example, in the third scenario 501 in which the electronic device 101 projects the media content associated with the sports game onto the first area 141, the electronic device 101 may identify a frame including information on one or more athletes associated with the sports event among the frames in the media content. The electronic device 101 may display the identified frame through the second area 142, in the third scenario 501. The second area 142 may have the aspect ratio (e.g., the preset aspect ratio of the first area 141) of the frame. The embodiment is not limited thereto, and for example, the electronic device 101 may display a UI including the subtitle identified based on the scene recognition through the second area 142, while projecting media content including the subtitle onto the first area 141.
According to an embodiment, the electronic device 101 may determine layout of a UI in which information extracted from the media content is to be displayed, based on the aspect ratio and/or size of the second area 142. Referring to
Referring to
As described above, according to an embodiment, the electronic device 101 may obtain the UI to be displayed through the second area 142 and/or information to be included in the UI, based on frames in the media content displayed through the first area 141. For example, in case that a parameter indicating the selection of at least one of a plurality of preset UIs stored in the electronic device 101 is not identified in the media content, the electronic device 101 may obtain the UI to be displayed through the second area 142 by performing the scene recognition on the media content. For example, the electronic device 101 may obtain text to be displayed through the second area 142, based on the OCR for at least one of frames in the media content. Based on the aspect ratio of the second area 142, the electronic device 101 may selectively project any one of a frame used to obtain the text or the text among frames in the media content onto the second area 142.
According to an embodiment, the electronic device 101 may identify the first area 141 in which the media content is to be displayed and the second area 142 in which the UI associated with the media content is to be displayed, by recognizing a plurality of planes including the plane 120. For example, the first area 141 and the second area 142 may be selected from each of the plurality of planes. Hereinafter, with reference to
Referring to
According to an embodiment, the electronic device 101 may obtain at least one image including information on all directions.
Referring to
According to an embodiment, the electronic device 101 may select a first area in which the media content is to be projected and a second area in which a UI associated with the media content is displayed, from among the areas 642, 652, and 662 that are not occluded by the first external object 621, the second external object 622, and the third external object 623, in a plurality of planes 611, 612, and 613. The electronic device 101 may identify the first area and the second area, based on at least one plane adjacent to a user 670 among the plurality of planes 611, 612, and 613, based on identifying the user 670 adjacent to the plurality of planes 611, 612, and 613. For example, referring to
As described above, according to an embodiment, in case of projecting light toward the plurality of planes, the electronic device 101 may select the first area to which the media content is to be projected and the second area to which the UI associated with the media content is to be projected, based on the projectable portion of each of the plurality of planes and the position of the user 670 with respect to the plurality of planes. Since the light is projected to the plurality of areas, the electronic device 101 may increase amount of usage of the plurality of planes. Based on the increased amount of usage, the electronic device 101 may improve a user experience.
Hereinafter, an operation of the electronic device 101 according to an embodiment will be described with reference to
In operation 710, the method may include obtaining information on a plane on which light emitted from the electronic device is to be projected. According to an embodiment, the electronic device may obtain information on a plane on which light emitted from the electronic device is to be projected. The information may include an image obtained through a camera (e.g., a camera 250-1 of
In operation 720, the method may include identifying a first portion in the plane occluded by at least one external object based on the information of the operation 710. According to an embodiment, the electronic device may identify a first portion in the plane occluded by at least one external object based on the information of the operation 710. The electronic device may identify at least one external object provided between the electronic device and the plane, based on the information of the operation 710. The electronic device may identify the first portion in the plane occluded by the at least one external object. The first portion may be a portion to which light (e.g., light emitted from a projection assembly 240 of
In operation 730, the method may include selecting a first area and a second area smaller than the first area, in a second portion in a plane different from the first portion. According to an embodiment, the electronic device may select a first area and a second area smaller than the first area, in a second portion in a plane different from the first portion. The electronic device may determine a candidate area having a maximum extent from among candidate areas having a quadrangular form having a preset aspect ratio (e.g., 16:9) as the first area. The electronic device may select the second area based on a condition indicated by a media content to be displayed through the first area, in the second portion from which the first area is excluded. For example, based on the conditions, the electronic device may determine a candidate area having a minimum difference in width and height, among the candidate areas having the quadrangular form formed in the second portion from which the first area is excluded, as the second area. For example, based on the condition, the electronic device may determine a candidate area having the maximum extent among the candidate areas having the quadrangular form formed in the second portion from which the first area is excluded, as the second area.
In operation 740, the method may include obtaining a UI associated with the media content and based on the size of the second area, in a state of displaying the media content through the first area. According to an embodiment, the electronic device may obtain a UI associated with the media content and based on the size of the second area, in a state of displaying the media content through the first area. The electronic device may select a preset UI associated with the media content from among a plurality of preset UIs. The electronic device may obtain the UI of the operation 740 by adjusting the layout of the preset UI based on the width, height, and/or aspect ratio of the second area of the operation 730. According to an embodiment, the electronic device may extract information from the media content, based on scene recognition and/or OCR for the media content. The electronic device may obtain a UI for displaying the extracted information. The electronic device may obtain a UI including the information and having a layout based on the width, height, and/or aspect ratio of the second area.
In operation 750, the method may include displaying the UI through the second area. According to an embodiment, the electronic device may display the UI through the second area. The electronic device may display the UI through the second area, in a state of displaying the media content through the first area of the operation 740. Based on the operations 740 and 750, the electronic device may simultaneously display the media content and the UI.
In operation 810, the method may include obtaining information on a plurality of planes in which light emitted from the electronic device is to be projected. According to an embodiment, the electronic device may obtain information on a plurality of planes in which light emitted from the electronic device is to be projected. For example, the electronic device may obtain information on the plurality of planes, by using a camera (e.g., a camera 250-1 of
In operation 820, the method may include identifying a candidate area included in another portion that is different from a portion occluded by an external object, in each of the plurality of planes. According to an embodiment, the electronic device may identify a candidate area included in another portion that is different from a portion occluded by an external object, in each of the plurality of planes. For example, the electronic device may identify at least one external object provided between the plurality of planes and the electronic device, based on the information of the operation 810. The electronic device may identify the portion occluded by at least one external object, based on one or more images in which the plurality of planes are captured. For example, in each of the plurality of planes, the other portion different from the portion may correspond to a portion to which light emitted from the projection assembly of the electronic device may reach.
In operation 830, the method may include selecting a first area and a second area smaller than the first area, from candidate areas selected from each of the plurality of planes. According to an embodiment, the electronic device may select a first area and a second area smaller than the first area, from candidate areas selected from each of the plurality of planes. Based on the information of the operation 810, the electronic device may identify a user (e.g., a user 670 of
In operation 840, the method may include displaying the media content through the first area, and the UI having a layout based on the size of the second area and associated with the media content through the second area. According to an embodiment, the electronic device may display the media content through the first area, and the UI having a layout based on the size of the second area and associated with the media content through the second area. The electronic device may project the UI through the second area substantially simultaneously with projecting the media content into the first area. The electronic device may display a UI having a layout adjusted by a ratio of the width and the height of the second area, in the second area selected based on the operation 830. For example, in the UI, a plurality of visual objects may be arranged based on the aspect ratio of the second area. For example, in the UI, the plurality of visual objects may have sizes proportional to the size of the second area.
In operation 910, according to an embodiment, the electronic device 101 may transmit a first signal 912 for requesting information on an external space to an external electronic device including a camera. The first signal 912 may include information indicating execution of an application (e.g., an application 270 of
In operation 920, according to an embodiment, the external electronic device 110 may obtain the image for at least a portion of the external space, by executing a preset application (e.g., the application 270 of
In operation 930, according to an embodiment, the external electronic device 110 may transmit a second signal 932 including information on an external space associated with the obtained image. The second signal 932 may include the image of the operation 920. In case that the external electronic device 110 obtains a plurality of images, the second signal 932 may include the plurality of images. The external electronic device 110 may transmit the second signal 932 to the electronic device 101, in response to the first signal 912. The embodiment is not limited thereto, and for example, the external electronic device 110 may select different areas (e.g., a first area 141 to a second area 142 of
In operation 940, according to an embodiment, the electronic device 101 may identify at least one plane on which the light emitted from the electronic device is projected. The electronic device 101 may identify at least one plane capable of reflecting the light output from the electronic device 101, such as the plane 120 of
In operation 950, according to an embodiment, the electronic device 101 may select a plurality of areas on which the light is to be projected, based on a portion occluded by at least one external object, in the plane. In an embodiment, the electronic device 101 may perform the operation 950 of
In operation 960, according to an embodiment, the electronic device 101 may output light representing the media content and the UI associated with the media content, based on the selected plurality of areas. Based on operation 740 of
According to an embodiment, the electronic device may identify at least one external object that occludes a plane for reflecting light emitted from the electronic device. The electronic device may emit light to a second portion different from a first portion in a plane occluded by at least one external object. The form of the second portion may be different from that of media content included in the light and having a quadrangular form. In order to increase the amount of information displayed through the second portion, the electronic device may select at least one second area different from the first area in which the media content is displayed, in the second portion. In the at least one second area, the electronic device may display the UI associated with the media content.
The electronic device according to an embodiment described with reference to
Metaverse is a combination of the English words Meta, which means “virtual” and “transcendence,” and “Universe,” which means the universe, and refers to a three-dimensional virtual world where social, economic, and cultural activities like the real world take place. Metaverse is a concept that has evolved one step further than virtual reality, and it is characterized by using avatars to not only enjoy games or virtual reality (VR, cutting-edge technology that enables people to experience real-life experiences in a computerized virtual world), but also social and cultural activities like real reality. Metaverse service may provide media content to enhance immersion in the virtual world, based on augmented reality (AR), virtual reality environment (VR), mixed environment (MR), and/or extended reality (XR).
For example, the media content provided by metaverse service may include social interaction content including a game, a concert, a party, and/or a conference based on an avatar. For example, the media content may include information for economic activities such as advertising, user-created content, and/or sales of products and/or shopping. Ownership of the user-created content may be proved by a blockchain-based non-fungible token (NFT). The metaverse service may support economic activities based on real money and/or cryptocurrency. Virtual content linked to the real world, such as digital twin or life logging, may be provided by the metaverse service. The metaverse service may be provided through a network based on fifth generation (5G) and/or sixth generation (6G). However, the disclosure is not limited to a network based on fifth generation (5G) and/or sixth generation (6G).
Referring to
In this case, the server 1010 provides a virtual space so that the user terminal 1020 may perform activities in the virtual space. In addition, the user terminal 1020 may represent information provided by the server 1010 to the user or transmit information in which the user wants to represent in the virtual space to the server, by installing S/W agent to access a virtual space provided by the server 1010. The S/W agent may be provided directly through the server 1010, downloaded from a public server, or embedded and provided when purchasing a terminal.
In an embodiment, the metaverse service may be provided to the user terminal 1020 and/or the user by using the server 1010. The embodiment is not limited thereto, and the metaverse service may be provided through individual contact between users. For example, within the network environment 1001, the metaverse service may be provided by a direct connection between the first terminal 1020-1 and the second terminal 1020-2, independently of the server 1010. Referring to
In an embodiment, the user terminal 1020 (or the user terminal 1020 including the first terminal 1020-1 and the second terminal 1020-2) may be made into various form factors, and may be characterized by including an input device for inputting information to the metaverse service and an output device that provides video and/or sound to the user. Examples of various form factors of the user terminal 1020 include a smartphone (e.g., the second terminal 1020-2), an AR device (e.g., the first terminal 1020-1), a VR device, an MR device, a video see through (VST) device, an optical see through (OST) device, a smart lens, a smart mirror, a TV or a projector capable of input/output.
Network (e.g., a network formed by at least one intermediate node 1030) include various broadband networks including 3G, 4G, and 5G, a short-range networks including Wi-Fi and BT (e.g., a wired network or a wireless network that directly connect the first terminal 1020-1 and the second terminal 1020-2).
In an embodiment, the user terminal 1020 of
In an embodiment, a method of increasing the amount of information displayed through one or more planes that reflect light emitted from the electronic device may be required. As described above, according to an embodiment, an electronic device (e.g., an electronic device 101 of
For example, the processor may be configured to obtain an image including the plane from the information. The processor may be configured to, in the image, based on identifying at least portion of the plane occluded by at least one external object, identify, based on another portion (e.g., a portion 320 of
For example, the processor may be configured to identify, in the other portions in the plane, a plurality of candidate areas having the preset ratio. The processor may be configured to identify, among the plurality of candidate areas, a candidate area having a maximum extent as the first area.
For example, the processor may be configured to identify, in the other portions where the first area is segmented, the second area based on a condition corresponding to the UI.
For example, the processor may be configured to identify, among a first preset condition set by deviation between a width and a height, and a second preset condition set based on an extent, the condition corresponding to the UI.
For example, the processor may be configured to obtain, based on one of frames included in a video of the media content, the UI.
For example, the processor may be configured to display, text extracted from a frame used for obtaining the UI, in the UI.
For example, the processor may be configured to transmit, to an external electronic device (e.g., an external electronic device 110 of
As described above, according to an embodiment, a method of an electronic device may comprise obtaining information with respect to one or more planes where light emitted from a projection assembly of the electronic device is to be projected. The method may comprise displaying, in a first area identified in the one or more planes based on the information, a media content. The method may comprise, based on the media content, identifying, in the one or more planes, a second area different from the first area. The method may comprise displaying, in the second area, user interface (UI) that is associated with the media content and having layout based on a width and height of the second area.
For example, the obtaining may comprise transmitting, to an external electronic device through a communication circuitry of the electronic device, a first signal indicating to obtain information with respect to the one or more planes, by executing a preset application of the external electronic device. The method may comprise obtaining, based on a second signal transmitted as a response to the first signal from the external electronic device, the information including at least one image with respect to the one or more planes.
For example, the displaying the media content may comprise projecting onto the first area having a preset aspect ratio, light representing the media content.
For example, the displaying the media content may comprise identifying, a plurality of candidate areas having a preset aspect ratio in other portion different from a portion occluded by at least one external object in the one or more planes. The method may comprise determining, a candidate area having a maximum extent among the plurality of candidate areas, as the first area.
For example, the identifying the second area may comprise identifying, in other portion in the one or more planes where the first area is excluded, the second area having an extent or an aspect ratio indicated by the media content.
For example, the displaying the UI may comprise displaying, in a state where the media content is displayed in the first area, a preset UI selected by the media content among preset UIs stored in a memory of the electronic device, in the second area.
For example, the displaying the UI may comprise displaying, by adjusting layout of the preset UI selected by the media content based on an aspect ratio of the second area, the UI associated with the media content.
As described above, according to an embodiment, a method of an electronic device may comprise obtaining information for a plane where light emitted from a projection assembly of the electronic device is to be projected. The method may comprise identifying, from the plane based on the information, a first area having a preset ratio, and a second area smaller than the first area. The method may comprise obtaining, in a state that a media content displays in the first area identified by a communication circuitry of the electronic device, user interface (UI) associated with the media content. The method may comprise displaying, in the second area, the UI having layout based on a width and a height of the second area.
For example, the identifying may comprise obtaining an image including the plane from the information. The method may comprise, in the image, based on identifying at least portion of the plane occluded by at least one external object, identifying, based on another portion in the plane different from the at least portion, the first area and the second area.
For example, the identifying the first area may comprise identifying, in the other portions in the plane, a plurality of candidate areas having the preset ratio. The method may comprise identifying, among the plurality of candidate areas, a candidate area having a maximum extent as the first area.
For example, the identifying the second area may comprise identifying, in the other portions where the first area is segmented, the second area based on a condition corresponding to the UI.
For example, the identifying the second area may comprise identifying, among a first preset condition set by deviation between a width and a height, or a second preset condition set based on an extent, the condition corresponding to the UI.
For example, the obtaining the UI may comprise obtaining, based on one of frames included in a video of the media content, the UI.
For example, the displaying the UI may comprise displaying, text extracted from a frame used for obtaining the UI, in the UI.
For example, the obtaining may comprise transmitting, to an external electronic device connected through the communication circuitry, a signal indicating to obtain the information including an image including the plane by executing an application executed by the external electronic device.
As described above, according to an embodiment, an electronic device (e.g., an electronic device 101 of
For example, the electronic device may comprise a communication circuitry (e.g., the electronic device 101 of
For example, the processor may be configured to project onto the first area having a preset aspect ratio, light representing the media content.
For example, the processor may be configured to identify, a plurality of candidate areas having a preset aspect ratio in other portion different from a portion occluded by at least one external object in the one or more planes. The processor may be configured to determine, a candidate area having a maximum extent among the plurality of candidate areas, as the first area.
For example, the processor may be configured to identify, in other portion in the one or more planes where the first area is excluded, the second area having an extent or an aspect ratio indicated by the media content.
For example, the electronic device may further comprise a memory (e.g., a memory 220-1 of
For example, the processor may be configured to display, by adjusting layout of the preset UI selected by the media content based on an aspect ratio of the second area, the UI associated with the media content.
The devices described above may be implemented as a hardware component, a software component, and/or a combination of the hardware component and the software component. For example, the device and component described in the embodiments may be implemented by using one or more general purpose computers or special purpose computers such as a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), a programmable logic unit (PLU), a microprocessor, any other device capable of executing and responding to an instruction. The processing device may perform an operating system (OS) and one or more software applications performed on the operating system. In addition, the processing device may access, store, manipulate, process, and generate data in response to execution of the software. For convenience of understanding, although one processing device may be described as being used, a person having ordinary knowledge in the relevant technical field may see that the processing device may include a plurality of processing elements and/or a plurality of types of processing elements. For example, the processing device may include a plurality of processors or one processor and one controller. In addition, another processing configuration, such as a parallel processor, is also possible.
The software may include a computer program, code, instruction, or a combination of one or more of them, and may configure the processing device to operate as desired or may instruct the processing device independently or collectively. The software and/or data may be embodied in any type of machine, component, physical device, computer storage medium, or device, to be interpreted by the processing device or to provide the instruction or data to the processing device. The software may be distributed on a network-connected computer system and stored or executed in a distributed manner. The software and data may be stored in one or more computer-readable recording medium.
The method according to the embodiment may be implemented in the form of a program instruction that may be recorded in a computer-readable medium and performed through various processors and/or computers. In this case, the medium may continuously store a program executable by a computer or may temporarily store the program for execution or download. In addition, the medium may be various recording components or storage components in the form of a single unit or a combination of several pieces of hardware, but is not limited to a medium directly connected to a certain computer system, and may exist distributed on the network. An example of medium may be configured to store a program instruction, by including magnetic medium such as a hard disk, a floppy disk and a magnetic tape, optical recording medium such as a CD-ROM and a DVD, magneto-optical medium such as a floptical disk, a ROM, a RAM, a flash memory, and the like. In addition, an example of another medium may include an app store that distribute an application, a site that supply or distribute various software, and recording medium or storage medium managed by servers.
As described above, although the embodiments have been described by a limited embodiment and a drawing, a person having ordinary knowledge in the relevant technical field may make various modifications and variations from the above description. For example, even if the described techniques are performed in a different order from the described method, and/or components such as the described system, structure, device, circuitry, and the like are coupled or combined in a different form from the described method or are replaced or substituted by another component or equivalent, an appropriate result may be achieved.
Therefore, other implementations, other embodiments, and equivalents of the patents are also in the scope of the claims to be described later.
Number | Date | Country | Kind |
---|---|---|---|
10-2022-0163471 | Nov 2022 | KR | national |
This application is a continuation of PCT International Application No. PCT/KR2023/012303, which was filed on Aug. 18, 2023, and claims priority to Korean Patent Application No. 10-2022-0163471, filed on Nov. 29, 2022, in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2023/012303 | Aug 2023 | WO |
Child | 18240226 | US |