This disclosure relates generally to database and file management within network environments, and in particular relates to data manipulation in XR space.
An extended reality (XR) system may generally include a computer-generated environment and/or a real-world environment that includes at least some XR objects. Such an XR system or world and associated XR objects typically include various applications (e.g., video games), which may allow users to utilize these XR artifacts by manipulating their presence in the form of a computer-generated representation (e.g., avatar). In typical XR systems, image data may be rendered on, for example, a lightweight, head-mounted display (HMD) that may be coupled through a physical wired connection to a base graphics generation device responsible for generating the image data. In some instances, it may be desirable to couple the HMD to the base graphics generation device via a wireless network connection.
With these tools, users generate new forms of reality by bringing digital objects into the physical world and bringing physical world objects into the digital world. XR technologies have applications in almost every industry, such as architecture, automotive industry, sports training, real estate, mental health, medicine, health care, retail, space travel, design, engineering, interior design, television and film, media, advertising, marketing, libraries, education, news, music, and travel.
A video call is a call using an Internet connection, sometimes called VoIP, that utilizes video to transmit a live picture of the person making the call. Video calls are made using a computer's webcam or other electronic devices with a video-capable camera, like a smartphone, tablet, or video-capable phone system. A video call can also be, for example, a “holocall” or depth video call, where the user's two-dimensional video is projected into a three-dimensional space.
In particular embodiments, the electronic device 100 may be a smart home appliance. Examples of the smart home appliance may comprise at least one of a television, a digital video disk (DVD) player, an audio player, a refrigerator, an air conditioner, a cleaner, an oven, a microwave oven, a washer, a drier, an air cleaner, a set-top box, a home automation control panel, a security control panel, a TV box , a gaming console, an electronic dictionary, an electronic key, a camcorder, or an electronic picture frame.
In particular embodiments, the electronic device 100 may comprise at least one of various medical devices (e.g., diverse portable medical measuring devices (a blood sugar measuring device, a heartbeat measuring device, or a body temperature measuring device), a magnetic resource angiography (MRA) device, a magnetic resource imaging (MM) device, a computed tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR), a flight data recorder (FDR), an automotive infotainment device, an sailing electronic device (e.g., a sailing navigation device or a gyro compass), avionics, security devices, vehicular head units, industrial or home robots, automatic teller's machines (ATMs), point of sales (POS) devices, or Internet of Things devices (e.g., a bulb, various sensors, an electric or gas meter, a sprinkler, a fire alarm, a thermostat, a street light, a toaster, fitness equipment, a hot water tank, a heater, or a boiler).
In particular embodiments, the electronic device 100 may comprise at least one of part of a piece of furniture or building/structure, an electronic board, an electronic signature receiving device, a projector, or various measurement devices (e.g., devices for measuring water, electricity, gas, or electromagnetic waves).
In particular embodiments, the electronic device 100 may be one or a combination of the above-listed devices. The electronic device may be a flexible electronic device. The electronic device disclosed herein may be not limited to the above-listed devices and may comprise new electronic devices depending on the development of technology.
Hereinafter, electronic devices 100 are described with reference to the accompanying drawings, according to various embodiments of the present disclosure. As used herein, the term “user” may denote a human or another device (e.g., an artificial intelligent electronic device) using the electronic device 100.
In particular embodiments, the electronic device 100 may include, for example, any of various personal electronic devices 102, such as a mobile phone electronic device, a tablet computer electronic device, a laptop computer electronic device, and so forth. In particular embodiments, as further depicted by
In particular embodiments, the one or more processor(s) 104 may be operably coupled with the memory 106 to perform various algorithms, processes, or functions. Such programs or instructions executed by the processor(s) 104 may be stored in any suitable article of manufacture that includes one or more tangible, computer-readable media at least collectively storing the instructions or routines, such as the memory 106. The memory 106 may include any suitable articles of manufacture for storing data and executable instructions, such as random-access memory (RAM), read-only memory (ROM), rewritable flash memory, hard drives, and so forth. Also, programs (e.g., an operating system) encoded on such a computer program product may also include instructions that may be executed by the processor(s) 104 to enable the electronic device 100 to provide various functionalities.
In particular embodiments, the sensors 108 may include, for example, one or more cameras (e.g., depth cameras), touch sensors, microphones, motion detection sensors, thermal detection sensors, light detection sensors, time of flight (ToF) sensors, ultrasonic sensors, infrared sensors, or other similar sensors that may be utilized to detect various user inputs (e.g., user voice inputs, user gesture inputs, user touch inputs, user instrument inputs, user motion inputs, and so forth). The cameras 110 may include any number of cameras (e.g., wide cameras, narrow cameras, telephoto cameras, ultra-wide cameras, depth cameras, and so forth) that may be utilized to capture various 2D and 3D images. The display 112 may include any display architecture (e.g., AMLCD, AMOLED, micro-LED, and so forth), which may provide further means by which users may interact and engage with the electronic device 100. In particular embodiments, as further illustrated by
In particular embodiments, the input structures 114 may include any physical structures utilized to control one or more global functions of the electronic device 100 (e.g., pressing a button to power “ON” or power “OFF” the electronic device 100). The network interface 116 may include, for example, any number of network interfaces suitable for allowing the electronic device 100 to access and receive data over one or more cloud-based networks (e.g., a cloud-based service that may service hundreds or thousands of the electronic device 100 and the associated users corresponding thereto) and/or distributed networks. The power source 118 may include any suitable source of power, such as a rechargeable lithium polymer (Li-poly) battery and/or an alternating current (AC) power converter that may be utilized to power and/or charge the electronic device 100 for operation. Similarly, the I/O interface 120 may be provided to allow the electronic device 100 to interface with various other electronic or computing devices, such as one or more auxiliary electronic devices.
In particular embodiments, the computing platform 206 may include, for example, a standalone host computing system, an on-board computer system integrated with the XR display device 202, a mobile device, or any other hardware platform that may be capable of providing extended reality content to the XR display device 202. In particular embodiments, the computing platform 206 may include, for example, a cloud-based computing architecture (including one or more servers 208 and data stores 210) suitable for hosting and servicing XR applications or experiences executing on the XR electronic device 202. For example, in particular embodiments, the computing platform 206 may include a Platform as a Service (PaaS) architecture, a Software as a Service (SaaS) architecture, and an Infrastructure as a Service (IaaS), or other similar cloud-based computing architecture.
The bus 310 may include a circuit for connecting the components 320 to 380 with one another and transferring communications (e.g., control messages and/or data) between the components.
The processing module 320 may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP). The processor 320 may perform control on at least one of the other components of the electronic device 100a, and/or perform an operation or data processing relating to communication.
The memory 330 may include a volatile and/or non-volatile memory. For example, the memory 330 may store commands or data related to at least one other component of the electronic device 100a. According to an embodiment of the present disclosure, the memory 330 may store software and/or a program 340. The program 340 may include, e.g., a kernel 341, middleware 343, an application programming interface (API) 345, and/or an application program (or “application”) 347. At least a portion of the kernel 341, middleware 343, or API 345 may be denoted an operating system (OS).
For example, the kernel 341 may control or manage system resources (e.g., the bus 310, processor 320, or a memory 330) used to perform operations or functions implemented in other programs (e.g., the middleware 343, API 345, or application program 347). The kernel 341 may provide an interface that allows the middleware 343, the API 345, or the application 347 to access the individual components of the electronic device 100a to control or manage the system resources.
The middleware 343 may function as a relay to allow the API 345 or the application 347 to communicate data with the kernel 341, for example. A plurality of applications 347 may be provided. The middleware 343 may control work requests received from the applications 347, e.g., by allocation the priority of using the system resources of the electronic device 100a (e.g., the bus 310, the processor 320, or the memory 330) to at least one of the plurality of applications 347.
The API 345 may be an interface allowing the application 347 to control functions provided from the kernel 341 or the middleware 343. For example, the API 345 may include at least one interface or function (e.g., a command) for filing control, window control, image processing or text control.
The input/output interface 350 may serve as an interface that may, e.g., transfer commands or data input from a user or other external devices to other component(s) of the electronic device 100a. Further, the input/output interface 350 may output commands or data received from other component(s) of the electronic device 100a to the user or the other external device.
The display 360 may include, e.g., a liquid crystal display (LCD), a light emitting diode (LED) display, an organic light emitting diode (OLED) display, or a microelectromechanical systems (MEMS) display, or an electronic paper display. The display 360 may display, e.g., various contents (e.g., text, images, videos, icons, or symbols) to the user. The display 360 may include a touchscreen and may receive, e.g., a touch, gesture, proximity or hovering input using an electronic pen or a body portion of the user.
For example, the communication interface 370 may set up communication between the electronic device 100a and an external electronic device (e.g., a first electronic device 100b, a second electronic device 100c, or a server 306). For example, the communication interface 370 may be connected with the network 362 or 364 through wireless or wired communication to communicate with the external electronic device.
The first external electronic device 100b or the second external electronic device 100c may be a wearable device or an electronic device 100a—mountable wearable device (e.g., a head mounted display (HMD)). When the electronic device 100a is mounted in an HMD (e.g., the electronic device 100b), the electronic device 100a may detect the mounting in the HMD and operate in a virtual reality mode. When the electronic device 100a is mounted in the electronic device 100b (e.g., the HMD), the electronic device 100a may communicate with the electronic device 100b through the communication interface 370. The electronic device 100a may be directly connected with the electronic device 100b to communicate with the electronic device 100b without involving with a separate network.
The wireless communication may use at least one of, e.g., long term evolution (LTE), long term evolution-advanced (LTE-A), code division multiple access (CDMA), wideband code division multiple access (WCDMA), universal mobile telecommunication system (UMTS), wireless broadband (WiBro), or global system for mobile communication (GSM), as a cellular communication protocol. The wired connection may include at least one of universal serial bus (USB), high definition multimedia interface (HDMI), recommended standard 232 (RS-232), or plain old telephone service (POTS).
The network 362 may include at least one of communication networks, e.g., a computer network (e.g., local area network (LAN) or wide area network (WAN)), Internet, or a telephone network.
The first and second external electronic devices 100b and 100c each may be a device of the same or a different type from the electronic device 100a. According to an embodiment of the present disclosure, the server 306 may include a group of one or more servers. According to an embodiment of the present disclosure, all or some of operations executed on the electronic device 100a may be executed on another or multiple other electronic devices (e.g., the electronic devices 100b and 100c or server 306). According to an embodiment of the present disclosure, when the electronic device 100a should perform some function or service automatically or at a request, the electronic device 100a, instead of executing the function or service on its own or additionally, may request another device (e.g., electronic devices 100b and 100c or server 306) to perform at least some functions associated therewith. The other electronic device (e.g., electronic devices 100b and 100c or server 306) may execute the requested functions or additional functions and transfer a result of the execution to the electronic device 100a. The electronic device 100a may provide a requested function or service by processing the received result as it is or additionally. To that end, a cloud computing, distributed computing, or client-server computing technique may be used, for example.
Although
The server 306 may support to drive the electronic device 100a by performing at least one of operations (or functions) implemented on the electronic device 100a. For example, the server 306 may include an event processing server module (not shown) that may support the event processing module 380 implemented in the electronic device 100a.
For example, the event processing server module may include at least one of the components of the event processing module 380 and perform (or instead perform) at least one of the operations (or functions) conducted by the event processing module 380.
The event processing module 380 may process at least part of information obtained from other elements (e.g., the processor 320, the memory 330, the input/output interface 350, or the communication interface 370) and may provide the same to the user in various manners.
Although in
Exemplary embodiments described herein are not meant to be limiting and merely illustrative of various aspects of the invention. While exemplary embodiments may be indicated as applicable to a particular device category (e.g., head-mounted displays, etc.) the processes and examples provided are not intended to be solely limited to the device category and can be broadly applicable to various device categories (e.g., appliances, computers, automobiles, mobile phone, tablet, etc.).
According to an embodiment of the present invention, the electronic device 100 may include at least one of a touchscreen 420, a controller 430, a storage unit 440, or a communication unit 450. The touchscreen 420 may include a display panel 421 and/or a touch panel 422. The controller 430 may include at least one of an augmented reality mode processing unit 431, an event determining unit 432, an event information processing unit 433, or an application controller 434.
For example, when the electronic device 100 is mounted in a wearable device 410, the electronic device 100 may operate, e.g., as an HMD, and run an augmented reality mode. Further, according to an embodiment of the present invention, even when the electronic device 100 is not mounted in the wearable device 410, the electronic device 100 may run the augmented reality mode according to the user's settings or running an augmented reality mode related application. In the following embodiment, although the electronic device 100 is set to be mounted in the wearable device 410 to run the augmented reality mode, embodiments of the present invention are not limited thereto.
According to an embodiment of the present invention, when the electronic device 100 operates in the augmented reality mode (e.g., the electronic device 100 is mounted in the wearable device 410 to operate in a head mounted theater (HMT) mode), two screens corresponding to the user's eyes (left and right eye) may be displayed through the display panel 421.
According to an embodiment of the present invention, when the electronic device 100 is operated in the augmented reality mode, the controller 430 may perform control to process information related to an event generated while operating in the augmented reality mode to fit the augmented reality mode and display the processed information. According to an embodiment of the present invention, when the event generated while operating in the augmented reality mode is an event related to running an application, the controller 430 may block the running of the application or process the application to operate as a background process or application.
More specifically, according to an embodiment of the present invention, the controller 430 may include at least one of a augmented reality mode processing unit 431, an event determining unit 432, an event information processing unit 433, or an application controller 434 to perform functions according to various embodiments of the present invention. An embodiment of the present invention may be implemented to perform various operations or functions as described below using at least one component of the electronic device 100 (e.g., the touchscreen 420, controller 430, or storage unit 440).
According to an embodiment of the present invention, when the electronic device 100 is mounted in the wearable device 410 or the augmented reality mode is run according to the user's setting or as a augmented reality mode-related application runs, the augmented reality mode processing unit 431 may process various functions related to the operation of the augmented reality mode. The augmented reality mode processing unit 431 may load at least one augmented reality program 441 stored in the storage unit 440 to perform various functions.
The event determining unit 432 may determine an event generated while operated in the augmented reality mode by the augmented reality mode processing unit 431. Further, the event determining unit 432 may determine whether there is information to be displayed on the screen in relation with an event generated while operating in the augmented reality mode. Further, the event determining unit 432 may determine an application to be run in relation with an event generated while operating in the augmented reality mode. Various embodiments of an application related to the type of event are described below.
The event information processing unit 433 may process the event-related information to be displayed on the screen to fit the augmented reality mode when there is information to be displayed in relation with an event occurring while operating in the augmented reality mode depending on the result of determination by the event determining unit 432. Various methods for processing the event-related information may apply. For example, when a three-dimensional (3D) image is implemented in the augmented reality mode, the electronic device 100 may convert the event-related information to fit the 3D image. For example, event-related information being displayed in two dimension (2D) may be converted into information corresponding to the left and right eye corresponding to the 3D image, and the converted information may be synthesized and displayed on the screen of the augmented reality mode being currently run.
When it is determined by the event determining unit 432 that there is an application to be run in relation with the event occurring while operating in the augmented reality mode, the application controller 434 may perform control to block the running of the application related to the event. According to an embodiment of the present invention, when it is determined by the event determining unit 432 that there is an application to be run in relation with the event occurring while operating in the augmented reality mode, the application controller 434 may perform control so that the application is run in the background not to influence the running or screen display of the application corresponding to the augmented reality mode when the event-related application runs.
The storage unit 440 may store an augmented reality program 441. The augmented reality program 441 may be an application related to the augmented reality mode operation of the electronic device 100. The storage unit 440 may store the event-related information 442. The event determining unit 432 may reference the event-related information 442 stored in the storage unit 440 to determine whether the occurring event is displayed on the screen or identify information on the application to be run in relation with the occurring event.
The wearable device 410 may be an electronic device 100, and the wearable device 410 may be a wearable stand to which the electronic device 100 may be mounted. In case the wearable device 410 is an electronic device 100, when the electronic device 100 is mounted on the wearable device 410, various functions may be provided through the communication unit 450 of the electronic device 100. For example, when the electronic device 100 is mounted on the wearable device 410, the electronic device 100 may detect whether to be mounted on the wearable device 410 for communication with the wearable device 410 and may determine whether to operate in the augmented reality mode (or an HMT mode).
According to an embodiment of the present invention, upon failure to automatically determine whether the electronic device 100 is mounted when the communication unit is mounted on the wearable device 410, the user may apply various embodiments of the present invention by running the augmented reality program 441 or selecting the augmented reality mode (or, the HMT mode). According to an embodiment of the present invention, when the wearable device 410 includes functions as the electronic device 100, it may be implemented to automatically determine whether the electronic device 100 is mounted on the wearable device 410 and to enable the running mode of the electronic device 100 to automatically switch to the augmented reality mode (or the HMT mode).
At least some functions of the controller 430 shown in
Although in
According to an embodiment of the present invention, the electronic device 100 may be denoted as a first device (or a first electronic device), and the wearable device 410 may be denoted as a second device (or a second electronic device) for ease of description.
According to an embodiment of the present invention, an electronic device may comprise a display unit displaying a screen corresponding to a augmented reality mode and a controller performing control to detect an interrupt according to occurrence of at least one event, vary event-related information related to the event in a form corresponding to the augmented reality mode, and display the varied event-related information on a screen run corresponding to the augmented reality mode.
According to an embodiment of the present invention, the event may include any one or more selected from among a call reception event, a message reception event, an alarm notification, a scheduler notification, a wireless fidelity (Wi-Fi) connection, a WiFi disconnection, a low battery notification, a data permission or use restriction notification, a no application response notification, or an abnormal application termination notification.
According to an embodiment of the present invention, the electronic device further comprises a storage unit storing the event-related information when the event is not an event to be displayed in the augmented reality mode, wherein the controller may perform control to display the event-related information stored in the storage unit when the electronic device switches from the augmented reality mode into a see-through mode.
According to an embodiment of the present invention, the electronic device may further comprise a storage unit storing information regarding at least one event to be displayed in the augmented reality mode.
According to an embodiment of the present invention, the event may include an instant message reception notification event.
According to an embodiment of the present invention, when the event is an event related to running at least one application, the controller may perform control to block running of the application according to occurrence of the event.
According to an embodiment of the present invention, the controller may perform control to run the blocked application when a screen mode of the electronic device switches from the augmented reality mode into a see-through mode.
According to an embodiment of the present invention, when the event is an event related to running at least one application, the controller may perform control to enable the application according to the occurrence of the event to be run on a background of a screen of the augmented reality mode.
According to an embodiment of the present invention, when the electronic device is connected with a wearable device, the controller may perform control to run the augmented reality mode.
According to an embodiment of the present invention, the controller may enable the event-related information to be arranged and processed to be displayed in a three-dimensional (3D) space of the augmented reality mode screen being displayed on a current screen.
According to an embodiment of the present invention, the electronic device may include additional sensors such as one or more RGB cameras, DVS cameras, 360-degree cameras, or a combination thereof.
The system operating system 510 may include at least one system resource manager or at least one device driver. The system resource manager may perform, e.g., control, allocation, or recovery of system resources, and the system resource manager may include at least one manager, such as a process manager, a memory manager, or a file system manager. The device driver may include at least one driver, such as, e.g., a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a Wi-Fi driver, an audio driver, or an inter-process communication (IPC) driver.
According to an embodiment of the present invention, the framework 520 (e.g., middleware) may provide, e.g., functions commonly required for the application or provide the application with various functions through the API to allow the application to efficiently use limited system resources inside the electronic device.
According to an embodiment of the present invention, the AR framework included in the framework 520 may control functions related to augmented reality mode operations on the electronic device. For example, according to running of an augmented reality mode operation, the AR framework 520 may control at least one AR application 551 related to augmented reality among applications 530 to provide the augmented reality mode on the electronic device.
The application 530 may include a plurality of applications and may include at least one AR application 551 running in the augmented reality mode and at least one normal application 552 running in a normal mode, but not the augmented reality mode.
According to an embodiment of the present invention, the application 530 may further include an AR control application 540. An operation of the at least one AR application 551 and/or at least one normal application 552 may be controlled under the control of the AR control application 540.
According to an embodiment of the present invention, when at least one event occurs while the electronic device operates in the augmented reality mode, the system operating system 510 may notify the framework 520 (e.g., the AR framework) of occurrence of the event.
The framework 520 may control the running of the normal application 552 so that event-related information may be displayed on the screen for the event occurring in the normal mode, but not in the augmented reality mode. When there is an application to be run in relation with the event occurring in the normal mode, the framework 520 may perform control to run at least one normal application 552.
According to an embodiment of the present invention, when an event occurs while operating in the augmented reality mode, the framework 520 (e.g., the AR framework) may block the operation of at least one normal application 552 to display the information related to the occurring event. The framework 520 may provide the event occurring while operating in the augmented reality mode to the AR control application 540.
The AR control application 540 may process the information related to the event occurring while operating in the augmented reality mode to fit the augmented reality mode. For example, 2D, planar event-related information may be processed into 5D information.
The AR control application 540 may control at least one AR application 551 currently running and may perform control to synthesize the processed event-related information with the running screen by the AR application 551 and display the result.
According to an embodiment of the present invention, when an event occurs while operating in the augmented reality mode, the framework 520 may perform control to block the running of at least one normal application 552 related to the occurring event.
According to an embodiment of the present invention, when an event occurs while operating in the augmented reality mode, the framework 520 may perform control to temporarily block the running of at least one normal application 552 related to the occurring event, and when the augmented reality mode terminates, to run the blocked normal application 552.
According to an embodiment of the present invention, when an event occurs while operating in the augmented reality mode, the framework 520 may control the running of at least one normal application 552 related to the occurring event so that the at least one normal application 552 related to the event operates on the background so as not to influence the screen by the AR application 551 currently running.
The embodiment described in connection with
An XR video call may require more settings and controls for a phone user than the traditional video call. In an XR video call, both phone users and users of head mounted XR devices (e.g., AR glasses) may be able to share XR objects. Transformation of a shared 2D/3D XR object on a 2D touchscreen display may show on-screen UI such as bounding boxes, handles, buttons, and gizmos. Furthermore, in the XR video call, the phone user may move the camera around for view manipulation. Those multiple controls may cause the clutter problem on the phone screen and complex touch gestures. To address the aforementioned problems, the embodiments disclosed herein use multitouch gesture interpretation to enable users to easily share and transform XR objects in an XR video call.
In particular embodiments, an electronic device 100 may enable a user of the electronic device 100 to easily view a representation of a shared extended reality (XR) object or manipulate the object via a variety of gesture inputs in an XR space during an XR video call. As an example and not by way of limitation, the electronic device 100 may comprise a two-dimensional (2D) mobile device (e.g., mobile phone, tablet, etc.). The XR video call may be between the user of the electronic device 100 and other user(s) using head mounted XR devices. In the XR video call, all users may share XR objects comprising images, videos, 3D objects, and their physical environments. After sharing the XR objects, transforming (e.g., positioning, rotating, and scaling) a 3D object through a 2D touchscreen interface and a 2D object in a 3D environment may be a task with inherent complexity that typically requires on-screen user interface (UI), such as bounding boxes, handles, buttons, gizmos. This complexity may compound when these same touch gestures also allow the user to reposition the view of the camera on the electronic device 100. To address this complex task, the embodiments disclosed herein effectively interpret multitouch gestures to enable different object transformations (e.g., translation, rotation, scaling) and different view transformations (e.g., translation, rotation, zooming) without the need for visible UI or mode switching, which may be a technical advantage of the embodiments disclosed herein. The embodiments disclosed herein may use “zooming” and “scaling” interchangeably. As an example and not by way of limitation, zooming in/out the view may be translating the user's view forward or backward, which enables the user to zoom in their view/virtual-camera (changing the field of view). As another example and not by way of limitation, zooming an object may be scaling it up or down, e.g., scaling a user's representation in the XR video call up or down. The embodiments disclosed herein also disclose a media sharing flow with simple touch gestures. Media sharing flow may change based on the number of participants in the XR video call. The gestures may reflect the user's intents whether they want to share a single media object or multiple media objects to collaborate with other users. The embodiments disclosed herein also provide additional touch controls to a user in a call space to help improve manipulation experience for a shared workspace.
In particular embodiments, the electronic device 100 may render, on the one or more touchscreen displays, a first sequence of image frames of an extended reality (XR) video call. The first sequence of images may portray a shared XR space. In particular embodiments, the XR video call may be between a first user of the electronic device 100 and one or more second users. The electronic device 100 may then receive, via the one or more touchscreen displays, one or more gesture inputs associated with manipulating one or more parameters of the XR space during the XR video call. In particular embodiments, the electronic device 100 may determine, responsive to the one or more gesture inputs, one or more transformations within the XR space. The determination may be based on a gesture type associated with each of the one or more gesture inputs. In particular embodiments, the electronic device 100 may further render, on the one or more touchscreen displays, a second sequence of image frames of the XR video call. The second sequence of images may portray the one or more transformations to the XR space.
Certain technical challenges exist for manipulating parameters of an XR space during an XR video call. One technical challenge may include responding to user's manipulations by accurately transforming the XR space. The solution presented by the embodiments disclosed herein to address this challenge may be determining whether the user's gesture inputs are intended to manipulate an XR object or a view of the XR space based on the gesture type of each gesture input as each distinct gesture type corresponds to a particular user intent and a particular transformation of the XR space.
Certain embodiments disclosed herein may provide one or more technical advantages. A technical advantage of the embodiments may include enabling object manipulations and view manipulations without the need for visible UI or mode-switching. Another technical advantage of the embodiments may include a user interface allowing both 2D (e.g., mobile phone, tablet, etc.) and 3D (e.g., head mounted XR devices, etc.) users to browse for, and import, media (e.g., photos, videos, 3D objects, etc.) into the shared virtual space of the XR video call. Another technical advantage of the embodiments may include an easy media sharing flow with simple touch gestures, which reflect the user's intent whether they want to share a single media or multiple media to collaborate with other users. Another technical advantage of the embodiments may include additional touch control to help better manipulation experience for shared workspace when the user is on a group XR video call. Certain embodiments disclosed herein may provide none, some, or all of the above technical advantages. One or more other technical advantages may be readily apparent to one skilled in the art in view of the figures, descriptions, and claims of the present disclosure.
In particular embodiments, manipulating the parameters of the XR space by the user may cause one or more transformations (changes) to the XR space. Determining the one or more transformations may comprise determining, based on the gesture type associated with each of the one or more gesture inputs, whether the one or more gesture inputs are intended to manipulate an XR object or a view of the XR space. Determining whether the user's gesture inputs are intended to manipulate an XR object or a view of the XR space based on the gesture type of each gesture input may be an effective solution for addressing the technical challenge of responding to user's manipulations by accurately transforming the XR space as each distinct gesture type corresponds to a particular user intent and a particular transformation of the XR space. In particular embodiments, manipulating the one or more parameters of the XR space may comprise manipulating one or more XR objects in the XR space. Correspondingly, manipulating the one or more XR objects may comprise one or more of adding the one or more XR objects to the XR space, removing the one or more XR objects from the XR space, translating the one or more media objects, rotating the one or more XR objects, zooming in the one or more XR objects, or zooming out the one or more XR objects.
In particular embodiments, for the shared XR space, two touch gestures may tell the user's intent when sharing the media content. Tapping may allow the user to share a single media content from the user's electronic device 100 (e.g., phone), which may replace the previous shared media content. Dragging with one single finger may allow the user to add the media content to the XR space with other media content being the XR space at the same time.
In particular embodiments, the user may use different types of gesture inputs to manipulate objects. As an example and not by way of limitation, the user may drag an XR object with a single finger, which may translate the XR object along plane parallel to the touchscreen of the electronic device 100. As another example and not by way of limitation, the user may rotate the XR object using two fingers. As yet another example and not by way of limitation, the user may pinch the XR object with two fingers to translate the XR object away/toward the touchscreen normal with two-finger drag up or down without pre-pinch. As yet another example and not by way of limitation, the user may drag the XR object with two fingers to transform the XR object along the plane parallel to the touchscreen. In particular embodiments, there may be a threshold the user has to cross for the electronic device 100 to determine if it should manipulate the object as a two-finger forward/backward translation or a two-finger translation along a plane parallel to the screen, followed by rotation and scaling. If the two fingers are about the same distance from each other in this threshold and if the two fingers are moving in about the same direction and if that direction is almost entirely up or almost entirely down, the forward/backward translation from their view happens with their gesture (e.g., as illustrated in
In particular embodiments, manipulating the one or more parameters of the XR space may comprise manipulating a view of the XR space. Correspondingly, manipulating the view of the XR space may comprise one or more of translating the view of the XR space, rotating the view of the XR space, zooming in the view of the XR space, or zooming out the view of the XR space. As an example and not by way of limitation, the user may double-tap the view with a single finger to translate forward toward the point, i.e., zooming in the view. As another example and not by way of limitation, the user may drag the view with a single finger to rotate the view around the center of orbit position. In particular embodiments, the orbit position may be set from a ray from the center of the camera to its intersection with an object in the scene. The orbit position may update on translate. As yet another example and not by way of limitation, the user may use two fingers to pinch the view to translate forward (i.e., zooming in) with pinch out. As yet another example and not by way of limitation, the user may use two fingers to pinch the view to translate backward (i.e., zooming out) with pinch in respectively. As yet another example and not by way of limitation, the user may drag the view with two fingers to translate the view along the camera plane.
In particular embodiments, manipulating the one or more parameters of the XR space may comprise manipulating one or more XR objects in the XR space. In this case, the one or more gesture inputs may comprise a gesture input for locking the one or more XR objects to prohibit the one or more second users to manipulate the one or more XR objects. In the XR video call, any participant may lock or unlock an object at any time.
In particular embodiments, the one or more gesture inputs may comprise a gesture input for activating an isolation view. Accordingly, the second sequence of image frames of the XR video call may be rendered in the isolation view. In addition, the portrayed transformations to the XR space in the isolation view may be not visible to the one or more second users. An isolation view may allow for comfortable viewing of the media content without disrupting its placement for other users. A user may double tap on the shared media content with one single finger to activate this view.
As illustrated in
The method 2600 may begin at step 2610 with the one or more processing devices (e.g., the electronic device 100). For example, in particular embodiments, the electronic device 100 may render, on one or more touchscreen displays of the electronic device, a first sequence of image frames of an extended reality (XR) video call, wherein the first sequence of images portrays a shared XR space, and wherein the XR video call is between a first user of the electronic device 100 and one or more second users, wherein the XR space comprises one or more of a wall anchor for two-dimensional XR objects or a space anchor for three-dimensional XR objects. The method 2600 may then continue at step 2620 with the one or more processing devices (e.g., the electronic device 100). For example, in particular embodiments, the electronic device 100 may receive, via the one or more touchscreen displays, one or more gesture inputs associated with manipulating one or more parameters of the XR space during the XR video call, wherein manipulating the one or more parameters of the XR space comprises manipulating one or more XR objects in the XR space comprising one or more of adding the one or more XR objects to the XR space, removing the one or more XR objects from the XR space, translating the one or more media objects, rotating the one or more XR objects, zooming in the one or more XR objects, or zooming out the one or more XR objects, wherein manipulating the one or more parameters of the XR space comprises manipulating a view of the XR space comprising one or more of translating the view of the XR space, rotating the view of the XR space, zooming in the view of the XR space, or zooming out the view of the XR space, wherein the one or more gesture inputs comprise a gesture input for locking the one or more XR objects to prohibit the one or more second users to manipulate the one or more XR objects, wherein the one or more gesture inputs comprise a gesture input for activating an isolation view, wherein the second sequence of image frames of the XR video call is rendered in the isolation view, and wherein the portrayed transformations to the XR space in the isolation view are not visible to the one or more second users. The method 2600 may then continue at step 2630 with the one or more processing devices (e.g., the electronic device 100). For example, in particular embodiments, the electronic device 100 may determine, responsive to the one or more gesture inputs, one or more transformations within the XR space, wherein the determination is based on a gesture type associated with each of the one or more gesture inputs, wherein determining the one or more transformations comprises determining, based on the gesture type associated with each of the one or more gesture inputs, whether the one or more gesture inputs are intended to manipulate an XR object or a view of the XR space. The method 2600 may then continue at block 2640 with the one or more processing devices (e.g., the electronic device 100). For example, in particular embodiments, the electronic device 100 may render, on the one or more touchscreen displays, a second sequence of image frames of the XR video call, wherein the second sequence of images portrays the one or more transformations to the XR space. Particular embodiments may repeat one or more steps of the method of
This disclosure contemplates any suitable number of computer systems 2700. This disclosure contemplates computer system 2700 taking any suitable physical form. As example and not by way of limitation, computer system 2700 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (e.g., a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, a tablet computer system, an augmented/virtual reality device, or a combination of two or more of these. Where appropriate, computer system 2700 may include one or more computer systems 2700; be unitary or distributed; span multiple locations; span multiple machines; span multiple data centers; or reside in a cloud, which may include one or more cloud components in one or more networks.
Where appropriate, one or more computer systems 2700 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example, and not by way of limitation, one or more computer systems 2700 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 2700 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
In particular embodiments, computer system 2700 includes a processor 2702, memory 2704, storage 2706, an input/output (I/O) interface 2708, a communication interface 2710, and a bus 2712. Although this disclosure describes and illustrates a particular computer system having a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement. In particular embodiments, processor 2702 includes hardware for executing instructions, such as those making up a computer program. As an example, and not by way of limitation, to execute instructions, processor 2702 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 2704, or storage 2706; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 2704, or storage 2706. In particular embodiments, processor 2702 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 2702 including any suitable number of any suitable internal caches, where appropriate. As an example, and not by way of limitation, processor 2702 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 2704 or storage 2706, and the instruction caches may speed up retrieval of those instructions by processor 2702.
Data in the data caches may be copies of data in memory 2704 or storage 2706 for instructions executing at processor 2702 to operate on; the results of previous instructions executed at processor 2702 for access by subsequent instructions executing at processor 2702 or for writing to memory 2704 or storage 2706; or other suitable data. The data caches may speed up read or write operations by processor 2702. The TLBs may speed up virtual-address translation for processor 2702. In particular embodiments, processor 2702 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 2702 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 2702 may include one or more arithmetic logic units (ALUs); be a multi-core processor; or include one or more processors 2702. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
In particular embodiments, memory 2704 includes main memory for storing instructions for processor 2702 to execute or data for processor 2702 to operate on. As an example, and not by way of limitation, computer system 2700 may load instructions from storage 2706 or another source (such as, for example, another computer system 2700) to memory 2704. Processor 2702 may then load the instructions from memory 2704 to an internal register or internal cache. To execute the instructions, processor 2702 may retrieve the instructions from the internal register or internal cache and decode them. During or after execution of the instructions, processor 2702 may write one or more results (which may be intermediate or final results) to the internal register or internal cache. Processor 2702 may then write one or more of those results to memory 2704. In particular embodiments, processor 2702 executes only instructions in one or more internal registers or internal caches or in memory 2704 (as opposed to storage 2706 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 2704 (as opposed to storage 2706 or elsewhere).
One or more memory buses (which may each include an address bus and a data bus) may couple processor 2702 to memory 2704. Bus 2712 may include one or more memory buses, as described below. In particular embodiments, one or more memory management units (MMUs) reside between processor 2702 and memory 2704 and facilitate accesses to memory 2704 requested by processor 2702. In particular embodiments, memory 2704 includes random access memory (RAM). This RAM may be volatile memory, where appropriate. Where appropriate, this RAM may be dynamic RAM (DRAM) or static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM. This disclosure contemplates any suitable RAM. Memory 2704 may include one or more memory devices 2704, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
In particular embodiments, storage 2706 includes mass storage for data or instructions. As an example, and not by way of limitation, storage 2706 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Storage 2706 may include removable or non-removable (or fixed) media, where appropriate. Storage 2706 may be internal or external to computer system 2700, where appropriate. In particular embodiments, storage 2706 is non-volatile, solid-state memory. In particular embodiments, storage 2706 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these. This disclosure contemplates mass storage 2706 taking any suitable physical form. Storage 2706 may include one or more storage control units facilitating communication between processor 2702 and storage 2706, where appropriate. Where appropriate, storage 2706 may include one or more storages 2706. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
In particular embodiments, I/O interface 2708 includes hardware, software, or both, providing one or more interfaces for communication between computer system 2700 and one or more I/O devices. Computer system 2700 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 2700. As an example, and not by way of limitation, an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these. An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 2706 for them. Where appropriate, I/O interface 2708 may include one or more device or software drivers enabling processor 2702 to drive one or more of these I/O devices. I/O interface 2708 may include one or more I/O interfaces 2706, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
In particular embodiments, communication interface 2710 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 2700 and one or more other computer systems 2700 or one or more networks. As an example, and not by way of limitation, communication interface 2710 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network. This disclosure contemplates any suitable network and any suitable communication interface 2710 for it.
As an example, and not by way of limitation, computer system 2700 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, computer system 2700 may communicate with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these. Computer system 2700 may include any suitable communication interface 2710 for any of these networks, where appropriate. Communication interface 2710 may include one or more communication interfaces 2710, where appropriate. Although this disclosure describes and illustrates a particular communication interface, this disclosure contemplates any suitable communication interface.
In particular embodiments, bus 2712 includes hardware, software, or both coupling components of computer system 2700 to each other. As an example, and not by way of limitation, bus 2712 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these. Bus 2712 may include one or more buses 2712, where appropriate. Although this disclosure describes and illustrates a particular bus, this disclosure contemplates any suitable bus or interconnect.
Herein, “or” is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A or B” means “A, B, or both,” unless expressly indicated otherwise or indicated otherwise by context. Moreover, “and” is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A and B” means “A and B, jointly or severally,” unless expressly indicated otherwise or indicated otherwise by context.
Herein, “automatically” and its derivatives means “without human intervention,” unless expressly indicated otherwise or indicated otherwise by context.
The embodiments disclosed herein are only examples, and the scope of this disclosure is not limited to them. Embodiments according to the invention are in particular disclosed in the attached claims directed to a method, a storage medium, a system and a computer program product, wherein any feature mentioned in one claim category, e.g. method, can be claimed in another claim category, e.g. system, as well. The dependencies or references back in the attached claims are chosen for formal reasons only. However, any subject matter resulting from a deliberate reference back to any previous claims (in particular multiple dependencies) can be claimed as well, so that any combination of claims and the features thereof are disclosed and can be claimed regardless of the dependencies chosen in the attached claims. The subject-matter which can be claimed comprises not only the combinations of features as set out in the attached claims but also any other combination of features in the claims, wherein each feature mentioned in the claims can be combined with any other feature or combination of other features in the claims. Furthermore, any of the embodiments and features described or depicted herein can be claimed in a separate claim and/or in any combination with any embodiment or feature described or depicted herein or with any of the features of the attached claims.
The scope of this disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of this disclosure is not limited to the example embodiments described or illustrated herein. Moreover, although this disclosure describes and illustrates respective embodiments herein as including particular components, elements, feature, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, features, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would comprehend. Furthermore, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative. Additionally, although this disclosure describes or illustrates particular embodiments as providing particular advantages, particular embodiments may provide none, some, or all of these advantages.
This application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 63/139,218, filed 19 Jan. 2021, which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63139218 | Jan 2021 | US |