This application relates to the field of projection technologies, and in particular, to an application window projection method and an electronic device.
A projection technology is an emerging technology, and means that an electronic device can project a display interface of a display of the electronic device onto another electronic device for display. For example, a movie on a mobile phone can be played on a television, or a picture in a tablet computer can be displayed on a television.
For example, a mobile phone projects a screen onto a PC. Generally, a process in which the mobile phone projects the screen onto the PC includes: A user triggers a projection operation on the mobile phone. The mobile phone establishes a projection connection to the PC. The mobile phone sends a current interface to the PC. In this way, both the PC and the mobile phone display the current interface. For example, as shown in
An objective of this application is to provide an application window projection method and an electronic device, to improve projection experience.
According to a first aspect, an application window projection method is provided. The method is applicable to a system including a first electronic device and a second electronic device. The method includes: A first electronic device establishes a connection to a second electronic device. The first electronic device projects a first window of an application onto the second electronic device. The second electronic device displays the first window. The first electronic device displays a second window of the application.
In other words, a projection granularity may be each window of the application. For example, a MeeTime application includes a plurality of windows. The first electronic device (for example, a mobile phone) may project some or all windows of the MeeTime application onto the second electronic device (for example, a PC). For example, the mobile phone projects a screen onto the PC. For example, the first window is a “Contacts” window of the MeeTime application, and the second window is a chat window for chatting with a contact A. A user can view the “Contacts” window on the PC and view the chat window for chatting with the contact A on the mobile phone, without a need to switch back and forth between different windows of the application on the mobile phone.
It should be noted that, in this embodiment of this application, when the first electronic device projects the first window onto the second electronic device, the first electronic device can display the second window. In other words, a step of projecting the first window of the application by the first electronic device onto the second electronic device and a step of displaying the second window of the application by the first electronic device may be performed simultaneously, or certainly may not be performed simultaneously.
In a possible design, the second window is a window previous to the first window, a default window of the application, or a specified window of the application.
In other words, when the first electronic device projects the first window onto the second electronic device, the first electronic device displays the window previous to the first window, namely, the second window, or displays the default window or the specified window. The default window is a window that is set by default in an electronic device system. The specified window is window set by the user. The first window is a window previous to a window currently opened.
In a possible design, that the first electronic device projects a first window of an application onto the second electronic device for display includes: In response to an operation for opening the first window, the first electronic device projects the first window onto the second electronic device for display.
In other words, when detecting the operation for opening the first window, the first electronic device automatically projects the first window onto the second electronic device for display, and the first electronic device may not display the first window. In this manner, fast and efficient projection is implemented.
In a possible design, the operation for opening the first window includes an operation for opening an image and/or an operation for playing a video file.
In other words, when detecting the operation for opening the image, the first electronic device projects the image onto the second electronic device for display. Alternatively, when detecting an operation for playing a video, the first electronic device projects the video onto the second electronic device for display. For example, the mobile phone projects the screen onto the PC. Generally, a small display of the mobile phone results in poor effect of viewing an image or watching a video. In the manner provided in this application, when an image or a video is opened on the mobile phone, the image or the video can be directly projected onto the PC for display, to implement more convenient and efficient projection.
In a possible design, that the first electronic device projects a first window of an application onto the second electronic device for display includes: In response to an operation for opening the first window, the first electronic device displays the first window on a display of the first electronic device. In response to an operation for projecting the second window, the first electronic device projects the first window onto the second electronic device for display.
In other words, in response to the operation for opening the first window, the first electronic device first displays the first window. In response to the operation for projecting the first window, the first electronic device projects the first window onto the second electronic device, and the first electronic device closes the first window and displays the second window. For example, the mobile phone projects the screen onto the PC. The first window is first displayed on the mobile phone, and the user projects the first window as required. This helps prevent a window including user privacy from being disclosed to others due to automatic projection.
In a possible design, before the first electronic device projects a first window of an application onto the second electronic device, the method further includes: The first electronic device displays the first window and the second window. That the first electronic device displays a second window of the application includes: The first electronic device displays the second window in full screen.
In other words, when the first electronic device displays the first window and the second window in split screens, if the first electronic device projects the first window onto the second electronic device, the first electronic device can display the second window in full screen. In this way, a plurality of windows of the application can be distributed and displayed on different devices, to avoid a case in which the plurality of windows of the application cannot be better viewed because the display of the first electronic device is small.
In a possible design, that the first electronic device projects a first window of an application onto the second electronic device for display includes: In response to a first operation, the first electronic device displays a plurality of identifiers of a plurality of devices onto which a screen is to be projected. In response to an operation for selecting an identifier of the second electronic device from the plurality of identifiers, the first electronic device projects the first window onto the second electronic device for display. In other words, the user can select, from the plurality of devices onto which a screen is to be projected, a second electronic device onto which the user wants to project a screen, and then project the first window onto the second electronic device. This provides good user experience.
In a possible design, that the first electronic device projects a first window of an application onto the second electronic device for display includes: The first electronic device displays a plurality of candidate windows of the application in response to a second operation. In response to an operation for selecting the first window from the plurality of candidate windows, the first electronic device projects the first window onto the second electronic device for display. In other words, the user can select, from the plurality of candidate windows of the application, a first window that the user wants to project. This provides good experience.
In a possible design, the candidate window includes:
It should be noted that the foregoing several candidate windows are merely examples, and do not constitute a limitation. Another type of window can be also used as a candidate window. This is not limited in this embodiment of this application.
In a possible design, the method further includes: The first electronic device opens a third window of the application. The first electronic device projects the third window onto the second electronic device. The second electronic device displays the third window. The third window on the second electronic device completely or incompletely covers the first window.
In other words, when opening a new window (namely, the third window) of the application, the first electronic device can further project the new window onto the second electronic device, that is, the projection granularity is each window of the application. Therefore, the second electronic device displays two windows, namely, the third window and the second window. The third window may completely cover the first window (that is, the first window is hidden). Alternatively, the third window may not completely cover the first window, for example, cover a part of the first window. Alternatively, the third window does not cover the first window at all (that is, the plurality of windows are displayed in split screens or simultaneously displayed).
In a possible design, the method further includes: In response to a first instruction used to return the first window to the first electronic device for display, the first electronic device displays the first window and the second electronic device closes the first window. In other words, the first window that is projected onto the second electronic device can be returned to the first electronic device for display. This provides a flexible operation and good user experience.
In a possible design, the first instruction used to return the first window to the first electronic device for display includes: an operation instruction that is detected on the second electronic device and that is used to close the first window on the second electronic device, or an operation instruction that is detected on the first electronic device and that is used to open the first window on the first electronic device. In other words, when closing the first window on the second electronic device, the user can return the first window to the first electronic device for display. Alternatively, when opening the first window on the first electronic device, the user can return the first window to the first electronic device for display.
In a possible design, the application further includes a fourth window. The method further includes: An operation for opening the second window is detected on the second electronic device. The second electronic device sends a second instruction to the first electronic device in response to the operation for opening the second window. The first electronic device projects the second window onto the second electronic device in response to the received second instruction. The second electronic device displays the second window. The second window on the second electronic device completely or incompletely covers the first window. The first electronic device displays the fourth window. In other words, the first electronic device can successively project two windows of the application onto the second electronic device, and the two windows can be displayed in the second electronic device in an overlapping manner or in a non-overlapping manner.
In a possible design, the application further includes a fourth window. The system further includes a third electronic device. The method further includes: The first electronic device is connected to the third electronic device. The first electronic device projects the second window onto the third electronic device. The third electronic device displays the second window. The first electronic device displays the fourth window of the application. In other words, the first electronic device can project the first window of the application onto the second electronic device, and project the second window of the application onto the third electronic device. In this way, the plurality of windows of the application can be distributed and displayed on a plurality of electronic devices, to facilitate viewing by the user.
In a possible design, that the first electronic device projects a first window of an application onto the second electronic device includes: The first electronic device creates the first window in a first task stack. A window in the first task stack is configured to be projected onto the second electronic device for display. The first task stack is different from a second task stack in which the second window is located. A window in the second task stack is configured to be displayed on a display of the first electronic device.
It should be noted that, in this embodiment of this application, the first window and the second window of the application are located in different task stacks. The window in the second task stack is displayed on the first electronic device. The window in the first task stack is projected onto the second electronic device for display. In other words, stack splitting processing is performed on different windows of the application, to achieve display effect of different windows on different devices.
In a possible design, that the first electronic device projects a first window of an application onto the second electronic device includes: The first electronic device creates the first window in a second task stack in response to the operation for opening the first window. A window in the second task stack is configured to be displayed on a display of the first electronic device. In response to an operation for projecting the first window, the first electronic device creates the first window in a first task stack, and closes the first window in the second task stack. A window in the first task stack is configured to be projected onto the second electronic device for display.
It should be noted that, in this embodiment of this application, a same task stack (for example, the second task stack) of an application may include two windows, for example, the first window and the second window. If the first electronic device wants to project the first window onto the second electronic device for display, stack splitting processing may be performed on the first window, so that the first window is located in the first task stack, to project the first window onto the second electronic device. In other words, different windows originally in a same task stack are located in different task stacks, to implement display effect on different devices. For example, the first window located in the first task stack is projected onto the second electronic device for display. The second window still in the second task stack is displayed on the first electronic device.
In a possible design, the method further includes: In response to an operation for returning the first window to the first electronic device for display, the first electronic device determines, based on an original stack marker of the first task stack, that an original task stack of the first window is the second task stack. The first electronic device creates the first window in the second task stack. The first electronic device closes the first window in the first task stack. In other words, if the first electronic device needs to return the projected first window to the first electronic device for display, the first electronic device can return the first window in the first task stack to the second task stack in which the first window is originally located, so that the first window is displayed on the first electronic device.
In a possible design, that the first electronic device projects a first window of an application onto the second electronic device includes: The first electronic device displays a plurality of task stacks of the application in response to a preset operation. In response to an operation for selecting a third task stack from the plurality of task stacks, the first electronic device projects all windows in the third task stack onto the second electronic device. In other words, the first electronic device can project all windows in a task stack onto the second electronic device, that is, a task stack of an application is used as a projection granularity.
According to a second aspect, an application window projection method is provided. The method is applicable to a first electronic device. The first electronic device includes an application. The application includes a first window and a second window. The method includes:
The first electronic device is connected to the second electronic device.
The first electronic device projects the first window onto the second electronic device for display.
The first electronic device displays the second window.
In a possible design, the second window is a window previous to the first window, a default window of the application, or a specified window of the application.
In a possible design, that the first electronic device projects the first window of the application onto the second electronic device for display includes: In response to an operation for opening the first window, the first electronic device projects the first window onto the second electronic device for display.
In a possible design, the operation for opening the first window includes an operation for opening an image and/or an operation for playing a video file.
In a possible design, that the first electronic device projects the first window of the application onto the second electronic device for display includes: The first electronic device displays the first window in response to an operation for opening the first window. In response to an operation for projecting the second window, the first electronic device projects the first window onto the second electronic device for display.
In a possible design, before the first electronic device projects the first window onto the second electronic device, the method further includes: The first electronic device displays the first window and the second window. That the first electronic device displays the second window includes: The first electronic device displays the second window in full screen.
In a possible design, that the first electronic device projects the first window of the application onto the second electronic device for display includes: In response to a first operation, the first electronic device displays a plurality of identifiers of a plurality of devices onto which a screen is to be projected. In response to an operation for selecting an identifier of the second electronic device from the plurality of identifiers, the first electronic device projects the first window onto the second electronic device for display.
In a possible design, that the first electronic device projects the first window of the application onto the second electronic device for display includes: The first electronic device displays a plurality of candidate windows of the application in response to a second operation. In response to an operation for selecting the first window from the plurality of candidate windows, the first electronic device projects the first window onto the second electronic device for display.
In a possible design, the candidate window includes:
In a possible design, in response to a first instruction used to return the first window to the first electronic device for display, the first electronic device displays the first window, and controls the second electronic device to close the first window.
In a possible design, the first instruction includes a first instruction received from the second electronic device. The first instruction is an operation instruction that is detected on the second electronic device and that is used to close the first window on the second electronic device, or an operation instruction that is detected on the first electronic device and that is used to open the first window on the first electronic device.
In a possible design, the application further includes a fourth window. A system further includes a third electronic device. The method further includes: The first electronic device is connected to the third electronic device. The first electronic device projects the second window onto the third electronic device. The first electronic device displays the fourth window.
In a possible design, that the first electronic device projects the first window onto the second electronic device for display includes: The first electronic device creates the first window of the application in a first task stack. A window in the first task stack is configured to be projected onto a display of the second electronic device for display. The first task stack is different from a second task stack in which the second window is located. A window in the second task stack is configured to be displayed on a display of the first electronic device.
In a possible design, that the first electronic device projects the first window onto the second electronic device for display includes: The first electronic device creates the first window in a second task stack in response to the operation for opening the first window. A window in the second task stack is configured to be displayed on a display of the first electronic device. In response to an operation for projecting the first window, the first electronic device creates the first window in a first task stack and closes the first window in the second task stack. A window in the first task stack is configured to be projected onto a display of the second electronic device for display.
In a possible design, the method further includes: In response to an operation for returning the first window to the first electronic device for display, the first electronic device determines, based on an original stack marker of the first task stack, that an original task stack of the first window is the second task stack. The first electronic device creates the first window in the second task stack. The first electronic device closes the first window in the first task stack.
In a possible design, that the first electronic device projects the first window of the application onto the second electronic device includes: The first electronic device displays a plurality of task stacks of the application in response to a preset operation. In response to an operation for selecting a third task stack from the plurality of task stacks, the first electronic device projects all windows in the third task stack onto the second electronic device.
According to a third aspect, an application window projection method is further provided. The method is applicable to a second electronic device. The method includes:
The second electronic device is connected to a first electronic device.
The second electronic device receives first display information sent by the first electronic device, and displays a first window based on the first display information.
The second electronic device receives second display information sent by the first electronic device, and displays a second window based on the second display information. The second window is a window displayed on the first electronic device when the second electronic device displays the first window. The second window and the first window come from a same application on the first electronic device.
The second window completely or incompletely covers the first window.
In a possible design, the method further includes: The second electronic device receives a first instruction used to return the first window to the first electronic device, and close the first window. The second electronic device sends a second instruction to the first electronic device. The second instruction is used to instruct the first electronic device to display the first window.
In a possible design, that the first instruction used to return the first window to the first electronic device includes an operation instruction that is detected on the second electronic device and that is used to close the first window on the second electronic device, or the first instruction received from the first electronic device. The first instruction is used to instruct to open the first window on the first electronic device.
According to a fourth aspect, an electronic device is provided, including a display, one or more processors, a memory, and one or more programs. The one or more programs are stored in the memory. The one or more programs include instructions. When the instructions are executed by the electronic device, the electronic device is enabled to perform the method according to any one of the first aspect to the third aspect.
According to a fifth aspect, an embodiment of this application further provides an electronic device. The electronic device includes a module/unit for performing the method according to any one of the first aspect to the third aspect. The module/unit may be implemented by using hardware, or may be implemented by hardware executing corresponding software.
According to a sixth aspect, an embodiment of this application further provides a chip. The chip is coupled to a memory in an electronic device, and is configured to invoke a computer program stored in the memory and perform the technical solutions according to any one of the first aspect to the third aspect of embodiments of this application. “Coupling” in this embodiment of this application means that two components are directly or indirectly combined with each other.
According to a seventh aspect, a computer-readable storage medium is further provided. The computer-readable storage medium includes a computer program. When the computer program is run on an electronic device, the electronic device is enabled to perform the technical solutions according to any one of the first aspect to the third aspect.
According to an eighth aspect, a program product is further provided, including instructions. When the instructions are run on a computer, the computer is enabled to perform the technical solutions according to any one of the first aspect to the third aspect.
According to a ninth aspect, a system is further provided, including a first electronic device and a second electronic device. The first electronic device is configured to perform the method steps according to the second aspect, and the second electronic device performs the method steps according to the third aspect.
According to a tenth aspect, a graphical user interface on an electronic device is further provided. The electronic device has a display, one or more memories, and one or more processors. The one or more processors are configured to execute one or more computer programs stored in the one or more memories. The graphical user interface includes a graphical user interface displayed when the electronic device performs the technical solutions according to any one of the first aspect to the third aspect.
For advantageous effect of the second aspect to the tenth aspect, refer to advantageous effect of the first aspect. Details are not described again.
The following first describes some terms in embodiments of this application.
(1) Application (application, referred to as an app): The application is referred to as the app and is a software program that can implement one or more specific functions. Generally, a plurality of applications such as an instant messaging application, a video application, an audio application, and an image shooting application may be installed in an electronic device. The instant messaging application may include, for example, Messaging, MeeTime, WeChat (WeChat), WhatsApp Messenger, Line (Line), Instagram (Instagram), Kakao Talk, and DingTalk. The image shooting application may include, for example, a camera application (a system camera or a third-party camera application). The video application may include, for example, YouTube, Twitter, TikTok, iQIYI, and Tencent Video. The audio application may include, for example, Google Music, KuGou, Xiami Music, and QQ Music. An application mentioned in the following embodiments may be an application installed when the electronic device is delivered from the factory, or may be an application downloaded by a user from a network or obtained by the user from another electronic device during use of the electronic device.
(2) Application window (Window): The window is a display area of an application. Each application window corresponds to an activity (Activity) object, and all windows are presented through View (View). One application may correspond to a plurality of windows. Different windows may display different content of the application.
(3) Task (task) and stack (stack): The task is a set of a plurality of activities, and may also be understood as a container used to place the activity. The stack is a set of a plurality of tasks. Generally, the stack and the task are in a one-to-one relationship or a one-to-many relationship. Certainly, this is not limited in this application. For ease of description, this application is described by using an example in which the stack and the task are in the one-to-one relationship. To be specific, one stack includes one task, and tasks in different stacks are different. In this case, the task and/or the stack may be briefly referred to as a task stack.
In the task stack, the activity is placed in a “first in, last out” form. “First in, last out” may be understood as: A task stack is created, and a first activity is placed in the task stack. The first activity is located at the top layer, and is displayed in the foreground. Subsequently, if a second activity is created in the task stack, because the first activity is pushed into the stack before the second activity, the second activity is later pushed into the stack, and the second activity is pushed into the first activity, that is, the second activity is located at an upper layer of the first activity. If the first activity and the second activity are displayed in a same location, as shown in the foreground, the second activity overwrites the first activity. After the second activity is moved from the first task stack (briefly referred to as out of the stack), because the first activity is located at the top layer of the task stack, the first activity is displayed and the second activity is not displayed in the foreground.
(4) At least one: In embodiments of this application, the at least one means one or more, and a plurality of means two or more. In addition, it should be understood that, in the descriptions of this application, terms such as “first” and “second” are only for distinction and description, but cannot be understood as indicating or implying relative importance, or as indicating or implying an order.
An application window projection method provided in this application is applicable to a projection scenario. Generally, the projection scenario includes a transmitter and a receiver. The transmitter sends display information to the receiver for display. For ease of description, the transmitter is referred to as a primary device (or a source device), and the receiver is referred to as a target device in this application. The primary device may be a mobile phone, a tablet computer, a PC, a mobile phone, a watch, or the like. The target device may be a large-screen device such as a mobile phone, a tablet computer, a PC, or a television. It may be understood that, in addition to the primary device and the target device, the transmitter and the receiver may have other names. For example, the transmitter is an active projection device, and the receiver is a passive projection device. Alternatively, the transmitter is a first electronic device, and the receiver is a second electronic device. This is not limited in this application. Roles of the transmitter and the receiver are interchangeable, that is, the transmitter may project a screen onto the receiver, and correspondingly, the receiver may also project a screen onto the transmitter.
Currently, common projection technologies include same-source projection and different-source projection.
The different-source projection is different from the same-source projection. Simply, in a different-source projection scenario, when the primary device projects the screen onto the target device, the display information of the target device that is projected from the primary device may not be synchronized with the display information of the foreground of the primary device.
Specifically,
In the application window projection method provided in this application, a finer granularity may be used for projection, for example, one window of an application is used as a projection granularity. For example, the primary device is the mobile phone. An application on the mobile phone may include a plurality of windows. The mobile phone may project all or some windows in the plurality of windows onto the PC.
The following describes a projection scenario provided in embodiments of this application.
For example, the mobile phone projects a screen onto the PC. For example, refer to
In the foregoing example, that the mobile phone projects the screen onto the PC is used as an example for description. Simply, one primary device projects a screen onto one target device. This application is further applicable to a scenario in which one primary device projects a screen onto a plurality of target devices. For example, the phone projects a screen onto both the PC and the television. For example, still refer to
It should be noted that
The following describes an electronic device.
The processor 110 may include one or more processing units. For example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a neural-network processing unit (neural-network processing unit, NPU), and/or the like. Different processing units may be independent components, or may be integrated into one or more processors. The controller may be a nerve center and a command center of the electronic device 100. The controller may generate an operation control signal based on instruction operation code and a time sequence signal, to complete control of instruction reading and instruction execution. A memory may be further disposed in the processor 110, and is configured to store instructions and data. In some embodiments, the memory in the processor 110 is a cache. The memory may store instructions or data just used or cyclically used by the processor 110. If the processor 110 needs to use the instructions or the data again, the processor 110 may directly invoke the instructions or the data from the memory. This avoids repeated access, reduces waiting time of the processor 110, and improves system efficiency.
The USB interface 130 is an interface that conforms to a USB standard specification, and may be specifically a mini USB interface, a micro USB interface, a USB type-C interface, or the like. The USB interface 130 may be configured to connect to a charger to charge the electronic device 100, or may be configured to transmit data between the electronic device 100 and a peripheral device. The charging management module 140 is configured to receive a charging input from the charger. The power management module 141 is configured to connect the battery 142 and the charging management module 140 to the processor 110. The power management module 141 receives an input from the battery 142 and/or the charging management module 140, to supply power to the processor 110, the internal memory 121, an external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
A wireless communication function of the electronic device 100 may be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like. The antenna 1 and the antenna 2 are configured to transmit and receive electromagnetic wave signals. Each antenna in the electronic device 100 may be configured to cover one or more communication frequency bands. Different antennas may be further multiplexed, to improve antenna utilization. For example, the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In some other embodiments, an antenna may be used in combination with a tuning switch.
The mobile communication module 150 may provide a wireless communication solution that includes 2G/3G/4G/5G or the like and that is applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (low noise amplifier, LNA), and the like. The mobile communication module 150 may receive an electromagnetic wave through the antenna 1, perform processing such as filtering and amplification on the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may further amplify a signal modulated by the modem processor, and convert an amplified signal into an electromagnetic wave through the antenna 1 for radiation. In some embodiments, at least some functional modules in the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some functional modules in the mobile communication module 150 and at least some modules of the processor 110 may be disposed in a same component.
The wireless communication module 160 may provide a wireless communication solution that includes a wireless local area network (wireless local area network, WLAN) (such as, a wireless fidelity (wireless fidelity, Wi-Fi) network), Bluetooth (Bluetooth, BT), a global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), a near field communication (near field communication, NFC) technology, an infrared (infrared, IR) technology, or the like and that is applied to the electronic device 100. The wireless communication module 160 may be one or more components integrating at least one communication processor module. The wireless communication module 160 receives an electromagnetic wave through the antenna 2, performs frequency modulation and filtering processing on the electromagnetic wave signal, and sends a processed signal to the processor 110. The wireless communication module 160 may further receive a to-be-sent signal from the processor 110, perform frequency modulation and amplification on the signal, and convert a processed signal into an electromagnetic wave through the antenna 2 for radiation.
In some embodiments, in the electronic device 100, the antenna 1 and the mobile communication module 150 are coupled, and the antenna 2 and the wireless communication module 160 are coupled, so that the electronic device 100 can communicate with a network and another device by using a wireless communication technology. The wireless communication technology may include a global system for mobile communications (global system for mobile communications, GSM), a general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (time-division code division multiple access, TD-CDMA), long term evolution (long term evolution, LTE), BT, a GNSS, a WLAN, NFC, FM, an IR technology, and/or the like. The GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a BeiDou navigation satellite system (BeiDou navigation satellite system, BDS), a quasi-zenith satellite system (quasi-zenith satellite system, QZSS), and/or a satellite based augmentation system (satellite based augmentation system, SBAS).
The display 194 is configured to display a display interface of an application, for example, a viewfinder interface of a camera application. The display 194 includes a display panel. The display panel may be a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (flexible light-emitting diode, FLED), a mini-LED, a micro-LED, a micro-OLED, a quantum dot light-emitting diode (quantum dot light-emitting diode, QLED), or the like. In some embodiments, the electronic device 100 may include one or N displays 194, where N is a positive integer greater than 1.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is configured to process data fed back by the camera 193. For example, during photographing, a shutter is pressed, and light is transmitted to a photosensitive element of the camera through a lens. An optical signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, to convert the electrical signal into a visible image. The ISP may further perform algorithm optimization on noise, brightness, and complexion of an image. The ISP may further optimize parameters such as exposure and a color temperature of a photographing scenario. In some embodiments, the ISP may be disposed in the camera 193.
The camera 193 is configured to capture a static image or a video. An optical image of an object is generated through a lens, and is projected onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (complementary metal-oxide-semiconductor, CMOS) phototransistor. The photosensitive element converts an optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert the electrical signal into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard format, for example, RGB or YUV. In some embodiments, the electronic device 100 may include one or N cameras 193, where N is a positive integer greater than 1.
The digital signal processor is configured to process a digital signal, and may process another digital signal in addition to the digital image signal. For example, when the electronic device 100 performs frequency selection, the digital signal processor is configured to perform Fourier transform on frequency energy.
The video codec is configured to compress or decompress a digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in a plurality of encoding formats, for example, moving picture experts group (moving picture experts group, MPEG)-1, MPEG-2, MPEG-3, and MPEG-4.
The NPU is a neural-network (neural-network, NN) computing processor, quickly processes input information by referring to a structure of a biological neural network, for example, by referring to a mode of transfer between human brain neurons, and may further continuously perform self-learning. The NPU can implement applications such as intelligent cognition of the electronic device 100, for example, image recognition, facial recognition, speech recognition, and text understanding.
The internal memory 121 may be configured to store computer-executable program code. The executable program code includes instructions. The processor 110 runs the instructions stored in the internal memory 121 to perform various function applications of the electronic device 100 and data processing. The internal memory 121 may include a program storage area and a data storage area. An operating system, software code of at least one application (for example, an iQIYI application or a WeChat application), and the like may be stored in the program storage area. Data (for example, a photographed image or a recorded video) generated during use of the electronic device 100 may be stored in the data storage area. In addition, the internal memory 121 may include a high-speed random access memory, or may include a nonvolatile memory, for example, at least one magnetic disk storage device, a flash memory, or a universal flash storage (universal flash storage, UFS).
The external memory interface 120 may be configured to connect to an external storage card, for example, a micro-SD card, to extend a storage capability of the electronic device. The external storage card communicates with the processor 110 through the external memory interface 120, to implement a data storage function. For example, files such as a picture or a video is stored in the external storage card.
The electronic device 100 may implement audio functions such as music playing and recording by using the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset jack 170D, the application processor, and the like.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a barometric pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, an optical proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is configured to sense a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed in the display 194. The gyroscope sensor 180B may be configured to determine a motion posture of the electronic device 100. In some embodiments, angular velocities of the electronic device 100 around three axes (namely, axes x, y, and z) may be determined by using the gyroscope sensor 180B.
The gyroscope sensor 180B may be configured to implement image stabilization during photographing. The barometric pressure sensor 180C is configured to measure barometric pressure. In some embodiments, the electronic device 100 calculates an altitude through the barometric pressure measured by the barometric pressure sensor 180C, to assist in positioning and navigation. The magnetic sensor 180D includes a Hall effect sensor. The electronic device 100 may detect opening and closing of a flip cover by using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a clamshell phone, the electronic device 100 may detect opening and closing of a flip cover by using the magnetic sensor 180D. Further, a feature such as automatic unlocking upon opening of the flip cover is set based on a detected opening or closing state of the flip cover. The acceleration sensor 180E may detect accelerations in various directions (usually on three axes) of the electronic device 100. When the electronic device 100 is stationary, the acceleration sensor 180E may detect a magnitude and a direction of gravity. The acceleration sensor 180E may be further configured to identify a posture of the electronic device 100, and is used in an application such as switching between a landscape mode and a portrait mode or a pedometer.
The distance sensor 180F is configured to measure a distance. The electronic device 100 may measure a distance by infrared or laser light. In some embodiments, in a photographing scenario, the electronic device 100 may measure a distance by using the distance sensor 180F to implement quick focusing. The optical proximity sensor 180G may include, for example, a light-emitting diode (LED) and an optical detector such as a photodiode. The light-emitting diode may be an infrared light-emitting diode. The electronic device 100 emits infrared light by using the light-emitting diode. The electronic device 100 detects infrared reflected light from a nearby object by using the photodiode. When sufficient reflected light is detected, the electronic device 100 may determine that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100. The electronic device 100 may detect, by using the optical proximity sensor 180G, that the user holds the electronic device 100 close to an ear to make a call, to automatically perform screen-off for power saving. The optical proximity sensor 180G may also be used in a smart cover mode or a pocket mode to automatically perform screen unlocking or locking.
The ambient light sensor 180L is configured to sense ambient light brightness. The electronic device 100 may adaptively adjust brightness of the display 194 based on the sensed ambient light brightness. The ambient light sensor 180L may also be configured to automatically adjust white balance during photographing. The ambient light sensor 180L may also cooperate with the optical proximity sensor 180G to detect whether the electronic device 100 is in a pocket, to avoid an accidental touch. The fingerprint sensor 180H is configured to collect a fingerprint. The electronic device 100 may implement fingerprint unlocking, access to an application lock, fingerprint-based photographing, fingerprint-based call answering, and the like based on features of the collected fingerprint.
The temperature sensor 180J is configured to detect a temperature. In some embodiments, the electronic device 100 executes a temperature processing policy through the temperature detected by the temperature sensor 18J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 lowers performance of a processor near the temperature sensor 180J, to reduce power consumption for thermal protection. In some other embodiments, when the temperature is less than another threshold, the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being powered off abnormally due to a low temperature. In some other embodiments, when the temperature is less than still another threshold, the electronic device 100 boosts an output voltage of the battery 142 to avoid abnormal shutdown caused by a low temperature.
The touch sensor 180K is also referred to as a “touch panel”. The touch sensor 180K may be disposed on the display 194, and the touch sensor 180K and the display 194 form a touchscreen, which is also referred to as a “touch screen”. The touch sensor 180K is configured to detect a touch operation performed on or near the touch sensor. The touch sensor may transfer the detected touch operation to the application processor to determine a type of the touch event. A visual output related to the touch operation may be provided on the display 194. In some other embodiments, the touch sensor 180K may alternatively be disposed on a surface of the electronic device 100, and a location of the touch sensor 180K is different from a location of the display 194.
The bone conduction sensor 180M may obtain a vibration signal. In some embodiments, the bone conduction sensor 180M may obtain a vibration signal of a vibration bone of a human vocal-cord part. The bone conduction sensor 180M may also be in contact with a body pulse to receive a blood pressure beating signal.
The button 190 includes a power button, a volume button, and the like. The button 190 may be a mechanical button, or may be a touch button. The electronic device 100 may receive a button input, and generate a button signal input related to user setting and function control of the electronic device 100. The motor 191 may generate a vibration prompt. The motor 191 may be configured to provide an incoming call vibration prompt and a touch vibration feedback. For example, touch operations performed on different applications (for example, a photographing application and an audio playing application) may correspond to different vibration feedback effects. Touch vibration feedback effect may be further customized. The indicator 192 may be an indicator light, and may be configured to indicate a charging state and a power change, or may be configured to indicate a message, a missed call, a notification, and the like. The SIM card interface 195 is configured to connect to a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195, to implement contact with or separation from the electronic device 100.
It may be understood that the components shown in
A left half part of
The application layer includes an application module and a projection management module. The application module includes various types of applications such as a video playing application and an image shooting application. The projection management module is configured to manage projection. For example, the projection management module includes a device connection module. The device connection module may send, to a cross-device management module at the framework layer, an instruction used to control a connection between the primary device and a target device.
The framework layer includes a basic framework module, the cross-device management module, and a data management module.
The basic framework module includes an event management module, a window display module, a life cycle module, and a task stack management module. The event management module is responsible for sensing an event, for example, receiving an input event reported by the driver layer, for example, an input event used to project a window of an application.
The window display module is configured to manage a display manner of a window of an application, including a coordinate size of a displayed window, a window display level, and the like.
The life cycle module is configured to manage a life cycle of a window. For example, the life cycle module may activate a window through an onResume (onResume) interface, and the activated window is displayed in the foreground. Alternatively, the life cycle module pauses a window through an onPause (onPause) interface, so that the window is switched from the foreground to the background for display. A top-level window (top Activity) in a task stack is used as an example. If the top activity is activated by invoking onResume, the top activity is displayed in the foreground. When a new window is pushed into the task stack and located at an upper layer of the original top activity, the original top activity is paused by invoking onPause, so that the original top activity is switched from the foreground to the background for display. A life cycle of the new window can be managed according to a similar principle.
The task stack management module is configured to create and close a task stack.
The cross-device management module is responsible for controlling the driver layer to implement functions such as proximity discovery, authentication, and connection between the primary device and the target device.
The data management module is responsible for transmitting data streams such as audio, a video, and a layer between the primary device and the target device, and can also be responsible for implementing functions such as controlling a reverse event (namely, an event triggered by the target device to control the primary device).
The driver layer includes a bottom-layer driver, and is responsible for work such as discovery, authentication, and a connection. For example, the driver layer receives a command delivered by the cross-device management module in the framework layer, and performs actions such as the connection and a disconnection. Specifically, the driver layer includes a device discovery module, a device authentication module, and a device connection module. The device discovery module is responsible for device discovery. The device authentication module is responsible for device authentication. The device connection module is responsible for a device connection. Certainly, the driver layer may further include a hardware driver, for example, a display driver.
A right half part of
With reference to
A display of the primary device displays a first window of an application. When receiving hardware interrupt (for example, an operation of the user for opening a second window of the application on a touchscreen), the driver layer of the primary device reports an input event corresponding to the hardware interrupt to the application at the application layer. When it is determined, based on the input event, that the second window of the application is opened, the second window of the application is displayed on the display.
When receiving hardware interrupt (for example, an operation of the user for tapping a projection icon on a touchscreen), the driver layer of the primary device reports an input event corresponding to the hardware interrupt to the application. The application triggers the cross-device management module by using the projection management module at the application layer, so that the cross-device management module sends an instruction to the driver layer, to control the driver layer to implement authentication, a connection, and the like between the primary device and the target device.
After determining that the primary device is successfully connected to the target device, the driver layer of the primary device sends an instruction to the event management module in the basic framework module at the framework layer. The instruction is used to instruct to project the second window of the application. The event management module sends an instruction to the task stack management module, so as to implement stack splitting processing on the second window and the first window by using the task stack management module (specific content is described later). The second window on which stack splitting processing is performed is sent to the target device by using the data management module for display. After the task stack management module completes stack splitting processing on the second window and the first window, the first window is located at the top layer, may be activated by using the life cycle module, and is displayed by using the window display module, that is, the first window is switched from the background to the foreground for display.
The following uses an example in which the primary device is a mobile phone and the target device is a PC or a television, to describe in detail an application window projection method provided in embodiments of this application with reference to the accompanying drawings.
In the application window projection method provided in embodiments of this application, a projection granularity may be each window of an application. For example, an application on the mobile phone includes a plurality of windows. The mobile phone may project at least one window in the plurality of windows onto the PC. Another window in the plurality of windows is displayed on the mobile phone. Specifically, the application window projection method provided in this application includes Embodiment 1 to Embodiment 4.
Embodiment 1 provides an application window projection method. Specifically, a window A of an application is displayed in the foreground of a mobile phone. In response to a user operation, the window A of the application is switched to a window B of the application in the foreground of the mobile phone. In this case, if a user wants to project the window B onto a PC, the user may trigger a projection operation, so that the mobile phone establishes a projection connection to the PC. The mobile phone projects the window B onto the PC. Because the window B is projected, the window B may not be displayed in the foreground of the mobile phone, but another window of the application, for example, the window A, is displayed. In other words, when the window B is projected by the mobile phone, the window B is switched to the window A in the foreground of the mobile phone.
A browser application on the mobile phone is used as an example.
As shown in
The following uses a flowchart shown in
As shown in
S1: The mobile phone receives a first operation for opening the first window of the application.
For example, refer to
S2: The mobile phone creates a first task stack, and creates the first window in the first task stack.
Refer to
S3: The mobile phone displays the first window of the application on the display.
For example, refer to
S4: The mobile phone receives a second operation for opening the second window of the application.
S5: The mobile phone creates the second window in the first task stack.
Still refer to
S6: The first window is switched to the second window in the foreground of the mobile phone.
Refer to
In this case, if the second window needs to be projected onto a target device, the mobile phone may continue to perform S7 to S15.
S7: The mobile phone receives a third operation for projecting the second window.
The third operation may be an operation for sliding upward from a lower edge of the display, an operation for sliding rightward from a left edge of the display, an operation for sliding leftward from a right edge of the display, an operation for sliding downward from an upper edge of the display, or the like.
Alternatively, the third operation includes an operation for sliding upward from a lower edge of the display and an operation for tapping a projection icon on a slide-up menu (a slide-up menu displayed in response to the operation for sliding upward). Alternatively, the third operation includes an operation for sliding downward from a status bar on the top of the screen of the mobile phone, and an operation for tapping a “wireless projection” icon on a notification panel (a notification panel displayed in response to the operation for sliding downward).
Alternatively, the third operation may be an operation for opening a multi-device control center, for example, an operation for sliding vertically upward from a lower left edge or a lower right edge of the mobile phone, to open the multi-device control center. The multi-device control center displays an icon of at least one device onto which a screen can be projected, so that the user can select the target device.
The foregoing is an example of the third operation, but does not constitute a limitation on the third operation. Alternatively, the third operation may be another type of operation, for example, a voice instruction used to project the second window. Specific examples are not provided one by one in this embodiment of this application.
If the mobile phone currently does not establish a projection connection to the target device, the mobile phone may first connect to the target device, and then project the second window onto the target device. A process of connecting the mobile phone to the target device includes S8 to S10.
S8: The mobile phone displays a device list.
S9: The mobile phone selects the target device.
For example, the mobile phone receives a tapping operation on an icon of a device in the device list. In response to the tapping operation, the mobile phone determines that a device corresponding to the tapped icon is the target device.
S10: The mobile phone establishes the projection connection to the target device.
In the foregoing example, when detecting the third operation, the mobile phone opens the device list, selects the target device from the device list, and connects to the target device. Optionally, in addition to the foregoing manner, there may be another manner for establishing the connection to the target device. For example, a connection through a QR code may be used. For example, a multi-screen collaboration interface on the mobile phone is opened, the connection through a QR code is tapped on the multi-screen collaboration interface, and a QR code on the PC is scanned to complete the connection. For another example, a “OneHop” connection may be alternatively used. For example, an NFC function on the mobile phone is enabled, and an NFC area on the mobile phone is enabled to touch an NFC area on a keyboard of the PC, to establish a near field communication connection between the mobile phone and the PC. For another example, a proximity connection may alternatively be enabled. For example, a proximity connection function on the PC is enabled. After a Bluetooth function on the mobile phone is enabled, when the mobile phone approaches the PC, the PC finds the mobile phone, and displays prompt information for confirming whether to connect to the mobile phone. When receiving a confirmation instruction, the PC completes a connection to the mobile phone.
The mobile phone may perform stack splitting processing on the first window and the second window of the application. The second window on which stack splitting processing is performed is projected onto the target device. Specifically, this may be implemented by using the following steps S11 to S15.
S11: The mobile phone creates a virtual display.
The virtual display is relative to a real display (visible to the user) of the mobile phone. The virtual display is invisible to the user. The virtual display may include various display parameters such as a resolution and a display size. The display parameter of the virtual display may be consistent with that of the real display of the mobile phone. Alternatively, after a primary device establishes a projection connection to the target device, an objective of creating the virtual display is to enable a window in the virtual display to be displayed on a display of the target device. Therefore, a display parameter of the virtual display may be consistent with that of the display of the target device. For example, a size of the virtual display is consistent with that of the display of the target device. In this case, content displayed on the virtual display is suitable for the display of the target device.
S12: The mobile phone creates a second task stack. The second task stack is displayed on the virtual display. The mobile phone moves the second window from the first task stack to the second task stack.
A sequence of performing S11 and S12 is not limited. For example, S11 may be performed before S12, that is, the virtual display is created after a connection to the target device is established. Alternatively, S12 may be performed before S11.
That the second window is moved from the original first task stack to the second task stack specifically includes: creating the second task stack, creating the second window in the second task stack, and then closing the second window in the first task stack. As shown in
S13: The second window is switched to the first window in the foreground of the mobile phone.
S14: The mobile phone sends display information of the second window in the second task stack to the target device.
S15: The target device displays the second window of the application.
It should be noted that a sequence of performing S13 to S15 is not limited. For example, S13 and S14 may be performed simultaneously.
For example, refer to
It should be noted that, in the foregoing embodiment, an example in which the second window completely covers the first window when the mobile phone opens the second window is used for description. It may be understood that a display location of the second window may be different from a display location of the first window. For example, the second window and the first window are displayed in split screens, as shown in
It should be noted that, in the case of split-screen display, to facilitate the mobile phone to distinguish which window the user wants to project, when detecting that the first window is selected, the mobile phone determines that the user wants to project the first window, and when detecting that the second window is selected, the mobile phone determines that the user wants to project the second window. An operation for selecting a window may be tapping a window twice, tapping a window for three consecutive times, touching and holding a window, or the like. For example, still refer to
In the foregoing embodiment, when the second window is projected, the second window is switched to the first window in the foreground of the mobile phone. The first window is a previous window opened before the second window is opened. It may be understood that, when the second window is projected, another window of the application, for example, a default window, may further be displayed in the foreground of the mobile phone. The default window may be, for example, a home page window of the application, or a window specified by the user.
To sum up, in Embodiment 1, when the mobile phone displays a window of an application, in response to a projection operation of the user, after completing a projection connection to the PC, the mobile phone projects a window displayed in the foreground of the mobile phone onto the PC, and another window of the application is displayed in the foreground of the mobile phone. That is, a projection granularity is the window of the application.
Embodiment 2 provides an application window projection method. Specifically, when the mobile phone establishes the projection connection to the target device, and the mobile phone detects an operation for opening a window C of the application, the window C is displayed on the mobile phone. When an operation for projecting the window C is detected, the window C is projected onto the target device. For ease of description, in Embodiment 2, an example in which the mobile phone currently establishes the projection connection to only one target device (for example, the PC) is used for description.
The following describes an example scenario of Embodiment 2.
It is assumed that the mobile phone establishes the projection connection to the PC, projects the second window of the application (for example, a MeeTime application) displayed in the foreground of the mobile phone onto the PC, and displays the first window of the application (for example, the MeeTime application). For example, the first window is a chat window for chatting with a contact A in the MeeTime application. Refer to
The following uses
S16: The mobile phone detects a fourth operation for opening the third window of the application.
For an implementation of the fourth operation, refer to the description of the implementation of the second operation in Embodiment 1. Details are not described again.
S17: The mobile phone creates the third window in the first task stack.
Refer to
S18: The first window is switched to the third window in the foreground of the mobile phone.
For example, refer to
S19: The mobile phone detects a fifth operation for projecting the third window.
For example, the fifth operation may be a hold and drag operation on the third window. Alternatively, the fifth operation may be a hold and drag operation on the third window, and the drag operation ends at an upper/lower/left/right edge of the display. Alternatively, the fifth operation is an operation for tapping an icon of a target device in a multi-device center.
S20: The mobile phone creates a third task stack. The third task stack is displayed on the virtual display. The mobile phone moves the third window from the first task stack to the third task stack.
Refer to
S21: The third window is switched to the first window in the foreground of the mobile phone.
S22: The mobile phone sends display information of the third window to the target device.
S24: The target device displays the third window.
Refer to
Because the mobile phone projects the second window before the third window, the PC includes two windows projected by the mobile phone. For example, as shown in
In the foregoing example, an example in which the mobile phone establishes the projection connection to only one target device (namely, the PC) is used for description. Therefore, in the example of
Because there may be more than one target device currently connected to the mobile phone, when detecting an operation for projecting a window, the mobile phone needs to determine a target device onto which the window is projected. For example, refer to
When the mobile phone establishes a projection connection to a first target device, the mobile phone may also project the window onto a second target device that is not connected to the mobile phone. For example, when detecting an operation for invoking the device list, the mobile phone displays the device list. The device list includes a target device that is currently not connected, for selection by the user. The operation for invoking the device list may be, for example, the third operation in Embodiment 1. Details are not repeated herein.
In Embodiment 2, when the mobile phone establishes the projection connection to the target device, and the mobile phone detects the operation for opening the window C of the application, the window C is displayed on the mobile phone. When the operation for projecting the window C is detected, the window C is projected onto the target device. Different from Embodiment 2, in Embodiment 3, when the mobile phone establishes the projection connection to the target device, and the mobile phone detects the operation for opening the window C of the application, the mobile phone may not open the window C, but directly project the window C onto the target device for display.
The following uses
The following describes an example scenario of Embodiment 3.
It is assumed that the mobile phone establishes the projection connection to the PC, projects the second window of the application (for example, the MeeTime application) displayed in the foreground of the mobile phone onto the PC, and displays the first window of the application (for example, the MeeTime application). For example, the first window is the chat window for chatting with the contact A. Refer to (a) in
It should be noted that an example in which the mobile phone projects the image onto the target device for display when detecting that the user wants to open an image is used in
Optionally, if a projection manner in Embodiment 2 is used as a first projection manner, and a projection manner in Embodiment 3 is used as a second projection manner, the mobile phone may use the first projection manner or the second projection manner by default. Certainly, the user may also select a projection manner. For example, a projection mode switching button is set on the mobile phone, and switching between the first projection mode and the second projection mode is implemented by using the button.
In addition to the first projection manner and the second projection manner, Embodiment 4 provides a third projection manner. Different from the first projection manner and the second projection manner, in the third projection manner, the mobile phone may display a candidate window of the application. There may be a plurality of candidate windows, so that the user selects a to-be-projected window.
The candidate window may be all windows opened after the application is started. The browser application is used as an example. After the browser application is started, the home page window, the window of the news A, and the window of the news B are opened. In this case, the candidate window includes the three windows.
Alternatively, the candidate window may be a preset fixed window. A phone application is used as an example. The candidate window may include a “Contacts” window, a “Calls” window, a “Dialer” window, and a “MeeTime” window.
Alternatively, the candidate window may be a window specified by the user. The MeeTime application is used as an example. The user may set a chat window for chatting with a contact as the candidate window, and may further specify a group chat window as the candidate window.
It should be noted that the foregoing is an example of the candidate window, and the candidate window may alternatively be set in another manner. For example, the candidate window may be a window that is of the application and whose opening frequency is greater than a preset frequency and times of opening is greater than a preset quantity of times, a window that is of the application and whose foreground running duration is longer than preset duration, or the like. Examples are not provided one by one in this embodiment of this application.
Optionally, candidate windows of each application may be different. The candidate windows of each application may be the windows enumerated above.
A manner of displaying the candidate window of the application by the mobile phone may be: displaying the candidate window of the application when a preset operation for invoking the candidate window of the application is detected.
An operation type of the preset operation is not limited in this application. For example, the preset operation may be double tapping or holding down a home button, may be an operation for sliding upward from a lower part of a screen and pausing to enter a multitask management interface, where the multitask management interface includes the candidate window, or may be double tapping or holding down the projection icon (for example, in a pull-up menu). Alternatively, after the mobile phone establishes a connection to the target device, a candidate window button is displayed in the foreground of the mobile phone, and the candidate window is invoked when a tap operation on the button is detected.
It may be understood that, when a window of an application A is displayed in the foreground of the mobile phone, a candidate window of the application A or a candidate window of each of all currently running applications (for example, the application A and an application B) may be invoked by using the foregoing preset operation.
For example, refer to
Optionally, in
In an example shown in
It should be noted that when the window C is moved from the first task stack to the second task stack, an original task stack marker may be set for the second task stack. The original task stack marker is used to represent an original task stack to which the window in the second task stack belongs. That is, the original task stack marker set for the second task stack is a marker of the first task stack, for example, an ID of the first task stack. A task stack marker is set, so that a window in a second task stack can be easily returned to an original first task stack when a projection process ends. Similarly, when the window D is moved from the first task stack to the third task stack, an original task stack marker set for the third task stack is also the ID of the first task stack.
For example, in
Optionally, in
Optionally, a window projected onto the PC may be closed. For example, refer to
Optionally, when the window C is closed on the PC, the window C may further return to the mobile phone for display. For example, still refer to
That the window C returns to the mobile phone for display can be implemented according to a stack restoration principle. Specifically, refer to
In a first manner, when the window C is closed on the PC, the window C returns to the mobile phone for display. In a second manner, when the window C is closed on the PC, the window C does not return to the mobile phone for display. In this case, the mobile phone may use the first mode or the second mode by default. Alternatively, the user may select the first manner or the second manner. For example, the mobile phone or the PC includes a switching button, and the first manner or the second manner is selected by using the switching button.
Optionally, after the window C is projected onto the PC, when detecting an operation for opening the window C, the mobile phone may also pull back the window C on the PC. For example, when detecting the operation for opening the window C, the mobile phone sends an instruction to the PC. The instruction is used to instruct the PC to close the window C. Therefore, the window C is closed on the PC, and the window C is displayed on the mobile phone, as shown in
Optionally, when detecting an operation for opening the window C, the mobile phone may not pull back the window C on the PC.
Optionally, in
In a first manner, when the window C is opened on the mobile phone, the window C on the PC is pulled back to the mobile phone. In a second manner, when the window C is opened on the mobile phone, the window C on the PC is not pulled back to the mobile phone. In this case, the mobile phone may use the first mode or the second mode by default. Alternatively, the user may select the first manner or the second manner. For example, the mobile phone or the PC includes a switching button, and the first manner or the second manner is selected by using the switching button.
Optionally, the user may open a new window of the application on the PC. For example, refer to
Specifically, for an implementation principle of opening the window E of the application on the PC, refer to
Based on the foregoing embodiments, an embodiment of this application further provides an electronic device. The electronic device may be the primary device or the target device. As shown in
The display 1201 is configured to display a related user interface such as an interface, an image, and a video of an application. The memory 1203 stores one or more computer programs. The one or more computer programs include instructions. The processor 1202 invokes the instructions stored in the memory 1203, so that the electronic device 1200 performs the application window projection method provided in embodiments of this application.
In the foregoing embodiments provided in this application, the method provided in the embodiments of this application is described from a perspective in which an electronic device serves as an execution body. To implement functions in the foregoing method provided in embodiments of this application, the terminal device may include a hardware structure and/or a software module, and implement the foregoing functions by using the hardware structure, the software module, or a combination of the hardware structure and the software module. Whether a function in the foregoing functions is performed by using the hardware structure, the software module, or the combination of the hardware structure and the software module depends on particular applications and design constraints of the technical solutions.
According to the context, the term “when” used in the foregoing embodiments may be interpreted as a meaning of “if”, “after”, “in response to determining”, or “in response to detecting”. Similarly, according to the context, the phrase “when it is determined that” or “if (a stated condition or event) is detected” may be interpreted as a meaning of “if it is determined that”, “in response to determining”, “when (a stated condition or event) is detected”, or “in response to detecting (a stated condition or event)”. In addition, in the foregoing embodiments, relationship terms such as first and second are used to distinguish one entity from another, but do not limit any actual relationship and sequence between these entities.
All or some of the foregoing embodiments may be implemented by using software, hardware, firmware, or any combination thereof. When software is used to implement the embodiments, all or some of the embodiments may be implemented in a form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, all or some of procedures or functions according to embodiments of the present invention are generated. The computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital subscriber line (DSL)) or wireless (for example, infrared, radio, or microwave) manner. The computer-readable storage medium may be any usable medium accessible by the computer, or a data storage device, for example, a server or a data center, integrating one or more usable media. The available medium may be a magnetic medium (for example, a floppy disk, a hard disk drive, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid state disk (Solid State Disk, SSD)), or the like.
A person skilled in the art should be aware that, in the foregoing one or more examples, the functions described in embodiments of this application may be implemented by using hardware, software, firmware, or any combination thereof. When the functions are implemented by software, the foregoing functions may be stored in a computer-readable medium or transmitted as one or more instructions or code in the computer-readable medium. The computer-readable medium includes a computer storage medium and a communication medium. The communication medium includes any medium that enables a computer program to be transmitted from one place to another. The storage medium may be any usable medium accessible by a general-purpose computer or a dedicated computer.
In the foregoing specific implementations, the objectives, technical solutions, and advantageous effect of embodiments of this application are further described in detail. It should be understood that the foregoing descriptions are merely specific implementations of embodiments of this application, but are not intended to limit the protection scope of embodiments of this application. Any modification, equivalent replacement, improvement, or the like made based on the technical solutions in embodiments of this application shall fall within the protection scope of embodiments of this application. According to the foregoing descriptions in this specification of this application, technologies in the art may use or implement the content in embodiments of this application. Any modification based on the disclosed content shall be considered obvious in the art. The basic principles described in embodiments of this application may be applied to other variations without departing from the essence and scope of this application. Therefore, the content disclosed in embodiments of this application is not limited to the described embodiments and designs, but may also be extended to a maximum scope that is consistent with the principles and new features disclosed in this application.
Although this application is described with reference to specific features and embodiments thereof, it is clear that various modifications and combinations may be made to this application without departing from the spirit and scope of embodiments of this application. Correspondingly, the specification and accompanying drawings are merely example descriptions of this application defined by the appended claims, and are considered as any of or all modifications, variations, combinations, or equivalents that cover the scope of this application. It is clear that a person skilled in the art can make various modifications and variations to this application without departing from the scope of this application. In this way, embodiments of this application are intended to cover these modifications and variations provided that they fall within the protection scope defined by the following claims and their equivalent technologies of this application.
Number | Date | Country | Kind |
---|---|---|---|
202011195469.7 | Oct 2020 | CN | national |
This application is a National Stage of International Application No. PCT/CN2021/121041, filed on Sep. 27, 2021, which claims priority to Chinese Patent Application No. 202011195469.7, filed on Oct. 31, 2020. Both of the aforementioned applications are hereby incorporated by reference in their entireties.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2021/121041 | 9/27/2021 | WO |