This application claims priority to Chinese Patent Application No. 202110547552.4, filed with the China National Intellectual Property Administration on May 19, 2021 and entitled “SCREEN PROJECTION METHOD”, which is incorporated herein by reference in its entirety. This application claims priority to Chinese Patent Application No. 202110745467.9, filed with the China National Intellectual Property Administration on Jun. 30, 2021 and entitled “SCREEN PROJECTION METHOD AND RELATED APPARATUS”, which is incorporated herein by reference in its entirety.
This application relates to the field of electronic technologies, and in particular, to a screen projection method and a related apparatus.
A smart terminal and a plurality of display devices together form multi-screen linkage and collaborative complementarity, which is an important link in establishing all-scenario ecology. Currently, a screen projection technology covers large-screen scenarios, for example, a mobile office screen, an in-vehicle terminal Hicar scenario, and a smart screen scenario. Screen projection of a mobile phone to a personal computer (Personal Computer, PC) is used as an example. After the mobile phone is projected to the computer, a user may operate and control the mobile phone on the computer.
However, in the conventional large-screen projection technology, mirror projection is usually used. To be specific, after the projection, display content on the mobile phone (that is, a projection sending device) is completely the same as display content on the computer (that is, a projection receiving device). The user cannot view different display content on the projection sending device and the projection receiving device.
This application provides a screen projection method and a related apparatus, so that a projection sending device and a projection receiving device can display different content after screen projection.
According to a first aspect, this application provides a screen projection method, including: A first electronic device invokes a first module of a first application to run a first desktop, where the first desktop is associated with a first display area; the first electronic device displays first display content based on the first display area, where the first display content includes the first desktop; in response to a first user operation, the first electronic device invokes a second module of the first application to run a second desktop, where the second desktop is associated with a second display area; the first electronic device sends second display content corresponding to the second display area to a second electronic device, where the second display content includes the second desktop; in response to a second user operation performed on the first display content, the first electronic device displays third display content based on a task stack that runs in the first display area; in response to a third user operation performed on the second display content displayed by the second electronic device, the first electronic device determines that display content corresponding to the second display area is fourth display content based on a task stack that runs in the second display area; and the first electronic device sends the fourth display content to the second electronic device.
In this embodiment of this application, the first electronic device (that is, a projection sending device) supports simultaneous running of a plurality of desktop instances in different display areas by using a same application, for example, running of the first desktop in the first display area by using the first module of the first application, and running of the second desktop in the second display area by using the second module of the first application. The first electronic device determines display content on a home screen of the first electronic device based on the task stack that runs in the first display area; and determines, based on the task stack that runs in the second display area, display content to be projected to the second electronic device (that is, a projection receiving device). In this way, the first electronic device and the second electronic device may display different desktops and other different content based on the two different display areas.
In an implementation, that in response to a second user operation performed on the first display content, the first electronic device displays third display content based on a task stack that runs in the first display area includes: In response to the second user operation performed on the first desktop in the first display content, the first electronic device displays the third display content based on a task stack that is of the first application and that runs in the first display area; and that in response to a third user operation performed on the second display content displayed by the second electronic device, the first electronic device determines that display content corresponding to the second display area is fourth display content based on a task stack that runs in the second display area includes: In response to the third user operation performed on the second desktop displayed by the second electronic device, the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on a task stack that is of the first application and that runs in the second display area.
In this embodiment of this application, for the second user operation performed on the first desktop, the first electronic device may execute, based on the task stack that is of the first application and that runs in the display area associated with the first desktop, a response event corresponding to the second user operation; and for the third user operation performed on the second desktop, the first electronic device may execute, based on the task stack that is of the first application and that runs in the display area associated with the second desktop, a response event corresponding to the third user operation. In this way, data isolation between events (input events and/or response events) of different desktops can be ensured. In addition, because both desktop instances are run by the modules of the first application, the two desktops can implement sharing of specified data, and the second desktop can inherit some or all functional features of the first desktop.
In an implementation, before the first electronic device displays the first display content based on the first display area, the method further includes: The first electronic device invokes a third module of a second application to run a first status bar, where the first status bar is associated with the first display area, and the first display content includes the first status bar; and the method further includes: In response to the first user operation, the first electronic device invokes a fourth module of the second application to run a second status bar, where the second status bar is associated with the second display area, and the second display content includes the second status bar.
In this embodiment of this application, the first electronic device supports simultaneous running of a plurality of status bar instances in different display areas by using a same application, for example, running of the first status bar in the first display area by using the third module of the second application, and running of the second status bar in the second display area by using the fourth module of the second application. In this way, the two status bars are associated with different display areas, so that the first electronic device and the second electronic device may display different status bars, thereby ensuring data isolation between events (input events and/or response events) of the two status bars. In addition, because both status bars are run by the modules of the second application, the two status bars can implement sharing of specified data (for example, a notification message), and the second status bar can inherit some or all functional features of the first status bar.
In an implementation, before the first electronic device displays the first display content based on the first display area, the method further includes: The first electronic device invokes a fifth module of a third application to run a first display object of a first variable, where the first variable is associated with the first display area, and the first display content includes the first display object; and the first variable is associated with the second display area, and the second display content includes the first display object.
In this embodiment of this application, the first electronic device supports simultaneous display of display objects corresponding to a same variable in a plurality of different display areas. In this embodiment of this application, the third application and the second application may be a same application, or may be different applications. This is not specifically limited herein.
In an implementation, the method further includes: In response to a fourth user operation performed on the first display content, the fifth module of the third application is invoked to modify the display object of the first variable to a second display object; the first electronic device updates display content corresponding to the first display area to fifth display content, where the fifth display content includes the second display object; and the first electronic device updates the display content corresponding to the second display area to sixth display content, and sends the sixth display content to the second electronic device, where the sixth display content includes the second display object.
In this embodiment of this application, the first electronic device supports simultaneous display of display objects corresponding to a same variable in a plurality of different display areas. After a user changes the display object of the first variable in the first display area, the display object of the first variable in the second display area also changes accordingly.
In an implementation, the first variable indicates a display object of a wallpaper, the display object of the wallpaper is a static picture and/or a dynamic picture, and the wallpaper includes a lock-screen wallpaper used when a screen is locked and/or a desktop wallpaper used when the screen is not locked.
In this embodiment of this application, after the user changes a wallpaper displayed on the first electronic device, a wallpaper projected to the second electronic device also changes accordingly.
In an implementation, a plurality of themes are preset in the first electronic device, and the theme indicates a desktop layout style, an icon display style, and/or an interface color, and the like; and the first variable indicates a display object of the theme, and the display object of the theme is display content corresponding to one of the plurality of themes.
In this embodiment of this application, after the user changes a theme displayed on the first electronic device, a theme projected to the second electronic device also changes accordingly.
In an implementation, the first module of the first application includes a first common class, a first user interface UI control class, and a desktop task stack of the first desktop that are used for creating and running the first desktop; and the second module of the first application includes a second common class, a second UI control class, and a desktop task stack of the second desktop that are used for creating and running the second desktop, where some or all classes in the second common class are inherited from the first common class, and some or all classes in the second UI control class are inherited from the first UI control class.
In this embodiment of this application, the second common class, the second UI control class, and the desktop task stack of the second desktop that are used for creating and running the second desktop are added to the first electronic device; and some or all of the added common class and UI control class are inherited from the first common class and the first UI control class that correspond to the original first desktop. Therefore, the second desktop can inherit some functional features of the first desktop, and the two desktops can implement sharing of specified data.
In an implementation, the second common class includes one or more of the following: a desktop launcher provider, a database assistant, a desktop launcher setting class, a desktop launcher constant class, a Pc layout configuration, a Pc device file, a Pc cell counter, a Pc desktop launcher policy, a Pc desktop launcher model, a Pc loading task, and the like; and the second UI control class includes one or more of the following: a Pc drag layer, a Pc desktop workspace, a Pc cell layout, a Pc program dock view, a Pc folder, a Pc folder icon, and the like.
In an implementation, the third module of the second application includes a first component, a first dependency control class, and a third UI control class that are used for creating and running the first status bar; and the second module of the first application includes a second component, a second dependency control class, and a fourth UI control class that are used for creating and running the second status bar, where some or all components in the second component are inherited from the first component, some or all classes in the second dependency control class are inherited from the first dependency control class, and some or all classes in the fourth UI control class are inherited from the third UI control class.
In this embodiment of this application, the second component, the second dependency control class, and the fourth UI control class that are used for creating and running the second status bar are added to the first electronic device; and some or all of the added component, dependency control class, and UI control class are inherited from the first component, the first dependency control class, and the third UI control class that correspond to the original first status bar. Therefore, the second status bar can inherit some functional features of the first status bar, and the two status bars can implement sharing of specified data.
In an implementation, the second component includes one or more of the following: a PC dependency class, a PC system provider, a PC system bar, and the second status bar; the second dependency control class includes one or more of the following: a Pc status bar window control class, a screen control class, a lock-screen control class, and a remote control class; and the fourth UI control class includes one or more of the following: a Pc status bar window view, a Pc notification panel view, a Pc quick setting fragment, a Pc status bar fragment, and a Pc status bar view.
In an implementation, an identity ID of a display area associated with the second module is an ID of the second display area; and that in response to a first user operation, the first electronic device invokes a second module of the first application to run a second desktop, where the second desktop is associated with a second display area includes: In response to the first user operation, a Pc management service receives a mode switching instruction, where the instruction instructs to switch a current non-projection mode to a projection mode; in response to the instruction, the Pc management service invokes a Pc desktop service, where the Pc desktop service invokes an activity management service, and the activity management service invokes an activity task manager to start the second module of the first application; a root activity container is invoked to determine the ID of the display area associated with the second module; when the ID of the display area associated with the second module is the ID of the second display area, an Activity of the second desktop is queried and used as an Activity of a to-be-launched desktop, or when the ID of the display area associated with the second module is an ID of the first display area, an Activity of the first desktop is queried and used as an Activity of a to-be-launched desktop; and an activity start controller invokes an activity starter to start the second desktop.
In an implementation, that in response to the first user operation, the first electronic device invokes a fourth module of the second application to run a second status bar, where the second status bar is associated with the second display area includes: In response to the first user operation, a Pc management service receives a mode switching instruction, where the instruction instructs to switch a current non-projection mode to a projection mode; in response to the instruction, the Pc management service starts a productivity service, where the productivity service invokes a system bar to start the second status bar, and the system bar creates the second status bar based on a configuration file; the second status bar invokes a callback interface of a command queue to add a callback to the second status bar; the second status bar initializes a layout, and registers an IstatusBar object corresponding to the second status bar to a status bar management service; the second status bar creates the Pc status bar window view, and adds the Pc status bar window view to a status bar window control class; and the status bar window control class invokes a window management interface to add the second status bar to a window management service, so as to add the second status bar to the second display area.
In an implementation, in the non-projection mode, the command queue supports the first status bar associated with the first display area; and in the projection mode, the command queue supports both the first status bar associated with the first display area and the second status bar associated with the second display area.
According to a second aspect, this application provides an electronic device, including one or more processors, and one or more memories. The one or more memories are coupled to the one or more processors. The one or more memories are configured to store computer program code, and the computer program code includes computer instructions. When the one or more processors execute the computer instructions, the electronic device is enabled to perform the screen projection method according to any one of the possible implementations of any one of the foregoing aspects.
According to a third aspect, an embodiment of this application provides a computer storage medium, including computer instructions. When the computer instructions are run on an electronic device, the electronic device is enabled to perform the screen projection method according to any one of the possible implementations of any one of the foregoing aspects.
According to a fourth aspect, an embodiment of this application provides a computer program product. When the computer program product runs on a computer, the computer is enabled to perform the screen projection method according to any one of the possible implementations of any one of the foregoing aspects.
The following clearly describes technical solutions in embodiments of this application with reference to the accompanying drawings. In the descriptions of embodiments of this application, unless otherwise specified, “/” indicates “or”. For example, A/B may indicate A or B. The term “and/or” in this specification merely describes an association relationship for describing associated objects, and indicates that three relationships may exist. For example, A and/or B may indicate the following three cases: Only A exists, both A and B exist, and only B exists. In addition, in the descriptions of embodiments of this application, “a plurality of” means two or more.
The following terms “first” and “second” are merely intended for a purpose of description, and shall not be understood as an indication or implication of relative importance or implicit indication of a quantity of indicated technical features. Therefore, a feature limited by “first” or “second” may explicitly or implicitly include one or more features. In the descriptions of embodiments of this application, unless otherwise specified, “a plurality of” means two or more than two.
The following describes a communication system 10 according to an embodiment of this application.
For example,
In this embodiment of this application, the electronic device 100 may be directly connected to the electronic device 200 through a near-field wireless communication connection or a local wired connection. For example, the electronic device 100 and the electronic device 200 each may have one or more near-field communication modules among the following communication modules: a near field communication (near field communication, NFC) communication module, a wireless fidelity (wireless fidelity, Wi-Fi) communication module, an ultra-wideband (ultra wideband, UWB) communication module, a Bluetooth (Bluetooth) communication module, a ZigBee communication module, and the like. The electronic device 100 is used as an example. The electronic device 100 may detect and scan electronic devices near the electronic device 100 by transmitting a signal through a near-field communication module (for example, the NFC communication module), so that the electronic device 100 can discover a nearby electronic device (for example, the electronic device 200) through a near-field wireless communication protocol, establish a wireless communication connection to the nearby electronic device, and transmit data to the nearby electronic device.
In some embodiments, the electronic device 100 and the electronic device 200 may be connected, based on a wired connection or a Wi-Fi connection, to a local area network (local area network, LAN) through an electronic device 300. For example, the electronic device 300 may be a third-party device, such as a router, a gateway, or an intelligent device controller. In some embodiments, the electronic device 100 and the electronic device 200 may be indirectly connected through at least one electronic device 400 in a wide area network (for example, a Huawei Cloud network). For example, the electronic device 400 may be a hardware server, or may be a cloud server embedded in a virtualized environment. It can be understood that, through the electronic device 300 and/or the electronic device 400, the electronic device 100 may indirectly perform a wireless communication connection and data transmission with the electronic device 200.
It can be understood that, the structure shown in this embodiment does not constitute a specific limitation on the communication system 10. In some other embodiments of this application, the communication system 10 may include more or fewer devices than those shown in the figure.
In this embodiment of this application, after the electronic device 100 establishes a connection to the electronic device 200, the electronic device 100 may send projected image data, audio data and/or the like to the electronic device 200; and the electronic device 200 may perform, based on the data sent by the electronic device 100, interface display and/or audio output.
In this embodiment of this application, screen resolutions of display screens of the electronic device 200 and the electronic device 100 may be different. The electronic device 100 may be a portable electronic device, such as a mobile phone, a pad computer, a personal digital assistant (personal digital assistant, PDA), a wearable device, or a laptop computer (laptop). Optionally, the electronic device 100 may be alternatively another electronic device than a portable electronic device. This is not limited in this embodiment of this application. The electronic device 200 may be any display apparatus, such as a smart screen, a television, a pad computer, a notebook computer, a vehicle-mounted device, or a projector. Example embodiments of the electronic device 100 and the electronic device 200 include but are not limited to carrying iOS®, Android®, Microsoft®, or another operating system.
The following describes a hardware structure of an electronic device according to an embodiment of this application.
Refer to
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) port 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headset jack 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a subscriber identity module (subscriber identity module, SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a barometric pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, an optical proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It can be understood that, the structure shown in this embodiment of the present invention does not constitute a specific limitation on the electronic device 100. In some other embodiments of this application, the electronic device 100 may include more or fewer components than those shown in the figure, combine some components, split some components, or have different component arrangements. The illustrated components may be implemented by hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units. For example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, a neural-network processing unit (neural-network processing unit, NPU), and/or the like. Different processing units may be independent components, or may be integrated into one or more processors.
The controller may be a nerve center and a command center of the electronic device 100. The controller may generate, based on an instruction operation code and a time sequence signal, an operation control signal to control instruction fetching and instruction execution.
The processor 110 may further be provided with a memory for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may store instructions or data that has just been used or used repeatedly by the processor 110. If the processor 110 needs to use the instructions or data again, the processor 110 may directly invoke the instructions or data from the memory. This avoids repeated access and reduces a waiting time of the processor 110, thereby improving system efficiency.
In some embodiments, the processor 110 may include one or more interfaces. The interface may include an inter-integrated circuit (inter-integrated circuit, I2C) interface, an inter-integrated circuit sound (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver/transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (general-purpose input/output, GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, a universal serial bus (universal serial bus, USB) port, and/or the like.
The charging management module 140 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some embodiments of wired charging, the charging management module 140 may receive a charging input from the wired charger through the USB port 130. In some embodiments of wireless charging, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 supplies power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is configured to connect to the battery 142, the charging management module 140, and the processor 110. The power management module 141 receives an input from the battery 142 and/or the charging management module 140; and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may be further configured to monitor parameters such as a battery capacity, a battery cycle count, and a battery health status (electricity leakage and impedance). In some other embodiments, the power management module 141 may be alternatively disposed in the processor 110. In some other embodiments, the power management module 141 and the charging management module 140 may be alternatively disposed in a same component.
A wireless communication function of the electronic device 100 may be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antenna 1 and the antenna 2 are configured to transmit and receive an electromagnetic wave signal. Each antenna in the electronic device 100 may be configured to cover one or more communication frequency bands. Different antennas may be multiplexed to improve antenna utilization. For example, the antenna 1 may be multiplexed as a diversity antenna in a wireless local area network. In some other embodiments, an antenna may be used in combination with a tuning switch.
The mobile communication module 150 may provide a wireless communication solution that includes 2G/3G/4G/5G or the like and that is applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), and the like. The mobile communication module 150 may receive an electromagnetic wave via the antenna 1, perform filtering, amplification, and other processing on the received electromagnetic wave, and then transmit the processed electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may further amplify a signal modulated by the modem processor, which is converted into an electromagnetic wave for radiation via the antenna 1. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 and at least some of the modules of the processor 110 may be disposed in a same component.
The modem processor may include a modulator and a demodulator. The modulator is configured to modulate a to-be-transmitted low-frequency baseband signal into a medium-high-frequency signal. The demodulator is configured to demodulate a received electromagnetic wave signal to obtain a low-frequency baseband signal. Then the demodulator transfers the low-frequency baseband signal obtained through demodulation to the baseband processor for processing. The low-frequency baseband signal is transferred to the application processor after being processed by the baseband processor. The application processor outputs a sound signal through an audio device (which is not limited to the speaker 170A, the receiver 170B, and the like), or displays an image or a video through the display screen 194. In some embodiments, the modem processor may be an independent component. In some other embodiments, the modem processor may be independent of the processor 110, and may be disposed, together with the mobile communication module 150 or other functional modules, in a same component.
The wireless communication module 160 may provide a wireless communication solution that is applied to the electronic device 100 and that includes: a wireless local area network (wireless local area network, WLAN) (for example, a wireless fidelity (wireless fidelity, Wi-Fi) network), Bluetooth (Bluetooth, BT), a global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), a near field communication (near field communication, NFC) technology, an infrared (infrared, IR) technology, or the like. The wireless communication module 160 may be one or more components integrating at least one communication processing module. The wireless communication module 160 receives an electromagnetic wave via the antenna 2, performs frequency modulation and filtering on an electromagnetic wave signal, and sends the processed signal to the processor 110. The wireless communication module 160 may further receive a to-be-sent signal from the processor 110, perform frequency modulation and amplification on the signal, and convert the processed signal into an electromagnetic wave for radiation via the antenna 2.
In some embodiments, in the electronic device 100, the antenna 1 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with a network and another device by using a wireless communication technology. The wireless communication technology may include a Global System for Mobile Communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, IR, and/or the like. The GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a BeiDou navigation satellite system (BeiDou navigation satellite system, BDS), a quasi-zenith satellite system (quasi-zenith satellite system, QZSS), and/or a satellite-based augmentation system (satellite based augmentation system, SBAS).
The electronic device 100 implements a display function through the GPU, the display screen 194, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. The GPU is configured to perform mathematical and geometric computation for graphic rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is configured to display images, videos, or the like. The display screen 194 includes a display panel. The display panel may use a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (organic light-emitting diode, OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (flex light-emitting diode, FLED), a mini-LED, a micro-LED, a micro-OLED, a quantum dot light-emitting diode (quantum dot light-emitting diode, QLED), or the like. In some embodiments, the electronic device 100 may include one display screen 194 or N display screens 194, where N is a positive integer greater than 1.
The electronic device 100 can implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is configured to process data fed back by the camera 193. For example, during shooting, a shutter is pressed, light is transmitted to a photosensitive element of the camera through a lens, an optical signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, to convert the electrical signal into a visible image. The ISP may further perform algorithm optimization on noise, brightness, and complexion of the image. The ISP may further optimize parameters such as exposure and color temperature of a photographing scenario. In some embodiments, the ISP may be disposed in the camera 193.
The camera 193 is configured to capture a static image or a video. An optical image of an object is generated through the lens, and is projected to the photosensitive element. The photosensitive element may be a charge-coupled device (charge-coupled device, CCD) or a complementary metal-oxide-semiconductor (complementary metal-oxide-semiconductor, CMOS) phototransistor. The photosensitive element converts an optical signal into an electrical signal, and then transmits the electrical signal to the ISP, which converts the electrical signal into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard format such as RGB or YUV. In some embodiments, the electronic device 100 may include one camera 193 or N cameras 193, where N is a positive integer greater than 1.
The digital signal processor is configured to process a digital signal, and may process another digital signal in addition to the digital image signal. For example, when the electronic device 100 selects a frequency, the digital signal processor is configured to perform Fourier transform on frequency energy.
The video codec is configured to compress or decompress a digital video. The electronic device 100 may support one or more types of video codecs. Therefore, the electronic device 100 may play or record videos in a plurality of encoding formats, for example, Moving Picture Experts Group (moving picture experts group, MPEG)-1, MPEG-2, MPEG-3, and MPEG-4.
The NPU is a neural network (neural network, NN) computing processor. With reference to a structure of a biological neural network, for example, with reference to a pattern of transmission between human brain neurons, the NPU quickly processes input information, and can continuously perform self-learning. Applications such as intelligent cognition of the electronic device 100, for example, image recognition, facial recognition, speech recognition, and text understanding, may be implemented through the NPU.
The internal memory 121 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (non-volatile memory, NVM). The random access memory may include a static random access memory (static random access memory, SRAM), a dynamic random access memory (dynamic random access memory, DRAM), a synchronous dynamic random access memory (synchronous dynamic random access memory, SDRAM), a double data rate synchronous dynamic random access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, for example, a fifth generation DDR SDRAM is generally referred to as a DDR5 SDRAM), and the like. The nonvolatile memory may include a magnetic disk storage device and a flash memory (flash memory). The flash memory may be divided, based on operating principles, into a NOR flash, a NAND flash, a 3D NAND flash, and the like; or may be divided, based on storage levels, into a single-level cell (single-level cell, SLC), a multi-level cell (multi-level cell, MLC), a triple-level cell (triple-level cell, TLC), a quad-level cell (quad-level cell, QLC), and the like; or may be divided, based on storage specifications, into a universal flash storage (English: universal flash storage, UFS), an embedded multimedia storage card (embedded multimedia card, eMMC), and the like. In some embodiments, the random access memory may be directly read and written by the processor 110, and may be configured to store an operating system or an executable program (for example, machine instructions) of another running program, or may be configured to store data of a user and data of an application, and the like. The non-volatile memory may also store an executable program, data of a user and data of an application, and the like; and may be loaded to the random access memory in advance for the processor 110 to directly perform reading and writing.
The external memory interface 120 may be configured to connect to an external non-volatile memory, to extend a storage capability of the electronic device 100. The external non-volatile memory communicates with the processor 110 through the external memory interface 120, to implement a data storage function. For example, files such as music and videos are stored in the external non-volatile memory.
The electronic device 100 may implement an audio function, such as music playing or recording, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset jack 170D, the application processor, and the like.
The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and also configured to convert an analog audio input into a digital audio signal. The audio module 170 may be further configured to encode and decode the audio signals.
The speaker 170A, also referred to as a “loudspeaker”, is configured to convert an electrical audio signal into a sound signal.
The receiver 170B, also referred to as an “earpiece”, is configured to convert an electrical audio signal into a sound signal.
The microphone 170C, also referred to as a “microphone” or “mike”, is configured to convert a sound signal into an electrical signal.
The pressure sensor 180A is configured to sense a pressure signal, and convert the pressure signal into an electrical signal. When a touch operation is performed on the display screen 194, the electronic device 100 detects an intensity of the touch operation based on the pressure sensor 180A. The electronic device 100 may perform computing for a touch location based on a detection signal of the pressure sensor 180A.
The gyroscope sensor 180B may be configured to determine a motion posture of the electronic device 100.
The barometric pressure sensor 180C is configured to measure barometric pressure. The magnetic sensor 180D includes a Hall sensor.
The acceleration sensor 180E may detect accelerations of the electronic device 100 in various directions (for example, directions pointed by three axes, namely, an x axis, a y axis, and a z axis, in a three-axis coordinate system of the electronic device 100).
The distance sensor 180F is configured to measure a distance. The electronic device 100 may measure a distance in an infrared manner or a laser manner.
The optical proximity sensor 180G may include, for example, a light-emitting diode (LED) and a light detector, such as a photodiode.
The ambient light sensor 180L is configured to sense luminance of ambient light. The electronic device 100 may adaptively adjust brightness of the display screen 194 based on the sensed ambient light brightness.
The fingerprint sensor 180H is configured to collect a fingerprint.
The temperature sensor 180J is configured to measure a temperature.
The touch sensor 180K is also referred to as a “touch component”. The touch sensor 180K may be disposed in the display screen 194, and the touch sensor 180K and the display screen 194 form a touchscreen, which is also referred to as a “touch panel”. The touch sensor 180K is configured to detect a touch operation that is performed on or near the touch sensor 180K. The touch sensor may transfer the detected touch operation to the application processor, to determine a type of a touch event. A visual output related to the touch operation may be provided through the display screen 194. In some other embodiments, the touch sensor 180K may be alternatively disposed on a surface of the electronic device 100 and at a position different from that of the display screen 194.
The bone conduction sensor 180M may obtain a vibration signal. In some embodiments, the bone conduction sensor 180M may obtain a vibration signal and a blood pressure beating signal of a vibrating bone block of a human vocal part.
The button 190 may be a mechanical button or a touch button. The electronic device 100 may receive a key input, and generate a key signal input related to a user setting and function control of the electronic device 100.
The following briefly describes implementation of screen projection between the electronic device 100 and the electronic device 200 according to an embodiment of this application.
In a screen projection implementation, after a desktop of the electronic device 100 is projected to the electronic device 200, the electronic device 200 displays an extended-screen desktop of the electronic device 100 in full screen. To enable the extended-screen desktop projected by the electronic device 100 to the electronic device 200 to adapt to both a screen resolution of the electronic device 200 and an operation habit of a user on the electronic device 200, the extended-screen desktop usually uses a customized APK that is different from an Android application package (Android application package, APK) of a standard desktop of the electronic device 100. In other words, the electronic device 100 displays, by using the customized APK, the desktop of the electronic device 100 and a status bar of the electronic device 100 on the electronic device 200 in a simulated manner. In this implementation, the extended-screen desktop displayed by the electronic device 200 and the standard desktop of the electronic device 100 come from different sources, and the user cannot maintain the extended-screen desktop and the standard desktop in a unified manner. Consequently, the desktop (that is, the extended-screen desktop) and the status bar (that is, an extended-screen status bar) projected to the electronic device 200 cannot follow a user experience (User Experience, UX) style (for example, a theme, a wallpaper, and a lock screen) of the standard desktop of the electronic device 100; and cannot inherit functional features (for example, functional features of a notification center, quick settings, a folder, and an FA) of the standard desktop and a standard status bar of the electronic device 100. In addition, necessary data synchronization (for example, notification message management) cannot be maintained.
In another screen projection implementation provided in this embodiment of this application, the electronic device 100 may support a desktop launcher (Launcher) in simultaneously running a plurality of desktop instances in different display areas (Display), support a system interface (SystemUI) in simultaneously running a plurality of status bar instances, and maintain correspondences between status bar instances and desktop instances. In this embodiment of this application, the desktop launcher may also be referred to as a desktop application.
For example,
After determining, based on a projection instruction, that a projection destination device is the electronic device 200, the electronic device 100 uses, as an extended screen of the mobile phone 100, a display screen configured in the electronic device 200, and creates an extended-screen display area (Display1) for the extended screen. The electronic device 100 runs an extended-screen desktop and an extended-screen status bar in the Display1, and a DisplayId corresponding to an application window running in the Display1 is an ID of the Display1. The extended-screen desktop and the standard desktop are two desktop instances created and run by a same Launcher. The extended-screen status bar and the standard status bar are status bar instances created and run by a same SystemUI. The electronic device 100 determines, based on the application window running in the Display1, display content of the extended screen, and may project the display content of the extended screen to the electronic device 200; and the electronic device 200 may display, based on projection data sent by the electronic device 100, the extended-screen desktop and the extended-screen status bar.
In some embodiments, after a connection to the electronic device 200 is established, the electronic device 100 may obtain device information of the electronic device 200, such as a model and a screen resolution of the electronic device 200. The application window run by the electronic device 100 in the Display1 is an application window that is applicable to the electronic device 200 and that is obtained after interface optimization and function optimization are performed, based on the device information of the electronic device 200 such as the model and the screen resolution of the electronic device 200, on an original application window of the electronic device 100. The interface optimization includes: changing a display size of the application window, and changing a display size and an interface layout of each interface element in the application window. The function optimization includes: adding a function that is unavailable in the original application window of the electronic device 100, and shielding some original functions of the electronic device 100 that are not applicable to the electronic device 200, so as to adapt to the operation habit of the user on the electronic device 200.
Refer to
It needs to be noted that, in this embodiment of this application, the application window may be a Window object corresponding to an Activity in an Android system, or may be an application window in an iOS system, or may be an application window in another operating system. This is not specifically limited herein. One application includes a plurality of application windows, and one application window usually corresponds to one user interface (User Interface, UI). Optionally, one application window may alternatively correspond to a plurality of user interfaces. For ease of description, an application window may also be referred to as a window for short in this embodiment of this application.
The Activity in the Android system is an interface for interaction between the user and an application. Each Activity component is associated with a Window object used to describe a specific application window. It can be learned that, the Activity is a highly-abstract user interface component, and represents a user interface and corresponding service logic that centers on the user interface in the Android system; and an event triggered by the user may be monitored and processed by using a control in the user interface. It can be understood that, in Android applications, one Activity may be represented as one user interface, and one Android application may have a plurality of activities.
The following describes, by using an example in which the electronic device 100 is a mobile phone and the electronic device 200 is a computer, example user interfaces in a screen projection scenario according to an embodiment of this application with reference to the accompanying drawings.
The following describes a home screen 11 of a desktop application of an example mobile phone 100 according to an embodiment of this application.
For example,
The standard status bar 101 may include: one or more signal strength indicators 101A of a mobile communication signal (also referred to as a cellular signal), an operator name (for example, “China Mobile”) 101B, one or more signal strength indicators 101C of a wireless fidelity (wireless fidelity, Wi-Fi) signal, a battery status indicator 101D, and a time indicator 101E.
The tray 104 containing common application icons may present: an icon of Phone, an icon of Contacts, an icon of Messages, and an icon of Camera. The other application icons 105 may present: an icon of File management, an icon of Gallery, an icon of Music, an icon of Settings, and the like. The home screen 11 may further include a page indicator 106. The other application icons may be distributed on a plurality of pages, and the page indicator 106 may be used to indicate a specific page on which an application is currently viewed by the user. The user may slide leftward or rightward in an area including the other application icons, to view an application icon on another page.
The home screen 11 may further include a desktop wallpaper 107. The desktop wallpaper 107 may be specified by the user, or may be specified by default in the mobile phone 100. In addition, the mobile phone 100 may use a plurality of themes, and the mobile phone 100 may change a desktop layout style, an icon display style, and a desktop color by switching a theme. Generally, a wallpaper is configured by default for each theme. The user can modify a wallpaper of the mobile phone 100 under a current theme.
It can be understood that,
In this embodiment of this application, the mobile phone 100 may establish a connection to a computer 200 by using a near-field wireless communication technology such as NFC, Wi-Fi, or Bluetooth, so as to project a desktop of the mobile phone 100 to the computer 200. The following describes a screen projection process by using an example in which screen projection is implemented by using NFC.
For example, as shown in
There are two statuses for the NFC icon 201A: a selected status and a non-selected status. For example, the NFC icon 201A shown in
For example, as shown in
Specifically, after the user places the NFC area of the mobile phone 100 to get close to the NFC area of the computer 200, the mobile phone 100 may detect an NFC signal of the computer 200, and the mobile phone 100 displays, on a current display interface, a prompt box 13 shown in
The prompt information is used to prompt the user to tap the connection control 302 to implement an NFC connection; and after the NFC connection is established, the user may operate and control the mobile phone 100 on the computer 200. The connection control 303 may receive an input operation (for example, a touch operation) of the user; and in response to the input operation, the mobile phone 100 sends an NFC connection request to the computer 200. The cancellation control 304 may receive an input operation (for example, a touch operation) of the user; and in response to the input operation, the mobile phone 100 may close the prompt box 13.
As shown in
The prompt information 305 is used to prompt the user whether to allow the mobile phone 100 to connect to the local device. An NFC connection may be implemented by tapping the connection control 306. After the NFC connection is established, the user may operate the mobile phone 100 on the computer 200. In other words, after the NFC connection is established, the mobile phone 100 may project the desktop of the mobile phone 100 to the computer 200, and the user may control the mobile phone 100 by using a desktop displayed on the computer 200.
The connection control 306 may receive an input operation (for example, a touch operation) of the user; and in response to the input operation, the computer 200 sends an NFC connection response to the mobile phone 100. The cancellation control 307 may receive an input operation (for example, a touch operation) of the user; and in response to the input operation, the computer 200 may close the prompt box 15.
In response to the received NFC connection response of the computer 200, the mobile phone 100 creates an extended-screen desktop by using the Launcher, and creates an extended-screen status bar by using the SystemUI. A DisplayId corresponding to the extended-screen desktop and a DisplayId corresponding to the extended-screen status bar are an ID of an extended-screen display area (Display1). The mobile phone 100 sends projection data of the Display1 to the computer 200. As shown in
For example, as shown in
The following separately describes in detail the extended-screen status bar 401, the search bar 402, the Dock bar 403, and a lock screen of the home screen 16 that is projected by the mobile phone 100 to the extended screen.
As shown in
It can be understood that, in this embodiment of this application, the mobile phone 100 runs two status bar instances: the standard status bar 101 corresponding to the standard desktop displayed by the mobile phone 100, and the extended-screen status bar 401 corresponding to the extended-screen desktop displayed by the computer 200. A DisplayId associated with the standard status bar 101 is an ID of the Display0, and a DisplayId associated with the extended-screen status bar 401 is the ID of the Display1.
Different from the standard status bar 101 corresponding to the standard desktop of the mobile phone 100, each interface element in the extended-screen status bar 401 may be tapped to open a corresponding second-level interface. The following describes the second-level interface of each interface element in the extended-screen status bar 401.
The notification center icon 401A may receive an input operation (for example, a click operation performed by using a left button of a mouse or a touch operation of the user) of the user; and in response to the input operation, the computer 200 displays a notification center window 17 shown in
It needs to be noted that, when the computer 200 is equipped with a mouse, input operations received by the extended-screen desktop and the extended-screen status bar in this embodiment of this application may be operations performed by the user by using the mouse; or when the computer 200 is equipped with a touchscreen (or a touch panel), input operations received by the extended-screen desktop and the extended-screen status bar in this embodiment of this application may be touch operations performed by the user by using the touchscreen (or the touch panel). This is not specifically limited herein.
Specifically, in some embodiments, a control (for example, the notification center icon 401A displayed on the home screen 16) in the extended-screen status bar displayed by the computer 200 may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 may send related information (for example, coordinates clicked by using the left button and a DisplayId corresponding to a display screen on which the input operation is performed) of the input operation to the mobile phone 100. Based on the related information of the input operation, the mobile phone 100 identifies that the input operation is a click operation performed by using the left button of the mouse on the notification center icon 401A in the extended-screen status bar, and then determines that a response event triggered by the input operation is to display the notification center window 17. The mobile phone 100 runs the notification center window 17 in the Display1, and sends updated display content of the Display1 to the computer 200. The computer 200 displays, based on the projection data sent by the mobile phone 100, the notification center window 17 shown in
For example, as shown in
For a notification message (for example, the notification message 502) in the notification center window 17, an input operation of the user may be received; and in response to different input operations of the user, operations such as removing the notification message, sharing the notification message, and viewing details of the notification message may be implemented. Optionally, for the notification message 502, an input operation (for example, a click operation performed by using the left button of the mouse) of the user may be received; and in response to the input operation, the computer 200 may display specific content of the notification message 502 on a user interface of an SMS message application. Optionally, for the notification message 502, an input operation (for example, a click operation performed by using a right button of the mouse) of the user may be received; and in response to the input operation, the computer 200 displays a menu bar 505 shown in
In some embodiments, display content of the notification center icon 401A includes icons of applications corresponding to the latest N notification messages. For example, as shown in
In some embodiments, when a specified notification message (for example, the notification message 502) in the notification center window 17 displayed on the extended screen is removed (or deleted), the notification message is also removed (or deleted) from a notification center window displayed on a default screen of the mobile phone 100.
The input method indicator 401B may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 displays a floating window 18 shown in
For example, as shown in
The signal strength indicator 401C may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 displays a floating window 19 shown in
The battery status indicator 401C may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 displays a floating window 20 shown in
The time indicator 401E may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 displays a floating window 21 shown in
The control center icon 401F may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 displays a control center window 22 shown in
In this embodiment of this application, the mobile phone 100 may further adjust, based on a theme color of the extended-screen desktop and/or a wallpaper color of the extended-screen desktop, colors of interface elements displayed in the extended-screen status bar 401. Optionally, when the theme color of the extended-screen desktop and the color of the desktop wallpaper 404 are relatively deep, the colors of the interface elements in the extended-screen status bar 401 are adjusted to white or another preset light color; or when the theme color of the extended-screen desktop and the color of the desktop wallpaper are relatively light, the colors of the interface elements in the extended-screen status bar 401 are adjusted to black or another preset dark color. For example, as shown in
The search bar 402 may receive an input operation (for example, a double-click operation performed by using the right button of the mouse) of the user; and in response to the input operation, the computer 200 displays a global-search floating window 23 shown in
In this embodiment of this application, a global search may be implemented by using the search bar 402, so that an application locally installed on the mobile phone 100, a stored file, and the like can be searched offline, and resources such as news, a video, and music can be searched online.
In some embodiments, the search bar 402 on the extended-screen desktop may be disposed in the extended-screen status bar 401. This is not specifically limited herein.
It needs to be noted that, in some embodiments, the user may move, by using the mouse, a cursor of the mouse on the home screen 16 of the mobile phone 100 displayed by the computer 200. Optionally, in this embodiment of this application, for a specified input operation performed by the user by using the mouse, a corresponding cursor motion effect is added, thereby adding a visual feedback of the input operation of the user. For example, an initial display form of the cursor of the mouse may be a first shape (for example, an arrow). When the cursor of the mouse is hovered on a specified interface element (for example, the search bar 402 shown in
The Dock bar 403 may also be referred to as a program dock. For example, as shown in
In some embodiments, the user may specify a display status of the Dock bar 403. Optionally, the display status of the Dock bar 403 is specified as being automatically hidden, and when the computer 200 displays the extended-screen desktop of the mobile phone 100 or displays a user interface of another application of the mobile phone 100, the Dock bar 403 is automatically hidden; and the computer 200 displays the Dock bar 403 only when the user moves, by using the mouse, the cursor to get close to an area in which the Dock bar 403 is located. Optionally, the display status of the Dock bar 403 is specified as being always displayed, and when the computer 200 displays the extended-screen desktop of the mobile phone 100 or displays, in a form of a floating window, a user interface of another application of the mobile phone 100, the Dock bar 403 is always displayed; or when the computer 200 displays the user interface of the another application in full screen, the Dock bar 403 is automatically hidden; and the computer 200 displays the Dock bar 403 only when the user moves, by using the mouse, the cursor to get close to the area in which the Dock bar 403 is located.
The application list icon 701A may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 displays a user interface 24 shown in
Optionally, the user interface 24 may further include a page indicator 704, the other application icons may be distributed on a plurality of pages, and the page indicator 704 may be used to indicate a specific page on which an application is currently viewed by the user. Optionally, the user may slide leftward or rightward on the user interface 24 by using a finger to view an application icon on another page. Optionally, as shown in
In this embodiment of this application, the user may add an application icon in the application icon list 703 to the fixed-application area 701 of the Dock bar. For example, as shown in
In this embodiment of this application, an application icon in the application icon list 703 or in the Dock bar 403 may receive an input operation (for example, a double-click operation performed by using the left button of the mouse or a touch operation performed by using a finger of the user) of the user; and in response to the input operation, the computer 200 may display a user interface of an application corresponding to the application icon.
It needs to be noted that some third-party applications installed on the mobile phone 100 do not support extended-screen display. In this embodiment of this application, the mobile phone 100 specifies a trustlist of applications that can be displayed on the extended screen, and a function that cannot be implemented on the extended-screen desktop can be shielded by specifying the trustlist. In some embodiments, the application icon list 703 displays only an icon of an application in the trustlist; and after a third-party application supports extended-screen display, the application is dynamically added to the trustlist, and the icon of the application is added to the application icon list 703. In addition, among applications that support extended-screen display, some applications are applicable to the screen resolution of the computer 200, and a PC-like user interface is displayed in full screen on the extended screen; and some other applications are not applicable to the screen resolution of the computer 200, and only a standard user interface can be displayed on the extended screen for these applications on the mobile phone 100.
For example, Gallery supports full-screen display of the PC-like user interface on the extended screen, and the icon 703B of Gallery may receive an input operation (for example, a double-click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 may display a PC-like user interface 25 of Gallery in full screen, as shown in
Specifically, in some embodiments, when the computer 200 receives an input operation (for example, a double-click operation performed by using the left button of the mouse on the icon 703B of Gallery) performed on an application icon 1 to start an application 1, the computer 200 sends related information (for example, coordinates double-clicked by using the left button, and the DisplayId corresponding to the extended screen on which the input operation is performed) of the input operation to the mobile phone 100; and the mobile phone 100 identifies, based on the related information of the input operation and an interface layout of a user interface (for example, the user interface 24) currently running in the foreground of the Display1, that the input operation is an input operation performed on the application icon 1 on the extended-screen desktop, and determines that the input operation is used to start the application 1.
It needs to be noted that, when the user intends to start the application 1 of the mobile phone 100 on the extended-screen desktop, the application 1 may have already been started on the standard desktop of the mobile phone 100.
Optionally, when the mobile phone 100 determines that the mobile phone 100 currently does not run the application 1 in the Display0, the mobile phone 100 starts a user interface 1 (for example, the user interface 25 of Gallery) of the application 1 in the Display1, and projects updated display content of the Display1 to the computer 200; and the computer 200 displays the user interface 1 based on the projection data sent by the mobile phone 100. Optionally, when the mobile phone 100 determines that the mobile phone 100 is running the application 1 in the Display0, the mobile phone 100 moves an activity stack (ActivityStack) of the application 1 in the Display0 to the Display1, runs the application 1 in the Display1, clears running data of the application 1 in the Display0, and then projects updated display content of the Display1 to the computer 200; and the computer 200 displays a user interface (for example, the user interface 25 of Gallery) of the application 1 based on the projection data sent by the mobile phone 100. Optionally, when the mobile phone 100 determines that the mobile phone 100 is running the application 1 in the Display0, the mobile phone 100 sends prompt information 1 to the computer 200; the computer 200 may display prompt information 2 based on the prompt information 1, to prompt the user that the mobile phone 100 is running the application 1; and the user may control the application 1 by using the mobile phone 100.
The multi-task icon 701B may receive an input operation (for example, a click operation performed by using the left button of the mouse) of the user; and in response to the input operation, the computer 200 displays a recent tasks screen 27 shown in
As shown in
When an application window of another application is displayed on the extended-screen desktop, the icon 701C for displaying the desktop may be used to minimize one or more currently displayed application windows, and display the extended-screen desktop. When no other application window is displayed on the extended-screen desktop, the icon 701C for displaying the desktop may be used to restore display of one or more application windows recently minimized. For example, as shown in
In some embodiments, a user interface (for example, the user interface 25 of Gallery shown in
As shown in
In some embodiments, the computer 200 may display a plurality of application windows on the extended-screen desktop of the mobile phone 100 in a tiled manner. For example, as shown in
In this embodiment of this application, an application icon in the Dock bar 403 or in the application icon list 703 may alternatively receive an input operation (for example, a click operation performed by using the right button of the mouse) of the user; and in response to the input operation, the computer 200 may display a plurality of operation options for an application corresponding to the icon, to implement removal, sharing, uninstallation, another shortcut function, or the like of the application. It needs to be noted that, each application may correspond to different operation options. For example, some system applications (such as Gallery and File management) cannot be uninstalled.
For example, as shown in
The removal control 801 may receive an input operation (for example, a click operation performed by using the right button of the mouse) of the user; and in response to the input operation, the computer 200 may remove the icon 701E of Browser from the Dock bar 403. The sharing control 802 is used to share a browser application with a target object. The uninstallation control 803 is used to uninstall the browser application of the mobile phone 100.
In some embodiments, an application icon (for example, an icon 701F of Gallery) in the Dock bar 403 may further receive a hovering operation (for example, hovering the cursor of the mouse on the application icon) of the user; and in response to the input operation, if an application corresponding to the application icon is in a background running status, the computer 200 may display a thumbnail of the application corresponding to the application icon, where the thumbnail is a thumbnail of a user interface in which the application has recently run. For example, as shown in
In some embodiments, a style of a lock screen of the extended-screen desktop of the mobile phone 100 is consistent with a style of a lock screen of the standard desktop of the mobile phone 100.
Optionally, after the user controls the computer 200 to perform screen locking on the extended-screen desktop of the mobile phone 100, the mobile phone 100 also performs screen locking on the standard desktop; and vice versa. For example, after the user controls, on the computer 200, the extended-screen desktop of the mobile phone 100 to perform screen locking, the computer 200 displays a lock screen 31 of the extended-screen desktop shown in
Optionally, after the user controls the computer 200 to perform screen locking on the extended-screen desktop of the mobile phone 100, the mobile phone 100 does not perform, along with the extended-screen desktop, screen locking on the standard desktop; and vice versa.
The following describes a software system of the mobile phone 100 according to an embodiment of this application with reference to the accompanying drawings.
First, it needs to be noted that, the screen projection method provided in embodiments of this application involves modifying a plurality of system applications (including a Launcher, a SystemUI, a file manager (FileManager), global search (Hisearch), wireless projection (AirSharing), and the like) and a system framework.
The desktop launcher is used to manage a desktop layout, a Dock bar, an application list, a plurality of tasks, and the like. The SystemUI is a UI component that provides system-level information display and interaction for the user; and is used to manage status bars, navigation bars, notification centers, lock screens, wallpapers, and the like. The file manager is used to provide a file box externally, manage desktop files, provide a file operation capability externally, and implement file dragging between applications. The global search is used to implement local and online global search. The wireless projection is used to implement wireless projection from the mobile phone 100 to a projection destination device (for example, the computer 200).
In this embodiment of this application, the software system of the mobile phone 100 may use a hierarchical architecture, an event-driven architecture, a micro-nucleus architecture, a micro-service architecture, or a cloud architecture. In this embodiment of this application, an Android system with a hierarchical architecture is used as an example to describe a software structure of the electronic device 100.
For example,
As shown in
The Android runtime includes core libraries and virtual machines. The Android runtime is responsible for scheduling and management of the Android system.
The core libraries include two parts, namely, performance functions that need to be invoked by the Java language, and core libraries of Android.
The application layer and the application framework layer run in the virtual machines. The virtual machines execute Java files of the application layer and the application framework layer as binary files. The virtual machines are used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
The application layer includes a series of application packages (Android application package, APK), for example, a desktop launcher APK (HWLauncher6.apk), a system interface APK (SystemUI.apk), a file manager APK (FileManager.apk), a global search APK (Hisearch.apk), and a wireless sharing APK (AirSharing.apk).
In some embodiments, the desktop launcher APK includes a standard desktop launcher (UniHomelauncher) and an extended-screen desktop launcher (PcHomelauncher). The system interface APK (SystemUI.apk) includes a standard status bar (StatusBar) and an extended-screen status bar (PcStatusBar). The file manager APK (FileManager.apk) includes a standard interface, a columns interface, and the file box. The global search APK (Hisearch.apk) includes a standard search interface and an extended-screen search interface. The wireless sharing APK (AirSharing.apk) includes common wireless projection and Huawei-developed high-definition display projection. In this embodiment of this application, an extended-screen desktop may also be referred to as a Pc desktop, and the extended-screen status bar may also be referred to as a Pc status bar. This is not specifically limited herein.
As shown in
The application layer further includes a Kit (Kit). The Kit includes a software development kit (HwSDK), a user interface kit (Uikit), and a projection protocol kit (Cast+kit).
In this embodiment of this application, to implement the screen projection method provided in embodiments of this application, the HwSDK is added to the Kit. The HwSDK is a set of development tools for establishing application software for a software package, a software framework, a hardware platform, and an operating system that are related to this application. To ensure compatibility of the Kit, the Uikit and the Cast+kit are modified in this embodiment of this application, so that the Uikit can implement enhancement of capabilities of native controls on the extended-screen desktop, optimization of text right-click menus, dynamic effect optimization of operations such as mouse clicking and mouse hovering (Hover), and the like; and the Cast+kit can implement projection of the extended-screen desktop to a high-definition display.
The application framework layer provides application programming interfaces (application programming interfaces, APIs) and a programming framework for applications at the application layer. The application framework layer includes some predefined functions.
In this embodiment of this application, to implement the screen projection method provided in embodiments of this application, a Huawei-developed display projection service (HwProductiveMultiWindowManager) and a Dock bar management service (DockBarManagerService) are added to the application framework layer. The Huawei-developed display projection service is used to simultaneously run a plurality of application windows of a same application, and project a specified window in the plurality of application windows to the projection destination device (for example, the computer 200). The Dock bar management service is used to manage the Dock bar.
To ensure compatibility of the application framework layer, a display management service (DisplayManagerService, DMS), an input/output service (InputManagerService, IMS), and a parallel view service (HwPartsMagicWindow) are further modified in this embodiment of this application.
The DMS is used to: manage a life cycle of interface display; control logical display of the default-screen display area (Display0) and the extended-screen display area (Display1) of a current connection; and send a notification or the like to the system and an application when display statuses of the Display0 and the Display1 have changed. The IMS is used to manage inputs and outputs of the mobile phone 100. An input/output device of the mobile phone 100 may include a printer, a hard disk, a keyboard, a mouse, a hard disk, a magnetic disk, and a writable read-only optical disc. The parallel view service is used to implement an application screen splitting function. To be specific, two user interfaces corresponding to two different activities of an application may be simultaneously displayed, so that the two user interfaces can be displayed on split screens of a same display screen, or the two user interfaces can be displayed on different display screens. In addition, the parallel view service provided in this embodiment of this application supports user interface display by using a floating window.
In this embodiment of this application, for the mobile phone 100, a package management service (PackageManagerService, PMS) and a wallpaper service (WallpaperService) at the application framework layer have also been modified, so that a policy for selecting the default Launcher (that is, UniHomelauncher) is added to the package management service, and the WallpaperService supports displaying a wallpaper in a plurality of Displays.
The system library may include a plurality of functional modules. In this embodiment of this application, the system library may include an event dispatcher (InputDispatcher), an audio system (AudioSystem), a projection protocol (Huawei Cast+), and the like. In this embodiment of this application, to implement the projection of the extended-screen desktop to the high-definition display, the projection protocol is also correspondingly modified in this embodiment of this application.
The following describes in detail, separately for the Launcher and the SystemUI, a software system framework to which the projection method provided in the embodiments of this application is applied.
For example, for the SystemUI,
The SystemUIApplication is a subclass of the Application, and is responsible for initializing all components of the SystemUI.
The Base Class includes a system interface factory class (SystemUIFactory), a system interface root component (SystemUIRootComponent), and the like. The SystemUIFactory is used to create the components of the SystemUI; and the SystemUIRootComponent is used to implement initialization of a dependency injection framework (dagger).
The Service includes a system interface service (SystemUIService). The SystemUIService is used to initialize a series of components of the SystemUI. When the SystemUI is started, the SystemUIService instantiates sub-services defined in a service list of the SystemUIService one by one. A start( ) method of each sub-service is invoked to run the sub-service. All the sub-services are inherited from a SystemUI abstract class. The status bars and the navigation bars are sub-services in the service list.
The Component includes a command queue (CommandQueue), a dependency class (Dependency), a lock-screen view mediator (KeyguardViewMediator), a notification center (Notification), a standard system bar (Systembars), a standard status bar, and the like.
The CommandQueue is a Binder class used to process requests related to the status bar and the notification center. The CommandQueue is registered by the status bar into a status bar management service (StatusBarManagerService), to receive messages of the StatusBarManagerService. The CommandQueue internally maintains an event queue. A status bar service (StatusBarService) is used to implement Callbacks callbacks in the CommandQueue. In this embodiment of this application, the CommandQueue component is modified, so that the CommandQueue component supports message distribution of a plurality of StatusBar (that is, the standard status bar and the extended-screen status bar).
The Dependency is used to create a globally available dependency.
The SystemBars is inherited from the base class SystemUI, and is an entry class for creating an entire SystemUI view. The standard status bar is mainly used to: display, on the standard desktop, notification icons (Icon) of applications and status icons (an alarm clock icon, a Wi-Fi icon, a SIM card icon, a system time icon, and the like) of the system, and control and manage these icons. The Notification is used to display notification information in the status bar, and control and manage the notification information.
The KeyguardViewMediator is a core class for screen locking. Other lock-screen objects interact with each other through the KeyguardViewMediator. The KeyguardViewMediator is a status callback management class. All calls from a lock-screen service (KeyguardService) are converted by the KeyguardViewMediator into UI threads.
The dependency class provider includes a status bar window control class (StatusBarWindowController), a status bar icon controller implementation class (StatusBarlconControllerImpl), a status bar policy (StatusBarPolicy), and the like.
The StatusBarWindowController is used to manage a status bar window view (StatusBarWindowView), and can invoke an interface of a WindowManager to display the standard status bar. In this embodiment of this application, the StatusBarWindowController component is modified, so that the StatusBarWindowController component supports addition, deletion, and management of a plurality of status bar objects.
The StatusBarIconControllerImpl is used to manage application icons in the status bar, including sizes, positions, and color changes of the icons.
The StatusBarPolicy is used to manage a display policy (for example, update of icons in the status bar, display time, and display position) of the status bar. The StatusBarPolicy class is a policy management class, and actual functions are implemented by the StatusBarService.
The UI control class includes a status bar window view (StatusBarWindowView), a notification panel view (NotificationPanelView), a quick setting fragment (QSFragment), a status bar fragment (StatusBarFragment), a status bar view (StatusBarView), and the like.
The StatusBarWindowView is used to determine a root layout when the status bar is not extended and create a status bar window view.
The StatusBarView is responsible for creating and instantiating the entire SystemUI view (including the status bar, the notification center, the lock screen, and the like). The StatusBarView defines names, display sequences, portable network graphics (Png), and the like of icons displayed in the status bar. During initialization of the StatusBarService, a StatusBarView for displaying the status bar is initialized. The StatusBarService invokes a makeStatusBarView method to implement initialization of the status bar.
The NotificationPanelView is a control class of the notification center after the status bar is pulled down.
The QSFragment is a control class of a control center after the status bar is pulled down.
The StatusBarFragment manages the status bar in a collapsed status, and is responsible for life cycle management of the icons in the status bar.
To implement the extended-screen status bar, in this embodiment of this application, a Service, a Component, a dependency class provider, and a UI control class that are used for implementing the extended-screen status bar are added to the system interface APK.
The Service used to implement the extended-screen status bar includes a productivity service (ProductiveService). The ProductiveService is inherited from the Service.
The Component used to implement the extended-screen status bar includes: a Pc dependency class (PcDependency), a Pc system provider (PcSystemProviders), a Pc system bar (PcSystembars), and an extended-screen status bar. The PcDependency, the PcSystembars, and the extended-screen status bar are all inherited from corresponding components of the foregoing standard status bar, and implement similar functions for the extended-screen status bar. Details are not described herein again.
The dependent class provider used to implement the extended-screen status bar includes: a Pc status bar window control class (PcStatusBarWindowController), a screen control class (ScreenController), a lock-screen control class (KeyguardController), and a remote control class (RemoteController). The PcStatusBarWindowController is inherited from the StatusBarWindowController of the foregoing standard status bar, and implements a similar function for the extended-screen status bar. Details are not described herein again.
The screen control class (ScreenController) is used to control turning-on and turning-off a screen on a projection side.
In this embodiment of this application, a Keyguard component, for example, a lock-screen control class (KeyguardController), is modified, so that the Keyguard component supports screen locking in a plurality of Displays.
The remote control class (RemoteController) is used to communicate with the application framework layer (Framework).
The UI control class used to implement the extended-screen status bar includes: a Pc status bar window view (PcStatusBarWindowView), a Pc notification panel view (PcNotificationPanelView), a Pc quick setting fragment (PcQSFragment), a Pc status bar fragment (PcStatusBarFragment), a Pc status bar view (PcStatusBarView), and the like. The added UI control class is inherited from the corresponding control class of the foregoing standard status bar, and implements similar functions for the extended-screen status bar. Details are not described herein again.
In addition, to ensure compatibility, in this embodiment of this application, a window management service (WMS), a status bar management service (StatusBarManagerService), a wallpaper management service (WallpaperManagerService), a notification management service (NotificationManagerService, NMS), and a Pc module (HwPartsPowerOffice) at the application framework layer are also modified correspondingly.
The window management service (WindowManagerService, WMS) includes a window management policy (WindowManagerPolicy) and a lock-screen service agent (KeyguardServiceDelegate). The WMS allows a same application to simultaneously run a plurality of application windows.
The StatusBarManagerService supports registration and management of a plurality of status bars (such as the standard status bar and the extended-screen status bar). The StatusBarManagerService is a manager of the StatusBarService. The StatusBarService is used to implement: loading, updating, and deletion of the icons in the status bar; interaction between an application and the status bar; notification information processing; and the like.
The WallpaperManagerService allows a wallpaper to be displayed in a plurality of Displays (for example, the Display0 corresponding to the standard desktop and the Display1 corresponding to the extended-screen desktop).
The NMS supports a UX style of a Pc, and supports management of notification centers after the plurality of status bars are pulled down.
The HwPartsPowerOffice is used to modify a projection entry. In this embodiment of this application, changes to a Pc management service (HwPCManagerService) are added to the HwPartsPowerOffice, so as to implement loading of the extended-screen desktop and the extended-screen status bar in a projection scenario.
It needs to be noted that, inheritance in this application means that a subclass inherits a feature and a behavior of a parent class, so that a subclass object (instance) has an instance domain and a method of the parent class; or a subclass inherits a method from a parent class, so that the subclass has a same behavior as the parent class.
In some embodiments, the system architecture diagram shown in
It needs to be noted that, the SystemUI includes a plurality of types of system interfaces.
For example, for the Launcher,
In this application, for the common class (Common Class), the following are modified to ensure compatibility with the extended-screen desktop: a desktop launcher provider (LauncherProvider), a database assistant (DatabaseHelper), a desktop launcher setting class (LauncherSettings), and a desktop launcher constant class (LauncherConstants).
The LauncherProvider is a database of the desktop launcher, and a database content provider of application icons of the plurality of desktop instances (for example, the standard desktop and the extended-screen desktop) run by the desktop launcher; and enables other applications to access or perform operations on data in the desktop launcher.
The DatabaseHelper is used to create and maintain the database.
The LauncherSettings is used to implement definitions of character strings of database items. An internal class Favorites provides some Uri to perform operations on the LauncherProvider and perform operations on field names of corresponding fields in the database.
The LauncherConstants maintains and manages constants in the application.
In this application, the following are added to the Common Class to implement the extended-screen desktop: a Pc layout configuration (PclayoutConfig), a Pc device file (PcDeviceProfile), a Pc cell (cell) counter (PcCellNumCalculator), a Pc desktop launcher policy (PcLauncherPolicy), a Pc desktop launcher model (PcLauncherModel), a Pc loading task (PcLoaderTask), and the like. The added common classes are inherited from corresponding common classes of the standard desktop, and are used to implement similar functions on the extended-screen desktop.
The PclayoutConfig is used to set layout attributes (for example, a width (width) and a height (height)) and parameters of the extended-screen desktop. A display effect of components on the extended-screen desktop may be restricted by specifying the layout attributes.
The PcDeviceProfile is used to define basic attributes of each module on the extended-screen desktop; and is responsible for initialization of values of the attributes, setting of a padding of a layout of each element, and the like.
The PcCellNumCalculator is a desktop icon layout policy class.
The PcLauncherPolicy manages a display policy of the extended-screen desktop.
The PcLauncherModel is a data processing class, stores a desktop status of the extended-screen desktop, provides an API for reading and writing the database, and updates the database when applications are deleted, replaced, or added.
The PcLoaderTask is used to load the extended-screen desktop.
In this application, the following are further added to the control class to implement the extended-screen desktop: a Pc drag layer (PcDraglayer), a Pc desktop workspace (PcWorkSpace), a Pc cell layout (PcCelllayout), a Pc program dock view (PcDockview), a Pc folder (PcFolder), a Pc folder icon (PcFolderIcon), and the like. The added control classes are all inherited from corresponding control classes of the standard desktop, and are used to implement similar functions on the extended-screen desktop.
The PcDraglayer is a view group (ViewGroup) responsible for distributing events, and is used to preliminarily process events of the extended-screen desktop and distribute the events based on a situation. The DragLayer includes: a desktop layout (Workspace), a navigation point (QuickNavigationView), a dock area (Hotseat), and a recent tasks list (OverviewContainer). The Hotseat is a container responsible for managing the Dock bar.
The PcWorkSpace is a subclass of PagedView, and consists of a plurality of CellLayout controls. Each CellLayout control represents a split screen. The PcWorkSpace is used to implement a split-screen sliding function on the extended-screen desktop.
The PcCellLayout is used to manage the display and layout of split-screen icons on the extended-screen desktop.
The PcDockview is used to implement a Dock layout on the projection side.
The PcFolder is used to implement folders (including a folder created by the user and folders provided by the system) on the extended-screen desktop.
To implement the extended-screen desktop, the activity stack in this embodiment of this application supports management of a task stack corresponding to the extended-screen desktop. A Pc desktop service (PcHomeService) is added to the Service. The PcHomeService is used to launch the extended-screen desktop.
In addition, to ensure compatibility of the application framework layer with the projection of the extended-screen desktop, in this embodiment of this application, an activity management service (ActivityManagerService, AMS) at the application framework layer is also modified correspondingly, so that the AMS supports simultaneous running of a plurality of application windows by using a same application, for example, supports simultaneous running of the foregoing two desktop instances.
In some embodiments, a Dock bar is added to the extended-screen desktop; and correspondingly, an IdockBar.aidl is added to the desktop launcher, to provide a Binder interface for the framework layer to perform operations on the Dock bar.
The following describes task stacks according to an embodiment of this application.
For example, as shown in
The PcHomelauncher runs the HomeStack of the extended screen of the mobile phone 100 by using the Display1, and the UniHomeLauncher runs the HomeStack of the default screen of the mobile phone 100 by using the Display0. In this way, data isolation between the HomeStack of the extended screen and the HomeStack of the default screen can be implemented. After obtaining input events by using a pointer event listener (TapPointerListener), the WMS can distribute an input event acting on the extended screen to the Display1 and distribute an input event acting on the default screen to the Display0.
Based on the foregoing application scenario and software system, the following describes in detail specific implementation of a screen projection method according to an embodiment of this application.
Refer to
S101: After being connected to a display of a projection destination device (for example, the computer 200), the mobile phone 100 receives a projection instruction for confirming projection.
S102: In response to the projection instruction, the mobile phone 100 adds an extended-screen status bar and a related logic control class by using the SystemUI, where a DisplayId corresponding to the extended-screen status bar is an ID of an extended-screen display area Display1.
S103: In response to the projection instruction, the mobile phone 100 further adds an extended-screen desktop and a related logic control class by using the Launcher, where a DisplayId corresponding to the extended-screen desktop is the ID of the extended-screen display area Display1.
Specifically, in some embodiments, in response to the projection instruction, the mobile phone 100 starts the service ProductiveService by using the HwPCManagerService, to load the extended-screen status bar; and starts the service PcHomeService by using the HwPCManagerService, to load the extended-screen desktop.
To further describe specific implementation of the projection method provided in this embodiment of this application, the following separately describes in detail software implementation procedures of the SystemUI and the Launcher according to this embodiment of this application.
For example,
(1) When the mobile phone 100 is powered on, in a non-projection mode, the mobile phone 100 starts a Zeyote process, creates a virtual machine instance by using the Zeyote process, and executes a system service (SystemServer).
(2) The SystemServer starts a series of services required for the running of the system, including the SystemUIService.
(3) The SystemUIService starts the SystemUI, and invokes the SystemUIApplication.
Specifically, the SystemServer separately starts a boot service, a core service, and other services; and separately starts, by invoking an mActivityManagerService.systemReady( ) method in a startOtherService method, the SystemUI and the Launcher.
(4) The SystemUIApplication starts the components of the SystemUI by using a startServicesIfNeeded function. The components of the SystemUI include the Systembars.
(5) After being started, the SystemBars starts the standard status bar (StatusBar) based on a configuration item config_statusBarComponent.
In this embodiment of this application, after being started, the SystemUIService reads a configuration item config_systemUIServiceComponents, and loads each component (including the SystemBars). After being loaded, the SystemBars component reads the configuration item (config_statusBarComponent) and determines an operation control type of the status bars based on the configuration item; and then determines, based on the determined operation control type of the status bars, whether to start the extended-screen status bar (PcStatusBar) or the standard status bar (StatusBar).
In some embodiments, when the mobile phone 100 is in the non-projection mode, a value of the config_statusBarComponent is com.android.systemui.statusbar.phone.PhoneStatusBar, and the SystemBars determines that the operation control type of the status bars is StatusBar. When the mobile phone 100 is in a projection mode, a value of the config_systemBarComponent is com.android.systemui.statusbar.tablet.TabletStatusBar, and the SystemBars determines that the operation control type of the status bars is PcStatusBar.
(6) A callback interface of the CommandQueue is invoked to add a callback to the standard status bar.
(7) The standard status bar initializes a layout, and registers an IstatusBar object corresponding to the standard status bar to the StatusBarManagerService.
(8) The standard status bar creates and adds the standard status bar window view (StatusBarWindowView) to the StatusBarWindowController.
(9) The StatusBarWindowController invokes an interface of the WindowManager to add the standard status bar to the WMS, so as to add the standard status bar to the Display1.
In some embodiments, in a startup process of the StatusBar, the CommandQueue object is transferred to the StatusBarManagerService, and is stored as mBar. When a client obtains an interface of the system service StatusBarManagerService through the ServiceManager, the client can invoke a method of the CommandQueue object through the mBar. The CommandQueue invokes the callback interface, to send back a message to the StatusBar to update the StatusBar.
(10) The standard status bar invokes a status bar prompt manager (StatusBarPromptManager) to register the projection Broadcast (Broadcast).
(11) After the standard status bar is created, the notification management service (NotificationManagerService) is invoked to register a message listener.
(12) After being connected to a display (DisplayDevice) (that is, the extended screen) of the projection destination device, the mobile phone 100 invokes the DMS.
In this embodiment of this application, the mobile phone 100 may connect to the projection destination device by using a wired connection, or may connect to the projection destination device by using a wireless communication technology such as NFC, Wi-Fi, or Bluetooth.
(13) The DMS triggers a display event (OnDisplayevent), creates a logical display area Display1 corresponding to the extended screen, and invokes a display management target (DisplayManagerGlobal) to obtain a DisplayId of the Display1.
In some embodiments, the DMS invokes a handleDisplayDeviceAddedLocked function to generate a corresponding logical device (LogicalDevice) for the display (DisplayDevice), adds the LogicalDevice to a logical device list (mLogicalDevices) managed by the DMS, and adds the DisplayDevice to a display list (mDisplayDevices) of the DMS, so as to generate a logical display area (LogicalDisplay) (that is, the extended-screen display area Display1) for the DisplayDevice. The DMS invokes the DisplayManagerGlobal to determine a DisplayId of the logical display area.
(14) The DisplayManagerGlobal sends the display event to a display listening agent (DisplaylistenerDelegate), to add a display listening agent for the Display1.
The DisplayManagerGlobal is mainly responsible for managing communications between a display manager (Display Manager) and the DMS.
(15) The DisplaylistenerDelegate sends a notification (onDisplayAdded) indicating that a display area has been added.
(16) A RootActivityContainer starts the HwPCManagerService to load the extended-screen status bar.
In this embodiment of this application, changes to the HwPCManagerService are added to the hwPartsPowerOffice, so as to start the extended-screen desktop and the extended-screen status bar in a projection scenario.
(17) The HwPCManagerService sends a projection capsule prompt broadcast to the status bar prompt manager (StatusBarPromptManager).
(18) The HwPCManagerService sends a projection notification message to the NotificationManagerService.
(19) The HwPCManagerService receives a mode switching instruction, where the instruction instructs to switch the current non-projection mode to the projection mode.
In this embodiment of this application, after receiving the capsule prompt broadcast and the projection notification message, the SystemUI displays a capsule prompt in the status bar, and displays the projection notification message in the notification center. For example, the capsule prompt is the prompt box 13 shown in
(20) In response to the instruction, the HwPCManagerService starts the ProductiveService.
In some embodiments, the HwPCManagerService invokes a bindService to start the ProductiveService.
(21) The ProductiveService invokes the Systembars to start the extended-screen status bar.
In some embodiments, the ProductiveService invokes PcSystembars to start the extended-screen status bar.
(22) The Systembars creates the extended-screen status bar based on a configuration file.
In some embodiments, the Systembars reads the configuration config_PcstatusBarComponent, determines that the operation control type of the status bars is PcStatusBar, and then creates the extended-screen status bar.
(23) The extended-screen status bar invokes the callback interface of the CommandQueue, to add a callback to the extended-screen status bar.
(24) The extended-screen status bar initializes a layout, and registers an IstatusBar object corresponding to the extended-screen status bar to the StatusBarManagerService.
(25) The extended-screen status bar creates and adds the Pc status bar window view (StatusBarWindowView) to the StatusBarWindowController.
(26) The StatusBarWindowController invokes the interface of the WindowManager to add the extended-screen status bar to the WMS, so as to add the extended-screen status bar to the Display1.
For example, with reference to the software implementation procedure of the SystemUI shown in
As shown in
In the projection mode, Contexts of the status bars need to be adjusted to ensure that each status bar obtains corresponding Display information (such as a display ID). The StatusBarWindowController also needs to support adding a plurality of status bars. When the WindowManagerService adds a window by using an addWindow function, a DisplayId of a window status (windowState) of the window needs to be modified, to ensure that the window can be correctly displayed on a corresponding display screen. For example, the standard status bar can be displayed on the default screen of the mobile phone 100, and the extended-screen status bar can be displayed on the extended screen (that is, the display screen of the computer 200) of the mobile phone 100.
Refer to
In the non-projection mode, after the SystemBars is started, the SystemBars obtains a default status bar (that is, the standard status bar of the mobile phone 100) based on the configuration item config_statusBarComponent. The SystemBars invokes a start( ) function of the standard status bar to perform initialization. A Callback is set by using the CommandQueue component. Then the Binder interface is invoked to register, by using a registerStatusBar function, the IstatusBar object corresponding to the standard status bar to the StatusBarManagerService for management.
When the mobile phone 100 is connected to the display (that is, the computer 200), the DMS receives an OnDisplayDeviceEvent through an input channel, and the DMS distributes a display event (DisplayEvent) based on a callback record (CallbackRecord). After the mobile phone 100 switches from the non-projection mode to the projection mode, the DMS creates a logical display area Display1 corresponding to the extended screen, and invokes the DisplayManagerGlobal to obtain a DisplayId of the Display1. The DisplayManagerGlobal sends the display event to the DisplaylistenerDelegate, to add a display listening agent for the Display1. The components of the SystemUI read the configuration item config_PcsystemUIServiceComponents in the configuration file Config.xml. After the SystemBars is started, the SystemBars starts the extended-screen status bar based on the configuration item config_PcstatusBarComponent. As shown in
As shown in
For example,
First, it needs to be noted that, in this embodiment of this application, because extended-screen display needs to be supported in the projection mode, a category option “android.intent.category.SECONDARY_HOME” is added to an intent-filter configuration of a PcHomeLauncher node in AndroidMainfest.xml of the application. This option is used to represent the extended-screen desktop.
In the non-projection mode, in the foregoing step (2), the SystemServer starts a series of services required for system running, including the Launcher.
Specifically, the SystemServer process starts the PMS and the AMS during startup. After being started, the PackageManagerService parses and installs the application APKs in the system. The AMS is mainly used to start and manage four major components. An entry for starting the Launcher is the systemReady method of the AMS.
(31) The AMS starts the Launcher by using a StartHomeOnAllDisplays method.
(32) An activity task manager (ActivityTaskManagerinternal) invokes StartHomeOnDisplays to start the Launcher.
(33) The RootActivityContainer determines whether a DisplayId is the ID of the default-screen display area (namely, the Display0). If the DisplayId is the ID of the default-screen display area, a standard-desktop Activity is resolved (resolveHomeActivity), and a step (34) is performed; or if the DisplayId is not the ID of the default-screen display area, an extended-screen desktop Activity is resolved (resolveSecondaryHomeActivity), and a step (37) is performed.
In some embodiments, if the DisplayId is the ID of the Display0, the PMS queries and uses an Activity of CATEGORY_HOME as a desktop; or if the DisplayId is the ID of the Display1, the PMS queries and uses an Activity of SECONDARY_HOME as a desktop.
(34) An activity start controller (ActivityStartController) invokes an activity starter (ActivityStarter).
(35) The ActivityStarter invokes startHomeActivityLocked to start the standard desktop.
In the projection mode, for steps (12) to (16), refer to related descriptions about
During the execution of the steps (12) to (16), the DisplayManagerService sends an onDisplayAdded notification when adding a display area Display. After receiving the notification, the HwPCManagerService in the Pc module (hwPartsPowerOffice) starts the Launcher (namely, the PcHomeLauncher) of the extended screen by using startHomeOnDisplay.
(36) The HwPCManagerService invokes a Pc desktop service (HwHomeService) by using the BindService.
(37) The HwHomeService invokes the AMS by using StartHomeOnProductiveDisplay.
(38) The AMS invokes the StartHomeOnAllDisplays method to start the PcHomeLauncher.
(39) The activity task manager invokes StartHomeOnDisplays to start the PcHomeLauncher.
Then the step (33) is performed, where the Activity of the extended-screen desktop is obtained (resolveSecondaryHomeActivity) in the step (33); and a step (40) is performed.
(40) The activity start controller (ActivityStartController) invokes the activity starter (ActivityStarter).
(41) The ActivityStarter invokes startHomeActivityLocked to start the extended-screen desktop.
Based on the foregoing embodiments, an embodiment of this application provides a screen projection method. The projection method includes but is not limited to steps S201 to S205.
S201: A first electronic device invokes a first module of a first application to run a first desktop, where the first desktop is associated with a first display area; and the first electronic device displays first display content based on the first display area, where the first display content includes the first desktop.
S202: In response to a first user operation, the first electronic device invokes a second module of the first application to run a second desktop, where the second desktop is associated with a second display area; and the first electronic device sends second display content corresponding to the second display area to a second electronic device, where the second display content includes the second desktop.
In this embodiment of this application, the first electronic device may be the foregoing electronic device 100, for example, the mobile phone 100; and the second electronic device may be the foregoing electronic device 200, for example, the computer 200. The first application may be the foregoing desktop launcher, for example, the HWLauncher6; the first module may be the standard desktop launcher, for example, the UniHomelauncher; and the second module may be the extended-screen desktop launcher, for example, the PcHomelauncher. The first desktop may be the foregoing standard desktop, and the second desktop may be the foregoing extended-screen desktop. The first display area may be the foregoing default-screen display area, namely, the Display0; and the second display area may be the foregoing extended-screen display area, namely, the Display1.
For example, the first display content may be the user interface displayed by the mobile phone 100 shown in
S203: In response to a second user operation performed on the first display content, the first electronic device displays third display content based on a task stack that runs in the first display area.
For example, the first display content may be the user interface 11 displayed by the mobile phone 100 shown in
S204: In response to a third user operation performed on the second display content displayed by the second electronic device, the first electronic device determines that display content corresponding to the second display area is fourth display content based on a task stack that runs in the second display area.
S205: The first electronic device sends the fourth display content to the second electronic device.
For example, the second display content may be the user interface 16 displayed by the computer 200 shown in
It needs to be noted that the first electronic device receives the third user operation by using the second electronic device. After receiving the third user operation, the second electronic device determines a first input event, where the first input event indicates the third input operation. For example, the first input event includes coordinates of the third input operation performed on a display screen of the second electronic device, an operation type (for example, a touch operation or a mouse-clicking operation) of the third input operation, and the like. The second electronic device sends the first input event to the first electronic device; and the first electronic device determines, based on the first input event and the task stack that runs in the second display area, the third input operation indicated by the first input event, and executes a response event corresponding to the third input operation. After the response event is executed, the display content corresponding to the second display area is updated to the fourth display content. The first display device sends the updated fourth display content to the second display device, and the second display device displays the fourth display content.
In this embodiment of this application, the first electronic device supports simultaneous running of a plurality of desktop instances in different display areas by using a same application, for example, running of the first desktop in the first display area by using the first module of the first application, and running of the second desktop in the second display area by using the second module of the first application. The first electronic device determines display content on a home screen of the first electronic device based on the task stack that runs in the first display area; and determines, based on the task stack that runs in the second display area, display content to be projected to the second electronic device. In this way, the first electronic device and the second electronic device may display different desktops and other different content based on the two different display areas.
In some embodiments, that in response to a second user operation performed on the first display content, the first electronic device displays third display content based on a task stack that runs in the first display area includes: In response to the second user operation performed on the first desktop in the first display content, the first electronic device displays the third display content based on a task stack that is of the first application and that runs in the first display area; and that in response to a third user operation performed on the second display content displayed by the second electronic device, the first electronic device determines that display content corresponding to the second display area is fourth display content based on a task stack that runs in the second display area includes: In response to the third user operation performed on the second desktop displayed by the second electronic device, the first electronic device determines that the display content corresponding to the second display area is the fourth display content based on a task stack that is of the first application and that runs in the second display area.
For example, the first display content may be an application icon (for example, the icon of Gallery) on the desktop (that is, the first desktop) shown in
In this embodiment of this application, for the second user operation performed on the first desktop, the first electronic device may execute, based on the task stack that is of the first application and that runs in the display area associated with the first desktop, a response event corresponding to the second user operation; and for the third user operation performed on the second desktop, the first electronic device may execute, based on the task stack that is of the first application and that runs in the display area associated with the second desktop, a response event corresponding to the third user operation. In this way, data isolation between events (input events and/or response events) of different desktops can be ensured. In addition, because both desktop instances are run by the modules of the first application, the two desktops can implement sharing of specified data, and the second desktop can inherit some or all functional features of the first desktop.
In some embodiments, before the first electronic device displays the first display content based on the first display area, the method further includes: The first electronic device invokes a third module of a second application to run a first status bar, where the first status bar is associated with the first display area, and the first display content includes the first status bar; and the method further includes: In response to the first user operation, the first electronic device invokes a fourth module of the second application to run a second status bar, where the second status bar is associated with the second display area, and the second display content includes the second status bar.
In this embodiment of this application, the first user operation is an input operation of determining, by a user, the projection of the first electronic device to the second electronic device. In some embodiments, in the step S202, in response to a projection instruction, the first electronic device invokes the second module of the first application to run the second desktop, and invokes the fourth module of the second application to run the second status bar, where both the second desktop and the second status bar are associated with the second display area. For example, for the projection instruction, refer to the projection instruction or the mode switching instruction in the foregoing embodiments.
In this embodiment of this application, the second application may be the foregoing system interface, for example, the SystemUI; the third module may be the standard status bar, for example, the StatusBar; and the fourth module may be the extended-screen status bar, for example, the PcStatusBar. The first status bar may be the foregoing standard status bar, and the second status bar may be the foregoing extended-screen status bar.
For example, the first status bar may be the status bar 101 shown in
In this embodiment of this application, the first electronic device supports simultaneous running of a plurality of status bar instances in different display areas by using a same application, for example, running of the first status bar in the first display area by using the third module of the second application, and running of the second status bar in the second display area by using the fourth module of the second application. In this way, the two status bars are associated with different display areas, so that the first electronic device and the second electronic device may display different status bars, thereby ensuring data isolation between events (input events and/or response events) of the two status bars. In addition, because both status bars are run by the modules of the second application, the two status bars can implement sharing of specified data (for example, a notification message), and the second status bar can inherit some or all functional features of the first status bar.
In some embodiments, before the first electronic device displays the first display content based on the first display area, the method further includes: The first electronic device invokes a fifth module of a third application to run a first display object of a first variable, where the first variable is associated with the first display area, and the first display content includes the first display object; and the first variable is associated with the second display area, and the second display content includes the first display object.
In this embodiment of this application, the first electronic device supports simultaneous display of display objects corresponding to a same variable in a plurality of different display areas. In this embodiment of this application, the third application and the second application may be a same application, or may be different applications. This is not specifically limited herein.
In some embodiments, the method further includes: In response to a fourth user operation performed on the first display content, the fifth module of the third application is invoked to modify the display object of the first variable to a second display object; the first electronic device updates display content corresponding to the first display area to fifth display content, where the fifth display content includes the second display object; and the first electronic device updates the display content corresponding to the second display area to sixth display content, and sends the sixth display content to the second electronic device, where the sixth display content includes the second display object.
In this embodiment of this application, the first electronic device supports simultaneous display of display objects corresponding to a same variable in a plurality of different display areas. After the user changes the display object of the first variable in the first display area, the display object of the first variable in the second display area also changes accordingly.
In some embodiments, the first variable indicates a display object of a wallpaper, the display object of the wallpaper is a static picture and/or a dynamic picture, and the wallpaper includes a lock-screen wallpaper used when a screen is locked and/or a desktop wallpaper used when the screen is not locked.
In this embodiment of this application, the third application may be the system interface (SystemUI) or a wallpaper application. For example, the fifth module may be the wallpaper management service, for example, the WallpaperManagerService.
For example, the first display object of the wallpaper is displayed, in the first display area, as the desktop wallpaper 107 shown in
For example, the first display object of the wallpaper is displayed, in the first display area, as the lock-screen wallpaper 902 on the lock screen shown in
In this embodiment of this application, after the user changes a wallpaper displayed on the first electronic device, a wallpaper projected to the second electronic device also changes accordingly.
In some embodiments, a plurality of themes are preset in the first electronic device, and the theme indicates a desktop layout style, an icon display style, and/or an interface color, and the like; and the first variable indicates a display object of the theme, and the display object of the theme is display content corresponding to one of the plurality of themes.
In this embodiment of this application, after the user changes a theme displayed on the first electronic device, a theme projected to the second electronic device also changes accordingly.
In some embodiments, the first module of the first application includes a first common class, a first user interface UI control class, and a desktop task stack of the first desktop that are used for creating and running the first desktop; and the second module of the first application includes a second common class, a second UI control class, and a desktop task stack of the second desktop that are used for creating and running the second desktop, where some or all classes in the second common class are inherited from the first common class, and some or all classes in the second UI control class are inherited from the first UI control class.
In this embodiment of this application, the second common class, the second UI control class, and the desktop task stack of the second desktop that are used for creating and running the second desktop are added to the first electronic device; and some or all of the added common class and UI control class are inherited from the first common class and the first UI control class that correspond to the original first desktop. Therefore, the second desktop can inherit some functional features of the first desktop, and the two desktops can implement sharing of specified data.
In some embodiments, the second common class includes one or more of the following: a desktop launcher provider, a database assistant, a desktop launcher setting class, a desktop launcher constant class, a Pc layout configuration, a Pc device file, a Pc cell counter, a Pc desktop launcher policy, a Pc desktop launcher model, a Pc loading task, and the like; and the second UI control class includes one or more of the following: a Pc drag layer, a Pc desktop workspace, a Pc cell layout, a Pc program dock view, a Pc folder, a Pc folder icon, and the like.
For example, the second common class may be the common class shown in
In some embodiments, the third module of the second application includes a first component, a first dependency control class, and a third UI control class that are used for creating and running the first status bar; and the second module of the first application includes a second component, a second dependency control class, and a fourth UI control class that are used for creating and running the second status bar, where some or all components in the second component are inherited from the first component, some or all classes in the second dependency control class are inherited from the first dependency control class, and some or all classes in the fourth UI control class are inherited from the third UI control class.
In this embodiment of this application, the second component, the second dependency control class, and the fourth UI control class that are used for creating and running the second status bar are added to the first electronic device; and some or all of the added component, dependency control class, and UI control class are inherited from the first component, the first dependency control class, and the third UI control class that correspond to the original first status bar. Therefore, the second status bar can inherit some functional features of the first status bar, and the two status bars can implement sharing of specified data.
In some embodiments, the second component includes one or more of the following: a PC dependency class, a PC system provider, a PC system bar, and the second status bar; the second dependency control class includes one or more of the following: a Pc status bar window control class, a screen control class, a lock-screen control class, and a remote control class; and the fourth UI control class includes one or more of the following: a Pc status bar window view, a Pc notification panel view, a Pc quick setting fragment, a Pc status bar fragment, and a Pc status bar view.
For example, the first component, the first dependency control class, and the third UI control class may be respectively the component, the dependency control class, and the UI control class that are shown in
In some embodiments, an identity ID of a display area associated with the second module is an ID of the second display area; and that in response to a first user operation, the first electronic device invokes a second module of the first application to run a second desktop, where the second desktop is associated with a second display area includes: In response to the first user operation, a Pc management service receives a mode switching instruction, where the instruction instructs to switch a current non-projection mode to a projection mode; in response to the instruction, the Pc management service invokes a Pc desktop service, where the Pc desktop service invokes an activity management service, and the activity management service invokes an activity task manager to start the second module of the first application; a root activity container is invoked to determine the ID of the display area associated with the second module; when the ID of the display area associated with the second module is the ID of the second display area, an Activity of the second desktop is queried and used as an Activity of a to-be-launched desktop, or when the ID of the display area associated with the second module is an ID of the first display area, an Activity of the first desktop is queried and used as an Activity of a to-be-launched desktop; and an activity start controller invokes an activity starter to start the second desktop.
In some embodiments, that in response to the first user operation, the first electronic device invokes a fourth module of the second application to run a second status bar, where the second status bar is associated with the second display area includes: In response to the first user operation, a Pc management service receives a mode switching instruction, where the instruction instructs to switch a current non-projection mode to a projection mode; in response to the instruction, the Pc management service starts a productivity service, where the productivity service invokes a system bar to start the second status bar, and the system bar creates the second status bar based on a configuration file; the second status bar invokes a callback interface of a command queue to add a callback to the second status bar; the second status bar initializes a layout, and registers an IstatusBar object corresponding to the second status bar to a status bar management service; the second status bar creates the Pc status bar window view, and adds the Pc status bar window view to a status bar window control class; and the status bar window control class invokes a window management interface to add the second status bar to a window management service, so as to add the second status bar to the second display area.
In some embodiments, in the non-projection mode, the command queue supports the first status bar associated with the first display area; and in the projection mode, the command queue supports both the first status bar associated with the first display area and the second status bar associated with the second display area.
The implementations of this application may be combined as desired to achieve different technical effects.
All or some of the foregoing embodiments may be implemented by software, hardware, firmware, or any combination thereof. When software is used for implementation, all or some of the foregoing embodiments may be implemented in a form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the procedure or functions according to this application are all or partially generated. The computer may be a general-purpose computer, a dedicated computer, a computer network, or another programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or may be transmitted from a computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from a website, computer, server, or data center to another website, computer, server, or data center in a wired (for example, a coaxial cable, an optical fiber, or a digital subscriber line) or wireless (for example, infrared, radio, or microwave) manner. The computer-readable storage medium may be any usable medium accessible to a computer, or a data storage device integrating one or more usable media, for example, a server or a data center. The usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, or a magnetic tape), an optical medium (for example, a DVD), a semiconductor medium (for example, a solid state disk (solid state disk, SSD)), or the like.
A person of ordinary skill in the art may understand that all or some of the procedures of the methods in the embodiments may be implemented by a computer program instructing related hardware. The program may be stored in a computer-readable storage medium. When the program is run, the procedures of the methods in the embodiments are performed. The foregoing storage medium includes any medium that can store program code, such as a ROM, a random access memory RAM, a magnetic disk, or an optical disc.
In conclusion, the foregoing descriptions are merely embodiments of the technical solutions of the present invention, but are not intended to limit the protection scope of the present invention. Any modification, equivalent replacement, or improvement made according to the disclosure of the present invention shall fall within the protection scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
202110547552.4 | May 2021 | CN | national |
202110745467.9 | Jun 2021 | CN | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2022/091899 | 5/10/2022 | WO |