The present disclosure relates to the field of video transmission technology, in particular to a video transmission method, a conferencing system, and a storage medium.
Currently, it is common for users to install remote conferencing software (such as Zoom, TEAMS, Tencent Meeting, Mailing, etc.) on client devices such as computers, phones, tablets, etc., and use remote conferencing software to hold meetings with other online users. During the meeting, a wireless screen sharing device may be used to call a camera head connected to the screen transmission receiver, thereby achieving the use of a camera head connected to the screen transmission receiver for meetings.
However, some wireless screen sharing devices do not have video decoding capabilities. In this case, after receiving the compressed and encoded video stream transmitted by the interactive board, the wireless screen sharing device may not decode it, which may cause conferencing software on the computer to be unable to play the compressed and encoded video stream normally.
The present disclosure provides a video transmission method, a conferencing system, and a storage medium, which can transmit video data through a wireless screen sharing device without video decoding capability, and enable conferencing software of a client device to play the video data transmitted by the wireless transmitter normally.
According to a first aspect of the present disclosure, a video transmission method is provided, being applied to a wireless screen sharing device including a universal interface, where the wireless screen sharing device does not have video decoding capability, the wireless screen sharing device is used to be connected to a screen sharing receiving device, and the screen sharing receiving device is connected to a camera head to receive original video data transmitted by the camera head; the method includes:
According to a second aspect of the present disclosure, a conferencing system is provided, including an interactive board and a wireless screen sharing device, where the interactive board includes interconnected camera heads and a screen sharing receiving device, and the camera head is used to collect original video data; the screen sharing receiving device is used to transmit the original video data collected by the camera head for transmission;
According to a third aspect of the present disclosure, a storage medium including computer executable instructions is provided, where the computer executable instructions are used to execute a video data transmission method as described in the aforementioned embodiment when executed by a computer processor.
According to the embodiment of the present disclosure, the wireless screen sharing device does not have video decoding capability, forwards the compressed and encoded the received video data in the first data format to the client device, to trigger the decoding application of the client device to decode the video data in the first data format to obtain the video data in the second data format, and transmits the video data in the second data format to the wireless screen sharing device. The wireless screen sharing device presents the video data in the second data format to the camera peripheral device. When it recognizes that the decoding application of the client device is not running, the wireless screen sharing device triggers the screen sharing device receiver to process the received original video data in the data format supported by the client device for playback to obtain the video data in the third format, and transmits the video data in the third data format to the wireless screen sharing device. The wireless screen sharing device presents the received video data in the third data format to the camera peripheral device, so that the conferencing software of the client device can play the video data in the second or third data format normally.
of the present disclosure.
In order to make the purpose, technical solution, and advantages of the present disclosure clearer, further detailed descriptions of specific embodiments of the present disclosure will be provided below in conjunction with the accompanying drawings. It may be understood that the specific embodiments described here are only intended to explain the present disclosure and not to limit it. Furthermore, it should be noted that for ease of description, only the relevant parts of the present disclosure are shown in the accompanying drawings, rather than the entire content. Before discussing the exemplary embodiments in more detail, it should be mentioned that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although flowcharts describe various operations (or steps) as sequential processing, many of them may be implemented in parallel, concurrently, or simultaneously. In addition, the order of each operation may be rearranged. When the operation is completed, the process may be terminated, but there may also be additional steps not included in the drawings. The processing may correspond to methods, functions, procedures, subroutines, subroutines, etc.
Firstly, in order to better understand a video data transmission method according to an embodiment of the present disclosure, the present disclosure scenarios of the embodiments of the present disclosure will be introduced in conjunction with
In multi-person interaction scenarios such as meetings, teaching, discussions, and brainstorming, with the rapid development of information technology, the limitations of traditional communication methods mainly based on offline listening and watching are becoming increasingly apparent. Online multi-person communication, such as remote meetings, remote education, and online chatting, has become a trend of the times. Taking remote meetings as an example, in the prior art, users usually install remote conferencing software (such as Zoom, TEAMS, Tencent Meeting, Mailing, etc.) on client devices (such as computers, phones, tablets, etc.) and hold meetings with other online users through remote meeting software. During the meeting, a camera head on the computer is usually used to share their own images with online users. However, the quality of the built-in camera head on the computer of a user is usually relatively low, and it usually may not meet the needs of high-quality meetings.
In related technologies, users may physically plug a wireless screen sharing device into a universal interface of the computer (such as a USB interface), and establish a data transmission link between the wireless screen sharing device and the screen transmission receiver. Based on this data transmission link, the computer may call the camera head connected to the screen transmission receiver through a wireless screen sharing device, thereby achieving the use of the camera head connected to the screen sharing receiving device for meetings. Taking the interactive board as an example, as shown in
In related technologies, some wireless screen sharing devices have video decoding capabilities. After receiving the compressed and encoded video stream transmitted by the screen sharing receiving device of the interactive board, these wireless screen sharing devices may decode the received compressed and encoded video stream to decode it into a video stream in a data format supported by the conferencing software for playback, and transmit the decoded video stream to the computer. This way, the computer may present the decoded video stream to the conferencing software. The conferencing software may use a camera head to hold meetings with remote attendees, enabling users to call the camera head of the interactive board for meetings. But some wireless screen sharing devices do not have video decoding capabilities. At this time, after receiving the compressed and encoded video stream transmitted by the interactive board, the wireless screen sharing device may not decode it, which may cause the conferencing software on the computer to be unable to play the compressed and encoded video stream normally.
Moreover, the amount of data collected by the camera head from the original video stream is generally large. For example, for YUV format video streams with a resolution of 1080P, in order to transmit 30 frames of YUV format video stream, a transmission rate of about 700 Mbps per second is required to ensure its transmission effect. However, current wireless communication networks may not meet such fast transmission rates. Therefore, based on the actual wireless communication environment, it is not possible to directly transmit the original video stream to the wireless screen sharing device.
Based on the above issues, the inventors proposed a conferencing system that can enable video streams transmitted by wireless transceivers without video decoding capabilities to be played normally in computer conferencing software.
As shown in
The screen sharing receiving device 210 is used to establish a wireless communication link with the screen transmission transmitting end, such as the wireless screen sharing device 22, and transmit video data through the wireless communication link. The camera head 211 is used to collect original video data of a location of the interactive board 21 (such as a conference room), and the microphone 212 is used to collect original video data of a location of the interactive board 21 (such as a conference room). The speaker 213 may play the received video data.
In other embodiments, the screen sharing receiving device 210 may be independent of the external environment of the interactive board 21,
When the screen sharing receiving device 210 is independent of the external interactive board 21, the screen sharing receiving device 210 may be integrated with the camera head 211, such as an integrated video conference terminal, and the integrated video conference terminal may integrate a camera head, a microphone, a speaker, and a cloud video conference terminal into one.
In some other embodiments, when the screen sharing receiving device 210 is independent of the external environment of the interactive board 21, the screen sharing receiving device 210 may be independent of the camera head 211. For example, the screen sharing receiving device 210 may be a screen transmission box, and the camera head 211 may be connected to the screen transmission box through a connecting cable. The screen transmission box may be connected to the interactive board 21 in a wired or wireless manner. In other embodiments, the camera head 211 may be integrated inside the interactive board, and the screen transfer box may be connected to the interactive board 21 in a wired or wireless manner.
As shown in
In the embodiment of the present disclosure, the wireless screen sharing device 22 includes a universal interface 221, a processing module 222, and a wireless communication module 223. The universal interface 221 and the wireless communication module 223 are respectively connected to the processing module 222.
The universal interface 221 provides specific one or more ports to achieve wired connection with the computer 23 through specific ports. The type of the universal interface 221 is currently not limited, for example, the universal interface 221 may be Universal Serial Bus (USB) port or High Definition Multimedia Interface (HDMI), etc., in one embodiment, taking the universal interface 221 as USB port as an example, the wireless screen sharing device 22 and computer 23 are connected via USB cable. Optional, when connected via USB cable, the wireless screen sharing device 22 is equipped with a USB port plug, and the computer 23 is equipped with a USB port socket. After the universal interface 221 is inserted into the computer 23, the wireless screen sharing device 22 may work. The specific implementation approaches of the computer 23 detecting the wireless screen sharing device 22 and signal transmission with wireless screen sharing device 22 are currently not limited. It may be understood that in practical applications, the universal interface 221 of the wireless screen sharing device 22 may also be connected to the computer 23 through a connecting cable, the two ends of the connecting cable may be connected to the universal interface 221 and the computer 23 respectively.
The processing module 222 may include a Micro controller Unit (MCU) or System on Chip (SoC), etc. In one embodiment, the processing module 222 includes a storage portion and a processing portion, the storage portion may be integrated into one physical entity or may be different physical entities. The storage portion and the processing portion may be understood as memory and processor, respectively. The components and circuit structure contained in the storage portion and processing portion are currently not limited. The storage section may cache the currently processed data (such as screen data), and may also store the computer programs required during the operation of the wireless screen sharing device. The processing portion includes one or more processors, and the processing portion may run computer programs stored in the storage portion to achieve the various functions of the wireless screen sharing device 22. On this basis, the data transmission between computer 23, the interactive board 21, and wireless screen sharing device 22 requires the processing module 222.
The processing module 222 according to the embodiment of the present disclosure does not have video decoding capability and does not support decoding processing of compressed encoded video data. It may be understood that the processing module 222 according to the embodiment of the present disclosure does not have a video decoder, the processing module 222 may not support software decoding or hardware decoding of videos. The video decoder may be implemented through software or hardware, and the video decoder may perform restoration and decoding operations on encoded video data.
The wireless communication module 223 has both transmission and reception functions, and the wireless communication module 223 is used for transmitting data and instructions (i.e. transmitting signals) with the interactive board 21. It may be understood that the wireless screen sharing device 22 and the interactive board 21 establish a wireless connection, specifically referring to the wireless communication module 223 establishing a wireless connection with the interactive board 21. The connection protocol used for wireless connection is currently not limited. In one embodiment, the wireless communication module 223 may include a Wi-Fi module and/or a Bluetooth module, the wireless communication module 223 establishes a wireless connection within the local area network with the interactive board 21 through the Wi-Fi module, and establishes a Bluetooth connection with the interactive board 21 through the Bluetooth module. Optionally, the wireless communication module 223 further includes Near Field Communication (NFC) module to achieve near-field communication with the interactive board 21. For example, the wireless screen sharing device 22 acquires a communication address of the interactive board 21 in the local area network through the NFC module, and then establishes a wireless connection with the interactive board 21 through the NFC module based on the communication address.
According to the embodiment of the present disclosure, when the user inserts the wireless screen sharing device 22 into the USB interface of the computer 23, it will trigger the pre-installed driver to run on the computer 23, so that the computer 23 may communicate with the wireless screen sharing device 22. On this basis, in order to achieve recognition of the video stream transmitted by the wireless screen sharing device 22 by the conferencing software running on computer 23, the wireless screen sharing device 22 will initiate a peripheral creation request to the computer through the corresponding driver (such as a camera head driver) to create a camera peripheral device 224 corresponding to the camera head 211 on the wireless screen sharing device 22.
For example, the computer 23 may acquire USB description message of the wireless screen sharing device 22 based on an enumeration process of the USB interface, use the USB description message as identification information; based on the preset driver of the identification information matching system, run the matched driver to create a camera peripheral device 224 corresponding to the camera head 211 on the wireless screen sharing device 22.
The device created by the wireless screen sharing device 22 on the computer 23 is a standard UVC device (USB video class device). When the wireless screen sharing device 22 is inserted into the computer, the wireless screen sharing device 22 will transmit a USB description message to the computer 23 to indicate its identity. During the enumeration process, the computer 23 transmits a Get Device Descriptor request to the wireless screen sharing device 22 to acquire a maximum packet length of the default control pipeline, and then transmits a request Set Address to assign a unique device address to the wireless screen sharing device 22. By repeatedly transmitting a Get Device Descriptor request to the wireless screen sharing device 22, complete configuration information (configuration descriptor, interface descriptor, endpoint descriptor, setting definition descriptor, application custom descriptor, etc.) is acquired to obtain the USB description message of the wireless screen sharing device and its identification information is determined. Furthermore, based on this identification information, the pre-installed driver on the computer system is located, and the driver matches the identification information. By running the driver, a UVC device (USB video class device) corresponding to the camera head 211 may be created on the wireless screen sharing device 22.
It should be noted that the number of camera heads 211 according to the embodiment of the present disclosure may be one or more. When there is one camera head 211, the wireless screen sharing device 211 may create a camera peripheral device corresponding to the camera head 211. When there are multiple camera heads 211, the wireless screen sharing device 211 may create a camera peripheral device corresponding to multiple camera heads 211. Alternatively, when there are multiple camera heads 211, the wireless screen sharing device may create multiple camera peripheral devices, each corresponding to a camera head. Alternatively, when there are multiple camera heads 211, the wireless screen sharing device may create multiple camera peripheral devices. The number of camera peripheral devices may be greater than or less than the number of camera heads, and the correspondence between camera peripheral devices may be set based on actual conditions. For example, one camera head may correspond to one camera peripheral devices, while multiple camera heads may correspond to one camera peripheral devices.
As shown in
In the embodiment of the present disclosure, when the wireless screen sharing device 22 is connected to the computer 23 through the universal interface 221, the processing module 222 may monitor the running of the decoding application of the computer 23. The decoding application may decode the video data received by computer 23 to obtain the corresponding format of video data.
As shown in
For example, the video data in the first data format may be video data in H264/H265 format, and the video data in the second data format may be video data in MJPEG format or video data in YUV format. It may be understood that the above is only an exemplary explanation, and the video data in the first data format may be specifically determined based on the actual situation of the interactive board 21 and the transmission capacity of the wireless network between the interactive board 21 and the wireless screen sharing device 22. The video data in the second data format may be determined based on the actual supported data format of the conferencing software on computer 23. For example, when the conferencing software on the computer 23 only supports video data in YUV format, the decoding application may decode it into video data in YUV format.
According to the embodiment of the present disclosure of the present disclosure, the wireless screen sharing device does not have video decoding capability, the wireless screen sharing device forwards the received compressed and encoded video data in the first data format to the computer to trigger the decoding application of the computer to decode and process the video data in the first data format to obtain the video data in the second data format, and transmits the video data in the second data format to the wireless screen sharing device. The wireless screen sharing device presents the received video data in the second data format to the camera peripheral device, allowing the conferencing software on the computer 23 to play the video data in the second data format normally.
In some embodiments, the wireless screen sharing device 22 may be pre-installed with a monitoring application, and the monitoring application may monitor the running of the decoding application on the computer 23. The monitoring application may be pre-stored in the memory of the wireless screen sharing device 22, and the processing module 222 may monitor the running of the decoding application of the computer 23 by calling and executing the monitoring application.
In other embodiments, the camera head driver and/or network card driver may also be customized by the supplier. In some scenarios where video transmission privacy is limited, in order to restrict unrelated devices from acquiring the audio and video streams transmitted by the system, the driver customized by a supplier may be used to create the camera head driver and/or network card driver.
Wireless screen sharing devices in related technologies have small volumes, and wireless screen sharing devices with video decoding capabilities have limited computing power. In cases where their device capabilities are limited, they may usually only support presenting video data with lower fixed resolution to computers. For example, it only supports providing 720p video data to the computer, and when the computer has high performance for supporting to play video data with high resolution (such as 1080p or 4K), the playback effect of video data with lower fixed resolution on the computer is not good.
Based on this, in some embodiments, the processing module 222 may also acquire the maximum resolution of video data supported by the computer 23 through the universal interface 221, and form a first notification information based on the maximum resolution. The first notification information is used to instruct the interactive board 21 to process the original video data of the conference room collected by the camera head 211 based on the maximum resolution to obtain the video data in the first data format with the maximum resolution. The processing module 222 is also configured to transmit the first notification information to the interactive board 21. After receiving the first notification information, the interactive board 21 processes the original video data of the conference room collected by the camera head 211 based on the maximum resolution to obtain the video data in the first data format with the maximum resolution, and transmits the video data in the first data format with the maximum resolution to the wireless screen 22. The wireless screen sharing device 22 transmits the video data in the first data format with the maximum resolution to the computer 23. The computer 23 decodes and processes it through a decoding application to obtain the video data in the second data format with the maximum resolution, and transmits it back to the wireless screen sharing device 22. The wireless screen sharing device 22 presents the video data in the second data format with the maximum resolution to the conferencing software of the computer 23 through the camera peripheral device 224, so that the conferencing software of the computer 23 may play the video data in the second data format with the maximum resolution.
For example, the maximum resolution supported by computer 23 is 1080p. The processing module 222 may also acquire the maximum resolution of video data supported by the computer 23 for playback through the universal interface 221, and form first notification information based on the maximum resolution of 1080p. The first notification information is used to instruct interactive board 21 to process the original video data collected by the camera head 211 in the conference room based on the maximum resolution of 1080p to obtain H264-format video data in 1080p. The processing module 222 is also configured to transmit the first notification information to the interactive board 21. After receiving the first notification information, the interactive board 21 processes the original video data of the conference room collected by the camera head 211 based on 1080p to obtain H264-format video data in 1080p, and transmits the H264-format video data in 1080p to the wireless screen sharing device 22. The wireless screen sharing device 22 transmits H264-format video data in 1080p to the computer 23, and the computer 23 decodes it through a decoding application to obtain MJPEG-format video data in 1080p and transmits it back to the wireless screen sharing device 22. The wireless screen sharing device 22 presents MJPEG-format video data in 1080p to conferencing software of the computer 23 through the camera peripheral device 224, so that the conferencing software of the computer 23 may play MJPEG-format video data in 1080p.
For example, the maximum resolution supported by the computer 23 is 720p, and the processing module 222 may also acquire the maximum resolution of video data supported by the computer 23 for playback through the universal interface 221, based on the maximum resolution of 720p, forms a first notification information. The first notification information is used to instruct interactive board 21 to process the original video data of the conference room collected by the camera head 211 based on the maximum resolution of 720p to obtain H264-format video data in 720p. The processing module 222 is further configured to transmit the first notification information to the interactive board 21. After receiving the first notification information, the interactive board 21 processes the original video data of the conference room collected by the camera head 211 based on 720p to obtain H264-format video data in 720p, and transmits the H264 format video data in 720p to the wireless screen 22. The wireless screen sharing device 22 transmits H264-format video data in 720p to the computer 23, and the computer 23 decodes it through a decoding application to obtain MJPEG-format video data in 720p and transmits it back to wireless screen sharing device 22. The wireless screen sharing device 22 presents MJPEG-format video data in 720p to the conferencing software of the computer 23 through the camera peripheral device 224, so that the conferencing software of the computer 23 may play MJPEG-format video data in 720p.
According to the embodiment of the present disclosure, it may be understood that the encoding manner of the original video data may be adjust by triggering the interactive tablet 21 through the wireless screen sharing device based on the maximum resolution video data supported by the computer, so that the wireless screen sharing device can present the video data in the second data format to the conferencing software based on the maximum resolution supported by the computer.
In one embodiment, when wireless screen sharing device 22 is first connected to the computer 23, extract the decoding application stored in wireless screen sharing device 22 and install it locally on the computer 23, or download the installation package of the decoding application based on the decoding application pre-stored in wireless screen sharing device 22, and install the decoding application locally on computer 23 based on the installation package.
It may be understood that in order to facilitate the configuration of the corresponding decoding application by the client device, according to the embodiment of the present disclosure, the decoding application is installed when the screen sharing device is first connected to the computer 23 for subsequent video data decoding. Therefore, by storing the installation package of the decoding application or download address of the decoding application in the wireless screen sharing device 22, when connecting with computer 23 for the first time, the installation package of the decoding application or download address of the decoding application is provided to the computer 23. The computer 23 installs the decoding application based on the installation package of the decoding application or connects to the network to download and install the decoding application based on the download address of the decoding application, thereby providing the decoding ability of responding the computer 23, allowing the computer 23 to decode the video data transmitted by the wireless screen sharing device 22. In practical applications, the decoding application may also be actively sent from the system background of the conferencing software to the computer 23 for installation. According to the embodiment of the present disclosure, the specific decoding application installation scheme is currently not limited, which will not be described herein.
As shown in
However, in practical applications, due to security limitations, some computers may not be able to install the decoding application in the above embodiments, or may not be able to run the decoding application properly, resulting in the conferencing software on the computer 23 being unable to play video data properly.
Based on this, the processing module 222 according to the embodiment of the present disclosure may also form a second notification information when monitoring that the computer 23 is not running the decoding application, and transmit the second notification information to the interactive board 21. The second notification information is used to instruct the interactive board 22 to directly compress and encode the original video data collected by the camera head 211 in the conference room into a third data format supported by the computer 23 for playback. The interactive board 21 transmits video data in the third data format to the wireless communication module 223 of the wireless screen sharing device 23. The processing module 222 controls the wireless communication module 223 to forward the received video data in the third data format to the camera peripheral device 224, so that the camera peripheral device 224 presents the video data in the third data format to the conferencing software of the computer 23. For example, computer 23 may support playing MJPEG-format video data, and the interactive board 22 may directly compress and encode the original video data collected by the camera head 211 in the conference room into MJPEG-format video data. When the computer 23 may support multiple video formats for playback, the smallest data format may be selected for compression encoding. For example, the computer 23 may support playing MJPEG-format video data or YUV-format video data, the data amount of YUV-format video data is greater than that of MJPEG-format video data. In order to improve the transmission rate of video data between interactive boards and wireless screen sharing devices, the original video data may be encoded into MJPEG-format video data.
In some embodiments, the processing module 222 may further acquire the maximum resolution of video data supported by the computer 23 for playback through the universal interface 221, and form a second notification information based on the maximum resolution. The second notification information is used to instruct the interactive board 21 to process the original video data of the conference room collected by the camera head 211 based on the maximum resolution to obtain the video data in the second data format with the maximum resolution. The processing module 222 is further configured to transmit the second notification information to the interactive board 21. After receiving the second notification information, the interactive board 21 processes the original video data of the conference room collected by the camera head 211 based on the maximum resolution to obtain the video data in the second data format with the maximum resolution, and transmits the video data in the second data format with the maximum resolution to the wireless screen sharing device 22. The wireless screen sharing device 22 presents the video data in the second data format with maximum resolution to the conferencing software of the computer 23 through the camera peripheral device 224, so that the conferencing software of the computer 23 may play the video data in the second data format with maximum resolution.
For example, the maximum resolution supported by computer 23 is 1080p. The processing module 222 may also acquire the maximum resolution of video data supported by the computer 23 for playback through the universal interface 221, and form second notification information based on the maximum resolution of 1080p. The second notification information is used to instruct the interactive board 21 to process the original video data collected by the camera head 211 in the conference room based on the maximum resolution of 1080p to obtain MJPEG-format video data in 1080p. The processing module 222 is further configured to transmit the second notification information to the interactive board 21. After receiving the second notification information, the interactive board 21 processes the original video data of the conference room collected by the camera head 211 based on 1080p to obtain MJPEG-format video data in 1080p, and transmits the MJPEG-format video data in 1080p to the wireless screen sharing device 22. The wireless screen sharing device 22 presents MJPEG-format video data in 1080p to the conferencing software of the computer 23 through the camera peripheral device 224, so that the conferencing software of the computer 23 may play MJPEG-format video data in 1080p.
For example, the maximum resolution supported by the computer 23 is 720p, and the processing module 222 may also acquire the maximum resolution of video data supported by the computer 23 for playback through the universal interface 221, based on the maximum resolution of 720p, form a second notification information, the second notification information is used to instruct interactive board 21 to process the original video data of the conference room collected by the camera head 211 based on the maximum resolution of 720p to obtain MJPEG-format video data in 720p, and transmit MJPEG-format video data in 1080p to the wireless screen sharing device 22. The wireless screen sharing device 22 presents MJPEG-format video data in 720p to the conferencing software of the computer 23 through the camera peripheral device 224, so that the conferencing software of the computer 23 may play MJPEG-format video data in 720p.
As shown in
S410, connecting with the client device through the universal interface and identifying whether the client device is running a decoding application, where the client device is installed with conferencing software.
S420, configuring the universal interface to create a camera peripheral device.
S430, when it is recognized that the client device is running a decoding application, transmitting a first notification information to the screen sharing receiving device, where the first notification information is used to instruct the screen sharing receiving device to encode the received original video data based on the first notification information to obtain video data in a first data format.
S440, receiving video data in the first data format and transmitting the video data in the first data format to the decoding application, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire video data in a second data format.
S450, receiving video data in the second data format through the camera peripheral device and presenting the video data in the second data format to the conferencing software.
As shown in
S510, connecting with the client device through the universal interface and identifying whether the client device is running a decoding application, where the client device is installed with conferencing software.
S520, configuring the universal interface to create a camera peripheral device.
S530, acquiring the maximum resolution of video data supported by the client device for playback based on the universal interface.
S540, based on the acquired maximum resolution of video data supported by the client device for playback, forming the first notification information.
S550, when it is recognized that the client device is running the decoding application, transmitting first notification information to the screen sharing receiving device, where the first notification information is used to instruct the screen sharing receiving device to encode the received original video data based on the first notification information to obtain video data in a first data format.
For example, when it is recognized that the client device is running the decoding application, a first notification information is sent to the screen sharing receiving device. The first notification information is used to instruct the screen sharing receiving device to compress and encode the received original video data based on the maximum resolution to obtain video data in the first data format, and the resolution of the video data in the first data format is equal to the maximum resolution.
S560, receiving video data in the first data format and transmitting the video data in the first data format to the decoding application, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire video data in the second data format; the resolution of the video data in the second data format is equal to the maximum resolution.
S570, receiving video data in the second data format through the camera peripheral device, and transmitting the video data in the second data format to the conferencing software through the camera peripheral device.
As shown in
S610, connecting with the client device through the universal interface and identifying whether the client device is running a decoding application, where the client device is installed with conferencing software.
S620, configuring the universal interface to create a camera peripheral device.
S630, when it is recognized that the client device is running a decoding application, transmitting first notification information to the screen sharing receiving device, where the first notification information is used to instruct the screen sharing receiving device to encode the received original video data based on the first notification information to obtain video data in a first data format.
S640, receiving video data in the first data format and transmitting the video data in the first data format to the decoding application, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire video data in a second data format.
S650, receiving video data in the second data format through the camera peripheral device, and transmitting the video data in the second data format to the conferencing software through the camera peripheral device.
S660, when it is recognized that the client device is not running the decoding application, transmitting second notification information to the screen sharing receiving device, where the second notification information is used to notify the screen sharing receiving device to process the received original video data based on the data format supported by the client device for playback to acquire video data in a third data format.
S670, receiving video data in the third data format through the camera peripheral device, and transmitting the video data in the third data format to the conferencing software through the camera peripheral device.
As shown in
S710, connecting with the client device through the universal interface and identifying whether the client device is running a decoding application, where the client device is installed with conferencing software.
S720, configuring the universal interface to create a camera peripheral device.
S730, acquiring the maximum resolution of video data supported by the client device for playback based on the universal interface.
S740, based on the maximum resolution of video data supported by the client device for playback, forming the first notification information.
S750, when it is recognized that the client device is running the decoding application, transmitting first notification information to the screen sharing receiving device, where the first notification information is used to instruct the screen sharing receiving device to compress and encode the received original video data based on the maximum resolution to obtain video data in the first data format, and the resolution of the video data in the first data format is equal to the maximum resolution.
S760, receiving video data in the first data format and transmitting the video data in the first data format to the decoding application, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire video data in a second data format.
S770, receiving video data in the second data format through the camera peripheral device, and transmitting the video data in the second data format to the conferencing software through the camera peripheral device.
S780, determining a preset resolution based on the maximum resolution of video data supported by the client device for playback, and forming the second notification information based on the preset resolution, where the preset resolution is less than the maximum resolution.
S790, when it is recognized that the client device is not running the decoding application, transmitting second notification information to the screen sharing receiving device, where the second notification information is used to notify the screen sharing receiving device to process the received original video data based on the data format and the preset resolution supported by the client device to acquire video data in a third data format, and the resolution of the third data format is equal to the preset resolution.
S7100, receiving video data in the third data format through the camera peripheral device, and transmitting the video data in the third data format to the conferencing software through the camera peripheral device.
The preset resolution may be determined based on the data transmission capacity of the existing wireless communication network, for example, the preset resolution may be set to 720P.
It may be understood that in the embodiment of the present disclosure, when the decoding application may be run on the computer side, the encoding manner of the original video data may be adjusted by triggering the interactive tablet 21 through the wireless screen sharing device based on the maximum resolution of video data supported by the computer, so that the wireless screen sharing device may present the video data in the second data format to the conferencing software based on the maximum resolution supported by the computer; when the decoding application fails to run properly on the computer side, the original video data may be encoded based on the preset resolution. If the preset resolution is less than the maximum resolution supported by the computer for playback, the data amount of the third data format may be reduced by reducing the resolution, reducing the impact of slow transmission speed and computer video playback lag caused by the large data amount of the video data in the third data format.
In some embodiments, the configuring the universal interface to create a camera peripheral device may include: configuring the universal interface to create a network card device and a camera peripheral device.
The receiving video data in the first data format and transmitting the video data in the first data format to the decoding application, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire video data in a second data format may include: receiving the video data in the first data format and transmitting the video data in the first data format to the network card device; transmitting the video data in the first data format to the decoding application through the network card device, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire the video data in the second data format.
In some embodiments, the universal interface includes a USB interface, the configuring the universal interface to create a network card device and a camera head peripheral includes: transmitting a USB description message to the client device through the USB interface to enable the client device to match the camera head driver and network card driver based on the USB description message; in response to the installation of the camera head driver and network card driver, creating a network card device and USB camera peripheral device based on the USB interface.
The camera head driver is a universal driver or a driver customized by the supplier; the network card driver is a universal driver or a driver customized by the supplier.
In some embodiments, the receiving video data in the second data format and presenting the received video data in the second data format to the conferencing software may include:
The video data transmission method according to the above embodiments may be executed in the wireless screen sharing device and conferencing system according to any embodiment of the present disclosure. Technical details not described in detail in the above embodiments may be found in the relevant description of the wireless screen sharing device and conferencing system according to any embodiment of the present disclosure, and will not be repeated here.
As shown in
The memory 820, as a computer-readable storage medium, may be used to store software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the video data transmission method according to any embodiment of the present disclosure. The memory 820 may mainly include a storage program area and a storage data area, the storage program area may store the program required by the operating system and at least one function, and the storage data area may store data created based on device usage, etc. In addition, memory 820 may include high-speed random access memory and may further include non-volatile memory, such as at least one disk storage device, flash memory device, or other non-volatile solid-state storage device. In some instances, the memory 820 may further include memories remotely set relative to the processor, these remote memories may be connected to the device through a network. Examples of the above networks include but are not limited to the Internet, enterprise intranet, local area network, mobile communication network and their combinations.
The processor 810 executes various functional applications and data processing of the device by running software programs, instructions, and modules stored in the memory 820, that is, implementing the above-mentioned video data transmission method. The video data transmission method includes: connecting with the client device through the universal interface and identifying whether the client device is running a decoding application, where the client device is installed with conferencing software; configuring the universal interface to create a camera peripheral device; when it is recognized that the client device is running a decoding application, transmitting first notification information to the screen sharing receiving device, where the first notification information is used to instruct the screen sharing receiving device to encode the received original video data based on the first notification information to obtain video data in a first data format; receiving video data in the first data format and transmitting the video data in the first data format to the decoding application, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire video data in a second data format; and receiving video data in the second data format through the camera peripheral device, and transmitting the video data in the second data format to the conferencing software through the camera peripheral device.
The wireless screen sharing device mentioned above may be used to execute the video data transmission method according to Embodiment 1, and has corresponding functions and beneficial effects.
According to the embodiment of the present disclosure, a storage medium is further provided, including computer executable instructions, the computer executable instructions are used to execute a video data transmission method when executed by a computer processor. The video data transmission method includes: connecting with the client device through the universal interface and identifying whether the client device is running a decoding application, where the client device is installed with conferencing software; configuring the universal interface to create a camera peripheral device; when it is recognized that the client device is running a decoding application, transmitting first notification information to the screen sharing receiving device, where the first notification information is used to instruct the screen sharing receiving device to encode the received original video data based on the first notification information to obtain video data in a first data format; receiving video data in the first data format and transmitting the video data in the first data format to the decoding application, where the decoding application is used to process the received video data in the first data format based on the data format supported by the client device for playback to acquire video data in a second data format; and receiving video data in the second data format through the camera peripheral device, and transmitting the video data in the second data format to the conferencing software through the camera peripheral device.
Storage medium-any type of memory device or storage device. The term “storage medium” aims to include: installation medium, such as CD-ROM, floppy disk, or magnetic tape device; computer system memory or random access memory, such as DRAM, DDR RAM, SRAM, EDO RAM, Rambus RAM, etc; non volatile memory, such as flash memory, magnetic media (such as hard drives or optical storage); registers or other similar types of memory components, etc. The storage medium may further include other types of memory or their combinations. In addition, the storage medium may be located in the first computer system in which the program is executed, or it may be located in a different second computer system, and the second computer system is connected to the first computer system through a network (such as the Internet). The second computer system may provide program instructions to the first computer for execution. The term “storage medium” may include two or more storage media residing in different locations (such as in different computer systems connected through a network). A storage medium may store program instructions that may be executed by one or more processors (for example, specifically implemented as computer programs).
Of course, the storage medium according to the embodiment of the present disclosure includes computer executable instructions, which are not limited to the video data transmission method as described above, and may also perform relevant operations in the video data transmission method according to any embodiment of the present disclosure.
The storage medium and the electronic device according to the above embodiments may execute the video data transmission method according to any embodiment of the present disclosure. Technical details not described in detail in the above embodiments may be provided in the video data transmission method according to any embodiment of the present disclosure.
The above are only the preferred embodiments and technical principles used in the present disclosure. The present disclosure is not limited to the specific embodiments described herein, and any significant changes, readjustments, and substitutions that may be made to those skilled in the art will not deviate from the scope of protection of the present disclosure. Therefore, although the above embodiments have provided a more detailed explanation of the present disclosure, the present disclosure is not limited to the above embodiments. Without departing from the concept of the present disclosure, it may also include more equivalent embodiments, and the scope of the present disclosure is determined by the scope of the claims.
This application is a continuation of International Application No. PCT/CN2023/118930, filed on Sep. 14, 2023. The aforementioned patent application is hereby incorporated by reference in their entirety.
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/CN2023/118930 | Sep 2023 | WO |
| Child | 18885128 | US |