This application claims priority of Chinese Patent Application No. 202210819002.8, filed on Jul. 13, 2022, the contents of which are entirely incorporated herein by reference.
The present disclosure involves the field of unmanned aerial vehicle (UAV), and in particular, to a system and method for managing an UAV in a smart city based on the Internet of things (IoT).
With the development of an UAV technology, the UAV is applied in more and more fields for data collection, and the UAV is applied to various monitoring occasions. In view of its high flexibility and strong motorization features, UAVs can complete the collection and transmission of data that is difficult to achieve by man-machine. However, during transmitting the data, the efficiency and effect of data transmission are often unsatisfactory due to the influence of signal interference, transmission path, and a transmission distance.
Therefore, it is hoped to provide a system and method for managing an UAV in a smart city based on the IoT, which can better transmit data.
One of the embodiments of the present disclosure provides a method for managing an UAV in a smart city based on IoT, which is realized by a system for managing an UAV in a smart city based on IoT. The system includes: a user platform, a service platform, a management platform, a sensing network platform, and an object platform. The method includes the following operations: a general platform of the management platform may obtain requirement information of a user from the user platform through the service platform, and assigns the requirement information to a corresponding management sub-platform; the management sub-platform may determine different time domains and different airspaces of at least two UAVs performing missions based on the requirement information, the different time domains may have an overlapping interval, and different airspaces may have an overlapping interval; and the management platform may control the at least two UAVs to perform different missions in the different time domains and the different airspaces, and collect mission data corresponding to the different missions.
One of the embodiments of the present disclosure provides a system for managing the UAV in a smart city based on IoT, which includes a user platform, a service platform, a management platform, a sensing network platform, and an object platform. A general platform of the management platform may be configured to obtain requirement information of a user from the user platform through the service platform, and assign the requirement information to a corresponding management sub-platform; the management sub-platform may be configured to determine different time domains and different airspaces of at least two UAVs performing missions based on the required information, the different time domains may have an overlapping interval, and the different airspaces may have an overlapping interval; and the management platform may be configured to control the at least two UAVs to perform the different missions in the different time domains and the different airspaces, and collect mission data corresponding to the different missions.
One of the embodiments of the present disclosure provide a non-transitory computer readable medium including at least one instruction set. When the at least one instruction set is performed by one or more processors of a computer device, the at least one computer instruction set makes the computer device implement the above method for managing an UAV in a smart city based on the IoT.
The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
In order to illustrate technical solutions of the embodiments of the present disclosure, a brief introduction regarding the drawings used to describe the embodiments is provided below. Obviously, the drawings described below are merely some examples or embodiments of the present disclosure. Those having ordinary skills in the art, without further creative efforts, may apply the present disclosure to other similar scenarios according to these drawings. It should be understood that the exemplary embodiments are provided merely for better comprehension and application of the present disclosure by those skilled in the art, and not intended to limit the scope of the present disclosure. Unless obvious according to the context or illustrated specifically, the same numeral in the drawings refers to the same structure or operation.
It should be understood that the terms “system”, “device”, “unit” and/or “module” used in the specification are means used to distinguish different components, elements, parts, segments, or assemblies. However, these words may be replaced by other expressions if they serve the same purpose.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise; the plural forms may be intended to include the singular forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprise,” “comprises,” and/or “comprising,” “include,” “includes,” and/or “including,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments in the present disclosure. It is to be expressly understood, the operations of the flowchart may be implemented not in order. Conversely, the operations may be implemented in inverted order, or simultaneously. Moreover, one or more other operations may be added into the flowcharts. One or more operations may be removed from the flowcharts.
In some embodiments, the server 110 may be configured to process information and/or data related to the scene 100. For example, the server 110 may access information and/or data stored in the storage device 120 through the network 130. For another example, the server 110 may be directly connected to the storage device 120 to access the information and/or data stored.
The storage device 120 may be configured to store data and/or instructions related to UAV management. In some embodiments, the storage device 120 may store data obtained/acquired from the UAV group 140 and/or the terminal 150. In some embodiments, the storage device 120 may store the data and/or instruction used or performed by the server 110 to complete the exemplary method described in the present disclosure. In some embodiments, the storage device 120 may be implemented in a cloud platform.
In some embodiments, the storage device 120 may be connected to the network 130 to communicate with one or more components of the scene 100 (for example, the server 110, the UAV group 140, the terminal 150). The one or more components of the scene 100 may access data or instruction stored in the storage device 120 through the network 130. In some embodiments, the storage device 120 may be part of the server 110. In some embodiments, the storage device 120 may be an independent memory.
The network 130 may promote information and/or data exchange. In some embodiments, one or more components of the scene 100 (for example, the server 110, the UAV group 140, the terminal 150) may send information and/or data to other components of the scene 100 through the network 130. As an example, the network 130 may include a cable network. In some embodiments, the scene 100 may include one or more network access points. For example, a base station and/or a wireless access point, one or more components of the scene 100 may be connected to the network 130 to exchange data and/or information.
The UAV group 140 may be UAV group composed of one or more UAVs. For example, the UAV group 140 may include an UAV 140-1, an UAV 140-2, . . . , an UAV 140-N. In some embodiments, data collection requirement may be obtained based on the terminal 150 to determine the count of UAVs in the UAV group 140. In some embodiments, the UAVs of the UAV group 140 may be the UAVs on the same mission or on different missions. In some embodiments, the UAV group 140 may obtain a data collection instruction of a management platform through the network 130. In some embodiments, the UAV group 140 may transmit the data to the corresponding management platform according to a data type, and may further transmit the data to an UAV management platform, and then transmit the data from the UAV management platform to the corresponding platform.
The terminal 150 may include one or more terminals or software used by a user. In some embodiments, the user (such as a relevant staff of the UAV management platform, a traffic management platform, a fire management platform, or a pollution management platform) may be the owner of the terminal 150. In some embodiments, the terminal 150 may include mobile devices, tablets, laptops, wearable smart terminals, etc. or any combinations thereof. In some embodiments, the user may obtain relevant information of UAV management through the terminal 150.
It should be noted that the scene 100 is only provided for the purpose of explanation, and does not intend to limit the scope of the present disclosure. For those skilled in the art, a variety of modifications or changes may be made according to the description of the present disclosure. For example, the system for managing an UAV in a smart city based on the IoT may further include a cloud server. For another example, the system for managing an UAV in a smart city based on the IoT may implement similar or different functions on other devices. However, these changes and modifications will not deviate from the scope of the present disclosure.
An IoT system is an information processing system that includes all of or a part of a user platform, a service platform, a management platform, a sensing network platform, and an object platform. The user platform may be the leader of the entire IoT operation system, and may be configured to obtain a user's requirement. The user's requirement may be a foundation and premise of the formation of the IoT operation system. The platforms of the IoT system may be connected to satisfy the user's requirement. The service platform may be a bridge between the user platform and the management platform to realize the communication between the two platforms. The service platform may provide the user with an input and output services. The management platform may realize the connection and collaboration between various functional platforms (such as the user platform, the service platform, the sensing network platform, and the object platform). The management platform brings together information about the IoT operation system, which may provide the IoT operation system with functions of perception management and control management. The sensing network platform may connect the management platform and the object platform, which plays a role of perceptual information sensing communication and control information sensing communication. The object platform may be a functional platform for the performance of perceptual information generation and control information.
The information processing in the IoT system may be divided into a processing process of the perceptual information and a processing process of the control information. The control information may be information generated based on the perceptual information. The processing of perceptual information may include obtaining perceptual information through the object platform, and transmitting the perceptual information to the management platform through the sensing network platform, the management platform may transmit the calculated perceptual information to the service platform, and finally, the perceptual information may be transformed to the service platform. The user may generate the control information through the judgement and the analyze on the perceptual information. The control information may be generated by the user platform and issued to the service platform. The service platform may transmit the control information to the management platform. The management platform calculates the control information and issues the control information to the object platform through the sensing network platform to achieve the control of a corresponding object.
In some embodiments, when applying the IoT system to urban management, the system may be called an IoT system in a smart city.
In some embodiments, the system for managing an UAV in a smart city based on the IoT 200 may be applied to various scenarios such as an UAV management. In some embodiments, the system for managing an UAV in a smart city based on the IoT 200 may obtain data collection requirement information. In some embodiments, the system for managing an UAV in a smart city based on the IoT 200 may control at least two UAVs to perform data collection mission based on the data collection requirement information.
Various scenarios of the UAV management may include collection and transmission of data (such as image data, etc.) of traffic type (such as traffic accidents), fire type, pollution type, etc. It should be noted that the above scenarios are only examples, and it does not limit the specific application scenarios of the system for managing an UAV in a smart city based on the IoT 200. Those skilled in the art may apply the system for managing an UAV in a smart city based on the IoT 200 to other suitable scenarios based on the content of the embodiments of the present disclosure.
The following will specifically explain the system for managing an UAV in a smart city based on the IoT 200.
The user platform 210 may be a user-oriented service interface and may be configured as a terminal device. In some embodiments, the user platform 210 may receive information from the user. For example, the user platform 210 may receive requirement information input by the user. For another example, the user platform 210 may receive an inquiry instruction of an UAV management strategy input by the user. In some embodiments, the user platform 210 may interact with the service platform 220. For example, the user platform 210 may send the requirement information input by the user to the service platform. For another example, the user platform 210 may send the inquiry instruction of an UAV management strategy to the service platform, and receive the UAV management strategy feedback by the service platform 220. The UAV management strategy may include a distribution of the UAVs, a count of the UAVs, and a data transmission mode of the UAV (e.g., a split strategy of a large amount of data, and a data relay transmission based on a second UAV, etc.) or the like.
The service platform 220 may be a platform for a preliminary processing of the requirement information, and it may be configured as the first server. In some embodiments, the service platform usually adopts a centralized layout. In some embodiments, the service platform 220 may send the requirement information from the user platform 210 to a general platform of the management platform. In some embodiments, the service platform 220 may interact with the management platform 230. For example, the service platform 220 may send the inquiry instruction of an UAV management strategy obtained from the user platform 210 to the management platform 230, and receive the UAV management strategy feedback by the management platform 230.
The management platform 230 may refer to an IoT platform that provides perceptual management and control management, as well as coordinates the connection and collaboration between functional platforms. In some embodiments, the management platform 230 may be configured as a second server. In some embodiments, in response of the requirement of the user sent by the service platform 220, the management platform 230 may perform the corresponding management work. For example, the general platform of the management platform 230 summarizes the requirement information, and distributes the different requirement information to a corresponding management sub-platform for further processing to determine the corresponding management strategy.
In some embodiments, the management platform usually adopts a front split layout. The front split layout refers to that the management platform is configured with a general platform and a plurality of sub-platforms (including its own database), and the plurality of sub-platforms store and process different types of data sent by the sensing network platform. The data of the plurality of sub-platforms may be summarized, stored, and processed by the general platform, and then transmitted to the service platform. The management sub-platform may perform configuration based on different mission types or requirement types, and each management sub-platform has a corresponding sensing network sub-platform. The data obtained by the object platform may be uploaded to the corresponding sensing network platform. After summarized by the sensing network platform, the data may be uploaded to the corresponding management sub-platform. For example, the management sub-platform may include a traffic accident management sub-platform, a fire accident management sub-platform, and the pollution accident management sub-platform, etc.
In some embodiments, each management sub-platform receives mission data collected from the relevant UAV from a general database of the sensing network platform, and each management sub-platform may perform processing and operation management on the mission data collected by the UAV. In some embodiments, each management sub-platform (including its own database) may further upload the processed data to a general management platform, and the general management platform may upload the summarized processed data to the service platform. The data uploaded to the service platform may include UAV management strategy information.
In some embodiments, the general management platform may further be called as the general platform of the management platform, and the management sub-platform may further be called as the sub-platform of the management platform.
In some embodiments, the management platform 230 may interact with the sensing network platform 240. The management platform 230 may issue a mission data obtaining instruction to the sensing network platform, and receive the summarized mission data collected by the UAV and uploaded by a general sensing network platform. For example, the UAV device performing a traffic accident data collection mission, when the mission data collected by the UAV device is processed by the traffic accident sensing network sub-platform and uploaded to the general sensing network platform, the data has corresponding identification for indicating the mission type. When the traffic accident management sub-platform issues a mission of obtaining the traffic accident data collected, the general sensing network platform may screen corresponding data through the identification of the mission type and upload the data to the traffic accident management sub-platform.
In some embodiments of the present disclosure, the management sub-platform may process an UAV monitoring data of different mission types, and then summarize the data to the general database, which may reduce the data processing pressure of the entire management platform and gather together data of each sub-platform for unified processing at the same time, so as to achieve collaborative operation and deployment of UAVs for different types of missions by the management platform.
In some embodiments, the management platform 230 may be used to perform the method for managing an UAV in a smart city based on the IoT according to some embodiments of the present disclosure. In response to the user's inquiry requirement, the platform may process the mission data uploaded by the sensing network platform to determine the UAV management strategy.
In some embodiments, the management platform 230 may be further configured to determine different time domains and different airspaces where at least two UAVs perform missions, and control at least two UAVs to perform missions and collect mission data.
In some embodiments, the management platform 230 may be further configured to determine a data transmission mode based on data requirements of different mission data, and features of at least two UAVs.
More explanations of the management platform 230, may be referred to in
The sensing network platform 240 may be a platform for the interaction and connection between the management platform and the object platform. In some embodiments, the sensing network platform adopts the front split layout, including the general sensing network platform and the sensing network sub-platforms, and the sensing network sub-platforms may be in a one-to-one correspondence with the management sub-platforms. The sensing network sub-platforms may include a traffic accident sensing network sub-platform, a fire accident sensing network sub-platform, a pollution accident sensing network sub-platform, etc. In some embodiments, the sensing network platform 240 may be configured as a communication network and gateway, and each sensing network sub-platform may be configured with an independent gateway. In some embodiments, the general platform of the sensing network platform 240 may summarize the management strategies from various management sub-platforms, and distribute different management strategies to the corresponding sensing network sub-platform for processing.
In some embodiments, the sensing network general platform may summarize the different time domains and different airspaces where at least two UAVs perform missions determined by the management sub-platform, and may perform further processing through the sensing network sub-platform. The sub-platform of the sensing network platform may send the processed different time domains and different airspaces where at least two UAVs perform missions to the object platform.
In some embodiments, the general sensing network platform may further be called as the general platform of the sensing network platform, and the sensing network sub-platform may further be called the sub-platform of the sensing network platform.
In some embodiments, the mission data collected by the UAV device may be first uploaded directly to the corresponding sensing network sub-platform for processing and operation management. For example, for the UAV device dispatched to a destination to perform traffic accident monitoring, its data may be uploaded to the traffic accident sensing network sub-platform. For another example, for the UAV device dispatched to a destination to perform fire accident monitoring, its data may be uploaded to the fire accident sensing network sub-platform. In some embodiments, the sensing network sub-platform (including its own database) further uploads the data to the general sensing network platform, and the general sensing network platform uploads the summarized and processed mission data to the management platform 230.
In some embodiments of the present disclosure, a large amount of mission data collected by the UAV may be processed by the sensing network platform, and then summarized to the general database, which may reduce the data processing pressure of the entire sensing network platform, and avoid data clutter and cause large load.
The object platform 250 may be a functional platform for the generation of perceptual information and the final performance of control information. In some embodiments, the object platform 250 may be configured as a functional platform composed of at least one UAV device. Each of the at least one of the UAV devices may be configured with a unique serial number, which may be used to manage the UAV (such as, allocating, packet co-transmission, etc.). The UAV device may include a positioning device for obtaining real-time position information (such as, an area, a coordinate, etc.) of the UAV. The UAV device may further include a camera device for collecting real-time image information. In some embodiments, the object platform may interact with the sensing network platform 240, receive the mission data collection instruction issued by the sensing network platform, and upload the mission data collected to the corresponding sensing network sub-platform.
For those skilled in the art, after understanding the principle of the system, the system for managing an UAV in a smart city based on the IoT 200 may be applied to any other suitable scenario without deviating from the principle.
It should be noted that the above description of the system and the components thereof is only for the convenience of description, and it cannot limit the scope of the embodiments of the present disclosure. It is understood that for those skilled in the art, after understanding the principle of the system, each component of the system may be arbitrarily combined, or may construct connections between a sub-system and other components without deviating from the principle. For example, each component may share a storage device, or each component may further have their own storage device. Such deformations are within the protection range of the present disclosure.
In 310, a general platform of the management platform obtains requirement information of a user from a user platform through a service platform, and assigns the requirement information to a corresponding management sub-platform.
The requirement information refers to information that may reflect data collection requirement. In some embodiments, the requirement information may include a data collection requirement from the user. In some embodiments, the user platform may obtain the requirement information input by the user, and send the requirement information to the service platform. In some embodiments, the general platform of the management platform may obtain the user's requirement information through the service platform. In some embodiments, the requirement information may include data collection requirement of collecting a water quality status in a water area based on a pollution management, data collection requirement of collecting a fire situation of a fire scene and a traffic condition on a surrounding road based on a fire management.
In some embodiments, the management platform may further automatically generate the requirement information through processing the information returned by an UAV group. For example, the management platform may analyze fire information collected by the UAV and generate data collection requirement of collecting the traffic condition around the fire scene.
In some embodiments, the general platform of the management platform may assign different requirement information to different management sub-platforms for further processing to determine a management strategy of the UAV. The management strategy may include different time domains and different airspaces of at least two UAVs.
In 320, the management sub-platform determines different time domains and different airspaces of the at least two UAVs performing missions based on the requirement information, the different time domains may have an overlapping interval, and the different airspaces may have an overlapping interval.
The time domain refers to a time interval of an UAV to perform missions. In some embodiments, the time domain may include at least one time interval. In some embodiments, the at least two UAVs performing missions may have different time domains when performing missions, and different time domains may have an overlapping interval. For example, the time domain of an UAV 140-1 performing a mission may be from 9:00 to 12:00 on May 1, 2022, and the time domain of an UAV 140-2 performing mission may be from 11:00 to 13:00 on May 1, 2022. The two UAVs have different time domains when performing missions, but the different time domains have an overlapping interval, that is, between 11:00 and 12:00 on May 1, 2022.
The airspace refers to the flying space range of an UAV to perform missions. For example, a height range of a flight, the range of mission areas. In some embodiments, the airspace may include at least one space range. In some embodiments, the at least two UAVs may have different airspaces when performing missions, and different airspaces may have an overlapping interval. For example, the UAV 140-1 performs a mission to collect the fire information at a fire scene. The mission of the UAV 140-2 may be to collect the traffic condition information around the fire scene. Then the two UAVs have different airspaces when performing missions, but the different airspaces may have an overlapping interval.
In some embodiments, the sub-platform of the management platform may determine the different time domains and different airspaces of the at least two UAVs when performing missions based on the requirement information.
In some embodiments, the sub-platform of the management platform may determine the different time domains and airspaces of the at least two UAVs when they are performing missions based on time information and mission area information in the requirement information. For example, based on the traffic condition information of section A collected every morning from 8:00 to 10:00, and every afternoon from 5:00 to 7:00, then it may be determined that the time domain of the UAV when performing missions may be every morning from 8:00 to 10:00 and every afternoon from 5:00 to 7:00, and the airspace information may be section A.
In some embodiments, the sub-platform of the management platform may determine the time domain information of the UAV when performing missions based on the time information and UAV battery in the requirement information. For example, the mission period in the requirement information is 4 hours, from 8 am to 12 am, and the UAV battery may support up to 3 hours. Then the time domain of the UAV performing missions may be determined as 3 hours between 8 am and 12 am.
In some embodiments, the sub-platform of the management platform may determine the airspace of the UAV performing missions based on requirement information and laws and regulations information. For example, mission area information may be determined based on the requirement information, and whether the mission area involves a no-fly area may be determined based on the laws and regulations information, and the airspace of the UAV performing missions may be further determined. In some embodiments, the determining the airspace of the UAV performing missions further includes determining a flight route of the UAV. In some embodiments, the flight route of the UAV may be determined based on the mission area information in the requirement information and the laws and regulations information. For example, whether the mission area involves the no-fly area may be determined based on the laws and regulations information. If the mission area involves the no-fly area, then the no-fly area is avoided when determining the flight route.
In 330, the management platform controls the at least two UAVs to perform the different missions in the different time domains and the different airspaces, and collects mission data corresponding to different missions.
In some embodiments, the management platform 230 may control the at least two UAVs to perform different missions in different time domains and different airspaces, and collect mission data corresponding to different missions. For example, based on a data collection requirement of collecting water quality information in a water area issued by the pollution management platform, the UAV may be controlled to collect water samples in different positions of the water area to perform detection, and capture water color, etc. For another example, based on a data collection requirement of collecting fire information of a factory fire issued by the fire management platform, the UAV may be controlled to capture images of the fire scene at different positions of the factory, and collect information including wind power and wind direction, etc.
In some embodiments, the management platform 230 may process the data returned by the UAV. Based on a processing result, the at least two UAVs may be controlled to perform different missions in different time domains and different airspaces, and collect mission data corresponding to different missions. For example, the fire data of the factory returned by the UAV may be processed. Based on the processing result, it may be found that the fire is serious, and more fire force needs to be called for fire rescue. At the same time, the traffic condition around the fire scene needs to be monitored to clear a rescue channel. At this time, the UVA nearby may be controlled to the factory to perform a mission of collecting traffic data.
In 340, the management platform determines a data transmission mode based on data requirements of the mission data and features of the at least two UAVs, the features of the at least two UAVs including at least one of a distance between the UAVs, distances between the UAVs and the management platform, a bandwidth situation transmitted by the UAVs, and positions of the UAVs.
The data requirement refers to the data to be collected in the mission of the UAV. In some embodiments, the data requirement includes an amount of data to be transmitted.
The amount of data to be transmitted refers to the amount of data that needs to be transmitted by the UAV to fulfill the mission. For example, the amount of data to be transmitted in the last 10 minutes of a traffic condition in area A may be 1.5 GB; and the amount of data to be transmitted in the last month of a pollution condition in river C may be 45 GB.
In some embodiments, a total amount of data to be transmitted may be obtained by summarizing the amounts of data to be transmitted of all missions in all types being performed. For example, the mission type 410 being performed includes a traffic management and a disaster relief, and the amounts of data to be transmitted of the traffic management and the disaster relief may be respectively 400 GB and 500 GB, then the total amount of data to be transmitted may be 900 GB.
In some embodiments, the amount of data to be transmitted for each type of mission may be obtained by summarizing the amounts of data to be transmitted of the UAVs performing missions. For example, the number of UAVs performing a traffic management mission may be 35, and the number of UAVs performing a disaster relief mission may be 30; the amount of data to be transmitted for each UAV 440 performing the traffic management and the disaster relief may be 10 GB and 5 GB respectively. Then the amount of data to be transmitted for traffic management may be 350 GB, and the amount of data to be transmitted for disaster relief may be 150 GB.
In some embodiments, the amount of data to be transmitted may be determined through a prediction model. More about the prediction model may be referred to in
The feature of UAV refers to the feature of the UAV when transmitting data. For example, the feature of a UAV may include at least one of a distance between the UAV and another UVA, a distance between the UAV and the management platform, the bandwidth situation transmitted by the UAV, and the position of the UAV.
The data transmission mode refers to the mode adopted by UAV when transmitting the data. For example, the data transmission mode may include a relay transmission, a split transmission, and a merging transmission, etc.
The relay transmission refers to the use of relays for data transmission between at least two data transmission points. For example, when the UAV performing missions and the corresponding management platform cannot transmit data due to a long distance between them, other UAVs or platforms between the UAV and the management platform may be used as relays to perform the data transmission.
The split transmission refers to the transmission of data after splitting. For example, when transmitting a large video with low transmission bandwidth, the video may be split into a plurality of segments and transmitted separately. In some embodiments, the data may be split and transmitted in sequence by the same UAV. In some embodiments, the data may be split and transmitted separately through a plurality of UAVs. In some embodiments, the split data of the transmitted may be merged based on data marks. In some embodiments, when the UAV collects the data, it may mark the data based on the mission types, the mission positionings and the data collection times.
The merging transmission refers to merging data of the same type before transmission. The data of the same type refers to data with the same type and the same collection location. The data types may include a text, a picture, a voice, a video, etc. For example, a plurality of image data captured at the same place and from the same angle may be transmitted after being merged.
In some embodiments, the data transmission mode may be determined based on the data requirements of different mission data and the features of at least two UAVs.
In some embodiments, the data transmission mode may be determined based on the features of at least two UAVs. In some embodiments, based on the distance between the UAV and the corresponding management platform, and the comparison between the distance between the at least two UAVs and a threshold, the data transmission mode may be determined as the relay transmission. For example, when a distance between an UAV A and the corresponding management platform is greater than a threshold, a distance between UAV A and another UAV B is less than the threshold, and a distance between the UAV B and the management platform is less than the threshold, then the data transmission may be performed with the UAV B as a relay. The threshold may be a preset value, such as 10 km.
In some embodiments, the data transmission mode may be determined based on the data requirements and the features of the UAV performing different mission data.
In some embodiments, the amount of data may be determined based on the data requirements of different mission data. In some embodiments, the data transmission mode may be determined as the split transmission based on the amount of data and an UAV transmission bandwidth. For example, the amount of data collected by the UAV may be 10 GB, and the current UAV transmission bandwidth may be 1 Mbps. To improve the efficiency of data transmission, the data may be split into five data packs of 2 GB, and transmitted by five UAVs nearby.
In some embodiments, when the data is of the same type, the data transmission mode may be determined as the merging transmission based on the bandwidth situation of UAV transmission. For example, when a plurality of data is of the same type, and the amount of data is large, and the bandwidth of UAV transmission is low, the data of the same type may be merged and then transmitted.
In some embodiments, whether the data is of the same type may be determined based on the data requirements of different mission data. In some embodiments, whether the data is of the same type may be determined based on the data type in the data requirement and a data collection position. For example, when the type of a plurality of data is picture, and the pictures are captured at the same position, the plurality of data may be data of the same type. In some embodiments, coordinate information of the data collection position may be determined based on the positioning information of the UAV when collecting the data. If a distance between the two coordinates is less than a threshold, the two groups of data corresponding to the coordinates may be data of the same type.
In some embodiments of the present disclosure, the time domain and airspace of the UAV performing missions may be determined based on the data requirement, and data corresponding to the missions performed by the UAV performing missions may be controlled. As a result, an assignment of the UAVs may be more targeting and reasonable, the resources may be reasonably allocated, and the timeliness of UAV coordination and scheduling in emergency situations can be improved. By calling the nearest UAV to help performing missions, it helps to obtain information in a timely and comprehensive manner, and may strive for time for emergency rescue.
In some embodiments of the present disclosure, transmitting data adopting a plurality of transmission modes based on the data requirements and features of the UAVs can improve the efficiency and flexibility of data transmission, and ensure the completion and accuracy of the data transmitted. Adopting the relay transmission can avoid the situation where the UAV is too far from the management platform to transmit data; the split transmission and the merging transmission may reduce the amount of data transmitted by a single UAV and improve transmission efficiency.
It should be noted that the description of the above flow 300 is only for examples and illustration, but not to limited the scope of the present disclosure. For those skilled in the art, under the guidance of the present disclosure, various amendments and changes may be made to the flow 300. However, these amendments and changes are still within the scope of the present disclosure. For example, the process 300 may further include the process of the assignment of the UAV.
In some embodiments, the determining the data transmission mode further includes determining a priority of the data transmission. For more embodiments of determining the priority of the data transmission, please refer to
In some embodiments, a prediction model 430 may be configured to determine an amount of data to be transmitted by each UAV 440. In some embodiments, an input of the prediction model may include at least one mission type 410 performed by the UAV and monitoring data 420 corresponding to the mission performed. For example, the input of the prediction model 430 may include the mission type 410 and the monitoring data 420. The output may include the amount of data to be transmitted by each UAV 440.
In some embodiments, the prediction model 430 may be a trained machine learning model. The prediction model 430 may include, but not limited to any one or combination of a convolutional neural network and deep neural networks.
The mission type 410 refers to a type of the mission performed by the UAV. For example, a traffic management, a disaster relief, a pollution monitoring, etc.
The monitoring data 420 refers to a working condition of the UAV when performing the mission. In some embodiments, the monitoring data 420 may include one or more of target area information (e.g., a position of an area A), a monitoring item (e.g., all road images of the area A), a monitoring length or duration (e.g., 10 minutes, 12:00˜18:00, etc.), a monitoring frequency (e.g., once an hour, once a day, etc.).
In some embodiments, a parameter of the prediction model 430 may be obtained through a training. The training may be performed on an initial prediction model 431 based on a plurality of groups of training samples with labels. The training samples may be a historical mission type 431-2 of the mission performed by the UAV in a history and historical monitoring data 431-1 corresponding to the historical mission type 431-2, and the label of the training sample may be an actual amount of mission data transmitted corresponding to the mission type 410 and its monitoring data 420. It should be understood that the UAV in the sample and the UAV whose amount of data to be transmitted needs to be determined should be the UAVs of the same type. For example, a processing device may collect the historical mission type 431-2 and the historical monitoring data 431-1 corresponding to the historical mission type 431-2 of a plurality of missions performed by the UAV within a period of time (such as one day, one week, one month, etc.) in the history as the training sample, and the mission type 410 and an actual amount of data transmitted 431-3 corresponding to the monitoring data 420 during the period of time may be taken as the label of the training sample. The actual amount of data transmitted 431-3 may be obtained through inquiring or detecting an amount of data transmitted during the period of time. A plurality of training samples may be input to an initial prediction model 431, based on an output of the initial prediction model 431 and the label, a loss function may be constructed, and a parameter of the initial prediction model 431 may be iteratively updated based on the loss function. When the trained model satisfied a preset condition, the training ends, and a trained prediction model 430 may be obtained. The preset condition may include but not limit to a convergence of the loss function, a loss function value is less than a preset value or a number of training iterations reaches a threshold.
Through the methods described in some embodiments of the present disclosure, the amount of data to be transmitted of the UAV may be quickly and accurately determined to facilitate subsequent accurate adjustment of the number of the UAVs, so as to ensure the successful completion of the missions of the UAVs.
In some embodiments, the amount of data to be transmitted for each UAV 440 may be determined based on other modes. In some embodiments, the processing device may further determine the amount of data to be transmitted based on an artificial experience. For example, staffs related to the UAV (e.g., experts in the UAV industry, UAV operators, etc.) may determine the amount of data to be transmitted rely on the past experience. In some embodiments, the processing device may determine the amount of data to be transmitted according to historical data. The historical data may include the mission type 410, the monitoring data 420, and the corresponding actual amount of data. It may be understood that the processing device may take the actual amount of data corresponding to the historical data that is similar to the current mission type 410 and the monitoring data 420 as a current amount of data to be transmitted.
In some embodiments, the processing device may determine an optimal number of the UAVs to perform the current mission based on a total amount of data to be transmitted. In some embodiments, the processing device may set a preset range of the total amount of data to be transmitted in advance. When the total amount of data to be transmitted is within a certain preset range, the optimal number of the UAVs may be a value corresponding to the preset range. As an example, when the total amount of data to be transmitted is within 0˜100 GB, the optimal number of the UAVs may be 5; when the total amount of data to be transmitted is within 100 GB˜500 GB, the optimal number of the UAVs may be 15; when the total amount of data to be transmitted exceeds 500 GB, the optimal number of the UAVs may be 20. It should be understood that the larger the total amount of data to be transmitted, the larger the optimal amount of UAVs. In this way, a collection efficiency and a transmission efficiency of the data may be improved.
In some embodiments, the processing device may adjust the number of the UAVs performing the mission based on the optimal number and a ratio of the amount of data to be transmitted for each type of mission to the total amount of data to be transmitted. For example, there may be currently 11 UAVs performing missions, 7 of them may be performing a traffic management mission, and 4 of them may be performing a pollution monitoring mission. The current amount of data to be transmitted may be 604 GB, according to the above correspondence, the optimal number of UAVs may be 20. The current amount of data to be transmitted in the traffic management and pollution monitoring may be 397 G and 207 G respectively, which count for 34.3% and 65.7% of the total amount of data to be transmitted. Then the optimal number of UAVs to perform the traffic management mission should be 13, and the optimal number of UAVs to perform the pollution monitoring mission should be 7. Therefore, on the current basis, 6 UAVs and 3 UAVs may be added to respectively perform the traffic management mission and the pollution monitoring mission.
Through the methods described in some embodiments of the present disclosure, the number of the UAVs may be accurately adjusted based on the amount of data to be transmitted to achieve a reasonable distribution of the resources.
In some embodiments, the processing device may determine a data transmission mode based on the total amount of data to be transmitted. In some embodiments, when the total amount of data to be transmitted is greater than a preset threshold, the processing device may split or merge the data before transmitting it to a receiving platform.
For more explanations of data splitting and merging, see
Through some methods described in the embodiment of the present disclosure, the data transmission mode may be determined based on the amount of data to be transmitted, thereby improving the transmission efficiency.
According to some methods described in the embodiment of the present disclosure, the amount of data to be transmitted of different mission data may be determined according to the prediction model, and determining number of the UAVs and different data transmission modes according to the amount of data to be transmitted may improve the efficiency of data transmission and the flexibility of an UVA resource allocation.
In some embodiments, the processing device may obtain a data feature of mission data, and determine the priority of data transmission based on the data feature.
The data feature refers to a feature of the mission data. In some embodiments, the data feature may include at least one of a degree of importance, a degree of urgency, and a data size.
The degree of importance may refer to the degree of influence of the mission data on a mission performance result. For example, in a traffic management, the degree of importance of the image data indicating a road with a heavy traffic may be higher than that of a road with a light traffic.
The degree of urgency may reflect whether the current mission data needs to be quickly transmitted and analyzed for a mission performance. For example, in a disaster relief, when a rescue team sets off, the degree of urgency of the image data of a road from the position including the rescue team to a dangerous area may be higher than that from the dangerous area to a hospital.
The data size may refer to an amount of data of the mission data. For example, 500 MB, 30 GB, etc.
In some embodiments, the degree of importance and the degree of urgency of the mission data may be determined by information contained in the mission data. In some embodiments, the degree of importance and the degree of urgency of different mission data may be pre-set to be corresponding to the information contained in the mission data. As an example, the corresponding degree of importance and the degree of urgency of the mission data may be preset according to the mission type 410 contained in the mission data. For example, the degree of importance and the degree of urgency may be the highest when the mission type 410 is the disaster relief, the degree of importance and the degree of urgency may be the second highest when the mission type 410 is the traffic management, and the degree of importance and the degree of urgency may be the lowest when the mission type 410 is a pollution monitoring. For another example, the degree of importance and the degree of urgency may both be expressed as a value in 0-100. The larger the value, the higher the degree of importance and the degree of urgency corresponding to the mission type 410. When the mission type 410 is the disaster relief, the degree of importance and the degree of urgency may be 100; when the mission type 410 is the traffic management, the degree of importance and the degree of urgency may be 70; and when the mission type 410 is the pollution monitoring, the degree of importance and the degree of urgency may be 50.
In some embodiments, the size of the data (or data size) may be determined directly by the amount of data of the mission data. For example, the amount of data of the mission data of the pollution monitoring may be 200 MB, and the amount of data of the mission data of the disaster relief may be 1 GB, then the data size of the mission data of the disaster relief may be greater than the data size of the mission data of pollution monitoring.
The priority of the data transmission (also referred to as data transmission priority) may refer to the sequence of mission data transmission. For example, data A has the highest priority of data transmission, and data B has the lowest priority of data transmission of, then data A is transmitted before data B.
In some embodiments, the priority of data transmission may be determined based on the degree of importance and urgency, and data size.
In some embodiments, the priority of data transmission can be determined based on the degree of importance and the degree of urgency. As an example, the degree of importance and the degree of urgency may both be expressed as a value in 0-100. The greater the value, the higher the corresponding degree of importance and the degree of urgency. When a sum of the values of the degree of importance and the degree of urgency of one mission data is greater than the sum of the values of the degree of importance and the degree of urgency of another mission data, the priority of the former mission data may be higher than the latter. For example, the value of the degree of urgency of mission data A is 60, the value of the degree of importance of mission data A is 80, the value of the degree of urgency of mission data B is 90, the value of the degree of importance of mission data B is 75. Then the sum of the values of the degree of importance and the degree of urgency of the mission data A may be 140, which is smaller than that of the mission data B, which is 165. Then the priority of the mission data B may be higher than the mission data A.
In some embodiments, when the priorities of at least two mission data determined based on the degree of importance and the degree of urgency are the same, their priorities may be determined according to the data size of the mission data. In some embodiments, the priority of mission data with small data size may be higher than the priority of mission data with large data size. For example, the data size of a mission data C is 800 MB, and the data size of a mission data D is 1.4 GB, then the priority of the mission data C is higher than the priority of the mission data D.
In some embodiments, a data transmission priority 540 may be determined by a determination model 520.
In some embodiments, the data transmission priority 540 of each mission data may be determined based on image data 510 in the mission data collected by an UAV through the determination model 520. The determination model 520 may be a trained machine learning model which may include, but not limited to any one or combination of a convolutional neural network and a deep neural network.
As shown in
The feature recognition layer 520-1 may extract features in the image data 510 and obtain an image feature 521. The image feature may include a color feature, a texture feature, a shape feature, and a spatial relationship feature of the image. The feature recognition layer 520-1 may be the convolutional neural network. As shown in
The priority determination layer 520-2 may perform processing on requirement information 530 and the image feature 521 to determine the corresponding data transmission priority 540. The priority determination layer 520-2 may be the deep neural network. As shown in
In some embodiments, the feature recognition layer 520-1 and the priority determination layer 520-2 may be obtained through a joint training. A training sample may include the image data and historical requirement information in the historical mission data of the UAV. A label may be the priority corresponding to the actual transmission of the image data under the influence of the historical requirement information. As an example, the priority may be expressed by numbers 0˜10. 0 indicates the highest priority, 10 indicates the lowest priority. The smaller the number, the higher the priority, and the more advanced the order of the transmission of the mission data. The image data in the training sample may be input to an initial feature recognition layer. Then the output of the initial feature recognition layer and the requirement information in the training sample may be input to an initial priority determination layer, and a loss function may be constructed based on the output of the initial feature recognition layer and the label. A parameter of each layer of the determination model may be iteratively updated based on the loss function until it satisfies a preset condition, and a trained determination model may be obtained. The preset condition may include but not limited to the loss function converges, a loss function value is less than a preset value or a number of training iterations reaches a threshold.
The method described in some embodiments of the present disclosure, by determining the priority of data transmission, it can be ensured that important and urgent data may be transmitted in priority, and a loss of important data can be avoided. At the same time, a priority transmission of emergency data helps the management platform to deal with emergency situations timely. When the degree of importance and the degree of urgency of the data is almost the same, the data with a smaller data size may be transmitted first to ensure its complete transmission, avoiding situations like a transmission speed drop, a transmission interruption, etc. due to the transmission of data of a large amount, thereby ensuring as much and complete data transmission as possible.
Having thus described the basic concepts, it may be rather apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested by this disclosure, and are within the spirit and scope of the exemplary embodiments of this disclosure.
Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment,” “an embodiment,” and/or “some embodiments” mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
Each of the patents, patent applications, publications of patent applications, and other material, such as articles, books, specifications, publications, documents, things, and/or the like, referenced herein is hereby incorporated herein by this reference in its entirety for all purposes, excepting any prosecution file history associated with same, any of same that is inconsistent with or in conflict with the present document, or any of same that may have a limiting effect as to the broadest scope of the claims now or later associated with the present document. By way of example, should there be any inconsistency or conflict between the description, definition, and/or the use of a term associated with any of the incorporated material and that associated with the present document, the description, definition, and/or the use of the term in the present document shall prevail.
In closing, it is to be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the application. Other modifications that may be employed may be within the scope of the application. Thus, by way of example, but not of limitation, alternative configurations of the embodiments of the application may be utilized in accordance with the teachings herein. Accordingly, embodiments of the present application are not limited to that precisely as shown and described.
Number | Date | Country | Kind |
---|---|---|---|
202210819002.8 | Jul 2022 | CN | national |
Number | Name | Date | Kind |
---|---|---|---|
20200209892 | Luo et al. | Jul 2020 | A1 |
20210088337 | Koubaa | Mar 2021 | A1 |
20210407304 | Barney | Dec 2021 | A1 |
Number | Date | Country |
---|---|---|
108366099 | Aug 2018 | CN |
110876126 | Mar 2020 | CN |
111323772 | Jun 2020 | CN |
111428677 | Jul 2020 | CN |
113778967 | Dec 2021 | CN |
114386953 | Apr 2022 | CN |
114390079 | Apr 2022 | CN |
Entry |
---|
Huang, Xiaobing et al., Application of Point Cloud Fusion Algorithm of Multi-source Data in 3D Modeling of Smart City, Beijing Surveying and Mapping, 35(12): 1582-1586, 2021. |
Muhammad Hameed Siddiqi et al., FANET: Smart City Mobility off to a Flying Start with Self-organized Drone-based Networks, IET Communications, 16(10): 1209-1217, 2021. |
Wu, Jian, Application Prospect and Realization of UAV-based Security Monitoring System in Smart Cities, Wireless Internet Technology, 108-109, 2020. |
White Paper on Urban Brain Development, Smart City Standard Working Group of National Beacon Commission, 2022, 59 pages. |
First Office Action in Chinese Application No. 202210819002.8 dated Sep. 15, 2022, 16 pages. |
Decision to Grant a Patent for invention in Chinese Application No. 202210819002.8 dated Sep. 27, 2022, 8 pages. |