Embodiments of this disclosure relate to the field of terminal devices, and in particular, to a route planning method and apparatus.
With development of the field of terminal device technologies, a terminal device has increasing functions and increasing application scenarios. A user may plan a driving route by using a map application provided by the terminal device. Currently, in a route planning method on the terminal, the user usually specifies a start point and an end point, and the terminal plans a corresponding route based on the start point and the end point that are set by the user. In a route planning process, a route is usually planned according to a rule like a shortest route length or fewest traffic lights. A route planning manner is simple, and a personalized requirement of the user cannot be met.
Embodiments of this disclosure provide a route planning method and apparatus. The method includes: An electronic device plans, based on an indication of a user, a route that meets a target graphic specified by the user, so that a personalized setting of the route can be implemented. This improves user experience.
According to a first aspect, an embodiment of this disclosure provides a route planning method. The method includes: An electronic device receives a first operation. The received first operation indicates a target graphic. The electronic device determines a target route based on the target graphic indicated by the first operation and a target region on a map, where a similarity between a shape of the target route and a shape of the target graphic is greater than a preset threshold, the target route is in the target region, and the target region is set based on a location of the electronic device, or the target region is set based on a user operation. After determining the target route, the electronic device displays the target route on the map. In this way, in this embodiment of this disclosure, the user can randomly set a graphic without any limitation. The electronic device can perform route planning based on the graphic set by the user, and obtain a route similar to the graphic set by the user, to meet a user requirement, so as to implement a personalized route setting. In addition, in this embodiment of this disclosure, the electronic device performs route planning on the target region on the map, and planning is more purposeful, so that a success rate and efficiency of route planning can be effectively improved.
For example, the planned target route in this embodiment of this disclosure includes a start point and an end point. The start point and the end point are automatically generated based on the planned route.
For example, the target route may be in any region in the target region.
For example, the electronic device may obtain a current location of the electronic device, and determine the target region based on the current location.
For example, a range of the target region may be a preset value, or may be determined based on a latest setting of the user.
For example, the target region may be a circle, a rectangle, or another geometry that centers on the current location of the electronic device.
For example, a center of the target region may alternatively be determined based on a user setting.
For example, the target region may cover a city, a town, or the like on the map. This is not limited in this disclosure.
In a possible implementation, that the electronic device determines the target route based on the target graphic and the target region on the map includes: The electronic device obtains feature information of the target graphic in response to the received first operation, where the feature information describes a graphic feature of the target graphic. The electronic device determines the target route based on the feature information of the target graphic and the target region on the map. In this way, in this embodiment of this disclosure, corresponding processing is performed on the target graphic, to obtain corresponding feature information. Therefore, a corresponding target route may be obtained by performing matching based on some graphic features of the target graphic. In other words, a graphic enclosed by the target route has a same graphic feature as the target graphic.
In a possible implementation, that the electronic device obtains the feature information of the target graphic in response to the received first operation includes: The electronic device performs abstraction processing on the target graphic, to obtain a target abstract graphic of the target graphic, where the target abstract graphic is used to describe a key point in the target graphic or a connection relationship between key points in the target abstract graphic. The electronic device obtains the feature information based on the target abstract graphic. In this way, in this embodiment, for a target graphic requirement indicated by the user, the target graphic is output in a form of data descriptions, namely, the feature information, while an original key feature of the target graphic is unchanged, so that a target route that is of the graphic and that meets feature information descriptions can be constructed.
In a possible implementation, the feature information includes at least one of the following: connection feature information, geometric feature information, angle feature information, size feature information, and location feature information.
For example, the connection feature information may indicate a connection status between different geometries, and the connection feature information may include a connection point and an angle corresponding to the connection point.
For example, the geometric feature information indicates some geometries included in the graphic, for example, may include but is not limited to a diamond, a rectangle, or a circle.
For example, the size feature information indicates a size relationship between different geometries, and may also be understood as a size ratio.
For example, the location feature information may indicate a location relationship between different graphics.
In a possible implementation, that the electronic device determines the target route based on the feature information of the target graphic and the target region on the map includes: The electronic device performs matching on the target region on the map based on the feature information. The electronic device determines a successfully matched route as the target route.
In a possible implementation, that the electronic device determines the target route based on the feature information of the target graphic and the target region on the map includes: The electronic device performs matching on the target region on the map based on first feature information, to obtain at least one successfully matched first target subroute. The electronic device performs matching on a surrounding region of the at least one first target subroute based on second feature information, to obtain a successfully matched second target subroute, where the second target subroute is obtained by performing matching on a surrounding region of a third target subroute of the at least one first target subroute, the first feature information and the second feature information are partial information of the feature information, and the second target subroute and the third target subroute form the target route. In this way, in this embodiment, a plurality of pieces of feature information included in the feature information is separately matched with routes in the target region, to obtain a target route that meets a characteristic constraint.
In a possible implementation, if a plurality of second target subroutes are obtained through matching, that the electronic device determines the target route based on the feature information of the target graphic and the target region on the map further includes: The electronic device receives a second operation, where the second operation is used to specify a target route from a route including the plurality of second target subroutes. The electronic device determines the target route in response to the received second operation. In this way, if a plurality of routes that meet the feature information constraint are obtained through matching, the user may specify the target route.
In a possible implementation, that the electronic device determines the target route based on the feature information of the target graphic and the target region on the map includes: determining a plurality of candidate routes based on the target graphic and the target region on the map; and determining the target route from the plurality of candidate routes based on a preset condition, where the preset condition includes at least one of the following: a similarity between the shape of the target route and the target abstract graphic is the highest, a length of the target route is the shortest, and the target route passes through a specified region, where the specified region is preset. In this way, if the plurality of routes is obtained by performing matching based on the constraint, the routes may be further filtered based on the preset condition, to obtain a route that better meets the constraint requirement.
In a possible implementation, that the electronic device receives the first operation includes: receiving the first operation performed on a first interface; and that the electronic device displays the target route on the map includes: displaying the target route on a map on a second interface, where the second interface is different from the first interface. In this way, in this embodiment, a user setting interface, namely, the first interface, may be provided. The user may randomly set the target graphic on the first interface. After performing route planning based on the target graphic set by the user on the first interface, to obtain the target route, the electronic device may display the target route on the map on the second interface. In other words, when indicating the target graphic on the first interface, the user does not need to perform setting based on the map, for example, does not need to set the start point or the end point, does not need to select a route, and only needs to set any graphic based on an actual requirement of the user. This effectively reduces operation difficulty of the user, and an automatic and personalized route planning manner can be implemented without a need to manually select a route by the user.
In a possible implementation, the first operation indicates to draw the target graphic on the first interface.
In a possible implementation, the first interface is a gallery interface, and the gallery interface includes at least one locally stored graphic; and that the electronic device receives the first operation performed on the first interface includes: The electronic device determines the target graphic in response to the received first operation performed on the target graphic, where the target graphic is a graphic in the at least one graphic.
In a possible implementation, the first interface is a candidate graphic interface, and the candidate graphic interface includes at least one candidate graphic; and that the electronic device receives the first operation performed on the first interface includes: The electronic device determines the target graphic in response to the received first operation performed on a target candidate graphic, where the target candidate graphic is a graphic in the at least one candidate graphic. In this way, in this embodiment, some candidate graphic may be preset for the user to select.
In a possible implementation, that the electronic device determines the target route based on the feature information and the target region includes: segmenting the target region, to obtain N target sub-regions, where N is a positive integer greater than 1; prioritizing the N target sub-regions based on target feature information, where a sub-region that includes more routes corresponding to the target feature information has a higher priority, and the target feature information is partial information of the feature information; and performing matching on the N target sub-regions one by one based on the feature information and a priority sequence of the N target sub-regions. In this way, in this embodiment, the map is segmented, so that route matching can be performed in each map obtained through segmentation, to increase a route search speed.
In a possible implementation, that the electronic device displays the target route on the map on the second interface includes: The electronic device displays a key point on the target route on the map on the second interface.
In a possible implementation, that the electronic device displays the target route on the map on the second interface includes: displaying an entire route of the target route on the map on the second interface.
In a possible implementation, that the electronic device determines the target route based on the target graphic and the target region on the map includes: obtaining a current location of the electronic device; and determining the target region based on the current location of the electronic device and a target region range, where the target region range is a preset value, or the target region range is set by using a received region range setting operation.
In a possible implementation, the target region is set by using a latest received user operation.
According to a second aspect, an embodiment of this disclosure provides a route planning apparatus. The apparatus includes a receiving module, a processing module, and a display module. The receiving module is configured to receive a first operation, where the first operation indicates a target graphic. The processing module is configured to determine a target route based on the target graphic and a target region on a map, where a similarity between a shape of the target route and a shape of the target graphic is greater than a preset threshold, the target route is in the target region, and the target region is set based on a location of an electronic device, or the target region is set based on a user operation. The display module is configured to display the target route on the map.
In a possible implementation, the processing module is specifically configured to: obtain feature information of the target graphic in response to the received first operation, where the feature information describes a graphic feature of the target graphic; and determine the target route based on the feature information of the target graphic and the target region on the map.
In a possible implementation, the processing module is specifically configured to: perform abstraction processing on the target graphic, to obtain a target abstract graphic of the target graphic, where the target abstract graphic is used to describe a key point in the target graphic or a connection relationship between key points in the target abstract graphic; and obtain the feature information based on the target abstract graphic.
In a possible implementation, the feature information includes at least one of the following: connection feature information, geometric feature information, angle feature information, size feature information, and location feature information.
In a possible implementation, the processing module is specifically configured to: perform matching on the target region on the map based on the feature information; and determine a successfully matched route as the target route.
In a possible implementation, the processing module is specifically configured to: perform matching on the target region on the map based on first feature information, to obtain at least one successfully matched first target subroute; and perform matching on a surrounding region of the at least one first target subroute based on second feature information, to obtain a successfully matched second target subroute, where the second target subroute is obtained by performing matching on a surrounding region of a third target subroute of the at least one first target subroute, the first feature information and the second feature information are partial information of the feature information, and the second target subroute and the third target subroute form the target route.
In a possible implementation, the processing module is specifically configured to: if a plurality of second target subroutes are obtained through matching, receive a second operation, where the second operation is used to specify the target route from a route including the plurality of second target subroutes; and determine the target route in response to the received second operation.
In a possible implementation, the processing module is specifically configured to: determine a plurality of candidate routes based on the target graphic and the target region on the map; and determine the target route from the plurality of candidate routes based on a preset condition, where the preset condition includes at least one of the following: a similarity between the shape of the target route and the target abstract graphic is the highest, a length of the target route is the shortest, and the target route passes through a specified region, where the specified region is preset.
In a possible implementation, the receiving module is configured to receive the first operation performed on a first interface. The display module is configured to display the target route on a map on a second interface, where the second interface is different from the first interface.
In a possible implementation, the first operation indicates to draw the target graphic on the first interface.
In a possible implementation, the first interface is a gallery interface, and the gallery interface includes at least one locally stored graphic; and the receiving module is configured to determine the target graphic in response to the received first operation performed on the target graphic, where the target graphic is a graphic in the at least one graphic.
In a possible implementation, the first interface is a candidate graphic interface, and the candidate graphic interface includes at least one candidate graphic; and the receiving module is configured to determine the target graphic in response to the received first operation performed on a target candidate graphic, where the target candidate graphic is a graphic in the at least one candidate graphic.
In a possible implementation, the processing module is configured to: segment the target region, to obtain N target sub-regions, where N is a positive integer greater than 1; prioritize the N target sub-regions based on target feature information, where a sub-region that includes more routes corresponding to the target feature information has a higher priority, and the target feature information is partial information of the feature information; and perform matching on the N target sub-regions one by one based on the feature information and a priority sequence of the N target sub-regions.
In a possible implementation, the display module is configured to display, on the map on the second interface, a key point on the target route.
In a possible implementation, the display module is configured to display an entire route of the target route on the map on the second interface.
In a possible implementation, the processing module is configured to: obtain a current location of the electronic device; and determine the target region based on the current location of the electronic device and a target region range, where the target region range is a preset value, or the target region range is set by using a received region range setting operation.
In a possible implementation, the target region is set by using a latest received user operation.
Any one of the second aspect and the implementations of the second aspect respectively correspond to any one of the first aspect and the implementations of the first aspect. For technical effects corresponding to any one of the second aspect and the implementations of the second aspect, refer to technical effects corresponding to any one of the first aspect and the implementations of the first aspect. Details are not described herein again.
According to a third aspect, an embodiment of this disclosure provides a computer-readable medium, configured to store a computer program, where the computer program includes instructions for performing the method according to the first aspect or any possible implementation of the first aspect.
According to a fourth aspect, an embodiment of this disclosure provides a computer program, where the computer program includes instructions for performing the method according to the first aspect or any possible implementation of the first aspect.
According to a fifth aspect, an embodiment of this disclosure provides a chip, where the chip includes a processing circuit and a transceiver pin. The transceiver pin and the processing circuit communicate with each other through an internal connection path. The processing circuit performs the method according to any one of the first aspect or the possible implementations of the first aspect, to control a receive pin to receive a signal and a transmit pin to send a signal.
According to a sixth aspect, an embodiment of this disclosure provides an electronic device. The electronic device includes one or more processors, a memory, and one or more computer programs. The one or more computer programs are stored in the memory. When the computer program is executed by the one or more processors, the electronic device is enabled to perform the method according to any one of the first aspect or the possible implementations of the first aspect.
The following clearly and completely describes the technical solutions in embodiments of this disclosure with reference to the accompanying drawings in embodiments of this disclosure. It is clear that the described embodiments are some but not all of embodiments of this disclosure.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a headset jack 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 194, a subscriber identity module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, a barometric pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, an optical proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units. For example, the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), a controller, a memory, a video codec, a digital signal processor (DSP), a baseband processor, a neural-network processing unit (NPU), and/or the like. Different processing units may be independent components, or may be integrated into one or more processors.
The controller may be a nerve center and a command center of the electronic device 100. The controller may generate an operation control signal based on an instruction operation code and a time sequence signal, to complete control of instruction reading and instruction execution.
A memory may be further disposed in the processor 110, and is configured to store instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may store the instructions or the data that have/has been used or cyclically used by the processor 110. If the processor 110 needs to use the instructions or the data again, the processor may directly invoke the instructions or the data from the memory. This avoids repeated access, reduces waiting time of the processor 110, and improves system efficiency.
In some embodiments, the processor 110 may include one or more interfaces. The interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a SIM interface, a universal serial bus (USB) interface, and/or the like.
The I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL). In some embodiments, the processor 110 may include a plurality of groups of I2C buses. The processor 110 may be separately coupled to the touch sensor 180K, a charger, a flash, the camera 193, and the like through different I2C bus interfaces. For example, the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 communicates with the touch sensor 180K through the I2C bus interface, to implement a touch function of the electronic device 100.
The I2S interface may be configured to perform audio communication. In some embodiments, the processor 110 may include a plurality of groups of I2S buses. The processor 110 may be coupled to the audio module 170 through the I2S bus, to implement communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through a Bluetooth headset.
The PCM interface may also be used to perform audio communication, and sample, quantize, and encode an analog signal. In some embodiments, the audio module 170 may be coupled to the wireless communication module 160 through a PCM bus interface. In some embodiments, the audio module 170 may alternatively transmit an audio signal to the wireless communication module 160 through the PCM interface, to implement a function of answering a call through a Bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus, and is configured to perform asynchronous communication. The bus may be a bidirectional communication bus. The bus converts to-be-transmitted data between serial communication and parallel communication. In some embodiments, the UART interface is usually configured to connect the processor 110 and the wireless communication module 160. For example, the processor 110 communicates with a Bluetooth module in the wireless communication module 160 through the UART interface, to implement a Bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the UART interface, to implement a function of playing music through a Bluetooth headset.
The MIPI interface may be configured to connect the processor 110 and a peripheral component like the display 194 or the camera 193. The MIPI interface includes a camera serial interface (CSI), a display serial interface (DSI), and the like. In some embodiments, the processor 110 communicates with the camera 193 through the CSI interface, to implement a photographing function of the electronic device 100. The processor 110 communicates with the display 194 through the DSI interface, to implement a display function of the electronic device 100.
The GPIO interface may be configured by using software. The GPIO interface may be configured as a control signal or a data signal. In some embodiments, the GPIO interface may be configured to connect the processor 110 and the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, or the like. The GPIO interface may alternatively be configured as the I2C interface, the I2S interface, the UART interface, the MIPI interface, or the like.
The USB interface 130 is an interface that conforms to a USB standard specification, and may be specifically a mini USB interface, a micro USB interface, a USB type-C interface, or the like. The USB interface 130 may be configured to connect to the charger to charge the electronic device 100, or may be configured to perform data transmission between the electronic device 100 and a peripheral device, or may be configured to connect to a headset for playing audio through the headset. The interface may be further configured to connect to another electronic device like an AR device.
It may be understood that an interface connection relationship between the modules shown in this embodiment is merely an example for description, and constitutes no limitation on the structure of the electronic device 100. In some other embodiments, the electronic device 100 may alternatively use an interface connection manner different from that in the foregoing embodiment, or use a combination of a plurality of interface connection manners.
The charging management module 140 is configured to receive a charging input from the charger. The charger may be a wireless charger or a wired charger. In some embodiments of wired charging, the charging management module 140 may receive a charging input of a wired charger through the USB interface 130. In some embodiments of wireless charging, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 supplies power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is configured to connect the battery 142, the charging management module 140, and the processor 110. The power management module 141 receives an input of the battery 142 and/or the charging management module 140, to supply power to the processor 110, the internal memory 121, an external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may be further configured to monitor parameters such as a battery capacity, a battery cycle count, and a battery health status (electric leakage or impedance). In some other embodiments, the power management module 141 may alternatively be disposed in the processor 110. In some other embodiments, the power management module 141 and the charging management module 140 may alternatively be disposed in a same device.
A wireless communication function of the electronic device 100 may be implemented by using the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antenna 1 and the antenna 2 are configured to transmit and receive an electromagnetic wave signal. Each antenna in the electronic device 100 may be configured to cover one or more communication frequency bands. Different antennas may be further multiplexed, to improve antenna utilization. For example, the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In some other embodiments, the antenna may be used in combination with a tuning switch.
The mobile communication module 150 may provide a wireless communication solution that includes 2G/3G/4G/5G or the like and that is applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low-noise amplifier (LNA), and the like. The mobile communication module 150 may receive an electromagnetic wave through the antenna 1, perform processing such as filtering or amplification on the received electromagnetic wave, and transmit a processed electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may further amplify a signal modulated by the modem processor, and convert an amplified signal into an electromagnetic wave for radiation through the antenna 1. In some embodiments, at least some functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some functional modules of the mobile communication module 150 may be disposed in a same device as at least some modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is configured to modulate a to-be-sent low-frequency baseband signal into a medium-high frequency signal. The demodulator is configured to demodulate a received electromagnetic wave signal into a low-frequency baseband signal. Then, the demodulator transmits the low-frequency baseband signal obtained through demodulation to the baseband processor for processing. The low-frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal by using an audio device (which is not limited to the speaker 170A, or the receiver 170B), or displays an image or a video on the display 194. In some embodiments, the modem processor may be an independent component. In some other embodiments, the modem processor may be independent of the processor 110, and is disposed in a same device as the mobile communication module 150 or another functional module.
The wireless communication module 160 may provide a wireless communication solution that is applied to the electronic device 100, and that includes a wireless local area network (WLAN) (for example, a WI-FI network), BLUETOOTH (BT), a global navigation satellite system (GNSS), frequency modulation (FM), a near field communication (NFC) technology, an infrared (IR) technology, or the like. The wireless communication module 160 may be one or more components integrating at least one communication processor module. The wireless communication module 160 receives an electromagnetic wave through the antenna 2, performs frequency modulation and filtering processing on the electromagnetic wave signal, and sends a processed signal to the processor 110. The wireless communication module 160 may further receive a to-be-sent signal from the processor 110, perform frequency modulation and amplification on the signal, and convert a processed signal into an electromagnetic wave for radiation through the antenna 2.
In some embodiments, in the electronic device 100, the antenna 1 and the mobile communication module 150 are coupled, and the antenna 2 and the wireless communication module 160 are coupled, so that the electronic device 100 can communicate with a network and another device by using a wireless communication technology. The wireless communication technology may include a global system for mobile communications (GSM), a general packet radio service (GPRS), code-division multiple access (CDMA), wideband CDMA (WCDMA), time-division CDMA (TD-SCDMA), Long-term Evolution (LTE), BT, a GNSS, a WLAN, NFC, FM, an IR technology, and/or the like. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a BeiDou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a satellite based augmentation system (SBAS).
The electronic device 100 may implement a display function by using the GPU, the display 194, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is configured to: perform mathematical and geometric computation, and render an image. The processor 110 may include one or more GPUs, which execute program instructions to generate or change display information.
The display 194 is configured to display an image, a video, or the like. The display 194 includes a display panel. The display panel may be a liquid-crystal display (LCD) or a light-emitting diode (LED) such as an organic LED (OLED), an active-matrix OLED (AMOLED), a flexible LED (FLED), a mini-LED, a micro-LED, a micro-OLED, a quantum dot LED (QLED), or the like. In some embodiments, the electronic device 100 may include one or N displays 194, where N is a positive integer greater than 1.
The electronic device 100 may implement a photographing function by using the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is configured to process data fed back by the camera 193. For example, during photographing, a shutter is pressed, and light is emitted to a photosensitive element of the camera through a lens. An optical signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, to convert the electrical signal into a visible image. The ISP may further perform algorithm optimization on noise, brightness, and complexion of the image. The ISP may further optimize parameters such as exposure and a color temperature of a photographing scene. In some embodiments, the ISP may be disposed in the camera 193.
The camera 193 is configured to capture a static image or a video. An optical image of an object is generated through the lens, and is projected onto a photosensitive element. The photosensitive element may be a charge-coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts an optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert the electrical signal into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard format like RGB or YUV. In some embodiments, the electronic device 100 may include one or N cameras 193, where N is a positive integer greater than 1.
The digital signal processor is configured to process a digital signal, and may process another digital signal in addition to the digital image signal. For example, when the electronic device 100 selects a frequency, the digital signal processor is configured to perform Fourier transformation on frequency energy.
The video codec is configured to compress or decompress a digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a plurality of encoding formats, for example, Moving Picture Experts Group (MPEG)-1, MPEG-2, MPEG-3, and MPEG-4.
The NPU is a neural-network (NN) computing processor, and simulates a biological neural network structure like a transmission mode between neurons in a human brain to perform rapid processing on input information, and can perform continuous self-learning. Applications such as intelligent cognition of the electronic device 100 may be implemented by using the NPU, for example, image recognition, facial recognition, speech recognition, and text understanding.
The external memory interface 120 may be configured to connect to an external memory card, for example, a micro SD card, to extend a storage capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120, to implement a data storage function. For example, files such as music and videos are stored in the external memory card.
The internal memory 121 may be configured to store computer-executable program code. The executable program code includes instructions. The processor 110 runs the instructions stored in the internal memory 121, to perform various function applications of the electronic device 100 and data processing. The internal memory 121 may include a program storage region and a data storage region. The program storage region may store an operating system, an application required by at least one function (for example, a voice playing function or an image playing function), and the like. The data storage region may store data (such as audio data and an address book) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, or may include a non-volatile memory, for example, at least one magnetic disk storage device, a flash memory device, or a universal flash storage (UFS).
The electronic device 100 may implement an audio function, for example, music playing and recording, through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headset jack 170D, the application processor, and the like.
The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and is also configured to convert an analog audio input into a digital audio signal. The audio module 170 may be further configured to encode and decode an audio signal. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules in the audio module 170 are disposed in the processor 110.
The speaker 170A, also referred to as a “loudspeaker”, is configured to convert an audio electrical signal into a sound signal. The electronic device 100 may be used to listen to music or answer a call in a hands-free mode by using the speaker 170A.
The receiver 170B, also referred to as an “earpiece”, is configured to convert an audio electrical signal into a sound signal. When a call is answered or speech message is received by using the electronic device 100, the receiver 170B may be put close to a human ear to listen to a voice.
The microphone 170C, also referred to as a “mike” or a “mic”, is configured to convert a sound signal into an electrical signal. When making a call or sending speech message, the user may place the mouth of the user near the microphone 170C to make a sound, to input a sound signal to the microphone 170C. At least one microphone 170C may be disposed in the electronic device 100. In some other embodiments, two microphones 170C may be disposed in the electronic device 100, to collect a sound signal and implement a noise reduction function. In some other embodiments, three, four, or more microphones 170C may alternatively be disposed in the electronic device 100, to collect a sound signal, implement noise reduction, and identify a sound source, so as to implement a directional recording function and the like.
The headset jack 170D is configured to connect to a wired headset. The headset jack 170D may be a USB interface 130, or may be a 3.5 millimeter (mm) Open Mobile Terminal Platform (OMTP) standard interface or CTIA of the USA standard interface.
The pressure sensor 180A is configured to sense a pressure signal, and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display 194. There is a plurality of types of pressure sensors 180A, such as a resistive pressure sensor, an inductive pressure sensor, and a capacitive pressure sensor. The capacitive pressure sensor may include at least two parallel plates made of conductive materials. When a force acts on the pressure sensor 180A, capacitance between electrodes changes. The electronic device 100 determines pressure intensity based on the change in the capacitance. When a touch operation is performed on the display 194, the electronic device 100 detects intensity of the touch operation by using the pressure sensor 180A. The electronic device 100 may also calculate a touch location based on a detection signal of the pressure sensor 180A. In some embodiments, touch operations that are performed in a same touch location but have different touch operation intensity may correspond to different operation instructions. For example, when a touch operation whose touch operation intensity is less than a first pressure threshold is performed on a Messages application icon, an instruction for viewing an SMS message is performed. When a touch operation whose touch operation intensity is greater than or equal to the first pressure threshold is performed on the Message application icon, an instruction for creating a new message is performed.
The gyroscope sensor 180B may be configured to determine a motion posture of the electronic device 100. In some embodiments, an angular velocity of the electronic device 100 around three axes (axes x, y, and z) may be determined by using the gyroscope sensor 180B. The gyroscope sensor 180B may be configured to implement image stabilization during photographing. For example, when the shutter is pressed, the gyroscope sensor 180B detects an angle at which the electronic device 100 jitters, calculates, based on the angle, a distance for which a lens module needs to compensate, and allows the lens to cancel the jitter of the electronic device 100 through reverse motion, to implement stabilization. The gyroscope sensor 180B may also be used in a navigation scenario and a somatic game scenario.
The barometric pressure sensor 180C is configured to measure barometric pressure. In some embodiments, the electronic device 100 calculates an altitude based on a barometric pressure value measured by the barometric pressure sensor 180C, to assist in positioning and navigation.
The magnetic sensor 180D includes a Hall effect sensor. The electronic device 100 may detect opening and closing of a flip leather case by using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a clamshell phone, the electronic device 100 may detect opening and closing of a flip cover by using the magnetic sensor 180D. Further, a feature like automatic unlocking of the flip cover is set based on a detected opening or closing state of the leather case or a detected opening or closing state of the flip cover.
The acceleration sensor 180E may detect accelerations in various directions (usually on three axes) of the electronic device 100. When the electronic device 100 is still, a magnitude and a direction of gravity may be detected. The acceleration sensor 180E may be further configured to identify a posture of the electronic device, and is used in switching between a landscape mode and a portrait mode, a pedometer, or another application.
The distance sensor 180F is configured to measure a distance. The electronic device 100 may measure the distance in an infrared manner or a laser manner. In some embodiments, in a photographing scene, the electronic device 100 may measure a distance by using the distance sensor 180F, to implement quick focusing.
The optical proximity sensor 180G may include, for example, a light-emitting diode (LED) and an optical detector, for example, a photodiode. The light emitting diode may be an infrared light-emitting diode. The electronic device 100 emits infrared light by using the light-emitting diode. The electronic device 100 detects infrared reflected light from a nearby object by using the photodiode. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100. The electronic device 100 may detect, by using the optical proximity sensor 180G, that the user holds the electronic device 100 close to an ear for a call, to automatically perform screen-off for power saving. The optical proximity sensor 180G may also be used in a leather case mode or a pocket mode to automatically perform screen unlocking or locking.
The ambient light sensor 180L is configured to sense ambient light brightness. The electronic device 100 may adaptively adjust brightness of the display 194 based on the sensed ambient light brightness. The ambient light sensor 180L may also be configured to automatically adjust white balance during photographing. The ambient light sensor 180L may also cooperate with the optical proximity sensor 180G, to detect whether the electronic device 100 is in a pocket, to avoid an accidental touch.
The fingerprint sensor 180H is configured to collect a fingerprint. The electronic device 100 may use a feature of the collected fingerprint to implement fingerprint-based unlocking, application lock access, fingerprint-based photographing, fingerprint-based call answering, and the like.
The temperature sensor 180J is configured to detect a temperature. In some embodiments, the electronic device 100 executes a temperature processing strategy based on the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 lowers performance of a processor nearby the temperature sensor 180J, to reduce power consumption for thermal protection. In some other embodiments, when the temperature is less than another threshold, the electronic device 100 heats the battery 142 to prevent the electronic device 100 from being shut down abnormally due to a low temperature. In some other embodiments, when the temperature is lower than still another threshold, the electronic device 100 boosts an output voltage of the battery 142, to avoid abnormal shutdown caused by a low temperature.
The touch sensor 180K is also referred to as a “touch panel”. The touch sensor 180K may be disposed on the display 194, and the touch sensor 180K and the display 194 form a touchscreen, which is also referred to as a “touch screen”. The touch sensor 180K is configured to detect a touch operation performed on or near the touch sensor. The touch sensor may transfer the detected touch operation to the application processor to determine a type of a touch event. A visual output related to the touch operation may be provided on the display 194. In some other embodiments, the touch sensor 180K may also be disposed on a surface of the electronic device 100 at a location different from that of the display 194.
The bone conduction sensor 180M may obtain a vibration signal. In some embodiments, the bone conduction sensor 180M may obtain a vibration signal of a vibration bone of a human vocal-cord part. The bone conduction sensor 180M may also be in contact with a body pulse, to receive a blood pressure beating signal. In some embodiments, the bone conduction sensor 180M may alternatively be disposed in the headset, to combine into a bone conduction headset. The audio module 170 may obtain a speech signal through parsing based on the vibration signal that is of the vibration bone of the vocal-cord part and that is obtained by the bone conduction sensor 180M, to implement a speech function. The application processor may parse heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, to implement a heart rate detection function.
The button 190 includes a power button, a volume button, and the like. The button 190 may be a mechanical button, or may be a touch button. The electronic device 100 may receive a button input, and generate a button signal input related to a user setting and function control of the electronic device 100.
The motor 191 may generate a vibration prompt. The motor 191 may be configured to provide an incoming call vibration prompt and a touch vibration feedback. For example, touch operations performed on different applications (for example, photographing and audio playing) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects for touch operations performed on different regions of the display 194. Different application scenarios (for example, a time reminder, information receiving, an alarm clock, and a game) may also correspond to different vibration feedback effects. A touch vibration feedback effect may be further customized.
The indicator 192 may be an indicator light, and may be configured to indicate a charging status and a power change, or may be configured to indicate a message, a missed call, a notification, and the like.
The SIM card interface 195 is configured to connect to a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195, to implement contact with or separation from the electronic device 100. The electronic device 100 may support one or N SIM card interfaces, where N is a positive integer greater than 1. The SIM card interface 195 may support a nano-SIM card, a micro-SIM card, a SIM card, and the like. A plurality of cards may be simultaneously inserted into a same SIM card interface 195. The plurality of cards may be of a same type or different types. The SIM card interface 195 may further be compatible with different types of SIM cards. The SIM card interface 195 may further be compatible with an external memory card. The electronic device 100 interacts with a network through the SIM card, to implement functions such as calling and data communication. In some embodiments, the electronic device 100 uses an eSIM, namely, an embedded SIM card. The eSIM card may be embedded into the electronic device 100, and cannot be separated from the electronic device 100.
A software system of the electronic device 100 may use a layered architecture, an event-driven architecture, a microkernel architecture, a micro service architecture, or a cloud architecture. In this embodiment, the Android system with a layered architecture is used as an example to illustrate a software structure of the electronic device 100.
In a layered architecture of the electronic device 100, software is divided into several layers, and each layer has a clear role and task. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers: an application layer, an application framework layer, an Android runtime and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in
The application framework layer provides an application programming interface (API) and a programming framework for an application at the application layer. The application framework layer includes some predefined functions.
As shown in
The window manager is configured to manage a window program. The window manager may obtain a size of the display, determine whether there is a status bar, perform screen locking, take a screenshot, or the like.
The content provider is configured to: store and obtain data, and enable the data to be accessed by an application. The data may include a video, an image, an audio, calls that are made and answered, a browsing history and bookmarks, an address book, and the like.
The view system includes visual controls such as a control for displaying a text and a control for displaying an image. The view system may be configured to construct an application. A display interface may include one or more views. For example, a display interface including a notification icon of Messages may include a view for displaying a text and a view for displaying an image.
The phone manager is configured to provide a communication function for the electronic device 100, for example, management of a call status (including answering, declining, or the like).
The resource manager provides various resources such as a localized character string, an icon, an image, a layout file, and a video file for an application.
The notification manager enables an application to display notification information in a status bar, and may be configured to convey a notification message, which may automatically disappear after a short pause without requiring a user interaction. For example, the notification manager is configured to: notify download completion, give a message notification, and the like. The notification manager may alternatively be a notification that appears in a top status bar of the system in a form of graph or scroll bar text, for example, a notification of an application that is run on a background, or may be a notification that appears on the screen in a form of dialog window. For example, text information is prompted in the status bar, an alert sound is played, the electronic device vibrates, or the indicator light blinks.
The route planning module may be configured to: perform abstraction processing on input information input by an application (for example, a map application), to obtain an abstraction processing result; and perform matching on a map based on the abstract processing result, to obtain an optimal route.
The Android runtime includes a kernel library and a virtual machine. The Android runtime is responsible for scheduling and management of the Android system.
The kernel library includes two parts: a function that needs to be invoked in Java language and a kernel library of Android.
The application layer and the application framework layer run on the virtual machine. The virtual machine executes java files of the application layer and the application framework layer as binary files. The virtual machine is configured to implement functions such as object lifecycle management, stack management, thread management, security and exception management, and garbage collection.
The system library may include a plurality of functional modules, for example, a surface manager, a media library, a three-dimensional graphics processing library (for example, OpenGL ES), and a 2D graphics engine (for example, SGL).
The surface manager is configured to manage a display subsystem and provide fusion of 2D and 3D layers for a plurality of applications.
The media library supports playback and recording of a plurality of commonly used audio and video formats, static image files, and the like. The media library may support a plurality of audio and video encoding formats, for example, MPEG-4, H.264, MP3, AAC, AMR, JPG, and PNG.
The three-dimensional graphics processing library is configured to implement three-dimensional graphics drawing, image rendering, composition, layer processing, and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
It can be understood that components included in the system framework layer, and the system library and runtime layer that are shown in
S301: Obtain a user input.
For example, a user may indicate, on an interface provided by a map application of a mobile phone, the mobile phone to plan a route based on a result selected by the user.
For example, in this embodiment, some available graphics may be preset and presented in a list. For example, the user taps the list selection option 402.
Optionally, the graphic in embodiments of this disclosure may be a character graphic, a geometry, or another simple or complex graphic. This is not limited in this disclosure.
For example, the user may tap any graphic. An example in which the user taps “520” is used for description. As shown in
For example, the user taps the search range setting option 504.
Still as shown in
Still as shown in
For example, the search range preview box 605 may be used to display a map. The map may be obtained by the map application from a cloud, or may be obtained by the map application from a server of another map provider. This is not limited in this disclosure. Optionally, the cloud in embodiments of this disclosure may be the Huawei Cloud, and the cloud may include one or more servers.
For example, the user may slide on the map displayed in the search range preview box 605, and determine a search range center point in a manner like touching and holding (or another operation, which may be set based on an actual requirement, and is not limited in this disclosure). For example, a center point 605a shown in
For example, the user may set the search range in the range setting option 606. For example, the user may tap the search range setting option 606. The map application may display a numeric keyboard (or some optional range lists, which may be set based on an actual requirement, and is not limited in this disclosure) in response to a received user operation. The user may input a required search range, for example, 20 km. The value may be set based on an actual requirement, and this is not limited in this disclosure.
For example, in response to the search range center point (for example, the center point 605a) and the search range (for example, 20 km) that are specified by the user, the map application may display a corresponding search range in the search range preview box 605, namely, a circular region (which may alternatively be in another shape and is not limited in this disclosure) that centers on the search range center point 605a and whose diameter is 20 km.
For example, the user may tap the confirmation option to determine the search range. The map application determines the search range in response to a received user operation. Optionally, the map application may display a current set range in the customization option 603 in
For example, the user may alternatively tap the reset option to reselect a search range center point and/or a search range.
As shown in
For example, a default value of a route length may be displayed in the default option 702. In this embodiment, the default value of the route length is optionally within 10 km, that is, a length of the planned route is less than or equal to 10 km.
For example, the user may alternatively tap the customization option 703, to set a route length as required. As shown in
It should be noted that the user interface shown in this embodiment is merely an example. An actual interface may include more or less content. In addition, a location, a size, and display content of each option may be set based on an actual requirement. This is not limited in this disclosure, and details are not described below again.
For example, the map application may determine input information based on the received user operation. The input information may also be understood as a user requirement or a presented effect of a planned route expected by the user. The input information includes but is not limited to a graphic selected by the user (for example, “520” selected in
Still as shown in
For example, the cancel option 802a may be configured to cancel content that is most recently drawn by the user in the graphic drawing box 803.
For example, the gallery option 802b may be configured to invoke a gallery in the mobile phone. For example, if the user taps the gallery option 802b, the map application displays a gallery interface in response to a received user operation. The gallery interface may include at least one picture locally stored in the mobile phone. The user may select one or more pictures in the gallery. In response to a received user selection operation, the map application may display, in the graphic drawing box 803, the picture selected by the user. In other words, the picture in the gallery selected by the user is a user-customized graphic. Optionally, the user may further draw on the picture (the picture selected from the gallery) displayed in the graphic drawing box 803.
For example, the user may draw a graphic in the graphic drawing box 803, and the map application displays a corresponding graphic in response to a received user drawing operation. For example, as described above, the graphic in embodiments of this disclosure may include a geometry, a character graphic, and another graphic.
Optionally, the graphic drawing box 803 may further display some drawing tools, such as a pencil tool, a filling tool, and a straight-line drawing tool. This is not limited in this disclosure.
For example, the user may tap the confirmation option, to confirm the drawn graphic. The map application obtains, in response to a received operation of tapping the confirmation option by the user, the graphic drawn by the user and/or the graphic selected by the user from the gallery.
For example, if the user taps the reset option, the map application clears content in the graphic drawing box 803 in response to a received user operation. The user may redraw a graphic in the graphic drawing box 803.
For example, after the user taps the confirmation option, the map may display, in response to a received user operation, the interface shown in
In a possible implementation, the mobile phone may alternatively receive an instruction sent by another device. The instruction may include input information, and the input information includes but is not limited to a graphic, search range information, and route range information. For example, a user B may perform settings in
S302: The map application sends a route planning request to a route planning service.
For example, after obtaining the input information, the map application may send the route planning request to the route planning service. The route planning request includes but is not limited to the input information, and the route planning request is used to request the route planning service to perform route planning based on the input information.
It should be noted that, in a module interaction process in this embodiment this disclosure, interaction may alternatively be performed via a memory. For example, the map application may write the input information into the memory, and trigger the route planning service to read the input information in the memory. A module interaction manner described in this embodiment this disclosure is merely an example, and this is not limited in this disclosure.
S303: The route planning service performs abstraction processing on the input information, to obtain an abstraction processing result.
For example, the route planning service obtains the input information in the route planning request, and may perform abstraction processing on the graphic based on the input information, to obtain an abstraction processing result corresponding to the graphic.
Specifically, after obtaining the input information, the route planning service obtains the graphic (referred to as a target graphic below) in the input information. For example, the graphic may be a small fish drawn in
The route planning service may perform image recognition on the target graphic, to classify the target graphic. Optionally, the route planning service may preset some classifications. For example, in this embodiment, graphic classifications include but are not limited to a character graphic, a geometry, and another graphic. For example,
In this embodiment, an abstraction processing process may be understood as a process in which a graphic is abstracted as a simple graphic or a simple line. For example, as shown in (2) in
For another example, as shown in (2) in
For another example, as shown in (2) in
The following briefly describes an abstraction process of each of the “small fish” graphic and the character graphic by using a specific example. It should be noted that in embodiments of this disclosure, only some graphics are used as examples for description. Another similar graphic may be processed in an abstraction method in embodiments of this disclosure. Examples are not described one by one in this disclosure.
In this embodiment, after obtaining the abstract graphic, the route planning service may obtain feature information based on the abstract graphic.
In this embodiment, the feature information includes but is not limited to connection information (which may also be referred to as cross information), angle information, size information, geometric information, and location information.
For example, the connection information indicates a connection status of different geometries. For example, as shown in (3) in
For example, the angle information indicates an angle in the geometry. For example, as shown in (3) in
For example, the size information indicates a ratio of sizes of a plurality of geometries. For example, as shown in (3) in
For example, the geometric information may also be referred to as graphic information, and indicates a geometry included in the abstract graphic. For example, as shown in (3) in
For example, the location information indicates a location relationship between scatters, lines, and/or the like in the abstract graphic. Optionally, the location relationship includes but is not limited to information such as a direction and location and/or a distance. For example, as shown in (2) in
It should be noted that the feature information described in embodiments of this disclosure is merely an example. In another embodiment, the feature information may further include other information or any information combination, which intends to describe an abstract graphic.
In this embodiment of this disclosure, for the abstract graphic (for example, shown in (3) in
It may be understood that the feature information may describe information indicating that the abstract graphic of the “small fish” graphic includes the diamond ABOE and the triangle COD, where the connection point between the diamond ABOE and the triangle COD is the point O, and the point O is a “cross” crosspoint. In addition, the feature information further describes values of angles of the diamond ABOE and the triangle COD and angles at the connection point.
It should be noted that specific content and values of the information in embodiments of this disclosure are merely examples, and are not limited in this disclosure.
For example, the route planning service may obtain an abstract processing result corresponding to the “small fish” graphic (the target graphic) shown in (1) in
For example, feature information corresponding to the abstract graphic may be obtained based on route information. The feature information may include location information and connection information between the scatters.
In a possible implementation, for a graphic (for example, as shown in (1) in
For example, location information corresponding to the abstract graphic shown in (2) in
The connection information may include but is not limited to that the scatter 1101 is connected to the scatter 1105, and the scatter 1106 is connected to the scatter 1105.
Optionally, the connection information may further include that the scatter 1101 is connected to the scatter 1102, and the scatter 1103 is connected to the scatter 1104. In other words, the connection information may further indicate a connection relationship between abstract graphics of different characters.
In this embodiment, the feature information may also be understood as a route filtering condition. In a subsequent route planning process, the route planning service may perform matching based on the feature information, to obtain a route that meets the condition.
In a possible implementation, the route planning service may divide the feature information into coarse matching information and exact matching information. The coarse matching information may also be referred to as a coarse matching condition or coarse matching condition information, and may be used for coarse matching in a route matching process, to obtain a route that meets the coarse matching information. The exact matching information may also be referred to as an exact matching condition or exact matching condition information, and may be used to further perform exact matching on routes that are successfully matched in coarse matching, to select a route with a higher similarity. A specific implementation is described in detail below.
Optionally, the coarse matching information may include but is not limited to at least one of the following: the geometric information (for a concept, refer to the foregoing descriptions), the connection information, and the location information. Optionally, the exact matching information may include but is not limited to at least one of the following: the angle information, and the size information. It should be noted that a division manner (included content) of the coarse matching information and the exact matching information in embodiments of this disclosure is merely an example, and is not limited in this disclosure.
S304: The route planning service performs route matching based on the abstract processing result.
For example, as described above, after the route planning service performs abstraction processing on the target graphic, the abstraction processing result may be obtained. The abstract processing result may include the feature information or may include the geometric information, or the feature information includes the geometric information. For specific descriptions, refer to the foregoing descriptions. Details are not described herein again.
For example, as described above, the route planning service may divide the feature information, to obtain the coarse matching information and the exact matching information, which may also be referred to as the coarse matching condition and the exact matching condition. This is not limited in this disclosure.
In this embodiment, the route planning service may obtain map information from the cloud (for example, the Huawei Cloud) or a server of a map provider. The Huawei Cloud is used as an example. The route planning service may send request information to the cloud, to request the cloud to feed back the map information. Optionally, the request information may include the search range information described above, and is used to request a part of a region on the map, namely, a map of a range indicated by the search range information. In response to the received request information, the cloud feeds back the map to the electronic device (for example, the mobile phone) on which the route planning service is performed. Optionally, the information fed back by the cloud may be an entire map, or may be a part of the map (a map of the target region, where the target region is a region indicated by the search range). This is not limited in this disclosure. Optionally, if the cloud feeds back the entire map (for example, a national map or a provincial map, which may be set based on an actual requirement, and is not limited in this disclosure), the route planning service may perform route matching on the target region (the region indicated by the search range) on the map based on the feature information.
For example, after obtaining the map, the route planning service may perform matching on the target region on the map based on the coarse matching information, to obtain a route that meets the condition indicated by the coarse matching information.
In a possible implementation, the route planning service may segment the map of the target region, to obtain a plurality of regions obtained through segmentation.
For example, the route planning service prioritizes the regions obtained through segmentation based on at least one of coarse matching conditions. A region obtained through segmentation includes a larger quantity of routes that meet the condition has a higher priority. For example, the abstract graphic corresponding to the “small fish” graphic shown in (3) in
Optionally, for a region obtained through segmentation (for example, the region a_m obtained through segmentation) that does not include a “cross” cross route, a priority of the region may be set to 0, that is, in a subsequent matching process, matching may not be performed on the region that is obtained through segmentation and whose priority is 0, to reduce a total quantity of retrieval times (which may also be understood as matching times) of route matching.
For example, the route planning service may perform matching, one by one based on other information (or referred to as conditions) in the coarse matching information and in a priority sequence, on the regions obtained through segmentation, to obtain a successfully matched route.
As shown in
As shown in
The route planning service continues to perform matching, based on the geometry indicated by the geometric information, on the region obtained through segmentation, to obtain a route that meets the geometric information. As shown in
For example, the route planning service continues to perform detection, to determine whether routes connected to O2 and O6 include a route whose shape is a diamond. For example, as shown in
It should be noted that, in the foregoing embodiment, that the route planning service first performs matching, to obtain the cross route and then sequentially performs matching, to obtain a triangle route and a diamond route is merely used as an example for description. In another embodiment, the route planning service may perform matching in another sequence, or may perform parallel matching based on a plurality of conditions. This is not limited in this disclosure.
For example, the route planning service performs matching based on the coarse matching information, and obtained routes include but are not limited to: O2-A1-B1-E2-O2-C2-D2, O2-A1-B1-E2-O2-C1-D1, and O6-D2-O2-E1-O6-C2-D3.
In a possible implementation, the route planning service may not segment the map. In other words, the route planning service may perform route matching on the target region on the map based on the feature information. A matching manner is the same as that (described in
S304: The route planning service performs filtering, according to a preset strategy, on the routes obtained through matching, to obtain an optimal route.
For example, the route planning service may perform further matching (or filtering), based on the feature information, on the routes that are successfully matched in S303, to obtain the optimal route. Optionally, the optimal route may include one or more routes.
For example,
Optionally, the preset strategy may include but is not limited to the exact matching information and the route length information (a route length range set by the user in the foregoing embodiments). The route length information indicates that a length of the route obtained through matching is less than or equal to the route length indicated by the route length information.
Optionally, the exact matching information includes but is not limited to the angle information and the size information. For example, the angle information may include the values of angles in the abstract graphic, and the size information indicates the size ratio of the diamond to the triangle.
For example, the route planning service may set a corresponding matching range for information in the exact matching information. For example, the angle information indicates that the angle COD is 120°. For example, the route planning service may set an angle range to ±30° (which may be set based on an actual requirement, and is not limited in this disclosure). In other words, a range of the angle COD indicated by the exact matching information is [90°, 150° ]. A range of another angle is similar to this. This is not described herein again.
For example, the route planning service may further set a range for the size information in the exact matching information. For example, original size information indicates that the ratio is 2:1, and the route planning service may expand this condition to a ratio of 0.5 (1:2) to 3 (3:1). The condition may be set based on an actual requirement. This is not limited in this disclosure.
Still as shown in
In a possible implementation, when the route planning service obtains a plurality of optimal routes by performing matching based on the exact matching information, the route planning service may further obtain, through matching, a route whose conditions are closest to conditions defined by the exact matching information, namely, the optimal route. For example, in the route O2-A1-B1-E2-O2-C2-D2, a size ratio of a diamond O2-A1-B1-E2 to a triangle O2-C2-D2 is 2:1. In the route O6-D2-O2-E1-O6-C2-D3, a size ratio of a diamond O6-D2-O2-E1 to a triangle O6-C2-D3 is 1:2. In this case, the size ratio in the route O2-A1-B1-E2-O2-C2-D2 is closer to the size ratio (2:1) indicated by the size information in the exact matching information. The route O2-A1-B1-E2-O2-C2-D2 is the optimal route.
In another possible implementation, when the route planning service obtains a plurality of optimal routes by performing matching based on the exact matching information, the route planning service may send the plurality of optimal routes to the map application. The map application may display the plurality of optimal routes, and the user may select one of the plurality of optimal routes as the optimal route. The map application determines the optimal route in response to a received user selection operation.
In still another possible implementation, when the route planning service obtains a plurality of optimal routes by performing matching based on the exact matching information, the route planning service may alternatively select a route whose semantics is closest to semantics of the target graphic as the optimal route. For example, the route planning service may perform semantic recognition on the target graphic, to obtain a semantic recognition result of the graphic, and the route planning service may perform semantic recognition on a graphic enclosed by the optimal route, to obtain a semantic recognition result of the route. A specific manner of semantic recognition may be any feasible manner in the technology. This is not limited in this disclosure. The route planning service may separately match a plurality of obtained semantic recognition results of the routes with the semantic recognition result of the graphic, to obtain a route with closest semantics, for example, a route whose semantics similarity is less than or equal to a preset threshold.
It may be understood that, in this embodiment, a similarity between the abstract graphic and the graphic enclosed by the optimal route determined by the route planning service is greater than or equal to the preset similarity threshold. In other words, the exact matching process may be understood as a similarity evaluation process. The route planning service may further obtain a route with a highest shape similarity to the abstract graphic by performing matching based on the condition indicated by the exact matching information.
For example, evaluation conditions of the similarity may be various. In an example, the similarity may be evaluated based on a geometric similarity. It may also be understood that a higher geometric similarity between a geometry enclosed by the route and the abstract graphic (which may also be understood as a shape of the target graphic) indicates a higher matching degree. For example, a difference between a size ratio of the geometries in the route and the size ratio indicated by the size information in the feature information is less than or equal to a threshold, or it may be understood that a size similarity is greater than or equal to the threshold. A smaller difference indicates a higher size similarity (geometric similarity) corresponding to the route. Optionally, the geometric similarity may be evaluated based on an angle, a location, or the like. For example, a difference between an angle in a shape enclosed by the route and an angle indicated in the angle information is less than or equal to a threshold, that is, an angle similarity is greater than or equal to the threshold. A smaller difference indicates a higher angle similarity (geometric similarity) corresponding to the route. In another example, the similarity may alternatively be evaluated based on a semantics similarity. In other words, as described above, a closer semantic recognition result indicates a higher similarity. It should be noted that the thresholds described in this disclosure may be set based on an actual requirement. This is not limited in this disclosure.
For example, the route planning service may perform further matching on the optimal route based on the route length information, to detect whether the optimal route is less than or equal to the route length range, where a route length of the optimal route needs to be less than or equal to a route length indicated by the route length information. Optionally, the route planning service may first perform route length information matching on a route obtained in a coarse matching process, and then perform exact matching. This is not limited in this disclosure.
In a possible implementation, when the route planning service obtains a plurality of optimal routes by performing matching based on the exact matching information, the route planning service may select a route with a shortest route length as the optimal route.
In another possible implementation, the user may alternatively select a specified route or location in a phase of setting the target graphic. Correspondingly, when the route planning service obtains a plurality of optimal routes by performing matching based on the exact matching information, a route passing through the specified route or location may be used as the optimal route. Certainly, when the user selects the specified route or location, the route planning service may alternatively use this condition as a coarse matching condition in a coarse matching phase. This is not limited in this disclosure.
It should be noted that the coarse matching condition and the exact matching condition in embodiments of this disclosure are merely examples. During actual application, the user may further customize more route matching conditions. For example, the user may set to match a route passing through a largest quantity of shade places or a largest quantity of schools. This is not limited in this disclosure.
In still another possible implementation, the route planning service may alternatively synchronously perform coarse matching and exact matching. For example, after obtaining the cross route through matching, the route planning service may detect angles of the cross route based on the angle information. If the angle is within the angle range indicated by the angle information, the cross route is the successfully matched cross route.
S306: The route planning application sends the optimal route to the map application.
For example, after determining the optimal route, the route planning service may send the optimal route to the map application. Optionally, the optimal route sent in the route planning includes but is not limited to a start point and an end point of the route, and the route.
S307: The map application displays the optimal route.
For example, the map application may determine the start point, the end point, and the route in response to the obtained optimal route. The map application may display the optimal route in the target region on the map.
The route planning service may display the optimal route in any manner in
For example, the user may tap the start navigation option, to indicate the map application to perform navigation based on the optimal route and a location of the user. The map application starts navigation in response to a received user operation.
In a possible implementation, if the route planning service fails to obtain, through matching, a route that meets the feature information and the route length information, the route planning service may send matching failure information to the map application, to indicate that no optimal route is obtained. The map application may display matching failure prompt information, to prompt the user to reset the input information, which includes resetting the target graphic, the search range, the route length, and/or the like.
In another possible implementation,
In still another possible implementation, for a list selection manner shown in
In another possible implementation, for some simple graphics, results obtained after the route planning service performs abstraction processing on the simple graphics may be the same as original graphics. For this type of graphics, the route planning service may skip the abstract processing process, and directly obtain corresponding feature information based on the original graphics (the target graphics). For example,
It may be understood that, to implement the foregoing functions, the electronic device includes corresponding hardware and/or software modules for performing the function. In combination with example algorithm steps described in embodiments disclosed in this disclosure, this application may be implemented by hardware or a combination of hardware and computer software. Whether a function is performed by hardware or hardware driven by computer software depends on particular applications and design constraints of the technical solutions. A person skilled in the art may use different methods to implement the described functions for each particular application with reference to embodiments, but it should not be considered that the implementation goes beyond the scope of this disclosure.
In a possible implementation, the processing module 2202 is specifically configured to: obtain feature information of the target graphic in response to the received first operation, where the feature information describes a graphic feature of the target graphic; and determine the target route based on the feature information of the target graphic and the target region on the map.
In a possible implementation, the processing module 2202 is specifically configured to: perform abstraction processing on the target graphic, to obtain a target abstract graphic of the target graphic, where the target abstract graphic is used to describe a key point in the target graphic or a connection relationship between key points in the target abstract graphic; and obtain the feature information based on the target abstract graphic.
In a possible implementation, the feature information includes at least one of the following: connection feature information, geometric feature information, angle feature information, size feature information, and location feature information.
In a possible implementation, the processing module 2202 is specifically configured to: perform matching on the target region on the map based on the feature information; and determine a successfully matched route as the target route.
In a possible implementation, the processing module 2202 is specifically configured to: perform matching on the target region on the map based on first feature information, to obtain at least one successfully matched first target subroute; and perform matching on a surrounding region of the at least one first target subroute based on second feature information, to obtain a successfully matched second target subroute, where the second target subroute is obtained by performing matching on a surrounding region of a third target subroute of the at least one first target subroute, the first feature information and the second feature information are partial information of the feature information, and the second target subroute and the third target subroute form the target route.
In a possible implementation, the processing module 2202 is specifically configured to: if a plurality of second target subroutes are obtained through matching, receive a second operation, where the second operation is used to specify the target route from a route including the plurality of second target subroutes; and determine the target route in response to the received second operation.
In a possible implementation, the processing module 2202 is specifically configured to: determine a plurality of candidate routes based on the target graphic and the target region on the map; and determine the target route from the plurality of candidate routes based on a preset condition, where the preset condition includes at least one of the following: the shape of the target route is the highest similar to the target abstract graphic, a length of the target route is the shortest, and the target route passes through a specified region; and the specified region is preset.
In a possible implementation, the receiving module 2201 is configured to receive the first operation performed on a first interface. The display module is configured to display the target route on a map on a second interface, where the second interface is different from the first interface.
In a possible implementation, the first operation indicates to draw the target graphic on the first interface.
In a possible implementation, the first interface is a gallery interface, and the gallery interface includes at least one locally stored graphic; and the receiving module is configured to determine the target graphic in response to the received first operation performed on the target graphic, where the target graphic is a graphic in the at least one graphic.
In a possible implementation, the first interface is a candidate graphic interface, and the candidate graphic interface includes at least one candidate graphic; and the receiving module is configured to determine the target graphic in response to the received first operation performed on a target candidate graphic, where the target candidate graphic is a graphic in the at least one candidate graphic.
In a possible implementation, the processing module 2202 is configured to: segment the target region, to obtain N target sub-regions, where N is a positive integer greater than 1; prioritize the N target sub-regions based on target feature information, where a sub-region that includes more routes corresponding to the target feature information has a higher priority, and the target feature information is partial information of the feature information; and perform matching on the N target sub-regions one by one based on the feature information and a priority sequence of the N target sub-regions.
In a possible implementation, the display module 2203 is configured to display, on the map on the second interface, a key point on the target route.
In a possible implementation, the display module 2203 is configured to display an entire route of the target route on the map on the second interface.
In a possible implementation, the processing module 2202 is configured to: obtain a current location of the electronic device; and determine the target region based on the current location of the electronic device and a target region range, where the target region range is a preset value, or the target region range is set by using a received region range setting operation.
In a possible implementation, the target region is set by using a latest received user operation.
In an example,
All components of the apparatus 2300 are coupled together through a bus 2304. In addition to a data bus, the bus 2304 further includes a power bus, a control bus, and a status signal bus. However, for clear description, various types of buses in the figure are marked as the bus 2304.
Optionally, the memory 2303 may be configured to store instructions in the foregoing method embodiments. The processor 2301 may be configured to: execute the instructions in the memory 2303, control a receive pin to receive a signal, and control a transmit pin to send a signal.
The apparatus 2300 may be the electronic device or a chip of the electronic device in the foregoing method embodiments.
All related content of the steps in the foregoing method embodiments may be cited in function descriptions of the corresponding functional modules. Details are not described herein again.
This embodiment further provides a computer storage medium. The computer storage medium stores computer instructions, and when the computer instructions are run on an electronic device, the electronic device is enabled to perform the foregoing related method steps to implement the method in the foregoing embodiments.
This embodiment further provides a computer program product. When the computer program product runs on a computer, the computer is enabled to perform the related steps, to implement the method in the foregoing embodiments.
In addition, an embodiment of this disclosure further provides an apparatus. The apparatus may be specifically a chip, a component, or a module. The apparatus may include a processor and a memory that are connected. The memory is configured to store computer-executable instructions. When the apparatus runs, the processor may execute the computer-executable instructions stored in the memory, to enable the chip to perform the methods in the foregoing method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in embodiments is configured to perform the corresponding method provided above. Therefore, for beneficial effects that can be achieved, refer to the beneficial effects of the corresponding method provided above. Details are not described herein again.
Based on the foregoing descriptions of the implementations, a person skilled in the art may understand that for the purpose of convenient and brief description, division into the foregoing functional modules is merely used as an example for illustration. During actual application, the foregoing functions can be allocated to different functional modules for implementation as required, that is, an inner structure of an apparatus is divided into different functional modules to implement all or a part of the functions described above.
In the several embodiments provided in this disclosure, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the described apparatus embodiment is merely an example. For example, the module or division into the units is merely logical function division and may be other division in an actual implementation. For example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be ignored or not performed. In addition, the displayed or discussed mutual couplings or direct couplings or communication connections may be implemented through some interfaces. The indirect couplings or communication connections between the apparatuses or units may be implemented in electronic, mechanical, or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one or more physical units, may be located in one place, or may be distributed on different places. Some or all of the units may be selected based on actual requirements to achieve the objectives of the solutions of embodiments.
In addition, functional units in embodiments of this disclosure may be integrated into one processing unit, each of the units may exist independently physically, or two or more units may be integrated into one unit. The integrated unit may be implemented in a form of hardware, or may be implemented in a form of software functional unit.
Any content in embodiments of this disclosure and any content in a same embodiment can be freely combined. Any combination of the foregoing content falls within the scope of this disclosure.
When the integrated unit is implemented in the form of software functional unit and sold or used as an independent product, the integrated unit may be stored in a readable storage medium. Based on such an understanding, the technical solutions in embodiments may be implemented in the form of software product. The software product is stored in a storage medium and includes several instructions for instructing a device (which may be a single-chip microcomputer, a chip or the like) or a processor to perform all or some of the steps of the methods described in embodiments of this disclosure. The foregoing storage medium includes any medium that can store program code, for example, a USB flash drive, a removable hard disk, a read-only memory (ROM), a random-access memory (RAM), a magnetic disk, or an optical disc.
The foregoing describes embodiments of this disclosure with reference to the accompanying drawings. However, this disclosure is not limited to the foregoing specific implementations. The foregoing specific implementations are merely examples, but are not limitative. Inspired by this disclosure, a person of ordinary skill in the art may further make modifications without departing from the purposes of this disclosure and the protection scope of the claims, and all the modifications shall fall within the protection of this disclosure.
Methods or algorithm steps described in combination with the content disclosed in embodiments of this disclosure may be implemented by hardware, or may be implemented by a processor by executing a software instruction. The software instruction may include a corresponding software module. The software module may be stored in a RAM, a flash memory, a ROM, an erasable programmable ROM (EPROM), an electrically erasable programmable ROM (EEPROM), a register, a hard disk, a removable hard disk, a compact disc ROM (CD-ROM), or any other form of storage medium well-known in the art. For example, a storage medium is coupled to a processor, so that the processor can read information from the storage medium and write information into the storage medium. Certainly, the storage medium may be a component of the processor. The processor and the storage medium may be disposed in an ASIC.
A person skilled in the art should be aware that in the foregoing one or more examples, functions described in embodiments of this disclosure may be implemented by hardware, software, firmware, or any combination thereof. When the functions are implemented by software, the foregoing functions may be stored in a computer-readable medium or transmitted as one or more instructions or code in a computer-readable medium. The computer-readable medium includes a computer storage medium and a communication medium, where the communication medium includes any medium that facilitates transmission of a computer program from one place to another. The storage medium may be any available medium accessible to a general-purpose or a dedicated computer.
The term “and/or” in this specification describes only an association relationship for describing associated objects and represents that three relationships may exist. For example, A and/or B may represent the following three cases: Only A exists, both A and B exist, and only B exists.
In the specification and claims in embodiments of this disclosure, the terms “first”, “second”, and so on are intended to distinguish between different objects but do not indicate a particular sequence of the objects. For example, a first target object, a second target object, and the like are used for distinguishing between different target objects, but are not used for describing a specific sequence of the target objects.
In addition, in embodiments of this disclosure, the word “example” or “for example” is used to represent giving an example, an illustration, or a description. Any embodiment or design scheme described as an “example” or “for example” in embodiments of this disclosure should not be explained as being more preferred or having more advantages than another embodiment or design scheme. Exactly, use of the word “example”, “for example”, or the like is intended to present a related concept in a specific manner.
In the descriptions of embodiments of this disclosure, unless otherwise stated, “a plurality of” means two or more than two. For example, a plurality of processing units are two or more processing units, and a plurality of systems are two or more systems.
The foregoing describes embodiments of this disclosure with reference to the accompanying drawings. However, this disclosure is not limited to the foregoing specific implementations. The foregoing specific implementations are merely examples, but are not limitative. Inspired by this disclosure, a person of ordinary skill in the art may further make modifications without departing from the purposes of this disclosure and the protection scope of the claims, and all the modifications shall fall within the protection of this disclosure.
Number | Date | Country | Kind |
---|---|---|---|
202211020211.2 | Aug 2022 | CN | national |
This is a continuation of Int'l Patent App. No. PCT/CN2023/103263, filed on Jun. 28, 2023, which claims priority to Chinese Patent App. No. 202211020211.2, filed on Aug. 24, 2022, both of which are incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2023/103263 | Jun 2023 | WO |
Child | 19024589 | US |