This disclosure relates to vehicular electronic devices, and more particular, to control of vehicular functions for autonomous driving by an electronic device and server.
In recent times, much attention has been centered on autonomous vehicles and their operation. Autonomous vehicles, such as self-driving cars, are capable of navigating roadways using sensor inputs, electronic control of vehicle functions and sophisticated processing systems. Accordingly, autonomous operation typically utilizes a significant quantity of data processing. Operations may thus involve processing from computer devices installed in the vehicle and/or external servers.
When a server is utilizes to process autonomous data, and the communication link between the vehicle and server is unstable, the server may not be able to receive and process the data properly. Further, there may be delays in transmitting crucial instructions to the vehicle.
Thus, a need exists to ensure stable and powerful communication connection between a vehicle and a server for vehicular control, communication, data transmission/reception and/or data processing.
The present disclosure enables increased safety and security to vehicular control for an autonomous driving system, involving a communication connection between the vehicle and an external server.
Further, the present disclosure improves the vehicle's ability to automatically response rapidly to changing contextual and environment conditions, even when the communication connection with the server is disrupted or diminished.
The present disclosure improves provision of stable autonomous driving according to a state of communication between the vehicle and the server, which may help the driver adapt to changing and unexpected situations that may occur during autonomous operation of the vehicle.
According to an embodiment of the disclosure, an electronic apparatus is disclosed. The electronic device is installed in a vehicle to control a function of the vehicle includes: a memory storing at least one instruction, communication circuitry that is communicably connected with an external server via a communication connection, and a processor operably coupled to the memory, wherein the at least one instruction is executable by the processor to cause the electronic apparatus to: identify a state of the communication linkage with the external server, based on the identified state, determine a target size for data to be transmitted to the external server, control at least one sensor of the vehicle to collect the data such that a total size of the collected data is less than or equal to the determined target size, transmit the data collected by the at least one sensor to the external server, and receive a control command from the external server based on the transmitted data, and control a function of the vehicle according to the received control command.
According to an embodiment of the disclosure, a method is disclosed. The method is for an electronic apparatus installed in a vehicle, and includes: identifying a state of a communication linkage with an external server, based on the identified state, determining a target size for data to be transmitted to the external server, controlling at least one sensor of the vehicle to collect the data such that a total size of the collected data is less than or equal to the determined target size, transmitting the data collected by the at least one sensor to the external server, and receiving a control command from the external server based on the transmitted data, and controlling a function of the vehicle according to the received control command.
According to an embodiment of the disclosure, an electronic apparatus is disclosed. The electronic apparatus includes: a memory storing at least one instruction, and a processor operably coupled to the memory, wherein at least one instruction is executable by the processor to cause the electronic apparatus to: identify a state of a communication linkage with a server, based on the identified state, determine a target size for data to be transmitted to the external server, select, from previously obtained data, a sub-portion of the previously obtained data having a total size less than or equal to the determined target size for transmission to the external server, transmit the selected sub-portion to the external server, and receive a control command from the external server based on the transmitted data, and control a function of the vehicle according to the received control command.
In certain embodiments of the disclosure, an autonomous driving system is disclosed which enables safe control of a vehicle, in a context in which the vehicle and an external server are communicably linked to each other.
Also, in certain embodiments of the disclosure, an autonomous vehicle may react with rapidity to unexpected and changing external factors, even when communication with the server is diminished or disrupted.
Also, in certain embodiments of the disclosure, autonomous driving may be more stably implemented, according to the state of communication between a vehicle and a server, improving the ability of the driver to adapt to situations that may occur during autonomous operations.
However, effects that may be achieved by an electronic apparatus for controlling a function of a vehicle and a method performed by the electronic apparatus according to an embodiment of the disclosure are not limited to those mentioned above, and other effects not mentioned above may be clearly understood by those of ordinary skill in the art from the following description.
Brief descriptions about respective drawings are provided to gain a sufficient understanding of the drawings of the present specification.
According to an embodiment of the disclosure, an electronic apparatus is disclosed. The apparatus may be installed in a vehicle to control a function of the vehicle. The apparatus may include: a memory storing at least one instruction; and a processor configured to execute the at least one instruction, causing the apparatus to: identify a state of communication with a server, determine a target size corresponding to data to be transmitted to the server, based on the identified communication state, control at least one sensor installed in the vehicle such that the at least one sensor collects data with a total size less than or equal to the determined target size, transmit the data collected by the at least one sensor to the server, and control a function of the vehicle according to a control command received from the server, in response to transmission of the data.
The processor may be further configured to turn off some of the at least one sensor such that data with a total size less than or equal to the determined target size may be collected.
The processor may be further configured to select a sensor to be turned off from among the at least one sensor, based on at least one of a movement direction of the vehicle or a position of the vehicle.
The processor may be further configured to reduce a sensing capability of the at least one sensor such that data with a total size less than or equal to the determined target size may be collected.
The processor may be further configured to determine the target size such that an expected time from a time point of starting collection of the data to a time point of execution of the control command may be less than or equal to a threshold time.
The processor may be further configured to determine an analysis accuracy for allowing the expected time to be less than or equal to the threshold time and transmit the determined analysis accuracy to the server, in which the control command may be generated by analyzing the data with the determined analysis accuracy.
The processor may be further configured to select the threshold time from among a plurality of candidate threshold times according to at least one of a speed of the vehicle or a position of the vehicle.
The processor be further configured to may output a stability level of autonomous driving corresponding to the determined target size and the threshold time through an in-vehicle output device.
The processor may be further configured to output, through an output device, predicted stability levels of autonomous driving in sections within a route between a departure position and an arrival position of the vehicle.
The processor may be further configured to select a recommended route from among routes between a departure position and an arrival position of the vehicle by considering averages of predicted stability levels of the routes.
The processor may be further configured to determine a target size corresponding to data to be transmitted to the server, according to a stability level of autonomous driving corresponding to a method of communication with the server.
The processor may be further configured to select some data with a total size less than or equal to the determined target size, when there is data pre-obtained from the at least one sensor, and transmit the selected some data to the server.
According to another embodiment of the disclosure, a method performed by an electronic apparatus to be installed in a vehicle includes: identifying a state of communication with a server; determining a target size corresponding to data to be transmitted to the server according to the identified communication state; controlling at least one sensor installed in the vehicle such that the at least one sensor collects data with a total size less than or equal to the determined target size; transmitting the data collected by the at least one sensor to the server; and controlling a function of the vehicle according to a control command received from the server, in response to the transmission of the data.
The method may further include turning off some of the at least one sensor such that data with a total size less than or equal to the determined target size may be collected.
The method may further include reducing a sensing capability of the at least one sensor such that data with a total size less than or equal to the determined target size may be collected.
The determining of the target size may include determining the target size such that that an expected time from a time point of starting collection of the data to a time point of execution of the control command may be less than or equal to a threshold time.
The determining of the target size may include selecting the threshold time from among a plurality of candidate threshold times according to at least one of a speed of the vehicle or a position of the vehicle.
The method may further include outputting a stability level of autonomous driving corresponding to the determined target size and the threshold time, through an in-vehicle output device.
The determining of the target size may include determining a target size corresponding to data to be transmitted to the server, according to a stability level of autonomous driving corresponding to a method of communication with the server.
According to another embodiment of the disclosure, an electronic apparatus to be installed in a vehicle to control a function of the vehicle includes: a memory storing at least one instruction; and a processor configured to operate according to the at least one instruction, wherein the processor is further configured to identify a state of communication with a server, determine a target size corresponding to data to be transmitted to the server, based on the identified communication state, select some data with a total size less than or equal to the determined target size, from among data obtained from at least one sensor installed in the vehicle, transmit the selected some data to the server, and control a function of the vehicle according to a control command received from the server, in response to transmission of the some data.
Throughout the disclosure, the expression “at least one of a, b or c” indicates only a, only b, only c, both a and b, both a and c, both b and c, all of a, b, and c, or variations thereof.
The disclosure may include certain embodiments and modifications, and particular embodiments thereof are illustrated in the drawings and will be described herein in detail. However, it should be understood that the disclosure is not limited to particular embodiments and includes all modifications, equivalents, and substitutions included in the scope of the disclosure.
In the following description of the embodiments, detailed descriptions of the related art will be omitted when it is deemed that they may unnecessarily obscure the subject matters of the disclosure. Also, numbers (e.g., first and second) used in the process of describing embodiment of the disclosure are merely identification symbols for distinguishing an element from another element.
Also, herein, when an element is referred to as being “connected” or “coupled” to another element, the element may be directly connected or coupled to the other element and may also be connected or coupled to the other element through one or more other intervening elements therebetween unless otherwise specified.
Also, herein, as for elements represented as ‘ . . . portions (units)’, ‘ . . . modules’, or the like, two or more elements may be combined into one element, or one element may be divided into two or more elements for each subdivided function. Also, each element described below may additionally perform some or all of the functions of other elements in addition to its main functions, and some of the main functions of each element may be exclusively performed by other elements.
Hereinafter, embodiments according to the disclosure will be sequentially described in detail.
According to an architectural flow describing the development of automobile technology, an automobile is expected to have an integrated structure for each function in a current distributed structure and evolve into a structure in which combined domains are controlled by a central controller. That is, with the development of technology, it is expected that data will be centrally processed and cloud connection may also be possible in the future.
Because a lot of data should be processed in order to achieve autonomous driving currently attracting attention, a high-performance processor for processing a large amount of data may be utilized. Such a processor may be installed in a vehicle; however, as an autonomous driving system becomes more sophisticated, it may complicate the manufacturing of a vehicle and increase the manufacturing cost thereof.
In order to overcome these problems, a system in which a vehicle functions to collect environment data around the vehicle and transmit the environment data to a server and the server analyzes the environment data and transmits a control command to the vehicle may be considered.
The most important consideration in an autonomous driving system in which a vehicle and a server are linked to each other may be how rapidly it may respond to an external factor. A rapid response to the external factor may be greatly affected by the communication state between the vehicle and the server. Thus, there may be a need for a method for autonomously driving the vehicle stably even in a situation where the communication state between the vehicle and the server changes.
Moreover, unlike an existing rule-based system, an artificial intelligence (AI) system may be a computer system that may allow a machine to learn, determine, and become more intelligent by itself. Because the AI system may have a higher recognition rate and more accurately understand user tastes as it is used more, existing rule-based systems have been gradually replaced with deep learning-based AI systems.
AI technology may include machine learning (deep learning) and element technologies utilizing machine learning.
Machine learning may be an algorithm technology for classifying/learning the features of input data by itself, and the element technologies may be technologies using a machine learning algorithm such as deep learning and may include technical fields such as linguistic understanding, visual understanding, reasoning/prediction, knowledge representation, and motion control.
Linguistic understanding may be a technology for recognizing and applying/processing human languages/characters and may include natural language processing, machine translation, conversation systems, question and answer, and voice recognition/synthesis. Visual understanding may be a technology for recognizing and processing objects, as done by human vision, and may include object recognition, object tracking, image search, human recognition, scene understanding, space understanding, and image improvement. Reasoning/prediction may be a technology for reasoning and predicting logically by determining information and may include knowledge/probability-based reasoning, optimization prediction, preference-based planning, and recommendation. Knowledge representation may include a technology for automatically processing human experience information into knowledge data, and may include knowledge generation (data creation and classification) and knowledge management (data utilization). Motion control may be a technology for controlling autonomous driving of a vehicle and motion of a robot and may include motion control (navigation, collision, and driving) and operation control (behavior control).
Embodiments of the disclosure relate to a method for safely controlling a vehicle in an autonomous driving system in which the vehicle and a server are linked to each other and an apparatus for performing the method, which will be described below in detail with reference to the accompanying drawings.
The vehicle 10 may perform autonomous driving through link with the server 200. As described above, the vehicle 10 may transmit data collected by sensors installed in the vehicle 10 to the server 200, and the server 200 may analyze the data and generate a control command for autonomous driving. The server 200 may transmit the control command to the vehicle 10, and the vehicle 10 may operate according to the control command.
Here, the data collected by the sensors installed in the vehicle 10 may be data about the surrounding environments of the vehicle 10. For example, the data may include data indicating whether there is a risk-of-collision object proximate to the vehicle, factors to be considered in driving or moving the vehicle (e.g., whether there is a shaded area and/or whether an accident has occurred nearby), and/or factors affecting the driving of the vehicle. Hereinafter, for convenience of description, the data will be referred to as ‘environment data’ or ‘sensing data’ for example in a case where ‘data’ is data about the surrounding environment of the vehicle.
The electronic apparatus 100 may be installed in the vehicle 10 to support autonomous driving of the vehicle 10.
The electronic apparatus 100 may be installed in the vehicle 10 during the process of manufacturing the vehicle 10, or may be installed in the vehicle 10 after completion of the manufacturing of the vehicle 10.
In an autonomous driving system in which the vehicle 10 and the server 200 are interlinked, the electronic apparatus 100 may enable safe autonomous operation of the vehicle 10 to even in a situation where communication between the vehicle 10 and the server 200 is disrupted. Particularly, the electronic apparatus 100 may minimize the influence of a disrupted communication state on the safety of the autonomous operation of the vehicle 10, by maintenance of a specified time variable for executing control commands received from the server 200.
Hereinafter, the vehicle 10 equipped with the electronic apparatus 100 will be described in more detail with reference to
Referring to
First, the sensors 400 may collect environmental data via sensing inputs generated from an external environment of the vehicle 10. The environment data collected by the sensors 400 may be transmitted to the electronic apparatus 100.
In an embodiment of the disclosure, the sensors 400 may include a sensor for detecting an object proximate to the vehicle 10, a sensor for detecting the current position/location of the vehicle 10, a sensor for obtaining images of the environs around the vehicle 10, and/or a sensor for detecting an attitude of the vehicle 10. For example, the sensors 400 may be installed in the vehicle 10, and include a camera, a light detection and ranging sensor (LIDAR), a radio detection and ranging sensor (RADAR), a global positioning system (GPS), an acceleration sensor, and/or a gyroscopic sensor. The sensors 400 installed to the vehicle 10 are not limited to the above examples, and various types of sensors not illustrated may be installed in the vehicle 10. One or more sensors 400 may be installed in the vehicle 10.
The communication module 300 may include communication circuitry that is capable of transceiving data between the electronic apparatus 100 and the server 200, while communicating with the server 200 through a communication network 30. That is, the communication module 300 may transmit environmental data provided from the electronic apparatus 100, to the server 200 through the communication network 30 and may transmit a control command received from the server 200, to the electronic apparatus 100 through the communication network 30. The communication module 300 may include a network interface card (NIC).
In an embodiment of the disclosure, the communication module 300 may communicate with the server 200 using any one of various supportable communication methods. The various supportable communication methods may include, but are not limited to, a 5G mmWave communication method, a 5G SUB6 communication method, and a long-term evolution (LTE) communication method. Both the 5G mmWave communication method and the 5G SUB6 communication method may correspond to the 5G communication method but may be different in available frequency bands.
The communication module 300 may communicate with the server 200 using any one communication method, and then communicate with the server 200 by replacing the same with another communication method. For example, the communication module 300 may communicate with the server 200 by the 5G mmWave communication method and then communicate with the server 200 by converting to the LTE communication method when it is difficult to use the 5G mmWave communication method in a particular area.
When there are various communication methods available for communication with the server 200, the communication module 300 may communicate with the server 200 by a communication method with the higher priority. For example, when the communication module 300 may communicate with the server 200 by the 5G mmWave communication method and by the 5G SUB6 communication method and the priority of the 5G mmWave communication method is higher than the priority of the 5G SUB6 communication method, the communication module 300 may use the 5G mmWave communication method to communicate with the server 200.
The communication module 300 may provide information about the communication state with the server 200 to the electronic apparatus 100. The information about the communication state may include information representing the communication rate with the server 200. When the communication method between the communication module 300 and the server 200 is identified, the communication rate may be identified and thus the information about the communication state may include information representing the communication method.
The electronic apparatus 100 may include a memory 110 and a processor 130.
The memory 110 may store at least one instruction, and the processor 130 may perform an operation for controlling the functions of the vehicle 10 via execution of the at least one instruction. According to an embodiment of the disclosure, the communication module 300 described above may be included in the electronic apparatus 100.
The electronic apparatus 100 may adaptively determine a target size of environment data to be transmitted to the server 200, by considering the communication state between the communication module 300 and the server 200. When the size of the environment data to be transmitted to the server 200 increases, the time utilized to transmit the environment data from the communication module 300 to the server 200 increases, and as a result, a rapid autonomous response to the changing external context of the vehicle 10 may become impossible. Thus, when the communication state between the communication module 300 and the server 200 is stable and sufficient in bandwidth, the electronic apparatus 100 may increase the target size of the environment data as the larger bandwidth and stability would not hinder rapidity of response, and increased data granularity may help generate a more accurate response to an external factor of the vehicle 10. When the communication state between the communication module 300 and the server 200 is suffers disruption, the electronic apparatus 100 may decrease the target size of the environment data to maintain the transmission time, and thus maintain a certain degree of rapidity in autonomous responses to changes in the external context of the vehicle 10.
The server 200 may analyze the environment data received from the electronic apparatus 100 through the communication module 300 and generate a control command indicating how to control the function of the vehicle 10. For example, based on analysis of the environment data, when the server 200 determines that a right turn of the vehicle 10 is impending, the server 200 may generate a control command representing a right turn.
The control command may be provided from the server 200 to the electronic apparatus 100 through the communication module 300. The electronic apparatus 100 may control a function of the vehicle 10 according to the control command. For example, the electronic apparatus 100 may control at least one of an engine 11, a powertrain 12, a brake 13, a steering 14, or a suspension 15 of the vehicle 10 to allow the vehicle 10 to perform a particular function (e.g., a right turn).
An operation of the electronic apparatus 100 will be described in more detail with reference to
In operation S310, the electronic apparatus 100 may identify a communication state between the communication module 300 and the server 200. The electronic apparatus 100 may receive information regarding the communication state from the communication module 300 and identify the communication state (e.g., quality, stability, bandwidth, etc.) between the communication module 300 and the server 200 according to the received information.
In operation S320, the electronic apparatus 100 may determine a target size of environment data to be transmitted to the server 200, according to the identified communication state. For example, when the communication state indicates a quality over a certain quality threshold (e.g., signal strength, bandwidth, etc.), the target size of the environment data to be transmitted to the server 200 may be increased. When the communication state indicates a quality less than a certain quality threshold, the target size of the environment data to be transmitted to the server 200 may be reduced to enable to maintain a stable data transmission time. A method for the electronic apparatus 100 to determine the target size of the environment data will be described below with reference to
In operation S330, the electronic apparatus 100 may control and/or configure the sensors 400 to detect environment data, but in a way such that a size of the collected data is less than or equal to the target size determined in operation S320. That is, electronic apparatus 100 may control operation of the sensors 400 such that the total size of detected environment data collected by the sensors 400 is less than or equal to the target size.
For example, the electronic apparatus 100 may control the sensors 400 such that the total size of environment data collected by the sensors 400 may be less than or equal to the target size, but close to the target size.
As another example, the electronic apparatus 100 may control the sensors 400 such that the total size of environment data collected by the sensors 400 may be less than or equal to the target size and the difference thereof from the target size may be less than or equal to a threshold value. For example, when the target size is 100 Mb and the threshold value is 10 Mb, the total size of the environment data may be about 90 Mb to about 1,000 Mb.
Turning of a sensing operation of some of the sensors 400 installed in the vehicle 10 or reducing a sensing capability of the sensors 400 installed in the vehicle 10 may be considered as a method of controlling the sensors 400. This will be described below with reference to
In operation S340, the electronic apparatus 100 may transmit the environment data obtained from the sensors 400 to the server 200. The electronic apparatus 100 may convert (e.g., compress) the environment data to render it suitable for the utilized communication method between the communication module 300 and the server 200.
The server 200 may generate a control command for the vehicle by analyzing the environment data received from the electronic apparatus 100. For example, the server 200 may retrieve a control command indicating execution of a function of the vehicle 10 in autonomous operation. The appropriate control command may, in some examples, be selected or generated by inputting the environment data into a trained neural network for machine-learning.
In operation S350, the electronic apparatus 100 may control a function of the vehicle 10 in accordance with the control command received from the server 200. Thus, the vehicle 10 executes an autonomous response to events occurring in the external environment, as detected by the sensors and transmitted in the requisite time to the external server.
As described with reference to
When the target size of the environment data is reduced to be smaller than a certain threshold, a rapid response to the external factor may be maintained. However, the accuracy of autonomous driving response may be reduced due to reductions in the detail of the sensor data. Thus, it may be important to set a target size of the environment data that enables both a rapid and accurate response in the autonomous operation of the vehicle 10.
In an embodiment of the disclosure, when the electronic apparatus 100 determines to collect environment data of a particular size, when the expected time from the start time point of collection of the environment data by the sensors 400 to the time point of execution of the control command is less than or equal to a threshold time, the particular size may be determined as the target size of the environment data. The threshold time may be predetermined in order to ensure a rapid response of the vehicle 10.
The expected time may be divided into detailed times as follows.
{circle around (1)} the time for the sensors 400 to collect the environment data; {circle around (2)} the time for the environment data to be transmitted from the sensors 400 to the electronic apparatus 100; {circle around (3)} the time for the electronic apparatus 100 to convert the environment data and transmit the same to the communication module 300; {circle around (4)} the time for the communication module 300 to transmit the environment data to the server 200 through the communication network 30; {circle around (5)} the time for the server 200 to analyze the environment data and generate the control command; {circle around (6)} the time for the server 200 to transmit the control command to the communication module 300 through the communication network 30; {circle around (7)} the time for the control command to be transmitted from the communication module 300 to the electronic apparatus 100; and {circle around (8)} the time for the electronic apparatus 100 to control a function of the vehicle 10 according to the control command.
The detailed times listed above may vary depending on the size of the environment data to be transmitted to the server 200. For example, as the size of the environment data increases, it is likely that the detailed time {circle around (1)}, the detailed time {circle around (2)}, the detailed time {circle around (3)}, the detailed time {circle around (4)}, and the detailed time {circle around (5)} will increase.
Particularly, because the detailed times {circle around (4)} and {circle around (6)} depend on the communication state (particularly the communication rate) between the communication module 300 and the server 200, the electronic apparatus 100 should determine the target size of the environment data to be transmitted the server 200, by considering the communication state.
The electronic apparatus 100 may pre-know how much time it will take to collect environment data of a certain size in a particular communication state. For example, the electronic apparatus 100 may pre-know that 1 second will be taken from the start time point of collection of the environment data to the time point of execution of the control command in the case of collecting 100-Mb environment data in the 5G mmWave communication method and 2 seconds will be taken from the start time point of collection of the environment data to the time point of execution of the control command in the case of collecting 100-Mb environment data in the LTE communication method.
According to an embodiment of the disclosure, the electronic apparatus 100 may calculate the target size of the environment data to be transmitted to the server 200, by inputting information about the communication state and the threshold time into a prestored algorithm.
According to an embodiment of the disclosure, the electronic apparatus 100 may further determine the analysis accuracy by the server 200 when determining the target size by considering the expected time and the threshold time. The analysis accuracy of the server 200 may be related to the detailed time {circle around (5)}. Because the detailed time {circle around (5)} increases as the analysis accuracy increases, the electronic apparatus 100 may determine the target size and the analysis accuracy such that the expected time may be less than or equal to the threshold time.
The electronic apparatus 100 may transmit the analysis accuracy to the server 200, and the server 200 may analyze the environment data according to the received analysis accuracy. For example, when the analysis accuracy is 3, the server 200 may analyze the environment data through a first algorithm predetermined; when the analysis accuracy is 2, the server 200 may analyze the environment data through a second algorithm having less complexity than the first algorithm; and when the analysis accuracy is 1, the server 200 may analyze the environment data through a third algorithm having less complexity than the second algorithm. The first algorithm may include a first neural network, the second algorithm may include a second neural network having fewer layers than the first neural network, and the third algorithm may include a third neural network having fewer layers than the second neural network.
Referring to
In operation S420, when the expected time is less than the threshold time, it can be determined that the candidate size is appropriate. Thus, the candidate size may be set as the target size for the environment data. Even when the certain size is increased, when the expected time is less than the threshold time, the target size may be determined as a size greater than the certain size. That is, the electronic apparatus 100 may determine, as the target size, the greatest size among various sizes that cause an expected time less than the threshold time. As described above, the threshold time is to ensure a rapid response of the vehicle 10; however, because the safety may not be ensured by the rapid response, environment data of the maximum size is to be collected within the range in which the rapid response is possible.
In operation S430, when the expected time is greater than or equal to the threshold time, it may be determined that the candidate size is too large for transmission within the threshold time. Thus, the electronic apparatus 100 may reduce the candidate size and reassess the reduced candidate size by re-performing operation S410.
Moreover, the threshold time may vary depending on the current situation of the vehicle 10. For example, when the vehicle 10 is driving at high speed (e.g., above a certain speed threshold), the threshold time should be short, and when the vehicle 10 is driving at low speed (e.g., below a certain speed threshold), it is likely that a problem will not occur even when the threshold time is long.
As illustrated in
As illustrated in
Referring to
Likewise, the threshold time may decrease as the risk of the vehicle 10 increases. As an example, when the speed of the vehicle 10 is 30 km/h or less and the vehicle 10 is located in a section of risk 1, the threshold time may be 1 second, and when the speed of the vehicle 10 is 30 km/h or less and the vehicle 10 is located in a section of risk 2, the threshold time may be 0.8 seconds.
According to an embodiment of the disclosure, the electronic apparatus 100 may determine a target time for determining a target size by increasing or decreasing a preset threshold time according to at least one of the speed of the vehicle 10 or the risk of an area in which the vehicle 10 is located. For example, when the preset threshold time corresponds to a case where the speed of the vehicle 10 is about 31 km/h to about 60 km/h and the risk is 2, when the speed of the vehicle 10 is greater than a speed ranging from about 31 km/h to about 60 km/h or the risk is greater than 2, the preset threshold time may be reduced.
Further, a stability level may be calculated as a representation of a stability (e.g., safety) of the autonomous driving operation, which may be based on the threshold time and target size. Faster autonomous responses of the vehicle 10 are facilitated by shorter threshold time values. Furthermore, the accuracy of autonomous driving responses increases as the target size of the environment data increases (e.g., due to richer and more detailed data). Thus, the stability of autonomous operation of the vehicle 10 may be identified by considering both the threshold time and the target size of the environment data.
Referring to
When the stability level of autonomous driving is identified, the electronic apparatus 100 may output the stability level of autonomous driving through an output device of the vehicle 10, for example, a speaker or a display. The stability level of autonomous driving may represent an autonomous driving mode. The degree of driver's intervention may vary depending on the autonomous driving mode. For example, no driver's intervention may be required in an autonomous driving mode where the stability level of autonomous driving is 3, and some driver's intervention (e.g., looking ahead while holding the steering wheel) may be desirable in an autonomous driving mode where the stability level of autonomous driving is 2. Much user intervention (e.g., being able to operate the brake at any time by looking ahead while holding the steering wheel) may be mandated in an autonomous driving mode where the stability level of autonomous driving is 1.
By notifying the driver of an estimated stability level of present autonomous driving operations (e.g., the autonomous driving mode), the driver may be informed as to periods where more or less human intervention may be necessary to respond to events occurring in the external environment during autonomous operation of the vehicle.
When a departure point 710 and an arrival point 730 are set through a navigation 700 of the vehicle 10, the electronic apparatus 100 may predict a plurality of stability levels of autonomous driving for each of the sections 722, 724, and 726, in the overall route 720 between the departure point 710 and the arrival point 730, and output the respective predicted stability levels for each section.
The electronic apparatus 100 may pre-obtain information about in which communication state communication is performed with the server 200 in each of the sections 722, 724, and 726 in the route and may predict the stability level of autonomous driving in each section based on the communication state with the server 200. When the communication state is identified, the electronic apparatus 100 may calculate the stability level of autonomous driving in each section according to the above embodiment of the disclosure.
As illustrated in
When the driver drives along the route 720 between the departure point 710 and the arrival point 730, the driver may be informed as to road sections where more or less human intervention may be necessary to respond to events occurring in the external environment during autonomous navigation of the vehicle.
In an embodiment of the disclosure, when there are various routes connecting the departure point and arrival point, the electronic apparatus 100 may calculate stability levels of subsections of each prospective route, calculate an average of predicted stability level of each route, and then select a recommended route according to the calculated average.
Referring to
The electronic apparatus 100 may calculate the average of predicted stability level of the entirety of route A and the entirety of route B. These calculations may, for example, consider the driving time in each section. For example, the electronic apparatus 100 may calculate the average of predicted stability level by weight-averaging the predicted stability level in each section of each route according to each driving time. Particularly, the electronic apparatus 100 may calculate the average of the predicted stability level of the route A as (3*0.4)+(1*0.3)+(2*0.3) and calculate the average predicted stability level of the route B as (2*0.5)+(3*0.3)+(2*0.2). The weight multiplied by the predicted stability level of each section of the route may be calculated by normalizing the sum of driving times to 1. That is, because the total driving time of the route A is 100 minutes, the weight multiplied by the predicted stability level 3 of the section a-1 may be determined as 40/100.
When the average of predicted stability level of the route A is 2.1 and the average of predicted stability level of the route B is 2.3, the electronic apparatus 100 may automatically select route B as the recommended route, due to route B having the higher average of predicted stability level.
The above method of selecting the recommended route is merely an example, and the electronic apparatus 100 may select the recommended route by further considering the toll, the total driving time, and/or the like when driving along each route.
The electronic apparatus 100 may select the stability level of autonomous driving (i.e., the autonomous driving mode) corresponding to the communication method between the vehicle 10 and the server 200 and determine the target size of the environment data corresponding to the selected stability level of autonomous driving.
In the above, the target size is determined and the stability level of autonomous driving is determined by considering the communication state between the vehicle 10 and the server 200, the expected time, and the threshold time; however, according to an embodiment of the disclosure, the stability level of autonomous driving may be determined and the target size may be determined according to the communication method between the vehicle 10 and the server 200.
As illustrated in
In the embodiment illustrated in
Hereinafter, a method for the electronic apparatus 100 to obtain the environment data from the sensors 400 after the target size is determined will be described.
When the target size of the environment data to be transmitted to the server 200 is determined, the electronic apparatus 100 may control the function of the sensors 400 such that the environment data collected by the sensors 400 falls within the determined target size.
In an embodiment of the disclosure, the electronic apparatus 100 may reduce the total size of the environment data collected by the sensors 400, by, for example, deactivating some of the sensors 400. This may be useful when the total size of the environment data collected in a state where all of the sensors 400 are turned on is greater than the target size of the environment data to be transmitted to the server 200.
The electronic apparatus 100 may select the sensor 400 to be deactivated from among the sensors 400, by considering at least one of the movement direction of the vehicle 10 or the position of the vehicle 10. For example, the position of the vehicle 10 may allow consideration of a risk of in view of subregions of the vehicle, directionality thereof, etc.
For example, as illustrated in
Also, as illustrated in
The electronic apparatus 100 may predetermine the sensors 400 that should be sequentially deactivated according to at least one of the movement direction of the vehicle 10 or the position of the vehicle 10 and may sequentially deactivate the sensors 400 according to the priority such that the total size of the environment data collected by the sensors 400 may be less than or equal to the target size of the environment data to be transmitted to the server 200.
In another embodiment of the disclosure, the electronic apparatus 100 may reduce the sensing capability of the sensors 400 such that the size of the environment data collected by the sensors 400 may be less than or equal to the target size of the environment data to be transmitted to the server 200. For example, the electronic apparatus 100 may reduce the total size of the environment data collected by the sensors 400, by reducing the resolution of an image obtained by the camera or by reducing the range (distance or area) detected by the LIDAR or the radar.
The electronic apparatus 100 may adaptively determine the degree of reduction in the sensing capability according to the target size of the environment data to be transmitted to the server 200. For example, when the target size of the environment data to be transmitted to the server 200 is a first size, the resolution of an image obtained by the camera may be determined as UHD (3840*2160); when the target size of the environment data to be transmitted to the server 200 is a second size less than the first size, the resolution of an image obtained by the camera may be determined as MD (1920*1080); and when the target size of the environment data to be transmitted to the server 200 is a third size less than the second size, the resolution of an image obtained by the camera may be determined as HD (1280*720).
In operation S1210, the electronic apparatus 100 may identify the communication state between the communication module 300 and the server 200. The electronic apparatus 100 may receive information about the communication state from the communication module 300 and identify the communication state between the communication module 300 and the server 200 according to the received information.
In operation S1220, the electronic apparatus 100 may determine the target size of the environment data to be transmitted to the server 200, according to the identified communication state. Because a method of determining the target size of the environment data to be transmitted to the server 200 has been described above, redundant descriptions thereof will be omitted for conciseness.
In operation S1230, the electronic apparatus 100 may select some of the environment data already obtained from the sensors 400. When the total size of the environment data obtained from the sensors 400 is greater than the target size determined in operation S1220, the electronic apparatus 100 may select a sub-portion of the obtained environment data from among the environment data obtained from the sensors 400, such that the total size of the sub-portion is less than or equal to the target size. The electronic apparatus 100 may therefore select some environment data having a total size less than or equal to the target size, and, when different configurations of the sub-portion of data are possible, select a particular configuration have a size closest to the target size for transmission to the external server.
The embodiment of illustrated in
When selecting some of the environment data, the electronic apparatus 100 may consider the priority of the environment data, which will be described below with reference to
In operation S1240, the electronic apparatus 100 may transmit some of the environment data obtained from the sensors 400 to the server 200. The electronic apparatus 100 may convert (e.g., compress) the environment data to be suitable for the communication method between the communication module 300 and the server 200.
The server 200 may generate a control command by analyzing the environment data received from the electronic apparatus 100. For example, the server 200 may obtain a control command indicating how the vehicle 10 should function, by inputting the environment data into a trained neural network.
In operation S1250, the electronic apparatus 100 may control a function of the vehicle 10 according to the control command received from the server 200 as a response to the environment data.
The electronic apparatus 100 may select some of the environment data obtained from the sensors 400 according to at least one of the movement direction of the vehicle 10 or the position of the vehicle 10. The position of the vehicle 10 may represent the risk level of a region in which the vehicle 10 is located.
As illustrated in
Also, as illustrated in
The electronic apparatus 100 may preset the priority between different sets of environment data collected by different sensors, by considering at least one of the movement direction of the vehicle 10 or the position of the vehicle 10. Also, the electronic apparatus 100 may sequentially select the environment data to be transmitted to the server 200 according to the preset priority such that the total size of the selected environment data may be less than or equal to the target size and may be closest to the target size.
Moreover, it has been described above that the electronic apparatus 100 actively determines the target size of the environment data to be transmitted to the server 200; however, according to an embodiment of the disclosure, the server 200 may request the electronic apparatus 100 to reduce the size of the environment data. The server 200 may transmit, to the electronic apparatus 100, target size information of the environment data that the server 200 is to receive. Alternatively, the server 200 may request the electronic apparatus 100 to reduce the size of the environment data, and the electronic apparatus 100 may determine the target size according to the reduction request.
When the server 200 makes a request for reduction of the size of the environment data, it may be useful when the load applied to the server 200 is great. For example, when a great number of vehicles 10 autonomously drive in link with the server 200, because the load applied to the server 200 increases, the server 200 may reduce the load to be processed, by transmitting a request for reduction of the size of the environment data to the vehicle 10.
In an embodiment of the disclosure, at least one of the operations performed by the electronic apparatus 100 (particularly by the processor 130 of the electronic apparatus 100) may be performed by using AI technology. At least one operation performed by using AI technology will be described below in detail with reference to
Particularly, at least one of i) an operation of determining the target size, ii) an operation of selecting the threshold time, iii) an operation of selecting the sensor 400 to be turned off from among the sensors 400, or iv) an operation of selecting some environment data to be transmitted to the server 200 from among the environment data, which are performed by the electronic apparatus 100, may be performed by using AI technology that performs an operation through a neural network. For example, it may be performed based on AI based on the communication state between the server 200 and the vehicle 10 described above.
AI technology may be a technology for obtaining a desired result by processing (e.g., analyzing and/or classifying) input data based on operation through a neural network.
The AI technology may be implemented by using an algorithm. Here, an algorithm or a set of algorithms for implementing the AI technology may be referred to as a neural network. The neural network may receive input data, perform an operation for analysis and/or classification, and output result data. In order for the neural network to accurately output the result data corresponding to the input data, it is desirable to train the neural network. Here, ‘training’ may represent training the neural network such that the neural network may personally find and learn a method of analyzing a plurality of pieces of input data to the neural network, a method of classifying the plurality of pieces of input data, and/or a method of extracting features for generating result data from the plurality of pieces of input data. Particularly, through a training process, the neural network may optimize the weight values in the neural network based on the training data. Then, a desired result may be output by processing the input data through the neural network having optimized weight values.
The neural network may be classified as a deep neural network when there are a plurality of hidden layers as internal layers for performing an operation, that is, when the depth of the neural network for performing an operation increases. Examples of the neural network may include, but are not limited to, Convolutional Neural Network (CNN), Deep Neural Network (DNN), Recurrent Neural Network (RNN), Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Bidirectional Recurrent Deep Neural Network (BRDNN), and Deep Q-Networks. Also, the neural network may be subdivided. For example, the CNN neural network may be subdivided into Deep Convolution Neural Networks (DCNNs) or CapsNet neural networks (not illustrated).
‘AI model’ may refer to a neural network including at least one layer that receives input data and operates to output a desired result. Also, ‘AI model’ may refer to an algorithm for outputting a desired result by performing an operation through a neural network, a set of algorithms, a processor for executing an algorithm (or a set of algorithms), software for executing an algorithm (or a set of algorithms), or hardware for executing an algorithm (or a set of algorithms).
At least one of i) the operation of determining the target size, ii) the operation of selecting the threshold time, iii) the operation of selecting the sensor 400 to be turned off from among the sensors 400, or iv) the operation of selecting some of the environment data may be performed based on the AI model.
Referring to
For example, the neural network 1510 may learn a method of determining the target size of the environment data to be transmitted to the server 200 based on the communication state information. The trained neural network 1510 may receive the communication state information and determine the target size of the environment data.
As another example, based on the target size information of the environment data, the neural network 1510 may learn a method of selecting the sensor 400 to be turned off from among the sensors 400 and/or a method of selecting some of the environment data. The trained neural network 1510 may receive the target size information of the environment data and select the sensor 400 to be turned off from among the sensors 400 and/or some of the environment data.
In an embodiment of the disclosure, a neural network for performing at least one of i) an operation of determining the target size of the environment data, ii) an operation of selecting the threshold time, iii) an operation of selecting the sensor 400 to be turned off from among the sensors 400, or iv) an operation of selecting some of the environment data may be implemented in a processor (e.g., 130 of
Alternatively, the neural network performing at least one of i) the operation of determining the target size of the environment data, ii) the operation of selecting the threshold time, iii) the operation of selecting the sensor 400 to be turned off from among the sensors 400, or iv) the operation of selecting some of the environment data may be implemented in an electronic apparatus (not illustrated) or a processor (not illustrated) that is distinct from the electronic apparatus 100.
The operation through the neural network described above may be performed by a server (not illustrated) that may communicate with the electronic apparatus 100 according to an embodiment of the disclosure through a wireless communication network. Communication between the electronic apparatus 100 and a server (not illustrated) will be described with reference to
Referring to
For example, the processor 1630 of the server 1600 may obtain information about the communication state, and determine the target size of the environment data to be transmitted to the server 200 or 1600 according to the communication state information. The communication module 1610 may transmit the target size information of the environment data to the electronic apparatus 100.
In an embodiment of the disclosure, the server 1600 may determine the target size of the environment data by performing an operation through the neural network 1510 described with reference to
In general, in terms of the memory storage capacity, the operation processing speed, the collection capability of a training data set, and/or the like, the electronic apparatus 100 may be restricted compared to the server 1600. Thus, after the server 1600 performs an operation requiring storage of a large amount of data and a large amount of computation, data and/or the AI model may be transmitted to the electronic apparatus 100 through a communication network. Then, the electronic apparatus 100 may rapidly and easily perform an operation without a processor having a large-capacity memory and a fast operation capability by receiving and using the data and/or the AI model through the server 1600.
In an embodiment of the disclosure, the server 1600 may include the neural network 1510 described with reference to
The communication module 300 may communicate with an external device (e.g., the server 200 or 1600) through a wireless communication network 1615. Here, the external device (not illustrated) may include a server (e.g., 200 or 1600) that may perform at least one of the operations executed by the electronic apparatus 100 or may transmit data utilized by the electronic apparatus 100.
The communication module 300 may include at least one communication module such as a short-range communication module, a wired communication module, a mobile communication module, and/or a broadcast receiving module. Here, the communication module 300 may include a tuner performing broadcast reception and/or a communication module capable of performing data transmission/reception through a network conforming to a communication standard such as a communication method using Bluetooth, Wireless LAN (WLAN) (WiFi), Wireless Broadband (WiBro), World Interoperability for Microwave Access (WiMAX), CDMA, WCDMA, Internet, 3G, 4G, 5G, and/or millimeter Wave (mmWave).
For example, the communication module 300 may rapidly transmit/receive a large amount of data by performing communication by using mmWave. Particularly, the vehicle 10 may increase the safety of the vehicle 10 and/or the user's convenience by rapidly receiving a large amount of data by using millimeter waves and rapidly providing user content (e.g., movie or music) and/or data beneficial for the safety of the vehicle 10 (e.g., data for autonomous driving and/or data for a navigation service).
The communication module 300 may communicate with another device (e.g., the server 200 or 1600) located at a long distance through a communication network conforming to a communication standard such as 3G, 4G, and/or 5G. Here, the communication module 300 performing communication with another device located at a long distance may be referred to as a ‘long-range communication module’.
Referring to
The communication module 1610 may include one or more elements for allowing communication with the electronic apparatus 100. Because the detailed configuration of the communication module 1610 corresponds to the configuration of the communication module 300 described above, redundant descriptions thereof will be omitted for conciseness.
For example, the communication module 1610 may communicate with another device (e.g., the electronic apparatus 100) located at a long distance through a communication network conforming to a communication standard such as Internet, 3G, 4G, and/or 5G.
The processor 1630 may control an overall operation of the server 1600. For example, the processor 1630 may perform desired operations by executing at least one of programs or at least one instruction of the server 1600.
The DB 1650 may include a memory (not illustrated) and store at least one of at least one instruction, programs, or data for the server 1600 to perform a certain operation in the memory (not illustrated). Also, the DB 1650 may store data for the server 1600 to perform an operation according to the neural network.
In an embodiment of the disclosure, the server 1600 may store the neural network 1510 described with reference to
In an embodiment of the disclosure, the server 1600 may determine the target size of the environment data described above by using the neural network internally included and transmit the determined target size information to the communication module 300 of the electronic apparatus 100 through the communication module 1610.
Also, the server 1600 may transmit the trained neural network to the communication module 300 of the electronic apparatus 100 through the communication module 1610. Then, the electronic apparatus 100 may obtain and store the trained neural network and may obtain desired output data through the neural network.
Referring to
The data learner 1710 may learn a criterion for at least one of i) an operation of determining the target size of the environment data, ii) an operation of selecting the threshold time, iii) an operation of selecting the sensor 400 to be turned off from among the sensors 400, or iv) an operation of selecting some of the environment data.
For example, the data learner 1710 may learn a criterion about which data to use to determine the target size of the environment data and/or how to select the target size of the environment data by using the data.
By obtaining data to be used for training and applying the obtained data to a data recognition model described below, the data learner 1710 may learn a criterion for at least one of i) an operation of determining the target size of the environment data, ii) an operation of selecting the threshold time, iii) an operation of selecting the sensor 400 to be turned off from among the sensors 400, or iv) an operation of selecting some of the environment data.
The data recognizer 1720 may determine the target size of the environment data to be transmitted to the server 200 or 1600 based on the data (e.g., the communication state information). The data recognizer 1720 may determine the target size of the environment data from certain data by using the trained data recognition model. The data recognizer 1720 may determine the target size of the environment data based on certain data by obtaining certain data according to a preset criterion based on training and then using the data recognition model by using the obtained data as an input value. Also, the result value output by the data recognition model by using the obtained data as an input value may be used to update the data recognition model.
At least one of the data learner 1710 or the data recognizer 1720 may be manufactured in the form of at least one hardware chip and mounted on the electronic device 100. For example, at least one of the data learner 1710 or the data recognizer 1720 may be manufactured in the form of a dedicated hardware chip for AI or may be manufactured as a portion of a general-purpose processor (e.g., a central processing unit (CPU) or an application processor) or a graphics processor (e.g., a graphics processing unit (GPU)) and mounted on the electronic apparatus 100 described above.
The data learner 1710 and the data recognizer 1720 may be mounted on one electronic apparatus 100 or may be respectively mounted on separate devices. For example, one of the data learner 1710 and the data recognizer 1720 may be included in the electronic apparatus 100, and the other one may be included in the server 1600. Also, the data learner 1710 and the data recognizer 1720 may provide model data generated by the data learner 1710 to the data recognizer 1720 by wire or wireless, or data input into the data recognizer 1720 may be provided as additional training data to the data learner 1710.
Moreover, at least one of the data learner 1710 or the data recognizer 1720 may be implemented as a software module. When at least one of the data learner 1710 or the data recognizer 1720 is implemented as a software module (or a program module including instructions), the software module may be stored in a non-transitory computer-readable recording medium. Also, in this case, at least one software module may be provided by an operating system (OS) or may be provided by a certain application. Alternatively, some of at least one software module may be provided by an operating system (OS), and the others may be provided by a certain application.
Referring to
The data obtainer 1710-1 may obtain data for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. The data obtainer 1710-1 may obtain data for learning at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time.
The preprocessor 1710-2 may preprocess the data obtained for training. The preprocessor 1710-2 may preprocess the data into a preset format such that the model trainer 1710-4 described below may use the data for training for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time.
The training data selector 1710-3 may select data for training from among the preprocessed data. The selected data may be provided to the model trainer 1710-4. The training data selector 1710-3 may select data for training from among the preprocessed data according to a preset criterion for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. Also, the training data selector 1710-3 may select data according to a preset criterion based on training by the model trainer 1710-4 described below.
Based on the training data, the model trainer 1710-4 may learn a criterion about how to perform at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. Also, the model trainer 1710-4 may learn a criterion about which training data to use for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time.
Also, the model trainer 1710-4 may use the training data to train the data recognition model used for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. In this case, the data recognition model may be a pre-generated model. For example, the data recognition model may be a model pre-generated by receiving basic training data (e.g., sample data).
The data recognition model may be generated by considering the application field of the recognition model, the purpose of training, or the computer performance of the device. The data recognition model may be, for example, a model based on a neural network. For example, a model such as a convolutional neural network (CNN), a deep neural network (DNN), a recurrent neural network (RNN), or a bidirectional recurrent deep neural network (BRDNN) may be used as the data recognition model; however, the disclosure is not limited thereto.
According to certain embodiments of the disclosure, when there are a plurality of pre-generated data recognition models, the model trainer 1710-4 may determine a data recognition model having a high relationship between input training data and basic training data as a data recognition model to be trained. In this case, the basic training data may be pre-classified for each type of data, and the data recognition model may be pre-generated for each type of data. For example, the basic training data may be pre-classified by various references such as the region where the training data is generated, the time when the training data is generated, the size of the training data, the genre of the training data, the generator of the training data, and the type of an object in the training data.
Also, the model trainer 1710-4 may train the data recognition model by using, for example, a training algorithm including error back-propagation or gradient descent.
Also, the model trainer 1710-4 may train the data recognition model, for example, through supervised learning using training data as an input value. Also, for example, the model trainer 1710-4 may train the data recognition model through unsupervised learning for finding a criterion for performing at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time, by self-learning the type of data for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time, without supervision. Also, for example, the model trainer 1710-4 may train the data recognition model through reinforcement learning using feedback on whether the result of determination of the target size of the environment data according to training is correct.
Also, when the data recognition model is trained, the model trainer 1710-4 may store the trained data recognition model. In this case, the model trainer 1710-4 may store the trained data recognition model in the memory 110 of the electronic apparatus 100 including the data recognizer 1720. Alternatively, the model trainer 1710-4 may store the trained data recognition model in the memory of the server 1600 connected to the electronic apparatus 100 through a wired or wireless network.
In this case, the memory storing the trained data recognition model may also store, for example, commands or data related to at least one other element of the electronic apparatus 100. Also, the memory may store software and/or programs. The program may include, for example, a kernel, middleware, an application programming interface (API), and/or an application program (or “application”).
The model evaluator 1710-5 may input evaluation data into the model recognition model and may cause the model trainer 1710-4 to learn when the recognition result output from the evaluation data fails to satisfy a certain criterion. In this case, the evaluation data may be preset data for evaluating the data recognition model.
For example, the model evaluator 1710-5 may evaluate that the certain criterion is not satisfied, when the number or ratio of evaluation data with inaccurate recognition results among the recognition results of the data recognition model about the evaluation data exceeds a preset threshold value. For example, when the certain criterion is defined as a ratio of 2%, the model evaluator 1710-5 may evaluate that the data recognition model is not suitable, when the data recognition model outputs incorrect recognition results with respect to more than 20 pieces of evaluation data among a total of 1,000 pieces of evaluation data.
Moreover, when there are a plurality of trained data recognition models, the model evaluator 1710-5 may evaluate whether each of the trained data recognition models satisfies a certain criterion and determine the model satisfying the certain criterion as a final data recognition model. In this case, when there are a plurality of models satisfying the certain criterion, the model evaluator 1710-5 may determine, as a final data recognition model, any one model or a certain number of models preset in descending order of evaluation scores.
Moreover, at least one of the data obtainer 1710-1, the preprocessor 1710-2, the training data selector 1710-3, the model trainer 1710-4, or the model evaluator 1710-5 in the data learner 1710 may be manufactured in the form of at least one hardware chip and mounted on the electronic apparatus 100. For example, at least one of the data obtainer 1710-1, the preprocessor 1710-2, the training data selector 1710-3, the model trainer 1710-4, or the model evaluator 1710-5 may be manufactured in the form of a dedicated hardware chip for AI or may be manufactured as a portion of a general-purpose processor (e.g., a CPU or an application processor) or a graphic processor (e.g., a GPU) and mounted on the electronic apparatus 100 described above.
Also, the data obtainer 1710-1, the preprocessor 1710-2, the training data selector 1710-3, the model trainer 1710-4, and the model evaluator 1710-5 may be mounted on one electronic apparatus 100 or may be respectively mounted on separate devices. For example, some of the data obtainer 1710-1, the preprocessor 1710-2, the training data selector 1710-3, the model trainer 1710-4, and the model evaluator 1710-5 may be included in the electronic apparatus 100, and the others may be included in the server 1600.
Also, at least one of the data obtainer 1710-1, the preprocessor 1710-2, the training data selector 1710-3, the model trainer 1710-4, or the model evaluator 1710-5 may be implemented as a software module. When at least one of the data obtainer 1710-1, the preprocessor 1710-2, the training data selector 1710-3, the model trainer 1710-4, or the model evaluator 1710-5 is implemented as a software module (or a program module including instructions), the software module may be stored in a non-transitory computer-readable recording medium. Also, in this case, at least one software module may be provided by an operating system (OS) or may be provided by a certain application. Alternatively, some of at least one software module may be provided by an operating system (OS), and the others may be provided by a certain application.
Referring to
The data obtainer 1720-1 may obtain data for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time, and the preprocessor 1720-2 may preprocess the obtained data such that the obtained data may be used for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. The preprocessor 1720-2 may preprocess the obtained data into a preset format such that the recognition result provider 1720-4 described below may use the obtained data for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time.
The recognition data selector 1720-3 may select, from among the preprocessed data, data for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. The selected data may be provided to the recognition result provider 1720-4. The recognition data selector 1720-3 may select some or all of the preprocessed data according to a preset criterion for at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. Also, the recognition data selector 1720-3 may select data according to a preset criterion based on training by the model trainer 1710-4.
The recognition result provider 1720-4 may apply the selected data to the data recognition model to perform at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. The recognition result provider 1720-4 may provide the recognition result according to the purpose of recognizing data. The recognition result provider 1720-4 may apply the selected data to the data recognition model by using the data selected by the recognition data selector 1720-3 as an input value. Also, the recognition result may be determined by the data recognition model.
The model updater 1720-5 may allow the data recognition model to be updated based on the evaluation of the recognition result provided by the recognition result provider 1720-4. For example, the model updater 1720-5 may allow the model trainer 1720-4 to update the data recognition model, by providing the model trainer 1710-4 with the recognition result provided by the recognition result provider 1720-4.
Moreover, at least one of the data obtainer 1720-1, the preprocessor 1720-2, the recognition data selector 1720-3, the recognition result provider 1720-4, or the model updater 1720-5 in the data recognizer 1720 may be manufactured in the form of at least one hardware chip and mounted on the electronic apparatus 100. For example, at least one of the data obtainer 1720-1, the preprocessor 1720-2, the recognition data selector 1720-3, the recognition result provider 1720-4, or the model updater 1720-5 may be manufactured in the form of a dedicated hardware chip for AI or may be manufactured as a portion of a general-purpose processor (e.g., a CPU or an application processor) or a graphic processor (e.g., a GPU) and mounted on the electronic apparatus 100.
Also, the data obtainer 1720-1, the preprocessor 1720-2, the recognition data selector 1720-3, the recognition result provider 1720-4, and the model updater 1720-5 may be mounted on one electronic apparatus 100 or may be respectively mounted on separate devices. For example, some of the data obtainer 1720-1, the preprocessor 1720-2, the recognition data selector 1720-3, the recognition result provider 1720-4, and the model updater 1720-5 may be included in the electronic apparatus 100, and the others may be included in the server 1600.
Also, at least one of the data obtainer 1720-1, the preprocessor 1720-2, the recognition data selector 1720-3 the recognition result provider 1720-4, or the model updater 1720-5 may be implemented as a software module. When at least one of the data obtainer 1720-1, the preprocessor 1720-2, the recognition data selector 1720-3, the recognition result provider 1720-4, or the model updater 1720-5 is implemented as a software module (or a program module including instructions), the software module may be stored in a non-transitory computer-readable recording medium. Also, in this case, at least one software module may be provided by an operating system (OS) or may be provided by a certain application. Alternatively, some of at least one software module may be provided by an operating system (OS), and the others may be provided by a certain application.
Referring to
In this case, a model trainer 2710 of the server 1600 may perform the function of the data learner 1710 illustrated in
Also, the recognition result provider 1720-4 of the electronic apparatus 100 may apply the data selected by the recognition data selector 1720-3 to the data recognition model generated by the server 1600 to perform at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. For example, the recognition result provider 1720-4 may transmit the data selected by the recognition data selector 1720-3 to the server 1600 and request the server 1600 to apply the data selected by the recognition data selector 1720-3 to the recognition model to perform at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. Also, the recognition result provider 1720-4 may receive, from the server 1600, result information of at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time, which have been performed by the server 1600.
Alternatively, the recognition result provider 1720-4 of the electronic apparatus 100 may receive the recognition model generated by the server 1600 from the server 1600 and use the received recognition model to perform at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time. In this case, the recognition result provider 1720-4 of the electronic apparatus 100 may apply the data selected by the recognition data selector 1720-3 to the data recognition model received from the server 1600 to perform at least one operation among determination of the target size of the environment data, selection of the sensor 400 to be turned off, selection of the environment data to be transmitted to the server 200, and determination of the threshold time.
Moreover, the embodiments of the disclosure described above may be written as a program that may be executed in a computer, and the written program may be stored in a medium.
The medium may be any one that may continuously store computer-executable programs or temporarily store them for execution or download. Also, the medium may be any recording unit or storage unit in the form of a single hardware unit or a combination of hardware units, but is not limited to a medium directly connected to a computer system and may be distributed on a network. Examples of the mediums may include magnetic recording mediums such as hard disks, floppy disks, and magnetic tapes, optical recording mediums such as CD-ROMs and DVDs, magneto-optical recording mediums such as floptical disks, and storage mediums such as ROMs, RAMs, and flash memories that are configured to store program instructions. Also, as another example of the mediums, recording mediums or storage mediums may be managed by app stores distributing applications or by sites or servers supplying or distributing various other software.
Although the disclosure has been described above in detail with reference to embodiments thereof, the disclosure is not limited to the above embodiments and various modifications and changes may be made therein by those of ordinary skill in the art without departing from the scope of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
10-2021-0027429 | Mar 2021 | KR | national |
This application is a Continuation of and claims priority under 35 U.S.C. § 120 to PCT International Application No. PCT/KR2022/002796, which was filed on Feb. 25, 2022, and claims priority to Korean Patent Application No. 10-2021-0027429, filed on Mar. 2, 2021, in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/KR2022/002796 | Feb 2022 | US |
Child | 17684738 | US |