The present disclosure relates to an embedded controller in a vehicle, and more particularly, to use of an embedded controller to secure functionality of the vehicle.
Modern vehicles typically include a multitude of processors which control disparate functionality of the vehicles. For example, a vehicle may include a processor which controls a display positioned within the vehicle. In this example, the display may present an interface for use by a driver to view information relevant to operation of the vehicle. As an example, the display may allow for adjustment of a radio or sound system. As another example, the display may present a map associated with a present location of the vehicle.
Certain processors, such as the above-described display processor, may be accessible from the outside world. For example, the display processor may obtain information over a wireless network (e.g., a cellular network) for inclusion in the presented interface. Example information may include traffic information, map information, and so on. Since the processor is responsive to information transmitted from outside of the vehicle, there is a security risk that a malicious actor may tamper with the operation of the processor. For example, the malicious actor may be able to control the in-vehicle display.
As may be appreciated, as vehicles become more complex and connected with the outside world the risk of tampering with the operation of the vehicles increases. For example, a controller area network (CAN bus) in a vehicle may be remotely accessible over a wireless network. In this example, acceleration, braking, and so on, may be subject to tampering by malicious attacks.
In some embodiments, a method implemented by an embedded controller is described. The method includes receiving, from a display positioned in the vehicle, user input directed to a user interface presented via the display, wherein the user interface is rendered by an infotainment system included in the vehicle; identifying, based on the user input, a gear shift request associated with adjusting a propulsion direction of the vehicle; and updating the user interface to include a static image associated with the gear shift request, wherein the embedded controller provides information associated with the static image to a timing controller of the display, wherein the embedded controller is in communication with a propulsion system which controls the propulsion direction of the vehicle, and wherein the embedded controller routes the gear shift request to the propulsion system.
In some embodiments, a method implemented by a vehicle processor system is described, with the vehicle processor system including an embedded controller and an infotainment system, and with the vehicle processor system being configured to present a user interface for presentation via a display of a vehicle. The user interface presents a first portion, the first portion including a static image indicative of a currently selected gear, the currently selected gear being associated with a particular propulsion direction, wherein the static image is provided via the embedded controller to a timing controller of the display; presents a second portion, the second portion including a dynamic user interface associated with disparate vehicle functionality, the dynamic user interface being rendered by the infotainment system; and responds to user input provided to the first portion associated with changing the currently selected gear, wherein the user input is routed by the display to the embedded controller, and wherein the embedded generates a gear change request for transmission to a propulsion system.
In some embodiments, a vehicle processor system is described. The vehicle processor system includes an embedded controller, the embedded controller configured to receive touch input information from the display, the touch input information reflecting interactions with the display by a person in the vehicle; and an infotainment system, the infotainment system being in communication with the embedded controller and configured to receive the touch input information from the embedded controller, wherein the infotainment system is configured to render a dynamic user interface for presentation via the display, wherein the embedded controller is configured to cause presentation of a static image reflecting a currently selected gear of the vehicle, and wherein the embedded controller provides information associated with the static image to a timing controller of the display.
In some embodiments, a vehicle is described. The vehicle includes an electric motor; a battery pack; a display in communication with a vehicle processor system; and the vehicle processor system comprising: an embedded controller, the embedded controller configured to receive touch input information from the display, the touch input information reflecting interactions with the display by a person in the vehicle; and an infotainment system, the infotainment system being in communication with the embedded controller and configured to receive the touch input information from the embedded controller, wherein the infotainment system is configured to render a dynamic user interface for presentation via the display, wherein the embedded controller is configured to cause presentation of a static image reflecting a currently selected gear of the vehicle, and wherein the embedded controller provides information associated with the static image to a timing controller of the display.
In some embodiments, a method is described. The method includes presenting, via a display positioned in the vehicle, a user interface which includes a visual representation of a vehicle operational parameter; determining a checksum value associated with the visual representation, wherein the checksum value is based on pixel information which forms the visual representation; determining that the determined checksum value is different from a known checksum value associated with a display of the vehicle operational parameter; and taking remedial action, wherein the remedial action comprises updating the user interface.
Embodiments of the present disclosure and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures, wherein showings therein are for purposes of illustrating embodiments of the present disclosure and not for purposes of limiting the same.
This specification describes techniques for secure operation of a vehicle, such as an electric vehicle. As will be described, a display (e.g., a touch-sensitive display) included in a vehicle may be used by a driver to adjust driving functionality of the vehicle. For example, the driver may cause a change to a vehicle operating parameter, such as a current gear, current heating ventilation and air conditioning (HVAC) settings, and so on. With respect to the example of a gear change or gear shift, the vehicle may adjust the current gear between drive, reverse, park, neutral, and so on. The current gear may represent a gear associated with a transmission or a propulsion direction (e.g., with respect to an electric vehicle). To ensure that a malicious attacker is unable to improperly adjust the vehicle's current gear, for example via malicious compromise of the display, this specification describes separation between (1) a processor or computer usable to present a user interface via the display and (2) an embedded controller usable to effectuate adjustment of the current gear. The processor or computer (herein referred to as the infotainment system) may, in some embodiments, be responsive to information received via a wireless connection (e.g., a cellular network) from the outside world. In contrast, the embedded controller may be blocked (e.g., firewalled) from the outside world. In this way, the vehicle may allow for the ease of use of adjusting a current gear while ensuring strict security.
To reduce the complexity associated with operating vehicles, it may be advantageous to remove at least a portion of the plethora of physical controls included in a vehicle. An example physical control may include a control to change propulsion direction. An autonomously or semi-autonomously operated vehicle may intelligently determine which propulsion direction is suitable during operation of the vehicle. In this example, the vehicle may determine that when in a driver's garage, the vehicle needs to be in reverse to back out of the garage. Subsequently, the vehicle may then determine that the vehicle should be placed into a drive mode to navigate towards a driver's destination. Thus, the physical control may be removed without detriment to a user experience of operating a vehicle.
As may be appreciated, discarding physical controls may additionally simplify manufacturing of a vehicle. For example, a physical control to change propulsion direction may typically be a stalk positioned proximate to a steering wheel or a gear shifter positioned to a right of the driver (e.g., in right-side driving regions). In this example, the physical control may be removed, and the functionality associated with the control instead be autonomously determined.
While autonomous vehicles provide benefits with respect to operation, at present they do not represent a substantial number of vehicles. However, the above-described physical control to adjust propulsion direction may still be removed and its functionality be instead moved to a software-defined control. For example, a display included in a vehicle may present a user interface which enables adjusting a current gear. In this example, the user interface may respond to user input associated with changing the current gear (e.g., from park to drive, from reverse to drive, and so on). Thus, a driver may provide simple user input to a centralized display in contrast to manipulating a physical input. Over time, for example as autonomous software becomes more commonplace, the vehicle may allow for autonomous operation of propulsion direction or current gear. The above-described vehicle thus includes the benefits of simplified manufacturing while also preserving a driver's ability to manually control the vehicle's gear setting.
As described above, the display of a vehicle may typically be controlled by a processor which is responsive to information provided from the outside world. For example, as the vehicle traverses a real-world environment, the processor may overlay information on a map included in a displayed user interface. In this example, the overlaid information may indicate a route the driver is to follow, traffic information, upcoming hazards, and so on. This information may be obtained using a wireless network, such as a cellular network, or any network capable of communicating with the vehicle to which the vehicle connects.
Typically, the user interface presented on vehicle displays may not be considered at a high risk of compromise (e.g., by a malicious attacker). For example, a vehicle display typically presents information relevant to operation of the vehicle but lacks functionality to directly control driving of the vehicle. In this example, the display may therefore represent a convenience for use by the driver while actual driving functionality (e.g., steering, acceleration, braking, gear changes, and so on) may be effectuated elsewhere in the vehicle (e.g., using physical controls, using touch controls located in a separate location, autonomously by the vehicle, and so on). Additionally, being able to attack a vehicle over a cellular network presents tremendous technological hurdles. For example, the malicious attacker may need to find exploits in software which controls the user interface. In this example, the exploits would need to be reachable by software which is responsive to information provided by the cellular network. As may be appreciated, an attacker may reach the exploit through various other techniques via a wireless or wired data connection.
Thus, while at present there have been limited examples of compromise by malicious attackers outside of a research setting, as more sophisticated vehicle controls are moved from physical controls to software-defined controls the vehicle's security posture may need to be improved. While the description herein focuses on software-defined controls to adjust a current gear, it may be appreciated that other vehicle controls may fall within the scope herein. For example, functionality to honk, adjust lights, turn on an emergency brake, set a cruise mode, and so on, may be controllable by software-defined controls.
To enhance security, this application describes use of a secure embedded controller which initially receives user input provided by a driver via a display of a vehicle. For example, the embedded controller may receive touch-based input information representing the driver's presses, swipes, gestures, and so on provided to a user interface. With respect to adjusting a current gear, the embedded controller may analyze received user input and determine whether the driver intends to adjust the current gear. Adjusting the current gear may include a change in the current propulsion direction. Upon a positive determination, the embedded controller may then transmit a gear change request (e.g., via a CAN bus or other messaging protocol/bus) to a processor or system associated with adjusting the current gear (hereinafter referred to as a propulsion system). The user input may additionally be forwarded to one or more processors which render a dynamic user interface for presentation on the display (hereinafter referred to as an infotainment system).
As will be described, the embedded controller may be isolated from the outside world. For example, the embedded controller may disallow arbitrary wireless communications. In contrast, wireless communications may be limited to the infotainment system. It may be appreciated that modern vehicle user interfaces increasingly include disparate functionality which rely upon a network connection. Such a network connection may be directly connected to the infotainment system, passed through a device not directly connected to the vehicle (e.g., a cellular telephone), or accessed in any other suitable manner. As an example, and as described above, navigation information may rely upon a network connection. As another example, streaming audio applications may rely upon a network connection to stream a driver's preferred audio. Thus, the infotainment system may need the ability to receive network information. Since the infotainment system is accessible from the outside world, there is an increased likelihood of the system being compromised.
The separation between the embedded controller and infotainment system enhances the security posture of the vehicle while also maintaining the above-described modern infotainment functionality. For example, the infotainment system may be disallowed from effectuating vehicle control changes (e.g., gear changes). As another example, the infotainment system may be disallowed from providing information to the embedded controller. Instead, the embedded controller may act as a gateway to the infotainment system thus ensuring that a malicious attacker has no path to compromising driving functionality through malicious control of the infotainment system.
In addition to ensuring secure gear changes, the embedded controller may output information for presentation via the display. For example, and as will be described, the embedded controller may update the user interface by directly providing visual information (hereinafter referred to as static images) to an element included in the display. An example element includes the timing controller which is used to drive the display. In this way, the embedded controller may bypass the infotainment system which may otherwise render the user interface.
An example static image may include a current gear setting. Thus, even if the infotainment system is compromised the display will still reflect an accurate gear setting. For example, if a driver provides user input to adjust the current gear, the embedded controller may provide one or more static images to the timing controller which reflect the adjusted gear. In this example, the driver may adjust the current gear from park to drive. Thus, the embedded controller may cause the user interface to present a static image indicating the current gear of drive.
The user interface presented to the driver may therefore include a first portion which includes static images from the embedded controller and a second portion which is rendered by the infotainment system. Thus, in addition to being blocked from changing vehicle controls, a malicious attacker may additionally be blocked from improperly manipulating the above-described first portion of the user interface. In this way, the driver may rely upon the first portion as providing information which is not able to be compromised.
While the figures below describe an example of changing a gear, as may be appreciated the techniques described herein can be used for other vehicle operational parameters. For example, static images may be used with respect to HVAC controls. As another example, static images may be used to display a current speed of the vehicle. These static images may be updated as the vehicle's speed is adjusted (e.g., in substantially real-time).
In the illustrated example, the user interface 116 includes a first portion 118A and a second portion 118B. The first portion 118A may be associated with control of the vehicle by a user. For example, the first portion 118A may be used to present a current gear setting of the vehicle and to allow for adjustment of the current gear. The second portion 118B may instead be used to control navigation, audio, heating ventilation, and air conditioning (HVAC), and so on. Thus, the first portion 118A may enable control of sensitive aspects of the vehicle (e.g., gear changes) which affect the driving of the vehicle while the second portion 118B may be usable to control aspects of the vehicle which do not directly affect driving. In some embodiments, the first portion 118A may depict HVAC controls/information, speed of the vehicle, critical alerts, autonomous alerts/information, and so on.
To allow for the above-described separation, the vehicle processor system 100 includes an embedded controller 120 and an infotainment system 130. The embedded controller 120 may, as an example, be a microcontroller, a processor, an application specific integrated circuit (ASIC), and so on. As will be described, the embedded controller 120 may provide static image information 122 to the display 110 which is usable to update the first portion 118A. The infotainment system 130 may be a computer, one or more processors, and so on. Similar to the above, the infotainment system 130 may provide dynamic user interface information 132 which is usable to update at least the second portion 118B.
The infotainment system 130 may execute applications, software, and so on, which, as described above, are associated with entertainment, navigation, control of non-driving aspects of the vehicle (e.g., HVAC), and so on. For example, the infotainment system 130 may enable disparate functionality to be performed via interaction with the user interface 116. In some embodiments, the infotainment system 130 may be associated with an online application store which allows for a driver or passenger to execute preferred applications (e.g., ‘apps’). At least a portion of the disparate functionality may use a network connection of the vehicle (e.g., a cellular network). For example, audio may be streamed via the network connection. Thus, the applications, software, and so on, which execute via the infotainment engine 130, may be allowed to provide and receive information over the network connection.
In contrast, in some embodiments the embedded controller 120 may be constrained from providing or receiving information over the network connection. In this way, the embedded controller 120 may be configured to not be accessible from the outside world. The embedded controller 120 may, as an example, be accessible through a physical network connection which may be isolated from other internal networks (e.g., the CAN bus), thereby limiting access. As may be appreciated, this inaccessibility may reduce or eliminate a likelihood of a malicious attacker being able to tamper with the embedded controller and thus driving functionality of the vehicle (e.g., gear changes).
During operation of the vehicle, a driver or passenger may interact with the user interface 116 by providing touch-based input. For example, the display 110 may represent a touch-sensitive display. An example interaction may include adjusting an HVAC setting to increase, or reduce, a temperature within a cabin of the vehicle. Another example interaction may include adjusting selection of audio via interaction with a streaming audio application being executed by the infotainment system 130. Another example interaction may include adjusting a current gear of the vehicle (e.g., from reverse to drive, from drive to park, and so on).
The above-described touch-based input may be provided to the vehicle processor system 100 as input information 112. Advantageously, the input information 112 may be routed to the embedded controller 120 by the display 110 (e.g., via a connection, such as an I2C connection). In this way, the input information 112 may be analyzed by the embedded controller 120 to determine whether the input information 112 reflects an intent to change a current gear of the vehicle. As an example, the input information 112 may reflect the driver interacting with the first portion of the user interface 118A to adjust the current gear. The embedded controller 120 may thus determine that the driver intends to adjust the current gear.
In response, the controller 120 may transmit a gear change request 124 to the propulsion system 140 to effectuate the adjustment. The propulsion system 140, as described herein, may represent a system or processor which controls a gear setting of the vehicle. An example of the embedded controller 120 analyzing input information 112 to determine a gear change is described below, with respect to
In some embodiments, the embedded controller 120 may analyze the input information 112 if it's associated with the first portion 118A. For example, the first portion 118A may be associated with certain pixels of the user interface 116. As another example, the first portion 118A may be defined by one or more boundaries. The embedded controller 120 may thus analyze the input information 112 if at least one touch event (e.g., a press, swipe, gesture, such as over a threshold time period) is included within the defined boundaries. Advantageously, and as will be expanded on below with respect to
The input information 112 may then be routed to the infotainment system 130. As may be appreciated, a driver or passenger may more routinely interact with the display to adjust functionality not related to driving of the vehicle. For example, the driver or passenger may interact with a map presented in the user interface 116. In this example, the interaction may include selecting a destination, zooming in or out of the map, and so on. The infotainment system 130 may therefore analyze the input information 112 to update the user interface 116. For user input associated with the first portion 118A, the infotainment system 130 can discard the user input. For user input associated with the second portion 118B, the infotainment system 130 may use the input to update rendering of the user interface.
Advantageously, transmission of the gear change request 124 may be limited to the embedded controller 120. Thus, the infotainment system 130 may lack an ability to communicate with, or provide requests to, the propulsion system 140. In this way, a malicious attacker may be unable to interface with an element of the vehicle which controls gear changes.
In the illustrated example, the vehicle processor system 100 is providing static image information 122 and dynamic user interface information 132 to the display. The dynamic user interface information 132 may be rendered via the infotainment system 130, such as via a graphics processor unit, a processor, or from a computer memory and may reflect an interactive user interface. For example, the dynamic user interface information 132 may be provided to an input of the display 110 via DisplayPort, high-definition multimedia interface (HDMI), and so on. The dynamic user interface information 132 may thus allow for complex animated graphics and user interface elements to be presented via the display 110.
The static image information 122 may include an image, or selection of an image, which is to be presented in the first portion 118A of the user interface 116. An example image may include a representation of different gear settings in which the vehicle may be placed (e.g., drive, reverse, park, neutral) along with an indication of a currently selected gear. To ensure that the user interface 116 accurately reflects the currently selected gear, the embedded controller may directly provide the static image information 122 to the display. For example, the static image information 122 may be provided to a timing controller 114 of the display 110. As may be appreciated, the timing controller may set drivers of the display 110 which are usable to cause output of light which forms the user interface 116. Thus, the static image information 122 may cause the timing controller 114 to directly set pixel values of the display 110. In this way, a static image may be overlaid on the user interface 116.
The embedded controller 120 may optionally output static image information 122 for inclusion in first portion 118A during operation of the vehicle. Thus, the embedded controller 120 may output static image information 122 such that the user interface 116 includes one or more static images in every frame presented via the display 110.
In some embodiments, the embedded controller 120 may output static image information 122 until detection of the gear change request 124. For example, the embedded controller 120 may output static image information 122 reflecting that the vehicle is in a first gear. In this example, the first portion 118A may include an image reflecting the first gear. Upon determining that the driver intends to change gears, the embedded controller 120 may cease outputting of static image information 122. The infotainment system 130 may then render an animation reflecting adjustment from the first gear to a second gear for inclusion in the first portion 118A. After the animation, the embedded controller 120 may output static image information 122 which causes first portion 118A to indicate that the second gear is currently selected.
In some embodiments, the embedded controller 120 may output static image information 122 for a threshold amount of time after determining the gear change request 124. For example, the user interface 116 may be rendered based on dynamic user interface information 132 prior to the gear change request 124. In this example, the infotainment system 130 may render both the first portion 118A and the second portion 118B. Thus, the infotainment system 130 may render information reflecting a current gear. The embedded controller 120, as described above, may analyze input information 112 and determine that a driver intends to adjust the vehicle's gear to a subsequent gear. The embedded controller 120 may then output a static image reflecting the subsequent gear for at least the threshold amount of time.
As described above, in some embodiments the first portion 118A may be rendered, for at least a portion of the time, by the infotainment system 130. For example, the infotainment system 130 may cause the first portion 118A to present an animation reflecting a gear change. As another example, the infotainment system 130 may render visual information reflecting a current gear in the first portion 118A for a threshold amount of time. In these embodiments, the infotainment system 130 may avoid rendering visual information for inclusion in the first portion 118A when the embedded controller 120 is providing static image information 122. Optionally, the system 130 may render a particular color background (e.g., gray) and the static image information 122 may be overlaid over the particular color background. The embedded controller 120 may optionally provide information to the infotainment system 130 indicating times at which it is outputting static image information 122. Thus, the infotainment system 130 may avoid providing dynamic user interface information 132 which conflicts with the static image information 122.
Additionally, the user interface 116 may, in some embodiments, always present static images. For example, the timing controller 114 may output a static image reflecting a current gear. Upon the driver selecting a new gear, or the gear being autonomously selected, the user interface 116 may output an animation reflecting the change. The display 110 may maintain outputting a static image (e.g., reflecting the old or new gear), however the static image may be presented with an alpha value of 0. Thus, the static image may be transparent such that the animation is visible to the driver. Subsequent to the animation, the static image may be updated to have an alpha value of 1 such that the static image is visible. In this way, even if the animation were to be compromised by a malicious attacker, the static image will automatically be presented with an alpha value of 1 to override any incorrect or improper animation.
Thus, the embedded controller 120 may provide added security while the infotainment system 130 may maintain the ability to flexibly render at least a portion of the user interface 116. For example, the infotainment system 130 may be rapidly updated and improved upon via software updates (e.g., over the air updates). Since the infotainment system 130 is removed from actual control of driving aspects of the vehicle (e.g., gear changes), there is limited ability for any of the rapid improvements to negatively affect control of the vehicle.
The memory 202 may store the static images 208 as pixel values (e.g., red, green, blue, values). Optionally, the memory 202 may store the static images 208 as being associated with a portion of the user interface 204 in which they are to be included (e.g., specific pixels of the user interface 204).
To cause inclusion of static image 206 in user interface 204, the static image information 122 may reflect a selection of the static image 206 from the stored static images 208. As an example, the embedded controller may determine that a driver intends to select a particular gear (e.g., park, represented as ‘P’ in the example). For example, the driver may have provided user input to select the ‘P’ symbol in user interface 204. The embedded controller may provide the request to the propulsion system, and if the request is granted, the controller may cause updating of the static image 206. The embedded controller may then provide information 122 which identifies selection of the static image 206. As an example, each of the static images 208 may be associated with an identifier such that the static image information 122 may include a particular identifier.
The selected static image 206 may then be provided to the timing controller 114 for inclusion in user interface 204. As may be appreciated, the static images 208 may be stored in the timing controller's 114 memory 202, transmitted along with the static image information 122 from the embedded controller 120, or may be stored in any other suitable location where the static image information 122 may be provided to, or otherwise accessed by, the timing controller 114. In some embodiments, the timing controller 114 may output the selected static image 206 until the embedded controller determines that the driver intends to change the gear. In some embodiments, the timing controller 114 may output the selected static image 206 for a threshold amount of time (e.g., 5 seconds, 10 seconds, one minute, one hour). After the threshold amount of time the infotainment system 130 (e.g., illustrated in
The user interface 204 may additionally present an animation associated with a gear change. For example, upon selection of a new gear (e.g., park as described above), the user interface 204 may render an animation reflecting the change. After a threshold amount of time, the user interface 204 may present the selected static image 206 associated with park. Additionally, and as described above, in some embodiments a static image may always be presented and the alpha value associated with the static image (e.g., associate with each pixel or the entire image) may be toggled between 0 and 1. Thus, the static image may be substantially invisible to the driver during an animation and then toggled to be subsequently visible.
At block 302, the system receives user input from a display included in a vehicle. The display may be a touch-sensitive display which is positioned at a front of the vehicle (e.g., the display 110 of
The display may thus output information reflecting the driver's interaction with the user interface. For example, the display may provide information identifying locations in the user interface which the driver touched, specific gestures recognized by a processor associated with the display, and so on. As described in
At block 304, the system identifies a gear shift request based on the user input. As described herein, a vehicle operational parameter, such as a gear shift, may be adjusted via user input. The description below relates to adjusting the current gear, however the description herein may be applied to other vehicle operational parameters.
The user interface, such as user interface 204 in
A different user interface, such as illustrated in
At block 306, the system updates the user interface to present a static image associated with the gear shift request. As described in
In some embodiments, the system may initially present an animation associated with changing of the gear. For example, and with respect to
At block 308, the system routes the gear shift request to a propulsion system. To effectuate the gear change to the updated gear, the embedded controller provides a request to a propulsion system to update the currently selected gear. The propulsion system may represent a processor, controller, and so on, which adjusts a propulsion direction of the vehicle. For example, a vehicle with a transmission may be updated to reflect the updated gear. As another example, an electric vehicle may be updated such that its motor rotates in a particular direction associated with the updated propulsion direction.
At block 402, the system causes presentation of a dynamic user interface via a display. As described in
At block 404, the system monitors a heartbeat between the embedded controller and the infotainment system. To detect a crash or failure of the infotainment system, the embedded controller may receive a heartbeat from the infotainment system. The heartbeat may be a constant signal provided to the embedded controller by the infotainment system. The heartbeat may also represent a periodic signal which is provided to the embedded controller by the infotainment system. The heartbeat may optionally include information reflecting proper operation of the infotainment system.
At block 406, the system determines a fault associated with the infotainment system. The embedded controller determines that the infotainment system is malfunctioning based on lack of a received heartbeat, a received heartbeat that differs from what is expected by the embedded controller, or information included in the heartbeat indicating a failure. For example, the infotainment system may crash such that the heartbeat fails to be provided to the embedded controller. As another example, the infotainment system may suffer an error or fault and include information in the heartbeat signal (e.g., information related to such error or fault). Certain errors or faults may interrupt the proper rendering of the dynamic user interface.
At block 408, the system causes presentation of static images associated with the current gear. Since the infotainment system renders the dynamic user interface which may substantially fill the display, an error or failure of the system may result in the user interface not being rendered or being rendered improperly. To ensure that the display presents information related to driving of the vehicle, the embedded controller may cause the display to present a static image associated with a current gear of the vehicle. The static image may be presented with an alpha value which causes the static image to be presented (e.g., a value of 1). In some embodiments, the embedded controller may additionally render a static image indicating a measure of gas or stored energy left in the vehicle. In some embodiments, the embedded controller may additionally render, and update, a speed at which the vehicle is traveling. The static image may optionally be present on the display at all times, including when the infotainment system is functioning, with an alpha value of 0, such that when the infotainment system is functioning some or all images rendered by the embedded controller are hidden from the user.
In some embodiments, the static image may include pixels which form a representation of a current gear. Additionally, the static image may include pixels surrounding the representation. For example, the surrounding pixels may be a particular color to ensure that the representation is visible. As an example, the surrounding pixels may be white, gray, and so on. Thus, the representation may be presented darker (e.g., black, dark gray) and even if the display is presenting nothing else (e.g., such that the background would otherwise be black) the representation may be visible.
In the illustrated example, the vehicle processor system 100 is providing static image information 122 and dynamic user interface information 132 as described herein. Thus, the user interface 504 includes a static image 506 along with the dynamically rendered user interface 508. In some embodiments, the infotainment system 130 may render the user interface 504. For example, and as described above with respect to
In embodiments in which the infotainment system 130 was rendering the user interface 504, the failure of the infotainment system 130 may cause the entire user interface 504 to be removed. Upon detecting a lack of the heartbeat signal 502, the embedded controller 120 may thus output static image 506. For example, the embedded controller 120 may store information identifying a current gear. As another example, the embedded controller 120 may request the current gear from the propulsion system illustrated in
The above-described embedded controller may analyze the received user input and determine that the driver intends to change gears. In some embodiments, an animation may be presented reflecting adjustment of the gear. For example, the animation may depict the visual element 602 moving upwards when the selected gear is drive. Subsequently, the embedded controller may output a static image of the visual element 602 being further up in the user interface 600. As another example, the animation may depict the visual element 602 moving downwards when the selected gear is reverse. Subsequently, the embedded controller may output a static image of the visual element 602 being further down in the user interface 600.
At block 802, the system causes the infotainment system 130 to present a visual representation of a vehicle operational parameter. As described in
At block 804, the system (e.g., the embedded controller) determines a checksum value based on pixel information which forms the visual representation of the vehicle operational parameter (e.g., pixel values of the visual representation, such as red, green, blue values). The checksum value may be determined using, as a few examples, a cyclic redundancy check, parity byte algorithm, frame check sequence, and so on. The checksum value may be determined using an error-detecting code. The vehicle operational parameter may be visually represented on the display 110 (e.g., the static image 506 in
In another example, the embedded controller 120 may access the pixel information related to the vehicle operational parameter being displayed directly from the display 110 (e.g., from an HDMI controller, USB controller, special purpose image data decoder, and so on). In another example, the infotainment system 130 may be in control of the display 110 and directly transmit the pixel values to both the display 110 and the embedded controller 120, such that the embedded controller may directly monitor the display information output by the infotainment system 130 as it is received by the display.
At block 806, the system accesses a known checksum value associated with the displayed vehicle operational parameter. The known checksum value may be stored in a memory 202 of the timing controller 114, a memory of the embedded controller 120, and so on. The known checksum value may also be calculated in real-time by the embedded controller 120, timing controller 114, and so on. For example, to calculate a known checksum value in real-time, the embedded controller 120 may receive static information (e.g., a static image information 122 in
At block 808, the system compares the checksum value for the visual representation to the known checksum value for that operational parameter accessed in block 806.
At block 810, the system takes a remedial action (e.g., displaying an error message on the display, displaying a warning on the display, turning off the infotainment system, turning off the display, changing the current gear of the propulsion system, and so on) in response to a negative comparison. For example, the infotainment system 130 may be instructed by the embedded controller 120 to override the current visual representation on the display 110 to replace the displayed system with a warning or static image. In another example, the embedded controller 120 may instruct the timing controller 114 to directly take control of a portion, or all of, the display 110 from the infotainment system 130 and display a selected static image 206 representing the correct value of the vehicle operational parameter. For example, the embedded controller 120 may cause presentation of static images as described herein (e.g., the controller 120 may cause the timing controller to directly set pixel values).
The vehicle 900 further includes a propulsion system usable to set a gear (e.g., a propulsion direction) for the vehicle. With respect to an electric vehicle, the propulsion system 140 may adjust operation of the electric motor 902 to change propulsion direction. Additionally, the vehicle includes the vehicle processor system 100 and display 110 described above.
All of the processes described herein may be embodied in, and fully automated, via software code modules executed by a computing system that includes one or more computers or processors. The code modules may be stored in any type of non-transitory computer-readable medium or other computer storage device. Some or all the methods may be embodied in specialized computer hardware.
Many other variations than those described herein will be apparent from this disclosure. For example, depending on the embodiment, certain acts, events, or functions of any of the algorithms described herein can be performed in a different sequence or can be added, merged, or left out altogether (for example, not all described acts or events are necessary for the practice of the algorithms). Moreover, in certain embodiments, acts or events can be performed concurrently, for example, through multi-threaded processing, interrupt processing, or multiple processors or processor cores or on other parallel architectures, rather than sequentially. In addition, different tasks or processes can be performed by different machines and/or computing systems that can function together.
The various illustrative logical blocks, modules, and engines described in connection with the embodiments disclosed herein can be implemented or performed by a machine, such as a processing unit or processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A processor can be a microprocessor, but in the alternative, the processor can be a controller, microcontroller, or state machine, combinations of the same, or the like. A processor can include electrical circuitry configured to process computer-executable instructions. In another embodiment, a processor includes an FPGA or other programmable device that performs logic operations without processing computer-executable instructions. A processor can also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Although described herein primarily with respect to digital technology, a processor may also include primarily analog components. For example, some or all of the signal processing algorithms described herein may be implemented in analog circuitry or mixed analog and digital circuitry. A computing environment can include any type of computer system, including, but not limited to, a computer system based on a microprocessor, a mainframe computer, a digital signal processor, a portable computing device, a device controller, or a computational engine within an appliance, to name a few.
Conditional language such as, among others, “can,” “could,” “might” or “may,” unless specifically stated otherwise, are understood within the context as used in general to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.
Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is understood with the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (for example, X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.
Any process descriptions, elements or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or elements in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown, or discussed, including substantially concurrently or in reverse order, depending on the functionality involved as would be understood by those skilled in the art.
Unless otherwise explicitly stated, articles such as “a” or “an” should generally be interpreted to include one or more described items. Accordingly, phrases such as “a device configured to” are intended to include one or more recited devices. Such one or more recited devices can also be collectively configured to carry out the stated recitations. For example, “a processor configured to carry out recitations A, B and C” can include a first processor configured to carry out recitation A working in conjunction with a second processor configured to carry out recitations B and C.
It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure.
This application claims priority to U.S. Prov. Patent App. No. 63/263,920, titled “INTRUSION PROTECTED USER INTERFACE AND FUNCTIONALITY FOR VEHICLE USING EMBEDDED CONTROLLER” and filed on Nov. 11, 2021, the disclosure of which is hereby incorporated herein by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2022/049583 | 11/10/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63263920 | Nov 2021 | US |