This disclosure relates to integrating navigation systems.
In-vehicle entertainment systems and portable navigation systems sometimes include graphical displays, touch-screens, physical user-interface controls, and interactive or one-way voice interfaces. They may also be equipped wife telecommunication interfaces including terrestrial or satellite radio, Bluetooth, GPS, and cellular voice and data technologies. Entertainment systems integrated into vehicles may have access to vehicle data, including speed and acceleration, navigation, and collision event data. Navigation systems may include databases of maps and travel information and software for computing driving directions. Navigation systems and entertainment systems may be integrated or may be separate components.
In general, in one aspect, personal navigation device includes an interface capable of receiving navigation input data from a media device; a processor structured to generate a visual element indicating a current location from the navigation input data; a frame buffer to store the visual element; and a storage device in which software is stored that when executed by the processor causes the processor to repeatedly check the visual element in the frame buffer to determine if the visual element has been updated since a previous instance of checking the visual element, and compress the visual element and transmit the visual element to the media device if the visual element has not been updated between two instances of checking the visual element.
In general, in one aspect, a method includes receiving navigation input data from a media device, generating a visual element indicating a current location from the navigation input data, storing the visual element in a storage device of a personal navigation device, repeatedly checking the visual element in the storage device to determine if the visual element has been updated between two instances of checking the visual element, and compressing the visual element and transmitting the visual element to the media device if the visual element has not been updated between two instances of checking the visual element.
In general, in one aspect, a computer readable medium encoding instructions to cause a personal navigation device to receive navigation input data from a media device; repeatedly check a visual element that is generated by the personal navigation device from the navigation input data, is stored by the personal navigation device, and that indicates a current position, to determine if the visual element has been updated between two instances of checking the visual element; and compress the visual element and transmit the visual element to the media device if the visual element has not been updated between two instances of checking the visual element.
Implementations of the above may include one or more of the following features. Loss-less compression is employed to compress the visual element. It is determined if the visual element has been updated by comparing every Nth horizontal line of the visual element from a first instance of checking the visual element to corresponding horizontal lines of the visual element from a second instance of checking the visual element, wherein N has a value of at least 2. The visual element is compressed by serializing pixels of the visual element into a stream of serialized pixels and creating a description of the serialized pixels in which a given pixel color is specified when the pixel color is different from a preceding pixel color and in which the specification of the given pixel color is accompanied by a value indicating the quantity of adjacent pixels that have the given pixel color. The media device is installed within a vehicle, and the navigation input data includes data from at least one sensor of the vehicle. A piece of data pertaining to a control of the personal navigation device is transmitted to the media device to enable the media device to assign a control of the media device as a proxy for the control of the personal navigation device. The software further causes the processor to receive a indication of an actuation of the control of the media device and respond to the indication in a manner substantially identical to the manner in which an actuation of the control of the personal navigation device is responded to. The repeated checking of the visual element to determine if the visual element has been updated entails repeatedly checking the frame buffer to determine if the entirety of the frame buffer has been updated.
In general, in one aspect a media device includes an interlace capable of receiving a visual element indicating a current location from a personal navigation device; a screen; a processor structured to provide an image indicating the current location and providing entertainment information for display on the screen from at least the visual element; and a storage device in which software is stored that when executed by the processor causes the processor to define a first layer and a second layer, store the visual element in the second layer, store another visual element pertaining to the entertainment information in the first, layer, and combine the first layer and the second layer to create the image with the first layer overlying the second layer such that the another visual element overlies the visual element.
In general in one aspect, a method includes receiving a visual element indicating a current location from a personal navigation device, defining a first layer and a second layer, storing the visual element in the second layer, storing another visual element pertaining to the entertainment information in the first layer, combining the first layer and the second layer to provide an image with the first layer overlying the second layer such that the another visual element overlies the visual element, and displaying the image on a screen of a media device.
In general, in one aspect, a computer readable medium encoding instructions to cause a media device to receive a visual element indicating a current location from a personal navigation device, define a first layer and a second layer, store the visual element in the second layer, store another visual element pertaining to the entertainment information in the first layer, combine the first layer and the second layer to provide an image with the first layer overlying the second layer such that the another visual element overlies the visual element, and display the image on a screen of the media device.
Implementations of the above may include one or more of the following features. The media device of claim further includes a receiver capable of receiving a GPS signal from a satellite, wherein the processor is further structured to provide navigation input data corresponding to that GPS signal to the personal navigation device. The software further causes the processor to alter a visual characteristic of the visual element. The visual characteristic of the visual element is one of a set consisting of a color, a font and a shape. The visual characteristic that is altered is a color, and wherein the color is altered to at least approximate a color of a vehicle into which the media device is installed. The visual characteristic that is altered is a color, and wherein the color is altered to at least approximate a color specified by a user of the media device. The media device further includes a physical control and the software further causes the processor to assign the physical control to serve as a proxy for a control of the personal navigation device. The control of the personal navigation device is a physical control of the personal navigation device. The control of the personal navigation device is a virtual control having a corresponding additional visual element that is received from the personal navigation device and that the software further causes the processor to refrain from displaying on the screen. The media device further includes a proximity sensor, and the software further causes the processor to alter at least a portion of the another visual element in response to detecting the approach of a portion of the body of a user of the media device through the proximity sensor. The another visual element is enlarged such that it overlies a relatively larger portion of the visual element.
In general, in one aspect, a media device includes at least one speaker; an interlace enabling a connection between the media device and a personal navigation device to be formed, and enabling audio data stored on the personal navigation device to be played on the at least one speaker; and a user interface comprising a plurality of physical controls capable of being actuated by a user of the media device to control a function of the playing of the audio data stored on the personal navigation device during a time when there is a connection between the media device and the personal navigation device.
In general, in one aspect a method includes detecting that a connection exists with a personal navigation device and a media device, receiving audio data from the personal navigation device, playing the audio data through at least one speaker of the media device; and transmitting a command to the personal navigation device pertaining to the playing of the audio data in response to an actuation of at least one physical control of the media device.
Implementations of the above may include one or more of the following features. The media device is structured to interact with the personal navigation device to employ a screen of the personal navigation device as a component of the user interface of the media device during a time when there is a connection between the media device and the personal navigation device. The media device is structured to assign the plurality of physical controls to serve as proxies for a corresponding plurality of controls of the personal navigation device during a time when the screen of the personal navigation device is employed as a component of the user interlace of the media device. The media device is structured to transmit to the personal navigation device an indication of a characteristic of the user interface of the personal navigation device to be altered during a time when there is a connection between the media device and the personal navigation device. The characteristic of the user interface of the personal navigation device to be altered is one of a set consisting of a color, a font, and a shape of a visual element displayed on a screen of the personal navigation device. The media device is structured to accept commands from the personal navigation device during a time when there is a wireless connection between the media device and the personal navigation device to enable the personal navigation device to serve as a remote control of the media device. The media device further includes an additional interface enabling a connection between the media device and another media device through which the media device is able to relay a command received from the personal navigation device to the another media device.
Other features and advantages of the invention will he apparent from the description and the claims.
In-vehicle entertainment systems and portable navigation systems each have unique features that the other generally lacks. One or the other or both can be improved by using capabilities provided by the other. For example, a portable navigation system may have an integrated antenna, which may provide a weaker signal than an external antenna mounted on a roof of a vehicle to be used by the vehicle's entertainment system. In vehicle entertainment systems may lack navigation capabilities or have only limited capabilities. When we refer to a navigation system in this disclosure, we are referring to a portable navigation system separate from any vehicle navigation system that may be built-in to a vehicle. A communications system that can link a portable navigation system with an in-vehicle entertainment system can allow either system to provide services to or receive services shared by the other device.
An in-vehicle entertainment system 102 and a portable navigation system 104 may be linked within a vehicle 100 as shown in
In some examples, the navigation system 104 includes a user interface 124, navigation data 126, a processor 128, navigation software 130, and communications interlaces 132. The communications interface may include GPS, for finding the system's location based on GPS signals from satellites or terrestrial beacons, a cellular interface for transmitting voice or data signals, and a Bluetooth interface for communicating with other electronic devices, such as wireless phones.
In some examples, the various components of the head unit 106 are connected as shown in
The processor may receive inputs from individual devices, such as a gyroscope 148 and backup camera 149, and exchanges information with a gateway 150 to an information bus 152 and direct signal inputs from a variety of sources 155, such as vehicle speed sensors or the ignition switch. Whether particular inputs are direct signals or are communicated over the bus 152 will depend on the architecture of the vehicle 100. In some examples, the vehicle is equipped with at least one bus for communicating vehicle operating data between various modules. There may be an additional bus for entertainment system data. The head unit 106 may have access to one or more of these busses. In some examples, a gateway module in the vehicle (not shown) converts data from a bus not available to the head unit 106 to a bus protocol that is available to the head unit 106. In some examples, the head unit 106 is connected to more than one bus and performs the conversion function for other modules in the vehicle. The processor may also exchange data with a wireless interface 159. This can provide connections to media players or wireless telephones, for example. The head unit 106 may also have a wireless telephone interface 110b built-in. Any of the components shown as part of the head unit 106 in
As noted above, in some examples, the connection to the navigation system 104 is wireless, thus the arrows to and from the connector 160 in
In some examples, the various components of the navigation system 104 are connected as shown in
The connector 162 may be a set of standard cable connectors, a customized connector for the navigation system 104 or a combination of connectors, as discussed with regard to
A graphics processor (GPU) 172 may be used to generate images for display through the user interface 124 or through the entertainment system 102. The GPU 172 may receive video images from the entertainment system 102 directly through the connector 162 or through the processor 128 and process these for display on the navigation system's user interlace 124. Alternatively, video processing could be handled by the main processor 128, and the images may be output through the connector 162 either by the processor 128 or directly by the GPU 172. The processor 128 may also include digital/analog converters (DACs and ADCs) 166, or these functions may be performed by dedicated devices. The user interface 124 may include an LCD or other video display screen 174, a touch screen sensor 176, and controls 178. In some examples, video signals, such as from the backup camera 149, are passed directly to the display 174. A power supply 180 regulates power received from an external source 182 or from an internal battery 720. The power supply 180 may also charge the battery 720 from the external source 182.
In some examples, as shown in
The navigation system 104 can use the data 203 for improving its calculation of the vehicle's location, for example, by combining the vehicle's own speed readings 208 with those derived from GPS signals 204a, 204b, or 206, the navigation system 104 can make a more accurate determination of the vehicle's true speed. Signal 206 may also include gyroscope information that has been processed by processor 120 as mentioned above. If a GPS signal 204a, 204b, or 206 is not available, for example, if the vehicle 100 is surrounded by tall buildings or in a tunnel and does not have a line of sight to enough satellites, the speed 208, acceleration 210, steering 212, and other inputs 214 or 218 characterizing the vehicle's motion can be used to estimate the vehicle's course by dead reckoning. Gyroscope information that has been processed by processor 120 and is provided by 206 may also be used. In some examples, the computations of the vehicle's location based on information other than GPS signals may be performed by the processor 120 and relayed to the navigation system in the form of a longitude and latitude location. If the vehicle has its own built-in navigation system, such calculations of vehicle location may also be used by that system. Other data 218 from the entertainment system of use to the navigation system may include traffic data received through the radio or wireless phone interface, collision data, and vehicle status such as doors opening or closing, engine start, headlights or internal lights turned on, and audio volume. This can be used for such things as changing the display of the navigation device to compensate for ambient light, locking-down the user interface during while driving, or calling for emergency services in the event of an accident if the car does not have its own wireless phone interface.
The navigation system 104 may also provide services through the entertainment system 102 by exchanging data including video signals 220, audio signals 222, and commands or information 224, collectively referred to as data 202. Power for the navigation system 104, for charging or regular use, may be provided from the entertainment system's power supply 156 to the navigation, system's power supply 180 through connection 225. If the navigation system's communications interfaces 132 include a wireless phone interface 132a and the entertainment system 102 does not have one, the navigation system 104 may citable the entertainment system 102 to provide hands-free calling to the driver through, the vehicle's speakers 226 and a microphone 230. The audio signals 222 carry the voice from the driver to the wireless phone interface 132a in the navigation system and carry any audio from a call back to the entertainment system 202. The audio signals 222 can also be used to transfer audible instructions such as driving directions or voice recognition acknowledgements from the navigation system 104 to the head unit 106 for playback on the vehicle's speakers 226 instead, of using a built-in speaker 168 in the navigation system 104.
The audio signals 222 may also be used to provide hands-free operation from one device to another. If the entertainment system 102 has a hands-free system 222, it may receive voice inputs and relay them as audio signals 222 to the navigation system 104 for interpretation by voice recognition software and receive audio responses 222, command data and display information 224, and updated graphics 220 back from the navigation system 104. The entertainment system 102 may also interpret the voice inputs itself and send control commands 224 directly to the navigation system 204. If the navigation system 104 has a hands-free system 236 capable of controlling aspects of the entertainment system, the entertainment system may receive audio signals from its own microphone 230, relay them as audio signals 222 to the navigation system 104 for interpretation, and receive control commands 224 and audio responses 222 back from the navigation system 104. In some examples, the navigation system 104 also functions as a personal media player, and the audio signals 222 may carry a primary audio program to be played back through the vehicle's speakers 226.
If the head unit 106 has a better screen 114 than the navigation system 104 has (for example, it may be larger, brighter, or located where the driver can see it more easily), video signals 220 can allow the navigation system 104 to display its user interface 124 through the head unit 106's screen 114. The head unit 106 can receive inputs on its user interface 116 or 118 and relay these to the navigation system 104 as commands 224. In this way, the driver only needs to interact with one device, and connecting the navigation system 104 to the entertainment system 102 allows the entertainment system 102 to operate as if it included navigation features. In some examples, the navigation system 104 may be used to display images from the entertainment system 102, for example, from the backup camera 149 or in place of using the head unit's own screen 114. Such images can be passed to the navigation system 104 using the video signals 220. This has the advantage of providing a graphical display screen for a head unit 106 that may have a more-limited display 114. For example, images from the backup camera 149 may be relayed to the navigation system 104 using video signals 220, and when the vehicle Is put in to reverse, as indicated by a direct input 154 or over the vehicle bus 152 (
In cases where the entertainment system 102 does include navigation features, the navigation system 104 may be able to supplement or improve on those features, for example, by providing more-detailed or more-current maps though the command and information link 224 or offering better navigation software or a more powerful processor. In some examples, the head unit 106 may be equipped to transmit navigation service requests over the command and information link 224 and receive responses from the navigation system's processor 128. In some examples, the navigation system 104 can supply software 130 and data 126 to the head unit 106 to use with its own processor 120. In some examples, the entertainment system 102 may download additional software to the personal navigation system, for example, to update its ability to calculate location based on the specific information that vehicle makes available.
The ability to relay the navigation system's interlaces through the entertainment system has the benefit of allowing the navigation system 104 to be located somewhere not readily visible to the driver and to still provide navigation and other services. The connections described may be made using a standardized communications interlace or may be proprietary. A standardized interface may allow navigation systems from various manufacturers to work in a vehicle without requiring customization. If the navigation systems use proprietary formats for data, signals, or connections, the entertainment system 102 may include software or hardware that allows it to convert between formats as required.
In some examples, the navigation system's interface 124 is relayed through the head unit's interface 112 as shown in
In the example of
In the example of
Audio from the navigation system 104 and entertainment system 102 may similarly be combined, as shown in
When the head unit's interface 112 is used in this manner as a proxy for the navigation system's interface 124, in addition to using the screen 114, it may also use the head unit's inputs 118 or touch screen 116 to control the navigation system 104. In some examples, as shown in
Several methods can be used to generate the screen images shown on the screen 114 of the head unit 106. In some examples, as shown In
The image may be provided by the navigation system in several forms including a full image map, difference data, or vector data. For a full image map, as shown in
The image may also be transmitted as icon data, as shown in
In a similar fashion, as shown in
When an image is being transmitted from the navigation system 104 to the head unit 106, the amount of bandwidth required may dominate the connections between the devices. For example, if a single USB connection is used for the video signals 220, audio signals 222, and commands and information 224, a full video stream may not leave any room for control data. In some examples, as shown in
In some examples, the navigation system 104 may be connected to the entertainment system 102 through a direct wire connection as shown in
As shown in
Either a hardware-based or a software-based implementation of layering may be used. In a software-based implementation, the processor 120 (
Differences in how a given piece of data is displayed on the screen 174 and how it is displayed on the screen 114 may dictate whether that piece of data is transmitted by the portable navigation system 104 to the head unit 106 as visual data or as some other form of data, and may dictate the form of visual data used where the given piece of data is transmitted as visual data. By way of example and solely for purposes of discussion, when the portable navigation system 104 is used by itself and separately from the head unit 106, the portable navigation system 104 may display the current time on the screen 174 of the portable navigation system 104 as part of performing its navigation function. However, when the portable navigation system 104 is then used in conjunction with the bead unit 106 as has been described herein, the portable navigation system 104 may transmit the current time to the head unit 106 to be displayed on the screen 114. This transmission, of the current time may be performed either by transmitting the current time as one or more values representing the current time, or by transmitting a visual element that provides a visual representation of the current time such as a bitmap of human-readable digits or an analog clock face with hour and minute hands. In some embodiments, where the screen 114 is larger or in some other way superior to the screen 174, what is displayed on the screen 114 may differ from what would be displayed on the screen 174 in order to make use of the superior features of the screen 114. In some cases, even though the current time may be displayed on the screen 174 as part of a larger bitmap of other navigation input data, it may be desirable to remove that display of the current time from that bitmap, and instead, transmit the time as one or more numerical or other values that represent the current time to allow the head unit 106 to display that bitmap without the inclusion of the current time. This would also allow the head unit 106 to either employ those value(s) representing the current time in generating a display of the current time that is in some way different from that provided by the portable navigation unit 104, or would allow the head unit to refrain from displaying the current time, altogether. Alternatively, it may be advantageous to simply transfer a visual element providing a visual representation of the current time as it would otherwise be displayed on the screen 174 for display on the screen 114, but separate from other visual elements to allow flexibility in positioning the display of the current time on the screen 114. Those skilled in the art will readily recognize that although this discussion has centered on displaying the current time, it is meant as an example, and this same choice of whether to convey a piece of data as a visual representation or as one or more values representing the data may be made regarding any of numerous other pieces of information provided by the portable navigation device 104 to the head unit 106.
As previously discussed with regard to
As earlier discussed, the head unit 106 incorporates software 122. A portion of the software 122 of the head unit 106 is a user interface application 928 that causes the processor 120 to provide the user interface 112 through which the user interacts with the head unit 106. Another portion of the software 122 is software 920 that causes the processor 120 to interact with the portable navigation device 104 to provide the portable navigation device 104 with navigation input data and to receive visual and other data pertaining to navigation for display on the screen 114 to the user. Software 920 includes a communications handling portion 922, a data transfer portion 923, an image decompression portion 924, and a navigation and user interface (UI) integration portion 925.
As also earlier discussed, the portable navigation system 104 incorporates software 130. A portion of the software 130 is software 930 that causes the processor 128 to interact with the head unit 106 to receive the navigation input data and to provide visual elements and other data pertaining to navigation to the head unit 106 for display on the screen 114. Another portion of the software 130 of the portable navigation system 104 is a navigation application 938 that causes the processor 128 to generate those visual elements and other data pertaining to navigation from the navigation input data received from the head unit 106. Software 930 includes a communications handling portion 932, a data transfer portion 933, a loss-less image compression portion 934, and an image capture portion 935.
As previously discussed, each of the portable navigation system 104 and the head unit 106 are able to be operated entirely separately of each other. In some embodiments, the portable navigation system 104 may not have the software 930 installed and/or the head unit 106 may not have the software 920 installed. In such cases, it would be necessary to install one or both of software 920 and the software 930 to enable the portable navigation system 104 and the head unit 106 to interact.
In the interactions between the head unit 106 and the portable navigation system 104 to provide a combined display of imagery for both navigation and entertainment, the processor 120 is caused by the communications handling portion 922 to assemble GPS data received from satellites (perhaps, via the antenna 113 in some embodiments) and/or other location data from vehicle sensors (perhaps, via the bus 152 in some embodiments) to assemble navigation input data for transmission to the portable navigation system 104. As has been explained earlier, the head unit 106 may transmit what is received from satellites to the portable navigation system 104 with little or no processing, thereby allowing the portable navigation system 104 to perform most or all of this processing as part of determining a current location. However, as was also explained earlier, the head unit 106 may perform at least some level of processing on what is received from satellites, and perhaps provide the portable navigation unit 104 with coordinates derived from that processing denoting a current location, thereby freeing the portable navigation unit 104 to perform other navigation-related functions. Therefore, the GPS data assembled by the communications handling portion 922 into navigation input data may have already been processed to some degree by the processor 120, and may be GPS coordinates or may be even more thoroughly processed GPS data. The data transfer portion 923 then causes the processor 120 to transmit the results of this processing to the portable navigation system 104. Depending on the nature of the connection, established between the portable navigation device and the head unit 106 (i.e., whether that connection is wireless (including the use of either infrared or radio frequencies) or wired, electrical or fiber optic, serial or parallel, a connection shared among still other devices or a point-to-point connection, etc.), the data transfer portion 923 may serialize and/or packetize data, may embed status and/or control protocols, and/or may perform, various other functions required by the nature of the connection.
Also in the interactions between the head unit 106 and the portable navigation system 104, the processor 120 is caused by the navigation and user interface (UI) integration portion 925 to relay control inputs received from the user interface (UI) application 928 as a result of a user actuating controls or taking other actions that necessitate the sending of commands to the portable navigation system 104. The navigation and UI integration portion relays those control inputs and commands to the communications handling portion 922 to be assembled for passing to the data transfer portion 923 for transmission to the portable navigation system 104.
The data transfer portion 933 causes the processor 128 to receive the navigation input data and the assembled commands and control inputs transferred to the portable navigation device 104 as a result of the processor 120 executing a sequence of the instructions of the data transfer portion 923. The processor 128 is further caused by the communications handling portion 932 to perform some degree of processing on the received navigation input data and the assembled commands and control inputs. In some embodiments, this processing may be little more than reorganizing the navigation input data and/or the assembled commands and control inputs. Also, in some embodiments, this processing may entail performing a sampling algorithm to extract data occurring at specific time intervals from other data.
The processor 128 is then caused by the navigation application 938 to process the navigation input data and to act on the commands and control inputs. As part of this processing, the navigation application 938 causes the processor 128 to generate visual elements pertaining to navigation and to store those visual elements in a storage location 939 defined within storage 164 and/or within another storage device of the portable navigation device 104. In some embodiments, the storage of the visual elements may entail the use of a frame buffer defined through the navigation application 938 in which at least a majority of the visual elements are assembled together in a substantially complete image to be transmitted to the head unit 106. It may be that the navigation application 938 routinely causes the processor 128 to define and use a frame buffer as part of enabling visual navigation elements pertaining to navigation to be combined in the frame buffer for display on the screen 174 of the portable navigation system 104 when the portable navigation system 104 is used separately from the head unit 106. It may be that the navigation application continues to cause the processor 12S to define and use a frame buffer when the image created in the frame buffer is to be transmitted to the head unit 106 for display on the screen 114. Those skilled in the art of graphics systems will recognize that such a frame buffer may be referred to as a “virtual” frame butter as a result of such a frame buffer not being used to drive the screen 174, but instead, being used to drive the more remote screen 114. In alternate embodiments, at least some of the visual elements may be stored and transmitted to the head unit 106 separately from each other. Those skilled in the art of graphics systems will readily appreciate that visual elements may be stored in any of a number of ways.
Where the screen 114 of the head unit 106 is larger or has a greater pixel resolution than the screen 174 of the portable navigation system 104, one or more of the visual elements pertaining to navigation may be displayed on the screen 114 in larger size or with greater detail than would be the case when displayed on the screen 174. For example, where the screen 114 has a higher resolution, the map 312 may be expanded to show more detail such as streets, when created for display on the screen 114 versus the screen 174. As a result, where a frame buffer is defined and used by the navigation application 938, that frame buffer may be defined to be of a greater resolution when its contents are displayed on the screen 114 than when displayed on the screen 174.
Regardless of how exactly the processor 128 is caused by the navigation application 938 to store visual elements pertaining to navigation, the image capture portion 935 causes the processor 128 to retrieve those visual elements for transmission to the head unit 106. As those skilled in the art of graphics systems will readily recognize, where a repeatedly updated frame buffer is defined and/or where a repeatedly updated visual element is stored as a bitmap (for example, perhaps the map 312), there may be a need to coordinate the retrieval of either of these with their being updated. Undesirable visual artifacts may occur where such updating and retrieval are not coordinated, including instances where either a frame buffer or a bitmap is displayed in a partially updated state. In some embodiments, the updating and retrieval functions caused to occur by the navigation application 938 and the image capture portion 935, respectively, may be coordinated through various known handshaking algorithms involving the setting and monitoring of various flags between the navigation application 938 and the image capture portion 935.
However, in other embodiments, where the navigation application 938 was never written to coordinate with the image capture portion 935, the image capture portion 935 may cause the processor 128 to retrieve a frame buffer or a visual element on a regular basis and to monitor the content of such a frame buffer or visual element for an indication that the content has remained sufficiently unchanged that what was retrieved may he transmitted to the head unit 106. More specifically, the image capture portion 935 may cause the processor 128 to repeatedly retrieve the content of a frame buffer or a visual element and compare every Nth horizontal line (e.g., every 4th horizontal line) with those same lines from the last retrieval to determine if the content of any of those lines has changed, and if not, then to transmit the most recently retrieved content of that frame buffer or visual element to the head unit 106 for display. Such situations may arise where the software 930 is added to the portable navigation system 104 to enable the portable navigation system 104 to interact with the head unit 106, but such an interaction between the portable navigation system 104 and the head unit 106 was never originally contemplated by the purveyors of the portable navigation system 104.
The loss-less image compression portion 934 causes the processor 128 to employ any of a number of possible compression algorithms to reduce the size of what the image capture portion 935 has caused the processor 128 to retrieve In order to reduce the bandwidth requirements for transmission to the head unit 106. This may be necessary where the nature of the connection between the portable navigation system 104 and the head unit 106 is such that bandwidth is too limited to transmit an uncompressed frame buffer and/or a visual, element (e.g., a serial connection such as EIA RS-232 or RS-422), and/or where it is anticipated that the connection will be used to transfer a sufficient amount of other data that bandwidth for those transfers must remain available.
Such a limitation in the connection may be addressed through the use of data compression, however, as a result of efforts to minimise costs in the design of typical portable navigation systems, there may not be sufficient processor or storage capacity available to use complex compression algorithms such as JPEG, etc. In such cases, a simpler compression algorithm may be used in which a frame buffer or a visual element stored as a bitmap may be transmitted by serializing each horizontal line and creating a description of the pixels in the resulting pixel stream in which pixel color values are specified only where they change and those pixel values are accompanied by a value describing how many adjacent pixels in the stream have the same color. Also, in such embodiments where the actual quantity of colors is limited, color lookup tables may be employed to reduce the number of bytes required to specify each color. The compressed data is then caused to be transmitted by the processor 128 to the head unit 106 by the data transfer portion 933.
The processing of the navigation input data and both the commands and control inputs caused by the navigation application 938 also causes the processor 128 to generate navigation output data. The navigation output data may include numerical values and/or various other indicators of current location, current compass heading, or other current navigational data that is meant to be transmitted back to the head unit 106 in a form other than that of one or more visual elements. It should be noted that such navigation output data may be transmitted to the head unit 106 either in response to the receipt of the commands and/or control inputs, or without such solicitation from the head unit 106 (e.g., as part of regular updating of information at predetermined intervals). Such navigation output data is relayed to the communications handling portion 932 to be assembled to then be relayed to the data transfer portion 933 for transmission back to the head unit 106.
The data transfer portion 923 and the image decompression portion 924 Causes the processor 120 of the head unit 106 to receive and decompress, respectively, what was caused to be compressed and transmitted by the loss-less image compression portion 934 and the data transfer portion 933, respectively. Also, the data transfer portion 923 and the communications handling portion 922 receive and disassemble, respectively, the navigation output data caused to be assembled and transmitted by the communications handling portion 932 and the data transfer portion 933, respectively The navigation and UI integration portion 925 then causes the processor 120 to combine the frame buffer images, the visual elements and/or the navigation, output data received from the portable navigation system 104 with visual elements and other data pertaining to entertainment to create a single image for display on the screen 114.
As previously discussed, the manner in which visual elements are combined may be changed in response to sensing an approaching hand of a user via a proximity sensor or other mechanism. The proximity of a human hand may be detected through echo location with ultrasound, through sensing body heat emissions, or in other ways known to those skilled in the art. Where a proximity sensor is used, that proximity sensor may be incorporated into the head unit 106 (such as the depicted as sensor 926), or it may be incorporated into the portable navigation system 104. The processor 120 is caused to place the combined image in a frame buffer 929 by the user interface application 928, and from the frame buffer 929, the combined image is driven onto the screen 114 in a manner that will be familiar to those skilled in the art of graphics systems.
The navigation and UI integration portion 925 may cause various ones of the buttons and knobs 118a-118s to be assigned as proxies for various physical or virtual controls of the portable navigation device 104, as previously discussed. The navigation and UI integration portion 925 may also cause various visual elements pertaining to navigation to be displayed in different locations or to take on a different appearance from how they would otherwise be displayed on the screen 174, as also previously discussed. The navigation and UI integration portion 925 may also alter various details of these visual elements to give them an appearance that better matches other visual employed by the user interface 112 of the head unit 106. For example, the navigation and UI integration portion 925 may alter one or more of the colors of one or more of the visual elements pertaining to navigation to match or at least approximate a color scheme employed by the user interface 112, such as a color scheme that matches or at least approximates colors employed in the interior of or on the exterior of the vehicle into which the head unit 106 has been installed, or that matches or at least approximates a color scheme selected for the user interface 112 by a user, purveyor or installer of the head unit 106.
In the example of
For the features discussed above, the cables 702, 704, and 706 may carry video signals 220, audio signals 222, and commands or information 224 (
The data connections 706 and 712 may be a multi-purpose format such as USB, Firewire, UART, RS-232, RS-485, I2C, or an in-vehicle communication network such as controller area network (CAN), or they could be custom connections devised by the maker of the head unit 106, navigation system 104, or vehicle 100. The head unit 106 may serve as a gateway for the multiple data formats and connection types used in a vehicle, so that the navigation system 104 needs to support only one data format and connection type. Physical connections may also include power for the navigation system 104.
As shown in
The docking unit 802 may also include features 808 for physically connecting to the navigation system 104 and holding it in place. This may function to maintain the data connections 804 or 806, and may also serve to position the navigation system 104 in a given position so that its interface 124 an be easily seen and used by the driver of the car.
In some examples, as shown in
Referring now to both
Furthermore, upon being docked with either the head unit 106 or the base unit 2106, the user interface 124 of the portable navigation system 104 may automatically alter its user interface to make use of one or more of the buttons and knobs 118a-118s or the buttons 2118a-2118d in place of one or more of whatever physical or virtual controls that the user interface 124 may employ on the portable navigation system 104 when the portable navigation system 104 is used separately from either the head unit 106 or the base unit 2106.
Such features of the user interface 124 as adopting user interface characteristics or making use of additional buttons or knobs provided by either the head unit 106 or the base unit 2106 may occur when the portable navigation system 104 becomes connected to either the head unit 106 or the base unit 2106 in other ways than through docking, including through a cable-based or wireless connection (including wireless connections making use of ultrasonic, infrared or radio frequency signals). More specifically, the user interface 124 may automatically adopt characteristics of a user interface of either the head unit 106 or the base unit 2106 upon being brought into close enough proximity to engage in wireless communications with either. Furthermore, such wireless communications may enable the portable navigation system 104 to he used as a form of wireless remote control to allow a user to operate various aspects of either the head unit 106 or the base unit 2106 in a manner not unlike that in which many operate a television or stereo component through a remote control.
Still further, the adoption of user interlace characteristics by the user interface 124 may be mode-dependent based on a change in the nature of the connection between the portable navigation system 104 and either of the head unit 106 or the base unit 2106. More specifically, when the portable navigation system 104 is brought into close enough proximity to either the head unit 106 or the base unit 2106, the user interface 124 of the portable navigation system 104 may adopt characteristics of the user interface of either the head unit 106 or the base unit 2106. The portable navigation system 104 may automatically provide either physical or virtual controls to allow a user to operate the portable navigation system 104 as a handheld remote control to control various functions of either the head unit 106 or the base unit 2106. This remote control function would be carried out through any of a variety of wireless connections already discussed, including wireless communications based on radio frequency, infrared or ultrasonic communication. However, as the portable navigation system 104 is brought still closer to either the bead unit 106 or the base unit 2106, or when the portable navigation system 104 is connected with either the head unit 106 or the base unit 2106 through docking or a cable-based connection, the user interface 124 may automatically change the manner in which if adopts characteristics of the user interlace of either the head unit 106 or the base unit 2106. The portable navigation system 104 may cease to provide either physical or virtual controls and start to function more as a display of either the head unit 106 or the base unit 2106, and may automatically cooperate with the head unit 106 or the base unit 2106 to enable use of the various buttons or knobs on either the head unit 106 or the base unit 2106 as previously discussed with regard to docking.
Upon being docked or provided a cable-based connection to either the head unit 106 or the base unit 2106, the portable navigation system 104 may take on the behavior of being part of either the head unit 106 or the base unit 2106 to the extent that the combination of the portable navigation system 104 and either the head unit 106 or the base unit 2106 responds to commands received from a remote control of either the head unit 106 or the base unit 2106. Furthermore, an additional media device (not shown), including any of a wide variety of possible audio and/or video recording or playback devices, may be in communication with either combination such that commands received by the combination from the remote control are relayed to the additional media device.
Further, upon being docked with the base unit 2106, the behaviors that the portable navigation system 104 may take on as being part of the base unit 2106 may be modal in nature depending on the proximity of a user's hand in a manner not unlike what has been previously discussed with regard to the head unit 106. By way of example, the screen 174 of the portable navigation system 104 may display visual artwork pertaining to an audio recording (e.g., cover art of a music album) until a proximity sensor (not shown) of the base unit 2106 detects the approach of a user's hand towards the base unit 2106. Upon detecting the approach of the hand, the screen 174 of the portable navigation system 104 may automatically switch from displaying the visual artwork to displaying other information pertaining to entertainment. This automatic switching of images may be caused to occur on the presumption that the user is extending a hand to operate one or more controls. The user may also be provided with the ability to turn off this automatic switching of images. Not unlike the earlier discussion of the use of a proximity sensor with the head unit 106, a proximity sensor employed in the combination of the personal navigation system 104 and the base unit 2106 may be located either within the personal navigation system 104 or the base unit 2106.
In either the ease of a combination of the personal navigation system 104 with the head unit 106 or a combination of the personal navigation system 104 with the base unit 2106, a proximity sensor incorporated Into the personal navigation system 104 may be caused through software stored within the personal navigation system 104 to be assignable to being controlled and/or monitored, by either the head unit 106 or the base unit 2106 for any of a variety of purposes.
In some embodiments of interaction between the portable navigation system 104 and either the head unit 106 or the base unit 2106, the portable navigation system 104 may be provided the ability to receive and store new data from either the head unit 106 or the base unit 2106. This may allow the portable navigation system 104 to benefit from a connection that either the head unit 106 or the base unit 2106 may have to the Internet or to other sources of data that the portable navigation system 104 may not itself have. In other words, upon there being a connection formed between the portable navigation system 104 and either the head unit 106 or the base unit 2106 (whether that connection be wired, wireless, through docking, etc.), the portable navigation system 104 may be provided with access to updated maps or other data about a location, or may be provided with access to a collection of entertainment data (e.g., a library of MP3 files).
In some embodiments of interaction between the portable navigation system 104 and either the head unit 106 or the base unit 2106, software on one or more of these devices may perform a check of the other device to determine if the other device or the software of the other device meets one or more requirements before allowing some or all of the various described, forms of interaction to take place. For example, copyright considerations, electrical compatibility, nuances of feature interactions or other considerations may make it desirable for software stored within the portable navigation system 104 to refuse to interact with one or more particular forms of either a head unit 106 or a base unit 2106, or to at least limit the degree of interaction in some way. Similarly, it may be desirable for software stored within either fire head unit 106 or the base unit 2106 to refuse to interact with one or more particular forms of a portable navigation system 104, or to at least limit the degree of interaction in some way. Furthermore, it may be desirable for any one the portable navigation system 104, the head unit 106 or the base unit 2106 to refuse to interact with or to at least limit interaction with some other form of device that might otherwise have been capable of at least some particular interaction were it not for such an imposed refusal or limitation. Where interaction is simply limited, the interaction may be a limit against the use of a given communications protocol, a limit against the transfer of a given piece or type of data, a limit to a predefined lower bandwidth than is otherwise possible, or some other limit.
In some examples, a wireless connection 902 can be used to connect the navigation system 104 and the entertainment system 102, as shown in
The wireless connection 902 may be provided by a transponder within the head unit 106 or another component of the entertainment system 102, or it may be a stand-alone device connected to the other entertainment system components through a wired connection, such as through the data bus 710. In some examples, the head unit 106 includes a Bluetooth connection for connecting to a user's mobile telephone 906 and allowing hands-free calling over the audio system. Such a Bluetooth connection can be used to also connect the navigation system 106, if the software 122 in the head unit 106 is configured to make such connections. In some examples, to allow a wirelessly-connected navigation system 104 to use the vehicle's antenna 113 for improved GPS reception, the antenna 113 is connected to the head unit 106 with a wired connection 810, and GPS signals are interpreted in the head unit and computed longitude and latitude values are transmitted to the navigation system 104 using the wireless connection 902. In the example of Bluetooth, a number of Bluetooth profiles may be used to exchange information, including, for example, advanced audio distribution profile (A2DP) to supply audio information, video distribution profile (VDP) for screen images, hands-free, human interface device (HID), and audio/video remote control (AVRCP) profiles for control information, and serial port and object push profiles for exchanging navigation data, map graphics, and other signals.
In some examples, as shown in
This feature may be implemented using the process shown in
Other implementations are within the scope of the following claims and other claims to which the applicant may be entitled.
This application is a continuation-in-part of prior U.S. patent application Ser. No. 11/612,003, filed Dec. 18, 2006, the contents of which are incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 11612003 | Dec 2006 | US |
Child | 11750822 | US |