Aspects of the disclosure generally relate to a user interface of a personal device for zone-based or collective control of in-vehicle components.
Sales of personal devices, such as smartphones and wearables, continue to increase. Thus, more personal devices are brought by users into the automotive context. Smartphones can already be used in some vehicle models to access a wide range of vehicle information, to start the vehicle, and to open windows and doors. Some wearables are capable of providing real-time navigation information to the driver. Device manufacturers are implementing frameworks to enable a more seamless integration of their brand of personal devices into the driving experience.
In a first illustrative embodiment, a system includes a personal device including a display; a processor, programmed to send, to the display, a vehicle interior map overlaid with indications of in-vehicle components, create a group of the in-vehicle components responsive to receipt of a gesture to the display selecting a subset of the indications, receive input from the display of a location on the map, and aim outputs of the in-vehicle components of the group based on the location.
In a second illustrative embodiment, a method includes displaying, on a touch screen, a vehicle interior map overlaid with indications of in-vehicle components providing outputs within the vehicle interior; receiving touch input to the screen indicating a location on the interior map to aim the in-vehicle components; and adjusting one or more of spread, intensity and direction of outputs of the in-vehicle components toward the location of the vehicle interior.
In a third illustrative embodiment, a non-transitory computer-readable medium embodying instructions that, when executed by a processor of a personal device, cause the personal device to: display a vehicle interior map overlaid with indications of a group of in-vehicle components providing outputs within the vehicle interior; receive input indicating a location on the interior map to aim the in-vehicle components; generate a first setting adjustment to a first in-vehicle component of the group to aim the output of the first in-vehicle component toward the location; and generate a second setting adjustment to a second in-vehicle component of the group to aim the output of the second in-vehicle component toward the location, the second setting adjustment being different from the first setting adjustment.
As required, detailed embodiments of the present invention are disclosed herein; however, it is to be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The figures are not necessarily to scale; some features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention.
Vehicle interior modules, such as reading lights or speakers, may be enhanced with a wireless communication interface such as Bluetooth Low Energy (BLE). These enhanced modules of the vehicle interior may be referred to as in-vehicle components. Vehicle occupants may utilize their personal devices to control features of the in-vehicle components over the communications interface. In an example, a vehicle occupant may utilize an application installed to the personal device to turn a reading light on or off or to adjust a volume of a speaker.
The location of the personal device within the vehicle cabin may be determined according to signal strength information between the in-vehicle components and the personal device. Based on the location, the personal device may identify what in-vehicle component features are available in the specific seating location of the user, as well as how to interact with the identified features. Accordingly, the personal device of the user may become an extension of the vehicle user interface.
A vehicle component interface application installed to the personal device may be used to provide the user interface on the personal device for the control of the in-vehicle components. When launched, the vehicle component interface application may detect, locate and display an illustration of the in-vehicle components on an intuitive map of the vehicle.
The vehicle component interface application may support a zone user interface mode and a functional entity user interface mode. In the zone mode, the user interface may allow the user to interact with the in-vehicle component features in the seating zone in which the user is located. The zone mode may be useful, in an example, for allowing the user to change lighting, climate, or other settings specific to the user's seating zone within the vehicle, without disturbing the settings of other users.
In the functional entity mode, the user interface may allow the user to interact with multiple in-vehicle components of a common type of function across multiple seating zones. The functional entity mode may be useful, in an example, for combining the functioning of multiple in-vehicle components into a single unit, e.g., to aim several lights or climate control vents within the vehicle cabin to a specified location. The functional entity mode may further allow for the complex selection of the in-vehicle components through swipe or other gestures to allow the user to craft a specific subset of in-vehicle components to be collectively controlled.
The vehicle 102 may include various types of automobile, crossover utility vehicle (CUV), sport utility vehicle (SUV), truck, recreational vehicle (RV), boat, plane or other mobile machine for transporting people or goods. In many cases, the vehicle 102 may be powered by an internal combustion engine. As another possibility, the vehicle 102 may be a hybrid electric vehicle (HEV) powered by both an internal combustion engine and one or more electric motors, such as a series hybrid electric vehicle (SHEV), a parallel hybrid electrical vehicle (PHEV), or a parallel/series hybrid electric vehicle (PSHEV). As the type and configuration of vehicle 102 may vary, the capabilities of the vehicle 102 may correspondingly vary. As some other possibilities, vehicles 102 may have different capabilities with respect to passenger capacity, towing ability and capacity, and storage volume.
The personal devices 104-A, 104-B and 104-C (collectively 104) may include mobile devices of the users, and/or wearable devices of the users. The mobile devices may be any of various types of portable computing device, such as cellular phones, tablet computers, smart watches, laptop computers, portable music players, or other devices capable of networked communication with other mobile devices. The wearable devices may include, as some non-limiting examples, smartwatches, smart glasses, fitness bands, control rings, or other personal mobility or accessory device designed to be worn and to communicate with the user's mobile device.
The in-vehicle components 106-A through 106-N (collectively 106) may include various elements of the vehicle 102 having user-configurable settings. These in-vehicle components 106 may include, as some examples, overhead light in-vehicle components 106-A through 106-D, climate control in-vehicle components 106-E and 106-F, seat control in-vehicle components 106-G through 106-J, and speaker in-vehicle components 106-K through 106-N. Other examples of in-vehicle components 106 are possible as well, such as rear seat entertainment screens or automated window shades. In many cases, the in-vehicle component 106 may expose controls such as buttons, sliders, and touchscreens that may be used by the user to configure the particular settings of the in-vehicle component 106. As some possibilities, the controls of the in-vehicle component 106 may allow the user to set a lighting level of a light control, set a temperature of a climate control, set a volume and source of audio for a speaker, and set a position of a seat.
The vehicle 102 interior may be divided into multiple zones 108, where each zone 108 may be associated with a seating position within the vehicle 102 interior. For instance, the front row of the illustrated vehicle 102 may include a first zone 108-A associated with the driver seating position, and a second zone 108-B associated with a front passenger seating position. The second row of the illustrated vehicle 102 may include a third zone 108-C associated with a driver-side rear seating position and a fourth zone 108-D associated with a passenger-side rear seating position. Variations on the number and arrangement of zones 108 are possible. For instance, an alternate second row may include an additional fifth zone 108 of a second-row middle seating position (not shown). Four occupants are illustrated as being inside the example vehicle 102, three of whom are using personal devices 104. A driver occupant in the zone 108-A is not using a personal device 104. A front passenger occupant in the zone 108-B is using the personal device 104-A. A rear driver-side passenger occupant in the zone 108-C is using the personal device 104-B. A rear passenger-side passenger occupant in the zone 108-D is using the personal device 104-C.
Each of the various in-vehicle components 106 present in the vehicle 102 interior may be associated with the one or more of the zones 108. As some examples, the in-vehicle components 106 may be associated with the zone 108 in which the respective in-vehicle component 106 is located and/or the one (or more) of the zones 108 that is controlled by the respective in-vehicle component 106. For instance, the light in-vehicle component 106-C accessible by the front passenger may be associated with the second zone 108-B, while the light in-vehicle component 106-D accessible by passenger-side rear may be associated with the fourth zone 108-D. It should be noted that the illustrated portion of the vehicle 102 in
Referring to
In many examples the personal devices 104 may include a wireless transceiver 112 (e.g., a BLUETOOTH module, a ZIGBEE transceiver, a Wi-Fi transceiver, an IrDA transceiver, an RFID transceiver, etc.) configured to communicate with other compatible devices. In an example, the wireless transceiver 112 of the personal device 104 may communicate data with the wireless transceiver 110 of the in-vehicle component 106 over a wireless connection 114. In another example, a wireless transceiver 112 of a wearable personal device 104 may communicate data with a wireless transceiver 112 of a mobile personal device 104 over a wireless connection 114. The wireless connections 114 may be a Bluetooth Low Energy (BLE) connection, but other types of local wireless connection 114, such as Wi-Fi or Zigbee may be utilized as well.
The personal devices 104 may also include a device modem configured to facilitate communication of the personal devices 104 with other devices over a communications network. The communications network may provide communications services, such as packet-switched network services (e.g., Internet access, voice over internet protocol (VoIP) communication services), to devices connected to the communications network. An example of a communications network may include a cellular telephone network. To facilitate the communications over the communications network, personal devices 104 may be associated with unique device identifiers (e.g., mobile device numbers (MDNs), Internet protocol (IP) addresses, identifiers of the device modems, etc.) to identify the communications of the personal devices 104 over the communications network. These personal device 104 identifiers may also be utilized by the in-vehicle component 106 to identify the personal devices 104.
The vehicle component interface application 118 may be an application installed to the personal device 104. The vehicle component interface application 118 may be configured to facilitate vehicle occupant access to features of the in-vehicle components 106 exposed for networked configuration via the wireless transceiver 110. In some cases, the vehicle component interface application 118 may be configured to identify the available in-vehicle components 106, identify the available features and current settings of the identified in-vehicle components 106, and determine which of the available in-vehicle components 106 are within proximity to the vehicle occupant (e.g., in the same zone 108 as the location of the personal device 104). The vehicle component interface application 118 may be further configured to display a user interface descriptive of the available features, receive user input, and provide commands based on the user input to allow the user to control the features of the in-vehicle components 106. Thus, the system 100 may be configured to allow vehicle occupants to seamlessly interact with the in-vehicle components 106 in the vehicle 102, without requiring the personal devices 104 to have been paired with or be in communication with a head unit of the vehicle 102.
The system 100 may use one or more device location-tracking techniques to identify the zone 108 in which the personal device 104 is located. Location-tracking techniques may be classified depending on whether the estimate is based on proximity, angulation or lateration. Proximity methods are “coarse-grained,” and may provide information regarding whether a target is within a predefined range but they do not provide an exact location of the target. Angulation methods estimate a position of the target according to angles between the target and reference locations. Lateration provide an estimate of the target location, starting from available distances between target and references. The distance of the target from a reference can be obtained from a measurement of signal strength 116 over the wireless connection 114 between the wireless transceiver 110 of the in-vehicle component 106 and the wireless transceiver 112 of the personal device 104, or from a time measurement of either arrival (TOA) or difference of arrival (TDOA).
One of the advantages of lateration using signal strength 116 is that it can leverage the already-existing received signal strength indication (RSSI) signal strength 116 information available in many communication protocols. For example, iBeacon uses the RSSI signal strength 116 information available in the Bluetooth Low-Energy (BLE) protocol to infer the distance of a beacon from a personal device 104 (i.e. a target), so that specific events can be triggered as the personal device 104 approaches the beacon. Other implementations expand on the concept, leveraging multiple references to estimate the location of the target. When the distance from three reference beacons are known, the location can be estimated in full (trilateration) from the following equations:
d12=(x−x1)2+(y−y1)2+(z−z1)2
d22=(x−x2)2+(y−y2)2+(z−z2)2
d32=(x−x3)2+(y−y3)2+(z−z3)2 (1)
In an example, as shown in
Thus, the mesh of in-vehicle components 106 and the personal devices 104 may accordingly be utilized to allow the in-vehicle components 106 to identify in which zone 108 each personal device 104 is located.
To enable tracking of personal devices 104 within the vehicle 102, information descriptive of the location (e.g., zone 108) of each in-vehicle component 106 relative to the vehicle 102 interior may be to be broadcast by the in-vehicle components 106 to the other in-vehicle components 106 and personal devices 104. Moreover, to provide status information indicative of the current settings of the in-vehicle components 106, the in-vehicle components 106 may also broadcast status information and/or information indicative of when changes to the settings of the in-vehicle components 106 are made.
The vehicle component interface application 118 executed by the personal device 104 may be configured to scan for and update a data store of available in-vehicle components 106. As some examples, the scanning may be performed periodically, responsive to a user request to refresh, or upon activation of the vehicle component interface application 118. In examples where the scanning is performed automatically, the transition from vehicle 102 to vehicle 102 may be seamless, as the correct set of functionality is continuously refreshed and the user interface of the vehicle component interface application 118 is updated to reflect the changes.
In an example, advertising packets in broadcasting mode may be used to communicate location, event, or other information from the in-vehicle components 106 to the personal devices 104. This may be advantageous, as the personal devices 104 may be unable to preemptively connect to each of the in-vehicle components 106 to receive component information and status updates. In an example, the advertisements may be BLE advertisements, and location, component type, and event information may be embedded into the primary service universally unique identifier (UUID) that is included in the advertisement packet made by the in-vehicle component 106. By parsing the service UUIDs of the advertisement data of the in-vehicle component 106, personal devices 104 and other in-vehicle components 106 scanning for advertisements may be able to: (i) identify the existence in the vehicle 102 of the in-vehicle component 106, (ii) determine its location and zone 108 within the vehicle 102, and (iii) detect whether a physical interaction has taken place between a user and the in-vehicle component 106 (e.g., when changes are identified to the advertised data).
The interior map 204 of the vehicle 102 may illustrate an overhead view of the interior of the vehicle 102 over which the component indications 206 of the in-vehicle components 106 may be drawn in overlay. In an example, the interior map 204 may be a generic vehicle 102 interior image stored to the personal device 104. In another example, the interior map 104 may be downloaded from a server included within the vehicle 102, or from one of the in-vehicle components 106 configured to serve component information to the personal device 104. In yet another example, the one of the in-vehicle components 106 configured to serve component information to the personal device 104 may provide the personal device 104 with an identifier of the vehicle 102 (e.g., VIN), or an identifier of the interior map 204 image corresponding to the vehicle 102. This identifier may then be used in a query to a server external to the vehicle 102, which may return the interior map 204 to the personal device 104.
The component indications 206 may include graphics overlaid on the interior map 204 at relative locations of the in-vehicle components 106 within the vehicle 102. In an example, the location information may be encoded in the service UUIDs of the advertisement data of the in-vehicle component 106. As one possible encoding scheme, two-dimensional coordinates of the location of the in-vehicle component 106 may be encoded within the UUID (e.g., as a location in X from 0 being one size of the vehicle 102 to 64 being the other side and a location in Y from 0 being the front of the vehicle 102 to 64 being the back of the vehicle 102). Size and orientation information may also be similarly included in the UUID. In another example, the one of the in-vehicle components 106 configured to serve vehicle 102 information to the personal device 104 may provide the personal device 104 with an identifier of the vehicle 102 (e.g., VIN) which may be used to query a server external to the vehicle 102 to retrieve the interior locations and dimensions of the in-vehicle components 106.
As shown in the user interface 200, the component indications 206 may include, as some non-limiting examples, light component indications 206-A through 206-D, climate control component indications 206-E and 206-F, seat control component indications 206-G through 206-J, speaker component indications 206-K through 206-N, display component indications 206-O through 206-Q, and keyboard component indications 206-R through 206-T.
The legend control 208 may include one or more type indications 210, where each type indication 210 corresponds to one of the types of in-vehicle components 106 that is included within the vehicle 102. In an example, the type information may be encoded in the service UUIDs of the advertisement data of the in-vehicle component 106 as mentioned above. For instance, the UUID may include a first predefined type code for a seat control in-vehicle component 106, a second predefined type code for a climate control in-vehicle component 106, a third predefined type code for a display in-vehicle component 106, a fourth predefined type code for a light in-vehicle component 106, a fifth predefined type code for a speaker in-vehicle component 106, and a sixth predefined type code for a keyboard in-vehicle component 106.
The type indications 210 may be labeled according to the type of in-vehicle components 106 that they represent, and may be user selectable to allow the user to filter which of the in-vehicle component 106 are to be overlaid on the interior map 204. In an example, each of the type indications 210 may operate at a checkbox control, such that display or visibility of the component indications 206 of the in-vehicle components 106 of the corresponding type may be toggled responsive to user selection of the type indications 210.
As one non-limiting example, the type indications 210 may include a seat type indication 210-A presented in orange, a climate type indication 212-B presented in green, a display type indication 212-C presented in blue, a light type indication 210-C presented in yellow, a speaker type indication 210-D presented in purple, and a keyboard type indication 210-E presented in brown.
To strengthen the interface connection between the type indications 210 and the component indications 206, the component indications 206 for in-vehicle components 106 for a given type may be displayed in the same color or pattern as the corresponding type indication 210. To continue with the illustrated example, the light component indications 206-A through 206-D may be presented in yellow, the climate control component indications 206-E and 206-F may be presented in green, the seat control component indications 206-G through 206-J may be presented in orange, the speaker component indications 206-K through 206-N may be presented in purple, the display component indications 206-O through 206-Q may be presented in blue, and the keyboard component indications 206-R through 206-T may be presented in brown.
The zone indication 212 may indicate the seating zone 108 in which the personal device 104 of the occupant is located. As one example, the seating zone 108 in which the user is located may be displayed as a zone highlight 212 of that zone 108 on the interior map 204. As shown, the zone indication 212 indicates that the user is located within the rear driver side zone 108-C.
A user of the vehicle component interface application 118 may interact with the user interface 200 in a zone user interface mode or a functional entity user interface mode. In the zone mode, the user interface 200 may allow the user to interact with the in-vehicle component 106 features in the seating zone 108 in which the user is located. The zone mode may be useful, in an example, for allowing the user to change lighting, climate, or other settings specific to the user's seating zone 108 within the vehicle 102, without disturbing the settings of other users.
Moreover, it should further be noted that in the example user interface 300, certain types of in-vehicle components 106 have been deselected from being displayed in overlay. This deselection may be performed, in an example, by a user touching the type indications 210 to toggle off display or visibility of the associated component indications 206. As shown, the seat type indication 210-A, the climate type indication 212-B, the display type indication 212-C, and the a keyboard type indication 210-E are deselected, thereby hiding the corresponding seat, climate, display, and keyboard component indications 206 from being displayed.
In the functional entity mode, the user interface 200 may allow the user to interact with multiple in-vehicle components 106 of a common type of function across multiple seating zones 108. The functional entity mode may be useful, in an example, for combining the functioning of multiple in-vehicle components 106 into a single unit, e.g., to aim several lights or climate control vents within the vehicle cabin to a specified location. The functional entity mode may further allow for the complex selection of the in-vehicle components 106 through swipe or other gestures to allow the user to craft the specific subset of in-vehicle components 106 to be collectively controlled.
The outputs 502 may be overlaid in the interior map 204 in accordance with the current settings of the in-vehicle components 106 being controlled. In an example, the outputs 502 may be light, and the color of the outputs 502 may be displayed in a color corresponding to the current color of light set for the in-vehicle component 106, the length of the outputs 502 may be displayed in a length corresponding to the intensity set for the in-vehicle component 106, and the direction of the outputs 502 may be displayed in an angle and shape corresponding to the aiming set for the in-vehicle component 106.
The component controls 504 may include one or more controls for the adjustment of the functionality of the collection of in-vehicle components 106 being controlled. Continuing with the light in-vehicle component 106 example, the component controls 504 may include a light direction control 504-A for adjusting the aiming of the light in-vehicle components 106, an intensity control 504-B for adjusting the brightness of the light in-vehicle components 106, and a color control 504-C for adjusting the color of lighting provided by the light in-vehicle components 106. For instance, the user may select one or more of the indications 206 to adjust the settings of the one or more of the in-vehicle components 106, or may adjust the settings of the collection of in-vehicle components 106 collectively.
In an example, the component controls 504 may be displayed in place of the legend control 208, although other layouts possibilities may be used as well. For example, the component controls 504 may be displayed above or below or next to the legend controls 208, as some other possibilities (not shown).
The vehicle component interface application 118 may determine the specific component controls 504 to display based on information provided by the in-vehicle components 106 to be controlled. In an example, vehicle component interface application 118 may enumerate the characteristic UUIDs of the characteristics of the service UUID of the in-vehicle components 106 being controlled, and may decode information from the characteristic UUIDs, such as a listing of names and/or identifiers of the available features of the in-vehicle component 106 and/or information indicative of the current state of the in-vehicle component 106. In another example, the information indicative of the options for the component controls 504 may be pre-stored to the personal device 104. In yet another example, the information indicative of the options for the component controls 504 may be downloaded from a server included within the vehicle 102, or from one of the in-vehicle components 106 configured to serve control information to the personal device 104. In yet another example, the one of the in-vehicle components 106 configured to serve control information to the personal device 104 may provide the personal device 104 with an identifier of the vehicle 102 (e.g., VIN), or an identifier of the in-vehicle component 106 or type corresponding to the in-vehicle component 106. This identifier may then be used in a query to a server external to the vehicle 102, which may return the information indicative of the options for the component controls 504 to the personal device 104.
The light intensity, direction, and color may accordingly be directly controlled from the user interface 300 using the component controls 504. Moreover, the collective aiming and intensity of the in-vehicle components 106 may be collectively controlled using the handle control 506 as discussed in more detail in combination with
Responsive to the location provided to the display 202, the vehicle component interface application 118 may adjust the direction, intensity, spread, and/or other properties of the in-vehicle components 106 to effect the change in aiming of the collection of in-vehicle components 106. For example, to move the aiming of the in-vehicle components 106-B and 106-D from the center of the vehicle 102 to a location in the passenger read seating zone 108-D, the vehicle component interface application 118 may increase the intensity of the in-vehicle components 106-B that is farther from that location, decrease the intensity of the in-vehicle components 106-D that is closer to that location, and adjust the aiming of the in-vehicle components 106-B and 106-D direct the light to the indication location. The outputs 502 overlaid in the interior map 204 may also be updated in accordance with the updated settings of the in-vehicle components 106 being controlled. Thus, when integrating a mesh of in-vehicle components 106 into a connection in the functional entity mode, the vehicle component interface application 118 may provide the ability of each in-vehicle component 106 of the collection to be controlled as a collective group.
While not shown, in some examples multiple swipe 402 gestures may be used to compose a group of in-vehicle components 106 for control. For instance, a first gesture 402 may be used to select a set of in-vehicle components 106, and a second gesture 402 may be used to select additional in-vehicle components 106 to be added to the group.
As another example, the vehicle component interface application 118 may support both additive and subtractive gestures. For example, a swipe 402 gesture of an additive type may be used to add components to a group, and a swipe 402 gesture of a subtractive type may be used to remove one or more components from the group. As one non-limiting possibility, an additive swipe may use one finger, and a subtractive swipe may use two fingers.
Similar to as discussed above with respect to the handle 506, responsive to the input provided using the exclusion handle control 1102, the vehicle component interface application 118 may adjust the direction, intensity, spread, and/or other properties of the in-vehicle components 106 to effect the change in aiming of the collection of in-vehicle components 106. For example, to move the aiming of the in-vehicle components 106-A, 106-B, 106-C, and 106-D away from the indicated location, the vehicle component interface application 118 may adjust the aiming, spread, and/or intensity of the in-vehicle components 106-A, 106-B, 106-C, and 106-D direct the light away from the indication location.
A user may use gesture input to adjust various attributes of the adjustable region 1202. These attributes may include, for example, location, width, height, and orientation. For instance, adjustments to the height or width of the adjustable region 1202 may be performed using pinch gestures in which two fingers are placed on the adjustable region 1202 are and spread apart to increase the size of the region 1202 or are moved towards one another to decrease the size of the region 1202. Or, adjustments to an edge of the adjustable region 1202 may be performed by dragging gestures performed by placing a finger onto a side of the adjustable region 1202 to be moved, moving the side to the new location, and releasing the finger. Or, the orientation of the adjustable region 1202 may be adjusted by a user placing two fingers into the region and rotating the fingers to perform a corresponding rotation to the adjustable region 1202. As some other possibilities, a user may draw the adjustable region 1202 by using a finger to draw a boundary completely enclosing an area to be adjusted, or by painting a filled area to be adjusted.
Once drawn, the adjustable region 1202 may be activated by an area activation gesture. In an example, a user may tap within the adjustable region 1202 to activate the adjustable region 1202 for use in directing of output from the selected in-vehicle components 106 into the adjustable region 1202. As another example, a single tap within the adjustable region 1202 may activate the adjustable region 1202 for directing output from the selected in-vehicle components 106 into the adjustable region 1202, while a double tap within the adjustable region 1202 may activate the adjustable region for excluding output from the selected in-vehicle components 106 out of the adjustable region 1202. As yet a further example, selection of the adjustable region 1202 for directing output into the region 1202 may be indicated by a tap with a single finger, while selection of the adjustable region 1202 for excluding output from the region 1202 may be indicated by a tap from two fingers.
It should be noted that while many of the examples described herein relate to light in-vehicle controls 106, it should be noted that the disclosure applies to other types of vehicle interior functions characterized by directionality, intensity and spread, such as climate control air output 502 from air vents or sound output 502 from speakers. Depending on the specific type of in-vehicle components 106 that are being controlled, the relationship between intensity and distance for the output 502 may change. Such output 502 relationship may be pre-coded into the vehicle component interface application 118 or may be provided by the in-vehicle components 106 similar to as discussed above with respect to the providing of the information for display and configuration of the component controls 504.
Moreover, while in many examples the vehicle component interface application 118 is described as adjusting the specific intensity, spread, and/or location of the in-vehicle components 106, in other examples, the vehicle component interface application 118 may instead provide information to the in-vehicle components 106 specifying the desired coordinate location to provide or not provide output 502, where the in-vehicle components 106 determine themselves how to self-adjust intensity, aim, spread, etc. in accordance with the provided location coordinates. As one non-limiting possibility, these coordinates may be provided to the in-vehicle components 106 in an encoding similar to that used for the encoding of location information in the UUID information of the in-vehicle component 106.
At operation 1302, the personal device 104 detects the in-vehicle components 106 of the zone 108 of the vehicle 102. In an example, the vehicle component interface application 118 may identify what in-vehicle components 106 are available within the vehicle 102 according to the advertisements of the in-vehicle components 106 identified using the wireless transceiver 112 of the personal device 104.
At 1304, the process 1300 determines the seating location of the user within the vehicle 102. In an example, signal strength 116 information between the personal device 104 and the in-vehicle components 106 may be utilized to detect the location of the personal device 104. In many examples, the determination of the zone 108 may be performed by the vehicle component interface application 118. In another example, the determination may be performed by one or more of the in-vehicle components 106 of the vehicle 102, and may be indicated to the vehicle component interface application 118 of the personal device 104.
At operation 1306, the personal device 104 identifies the interior map 204 of the vehicle 102. In an example, the interior map 204 may be a generic vehicle 102 interior image or previously downloaded image retrieved from the storage of the personal device 104. In another example, the interior map 104 may be downloaded from a server included within the vehicle 102, or from one of the in-vehicle components 106 configured to serve component information to the personal device 104. In yet another example, the one of the in-vehicle components 106 configured to serve component information to the personal device 104 may provide the personal device 104 with an identifier of the vehicle 102 (e.g., VIN), or an identifier of the interior map 204 image corresponding to the vehicle 102. This identifier may then be used in a query to a server external to the vehicle 102, which may return the interior map 204 to the personal device 104.
In 1308, the personal device 104 generates the component indications 206 for overlay over the interior map 204. In an example, the component types, locations and/or dimensions of the component indications 206 may be identified according to location information decoded from the service UUIDs of the advertisement data of the corresponding in-vehicle components 106. In another example, one of the in-vehicle components 106 configured to serve vehicle 102 information to the personal device 104 may provide the personal device 104 with an identifier of the vehicle 102 (e.g., VIN) which may be used to query a server external to the vehicle 102 to retrieve the interior locations and dimensions of the in-vehicle components 106. The component indications 206 may further be generated in colors or patterns according to the types of the in-vehicle components 106. The vehicle component interface application 118 may also generate a legend control 208, such that each type indication 210 in the legend control 208 corresponds to one of the types of in-vehicle components 106, and is rendered in the same color or pattern as in which the component indications 206 of that type are rendered.
The personal device 104 sends the interior map 204 and component indications 206 for overlay to the display 202 at operation 1310. Examples of the displayed content include the user interfaces 200-1200 described in detail above.
At operation 1312, the personal device 104 determines whether the component indications 206 should be updated. In an example, the user may have selected to toggle the display or visibility of a type of component indications 206 by selecting one of the type indications 210 from the legend control 208. In another example, the personal device 104 may re-detect the in-vehicle components 106 of the vehicle 102, and may determine that one or more in-vehicle components 106 has been added or removed. If the component indications 206 should be updated, control passes to operation 1308. Otherwise, control remains at operation 1312.
In operation 1402, the personal device 104 determines whether gesture input selecting component indications 206 is received. In an example, the vehicle component interface application 118 may receive one or more gestures 402 from a user selecting in-vehicle components 106 from the display 202. The gestures 402 may include, for example, additive gestures 402 to add components to a group, and subtractive gestures 402 to remove one or more components from the group. Example gestures 402 are illustrated in the user interfaces 400 and 700. If one or more gestures 402 are received, the vehicle component interface application 118 determines the in-vehicle components 106 included within the selection at operation 1404. After operation 1404, control returns to operation 1402. If no new gestures 402 are received, control passes to operation 1406.
At 1406, the personal device 104 displays component controls 504 for the selected component indications 206. In an example, the vehicle component interface application 118 may display the component controls 504 in place of the legend control 208, although other layouts possibilities may be used as well. The vehicle component interface application 118 may further determine the specific component controls 504 to display based on information provided by the in-vehicle components 106 to be controlled, e.g., according to an enumeration of the characteristic UUIDs of the characteristics of the service UUID of the in-vehicle components 106 being controlled, by downloaded the information from a server included within or external to the vehicle 102, or by downloading the information from one of the in-vehicle components 106 configured to serve control information to the personal device 104. State information for the component controls 504 may also be retrieved by the vehicle component interface application 118 from the in-vehicle components 106 for presentation in the user interface.
The personal device 104 determines whether input to the component controls 504 is received at operation 1408. If input is received to one or more of the component controls 504, the vehicle component interface application 118 passes control to operation 1410. Otherwise, control passes to operation 1412.
At operation 1410, the one or more of the component controls 504 being controlled is updated with the user input. In an example, if the color of light to be provided by the in-vehicle components 106 is updated, the vehicle component interface application 118 applies the color selection to each of the in-vehicle components 106 of the selected group. After operation 1410, control passes to operation 1402.
At 1412, the personal device 104 determines whether location input is received to the interior map 204. In an example, the vehicle component interface application 118 may receive input from the handle control 506 from a user performing collective aiming of the collection of in-vehicle components 106 to a particular location. In another example, the vehicle component interface application 118 may receive input from the user painting a desired maximum light intensity region 902, or a region 1202 in which light is not preferred. If a location is received, control passes to operation 1414. Otherwise, control passes to operation 1402.
At operation 1414, the personal device 104 aims outputs of the selected in-vehicle components 105 based on the received location. In an example, responsive to the location provided to the display 202, the vehicle component interface application 118 may adjust the direction, intensity, spread, and/or other properties of the in-vehicle components 106 to effect the change in aiming of the collection of in-vehicle components 106. The adjustments may be made, for example, based on intensity and distance relationship information be pre-coded into the vehicle component interface application 118 or provided by the in-vehicle components 106 to the personal device 104. After operation 1414, control passes to operation 1402.
Computing devices described herein, such as the personal devices 104 and in-vehicle components 106, generally include computer-executable instructions, where the instructions may be executable by one or more computing devices such as those listed above. Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java™, C, C++, C#, Visual Basic, Java Script, Perl, etc. In general, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media.
With regard to the processes, systems, methods, heuristics, etc., described herein, it should be understood that, although the steps of such processes, etc., have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain embodiments, and should in no way be construed so as to limit the claims.
While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms of the invention. Rather, the words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the invention. Additionally, the features of various implementing embodiments may be combined to form further embodiments of the invention.
Number | Name | Date | Kind |
---|---|---|---|
4721954 | Mauch | Jan 1988 | A |
4792783 | Burgess et al. | Dec 1988 | A |
4962302 | Katsumi | Oct 1990 | A |
5132880 | Kawamura | Jul 1992 | A |
5143437 | Matsuno et al. | Sep 1992 | A |
5255442 | Schierbeek et al. | Oct 1993 | A |
5543591 | Gillespie et al. | Aug 1996 | A |
5648656 | Begemann et al. | Jul 1997 | A |
5650929 | Potter et al. | Jul 1997 | A |
5697844 | Von Kohorn | Dec 1997 | A |
5757268 | Toffolo et al. | May 1998 | A |
5796179 | Honaga | Aug 1998 | A |
5848634 | Will et al. | Dec 1998 | A |
5850174 | DiCroce et al. | Dec 1998 | A |
6028537 | Suman et al. | Feb 2000 | A |
6377860 | Gray et al. | Apr 2002 | B1 |
6397249 | Cromer et al. | May 2002 | B1 |
6449541 | Goldberg et al. | Sep 2002 | B1 |
6473038 | Patwari et al. | Oct 2002 | B2 |
6536928 | Hein et al. | Mar 2003 | B1 |
6935763 | Mueller et al. | Aug 2005 | B2 |
7009504 | Banter et al. | Mar 2006 | B1 |
7015791 | Huntzicker | Mar 2006 | B2 |
7015896 | Levy et al. | Mar 2006 | B2 |
7034655 | Magner et al. | Apr 2006 | B2 |
7342325 | Rhodes | Mar 2008 | B2 |
7502620 | Morgan et al. | Mar 2009 | B2 |
7595718 | Chen | Sep 2009 | B2 |
7672757 | Hong et al. | Mar 2010 | B2 |
7706740 | Collins et al. | Apr 2010 | B2 |
7778651 | Billhartz | Aug 2010 | B2 |
7800483 | Bucher | Sep 2010 | B2 |
7810969 | Blackmore et al. | Oct 2010 | B2 |
7973773 | Pryor | Jul 2011 | B2 |
8065169 | Oldham et al. | Nov 2011 | B1 |
8073589 | Rasin et al. | Dec 2011 | B2 |
8324910 | Lamborghini et al. | Dec 2012 | B2 |
8344850 | Girard, III et al. | Jan 2013 | B2 |
8408766 | Wilson et al. | Apr 2013 | B2 |
8417258 | Barnes, Jr. | Apr 2013 | B2 |
8421589 | Sultan et al. | Apr 2013 | B2 |
8447598 | Chutorash et al. | May 2013 | B2 |
8476832 | Prodin et al. | Jul 2013 | B2 |
8482430 | Szczerba | Jul 2013 | B2 |
8768565 | Jefferies et al. | Jul 2014 | B2 |
8797295 | Bernstein et al. | Aug 2014 | B2 |
8823517 | Hadsall, Sr. | Sep 2014 | B2 |
8831514 | Tysowski | Sep 2014 | B2 |
8856543 | Geiger et al. | Oct 2014 | B2 |
8866604 | Rankin et al. | Oct 2014 | B2 |
8873147 | Rhodes et al. | Oct 2014 | B1 |
8873841 | Yang et al. | Oct 2014 | B2 |
8880100 | Dobyns | Nov 2014 | B2 |
8930045 | Oman et al. | Jan 2015 | B2 |
8947202 | Tucker et al. | Feb 2015 | B2 |
9053516 | Stempora | Jun 2015 | B2 |
9078200 | Wuergler et al. | Jul 2015 | B2 |
9104537 | Penilla | Aug 2015 | B1 |
9164588 | Johnson et al. | Oct 2015 | B1 |
9288270 | Penilla | Mar 2016 | B1 |
9350809 | Leppanen | May 2016 | B2 |
9357054 | Froment et al. | May 2016 | B1 |
9417691 | Belimpasakis et al. | Aug 2016 | B2 |
20020069002 | Morehouse | Jun 2002 | A1 |
20020070923 | Levy et al. | Jun 2002 | A1 |
20020087423 | Carbrey Palango et al. | Jul 2002 | A1 |
20020092019 | Marcus | Jul 2002 | A1 |
20020096572 | Chene et al. | Jul 2002 | A1 |
20020178385 | Dent et al. | Nov 2002 | A1 |
20020197976 | Liu et al. | Dec 2002 | A1 |
20030078709 | Yester et al. | Apr 2003 | A1 |
20030171863 | Plumeier et al. | Sep 2003 | A1 |
20040034455 | Simonds et al. | Feb 2004 | A1 |
20040076015 | Aoki et al. | Apr 2004 | A1 |
20040141634 | Yamamoto et al. | Jul 2004 | A1 |
20040215532 | Boman et al. | Oct 2004 | A1 |
20050040933 | Huntzicker | Feb 2005 | A1 |
20050044906 | Spielman | Mar 2005 | A1 |
20050099320 | Nath et al. | May 2005 | A1 |
20050136845 | Masouka et al. | Jun 2005 | A1 |
20050185399 | Beermann et al. | Aug 2005 | A1 |
20050261807 | Sorensen et al. | Nov 2005 | A1 |
20050261815 | Cowelchuk et al. | Nov 2005 | A1 |
20050288837 | Wiegand et al. | Dec 2005 | A1 |
20060075934 | Ram | Apr 2006 | A1 |
20060089755 | Ampunan et al. | Apr 2006 | A1 |
20060155429 | Boone et al. | Jul 2006 | A1 |
20060155547 | Browne et al. | Jul 2006 | A1 |
20060205456 | Bentz et al. | Sep 2006 | A1 |
20060250217 | Hamling et al. | Nov 2006 | A1 |
20060258377 | Economos et al. | Nov 2006 | A1 |
20060271261 | Flores et al. | Nov 2006 | A1 |
20070021885 | Soehren | Jan 2007 | A1 |
20070140187 | Rokusek et al. | Jun 2007 | A1 |
20070180503 | Li et al. | Aug 2007 | A1 |
20070198472 | Simonds et al. | Aug 2007 | A1 |
20070201389 | Murayama | Aug 2007 | A1 |
20070262140 | Long, Sr. | Nov 2007 | A1 |
20080140868 | Kalayjian et al. | Jun 2008 | A1 |
20080180231 | Chen | Jul 2008 | A1 |
20080261643 | Bauer et al. | Oct 2008 | A1 |
20080288406 | Seguin et al. | Nov 2008 | A1 |
20090249081 | Zayas | Oct 2009 | A1 |
20090253439 | Gantner et al. | Oct 2009 | A1 |
20100091394 | DeWind et al. | Apr 2010 | A1 |
20100171696 | Wu | Jul 2010 | A1 |
20100176917 | Bacarella | Jul 2010 | A1 |
20100197359 | Harris | Aug 2010 | A1 |
20100216401 | Kitahara | Aug 2010 | A1 |
20100222939 | Namburu et al. | Sep 2010 | A1 |
20100225443 | Bayram et al. | Sep 2010 | A1 |
20100231958 | Okigami | Sep 2010 | A1 |
20100233957 | Dobosz | Sep 2010 | A1 |
20100235045 | Craig et al. | Sep 2010 | A1 |
20100280711 | Chen et al. | Nov 2010 | A1 |
20100315373 | Steinhauser et al. | Dec 2010 | A1 |
20110086668 | Patel | Apr 2011 | A1 |
20110137520 | Rector et al. | Jun 2011 | A1 |
20110187496 | Denison et al. | Aug 2011 | A1 |
20110199298 | Bassompiere et al. | Aug 2011 | A1 |
20110219080 | McWithey et al. | Sep 2011 | A1 |
20110264491 | Bimbaum et al. | Oct 2011 | A1 |
20120006611 | Wallace et al. | Jan 2012 | A1 |
20120032899 | Waeller | Feb 2012 | A1 |
20120065815 | Hess | Mar 2012 | A1 |
20120096908 | Fuse | Apr 2012 | A1 |
20120098768 | Bendewald et al. | Apr 2012 | A1 |
20120109451 | Tan | May 2012 | A1 |
20120136802 | McQuade et al. | May 2012 | A1 |
20120154114 | Kawamura | Jun 2012 | A1 |
20120214463 | Smith et al. | Aug 2012 | A1 |
20120214471 | Tadayon et al. | Aug 2012 | A1 |
20120229253 | Kolar | Sep 2012 | A1 |
20120244883 | Tibbitts et al. | Sep 2012 | A1 |
20120254809 | Yang et al. | Oct 2012 | A1 |
20120268235 | Farhan et al. | Oct 2012 | A1 |
20120268242 | Tieman et al. | Oct 2012 | A1 |
20130015951 | Kuramochi et al. | Jan 2013 | A1 |
20130037252 | Major et al. | Feb 2013 | A1 |
20130079951 | Brickman | Mar 2013 | A1 |
20130099892 | Tucker et al. | Apr 2013 | A1 |
20130116012 | Okayasu | May 2013 | A1 |
20130218371 | Simard et al. | Aug 2013 | A1 |
20130227647 | Thomas et al. | Aug 2013 | A1 |
20130259232 | Petel | Oct 2013 | A1 |
20130261871 | Hobbs et al. | Oct 2013 | A1 |
20130283202 | Zhou et al. | Oct 2013 | A1 |
20130295908 | Zeinstra et al. | Nov 2013 | A1 |
20130300608 | Margalef et al. | Nov 2013 | A1 |
20130329111 | Desai et al. | Dec 2013 | A1 |
20130335222 | Comerford et al. | Dec 2013 | A1 |
20130342379 | Bauman et al. | Dec 2013 | A1 |
20140043152 | Lippman et al. | Feb 2014 | A1 |
20140068713 | Nicholson et al. | Mar 2014 | A1 |
20140121883 | Shen | May 2014 | A1 |
20140139454 | Mistry et al. | May 2014 | A1 |
20140142783 | Grimm et al. | May 2014 | A1 |
20140163774 | Demeniuk | Jun 2014 | A1 |
20140164559 | Demeniuk | Jun 2014 | A1 |
20140188348 | Gautama et al. | Jul 2014 | A1 |
20140200736 | Silvester | Jul 2014 | A1 |
20140212002 | Curcio et al. | Jul 2014 | A1 |
20140213287 | MacDonald et al. | Jul 2014 | A1 |
20140215120 | Saylor et al. | Jul 2014 | A1 |
20140226303 | Pasdar | Aug 2014 | A1 |
20140258727 | Schmit et al. | Sep 2014 | A1 |
20140277935 | Daman et al. | Sep 2014 | A1 |
20140279744 | Evans | Sep 2014 | A1 |
20140309806 | Ricci | Oct 2014 | A1 |
20140321321 | Knaappila | Oct 2014 | A1 |
20140335902 | Guba et al. | Nov 2014 | A1 |
20140375477 | Jain et al. | Dec 2014 | A1 |
20140379175 | Mittermeier | Dec 2014 | A1 |
20140380442 | Addepalli et al. | Dec 2014 | A1 |
20150039877 | Hall et al. | Feb 2015 | A1 |
20150048927 | Simmons | Feb 2015 | A1 |
20150094088 | Chen | Apr 2015 | A1 |
20150116085 | Juzswik | Apr 2015 | A1 |
20150116100 | Yang et al. | Apr 2015 | A1 |
20150123762 | Park et al. | May 2015 | A1 |
20150126171 | Miller et al. | May 2015 | A1 |
20150147974 | Tucker et al. | May 2015 | A1 |
20150148990 | Patel | May 2015 | A1 |
20150149042 | Cooper et al. | May 2015 | A1 |
20150154531 | Skaaksrud | Jun 2015 | A1 |
20150172902 | Kasslin et al. | Jun 2015 | A1 |
20150178034 | Penilla et al. | Jun 2015 | A1 |
20150181014 | Gerhardt et al. | Jun 2015 | A1 |
20150204965 | Magarida et al. | Jul 2015 | A1 |
20150210287 | Penilla et al. | Jul 2015 | A1 |
20150223151 | Lei et al. | Aug 2015 | A1 |
20150256668 | Atkinson et al. | Sep 2015 | A1 |
20150261219 | Cuddihy et al. | Sep 2015 | A1 |
20150261573 | Rausch et al. | Sep 2015 | A1 |
20150269797 | Kauffmann et al. | Sep 2015 | A1 |
20150278164 | Kim et al. | Oct 2015 | A1 |
20150283914 | Malone | Oct 2015 | A1 |
20150294518 | Peplin et al. | Oct 2015 | A1 |
20150332530 | Kishita | Nov 2015 | A1 |
20150352953 | Koravadi | Dec 2015 | A1 |
20150356797 | McBride et al. | Dec 2015 | A1 |
20150382160 | Slay, Jr. et al. | Dec 2015 | A1 |
20160039430 | Ricci | Feb 2016 | A1 |
20160055699 | Vincenti | Feb 2016 | A1 |
20160119782 | Kim | Apr 2016 | A1 |
20160133072 | Santavicca | May 2016 | A1 |
20160203661 | Pudar et al. | Jul 2016 | A1 |
20160214572 | Snider | Jul 2016 | A1 |
20160248905 | Miller et al. | Aug 2016 | A1 |
20160332535 | Bradley et al. | Nov 2016 | A1 |
20160360382 | Gross | Dec 2016 | A1 |
20170313426 | Morin | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
102445954 | Mar 2014 | CN |
103942963 | Jul 2014 | CN |
2011131833 | Jul 2011 | JP |
2013052043 | Apr 2013 | WO |
Entry |
---|
General Motors Corporation; Pontiac GTO Owner's Manual; 2005; pp. 3-19 and 3-20; https://my.gm.com/content/dam/gmownercenter/gmna/dynamic/manuals/2006/pontiac/gto/2006_gto_owners.pdf. |
Basin, “An In/Vehicle Human/Machine Interface Module,” XML Journal, Jan. 3, 2003, (9 pages), retrieved from http://xml.sys/con.com/node/40547 on Dec. 13, 2014. |
Shahzada, “Touch Interaction for User Authentication,” Thesis, Carleton University, Ottawa, Ontario, May 2014 (124 pages). |
Napa Sae/Bae et al., “Biometric/Rich Gestures: A Novel Approach to Authentication on Multi/touch Devices,” NYU/Poly, CHI 2012, May 5/10, 2012, Austin, TX (10 pages). |
Services/Bluetooth Development Portal, last accessed May 30, 2015, https://developer.bluetooth.org/gatt/services/Pages/ServicesHome.aspx. (1 page). |
Azad, “The Quick Guide to GUIDs,” Better Explained / Math insights that click, last accessed May 24, 2015, http://betterexplained.com/articles (15 pages). |
Goodwin, “Add/on module auto/unlocks your car when your phone is near,” CNET, Car Tech, Nov. 19, 2013, http://www.cnet.com/news/add/on/module/auto/unlocks/your/car/when/your/phone/is/near (2 pages). |
Hertz 24/7, “Book. Unlock. Go. You can reserve your vehicle anywhere, anytime / up to 10 days in advance,” last accessed Jul. 28, 2015, https://www.hertz247.com/parkridge/en/us/About (3 pages). |
Klosowski, “Unlock Your Car with a Bluetooth Powered Keyless Entry System,” Lifehacker, Sep. 30, 2013, http://lifehacker.com/unlock/your/car/with/a/bluetooth/powered/keyless/entry/1427088798 (2 pages). |
Toyota, Toyota Prius C Brochure, 2015, available at http://www.toyota.com/priusc/ebrochure. |
Thomas, “2010 Toyota Prius Touch Tracer Display,” Mar. 3, 2009, available at https://www.cars.com/articles/2009/03/2010/toyota/prius/touch/tracer/display/. |
Gahran, “Vehicle owner's manuals // now on smartphones,” CNN.com, Jan. 31, 2011, available at http://www.cnn.com/2011/TECH/mobile/01/31/car.manual.phone/. |
Specification of the Bluetooth System, Version 4.2, “Master Table of Contents & Compliance Requirements,” Dec. 2, 2014, https://www.bluetooth.or/en/us/specification/adopted/specifications. (2,772 pages). |
Bargshady et al., Precise Tracking of Things via Hybrid 3-D Fingerprint Database and Kernel Method Particle Filter, 2016, IEEE, p. 8963-8971. |
Murugappan et al., Wireless EEG Signals based Neuromarketing System using Fast Fourier Transform (FFT), 2014, IEEE, p. 25-30. |
Katoh et al., A Method of Advertisement Selection in Multiple RFID-Tags Sensor Network for a Ubiquitous Wide-Area Advertising Service, 2008, IEEE, p. 519-524. |
Number | Date | Country | |
---|---|---|---|
20170166056 A1 | Jun 2017 | US |