The present disclosure relates generally to mobile communication device connectivity, and more particularly, to a system and method for vehicular and mobile communication device connectivity.
As mobile communication devices (e.g., smart devices, smart phones, cell phones, tablets, PDAs, laptops, etc.) have become increasingly ubiquitous in society, mobile communication device usage and dependence has, not surprisingly, increased dramatically, particular among younger generations. Along these lines, owners of mobile communication devices have been found to place a higher ownership priority on their respective mobile communication device than other devices, even including vehicles. Therefore, it follows that the desire for connectivity to one's mobile communication device while riding in a vehicle is expanding, and especially so for drivers. By enhancing vehicular and mobile communication device connectivity, user convenience and accessibility can be increased, while minimizing security risks and driver distraction.
The present disclosure provides techniques for enabling seamless in-vehicle connectivity to mobile communication devices. Once a user (i.e., driver or passenger) initiates a connection between his or her mobile communication device and the vehicle, the user can conveniently interact with the device using a variety of techniques throughout the vehicle cabin. The user can view a representation of the mobile communication device's interface displayed in the vehicle (e.g., on the dashboard), such that it is integrated with vehicular information, such as heating, ventilating, and air conditioning (HVAC) controls, radio controls, navigational features, and the like. Further, the user can customize the layout of the in-vehicle interface by manually rearranging the placement of the displayed mobile communication device and vehicle information. Also, the existing hardware of the mobile communication device can be leveraged to provide additional features in the vehicle, such as safety-related features, navigational features, and other convenience-related functionality.
According to embodiments of the present disclosure, a method includes: establishing a connection between a mobile communication device and a controller area network (CAN) bus in a CAN of a vehicle; receiving, via the established connection, information from the mobile communication device, including image information acquired by a camera of the mobile communication device; displaying vehicle information and a representation of an interface of the mobile communication device based on the information received from the mobile communication device in a display area of the vehicle; receiving input for controlling a function of the mobile communication device; and controlling, via the established connection, the function of the mobile communication device according to the received input.
The method may further include controlling a function of the vehicle based on the information received from the mobile communication device.
The method may further include performing a safety-related function associated with the vehicle based on the image information acquired by the camera of the mobile communication device. The safety-related function may relate to at least one of: a lane keeping assist system (LKAS), a lane departure warning (LDW), pedestrian detection, forward collision warning (FCW), and adaptive cruise control (ACC).
The method may further include performing a navigation-related function associated with the vehicle based on navigation information acquired by the mobile communication device.
The method may further include receiving, via the established connection, additional information from the mobile communication device acquired by a hardware-based component of the mobile communication device other than the camera. The hardware-based component may include: an ambient light sensor, a global positioning system (GPS) unit, an accelerometer, a gyroscope, a microphone, a compass, or a barometer.
The input for controlling the function of the mobile communication device may include at least one of: a touch gesture of a user, a motion gesture of the user, a gaze of the user, and a sound of the user. The method may further include sensing the touch gesture at a touchscreen coupled to the display area. The method may further include identifying the motion gesture or the gaze using a camera installed in the vehicle. The method may further include identifying the motion gesture or the gaze using a camera of the mobile communication device. The user may be a driver of the vehicle.
The method may further include identifying a particular gesture which is linked with a particular function of the mobile communication device based on the received input, the action being associated with at least one of: a motion of the driver, a sound of the driver, a gaze of the driver, and an eye position of the driver.
The method may further include supplying power to the mobile communication device.
The method may further include establishing the connection between the mobile communication device and the CAN bus via a wired or wireless connection.
The method may further include establishing the connection between the mobile communication device and the CAN bus via a docking station in the vehicle. The docking station may be located behind a rear-view mirror of the vehicle.
The method may further include concurrently displaying the vehicle information and the representation of the interface of the mobile communication device in the display area.
The displayed vehicle information may be associated with at least one of: heating, ventilation, and air conditioning (HVAC) information, infotainment information, and telematics information.
The display area may include a light-emitting diode (LED)-based screen, a liquid crystal display (LCD)-based screen, or a dashboard area onto which information is projected by a projection device.
The method may further include adjusting an appearance of the displayed representation of the interface of the mobile communication device in the display area according to received input.
Furthermore, according to embodiments of the present disclosure, a system includes: a vehicle including a controller area network (CAN) bus in a CAN of the vehicle; and a mobile communication device connected to the CAN bus. The CAN bus: i) receives, via the connection, information from the mobile communication device, including image information acquired by a camera of the mobile communication device, ii) displays vehicle information and a representation of an interface of the mobile communication device based on the information received from the mobile communication device in a display area of the vehicle, iii) receives input for controlling a function of the mobile communication device, and iv) controls, via the connection, the function of the mobile communication device according to the received input.
Furthermore, according to embodiments of the present disclosure, a method includes: establishing a connection between a mobile communication device and a controller area network (CAN) bus in a CAN of a vehicle; receiving, via the established connection, information from the mobile communication device, including image information acquired by a camera of the mobile communication device; displaying vehicle information and a representation of an interface of the mobile communication device based on the information received from the mobile communication device in a display area of the vehicle; and controlling a function of the vehicle based on the information received from the mobile communication device.
Furthermore, according to embodiments of the present disclosure, a system includes: a vehicle including a controller area network (CAN) bus in a CAN of the vehicle; and a mobile communication device connected to the CAN bus. The CAN bus: i) receives, via the connection, information from the mobile communication device, including image information acquired by a camera of the mobile communication device, ii) displays vehicle information and a representation of an interface of the mobile communication device based on the information received from the mobile communication device in a display area of the vehicle, and iii) controls a function of the vehicle based on the information received from the mobile communication device.
The embodiments herein may be better understood by referring to the following description in conjunction with the accompanying drawings in which like reference numerals indicate identically or functionally similar elements, of which:
It should be understood that the above-referenced drawings are not necessarily to scale, presenting a somewhat simplified representation of various preferred features illustrative of the basic principles of the disclosure. The specific design features of the present disclosure, including, for example, specific dimensions, orientations, locations, and shapes, will be determined in part by the particular intended application and use environment.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. The term “coupled” denotes a physical relationship between two components whereby the components are either directly connected to one another or indirectly connected via one or more intermediary components.
It is understood that the term “vehicle” or “vehicular” or other similar term as used herein is inclusive of motor vehicles, in general, such as passenger automobiles including sports utility vehicles (SUV), buses, trucks, various commercial vehicles, watercraft including a variety of boats and ships, aircraft, and the like, and includes hybrid vehicles, electric vehicles, hybrid electric vehicles, hydrogen-powered vehicles and other alternative fuel vehicles (e.g., fuels derived from resources other than petroleum). As referred to herein, an electric vehicle (EV) is a vehicle that includes, as part of its locomotion capabilities, electrical power derived from a chargeable energy storage device (e.g., one or more rechargeable electrochemical cells or other type of battery). An EV is not limited to an automobile and may include motorcycles, carts, scooters, and the like. Furthermore, a hybrid vehicle is a vehicle that has two or more sources of power, for example both gasoline-based power and electric-based power (e.g., a hybrid electric vehicle (HEV)).
The term “user” may encompass any person substantially capable of interacting with a vehicle, as it is defined herein, including, but not limited to a driver, a passenger, and the like. Also, the term “mobile communication device” may encompass any portable, communication-enabled device, including, but not limited to, smart devices, smart phones, cell phones, tablets, PDAs, laptops, and so forth.
Additionally, it is understood that one or more of the below methods, or aspects thereof, may be executed or moderated by at least one controller area network (CAN) bus in a CAN of a vehicle. A CAN is a serial bus network of controllers or microcontrollers (e.g., electronic control units (ECUs)) that interconnects devices, actuators, sensors, and the like in a system (such as a vehicle, as in the present case) for real-time control applications. Vehicles typically employ a wide variety of controllers including an engine control unit and others used for transmission, airbags, anti-lock braking, cruise control, electric power steering, audio systems, power windows, doors, mirror adjustment, battery or recharging systems for electric vehicles, and so forth. The controllers may include a processor as well as a memory configured to store program instructions, where the processor is specifically programmed to execute the program instructions to perform one or more processes.
In CANs, messages are broadcast to all nodes (consisting of, e.g., a controller, a sensor, a transceiver, etc.) in the network using an identifier unique to the network. Based on the identifier, the individual nodes decide whether the message is relevant, and thus whether to process the message. Also, the nodes determine the priority of the message in terms of competition for access to the CAN bus, which allows the nodes to communicate with each other. Accordingly, the CAN bus may effectively control nodes (including the mobile communication device, as described herein) connected in the CAN by facilitating communication among the nodes and via transmission of control messages throughout the network.
Referring now to embodiments of the present disclosure, the disclosed techniques allow for seamless in-vehicle connectivity to mobile communication devices. Once a user initiates a connection between his or her mobile communication device and the vehicle, the user can conveniently interact with the device using a variety of techniques throughout the vehicle cabin, thereby eliminating the need for the driver to interact directly with the device. The user can view a representation of the mobile communication device's interface displayed in the vehicle (e.g., on the dashboard), such that it is integrated with vehicular information, such as HVAC controls, radio controls, navigational features, and the like. Further, the user may personalize the in-vehicle display of the mobile communication device information and vehicle information by manually rearranging the placement of the displayed mobile communication device and vehicle information. Also, the existing hardware of the mobile communication device can be leveraged to provide additional features in the vehicle, such as safety-related features, navigational features, and other convenience-related functionality.
The mobile communication device 110 may be connected to the CAN bus 210 via a wired connection. For instance, the mobile communication device 110 may be inserted into a docking station (not shown) in the vehicle 100. The docking station may be variously located throughout the vehicle 100; though the docking station may be preferably be located behind a rear-view mirror of the vehicle 100, as demonstratively shown in
Information may be transmitted back and forth between the mobile communication device 110 and the CAN bus 210 over the established connection. For instance, the CAN bus 210 may receive information from the mobile communication device 110 via the established connection. The information transmitted from the mobile communication device 110 to the CAN bus 210 may include any information suitable for transmission, such as, for example, information relating to the user's personal data, contacts, calendar, emails, messages, phone calls, applications, and so forth. Further, the transmitted information may include information acquired by a hardware-based component of the mobile communication device 110, such as, for example, a camera, an ambient light sensor, a global positioning system (GPS) unit, an accelerometer, a gyroscope, a microphone, a compass, a barometer, and so forth. The information acquired by a hardware-based component of the mobile communication device 110 may have been previously acquired (before connection to the CAN bus 210) or acquired in real-time (after connection to the CAN bus 210). The information transmitted by the mobile communication device 110 can then be used by the CAN bus 210 for controlling a function of the vehicle 100, as described in greater detail below. Conversely, the CAN bus 210 may transmit information to the mobile communication device 110, such as control messages, via the established connection. For instance, the CAN bus 210 may control a function of the mobile communication device 110 according to received input, as similarly described in greater detail below.
Additionally, the CAN bus 210 can cause a representation of an interface of the mobile communication device 110 to be displayed in a display area 120 of the vehicle 100. The representation of the interface of the mobile communication device 110 as displayed in the display area 120 may be based on the information received at the CAN bus 210 from the mobile communication device 110. For instance, as demonstratively shown in
Notably, the representation of the mobile communication device 210 interface shown in
Vehicle information may also be displayed in the display area 120. Conveniently, vehicle information may be displayed in the display area 120 concurrently with the mobile communication device information, such that the driver may simultaneously view useful information relating to both of the vehicle 100 and the mobile communication device 110. The vehicle information may include any information relating to a state of the vehicle, including, for example, HVAC information, infotainment information, and telematics information.
The mobile communication device information and/or the vehicle information may be displayed in the display area 120 of the vehicle 100 using any display means suitable for displaying such information. For instance, as demonstratively shown in
The CAN bus 210 may be configured to receive input for controlling a function of the mobile communication device 110 (e.g., CONTROL INPUT 220 in
In this regard, user input for controlling a function of the mobile communication device 110 may include at least one of: a touch gesture of a user, a motion gesture of the user, a gaze of the user, and a sound of the user. The touch gesture may be sensed at the display area 120. For instance, the touch gesture may be sensed at a touchscreen coupled to the display area 120. Any suitable type of touchscreen technology may be employed, including, for example, a capacitive screen, a resistive/pressure-based screen, and the like. The motion gesture or the gaze may be captured using a camera installed in the vehicle 110. Alternatively, or additionally, the motion gesture or the gaze may be captured using the camera 240 of the mobile communication device 110.
Then, based on the received input, the CAN bus 210 may identify a particular gesture of the user which is linked with a particular function of the mobile communication device 110. For instance, the action may be associated with at least one of: a motion of the user, a sound of the user, a gaze of the user, and an eye position of the user. Upon identifying the particular gesture, the CAN bus 210 may control the mobile communication device 110 via the connection such that the corresponding function is performed by the mobile communication device 110 (e.g., making a call, sending an SMS message, initiating a navigation to a destination, etc.). To this end, the mobile communication device 110 and CAN bus 210 may share applications, such that the CAN bus 210 can perform applications installed in the mobile communication device 110, thus allowing for a seamless integration and functional continuity for the user upon entering the vehicle 100. Furthermore, an application may be installed in the mobile communication device 120 that facilitates the device's ability to control a function of the vehicle through the CAN bus 210.
The CAN bus 210 can cause/transmit information to be displayed in the display area 120 (e.g., on the vehicle dashboard), as described above, using a display device 230, such as a LED-based screen, a LCD-based screen, a projection device (e.g., a pico-projector or the like), or any other device suitable for displaying information in a vehicle (e.g., DISPLAY 230 in
Additionally, the CAN bus 210 can cause/transmit vehicle information to be displayed in the display area 120 using the display device 230. The vehicle information may relate to, for example, a HVAC system, telematics (e.g., GPS navigation, safety-related communications, driving assistance systems, etc.), infotainment (e.g., media content, social media content, personalized content, etc.), and so forth. The vehicle information may simply be presented as status information or may include controls enabling the user to adjust vehicle settings.
Moreover, the layout of the displayed mobile communication device information and/or displayed vehicle information in the display area 120 may be customized by the user according to his or her preferences. That is, the CAN bus 210 can adjust an appearance of the displayed representation of the interface of the mobile communication device 110 in the display area 120 according to received input. In particular, users have the ability to arrange applications, windows, and information along the dashboard (e.g., in the display area 120) as desired. For instance, a user may use a touch gesture at a touchscreen coupled to the display area 120 in order to select and drag a particular window, information grouping, image, or the like, to another location, or to remove it completely. Further, the user may select (or remove), and then position, additional information of the mobile communication device 110 or vehicle 100 to be displayed in the display area 120.
The camera 240 of the mobile communication device 110 may also be utilized by the CAN bus 210 in order to enhance functionality of the vehicle 100 (e.g., CAMERA 240 in
Other hardware-based components and/or software of the mobile communication device 110 may be leveraged by the CAN bus 210 in the above manner, as well. That is, the CAN bus 210 may receive, via the established connection, additional information from the mobile communication device 110 acquired by a hardware-based component of the mobile communication device 110 other than the camera 240. The hardware-based components may include, for example, an ambient light sensor, a global positioning system (GPS) unit, an accelerometer, a gyroscope, a microphone, a compass, a barometer, and the like.
As an example, the CAN bus 210 may perform a navigation-related function associated with the vehicle 100 based on navigation information acquired by the mobile communication device 110. In this regard, a GPS unit of the mobile communication device 110 may acquire a current location of the device 110 (as well as the vehicle 100, as the mobile communication device 110 resides therein), and the mobile communication device 110 may transmit the same (i.e., navigational information) to the CAN bus 210. Furthermore, the mobile communication device 110 may determine an optimal route from the determined current location to a chosen destination using a navigation application installed on the device 110. The routing information may also be transmitted to the CAN bus 210. Based on the received information, the CAN bus 210 may, for example, cause the optimal route to be displayed in the display area 120, routing instructions to be audibly outputted to the driver, update the current location of the vehicle 100 as the mobile communication device 110 detects an updated current location (using the GPS unit), and so forth. A wide variety of other ways for leveraging the hardware and/or software of the mobile communication device 110 may also be envisioned.
At step 300, the CAN system is initialized (i.e., powered-up), at which point the CAN bus 210 obtains an initial telemetry status of the vehicle 100 (step 310). Meanwhile, a connection between the mobile communication device 110 (illustratively referred to as a “phone” in
If a connection between the mobile communication device 110 and the CAN bus 210 is a successful, the CAN bus 210 obtains a telemetry status update of the vehicle 100 (step 330). At step 335, the telemetry status information is then uploaded to the display area 120. In other words, the display device 230 displays vehicle information including the updated telemetry information obtained in step 330. At step 340, the user can perform an action as input (e.g., for controlling a function of the mobile communication device 110). The user input may be in the form of a touch gesture at the display area 120, a motion gesture, a gaze, a sound, or the like. The user input may then be processed by the CAN bus 210 in order to determine a function of the motion communication device 110 corresponds to the identified user input (step 345).
Meanwhile, environmental events may occur, either in the cabin of the vehicle (e.g., a motion of the driver, a gaze of the driver, etc.) or outside of the vehicle (e.g., a pedestrian walks into the road, the vehicle 100 veers into another lane, the vehicle 100 may collide with another object, etc.) (step 350). The mobile communication device 110 can activate its camera 240 so as to capture the occurring environmental event (step 355). Then, at step 360, the mobile communication device 110 processes the image information acquired by the camera 240 in step 355, as well as commands based on user input relayed from the CAN bus 210 to the mobile communication device 110 in step 345. The mobile communication device 110 also performs a function(s) in accordance with control messages relayed by the CAN bus 210 (e.g., make a call, send an SMS message, compose an email, initiate a navigation, etc.). Then, the mobile communication device 110 transmits the information, including image information, to the CAN bus 210, and steps 330-360 can be repeated.
The procedure depicted in
It should be noted that the steps shown in
Accordingly, techniques are described herein that enhance vehicular and mobile communication device connectivity, thereby increasing user convenience and accessibility, while minimizing security risks and driver distraction, as the driver no longer needs to interact directly with the mobile communication device. Because the vehicle and the mobile communication device can be seamlessly integrated, the driver does not need to learn or perform different techniques to control his or her mobile communication device while in a vehicle; instead, the driver can perform the same functions to control the mobile communication device that are used when the driver is not in the vehicle. Furthermore, the display area displaying the representation of the mobile communication device interface can be personalized according to the driver's preferences. Even further, pre-existing hardware and/or software of the mobile communication device can be leveraged to provide additional in-vehicle functionality, relating to safety, navigation, infotainment, and the like.
While there have been shown and described illustrative embodiments that provide for a system and method for vehicular and mobile communication device connectivity, it is to be understood that various other adaptations and modifications may be made within the spirit and scope of the embodiments herein. Accordingly, this description is to be taken only by way of example and not to otherwise limit the scope of the embodiments herein. Therefore, it is the object of the appended claims to cover all such variations and modifications as come within the true spirit and scope of the embodiments herein.