The present disclosure relates to systems and methods for enabling vehicle movement via an external interface configured to be removably attached to a vehicle exterior surface.
Users may move their vehicles over relatively short distances frequently when the users may be performing outdoor activities or tasks. For example, a user may frequently move the user's vehicle over short distances (e.g., 5-10 meters) as the user performs the activity.
It may be inconvenient for the user to frequently enter and move the vehicle and then exit the vehicle multiple times to perform the activity, and hence the user may not prefer to enter the vehicle frequently when the user may be performing such activities. Therefore, it may be desirable to have a system that may enable the user to conveniently move the vehicle over relatively short distances without repeatedly entering and exiting the vehicle.
The detailed description is set forth with reference to the accompanying drawings. The use of the same reference numerals may indicate similar or identical items. Various embodiments may utilize elements and/or components other than those illustrated in the drawings, and some elements and/or components may not be present in various embodiments. Elements and/or components in the figures are not necessarily drawn to scale. Throughout this disclosure, depending on the context, singular and plural terminology may be used interchangeably.
The present disclosure describes a vehicle that may be moved by using an external interface that may be removably attached to a vehicle exterior surface. A user may cause vehicle movement and/or vehicle steering wheel rotation by providing user inputs to the interface, which may transmit the user inputs via a wired connection or a wireless network to the vehicle to cause the vehicle movement and/or the vehicle steering wheel rotation. In some aspects, the vehicle may be configured to control the vehicle speed and/or the vehicle steering wheel rotation based on an interface location and/or orientation relative to the vehicle and a maximum permissible vehicle speed/steering wheel rotation angle associated with the interface location and/or orientation. For example, the vehicle may not enable the vehicle speed to increase beyond a first predefined maximum speed when the interface location may be in the vehicle (e.g., connected to a vehicle connection port disposed on the vehicle exterior surface) and may not enable the vehicle speed to increase beyond a second predefined maximum speed when the interface location may be outside the vehicle (e.g., when the user may be holding the interface in user's hand). In an exemplary aspect, the first predefined maximum speed may be different from the second predefined maximum speed.
In some aspects, the vehicle may determine the interface location and/or orientation relative to the vehicle based on interface information that the vehicle may obtain from an interface sensor unit associated with the interface, vehicle information that the vehicle may obtain from a vehicle sensor unit, and user device information that the vehicle may obtain from a user device that the user may be carrying. In an exemplary aspect, the interface sensor unit may include an interface accelerometer, an interface gyroscope and an interface magnetometer, and the interface information may include information associated with interface movement speed and direction, inclination/tilt relative to ground, interface angular motion, and/or the like. The vehicle sensor unit may include a vehicle accelerometer, a vehicle gyroscope, a vehicle magnetometer, and vehicle interior and exterior cameras, and the vehicle information may include information associated with vehicle movement speed and direction, inclination/tilt relative to ground, vehicle angular motion, and/or the like. Similarly, the user device information may include user device movement speed and direction, inclination/tilt relative to ground, user device angular motion, and/or the like.
The vehicle may correlate the interface information, the vehicle information and/or the user device information described above to determine the interface location and/or orientation relative to the vehicle. For example, the vehicle may correlate the information described above to determine whether the interface may be located in the vehicle or may be held in the user's hand when the user may be located outside the vehicle.
In further aspects, the vehicle information may include a connection status associated with each connection port, of a plurality of connection ports disposed on the vehicle exterior surface, with the interface. The vehicle may determine whether the interface may be attached to a connection port and the corresponding interface orientation relative to the vehicle based on the connection status included in the vehicle information.
Responsive to determining the interface location and/or orientation relative to the vehicle, the vehicle may obtain a mapping associated with the determined interface location and/or orientation with maximum permissible vehicle speed, steering wheel rotation angle and/or travel distance from a vehicle memory or an external server, to control the vehicle movement. In some aspects, the vehicle may additionally control and/or activate vehicle advanced driver-assistance system (ADAS) features and/or vehicle proximity sensors based on the determined interface location relative to the vehicle.
The present disclosure discloses a vehicle that may be moved by providing inputs to an interface that may be removably attached to a vehicle exterior surface. The interface may enable the user to cause the vehicle movement without having to enter the vehicle interior portion. Since the user is not required to enter the vehicle to cause the vehicle movement, the interface may facilitate the user in performing outdoor activities such as farming, laying fences, etc., which may require frequent vehicle movement over short distances. Further, the interface is easy to attach to the vehicle exterior surface via the plurality of connection ports, thus enhancing ease of use for the user.
These and other advantages of the present disclosure are provided in detail herein.
The disclosure will be described more fully hereinafter with reference to the accompanying drawings, in which example embodiments of the disclosure are shown, and not intended to be limiting.
The vehicle 102 may take the form of any passenger or commercial vehicle such as, for example, a car, a work vehicle, a crossover vehicle, a van, a minivan, etc. Further, the vehicle 102 may be a manually driven vehicle and/or may be configured to operate in a fully autonomous (e.g., driverless) mode or a partially autonomous mode and may include any powertrain such as, for example, a gasoline engine, one or more electrically-actuated motor(s), a hybrid system, etc.
The environment 100 may further include an external interface 110 (or interface 110) that may be configured to be removably attached to a vehicle exterior surface (or a vehicle interior surface). In some aspects, the vehicle exterior surface may include one or more cavities or slots or connection ports (shown as connection ports 250 in
The interface 110 may be configured to cause and/or control vehicle movement based on user inputs. In some aspects, the user 104 may not be required enter and exit the vehicle 102 multiple times to frequently move the vehicle 102 over the short distances around the farm periphery by using the interface 110. Since the interface 110 may be configured to be removably attached to the vehicle exterior surface, the user 104 may conveniently cause and control the vehicle movement from outside the vehicle 102 by using the interface 110.
In some aspects, the interface 110 may be configured to cause and/or control the vehicle movement when the interface 110 may be attached to one of the connection ports described above. In other aspects, the interface 110 may be configured to cause and/or control the vehicle movement by transmitting command signals wirelessly to the vehicle 102 when the interface 110 may be disposed within a predefined distance from the vehicle 102. As an example, the user 104 may hold the interface 110 in user's hand/palm and provide the user inputs to the interface 110. The interface 110 may then generate command signals associated with the user inputs and wirelessly transmit the command signals to the vehicle 102 to cause the vehicle movement. In some aspects, a maximum permissible vehicle speed and/or a maximum permissible vehicle steering wheel rotation angle may be different based on whether the interface 110 is attached to a connection port, or the interface 110 is held in user's palm (and not attached to any connection port). Further, the maximum permissible vehicle speed and/or a maximum permissible vehicle steering wheel rotation angle may be different based on whether the user 104 is in the vehicle 102 (e.g., holding the interface 110 in user's hand) or the user 104 is walking in proximity to the vehicle 102 holding the interface 110 in user's hand (and outside the vehicle 102). The vehicle 102 may be configured to determine an interface location and/or orientation relative to the vehicle 102 and may accordingly enable the interface 110 to cause and/or control the vehicle movement based on the interface location and/or orientation and the user inputs.
In some aspects, the interface 110 may be dome-shaped (as shown in
In other aspects, the interface 110 may have a shape of an elongated rod or stick and may act like a joystick having one or more tilt sensors, torsional motion sensors, and/or the like. In yet another aspect, the interface 110 may include a plurality of switches or buttons on a switchboard, which may be removably attached to the vehicle 102 or may be hand-held. Although
In some aspects, to cause and/or control the vehicle movement using the interface 110, the user 104 may first activate an external interface movement mode associated with the vehicle 102. For example, the user 104 may transmit a request to the vehicle 102 to activate the external interface movement mode when the user 104 desires to cause and/or control the vehicle movement using the interface 110. The user 104 may transmit the request via a user device (shown as user device 202 in
In some aspects, the vehicle 102 may authenticate the user 104 by requesting the user 104 to input a preset passcode/password on the infotainment system or the user device, by authenticating the user device (e.g., when the user device may be executing a phone-as-a-key (PaaK) application and communicatively paired with the vehicle 102), and/or by authenticating/pairing with a key fob (not shown) associated with the vehicle 102 that the user 104 may be carrying. The methods described here for authenticating the user 104 are exemplary in nature and should not be construed as limiting. The vehicle 102 may authenticate the user 104 by any other method (e.g., facial recognition, fingerprint recognition, etc.) as well, without departing from the present disclosure scope.
The vehicle 102 may determine that the user 104 may be in proximity to the vehicle 102 by determining user device location (when the user device may be executing the PaaK application and communicatively paired with the vehicle 102) or key fob location. When the user device may not be executing the PaaK application, the vehicle 102 may determine the user device location by determining received signal strength indicator (RSSI) value associated with the user device. In other aspects, the vehicle 102 may determine that the user 104 may be in proximity to the vehicle 102 by obtaining user images from vehicle cameras and/or inputs from other vehicle sensors (e.g., radio detecting and ranging (radar) sensors). The methods described here for determining that the user 104 may be in proximity to the vehicle 102 are exemplary in nature and should not be construed as limiting. The vehicle 102 may determine user location by any other method as well, without departing from the present disclosure scope.
The vehicle 102 may authenticate the interface 110 by exchanging preset authentication codes with the interface 110, when the interface 110 may be communicatively coupled with the vehicle 102 via a wireless network and/or when the interface 110 may be attached to a connection port described above. The preset authentication codes may be pre-stored in the vehicle 102 and the interface 110 when, for example, the interface 110 may be first registered with the vehicle 102 (e.g., when the interface 110 may be first used with the vehicle 102). In other aspects, in addition to or alternative to exchanging the preset authentication codes, the vehicle 102 and the interface 110 may obtain an encryption key from an external server (shown as server 204 in
When the vehicle 102 authenticates the user 104 and the interface 110 and determines that the user 104 may be located within a predefined distance from the vehicle 102, the vehicle 102 may enable the interface 110 to cause and/or control the vehicle movement based on the user inputs received at the interface 110. Stated another way, in this case, the vehicle 102 may activate the external interface movement mode associated with the vehicle 102.
In some aspects, responsive to enabling the interface 110 to cause and/or control the vehicle movement, the vehicle 102 may determine whether the interface 110 may be attached to a connection port in the vehicle 102 or the user 104 may be holding the interface 110, e.g., on a user palm/hand. The vehicle 102 may further determine an interface location and/or orientation relative to the vehicle 102. In some aspects, the vehicle 102 may make such determinations to identify a maximum permissible vehicle speed and/or a maximum permissible vehicle steering wheel rotation angle that may be allowed based on the user inputs on the interface 110. For example, when the user 104 may be holding the interface 110 in the user palm, the vehicle 102 may allow a lower maximum vehicle speed as compared to when the interface 110 may be attached to a connection port (and the user 104 may be outside the vehicle 102). In additional aspects, the vehicle 102 may make such determinations to control and/or activate vehicle advanced driver-assistance system (ADAS) features and/or vehicle proximity sensors based on the determined interface location relative to the vehicle 102. Further, based on the determined interface location relative to the vehicle 102, the vehicle 102 may use one or more vehicle speakers, vehicle lights or vehicle display screens closest to the determined interface location to provide/output notifications associated with an interface operational status, a vehicle movement status, and/or the like, to the user 104. The vehicle 102 may further use remaining vehicle speakers, vehicle lights or vehicle display screens to provide similar or different notifications to bystanders who may be located in proximity to the vehicle 102.
In some aspects, the vehicle 102 may determine whether the interface 110 may be attached or detached from a connection port and the interface location and/or orientation based on interface information that the vehicle 102 may obtain from an interface sensor unit, vehicle information that the vehicle 102 may obtain from a vehicle sensor unit, and/or user device information that the vehicle 102 may obtain from the user device associated with the user 104. In an exemplary aspect, the interface sensor unit may include an interface accelerometer, an interface gyroscope, and/or an interface magnetometer, and the interface information may include information associated with interface movement speed and direction, inclination/tilt relative to ground or north/south pole, interface angular motion, and/or the like. In some aspects, by using the interface information obtained from the interface accelerometer, the interface gyroscope, and/or the interface magnetometer, the vehicle 102 may not only determine the interface location relative to the vehicle 102, but may also determine a mounting point or the connection port on the vehicle 102 at which the interface 110 may be attached. This is because an interface rate of change of speed pattern is a function of a mounting location/point on the vehicle 102, and the vehicle 102 may compare an axial rate of change of speed pattern associated with the interface 110 obtained from the interface sensor unit with a vehicle axial rate of change of speed pattern obtained from the vehicle sensor unit to determine the interface mounting location/point on the vehicle 102.
The vehicle sensor unit may include a plurality of vehicle sensors including, but not limited to, a vehicle accelerometer, a vehicle gyroscope, a vehicle magnetometer, vehicle interior and exterior cameras, and/or the like. In some aspects, the vehicle information may include vehicle movement information associated with vehicle movement speed and direction, vehicle inclination/tilt relative to ground, vehicle angular motion, and/or the like. In additional aspects, the vehicle sensor unit may be configured to obtain signals from the plurality of connection ports that may be located in the vehicle 102. In this case, the vehicle information may include information associated with connection status of each connection port. For example, the vehicle information may include information indicating that the interface 110 may be attached to or inserted into a first connection port, from the plurality of connection ports.
The user device information may include information associated with user device movement speed, inclination/tilt relative to ground or north/south pole, user device angular motion, and/or the like, which the user device may determine based on signals obtained from user device accelerometer, gyroscope and magnetometer.
Responsive to obtaining the interface information, the vehicle information and/or the user device information, the vehicle 102 may correlate the obtained information to determine whether the interface 110 may be attached to or detached from a connection port and the interface location and/or orientation relative to the vehicle 102 based on the correlation. The process of determining whether the interface 110 may be attached to or detached from a connection port and the interface location and/or orientation is described later in detail below in conjunction with
Responsive to determining the interface location and/or orientation, the vehicle 102 may fetch/obtain a mapping of different interface locations and/or orientations with maximum permissible/allowable vehicle speeds and/or vehicle steering wheel rotation angles that may be pre-stored in a vehicle memory or the external server. The vehicle 102 may then enable the vehicle movement based on the mapping, the determined interface location and/or orientation, and the user inputs obtained from the interface 110. For example, when the interface 110 may be attached to a connection port disposed at a vehicle rear portion and the maximum allowable forward vehicle speed for such an interface location may be 5 miles per hour, the vehicle 102 may enable the vehicle 102 to move forward with a speed of not more than 5 miles per hour when the user 104 provides inputs to the interface 110 to cause the vehicle 102 to move forward. As another example, when the interface 110 may be disposed on the user palm (and not attached to a connection port) and the maximum allowable forward vehicle speed for such an interface location may be 3 miles per hour, the vehicle 102 may enable the vehicle 102 to move forward with a speed of not more than 3 miles per hour when the user 104 provides inputs to the interface 110 to cause the vehicle 102 to move forward. As described above, the vehicle 102 may further control and/or activate vehicle ADAS features and/or vehicle proximity sensors based on the determined interface location relative to the vehicle 102. For example, those vehicle proximity sensors may be activated that may be closer to the determined interface location. Furthermore, based on the determined interface location relative to the vehicle 102, the vehicle 102 may use one or more vehicle speakers, vehicle lights or vehicle display screens closest to the determined interface location to provide/output notifications associated with an interface operational status, a vehicle movement status, and/or the like, to the user 104. The vehicle 102 may further use remaining vehicle speakers, vehicle lights or vehicle display screens to provide/output similar or different notifications to bystanders who may be located in proximity to the vehicle 102.
Further details associated with the interface 110 and the vehicle 102 are described below in conjunction with the subsequent figures.
The vehicle 102 and the interface 110 implement and/or perform operations, as described here in the present disclosure, in accordance with the owner manual and safety guidelines. In addition, any action taken by the user 104 based on recommendations or notifications provided by the vehicle 102 should comply with all the rules specific to the location and operation of the vehicle 102 (e.g., Federal, state, country, city, etc.). The recommendation or notifications, as provided by the vehicle 102, should be treated as suggestions and only followed according to any rules specific to the location and operation of the vehicle 102.
The system 200 may include the vehicle 102, the interface 110, a user device 202, and one or more servers 204 (or server 204) communicatively coupled with each other via one or more networks 206 (or a network 206). In some aspects, the vehicle 102 and the interface 110 may be communicatively coupled with each other via the network 206 as shown in
The user device 202 may be associated with the user 104 and may be, for example, a mobile phone, a laptop, a computer, a tablet, a wearable device, or any other similar device with communication capabilities. The server 204 may be part of a cloud-based computing infrastructure and may be associated with and/or include a Telematics Service Delivery Network (SDN) that provides digital data services to the vehicle 102 and other vehicles (not shown) that may be part of a vehicle fleet. In further aspects, the server 204 may be configured to provide encryption keys to the vehicle 102 and the interface 110 to enable interface authentication, when the user 104 transmits, e.g., via the user device 202, the request to the vehicle 102 to activate the external interface movement mode, as described above in conjunction with
The network 206 illustrates an example communication infrastructure in which the connected devices discussed in various embodiments of this disclosure may communicate. The network 206 may be and/or include the Internet, a private network, public network or other configuration that operates using any one or more known communication protocols such as, for example, transmission control protocol/Internet protocol (TCP/IP), Bluetooth®, BLE, Wi-Fi based on the Institute of Electrical and Electronics Engineers (IEEE) standard 802.11, ultra-wideband (UWB), and cellular technologies such as Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), High-Speed Packet Access (HSPDA), Long-Term Evolution (LTE), Global System for Mobile Communications (GSM), and Fifth Generation (5G), to name a few examples.
The interface 110 may include a plurality of units including, but not limited to, a transceiver 208, a processor 210, a memory 212 and an interface sensor unit 214. The transceiver 208 may be configured to transmit/receive, via a wired connection or the network 206, signals/information/data to/from one or more external systems or devices, e.g., the user device 202, the server 204, the vehicle 102, etc. The interface sensor unit 214 may include a plurality of sensors including, but not limited to, pressure sensors, capacitive sensors, rotary position sensing element, an interface accelerometer, an interface gyroscope, an interface magnetometer, and/or the like. The interface sensor unit 214 may be configured to determine/detect user inputs associated with vehicle longitudinal movement (e.g., vehicle forward or reverse movement) and/or vehicle steering wheel rotation on the interface 110 and generate electric current/command signals based on the user inputs. The interface sensor unit 214 may transmit the generated electric current/command signals to the transceiver 208, which in turn may transmit the command signals to the vehicle 102 to enable the vehicle movement based on the user inputs (e.g., when the vehicle 102 enables the interface 110 to cause and/or control the vehicle movement).
In further aspects, the interface sensor unit 214 may be configured to determine/detect interface information associated with the interface 110 based on the inputs received from the interface accelerometer, the interface gyroscope and/or the interface magnetometer. In some aspects, the interface information may be associated with interface movement speed and direction, inclination/tilt relative to ground or north/south pole, interface angular motion, and/or the like. The interface sensor unit 214 may transmit the interface information to the transceiver 208, which in turn may transmit the interface information to the vehicle 102 when the interface 110 may be communicatively coupled with the vehicle 102 and/or when the vehicle 102 enables the interface 110 to cause and/or control the vehicle movement.
The processor 210 may be disposed in communication with one or more memory devices disposed in communication with the respective computing systems (e.g., the memory 212 and/or one or more external databases not shown in
The vehicle 102 may include a plurality of units including, but not limited to, an automotive computer 216, a Vehicle Control Unit (VCU) 218, and an interface management system 220 (or system 220). The VCU 218 may include a plurality of Electronic Control Units (ECUs) 222 disposed in communication with the automotive computer 216.
In some aspects, the user device 202 may be configured to connect with the automotive computer 216 and/or the system 220 via the network 206, which may communicate via one or more wireless connection(s), and/or may connect with the vehicle 102 directly by using near field communication (NFC) protocols, Bluetooth® protocols, Wi-Fi, Ultra-Wide Band (UWB), and other possible data connection and sharing techniques.
The automotive computer 216 and/or the system 220 may be installed anywhere in the vehicle 102, in accordance with the disclosure. Further, the automotive computer 216 may operate as a functional part of the system 220. The automotive computer 216 may be or include an electronic vehicle controller, having one or more processor(s) 224 and a memory 226. Moreover, the system 220 may be separate from the automotive computer 216 (as shown in
The processor(s) 224 may be disposed in communication with one or more memory devices disposed in communication with the respective computing systems (e.g., the memory 226 and/or one or more external databases not shown in
In accordance with some aspects, the VCU 218 may share a power bus with the automotive computer 216 and may be configured and/or programmed to coordinate the data between vehicle systems, connected servers (e.g., the server 204), and other vehicles (not shown in
In some aspects, the VCU 218 may control vehicle operational aspects and implement one or more instruction sets received from the server 204, from one or more instruction sets stored in the memory 226, including instructions operational as part of the system 220.
The TCU 234 may be configured and/or programmed to provide vehicle connectivity to wireless computing systems onboard and off board the vehicle 102 and may include a Navigation (NAV) receiver 242 for receiving and processing a GPS signal, a BLE® Module (BLEM) 244, a Wi-Fi transceiver, an ultra-wideband (UWB) transceiver, and/or other wireless transceivers (not shown in
The ECUs 222 may control aspects of vehicle operation and communication using inputs from human drivers, inputs from the automotive computer 216, the system 220, and/or via wireless signal inputs/command signals received via the wireless connection(s) from other connected devices, such as the server 204, the user device 202, the interface 110, among others.
The BCM 228 generally includes integration of sensors, vehicle performance indicators, and variable reactors associated with vehicle systems and may include processor-based power distribution circuitry that may control functions associated with the vehicle body such as lights, windows, security, camera(s), audio system(s), speakers, wipers, door locks and access control, various comfort controls, etc. The BCM 228 may also operate as a gateway for bus and network interfaces to interact with remote ECUs (not shown in
The DAT controller 236 may provide Level-1 through Level-3 automated driving and driver assistance functionality that may include, for example, active parking assistance, vehicle backup assistance, and/or adaptive cruise control, among other features. The DAT controller 236 may also provide aspects of user and environmental inputs usable for user authentication.
In some aspects, the automotive computer 216 may connect with an infotainment system 246 (or a vehicle Human-Machine Interface (HMI)). The infotainment system 246 may include a touchscreen interface portion and may include voice recognition features, biometric identification capabilities that may identify users based on facial recognition, voice recognition, fingerprint identification, or other biological identification means. In other aspects, the infotainment system 246 may be further configured to receive user instructions via the touchscreen interface portion and/or output or display notifications, navigation maps, etc. on the touchscreen interface portion.
The computing system architecture of the automotive computer 216, the VCU 218, and/or the system 220 may omit certain computing modules. It should be readily understood that the computing environment depicted in
The vehicle 102 may further include a vehicle sensor unit 248 and a plurality of connection ports 250. In some aspects, the vehicle sensor unit 248 may be part of the vehicle sensory system 240. In other aspects, the vehicle sensor unit 248 may be separate from the vehicle sensory system 240. The vehicle sensor unit 248 may include a plurality of sensors including, but not limited to, the vehicle accelerometer, the vehicle gyroscope, the vehicle magnetometer, the interior and exterior vehicle cameras, and/or the like. In some aspects, the vehicle sensor unit 248 may be configured to determine the vehicle information associated with the vehicle movement and/or the plurality of connection ports 250 (e.g., connection status of each connection port with the interface 110). Examples of the vehicle information are already described above in conjunction with
The interface 110 may be configured to removably attach to the vehicle exterior surface via the plurality of connection ports 250. In some aspects, the plurality of connection ports 250 may be disposed on the vehicle exterior surface, and the interface 110 may be configured to be inserted into a connection port, from the plurality of connection ports 250, to enable electro-mechanical attachment between the interface 110 and the vehicle 102.
In accordance with some aspects, the system 220 may be integrated with and/or executed as part of the ECUs 222. The system 220, regardless of whether it is integrated with the automotive computer 216 or the ECUs 222, or whether it operates as an independent computing system in the vehicle 102, may include a transceiver 252, a processor 254, and a computer-readable memory 256.
The transceiver 252 may be configured to receive information/inputs from one or more external devices or systems, e.g., the user device 202, the server 204, the interface 110, and/or the like, via the network 206. Further, the transceiver 252 may transmit notifications, requests, signals, etc. to the external devices or systems. In addition, the transceiver 252 may be configured to receive information/inputs from vehicle components such as the vehicle sensor unit 248, the plurality of connection ports 250, one or more ECUs 222, and/or the like. Further, the transceiver 252 may transmit signals (e.g., command signals) or notifications to the vehicle components such as the BCM 228, the infotainment system 246, and/or the like.
The processor 254 and the memory 256 may be same as or similar to the processor 224 and the memory 226, respectively. In some aspects, the processor 254 may utilize the memory 256 to store programs in code and/or to store data for performing aspects in accordance with the disclosure. The memory 256 may be a non-transitory computer-readable storage medium or memory storing the interface management program code. In some aspects, the memory 256 may additionally store instructions/information/data/mapping obtained from the server 204, the user device 202, the interface 110, and/or the like.
In operation, when the user 104 desires to cause and/or control the vehicle movement using the interface 110, the user 104 may transmit, via the user device 202 or the infotainment system 246, the request to the transceiver 252 to activate the external interface movement mode associated with the vehicle 102, as described above in conjunction with
Responsive to obtaining the request, the processor 254 may authenticate the user 104, determine a user location and/or authenticate the interface 110, as described above in conjunction with
In addition, in parallel to receiving the request from the user 104 (via the user device 202 or the infotainment system 246) or responsive to the external interface movement mode being activated and/or the interface 110 being communicatively coupled with the vehicle 102, the transceiver 252 may receive the interface information from the interface sensor unit 214 (via the transceiver 208 and a wired connection or the network 206). In addition, the transceiver 252 may receive the user device information from the user device 202 via the network 206. As described above in conjunction with
In some aspects, responsive to determining that the trigger event may have occurred, the processor 254 may obtain the interface information and/or the user device information from the transceiver 252. In addition, responsive to determining that the trigger event may have occurred, the processor 254 may obtain the vehicle information from the vehicle sensor unit 248. As described above, the vehicle information may be associated with the vehicle movement and/or connection status of each connection port from the plurality of connection ports 250 with the interface 110. In some aspects, the processor 254 may determine the interface location and/or orientation relative to the vehicle 102 based on the interface information, the vehicle information and/or the user device information, as described below. Specifically, responsive to obtaining the information described above, the processor 254 may determine whether the interface 110 may be physically/electromechanically attached to a connection port from the plurality of connection ports 250, or the user 104 may be holding the interface 110 in the user's hand.
In a first exemplary aspect, the processor 254 may determine whether the interface 110 may be physically/electromechanically attached to a connection port or the user 104 may be holding the interface 110 in the user's hand (and be outside the vehicle 102) based on the interface information obtained from the interface sensor unit 214. In this case, the processor 254 may analyze the interface information and compare the interface information with historical interface information (that may be pre-stored in the memory 256 or obtained from the server 204) indicative of interface movements when the interface 110 may have been attached to the vehicle 102 and when the interface 110 may have been held in a user's hand. In some aspects, based on comparing the interface information with the historical interface information, the processor 254 may determine whether the interface information corresponds to a vehicle movement or a human movement. For example, when the interface 110 may be held in the user's hand, the interface information may indicate greater change of interface movement direction or orientation and/or sudden increase or decrease of interface speed, as compared to when the interface 110 may be attached to the vehicle 102 (via a connection port).
In some aspects, the processor 254 may analyze the interface information, as described above, in frequency domain to determine whether the interface 110 may be physically attached to the vehicle 102 or user 104 may be holding the interface 110 in the user's hand. In some aspects, the processor 254 may use windowed Fast Fourier Transform, a Wavelet Transform (Haar wavelet), or a band-pass digital filter to analyze the interface information in the frequency domain. In other aspects, the processor 254 may use Artificial Intelligence/Machine Learning (AI/ML), classifiers such as hidden Markov chains, deep learning methods, neural network, etc. to analyze the interface information and determine the interface location and/or orientation (i.e., whether the interface 110 may be physically attached to the vehicle 102 or user 104 may be holding the interface 110 in the user's hand). An example view of the user 104 holding the interface 110 in the user's hand is shown in
In a second exemplary aspect, the processor 254 may determine whether the interface 110 may be physically attached to a connection port or the user 104 may be holding the interface 110 in the user's hand based on the interface information and the vehicle information associated with the vehicle movement. In this case, the processor 254 may correlate the interface information with the vehicle information associated with the vehicle movement and determine the interface location and/or orientation relative to the vehicle 102 based on the correlation. For example, the processor 254 may compare changes in interface orientation (based on inputs obtained from the interface gyroscope and the interface magnetometer) with changes in vehicle orientation (based on inputs obtained from the vehicle gyroscope and the vehicle magnetometer) and determine that the interface 110 may be disposed in the vehicle 102 (and connected to a connection port) when the changes in the interface orientation matches with the changes in the vehicle orientation. The processor 254 may additionally compare and identify matches between frequency data obtained from the interface accelerometer and the vehicle accelerometer. Responsive to determining that the interface 110 may be disposed in the vehicle 102 based on the comparison/matching described above, the processor 254 may determine the interface orientation relative to the vehicle 102 based on matching between the interface orientation and the vehicle orientation. For example, the processor 254 may determine that interface forward motion direction (e.g., a direction of “push” that the user 104 applies on the interface 110 to cause forward vehicle movement) may be aligned with vehicle forward movement when the interface orientation and the vehicle orientation may be matched. Furthermore, as described above in conjunction with
Responsive to determining that the interface 110 may be disposed in the vehicle 102 or the interface location may be in the vehicle 102 based on the comparison/matching/correlation described above, the processor 254 may fetch the mapping of different interface locations and/or orientations relative to the vehicle 102 with maximum permissible/allowable vehicle speeds and/or vehicle steering wheel rotation angles and/or maximum permissible/allowable distances the vehicle 102 may travel from the memory 256 or the server 204. The processor 254 may then correlate the determined interface location and/or orientation with the mapping to determine a first maximum permissible/allowable vehicle speed and/or a first maximum permissible vehicle steering wheel rotation angle and/or a first maximum permissible/allowable distance the vehicle 102 may travel based on the determined interface location and/or orientation. For example, the processor 254 may determine that the vehicle 102 may travel at a maximum speed of 5 miles per hour and/or travel a maximum of 500 meters when the interface location may be in the vehicle 102 (e.g., connected to a connection port).
In further aspects, the transceiver 252 may receive command signals from the transceiver 208 when the user 104 may be providing user inputs to the interface 110 to cause and/or control the vehicle movement (e.g., when the external interface movement mode may be activated). The command signals may be associated with the user inputs received on the interface 110 from the user 104. Responsive to the transceiver 252 receiving the command signals from the transceiver 208, the transceiver 252 may transmit the command signals to the processor 254.
Responsive to obtaining the command signals from the transceiver 252, the processor 254 may cause and control, via the BCM 228, the vehicle forward/reverse movement/speed and/or the vehicle steering wheel rotation based on the obtained command signals. In some aspects, the processor 254 may control the vehicle speed and/or the vehicle steering wheel rotation based on the first maximum permissible/allowable vehicle speed and/or the first maximum permissible vehicle steering wheel rotation angle, such that the vehicle speed and/or vehicle steering wheel rotation may not exceed respective maximum permissible values. Further, the processor 254 may enable the vehicle movement such that the vehicle 102 may not travel/move beyond the first maximum permissible/allowable distance. Furthermore, as described above in conjunction with
In a third exemplary aspect, the processor 254 may determine the interface location and/or orientation, i.e., whether the interface 110 may be physically attached to a connection port or the user 104 may be holding the interface 110 in the user's hand, based on the interface information, the vehicle information associated with the vehicle movement and the user device information. In this case, the processor 254 may correlate the interface information with the vehicle information and correlate the interface information with the user device information to determine whether the interface information matches with the vehicle information or the user device information. In some aspects, when the interface information matches with the vehicle information associated with the vehicle movement, the processor 254 may determine that the interface location may be in the vehicle 102. On the other hand, when the interface information matches with the user device information and does not match with the vehicle information, the processor 254 may determine that the interface location may be outside the vehicle 102. In an exemplary aspect, when the interface information, the vehicle information associated with the vehicle movement and the user information match with each other, the processor 254 may determine that the interface location may be in the vehicle 102 and the user 104 may also be in the vehicle 102. In this case, the processor 254 may determine whether the interface 110 may be connected to a connection port or held in user's hand based on the vehicle information associated with the plurality of connection ports 250, as described in the description later below.
In some aspects, responsive to determining that the interface location may be outside the vehicle 102 based on the correlation of the interface information, the vehicle information and the user device information, the processor 254 may use the mapping described above to determine a second maximum permissible/allowable vehicle speed and/or a second maximum permissible vehicle steering wheel rotation angle and/or a second maximum permissible/allowable distance the vehicle 102 may travel based on the determined interface location of outside the vehicle 102. The processor 254 may then control the vehicle speed, the vehicle steering wheel rotation and/or vehicle travel distance based on the command signals obtained from the transceiver 208/interface 110 and the determined second maximum permissible vehicle speed, the second maximum permissible vehicle steering wheel rotation angle and/or the second maximum permissible distance, as described above.
In further aspects, the processor 254 may also use the interface information and/or the vehicle information associated with the vehicle movement and/or the user device information to determine whether to disable the external interface movement mode or decrease vehicle speed or stop vehicle movement. For example, the processor 254 may disable the external interface movement mode when the processor 254 determines that the vehicle 102 may travelling on a steep terrain (e.g., when a terrain slope angle/gradient may be greater than a predefined threshold), determined based on the vehicle information associated with the vehicle movement. As another example, the processor 254 may decrease vehicle speed when the vehicle 102 may be travelling on a rough terrain, determined based on the vehicle information associated with the vehicle movement. As yet another example, the processor 254 may stop vehicle movement when the user device information indicates a sudden change in orientation (indicating that the user 104 may have fallen or slipped or touched the vehicle 102 or any other obstruction). As yet another example, the processor 254 may decrease the maximum allowable vehicle speed and/or vehicle steering wheel rotation when the processor 254 determines that the user 104 may be holding the interface 110 in the user's hand, determined based on the user device information, the vehicle information, the interface information and/or images obtained from exterior vehicle cameras.
In a fourth exemplary aspect, the processor 254 may determine the interface location and/or orientation relative to the vehicle 102, i.e., whether the interface 110 may be physically attached to a connection port from the plurality of connection ports 250 disposed in the vehicle 102, based on the vehicle information associated with the plurality of connection ports 250. In some aspects, the plurality of connection ports 250 may be disposed at a plurality of locations in the vehicle exterior surface. For example, as shown in
In some aspects, each connection port, from the plurality of connection ports 250, may include one or more pins in a unique orientation/arrangement. The pins may be disposed at a bottom surface or a side surface of each connection port. For example, as shown in view 402 of
In an exemplary aspect, the interface 110 may be removably attached to a connection port, from the plurality of connection ports 250, via an elongated connector 302 shown in
In some aspects, a bottom surface or a side surface of the bottom portion 306 may include one or more connector pins that may be configured to couple with the first, second, third and/or fourth pins 404, 406, 408 and 410 described above. For example, as shown in
In some aspects, when the connector pins 412 may be disposed at the bottom surface of the bottom portion 306, the connector pins 412 may be disposed at a position (e.g., elevated position) that may prevent water or snow buildup at the bottom surface. In other aspects, when the connector pins 412 may be disposed at the side surface of the bottom portion 306, the connector pins 412 may be spring-loaded so that the connector pins 412 may retract when the elongated connector 302 may be inserted into a connection port and snap out when the connection between the elongated connector 302 and the connection port may be established.
In an exemplary aspect, when the interface 110 may be inserted into the first connection port 250a, the vehicle sensor unit 248 (and/or the processor 254 directly) may receive a first connection signal from the pins 404, 406 when the connector pins 412a, 412b engage with the pins 404, 406. The first connection signal may be indicative of unique orientation of the pins 404, 406 in the first connection port 250a. Responsive to receiving the first connection signal from the pins 404, 406, the vehicle sensor unit 248 may transmit the first connection signal to the processor 254 as part of the vehicle information associated with the plurality of connection ports 250.
The processor 254 may obtain the first connection signal from the vehicle sensor unit 248 or directly from the pins 404, 406 and may determine that the interface location may be in the vehicle 102 and the interface 110 may be attached to the first connection port 250a based on the first connection signal. Responsive to such determination, the processor 254 may use the mapping described above to determine a third maximum permissible/allowable vehicle speed and/or a third maximum permissible vehicle steering wheel rotation angle and/or a third maximum permissible/allowable distance the vehicle 102 may travel based on the determined interface location. The processor 254 may then control the vehicle speed, the vehicle steering wheel rotation and/or vehicle travel distance based on the command signals obtained from the transceiver 208/interface 110 and the determined third maximum permissible vehicle speed, the third maximum permissible vehicle steering wheel rotation angle and/or the third maximum permissible distance as described above.
In some aspects, the pins 404, 406, 408 and 410 associated with the plurality of connection ports 250 may be passive pins, which may mean that the pins 404, 406, 408 or 410 may only be used to determine connection status with the corresponding connector pins 412. In other aspects, the pins 404, 406, 408 and 410 may be active pins, which may mean that the pins 404, 406, 408 and 410 may additionally be used to transmit the command signals from the interface 110 to the vehicle 102 (and/or transmit data/signals from the vehicle 102 to the interface 110).
When the pins 404, 406, 408 and 410 may be passive pins, each connector pin 412 may be set to a digital “high” level (or a level of “1”) and the pins 404, 406, 408 and 410 may be connected to ground (or set to a level of “0”). When the elongated connector 302 may be inserted into a connection port, the corresponding connector pins that connect with any two of the pins 404, 406, 408 and 410 may turn to a digital “low” level (as the pins 404, 406, 408 and 410 are connected to ground). In this case, the vehicle sensor unit 248 or the interface sensor unit 214 may poll each connector pin 412 to determine the connector pins that may be turned/pulled to the digital “low” level, thereby determining the connection status between the connection port and the elongated connector 302 and thus the interface location in the vehicle 102.
On the other hand, when the pins 404, 406, 408 and 410 may be active pins, each connector pin 412 and the pin 404 may be first set to the digital “high” level, and the pin 406 may be connected to ground (associated with the first connection port 250a, used as an example). In this case, when the first connection port 250a may be connected with the elongated connector 302, only the connector pin 412b may be turned/pulled to the digital “low” level (since the corresponding pin 406 is connected to ground). The vehicle sensor unit 248 or the interface sensor unit 214 may read the digital low level of the connector pin 412b to determine that the connection pin 412b may be connected with the pin 406. Thereafter, the remaining connector pins may be turned/pulled to the digital low level. In this case, only the connector pin 412a may turn to the digital high level since it may be connected to the pin 404 that is set at the digital “high” level. The vehicle sensor unit 248 or the interface sensor unit 214 may then read the digital high level of the connector pin 412a to determine that the connector pin 412a may be connected with the pin 404. Responsive to determining the connection status of the connector pins 412a, 412b and the pins 404, 406, as described above, the processor 254 may configure these pins to transfer signals (e.g., the command signals associated with the user inputs on the interface 110) between the interface 110 and the vehicle 102.
In alternative aspects, instead of having the pins 404-410, each connection port, from the plurality of connection ports 250, may include a unique near field communication (NFC) tag and the elongated connector 302 may include an NFC reader. In this case, when the elongated connector 302 may be attached to the second connection port 250b (used as an example), the vehicle sensor unit 248 (and/or the processor 254) may receive a second connection signal from the NFC reader (corresponding to the NFC tag). The second connection signal may be indicative of the unique NFC tag associated with the second connection port 250b. Responsive to receiving the second connection signal, the vehicle sensor unit 248 may transmit the second connection signal as part of the vehicle information associated with the plurality of connection ports 250 to the processor 254.
The processor 254 may obtain the second connection signal from the vehicle sensor unit 248 or directly from the NFC reader. Responsive to receiving the second connection signal, the processor 254 may determine that the interface location may be in the vehicle 102 and the interface 110 may be attached to the second connection port 250b based on the second connection signal.
In yet another aspect, each connection port, from the plurality of connection ports 250, may include one or more conductors with a unique pattern (or disposed in a unique arrangement) on a connection port side wall, as shown in
The method 600 starts at step 602. At step 604, the method 600 may include determining, by the processor 254, that the trigger event has occurred. At step 606, the method 600 may include obtaining, by the processor 254, the interface information from the interface sensor unit 215 and/or the vehicle information from the vehicle sensor unit 248 responsive to determining that the trigger event has occurred.
At step 608, the method 600 may include determining, by the processor 254, the interface location relative to the vehicle 102 based on the interface information and/or the vehicle information, as described above in conjunction with
The method 600 may end at step 612.
In the above disclosure, reference has been made to the accompanying drawings, which form a part hereof, which illustrate specific implementations in which the present disclosure may be practiced. It is understood that other implementations may be utilized, and structural changes may be made without departing from the scope of the present disclosure. References in the specification to “one embodiment,” “an embodiment,” “an example embodiment,” etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a feature, structure, or characteristic is described in connection with an embodiment, one skilled in the art will recognize such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
Further, where appropriate, the functions described herein can be performed in one or more of hardware, software, firmware, digital components, or analog components. For example, one or more application specific integrated circuits (ASICs) can be programmed to carry out one or more of the systems and procedures described herein. Certain terms are used throughout the description and claims refer to particular system components. As one skilled in the art will appreciate, components may be referred to by different names. This document does not intend to distinguish between components that differ in name, but not function.
It should also be understood that the word “example” as used herein is intended to be non-exclusionary and non-limiting in nature. More particularly, the word “example” as used herein indicates one among several examples, and it should be understood that no undue emphasis or preference is being directed to the particular example being described.
A computer-readable medium (also referred to as a processor-readable medium) includes any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Computing devices may include computer-executable instructions, where the instructions may be executable by one or more computing devices such as those listed above and stored on a computer-readable medium.
With regard to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating various embodiments and should in no way be construed so as to limit the claims.
Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the application is capable of modification and variation.
All terms used in the claims are intended to be given their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary is made herein. In particular, use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary. Conditional language, such as, among others, “can,” “could,” “might,” or “may,” unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments could include, while other embodiments may not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements, and/or steps are in any way required for one or more embodiments.