When an autonomous vehicle requires a service need or is impaired, it is often challenging to detect what type of service is required by the vehicle and how to best respond to the service need/impairment as there is no human driver present in the autonomous vehicle. For instance, an autonomous vehicle may have one or more faulty sensor components (e.g., GPS, LiDAR, etc.), may require a major service need (e.g., due to engine overheating, flat tire, etc.), may require a minor or a common service need (e.g., car wash, windshield fluid, etc.), and/or may need to be scheduled for its regular maintenance (e.g., yearly service, 10K miles service, etc.). Autonomous vehicles are not designed to manage their own maintenance and address impairments. Anytime an issue occurs in an autonomous vehicle, the vehicle generally has to return to a central location from where a required service need is analyzed and taken care of, which can be very inefficient and impractical. Additionally, if an autonomous vehicle is unable to drive autonomously due to an impairment or service need, generally human assistance would be required to arrive at a location of the vehicle and tow the vehicle away to a service center, which is time consuming and costly.
In the following description, various embodiments will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the embodiments. However, it will also be apparent to one skilled in the art that the embodiments may be practiced without the specific details. Furthermore, well-known features may be omitted or simplified in order not to obscure the embodiment being described. In addition, the embodiments disclosed herein are only examples, and the scope of this disclosure is not limited to them. Particular embodiments may include all, some, or none of the components, elements, features, functions, operations, or steps of the embodiments disclosed above. Embodiments according to the invention are in particular disclosed in the attached claims directed to a method, a storage medium, a system and a computer program product, wherein any feature mentioned in one claim category, e.g., method, can be claimed in another claim category, e.g., system, as well. The dependencies or references back in the attached claims are chosen for formal reasons only. However any subject matter resulting from a deliberate reference back to any previous claims (in particular multiple dependencies) can be claimed as well, so that any combination of claims and the features thereof are disclosed and can be claimed regardless of the dependencies chosen in the attached claims. The subject-matter which can be claimed comprises not only the combinations of features as set out in the attached claims but also any other combination of features in the claims, wherein each feature mentioned in the claims can be combined with any other feature or combination of other features in the claims. Furthermore, any of the embodiments and features described or depicted herein can be claimed in a separate claim and/or in any combination with any embodiment or feature described or depicted herein or with any of the features of the attached claims.
When an autonomous vehicle requires a service need or is impaired, it is often challenging to detect what type of service is required by the vehicle and how to best respond to the service need/impairment as there is no human driver present in the autonomous vehicle. For instance, an autonomous vehicle may have one or more faulty sensor components (e.g., GPS, LiDAR, etc.), may require a major service need (e.g., due to engine overheating, flat tire, etc.), may require a minor or a common service need (e.g., car wash, windshield fluid, etc.), and/or may need to be scheduled for its regular maintenance (e.g., yearly service, 10K miles service, etc.). Autonomous vehicles are not designed to manage their own maintenance and address impairments. Even if they detect a problem, they would not know what to do, where to go, and when to go. If an autonomous vehicle is unable to drive autonomously due to an impairment or service need, generally human assistance would be required to arrive at a location of the vehicle and tow the vehicle away to a service center, which is time consuming and costly. Furthermore, an autonomous vehicle may be transporting one or more ride requestors (interchangeably referred herein as passengers) when something breaks down. Thus, in an event of a service need, an appropriate response needs to be provided relating to the impaired vehicle and its passengers.
Particular embodiments described herein relates to systems, apparatuses, and methods for providing responses to service needs of an impaired autonomous vehicle. In particular embodiments, a central entity or system managing a fleet of various autonomous vehicle, such as a transportation management system, may be able to manage different service needs of an impaired autonomous vehicle. By way of a first example and with reference to
As another example of a response to a service need, the transportation management system may detect the severity and/or urgency of the service needed by an autonomous vehicle. For example, the transportation management system may determine that an autonomous vehicle needs a major service (e.g., due to mechanical failure, engine overheat, etc.) or a minor or a common service (e.g., oil change, car wash, gas refuel, washer fluid, etc.). Based on the type and urgency of service that the autonomous vehicle requires, the system may determine that the vehicle is still able to drive. In response, the system may identify one of a nearest service center (e.g., for vehicle with urgent and/or major service need), a specialty service center (e.g., for a particular type of service required by the vehicle for which a given service center specializes in), or a best service center (e.g., service center with high user rating/feedback and most cost-efficient for service repairs) for the autonomous vehicle, as shown and discussed in detail in reference to at least
Yet as another example of a response to a service need, the transportation management system may schedule a maintenance service for a vehicle when a required maintenance (e.g., 10K miles service, 2 year maintenance, etc.) is upcoming or overdue, as discussed in detail below in reference to
Providing a response to a major service need (e.g., engine overheat, flat tire, etc.), a minor service need (e.g., car wash, gas refuel, etc.), a panic alert from a passenger of the vehicle, or a regular vehicle maintenance, as discussed herein, is advantageous as autonomous vehicles generally do not know how to respond to different service needs because responding to these needs is beyond the typical capabilities of an autonomous vehicle, especially at the fleet level. By providing an appropriate response to each of these service needs, the transportation management system ensures that the autonomous vehicles are in their best operational condition (e.g., all parts/components properly working, fluids (e.g., break oil, engine oil, etc.) up to their required levels, vehicle maintenance is done at scheduled times, tire pressure is fine, vehicle is cleaned, etc.) and also ensures overall safety and convenience of passengers of the autonomous vehicles. This is also advantageous from an overall system or fleet level as currently when an impairment or issue occurs in an autonomous vehicle, the vehicle has to report to a central authority/location from where an appropriate service need is detected and provided, which is time consuming and inefficient. By detecting the various services needs required by a vehicle and providing an appropriate response to each of those service needs to the vehicle on the go (e.g., Shepherd vehicle provided for assisting an impaired vehicle, field agent requested to arrive at the impaired vehicle's location, impaired vehicle navigated to a nearest service center location, etc.), overall response time to resolve the service needs of an impaired vehicle is significantly reduced and less overload is put on the system as the system does not have to fulfill the service needs of a number of vehicles all at once. In any event of a service need, apart from fulfilling the service need of the impaired vehicle, the transportation management system may make sure to manage the needs of one or more passengers in the impaired vehicle. For instance, if a passenger is present in an impaired vehicle that requires service, then the transportation management system may request an alternate vehicle (e.g., autonomous or human-driven vehicle) to pick up the one or more passengers of the impaired vehicle and transport them to their respective destinations.
In particular embodiments, the requestor 210 may use a transportation application running on a requestor computing device 220 (e.g., smartphone, tablet computer, smart wearable device, laptop computer, etc.) to request a ride from a specified pick-up location to a specified drop-off location. The request may be sent over a communication network 270 to the transportation management system 230. The transportation management system 230 may fulfil ride requests by dispatching autonomous vehicles 240. For example, in response to a ride request, the transportation management system 230 may dispatch and instruct an autonomous vehicle 240a managed by the system to transport the requestor 210. In particular embodiments, a fleet of autonomous vehicles 240 may be managed by the transportation management system 230. The fleet of autonomous vehicles 240, in whole or in part, may be owned by the entity associated with the transportation management system 230, or they may be owned by a third-party entity relative to the transportation management system 230. In either case, the transportation management system 230 may control the operations of the autonomous vehicles 240, including, e.g., dispatching select vehicles 240 to fulfill ride requests, instructing the vehicles 240 to perform select operations (e.g., head to a service center or charging/fueling station, pull over, stop immediately, self-diagnose, lock/unlock compartments, change music station, change temperature, and any other suitable operations), and instructing the vehicles 240 to enter select operation modes (e.g., operate normally, drive at a reduced speed, drive under the command of human operators, and any other suitable operational modes).
Although not shown in
In particular embodiments, the transportation management system 230 may include software modules or applications, including, e.g., identity management services 232, location services 234, ride services 236, impaired-vehicle services 238, and/or any other suitable services. Although a particular number of services are shown as being provided by system 230, more or fewer services may be provided in various embodiments. In particular embodiments, identity management services 232 may be configured to, e.g., perform authorization services for ride requestors 210 and manage their interactions and data with the transportation management system 230. This may include, e.g., authenticating the identity of requestors 210 and determining that they are authorized to receive services from the transportation management system 230. Identity management services 232 may also manage and control access to requestor data maintained by the transportation management system 230, such as ride histories, vehicle data, personal data, preferences, usage patterns, profile pictures, linked third-party accounts (e.g., credentials for music or entertainment services, social-networking systems, calendar systems, task-management systems, etc.) and any other associated information. In particular embodiments, the transportation management system 230 may provide location services 234, which may include navigation and/or traffic management services and user interfaces. For example, the location services 234 may be responsible for querying device(s) associated with requester(s) 210 (e.g., computing device 220) for their locations. The location services 234 may also be configured to track those devices to determine their relative proximities, generate relevant alerts (e.g., proximity is within a threshold distance), generate navigation recommendations, and any other location-based services. In particular embodiments, the transportation management system 230 may provide ride services 236, which may include ride matching and management services to connect a requestor 210 to an autonomous vehicle 240. For example, after the identify of a ride requestor 210 has been authenticated by the identity management services module 232, the ride services module 236 may attempt to match the requestor with one or more autonomous vehicles 240. In particular embodiments, the ride services module 236 may identify an appropriate vehicle 240 using location data obtained from the location services module 234. The ride services module 236 may use the location data to identify a vehicle 240 that is geographically close to the requestor 210 (e.g., within a certain threshold distance or travel time). In particular embodiments, the impaired-vehicle services 238 may be responsible for providing responses to detected impairments or service needs of an impaired autonomous vehicle 240. The impaired-vehicle services 238 may receive an indication of an impairment or service need from the vehicle 240 or passenger(s) of the vehicle 240. For instance, data indicating the impairment or service need may be obtained using the identity management services 232, location services 234, and ride services 236, as well as from the requestor's computing device 220, and the vehicle 240. In particular embodiments, the impaired-vehicle services 238 may provide an appropriate response to a detected impairment or service need according to the method 300 as discussed in
An autonomous vehicle 240 may be a vehicle that is capable of sensing its environment and navigating with little to no human input. The autonomous vehicle 240 may be equipped with a variety of systems or modules for enabling it to determine its surroundings and safely navigate to target destinations. In particular embodiments, the vehicle 240 may be equipped with an array of sensors 244, a navigation system 246, and a ride-service computing device 248. The sensors 244 may obtain and process sensor/telemetry data. For example, the sensors 244 may be optical cameras for, e.g., recognizing roads and lane markings; infrared cameras for, e.g., night vision; LiDARs for, e.g., detecting 360° surroundings; RADAR for, e.g., detecting distant hazards; stereo vision for, e.g., spotting hazards such as pedestrians or tree branches; wheel sensors for, e.g., measuring velocity; ultra sound for, e.g., parking and obstacle detection; global positioning system (GPS) for, e.g., determining the vehicle's current geolocation; and/or inertial measurement units, accelerometers, gyroscopes, and/or odometer systems for movement or motion detection. While the description of these sensors provides particular examples of utility, one of ordinary skill in the art would appreciate that the utilities of the sensors are not limited to these examples. The navigation system 246 may be responsible for safely navigating the autonomous vehicle 640. In particular embodiments, the navigation system 246 may take as input any type of sensor data from, e.g., a Global Positioning System (GPS) module, inertial measurement unit (IMU), LiDAR sensors, optical cameras, radio frequency (RF) transceivers, or any other suitable telemetry or sensory mechanisms. In particular embodiments, the navigation system 246 may use its determinations to control the vehicle 240 to operate in prescribed manners and to guide the autonomous vehicle 240 to its destinations without colliding into other objects. The ride-service computing device 248 may be a tablet or other suitable device installed by transportation management system 230 to allow a user to interact with the autonomous vehicle 240, transportation management system 230, or other users. Although not shown in
In particular embodiments, autonomous vehicles 240 may be able to communicate with each other either directly via a wireless communication channel (e.g., Bluetooth, NFC, Infrared, etc.) or via the transportation management system 230 by sending or receiving data through the network 270. In particular embodiments, when one of the autonomous vehicles is down (impaired autonomous vehicle), the transportation management system 230 may instruct a second autonomous vehicle (Shepherd autonomous vehicle) to help the impaired vehicle. As an example and not by way of limitation, vehicle 240a may be impaired due to one or more sensors 244 not working properly or being faulty, then the transportation management system 230 may instruct a second vehicle 240b to share its sensor data with the impaired vehicle 240a (see for example,
At step 304, the transportation management system may detect a service need that is required by the autonomous vehicle. In some embodiments, the transportation management system may receive performance statistics for various sensors/components of the vehicle indicating a current state of each sensor from the transportation management vehicle device installed in the vehicle. For instance, transportation management vehicle device may be connected to a central or main controlling unit of the vehicle (e.g., the engine control unit (ECU)) from which the device gets performance statistics for each sensor associated with the central or main controlling unit of the vehicle. The device then shares the performance statistics in real-time or at periodic time intervals with the transportation management system. Having received the performance statistics for each sensor, the transportation management system may compare the current statistics with the default/factory statistics for the sensor or the last known good configuration saved for that sensor. If the two statistics do not match or if the difference between the statistics are above a certain threshold, then the transportation management system may detect a service need that is required for an item/component that is associated with that particular sensor. By way of an example and without limitation, the transportation management system may receive performance statistics for an engine-temperature component indicating that the current engine temperature is about 230 Fahrenheit. An ideal engine temperature set in the default statistics for the same component is indicated to be within 180-220 Fahrenheit. Upon comparing the two, the transportation management system may detect that the engine of the autonomous vehicle is overheating, which calls for a major service need and may take an action for it accordingly (as discussed for example in reference to
As depicted in
If at step 320, the transportation management system determines that the vehicle can still safely drive with the impaired sensor component, then at step 323, the transportation management system may identify a sensor type of the impaired sensor component. For instance, a sensor component may comprise of one or more sensor types and an impaired sensor component may have a particular sensor type that may be faulty or not working properly. By way of an example, the sensor component may be a GPS module comprising of a traffic sensor for analyzing current traffic conditions, a speed-limit sensor for determining speed limit in the current geographic area/region of the vehicle, accidents or hazards sensor for identifying any accidents or potential hazards (e.g., road work, construction, etc.) in the current route of the vehicle, etc. In this example, the GPS module may have a faulty traffic sensor due to which it may be unable to properly analyze the current traffic conditions, which may lead to delay in transmit or commute time. At step 324, the transportation management system may identify a second autonomous vehicle (Shepherd autonomous vehicle) having all functional sensors including their respective sensor types. The transportation management system may identify this second vehicle by first identifying one or more vacant autonomous vehicles (i.e., vehicle carrying no passengers) that are located in the vicinity or within a certain threshold distance of the current geographic location of the first autonomous vehicle. For example, the system may identify if there is a vacant autonomous vehicle located within five miles from the current location of the impaired first vehicle. If the system identifies one, then it may send instruction to the identified second autonomous vehicle to drive to the location of the first autonomous vehicle. If in case the system does not identify an available autonomous vehicle in the vicinity or within the certain threshold distance from the first vehicle, then the system may request a second autonomous vehicle from a dispatch pool (e.g., main central location where the fleet of all the autonomous vehicles are located). While the second autonomous vehicle arrives at the location of the first vehicle, the first autonomous vehicle may be instructed by the transportation management system to pull over and wait at a nearest safe location. In particular embodiments, transportation management system may take the identified second autonomous vehicle from the dispatch pool and set its status as temporarily non-operational for passenger pick-up and drop-off (i.e., the identified second vehicle may not take and fulfill any new ride requests). At step 326, the system may determine a suitable service center location where the first autonomous vehicle can be directed for repair. In particular embodiments, the system my determine a service center based on one or more criteria. The one or more criteria may include, as an example and without limitation, proximity of a service center location to the current geographic location of the first vehicle, specialty or expertise of a service center in fixing that particular impaired sensor component, user ratings/feedback associated with a service center, cost-effectiveness in repairing the impaired sensor component, availability of a service center (i.e., how soon the service center can begin working on the repair), estimate time for the repair, etc.
At step 328, the transportation management system may instruct the identified second autonomous vehicle (Shepherd vehicle) to share its sensor data with the first autonomous vehicle (see for example,
At step 330, the transportation management system may instruct the first autonomous vehicle to drive to the determined service center location using the sensor data from the second autonomous vehicle (see for example,
In some embodiments, although not shown in
If the transportation management system determines in step 340 that the first vehicle can further drive, then in step 344, the system may request performance data from the first vehicle indicating current state/condition of the vehicle. For instance, the system may request performance statistics for the various sensors (e.g., engine sensors, cameras, microphones, infrared, sonars, LiDARs, lightening, temperature, weather, and any other suitable sensors etc.) in the first vehicle from the transportation management vehicle device, as discussed elsewhere herein. In response to the request, in step 346, the system may receive the performance data/statistics from the first autonomous vehicle and then in step 348, the system may determine how far the first vehicle can drive based on the current state/condition of the vehicle. By way of an example, as discussed above, the major service need may be relating to an engine overheating issue and the performance data received from the first vehicle indicates that the engine-temperature sensor specifies a current engine temperature of 200 Fahrenheit. Based on this current temperature reading and history of previous temperature readings (e.g., readings in last fifteen minutes), the system may estimate that the vehicle can drive up to an additional 10 miles before the temperature rises to the engine temperature of 220 Fahrenheit, which may be the threshold temperature limit beyond which the engine would probably cease operating. Having determined a total distance that the first autonomous vehicle can drive, the system may identify, in step 350, one or more service centers that are located within this total distance. Taking the example above where the system estimated that the first vehicle can drive up to an additional 10 miles, the system may identify one or more service centers that are located within 10 miles from the current location of the first vehicle. The system may identify the service centers based on the one or more criteria as discussed with respect to step 326 in
At step 352, a determination may be made that whether the system identified one or more service centers within the total distance (e.g., 10 miles). If the result of the determination is negative, then the transportation management system may instruct the first autonomous vehicle to pull over at a nearest safe location and send a request to human road-side assistance (field agent) to arrive at the location of the vehicle and resolve the issue (e.g., by towing the impaired vehicle and taking it to a nearest service center location). Otherwise if the system does identify the one or more service centers, then in step 354, the system determines whether the first vehicle requires a particular type of service. For example, in order to the fix/repair the major service need of the first vehicle, a particular type of component need to be replaced that is available at only select specialty service center locations. If that's the case then in step 356, the system may send driving directions to a specialty service center located within the total distance and instructs the vehicle to go to specialty service center location using the driving directions. The specialty service center may specialize in the particular type of service required by the vehicle. If otherwise the system determines that the particular or special service is not required then in step 358, the system may send driving directions to a nearest service center (i.e., one located nearest to the current location of the first autonomous vehicle) and instructs the vehicle to go to nearest service center location using the driving directions. It should be realized that a best service center may not be applicable for a major service need because of the urgency of the service required by the vehicle. A best service center is well suited for vehicles with minor service needs as discussed in further detail below in reference to
At step 362, the transportation management system may receive an indication that the one or more passengers have been dropped-off at their respective destinations. In some embodiments, the system may receive this indication from a ride requestor's/passenger's computing device (e.g., a transportation application running on a mobile device of the passenger) that the passenger has reached his destination. In other embodiments, current location information (e.g., geolocation) may be constantly transmitted by the vehicle to the system or the system may directly query the vehicle for its geolocation at periodic time intervals or in real-time to get this indication. Once the one or more passengers have been dropped-off, at step 364, the system may identify a best service center for navigating the vehicle to its respective service location for resolving the minor service need. The best service center may be identified based on one or more criteria, including for example, user ratings and/or comments associated with a service center (e.g., service center 1 is given 4.5/5 star rating by users and 100 reviews while service center 2 is given only 3/5 star rating and 57 reviews), proximity of a service center to the current geographic location of the first vehicle (i.e., how close the service center is located which itself lead to fuel saving), cost effectiveness of a service center (e.g., repairs or service components at service center X may cost less than at service center Y), etc. As discussed earlier, a best service center may be best suited for situations where a vehicle can still drive long distances and have less urgent, minor, or common service needs. In response to identifying a best service center, at step 366, the system may send driving directions to the best service center location and instructs the vehicle to go to the best service center location using the driving directions.
Having identified the required maintenance, the transportation management system may make a determination of whether the maintenance is overdue (step 370) or upcoming (step 378). Continuing with the machine-learning model maintenance example above where the vehicle has currently 19,500 miles and last maintenance was performed at 9,754 miles, the system may determine in this case that the maintenance is upcoming in 254 miles. If in case the vehicle was instead identified as having 21,000 miles then the system may determine that the maintenance for the vehicle is overdue. If the system determines in step 370 that the maintenance is overdue, then in step 371, the system determines if the first vehicle requiring maintenance is carrying one or more ride requestors/passengers in the vehicle. If the determination is affirmative, then at step 372, the system may instruct the first autonomous vehicle to first drop-off the one or more passengers at their respective destination locations and at step 373, the system may receive an indication that the one or more passengers have been dropped-off at their respective destinations, as discussed with respect to steps 360 and 362 in
If the transportation management system determines that the maintenance is not overdue but its upcoming (step 378), then at step 380, the system determines if the first vehicle requires a particular maintenance. If so, then at step 382, the system schedules a specialty service center for maintenance, as discussed elsewhere herein. Otherwise, at step 384, the system identify a best service center for maintenance since the maintenance is not due immediate and thus the vehicle can be sent to a service center with high rating, positive feedback, and one which is cost effective and time efficient. The best service center may be identified based on one or more criteria as discussed with respect to step 364 in
At step 392, if the system determines that the passenger panic alert relates to stopping the first vehicle, then at step 393, the system may send an instruction to the first autonomous vehicle to pull over at a nearest safe location and wait until provided with another instruction. For example, the first vehicle may be making some weird noise and shaking due to which the passenger of the vehicle panics and requests to stop the vehicle. In response to stopping the first vehicle or if the panic alert does not relate to stopping the vehicle, at step 394, the system determines if the passenger wants an alternate vehicle to get to their destination. If so, at step 395, the system may send a request to a second autonomous vehicle to pick up the passenger from a current geographic location of the first vehicle and transport the passenger to their respective destination. At step 396, the system determines if the panic alert relates to passenger indicating that the first vehicle requires a major service need, as discussed above in detail in reference to
Particular embodiments may repeat one or more steps of the method 300 of
In particular embodiments, the transportation management vehicle device 460 may include a connector 416. In particular embodiments, the connector 416 may be configured to physically connect to the ride provider's computing device and/or the requestor's computing device. In particular embodiments, the connector 416 may be configured for physically connecting the transportation management vehicle device 460 to the vehicle for power and/or for communicating with the vehicle. For instance, the connector 416 may implement a suitable communication interface or protocol for communicating with the vehicle. For example, through the connector 416, the transportation management vehicle device 460 may be able to issue instructions to the vehicle's onboard computer and cause it to adjust certain vehicle configurations, such as air-conditioning level, entertainment/informational content (e.g., music, news station, content source, etc.), audio volume, window configuration, seat warmer temperature, and any other configurable features of the vehicle. As another example, the connector 416 may enable the transportation management vehicle device 460 to query the vehicle for certain data, such as current configurations of any of the aforementioned features, as well as the vehicle's speed, fuel level, tire pressure, external temperature gauge, navigation system, and any other information available through the vehicle's computing system. In particular embodiments, the transportation management vehicle device 460 may be further configured with wireless communication capabilities (e.g., Bluetooth, WI-FI, NFC, etc.), thereby enabling the device 460 to wirelessly communicate with the vehicle, the provider's computing device, and/or the requestor's computing device.
In particular embodiments, the transportation management vehicle device 460 may be integrated with one or more sensors 419, such as a camera, microphone, infrared sensor, gyroscope, accelerometer, and any other suitable sensor for detecting signals of interest within the passenger compartment of the vehicle. For example, the sensor 419 may be a rear-facing wide-angle camera that captures the passenger compartment and any passengers therein. As another example, the sensor 419 may be a microphone that captures conversation and/or sounds in the passenger compartment. The sensor 419 may also be an infrared sensor capable of detecting motion and/or temperature of the passengers.
Although
In particular embodiments, lighting controller 422 may manage the colors and/or other lighting displayed by light features 414, the front display 404, and/or the back display 410. The lighting controller may include rules and algorithms for controlling the lighting features 414 so that the intended information is conveyed. For example, to help a set of matching provider and requestor find each other at a pick-up location, the lighting controller 422 may obtain instructions that the color blue is to be used for identification. In response, the front display 404 may display blue and the lighting controller 422 may cause the light features 414 to display blue so that the ride provider would know what color to look for.
In particular embodiments, the transportation management vehicle device 460 may include a communication component 424 for managing communications with other systems, including, e.g., the provider device, the requestor device, the vehicle, the transportation management system, and third-party systems (e.g., music, entertainment, traffic, and/or maps providers). In particular embodiments, communication component 424 may be configured to communicate over WI-FI, Bluetooth, NFC, RF, or any other wired or wireless communication network or protocol.
In particular embodiments, the transportation management vehicle 460 may include an input/output system 426 configured to receive inputs from users and/or the environment and provide output. For example, I/O system 426 may include a sensor such as an image-capturing device configured to recognize motion or gesture-based inputs from passengers, a microphone configured to detect and record speech or dialog uttered, a heat sensor to detect the temperature in the passenger compartment, and any other suitable sensor. The I/O system 426 may output the detected sensor data to any other system, including the transportation management system, the computing devices of the ride provider and requestor, etc. Additionally, I/O system 426 may include audio device configured to provide audio outputs (such as alerts, instructions, or other information) to users and/or receive audio inputs, such as audio commands, which may be interpreted by a voice recognition system or other command interface. In particular embodiments, I/O system 426 may include one or more input or output ports, such as USB (universal serial bus) ports, lightning connector ports, or other ports enabling users to directly connect their devices to the transportation management vehicle device 460 (e.g., to exchange data, verify identity information, provide power, etc.).
The user device 530, transportation management system 560, autonomous vehicle 540, and third-party system 570 may be communicatively connected or co-located with each other in whole or in part. These computing entities may communicate via different transmission technologies and network types. For example, the user device 530 and the vehicle 540 may communicate with each other via a cable or short-range wireless communication (e.g., Bluetooth, NFC, WI-FI, etc.), and together they may be connected to the Internet via a cellular network accessible to either one of the devices (e.g., the user device 530 may be a smartphone with LTE connection). The transportation management system 560 and third-party system 570, on the other hand, may be connected to the Internet via their respective LAN/WLAN networks and Internet Service Providers (ISP).
In particular embodiments, the transportation management system 560 may fulfill ride requests for one or more users 501 by dispatching suitable vehicles. The transportation management system 560 may receive any number of ride requests from any number of ride requestors 501. In particular embodiments, a ride request from a ride requestor 501 may include an identifier that identifies them in the system 560. The transportation management system 560 may use the identifier to access and store the ride requestor's 501 information, in accordance with his/her privacy settings. The ride requestor's 501 information may be stored in one or more data stores (e.g., a relational database system) associated with and accessible to the transportation management system 560. In particular embodiments, ride requestor information may include profile information about a particular ride requestor 501. In particular embodiments, the ride requestor 501 may be associated with one or more categories or types, through which the ride requestor 501 may be associated with aggregate information about certain ride requestors of those categories or types. Ride information may include, for example, preferred pick-up and drop-off locations, driving preferences (e.g., safety comfort level, preferred speed, rates of acceleration/deceleration, safety distance from other vehicles when travelling at various speeds, route, etc.), entertainment preferences and settings (e.g., preferred music genre or playlist, audio volume, display brightness, etc.), temperature settings, whether conversation with the driver is welcomed, frequent destinations, historical riding patterns (e.g., time of day of travel, starting and ending locations, etc.), preferred language, age, gender, or any other suitable information. In particular embodiments, the transportation management system 560 may classify a user 501 based on known information about the user 501 (e.g., using machine-learning classifiers), and use the classification to retrieve relevant aggregate information associated with that class. For example, the system 560 may classify a user 501 as a teenager and retrieve relevant aggregate information associated with teenagers, such as the type of music generally preferred by teenagers.
Transportation management system 560 may also store and access ride information. Ride information may include locations related to the ride, traffic data, route options, optimal pick-up or drop-off locations for the ride, or any other suitable information associated with a ride. As an example and not by way of limitation, when the transportation management system 560 receives a request to travel from San Francisco International Airport (SFO) to Palo Alto, Calif., the system 560 may access or generate any relevant ride information for this particular ride request. The ride information may include, for example, preferred pick-up locations at SFO; alternate pick-up locations in the event that a pick-up location is incompatible with the ride requestor (e.g., the ride requestor may be disabled and cannot access the pick-up location) or the pick-up location is otherwise unavailable due to construction, traffic congestion, changes in pick-up/drop-off rules, or any other reason; one or more routes to navigate from SFO to Palo Alto; preferred off-ramps for a type of user; or any other suitable information associated with the ride. In particular embodiments, portions of the ride information may be based on historical data associated with historical rides facilitated by the system 560. For example, historical data may include aggregate information generated based on past ride information, which may include any ride information described herein and telemetry data collected by sensors in autonomous vehicles and/or user devices. Historical data may be associated with a particular user (e.g., that particular user's preferences, common routes, etc.), a category/class of users (e.g., based on demographics), and/or all users of the system 560. For example, historical data specific to a single user may include information about past rides that particular user has taken, including the locations at which the user is picked up and dropped off, music the user likes to listen to, traffic information associated with the rides, time of the day the user most often rides, and any other suitable information specific to the user. As another example, historical data associated with a category/class of users may include, e.g., common or popular ride preferences of users in that category/class, such as teenagers preferring pop music, ride requestors who frequently commute to the financial district may prefer to listen to news, etc. As yet another example, historical data associated with all users may include general usage trends, such as traffic and ride patterns. Using historical data, the system 560 in particular embodiments may predict and provide ride suggestions in response to a ride request. In particular embodiments, the system 560 may use machine-learning, such as neural-networks, regression algorithms, instance-based algorithms (e.g., k-Nearest Neighbor), decision-tree algorithms, Bayesian algorithms, clustering algorithms, association-rule-learning algorithms, deep-learning algorithms, dimensionality-reduction algorithms, ensemble algorithms, and any other suitable machine-learning algorithms known to persons of ordinary skill in the art. The machine-learning models may be trained using any suitable training algorithm, including supervised learning based on labeled training data, unsupervised learning based on unlabeled training data, and/or semi-supervised learning based on a mixture of labeled and unlabeled training data.
In particular embodiments, transportation management system 560 may include one or more servers. Each server may be a unitary server or a distributed server spanning multiple computers or multiple datacenters. The servers may be of various types, such as, for example and without limitation, web server, news server, mail server, message server, advertising server, file server, application server, exchange server, database server, proxy server, another server suitable for performing functions or processes described herein, or any combination thereof. In particular embodiments, each server may include hardware, software, or embedded logic components or a combination of two or more such components for carrying out the appropriate functionalities implemented or supported by the server. In particular embodiments, transportation management system 560 may include one or more data stores. The data stores may be used to store various types of information, such as ride information, ride requestor information, ride provider information, historical information, third-party information, or any other suitable type of information. In particular embodiments, the information stored in the data stores may be organized according to specific data structures. In particular embodiments, each data store may be a relational, columnar, correlation, or other suitable database system. Although this disclosure describes or illustrates particular types of databases, this disclosure contemplates any suitable types of databases. Particular embodiments may provide interfaces that enable a user device 530 (which may belong to a ride requestor or provider), a transportation management system 560, vehicle system 540, or a third-party system 570 to process, transform, manage, retrieve, modify, add, or delete the information stored in data store.
In particular embodiments, transportation management system 560 may include an authorization server (or other suitable component(s)) that allows users 501 to opt-in to or opt-out of having their information and actions logged, recorded, or sensed by transportation management system 560 or shared with other systems (e.g., third-party systems 570). In particular embodiments, a user 501 may opt-in or opt-out by setting appropriate privacy settings. A privacy setting of a user may determine what information associated with the user may be logged, how information associated with the user may be logged, when information associated with the user may be logged, who may log information associated with the user, whom information associated with the user may be shared with, and for what purposes information associated with the user may be logged or shared. Authorization servers may be used to enforce one or more privacy settings of the users 501 of transportation management system 560 through blocking, data hashing, anonymization, or other suitable techniques as appropriate.
In particular embodiments, third-party system 570 may be a network-addressable computing system that may host GPS maps, customer reviews, music or content, weather information, or any other suitable type of information. Third-party system 570 may generate, store, receive, and send relevant data, such as, for example, map data, customer review data from a customer review website, weather data, or any other suitable type of data. Third-party system 570 may be accessed by the other computing entities of the network environment either directly or via network 510. For example, user device 530 may access the third-party system 570 via network 510, or via transportation management system 560. In the latter case, if credentials are required to access the third-party system 570, the user 501 may provide such information to the transportation management system 560, which may serve as a proxy for accessing content from the third-party system 570.
In particular embodiments, user device 530 may be a mobile computing device such as a smartphone, tablet computer, or laptop computer. User device 530 may include one or more processors (e.g., CPU and/or GPU), memory, and storage. An operation system and applications may be installed on the user device 530, such as, e.g., a transportation application associated with the transportation management system 560, applications associated with third-party systems 570, and applications associated with the operating system. User device 530 may include functionality for determining its location, direction, or orientation, based on integrated sensors such as GPS, compass, gyroscope, or accelerometer. User device 530 may also include wireless transceivers for wireless communication, and may support wireless communication protocols such as Bluetooth, near-field communication (NFC), infrared (IR) communication, WI-FI, and/or 2G/3G/4G/LTE mobile communication standard. User device 530 may also include one or more cameras, scanners, touchscreens, microphones, speakers, and any other suitable input-output devices.
In particular embodiments, the vehicle 540 may be an autonomous vehicle and equipped with an array of sensors 544, a navigation system 546, and a ride-service computing device 548. In particular embodiments, a fleet of autonomous vehicles 540 may be managed by the transportation management system 560. The fleet of autonomous vehicles 540, in whole or in part, may be owned by the entity associated with the transportation management system 560, or they may be owned by a third-party entity relative to the transportation management system 560. In either case, the transportation management system 560 may control the operations of the autonomous vehicles 540, including, e.g., dispatching select vehicles 540 to fulfill ride requests, instructing the vehicles 540 to perform select operations (e.g., head to a service center or charging/fueling station, pull over, stop immediately, self-diagnose, lock/unlock compartments, change music station, change temperature, and any other suitable operations), and instructing the vehicles 540 to enter select operation modes (e.g., operate normally, drive at a reduced speed, drive under the command of human operators, and any other suitable operational modes).
In particular embodiments, the autonomous vehicles 540 may receive data from and transmit data to the transportation management system 560 and the third-party system 570. Example of received data may include, e.g., instructions, new software or software updates, maps, 3D models, trained or untrained machine-learning models, location information (e.g., location of the ride requestor, the autonomous vehicle 540 itself, other autonomous vehicles 540, and target destinations such as service centers), navigation information, traffic information, weather information, entertainment content (e.g., music, video, and news) ride requestor information, ride information, and any other suitable information. Examples of data transmitted from the autonomous vehicle 540 may include, e.g., telemetry and sensor data, determinations/decisions based on such data, vehicle condition or state (e.g., battery/fuel level, tire and brake conditions, sensor condition, speed, odometer, etc.), location, navigation data, passenger inputs (e.g., through a user interface in the vehicle 540, passengers may send/receive data to the transportation management system 560 and/or third-party system 570), and any other suitable data.
In particular embodiments, autonomous vehicles 540 may also communicate with each other as well as other traditional human-driven vehicles, including those managed and not managed by the transportation management system 560. For example, one vehicle 540 may communicate with another vehicle data regarding their respective location, condition, status, sensor reading, and any other suitable information. In particular embodiments, vehicle-to-vehicle communication may take place over direct short-range wireless connection (e.g., WI-FI, Bluetooth, NFC) and/or over a network (e.g., the Internet or via the transportation management system 560 or third-party system 570).
In particular embodiments, an autonomous vehicle 540 may obtain and process sensor/telemetry data. Such data may be captured by any suitable sensors. For example, the vehicle 540 may have aa Light Detection and Ranging (LiDAR) sensor array of multiple LiDAR transceivers that are configured to rotate 360°, emitting pulsed laser light and measuring the reflected light from objects surrounding vehicle 540. In particular embodiments, LiDAR transmitting signals may be steered by use of a gated light valve, which may be a MEMs device that directs a light beam using the principle of light diffraction. Such a device may not use a gimbaled mirror to steer light beams in 360° around the autonomous vehicle. Rather, the gated light valve may direct the light beam into one of several optical fibers, which may be arranged such that the light beam may be directed to many discrete positions around the autonomous vehicle. Thus, data may be captured in 360° around the autonomous vehicle, but no rotating parts may be necessary. A LiDAR is an effective sensor for measuring distances to targets, and as such may be used to generate a three-dimensional (3D) model of the external environment of the autonomous vehicle 540. As an example and not by way of limitation, the 3D model may represent the external environment including objects such as other cars, curbs, debris, objects, and pedestrians up to a maximum range of the sensor arrangement (e.g., 50, 100, or 200 meters). As another example, the autonomous vehicle 540 may have optical cameras pointing in different directions. The cameras may be used for, e.g., recognizing roads, lane markings, street signs, traffic lights, police, other vehicles, and any other visible objects of interest. To enable the vehicle 540 to “see” at night, infrared cameras may be installed. In particular embodiments, the vehicle may be equipped with stereo vision for, e.g., spotting hazards such as pedestrians or tree branches on the road. As another example, the vehicle 540 may have radars for, e.g., detecting other vehicles and/or hazards afar. Furthermore, the vehicle 540 may have ultra sound equipment for, e.g., parking and obstacle detection. In addition to sensors enabling the vehicle 540 to detect, measure, and understand the external world around it, the vehicle 540 may further be equipped with sensors for detecting and self-diagnosing the its own state and condition. For example, the vehicle 540 may have wheel sensors for, e.g., measuring velocity; global positioning system (GPS) for, e.g., determining the vehicle's current geolocation; and/or inertial measurement units, accelerometers, gyroscopes, and/or odometer systems for movement or motion detection. While the description of these sensors provides particular examples of utility, one of ordinary skill in the art would appreciate that the utilities of the sensors are not limited to those examples. Further, while an example of a utility may be described with respect to a particular type of sensor, it should be appreciated that the utility may be achieving using any combination of sensors. For example, an autonomous vehicle 540 may build a 3D model of its surrounding based on data from its LiDAR, radar, sonar, and cameras, along with a pre-generated map obtained from the transportation management system 560 or the third-party system 570. Although sensors 544 appear in a particular location on autonomous vehicle 540 in
In particular embodiments, the autonomous vehicle 540 may be equipped with a processing unit (e.g., one or more CPUs and GPUs), memory, and storage. The vehicle 540 may thus be equipped to perform a variety of computational and processing tasks, including processing the sensor data, extracting useful information, and operating accordingly. For example, based on images captured by its cameras and a machine-vision model, the vehicle 540 may identify particular types of objects captured by the images, such as pedestrians, other vehicles, lanes, curbs, and any other objects of interest.
In particular embodiments, the autonomous vehicle 540 may have a navigation system 546 responsible for safely navigating the autonomous vehicle 540. In particular embodiments, the navigation system 546 may take as input any type of sensor data from, e.g., a Global Positioning System (GPS) module, inertial measurement unit (IMU), LiDAR sensors, optical cameras, radio frequency (RF) transceivers, or any other suitable telemetry or sensory mechanisms. The navigation system 546 may also utilize, e.g., map data, traffic data, accident reports, weather reports, instructions, target destinations, and any other suitable information to determine navigation routes and particular driving operations (e.g., slowing down, speeding up, stopping, swerving, etc.). In particular embodiments, the navigation system 546 may use its determinations to control the vehicle 540 to operate in prescribed manners and to guide the autonomous vehicle 540 to its destinations without colliding into other objects. Although the physical embodiment of the navigation system 546 (e.g., the processing unit) appears in a particular location on autonomous vehicle 540 in
In particular embodiments, the autonomous vehicle 540 may be equipped with a ride-service computing device 548, which may be a tablet or other suitable device installed by transportation management system 560 to allow the user to interact with the autonomous vehicle 540, transportation management system 560, other users 501, or third-party systems 570. In particular embodiments, installation of ride-service computing device 548 may be accomplished by placing the ride-service computing device 548 inside autonomous vehicle 540, and configuring it to communicate with the vehicle 540 via a wire or wireless connection (e.g., via Bluetooth). Although
This disclosure contemplates any suitable number of computer systems 600. This disclosure contemplates computer system 600 taking any suitable physical form. As example and not by way of limitation, computer system 600 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, a tablet computer system, an augmented/virtual reality device, or a combination of two or more of these. Where appropriate, computer system 600 may include one or more computer systems 600; be unitary or distributed; span multiple locations; span multiple machines; span multiple data centers; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 600 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one or more computer systems 600 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 600 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
In particular embodiments, computer system 600 includes a processor 602, memory 604, storage 606, an input/output (I/O) interface 608, a communication interface 610, and a bus 612. Although this disclosure describes and illustrates a particular computer system having a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
In particular embodiments, processor 602 includes hardware for executing instructions, such as those making up a computer program. As an example and not by way of limitation, to execute instructions, processor 602 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 604, or storage 606; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 604, or storage 606. In particular embodiments, processor 602 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 602 including any suitable number of any suitable internal caches, where appropriate. As an example and not by way of limitation, processor 602 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 604 or storage 606, and the instruction caches may speed up retrieval of those instructions by processor 602. Data in the data caches may be copies of data in memory 604 or storage 606 for instructions executing at processor 602 to operate on; the results of previous instructions executed at processor 602 for access by subsequent instructions executing at processor 602 or for writing to memory 604 or storage 606; or other suitable data. The data caches may speed up read or write operations by processor 602. The TLBs may speed up virtual-address translation for processor 602. In particular embodiments, processor 602 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 602 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 602 may include one or more arithmetic logic units (ALUs); be a multi-core processor; or include one or more processors 602. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
In particular embodiments, memory 604 includes main memory for storing instructions for processor 602 to execute or data for processor 602 to operate on. As an example and not by way of limitation, computer system 600 may load instructions from storage 606 or another source (such as, for example, another computer system 600) to memory 604. Processor 602 may then load the instructions from memory 604 to an internal register or internal cache. To execute the instructions, processor 602 may retrieve the instructions from the internal register or internal cache and decode them. During or after execution of the instructions, processor 602 may write one or more results (which may be intermediate or final results) to the internal register or internal cache. Processor 602 may then write one or more of those results to memory 604. In particular embodiments, processor 602 executes only instructions in one or more internal registers or internal caches or in memory 604 (as opposed to storage 606 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 604 (as opposed to storage 606 or elsewhere). One or more memory buses (which may each include an address bus and a data bus) may couple processor 602 to memory 604. Bus 612 may include one or more memory buses, as described in further detail below. In particular embodiments, one or more memory management units (MMUs) reside between processor 602 and memory 604 and facilitate accesses to memory 604 requested by processor 602. In particular embodiments, memory 604 includes random access memory (RAM). This RAM may be volatile memory, where appropriate. Where appropriate, this RAM may be dynamic RAM (DRAM) or static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM. This disclosure contemplates any suitable RAM. Memory 604 may include one or more memories 604, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
In particular embodiments, storage 606 includes mass storage for data or instructions. As an example and not by way of limitation, storage 606 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Storage 606 may include removable or non-removable (or fixed) media, where appropriate. Storage 606 may be internal or external to computer system 600, where appropriate. In particular embodiments, storage 606 is non-volatile, solid-state memory. In particular embodiments, storage 606 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these. This disclosure contemplates mass storage 606 taking any suitable physical form. Storage 606 may include one or more storage control units facilitating communication between processor 602 and storage 606, where appropriate. Where appropriate, storage 606 may include one or more storages 606. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
In particular embodiments, I/O interface 608 includes hardware, software, or both, providing one or more interfaces for communication between computer system 600 and one or more I/O devices. Computer system 600 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 600. As an example and not by way of limitation, an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these. An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 608 for them. Where appropriate, I/O interface 608 may include one or more device or software drivers enabling processor 602 to drive one or more of these I/O devices. I/O interface 608 may include one or more I/O interfaces 608, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
In particular embodiments, communication interface 610 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 600 and one or more other computer systems 600 or one or more networks. As an example and not by way of limitation, communication interface 610 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network. This disclosure contemplates any suitable network and any suitable communication interface 610 for it. As an example and not by way of limitation, computer system 600 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, computer system 600 may communicate with a wireless PAN (WPAN) (such as, for example, a Bluetooth WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these. Computer system 600 may include any suitable communication interface 610 for any of these networks, where appropriate. Communication interface 610 may include one or more communication interfaces 610, where appropriate. Although this disclosure describes and illustrates a particular communication interface, this disclosure contemplates any suitable communication interface.
In particular embodiments, bus 612 includes hardware, software, or both coupling components of computer system 600 to each other. As an example and not by way of limitation, bus 612 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these. Bus 612 may include one or more buses 612, where appropriate. Although this disclosure describes and illustrates a particular bus, this disclosure contemplates any suitable bus or interconnect.
Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs, optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, solid-state drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate.
Herein, “or” is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A or B” means “A, B, or both,” unless expressly indicated otherwise or indicated otherwise by context. Moreover, “and” is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A and B” means “A and B, jointly or severally,” unless expressly indicated otherwise or indicated otherwise by context.
The scope of this disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of this disclosure is not limited to the example embodiments described or illustrated herein. Moreover, although this disclosure describes and illustrates respective embodiments herein as including particular components, elements, feature, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, features, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would comprehend. Furthermore, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative. Additionally, although this disclosure describes or illustrates particular embodiments as providing particular advantages, particular embodiments may provide none, some, or all of these advantages.