Controlling driving modes of self-driving vehicles

Abstract
A method controls an operational mode of a self-driving vehicle (SDV). One or more physical detectors detect an erratically driven vehicle (EDV) that is being operated in an unsafe manner within a predetermined distance of an SDV that is initially being operated in an evasive autonomous mode. One or more processors retrieve traffic pattern data for other SDVs, and examine the traffic pattern data to 1) determine a first traffic flow of the other SDVs while operating in the evasive autonomous mode, and 2) determine a second traffic flow of the other SDVs while operating in a manual mode. In response to determining that the first traffic flow has a higher accident rate than the second traffic flow, an operational mode device changes the operational mode of the SDV from the evasive autonomous mode to the manual mode.
Description
BACKGROUND

The present disclosure relates to the field of vehicles, and specifically to the field of self-driving vehicles. Still more specifically, the present disclosure relates to the field of controlling whether self-driving vehicles operate in autonomous mode or manual mode.


Self-driving vehicles (SDVs) are vehicles that are able to autonomously drive themselves through private and/or public spaces. Using a system of sensors that detect the location and/or surroundings of the SDV, logic within or associated with the SDV controls the speed, propulsion, braking, and steering of the SDV based on the sensor-detected location and surroundings of the SDV.


SUMMARY

In an embodiment of the present invention, a method controls an operational mode of a self-driving vehicle (SDV). One or more physical detectors detect an erratically driven vehicle (EDV) that is being operated in an unsafe manner within a predetermined distance of an SDV that is initially being operated in an evasive autonomous mode. One or more processors retrieve traffic pattern data for other SDVs, and examine the traffic pattern data to 1) determine a first traffic flow of the other SDVs while operating in the evasive autonomous mode, and 2) determine a second traffic flow of the other SDVs while operating in a manual mode. In response to determining that the first traffic flow has a higher accident rate than the second traffic flow, an operational mode device changes the operational mode of the SDV from the evasive autonomous mode to the manual mode.


In an embodiment of the present invention, a computer program product controls an operational mode of a self-driving vehicle (SDV). The computer program product includes a non-transitory computer readable storage medium having program code embodied therewith. The program code is readable and executable by a processor to perform a method of: detecting, by one or more physical detectors, an erratically driven vehicle (EDV) that is being operated in an unsafe manner within a predetermined distance of an SDV, where the SDV and the EDV are traveling on a roadway, and where the SDV is initially being operated in an evasive autonomous mode; retrieving driver profile information about a human driver of the SDV; assigning the human driver of the SDV to a cohort of drivers who have traveled on the roadway in other SDVs, where the human driver of the SDV shares more than a predetermined quantity of traits with members of the cohort of drivers who have traveled in the other SDVs; and in response to assigning the human driver of the SDV to the cohort of drivers who have traveled on the roadway in other SDVs, switching the operational mode of the SDV from the evasive autonomous mode to a manual mode.


In an embodiment of the present invention, a self-driving vehicle includes a processor, a computer readable memory, a non-transitory computer readable storage medium, and a set of program instructions that include: first program instructions to detect, by one or more physical detectors on an SDV, an erratically driven vehicle (EDV) that is being operated in an unsafe manner within a predetermined distance of an SDV, where the SDV is initially being operated an evasive autonomous mode; second program instructions to retrieve traffic pattern data for other SDVs; third program instructions to examine the traffic pattern data to determine a first traffic flow of the other SDVs while operating in the evasive autonomous mode; fourth program instructions to examine the traffic pattern data to determine a second traffic flow of the other SDVs while operating in the nominal autonomous mode; and fifth program instructions to, in response to determining that the first traffic flow has a higher accident rate than the second traffic flow, instruct the operational mode device to change the operational mode of the SDV from the evasive autonomous mode back to the nominal autonomous mode. The first, second, third, fourth, and fifth program instructions are stored on the non-transitory computer readable storage medium for execution by one or more processors via the computer readable memory.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 depicts an exemplary system and network in which the present disclosure may be implemented;



FIG. 2 illustrates an exemplary self-driving vehicle (SDV) traveling on a roadway in accordance with one or more embodiments of the present invention;



FIG. 3 depicts additional detail of control hardware within an SDV;



FIG. 4 depicts communication linkages among SDVs and a coordinating server;



FIG. 5 is a high-level flow chart of one or more steps performed by one or more processors to control a driving mode of an SDV in accordance with one or more embodiments of the present invention;



FIG. 6 depicts a cloud computing node according to an embodiment of the present disclosure;



FIG. 7 depicts a cloud computing environment according to an embodiment of the present disclosure; and



FIG. 8 depicts abstraction model layers according to an embodiment of the present disclosure.





DETAILED DESCRIPTION

The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.


The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.


Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.


Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.


Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.


These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.


The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.


The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.


With reference now to the figures, and in particular to FIG. 1, there is depicted a block diagram of an exemplary system and network that may be utilized by and/or in the implementation of the present invention. Some or all of the exemplary architecture, including both depicted hardware and software, shown for and within computer 101 may be utilized by software deploying server 149 shown in FIG. 1, and/or coordinating computer 201 shown in FIG. 2, and/or a self-driving vehicle (SDV) on-board computer 301 shown in FIG. 3, and/or a coordinating server 401 depicted in FIG. 4.


Exemplary computer 101 includes a processor 103 that is coupled to a system bus 105. Processor 103 may utilize one or more processors, each of which has one or more processor cores. A video adapter 107, which drives/supports a display 109, is also coupled to system bus 105. System bus 105 is coupled via a bus bridge 111 to an input/output (I/O) bus 113. An I/O interface 115 is coupled to I/O bus 113. I/O interface 115 affords communication with various I/O devices, including a keyboard 117, a mouse 119, a media tray 121 (which may include storage devices such as CD-ROM drives, multi-media interfaces, etc.), a transceiver 123 (capable of transmitting and/or receiving electronic communication signals), and external USB port(s) 125. While the format of the ports connected to I/O interface 115 may be any known to those skilled in the art of computer architecture, in one embodiment some or all of these ports are universal serial bus (USB) ports.


As depicted, computer 101 is able to communicate with a software deploying server 149 and/or other devices/systems (e.g., establishing communication among SDV 202, EDV 204, and/or coordinating server 401 depicted in the figures below) using a network interface 129. Network interface 129 is a hardware network interface, such as a network interface card (NIC), etc. Network 127 may be an external network such as the Internet, or an internal network such as an Ethernet or a virtual private network (VPN). In one or more embodiments, network 127 is a wireless network, such as a Wi-Fi network, a cellular network, etc.


A hard drive interface 131 is also coupled to system bus 105. Hard drive interface 131 interfaces with a hard drive 133. In one embodiment, hard drive 133 populates a system memory 135, which is also coupled to system bus 105. System memory is defined as a lowest level of volatile memory in computer 101. This volatile memory includes additional higher levels of volatile memory (not shown), including, but not limited to, cache memory, registers and buffers. Data that populates system memory 135 includes computer 101's operating system (OS) 137 and application programs 143.


OS 137 includes a shell 139, for providing transparent user access to resources such as application programs 143. Generally, shell 139 is a program that provides an interpreter and an interface between the user and the operating system. More specifically, shell 139 executes commands that are entered into a command line user interface or from a file. Thus, shell 139, also called a command processor, is generally the highest level of the operating system software hierarchy and serves as a command interpreter. The shell provides a system prompt, interprets commands entered by keyboard, mouse, or other user input media, and sends the interpreted command(s) to the appropriate lower levels of the operating system (e.g., a kernel 141) for processing. While shell 139 is a text-based, line-oriented user interface, the present invention will equally well support other user interface modes, such as graphical, voice, gestural, etc.


As depicted, OS 137 also includes kernel 141, which includes lower levels of functionality for OS 137, including providing essential services required by other parts of OS 137 and application programs 143, including memory management, process and task management, disk management, and mouse and keyboard management.


Application programs 143 include a renderer, shown in exemplary manner as a browser 145. Browser 145 includes program modules and instructions enabling a world wide web (WWW) client (i.e., computer 101) to send and receive network messages to the Internet using hypertext transfer protocol (HTTP) messaging, thus enabling communication with software deploying server 149 and other systems.


Application programs 143 in computer 101's system memory (as well as software deploying server 149's system memory) also include Logic for Managing Self-Driving Vehicles (LMSDV) 147. LMSDV 147 includes code for implementing the processes described below, including those described in FIGS. 2-5. In one embodiment, computer 101 is able to download LMSDV 147 from software deploying server 149, including in an on-demand basis, wherein the code in LMSDV 147 is not downloaded until needed for execution. In one embodiment of the present invention, software deploying server 149 performs all of the functions associated with the present invention (including execution of LMSDV 147), thus freeing computer 101 from having to use its own internal computing resources to execute LMSDV 147.


Also within computer 101 is a positioning system 151, which determines a real-time current location of computer 101 (particularly when part of an emergency vehicle and/or a self-driving vehicle as described herein). Positioning system 151 may be a combination of accelerometers, speedometers, etc., or it may be a global positioning system (GPS) that utilizes space-based satellites to provide triangulated signals used to determine two-dimensional or three-dimensional locations.


Also associated with computer 101 are sensors 153, which detect an environment of the computer 101. More specifically, sensors 153 are able to detect vehicles, road obstructions, pavement, etc. For example, if computer 101 is on board a self-driving vehicle (SDV), then sensors 153 may be cameras, radar transceivers, etc. that allow the SDV to detect the environment (e.g., other vehicles including erratically driven vehicles as described herein, road obstructions, pavement, etc.) of that SDV, thus enabling it to be autonomously self-driven. Similarly, sensors 153 may be cameras, thermometers, moisture detectors, etc. that detect ambient weather conditions and other environmental conditions of a roadway upon which the SDV is traveling.


The hardware elements depicted in computer 101 are not intended to be exhaustive, but rather are representative to highlight essential components required by the present invention. For instance, computer 101 may include alternate memory storage devices such as magnetic cassettes, digital versatile disks (DVDs), Bernoulli cartridges, and the like. These and other variations are intended to be within the spirit and scope of the present invention.


With reference now to FIG. 2, an exemplary self-driving vehicle (SDV) 202 and an erratically driven vehicle (EDV) 204 traveling along a roadway 206 in accordance with one or more embodiments of the present invention are presented. Roadway 206 may be a public roadway, a private roadway, a parking lot, a paved road, an unpaved road, and/or any other surface capable of supporting vehicles, which may be wheeled (e.g., cars), tracked (e.g., trains), or a combination thereof.


An EDV is defined as a vehicle that is being driven in a manner that has been predetermined to be unsafe, including but not limited to a manner in which movement is erratic. That is, EDV 204 is deemed to be an erratically driven vehicle if it is currently weaving in and out of traffic, not staying within lane dividers, going too fast or too slow for current road/traffic conditions, exceeding a posted speed limit, tailgating another vehicle, and/or being driven in any other manner that has been predetermined to be erratic or otherwise improper, and therefore is unsafe.


Additionally, EDV 204 is deemed in one or more embodiments to be an erratically driven vehicle if it is currently entering and leaving manual and autonomous modes at a high rate of change (e.g., the vehicle enters, leaves, enters, and leaves the mode above a certain rate for a unit time, as determined by an examination of an indicator of the mode on the vehicle). For example, if SDV 202 determines that EDV 204 has toggled modes (e.g., between manual mode and autonomous mode) 20 times in one minute, then EDV 204 is deemed to be erratically driven. This mode toggling can be displayed by an indicator (visual or aural) on SDV 202. That is, SDV on-board computer 301 in SDV 202 can monitor the driving mode device 307 in EDV 204, assuming that the SDV on-board computer 301 in the EDV 204 transmits the status/output of the driving mode device 307 in EDV 204 to other SDVs. If SDV 202 receives messages/signals from SDV 204 indicating that the output of the driving mode device 307 in the EDV 204 is toggling (switching) more than a predetermined frequency, then EDV 204 is deemed to be operating in an erratic/unsafe manner, and this information is represented on a warning light, tone speaker, etc. (not shown) in the cabin of SDV 202.


As indicated by its askew positioning in FIG. 2, EDV 204 is being driven in an erratic/unsafe manner. In one embodiment, EDV 204 is an SDV operating in autonomous mode (discussed below) whose systems are malfunctioning, thus resulting in the erratic/unsafe operations of EDV 204. In another embodiment, EDV 204 is either an SDV operating in manual mode (also discussed below), or is a non-SDV vehicle that is always driven in manual mode, such that a human driver is driving the EDV 204 in an erratic/unsafe manner.


In accordance with one or more embodiments of the present invention, SDV 202 utilizes on-board sensors and/or readings from roadway sensor(s) 208 to determine that EDV 204 is currently operating in an erratic/unsafe manner. In response to making this determination, on-board controllers (discussed below) on the SDV 202 adjust the controls of SDV 202 to provide additional distance between the SDV 202 and the EDV 204 (e.g., by causing SDV 202 to change lanes or to speed up or to slow down, thus providing more space between SDV 202 and EDV 204), and/or to perform other reactive actions (e.g., notifying the proper officials of the erratic driving nature of EDV 204, send suggestions to EDV 204, etc.).


Current conditions of the roadway 206, including weather conditions, traffic conditions, construction events, accident events, etc., can be determined and transmitted by a coordinating computer 201. That is, coordinating computer 201 is able to determine current roadway conditions based on internal sensors 153 shown in FIG. 1, and/or roadway sensor(s) 208 (e.g., mechanical, visual, and/or electrical sensors that are able to detect the number and speed of vehicles traveling on the roadway 206, the amount and/or type of precipitation on the roadway 206, the temperature of the roadway 206 and/or ambient air around the roadway 206, the movement of vehicles traveling along roadway 206, etc.), as well as information received from sensors and/or on-board computers within SDV 202 and/or EDV 204, and/or from information received by an information service (e.g., a weather station). In one or more embodiments, these roadway conditions are utilized in the decision regarding into which operational/driving mode the SDV 202 should be placed.


In accordance with one or more embodiments of the present invention, SDV 202 may be driven in “manual mode”, “nominal autonomous mode”, “evasive autonomous mode”, or “stopping autonomous mode”, each of which are referred to herein as an operational mode or a driving mode, where the terms “operational mode” and “driving mode” are synonymous and interchangeable.


As used and described herein, “manual mode” is defined as an SDV being at least partially under the input control of a human driver. That is, if SDV 202 is being steered by a human driver but has cruise control activated, then it is in manual mode, since SDV 202 is partially under the input control (steering) of the human driver. Thus, while in manual mode, even SDV 202 can operate as a traditional motor vehicle, in which a human driver controls the engine throttle, engine on/off switch, steering mechanism, braking system, horn, signals, etc. found on a motor vehicle. These vehicle mechanisms may be operated in a “drive-by-wire” manner, in which inputs to an SDV control processor 303 (shown in FIG. 3) by the driver result in output signals that control the SDV vehicular physical control mechanisms 305 (e.g., the engine throttle, steering mechanisms, braking systems, turn signals, etc.).


As used and described herein, “nominal autonomous mode” is defined as an SDV being totally controlled by hardware/software logic (e.g., SDV on-board computer 301 and/or driving mode device 307 and/or SDV control processor 303 shown in FIG. 3) without inputs from the human driver under roadway and/or SDV conditions that have been predetermined to be normal (i.e., “nominal”). That is, if steering, braking, throttle control, obstacle/vehicle avoidance, etc. are all under the control of hardware/software logic such as the SDV on-board computer 301 shown in FIG. 3, then SDV 202 is in an autonomous mode, including the “nominal autonomous mode” in which roadway conditions are normal. Roadway and/or SDV conditions are deemed to be normal/nominal if the mechanical systems of the SDV are operating properly, roadway conditions are not unduly hazardous (e.g., are not icy), and other vehicles are operating properly (e.g., are not operating in an erratic or otherwise unsafe manner). The factors used to determine if the SDV and/or roadway are nominal or not nominal are predetermined. That is, various operational parameters (e.g., loss of pressure in brake lines, ice on the roadway, the presence of an EDV within a predetermined distance of the SDV, etc.) are stored in a database. If these parameters/conditions are occurring, then nominal SDV/roadway conditions are not being met.


As used and described herein, “evasive autonomous mode” is defined as an SDV being totally controlled by hardware/software logic (e.g., SDV on-board computer 301 and/or driving mode device 307 and/or SDV control processor 303 shown in FIG. 3) without inputs from the human driver (similar to “nominal autonomous mode”), but under tighter restrictions than those under the nominal autonomous mode and with an increased focus on the detected EDV. That is, if sensors within SDV 202 detect a vehicle within a certain proximity to SDV 202 traveling in an erratic or other unsafe manner, then going into “evasive autonomous mode” (as instructed by the driving mode device 307 in FIG. 3) causes the SDV on-board computer 301 shown in FIG. 3 to execute instructions designed for conditions in which the SDV 202 is near the EDV 204. Such instructions will result in additional distance between the SDV 202 and the EDV 204 (e.g., by SDV 202 speeding up, SDV 202 slowing down, SDV 202 changing lanes, etc. to put more space between SDV 202 and EDV 204), and/or will cause the SDV 202 to issue alert warnings, direct the SDV on-board computer 301 to prioritize receipt of and response to sensor readings that are focused on the EDV 204, etc.


As used and described herein, “stopping autonomous mode” is defined as an SDV being totally controlled by hardware/software logic (e.g., SDV on-board computer 301 and/or driving mode device 307 and/or SDV control processor 303 shown in FIG. 3) without inputs from the human driver (similar to “nominal autonomous mode” or “evasive autonomous mode”), but now specifically designed to steer the SDV to a safe location (e.g., a roadway shoulder, a next exit, etc.) and stopping the SDV at that safe location.


In one or more embodiments of the present invention, a weighted voting system is used to weight the various variables used in making the decision to enter one or more of the above-described modes. Such inputs may include: votes by other nearby cars, a history of the driver of the EDV, a history of the efficiency of the control system of the SDV in avoiding collisions in adverse circumstances, etc. Such weighted voting approaches may be characterized primarily by three aspects—the inputs, the weights, and the quota. The inputs are (I1, I2, . . . , IN). N denotes the total number of inputs. An input's weight (w) is the number of “votes” associated with the input. The quota (q) is the minimum number of votes required to “pass a motion,” which in this case refers primarily to a decision made regarding into which control/driving mode the SDV is placed.


Additional details of one or more embodiments of the SDV 202 (which may have a same architecture as EDV 204 when enabled as an SDV) are presented in FIG. 3. As shown in FIG. 3, SDV 202 has an SDV on-board computer 301 that controls operations of the SDV 202. According to directives from a driving mode device 307, the SDV 202 can be selectively operated in manual mode or autonomous mode (including the nominal autonomous mode, evasive autonomous mode, and/or stopping autonomous mode described above). In a preferred embodiment, driving mode device 307 is a dedicated hardware device that selectively directs the SDV on-board computer 301 to operate the SDV 202 in one of the autonomous modes or in the manual mode.


While in autonomous mode, SDV 202 operates without the input of a human driver, such that the engine, steering mechanism, braking system, horn, signals, etc. are controlled by the SDV control processor 303, which is now under the control of the SDV on-board computer 301. That is, by the SDV on-board computer 301 processing inputs taken from navigation and control sensors 309 and the driving mode device 307 (indicating that the SDV 202 is to be controlled autonomously), then driver inputs to the SDV control processor 303 and/or SDV vehicular physical control mechanisms 305 are no longer needed.


As just mentioned, the SDV on-board computer 301 uses outputs from navigation and control sensors 309 to control the SDV 202. Navigation and control sensors 309 include hardware sensors that 1) determine the location of the SDV 202; 2) sense other cars and/or obstacles and/or physical structures around SDV 202; 3) measure the speed and direction of the SDV 202; and 4) provide any other inputs needed to safely control the movement of the SDV 202.


With respect to the feature of 1) determining the location of the SDV 202, this can be achieved through the use of a positioning system such as positioning system 151 shown in FIG. 1. Positioning system 151 may use a global positioning system (GPS), which uses space-based satellites that provide positioning signals that are triangulated by a GPS receiver to determine a 3-D geophysical position of the SDV 202. Positioning system 151 may also use, either alone or in conjunction with a GPS system, physical movement sensors such as accelerometers (which measure rates of changes to a vehicle in any direction), speedometers (which measure the instantaneous speed of a vehicle), airflow meters (which measure the flow of air around a vehicle), etc. Such physical movement sensors may incorporate the use of semiconductor strain gauges, electromechanical gauges that take readings from drivetrain rotations, barometric sensors, etc.


With respect to the feature of 2) sensing other cars and/or obstacles and/or physical structures around SDV 202, the positioning system 151 may use radar or other electromagnetic energy that is emitted from an electromagnetic radiation transmitter (e.g., transceiver 323 shown in FIG. 3), bounced off a physical structure (e.g., another car), and then received by an electromagnetic radiation receiver (e.g., transceiver 323). By measuring the time it takes to receive back the emitted electromagnetic radiation, and/or evaluating a Doppler shift (i.e., a change in frequency to the electromagnetic radiation that is caused by the relative movement of the SDV 202 to objects being interrogated by the electromagnetic radiation) in the received electromagnetic radiation from when it was transmitted, the presence and location of other physical objects can be ascertained by the SDV on-board computer 301.


With respect to the feature of 3) measuring the speed and direction of the SDV 202, this can be accomplished by taking readings from an on-board speedometer (not depicted) on the SDV 202 and/or detecting movements to the steering mechanism (also not depicted) on the SDV 202 and/or the positioning system 151 discussed above.


With respect to the feature of 4) providing any other inputs needed to safely control the movement of the SDV 202, such inputs include, but are not limited to, control signals to activate a horn, turning indicators, flashing emergency lights, etc. on the SDV 202.


In one or more embodiments of the present invention, SDV 202 includes roadway sensors 311 that are coupled to the SDV 202. Roadway sensors 311 may include sensors that are able to detect the amount of water, snow, ice on the roadway 206 (e.g., using cameras, heat sensors, moisture sensors, thermometers, etc.). Roadway sensors 311 also include sensors that are able to detect “rough” roadways (e.g., roadways having potholes, poorly maintained pavement, no paving, etc.) using cameras, vibration sensors, etc. Roadway sensors 311 may also include sensors that are also able to detect how dark the roadway 206 is using light sensors.


Similarly, a dedicated camera 321 can be trained on roadway 206, in order to provide photographic images capable of being evaluated, thereby recognizing erratic vehicular operations. For example, sequences of photographic images can show the velocity and any change in direction of EDV 204, thus providing the recognition of the erratic/unsafe driving pattern for EDV 204.


Similarly, a dedicated object motion detector 319 (e.g., a radar transceiver capable of detecting Doppler shifts indicative of the speed and direction of movement of EDV 204) can be trained on roadway 206, in order to detect the movement of EDV 204, thus providing the recognition of the erratic/unsafe driving pattern for EDV 204.


In one or more embodiments of the present invention, also within the SDV 202 are SDV equipment sensors 315. SDV equipment sensors 315 may include cameras aimed at tires on the SDV 202 to detect how much tread is left on the tire. SDV equipment sensors 315 may include electronic sensors that detect how much padding is left of brake calipers on disk brakes. SDV equipment sensors 315 may include drivetrain sensors that detect operating conditions within an engine (e.g., power, speed, revolutions per minute—RPMs of the engine, timing, cylinder compression, coolant levels, engine temperature, oil pressure, etc.), the transmission (e.g., transmission fluid level, conditions of the clutch, gears, etc.), etc. SDV equipment sensors 315 may include sensors that detect the condition of other components of the SDV 202, including lights (e.g., using circuitry that detects if a bulb is broken), wipers (e.g., using circuitry that detects a faulty wiper blade, wiper motor, etc.), etc.


In one or more embodiments of the present invention, also within SDV 202 is a communications transceiver 317, which is able to receive and transmit electronic communication signals (e.g., RF messages) from and to other communications transceivers found in other vehicles, servers, monitoring systems, etc.


In one or more embodiments of the present invention, also within SDV 202 is a telecommunication device 325 (e.g., a smart phone, a cell phone, a laptop computer, etc.), which may be connected (e.g., via a near field communication—NFC connection) to the SDV on-board computer 301. Thus, alerts regarding EDV 204 may be transmitted to a smart phone within the EDV 204 or another vehicle or agency (e.g., a local law enforcement agency).


Returning to FIG. 2, one or more embodiments of the present invention are directed to determining whether to place the SDV 202 into evasive autonomous mode or stopping autonomous mode, based on the presence of the EDV 204 and/or based on the qualifications/abilities/condition of the driver of the EDV 204 and/or based on the condition/state of mechanical and/or control equipment of the EDV 204, particularly when operating in SDV mode. In one or more embodiments, this decision is also based on roadway conditions and/or vehicle conditions, as well as other factors. That is, if the movement of EDV 204 is so hazardous that the prudent step would be to simply pull over to the side of the road, then the system will place SDV 202 into stopping autonomous mode, thereby pulling the SDV 202 over to the side of the road (or onto the next available exit).


In order to determine whether or not to switch from evasive autonomous mode (or even nominal autonomous mode) to stopping autonomous mode, one or more embodiments of the present invention utilize a metric of observed conditions. For example, assume that SDV equipment sensors 315 in FIG. 3 detect that the thickness of brake linings on SDV 202 is below (i.e., worn down—thinner than) a predefined limit (e.g., less than 2 mm of brake pad remain on the brake calipers), that object motion detector 319 and/or camera 321 have identified EDV 204 as changing lanes (i.e., swerving) more frequently than another predefined limit (e.g., every five seconds over the course of 30 seconds), and that roadway sensors 311 have determined that roadway 206 (shown in FIG. 2) has a film of ice more than a predefined thickness (e.g., more than ½ inch). These parameters are summed. If their summation exceeds predefined limit, then SDV 202 is automatically placed into stopping autonomous mode. If not, then SDV 202 stays in its current operational mode (e.g., nominal autonomous mode or evasive autonomous mode). These parameters may be weighted, such that the value associated with swerving is given a higher weight (e.g., 2×) than the condition of the brake pads (e.g., 0.5×).


In an embodiment of the present invention, once the parameters that caused SDV 202 to go into evasive autonomous mode and/or stopping autonomous mode cease, then the SDV on-board computer 301 directs SDV 202 to return to nominal autonomous mode or manual mode. For example, if the object motion detector 319 and/or camera 321 detect that EDV 204 has exited roadway 206; or that EDV 204 has pulled far enough away from SDV 202 (e.g., one mile) to no longer pose a threat to SDV 202; or that EDV 204 has started driving safely (e.g., EDV 204 has entered an autonomous mode and/or sent a message to SDV 202 indicating as such, and movements of EDV 204 captured by object motion detector 319 and/or camera 321 confirm that EDV 204 is now driving safely); etc., then the SDV on-board computer 301 in SDV 202 will cause SDV 202 to revert back to nominal autonomous mode or manual mode.


Note that the parameters/summation/weighting just described for going from one autonomous mode to another autonomous mode may also be used in determining if the SDV 202 should go from manual mode to one of the autonomous modes.


Other factors considered in determining which operational mode to place the SDV into include the driving ability of the driver of the SDV 202, the driving ability of the driver of the EDV 204, the accuracy of environmental and other sensors (e.g., sensors 153 shown in FIG. 1) mounted on the SDV 202, and/or learned traffic patterns for vehicles on the roadway 206. For example, if the driver of the SDV 202 has a profile and/or driving history that is exceptionally high, then the system will “trust” the driver to maneuver around the EDV 204. Alternatively, if the profile/history of the driver of the EDV 204 is exceptionally poor, then the system will take proactive steps to place the SDV 202 into a cautious mode (e.g., evasive autonomous mode) even if the movement of the EDV 204 has not exceeded predefined criteria (e.g., the EDV 204 is not currently swerving, speeding, etc.). Other factors (e.g., sensor accuracy, environmental/weather conditions, learned traffic patterns for vehicles on the roadway, etc.) will be used in various embodiments to determine how conservative the system will be when determining which type of autonomous mode or the manual mode will be utilized by the SDV. For example, if the roadway is icy, sensors on the SDV 202 are having trouble accurately detecting movement of the EDV 204, the roadway 206 has a high incidence of accidents, etc., then the system will take the SDV 202 from the nominal autonomous mode to the evasive autonomous mode sooner than if the roadway was dry, the sensors were operating efficiently, the roadway had a history of few accidents, etc.


Messages/sensor readings describing factors used by the SDV 202 to decide which operational/driving mode to use may come from the SDV 202 itself, another vehicle (e.g., EDV 204), the coordinating computer 201, and/or the coordinating server 401 shown in FIG. 4. Coordinating server 401 may coordinate the control of the driving mode (i.e., autonomous or manual) of the SDV 202 and/or EDV 204 (if equipped to operate in SDV mode), and/or may receive current roadway conditions of roadway 206 and/or the state of the driver of EDV 204. As depicted in FIG. 4, coordinating server 401 and/or SDV 202 and/or EDV 204 are able to communicate with one another wirelessly, using a wireless transceiver (e.g., transceiver 123 shown in FIG. 1) that is found in each of the coordinating server 401 and/or SDV 202 and/or EDV 204.


With reference now to FIG. 5, a high-level flow chart of one or more steps performed by one or more processors to control an operational mode of an SDV in accordance with one or more embodiments of the present invention is presented. Note that various actions described for the present invention may be performed by the SDV on-board computer 301 shown in FIG. 3, the monitoring computer 401 shown in FIG. 4, and/or the cloud computing environment 50 shown in FIG. 7.


After initiator block 502, one or more physical detectors (e.g., roadway sensors 311, camera 321, and/or object motion detector 319 shown in FIG. 3; roadway sensor(s) 208 shown in FIG. 2; and/or sensors 153 shown in FIG. 1 when incorporated into coordinating computer 201 shown next to roadway 206 in FIG. 2) detect an erratically driven vehicle (EDV), such as EDV 204 shown in FIG. 2. The EDV is being operated in an unsafe manner within a predetermined distance (e.g., within 300 feet) of an SDV (e.g., SDV 202 shown in FIG. 2), as depicted in block 504. As described herein, the SDV is initially being operated in a nominal autonomous mode (defined above).


As described in block 506, in response to the physical detectors detecting the EDV within the predetermined distance of the SDV, a driving mode device (e.g., driving mode device 307 in FIG. 3) in the SDV changes an operational mode of the SDV from the nominal autonomous mode to an evasive autonomous mode.


The flow-chart ends at terminator block 508.


In an embodiment of the present invention, a transceiver (e.g., communication transceiver 317 shown in FIG. 3) on the SDV transmits an alert to other SDVs describing the EDV. For example, assume that SDV 202 has detected EDV 204 driving in an unsafe/erratic manner. SDV 202 will then send a warning to SDV 210, alerting SDV 210 of the presence and behavior of EDV 204, thus allowing SDV 210 to engage the evasive autonomous mode in its own on-board system and/or to otherwise respond to the presence of EDV 204.


In an embodiment of the present invention, a transceiver (e.g., communication transceiver 317 shown in FIG. 3) on the SDV transmits an alert to an authority agency describing the EDV. For example, assume that SDV 210 is a police vehicle, and/or that coordinating computer 201 is monitored by a local law enforcement agency. The SDV 202 will transmit a message to the police vehicle and/or local law enforcement agency (e.g., via the coordinating computer 201) alerting them to the presence and behavior of EDV 204, thus giving them the opportunity to take intervening steps (e.g., pull the EDV 204 over) in order to protect the public safety.


In one embodiment of the present invention, the decision regarding which mode to place the SDV into (manual mode, nominal autonomous mode, evasive autonomous mode, or stopping autonomous mode) is further based on the current condition of the roadway. Thus, assume again that the SDV 202 shown in FIG. 2 is traveling on a roadway (e.g., roadway 206 shown in FIG. 2). One or more processors (e.g., within SDV on-board computer 301) receive sensor readings from multiple sensors (e.g., roadway sensor(s) 208 shown in FIG. 2). In one embodiment, each of the multiple sensors detects a different type of current condition of the roadway. Based on the sensor readings (and thus the current roadway condition of the roadway), the driving mode device (e.g., driving mode device 307 shown in FIG. 3) further adjusts the operational mode. For example, if the roadway conditions are clear and dry, then the driving mode device may cause the SDV to revert back to nominal autonomous mode from the evasive autonomous mode. However, if the roadway conditions are dark and icy, then the driving mode device may cause the SDV to shift operational control from the evasive autonomous mode to the stopping autonomous mode.


In an embodiment of the present invention, the processor(s) weight each of the sensor readings for different current conditions of the roadway (e.g., snow on the roadway is weighted higher than rain on the roadway, but less than ice on the roadway). The processor(s) sum the weighted sensor readings for the different current conditions of the roadway, and determine whether the summed weighted sensor readings exceed a predefined level (e.g., some particular numerical value). In response to determining that the summed weighted sensor readings exceed a predefined level, the on-board SDV control processor adjusts the operational mode of the SDV accordingly.


In one embodiment of the present invention, the decision to place the SDV into a new type of autonomous mode (e.g., evasive autonomous mode or stopping autonomous mode) is based on the driver of the SDV being part of a cohort of drivers that share certain traits. That is, in one embodiment of the present invention assume that the SDV 202 shown in FIG. 2 is traveling on roadway 206. One or more processors (e.g., within coordinating computer 401 shown in FIG. 4) retrieve driver profile information about the human driver of the SDV, and then assign the human driver of the SDV to a cohort of drivers traveling on the roadway in other vehicles (where the human driver of the SDV shares more than a predetermined quantity of traits with members of the cohort of drivers). The processor(s) retrieve traffic pattern data for the other vehicles while traveling on the roadway, and examine that traffic pattern data to determine a first traffic flow of the multiple vehicles (e.g., while operating in the evasive autonomous mode described above) and a second traffic flow of the multiple SDVs (while operating in the nominal autonomous mode described above). In response to determining that the first traffic flow has a higher accident rate than the second traffic flow, the processor(s) change the operational mode of the SDV from the evasive autonomous mode back to the nominal autonomous mode, since the data shows that the evasive autonomous mode is actually more dangerous than simply leaving the SDV in the nominal autonomous mode, including when driving near the EDV.


Similarly and in one or more embodiments of the present invention, the decision to place the SDV into a manual mode (e.g., switching from evasive autonomous mode into manual mode) is based on the driver of the SDV being part of a cohort of drivers that share certain traits. That is, again assume that the SDV 202 shown in FIG. 2 is traveling on roadway 206. One or more processors (e.g., within coordinating computer 401 shown in FIG. 4) retrieve driver profile information about the human driver of the SDV, and then assign the human driver of the SDV to a cohort of drivers traveling on the roadway in other vehicles (where the human driver of the SDV shares more than a predetermined quantity of traits with members of the cohort of drivers). The processor(s) retrieve traffic pattern data for the other vehicles while traveling on the roadway, and examine that traffic pattern data to determine a first traffic flow of the multiple vehicles (e.g., while operating in the evasive autonomous mode described above) and a second traffic flow of the multiple SDVs (while operating in the manual mode described above). In response to determining that the first traffic flow has a higher accident rate than the second traffic flow, the processor(s) change the operational mode of the SDV from the evasive autonomous mode to the manual mode, since the data shows that the evasive autonomous mode is actually more dangerous than allowing the driver to manually respond to roadway conditions, including the presence of an EDV.


In another embodiment of the present invention, the decision regarding which mode to use (e.g., one of the autonomous modes or the manual mode) is based on the driver of the EDV 204 being a member of a cohort (i.e., by sharing one or more traits with members of the cohort) that has a history of safe/unsafe driving. Based on this driving history of the cohort and the fact that the EDV driver shares common traits with its members, the system will place the SDV into the best-fit operational/driving mode.


In one embodiment of the present invention, the decision to place the SDV in manual or one of the autonomous modes described herein is further dependent on the current mechanical condition of the SDV (e.g., the condition of the tires, the condition of the brakes, the condition of the headlights, the condition of the windshield wipers, the condition of the engine, the condition of the transmission, the condition of the cooling system, etc.). Thus, one or more processor(s) (e.g., within the SDV on-board computer 301 shown in FIG. 3) receive operational readings from one or more operational sensors (e.g., SDV equipment sensors 315 shown in FIG. 3) on the SDV, which detect a current state of mechanical equipment on the SDV. Based on the received operational readings, the processor(s) detect a mechanical fault (e.g., faulty brakes, bald tires, etc.) with the mechanical equipment on the SDV. In response to detecting the mechanical fault with the mechanical equipment on the SDV, the operational mode device further adjusts the operational mode of the SDV (e.g., the operational mode device may change the operational mode of the SDV from the evasive autonomous mode to a stopping autonomous mode).


As described herein, the driving mode device 307 along with the SDV on-board computer 301 shown in FIG. 3 provides a process for selectively switching between various types of autonomous modes and/or a manual mode. However, if such switching back and forth occurs too frequently, safety issues may arise. For example, if the driving mode device in FIG. 3 switches control of the SDV 202 from the manual mode to the autonomous mode, and then switches control of the SDV 202 back to the manual mode a few seconds later, the driver and/or SDV will likely become confused and/or ineffective.


Therefore, in one embodiment of the present invention, a predefined time limit and/or physical distance is set between switching back and forth between operational modes. For example, based on historical data that describes how long the current driver (and/or drivers from a cohort of drivers that have similar traits/characteristics as the current driver) needs to recover from relinquishing control of the SDV to the autonomous controller, the predefined time limit may be one minute. Similarly, based on historical data that describes how far the current driver must travel in order to recover from relinquishing control of the SDV to the autonomous controller, the predefined physical distance may be one mile. Therefore, if the system has switched from the manual mode to the autonomous mode, then one minute must pass and/or one mile must be traversed by the SDV before control can be returned back to the driver (e.g., manual mode is re-activated).


In one or more embodiments, the present invention is implemented in a cloud environment. It is understood in advance that although this disclosure includes a detailed description on cloud computing, implementation of the teachings recited herein are not limited to a cloud computing environment. Rather, embodiments of the present invention are capable of being implemented in conjunction with any other type of computing environment now known or later developed.


Cloud computing is a model of service delivery for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, network bandwidth, servers, processing, memory, storage, applications, virtual machines, and services) that can be rapidly provisioned and released with minimal management effort or interaction with a provider of the service. This cloud model may include at least five characteristics, at least three service models, and at least four deployment models.


Characteristics are as follows:


On-demand self-service: a cloud consumer can unilaterally provision computing capabilities, such as server time and network storage, as needed automatically without requiring human interaction with the service's provider.


Broad network access: capabilities are available over a network and accessed through standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, laptops, and PDAs).


Resource pooling: the provider's computing resources are pooled to serve multiple consumers using a multi-tenant model, with different physical and virtual resources dynamically assigned and reassigned according to demand. There is a sense of location independence in that the consumer generally has no control or knowledge over the exact location of the provided resources but may be able to specify location at a higher level of abstraction (e.g., country, state, or datacenter).


Rapid elasticity: capabilities can be rapidly and elastically provisioned, in some cases automatically, to quickly scale out and rapidly released to quickly scale in. To the consumer, the capabilities available for provisioning often appear to be unlimited and can be purchased in any quantity at any time.


Measured service: cloud systems automatically control and optimize resource use by leveraging a metering capability at some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can be monitored, controlled, and reported providing transparency for both the provider and consumer of the utilized service.


Service Models are as follows:


Software as a Service (SaaS): the capability provided to the consumer is to use the provider's applications running on a cloud infrastructure. The applications are accessible from various client devices through a thin client interface such as a web browser (e.g., web-based e-mail). The consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.


Platform as a Service (PaaS): the capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages and tools supported by the provider. The consumer does not manage or control the underlying cloud infrastructure including networks, servers, operating systems, or storage, but has control over the deployed applications and possibly application hosting environment configurations.


Infrastructure as a Service (IaaS): the capability provided to the consumer is to provision processing, storage, networks, and other fundamental computing resources where the consumer is able to deploy and run arbitrary software, which can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but has control over operating systems, storage, deployed applications, and possibly limited control of select networking components (e.g., host firewalls).


Deployment Models are as follows:


Private cloud: the cloud infrastructure is operated solely for an organization. It may be managed by the organization or a third party and may exist on-premises or off-premises.


Community cloud: the cloud infrastructure is shared by several organizations and supports a specific community that has shared concerns (e.g., mission, security requirements, policy, and compliance considerations). It may be managed by the organizations or a third party and may exist on-premises or off-premises.


Public cloud: the cloud infrastructure is made available to the general public or a large industry group and is owned by an organization selling cloud services.


Hybrid cloud: the cloud infrastructure is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together by standardized or proprietary technology that enables data and application portability (e.g., cloud bursting for load-balancing between clouds).


A cloud computing environment is service oriented with a focus on statelessness, low coupling, modularity, and semantic interoperability. At the heart of cloud computing is an infrastructure comprising a network of interconnected nodes.


Referring now to FIG. 6, a schematic of an example of a cloud computing node is shown. Cloud computing node 10 is only one example of a suitable cloud computing node and is not intended to suggest any limitation as to the scope of use or functionality of embodiments of the invention described herein. Regardless, cloud computing node 10 is capable of being implemented and/or performing any of the functionality set forth hereinabove.


In cloud computing node 10 there is a computer system/server 12, which is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with computer system/server 12 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like.


Computer system/server 12 may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types. Computer system/server 12 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.


As shown in FIG. 6, computer system/server 12 in cloud computing node 10 is shown in the form of a general-purpose computing device. The components of computer system/server 12 may include, but are not limited to, one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including system memory 28 to processor 16.


Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnects (PCI) bus.


Computer system/server 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 12, and it includes both volatile and non-volatile media, removable and non-removable media.


System memory 28 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 30 and/or cache memory 32. Computer system/server 12 may further include other removable/non-removable, volatile/non-volatile computer system storage media. By way of example only, storage system 34 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”). Although not shown, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided. In such instances, each can be connected to bus 18 by one or more data media interfaces. As will be further depicted and described below, memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.


Program/utility 40, having a set (at least one) of program modules 42, may be stored in memory 28 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment. Program modules 42 generally carry out the functions and/or methodologies of embodiments of the invention as described herein.


Computer system/server 12 may also communicate with one or more external devices 14 such as a keyboard, a pointing device, a display 24, etc.; one or more devices that enable a user to interact with computer system/server 12; and/or any devices (e.g., network card, modem, etc.) that enable computer system/server 12 to communicate with one or more other computing devices. Such communication can occur via Input/output (I/O) interfaces 22. Still yet, computer system/server 12 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 20. As depicted, network adapter 20 communicates with the other components of computer system/server 12 via bus 18. It should be understood that although not shown, other hardware and/or software components could be used in conjunction with computer system/server 12. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.


Referring now to FIG. 7, illustrative cloud computing environment 50 is depicted. As shown, cloud computing environment 50 comprises one or more cloud computing nodes 10 with which local computing devices used by cloud consumers, such as, for example, personal digital assistant (PDA) or cellular telephone 54A, desktop computer 54B, laptop computer 54C, and/or automobile computer system 54N may communicate. Nodes 10 may communicate with one another. They may be grouped (not shown) physically or virtually, in one or more networks, such as Private, Community, Public, or Hybrid clouds as described hereinabove, or a combination thereof. This allows cloud computing environment 50 to offer infrastructure, platforms and/or software as services for which a cloud consumer does not need to maintain resources on a local computing device. It is understood that the types of computing devices 54A-N shown in FIG. 7 are intended to be illustrative only and that computing nodes 10 and cloud computing environment 50 can communicate with any type of computerized device over any type of network and/or network addressable connection (e.g., using a web browser).


Referring now to FIG. 8, a set of functional abstraction layers provided by cloud computing environment 50 (FIG. 7) is shown. It should be understood in advance that the components, layers, and functions shown in FIG. 8 are intended to be illustrative only and embodiments of the invention are not limited thereto. As depicted, the following layers and corresponding functions are provided:


Hardware and software layer 60 includes hardware and software components. Examples of hardware components include: mainframes 61; RISC (Reduced Instruction Set Computer) architecture based servers 62; servers 63; blade servers 64; storage devices 65; and networks and networking components 66. In some embodiments, software components include network application server software 67 and database software 68.


Virtualization layer 70 provides an abstraction layer from which the following examples of virtual entities may be provided: virtual servers 71; virtual storage 72; virtual networks 73, including virtual private networks; virtual applications and operating systems 74; and virtual clients 75.


In one example, management layer 80 may provide the functions described below. Resource provisioning 81 provides dynamic procurement of computing resources and other resources that are utilized to perform tasks within the cloud computing environment. Metering and Pricing 82 provide cost tracking as resources are utilized within the cloud computing environment, and billing or invoicing for consumption of these resources. In one example, these resources may comprise application software licenses. Security provides identity verification for cloud consumers and tasks, as well as protection for data and other resources. User portal 83 provides access to the cloud computing environment for consumers and system administrators. Service level management 84 provides cloud computing resource allocation and management such that required service levels are met. Service Level Agreement (SLA) planning and fulfillment 85 provide pre-arrangement for, and procurement of, cloud computing resources for which a future requirement is anticipated in accordance with an SLA.


Workloads layer 90 provides examples of functionality for which the cloud computing environment may be utilized. Examples of workloads and functions which may be provided from this layer include: mapping and navigation 91; software development and lifecycle management 92; virtual classroom education delivery 93; data analytics processing 94; transaction processing 95; and self-driving vehicle control processing 96 (for selectively setting operational/driving control of an SDV to manual mode or one of the autonomous modes described herein).


The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.


The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of various embodiments of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the present invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the present invention. The embodiment was chosen and described in order to best explain the principles of the present invention and the practical application, and to enable others of ordinary skill in the art to understand the present invention for various embodiments with various modifications as are suited to the particular use contemplated.


Any methods described in the present disclosure may be implemented through the use of a VHDL (VHSIC Hardware Description Language) program and a VHDL chip. VHDL is an exemplary design-entry language for Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), and other similar electronic devices. Thus, any software-implemented method described herein may be emulated by a hardware-based VHDL program, which is then applied to a VHDL chip, such as a FPGA.


Having thus described embodiments of the present invention of the present application in detail and by reference to illustrative embodiments thereof, it will be apparent that modifications and variations are possible without departing from the scope of the present invention defined in the appended claims.

Claims
  • 1. A method for controlling an operational mode of a self-driving vehicle (SDV), the method comprising: detecting, by one or more physical detectors, an erratically driven vehicle (EDV) that is being operated in an unsafe manner within a predetermined distance of an SDV, wherein the SDV is initially being operated in an evasive autonomous mode;retrieving, by one or more processors, traffic pattern data for other SDVs;examining, by one or more processors, the traffic pattern data to determine a first traffic flow of the other SDVs while operating in the evasive autonomous mode;examining, by one or more processors, the traffic pattern data to determine a second traffic flow of the other SDVs while operating in a manual mode; andin response to determining that the first traffic flow has a higher accident rate than the second traffic flow, changing, by an operational mode device, the operational mode of the SDV from the evasive autonomous mode to the manual mode.
  • 2. The method of claim 1, further comprising: transmitting, by a transceiver on the SDV, an alert to other SDVs describing the EDV.
  • 3. The method of claim 1, further comprising: transmitting, by a transceiver on the SDV, an alert to an authority agency describing the EDV.
  • 4. The method of claim 1, further comprising: detecting, based on sensor readings from a roadway sensor, a current roadway condition of a roadway upon which the SDV and the EDV are traveling; andfurther adjusting, by the operational mode device, the operational mode of the SDV based on the current roadway condition of the roadway upon which the SDV and the EDV are traveling.
  • 5. The method of claim 1, wherein the SDV and the EDV are traveling on a roadway, and wherein the processor-implemented method further comprises: switching the operational mode of the SDV from the manual mode to the evasive autonomous mode;retrieving, by one or more processors, driver profile information about a human driver of the SDV;assigning, by one or more processors, the human driver of the SDV to a cohort of drivers who have traveled on the roadway in other SDVs, wherein the human driver of the SDV shares more than a predetermined quantity of traits with members of the cohort of drivers who have traveled in the other SDVs; andin response to assigning the human driver of the SDV to the cohort of drivers who have traveled on the roadway in other SDVs, switching the operational mode of the SDV from the evasive autonomous mode to the manual mode.
  • 6. The method of claim 1, wherein the SDV and the EDV are traveling on a roadway, and wherein the method further comprises: switching the operational mode of the SDV from the manual mode to the evasive autonomous mode;receiving, by one or more processors, sensor readings from multiple sensors, wherein each of the multiple sensors detects a different type of current condition of the roadway;weighting, by one or more processors, each of the sensor readings for different current conditions of the roadway;summing, by one or more processors, weighted sensor readings for the different current conditions of the roadway;determining, by one or more processors, whether the summed weighted sensor readings exceed a predefined level; andin response to determining that the summed weighted sensor readings exceed the predefined level, changing, by the operational mode device, the operational mode of the SDV from the evasive autonomous mode to a stopping autonomous mode.
  • 7. The method of claim 1, further comprising: switching, by one or more processors, the operational mode of the SDV from the manual mode to the evasive autonomous mode;receiving, by one or more processors, operational readings from one or more SDV operational sensors on the SDV, wherein the SDV operational sensors detect a current state of mechanical equipment on the SDV;detecting, by the one or more processors and based on received operational readings, a mechanical fault with the mechanical equipment on the SDV; andin response to detecting the mechanical fault with the mechanical equipment on the SDV, changing, by the operational mode device, the operational mode of the SDV from the evasive autonomous mode to a stopping autonomous mode.
  • 8. A computer program product for controlling an operational mode of a self-driving vehicle (SDV), the computer program product comprising a non-transitory computer readable storage medium having program code embodied therewith, the program code readable and executable by a processor to perform a method comprising: detecting, by one or more physical detectors, an erratically driven vehicle (EDV) that is being operated in an unsafe manner within a predetermined distance of an SDV, wherein the SDV and the EDV are traveling on a roadway, and wherein the SDV is initially being operated in an evasive autonomous mode;retrieving driver profile information about a human driver of the SDV;assigning the human driver of the SDV to a cohort of drivers who have traveled on the roadway in other SDVs, wherein the human driver of the SDV shares more than a predetermined quantity of traits with members of the cohort of drivers who have traveled in the other SDVs; andin response to assigning the human driver of the SDV to the cohort of drivers who have traveled on the roadway in other SDVs, switching the operational mode of the SDV from the evasive autonomous mode to a manual mode.
  • 9. The computer program product of claim 8, wherein the method further comprises: transmitting, by a transceiver on the SDV, an alert to other SDVs describing the EDV.
  • 10. The computer program product of claim 8, wherein the method further comprises: transmitting, by a transceiver on the SDV, an alert to an authority agency describing the EDV.
  • 11. The computer program product of claim 8, wherein the method further comprises: detecting, based on sensor readings from a roadway sensor, a current roadway condition of a roadway upon which the SDV and the EDV are traveling; andfurther adjusting, by the operational mode device, the operational mode of the SDV based on the current roadway condition of the roadway upon which the SDV and the EDV are traveling.
  • 12. The computer program product of claim 8, wherein the method further comprises: switching the operational mode of the SDV from the manual mode to the evasive autonomous mode;retrieving traffic pattern data for the other SDVs as they traveled on the roadway;examining the traffic pattern data to determine a first traffic flow of the other SDVs while operating in the evasive autonomous mode on the roadway;examining the traffic pattern data to determine a second traffic flow of the other SDVs while operating in a manual mode on the roadway; andin response to determining that the first traffic flow has a higher accident rate than the second traffic flow, changing, by the operational mode device, the operational mode of the SDV from the evasive autonomous mode to the manual mode.
  • 13. The computer program product of claim 8, wherein the method further comprises: switching the operational mode of the SDV from the manual mode to the evasive autonomous mode;receiving sensor readings from multiple sensors, wherein each of the multiple sensors detects a different type of current condition of the roadway;weighting each of the sensor readings for different current conditions of the roadway;summing weighted sensor readings for the different current conditions of the roadway;determining whether the summed weighted sensor readings exceed a predefined level; andin response to determining that the summed weighted sensor readings exceed the predefined level, changing, by the operational mode device, the operational mode of the SDV from the evasive autonomous mode to a stopping autonomous mode.
  • 14. The computer program product of claim 8, wherein the method further comprises: switching the operational mode of the SDV from the manual mode to the evasive autonomous mode;receiving operational readings from one or more SDV operational sensors on the SDV, wherein the SDV operational sensors detect a current state of mechanical equipment on the SDV;detecting, based on received operational readings, a mechanical fault with the mechanical equipment on the SDV; andin response to detecting the mechanical fault with the mechanical equipment on the SDV, changing, by the operational mode device, the operational mode of the SDV from the evasive autonomous mode to a stopping autonomous mode.
  • 15. A self-driving vehicle comprising: a processor, a computer readable memory, and a non-transitory computer readable storage medium;first program instructions to detect, by one or more physical detectors on an SDV, an erratically driven vehicle (EDV) that is being operated in an unsafe manner within a predetermined distance of an SDV, wherein the SDV is initially being operated in an evasive autonomous mode;second program instructions to retrieve traffic pattern data for other SDVs;third program instructions to examine the traffic pattern data to determine a first traffic flow of the other SDVs while operating in the evasive autonomous mode;fourth program instructions to examine the traffic pattern data to determine a second traffic flow of the other SDVs while operating in a nominal autonomous mode; andfifth program instructions to, in response to determining that the first traffic flow has a higher accident rate than the second traffic flow, instruct the operational mode device to change the operational mode of the SDV from the evasive autonomous mode back to the nominal autonomous mode; and wherein
  • 16. The self-driving vehicle of claim 15, further comprising: sixth program instructions to instruct a transceiver on the SDV to transmit an alert to other SDVs describing the EDV; and wherein
  • 17. The self-driving vehicle of claim 15, further comprising: sixth program instructions to instruct a transceiver on the SDV to transmit an alert to an authority agency describing the EDV; and wherein
  • 18. The self-driving vehicle of claim 15, further comprising: sixth program instructions to receive sensor readings from a roadway sensor describing a current roadway condition of a roadway upon which the SDV and the EDV are traveling; andseventh program instructions to further adjust, via the operational mode device, the operational mode of the SDV based on the current roadway condition of the roadway upon which the SDV and the EDV are traveling; and wherein
  • 19. The self-driving vehicle of claim 15, wherein the SDV and the EDV are traveling on a roadway, wherein the SDV is a first SDV, and wherein the self-driving vehicle further comprises: sixth program instructions to switch the operational mode of the SDV from the nominal autonomous mode to the evasive autonomous mode;seventh program instructions to retrieve driver profile information about a human driver of the first SDV;eighth program instructions to assign the human driver of the first SDV to a cohort of drivers traveling on the roadway in other SDVs, wherein the human driver of the first SDV shares more than a predetermined quantity of traits with members of the cohort of drivers of the other SDVs; andninth program instructions to in, response to assigning the human driver of the SDV to the cohort of drivers who have traveled on the roadway in other SDVs, switch the operational mode of the SDV from the evasive autonomous mode to a manual mode; and wherein
  • 20. The self-driving vehicle of claim 15, further comprising: sixth program instructions to switch the operational mode of the SDV from the nominal autonomous mode to the evasive autonomous mode;seventh program instructions to receive operational readings from one or more SDV operational sensors on the SDV, wherein the SDV operational sensors detect a current state of mechanical equipment on the SDV;eighth program instructions to detect, based on received operational readings, a mechanical fault with the mechanical equipment on the SDV; andninth program instructions to instruct the operational mode device to change the operational mode of the SDV from the evasive autonomous mode to a stopping autonomous mode in response to detecting the mechanical fault with the mechanical equipment on the SDV; and wherein
US Referenced Citations (264)
Number Name Date Kind
4665395 Van Ness May 1987 A
4908988 Yamamura et al. Mar 1990 A
5541590 Nishio Jul 1996 A
5975791 McCulloch Nov 1999 A
6064970 McMillian et al. May 2000 A
6201318 Guillory Mar 2001 B1
6326903 Gross et al. Dec 2001 B1
6393362 Burns May 2002 B1
6502035 Levine Dec 2002 B2
6587043 Kramer Jul 2003 B1
6622082 Schmidt et al. Sep 2003 B1
6731202 Klaus May 2004 B1
6810312 Jammu et al. Oct 2004 B2
7124088 Bauer et al. Oct 2006 B2
7580782 Breed et al. Aug 2009 B2
7769544 Blesener et al. Aug 2010 B2
7877269 Bauer et al. Jan 2011 B2
7894951 Norris et al. Feb 2011 B2
7979173 Breed Jul 2011 B2
8031062 Smith Oct 2011 B2
8045455 Agronow et al. Oct 2011 B1
8078349 Prada Gomez et al. Dec 2011 B1
8090598 Bauer et al. Jan 2012 B2
8139109 Schmiedel et al. Mar 2012 B2
8140358 Ling et al. Mar 2012 B1
8146703 Baumann et al. Apr 2012 B2
8152325 McDermott Apr 2012 B2
8180322 Lin et al. May 2012 B2
8346480 Trepagnier et al. Jan 2013 B2
8352112 Mudalige Jan 2013 B2
8442854 Lawton et al. May 2013 B2
8466807 Mudalige Jun 2013 B2
8489434 Otis et al. Jul 2013 B1
8583365 Jang et al. Nov 2013 B2
8660734 Zhu et al. Feb 2014 B2
8676466 Mudalige Mar 2014 B2
8678701 Aldasem Mar 2014 B1
8781964 Martin et al. Jul 2014 B2
8786461 Daudelin Jul 2014 B1
8810392 Teller et al. Aug 2014 B1
8816857 Nordin et al. Aug 2014 B2
8874305 Dolgov et al. Oct 2014 B2
8880270 Ferguson et al. Nov 2014 B1
8892451 Everett Nov 2014 B2
8903591 Ferguson et al. Dec 2014 B1
8923890 White et al. Dec 2014 B1
8928479 Gonsalves et al. Jan 2015 B2
8935034 Zhu Jan 2015 B1
8948955 Zhu et al. Feb 2015 B2
8949016 Ferguson et al. Feb 2015 B1
8954217 Montemerlo et al. Feb 2015 B1
8954252 Urmson et al. Feb 2015 B1
8954261 Das et al. Feb 2015 B2
8958943 Bertosa et al. Feb 2015 B2
8965621 Urmson et al. Feb 2015 B1
8970362 Morley et al. Mar 2015 B2
8983705 Zhu et al. Mar 2015 B2
8996224 Herbach et al. Mar 2015 B1
9014905 Kretzschmar et al. Apr 2015 B1
9024787 Alshinnawi et al. May 2015 B2
9123049 Hyde et al. Sep 2015 B2
9170327 Choe et al. Oct 2015 B2
9189897 Stenneth Nov 2015 B1
9194168 Lu et al. Nov 2015 B1
9216745 Beardsley et al. Dec 2015 B2
9218698 Ricci Dec 2015 B2
9278689 Delp Mar 2016 B1
9286520 Lo et al. Mar 2016 B1
9317033 Ibanez-guzman et al. Apr 2016 B2
9381915 Crombez et al. Jul 2016 B1
9390451 Slusar Jul 2016 B1
9399472 Minoiu-Enache Jul 2016 B2
9463805 Kirsch et al. Oct 2016 B2
9483948 Gordon et al. Nov 2016 B1
9552735 Pilutti et al. Jan 2017 B2
9566958 Waldmann Feb 2017 B2
9566986 Gordon et al. Feb 2017 B1
9587952 Slusar Mar 2017 B1
9628975 Watkins et al. Apr 2017 B1
9646496 Miller May 2017 B1
9718468 Barfield et al. Aug 2017 B2
9754235 Konrardy Sep 2017 B1
9791291 Yamashita et al. Oct 2017 B1
9834224 Gordon et al. Dec 2017 B2
9944291 Gordon Apr 2018 B2
10042359 Konrardy Aug 2018 B1
10093322 Gordon Oct 2018 B2
20020022927 Lemelson et al. Feb 2002 A1
20020026841 Svendsen Mar 2002 A1
20020128774 Takezaki et al. Sep 2002 A1
20030050740 Fecher et al. Mar 2003 A1
20030065572 McNee et al. Apr 2003 A1
20030076981 Smith et al. Apr 2003 A1
20040078133 Miller Apr 2004 A1
20040117086 Rao et al. Jun 2004 A1
20040199306 Helmann et al. Oct 2004 A1
20050021227 Matsumoto et al. Jan 2005 A1
20050104745 Bachelder et al. May 2005 A1
20060106671 Biet May 2006 A1
20060163939 Kuramochi et al. Jul 2006 A1
20060200379 Biet Sep 2006 A1
20060241855 Joe et al. Oct 2006 A1
20070100687 Yoshikawa May 2007 A1
20070124027 Betzitza et al. May 2007 A1
20080048850 Yamada Feb 2008 A1
20080065293 Placke et al. Mar 2008 A1
20080114663 Watkins et al. May 2008 A1
20080129475 Breed et al. Jun 2008 A1
20080201217 Bader et al. Aug 2008 A1
20080288406 Seguin et al. Nov 2008 A1
20090094109 Aaronson et al. Apr 2009 A1
20090138168 Labuhn et al. May 2009 A1
20090248231 Kamiya Oct 2009 A1
20090313096 Kama Dec 2009 A1
20100057511 Mansouri et al. Mar 2010 A1
20100156672 Yoo et al. Jun 2010 A1
20100179720 Lin et al. Jul 2010 A1
20100228427 Anderson et al. Sep 2010 A1
20100256852 Mudalige Oct 2010 A1
20110029173 Hyde et al. Feb 2011 A1
20110035250 Finucan Feb 2011 A1
20110077807 Hyde et al. Mar 2011 A1
20110077808 Hyde et al. Mar 2011 A1
20110137699 Ben-Ari et al. Jun 2011 A1
20110264521 Straka Oct 2011 A1
20120072243 Collins et al. Mar 2012 A1
20120083960 Zhu Apr 2012 A1
20120123646 Mantini May 2012 A1
20120139756 Djurkovic Jun 2012 A1
20120277947 Boehringer et al. Nov 2012 A1
20120293341 Lin Nov 2012 A1
20130030657 Chatterjee et al. Jan 2013 A1
20130113634 Hutchinson et al. May 2013 A1
20130131949 Shida May 2013 A1
20130141578 Chundrlik et al. Jun 2013 A1
20130144502 Shida Jun 2013 A1
20130222127 Ray Avalani Aug 2013 A1
20130231824 Wilson et al. Sep 2013 A1
20130261871 Hobbs et al. Oct 2013 A1
20130304514 Hyde Nov 2013 A1
20140019259 Dung et al. Jan 2014 A1
20140032049 Moshchuk et al. Jan 2014 A1
20140088850 Schuberth Mar 2014 A1
20140092332 Price Apr 2014 A1
20140095214 Mathe et al. Apr 2014 A1
20140129073 Ferguson May 2014 A1
20140136045 Zhu et al. May 2014 A1
20140136414 Abhyanker May 2014 A1
20140142799 Ferguson May 2014 A1
20140164126 Nicholas et al. Jun 2014 A1
20140188999 Leonard et al. Jul 2014 A1
20140195213 Kozloski et al. Jul 2014 A1
20140201037 Mallawarachchi et al. Jul 2014 A1
20140201126 Zadeh Jul 2014 A1
20140214255 Dolgov et al. Jul 2014 A1
20140214260 Eckert et al. Jul 2014 A1
20140222277 Tsimhoni et al. Aug 2014 A1
20140222577 Abhyanker Aug 2014 A1
20140282967 Maguire Sep 2014 A1
20140297116 Anderson et al. Oct 2014 A1
20140306833 Ricci Oct 2014 A1
20140309789 Ricci Oct 2014 A1
20140309806 Ricci Oct 2014 A1
20140309864 Ricci Oct 2014 A1
20140309891 Ricci Oct 2014 A1
20140310186 Ricci Oct 2014 A1
20140316671 Okamoto Oct 2014 A1
20140324268 Montemerlo et al. Oct 2014 A1
20140330479 Dolgov Nov 2014 A1
20140358331 Prada Gomez et al. Dec 2014 A1
20140358353 Ibanez-Guzman et al. Dec 2014 A1
20150006005 Yu et al. Jan 2015 A1
20150006014 Wimmer et al. Jan 2015 A1
20150026092 Abboud et al. Jan 2015 A1
20150035685 Strickland et al. Feb 2015 A1
20150051778 Mueller Feb 2015 A1
20150057891 Mudalige et al. Feb 2015 A1
20150062340 Datta et al. Mar 2015 A1
20150062469 Fleury Mar 2015 A1
20150066282 Yopp Mar 2015 A1
20150066284 Yopp Mar 2015 A1
20150070178 Kline Mar 2015 A1
20150088358 Yopp Mar 2015 A1
20150095190 Hammad et al. Apr 2015 A1
20150097866 Mochizuki Apr 2015 A1
20150120331 Russo et al. Apr 2015 A1
20150134178 Minoiu-Enache May 2015 A1
20150137985 Zafiroglu et al. May 2015 A1
20150141043 Abramson May 2015 A1
20150149018 Attard et al. May 2015 A1
20150149021 Duncan et al. May 2015 A1
20150160019 Biswal et al. Jun 2015 A1
20150166059 Ko Jun 2015 A1
20150170287 Tirone Jun 2015 A1
20150187019 Fernandes Jul 2015 A1
20150196256 Venkatraman et al. Jul 2015 A1
20150210280 Agnew et al. Jul 2015 A1
20150232065 Ricci et al. Aug 2015 A1
20150235480 Cudak Aug 2015 A1
20150235557 Engelman Aug 2015 A1
20150242953 Suiter Aug 2015 A1
20150269536 Parris Sep 2015 A1
20150293994 Kelly Oct 2015 A1
20150338226 Mason et al. Nov 2015 A1
20150339639 Choe Nov 2015 A1
20150344038 Stenneth et al. Dec 2015 A1
20160001781 Fung et al. Jan 2016 A1
20160026182 Boroditsky et al. Jan 2016 A1
20160063761 Sisbot et al. Mar 2016 A1
20160075512 Lert, Jr. Mar 2016 A1
20160078695 McClintic et al. Mar 2016 A1
20160078758 Basalamah Mar 2016 A1
20160090100 Oyama et al. Mar 2016 A1
20160139594 Okumura et al. May 2016 A1
20160140507 Stevens et al. May 2016 A1
20160161950 Frangou Jun 2016 A1
20160176409 Kirsch et al. Jun 2016 A1
20160200317 Danzl et al. Jul 2016 A1
20160202700 Sprigg Jul 2016 A1
20160205146 Sugioka et al. Jul 2016 A1
20160221768 Kadaba Aug 2016 A1
20160264131 Chan et al. Sep 2016 A1
20160303969 Akula Oct 2016 A1
20160304122 Herzog et al. Oct 2016 A1
20160334797 Ross et al. Nov 2016 A1
20160344737 Anton Nov 2016 A1
20160355192 James et al. Dec 2016 A1
20160358477 Ansari Dec 2016 A1
20160363935 Shuster et al. Dec 2016 A1
20160364823 Cao Dec 2016 A1
20160368534 Harda Dec 2016 A1
20160371977 Wingate Dec 2016 A1
20170001650 Park Jan 2017 A1
20170010613 Fukumoto Jan 2017 A1
20170021830 Feldman et al. Jan 2017 A1
20170021837 Ebina Jan 2017 A1
20170032585 Stenneth Feb 2017 A1
20170057542 Kim et al. Mar 2017 A1
20170061798 Linder Mar 2017 A1
20170088143 Goldman-Shenhar et al. Mar 2017 A1
20170106876 Gordon et al. Apr 2017 A1
20170123428 Levinson et al. May 2017 A1
20170129335 Lu May 2017 A1
20170129487 Wulf May 2017 A1
20170132917 Ricci May 2017 A1
20170137023 Anderson et al. May 2017 A1
20170151958 Sakuma Jun 2017 A1
20170168689 Goldman-Shenhar et al. Jun 2017 A1
20170200449 Penilla et al. Jul 2017 A1
20170219364 Lathrop Aug 2017 A1
20170240098 Sweeney et al. Aug 2017 A1
20170248949 Moran et al. Aug 2017 A1
20170300855 Lund Oct 2017 A1
20180032071 Wieneke Feb 2018 A1
20180072323 Gordon Mar 2018 A1
20180075309 Sathyanarayana et al. Mar 2018 A1
20180086373 Tamura Mar 2018 A1
20180093631 Lee et al. Apr 2018 A1
20180108369 Gross Apr 2018 A1
20180141453 High May 2018 A1
20180154906 Dudar Jun 2018 A1
20180203455 Cronin Jul 2018 A1
20180265054 Hofmann Sep 2018 A1
20180371805 Ichinose Dec 2018 A1
Foreign Referenced Citations (24)
Number Date Country
2447554 Nov 2000 CA
1135063 Nov 1996 CN
2349068 Nov 1999 CN
1376599 Oct 2002 CN
201004265 Jan 2008 CN
201635568 Nov 2010 CN
202012052 Oct 2011 CN
202038228 Nov 2011 CN
102650882 Aug 2012 CN
202772924 Mar 2013 CN
104900018 Sep 2015 CN
0582236 Feb 1994 EP
3130516 Feb 2017 EP
2498793 Jul 2013 GB
2006003661 Jan 2006 WO
2010101749 Sep 2010 WO
2014058263 Apr 2014 WO
2014066721 May 2014 WO
2014147361 Sep 2014 WO
2014148975 Sep 2014 WO
2014148976 Sep 2014 WO
2015024616 Feb 2015 WO
2015056105 Apr 2015 WO
2015156146 Oct 2015 WO
Non-Patent Literature Citations (19)
Entry
Anonymous, ‘System and Method to Target Advertisements for the Right Focus Group’. ip.com, No. 000218285, May 31, 2012, pp. 1-2.
Anonymous, “Car Built-in Mechanism to Enforce Mandatory Self-Driving Mode”, ip.com, No. 000234916, Feb. 14, 2014, pp. 1-3.
T. Horberry et al., “Driver Distraction: The Effects of Concurrent In-Vehicle Tasks, Road Enviornment Complexity and Age on Driving Performance”, Elsevier Ltd., Accident Analysis and Prevention, 38, 2006, pp. 185-191.
J. Miller, “Self-Driving Car Technology's Benefits, Potential Risks, and Solutions”, The Energy Collective, theenergycollective.com, Aug. 19, 2014, pp. 1-7.
Chen S, et al., “A Crash Risk Assessment Model for Roas Curves”. Inproceedings 20th International Technical Conference on the Enhanced Saftey of Vehicles., 2007. Lyon, France.
J. Wei et al., “Towards a Viable Autonomous Driving Research Platform”, IEEE, Intelligent Vehicles Symposium (IV), 2013, pp. 1-8.
Anonymous, “Diagnostics Mechanism for Self-Driving Cars to Validate Self-Driving Capabilities”, ip.com, Jun. 6, 2014, pp. 1-5. ip.com.
Brownell, “Shared Autonomous Taxi Networks: An Analysis of Transportation Demand in NJ and a 21st Century Solution for Congestion”, Dissertation, Princeton University, 2013, pp. 1-122.
Sessa et al, “Blueprint of Alternative City Cyber-Mobility Take-U Scenarios”, Seventh Framework Programme Theme SST.2012.3.1-4, Automated Urban Vehicles Collaborative Project—Grant Agreement No: 314190, 2013, pp. 1-63.
Lutin et al., “The Revolutionary Development of Self-Driving Vehicles and Implications for the Transportation Engineering Profession”, ITE Journal 83.7, 2013, pp. 28-32.
A. Hars, “Self-Driving Cars: The Digital Transformation of Mobility”, Marktplatze IM Umbruch, Springer Berlin Heidelberg, 2015, pp. 539-549.
Jimenez et al.; “Autonomous collision avoidance system based on accurate knowledge of the vehicle surroundings”; Inst Engineering Technology—IET; IET Intelligent Transport Systems vol. 9, No. 1, pp. 105-117; 2015; England.
Anonymous, “Avoiding Crashes With Self-Driving Cars: Today's Crash-Avoidance Systems Are the Mile Markers to Tomorrow's Autonomous Vehicles”. Consumer Reports Magazine, Feb. 2014. Web. Sep. 22, 2016. <http://www.consumerreports.org/cro/magazine/2014/04/the-road-to-self-driving-cars/index.htm>.
Anonymous, “Google Files Patent for Second-Gen Autonomous Vehicle Without a Steering Wheel, Brake Pedal & More”. patentlymobile.com, Nov. 27, 2015. Web. Sep. 22, 2016. <http://www.patentlymobile.com/2015/11/GOOGLE-FILES-PATENT-FOR-SECOND-GEN-AUTONOMOUS-VEHICLE-WITHOUT-A-STEERING-WHEEL-BRAKE-PEDAL-MORE.HTML>.
C. Berger et al., “COTS—Architecture With a Real-Time OS for a Self-Driving Miniature Vehicle”, Safecomp 2013—Workshop ASCOMS of the 32nd International Conference on Computer Safety, Reliability and Security, Sep. 2013, Toulouse, France, pp. 1-13.
P. Mell et al., “NIST Definition fo Cloud Computing”, National Institute of Standards and Tchnology, Information Technology Labratory, Sep. 2011, pp. 1-7.
U.S. Appl. No. 14/924,034 Non-Final Office Action dated Jul. 13, 2017.
List of IBM Patents or Patent Applications Treated as Related. Dec. 11, 2017.
R. Vaidyanathan et al., “A Reflexive Vehicle Control Architecture Based on a Neural Model of the Cockroach Escape Response”, Institution of Mechanical Engineers. Journal of Systems and Control Engineering, 2011, vol. 226, No. 5, pp. 699-718.
Related Publications (1)
Number Date Country
20180099669 A1 Apr 2018 US
Continuations (1)
Number Date Country
Parent 14924034 Oct 2015 US
Child 15838500 US