Autonomous vehicles are becoming more sophisticated. As the level of sophistication increases, the amount of passenger interaction required by the autonomous vehicle decreases. Eventually, autonomous vehicles may require no passenger interaction beyond, e.g., selecting a destination, leaving passengers to focus on non-driving-related tasks. While a human operator may remain “in the driver's seat,” i.e., proximate to vehicle components such as a steering wheel, accelerator pedal, brake pedal, gearshift lever, etc., components such as the steering wheel may be reconfigured to provide the passenger more room during autonomous operation. As airbags for a passenger in the driver seat are typically housed in the steering wheel, vehicles are generally not equipped with another frontal impact airbag for a passenger in the driver seat separate and apart from that housed in the steering wheel. However, inclusion of another completely independent airbag mechanism for the driver may be difficult or otherwise undesirable in view of cost and packaging targets.
An exemplary autonomous vehicle includes a steering wheel located in a passenger compartment. The steering wheel is configured to be moved from an operational position to a stowed position. In the event of a collision, a first airbag is configured to deploy when the steering wheel is in the operational position and a second airbag is configured to deploy at the driver seat when the steering wheel is in the stowed position. In some implementations, the second airbag may be in the form of a multi-stage airbag configured to selectively deploy at the passenger seat only when the steering wheel is in the operational position, or at both the driver and passenger seats when the steering wheel is in the stowed position. For example, a programmed controller in the vehicle may determine airbag deployment. The vehicle may further include autonomous driving sensors and an autonomous controller that receives signals generated by the autonomous driving sensors (e.g., sensors for driving the vehicle in an autonomous mode) and controls at least one vehicle subsystem to operate the vehicle in autonomous mode according to the signals received.
The
As illustrated in
The user interface device 135 may be configured to present information to a user, such as a driver, during operation of the vehicle 100. Moreover, the user interface device 135 may be configured to receive user inputs. Thus, the user interface device 135 may be located in the passenger compartment 105 of the vehicle 100. In some possible approaches, the user interface device 135 may include a touch-sensitive display screen.
The autonomous driving sensors 140 may include any number of devices configured to generate signals that help navigate the vehicle 100 while the vehicle 100 is operating in an autonomous (e.g., driverless) mode. Examples of autonomous driving sensors 140 may include a radar sensor, a lidar sensor, a camera, or the like. The autonomous driving sensors 140 help the vehicle 100 “see” the roadway and the vehicle surroundings and/or negotiate various obstacles while the vehicle 100 is operating in the autonomous mode.
The controller 145 may be configured to control one or more subsystems 155 while the vehicle 100 is operating in the autonomous mode. Examples of subsystems 155 that may be controlled by the controller 145 may include a brake subsystem, a suspension subsystem, a steering subsystem, and a powertrain subsystem. The controller 145 may control any one or more of these subsystems 155 by outputting signals to control units associated with these subsystems 155. The controller 145 may control the subsystems 155 based, at least in part, on signals generated by the autonomous driving sensors 140.
The seat motors 150 may be configured to control the position and/or orientation of one or more seats 110 inside the passenger compartment 105. Each motor 150 may be associated with only one seat 110. Alternatively, a single motor 150 may be used to move multiple seats 110, including an entire row of seats 110. The motor 150 may operate in accordance with control signals output by the user interface device 135. For example, the user interface device 135 may receive commands from the driver or another passenger indicating a desire for one or more seats 110 to adopt a particular configuration. The motor 150 may automatically adjust the seats 110 to the desired configuration, including one or more predetermined configurations defined by a seat memory system. The motor 150 may be configured to apply one of the predetermined configurations based on the person in the seat 110. The seat occupant may be identified from, e.g., sensors located in the passenger compartment 105. If the seat occupant is unknown, the motor 150 may move the seat 110 to a default configuration. Examples of different possible seat 110 configurations are described in greater detail below. For example, the motor 150 may cause the seat 110 to pivot, fold, unfold, slide, recline, etc.
The crash sensors 320 include one or more known sensors for detecting a collision or imminent collision in a vehicle 100. For example, crash sensors 320 may include accelerometer, radar, image sensors, etc. The sensors 320 provide data to the controller 145 which may operate in a known manner to identify a collision or imminent collision, e.g., a frontal collision, of the vehicle 100.
Referring now to
One way to move the seats 110 from the front-facing position to the rear-facing position may be to rotate the seat 110. The seat 110 may be rotated about an axis 175A that extends generally perpendicularly from a floor of the vehicle 100 and through a center of the seat 110. Rotating the seat 110 may occur automatically when the vehicle 100 is operating in the autonomous mode or while the vehicle is parked. Moreover, the seat 110 may be rotated manually by, e.g., removing and repositioning the seat 110 or spinning the seat 110 about the axis 175A.
Alternatively, one part of the seat 110, such as a back portion 165 (
As shown in
A locking mechanism (not shown) may prevent the seat 110 from being moved to the rear-facing position, and vice versa. The locking mechanism may be manually unlocked by the user or automatically by, e.g., the motor 150 or another device. The original position of the seat 110 is shown in
The console 160 located between the two front seats 110 may be configured to slide toward the front of the vehicle 100 when one or both seats 110 in the front row are in the rear-facing position. Alternatively, the console 160 may slide toward the back of the vehicle 100 to a position between the first and second rows of seats 110. This way, when one or more seats 110 in the first row are oriented in a rear-facing position, the center console 160 may act as a table available for passengers in either the first row and/or second row to use. The rearward movement of the central console 160 may also facilitate the reorientation of seats 110, which move between forward and rearward orientations by, e.g., rotational movements. In other words, the entire console 160 may move toward a center of the passenger compartment 105 to allow one or more of the seats 110 to change orientations. In some implementations, the console 160 may be configured to rotate. The console 160 may be moved by, e.g., lifting the console 160 about a hinge and either sliding or rotating the console 160 out of the path of the seats 110 or via a stepper motor. Therefore, the console 160 may be repositioned so as not to interfere with the reorientation of the seats 110 or a passenger's legs as the seats 110 are changing orientation.
The console 160 may include an integrated computing device 215, which may include a desktop computer, a laptop computer, a tablet computer, or the like. In
Referring now to
Furthermore, the computing device 215, discussed above, may be implemented into the steering wheel 115 instead of the console 160 for when the vehicle 100 is operating in the autonomous mode (see
Alternatively, with reference to
To accommodate the seats 110 under the instrument panel 195, one or more components located in the passenger compartment 105, such as the accelerator pedal 120 and the brake pedal 125 (see
A first airbag 225, which may be installed in the steering wheel 115 for deployment therefrom, may be configured to deploy during a collision, e.g., a frontal collision, that occurs while the steering wheel 115 is in the operational position. While the steering wheel 115 is in the stowed position, however, the first airbag 225 may be ineffective during a collision, even if the driver seat 110 is facing forward. Therefore, a second airbag 230, which may be located in a header above the windshield near the driver seat 110, may be deployed during a collision if the steering wheel 115 is in the stowed position.
A processing device 235, which may be incorporated into the controller 145 or some other controller in a vehicle 100 may be configured to selectively enable the first airbag 225 or the second airbag 230 based on, e.g., whether the steering wheel 115 is in the operational position or in the stowed position. When the steering wheel 115 is in the operational position, the processing device 235 may enable the first airbag 225 and disable the second airbag 230. Thus, in the event of a collision, the first airbag 225 will be deployed but the second airbag 230 will not. When the steering wheel 115 is in the stowed position, the processing device 235 may enable the second airbag 230 and disable the first airbag 225 so that only the second airbag 230 may be deployed in the event of a collision but the first airbag 225 will not.
The processing device 235 may further consider whether the driver seat 110 is facing forward or backward. If facing forward, the processing device 235 may enable either the first airbag 225 or the second airbag 230 according to the conditions previously discussed. If facing backward, the processing device 235 may disable both the first airbag 225 and the second airbag 230.
The vehicle 100′ includes a multi-stage airbag 231′ across the vehicle 100′ so as to laterally overlap with both the driver and passenger seats 110a′-110b′. For example, as illustrated herein, the multi-stage airbag 231′ may be in a header above the windshield. In another example, the multi-stage airbag 231′ may be in the instrument panel 195′. With further reference to
As described herein, the steering wheel 115′ may be stowed, e.g. under the instrument panel 195′, such as, by way of non-limiting example, to give a driver additional leg room or clearance during autonomous driving operations. Referring to
The multi stage airbag 231′ may have a variety of configurations. For example, the multi-stage airbag 231′ may be a dual-stage airbag with a dual-stage inflator. In another example, the multi-stage airbag 231′ may have two or more multiple independent chambers and at least two inflators.
The processing device 235′, which may be incorporated into a controller in the vehicle 100′, may be configured to selectively enable inflation of the first airbag 225′ and/or the multi-stage airbag 231′ based on, e.g., whether the steering wheel 115′ is in the operational position or in the stowed position. When the steering wheel 115′ is in the operational position, the processing device 235′ may enable the first airbag 225′ and partially enable the multi-stage airbag 231′ to deploy only at the passenger seat 110b′. Thus, in the event of a collision, the first airbag 225′ will be deployed at the driver seat 110a′ and the multi-stage airbag 231′ will only be deployed at the passenger seat 110b′. When the steering wheel 115′ is in the stowed position, the processing device 235′ may enable the multi-stage airbag 231′ to deploy at both of the driver and passenger seats 110a′ and 110b′, and disable the first airbag 225′. The processing device 235′ may further consider whether the driver seat 110a′ and/or passenger seat 110b′ are facing forward or backward. For example, if both are facing backward, the processing device 235′ may disable both the first airbag 225′ and the multi-stage airbag 231′.
The process 1000 begins in a block 1005, in which the vehicle 100′ monitors data from its one or more crash sensors. Next, in a block 1010, the controller of vehicle 100′ determines whether a frontal crash threshold has been reached or exceeded, i.e. whether data from the one or more crash sensors indicates that the vehicle 100′ has experienced, or imminently is likely to experience, a frontal collision. If not, the process loops back to the block 1005, and the monitoring continues.
If a frontal crash threshold is met, next, in a block 1015, the controller of the vehicle 100′ determines whether the steering wheel 115′ is in a stowed or operating position. For example, a user interface device could have provided an instruction to the controller to stow the steering wheel 115′, and, in another example, the steering wheel 115′ could be automatically stowed upon the vehicle 100′ entering an autonomous or semi-autonomous mode, etc. In any case, if the steering wheel 115′ is stowed, then the process 1000 proceeds to a block 1020. At the block 1020, the controller of the vehicle 100′, or possibly a separate airbag controller such as is known, provides a signal to deploy the multi-stage airbag 231′ at both the driver seat 110a′ and the passenger seat 110b′, as illustrated in
In general, computing systems and/or devices, such as the user interface device 135, the controller 145, and the processing device 235, may employ any of a number of computer operating systems, including, but by no means limited to, versions and/or varieties of the Ford SYNC® operating system, the Microsoft Windows® operating system, the Unix operating system (e.g., the Solaris® operating system distributed by Oracle Corporation of Redwood Shores, Calif.), the AIX UNIX operating system distributed by International Business Machines of Armonk, N.Y., the Linux operating system, the Mac OS X and iOS operating systems distributed by Apple Inc. of Cupertino, Calif., and the Android operating system developed by the Open Handset Alliance. Examples of computing devices include, without limitation, a computer workstation, a server, a desktop, notebook, laptop, or handheld computer, or some other computing system and/or device.
Computing devices generally include computer-executable instructions, where the instructions may be executable by one or more computing devices such as those listed above. Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java™, C, C++, Visual Basic, Java Script, Perl, etc. In general, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media.
A computer-readable medium (also referred to as a processor-readable medium) includes any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random access memory (DRAM), which typically constitutes a main memory. Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to a processor of a computer. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.
In some examples, system elements may be implemented as computer-readable instructions (e.g., software) on one or more computing devices (e.g., servers, personal computers, etc.), stored on computer readable media associated therewith (e.g., disks, memories, etc.). A computer program product may comprise such instructions stored on computer readable media for carrying out the functions described herein.
With regard to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain embodiments, and should in no way be construed so as to limit the claims.
Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the application is capable of modification and variation.
All terms used in the claims are intended to be given their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary in made herein. In particular, use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.
The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.
This application is a continuation-in-part of U.S. Ser. No. 14/220,452 filed Mar. 20, 2014, entitled “AUTONOMOUS VEHICLE WITH RECONFIGURABLE INTERIOR,” which in turn is a continuation-in-part of U.S. Ser. No. 14/085,135 filed on Nov. 20, 2013, U.S. Ser. No. 14/085,158 filed on Nov. 20, 2013, and U.S. Ser. No. 14/085,166 filed on Nov. 20, 2013, each entitled “AUTONOMOUS VEHICLE WITH RECONFIGURABLE SEATS”. The contents of each of the foregoing applications are hereby incorporated herein by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4653809 | Czernakowski | Mar 1987 | A |
5602734 | Kithil | Feb 1997 | A |
6024647 | Bennett | Feb 2000 | A |
6240352 | McCurdy | May 2001 | B1 |
6431586 | Eyrainer et al. | Aug 2002 | B1 |
6457694 | Haynes | Oct 2002 | B1 |
6488333 | Kim | Dec 2002 | B2 |
6494531 | Kim | Dec 2002 | B1 |
6782831 | Yamada | Aug 2004 | B2 |
6913280 | Dominissini et al. | Jul 2005 | B2 |
6991285 | Hemenway | Jan 2006 | B1 |
7040653 | Breed | May 2006 | B1 |
7281761 | Brown | Oct 2007 | B2 |
7630806 | Breed | Dec 2009 | B2 |
7726684 | Breed | Jun 2010 | B2 |
7740273 | Breed | Jun 2010 | B2 |
7775599 | George | Aug 2010 | B2 |
8240706 | Bustos Garcia et al. | Aug 2012 | B2 |
8430424 | Rao et al. | Apr 2013 | B1 |
8534735 | McManus | Sep 2013 | B2 |
20030047974 | Tame | Mar 2003 | A1 |
20040262940 | Johnson | Dec 2004 | A1 |
20050038575 | Wu | Feb 2005 | A1 |
20060042497 | Yamada | Mar 2006 | A1 |
20070198145 | Norris | Aug 2007 | A1 |
20110071718 | Norris | Mar 2011 | A1 |
20130002416 | Gazit | Jan 2013 | A1 |
20130006478 | Lin | Jan 2013 | A1 |
20140207535 | Stefan | Jul 2014 | A1 |
20140260761 | Soderlind | Sep 2014 | A1 |
20140277896 | Lathrop et al. | Sep 2014 | A1 |
Number | Date | Country |
---|---|---|
2005225296 | Aug 2005 | JP |
Entry |
---|
John Day, “TRW Steering Wheel Concept Supports Automated Driving”, John Day Automotive Electronics, Mar. 18, 2014 (4 pages). |
William Diem, “Camera Technology Monitors Where Drive is Looking,” WARDAUTO, Aug. 5, 2013; 2013 CAR Management Briefing Seminars, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20150137492 A1 | May 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14220452 | Mar 2014 | US |
Child | 14507760 | US | |
Parent | 14085135 | Nov 2013 | US |
Child | 14220452 | US | |
Parent | 14085158 | Nov 2013 | US |
Child | 14085135 | US | |
Parent | 14085166 | Nov 2013 | US |
Child | 14085158 | US |