Apparatus and method for identifying a driving state of an unmanned vehicle and unmanned vehicle

Information

  • Patent Grant
  • 10328847
  • Patent Number
    10,328,847
  • Date Filed
    Thursday, May 25, 2017
    7 years ago
  • Date Issued
    Tuesday, June 25, 2019
    5 years ago
Abstract
The present disclosure discloses an apparatus and a method for identifying a driving state of an unmanned vehicle, the identifying apparatus comprising: an information analyzing apparatus, configured to receive information of an operating system of the unmanned vehicle, and determine and transmit a corresponding driving state signal based on the information; and a display apparatus, configured to receive the driving state signal transmitted from the information analyzing apparatus and display the driving state signal. The identifying apparatus fully considers the driving conditions of various vehicles on the road and displays the driving state of the unmanned vehicle in an intuitive and eye-catching manner to the surrounding vehicles and pedestrians, improves the safety performance of road driving, and is simple in structure, easy to install, low in cost, suitable for mass promotion and use.
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application is related to and claims priority from Chinese Application No. 201611199483.8, filed on Dec. 22, 2016, entitled “Apparatus and Method for Identifying a Driving State of an Unmanned Vehicle and Unmanned Vehicle” the entire disclosure of which is hereby incorporated by reference.


TECHNICAL FIELD

The present disclosure relates to the field of unmanned vehicle technology, specifically to an apparatus and a method for identifying a driving state of an unmanned vehicle. The present disclosure also relates to a unmanned vehicle possessing the apparatus for identifying a driving state of the unmanned vehicle.


BACKGROUND

With the increasing research on unmanned vehicle technology and the development of the industry, unmanned vehicles will travel on the public road in the near future. In the early days of the unmanned vehicle age, it is inevitable that unmanned vehicles and ordinary traditional vehicles will co-exist at the same time, and the traditional vehicles and unmanned vehicles will run together on the public road. Then, the driver driving a traditional vehicle or the pedestrian needs to be able to quickly identify the unmanned vehicle and whether the unmanned vehicle is in a manual state. This leaves the driver and the pedestrian longer predicting time so as to avoid unnecessary traffic accidents and damages.


SUMMARY

In view of the above-mentioned drawbacks or deficiencies in the prior art, the present disclosure desires to provide a solution for quickly identifying driving state of a unmanned vehicle.


In a first aspect, the present disclosure embodiment provides an apparatus for identifying a driving state of an unmanned vehicle, the apparatus comprising: an information analyzing apparatus, configured to receive information of an operating system of the unmanned vehicle, and determine and transmit a corresponding driving state signal based on the information; and a display apparatus, configured to receive the driving state signal transmitted from the information analyzing apparatus and display the driving state signal.


In a second aspect, the present disclosure embodiment also provides a method for identifying a driving state of an unmanned vehicle, the method comprising: receiving information of an operating system of the unmanned vehicle and decoding the information; processing the decoded information, converting the information into a corresponding driving state signal and transmitting the same; and receiving and displaying the driving state signal of the unmanned vehicle.


In a third aspect, the present disclosure embodiment also provides a unmanned vehicle possessing the driving state identifying apparatus.


The apparatus for identifying a driving state of an unmanned vehicle provided by the embodiments of the present disclosure fully considers the driving conditions of various vehicles on the road and displays the driving state of the unmanned vehicle in an intuitive and eye-catching manner to the surrounding vehicles and pedestrians by the information analyzing apparatus and the display apparatus, improves the safety performance of road driving, and is simple in structure, easy to install, low in cost, suitable for mass promotion and use.





BRIEF DESCRIPTION OF THE DRAWINGS

Other features, objectives and advantages of the present disclosure will become more apparent upon reading the detailed description to non-limiting embodiments with reference to the accompanying drawings, wherein:



FIG. 1 shows an exemplary system architecture of an apparatus for identifying a driving state of an unmanned vehicle according to an embodiment of the present disclosure;



FIG. 2 shows a schematic structure diagram of the apparatus for identifying a driving state of an unmanned vehicle according to an embodiment of the present disclosure; and



FIG. 3 shows an exemplary flowchart of a method for identifying a driving state of an unmanned vehicle according to an embodiment of the present disclosure.





DETAILED DESCRIPTION OF EMBODIMENTS

The present disclosure will be further described below in detail in combination with the accompanying drawings and the embodiments. It should be appreciated that the specific embodiments described herein are merely used for explaining the relevant invention, rather than limiting the invention. In addition, it should be noted that, for the ease of description, only the parts related to the relevant invention are shown in the accompanying drawings.



FIG. 1 shows an exemplary system architecture of a unmanned vehicle driving state identifying apparatus of an embodiment of the present disclosure, FIG. 2 shows a schematic structure diagram of the apparatus for identifying a driving state of an unmanned vehicle according to an embodiment of the present disclosure.


As shown in FIG. 1, the apparatus 100 for identifying a driving state of an unmanned vehicle may include an information analyzing apparatus 110 and a display apparatus 120.


The information analyzing apparatus 110 is configured to receive information of an operating system 10 of the unmanned vehicle, and determine and transmit a corresponding driving state signal based on the information.


Specifically, the information analyzing apparatus 110 includes an information decoding unit 111 and a real-time calculation unit 112. The information decoding unit 111 is configured to receive information of the operating system 10 of the unmanned vehicle and decode the information. The real-time calculation unit 112 processes the information decoded by the information decoding unit 111, converts the information into a corresponding driving state signal and transmits it. The driving state signal may be represented numerically. For example, if the driving state is autopilot, the transmitting signal is “1”, and if the driving state is manual driving, the transmitting signal is “0”. The driving state signal may also be represented using encrypted communication message converted into characters.


The display apparatus 120 is configured to receive the driving state signal transmitted from the information analyzing apparatus 110 and display the driving state signal, so that the driver driving the traditional vehicle or the pedestrian can quickly identify the unmanned vehicle and its driving state.


As shown in FIG. 2, the display apparatus 120 may include a light apparatus 120A set on the top of the unmanned vehicle. After the light apparatus 120A receives the driving state signal transmitted by the information analyzing apparatus 110, the current driving state of the unmanned vehicle may be displayed by the color change of the light apparatus 120A.


Further, the light apparatus 120A includes a light emitting device and a base supporting the light emitting device. The base is fixed to the top of the unmanned vehicle, and the light emitting device is for emitting light of different colors. For example, when the light emitting device receives a signal “1” transmitted from the information analyzing apparatus 110, it emits red light indicating that the unmanned vehicle is currently in an automatic driving state. When the light emitting device receives a signal “0” transmitted from the information analyzing apparatus 110, it emits green light indicating that the unmanned vehicle is currently in a manual driving state.


The display apparatus 120 may further include a character apparatus 120B set on the rear glass of the unmanned vehicle. After the character apparatus 120B receives the driving state signal transmitted from the information analyzing apparatus 110, the current driving state of the unmanned vehicle may be displayed through the characters.


Further, the character apparatus 120B includes a base plate and light emitting diodes. The base plate is fixed inside the rear glass of the unmanned vehicle, and the light emitting diodes are mounted on the base plate for displaying the character content representing the current driving state of the unmanned vehicle. For example, after the light emitting diodes receive an encrypted communication message transmitted from the information analyzing apparatus 110 and converted into a character displayed as “automatic”, it indicates that the unmanned vehicle is currently in an automatic driving state. After the light emitting diodes receive an encrypted communication message transmitted from the information analyzing apparatus 110 and converted into a character displayed as “manual”, it indicates that the unmanned vehicle is currently in a manual driving state.


The character content representing the driving state of the unmanned vehicle is only for exemplary purpose and is not limited thereto. In addition, the character content displayed by the light emitting diodes may be in Chinese, English or any one or a combination of a few of other languages, depending on the location and language environment of the unmanned vehicle.


It should be noted that the display apparatus 120 may be used by combining the light apparatus 120A and the character apparatus 120B, or only of the above is used alone to display the driving state of the unmanned vehicle.


The apparatus 100 for identifying a driving state of an unmanned vehicle provided by the embodiments of the present disclosure fully considers the driving conditions of various vehicles on the road and displays the driving state of the unmanned vehicle in an intuitive and eye-catching manner to the surrounding vehicles and pedestrians by the information analyzing apparatus 110 and the display apparatus 120, improves the safety performance of road driving, and is simple in structure, easy to install, low in cost, suitable for mass promotion and use.



FIG. 3 shows an exemplary flowchart of a method for identifying a driving state of an unmanned vehicle according to an embodiment of the present disclosure.


As shown in FIG. 3, the method 200 includes the following steps:


Step 201: receiving information of an operating system of the unmanned vehicle and decoding the information;


Step 202: processing the decoded information, converting the information into a corresponding driving state signal and transmitting; and


Step 203: receiving and displaying the driving state signal of the unmanned vehicle.


In the above step S203, the driving state signal of the unmanned vehicle is displayed by the color change of the light emitting device and/or the character content of the character apparatus as described above, therefore detailed description thereof is omitted.


In addition, the present disclosure embodiment also provides an apparatus 100 for identifying a driving state of an unmanned vehicle.


The flowcharts and block diagrams in the accompanying figures illustrate architectures, functions and operations that may be implemented according to the system, the method and the computer program product of the various embodiments of the present invention. In this regard, each block in the flowcharts and block diagrams may represent a module, a program segment, or a code portion. The module, the program segment, or the code portion comprises one or more executable instructions for implementing the specified logical function. It should be noted that, in some alternative implementations, the functions denoted by the blocks may occur in a sequence different from the sequences shown in the figures. For example, in practice, two blocks in succession may be executed, depending on the involved functionalities, substantially in parallel, or in a reverse sequence. It should also be noted that, each block in the block diagrams and/or the flowcharts and/or a combination of the blocks may be implemented by a dedicated hardware-based system executing specific functions or operations, or by a combination of a dedicated hardware and computer instructions.


The units or modules involved in the embodiments of the present disclosure may be implemented by way of software or hardware. The described units or modules may also be provided in a processor. The names of these units or modules do not in any case constitute a limitation to the unit or module itself.


In another aspect, the present disclosure further provides a computer-readable storage medium. The computer-readable storage medium may be the computer-readable storage medium included in the apparatus in the above embodiments, or a stand-alone computer-readable storage medium which has not been assembled into the apparatus. The computer-readable storage medium stores one or more programs. The programs are used by one or more processors to perform the method for identifying a driving state of an unmanned vehicle described in the present disclosure.


The foregoing is only a description of the preferred embodiments of the present disclosure and the applied technical principles. It should be appreciated by those skilled in the art that the inventive scope of the present disclosure is not limited to the technical solutions formed by the particular combinations of the above technical features. The inventive scope should also cover other technical solutions formed by any combinations of the above technical features or equivalent features thereof without departing from the concept of the invention, such as, technical solutions formed by replacing the features as disclosed in the present disclosure with (but not limited to), technical features with similar functions.

Claims
  • 1. An apparatus for identifying a driving state of an unmanned vehicle, the apparatus comprising: an information analyzing apparatus, configured to receive information of an operating system of the unmanned vehicle, and determine and transmit a corresponding driving state signal based on the information; anda display apparatus, configured to receive the driving state signal transmitted from the information analyzing apparatus and display the driving state signal;wherein the display apparatus comprises a character apparatus provided on a rear glass of the unmanned vehicle, the character apparatus configured to display a current driving state of the unmanned vehicle through characters after receiving the driving state signal transmitted by the information analyzing apparatus, the current driving state of the unmanned vehicle comprising an automatic driving state or a manual driving state;wherein the character apparatus comprises a base plate and light emitting diodes, the base plate fixed inside the rear glass of the unmanned vehicle, and the light emitting diodes mounted on the base plate and configured to display character content representing the current driving state of the unmanned vehicle.
  • 2. The apparatus according to claim 1, wherein the information analyzing apparatus comprises: at least one processor; anda memory storing instructions, the instructions when executed by the at least one processor, cause the at least one processor to perform operations, the operations comprising:receiving the information of the operating system of the unmanned vehicle;decoding the information;converting the information into a corresponding driving state signal; andtransmitting the driving state signal.
  • 3. The apparatus according to claim 1, wherein the display apparatus further includes a light apparatus provided on top of the unmanned vehicle, the light apparatus configured to display the current driving state of the unmanned vehicle with color changes of the light apparatus after receiving the driving state signal transmitted by the information analyzing apparatus.
  • 4. The apparatus according to claim 3, wherein the light apparatus includes a light emitting device and a base supporting the light emitting device, the base is fixed to a top of the unmanned vehicle, and the light emitting device is configured to emit emitting light of different colors.
  • 5. The apparatus according to claim 4, wherein the light emitting device is configured to indicate that the unmanned vehicle is currently in the automatic driving state by emitting red light, and the light emitting device is configured to indicate that the unmanned vehicle is currently in the manual driving state by emitting green light.
  • 6. The apparatus according to claim 1, wherein the character content displayed by the light emitting diodes of the character apparatus is in a language comprising at least one of Chinese or English.
  • 7. An unmanned vehicle, comprising the apparatus for identifying a driving state of an unmanned vehicle according to claim 1.
  • 8. A method for identifying an unmanned vehicle driving state, the method comprising: receiving information of an operating system of the unmanned vehicle and decoding the information;processing the decoded information, converting the decoded information into a corresponding driving state signal and transmitting the driving state signal; andreceiving and displaying the driving state signal of the unmanned vehicle,wherein the driving state signal of the unmanned vehicle is displayed with a character content of a character apparatus provided on a rear glass of the unmanned vehicle, the driving state signal of the unmanned vehicle comprising an automatic driving state or a manual driving state;wherein the character apparatus comprises a base plate and light emitting diodes, the base plate fixed inside the rear glass of the unmanned vehicle, and the light emitting diodes mounted on the base plate and configured to display character content representing the current driving state of the unmanned vehicle.
  • 9. The method according to claim 8, wherein the driving state signal of the unmanned vehicle is displayed with color changes of a light emitting device.
  • 10. A non-transitory computer storage medium storing a computer program, which when executed by one or more processors, causes the one or more processors to perform operations, the operations comprising: receiving information of an operating system of the unmanned vehicle and decoding the information;processing the decoded information, converting the decoded information into a corresponding driving state signal and transmitting the driving state signal; andreceiving and displaying the driving state signal of the unmanned vehicle,wherein the driving state signal of the unmanned vehicle is displayed with a character content of a character apparatus provided on a rear glass of the unmanned vehicle, the driving state signal of the unmanned vehicle comprising an automatic driving state or a manual driving state;wherein the character apparatus comprises a base plate and light emitting diodes, the base plate fixed inside the rear glass of the unmanned vehicle, and the light emitting diodes mounted on the base plate and configured to display character content representing the current driving state of the unmanned vehicle.
  • 11. The non-transitory computer storage medium according to claim 10, wherein the driving state signal of the unmanned vehicle is displayed with color changes of a light emitting device.
Priority Claims (1)
Number Date Country Kind
2016 1 1199483 Dec 2016 CN national
US Referenced Citations (87)
Number Name Date Kind
4600913 Caine Jul 1986 A
6282823 Brown Sep 2001 B1
6300870 Nelson Oct 2001 B1
6553285 Bahmad Apr 2003 B1
6940422 Bachelder Sep 2005 B1
7095318 Bekhor Aug 2006 B1
7629898 Kirkpatrick Dec 2009 B2
8217766 Nakayama Jul 2012 B2
8248273 Hayashi Aug 2012 B2
8514100 Yamashita Aug 2013 B2
8547249 David Oct 2013 B2
8606430 Seder Dec 2013 B2
8854229 Kim Oct 2014 B2
8954252 Urmson Feb 2015 B1
9196164 Urmson et al. Nov 2015 B1
9336436 Dowdall May 2016 B1
9340178 Khaykin May 2016 B1
9429947 Wengreen Aug 2016 B1
9513632 Gordon Dec 2016 B1
9694736 Williams Jul 2017 B2
9878659 Williams Jan 2018 B2
9881503 Goldman-Shenhar Jan 2018 B1
9902311 Sweeney Feb 2018 B2
9916703 Levinson Mar 2018 B2
9928734 Newman Mar 2018 B2
20010018641 Kodaka Aug 2001 A1
20030132666 Bond, III Jul 2003 A1
20030149530 Stopczynski Aug 2003 A1
20040193347 Harumoto Sep 2004 A1
20050083183 Cao Apr 2005 A1
20050134441 Somuah Jun 2005 A1
20050196020 Comaniciu Sep 2005 A1
20060265918 Meyer Nov 2006 A1
20070032952 Carlstedt Feb 2007 A1
20070040664 Johnson Feb 2007 A1
20070222565 Kawamata Sep 2007 A1
20080019567 Takagi Jan 2008 A1
20080260208 Nagaoka Oct 2008 A1
20090174573 Smith Jul 2009 A1
20100256852 Mudalige Oct 2010 A1
20110090093 Grimm Apr 2011 A1
20110140919 Hara Jun 2011 A1
20110199199 Perkins Aug 2011 A1
20110205042 Takemura Aug 2011 A1
20110246156 Zecha Oct 2011 A1
20120025964 Beggs Feb 2012 A1
20120072087 Wu Mar 2012 A1
20120206597 Komoto Aug 2012 A1
20130229289 Bensoussan Sep 2013 A1
20130329960 Sandahl Dec 2013 A1
20140032093 Mills Jan 2014 A1
20140051346 Li Feb 2014 A1
20140056438 Baalu Feb 2014 A1
20140062685 Tamatsu Mar 2014 A1
20140112538 Ogawa Apr 2014 A1
20140214260 Eckert Jul 2014 A1
20140240113 Pottier Aug 2014 A1
20150035685 Strickland Feb 2015 A1
20150054642 Carruthers Feb 2015 A1
20150103159 Shashua Apr 2015 A1
20150127222 Cunningham, III May 2015 A1
20150151725 Clarke Jun 2015 A1
20150191117 Arita Jul 2015 A1
20150197185 Jones Jul 2015 A1
20150203023 Marti Jul 2015 A1
20150210279 Agnew Jul 2015 A1
20150228195 Beaurepaire Aug 2015 A1
20150234045 Rosenblum Aug 2015 A1
20150269925 Kanaya Sep 2015 A1
20150329043 Skvarce Nov 2015 A1
20150331422 Hartung Nov 2015 A1
20150332114 Springer Nov 2015 A1
20150336502 Hillis Nov 2015 A1
20150336547 Dagan Nov 2015 A1
20160075332 Edo-Ros Mar 2016 A1
20160121791 Shimizu May 2016 A1
20160132705 Kovarik May 2016 A1
20160163198 Dougherty Jun 2016 A1
20160167648 James Jun 2016 A1
20160207454 Cuddihy Jul 2016 A1
20160229397 Muthukumar Aug 2016 A1
20160231746 Hazelton Aug 2016 A1
20160232423 Zhong Aug 2016 A1
20160250963 Reuschel et al. Sep 2016 A1
20170123434 Urano May 2017 A1
20170166222 James Jun 2017 A1
20170282784 Foster Oct 2017 A1
Related Publications (1)
Number Date Country
20180182186 A1 Jun 2018 US