The present invention relates to a navigation device, a method for providing a navigation service, and a server for providing a navigation service, and more specifically, to a technology for sharing information on a vehicle currently being driven with an external terminal and notifying a user of driving information together with a navigation device installed in a vehicle by using information and images transmitted by the external terminal.
A navigation device displays the current location of a moving object, such as a vehicle, or the like, on a map by using GPS signals received from a global positioning system (GPS). Such a navigation device is currently mounted on various moving objects such as ships, aircraft, vehicles, and the like, and is widely used to check the current location and moving speed of the moving object or to determine a moving route.
In general, a navigation device used in a vehicle searches for driving routes to a destination using map information when a user inputs the destination, and guides the vehicle along a driving route selected by the user. In addition, the navigation device provides guidance for the user to arrive at the destination by visually or audibly providing various information such as driving routes to the destination, terrain features located around the driving routes, a road congestion level, and so on.
On the other hand, when the guidance provided by a navigation device is used, often, it is difficult to find an exact route to go, or there are times of getting lost. In particular, it may be difficult to accurately determine whether to make a turn in this or the next alley only with the guidance provided in complex alleys. As a result, the driver often stops the vehicle for a while to compare the map provided by the navigation device with the actual location, or listens to an explanation of the route to the destination through a call connection with an acquaintance at the destination, and in such cases, many problems occur, such as not being able to arrive at the destination in time, causing traffic jams, or the like.
A navigation device, a method for providing a navigation service, and a server for providing a navigation service in accordance with an embodiment are an invention devised to solve the problems described above, and there exists an object to improve the reliability of a navigation device by sharing the driving information of a vehicle and front-view images of the vehicle with an external user and receiving the guidance of a driving route more accurately from an external user who knows the route to a destination well.
A navigation device in accordance with one embodiment may comprise a communication unit configured to send a driving image of a vehicle captured through at least one camera installed in the vehicle to an external terminal and to receive driving assistance information inputted by a user of the external terminal for the driving image and a control unit configured to output information generated based on the driving assistance information by controlling at least one of a display unit and a speaker.
The driving assistance information may comprise display information inputted by a user of the external terminal for the driving image displayed on the external terminal, and the control unit may display the display information on the display unit together with the driving image displayed on the display unit.
The control unit may display first display information, obtained by transforming the driving assistance information into arrows or text, by superimposing or augmenting it over the driving image.
The control unit may display the display information by superimposing or augmenting it over the driving image, without transforming it.
The control unit may display driving route information for guiding the vehicle on the display unit.
The communication unit may send the driving route information to the external terminal, and the control unit may display driving assistance information inputted from the external terminal for the driving route information displayed on the external terminal together with the driving route information displayed on the display unit.
The control unit may output the voice information through a speaker if the driving assistance information is voice information,
The control unit may simultaneously display the driving image of the vehicle sent by the communication unit to the external terminal on the display unit. if the external terminal and the communication unit are connected,
A method for providing a navigation service in accordance with one embodiment may comprise receiving driving information of a vehicle and a driving image of the vehicle, and sending the driving image to an external terminal, receiving driving assistance information inputted to the external terminal for the driving image sent to the external terminal and generating a combined image obtained by combining the driving assistance information with the driving image, and sending the combined image to the vehicle or a user terminal providing a driving information service for the vehicle.
A server for providing a navigation service in accordance with one embodiment may comprise a communication unit configured to receive a driving image of a vehicle, to send the driving image to an external terminal, and then to receive driving assistance information inputted by the external terminal for the driving image and a control unit configured to generate a combined image obtained by combining the driving assistance information with the driving image, and to send, by using the communication unit, the combined image and driving route information for guiding the vehicle to the vehicle or a user terminal providing a driving information service for the vehicle.
According to the navigation device, the method for providing a navigation service, and the server for providing a navigation service in accordance with an embodiment, since a driver can receive guidance for a driving route from an external user while the user of an external terminal is viewing the same driving image, there is an effect of reducing the error between the current location recognized by the driver and a destination and of improving the reliability of the navigation device.
In addition, according to the navigation device, the method for providing a navigation service, and the server for providing a navigation service in accordance with an embodiment, there is an advantage that when a driver drives to a destination, he or she can receive guidance of voice information or input information from another person who knows the route to the destination well together, and thus, can drive to the destination more safely.
d are diagrams for illustrating an operation in which a navigation device and an external terminal are communicatively connected according to one embodiment.
d are diagrams for illustrating an operation of sharing driving image information according to another embodiment.
A navigation device 1 to be described below not only refers to an independent device constructed separately from a vehicle providing a navigation service only, but also may refer to a device that is implemented as one component of a vehicle and provides a navigation service, and may be interpreted as a concept that includes both a server providing a navigation service and a user terminal providing a navigation service.
Referring to
Meanwhile, the navigation device 1 may communicate with an external terminal 10. The external terminal 10 may be implemented by a computer or a portable terminal capable of connecting to the navigation device 1 through a network. Here, the computer may include, for example, a laptop, a desktop, a tablet PC, a slate PC, and the like with a web browser installed thereon, and the portable terminal is, for example, a wireless communication device that ensures portability and mobility, and may include all types of handheld-based wireless communication devices such as PCS (Personal Communication System), GSM (Global System for Mobile communications), PDC (Personal Digital Cellular), PHS (Personal Handphone System), PDA (Personal Digital Assistant), IMT (International Mobile Telecommunication)-2000, CDMA (Code Division) Multiple Access)-2000, W-CDMA (W-Code Division Multiple Access), WiBro (Wireless Broadband Internet) terminals, smartphones, etc., wearable devices such as watches, rings, bracelets, anklets, necklaces, glasses, contact lenses, or head-mounted-devices (HMDs), and the like.
Referring to
Specifically, the sensor unit 20 may include a collector for recognizing the voice of the user. The sensor unit 20 may include various devices that receive the voice of the user through vibration, amplify it, and then convert it into an electrical signal. The sensor unit 20 converts the voice of the user into an electrical signal and transmits it to the control unit 80. The control unit 80 may recognize the voice of the user through a voice recognition algorithm.
The communication unit 30 may receive and send various information from and to the external terminal 10 or the external server. Specifically, the communication unit 30 may receive various information related to the driving of the vehicle from the external server, or send the driving images of the vehicle captured through at least one camera 40 installed in the vehicle 5 to the external terminal 10, the display unit 50, and the control unit 80.
The driving information received from the external server may generally include driving information to a destination, information on the current traffic conditions, map information, and various information on the road on which the vehicle is currently traveling, and such information is processed so that the driver can conveniently recognize it and is displayed on the display unit 50 as driving information.
The cameras 40 are housing modules installed on the inside/outside or both sides of the vehicle, and may be mounted on at least one of the center 45 of the front bumper of the vehicle so that front-view images of the vehicle can be captured, the headlamps 46A and 46B installed at both ends of the front bumper, the side-view mirrors 47A, 47B, and the rear-view mirror 48 installed inside the vehicle, for example, as shown in
The front-view images captured by the cameras 40 may be stored in the storage unit 70 to be described later, and the storage unit 70 is a module that can input/output information, such as a hard disk drive, a solid-state drive (SSD), flash memory, CF card (Compact Flash card), SD card (Secure Digital card), SM card (Smart Media Card), MMC card (Multi-Media Card), Memory Stick, or the like, and may be provided inside the device or may be provided in a separate device.
The camera 40 may include a lens assembly, a filter, a photoelectric conversion module, and an analog/digital conversion module. The lens assembly includes a zoom lens, a focus lens and a compensating lens. The focal length of the lens may be moved according to the control of a focus motor MF. The filter may include an optical low-pass filter and an infrared cut-off filter. The optical noise of high frequency components is removed with the optical low-pass filter, and the infrared cut-off filter blocks the infrared component of incident light. The photoelectric conversion module may comprise an imaging device, such as a charge-coupled device (CCD), a complementary metal-oxide-semiconductor (CMOS), or the like. The photoelectric conversion module converts light from an optical system (OPS) into an electrical analog signal. The analog/digital conversion module may comprise a CDS-ADC (Correlation Double Sampler and Analog-to-Digital Converter) device.
If the cameras 40 are mounted on the center 45 of the front bumper and on the rear-view mirror 48 of the vehicle as shown in
Further, the cameras 40 may be implemented such that a plurality of cameras facing in different directions are installed as shown in
Therefore, the front camera 40a can capture front-view images in the forward direction, the 30° camera 40b can capture 30° images, which are images in the 30° direction based on the forward direction, the 60° camera 40c can capture 60° images, which are images in the 60° direction based on the forward direction, the 90° camera 40d can capture 90° images, which are images in the 90° direction based on the forward direction, and the 120° camera 40e can capture 120° images, which are images in the 120° direction based on the forward direction, and the images thus captured may be displayed on the display unit 50 or transmitted to the external terminal 10. Accordingly, the user may share not only images of the front view of the vehicle but also side images of the vehicle with the external terminal 10, and may receive driving assistance information from the external terminal 10 based on the images sent. A detailed description thereof will be provided later.
The various images acquired by the cameras 40 may be sent to the display unit 50 and displayed together with the image in the vehicle driving information, and may be sent to the external terminal 10 via the communication unit 30.
The driving image acquired by the cameras 40 displayed on the display unit 50 may be generally an image of the front view of the vehicle centered on the exact center of the vehicle, but an image for the left or right side of the vehicle or an image of the rear-view may be displayed at the request of the driver of the vehicle or the external terminal 10.
In addition, the communication unit 30 may receive the voice information of an external user sent by the external terminal 10, or driving assistance information received as input from the user of the external terminal for the driving image captured by the cameras 40 displayed on the external terminal 10, and the like, and the received information may be transmitted to the control unit 80.
Therefore, the communication unit 30 may include one or more components that send and receive signals to and from various components of the vehicle 5 and enable communication with the external server and the external terminal 10. For example, it may include at least one of a short-range communication module, a wired communication module, and a wireless communication module.
The short-range communication module may include various short-range communication modules for sending and receiving signals by using a wireless communication network in a short distance, such as a Bluetooth module, an infrared communication module, an RFID (Radio Frequency Identification) communication module, a WLAN (Wireless Local Access Network) communication module, an NFC communication module, a Zigbee communication module, etc.
The wired communication module may include not only various wired communication modules, such as a Controller Area Network (CAN) communication module, a Local Area Network (LAN) module, a Wide Area Network (WAN) module, or a Value-Added Network (VAN) module, but also various cable communication modules, such as USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), DVI (Digital Visual Interface), RS-232 (recommended standard232), power line communication, or POTS (plain old telephone service).
The wireless communication module may include a wireless communication module supporting various wireless communication methods, such as GSM (Global System for Mobile Communication), CDMA (Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), UMTS (universal mobile telecommunications system), TDMA (Time Division Multiple Access), or LTE (Long Term Evolution), in addition to a Wi-Fi module and a Wireless broadband module.
The display unit 50 is configured with a display, and displays driving route information obtained by visually configuring driving routes, driving speeds of the vehicle, map information, and guidance. In addition, the display unit 50 may display thereon map information and driving information received from the external server, vehicle traveling images captured by the cameras 40 attached to the vehicle, and driving assistance information received from the external terminal 10, etc.
The display may include various display panels, such as a liquid crystal display (LCD) panel, a light-emitting diode (LED) panel, or an organic light-emitting diode (OLED) panel. Meanwhile, if the display includes a graphical user interface (GUI) such as a touchpad or the like, that is, a device that is software, it may also serve as an input unit for receiving user input.
In one embodiment, a touch input displayed by the driver of the vehicle of the display unit 50 provided as a touchpad may be received, and an arrow or text corresponding thereto may be displayed. The display unit 50 converts an input command of the user into an electrical signal and transmits it to the communication unit 30. In addition, the display unit 50 may receive image information including arrows and text from the communication unit 30 based on a touch input inputted by the user of the external terminal 10 and display the information received.
The speaker 60 may include a component for outputting guidance in a sound.
The speaker 60 in accordance with an embodiment may present various voice information, in addition to outputting guidance. For example, if the sensor unit 20 receives an input command through the voice of the driver, the control unit 80 may recognize the voice command of the user through a voice recognition algorithm, and output an answer corresponding thereto through the speaker 60. In other words, the speaker 60 may output a voice answer stored in advance.
As another embodiment, the speaker 60 may output voice information inputted by the user of the external terminal 10 received by the communication unit 30. In other words, the navigation device 1 may provide guidance just like a streaming service by outputting voice information on a driving route sent by the user of the external terminal 10 through the external terminal 10.
The storage unit 70 stores various information received by the communication unit 30 and programs necessary for the operation of the navigation device 1. Specifically, the information to be stored by the storage unit 70 may include information provided by the vehicle 5, and driving assistance information including display information and voice information sent by the external terminal 10. The storage unit 70 provides information necessary when the control unit 80 operates based on the input command of the user. In addition, the storage unit 70 may have map information stored thereon in advance, and provide the map information for the control unit 80 to search for driving routes to reach a destination. The map information may include information about terrain features.
The storage unit 70 may store information necessary for guidance. For example, the storage unit 70 stores the guidance “Turn right” in the form of data. When the location of the vehicle approaches within a certain distance on the driving route based on the GPS signal, the control unit 80 may provide guidance to the driver by outputting the guidance.
Therefore, the storage unit 70 may be implemented by at least one of non-volatile memory devices, such as cache, read-only memory (ROM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), and flash memory, or volatile memory devices, such as random-access memory (RAM), or storage media, such as a hard disk drive (HDD) or CD-ROM, but the present invention is not limited thereto. The storage unit 70 may be a memory implemented as a chip separate from the processor described above in relation to the control unit 80 to be described later, or may be implemented as a chip integral with the processor.
The control unit 80 controls the navigation device 1 on the whole.
Specifically, the control unit 80 performs a basic operation of navigation for controlling the display unit 50 and the speaker 60 in order to guide the driver along the driving route based on the map information and the GPS signal of the vehicle 5.
The control unit 80 may receive via the communication unit 30 the driving assistance information sent by the external terminal 10 for the driving image shared with the external terminal 10, and may display the received driving assistance information on the display unit 50 together with the driving image captured by the cameras 40 or driving route information, or may control the speaker 60 so that the driving assistance information may be outputted through the speaker 60.
In addition, when the vehicle 5 and the external terminal 10 are communicatively connected at the request of the external terminal 10, the control unit 80 may control such that the driving image of the vehicle sent by the communication unit 30 to the external terminal can be displayed also on the display unit 50 at the same time.
If the driving image of the vehicle sent to the external terminal 10 is displayed on the display unit 50, the driver may be provided with guidance much like a streaming service that receives road guidance while watching the same image as the user of the external terminal 10.
The driving assistance information refers to information inputted by the user of the external terminal 10 in response to the driving image and driving route information of the vehicle shared with the external terminal 10 via the communication unit 30. The driving assistance information received as input from the user of the external terminal for the image may refer to display information or voice information. The display information and the voice information may refer to travel information to travel in a particular direction.
Meanwhile, the control unit 80 may be implemented by a memory (not shown) that stores data for an algorithm for controlling the operation of the components of the navigation device 1 or a program that reproduces the algorithm, and a processor (not shown) that performs the operation described above using the data stored in the memory. In this case, the memory and the processor may each be implemented by separate chips. Alternatively, the memory and the processor may also be implemented by a single chip.
Further, although the control unit 80 and the communication unit 30 are illustrated as separate components in
The navigation device 1 may further include other components in addition to the components described above in
d are diagrams for illustrating an operation in which a navigation device and an external terminal are communicatively connected according to one embodiment. In order to avoid repetitive descriptions, they will be described together below.
Referring to
In more detail, when a request for location information is received from the external terminal 10 while traveling, the navigation device 1 may send the location information of the current vehicle to the external terminal 10. The navigation device 1 may perform wireless communication using an antenna provided in the vehicle 5, or may also communicate with the external terminal 10 using the communication unit 30, and the location information may include GPS signal information of the current vehicle.
Referring to
The navigation device 1 may also receive a location-sharing request from the external terminal 10, as shown in
As shown in
Referring to
Meanwhile, the screens of the first to third display areas 41 to 43 in
d are diagrams for illustrating an operation of sharing driving image information according to another embodiment. In order to avoid repetitive descriptions, they will be described together below.
Referring to
In more detail, the navigation device 1 may send a signal requesting camera connection to the external terminal 10, and, if the external terminal 10 agrees to this, may send the driving image and the image for the driving route information described above to the external terminal 10.
Referring to
The navigation device 1 may display the result of recognizing the voice input by the driver, that is, “Connect to the camera” in the text on the third display area 43. Thereafter, the navigation device 1 may send various image information to the external terminal 10.
If the external user agrees to the connection, the driving image or the image for the driving information of the vehicle sent by the navigation device 1 may be displayed on the display of the external terminal 10.
When the navigation device 1 sends the driving image captured by the cameras 40 to the external terminal 10, the driving route of the vehicle may be displayed on the first display area 41 of the navigation device 1, the map information may be displayed on the second display area 42, and the driving image sent to the external terminal 10 may be simultaneously displayed on the fourth display area 45, as shown in
Meanwhile, the screens of the first to fourth display areas 41 to 44 and the user interface of the external terminal 10 noted in
Referring to
The navigation device 1 may simultaneously display the driving image of the vehicle captured by the camera 40, which is being sent to the external terminal 10, on the fourth display area 45. Image information that changes as the vehicle 5 moves may also be displayed together while changing in real-time. In such a case, since the image currently being viewed by the driver of the vehicle and the image being viewed by the user of the external terminal 10 are the same, there is an advantage that information on the driving direction can be easily obtained from an external user who knows the geography well.
Further, the navigation device 1 and the terminal 10 may output the received voice information while sending and receiving voice information to and from each other. For example, the driver A may transmit voice information “where to go?” to the external terminal 10. The user B of the external terminal 10 may transmit a voice saying “Do you see the building on the left? Turn left there,” and the navigation device 1 may output the received voice information via the speaker 60. The navigation device 1 receives the voice of the driver A saying “I see” and transmits it to the external terminal 10.
Through this, since the navigation device 1 can be provided with guidance in conversation as if the user was giving directions from next to the driver while the driver and the user view the same screen in real-time, there is an effect that the driver of the vehicle can drive to the destination more accurately.
As with
Although an example in which the first display information is transformed into an arrow has been given in
Referring to
When displaying on the fourth display area in a method of superimposing the driving assistance information over the driving image of the vehicle, as shown in
Although the driving assistance information received from the external terminal 10 has been described as being limited to the information received based on the driving image of the vehicle in
For example, if the information displayed in the second display area 42 in
Further, in the embodiment of
Referring to
Augmented reality (AR) refers to a technology that overlays a 3D virtual image on a real image or background and shows it as a single image. Augmented reality is also called Mixed Reality (MR).
When described with reference to
Referring to
When the external terminal 10 that has received the request signal approves the sharing, the navigation device 1 may transmit a driving image of the vehicle, image information including driving information of the vehicle, and the like to the external terminal 10 (S20).
Thereafter, the navigation device 1 may receive driving assistance information from the external terminal 10 (S30). As the detailed description of the driving assistance information has been described above, it will be omitted.
Thereafter, the navigation device 1 may display the driving assistance information on the vehicle driving image displayed on the display unit 50 together, or output it via the speaker 60 (S40).
The navigation device 1 may display the driving assistance information by superimposing it over the map information, but is not necessarily limited thereto, and the driving assistance information may be provided to the driver in various ways.
Specifically,
The server for providing a navigation service 100 according to
The server for providing a navigation service 100 that has received the driving assistance information may send the information generated based on the assistance information and the driving image for the vehicle to the user terminal 200 or the vehicle 300 that provides the navigation service to the driver. In this case, the server may generate a combined image in a method of superimposing the assistance information over the driving image and then send the generated combined image, and in sending the combined image, may send driving information, which is driving guidance service information for the vehicle, together. Since the method of utilizing the driving assistance information has been described in detail in the previous drawings, a description thereof will be omitted.
Further, the server for providing a navigation service 100 according to
As the server described in
According to the navigation device, the method for providing a navigation service, and the server for providing a navigation service in accordance with an embodiment, since a driver can receive guidance for a driving route from an external user while the user of an external terminal is viewing the same driving image, there is an effect of reducing the error between the current location recognized by the driver and a destination and of improving the reliability of the navigation device.
In addition, according to the navigation device, the method for providing a navigation service, and the server for providing a navigation service in accordance with an embodiment, there is an advantage that when a driver drives to a destination, he or she can receive guidance of voice information or input information from another person who knows the route to the destination well together, and thus, can drive to the destination more safely.
On the other hand, the constitutional elements, units, modules, components, and the like stated as “˜part or portion” in the present invention may be implemented together or individually as logic devices interoperable while being individual. Descriptions of different features of modules, units or the like are intended to emphasize functional embodiments different from each other and do not necessarily mean that the embodiments should be realized by individual hardware or software components. Rather, the functions related to one or more modules or units may be performed by individual hardware or software components or integrated in common or individual hardware or software components.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, declarative or procedural languages, and it can be deployed in any form, including as a standalone program or as a module, component, subroutine, object, or other unit suitable for use in a computing environment.
Additionally, the logic flows and structure block diagrams described in this patent document, which describe particular methods and/or corresponding acts in support of steps and corresponding functions in support of disclosed structural means, may also be utilized to implement corresponding software structures and algorithms, and equivalents thereof.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output.
This written description sets forth the best mode of the present invention and provides examples to describe the present invention and to enable a person of ordinary skill in the art to make and use the present invention. This written description does not limit the present invention to the specific terms set forth.
While the present invention has been shown and described with reference to certain embodiments thereof, it will be understood by those skilled in the art that various changes in forms and details may be made therein without departing from the spirit and scope of the present invention as defined by the appended claims and their equivalents. Therefore, the technical scope of the present invention may be determined by on the technical scope of the accompanying claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2020-0114797 | Sep 2020 | KR | national |
10-2021-0021642 | Feb 2021 | KR | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/KR2021/012185 | 9/8/2021 | WO |