CN117215060A - AR display device for vehicle and operation method thereof - Google Patents

AR display device for vehicle and operation method thereof Download PDF

Info

Publication number
CN117215060A
CN117215060A CN202310687460.5A CN202310687460A CN117215060A CN 117215060 A CN117215060 A CN 117215060A CN 202310687460 A CN202310687460 A CN 202310687460A CN 117215060 A CN117215060 A CN 117215060A
Authority
CN
China
Prior art keywords
vehicle
graphical interface
information
processor
condition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310687460.5A
Other languages
Chinese (zh)
Inventor
崔秉埈
蔡知锡
孙正勋
金一完
朴钟兑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/KR2022/095145 external-priority patent/WO2023239002A1/en
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of CN117215060A publication Critical patent/CN117215060A/en
Pending legal-status Critical Current

Links

Landscapes

  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is an AR display device for a vehicle and an operation method thereof. The AR display device for a vehicle according to the present invention includes: at least one interface module that receives data including a front image of a vehicle, position data including a current position of the vehicle, and map data for the current position of the vehicle; a processor rendering an AR graphical interface in a manner overlapping the front image, the AR graphical interface including a first AR object displaying a driving state of the vehicle based on the map data and a second AR object displaying a guide to the driving state of the vehicle; and a display for displaying a navigation screen including a front image on which the AR graphical interface is superimposed according to the rendering, wherein the processor renders the AR graphical interface by selectively separating the second AR object from the first AR object or combining the second AR object to the first AR object on the navigation screen according to a next driving condition of the vehicle estimated based on the map data and a current position of the vehicle.

Description

AR display device for vehicle and operation method thereof
Technical Field
The present invention relates to an AR display device that is linked with a vehicle and an operation method thereof, and more particularly, to an AR display device that can display in advance, in front of the vehicle, guidance of a next traveling condition predicted during traveling of the vehicle in an AR format and an operation method thereof.
Background
In order to make use of safety and convenience of users of vehicles, various sensors and devices are provided in vehicles, and functions of vehicles are gradually diversified. The functions of such a vehicle may be classified into a convenience function for improving the convenience of a driver and a safety function for improving the safety of the driver and/or pedestrians.
The convenience function of the vehicle has a research and development motivation related to driver convenience, such as an information+entry entertainment (infotainment) function given to the vehicle, an automatic driving function supporting a portion, or a view of the driver helping to secure a night view, a blind area, or the like. For example, with active cruise control (active cruise control, ACC), smart parking system (smartparking assist system, SPAS), night Vision (NV), head Up Display (HUD), panoramic display (around view moni tor, AVM), adaptive front light control (adaptive headlight system, AHS) functions, and the like.
The safety function of the vehicle is a technique for ensuring the safety of the driver and/or the safety of the pedestrian, and includes a lane departure warning system (lane departure warning system, LDWS), a lane keeping assist system (lane keeping assist system, LKAS), an automatic emergency brake (autonomous emergency braking, AEB) function, and the like.
In recent years, a technology of outputting a graphic object through a windshield of a vehicle, a HUD (Head Up Display), or outputting a graphic object in an image captured by a camera to additionally output the graphic object to augmented reality (Augmented Reality, AR) in the real world has been actively developed. In particular, with such Augmented Reality (AR) technology, development of a technology for guiding a path to a driver by the Augmented Reality (AR) technology is being further expanded.
On the other hand, in the related art, even if such an Augmented Reality (AR) technique is applied to route guidance according to an AR travel mode, the existing travel guidance is simply displayed in an AR form. For example, the travel direction change guide is displayed as an AR image only at a fixed specific position.
Therefore, it is difficult to distinguish the display from other AR features (features) of the AR travel mode, and there is a limit in receiving intuitive route guidance. In addition, in the case of a driver with insufficient driving experience, there is a limit in accurately driving according to the guidance. Even if the travel direction change display and the remaining distance value are displayed together. Therefore, it is necessary to study for executing a more intuitive and complete AR travel pattern.
Disclosure of Invention
The present invention aims to solve the aforementioned problems and other problems.
According to some embodiments of the present invention, it is an object to provide an AR display device capable of performing a more intuitive and complete AR driving mode and an operation method thereof.
Further, according to some embodiments of the present invention, it is an object to provide an AR display device and an operating method thereof capable of intuitively observing a predicted next driving situation at a current position of a vehicle and enabling convenient and safe driving only by traveling along a displayed AR interface.
In addition, according to some embodiments of the present invention, it is an object to provide an AR display device and an operation method thereof capable of freely switching UX displays such as combining, separating, and deforming to be able to provide various information using one AR interface.
Further, according to some embodiments of the present invention, it is an object to provide an AR display device and an operation method thereof capable of more coordinated and expanded UX display of navigation information or POI information related to driving in an AR driving mode and a predicted next driving situation.
For this reason, the AR display device of the present invention in conjunction with a vehicle displays the current running state of the vehicle during running of the vehicle using an AR graphical interface, and can display guidance related to the predicted next running state on a navigation screen.
In this case, the AR display device may separate a part of the graphical interface before a predetermined distance or a predetermined time from the predicted point and guide the predicted driving situation in advance.
At this time, the separated AR object is changed in real time according to the current position of the vehicle and the road formation and can be displayed in navigation. In addition, the separated AR object may reflect and display predicted traveling conditions, and state information of navigation, setting information, surrounding POI information, and the like.
When this situation ends, the separated AR object may be displayed on the navigation screen as an AR graphical interface in a recombined state. In addition, during the running of the vehicle, the separation and the guidance of the AR graphical interface into AR objects and the subsequent combination may be continued to be repeated each time a specific running condition is predicted.
Specifically, the AR display device of the present invention may include: at least one interface module (communication module) that receives data including a front image of the vehicle acquired by the camera, map data for a current position of the vehicle, and position data including the current position of the vehicle; a processor rendering an AR graphical interface in a manner overlapping the front image, the AR graphical interface including a first AR object displaying a running state of a vehicle and a second AR object displaying a guide to the running state of the vehicle based on the map data; and a display that displays a navigation screen including a front image overlapping the AR graphical interface according to the rendering. At this time, the processor may selectively separate the second AR object from the first AR object or combine the second AR object to the first AR object on a navigation screen to change the AR graphical interface and render according to a next driving condition of the vehicle estimated based on the map data and the current position of the vehicle.
In an embodiment, a vision sensor may also be included that provides data including a front image of the vehicle through the interface module.
In an embodiment, the visual sensor may comprise a camera or Smart Glass (Smart Glass).
In an embodiment, the AR graphical interface may be separated at a second location that is a prescribed time or distance ahead from the first location where the next driving condition of the vehicle begins.
In an embodiment, either one of the first condition and the second condition is determined based on at least one of the map data, the position data, and navigation information of the vehicle, and the processor displays the AR graphical interface as follows: and under the first condition, displaying the AR graphical interface in a combined state of the first AR object and the second AR object, and under the second condition, separating the first AR object from the second AR object, wherein the separated second AR object represents the next running condition.
In an embodiment, in the first condition, the processor may render that a panel region including an AR object related to travel information of a vehicle is overlapped on a designated region of the front image, and in the second condition, the processor may render that the panel region is updated to be displayed hidden or obliquely during a period in which the separated second AR object represents a next travel condition.
Thus, travel guidance based on the AR feature and the AR graphical interface can be displayed more cooperatively, the integrity of the AR travel mode is improved, and travel stability and convenience are provided.
In an embodiment, the next driving condition of the vehicle includes road shape information estimated based on the current position of the vehicle and at least one of the map data and the sensed data, and the processor may display the update such that a change corresponding to the road shape information is reflected to the AR graphical interface.
For example, it is possible to move in the three-axis directions in a state that a part of the AR objects constituting the AR graphical interface are combined to reflect the curve or inclination of the road to the AR graphical interface.
In an embodiment, the AR graphical interface may be displayed to rotate in a first direction based on a vehicle steering value corresponding to the road shape information, and the processor may display an update that a degree of distortion between a first AR object and a second AR object forming the AR graphical interface increases as the vehicle steering value increases.
In an embodiment, the AR graphical interface may be displayed to be rotated in a second direction based on a tilt value corresponding to the road shape information, and the processor may display to update the AR graphical interface such that a tilt direction and degree of the second AR object is changed based on the first AR object according to the direction and degree of the tilt value.
In the present invention, the next running condition of the vehicle may include state information and setting information of navigation.
In an embodiment, the processor may separate the second AR object and display the second AR object as rotating with respect to the first AR object during a period in which the first AR object displays a current driving state of the vehicle as a next driving state of the vehicle according to the recognition of the path search, and may display an AR graphical interface updated to be a combination of the first AR object and the second AR object according to the end of the path search.
In an embodiment, the processor may identify a driving warning condition of the vehicle according to sensing data based on the vehicle and navigation information of the vehicle, and change an image of the AR graphical interface to include a notification of the driving warning condition.
Here, the running warning condition may include a condition in which the vehicle exceeds or is very close to the speed limit of the vehicle, and a condition in which the vehicle is expected to possibly collide with other vehicles or objects.
In an embodiment, the processor may change at least one of a color, a shape, a flashing, and a highlighting of the AR graphical interface in accordance with identifying an overspeed state of the vehicle as the driving warning condition. Thus, the driver can intuitively recognize the warning condition.
In an embodiment, the processor may render the update to separate a second AR object of the AR graphical interface at the second location and cause the second AR object to display at least one of a direction and a path of the rotational travel during a period in which the first AR object of the AR graphical interface displays a current travel direction of the vehicle according to the sensed data based on the map data and the vehicle to identify the rotational travel as a next travel condition of the vehicle.
Thus, the vehicle can safely and easily perform the turning travel only by moving along the guide of the second AR object.
In an embodiment, the processor may display as a next path to guide the vehicle to travel while maintaining the separated state of the second AR object within a prescribed distance in response to the current position of the vehicle approaching the second position, and the processor may update an AR graphical interface in which the first AR object and the second AR object are combined in response to the current position of the vehicle passing through the second position.
At this time, it is important to maintain the point of time and the separation state at which the second AR object is separated, to add the point of time at which the next path is guided, and the guiding path is calibrated to coincide exactly with the navigation screen on which the second AR object is rendered, so as to be displayed so as to easily follow the guiding path.
In an embodiment, the processor may calculate a path corresponding to each exit point included in the roundabout section according to the identification of the roundabout section as the next driving condition of the vehicle based on the map data and the navigation information of the vehicle, and the processor may render and update the AR graphical interface to separate a second AR object of the AR graphical interface into a plurality at the second location and display the calculated path at a third location corresponding to each exit point.
In an embodiment, the processor may receive destination information from navigation information of a vehicle through the at least one interface module (communication module), display a second AR object updated to separate the AR graphical interface and direct the second AR object to the first location according to the arrival of the second location at the current location of the vehicle with reference to the first location corresponding to the received destination information.
In an embodiment, the separated second AR object may move to the first location and change an image to include remaining distance information from the current location of the vehicle to the first location.
Thus, the driver can confirm the destination position more accurately. In addition, the display of the AR graphical interface will also continue to remain after the vehicle actually reaches the first location corresponding to the destination information. Therefore, the situation that the guidance screen ends when the destination is reached in the existing guidance route, and the first visitor does not know the exact destination position and is panicked does not occur.
In an embodiment, the processor may render an AR graphical interface updated to display the first AR object in combination with the second AR object in accordance with the current location of the vehicle passing the first location before a next driving condition of the vehicle is detected.
In an embodiment, the processor display changes to identify a turning section as a next driving condition of the vehicle based on the map data and navigation information of the vehicle, separates a second AR object of the AR graphical interface at the second position and causes the second AR object to direct the turning position, and the processor renders and updates that the display of the separated second AR object changes in correspondence with the driving condition of the vehicle in the turning section.
The invention provides an operating method of an AR display device for a vehicle, comprising the following steps: a step of receiving data including a front image of the vehicle, position data including a current position of the vehicle, and map data for the current position of the vehicle; a step of rendering an AR graphical interface including a first AR object displaying a running state of the vehicle and a second AR object displaying guidance for a running state of the vehicle based on the map data in a manner overlapping with the front image; a step of displaying a navigation screen including a front image overlapping the AR graphical interface according to the rendering; and a step of rendering the AR graphical interface by selectively separating the second AR object from the first AR object or bonding the second AR object to the first AR object on a navigation screen according to a next driving condition of the vehicle estimated based on the map data and a current position of the vehicle.
In an embodiment, in the step of selectively separating the second AR object from the first AR object, the AR graphical interface is separated at a second location that is a location advanced by a prescribed time or distance from a first location at which a next driving condition of the vehicle begins.
In an embodiment, either one of a first condition and a second condition is determined based on at least one of the map data, the current position of the vehicle, and navigation information of the vehicle, and in the step of selectively separating the second AR object from the first AR object, the AR graphical interface is displayed as: and under the first condition, displaying the AR graphical interface in a combined state of the first AR object and the second AR object, and under the second condition, separating the first AR object from the second AR object, wherein the separated second AR object represents the next running condition.
In an embodiment, in the first condition, a panel region including an AR object related to travel information of a vehicle is rendered to overlap a designated region of the front image, and in the second condition, the rendering is updated to display the panel region hidden or obliquely during a period in which the separated second AR object represents a next travel condition.
In an embodiment, the next driving condition of the vehicle includes road shape information estimated based on the current position of the vehicle and at least one of the map data and the sensed data of the vehicle, and the action method further includes: updating to reflect the change corresponding to the road shape information to the AR graphical interface.
In an embodiment, further comprising: the AR graphical interface is displayed as a step of rotating in a first direction based on a vehicle steering value corresponding to the road shape information, and updating to increase a degree of distortion between a first AR object and a second AR object forming the AR graphical interface as the vehicle steering value increases.
In an embodiment, further comprising: the AR graphical interface is displayed as a step of rotating in a second direction based on a tilt value corresponding to the road shape information, and updating the AR graphical interface such that a tilt direction and a degree of the second AR object are changed based on the first AR object according to the direction and the degree of the tilt value.
In an embodiment, further comprising: and a step of separating the second AR object and displaying the second AR object as a rotation based on the first AR object while the first AR object displays the current running state of the vehicle, in accordance with the recognition of the path search as the next running state of the vehicle, and updating an AR graphical interface that is a combination of the first AR object and the second AR object, in accordance with the end of the path search.
In an embodiment, further comprising: and a step of changing an image of the AR graphical interface to include notification of a running warning condition of the vehicle according to the sensed data of the vehicle and the navigation information of the vehicle.
In an embodiment, at least one of a color, a shape, a blinking, and a highlighting of the AR graphical interface is changed in accordance with identifying an overspeed state of the vehicle as the driving warning condition.
In an embodiment, further comprising: and a step of, in accordance with the identification of the rotational travel as the next travel condition of the vehicle based on the map data and the sensed data of the vehicle, updating to separate a second AR object of the AR graphical interface at the second position and cause the second AR object to display at least one of the direction and the path of the rotational travel while the first AR object of the AR graphical interface displays the current travel direction of the vehicle.
In an embodiment, further comprising: and a step of updating an AR graphical interface in which the first AR object and the second AR object are combined, in response to the current position of the vehicle passing through the second position, in response to the current position of the vehicle being within a prescribed distance from the second position while maintaining the separated state of the second AR object and displaying a next path to be traveled by the vehicle.
In an embodiment, further comprising: and a step of calculating a route corresponding to each exit point included in the rotary island section, based on the map data and the navigation information of the vehicle, by identifying the rotary island section as a next driving condition of the vehicle, updating the AR graphical interface rendering to separate a second AR object of the AR graphical interface into a plurality of parts at the second position, and displaying the calculated route at a third position corresponding to each exit point.
In an embodiment, further comprising: and receiving destination information from navigation information of a vehicle, and displaying a second AR object updated to separate the AR graphical interface and pointing the second AR object at the first position, based on the arrival of the current position of the vehicle at the second position with reference to the first position corresponding to the received destination information.
In an embodiment, the separated second AR object moves to the first location and changes an image to include remaining distance information from a current location of the vehicle to the first location.
In an embodiment, the rendering is updated to display an AR graphical interface where the first AR object and the second AR object are combined before a next driving situation of the vehicle is detected according to the current position of the vehicle passing the first position.
In an embodiment, the display is changed such that, based on the map data and the navigation information of the vehicle, a u-turn section is identified as a next driving condition of the vehicle, a second AR object of the AR graphical interface is separated at the second position, the second AR object is directed to the u-turn position, and the separated display of the second AR object is rendered and updated such that the display of the second AR object changes in accordance with the driving condition of the vehicle in the u-turn section.
The AR display device for a vehicle and the operation method thereof of the present invention have the following effects.
According to the AR display device and the action method thereof, according to some embodiments of the invention, an augmented reality navigation screen based on a calibrated front image can be provided without additional setting, and the guidance of the predicted running condition seen on the current navigation screen and the current position of the vehicle are guided through an AR object, so that more visual and real AR guidance can be provided for the vehicle. In addition, information on the shape of the road is displayed on the AR graphical interface, thereby facilitating steering control of the vehicle.
In addition, according to the AR display apparatus and the operation method thereof according to some embodiments of the present invention, the current driving state and the next driving state are simultaneously directed in a manner of separating, deforming, and combining one AR graphic interface, and the separated AR object is directed to travel while moving together with the vehicle instead of being displayed at a fixed position, so that the follow-up travel control of the directed vehicle is easier and more safe and convenient.
In addition, according to the AR display device and the operating method thereof according to some embodiments of the present invention, an AR graphical interface reflecting predicted driving conditions and state information of navigation, setting information, surrounding POI information, and the like is provided, so that a more coordinated and expanded AR driving mode can be experienced.
Drawings
Fig. 1 is a diagram showing an example of a vehicle related to an embodiment of the present invention.
Fig. 2 is a view of a vehicle related to an embodiment of the present invention from various angles.
Fig. 3 and 4 are diagrams showing the interior of a vehicle relating to an embodiment of the present invention.
Fig. 5 and 6 are diagrams for explaining various objects related to the running of the vehicle according to the embodiment of the present invention.
Fig. 7 is a block diagram for explaining a vehicle and an AR display device related to an embodiment of the present invention.
Fig. 8 is a detailed block diagram related to a processor of an AR display device related to an embodiment of the present invention.
Fig. 9 is a diagram for explaining a navigation screen according to an embodiment of the present invention, and fig. 10 is a diagram for explaining an operation of generating the navigation screen of fig. 9.
Fig. 11 is a flowchart for explaining a method of displaying an AR graphical interface of an embodiment of the present invention on a navigation screen.
Fig. 12A and 12B are examples of an AR graphical interface according to an embodiment of the present invention, which are diagrams for explaining separation and combination of a first AR object and a second AR object.
Fig. 13A, 13B, 14A, 14B, 14C, 15A, 15B, 16A, 16B, 16C, 16D, 17A, 17B, 17C, 18, 19A, 19B, 20, 21, 22, 23A, 23B, 24, and 27 are diagrams showing various examples related to rendering on a navigation screen an AR graphical interface according to an embodiment of the present invention by changing the AR graphical interface differently according to a predicted driving situation.
Fig. 25A, 25B, and 26 are exemplary diagrams related to providing peripheral information together as an intuitive AR UX when an AR graphical interface of an embodiment of the present invention is displayed on a navigation screen.
Detailed Description
With reference to the accompanying drawings, a description will now be given in detail in terms of exemplary embodiments disclosed herein. For purposes of brief description with reference to the drawings, the same or equivalent components may be provided with the same or similar reference numerals, and the description thereof will not be repeated. In general, suffixes such as "module" and "unit" may be used to refer to elements or components. The use of such suffixes is intended to only aid in the description of the specification herein, and the suffixes themselves are intended to not give any particular meaning or function. In the present disclosure, those well known to those of ordinary skill in the relevant art have generally been omitted for the sake of brevity. The drawings are intended to assist in understanding the various features easily and it should be understood that the drawings are not limited to the embodiments set forth herein. As such, the disclosure should be interpreted to extend to any modifications, equivalents, and alternatives other than those specifically set forth in the drawings.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another element.
It will be understood that when an element is referred to as being "connected" to another element, it can be connected to the other element or intervening elements may also be present. In contrast, when an element is referred to as being "directly connected" to another element, there are no intervening elements present.
Singular references may include plural references unless their references clearly differ depending on the context. Terms such as "comprising" or "having" are used herein and are understood to be directed to the presence of several components, functions or steps disclosed in this specification and to likewise utilize more or fewer components, functions or steps.
The vehicle described in this specification may be a concept including an automobile and a motorcycle. Hereinafter, a description will be given mainly of an automobile in a vehicle.
The vehicle described in this specification may be a concept including an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, an electric vehicle having an electric motor as a power source, and the like.
In the following description, the left side of the vehicle refers to the left side in the traveling direction of the vehicle, and the right side of the vehicle refers to the right side in the traveling direction of the vehicle.
The "system" disclosed in the present specification may include at least one of a server device and a cloud device, but is not limited thereto. For example, the system may be constituted by more than one server apparatus. As another example, the system may be composed of more than one cloud device. As another example, the system may be configured and operated by a server device and a cloud device together.
In the present invention, "user terminal" or "user client" may refer to a computing device and/or system, including a vehicle (or an electrical component, device/system, etc. provided to the vehicle) and a user terminal or the like, which communicates with an AR display device/system, or the user itself.
The "map information" disclosed in the present specification may mean a meaning including an image photographed by a visual sensor such as a camera, two-dimensional map information, three-dimensional map information, a Digital Twin (Digital Twin) three-dimensional map, map information on a real/virtual space.
The "POI (Point of Interest, POI) information" disclosed in the present specification is a place of interest selected based on the map information, and may include pre-registered POI information (POI of a map stored in a cloud server), user-set POI information (e.g., i's home, school, company, etc.), travel-related POI information (e.g., destination, via-site, gas station, service area, parking lot, etc.), upper-level search POI information (e.g., POI with a large number of recent click accesses, hot area, etc.). Such POI information may be updated in real time with reference to the current location of the vehicle.
Fig. 1 and 2 are external views of a vehicle related to an embodiment of the present invention, and fig. 3 and 4 are views showing an interior of the vehicle related to the embodiment of the present invention.
Fig. 5 to 6 are diagrams showing various objects related to the running of the vehicle of the embodiment of the present invention.
Fig. 7 is a block diagram for explaining a vehicle related to an embodiment of the present invention.
Referring to fig. 1 to 7, a vehicle 100 may include: a wheel rotated by a power source; and a steering input device 510 for adjusting the traveling direction of the vehicle 100.
The vehicle 100 may be an autonomous vehicle. The vehicle 100 may switch to an automatic driving mode or a manual mode based on user input. For example, the vehicle 100 may switch from the manual mode to the automatic driving mode or from the automatic driving mode to the manual mode based on a user input received through a user interface device (hereinafter, referred to as "user terminal") 200.
The vehicle 100 may switch to the automatic driving mode or the manual mode based on the running condition information. The running condition information may be generated based on the object information supplied from the object detection device 300. For example, the vehicle 100 may switch from the manual mode to the automatic driving mode or from the automatic driving mode to the manual mode based on the running condition information generated in the object detection device 300. For example, the vehicle 100 may switch from the manual mode to the automatic driving mode or from the automatic driving mode to the manual mode based on the travel condition information received through the communication device 400.
The vehicle 100 may switch from the manual mode to the automatic driving mode or from the automatic driving mode to the manual mode based on information, data, signals provided from an external device.
In the case where the vehicle 100 is operated in an autopilot mode (autopilot mode), the autopilot vehicle 100 may be operated based on the operating system 700. For example, the autonomous vehicle 100 may operate based on information, data, or signals generated in the driving system 710, the departure system 740, and the parking system 750.
In the case where the vehicle 100 is operated in the manual mode, the autonomous vehicle 100 may receive a user input for driving through the driving operation device 500. Based on the user input received through the driving operation device 500, the vehicle 100 may be operated.
The total length (width) refers to the length from the front portion to the rear portion of the vehicle 100, the total width (width) refers to the width of the vehicle 100, and the total height (height) refers to the length from the lower portion of the wheel to the roof. In the following description, the overall length direction L may refer to a direction based on an overall length measurement of the vehicle 100, the overall width direction W may refer to a direction based on an overall width measurement of the vehicle 100, and the overall height direction H may refer to a direction based on an overall height measurement of the vehicle 100.
As shown in fig. 7, the vehicle 100 may include a user interface device (hereinafter, may be referred to as a "user terminal") 200, an object detection device 300, a communication device 400, a driving operation device 500, a vehicle driving device 600, an operation system 700, a navigation system 770, a sensing portion 120, a vehicle interface portion 130, a memory 140, a control portion 170, and a power supply portion 190.
According to the embodiment, the vehicle 100 may include other components in addition to the components described in the present specification, or may not include some of the components described.
The user interface device 200 is a device for communication between the vehicle 100 and a user. The user interface device 200 receives user input and may provide information generated in the vehicle 100 to a user. The vehicle 100 may implement a UI (User Interfaces) or UX (User experiences) through a User interface device (hereinafter, may be referred to as a "User terminal") 200.
The user interface device 200 may include an input 210, an internal camera 220, a biometric sensing 230, an output 250, and a processor 270. According to the embodiment, the user interface device 200 may include other constituent elements in addition to the constituent elements described, or may not include some of the constituent elements described.
The input 210 is used to receive information from a user, and the data collected in the input 210 is analyzed by the processor 270 and can be processed into control commands for the user.
The input portion 210 may be disposed inside the vehicle. For example, the input unit 210 may be disposed in an area of a steering wheel (steering wheel), an area of an instrument panel (instrument panel), an area of a seat (seat), an area of each column (pilar), an area of a door (door), an area of a center console (center console), an area of a roof panel (head lining), an area of a sun visor (sun visor), an area of a windshield (window), an area of a window (window), or the like.
The input part 210 may include a voice input part 211, a gesture input part 212, a touch input part 213, and a mechanical input part 214.
The voice input section 211 may convert a voice input of a user into an electrical signal. The converted electrical signals may be provided to the processor 270 or the control portion 170. The voice input 211 may include more than one microphone.
The gesture input section 212 may convert gesture input of a user into an electrical signal. The converted electrical signals may be provided to the processor 270 or the control portion 170.
The gesture input part 212 may include at least one of an infrared sensor and an image sensor for sensing gesture input of a user. According to an embodiment, gesture input 212 may sense a three-dimensional gesture input of a user. To this end, the gesture input part 212 may include a light output part outputting a plurality of infrared lights or a plurality of image sensors.
The gesture input unit 212 may sense a three-dimensional gesture input of the user by a TOF (Time of Flight) method, a Structured light (Structured light) method, or a parallax (Disparity) method.
The touch input part 213 may convert a touch input of a user into an electrical signal. The converted electrical signals may be provided to the processor 270 or the control portion 170.
The touch input part 213 may include a touch sensor for sensing a touch input of a user. According to the embodiment, the touch input part 213 is integrally formed with the display part 251, so that a touch screen can be implemented. Such a touch screen may together provide an input interface and an output interface between the vehicle 100 and a user.
The mechanical input 214 may include at least one of a button, a dome switch (dome switch), a scroll wheel, and a scroll wheel switch. The electrical signal generated by the mechanical input 214 may be provided to the processor 270 or the control 170. The mechanical input 214 may be configured at a steering wheel, center fascia, center console, cockpit module, door, or the like.
The interior camera 220 may acquire an image of the interior of the vehicle. The processor 270 may sense a state of the user based on the vehicle interior image. The processor 270 may obtain line-of-sight information of the user from the vehicle interior image. Processor 270 may sense gestures of a user from an image of the interior of the vehicle.
The biometric sensing unit 230 may acquire biometric information of the user. The biometric sensing unit 230 includes a sensor capable of acquiring biometric information of the user, and the sensor can be used to acquire fingerprint information, heart rate information, and the like of the user. The biometric information may be used for user authentication.
The output section 250 is used to generate an output related to a sense of sight, sense of hearing, sense of touch, or the like. The output part 250 may include at least one of a display part 251, a sound output part 252, and a haptic output part 253.
The display part 251 may display a graphic object corresponding to various information. The display part 251 may include at least one of a liquid crystal display (liquid cryst al display, LCD), a thin film transistor liquid crystal display (thin film transistor-liquid crystal display, TFTLCD), an organic light-emitting diode (OLED), a flexible display (flexible display), a three-dimensional display (3D display), and an electronic ink display (e-ink display).
The display portion 251 and the touch input portion 213 are formed in a layer structure or integrally formed, so that a touch screen can be realized.
The display portion 251 may be implemented by HUD (Head Up Display). In the case where the display part 251 is implemented by a HUD, the display part 251 has a transmission module, and information can be output through an image transmitted to a windshield or a window.
The display part 251 may include a transparent display. The transparent display may be attached to a windshield or window. The transparent display has a predetermined transparency and can display a predetermined screen. To have transparency, the transparent display may include at least one of a transparent TFEL (Thin Film Elec roluminescent: thin film electroluminescent material), a transparent OLED (Organic Light-Emitting Diode), a transparent LCD (Liq uid Crystal Display), a transmissive transparent display, and a transparent LED (Light Emitting Diode) display. The transparency of the transparent display can be adjusted.
On the other hand, the user interface device 200 may include a plurality of display parts 251a to 251g.
The display portion 251 may be disposed in a region of the steering wheel, a region 521a, 251b, 251e of the dashboard, a region 251d of the seat, a region 251f of each pillar, a region 251g of the door, a region of the center console, a region of the roof panel, a region of the sun visor, or may be implemented in a region 251c of the windshield, a region 251h of the window.
The sound output section 252 converts the electric signal supplied from the processor 270 or the control section 170 into an audio signal and outputs the audio signal. For this purpose, the sound output section 252 may include one or more speakers.
The haptic output 253 generates a haptic output. For example, the tactile output 253 can make the user recognize and output by vibrating the steering wheel, the seat belt, and the seats 110FL, 110FR, 110RL, and 110 RR.
A processor (hereinafter, may be referred to as a "control unit") 270 may control the overall actions of the respective units of the user interface device 200. Depending on the embodiment, the user interface device 200 may or may not include a plurality of processors 270.
In the case where the processor 270 is not included in the user interface device 200, the user interface device 200 may operate under the control of the processor or the control portion 170 of other devices in the vehicle 100.
On the other hand, the user interface device 200 may be named as a display device for a vehicle. The user interface device 200 can operate under the control of the control unit 170.
The object detection device 300 is a device for detecting an object located outside the vehicle 100. The objects may be various objects related to the operation of the vehicle 100. Referring to fig. 5 to 6, the object O may include a lane OB10, other vehicles OB11, pedestrians OB12, two-wheelers OB13, traffic signals OB14, OB15, light, roads, structures, speed bumps, terrains, animals, and the like.
The Lane (Lane) OB10 may be a driving Lane, a side Lane of the driving Lane, or a Lane in which vehicles travel in opposite directions. The Lane (Lane) OB10 may be a concept including left and right side lines (lines) forming the Lane (Lane).
The other vehicle OB11 may be a vehicle that is traveling around the vehicle 100. The other vehicles may be vehicles that are within a prescribed distance from the vehicle 100. For example, the other vehicle OB11 may be a vehicle that is traveling ahead or behind the vehicle 100.
The pedestrian OB12 may be a person located at the periphery of the vehicle 100. Pedestrian OB12 may be a person located within a prescribed distance from vehicle 100. For example, pedestrian OB12 may be a person located on a sidewalk or a roadway.
The two-wheeled vehicle OB12 may be a vehicle that is located around the vehicle 100 and that can be moved by two wheels. The two-wheeled vehicle OB12 may be a vehicle having two wheels located within a prescribed distance from the vehicle 100. For example, the two-wheeled vehicle OB13 may be a motorcycle or a bicycle located on a sidewalk or a lane.
The traffic signals may include traffic lights OB15, traffic indication boards OB14, patterns or text drawn on the road surface.
The light may be light generated from a lamp provided in another vehicle. The light may be light generated from a street lamp. The light may be sunlight.
The road may include a road surface, a curve, a slope such as an uphill slope, a downhill slope, etc.
The structure may be an object located at the periphery of the road and fixed to the ground. For example, structures may include street lights, roadside trees, buildings, utility poles, signal lights, bridges.
The terrain may include hills, hillsides, etc.
On the other hand, objects can be classified into moving objects and fixed objects. For example, the moving object may be a concept including other vehicles, pedestrians. For example, the stationary object may be a concept including traffic signals, roads, structures.
The object detection device 300 may include a camera 310, a radar 320, a lidar 330, an ultrasonic sensor 340, an infrared sensor 350, and a processor 370.
According to the embodiment, the object detection device 300 may include other components in addition to the described components, or may not include some of the described components.
To acquire an image of the exterior of the vehicle, the camera 310 may be located at an appropriate position on the exterior of the vehicle. The camera 310 may be a single camera, a stereo camera 310a, an AVM (Around View Monitoring: look around monitoring) camera 310b, or a 360 degree camera.
For example, in order to acquire an image in front of the vehicle, the camera 310 may be disposed in a room of the vehicle at a position close to the front windshield. Alternatively, the camera 310 may be disposed around the front bumper or radiator grille.
For example, in order to acquire an image of the rear of the vehicle, the camera 310 may be disposed in the interior of the vehicle at a position near the rear glass. Alternatively, the camera 310 may be disposed around a rear bumper, trunk, or tailgate.
For example, to acquire an image of a side of the vehicle, the camera 310 may be disposed in a room of the vehicle near at least one side window. Alternatively, the camera 310 may be disposed around a rear view mirror, a fender, or a door.
Camera 310 may provide the acquired image to processor 370.
The radar 320 may include an electromagnetic wave transmitting portion and a receiving portion. The radar 320 may be implemented as a pulse radar (pulseardar) or a continuous wave radar (Continuous Wave Radar) according to the principle of electric wave emission. In the continuous wave radar system, the radar 320 may be realized in an FMCW (Frequency Modulated Continuous Wave: frequency modulated continuous wave) system or an FSK (Freq uency Shift Keyong: frequency shift keying) system according to a signal waveform.
The radar 320 may detect an object based on TOF (Time of Flight) method or phase-shift method using electromagnetic waves as a medium, and may detect a position of the detected object, a distance from the detected object, and a relative speed.
The radar 320 may be disposed at an appropriate location outside the vehicle in order to sense an object located in front of, behind, or beside the vehicle.
The laser radar 330 may include a laser transmitting section and a receiving section. Lidar 330 may be implemented in TOF (Time of Flight) or phase-shift (phase-shift) fashion.
Lidar 330 may be implemented as either driven or non-driven.
In the case of being driven, the lidar 330 is rotated by a motor, and an object around the vehicle 100 can be detected.
When the laser radar 330 is not driven, the laser radar 330 can detect an object located within a predetermined range with respect to the vehicle 100 by using light steering. The vehicle 100 may include a plurality of non-driven lidars 330.
The lidar 330 may detect an object based on TOF (Time of Flight) method or phase-shift method using laser light as an optical medium, and may detect a position of the detected object, a distance from the detected object, and a relative speed.
In order to sense objects located in front of, behind, or to the side of the vehicle, the lidar 330 may be disposed at an appropriate location outside of the vehicle.
The ultrasonic sensor 340 may include an ultrasonic transmitting unit and a receiving unit. The ultrasonic sensor 340 detects an object based on ultrasonic waves, and can detect a position of the detected object, a distance from the detected object, and a relative speed.
The ultrasonic sensor 340 may be disposed at an appropriate position outside the vehicle in order to sense an object located in front of, behind, or beside the vehicle.
The infrared sensor 350 may include an infrared transmitting portion and a receiving portion. The infrared sensor 350 detects an object based on infrared light, and can detect a position of the detected object, a distance from the detected object, and a relative speed.
The infrared sensor 350 may be disposed at an appropriate position outside the vehicle in order to sense an object located in front of, behind, or beside the vehicle.
The processor 370 may control the overall actions of the various units of the object detection device 300.
Processor 370 may detect and track objects based on the acquired images. Processor 370 may perform actions such as distance calculations with objects, relative velocity calculations with objects, etc. through image processing algorithms.
The processor 370 may detect and track the object based on the reflected electromagnetic wave returned by the transmitted electromagnetic wave being reflected by the object. The processor 370 may perform actions such as distance calculation with the object, relative speed calculation with the object, and the like based on the electromagnetic waves.
Processor 370 may detect and track the object based on reflected laser light returned by the transmitted laser light reflected by the object. Processor 370 may perform distance calculation with the object, relative velocity calculation with the object, and the like based on the laser light.
Processor 370 may detect and track the object based on the reflected ultrasonic wave returned by the transmitted ultrasonic wave being reflected by the object. Processor 370 may perform actions such as distance calculations with the object, relative velocity calculations with the object, and the like based on the ultrasound waves.
Processor 370 may detect and track the object based on reflected infrared light returned by the transmitted infrared light reflected by the object. Processor 370 may perform actions such as distance calculations with objects, relative velocity calculations with objects, and the like based on the infrared light.
The object detection device 300 may include a plurality of processors 370 or may not include a processor 370, depending on the embodiment. For example, the camera 310, the radar 320, the lidar 330, the ultrasonic sensor 340, and the infrared sensor 350 may each include a processor.
In the case where the object detection device 300 does not include the processor 370, the object detection device 300 may operate in accordance with the control of the processor or the control unit 170 of the device in the vehicle 100.
The object detection device 300 may operate according to the control of the control unit 170.
The communication apparatus 400 is an apparatus for performing communication with an external device. Here, the external device may be another vehicle, a mobile terminal, or a server.
To perform communication, the communication apparatus 400 may include at least one of a transmitting antenna, a receiving antenna, an RF (radio Frequency) circuit capable of implementing various communication protocols, and an RF element.
The communication apparatus 400 may include a short-range communication part 410, a location information part 420, a V2X communication part 430, an optical communication part 440, a broadcast transceiver part 450, and a processor 470.
According to the embodiment, the communication apparatus 400 may include other constituent elements in addition to the constituent elements described, or may not include some of the constituent elements described.
The short-range communication unit 410 is a unit for short-range communication (Short range communication). The short-range communication unit 410 may use Bluetooth (Bluetooth) TM ) RFID (Radio Frequency Identification: radio frequency identification), infrared communications (Infrared Data Association; irDA), UWB (Ultra wide: ultra wideband), zigBee, NFC (N) ear Field Communication: near field communication), wi-Fi (Wireless-Fidelity: wireless fidelity), wi-Fi Direct, wireless USB (Wireless Universal Serial Bus: wireless universal serial bus) technology to support short-range communications.
The short-range communication section 410 can perform short-range communication between the vehicle 100 and at least one external device by forming a short-range wireless communication network (Wireless Area Networks).
The position information unit 420 is a unit for acquiring position information of the vehicle 100. For example, the location information part 420 may include a GPS (Global Positioning System: global positioning System) module or a DGPS (Differential Global Positioning System: differential Global positioning System) module.
The V2X communication section 430 is a unit for performing wireless communication with a server (V2I: vehicle to Infra), other vehicles (V2V: vehicle to Vehicle), or pedestrians (V2P: vehicleto Pedestrian). The V2X communication section 430 may include an RF circuit capable of implementing a communication V2I with an infrastructure, a communication V2V between vehicles, and a communication V2P protocol with pedestrians.
The optical communication unit 440 is a unit for performing communication with an external device using light as a medium. The optical communication section 440 may include: an optical transmission unit that converts the electrical signal into an optical signal and transmits the optical signal to the outside; and a light receiving section that converts the received optical signal into an electrical signal.
According to an embodiment, the light transmitting portion may be integrally formed with a lamp included in the vehicle 100.
The broadcast transmitting/receiving unit 450 is a unit for receiving a broadcast signal from an external broadcast management server or transmitting a broadcast signal to the broadcast management server through a broadcast channel. The broadcast channels may include satellite channels, terrestrial channels. The broadcast signals may include TV broadcast signals, wireless broadcast signals, and data broadcast signals.
Processor 470 may control the overall actions of the various elements of communication device 400.
Depending on the embodiment, the communication device 400 may include a plurality of processors 470 or may not include a processor 470.
In the case where the processor 470 is not included in the communication device 400, the communication device 400 may operate under the control of the processor or the control unit 170 of other devices in the vehicle 100.
On the other hand, the communication device 400 may implement a display device for a vehicle together with the user interface device 200. In this case, the display device for a vehicle may be named a telematics (telematics) device or an AVN (Audio Video Navigation: audio video navigation) device.
The communication device 400 can operate under the control of the control unit 170.
The driving operation device 500 is a device that receives user input for driving.
In the case of the manual mode, the vehicle 100 may operate based on the signal provided by the driving operation device 500.
The driving operation device 500 may include a steering input device 510, an acceleration input device 530, and a brake input device 570.
The steering input device 510 may receive a travel direction input of the vehicle 100 from a user. The steering input device 510 is preferably formed in a wheel shape so as to be able to perform steering input by rotation. The steering input device may also be formed in the form of a touch screen, a touch pad, or buttons, according to an embodiment.
The acceleration input device 530 may receive input for acceleration of the vehicle 100 from a user. The brake input device 570 may receive input from a user for deceleration of the vehicle 100. The acceleration input device 530 and the brake input device 570 are preferably formed in a pedal configuration. According to an embodiment, the acceleration input device or the brake input device may also be formed in the form of a touch screen, a touch pad or buttons.
The driving operation device 500 can be operated under the control of the control unit 170.
The vehicle driving device 600 is a device that electrically controls driving of various devices in the vehicle 100.
The vehicle driving device 600 may include a power transmission driving part 610, a chassis driving part 620, a door/window driving part 630, a safety device driving part 640, a lamp driving part 650, and an air conditioner driving part 660.
According to the embodiment, the vehicle driving device 600 may include other constituent elements in addition to the constituent elements described, or may not include some of the constituent elements described.
On the other hand, the vehicle driving device 600 may include a processor. Each of the individual units of the vehicle drive device 600 may include a processor, respectively.
The power transmission driving part 610 may control the operation of the power transmission device.
The power transmission drive portion 610 may include a power source drive portion 611 and a transmission drive portion 612.
The power source drive portion 611 may perform control of the power source of the vehicle 100.
For example, in the case where the fossil fuel-based engine is a power source, the power source driving section 610 may perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled. The power source drive portion 611 may adjust the engine output torque according to the control of the control portion 170.
For example, in the case where the electric-energy-based motor is a power source, the power source driving section 610 may perform control of the motor. The power source drive unit 610 may adjust the rotation speed, torque, etc. of the motor according to the control of the control unit 170.
The transmission driving section 612 may perform control of the transmission. The transmission driving section 612 can adjust the state of the transmission. The transmission drive 612 may adjust the state of the transmission to forward D, reverse R, neutral N, or park P.
On the other hand, in the case where the engine is a power source, the transmission driving portion 612 may adjust the engagement state of the gears in the forward D state.
The chassis driving unit 620 may control the operation of the chassis device. The chassis driving part 620 may include a steering driving part 621, a braking driving part 622, and a suspension driving part 623.
The steering drive unit 621 can perform electronic control of the steering device (steering apparatus) in the vehicle 100. The steering drive section 621 may change the traveling direction of the vehicle.
The brake driving portion 622 may perform electronic control on a brake device (brake application) in the vehicle 100. For example, the speed of the vehicle 100 may be reduced by controlling the operation of brakes provided to the wheels.
On the other hand, the brake driving section 622 may control each of the plurality of brakes separately. The brake driving part 622 may control braking forces applied to the plurality of wheels differently from each other.
The suspension driving unit 623 may perform electronic control on the suspension device (suspension apparatus) in the vehicle 100. For example, in the case where the road surface is curved, the suspension driving portion 623 may reduce the vibration of the vehicle 100 by controlling the suspension device. On the other hand, the suspension driving section 623 may control each of the plurality of suspensions, respectively.
The door/window driving part 630 may perform electronic control on a door apparatus (door apparatus) or a window apparatus (window apparatus) within the vehicle 100.
The door/window driving part 630 may include a door driving part 631 and a window driving part 632.
The gate driving part 631 may perform control of the gate device. The door driving unit 631 can control the opening and closing of a plurality of doors included in the vehicle 100. The door driving part 631 may control the opening or closing of a trunk (trunk) or a tail gate (tail gate). The door driving part 631 may control the opening or closing of a sunroof (sunroof).
The window driving part 632 may perform electronic control on a window apparatus (window apparatus). The opening or closing of a plurality of windows included in the vehicle 100 may be controlled.
The safety device driving section 640 may perform electronic control of various safety devices (safety devices) in the vehicle 100.
The safety device driving part 640 may include an airbag driving part 641, a seat belt driving part 642, and a pedestrian protection device driving part 643.
The air bag driving portion 641 may perform electronic control of an air bag device (air bag device) in the vehicle 100. For example, the airbag driving part 641 may control to deploy the airbag when a hazard is sensed.
The seat belt drive unit 642 can electronically control the seat belt device (seatbelt appartus) in the vehicle 100. For example, the seat belt drive portion 642 may be controlled to fix the occupant to the seats 110FL, 110FR, 110RL, 110RR with the seat belt when a hazard is sensed.
The pedestrian protection apparatus driving portion 643 may perform electronic control of the hood lifter and the pedestrian airbag. For example, the pedestrian protection apparatus driving section 643 may be controlled to raise and deploy the pedestrian airbag when a collision with a pedestrian is sensed.
The lamp driving part 650 may perform electronic control on various lamp devices (lamps) within the vehicle 100.
The air conditioner driving unit 660 may perform electronic control of an air conditioner (air conditioner) in the vehicle 100. For example, when the temperature of the vehicle interior is high, the air conditioner driving unit 660 may control to supply cool air to the vehicle interior by operating the air conditioner.
The vehicle drive device 600 may include a processor. Each of the individual units of the vehicle drive device 600 may include a processor, respectively.
The vehicle driving device 600 can operate under the control of the control unit 170.
The operation system 700 is a system that controls various operations of the vehicle 100. The operating system 700 may be operated in an autopilot mode.
The running system 700 may include a driving system 710, an alighting system 740, and a parking system 750.
According to an embodiment, the operation system 700 may include other constituent elements in addition to the illustrated constituent elements, or may not include a part of the illustrated constituent elements.
Alternatively, the running system 700 may comprise a processor. Each of the various units of the operating system 700 may each include a processor.
On the other hand, according to the embodiment, in the case where the operation system 700 is implemented by software, it may be a lower concept of the control section 170.
On the other hand, according to an embodiment, the operation system 700 may be a concept including at least one of the user interface device 200, the object detection device 300, the communication device 400, the vehicle driving device 600, and the control section 170.
The driving system 710 may perform traveling of the vehicle 100.
The driving system 710 may receive navigation information from the navigation system 770 and provide control signals to the vehicle driving apparatus 600 so that traveling of the vehicle 100 may be performed. The driving system 710 may receive the object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 so that the traveling of the vehicle 100 may be performed. The driving system 710 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600, so that the traveling of the vehicle 100 may be performed.
The alighting system 740 may perform alighting of the vehicle 100.
The alighting system 740 may receive navigation information from the navigation system 770 and provide control signals to the vehicle driving apparatus 600 so that alighting of the vehicle 100 may be performed. The alighting system 740 may receive the object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 so that alighting of the vehicle 100 may be performed. The alighting system 740 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600, so that alighting of the vehicle 100 may be performed.
Parking system 750 may perform parking of vehicle 100.
The parking system 750 may receive navigation information from the navigation system 770 and provide control signals to the vehicle driving apparatus 600 so that parking of the vehicle 100 may be performed. The parking system 750 may receive the object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 so that parking of the vehicle 100 may be performed. The parking system 750 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600, so that parking of the vehicle 100 may be performed.
The navigation system 770 may provide navigation information. The navigation information may include at least one of map (map) information, set destination information, path information set based on the destination, information of various objects on the path, lane information, and current position information of the vehicle.
The navigation system 770 can include a memory, a processor. The memory may store navigation information. The processor may control the actions of the navigation system 770.
According to an embodiment, the navigation system 770 may receive information from an external device through the communication apparatus 400 and may update pre-stored information.
According to an embodiment, the navigation system 770 may also be classified as a lower-level constituent element of the user interface apparatus 200.
The sensing part 120 may sense a state of the vehicle. The sensing part 120 may include a posture sensor (e.g., yaw sensor (yaw sensor), roll sensor (roll sensor), pitch sensor (pitch sensor)), a collision sensor, a wheel sensor (white sensor), a speed sensor, a tilt sensor, a weight sensor, a heading sensor (heading sensor), a yaw sensor (yaw sensor), a gyro sensor (gyro sensor), a position module (position module), a vehicle forward/backward sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor based on steering wheel rotation, a vehicle interior temperature sensor, a vehicle interior humidity sensor, an ultrasonic sensor, an illuminance sensor, an accelerator pedal position sensor, a brake pedal position sensor, and the like.
The sensing portion 120 may acquire sensing signals of vehicle posture information, vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle inclination information, vehicle forward/backward information, battery information, fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, vehicle exterior illuminance, pressure applied to an accelerator pedal, pressure applied to a brake pedal, and the like.
In addition, the sensing unit 120 may include an accelerator pedal sensor, a pressure sensor, an engine rotation speed sensor (engine speed sensor), an Air Flow Sensor (AFS), an intake Air Temperature Sensor (ATS), a Water Temperature Sensor (WTS), an accelerator position sensor (TPS), a TDC sensor, a Crank Angle Sensor (CAS), and the like.
The vehicle interface portion 130 may perform a tunnel function with various kinds of external devices connected to the vehicle 100. For example, the vehicle interface part 130 may have a port capable of connecting with a mobile terminal, through which the mobile terminal can be connected. In this case, the vehicle interface part 130 may exchange data with the mobile terminal.
On the other hand, the vehicle interface part 130 may perform a channel function of supplying power to the connected mobile terminal. In the case where the mobile terminal is electrically connected to the vehicle interface section 130, the vehicle interface section 130 may supply the electric power supplied from the power supply section 190 to the mobile terminal according to the control of the control section 170.
The memory 140 is electrically connected to the control unit 170. The memory 140 may store basic data of the unit, control data for controlling the operation of the unit, and input/output data. In hardware, the memory 140 may be various storage devices such as ROM, RAM, EPROM, a flash drive, a hard disk drive. The memory 140 may store various data for the overall operation of the vehicle 100, such as a program for processing and controlling the control unit 170.
According to an embodiment, the memory 140 may be integrally formed with the control portion 170, or may be implemented by a lower-level constituent element of the control portion 170.
The control unit 170 may control the overall operation of each unit in the vehicle 100. The control section 170 may be named ECU (Electroni c Contol Unit: electronic control unit).
The power supply unit 190 may supply power necessary for the operation of each component according to the control of the control unit 170. In particular, the power supply portion 190 may receive power from a battery or the like inside the vehicle.
The one or more processors and control 170 included in the vehicle 100 may be implemented using at least one of ASICs (application specific integrated circuits: application specific integrated circuits), DSPs (digital signal processors: digital signal processors), DSPDs (digital signal processing devices: digital signal processing devices), PLDs (programmable logic devices: programmable logic devices), FPGAs (field programmable gate arrays: field programmable gate arrays), processors, controllers, micro-controllers, microprocessors (micro processors), and electrical units for performing other functions.
On the other hand, the AR display device 800 of the present invention may display an AR graphical interface representing the running state of the vehicle on a front image of the vehicle (or a windshield of the vehicle) in real-time AR integration based on the navigation information of the vehicle 100 and the data received from the AR camera.
To this end, the AR display device 800 may include: a communication module 810 for communicating with other devices/systems, servers, and vehicles; a processor 820 controlling the overall operation of the AR display device 800; and a display 830 for displaying a navigation screen including a front image on which the AR graphic interface is rendered.
The "front image" or "running image" disclosed in the present specification may include an image taken by a camera sensor (or including a smart glass including such a function), an image reflected on an LCD screen by the camera sensor, and an image of a real space seen on a windshield/dashboard, a digital twin three-dimensional image, and the like.
The "navigation screen including a front image (running image)" disclosed in the present specification may refer to a navigation screen of a front image realized in one form of a front image photographed by a camera of a vehicle, an image reflected on an LCD screen, an image of an actual space seen through a windshield or the like, and/or a digital twin three-dimensional image, which is layered on a navigation screen generated based on the current position and navigation information.
The navigation screen may be an AR navigation screen to which AR technology is applied.
In addition, the "AR graphical interface" disclosed in the present specification is a graphical user interface to which Augmented Reality (AR) technology is applied, and is a front image in which real-time AR is integrated into a vehicle.
The AR graphical interface in this specification may be an AR graphical image representing the current running state of the vehicle. In addition, the AR graphical interface disclosed in the present specification may be an AR graphical image further representing guidance of the running condition of the vehicle in the current running state of the vehicle. At this time, the guidance of the running condition of the vehicle is displayed on the vehicle front image a predetermined distance and/or a predetermined time earlier than the corresponding running condition.
Referring to fig. 7, an AR display device 800 of an embodiment of the present invention may be implemented by a portion of an electrical component or system of the vehicle 100, or may be implemented by a separate stand-alone device or system. Alternatively, the AR display device 800 may be implemented in a program form including instructions for operating a processor such as a user terminal of the vehicle 100.
The AR display device 800 may communicate with the vehicle 100, other devices, and/or a server, and may receive a front image of the vehicle acquired through an AR camera and sensing data acquired through a sensor (e.g., a gyro sensor, an acceleration sensor, a gravity sensor, a geomagnetic sensor, a temperature sensor, etc.) provided to the vehicle.
The AR display device 800 may drive a preset application, such as an AR navigation application.
The AR display device 800 renders an AR graphical interface representing a current driving state of the vehicle based on map data (e.g., information of route, POI, etc.), sensed data, a front image acquired by the camera, and may provide an AR GUI surface and an AR camera surface of the navigation application in real time.
The AR display device 800 renders an AR object separated from the AR graphic interface as a guide representing a driving condition of a vehicle based on map data (e.g., information of a route, POI, etc.), sensed data, a front image acquired by a camera, and may provide an AR GUI surface and an AR camera surface of a navigation application in real time.
At this time, the separated AR object may be named as a "second AR object", and the remaining portion of the AR graphical interface may be named as a "first AR object" after the second AR object is separated. That is, the AR graphical interface may include a first AR object displaying a current driving state of the vehicle and a second AR object displaying a guide of the driving condition of the vehicle.
Next, fig. 8 is a detailed block diagram related to the processor 820 of the AR display device 800 according to the embodiment of the present invention described above.
The conceptual diagram shown in fig. 8 may include components related to actions performed by processor 820 of AR display device 800, as well as information, data, and programs used therefor. In this regard, the block diagram shown in fig. 8 may also be used as a system executed/implemented by a server provided by processor 820 and/or processor 820. Hereinafter, for ease of explanation, processor 820 is referred to.
Fig. 9 is a diagram for explaining a navigation screen according to an embodiment of the present invention, and fig. 10 is a diagram for explaining an operation of generating the navigation screen of fig. 9.
Referring to fig. 8, the processor 820 may include or be driven in conjunction with a navigation engine (Navigation Engine) 910, an AR engine (Augmented Reality Engine) 920, a navigation application (Navigation Application) 930, sensors, and a map 940.
The navigation engine 910 may receive map data and GPS data from a vehicle or the like. The navigation engine 910 may perform map matching based on the map data and the GPS data. The navigation engine 910 may perform route planning (route planning) according to map matching. The navigation engine 910 may display a map, perform route guidance (route guide). The navigation engine 910 may provide route guidance information to the navigation application 930.
The navigation engine 910 may include a navigation controller 911. The navigation controller 911 may receive map matching data, map display data, route guidance data.
The navigation controller 911 may provide route data, POI (point of interest) data, etc., to the AR engine 920 based on the received map matching data, map display data, route guidance data, etc.
The navigation controller 911 may provide route guidance data, map display frames, and the like to the navigation application 930.
The AR engine 920 may include an adapter 921 and a renderer 922. The adapter 921 may receive front image data acquired from a camera (e.g., an AR camera), sensing data acquired from a sensor of a vehicle such as a gyro sensor (gyroscillope), an acceleration sensor (ccorrometer), a Gravity sensor (Gravity), a geomagnetic sensor (Magnetometer), and/or a temperature sensor (thermo).
The AR engine 920 may receive sensing data acquired from ADAS sensors (e.g., cameras (cameras), radars (radars), lidars (lidars), ultrasonic waves (Ultrasonic), sonars (sonars)). For example, as the sensed data, sensed data related to traveling, such as traveling direction and speed, distance from a lane, and the like, may be acquired by an ADAS sensor.
The AR engine 920 may receive high definition map data and programs related thereto. Here, the high definition Map (HD Map) is a Map for providing detailed information of roads and surrounding terrains to an autonomous vehicle in advance, and has an accuracy within an error range of about 10cm, and stores information of road units such as a road center line and a warning line, and information of traffic lights, indication boards, curbs, road signs, various structures, and the like in three-dimensional numbers.
The AR engine 920 may receive sensed data, received data, control data, and programs related thereto, obtained from a TCU (Transmission Control Unit: transmission control unit) (e.g., third party service, V2X, ITS communication, etc.).
The sensor and the map 940 TCU (Transmission Control Unit) are communication control devices mounted on the vehicle, and can communicate with, for example, V2X (vehicle to everything: vehicle-to-everything) which is a communication technology for communicating with various elements located on a road for automatically driving the vehicle (for example, condition data which can be collected by V2V and V2I), ITS (Intelligent Transport Systems: intelligent transportation system) which is a cooperative intelligent transportation system technology, or C-ITS (Cooperative Intelligent Transport Systems: cooperative intelligent transportation system).
The AR engine 920 may perform calibration of the front image based on data provided from a calibration factor database (calibration factor DB). The AR engine 920 may perform object detection based on the front image data and the route data. The AR engine 920 may perform Prediction and Interpolation (Prediction & Interpolation) based on the detected object.
The renderer 922 can perform rendering based on route data, POI data, prediction, and interpolation result data. The renderer 922 may provide an AR GUI (graphical user interface) frame and an AR camera frame to the navigation application 930.
The navigation application 930 may generate an AR navigation screen.
Referring to fig. 9, an AR navigation screen 900 may include a navigation map plane (navigation map surface) 901, an AR camera plane 902, an AR GUI plane 903, and a navigation GUI plane 904.
The navigation application 930 may generate the navigation map 901 based on the map display frames received from the navigation controller 911. The navigation application 930 may generate the AR camera face 902 based on the AR camera frames received from the renderer 922. The navigation application 930 may generate the AR GUI surface 903 based on the AR GUI frames received from the renderer 922. The navigation application 930 may generate the navigation GUI surface 904 based on route guidance data received from the navigation controller 911.
Referring to fig. 8 and 10 together, if the navigation application 930 is driven, the navigation application 930 may generate a navigation map plane 901, an AR camera plane 902, an AR GUI plane 903, and a navigation GUI plane 904.
The navigation application 930 may provide parameters of the AR camera face 902 and parameters of the AR GUI face 903 to the AR engine 920.
To receive the front image data from the camera server 1001, the AR engine 920 may register a callback (callback) function. The camera server 1001 may be understood as a concept including, for example, a memory of the AR display device 800.
The AR engine 920 may receive the front image data and crop (cropping). Cropping (cropping) may include adjusting the size or position of the image, editing the partial area, adjusting transparency, and the like. The navigation application 930 may display the cropped front image on the AR camera face 902. The AR engine 920 may perform AR integration (AR instrumentation) in real time. In addition, the navigation application 930 may display an AR GUI on the AR GUI surface 903 based on the cropped front image.
Fig. 11 is a flowchart for explaining a method 1100 of displaying an AR graphical interface in a navigation screen according to an embodiment of the present invention.
The various processes of fig. 11 may be performed by a processor (or AR engine) unless otherwise indicated. In addition, the process of fig. 11 may include the actions of the processor 820-based navigation engine 910, AR engine 920, navigation application 930 described above with reference to fig. 8-10, or some of the actions thereof may be performed before or after the process of fig. 11.
Referring to fig. 11, the method starts by driving a preset application (S10).
The preset application is preset at the AR display device 800 or may be driven according to the AR mode of the vehicle being performed by other devices/servers coupled thereto, for example. For example, the preset application may be a navigation application executed in an AR mode while the vehicle is traveling.
For example, the navigation application receives route guidance and map display frames based on map data and GPS data from the navigation engine and generates a navigation GUI rendering and map display surface, respectively.
In addition, for example, the navigation application receives the AR GUI frames from the AR engine and generates an AR GUI surface, receives the AR camera frames, and generates an AR camera surface. The navigation application renders the generated map display surface, AR camera surface, and AR GUI surface on the navigation GUI surface.
The processor generates an AR graphical interface including a first AR object displaying a driving state of the vehicle and a second AR object displaying a guide for the driving state of the vehicle based on map (map) data acquired from a server, a memory, or the vehicle and sensing data of the vehicle and renders in a manner overlapping with a front image of the vehicle (S20).
The processor may integrate (AR measurement) the real-time generated AR graphical interface to the front image of the vehicle.
The processor displays (renders) the AR graphical interface in a state where the first AR object and the second AR object are combined. If the predetermined condition is satisfied, the processor displays (renders) the AR graphical interface in a state in which the second AR object is separated from the AR graphical interface.
Here, the preset condition may include a case where a change in the running condition of the vehicle is predicted from the current running state based on the sensed data of the vehicle. The preset condition may include a case where a change in the running condition of the vehicle predicted from the current running state or a condition requiring guidance is detected based on at least one of the ADAS sensing data, the high definition map data, the TCU communication data such as V2X, ITS, C-ITS, and the like.
Next, the processor displays the front image overlapped with the AR graphical interface on the navigation screen (S30).
The processor may render the AR graphical interface in a front image in a state in which the first AR object and the second AR object are combined. The processor provides the AR GUI frame and the AR camera frame corresponding to the AR graphical interface to the navigation application so that an AR GUI face and an AR camera face may be generated, respectively.
Then, the generated AR GUI surface and AR camera surface are rendered on the navigation GUI surface, so that the front image on which the AR graphical interface is rendered is included on the navigation screen (display).
On the other hand, such an AR graphical interface may be changed according to a running condition predicted to be changed based on map data and sensed data of the vehicle.
In this case, the changed AR graphical interface is displayed in a form in which a plurality of AR objects are separated from each other, so that the driver of the vehicle can intuitively confirm the display of the current running state and the guide display for predicting the changed running state.
Fig. 12A and 12B are examples of an AR graphical interface according to an embodiment of the present invention, which is a diagram for explaining the selective separation and combination of a first AR object and a second AR object according to a predicted changed driving condition.
Referring to the drawings, the AR graphical interface 1200 may be implemented as an AR image of a specific shape in a 3D form, by which road information and the like may be represented in addition to the current traveling direction, traveling speed, steering information of the vehicle.
The AR graphical interface 1200 may be implemented as a modality in which a first object and a second object are combined.
Here, for example, the first object may be implemented in a 3D spade (e.g., an image of a spade form) form, and the second object may be implemented in a 3D chevron (e.g., an image of an a or V form) form extending from the first object. However, it is not meant that the first object and the second object are limited to such a shape.
The AR graphical interface 1200 may be coupled in such a manner that the inner frame of the second object and the outer frame of the first object are in contact. At this time, the first object and the second object may be presented in colors different from each other so as to be visually distinguished.
To represent the current driving state of the vehicle, the AR graphical interface 1200 may be presented in such a manner that the first object and the second object are moved at the same or different twist angles from each other in a combined state.
The generated AR graphical interface 1200 is displayed to overlap with the vehicle front image included in the navigation screen. Specifically, the processor 820 generates an AR graphical interface 1200 representing the current running state of the vehicle based on the map data and the sensed data of the vehicle, renders based on the route and POI information, etc., and provides it to the navigation application 930, so as to be displayed in a form in which the AR graphical interface 1200 overlaps with the vehicle front image included in the navigation screen.
Referring to fig. 12B, the processor 820 separates the first AR object 1220 and the second AR object 1210 of the AR graphic interface according to the changed driving condition predicted based on the map data and the sensing data of the vehicle, renders the separated second AR object 1210 in such a manner that the directions related to the changed driving condition are displayed, and can update the AR GUI surface and the AR camera surface of the navigation application 930.
The condition that the first AR object 1220 and the second AR object 1210 are separated may include a case where a change in the running condition of the vehicle is predicted from the current running state of the vehicle based on the sensed data of the vehicle.
Alternatively, the condition that the first AR object 1220 and the second AR object 1210 are separated may include a case where a change of a driving condition of the vehicle or a condition requiring guidance is predicted from a current driving state of the vehicle is detected based on at least one of ADAS sensing data, high definition map data, V2X, ITS, C-ITS, etc. TCU communication data of the vehicle.
On the other hand, the separated second AR object 1210 is displayed extending from the display position of the first AR object 1220. The first AR object 1220 represents a current driving state of the vehicle (e.g., a current position and a driving direction of the vehicle), so the driver can intuitively recognize a point of time and a driving direction at which the vehicle should be driven according to the guidance indicated at the second AR object 1210.
The separation distance between the first AR object 1220 and the second AR object 1210 may correspond to a change prediction time point or distance of the running condition of the vehicle.
In addition, although not illustrated in detail, the separated second AR object 1210 may be presented as a plurality of segments (fragments). A prescribed interval may be maintained between a plurality of the segments.
In addition, the directions represented by the respective segments of the plurality of segments may appear to gradually point to the predicted situation occurrence place (or situation end place). For example, in the case where the separated second AR object 1210 appears as a total of five segments, each of the five segments may be directed to the same location (e.g., predicted condition occurrence location) at different twist angles from each other.
The plurality of segments may be presented in a form that moves a prescribed distance in advance of the first AR object 1220. That is, the plurality of segments are in a form that is realized to present travel guidance that occurs based on the predicted condition while moving based on the current position and the travel state of the vehicle, rather than being fixedly displayed at a specific place or time point.
The moving speeds of the plurality of segments may correspond to the degree to which the vehicle approaches (e.g., the traveling speed).
In addition, the number of segments and/or the display length may be proportional to the hold time or hold distance of the predicted condition. For example, a case where the status holding time is longer may be presented to include a greater number of segments or the total display length may be longer than a case where the status holding time is not longer.
The segments of the plurality of segments that are proximate to the first AR object 1220 are displayed in association with the travel state presented at the first AR object 1220.
The segments of the plurality that are farther from the first AR object 1220 are displayed in association with the predicted condition.
That is, the plurality of segments of the second AR object 1210 that are separated direct to provide a condition predicted from the current driving state corresponding to the first AR object 1220 in a more gradual and seamless manner.
When the situation corresponding to the separated condition ends, the separated second AR object 1210 is displayed again in a state of being combined with the first AR object 1220. That is, it may be displayed again in the AR graphical interface 1200 as in fig. 12A.
Hereinafter, the AR display device 800 of the present invention may receive a vehicle front image and map (map) data acquired through an AR camera through the communication module 810.
Processor 820 of AR display device 800 may drive a preset application (e.g., navigation application 930) to render an AR graphical interface including a first AR object displaying a driving state of the vehicle and a second AR object displaying a guide for the driving state of the vehicle based on the received map data in such a manner that the AR graphical interface overlaps the received front image of the vehicle.
The display 830 of the AR display device 800 may display a navigation screen overlapping the vehicle front image according to the AR graphical interface rendered on the AR GUI surface.
At this time, as described above, the navigation screen displayed on the display 830 may include an image output to the LCD display module or the like, an image reflected on the LCD screen by the camera sensor, an image of the real space seen on the windshield/dashboard, a digital twin three-dimensional image, and the like.
According to an embodiment, processor 820 may change and render the AR graphical interface according to a next driving condition of the vehicle that is presumed based on the received map data and the current position of the vehicle.
Here, the change of the AR graphical interface may include a case of a change in a state in which the first object and the second object of the AR graphical interface are combined and a case of a change in a separate state.
In the former case, the next running condition of the vehicle may be displayed at a different twist angle from each other in a state where the first object and the second object are combined.
In the latter case, the first object and the second object are separated, the first object may continue to display the current running state of the vehicle, and the separated second object may display the guiding running information based on the predicted next running state. At this time, the guidance traveling information based on the predicted next traveling condition may include a shape change (e.g., expansion of a plurality of segments, a number change, rotation of a portion, etc.) and/or a color change (e.g., a color for warning notification (e.g., change to red) of the second object to be separated.
According to an embodiment, the change of the AR graphical interface is performed at a second location advanced by a prescribed time or distance from a first location (e.g., a predicted condition occurrence location/section/time or a predicted condition ending location/section/time) at which the next driving condition of the vehicle starts.
For example, at the beginning of the predicted next driving situation or 150-200 m before entering, the second AR object is separated from the AR graphical interface, and the separated second AR object may perform the guidance display in real time. At this time, the first AR object continues to display the current running state of the vehicle.
In an embodiment, the second position may be changed according to the kind, characteristics, road condition of the predicted running condition.
In addition, the second AR object is directed to connect with the current driving state of the vehicle as approaching the second location, approaching the next driving state as approaching the first location after the second location is changed (e.g., separated).
Processor 820 may determine whether any of the first condition and the second condition is satisfied based on at least one of map data, a current location, and navigation information of the vehicle.
The first condition is a case where the running condition of the vehicle does not change much from the current running condition, and may include, for example, straight running, curved running, slow running, entering a tunnel, or the like.
The second condition may include a case where a change in the running condition of the vehicle is predicted or a case where guidance is predicted to be required. For example, lane change prediction, left/right turns, collision likelihood detection, destination approach, and the like may be included.
When the first condition is satisfied, processor 820 may display the AR graphical interface in a state in which the first AR object and the second AR object are combined. When the second condition is satisfied, processor 820 may display the AR graphical interface in such a manner that the first AR object and the second AR object are separated and the separated second AR object displays the next driving situation.
Based on the current location of the vehicle passing through the first location in a state where the first and second AR objects are separated, processor 820 may update to display an AR graphical interface that is re-combined with the first and second AR objects before detecting a next driving condition of the vehicle.
Next, referring to fig. 13A, 13B, 14A, 14B, 14C, 15A, 15B, 16A, 16B, 16C, 16D, 17A, 17B, 17C, 18, 19A, 19B, 20, 21, 22, 23A, 23B, 24, and 27, various examples relating to rendering an AR graphical interface of an embodiment of the present invention on a navigation screen by changing the AR graphical interface differently according to a predicted driving situation are specifically shown.
Straight road/curved road
Referring to fig. 13A, 13B, 14A, 14B, and 14C, the AR display device 800 of the present invention may provide UX including information of the shape of a road through an AR graphical interface overlapped with a vehicle front image when the vehicle is running.
As the next running condition of the vehicle presented through the second AR object, road shape information presumed based on at least one of map data and sensed data of the vehicle and the current position of the vehicle may be included.
According to an embodiment, processor 820 may update to reflect changes corresponding to road shape information in travel to the AR graphical interface.
Processor 820 may render in such a manner that the curve or inclination of the road in travel is reflected to the AR graphical interface to move part (e.g., the second AR object) or all (e.g., the first AR object and the second AR object) in at least one of three axis directions (x-axis, y-axis, z-axis) in a state where AR objects constituting the AR graphical interface are combined.
Specifically, the processor 820 calculates a twist angle or a rotation angle of the AR graphic interface based on road shape information included in the received map data, reflects the calculated twist angle or rotation angle to the AR graphic interface, and AR renders, thereby providing assistance to a steering control value of the vehicle, and providing convenience to a driver.
Fig. 13A is an example of a front image 1301 of a vehicle when traveling on a straight road, and fig. 13B is an example of a front image 1302 of a vehicle when traveling on a curved road.
In the AR graphical interface 1300a displayed overlapping the front image 1301 of fig. 13A, the first object and the second object are displayed as UX in the same direction without being distorted.
On the other hand, the AR graphical interface 1300B displayed overlapping the front image 1302 of fig. 13B is displayed as UX in which the first object and the second object are distorted or rotated according to the direction and degree of the curve. At this time, the degree of twisting or rotation of the first object and the second object of the AR graphical interface 1300b is performed differently from each other.
The rotational motion of the vehicle includes roll (roll), pitch (pitch), and yaw (yaw). The roll rotation is a phenomenon of rotation about the longitudinal axis (x-axis) of the vehicle. Pitching rotation is a phenomenon in which the vehicle rotates about a lateral axis (y-axis). Yaw rotation is a phenomenon in which the vehicle rotates about a vertical axis (z-axis).
In a curved road, a roll rotation and a yaw rotation of the vehicle can be observed.
According to an embodiment, the AR graphical interface may be displayed to rotate in a first direction based on a vehicle steering value corresponding to the road shape information. Here, the first direction may include a rotation direction of yaw rotation and roll rotation of a vehicle steering value corresponding to a curve direction of the road.
Processor 820 may render updates such that as the vehicle steering value (absolute value) increases, the degree of distortion between the first and second AR objects forming the AR graphical interface increases. The degree of twist between the first AR object and the second AR object increases as the roll rotation of the vehicle steering value corresponding to the curve direction of the road increases.
For example, referring to fig. 14A and 14B, in a curve (e.g., a counterclockwise curve) section of the vehicle, the first object 1420 and the second object 1410 of the AR graphical interface 1400 may be displayed updated to yaw a degree (X) corresponding to the curve direction (e.g., a counterclockwise direction) according to the direction and degree of the curve. Further, the second object 1410 of the AR graphical interface 1400 may display an update to the degree (X) of the roll rotation corresponding to the curve direction (e.g., counterclockwise direction).
The yaw rotation values of the first object 1420 and the second object 1410 may be calculated as values between-180 ° and +180°, for example, based on the current running state of the vehicle and map data.
The roll rotation value of the second object 1410 may be calculated as a value between-45 ° and +45°, for example, based on the current running state of the vehicle and the map data.
Here, the signs ("+", "-") of the yaw rotation value and the roll rotation value correspond to the steering value of the vehicle based on the direction of the curve. As shown in fig. 13B, in a counterclockwise curve, the yaw rotation value and the roll rotation value have "+" signs. On the other hand, in a clockwise curve, there may be a yaw rotation value and a roll rotation value of the "-" sign.
The yaw rotation values of the first object 1420 and the second object 1410 corresponding to the direction and degree of the curve may be calculated as follows.
[ mathematics 1]
[ Yaw rotation value ] =yaw coefficient×vehicle steering value (wheel angle)
In addition, the roll rotation value of the second object 1410 corresponding to the direction and degree of the curve may be calculated as follows.
[ math figure 2]
[ Roll rotation value ] =min (|roll coefficient×vehicle steering value (white angle) |, 45) ×sign of vehicle steering value (white angle) (+ "or" - "), wherein the maximum value" 45 "is variable.
Here, the Yaw coefficient and Roll coefficient represent ratios for adjusting the magnitude (degree) of rotation when the magnitude of the vehicle steering value is applied to the Yaw rotation values of the first object 1420 and the second object 1410 and the Roll rotation values of the second object 1410.
Here, the sign of the vehicle steering value (wire angle) has a positive number ("+") or a negative number ("-"), for example, as shown in fig. 13A, in the case of a straight road, the vehicle steering value has "0". As the curvature of the curve becomes larger, the absolute value of the vehicle steering value (steering angle) becomes larger.
Therefore, as the curvature of the curve increases, the yaw rotation values of the first object 1420 and the second object 1410 increase, and as the curvature of the curve increases, the roll rotation value of the second object 1410, that is, the degree of twist of the second object 1410 with respect to the first object 1420 further increases.
On the other hand, when the vehicle passes through the curve section, the first object 1420 and the second object 1410 are again displayed in the coupled state as UX in the same direction without distortion.
Fig. 14C illustrates a method of acquiring road information of a vehicle based on sensing data acquired through an ADAS sensor of the vehicle and changing a display AR graphical interface according to a vehicle steering value corresponding to the acquired lane information.
First, lane information may be acquired as road information of a vehicle based on sensing data acquired through an ADAS sensor of the vehicle (S1410). Processor 820 determines whether the predicted running condition is straight based on the lane information (S1420).
In the case of a straight road, processor 820 UX displays an AR graphical interface without distortion in a form in which a first AR object and a second AR object are combined according to the straight road (S1430).
In the case of a curve road other than the straight road, the processor 820 calculates a rotation value corresponding to a vehicle steering value based on the curve road, that is, calculates a yaw rotation value and a roll rotation value corresponding to a degree and direction of the curve road (S1440).
Processor 820 UX displays an AR graphical interface changed in a form in which the first AR object and the second AR object are distorted based on the calculated yaw rotation value and roll rotation value (S1450). At this time, the processor 820 provides the AR GUI frames for the changed AR graphical interface to the navigation application in real time according to the prediction of the curve road, and updates the AR GUI surface in real time.
[ inclination of road ]
Referring to fig. 15A, 15B, 16A, 16B, 16C, and 16D, the AR display device 800 of the present invention may provide UX including inclination information of a road through an AR graphic interface overlapped with an image in front of a vehicle when the vehicle is traveling.
As the next running condition of the vehicle presented through the second AR object, inclination information estimated based on at least one of map data and sensing data of the vehicle (or ADAS sensor data of the vehicle) and the current position of the vehicle may be included. At this time, the inclination information may include inclination position and inclination angle information.
According to an embodiment, processor 820 may display an update to rotate the AR graphical interface in the second direction based on the tilt value (i.e., the tilt) included in the road shape information.
Processor 820 may display the update to cause the second AR object of the AR graphical interface to be rotated in pitch in a second direction corresponding to the tilt direction.
Processor 820 may display the updated AR graphical interface such that the direction and degree of inclination of the second AR object changes based on the first AR object according to the direction and degree of inclination of the road shape information included in the driving.
The inclination value of the road shape information included in the traveling may be acquired by inclination information (e.g., inclination position and inclination angle) of sensed data included in a high definition Map (HD Map) or ADAS, for example. Processor 820 displays a second AR object updated to reflect information about the acquired tilt position and tilt angle to the AR graphical interface.
Fig. 15A shows a changed AR graphical interface 1500a displayed in the front image 1501 when ascending an uphill road. The second AR object that can confirm the guide predicted running condition is displayed distorted in the oblique direction toward the upper side (+y axis) direction.
Fig. 15B shows a changed AR graphical interface 1500B displayed in front of image 1502 when descending a slope road. Here, it is possible to confirm that the second AR object guiding the predicted running condition is displayed distorted in the downward (-y-axis) direction along the oblique direction.
In the case of a horizontal road without inclination, as shown in fig. 16A, the AR graphic interface 1600 of the same direction and the same angle is displayed in the form UX in which the first AR object and the second AR object are combined.
During the tilt section traveling, the second AR object changes to pitch rotation with reference to the first AR object in correspondence with the acquired tilt position and tilt angle. For example, in an uphill slope as shown in fig. 16B, second AR object 1610 is shown to be pitching rotated in an upward direction relative to first AR object 1620. Alternatively, for example, in a downhill slope as shown in fig. 16C, second AR object 1610 is displayed to be pitching rotated in a downward direction relative to first AR object 1620.
On the other hand, after the vehicle passes through the inclined section of the ascending or descending slope, as shown in fig. 16A, the AR graphic interface 1600 of the same direction and the same angle is displayed again in a form in which the first AR object and the second AR object are combined.
Pitching rotation is a phenomenon in which the vehicle rotates about a lateral axis (y-axis).
The pitch rotation value of the second AR object corresponding to the direction and degree of tilt may be calculated as follows.
[ math 3]
[ Pitch rotation value ] =gradient (Slope) coefficient×actual gradient of road (Slope angle)
Here, the inclination coefficient is a ratio for adjusting the rotation magnitude when the magnitude of the actual inclination value of the road is applied to the pitch rotation value of the second AR object.
Here, as the actual inclination of the road becomes larger, the pitch rotation value of the second AR object becomes larger, and thus the inclination angle of the second AR object with respect to the first AR object also further increases.
Fig. 16D illustrates a method of acquiring inclination information of a road based on sensing data acquired through an ADAS sensor or a high definition map of a vehicle, etc., and changing a display AR graphical interface according to the acquired inclination information.
First, inclination information of a road may be acquired based on sensing data or map data acquired through an ADAS sensor of a vehicle, a high definition map, or the like (S1610). Processor 820 determines whether the predicted running condition is a sloped road running based on the acquired inclination information (S1620).
In the case of the horizontal road without inclination, processor 820 displays an AR graphical interface without inclination in a form UX in which the first AR object and the second AR object are combined along the horizontal road (S1630).
If there is an inclined road, processor 820 calculates a pitch rotation angle of the vehicle based on the inclination direction and the angle, which is a value corresponding to the inclination of the road (S1640).
Processor 820 displays the changed AR graphical interface in a form UX in which the second AR object is tilted with respect to the first AR object based on the calculated pitch rotation angle value (S1650). Processor 820 provides AR GUI frames of the AR graphical interface, which change according to the inclined road prediction, to the navigation application in real-time in order to update the AR GUI surface in real-time.
As described above, the AR display device 800 of the present invention visually provides the curve of the road predicted along with the state of the current driving road, the inclination of the road UX, thereby providing driving assistance to the driver of the vehicle.
[ Path search display ]
In the present invention, as the next driving condition of the vehicle presented through the second AR object of the AR graphical interface, status information and setting information of the navigator may be included.
Referring to fig. 17A, 17B, and 17C, the AR display device 800 of the present invention may provide UX including status information of navigation through an AR graphical interface overlapped with a vehicle front image when the vehicle is running.
As the next driving condition of the vehicle presented through the second AR object, status information of navigation in linkage with the vehicle may be included. At this time, the status information of the navigation may include usage status data, action status data, and the like of the navigation.
For example, processor 820 may receive navigational state information (e.g., using state data, action state data) from a navigational application, identify a navigational path search as a next driving condition of the vehicle based on the navigational state information received, display updates to separate a second AR object (or a state that remains partially bound) during a period in which the first AR object displays a current driving condition of the vehicle, and may rotate with respect to the first AR object.
Referring to fig. 17A, when the vehicle is traveling, it is recognized whether the state of navigation is a path (re) search state, and the path (re) search state may be displayed by the second AR object 1710 in a state in which the first AR object 1720 and the second AR object 1710 are separated.
For example, the second AR object 1710 may be rotated 360 degrees in place with respect to the first AR object 1720. At this time, as shown in fig. 17C, an animation effect in which the second AR object 1710 rotates at a roll of 360 degrees may be output. In addition, the in-place rotation of the second AR object 1710 may be repeated until the path (re) search state ends.
Thus, the driver of the vehicle may (re) update the path information identifying or intuitively confirming navigation through the display of the second AR object 1710 while continuing to confirm the current driving state through the first AR object 1720.
Referring to fig. 17B, a method of changing the display of the AR graphical interface according to the state information of the navigation is described.
First, status information (e.g., use status data, action status data, etc.) of navigation may be acquired by the navigation application (S1710). Processor 820 determines whether the state of navigation is in a path (re) search for path update guidance (S1720).
If the route (re) search is not being performed, the UX displays an AR graphical interface represented by a state in which the first AR object and the second AR object are combined according to the current running state of the vehicle (S1730).
If in the path (re) search, to represent this, the UX displays the changed AR graphical interface such that the separated second AR object is rotated 360 degrees with the first AR object as a center (S1740). This step may be repeated until the path (re) search state ends.
Processor 820 provides the AR GUI frames of the AR graphical interface changed for the path (re) search presentation to the navigation application in real-time, thereby updating the AR GUI surface in real-time.
Processor 820 may render updates to be displayed with the AR graphical interface UX where the first AR object and the second AR object are recombined in accordance with the end of the navigated path search (e.g., find path, cancel find path).
[ warning about overspeed ]
On the other hand, the AR display device 800 of the present invention may represent a driving warning condition for a predicted problem occurrence event by changing the AR graphical interface.
Referring to fig. 18, an AR display device 800 of the present invention may provide UX notifying a running warning condition for a predicted problem occurrence event through an AR graphical interface overlapping with a vehicle front image when the vehicle is running.
As the next running condition of the vehicle presented through the second AR object, a running warning condition (e.g., collision possibility, etc.) of a problem occurrence event predicted based on the sensed data of the vehicle and the navigation information may be included.
Processor 820 may alter the image displaying the AR graphical interface to include a notification of a driving warning condition of the vehicle according to identifying the driving warning condition based on the sensed data and the navigation information of the vehicle.
Here, the running warning condition may include a condition that the vehicle exceeds or is close to the speed limit of the vehicle, a condition that the vehicle may collide with other vehicles or objects.
Referring to fig. 18, processor 820 may acquire the current traveling speed of the vehicle and speed limit information of the road based on the sensed data (e.g., CAN data) of the vehicle and the navigation information (e.g., speed limit information of the traveling road, crosswalk information, etc.) (S1810).
Processor 820 determines whether or not the traveling speed is overspeed based on the acquired information (S1820), and may display the determination result as an AR graphical interface UX in a form in which the first AR object and the second AR object are combined.
If it is determined that the speed exceeds (or is very close to) the speed exceeds, the AR graphical interface display in the form of the first AR object and the second AR object being combined is updated to a color (for example, a red or orange color) capable of visually warning notification (S1840). At this time, the degree of change in color may correspond to the degree of overspeed, and the warning notification corresponding to the degree of overspeed may be updated in stages.
To this end, the processor 180 may change at least one of a color, a shape, a blinking, and a highlighting of the displayed AR graphical interface according to the overspeed state of the vehicle identified as the driving warning condition. And, the AR GUI frame corresponding to the information of the display change is provided to the navigation application so that the navigation application 930 updates the AR GUI surface.
If not, the AR graphical interface in the form of a combination of the first AR object and the second AR object is displayed in the basic color (S1830). In addition, at the end of the overspeed state confirmation, an AR graphical interface in the form of a combination of the first AR object and the second AR object is also displayed in the basic color.
[ left/Right turning and exit ]
Referring to fig. 19A, 19B, and 20, the AR display device 800 of the present invention may provide UX including TBT (Turn by turn) information of left/right turn and Exit (Exit) points of a vehicle through an AR graphical interface overlapped with a front image.
In other words, as the next running condition of the vehicle presented through the second AR object, TBT (Turn by Turn) information of left/right turning and Exit (Exit) places of the vehicle estimated based on the map data, the sensed data, and the lane information of the high definition map (HD map) may be included.
Processor 820 may render that, in accordance with map data and sensed data based on the vehicle, the rotational travel is identified (predicted) as a next travel condition of the vehicle, a second AR object of the AR graphical interface is separated at a second location corresponding to the estimated TBT information (i.e., the aforementioned "second location") during which the first AR object of the AR graphical interface displays a current travel direction of the vehicle, and at least one of a direction and a path of the rotational travel is displayed by the separated second AR object.
At this time, the TBT information is intuitively recognized by the separated second AR object, so that the traveling based on the left/right turn and the exit can be easily performed.
For this reason, processor 820 may determine whether guidance of left/right turn and exit information is required from navigation, confirm a rotation or exit direction corresponding to the left/right turn and exit information if required, and determine a position and direction in which the vehicle is to travel using lane information (HD-Map).
If it is recognized that the vehicle is within a predetermined distance (for example, 150 to 200 m) from the left/right turn and the exit point, processor 820 may display the second AR object separately in advance and may move ahead while keeping the separated second AR object at a predetermined distance from the first AR object indicating the current position of the vehicle.
At this time, it is important to accurately indicate a place where the second AR object is first separated (hereinafter, "separated moving place") and a place where the next route is first guided in a separated state (hereinafter, "route guiding place"). The driver can recognize and prepare the route guidance through the AR graphic interface in advance from a separate moving place other than the route guidance place, and thus is more helpful for traveling. In addition, the route guidance of the second AR object to be moved by separation is calibrated to be exactly coincident with the navigation screen on which the second AR object is rendered, so that the driver can easily follow the route of the guidance.
Fig. 19A may confirm that the UX display directing the right-turn (or left-turn) location is rendered by a second AR object 1910a in the front image 1901 that is separate from the first AR object 1920.
FIG. 19B may confirm that the UX display directing the Exit (Exit) site by a second AR object 1910B in the front image 1901 that is separate from the first AR object 1920 is rendered. At this time, in order to guide to prevent the vehicle from deviating from the corresponding lane, the second AR object 1910b may be displayed in the vertical direction on the lane warning line.
In fig. 19A and 19B, when the vehicle enters within a prescribed distance from a left/right turn or an exit point, the second AR object is separated from the AR graphical interface, and, while moving together with the travel of the vehicle after being separated, the position where the vehicle needs to perform a left/right turn or the exit point where it needs to exit can be accurately directed.
In addition, as shown in the drawing, the separated second AR object appears as a plurality of segments (fragments) forming part of the path, and if the vehicle approaches the plurality of segments (more precisely, the segment near the first AR object representing the current driving state), the plurality of segments also move while maintaining a prescribed distance from the vehicle (or the first AR object).
A moving partial guideline path is generated such that a segment of the plurality of segments that is proximate to the first AR object is associated with a current travel state of the vehicle presented by the first AR object, and a segment that is furthest from the first AR object is associated with a left/right turn or Exit (Exit) location.
Processor 820 provides AR GUI frames generated based on the generated partial guideline path to navigation application 930 so that updated AR GUI surfaces may be displayed.
On the other hand, after the completion of the left/right turn travel or the passage through the exit, the separated second AR object is changed to a form of being combined with the first AR object again.
Referring to fig. 20, a method provided by changing the AR graphical interface in the case where a right turn of the vehicle is predicted is specifically described.
First, during straight traveling of the vehicle, the AR graphical interface 2000a of the state in which the first AR object and the second AR object are combined is displayed in the same direction as the straight traveling of the vehicle.
Based on the map data, the navigation information, and the sensed data of the vehicle, the processor 820 predicts that the vehicle enters the exit as the next driving situation, and displays the AR graphical interface in a state where the first AR object 2020 and the second AR object 2010a are separated at a point a predetermined distance (for example, 150 to 200 m) in advance from the exit side rotational position, that is, "separated moving point".
Then, the second AR object 2010a gradually moves from the current position to the exit point, and the first AR object 2020 moves according to the driving state of the vehicle. At this time, the second AR object 2010b may be displayed as a portion of the guidance path that guides the rotational direction in which the vehicle needs to travel at the exit point.
After the second AR object moves, if the first AR object that moves with the traveling of the vehicle approaches within a predetermined distance D1 (e.g., 10 m) from the exit point, the second AR object 2010b displays the partial guide path in real time while maintaining the predetermined distance from the position of the vehicle (i.e., the first AR object).
Then, when the position of the vehicle (i.e., the first AR object) reaches the exit point, the second AR object 2010c moves and displays the guide corresponding to the next driving situation while keeping a predetermined distance D2 (for example, between 5m and 10 m) from the first AR object.
When the vehicle turns right and passes through the exit point completely, the second AR object 2010d is displayed in a separated state for a predetermined time, and then, when the predetermined time elapses, the separated second AR object 2010d is moved to the position of the first AR object for coupling.
As described above, information on the display of the first AR object and the second AR object that are changed in real time is rendered by the AR engine, based on which AR GUI frame information is provided to the navigation application, and the AR GUI surface is updated in real time.
[ roundabout section road direction ]
Referring to fig. 21, when the entering rotary section is predicted to be the next driving situation during the driving of the vehicle, AR display device 800 of the present invention may provide UX including the route guidance information of the rotary section through the AR graphical interface overlapping with the vehicle front image.
A rotary (rotary) section is a cross section that is generally rounded for traffic control in a cross section with complex traffic. The roundabout section includes a plurality of entry points and a plurality of exits, and thus requires more intuitive route guidance UX than in the running image.
Processor 820 may identify a next driving condition of the vehicle presented by the second AR object based on the roundabout section information acquired through the navigation information and the map data (e.g., high definition map (HD map)).
According to an embodiment, processor 820 may calculate a path corresponding to each exit point included in the identified roundabout section according to identifying the roundabout section as the next driving condition of the vehicle based on the map data and the navigation information of the vehicle.
In the case of using map data of a high definition map (HD map), a route corresponding to each exit point may be calculated based on lane information within the roundabout section.
In the case where map data of a high definition map (HD map) cannot be used, a route corresponding to each exit point may be calculated based on the entry/exit position of the rotary section and the position information of each exit. For example, the travel route in the rotary island section may be calculated from the shape of a triangle circumscribed circle constituted by three points (entrance/exit/entrance) in the rotary island section.
Processor 820 may update the rendering of the AR graphical interface based on the calculated plurality of exit points, separate and display the second AR object of the AR graphical interface as a plurality at a second position, which is a point a predetermined distance away from each exit point, and display the calculated paths at a third position corresponding to each exit point.
Referring to fig. 21 (a), when the vehicle enters within a predetermined distance from the entry position of the rotary island section, the second AR object 2110a of the AR graphic interface is separated from the first AR object 2120 and displayed in a plurality of segments overlapping the front image 2101.
Next, as shown in fig. 21 (b) and (c), the second AR objects 2110b and 2110c in a plurality of segments are displayed so as to overlap with the front images 2102 and 2103 along the guide path of the travel path in the rotary island section to the entrance/exit position.
Next, as shown in fig. 21 (d) to (e), the combinations (e.g., the first plurality of segments and the second plurality of segments) of the second AR objects 2110d, 2110e and 2110f forming the respective guidance routes are displayed and updated to sequentially guide the traveling direction toward the final exit direction side of the roundabout section.
At this time, as shown in the drawing, the second AR objects 2110d, 2110e, 2110f connect the display guide path to prevent the vehicle from entering the exit other than the final exit direction, disregarding the exit (or blocking) other than the final exit direction.
Then, if the vehicle leaves the rotary island section along the guidance route displayed by the second AR objects 2110d, 2110e, 2110f, the AR graphical interface is displayed in a form in which the second AR object is again combined with the first AR object.
[ near the set destination ]
Referring to fig. 22, an AR display device 800 of the present invention may provide UX including setting information of navigation through an AR graphical interface overlapped with a front image of a vehicle when the vehicle is traveling.
Processor 820 may receive setting information of navigation in linkage with the vehicle, for example, destination (gold) information, and based on this, recognize that the current position of the vehicle is within a prescribed distance (second position) from a position (first position) corresponding to the destination information, and may display an accurate position updated to guide the destination by separating the second AR object.
Processor 820 may receive destination information from navigation information of the vehicle through communication module 810, and may display that a second AR object updated as an AR graphical interface is separated (moved) and directed to the first location according to the current location of the vehicle reaching a second location that is a prescribed distance ahead of the first location corresponding to the received destination information.
According to an embodiment, if the separated second AR object moves to the first location, the image may be changed or added to include the remaining distance information from the current location of the vehicle (or the first AR object) to the first location corresponding to the destination information.
Thus, the driver can confirm the destination position more accurately. In addition, after the vehicle actually reaches the destination, the display of the AR graphical interface continues to be displayed in a combined state without disappearing. Thus, a situation of a sudden end of guidance when reaching the destination and a loitering at an accurate destination position is not caused.
Referring to fig. 22 (a) to (c), during the travel of the vehicle toward the destination, an AR graphical interface 2200 in the form of a combination of the first AR object and the second AR object is displayed in the vehicle front image 2201, and then, if the vehicle approaches within a predetermined distance of the destination position, the second AR object is separated from the first AR object, and the display of the movement toward the destination position is updated in sequence in the front images 2202, 2203.
Then, as shown in (d) of fig. 22, after the separated second AR object 2210 is moved to the destination position, the display is updated to be rotated to a display (e.g., goal post icon) directed to the destination position, and for example, the shape is changed to a milestone form including the destination position and the remaining distance information.
On the other hand, for example, processor 820 renders and updates the AR graphical interface in a form in which the first AR object and the second AR object are combined to be displayed again before the next driving situation of the vehicle is detected, based on the current position of the vehicle reaching the first position corresponding to the destination information or passing through the first position.
[ U-turn guide display ]
Referring to fig. 23A, 23B, and 24, the AR display device 800 of the present invention can provide a more intuitive u-turn guidance direction UX through an AR graphical interface overlapped with a front image of a vehicle when the vehicle is running.
Processor 820 may display a change to identify a u-turn section as a next driving condition of the vehicle based on the map data and the navigation information of the vehicle, and separate a second AR object of the AR graphical interface to direct the u-turn position at a second position a prescribed distance in advance from the point where the u-turn section enters.
At this time, the prescribed distance calculated from the turning section entry point, the second position earlier than the prescribed distance, or the point in time at which the second AR object is separated may be adaptively changed according to the condition of the lane in which turning is possible (for example, the degree of congestion of the turning waiting vehicle, the obstacle discovery of the opposite lane, and the possibility of collision).
Processor 820 may render updates such that the display of the second AR object separate from the first AR object changes corresponding to the driving state of the vehicle in the u-turn section.
Referring to fig. 23A, the position of the vehicle (or the position of the first AR object) is within a prescribed distance (e.g., 150-200 m) from the u-turn section, the second AR object is separated from the first AR object 2320, a portion 2310b of the second AR object flies to the u-turn position, and the remaining segment 2310a displays a guiding track on which the vehicle is to travel.
At this time, the moved segment 2310b is displayed in overlapping with the front image 2301 in a form of a u-turn GUI (e.g., the segment 2310b rotates in a u-turn direction) that matches the u-turn position. And, the first AR object is changed in real time corresponding to the traveling direction and traveling speed of the vehicle. To this end, processor 820 provides AR GUI frames corresponding to changes to the first AR object and the second AR object to the navigation application for display of updated AR GUI surfaces.
Next, referring to fig. 23B, after the vehicle enters the u-turn section, the remaining segments of the second AR object 2310a' are sequentially rotated in the u-turn direction at the u-turn position before the vehicle, and the guide track is displayed at the front image 2302.
FIG. 24 is a flowchart of a method for changing an AR graphical interface to provide a turn around guide UX in accordance with the present invention. The various processes of the illustrated method are performed by a processor 820, which processor 820 may be in linkage with a navigation engine 910, an AR engine 920, a navigation application 930, sensors, and a map 940.
In the flowchart of fig. 24, the processor 820 may receive the turn around information from the navigation information (2410). Whether it is a point of time requiring a turning guide is determined according to distance information between the current position and the turning position of the vehicle (2420). For example, during a period when the current position of the vehicle deviates from the u-turn position by a prescribed distance, for example, in a case where the vehicle is 200m or more from the u-turn position, the first AR object and the second AR object are displayed in a combined state.
If it is recognized that the vehicle is within a predetermined distance from the turning position, navigation information, map data, GPS data, CAN data, and ADAS sensing data of the vehicle are received (2430), and a turning pattern is calculated based on the received information or data (2440).
The turning mode may include a first turning mode for starting turning guide, a second turning mode for continuing turning guide, and a third turning mode for ending turning guide. The first turning mode to the third turning mode are sequentially executed according to turning running of the vehicle.
Specifically, if the vehicle enters a prescribed distance from the u-turn position, a first u-turn mode (2450) is executed, and a second AR object separate from the first AR object flies to the u-turn position while drawing a u-turn trajectory.
Next, the vehicle performs a second u-turn mode (2460) during traveling in a u-turn section after entering, the separated second AR object maintains a prescribed distance from the vehicle while sequentially turning around at the u-turn position, and displays a guiding track to be traveled by the vehicle in real time.
Information related to the first and second turn around modes is stored in a mode information store (2480) and then invoked for use if necessary.
If the vehicle completes turning around, for example, the vehicle turns around and passes around the turning around position 50m, a third turning around mode is performed (2470). Thus, the separated second AR object moves to the position of the first AR object and displays the AR graphical interface in a bonded form.
[ display of GUI Panel ]
On the other hand, processor 820 may determine whether one of a first condition and a second condition is satisfied based on at least one of map data, a current position, and navigation information of the vehicle, where the first condition is a case where a running condition of the vehicle does not change much from the current running condition, and the second condition is a case where a running condition change of the vehicle is predicted or a case where guidance is predicted to be required.
Processor 820 displays the AR graphical interface in a combined state of the first AR object and the second AR object when the first condition is satisfied, displays the AR graphical interface in a separated state of the first AR object and the second AR object when the second condition is satisfied, and displays the AR graphical interface in such a manner that the separated second AR object represents the predicted next driving situation.
Under the first condition, the processor 820 may render that a panel region including an AR object (e.g., a traveling speed of the vehicle, a current time, a remaining time from a destination, a next traveling direction position) related to traveling information of the vehicle overlaps with a designated region of the traveling image.
In addition, under the second condition, processor 820 may render the panel area updated to include the AR object related to the travel information of the vehicle (e.g., the travel speed of the vehicle, the current time, the remaining time from the destination, the next travel direction position) hidden or obliquely displayed during the display of the separated second AR object for the next travel condition.
For example, in the front image 2701 of fig. 27 (a), the AR graphic interface 2700 is displayed in a combined state during the straight running of the vehicle, and the panel area of the navigation screen is displayed in a generally horizontal mode.
On the other hand, the navigation application 930 may receive information, such as, for example, combine, separate, re-combine, etc., from the processor 820 to determine a display mode of the panel region based on the received information. At this time, various display modes of the panel region may be stored in advance in a memory or the like. The navigation application 930 passes information corresponding to the determined display mode to the navigation GUI surface in real time. Thereby, the display mode of the panel area of the navigation screen is changed.
For example, as the lane change of the vehicle is predicted (e.g., right turn signal lamp on), it may be confirmed that the panel region of the navigation screen is switched to the oblique display mode during the display of the AR graphical interface in the state in which the first AR object 2720 and the second AR object 2710 are separated in the front image 2702 of fig. 27 (b).
At this time, when the display mode is changed in the panel region, an animation movement effect (for example, horizontal→tilt, inclination→horizontal) may be output to seamlessly perform mode switching.
Thus, other AR features (features) displayed on the navigation screen and travel guidance based on the AR graphical interface of the present invention can be displayed more cooperatively. In addition, the integrity of the AR travel mode is further improved, providing travel stability and convenience.
Fig. 25A, 25B, and 26 are exemplary diagrams related to providing peripheral information together as an intuitive AR UX when an AR graphical interface of an embodiment of the present invention is displayed on a navigation screen.
[ POI rotation ]
Referring to fig. 25A, during the overlapping display of AR graphical interface 2500 with vehicle front image 2501, surrounding POIs may be displayed in AR form, which may display additional information while rotating as the vehicle approaches the displayed POIs 2531, 2532.
Accordingly, surrounding POI information is displayed more intuitively during the running of the vehicle, so that it is possible to provide assistance for the POI recognition of the driver.
Referring to fig. 25B, in the flowchart, processor 820 may receive POI information from navigation information (S2510). Processor 820 determines whether to display the POI based on the current position of the vehicle, the navigation information, and the map data (S2520).
If it is determined that the POI is displayed, sensing data of the vehicle, for example, navigation information of the vehicle, map data, GPS data, CAN data, and ADAS sensing data, is received (S2530), and a POI pattern is calculated based on the received information or data (S2540).
The POI mode may include a first POI mode in which POIs are displayed in a front image, a second POI mode in which a rotational animation effect is applied to POIs, and a third POI mode in which POIs are displayed in a side direction. The first to third POI modes are sequentially executed during the approach and passage of the vehicle to the POI position.
Specifically, if the vehicle enters a prescribed distance from the POI position, a first POI mode is executed (S2550), and the POI icon is displayed in the front image.
Next, in a case where the vehicle approaches the displayed POI position, a second POI mode is performed (S2560), and an animation effect of the POI icon rotation is output. At this time, information related to the first POI mode and the second POI mode is stored to the mode information storage (S2580), and then use is invoked if necessary. When the vehicle arrives at the POI position, the POI icon is displayed in the side direction, and a third POI mode of displaying additional information (e.g., POI name, POI-related advertisement content) is performed (S2570). Then, if the vehicle passes through the POI position, the POI icon disappears from the front image.
Crosswalk display
Referring to fig. 26, an AR display device 800 of the present invention may provide UX including crosswalk information in the form of AR overlapping with a vehicle front image when the vehicle is running.
Processor 820 obtains crosswalk information from sensed data of the vehicle as the vehicle travels, as shown in fig. 26, and AR crosswalk display 2631 may be provided to front image 2601 along with AR graphical interface 2600.
At this time, AR graphical interface 2600 is displayed in a state where the first AR object and the second AR object are combined, and AR crosswalk display 2631 is displayed on the bottom surface of the traveling road in front image 2601. The AR crosswalk display 2631 may be displayed so as to overlap with the front image when the vehicle arrives at a place spaced apart from the actual crosswalk position by a predetermined distance. If the vehicle passes through the actual crosswalk position and then a predetermined distance, AR crosswalk display 2631 disappears and only AR graphical interface 2600 is displayed.
Accordingly, crosswalk information is displayed in an intuitive AR format UX while the vehicle is traveling, thereby providing assistance to the driver's safe driving.
As described above, according to the AR display device and the operating method thereof according to the embodiments of the present invention, an augmented reality navigation screen based on a calibrated front image can be provided without additional setting, and guidance of a predicted driving condition seen on a current navigation screen together with a current position of a vehicle can be guided through an AR object, so that more intuitive and real AR guidance can be provided for the vehicle. In addition, the current driving state and the next driving state are simultaneously guided in a manner of separating, deforming and combining one AR graphical interface, and the separated AR object moves together with the vehicle and guides the driving instead of being displayed at a fixed position, so that the tracking driving control of the guided vehicle is easier, safer and more convenient. In addition, an AR graphical interface is provided that reflects predicted travel conditions, navigation status information, setting information, surrounding POI information, and the like, so that a more coordinated and expanded AR travel pattern can be experienced.
As previously described, the processor may be further configured to render the AR graphical interface on a navigation screen in a manner that selectively separates the second AR object from the first AR object or joins the second AR object to the first AR object. Such separation or combination may alternatively be performed.
The "combined state" in the present specification refers to a state in which the first AR object and the second AR object are connected to each other on the screen or in which the first AR object and the second AR object are relatively closer to each other on the screen than the "separated state".
Similarly, the meaning of the expression "the first AR object and the second AR object are combined with each other" in this specification includes that the first AR object and the second AR object are connected to each other and that the first AR object and the second AR object are relatively close. The interval at which the first AR object and the second AR object are combined with each other is narrower than in the case of "separating the second AR object from the first AR object".
In a head up display device, the display of the present invention may refer to a display or a screen within PGU (Picture Generation Unit), a mirror or a reflective plate that reflects an image provided from a PGU, or a combiner of different images.
The above-described functions performed by the processor of the AR display device may be performed by one or more processors located outside the AR display device.
According to another embodiment, a method of providing an augmented reality AR interface to an AR display device for a vehicle, comprises:
a step of receiving image data including a front image of the vehicle; a step of receiving position data including a current position of the vehicle; a step of receiving map data for a current position of the vehicle; activating a preset application program, and rendering an AR (augmented reality) image interface in a mode of being overlapped with a front image, wherein the AR image interface generates a signal for displaying a navigation picture, and the navigation picture at least comprises the current position of a vehicle, map data and a front image overlapped by the AR image interface according to the rendering.
Rendering of the AR graphical interface includes selectively separating the second AR object from the first AR object or bonding the second AR object to the first AR object on the navigation screen. Such separation or combination may alternatively be performed.
According to the embodiment of fig. 8, processor 820 of AR display device 800 includes navigation engine 910, AR engine 920, and navigation application 930. Alternatively, processor 820 may include only AR engine 920.
That is, only the AR engine 920 may be executed by the processor 820, and the navigation engine 910 and the navigation application 930 may be executed by one or more processors external to the AR display device 800.
In addition, it may be executed by one or more processors external to the AR display device 800. In such an alternative scenario, the AR display device 800 may simply perform receiving the rendered image and display it on the display 830.
The AR display device 800 may be converted, if necessary. The rendered image may be encrypted to fit the form factor of the AR display device 800, or the encrypted rendered image may be decrypted.
The invention as referred to in the foregoing description may be implemented using a machine-readable medium having stored thereon instructions for execution by a processor performing the various methods set forth herein. Examples of possible machine readable media include HDD (hard disk drive), SSD (solid state disk), SDD (silicon disk drive), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, and so forth. The computer-readable medium may also be embodied in the form of a carrier wave (e.g., transmission via the internet), as desired. The processor may include a controller of the terminal.
The foregoing embodiments are merely exemplary and are not to be construed as limiting the present disclosure. This description is intended to be illustrative, and not to limit the scope of the claims. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other characteristics of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments.
As the present features may be embodied in several forms without departing from the characteristics thereof, it should also be understood that the above-described embodiments are not limited by any of the details of the foregoing description, unless otherwise specified, but rather should be construed broadly within its scope as defined in the appended claims, and therefore all changes and modifications that fall within the metes and bounds of the claims, or equivalents of such metes and bounds are therefore intended to be embraced by the appended claims.

Claims (10)

1. An AR display device for a vehicle, comprising:
at least one interface module that receives data including a front image of a vehicle, position data including a current position of the vehicle, and map data for the current position of the vehicle;
a processor rendering an AR graphical interface in a manner overlapping the front image, the AR graphical interface including a first AR object displaying a running state of a vehicle and a second AR object displaying a guide to the running state of the vehicle based on the map data; and
a display for displaying a navigation screen including a front image overlapping the AR graphical interface according to the rendering,
the processor renders the AR graphical interface by selectively separating the second AR object from the first AR object or bonding the second AR object to the first AR object on a navigation screen according to a next driving condition of the vehicle that is presumed based on the map data and a current location of the vehicle.
2. The AR display device for a vehicle according to claim 1, wherein,
also included is a vision sensor that provides data including a front image of the vehicle through the interface module.
3. The AR display device for a vehicle according to claim 2, wherein,
the visual sensor comprises a camera or intelligent glass.
4. The AR display device for a vehicle according to claim 1, wherein,
the AR graphical interface is split at a second location that is a prescribed time or distance ahead of a first location at which a next driving condition of the vehicle begins.
5. The AR display device for a vehicle according to claim 1, wherein,
determining either one of a first condition and a second condition based on at least one of the map data, the position data, and navigation information of the vehicle,
the processor displays the AR graphical interface as follows:
and under the first condition, displaying the AR graphical interface in a combined state of the first AR object and the second AR object, and under the second condition, separating the first AR object from the second AR object, wherein the separated second AR object represents the next running condition.
6. The AR display device for a vehicle according to claim 5, wherein,
in the first condition, the processor renders that a panel region including an AR object related to traveling information of a vehicle is overlapped with a specified region of the front image,
in the second condition, the processor renders updates to display the panel region hidden or tilted during the period in which the separated second AR object represents the next driving condition.
7. The AR display device for a vehicle according to claim 1, wherein,
the next running condition of the vehicle includes road shape information presumed based on the current position of the vehicle and at least one of the map data and the sensed data of the vehicle,
the processor updates that a change corresponding to the road shape information is reflected to the AR graphical interface.
8. The AR display device for a vehicle according to claim 7, wherein,
the AR graphical interface is displayed to rotate in a first direction based on a vehicle steering value corresponding to the road shape information,
the processor is updated such that as the vehicle steering value increases, a degree of distortion between a first AR object and a second AR object forming the AR graphical interface increases.
9. The AR display device for a vehicle according to claim 7, wherein,
the AR graphical interface is displayed to rotate in a second direction based on a tilt value corresponding to the road shape information,
the processor updates the AR graphical interface to cause the direction and degree of tilt of the second AR object to change based on the first AR object according to the direction and degree of tilt value.
10. The AR display device for a vehicle according to claim 1, wherein,
the processor separates the second AR object and causes the second AR object to be displayed to rotate with reference to the first AR object during a period in which the first AR object displays a current driving state of the vehicle in accordance with the recognition of the path search as a next driving state of the vehicle,
and updating an AR graphical interface combined by the first AR object and the second AR object according to the end of the path search.
CN202310687460.5A 2022-06-10 2023-06-09 AR display device for vehicle and operation method thereof Pending CN117215060A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR10-2022-0070770 2022-06-10
KRPCT/KR2022/095145 2022-10-19
PCT/KR2022/095145 WO2023239002A1 (en) 2022-06-10 2022-10-19 Vehicle ar display device and operating method therefor

Publications (1)

Publication Number Publication Date
CN117215060A true CN117215060A (en) 2023-12-12

Family

ID=89034076

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310687460.5A Pending CN117215060A (en) 2022-06-10 2023-06-09 AR display device for vehicle and operation method thereof

Country Status (1)

Country Link
CN (1) CN117215060A (en)

Similar Documents

Publication Publication Date Title
KR101994698B1 (en) User interface appartus for vehicle and vehicle
US10488218B2 (en) Vehicle user interface apparatus and vehicle
CN111225845B (en) Driving assistance system and vehicle including the same
CN110053608B (en) Vehicle control device mounted on vehicle and method of controlling the vehicle
EP3495189B1 (en) Vehicle control device
US20210362597A1 (en) Vehicle control device and vehicle including the same
US20230304821A1 (en) Digital signage platform providing device, operating method thereof, and system including digital signage platform providing device
US11043034B2 (en) Image output device
US12008683B2 (en) Vehicle augmented reality navigational image output device and control methods
KR102611337B1 (en) Vehicle AR display device and method of operation thereof
KR20200095318A (en) Image output device
KR101979277B1 (en) User interface apparatus for vehicle and Vehicle
EP4290187A2 (en) Ar display device for vehicle and method for operating same
US20230398868A1 (en) Ar display device for vehicle and method for operating same
US20220388395A1 (en) Vehicle display device and control method thereof
US20230400321A1 (en) Ar display device for vehicle and method for operating same
KR102043954B1 (en) Robot for vehicle mounted on the vehcile and method for controlling the robot for vehicle
KR20200064199A (en) Path providing device and vehicle provide system comprising therefor
CN117215060A (en) AR display device for vehicle and operation method thereof
US20210323469A1 (en) Vehicular around view image providing apparatus and vehicle
CN117218882A (en) AR display device for vehicle and operation method thereof
CN117213520A (en) AR display device for vehicle and operation method thereof
CN117215061A (en) AR display device for vehicle and operation method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination