WO2023239003A1 - Ar display device for vehicle and operation method thereof - Google Patents

Ar display device for vehicle and operation method thereof Download PDF

Info

Publication number
WO2023239003A1
WO2023239003A1 PCT/KR2022/095146 KR2022095146W WO2023239003A1 WO 2023239003 A1 WO2023239003 A1 WO 2023239003A1 KR 2022095146 W KR2022095146 W KR 2022095146W WO 2023239003 A1 WO2023239003 A1 WO 2023239003A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
parking
processor
display
area
Prior art date
Application number
PCT/KR2022/095146
Other languages
French (fr)
Korean (ko)
Inventor
박종태
채지석
손정훈
김형규
이지은
김일완
최병준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020237018106A priority Critical patent/KR102611338B1/en
Priority to EP23175933.3A priority patent/EP4290475A1/en
Priority to CN202310686375.7A priority patent/CN117218882A/en
Priority to US18/208,540 priority patent/US20230398868A1/en
Publication of WO2023239003A1 publication Critical patent/WO2023239003A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to an AR display device linked to a vehicle and its operating method. More specifically, it relates to an AR display device capable of providing guidance related to parking or charging of a vehicle through AR technology and its operating method. will be.
  • vehicle functions are becoming more diverse. These vehicle functions can be divided into convenience functions to promote driver convenience, and safety functions to promote driver and/or pedestrian safety.
  • Convenience functions of a vehicle have a motive for development related to driver convenience, such as providing infotainment (information + entertainment) functions to the vehicle, supporting partial autonomous driving functions, or helping to secure the driver's field of vision such as night vision or blind spots.
  • driver convenience such as providing infotainment (information + entertainment) functions to the vehicle, supporting partial autonomous driving functions, or helping to secure the driver's field of vision such as night vision or blind spots.
  • ACC adaptive cruise control
  • SPAS smart0020parking assist system
  • NV night vision
  • HUD head up display
  • AHS adaptive headlight system
  • vehicle safety functions are technologies that ensure driver safety and/or pedestrian safety, such as lane departure warning system (LDWS), lane keeping assist system (LKAS), and automatic emergency There is an autonomous emergency braking (AEB) function.
  • LDWS lane departure warning system
  • LKAS lane keeping assist system
  • AEB autonomous emergency braking
  • Augmented Reality outputs additional graphic objects in the real world by outputting graphic objects through a vehicle's windshield or HUD (Head Up Display) or on images captured by a camera.
  • HUD Head Up Display
  • Technology development is actively taking place.
  • technologies that provide route guidance to drivers through augmented reality (AR) technology is expanding.
  • AR augmented reality
  • the present invention aims to solve the above-mentioned problems and other problems.
  • the purpose of the present invention is to provide an AR display device that can perform a more intuitive and highly complete AR driving mode and a method of operating the same.
  • the purpose is to provide an AR display device and a method of operating the same that can provide navigation, routes, and necessary information through a more intuitive AR graphic interface when a vehicle enters a parking lot/charging station. do.
  • the optimal parking area and/or charging area can be searched in advance at a parking lot/charging station, and the AR graphic interface can be changed in real time to provide a guidance route in a direction that is convenient for parking.
  • the purpose is to provide an AR display device and its operating method.
  • the purpose is to provide an AR display device and a method of operating the same that can provide UX using an intuitive AR graphic interface to prevent entering from a parking lot/charging station in a direction that is not possible.
  • the vehicle's AR display device intuitively displays the parking area or charging area when the vehicle enters the parking lot/charging station, based on the vehicle's ADAS sensing data and/or parking lot/charging station control data.
  • AR path can be provided.
  • the AR display device for a vehicle recognizes the selected parking space or in front of the charger so that the vehicle can accurately park in the desired parking space or in front of the charger, and changes the AR graphic interface in real time to provide a guide path for parking. Forward driving, reverse driving change points, reverse driving, etc. are provided sequentially in response to the current driving status of the vehicle.
  • the AR display device includes a camera that acquires a front image of the vehicle; A communication module that receives sensing data from the vehicle; A processor that runs a preset application and renders an AR graphic interface combining a first AR object that displays the driving state of the vehicle and a second AR object that displays a guide to the driving situation of the vehicle so that it overlaps the front image; And according to the rendering, it may include a display that displays a front image with the AR graphic interface overlaid.
  • the processor in response to the vehicle entering a parking area including a charging area, the processor searches for a possible parking area based on at least one of the sensing data and the control data of the parking area, and The AR graphic interface can be displayed variably to guide the vehicle.
  • the processor displays the location of the searched available parking area, separates the second AR object based on the displayed available parking area being selected, and displays a guide to the selected parking area.
  • the graphical interface can be updated.
  • the processor displays the second AR object to be separated and rotated relative to the first AR object during the search of the available parking area, and in response to the end of the search, the first and second Rendering can be updated to display an AR graphical interface with AR objects combined.
  • the processor recognizes that the vehicle has entered a direction in which the vehicle cannot be driven based on the current location and driving state of the vehicle, and separates the second AR object according to the recognition to provide a warning notification and a direction in which the vehicle can be driven.
  • the rendering can be updated to display the guide it represents.
  • the processor may display and update the AR graphic interface to display the location of a parking area rediscovered based on the sensing data and the control data in a surrounding area in the driving direction.
  • the processor determines a possible parking type based on the vehicle's proximity to the selected parking area, separates the second AR object according to the determination, and displays the AR graphic to display a parking guide line to drive. Interface rendering can be updated.
  • the processor calculates an expected change point of reverse driving based on the determined possible parking type, the current location of the vehicle and the location of the selected parking area, and the separated second AR object.
  • a first guide line toward the change point is displayed, and then, based on the current location of the vehicle corresponding to the first AR object being close to the change point, the vehicle heads toward the selected parking area in the reverse direction.
  • the AR graphic interface may be displayed and updated to display a second guide line.
  • the processor in response to the vehicle entering the parking area, searches for a chargeable area among the charging areas based on at least one of the sensing data and the control data, and searches for a chargeable area among the discovered chargeable areas.
  • the AR graphic interface can be displayed variably to display the location of .
  • the processor updates the rendering of the AR graphic interface to display charging-related information for the discovered chargeable area, and the charging-related information may include at least one of a charging method and a charging cost. there is.
  • the processor may update the rendering of the AR graphic interface to display each remaining charging time information for the surrounding charging area based on the current location of the vehicle, based on failure to search for the charging area. there is.
  • the processor displays the first AR object to rotate in response to the driving direction of the vehicle, separates the second AR object, and moves the searched parking area from the first AR object to the location of the discovered parking area.
  • the AR graphical interface can be updated to display the heading guide trajectory.
  • the guide trajectory for the searched available parking area may be a guide path generated based on at least one of the vehicle's ADAS sensing data and the control data.
  • the guide for the current location and predicted driving situation of the vehicle is simultaneously provided as an AR object in the front image that is calibrated without separate settings, thereby providing the vehicle with a better view. It can provide intuitive and realistic AR guide guidance.
  • the AR display device and its operating method when a vehicle enters a parking lot/charging station, navigation, route, and necessary information can be provided through a more intuitive AR graphic interface.
  • the AR display device for a vehicle recognizes the selected parking space or in front of the charger so that the vehicle can accurately park in the desired parking space or in front of the charger, and changes the AR graphic interface in real time to provide a guide path for parking. Forward driving, reverse driving change points, reverse driving, etc. are provided sequentially in response to the current driving status of the vehicle.
  • a vehicle when a vehicle enters a parking lot or charging station, it communicates with the control server of that location or uses ADAR sensing to provide route guidance for parking/charging areas, parking/charging related information, and route guidance when leaving the vehicle through a more intuitive AR graphic interface. By displaying it, a more direct and smart parking/charging-related UX can be provided.
  • FIG. 1 is a diagram illustrating an example of a vehicle related to an embodiment of the present invention.
  • Figure 2 is a view of a vehicle related to an embodiment of the present invention viewed from various angles.
  • 3 and 4 are diagrams showing the interior of a vehicle related to an embodiment of the present invention.
  • 5 and 6 are diagrams referenced for explaining various objects related to driving of a vehicle related to an embodiment of the present invention.
  • Figure 7 is a block diagram referenced for explaining a vehicle and an AR display device related to an embodiment of the present invention.
  • FIG. 8 is a detailed block diagram related to the processor of the AR display device according to an embodiment of the present invention.
  • FIG. 9 is a diagram referenced for explaining a navigation screen according to an embodiment of the present invention
  • FIG. 10 is a diagram referenced for explaining an operation of generating the navigation screen of FIG. 9 .
  • Figure 11 is a flowchart referenced to explain a method of displaying an AR graphic interface on a navigation screen according to an embodiment of the present invention.
  • FIGS. 12A and 12B are illustrations of an AR graphic interface according to an embodiment of the present invention, and are reference diagrams for explaining separation and combination into first and second AR objects.
  • FIG. 13 is a flowchart referenced to explain a method of providing a UX display related to parking/charging of a vehicle using an AR graphic interface as a method of operating an AR display device according to an embodiment of the present invention.
  • FIGS. 14A, 14B, 14C, and 14D are conceptual diagrams illustrating guiding a parking area by changing the AR graphic interface based on ADAS sensing data, according to an embodiment of the present invention.
  • FIGS. 15A, 15B, 15C, and 15D are conceptual diagrams to explain guiding a parking area by changing the AR graphic interface based on control information, according to an embodiment of the present invention.
  • FIGS. 16A and 16B are conceptual diagrams related to updating the display of an AR graphic interface according to a parking type, according to an embodiment of the present invention.
  • FIG. 17 is a flowchart illustrating guiding the charging area and charging through an AR graphic interface based on control information, according to an embodiment of the present invention, and FIGS. 18A, 18B, and 18C illustrate FIG. 17. These are conceptual diagrams that are referred to.
  • the vehicle described in this specification may include a car and a motorcycle. Below, description of vehicles will focus on automobiles.
  • the vehicle described in this specification may be a concept that includes all internal combustion engine vehicles having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
  • the left side of the vehicle refers to the left side of the vehicle's traveling direction
  • the right side of the vehicle refers to the right side of the vehicle's traveling direction
  • the “system” disclosed in this specification may include at least one of a server device and a cloud device, but is not limited thereto.
  • a system may consist of one or more server devices.
  • a system may consist of one or more cloud devices.
  • the system may be operated with a server device and a cloud device configured together.
  • Map information or “map data” disclosed in this specification includes images captured through vision sensors such as cameras, two-dimensional map information, three-dimensional map information, digital twin three-dimensional maps, and high-precision maps (HD maps). ), and map information such as maps in real/virtual space, map data, and map-related applications.
  • vision sensors such as cameras, two-dimensional map information, three-dimensional map information, digital twin three-dimensional maps, and high-precision maps (HD maps).
  • HD maps high-precision maps
  • Point of Interest (POI) information refers to points of interest selected based on the map information or map data, pre-registered POI information (POI stored in the map of the cloud server), and user-set POI information. (e.g., my home, school, work, etc.), driving-related POI information (e.g., destination, waypoint, gas station, rest area, parking lot, etc.), and top search POI information (e.g., recently clicked/highly visited POI, hot place, etc.) may include. This POI information can be updated in real time based on the current location of the vehicle.
  • POI information can be updated in real time based on the current location of the vehicle.
  • the “front image” disclosed in this specification is obtained through a vision sensor on or around the vehicle, or an AR camera of an AR display device, for example, through a vision sensor (camera, laser sensor for image, etc.) while the vehicle is driving. It may include acquired or projected images, the real image itself projected on the windshield of a vehicle, or images in a virtual space. In other words, the front image may be referred to as including all images output through a display, images projected through a laser sensor, or the actual image itself shown through the windshield of a vehicle.
  • FIGS. 1 and 2 are diagrams showing the exterior of a vehicle related to an embodiment of the present invention
  • FIGS. 3 and 4 are diagrams showing the interior of a vehicle related to an embodiment of the present invention.
  • 5 and 6 are diagrams showing various objects related to driving of a vehicle related to an embodiment of the present invention.
  • Figure 7 is a block diagram referenced in explaining a vehicle related to an embodiment of the present invention.
  • Figure 7 is a block diagram used to describe a vehicle according to an embodiment of the present invention.
  • the vehicle 100 may include wheels rotated by a power source and a steering input device 510 for controlling the moving direction of the vehicle 100.
  • Vehicle 100 may be an autonomous vehicle.
  • the vehicle 100 may be switched to autonomous driving mode or manual mode based on user input.
  • the vehicle 100 switches from manual mode to autonomous driving mode based on user input received through the user interface device (hereinafter, referred to as 'user terminal') 200, or It can be switched from autonomous driving mode to manual mode.
  • 'user terminal' user interface device
  • the vehicle 100 may be switched to autonomous driving mode or manual mode based on driving situation information.
  • Driving situation information may be generated based on object information provided by the object detection device 300.
  • the vehicle 100 may be switched from manual mode to autonomous driving mode, or from autonomous driving mode to manual mode, based on driving situation information generated by the object detection device 300.
  • the vehicle 100 may be switched from manual mode to autonomous driving mode, or from autonomous driving mode to manual mode, based on driving situation information received through the communication device 400.
  • the vehicle 100 may be switched from manual mode to autonomous driving mode or from autonomous driving mode to manual mode based on information, data, and signals provided from an external device.
  • the autonomous vehicle 100 may be driven based on the driving system 700 .
  • the autonomous vehicle 100 may be driven based on information, data, or signals generated by the driving system 710, the parking system 740, and the parking system 750.
  • the autonomous vehicle 100 may receive user input for driving through the driving control device 500. Based on user input received through the driving control device 500, the vehicle 100 may be driven.
  • the overall length refers to the length from the front to the rear of the vehicle 100
  • the overall width refers to the width of the vehicle 100
  • the overall height refers to the length from the bottom of the wheels to the roof.
  • the overall length direction (L) is the direction that is the standard for measuring the overall length of the vehicle 100
  • the overall width direction (W) is the direction that is the standard for measuring the overall width of the vehicle 100
  • the overall height direction (H) is the direction that is the standard for measuring the overall width of the vehicle 100. It may refer to the direction that serves as the standard for measuring the total height of (100).
  • the vehicle 100 includes a user interface device (hereinafter referred to as a 'user terminal') 200, an object detection device 300, a communication device 400, and a driving operation device. (500), vehicle driving device 600, driving system 700, navigation system 770, sensing unit 120, vehicle interface unit 130, memory 140, control unit 170, and power supply unit 190 ) may include.
  • the vehicle 100 may further include other components in addition to the components described in this specification, or may not include some of the components described.
  • the user interface device 200 is a device for communication between the vehicle 100 and the user.
  • the user interface device 200 may receive user input and provide information generated by the vehicle 100 to the user.
  • the vehicle 100 may implement User Interfaces (UI) or User Experience (UX) through a user interface device (hereinafter referred to as a 'user terminal') 200.
  • UI User Interfaces
  • UX User Experience
  • the user interface device 200 may include an input unit 210, an internal camera 220, a biometric detection unit 230, an output unit 250, and a processor 270. Depending on the embodiment, the user interface device 200 may further include other components in addition to the components described, or may not include some of the components described.
  • the input unit 210 is used to receive information from the user, and the data collected by the input unit 120 can be analyzed by the processor 270 and processed as a user's control command.
  • the input unit 210 may be placed inside the vehicle.
  • the input unit 210 is an area of the steering wheel, an area of the instrument panel, an area of the seat, an area of each pillar, and a door.
  • the input unit 210 may include a voice input unit 211, a gesture input unit 212, a touch input unit 213, and a mechanical input unit 214.
  • the voice input unit 211 can convert the user's voice input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the control unit 170.
  • the voice input unit 211 may include one or more microphones.
  • the gesture input unit 212 can convert the user's gesture input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the control unit 170.
  • the gesture input unit 212 may include at least one of an infrared sensor and an image sensor for detecting a user's gesture input. Depending on the embodiment, the gesture input unit 212 may detect a user's 3D gesture input. To this end, the gesture input unit 212 may include a light output unit that outputs a plurality of infrared lights or a plurality of image sensors.
  • the gesture input unit 212 may detect the user's 3D gesture input through a time of flight (TOF) method, a structured light method, or a disparity method.
  • TOF time of flight
  • the touch input unit 213 can convert the user's touch input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the control unit 170.
  • the touch input unit 213 may include a touch sensor for detecting a user's touch input.
  • the touch input unit 213 may be formed integrally with the display unit 251 to implement a touch screen. This touch screen can provide both an input interface and an output interface between the vehicle 100 and the user.
  • the mechanical input unit 214 may include at least one of a button, a dome switch, a jog wheel, and a jog switch.
  • the electrical signal generated by the mechanical input unit 214 may be provided to the processor 270 or the control unit 170.
  • the mechanical input unit 214 may be placed on a steering wheel, center fascia, center console, cockpit module, door, etc.
  • the internal camera 220 can acquire images inside the vehicle.
  • the processor 270 may detect the user's state based on the image inside the vehicle.
  • the processor 270 may obtain the user's gaze information from the image inside the vehicle.
  • the processor 270 may detect a user's gesture from an image inside the vehicle.
  • the biometric detection unit 230 can acquire the user's biometric information.
  • the biometric detection unit 230 includes a sensor that can acquire the user's biometric information, and can obtain the user's fingerprint information, heart rate information, etc. using the sensor. Biometric information can be used for user authentication.
  • the output unit 250 is for generating output related to vision, hearing, or tactile sensation.
  • the output unit 250 may include at least one of a display unit 251, an audio output unit 252, and a haptic output unit 253.
  • the display unit 251 can display graphic objects corresponding to various information.
  • the display unit 251 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display. It may include at least one of a display, a 3D display, and an e-ink display.
  • the display unit 251 and the touch input unit 213 may form a layered structure or be formed as one piece, thereby implementing a touch screen.
  • the display unit 251 may be implemented as a Head Up Display (HUD).
  • HUD Head Up Display
  • the display unit 251 is equipped with a projection module and can output information through an image projected on a windshield or window.
  • the display unit 251 may include a transparent display.
  • the transparent display can be attached to a windshield or window.
  • a transparent display can display a certain screen while having a certain transparency.
  • transparent displays include transparent TFEL (Thin Film Elecroluminescent), transparent OLED (Organic Light-Emitting Diode), transparent LCD (Liquid Crystal Display), transparent transparent display, and transparent LED (Light Emitting Diode) display. It may include at least one of: The transparency of a transparent display can be adjusted.
  • the user interface device 200 may include a plurality of display units 251a to 251g.
  • the display unit 251 includes one area of the steering wheel, one area of the instrument panel (521a, 251b, 251e), one area of the seat (251d), one area of each pillar (251f), and one area of the door ( 251g), may be placed in an area of the center console, an area of the headlining, or an area of the sun visor, or may be implemented in an area of the windshield (251c) or an area of the window (251h).
  • the audio output unit 252 converts the electrical signal provided from the processor 270 or the control unit 170 into an audio signal and outputs it. To this end, the sound output unit 252 may include one or more speakers.
  • the haptic output unit 253 generates a tactile output.
  • the haptic output unit 253 may operate to vibrate the steering wheel, seat belt, and seats 110FL, 110FR, 110RL, and 110RR so that the user can perceive the output.
  • the processor (hereinafter referred to as a 'control unit') 270 may control the overall operation of each unit of the user interface device 200.
  • the user interface device 200 may include a plurality of processors 270 or may not include the processor 270.
  • the user interface device 200 may be operated under the control of the processor 170 or a processor of another device in the vehicle 100.
  • the user interface device 200 may be called a vehicle display device.
  • the user interface device 200 may be operated under the control of the control unit 170.
  • the object detection device 300 is a device for detecting objects located outside the vehicle 100. Objects may be various objects related to the operation of the vehicle 100. 5 and 6, the object O is a lane (OB10), another vehicle (OB11), a pedestrian (OB12), a two-wheeled vehicle (OB13), a traffic signal (OB14, OB15), light, a road, a structure, Can include speed bumps, landmarks, animals, etc.
  • Lane OB10 may be a driving lane, a lane next to a driving lane, or a lane in which an oncoming vehicle travels. Lane OB10 may be a concept that includes left and right lines forming a lane.
  • the other vehicle OB11 may be a vehicle running around the vehicle 100 .
  • the other vehicle may be a vehicle located within a predetermined distance from the vehicle 100.
  • the other vehicle OB11 may be a vehicle that precedes or follows the vehicle 100.
  • the pedestrian OB12 may be a person located around the vehicle 100.
  • the pedestrian OB12 may be a person located within a predetermined distance from the vehicle 100.
  • a pedestrian OB12 may be a person located on a sidewalk or roadway.
  • the two-wheeled vehicle OB12 may refer to a vehicle located around the vehicle 100 and moving using two wheels.
  • the two-wheeled vehicle OB12 may be a vehicle with two wheels located within a predetermined distance from the vehicle 100.
  • the two-wheeled vehicle OB13 may be a motorcycle or bicycle located on a sidewalk or roadway.
  • Traffic signals may include traffic lights (OB15), traffic signs (OB14), and patterns or text drawn on the road surface.
  • the light may be light generated from a lamp provided in another vehicle.
  • the light can be the light generated from street lights.
  • the light may be sunlight.
  • a road may include a road surface, a curve, a slope such as uphill or downhill, etc.
  • the structure may be an object located near the road and fixed to the ground.
  • structures may include streetlights, trees, buildings, electric poles, traffic lights, and bridges.
  • Landforms may include mountains, hills, etc.
  • objects can be classified into moving objects and fixed objects.
  • a moving object may be a concept that includes other vehicles and pedestrians.
  • a fixed object may be a concept including a traffic signal, road, or structure.
  • the object detection device 300 may include a camera 310, radar 320, lidar 330, ultrasonic sensor 340, infrared sensor 350, and processor 370.
  • the object detection apparatus 300 may further include other components in addition to the components described, or may not include some of the components described.
  • the camera 310 may be located at an appropriate location outside the vehicle to obtain images of the exterior of the vehicle.
  • the camera 310 may be a mono camera, a stereo camera 310a, an Around View Monitoring (AVM) camera 310b, or a 360-degree camera.
  • AVM Around View Monitoring
  • camera 310 may be placed close to the front windshield, inside the vehicle, to obtain an image of the front of the vehicle.
  • the camera 310 may be placed around the front bumper or radiator grill.
  • the camera 310 may be placed close to the rear windshield in the interior of the vehicle to obtain an image of the rear of the vehicle.
  • the camera 310 may be placed around the rear bumper, trunk, or tailgate.
  • the camera 310 may be placed close to at least one of the side windows inside the vehicle to obtain an image of the side of the vehicle.
  • the camera 310 may be placed around a side mirror, fender, or door.
  • the camera 310 may provide the acquired image to the processor 370.
  • Radar 320 may include an electromagnetic wave transmitting unit and a receiving unit.
  • the radar 320 may be implemented as a pulse radar or continuous wave radar based on the principle of transmitting radio waves.
  • the radar 320 may be implemented in a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method depending on the signal waveform among the continuous wave radar methods.
  • FMCW frequency modulated continuous wave
  • FSK frequency shift keyong
  • the radar 320 detects an object using electromagnetic waves based on a Time of Flight (TOF) method or a phase-shift method, and determines the location of the detected object, the distance to the detected object, and the relative speed. can be detected.
  • TOF Time of Flight
  • phase-shift method determines the location of the detected object, the distance to the detected object, and the relative speed. can be detected.
  • the radar 320 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • LiDAR 330 may include a laser transmitter and a receiver. LiDAR 330 may be implemented in a time of flight (TOF) method or a phase-shift method.
  • TOF time of flight
  • LiDAR 330 may be implemented as a driven or non-driven type.
  • the LIDAR 330 When implemented in a driven manner, the LIDAR 330 is rotated by a motor and can detect objects around the vehicle 100.
  • the LIDAR 330 can detect objects located within a predetermined range based on the vehicle 100 through optical steering.
  • the vehicle 100 may include a plurality of non-driven LIDARs 330.
  • the LIDAR 330 detects an object via laser light based on a time of flight (TOF) method or a phase-shift method, and determines the location of the detected object, the distance to the detected object, and Relative speed can be detected.
  • TOF time of flight
  • phase-shift method determines the location of the detected object, the distance to the detected object, and Relative speed can be detected.
  • Lidar 330 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • the ultrasonic sensor 340 may include an ultrasonic transmitter and a receiver.
  • the ultrasonic sensor 340 can detect an object based on ultrasonic waves and detect the location of the detected object, the distance to the detected object, and the relative speed.
  • the ultrasonic sensor 340 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • the infrared sensor 350 may include an infrared transmitter and a receiver.
  • the infrared sensor 340 can detect an object based on infrared light, and detect the location of the detected object, the distance to the detected object, and the relative speed.
  • the infrared sensor 350 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • the processor 370 may control the overall operation of each unit of the object detection device 300.
  • the processor 370 can detect and track an object based on the acquired image.
  • the processor 370 can perform operations such as calculating a distance to an object and calculating a relative speed to an object through an image processing algorithm.
  • the processor 370 can detect and track an object based on reflected electromagnetic waves that are transmitted when the electromagnetic waves are reflected by the object and returned.
  • the processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object, based on electromagnetic waves.
  • the processor 370 may detect and track an object based on reflected laser light that is returned after the transmitted laser is reflected by the object.
  • the processor 370 may perform operations such as calculating the distance to the object and calculating the relative speed to the object, based on the laser light.
  • the processor 370 may detect and track an object based on reflected ultrasonic waves in which the transmitted ultrasonic waves are reflected by the object and returned.
  • the processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object based on ultrasonic waves.
  • the processor 370 may detect and track an object based on the reflected infrared light that is returned after the transmitted infrared light is reflected by the object.
  • the processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object based on infrared light.
  • the object detection apparatus 300 may include a plurality of processors 370 or may not include the processor 370.
  • the camera 310, radar 320, lidar 330, ultrasonic sensor 340, and infrared sensor 350 may each individually include a processor.
  • the object detection device 300 may be operated under the control of the processor or control unit 170 of the device in the vehicle 100.
  • the object detection device 400 may be operated under the control of the control unit 170.
  • the communication device 400 is a device for communicating with an external device.
  • the external device may be another vehicle, mobile terminal, or server.
  • the communication device 400 may include at least one of a transmitting antenna, a receiving antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
  • RF radio frequency
  • the communication device 400 may include a short-range communication unit 410, a location information unit 420, a V2X communication unit 430, an optical communication unit 440, a broadcast transceiver 450, and a processor 470.
  • the communication device 400 may further include other components in addition to the components described, or may not include some of the components described.
  • the short-range communication unit 410 is a unit for short-range communication.
  • the short-range communication unit 410 includes BluetoothTM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and Wi-Fi (Wireless).
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee Ultra Wideband
  • NFC Near Field Communication
  • Wi-Fi Wireless
  • -Fidelity Wi-Fi Direct
  • Wireless USB Wireless Universal Serial Bus
  • the short-range communication unit 410 may form a wireless area network and perform short-range communication between the vehicle 100 and at least one external device.
  • the location information unit 420 is a unit for acquiring location information of the vehicle 100.
  • the location information unit 420 may include a Global Positioning System (GPS) module or a Differential Global Positioning System (DGPS) module.
  • GPS Global Positioning System
  • DGPS Differential Global Positioning System
  • the V2X communication unit 430 is a unit for performing wireless communication with a server (V2I: Vehicle to Infra), another vehicle (V2V: Vehicle to Vehicle), or a pedestrian (V2P: Vehicle to Pedestrian).
  • the V2X communication unit 430 may include an RF circuit capable of implementing communication with infrastructure (V2I), communication between vehicles (V2V), and communication with pedestrians (V2P) protocols.
  • the optical communication unit 440 is a unit for communicating with an external device through light.
  • the optical communication unit 440 may include an optical transmitter that converts an electrical signal into an optical signal and transmits it to the outside, and an optical receiver that converts the received optical signal into an electrical signal.
  • the light emitting unit may be formed to be integrated with the lamp included in the vehicle 100.
  • the broadcast transceiver 450 is a unit for receiving a broadcast signal from an external broadcast management server through a broadcast channel or transmitting a broadcast signal to the broadcast management server.
  • Broadcast channels may include satellite channels and terrestrial channels.
  • Broadcast signals may include TV broadcast signals, radio broadcast signals, and data broadcast signals.
  • the processor 470 may control the overall operation of each unit of the communication device 400.
  • the communication device 400 may include a plurality of processors 470 or may not include the processor 470.
  • the communication device 400 may be operated under the control of the processor 170 or a processor of another device in the vehicle 100.
  • the communication device 400 may implement a vehicle display device together with the user interface device 200.
  • the vehicle display device may be called a telematics device or an AVN (Audio Video Navigation) device.
  • the communication device 400 may be operated under the control of the control unit 170.
  • the driving control device 500 is a device that receives user input for driving.
  • the vehicle 100 may be operated based on signals provided by the driving control device 500.
  • the driving control device 500 may include a steering input device 510, an acceleration input device 530, and a brake input device 570.
  • the steering input device 510 may receive an input of the direction of travel of the vehicle 100 from the user.
  • the steering input device 510 is preferably formed in a wheel shape to enable steering input by rotation.
  • the steering input device may be formed in the form of a touch screen, touch pad, or button.
  • the acceleration input device 530 may receive an input for acceleration of the vehicle 100 from the user.
  • the brake input device 570 may receive an input for decelerating the vehicle 100 from the user.
  • the acceleration input device 530 and the brake input device 570 are preferably formed in the form of pedals. Depending on the embodiment, the acceleration input device or the brake input device may be formed in the form of a touch screen, touch pad, or button.
  • the driving control device 500 may be operated under the control of the control unit 170.
  • the vehicle driving device 600 is a device that electrically controls the operation of various devices in the vehicle 100.
  • the vehicle driving device 600 may include a power train driving unit 610, a chassis driving unit 620, a door/window driving unit 630, a safety device driving unit 640, a lamp driving unit 650, and an air conditioning driving unit 660. You can.
  • the vehicle driving device 600 may further include other components in addition to the components described, or may not include some of the components described.
  • the vehicle driving device 600 may include a processor. Each unit of the vehicle driving device 600 may individually include a processor.
  • the power train driver 610 can control the operation of the power train device.
  • the power train driving unit 610 may include a power source driving unit 611 and a transmission driving unit 612.
  • the power source driver 611 may control the power source of the vehicle 100.
  • the power source driver 610 may perform electronic control of the engine. Thereby, the output torque of the engine, etc. can be controlled.
  • the power source driving unit 611 can adjust the engine output torque according to the control of the control unit 170.
  • the power source driver 610 may control the motor.
  • the power source driving unit 610 can adjust the rotational speed and torque of the motor according to the control of the control unit 170.
  • the transmission drive unit 612 can control the transmission.
  • the transmission drive unit 612 can adjust the state of the transmission.
  • the transmission drive unit 612 can adjust the state of the transmission to forward (D), reverse (R), neutral (N), or park (P).
  • the transmission drive unit 612 can adjust the gear engagement state in the forward (D) state.
  • the chassis driver 620 can control the operation of the chassis device.
  • the chassis drive unit 620 may include a steering drive unit 621, a brake drive unit 622, and a suspension drive unit 623.
  • the steering drive unit 621 may perform electronic control of the steering apparatus within the vehicle 100.
  • the steering drive unit 621 can change the moving direction of the vehicle.
  • the brake driver 622 may perform electronic control of the brake apparatus within the vehicle 100. For example, the speed of the vehicle 100 can be reduced by controlling the operation of the brakes disposed on the wheels.
  • the brake driver 622 can individually control each of the plurality of brakes.
  • the brake driver 622 can control braking force applied to a plurality of wheels differently.
  • the suspension drive unit 623 may perform electronic control of the suspension apparatus within the vehicle 100. For example, when the road surface is curved, the suspension drive unit 623 may control the suspension device to reduce vibration of the vehicle 100. Meanwhile, the suspension driving unit 623 can individually control each of the plurality of suspensions.
  • the door/window driving unit 630 may perform electronic control of the door apparatus or window apparatus within the vehicle 100.
  • the door/window driving unit 630 may include a door driving unit 631 and a window driving unit 632.
  • the door driver 631 can control the door device.
  • the door driver 631 can control the opening and closing of a plurality of doors included in the vehicle 100.
  • the door driver 631 can control the opening or closing of the trunk or tail gate.
  • the door driver 631 can control the opening or closing of the sunroof.
  • the window driver 632 may perform electronic control of a window apparatus. It is possible to control the opening or closing of a plurality of windows included in the vehicle 100.
  • the safety device driver 640 may perform electronic control of various safety apparatuses in the vehicle 100.
  • the safety device driver 640 may include an airbag driver 641, a seat belt driver 642, and a pedestrian protection device driver 643.
  • the airbag driving unit 641 may perform electronic control of the airbag apparatus within the vehicle 100.
  • the airbag driving unit 641 may control the airbag to be deployed when danger is detected.
  • the seat belt drive unit 642 may perform electronic control of the seat belt appartus in the vehicle 100. For example, when danger is detected, the seat belt drive unit 642 can control the passenger to be fixed to the seat (110FL, 110FR, 110RL, 110RR) using the seat belt.
  • the pedestrian protection device driving unit 643 may perform electronic control of the hood lift and pedestrian airbag. For example, the pedestrian protection device driving unit 643 may control the hood to lift up and the pedestrian airbag to deploy when a collision with a pedestrian is detected.
  • the lamp driver 650 may perform electronic control of various lamp apparatuses in the vehicle 100.
  • the air conditioning driver 660 may perform electronic control of the air conditioning device (air cinditioner) in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioning driver 660 can control the air conditioning device to operate so that cold air is supplied into the vehicle interior.
  • the air conditioning driver 660 can control the air conditioning device to operate so that cold air is supplied into the vehicle interior.
  • the vehicle driving device 600 may include a processor. Each unit of the vehicle driving device 600 may individually include a processor.
  • the vehicle driving device 600 may be operated under the control of the control unit 170.
  • the operation system 700 is a system that controls various operations of the vehicle 100.
  • the navigation system 700 may be operated in autonomous driving mode.
  • the driving system 700 may include a driving system 710, a parking system 740, and a parking system 750.
  • the navigation system 700 may further include other components in addition to the components described, or may not include some of the components described.
  • the navigation system 700 may include a processor. Each unit of the navigation system 700 may individually include a processor.
  • the navigation system 700 when the navigation system 700 is implemented in software, it may be a sub-concept of the control unit 170.
  • the navigation system 700 includes at least one of the user interface device 200, the object detection device 300, the communication device 400, the vehicle driving device 600, and the control unit 170. It may be a concept that includes
  • the driving system 710 can drive the vehicle 100.
  • the driving system 710 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100.
  • the driving system 710 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100.
  • the driving system 710 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100.
  • the parking system 740 can remove the vehicle 100.
  • the parking system 740 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to remove the vehicle 100.
  • the parking system 740 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to remove the vehicle 100.
  • the parking system 740 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600 to remove the vehicle 100.
  • the parking system 750 can park the vehicle 100.
  • the parking system 750 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to park the vehicle 100.
  • the parking system 750 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to park the vehicle 100.
  • the parking system 750 may park the vehicle 100 by receiving a signal from an external device through the communication device 400 and providing a control signal to the vehicle driving device 600.
  • the navigation system 770 may provide navigation information.
  • Navigation information may include at least one of map information, set destination information, route information according to the set destination, information on various objects on the route, lane information, and current location information of the vehicle.
  • the navigation system 770 may include memory and a processor.
  • the memory can store navigation information.
  • the processor may control the operation of the navigation system 770.
  • the navigation system 770 may receive information from an external device through the communication device 400 and update pre-stored information.
  • the navigation system 770 may be classified as a sub-component of the user interface device 200.
  • the sensing unit 120 can sense the status of the vehicle.
  • the sensing unit 120 includes a posture sensor (e.g., yaw sensor, roll sensor, pitch sensor), collision sensor, wheel sensor, speed sensor, and inclination sensor.
  • Sensor weight sensor, heading sensor, yaw sensor, gyro sensor, position module, vehicle forward/reverse sensor, battery sensor, fuel sensor, tire sensor, steering wheel It may include a rotational steering sensor, vehicle interior temperature sensor, vehicle interior humidity sensor, ultrasonic sensor, illuminance sensor, accelerator pedal position sensor, brake pedal position sensor, etc.
  • the sensing unit 120 includes vehicle posture information, vehicle collision information, vehicle direction information, vehicle location information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward/backward information, and battery. Obtain sensing signals for information, fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, vehicle exterior illumination, pressure applied to the accelerator pedal, pressure applied to the brake pedal, etc. can do.
  • the sensing unit 120 includes an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor (AFS), an intake temperature sensor (ATS), a water temperature sensor (WTS), and a throttle position sensor. (TPS), TDC sensor, crank angle sensor (CAS), etc. may be further included.
  • the vehicle interface unit 130 may serve as a passageway for various types of external devices connected to the vehicle 100.
  • the vehicle interface unit 130 may have a port that can be connected to a mobile terminal, and can be connected to a mobile terminal through the port. In this case, the vehicle interface unit 130 can exchange data with the mobile terminal.
  • the vehicle interface unit 130 may serve as a conduit for supplying electrical energy to a connected mobile terminal.
  • the vehicle interface unit 130 may provide electrical energy supplied from the power supply unit 190 to the mobile terminal under the control of the control unit 170. .
  • the memory 140 is electrically connected to the control unit 170.
  • the memory 140 can store basic data for the unit, control data for controlling the operation of the unit, and input/output data.
  • the memory 140 may be a variety of storage devices such as ROM, RAM, EPROM, flash drive, hard drive, etc.
  • the memory 140 may store various data for the overall operation of the vehicle 100, such as programs for processing or controlling the control unit 170.
  • the memory 140 may be formed integrally with the control unit 170 or may be implemented as a sub-component of the control unit 170.
  • the control unit 170 may control the overall operation of each unit within the vehicle 100.
  • the control unit 170 may be named ECU (Electronic Control Unit).
  • the power supply unit 190 may supply power required for the operation of each component under the control of the control unit 170.
  • the power supply unit 190 may receive power from a battery inside the vehicle.
  • processors and control units 170 included in the vehicle 100 include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and FPGAs ( It may be implemented using at least one of field programmable gate arrays, processors, controllers, micro-controllers, microprocessors, and other electrical units for performing functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, and other electrical units for performing functions.
  • the AR display device 800 is used for navigation of the vehicle 100. Based on the information and data received from the AR camera, an AR graphic interface indicating the driving state of the vehicle can be displayed in real time through AR registration on the front image of the vehicle (or the vehicle's windshield).
  • the AR display device 800 includes a communication module 810 for communicating with other devices/systems, servers, and vehicles, a processor 820 that controls the overall operation of the AR display device 800, and an AR graphic interface.
  • a communication module 810 for communicating with other devices/systems, servers, and vehicles
  • a processor 820 that controls the overall operation of the AR display device 800
  • an AR graphic interface may include a display 830 for displaying a navigation screen including a rendered forward image.
  • 'front image' refers not only to images captured through camera sensors (or smart glasses including these functions), but also to images reflected on the LCD screen through camera sensors and windshield/dash images. It can include both real-space images and/or digitally twinned 3D images shown on the board.
  • 'Navigation screen including forward image (or driving image)' disclosed in this specification is a navigation screen created based on the current location and navigation information, a forward image captured through the vehicle's camera, and reflected on the LCD screen. This may mean that a forward image implemented in the form of one of the images of real space shown through a video, windshield, etc. and/or a digitally twinned 3D image is layered on the navigation screen.
  • parking area is used to include both a charging station, including a charger, and a parking lot, including parking spaces.
  • the navigation screen may be an AR navigation screen to which AR technology is applied.
  • the 'AR graphic interface' disclosed in this specification is a graphic user interface using augmented reality (AR) technology, and is AR matched to the front image of the vehicle in real time.
  • AR augmented reality
  • the AR graphic interface may be an AR graphic image representing the current driving state of the vehicle.
  • the AR graphic interface disclosed in this specification may be an AR graphic image that further represents a guide to the driving situation of the vehicle simultaneously with the current driving state of the vehicle. At this time, a guide to the driving situation of the vehicle is displayed on the image in front of the vehicle at a certain distance and/or a certain time ahead of the driving situation.
  • the AR graphic interface disclosed in this specification may be implemented as an AR graphic image that moves or moves depending on the current driving state of the vehicle and/or the driving situation of the vehicle.
  • the AR display device 800 may be implemented as part of the electrical equipment or system of the vehicle 100, or may be implemented as a separate independent device or system.
  • the AR display device 800 may be implemented in the form of a program consisting of instructions operated by a processor such as a user terminal of the vehicle 100.
  • the AR display device 800 communicates with the vehicle 100, other devices, and/or servers to display the front image of the vehicle acquired through an AR camera and sensors provided in the vehicle (e.g., gyroscope sensor, acceleration sensor, gravity sensor). , geomagnetic sensors, temperature sensors, etc.).
  • sensors e.g., gyroscope sensor, acceleration sensor, gravity sensor. , geomagnetic sensors, temperature sensors, etc.
  • the AR display device 800 may run a preset application, for example, an (AR) navigation application.
  • a preset application for example, an (AR) navigation application.
  • the AR display device 800 renders an AR graphic interface indicating the current driving state of the vehicle based on map data (e.g., information such as route, POI, etc.), sensing data, and front image acquired by the camera, and displays the navigation application. It can be provided in real time to the AR GUI surface and AR camera surface.
  • map data e.g., information such as route, POI, etc.
  • the AR display device 800 displays an AR object separated from the AR graphic interface based on map data (e.g., information such as route, POI, etc.), sensing data, and a front image acquired by a camera to display information about the driving situation of the vehicle. It can be rendered to display a guide and provided in real time to the AR GUI surface and AR camera surface of the navigation application.
  • map data e.g., information such as route, POI, etc.
  • sensing data e.g., information such as route, POI, etc.
  • a front image acquired by a camera to display information about the driving situation of the vehicle. It can be rendered to display a guide and provided in real time to the AR GUI surface and AR camera surface of the navigation application.
  • the separated AR object may be named 'second AR object', and the remaining part of the AR graphic interface after the second AR object is separated may be named 'first AR object'.
  • the AR graphic interface can be said to include a first AR object that represents the current driving state of the vehicle and a second AR object that displays a guide to the driving situation of the vehicle.
  • FIG. 8 is a detailed block diagram related to the processor 820 of the AR display device 800 related to the embodiment of the present invention described above.
  • the conceptual diagram shown in FIG. 8 may include configurations related to operations performed by the processor 820 of the AR display device 800 and information, data, and programs used therefor.
  • the block diagram shown in FIG. 8 may be used to mean a service provided through the processor 820 and/or a system executed/implemented by the processor 820.
  • processor 820 for convenience of explanation, it will be referred to as processor 820.
  • FIG. 9 is a diagram referenced for explaining a navigation screen according to an embodiment of the present invention
  • FIG. 10 is a diagram referenced for explaining an operation of generating the navigation screen of FIG. 9 .
  • the processor 820 includes a navigation engine (Navigation Engine) 910, an Augmented Reality Engine (AR Engine) 920, a navigation application (Navigation Application) 930, and sensors and maps (940). They can be driven by including and/or in conjunction with them.
  • the navigation engine 910 may receive map data and GPS data from a vehicle, etc.
  • the navigation engine 910 may perform map matching based on map data and GPS data.
  • the navigation engine 910 may perform route planning according to map matching.
  • the navigation engine 910 can display a map and perform route guidance.
  • the navigation engine 910 may provide route guidance information to the navigation application 930.
  • the navigation engine 910 may include a navigation controller 911.
  • the navigation controller 911 may receive map matching data, map display data, and route guidance data.
  • the navigation controller 911 may provide route data, point of interest (POI) data, etc. to the AR engine 920 based on the received map matching data, map display data, and route guidance data.
  • POI point of interest
  • the navigation controller 911 may provide route guidance data and map display frames to the navigation application 930.
  • the AR engine 920 may include an adapter 921 and a renderer 922.
  • the adapter 921 uses front image data acquired from a camera (e.g., AR camera), sensors of the vehicle, such as a gyroscope, an acceleration sensor, a gravity sensor, and a magnetometer. , and/or sensing data obtained from a temperature sensor (Thermometer) may be received.
  • a camera e.g., AR camera
  • sensors of the vehicle such as a gyroscope, an acceleration sensor, a gravity sensor, and a magnetometer.
  • a temperature sensor Thermometer
  • the AR engine 920 may receive sensing data obtained from ADAS sensors (e.g., Camera, Radar, Lidar, Ultrasonic, Sonar). For example, through ADAS sensors, driving-related sensing data such as driving direction, speed, and distance from lanes can be obtained as sensing data.
  • ADAS sensors e.g., Camera, Radar, Lidar, Ultrasonic, Sonar.
  • driving-related sensing data such as driving direction, speed, and distance from lanes can be obtained as sensing data.
  • the AR engine 920 can receive high-precision map data and programs related thereto.
  • the high-precision map is a map to provide detailed road and surrounding terrain information to autonomous vehicles in advance. It has an accuracy within about 10cm of the error range and provides lane-level information such as the road center line and boundary line as well as lane-level information. Information such as traffic lights, signs, curbs, road marks, and various structures are contained in 3D digital format.
  • the AR engine 920 may receive sensing data, reception data, control data, and programs related thereto obtained from a Transmission Control Unit (TCU) (e.g., third party service, V2X, ITS communication, etc.).
  • TCU Transmission Control Unit
  • the TCU (Transmission Control Unit) of the sensor and map 940 is a communication control device mounted on the vehicle, for example, V2X (vehicle to everything), a communication technology that communicates with various elements on the road for autonomous vehicles (e.g. , situational data collectable through V2V and V2I), and enables communication with ITS (Intelligent Transport Systems) or C-ITS (Cooperative Intelligent Transport Systems), which are cooperative intelligent transport system technologies.
  • V2X vehicle to everything
  • ITS Intelligent Transport Systems
  • C-ITS Cooperative Intelligent Transport Systems
  • the AR engine 920 may perform calibration on the front image based on data provided from a calibration factor database.
  • the AR engine 920 may perform object detection based on front image data and route data.
  • the AR engine 920 may perform prediction and interpolation based on the detected object.
  • the renderer 922 may perform rendering based on route data, POI data, and prediction and interpolation result data.
  • the renderer 922 may provide an AR graphical user interface (GUI) frame and an AR camera frame to the navigation application 930.
  • GUI graphical user interface
  • the navigation application 930 can create an AR navigation screen.
  • the AR navigation screen 900 may include a navigation map surface 901, an AR camera surface 902, an AR GUI surface 903, and a navigation GUI surface 904. there is.
  • the navigation application 930 may generate a navigation map surface 910 based on the map display frame provided by the navigation controller 911.
  • the navigation application 930 may generate the AR camera surface 902 based on the AR camera frame provided from the renderer 922.
  • the navigation application 930 may create an AR GUI surface 903 based on the AR GUI frame provided from the renderer 922.
  • the navigation application 930 may generate a navigation GUI surface 904 based on route guidance data provided from the navigation controller 911.
  • the navigation application 930 displays a navigation map surface 901, an AR camera surface 902, an AR GUI surface 903, and a navigation GUI surface ( 904) can be generated.
  • the navigation application 930 may provide the parameters of the AR camera surface 902 and the parameters of the AR GUI surface 903 to the AR engine 920.
  • the AR engine 920 may register a callback function to receive front image data from the camera server 1001.
  • the camera server 1001 may be understood as a concept included in the memory of the AR display device 800, for example.
  • the AR engine 920 may receive front image data and perform cropping. Cropping may include adjusting the size or position of the image, editing some areas, or adjusting transparency.
  • the navigation application 930 may display the cropped front image on the AR camera surface 902.
  • the AR engine 230 can perform AR merging in real time. Additionally, the navigation application 930 may display an AR GUI on the AR GUI surface 903 based on the cropped front image.
  • FIG. 11 is a flowchart referenced to explain a method 1100 of displaying an AR graphic interface on a navigation screen according to an embodiment of the present invention.
  • Each process in FIG. 11 may be performed by a processor (or AR engine) unless otherwise specified.
  • the process of FIG. 11 includes or performs operations of the navigation engine 910, AR engine 920, and navigation application 930 by the processor 820 described above with reference to FIGS. 8 to 10. At least some of these may be performed before or after the process of FIG. 11.
  • the method begins with running a preset application (S10).
  • the preset application may be preinstalled on the AR display device 800 or driven by another device/server linked thereto, for example, when the AR mode of the vehicle is executed.
  • the preset application may be, for example, a navigation application that runs in AR mode while driving the vehicle.
  • the navigation application receives route guidance and map display frames based on map data and GPS data from the navigation engine, and generates navigation GUI rendering and map display surfaces, respectively.
  • the navigation application receives an AR GUI frame from the AR engine to create an AR GUI surface, and receives an AR camera frame to create an AR camera surface.
  • the navigation application renders the generated map display surface, AR camera surface, and AR GUI surface to the navigation GUI surface.
  • the processor includes a first AR object that displays the driving state of the vehicle and a second AR object that displays a guide to the driving state of the vehicle, based on map data acquired from a server, memory, or vehicle, and sensing data of the vehicle.
  • An AR graphic interface including AR objects is created and rendered to overlap the front image of the vehicle (S20).
  • the processor can merge the AR graphic interface generated in real time with the front image of the vehicle in real time.
  • the processor displays (render) the AR graphic interface with the first and second AR objects combined. If a preset condition is satisfied, the processor displays (render) the AR graphic interface with the second AR object separated from the AR graphic interface.
  • the preset condition may include a case where a change in the driving situation of the vehicle is predicted from the current driving state based on the vehicle's sensing data.
  • the preset condition is a situation in which a change in the driving situation of the vehicle or a need for guidance is predicted from the current driving state based on at least one of ADAS sensing data, high-precision map data, and TCU communication data such as V2X, ITS, and C-ITS. May include detected cases.
  • the processor displays the front image with the AR graphic interface superimposed on the navigation screen (S30).
  • the processor may render the AR graphic interface on the front image with the first and second AR objects combined.
  • the processor may provide an AR GUI frame and an AR camera frame corresponding to the AR graphic interface to the navigation application, thereby creating an AR GUI surface and an AR camera surface, respectively.
  • the created AR GUI surface and AR camera surface are rendered on the navigation GUI surface, so that the front image on which the AR graphic interface is rendered is included (displayed) on the navigation screen.
  • this AR graphic interface can be changed based on driving situations that are predicted to change based on map data and vehicle sensing data.
  • variable AR graphic interface displays a plurality of AR objects in a separated form so that the driver of the vehicle can intuitively see the display of the current driving state and the guide display for the driving situation predicted to change.
  • FIGS. 12A and 12B are examples of an AR graphic interface according to an embodiment of the present invention, and are reference diagrams for explaining separation and combination into first and second AR objects based on predicted changing driving situations.
  • the AR graphic interface 1200 may be implemented as an AR image of a specific shape in 3D form, and through these AR images, the vehicle's current driving direction, driving speed, and steering information, as well as road information, etc. can be displayed. there is.
  • the AR graphic interface 1200 may be implemented as a combination of a first object and a second object.
  • the first object may be implemented in the form of, for example, a 3D spade (e.g., a shovel-shaped image), and the second object may be implemented as a 3D chevron (e.g., a 3D chevron) extending from the first object. , A or V-shaped image).
  • a 3D spade e.g., a shovel-shaped image
  • the second object may be implemented as a 3D chevron (e.g., a 3D chevron) extending from the first object.
  • a or V-shaped image e.g., a 3D chevron
  • the AR graphic interface 1200 may be combined in an extended form so that the inner frame of the second object and the outer frame of the first object come into contact with each other. At this time, the first and second objects may be expressed in different colors so that they can be visually distinguished.
  • the AR graphic interface 1200 may be displayed so that the first and second objects are combined and move at the same or different twisting angles to indicate the current driving state of the vehicle.
  • the created AR graphic interface 1200 is displayed to overlap the front image of the vehicle included in the navigation screen.
  • the processor 820 generates an AR graphic interface 1200 that represents map data and vehicle sensing data and the current driving state of the vehicle, renders it based on route and POI information, and provides it to the navigation application 930.
  • the AR graphic interface 1200 is displayed in an overlapped form on the front image of the vehicle included in the navigation screen.
  • the processor 820 separates the first and second AR objects 1210 and 1220 of the AR graphic interface based on the changed driving situation predicted based on map data and vehicle sensing data,
  • the AR GUI surface and AR camera surface of the navigation application 930 can be updated by rendering to display a guide related to the changed driving situation through the separated second AR object 1210.
  • the condition for separating the first and second AR objects 1210 and 1220 may include a case where a change in the driving situation of the vehicle is predicted from the current driving state of the vehicle based on the vehicle's sensing data.
  • the condition for separating the first and second AR objects 1210 and 1220 is based on at least one of vehicle ADAS sensing data, high-precision map data, and TCU communication data such as V2X, ITS, and C-ITS, This may include a case in which a change in the driving situation of the vehicle or a situation in which the need for guidance is predicted is detected from the current driving state of the vehicle.
  • the separated second AR object 1210 is displayed extending from the display position of the first AR object 1220. Since the first AR object 1220 represents the current driving state of the vehicle (e.g., the current location and driving direction of the vehicle), the driver determines when and when the vehicle should drive according to the guide expressed in the second AR object 1210. You can intuit the direction.
  • the first AR object 1220 represents the current driving state of the vehicle (e.g., the current location and driving direction of the vehicle)
  • the driver determines when and when the vehicle should drive according to the guide expressed in the second AR object 1210. You can intuit the direction.
  • the separation distance between the first and second AR objects 1210 and 1220 may correspond to a predicted change point or distance in the driving situation of the vehicle.
  • the separated second AR object 1210 may be expressed as a plurality of fragments. A certain interval may be maintained between the plurality of fragments.
  • each fragment of the plurality of fragments may be expressed to gradually point toward a predicted situation occurrence point (or situation end point).
  • a predicted situation occurrence point or situation end point.
  • each of the 5 fragments may be oriented at the same location (eg, a predicted situation occurrence point) at a different twist angle.
  • the plurality of fragments may be expressed as moving a certain distance ahead of the first AR object 1220.
  • it is implemented to express a driving guide according to the occurrence of a predicted situation while moving based on the current location and driving state of the vehicle.
  • the moving speed of the plurality of fragments may correspond to the degree to which the vehicle approaches (eg, driving speed).
  • the number and/or display length of the plurality of fragments may be proportional to the maintenance time or maintenance distance of the predicted situation. For example, a case in which the status maintenance time is long may include a greater number of fragments or the total display length may be expressed as longer than in a case where the status maintenance time is long.
  • a guide is displayed for a fragment close to the first AR object 1220 to be related to the driving state expressed in the first AR object 1220.
  • a fragment that is far from the first AR object 1220 is guided to be associated with a predicted situation.
  • the plurality of fragments of the separated second AR object 1210 provide guidance on situations predicted from the current driving state corresponding to the first AR object 1220 in a more gradual and seamless manner.
  • the separated second AR object 1210 is displayed again in a combined state with the first AR object 1220. That is, it can be displayed again as the AR graphic interface 1200 as shown in FIG. 12A.
  • the AR display device 800 recognizes that a vehicle has entered a parking area including a charging area, and parks based on at least one of sensing data (e.g., ADAS sensing data) and parking area control data.
  • the available area can be searched, and the AR graphic interface can be displayed in real time to guide the vehicle to the searched available parking area.
  • FIG. 13 is a representative flowchart referenced to explain a method 1300 of providing a UX display related to parking/charging of a vehicle using an AR graphic interface as a method of operating an AR display device according to an embodiment of the present invention.
  • each step in FIG. 13 may be performed by the processor 820 of the AR display device.
  • each step is performed including some of the operations of the navigation engine 910, AR engine 920, and navigation application 930 described above with reference to FIGS. 8 to 10, or at least some of the operations are performed. It may be performed before or after the process of FIG. 13 below.
  • the AR display device 800 displays a front image in which a display indicating the driving state of the vehicle and an AR graphic interface indicating a guide related to the driving situation are rendered (S1310).
  • the processor 820 runs a preset application (e.g., navigation application) to create an AR object that combines a first AR object that displays the driving state of the vehicle and a second AR object that displays a guide to the driving state of the vehicle.
  • a preset application e.g., navigation application
  • An AR graphic interface in the form of an AR graphic interface can be rendered to overlap the front image acquired through an AR camera.
  • the processor 820 displays the front image in which the AR graphic interface is rendered (e.g., an image reflected on an LCD screen, an image in real space shown on the windshield/dashboard of a vehicle, or a digitally twinned three-dimensional image). etc.) can be displayed.
  • the AR graphic interface e.g., an image reflected on an LCD screen, an image in real space shown on the windshield/dashboard of a vehicle, or a digitally twinned three-dimensional image). etc.
  • the AR display device 800 may recognize that the vehicle has entered the parking area including the charging area (S1320).
  • 'parking area' is used to include both a charging station including a charger and a parking lot including a parking space, as described above. Additionally, the 'parking area' according to the present disclosure may include both cases that include a control server and cases that do not.
  • the AR display device 800 recognizes that a vehicle has entered the parking area based on the vehicle's sensing data, map data, and/or ADAS sensing data. can do.
  • control server recognizes the entry of a vehicle and/or detects it based on control data, that is, sensing data from sensors (e.g., cameras, lidar, radar, etc.) installed in the parking area.
  • control data that is, sensing data from sensors (e.g., cameras, lidar, radar, etc.) installed in the parking area.
  • Processor 820 may generate and output an indication that a vehicle has entered the parking area, in conjunction with or through an AR graphical interface.
  • the processor 820 may search for a parking area or a charging area based on at least one of vehicle sensing data and parking area control data (S1330).
  • the processor 820 may search for a parking area or a charging area within the parking area based on the vehicle's sensing data and/or ADAS sensing data.
  • control server determines the parking area or charging area within the parking area. After searching the area, it can be provided to the AR display device 800.
  • the processor 820 guides the vehicle to the discovered parking area or charging area.
  • the AR graphic interface can be rendered variably (S1340).
  • the AR graphic interface may provide UX in the form of separate first and second AR objects in order to guide the vehicle to a parking area or a charging area. Additionally, the AR graphic interface may further include a third AR object along with the first and second AR objects to display a specific event within the parking area.
  • FIGS. 14A, 14B, 14C, and 14D are conceptual diagrams to explain guiding a parking area using a variable AR graphic interface based on ADAS sensing data, according to an embodiment of the present invention.
  • the AR display device 800 displays vehicle sensing data (e.g., CAN (steering wheel angle, driving speed, yaw rate), GPS location/direction information) and map data (e.g., navigation /Map data (lane geometry), and based on ADAS sensing data, the AR graphic interface can be provided (output) in real-time changes to guide the vehicle to the parking area or charging area.
  • vehicle sensing data e.g., CAN (steering wheel angle, driving speed, yaw rate), GPS location/direction information
  • map data e.g., navigation /Map data (lane geometry)
  • ADAS is an advanced driver assistance system (ADAS), and ADAS sensing data refers to sensing data acquired through ADAS (system). ADAS can detect both objects around the vehicle and the vehicle environment.
  • ADAS advanced driver assistance system
  • the processor 820 receives map data such as ADAS sensing data, vehicle sensing data (e.g., CAN data), and navigation/map/GPS data, and provides auxiliary functions based on the received data through a separate second AR. It can be displayed through an object. At this time, the separated second AR object may be displayed with additional information (eg, charging information such as remaining charging time and charging fee).
  • map data such as ADAS sensing data, vehicle sensing data (e.g., CAN data), and navigation/map/GPS data
  • the processor 820 may search for a parking area or a charging area based on ADAS sensing data based on the vehicle entering the parking lot/charging station.
  • the optimal parking space/charger that meets the preset criteria e.g., proximity to the vehicle's current location, proximity to the exit, priority on fast charging, etc.
  • the preset criteria e.g., proximity to the vehicle's current location, proximity to the exit, priority on fast charging, etc.
  • a plurality of selectable locations (or paths) may be presented so that they can be implemented or selected through user input.
  • the processor 820 moves the separated second AR object to the location of the selected parking space/charger based on the vehicle entering the location of the parking space/charger selected through automatic selection or user input within a predetermined distance. Move and update the AR graphical interface to display a guide path from the vehicle's current location to the location of the parking space/charger.
  • the second AR object moves to the first AR object and displays the combined AR object. Provides a graphical interface.
  • the vehicle enters parking mode.
  • the driver can see that the first and second AR objects are recombined and can intuit that the parking mode has been entered (and/or through additional information ('parking mode execution') provided along with it).
  • FIGS. 14A to 14D an embodiment of searching for a possible parking area and providing route guidance using an AR graphic interface that varies based on ADAS sensing data will be described in detail.
  • the processor 820 may search for a parking area or a charging area based on ADAS sensing data as the vehicle enters the parking lot/charging station.
  • the second AR object may be separated from the first AR object, and an animation effect rotating 360 degrees with respect to the first AR object may be output.
  • search is completed (eg, search success/failure)
  • the first and second AR objects are displayed in a combined form again.
  • the processor 820 may display an indication of the surrounding parking available areas 1411 and 1412 found based on the current location of the vehicle on the front image 1401 of the vehicle.
  • the AR graphics interface 1400 displays the current driving state of the vehicle in a form in which the first and second AR objects are combined.
  • the processor 820 may display selection options 1421 and 1422 for a plurality of searched available parking areas 1411 and 1412 on the image in front of the vehicle.
  • the selection options 1421 and 1422 may be displayed along with additional information (e.g., driving distance, proximity to the current location of the vehicle, proximity to the exit, etc.).
  • the processor 820 selects one available parking area 1412 based on the input for the selection options 1421 and 1422, and the second AR object is separated accordingly to determine the location of the selected available parking area 1412. Go to
  • the second AR object 1410 generates a guide trajectory connecting the location of the selected parking available area 1412 with the first AR object indicating the current position of the vehicle, thereby providing guidance. Provides a route.
  • the guidance route may be created in a direction that is convenient for parking, taking into account the parking of the vehicle to be followed.
  • the processor 820 recognizes that the vehicle has entered a direction in which the vehicle cannot be driven based on the current location and driving state of the vehicle, and separates the second AR object according to the recognition to display a warning notification and a guide indicating the direction in which the vehicle can be driven. Rendering can be updated.
  • the first AR object 1420 indicates the current driving direction of the vehicle through the rotation amount, so it is displayed to face the selected parking area 1412.
  • the separated second AR objects 1410S-1 and 1410S-2 do not face the selected parking area 1412, but rotate to point in the same direction as the entry direction 1430. That is, by pointing in a direction opposite to the direction pointed by the first AR object 1420, a possible driving direction is provided.
  • the separated second AR objects 1410S-1 and 1410S-2 may branch in both directions with the first AR object 1420 between them.
  • the first part 1410S-1 of the separated second AR object connects the location of the selected parking area 1412 and the first AR object 1420. And, the second part 1410S-2 of the separated second AR object guides the path from the first AR object 1420 to the driving direction. At this time, the trajectories of the first and second parts 1410S-1 all head in the opposite direction to the direction pointed by the first AR object 1420, that is, in the drivable direction.
  • the separated second AR objects (1410S-1, 1410S-2) may display an entry impossibility warning through color change, shape change, blinking, and highlight display.
  • the color of the separated second AR object e.g., green
  • the color of the separated second AR object e.g., orange or Red (red) may be different.
  • the processor 820 may change the display method and/or notification level of the entry impossibility warning according to the vehicle's driving situation (e.g., entry of another vehicle in the entry direction, degree of parking congestion, distance from the vehicle, etc.).
  • the vehicle's driving situation e.g., entry of another vehicle in the entry direction, degree of parking congestion, distance from the vehicle, etc.
  • the driver can intuitively change the driving direction of the vehicle, decelerate and stop the vehicle, etc. by checking the second AR object that displays warning information about directions that cannot be entered.
  • the entry impossibility warning disappears, and the separated second AR object
  • the color and shape of the objects 1410S-1 and 1410S-2 are restored to their previous state, and then the first and second AR objects are displayed in a combined state again.
  • the processor 820 may display and update the AR graphic interface to re-display the location of the re-discovered parking area based on ADAS sensing data in the surrounding area of the driving direction.
  • FIGS. 15A, 15B, 15C, and 15D are conceptual diagrams to explain guiding a parking area using an AR graphic interface that changes based on control information, according to an embodiment of the present invention.
  • the AR display device 800 displays vehicle sensing data (e.g., CAN data (steering wheel angle, driving speed, yaw rate), GPS location/direction information) and map data (e.g., By providing (outputting) an AR graphic interface in real time in real time based on the control data of the parking lot/charging station along with navigation/map data (lane geometry), the parking area or charging area can be guided to the vehicle.
  • vehicle sensing data e.g., CAN data (steering wheel angle, driving speed, yaw rate), GPS location/direction information
  • map data e.g., By providing (outputting) an AR graphic interface in real time in real time based on the control data of the parking lot/charging station along with navigation/map data (lane geometry), the parking area or charging area can be guided to the vehicle.
  • Control data includes data and information generated by the control server based on sensing data from sensors installed in parking lots/charging stations (e.g., lidar, camera, radar, location sensor platform using UWB/BLE, etc.) do.
  • sensors installed in parking lots/charging stations e.g., lidar, camera, radar, location sensor platform using UWB/BLE, etc.
  • the control server is connected to the AR display device 800 when a vehicle enters the parking lot/charging station, and for example, uses a digital twin to monitor events (situations, actions, functions, etc.) that occur within the parking lot/charging station and the parking lot/charging station.
  • Devices installed at the charging station e.g. sensors, chargers, other connected devices/devices, etc. can be controlled.
  • the control server may transmit the acquired control data or information or data generated based on it to the vehicle 100 or the AR display device 800.
  • a digital twin refers to an object that exists in reality (objects, space, environment, process, procedure, etc.) expressed as a digital data model on a computer so that it can be copied identically and react to each other in real time.
  • These digital twins can create virtual models of assets such as physical objects, spaces, environments, people, and processes using software, allowing them to operate or perform the same actions as they do in the real world.
  • the control server contains the internal 3D shape of the parking lot/charging station building through digital twin, and sensors installed in the parking lot/charging station (e.g., location sensor platform using lidar, camera, radar, UWB/BLE, etc.) Based on the sensing data, vehicle entry, charging/exiting, loading/unloading routes, etc. can be provided to the AR display device 800.
  • sensors installed in the parking lot/charging station e.g., location sensor platform using lidar, camera, radar, UWB/BLE, etc.
  • the processor 820 receives map data such as control data, vehicle sensing data (e.g., CAN data), and navigation/map/GPS data, and provides auxiliary functions based on the received data as a separate second AR object. It can be displayed through . At this time, the separated second AR object may be displayed with additional information (eg, charging information such as remaining charging time and charging fee).
  • map data such as control data, vehicle sensing data (e.g., CAN data), and navigation/map/GPS data
  • the processor 820 may recognize a parking area or a charging area based on the vehicle entering the parking lot/charging station and control data.
  • the optimal parking space/charger that meets the preset criteria e.g., proximity to the vehicle's current location, proximity to the exit, priority on fast charging, etc.
  • the preset criteria e.g., proximity to the vehicle's current location, proximity to the exit, priority on fast charging, etc.
  • the processor 820 moves the separated second AR object to the location of the selected parking space/charger based on the vehicle entering the location of the parking space/charger selected through automatic selection or user input within a predetermined distance,
  • the AR graphic interface is updated to display a guide route from the vehicle's current location to the location of the parking space/charger.
  • the processor 820 determines that the second AR object is connected to the first AR object. Go to and render it to be displayed as a combined AR graphic interface.
  • the vehicle enters parking mode.
  • the driver can see that the first and second AR objects are recombined and can intuit that the parking mode has been entered (and/or through additional information ('parking mode execution') provided along with it).
  • FIGS. 15A to 15D an embodiment of searching for a possible parking area and providing route guidance will be described in detail using an AR graphic interface that varies based on control data of the parking lot/charging station.
  • the control server detects the vehicle's entry and searches for a parking area or a charging area (for example, using a digital twin). Accordingly, the processor 820 may recognize the result of searching for a parking area or a charging area based on control data as the vehicle enters the parking lot/charging station.
  • an animation in which the second AR object is separated from the first AR object and rotates 360 degrees based on the first AR object during navigation by the control server (or until control data is received from the control server) Effects can be output.
  • the search is completed (eg, search success/failure)
  • the first and second AR objects are displayed in a combined form again.
  • the processor 820 may display the surrounding parking available areas 1511 and 1512 discovered by the control server based on the current location of the vehicle on the front image 1501 of the vehicle.
  • the AR graphics interface 1500 is a combination of the first and second AR objects and displays the current driving state of the vehicle.
  • the processor 820 may display selection options 1521 and 1522 for a plurality of discovered parking areas 1511 and 1512 on the image in front of the vehicle.
  • the selection options 1521 and 1522 may be displayed along with additional information (e.g., driving distance, proximity to the current location of the vehicle, proximity to the exit, etc.).
  • the processor 820 selects one parking area 1512 based on the inputs for the selection options 1521 and 1522, and the second AR object is separated accordingly to determine the location of the selected parking area 1512. Go to
  • the second AR object 1510 generates a guide trajectory connecting the location of the selected parking area 1512 with the first AR object 1510 indicating the current position of the vehicle.
  • a guidance route is provided.
  • the guidance route may be created in a direction that is convenient for parking, taking into account the parking of the vehicle to be followed.
  • the processor 820 recognizes that the vehicle has entered a direction in which the vehicle cannot be driven based on the current location and driving state of the vehicle, and separates the second AR object according to the recognition to display a warning notification and a guide indicating the direction in which the vehicle can be driven. Rendering can be updated.
  • the first AR object 1520 indicates the current driving direction of the vehicle through the rotation amount, so it is displayed to face the selected parking area 1512.
  • the separated second AR objects 1510S-1 and 1510S-2 do not face the selected parking area 1512, but rotate to point in the same direction as the entry direction 1530 (or driving direction). That is, by pointing in a direction opposite to the direction pointed by the first AR object 1520, a possible driving direction is provided.
  • the separated second AR objects 1510S-1 and 1510S-2 may branch again in both directions with the first AR object 1520 between them.
  • the first part 1510S-1 of the separated second AR object connects the location of the selected parking area 1512 and the first AR object 1520.
  • the second part 1510S-2 of the separated second AR object guides a path from the first AR object 1520 toward the driving direction. At this time, the trajectories of the first and second parts 1510S-1 all head in the opposite direction to the direction pointed by the first AR object 1520, that is, in the drivable direction.
  • the separated second AR objects (1510S-1, 1510S-2) may display an entry impossibility warning through color change, shape change, blinking, and highlight display.
  • the color of the separated second AR object e.g., green
  • the color of the separated second AR object e.g., orange or Red (red) may be different.
  • the processor 820 may change the display method and/or notification level of the entry impossibility warning according to the vehicle's driving situation (e.g., entry of another vehicle in the entry direction, degree of parking congestion, distance from the vehicle, etc.).
  • the vehicle's driving situation e.g., entry of another vehicle in the entry direction, degree of parking congestion, distance from the vehicle, etc.
  • the driver can intuitively change the driving direction of the vehicle, decelerate and stop the vehicle, etc. by checking the second AR object that displays warning information about directions that cannot be entered.
  • the entry impossibility warning disappears, and the separated second AR object
  • the color and shape of the objects 1510S-1 and 1510S-2 are restored to their previous state, and then the first and second AR objects are displayed in a combined state again.
  • the processor 820 may display and update the AR graphic interface to re-display the location of the parking area rediscovered based on the control data in the surrounding area of the driving direction.
  • FIGS. 16A and 16B are conceptual diagrams related to updating the display of an AR graphic interface according to a parking type, according to an embodiment of the present invention.
  • the separated second AR object When the vehicle arrives at a parking area or a charging area through the separated second AR object, the separated second AR object is combined with the first AR object again, and then the parking mode is executed.
  • parking mode a guided route for parking is provided through the AR graphic interface, which varies depending on the type of parking.
  • Parking types include, for example, forward parking (front parking), reverse parking (vertical parking), diagonal parking, parallel parking (horizontal parking), etc.
  • the processor 820 of the AR display device 800 determines a possible parking type based on the vehicle's proximity to the selected parking area (or selected charging area), and creates a second AR object according to the determination. Separated again, the AR graphical interface can be updated to render to display parking guidance lines to drive through.
  • the separated second AR object displays the driving direction and driving distance in which the vehicle should travel according to the determined parking type.
  • the first AR object represents the current driving direction and steering angle (rotation amount) of the vehicle traveling along the guided parking guidance path.
  • the processor 820 performs an expected reverse operation based on the current location of the vehicle and the location of the selected parking area (and ADAS sensing data), based on the parking type for the selected parking area (or charging area) being determined. Driving change points can be calculated.
  • the change point refers to a point where the driving direction must be changed in order for the vehicle to park in the selected parking area (or charging area).
  • the change point refers to the position and steering angle at which the vehicle must change from the forward direction (reverse direction) to the reverse direction (forward direction).
  • the processor 820 displays a first guide line toward the change point through a separated second AR object, and then determines that the current location of the vehicle corresponding to the first AR object is close to the change point. Based on this, the AR graphic interface is updated to display a second guide line toward the selected parking area as the vehicle runs backwards.
  • the processor 820 When it is detected that the vehicle is driving deviating from the first guide line or the second guide line, the processor 820 changes the color, shape, etc. of the first guide line and/or the second guide line to guide the vehicle to drive according to the guide line. .
  • Figures 16a and 16b are examples of UX that provide parking guidance through an AR graphic interface when parallel parking (horizontal parking).
  • Parallel parking horizontal parking
  • Parallel parking consists of a combination of forward and reverse driving.
  • the front image 1601 includes the first AR object 1620 indicating the forward driving of the vehicle and the separation indicated by the first guide line.
  • the second AR object 1610 is displayed.
  • the second AR object 1610 includes a change point for backward travel, and the first guide line displayed by the second AR object 1610 connects the first AR object 1620 and the change point for reverse travel.
  • the change point may be, for example, the destination of the first guide line.
  • the change point may be displayed in a color or shape that is distinct from other guide trajectories constituting the first guide line.
  • the processor 820 changes the color, shape, etc. of the change point to display the change point to reverse driving.
  • additional information related to the change point e.g., ‘Change to reverse (R) driving’
  • R reverse to reverse
  • the processor 820 When the vehicle arrives at the change point, the processor 820 generates and displays a second guide line toward the destination parking area (PI) in the vehicle's reverse direction instead of displaying the first guide line by the second AR object 1610. To do this, update the rendering.
  • PI destination parking area
  • first AR object 1620' indicating the current driving direction (forward direction) of the vehicle, and a second guide line guiding the vehicle to the target parking area (PI) in the reverse direction of the vehicle. It is displayed as object 1610R.
  • the color of the second guide line and/or the direction pointed by its guide trace may be displayed differently from the color of the above-described first guide line and/or the direction pointed by its guide trace. Accordingly, the driver can intuitively recognize the vehicle's forward and reverse control guidance.
  • Figures 16c and 16d are examples of UX that provide parking guidance through an AR graphic interface when reverse parking (vertical parking).
  • Reverse parking vertical parking
  • Reverse parking similarly consists of a combination of forward and backward driving, but the amount of rotation (or rotation angle) during reverse parking is larger than that of parallel parking (horizontal parking).
  • the front image 1603 While the vehicle 100 is traveling along the first guide line that guides driving in the forward direction, the front image 1603 includes a first AR object 1620 indicating the forward driving of the vehicle and a separated first guide line indicated by the first guide line. 2 AR object 1610' is displayed. Information on parking type (eg, reverse parking) may be displayed as additional information on the second AR object 1610'.
  • first AR object 1620 indicating the forward driving of the vehicle
  • second AR object 1610' is displayed.
  • Information on parking type eg, reverse parking
  • the second AR object 1610' includes a change point for backward travel, and the first guide line displayed by the second AR object 1610' connects the first AR object 1620 and the change point for reverse travel. do.
  • the change point may be, for example, the destination of the first guide line.
  • the change point may be displayed in a color or shape that is distinct from other guide trajectories constituting the first guide line.
  • the processor 820 changes the color, shape, etc. of the change point and displays it to reverse drive. Provides change notification.
  • additional information related to the change point e.g., ‘Change to reverse (R) driving’
  • R reverse to reverse
  • the processor 820 When the vehicle arrives at the change point, the processor 820 generates and displays a second guide line toward the destination parking area in the vehicle's reverse direction instead of displaying the first guide line by the second AR object 1610'. Update rendering.
  • the curve of the guide trajectory indicated by the second guide line is larger than that in the case of parallel parking (horizontal parking) described above. This means that the rotation amount of the vehicle to drive backwards along the second guide line must also be greatly increased. Accordingly, the second AR object 1610R' is additional information related to the second guide line and may further display vehicle rotation amount guide information (e.g., 'Turn the steering wheel all the way').
  • a first AR object 1620' indicating the current driving direction (forward direction) of the vehicle and a second guide line guiding the vehicle to the target parking area in the reverse direction are displayed as a second AR object 1610R. ').
  • the color of the second guide line and/or the direction pointed by its guide trace may be displayed differently from the color of the above-described first guide line and/or the direction pointed by its guide trace. Accordingly, the driver can intuitively recognize the vehicle's forward and reverse control guidance.
  • FIG. 17 is a flowchart illustrating guiding the charging area and charging through an AR graphic interface based on control information, according to an embodiment of the present invention, and FIGS. 18A, 18B, and 18C illustrate FIG. 17. These are conceptual diagrams that are referred to.
  • each step shown in FIG. 17 may be performed by the processor 820 of the AR display device 800.
  • each step is performed including some of the operations of the navigation engine 910, AR engine 920, and navigation application 930 described above with reference to FIGS. 8 to 10, or at least some of the operations are performed. It may be performed before or after the process of FIG. 17 below.
  • the AR display device 800 receives parking lot/charging station map data (or digital twin) and parking lot/charging station related information from the parking/charging control server based on the vehicle entering the parking lot/charging station. can be received (1710).
  • the processor 820 may display the parking area/charging area through an AR graphic interface (1720).
  • the processor 820 searches for a parking area or a charging area based on at least one of sensing data and control data, and searches for a parking area or charging area.
  • the AR graphic interface can be variably displayed to display the location of a parking area or a charging area.
  • the processor 820 provides the navigation application 930 with an AR GUI frame generated based on variable information about the AR graphic interface, allowing the navigation application 930 to update the AR GUI surface.
  • the processor 820 may update the rendering of the AR graphic interface to display charging-related information for the discovered chargeable area.
  • the charging-related information may include at least one of a charging method and a charging cost.
  • the processor 820 may update the rendering of the AR graphic interface to display remaining charging time information for each surrounding charging area based on the current location of the vehicle, based on failure to search for a charging area.
  • the processor 820 may continue to search for nearby parking spaces until the end of the search is input based on failure to search for a parking available area.
  • the second AR object of the AR graphic interface is separated and outputs an animation effect that rotates 360 degrees based on the first AR object, indicating that it is being searched.
  • the processor 820 determines the charging waiting time for the charging area or selected area around the vehicle through the AR graphic interface based on the received control data. It can be displayed on the front video.
  • the processor 820 determines whether the automatic selection option is activated (1730) and determines the next AR graphic interface display based on the determination result.
  • the optimal parking space/charging area is automatically selected by the parking/charging control server or processor 820 (1740). Accordingly, notification information indicating that the optimal parking space/charging area has been automatically selected is displayed/output through the AR graphic interface.
  • the processor 820 displays an AR graphic interface including selection options for the plurality of discovered parking/charging areas. Afterwards, the parking space/charging area is selected based on the input for the displayed selection option (1780).
  • the processor 820 When a parking space/charging area is selected, the processor 820 generates a guidance path leading to the location of the selected parking space/charging area based on ADAS sensing data and/or control data of the parking lot/charging station (1750).
  • the guide path may be implemented through display of a guide trajectory by a separate second AR object.
  • the processor 820 displays the first AR object of the AR graphic interface to rotate in response to the driving direction of the vehicle, separates the second AR object from the first AR object, and uses the separated second AR object to display,
  • the AR graphic interface can be updated to display a guide trajectory from the first AR object to the discovered or selected parking space/charging area location.
  • the processor 820 may determine whether the generated guidance path is a travelable direction (1760), and if it is not a travelable direction, that is, an entry-impossible direction, an entry impossibility warning and entry are provided through a separated second AR object. Update the direction indicator (1770). To this end, the color, shape, blinking light, and highlight effects of the separated second AR object can be changed/applied.
  • the guidance route is displayed through a separate second AR object.
  • the smart parking mode (1795) may be performed.
  • Smart parking mode is a parking/charging control server that, when the vehicle 100 enters the parking lot/charging station, based on GPS information, authority information (vehicle control rights), vehicle information, etc. received from the vehicle or the AR display device 800, This can be performed by linking the charging control server and the AR display device 800.
  • the processor 820 can display the driving state of the vehicle and the location and direction to be driven in real time through the AR graphic interface while performing parking driving according to the execution of the smart parking mode.
  • the vehicle 100 may be provided with a guidance display through the AR graphic interface to the entrance of the discovered parking lot/charging station.
  • the AR display device 800 can search for, detect, and create a guidance route for parking lot/charging station locations based on map data and ADAS sensing data.
  • the parking/charging control server detects it through a sensor (e.g., camera, lidar, radar, location sensor platform, etc.) and requests a connection to the AR display device 800 ( Yes, requests to transmit GPS information, authority information (vehicle control rights), vehicle information, etc.) can be transmitted.
  • a sensor e.g., camera, lidar, radar, location sensor platform, etc.
  • AR The display device 800 may receive control data obtained through a parking/charging control server.
  • the control data includes map data and charging information of the parking lot/charging station.
  • 3D space map of parking lot/charging station parking area based on incoming vehicle information or information about available (fast or slow) chargers, real-time parking lot/charging station information (e.g. charging unit price (by super-rapid/rapid/slow-speed) It may include data, information, and programs regarding (charge), charging vehicle occupancy, charging waiting time, charger failure information, etc.
  • real-time parking lot/charging station information e.g. charging unit price (by super-rapid/rapid/slow-speed) It may include data, information, and programs regarding (charge), charging vehicle occupancy, charging waiting time, charger failure information, etc.
  • the AR display device 800 can display the AR graphic interface variably for parking/charging guidance based on the received control data. Additionally, the parking/charging control server may generate the above-described digital twin based on control data and vehicle information of the vehicle 100 and provide parking/charging guidance using the digital twin.
  • an AR graphic interface 1800 indicating a driving state including the current location of the vehicle may be displayed on the front image 1801 of the vehicle (or through a digital twin).
  • the processor 820 can display the location of the discovered chargeable area (or parking area) and charging information for each chargeable area in an AR graphic interface, based on control data from the parking/charging control server. there is.
  • the processor 820 may receive charger usage status and information on ultra-fast/rapid/slow chargers as control data from the parking/charging control server.
  • the processor 820 provides charging information (1821) for the discovered first charging area (1811) based on the charger usage status and information on ultra-fast/quick/slow chargers received from the parking/charging control server. ), 'slow speed' can be displayed, and as charging information 1822 for the discovered second charging possible area 1812, 'rapid' can be displayed.
  • the second AR object of the AR graphic interface is separated to create a guidance path to the first chargeable area 1811 ( 1810a) and a guide path 1810b for the second chargeable area 1812 are displayed.
  • the first AR object 1820 continues to display the current location and driving direction of the vehicle.
  • a guidance path 1810a for the displayed first chargeable area 1811 and guidance for the second chargeable area 1812 are provided.
  • One of the routes 1810b is confirmed as a guidance route.
  • the processor 820 may provide a recommended display (e.g., color change, highlight display, display of additional information, etc.) for the guide path selected/suggested according to preset criteria among the plurality of guide paths 1810a and 1810b. .
  • a recommended display e.g., color change, highlight display, display of additional information, etc.
  • selection based on the preset criteria may include proximity to the current location of the vehicle, proximity to the parking lot/charging station exit, priority on fast charging, etc.
  • a recommendation display (e.g., a highlight display) may be output on the guide path 1810b for the second chargeable area 1812 indicated as 'fast' charging information 1822.
  • the second chargeable area 1812 is selected as the charging area according to the selection input according to the selection/suggestion or the automatic selection option, only the guidance path 1810b is left in the image in front of the vehicle. Accordingly, the path to the charging area 1812 is guided through the separated second AR object as shown in FIG. 18C.
  • the guidance path 1810b displayed through the second AR object is created and provided in a direction that is convenient for parking, considering the parking mode to be used in the future.
  • the guidance path 1810b provides a warning and entry direction indicating that entry is not possible in the guidance path 1810b displayed through the second AR object. (e.g., rotation to be opposite to the vehicle's current driving direction) can be displayed.
  • the AR graphic interface is displayed in the form of a combination of the first and second AR objects.
  • the AR display device 800 displays charging information and related information (e.g., remaining charging time) based on control data received from the parking/charging control server. , charging fees, events/promotions linked to the charging station, etc.) can be displayed in real time on the video in front of the vehicle.
  • the processor 820 uses the separated second AR object based on the control data received from the parking/charging control server to detect the parking lot/charge. Create and display a route leading to the charging station exit.
  • the contents described with reference to FIGS. 17 and 18A to 18C are similarly applied even when the AR display device 800 is performed based on ADAS sensing data.
  • the AR display device 800 searches for a chargeable area, routes to the discovered chargeable area, displays charging information, and parks through separation, combination, and modification of the AR graphic interface. It can display parking guidance routes according to type, no-entry signs, guidance routes to the exit after charging is complete, etc.
  • the guide for the current location and predicted driving situation of the vehicle is simultaneously provided as an AR object in the front image that is calibrated without separate settings, thereby It is possible to provide more intuitive and realistic AR guide guidance to users.
  • a vehicle enters a parking lot/charging station navigation, routes, and necessary information can be provided through a more intuitive AR graphic interface.
  • the vehicle can park accurately in the desired parking space or in front of the charger, the selected parking space or in front of the charger is recognized, and the AR graphic interface is changed in real time to change the guide path for parking to forward driving, reverse driving change point, reverse driving, etc. is provided sequentially in correspondence with the current driving status of the vehicle.
  • a vehicle when a vehicle enters a parking lot or charging station, it communicates with the control server of that location or uses ADAR sensing to provide route guidance for parking/charging areas, parking/charging related information, and route guidance when leaving the vehicle through a more intuitive AR graphic interface. By displaying it, a more direct and smart parking/charging-related UX can be provided.
  • the above-described present invention can be implemented as computer-readable code (or application or software) on a program-recorded medium.
  • the control method of the self-driving vehicle described above can be implemented using codes stored in memory, etc.
  • Computer-readable media includes all types of recording devices that store data that can be read by a computer system. Examples of computer-readable media include HDD (Hard Disk Drive), SSD (Solid State Disk), SDD (Silicon Disk Drive), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. It also includes those implemented in the form of carrier waves (e.g., transmission via the Internet). Additionally, the computer may include a processor or control unit. Accordingly, the above detailed description should not be construed as restrictive in all respects and should be considered illustrative. The scope of the present invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the present invention are included in the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

An AR display device linked to a vehicle and an operation method thereof are disclosed. The AR display device according to the present invention may provide an intuitive AR path to an available parking area or an available charging area in a parking lot or a charging station on the basis of ADAS sensing data of a vehicle and/or control data of the parking lot/charging station. Furthermore, the AR display device recognizes arrival of a vehicle in a selected parking space or in front of a selected charger, and changes an AR graphic interface in real time to sequentially provide, for parking, forward driving, backward driving change timing, and backward driving to correspond to the current driving state of the vehicle, so that the vehicle can be accurately parked in a desired parking space or in front of a desired charger.

Description

차량의 AR 디스플레이 장치 및 그것의 동작방법Vehicle AR display device and method of operation thereof
본 발명은 차량과 연동된 AR 디스플레이 장치 및 그것의 동작방법에 관한 것으로, 보다 구체적으로는 AR 기술을 통해 차량의 주차 또는 충전과 관련된 안내를 제공할 수 있는 AR 디스플레이 장치 및 그것의 동작방법에 관한 것이다.The present invention relates to an AR display device linked to a vehicle and its operating method. More specifically, it relates to an AR display device capable of providing guidance related to parking or charging of a vehicle through AR technology and its operating method. will be.
차량을 이용하는 사용자의 안전 및 편의를 위해, 차량에는 각종 센서와 장치가 구비되고 있으며, 차량의 기능이 다양화 되고 있다. 이러한 차량의 기능은 운전자의 편의를 도모하기 위한 편의 기능, 그리고 운전자 및/또는 보행자의 안전을 도모하기 위한 안전 기능으로 나뉠 수 있다. For the safety and convenience of vehicle users, vehicles are equipped with various sensors and devices, and vehicle functions are becoming more diverse. These vehicle functions can be divided into convenience functions to promote driver convenience, and safety functions to promote driver and/or pedestrian safety.
차량의 편의 기능은 차량에 인포테인먼트(information + entertainment) 기능을 부여하고, 부분적인 자율 주행 기능을 지원하거나, 야간 시야나 사각 대와 같은 운전자의 시야 확보를 돕는 등의 운전자 편의와 관련된 개발 동기를 가진다. 예를 들어, 적응 순향 제어(active cruise control, ACC), 스마트주자시스템(smart0020parking assist system, SPAS), 나이트비전(night vision, NV), 헤드 업 디스플레이(head up display, HUD), 어라운드 뷰 모니터(around view monitor, AVM), 적응형 상향등 제어(adaptive headlight system, AHS) 기능 등이 있다. Convenience functions of a vehicle have a motive for development related to driver convenience, such as providing infotainment (information + entertainment) functions to the vehicle, supporting partial autonomous driving functions, or helping to secure the driver's field of vision such as night vision or blind spots. . For example, adaptive cruise control (ACC), smart0020parking assist system (SPAS), night vision (NV), head up display (HUD), and around view monitor ( around view monitor (AVM) and adaptive headlight system (AHS) functions.
또한, 차량의 안전 기능은 운전자의 안전 및/또는 보행자의 안전을 확보하는 기술로, 차선 이탈 경고 시스템(lane departure warning system, LDWS), 차선 유지 보조 시스템(lane keeping assist system, LKAS), 자동 긴급 제동(autonomous emergency braking, AEB) 기능 등이 있다. In addition, vehicle safety functions are technologies that ensure driver safety and/or pedestrian safety, such as lane departure warning system (LDWS), lane keeping assist system (LKAS), and automatic emergency There is an autonomous emergency braking (AEB) function.
최근에는, 차량의 윈드쉴드, HUD(Head Up Display)를 통해 그래픽 객체를 출력하거나, 카메라로 촬영되는 영상에 그래픽 객체를 출력하여 실제 세계에 그래픽 객체를 추가로 출력시키는 증강현실(Augmented Reality, AR)에 대한 기술개발이 활발히 이루어지고 있다. 특히, 이러한 증강현실(AR) 기술을 활용하여, 운전자에게 증강현실(AR) 기술을 통해 경로를 안내하는 기술들의 개발이 보다 확대되고 있다. Recently, Augmented Reality (AR) outputs additional graphic objects in the real world by outputting graphic objects through a vehicle's windshield or HUD (Head Up Display) or on images captured by a camera. ) Technology development is actively taking place. In particular, the development of technologies that provide route guidance to drivers through augmented reality (AR) technology is expanding.
한편, 기존에는 AR 주행 모드에 따라 이러한 증강현실(AR) 기술을 경로 안내에 적용하더라도, 단순히 기존 주행 안내 표시를 AR 형태로 표시하는데 불과하였다. 예를 들어, 고정된 특정 위치에 주행방향 변경 안내를 AR 이미지로 표시하는데 불과하였다.Meanwhile, in the past, even if such augmented reality (AR) technology was applied to route guidance depending on the AR driving mode, it simply displayed the existing driving guidance display in AR form. For example, it merely displayed driving direction change guidance as an AR image at a fixed location.
그에 따라, AR 주행 모드의 다른 AR 피처(feature)와 표시 구별이 어려워 직관적인 경로 안내를 제공받는데 한계가 있었다. 또, 운전에 미숙한 운전자의 경우, 안내대로 정확하게 주행하는데 한계가 있었다. 이는, 주행 방향 변경 표시와 함께 남은 거리값이 함께 표시되더라도 마찬가지이다. 이에, 좀더 직관적이고 완성도 높은 AR 주행 모드를 수행하기 위한 연구가 필요하다.Accordingly, it was difficult to distinguish the display from other AR features in the AR driving mode, so there was a limit to providing intuitive route guidance. Additionally, in the case of inexperienced drivers, there were limitations in driving accurately according to the instructions. This is the same even if the remaining distance value is displayed together with the driving direction change display. Accordingly, research is needed to implement a more intuitive and highly complete AR driving mode.
특히, 차량의 주차 또는 충전시, 운전에 미숙한 운전자는 복잡한 공간에서 원하는 위치를 찾는데 어려움을 겪는다. 그러나, 기존의 AR 주행 모드에서는 주차 또는 충전 가능한 공간을 미리 탐색하더라도 이와 관련된 보다 직관적인 안내를 제공하는데 한계가 있었다. 이는 관제시스템과 통신하여 주차 또는 충전 가능한 공간을 미리 인지하는 경우에도 마찬가지였다. In particular, when parking or charging a vehicle, drivers who are inexperienced at driving have difficulty finding the desired location in a crowded space. However, in the existing AR driving mode, there were limitations in providing more intuitive guidance related to parking or charging spaces even when available spaces were searched in advance. This was the same even when spaces available for parking or charging were recognized in advance by communicating with the control system.
본 발명은 전술한 문제 및 다른 문제를 해결하는 것을 목적으로 한다.The present invention aims to solve the above-mentioned problems and other problems.
본 발명의 일부 실시 예에 따르면, 보다 직관적이고 완성도 높은 AR 주행 모드를 수행할 수 있는 AR 디스플레이 장치 및 그것의 동작방법을 제공하는 것을 목적으로 한다. According to some embodiments of the present invention, the purpose of the present invention is to provide an AR display device that can perform a more intuitive and highly complete AR driving mode and a method of operating the same.
또한, 본 발명의 일부 실시 예에 따르면, 차량이 주차장/충전소 진입시 탐색, 경로, 및 필요한 정보를 보다 직관적인 AR 그래픽 인터페이스로 제공해줄 수 있는 AR 디스플레이 장치 및 그것의 동작방법을 제공하는 것을 목적으로 한다. In addition, according to some embodiments of the present invention, the purpose is to provide an AR display device and a method of operating the same that can provide navigation, routes, and necessary information through a more intuitive AR graphic interface when a vehicle enters a parking lot/charging station. do.
또한, 본 발명의 일부 실시 예에 따르면, 주차장/충전소에서 최적의 주차 가능 영역 및/또는 충전 가능 영역을 미리 탐색하고, AR 그래픽 인터페이스를 실시간 가변하여 주차가 편한 방향으로 안내 경로를 제공할 수 있는 AR 디스플레이 장치 및 그것의 동작방법을 제공하는 것을 목적으로 한다. In addition, according to some embodiments of the present invention, the optimal parking area and/or charging area can be searched in advance at a parking lot/charging station, and the AR graphic interface can be changed in real time to provide a guidance route in a direction that is convenient for parking. The purpose is to provide an AR display device and its operating method.
또한, 본 발명의 일부 실시 예에 따르면, 주차장/충전소에서 진입 불가 방향으로 진입하지 않도록 직관적인 AR 그래픽 인터페이스를 이용한 UX를 제공할 수 있는 AR 디스플레이 장치 및 그것의 동작방법을 제공하는 것을 목적으로 한다. In addition, according to some embodiments of the present invention, the purpose is to provide an AR display device and a method of operating the same that can provide UX using an intuitive AR graphic interface to prevent entering from a parking lot/charging station in a direction that is not possible. .
이를 위해, 본 발명에 따른 차량의 AR 디스플레이 장치는, 차량의 ADAS 센싱 데이터 및/또는 주차장/충전소의 관제 데이터에 기초하여, 차량이 주차장/충전소의 진입하면 주차 가능 영역 또는 충전 가능 영역으로 직관적인 AR 경로를 제공할 수 있다. To this end, the vehicle's AR display device according to the present invention intuitively displays the parking area or charging area when the vehicle enters the parking lot/charging station, based on the vehicle's ADAS sensing data and/or parking lot/charging station control data. AR path can be provided.
또한, 본 발명에 따른 차량의 AR 디스플레이 장치는, 차량이 원하는 주차공간 또는 충전기 앞에 정확히 주차할 수 있도록, 선택된 주차공간 또는 충전기 앞을 인지하고, AR 그래픽 인터페이스를 실시간 가변하여 주차를 위한 가이드경로의 전진주행, 후진주행 변경 시점, 후진주행 등을 차량의 현재 주행 상태에 대응되게 순차적으로 제공한다. In addition, the AR display device for a vehicle according to the present invention recognizes the selected parking space or in front of the charger so that the vehicle can accurately park in the desired parking space or in front of the charger, and changes the AR graphic interface in real time to provide a guide path for parking. Forward driving, reverse driving change points, reverse driving, etc. are provided sequentially in response to the current driving status of the vehicle.
구체적으로, 본 개시에 따른 AR 디스플레이 장치는, 차량의 전방 영상을 획득하는 카메라; 차량의 센싱 데이터를 수신하는 통신 모듈; 기설정된 애플리케이션을 구동하여, 차량의 주행 상태를 표시하는 제1 AR 오브젝트와 차량의 주행 상황에 대한 가이드를 표시하는 제2 AR 오브젝트가 결합된 AR 그래픽 인터페이스가 상기 전방 영상에 중첩되도록 렌더링하는 프로세서; 및 상기 렌더링에 따라, 상기 AR 그래픽 인터페이스가 중첩된 전방 영상을 표시하는 디스플레이를 포함할 수 있다. 또한, 상기 프로세서는, 차량이 충전 영역을 포함한 주차 영역에 진입한 것에 응답하여, 상기 센싱 데이터 및 상기 주차 영역의 관제 데이터 중 적어도 하나에 근거하여 주차 가능 영역을 탐색하고, 상기 탐색된 주차 가능 영역으로 차량을 안내하도록 상기 AR 그래픽 인터페이스를 가변하여 표시할 수 있다. Specifically, the AR display device according to the present disclosure includes a camera that acquires a front image of the vehicle; A communication module that receives sensing data from the vehicle; A processor that runs a preset application and renders an AR graphic interface combining a first AR object that displays the driving state of the vehicle and a second AR object that displays a guide to the driving situation of the vehicle so that it overlaps the front image; And according to the rendering, it may include a display that displays a front image with the AR graphic interface overlaid. In addition, in response to the vehicle entering a parking area including a charging area, the processor searches for a possible parking area based on at least one of the sensing data and the control data of the parking area, and The AR graphic interface can be displayed variably to guide the vehicle.
실시 예에 따라, 상기 프로세서는, 상기 탐색된 주차 가능 영역의 위치를 표시하고, 표시된 주차 가능 영역이 선택된 것에 근거하여 상기 제2 AR 오브젝트를 분리하여 상기 선택된 주차 영역으로 향하는 가이드를 표시하도록 상기 AR 그래픽 인터페이스를 업데이트할 수 있다. Depending on the embodiment, the processor displays the location of the searched available parking area, separates the second AR object based on the displayed available parking area being selected, and displays a guide to the selected parking area. The graphical interface can be updated.
실시 예에 따라, 상기 프로세서는, 상기 주차 가능 영역의 탐색 동안 상기 제2 AR 오브젝트가 분리되어 상기 제1 AR 오브젝트를 기준으로 회전하도록 표시하고, 상기 탐색의 종료에 응답하여 상기 제1 및 제2 AR 오브젝트가 결합된 AR 그래픽 인터페이스를 표시하도록 렌더링 업데이트할 수 있다. According to an embodiment, the processor displays the second AR object to be separated and rotated relative to the first AR object during the search of the available parking area, and in response to the end of the search, the first and second Rendering can be updated to display an AR graphical interface with AR objects combined.
실시 예에 따라, 상기 프로세서는, 차량의 현재 위치 및 주행 상태에 근거하여 차량이 주행 불가 방향으로 진입한 것을 인식하고, 상기 인식에 따라 상기 제2 AR 오브젝트를 분리하여 경고 알림 및 주행 가능 방향을 나타내는 가이드를 표시하도록 렌더링 업데이트할 수 있다. Depending on the embodiment, the processor recognizes that the vehicle has entered a direction in which the vehicle cannot be driven based on the current location and driving state of the vehicle, and separates the second AR object according to the recognition to provide a warning notification and a direction in which the vehicle can be driven. The rendering can be updated to display the guide it represents.
실시 예에 따라, 상기 프로세서는, 상기 주행 가능 방향의 주변 영역에, 상기 센싱 데이터 및 상기 관제 데이터에 기초하여 재탐색된 주차 가능 영역의 위치를 표시하도록 AR 그래픽 인터페이스를 표시 업데이트할 수 있다.Depending on the embodiment, the processor may display and update the AR graphic interface to display the location of a parking area rediscovered based on the sensing data and the control data in a surrounding area in the driving direction.
실시 예에 따라, 상기 프로세서는, 상기 선택된 주차 영역에 차량이 근접한 것에 근거하여 가능한 주차 형태를 결정하고, 상기 결정에 따라 상기 제2 AR 오브젝트를 분리하여, 주행해야할 주차 안내선을 표시하도록 상기 AR 그래픽 인터페이스를 렌더링 업데이트할 수 있다.Depending on the embodiment, the processor determines a possible parking type based on the vehicle's proximity to the selected parking area, separates the second AR object according to the determination, and displays the AR graphic to display a parking guide line to drive. Interface rendering can be updated.
실시 예에 따라, 상기 프로세서는, 상기 가능한 주차 형태가 결정된 것에 근거하여, 차량의 현재 위치 및 선택된 주차 영역의 위치에 기초하여 예상되는 후진주행의 변경 포인트를 산출하고, 상기 분리된 제2 AR 오브젝트를 통해, 상기 변경 포인트를 향하는 제1 안내선을 표시하고, 이 후 상기 제1 AR 오브젝트에 대응되는 차량의 현재 위치가 상기 변경 포인트에 근접한 것에 근거하여, 차량이 후진방향으로 상기 선택된 주차 영역을 향하는 제2 안내선을 표시하도록, 상기 AR 그래픽 인터페이스를 표시 업데이트할 수 있다.Depending on the embodiment, the processor calculates an expected change point of reverse driving based on the determined possible parking type, the current location of the vehicle and the location of the selected parking area, and the separated second AR object. Through, a first guide line toward the change point is displayed, and then, based on the current location of the vehicle corresponding to the first AR object being close to the change point, the vehicle heads toward the selected parking area in the reverse direction. The AR graphic interface may be displayed and updated to display a second guide line.
실시 예에 따라, 상기 프로세서는, 차량이 상기 주차 영역에 진입한 것에 응답하여, 상기 센싱 데이터 및 상기 관제 데이터 중 적어도 하나에 근거하여 상기 충전 영역 중 충전 가능 영역을 탐색하고, 탐색된 충전 가능 영역의 위치를 표시하도록 상기 AR 그래픽 인터페이스를 가변하여 표시할 수 있다.Depending on the embodiment, in response to the vehicle entering the parking area, the processor searches for a chargeable area among the charging areas based on at least one of the sensing data and the control data, and searches for a chargeable area among the discovered chargeable areas. The AR graphic interface can be displayed variably to display the location of .
실시 예에 따라, 상기 프로세서는, 상기 탐색된 충전 가능 영역에 대한 충전 관련 정보가 표시되도록 상기 AR 그래픽 인터페이스를 렌더링 업데이터하고, 상기 충전 관련 정보는, 충전 방식 및 충전 비용 중 적어도 하나를 포함할 수 있다.Depending on the embodiment, the processor updates the rendering of the AR graphic interface to display charging-related information for the discovered chargeable area, and the charging-related information may include at least one of a charging method and a charging cost. there is.
실시 예에 따라, 상기 프로세서는, 상기 충전 가능 영역의 탐색에 실패한 것에 근거하여, 차량의 현재 위치를 기준으로 주변 충전 영역에 대한 각 잔여 충전 시간 정보가 표시되도록 상기 AR 그래픽 인터페이스를 렌더링 업데이트할 수 있다.Depending on the embodiment, the processor may update the rendering of the AR graphic interface to display each remaining charging time information for the surrounding charging area based on the current location of the vehicle, based on failure to search for the charging area. there is.
실시 예에 따라, 상기 프로세서는, 상기 제1 AR 오브젝트가 차량의 주행 방향에 대응하여 회전하도록 표시하고, 상기 제2 AR 오브젝트를 분리하여 상기 제1 AR 오브젝트로부터 상기 탐색된 주차 가능 영역의 위치로 향하는 가이드 궤적을 표시하도록 상기 AR 그래픽 인터페이스를 업데이트할 수 있다.Depending on the embodiment, the processor displays the first AR object to rotate in response to the driving direction of the vehicle, separates the second AR object, and moves the searched parking area from the first AR object to the location of the discovered parking area. The AR graphical interface can be updated to display the heading guide trajectory.
실시 예에 따라, 상기 탐색된 주차 가능 영역에 대한 가이드 궤적은, 차량의 ADAS 센싱 데이터 및 상기 관제 데이터 중 적어도 하나에 기초하여 생성된 안내 경로일 수 있다. Depending on the embodiment, the guide trajectory for the searched available parking area may be a guide path generated based on at least one of the vehicle's ADAS sensing data and the control data.
본 발명에 따른 차량의 AR 디스플레이 장치 및 그것의 동작방법의 효과에 대해 설명하면 다음과 같다.The effects of the vehicle AR display device and its operating method according to the present invention will be described as follows.
본 발명의 일부 실시 예에 따른 AR 디스플레이 장치 및 그것의 동작방법에 의하면, 별도의 설정 없이도 캘리브레이션되는 전방 영상에 차량의 현재 위치와 예측 주행 상황에 대한 가이드를 AR 오브젝트로 동시 안내함으로써, 차량에게 보다 직관적이고 현실감 있는 AR 가이드 안내를 제공할 수 있다. According to the AR display device and its operating method according to some embodiments of the present invention, the guide for the current location and predicted driving situation of the vehicle is simultaneously provided as an AR object in the front image that is calibrated without separate settings, thereby providing the vehicle with a better view. It can provide intuitive and realistic AR guide guidance.
또한, 본 발명의 일부 실시 예에 따른 AR 디스플레이 장치 및 그것의 동작방법에 의하면, 차량이 주차장/충전소 진입시 탐색, 경로, 및 필요한 정보를 보다 직관적인인 AR 그래픽 인터페이스로 제공해줄 수 있다. In addition, according to the AR display device and its operating method according to some embodiments of the present invention, when a vehicle enters a parking lot/charging station, navigation, route, and necessary information can be provided through a more intuitive AR graphic interface.
또한, 본 발명에 따른 차량의 AR 디스플레이 장치는, 차량이 원하는 주차공간 또는 충전기 앞에 정확히 주차할 수 있도록, 선택된 주차공간 또는 충전기 앞을 인지하고, AR 그래픽 인터페이스를 실시간 가변하여 주차를 위한 가이드경로의 전진주행, 후진주행 변경 시점, 후진주행 등을 차량의 현재 주행 상태에 대응되게 순차적으로 제공한다. In addition, the AR display device for a vehicle according to the present invention recognizes the selected parking space or in front of the charger so that the vehicle can accurately park in the desired parking space or in front of the charger, and changes the AR graphic interface in real time to provide a guide path for parking. Forward driving, reverse driving change points, reverse driving, etc. are provided sequentially in response to the current driving status of the vehicle.
또한, 차량의 주차장 또는 충전소 진입시, 해당 장소의 관제 서버와 통신하거나 ADAR 센싱을 통해, 주차/충전 가능 영역에 대한 경로 안내, 주차/충전 관련 정보, 출차시 경로 안내를 보다 직관적인 AR 그래픽 인터페이스로 표시해줌으로써, 더욱 직접적이고 스마트한 주차/충전 관련 UX를 제공할 수 있다. In addition, when a vehicle enters a parking lot or charging station, it communicates with the control server of that location or uses ADAR sensing to provide route guidance for parking/charging areas, parking/charging related information, and route guidance when leaving the vehicle through a more intuitive AR graphic interface. By displaying it, a more direct and smart parking/charging-related UX can be provided.
도 1은 본 발명의 실시예와 관련된 차량의 예시를 도시한 도면이다.1 is a diagram illustrating an example of a vehicle related to an embodiment of the present invention.
도 2는 본 발명의 실시예와 관련된 차량을 다양한 각도에서 본 도면이다.Figure 2 is a view of a vehicle related to an embodiment of the present invention viewed from various angles.
도 3 및 도 4는 본 발명의 실시예와 관련된 차량의 내부를 도시한 도면이다.3 and 4 are diagrams showing the interior of a vehicle related to an embodiment of the present invention.
도 5 및 도 6은 본 발명의 실시예와 관련된 차량의 주행과 관련된 다양한 오브젝트를 설명하는데 참조되는 도면이다.5 and 6 are diagrams referenced for explaining various objects related to driving of a vehicle related to an embodiment of the present invention.
도 7은 본 발명의 실시예와 관련된 차량과 AR 디스플레이 장치를 설명하는데 참조되는 블럭도이다.Figure 7 is a block diagram referenced for explaining a vehicle and an AR display device related to an embodiment of the present invention.
도 8은 본 발명의 실시예에 관련된 AR 디스플레이 장치의 프로세서와 관련된 상세 블럭도이다. Figure 8 is a detailed block diagram related to the processor of the AR display device according to an embodiment of the present invention.
도 9는 본 발명의 실시예에 따른 내비게이션 화면을 설명하는데 참조되는 도면이며, 도 10은 도 9의 내비게이션 화면을 생성하는 동작을 설명하는데 참조되는 도면이다. FIG. 9 is a diagram referenced for explaining a navigation screen according to an embodiment of the present invention, and FIG. 10 is a diagram referenced for explaining an operation of generating the navigation screen of FIG. 9 .
도 11은 본 발명의 실시예에 따른 AR 그래픽 인터페이스를 내비게이션 화면이 표시하는 방법을 설명하는데 참조되는 흐름도이다. Figure 11 is a flowchart referenced to explain a method of displaying an AR graphic interface on a navigation screen according to an embodiment of the present invention.
도 12a 및 도 12b는 본 발명의 실시예에 따른 AR 그래픽 인터페이스의 예시로서, 제1 및 제2 AR 오브젝트로 분리 및 결합되는 것을 설명하는데 참조되는 도면이다. FIGS. 12A and 12B are illustrations of an AR graphic interface according to an embodiment of the present invention, and are reference diagrams for explaining separation and combination into first and second AR objects.
도 13은 본 발명의 실시예에 따른 AR 디스플레이 장치의 동작 방법으로서, AR 그래픽 인터페이스를 이용하여 차량의 주차/충전과 관련된 UX 표시를 제공하는 방법을 설명하는데 참조되는 흐름도이다. FIG. 13 is a flowchart referenced to explain a method of providing a UX display related to parking/charging of a vehicle using an AR graphic interface as a method of operating an AR display device according to an embodiment of the present invention.
도 14a, 도 14b, 도 14c, 도 14d는 본 발명의 실시예에 따라, ADAS 센싱 데이터에 기반하여 AR 그래픽 인터페이스를 가변하여 주차 가능 영역을 가이드하는 것을 설명하기 위한 개념도들이다. FIGS. 14A, 14B, 14C, and 14D are conceptual diagrams illustrating guiding a parking area by changing the AR graphic interface based on ADAS sensing data, according to an embodiment of the present invention.
도 15a, 도 15b, 도 15c, 도 15d는 본 발명의 실시예에 따라, 관제정보에 기반하여 AR 그래픽 인터페이스를 가변하여 주차 가능 영역을 가이드하는 것을 설명하기 위한 개념도들이다. FIGS. 15A, 15B, 15C, and 15D are conceptual diagrams to explain guiding a parking area by changing the AR graphic interface based on control information, according to an embodiment of the present invention.
도 16a 및 도 16b는 본 발명의 실시예에 따라, 주차 형태에 따라 AR 그래픽 인터페이스를 표시 업데이트하는 것과 관련된 개념도들이다. FIGS. 16A and 16B are conceptual diagrams related to updating the display of an AR graphic interface according to a parking type, according to an embodiment of the present invention.
도 17은 본 발명의 실시예에 따라, 관제정보에 기반하여 AR 그래픽 인터페이스를 통해, 충전 가능 영역 및 충전을 가이드하는 것을 설명하기 위한 흐름도이고, 도 18a, 도 18b, 도 18c는 도 17을 설명하는데 참조되는 개념도들이다.FIG. 17 is a flowchart illustrating guiding the charging area and charging through an AR graphic interface based on control information, according to an embodiment of the present invention, and FIGS. 18A, 18B, and 18C illustrate FIG. 17. These are conceptual diagrams that are referred to.
이하, 첨부된 도면을 참조하여 본 명세서에 개시된 실시 예를 상세히 설명하되, 도면 부호에 관계없이 동일하거나 유사한 구성요소는 동일한 참조 번호를 부여하고 이에 대한 중복되는 설명은 생략하기로 한다. 이하의 설명에서 사용되는 구성요소에 대한 접미사 "모듈" 및 "부"는 명세서 작성의 용이함만이 고려되어 부여되거나 혼용되는 것으로서, 그 자체로 서로 구별되는 의미 또는 역할을 갖는 것은 아니다. 또한, 본 명세서에 개시된 실시 예를 설명함에 있어서 관련된 공지 기술에 대한 구체적인 설명이 본 명세서에 개시된 실시 예의 요지를 흐릴 수 있다고 판단되는 경우 그 상세한 설명을 생략한다. 또한, 첨부된 도면은 본 명세서에 개시된 실시 예를 쉽게 이해할 수 있도록 하기 위한 것일 뿐, 첨부된 도면에 의해 본 명세서에 개시된 기술적 사상이 제한되지 않으며, 본 발명의 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물을 포함하는 것으로 이해되어야 한다.Hereinafter, embodiments disclosed in the present specification will be described in detail with reference to the attached drawings. However, identical or similar components will be assigned the same reference numbers regardless of reference numerals, and duplicate descriptions thereof will be omitted. The suffixes “module” and “part” for components used in the following description are given or used interchangeably only for the ease of preparing the specification, and do not have distinct meanings or roles in themselves. Additionally, in describing the embodiments disclosed in this specification, if it is determined that detailed descriptions of related known technologies may obscure the gist of the embodiments disclosed in this specification, the detailed descriptions will be omitted. In addition, the attached drawings are only for easy understanding of the embodiments disclosed in this specification, and the technical idea disclosed in this specification is not limited by the attached drawings, and all changes included in the spirit and technical scope of the present invention are not limited. , should be understood to include equivalents or substitutes.
제1, 제2 등과 같이 서수를 포함하는 용어는 다양한 구성요소들을 설명하는데 사용될 수 있지만, 상기 구성요소들은 상기 용어들에 의해 한정되지는 않는다. 상기 용어들은 하나의 구성요소를 다른 구성요소로부터 구별하는 목적으로만 사용된다.Terms containing ordinal numbers, such as first, second, etc., may be used to describe various components, but the components are not limited by the terms. The above terms are used only for the purpose of distinguishing one component from another.
어떤 구성요소가 다른 구성요소에 "연결되어" 있다거나 "접속되어" 있다고 언급된 때에는, 그 다른 구성요소에 직접적으로 연결되어 있거나 또는 접속되어 있을 수도 있지만, 중간에 다른 구성요소가 존재할 수도 있다고 이해되어야 할 것이다. 반면에, 어떤 구성요소가 다른 구성요소에 "직접 연결되어" 있다거나 "직접 접속되어" 있다고 언급된 때에는, 중간에 다른 구성요소가 존재하지 않는 것으로 이해되어야 할 것이다.When a component is said to be "connected" or "connected" to another component, it is understood that it may be directly connected to or connected to the other component, but that other components may exist in between. It should be. On the other hand, when it is mentioned that a component is “directly connected” or “directly connected” to another component, it should be understood that there are no other components in between.
단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. Singular expressions include plural expressions unless the context clearly dictates otherwise.
본 출원에서, "포함한다" 또는 "가지다" 등의 용어는 명세서상에 기재된 특징, 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것이 존재함을 지정하려는 것이지, 하나 또는 그 이상의 다른 특징들이나 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것들의 존재 또는 부가 가능성을 미리 배제하지 않는 것으로 이해되어야 한다.In this application, terms such as “comprise” or “have” are intended to designate the presence of features, numbers, steps, operations, components, parts, or combinations thereof described in the specification, but are not intended to indicate the presence of one or more other features. It should be understood that this does not exclude in advance the possibility of the existence or addition of elements, numbers, steps, operations, components, parts, or combinations thereof.
본 명세서에서 기술되는 차량은, 자동차, 오토바이를 포함하는 개념일 수 있다. 이하에서는, 차량에 대해 자동차를 위주로 기술한다.The vehicle described in this specification may include a car and a motorcycle. Below, description of vehicles will focus on automobiles.
본 명세서에서 기술되는 차량은, 동력원으로서 엔진을 구비하는 내연기관 차량, 동력원으로서 엔진과 전기 모터를 구비하는 하이브리드 차량, 동력원으로서 전기 모터를 구비하는 전기 차량등을 모두 포함하는 개념일 수 있다.The vehicle described in this specification may be a concept that includes all internal combustion engine vehicles having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
이하의 설명에서 차량의 좌측은 차량의 주행 방향의 좌측을 의미하고, 차량의 우측은 차량의 주행 방향의 우측을 의미한다.In the following description, the left side of the vehicle refers to the left side of the vehicle's traveling direction, and the right side of the vehicle refers to the right side of the vehicle's traveling direction.
본 명세서에 개시된 "시스템"은 서버 장치와 클라우드 장치 중 적어도 하나의 장치를 포함할 수 있으나, 이에 한정되는 것은 아니다. 예를 들어, 시스템은 하나 이상의 서버 장치로 구성될 수 있다. 다른 예로서, 시스템은 하나 이상의 클라우드 장치로 구성될 수 있다. 또 다른 예로서, 시스템은 서버 장치와 클라우드 장치가 함께 구성되어 동작될 수 있다.The “system” disclosed in this specification may include at least one of a server device and a cloud device, but is not limited thereto. For example, a system may consist of one or more server devices. As another example, a system may consist of one or more cloud devices. As another example, the system may be operated with a server device and a cloud device configured together.
본 명세서에 개시된 "맵 정보" 또는 "맵 데이터"는 카메라 등의 비전센서를 통해 촬영된 영상, 2차원 지도정보, 3차원지도정보, 디지털 트윈(Digital Twin) 3차원 맵, 고정밀 지도(HD map), 및 실제/가상 공간상의 지도 등의 지도 정보, 지도 데이터, 지도 관련 애플리케이션을 포함하는 의미로 지칭될 수 있다. “Map information” or “map data” disclosed in this specification includes images captured through vision sensors such as cameras, two-dimensional map information, three-dimensional map information, digital twin three-dimensional maps, and high-precision maps (HD maps). ), and map information such as maps in real/virtual space, map data, and map-related applications.
본 명세서에 개시된 "POI(Point of Interest, POI) 정보"는 상기 맵 정보 또는 맵 데이터를 기초로 선택된 관심지점으로, 기등록된 POI 정보(클라우드 서버의 맵 지도에 저장된 POI), 사용자 설정 POI 정보(예, 우리집, 학교, 회사 등), 주행 관련 POI 정보(예, 목적지, 경유지, 주유소, 휴게소, 주차장 등), 및 상위 검색 POI 정보(예, 최근 클릿/방문수가 많은 POI, 핫 플레이스 등)를 포함할 수 있다. 이러한 POI 정보는, 차량의 현재 위치를 기준으로 실시간 업데이트될 수 있다. “Point of Interest (POI) information” disclosed herein refers to points of interest selected based on the map information or map data, pre-registered POI information (POI stored in the map of the cloud server), and user-set POI information. (e.g., my home, school, work, etc.), driving-related POI information (e.g., destination, waypoint, gas station, rest area, parking lot, etc.), and top search POI information (e.g., recently clicked/highly visited POI, hot place, etc.) may include. This POI information can be updated in real time based on the current location of the vehicle.
본 명세서에 개시된 "전방 영상"은 차량 또는 차량 주변의 비전센서, 또는 AR 디스플레이 장치의 AR 카메라를 통해 획득된 것으로, 예를 들어 차량의 주행 동안 비전센서(카메라, 영상용 레이저센서 등)를 통해 획득되거나 투사되는 영상, 차량의 윈드쉴드에 투사되는 현실의 이미지 자체/가상공간의 영상을 포함할 수 있다. 즉, 상기 전방 영상은 디스플레이를 통해 출력되는 영상, 레이저 센서 등을 통해 투사되는 영상, 또는 차량의 윈드쉴드를 통해 보여지는 현실의 이미지 자체를 모두 포함하는 의미로 지칭될 수 있다. The “front image” disclosed in this specification is obtained through a vision sensor on or around the vehicle, or an AR camera of an AR display device, for example, through a vision sensor (camera, laser sensor for image, etc.) while the vehicle is driving. It may include acquired or projected images, the real image itself projected on the windshield of a vehicle, or images in a virtual space. In other words, the front image may be referred to as including all images output through a display, images projected through a laser sensor, or the actual image itself shown through the windshield of a vehicle.
도 1 및 도 2는 본 발명의 실시 예와 관련된 차량의 외관이고, 도 3 및 도 4는 본 발명의 실시예와 관련된 차량의 내부를 도시한 도면이다. 1 and 2 are diagrams showing the exterior of a vehicle related to an embodiment of the present invention, and FIGS. 3 and 4 are diagrams showing the interior of a vehicle related to an embodiment of the present invention.
도 5 내지 도 6은 본 발명의 실시예와 관련된 차량의 주행과 관련된 다양한 오브젝트를 도시한 도면이다. 5 and 6 are diagrams showing various objects related to driving of a vehicle related to an embodiment of the present invention.
도 7은 본 발명의 실시예와 관련된 차량을 설명하는데 참조되는 블럭도이다. 도 7은 본 발명의 실시예에 따른 차량을 설명하는데 참조되는 블럭도이다.Figure 7 is a block diagram referenced in explaining a vehicle related to an embodiment of the present invention. Figure 7 is a block diagram used to describe a vehicle according to an embodiment of the present invention.
도 1 내지 도 7을 참조하면, 차량(100)은 동력원에 의해 회전하는 바퀴, 차량(100)의 진행 방향을 조절하기 위한 조향 입력 장치(510)를 포함할 수 있다.Referring to FIGS. 1 to 7 , the vehicle 100 may include wheels rotated by a power source and a steering input device 510 for controlling the moving direction of the vehicle 100.
차량(100)은 자율 주행 차량일 수 있다. 차량(100)은, 사용자 입력에 기초하여, 자율 주행 모드 또는 메뉴얼 모드로 전환될 수 있다. 예를 들면, 차량(100)은, 사용자 인터페이스 장치(이하, '사용자 단말'로 명명될 수 있음)(200)를 통해, 수신되는 사용자 입력에 기초하여, 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다. Vehicle 100 may be an autonomous vehicle. The vehicle 100 may be switched to autonomous driving mode or manual mode based on user input. For example, the vehicle 100 switches from manual mode to autonomous driving mode based on user input received through the user interface device (hereinafter, referred to as 'user terminal') 200, or It can be switched from autonomous driving mode to manual mode.
차량(100)은, 주행 상황 정보에 기초하여, 자율 주행 모드 또는 메뉴얼 모드로 전환될 수 있다. 주행 상황 정보는, 오브젝트 검출 장치(300)에서 제공된 오브젝트 정보에 기초하여 생성될 수 있다. 예를 들면, 차량(100)은, 오브젝트 검출 장치(300)에서 생성되는 주행 상황 정보에 기초하여, 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다. 예를 들면, 차량(100)은, 통신 장치(400)를 통해 수신되는 주행 상황 정보에 기초하여, 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다.The vehicle 100 may be switched to autonomous driving mode or manual mode based on driving situation information. Driving situation information may be generated based on object information provided by the object detection device 300. For example, the vehicle 100 may be switched from manual mode to autonomous driving mode, or from autonomous driving mode to manual mode, based on driving situation information generated by the object detection device 300. For example, the vehicle 100 may be switched from manual mode to autonomous driving mode, or from autonomous driving mode to manual mode, based on driving situation information received through the communication device 400.
차량(100)은, 외부 디바이스에서 제공되는 정보, 데이터, 신호에 기초하여 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다.The vehicle 100 may be switched from manual mode to autonomous driving mode or from autonomous driving mode to manual mode based on information, data, and signals provided from an external device.
차량(100)이 자율 주행 모드로 운행되는 경우, 자율 주행 차량(100)은, 운행 시스템(700)에 기초하여 운행될 수 있다. 예를 들면, 자율 주행 차량(100)은, 주행 시스템(710), 출차 시스템(740), 주차 시스템(750)에서 생성되는 정보, 데이터 또는 신호에 기초하여 운행될 수 있다.When the vehicle 100 is driven in an autonomous driving mode, the autonomous vehicle 100 may be driven based on the driving system 700 . For example, the autonomous vehicle 100 may be driven based on information, data, or signals generated by the driving system 710, the parking system 740, and the parking system 750.
차량(100)이 메뉴얼 모드로 운행되는 경우, 자율 주행 차량(100)은, 운전 조작 장치(500)를 통해 운전을 위한 사용자 입력을 수신할 수 있다. 운전 조작 장치(500)를 통해 수신되는 사용자 입력에 기초하여, 차량(100)은 운행될 수 있다.When the vehicle 100 is operated in manual mode, the autonomous vehicle 100 may receive user input for driving through the driving control device 500. Based on user input received through the driving control device 500, the vehicle 100 may be driven.
전장(overall length)은 차량(100)의 앞부분에서 뒷부분까지의 길이, 전폭(width)은 차량(100)의 너비, 전고(height)는 바퀴 하부에서 루프까지의 길이를 의미한다. 이하의 설명에서, 전장 방향(L)은 차량(100)의 전장 측정의 기준이 되는 방향, 전폭 방향(W)은 차량(100)의 전폭 측정의 기준이 되는 방향, 전고 방향(H)은 차량(100)의 전고 측정의 기준이 되는 방향을 의미할 수 있다.The overall length refers to the length from the front to the rear of the vehicle 100, the overall width refers to the width of the vehicle 100, and the overall height refers to the length from the bottom of the wheels to the roof. In the following description, the overall length direction (L) is the direction that is the standard for measuring the overall length of the vehicle 100, the overall width direction (W) is the direction that is the standard for measuring the overall width of the vehicle 100, and the overall height direction (H) is the direction that is the standard for measuring the overall width of the vehicle 100. It may refer to the direction that serves as the standard for measuring the total height of (100).
도 7에 예시된 바와 같이, 차량(100)은, 사용자 인터페이스 장치(이하, '사용자 단말'로 명명될 수 있음)(200), 오브젝트 검출 장치(300), 통신 장치(400), 운전 조작 장치(500), 차량 구동 장치(600), 운행 시스템(700), 내비게이션 시스템(770), 센싱부(120), 차량 인터페이스부(130), 메모리(140), 제어부(170) 및 전원 공급부(190)를 포함할 수 있다.As illustrated in FIG. 7, the vehicle 100 includes a user interface device (hereinafter referred to as a 'user terminal') 200, an object detection device 300, a communication device 400, and a driving operation device. (500), vehicle driving device 600, driving system 700, navigation system 770, sensing unit 120, vehicle interface unit 130, memory 140, control unit 170, and power supply unit 190 ) may include.
실시예에 따라, 차량(100)은, 본 명세서에서 설명되는 구성 요소외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.Depending on the embodiment, the vehicle 100 may further include other components in addition to the components described in this specification, or may not include some of the components described.
사용자 인터페이스 장치(200)는, 차량(100)과 사용자와의 소통을 위한 장치이다. 사용자 인터페이스 장치(200)는, 사용자 입력을 수신하고, 사용자에게 차량(100)에서 생성된 정보를 제공할 수 있다. 차량(100)은, 사용자 인터페이스 장치(이하, '사용자 단말'로 명명될 수 있음)(200)를 통해, UI(User Interfaces) 또는 UX(User Experience)를 구현할 수 있다.The user interface device 200 is a device for communication between the vehicle 100 and the user. The user interface device 200 may receive user input and provide information generated by the vehicle 100 to the user. The vehicle 100 may implement User Interfaces (UI) or User Experience (UX) through a user interface device (hereinafter referred to as a 'user terminal') 200.
사용자 인터페이스 장치(200)는, 입력부(210), 내부 카메라(220), 생체 감지부(230), 출력부(250) 및 프로세서(270)를 포함할 수 있다. 실시예에 따라, 사용자 인터페이스 장치(200)는, 설명되는 구성 요소외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수도 있다.The user interface device 200 may include an input unit 210, an internal camera 220, a biometric detection unit 230, an output unit 250, and a processor 270. Depending on the embodiment, the user interface device 200 may further include other components in addition to the components described, or may not include some of the components described.
입력부(210)는, 사용자로부터 정보를 입력받기 위한 것으로, 입력부(120)에서 수집한 데이터는, 프로세서(270)에 의해 분석되어, 사용자의 제어 명령으로 처리될 수 있다.The input unit 210 is used to receive information from the user, and the data collected by the input unit 120 can be analyzed by the processor 270 and processed as a user's control command.
입력부(210)는, 차량 내부에 배치될 수 있다. 예를 들면, 입력부(210)는, 스티어링 휠(steering wheel)의 일 영역, 인스투루먼트 패널(instrument panel)의 일 영역, 시트(seat)의 일 영역, 각 필러(pillar)의 일 영역, 도어(door)의 일 영역, 센타 콘솔(center console)의 일 영역, 헤드 라이닝(head lining)의 일 영역, 썬바이저(sun visor)의 일 영역, 윈드 쉴드(windshield)의 일 영역 또는 윈도우(window)의 일 영역 등에 배치될 수 있다.The input unit 210 may be placed inside the vehicle. For example, the input unit 210 is an area of the steering wheel, an area of the instrument panel, an area of the seat, an area of each pillar, and a door. One area of the door, one area of the center console, one area of the head lining, one area of the sun visor, one area of the windshield or window. It can be placed in one area, etc.
입력부(210)는, 음성 입력부(211), 제스쳐 입력부(212), 터치 입력부(213) 및 기계식 입력부(214)를 포함할 수 있다.The input unit 210 may include a voice input unit 211, a gesture input unit 212, a touch input unit 213, and a mechanical input unit 214.
음성 입력부(211)는, 사용자의 음성 입력을 전기적 신호로 전환할 수 있다. 전환된 전기적 신호는, 프로세서(270) 또는 제어부(170)에 제공될 수 있다. 음성 입력부(211)는, 하나 이상의 마이크로 폰을 포함할 수 있다.The voice input unit 211 can convert the user's voice input into an electrical signal. The converted electrical signal may be provided to the processor 270 or the control unit 170. The voice input unit 211 may include one or more microphones.
제스쳐 입력부(212)는, 사용자의 제스쳐 입력을 전기적 신호로 전환할 수 있다. 전환된 전기적 신호는, 프로세서(270) 또는 제어부(170)에 제공될 수 있다.The gesture input unit 212 can convert the user's gesture input into an electrical signal. The converted electrical signal may be provided to the processor 270 or the control unit 170.
제스쳐 입력부(212)는, 사용자의 제스쳐 입력을 감지하기 위한 적외선 센서 및 이미지 센서 중 적어도 어느 하나를 포함할 수 있다. 실시예에 따라, 제스쳐 입력부(212)는, 사용자의 3차원 제스쳐 입력을 감지할 수 있다. 이를 위해, 제스쳐 입력부(212)는, 복수의 적외선 광을 출력하는 광출력부 또는 복수의 이미지 센서를 포함할 수 있다.The gesture input unit 212 may include at least one of an infrared sensor and an image sensor for detecting a user's gesture input. Depending on the embodiment, the gesture input unit 212 may detect a user's 3D gesture input. To this end, the gesture input unit 212 may include a light output unit that outputs a plurality of infrared lights or a plurality of image sensors.
제스쳐 입력부(212)는, TOF(Time of Flight) 방식, 구조광(Structured light) 방식 또는 디스패러티(Disparity) 방식을 통해 사용자의 3차원 제스쳐 입력을 감지할 수 있다.The gesture input unit 212 may detect the user's 3D gesture input through a time of flight (TOF) method, a structured light method, or a disparity method.
터치 입력부(213)는, 사용자의 터치 입력을 전기적 신호로 전환할 수 있다. 전환된 전기적 신호는 프로세서(270) 또는 제어부(170)에 제공될 수 있다.The touch input unit 213 can convert the user's touch input into an electrical signal. The converted electrical signal may be provided to the processor 270 or the control unit 170.
터치 입력부(213)는, 사용자의 터치 입력을 감지하기 위한 터치 센서를 포함할 수 있다. 실시예에 따라, 터치 입력부(213)는 디스플레이부(251)와 일체형으로 형성됨으로써, 터치 스크린을 구현할 수 있다. 이러한, 터치 스크린은, 차량(100)과 사용자 사이의 입력 인터페이스 및 출력 인터페이스를 함께 제공할 수 있다.The touch input unit 213 may include a touch sensor for detecting a user's touch input. Depending on the embodiment, the touch input unit 213 may be formed integrally with the display unit 251 to implement a touch screen. This touch screen can provide both an input interface and an output interface between the vehicle 100 and the user.
기계식 입력부(214)는, 버튼, 돔 스위치(dome switch), 조그 휠 및 조그 스위치 중 적어도 어느 하나를 포함할 수 있다. 기계식 입력부(214)에 의해 생성된 전기적 신호는, 프로세서(270) 또는 제어부(170)에 제공될 수 있다. 기계식 입력부(214)는, 스티어링 휠, 센테 페시아, 센타 콘솔, 칵픽 모듈, 도어 등에 배치될 수 있다.The mechanical input unit 214 may include at least one of a button, a dome switch, a jog wheel, and a jog switch. The electrical signal generated by the mechanical input unit 214 may be provided to the processor 270 or the control unit 170. The mechanical input unit 214 may be placed on a steering wheel, center fascia, center console, cockpit module, door, etc.
내부 카메라(220)는, 차량 내부 영상을 획득할 수 있다. 프로세서(270)는, 차량 내부 영상을 기초로, 사용자의 상태를 감지할 수 있다. 프로세서(270)는, 차량 내부 영상에서 사용자의 시선 정보를 획득할 수 있다. 프로세서(270)는, 차량 내부 영상에서 사용자의 제스쳐를 감지할 수 있다.The internal camera 220 can acquire images inside the vehicle. The processor 270 may detect the user's state based on the image inside the vehicle. The processor 270 may obtain the user's gaze information from the image inside the vehicle. The processor 270 may detect a user's gesture from an image inside the vehicle.
생체 감지부(230)는, 사용자의 생체 정보를 획득할 수 있다. 생체 감지부(230)는, 사용자의 생체 정보를 획득할 수 있는 센서를 포함하고, 센서를 이용하여, 사용자의 지문 정보, 심박동 정보 등을 획득할 수 있다. 생체 정보는 사용자 인증을 위해 이용될 수 있다.The biometric detection unit 230 can acquire the user's biometric information. The biometric detection unit 230 includes a sensor that can acquire the user's biometric information, and can obtain the user's fingerprint information, heart rate information, etc. using the sensor. Biometric information can be used for user authentication.
출력부(250)는, 시각, 청각 또는 촉각 등과 관련된 출력을 발생시키기 위한 것이다. 출력부(250)는, 디스플레이부(251), 음향 출력부(252) 및 햅틱 출력부(253) 중 적어도 어느 하나를 포함할 수 있다.The output unit 250 is for generating output related to vision, hearing, or tactile sensation. The output unit 250 may include at least one of a display unit 251, an audio output unit 252, and a haptic output unit 253.
디스플레이부(251)는, 다양한 정보에 대응되는 그래픽 객체를 표시할 수 있다. 디스플레이부(251)는 액정 디스플레이(liquid crystal display, LCD), 박막 트랜지스터 액정 디스플레이(thin film transistor-liquid crystal display, TFT LCD), 유기 발광 다이오드(organic light-emitting diode, OLED), 플렉서블 디스플레이(flexible display), 3차원 디스플레이(3D display), 전자잉크 디스플레이(e-ink display) 중에서 적어도 하나를 포함할 수 있다.The display unit 251 can display graphic objects corresponding to various information. The display unit 251 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display. It may include at least one of a display, a 3D display, and an e-ink display.
디스플레이부(251)는 터치 입력부(213)와 상호 레이어 구조를 이루거나 일체형으로 형성됨으로써, 터치 스크린을 구현할 수 있다.The display unit 251 and the touch input unit 213 may form a layered structure or be formed as one piece, thereby implementing a touch screen.
디스플레이부(251)는 HUD(Head Up Display)로 구현될 수 있다. 디스플레이부(251)가 HUD로 구현되는 경우, 디스플레이부(251)는 투사 모듈을 구비하여 윈드 쉴드 또는 윈도우에 투사되는 이미지를 통해 정보를 출력할 수 있다.The display unit 251 may be implemented as a Head Up Display (HUD). When the display unit 251 is implemented as a HUD, the display unit 251 is equipped with a projection module and can output information through an image projected on a windshield or window.
디스플레이부(251)는, 투명 디스플레이를 포함할 수 있다. 투명 디스플레이는 윈드 쉴드 또는 윈도우에 부착될 수 있다. 투명 디스플레이는 소정의 투명도를 가지면서, 소정의 화면을 표시할 수 있다. 투명 디스플레이는, 투명도를 가지기 위해, 투명 디스플레이는 투명 TFEL(Thin Film Elecroluminescent), 투명 OLED(Organic Light-Emitting Diode), 투명 LCD(Liquid Crystal Display), 투과형 투명디스플레이, 투명 LED(Light Emitting Diode) 디스플레이 중 적어도 하나를 포함할 수 있다. 투명 디스플레이의 투명도는 조절될 수 있다.The display unit 251 may include a transparent display. The transparent display can be attached to a windshield or window. A transparent display can display a certain screen while having a certain transparency. In order to have transparency, transparent displays include transparent TFEL (Thin Film Elecroluminescent), transparent OLED (Organic Light-Emitting Diode), transparent LCD (Liquid Crystal Display), transparent transparent display, and transparent LED (Light Emitting Diode) display. It may include at least one of: The transparency of a transparent display can be adjusted.
한편, 사용자 인터페이스 장치(200)는, 복수의 디스플레이부(251a 내지 251g)를 포함할 수 있다. Meanwhile, the user interface device 200 may include a plurality of display units 251a to 251g.
디스플레이부(251)는, 스티어링 휠의 일 영역, 인스투루먼트 패널의 일 영역(521a, 251b, 251e), 시트의 일 영역(251d), 각 필러의 일 영역(251f), 도어의 일 영역(251g), 센타 콘솔의 일 영역, 헤드 라이닝의 일 영역, 썬바이저의 일 영역에 배치되거나, 윈드 쉴드의 일영역(251c), 윈도우의 일영역(251h)에 구현될 수 있다.The display unit 251 includes one area of the steering wheel, one area of the instrument panel (521a, 251b, 251e), one area of the seat (251d), one area of each pillar (251f), and one area of the door ( 251g), may be placed in an area of the center console, an area of the headlining, or an area of the sun visor, or may be implemented in an area of the windshield (251c) or an area of the window (251h).
음향 출력부(252)는, 프로세서(270) 또는 제어부(170)로부터 제공되는 전기 신호를 오디오 신호로 변환하여 출력한다. 이를 위해, 음향 출력부(252)는, 하나 이상의 스피커를 포함할 수 있다.The audio output unit 252 converts the electrical signal provided from the processor 270 or the control unit 170 into an audio signal and outputs it. To this end, the sound output unit 252 may include one or more speakers.
햅틱 출력부(253)는, 촉각적인 출력을 발생시킨다. 예를 들면, 햅틱 출력부(253)는, 스티어링 휠, 안전 벨트, 시트(110FL, 110FR, 110RL, 110RR)를 진동시켜, 사용자가 출력을 인지할 수 있게 동작할 수 있다.The haptic output unit 253 generates a tactile output. For example, the haptic output unit 253 may operate to vibrate the steering wheel, seat belt, and seats 110FL, 110FR, 110RL, and 110RR so that the user can perceive the output.
프로세서(이하, '제어부'로 명명될 수 있음)(270)는, 사용자 인터페이스 장치(200)의 각 유닛의 전반적인 동작을 제어할 수 있다. 실시예에 따라, 사용자 인터페이스 장치(200)는, 복수의 프로세서(270)를 포함하거나, 프로세서(270)를 포함하지 않을 수도 있다.The processor (hereinafter referred to as a 'control unit') 270 may control the overall operation of each unit of the user interface device 200. Depending on the embodiment, the user interface device 200 may include a plurality of processors 270 or may not include the processor 270.
사용자 인터페이스 장치(200)에 프로세서(270)가 포함되지 않는 경우, 사용자 인터페이스 장치(200)는, 차량(100)내 다른 장치의 프로세서 또는 제어부(170)의 제어에 따라, 동작될 수 있다.When the user interface device 200 does not include the processor 270, the user interface device 200 may be operated under the control of the processor 170 or a processor of another device in the vehicle 100.
한편, 사용자 인터페이스 장치(200)는, 차량용 디스플레이 장치로 명명될 수 있다. 사용자 인터페이스 장치(200)는, 제어부(170)의 제어에 따라 동작될 수 있다.Meanwhile, the user interface device 200 may be called a vehicle display device. The user interface device 200 may be operated under the control of the control unit 170.
오브젝트 검출 장치(300)는, 차량(100) 외부에 위치하는 오브젝트를 검출하기 위한 장치이다. 오브젝트는, 차량(100)의 운행과 관련된 다양한 물체들일 수 있다. 도 5 내지 도 6을 참조하면, 오브젝트(O)는, 차선(OB10), 타 차량(OB11), 보행자(OB12), 이륜차(OB13), 교통 신호(OB14, OB15), 빛, 도로, 구조물, 과속 방지턱, 지형물, 동물 등을 포함할 수 있다.The object detection device 300 is a device for detecting objects located outside the vehicle 100. Objects may be various objects related to the operation of the vehicle 100. 5 and 6, the object O is a lane (OB10), another vehicle (OB11), a pedestrian (OB12), a two-wheeled vehicle (OB13), a traffic signal (OB14, OB15), light, a road, a structure, Can include speed bumps, landmarks, animals, etc.
차선(Lane)(OB10)은, 주행 차선, 주행 차선의 옆 차선, 대향되는 차량이 주행하는 차선일 수 있다. 차선(Lane)(OB10)은, 차선(Lane)을 형성하는 좌우측 선(Line)을 포함하는 개념일 수 있다.Lane OB10 may be a driving lane, a lane next to a driving lane, or a lane in which an oncoming vehicle travels. Lane OB10 may be a concept that includes left and right lines forming a lane.
타 차량(OB11)은, 차량(100)의 주변에서 주행 중인 차량일 수 있다. 타 차량은, 차량(100)으로부터 소정 거리 이내에 위치하는 차량일 수 있다. 예를 들면, 타 차량(OB11)은, 차량(100)보다 선행 또는 후행하는 차량일 수 있다. The other vehicle OB11 may be a vehicle running around the vehicle 100 . The other vehicle may be a vehicle located within a predetermined distance from the vehicle 100. For example, the other vehicle OB11 may be a vehicle that precedes or follows the vehicle 100.
보행자(OB12)는, 차량(100)의 주변에 위치한 사람일 수 있다. 보행자(OB12)는, 차량(100)으로부터 소정 거리 이내에 위치하는 사람일 수 있다. 예를 들면, 보행자(OB12)는, 인도 또는 차도상에 위치하는 사람일 수 있다.The pedestrian OB12 may be a person located around the vehicle 100. The pedestrian OB12 may be a person located within a predetermined distance from the vehicle 100. For example, a pedestrian OB12 may be a person located on a sidewalk or roadway.
이륜차(OB12)는, 차량(100)의 주변에 위치하고, 2개의 바퀴를 이용해 움직이는 탈것을 의미할 수 있다. 이륜차(OB12)는, 차량(100)으로부터 소정 거리 이내에 위치하는 2개의 바퀴를 가지는 탈 것일 수 있다. 예를 들면, 이륜차(OB13)는, 인도 또는 차도상에 위치하는 오토바이 또는 자전거일 수 있다.The two-wheeled vehicle OB12 may refer to a vehicle located around the vehicle 100 and moving using two wheels. The two-wheeled vehicle OB12 may be a vehicle with two wheels located within a predetermined distance from the vehicle 100. For example, the two-wheeled vehicle OB13 may be a motorcycle or bicycle located on a sidewalk or roadway.
교통 신호는, 교통 신호등(OB15), 교통 표지판(OB14), 도로면에 그려진 문양 또는 텍스트를 포함할 수 있다.Traffic signals may include traffic lights (OB15), traffic signs (OB14), and patterns or text drawn on the road surface.
빛은, 타 차량에 구비된 램프에서 생성된 빛일 수 있다. 빛은, 가로등에서 생성된 빛을 수 있다. 빛은 태양광일 수 있다.The light may be light generated from a lamp provided in another vehicle. The light can be the light generated from street lights. The light may be sunlight.
도로는, 도로면, 커브, 오르막, 내리막 등의 경사 등을 포함할 수 있다.A road may include a road surface, a curve, a slope such as uphill or downhill, etc.
구조물은, 도로 주변에 위치하고, 지면에 고정된 물체일 수 있다. 예를 들면, 구조물은, 가로등, 가로수, 건물, 전봇대, 신호등, 다리를 포함할 수 있다.The structure may be an object located near the road and fixed to the ground. For example, structures may include streetlights, trees, buildings, electric poles, traffic lights, and bridges.
지형물은, 산, 언덕, 등을 포함할 수 있다.Landforms may include mountains, hills, etc.
한편, 오브젝트는, 이동 오브젝트와 고정 오브젝트로 분류될 수 있다. 예를 들면, 이동 오브젝트는, 타 차량, 보행자를 포함하는 개념일 수 있다. 예를 들면, 고정 오브젝트는, 교통 신호, 도로, 구조물을 포함하는 개념일 수 있다.Meanwhile, objects can be classified into moving objects and fixed objects. For example, a moving object may be a concept that includes other vehicles and pedestrians. For example, a fixed object may be a concept including a traffic signal, road, or structure.
오브젝트 검출 장치(300)는, 카메라(310), 레이다(320), 라이다(330), 초음파 센서(340), 적외선 센서(350) 및 프로세서(370)를 포함할 수 있다.The object detection device 300 may include a camera 310, radar 320, lidar 330, ultrasonic sensor 340, infrared sensor 350, and processor 370.
실시예에 따라, 오브젝트 검출 장치(300)는, 설명되는 구성 요소외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.Depending on the embodiment, the object detection apparatus 300 may further include other components in addition to the components described, or may not include some of the components described.
카메라(310)는, 차량 외부 영상을 획득하기 위해, 차량의 외부의 적절한 곳에 위치할 수 있다. 카메라(310)는, 모노 카메라, 스테레오 카메라(310a), AVM(Around View Monitoring) 카메라(310b) 또는 360도 카메라일 수 있다.The camera 310 may be located at an appropriate location outside the vehicle to obtain images of the exterior of the vehicle. The camera 310 may be a mono camera, a stereo camera 310a, an Around View Monitoring (AVM) camera 310b, or a 360-degree camera.
예를 들면, 카메라(310)는, 차량 전방의 영상을 획득하기 위해, 차량의 실내에서, 프런트 윈드 쉴드에 근접하게 배치될 수 있다. 또는, 카메라(310)는, 프런트 범퍼 또는 라디에이터 그릴 주변에 배치될 수 있다.For example, camera 310 may be placed close to the front windshield, inside the vehicle, to obtain an image of the front of the vehicle. Alternatively, the camera 310 may be placed around the front bumper or radiator grill.
예를 들면, 카메라(310)는, 차량 후방의 영상을 획득하기 위해, 차량의 실내에서, 리어 글라스에 근접하게 배치될 수 있다. 또는, 카메라(310)는, 리어 범퍼, 트렁크 또는 테일 게이트 주변에 배치될 수 있다.For example, the camera 310 may be placed close to the rear windshield in the interior of the vehicle to obtain an image of the rear of the vehicle. Alternatively, the camera 310 may be placed around the rear bumper, trunk, or tailgate.
예를 들면, 카메라(310)는, 차량 측방의 영상을 획득하기 위해, 차량의 실내에서 사이드 윈도우 중 적어도 어느 하나에 근접하게 배치될 수 있다. 또는, 카메라(310)는, 사이드 미러, 휀더 또는 도어 주변에 배치될 수 있다.For example, the camera 310 may be placed close to at least one of the side windows inside the vehicle to obtain an image of the side of the vehicle. Alternatively, the camera 310 may be placed around a side mirror, fender, or door.
카메라(310)는, 획득된 영상을 프로세서(370)에 제공할 수 있다. The camera 310 may provide the acquired image to the processor 370.
레이다(320)는, 전자파 송신부, 수신부를 포함할 수 있다. 레이더(320)는 전파 발사 원리상 펄스 레이더(Pulse Radar) 방식 또는 연속파 레이더(Continuous Wave Radar) 방식으로 구현될 수 있다. 레이더(320)는 연속파 레이더 방식 중에서 신호 파형에 따라 FMCW(Frequency Modulated Continuous Wave)방식 또는 FSK(Frequency Shift Keyong) 방식으로 구현될 수 있다. Radar 320 may include an electromagnetic wave transmitting unit and a receiving unit. The radar 320 may be implemented as a pulse radar or continuous wave radar based on the principle of transmitting radio waves. The radar 320 may be implemented in a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method depending on the signal waveform among the continuous wave radar methods.
레이더(320)는 전자파를 매개로, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식에 기초하여, 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. The radar 320 detects an object using electromagnetic waves based on a Time of Flight (TOF) method or a phase-shift method, and determines the location of the detected object, the distance to the detected object, and the relative speed. can be detected.
레이더(320)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다. The radar 320 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
라이다(330)는, 레이저 송신부, 수신부를 포함할 수 있다. 라이다(330)는, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식으로 구현될 수 있다. LiDAR 330 may include a laser transmitter and a receiver. LiDAR 330 may be implemented in a time of flight (TOF) method or a phase-shift method.
라이다(330)는, 구동식 또는 비구동식으로 구현될 수 있다. LiDAR 330 may be implemented as a driven or non-driven type.
구동식으로 구현되는 경우, 라이다(330)는, 모터에 의해 회전되며, 차량(100) 주변의 오브젝트를 검출할 수 있다.When implemented in a driven manner, the LIDAR 330 is rotated by a motor and can detect objects around the vehicle 100.
비구동식으로 구현되는 경우, 라이다(330)는, 광 스티어링에 의해, 차량(100)을 기준으로 소정 범위 내에 위치하는 오브젝트를 검출할 수 있다. 차량(100)은 복수의 비구동식 라이다(330)를 포함할 수 있다.When implemented in a non-driven manner, the LIDAR 330 can detect objects located within a predetermined range based on the vehicle 100 through optical steering. The vehicle 100 may include a plurality of non-driven LIDARs 330.
라이다(330)는, 레이저 광 매개로, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식에 기초하여, 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. The LIDAR 330 detects an object via laser light based on a time of flight (TOF) method or a phase-shift method, and determines the location of the detected object, the distance to the detected object, and Relative speed can be detected.
라이다(330)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다. Lidar 330 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
초음파 센서(340)는, 초음파 송신부, 수신부를 포함할 수 있다. 초음파 센서(340)은, 초음파를 기초로 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. The ultrasonic sensor 340 may include an ultrasonic transmitter and a receiver. The ultrasonic sensor 340 can detect an object based on ultrasonic waves and detect the location of the detected object, the distance to the detected object, and the relative speed.
초음파 센서(340)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The ultrasonic sensor 340 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
적외선 센서(350)는, 적외선 송신부, 수신부를 포함할 수 있다. 적외선 센서(340)는, 적외선 광을 기초로 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다.The infrared sensor 350 may include an infrared transmitter and a receiver. The infrared sensor 340 can detect an object based on infrared light, and detect the location of the detected object, the distance to the detected object, and the relative speed.
적외선 센서(350)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The infrared sensor 350 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
프로세서(370)는, 오브젝트 검출 장치(300)의 각 유닛의 전반적인 동작을 제어할 수 있다.The processor 370 may control the overall operation of each unit of the object detection device 300.
프로세서(370)는, 획득된 영상에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 영상 처리 알고리즘을 통해, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출등의 동작을 수행할 수 있다.The processor 370 can detect and track an object based on the acquired image. The processor 370 can perform operations such as calculating a distance to an object and calculating a relative speed to an object through an image processing algorithm.
프로세서(370)는, 송신된 전자파가 오브젝트에 반사되어 되돌아오는 반사 전자파에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 전자파에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 can detect and track an object based on reflected electromagnetic waves that are transmitted when the electromagnetic waves are reflected by the object and returned. The processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object, based on electromagnetic waves.
프로세서(370)는, 송신된 레이저가 오브젝트에 반사되어 되돌아오는 반사 레이저 광에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 레이저 광에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track an object based on reflected laser light that is returned after the transmitted laser is reflected by the object. The processor 370 may perform operations such as calculating the distance to the object and calculating the relative speed to the object, based on the laser light.
프로세서(370)는, 송신된 초음파가 오브젝트에 반사되어 되돌아오는 반사 초음파에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 초음파에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track an object based on reflected ultrasonic waves in which the transmitted ultrasonic waves are reflected by the object and returned. The processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object based on ultrasonic waves.
프로세서(370)는, 송신된 적외선 광이 오브젝트에 반사되어 되돌아오는 반사 적외선 광에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 적외선 광에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track an object based on the reflected infrared light that is returned after the transmitted infrared light is reflected by the object. The processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object based on infrared light.
실시예에 따라, 오브젝트 검출 장치(300)는, 복수의 프로세서(370)를 포함하거나, 프로세서(370)를 포함하지 않을 수도 있다. 예를 들면, 카메라(310), 레이다(320), 라이다(330), 초음파 센서(340) 및 적외선 센서(350) 각각은 개별적으로 프로세서를 포함할 수 있다.Depending on the embodiment, the object detection apparatus 300 may include a plurality of processors 370 or may not include the processor 370. For example, the camera 310, radar 320, lidar 330, ultrasonic sensor 340, and infrared sensor 350 may each individually include a processor.
오브젝트 검출 장치(300)에 프로세서(370)가 포함되지 않는 경우, 오브젝트 검출 장치(300)는, 차량(100)내 장치의 프로세서 또는 제어부(170)의 제어에 따라, 동작될 수 있다.When the object detection device 300 does not include the processor 370, the object detection device 300 may be operated under the control of the processor or control unit 170 of the device in the vehicle 100.
오브젝트 검출 장치(400)는, 제어부(170)의 제어에 따라 동작될 수 있다.The object detection device 400 may be operated under the control of the control unit 170.
통신 장치(400)는, 외부 디바이스와 통신을 수행하기 위한 장치이다. 여기서, 외부 디바이스는, 타 차량, 이동 단말기 또는 서버일 수 있다. The communication device 400 is a device for communicating with an external device. Here, the external device may be another vehicle, mobile terminal, or server.
통신 장치(400)는, 통신을 수행하기 위해 송신 안테나, 수신 안테나, 각종 통신 프로토콜이 구현 가능한 RF(Radio Frequency) 회로 및 RF 소자 중 적어도 어느 하나를 포함할 수 있다.The communication device 400 may include at least one of a transmitting antenna, a receiving antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
통신 장치(400)는, 근거리 통신부(410), 위치 정보부(420), V2X 통신부(430), 광통신부(440), 방송 송수신부(450) 및 프로세서(470)를 포함할 수 있다.The communication device 400 may include a short-range communication unit 410, a location information unit 420, a V2X communication unit 430, an optical communication unit 440, a broadcast transceiver 450, and a processor 470.
실시예에 따라, 통신 장치(400)는, 설명되는 구성 요소외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.Depending on the embodiment, the communication device 400 may further include other components in addition to the components described, or may not include some of the components described.
근거리 통신부(410)는, 근거리 통신(Short range communication)을 위한 유닛이다. 근거리 통신부(410)는, 블루투스(Bluetooth™), RFID(Radio Frequency Identification), 적외선 통신(Infrared Data Association; IrDA), UWB(Ultra Wideband), ZigBee, NFC(Near Field Communication), Wi-Fi(Wireless-Fidelity), Wi-Fi Direct, Wireless USB(Wireless Universal Serial Bus) 기술 중 적어도 하나를 이용하여, 근거리 통신을 지원할 수 있다.The short-range communication unit 410 is a unit for short-range communication. The short-range communication unit 410 includes Bluetooth™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and Wi-Fi (Wireless). -Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technologies can be used to support short-distance communication.
근거리 통신부(410)는, 근거리 무선 통신망(Wireless Area Networks)을 형성하여, 차량(100)과 적어도 하나의 외부 디바이스 사이의 근거리 통신을 수행할 수 있다.The short-range communication unit 410 may form a wireless area network and perform short-range communication between the vehicle 100 and at least one external device.
위치 정보부(420)는, 차량(100)의 위치 정보를 획득하기 위한 유닛이다. 예를 들면, 위치 정보부(420)는, GPS(Global Positioning System) 모듈 또는 DGPS(Differential Global Positioning System) 모듈을 포함할 수 있다.The location information unit 420 is a unit for acquiring location information of the vehicle 100. For example, the location information unit 420 may include a Global Positioning System (GPS) module or a Differential Global Positioning System (DGPS) module.
V2X 통신부(430)는, 서버(V2I : Vehicle to Infra), 타 차량(V2V : Vehicle to Vehicle) 또는 보행자(V2P : Vehicle to Pedestrian)와의 무선 통신 수행을 위한 유닛이다. V2X 통신부(430)는, 인프라와의 통신(V2I), 차량간 통신(V2V), 보행자와의 통신(V2P) 프로토콜이 구현 가능한 RF 회로를 포함할 수 있다.The V2X communication unit 430 is a unit for performing wireless communication with a server (V2I: Vehicle to Infra), another vehicle (V2V: Vehicle to Vehicle), or a pedestrian (V2P: Vehicle to Pedestrian). The V2X communication unit 430 may include an RF circuit capable of implementing communication with infrastructure (V2I), communication between vehicles (V2V), and communication with pedestrians (V2P) protocols.
광통신부(440)는, 광을 매개로 외부 디바이스와 통신을 수행하기 위한 유닛이다. 광통신부(440)는, 전기 신호를 광 신호로 전환하여 외부에 발신하는 광발신부 및 수신된 광 신호를 전기 신호로 전환하는 광수신부를 포함할 수 있다.The optical communication unit 440 is a unit for communicating with an external device through light. The optical communication unit 440 may include an optical transmitter that converts an electrical signal into an optical signal and transmits it to the outside, and an optical receiver that converts the received optical signal into an electrical signal.
실시예에 따라, 광발신부는, 차량(100)에 포함된 램프와 일체화되게 형성될 수 있다.Depending on the embodiment, the light emitting unit may be formed to be integrated with the lamp included in the vehicle 100.
방송 송수신부(450)는, 방송 채널을 통해, 외부의 방송 관리 서버로부터 방송 신호를 수신하거나, 방송 관리 서버에 방송 신호를 송출하기 위한 유닛이다. 방송 채널은, 위성 채널, 지상파 채널을 포함할 수 있다. 방송 신호는, TV 방송 신호, 라디오 방송 신호, 데이터 방송 신호를 포함할 수 있다.The broadcast transceiver 450 is a unit for receiving a broadcast signal from an external broadcast management server through a broadcast channel or transmitting a broadcast signal to the broadcast management server. Broadcast channels may include satellite channels and terrestrial channels. Broadcast signals may include TV broadcast signals, radio broadcast signals, and data broadcast signals.
프로세서(470)는, 통신 장치(400)의 각 유닛의 전반적인 동작을 제어할 수 있다.The processor 470 may control the overall operation of each unit of the communication device 400.
실시예에 따라, 통신 장치(400)는, 복수의 프로세서(470)를 포함하거나, 프로세서(470)를 포함하지 않을 수도 있다.Depending on the embodiment, the communication device 400 may include a plurality of processors 470 or may not include the processor 470.
통신 장치(400)에 프로세서(470)가 포함되지 않는 경우, 통신 장치(400)는, 차량(100)내 다른 장치의 프로세서 또는 제어부(170)의 제어에 따라, 동작될 수 있다.When the communication device 400 does not include the processor 470, the communication device 400 may be operated under the control of the processor 170 or a processor of another device in the vehicle 100.
한편, 통신 장치(400)는, 사용자 인터페이스 장치(200)와 함께 차량용 디스플레이 장치를 구현할 수 있다. 이경우, 차량용 디스플레이 장치는, 텔레 매틱스(telematics) 장치 또는 AVN(Audio Video Navigation) 장치로 명명될 수 있다.Meanwhile, the communication device 400 may implement a vehicle display device together with the user interface device 200. In this case, the vehicle display device may be called a telematics device or an AVN (Audio Video Navigation) device.
통신 장치(400)는, 제어부(170)의 제어에 따라 동작될 수 있다.The communication device 400 may be operated under the control of the control unit 170.
운전 조작 장치(500)는, 운전을 위한 사용자 입력을 수신하는 장치이다.The driving control device 500 is a device that receives user input for driving.
메뉴얼 모드인 경우, 차량(100)은, 운전 조작 장치(500)에 의해 제공되는 신호에 기초하여 운행될 수 있다.In the manual mode, the vehicle 100 may be operated based on signals provided by the driving control device 500.
운전 조작 장치(500)는, 조향 입력 장치(510), 가속 입력 장치(530) 및 브레이크 입력 장치(570)를 포함할 수 있다.The driving control device 500 may include a steering input device 510, an acceleration input device 530, and a brake input device 570.
조향 입력 장치(510)는, 사용자로부터 차량(100)의 진행 방향 입력을 수신할 수 있다. 조향 입력 장치(510)는, 회전에 의해 조향 입력이 가능하도록 휠 형태로 형성되는 것이 바람직하다. 실시예에 따라, 조향 입력 장치는, 터치 스크린, 터치 패드 또는 버튼 형태로 형성될 수도 있다.The steering input device 510 may receive an input of the direction of travel of the vehicle 100 from the user. The steering input device 510 is preferably formed in a wheel shape to enable steering input by rotation. Depending on the embodiment, the steering input device may be formed in the form of a touch screen, touch pad, or button.
가속 입력 장치(530)는, 사용자로부터 차량(100)의 가속을 위한 입력을 수신할 수 있다. 브레이크 입력 장치(570)는, 사용자로부터 차량(100)의 감속을 위한 입력을 수신할 수 있다. 가속 입력 장치(530) 및 브레이크 입력 장치(570)는, 페달 형태로 형성되는 것이 바람직하다. 실시예에 따라, 가속 입력 장치 또는 브레이크 입력 장치는, 터치 스크린, 터치 패드 또는 버튼 형태로 형성될 수도 있다.The acceleration input device 530 may receive an input for acceleration of the vehicle 100 from the user. The brake input device 570 may receive an input for decelerating the vehicle 100 from the user. The acceleration input device 530 and the brake input device 570 are preferably formed in the form of pedals. Depending on the embodiment, the acceleration input device or the brake input device may be formed in the form of a touch screen, touch pad, or button.
운전 조작 장치(500)는, 제어부(170)의 제어에 따라 동작될 수 있다.The driving control device 500 may be operated under the control of the control unit 170.
차량 구동 장치(600)는, 차량(100)내 각종 장치의 구동을 전기적으로 제어하는 장치이다.The vehicle driving device 600 is a device that electrically controls the operation of various devices in the vehicle 100.
차량 구동 장치(600)는, 파워 트레인 구동부(610), 샤시 구동부(620), 도어/윈도우 구동부(630), 안전 장치 구동부(640), 램프 구동부(650) 및 공조 구동부(660)를 포함할 수 있다.The vehicle driving device 600 may include a power train driving unit 610, a chassis driving unit 620, a door/window driving unit 630, a safety device driving unit 640, a lamp driving unit 650, and an air conditioning driving unit 660. You can.
실시예에 따라, 차량 구동 장치(600)는, 설명되는 구성 요소외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.Depending on the embodiment, the vehicle driving device 600 may further include other components in addition to the components described, or may not include some of the components described.
한편, 차량 구동 장치(600)는 프로세서를 포함할 수 있다. 차량 구동 장치(600)의 각 유닛은, 각각 개별적으로 프로세서를 포함할 수 있다. Meanwhile, the vehicle driving device 600 may include a processor. Each unit of the vehicle driving device 600 may individually include a processor.
파워 트레인 구동부(610)는, 파워 트레인 장치의 동작을 제어할 수 있다.The power train driver 610 can control the operation of the power train device.
파워 트레인 구동부(610)는, 동력원 구동부(611) 및 변속기 구동부(612)를 포함할 수 있다.The power train driving unit 610 may include a power source driving unit 611 and a transmission driving unit 612.
동력원 구동부(611)는, 차량(100)의 동력원에 대한 제어를 수행할 수 있다.The power source driver 611 may control the power source of the vehicle 100.
예를 들면, 화석 연료 기반의 엔진이 동력원인 경우, 동력원 구동부(610)는, 엔진에 대한 전자식 제어를 수행할 수 있다. 이에 의해, 엔진의 출력 토크 등을 제어할 수 있다. 동력원 구동부(611)는, 제어부(170)의 제어에 따라, 엔진 출력 토크를 조정할 수 있다.For example, when a fossil fuel-based engine is the power source, the power source driver 610 may perform electronic control of the engine. Thereby, the output torque of the engine, etc. can be controlled. The power source driving unit 611 can adjust the engine output torque according to the control of the control unit 170.
예를 들면, 전기 에너지 기반의 모터가 동력원인 경우, 동력원 구동부(610)는, 모터에 대한 제어를 수행할 수 있다. 동력원 구동부(610)는, 제어부(170)의 제어에 따라, 모터의 회전 속도, 토크 등을 조정할 수 있다.For example, when an electric energy-based motor is the power source, the power source driver 610 may control the motor. The power source driving unit 610 can adjust the rotational speed and torque of the motor according to the control of the control unit 170.
변속기 구동부(612)는, 변속기에 대한 제어를 수행할 수 있다. 변속기 구동부(612)는, 변속기의 상태를 조정할 수 있다. 변속기 구동부(612)는, 변속기의 상태를, 전진(D), 후진(R), 중립(N) 또는 주차(P)로 조정할 수 있다. The transmission drive unit 612 can control the transmission. The transmission drive unit 612 can adjust the state of the transmission. The transmission drive unit 612 can adjust the state of the transmission to forward (D), reverse (R), neutral (N), or park (P).
한편, 엔진이 동력원인 경우, 변속기 구동부(612)는, 전진(D) 상태에서, 기어의 물림 상태를 조정할 수 있다.Meanwhile, when the engine is the power source, the transmission drive unit 612 can adjust the gear engagement state in the forward (D) state.
샤시 구동부(620)는, 샤시 장치의 동작을 제어할 수 있다. 샤시 구동부(620)는, 조향 구동부(621), 브레이크 구동부(622) 및 서스펜션 구동부(623)를 포함할 수 있다.The chassis driver 620 can control the operation of the chassis device. The chassis drive unit 620 may include a steering drive unit 621, a brake drive unit 622, and a suspension drive unit 623.
조향 구동부(621)는, 차량(100) 내의 조향 장치(steering apparatus)에 대한 전자식 제어를 수행할 수 있다. 조향 구동부(621)는, 차량의 진행 방향을 변경할 수 있다.The steering drive unit 621 may perform electronic control of the steering apparatus within the vehicle 100. The steering drive unit 621 can change the moving direction of the vehicle.
브레이크 구동부(622)는, 차량(100) 내의 브레이크 장치(brake apparatus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 바퀴에 배치되는 브레이크의 동작을 제어하여, 차량(100)의 속도를 줄일 수 있다. The brake driver 622 may perform electronic control of the brake apparatus within the vehicle 100. For example, the speed of the vehicle 100 can be reduced by controlling the operation of the brakes disposed on the wheels.
한편, 브레이크 구동부(622)는, 복수의 브레이크 각각을 개별적으로 제어할 수 있다. 브레이크 구동부(622)는, 복수의 휠에 걸리는 제동력을 서로 다르게 제어할 수 있다.Meanwhile, the brake driver 622 can individually control each of the plurality of brakes. The brake driver 622 can control braking force applied to a plurality of wheels differently.
서스펜션 구동부(623)는, 차량(100) 내의 서스펜션 장치(suspension apparatus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 서스펜션 구동부(623)는 도로면에 굴곡이 있는 경우, 서스펜션 장치를 제어하여, 차량(100)의 진동이 저감되도록 제어할 수 있다. 한편, 서스펜션 구동부(623)는, 복수의 서스펜션 각각을 개별적으로 제어할 수 있다.The suspension drive unit 623 may perform electronic control of the suspension apparatus within the vehicle 100. For example, when the road surface is curved, the suspension drive unit 623 may control the suspension device to reduce vibration of the vehicle 100. Meanwhile, the suspension driving unit 623 can individually control each of the plurality of suspensions.
도어/윈도우 구동부(630)는, 차량(100) 내의 도어 장치(door apparatus) 또는 윈도우 장치(window apparatus)에 대한 전자식 제어를 수행할 수 있다.The door/window driving unit 630 may perform electronic control of the door apparatus or window apparatus within the vehicle 100.
도어/윈도우 구동부(630)는, 도어 구동부(631) 및 윈도우 구동부(632)를 포함할 수 있다.The door/window driving unit 630 may include a door driving unit 631 and a window driving unit 632.
도어 구동부(631)는, 도어 장치에 대한 제어를 수행할 수 있다. 도어 구동부(631)는, 차량(100)에 포함되는 복수의 도어의 개방, 폐쇄를 제어할 수 있다. 도어 구동부(631)는, 트렁크(trunk) 또는 테일 게이트(tail gate)의 개방 또는 폐쇄를 제어할 수 있다. 도어 구동부(631)는, 썬루프(sunroof)의 개방 또는 폐쇄를 제어할 수 있다.The door driver 631 can control the door device. The door driver 631 can control the opening and closing of a plurality of doors included in the vehicle 100. The door driver 631 can control the opening or closing of the trunk or tail gate. The door driver 631 can control the opening or closing of the sunroof.
윈도우 구동부(632)는, 윈도우 장치(window apparatus)에 대한 전자식 제어를 수행할 수 있다. 차량(100)에 포함되는 복수의 윈도우의 개방 또는 폐쇄를 제어할 수 있다.The window driver 632 may perform electronic control of a window apparatus. It is possible to control the opening or closing of a plurality of windows included in the vehicle 100.
안전 장치 구동부(640)는, 차량(100) 내의 각종 안전 장치(safety apparatus)에 대한 전자식 제어를 수행할 수 있다.The safety device driver 640 may perform electronic control of various safety apparatuses in the vehicle 100.
안전 장치 구동부(640)는, 에어백 구동부(641), 시트벨트 구동부(642) 및 보행자 보호 장치 구동부(643)를 포함할 수 있다.The safety device driver 640 may include an airbag driver 641, a seat belt driver 642, and a pedestrian protection device driver 643.
에어백 구동부(641)는, 차량(100) 내의 에어백 장치(airbag apparatus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 에어백 구동부(641)는, 위험 감지시, 에어백이 전개되도록 제어할 수 있다.The airbag driving unit 641 may perform electronic control of the airbag apparatus within the vehicle 100. For example, the airbag driving unit 641 may control the airbag to be deployed when danger is detected.
시트벨트 구동부(642)는, 차량(100) 내의 시트벨트 장치(seatbelt appartus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 시트벨트 구동부(642)는, 위험 감지시, 시트 밸트를 이용해 탑승객이 시트(110FL, 110FR, 110RL, 110RR)에 고정되도록 제어할 수 있다.The seat belt drive unit 642 may perform electronic control of the seat belt appartus in the vehicle 100. For example, when danger is detected, the seat belt drive unit 642 can control the passenger to be fixed to the seat (110FL, 110FR, 110RL, 110RR) using the seat belt.
보행자 보호 장치 구동부(643)는, 후드 리프트 및 보행자 에어백에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 보행자 보호 장치 구동부(643)는, 보행자와의 충돌 감지시, 후드 리프트 업 및 보행자 에어백 전개되도록 제어할 수 있다.The pedestrian protection device driving unit 643 may perform electronic control of the hood lift and pedestrian airbag. For example, the pedestrian protection device driving unit 643 may control the hood to lift up and the pedestrian airbag to deploy when a collision with a pedestrian is detected.
램프 구동부(650)는, 차량(100) 내의 각종 램프 장치(lamp apparatus)에 대한 전자식 제어를 수행할 수 있다.The lamp driver 650 may perform electronic control of various lamp apparatuses in the vehicle 100.
공조 구동부(660)는, 차량(100) 내의 공조 장치(air cinditioner)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 공조 구동부(660)는, 차량 내부의 온도가 높은 경우, 공조 장치가 동작하여, 냉기가 차량 내부로 공급되도록 제어할 수 있다.The air conditioning driver 660 may perform electronic control of the air conditioning device (air cinditioner) in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioning driver 660 can control the air conditioning device to operate so that cold air is supplied into the vehicle interior.
차량 구동 장치(600)는, 프로세서를 포함할 수 있다. 차량 구동 장치(600)의 각 유닛은, 각각 개별적으로 프로세서를 포함할 수 있다.The vehicle driving device 600 may include a processor. Each unit of the vehicle driving device 600 may individually include a processor.
차량 구동 장치(600)는, 제어부(170)의 제어에 따라 동작될 수 있다.The vehicle driving device 600 may be operated under the control of the control unit 170.
운행 시스템(700)은, 차량(100)의 각종 운행을 제어하는 시스템이다. 운행 시스템(700)은, 자율 주행 모드에서 동작될 수 있다.The operation system 700 is a system that controls various operations of the vehicle 100. The navigation system 700 may be operated in autonomous driving mode.
운행 시스템(700)은, 주행 시스템(710), 출차 시스템(740) 및 주차 시스템(750) 을 포함할 수 있다.The driving system 700 may include a driving system 710, a parking system 740, and a parking system 750.
실시예에 따라, 운행 시스템(700)은, 설명되는 구성 요소외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.Depending on the embodiment, the navigation system 700 may further include other components in addition to the components described, or may not include some of the components described.
한편, 운행 시스템(700)은, 프로세서를 포함할 수 있다. 운행 시스템(700)의 각 유닛은, 각각 개별적으로 프로세서를 포함할 수 있다.Meanwhile, the navigation system 700 may include a processor. Each unit of the navigation system 700 may individually include a processor.
한편, 실시예에 따라, 운행 시스템(700)이 소프트웨어적으로 구현되는 경우, 제어부(170)의 하위 개념일 수도 있다.Meanwhile, depending on the embodiment, when the navigation system 700 is implemented in software, it may be a sub-concept of the control unit 170.
한편, 실시예에 따라, 운행 시스템(700)은, 사용자 인터페이스 장치(200), 오브젝트 검출 장치(300), 통신 장치(400), 차량 구동 장치(600) 및 제어부(170) 중 적어도 어느 하나를 포함하는 개념일 수 있다.Meanwhile, depending on the embodiment, the navigation system 700 includes at least one of the user interface device 200, the object detection device 300, the communication device 400, the vehicle driving device 600, and the control unit 170. It may be a concept that includes
주행 시스템(710)은, 차량(100)의 주행을 수행할 수 있다. The driving system 710 can drive the vehicle 100.
주행 시스템(710)은, 내비게이션 시스템(770)으로부터 내비게이션 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주행을 수행할 수 있다. 주행 시스템(710)은, 오브젝트 검출 장치(300)로부터 오브젝트 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주행을 수행할 수 있다. 주행 시스템(710)은, 통신 장치(400)를 통해, 외부 디바이스로부터 신호를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주행을 수행할 수 있다.The driving system 710 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100. The driving system 710 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100. The driving system 710 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100.
출차 시스템(740)은, 차량(100)의 출차를 수행할 수 있다. The parking system 740 can remove the vehicle 100.
출차 시스템(740)은, 내비게이션 시스템(770)으로부터 내비게이션 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 출차를 수행할 수 있다. 출차 시스템(740)은, 오브젝트 검출 장치(300)로부터 오브젝트 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 출차를 수행할 수 있다. 출차 시스템(740)은, 통신 장치(400)를 통해, 외부 디바이스로부터 신호를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 출차를 수행할 수 있다.The parking system 740 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to remove the vehicle 100. The parking system 740 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to remove the vehicle 100. The parking system 740 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600 to remove the vehicle 100.
주차 시스템(750)은, 차량(100)의 주차를 수행할 수 있다.The parking system 750 can park the vehicle 100.
주차 시스템(750)은, 내비게이션 시스템(770)으로부터 내비게이션 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주차를 수행할 수 있다. 주차 시스템(750)은, 오브젝트 검출 장치(300)로부터 오브젝트 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주차를 수행할 수 있다. 주차 시스템(750)은, 통신 장치(400)를 통해, 외부 디바이스로부터 신호를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주차를 수행할 수 있다.The parking system 750 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to park the vehicle 100. The parking system 750 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to park the vehicle 100. The parking system 750 may park the vehicle 100 by receiving a signal from an external device through the communication device 400 and providing a control signal to the vehicle driving device 600.
내비게이션 시스템(770)은, 내비게이션 정보를 제공할 수 있다. 내비게이션 정보는, 맵(map) 정보, 설정된 목적지 정보, 상기 목적지 설정 따른 경로 정보, 경로 상의 다양한 오브젝트에 대한 정보, 차선 정보 및 차량의 현재 위치 정보 중 적어도 어느 하나를 포함할 수 있다.The navigation system 770 may provide navigation information. Navigation information may include at least one of map information, set destination information, route information according to the set destination, information on various objects on the route, lane information, and current location information of the vehicle.
내비게이션 시스템(770)은, 메모리, 프로세서를 포함할 수 있다. 메모리는 내비게이션 정보를 저장할 수 있다. 프로세서는 내비게이션 시스템(770)의 동작을 제어할 수 있다.The navigation system 770 may include memory and a processor. The memory can store navigation information. The processor may control the operation of the navigation system 770.
실시예에 따라, 내비게이션 시스템(770)은, 통신 장치(400)를 통해, 외부 디바이스로부터 정보를 수신하여, 기 저장된 정보를 업데이트 할 수 있다.Depending on the embodiment, the navigation system 770 may receive information from an external device through the communication device 400 and update pre-stored information.
실시예에 따라, 내비게이션 시스템(770)은, 사용자 인터페이스 장치(200)의 하위 구성 요소로 분류될 수도 있다.Depending on the embodiment, the navigation system 770 may be classified as a sub-component of the user interface device 200.
센싱부(120)는, 차량의 상태를 센싱할 수 있다. 센싱부(120)는, 자세 센서(예를 들면, 요 센서(yaw sensor), 롤 센서(roll sensor), 피치 센서(pitch sensor)), 충돌 센서, 휠 센서(wheel sensor), 속도 센서, 경사 센서, 중량 감지 센서, 헤딩 센서(heading sensor), 요 센서(yaw sensor), 자이로 센서(gyro sensor), 포지션 모듈(position module), 차량 전진/후진 센서, 배터리 센서, 연료 센서, 타이어 센서, 핸들 회전에 의한 스티어링 센서, 차량 내부 온도 센서, 차량 내부 습도 센서, 초음파 센서, 조도 센서, 가속 페달 포지션 센서, 브레이크 페달 포지션 센서, 등을 포함할 수 있다.The sensing unit 120 can sense the status of the vehicle. The sensing unit 120 includes a posture sensor (e.g., yaw sensor, roll sensor, pitch sensor), collision sensor, wheel sensor, speed sensor, and inclination sensor. Sensor, weight sensor, heading sensor, yaw sensor, gyro sensor, position module, vehicle forward/reverse sensor, battery sensor, fuel sensor, tire sensor, steering wheel It may include a rotational steering sensor, vehicle interior temperature sensor, vehicle interior humidity sensor, ultrasonic sensor, illuminance sensor, accelerator pedal position sensor, brake pedal position sensor, etc.
센싱부(120)는, 차량 자세 정보, 차량 충돌 정보, 차량 방향 정보, 차량 위치 정보(GPS 정보), 차량 각도 정보, 차량 속도 정보, 차량 가속도 정보, 차량 기울기 정보, 차량 전진/후진 정보, 배터리 정보, 연료 정보, 타이어 정보, 차량 램프 정보, 차량 내부 온도 정보, 차량 내부 습도 정보, 스티어링 휠 회전 각도, 차량 외부 조도, 가속 페달에 가해지는 압력, 브레이크 페달에 가해지는 압력 등에 대한 센싱 신호를 획득할 수 있다.The sensing unit 120 includes vehicle posture information, vehicle collision information, vehicle direction information, vehicle location information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward/backward information, and battery. Obtain sensing signals for information, fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, vehicle exterior illumination, pressure applied to the accelerator pedal, pressure applied to the brake pedal, etc. can do.
센싱부(120)는, 그 외, 가속페달센서, 압력센서, 엔진 회전 속도 센서(engine speed sensor), 공기 유량 센서(AFS), 흡기 온도 센서(ATS), 수온 센서(WTS), 스로틀 위치 센서(TPS), TDC 센서, 크랭크각 센서(CAS), 등을 더 포함할 수 있다.The sensing unit 120 includes an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor (AFS), an intake temperature sensor (ATS), a water temperature sensor (WTS), and a throttle position sensor. (TPS), TDC sensor, crank angle sensor (CAS), etc. may be further included.
차량 인터페이스부(130)는, 차량(100)에 연결되는 다양한 종류의 외부 기기와의 통로 역할을 수행할 수 있다. 예를 들면, 차량 인터페이스부(130)는 이동 단말기와 연결 가능한 포트를 구비할 수 있고, 상기 포트를 통해, 이동 단말기와 연결할 수 있다. 이경우, 차량 인터페이스부(130)는 이동 단말기와 데이터를 교환할 수 있다.The vehicle interface unit 130 may serve as a passageway for various types of external devices connected to the vehicle 100. For example, the vehicle interface unit 130 may have a port that can be connected to a mobile terminal, and can be connected to a mobile terminal through the port. In this case, the vehicle interface unit 130 can exchange data with the mobile terminal.
한편, 차량 인터페이스부(130)는 연결된 이동 단말기에 전기 에너지를 공급하는 통로 역할을 수행할 수 있다. 이동 단말기가 차량 인터페이스부(130)에 전기적으로 연결되는 경우, 제어부(170)의 제어에 따라, 차량 인터페이스부(130)는 전원 공급부(190)에서 공급되는 전기 에너지를 이동 단말기에 제공할 수 있다.Meanwhile, the vehicle interface unit 130 may serve as a conduit for supplying electrical energy to a connected mobile terminal. When the mobile terminal is electrically connected to the vehicle interface unit 130, the vehicle interface unit 130 may provide electrical energy supplied from the power supply unit 190 to the mobile terminal under the control of the control unit 170. .
메모리(140)는, 제어부(170)와 전기적으로 연결된다. 메모리(140)는 유닛에 대한 기본데이터, 유닛의 동작제어를 위한 제어데이터, 입출력되는 데이터를 저장할 수 있다. 메모리(140)는, 하드웨어적으로, ROM, RAM, EPROM, 플래시 드라이브, 하드 드라이브 등과 같은 다양한 저장기기 일 수 있다. 메모리(140)는 제어부(170)의 처리 또는 제어를 위한 프로그램 등, 차량(100) 전반의 동작을 위한 다양한 데이터를 저장할 수 있다.The memory 140 is electrically connected to the control unit 170. The memory 140 can store basic data for the unit, control data for controlling the operation of the unit, and input/output data. In terms of hardware, the memory 140 may be a variety of storage devices such as ROM, RAM, EPROM, flash drive, hard drive, etc. The memory 140 may store various data for the overall operation of the vehicle 100, such as programs for processing or controlling the control unit 170.
실시예에 따라, 메모리(140)는, 제어부(170)와 일체형으로 형성되거나, 제어부(170)의 하위 구성 요소로 구현될 수 있다.Depending on the embodiment, the memory 140 may be formed integrally with the control unit 170 or may be implemented as a sub-component of the control unit 170.
제어부(170)는, 차량(100) 내의 각 유닛의 전반적인 동작을 제어할 수 있다. 제어부(170)는 ECU(Electronic Contol Unit)로 명명될 수 있다.The control unit 170 may control the overall operation of each unit within the vehicle 100. The control unit 170 may be named ECU (Electronic Control Unit).
전원 공급부(190)는, 제어부(170)의 제어에 따라, 각 구성요소들의 동작에 필요한 전원을 공급할 수 있다. 특히, 전원 공급부(190)는, 차량 내부의 배터리 등으로부터 전원을 공급받을 수 있다.The power supply unit 190 may supply power required for the operation of each component under the control of the control unit 170. In particular, the power supply unit 190 may receive power from a battery inside the vehicle.
차량(100)에 포함되는, 하나 이상의 프로세서 및 제어부(170)는, ASICs (application specific integrated circuits), DSPs(digital signal processors), DSPDs(digital signal processing devices), PLDs(programmable logic devices), FPGAs(field programmable gate arrays), 프로세서(processors), 제어기(controllers), 마이크로 컨트롤러(micro-controllers), 마이크로 프로세서(microprocessors), 기타 기능 수행을 위한 전기적 유닛 중 적어도 하나를 이용하여 구현될 수 있다.One or more processors and control units 170 included in the vehicle 100 include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and FPGAs ( It may be implemented using at least one of field programmable gate arrays, processors, controllers, micro-controllers, microprocessors, and other electrical units for performing functions.
한편, 본 발명에 따른 AR 디스플레이 장치(800)는 차량(100)의 내비게이션 정보와 AR 카메라로부터 수신된 데이터에 기초하여, 차량의 전방 영상(또는, 차량의 윈드쉴드)에, 차량의 주행 상태를 나타내는 AR 그래픽 인터페이스를 실시간으로 AR 정합하여 표시할 수 있다. Meanwhile, the AR display device 800 according to the present invention is used for navigation of the vehicle 100. Based on the information and data received from the AR camera, an AR graphic interface indicating the driving state of the vehicle can be displayed in real time through AR registration on the front image of the vehicle (or the vehicle's windshield).
이를 위해, AR 디스플레이 장치(800)는 다른 장치/시스템, 서버, 및 차량과 통신하기 위한 통신모듈(810), AR 디스플레이 장치(800)의 전반적인 동작을 제어하는 프로세서(820), 및 AR 그래픽 인터페이스가 렌더링된 전방 영상을 포함하는 내비게이션 화면을 디스플레이하기 위한 디스플레이(830)를 포함할 수 있다. To this end, the AR display device 800 includes a communication module 810 for communicating with other devices/systems, servers, and vehicles, a processor 820 that controls the overall operation of the AR display device 800, and an AR graphic interface. may include a display 830 for displaying a navigation screen including a rendered forward image.
본 명세서에 개시된, '전방 영상' 은, 카메라 센서(또는, 이러한 기능을 포함하는 스마트 글래스를 포함함)를 통해 촬영된 영상뿐만 아니라, 카메라 센서를 통해 LCD 화면에 반사되는 영상 및 윈드쉴드/대시보드에 보여지는 실제공간의 이미지 및/또는 디지털 트윈된 3차원 이미지 등을 모두 포함할 수 있다. As disclosed herein, 'front image' refers not only to images captured through camera sensors (or smart glasses including these functions), but also to images reflected on the LCD screen through camera sensors and windshield/dash images. It can include both real-space images and/or digitally twinned 3D images shown on the board.
본 명세서에 개시된, '전방 영상(또는, 주행 영상)이 포함된 내비게이션 화면' 은 현재 위치와 내비게이션 정보에 기초하여 생성된 내비게이션 화면에, 차량의 카메라를 통해 촬영된 전방 영상, LCD 화면에 반사된 영상, 윈드쉴드 등을 통해 보여지는 실제공간의 이미지 및/또는 디지털 트윈된 3차원 이미지 중 하나의 형태로 구현된 전방 영상이 내비게이션 화면에 레이어된 것을 의미할 수 있다.'Navigation screen including forward image (or driving image)' disclosed in this specification is a navigation screen created based on the current location and navigation information, a forward image captured through the vehicle's camera, and reflected on the LCD screen. This may mean that a forward image implemented in the form of one of the images of real space shown through a video, windshield, etc. and/or a digitally twinned 3D image is layered on the navigation screen.
본 명세서에 개시된, "주차 영역"은 충전기를 포함한 충전소 및 주차공간을 포함한 주차장을 모두 포함하는 것으로 사용되었다. As disclosed herein, “parking area” is used to include both a charging station, including a charger, and a parking lot, including parking spaces.
내비게이션 화면은, AR 기술이 적용된 AR 내비게이션 화면일 수 있다. The navigation screen may be an AR navigation screen to which AR technology is applied.
본 명세서에 개시된 'AR 그래픽 인터페이스'는 증강 현실(AR) 기술이 적용된 그래픽 유저 인터페이스로서, 차량의 전방 영상에 실시간으로 AR 정합된다. The 'AR graphic interface' disclosed in this specification is a graphic user interface using augmented reality (AR) technology, and is AR matched to the front image of the vehicle in real time.
본 명세서에서 AR 그래픽 인터페이스는, 차량의 현재 주행 상태를 나타내는 AR 그래픽 이미지일 수 있다. 또한, 본 명세서에 개시된 AR 그래픽 인터페이스는, 차량의 현재 주행 상태와 동시에 차량의 주행 상황에 대한 가이드를 더 나타내는 AR 그래픽 이미지일 수 있다. 이때, 차량의 주행 상황에 대한 가이드는, 해당 주행 상황 보다 일정거리 및/또는 일정시간 앞서서 차량 전방 영상에 표시된다. 또한, 본 명세서에 개시된 AR 그래픽 인터페이스는, 차량의 현재 주행 상태 및/또는 차량의 주행 상황에 따라 움직이거나 이동하는 AR 그래픽 이미지로 구현될 수 있다. In this specification, the AR graphic interface may be an AR graphic image representing the current driving state of the vehicle. Additionally, the AR graphic interface disclosed in this specification may be an AR graphic image that further represents a guide to the driving situation of the vehicle simultaneously with the current driving state of the vehicle. At this time, a guide to the driving situation of the vehicle is displayed on the image in front of the vehicle at a certain distance and/or a certain time ahead of the driving situation. Additionally, the AR graphic interface disclosed in this specification may be implemented as an AR graphic image that moves or moves depending on the current driving state of the vehicle and/or the driving situation of the vehicle.
도 7을 참조하면, 본 발명의 실시 예에 따른 AR 디스플레이 장치(800)는, 차량(100)의 전장품 또는 시스템의 일부로 구현되거나, 또는 별도의 독립항 장치 또는 시스템으로 구현될 수 있다. 또는, 상기 AR 디스플레이 장치(800)는 차량(100)의 사용자 단말 등의 프로세서에 의해 동작하는 명령어로 이루어진 프로그램 형태로 구현될 수도 있다.Referring to FIG. 7, the AR display device 800 according to an embodiment of the present invention may be implemented as part of the electrical equipment or system of the vehicle 100, or may be implemented as a separate independent device or system. Alternatively, the AR display device 800 may be implemented in the form of a program consisting of instructions operated by a processor such as a user terminal of the vehicle 100.
AR 디스플레이 장치(800)는 차량(100), 다른 장치, 및/또는 서버와 통신하여 AR 카메라를 통해 획득되는 차량의 전방 영상 및 차량에 구비된 센서(예, 자이로스코프센서, 가속도센서, 중력센서, 지자기센서, 온도 센서 등)를 통해 획득된 센싱 데이터를 수신할 수 있다.The AR display device 800 communicates with the vehicle 100, other devices, and/or servers to display the front image of the vehicle acquired through an AR camera and sensors provided in the vehicle (e.g., gyroscope sensor, acceleration sensor, gravity sensor). , geomagnetic sensors, temperature sensors, etc.).
AR 디스플레이 장치(800)는 기설정된 애플리케이션, 예를 들어 (AR) 내비게이션 애플리케이션을 구동할 수 있다. The AR display device 800 may run a preset application, for example, an (AR) navigation application.
AR 디스플레이 장치(800)는 맵 데이터(예, 루트, POI 등의 정보), 센싱 데이터, 카메라에 의해 획득된 전방 영상에 기초하여, 차량의 현재 주행 상태를 나타내는 AR 그래픽 인터페이스를 렌더러하여, 내비게이션 애플리케이션의 AR GUI 서피스 및 AR 카메라 서피스에 실시간으로 제공할 수 있다. The AR display device 800 renders an AR graphic interface indicating the current driving state of the vehicle based on map data (e.g., information such as route, POI, etc.), sensing data, and front image acquired by the camera, and displays the navigation application. It can be provided in real time to the AR GUI surface and AR camera surface.
AR 디스플레이 장치(800)는 맵 데이터(예, 루트, POI 등의 정보), 센싱 데이터, 카메라에 의해 획득된 전방 영상에 기초하여, 상기 AR 그래픽 인터페이스로부터 분리된 AR 오브젝트가 차량의 주행 상황에 대한 가이드를 나타내도록 렌더러하여, 내비게이션 애플리케이션의 AR GUI 서피스 및 AR 카메라 서피스에 실시간으로 제공할 수 있다. The AR display device 800 displays an AR object separated from the AR graphic interface based on map data (e.g., information such as route, POI, etc.), sensing data, and a front image acquired by a camera to display information about the driving situation of the vehicle. It can be rendered to display a guide and provided in real time to the AR GUI surface and AR camera surface of the navigation application.
이때, 분리된 AR 오브젝트는 '제2 AR 오브젝트'로 명명될 수 있고, 제2 AR 오브젝트가 분리된 후 상기 AR 그래픽 인터페이스의 남은 부분은 '제1 AR 오브젝트'로 명명될 수 있다. 즉, 상기 AR 그래픽 인터페이스는 차량의 현재 주행 상태를 나타내는 제1 AR 오브젝트와 차량의 주행 상황에 대한 가이드를 표시하는 제2 AR 오브젝트를 포함한다고 말할 수 있다. At this time, the separated AR object may be named 'second AR object', and the remaining part of the AR graphic interface after the second AR object is separated may be named 'first AR object'. In other words, the AR graphic interface can be said to include a first AR object that represents the current driving state of the vehicle and a second AR object that displays a guide to the driving situation of the vehicle.
이하, 도 8은 전술한 본 발명의 실시예에 관련된 AR 디스플레이 장치(800)의 프로세서(820)와 관련된 상세 블럭도이다. Hereinafter, FIG. 8 is a detailed block diagram related to the processor 820 of the AR display device 800 related to the embodiment of the present invention described above.
도 8에 도시된 개념도는 AR 디스플레이 장치(800)의 프로세서(820)에 의해 수행되는 동작과 관련된 구성 및 이를 위해 사용되는 정보, 데이터, 및 프로그램을 포함할 수 있다. 이러한 측면에서 도 8에 도시된 블럭도는 프로세서(820)를 통해 제공되는 서비스 및/또는 프로세서(820)에 의해 실행/구현되는 시스템을 의미하는 것으로 사용될 수도 있다. 이하에서는, 설명의 편의를 위해, 프로세서(820)로 지칭하겠다.The conceptual diagram shown in FIG. 8 may include configurations related to operations performed by the processor 820 of the AR display device 800 and information, data, and programs used therefor. In this respect, the block diagram shown in FIG. 8 may be used to mean a service provided through the processor 820 and/or a system executed/implemented by the processor 820. Hereinafter, for convenience of explanation, it will be referred to as processor 820.
도 9는 본 발명의 실시예에 따른 내비게이션 화면을 설명하는데 참조되는 도면이며, 도 10은 도 9의 내비게이션 화면을 생성하는 동작을 설명하는데 참조되는 도면이다.FIG. 9 is a diagram referenced for explaining a navigation screen according to an embodiment of the present invention, and FIG. 10 is a diagram referenced for explaining an operation of generating the navigation screen of FIG. 9 .
도 8을 참조하면, 프로세서(820)는, 내비게이션 엔진(Navigation Engine)(910), AR 엔진(Augmented Reality Engine)(920) 및 내비게이션 애플리케이션(Navigation Application)(930), 및 센서 및 맵(940)을 포함하거나 그리고/또는 이와 연동하여 이들을 구동시킬 수 있다.Referring to FIG. 8, the processor 820 includes a navigation engine (Navigation Engine) 910, an Augmented Reality Engine (AR Engine) 920, a navigation application (Navigation Application) 930, and sensors and maps (940). They can be driven by including and/or in conjunction with them.
내비게이션 엔진(910)은, 차량 등으로부터 맵 데이터 및 GPS 데이터를 수신할 수 있다. 내비게이션 엔진(910)은, 맵 데이터 및 GPS 데이터에 기초하여 맵 매칭을 수행할 수 있다. 내비게이션 엔진(910)은, 맵 매칭에 따라 루트 플레닝(route planning)을 수행할 수 있다. 내비게이션 엔진(910)은, 맵을 디스플레이하고, 루트 가이던스(route guidance)를 수행할 수 있다. 내비게이션 엔진(910)은, 루트 가이던스 정보를 내비게이션 애플리케이션(930)에 제공할 수 있다. The navigation engine 910 may receive map data and GPS data from a vehicle, etc. The navigation engine 910 may perform map matching based on map data and GPS data. The navigation engine 910 may perform route planning according to map matching. The navigation engine 910 can display a map and perform route guidance. The navigation engine 910 may provide route guidance information to the navigation application 930.
내비게이션 엔진(910)은, 내비게이션 컨트롤러(911)를 포함할 수 있다. 내비게이션 컨트롤러(911)는, 맵 매칭 데이터, 맵 디스플레이 데이터, 루트 가이던스 데이터를 수신할 수 있다. The navigation engine 910 may include a navigation controller 911. The navigation controller 911 may receive map matching data, map display data, and route guidance data.
내비게이션 컨트롤러(911)는, 수신된 맵 매칭 데이터, 맵 디스플레이 데이터, 루트 가이던스 데이터에 기초하여 루트 데이터, POI(point of interest) 데이터 등을 AR 엔진(920)에 제공할 수 있다. The navigation controller 911 may provide route data, point of interest (POI) data, etc. to the AR engine 920 based on the received map matching data, map display data, and route guidance data.
내비게이션 컨트롤러(911)는, 내비게이션 애플리케이션(930)에 루트 가이던스 데이터 및 맵 디스플레이 프레임 등을 제공할 수 있다.The navigation controller 911 may provide route guidance data and map display frames to the navigation application 930.
AR 엔진(920)은, 어댑터(adaptor)(921) 및 렌더러(renderer)(922)를 포함할 수 있다. 어댑터(921)는 카메라(예, AR 카메라)로부터 획득된 전방 영상 데이터, 차량의 센서, 예를 들어 자이로스코프 센서(Gyroscope), 가속도 센서(ccelorometer), 중력 센서(Gravity), 지자기센서(Magnetometer), 및/또는 온도센서(Thermometer)로부터 획득된 센싱 데이터를 수신할 수 있다. The AR engine 920 may include an adapter 921 and a renderer 922. The adapter 921 uses front image data acquired from a camera (e.g., AR camera), sensors of the vehicle, such as a gyroscope, an acceleration sensor, a gravity sensor, and a magnetometer. , and/or sensing data obtained from a temperature sensor (Thermometer) may be received.
AR 엔진(920)은, ADAS 센서(예, 카메라(Camera), 레이다(Radar), 라이다(Lidar), 초음파(Ultrasonic), 소나(Sonar))로부터 획득된 센싱 데이터를 수신할 수 있다. 예를 들어, ADAS 센서를 통해, 주행 방향 및 속도, 차선과의 거리 등의 주행 관련 센싱 데이터를 센싱 데이터로 획득할 수 있다. The AR engine 920 may receive sensing data obtained from ADAS sensors (e.g., Camera, Radar, Lidar, Ultrasonic, Sonar). For example, through ADAS sensors, driving-related sensing data such as driving direction, speed, and distance from lanes can be obtained as sensing data.
AR 엔진(920)은, 고정밀 지도 데이터 및 이와 관련된 프로그램을 수신할 수 있다. 여기서, 고정밀 지도(HD Map)는 자율주행차량에 세밀한 도로와 주변 지형의 정보를 사전에 제공하기 위한 지도로서, 오차범위 약 10cm 이내 수준의 정확도를 가지며 도로 중심선, 경계선 등 차선 단위의 정보는 물론 신호등, 표지판, 연석, 노면마크, 각종 구조물 등의 정보가 3차원 디지털로 담긴다.The AR engine 920 can receive high-precision map data and programs related thereto. Here, the high-precision map (HD Map) is a map to provide detailed road and surrounding terrain information to autonomous vehicles in advance. It has an accuracy within about 10cm of the error range and provides lane-level information such as the road center line and boundary line as well as lane-level information. Information such as traffic lights, signs, curbs, road marks, and various structures are contained in 3D digital format.
AR 엔진(920)은, TCU(Transmission Contro Unit)(예, 써드 파티 서비스, V2X, ITS 통신 등)로부터 획득된 센싱 데이터, 수신 데이터, 제어 데이터, 및 이와 관련된 프로그램을 수신할 수 있다. The AR engine 920 may receive sensing data, reception data, control data, and programs related thereto obtained from a Transmission Control Unit (TCU) (e.g., third party service, V2X, ITS communication, etc.).
센서 및 맵(940)의 TCU(Transmission Contro Unit)는, 차량에 장착된 통신 제어 장치로서, 예를 들어 자율주행차량을 위해 도로에 있는 다양한 요소와 소통하는 통신 기술인 V2X(vehicle to everything)(예, V2V 및 V2I를 통해 수집가능한 상황 데이터), 협력 지능형 교통 체계 기술인 ITS(Intelligent Transport Systems)나 C-ITS(Cooperative Intelligent Transport Systems)와 통신할 수 있게 한다.The TCU (Transmission Control Unit) of the sensor and map 940 is a communication control device mounted on the vehicle, for example, V2X (vehicle to everything), a communication technology that communicates with various elements on the road for autonomous vehicles (e.g. , situational data collectable through V2V and V2I), and enables communication with ITS (Intelligent Transport Systems) or C-ITS (Cooperative Intelligent Transport Systems), which are cooperative intelligent transport system technologies.
AR 엔진(920)은, 캘리브레이션 팩터 데이터 베이스(calibration factor DB)에서 제공되는 데이터에 기초하여, 전방 영상에 대한 캘리브레이션을 수행할 수 있다. AR 엔진(920)은, 전방 영상 데이터 및 루트 데이터에 기초하여, 오브젝트 검출을 수행할 수 있다. AR 엔진(920)은, 검출된 오브젝트를 기초로 예측 및 보간(Prediction & Interpolation)을 수행할 수 있다.The AR engine 920 may perform calibration on the front image based on data provided from a calibration factor database. The AR engine 920 may perform object detection based on front image data and route data. The AR engine 920 may perform prediction and interpolation based on the detected object.
렌더러(922)은, 루트 데이터, POI 데이터 및 예측 및 보간 결과 데이터에 기초하여 렌더링을 수행할 수 있다. 렌더러(922)는, 내비게이션 애플리케이션(930)에 AR GUI(graphical user interface) 프레임 및 AR 카메라 프레임을 제공할 수 있다.The renderer 922 may perform rendering based on route data, POI data, and prediction and interpolation result data. The renderer 922 may provide an AR graphical user interface (GUI) frame and an AR camera frame to the navigation application 930.
내비게이션 애플리케이션(930)은, AR 내비게이션 화면을 생성할 수 있다. The navigation application 930 can create an AR navigation screen.
도 9를 참조하면, AR 내비게이션 화면(900)은, 내비게이션 맵 서피스(navigation map surface)(901), AR 카메라 서피스(902), AR GUI 서피스(903) 및 내비게이션 GUI 서피스(904)를 포함할 수 있다. Referring to FIG. 9, the AR navigation screen 900 may include a navigation map surface 901, an AR camera surface 902, an AR GUI surface 903, and a navigation GUI surface 904. there is.
내비게이션 애플리케이션(930)은, 내비게이션 컨트롤러(911)에서 제공받은 맵 디스플레이 프레임에 기초하여, 내비게이션 맵 서피스(910)를 생성할 수 있다. 내비게이션 애플리케이션(930)은, 렌더러(922)로부터 제공받은 AR 카메라 프레임에 기초하여 AR 카메라 서피스(902)를 생성할 수 있다. 내비게이션 애플리케이션(930)은, 렌더러(922)로부터 제공받은 AR GUI 프레임에 기초하여 AR GUI 서피스(903)를 생성할 수 있다. 내비게이션 애플리케이션(930)은, 내비게이션 컨트롤러(911)로부터 제공받은 루트 가이던스 데이터에 기초하여, 내비게이션 GUI 서피스(904)를 생성할 수 있다.The navigation application 930 may generate a navigation map surface 910 based on the map display frame provided by the navigation controller 911. The navigation application 930 may generate the AR camera surface 902 based on the AR camera frame provided from the renderer 922. The navigation application 930 may create an AR GUI surface 903 based on the AR GUI frame provided from the renderer 922. The navigation application 930 may generate a navigation GUI surface 904 based on route guidance data provided from the navigation controller 911.
도 8 및 도 10을 함께 참조하면, 내비게이션 애플리케이션(930)이 구동되면, 내비게이션 애플리케이션(930)에서 내비게이션 맵 서피스(901), AR 카메라 서피스(902), AR GUI 서피스(903) 및 내비게이션 GUI 서피스(904)를 생성할 수 있다. Referring to FIGS. 8 and 10 together, when the navigation application 930 is running, the navigation application 930 displays a navigation map surface 901, an AR camera surface 902, an AR GUI surface 903, and a navigation GUI surface ( 904) can be generated.
내비게이션 애플리케이션(930)은, AR 카메라 서피스(902)의 파라미터와 AR GUI 서피스(903)의 파라미터를 AR 엔진(920)에 제공할 수 있다. The navigation application 930 may provide the parameters of the AR camera surface 902 and the parameters of the AR GUI surface 903 to the AR engine 920.
AR 엔진(920)은, 카메라 서버(1001)에서 전방 영상 데이터를 수신하기 위해 콜백(callback) 함수를 등록할 수 있다. 카메라 서버(1001)는, 예를 들어 AR 디스플레이 장치(800)의 메모리에 포함되는 개념으로 이해될 수 있다. The AR engine 920 may register a callback function to receive front image data from the camera server 1001. The camera server 1001 may be understood as a concept included in the memory of the AR display device 800, for example.
AR 엔진(920)은, 전방 영상 데이터를 수신하여 크로핑(cropping)할 수 있다. 크로핑(cropping)은 영상의 크기 또는 위치 조절, 일부영역이 편집, 투명도 조절 등을 포함할 수 있다. 내비게이션 애플리케이션(930)은, AR 카메라 서피스(902)에 크로핑된 전방 영상을 표시할 수 있다. AR 엔진(230)은, 실시간으로 AR 정합(AR Merging)을 수행할 수 있다. 또, 내비게이션 애플리케이션(930)은, 크로핑된 전방 영상에 기초하여, AR GUI 서피스(903)에 AR GUI를 표시할 수 있다.The AR engine 920 may receive front image data and perform cropping. Cropping may include adjusting the size or position of the image, editing some areas, or adjusting transparency. The navigation application 930 may display the cropped front image on the AR camera surface 902. The AR engine 230 can perform AR merging in real time. Additionally, the navigation application 930 may display an AR GUI on the AR GUI surface 903 based on the cropped front image.
도 11은 본 발명의 실시예에 따른 AR 그래픽 인터페이스를 내비게이션 화면이 표시하는 방법(1100)을 설명하는데 참조되는 흐름도이다. FIG. 11 is a flowchart referenced to explain a method 1100 of displaying an AR graphic interface on a navigation screen according to an embodiment of the present invention.
도 11의 각 과정은 다른 언급이 없다면 프로세서(또는, AR 엔진)에 의해 수행될 수 있다. 또한, 도 11의 과정은 위에서 도 8 내지 도 10을 참조하여 설명한 프로세서(820)에 의한 내비게이션 엔진(910), AR 엔진(920), 내비게이션 애플리케이션(930)의 동작들을 포함하여 수행하거나 또는 그 동작들 중 적어도 일부가 도 11의 과정 이전 또는 이후에 수행될 수 있다.Each process in FIG. 11 may be performed by a processor (or AR engine) unless otherwise specified. In addition, the process of FIG. 11 includes or performs operations of the navigation engine 910, AR engine 920, and navigation application 930 by the processor 820 described above with reference to FIGS. 8 to 10. At least some of these may be performed before or after the process of FIG. 11.
도 11을 참조하면, 기설정된 애플리케이션이 구동되는 것으로 방법이 개시된다(S10). Referring to FIG. 11, the method begins with running a preset application (S10).
상기 기설정된 애플리케이션은, AR 디스플레이 장치(800)에 미리 설치되거나 또는 이와 연동된 다른 장치/서버에 의해, 예를 들어 차량의 AR 모드의 실행에 따라 구동될 수 있다. 상기 기설정된 애플리케이션은, 예를 들어 차량의 주행시 AR 모드에서 실행되는 내비게이션 애플리케이션일 수 있다. The preset application may be preinstalled on the AR display device 800 or driven by another device/server linked thereto, for example, when the AR mode of the vehicle is executed. The preset application may be, for example, a navigation application that runs in AR mode while driving the vehicle.
내비게이션 애플리케이션은, 예를 들어 내비게이션 엔진으로부터 맵 데이터 및 GPS 데이터에 기반한 루트 가이던스와 맵 디스플레이 프레임을 제공받아, 각각 내비게이션 GUI 렌더링 및 맵 디스플레이 서피스를 생성한다. The navigation application, for example, receives route guidance and map display frames based on map data and GPS data from the navigation engine, and generates navigation GUI rendering and map display surfaces, respectively.
또, 내비게이션 애플리케이션은, 예를 들어 AR 엔진으로부터 AR GUI 프레임을 제공받아 AR GUI 서피스를 생성하고, AR 카메라 프레임을 제공받아 AR 카메라 서피스를 생성한다. 내비게이션 애플리케이션은 생성된 맵 디스플레이 서피스, AR 카메라 서피스, 및 AR GUI 서피스를 내비게이션 GUI 서피스에 렌더링한다.In addition, the navigation application, for example, receives an AR GUI frame from the AR engine to create an AR GUI surface, and receives an AR camera frame to create an AR camera surface. The navigation application renders the generated map display surface, AR camera surface, and AR GUI surface to the navigation GUI surface.
프로세서는, 서버, 메모리, 또는 차량으로부터 획득된 맵(map) 데이터 그리고 차량의 센싱 데이터에 기초하여, 차량의 주행 상태를 표시하는 제1 AR 오브젝트와 차량의 주행 상황에 대한 가이드를 표시하는 제2 AR 오브젝트를 포함하는 AR 그래픽 인터페이스를 생성하여 차량의 전방 영상에 중첩되도록 렌더링한다(S20).The processor includes a first AR object that displays the driving state of the vehicle and a second AR object that displays a guide to the driving state of the vehicle, based on map data acquired from a server, memory, or vehicle, and sensing data of the vehicle. An AR graphic interface including AR objects is created and rendered to overlap the front image of the vehicle (S20).
프로세서는, 실시간 생성되는 AR 그래픽 인터페이스를 차량의 전방 영상에 실시간으로 정합(AR Merging)할 수 있다.The processor can merge the AR graphic interface generated in real time with the front image of the vehicle in real time.
프로세서는, 제1 및 제2 AR 오브젝트가 결합된 상태로 AR 그래픽 인터페이스를 표시(렌더링)한다. 프로세서는, 기설정된 조건을 만족하면 제2 AR 오브젝트가 AR 그래픽 인터페이스로부터 분리된 상태로 AR 그래픽 인터페이스를 표시(렌더링)한다. The processor displays (render) the AR graphic interface with the first and second AR objects combined. If a preset condition is satisfied, the processor displays (render) the AR graphic interface with the second AR object separated from the AR graphic interface.
여기에서, 상기 기설정된 조건은 차량의 센싱 데이터에 기초하여, 현재 주행 상태로부터 차량의 주행 상황의 변경이 예측되는 경우를 포함할 수 있다. 상기 기설정된 조건은 ADAS 센싱 데이터, 고정밀 지도 데이터, V2X, ITS, C-ITS 등의 TCU 통신 데이터 중 적어도 하나에 기초하여, 현재 주행 상태로부터 차량의 주행 상황의 변경 또는 가이드 필요가 예측되는 상황이 검출된 경우를 포함할 수 있다. Here, the preset condition may include a case where a change in the driving situation of the vehicle is predicted from the current driving state based on the vehicle's sensing data. The preset condition is a situation in which a change in the driving situation of the vehicle or a need for guidance is predicted from the current driving state based on at least one of ADAS sensing data, high-precision map data, and TCU communication data such as V2X, ITS, and C-ITS. May include detected cases.
이어서, 프로세서는 AR 그래픽 인터페이스가 중첩된 전방 영상을 내비게이션 화면에 표시한다(S30). Next, the processor displays the front image with the AR graphic interface superimposed on the navigation screen (S30).
프로세서는, 제1 및 제2 AR 오브젝트가 결합된 상태로 AR 그래픽 인터페이스를 전방 영상에 렌더링할 수 있다. 프로세서는, AR 그래픽 인터페이스에 대응되는 AR GUI 프레임과 AR 카메라 프레임을 내비게이션 애플리케이션에 제공하여, AR GUI 서피스와 AR 카메라 서피스를 각각 생성할 수 있다. The processor may render the AR graphic interface on the front image with the first and second AR objects combined. The processor may provide an AR GUI frame and an AR camera frame corresponding to the AR graphic interface to the navigation application, thereby creating an AR GUI surface and an AR camera surface, respectively.
이 후, 생성된 AR GUI 서피스 및 AR 카메라 서피스가 내비게이션 GUI 서피스에 렌더링됨으로써, AR 그래픽 인터페이스가 렌더링된 전방 영상이 내비게이션 화면에 포함된다(표시된다).Afterwards, the created AR GUI surface and AR camera surface are rendered on the navigation GUI surface, so that the front image on which the AR graphic interface is rendered is included (displayed) on the navigation screen.
한편, 이러한 AR 그래픽 인터페이스는 맵 데이터 및 차량의 센싱 데이터에 기초하여 변경 예측되는 주행 상황에 근거하여, 가변될 수 있다. Meanwhile, this AR graphic interface can be changed based on driving situations that are predicted to change based on map data and vehicle sensing data.
이때, 가변되는 AR 그래픽 인터페이스는, 차량의 운전자가 현재 주행 상태의 표시와 변경 예측된 주행 상황에 대한 가이드 표시를 직관할 수 있도록, 복수의 AR 오브젝트가 분리된 형태로 나타내진다. At this time, the variable AR graphic interface displays a plurality of AR objects in a separated form so that the driver of the vehicle can intuitively see the display of the current driving state and the guide display for the driving situation predicted to change.
도 12a 및 도 12b는 본 발명의 실시예에 따른 AR 그래픽 인터페이스의 예시로서, 예측되는 변경 주행 상황에 근거하여 제1 및 제2 AR 오브젝트로 분리 및 결합되는 것을 설명하는데 참조되는 도면이다.FIGS. 12A and 12B are examples of an AR graphic interface according to an embodiment of the present invention, and are reference diagrams for explaining separation and combination into first and second AR objects based on predicted changing driving situations.
도면을 참조하면, AR 그래픽 인터페이스(1200)는 3D 형태의 특정 형상의 AR 이미지로 구현될 수 있고, 이러한 AR 이미지를 통해 차량의 현재 주행방향, 주행속도, 조향정보 외에도, 도로 정보 등을 나타낼 수 있다. Referring to the drawing, the AR graphic interface 1200 may be implemented as an AR image of a specific shape in 3D form, and through these AR images, the vehicle's current driving direction, driving speed, and steering information, as well as road information, etc. can be displayed. there is.
AR 그래픽 인터페이스(1200)는 제1 오브젝트와 제2 오브젝트가 결합된 형태로 구현될 수 있다. The AR graphic interface 1200 may be implemented as a combination of a first object and a second object.
여기서, 상기 제1 오브젝트는 예를 들어 3D 스페이드(spade)(예, 삽 형태의 이미지) 형태로 구현될 수 있고, 상기 제2 오브젝트는 상기 제1 오브젝트로부터 연장된 3D 셰브런(chevron)(예, A 또는 V 형태의 이미지) 형태로 구현될 수 있다. 다만, 제1 및 제2 오브젝트가 이러한 형상에 제한되는 것을 의미하는 것은 아니다. Here, the first object may be implemented in the form of, for example, a 3D spade (e.g., a shovel-shaped image), and the second object may be implemented as a 3D chevron (e.g., a 3D chevron) extending from the first object. , A or V-shaped image). However, this does not mean that the first and second objects are limited to these shapes.
AR 그래픽 인터페이스(1200)는 상기 제2 오브젝트의 내측 프레임과 상기 제1 오브젝트의 외측 프레임이 맞닿도록 연장된 형태로 결합될 수 있다. 이때, 상기 제1 및 제2 오브젝트는 시각적으로 구별할 수 있도록, 서로 다른 컬러로 표현될 수 있다. The AR graphic interface 1200 may be combined in an extended form so that the inner frame of the second object and the outer frame of the first object come into contact with each other. At this time, the first and second objects may be expressed in different colors so that they can be visually distinguished.
AR 그래픽 인터페이스(1200)는 차량의 현재 주행 상태를 나타내기 위해 상기 제1 및 제2 오브젝트가 결합된 상태에서 동일 또는 서로 다른 뒤틀림 각도로 움직이도록 표현될 수 있다. The AR graphic interface 1200 may be displayed so that the first and second objects are combined and move at the same or different twisting angles to indicate the current driving state of the vehicle.
생성된 AR 그래픽 인터페이스(1200)는 내비게이션 화면에 포함된 차량 전방 영상에 중첩되도록 표시된다. 구체적으로, 프로세서(820)는 맵 데이터 및 차량의 센싱 데이터 차량의 현재 주행 상태를 나타내는 AR 그래픽 인터페이스(1200)을 생성하여, 루트 및 POI 정보 등을 기초로 렌더러하고, 이를 내비게이션 애플리케이션(930)에 제공하여, 내비게이션 화면에 포함된 차량 전방 영상에 AR 그래픽 인터페이스(1200)가 오버랩 형태로 표시되게 한다.The created AR graphic interface 1200 is displayed to overlap the front image of the vehicle included in the navigation screen. Specifically, the processor 820 generates an AR graphic interface 1200 that represents map data and vehicle sensing data and the current driving state of the vehicle, renders it based on route and POI information, and provides it to the navigation application 930. By providing this, the AR graphic interface 1200 is displayed in an overlapped form on the front image of the vehicle included in the navigation screen.
도 12b를 참조하면, 프로세서(820)는 맵 데이터 및 차량의 센싱 데이터에 기초하여 예측되는 변경 주행 상황에 근거하여, AR 그래픽 인터페이스의 제1 및 제2 AR 오브젝트(1210, 1220)를 분리하고, 분리된 제2 AR 오브젝트(1210)를 통해 변경 주행 상황과 관련된 가이드를 표시하도록 렌더러하여, 내비게이션 애플리케이션(930)의 AR GUI 서피스 및 AR 카메라 서피스를 업데이트할 수 있다. Referring to FIG. 12B, the processor 820 separates the first and second AR objects 1210 and 1220 of the AR graphic interface based on the changed driving situation predicted based on map data and vehicle sensing data, The AR GUI surface and AR camera surface of the navigation application 930 can be updated by rendering to display a guide related to the changed driving situation through the separated second AR object 1210.
제1 및 제2 AR 오브젝트(1210, 1220)가 분리되는 조건은, 차량의 센싱 데이터에 기초하여, 차량의 현재 주행 상태로부터 차량의 주행 상황의 변경이 예측되는 경우를 포함할 수 있다. The condition for separating the first and second AR objects 1210 and 1220 may include a case where a change in the driving situation of the vehicle is predicted from the current driving state of the vehicle based on the vehicle's sensing data.
또는, 상기 제1 및 제2 AR 오브젝트(1210, 1220)가 분리되는 조건은, 차량의 ADAS 센싱 데이터, 고정밀 지도 데이터, V2X, ITS, C-ITS 등의 TCU 통신 데이터 중 적어도 하나에 기초하여, 차량의 현재 주행 상태로부터 차량의 주행 상황의 변경 또는 가이드 필요가 예측되는 상황이 검출된 경우를 포함할 수 있다. Alternatively, the condition for separating the first and second AR objects 1210 and 1220 is based on at least one of vehicle ADAS sensing data, high-precision map data, and TCU communication data such as V2X, ITS, and C-ITS, This may include a case in which a change in the driving situation of the vehicle or a situation in which the need for guidance is predicted is detected from the current driving state of the vehicle.
한편, 분리된 제2 AR 오브젝트(1210)는 제1 AR 오브젝트(1220)의 표시 위치로부터 연장되어 표시된다. 제1 AR 오브젝트(1220)는 차량의 현재 주행 상태(예, 차량의 현재 위치 및 주행방향)를 나타내므로, 운전자는 차량이 제2 AR 오브젝트(1210)에 표현된 가이드에 따라 주행해야할 시점과 주행 방향을 직관할 수 있다.Meanwhile, the separated second AR object 1210 is displayed extending from the display position of the first AR object 1220. Since the first AR object 1220 represents the current driving state of the vehicle (e.g., the current location and driving direction of the vehicle), the driver determines when and when the vehicle should drive according to the guide expressed in the second AR object 1210. You can intuit the direction.
제1 및 제2 AR 오브젝트(1210, 1220) 간의 이격거리는 차량의 주행 상황의 변경 예측 시점 또는 거리에 대응될 수 있다. The separation distance between the first and second AR objects 1210 and 1220 may correspond to a predicted change point or distance in the driving situation of the vehicle.
또, 비록 자세히 도시되지 않았지만, 분리된 제2 AR 오브젝트(1210)는 복수의 프래그먼트(fragments)로 표현될 수 있다. 상기 복수의 프래그먼트 간에는 일정 간격을 유지할 수 있다. Additionally, although not shown in detail, the separated second AR object 1210 may be expressed as a plurality of fragments. A certain interval may be maintained between the plurality of fragments.
또, 상기 복수의 프래그먼트의 각 프래그먼트가 나타내는 방향은 예측되는 상황 발생 지점(또는, 상황 종료 지점)을 점진적으로 지향하도록 표현될 수 있다. 예를 들어, 분리된 제2 AR 오브젝트(1210)가 총 5개의 프래그먼트로 표현되는 경우, 5개의 프래그먼트 각각은 서로 다른 비틀림 각도로 동일 위치(예, 예측되는 상황 발생 지점)를 지향할 수 있다. Additionally, the direction indicated by each fragment of the plurality of fragments may be expressed to gradually point toward a predicted situation occurrence point (or situation end point). For example, when the separated second AR object 1210 is represented by a total of 5 fragments, each of the 5 fragments may be oriented at the same location (eg, a predicted situation occurrence point) at a different twist angle.
상기 복수의 프래그먼트는 제1 AR 오브젝트(1220) 보다 일정 거리 앞서서 이동하는 형태로 표현될 수 있다. 즉, 복수의 프래그먼트 특정 지점 또는 시점에 고정적으로 나타나는 형태가 아니라, 차량의 현재 위치 및 주행 상태에 기초하여 이동하면서 예측되는 상황 발생에 따른 주행 가이드를 표현하도록 구현된다. The plurality of fragments may be expressed as moving a certain distance ahead of the first AR object 1220. In other words, rather than being fixed in a form where a plurality of fragments appear at a specific point or time, it is implemented to express a driving guide according to the occurrence of a predicted situation while moving based on the current location and driving state of the vehicle.
상기 복수의 프래그먼트의 이동 속도는 차량이 가까이 접근하는 정도(예, 주행속도)에 대응될 수 있다.The moving speed of the plurality of fragments may correspond to the degree to which the vehicle approaches (eg, driving speed).
또, 상기 복수의 프래그먼트의 개수 및/또는 표시 길이는 예측되는 상황의 유지 시간 또는 유지 거리에 비례할 수 있다. 예를 들어, 상황 유지 시간이 긴 경우가 그렇지 않은 경우보다 더 많은 개수의 프래그먼트를 포함하거나 총 표시 길이가 더 길게 표현될 수 있다. Additionally, the number and/or display length of the plurality of fragments may be proportional to the maintenance time or maintenance distance of the predicted situation. For example, a case in which the status maintenance time is long may include a greater number of fragments or the total display length may be expressed as longer than in a case where the status maintenance time is long.
복수의 프래그먼트 중 제1 AR 오브젝트(1220)로부터 가까운 프래그먼트는 제1 AR 오브젝트(1220)에 표현된 주행 상태와 연관되도록 가이드가 표시된다. Among the plurality of fragments, a guide is displayed for a fragment close to the first AR object 1220 to be related to the driving state expressed in the first AR object 1220.
복수의 프래그먼트 중 제1 AR 오브젝트(1220)로부터 먼 프래그먼트는 예측되는 상황과 연관되도록 가이드 표시된다. Among the plurality of fragments, a fragment that is far from the first AR object 1220 is guided to be associated with a predicted situation.
즉, 분리된 제2 AR 오브젝트(1210)의 복수의 프래그먼트는 제1 AR 오브젝트(1220)에 대응되는 현재 주행 상태로부터 예측되는 상황을 보다 점진적이고 심리스한 방식으로 가이드 제공한다. That is, the plurality of fragments of the separated second AR object 1210 provide guidance on situations predicted from the current driving state corresponding to the first AR object 1220 in a more gradual and seamless manner.
분리된 제2 AR 오브젝트(1210)는 분리되는 조건에 대응되는 상황이 종료되면, 다시 제1 AR 오브젝트(1220)와 결합된 상태로 표시된다. 즉, 다시 도 12a와 같은 AR 그래픽 인터페이스(1200)로 표시될 수 있다.When the situation corresponding to the separation condition ends, the separated second AR object 1210 is displayed again in a combined state with the first AR object 1220. That is, it can be displayed again as the AR graphic interface 1200 as shown in FIG. 12A.
이하, 본 개시에 따른 AR 디스플레이 장치(800)는 차량이 충전 영역을 포함한 주차 영역에 진입한 것을 인지하고, 센싱 데이터(예, ADAS 센싱 데이터) 및 주차 영역의 관제 데이터 중 적어도 하나에 근거하여 주차 가능 영역을 탐색할 수 있고, 탐색된 주차 가능 영역으로 차량을 안내하도록 상기 AR 그래픽 인터페이스를 실시간 가변하여 표시할 수 있다. Hereinafter, the AR display device 800 according to the present disclosure recognizes that a vehicle has entered a parking area including a charging area, and parks based on at least one of sensing data (e.g., ADAS sensing data) and parking area control data. The available area can be searched, and the AR graphic interface can be displayed in real time to guide the vehicle to the searched available parking area.
도 13은 본 발명의 실시예에 따른 AR 디스플레이 장치의 동작 방법으로서, AR 그래픽 인터페이스를 이용하여 차량의 주차/충전과 관련된 UX 표시를 제공하는 방법(1300)을 설명하는데 참조되는 대표 흐름도이다.FIG. 13 is a representative flowchart referenced to explain a method 1300 of providing a UX display related to parking/charging of a vehicle using an AR graphic interface as a method of operating an AR display device according to an embodiment of the present invention.
도 13의 각 단계는 다른 언급이 없다면, AR 디스플레이 장치의 프로세서(820)에 의해 수행될 수 있다. 또한, 각 단계는 도 8 내지 도 10을 참조하여 위에서 설명한 내비게이션 엔진(910), AR 엔진(920), 내비게이션 애플리케이션(930)의 동작들 중 일부를 포함하여 수행하거나 또는 그 동작들 중 적어도 일부가 이하의 도 13의 과정 전 또는 후에 수행될 수 있다.Unless otherwise specified, each step in FIG. 13 may be performed by the processor 820 of the AR display device. In addition, each step is performed including some of the operations of the navigation engine 910, AR engine 920, and navigation application 930 described above with reference to FIGS. 8 to 10, or at least some of the operations are performed. It may be performed before or after the process of FIG. 13 below.
도 13을 참조하면, 차량의 주행 동안, AR 디스플레이 장치(800)에 차량의 주행 상태를 나타내는 표시 및 주행 상황과 관련된 가이드를 나타내는 AR 그래픽 인터페이스가 렌더링된 전방 영상을 디스플레이한다(S1310).Referring to FIG. 13, while the vehicle is driving, the AR display device 800 displays a front image in which a display indicating the driving state of the vehicle and an AR graphic interface indicating a guide related to the driving situation are rendered (S1310).
구체적으로, 프로세서(820)는 기설정된 애플리케이션(예, 내비게이션 애플리케이션)을 구동하여, 차량의 주행 상태를 표시하는 제1 AR 오브젝트와 차량의 주행 상황에 대한 가이드를 표시하는 제2 AR 오브젝트가 결합된 형태의 AR 그래픽 인터페이스를, AR 카메라를 통해 획득된 전방 영상에 중첩되도록 렌더링할 수 있다. Specifically, the processor 820 runs a preset application (e.g., navigation application) to create an AR object that combines a first AR object that displays the driving state of the vehicle and a second AR object that displays a guide to the driving state of the vehicle. An AR graphic interface in the form of an AR graphic interface can be rendered to overlap the front image acquired through an AR camera.
프로세서(820)는 AR 그래픽 인터페이스가 렌더링된 전방영상을 디스플레이(830)(예, LCD 화면에 반사되는 영상, 차량의 윈드쉴드/대시보드에 보여지는 실제공간의 이미지, 또는 디지털 트윈된 3차원 이미지 등)에 표시할 수 있다. The processor 820 displays the front image in which the AR graphic interface is rendered (e.g., an image reflected on an LCD screen, an image in real space shown on the windshield/dashboard of a vehicle, or a digitally twinned three-dimensional image). etc.) can be displayed.
AR 디스플레이 장치(800)는, 차량이 충전 영역을 포함한 주차 영역에 진입하였음을 인지할 수 있다(S1320). The AR display device 800 may recognize that the vehicle has entered the parking area including the charging area (S1320).
본 개시에서, '주차 영역'은 전술한 바와 같이 충전기를 포함한 충전소 및 주차공간을 포함한 주차장을 모두 포함하는 것으로 사용되었다. 또한, 본 개시에 따른 '주차 영역'은 관제서버를 포함하는 경우와 그렇지 않은 경우를 모두 포함할 수 있다. In the present disclosure, 'parking area' is used to include both a charging station including a charger and a parking lot including a parking space, as described above. Additionally, the 'parking area' according to the present disclosure may include both cases that include a control server and cases that do not.
관제서버가 없는 (또는, 작동하지 않는) 주차 영역의 경우, AR 디스플레이 장치(800)이 차량의 센싱 데이터, 맵 데이터, 및/또는 ADAS 센싱 데이터에 기초하여 차량이 상기 주차 영역에 진입하였음을 인지할 수 있다. In the case of a parking area without a control server (or not operating), the AR display device 800 recognizes that a vehicle has entered the parking area based on the vehicle's sensing data, map data, and/or ADAS sensing data. can do.
관제서버를 포함하는 주차 영역의 경우, 관제 데이터, 즉 주차 영역에 구비된 센서(예, 카메라, 라이다, 레이더 등)의 센싱 데이터를 기초로 관제서버가 차량의 진입을 인지하거나 그리고/또는 이를 AR 디스플레이 장치(800)에 제공함으로서, 차량이 상기 주차 영역에 진입하였음을 인지할 수 있다. In the case of a parking area that includes a control server, the control server recognizes the entry of a vehicle and/or detects it based on control data, that is, sensing data from sensors (e.g., cameras, lidar, radar, etc.) installed in the parking area. By providing the information to the AR display device 800, it is possible to recognize that a vehicle has entered the parking area.
프로세서(820)는 AR 그래픽 인터페이스와 함께 또는 이를 통해, 차량이 상기 주차 영역에 진입하였다는 표시를 생성 및 출력할 수 있다. Processor 820 may generate and output an indication that a vehicle has entered the parking area, in conjunction with or through an AR graphical interface.
계속해서, 프로세서(820)는 차량의 센싱 데이터 및 주차 영역의 관제 데이터 중 적어도 하나에 근거하여, 주차 가능 영역 또는 충전 가능 영역을 탐색할 수 있다(S1330). Subsequently, the processor 820 may search for a parking area or a charging area based on at least one of vehicle sensing data and parking area control data (S1330).
관제서버가 없는 (또는, 작동하지 않는) 주차 영역의 경우, 프로세서(820)는 차량의 센싱 데이터 및/또는 ADAS 센싱 데이터에 근거하여 주차 영역 내 주차 가능 영역 또는 충전 가능 영역을 탐색할 수 있다. In the case of a parking area where the control server is not present (or does not operate), the processor 820 may search for a parking area or a charging area within the parking area based on the vehicle's sensing data and/or ADAS sensing data.
관제서버를 포함하는 주차 영역의 경우, 관제 데이터, 즉 주차 영역에 구비된 센서(예, 카메라, 라이다, 레이더 등)의 센싱 데이터를 기초로, 관제서버가 주차 영역 내 주차 가능 영역 또는 충전 가능 영역을 탐색한 후, AR 디스플레이 장치(800)에 제공할 수 있다. In the case of a parking area that includes a control server, based on control data, that is, sensing data from sensors (e.g. cameras, lidar, radar, etc.) installed in the parking area, the control server determines the parking area or charging area within the parking area. After searching the area, it can be provided to the AR display device 800.
이와 같이, 차량의 센싱 데이터 및/또는 ADAS 센싱 데이터 또는 관제 데이터를 통해 주차 가능 영역 또는 충전 가능 영역이 탐색되면, 프로세서(820)는 탐색된 주차 가능 영역 또는 충전 가능 영역으로 차량을 안내하도록, 상기 AR 그래픽 인터페이스를 가변하여 렌더링할 수 있다(S1340). In this way, when a parking area or a charging area is discovered through the vehicle's sensing data and/or ADAS sensing data or control data, the processor 820 guides the vehicle to the discovered parking area or charging area. The AR graphic interface can be rendered variably (S1340).
상기 AR 그래픽 인터페이스는, 주차 가능 영역 또는 충전 가능 영역으로 차량을 가이드하기 위해, 제1 및 제2 AR 오브젝트가 분리된 형태로 UX를 제공할 수 있다. 또한, 상기 AR 그래픽 인터페이스는, 주차 영역 내 특정 이벤트를 표시하기 위해 제1 및 제2 AR 오브젝트와 함께 제3 AR 오브젝트를 추가로 포함할 수 있다. The AR graphic interface may provide UX in the form of separate first and second AR objects in order to guide the vehicle to a parking area or a charging area. Additionally, the AR graphic interface may further include a third AR object along with the first and second AR objects to display a specific event within the parking area.
도 14a, 도 14b, 도 14c, 도 14d는 본 발명의 실시예에 따라, ADAS 센싱 데이터에 기반하여 가변된 AR 그래픽 인터페이스를 이용하여, 주차 가능 영역을 가이드하는 것을 설명하기 위한 개념도들이다. FIGS. 14A, 14B, 14C, and 14D are conceptual diagrams to explain guiding a parking area using a variable AR graphic interface based on ADAS sensing data, according to an embodiment of the present invention.
본 개시에 따른 AR 디스플레이 장치(800)는 차량의 센싱 데이터(예, CAN(Steering Wheel 각도, 주행속도(Speed) 요우각 속도(Yawrate)), GPS위치/방향정보) 및 맵 데이터(예, 내비게이션/지도 데이터(lane geometry))와 함께, ADAS 센싱 데이터에 근거하여, AR 그래픽 인터페이스를 실시간 가변하여 제공(출력)함으로써, 주차 가능 영역 또는 충전 가능 영역을 차량에 가이드할 수 있다.The AR display device 800 according to the present disclosure displays vehicle sensing data (e.g., CAN (steering wheel angle, driving speed, yaw rate), GPS location/direction information) and map data (e.g., navigation /Map data (lane geometry), and based on ADAS sensing data, the AR graphic interface can be provided (output) in real-time changes to guide the vehicle to the parking area or charging area.
ADAS는, 첨단 운전자 지원 시스템(ADAS: Advanced Driver Assistance Systems)으로, ADAS 센싱 데이터는 ADAS (시스템)를 통해 획득된 센싱 데이터를 의미한다. ADAS 를 통해 차량 주변 오브젝트와 차량 환경을 모두 감지할 수 있다.ADAS is an advanced driver assistance system (ADAS), and ADAS sensing data refers to sensing data acquired through ADAS (system). ADAS can detect both objects around the vehicle and the vehicle environment.
프로세서(820)는 ADAS 센싱 데이터, 차량의 센싱 데이터(예, CAN 데이터), 내비게이션/지도/GPS 데이터 등의 맵 데이터를 수신하고, 수신된 데이터를 기반으로 제공가능 보조기능을 분리된 제2 AR 오브젝트를 통해 표시할 수 있다. 이때, 분리된 제2 AR 오브젝트는 부가정보(예, 남은충전시간, 충전요금 등의 충전정보)와 함께 표시될 수 있다.The processor 820 receives map data such as ADAS sensing data, vehicle sensing data (e.g., CAN data), and navigation/map/GPS data, and provides auxiliary functions based on the received data through a separate second AR. It can be displayed through an object. At this time, the separated second AR object may be displayed with additional information (eg, charging information such as remaining charging time and charging fee).
프로세서(820)는 차량이 주차장/충전소에 진입한 것에 근거하여, ADAS 센싱 데이터에 기반하여 주차 가능 영역 또는 충전 가능 영역을 탐색할 수 있다. The processor 820 may search for a parking area or a charging area based on ADAS sensing data based on the vehicle entering the parking lot/charging station.
이때, 주차 가능 영역 또는 충전 가능 영역이 복수인 경우, 기설정된 기준(예, 차량의 현재 위치에 가까운 정도, 출구에 가까운 정도, 급속 충전 우선 등)에 부합하는 최적의 주차공간/충전기가 자동 선택되도록 구현되거나 또는 사용자 입력을 통해 선택될 수 있도록 선택가능한 복수의 위치(또는, 경로)를 제시될 수 있다.At this time, if there are multiple parking or charging areas, the optimal parking space/charger that meets the preset criteria (e.g., proximity to the vehicle's current location, proximity to the exit, priority on fast charging, etc.) is automatically selected. A plurality of selectable locations (or paths) may be presented so that they can be implemented or selected through user input.
이 후, 프로세서(820)는, 차량이 자동 선택 또는 사용자 입력을 통해 선택된 주차공간/충전기의 위치에 소정거리 이내 진입한 것에 근거하여, 분리된 제2 AR 오브젝트가 선택된 주차공간/충전기의 위치로 이동하고, 차량의 현재 위치로부터 주차공간/충전기의 위치까지 연결된 가이드 경로를 표시하도록, AR 그래픽 인터페이스를 표시 업데이트한다. Afterwards, the processor 820 moves the separated second AR object to the location of the selected parking space/charger based on the vehicle entering the location of the parking space/charger selected through automatic selection or user input within a predetermined distance. Move and update the AR graphical interface to display a guide path from the vehicle's current location to the location of the parking space/charger.
차량이, 분리된 제2 AR 오브젝트에 의한 가이드 경로를 따라, 선택된 주차공간/충전기의 위치(또는, 그 앞)에 도달하면, 제2 AR 오브젝트는 제1 AR 오브젝트로 이동하여 결합된 형태의 AR 그래픽 인터페이스를 제공한다. When the vehicle reaches the location of (or in front of) the selected parking space/charger along the guide path by the separate second AR object, the second AR object moves to the first AR object and displays the combined AR object. Provides a graphical interface.
이 후, 차량은 주차모드에 진입한다. 운전자는 제1 및 제2 AR 오브젝트가 다시 결합된 것을 보고, (그리고/또는 이와 함께 제공되는 부가정보('주차모드 실행')를 통해) 주차모드에 진입한 것을 직관할 수 있다.After this, the vehicle enters parking mode. The driver can see that the first and second AR objects are recombined and can intuit that the parking mode has been entered (and/or through additional information ('parking mode execution') provided along with it).
이하에서는, 도 14a 내지 도 14d를 참조하면, ADAS 센싱 데이터에 기반하여 가변되는 AR 그래픽 인터페이스를 이용하여 주차 가능 영역을 탐색 및 경로 안내하는 실시 예를 구체적으로 설명하겠다.Below, with reference to FIGS. 14A to 14D, an embodiment of searching for a possible parking area and providing route guidance using an AR graphic interface that varies based on ADAS sensing data will be described in detail.
도 14a를 참조하면, 차량이 주차장/충전소에 진입하면, ADAS 시스템(예, ADAS 센싱 데이터)을 통해 차량 전, 후, 측방을 탐색한다. 그에 따라, 프로세서(820)는 차량의 주차장/충전소 진입에 따라, ADAS 센싱 데이터에 기반하여 주차 가능 영역 또는 충전 가능 영역을 탐색할 수 있다. Referring to FIG. 14A, when a vehicle enters a parking lot/charging station, the front, rear, and sides of the vehicle are searched through the ADAS system (e.g., ADAS sensing data). Accordingly, the processor 820 may search for a parking area or a charging area based on ADAS sensing data as the vehicle enters the parking lot/charging station.
비록 도시되지 않았지만, 상기 탐색 동안, 제2 AR 오브젝트가 제1 AR 오브젝트로부터 분리되어, 제1 AR 오브젝트를 기준으로 360도 회전하는 애니메이션 효과가 출력될 수 있다. 탐색이 종료되면(예, 탐색 성공/실패), 제1 및 제2 AR 오브제젝트는 다시 결합된 형태로 표시된다. Although not shown, during the search, the second AR object may be separated from the first AR object, and an animation effect rotating 360 degrees with respect to the first AR object may be output. When the search is completed (eg, search success/failure), the first and second AR objects are displayed in a combined form again.
프로세서(820)는 ADAS 센싱 데이터에 기반하여, 차량의 현재 위치를 기준으로 탐색된 주변 주차 가능 영역(1411, 1412)에 대한 표시를 차량 전방 영상(1401)에 나타낼 수 있다. 이때, AR 그래피 인터페이스(1400)는 제1 및 제2 AR 오브젝트가 결합된 형태로, 차량의 현재 주행 상태를 표시한다.Based on ADAS sensing data, the processor 820 may display an indication of the surrounding parking available areas 1411 and 1412 found based on the current location of the vehicle on the front image 1401 of the vehicle. At this time, the AR graphics interface 1400 displays the current driving state of the vehicle in a form in which the first and second AR objects are combined.
다음, 도 14b에 도시된 바와 같이, 프로세서(820)는 탐색된 복수의 주차 가능 영역(1411, 1412)에 대한 선택 옵션(1421, 1422)을 차량 전방 영상에 표시할 수 있다. 이때, 선택 옵션(1421, 1422)은 부가 정보(예, 주행거리, 차량의 현재 위치에서 가까운 정도, 출구에 가까운 정도 등)와 함께 표시될 수 있다.Next, as shown in FIG. 14B, the processor 820 may display selection options 1421 and 1422 for a plurality of searched available parking areas 1411 and 1412 on the image in front of the vehicle. At this time, the selection options 1421 and 1422 may be displayed along with additional information (e.g., driving distance, proximity to the current location of the vehicle, proximity to the exit, etc.).
프로세서(820)는 선택 옵션(1421, 1422)에 대한 입력에 근거하여, 하나의 주차 가능 영역(1412)을 선택하고, 그에 따라 제2 AR 오브젝트가 분리되어, 선택된 주차 가능 영역(1412)의 위치로 이동한다. The processor 820 selects one available parking area 1412 based on the input for the selection options 1421 and 1422, and the second AR object is separated accordingly to determine the location of the selected available parking area 1412. Go to
이 후, 도 14c에 도시된 바와 같이, 제2 AR 오브젝트(1410)는 차량의 현재 위를 나타내는 제1 AR 오브젝트에서 상기 선택된 주차 가능 영역(1412)의 위치를 연결하는 가이드 궤적을 생성하여, 안내 경로를 제공한다.Afterwards, as shown in FIG. 14C, the second AR object 1410 generates a guide trajectory connecting the location of the selected parking available area 1412 with the first AR object indicating the current position of the vehicle, thereby providing guidance. Provides a route.
이때, 상기 안내 경로는, 이어질 차량의 주차를 고려하여, 주차가 편한 방향으로 경로가 생성될 수 있다.At this time, the guidance route may be created in a direction that is convenient for parking, taking into account the parking of the vehicle to be followed.
한편, 도 14d에 도시된 바와 같이, 차량이 주행 반대방향으로 진입한 경우 또는 선택된 주차 가능 영역(1412)에의 진입방향과 차량의 주행방향이 반대인 경우, 분리된 제2 AR 오브젝트를 통해 진입 불가 안내를 표시할 수 있다.Meanwhile, as shown in FIG. 14d, when the vehicle enters in the opposite direction of travel or when the entry direction into the selected parking available area 1412 is opposite to the vehicle's travel direction, entry is not possible through the separated second AR object. Guidance can be displayed.
프로세서(820)는 차량의 현재 위치 및 주행 상태에 근거하여 차량이 주행 불가 방향으로 진입한 것을 인식하고, 상기 인식에 따라 제2 AR 오브젝트를 분리하여 경고 알림 및 주행 가능 방향을 나타내는 가이드를 표시하도록 렌더링 업데이트할 수 있다.The processor 820 recognizes that the vehicle has entered a direction in which the vehicle cannot be driven based on the current location and driving state of the vehicle, and separates the second AR object according to the recognition to display a warning notification and a guide indicating the direction in which the vehicle can be driven. Rendering can be updated.
구체적으로, 제1 AR 오브젝트(1420)는 회전량을 통해 차량의 현재 주행방향을 나타내므로, 선택된 주차 가능 영역(1412)을 향하도록 표시된다. Specifically, the first AR object 1420 indicates the current driving direction of the vehicle through the rotation amount, so it is displayed to face the selected parking area 1412.
분리된 제2 AR 오브젝트(1410S-1, 1410S-2)는 선택된 주차 가능 영역(1412)을 향하지 않고, 진입방향(1430)과 동일한 방향을 가리키도록 회전한다. 즉, 제1 AR 오브젝트(1420)가 가리키는 방향과 반대방향을 가리킴으로써, 주행 가능 방향을 제공한다.The separated second AR objects 1410S-1 and 1410S-2 do not face the selected parking area 1412, but rotate to point in the same direction as the entry direction 1430. That is, by pointing in a direction opposite to the direction pointed by the first AR object 1420, a possible driving direction is provided.
분리된 제2 AR 오브젝트(1410S-1, 1410S-2)는 제1 AR 오브젝트(1420)를 사이데 두고 양 방향으로 분기될 수 있다. The separated second AR objects 1410S-1 and 1410S-2 may branch in both directions with the first AR object 1420 between them.
분리된 제2 AR 오브젝트의 제1부분(1410S-1)은 선택된 주차 가능 영역(1412)의 위치와 제1 AR 오브젝트(1420)의 사이를 연결한다. 그리고, 분리된 제2 AR 오브젝트의 제2부분(1410S-2)은 제1 AR 오브젝트(1420)로부터 주행 가능 방향으로 향하는 경로를 가이드한다. 이때, 상기 제1 및 제2부분(1410S-1)의 궤적은 모두 제1 AR 오브젝트(1420)가 가리키는 방향과 반대방향, 즉 주행 가능 방향을 향한다.The first part 1410S-1 of the separated second AR object connects the location of the selected parking area 1412 and the first AR object 1420. And, the second part 1410S-2 of the separated second AR object guides the path from the first AR object 1420 to the driving direction. At this time, the trajectories of the first and second parts 1410S-1 all head in the opposite direction to the direction pointed by the first AR object 1420, that is, in the drivable direction.
분리된 제2 AR 오브젝트(1410S-1, 1410S-2)는 컬러 변경, 형상 변경, 점등점멸, 및 하이라이트 표시 등을 통해, 진입 불가 경고를 표시할 수 있다.The separated second AR objects (1410S-1, 1410S-2) may display an entry impossibility warning through color change, shape change, blinking, and highlight display.
예를 들어, 진입방향에 대한 경로 안내시 분리된 제2 AR 오브젝트의 컬러(예, 그린(green))와 진입 불가 방향에 대한 경고 안내시 분리된 제2 AR 오브젝트의 컬러(예, 오렌지 계열 또는 레드(red))는 상이할 수 있다. For example, the color of the separated second AR object (e.g., green) when providing route guidance for the entry direction and the color of the separated second AR object (e.g., orange or Red (red) may be different.
프로세서(820)는 차량의 주행 상황(예, 진입방향으로 다른 차량 진입, 주차 혼잡 정도, 차량과의 이격 거리 등)에 따라 상기 진입 불가 경고의 표시 방식 및/또는 알림 레벨을 변경할 수 있다. The processor 820 may change the display method and/or notification level of the entry impossibility warning according to the vehicle's driving situation (e.g., entry of another vehicle in the entry direction, degree of parking congestion, distance from the vehicle, etc.).
운전자는, 진입 불가 방향에 대한 경고 안내를 표시하는 제2 AR 오브젝트를 확인하여, 직관적으로 차량의 주행방향 변경, 차량의 주행감속 및 정지 등을 수행할 수 있다. The driver can intuitively change the driving direction of the vehicle, decelerate and stop the vehicle, etc. by checking the second AR object that displays warning information about directions that cannot be entered.
이 후, 분리된 제2 AR 오브젝트의 제2부분(1410S-2)의 가이드 궤적을 따라, 차량이 진입방향 (또는, 주행 가능 방향)으로 주행하면, 진입 불가 경고가 사라지고, 분리된 제2 AR 오브젝트(1410S-1, 1410S-2)의 컬러, 형상 등이 이전 상태로 복구되며, 이 후 제1 및 제2 AR 오브젝트가 다시 결합된 상태로 표시된다. Afterwards, when the vehicle drives in the entry direction (or driving direction) along the guide trajectory of the second part (1410S-2) of the separated second AR object, the entry impossibility warning disappears, and the separated second AR object The color and shape of the objects 1410S-1 and 1410S-2 are restored to their previous state, and then the first and second AR objects are displayed in a combined state again.
또는, 프로세서(820)는, 상기 주행 가능 방향의 주변 영역에, ADAS 센싱 데이터에 기초하여 재탐색된 주차 가능 영역의 위치를 다시 표시하도록 AR 그래픽 인터페이스를 표시 업데이트할 수 있다.Alternatively, the processor 820 may display and update the AR graphic interface to re-display the location of the re-discovered parking area based on ADAS sensing data in the surrounding area of the driving direction.
도 15a, 도 15b, 도 15c, 도 15d는 본 발명의 실시예에 따라, 관제정보에 기반하여 가변되는 AR 그래픽 인터페이스를 이용하여, 주차 가능 영역을 가이드하는 것을 설명하기 위한 개념도들이다. FIGS. 15A, 15B, 15C, and 15D are conceptual diagrams to explain guiding a parking area using an AR graphic interface that changes based on control information, according to an embodiment of the present invention.
본 개시에 따른 AR 디스플레이 장치(800)는 차량의 센싱 데이터(예, CAN 데이터(Steering Wheel 각도, 주행속도(Speed) 요우각 속도(Yawrate)), GPS위치/방향정보) 및 맵 데이터(예, 내비게이션/지도 데이터(lane geometry))와 함께, 주차장/충전소의 관제 데이터에 근거하여, AR 그래픽 인터페이스를 실시간 가변하여 제공(출력)함으로써, 주차 가능 영역 또는 충전 가능 영역을 차량에 가이드할 수 있다.The AR display device 800 according to the present disclosure displays vehicle sensing data (e.g., CAN data (steering wheel angle, driving speed, yaw rate), GPS location/direction information) and map data (e.g., By providing (outputting) an AR graphic interface in real time in real time based on the control data of the parking lot/charging station along with navigation/map data (lane geometry), the parking area or charging area can be guided to the vehicle.
관제 데이터는, 주차장/충전소에 구비된 센서(예, 라이다, 카메라, 레이다, UWB/ BLE 등을 활용한 위치 센서 플랫폼 등)의 센싱 데이터에 기초하여 관제 서버에 의해 생성된 데이터, 정보를 포함한다. Control data includes data and information generated by the control server based on sensing data from sensors installed in parking lots/charging stations (e.g., lidar, camera, radar, location sensor platform using UWB/BLE, etc.) do.
관제 서버는, 차량이 주차장/충전소에 진입하면, AR 디스플레이 장치(800)와 연결되며, 예를 들어 디지털 트윈을 이용하여 주차장/충전소 내에서 발생되는 이벤트(상황, 동작, 기능 등) 및 주차장/충전소에 설치된 장치들(예, 센서, 충전기, 연계된 다른 장치/기기 등)을 제어할 수 있다. The control server is connected to the AR display device 800 when a vehicle enters the parking lot/charging station, and for example, uses a digital twin to monitor events (situations, actions, functions, etc.) that occur within the parking lot/charging station and the parking lot/charging station. Devices installed at the charging station (e.g. sensors, chargers, other connected devices/devices, etc.) can be controlled.
관제 서버는, 획득된 관제 데이터 또는 이를 기초로 생성된 정보나 데이터를 차량(100)이나 AR 디스플레이 장치(800)에 전송해줄 수 있다.The control server may transmit the acquired control data or information or data generated based on it to the vehicle 100 or the AR display device 800.
디지털 트윈은 현실에 존재하는 객체(사물, 공간, 환경, 공정, 절차 등)를 컴퓨터상에 디지털 데이터 모델로 표현하여 똑같이 복제하고 실시간으로 서로 반응할 수 있도록 구현된 것을 말한다. 이러한 디지털 트윈은 물리적인 사물, 공간, 환경, 사람, 프로세스 등의 자산을 소프트웨어를 사용하여 가상의 모델로 만들어 실세계에서 하는 것과 같이 동작시키거나 동일한 행위를 해볼 수 있게 할 수 있다. A digital twin refers to an object that exists in reality (objects, space, environment, process, procedure, etc.) expressed as a digital data model on a computer so that it can be copied identically and react to each other in real time. These digital twins can create virtual models of assets such as physical objects, spaces, environments, people, and processes using software, allowing them to operate or perform the same actions as they do in the real world.
관제 서버는, 디지털 트윈을 통해 주차장/충전소 건물의 내부 3D형상을 포함하고 있으며, 주차장/충전소에 구비된 센서(예, 라이다, 카메라, 레이다, UWB/ BLE 등을 활용한 위치 센서 플랫폼 등)의 센싱 데이터에 기반하여, 입차, 충전/출차, 입고/출고 경로 등을 AR 디스플레이 장치(800)에 제공할 수 있다.The control server contains the internal 3D shape of the parking lot/charging station building through digital twin, and sensors installed in the parking lot/charging station (e.g., location sensor platform using lidar, camera, radar, UWB/BLE, etc.) Based on the sensing data, vehicle entry, charging/exiting, loading/unloading routes, etc. can be provided to the AR display device 800.
프로세서(820)는 관제 데이터, 차량의 센싱 데이터(예, CAN 데이터), 내비게이션/지도/GPS 데이터 등의 맵 데이터를 수신하고, 수신된 데이터를 기반으로 제공가능 보조기능을 분리된 제2 AR 오브젝트를 통해 표시할 수 있다. 이때, 분리된 제2 AR 오브젝트는 부가정보(예, 남은충전시간, 충전요금 등의 충전정보)와 함께 표시될 수 있다.The processor 820 receives map data such as control data, vehicle sensing data (e.g., CAN data), and navigation/map/GPS data, and provides auxiliary functions based on the received data as a separate second AR object. It can be displayed through . At this time, the separated second AR object may be displayed with additional information (eg, charging information such as remaining charging time and charging fee).
프로세서(820)는 차량이 주차장/충전소에 진입한 것에 근거하여, 관제 데이터에 기반하여 주차 가능 영역 또는 충전 가능 영역을 인지할 수 있다. The processor 820 may recognize a parking area or a charging area based on the vehicle entering the parking lot/charging station and control data.
이때, 주차 가능 영역 또는 충전 가능 영역이 복수인 경우, 기설정된 기준(예, 차량의 현재 위치에 가까운 정도, 출구에 가까운 정도, 급속 충전 우선 등)에 부합하는 최적의 주차공간/충전기가 자동 선택되거나 사용자 입력을 통해 선택될 수 있다.At this time, if there are multiple parking or charging areas, the optimal parking space/charger that meets the preset criteria (e.g., proximity to the vehicle's current location, proximity to the exit, priority on fast charging, etc.) is automatically selected. or can be selected through user input.
프로세서(820)는, 차량이 자동 선택 또는 사용자 입력을 통해 선택된 주차공간/충전기의 위치에 소정거리 이내 진입한 것에 근거하여, 분리된 제2 AR 오브젝트가 선택된 주차공간/충전기의 위치로 이동하고, 차량의 현재 위치로부터 주차공간/충전기의 위치까지 연결된 가이드 경로를 표시하도록, AR 그래픽 인터페이스를 표시 업데이트한다. The processor 820 moves the separated second AR object to the location of the selected parking space/charger based on the vehicle entering the location of the parking space/charger selected through automatic selection or user input within a predetermined distance, The AR graphic interface is updated to display a guide route from the vehicle's current location to the location of the parking space/charger.
이 후, 분리된 제2 AR 오브젝트에 의한 가이드 경로를 따라, 차량이 상기 선택된 주차공간/충전기의 위치(또는, 그 앞)에 도달하면, 프로세서(820)는 제2 AR 오브젝트가 제1 AR 오브젝트로 이동하여 다시 결합된 형태의 AR 그래픽 인터페이로 표시되도록 렌더링한다. Afterwards, when the vehicle reaches the location of (or in front of) the selected parking space/charger along the guide path by the separated second AR object, the processor 820 determines that the second AR object is connected to the first AR object. Go to and render it to be displayed as a combined AR graphic interface.
이 후, 차량은 주차모드에 진입한다. 운전자는 제1 및 제2 AR 오브젝트가 다시 결합된 것을 보고, (그리고/또는 이와 함께 제공되는 부가정보('주차모드 실행')를 통해) 주차모드에 진입한 것을 직관할 수 있다.After this, the vehicle enters parking mode. The driver can see that the first and second AR objects are recombined and can intuit that the parking mode has been entered (and/or through additional information ('parking mode execution') provided along with it).
이하에서는, 도 15a 내지 도 15d를 참조하면, 주차장/충전소의 관제 데이터에 기반하여 가변되는 AR 그래픽 인터페이스를 이용하여 주차 가능 영역을 탐색 및 경로 안내하는 실시 예를 구체적으로 설명하겠다.Hereinafter, with reference to FIGS. 15A to 15D, an embodiment of searching for a possible parking area and providing route guidance will be described in detail using an AR graphic interface that varies based on control data of the parking lot/charging station.
도 15a를 참조하면, 차량이 주차장/충전소에 진입하면, 관제 서버에서 차량의 진입을 감지하고, (예를 들어, 디지털 트윈을 이용하여) 주차 가능 영역 또는 충전 가능 영역을 탐색한다. 그에 따라, 프로세서(820)는 차량의 주차장/충전소 진입에 따라 관제 데이터에 기반하여 주차 가능 영역 또는 충전 가능 영역의 탐색 결과를 인지할 수 있다.Referring to FIG. 15A, when a vehicle enters a parking lot/charging station, the control server detects the vehicle's entry and searches for a parking area or a charging area (for example, using a digital twin). Accordingly, the processor 820 may recognize the result of searching for a parking area or a charging area based on control data as the vehicle enters the parking lot/charging station.
비록 도시되지 않았지만, 관제 서버에 의한 탐색 동안 (또는, 관제 서버로부터 관제 데이터가 수신되기까지), 제2 AR 오브젝트가 제1 AR 오브젝트로부터 분리되어, 제1 AR 오브젝트를 기준으로 360도 회전하는 애니메이션 효과가 출력될 수 있다. 탐색이 종료되면(예, 탐색 성공/실패), 제1 및 제2 AR 오브제젝트는 다시 결합된 형태로 표시된다. Although not shown, an animation in which the second AR object is separated from the first AR object and rotates 360 degrees based on the first AR object during navigation by the control server (or until control data is received from the control server) Effects can be output. When the search is completed (eg, search success/failure), the first and second AR objects are displayed in a combined form again.
프로세서(820)는 관제 데이터에 기반하여, 차량의 현재 위치를 기준으로 관제 서버에 의해 탐색된 주변 주차 가능 영역(1511, 1512)에 대한 표시를 차량 전방 영상(1501)에 나타낼 수 있다. 이때, AR 그래피 인터페이스(1500)는 제1 및 제2 AR 오브젝트가 결합된 형태로, 차량의 현재 주행 상태를 표시한다.Based on the control data, the processor 820 may display the surrounding parking available areas 1511 and 1512 discovered by the control server based on the current location of the vehicle on the front image 1501 of the vehicle. At this time, the AR graphics interface 1500 is a combination of the first and second AR objects and displays the current driving state of the vehicle.
다음, 도 15b에 도시된 바와 같이, 프로세서(820)는 탐색된 복수의 주차 가능 영역(1511, 1512)에 대한 선택 옵션(1521, 1522)을 차량 전방 영상에 표시할 수 있다. 이때, 선택 옵션(1521, 1522)은 부가 정보(예, 주행거리, 차량의 현재 위치에서 가까운 정도, 출구에 가까운 정도 등)와 함께 표시될 수 있다.Next, as shown in FIG. 15B, the processor 820 may display selection options 1521 and 1522 for a plurality of discovered parking areas 1511 and 1512 on the image in front of the vehicle. At this time, the selection options 1521 and 1522 may be displayed along with additional information (e.g., driving distance, proximity to the current location of the vehicle, proximity to the exit, etc.).
프로세서(820)는 선택 옵션(1521, 1522)에 대한 입력에 근거하여, 하나의 주차 가능 영역(1512)을 선택하고, 그에 따라 제2 AR 오브젝트가 분리되어, 선택된 주차 가능 영역(1512)의 위치로 이동한다. The processor 820 selects one parking area 1512 based on the inputs for the selection options 1521 and 1522, and the second AR object is separated accordingly to determine the location of the selected parking area 1512. Go to
이 후, 도 15c에 도시된 바와 같이, 제2 AR 오브젝트(1510)는 차량의 현재 위를 나타내는 제1 AR 오브젝트(1510)에서 상기 선택된 주차 가능 영역(1512)의 위치를 연결하는 가이드 궤적을 생성하여, 안내 경로를 제공한다.Afterwards, as shown in FIG. 15C, the second AR object 1510 generates a guide trajectory connecting the location of the selected parking area 1512 with the first AR object 1510 indicating the current position of the vehicle. Thus, a guidance route is provided.
이때, 상기 안내 경로는, 이어질 차량의 주차를 고려하여, 주차가 편한 방향으로 경로가 생성될 수 있다.At this time, the guidance route may be created in a direction that is convenient for parking, taking into account the parking of the vehicle to be followed.
한편, 도 15d에 도시된 바와 같이, 차량이 주행 반대방향으로 진입한 경우 또는 선택된 주차 가능 영역(1512)에의 진입방향과 차량의 주행방향이 반대인 경우, 분리된 제2 AR 오브젝트를 통해 진입 불가 안내를 표시할 수 있다.On the other hand, as shown in FIG. 15D, when the vehicle enters in the opposite direction of travel or when the entry direction into the selected parking available area 1512 is opposite to the vehicle's travel direction, entry is not possible through the separated second AR object. Guidance can be displayed.
프로세서(820)는 차량의 현재 위치 및 주행 상태에 근거하여 차량이 주행 불가 방향으로 진입한 것을 인식하고, 상기 인식에 따라 제2 AR 오브젝트를 분리하여 경고 알림 및 주행 가능 방향을 나타내는 가이드를 표시하도록 렌더링 업데이트할 수 있다.The processor 820 recognizes that the vehicle has entered a direction in which the vehicle cannot be driven based on the current location and driving state of the vehicle, and separates the second AR object according to the recognition to display a warning notification and a guide indicating the direction in which the vehicle can be driven. Rendering can be updated.
구체적으로, 제1 AR 오브젝트(1520)는 회전량을 통해 차량의 현재 주행방향을 나타내므로, 선택된 주차 가능 영역(1512)을 향하도록 표시된다. Specifically, the first AR object 1520 indicates the current driving direction of the vehicle through the rotation amount, so it is displayed to face the selected parking area 1512.
분리된 제2 AR 오브젝트(1510S-1, 1510S-2)는 선택된 주차 가능 영역(1512)을 향하지 않고, 진입방향(1530)(또는, 주행 가능 방향)과 동일한 방향을 가리키도록 회전한다. 즉, 제1 AR 오브젝트(1520)가 가리키는 방향과 반대방향을 가리킴으로써, 주행 가능 방향을 제공한다.The separated second AR objects 1510S-1 and 1510S-2 do not face the selected parking area 1512, but rotate to point in the same direction as the entry direction 1530 (or driving direction). That is, by pointing in a direction opposite to the direction pointed by the first AR object 1520, a possible driving direction is provided.
분리된 제2 AR 오브젝트(1510S-1, 1510S-2)는 제1 AR 오브젝트(1520)를 사이데 두고 다시 양 방향으로 분기될 수 있다. The separated second AR objects 1510S-1 and 1510S-2 may branch again in both directions with the first AR object 1520 between them.
분리된 제2 AR 오브젝트의 제1부분(1510S-1)은 선택된 주차 가능 영역(1512)의 위치와 제1 AR 오브젝트(1520)의 사이를 연결한다. 분리된 제2 AR 오브젝트의 제2부분(1510S-2)은 제1 AR 오브젝트(1520)로부터 주행 가능 방향으로 향하는 경로를 가이드한다. 이때, 상기 제1 및 제2부분(1510S-1)의 궤적은 모두 제1 AR 오브젝트(1520)가 가리키는 방향과 반대방향, 즉 주행 가능 방향을 향한다.The first part 1510S-1 of the separated second AR object connects the location of the selected parking area 1512 and the first AR object 1520. The second part 1510S-2 of the separated second AR object guides a path from the first AR object 1520 toward the driving direction. At this time, the trajectories of the first and second parts 1510S-1 all head in the opposite direction to the direction pointed by the first AR object 1520, that is, in the drivable direction.
분리된 제2 AR 오브젝트(1510S-1, 1510S-2)는 컬러 변경, 형상 변경, 점등점멸, 및 하이라이트 표시 등을 통해, 진입 불가 경고를 표시할 수 있다.The separated second AR objects (1510S-1, 1510S-2) may display an entry impossibility warning through color change, shape change, blinking, and highlight display.
예를 들어, 진입방향에 대한 경로 안내시 분리된 제2 AR 오브젝트의 컬러(예, 그린(green))와 진입 불가 방향에 대한 경고 안내시 분리된 제2 AR 오브젝트의 컬러(예, 오렌지 계열 또는 레드(red))는 상이할 수 있다. For example, the color of the separated second AR object (e.g., green) when providing route guidance for the entry direction and the color of the separated second AR object (e.g., orange or Red (red) may be different.
프로세서(820)는 차량의 주행 상황(예, 진입방향으로 다른 차량 진입, 주차 혼잡 정도, 차량과의 이격 거리 등)에 따라 상기 진입 불가 경고의 표시 방식 및/또는 알림 레벨을 변경할 수 있다. The processor 820 may change the display method and/or notification level of the entry impossibility warning according to the vehicle's driving situation (e.g., entry of another vehicle in the entry direction, degree of parking congestion, distance from the vehicle, etc.).
운전자는, 진입 불가 방향에 대한 경고 안내를 표시하는 제2 AR 오브젝트를 확인하여, 직관적으로 차량의 주행방향 변경, 차량의 주행감속 및 정지 등을 수행할 수 있다. The driver can intuitively change the driving direction of the vehicle, decelerate and stop the vehicle, etc. by checking the second AR object that displays warning information about directions that cannot be entered.
이 후, 분리된 제2 AR 오브젝트의 제2부분(1510S-2)의 가이드 궤적을 따라, 차량이 진입방향 (또는, 주행 가능 방향)으로 주행하면, 진입 불가 경고가 사라지고, 분리된 제2 AR 오브젝트(1510S-1, 1510S-2)의 컬러, 형상 등이 이전 상태로 복구되며, 이 후 제1 및 제2 AR 오브젝트가 다시 결합된 상태로 표시된다. Afterwards, when the vehicle drives in the entry direction (or driving direction) along the guide trajectory of the second part (1510S-2) of the separated second AR object, the entry impossibility warning disappears, and the separated second AR object The color and shape of the objects 1510S-1 and 1510S-2 are restored to their previous state, and then the first and second AR objects are displayed in a combined state again.
또는, 프로세서(820)는, 상기 주행 가능 방향의 주변 영역에, 관제 데이터에 기초하여 재탐색된 주차 가능 영역의 위치를 다시 표시하도록 AR 그래픽 인터페이스를 표시 업데이트할 수 있다.Alternatively, the processor 820 may display and update the AR graphic interface to re-display the location of the parking area rediscovered based on the control data in the surrounding area of the driving direction.
도 16a 및 도 16b는 본 발명의 실시예에 따라, 주차 형태에 따라 AR 그래픽 인터페이스를 표시 업데이트하는 것과 관련된 개념도들이다.FIGS. 16A and 16B are conceptual diagrams related to updating the display of an AR graphic interface according to a parking type, according to an embodiment of the present invention.
분리된 제2 AR 오브젝트를 통해, 차량이 주차 가능 영역 또는 충전 가능 영역에 도착하면, 분리된 제2 AR 오브젝트가 제1 AR 오브젝트와 다시 결합되고, 이 후 주차 모드가 실행된다. 주차 모드에서, AR 그래픽 인터페이스를 통해 주차를 위한 가이드 경로가 제공되는데, 이는 주차 형태에 따라 달라진다. When the vehicle arrives at a parking area or a charging area through the separated second AR object, the separated second AR object is combined with the first AR object again, and then the parking mode is executed. In parking mode, a guided route for parking is provided through the AR graphic interface, which varies depending on the type of parking.
주차 형태는, 예를 들어 전방 주차(전면 주차), 후진 주차(세로 주차), 사선 주차, 평행 주차(가로 주차) 등을 포함한다. Parking types include, for example, forward parking (front parking), reverse parking (vertical parking), diagonal parking, parallel parking (horizontal parking), etc.
선택된 주차공간/충전기 앞이 전방 주차 및 사선 주차 형태이면, AR 그래픽 인터페이스를 통해, 하나의 주차 안내 경로로 안내되는 것으로 충분하다. 따라서, 이하에서는, 주행방향의 변경이 필요하고 후진주행을 포함하는 후진 주차(세로 주차) 및 평행 주차(가로 주차)를 예시로 주차를 안내하는 UX 제공방법을 구체적으로 기술하겠다.If the area in front of the selected parking space/charger is in the form of forward parking or diagonal parking, it is sufficient to be guided through one parking guidance route through the AR graphic interface. Therefore, below, we will describe in detail a method of providing UX for parking guidance using reverse parking (vertical parking) and parallel parking (horizontal parking), which require a change in driving direction and include reverse driving, as examples.
본 개시에 따른 AR 디스플레이 장치(800)의 프로세서(820)는, 선택된 주차 영역(또는, 선택된 충전 영역)에 차량이 근접한 것에 근거하여 가능한 주차 형태를 결정하고, 그 결정에 따라 제2 AR 오브젝트를 다시 분리하여, 주행해야할 주차 안내선을 표시하도록, AR 그래픽 인터페이스를 렌더링 업데이트할 수 있다.The processor 820 of the AR display device 800 according to the present disclosure determines a possible parking type based on the vehicle's proximity to the selected parking area (or selected charging area), and creates a second AR object according to the determination. Separated again, the AR graphical interface can be updated to render to display parking guidance lines to drive through.
분리된 제2 AR 오브젝트는, 결정된 주차 형태에 따라, 차량이 주행해야할 주행 방향과 주행 거리를 표시한다. 제1 AR 오브젝트는 안내된 주차 안내 경로를 따라 주행하는 차량의 현재 주행 방향과 조향각(회전량)을 나타낸다.The separated second AR object displays the driving direction and driving distance in which the vehicle should travel according to the determined parking type. The first AR object represents the current driving direction and steering angle (rotation amount) of the vehicle traveling along the guided parking guidance path.
구체적으로, 프로세서(820)는, 선택된 주차 영역(또는, 충전 영역)에 대한 주차 형태가 결정된 것에 근거하여, 차량의 현재 위치 및 선택된 주차 영역의 위치 (및 ADAS 센싱 데이터)에 기초하여 예상되는 후진주행의 변경 포인트를 산출할 수 있다.Specifically, the processor 820 performs an expected reverse operation based on the current location of the vehicle and the location of the selected parking area (and ADAS sensing data), based on the parking type for the selected parking area (or charging area) being determined. Driving change points can be calculated.
여기에서, 상기 변경 포인트는, 차량이 선택된 주차 영역(또는, 충전 영역)에 주차하기 위해, 주행방향을 변경해야하는 지점을 의미한다. 상기 변경 포인트는, 차량이 전진방향(후진방향)에서 후진방향(전진방향)으로 변경해야하는 위치 및 조향각을 의미한다.Here, the change point refers to a point where the driving direction must be changed in order for the vehicle to park in the selected parking area (or charging area). The change point refers to the position and steering angle at which the vehicle must change from the forward direction (reverse direction) to the reverse direction (forward direction).
계속해서, 프로세서(820)는, 분리된 제2 AR 오브젝트를 통해, 상기 변경 포인트를 향하는 제1 안내선을 표시하고, 이 후 제1 AR 오브젝트에 대응되는 차량의 현재 위치가 상기 변경 포인트에 근접한 것에 근거하여, 차량의 후진주행으로 상기 선택된 주차 영역을 향하는 제2 안내선을 표시하도록 AR 그래픽 인터페이스를 표시 업데이트한다.Subsequently, the processor 820 displays a first guide line toward the change point through a separated second AR object, and then determines that the current location of the vehicle corresponding to the first AR object is close to the change point. Based on this, the AR graphic interface is updated to display a second guide line toward the selected parking area as the vehicle runs backwards.
차량이 제1 안내선 또는 제2 안내선을 이탈하여 주행하는 것으로 감지되면, 프로세서(820)는 제1 안내선 및/또는 제2 안내선의 컬러, 형상 등을 변경하여, 차량이 안내선에 따라 주행하도록 유도한다.When it is detected that the vehicle is driving deviating from the first guide line or the second guide line, the processor 820 changes the color, shape, etc. of the first guide line and/or the second guide line to guide the vehicle to drive according to the guide line. .
도 16a 및 도 16b는 평행 주차(가로 주차)시, AR 그래픽 인터페이스를 통해 주차를 안내하는 UX 예시이다. 평행 주차(가로 주차)는 전진주행 및 후진주행의 조합으로 이루어진다.Figures 16a and 16b are examples of UX that provide parking guidance through an AR graphic interface when parallel parking (horizontal parking). Parallel parking (horizontal parking) consists of a combination of forward and reverse driving.
먼저, 차량(100)이 전진방향으로 주행을 안내하는 제1 안내선을 따라 주행하는 동안, 전방 영상(1601)에는 차량의 전진 주행을 나타내는 제1 AR 오브젝트(1620) 및 제1 안내선으로 표시되는 분리된 제2 AR 오브젝트(1610)가 표시된다. First, while the vehicle 100 is traveling along the first guide line that guides driving in the forward direction, the front image 1601 includes the first AR object 1620 indicating the forward driving of the vehicle and the separation indicated by the first guide line. The second AR object 1610 is displayed.
제2 AR 오브젝트(1610)는 후진주행의 변경 포인트를 포함하며, 제2 AR 오브젝트(1610)에 의해 표시되는 제1 안내선은, 제1 AR 오브젝트(1620)와 후진주행의 변경 포인트를 연결한다. 상기 변경 포인트는 예를 들어, 제1 안내선의 목적지일 수 있다. 상기 변경 포인트는 상기 제1 안내선을 구성하는 다른 가이드 궤적과 구별되는 컬러, 형상으로 표시될 수 있다.The second AR object 1610 includes a change point for backward travel, and the first guide line displayed by the second AR object 1610 connects the first AR object 1620 and the change point for reverse travel. The change point may be, for example, the destination of the first guide line. The change point may be displayed in a color or shape that is distinct from other guide trajectories constituting the first guide line.
차량이 제2 AR 오브젝트(1610)에 의해 표시되는 제1 안내선을 따라 후진주행의 변경 포인트에 가까이 접근하면, 프로세서(820)는 상기 변경 포인트의 컬러, 형상 등을 변경 표시하여, 후진주행으로의 변경 알림을 제공한다. 이와 함께, 변경 포인트와 관련된 부가 정보(예, '후진(R)주행으로 변경하세요')가 표시될 수 있다.When the vehicle approaches the reverse driving change point along the first guide line displayed by the second AR object 1610, the processor 820 changes the color, shape, etc. of the change point to display the change point to reverse driving. Provide change notification. In addition, additional information related to the change point (e.g., ‘Change to reverse (R) driving’) may be displayed.
차량이 변경 포인트에 도착하면, 프로세서(820)는 제2 AR 오브젝트(1610)에 의해 제1 안내선을 표시하는 대신, 차량의 후진방향으로 목적 주차 영역(PI)을 향하는 제2 안내선을 생성 및 표시하도록, 렌더링 업데이트한다.When the vehicle arrives at the change point, the processor 820 generates and displays a second guide line toward the destination parking area (PI) in the vehicle's reverse direction instead of displaying the first guide line by the second AR object 1610. To do this, update the rendering.
차량 전방 영상(1602)에는, 차량의 현재 주행방향(전진방향)을 나타내는 제1 AR 오브젝트(1620')와 함께 차량의 후진방향으로 목적 주차 영역(PI)까지 안내하는 제2 안내선이 제2 AR 오브젝트(1610R)로 표시된다. In the vehicle front image 1602, there is a first AR object 1620' indicating the current driving direction (forward direction) of the vehicle, and a second guide line guiding the vehicle to the target parking area (PI) in the reverse direction of the vehicle. It is displayed as object 1610R.
이때, 상기 제2 안내선의 컬러 및/또는 그 가이드 궤적이 가리키는 방향과 전술한 제1 안내선의 컬러 및/또는 그 가이드 궤적이 가리키는 방향은 서로 상이하게 표시될 수 있다. 그에 따라, 운전자는 차량의 전진, 후진 제어 안내를 직관적으로 인식할 수 있다.At this time, the color of the second guide line and/or the direction pointed by its guide trace may be displayed differently from the color of the above-described first guide line and/or the direction pointed by its guide trace. Accordingly, the driver can intuitively recognize the vehicle's forward and reverse control guidance.
도 16c 및 도 16d는 후진 주차(세로 주차)시, AR 그래픽 인터페이스를 통해 주차를 안내하는 UX 예시이다. 후진 주차(세로 주차)도 마찬가지로 전진주행과 후진주행의 조합으로 이루어지나, 후진주행시의 회전량(또는, 회전각)이 평행 주차(가로 주차)의 경우보다 크다. Figures 16c and 16d are examples of UX that provide parking guidance through an AR graphic interface when reverse parking (vertical parking). Reverse parking (vertical parking) similarly consists of a combination of forward and backward driving, but the amount of rotation (or rotation angle) during reverse parking is larger than that of parallel parking (horizontal parking).
차량(100)이 전진방향으로 주행을 안내하는 제1 안내선을 따라 주행하는 동안, 전방 영상(1603)에는 차량의 전진 주행을 나타내는 제1 AR 오브젝트(1620) 및 제1 안내선으로 표시되는 분리된 제2 AR 오브젝트(1610')가 표시된다. 제2 AR 오브젝트(1610')에는 부가 정보로, 주차 형태에 관한 정보(예, 후진 주차)가 표시될 수 있다. While the vehicle 100 is traveling along the first guide line that guides driving in the forward direction, the front image 1603 includes a first AR object 1620 indicating the forward driving of the vehicle and a separated first guide line indicated by the first guide line. 2 AR object 1610' is displayed. Information on parking type (eg, reverse parking) may be displayed as additional information on the second AR object 1610'.
제2 AR 오브젝트(1610')는 후진주행의 변경 포인트를 포함하며, 제2 AR 오브젝트(1610')에 의해 표시되는 제1 안내선은, 제1 AR 오브젝트(1620)와 후진주행의 변경 포인트를 연결한다. 상기 변경 포인트는 예를 들어, 제1 안내선의 목적지일 수 있다. 상기 변경 포인트는 상기 제1 안내선을 구성하는 다른 가이드 궤적과 구별되는 컬러, 형상으로 표시될 수 있다.The second AR object 1610' includes a change point for backward travel, and the first guide line displayed by the second AR object 1610' connects the first AR object 1620 and the change point for reverse travel. do. The change point may be, for example, the destination of the first guide line. The change point may be displayed in a color or shape that is distinct from other guide trajectories constituting the first guide line.
차량이 제2 AR 오브젝트(1610')에 의해 표시되는 제1 안내선을 따라 후진주행의 변경 포인트에 가까이 접근하면, 프로세서(820)는 상기 변경 포인트의 컬러, 형상 등을 변경 표시하여, 후진주행으로의 변경 알림을 제공한다. 이와 함께, 변경 포인트와 관련된 부가 정보(예, '후진(R)주행으로 변경하세요')가 표시될 수 있다.When the vehicle approaches the reverse driving change point along the first guide line displayed by the second AR object 1610', the processor 820 changes the color, shape, etc. of the change point and displays it to reverse drive. Provides change notification. In addition, additional information related to the change point (e.g., ‘Change to reverse (R) driving’) may be displayed.
차량이 변경 포인트에 도착하면, 프로세서(820)는 제2 AR 오브젝트(1610')에 의해 제1 안내선을 표시하는 대신, 차량의 후진방향으로 목적 주차 영역을 향하는 제2 안내선을 생성 및 표시하도록, 렌더링 업데이트한다. When the vehicle arrives at the change point, the processor 820 generates and displays a second guide line toward the destination parking area in the vehicle's reverse direction instead of displaying the first guide line by the second AR object 1610'. Update rendering.
이때, 상기 제2 안내선에 의해 표시되는 가이드 궤적의 커브는 전술한 평행 주차(가로 주차)의 경우보다 더 크다. 이는, 상기 제2 안내선을 따라 후진주행해야할 차량의 회전량도 더 크게 증가시켜야함을 의미한다. 이에, 제2 AR 오브젝트(1610R')는 상기 제2 안내선과 관련된 부가 정보로, 차량의 회전량 안내 정보(예, '스티어링 휠을 끝까지 돌리세요')가 더 표시될 수 있다.At this time, the curve of the guide trajectory indicated by the second guide line is larger than that in the case of parallel parking (horizontal parking) described above. This means that the rotation amount of the vehicle to drive backwards along the second guide line must also be greatly increased. Accordingly, the second AR object 1610R' is additional information related to the second guide line and may further display vehicle rotation amount guide information (e.g., 'Turn the steering wheel all the way').
차량 전방 영상(1604)에는, 차량의 현재 주행방향(전진방향)을 나타내는 제1 AR 오브젝트(1620')와 함께 차량의 후진방향으로 목적 주차 영역까지 안내하는 제2 안내선이 제2 AR 오브젝트(1610R')로 표시된다. In the front image 1604 of the vehicle, a first AR object 1620' indicating the current driving direction (forward direction) of the vehicle and a second guide line guiding the vehicle to the target parking area in the reverse direction are displayed as a second AR object 1610R. ').
이때, 상기 제2 안내선의 컬러 및/또는 그 가이드 궤적이 가리키는 방향과 전술한 제1 안내선의 컬러 및/또는 그 가이드 궤적이 가리키는 방향은 서로 상이하게 표시될 수 있다. 그에 따라, 운전자는 차량의 전진, 후진 제어 안내를 직관적으로 인식할 수 있다.At this time, the color of the second guide line and/or the direction pointed by its guide trace may be displayed differently from the color of the above-described first guide line and/or the direction pointed by its guide trace. Accordingly, the driver can intuitively recognize the vehicle's forward and reverse control guidance.
도 17은 본 발명의 실시예에 따라, 관제정보에 기반하여 AR 그래픽 인터페이스를 통해, 충전 가능 영역 및 충전을 가이드하는 것을 설명하기 위한 흐름도이고, 도 18a, 도 18b, 도 18c는 도 17을 설명하는데 참조되는 개념도들이다.FIG. 17 is a flowchart illustrating guiding the charging area and charging through an AR graphic interface based on control information, according to an embodiment of the present invention, and FIGS. 18A, 18B, and 18C illustrate FIG. 17. These are conceptual diagrams that are referred to.
도 17에 도시된 각 단계는 다른 언급이 없다면, AR 디스플레이 장치(800)의 프로세서(820)에 의해 수행될 수 있다. 또한, 각 단계는 도 8 내지 도 10을 참조하여 위에서 설명한 내비게이션 엔진(910), AR 엔진(920), 내비게이션 애플리케이션(930)의 동작들 중 일부를 포함하여 수행하거나 또는 그 동작들 중 적어도 일부가 이하의 도 17의 과정 전 또는 후에 수행될 수 있다. Unless otherwise specified, each step shown in FIG. 17 may be performed by the processor 820 of the AR display device 800. In addition, each step is performed including some of the operations of the navigation engine 910, AR engine 920, and navigation application 930 described above with reference to FIGS. 8 to 10, or at least some of the operations are performed. It may be performed before or after the process of FIG. 17 below.
도 17을 참조하면, 본 개시에 따른 AR 디스플레이 장치(800)는 차량이 주차장/충전소 진입한 것에 근거하여, 주차/충전 관제 서버로부터 주차장/충전소 맵 데이터 (또는 디지털 트윈) 및 주차장/충전소 관련 정보를 수신할 수 있다(1710).Referring to FIG. 17, the AR display device 800 according to the present disclosure receives parking lot/charging station map data (or digital twin) and parking lot/charging station related information from the parking/charging control server based on the vehicle entering the parking lot/charging station. can be received (1710).
프로세서(820)는 수신된 정보에 근거하여, AR 그래픽 인터페이스를 통해 주차 가능 영역/충전 가능 영역을 표시할 수 있다(1720).Based on the received information, the processor 820 may display the parking area/charging area through an AR graphic interface (1720).
구체적으로, 프로세서(820)는, 차량이 주차장 또는 충전소(상기 주차 영역)에 진입한 것에 응답하여, 센싱 데이터 및 관제 데이터 중 적어도 하나에 근거하여 주차 가능 영역 또는 충전 가능 영역을 탐색하고, 탐색된 주차 가능 영역 또는 충전 가능 영역의 위치를 표시하도록 AR 그래픽 인터페이스를 가변하여 표시할 수 있다. 프로세서(820)는 AR 그래픽 인터페이에 대한 가변 정보를 기초로 생성된 AR GUI 프레임을 내비게이션 애플리케이션(930)에 제공하여, 내비게이션 애플리케이션(930)으로 하여금 AR GUI 서피스를 업데이트하도록 한다. Specifically, in response to a vehicle entering a parking lot or a charging station (the parking area), the processor 820 searches for a parking area or a charging area based on at least one of sensing data and control data, and searches for a parking area or charging area. The AR graphic interface can be variably displayed to display the location of a parking area or a charging area. The processor 820 provides the navigation application 930 with an AR GUI frame generated based on variable information about the AR graphic interface, allowing the navigation application 930 to update the AR GUI surface.
실시 예에 따라, 프로세서(820)는, 탐색된 충전 가능 영역에 대한 충전 관련 정보가 표시되도록 상기 AR 그래픽 인터페이스를 렌더링 업데이터할 수 있다. 이때, 상기 충전 관련 정보는, 충전 방식 및 충전 비용 중 적어도 하나를 포함할 수 있다. Depending on the embodiment, the processor 820 may update the rendering of the AR graphic interface to display charging-related information for the discovered chargeable area. At this time, the charging-related information may include at least one of a charging method and a charging cost.
한편, 프로세서(820)는, 충전 가능 영역의 탐색 실패에 근거하여, 차량의 현재 위치를 기준으로 주변 충전 영역에 대한 각 잔여 충전 시간 정보가 표시되도록 AR 그래픽 인터페이스를 렌더링 업데이트할 수 있다. Meanwhile, the processor 820 may update the rendering of the AR graphic interface to display remaining charging time information for each surrounding charging area based on the current location of the vehicle, based on failure to search for a charging area.
또, 상기 프로세서(820)는 주차 가능 영역의 탐색 실패에 근거하여, 탐색 종료가 입력될 때까지, 주변 주차 공간에 대한 탐색을 계속 수행할 수 있다. 이때, AR 그래픽 인터페이스의 제2 AR 오브젝트는 분리되어, 제1 AR 오브젝트를 기준으로 360 회전하는 애니메이션 효과를 출력하여, 탐색 중임을 표시한다.In addition, the processor 820 may continue to search for nearby parking spaces until the end of the search is input based on failure to search for a parking available area. At this time, the second AR object of the AR graphic interface is separated and outputs an animation effect that rotates 360 degrees based on the first AR object, indicating that it is being searched.
또한, 탐색 결과 충전 가능 영역 또는 주차 가능 영역이 없는 경우, 프로세서(820)는 수신된 관제 데이터에 근거하여, AR 그래픽 인터페이스를 통해, 차량 주변의 충전 영역 또는 선택된 영역에 대한 충전대기시간을, 차량 전방 영상에 표시해줄 수 있다. In addition, if there is no charging area or parking area as a result of the search, the processor 820 determines the charging waiting time for the charging area or selected area around the vehicle through the AR graphic interface based on the received control data. It can be displayed on the front video.
탐색된 주차 가능 영역/충전 가능 영역이 복수인 경우, 프로세서(820)는 자동 선택 옵션이 활성화되었는지를 판단하고(1730), 판단 결과에 근거하여 다음 AR 그래픽 인터페이스 표시를 결정한다. If there are multiple searchable parking areas/chargeable areas, the processor 820 determines whether the automatic selection option is activated (1730) and determines the next AR graphic interface display based on the determination result.
자동 선택 옵션이 활성화된 경우, 주차/충전 관제 서버 또는 프로세서(820)에 의해 최적의 주차공간/충전영역이 자동 선택된다(1740). 그에 따라, 최적의 주차공간/충전영역이 자동 선택되었음을 나타내는 알림 정보가 AR 그래픽 인터페이스를 통해 표시/출력된다. When the automatic selection option is activated, the optimal parking space/charging area is automatically selected by the parking/charging control server or processor 820 (1740). Accordingly, notification information indicating that the optimal parking space/charging area has been automatically selected is displayed/output through the AR graphic interface.
자동 선택 옵션이 활성화되지 않은 경우, 프로세서(820)는 탐색된 복수의 주차 가능 영역/충전 가능 영역에 대한 선택 옵션을 포함하여 AR 그래픽 인터페이스를 표시한다. 이 후, 표시된 선택 옵션에 대한 입력에 근거하여 주차공간/충전영역이 선택된다(1780).If the automatic selection option is not activated, the processor 820 displays an AR graphic interface including selection options for the plurality of discovered parking/charging areas. Afterwards, the parking space/charging area is selected based on the input for the displayed selection option (1780).
주차공간/충전영역이 선택되면, 프로세서(820)는 ADAS 센싱 데이터 및/또는 주차장/충전소의 관제 데이터에 근거하여, 선택된 주차공간/충전영역의 위치로 안내하는 안내 경로를 생성한다(1750). When a parking space/charging area is selected, the processor 820 generates a guidance path leading to the location of the selected parking space/charging area based on ADAS sensing data and/or control data of the parking lot/charging station (1750).
상기 안내 경로는, 분리된 제2 AR 오브젝트에 의한 가이드 궤적 표시를 통해 구현될 수 있다. The guide path may be implemented through display of a guide trajectory by a separate second AR object.
구체적으로, 프로세서(820)는 AR 그래픽 인터페이스의 제1 AR 오브젝트가 차량의 주행 방향에 대응하여 회전하도록 표시하고, 제1 AR 오브젝트에서 제2 AR 오브젝트를 분리하여, 분리된 제2 AR 오브젝트을 통해, 상기 제1 AR 오브젝트로부터 탐색된 또는 선택된 주차공간/충전영역 위치로 향하는 가이드 궤적을 표시하도록 AR 그래픽 인터페이스를 업데이트할 수 있다. Specifically, the processor 820 displays the first AR object of the AR graphic interface to rotate in response to the driving direction of the vehicle, separates the second AR object from the first AR object, and uses the separated second AR object to display, The AR graphic interface can be updated to display a guide trajectory from the first AR object to the discovered or selected parking space/charging area location.
한편, 프로세서(820)는 생성된 안내 경로가 주행 가능 방향인지를 판단할 수 있고(1760), 주행 가능 방향이 아니면, 즉 진입 불가 방향이면, 분리된 제2 AR 오브젝트를 통해 진입 불가 경고 및 진입방향을 나타내는 표시를 업데이트한다(1770). 이를 위해, 분리된 제2 AR 오브젝트의 컬러, 형상, 점멸점등, 하이라이트 효과가 변경/적용될 수 있다. Meanwhile, the processor 820 may determine whether the generated guidance path is a travelable direction (1760), and if it is not a travelable direction, that is, an entry-impossible direction, an entry impossibility warning and entry are provided through a separated second AR object. Update the direction indicator (1770). To this end, the color, shape, blinking light, and highlight effects of the separated second AR object can be changed/applied.
생성된 안내 경로가 주행 가능 방향이면, 분리된 제2 AR 오브젝트를 통해 안내 경로를 표시한다. 차량이 안내 경로를 따라 주행하여 해당 주차공간/충전영역에 도달하면(1790), 스마트 주차모드(1795)가 수행될 수 있다. If the generated guidance route is in a possible driving direction, the guidance route is displayed through a separate second AR object. When the vehicle drives along the guidance route and reaches the corresponding parking space/charging area (1790), the smart parking mode (1795) may be performed.
스마트 주차모드는 주차/충전 관제 서버가, 차량(100)의 주차장/충전소 진입시, 차량 또는 AR 디스플레이 장치(800)로부터 수신한 GPS정보, 권한정보(차량 제어권), 차량정보 등에 기초하여, 주차/충전 관제 서버와 AR 디스플레이 장치(800)가 연동하여 수행될 수 있다. Smart parking mode is a parking/charging control server that, when the vehicle 100 enters the parking lot/charging station, based on GPS information, authority information (vehicle control rights), vehicle information, etc. received from the vehicle or the AR display device 800, This can be performed by linking the charging control server and the AR display device 800.
또한, 프로세서(820)는 스마트 주차모드의 실행에 따라 주차 주행을 수행하는 동안, 차량의 주행 상태와 주행해야할 위치 및 방향을 AR 그래픽 인터페이스를 통해 실시간으로 표시할 수 있다. Additionally, the processor 820 can display the driving state of the vehicle and the location and direction to be driven in real time through the AR graphic interface while performing parking driving according to the execution of the smart parking mode.
이제, 도 18a 내지 도 18c를 참조하여, 관제정보에 기반하여 가변되는 AR 그래픽 인터페이스를 이용하여, 충전 가능 영역 및 충전을 가이드하는 UX 예시를 설명하겠다. Now, with reference to FIGS. 18A to 18C, we will describe a UX example that guides the charging area and charging using an AR graphic interface that varies based on control information.
주차장/충전소 위치의 탐색 및 검출에 따라, 차량(100)은 탐색된 주차장/충전소 입구까지 AR 그래픽 인터페이스를 통해 안내 표시를 제공받을 수 있다. 이를 위해, AR 디스플레이 장치(800)는 맵 데이터 및 ADAS 센싱 데이터에 근거하여, 주차장/충전소 위치의 탐색, 검출, 안내 경로 생성을 수행할 수 있다. According to the search and detection of the parking lot/charging station location, the vehicle 100 may be provided with a guidance display through the AR graphic interface to the entrance of the discovered parking lot/charging station. To this end, the AR display device 800 can search for, detect, and create a guidance route for parking lot/charging station locations based on map data and ADAS sensing data.
차량(100)이 주차장/충전소에 진입하면, 주차/충전 관제 서버가 센서(예, 카메라, 라이다, 레이더, 위치 센서 플랫폼 등)를 통해 이를 감지하고, AR 디스플레이 장치(800)로 연결 요청(예, GPS정보, 권한정보(차량 제어권), 차량정보 등의 전송 요청)을 전송할 수 있다. When the vehicle 100 enters the parking lot/charging station, the parking/charging control server detects it through a sensor (e.g., camera, lidar, radar, location sensor platform, etc.) and requests a connection to the AR display device 800 ( Yes, requests to transmit GPS information, authority information (vehicle control rights), vehicle information, etc.) can be transmitted.
AR 디스플레이 장치(800)의 응답(예, GPS정보, 권한정보(차량 제어권), 배터리정보 등의 차량정보 전송)에 근거하여, 주차/충전 관제 서버와 AR 디스플레이 장치(800)가 연결되면, AR 디스플레이 장치(800)는 주차/충전 관제 서버를 통해 획득되는 관제 데이터를 수신할 수 있다. When the parking/charging control server and the AR display device 800 are connected based on the response of the AR display device 800 (e.g., transmission of vehicle information such as GPS information, authority information (vehicle control rights), and battery information), AR The display device 800 may receive control data obtained through a parking/charging control server.
관제 데이터는, 주차장/충전소의 맵 데이터 및 충전 정보를 포함한다.The control data includes map data and charging information of the parking lot/charging station.
예를 들어, 주차장/충전소의 3D 공간 지도, 입고차량 정보에 기반한 주차가능영역 또는 충전가능한 (급속 또는 완속) 충전기에 관한 정보, 실시간 주차장/충전소 정보(예, 충전단가(초급속/급속/완속 별 요금), 충전차량 점유, 충전대기시간, 충전기 고장 정보 등에 관한, 데이터, 정보, 프로그램을 포함할 수 있다. For example, 3D space map of parking lot/charging station, parking area based on incoming vehicle information or information about available (fast or slow) chargers, real-time parking lot/charging station information (e.g. charging unit price (by super-rapid/rapid/slow-speed) It may include data, information, and programs regarding (charge), charging vehicle occupancy, charging waiting time, charger failure information, etc.
AR 디스플레이 장치(800)는 수신된 관제 데이터에 근거하여, 주차/충전 안내를 위해 AR 그래픽 인터페이스를 가변하여 표시할 수 있다. 또한, 주차/충전 관제 서버는, 관제 데이터 및 차량(100)의 차량정보에 근거하여 전술한 디지털 트윈을 생성하고, 디지털 트윈 이용하여 주차/충전 안내를 제공할 수 있다. The AR display device 800 can display the AR graphic interface variably for parking/charging guidance based on the received control data. Additionally, the parking/charging control server may generate the above-described digital twin based on control data and vehicle information of the vehicle 100 and provide parking/charging guidance using the digital twin.
도 18a에서, 차량 전방 영상(1801)에 (또는, 디지털 트윈을 통해), 차량의 현재 위치(예, 주차장/충전소의 입구)를 포함한 주행 상태를 나타내는 AR 그래픽 인터페이스(1800)가 표시될 수 있다. In FIG. 18A, an AR graphic interface 1800 indicating a driving state including the current location of the vehicle (e.g., entrance to a parking lot/charging station) may be displayed on the front image 1801 of the vehicle (or through a digital twin). .
프로세서(820)는, 주차/충전 관제 서버에 의한 관제 데이터에 근거하여, 탐색된 충전 가능 영역(또는, 주차 가능 영역)의 위치와 각 충전 가능 영역에 대한 충전 정보를 AR 그래픽 인터페이스로 표시할 수 있다. The processor 820 can display the location of the discovered chargeable area (or parking area) and charging information for each chargeable area in an AR graphic interface, based on control data from the parking/charging control server. there is.
이를 위해, 프로세서(820)는 주차/충전 관제 서버로부터 충전기 사용현황 및 초급속/급속/완속 충전기에 대한 정보를 관제 데이터로 수신할 수 있다.To this end, the processor 820 may receive charger usage status and information on ultra-fast/rapid/slow chargers as control data from the parking/charging control server.
예를 들어, 프로세서(820)는 주차/충전 관제 서버로부터 수신된 충전기 사용현황 및 초급속/급속/완속 충전기에 대한 정보에 근거하여, 탐색된 제1 충전 가능 영역(1811)에 대한 충전 정보(1821)로, '완속'을 표시하고, 탐색된 제2 충전 가능 영역(1812)에 대한 충전 정보(1822)로, '급속'이 표시할 수 있다.For example, the processor 820 provides charging information (1821) for the discovered first charging area (1811) based on the charger usage status and information on ultra-fast/quick/slow chargers received from the parking/charging control server. ), 'slow speed' can be displayed, and as charging information 1822 for the discovered second charging possible area 1812, 'rapid' can be displayed.
이 후, 차량의 전방 영상에 (또는, 디지털 트윈을 통해), 도 18b에 도시된 바와 같이, AR 그래픽 인터페이스의 제2 AR 오브젝트를 분리하여, 제1 충전 가능 영역(1811)에 대한 안내 경로(1810a)와 제2 충전 가능 영역(1812)에 대한 안내 경로(1810b)를 표시한다.Afterwards, in the front image of the vehicle (or through the digital twin), as shown in FIG. 18B, the second AR object of the AR graphic interface is separated to create a guidance path to the first chargeable area 1811 ( 1810a) and a guide path 1810b for the second chargeable area 1812 are displayed.
제1 AR 오브젝트(1820)는 차량의 현재 위치와 주행방향을 계속 표시한다. The first AR object 1820 continues to display the current location and driving direction of the vehicle.
충전 가능 영역(1811, 1812)에 대한 선택 입력(또는, 자동 선택)이 수신되면, 표시된 제1 충전 가능 영역(1811)에 대한 안내 경로(1810a) 및 제2 충전 가능 영역(1812)에 대한 안내 경로(1810b) 중 어느 하나의 안내 경로로 확정된다. When a selection input (or automatic selection) for the chargeable areas 1811 and 1812 is received, a guidance path 1810a for the displayed first chargeable area 1811 and guidance for the second chargeable area 1812 are provided. One of the routes 1810b is confirmed as a guidance route.
프로세서(820)는 복수의 안내 경로(1810a, 1810b) 중 기설정된 기준에 따른 선택/제안하는 안내 경로에 대한 추천 표시(예, 컬러변경, 하이라이트 표시, 또는 부가 정보 표시 등)를 제공할 수 있다. The processor 820 may provide a recommended display (e.g., color change, highlight display, display of additional information, etc.) for the guide path selected/suggested according to preset criteria among the plurality of guide paths 1810a and 1810b. .
여기에서, 상기 기설정된 기준에 따른 선택은, 차량의 현재 위치에 가까운 정도, 주차장/충전소 출구에 가까운 정도, 급속 충전 우선 등을 포함할 수 있다.Here, selection based on the preset criteria may include proximity to the current location of the vehicle, proximity to the parking lot/charging station exit, priority on fast charging, etc.
예를 들어, 도 18b에서 '급속' 충전 정보(1822)로 표시된 제2 충전 가능 영역(1812)에 대한 안내 경로(1810b)에 추천 표시(예, 하이라이트 표시)가 출력될 수 있다. For example, in FIG. 18B , a recommendation display (e.g., a highlight display) may be output on the guide path 1810b for the second chargeable area 1812 indicated as 'fast' charging information 1822.
선택/제안에 따른 선택입력 또는 자동 선택 옵션에 따라, 제2 충전 가능 영역(1812)이 충전 영역으로 선택되면, 차량 전방 영상에 안내 경로(1810b)만 남겨진다. 그에 따라, 도 18c와 같이 분리된 제2 AR 오브젝트를 통해 충전 영역(1812)으로의 경로를 안내받게 된다. If the second chargeable area 1812 is selected as the charging area according to the selection input according to the selection/suggestion or the automatic selection option, only the guidance path 1810b is left in the image in front of the vehicle. Accordingly, the path to the charging area 1812 is guided through the separated second AR object as shown in FIG. 18C.
이때, 제2 AR 오브젝트를 통해 표시되는 안내 경로(1810b)는 이후의 진행될 주차 모드를 고려하여, 주차하기 편한 방향으로 생성 및 제공된다. 또한, 안내 경로(1810b)는 제1 AR 오브젝트(1820)를 통해 표시되는 차량의 현재 주행 방향이 진입 불가 방향이면, 제2 AR 오브젝트를 통해 표시되는 안내 경로(1810b)에 진입 불가 경고 및 진입방향(예, 차량의 현재 주행 방향과 반대방향이 되도록 회전)을 표시할 수 있다. At this time, the guidance path 1810b displayed through the second AR object is created and provided in a direction that is convenient for parking, considering the parking mode to be used in the future. In addition, if the current driving direction of the vehicle displayed through the first AR object 1820 is a direction in which entry is not possible, the guidance path 1810b provides a warning and entry direction indicating that entry is not possible in the guidance path 1810b displayed through the second AR object. (e.g., rotation to be opposite to the vehicle's current driving direction) can be displayed.
주차 모드가 종료되면, 제1 및 제2 AR 오브젝트가 결합된 형태로 AR 그래픽 인터페이스가 표시된다. 또한, 충전 영역(1812)에 주차 후 충전 모드를 수행하는 동안에도, AR 디스플레이 장치(800)는 주차/충전 관제 서버로부터 수신되는 관제 데이터에 기초하여, 충전 정보 및 연관 정보(예, 남은 충전시간, 충전요금, 충전소와 연계된 이벤트/프로모션 등)를 차량 전방 영상에 실시간으로 표시할 수 있다. When the parking mode ends, the AR graphic interface is displayed in the form of a combination of the first and second AR objects. In addition, even while performing the charging mode after parking in the charging area 1812, the AR display device 800 displays charging information and related information (e.g., remaining charging time) based on control data received from the parking/charging control server. , charging fees, events/promotions linked to the charging station, etc.) can be displayed in real time on the video in front of the vehicle.
이 후, 차량의 충전 종료(예, 충전 중단 또는 충전 완료)이 감지되면, 프로세서(820)는 주차/충전 관제 서버로부터 수신되는 관제 데이터에 기초하여, 분리된 제2 AR 오브젝트를 통해, 주차장/충전소 출구까지 안내하는 경로를 생성 및 표시한다.Afterwards, when the end of charging of the vehicle (e.g., charging interruption or completion of charging) is detected, the processor 820 uses the separated second AR object based on the control data received from the parking/charging control server to detect the parking lot/charge. Create and display a route leading to the charging station exit.
한편, 도 17 및 도 18a 내지 도 18c를 참조하여 설명한 내용은 AR 디스플레이 장치(800)가 ADAS 센싱 데이터에 기초하여 수행되는 경우에도 유사하게 적용된다. 예를 들어, AR 디스플레이 장치(800)는 ADAS 센싱 데이터에 기초하여, AR 그래픽 인터페이스의 분리, 결합, 변형을 통해, 충전 가능 영역의 탐색, 탐색된 충전 가능 영역에 대한 경로, 충전 정보 표시, 주차 형태에 따른 주차 안내 경로, 진입 불가 표시, 충전 종료 후 출구까지의 안내 경로 등을 표시할 수 있다. Meanwhile, the contents described with reference to FIGS. 17 and 18A to 18C are similarly applied even when the AR display device 800 is performed based on ADAS sensing data. For example, based on ADAS sensing data, the AR display device 800 searches for a chargeable area, routes to the discovered chargeable area, displays charging information, and parks through separation, combination, and modification of the AR graphic interface. It can display parking guidance routes according to type, no-entry signs, guidance routes to the exit after charging is complete, etc.
또한, 본 발명의 일부 실시 예에 따른 AR 디스플레이 장치 및 그것의 동작방법에 의하면, 별도의 설정 없이도 캘리브레이션되는 전방 영상에 차량의 현재 위치와 예측 주행 상황에 대한 가이드를 AR 오브젝트로 동시 안내함으로써, 차량에게 보다 직관적이고 현실감 있는 AR 가이드 안내를 제공할 수 있다. 또한, 차량이 주차장/충전소 진입시 탐색, 경로, 및 필요한 정보를 보다 직관적인인 AR 그래픽 인터페이스로 제공해줄 수 있다. 또한, 차량이 원하는 주차공간 또는 충전기 앞에 정확히 주차할 수 있도록, 선택된 주차공간 또는 충전기 앞을 인지하고, AR 그래픽 인터페이스를 실시간 가변하여 주차를 위한 가이드경로의 전진주행, 후진주행 변경 시점, 후진주행 등을 차량의 현재 주행 상태에 대응되게 순차적으로 제공한다. 또한, 차량의 주차장 또는 충전소 진입시, 해당 장소의 관제 서버와 통신하거나 ADAR 센싱을 통해, 주차/충전 가능 영역에 대한 경로 안내, 주차/충전 관련 정보, 출차시 경로 안내를 보다 직관적인 AR 그래픽 인터페이스로 표시해줌으로써, 더욱 직접적이고 스마트한 주차/충전 관련 UX를 제공할 수 있다. In addition, according to the AR display device and its operating method according to some embodiments of the present invention, the guide for the current location and predicted driving situation of the vehicle is simultaneously provided as an AR object in the front image that is calibrated without separate settings, thereby It is possible to provide more intuitive and realistic AR guide guidance to users. Additionally, when a vehicle enters a parking lot/charging station, navigation, routes, and necessary information can be provided through a more intuitive AR graphic interface. In addition, so that the vehicle can park accurately in the desired parking space or in front of the charger, the selected parking space or in front of the charger is recognized, and the AR graphic interface is changed in real time to change the guide path for parking to forward driving, reverse driving change point, reverse driving, etc. is provided sequentially in correspondence with the current driving status of the vehicle. In addition, when a vehicle enters a parking lot or charging station, it communicates with the control server of that location or uses ADAR sensing to provide route guidance for parking/charging areas, parking/charging related information, and route guidance when leaving the vehicle through a more intuitive AR graphic interface. By displaying it, a more direct and smart parking/charging-related UX can be provided.
전술한 본 발명은, 프로그램이 기록된 매체에 컴퓨터가 읽을 수 있는 코드(또는, 애플리케이션이나 소프트웨어)로서 구현하는 것이 가능하다. 상술한 자율 주행 차량의 제어 방법은 메모리 등에 저장된 코드에 의하여 실현될 수 있다. The above-described present invention can be implemented as computer-readable code (or application or software) on a program-recorded medium. The control method of the self-driving vehicle described above can be implemented using codes stored in memory, etc.
컴퓨터가 읽을 수 있는 매체는, 컴퓨터 시스템에 의하여 읽혀질 수 있는 데이터가 저장되는 모든 종류의 기록장치를 포함한다. 컴퓨터가 읽을 수 있는 매체의 예로는, HDD(Hard Disk Drive), SSD(Solid State Disk), SDD(Silicon Disk Drive), ROM, RAM, CD-ROM, 자기 테이프, 플로피 디스크, 광 데이터 저장 장치 등이 있으며, 또한 캐리어 웨이브(예를 들어, 인터넷을 통한 전송)의 형태로 구현되는 것도 포함한다. 또한, 상기 컴퓨터는 프로세서 또는 제어부를 포함할 수도 있다. 따라서, 상기의 상세한 설명은 모든 면에서 제한적으로 해석되어서는 아니되고 예시적인 것으로 고려되어야 한다. 본 발명의 범위는 첨부된 청구항의 합리적 해석에 의해 결정되어야 하고, 본 발명의 등가적 범위 내에서의 모든 변경은 본 발명의 범위에 포함된다.Computer-readable media includes all types of recording devices that store data that can be read by a computer system. Examples of computer-readable media include HDD (Hard Disk Drive), SSD (Solid State Disk), SDD (Silicon Disk Drive), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. It also includes those implemented in the form of carrier waves (e.g., transmission via the Internet). Additionally, the computer may include a processor or control unit. Accordingly, the above detailed description should not be construed as restrictive in all respects and should be considered illustrative. The scope of the present invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the present invention are included in the scope of the present invention.

Claims (12)

  1. 차량의 전방 영상을 획득하는 카메라;A camera that acquires an image of the front of the vehicle;
    차량의 센싱 데이터를 수신하는 통신 모듈;A communication module that receives sensing data from the vehicle;
    기설정된 애플리케이션을 구동하여, 차량의 주행 상태를 표시하는 제1 AR 오브젝트와 차량의 주행 상황에 대한 가이드를 표시하는 제2 AR 오브젝트가 결합된 AR 그래픽 인터페이스가 상기 전방 영상에 중첩되도록 렌더링하는 프로세서; 및A processor that runs a preset application and renders an AR graphic interface combining a first AR object that displays the driving state of the vehicle and a second AR object that displays a guide to the driving situation of the vehicle so that it overlaps the front image; and
    상기 렌더링에 따라, 상기 AR 그래픽 인터페이스가 중첩된 전방 영상을 표시하는 디스플레이를 포함하고, According to the rendering, the AR graphic interface includes a display that displays an overlapping front image,
    상기 프로세서는, The processor,
    차량이 충전 영역을 포함한 주차 영역에 진입한 것에 응답하여, 상기 센싱 데이터 및 상기 주차 영역의 관제 데이터 중 적어도 하나에 근거하여 주차 가능 영역을 탐색하고, 상기 탐색된 주차 가능 영역으로 차량을 안내하도록 상기 AR 그래픽 인터페이스를 가변하여 표시하는,In response to the vehicle entering a parking area including a charging area, the vehicle searches for a possible parking area based on at least one of the sensing data and the control data of the parking area, and guides the vehicle to the discovered parking area. Variable display of AR graphic interface,
    AR 디스플레이 장치. AR display device.
  2. 제1항에 있어서,According to paragraph 1,
    상기 프로세서는,The processor,
    상기 탐색된 주차 가능 영역의 위치를 표시하고, 표시된 주차 가능 영역이 선택된 것에 근거하여 상기 제2 AR 오브젝트를 분리하여 상기 선택된 주차 영역으로 향하는 가이드를 표시하도록 상기 AR 그래픽 인터페이스를 업데이트하는,Displaying the location of the searched available parking area, separating the second AR object based on the displayed available parking area being selected, and updating the AR graphic interface to display a guide to the selected parking area,
    AR 디스플레이 장치.AR display device.
  3. 제1항에 있어서,According to paragraph 1,
    상기 프로세서는, The processor,
    상기 주차 가능 영역의 탐색 동안 상기 제2 AR 오브젝트가 분리되어 상기 제1 AR 오브젝트를 기준으로 회전하도록 표시하고,During the search for the available parking area, the second AR object is separated and displayed to rotate relative to the first AR object,
    상기 탐색의 종료에 응답하여 상기 제1 및 제2 AR 오브젝트가 결합된 AR 그래픽 인터페이스를 표시하도록 렌더링 업데이트하는,updating rendering to display an AR graphic interface in which the first and second AR objects are combined in response to the end of the search,
    AR 디스플레이 장치.AR display device.
  4. 제1항에 있어서,According to paragraph 1,
    상기 프로세서는,The processor,
    차량의 현재 위치 및 주행 상태에 근거하여 차량이 주행 불가 방향으로 진입한 것을 인식하고, 상기 인식에 따라 상기 제2 AR 오브젝트를 분리하여 경고 알림 및 주행 가능 방향을 나타내는 가이드를 표시하도록 렌더링 업데이트하는,Based on the vehicle's current location and driving state, it recognizes that the vehicle has entered a direction in which driving is not possible, and separates the second AR object according to the recognition to update the rendering to display a warning notification and a guide indicating a possible driving direction.
    AR 디스플레이 장치.AR display device.
  5. 제4항에 있어서,According to paragraph 4,
    상기 프로세서는, The processor,
    상기 주행 가능 방향의 주변 영역에, 상기 센싱 데이터 및 상기 관제 데이터에 기초하여 재탐색된 주차 가능 영역의 위치를 표시하도록 AR 그래픽 인터페이스를 표시 업데이트하는,Displaying and updating the AR graphic interface to display the location of the re-discovered parking area based on the sensing data and the control data in the surrounding area of the driving direction,
    AR 디스플레이 장치.AR display device.
  6. 제2항에 있어서,According to paragraph 2,
    상기 프로세서는,The processor,
    상기 선택된 주차 영역에 차량이 근접한 것에 근거하여 가능한 주차 형태를 결정하고, 상기 결정에 따라 상기 제2 AR 오브젝트를 분리하여, 주행해야할 주차 안내선을 표시하도록 상기 AR 그래픽 인터페이스를 렌더링 업데이트하는,Determining a possible parking type based on the vehicle's proximity to the selected parking area, separating the second AR object according to the determination, and updating the rendering of the AR graphic interface to display a parking guide line to drive,
    AR 디스플레이 장치.AR display device.
  7. 제6항에 있어서,According to clause 6,
    상기 프로세서는,The processor,
    상기 가능한 주차 형태가 결정된 것에 근거하여, 차량의 현재 위치 및 선택된 주차 영역의 위치에 기초하여 예상되는 후진주행의 변경 포인트를 산출하고,Based on the determination of the possible parking type, calculating an expected change point for reverse driving based on the current location of the vehicle and the location of the selected parking area,
    상기 분리된 제2 AR 오브젝트를 통해, Through the separated second AR object,
    상기 변경 포인트를 향하는 제1 안내선을 표시하고, 이 후 상기 제1 AR 오브젝트에 대응되는 차량의 현재 위치가 상기 변경 포인트에 근접한 것에 근거하여, 차량이 후진방향으로 상기 선택된 주차 영역을 향하는 제2 안내선을 표시하도록, 상기 AR 그래픽 인터페이스를 표시 업데이트하는,Displaying a first guide line toward the change point, and then, based on the current location of the vehicle corresponding to the first AR object being close to the change point, a second guide line toward the selected parking area in the reverse direction of the vehicle To display and update the AR graphical interface to display,
    AR 디스플레이 장치. AR display device.
  8. 제1항에 있어서,According to paragraph 1,
    상기 프로세서는,The processor,
    차량이 상기 주차 영역에 진입한 것에 응답하여, 상기 센싱 데이터 및 상기 관제 데이터 중 적어도 하나에 근거하여 상기 충전 영역 중 충전 가능 영역을 탐색하고, 탐색된 충전 가능 영역의 위치를 표시하도록 상기 AR 그래픽 인터페이스를 가변하여 표시하는,In response to the vehicle entering the parking area, the AR graphic interface searches for a chargeable area among the charging areas based on at least one of the sensing data and the control data, and displays the location of the discovered chargeable area. Displayed variably,
    AR 디스플레이 장치. AR display device.
  9. 제8항에 있어서,According to clause 8,
    상기 프로세서는,The processor,
    상기 탐색된 충전 가능 영역에 대한 충전 관련 정보가 표시되도록 상기 AR 그래픽 인터페이스를 렌더링 업데이터하고, 상기 충전 관련 정보는, 충전 방식 및 충전 비용 중 적어도 하나를 포함하는,Rendering updates the AR graphic interface to display charging-related information for the discovered chargeable area, and the charging-related information includes at least one of a charging method and a charging cost,
    AR 디스플레이 장치.AR display device.
  10. 제8항에 있어서,According to clause 8,
    상기 프로세서는, The processor,
    상기 충전 가능 영역의 탐색에 실패한 것에 근거하여, 차량의 현재 위치를 기준으로 주변 충전 영역에 대한 각 잔여 충전 시간 정보가 표시되도록 상기 AR 그래픽 인터페이스를 렌더링 업데이트하는, Based on failure to search for the charging area, updating the rendering of the AR graphic interface to display each remaining charging time information for the surrounding charging area based on the current location of the vehicle,
    AR 디스플레이 장치.AR display device.
  11. 제1항에 있어서,According to paragraph 1,
    상기 프로세서는,The processor,
    상기 제1 AR 오브젝트가 차량의 주행 방향에 대응하여 회전하도록 표시하고, 상기 제2 AR 오브젝트를 분리하여 상기 제1 AR 오브젝트로부터 상기 탐색된 주차 가능 영역의 위치로 향하는 가이드 궤적을 표시하도록 상기 AR 그래픽 인터페이스를 업데이트하는, The AR graphic displays the first AR object to rotate in response to the driving direction of the vehicle, separates the second AR object, and displays a guide trajectory from the first AR object to the location of the discovered parking available area. updating the interface,
    AR 디스플레이 장치.AR display device.
  12. 제11항에 있어서,According to clause 11,
    상기 탐색된 주차 가능 영역에 대한 가이드 궤적은, 차량의 ADAS 센싱 데이터 및 상기 관제 데이터 중 적어도 하나에 기초하여 생성된 안내 경로인,The guide trajectory for the discovered parking area is a guide path generated based on at least one of the vehicle's ADAS sensing data and the control data,
    AR 디스플레이 장치.AR display device.
PCT/KR2022/095146 2022-06-10 2022-10-19 Ar display device for vehicle and operation method thereof WO2023239003A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020237018106A KR102611338B1 (en) 2022-06-10 2022-10-19 Vehicle AR display device and method of operation thereof
EP23175933.3A EP4290475A1 (en) 2022-06-10 2023-05-30 Ar display device for vehicle and method for operating same
CN202310686375.7A CN117218882A (en) 2022-06-10 2023-06-09 AR display device for vehicle and operation method thereof
US18/208,540 US20230398868A1 (en) 2022-06-10 2023-06-12 Ar display device for vehicle and method for operating same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0070770 2022-06-10
KR20220070770 2022-06-10

Publications (1)

Publication Number Publication Date
WO2023239003A1 true WO2023239003A1 (en) 2023-12-14

Family

ID=89118487

Family Applications (4)

Application Number Title Priority Date Filing Date
PCT/KR2022/095146 WO2023239003A1 (en) 2022-06-10 2022-10-19 Ar display device for vehicle and operation method thereof
PCT/KR2022/095145 WO2023239002A1 (en) 2022-06-10 2022-10-19 Vehicle ar display device and operating method therefor
PCT/KR2022/015979 WO2023238992A1 (en) 2022-06-10 2022-10-19 Ar display device of vehicle and operating method thereof
PCT/KR2022/015982 WO2023238993A1 (en) 2022-06-10 2022-10-19 Ar display device of vehicle and operation method thereof

Family Applications After (3)

Application Number Title Priority Date Filing Date
PCT/KR2022/095145 WO2023239002A1 (en) 2022-06-10 2022-10-19 Vehicle ar display device and operating method therefor
PCT/KR2022/015979 WO2023238992A1 (en) 2022-06-10 2022-10-19 Ar display device of vehicle and operating method thereof
PCT/KR2022/015982 WO2023238993A1 (en) 2022-06-10 2022-10-19 Ar display device of vehicle and operation method thereof

Country Status (1)

Country Link
WO (4) WO2023239003A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012032811A (en) * 2010-07-09 2012-02-16 Toshiba Corp Display device, image data generation apparatus, image data generation program and display method
US20160129836A1 (en) * 2013-07-05 2016-05-12 Clarion Co., Ltd. Drive assist device
KR20170101758A (en) * 2016-02-26 2017-09-06 자동차부품연구원 Augmented Reality Head Up Display Navigation
KR20190078676A (en) * 2017-12-12 2019-07-05 엘지전자 주식회사 Vehicle control device mounted on vehicle and method for controlling the vehicle
KR20190136691A (en) * 2018-05-31 2019-12-10 주식회사 엘지유플러스 Mobile device and method for switching walking navigation thereof
JP2022058537A (en) * 2019-02-14 2022-04-12 株式会社デンソー Display control device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012032811A (en) * 2010-07-09 2012-02-16 Toshiba Corp Display device, image data generation apparatus, image data generation program and display method
US20160129836A1 (en) * 2013-07-05 2016-05-12 Clarion Co., Ltd. Drive assist device
KR20170101758A (en) * 2016-02-26 2017-09-06 자동차부품연구원 Augmented Reality Head Up Display Navigation
KR20190078676A (en) * 2017-12-12 2019-07-05 엘지전자 주식회사 Vehicle control device mounted on vehicle and method for controlling the vehicle
KR20190136691A (en) * 2018-05-31 2019-12-10 주식회사 엘지유플러스 Mobile device and method for switching walking navigation thereof
JP2022058537A (en) * 2019-02-14 2022-04-12 株式会社デンソー Display control device

Also Published As

Publication number Publication date
WO2023238993A1 (en) 2023-12-14
WO2023238992A1 (en) 2023-12-14
WO2023239002A1 (en) 2023-12-14

Similar Documents

Publication Publication Date Title
WO2017222299A1 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
WO2018044098A1 (en) Vehicle user interface apparatus and vehicle
WO2019031852A1 (en) Apparatus for providing map
WO2019098434A1 (en) In-vehicle vehicle control device and vehicle control method
WO2021045257A1 (en) Route providing device and method for providing route by same
WO2019117333A1 (en) Display device provided in vehicle and control method of display device
EP3475134A1 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
WO2018088615A1 (en) Vehicle driving control device and method
WO2018230768A1 (en) Vehicle control device installed in vehicle and vehicle control method
WO2019035652A1 (en) Driving assistance system and vehicle comprising the same
WO2018097465A1 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
WO2021141142A1 (en) Route providing device and route providing method therefor
WO2018079919A1 (en) Autonomous vehicle and operating method for autonomous vehicle
WO2018110789A1 (en) Vehicle controlling technology
WO2018088614A1 (en) Vehicle user interface device, and vehicle
WO2021045256A1 (en) Route provision apparatus and route provision method therefor
WO2022154369A1 (en) Display device interworking with vehicle and operating method thereof
WO2019066477A1 (en) Autonomous vehicle and method of controlling the same
EP3545380A1 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
WO2020149427A1 (en) Route providing device and route providing method therefor
WO2021157760A1 (en) Route provision apparatus and route provision method therefor
WO2017155199A1 (en) Vehicle control device provided in vehicle, and vehicle control method
WO2021141145A1 (en) Video output device and method for controlling same
WO2020149431A1 (en) Route providing device and control method therefor
WO2020017677A1 (en) Image output device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22945962

Country of ref document: EP

Kind code of ref document: A1