WO2019044652A1 - Appareil de traitement d'information, procédé de traitement d'information, programme et objet mobile - Google Patents

Appareil de traitement d'information, procédé de traitement d'information, programme et objet mobile Download PDF

Info

Publication number
WO2019044652A1
WO2019044652A1 PCT/JP2018/031155 JP2018031155W WO2019044652A1 WO 2019044652 A1 WO2019044652 A1 WO 2019044652A1 JP 2018031155 W JP2018031155 W JP 2018031155W WO 2019044652 A1 WO2019044652 A1 WO 2019044652A1
Authority
WO
WIPO (PCT)
Prior art keywords
movable object
reference image
unit
map
estimating
Prior art date
Application number
PCT/JP2018/031155
Other languages
English (en)
Inventor
Shinichiro Abe
Masahiko Toyoshi
Shun Lee
Keitaro Yamamoto
Original Assignee
Sony Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corporation filed Critical Sony Corporation
Priority to US16/640,970 priority Critical patent/US11288860B2/en
Priority to EP18769241.3A priority patent/EP3676752A1/fr
Publication of WO2019044652A1 publication Critical patent/WO2019044652A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Definitions

  • the present technology relates to an information processing apparatus, an information processing method, a program, and a movable object, and particularly to an information processing apparatus, an information processing method, a program, and a movable object suitable for use when performing matching between an image of map data and an observation image.
  • an apparatus for estimating a position of a movable object comprises a processor in communication with a memory.
  • the processor being configured to execute instructions stored in the memory that cause the processor to generate a dynamic reference image based on an estimated environment and a reference image extracted from a map, and estimate a position of the movable object based on the dynamic reference image and an observation image of an area around the movable object.
  • a non-transitory computer-readable storage medium comprising computer-executable instructions that, when executed by a processor, perform a method for estimating a position of a movable object.
  • the method comprises generating a dynamic reference image based on an estimated environment and a reference image extracted from a map, and estimating a position of the movable object based on the dynamic reference image and an observation image of an area around the movable object.
  • the matching accuracy between an image of map data and an observation image is improved.
  • FIG. 1 is a block diagram showing a schematic functional configuration example of a vehicle control system 100 as an example of a movable object control system to which the present technology can be applied.
  • the vehicle control system 100 is a system that is installed in a vehicle 10 and performs various types of control on the vehicle 10. Note that when distinguishing the vehicle 10 with another vehicle, it is referred to as the own car or own vehicle.
  • the vehicle control system 100 includes an input unit 101, a data acquisition unit 102, a communication unit 103, a vehicle interior device 104, an output control unit 105, an output unit 106, a driving-system control unit 107, a driving system 108, a body-system control unit 109, a body system 110, a storage unit 111, and an self-driving control unit 112.
  • the input unit 101, the data acquisition unit 102, the communication unit 103, the output control unit 105, the driving-system control unit 107, the body-system control unit 109, the storage unit 111, and the self-driving control unit 112 are connected to each other via a communication network 121.
  • the communication network 121 includes an on-vehicle communication network or a bus conforming to an arbitrary standard such as CAN (Controller Area Network), LIN (Local Interconnect Network), LAN (Local Area Network), and FlexRay (registered trademark). Note that the respective units of the vehicle control system 100 are directly connected to each other not via the communication network 121 in some cases.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • LAN Local Area Network
  • FlexRay registered trademark
  • the data acquisition unit 102 includes various sensors for acquiring data to be used for processing performed in the vehicle control system 100, or the like, and supplies the acquired data to the respective units of the vehicle control system 100.
  • the data acquisition unit 102 includes various sensors for detecting the state and the like of the vehicle 10.
  • the data acquisition unit 102 includes a gyro sensor, an acceleration sensor, an inertial measurement unit (IMU), and sensors for detecting the operational amount of an accelerator pedal, the operational amount of a brake pedal, the steering angle of a steering wheel, the engine r.p.m., the motor r.p.m., or the wheel rotation speed.
  • IMU inertial measurement unit
  • the data acquisition unit 102 includes various sensors for detecting information outside the vehicle 10.
  • the data acquisition unit 102 includes an imaging apparatus such as a ToF (Time Of Flight) camera, a stereo camera, a monocular camera, an infrared camera, and other cameras.
  • the data acquisition unit 102 includes an environment sensor for detecting weather, a meteorological phenomenon, or the like, and an ambient information detection sensor for detecting an object in the vicinity of the vehicle 10.
  • the environment sensor includes, for example, a raindrop sensor, a fog sensor, a sunshine sensor, a snow sensor, or the like.
  • the ambient information detection sensor includes, for example, an ultrasonic sensor, a radar, a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a sonar, or the like.
  • the data acquisition unit 102 includes various sensors for detecting the current position of the vehicle 10.
  • the data acquisition unit 102 includes a GNSS receiver that receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite, or the like.
  • GNSS Global Navigation Satellite System
  • the data acquisition unit 102 includes various sensors for detecting vehicle interior information.
  • the data acquisition unit 102 includes an imaging apparatus that captures an image of a driver, a biological sensor for detecting biological information regarding the driver, a microphone for collecting sound in the interior of the vehicle, and the like.
  • the biological sensor is provided, for example, on a seating surface, a steering wheel, or the like, and detects biological information regarding the passenger sitting on a seat or the driver holding the steering wheel.
  • the communication unit 103 communicates with the vehicle interior device 104, and various devices, a server, and a base station outside the vehicle, and the like to transmit data supplied from the respective units of the vehicle control system 100 or supply the received data to the respective units of the vehicle control system 100.
  • the communication protocol supported by the communication unit 103 is not particularly limited, and the communication unit 103 may support a plurality of types of communication protocols.
  • the communication unit 103 performs wireless communication with the vehicle interior device 104 via a wireless LAN, Bluetooth (registered trademark), NFC (Near Field Communication), WUSB (Wireless USB), or the like. Further, for example, the communication unit 103 performs wired communication with the vehicle interior device 104 by USB (Universal Serial Bus), HDMI (High-Definition Multimedia Interface), MHL (Mobile High-definition Link), or the like via a connection terminal (not shown) (and, if necessary, a cable).
  • USB Universal Serial Bus
  • HDMI High-Definition Multimedia Interface
  • MHL Mobile High-definition Link
  • the communication unit 103 communicates with a device (e.g., an application server or a control server) on an external network (e.g., the Internet, a cloud network, or a network unique to the operator) via a base station or an access point. Further, for example, the communication unit 103 communicates with a terminal (e.g., a terminal of a pedestrian or a shop, and an MTC (Machine Type Communication) terminal) in the vicinity of the vehicle 10 by using P2P (Peer To Peer) technology.
  • a terminal e.g., a terminal of a pedestrian or a shop, and an MTC (Machine Type Communication) terminal
  • the communication unit 103 performs V2X communication such as vehicle-to-vehicle communication, vehicle-to-infrastructure communication, communication between the vehicle 10 and a house, and vehicle-to-pedestrian communication.
  • V2X communication such as vehicle-to-vehicle communication, vehicle-to-infrastructure communication, communication between the vehicle 10 and a house, and vehicle-to-pedestrian communication.
  • the communication unit 103 includes a beacon reception unit, receives via radio wave or electromagnetic waves transmitted from a radio station or the like placed on a road, and acquires information such as information of the current position, traffic congestion, traffic regulation, or necessary time.
  • the vehicle interior device 104 includes, for example, a mobile device or a wearable device owned by the passenger, an information device carried in or attached to the vehicle 10, a navigation apparatus that searches for a path to an arbitrary destination.
  • the output control unit 105 controls output of various types of information regarding the passenger of the vehicle 10 or information outside the vehicle 10.
  • the output control unit 105 generates an output signal containing at least one of visual information (e.g., image data) and auditory information (e.g., audio data), supplies the signal to the output unit 106, and thereby controls output of the visual information and the auditory information from the output unit 106.
  • the output control unit 105 combines data of images captured by different imaging apparatuses of the data acquisition unit 102 to generate an overhead image, a panoramic image, or the like, and supplies an output signal containing the generated image to the output unit 106.
  • the output control unit 105 generates audio data containing warning sound, a warning message, or the like for danger such as collision, contact, and entry into a dangerous zone, and supplies an output signal containing the generated audio data to the output unit 106.
  • the output unit 106 includes an apparatus capable of outputting visual information or auditory information to the passenger of the vehicle 10 or the outside of the vehicle 10.
  • the output unit 106 includes a display apparatus, an instrument panel, an audio speaker, a headphone, a wearable device such as a spectacle-type display to be attached to the passenger, a projector, a lamp, and the like.
  • the display apparatus included in the output unit 106 is not limited to the apparatus including a normal display, and may be, for example, an apparatus for displaying visual information within the field of view of the driver, such as a head-up display, a transmissive display, and an apparatus having an AR (Augmented Reality) display function.
  • the driving-system control unit 107 generates various control signals, supplies the signals to the driving system 108, and thereby controls the driving system 108. Further, the driving-system control unit 107 supplies the control signal to the respective units other than the driving system 108 as necessary, and notifies the control state of the driving system 108, and the like.
  • the body system 110 includes various body-system apparatuses equipped on the vehicle body.
  • the body system 110 includes a keyless entry system, a smart key system, a power window apparatus, a power seat, a steering wheel, an air conditioner, various lamps (e.g., a head lamp, a back lamp, a brake lamp, a blinker, and a fog lamp), and the like.
  • the storage unit 111 includes, for example, a magnetic storage device such as a ROM (Read Only Memory), a RAM (Random Access Memory), and an HDD (Hard Disc Drive), a semiconductor storage device, an optical storage device, and a magneto-optical storage device, or the like.
  • the storage unit 111 stores various programs, data, and the like to be used by the respective units of the vehicle control system 100.
  • the storage unit 111 stores map data of a three-dimensional high precision map such as a dynamic map, a global map that has a lower precision and covers a wider area than the high precision map, and a local map containing information regarding the surroundings of the vehicle 10.
  • the vehicle exterior information detection unit 141 performs processing of detecting information outside the vehicle 10 on the basis of the data or signal from the respective units of the vehicle control system 100. For example, the vehicle exterior information detection unit 141 performs processing of detecting, recognizing, and following-up an object in the vicinity of the vehicle 10, and processing of detecting the distance to the object.
  • the object to be detected includes, for example, a vehicle, a human, an obstacle, a structure, a road, a traffic signal, a traffic sign, and a road sign. Further, for example, the vehicle exterior information detection unit 141 performs processing of detecting the ambient environment of the vehicle 10.
  • the ambient environment to be detected includes, for example, weather, temperature, humidity, brightness, condition of a road surface, and the like.
  • the vehicle exterior information detection unit 141 supplies the data indicating the results of the detection processing to the self-position estimation unit 132, a map analysis unit 151, a traffic rule recognition unit 152, and a situation recognition unit 153 of the situation analysis unit 133, and an emergency event avoidance unit 171 of the operation control unit 135, for example.
  • the vehicle state detection unit 143 performs processing of detecting the state of the vehicle 10 on the basis of the data or signal from the respective units of the vehicle control system 100.
  • the state of the vehicle 10 to be detected includes, for example, speed, acceleration, steering angle, presence/absence and content of abnormality, the state of the driving operation, position and inclination of the power seat, the state of the door lock, the state of other on-vehicle devices, and the like.
  • the vehicle state detection unit 143 supplies the data indicating the results of the detection processing to the situation recognition unit 153 of the situation analysis unit 133, and the emergency event avoidance unit 171 of the operation control unit 135, for example.
  • the self-position estimation unit 132 performs processing of estimating a position, a posture, and the like of the vehicle 10 on the basis of the data or signal from the respective units of the vehicle control system 100, such as the vehicle exterior information detection unit 141 and the situation recognition unit 153 of the situation analysis unit 133. Further, the self-position estimation unit 132 generates a local map (hereinafter, referred to as the self-position estimation map) to be used for estimating a self-position as necessary.
  • the self-position estimation map is, for example, a high precision map using a technology such as SLAM (Simultaneous Localization and Mapping).
  • the self-position estimation unit 132 supplies the data indicating the results of the estimation processing to the map analysis unit 151, the traffic rule recognition unit 152, and the situation recognition unit 153 of the situation analysis unit 133, for example. Further, the self-position estimation unit 132 causes the storage unit 111 to store the self-position estimation map.
  • the situation analysis unit 133 performs processing of analyzing the situation of the vehicle 10 and the surroundings thereof.
  • the situation analysis unit 133 includes the map analysis unit 151, the traffic rule recognition unit 152, the situation recognition unit 153, and a situation prediction unit 154.
  • the traffic rule recognition unit 152 performs processing of recognizing a traffic rule in the vicinity of the vehicle 10 on the basis of the data or signal from the respective units of the vehicle control system 100, such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, and the map analysis unit 151. By this recognition processing, for example, the position and state of the traffic signal in the vicinity of the vehicle 10, content of the traffic regulation in the vicinity of the vehicle 10, a drivable lane, and the like are recognized.
  • the traffic rule recognition unit 152 supplies the data indicating the results of the recognition processing to the situation prediction unit 154 and the like.
  • the situation recognition unit 153 performs processing of recognizing the situation regarding the vehicle 10 on the basis of the data or signal from the respective units of the vehicle control system 100, such as the self-position estimation unit 132, the vehicle exterior information detection unit 141, the vehicle interior information detection unit 142, the vehicle state detection unit 143, and the map analysis unit 151.
  • the situation recognition unit 153 performs processing of recognizing the situation of the vehicle 10, the situation of the surroundings of the vehicle 10, the state of the driver of the vehicle 10, and the like.
  • the situation recognition unit 153 generates a local map (hereinafter, referred to as the situation recognition map) to be used for recognizing the situation of the surroundings of the vehicle 10, as necessary.
  • the situation recognition map is, for example, an occupancy grid map.
  • the situation of the vehicle 10 to be recognized includes, for example, the position, posture, and movement (e.g., speed, acceleration, and moving direction) of the vehicle 10, presence/absence of and content of abnormality, and the like.
  • the situation of the surroundings of the vehicle 10 to be recognized includes, for example, the type and position of a stationary object of the surroundings, the type, position, and movement (e.g., speed, acceleration, and moving direction) of a movable body of the surroundings, the configuration a road of the surroundings, the condition of a road surface, weather, temperature, humidity, and brightness of the surroundings, and the like.
  • the state of the driver to be recognized includes, for example, physical condition, arousal degree, concentration degree, fatigue degree, movement of the line of sight, driving operation, and the like.
  • the situation recognition unit 153 supplies the data (including the situation recognition map as necessary) indicating the results of the recognition processing to the self-position estimation unit 132 and the situation prediction unit 154, for example. Further, the situation recognition unit 153 causes the storage unit 111 to store the situation recognition map.
  • the situation prediction unit 154 performs processing of predicting the situation regarding the vehicle 10 on the basis of the data or signal from the respective units of the vehicle control system 100, such as the map analysis unit 151, the traffic rule recognition unit 152 and the situation recognition unit 153. For example, the situation prediction unit 154 performs processing of predicting the situation of the vehicle 10, the situation of the surroundings of the vehicle 10, the state of the driver, and the like.
  • the situation of the vehicle 10 to be predicted includes, for example, the behavior of the vehicle 10, occurrence of abnormality, a drivable distance, and the like.
  • the situation of the surroundings of the vehicle 10 to be predicted includes, for example, the behavior of a movable body in the vicinity of the vehicle 10, change of the state of a traffic signal, change of the environment such as weather, and the like.
  • the state of the driver to be predicted includes, for example, the behavior, physical condition, and the like of the driver.
  • the situation prediction unit 154 supplies the data indicating the results of the prediction processing to, for example, the route planning unit 161, the action planning unit 162, and the operation planning unit 163 of the planning unit 134, together with the data from the traffic rule recognition unit 152 and the situation recognition unit 153.
  • the action planning unit 162 plans an action of the vehicle 10 for safely driving on the route planned by the route planning unit 161 within the planned time period on the basis of the data or signal from the respective units of the vehicle control system 100, such as the map analysis unit 151 and the situation prediction unit 154.
  • the action planning unit 162 makes plans for starting, stopping, travelling directions (e.g., forward, backward, turning left, turning right, and changing direction), driving lane, driving speed, overtaking, and the like.
  • the action planning unit 162 supplies the data indicating the planned action of the vehicle 10 to the operation planning unit 163, for example.
  • the operation planning unit 163 plans the operation of the vehicle 10 for realizing the action planned by the action planning unit 162 on the basis of the data or signal from the respective units of the vehicle control system 100, such as the map analysis unit 151 and the situation prediction unit 154. For example, the operation planning unit 163 makes plans for acceleration, deceleration, running track, and the like.
  • the operation planning unit 163 supplies the data indicating the planned operation of the vehicle 10 to an acceleration/deceleration control unit 172 and a direction control unit 173 of the operation control unit 135, for example.
  • the operation control unit 135 controls the operation of the vehicle 10.
  • the operation control unit 135 includes the emergency event avoidance unit 171, the acceleration/deceleration control unit 172, and the direction control unit 173.
  • the acceleration/deceleration control unit 172 performs acceleration/deceleration control for realizing the operation of the vehicle 10 planned by the operation planning unit 163 or the emergency event avoidance unit 171.
  • the acceleration/deceleration control unit 172 calculates a control target value of a driving-force generation apparatus or a braking apparatus for realizing the planned acceleration, deceleration, or sudden stop, and supplies a control command indicating the calculated control target value to the driving-system control unit 107.
  • the direction control unit 173 controls the direction for realizing the operation of the vehicle 10 planned by the operation planning unit 163 or the emergency event avoidance unit 171. For example, the direction control unit 173 calculates a control target value of a steering mechanism for realizing the running track or sudden turn planned by the operation planning unit 163 or the emergency event avoidance unit 171, and supplies a control command indicating the calculated control target value to the driving-system control unit 107.
  • the present technology is a technology relating to processing of generating map data to be used for the processing performed by mainly the data acquisition unit 102 and the self-position estimation unit 132 of the vehicle control system 100 shown in Fig. 1, and for the self-position estimation processing.
  • FIG. 2 is a block diagram showing one embodiment of a map generation system to which the present technology is applied.
  • a map generation system 200 is installed in, for example, a vehicle 11 different from the vehicle 10 in which the vehicle control system 100 shown in Fig. 1 is installed. Then, the map generation system 200 generates a key frame map that is map data to be used for the self-position estimation processing in the vehicle control system 100.
  • the key frame map includes a plurality of registration images (hereinafter, referred to as the key frames) generated on the basis of a plurality of images (hereinafter, referred to as map images) captured from the vehicle 11 at different positions and postures, and metadata of each key frame.
  • the key frames a plurality of registration images (hereinafter, referred to as the key frames) generated on the basis of a plurality of images (hereinafter, referred to as map images) captured from the vehicle 11 at different positions and postures, and metadata of each key frame.
  • the map generation system 200 includes a map data acquisition unit 201, a map generation unit 202, a material table storage unit 203, and a map storage unit 204.
  • the stereo camera 211 includes a left camera 211L and a right camera 211R.
  • the left camera 211L and the right camera 211R capture images (stereo photographing) of the front of the vehicle 11 from different, i.e., left and right directions, respectively, and supply the resulting captured images (map images) to the map generation unit 202.
  • the depth sensor 212 detects the distance (depth) to each object in front of the vehicle 11, and supplies depth information indicating the detection results to the map generation unit 202.
  • the map generation unit 202 performs processing of generating a key frame to be registered in a key frame map, and metadata regarding each key frame, and registering them in the key frame map stored in the map storage unit 204.
  • the map generation unit 202 includes a position/posture estimation unit 221, a registration determination unit 222, a registration image generation unit 223, and a metadata generation unit 224.
  • the position/posture estimation unit 221 estimates the position and posture of the vehicle 11 on the basis of the map data supplied from the map data acquisition unit 201.
  • the position/posture estimation unit 221 supplies the estimation results to the registration determination unit 222 and the metadata generation unit 224.
  • the registration image generation unit 223 performs processing of generating a key frame that is a registration image of a key frame map.
  • the registration image generation unit 223 includes the movable-body-area invalidation unit 231, a CG (Computer Graphics) processing unit 232, and a shadow removal unit 233.
  • CG Computer Graphics
  • the movable-body-area invalidation unit 231 performs processing of detecting a movable body area where there is a movable body in the map image on the basis of the depth information and the like, and invalidating the movable body area.
  • the movable-body-area invalidation unit 231 supplies the map image in which the movable body area is invalidated and the depth information to the CG processing unit 232.
  • the CG processing unit 232 performs CG processing on the map image, and thereby generates polygon mesh.
  • the CG processing unit 232 supplies the generated polygon mesh and the depth information to the shadow removal unit 233.
  • the metadata generation unit 224 generates metadata regarding the key frame. Then, the metadata generation unit 224 registers the key frame and the metadata in association with each other in the key frame map stored in the map storage unit 204.
  • map generation processing executed by the map generation system 200 will be described. This processing is started when a command for starting map generation processing is input to a map generation system, and finished when a command for finishing the map generation processing is input to the map generation system.
  • the left camera 211L and the right camera 211R of the stereo camera 211 captures images of the front of the vehicle 11, and supply the obtained left map image and right map image to the map generation unit 202, respectively.
  • the depth sensor 212 detects the depth to each object in front of the vehicle 11, and supplies depth information indicating the detection results to the map generation unit 202.
  • Each sensor of the sensor unit 213 performs processing of detecting various types of data, and supplies sensor data indicating the detection results to the map generation unit 202.
  • a method of estimating the position and posture of the vehicle 11 an arbitrary method can be adopted.
  • a high-performance estimation method using an RTK (Real Time Kinematic)-GNSS, a LiDAR, or the like is used.
  • a LiDAR is provided in the map data acquisition unit 201 shown in Fig. 2.
  • the position/posture estimation unit 221 supplies the estimation results of the position and posture of the vehicle 11 to the registration determination unit 222 and the metadata generation unit 224.
  • Step S3 the registration determination unit 222 determines whether or not a key frame is to be added. For example, in the case where the amount of change of the position and posture of the vehicle 11 after a previous key frame is added previously is not less than a predetermined threshold value, the registration determination unit 222 determines that a key frame is to be added. Meanwhile, in the case where the amount of change of the position and posture of the vehicle 11 after a previous key frame is added previously is less than the predetermined threshold value, the registration determination unit 222 determines that a key frame is not to be added.
  • the registration determination unit 222 extracts, from key frames registered in the key frame map stored in the map storage unit 204, a key frame that position/posture information added as metadata is close to the current position and posture of the vehicle 11. Next, the registration determination unit 222 calculates the degree of similarity between the extracted key frame and the latest map image. Then, in the case where the degree of similarity is less than a predetermined threshold value, i.e., there is no key frame registration record in the vicinity of the current vehicle 11, the registration determination unit 222 determines that a key frame is to be added. Meanwhile, in the case where the degree of similarity is not less than the predetermined threshold value, i.e., there is a key frame registration record in the vicinity of the current vehicle 11, the registration determination unit 222 determines that a key frame is not to be added.
  • a predetermined threshold value i.e., there is no key frame registration record in the vicinity of the current vehicle 11
  • Step S3 the processing returns to Step S1. After that, the processing from Step S1 to Step S3 is repeatedly executed until it is determined that a key frame is to be added in Step S3.
  • Step S4 the movable-body-area invalidation unit 231 invalidates the movable body area of the map image.
  • the movable-body-area invalidation unit 231 performs semantic segmentation on each of the right and left map images by using the depth information and the like to detect a movable body area in which there is a movable body.
  • the movable body to be detected includes, for example, a vehicle, a human, an animal, and the like. Further, the type of the movable body to be detected can be arbitrarily set. Further, for example, the type of the movable body to be detected may be set in advance, or may be dynamically set on the basis of information acquired from a server or the like.
  • the movable-body-area invalidation unit 231 invalidates a movable body area that is one dynamic element, which is changed as time passes, of the right and left map images.
  • invalidating the movable body area of the map image represents that the movable body area is not to be processed in the following processing.
  • an arbitrary method can be adopted as the method of invalidating the movable body area.
  • the movable-body-area invalidation unit 231 removes a movable body area from the map image by, for example, filling the movable body area with predetermined color (e.g., black or white).
  • predetermined color e.g., black or white
  • the movable-body-area invalidation unit 231 may interpolate an image of the background of the movable body in the removed area.
  • a map image 252 is generated from a map image 251 shown in Fig. 4.
  • a vehicle and humans are removed from the map image 251 and images of the background thereof are interpolated. Note that in Fig. 4, only one of the right and left map images is shown.
  • a map image captured at a similar position and posture at another time of day, or a key frame in a similar position and posture may be used for interpolating the image of the background.
  • a model generated by prior learning processing may be used for estimating an image of the background, and the estimated image may be interpolated.
  • Step S5 the CG processing unit 232 generates a point cloud. Specifically, the CG processing unit 232 converts each of the right and left map images in which the movable body area has been invalidated into a point cloud on the basis of the depth information.
  • the point cloud corresponding to the left map image and the point cloud corresponding to the right map image will be respectively referred to as the left point cloud and the right point cloud hereinafter.
  • the polygon mesh corresponding to the left point cloud and the polygon mesh corresponding to the right point cloud will be respectively referred to as the left polygon mesh and the right polygon mesh hereinafter.
  • the CG processing unit 232 performs texture mapping. Specifically, the CG processing unit 232 performs texture mapping on the right and left polygon meshes by using color information of the right map image and color information of the left map image, respectively. At this time, the CG processing unit 232 obtains, for example, offsets of the position and posture of the stereo camera 211 and the depth sensor 212 by calibration in advance, and maps the color information on each polygon of the polygon mesh on the basis of the offset. The CG processing unit 232 supplies the right and left polygon meshes in which texture mapping has been performed to the shadow removal unit 233.
  • Step S8 the shadow removal unit 233 removes a shadow. Specifically, the shadow removal unit 233 removes a shadow of the texture mapped on the right and left polygon meshes. Accordingly, the shadow of the image, which is one dynamic element of the right and left map images, is invalidated.
  • a key frame 253 including a polygon mesh obtained by removing the shadow from the map image 252 shown in Fig. 4 is generated. Note that in this figure, the key frame 253 is shown by not a polygon mesh but a normal image in order to make the figure easy to understand.
  • the shadow removal unit 233 removes a shadow of a polygon mesh by using a model that has learned in advance by deep learning using a color image.
  • the shadow removal unit 233 estimates the material properties of each polygon of a polygon mesh on the basis of the material table stored in the material table storage unit 203. Next, the shadow removal unit 233 estimates the position of a light source on the basis of the map image, the depth information, and the estimation results of the material properties. Then, the shadow removal unit 233 removes a shadow due to the influence of the light source from the texture of each polygon on the basis of the estimation results of the material properties of each polygon.
  • the shadow removal unit 233 supplies, as the right and left key frames, the right and left polygon meshes from which the shadow has been removed to the metadata generation unit 224.
  • Step S9 the metadata generation unit 224 generates position/posture information. Specifically, the metadata generation unit 224 generates position/posture information indicating the position and posture of the vehicle 11 estimated in the processing of Step S2.
  • Step S11 the metadata generation unit 224 adds a key frame. Specifically, the metadata generation unit 224 adds, to the key frame map stored in the map storage unit 204, the right and left key frames and metadata containing the position/posture information and the material information in association with each other.
  • a left key frame 254L generated from the left map image and a right key frame 254R generated from the right map image are added to the key frame map.
  • the left key frame 254L and the right key frame 254R each actually include a polygon mesh, they are each shown by a normal image in order to make the figure easy to understand.
  • a right and left pair of two key frames will be referred to as the key frame pair hereinafter. Therefore, a plurality of key frame pairs are registered in the key frame map.
  • Step S1 After that, the processing returns to Step S1, and the processing of Step S1 and subsequent Steps is executed.
  • a key frame pair in which at least a part of the dynamic element of the right and left map images is invalidated is generated on the basis of right and left map images captured at different positions and postures while the vehicle 11 runs, and added to a key frame map. Further, metadata containing position/posture information and material information is added to each key frame pair.
  • FIG. 6 is a block diagram showing an embodiment of a self-position estimation system to which the present technology is applied.
  • the self-position estimation system 300 includes an observation data acquisition unit 301, a self-position estimation unit 302, and a map storage unit 303.
  • the observation data acquisition unit 301 acquires data (hereinafter, referred to as the observation data) to be used for self-position estimation processing.
  • the observation data acquisition unit 301 includes a stereo camera 311, a depth sensor 312, and a sensor unit 313.
  • the depth sensor 312 detects the distance (depth) to each object in front of the vehicle 10, and supplies depth information indicating the detection results to the self-position estimation unit 302.
  • the self-position estimation unit 302 performs self-position estimation processing for estimating the position and posture of the vehicle 10.
  • the self-position estimation unit 302 includes an environment estimation unit 321, a provisional position/posture estimation unit 322, a reference image generation unit 323, a depth estimation unit 324, a movable-body-area invalidation unit 325, a matching unit 326, and a final position/posture estimation unit 327.
  • the environment estimation unit 321 estimates the environment in the vicinity of the vehicle 10 on the basis of the observation data and environment information supplied from an environment information server 340.
  • the environment information contains, for example, information regarding weather in the vicinity of the vehicle 10.
  • the environment estimation unit 321 supplies the estimation results of the environment in the vicinity of the vehicle 10 to the reference image generation unit 323.
  • the provisional position/posture estimation unit 322 estimates the provisional position and the provisional posture of the vehicle 10 on the basis of the observation data.
  • the provisional position/posture estimation unit 322 supplies the estimation results of the provisional position and the provisional posture of the vehicle 10 to the reference image generation unit 323.
  • the reference image generation unit 323 extracts, from the key frame map stored in the map storage unit 303, a key frame pair (right and left key frames) generated at a position and posture close to the provisional position and the provisional posture of the vehicle 10.
  • the reference image generation unit 323 processes the extracted right and left key frames on the basis of the environment in the vicinity of the vehicle 10 (environment in which the vehicle 10 moves), and thereby generates dynamic key frames that are right and left reference images.
  • the reference image generation unit 323 supplies the generated right and left dynamic key frames to the depth estimation unit 324 and the matching unit 326.
  • the depth estimation unit 324 performs stereo matching between the right and left dynamic key frames, and thereby estimates the depth of each pixel of each the dynamic key frames.
  • the depth estimation unit 324 supplies the estimation results of the depth of each pixel of the dynamic key frames to the final position/posture estimation unit 327.
  • the movable-body-area invalidation unit 325 invalidates a movable body area of the observation image similarly to the movable-body-area invalidation unit 231 of the map generation system 200 shown in Fig. 2.
  • the movable-body-area invalidation unit 325 supplies the observation image in which the movable body area is invalidated to the matching unit 326.
  • the matching unit 326 performs matching processing on the observation image and the dynamic key frame.
  • the matching unit 326 supplies the observation image, the dynamic key frame, and the results of the matching processing to the final position/posture estimation unit 327.
  • the final position/posture estimation unit 327 estimates the final position and posture of the vehicle 10 on the basis of the observation image, the dynamic key frame, the estimation results of the depth of each pixel of the dynamic key frame, and the results of the matching processing of the observation image and the dynamic key frame.
  • the final position/posture estimation unit 327 outputs the final estimation results of the position and posture of the vehicle 10.
  • the map storage unit 303 stores the key frame map generated by the map generation system 200 shown in Fig. 2.
  • map storage unit 303 does not necessarily need to be provided in the vehicle 10, and may be provided in an external server or the like. In the case where the map storage unit 204 and the map storage unit 303 are provided in the external server, for example, the map storage unit 204 shown in Fig. 2 and the map storage unit 303 can be commonized.
  • Step S101 the observation data acquisition unit 301 acquires the observation data.
  • the left camera 311L and the right camera 311R of the stereo camera 311 capture images of the front of the vehicle 10, and supply the resulting left observation image and right observation image to the self-position estimation unit 302.
  • the depth sensor 312 supplies detects the depth to each object in front of the vehicle 10, and supplies depth information indicating the detection results to the self-position estimation unit 302.
  • Each sensor of the sensor unit 313 performs processing of detecting various types of data, and supplies sensor data indicating the detection results to the self-position estimation unit 302.
  • Step S102 the provisional position/posture estimation unit 322 estimates the provisional position and posture of the vehicle 10. Specifically, the provisional position/posture estimation unit 322 estimates the relative position and the relative posture of the current vehicle 10 with respect to the absolute position and the absolute posture of the latest vehicle 10 estimated in the previous processing of Step S111 (to be described later).
  • the method of estimating the relative position and the relative posture of the vehicle 10 an arbitrary method can be adopted.
  • a method such as a SLAM using the observation image and the depth information, an inertial navigation method using an IMU, and odometry using a vehicle speed pulse can be used.
  • estimation results obtained by a plurality of methods may be added by using a statistical filter such as a Kalman filter and a particle filter.
  • the provisional position/posture estimation unit 322 estimates the provisional position and the provisional posture of the current vehicle 10 by adding the estimated relative position and relative posture to the absolute position and the absolute posture of the latest vehicle 10.
  • the provisional position/posture estimation unit 322 supplies the estimation results of the provisional position and the provisional posture to the reference image generation unit 323.
  • the CPU 501 loads a program stored in the storage unit 508 to the RAM 503 via the input/output interface 505 and the bus 504 and executes the program, thereby executing the series of processes described above.
  • one step includes a plurality of processes
  • the plurality of processes in the one step can be performed by one apparatus or shared by a plurality of apparatus.
  • a computerized method for estimating a position of a movable object comprising: generating a dynamic reference image based on an environment and a reference image extracted from a map; and estimating a position of the movable object based on the dynamic reference image and an observation image of an area around the movable object.
  • the method according to (1) further comprising: acquiring the observation image; extracting the reference image from the map based on a provisional position of the movable object; and estimating the environment based on received sensor data around the movable object.
  • estimating the environment comprises estimating one or more of a light source, weather, a wind direction, a wind speed, or some combination thereof, around the movable object.
  • acquiring the observation image of the area around the movable object comprises acquiring a stereo pair of images, comprising a left observation image and a right observation image; extracting the reference image from the map comprises extracting a reference image pair comprising a left reference image and a right reference image from the map; generating the dynamic reference image comprises: generating a left dynamic reference image based on the estimated environment and the left reference image; and generating a right dynamic reference image based on the estimated environment and the right reference image; and estimating the position of the movable object comprises estimating the position based on the left or right dynamic reference images and the left or right observation images.
  • estimating the environment comprises estimating one or more of a light source, weather, a wind direction, a wind speed, or some combination thereof, around the movable object.
  • acquiring the observation image of the area around the movable object comprises acquiring a stereo pair of images, comprising a left observation image and a right observation image; extracting the reference image from the map comprises extracting a reference image pair comprising a left reference image and a right reference image from the map; generating the dynamic reference image comprises: generating a left dynamic reference image based on the estimated environment and the left reference image; and generating a right dynamic reference image based on the estimated environment and the right reference image; and estimating the position of the movable object comprises estimating the position based on the left or right dynamic reference images and the left or right observation images.
  • the instructions are further operable to cause the processor to estimate a provisional position of the movable object, wherein the provisional position comprises the provisional position of the movable object and a provisional posture of the movable object; and wherein estimating the position of the movable object comprises estimating an updated position of the movable object and an updated posture of the movable object.
  • the method further comprising estimating a provisional position of the movable object, wherein the provisional position comprises the provisional position of the movable object and a provisional posture of the movable object; and wherein estimating the position of the movable object comprises estimating an updated position of the movable object and an updated posture of the movable object.
  • a movable object configured to estimate a position of the movable object
  • the movable object comprising one or more processors in communication with a memory, the one or more processors being configured to execute instructions stored in the memory that cause the one or more processors to: generate a dynamic reference image based on an estimated environment and a reference image extracted from a map; estimate a position of the movable object based on the dynamic reference image and an observation image of an area around the movable object; and generate a control signal to control movement of the movable object based on the estimated position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Automation & Control Theory (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne des procédés et un appareil pour estimer une position d'un objet mobile. Une image de référence dynamique est générée sur la base d'un environnement et d'une image de référence extraite à partir d'une carte. Une position de l'objet mobile est estimée sur la base de l'image de référence dynamique et d'une image d'observation d'une zone autour de l'objet mobile.
PCT/JP2018/031155 2017-08-29 2018-08-23 Appareil de traitement d'information, procédé de traitement d'information, programme et objet mobile WO2019044652A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/640,970 US11288860B2 (en) 2017-08-29 2018-08-23 Information processing apparatus, information processing method, program, and movable object
EP18769241.3A EP3676752A1 (fr) 2017-08-29 2018-08-23 Appareil de traitement d'information, procédé de traitement d'information, programme et objet mobile

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-164532 2017-08-29
JP2017164532A JP7043755B2 (ja) 2017-08-29 2017-08-29 情報処理装置、情報処理方法、プログラム、及び、移動体

Publications (1)

Publication Number Publication Date
WO2019044652A1 true WO2019044652A1 (fr) 2019-03-07

Family

ID=63557664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/031155 WO2019044652A1 (fr) 2017-08-29 2018-08-23 Appareil de traitement d'information, procédé de traitement d'information, programme et objet mobile

Country Status (4)

Country Link
US (1) US11288860B2 (fr)
EP (1) EP3676752A1 (fr)
JP (1) JP7043755B2 (fr)
WO (1) WO2019044652A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069593A (zh) * 2019-04-24 2019-07-30 百度在线网络技术(北京)有限公司 图像处理方法及***、服务器、计算机可读介质
FR3098326A1 (fr) * 2019-07-02 2021-01-08 Psa Automobiles Sa Procédé de détection d’une anomalie affectant une partie extérieure d’un véhicule automobile
FR3098325A1 (fr) * 2019-07-02 2021-01-08 Psa Automobiles Sa Procédé de détection d’une anomalie affectant une partie extérieure d’un véhicule automobile
DE102020109789A1 (de) 2020-04-08 2021-10-14 Valeo Schalter Und Sensoren Gmbh Verfahren zum Durchführen einer Selbstlokalisierung eines Fahrzeugs auf der Grundlage einer reduzierten digitalen Umgebungskarte, Computerprogrammprodukt sowie ein Selbstlokalisierungssystem

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6915512B2 (ja) * 2017-11-28 2021-08-04 トヨタ自動車株式会社 サーバ装置、故障車両推定方法および故障車両推定プログラム
WO2020008758A1 (fr) * 2018-07-06 2020-01-09 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JP7132037B2 (ja) * 2018-08-29 2022-09-06 フォルシアクラリオン・エレクトロニクス株式会社 車載処理装置
JP2020149317A (ja) 2019-03-13 2020-09-17 株式会社デンソー 車両用装置
CN112001968B (zh) * 2019-05-27 2022-07-15 浙江商汤科技开发有限公司 相机定位方法及装置、存储介质
JP7298343B2 (ja) * 2019-07-01 2023-06-27 日本電信電話株式会社 故障影響推定装置、故障影響推定方法、及びプログラム
EP4011764A4 (fr) 2019-10-25 2022-09-28 Sony Group Corporation Dispositif de traitement d'informations, procédé de traitement d'informations, programme et corps volant
KR20210071584A (ko) * 2019-12-06 2021-06-16 팅크웨어(주) 위성 항법 시스템 측정 위치의 보정을 통해 위치를 판단하는 위치 측정 방법, 위치 측정 장치 및 이를 수행하는 전자 기기
JP7173062B2 (ja) 2020-01-23 2022-11-16 トヨタ自動車株式会社 変化点検出装置及び地図情報配信システム
JP2023037043A (ja) * 2020-02-19 2023-03-15 株式会社Nttドコモ マップデータ生成装置及び測位装置
WO2021224955A1 (fr) * 2020-05-07 2021-11-11 三菱電機株式会社 Dispositif d'estimation de propre position, dispositif de commande de vol, dispositif de génération d'image, système de satellite artificiel et procédé d'estimation de propre position
KR20220033924A (ko) * 2020-09-10 2022-03-17 삼성전자주식회사 증강 현실 장치 및 그 제어 방법
WO2022068818A1 (fr) * 2020-09-29 2022-04-07 Chan Chun Hei Appareil et procédé d'étalonnage de scanner tridimensionnel et de raffinement de données de nuage de points
JP2022069007A (ja) * 2020-10-23 2022-05-11 株式会社アフェクション 情報処理システム、情報処理方法および情報処理プログラム
JP7254222B1 (ja) 2022-01-20 2023-04-07 三菱電機エンジニアリング株式会社 環境地図生成装置、環境地図生成方法およびプログラム
TWI831440B (zh) * 2022-10-26 2024-02-01 財團法人車輛研究測試中心 電腦視覺式車輛定位融合系統及方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1975565A2 (fr) * 2007-03-30 2008-10-01 Aisin AW Co., Ltd. Appareil de collecte d'informations caractéristiques et procédé de collecte d'informations caractéristiques
US20090245657A1 (en) * 2008-04-01 2009-10-01 Masamichi Osugi Image search apparatus and image processing apparatus
GB2495807A (en) * 2011-10-20 2013-04-24 Ibm Activating image processing in a vehicle safety system based on location information
US20140214255A1 (en) * 2013-01-25 2014-07-31 Google Inc. Modifying behavior of autonomous vehicles based on sensor blind spots and limitations

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3575679B2 (ja) * 2000-03-31 2004-10-13 日本電気株式会社 顔照合方法と該照合方法を格納した記録媒体と顔照合装置
US7417738B2 (en) * 2004-01-27 2008-08-26 Tradewind Scientific Ltd. Determining surface properties of a roadway or runway from a moving vehicle
JP2005326168A (ja) * 2004-05-12 2005-11-24 Fuji Photo Film Co Ltd 運転支援システム、車両、および運転支援方法
JP5375249B2 (ja) * 2009-03-25 2013-12-25 株式会社Ihi 移動経路計画装置、移動体制御装置及び移動体
JP5278108B2 (ja) * 2009-03-30 2013-09-04 マツダ株式会社 移動体検出システム及び移動体検出方法
JP5062498B2 (ja) * 2010-03-31 2012-10-31 アイシン・エィ・ダブリュ株式会社 風景マッチング用参照データ生成システム及び位置測位システム
US8532336B2 (en) * 2010-08-17 2013-09-10 International Business Machines Corporation Multi-mode video event indexing
JP6182866B2 (ja) * 2012-03-21 2017-08-23 株式会社リコー 校正装置、距離計測装置及び車両
WO2014007175A1 (fr) * 2012-07-03 2014-01-09 クラリオン株式会社 Dispositif de reconnaissance de l'environnement monté dans un véhicule
WO2014065856A1 (fr) * 2012-10-25 2014-05-01 Massachusetts Institute Of Technology Localisation de véhicule à l'aide d'un radar pénétrant la surface
JP6432182B2 (ja) * 2014-07-02 2018-12-05 富士通株式会社 サービス提供装置、方法、及びプログラム
CA3067160A1 (fr) * 2015-02-10 2016-08-18 Mobileye Vision Technologies Ltd. Carte eparse pour la navigation d'un vehicule autonome
US11137255B2 (en) * 2015-08-03 2021-10-05 Tomtom Global Content B.V. Methods and systems for generating and using localization reference data
US20170359561A1 (en) * 2016-06-08 2017-12-14 Uber Technologies, Inc. Disparity mapping for an autonomous vehicle
EP3784989B1 (fr) * 2018-05-15 2024-02-14 Mobileye Vision Technologies Ltd. Systèmes et procédés de navigation de véhicule autonome

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1975565A2 (fr) * 2007-03-30 2008-10-01 Aisin AW Co., Ltd. Appareil de collecte d'informations caractéristiques et procédé de collecte d'informations caractéristiques
US20090245657A1 (en) * 2008-04-01 2009-10-01 Masamichi Osugi Image search apparatus and image processing apparatus
GB2495807A (en) * 2011-10-20 2013-04-24 Ibm Activating image processing in a vehicle safety system based on location information
US20140214255A1 (en) * 2013-01-25 2014-07-31 Google Inc. Modifying behavior of autonomous vehicles based on sensor blind spots and limitations
JP2016520882A (ja) 2013-01-25 2016-07-14 グーグル インコーポレイテッド センサー検出不能場所及びセンサーの制限に基づく自律走行車両の動作の修正

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069593A (zh) * 2019-04-24 2019-07-30 百度在线网络技术(北京)有限公司 图像处理方法及***、服务器、计算机可读介质
CN110069593B (zh) * 2019-04-24 2021-11-12 百度在线网络技术(北京)有限公司 图像处理方法及***、服务器、计算机可读介质
FR3098326A1 (fr) * 2019-07-02 2021-01-08 Psa Automobiles Sa Procédé de détection d’une anomalie affectant une partie extérieure d’un véhicule automobile
FR3098325A1 (fr) * 2019-07-02 2021-01-08 Psa Automobiles Sa Procédé de détection d’une anomalie affectant une partie extérieure d’un véhicule automobile
DE102020109789A1 (de) 2020-04-08 2021-10-14 Valeo Schalter Und Sensoren Gmbh Verfahren zum Durchführen einer Selbstlokalisierung eines Fahrzeugs auf der Grundlage einer reduzierten digitalen Umgebungskarte, Computerprogrammprodukt sowie ein Selbstlokalisierungssystem

Also Published As

Publication number Publication date
US11288860B2 (en) 2022-03-29
EP3676752A1 (fr) 2020-07-08
JP2019045892A (ja) 2019-03-22
US20200175754A1 (en) 2020-06-04
JP7043755B2 (ja) 2022-03-30

Similar Documents

Publication Publication Date Title
US11288860B2 (en) Information processing apparatus, information processing method, program, and movable object
US11531354B2 (en) Image processing apparatus and image processing method
US11450026B2 (en) Information processing apparatus, information processing method, and mobile object
JP7259749B2 (ja) 情報処理装置、および情報処理方法、プログラム、並びに移動体
CN108139211B (zh) 用于测量的装置和方法以及程序
JP7143857B2 (ja) 情報処理装置、情報処理方法、プログラム、及び、移動体
US11363235B2 (en) Imaging apparatus, image processing apparatus, and image processing method
JP7320001B2 (ja) 情報処理装置、情報処理方法、プログラム、移動体制御装置、及び、移動体
US11501461B2 (en) Controller, control method, and program
US20200263994A1 (en) Information processing apparatus, information processing method, program, and moving body
US20200191975A1 (en) Information processing apparatus, self-position estimation method, and program
US20230215196A1 (en) Information processing apparatus, information processing method, and program
US20220058428A1 (en) Information processing apparatus, information processing method, program, mobile-object control apparatus, and mobile object
US20200230820A1 (en) Information processing apparatus, self-localization method, program, and mobile body
KR20200136398A (ko) 노광 제어 장치, 노광 제어 방법, 프로그램, 촬영 장치, 및 이동체
CN115485723A (zh) 信息处理装置、信息处理方法和程序
US20220277556A1 (en) Information processing device, information processing method, and program
US20220292296A1 (en) Information processing device, information processing method, and program
JP7483627B2 (ja) 情報処理装置、情報処理方法、プログラム、移動体制御装置、及び、移動体
WO2022059489A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18769241

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018769241

Country of ref document: EP

Effective date: 20200330