WO2018142533A1 - Dispositif d'estimation de position/d'orientation et procédé d'estimation de position/d'orientation - Google Patents

Dispositif d'estimation de position/d'orientation et procédé d'estimation de position/d'orientation Download PDF

Info

Publication number
WO2018142533A1
WO2018142533A1 PCT/JP2017/003757 JP2017003757W WO2018142533A1 WO 2018142533 A1 WO2018142533 A1 WO 2018142533A1 JP 2017003757 W JP2017003757 W JP 2017003757W WO 2018142533 A1 WO2018142533 A1 WO 2018142533A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
recognition
azimuth
distance marker
orientation
Prior art date
Application number
PCT/JP2017/003757
Other languages
English (en)
Japanese (ja)
Inventor
隆博 加島
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/003757 priority Critical patent/WO2018142533A1/fr
Priority to JP2018562138A priority patent/JP6479296B2/ja
Priority to TW106113892A priority patent/TW201830336A/zh
Publication of WO2018142533A1 publication Critical patent/WO2018142533A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a position / orientation estimation apparatus and a position / orientation estimation method for estimating the position and orientation of an estimation object.
  • Patent Literature 1 describes a vehicle position detection device that corrects a vehicle position using position information of a kilometer post (distance mark). This vehicle position detection device specifies the position information of the kilometer post corresponding to the feature information recognized from the photographed image among the pre-stored position information and feature information of the kilometer post, and uses this position information to determine the vehicle position. Is corrected.
  • a kilometer post is installed every 100 meters starting from the start point of the down line and the end point of the up line on the road where the lanes are separated into the up line and the down line.
  • the kilometer post indicates the number of kilometers that is the distance from the starting point set on the road to the kilometer post, so the same number of kilometers is present on the upstream kilometer post and the downstream kilopost on the corresponding position. Marked.
  • the number of kilometers is the feature information. Therefore, the position information of the kilometer post on the upstream line should be specified, but the position information of the kilometer post on the downstream line in the corresponding position is erroneously specified. There is a possibility that.
  • This invention solves the said subject, and it aims at obtaining the position and orientation estimation apparatus and position and orientation estimation method which can improve the estimation precision of the position and orientation of an estimation target object.
  • a position and orientation estimation apparatus includes an object search unit, an orientation calculation unit, an object identification unit, and a position and orientation estimation unit.
  • the object search unit converts the reference point set for the recognition object in the three-dimensional space and the label information recognized from the photographed image from the database in which the coordinate information indicated by the recognition object is associated. Search for the corresponding recognition object.
  • the azimuth calculating unit calculates the azimuth of the recognition target based on the position coordinates of the reference point set for the recognition target searched by the target search unit.
  • the object specifying unit is based on the shooting direction of the estimated object obtained by shooting the shot image and the direction of the recognition target calculated by the direction calculation unit, from among the recognition targets searched by the target search unit.
  • a recognition object in the captured image is specified.
  • the position / orientation estimation unit based on the correspondence between the position coordinates in the three-dimensional space of the reference point set for the recognition object specified by the object specifying unit and the position coordinates in the captured image, Estimate posture.
  • the present invention it is possible to accurately specify the recognition object in the photographed image from the recognition objects that indicate the same marking information based on the shooting direction of the estimation object and the direction of the recognition object.
  • the position and orientation of the estimation object can be accurately estimated based on the correspondence relationship between the position coordinates of the reference point of the accurately identified recognition object in the three-dimensional space and the position coordinates in the captured image.
  • FIG. 3A is a block diagram showing a hardware configuration for realizing the function of the position / orientation estimation apparatus according to Embodiment 1.
  • FIG. 3B is a block diagram illustrating a hardware configuration that executes software that implements the functions of the position and orientation estimation apparatus according to Embodiment 1.
  • 3 is a flowchart showing an operation of the position and orientation estimation apparatus according to the first embodiment. It is a figure which shows the outline
  • FIG. 1 is a block diagram showing a functional configuration of a position / orientation estimation apparatus 1 according to Embodiment 1 of the present invention.
  • the position / orientation estimation device 1 is connected to each of the imaging device 2, the sensor device 3, and the distance marker database 4, and estimates the position and orientation of the imaging device 2.
  • the imaging device 2 is an estimation target whose position and orientation are estimated by the position / orientation estimation device 1, and is installed at a predetermined location of the vehicle to image the periphery of the vehicle.
  • position indicates the position of the estimation object in the three-dimensional space
  • posture is the tilt angle or rotation angle of the estimation object in the three-dimensional space.
  • the imaging device 2 is assumed to be an in-vehicle camera that images the front of the vehicle.
  • the sensor device 3 is a sensor mounted on the vehicle, and is a three-axis acceleration sensor and a three-axis geomagnetic sensor.
  • the acceleration sensor detects acceleration along the x-axis, acceleration along the y-axis, and acceleration along the z-axis in a three-dimensional space.
  • the geomagnetic sensor detects geomagnetism along each of the x-axis, y-axis, and z-axis. In the vehicle, the positional relationship between the imaging device 2 and the sensor device 3 is fixed.
  • the distance marker database 4 is a database in which the position coordinates of the reference point set in the distance marker in the three-dimensional space are associated with the number of kilometers of the distance marker.
  • the distance marker is a recognition object whose number of kilometers is recognized from the photographed image photographed by the photographing device 2.
  • the number of kilometers is the marking information marked on the distance marker that is the recognition object, and indicates the distance from the starting point set on the road to the distance marker.
  • FIG. 2 is a diagram showing an example of the distance marker database 4.
  • the distance marker is a rectangular sign board, and the reference points are the four corner points of the surface on which the kilometer is marked in the distance marker.
  • the distance marker registered in the distance marker database 4 shown in FIG. 2 is a distance marker set on a road in which lanes are separated into an up line and a down line.
  • the distance markers indicating the same number of kilometers are the distance marker on the up line side and the distance marker on the down line side at the position corresponding to the distance marker.
  • the reference points registered in the distance marker database 4 are points in a three-dimensional space representing the real space.
  • the three-dimensional space defines, for example, an x-axis (for example, positive in the east direction) in the east-west direction and a y-axis (for example, positive in the north direction) in the north-south direction from a specific point on the earth as the origin. It is represented by a three-dimensional coordinate system that defines the z-axis (for example, the height direction of the distance marker is positive) in the elevation direction.
  • the position coordinates of the four corner points are distances along the x-axis, y-axis, and z-axis from the origin to the four corner points, expressed in meters.
  • the three-dimensional coordinate system representing the real space is referred to as a “world coordinate system”.
  • the position / orientation estimation apparatus 1 includes a distance marker recognition unit 5, an orientation estimation unit 6, a distance marker search unit 7, an orientation calculation unit 8, a distance marker identification unit 9, and a position / orientation estimation unit 10.
  • the distance marker recognition unit 5 is a component corresponding to the object recognition unit, and from the captured image captured by the imaging device 2, the number of kilometers marked on the distance marker and the reference point set for the distance marker above. Recognize position coordinates in a two-dimensional coordinate system of a captured image. For example, the distance marker recognition unit 5 performs pattern recognition processing or character recognition processing on a captured image that is a two-dimensional photographed image, thereby recognizing the number of kilometers of the distance marker captured in the captured image and the position coordinates of the reference point. To do.
  • the two-dimensional coordinate system of the captured image is referred to as an “image coordinate system”.
  • the orientation estimation unit 6 estimates the imaging orientation of the imaging device 2 based on the detection information of the sensor device 3.
  • the shooting direction of the shooting device 2 is a direction in which the viewpoint of the shooting device 2 is located.
  • This azimuth may be an angle indicating the shooting azimuth of the image taking device 2.
  • the north may be represented by 0 degrees, the east by 90 degrees, the south by 180 degrees, and the west by 270 degrees. It may be expressed as
  • the distance marker search unit 7 is a component corresponding to the object search unit, and searches the distance marker database 4 for a distance marker corresponding to the number of kilometers recognized by the distance marker recognition unit 5 from the photographed image.
  • the distance marker database 4 shown in FIG. 2 the distance marker on the up line side and the distance marker on the down line side corresponding to the distance marker have the same number of kilometers. Two distance markers corresponding to the number are searched. For example, if the kilometer recognized by the distance marker recognition unit 5 is 123.4, two distance markers corresponding to the kilometer are retrieved, and the position coordinates of the reference point set for each of these distance markers are calculated. Is output to the unit 8.
  • the azimuth calculating unit 8 calculates the normal direction of the surface on which the number of kilometers is indicated as the azimuth of the distance marker based on the position coordinates of the reference point set in the distance marker searched by the distance marker searching unit 7.
  • the azimuth of the distance marker may be an angle indicating the azimuth of the distance marker with respect to the imaging device 2. For example, as with the azimuth estimation unit 6, 0 degrees north, 90 degrees east, 180 degrees south, and 270 west The angle may be expressed in degrees or the angle may be expressed in radians.
  • the distance marker specifying unit 9 is a component corresponding to the object specifying unit, and based on the shooting direction of the imaging device 2 and the direction of the distance marker, from the distance markers searched by the distance marker searching unit 7, The distance marker in the captured image is specified.
  • the shooting direction of the shooting apparatus 2 is the direction estimated by the direction estimation unit 6, and the direction of the distance marker is the direction calculated by the direction calculation unit 8.
  • the position / orientation estimation unit 10 determines the position of the imaging device 2 based on the correspondence between the position coordinates in the three-dimensional space of the reference point set in the distance marker specified by the distance marker specifying unit 9 and the position coordinates in the captured image. And estimate posture. For example, the position / orientation estimation unit 10 generates a PnP (Perspective n-Points) problem using the internal parameters of the imaging apparatus 2 based on the correspondence between the position coordinates of the reference point in the world coordinate system and the position coordinates in the image coordinate system. Solve. Thereby, the position and orientation of the photographing apparatus 2 are estimated.
  • the internal parameters are information including the focal length and principal point of the photographing lens provided in the photographing apparatus 2.
  • the distance marker database 4 may be constructed on a storage area of an external storage device provided separately from the position and orientation estimation device 1.
  • the distance marker search unit 7 searches the distance marker by accessing the distance marker database 4 in the external storage device via a communication line such as the Internet or an intranet.
  • the distance marker recognizing unit 5 may be a constituent element included in the photographing apparatus 2.
  • the distance marker recognizing unit 5 is removed from the position / orientation estimation device 1 and the label information recognized from the captured image is output from the imaging device 2 to the distance marker searching unit 7 and is recognized from the captured image.
  • the position coordinates of the points are output from the imaging device 2 to the position / orientation estimation unit 10.
  • the direction estimation unit 6 may be a component included in the sensor device 3.
  • the azimuth estimation unit 6 is removed from the position / orientation estimation device 1, and information indicating the photographing orientation of the photographing device 2 is output from the sensor device 3 to the distance indicator specifying unit 9.
  • FIG. 3A is a block diagram showing a hardware configuration for realizing the function of the position / orientation estimation apparatus 1.
  • the imaging device 100, the sensor device 101, the storage device 102, and the processing circuit 103 are connected to each other by a bus.
  • FIG. 3B is a block diagram illustrating a hardware configuration that executes software that implements the functions of the position and orientation estimation apparatus 1.
  • the imaging device 100, the sensor device 101, the storage device 102, the CPU (Central Processing Unit) 104, and the memory 105 are connected to each other by a bus.
  • CPU Central Processing Unit
  • the image capturing device 100 is an image capturing device 2 that captures the periphery of the vehicle on which the position and orientation estimation device 1 is mounted, and the sensor device 101 is a sensor that constitutes the sensor device 3.
  • the storage device 102 stores a distance marker database 4.
  • the storage device 102 is realized by, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an HDD (Hard Disk Drive), or the like, and may be a storage device that combines these. Further, a part or all of the storage area of the storage device 102 may be provided in the external device.
  • the position / posture estimation apparatus 1 communicates with the external apparatus via a communication line such as the Internet or an intranet, and the distance marker search process is executed.
  • the functions of the distance marker recognition unit 5, the orientation estimation unit 6, the distance marker search unit 7, the orientation calculation unit 8, the distance marker identification unit 9, and the position / orientation estimation unit 10 are realized by a processing circuit.
  • the position / orientation estimation apparatus 1 includes a processing circuit for executing these functions.
  • the processing circuit may be dedicated hardware or a CPU that executes a program stored in a memory.
  • the processing circuit 103 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or the like. ), FPGA (Field-Programmable Gate Array), or a combination thereof.
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • each function of the distance marker recognition unit 5, the direction estimation unit 6, the distance marker search unit 7, the direction calculation unit 8, the distance target specification unit 9, and the position / orientation estimation unit 10 is realized by a processing circuit.
  • each function may be realized by a single processing circuit.
  • the processing circuit is the CPU 104 shown in FIG. 3B
  • the functions of the distance marker recognition unit 5, the azimuth estimation unit 6, the distance marker search unit 7, the azimuth calculation unit 8, the distance marker identification unit 9, and the position and orientation estimation unit 10 are as follows: It is realized by software, firmware, or a combination of software and firmware. Software and firmware are described as programs and stored in the memory 105.
  • the CPU 104 realizes each function by reading and executing a program stored in the memory 105. That is, the position / orientation estimation apparatus 1 includes a memory 105 for storing a program that, when executed by the CPU 104, results in the processing from step ST1 to step ST9 shown in FIG. These programs cause the computer to execute the procedures or methods of the distance marker recognition unit 5, the direction estimation unit 6, the distance marker search unit 7, the direction calculation unit 8, the distance marker identification unit 9, and the position and orientation estimation unit 10. .
  • the memory is, for example, non-volatile or volatile semiconductor memory such as RAM, ROM, flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically Programmable EPROM), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and the like are applicable.
  • non-volatile or volatile semiconductor memory such as RAM, ROM, flash memory, EPROM (Erasable Programmable ROM), EEPROM (Electrically Programmable EPROM), magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disk) and the like are applicable.
  • the distance marker recognition unit 5 and the azimuth estimation unit 6 realize their functions by a dedicated hardware processing circuit, and include a distance marker search unit 7, an azimuth calculation unit 8, a distance marker identification unit 9, and a position and orientation estimation unit 10.
  • the function is realized by the CPU 104 executing the program stored in the memory 105.
  • the processing circuit can realize the above-described functions by hardware, software, firmware, or a combination thereof.
  • FIG. 4 is a flowchart showing the operation of the position / orientation estimation apparatus 1 and shows a series of processes from obtaining a captured image in front of the vehicle until the position and orientation of the imaging apparatus 2 are estimated.
  • the distance marker recognition unit 5 inputs a captured image in front of the vehicle from the imaging device 2 (step ST1), the distance marker recognizes the distance marker from the captured image (step ST2).
  • FIG. 5 is a diagram showing an outline of processing for recognizing the distance marker 401 from the captured image 2A.
  • the pixel in the upper left corner of the captured image 2A is the origin (0, 0)
  • the axis extending rightward from the origin is defined as the x axis
  • the axis extending downward from the origin is defined as the y axis.
  • Each point on the captured image 2A is defined in units of pixels (pixels) and represents a position in the image coordinate system.
  • the definition of the origin and the coordinate axes described above is merely an example, and the present invention is not limited to this.
  • the distance marker recognition unit 5 recognizes the distance marker 401 in the captured image 2A by performing image processing on the captured image 2A (step ST3; YES), the distance marker recognition unit 5 recognizes the number of kilometers displayed on the distance marker 401. To the distance marker search unit 7. In FIG. 5, the number information “123.4” is output to the distance marker search unit 7. The distance marker recognition unit 5 recognizes the position coordinates of the four corner points of the distance marker 401 in the image coordinate system and outputs them to the position and orientation estimation unit 10. The four corner points of the distance marker 401 are points on the surface where the number of kilometers in the distance marker 401 is marked, and are points 402a, 402b, 402c, and 402d shown in FIG. On the other hand, when the distance marker is not shown in the photographed image, the distance marker recognition unit 5 cannot recognize the distance marker from the photographed image (step ST3; NO), and returns to the process of step ST1.
  • the distance marker search unit 7 searches the distance marker database 4 for a distance marker corresponding to the number of kilometers recognized from the captured image 2A by the distance marker recognition unit 5 (step ST4). For example, when the distance marker search unit 7 searches the distance marker database 4 shown in FIG. 2 based on “123.4”, which is the number of kilometers input from the distance marker recognition unit 5, the number of kilometers “123.4”. Information on the two distance markers corresponding to is read out and output to the azimuth calculation unit 8.
  • the above-mentioned distance marker information is position coordinates in the world coordinate system of each point at the four corners of the distance marker.
  • the azimuth calculating unit 8 calculates the azimuth of the distance marker with respect to the photographing apparatus 2 based on the position coordinates in the world coordinate system of the four corners of the distance marker input from the distance marker searching unit 7 (step ST5). ). As described above, when two distance markers corresponding to the number of kilometers “123.4” recognized from the photographed image 2A are retrieved from the distance marker database 4, the azimuth calculation unit 8 determines each of these distance markers. Calculate the bearing. The azimuth calculation unit 8 outputs the calculated azimuth for each distance marker and the position coordinates in the world coordinate system of the four corners (reference points) of the distance marker for which the azimuth has been calculated to the distance marker identification unit 9.
  • FIG. 6 is a diagram showing an outline of processing for calculating the azimuth of the distance marker 601 and shows a state where the distance marker 601 is viewed from the sky.
  • the positive direction of the x axis is the east direction
  • the positive direction of the y axis is the north direction.
  • a point 602 is a point at the upper left corner of the distance marker 601
  • a point 603 is a point at the upper right corner of the distance marker 601. Points 602 and 603 are the reference points described above.
  • An arrow 604 indicates the installation direction of the distance marker 601, that is, the normal direction of the surface on which the kilometer is marked.
  • the angle 605 is an angle indicating the azimuth of the distance marker 601 calculated by the azimuth calculation unit 8.
  • the azimuth calculation unit 8 calculates the angle 605 using the position coordinates of the upper left corner point 602, the position coordinates of the upper right corner point 603, and the inverse trigonometric function of the distance marker 601.
  • the azimuth calculation unit 8 calculates the azimuth for all the distance markers searched by the distance marker search unit 7 and outputs the calculated azimuth to the distance marker identification unit 9.
  • the azimuth of the distance marker 601 is calculated using the position coordinates of the upper left corner point 602 and the upper right corner point 603 of the distance marker 601, but instead of the lower left corner of the distance marker 601.
  • the azimuth of the distance marker 601 may be calculated using the position coordinates of the point and the point at the lower right corner.
  • the azimuth of the distance marker may be calculated using a combination of an upper left corner point, an upper right corner point, a lower left corner point, and a lower right corner point.
  • the bearing calculation unit 8 calculates the bearing of the distance marker based on the outer product of the vector from the upper left corner point to the lower left corner point and the vector from the upper left corner point to the upper right corner point. May be.
  • the azimuth calculation unit 8 calculates a vector perpendicular to the surface on which the number of kilometers is marked in the distance marker based on the value of the outer product, and uses the calculated vector and the inverse trigonometric function to calculate the distance marker. Calculate the bearing.
  • the orientation estimation unit 6 estimates the imaging orientation of the imaging device 2 based on the acquired sensor value (step ST7).
  • the azimuth estimation unit 6 calculates the posture of the photographing device 2 with respect to the ground based on the gravitational acceleration detected by the acceleration sensor, and based on the calculated posture and the value of the geomagnetism detected by the geomagnetic sensor.
  • the shooting direction of 2 is estimated.
  • the azimuth estimation unit 6 includes a geomagnetic sensor.
  • the photographing direction of the photographing apparatus 2 may be estimated using only the sensor information.
  • FIG. 7 is a diagram showing an outline of the distance marker specifying process.
  • a shooting direction 701 is a shooting direction of the shooting apparatus 2 estimated by the direction estimation unit 6 and has an azimuth angle of 60 degrees.
  • direction 703 shown with a broken line are the azimuth
  • FIG. The azimuth angle of the azimuth 702 is 50 degrees
  • the distance marker specifying unit 9 inverts the azimuth angle of the imaging azimuth 701 by 180 degrees in order to facilitate comparison with the azimuth of the distance marker.
  • an azimuth 704 is an azimuth obtained by inverting the azimuth angle of the shooting azimuth 701 by 180 degrees, and the azimuth angle is 240 degrees.
  • the distance marker specifying unit 9 compares the azimuth 704 obtained by inverting the imaging azimuth 701 by 180 degrees with the azimuths 702 and 703 of the two distance markers.
  • the distance marker specifying unit 9 calculates the absolute value 705 of the difference between the azimuth angle of the azimuth 704 and the azimuth angle of the azimuth 702 and calculates the absolute value 706 of the difference between the azimuth angle of the azimuth 704 and the azimuth angle of the azimuth 703.
  • the absolute value 705 and the absolute value 706 are compared in magnitude.
  • the distance marker specifying unit 9 specifies the distance marker having the smallest absolute value of the azimuth angle difference as the distance marker captured by the imaging device 2, that is, the distance marker in the captured image. Since the absolute value 705 is 170 degrees and the absolute value 706 is 20 degrees, the distance marker of the azimuth 703 is specified as the distance marker in the captured image.
  • the distance marker specifying unit 9 outputs the position coordinates in the world coordinate system of each point (reference point) at the four corners of the specified distance marker to the position and orientation estimating unit 10.
  • the distance sign is set to the road.
  • the number of kilometers that is the distance from the starting point to the mileage mark is displayed.
  • the same kilometer is marked on the up-distance side distance indicator 800a and the down-line side distance indicator 800b corresponding to the up-distance side distance indicator 800a.
  • the same number of kilometers is indicated on the distance marker 801b on the down line side at the position corresponding to.
  • the distance marker specifying unit 9 searches for the distance searched by the distance marker search unit 7 based on the direction of the distance marker calculated by the direction calculation unit 8 and the shooting direction of the shooting device 2 estimated by the direction estimation unit 6. A distance marker in the captured image is specified from the markers. Thereby, it is possible to accurately identify the distance marker 800a on the upline side that is actually reflected in the captured image.
  • the position / orientation estimation unit 10 associates the position coordinates in the world coordinate system of each point of the four corners of the distance marker specified by the distance marker specifying unit 9 with the position coordinates in the image coordinate system of each point of the four corners of the distance marker. Based on the relationship, the position and orientation of the imaging device 2 are estimated (step ST9). Since the distance marker specified by the distance marker specifying unit 9 is a distance marker captured by the imaging device 2, the position / orientation estimation unit 10 determines the position coordinates in the image coordinate system of each point at the four corners of the distance marker. It can be acquired from the mark recognition unit 5.
  • the position / orientation estimation unit 10 solves the PnP problem using the internal parameters of the imaging apparatus 2 based on the correspondence between the position coordinates in the world coordinate system and the position coordinates in the image coordinate system of the four corners of the distance marker. By solving, the position and orientation including the rotation component of the imaging device 2 are estimated.
  • the position / orientation estimation unit 10 is based on the estimated position and orientation of the imaging device 2 and the positional relationship. Can be accurately estimated. Since the vehicle position can be estimated accurately, the navigation accuracy can be improved.
  • step ST1 to step ST5 there is no dependency between the process from step ST1 to step ST5 and the process from step ST6 to step ST7, so the former process and the latter process are performed in parallel. May be executed.
  • the recognition target object is a distance marker and the labeling information is the number of kilometers has been described, but Embodiment 1 is not limited to this.
  • the recognition target object may be equipment installed along a road, and the marking information may be code information such as numbers, character strings, images, and two-dimensional barcodes marked on the equipment. . That is, the recognition target object and the labeling information may be anything that can be recognized from the captured image.
  • the present invention is not limited to this.
  • the reference point may be a point on the edge of the distance marker or a point on the marking surface in kilometer.
  • the position / orientation estimation apparatus 1 may be mounted on a vehicle, the present invention is not limited to this.
  • the position / orientation estimation apparatus 1 may be mounted on a moving body such as a railway or an aircraft, or the position / orientation estimation apparatus 1 may be realized by a mobile terminal such as a smartphone or a tablet PC and carried by a person.
  • the position / orientation estimation apparatus 1 searches the distance marker database 4 for the position coordinates of the reference point of the distance marker corresponding to the number of kilometers recognized from the captured image, and the retrieved distance marker.
  • the azimuth of the distance marker is calculated based on the position coordinates of the reference point
  • the distance marker in the photographed image is identified based on the calculated azimuth of the distance marker and the photographing azimuth of the photographing device 2, and the identified distance marker
  • the position and orientation of the photographing apparatus 2 are estimated based on the correspondence between the position coordinates of the reference point in the world coordinate system and the position coordinates in the image coordinate system.
  • the distance marker in the captured image can be accurately specified from the distance markers indicating the same number of kilometers. It is possible to accurately estimate the position and orientation of the photographing apparatus 2 based on the position coordinates of the reference point of the distance marker that is accurately specified. Since the position of the photographed distance marker can also be specified accurately, the position / orientation estimation apparatus 1 can be applied to the record of road maintenance inspection work. Further, since the position and orientation of the position / orientation estimation apparatus 1 can be accurately estimated based on the position and orientation of the photographing apparatus 2, the augmented reality application that requires the accurate position and orientation of the object is also applicable. Can be applied.
  • the orientation estimation unit 6 estimates the orientation of the imaging device 2 with respect to the ground based on the detection information of the acceleration sensor, and the imaging orientation of the imaging device 2 based on the estimated orientation. Is estimated. By comprising in this way, the imaging
  • any component of the embodiment can be modified or any component of the embodiment can be omitted within the scope of the invention.
  • the position / orientation estimation apparatus can improve the estimation accuracy of the position and orientation of the estimation object, and is suitable, for example, for an in-vehicle navigation apparatus.
  • 1 position and orientation estimation device 2,100 photographing device, 2A photographed image, 3,101 sensor device, 4 distance marker database, 5 distance marker recognition unit, 6 bearing estimation unit, 7 distance marker search unit, 8 bearing calculation unit, 9 Distance marker specifying unit, 10 position and orientation estimation unit, 102 storage device, 103 processing circuit, 104 CPU, 105 memory, 401, 601 distance marker, 604 arrow, 605 angle, 701 shooting orientation, 702 to 704 orientation, 705, 706 absolute Value, 800a, 800b, 801a, 801b distance markers.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un dispositif d'estimation de position/d'orientation (1) qui : récupère, à partir d'une base de données de bornes kilométriques (4), les coordonnées de position d'un point de référence d'une borne kilométrique correspondant à des kilomètres reconnus dans une image photographiée ; calcule l'azimut de la borne kilométrique en fonction des coordonnées de position récupérées du point de référence ; spécifie une borne kilométrique dans l'image photographiée en fonction de l'azimut calculé de la borne kilométrique et de l'azimut de photographie d'un dispositif de photographie (2) ; et estime la position et l'orientation du dispositif de photographie (2) en fonction de la correspondance entre les coordonnées de position du point de référence de la borne kilométrique identifié dans un système de coordonnées mondiales et les coordonnées de position de ce dernier dans le système de coordonnées d'image.
PCT/JP2017/003757 2017-02-02 2017-02-02 Dispositif d'estimation de position/d'orientation et procédé d'estimation de position/d'orientation WO2018142533A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/JP2017/003757 WO2018142533A1 (fr) 2017-02-02 2017-02-02 Dispositif d'estimation de position/d'orientation et procédé d'estimation de position/d'orientation
JP2018562138A JP6479296B2 (ja) 2017-02-02 2017-02-02 位置姿勢推定装置および位置姿勢推定方法
TW106113892A TW201830336A (zh) 2017-02-02 2017-04-26 位置姿勢推定裝置以及位置姿勢推定方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/003757 WO2018142533A1 (fr) 2017-02-02 2017-02-02 Dispositif d'estimation de position/d'orientation et procédé d'estimation de position/d'orientation

Publications (1)

Publication Number Publication Date
WO2018142533A1 true WO2018142533A1 (fr) 2018-08-09

Family

ID=63039486

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/003757 WO2018142533A1 (fr) 2017-02-02 2017-02-02 Dispositif d'estimation de position/d'orientation et procédé d'estimation de position/d'orientation

Country Status (3)

Country Link
JP (1) JP6479296B2 (fr)
TW (1) TW201830336A (fr)
WO (1) WO2018142533A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020045345A1 (fr) * 2018-08-31 2020-03-05 株式会社デンソー Système et procédé de reconnaissance de panneau indicateur
CN112639905A (zh) * 2018-08-31 2021-04-09 株式会社电装 标示物识别***以及标示物识别方法
CN113188439A (zh) * 2021-04-01 2021-07-30 深圳市磐锋精密技术有限公司 一种基于互联网的手机摄像用自动定位方法
WO2023013160A1 (fr) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Dispositif d'estimation de distance, dispositif d'antenne, et système, dispositif ainsi que procédé d'alimentation électrique
WO2023013171A1 (fr) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Dispositif d'estimation de distance, dispositif d'antenne, et système, dispositif ainsi que procédé d'alimentation électrique
JP7407213B2 (ja) 2022-02-22 2023-12-28 本田技研工業株式会社 方角特定装置、及び方角特定方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09119851A (ja) * 1995-10-24 1997-05-06 Nissan Diesel Motor Co Ltd 車両の位置検出装置
JP2007147515A (ja) * 2005-11-29 2007-06-14 Denso Corp 車両用ナビゲーション装置
JP2009250718A (ja) * 2008-04-03 2009-10-29 Nissan Motor Co Ltd 車両位置検出装置及び車両位置検出方法
JP2016018405A (ja) * 2014-07-09 2016-02-01 東日本高速道路株式会社 車載用キロポスト値表示端末及びそれに用いるキロポスト値更新方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09119851A (ja) * 1995-10-24 1997-05-06 Nissan Diesel Motor Co Ltd 車両の位置検出装置
JP2007147515A (ja) * 2005-11-29 2007-06-14 Denso Corp 車両用ナビゲーション装置
JP2009250718A (ja) * 2008-04-03 2009-10-29 Nissan Motor Co Ltd 車両位置検出装置及び車両位置検出方法
JP2016018405A (ja) * 2014-07-09 2016-02-01 東日本高速道路株式会社 車載用キロポスト値表示端末及びそれに用いるキロポスト値更新方法

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020045345A1 (fr) * 2018-08-31 2020-03-05 株式会社デンソー Système et procédé de reconnaissance de panneau indicateur
CN112639905A (zh) * 2018-08-31 2021-04-09 株式会社电装 标示物识别***以及标示物识别方法
CN112639905B (zh) * 2018-08-31 2023-09-29 株式会社电装 标示物识别***以及标示物识别方法
US11830255B2 (en) 2018-08-31 2023-11-28 Denso Corporation Method and system for recognizing sign
CN113188439A (zh) * 2021-04-01 2021-07-30 深圳市磐锋精密技术有限公司 一种基于互联网的手机摄像用自动定位方法
WO2023013160A1 (fr) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Dispositif d'estimation de distance, dispositif d'antenne, et système, dispositif ainsi que procédé d'alimentation électrique
WO2023013171A1 (fr) * 2021-08-02 2023-02-09 ミネベアミツミ株式会社 Dispositif d'estimation de distance, dispositif d'antenne, et système, dispositif ainsi que procédé d'alimentation électrique
JP7407213B2 (ja) 2022-02-22 2023-12-28 本田技研工業株式会社 方角特定装置、及び方角特定方法

Also Published As

Publication number Publication date
TW201830336A (zh) 2018-08-16
JP6479296B2 (ja) 2019-03-06
JPWO2018142533A1 (ja) 2019-03-14

Similar Documents

Publication Publication Date Title
CN110411441B (zh) 用于多模态映射和定位的***和方法
JP6479296B2 (ja) 位置姿勢推定装置および位置姿勢推定方法
US10515458B1 (en) Image-matching navigation method and apparatus for aerial vehicles
WO2021185218A1 (fr) Procédé d'acquisition de coordonnées 3d et de dimensions d'objet pendant un mouvement
EP2917754B1 (fr) Méthode de traitement d'image, en particulier utilisée dans la localisation visuelle d'un dispositif
CN110570477B (zh) 一种标定相机和旋转轴相对姿态的方法、装置和存储介质
WO2018142900A1 (fr) Dispositif de traitement d'informations, dispositif de gestion de données, système de gestion de données, procédé et programme
EP2491529B1 (fr) Obtention d'un descripteur associé à au moins une caractéristique d'une image
Wendel et al. Natural landmark-based monocular localization for MAVs
WO2016199605A1 (fr) Dispositif, procédé et programme de traitement d'image
JPWO2015045834A1 (ja) マーカ画像処理システム
CN107044853B (zh) 用于确定地标的方法和装置以及用于定位的方法和装置
JP4132068B2 (ja) 画像処理装置及び三次元計測装置並びに画像処理装置用プログラム
CN111279354A (zh) 图像处理方法、设备及计算机可读存储介质
JP2022042146A (ja) データ処理装置、データ処理方法およびデータ処理用プログラム
JP6410231B2 (ja) 位置合わせ装置、位置合わせ方法及び位置合わせ用コンピュータプログラム
JP6922348B2 (ja) 情報処理装置、方法、及びプログラム
JP2011112556A (ja) 捜索目標位置特定装置及び捜索目標位置特定方法並びにコンピュータプログラム
Huttunen et al. A monocular camera gyroscope
US20230079899A1 (en) Determination of an absolute initial position of a vehicle
JP6886136B2 (ja) 位置合わせ装置、位置合わせ方法及び位置合わせ用コンピュータプログラム
CN113011212B (zh) 图像识别方法、装置及车辆
CN114842224A (zh) 一种基于地理底图的单目无人机绝对视觉匹配定位方案
KR20220062709A (ko) 모바일 디바이스 영상에 기반한 공간 정보 클러스터링에 의한 재난 상황 인지 시스템 및 방법
Amorós et al. Towards relative altitude estimation in topological navigation tasks using the global appearance of visual information

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018562138

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17895095

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17895095

Country of ref document: EP

Kind code of ref document: A1