WO2019182082A1 - Estimation device, control method, program, and storage medium - Google Patents

Estimation device, control method, program, and storage medium Download PDF

Info

Publication number
WO2019182082A1
WO2019182082A1 PCT/JP2019/011977 JP2019011977W WO2019182082A1 WO 2019182082 A1 WO2019182082 A1 WO 2019182082A1 JP 2019011977 W JP2019011977 W JP 2019011977W WO 2019182082 A1 WO2019182082 A1 WO 2019182082A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
vehicle
estimation
measurement
acceleration
Prior art date
Application number
PCT/JP2019/011977
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 正浩
Original Assignee
パイオニア株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社 filed Critical パイオニア株式会社
Priority to JP2020507909A priority Critical patent/JPWO2019182082A1/en
Publication of WO2019182082A1 publication Critical patent/WO2019182082A1/en
Priority to JP2022074663A priority patent/JP2022115927A/en
Priority to JP2023190044A priority patent/JP2024016186A/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C15/00Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a technique for estimating the attitude of a measurement unit.
  • Patent Document 1 discloses a technique for estimating a self-position by collating the output of a measurement sensor with the position information of a feature registered in advance on a map.
  • Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
  • Data obtained from measurement units such as radar and cameras are coordinate system values based on the measurement unit, and are data that depends on the attitude of the measurement unit with respect to the vehicle. Need to be converted to Therefore, when a deviation occurs in the posture of the measurement unit, it is necessary to accurately detect the deviation and reflect it in the data of the measurement unit.
  • the present invention has been made in order to solve the above-described problems, and a main object of the present invention is to provide an estimation device that can appropriately estimate the posture of a measuring unit that measures a distance to an object with respect to a moving object.
  • the invention described in claim is an estimation device that estimates a posture of a measuring unit that measures a distance to an object with respect to a moving body.
  • An estimation unit configured to estimate an attitude of the measurement unit with respect to the moving body based on a detection result of the provided acceleration detection unit;
  • the invention described in the claims is a control method executed by the estimation device that estimates the attitude of the measuring unit that measures the distance to the object with respect to the moving body, and the moving body travels with acceleration and deceleration. And an estimation step of estimating the attitude of the measurement unit relative to the moving body based on the detection result of the acceleration detection unit provided in the measurement unit.
  • the invention described in the claims is a program executed by a computer that estimates the attitude of a measuring unit that measures a distance to an object with respect to a moving body, and when the moving body is running with acceleration / deceleration.
  • the computer is caused to function as an estimation unit that estimates the attitude of the measurement unit with respect to the moving body based on a detection result of an acceleration detection unit provided in the measurement unit.
  • an estimation device that estimates the attitude of a measuring unit that measures a distance to an object with respect to a moving body, and the moving body is traveling while accelerating and decelerating.
  • An estimation unit configured to estimate an attitude of the measurement unit with respect to the moving body based on a detection result of an acceleration detection unit provided in the measurement unit;
  • the estimation apparatus can preferably estimate the attitude of the measurement unit in the yaw direction with respect to the vehicle.
  • the “running with acceleration / deceleration” includes both modes of running with acceleration and running with deceleration.
  • the estimation unit is configured to determine a roll direction and a pitch direction of the measurement unit based on acceleration data output from the acceleration detection unit when the moving body is running or stopped at a predetermined speed. Based on the acceleration data output by the acceleration detection unit when the mobile object is traveling while accelerating and decelerating, and the estimated posture in the roll direction and the pitch direction. Estimate the orientation of the direction.
  • the estimation device preferably estimates the posture of the measurement unit in the roll direction and the pitch direction with respect to the vehicle when the moving body is traveling or stopped at a predetermined speed, and the moving body is traveling with acceleration / deceleration. Sometimes the posture of the measurement unit in the yaw direction with respect to the vehicle can be estimated appropriately.
  • the estimation unit estimates the amount of change in the posture based on the estimated posture of the measurement unit and the posture of the measurement unit stored in the storage unit.
  • the estimation apparatus can presume suitably the variation
  • the estimation unit estimates the position of the measurement unit in the height direction based on measurement data of the measurement unit indicating the position of the road surface in the height direction.
  • the estimation apparatus can estimate the position in the height direction of a measurement part suitably.
  • the estimation unit may be arranged before and after the moving object based on a distance between the road point and the moving object when a road point where a gradient changes is measured by the measuring unit. The position of the measurement unit in the direction is estimated. According to this aspect, the estimation device can appropriately estimate the position of the measurement unit in the front-rear direction of the moving body.
  • the estimation unit may calculate a time difference between a change in data output from the inclination detection unit that detects an inclination of the moving body in the pitch direction and a change in measurement data output from the measurement unit. Based on the above, the distance is calculated.
  • the estimation device preferably calculates the distance between the road point and the moving body when the road point where the gradient changes is measured by the measuring unit, and estimates the position of the measuring unit in the front-rear direction of the moving body. Can be used.
  • the estimation unit includes acceleration data in the left-right direction of the mobile body output by the acceleration detection unit during the turning of the mobile body, and an acceleration sensor mounted on the mobile body.
  • the position of the measurement unit in the left-right direction of the moving body is estimated based on the acceleration data in the left-right direction of the moving body to be output and the yaw rate of the moving body output from the gyro sensor mounted on the moving body.
  • the estimation apparatus can estimate the position of the measurement part in the left-right direction of a moving body suitably.
  • the estimation unit estimates the amount of change in the position based on the estimated position of the measurement unit and the position of the measurement unit stored in the storage unit.
  • the estimation apparatus can estimate suitably the variation
  • the estimation device further includes a correction unit that corrects measurement data output from the measurement unit based on the amount of change.
  • the estimation apparatus can correct the measurement data of the measurement unit so that the influence of the shift does not occur even when the measurement unit is misaligned.
  • the apparatus further includes a stop control unit that stops processing based on measurement data output from the measurement unit when the amount of change is equal to or greater than a predetermined amount. According to this aspect, the estimation apparatus can reliably suppress a reduction in accuracy of various processes using the measurement data by using the measurement data of the measurement unit having a large posture and position deviation.
  • a control method executed by an estimation device that estimates the attitude of a measuring unit that measures a distance to an object with respect to a moving body, wherein the moving body is accelerated and decelerated. And an estimation step of estimating an attitude of the measurement unit with respect to the moving body based on a detection result of an acceleration detection unit provided in the measurement unit.
  • the estimation apparatus can preferably estimate the attitude of the measurement unit in the yaw direction with respect to the vehicle.
  • a program executed by a computer that estimates a posture of a measuring unit that measures a distance to an object with respect to a moving body, and the moving body travels with acceleration and deceleration.
  • the computer is caused to function as an estimation unit that estimates the attitude of the measurement unit with respect to the moving body based on the detection result of the acceleration detection unit provided in the measurement unit.
  • the computer can suitably estimate the attitude of the measurement unit in the yaw direction with respect to the vehicle by executing this program.
  • the program is stored in a storage medium.
  • FIG. 1 is a schematic configuration diagram of a driving support system according to the present embodiment.
  • the driving support system shown in FIG. 1 is mounted on a vehicle, and includes an in-vehicle device 1 that performs control related to driving support of the vehicle, a lidar (Lider: Light Detection and Ranging or Laser Illuminated Detection And Ranging) 2, and a gyro sensor 3.
  • a lidar Lider: Light Detection and Ranging or Laser Illuminated Detection And Ranging
  • gyro sensor 3 A vehicle body acceleration sensor 4 and a rider acceleration sensor 5.
  • the in-vehicle device 1 is electrically connected to the lidar 2, the gyro sensor 3, the vehicle body acceleration sensor 4, and the lidar acceleration sensor 5, and acquires the output data thereof. Moreover, the map database (DB: DataBase) 10 which memorize
  • the in-vehicle device 1 estimates the position of the vehicle (also referred to as “own vehicle position”) based on the output data and the map DB 10 described above, and the vehicle such as automatic driving control based on the estimation result of the own vehicle position. Control related to driving support.
  • the in-vehicle device 1 estimates the posture and position of the rider 2 based on the outputs of the rider 2, the gyro sensor 3, the vehicle body acceleration sensor 4, and the rider acceleration sensor 5. And the vehicle equipment 1 performs the process etc. which correct
  • the in-vehicle device 1 is an example of the “estimation device” in the present invention.
  • the lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to an object existing in the outside world, and a three-dimensional point indicating the position of the object Generate group information.
  • the lidar 2 includes an irradiation unit that emits laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit. Output unit.
  • the scan data is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal described above, and is sent to the in-vehicle device 1. Supplied.
  • the lidar 2 is provided at each of a front portion and a rear portion of the vehicle.
  • the lidar 2 is an example of the “measurement unit” in the present invention.
  • the gyro sensor 3 is provided in the vehicle and supplies an output signal corresponding to the yaw rate of the vehicle body to the in-vehicle device 1.
  • the vehicle body acceleration sensor 4 is a three-axis acceleration sensor provided in the vehicle, and supplies a detection signal corresponding to three-axis acceleration data corresponding to the traveling direction, side surface direction, and height direction of the vehicle body to the in-vehicle device 1. .
  • the gyro sensor 3 and the vehicle body acceleration sensor 4 are examples of the “tilt detector” in the present invention.
  • the rider acceleration sensor 5 is a triaxial acceleration sensor provided in each rider 2, and supplies detection signals corresponding to the triaxial acceleration data of the installed rider 2 to the in-vehicle device 1.
  • the rider acceleration sensor 5 is an example of the “acceleration detector” in the present invention.
  • FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device 2.
  • the in-vehicle device 2 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
  • the interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle body acceleration sensor 4, and the lidar acceleration sensor 5, and supplies the output data to the control unit 15. In addition, the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
  • ECU Electronic Control Unit
  • the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
  • the storage unit 12 includes a map DB 10 and rider installation information IL.
  • the lidar installation information IL is information relating to the relative three-dimensional position and posture of each rider 2 at a certain reference time (for example, when there is no posture / position shift immediately after the alignment adjustment of the lidar 2).
  • the posture of the lidar 2 and the like is represented by a roll angle, a pitch angle, and a yaw angle (that is, Euler angle).
  • the lidar installation information IL may be information on the position and orientation actually measured at the reference time described above, and the position and orientation of the lidar 2 estimated by the vehicle-mounted device 1 by the position and orientation estimation process of the lidar 2 described later. It may be information about.
  • the input unit 14 is a button operated by the user, a touch panel, a remote controller, a voice input device, and the like, and receives an input for specifying a destination for route search, an input for specifying on / off of automatic driving, and the like.
  • the information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
  • the control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1.
  • the control unit 15 estimates the vehicle position based on the output signal of each sensor supplied from the interface 11 and the map DB 10, and controls vehicle driving support including automatic driving control based on the estimation result of the vehicle position. Etc.
  • the control unit 15 uses the lidar 2 as a reference based on the measurement data output by the lidar 2 based on the attitude and position of the lidar 2 recorded in the lidar installation information IL. To a coordinate system based on the vehicle.
  • the control unit 15 calculates the change amount with respect to the position and posture recorded in the rider installation information IL by estimating the current position (ie, processing reference time) of the rider 2 with respect to the vehicle. Then, the measurement data output from the lidar 2 is corrected based on the change amount. As a result, the control unit 15 corrects the measurement data output by the lidar 2 so as not to be affected by the deviation even when the deviation or position of the rider 2 occurs.
  • the controller 15 is an example of an “estimator”, “corrector”, “stop controller”, and “computer” that executes a program in the present invention.
  • the in-vehicle device 1 executes the following process for each rider 2.
  • FIG. 3 is a diagram illustrating the relationship between the vehicle coordinate system and the lidar coordinate system represented by two-dimensional coordinates.
  • the vehicle coordinate system has a coordinate axis “x b ” along the traveling direction of the vehicle and a coordinate axis “y b ” along the lateral direction of the vehicle with the vehicle center as the origin.
  • the lidar coordinate system has a coordinate axis “x L ” along the front direction of the rider 2 (see arrow A ⁇ b> 2) and a coordinate axis “y L ” along the side surface direction of the rider 2.
  • the measurement point [x] at the time “k” viewed from the vehicle coordinate system [x b (k), y b (k)] T is converted to the coordinates [x L (k), y L (k)] T of the lidar coordinate system by the following equation (1) using the rotation matrix “C ⁇ 0 ”. Converted.
  • the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the rotation matrix. Therefore, the measurement point [x L (k), y L (k)] T obtained at the time k obtained in the lidar coordinate system is expressed by the coordinates [x b (k), y b in the vehicle coordinate system according to the following equation (2). (K)] It is possible to convert to T.
  • FIG. 4 is a diagram illustrating the relationship between the vehicle coordinate system and the lidar coordinate system represented by three-dimensional coordinates.
  • a coordinate axis perpendicular to the coordinate axes x b and y b is “z b ”
  • a coordinate axis perpendicular to the coordinate axes x L and y L is “z L ”.
  • the roll angle of the rider 2 with respect to the vehicle coordinate system is “L ⁇ 0 ”, the pitch angle is “L ⁇ 0 ”, the yaw angle is “L ⁇ 0 ”, the position of the rider 2 on the coordinate axis x b is “L x0 ”, and the coordinate axis y b
  • the measurement point [x b0 (k), y b0 (k), z b0 ( k)] T is the following equation (3) using the direction cosine matrix “C 0 ” represented by the rotation matrices “C ⁇ 0 ”, “C ⁇ 0 ”, and “C ⁇ 0 ” corresponding to roll, pitch, and yaw. )
  • the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the direction cosine matrix. Therefore, the measurement point [x L0 (k), y L0 (k), z L0 (k)] T acquired at the time k in the lidar coordinate system T is the coordinate [x b0 of the vehicle coordinate system according to the following equation (4). (K), y b0 (k), z b0 (k)] can be converted to T.
  • directions along the coordinate axes x b , y b , and z b in the vehicle coordinate system are also simply referred to as “x direction”, “y direction”, and “z direction”, respectively.
  • the in-vehicle device 1 determines the roll angle L ⁇ 0 and the pitch angle L ⁇ 0 of the rider 2 based on the triaxial acceleration output value obtained from the rider acceleration sensor 5 installed in the target rider 2.
  • the lidar acceleration sensor 5 measures the triaxial acceleration of the lidar coordinate system.
  • FIG. 5A is a diagram showing a vector of gravitational acceleration g in the vehicle coordinate system
  • FIG. 5B is a diagram showing a vector of gravitational acceleration g in the lidar coordinate system. Accordingly, the output value [ ⁇ x , ⁇ y , ⁇ z ] T of the rider acceleration sensor 5 of the rider acceleration sensor 5 satisfies the following expression (5).
  • the roll angle L ⁇ 0 of the rider 2 is expressed by the following formula (7) that does not use the gravitational acceleration g.
  • the pitch angle L .theta.0 lidar 2 is expressed by the following equation without using the acceleration of gravity g (10).
  • the vehicle-mounted device 1 is based on the output value of the rider acceleration sensor 5 by referring to the equations (7) and (10) when stopped or traveling at a constant speed in a horizontal place.
  • the roll angle L ⁇ 0 and the pitch angle L ⁇ 0 of the rider 2 can be calculated.
  • the in-vehicle device 1 may determine whether or not the current position is a horizontal place based on the output of the gyro sensor 3 or the vehicle body acceleration sensor 4, and the inclination of the road data of the road corresponding to the current position of the vehicle. You may determine by referring the information regarding an angle from map DB10. Further, the in-vehicle device 1 may determine whether the vehicle is stopped or traveling at a constant speed based on the output of the vehicle body acceleration sensor 4 or based on the output of a vehicle speed sensor (not shown). .
  • the in-vehicle device 1 is obtained from the rider acceleration sensor 5 while the vehicle is accelerating or decelerating on a straight road using the calculated roll angle L ⁇ 0 and pitch angle L ⁇ 0 of the rider 2.
  • the yaw angle L ⁇ 0 is estimated based on the three-axis acceleration output value.
  • equation (18) is obtained by adding an equation obtained by multiplying equation (17) by sinL ⁇ 0 and an equation obtained by multiplying ⁇ x in equation (11) by cosL ⁇ 0 .
  • the yaw angle L ⁇ 0 of the rider 2 is expressed by the following equation (20) that does not use the acceleration ⁇ and the gravitational acceleration g.
  • the in-vehicle device 1 calculates the yaw angle L ⁇ 0 of the rider 2 by referring to the equation (20) based on the output value of the rider acceleration sensor 5 obtained while accelerating / decelerating on the straight road. Can do.
  • the in-vehicle device 1 may determine whether or not the vehicle is traveling on a straight road based on the output of the vehicle body acceleration sensor 4 and refer to the map DB 10 for road data of the road corresponding to the current position. You may judge by.
  • the in-vehicle device 1 may determine whether the vehicle is accelerating / decelerating based on the output of the vehicle body acceleration sensor 4 or based on the output of a vehicle speed sensor (not shown).
  • the position of the lidar 2 on the coordinate axis xb is “ ⁇ L x0 ”, the position on the coordinate axis y b is “ ⁇ L y0 ”, and the position on the coordinate axis z b is “ ⁇ L z0 ”.
  • the position on the coordinate axis x b at the processing reference time point is “L x ”
  • the position on the coordinate axis y b is “L y ”
  • the position on the coordinate axis z b is “L z ”.
  • the measurement point at time k obtained from the rider 2 is [x L0 (k), y L0 (k), z L0 (k)] T from [ x L (k), y L (k), and as a result, it becomes z L (k)] T.
  • conversion of the measurement point at time k from the vehicle coordinate system to the lidar coordinate system is as shown in Expression (23).
  • the roll angle L ⁇ of the rider 2 is calculated using the equations similar to the equations (7), (10), and (20).
  • the pitch angle L ⁇ and the yaw angle L ⁇ can be calculated. Therefore, the in-vehicle device 1 has a difference between the roll angle L ⁇ , the pitch angle L ⁇ , and the yaw angle L ⁇ and the roll angle L ⁇ 0 , pitch angle L ⁇ 0 , and yaw angle L ⁇ 0 recorded in the rider installation information IL.
  • the roll angle change amount ⁇ L ⁇ , the pitch angle change amount ⁇ L ⁇ , and the yaw angle change amount ⁇ L ⁇ can be suitably calculated.
  • the in-vehicle device 1 has the expressions (7), (10), and (20) at the timing when the estimation process of the attitude of the rider 2 can be executed first after the rider 2 is attached to the vehicle (that is, alignment adjustment).
  • the roll angle L ⁇ 0 , the pitch angle L ⁇ 0 , and the yaw angle L ⁇ 0 of the rider 2 estimated based on the above may be stored as the rider installation information IL.
  • the in-vehicle device 1 detects the position change amount “ ⁇ L z ” of the rider 2 in the z direction based on the value in the z direction of the measurement point of the rider 2 measured on the road surface while the vehicle is stopped on a flat road or traveling at a constant speed. Is calculated.
  • the coordinates [x b (k), y b (k), z b (k)] T of the measurement point at time k in the vehicle coordinate system at the processing reference time point are coordinates [x b0 ( k), y b0 (k), z b0 (k)] Similar to T , using the direction cosine matrix “C”, it is expressed by the following equation (24).
  • FIG. 6A shows a measured value z b0 (k) in the z direction of the road surface measured by the rider 2 when the vehicle is traveling on a flat road surface before the change of the position L z of the rider 2.
  • 6 (B) shows a measured value z b (k) in the z direction of the road surface measured by the rider 2 when the vehicle is traveling on a flat road surface after the change of the position L z of the rider 2.
  • the measured value z b0 (k) and the measured value z b (k) have the same length as the positions L z0 and L z of the rider 2 in the z direction, respectively. It has become.
  • the equation (4) the measured value of the road surface which is calculated by z b0 (k), the difference between the road surface after the posture change measured values z b (k) is equal to the z-direction variation amount [Delta] L z I understand that
  • the measured value z b0 (k) and the measured value z b (k) used in this case are preferably an average and a time average of a plurality of scanning lines.
  • the vehicle-mounted device 1 is based on the following equation (25), calculates the amount of change [Delta] L z.
  • the in-vehicle device 1 calculates the average of the measured values z b0 (k) of the road surface when the vehicle is stopped on a flat road surface or traveling at a constant speed.
  • the initial position L z0 of the rider 2 is recorded in the rider installation information IL.
  • the vehicle-mounted device 1 refers to the rider installation information IL, can be preferably calculates the position L z and the amount of change [Delta] L z during processing reference.
  • the in-vehicle device 1 estimates the vehicle suspension stroke amount (that is, the sinking amount from the fully extended position) at the time of measuring the initial position L z0 and at the processing reference time, and based on the estimated difference in stroke amount.
  • the amount of change ⁇ L z in the z direction may be corrected.
  • the vehicle-mounted device 1 may measure the stroke amount of the suspension based on a stroke sensor or the like provided in the vehicle suspension, or may estimate the stroke amount of the suspension based on the number of passengers in the vehicle. Thus, it is possible to calculate more accurately the amount of change [Delta] L z.
  • the vehicle-mounted device 1 is based on the time difference between the change in the measured value in the z direction of the rider 2 and the change in the pitch rate obtained from the gyro sensor 3 at the start point or end point of the slope or when passing through the bump on the road surface.
  • a position change amount L x in the x direction is calculated.
  • FIG. 7 (A) to 7 (G) show a line segment indicating the magnitude of the measured value z b (k) in the z direction of a specific scan line of the rider 2 of the vehicle traveling around the start point 50 of the uphill.
  • FIG. FIG. 8 (A) is a graph showing the time change of the measured value z b (k) measured when the vehicle shown in FIGS. 7 (A) to 7 (G) travels. These are graphs which show the time change of the vehicle body pitch angle (the integrated value of the pitch rate) in the same period as FIG. Note that numbers 51 to 57 in FIGS. 8A and 8B correspond to the measured values z b (k) indicated by the line segments 51 to 57 in FIGS. 7A to 7G. Respectively.
  • the measurement value of the lidar 2 in the z direction with the road surface as a measurement point changes near the start point or end point of the slope.
  • the measured value z b (k) is minimized at time “t2” (see FIG. 7D) when the front wheel of the vehicle body reaches the start point 50.
  • the in-vehicle device 1 determines that the measurement value z b (k) is minimized from the time t1 (see FIG. 7B) at which the measurement value z b (k) starts to decrease (see FIG. 7D). The time interval “ ⁇ t” until (see) is calculated.
  • the time interval ⁇ t corresponds to the time required for the vehicle to travel the distance d shown in FIG. 7A, and the distance d is the starting point when the starting point 50 of the slope is detected in a specific scan line. This corresponds to the distance between 50 and the front wheel of the vehicle.
  • the time interval ⁇ t is an example of the “time difference” in the present invention.
  • the vehicle equipment 1 calculates the distance d by calculating
  • the vehicle-mounted device 1 uses the gyro sensor 3 mounted on the vehicle body or the rider 2 at the time t2 when the front wheel of the vehicle body approaches the gradient change point (start point 50 in FIG. 7). It can also be determined from the pitch angle measured by (vehicle pitch angle in FIG. 8B).
  • FIGS. 9 (A) to 9 (G) are diagrams showing the magnitude of the measured value z b (k) in the z direction of the rider 2 of the vehicle traveling before and after the bump 60 by line segments 61 to 66.
  • FIG. is there.
  • FIG. 10A is a graph showing the time change of the measured value z b (k) obtained when the vehicle shown in FIGS. 9A to 9G is traveling.
  • FIG. 11 is a graph showing a change over time in the vehicle body pitch rate during the same period as in FIG. The numbers 61 to 66 in FIGS.
  • the measurement value z b (k) in the z direction of the rider 2 of the vehicle is the measurement value at the time “t3” when the bump 60 is irradiated (see FIG. 9B).
  • z b (k) temporarily decreases, and the measured value z b (k) temporarily increases at time “t4” (see FIG. 9D) when the front wheel of the vehicle body exceeds the bump 60. Therefore, the vehicle-mounted device 1 has the time t4 (FIG.
  • the vehicle equipment 1 can calculate the distance d suitably based on Formula (26).
  • the vehicle-mounted device 1 has a pitch rate measured by the gyro sensor 3 mounted on the vehicle body or the rider 2 at the time t4 when the front wheel of the vehicle body exceeds the bump 60 (FIG. 10 (B). ) Can also be determined by the vehicle body pitch rate).
  • the in-vehicle device 1 stores the distance “d 0 ” before the position of the rider 2 is changed, and obtains a difference from the distance d as shown in the following equation (27), so that x
  • the direction change amount ⁇ L x in the direction can be calculated.
  • the in-vehicle device 1 calculates the distance d 0 when it first passes near the start point or end point of the slope or when it passes a bump on the road surface.
  • the initial position L x0 of the rider 2 is recorded in the rider installation information IL.
  • the vehicle-mounted device 1 refers to the rider installation information IL, based on equation (27) can be suitably calculates the position L x and the amount of change [Delta] L x during processing reference.
  • the vehicle-mounted device 1 determines the position of the rider 2 in the y direction from the y direction output value of the rider acceleration sensor 5, the y direction output value of the vehicle body acceleration sensor 4, and the output value of the gyro sensor 3 while the vehicle is turning. L y is calculated.
  • FIG. 11 is a diagram schematically showing the action that occurs in a turning vehicle.
  • the speed “V G ” and the centripetal acceleration “ ⁇ G ” of the vehicle center of gravity during turning are given by the following equations (28) and (29).
  • r G in the equation (28) indicates the distance from the turning center point to the vehicle center of gravity point. Further, the direction of the velocity V G and the centripetal acceleration ⁇ G are orthogonal. If it is a rigid body, the angular velocity is the same everywhere, so the velocity “V A ” and the centripetal acceleration “ ⁇ ” at the vehicle coordinate origin set at a position [A x , A y ] T away from the center of gravity. " A " is given by the following equations (30) and (31).
  • centripetal acceleration alpha A and the vehicle lateral component "alpha Ay” have a relationship represented by the following formula (32).
  • the vehicle lateral direction component ⁇ Ay on the left side of the equation (33) can be measured as an output in the y-axis direction when an acceleration sensor is mounted at the position of the vehicle A point.
  • the equation (33 ) is obtained with respect to the vehicle lateral component “ ⁇ (L + A) y ” at the position of the lidar 2.
  • the following equation (34) is established, and the following equation (35) is obtained by using this equation (34) and equation (33).
  • Expression (36) is obtained by dividing the difference between the outputs of the vehicle acceleration sensor 4 and the rider acceleration sensor 5 mounted on the vehicle by the square of the vehicle yaw rate, and that is the position of the rider 2 in the y direction with respect to the vehicle coordinate system origin. It shows that it becomes.
  • the vehicle-mounted device 1 by calculating the equation (36) can grasp the position L y of the rider 2.
  • the in-vehicle device 1 can also calculate the change amount ⁇ L y by referring to the initial position L y0 stored in the rider installation information IL.
  • FIG. 12 is an example of a flowchart illustrating a processing procedure for correcting the output of the lidar 2.
  • the in-vehicle device 1 repeatedly executes the process shown in FIG. 12 at a predetermined timing.
  • the vehicle-mounted device 1 calculates the change amount ⁇ L ⁇ of the roll angle of the rider 2 and the change amount ⁇ L ⁇ of the pitch angle from the output value of the rider acceleration sensor 5 while the vehicle is stopped on a horizontal road or traveling at a constant speed. Calculate (step S101).
  • the in-vehicle device 1 calculates the roll angle L ⁇ and the pitch angle L ⁇ of the rider 2 based on the equations equivalent to the equations (7) and (10), and the roll angle L ⁇ 0 recorded in the rider installation information IL.
  • the difference between the pitch angle L .theta.0, variation [Delta] L phi is calculated as [Delta] L theta.
  • the vehicle-mounted device 1 calculates the change amount ⁇ L ⁇ of the yaw angle of the rider 2 from the output value of the rider acceleration sensor 5 while the vehicle is accelerating / decelerating on the straight road (step S102).
  • the in-vehicle device 1 calculates the yaw angle L ⁇ of the rider 2 based on the equation equivalent to the equation (20), and the difference from the yaw angle L ⁇ 0 recorded in the rider installation information IL is calculated as the change amount ⁇ L ⁇ .
  • the vehicle-mounted device 1 calculates the position change amount ⁇ L z in the z direction from the measured value in the z direction of the rider 2 that irradiates the road surface while the vehicle is stopped on a flat road or traveling at a constant speed (step) S103).
  • the in-vehicle device 1 calculates the average of the measured values z b (k) of the road surface when the vehicle is stopped on a flat road surface or traveling at a constant speed, and is recorded in the rider installation information IL.
  • the change amount ⁇ Lz is calculated by taking the difference from the average of the measured values z b0 (k) of the road surface under the conditions (see formula (25)).
  • the vehicle-mounted device 1 sets the time interval ⁇ t between the change in the measured value in the z direction of the rider 2 and the change in the output value of the gyro sensor near the start point or end point of the hill or when passing the bump on the road surface.
  • a position change amount ⁇ L x in the x direction is calculated (step S104).
  • the in-vehicle device 1 calculates the distance d by multiplying the time interval ⁇ t by the traveling speed v of the vehicle, and calculates the difference from the distance d 0 previously stored in the rider installation information IL as the change amount ⁇ L z. (See equation (26)).
  • the vehicle-mounted device 1 determines the position in the y direction from the y-direction output value of the rider acceleration sensor 5, the y-direction output value of the vehicle body acceleration sensor 4, and the output value of the gyro sensor 3 while the vehicle is turning.
  • a change amount ⁇ L y is calculated (step S105). Specifically, the in-vehicle device 1 calculates the position L y at the time of processing reference based on Expression (36), and calculates the difference from the initial position L y0 stored in the lidar installation information IL as the position change amount in the y direction. Calculated as ⁇ L y .
  • the in-vehicle device 1 determines whether or not there is a change amount ⁇ L ⁇ , ⁇ L ⁇ , ⁇ L ⁇ , ⁇ L x , ⁇ L y , ⁇ L z calculated in steps S101 to S105 that is equal to or greater than a predetermined threshold value.
  • the above-described threshold value is a threshold value for determining whether or not the measurement data of the lidar 2 can be continuously used by performing the correction process of the measurement data of the lidar 2 in step S108 described later. Is set.
  • the in-vehicle device 1 then includes a change amount ⁇ L ⁇ , ⁇ L ⁇ , ⁇ L ⁇ , ⁇ L x , ⁇ L y , ⁇ L z calculated in steps S101 to S105 that is greater than or equal to a predetermined threshold (step S106; Yes), information indicating a warning that the use of the output data of the target rider 2 (that is, use for obstacle detection, vehicle position estimation, etc.) must be stopped and the target rider 2 needs to be realigned is provided.
  • the output is performed by the output unit 16 (step S107).
  • the above threshold is an example of the “predetermined amount” in the present invention.
  • Step S 106 when there is no change amount ⁇ L ⁇ , ⁇ L ⁇ , ⁇ L ⁇ , ⁇ L x , ⁇ L y , ⁇ L z that exceeds a predetermined threshold (Step S 106; No), the in-vehicle device 1 changes these change amounts. Based on the above, each measurement value of the point cloud data output by the lidar 2 is corrected (step S108). In this case, for example, the in-vehicle device 1 stores a map or the like indicating the correction amount of the measurement value for each change amount, and corrects the above-described measurement value by referring to the map or the like. Alternatively, the measurement value may be corrected using the value of a predetermined ratio of the change amount as the correction amount of the measurement value.
  • the in-vehicle device 1 in the present embodiment estimates at least the attitude of the rider 2 that measures the distance to the object with respect to the vehicle, and the vehicle is traveling while accelerating and decelerating. Based on the detection result of the rider acceleration sensor 5 provided in the rider 2, processing for estimating the attitude of the rider 2 with respect to the vehicle is performed. Thereby, the vehicle equipment 1 can perform the process which converts the measurement data which the rider 2 outputs into a vehicle coordinate system with high accuracy, or can determine whether the rider 2 can be used.
  • step S108 in FIG. 12 the in-vehicle device 1 uses the lidar calculated in steps S101 to S105 instead of correcting each measurement value of the point cloud data output from the lidar 2 based on the amount of change calculated in steps S101 to S105.
  • Each measured value may be converted into the vehicle coordinate system based on the estimated values of the posture and position at the time of the second processing reference.
  • the in-vehicle device 1 uses the roll angle L ⁇ , pitch angle L ⁇ , yaw angle L ⁇ , x-direction position L x , y-direction position L y , and z-direction position L z calculated in steps S101 to S105.
  • each measurement value of the point cloud data output from the lidar 2 is converted from the lidar coordinate system to the vehicle body coordinate system, and based on the converted data, the vehicle position estimation and automatic driving control are performed. May be executed.
  • each in-vehicle device 1 is provided with an adjustment mechanism such as an actuator for correcting the posture and position of each rider 2
  • the in-vehicle device 1 replaces the process in step S108 with steps S101 to S101.
  • the configuration of the driving support system shown in FIG. 1 is an example, and the configuration of the driving support system to which the present invention is applicable is not limited to the configuration shown in FIG.
  • the electronic control device of the vehicle instead of having the in-vehicle device 1, the electronic control device of the vehicle may execute the processing shown in FIG.
  • the lidar installation information IL is stored in, for example, a storage unit in the vehicle, and the vehicle electronic control device is configured to be able to receive output data of various sensors such as the lidar 2.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A vehicle-mounted device 1 estimates at least a posture of a rider 2 which measures a distance to a subject object with respect to a vehicle, and performs a process and the like for estimating the posture of the rider 2 with respect to the vehicle on the basis of a detection result at a time when the vehicle is traveling while accelerating or decelerating, which is acquired by a rider acceleration sensor 5 provided on the rider 2.

Description

推定装置、制御方法、プログラム及び記憶媒体Estimation apparatus, control method, program, and storage medium
 本発明は、計測部の姿勢を推定する技術に関する。 The present invention relates to a technique for estimating the attitude of a measurement unit.
 従来から、レーダやカメラなどの計測部の計測データに基づいて、自車位置推定などを行う技術が知られている。例えば、特許文献1には、計測センサの出力と、予め地図上に登録された地物の位置情報とを照合させることで自己位置を推定する技術が開示されている。また、特許文献2には、カルマンフィルタを用いた自車位置推定技術が開示されている。 2. Description of the Related Art Conventionally, a technique for estimating a vehicle position based on measurement data of a measurement unit such as a radar or a camera is known. For example, Patent Document 1 discloses a technique for estimating a self-position by collating the output of a measurement sensor with the position information of a feature registered in advance on a map. Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
特開2013-257742号公報JP 2013-257742 A 特開2017-72422号公報JP 2017-72422 A
 レーダやカメラなどの計測部から得られるデータは、計測部を基準とした座標系の値であり、車両に対する計測部の姿勢等に依存したデータとなっているため、車両を基準とした座標系の値に変換する必要がある。従って、計測部の姿勢にずれが生じた場合には、そのずれを的確に検知して計測部のデータに反映させる必要がある。 Data obtained from measurement units such as radar and cameras are coordinate system values based on the measurement unit, and are data that depends on the attitude of the measurement unit with respect to the vehicle. Need to be converted to Therefore, when a deviation occurs in the posture of the measurement unit, it is necessary to accurately detect the deviation and reflect it in the data of the measurement unit.
 本発明は、上記のような課題を解決するためになされたものであり、対象物に対する距離を計測する計測部の移動体に対する姿勢を好適に推定可能な推定装置を提供することを主な目的とする。 The present invention has been made in order to solve the above-described problems, and a main object of the present invention is to provide an estimation device that can appropriately estimate the posture of a measuring unit that measures a distance to an object with respect to a moving object. And
 請求項に記載の発明は、対象物に対する距離を計測する計測部の移動体に対する姿勢を推定する推定装置であって、前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定部を有する。 The invention described in claim is an estimation device that estimates a posture of a measuring unit that measures a distance to an object with respect to a moving body. An estimation unit configured to estimate an attitude of the measurement unit with respect to the moving body based on a detection result of the provided acceleration detection unit;
 また、請求項に記載の発明は、対象物に対する距離を計測する計測部の移動体に対する姿勢を推定する推定装置が実行する制御方法であって、前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定工程を有する。 The invention described in the claims is a control method executed by the estimation device that estimates the attitude of the measuring unit that measures the distance to the object with respect to the moving body, and the moving body travels with acceleration and deceleration. And an estimation step of estimating the attitude of the measurement unit relative to the moving body based on the detection result of the acceleration detection unit provided in the measurement unit.
 また、請求項に記載の発明は、対象物に対する距離を計測する計測部の移動体に対する姿勢を推定するコンピュータが実行するプログラムであって、前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定部として前記コンピュータを機能させる。 The invention described in the claims is a program executed by a computer that estimates the attitude of a measuring unit that measures a distance to an object with respect to a moving body, and when the moving body is running with acceleration / deceleration. The computer is caused to function as an estimation unit that estimates the attitude of the measurement unit with respect to the moving body based on a detection result of an acceleration detection unit provided in the measurement unit.
運転支援システムの概略構成図である。It is a schematic block diagram of a driving assistance system. 車載機の機能的構成を示すブロック図である。It is a block diagram which shows the functional structure of a vehicle equipment. 2次元座標により表された車両座標系とライダ座標系との関係を示す図である。It is a figure which shows the relationship between the vehicle coordinate system represented by the two-dimensional coordinate, and a lidar coordinate system. 3次元座標により表された車両座標系とライダ座標系との関係を示す図である。It is a figure which shows the relationship between the vehicle coordinate system represented by the three-dimensional coordinate, and a lidar coordinate system. 車両座標系及びライダ座標系における重力加速度のベクトルを示した図である。It is the figure which showed the vector of the gravity acceleration in a vehicle coordinate system and a lidar coordinate system. ライダのz方向位置の変化前後において平坦な路面を車両が走行中のときにライダにより計測された路面のz方向の計測値を示した図である。It is the figure which showed the measured value of the z direction of the road surface measured by the rider when the vehicle is drive | working the flat road surface before and behind the change of the z direction position of a rider. 上り坂の始点の前後を走行中の車両のライダのz方向の計測値の大きさを示す図である。It is a figure which shows the magnitude | size of the measured value of the z direction of the rider of the vehicle currently drive | working before and after the starting point of an uphill. 車両の走行時に得られる計測値及び車体ピッチ角の時間変化を示すグラフである。It is a graph which shows the time change of the measured value and vehicle body pitch angle which are obtained at the time of driving | running | working of a vehicle. バンプの前後を走行中の車両のライダのz方向の計測値の大きさを示す図である。It is a figure which shows the magnitude | size of the measured value of the z direction of the rider of the vehicle which is drive | working before and after bump. 車両の走行時に得られる計測値及び車体ピッチレートの時間変化を示すグラフである。It is a graph which shows the time change of the measured value obtained when driving | running | working of a vehicle, and a vehicle body pitch rate. 旋回中の車両に生じる作用を概略的に示した図である。It is the figure which showed roughly the effect | action which arises in the vehicle in turning. ライダの出力を補正する処理の手順を示すフローチャートの一例である。It is an example of the flowchart which shows the procedure of the process which correct | amends the output of a rider.
 本発明の好適な実施形態によれば、対象物に対する距離を計測する計測部の移動体に対する姿勢を推定する推定装置であって、前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定部を有する。推定装置は、この態様により、車両に対するヨー方向における計測部の姿勢を好適に推定することができる。なお、「加減速して走行している」態様とは、加速して走行している態様、及び、減速して走行している態様のいずれの態様も含む。 According to a preferred embodiment of the present invention, there is provided an estimation device that estimates the attitude of a measuring unit that measures a distance to an object with respect to a moving body, and the moving body is traveling while accelerating and decelerating. An estimation unit configured to estimate an attitude of the measurement unit with respect to the moving body based on a detection result of an acceleration detection unit provided in the measurement unit; With this aspect, the estimation apparatus can preferably estimate the attitude of the measurement unit in the yaw direction with respect to the vehicle. The “running with acceleration / deceleration” includes both modes of running with acceleration and running with deceleration.
 上記推定装置の一態様では、前記推定部は、前記移動体が所定速度により走行または停止しているときの前記加速度検出部が出力する加速度データに基づいて、前記計測部のロール方向及びピッチ方向の姿勢を推定し、前記移動体が加減速して走行しているときの前記加速度検出部が出力する加速度データ並びに前記推定されたロール方向及びピッチ方向の姿勢に基づいて、前記計測部のヨー方向の姿勢を推定する。この態様により、推定装置は、移動体が所定速度により走行または停止しているときには車両に対するロール方向及びピッチ方向における計測部の姿勢を好適に推定し、移動体が加減速して走行しているときには車両に対するヨー方向における計測部の姿勢を好適に推定することができる。 In one aspect of the estimation device, the estimation unit is configured to determine a roll direction and a pitch direction of the measurement unit based on acceleration data output from the acceleration detection unit when the moving body is running or stopped at a predetermined speed. Based on the acceleration data output by the acceleration detection unit when the mobile object is traveling while accelerating and decelerating, and the estimated posture in the roll direction and the pitch direction. Estimate the orientation of the direction. According to this aspect, the estimation device preferably estimates the posture of the measurement unit in the roll direction and the pitch direction with respect to the vehicle when the moving body is traveling or stopped at a predetermined speed, and the moving body is traveling with acceleration / deceleration. Sometimes the posture of the measurement unit in the yaw direction with respect to the vehicle can be estimated appropriately.
 上記推定装置の他の一態様では、前記推定部は、推定した前記計測部の姿勢と、記憶部に記憶された前記計測部の姿勢とに基づき、前記姿勢の変化量を推定する。これにより、推定装置は、記憶部に記憶された標準の姿勢に対する現在の姿勢の変化量を好適に推定することができる。 In another aspect of the estimation apparatus, the estimation unit estimates the amount of change in the posture based on the estimated posture of the measurement unit and the posture of the measurement unit stored in the storage unit. Thereby, the estimation apparatus can presume suitably the variation | change_quantity of the present attitude | position with respect to the standard attitude | position memorize | stored in the memory | storage part.
 上記推定装置の他の一態様では、前記推定部は、高さ方向における路面の位置を示す前記計測部の計測データに基づいて、前記計測部の前記高さ方向の位置を推定する。この態様により、推定装置は、計測部の高さ方向における位置を好適に推定することができる。 In another aspect of the estimation apparatus, the estimation unit estimates the position of the measurement unit in the height direction based on measurement data of the measurement unit indicating the position of the road surface in the height direction. By this aspect, the estimation apparatus can estimate the position in the height direction of a measurement part suitably.
 上記推定装置の他の一態様では、前記推定部は、勾配が変化する道路地点が前記計測部により計測されたときの前記道路地点と前記移動体との距離に基づいて、前記移動体の前後方向における前記計測部の位置を推定する。この態様により、推定装置は、移動体の前後方向における計測部の位置を好適に推定することができる。 In another aspect of the estimation device, the estimation unit may be arranged before and after the moving object based on a distance between the road point and the moving object when a road point where a gradient changes is measured by the measuring unit. The position of the measurement unit in the direction is estimated. According to this aspect, the estimation device can appropriately estimate the position of the measurement unit in the front-rear direction of the moving body.
 上記推定装置の他の一態様では、前記推定部は、前記移動体のピッチ方向の傾きを検出する傾き検出部が出力するデータの変化と、前記計測部が出力する計測データの変化との時間差に基づき、前記距離を算出する。この態様により、推定装置は、勾配が変化する道路地点が計測部により計測されたときの道路地点と移動体との距離を好適に算出し、移動体の前後方向における計測部の位置を推定に用いることができる。 In another aspect of the estimation apparatus, the estimation unit may calculate a time difference between a change in data output from the inclination detection unit that detects an inclination of the moving body in the pitch direction and a change in measurement data output from the measurement unit. Based on the above, the distance is calculated. According to this aspect, the estimation device preferably calculates the distance between the road point and the moving body when the road point where the gradient changes is measured by the measuring unit, and estimates the position of the measuring unit in the front-rear direction of the moving body. Can be used.
 上記推定装置の他の一態様では、前記推定部は、前記移動体の旋回中において前記加速度検出部が出力する前記移動体の左右方向の加速度データと、前記移動体に搭載された加速度センサが出力する前記移動体の左右方向の加速度データと、前記移動体に搭載されたジャイロセンサが出力する前記移動体のヨーレートとに基づいて、前記移動体の左右方向における前記計測部の位置を推定する。この態様により、推定装置は、移動体の左右方向における計測部の位置を好適に推定することができる。 In another aspect of the estimation apparatus, the estimation unit includes acceleration data in the left-right direction of the mobile body output by the acceleration detection unit during the turning of the mobile body, and an acceleration sensor mounted on the mobile body. The position of the measurement unit in the left-right direction of the moving body is estimated based on the acceleration data in the left-right direction of the moving body to be output and the yaw rate of the moving body output from the gyro sensor mounted on the moving body. . By this aspect, the estimation apparatus can estimate the position of the measurement part in the left-right direction of a moving body suitably.
 上記推定装置の他の一態様では、前記推定部は、推定した前記計測部の位置と、記憶部に記憶された前記計測部の位置とに基づき、前記位置の変化量を推定する。これにより、推定装置は、記憶部に記憶された計測部の標準の位置に対する現在の位置の変化量を好適に推定することができる。 In another aspect of the estimation device, the estimation unit estimates the amount of change in the position based on the estimated position of the measurement unit and the position of the measurement unit stored in the storage unit. Thereby, the estimation apparatus can estimate suitably the variation | change_quantity of the present position with respect to the standard position of the measurement part memorize | stored in the memory | storage part.
 上記推定装置の他の一態様では、前記変化量に基づき、前記計測部が出力する計測データを補正する補正部をさらに備える。推定装置は、この態様により、計測部の姿勢や位置のずれが生じた場合であっても、ずれの影響が生じないように計測部の計測データを補正することができる。 In another aspect of the estimation apparatus, the estimation device further includes a correction unit that corrects measurement data output from the measurement unit based on the amount of change. With this aspect, the estimation apparatus can correct the measurement data of the measurement unit so that the influence of the shift does not occur even when the measurement unit is misaligned.
 上記推定装置の他の一態様では、前記変化量が所定量以上である場合、前記計測部が出力する計測データに基づく処理を停止する停止制御部をさらに備える。この態様により、推定装置は、姿勢や位置のずれが大きい計測部の計測データを用いることによる当該計測データを用いる種々の処理の精度が低下するのを確実に抑制することができる。 In another aspect of the estimation device, the apparatus further includes a stop control unit that stops processing based on measurement data output from the measurement unit when the amount of change is equal to or greater than a predetermined amount. According to this aspect, the estimation apparatus can reliably suppress a reduction in accuracy of various processes using the measurement data by using the measurement data of the measurement unit having a large posture and position deviation.
 本発明の他の好適な実施形態によれば、対象物に対する距離を計測する計測部の移動体に対する姿勢を推定する推定装置が実行する制御方法であって、前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定工程を有する。推定装置は、この制御方法を用いることで、車両に対するヨー方向における計測部の姿勢を好適に推定することができる。 According to another preferred embodiment of the present invention, there is provided a control method executed by an estimation device that estimates the attitude of a measuring unit that measures a distance to an object with respect to a moving body, wherein the moving body is accelerated and decelerated. And an estimation step of estimating an attitude of the measurement unit with respect to the moving body based on a detection result of an acceleration detection unit provided in the measurement unit. By using this control method, the estimation apparatus can preferably estimate the attitude of the measurement unit in the yaw direction with respect to the vehicle.
 本発明の他の好適な実施形態によれば、対象物に対する距離を計測する計測部の移動体に対する姿勢を推定するコンピュータが実行するプログラムであって、前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定部として前記コンピュータを機能させる。コンピュータは、このプログラムを実行することで、車両に対するヨー方向における計測部の姿勢を好適に推定することができる。好適には、上記プログラムは、記憶媒体に記憶される。 According to another preferred embodiment of the present invention, there is provided a program executed by a computer that estimates a posture of a measuring unit that measures a distance to an object with respect to a moving body, and the moving body travels with acceleration and deceleration. The computer is caused to function as an estimation unit that estimates the attitude of the measurement unit with respect to the moving body based on the detection result of the acceleration detection unit provided in the measurement unit. The computer can suitably estimate the attitude of the measurement unit in the yaw direction with respect to the vehicle by executing this program. Preferably, the program is stored in a storage medium.
 以下、図面を参照して本発明の好適な各実施例について説明する。 Hereinafter, preferred embodiments of the present invention will be described with reference to the drawings.
 [概略構成]
 図1は、本実施例に係る運転支援システムの概略構成図である。図1に示す運転支援システムは、車両に搭載され、車両の運転支援に関する制御を行う車載機1と、ライダ(Lidar:Light Detection and Ranging、または、Laser Illuminated Detection And Ranging)2と、ジャイロセンサ3と、車体用加速度センサ4と、ライダ用加速度センサ5とを有する。
[Schematic configuration]
FIG. 1 is a schematic configuration diagram of a driving support system according to the present embodiment. The driving support system shown in FIG. 1 is mounted on a vehicle, and includes an in-vehicle device 1 that performs control related to driving support of the vehicle, a lidar (Lider: Light Detection and Ranging or Laser Illuminated Detection And Ranging) 2, and a gyro sensor 3. A vehicle body acceleration sensor 4 and a rider acceleration sensor 5.
 車載機1は、ライダ2、ジャイロセンサ3、車体用加速度センサ4、及びライダ用加速度センサ5と電気的に接続し、これらの出力データを取得する。また、道路データ及び道路付近に設けられた地物に関する地物情報などを記憶した地図データベース(DB:DataBase)10を記憶している。そして、車載機1は、上述の出力データ及び地図DB10に基づき、車両の位置(「自車位置」とも呼ぶ。)の推定を行い、自車位置の推定結果に基づいて自動運転制御などの車両の運転支援に関する制御などを行う。また、車載機1は、ライダ2、ジャイロセンサ3、車体用加速度センサ4、及びライダ用加速度センサ5の出力に基づいて、ライダ2の姿勢及び位置の推定を行う。そして、車載機1は、この推定結果に基づいて、ライダ2が出力する点群データの各計測値を補正する処理などを行う。車載機1は、本発明における「推定装置」の一例である。 The in-vehicle device 1 is electrically connected to the lidar 2, the gyro sensor 3, the vehicle body acceleration sensor 4, and the lidar acceleration sensor 5, and acquires the output data thereof. Moreover, the map database (DB: DataBase) 10 which memorize | stored the road data and the feature information regarding the feature provided in the road vicinity is memorize | stored. The in-vehicle device 1 estimates the position of the vehicle (also referred to as “own vehicle position”) based on the output data and the map DB 10 described above, and the vehicle such as automatic driving control based on the estimation result of the own vehicle position. Control related to driving support. The in-vehicle device 1 estimates the posture and position of the rider 2 based on the outputs of the rider 2, the gyro sensor 3, the vehicle body acceleration sensor 4, and the rider acceleration sensor 5. And the vehicle equipment 1 performs the process etc. which correct | amend each measured value of the point cloud data which the lidar 2 outputs based on this estimation result. The in-vehicle device 1 is an example of the “estimation device” in the present invention.
 ライダ2は、水平方向および垂直方向の所定の角度範囲に対してパルスレーザを出射することで、外界に存在する物体までの距離を離散的に測定し、当該物体の位置を示す3次元の点群情報を生成する。この場合、ライダ2は、照射方向を変えながらレーザ光を照射する照射部と、照射したレーザ光の反射光(散乱光)を受光する受光部と、受光部が出力する受光信号に基づくスキャンデータを出力する出力部とを有する。スキャンデータは、受光部が受光したレーザ光に対応する照射方向と、上述の受光信号に基づき特定される当該レーザ光のその照射方向での物体までの距離とに基づき生成され、車載機1へ供給される。本実施例では、一例として、ライダ2は、車両のフロント部分とリア部分とにそれぞれ設けられている。ライダ2は、本発明における「計測部」の一例である。 The lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to an object existing in the outside world, and a three-dimensional point indicating the position of the object Generate group information. In this case, the lidar 2 includes an irradiation unit that emits laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit. Output unit. The scan data is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal described above, and is sent to the in-vehicle device 1. Supplied. In this embodiment, as an example, the lidar 2 is provided at each of a front portion and a rear portion of the vehicle. The lidar 2 is an example of the “measurement unit” in the present invention.
 ジャイロセンサ3は、車両に設けられ、車体のヨーレートに相当する出力信号を車載機1へ供給する。車体用加速度センサ4は、車両に設けられた3軸加速度センサであり、車体の進行方向、側面方向、高さ方向に相当する3軸の加速度データに相当する検出信号を車載機1へ供給する。ジャイロセンサ3及び車体用加速度センサ4は、本発明における「傾き検出部」の一例である。ライダ用加速度センサ5は、各ライダ2に設けられた3軸加速度センサであり、設置されたライダ2の3軸の加速度データに相当する検出信号をそれぞれ車載機1へ供給する。ライダ用加速度センサ5は、本発明における「加速度検出部」の一例である。 The gyro sensor 3 is provided in the vehicle and supplies an output signal corresponding to the yaw rate of the vehicle body to the in-vehicle device 1. The vehicle body acceleration sensor 4 is a three-axis acceleration sensor provided in the vehicle, and supplies a detection signal corresponding to three-axis acceleration data corresponding to the traveling direction, side surface direction, and height direction of the vehicle body to the in-vehicle device 1. . The gyro sensor 3 and the vehicle body acceleration sensor 4 are examples of the “tilt detector” in the present invention. The rider acceleration sensor 5 is a triaxial acceleration sensor provided in each rider 2, and supplies detection signals corresponding to the triaxial acceleration data of the installed rider 2 to the in-vehicle device 1. The rider acceleration sensor 5 is an example of the “acceleration detector” in the present invention.
 図2は、車載機2の機能的構成を示すブロック図である。車載機2は、主に、インターフェース11と、記憶部12と、入力部14と、制御部15と、情報出力部16と、を有する。これらの各要素は、バスラインを介して相互に接続されている。 FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device 2. The in-vehicle device 2 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
 インターフェース11は、ライダ2、ジャイロセンサ3、車体用加速度センサ4、及びライダ用加速度センサ5などのセンサから出力データを取得し、制御部15へ供給する。また、インターフェース11は、制御部15が生成した車両の走行制御に関する信号を車両の電子制御装置(ECU:Electronic Control Unit)へ供給する。 The interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle body acceleration sensor 4, and the lidar acceleration sensor 5, and supplies the output data to the control unit 15. In addition, the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
 記憶部12は、制御部15が実行するプログラムや、制御部15が所定の処理を実行するのに必要な情報を記憶する。本実施例では、記憶部12は、地図DB10と、ライダ設置情報ILとを有する。ライダ設置情報ILは、ある基準時(例えばライダ2のアライメント調整直後などの姿勢・位置ずれが生じていない時)における各ライダ2の相対的な3次元位置と姿勢に関する情報である。本実施例では、ライダ2等の姿勢を、ロール角、ピッチ角、ヨー角(即ちオイラー角)により表すものとする。ライダ設置情報ILは、上述の基準時において実測された位置及び姿勢に関する情報であってもよく、後述するライダ2の位置及び姿勢の推定処理により車載機1によって推定されたライダ2の位置及び姿勢に関する情報であってもよい。 The storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process. In the present embodiment, the storage unit 12 includes a map DB 10 and rider installation information IL. The lidar installation information IL is information relating to the relative three-dimensional position and posture of each rider 2 at a certain reference time (for example, when there is no posture / position shift immediately after the alignment adjustment of the lidar 2). In this embodiment, the posture of the lidar 2 and the like is represented by a roll angle, a pitch angle, and a yaw angle (that is, Euler angle). The lidar installation information IL may be information on the position and orientation actually measured at the reference time described above, and the position and orientation of the lidar 2 estimated by the vehicle-mounted device 1 by the position and orientation estimation process of the lidar 2 described later. It may be information about.
 入力部14は、ユーザが操作するためのボタン、タッチパネル、リモートコントローラ、音声入力装置等であり、経路探索のための目的地を指定する入力、自動運転のオン及びオフを指定する入力などを受け付ける。情報出力部16は、例えば、制御部15の制御に基づき出力を行うディスプレイやスピーカ等である。 The input unit 14 is a button operated by the user, a touch panel, a remote controller, a voice input device, and the like, and receives an input for specifying a destination for route search, an input for specifying on / off of automatic driving, and the like. . The information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
 制御部15は、プログラムを実行するCPUなどを含み、車載機1の全体を制御する。制御部15は、インターフェース11から供給される各センサの出力信号及び地図DB10に基づき、自車位置の推定を行い、自車位置の推定結果に基づいて自動運転制御を含む車両の運転支援に関する制御などを行う。このとき、制御部15は、ライダ2の出力データを用いる場合には、ライダ2が出力する計測データを、ライダ設置情報ILに記録されたライダ2の姿勢及び位置を基準として、ライダ2を基準とした座標系から車両を基準とした座標系に変換する。さらに、本実施例では、制御部15は、車両に対するライダ2の現在(即ち処理基準時)の位置及び姿勢を推定することで、ライダ設置情報ILに記録された位置及び姿勢に対する変化量を算出し、当該変化量に基づきライダ2が出力する計測データを補正する。これにより、制御部15は、ライダ2の位置又は姿勢にずれが生じた場合であっても、当該ずれの影響を受けないようにライダ2が出力する計測データを補正する。制御部15は、本発明における「推定部」、「補正部」、「停止制御部」及びプログラムを実行する「コンピュータ」の一例である。 The control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1. The control unit 15 estimates the vehicle position based on the output signal of each sensor supplied from the interface 11 and the map DB 10, and controls vehicle driving support including automatic driving control based on the estimation result of the vehicle position. Etc. At this time, when the output data of the lidar 2 is used, the control unit 15 uses the lidar 2 as a reference based on the measurement data output by the lidar 2 based on the attitude and position of the lidar 2 recorded in the lidar installation information IL. To a coordinate system based on the vehicle. Further, in the present embodiment, the control unit 15 calculates the change amount with respect to the position and posture recorded in the rider installation information IL by estimating the current position (ie, processing reference time) of the rider 2 with respect to the vehicle. Then, the measurement data output from the lidar 2 is corrected based on the change amount. As a result, the control unit 15 corrects the measurement data output by the lidar 2 so as not to be affected by the deviation even when the deviation or position of the rider 2 occurs. The controller 15 is an example of an “estimator”, “corrector”, “stop controller”, and “computer” that executes a program in the present invention.
 [ライダの位置及び姿勢推定]
 次に、ライダ2の位置及び姿勢の推定方法について説明する。車載機1は、以下に示す処理を、ライダ2ごとに実行する。
[Lidar position and orientation estimation]
Next, a method for estimating the position and orientation of the rider 2 will be described. The in-vehicle device 1 executes the following process for each rider 2.
 (1)座標系の変換
 ライダ2により取得される3次元点群データの各計測点が示す3次元座標は、ライダ2の位置及び姿勢を基準とした座標系(「ライダ座標系」とも呼ぶ。)で表されており、車両の位置及び姿勢を基準とした座標系(「車両座標系」とも呼ぶ。)に変換する必要がある。ここでは、まず、ライダ座標系と車両座標系との変換について説明する。
(1) Three-dimensional coordinates indicated by each measurement point of the three-dimensional point group data acquired by the conversion lidar 2 of the coordinate system are also referred to as a coordinate system based on the position and orientation of the lidar 2 (also referred to as “rider coordinate system”). ) And needs to be converted into a coordinate system based on the position and orientation of the vehicle (also referred to as “vehicle coordinate system”). Here, first, the conversion between the lidar coordinate system and the vehicle coordinate system will be described.
 図3は、2次元座標により表された車両座標系とライダ座標系との関係を示す図である。ここでは、車両座標系は、車両の中心を原点とし、車両の進行方向に沿った座標軸「x」と車両の側面方向に沿った座標軸「y」を有する。また、ライダ座標系は、ライダ2の正面方向(矢印A2参照)に沿った座標軸「x」とライダ2の側面方向に沿った座標軸「y」を有する。 FIG. 3 is a diagram illustrating the relationship between the vehicle coordinate system and the lidar coordinate system represented by two-dimensional coordinates. Here, the vehicle coordinate system has a coordinate axis “x b ” along the traveling direction of the vehicle and a coordinate axis “y b ” along the lateral direction of the vehicle with the vehicle center as the origin. The lidar coordinate system has a coordinate axis “x L ” along the front direction of the rider 2 (see arrow A <b> 2) and a coordinate axis “y L ” along the side surface direction of the rider 2.
 ここで、車両座標系に対するライダ2のヨー角を「Lψ0」、ライダ2の位置を[Lx0、Ly0とした場合、車両座標系から見た時刻「k」の計測点[x(k)、y(k)]は、回転行列「Cψ0」を用いた以下の式(1)によりライダ座標系の座標[x(k)、y(k)]へ変換される。 Here, when the yaw angle of the lidar 2 with respect to the vehicle coordinate system is “L ψ0 ” and the position of the lidar 2 is [L x0 , L y0 ] T , the measurement point [x] at the time “k” viewed from the vehicle coordinate system [x b (k), y b (k)] T is converted to the coordinates [x L (k), y L (k)] T of the lidar coordinate system by the following equation (1) using the rotation matrix “C ψ0 ”. Converted.
Figure JPOXMLDOC01-appb-M000001
 一方、ライダ座標系から車両座標系への変換は、回転行列の逆行列(転置行列)を用いればよい。よって、ライダ座標系で取得した時刻kの計測点[x(k)、y(k)]は、以下の式(2)により車両座標系の座標[x(k)、y(k)]に変換することが可能である。
Figure JPOXMLDOC01-appb-M000001
On the other hand, the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the rotation matrix. Therefore, the measurement point [x L (k), y L (k)] T obtained at the time k obtained in the lidar coordinate system is expressed by the coordinates [x b (k), y b in the vehicle coordinate system according to the following equation (2). (K)] It is possible to convert to T.
Figure JPOXMLDOC01-appb-M000002
 図4は、3次元座標により表された車両座標系とライダ座標系との関係を示す図である。ここでは、座標軸x、yに垂直な座標軸を「z」、座標軸x、yに垂直な座標軸を「z」とする。
Figure JPOXMLDOC01-appb-M000002
FIG. 4 is a diagram illustrating the relationship between the vehicle coordinate system and the lidar coordinate system represented by three-dimensional coordinates. Here, a coordinate axis perpendicular to the coordinate axes x b and y b is “z b ”, and a coordinate axis perpendicular to the coordinate axes x L and y L is “z L ”.
 車両座標系に対するライダ2のロール角を「Lφ0」、ピッチ角を「Lθ0」、ヨー角を「Lψ0」とし、ライダ2の座標軸xにおける位置が「Lx0」、座標軸yにおける位置が「Ly0」、座標軸zにおける位置が「Lz0」とした場合、車両座標系から見た時刻「k」の計測点[xb0(k)、yb0(k)、zb0(k)]は、ロール、ピッチ、ヨーに対応する各回転行列「Cφ0」、「Cθ0」、「Cψ0」により表される方向余弦行列「C」を用いた以下の式(3)により、ライダ座標系の座標[xL0(k)、yL0(k)、zL0(k)]へ変換される。 The roll angle of the rider 2 with respect to the vehicle coordinate system is “L φ0 ”, the pitch angle is “L θ0 ”, the yaw angle is “L ψ0 ”, the position of the rider 2 on the coordinate axis x b is “L x0 ”, and the coordinate axis y b When the position is “L y0 ” and the position on the coordinate axis z b is “L z0 ”, the measurement point [x b0 (k), y b0 (k), z b0 ( k)] T is the following equation (3) using the direction cosine matrix “C 0 ” represented by the rotation matrices “C φ0 ”, “C θ0 ”, and “C ψ0 ” corresponding to roll, pitch, and yaw. ) To the coordinates [x L0 (k), y L0 (k), z L0 (k)] T in the lidar coordinate system.
Figure JPOXMLDOC01-appb-M000003
 一方、ライダ座標系から車両座標系への変換は、方向余弦行列の逆行列(転置行列)を用いればよい。よって、ライダ座標系で取得した時刻kの計測点[xL0(k)、yL0(k)、zL0(k)]は、以下の式(4)により車両座標系の座標[xb0(k)、yb0(k)、zb0(k)]に変換することが可能である。
Figure JPOXMLDOC01-appb-M000003
On the other hand, the transformation from the lidar coordinate system to the vehicle coordinate system may be performed using an inverse matrix (transpose matrix) of the direction cosine matrix. Therefore, the measurement point [x L0 (k), y L0 (k), z L0 (k)] T acquired at the time k in the lidar coordinate system T is the coordinate [x b0 of the vehicle coordinate system according to the following equation (4). (K), y b0 (k), z b0 (k)] can be converted to T.
Figure JPOXMLDOC01-appb-M000004
 なお、以後では、車両座標系における各座標軸x、y、zに沿った方向を、それぞれ単に「x方向」、「y方向」、「z方向」とも呼ぶ。
Figure JPOXMLDOC01-appb-M000004
Hereinafter, directions along the coordinate axes x b , y b , and z b in the vehicle coordinate system are also simply referred to as “x direction”, “y direction”, and “z direction”, respectively.
 (2)ロール角及びピッチ角の推定
 次に、ライダ2のロール角Lφ0及びピッチ角Lθ0の推定方法について説明する。以下に説明するように、車載機1は、対象のライダ2に設置されたライダ用加速度センサ5から得られる3軸の加速度出力値に基づき、ライダ2のロール角Lφ0及びピッチ角Lθ0を推定する。以後では、説明便宜上、ライダ用加速度センサ5は、ライダ座標系の3軸の加速度を計測するものとする。
(2) Estimation of roll angle and pitch angle Next, a method for estimating the roll angle Lφ0 and the pitch angle Lθ0 of the rider 2 will be described. As described below, the in-vehicle device 1 determines the roll angle L φ0 and the pitch angle L θ0 of the rider 2 based on the triaxial acceleration output value obtained from the rider acceleration sensor 5 installed in the target rider 2. presume. Hereinafter, for convenience of explanation, it is assumed that the lidar acceleration sensor 5 measures the triaxial acceleration of the lidar coordinate system.
 車両が水平な場所で停車中あるいは一定速度で走行中の場合、車両座標系における加速度は、z方向の重力加速度gのみである。図5(A)は、車両座標系における重力加速度gのベクトルを示した図であり、図5(B)は、ライダ座標系における重力加速度gのベクトルを示した図である。よって,ライダ用加速度センサ5のライダ座標系の出力値[αは以下の式(5)が成り立つ。 When the vehicle is stopped at a horizontal place or traveling at a constant speed, the acceleration in the vehicle coordinate system is only the gravitational acceleration g in the z direction. FIG. 5A is a diagram showing a vector of gravitational acceleration g in the vehicle coordinate system, and FIG. 5B is a diagram showing a vector of gravitational acceleration g in the lidar coordinate system. Accordingly, the output value [α x , α y , α z ] T of the rider acceleration sensor 5 of the rider acceleration sensor 5 satisfies the following expression (5).
Figure JPOXMLDOC01-appb-M000005
 ここで、式(5)のαを用いて「α/α」を計算すると、以下の式(6)が得られる。
Figure JPOXMLDOC01-appb-M000005
Here, when “α y / α z ” is calculated using α y , α z of the equation (5), the following equation (6) is obtained.
Figure JPOXMLDOC01-appb-M000006
 よって、ライダ2のロール角Lφ0は、重力加速度gを用いない以下の式(7)により表される。
Figure JPOXMLDOC01-appb-M000006
Therefore, the roll angle L φ0 of the rider 2 is expressed by the following formula (7) that does not use the gravitational acceleration g.
Figure JPOXMLDOC01-appb-M000007
 また、式(5)のαを用いて「α +α 」を計算すると、以下の式(8)が得られ、さらに式(8)と式(5)のαを用いて重力加速度gを消去すると、以下の式(9)が得られる。
Figure JPOXMLDOC01-appb-M000007
Further, when “α y 2 + α z 2 ” is calculated using α y and α z in the equation (5), the following equation (8) is obtained, and α x in the equations (8) and (5) is obtained. When the gravitational acceleration g is erased by using the following equation (9) is obtained.
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000009
 よって、ライダ2のピッチ角Lθ0は、重力加速度gを用いない以下の式(10)により表される。
Figure JPOXMLDOC01-appb-M000009
Accordingly, the pitch angle L .theta.0 lidar 2 is expressed by the following equation without using the acceleration of gravity g (10).
Figure JPOXMLDOC01-appb-M000010
 以上により、車載機1は、水平な場所で、停止あるいは一定速度で走行しているときに、式(7)及び式(10)を参照することで、ライダ用加速度センサ5の出力値に基づき、ライダ2のロール角Lφ0及びピッチ角Lθ0を算出することができる。なお、車載機1は、現在位置が水平な場所か否かを、ジャイロセンサ3又は車体用加速度センサ4の出力に基づき判定してもよく、車両の現在位置に相当する道路の道路データの傾斜角度に関する情報を地図DB10から参照することで判定してもよい。また、車載機1は、車両が停止あるいは一定速度で走行しているか否かを、車体用加速度センサ4の出力に基づき判定してもよく、図示しない車速センサの出力に基づき判定してもよい。
Figure JPOXMLDOC01-appb-M000010
As described above, the vehicle-mounted device 1 is based on the output value of the rider acceleration sensor 5 by referring to the equations (7) and (10) when stopped or traveling at a constant speed in a horizontal place. The roll angle L φ0 and the pitch angle L θ0 of the rider 2 can be calculated. The in-vehicle device 1 may determine whether or not the current position is a horizontal place based on the output of the gyro sensor 3 or the vehicle body acceleration sensor 4, and the inclination of the road data of the road corresponding to the current position of the vehicle. You may determine by referring the information regarding an angle from map DB10. Further, the in-vehicle device 1 may determine whether the vehicle is stopped or traveling at a constant speed based on the output of the vehicle body acceleration sensor 4 or based on the output of a vehicle speed sensor (not shown). .
 (3)ヨー角の推定
 次に、ライダ2のヨー角Lψ0の推定方法について説明する。以下に説明するように、車載機1は、算出したライダ2のロール角Lφ0及びピッチ角Lθ0を用いて、車両が直進道路を加速中又は減速中に、ライダ用加速度センサ5から得られる3軸の加速度出力値に基づき、ヨー角Lψ0を推定する。
(3) Estimation of Yaw Angle Next, a method for estimating the yaw angle L ψ0 of the rider 2 will be described. As will be described below, the in-vehicle device 1 is obtained from the rider acceleration sensor 5 while the vehicle is accelerating or decelerating on a straight road using the calculated roll angle L φ0 and pitch angle L θ0 of the rider 2. The yaw angle L ψ0 is estimated based on the three-axis acceleration output value.
 車両が直進道路を加速度「α」により加速中又は減速中の場合、車両座標系において、x方向において加速度αが発生すると共に、z方向において重力加速度gが発生するため、ライダ用加速度センサ5のライダ座標系の出力値[αについて、以下の式(11)が成り立つ。 When the vehicle is accelerating or decelerating on a straight road with the acceleration “α”, the acceleration α is generated in the x direction and the gravitational acceleration g is generated in the z direction in the vehicle coordinate system. For the output value [α x , α y , α z ] T of the lidar coordinate system, the following equation (11) is established.
Figure JPOXMLDOC01-appb-M000011
 ここで、加速度αと重力加速度gを消去するため、以下に説明する演算を行う。
Figure JPOXMLDOC01-appb-M000011
Here, in order to eliminate the acceleration α and the gravitational acceleration g, the calculation described below is performed.
 まず、式(11)のαを用いると、それぞれ以下の式(12)、式(13)が成立する。 First, when α y and α z in the equation (11) are used, the following equations (12) and (13) are established, respectively.
Figure JPOXMLDOC01-appb-M000012
Figure JPOXMLDOC01-appb-M000012
Figure JPOXMLDOC01-appb-M000013
Figure JPOXMLDOC01-appb-M000013
 そして、式(12)を式(13)により減算すると、以下の式(14)が得られる。 Then, the following equation (14) is obtained by subtracting equation (12) by equation (13).
Figure JPOXMLDOC01-appb-M000014
Figure JPOXMLDOC01-appb-M000014
 同様に、式(11)のαを用いると、それぞれ以下の式(15)、式(16)が成立する。 Similarly, when α y and α z in the equation (11) are used, the following equations (15) and (16) are established, respectively.
Figure JPOXMLDOC01-appb-M000015
Figure JPOXMLDOC01-appb-M000015
Figure JPOXMLDOC01-appb-M000016
 そして、式(15)と式(16)とを加算すると、以下の式(17)が得られる。
Figure JPOXMLDOC01-appb-M000016
Then, when Expression (15) and Expression (16) are added, the following Expression (17) is obtained.
Figure JPOXMLDOC01-appb-M000017
 さらに、式(17)にsinLθ0を乗じた式と、式(11)のαにcosLθ0を乗じた式とを加算すると、以下の式(18)が得られる。
Figure JPOXMLDOC01-appb-M000017
Furthermore, the following equation (18) is obtained by adding an equation obtained by multiplying equation (17) by sinL θ0 and an equation obtained by multiplying α x in equation (11) by cosL θ0 .
Figure JPOXMLDOC01-appb-M000018
 そして、式(14)を式(18)により割ると、以下の式(19)が得られる。
Figure JPOXMLDOC01-appb-M000018
Then, when equation (14) is divided by equation (18), the following equation (19) is obtained.
Figure JPOXMLDOC01-appb-M000019
 よって、ライダ2のヨー角Lψ0は、加速度α及び重力加速度gを用いない以下の式(20)により表される。
Figure JPOXMLDOC01-appb-M000019
Therefore, the yaw angle L ψ0 of the rider 2 is expressed by the following equation (20) that does not use the acceleration α and the gravitational acceleration g.
Figure JPOXMLDOC01-appb-M000020
 以上により、車載機1は、直進道路を加減速中に得られたライダ用加速度センサ5の出力値に基づき、式(20)を参照することで、ライダ2のヨー角Lψ0を算出することができる。なお、車載機1は、車両が直進道路を走行中か否かを、車体用加速度センサ4の出力に基づき判定してもよく、現在位置に相当する道路の道路データを地図DB10から参照することで判定してもよい。また、車載機1は、車両が加減速中であるか否かを、車体用加速度センサ4の出力に基づき判定してもよく、図示しない車速センサの出力に基づき判定してもよい。
Figure JPOXMLDOC01-appb-M000020
As described above, the in-vehicle device 1 calculates the yaw angle L ψ0 of the rider 2 by referring to the equation (20) based on the output value of the rider acceleration sensor 5 obtained while accelerating / decelerating on the straight road. Can do. The in-vehicle device 1 may determine whether or not the vehicle is traveling on a straight road based on the output of the vehicle body acceleration sensor 4 and refer to the map DB 10 for road data of the road corresponding to the current position. You may judge by. The in-vehicle device 1 may determine whether the vehicle is accelerating / decelerating based on the output of the vehicle body acceleration sensor 4 or based on the output of a vehicle speed sensor (not shown).
(4)姿勢の変化量の算出
 次に、ライダ設置情報ILの生成時から現在時刻である処理基準時点までのライダ2のピッチ角、ロール角、ヨー角の変化量の算出について補足説明する。以下では、ライダ設置情報ILに記録された(即ちライダ2の姿勢・位置ずれが生じてない初期時の)ライダ2のピッチ角、ロール角、ヨー角を、それぞれ、「Lφ0」、「Lθ0」、「Lψ0」とする。
(4) Calculation of posture change amount Next, a supplementary description will be given of calculation of the change amounts of the pitch angle, roll angle, and yaw angle of the rider 2 from the generation of the rider installation information IL to the processing reference time point that is the current time. In the following, the pitch angle, roll angle, and yaw angle of the rider 2 recorded in the rider installation information IL (that is, at the initial time when the posture / position displacement of the rider 2 has not occurred) are respectively “L φ0 ”, “L θ0 ”and“ Lψ0 ”.
 何らかの影響により、以下の式(21)に示されるように、ライダ2のロール角が「ΔLφ」、ピッチ角が「ΔLθ」、ヨー角が「ΔLψ」だけライダ設置情報ILの生成時からそれぞれ変化し、処理基準時点のロール角が「Lφ」、ピッチ角が「Lθ」、ピッチ角が「Lψ」になったとする。 Due to some influence, as shown in the following formula (21), when the rider installation information IL is generated by the roll angle of the rider 2 of “ΔL φ ”, the pitch angle of “ΔL θ ”, and the yaw angle of “ΔL ψ ” And the roll angle at the processing reference time is “L φ ”, the pitch angle is “L θ ”, and the pitch angle is “L ψ ”.
Figure JPOXMLDOC01-appb-M000021
 同様に、以下の式(22)に示されるように、ライダ2の座標軸xにおける位置が「ΔLx0」、座標軸yにおける位置が「ΔLy0」、座標軸zにおける位置が「ΔLz0」だけライダ設置情報ILの生成時からそれぞれ変化し、処理基準時点の座標軸xにおける位置が「L」、座標軸yにおける位置が「L」、座標軸zにおける位置が「L」になったとする。
Figure JPOXMLDOC01-appb-M000021
Similarly, as shown in the following Expression (22), the position of the lidar 2 on the coordinate axis xb is “ΔL x0 ”, the position on the coordinate axis y b is “ΔL y0 ”, and the position on the coordinate axis z b is “ΔL z0 ”. Respectively, the position on the coordinate axis x b at the processing reference time point is “L x ”, the position on the coordinate axis y b is “L y ”, and the position on the coordinate axis z b is “L z ”. Suppose that
Figure JPOXMLDOC01-appb-M000022
 また、ライダ2の上述の姿勢及び位置の変化に起因して、ライダ2から得られる時刻kの計測点が[xL0(k)、yL0(k)、zL0(k)]から[x(k)、y(k)、z(k)]になったとする。この場合、時刻kの計測点の車両座標系からライダ座標系への変換は、式(23)のようになる。
Figure JPOXMLDOC01-appb-M000022
Further, due to the above-described changes in the posture and position of the rider 2, the measurement point at time k obtained from the rider 2 is [x L0 (k), y L0 (k), z L0 (k)] T from [ x L (k), y L (k), and as a result, it becomes z L (k)] T. In this case, conversion of the measurement point at time k from the vehicle coordinate system to the lidar coordinate system is as shown in Expression (23).
Figure JPOXMLDOC01-appb-M000023
 ここで、式(23)は、式(3)と同じ形式であるため、式(7)、式(10)、式(20)と同様の式を用いて、ライダ2のロール角Lφ、ピッチ角Lθ、ヨー角Lψを算出することができる。よって、車載機1は、これらのロール角Lφ、ピッチ角Lθ、ヨー角Lψと、ライダ設置情報ILに記録されたロール角Lφ0、ピッチ角Lθ0、ヨー角Lψ0との差をそれぞれ算出することで、ロール角の変化量ΔLφ、ピッチ角の変化量ΔLθ、ヨー角の変化量ΔLψを好適に算出することができる。なお、車載機1は、ライダ2の車両への取付け(即ちアライメント調整)後からライダ2の姿勢の推定処理が最初に実行可能なタイミングで式(7)、式(10)、式(20)に基づき推定したライダ2のロール角Lφ0、ピッチ角Lθ0、ヨー角Lψ0を、ライダ設置情報ILとして記憶してもよい。
Figure JPOXMLDOC01-appb-M000023
Here, since the equation (23) has the same format as the equation (3), the roll angle L φ of the rider 2 is calculated using the equations similar to the equations (7), (10), and (20). The pitch angle L θ and the yaw angle L ψ can be calculated. Therefore, the in-vehicle device 1 has a difference between the roll angle L φ , the pitch angle L θ , and the yaw angle L ψ and the roll angle L φ0 , pitch angle L θ0 , and yaw angle L ψ0 recorded in the rider installation information IL. Respectively, the roll angle change amount ΔL φ , the pitch angle change amount ΔL θ , and the yaw angle change amount ΔL ψ can be suitably calculated. Note that the in-vehicle device 1 has the expressions (7), (10), and (20) at the timing when the estimation process of the attitude of the rider 2 can be executed first after the rider 2 is attached to the vehicle (that is, alignment adjustment). The roll angle L φ0 , the pitch angle L θ0 , and the yaw angle L ψ0 of the rider 2 estimated based on the above may be stored as the rider installation information IL.
 (5)z方向の位置変化量の算出
 次に、z方向におけるライダ2の位置変化量の算出方法について説明する。車載機1は、平坦道路で車両が停止中又は一定速度走行中に、道路面を計測したライダ2の計測点のz方向の値に基づき、z方向におけるライダ2の位置変化量「ΔL」を算出する。
(5) Calculation of position change amount in z direction Next, a method for calculating the position change amount of the rider 2 in the z direction will be described. The in-vehicle device 1 detects the position change amount “ΔL z ” of the rider 2 in the z direction based on the value in the z direction of the measurement point of the rider 2 measured on the road surface while the vehicle is stopped on a flat road or traveling at a constant speed. Is calculated.
 処理基準時点での車両座標系の時刻kの計測点の座標[x(k)、y(k)、z(k)]は、式(4)により示される座標[xb0(k)、yb0(k)、zb0(k)]と同様、方向余弦行列「C」を用いて、以下の式(24)により表される。 The coordinates [x b (k), y b (k), z b (k)] T of the measurement point at time k in the vehicle coordinate system at the processing reference time point are coordinates [x b0 ( k), y b0 (k), z b0 (k)] Similar to T , using the direction cosine matrix “C”, it is expressed by the following equation (24).
Figure JPOXMLDOC01-appb-M000024
 ここで、処理基準時点のロール角Lφ、ピッチ角Lθ、ピッチ角Lψが正確に求められている場合、「C-1[x(k)、y(k)、z(k)]」は、車両座標系の値に正しく変換されている。よって、平坦な路面で車両が停車中あるいは一定速度で走行している時は、車両のピッチ変動が少ないため、道路面を照射したライダ2のz方向の計測値は、ライダ2の搭載位置から道路面までの高さ(即ち位置L)となる。
Figure JPOXMLDOC01-appb-M000024
Here, when the roll angle L φ , the pitch angle L θ , and the pitch angle L ψ at the processing reference time are accurately obtained, “C −1 [x L (k), y L (k), z L ( k)] T ”has been correctly converted to a value in the vehicle coordinate system. Therefore, when the vehicle is stopped on a flat road surface or traveling at a constant speed, the pitch variation of the vehicle is small. Therefore, the measured value in the z direction of the lidar 2 that irradiates the road surface is determined from the mounting position of the lidar 2. The height to the road surface (that is, the position L z ) is obtained.
 図6(A)は、ライダ2の位置Lの変化前において平坦な路面を車両が走行中のときにライダ2により計測された路面のz方向の計測値zb0(k)を示し、図6(B)は、ライダ2の位置Lの変化後において平坦な路面を車両が走行中のときにライダ2により計測された路面のz方向の計測値z(k)を示す。図6(A)、(B)に示すように、計測値zb0(k)、計測値z(k)は、それぞれ、ライダ2のz方向の位置Lz0、Lとそれぞれ同一長となっている。 FIG. 6A shows a measured value z b0 (k) in the z direction of the road surface measured by the rider 2 when the vehicle is traveling on a flat road surface before the change of the position L z of the rider 2. 6 (B) shows a measured value z b (k) in the z direction of the road surface measured by the rider 2 when the vehicle is traveling on a flat road surface after the change of the position L z of the rider 2. As shown in FIGS. 6A and 6B , the measured value z b0 (k) and the measured value z b (k) have the same length as the positions L z0 and L z of the rider 2 in the z direction, respectively. It has become.
 よって、式(4)で計算される道路面の計測値zb0(k)と、姿勢変化後の道路面の計測値z(k)との差分は、z方向の変化量ΔLと等しくなることがわかる。この場合に用いる計測値zb0(k)及び計測値z(k)は、複数の走査ラインでの平均及び時間平均であることが望ましい。 Therefore, the equation (4) the measured value of the road surface which is calculated by z b0 (k), the difference between the road surface after the posture change measured values z b (k) is equal to the z-direction variation amount [Delta] L z I understand that The measured value z b0 (k) and the measured value z b (k) used in this case are preferably an average and a time average of a plurality of scanning lines.
 以上を勘案し、車載機1は、以下の式(25)に基づき、変化量ΔLを算出する。 Considering the above, the vehicle-mounted device 1 is based on the following equation (25), calculates the amount of change [Delta] L z.
Figure JPOXMLDOC01-appb-M000025
 従って、車載機1は、ライダ2の初期位置Lz0の計測後に、平坦な路面で車両が停車中あるいは一定速度で走行している時の路面の計測値zb0(k)の平均を算出し、ライダ2の初期位置Lz0と共にライダ設置情報ILに記録しておく。これにより、車載機1は、ライダ設置情報ILを参照することで、処理基準時の位置L及び変化量ΔLを好適に算出することができる。
Figure JPOXMLDOC01-appb-M000025
Therefore, after measuring the initial position L z0 of the rider 2, the in-vehicle device 1 calculates the average of the measured values z b0 (k) of the road surface when the vehicle is stopped on a flat road surface or traveling at a constant speed. The initial position L z0 of the rider 2 is recorded in the rider installation information IL. Thus, the vehicle-mounted device 1 refers to the rider installation information IL, can be preferably calculates the position L z and the amount of change [Delta] L z during processing reference.
 なお、車載機1は、初期位置Lz0の計測時と処理基準時とでの車両のサスペンションのストローク量(即ち伸び切り位置からの沈み量)をそれぞれ推定し、推定したストローク量の差に基づき、z方向の変化量ΔLを補正してもよい。この場合、車載機1は、車両のサスペンションに設けられたストロークセンサ等に基づき、サスペンションのストローク量を計測してもよく、車両の搭乗人数に基づき、サスペンションのストローク量を推定してもよい。これにより、より正確に変化量ΔLを算出することが可能である。 The in-vehicle device 1 estimates the vehicle suspension stroke amount (that is, the sinking amount from the fully extended position) at the time of measuring the initial position L z0 and at the processing reference time, and based on the estimated difference in stroke amount. The amount of change ΔL z in the z direction may be corrected. In this case, the vehicle-mounted device 1 may measure the stroke amount of the suspension based on a stroke sensor or the like provided in the vehicle suspension, or may estimate the stroke amount of the suspension based on the number of passengers in the vehicle. Thus, it is possible to calculate more accurately the amount of change [Delta] L z.
 (6)x方向の位置変化量の算出
 次に、x方向におけるライダ2の位置変化量の算出方法について説明する。車載機1は、坂道の始点又は終点付近において、あるいは路面上のバンプを通過するときにおいて、ライダ2のz方向計測値の変化とジャイロセンサ3から得られるピッチレートの変化との時間差に基づき、x方向の位置変化量Lを算出する。
(6) Calculation of position change amount in x direction Next, a method for calculating the position change amount of the rider 2 in the x direction will be described. The vehicle-mounted device 1 is based on the time difference between the change in the measured value in the z direction of the rider 2 and the change in the pitch rate obtained from the gyro sensor 3 at the start point or end point of the slope or when passing through the bump on the road surface. A position change amount L x in the x direction is calculated.
 図7(A)~図7(G)は、上り坂の始点50の前後を走行中の車両のライダ2の特定のスキャンラインのz方向の計測値z(k)の大きさを線分51~57により表した図である。また、図8(A)は、図7(A)~図7(G)に示す車両の走行時に計測される計測値z(k)の時間変化を示すグラフであり、図8(B)は、図8(A)と同一期間における車体ピッチ角(ピッチレートの積分値)の時間変化を示すグラフである。なお、図8(A)及び図8(B)における番号51~57は、図7(A)~図7(G)の線分51~57が示す計測値z(k)に対応する位置をそれぞれ指し示している。 7 (A) to 7 (G) show a line segment indicating the magnitude of the measured value z b (k) in the z direction of a specific scan line of the rider 2 of the vehicle traveling around the start point 50 of the uphill. FIG. FIG. 8 (A) is a graph showing the time change of the measured value z b (k) measured when the vehicle shown in FIGS. 7 (A) to 7 (G) travels. These are graphs which show the time change of the vehicle body pitch angle (the integrated value of the pitch rate) in the same period as FIG. Note that numbers 51 to 57 in FIGS. 8A and 8B correspond to the measured values z b (k) indicated by the line segments 51 to 57 in FIGS. 7A to 7G. Respectively.
 上り坂や下り坂などの勾配路面の始まりや終わりの直前では、道路面を計測点とするz方向のライダ2の計測値は、坂の始点又は終点付近において変化する。図7の例では、図8(A)に示すように、ライダ2が始点50(即ち勾配の変化点)を照射した時刻「t1」(図7(B)参照)から計測値z(k)が徐々に変化し、車体の前輪が始点50に差し掛かる時刻「t2」(図7(D)参照)において計測値z(k)が最小となる。その後、徐々に計測値z(k)は大きくなり、後輪が始点50に差し掛かるとき(図7(F)参照)の計測値z(k)は平坦な路面を走行するときの計測値z(k)と同じになる。 Immediately before the beginning or end of a gradient road surface such as an uphill or downhill, the measurement value of the lidar 2 in the z direction with the road surface as a measurement point changes near the start point or end point of the slope. In the example of FIG. 7, as shown in FIG. 8A, the measured value z b (k) from the time “t1” (see FIG. 7B) when the lidar 2 irradiates the start point 50 (that is, the gradient changing point). ) Gradually change, and the measured value z b (k) is minimized at time “t2” (see FIG. 7D) when the front wheel of the vehicle body reaches the start point 50. Then, slowly measured value z b (k) is increased, when the rear wheel reaches the start point 50 measured values (FIG. 7 (F) refer) z b (k) is measured at the time of running on a flat road surface It becomes the same as the value z b (k).
 以上を勘案し、車載機1は、計測値z(k)が減少し始める時刻t1(図7(B)参照)から計測値z(k)が最小となる時刻t2(図7(D)参照)までの時間間隔「Δt」を算出する。 In consideration of the above, the in-vehicle device 1 determines that the measurement value z b (k) is minimized from the time t1 (see FIG. 7B) at which the measurement value z b (k) starts to decrease (see FIG. 7D). The time interval “Δt” until (see) is calculated.
 ここで、時間間隔Δtは、図7(A)に示す距離dを車両が走行するのに要する時間に相当し、距離dは、特定のスキャンラインにおいて坂道の始点50を検出したときの当該始点50と車両の前輪までの距離に相当する。時間間隔Δtは、本発明における「時間差」の一例である。 Here, the time interval Δt corresponds to the time required for the vehicle to travel the distance d shown in FIG. 7A, and the distance d is the starting point when the starting point 50 of the slope is detected in a specific scan line. This corresponds to the distance between 50 and the front wheel of the vehicle. The time interval Δt is an example of the “time difference” in the present invention.
 そして、車載機1は、以下の式(26)に示すように、車速パルスなどから車両の走行速度「v」を求めて時間間隔Δtを乗じることで、距離dを算出する。 And the vehicle equipment 1 calculates the distance d by calculating | requiring the driving speed "v" of a vehicle from a vehicle speed pulse etc., and multiplying it by the time interval (DELTA) t as shown in the following formula | equation (26).
Figure JPOXMLDOC01-appb-M000026
 なお、図8(B)に示すように、車載機1は、車体の前輪が勾配の変化点(図7では始点50)に差し掛かる時刻t2を、車体又はライダ2に搭載されたジャイロセンサ3により計測したピッチ角(図8(B)では車体ピッチ角)により判定することもできる。
Figure JPOXMLDOC01-appb-M000026
As shown in FIG. 8 (B), the vehicle-mounted device 1 uses the gyro sensor 3 mounted on the vehicle body or the rider 2 at the time t2 when the front wheel of the vehicle body approaches the gradient change point (start point 50 in FIG. 7). It can also be determined from the pitch angle measured by (vehicle pitch angle in FIG. 8B).
 また、車載機1は、距離dを、路面上のバンプを通過したときにも同様に計測することが可能である。図9(A)~図9(G)は、バンプ60の前後を走行中の車両のライダ2のz方向の計測値z(k)の大きさを線分61~66により表した図である。また、図10(A)は、図9(A)~図9(G)に示す車両の走行時に得られる計測値z(k)の時間変化を示すグラフであり、図10(B)は、図10(A)と同一期間における車体ピッチレートの時間変化を示すグラフである。なお、図10(A)及び図10(B)における番号61~66は、図9(A)~図9(G)の線分61~66が示す計測値z(k)が対応する位置をそれぞれ指し示している。この場合、図10(A)に示すように、車両のライダ2のz方向の計測値z(k)は、バンプ60を照射する時刻「t3」(図9(B)参照)において計測値z(k)が一時的に減少し、車体の前輪がバンプ60を超える時刻「t4」(図9(D)参照)において計測値z(k)一時的に増加する。従って、車載機1は、計測値z(k)が一時的に下がった時刻t3(図9(B)参照)から計測値z(k)が一時的に大きくなった時刻t4(図9(D)参照)までの時間間隔を、時間間隔Δtとして算出する。これによっても、車載機1は、式(26)に基づき距離dを好適に算出することができる。なお、図10(B)に示すように、車載機1は、車体の前輪がバンプ60を超える時刻t4を、車体又はライダ2に搭載されたジャイロセンサ3により計測したピッチレート(図10(B)では車体ピッチレート)により判定することもできる。 Moreover, the vehicle-mounted device 1 can measure the distance d in the same manner when the bumps on the road surface are passed. 9 (A) to 9 (G) are diagrams showing the magnitude of the measured value z b (k) in the z direction of the rider 2 of the vehicle traveling before and after the bump 60 by line segments 61 to 66. FIG. is there. FIG. 10A is a graph showing the time change of the measured value z b (k) obtained when the vehicle shown in FIGS. 9A to 9G is traveling. FIG. FIG. 11 is a graph showing a change over time in the vehicle body pitch rate during the same period as in FIG. The numbers 61 to 66 in FIGS. 10A and 10B correspond to the positions corresponding to the measurement values z b (k) indicated by the line segments 61 to 66 in FIGS. 9A to 9G. Respectively. In this case, as shown in FIG. 10A, the measurement value z b (k) in the z direction of the rider 2 of the vehicle is the measurement value at the time “t3” when the bump 60 is irradiated (see FIG. 9B). z b (k) temporarily decreases, and the measured value z b (k) temporarily increases at time “t4” (see FIG. 9D) when the front wheel of the vehicle body exceeds the bump 60. Therefore, the vehicle-mounted device 1 has the time t4 (FIG. 9) when the measurement value z b (k) temporarily increases from the time t3 (see FIG. 9B ) when the measurement value z b (k) temporarily decreases. The time interval until (see (D)) is calculated as the time interval Δt. Also by this, the vehicle equipment 1 can calculate the distance d suitably based on Formula (26). As shown in FIG. 10 (B), the vehicle-mounted device 1 has a pitch rate measured by the gyro sensor 3 mounted on the vehicle body or the rider 2 at the time t4 when the front wheel of the vehicle body exceeds the bump 60 (FIG. 10 (B). ) Can also be determined by the vehicle body pitch rate).
 次に、距離dから位置変化量ΔLを算出する方法について説明する。車載機1は、ライダ2の位置が変化する前の距離「d」を記憶しておき,以下の式(27)に示すように、距離dとの差分を取ることで、ライダ2のx方向の位置変化量ΔLを算出することができる。 Next, a method for calculating the position change amount ΔL x from the distance d will be described. The in-vehicle device 1 stores the distance “d 0 ” before the position of the rider 2 is changed, and obtains a difference from the distance d as shown in the following equation (27), so that x The direction change amount ΔL x in the direction can be calculated.
Figure JPOXMLDOC01-appb-M000027
 この場合、例えば、車載機1は、ライダ2の初期位置Lx0の計測後、最初に坂道の始点又は終点付近を通過したとき、あるいは路面上のバンプを通過したときに距離dを算出し、ライダ2の初期位置Lx0と共にライダ設置情報ILに記録しておく。これにより、車載機1は、ライダ設置情報ILを参照することで、式(27)に基づき、処理基準時の位置L及び変化量ΔLを好適に算出することができる。
Figure JPOXMLDOC01-appb-M000027
In this case, for example, after measuring the initial position L x0 of the rider 2, the in-vehicle device 1 calculates the distance d 0 when it first passes near the start point or end point of the slope or when it passes a bump on the road surface. The initial position L x0 of the rider 2 is recorded in the rider installation information IL. Thus, the vehicle-mounted device 1 refers to the rider installation information IL, based on equation (27) can be suitably calculates the position L x and the amount of change [Delta] L x during processing reference.
 (7)y方向の位置変化量の算出
 次に、y方向におけるライダ2の位置変化量の算出方法について説明する。車載機1は、車両の旋回中において、ライダ用加速度センサ5のy方向出力値と、車体用加速度センサ4のy方向出力値と、ジャイロセンサ3の出力値から、y方向のライダ2の位置Lを算出する。
(7) Calculation of position change amount in y direction Next, a method for calculating the position change amount of the rider 2 in the y direction will be described. The vehicle-mounted device 1 determines the position of the rider 2 in the y direction from the y direction output value of the rider acceleration sensor 5, the y direction output value of the vehicle body acceleration sensor 4, and the output value of the gyro sensor 3 while the vehicle is turning. L y is calculated.
 図11は、旋回中の車両に生じる作用を概略的に示した図である。一般に,旋回中の車両重心点の速度「V」と向心加速度「α」は、以下の式(28)及び式(29)で与えられる。 FIG. 11 is a diagram schematically showing the action that occurs in a turning vehicle. In general, the speed “V G ” and the centripetal acceleration “α G ” of the vehicle center of gravity during turning are given by the following equations (28) and (29).
Figure JPOXMLDOC01-appb-M000028
Figure JPOXMLDOC01-appb-M000028
Figure JPOXMLDOC01-appb-M000029
 ここで,式(28)の「r」は旋回中心点から車両重心点までの距離を示している。また,速度Vと向心加速度αの向きは直交している。剛体であれば,角速度はどの場所でも同じであるため,重心点から[A、A離れた位置に設定した車両座標原点のA点の速度「V」と向心加速度「α」は,以下の式(30)及び式(31)で与えられる。
Figure JPOXMLDOC01-appb-M000029
Here, “r G ” in the equation (28) indicates the distance from the turning center point to the vehicle center of gravity point. Further, the direction of the velocity V G and the centripetal acceleration α G are orthogonal. If it is a rigid body, the angular velocity is the same everywhere, so the velocity “V A ” and the centripetal acceleration “α” at the vehicle coordinate origin set at a position [A x , A y ] T away from the center of gravity. " A " is given by the following equations (30) and (31).
Figure JPOXMLDOC01-appb-M000030
Figure JPOXMLDOC01-appb-M000030
Figure JPOXMLDOC01-appb-M000031
 図11より,向心加速度αと車両横方向成分「αAy」との関係は,以下の式(32)に示される関係となる。
Figure JPOXMLDOC01-appb-M000031
Than 11, the relationship between centripetal acceleration alpha A and the vehicle lateral component "alpha Ay" have a relationship represented by the following formula (32).
Figure JPOXMLDOC01-appb-M000032
 よって、車両横方向成分αAyは、以下の式(33)により表される。
Figure JPOXMLDOC01-appb-M000032
Therefore, the vehicle lateral component α Ay is expressed by the following equation (33).
Figure JPOXMLDOC01-appb-M000033
 この式(33)の左辺の車両横方向成分αAyは、車両のA点の位置に加速度センサを搭載すると、y軸方向の出力として計測できる。ここで、A点から[L、L離れた位置にライダ2が搭載されているとすると、ライダ2の位置での車両横方向成分「α(L+A)y」に関し、式(33)と同様の以下の式(34)が成立し、この式(34)と式(33)を用いることで、さらに以下の式(35)が得られる。
Figure JPOXMLDOC01-appb-M000033
The vehicle lateral direction component α Ay on the left side of the equation (33) can be measured as an output in the y-axis direction when an acceleration sensor is mounted at the position of the vehicle A point. Here, if the lidar 2 is mounted at a position away from the point A by [L x , L y ] T , the equation (33 ) is obtained with respect to the vehicle lateral component “α (L + A) y ” at the position of the lidar 2. The following equation (34) is established, and the following equation (35) is obtained by using this equation (34) and equation (33).
Figure JPOXMLDOC01-appb-M000034
Figure JPOXMLDOC01-appb-M000034
Figure JPOXMLDOC01-appb-M000035
 よって、位置Lは、以下の式(36)により表される。
Figure JPOXMLDOC01-appb-M000035
Therefore, the position Ly is represented by the following formula (36).
Figure JPOXMLDOC01-appb-M000036
 式(36)は,車両に搭載した車両用加速度センサ4とライダ用加速度センサ5の出力の差分を車両のヨーレートの2乗で割れば、それが車両座標系原点に対するライダ2のy方向の位置になることを示している。従って、車載機1は、式(36)を計算することで、ライダ2の位置Lを把握することができる。また、車載機1は、ライダ設置情報ILに記憶された初期位置Ly0を参照することで、変化量ΔLを算出することもできる。
Figure JPOXMLDOC01-appb-M000036
Expression (36) is obtained by dividing the difference between the outputs of the vehicle acceleration sensor 4 and the rider acceleration sensor 5 mounted on the vehicle by the square of the vehicle yaw rate, and that is the position of the rider 2 in the y direction with respect to the vehicle coordinate system origin. It shows that it becomes. Thus, the vehicle-mounted device 1, by calculating the equation (36) can grasp the position L y of the rider 2. The in-vehicle device 1 can also calculate the change amount ΔL y by referring to the initial position L y0 stored in the rider installation information IL.
 [処理フロー]
 図12は、ライダ2の出力を補正する処理の手順を示すフローチャートの一例である。車載機1は、図12に示す処理を、所定のタイミングにおいて繰り返し実行する。
[Processing flow]
FIG. 12 is an example of a flowchart illustrating a processing procedure for correcting the output of the lidar 2. The in-vehicle device 1 repeatedly executes the process shown in FIG. 12 at a predetermined timing.
 まず、車載機1は、水平な道路で車両が停止中又は一定速度走行中に、ライダ用加速度センサ5の出力値からライダ2のロール角の変化量ΔLφ及びピッチ角の変化量ΔLθを算出する(ステップS101)。この場合、車載機1は、式(7)及び(10)と同等の式に基づきライダ2のロール角Lφ、ピッチ角Lθを算出し、ライダ設置情報ILに記録されたロール角Lφ0、ピッチ角Lθ0との差を、変化量ΔLφ、ΔLθとして算出する。 First, the vehicle-mounted device 1 calculates the change amount ΔL φ of the roll angle of the rider 2 and the change amount ΔL θ of the pitch angle from the output value of the rider acceleration sensor 5 while the vehicle is stopped on a horizontal road or traveling at a constant speed. Calculate (step S101). In this case, the in-vehicle device 1 calculates the roll angle L φ and the pitch angle L θ of the rider 2 based on the equations equivalent to the equations (7) and (10), and the roll angle L φ0 recorded in the rider installation information IL. , the difference between the pitch angle L .theta.0, variation [Delta] L phi, is calculated as [Delta] L theta.
 次に、車載機1は、車両が直進道路を加減速中に、ライダ用加速度センサ5の出力値からライダ2のヨー角の変化量ΔLψを算出する(ステップS102)。この場合、車載機1は、式(20)と同等の式に基づきライダ2のヨー角Lψを算出し、ライダ設置情報ILに記録されたヨー角Lψ0との差を、変化量ΔLψとして算出する。 Next, the vehicle-mounted device 1 calculates the change amount ΔL ψ of the yaw angle of the rider 2 from the output value of the rider acceleration sensor 5 while the vehicle is accelerating / decelerating on the straight road (step S102). In this case, the in-vehicle device 1 calculates the yaw angle L ψ of the rider 2 based on the equation equivalent to the equation (20), and the difference from the yaw angle L ψ0 recorded in the rider installation information IL is calculated as the change amount ΔL ψ. Calculate as
 次に、車載機1は、車両が平坦道路で停止中又は一定速度走行中に、道路面を照射するライダ2のz方向の計測値から、z方向の位置変化量ΔLを算出する(ステップS103)。この場合、車載機1は、平坦な路面で車両が停車中あるいは一定速度で走行している時の路面の計測値z(k)の平均を算出し、ライダ設置情報ILに記録された同条件での路面の計測値zb0(k)の平均との差分をとることで、変化量ΔLzを算出する(式(25)参照)。 Next, the vehicle-mounted device 1 calculates the position change amount ΔL z in the z direction from the measured value in the z direction of the rider 2 that irradiates the road surface while the vehicle is stopped on a flat road or traveling at a constant speed (step) S103). In this case, the in-vehicle device 1 calculates the average of the measured values z b (k) of the road surface when the vehicle is stopped on a flat road surface or traveling at a constant speed, and is recorded in the rider installation information IL. The change amount ΔLz is calculated by taking the difference from the average of the measured values z b0 (k) of the road surface under the conditions (see formula (25)).
 次に、車載機1は、坂道の始点又は終点付近において、あるいは路面上のバンプを通過するときにおいて、ライダ2のz方向計測値の変化とジャイロセンサの出力値の変化との時間間隔Δtを算出することで、x方向の位置変化量ΔLを算出する(ステップS104)。この場合、車載機1は、時間間隔Δtに車両の走行速度vを乗じることで距離dを算出し、あらかじめライダ設置情報ILに記憶された距離dとの差分を、変化量ΔLとして算出する(式(26)参照)。 Next, the vehicle-mounted device 1 sets the time interval Δt between the change in the measured value in the z direction of the rider 2 and the change in the output value of the gyro sensor near the start point or end point of the hill or when passing the bump on the road surface. By calculating, a position change amount ΔL x in the x direction is calculated (step S104). In this case, the in-vehicle device 1 calculates the distance d by multiplying the time interval Δt by the traveling speed v of the vehicle, and calculates the difference from the distance d 0 previously stored in the rider installation information IL as the change amount ΔL z. (See equation (26)).
 次に、車載機1は、車両の旋回中において、ライダ用加速度センサ5のy方向出力値と、車体用加速度センサ4のy方向出力値と、ジャイロセンサ3の出力値から、y方向の位置変化量ΔLを算出する(ステップS105)。具体的には、車載機1は、式(36)に基づき処理基準時の位置Lを算出し、ライダ設置情報ILに記憶された初期位置Ly0との差分を、y方向の位置変化量ΔLとして算出する。 Next, the vehicle-mounted device 1 determines the position in the y direction from the y-direction output value of the rider acceleration sensor 5, the y-direction output value of the vehicle body acceleration sensor 4, and the output value of the gyro sensor 3 while the vehicle is turning. A change amount ΔL y is calculated (step S105). Specifically, the in-vehicle device 1 calculates the position L y at the time of processing reference based on Expression (36), and calculates the difference from the initial position L y0 stored in the lidar installation information IL as the position change amount in the y direction. Calculated as ΔL y .
 次に、車載機1は、ステップS101~S105で算出した変化量ΔLφ、ΔLθ、ΔLψ、ΔL、ΔL、ΔLのうち、所定の閾値以上のものが存在するか否か判定する(ステップS106)。上述の閾値は、後述するステップS108でのライダ2の計測データの補正処理を行うことで引き続きライダ2の計測データを使用できるか否かを判定するための閾値であり、例えば予め実験等に基づき設定される。そして、車載機1は、ステップS101~S105で算出した変化量ΔLφ、ΔLθ、ΔLψ、ΔL、ΔL、ΔLのうち、所定の閾値以上のものが存在する場合(ステップS106;Yes)、対象のライダ2の出力データの使用(即ち障害物検知や自車位置推定等への利用)を中止し、対象のライダ2について再度のアライメント調整を行う必要がある旨の警告を情報出力部16により出力する(ステップS107)。これにより、事故等により著しく姿勢・位置のずれが生じたライダ2の計測データを用いることによる安全性低下等を確実に抑制する。上述の閾値は、本発明における「所定量」の一例である。 Next, the in-vehicle device 1 determines whether or not there is a change amount ΔL φ , ΔL θ , ΔL ψ , ΔL x , ΔL y , ΔL z calculated in steps S101 to S105 that is equal to or greater than a predetermined threshold value. (Step S106). The above-described threshold value is a threshold value for determining whether or not the measurement data of the lidar 2 can be continuously used by performing the correction process of the measurement data of the lidar 2 in step S108 described later. Is set. The in-vehicle device 1 then includes a change amount ΔL φ , ΔL θ , ΔL ψ , ΔL x , ΔL y , ΔL z calculated in steps S101 to S105 that is greater than or equal to a predetermined threshold (step S106; Yes), information indicating a warning that the use of the output data of the target rider 2 (that is, use for obstacle detection, vehicle position estimation, etc.) must be stopped and the target rider 2 needs to be realigned is provided. The output is performed by the output unit 16 (step S107). As a result, it is possible to reliably suppress a decrease in safety caused by using the measurement data of the rider 2 in which the posture / position is significantly displaced due to an accident or the like. The above threshold is an example of the “predetermined amount” in the present invention.
 一方、車載機1は、変化量ΔLφ、ΔLθ、ΔLψ、ΔL、ΔL、ΔLのうち、所定の閾値以上のものが存在しない場合(ステップS106;No)、これらの変化量に基づき、ライダ2が出力する点群データの各計測値を補正する(ステップS108)。この場合、車載機1は、例えば、各変化量の大きさごとの計測値の補正量を示すマップ等を記憶しておき、当該マップ等を参照することで、上述の計測値を補正する。また、変化量の所定の割合の値を計測値の補正量として計測値を補正してもよい。 On the other hand, when there is no change amount ΔL φ , ΔL θ , ΔL ψ , ΔL x , ΔL y , ΔL z that exceeds a predetermined threshold (Step S 106; No), the in-vehicle device 1 changes these change amounts. Based on the above, each measurement value of the point cloud data output by the lidar 2 is corrected (step S108). In this case, for example, the in-vehicle device 1 stores a map or the like indicating the correction amount of the measurement value for each change amount, and corrects the above-described measurement value by referring to the map or the like. Alternatively, the measurement value may be corrected using the value of a predetermined ratio of the change amount as the correction amount of the measurement value.
 以上説明したように、本実施例における車載機1は、対象物に対する距離を計測するライダ2の車両に対する姿勢を少なくとも推定するものであって、車両が加減速して走行しているときの、ライダ2に設けられたライダ用加速度センサ5の検出結果に基づいて、車両に対するライダ2の姿勢を推定する処理などを行う。これにより、車載機1は、ライダ2が出力する計測データを車両座標系に変換する処理を高精度に実行したり、ライダ2の使用可否の判定を行ったりすることができる。 As described above, the in-vehicle device 1 in the present embodiment estimates at least the attitude of the rider 2 that measures the distance to the object with respect to the vehicle, and the vehicle is traveling while accelerating and decelerating. Based on the detection result of the rider acceleration sensor 5 provided in the rider 2, processing for estimating the attitude of the rider 2 with respect to the vehicle is performed. Thereby, the vehicle equipment 1 can perform the process which converts the measurement data which the rider 2 outputs into a vehicle coordinate system with high accuracy, or can determine whether the rider 2 can be used.
 [変形例]
 以下、実施例に好適な変形例について説明する。以下の変形例は、組み合わせて実施例に適用してもよい。
[Modification]
Hereinafter, modified examples suitable for the embodiments will be described. The following modifications may be applied to the embodiments in combination.
 (変形例1)
 図12のステップS108において、車載機1は、ステップS101~S105で算出した各変化量に基づきライダ2が出力する点群データの各計測値を補正する代わりに、ステップS101~S105で算出するライダ2の処理基準時の姿勢及び位置の各推定値に基づき、各計測値を車両座標系に変換してもよい。
(Modification 1)
In step S108 in FIG. 12, the in-vehicle device 1 uses the lidar calculated in steps S101 to S105 instead of correcting each measurement value of the point cloud data output from the lidar 2 based on the amount of change calculated in steps S101 to S105. Each measured value may be converted into the vehicle coordinate system based on the estimated values of the posture and position at the time of the second processing reference.
 この場合、車載機1は、ステップS101~S105で算出したロール角Lφ、ピッチ角Lθ、ヨー角Lψ、x方向位置L、y方向位置L、z方向位置Lを用いて、式(24)に基づき、ライダ2が出力する点群データの各計測値をライダ座標系から車体座標系に変換し、変換後のデータに基づいて、自車位置推定や自動運転制御などを実行してもよい。 In this case, the in-vehicle device 1 uses the roll angle L φ , pitch angle L θ , yaw angle L ψ , x-direction position L x , y-direction position L y , and z-direction position L z calculated in steps S101 to S105. Based on the equation (24), each measurement value of the point cloud data output from the lidar 2 is converted from the lidar coordinate system to the vehicle body coordinate system, and based on the converted data, the vehicle position estimation and automatic driving control are performed. May be executed.
 他の例では、車載機1は、各ライダ2の姿勢及び位置を修正するためのアクチュエータなどの調整機構が各ライダ2に備わっている場合には、ステップS108の処理に代えて、ステップS101~S105で算出した各変化量の分だけライダ2の姿勢及び位置を修正するように調整機構を駆動させる制御を行ってもよい。 In another example, in the case where each in-vehicle device 1 is provided with an adjustment mechanism such as an actuator for correcting the posture and position of each rider 2, the in-vehicle device 1 replaces the process in step S108 with steps S101 to S101. You may perform control which drives an adjustment mechanism so that the attitude | position and position of the rider 2 may be corrected by the amount of each change calculated in S105.
 (変形例2)
 図1に示す運転支援システムの構成は一例であり、本発明が適用可能な運転支援システムの構成は図1に示す構成に限定されない。例えば、運転支援システムは、車載機1を有する代わりに、車両の電子制御装置が図12等に示す処理を実行してもよい。この場合、ライダ設置情報ILは、例えば車両内の記憶部に記憶され、車両の電子制御装置は、ライダ2などの各種センサの出力データを受信可能に構成される。
(Modification 2)
The configuration of the driving support system shown in FIG. 1 is an example, and the configuration of the driving support system to which the present invention is applicable is not limited to the configuration shown in FIG. For example, in the driving support system, instead of having the in-vehicle device 1, the electronic control device of the vehicle may execute the processing shown in FIG. In this case, the lidar installation information IL is stored in, for example, a storage unit in the vehicle, and the vehicle electronic control device is configured to be able to receive output data of various sensors such as the lidar 2.
 1 車載機
 2 ライダ
 3 ジャイロセンサ
 4 車体用加速度センサ
 5 ライダ用加速度センサ
 10 地図DB
DESCRIPTION OF SYMBOLS 1 In-vehicle apparatus 2 Rider 3 Gyro sensor 4 Acceleration sensor for vehicle bodies 5 Acceleration sensor for lidar 10 Map DB

Claims (13)

  1.  対象物に対する距離を計測する計測部の移動体に対する姿勢を推定する推定装置であって、
     前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定部
    を有する推定装置。
    An estimation device that estimates a posture of a moving unit of a measurement unit that measures a distance to an object,
    An estimation device having an estimation unit that estimates an attitude of the measurement unit with respect to the mobile body based on a detection result of an acceleration detection unit provided in the measurement unit when the mobile body is traveling with acceleration / deceleration .
  2.  前記推定部は、前記移動体が所定速度により走行または停止しているときの前記加速度検出部が出力する加速度データに基づいて、前記計測部のロール方向及びピッチ方向の姿勢を推定し、前記移動体が加減速して走行しているときの前記加速度検出部が出力する加速度データ並びに前記推定されたロール方向及びピッチ方向の姿勢に基づいて、前記計測部のヨー方向の姿勢を推定する請求項1に記載の推定装置。 The estimation unit estimates a posture of the measurement unit in a roll direction and a pitch direction based on acceleration data output from the acceleration detection unit when the moving body is running or stopped at a predetermined speed, and the movement The posture of the measurement unit in the yaw direction is estimated based on acceleration data output by the acceleration detection unit when the body is running with acceleration / deceleration and the estimated posture in the roll direction and the pitch direction. The estimation apparatus according to 1.
  3.  前記推定部は、推定した前記計測部の姿勢と、記憶部に記憶された前記計測部の姿勢とに基づき、前記姿勢の変化量を推定する請求項1または2に記載の推定装置。 The estimation device according to claim 1 or 2, wherein the estimation unit estimates a change amount of the posture based on the estimated posture of the measurement unit and the posture of the measurement unit stored in a storage unit.
  4.  前記推定部は、高さ方向における路面の位置を示す前記計測部の計測データに基づいて、前記計測部の前記高さ方向の位置を推定する請求項1~3のいずれか一項に記載の推定装置。 The estimation unit according to any one of claims 1 to 3, wherein the estimation unit estimates a position of the measurement unit in the height direction based on measurement data of the measurement unit indicating a position of a road surface in a height direction. Estimating device.
  5.  前記推定部は、勾配が変化する道路地点、あるいは路面上のバンプが前記計測部により計測されたときの前記道路地点と前記移動体との距離に基づいて、前記移動体の前後方向における前記計測部の位置を推定する請求項1~4のいずれか一項に記載の推定装置。 The estimation unit is configured to perform the measurement in the front-rear direction of the moving body based on a road point where the gradient changes or a distance between the road point and the moving body when a bump on the road surface is measured by the measurement unit. The estimation apparatus according to any one of claims 1 to 4, wherein the position of the part is estimated.
  6.  前記推定部は、前記移動体のピッチ方向の傾きを検出する傾き検出部が出力するデータの変化と、前記計測部が出力する計測データの変化との時間差に基づき、前記距離を算出する請求項5に記載の推定装置。 The estimation unit calculates the distance based on a time difference between a change in data output from an inclination detection unit that detects an inclination of the moving body in a pitch direction and a change in measurement data output from the measurement unit. 5. The estimation apparatus according to 5.
  7.  前記推定部は、前記移動体の旋回中において前記加速度検出部が出力する前記移動体の左右方向の加速度データと、前記移動体に搭載された加速度センサが出力する前記移動体の左右方向の加速度データと、前記移動体に搭載されたジャイロセンサが出力する前記移動体のヨーレートとに基づいて、前記移動体の左右方向における前記計測部の位置を推定する請求項1~6のいずれか一項に記載の推定装置。 The estimation unit includes acceleration data in the left-right direction of the moving body output by the acceleration detection unit during turning of the moving body, and acceleration in the left-right direction of the moving body output by an acceleration sensor mounted on the moving body. The position of the measurement unit in the left-right direction of the moving body is estimated based on the data and the yaw rate of the moving body output from a gyro sensor mounted on the moving body. The estimation apparatus described in 1.
  8.  前記推定部は、推定した前記計測部の位置と、記憶部に記憶された前記計測部の位置とに基づき、前記位置の変化量を推定する請求項4~7のいずれか一項に記載の推定装置。 The estimation unit according to any one of claims 4 to 7, wherein the estimation unit estimates the amount of change in the position based on the estimated position of the measurement unit and the position of the measurement unit stored in a storage unit. Estimating device.
  9.  前記変化量に基づき、前記計測部が出力する計測データを補正する補正部をさらに備える請求項3または8に記載の推定装置。 The estimation device according to claim 3 or 8, further comprising a correction unit that corrects measurement data output by the measurement unit based on the change amount.
  10.  前記変化量が所定量以上である場合、前記計測部が出力する計測データに基づく処理を停止する停止制御部をさらに備える請求項3または8に記載の推定装置。 The estimation apparatus according to claim 3 or 8, further comprising a stop control unit that stops processing based on measurement data output by the measurement unit when the change amount is equal to or greater than a predetermined amount.
  11.  対象物に対する距離を計測する計測部の移動体に対する姿勢を推定する推定装置が実行する制御方法であって、
     前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定工程
    を有する制御方法。
    A control method executed by an estimation device that estimates a posture of a measurement unit that measures a distance to an object with respect to a moving body,
    A control method including an estimation step of estimating an attitude of the measurement unit with respect to the mobile body based on a detection result of an acceleration detection unit provided in the measurement unit when the mobile body is traveling with acceleration / deceleration .
  12.  対象物に対する距離を計測する計測部の移動体に対する姿勢を推定するコンピュータが実行するプログラムであって、
     前記移動体が加減速して走行しているときの、前記計測部に設けられた加速度検出部の検出結果に基づいて、前記移動体に対する前記計測部の姿勢を推定する推定部
    として前記コンピュータを機能させるプログラム。
    A program executed by a computer that estimates a posture of a measurement unit that measures a distance to an object with respect to a moving object,
    The computer as an estimation unit that estimates the posture of the measurement unit with respect to the mobile unit based on a detection result of an acceleration detection unit provided in the measurement unit when the mobile unit is running with acceleration and deceleration. A program to function.
  13.  請求項12に記載のプログラムを記憶した記憶媒体。 A storage medium storing the program according to claim 12.
PCT/JP2019/011977 2018-03-23 2019-03-22 Estimation device, control method, program, and storage medium WO2019182082A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2020507909A JPWO2019182082A1 (en) 2018-03-23 2019-03-22 Estimator, control method, program and storage medium
JP2022074663A JP2022115927A (en) 2018-03-23 2022-04-28 Estimation device, control method, program, and storage media
JP2023190044A JP2024016186A (en) 2018-03-23 2023-11-07 Estimation device, control method, program, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018056741 2018-03-23
JP2018-056741 2018-03-23

Publications (1)

Publication Number Publication Date
WO2019182082A1 true WO2019182082A1 (en) 2019-09-26

Family

ID=67987394

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/011977 WO2019182082A1 (en) 2018-03-23 2019-03-22 Estimation device, control method, program, and storage medium

Country Status (2)

Country Link
JP (3) JPWO2019182082A1 (en)
WO (1) WO2019182082A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022208748A1 (en) * 2021-03-31 2022-10-06 三菱重工機械システム株式会社 Vehicle on-board device, server, vehicle motion detection method, and program
WO2024101175A1 (en) * 2022-11-11 2024-05-16 ヌヴォトンテクノロジージャパン株式会社 Information processing method, information processing device, and program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016149711A (en) * 2015-02-13 2016-08-18 株式会社デンソー Camera calibration device
WO2018038257A1 (en) * 2016-08-26 2018-03-01 株式会社Zmp Object detecting method and device therefor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016149711A (en) * 2015-02-13 2016-08-18 株式会社デンソー Camera calibration device
WO2018038257A1 (en) * 2016-08-26 2018-03-01 株式会社Zmp Object detecting method and device therefor

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022208748A1 (en) * 2021-03-31 2022-10-06 三菱重工機械システム株式会社 Vehicle on-board device, server, vehicle motion detection method, and program
WO2024101175A1 (en) * 2022-11-11 2024-05-16 ヌヴォトンテクノロジージャパン株式会社 Information processing method, information processing device, and program

Also Published As

Publication number Publication date
JPWO2019182082A1 (en) 2021-03-11
JP2024016186A (en) 2024-02-06
JP2022115927A (en) 2022-08-09

Similar Documents

Publication Publication Date Title
CN107084743B (en) Offset and misalignment compensation for six degree of freedom inertial measurement units using GNSS/INS data
CN106053879B (en) Pass through the vehicle speed estimation of the expiration operation of data fusion
US9796416B2 (en) Automated driving apparatus and automated driving system
US10260889B2 (en) Position estimation device and position estimation method
US9135825B2 (en) Risk degree calculation device
JP2024016186A (en) Estimation device, control method, program, and storage medium
US10323947B2 (en) Travel control method and travel control apparatus
JP6787297B2 (en) Display control device and display control program
JP4899626B2 (en) Travel control device
JP4277907B2 (en) Driving control device for automobile
JP4876847B2 (en) Vehicle traveling direction estimation device and driving support system
JP2016206976A (en) Preceding vehicle track calculation device for driving support control of vehicle
JP5752633B2 (en) Speed detection device, travel position calculation device, and speed calculation method
JP2007271605A (en) Acceleration estimation device and vehicle
JP2007271606A (en) Acceleration estimation device and vehicle
JP2008058256A (en) Device for calculating speed
CN110114634A (en) Extraneous identifying system
JP2005041360A (en) Driving operation assisting device for vehicle, and vehicle having it
JP2008287480A (en) Travel support device for vehicle
US7657395B2 (en) Two-axis accelerometer for detecting inclination without the effect of common acceleration
JP6594546B2 (en) Angle measuring device
JP2007137306A (en) Device and method for horizontal travel determination of movable body
JP2013129284A (en) Pitching angle processing apparatus
JP6454857B2 (en) Posture detection apparatus and posture detection method
JP6409711B2 (en) Driving environment recognition device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19771795

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020507909

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19771795

Country of ref document: EP

Kind code of ref document: A1