WO2020026677A1 - Dispositif de détection, dispositif de traitement, procédé de détection et programme de traitement - Google Patents

Dispositif de détection, dispositif de traitement, procédé de détection et programme de traitement Download PDF

Info

Publication number
WO2020026677A1
WO2020026677A1 PCT/JP2019/026218 JP2019026218W WO2020026677A1 WO 2020026677 A1 WO2020026677 A1 WO 2020026677A1 JP 2019026218 W JP2019026218 W JP 2019026218W WO 2020026677 A1 WO2020026677 A1 WO 2020026677A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
reference position
point
detection device
calculation unit
Prior art date
Application number
PCT/JP2019/026218
Other languages
English (en)
Japanese (ja)
Inventor
源洋 中川
幹也 田中
貴次 青山
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to JP2020534123A priority Critical patent/JP7024876B2/ja
Publication of WO2020026677A1 publication Critical patent/WO2020026677A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention relates to a detection device, a processing device, a detection method, and a processing program.
  • Patent Document 1 As a technique for detecting an object, for example, there is a technique described in Patent Document 1 below.
  • a moving object such as a walking human body
  • the left foot and the right foot are substantially at the same position in the shoulder width direction of the human body, and it is difficult to distinguish between the left foot and the right foot.
  • a detection unit that detects position information of each point of a moving object, and a position where a change amount of the position information satisfies a predetermined condition in an intersecting direction intersecting a moving direction of the object and a vertical direction. And a first portion of the object disposed on the first side with respect to a reference plane including the moving direction and the vertical direction based on the reference position calculated by the reference calculation unit.
  • a determination unit configured to determine a second portion of the object disposed on the second side opposite to the first side with respect to the reference plane.
  • a position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition in an intersecting direction intersecting the moving direction of the moving object and the vertical direction is calculated as a reference position.
  • the position information of each point of the moving object is detected, and the position where the amount of change of the position information satisfies a predetermined condition in the intersecting direction intersecting the moving direction of the object and the vertical direction is determined.
  • the computer determines, based on the position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition, in the direction intersecting the moving direction of the moving object and the vertical direction. Calculating as a position, and, based on the reference position, a first portion of the object disposed on a first side with respect to a reference plane including a movement direction and a vertical direction, and opposite to the first side with respect to the reference plane. And determining a second portion of the object arranged on the second side.
  • FIG. 3 is a diagram illustrating a detection unit according to the first embodiment.
  • FIG. 4 is a diagram illustrating a process of a point cloud data generation unit according to the first embodiment.
  • FIGS. 3A to 3C are diagrams illustrating a process of a reference calculation unit according to the first embodiment.
  • FIGS. 4A and 4B are diagrams illustrating a process of a reference calculation unit according to the first embodiment.
  • FIG. 4 is a diagram illustrating a process of a partial determination unit according to the first embodiment.
  • FIG. 5 is a diagram illustrating processing of a posture estimating unit according to the first embodiment. 5 is a flowchart illustrating a detection method according to the first embodiment. It is a figure showing a detecting device concerning a 2nd embodiment. It is a figure showing the detecting device concerning a 3rd embodiment. It is a figure showing the detecting device concerning a 4th embodiment.
  • FIG. 1 is a diagram illustrating a detection device according to the first embodiment.
  • the detection device 1 (detection system) is, for example, a motion capture device, a motion detection system, an exercise support system, or the like.
  • the detection device 1 is used for posture analysis or three-dimensional modeling. In these cases, the detection device 1 detects the object M that moves in one direction or a plurality of directions in a predetermined time range.
  • the detection device 1 may detect the object M that moves linearly, or may detect the object M that moves meandering.
  • the movement path (eg, trajectory) of the object M when the detection device 1 detects the object M may include a straight line, may include a curve, or may include a straight line and a curve.
  • the detection device 1 is arranged on a first portion M1 of the object M arranged on a first side (eg, left side) with respect to the moving direction MD of the object M, and is arranged on a side opposite to the first side (eg, right side).
  • the second part M2 of the object M is determined.
  • the object M is movable in a target area AR to be detected by the detection device 1 (eg, the detection region of the detection device 1, the field of view).
  • the object M includes, for example, a human body or a non-human animal, a human-type or non-human animal-type robot, or a non-animal-type robot.
  • the object M is a human body, and the object M is appropriately referred to as a human body M.
  • the object M (in this case, the human body M) is, for example, an object whose posture and / or shape changes with movement.
  • the detection device 1 is disposed on the first side (eg, left side) with the moving direction MD of the human body M or the midline of the human body M (eg, left and right center lines on the body surface) as a boundary.
  • the first portion M1 and the second portion M2 disposed on the opposite side (eg, right side) from the first side can be determined.
  • the moving human body M is, for example, a sport such as fencing, baseball, soccer, golf, kendo, American football, ice hockey, or gymnastics, walking or posing such as running, exercise, yoga, bodybuilding, a fashion show, a game, or a person. It is a human body that works for authentication or work.
  • the object M may include a moving human body and an object attached to the moving human body (eg, clothing, wearing equipment, exercise equipment, and armor).
  • an XYZ orthogonal coordinate system shown in FIG. In this XYZ rectangular coordinate system, the X direction is the moving direction of the object M (eg, the human body M), the Y direction is the vertical direction, and the Z direction intersects the moving direction of the object M (the human body M) with the vertical direction. (Eg, orthogonal) crossing directions (eg, orthogonal directions).
  • the Z direction includes the direction of the shoulder width of the human body M
  • the first part M1 is at least a part of the left half of the human body M
  • the second part M2 is at least a part of the right half of the human body M. Part.
  • the side indicated by the arrow is appropriately referred to as a + side (eg, + Z side), and the opposite side is referred to as a ⁇ side (eg, ⁇ Z side).
  • the first side is the -Z side
  • the second side is the + Z side.
  • the detection device 1 includes a position detection unit 2 and a processing device 3.
  • the position detector 2 detects position information of the moving human body M.
  • the position detecting unit 2 detects point group data including three-dimensional coordinates of each point on the surface of the human body M as position information of the human body M.
  • the moving direction of the human body M is predetermined with respect to the target area AR (for example, the field of view of the position detection unit 2).
  • the vertical direction and the moving direction of the human body M in the target area AR are given to the detection device 1 as known information.
  • the detection result eg, point cloud data
  • the position detector 2 includes a detector 4 and a point cloud data generator 5.
  • the detection unit 4 is, for example, at least a part of a portable device (portable device).
  • the detection unit 4 may be at least a part of a stationary device.
  • the detection unit 4 may be provided inside the processing device 3.
  • the point cloud data generating unit 5 may be provided in a device outside the processing device 3.
  • the position detection unit 2 is a device external to the processing device 3 and may include the detection unit 4 and the point cloud data generation unit 5.
  • a part or the whole of the detection device 1 may be a portable device (eg, an information terminal, a smartphone, a tablet, a camera-equipped mobile phone, and a wearable terminal).
  • a part or the whole of the detection device 1 may be a stationary device (eg, a fixed-point camera).
  • FIG. 2 is a diagram illustrating the detection unit according to the first embodiment.
  • the detection unit 4 detects depth as position information of the human body M.
  • the detection unit 4 includes, for example, a depth sensor (eg, a depth camera).
  • the detection unit 4 detects a depth (eg, position information, distance, depth, depth) from a predetermined point to each point on the surface of the object placed in the target area AR.
  • the predetermined point is, for example, a point at a position serving as a reference for detection by the detection unit 4 (eg, a viewpoint, a detection source point, a point representing the position of the detection unit 4, a pixel position of the imaging element 8 described later). .
  • the detection unit 4 includes an irradiation unit 6, an optical system 7, and an imaging device 8.
  • the irradiation unit 6 irradiates (eg, projects) light La (eg, pattern light, irradiation light) to the target area AR (space, detection area).
  • the optical system 7 includes, for example, an imaging optical system (imaging optical system).
  • the imaging device 8 includes, for example, a CMOS image sensor or a CCD image sensor.
  • the imaging element 8 has a plurality of pixels arranged two-dimensionally.
  • the imaging element 8 captures an image of the target area AR via the optical system 7.
  • the imaging element 8 detects light Lb (infrared light, return light) emitted from the object in the target area AR due to the irradiation of the light La.
  • the detection unit 4 is based on, for example, a pattern (eg, intensity distribution) of the light La emitted from the irradiation unit 6 and a pattern (eg, intensity distribution, captured image) of the light Lb detected by the image sensor 8. , The depth from the point on the target area AR corresponding to each pixel of the image sensor 8 to each pixel of the image sensor 8 is detected.
  • the detection unit 4 outputs a depth map (e.g., a depth image, depth information, and distance information) representing a depth distribution in the target area AR to the processing device 3 (see FIG. 1) as a detection result.
  • the detection unit 4 outputs a depth map to the processing device 3 as position information of the human body M.
  • the detection unit 4 may be a device that detects a depth by a time-of-flight (TOF) method.
  • the detection unit 4 may be a device that detects depth by a method other than the TOF method.
  • the detection unit 4 may include, for example, a laser scanner (for example, a laser range finder), and may detect the depth by laser scanning.
  • the detection unit 4 may include, for example, a phase difference sensor, and may detect the depth by a phase difference method.
  • the detection unit 4 may detect the depth by, for example, a DFD (depth @ from @ defocus) method.
  • the detection unit 4 may irradiate the object M with light other than infrared light (eg, visible light) and detect light (eg, visible light) emitted from the object M.
  • the detection unit 4 may include, for example, a stereo camera, and may detect (eg, image) the object M from a plurality of viewpoints.
  • the detection unit 4 may detect depth by triangulation using captured images obtained by capturing the object M from a plurality of viewpoints.
  • the detection unit 4 may detect the depth by a method other than an optical method (for example, scanning by ultrasonic waves).
  • the point cloud data generation unit 5 generates point cloud data as position information of the human body M based on the depth detected by the detection unit 4.
  • a detection unit 4 is provided outside the processing device 3, and a point cloud data generation unit 5 is provided inside the processing device 3.
  • the processing device 3 is communicably connected to the detection unit 4.
  • the detection unit 4 outputs the detection result to the processing device 3.
  • the processing device 3 processes the detection result output from the detection unit 4.
  • the processing device 3 includes a point cloud data generation unit 5, a reference calculation unit 11, a partial determination unit 12, a posture estimation unit 13, and a storage unit 14.
  • the storage unit 14 is, for example, a nonvolatile memory, a hard disk (HDD), a solid state drive (SSD), or the like.
  • the storage unit 14 stores original data processed by the processing device 3 and data processed by the processing device 3 (eg, data generated by the processing device 3).
  • the storage unit 14 stores the depth map output from the detection unit 4 as the original data of the point cloud data.
  • the point cloud data generation unit 5 executes point cloud processing for generating point cloud data as position information of the human body M.
  • the point cloud data includes three-dimensional coordinates of a plurality of points on the human body M in the target area AR.
  • the point cloud data may include three-dimensional coordinates of a plurality of points on an object (eg, a wall, a floor) around the human body M.
  • the point cloud data generation unit 5 calculates point cloud data of the human body M based on the position information (for example, depth) detected by the detection unit 4.
  • the point cloud data generation unit 5 reads out the detection result (eg, depth map) of the detection unit 4 stored in the storage unit 14 and calculates point cloud data.
  • the point cloud data generating unit 5 may generate point cloud data as model information (eg, shape information) of the object M (eg, a human body M).
  • FIG. 3 is a diagram illustrating a process of the point cloud data generation unit according to the first embodiment.
  • Symbol D1 is a depth map (eg, a depth image) corresponding to the detection result of the detection unit 4.
  • the depth map D ⁇ b> 1 is information (for example, an image) representing a spatial distribution of a depth measurement value obtained by the detection unit 4.
  • the depth map D1 is a grayscale image in which the depth at each point of the target area AR is represented by a gradation value.
  • a portion having a relatively high tone value eg, a white portion, a bright portion
  • a portion having a relatively small depth eg, a portion relatively close to the position detection unit 2).
  • a portion having a relatively low gradation value eg, a black portion, a dark portion
  • a relatively large depth eg, a portion relatively far from the position detection unit 2.
  • the point cloud data generation unit 5 reads out the data of the depth map D1 from the storage unit 14 and assigns the data to each pixel based on the gradation value (eg, the measured value of the depth) of each pixel of the depth map D1.
  • the three-dimensional coordinates of the corresponding point in the real space are calculated, and the point group data D2 is generated.
  • the three-dimensional coordinates of one point are appropriately referred to as point data.
  • the point cloud data D2 is data obtained by grouping a plurality of point data.
  • the processing device 3 (eg, the point cloud data generation unit 5) performs segmentation, pattern recognition, and the like on the entire point cloud data of the target area AR, for example, to extract point cloud data D2 of the human body M.
  • the point cloud data generation unit 5 executes an extraction process (eg, segmentation) of extracting position information of a part of the object from position information of the object arranged in the target area AR.
  • the objects arranged in the target area AR include an object M (eg, a human body) and objects around the object M (eg, floor, wall, and background objects).
  • the point cloud data generator 5 extracts (eg, segments, separates) the position information of the object M (eg, a human body M) from the position information detected by the position detector 2.
  • the position detection unit 2 Prior to the above-described extraction processing, the position detection unit 2 performs a first state in which the object M is not arranged in the target area AR and a second state in which the object M is arranged in the target area AR, The position information of the object in the target area AR is detected.
  • the point cloud data generation unit 5 calculates the difference between the detection result of the position detection unit 2 in the first state and the detection result of the position detection unit 2 in the second state, thereby obtaining the object M (eg, The position information of the human body M) is extracted.
  • the first state may be a state in which the object M (for example, a human body M) is arranged in the target area AR and the position of the object M (for example, the human body M) in the target area AR is different from the second state.
  • the first state is a state in which an object M (eg, a human body M) is arranged at a first position in the target area AR, and the second state is an object at a second position different from the first position in the target area.
  • M eg, human body M
  • the above-described extraction processing may be executed by a processing unit other than the point cloud data generation unit 5. This processing unit may be provided in the processing device 3 or may be provided in a device external to the processing device 3. The above-described extraction processing need not be performed.
  • the processing device 3 may remove noise from the detection result of the position detection unit 2.
  • the noise removal processing for removing noise includes, for example, processing for removing spatial noise from the depth map generated by the position detection unit 2. For example, when the difference in depth between the first region (eg, the first pixel) and the second region (eg, the second pixel) adjacent to each other in the depth map exceeds a threshold, the processing device 3 determines the depth in the first region. Alternatively, it is determined that the depth in the second area is noise. If the processing device 3 determines that the depth in the first region is noise, the processing device 3 performs interpolation on the first region by using depth of a region around the first region (eg, the third pixel and the fourth pixel). The spatial noise is removed by estimating the depth and updating (replacement, correction) the depth of the first area with the estimated depth.
  • the point cloud data generation unit 5 may generate point cloud data based on the depth map from which spatial noise has been removed.
  • the noise removal processing for removing noise includes, for example, processing for removing temporal noise (for example, noise that changes over time) from the depth map generated by the position detection unit 2.
  • the position detection unit 2 repeats detection at a predetermined sampling frequency, and the processing device 3 compares detection results (eg, depth maps) of the position detection units 2 at different detection timings.
  • the processing device 3 calculates the depth change amount (eg, time change amount) of each pixel using the depth map in the first frame and the depth map in the second frame following the first frame.
  • the processing device 3 determines that the depth of the pixel in the first frame or the depth of the pixel in the second frame is noise. If the processing device 3 determines that the depth in the second frame is noise, the processing device 3 changes the depth of the second frame by interpolation using the depth in the first frame and the depth in the third frame following the second frame. presume. The processing device 3 removes temporal noise by updating (replacing and correcting) the estimated depth of the pixel corresponding to the noise.
  • the point cloud data generating unit 5 may generate point cloud data based on the depth map from which temporal noise has been removed.
  • the noise removal processing may not include processing for removing spatial noise or processing for removing temporal noise.
  • the noise removal processing may be executed by a part other than the processing device 3 (for example, the position detection unit 2). The detection device 1 does not need to execute the noise removal processing.
  • the processing device 3 sets a reference position BP associated with the shape of the human body M based on a pattern in which the cross-sectional shape of the human body M changes in a predetermined direction, and sets the human body M with respect to the set reference.
  • the left body part and the right body part of M are determined.
  • the reference calculation unit 11 sets a direction different from both the moving direction (in this case, the X direction) and the vertical direction (in this case, the Y direction) (eg, the cross direction, the Z direction) as the predetermined direction.
  • the reference calculation unit 11 determines that a pattern (eg, the contour of the human body M) in which the cross-sectional shape (eg, contour) of the human body M in a plane orthogonal to the predetermined direction (eg, a plane parallel to the XY plane or the XY plane) changes in the predetermined direction
  • the reference position BP is set using the cross-sectional feature).
  • the reference calculation unit 11 uses, for example, a cross-sectional feature of the human body M in a side view when the human body M is viewed from the side with respect to the movement direction (eg, a pattern in which the contour of the human body M changes in the XY plane in the Z direction). Then, the characteristic portion (eg, arm, shoulder, head) of the human body is determined, and the reference position BP is set.
  • the cross-sectional feature is specified by the amount of change in position information of a point on the surface of the human body M.
  • the position information of the point is, for example, information in which coordinates in the X direction (X coordinates), coordinates in the Y direction (Y coordinates), and coordinates in the Z direction (Z coordinates) are combined. .
  • the change amount of the position information includes a change amount of the Y coordinate in the X direction (eg, dy / dx), a change amount of the Z coordinate in the X direction (eg, dz / dx), and a change amount of the X coordinate in the Y direction ( For example, dx / dy), the change amount of the Z coordinate in the Y direction (eg, dz / dy), the change amount of the X coordinate in the Z direction (eg, dx / dz), and the change amount of the Y coordinate in the Z direction. (Eg, dy / dz).
  • the cross-sectional features will be described with reference to FIGS. 4A to 4C, FIGS. 5A and 5B, and the like.
  • the reference calculation unit 11 executes a reference calculation process for calculating the reference position BP.
  • the reference calculation unit 11 determines that the amount of change in position information in a cross direction (eg, Z direction) intersecting the moving direction (X direction in this case) and the vertical direction (Y direction in this case) of the human body M is a predetermined condition. Is calculated as the reference position BP.
  • the reference calculation unit 11 executes a reference calculation process described later based on a detection result (eg, a one-frame depth image) in which the position detection unit 2 detects the human body M at an arbitrary timing.
  • the reference position BP includes a first reference position BP1 described later with reference to FIG. 4B and a second reference position BP2 described later with reference to FIG.
  • the reference position BP is a position used as a reference in the partial determination processing for determining the first part M1 and the second part M2.
  • the change amount of the position information includes, for example, a change amount (eg, a difference) of coordinates between a plurality of points representing the surface of the human body M at an arbitrary timing (time).
  • the change amount of the position information may be a slope (eg, a derivative) of a curve or a straight line representing the surface (eg, an outline, a contour, a silhouette) of the human body M.
  • the reference calculation unit 11 estimates, as the reference position BP, the position of a structure (eg, a head or a torso) whose number is one of the structures included in the object M (eg, the human body M).
  • the reference calculation unit 11 estimates, as the reference position BP, the position of a portion including the center (for example, the center line CL) of the human body M in the cross direction (for example, the Z direction).
  • the reference calculation unit 11 calculates a reference position based on the point cloud data.
  • the reference calculation unit 11 reads out the point cloud data of the human body M generated by the point cloud data generation unit 5 from the storage unit 14 and executes a reference calculation process.
  • FIGS. 4 (A) to 4 (C), FIGS. 5 (A) and 5 (B) are diagrams showing the processing of the reference calculating unit according to the first embodiment.
  • the reference calculation unit 11 executes a first reference calculation process of calculating a first reference position BP1 (see FIG. 4B) as the reference position BP.
  • the first reference position BP1 is a first reference position BP1a on the first side (eg, ⁇ Z side, left body) of the object M and a second side (eg, + Z side, right body) of the object M. ) Of the first reference position BP1b.
  • the reference calculation unit 11 performs a second reference calculation process of calculating a second reference position BP2 (see FIG. 5B) based on the first reference position BP1 as the reference position BP.
  • the reference calculation unit 11 determines the position of a predetermined part of the human body M (eg, the base of the neck, between the neck and the torso, between the shoulder and the head, etc.) based on the characteristics of the change in the position information of the human body M. Is estimated as the first reference position BP1a. The reference calculation unit 11 calculates, as the first reference position BP1, a position that satisfies the condition that the amount of change in the coordinates of the points included in the point cloud data D2 is equal to or greater than a threshold value.
  • the process of the reference calculation unit 11 assumes that the object assumed as the detection target is a walking human body, and assumes the shape (eg, posture) of the human body at an arbitrary timing.
  • the surface of the human body significantly changes from the arm (eg, elbow) to the shoulder (eg, the amount of change in the position information is equal to or greater than a threshold).
  • the human body has a gradual change in the position of the surface from the shoulder to the neck (eg, the amount of change in the position information is less than the threshold).
  • the position of the surface of the human body changes significantly from the shoulder to the head as compared to the shoulder portion (eg, the amount of change in the position information becomes equal to or larger than a threshold).
  • the reference calculation unit 11 calculates, as the first reference position BP1, a position that satisfies a condition that a change amount of the position information (eg, the Y coordinate of the point data) is equal to or larger than a threshold.
  • the above-mentioned arm includes, for example, a scapulohumeral joint, and is a portion from the scapulohumeral joint to the fingertip.
  • the above-mentioned shoulder is, for example, a portion between the scapulohumeral joint of the left arm and the scapulohumeral joint of the right arm, and a lower body side of the seventh cervical vertebra (base of the neck).
  • the head includes, for example, a portion surrounded by the skull, a portion on the surface side of the skull, and a neck.
  • the head includes, for example, the seventh cervical vertebra and the parietal region, and is a portion from the seventh cervical vertebra to the parietal region.
  • the reference calculation unit 11 divides the area in the Z direction, and calculates a candidate for the first reference position BP1 for each of the divided areas.
  • reference numeral DA i is divided regions (divided regions, partial regions) is.
  • Z i is the coordinate in the Z direction representing the area DA i (eg, the coordinate of the center of the area DA i in the Z direction).
  • the reference calculation unit 11 performs the first reference calculation process on the assumption that the Z coordinate of the point data included in the area DA i is Z i .
  • the reference calculation unit 11 calculates the maximum value of the coordinates in the vertical direction (Y direction) in the coordinates in the cross direction (eg, Z direction) for the points included in the point group data D2. Reference calculation unit 11, for each divided area DA i, and calculates the maximum value of the Y coordinate of the point data.
  • Ymax (Z i ) in FIG. 4A is the maximum value of the Y coordinate of the point data in the area DA i .
  • the reference calculation unit 11 sets Ymax (Z i ) as the maximum value of the Y coordinate of the point data at each coordinate (Z i ).
  • FIG. 4B shows a plot of Ymax (Z i ) with respect to the coordinates (Z coordinates) in the cross direction.
  • the change amount of the position information with respect to the coordinates (Z coordinate) in the cross direction is represented by ⁇ Ymax (Z i ).
  • ⁇ Ymax (Z i ) is represented, for example, by the following equation (1).
  • ⁇ Ymax (Z i ) Ymax (Z i + 1 ) ⁇ Ymax (Z i ) Equation (1)
  • FIG. 4C shows a plot of the absolute value of ⁇ Ymax (Z i ) with respect to the coordinates (Z coordinates) in the cross direction.
  • reference numeral Q1 is a part corresponding to the arm of the human body M, and a part corresponding to the left arm is represented by reference numeral Q1a, and a part corresponding to the right arm is represented by reference numeral Q1b.
  • Reference numeral Q2 is a part corresponding to the shoulder of the human body M. A part corresponding to the left shoulder is represented by a reference numeral Q2a, and a part corresponding to the right shoulder is represented by a reference numeral Q2b.
  • Reference numeral Q3 is a portion corresponding to the head of the human body M.
  • ) of ⁇ Ymax (Z i ), which is the amount of change in the position information, is large in the arm portion Q1, and smaller in the shoulder portion Q2 than in the arm portion Q1.
  • the reference calculation unit 11 calculates the reference position BP (eg, the first reference position BP1a) based on the feature (eg, change pattern) of the amount of change in the position information.
  • the reference calculation unit 11 sets a portion where
  • a first characteristic portion e.g, a left shoulder portion Q2a, a right shoulder portion Q2b
  • the reference calculation unit 11 sequentially moves from one side (eg, first side, ⁇ Z side) to the other side (eg, second side, + Z side) in the cross direction (eg, Z direction).
  • the reference calculation unit 11 determines the position at which the magnitude relationship between
  • the reference calculation unit 11 calculates
  • a portion that is equal to or more than the second threshold value V2 is defined as a second characteristic portion of the human body M (for example, a head portion Q3).
  • the reference calculation unit 11 sets
  • the reference calculation unit 11 determines the position at which the magnitude relationship between
  • the boundary eg, the base end of the left shoulder, the end on the ⁇ Z side of the head, the neck, the base of the neck
  • is set eg, specified, identified, determined).
  • the reference calculation unit 11 [delta] Ymax (Z i) is smaller than the second threshold value V2, [delta] Ymax when (Z i + 1) is the second threshold value V2 or more, the first characteristic portion (left shoulder position of Z i The position is set at the end on the + Z side in the portion Q2), and the position of Z i + 1 is set on the end on the ⁇ Z side of the second characteristic portion (the head portion Q3).
  • the reference calculation unit 11 calculates a first reference position BP1 corresponding to the head of the human body M as a reference position, based on the amount of change in position information from the arm to the head of the human body M.
  • the reference calculation unit 11 calculates, as the first reference position BP1, a position representing a boundary between a first feature portion (eg, a left shoulder portion Q2a) and a second feature portion (a head portion Q3).
  • the reference calculation unit 11 calculates, as the first reference position BP1a, the position of the end on the + Z side in the first characteristic portion (the left shoulder portion Q2a).
  • the reference calculation unit 11 calculates the first reference position BP1b in the same manner as the first reference position BP1a.
  • the reference calculation unit 11 sets the first characteristic in the order from the other side (eg, + Z side) to the opposite side (eg, ⁇ Z side) of the human body M. A part and a second characteristic part are set.
  • the reference calculation unit 11 calculates, as the first reference position BP1b, the position of the end on the ⁇ Z side in the first characteristic portion (the right shoulder portion Q2b).
  • the reference calculation unit 11 stores the calculated first reference position BP1 (eg, three-dimensional coordinate value) in the storage unit 14 in FIG.
  • the reference calculation unit 11 sets the position of the end on the ⁇ Z side in the second characteristic portion (head portion Q3) as the first reference position BP1a, and sets the position of the + Z side in the second characteristic portion (head portion Q3).
  • the end position may be the first reference position BP1b.
  • the reference calculation unit 11 calculates the position between the position of the + Z side end of the first feature portion (the left shoulder portion Q2a) and the position of the ⁇ Z side end of the second feature portion (the head portion Q3).
  • the position (for example, the center) may be the first reference position BP1a.
  • the reference calculation unit 11 calculates the position of the end on the ⁇ Z side in the first characteristic portion (the right shoulder portion Q2b) and the position of the + Z side end in the second characteristic portion (the head portion Q3). A position (eg, center) between them may be the first reference position BP1b. Further, the reference calculation unit 11 may calculate only one of the first reference position BP1a and the first reference position BP1b as the first reference position BP1.
  • the reference calculator 11 calculates a reference position BP from a predetermined area AR2 (eg, an area in the Z direction corresponding to the head portion Q3) set with respect to the first reference position BP1 shown in FIG.
  • the second reference position BP2 is selected (eg, set, determined).
  • the reference calculation unit 11 includes a first reference position BP1a on the first side (eg, ⁇ Z side) of the object M and a second side (eg, (+ Z side) and the first reference position BP1b.
  • the reference calculation unit 11 sets a predetermined area (for example, an area including the reference position) between the first reference position BP1a on the first side and the first reference position BP1b on the second side. It is assumed that the center line CL of the human body M in the Z direction passes through a region between the first reference position BP1a and the first reference position BP1b (the predetermined region, the head portion Q3).
  • the reference calculation unit 11 calculates a position satisfying a predetermined condition in a region between the first reference position BP1a and the first reference position BP1b as a second reference position BP2.
  • the reference calculation unit 11 divides the region in the Z direction as in FIG. 4A, and calculates a candidate for the second reference position BP2 for each divided region.
  • the minimum value of the coordinates with (+ X side) being positive is calculated.
  • the reference calculation unit 11 extracts point data on the front side of the human body M (front side in the traveling direction, + X side in the X direction). For example, the reference calculation unit 11 divides the area in the Y direction, and calculates the maximum value of the X coordinate of the point data for each of the divided areas.
  • reference numeral DB j is divided regions (divided regions, partial regions) is.
  • Y j is a coordinate in the Y direction representing the area DB j (eg, a coordinate of the center of the area DB j in the Y direction).
  • the reference calculation unit 11 executes the second reference calculation process, assuming that the Y coordinate of the point data included in the area DB j is Y j .
  • Xmax (Y j ) in FIG. 5A is the maximum value of the X coordinate of the point data in the area DB j .
  • the reference calculation unit 11 sets Xmax (Y j ) as the maximum value of the X coordinate of the point data at each coordinate (Y j ).
  • Xmax (Y j ) corresponds to the X coordinate of point data on the front side in the traveling direction (X direction) in each coordinate (Y j ) in the vertical direction (Y direction).
  • Reference calculation unit 11 calculates the minimum value of Xmax (Y j) with respect to the Y coordinate.
  • the position of Xmax (Y j ) is, for example, a position above the shoulder of the human body M and below the chin.
  • FIG. 5B shows a plot representing the distribution of Xmin (Z i ) with respect to the Z coordinate.
  • Reference calculation unit 11 the cross direction (e.g., Z-direction) on the basis of the change amount of the minimum value with respect to the coordinate of (Xmin (Z i)), selecting a second reference position BP2.
  • the neck of the human body M is approximated by an elliptic cylinder, the surface of the neck becomes convex forward in the traveling direction, and the position of the tip corresponds to the position of the center line CL of the human body M.
  • the reference calculation unit 11 calculates a Z coordinate (Xc in FIG. 5B) at which Xmin (Z i ) with respect to the Z coordinate becomes maximum (or maximum).
  • the reference calculation unit 11 sets Xmin (Zc) as the X coordinate of the second reference position BP2.
  • the reference calculation unit 11 sets the Y coordinate (Yk in FIG. 5A) corresponding to Xmin (Zc) as the Y coordinate of the second reference position BP2. Further, the reference calculation unit 11 sets Zc as the Z coordinate of the second reference position BP2.
  • Reference calculating unit 11 according to the present embodiment as described above, since performing the reference calculation process dealing with a two-dimensional data point cloud data for each divided area DA i, can reduce the load of the example process is there. Note that the reference calculation unit 11 may perform the reference calculation process by treating the point cloud data as three-dimensional data.
  • the reference calculation unit 11 calculates the reference position BP (for example, the second reference position BP2) using the front-side outer shape characteristics of the human body M. Since a general human body has a remarkable change in shape between the chin, the throat, and the chest, the reference calculation unit 11 can calculate the second reference position BP2 with high accuracy. Note that the reference calculation unit 11 may calculate the reference position BP using the external features on the rear side of the human body M (eg, the rear side and the back side in the traveling direction).
  • the reference calculation unit 11 causes the storage unit 14 to store the calculated reference position BP.
  • the partial determination unit 12 performs a partial determination process.
  • the part discriminating unit 12 discriminates a first part M1 (eg, at least a part of the left body) and a second part M2 (eg, at least a part of the right body) of the object M (the human body M) in the part judgment processing. (Eg, judgment, identification, identification).
  • the first portion M1 is a part of the object M arranged on a first side (eg, ⁇ Z side) with respect to a reference plane BF described later.
  • the second portion M2 is a part of the object M arranged on a second side (eg, + Z side) opposite to the first side with respect to the reference plane BF.
  • the reference plane BF is a plane including the moving direction (X direction) and the vertical direction (Y direction).
  • the reference plane BF is, for example, a plane parallel to the movement direction (X direction) and the vertical direction (Y direction) and including the reference position BP (second reference position BP2).
  • the partial determination unit 12 performs a partial determination process based on the reference position BP calculated by the reference calculation unit 11.
  • the partial determination unit 12 sets a reference plane BF based on the reference position BP calculated by the reference calculation unit 11.
  • the partial determination unit 12 reads the reference position BP calculated by the reference calculation unit 11 from the storage unit 14 and sets a reference plane BF.
  • the partial determination unit 12 sets a plane parallel to the X direction and the Y direction and including the second reference position BP2 as the reference plane BF.
  • the partial determination unit 12 performs a partial determination process based on the point cloud data generated by the point cloud data generation unit 5.
  • the reference calculation unit 11 reads out the point cloud data generated by the point cloud data generation unit 5 from the storage unit 14, and a portion represented by one or more point data included in the point cloud data is on the first side with respect to the reference plane BF. (E.g., the -Z side) or the second side (e.g., the + Z side) is determined.
  • the second reference position BP2 is a position estimated by the reference calculation unit 11 as the position of a point on the center line CL of the human body M, and the partial determination unit 12 determines the second reference position BP2 as the reference position BP. Is used to execute the partial determination process.
  • the part determining unit 12 determines a part disposed on the first side (eg, the ⁇ Z side) with respect to the second reference position BP2 as the first part M1.
  • the part determination unit 12 determines a part arranged on the second side (eg, + Z side) with respect to the second reference position BP2 as the second part M2.
  • FIG. 6 is a diagram illustrating a process of the partial determination unit according to the first embodiment.
  • a symbol QX is a part (eg, point, area) to be subjected to the part determination processing.
  • a portion to be subjected to the partial determination process is referred to as a determination target portion.
  • the determination target portion QX is set in advance.
  • the determination target portion QX is set to a part (eg, hand, foot) of the human body M, and the partial determination portion 12 determines that the determination target portion QX (eg, foot) has a left half body (eg, left foot) and a right half body. (E.g., right foot).
  • the determination target portion QX may be set to a plurality of characteristic portions (for example, the whole body) of the human body M.
  • the part discriminating unit 12 may sequentially set each part of the human body M as the discrimination target part QX and execute the part discriminating process for each part of the human body M.
  • the partial determination unit 12 calculates a geodesic line SL connecting the reference position BP and the determination target portion QX of the object M (human body M).
  • the geodesic line SL is the shortest line connecting two points along the surface of the object M (human body M).
  • the partial determination unit 12 calculates the geodesic SL based on the shape of the surface of the human body M obtained from the point cloud data.
  • the part determination unit 12 uses the second reference position BP2 as the reference position BP, calculates a geodesic line SL connecting the second reference position BP2 and the determination target portion QX.
  • the determination target part QX is an area
  • the part determination unit 12 calculates the geodesic line SL connecting the point selected from the determination target part QX (for example, the center of the determination target part QX) and the second reference position BP2. I do.
  • the part determination unit 12 determines the first part M1 and the second part M2 based on the relative position between the geodesic line SL and the reference plane BF.
  • the entire geodesic line SL is disposed on the first side (eg, -Z side) or the second side (eg, + Z side) with respect to the reference plane BF will be described.
  • the partial determination unit 12 determines that the geodesic line SL is on the first side (for example, -Z side), the relative position between the geodesic line SL and the reference plane BF is specified.
  • the partial determination unit 12 determines that the geodesic line SL is on the second side (eg, + Z side) with respect to the reference plane BF. , + Z side), the relative position between the geodesic line SL and the reference plane BF is specified.
  • the part discriminating unit 12 determines a feature amount (first feature amount) of a portion arranged on the first side (eg, ⁇ Z side) with respect to the reference plane BF on the geodesic line SL, and a reference plane BF on the geodesic line SL.
  • the relative position between the geodesic line SL and the reference plane BF is specified based on the feature amount (second feature amount) of the portion arranged on the second side (eg, + Z side).
  • the partial determination unit 12 specifies a relative position between the geodesic line SL and the reference plane BF based on a magnitude relationship or a ratio between the first feature amount and the second feature amount.
  • the feature amount is, for example, a distance between a point on the geodesic line SL and the reference plane BF.
  • the partial determination unit 12 determines the distance (first feature amount) between a point on the geodesic line SL located on the first side (eg, the ⁇ Z side) with respect to the reference plane BF and the reference plane BF.
  • the relative position between the geodesic line SL and the reference plane BF is specified based on the distance (second feature amount) between the point on the geodesic line SL arranged on the second side (+ Z side) and the reference plane. .
  • the partial determination unit 12 determines each point of a plurality of points (first point set) on the geodesic line SL arranged on the first side (eg, the ⁇ Z side) with respect to the reference plane SF and the reference plane. The average (first feature amount) of the distance from the SF is calculated.
  • the part determination unit 12 determines each point of a plurality of points (second point set) on the geodesic line SL arranged on the second side (eg, + Z side) with respect to the reference plane SF and the reference plane SF. Is calculated (the second feature amount).
  • the partial determination unit 12 specifies that the geodesic line SL is located at a relative position on the first side (eg, the ⁇ Z side) with respect to the reference plane BF. I do. Further, when the second feature value is larger than the first feature value, the partial determination unit 12 determines that the geodesic line SL is a relative position where the geodesic line SL is disposed on the second side (eg, + Z side) with respect to the reference plane BF. Identify. The partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
  • the partial determination unit 12 selects a plurality of points on the geodesic line SL.
  • the partial determination unit 12 may select the plurality of points regularly (eg, at predetermined intervals) or irregularly (eg, randomly). Then, when each of the selected points is arranged on the first side (for example, the ⁇ Z side) with respect to the reference plane SF, each of the selected points is classified into a first point set, When placed on the second side (eg, + Z side) with respect to the SF, this point is classified into a second point set, and a first point set and a second point set are set.
  • the partial determination unit 12 determines the first point set and the second point set such that the number of points belonging to the first point set is the same as the number of points belonging to the second point set. May be set. In this case, the partial determination unit 12 may use the sum of the distances instead of the average value of the distances as the feature amount. For example, the partial determination unit 12 uses the sum of the distances related to the first point set as the first feature amount and the sum of the distances related to the second point set as the second feature amount, and uses the geodesic line SL and the reference plane BF. May be specified.
  • the first feature value may be a distance between a point on the geodesic line SL farthest from the reference plane BF to the first side (eg, the ⁇ Z side) and the reference plane BF.
  • the second feature value may be a distance between a point on the geodesic line SL farthest from the reference plane BF to the second side (+ Z side) and the reference plane BF.
  • the partial determination unit 12 determines that the geodesic line SL is located on the first side (eg, the ⁇ Z side) with respect to the reference plane BF. Identify the location.
  • the partial determination unit 12 places the geodesic line SL on the second side (eg, + Z side) with respect to the reference plane BF. Specify that it is a relative position.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
  • the first feature amount may be the length of a portion arranged on the first side (eg, ⁇ Z side) with respect to the reference plane BF in the geodesic line SL.
  • the second feature value may be the length of a portion arranged on the second side (eg, + Z side) with respect to the reference plane BF in the geodesic line SL.
  • the partial determination unit 12 determines a relative position where the geodesic line SL is arranged on the first side (eg, the ⁇ Z side) with respect to the reference plane BF. Is specified.
  • the partial determination unit 12 determines that the geodesic line SL is located on the second side (eg, + Z side) with respect to the reference plane BF. Identify the location.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
  • the partial determination unit 12 does not need to set (generate) the first point set and the second point set, and does not use the first feature amount and the second feature amount to generate the geodesic SL.
  • the relative position with respect to the reference plane BF may be specified.
  • the partial determination unit 12 selects a plurality of points on the geodesic line SL regularly (for example, at predetermined intervals) or irregularly (for example, randomly). Then, the partial determination unit 12 calculates the distance from each of the selected points to the reference plane BF. When each of the selected points is arranged on one side (eg, the first side, the ⁇ Z side) with respect to the reference plane BF, the distance between the selected point and the reference plane BF is represented by a negative value.
  • the partial determination unit 12 calculates the sum of the distances between each point represented by a positive value, a negative value, or 0 and the reference plane BF for a plurality of points on the geodesic line SL.
  • the partial determination unit 12 determines the relative position where the geodesic line SL is located on one side (eg, the first side, the ⁇ Z side) with respect to the reference plane BF. Identify that there is.
  • the partial determination unit 12 determines the relative position where the geodesic line SL is located on the other side (eg, the second side, + Z side) with respect to the reference plane BF. Is specified.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF based on the relative position between a predetermined part of the geodesic line SL and the reference plane BF.
  • the above-mentioned predetermined portion may be a point (farthest point) farthest from the reference plane BF on the geodesic line SL.
  • the farthest point is located on the first side (eg, ⁇ Z side) with respect to the reference plane BF
  • the partial determination unit 12 determines that the geodesic line SL is on the first side (eg, ⁇ side) with respect to the reference plane BF.
  • the relative position located on the (Z side) is specified.
  • the partial determination unit 12 determines that the geodesic line SL is on the second side (eg, with respect to the reference plane BF). (+ Z side).
  • the predetermined portion may be other than the farthest point. For example, a portion closer to the second reference position BP2 than the determination target portion QX in the geodesic line SL (for example, a portion having a predetermined length starting from the second reference position BP2). Part).
  • the portion determination unit 12 determines that the determination target portion QX belongs to the second portion M2. It may be determined.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by combining the above methods.
  • the geodesic line SL extends to the + Z side starting from the second reference position BP2, and reaches the determination target portion QX via the + Z side of the reference plane BF.
  • the part determination unit 12 determines that the determination target part QX belongs to the second part M2 (for example, the right half).
  • the left foot and the right foot intersect or are aligned on the same line.
  • the right foot is located at the left side or the same position as the left foot in the direction of the shoulder width of the human body. Therefore, it may be difficult for the conventional normal device to automatically determine whether the detected foot is the right foot or the left foot.
  • the detection device 1 uses at least the relative position between the geodesic line SL and the reference plane BF, thereby detecting at least the geodesic line SL. Since a part is arranged on the + Z side or the ⁇ Z side with respect to the reference plane BF, it is possible to automatically determine with high accuracy whether the detected foot is the left foot or the right foot.
  • the partial determination unit 12 performs a partial determination process on at least a part of the object M (human body M) represented by the point cloud data, and stores the processing result in the storage unit 14. To memorize.
  • the part discriminating unit 12 performs a part discriminating process for each part of the object M (human body M) represented by the point cloud data, and indicates whether the part belongs to the first part M1 or the second part M2.
  • the determination information (eg, flag, attribute information) is stored in the storage unit 14 as a processing result.
  • the posture estimating unit 13 performs a posture estimating process of estimating the posture of the object M based on the first part M1 and the second part M2 determined by the part determining unit 12.
  • the part determination unit 12 generates, for example, position information of a characteristic part (eg, a characteristic part, a characteristic point) of the human body M.
  • the characteristic part of the human body M is, for example, a part that can be distinguished from other parts of the human body M.
  • the characteristic portion of the human body M includes, for example, at least one of a distal end (a finger, a toe, a head), a joint, or an intermediate portion between the distal end and the joint or between the two joints.
  • the posture estimating unit 13 performs, for example, a recognition process (eg, pattern recognition, shape recognition, skeleton recognition) on the shape of the human body M obtained from the point cloud data.
  • the posture estimating unit 13 generates position information of the above-described characteristic portion by a recognition process.
  • the position information of the characteristic portion includes, for example, the coordinates of a point representing the characteristic portion (eg, three-dimensional coordinates).
  • the posture estimating unit 13 calculates the coordinates of the point representing the characteristic portion by the above-described recognition processing. Then, the posture estimating unit 13 causes the storage unit 14 to store information of the specified portion (eg, coordinates of a point representing a characteristic portion).
  • FIG. 7 is a diagram illustrating a process of the posture estimating unit according to the first embodiment.
  • reference characters Q11 to Q30 are characteristic portions of the human body M specified by the posture estimating unit 13.
  • Symbols Q11 to Q15 are characteristic portions corresponding to the terminal portions, where Q11 is a head, Q12 is a left foot, Q13 is a left hand, Q14 is a right foot, and Q15 is a right hand.
  • Reference characters Q16 to Q27 are characteristic portions corresponding to joints, Q16 is a left ankle, Q17 is a left knee, Q18 is a left hip joint (base of the left foot), Q19 is a right ankle, Q20 is a right knee, and Q21 is a right knee.
  • Q22 is a left wrist
  • Q23 is a left elbow
  • Q24 is a left scapulohumeral joint
  • Q25 is a right wrist
  • Q26 is a right elbow
  • Q27 is a right scapula upper arm.
  • Symbols Q28 to Q30 are characteristic portions corresponding to an intermediate portion between a distal end portion and a joint or between two joints
  • Q28 is a waist (the center of the left and right hip joints)
  • Q29 is a neck (the left and right hip joints).
  • the center of the scapulohumeral joint) and Q30 are the back (the center between the waist Q28 and the neck Q29).
  • the posture estimating unit 13 of the present embodiment can determine the characteristic portion (eg, left ankle, right ankle) of the human body M because the partial determination unit 12 can determine the left body part and the right body part with high accuracy. It can be specified with high accuracy. As a result of the recognition process, the posture estimating unit 13 determines the position of each characteristic part, information (eg, attribute information, name (right knee, right hip joint)) of each characteristic part, and the connection relationship between the two characteristic parts. Posture information (e.g., skeleton information, skeleton data) is generated as a set of the indicated connection information. The posture estimating unit 13 stores the above posture information in the storage unit 14 of FIG.
  • the posture estimating unit 13 estimates the posture of the human body M based on the posture information.
  • the posture estimating unit 13 is a relative position (eg, an angle) between a first line connecting a pair of characteristic portions having a connection relationship and a second line connected to the first line and connecting the pair of feature portions having a connection relationship. Is used to estimate the posture of the portion including the first line and the second line.
  • the first line is a right shin Q31 connecting the right ankle Q19 and the right knee Q20
  • the second line is a right thigh Q32 (thigh) connecting the right knee Q20 and the right hip joint Q21.
  • the posture estimating unit 13 calculates an angle ⁇ formed between the right shin Q31 and the right thigh Q32.
  • the posture estimating unit 13 determines that the right leg of the human body M including the right shin Q31 and the right thigh Q32 is bent when the angle ⁇ formed by the right shin Q31 and the right thigh Q32 is equal to or smaller than the threshold. I do.
  • the posture estimating unit 13 determines that the right leg of the human body M including the right shin Q31 and the right thigh Q32 is in an extended posture. I do.
  • the posture estimating unit 13 estimates the posture of a part or the whole of the human body M by estimating the posture as described above, for example, for each of the characteristic portions set in advance by the user.
  • the posture estimating unit 13 may estimate the posture with reference to posture definition information that defines the posture of an object (eg, a human body).
  • the above-described posture definition information is, for example, information in which information representing the type of posture and information defining the relative position of each characteristic portion are paired.
  • the information indicating the type of the posture is, for example, a posture name such as a walking posture, a sitting position, and a yoga pose name.
  • the information defining the relative position of each characteristic portion is, for example, a range of an angle ⁇ formed by the right shin Q31 and the right thigh Q32 or a threshold value.
  • the posture estimation unit 13 may estimate (identify) the posture of the human body M by comparing the generated posture information with the posture definition information.
  • FIG. 8 is a flowchart illustrating the detection method according to the first embodiment.
  • the configuration of the detection device 1 and the processing by each unit are appropriately referred to FIGS. 1 to 7.
  • the position detection unit 2 detects position information of each point on the surface of the object M.
  • the position detector 2 detects depth as position information, and generates point cloud data representing position information as a detection result (see FIG. 3).
  • the processing in step S1 includes the processing in step S2 and the processing in step S3.
  • the detection unit 4 detects a depth from a predetermined point (viewpoint) to each point of the target area AR.
  • the detection unit 4 causes the storage unit 14 to store a depth map (depth spatial distribution) representing the detection result.
  • the point cloud data generator 5 generates point cloud data based on the depth detection result.
  • the point cloud data generation unit 5 causes the storage unit 14 to store the generated point cloud data.
  • step S4 the reference calculation unit 11 calculates, as the reference position BP, a position where the amount of change in the position information satisfies a predetermined condition.
  • the reference calculation unit 11 calculates a first reference position BP1 and a second reference position BP2 as the reference position BP.
  • the processing in step S4 includes the processing in step S5 and the processing in step S6.
  • step S5 the reference calculation unit 11 calculates the first reference position BP1 based on the amount of change in the coordinates (position information) in the vertical direction (Y direction) (see FIG. 4).
  • the reference calculation unit 11 causes the storage unit 14 to store the calculated first reference position BP1.
  • step S6 the reference calculation unit 11 calculates a second reference position BP2 based on the first reference position BP1 (see FIG. 5).
  • the reference calculation unit 11 sets a predetermined area AR2 based on the first reference position BP1 calculated in step S5, and calculates a position satisfying a predetermined condition in the predetermined area AR2 as a second reference position BP2.
  • the reference calculation unit 11 causes the storage unit 14 to store the calculated second reference position BP2.
  • step S7 based on the reference position BP, the part determining unit 12 determines the first part M1 on the first side ( ⁇ Z side) and the second part on the second side (+ Z side) with respect to the reference plane BF. M2 is determined.
  • the partial determination unit 12 sets the reference plane BF based on the second reference position BP2 calculated in step S6. Further, the partial determination unit 12 calculates a geodesic line SL connecting the determination target portion QX of the partial determination process and the second reference position BP2.
  • the part determination unit 12 determines whether the determination target part QX belongs to the first part M1 or the second part M2 based on the relative position between the reference plane BF and the geodesic line SL.
  • the partial determination unit 12 generates, as a processing result of the partial determination processing, determination information indicating which of the first part M1 and the second part M2 the determination target part QX belongs to.
  • the partial determination unit 12 causes the storage unit 14 to store the generated determination information.
  • the posture estimating unit 13 estimates the posture of the object M.
  • the posture estimating unit 13 executes a posture estimating process of estimating the posture of the object M (human body M) based on the processing result (eg, the discrimination information) of the partial determining unit 12.
  • the posture estimating unit 13 performs a recognition process on the shape of the human body M obtained from the point cloud data, and specifies a characteristic portion of the human body M (see FIG. 7).
  • the posture estimating unit 13 estimates the posture of the human body M based on the relative positions of the specified characteristic portions.
  • the reference calculation unit 11 calculates the reference position BP based on the amount of change in the position information
  • the partial determination unit 12 determines the first side of the object M based on the reference position BP.
  • the first portion M1 on the (eg, ⁇ Z side, left side) and the second portion M2 on the second side (eg, + Z side, right side) are determined. Therefore, the detection device 1 can determine the first portion M1 (eg, the left foot) and the second portion M2 (eg, the right foot) with high accuracy.
  • the posture estimation unit 13 detects the object M (the human body M). The posture can be estimated with high accuracy.
  • the reference calculation unit 11 estimates the position of a portion including the center of the object M in the cross direction (eg, the Z direction) as the reference position BP.
  • the part determination unit 12 since the part determination unit 12 performs the part determination processing based on the position of the part including the center of the object M, the first part M1 on the first side and the second part M2 on the second side are accurately determined. Can be determined.
  • the reference position BP may be shifted from the center of the object M in the Z direction.
  • the reference position BP is shifted from the center of the object M to the first side (+ Z side) by a predetermined shift amount.
  • the maximum distance between the geodesic line SL (see FIG. 6) and the reference plane BF is larger than the amount of displacement between the reference position BP and the center of the object M.
  • the part discriminating unit 12 calculates the characteristic amount of the portion (eg, the ratio of the geodesic line SL to the entirety, the reference surface BF) of the portion arranged on the first side (+ Z side) with respect to the reference plane BF in the geodesic line SL.
  • the portion determination unit 12 arranges the determination target portion QX on the first side. It may be determined that the part has been processed.
  • the reference calculation unit 11 may calculate the reference position BP by processing the depth detected by the position detection unit 2 (detection unit 4).
  • the reference calculation unit 11 may calculate the reference position BP based on shape information (eg, polygon data) of the object M obtained from the depth detected by the position detection unit 2 (detection unit 4) (later, This will be described with reference to FIG.
  • the reference calculation unit 11 does not have to set the predetermined area AR2.
  • the reference calculation unit 11 may set the center between the first reference position BP1a and the first reference position BP1b to the second reference position BP2. Further, the reference calculation unit 11 may calculate only one of the first reference position BP1a and the first reference position BP1b as the first reference position BP1. In this case, the reference calculation unit 11 may set a position shifted by a predetermined amount in the Z direction from the first reference position BP1 as the second reference position BP2.
  • the predetermined amount may be set, for example, based on the size (scale) of the object M.
  • the predetermined amount may be set to a predetermined ratio (for example, 25%) with respect to the size of the object M in the Z direction (for example, the distance between the left end of the left shoulder and the right end of the right shoulder). .
  • the reference calculation unit 11 does not need to calculate the first reference position BP1.
  • the reference calculation unit 11 determines a pattern in which the feature amount (eg, Xmin (Z i )) on the front side or the rear side of the object M changes in the Z direction (eg, FIG. 5B). )),
  • the detection device 1 does not need to include the attitude estimation unit 13 described above.
  • the detection device 1 may calculate the feature amounts (eg, dimensions, outer peripheral length, cross-sectional area) of the part determined by the part determination unit 12.
  • the detection device 1 may segment the object M using the processing result of the partial determination processing performed by the partial determination unit 12.
  • the detection device 1 uses the processing result of the partial determination processing by the partial determination unit 12 (eg, information for identifying whether the detected “foot” is the “left foot” or the “right foot”), and Information may be generated (described later with reference to FIG. 10).
  • FIG. 9 is a diagram illustrating a detection device according to the second embodiment.
  • the moving direction is determined in advance with respect to the target area AR (eg, the field of view of the position detection unit 2), but in the present embodiment, the detection device 1 , The moving direction of the human body M) is detected.
  • the detection device 1 includes a direction calculation unit 21 that calculates (derives and detects) the moving direction of the object M (human body M) based on the detection result of the position detection unit 2.
  • the detecting unit 4 repeatedly detects the human body M, and the point cloud data generating unit 5 generates point cloud data of the human body M for each detection timing as position information.
  • the direction calculation unit 21 calculates the moving direction of the human body M based on the time change of the position information of the human body M detected by the position detection unit 2. For example, the direction calculation unit 21 calculates a trajectory of the human body M (for example, a time history of a predetermined position of the human body M), and sets a direction along the trajectory as a moving direction of the human body M.
  • the predetermined position is a position on the human body M determined by default setting or user setting.
  • the direction calculation unit 21 calculates the center of gravity of a plurality of points included in the point cloud data as the predetermined position of the human body M.
  • the direction calculating unit 21 calculates the moving direction based on the calculated temporal change of the center of gravity. For example, the direction calculation unit 21 starts from a first predetermined position of the human body M obtained from the detection result of the detection unit 4 at the first time, and obtains the detection result from the detection unit 4 at the second time after the first time.
  • a vector having the second predetermined position of the human body M as an end point is calculated, and a direction vector (eg, a unit vector) parallel to this vector is determined as the moving direction of the human body M at the first time.
  • the direction calculation unit 21 causes the storage unit 14 to store the calculated movement direction (eg, movement information).
  • the reference calculation unit 11 calculates a reference position BP based on the moving direction calculated by the direction calculation unit 21.
  • the reference calculation unit 11 reads the movement direction calculated by the direction calculation unit 21 from the storage unit 14, sets the movement direction to the X direction, and executes a reference calculation process.
  • the part determination unit 12 determines the first part M1 and the second part M2 based on the moving direction calculated by the direction calculation unit 21.
  • the part determination unit 12 sets a plane parallel to the vertical direction and the movement direction calculated by the direction calculation unit 21 and including the second reference position BP2 as the reference plane BF.
  • the part determination unit 12 determines whether the determination target part QX is located on the first side (eg, ⁇ Z side) or the second side (eg, + Z side) with respect to the set reference plane BF. Determine.
  • the predetermined position may be the position of the characteristic portion (eg, the head of the human body M) described in the first embodiment. Further, the predetermined position may be a position of a characteristic part (eg, right foot, left foot) of the human body M derived based on the determination result of the partial determination unit 12. Further, the predetermined position may be a reference position BP calculated by the reference calculation unit 11 (eg, a second reference position BP2). For example, the direction calculation unit 21 may calculate a candidate for the movement direction based on a trajectory of a first predetermined position (for example, the center of gravity) set in advance.
  • a first predetermined position for example, the center of gravity
  • the reference calculation unit 11 may calculate a candidate for the reference position BP (for example, the second reference position BP2), using the candidate for the movement direction calculated by the direction calculation unit 21 as the X direction.
  • the direction calculation unit 21 uses the candidate for the second reference position BP2 calculated by the reference calculation unit 11 as the second predetermined position, and calculates the moving direction based on the trajectory of the second predetermined position (eg, recalculation). May be.
  • the reference calculation unit 11 may calculate (recalculate) the reference position BP with the movement direction calculated using the second predetermined position as the X direction.
  • the direction calculation unit 21 may calculate the moving direction of the object M based on the position information of the object M obtained from the global positioning system (GPS). Further, an acceleration sensor may be provided on the object M, and the direction calculation unit 21 may calculate the moving direction of the object M based on the detection result of the acceleration sensor.
  • the human body M moves with a mobile terminal (for example, a smartphone) including one or both of a receiving unit that receives information from GPS and the acceleration sensor, and the direction calculation unit 21 transmits a human body from the mobile terminal.
  • the position information (eg, GPS information, acceleration) of M may be acquired, and the moving direction of the object M may be calculated.
  • the detection device 1 includes a vertical detection unit (for example, a sensor that detects the direction of gravitational acceleration) that detects a vertical direction, sets the detection result detected by the vertical detection unit in the Y direction, and performs a reference calculation process. And / or both of the partial determination processing.
  • the detection device 1 may not include the vertical detection unit.
  • the vertical detection unit is provided on the object M (the human body M is carried), and the detection device 1 performs one of the reference calculation process and the partial determination process based on the vertical direction acquired from the vertical detection unit. Both may be performed.
  • the vertical detection unit may be provided separately from both the detection device 1 and the object M.
  • the vertical detection unit may be provided in a place where the object M is detected (for example, equipment in which the detection device 1 is installed).
  • FIG. 10 is a diagram illustrating a detection device according to the third embodiment.
  • the detection device 1 includes a model generation unit 22 that generates model information of an object M (in this case, a human body M).
  • the model information is, for example, three-dimensional CG model data, and includes shape information of the object M.
  • the model generation unit 22 calculates model information of the object M based on the first part M1 and the second part M2 determined by the part determination unit 12.
  • the model generation unit 22 includes the point cloud data generation unit 5 and the surface information generation unit 23.
  • the point cloud data generation unit 5 generates point cloud data as shape information based on the depth of the object M detected by the detection unit 4 as described in the first embodiment.
  • the surface information generation unit 23 performs a surface process for calculating surface information as shape information.
  • the surface information includes, for example, at least one of polygon data, vector data, and draw data.
  • the surface information includes coordinates of a plurality of points on the surface of the object and connection information between the plurality of points.
  • the connection information (for example, attribute information) includes, for example, information that associates points at both ends of a line corresponding to a ridge line (for example, an edge) on the surface of an object. Further, the connection information includes, for example, information that associates a plurality of lines corresponding to the contour of the object surface (surface) with each other.
  • the surface information generation unit 23 estimates a surface between a point selected from a plurality of points included in the point cloud data and a nearby point in the surface processing.
  • the surface information generation unit 23 segments the point group data using the processing result of the partial determination unit 12 when estimating a surface between a point and a nearby point. For example, when generating the surface information of the left foot of the human body M, the surface information generation unit 23 distinguishes the point group data of the left foot from the point group data of the right foot based on the processing result of the partial determination unit 12. In this case, for example, when the left knee and the right knee are approaching (for example, in contact with each other), the surface information generation unit 23 avoids estimating a surface that straddles the left knee and the right knee. Accurate surface information can be generated.
  • the surface information generation unit 23 converts the point cloud data into polygon data having plane information between points.
  • the surface information generation unit 23 converts the point cloud data into polygon data by, for example, an algorithm using the least squares method. This algorithm may be, for example, an algorithm that is published in a point cloud processing library.
  • the surface information generation unit 23 causes the storage unit 14 to store the calculated surface information.
  • the model information may include texture information of the object M.
  • the model generation unit 22 may generate texture information of a plane defined by three-dimensional point coordinates and related information.
  • the texture information includes, for example, at least one information of a character, a graphic, a pattern, a texture, a pattern, information defining irregularities, a specific image, and a color (eg, chromatic color, achromatic color) on the surface of the object.
  • the model generation unit 22 may cause the storage unit 14 to store the generated texture information.
  • the model information may include spatial information (eg, lighting conditions, light source information) of the image.
  • the light source information includes a position of a light source that irradiates the object M with light (eg, illumination light), a direction in which light is emitted from the light source to the object M (irradiation direction), a wavelength of light emitted from this light source, And information of at least one of the types of the light sources.
  • the model generation unit 22 may calculate the light source information using, for example, a model assuming Lambertian reflection, a model including Albedo estimation, or the like.
  • the model generation unit 22 may cause the storage unit 14 to store the generated spatial information.
  • the model generation unit 22 does not need to generate one or both of the texture information and the spatial information.
  • FIG. 11 is a diagram illustrating a detection device according to the fourth embodiment.
  • the detection device 1 detection system
  • the detection device 1 includes a model generation unit 22, a rendering processing unit 24 (rendering processing device, information processing device), an input device 25, and a display device 26.
  • the ⁇ rendering processing unit 24 includes, for example, a graphics ⁇ processing ⁇ unit (GPU). Note that the rendering processing unit 24 may be configured so that the CPU and the memory execute each processing according to the image processing program.
  • the rendering processing unit 24 performs, for example, at least one of drawing processing, texture mapping processing, and shading processing.
  • the rendering processing unit 24 can calculate, for example, an estimated image (eg, a reconstructed image) obtained by viewing the shape defined in the shape information of the model information from an arbitrary viewpoint.
  • an estimated image eg, a reconstructed image
  • the shape indicated by the shape information is referred to as a model shape.
  • the rendering processing unit 24 can reconstruct a model shape (eg, an estimated image) from the model information (eg, shape information) by, for example, a drawing process.
  • the rendering processing unit 24 causes the storage unit 14 to store the data of the calculated estimated image, for example.
  • the rendering processing unit 24 can calculate, for example, an estimated image in which the image indicated by the texture information of the model information is pasted on the surface of the object on the estimated image.
  • the rendering processing unit 24 can calculate an estimated image in which a texture different from the target object is pasted on the surface of the object on the estimated image.
  • the rendering processing unit 24 can calculate, for example, an estimated image in which a shadow formed by the light source indicated by the light source information of the model information is added to an object on the estimated image. In the shading process, the rendering processing unit 24 can calculate, for example, an estimated image in which a shadow formed by an arbitrary light source is added to an object on the estimated image.
  • the input device 25 is used for inputting various information (eg, data, commands) to the processing device 3.
  • the user can input various information to the processing device 3 by operating the input device 25.
  • the input device 25 includes, for example, at least one of a keyboard, a mouse, a trackball, a touch panel, and a voice input device (eg, a microphone).
  • the display device 26 displays the image based on the image data output from the processing device 3.
  • the processing device 3 outputs the data of the estimated image generated by the rendering processing unit 24 to the display device 26.
  • the display device 26 displays the estimated image based on the data of the estimated image output from the processing device 3.
  • the display device 26 includes, for example, a liquid crystal display.
  • the display device 26 and the input device 25 may be a touch panel or the like.
  • the detection device 1 does not need to include the input device 25.
  • the detection device 1 may be in a form in which various commands and information are input via communication.
  • the detection device 1 does not need to include the display device 26.
  • the detection device 1 may output data of an estimated image generated by the rendering process to a display device via communication, and the display device may display the estimated image.
  • the rendering processing unit 24 is provided in the processing device 3 in FIG. 11, but may be provided in a device external to the processing device 3.
  • the external device may be a cloud computer communicably connected to the processing device 3.
  • the processing device 3 includes, for example, a computer system.
  • the processing device 3 reads a processing program stored in the storage unit 14 and executes various processes according to the processing program.
  • This processing program instructs the computer to set the position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition in the intersecting direction intersecting the moving direction of the moving object and the vertical direction as the reference position. And a first portion of the object disposed on the first side with respect to the reference plane including the movement direction and the vertical direction, based on the reference position, and a first portion opposite to the first side with respect to the reference plane. Determining the second portion of the object disposed on the second side.
  • This program may be provided by being recorded on a computer-readable storage medium (eg, a non-transitory recording medium, non-transitory @ tangible @ media).

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

[Problème] Déterminer la partie d'un objet mobile sur un côté et la partie sur l'autre côté par rapport à une direction de croisement croisant la direction de déplacement de l'objet. [Solution] La présente invention concerne un dispositif de détection comprenant : une unité de détection qui détecte des informations de position de points sur un objet mobile; une unité de calcul de référence qui calcule, en tant que position de référence, une position où la quantité de changement d'informations de position dans une direction qui croise la direction de déplacement de l'objet et la direction verticale satisfait une condition prédéterminée; et une unité de détermination qui détermine, par rapport à un plan de référence qui est basé sur la position de référence calculée par l'unité de calcul de référence et qui contient la direction de déplacement et la direction verticale, une première partie d'un objet disposé sur un premier côté et une seconde partie de l'objet disposée sur le second côté opposé au premier côté par rapport au plan de référence.
PCT/JP2019/026218 2018-07-31 2019-07-02 Dispositif de détection, dispositif de traitement, procédé de détection et programme de traitement WO2020026677A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020534123A JP7024876B2 (ja) 2018-07-31 2019-07-02 検出装置、処理装置、検出方法、及び処理プログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018143687 2018-07-31
JP2018-143687 2018-07-31

Publications (1)

Publication Number Publication Date
WO2020026677A1 true WO2020026677A1 (fr) 2020-02-06

Family

ID=69230965

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026218 WO2020026677A1 (fr) 2018-07-31 2019-07-02 Dispositif de détection, dispositif de traitement, procédé de détection et programme de traitement

Country Status (2)

Country Link
JP (1) JP7024876B2 (fr)
WO (1) WO2020026677A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7261342B1 (ja) 2022-09-22 2023-04-19 三菱ケミカルグループ株式会社 情報処理装置、方法、プログラム、およびシステム
WO2024116444A1 (fr) * 2022-11-28 2024-06-06 株式会社Jvcケンウッド Dispositif de traitement d'image et programme de traitement d'image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007107927A (ja) * 2005-10-11 2007-04-26 Nittetsu Hokkaido Control Systems Corp 長さ測定装置、長さ測定方法および長さ測定用コンピュータプログラム
JP2008032489A (ja) * 2006-07-27 2008-02-14 Kanazawa Inst Of Technology 人体の3次元形状データ生成方法および3次元形状データ生成装置
JP2012215555A (ja) * 2011-03-30 2012-11-08 Advanced Telecommunication Research Institute International 計測装置,計測方法および計測プログラム
WO2017119154A1 (fr) * 2016-01-07 2017-07-13 三菱電機株式会社 Dispositif et procédé de détection

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007107927A (ja) * 2005-10-11 2007-04-26 Nittetsu Hokkaido Control Systems Corp 長さ測定装置、長さ測定方法および長さ測定用コンピュータプログラム
JP2008032489A (ja) * 2006-07-27 2008-02-14 Kanazawa Inst Of Technology 人体の3次元形状データ生成方法および3次元形状データ生成装置
JP2012215555A (ja) * 2011-03-30 2012-11-08 Advanced Telecommunication Research Institute International 計測装置,計測方法および計測プログラム
WO2017119154A1 (fr) * 2016-01-07 2017-07-13 三菱電機株式会社 Dispositif et procédé de détection

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7261342B1 (ja) 2022-09-22 2023-04-19 三菱ケミカルグループ株式会社 情報処理装置、方法、プログラム、およびシステム
WO2024062642A1 (fr) * 2022-09-22 2024-03-28 株式会社Shosabi Dispositif de traitement d'informations, procédé, programme et système associés
JP2024045823A (ja) * 2022-09-22 2024-04-03 三菱ケミカルグループ株式会社 情報処理装置、方法、プログラム、およびシステム
WO2024116444A1 (fr) * 2022-11-28 2024-06-06 株式会社Jvcケンウッド Dispositif de traitement d'image et programme de traitement d'image

Also Published As

Publication number Publication date
JPWO2020026677A1 (ja) 2021-08-02
JP7024876B2 (ja) 2022-02-24

Similar Documents

Publication Publication Date Title
US9594950B2 (en) Depth mapping with enhanced resolution
US9898651B2 (en) Upper-body skeleton extraction from depth maps
US9842405B2 (en) Visual target tracking
US8565476B2 (en) Visual target tracking
US8577084B2 (en) Visual target tracking
US7974443B2 (en) Visual target tracking using model fitting and exemplar
US8682028B2 (en) Visual target tracking
US9767611B2 (en) Information processing apparatus and method for estimating depth values using an approximate plane
US8577085B2 (en) Visual target tracking
US20140010425A1 (en) Extraction of skeletons from 3d maps
US8565477B2 (en) Visual target tracking
JP2016071645A (ja) オブジェクト3次元モデル復元方法、装置およびプログラム
JP7363962B2 (ja) 処理装置、検出装置、システム及びプログラム
JP7024876B2 (ja) 検出装置、処理装置、検出方法、及び処理プログラム
JP7200994B2 (ja) 処理装置、検出装置、処理方法、及び処理プログラム
JP7234595B2 (ja) 検出装置、処理装置、検出方法、及び処理プログラム
JP7447956B2 (ja) 処理装置、姿勢解析システム、及びプログラム
JP2022037506A (ja) 検出装置、処理装置、検出方法、及び処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19845513

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020534123

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19845513

Country of ref document: EP

Kind code of ref document: A1