WO2020026677A1 - Detection device, processing device, detection method, and processing program - Google Patents

Detection device, processing device, detection method, and processing program Download PDF

Info

Publication number
WO2020026677A1
WO2020026677A1 PCT/JP2019/026218 JP2019026218W WO2020026677A1 WO 2020026677 A1 WO2020026677 A1 WO 2020026677A1 JP 2019026218 W JP2019026218 W JP 2019026218W WO 2020026677 A1 WO2020026677 A1 WO 2020026677A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
reference position
point
detection device
calculation unit
Prior art date
Application number
PCT/JP2019/026218
Other languages
French (fr)
Japanese (ja)
Inventor
源洋 中川
幹也 田中
貴次 青山
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to JP2020534123A priority Critical patent/JP7024876B2/en
Publication of WO2020026677A1 publication Critical patent/WO2020026677A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention relates to a detection device, a processing device, a detection method, and a processing program.
  • Patent Document 1 As a technique for detecting an object, for example, there is a technique described in Patent Document 1 below.
  • a moving object such as a walking human body
  • the left foot and the right foot are substantially at the same position in the shoulder width direction of the human body, and it is difficult to distinguish between the left foot and the right foot.
  • a detection unit that detects position information of each point of a moving object, and a position where a change amount of the position information satisfies a predetermined condition in an intersecting direction intersecting a moving direction of the object and a vertical direction. And a first portion of the object disposed on the first side with respect to a reference plane including the moving direction and the vertical direction based on the reference position calculated by the reference calculation unit.
  • a determination unit configured to determine a second portion of the object disposed on the second side opposite to the first side with respect to the reference plane.
  • a position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition in an intersecting direction intersecting the moving direction of the moving object and the vertical direction is calculated as a reference position.
  • the position information of each point of the moving object is detected, and the position where the amount of change of the position information satisfies a predetermined condition in the intersecting direction intersecting the moving direction of the object and the vertical direction is determined.
  • the computer determines, based on the position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition, in the direction intersecting the moving direction of the moving object and the vertical direction. Calculating as a position, and, based on the reference position, a first portion of the object disposed on a first side with respect to a reference plane including a movement direction and a vertical direction, and opposite to the first side with respect to the reference plane. And determining a second portion of the object arranged on the second side.
  • FIG. 3 is a diagram illustrating a detection unit according to the first embodiment.
  • FIG. 4 is a diagram illustrating a process of a point cloud data generation unit according to the first embodiment.
  • FIGS. 3A to 3C are diagrams illustrating a process of a reference calculation unit according to the first embodiment.
  • FIGS. 4A and 4B are diagrams illustrating a process of a reference calculation unit according to the first embodiment.
  • FIG. 4 is a diagram illustrating a process of a partial determination unit according to the first embodiment.
  • FIG. 5 is a diagram illustrating processing of a posture estimating unit according to the first embodiment. 5 is a flowchart illustrating a detection method according to the first embodiment. It is a figure showing a detecting device concerning a 2nd embodiment. It is a figure showing the detecting device concerning a 3rd embodiment. It is a figure showing the detecting device concerning a 4th embodiment.
  • FIG. 1 is a diagram illustrating a detection device according to the first embodiment.
  • the detection device 1 (detection system) is, for example, a motion capture device, a motion detection system, an exercise support system, or the like.
  • the detection device 1 is used for posture analysis or three-dimensional modeling. In these cases, the detection device 1 detects the object M that moves in one direction or a plurality of directions in a predetermined time range.
  • the detection device 1 may detect the object M that moves linearly, or may detect the object M that moves meandering.
  • the movement path (eg, trajectory) of the object M when the detection device 1 detects the object M may include a straight line, may include a curve, or may include a straight line and a curve.
  • the detection device 1 is arranged on a first portion M1 of the object M arranged on a first side (eg, left side) with respect to the moving direction MD of the object M, and is arranged on a side opposite to the first side (eg, right side).
  • the second part M2 of the object M is determined.
  • the object M is movable in a target area AR to be detected by the detection device 1 (eg, the detection region of the detection device 1, the field of view).
  • the object M includes, for example, a human body or a non-human animal, a human-type or non-human animal-type robot, or a non-animal-type robot.
  • the object M is a human body, and the object M is appropriately referred to as a human body M.
  • the object M (in this case, the human body M) is, for example, an object whose posture and / or shape changes with movement.
  • the detection device 1 is disposed on the first side (eg, left side) with the moving direction MD of the human body M or the midline of the human body M (eg, left and right center lines on the body surface) as a boundary.
  • the first portion M1 and the second portion M2 disposed on the opposite side (eg, right side) from the first side can be determined.
  • the moving human body M is, for example, a sport such as fencing, baseball, soccer, golf, kendo, American football, ice hockey, or gymnastics, walking or posing such as running, exercise, yoga, bodybuilding, a fashion show, a game, or a person. It is a human body that works for authentication or work.
  • the object M may include a moving human body and an object attached to the moving human body (eg, clothing, wearing equipment, exercise equipment, and armor).
  • an XYZ orthogonal coordinate system shown in FIG. In this XYZ rectangular coordinate system, the X direction is the moving direction of the object M (eg, the human body M), the Y direction is the vertical direction, and the Z direction intersects the moving direction of the object M (the human body M) with the vertical direction. (Eg, orthogonal) crossing directions (eg, orthogonal directions).
  • the Z direction includes the direction of the shoulder width of the human body M
  • the first part M1 is at least a part of the left half of the human body M
  • the second part M2 is at least a part of the right half of the human body M. Part.
  • the side indicated by the arrow is appropriately referred to as a + side (eg, + Z side), and the opposite side is referred to as a ⁇ side (eg, ⁇ Z side).
  • the first side is the -Z side
  • the second side is the + Z side.
  • the detection device 1 includes a position detection unit 2 and a processing device 3.
  • the position detector 2 detects position information of the moving human body M.
  • the position detecting unit 2 detects point group data including three-dimensional coordinates of each point on the surface of the human body M as position information of the human body M.
  • the moving direction of the human body M is predetermined with respect to the target area AR (for example, the field of view of the position detection unit 2).
  • the vertical direction and the moving direction of the human body M in the target area AR are given to the detection device 1 as known information.
  • the detection result eg, point cloud data
  • the position detector 2 includes a detector 4 and a point cloud data generator 5.
  • the detection unit 4 is, for example, at least a part of a portable device (portable device).
  • the detection unit 4 may be at least a part of a stationary device.
  • the detection unit 4 may be provided inside the processing device 3.
  • the point cloud data generating unit 5 may be provided in a device outside the processing device 3.
  • the position detection unit 2 is a device external to the processing device 3 and may include the detection unit 4 and the point cloud data generation unit 5.
  • a part or the whole of the detection device 1 may be a portable device (eg, an information terminal, a smartphone, a tablet, a camera-equipped mobile phone, and a wearable terminal).
  • a part or the whole of the detection device 1 may be a stationary device (eg, a fixed-point camera).
  • FIG. 2 is a diagram illustrating the detection unit according to the first embodiment.
  • the detection unit 4 detects depth as position information of the human body M.
  • the detection unit 4 includes, for example, a depth sensor (eg, a depth camera).
  • the detection unit 4 detects a depth (eg, position information, distance, depth, depth) from a predetermined point to each point on the surface of the object placed in the target area AR.
  • the predetermined point is, for example, a point at a position serving as a reference for detection by the detection unit 4 (eg, a viewpoint, a detection source point, a point representing the position of the detection unit 4, a pixel position of the imaging element 8 described later). .
  • the detection unit 4 includes an irradiation unit 6, an optical system 7, and an imaging device 8.
  • the irradiation unit 6 irradiates (eg, projects) light La (eg, pattern light, irradiation light) to the target area AR (space, detection area).
  • the optical system 7 includes, for example, an imaging optical system (imaging optical system).
  • the imaging device 8 includes, for example, a CMOS image sensor or a CCD image sensor.
  • the imaging element 8 has a plurality of pixels arranged two-dimensionally.
  • the imaging element 8 captures an image of the target area AR via the optical system 7.
  • the imaging element 8 detects light Lb (infrared light, return light) emitted from the object in the target area AR due to the irradiation of the light La.
  • the detection unit 4 is based on, for example, a pattern (eg, intensity distribution) of the light La emitted from the irradiation unit 6 and a pattern (eg, intensity distribution, captured image) of the light Lb detected by the image sensor 8. , The depth from the point on the target area AR corresponding to each pixel of the image sensor 8 to each pixel of the image sensor 8 is detected.
  • the detection unit 4 outputs a depth map (e.g., a depth image, depth information, and distance information) representing a depth distribution in the target area AR to the processing device 3 (see FIG. 1) as a detection result.
  • the detection unit 4 outputs a depth map to the processing device 3 as position information of the human body M.
  • the detection unit 4 may be a device that detects a depth by a time-of-flight (TOF) method.
  • the detection unit 4 may be a device that detects depth by a method other than the TOF method.
  • the detection unit 4 may include, for example, a laser scanner (for example, a laser range finder), and may detect the depth by laser scanning.
  • the detection unit 4 may include, for example, a phase difference sensor, and may detect the depth by a phase difference method.
  • the detection unit 4 may detect the depth by, for example, a DFD (depth @ from @ defocus) method.
  • the detection unit 4 may irradiate the object M with light other than infrared light (eg, visible light) and detect light (eg, visible light) emitted from the object M.
  • the detection unit 4 may include, for example, a stereo camera, and may detect (eg, image) the object M from a plurality of viewpoints.
  • the detection unit 4 may detect depth by triangulation using captured images obtained by capturing the object M from a plurality of viewpoints.
  • the detection unit 4 may detect the depth by a method other than an optical method (for example, scanning by ultrasonic waves).
  • the point cloud data generation unit 5 generates point cloud data as position information of the human body M based on the depth detected by the detection unit 4.
  • a detection unit 4 is provided outside the processing device 3, and a point cloud data generation unit 5 is provided inside the processing device 3.
  • the processing device 3 is communicably connected to the detection unit 4.
  • the detection unit 4 outputs the detection result to the processing device 3.
  • the processing device 3 processes the detection result output from the detection unit 4.
  • the processing device 3 includes a point cloud data generation unit 5, a reference calculation unit 11, a partial determination unit 12, a posture estimation unit 13, and a storage unit 14.
  • the storage unit 14 is, for example, a nonvolatile memory, a hard disk (HDD), a solid state drive (SSD), or the like.
  • the storage unit 14 stores original data processed by the processing device 3 and data processed by the processing device 3 (eg, data generated by the processing device 3).
  • the storage unit 14 stores the depth map output from the detection unit 4 as the original data of the point cloud data.
  • the point cloud data generation unit 5 executes point cloud processing for generating point cloud data as position information of the human body M.
  • the point cloud data includes three-dimensional coordinates of a plurality of points on the human body M in the target area AR.
  • the point cloud data may include three-dimensional coordinates of a plurality of points on an object (eg, a wall, a floor) around the human body M.
  • the point cloud data generation unit 5 calculates point cloud data of the human body M based on the position information (for example, depth) detected by the detection unit 4.
  • the point cloud data generation unit 5 reads out the detection result (eg, depth map) of the detection unit 4 stored in the storage unit 14 and calculates point cloud data.
  • the point cloud data generating unit 5 may generate point cloud data as model information (eg, shape information) of the object M (eg, a human body M).
  • FIG. 3 is a diagram illustrating a process of the point cloud data generation unit according to the first embodiment.
  • Symbol D1 is a depth map (eg, a depth image) corresponding to the detection result of the detection unit 4.
  • the depth map D ⁇ b> 1 is information (for example, an image) representing a spatial distribution of a depth measurement value obtained by the detection unit 4.
  • the depth map D1 is a grayscale image in which the depth at each point of the target area AR is represented by a gradation value.
  • a portion having a relatively high tone value eg, a white portion, a bright portion
  • a portion having a relatively small depth eg, a portion relatively close to the position detection unit 2).
  • a portion having a relatively low gradation value eg, a black portion, a dark portion
  • a relatively large depth eg, a portion relatively far from the position detection unit 2.
  • the point cloud data generation unit 5 reads out the data of the depth map D1 from the storage unit 14 and assigns the data to each pixel based on the gradation value (eg, the measured value of the depth) of each pixel of the depth map D1.
  • the three-dimensional coordinates of the corresponding point in the real space are calculated, and the point group data D2 is generated.
  • the three-dimensional coordinates of one point are appropriately referred to as point data.
  • the point cloud data D2 is data obtained by grouping a plurality of point data.
  • the processing device 3 (eg, the point cloud data generation unit 5) performs segmentation, pattern recognition, and the like on the entire point cloud data of the target area AR, for example, to extract point cloud data D2 of the human body M.
  • the point cloud data generation unit 5 executes an extraction process (eg, segmentation) of extracting position information of a part of the object from position information of the object arranged in the target area AR.
  • the objects arranged in the target area AR include an object M (eg, a human body) and objects around the object M (eg, floor, wall, and background objects).
  • the point cloud data generator 5 extracts (eg, segments, separates) the position information of the object M (eg, a human body M) from the position information detected by the position detector 2.
  • the position detection unit 2 Prior to the above-described extraction processing, the position detection unit 2 performs a first state in which the object M is not arranged in the target area AR and a second state in which the object M is arranged in the target area AR, The position information of the object in the target area AR is detected.
  • the point cloud data generation unit 5 calculates the difference between the detection result of the position detection unit 2 in the first state and the detection result of the position detection unit 2 in the second state, thereby obtaining the object M (eg, The position information of the human body M) is extracted.
  • the first state may be a state in which the object M (for example, a human body M) is arranged in the target area AR and the position of the object M (for example, the human body M) in the target area AR is different from the second state.
  • the first state is a state in which an object M (eg, a human body M) is arranged at a first position in the target area AR, and the second state is an object at a second position different from the first position in the target area.
  • M eg, human body M
  • the above-described extraction processing may be executed by a processing unit other than the point cloud data generation unit 5. This processing unit may be provided in the processing device 3 or may be provided in a device external to the processing device 3. The above-described extraction processing need not be performed.
  • the processing device 3 may remove noise from the detection result of the position detection unit 2.
  • the noise removal processing for removing noise includes, for example, processing for removing spatial noise from the depth map generated by the position detection unit 2. For example, when the difference in depth between the first region (eg, the first pixel) and the second region (eg, the second pixel) adjacent to each other in the depth map exceeds a threshold, the processing device 3 determines the depth in the first region. Alternatively, it is determined that the depth in the second area is noise. If the processing device 3 determines that the depth in the first region is noise, the processing device 3 performs interpolation on the first region by using depth of a region around the first region (eg, the third pixel and the fourth pixel). The spatial noise is removed by estimating the depth and updating (replacement, correction) the depth of the first area with the estimated depth.
  • the point cloud data generation unit 5 may generate point cloud data based on the depth map from which spatial noise has been removed.
  • the noise removal processing for removing noise includes, for example, processing for removing temporal noise (for example, noise that changes over time) from the depth map generated by the position detection unit 2.
  • the position detection unit 2 repeats detection at a predetermined sampling frequency, and the processing device 3 compares detection results (eg, depth maps) of the position detection units 2 at different detection timings.
  • the processing device 3 calculates the depth change amount (eg, time change amount) of each pixel using the depth map in the first frame and the depth map in the second frame following the first frame.
  • the processing device 3 determines that the depth of the pixel in the first frame or the depth of the pixel in the second frame is noise. If the processing device 3 determines that the depth in the second frame is noise, the processing device 3 changes the depth of the second frame by interpolation using the depth in the first frame and the depth in the third frame following the second frame. presume. The processing device 3 removes temporal noise by updating (replacing and correcting) the estimated depth of the pixel corresponding to the noise.
  • the point cloud data generating unit 5 may generate point cloud data based on the depth map from which temporal noise has been removed.
  • the noise removal processing may not include processing for removing spatial noise or processing for removing temporal noise.
  • the noise removal processing may be executed by a part other than the processing device 3 (for example, the position detection unit 2). The detection device 1 does not need to execute the noise removal processing.
  • the processing device 3 sets a reference position BP associated with the shape of the human body M based on a pattern in which the cross-sectional shape of the human body M changes in a predetermined direction, and sets the human body M with respect to the set reference.
  • the left body part and the right body part of M are determined.
  • the reference calculation unit 11 sets a direction different from both the moving direction (in this case, the X direction) and the vertical direction (in this case, the Y direction) (eg, the cross direction, the Z direction) as the predetermined direction.
  • the reference calculation unit 11 determines that a pattern (eg, the contour of the human body M) in which the cross-sectional shape (eg, contour) of the human body M in a plane orthogonal to the predetermined direction (eg, a plane parallel to the XY plane or the XY plane) changes in the predetermined direction
  • the reference position BP is set using the cross-sectional feature).
  • the reference calculation unit 11 uses, for example, a cross-sectional feature of the human body M in a side view when the human body M is viewed from the side with respect to the movement direction (eg, a pattern in which the contour of the human body M changes in the XY plane in the Z direction). Then, the characteristic portion (eg, arm, shoulder, head) of the human body is determined, and the reference position BP is set.
  • the cross-sectional feature is specified by the amount of change in position information of a point on the surface of the human body M.
  • the position information of the point is, for example, information in which coordinates in the X direction (X coordinates), coordinates in the Y direction (Y coordinates), and coordinates in the Z direction (Z coordinates) are combined. .
  • the change amount of the position information includes a change amount of the Y coordinate in the X direction (eg, dy / dx), a change amount of the Z coordinate in the X direction (eg, dz / dx), and a change amount of the X coordinate in the Y direction ( For example, dx / dy), the change amount of the Z coordinate in the Y direction (eg, dz / dy), the change amount of the X coordinate in the Z direction (eg, dx / dz), and the change amount of the Y coordinate in the Z direction. (Eg, dy / dz).
  • the cross-sectional features will be described with reference to FIGS. 4A to 4C, FIGS. 5A and 5B, and the like.
  • the reference calculation unit 11 executes a reference calculation process for calculating the reference position BP.
  • the reference calculation unit 11 determines that the amount of change in position information in a cross direction (eg, Z direction) intersecting the moving direction (X direction in this case) and the vertical direction (Y direction in this case) of the human body M is a predetermined condition. Is calculated as the reference position BP.
  • the reference calculation unit 11 executes a reference calculation process described later based on a detection result (eg, a one-frame depth image) in which the position detection unit 2 detects the human body M at an arbitrary timing.
  • the reference position BP includes a first reference position BP1 described later with reference to FIG. 4B and a second reference position BP2 described later with reference to FIG.
  • the reference position BP is a position used as a reference in the partial determination processing for determining the first part M1 and the second part M2.
  • the change amount of the position information includes, for example, a change amount (eg, a difference) of coordinates between a plurality of points representing the surface of the human body M at an arbitrary timing (time).
  • the change amount of the position information may be a slope (eg, a derivative) of a curve or a straight line representing the surface (eg, an outline, a contour, a silhouette) of the human body M.
  • the reference calculation unit 11 estimates, as the reference position BP, the position of a structure (eg, a head or a torso) whose number is one of the structures included in the object M (eg, the human body M).
  • the reference calculation unit 11 estimates, as the reference position BP, the position of a portion including the center (for example, the center line CL) of the human body M in the cross direction (for example, the Z direction).
  • the reference calculation unit 11 calculates a reference position based on the point cloud data.
  • the reference calculation unit 11 reads out the point cloud data of the human body M generated by the point cloud data generation unit 5 from the storage unit 14 and executes a reference calculation process.
  • FIGS. 4 (A) to 4 (C), FIGS. 5 (A) and 5 (B) are diagrams showing the processing of the reference calculating unit according to the first embodiment.
  • the reference calculation unit 11 executes a first reference calculation process of calculating a first reference position BP1 (see FIG. 4B) as the reference position BP.
  • the first reference position BP1 is a first reference position BP1a on the first side (eg, ⁇ Z side, left body) of the object M and a second side (eg, + Z side, right body) of the object M. ) Of the first reference position BP1b.
  • the reference calculation unit 11 performs a second reference calculation process of calculating a second reference position BP2 (see FIG. 5B) based on the first reference position BP1 as the reference position BP.
  • the reference calculation unit 11 determines the position of a predetermined part of the human body M (eg, the base of the neck, between the neck and the torso, between the shoulder and the head, etc.) based on the characteristics of the change in the position information of the human body M. Is estimated as the first reference position BP1a. The reference calculation unit 11 calculates, as the first reference position BP1, a position that satisfies the condition that the amount of change in the coordinates of the points included in the point cloud data D2 is equal to or greater than a threshold value.
  • the process of the reference calculation unit 11 assumes that the object assumed as the detection target is a walking human body, and assumes the shape (eg, posture) of the human body at an arbitrary timing.
  • the surface of the human body significantly changes from the arm (eg, elbow) to the shoulder (eg, the amount of change in the position information is equal to or greater than a threshold).
  • the human body has a gradual change in the position of the surface from the shoulder to the neck (eg, the amount of change in the position information is less than the threshold).
  • the position of the surface of the human body changes significantly from the shoulder to the head as compared to the shoulder portion (eg, the amount of change in the position information becomes equal to or larger than a threshold).
  • the reference calculation unit 11 calculates, as the first reference position BP1, a position that satisfies a condition that a change amount of the position information (eg, the Y coordinate of the point data) is equal to or larger than a threshold.
  • the above-mentioned arm includes, for example, a scapulohumeral joint, and is a portion from the scapulohumeral joint to the fingertip.
  • the above-mentioned shoulder is, for example, a portion between the scapulohumeral joint of the left arm and the scapulohumeral joint of the right arm, and a lower body side of the seventh cervical vertebra (base of the neck).
  • the head includes, for example, a portion surrounded by the skull, a portion on the surface side of the skull, and a neck.
  • the head includes, for example, the seventh cervical vertebra and the parietal region, and is a portion from the seventh cervical vertebra to the parietal region.
  • the reference calculation unit 11 divides the area in the Z direction, and calculates a candidate for the first reference position BP1 for each of the divided areas.
  • reference numeral DA i is divided regions (divided regions, partial regions) is.
  • Z i is the coordinate in the Z direction representing the area DA i (eg, the coordinate of the center of the area DA i in the Z direction).
  • the reference calculation unit 11 performs the first reference calculation process on the assumption that the Z coordinate of the point data included in the area DA i is Z i .
  • the reference calculation unit 11 calculates the maximum value of the coordinates in the vertical direction (Y direction) in the coordinates in the cross direction (eg, Z direction) for the points included in the point group data D2. Reference calculation unit 11, for each divided area DA i, and calculates the maximum value of the Y coordinate of the point data.
  • Ymax (Z i ) in FIG. 4A is the maximum value of the Y coordinate of the point data in the area DA i .
  • the reference calculation unit 11 sets Ymax (Z i ) as the maximum value of the Y coordinate of the point data at each coordinate (Z i ).
  • FIG. 4B shows a plot of Ymax (Z i ) with respect to the coordinates (Z coordinates) in the cross direction.
  • the change amount of the position information with respect to the coordinates (Z coordinate) in the cross direction is represented by ⁇ Ymax (Z i ).
  • ⁇ Ymax (Z i ) is represented, for example, by the following equation (1).
  • ⁇ Ymax (Z i ) Ymax (Z i + 1 ) ⁇ Ymax (Z i ) Equation (1)
  • FIG. 4C shows a plot of the absolute value of ⁇ Ymax (Z i ) with respect to the coordinates (Z coordinates) in the cross direction.
  • reference numeral Q1 is a part corresponding to the arm of the human body M, and a part corresponding to the left arm is represented by reference numeral Q1a, and a part corresponding to the right arm is represented by reference numeral Q1b.
  • Reference numeral Q2 is a part corresponding to the shoulder of the human body M. A part corresponding to the left shoulder is represented by a reference numeral Q2a, and a part corresponding to the right shoulder is represented by a reference numeral Q2b.
  • Reference numeral Q3 is a portion corresponding to the head of the human body M.
  • ) of ⁇ Ymax (Z i ), which is the amount of change in the position information, is large in the arm portion Q1, and smaller in the shoulder portion Q2 than in the arm portion Q1.
  • the reference calculation unit 11 calculates the reference position BP (eg, the first reference position BP1a) based on the feature (eg, change pattern) of the amount of change in the position information.
  • the reference calculation unit 11 sets a portion where
  • a first characteristic portion e.g, a left shoulder portion Q2a, a right shoulder portion Q2b
  • the reference calculation unit 11 sequentially moves from one side (eg, first side, ⁇ Z side) to the other side (eg, second side, + Z side) in the cross direction (eg, Z direction).
  • the reference calculation unit 11 determines the position at which the magnitude relationship between
  • the reference calculation unit 11 calculates
  • a portion that is equal to or more than the second threshold value V2 is defined as a second characteristic portion of the human body M (for example, a head portion Q3).
  • the reference calculation unit 11 sets
  • the reference calculation unit 11 determines the position at which the magnitude relationship between
  • the boundary eg, the base end of the left shoulder, the end on the ⁇ Z side of the head, the neck, the base of the neck
  • is set eg, specified, identified, determined).
  • the reference calculation unit 11 [delta] Ymax (Z i) is smaller than the second threshold value V2, [delta] Ymax when (Z i + 1) is the second threshold value V2 or more, the first characteristic portion (left shoulder position of Z i The position is set at the end on the + Z side in the portion Q2), and the position of Z i + 1 is set on the end on the ⁇ Z side of the second characteristic portion (the head portion Q3).
  • the reference calculation unit 11 calculates a first reference position BP1 corresponding to the head of the human body M as a reference position, based on the amount of change in position information from the arm to the head of the human body M.
  • the reference calculation unit 11 calculates, as the first reference position BP1, a position representing a boundary between a first feature portion (eg, a left shoulder portion Q2a) and a second feature portion (a head portion Q3).
  • the reference calculation unit 11 calculates, as the first reference position BP1a, the position of the end on the + Z side in the first characteristic portion (the left shoulder portion Q2a).
  • the reference calculation unit 11 calculates the first reference position BP1b in the same manner as the first reference position BP1a.
  • the reference calculation unit 11 sets the first characteristic in the order from the other side (eg, + Z side) to the opposite side (eg, ⁇ Z side) of the human body M. A part and a second characteristic part are set.
  • the reference calculation unit 11 calculates, as the first reference position BP1b, the position of the end on the ⁇ Z side in the first characteristic portion (the right shoulder portion Q2b).
  • the reference calculation unit 11 stores the calculated first reference position BP1 (eg, three-dimensional coordinate value) in the storage unit 14 in FIG.
  • the reference calculation unit 11 sets the position of the end on the ⁇ Z side in the second characteristic portion (head portion Q3) as the first reference position BP1a, and sets the position of the + Z side in the second characteristic portion (head portion Q3).
  • the end position may be the first reference position BP1b.
  • the reference calculation unit 11 calculates the position between the position of the + Z side end of the first feature portion (the left shoulder portion Q2a) and the position of the ⁇ Z side end of the second feature portion (the head portion Q3).
  • the position (for example, the center) may be the first reference position BP1a.
  • the reference calculation unit 11 calculates the position of the end on the ⁇ Z side in the first characteristic portion (the right shoulder portion Q2b) and the position of the + Z side end in the second characteristic portion (the head portion Q3). A position (eg, center) between them may be the first reference position BP1b. Further, the reference calculation unit 11 may calculate only one of the first reference position BP1a and the first reference position BP1b as the first reference position BP1.
  • the reference calculator 11 calculates a reference position BP from a predetermined area AR2 (eg, an area in the Z direction corresponding to the head portion Q3) set with respect to the first reference position BP1 shown in FIG.
  • the second reference position BP2 is selected (eg, set, determined).
  • the reference calculation unit 11 includes a first reference position BP1a on the first side (eg, ⁇ Z side) of the object M and a second side (eg, (+ Z side) and the first reference position BP1b.
  • the reference calculation unit 11 sets a predetermined area (for example, an area including the reference position) between the first reference position BP1a on the first side and the first reference position BP1b on the second side. It is assumed that the center line CL of the human body M in the Z direction passes through a region between the first reference position BP1a and the first reference position BP1b (the predetermined region, the head portion Q3).
  • the reference calculation unit 11 calculates a position satisfying a predetermined condition in a region between the first reference position BP1a and the first reference position BP1b as a second reference position BP2.
  • the reference calculation unit 11 divides the region in the Z direction as in FIG. 4A, and calculates a candidate for the second reference position BP2 for each divided region.
  • the minimum value of the coordinates with (+ X side) being positive is calculated.
  • the reference calculation unit 11 extracts point data on the front side of the human body M (front side in the traveling direction, + X side in the X direction). For example, the reference calculation unit 11 divides the area in the Y direction, and calculates the maximum value of the X coordinate of the point data for each of the divided areas.
  • reference numeral DB j is divided regions (divided regions, partial regions) is.
  • Y j is a coordinate in the Y direction representing the area DB j (eg, a coordinate of the center of the area DB j in the Y direction).
  • the reference calculation unit 11 executes the second reference calculation process, assuming that the Y coordinate of the point data included in the area DB j is Y j .
  • Xmax (Y j ) in FIG. 5A is the maximum value of the X coordinate of the point data in the area DB j .
  • the reference calculation unit 11 sets Xmax (Y j ) as the maximum value of the X coordinate of the point data at each coordinate (Y j ).
  • Xmax (Y j ) corresponds to the X coordinate of point data on the front side in the traveling direction (X direction) in each coordinate (Y j ) in the vertical direction (Y direction).
  • Reference calculation unit 11 calculates the minimum value of Xmax (Y j) with respect to the Y coordinate.
  • the position of Xmax (Y j ) is, for example, a position above the shoulder of the human body M and below the chin.
  • FIG. 5B shows a plot representing the distribution of Xmin (Z i ) with respect to the Z coordinate.
  • Reference calculation unit 11 the cross direction (e.g., Z-direction) on the basis of the change amount of the minimum value with respect to the coordinate of (Xmin (Z i)), selecting a second reference position BP2.
  • the neck of the human body M is approximated by an elliptic cylinder, the surface of the neck becomes convex forward in the traveling direction, and the position of the tip corresponds to the position of the center line CL of the human body M.
  • the reference calculation unit 11 calculates a Z coordinate (Xc in FIG. 5B) at which Xmin (Z i ) with respect to the Z coordinate becomes maximum (or maximum).
  • the reference calculation unit 11 sets Xmin (Zc) as the X coordinate of the second reference position BP2.
  • the reference calculation unit 11 sets the Y coordinate (Yk in FIG. 5A) corresponding to Xmin (Zc) as the Y coordinate of the second reference position BP2. Further, the reference calculation unit 11 sets Zc as the Z coordinate of the second reference position BP2.
  • Reference calculating unit 11 according to the present embodiment as described above, since performing the reference calculation process dealing with a two-dimensional data point cloud data for each divided area DA i, can reduce the load of the example process is there. Note that the reference calculation unit 11 may perform the reference calculation process by treating the point cloud data as three-dimensional data.
  • the reference calculation unit 11 calculates the reference position BP (for example, the second reference position BP2) using the front-side outer shape characteristics of the human body M. Since a general human body has a remarkable change in shape between the chin, the throat, and the chest, the reference calculation unit 11 can calculate the second reference position BP2 with high accuracy. Note that the reference calculation unit 11 may calculate the reference position BP using the external features on the rear side of the human body M (eg, the rear side and the back side in the traveling direction).
  • the reference calculation unit 11 causes the storage unit 14 to store the calculated reference position BP.
  • the partial determination unit 12 performs a partial determination process.
  • the part discriminating unit 12 discriminates a first part M1 (eg, at least a part of the left body) and a second part M2 (eg, at least a part of the right body) of the object M (the human body M) in the part judgment processing. (Eg, judgment, identification, identification).
  • the first portion M1 is a part of the object M arranged on a first side (eg, ⁇ Z side) with respect to a reference plane BF described later.
  • the second portion M2 is a part of the object M arranged on a second side (eg, + Z side) opposite to the first side with respect to the reference plane BF.
  • the reference plane BF is a plane including the moving direction (X direction) and the vertical direction (Y direction).
  • the reference plane BF is, for example, a plane parallel to the movement direction (X direction) and the vertical direction (Y direction) and including the reference position BP (second reference position BP2).
  • the partial determination unit 12 performs a partial determination process based on the reference position BP calculated by the reference calculation unit 11.
  • the partial determination unit 12 sets a reference plane BF based on the reference position BP calculated by the reference calculation unit 11.
  • the partial determination unit 12 reads the reference position BP calculated by the reference calculation unit 11 from the storage unit 14 and sets a reference plane BF.
  • the partial determination unit 12 sets a plane parallel to the X direction and the Y direction and including the second reference position BP2 as the reference plane BF.
  • the partial determination unit 12 performs a partial determination process based on the point cloud data generated by the point cloud data generation unit 5.
  • the reference calculation unit 11 reads out the point cloud data generated by the point cloud data generation unit 5 from the storage unit 14, and a portion represented by one or more point data included in the point cloud data is on the first side with respect to the reference plane BF. (E.g., the -Z side) or the second side (e.g., the + Z side) is determined.
  • the second reference position BP2 is a position estimated by the reference calculation unit 11 as the position of a point on the center line CL of the human body M, and the partial determination unit 12 determines the second reference position BP2 as the reference position BP. Is used to execute the partial determination process.
  • the part determining unit 12 determines a part disposed on the first side (eg, the ⁇ Z side) with respect to the second reference position BP2 as the first part M1.
  • the part determination unit 12 determines a part arranged on the second side (eg, + Z side) with respect to the second reference position BP2 as the second part M2.
  • FIG. 6 is a diagram illustrating a process of the partial determination unit according to the first embodiment.
  • a symbol QX is a part (eg, point, area) to be subjected to the part determination processing.
  • a portion to be subjected to the partial determination process is referred to as a determination target portion.
  • the determination target portion QX is set in advance.
  • the determination target portion QX is set to a part (eg, hand, foot) of the human body M, and the partial determination portion 12 determines that the determination target portion QX (eg, foot) has a left half body (eg, left foot) and a right half body. (E.g., right foot).
  • the determination target portion QX may be set to a plurality of characteristic portions (for example, the whole body) of the human body M.
  • the part discriminating unit 12 may sequentially set each part of the human body M as the discrimination target part QX and execute the part discriminating process for each part of the human body M.
  • the partial determination unit 12 calculates a geodesic line SL connecting the reference position BP and the determination target portion QX of the object M (human body M).
  • the geodesic line SL is the shortest line connecting two points along the surface of the object M (human body M).
  • the partial determination unit 12 calculates the geodesic SL based on the shape of the surface of the human body M obtained from the point cloud data.
  • the part determination unit 12 uses the second reference position BP2 as the reference position BP, calculates a geodesic line SL connecting the second reference position BP2 and the determination target portion QX.
  • the determination target part QX is an area
  • the part determination unit 12 calculates the geodesic line SL connecting the point selected from the determination target part QX (for example, the center of the determination target part QX) and the second reference position BP2. I do.
  • the part determination unit 12 determines the first part M1 and the second part M2 based on the relative position between the geodesic line SL and the reference plane BF.
  • the entire geodesic line SL is disposed on the first side (eg, -Z side) or the second side (eg, + Z side) with respect to the reference plane BF will be described.
  • the partial determination unit 12 determines that the geodesic line SL is on the first side (for example, -Z side), the relative position between the geodesic line SL and the reference plane BF is specified.
  • the partial determination unit 12 determines that the geodesic line SL is on the second side (eg, + Z side) with respect to the reference plane BF. , + Z side), the relative position between the geodesic line SL and the reference plane BF is specified.
  • the part discriminating unit 12 determines a feature amount (first feature amount) of a portion arranged on the first side (eg, ⁇ Z side) with respect to the reference plane BF on the geodesic line SL, and a reference plane BF on the geodesic line SL.
  • the relative position between the geodesic line SL and the reference plane BF is specified based on the feature amount (second feature amount) of the portion arranged on the second side (eg, + Z side).
  • the partial determination unit 12 specifies a relative position between the geodesic line SL and the reference plane BF based on a magnitude relationship or a ratio between the first feature amount and the second feature amount.
  • the feature amount is, for example, a distance between a point on the geodesic line SL and the reference plane BF.
  • the partial determination unit 12 determines the distance (first feature amount) between a point on the geodesic line SL located on the first side (eg, the ⁇ Z side) with respect to the reference plane BF and the reference plane BF.
  • the relative position between the geodesic line SL and the reference plane BF is specified based on the distance (second feature amount) between the point on the geodesic line SL arranged on the second side (+ Z side) and the reference plane. .
  • the partial determination unit 12 determines each point of a plurality of points (first point set) on the geodesic line SL arranged on the first side (eg, the ⁇ Z side) with respect to the reference plane SF and the reference plane. The average (first feature amount) of the distance from the SF is calculated.
  • the part determination unit 12 determines each point of a plurality of points (second point set) on the geodesic line SL arranged on the second side (eg, + Z side) with respect to the reference plane SF and the reference plane SF. Is calculated (the second feature amount).
  • the partial determination unit 12 specifies that the geodesic line SL is located at a relative position on the first side (eg, the ⁇ Z side) with respect to the reference plane BF. I do. Further, when the second feature value is larger than the first feature value, the partial determination unit 12 determines that the geodesic line SL is a relative position where the geodesic line SL is disposed on the second side (eg, + Z side) with respect to the reference plane BF. Identify. The partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
  • the partial determination unit 12 selects a plurality of points on the geodesic line SL.
  • the partial determination unit 12 may select the plurality of points regularly (eg, at predetermined intervals) or irregularly (eg, randomly). Then, when each of the selected points is arranged on the first side (for example, the ⁇ Z side) with respect to the reference plane SF, each of the selected points is classified into a first point set, When placed on the second side (eg, + Z side) with respect to the SF, this point is classified into a second point set, and a first point set and a second point set are set.
  • the partial determination unit 12 determines the first point set and the second point set such that the number of points belonging to the first point set is the same as the number of points belonging to the second point set. May be set. In this case, the partial determination unit 12 may use the sum of the distances instead of the average value of the distances as the feature amount. For example, the partial determination unit 12 uses the sum of the distances related to the first point set as the first feature amount and the sum of the distances related to the second point set as the second feature amount, and uses the geodesic line SL and the reference plane BF. May be specified.
  • the first feature value may be a distance between a point on the geodesic line SL farthest from the reference plane BF to the first side (eg, the ⁇ Z side) and the reference plane BF.
  • the second feature value may be a distance between a point on the geodesic line SL farthest from the reference plane BF to the second side (+ Z side) and the reference plane BF.
  • the partial determination unit 12 determines that the geodesic line SL is located on the first side (eg, the ⁇ Z side) with respect to the reference plane BF. Identify the location.
  • the partial determination unit 12 places the geodesic line SL on the second side (eg, + Z side) with respect to the reference plane BF. Specify that it is a relative position.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
  • the first feature amount may be the length of a portion arranged on the first side (eg, ⁇ Z side) with respect to the reference plane BF in the geodesic line SL.
  • the second feature value may be the length of a portion arranged on the second side (eg, + Z side) with respect to the reference plane BF in the geodesic line SL.
  • the partial determination unit 12 determines a relative position where the geodesic line SL is arranged on the first side (eg, the ⁇ Z side) with respect to the reference plane BF. Is specified.
  • the partial determination unit 12 determines that the geodesic line SL is located on the second side (eg, + Z side) with respect to the reference plane BF. Identify the location.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
  • the partial determination unit 12 does not need to set (generate) the first point set and the second point set, and does not use the first feature amount and the second feature amount to generate the geodesic SL.
  • the relative position with respect to the reference plane BF may be specified.
  • the partial determination unit 12 selects a plurality of points on the geodesic line SL regularly (for example, at predetermined intervals) or irregularly (for example, randomly). Then, the partial determination unit 12 calculates the distance from each of the selected points to the reference plane BF. When each of the selected points is arranged on one side (eg, the first side, the ⁇ Z side) with respect to the reference plane BF, the distance between the selected point and the reference plane BF is represented by a negative value.
  • the partial determination unit 12 calculates the sum of the distances between each point represented by a positive value, a negative value, or 0 and the reference plane BF for a plurality of points on the geodesic line SL.
  • the partial determination unit 12 determines the relative position where the geodesic line SL is located on one side (eg, the first side, the ⁇ Z side) with respect to the reference plane BF. Identify that there is.
  • the partial determination unit 12 determines the relative position where the geodesic line SL is located on the other side (eg, the second side, + Z side) with respect to the reference plane BF. Is specified.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF based on the relative position between a predetermined part of the geodesic line SL and the reference plane BF.
  • the above-mentioned predetermined portion may be a point (farthest point) farthest from the reference plane BF on the geodesic line SL.
  • the farthest point is located on the first side (eg, ⁇ Z side) with respect to the reference plane BF
  • the partial determination unit 12 determines that the geodesic line SL is on the first side (eg, ⁇ side) with respect to the reference plane BF.
  • the relative position located on the (Z side) is specified.
  • the partial determination unit 12 determines that the geodesic line SL is on the second side (eg, with respect to the reference plane BF). (+ Z side).
  • the predetermined portion may be other than the farthest point. For example, a portion closer to the second reference position BP2 than the determination target portion QX in the geodesic line SL (for example, a portion having a predetermined length starting from the second reference position BP2). Part).
  • the portion determination unit 12 determines that the determination target portion QX belongs to the second portion M2. It may be determined.
  • the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by combining the above methods.
  • the geodesic line SL extends to the + Z side starting from the second reference position BP2, and reaches the determination target portion QX via the + Z side of the reference plane BF.
  • the part determination unit 12 determines that the determination target part QX belongs to the second part M2 (for example, the right half).
  • the left foot and the right foot intersect or are aligned on the same line.
  • the right foot is located at the left side or the same position as the left foot in the direction of the shoulder width of the human body. Therefore, it may be difficult for the conventional normal device to automatically determine whether the detected foot is the right foot or the left foot.
  • the detection device 1 uses at least the relative position between the geodesic line SL and the reference plane BF, thereby detecting at least the geodesic line SL. Since a part is arranged on the + Z side or the ⁇ Z side with respect to the reference plane BF, it is possible to automatically determine with high accuracy whether the detected foot is the left foot or the right foot.
  • the partial determination unit 12 performs a partial determination process on at least a part of the object M (human body M) represented by the point cloud data, and stores the processing result in the storage unit 14. To memorize.
  • the part discriminating unit 12 performs a part discriminating process for each part of the object M (human body M) represented by the point cloud data, and indicates whether the part belongs to the first part M1 or the second part M2.
  • the determination information (eg, flag, attribute information) is stored in the storage unit 14 as a processing result.
  • the posture estimating unit 13 performs a posture estimating process of estimating the posture of the object M based on the first part M1 and the second part M2 determined by the part determining unit 12.
  • the part determination unit 12 generates, for example, position information of a characteristic part (eg, a characteristic part, a characteristic point) of the human body M.
  • the characteristic part of the human body M is, for example, a part that can be distinguished from other parts of the human body M.
  • the characteristic portion of the human body M includes, for example, at least one of a distal end (a finger, a toe, a head), a joint, or an intermediate portion between the distal end and the joint or between the two joints.
  • the posture estimating unit 13 performs, for example, a recognition process (eg, pattern recognition, shape recognition, skeleton recognition) on the shape of the human body M obtained from the point cloud data.
  • the posture estimating unit 13 generates position information of the above-described characteristic portion by a recognition process.
  • the position information of the characteristic portion includes, for example, the coordinates of a point representing the characteristic portion (eg, three-dimensional coordinates).
  • the posture estimating unit 13 calculates the coordinates of the point representing the characteristic portion by the above-described recognition processing. Then, the posture estimating unit 13 causes the storage unit 14 to store information of the specified portion (eg, coordinates of a point representing a characteristic portion).
  • FIG. 7 is a diagram illustrating a process of the posture estimating unit according to the first embodiment.
  • reference characters Q11 to Q30 are characteristic portions of the human body M specified by the posture estimating unit 13.
  • Symbols Q11 to Q15 are characteristic portions corresponding to the terminal portions, where Q11 is a head, Q12 is a left foot, Q13 is a left hand, Q14 is a right foot, and Q15 is a right hand.
  • Reference characters Q16 to Q27 are characteristic portions corresponding to joints, Q16 is a left ankle, Q17 is a left knee, Q18 is a left hip joint (base of the left foot), Q19 is a right ankle, Q20 is a right knee, and Q21 is a right knee.
  • Q22 is a left wrist
  • Q23 is a left elbow
  • Q24 is a left scapulohumeral joint
  • Q25 is a right wrist
  • Q26 is a right elbow
  • Q27 is a right scapula upper arm.
  • Symbols Q28 to Q30 are characteristic portions corresponding to an intermediate portion between a distal end portion and a joint or between two joints
  • Q28 is a waist (the center of the left and right hip joints)
  • Q29 is a neck (the left and right hip joints).
  • the center of the scapulohumeral joint) and Q30 are the back (the center between the waist Q28 and the neck Q29).
  • the posture estimating unit 13 of the present embodiment can determine the characteristic portion (eg, left ankle, right ankle) of the human body M because the partial determination unit 12 can determine the left body part and the right body part with high accuracy. It can be specified with high accuracy. As a result of the recognition process, the posture estimating unit 13 determines the position of each characteristic part, information (eg, attribute information, name (right knee, right hip joint)) of each characteristic part, and the connection relationship between the two characteristic parts. Posture information (e.g., skeleton information, skeleton data) is generated as a set of the indicated connection information. The posture estimating unit 13 stores the above posture information in the storage unit 14 of FIG.
  • the posture estimating unit 13 estimates the posture of the human body M based on the posture information.
  • the posture estimating unit 13 is a relative position (eg, an angle) between a first line connecting a pair of characteristic portions having a connection relationship and a second line connected to the first line and connecting the pair of feature portions having a connection relationship. Is used to estimate the posture of the portion including the first line and the second line.
  • the first line is a right shin Q31 connecting the right ankle Q19 and the right knee Q20
  • the second line is a right thigh Q32 (thigh) connecting the right knee Q20 and the right hip joint Q21.
  • the posture estimating unit 13 calculates an angle ⁇ formed between the right shin Q31 and the right thigh Q32.
  • the posture estimating unit 13 determines that the right leg of the human body M including the right shin Q31 and the right thigh Q32 is bent when the angle ⁇ formed by the right shin Q31 and the right thigh Q32 is equal to or smaller than the threshold. I do.
  • the posture estimating unit 13 determines that the right leg of the human body M including the right shin Q31 and the right thigh Q32 is in an extended posture. I do.
  • the posture estimating unit 13 estimates the posture of a part or the whole of the human body M by estimating the posture as described above, for example, for each of the characteristic portions set in advance by the user.
  • the posture estimating unit 13 may estimate the posture with reference to posture definition information that defines the posture of an object (eg, a human body).
  • the above-described posture definition information is, for example, information in which information representing the type of posture and information defining the relative position of each characteristic portion are paired.
  • the information indicating the type of the posture is, for example, a posture name such as a walking posture, a sitting position, and a yoga pose name.
  • the information defining the relative position of each characteristic portion is, for example, a range of an angle ⁇ formed by the right shin Q31 and the right thigh Q32 or a threshold value.
  • the posture estimation unit 13 may estimate (identify) the posture of the human body M by comparing the generated posture information with the posture definition information.
  • FIG. 8 is a flowchart illustrating the detection method according to the first embodiment.
  • the configuration of the detection device 1 and the processing by each unit are appropriately referred to FIGS. 1 to 7.
  • the position detection unit 2 detects position information of each point on the surface of the object M.
  • the position detector 2 detects depth as position information, and generates point cloud data representing position information as a detection result (see FIG. 3).
  • the processing in step S1 includes the processing in step S2 and the processing in step S3.
  • the detection unit 4 detects a depth from a predetermined point (viewpoint) to each point of the target area AR.
  • the detection unit 4 causes the storage unit 14 to store a depth map (depth spatial distribution) representing the detection result.
  • the point cloud data generator 5 generates point cloud data based on the depth detection result.
  • the point cloud data generation unit 5 causes the storage unit 14 to store the generated point cloud data.
  • step S4 the reference calculation unit 11 calculates, as the reference position BP, a position where the amount of change in the position information satisfies a predetermined condition.
  • the reference calculation unit 11 calculates a first reference position BP1 and a second reference position BP2 as the reference position BP.
  • the processing in step S4 includes the processing in step S5 and the processing in step S6.
  • step S5 the reference calculation unit 11 calculates the first reference position BP1 based on the amount of change in the coordinates (position information) in the vertical direction (Y direction) (see FIG. 4).
  • the reference calculation unit 11 causes the storage unit 14 to store the calculated first reference position BP1.
  • step S6 the reference calculation unit 11 calculates a second reference position BP2 based on the first reference position BP1 (see FIG. 5).
  • the reference calculation unit 11 sets a predetermined area AR2 based on the first reference position BP1 calculated in step S5, and calculates a position satisfying a predetermined condition in the predetermined area AR2 as a second reference position BP2.
  • the reference calculation unit 11 causes the storage unit 14 to store the calculated second reference position BP2.
  • step S7 based on the reference position BP, the part determining unit 12 determines the first part M1 on the first side ( ⁇ Z side) and the second part on the second side (+ Z side) with respect to the reference plane BF. M2 is determined.
  • the partial determination unit 12 sets the reference plane BF based on the second reference position BP2 calculated in step S6. Further, the partial determination unit 12 calculates a geodesic line SL connecting the determination target portion QX of the partial determination process and the second reference position BP2.
  • the part determination unit 12 determines whether the determination target part QX belongs to the first part M1 or the second part M2 based on the relative position between the reference plane BF and the geodesic line SL.
  • the partial determination unit 12 generates, as a processing result of the partial determination processing, determination information indicating which of the first part M1 and the second part M2 the determination target part QX belongs to.
  • the partial determination unit 12 causes the storage unit 14 to store the generated determination information.
  • the posture estimating unit 13 estimates the posture of the object M.
  • the posture estimating unit 13 executes a posture estimating process of estimating the posture of the object M (human body M) based on the processing result (eg, the discrimination information) of the partial determining unit 12.
  • the posture estimating unit 13 performs a recognition process on the shape of the human body M obtained from the point cloud data, and specifies a characteristic portion of the human body M (see FIG. 7).
  • the posture estimating unit 13 estimates the posture of the human body M based on the relative positions of the specified characteristic portions.
  • the reference calculation unit 11 calculates the reference position BP based on the amount of change in the position information
  • the partial determination unit 12 determines the first side of the object M based on the reference position BP.
  • the first portion M1 on the (eg, ⁇ Z side, left side) and the second portion M2 on the second side (eg, + Z side, right side) are determined. Therefore, the detection device 1 can determine the first portion M1 (eg, the left foot) and the second portion M2 (eg, the right foot) with high accuracy.
  • the posture estimation unit 13 detects the object M (the human body M). The posture can be estimated with high accuracy.
  • the reference calculation unit 11 estimates the position of a portion including the center of the object M in the cross direction (eg, the Z direction) as the reference position BP.
  • the part determination unit 12 since the part determination unit 12 performs the part determination processing based on the position of the part including the center of the object M, the first part M1 on the first side and the second part M2 on the second side are accurately determined. Can be determined.
  • the reference position BP may be shifted from the center of the object M in the Z direction.
  • the reference position BP is shifted from the center of the object M to the first side (+ Z side) by a predetermined shift amount.
  • the maximum distance between the geodesic line SL (see FIG. 6) and the reference plane BF is larger than the amount of displacement between the reference position BP and the center of the object M.
  • the part discriminating unit 12 calculates the characteristic amount of the portion (eg, the ratio of the geodesic line SL to the entirety, the reference surface BF) of the portion arranged on the first side (+ Z side) with respect to the reference plane BF in the geodesic line SL.
  • the portion determination unit 12 arranges the determination target portion QX on the first side. It may be determined that the part has been processed.
  • the reference calculation unit 11 may calculate the reference position BP by processing the depth detected by the position detection unit 2 (detection unit 4).
  • the reference calculation unit 11 may calculate the reference position BP based on shape information (eg, polygon data) of the object M obtained from the depth detected by the position detection unit 2 (detection unit 4) (later, This will be described with reference to FIG.
  • the reference calculation unit 11 does not have to set the predetermined area AR2.
  • the reference calculation unit 11 may set the center between the first reference position BP1a and the first reference position BP1b to the second reference position BP2. Further, the reference calculation unit 11 may calculate only one of the first reference position BP1a and the first reference position BP1b as the first reference position BP1. In this case, the reference calculation unit 11 may set a position shifted by a predetermined amount in the Z direction from the first reference position BP1 as the second reference position BP2.
  • the predetermined amount may be set, for example, based on the size (scale) of the object M.
  • the predetermined amount may be set to a predetermined ratio (for example, 25%) with respect to the size of the object M in the Z direction (for example, the distance between the left end of the left shoulder and the right end of the right shoulder). .
  • the reference calculation unit 11 does not need to calculate the first reference position BP1.
  • the reference calculation unit 11 determines a pattern in which the feature amount (eg, Xmin (Z i )) on the front side or the rear side of the object M changes in the Z direction (eg, FIG. 5B). )),
  • the detection device 1 does not need to include the attitude estimation unit 13 described above.
  • the detection device 1 may calculate the feature amounts (eg, dimensions, outer peripheral length, cross-sectional area) of the part determined by the part determination unit 12.
  • the detection device 1 may segment the object M using the processing result of the partial determination processing performed by the partial determination unit 12.
  • the detection device 1 uses the processing result of the partial determination processing by the partial determination unit 12 (eg, information for identifying whether the detected “foot” is the “left foot” or the “right foot”), and Information may be generated (described later with reference to FIG. 10).
  • FIG. 9 is a diagram illustrating a detection device according to the second embodiment.
  • the moving direction is determined in advance with respect to the target area AR (eg, the field of view of the position detection unit 2), but in the present embodiment, the detection device 1 , The moving direction of the human body M) is detected.
  • the detection device 1 includes a direction calculation unit 21 that calculates (derives and detects) the moving direction of the object M (human body M) based on the detection result of the position detection unit 2.
  • the detecting unit 4 repeatedly detects the human body M, and the point cloud data generating unit 5 generates point cloud data of the human body M for each detection timing as position information.
  • the direction calculation unit 21 calculates the moving direction of the human body M based on the time change of the position information of the human body M detected by the position detection unit 2. For example, the direction calculation unit 21 calculates a trajectory of the human body M (for example, a time history of a predetermined position of the human body M), and sets a direction along the trajectory as a moving direction of the human body M.
  • the predetermined position is a position on the human body M determined by default setting or user setting.
  • the direction calculation unit 21 calculates the center of gravity of a plurality of points included in the point cloud data as the predetermined position of the human body M.
  • the direction calculating unit 21 calculates the moving direction based on the calculated temporal change of the center of gravity. For example, the direction calculation unit 21 starts from a first predetermined position of the human body M obtained from the detection result of the detection unit 4 at the first time, and obtains the detection result from the detection unit 4 at the second time after the first time.
  • a vector having the second predetermined position of the human body M as an end point is calculated, and a direction vector (eg, a unit vector) parallel to this vector is determined as the moving direction of the human body M at the first time.
  • the direction calculation unit 21 causes the storage unit 14 to store the calculated movement direction (eg, movement information).
  • the reference calculation unit 11 calculates a reference position BP based on the moving direction calculated by the direction calculation unit 21.
  • the reference calculation unit 11 reads the movement direction calculated by the direction calculation unit 21 from the storage unit 14, sets the movement direction to the X direction, and executes a reference calculation process.
  • the part determination unit 12 determines the first part M1 and the second part M2 based on the moving direction calculated by the direction calculation unit 21.
  • the part determination unit 12 sets a plane parallel to the vertical direction and the movement direction calculated by the direction calculation unit 21 and including the second reference position BP2 as the reference plane BF.
  • the part determination unit 12 determines whether the determination target part QX is located on the first side (eg, ⁇ Z side) or the second side (eg, + Z side) with respect to the set reference plane BF. Determine.
  • the predetermined position may be the position of the characteristic portion (eg, the head of the human body M) described in the first embodiment. Further, the predetermined position may be a position of a characteristic part (eg, right foot, left foot) of the human body M derived based on the determination result of the partial determination unit 12. Further, the predetermined position may be a reference position BP calculated by the reference calculation unit 11 (eg, a second reference position BP2). For example, the direction calculation unit 21 may calculate a candidate for the movement direction based on a trajectory of a first predetermined position (for example, the center of gravity) set in advance.
  • a first predetermined position for example, the center of gravity
  • the reference calculation unit 11 may calculate a candidate for the reference position BP (for example, the second reference position BP2), using the candidate for the movement direction calculated by the direction calculation unit 21 as the X direction.
  • the direction calculation unit 21 uses the candidate for the second reference position BP2 calculated by the reference calculation unit 11 as the second predetermined position, and calculates the moving direction based on the trajectory of the second predetermined position (eg, recalculation). May be.
  • the reference calculation unit 11 may calculate (recalculate) the reference position BP with the movement direction calculated using the second predetermined position as the X direction.
  • the direction calculation unit 21 may calculate the moving direction of the object M based on the position information of the object M obtained from the global positioning system (GPS). Further, an acceleration sensor may be provided on the object M, and the direction calculation unit 21 may calculate the moving direction of the object M based on the detection result of the acceleration sensor.
  • the human body M moves with a mobile terminal (for example, a smartphone) including one or both of a receiving unit that receives information from GPS and the acceleration sensor, and the direction calculation unit 21 transmits a human body from the mobile terminal.
  • the position information (eg, GPS information, acceleration) of M may be acquired, and the moving direction of the object M may be calculated.
  • the detection device 1 includes a vertical detection unit (for example, a sensor that detects the direction of gravitational acceleration) that detects a vertical direction, sets the detection result detected by the vertical detection unit in the Y direction, and performs a reference calculation process. And / or both of the partial determination processing.
  • the detection device 1 may not include the vertical detection unit.
  • the vertical detection unit is provided on the object M (the human body M is carried), and the detection device 1 performs one of the reference calculation process and the partial determination process based on the vertical direction acquired from the vertical detection unit. Both may be performed.
  • the vertical detection unit may be provided separately from both the detection device 1 and the object M.
  • the vertical detection unit may be provided in a place where the object M is detected (for example, equipment in which the detection device 1 is installed).
  • FIG. 10 is a diagram illustrating a detection device according to the third embodiment.
  • the detection device 1 includes a model generation unit 22 that generates model information of an object M (in this case, a human body M).
  • the model information is, for example, three-dimensional CG model data, and includes shape information of the object M.
  • the model generation unit 22 calculates model information of the object M based on the first part M1 and the second part M2 determined by the part determination unit 12.
  • the model generation unit 22 includes the point cloud data generation unit 5 and the surface information generation unit 23.
  • the point cloud data generation unit 5 generates point cloud data as shape information based on the depth of the object M detected by the detection unit 4 as described in the first embodiment.
  • the surface information generation unit 23 performs a surface process for calculating surface information as shape information.
  • the surface information includes, for example, at least one of polygon data, vector data, and draw data.
  • the surface information includes coordinates of a plurality of points on the surface of the object and connection information between the plurality of points.
  • the connection information (for example, attribute information) includes, for example, information that associates points at both ends of a line corresponding to a ridge line (for example, an edge) on the surface of an object. Further, the connection information includes, for example, information that associates a plurality of lines corresponding to the contour of the object surface (surface) with each other.
  • the surface information generation unit 23 estimates a surface between a point selected from a plurality of points included in the point cloud data and a nearby point in the surface processing.
  • the surface information generation unit 23 segments the point group data using the processing result of the partial determination unit 12 when estimating a surface between a point and a nearby point. For example, when generating the surface information of the left foot of the human body M, the surface information generation unit 23 distinguishes the point group data of the left foot from the point group data of the right foot based on the processing result of the partial determination unit 12. In this case, for example, when the left knee and the right knee are approaching (for example, in contact with each other), the surface information generation unit 23 avoids estimating a surface that straddles the left knee and the right knee. Accurate surface information can be generated.
  • the surface information generation unit 23 converts the point cloud data into polygon data having plane information between points.
  • the surface information generation unit 23 converts the point cloud data into polygon data by, for example, an algorithm using the least squares method. This algorithm may be, for example, an algorithm that is published in a point cloud processing library.
  • the surface information generation unit 23 causes the storage unit 14 to store the calculated surface information.
  • the model information may include texture information of the object M.
  • the model generation unit 22 may generate texture information of a plane defined by three-dimensional point coordinates and related information.
  • the texture information includes, for example, at least one information of a character, a graphic, a pattern, a texture, a pattern, information defining irregularities, a specific image, and a color (eg, chromatic color, achromatic color) on the surface of the object.
  • the model generation unit 22 may cause the storage unit 14 to store the generated texture information.
  • the model information may include spatial information (eg, lighting conditions, light source information) of the image.
  • the light source information includes a position of a light source that irradiates the object M with light (eg, illumination light), a direction in which light is emitted from the light source to the object M (irradiation direction), a wavelength of light emitted from this light source, And information of at least one of the types of the light sources.
  • the model generation unit 22 may calculate the light source information using, for example, a model assuming Lambertian reflection, a model including Albedo estimation, or the like.
  • the model generation unit 22 may cause the storage unit 14 to store the generated spatial information.
  • the model generation unit 22 does not need to generate one or both of the texture information and the spatial information.
  • FIG. 11 is a diagram illustrating a detection device according to the fourth embodiment.
  • the detection device 1 detection system
  • the detection device 1 includes a model generation unit 22, a rendering processing unit 24 (rendering processing device, information processing device), an input device 25, and a display device 26.
  • the ⁇ rendering processing unit 24 includes, for example, a graphics ⁇ processing ⁇ unit (GPU). Note that the rendering processing unit 24 may be configured so that the CPU and the memory execute each processing according to the image processing program.
  • the rendering processing unit 24 performs, for example, at least one of drawing processing, texture mapping processing, and shading processing.
  • the rendering processing unit 24 can calculate, for example, an estimated image (eg, a reconstructed image) obtained by viewing the shape defined in the shape information of the model information from an arbitrary viewpoint.
  • an estimated image eg, a reconstructed image
  • the shape indicated by the shape information is referred to as a model shape.
  • the rendering processing unit 24 can reconstruct a model shape (eg, an estimated image) from the model information (eg, shape information) by, for example, a drawing process.
  • the rendering processing unit 24 causes the storage unit 14 to store the data of the calculated estimated image, for example.
  • the rendering processing unit 24 can calculate, for example, an estimated image in which the image indicated by the texture information of the model information is pasted on the surface of the object on the estimated image.
  • the rendering processing unit 24 can calculate an estimated image in which a texture different from the target object is pasted on the surface of the object on the estimated image.
  • the rendering processing unit 24 can calculate, for example, an estimated image in which a shadow formed by the light source indicated by the light source information of the model information is added to an object on the estimated image. In the shading process, the rendering processing unit 24 can calculate, for example, an estimated image in which a shadow formed by an arbitrary light source is added to an object on the estimated image.
  • the input device 25 is used for inputting various information (eg, data, commands) to the processing device 3.
  • the user can input various information to the processing device 3 by operating the input device 25.
  • the input device 25 includes, for example, at least one of a keyboard, a mouse, a trackball, a touch panel, and a voice input device (eg, a microphone).
  • the display device 26 displays the image based on the image data output from the processing device 3.
  • the processing device 3 outputs the data of the estimated image generated by the rendering processing unit 24 to the display device 26.
  • the display device 26 displays the estimated image based on the data of the estimated image output from the processing device 3.
  • the display device 26 includes, for example, a liquid crystal display.
  • the display device 26 and the input device 25 may be a touch panel or the like.
  • the detection device 1 does not need to include the input device 25.
  • the detection device 1 may be in a form in which various commands and information are input via communication.
  • the detection device 1 does not need to include the display device 26.
  • the detection device 1 may output data of an estimated image generated by the rendering process to a display device via communication, and the display device may display the estimated image.
  • the rendering processing unit 24 is provided in the processing device 3 in FIG. 11, but may be provided in a device external to the processing device 3.
  • the external device may be a cloud computer communicably connected to the processing device 3.
  • the processing device 3 includes, for example, a computer system.
  • the processing device 3 reads a processing program stored in the storage unit 14 and executes various processes according to the processing program.
  • This processing program instructs the computer to set the position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition in the intersecting direction intersecting the moving direction of the moving object and the vertical direction as the reference position. And a first portion of the object disposed on the first side with respect to the reference plane including the movement direction and the vertical direction, based on the reference position, and a first portion opposite to the first side with respect to the reference plane. Determining the second portion of the object disposed on the second side.
  • This program may be provided by being recorded on a computer-readable storage medium (eg, a non-transitory recording medium, non-transitory @ tangible @ media).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

[Problem] To determine the portion of a moving object on one side and the portion on the other side with respect to a crossing direction crossing the movement direction of the object. [Solution] This detection device is provided with: a detection unit which detects position information of points on a moving object; a reference calculation unit which, as a reference position, calculates a position at which the amount of change in position information in a direction which crosses the movement direction of the object and the vertical direction satisfies a prescribed condition; and a determination unit which, relative to a reference plane that is based on the reference position calculated by the reference calculation unit and that contains the movement direction and the vertical direction, determines a first portion of an object disposed on a first side and a second portion of the object disposed on the second side opposite of the first side with respect to the reference plane.

Description

検出装置、処理装置、検出方法、及び処理プログラムDetection device, processing device, detection method, and processing program
 本発明は、検出装置、処理装置、検出方法、及び処理プログラムに関する。 The present invention relates to a detection device, a processing device, a detection method, and a processing program.
 物体を検出する技術として、例えば下記の特許文献1に記載された技術がある。歩行する人体等の移動する物体を検出する場合、移動方向に対して左側の部分と右側の部分とを判別することが難しいことがある。例えば、歩行時やランニング時における人体を検出する場合、人体の肩幅方向において左足と右足とがほぼ同じ位置になり、左足と右足との判別が難しい。 As a technique for detecting an object, for example, there is a technique described in Patent Document 1 below. When detecting a moving object such as a walking human body, it may be difficult to distinguish between a left portion and a right portion with respect to the moving direction. For example, when detecting a human body during walking or running, the left foot and the right foot are substantially at the same position in the shoulder width direction of the human body, and it is difficult to distinguish between the left foot and the right foot.
特開2010-134546号公報JP 2010-134546 A
 本発明の態様に従えば、移動する物体の各点の位置情報を検出する検出部と、物体の移動方向と鉛直方向とに交差する交差方向において位置情報の変化量が所定の条件を満たす位置を基準位置として算出する基準算出部と、基準算出部が算出した基準位置に基づいた、移動方向と鉛直方向とを含む基準面に対して、第1側に配置される物体の第1部分と、基準面に対して第1側と反対の第2側に配置される物体の第2部分とを判別する判別部と、を備える検出装置が提供される。 According to an aspect of the present invention, a detection unit that detects position information of each point of a moving object, and a position where a change amount of the position information satisfies a predetermined condition in an intersecting direction intersecting a moving direction of the object and a vertical direction. And a first portion of the object disposed on the first side with respect to a reference plane including the moving direction and the vertical direction based on the reference position calculated by the reference calculation unit. A determination unit configured to determine a second portion of the object disposed on the second side opposite to the first side with respect to the reference plane.
 本発明の態様に従えば、移動する物体の移動方向と鉛直方向とに交差する交差方向において、物体の表面上の各点の位置情報の変化量が所定の条件を満たす位置を基準位置として算出する基準算出部と、基準位置に基づいて、物体のうち移動方向と鉛直方向とを含む基準面に対して第1側に配置される第1部分と、物体のうち基準面に対して第1側と反対の第2側に配置される第2部分とを判別する判別部と、を備える処理装置が提供される。 According to the aspect of the present invention, a position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition in an intersecting direction intersecting the moving direction of the moving object and the vertical direction is calculated as a reference position. A reference calculation unit, a first portion disposed on a first side with respect to a reference plane including a movement direction and a vertical direction of the object based on the reference position, and a first portion disposed on a first side with respect to the reference plane of the object. And a discriminating unit for discriminating a second portion disposed on a second side opposite to the side.
 本発明の態様に従えば、移動する物体の各点の位置情報を検出することと、物体の移動方向と鉛直方向とに交差する交差方向において位置情報の変化量が所定の条件を満たす位置を基準位置として算出することと、基準位置に基づいた、移動方向と鉛直方向とを含む基準面に対して、第1側に配置される物体の第1部分と、基準面に対して第1側と反対の第2側に配置される物体の第2部分とを判別することと、を含む検出方法が提供される。 According to the aspect of the present invention, the position information of each point of the moving object is detected, and the position where the amount of change of the position information satisfies a predetermined condition in the intersecting direction intersecting the moving direction of the object and the vertical direction is determined. Calculating a reference position, a first portion of an object disposed on a first side with respect to a reference plane including a moving direction and a vertical direction based on the reference position, and a first side with respect to the reference plane Discriminating a second portion of the object located on the second side opposite to the second portion.
 本発明の態様に従えば、コンピュータに、移動する物体の移動方向と鉛直方向とに交差する交差方向において、物体の表面上の各点の位置情報の変化量が所定の条件を満たす位置を基準位置として算出することと、基準位置に基づいて、移動方向と鉛直方向とを含む基準面に対して第1側に配置される物体の第1部分と、基準面に対して第1側と反対の第2側に配置される物体の第2部分とを判別することと、を実行させる処理プログラムが提供される。 According to an aspect of the present invention, the computer determines, based on the position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition, in the direction intersecting the moving direction of the moving object and the vertical direction. Calculating as a position, and, based on the reference position, a first portion of the object disposed on a first side with respect to a reference plane including a movement direction and a vertical direction, and opposite to the first side with respect to the reference plane. And determining a second portion of the object arranged on the second side.
第1実施形態に係る検出装置を示す図である。It is a figure showing the detecting device concerning a 1st embodiment. 第1実施形態に係る検出部を示す図である。FIG. 3 is a diagram illustrating a detection unit according to the first embodiment. 第1実施形態に係る点群データ生成部の処理を示す図である。FIG. 4 is a diagram illustrating a process of a point cloud data generation unit according to the first embodiment. (A)から(C)は、第1実施形態に係る基準算出部の処理を示す図である。FIGS. 3A to 3C are diagrams illustrating a process of a reference calculation unit according to the first embodiment. (A)、(B)は、第1実施形態に係る基準算出部の処理を示す図である。FIGS. 4A and 4B are diagrams illustrating a process of a reference calculation unit according to the first embodiment. 第1実施形態に係る部分判別部の処理を示す図である。FIG. 4 is a diagram illustrating a process of a partial determination unit according to the first embodiment. 第1実施形態に係る姿勢推定部の処理を示す図である。FIG. 5 is a diagram illustrating processing of a posture estimating unit according to the first embodiment. 第1実施形態に係る検出方法を示すフローチャートである。5 is a flowchart illustrating a detection method according to the first embodiment. 第2実施形態に係る検出装置を示す図である。It is a figure showing a detecting device concerning a 2nd embodiment. 第3実施形態に係る検出装置を示す図である。It is a figure showing the detecting device concerning a 3rd embodiment. 第4実施形態に係る検出装置を示す図である。It is a figure showing the detecting device concerning a 4th embodiment.
[第1実施形態]
 第1実施形態について説明する。図1は、第1実施形態に係る検出装置を示す図である。検出装置1(検出システム)は、例えば、モーションキャプチャ装置、動作検出システム、運動支援システムなどである。また、検出装置1は、姿勢解析又は3次元モデリングなどに使われる。これらの場合、検出装置1は、所定の時間範囲において一方向又は複数の方向に移動する物体Mを検出する。検出装置1は、直線的に移動する物体Mを検出してもよいし、蛇行して移動する物体Mを検出してもよい。検出装置1が物体Mを検出する際の物体Mの移動経路(例、軌跡)は、直線を含んでもよいし、曲線を含んでもよく、直線および曲線を含んでもよい。検出装置1は、物体Mの移動方向MDに対する側方の第1側(例、左側)に配置される物体Mの第1部分M1と、第1側と反対側(例、右側)に配置される物体Mの第2部分M2とを判別する。
[First Embodiment]
A first embodiment will be described. FIG. 1 is a diagram illustrating a detection device according to the first embodiment. The detection device 1 (detection system) is, for example, a motion capture device, a motion detection system, an exercise support system, or the like. The detection device 1 is used for posture analysis or three-dimensional modeling. In these cases, the detection device 1 detects the object M that moves in one direction or a plurality of directions in a predetermined time range. The detection device 1 may detect the object M that moves linearly, or may detect the object M that moves meandering. The movement path (eg, trajectory) of the object M when the detection device 1 detects the object M may include a straight line, may include a curve, or may include a straight line and a curve. The detection device 1 is arranged on a first portion M1 of the object M arranged on a first side (eg, left side) with respect to the moving direction MD of the object M, and is arranged on a side opposite to the first side (eg, right side). The second part M2 of the object M is determined.
 物体Mは、検出装置1による検出の対象となる対象領域AR(例、検出装置1の検出領域、視野)において移動可能である。物体Mは、例えば、人体または人以外の動物、人型または人以外の動物型のロボット、若しくは動物型以外のロボットを含む。以下の説明において、物体Mが人体であるとし、適宜、物体Mを人体Mと表す。物体M(この場合、人体M)は、例えば、移動に伴って姿勢と形状との一方または双方が変化する物体である。例えば、人体Mは、移動(例、歩行、走行、運動、動作)する際に、第1部分M1(例、左足)と第2部分M2(例、右足)とが交互に移動方向に移動して、人体Mの少なくとも一部の形状(例、姿勢)が変化する。人体Mの場合、検出装置1は、人体Mの移動方向MD又は人体Mの正中線(例、体表上の左右の中心線)を境界にして第1側(例、左側)に配置される第1部分M1と、第1側と反対側(例、右側)に配置される第2部分M2とを判別できる。 The object M is movable in a target area AR to be detected by the detection device 1 (eg, the detection region of the detection device 1, the field of view). The object M includes, for example, a human body or a non-human animal, a human-type or non-human animal-type robot, or a non-animal-type robot. In the following description, it is assumed that the object M is a human body, and the object M is appropriately referred to as a human body M. The object M (in this case, the human body M) is, for example, an object whose posture and / or shape changes with movement. For example, when the human body M moves (eg, walks, runs, exercises, and moves), the first portion M1 (eg, left foot) and the second portion M2 (eg, right foot) alternately move in the movement direction. Thus, the shape (eg, posture) of at least a part of the human body M changes. In the case of the human body M, the detection device 1 is disposed on the first side (eg, left side) with the moving direction MD of the human body M or the midline of the human body M (eg, left and right center lines on the body surface) as a boundary. The first portion M1 and the second portion M2 disposed on the opposite side (eg, right side) from the first side can be determined.
 移動する人体Mは、例えば、フェンシング、野球、サッカー、ゴルフ、剣道、アメリカンフットボール、アイスホッケー、又は体操などのスポーツ、ランニング、エクササイズ、ヨガ、ボディビル、ファッションショーなどのウォーキング又はポージング、ゲーム、人物認証、若しくは仕事において動作する人体である。物体Mは、移動する人体と、移動する人体に付帯する物体(例、衣類、装着物、運動器具、防具)とを含んでもよい。 The moving human body M is, for example, a sport such as fencing, baseball, soccer, golf, kendo, American football, ice hockey, or gymnastics, walking or posing such as running, exercise, yoga, bodybuilding, a fashion show, a game, or a person. It is a human body that works for authentication or work. The object M may include a moving human body and an object attached to the moving human body (eg, clothing, wearing equipment, exercise equipment, and armor).
 以下の説明において、図1などに示すXYZ直交座標系を参照する。このXYZ直交座標系において、X方向は物体M(例、人体M)の移動方向であり、Y方向は鉛直方向であり、Z方向は物体M(人体M)の移動方向と鉛直方向とに交差(例、直交)する交差方向(例、直交方向)である。物体Mが人体である場合に、Z方向は人体Mの肩幅の方向を含み、第1部分M1は人体Mの左半身の少なくとも一部であり、第2部分M2は人体Mの右半身の少なくとも一部である。また、X、Y、Zの各方向において、適宜、矢印が指す側を+側(例、+Z側)と称し、その反対側を-側(例、-Z側)と称する。例えば、上記第1側は-Z側であり、上記第2側は+Z側である。 In the following description, an XYZ orthogonal coordinate system shown in FIG. In this XYZ rectangular coordinate system, the X direction is the moving direction of the object M (eg, the human body M), the Y direction is the vertical direction, and the Z direction intersects the moving direction of the object M (the human body M) with the vertical direction. (Eg, orthogonal) crossing directions (eg, orthogonal directions). When the object M is a human body, the Z direction includes the direction of the shoulder width of the human body M, the first part M1 is at least a part of the left half of the human body M, and the second part M2 is at least a part of the right half of the human body M. Part. In each of the X, Y, and Z directions, the side indicated by the arrow is appropriately referred to as a + side (eg, + Z side), and the opposite side is referred to as a − side (eg, −Z side). For example, the first side is the -Z side, and the second side is the + Z side.
 検出装置1は、位置検出部2と処理装置3とを備える。位置検出部2は、移動する人体Mの位置情報を検出する。位置検出部2は、人体Mの位置情報として、人体Mの表面上の各点の三次元座標を含む点群データを検出する。本実施形態において、人体Mの移動方向は、対象領域AR(例、位置検出部2の視野)に対して予め定められている。対象領域ARにおける鉛直方向および人体Mの移動方向は、既知の情報として検出装置1に与えられている。位置検出部2の検出結果(例、点群データ)において、X、Y、Zの各方向は上記既知の情報に基づいて設定される。位置検出部2は、検出部4と、点群データ生成部5とを含む。 The detection device 1 includes a position detection unit 2 and a processing device 3. The position detector 2 detects position information of the moving human body M. The position detecting unit 2 detects point group data including three-dimensional coordinates of each point on the surface of the human body M as position information of the human body M. In the present embodiment, the moving direction of the human body M is predetermined with respect to the target area AR (for example, the field of view of the position detection unit 2). The vertical direction and the moving direction of the human body M in the target area AR are given to the detection device 1 as known information. In the detection result (eg, point cloud data) of the position detection unit 2, the X, Y, and Z directions are set based on the known information. The position detector 2 includes a detector 4 and a point cloud data generator 5.
 検出部4は、例えば携帯型の装置(携帯機器)の少なくとも一部である。検出部4は、据え置き型の装置の少なくとも一部でもよい。検出部4は、処理装置3の内部に設けられてもよい。また、点群データ生成部5は、処理装置3の外部の装置に設けられてもよい。例えば、位置検出部2は、処理装置3の外部の装置であって、検出部4と点群データ生成部5とを内蔵してもよい。検出装置1の一部または全体は、携帯可能な装置(例、情報端末、スマートフォン、タブレット、カメラ付き携帯電話、ウェアラブル端末)でもよい。また、検出装置1の一部または全体は、据え置き型の装置(例、定点カメラ)でもよい。 The detection unit 4 is, for example, at least a part of a portable device (portable device). The detection unit 4 may be at least a part of a stationary device. The detection unit 4 may be provided inside the processing device 3. Further, the point cloud data generating unit 5 may be provided in a device outside the processing device 3. For example, the position detection unit 2 is a device external to the processing device 3 and may include the detection unit 4 and the point cloud data generation unit 5. A part or the whole of the detection device 1 may be a portable device (eg, an information terminal, a smartphone, a tablet, a camera-equipped mobile phone, and a wearable terminal). Further, a part or the whole of the detection device 1 may be a stationary device (eg, a fixed-point camera).
 図2は、第1実施形態に係る検出部を示す図である。検出部4は、人体Mの位置情報としてデプスを検出する。検出部4は、例えばデプスセンサ(例、デプスカメラ)を含む。検出部4は、所定の点から、対象領域ARに配置される物体の表面における各点までのデプス(例、位置情報、距離、奥行き、深度)を検出する。所定の点は、例えば、検出部4による検出の基準になる位置の点(例、視点、検出元の点、検出部4の位置を表す点、後述する撮像素子8の画素の位置)である。 FIG. 2 is a diagram illustrating the detection unit according to the first embodiment. The detection unit 4 detects depth as position information of the human body M. The detection unit 4 includes, for example, a depth sensor (eg, a depth camera). The detection unit 4 detects a depth (eg, position information, distance, depth, depth) from a predetermined point to each point on the surface of the object placed in the target area AR. The predetermined point is, for example, a point at a position serving as a reference for detection by the detection unit 4 (eg, a viewpoint, a detection source point, a point representing the position of the detection unit 4, a pixel position of the imaging element 8 described later). .
 検出部4は、照射部6、光学系7、及び撮像素子8を備える。照射部6は、対象領域AR(空間、検出領域)に光La(例、パターン光、照射光)を照射(例、投影)する。光学系7は、例えば結像光学系(撮像光学系)を含む。撮像素子8は、例えば、CMOSイメージセンサあるいはCCDイメージセンサを含む。撮像素子8は、2次元的に配列された複数の画素を有する。撮像素子8は、光学系7を介して、対象領域ARを撮像する。撮像素子8は、光Laの照射によって対象領域ARの物体から放射される光Lb(赤外光、戻り光)を検出する。 The detection unit 4 includes an irradiation unit 6, an optical system 7, and an imaging device 8. The irradiation unit 6 irradiates (eg, projects) light La (eg, pattern light, irradiation light) to the target area AR (space, detection area). The optical system 7 includes, for example, an imaging optical system (imaging optical system). The imaging device 8 includes, for example, a CMOS image sensor or a CCD image sensor. The imaging element 8 has a plurality of pixels arranged two-dimensionally. The imaging element 8 captures an image of the target area AR via the optical system 7. The imaging element 8 detects light Lb (infrared light, return light) emitted from the object in the target area AR due to the irradiation of the light La.
 検出部4は、例えば、照射部6から照射される光Laのパターン(例、強度分布)と、撮像素子8によって検出された光Lbのパターン(例、強度分布、撮像画像)とに基づいて、撮像素子8の各画素に対応する対象領域AR上の点から、撮像素子8の各画素までのデプスを検出する。検出部4は、その検出結果として、対象領域ARにおけるデプスの分布を表したデプスマップ(例、デプス画像、奥行き情報、距離情報)を処理装置3(図1参照)に出力する。検出部4は、人体Mの位置情報として、デプスマップを処理装置3に出力する。 The detection unit 4 is based on, for example, a pattern (eg, intensity distribution) of the light La emitted from the irradiation unit 6 and a pattern (eg, intensity distribution, captured image) of the light Lb detected by the image sensor 8. , The depth from the point on the target area AR corresponding to each pixel of the image sensor 8 to each pixel of the image sensor 8 is detected. The detection unit 4 outputs a depth map (e.g., a depth image, depth information, and distance information) representing a depth distribution in the target area AR to the processing device 3 (see FIG. 1) as a detection result. The detection unit 4 outputs a depth map to the processing device 3 as position information of the human body M.
 なお、検出部4は、TOF(time of flight)法によってデプスを検出するデバイスでもよい。検出部4は、TOF法以外の手法でデプスを検出するデバイスでもよい。検出部4は、例えば、レーザスキャナ(例、レーザ測距器)を含み、レーザスキャンによってデプスを検出してもよい。検出部4は、例えば、位相差センサを含み、位相差法によってデプスを検出してもよい。検出部4は、例えば、DFD(depth from defocus)法によってデプスを検出してもよい。 Note that the detection unit 4 may be a device that detects a depth by a time-of-flight (TOF) method. The detection unit 4 may be a device that detects depth by a method other than the TOF method. The detection unit 4 may include, for example, a laser scanner (for example, a laser range finder), and may detect the depth by laser scanning. The detection unit 4 may include, for example, a phase difference sensor, and may detect the depth by a phase difference method. The detection unit 4 may detect the depth by, for example, a DFD (depth @ from @ defocus) method.
 なお、検出部4は、赤外光以外の光(例、可視光)を物体Mに照射し、物体Mから出射する光(例、可視光)を検出してもよい。検出部4は、例えばステレオカメラなどを含み、複数の視点から物体Mを検出(例、撮像)してもよい。検出部4は、複数の視点から物体Mを撮像した撮像画像を用いて、三角測量によってデプスを検出してもよい。検出部4は、光学的な手法以外の手法(例、超音波によるスキャン)でデプスを検出してもよい。 The detection unit 4 may irradiate the object M with light other than infrared light (eg, visible light) and detect light (eg, visible light) emitted from the object M. The detection unit 4 may include, for example, a stereo camera, and may detect (eg, image) the object M from a plurality of viewpoints. The detection unit 4 may detect depth by triangulation using captured images obtained by capturing the object M from a plurality of viewpoints. The detection unit 4 may detect the depth by a method other than an optical method (for example, scanning by ultrasonic waves).
 図1の説明に戻り、点群データ生成部5は、検出部4が検出したデプスに基づいて、人体Mの位置情報として点群データを生成する。図1において、検出部4が処理装置3の外部に設けられ、点群データ生成部5が処理装置3の内部に設けられる。処理装置3は、検出部4と通信可能に接続される。検出部4は、その検出結果を処理装置3に出力する。処理装置3は、検出部4から出力された検出結果を処理する。 1 Returning to the description of FIG. 1, the point cloud data generation unit 5 generates point cloud data as position information of the human body M based on the depth detected by the detection unit 4. In FIG. 1, a detection unit 4 is provided outside the processing device 3, and a point cloud data generation unit 5 is provided inside the processing device 3. The processing device 3 is communicably connected to the detection unit 4. The detection unit 4 outputs the detection result to the processing device 3. The processing device 3 processes the detection result output from the detection unit 4.
 処理装置3は、点群データ生成部5、基準算出部11、部分判別部12、姿勢推定部13、及び記憶部14を備える。記憶部14は、例えば、不揮発性のメモリ、ハードディスク(HDD)、ソリッドステートドライブ(SSD)などである。記憶部14は、処理装置3で処理される元データ、及び処理装置3で処理されたデータ(例、処理装置3で生成されたデータ)を記憶する。記憶部14は、点群データの元データとして、検出部4から出力されるデプスマップを記憶する。 The processing device 3 includes a point cloud data generation unit 5, a reference calculation unit 11, a partial determination unit 12, a posture estimation unit 13, and a storage unit 14. The storage unit 14 is, for example, a nonvolatile memory, a hard disk (HDD), a solid state drive (SSD), or the like. The storage unit 14 stores original data processed by the processing device 3 and data processed by the processing device 3 (eg, data generated by the processing device 3). The storage unit 14 stores the depth map output from the detection unit 4 as the original data of the point cloud data.
 点群データ生成部5は、人体Mの位置情報として点群データを生成する点群処理を実行する。点群データは、対象領域ARにおける人体M上の複数の点の3次元座標を含む。点群データは、人体Mの周囲の物体(例、壁、床)上の複数の点の3次元座標を含んでもよい。点群データ生成部5は、検出部4が検出した位置情報(例、デプス)に基づいて、人体Mの点群データを算出する。点群データ生成部5は、記憶部14に記憶されている検出部4の検出結果(例、デプスマップ)を読み出して、点群データを算出する。点群データ生成部5は、物体M(例、人体M)のモデル情報(例、形状情報)として点群データを生成してもよい。 The point cloud data generation unit 5 executes point cloud processing for generating point cloud data as position information of the human body M. The point cloud data includes three-dimensional coordinates of a plurality of points on the human body M in the target area AR. The point cloud data may include three-dimensional coordinates of a plurality of points on an object (eg, a wall, a floor) around the human body M. The point cloud data generation unit 5 calculates point cloud data of the human body M based on the position information (for example, depth) detected by the detection unit 4. The point cloud data generation unit 5 reads out the detection result (eg, depth map) of the detection unit 4 stored in the storage unit 14 and calculates point cloud data. The point cloud data generating unit 5 may generate point cloud data as model information (eg, shape information) of the object M (eg, a human body M).
 図3は、第1実施形態に係る点群データ生成部の処理を示す図である。符号D1は、検出部4の検出結果に相当するデプスマップ(例、デプス画像)である。デプスマップD1は、検出部4によるデプスの測定値の空間分布を表す情報(例、画像)である。デプスマップD1は、対象領域ARの各点におけるデプスを階調値で表したグレースケールの画像である。デプスマップD1において、階調値が相対的に高い部分(例、白い部分、明るい部分)は、デプスが相対的に小さい部分(例、位置検出部2から相対的に近い部分)である。デプスマップD1において、階調値が相対的に低い部分(例、黒い部分、暗い部分)は、デプスが相対的に大きい部分(例、位置検出部2から相対的に遠い部分)である。 FIG. 3 is a diagram illustrating a process of the point cloud data generation unit according to the first embodiment. Symbol D1 is a depth map (eg, a depth image) corresponding to the detection result of the detection unit 4. The depth map D <b> 1 is information (for example, an image) representing a spatial distribution of a depth measurement value obtained by the detection unit 4. The depth map D1 is a grayscale image in which the depth at each point of the target area AR is represented by a gradation value. In the depth map D1, a portion having a relatively high tone value (eg, a white portion, a bright portion) is a portion having a relatively small depth (eg, a portion relatively close to the position detection unit 2). In the depth map D1, a portion having a relatively low gradation value (eg, a black portion, a dark portion) is a portion having a relatively large depth (eg, a portion relatively far from the position detection unit 2).
 点群データ生成部5(図1参照)は、記憶部14からデプスマップD1のデータを読み出し、デプスマップD1の各画素の階調値(例、デプスの測定値)に基づいて、各画素に相当する実空間上の点の3次元座標を算出し、点群データD2を生成する。以下の説明において、適宜、1つの点の三次元座標を点データと称する。点群データD2は、複数の点データを一組にしたデータである。 The point cloud data generation unit 5 (see FIG. 1) reads out the data of the depth map D1 from the storage unit 14 and assigns the data to each pixel based on the gradation value (eg, the measured value of the depth) of each pixel of the depth map D1. The three-dimensional coordinates of the corresponding point in the real space are calculated, and the point group data D2 is generated. In the following description, the three-dimensional coordinates of one point are appropriately referred to as point data. The point cloud data D2 is data obtained by grouping a plurality of point data.
 処理装置3(例、点群データ生成部5)は、例えば対象領域ARの全体の点群データに対してセグメント化、パターン認識などを施して、人体Mの点群データD2を抽出する。点群データ生成部5は、対象領域ARに配置される物体の位置情報から、物体の一部の位置情報を抽出する抽出処理(例、セグメンテーション)を実行する。ここで、対象領域ARに配置される物体は、物体M(例、人体)と、物体Mの周囲の物体(例、床、壁、背景の物体)とを含むとする。点群データ生成部5は、位置検出部2が検出した位置情報から、物体M(例、人体M)の位置情報を抽出(例、セグメント化、分離)する。 The processing device 3 (eg, the point cloud data generation unit 5) performs segmentation, pattern recognition, and the like on the entire point cloud data of the target area AR, for example, to extract point cloud data D2 of the human body M. The point cloud data generation unit 5 executes an extraction process (eg, segmentation) of extracting position information of a part of the object from position information of the object arranged in the target area AR. Here, it is assumed that the objects arranged in the target area AR include an object M (eg, a human body) and objects around the object M (eg, floor, wall, and background objects). The point cloud data generator 5 extracts (eg, segments, separates) the position information of the object M (eg, a human body M) from the position information detected by the position detector 2.
 上記の抽出処理に先立ち、位置検出部2は、対象領域ARに物体Mが配置されていない第1状態と、対象領域ARに物体Mが配置されている第2状態とのそれぞれの状態において、対象領域ARにおける物体の位置情報を検出する。抽出処理において、点群データ生成部5は、第1状態における位置検出部2の検出結果と、第2状態における位置検出部2の検出結果との差分を算出することによって、物体M(例、人体M)の位置情報を抽出する。 Prior to the above-described extraction processing, the position detection unit 2 performs a first state in which the object M is not arranged in the target area AR and a second state in which the object M is arranged in the target area AR, The position information of the object in the target area AR is detected. In the extraction processing, the point cloud data generation unit 5 calculates the difference between the detection result of the position detection unit 2 in the first state and the detection result of the position detection unit 2 in the second state, thereby obtaining the object M (eg, The position information of the human body M) is extracted.
 上記の第1状態は、対象領域ARに物体M(例、人体M)が配置され、かつ対象領域ARにおける物体M(例、人体M)の位置が第2状態と異なる状態でもよい。例えば、上記第1状態は対象領域ARにおける第1位置に物体M(例、人体M)が配置される状態であり、上記第2状態は、対象領域において第1位置と異なる第2位置に物体M(例、人体M)が配置される状態でもよい。上記の抽出処理は、点群データ生成部5以外の処理部によって実行されてもよい。この処理部は、処理装置3に設けられてもよいし、処理装置3の外部の装置に設けられてもよい。上記の抽出処理は実行されなくてもよい。 The first state may be a state in which the object M (for example, a human body M) is arranged in the target area AR and the position of the object M (for example, the human body M) in the target area AR is different from the second state. For example, the first state is a state in which an object M (eg, a human body M) is arranged at a first position in the target area AR, and the second state is an object at a second position different from the first position in the target area. M (eg, human body M) may be placed. The above-described extraction processing may be executed by a processing unit other than the point cloud data generation unit 5. This processing unit may be provided in the processing device 3 or may be provided in a device external to the processing device 3. The above-described extraction processing need not be performed.
 また、処理装置3は、位置検出部2の検出結果からノイズを除去してもよい。ノイズを除去するノイズ除去処理は、例えば、位置検出部2が生成するデプスマップから空間的なノイズを除去する処理を含む。例えば、処理装置3は、デプスマップにおいて隣り合う第1領域(例、第1画素)と第2領域(例、第2画素)とでデプスの差が閾値を超える場合に、第1領域におけるデプスまたは第2領域におけるデプスがノイズであると判定する。処理装置3は、第1領域におけるデプスがノイズであると判定した場合に、第1領域の周囲の領域(例、第3画素、第4画素)のデプスを用いた補間などによって第1領域のデプスを推定し、第1領域のデプスを推定したデプスで更新(置換、補正)することによって、空間的なノイズを除去する。点群データ生成部5は、空間的なノイズが除去されたデプスマップに基づいて、点群データを生成してもよい。 (4) The processing device 3 may remove noise from the detection result of the position detection unit 2. The noise removal processing for removing noise includes, for example, processing for removing spatial noise from the depth map generated by the position detection unit 2. For example, when the difference in depth between the first region (eg, the first pixel) and the second region (eg, the second pixel) adjacent to each other in the depth map exceeds a threshold, the processing device 3 determines the depth in the first region. Alternatively, it is determined that the depth in the second area is noise. If the processing device 3 determines that the depth in the first region is noise, the processing device 3 performs interpolation on the first region by using depth of a region around the first region (eg, the third pixel and the fourth pixel). The spatial noise is removed by estimating the depth and updating (replacement, correction) the depth of the first area with the estimated depth. The point cloud data generation unit 5 may generate point cloud data based on the depth map from which spatial noise has been removed.
 また、ノイズを除去するノイズ除去処理は、例えば、位置検出部2が生成するデプスマップから時間的なノイズ(例、時間的に変化するノイズ)を除去する処理を含む。例えば、位置検出部2は、所定のサンプリング周波数で検出を繰り返し、処理装置3は、検出のタイミングが異なる位置検出部2の検出結果(例、デプスマップ)を比較する。例えば、処理装置3は、第1フレームにおけるデプスマップと、第1フレームの次の第2フレームにおけるデプスマップとで、各画素におけるデプスの変化量(例、時間変化量)を算出する。処理装置3は、各画素におけるデプスの変化量が所定の条件を満たす場合、この画素の第1フレームにおけるデプスまたは第2フレームにおけるデプスがノイズであると判定する。処理装置3は、第2フレームにおけるデプスがノイズであると判定した場合に、第1フレームにおけるデプスと第2フレームの次の第3フレームにおけるデプスとを用いた補間などによって第2フレームのデプスを推定する。処理装置3は、ノイズに対応する画素のデプスを推定したデプスで更新(置換、補正)することによって、時間的なノイズを除去する。点群データ生成部5は、時間的なノイズが除去されたデプスマップに基づいて、点群データを生成してもよい。 {Also, the noise removal processing for removing noise includes, for example, processing for removing temporal noise (for example, noise that changes over time) from the depth map generated by the position detection unit 2. For example, the position detection unit 2 repeats detection at a predetermined sampling frequency, and the processing device 3 compares detection results (eg, depth maps) of the position detection units 2 at different detection timings. For example, the processing device 3 calculates the depth change amount (eg, time change amount) of each pixel using the depth map in the first frame and the depth map in the second frame following the first frame. When the amount of change in the depth of each pixel satisfies a predetermined condition, the processing device 3 determines that the depth of the pixel in the first frame or the depth of the pixel in the second frame is noise. If the processing device 3 determines that the depth in the second frame is noise, the processing device 3 changes the depth of the second frame by interpolation using the depth in the first frame and the depth in the third frame following the second frame. presume. The processing device 3 removes temporal noise by updating (replacing and correcting) the estimated depth of the pixel corresponding to the noise. The point cloud data generating unit 5 may generate point cloud data based on the depth map from which temporal noise has been removed.
 なお、ノイズ除去処理は、空間的なノイズを除去する処理または時間的なノイズを除去する処理を含まなくてもよい。また、ノイズ除去処理は、処理装置3以外の部分(例、位置検出部2)で実行されてもよい。また、検出装置1は、ノイズ除去処理を実行しなくてもよい。 Note that the noise removal processing may not include processing for removing spatial noise or processing for removing temporal noise. In addition, the noise removal processing may be executed by a part other than the processing device 3 (for example, the position detection unit 2). The detection device 1 does not need to execute the noise removal processing.
 図1の説明に戻り、処理装置3は、所定方向において人体Mの断面形状が変化するパターンに基づいて、人体Mの形状に関連付けられた基準位置BPを設定し、設定した基準に対して人体Mの左半身の部分と右半身の部分とを判別する。基準算出部11は、移動方向(この場合、X方向)および鉛直方向(この場合、Y方向)のいずれとも異なる方向(例、交差方向、Z方向)を、所定方向に設定する。基準算出部11は、所定方向に直交する面(例、XY平面またはXY平面に平行な面)における人体Mの断面形状(例、輪郭)が、所定方向において変化するパターン(例、人体Mの断面特徴)を利用して、基準位置BPを設定する。 Returning to the description of FIG. 1, the processing device 3 sets a reference position BP associated with the shape of the human body M based on a pattern in which the cross-sectional shape of the human body M changes in a predetermined direction, and sets the human body M with respect to the set reference. The left body part and the right body part of M are determined. The reference calculation unit 11 sets a direction different from both the moving direction (in this case, the X direction) and the vertical direction (in this case, the Y direction) (eg, the cross direction, the Z direction) as the predetermined direction. The reference calculation unit 11 determines that a pattern (eg, the contour of the human body M) in which the cross-sectional shape (eg, contour) of the human body M in a plane orthogonal to the predetermined direction (eg, a plane parallel to the XY plane or the XY plane) changes in the predetermined direction The reference position BP is set using the cross-sectional feature).
 基準算出部11は、例えば、移動方向に対する側方から人体Mを見た側面視の人体Mの断面特徴(例、XY平面な断面における人体Mの輪郭がZ方向に変化するパターン)を利用して、人体の特徴部分(例、腕、肩、頭部)を判別し、基準位置BPを設定する。上記断面特徴は、人体Mの表面における点の位置情報の変化量によって特定される。点の位置情報(この場合、点データ)は、例えば、X方向の座標(X座標)、Y方向の座標(Y座標)、及びZ方向の座標(Z座標)を一組にした情報である。位置情報の変化量は、X方向におけるY座標の変化量(例、dy/dx)と、X方向におけるZ座標の変化量(例、dz/dx)と、Y方向におけるX座標の変化量(例、dx/dy)と、Y方向におけるZ座標の変化量(例、dz/dy)と、Z方向におけるX座標の変化量(例、dx/dz)と、Z方向におけるY座標の変化量(例、dy/dz)との少なくとも1つで表される。上記断面特徴については、図4(A)から(C)、図5(A)および(B)などで説明する。 The reference calculation unit 11 uses, for example, a cross-sectional feature of the human body M in a side view when the human body M is viewed from the side with respect to the movement direction (eg, a pattern in which the contour of the human body M changes in the XY plane in the Z direction). Then, the characteristic portion (eg, arm, shoulder, head) of the human body is determined, and the reference position BP is set. The cross-sectional feature is specified by the amount of change in position information of a point on the surface of the human body M. The position information of the point (in this case, the point data) is, for example, information in which coordinates in the X direction (X coordinates), coordinates in the Y direction (Y coordinates), and coordinates in the Z direction (Z coordinates) are combined. . The change amount of the position information includes a change amount of the Y coordinate in the X direction (eg, dy / dx), a change amount of the Z coordinate in the X direction (eg, dz / dx), and a change amount of the X coordinate in the Y direction ( For example, dx / dy), the change amount of the Z coordinate in the Y direction (eg, dz / dy), the change amount of the X coordinate in the Z direction (eg, dx / dz), and the change amount of the Y coordinate in the Z direction. (Eg, dy / dz). The cross-sectional features will be described with reference to FIGS. 4A to 4C, FIGS. 5A and 5B, and the like.
 基準算出部11は、基準位置BPを算出する基準算出処理を実行する。基準算出部11は、人体Mの移動方向(この場合、X方向)と鉛直方向(この場合、Y方向)とに交差する交差方向(例、Z方向)において位置情報の変化量が所定の条件を満たす位置を、基準位置BPとして算出する。基準算出部11は、位置検出部2が任意のタイミングで人体Mを検出した検出結果(例、1フレームのデプス画像)に基づいて、後述の基準算出処理を実行する。 The reference calculation unit 11 executes a reference calculation process for calculating the reference position BP. The reference calculation unit 11 determines that the amount of change in position information in a cross direction (eg, Z direction) intersecting the moving direction (X direction in this case) and the vertical direction (Y direction in this case) of the human body M is a predetermined condition. Is calculated as the reference position BP. The reference calculation unit 11 executes a reference calculation process described later based on a detection result (eg, a one-frame depth image) in which the position detection unit 2 detects the human body M at an arbitrary timing.
 基準位置BPは、後に図4(B)で説明する第1基準位置BP1、及び後に図5(B)で説明する第2基準位置BP2を含む。基準位置BPは、第1部分M1と第2部分M2とを判別する部分判別処理において基準に用いる位置である。上記位置情報の変化量は、例えば、任意のタイミング(時刻)における人体Mの表面を表す複数の点間の座標の変化量(例、差分)を含む。上記位置情報の変化量は、人体Mの表面(例、外形、輪郭、シルエット)を表す曲線または直線の傾き(例、微分係数)でもよい。 The reference position BP includes a first reference position BP1 described later with reference to FIG. 4B and a second reference position BP2 described later with reference to FIG. The reference position BP is a position used as a reference in the partial determination processing for determining the first part M1 and the second part M2. The change amount of the position information includes, for example, a change amount (eg, a difference) of coordinates between a plurality of points representing the surface of the human body M at an arbitrary timing (time). The change amount of the position information may be a slope (eg, a derivative) of a curve or a straight line representing the surface (eg, an outline, a contour, a silhouette) of the human body M.
 次に、基準算出処理について説明する。基準算出部11は、基準位置BPとして、物体M(例、人体M)に含まれる構造のうちその数が1つである構造(例、頭部、胴体)の位置を推定する。基準算出部11は、基準位置BPとして、交差方向(例、Z方向)における人体Mの中心(例、中心線CL)を含む部分の位置を推定する。基準算出部11は、点群データに基づいて基準位置を算出する。基準算出部11は、点群データ生成部5が生成した人体Mの点群データを記憶部14から読み出して、基準算出処理を実行する。 Next, the reference calculation process will be described. The reference calculation unit 11 estimates, as the reference position BP, the position of a structure (eg, a head or a torso) whose number is one of the structures included in the object M (eg, the human body M). The reference calculation unit 11 estimates, as the reference position BP, the position of a portion including the center (for example, the center line CL) of the human body M in the cross direction (for example, the Z direction). The reference calculation unit 11 calculates a reference position based on the point cloud data. The reference calculation unit 11 reads out the point cloud data of the human body M generated by the point cloud data generation unit 5 from the storage unit 14 and executes a reference calculation process.
 図4(A)から図4(C)、図5(A)、図5(B)は、第1実施形態に係る基準算出部の処理を示す図である。基準算出部11は、基準位置BPとして第1基準位置BP1(図4(B)参照)を算出する第1基準算出処理を実行する。本実施形態において、第1基準位置BP1は、物体Mにおける第1側(例、-Z側、左半身)の第1基準位置BP1aと、物体Mにおける第2側(例、+Z側、右半身)の第1基準位置BP1bとを含む。また、基準算出部11は、基準位置BPとして、第1基準位置BP1に基づいて第2基準位置BP2(図5(B)参照)を算出する第2基準算出処理を実行する。 FIGS. 4 (A) to 4 (C), FIGS. 5 (A) and 5 (B) are diagrams showing the processing of the reference calculating unit according to the first embodiment. The reference calculation unit 11 executes a first reference calculation process of calculating a first reference position BP1 (see FIG. 4B) as the reference position BP. In the present embodiment, the first reference position BP1 is a first reference position BP1a on the first side (eg, −Z side, left body) of the object M and a second side (eg, + Z side, right body) of the object M. ) Of the first reference position BP1b. In addition, the reference calculation unit 11 performs a second reference calculation process of calculating a second reference position BP2 (see FIG. 5B) based on the first reference position BP1 as the reference position BP.
 まず、第1基準算出処理について説明する。基準算出部11は、人体Mの位置情報の変化の特徴に基づいて、人体Mの所定の部位(例、首の付け根、首と胴体との間、肩と頭部との間など)の位置を第1基準位置BP1aとして推定する。基準算出部11は、所定の条件として点群データD2に含まれる点の座標の変化量が閾値以上となる条件を満たす位置を、第1基準位置BP1として算出する。 First, the first reference calculation process will be described. The reference calculation unit 11 determines the position of a predetermined part of the human body M (eg, the base of the neck, between the neck and the torso, between the shoulder and the head, etc.) based on the characteristics of the change in the position information of the human body M. Is estimated as the first reference position BP1a. The reference calculation unit 11 calculates, as the first reference position BP1, a position that satisfies the condition that the amount of change in the coordinates of the points included in the point cloud data D2 is equal to or greater than a threshold value.
 ここで、基準算出部11の処理は、検出対象として想定されている物体が歩行中の人体であるとし、任意のタイミングにおける人体の形状(例、姿勢)を想定する。人体は、腕(例、肘)から肩にかけて表面の位置が顕著に変化する(例、位置情報の変化量が閾値以上となる)。人体は、肩から首元までの間において表面の位置の変化が緩やかである(例、位置情報の変化量が閾値未満となる)。しかし、人体は、肩から頭部にかけて、肩の部分に比べて表面の位置が顕著に変化する(例、位置情報の変化量が閾値以上となる)。基準算出部11は、位置情報(例、点データのY座標)の変化量が閾値以上となる条件を満たす位置を第1基準位置BP1として算出する。 Here, the process of the reference calculation unit 11 assumes that the object assumed as the detection target is a walking human body, and assumes the shape (eg, posture) of the human body at an arbitrary timing. The surface of the human body significantly changes from the arm (eg, elbow) to the shoulder (eg, the amount of change in the position information is equal to or greater than a threshold). The human body has a gradual change in the position of the surface from the shoulder to the neck (eg, the amount of change in the position information is less than the threshold). However, the position of the surface of the human body changes significantly from the shoulder to the head as compared to the shoulder portion (eg, the amount of change in the position information becomes equal to or larger than a threshold). The reference calculation unit 11 calculates, as the first reference position BP1, a position that satisfies a condition that a change amount of the position information (eg, the Y coordinate of the point data) is equal to or larger than a threshold.
 なお、上記の腕(腕部)は、例えば、肩甲上腕関節を含み、肩甲上腕関節から指先までの部分である。また、上記の肩(肩部)は、例えば、左腕の肩甲上腕関節と右腕の肩甲上腕関節との間、かつ第7頸椎(首の付け根)よりも下半身側の部分である。また、上記の頭部は、例えば、頭蓋骨に囲まれる部分および頭蓋骨よりも表面側の部分、並びに頸部を含む。頭部は、例えば、第7頸椎および頭頂を含み、第7頸椎から頭頂までの部分である。 The above-mentioned arm (arm part) includes, for example, a scapulohumeral joint, and is a portion from the scapulohumeral joint to the fingertip. The above-mentioned shoulder (shoulder) is, for example, a portion between the scapulohumeral joint of the left arm and the scapulohumeral joint of the right arm, and a lower body side of the seventh cervical vertebra (base of the neck). The head includes, for example, a portion surrounded by the skull, a portion on the surface side of the skull, and a neck. The head includes, for example, the seventh cervical vertebra and the parietal region, and is a portion from the seventh cervical vertebra to the parietal region.
 基準算出部11は、Z方向に領域を分割し、分割した領域ごとに第1基準位置BP1の候補を算出する。図4(A)において、符号DAは、分割された領域(分割領域、部分領域)である。添え字のiは、分割された各領域に割り付けられる番号であり、例えば0以上の整数(i=0、1、2・・・)である。また、Zは、領域DAを代表するZ方向の座標(例、Z方向における領域DAの中心の座標)である。基準算出部11は、領域DAにおける点群データD2を2次元領域における点データの集合として(例、Z座標の値を無視して)、第1基準算出処理を実行する。基準算出部11は、領域DAに含まれる点データのZ座標がZであるとして、第1基準算出処理を実行する。 The reference calculation unit 11 divides the area in the Z direction, and calculates a candidate for the first reference position BP1 for each of the divided areas. In FIG. 4 (A), reference numeral DA i is divided regions (divided regions, partial regions) is. The subscript i is a number assigned to each divided area, and is, for example, an integer of 0 or more (i = 0, 1, 2,...). Z i is the coordinate in the Z direction representing the area DA i (eg, the coordinate of the center of the area DA i in the Z direction). Reference calculation unit 11, the point group data D2 in the region DA i as a set of points in a 2D region (e.g., ignoring the value of the Z-coordinate), executes a first reference calculation process. The reference calculation unit 11 performs the first reference calculation process on the assumption that the Z coordinate of the point data included in the area DA i is Z i .
 基準算出部11は、点群データD2に含まれる点について交差方向(例、Z方向)の各座標における鉛直方向(Y方向)の座標の最大値を算出する。基準算出部11は、分割された領域DAごとに、点データのY座標の最大値を算出する。図4(A)のYmax(Z)は、領域DAにおける点データのY座標の最大値である。基準算出部11は、Ymax(Z)を、各座標(Z)における点データのY座標の最大値とする。基準算出部11は、iを変更しつつYmax(Z)を算出する処理を繰り返して、i=0、1、2・・・のそれぞれについてYmax(Z)を算出する。 The reference calculation unit 11 calculates the maximum value of the coordinates in the vertical direction (Y direction) in the coordinates in the cross direction (eg, Z direction) for the points included in the point group data D2. Reference calculation unit 11, for each divided area DA i, and calculates the maximum value of the Y coordinate of the point data. Ymax (Z i ) in FIG. 4A is the maximum value of the Y coordinate of the point data in the area DA i . The reference calculation unit 11 sets Ymax (Z i ) as the maximum value of the Y coordinate of the point data at each coordinate (Z i ). Reference calculation unit 11 repeats the process of calculating the while changing the i Ymax (Z i), for each i = 0, 1, 2 · · · calculating the Ymax (Z i).
 図4(B)には、交差方向の座標(Z座標)に対するYmax(Z)のプロットを示した。ここで、交差方向の座標(Z座標)に対する位置情報の変化量を、ΔYmax(Z)で表す。ΔYmax(Z)は、例えば下記の式(1)で表される。
  ΔYmax(Z)=Ymax(Zi+1)-Ymax(Z)・・・式(1)
FIG. 4B shows a plot of Ymax (Z i ) with respect to the coordinates (Z coordinates) in the cross direction. Here, the change amount of the position information with respect to the coordinates (Z coordinate) in the cross direction is represented by ΔYmax (Z i ). ΔYmax (Z i ) is represented, for example, by the following equation (1).
ΔYmax (Z i ) = Ymax (Z i + 1 ) −Ymax (Z i ) Equation (1)
 また、図4(C)には、交差方向の座標(Z座標)に対するΔYmax(Z)の絶対値のプロットを示した。図4(B)および図4(C)において、符号Q1は人体Mの腕に対応する部分であり、適宜、左腕に対応する部分を符号Q1aで表し、右腕に対応する部分を符号Q1bで表す。また、符号Q2は人体Mの肩に対応する部分であり、適宜、左肩に対応する部分を符号Q2aで表し、右肩に対応する部分を符号Q2bで表す。また、符号Q3は人体Mの頭部に対応する部分である。 FIG. 4C shows a plot of the absolute value of ΔYmax (Z i ) with respect to the coordinates (Z coordinates) in the cross direction. 4 (B) and 4 (C), reference numeral Q1 is a part corresponding to the arm of the human body M, and a part corresponding to the left arm is represented by reference numeral Q1a, and a part corresponding to the right arm is represented by reference numeral Q1b. . Reference numeral Q2 is a part corresponding to the shoulder of the human body M. A part corresponding to the left shoulder is represented by a reference numeral Q2a, and a part corresponding to the right shoulder is represented by a reference numeral Q2b. Reference numeral Q3 is a portion corresponding to the head of the human body M.
 位置情報の変化量であるΔYmax(Z)の絶対値(|ΔYmax(Z)|)は、腕の部分Q1において大きく、肩の部分Q2において腕の部分Q1よりも小さい。また、|ΔYmax(Z)|は、肩の部分Q2から頭部の部分Q3にかけて急激に大きくなる。基準算出部11は、このような位置情報の変化量の特徴(例、変化パターン)に基づいて、基準位置BP(例、第1基準位置BP1a)を算出する。基準算出処理において、基準算出部11は、|ΔYmax(Z)|が第1閾値V1以下となる部分を、第1特徴部分(例、左肩の部分Q2a、右肩の部分Q2b)として設定(例、識別、特定、決定)する。 The absolute value (| ΔYmax (Z i ) |) of ΔYmax (Z i ), which is the amount of change in the position information, is large in the arm portion Q1, and smaller in the shoulder portion Q2 than in the arm portion Q1. | ΔYmax (Z i ) | sharply increases from the shoulder Q2 to the head Q3. The reference calculation unit 11 calculates the reference position BP (eg, the first reference position BP1a) based on the feature (eg, change pattern) of the amount of change in the position information. In the reference calculation process, the reference calculation unit 11 sets a portion where | ΔYmax (Z i ) | is equal to or less than the first threshold V1 as a first characteristic portion (eg, a left shoulder portion Q2a, a right shoulder portion Q2b) ( E.g., identification, identification, decision).
 まず、人体Mにおける第1側(例、-Z側、左半身側)の第1基準位置BP1aを算出する処理について説明する。基準算出部11は、交差方向(例、Z方向)において一方の側(例、第1側、-Z側)からその反対の他方の側(例、第2側、+Z側)に向かう順に、|ΔYmax(Z)|と第1閾値V1とを比較する。基準算出部11は、|ΔYmax(Z)|と第1閾値V1との大小関係が切り替わる位置を、左腕の部分Q1aと左肩の部分Q2aとの境界(例、腕の端、肩の端)として設定(例、特定、識別)する。例えば、基準算出部11は、|Ymax(Z)|が第1閾値V1よりも大きく、ΔYmax(Zi+1)が第1閾値V1以下である場合に、Zの位置を左腕の部分Q1における+Z側の端に設定し、Zi+1の位置を左肩の部分Q2における-Z側の端に設定する。 First, a process of calculating the first reference position BP1a on the first side (eg, the −Z side, the left half body side) of the human body M will be described. The reference calculation unit 11 sequentially moves from one side (eg, first side, −Z side) to the other side (eg, second side, + Z side) in the cross direction (eg, Z direction). | ΔYmax (Z i ) | is compared with the first threshold value V1. The reference calculation unit 11 determines the position at which the magnitude relationship between | ΔYmax (Z i ) | and the first threshold value V1 is switched between the boundary between the left arm part Q1a and the left shoulder part Q2a (eg, the end of the arm, the end of the shoulder). (Eg, identification, identification). For example, the reference calculation unit 11, | Ymax (Z i) | is larger than the first threshold value V1, when ΔYmax (Z i + 1) is equal to or less than the first threshold value V1, the portion Q1 of the left arm the position of Z i The position is set at the end on the + Z side, and the position of Z i + 1 is set at the end on the −Z side of the left shoulder portion Q2.
 また、基準算出部11は、人体Mの第1特徴部分(例、左肩の部分Q2a)に対して上記他方の側(例、第2側、+Z側)において、|Ymax(Z)|が第2閾値V2以上となる部分を、人体Mの第2特徴部分(例、頭部の部分Q3)とする。例えば、基準算出部11は、第1特徴部分(例、左肩の部分Q2a)における-Z側の端として設定した位置から上記他方の側(例、+Z側)に向かう順に、|ΔYmax(Z)|と第2閾値V2とを比較する。基準算出部11は、|ΔYmax(Z)|と第2閾値V2との大小関係が切り替わる位置を、第1特徴部分(例、左肩の部分Q2a)と第2特徴部分(例、頭部の部分Q3)との境界(例、左肩の基端、頭部における-Z側の端、首元、首の付け根)として設定(例、特定、識別、決定)する。例えば、基準算出部11は、ΔYmax(Z)が第2閾値V2よりも小さく、ΔYmax(Zi+1)が第2閾値V2以上である場合に、Zの位置を第1特徴部分(左肩の部分Q2)における+Z側の端に設定し、Zi+1の位置を第2特徴部分(頭部の部分Q3)における-Z側の端に設定する。 In addition, the reference calculation unit 11 calculates | Ymax (Z i ) | on the other side (eg, the second side, + Z side) of the first characteristic portion (eg, the left shoulder part Q2a) of the human body M. A portion that is equal to or more than the second threshold value V2 is defined as a second characteristic portion of the human body M (for example, a head portion Q3). For example, the reference calculation unit 11 sets | ΔYmax (Z i ) in order from the position set as the end on the −Z side in the first characteristic portion (eg, the left shoulder portion Q2a) toward the other side (eg, the + Z side). ) Is compared with the second threshold value V2. The reference calculation unit 11 determines the position at which the magnitude relationship between | ΔYmax (Z i ) | and the second threshold value V2 switches between the first characteristic portion (eg, the left shoulder portion Q2a) and the second characteristic portion (eg, the head portion). The boundary (eg, the base end of the left shoulder, the end on the −Z side of the head, the neck, the base of the neck) with the part Q3) is set (eg, specified, identified, determined). For example, the reference calculation unit 11, [delta] Ymax (Z i) is smaller than the second threshold value V2, [delta] Ymax when (Z i + 1) is the second threshold value V2 or more, the first characteristic portion (left shoulder position of Z i The position is set at the end on the + Z side in the portion Q2), and the position of Z i + 1 is set on the end on the −Z side of the second characteristic portion (the head portion Q3).
 更に、基準算出部11は、基準位置として、人体Mの腕から頭部までの位置情報の変化量に基づいて、人体の頭部に対応する第1基準位置BP1を算出する。基準算出部11は、第1基準位置BP1として、第1特徴部分(例、左肩の部分Q2a)と第2特徴部分(頭部の部分Q3)との境界を表す位置を算出する。例えば、基準算出部11は、第1基準位置BP1aとして、第1特徴部分(左肩の部分Q2a)における+Z側の端の位置を算出する。 Furthermore, the reference calculation unit 11 calculates a first reference position BP1 corresponding to the head of the human body M as a reference position, based on the amount of change in position information from the arm to the head of the human body M. The reference calculation unit 11 calculates, as the first reference position BP1, a position representing a boundary between a first feature portion (eg, a left shoulder portion Q2a) and a second feature portion (a head portion Q3). For example, the reference calculation unit 11 calculates, as the first reference position BP1a, the position of the end on the + Z side in the first characteristic portion (the left shoulder portion Q2a).
 次に、基準算出部11は、第1基準位置BP1aと同様にして、第1基準位置BP1bを算出する。基準算出部11は、第1基準位置BP1bを算出する処理において、人体Mに対して他方の側(例、+Z側)からその反対側(例、-Z側)に向かう順に、上記第1特徴部分および第2特徴部分を設定する。そして、基準算出部11は、第1基準位置BP1bとして、第1特徴部分(右肩の部分Q2b)における-Z側の端の位置を算出する。基準算出部11は、算出した第1基準位置BP1(例、3次元座標値)を図1の記憶部14に記憶させる。 Next, the reference calculation unit 11 calculates the first reference position BP1b in the same manner as the first reference position BP1a. In the process of calculating the first reference position BP1b, the reference calculation unit 11 sets the first characteristic in the order from the other side (eg, + Z side) to the opposite side (eg, −Z side) of the human body M. A part and a second characteristic part are set. Then, the reference calculation unit 11 calculates, as the first reference position BP1b, the position of the end on the −Z side in the first characteristic portion (the right shoulder portion Q2b). The reference calculation unit 11 stores the calculated first reference position BP1 (eg, three-dimensional coordinate value) in the storage unit 14 in FIG.
 なお、基準算出部11は、第2特徴部分(頭部の部分Q3)における-Z側の端の位置を第1基準位置BP1aとし、第2特徴部分(頭部の部分Q3)における+Z側の端の位置を第1基準位置BP1bとしてもよい。また、基準算出部11は、第1特徴部分(左肩の部分Q2a)における+Z側の端の位置と、第2特徴部分(頭部の部分Q3)における-Z側の端の位置との間の位置(例、中心)を、第1基準位置BP1aとしてもよい。この場合、基準算出部11は、第1特徴部分(右肩の部分Q2b)における-Z側の端の位置と、第2特徴部分(頭部の部分Q3)における+Z側の端の位置との間の位置(例、中心)を、第1基準位置BP1bとしてもよい。また、基準算出部11は、第1基準位置BP1として、第1基準位置BP1aと第1基準位置BP1bとの一方のみを算出してもよい。 The reference calculation unit 11 sets the position of the end on the −Z side in the second characteristic portion (head portion Q3) as the first reference position BP1a, and sets the position of the + Z side in the second characteristic portion (head portion Q3). The end position may be the first reference position BP1b. In addition, the reference calculation unit 11 calculates the position between the position of the + Z side end of the first feature portion (the left shoulder portion Q2a) and the position of the −Z side end of the second feature portion (the head portion Q3). The position (for example, the center) may be the first reference position BP1a. In this case, the reference calculation unit 11 calculates the position of the end on the −Z side in the first characteristic portion (the right shoulder portion Q2b) and the position of the + Z side end in the second characteristic portion (the head portion Q3). A position (eg, center) between them may be the first reference position BP1b. Further, the reference calculation unit 11 may calculate only one of the first reference position BP1a and the first reference position BP1b as the first reference position BP1.
 次に、図5(A)、図5(B)を参照して、第2基準算出処理について説明する。基準算出部11は、図4(B)に示した第1基準位置BP1に対して設定される所定の領域AR2(例、頭部の部分Q3に相当するZ方向の領域)から基準位置BPとして第2基準位置BP2を選択(例、設定、決定)する。基準算出部11は、図4(B)および図4(C)で説明したように、物体Mにおける第1側(例、-Z側)の第1基準位置BP1aと、第2側(例、+Z側)の第1基準位置BP1bとを算出する。基準算出部11は、第1側の第1基準位置BP1aと第2側の第1基準位置BP1bとの間を所定の領域(例、基準位置を含む領域)に設定する。Z方向における人体Mの中心線CLは、第1基準位置BP1aと第1基準位置BP1bとの間の領域(上記所定の領域、頭部の部分Q3)を通ることが想定される。基準算出部11は、第1基準位置BP1aと第1基準位置BP1bとの間の領域において、所定の条件を満たす位置を第2基準位置BP2として算出する。 Next, the second criterion calculation process will be described with reference to FIGS. 5 (A) and 5 (B). The reference calculator 11 calculates a reference position BP from a predetermined area AR2 (eg, an area in the Z direction corresponding to the head portion Q3) set with respect to the first reference position BP1 shown in FIG. The second reference position BP2 is selected (eg, set, determined). As described with reference to FIGS. 4B and 4C, the reference calculation unit 11 includes a first reference position BP1a on the first side (eg, −Z side) of the object M and a second side (eg, (+ Z side) and the first reference position BP1b. The reference calculation unit 11 sets a predetermined area (for example, an area including the reference position) between the first reference position BP1a on the first side and the first reference position BP1b on the second side. It is assumed that the center line CL of the human body M in the Z direction passes through a region between the first reference position BP1a and the first reference position BP1b (the predetermined region, the head portion Q3). The reference calculation unit 11 calculates a position satisfying a predetermined condition in a region between the first reference position BP1a and the first reference position BP1b as a second reference position BP2.
 基準算出部11は、第2基準算出処理において、図4(A)と同様にZ方向に領域を分割し、分割した領域ごとに第2基準位置BP2の候補を算出する。基準算出部11は、分割された領域DAにおける点群データD2を2次元領域における点データの集合として(例、Z座標の値を無視して)、第2基準算出処理を実行する。 In the second reference calculation process, the reference calculation unit 11 divides the region in the Z direction as in FIG. 4A, and calculates a candidate for the second reference position BP2 for each divided region. Reference calculation unit 11, the point group data D2 in the divided region DA i as a set of points in a 2D region (e.g., ignoring the value of the Z-coordinate), executes the second reference calculation process.
 基準算出部11は、第2基準算出処理において、点群データに含まれる点について交差方向(例、Z方向)の各座標(例、Z=Z)おける移動方向(X方向)の前方(+X側)を正とした座標の極小値を算出する。まず、基準算出部11は、人体Mの前面側(進行方向の前方側、X方向の+X側)の点データを抽出する。例えば、基準算出部11は、Y方向に領域を分割し、分割された領域ごとに点データのX座標の最大値を算出する。 In the second reference calculation process, the reference calculation unit 11 forwards (in the X direction) the movement direction (X direction) in each of the coordinates (eg, Z = Z i ) in the cross direction (eg, Z direction) for the points included in the point cloud data. The minimum value of the coordinates with (+ X side) being positive is calculated. First, the reference calculation unit 11 extracts point data on the front side of the human body M (front side in the traveling direction, + X side in the X direction). For example, the reference calculation unit 11 divides the area in the Y direction, and calculates the maximum value of the X coordinate of the point data for each of the divided areas.
 図5(A)において、符号DBは、分割された領域(分割領域、部分領域)である。添え字のjは、分割された各領域に割り付けられる番号であり、例えば0以上の整数(j=0、1、2・・・)である。また、Yは、領域DBを代表するY方向の座標(例、Y方向における領域DBの中心の座標)である。基準算出部11は、領域DBにおける点群データD2を1次元領域における点データの集合として(例、Y座標の値を無視して)、第1基準算出処理を実行する。基準算出部11は、領域DBに含まれる点データのY座標がYであるとして、第2基準算出処理を実行する。 In FIG. 5 (A), reference numeral DB j is divided regions (divided regions, partial regions) is. The subscript j is a number assigned to each of the divided areas, and is, for example, an integer of 0 or more (j = 0, 1, 2,...). Y j is a coordinate in the Y direction representing the area DB j (eg, a coordinate of the center of the area DB j in the Y direction). Reference calculation unit 11, the point group data D2 in the area DB j as a set of points data in the one-dimensional area (for example, ignoring the value of Y coordinates), executes a first reference calculation process. The reference calculation unit 11 executes the second reference calculation process, assuming that the Y coordinate of the point data included in the area DB j is Y j .
 図5(A)のXmax(Y)は、領域DBにおける点データのX座標の最大値である。基準算出部11は、Xmax(Y)を、各座標(Y)における点データのX座標の最大値とする。Xmax(Y)は、鉛直方向(Y方向)の各座標(Y)において、進行方向(X方向)の前方側の点データのX座標に相当する。基準算出部11は、jを変更しつつXmax(Y)を算出する処理を繰り返して、j=0、1、2・・・のそれぞれについてXmax(Y)を算出する。 Xmax (Y j ) in FIG. 5A is the maximum value of the X coordinate of the point data in the area DB j . The reference calculation unit 11 sets Xmax (Y j ) as the maximum value of the X coordinate of the point data at each coordinate (Y j ). Xmax (Y j ) corresponds to the X coordinate of point data on the front side in the traveling direction (X direction) in each coordinate (Y j ) in the vertical direction (Y direction). Reference calculation unit 11 repeats the process of calculating the while changing the j Xmax (Y j), calculates the Xmax (Y j) for each j = 0,1,2 ···.
 基準算出部11は、Y座標(Y)に対する点データのX座標の最大値(Xmax(Y))の分布に基づいて、Y座標に対するXmax(Y)の極小値を算出する。図5(A)のXmin(Z)は、Z座標をZ(図中のZ=Z)とした領域DAにおけるY座標に対するXmax(Y)の極小値であり、Ykは、Xmin(Z)に対応する点データのY座標である。Xmax(Y)の位置は、例えば、人体Mの肩よりも上側かつ顎よりも下側の位置になる。 Reference calculation unit 11, based on the distribution of the maximum values of the X coordinate of the point data for the Y-coordinate (Y j) (Xmax (Y j)), calculates the minimum value of Xmax (Y j) with respect to the Y coordinate. Xmin (Z i ) in FIG. 5A is a minimum value of Xmax (Y j ) with respect to the Y coordinate in the area DA i where the Z coordinate is Z i (Z = Z i in the figure), and Yk is This is the Y coordinate of the point data corresponding to Xmin (Z i ). The position of Xmax (Y j ) is, for example, a position above the shoulder of the human body M and below the chin.
 基準算出部11は、iを変更しつつXmin(Z)を算出する処理を繰り返して、i=0、1、2・・・のそれぞれについてXmin(Z)を算出する。図5(B)には、Z座標に対するXmin(Z)の分布を表すプロットを示した。基準算出部11は、交差方向(例、Z方向)の座標に対する極小値(Xmin(Z))の変化量に基づいて、第2基準位置BP2を選択する。人体Mの頸部を楕円柱で近似すると、頸部の表面は進行方向の前方に向かって凸となり、その先端の位置は人体Mの中心線CLの位置に相当する。基準算出部11は、Z座標に対するXmin(Z)が極大(または最大)となるZ座標(図5(B)のZc)を算出する。 Reference calculation unit 11 repeats the process of calculating the while changing the i Xmin (Z i), for each i = 0, 1, 2 · · · calculating the Xmin (Z i). FIG. 5B shows a plot representing the distribution of Xmin (Z i ) with respect to the Z coordinate. Reference calculation unit 11, the cross direction (e.g., Z-direction) on the basis of the change amount of the minimum value with respect to the coordinate of (Xmin (Z i)), selecting a second reference position BP2. When the neck of the human body M is approximated by an elliptic cylinder, the surface of the neck becomes convex forward in the traveling direction, and the position of the tip corresponds to the position of the center line CL of the human body M. The reference calculation unit 11 calculates a Z coordinate (Xc in FIG. 5B) at which Xmin (Z i ) with respect to the Z coordinate becomes maximum (or maximum).
 基準算出部11は、Xmin(Zc)を第2基準位置BP2のX座標とする。また、基準算出部11は、Xmin(Zc)に対応するY座標(図5(A)のYk)を第2基準位置BP2のY座標とする。また、基準算出部11は、Zcを第2基準位置BP2のZ座標とする。本実施形態に係る基準算出部11は、上述のように、分割された領域DAごとの点群データを2次元データとして扱って基準算出処理を実行するので、例えば処理の負荷を低減可能である。なお、基準算出部11は、点群データを3次元データとして扱って基準算出処理を実行してもよい。 The reference calculation unit 11 sets Xmin (Zc) as the X coordinate of the second reference position BP2. The reference calculation unit 11 sets the Y coordinate (Yk in FIG. 5A) corresponding to Xmin (Zc) as the Y coordinate of the second reference position BP2. Further, the reference calculation unit 11 sets Zc as the Z coordinate of the second reference position BP2. Reference calculating unit 11 according to the present embodiment, as described above, since performing the reference calculation process dealing with a two-dimensional data point cloud data for each divided area DA i, can reduce the load of the example process is there. Note that the reference calculation unit 11 may perform the reference calculation process by treating the point cloud data as three-dimensional data.
 このように、本実施形態に係る基準算出部11は、人体Mの前面側の外形特徴を用いて基準位置BP(例、第2基準位置BP2)を算出する。一般的な人体は顎から喉、胸の部分で形状の変化が顕著であるため、基準算出部11は、第2基準位置BP2を高精度に算出可能である。なお、基準算出部11は、人体Mの後面側(例、進行方向の後方側、背中側)の外形特徴を用いて基準位置BPを算出してもよい。 As described above, the reference calculation unit 11 according to the present embodiment calculates the reference position BP (for example, the second reference position BP2) using the front-side outer shape characteristics of the human body M. Since a general human body has a remarkable change in shape between the chin, the throat, and the chest, the reference calculation unit 11 can calculate the second reference position BP2 with high accuracy. Note that the reference calculation unit 11 may calculate the reference position BP using the external features on the rear side of the human body M (eg, the rear side and the back side in the traveling direction).
 図1の説明に戻り、基準算出部11は、算出した基準位置BPを記憶部14に記憶させる。部分判別部12は、部分判別処理を実行する。部分判別部12は、部分判別処理において、物体M(人体M)の第1部分M1(例、左半身の少なくとも一部)と第2部分M2(例、右半身の少なくとも一部)とを判別(例、判定、識別、特定)する。第1部分M1は、後述する基準面BFに対して第1側(例、-Z側)に配置される物体Mの一部である。第2部分M2は、基準面BFに対して第1側と反対の第2側(例、+Z側)に配置される物体Mの一部である。基準面BFは、移動方向(X方向)と鉛直方向(Y方向)とを含む平面である。基準面BFは、例えば、移動方向(X方向)と鉛直方向(Y方向)とに平行かつ、基準位置BP(第2基準位置BP2)を含む平面である。 戻 り Returning to the description of FIG. 1, the reference calculation unit 11 causes the storage unit 14 to store the calculated reference position BP. The partial determination unit 12 performs a partial determination process. The part discriminating unit 12 discriminates a first part M1 (eg, at least a part of the left body) and a second part M2 (eg, at least a part of the right body) of the object M (the human body M) in the part judgment processing. (Eg, judgment, identification, identification). The first portion M1 is a part of the object M arranged on a first side (eg, −Z side) with respect to a reference plane BF described later. The second portion M2 is a part of the object M arranged on a second side (eg, + Z side) opposite to the first side with respect to the reference plane BF. The reference plane BF is a plane including the moving direction (X direction) and the vertical direction (Y direction). The reference plane BF is, for example, a plane parallel to the movement direction (X direction) and the vertical direction (Y direction) and including the reference position BP (second reference position BP2).
 部分判別部12は、基準算出部11が算出した基準位置BPに基づいて、部分判別処理を実行する。部分判別部12は、基準算出部11が算出した基準位置BPに基づいて、基準面BFを設定する。部分判別部12は、基準算出部11が算出した基準位置BPを記憶部14から読み出し、基準面BFを設定する。部分判別部12は、X方向とY方向とに平行かつ第2基準位置BP2を含む面を、基準面BFに設定する。 The partial determination unit 12 performs a partial determination process based on the reference position BP calculated by the reference calculation unit 11. The partial determination unit 12 sets a reference plane BF based on the reference position BP calculated by the reference calculation unit 11. The partial determination unit 12 reads the reference position BP calculated by the reference calculation unit 11 from the storage unit 14 and sets a reference plane BF. The partial determination unit 12 sets a plane parallel to the X direction and the Y direction and including the second reference position BP2 as the reference plane BF.
 そして、部分判別部12は、点群データ生成部5が生成した点群データに基づいて部分判別処理を実行する。基準算出部11は、点群データ生成部5が生成した点群データを記憶部14から読み出し、点群データに含まれる1または以上の点データが表す部分が基準面BFに対して第1側(例、-Z側)又は第2側(例、+Z側)のいずれに配置されるかを判定する。 {Circle around (4)} The partial determination unit 12 performs a partial determination process based on the point cloud data generated by the point cloud data generation unit 5. The reference calculation unit 11 reads out the point cloud data generated by the point cloud data generation unit 5 from the storage unit 14, and a portion represented by one or more point data included in the point cloud data is on the first side with respect to the reference plane BF. (E.g., the -Z side) or the second side (e.g., the + Z side) is determined.
 本実施形態において、第2基準位置BP2は、人体Mの中心線CL上の点の位置として基準算出部11が推定する位置であり、部分判別部12は、基準位置BPとして第2基準位置BP2を用いて部分判別処理を実行する。部分判別部12は、第2基準位置BP2に対して第1側(例、-Z側)に配置される部分を第1部分M1として判別する。また、部分判別部12は、第2基準位置BP2に対して第2側(例、+Z側)に配置される部分を第2部分M2として判別する。 In the present embodiment, the second reference position BP2 is a position estimated by the reference calculation unit 11 as the position of a point on the center line CL of the human body M, and the partial determination unit 12 determines the second reference position BP2 as the reference position BP. Is used to execute the partial determination process. The part determining unit 12 determines a part disposed on the first side (eg, the −Z side) with respect to the second reference position BP2 as the first part M1. In addition, the part determination unit 12 determines a part arranged on the second side (eg, + Z side) with respect to the second reference position BP2 as the second part M2.
 図6は、第1実施形態に係る部分判別部の処理を示す図である。図6において、符号QXは、部分判別処理の対象の部分(例、点、領域)である。以下の説明において、部分判別処理の対象の部分を、判別対象部分と称する。判別対象部分QXは、予め設定される。例えば、判別対象部分QXは、人体Mの一部(例、手、足)に設定され、部分判別部12は、判別対象部分QX(例、足)が左半身(例、左足)と右半身(例、右足)とのいずれに属するのかを判別する。判別対象部分QXは、人体Mの複数の特徴部分(例、全身)に設定されてもよい。この場合、部分判別部12は、人体Mの各部分を順に判別対象部分QXに設定し、人体Mの部分ごとに部分判別処理を実行してもよい。 FIG. 6 is a diagram illustrating a process of the partial determination unit according to the first embodiment. In FIG. 6, a symbol QX is a part (eg, point, area) to be subjected to the part determination processing. In the following description, a portion to be subjected to the partial determination process is referred to as a determination target portion. The determination target portion QX is set in advance. For example, the determination target portion QX is set to a part (eg, hand, foot) of the human body M, and the partial determination portion 12 determines that the determination target portion QX (eg, foot) has a left half body (eg, left foot) and a right half body. (E.g., right foot). The determination target portion QX may be set to a plurality of characteristic portions (for example, the whole body) of the human body M. In this case, the part discriminating unit 12 may sequentially set each part of the human body M as the discrimination target part QX and execute the part discriminating process for each part of the human body M.
 部分判別処理において、部分判別部12は、基準位置BPと物体M(人体M)の判別対象部分QXとを結ぶ測地線SLを算出する。測地線SLは、物体M(人体M)の表面にそって2点を結ぶ最短の線である。部分判別部12は、点群データから得られる人体Mの表面の形状に基づいて、測地線SLを算出する。部分判別部12は、基準位置BPとして第2基準位置BP2を用いて、第2基準位置BP2と判別対象部分QXとを結ぶ測地線SLを算出する。判別対象部分QXが領域である場合、部分判別部12は、判別対象部分QXから選択される点(例、判別対象部分QXの中心)と、第2基準位置BP2とを結ぶ測地線SLを算出する。 In the partial determination process, the partial determination unit 12 calculates a geodesic line SL connecting the reference position BP and the determination target portion QX of the object M (human body M). The geodesic line SL is the shortest line connecting two points along the surface of the object M (human body M). The partial determination unit 12 calculates the geodesic SL based on the shape of the surface of the human body M obtained from the point cloud data. Using the second reference position BP2 as the reference position BP, the part determination unit 12 calculates a geodesic line SL connecting the second reference position BP2 and the determination target portion QX. When the determination target part QX is an area, the part determination unit 12 calculates the geodesic line SL connecting the point selected from the determination target part QX (for example, the center of the determination target part QX) and the second reference position BP2. I do.
 部分判別部12は、測地線SLと基準面BFとの相対位置に基づいて、第1部分M1と第2部分M2とを判別する。まず、測地線SLの全体が基準面BFに対して第1側(例、-Z側)または第2側(例、+Z側)に配置される場合について説明する。測地線SLの全体が基準面BFに対して第1側(例、-Z側)に配置される場合、部分判別部12は、測地線SLが基準面BFに対して第1側(例、-Z側)に配置されていると、測地線SLと基準面BFとの相対位置を特定する。また、測地線SLの全体が基準面BFに対して第2側(例、+Z側)に配置される場合、部分判別部12は、測地線SLが基準面BFに対して第2側(例、+Z側)に配置されていると、測地線SLと基準面BFとの相対位置を特定する。 The part determination unit 12 determines the first part M1 and the second part M2 based on the relative position between the geodesic line SL and the reference plane BF. First, a case where the entire geodesic line SL is disposed on the first side (eg, -Z side) or the second side (eg, + Z side) with respect to the reference plane BF will be described. When the entire geodesic line SL is disposed on the first side (for example, the −Z side) with respect to the reference plane BF, the partial determination unit 12 determines that the geodesic line SL is on the first side (for example, -Z side), the relative position between the geodesic line SL and the reference plane BF is specified. When the entire geodesic line SL is disposed on the second side (eg, + Z side) with respect to the reference plane BF, the partial determination unit 12 determines that the geodesic line SL is on the second side (eg, + Z side) with respect to the reference plane BF. , + Z side), the relative position between the geodesic line SL and the reference plane BF is specified.
 次に、測地線SLの一部が基準面BFに対して第1側(例、-Z側)に配置され、測地線SLの一部が基準面BFに対して第2側(例、+Z側)に配置される場合について説明する。部分判別部12は、測地線SLにおいて基準面BFに対して第1側(例、-Z側)に配置される部分の特徴量(第1特徴量)と、測地線SLにおいて基準面BFに対して第2側(例、+Z側)に配置される部分の特徴量(第2特徴量)とに基づいて、測地線SLと基準面BFとの相対位置を特定する。部分判別部12は、第1特徴量と第2特徴量との大小関係または比に基づいて、測地線SLと基準面BFとの相対位置を特定する。 Next, a part of the geodesic line SL is arranged on the first side (eg, −Z side) with respect to the reference plane BF, and a part of the geodesic line SL is arranged on the second side (eg, + Z side) with respect to the reference plane BF. Side) will be described. The part discriminating unit 12 determines a feature amount (first feature amount) of a portion arranged on the first side (eg, −Z side) with respect to the reference plane BF on the geodesic line SL, and a reference plane BF on the geodesic line SL. On the other hand, the relative position between the geodesic line SL and the reference plane BF is specified based on the feature amount (second feature amount) of the portion arranged on the second side (eg, + Z side). The partial determination unit 12 specifies a relative position between the geodesic line SL and the reference plane BF based on a magnitude relationship or a ratio between the first feature amount and the second feature amount.
 上記特徴量は、例えば、測地線SL上の点と基準面BFとの距離である。部分判別部12は、基準面BFに対して第1側(例、-Z側)に配置される測地線SL上の点と基準面BFとの距離(第1特徴量)と、基準面BFに対して第2側(+Z側)に配置される測地線SL上の点と基準面との距離(第2特徴量)とに基づいて測地線SLと基準面BFとの相対位置を特定する。 The feature amount is, for example, a distance between a point on the geodesic line SL and the reference plane BF. The partial determination unit 12 determines the distance (first feature amount) between a point on the geodesic line SL located on the first side (eg, the −Z side) with respect to the reference plane BF and the reference plane BF. , The relative position between the geodesic line SL and the reference plane BF is specified based on the distance (second feature amount) between the point on the geodesic line SL arranged on the second side (+ Z side) and the reference plane. .
 例えば、部分判別部12は、基準面SFに対して第1側(例、-Z側)に配置される測地線SL上の複数の点(第1の点集合)についての各点と基準面SFとの距離の平均(第1特徴量)を算出する。また、部分判別部12は、基準面SFに対して第2側(例、+Z側)に配置される測地線SL上の複数の点(第2の点集合)についての各点と基準面SFとの距離の平均(第2特徴量)を算出する。部分判別部12は、第1特徴量が第2特徴量よりも大きい場合、測地線SLが基準面BFに対して第1側(例、-Z側)に配置される相対位置であると特定する。また、部分判別部12は、第2特徴量が第1特徴量よりも大きい場合、測地線SLが基準面BFに対して第2側(例、+Z側)に配置される相対位置であると特定する。部分判別部12は、第1特徴量と第2特徴量との比を閾値と比較して、測地線SLと基準面BFとの相対位置を特定してもよい。 For example, the partial determination unit 12 determines each point of a plurality of points (first point set) on the geodesic line SL arranged on the first side (eg, the −Z side) with respect to the reference plane SF and the reference plane. The average (first feature amount) of the distance from the SF is calculated. In addition, the part determination unit 12 determines each point of a plurality of points (second point set) on the geodesic line SL arranged on the second side (eg, + Z side) with respect to the reference plane SF and the reference plane SF. Is calculated (the second feature amount). When the first feature value is larger than the second feature value, the partial determination unit 12 specifies that the geodesic line SL is located at a relative position on the first side (eg, the −Z side) with respect to the reference plane BF. I do. Further, when the second feature value is larger than the first feature value, the partial determination unit 12 determines that the geodesic line SL is a relative position where the geodesic line SL is disposed on the second side (eg, + Z side) with respect to the reference plane BF. Identify. The partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
 部分判別部12は、第1の点集合および第2の点集合を設定(生成)する際に、測地線SL上の複数の点を選択する。部分判別部12は、上記複数の点を、規則的に(例、所定間隔で)選択してもよいし、不規則的(例、ランダム)に選択してもよい。そして、部分判別部12は、選択した各点について、基準面SFに対して第1側(例、-Z側)に配置される場合にこの点を第1の点集合に分類し、基準面SFに対して第2側(例、+Z側)に配置される場合にこの点を第2の点集合に分類して、第1の点集合および第2の点集合を設定する。 When setting (generating) the first point set and the second point set, the partial determination unit 12 selects a plurality of points on the geodesic line SL. The partial determination unit 12 may select the plurality of points regularly (eg, at predetermined intervals) or irregularly (eg, randomly). Then, when each of the selected points is arranged on the first side (for example, the −Z side) with respect to the reference plane SF, each of the selected points is classified into a first point set, When placed on the second side (eg, + Z side) with respect to the SF, this point is classified into a second point set, and a first point set and a second point set are set.
 部分判別部12は、上記第1の点集合に属する点の数と、上記第2の点集合に属する点の数とが同じになるように、第1の点集合と第2の点集合を設定してもよい。この場合、部分判別部12は、特徴量として、距離の平均値の代わりに距離の総和を用いてもよい。例えば、部分判別部12は、第1特徴量として第1の点集合に関する距離の総和を用い、第2特徴量として第2の点集合に関する距離の総和を用いて、測地線SLと基準面BFとの相対位置を特定してもよい。 The partial determination unit 12 determines the first point set and the second point set such that the number of points belonging to the first point set is the same as the number of points belonging to the second point set. May be set. In this case, the partial determination unit 12 may use the sum of the distances instead of the average value of the distances as the feature amount. For example, the partial determination unit 12 uses the sum of the distances related to the first point set as the first feature amount and the sum of the distances related to the second point set as the second feature amount, and uses the geodesic line SL and the reference plane BF. May be specified.
 また、上記第1特徴量は、基準面BFから第1側(例、-Z側)へ最も離れた測地線SL上の点と基準面BFとの距離でもよい。上記第2特徴量は、基準面BFから第2側(+Z側)へ最も離れた測地線SL上の点と基準面BFとの距離でもよい。部分判別部12は、第1特徴量が第2特徴量よりも大きい(距離が長い)場合、測地線SLが基準面BFに対して第1側(例、-Z側)に配置される相対位置であると特定する。また、部分判別部12は、第2特徴量が第1特徴量よりも大きい(距離が長い)場合、測地線SLが基準面BFに対して第2側(例、+Z側)に配置される相対位置であると特定する。部分判別部12は、第1特徴量と第2特徴量との比を閾値と比較して、測地線SLと基準面BFとの相対位置を特定してもよい。 The first feature value may be a distance between a point on the geodesic line SL farthest from the reference plane BF to the first side (eg, the −Z side) and the reference plane BF. The second feature value may be a distance between a point on the geodesic line SL farthest from the reference plane BF to the second side (+ Z side) and the reference plane BF. When the first feature value is larger than the second feature value (the distance is longer), the partial determination unit 12 determines that the geodesic line SL is located on the first side (eg, the −Z side) with respect to the reference plane BF. Identify the location. When the second feature value is larger than the first feature value (the distance is longer), the partial determination unit 12 places the geodesic line SL on the second side (eg, + Z side) with respect to the reference plane BF. Specify that it is a relative position. The partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
 また、上記第1特徴量は、測地線SLにおいて基準面BFに対して第1側(例、-Z側)に配置される部分の長さでもよい。上記第2特徴量は、測地線SLにおいて基準面BFに対して第2側(例、+Z側)に配置される部分の長さでもよい。部分判別部12は、第1特徴量が第2特徴量よりも大きい(又は長い)場合、測地線SLが基準面BFに対して第1側(例、-Z側)に配置される相対位置であると特定する。また、部分判別部12は、第2特徴量が第1特徴量よりも大きい(又は長い)場合、測地線SLが基準面BFに対して第2側(例、+Z側)に配置される相対位置であると特定する。部分判別部12は、第1特徴量と第2特徴量との比を閾値と比較して、測地線SLと基準面BFとの相対位置を特定してもよい。 {Circle around (1)} The first feature amount may be the length of a portion arranged on the first side (eg, −Z side) with respect to the reference plane BF in the geodesic line SL. The second feature value may be the length of a portion arranged on the second side (eg, + Z side) with respect to the reference plane BF in the geodesic line SL. When the first feature value is larger (or longer) than the second feature value, the partial determination unit 12 determines a relative position where the geodesic line SL is arranged on the first side (eg, the −Z side) with respect to the reference plane BF. Is specified. When the second feature value is larger (or longer) than the first feature value, the partial determination unit 12 determines that the geodesic line SL is located on the second side (eg, + Z side) with respect to the reference plane BF. Identify the location. The partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by comparing the ratio between the first feature amount and the second feature amount with a threshold.
 また、部分判別部12は、上記第1の点集合および第2の点集合を設定(生成)しなくてもよく、上記第1特徴量および第2特徴量を用いないで、測地線SLと基準面BFとの相対位置を特定してもよい。例えば、部分判別部12は、測地線SL上の複数の点を、規則的に(例、所定間隔で)または不規則的(例、ランダム)に選択する。そして、部分判別部12は、選択した各点について、基準面BFとの距離を算出する。部分判別部12は、選択した各点について、基準面BFに対して一方側(例、第1側、-Z側)に配置される場合に基準面BFとの距離を負の値で表す。また、部分判別部12は、選択した各点について、基準面BFに対して他方側(例、第2側、+Z側)に配置される場合に基準面BFとの距離を正の値で表す。部分判別部12は、測地線SL上の複数の点について、正の値、負の値、又は0で表される各点と基準面BFとの距離の総和を算出する。部分判別部12は、算出した距離の総和が負の値である場合に、測地線SLが基準面BFに対して一方側(例、第1側、-Z側)に配置される相対位置であると特定する。また、部分判別部12は、算出した距離の総和が正の値である場合に、測地線SLが基準面BFに対して他方側(例、第2側、+Z側)に配置される相対位置であると特定する。 In addition, the partial determination unit 12 does not need to set (generate) the first point set and the second point set, and does not use the first feature amount and the second feature amount to generate the geodesic SL. The relative position with respect to the reference plane BF may be specified. For example, the partial determination unit 12 selects a plurality of points on the geodesic line SL regularly (for example, at predetermined intervals) or irregularly (for example, randomly). Then, the partial determination unit 12 calculates the distance from each of the selected points to the reference plane BF. When each of the selected points is arranged on one side (eg, the first side, the −Z side) with respect to the reference plane BF, the distance between the selected point and the reference plane BF is represented by a negative value. When each of the selected points is arranged on the other side (eg, the second side, + Z side) with respect to the reference plane BF, the distance between the selected point and the reference plane BF is represented by a positive value. . The partial determination unit 12 calculates the sum of the distances between each point represented by a positive value, a negative value, or 0 and the reference plane BF for a plurality of points on the geodesic line SL. When the sum of the calculated distances is a negative value, the partial determination unit 12 determines the relative position where the geodesic line SL is located on one side (eg, the first side, the −Z side) with respect to the reference plane BF. Identify that there is. When the sum of the calculated distances is a positive value, the partial determination unit 12 determines the relative position where the geodesic line SL is located on the other side (eg, the second side, + Z side) with respect to the reference plane BF. Is specified.
 なお、部分判別部12は、測地線SLにおける所定の部分と基準面BFとの相対位置に基づいて、測地線SLと基準面BFとの相対位置を特定してもよい。上記の所定の部分は、測地線SLにおいて基準面BFから最も離れた点(最遠点)でもよい。部分判別部12は、上記最遠点が基準面BFに対して第1側(例、-Z側)に配置される場合、測地線SLが基準面BFに対して第1側(例、-Z側)に配置される相対位置である特定する。また、部分判別部12は、上記最遠点が基準面BFに対して第2側(例、+Z側)に配置される場合、測地線SLが基準面BFに対して第2側(例、+Z側)に配置される相対位置であると特定する。上記所定の部分は、最遠点以外でもよく、例えば、測地線SLにおいて判別対象部分QXよりも第2基準位置BP2に近い部分(例、第2基準位置BP2を起点とする所定の長さの部分)でもよい。部分判別部12は、測地線SLのうち第2基準位置BP2を起点として所定の長さの部分が第2側(+Z側)に延びる場合に、判別対象部分QXが第2部分M2に属すると判別してもよい。部分判別部12は、上述の手法を組み合わせて、測地線SLと基準面BFとの相対位置を特定してもよい。 Note that the partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF based on the relative position between a predetermined part of the geodesic line SL and the reference plane BF. The above-mentioned predetermined portion may be a point (farthest point) farthest from the reference plane BF on the geodesic line SL. When the farthest point is located on the first side (eg, −Z side) with respect to the reference plane BF, the partial determination unit 12 determines that the geodesic line SL is on the first side (eg, −− side) with respect to the reference plane BF. The relative position located on the (Z side) is specified. When the farthest point is located on the second side (eg, + Z side) with respect to the reference plane BF, the partial determination unit 12 determines that the geodesic line SL is on the second side (eg, with respect to the reference plane BF). (+ Z side). The predetermined portion may be other than the farthest point. For example, a portion closer to the second reference position BP2 than the determination target portion QX in the geodesic line SL (for example, a portion having a predetermined length starting from the second reference position BP2). Part). When the portion having a predetermined length from the second reference position BP2 of the geodesic line SL extends to the second side (+ Z side) in the geodesic line SL, the portion determination unit 12 determines that the determination target portion QX belongs to the second portion M2. It may be determined. The partial determination unit 12 may specify the relative position between the geodesic line SL and the reference plane BF by combining the above methods.
 図6において、測地線SLは、第2基準位置BP2を起点として+Z側に延びており、基準面BFの+Z側を経由して判別対象部分QXに至る。このような場合、部分判別部12は、判別対象部分QXが第2部分M2(例、右半身)に属すると判別する。ここで、左足と右足とが交差するか、もしくは同一の線上に並ぶ状態を想定する。この状態において、右足は、人体の肩幅の方向において左足よりも左側の位置または同じ位置に配置される。そのため、従来の通常の装置は、検出された足が右足であるのか左足であるのかを自動的に判別することが難しい場合がある。左足と右足とが交差するか、もしくは人体の中心線上に並ぶ状態において、実施形態に係る検出装置1は、例えば測地線SLと基準面BFとの相対位置を用いることによって、測地線SLの少なくとも一部が基準面BFに対して+Z側または-Z側に配置されるので、検出された足が左足であるのか、若しくは右足であるのかを高精度に自動で判別することができる。 In FIG. 6, the geodesic line SL extends to the + Z side starting from the second reference position BP2, and reaches the determination target portion QX via the + Z side of the reference plane BF. In such a case, the part determination unit 12 determines that the determination target part QX belongs to the second part M2 (for example, the right half). Here, it is assumed that the left foot and the right foot intersect or are aligned on the same line. In this state, the right foot is located at the left side or the same position as the left foot in the direction of the shoulder width of the human body. Therefore, it may be difficult for the conventional normal device to automatically determine whether the detected foot is the right foot or the left foot. In a state where the left foot and the right foot intersect or are aligned on the center line of the human body, the detection device 1 according to the embodiment uses at least the relative position between the geodesic line SL and the reference plane BF, thereby detecting at least the geodesic line SL. Since a part is arranged on the + Z side or the −Z side with respect to the reference plane BF, it is possible to automatically determine with high accuracy whether the detected foot is the left foot or the right foot.
 図1の説明に戻り、上述のように、部分判別部12は、点群データが表す物体M(人体M)の少なくとも一部に対して部分判別処理を実行し、その処理結果を記憶部14に記憶させる。例えば、部分判別部12は、点群データが表す物体M(人体M)の部分ごとに部分判別処理を実行し、この部分が第1部分M1と第2部分M2とのいずれに属するかを示す判別情報(例、フラグ、属性情報)を処理結果として記憶部14に記憶させる。 Returning to the description of FIG. 1, as described above, the partial determination unit 12 performs a partial determination process on at least a part of the object M (human body M) represented by the point cloud data, and stores the processing result in the storage unit 14. To memorize. For example, the part discriminating unit 12 performs a part discriminating process for each part of the object M (human body M) represented by the point cloud data, and indicates whether the part belongs to the first part M1 or the second part M2. The determination information (eg, flag, attribute information) is stored in the storage unit 14 as a processing result.
 そして、姿勢推定部13は、部分判別部12が判別した第1部分M1と第2部分M2とに基づいて、物体Mの姿勢を推定する姿勢推定処理を実行する。部分判別部12は、例えば、人体Mの特徴的な部分(例、特徴部分、特徴点)の位置情報を生成する。人体Mの特徴部分は、例えば、人体Mのうち他の部分と区別可能な部分である。人体Mの特徴部分は、例えば、人体の末端部(手先、足先、頭部)、関節、又は末端部と関節との間もしくは2つの関節の間の中間部の少なくとも1つを含む。 Then, the posture estimating unit 13 performs a posture estimating process of estimating the posture of the object M based on the first part M1 and the second part M2 determined by the part determining unit 12. The part determination unit 12 generates, for example, position information of a characteristic part (eg, a characteristic part, a characteristic point) of the human body M. The characteristic part of the human body M is, for example, a part that can be distinguished from other parts of the human body M. The characteristic portion of the human body M includes, for example, at least one of a distal end (a finger, a toe, a head), a joint, or an intermediate portion between the distal end and the joint or between the two joints.
 姿勢推定部13は、例えば、点群データから得られる人体Mの形状に対して、認識処理(例、パターン認識、形状認識、骨格認識)を実行する。姿勢推定部13は、認識処理によって、上記の特徴部分の位置情報を生成する。特徴部分の位置情報は、例えば、特徴部分を代表する点の座標(例、3次元座標)を含む。姿勢推定部13は、上記の認識処理によって、特徴部分を表す点の座標を算出する。そして、姿勢推定部13は、特定した部分の情報(例、特徴部分を表す点の座標)を記憶部14に記憶させる。 The posture estimating unit 13 performs, for example, a recognition process (eg, pattern recognition, shape recognition, skeleton recognition) on the shape of the human body M obtained from the point cloud data. The posture estimating unit 13 generates position information of the above-described characteristic portion by a recognition process. The position information of the characteristic portion includes, for example, the coordinates of a point representing the characteristic portion (eg, three-dimensional coordinates). The posture estimating unit 13 calculates the coordinates of the point representing the characteristic portion by the above-described recognition processing. Then, the posture estimating unit 13 causes the storage unit 14 to store information of the specified portion (eg, coordinates of a point representing a characteristic portion).
 図7は、第1実施形態に係る姿勢推定部の処理を示す図である。図7において、符号Q11からQ30は、姿勢推定部13が特定した人体Mの特徴部分である。符号Q11からQ15は末端部に対応する特徴部分であり、Q11は頭部、Q12は左足先、Q13は左手先、Q14は右足先、Q15は右手先である。また、符号Q16から符号Q27は、関節に対応する特徴部分であり、Q16は左足首、Q17は左膝、Q18は左股関節(左足の付け根)、Q19は右足首、Q20は右膝、Q21は右股関節(右足の付け根)、Q22は左手首、Q23は左肘、Q24は左肩甲上腕関節、Q25は右手首、Q26は右肘、Q27は右肩甲上腕関節である。また、符号Q28からQ30は、末端部と関節との間もしくは2つの関節の間の中間部に相当する特徴部分であり、Q28は腰(左右の股関節の中央)、Q29は頸部(左右の肩甲上腕関節の中央)、Q30は背中(腰Q28と頸部Q29との中央)である。 FIG. 7 is a diagram illustrating a process of the posture estimating unit according to the first embodiment. In FIG. 7, reference characters Q11 to Q30 are characteristic portions of the human body M specified by the posture estimating unit 13. Symbols Q11 to Q15 are characteristic portions corresponding to the terminal portions, where Q11 is a head, Q12 is a left foot, Q13 is a left hand, Q14 is a right foot, and Q15 is a right hand. Reference characters Q16 to Q27 are characteristic portions corresponding to joints, Q16 is a left ankle, Q17 is a left knee, Q18 is a left hip joint (base of the left foot), Q19 is a right ankle, Q20 is a right knee, and Q21 is a right knee. Right hip joint (base of right foot), Q22 is a left wrist, Q23 is a left elbow, Q24 is a left scapulohumeral joint, Q25 is a right wrist, Q26 is a right elbow, and Q27 is a right scapula upper arm. Symbols Q28 to Q30 are characteristic portions corresponding to an intermediate portion between a distal end portion and a joint or between two joints, Q28 is a waist (the center of the left and right hip joints), and Q29 is a neck (the left and right hip joints). The center of the scapulohumeral joint) and Q30 are the back (the center between the waist Q28 and the neck Q29).
 本実施形態の姿勢推定部13は、部分判別部12が左半身の部分と右半身の部分とを高精度に判別可能であるので、人体Mの特徴部分(例、左足首、右足首)を高精度に特定可能である。姿勢推定部13は、上記認識処理の結果として、各特徴部分の位置と、各特徴部分の情報(例、属性情報、名称(右膝、右股関節))と、2つの特徴部分の接続関係を示す接続情報とを一組にした姿勢情報(例、骨格情報、スケルトンデータ)を生成する。姿勢推定部13は、上記姿勢情報を図1の記憶部14に記憶させる。 The posture estimating unit 13 of the present embodiment can determine the characteristic portion (eg, left ankle, right ankle) of the human body M because the partial determination unit 12 can determine the left body part and the right body part with high accuracy. It can be specified with high accuracy. As a result of the recognition process, the posture estimating unit 13 determines the position of each characteristic part, information (eg, attribute information, name (right knee, right hip joint)) of each characteristic part, and the connection relationship between the two characteristic parts. Posture information (e.g., skeleton information, skeleton data) is generated as a set of the indicated connection information. The posture estimating unit 13 stores the above posture information in the storage unit 14 of FIG.
 姿勢推定部13は、上記姿勢情報に基づいて、人体Mの姿勢を推定する。姿勢推定部13は、接続関係がある一対の特徴部分を結ぶ第1線と、第1線に接続され、接続関係がある一対の特徴部分を結ぶ第2線との相対位置(例、角度)に基づいて、第1線および第2線を含む部分の姿勢を推定する。ここで、上記第1線は、右足首Q19と右膝Q20とを結ぶ右脛Q31であるとし、上記第2線は、右膝Q20と右股関節Q21とを結ぶ右大腿部Q32(ふともも)であるとする。姿勢推定部13は、右脛Q31と右大腿部Q32とがなす角度αを算出する。姿勢推定部13は、右脛Q31と右大腿部Q32とがなす角度αが閾値以下である場合、右脛Q31および右大腿部Q32を含む人体Mの右足が曲がった姿勢であると判定する。姿勢推定部13は、右脛Q31と右大腿部Q32とがなす角度αが閾値よりも大きい場合、右脛Q31および右大腿部Q32を含む人体Mの右足が伸びた姿勢であると判定する。 The posture estimating unit 13 estimates the posture of the human body M based on the posture information. The posture estimating unit 13 is a relative position (eg, an angle) between a first line connecting a pair of characteristic portions having a connection relationship and a second line connected to the first line and connecting the pair of feature portions having a connection relationship. Is used to estimate the posture of the portion including the first line and the second line. Here, the first line is a right shin Q31 connecting the right ankle Q19 and the right knee Q20, and the second line is a right thigh Q32 (thigh) connecting the right knee Q20 and the right hip joint Q21. And The posture estimating unit 13 calculates an angle α formed between the right shin Q31 and the right thigh Q32. The posture estimating unit 13 determines that the right leg of the human body M including the right shin Q31 and the right thigh Q32 is bent when the angle α formed by the right shin Q31 and the right thigh Q32 is equal to or smaller than the threshold. I do. When the angle α formed between the right shin Q31 and the right thigh Q32 is larger than a threshold, the posture estimating unit 13 determines that the right leg of the human body M including the right shin Q31 and the right thigh Q32 is in an extended posture. I do.
 姿勢推定部13は、例えばユーザが予め設定した各特徴部分について、上述のように姿勢を推定することで、人体Mの一部または全体の姿勢を推定する。なお、姿勢推定部13は、物体(例、人体)の姿勢を定義した姿勢定義情報を参照して、姿勢を推定してもよい。上記の姿勢定義情報は、例えば、姿勢の種類を表す情報と、各特徴部分の相対位置を定義する情報とを一組にした情報である。姿勢の種類を表す情報は、例えば、歩行姿勢、座位、ヨガのポーズ名など姿勢の名称である。各特徴部分の相対位置を定義する情報は、例えば、右脛Q31と右大腿部Q32とがなす角度αの範囲あるいは閾値である。姿勢推定部13は、生成した上記姿勢情報と上記姿勢定義情報とを照合して、人体Mの姿勢を推定(識別)してもよい。 The posture estimating unit 13 estimates the posture of a part or the whole of the human body M by estimating the posture as described above, for example, for each of the characteristic portions set in advance by the user. Note that the posture estimating unit 13 may estimate the posture with reference to posture definition information that defines the posture of an object (eg, a human body). The above-described posture definition information is, for example, information in which information representing the type of posture and information defining the relative position of each characteristic portion are paired. The information indicating the type of the posture is, for example, a posture name such as a walking posture, a sitting position, and a yoga pose name. The information defining the relative position of each characteristic portion is, for example, a range of an angle α formed by the right shin Q31 and the right thigh Q32 or a threshold value. The posture estimation unit 13 may estimate (identify) the posture of the human body M by comparing the generated posture information with the posture definition information.
 次に、上記検出装置1の構成に基づいて、実施形態に係る検出方法について説明する。図8は、第1実施形態に係る検出方法を示すフローチャートである。検出装置1の構成、各部による処理については、適宜、図1から図7を参照する。ステップS1において、位置検出部2は、物体Mの表面上の各点の位置情報を検出する。位置検出部2は、位置情報としてデプスを検出し、検出結果として位置情報を表す点群データを生成する(図3参照)。ステップS1の処理は、ステップS2の処理およびステップS3の処理を含む。ステップS2において、検出部4は、所定の点(視点)から対象領域ARの各点までのデプスを検出する。検出部4は、検出結果を表すデプスマップ(デプスの空間分布)を記憶部14に記憶させる。ステップS3において、点群データ生成部5は、デプスの検出結果に基づいて点群データを生成する。点群データ生成部5は、生成した点群データを記憶部14に記憶させる。 Next, a detection method according to the embodiment will be described based on the configuration of the detection device 1. FIG. 8 is a flowchart illustrating the detection method according to the first embodiment. The configuration of the detection device 1 and the processing by each unit are appropriately referred to FIGS. 1 to 7. In step S1, the position detection unit 2 detects position information of each point on the surface of the object M. The position detector 2 detects depth as position information, and generates point cloud data representing position information as a detection result (see FIG. 3). The processing in step S1 includes the processing in step S2 and the processing in step S3. In step S2, the detection unit 4 detects a depth from a predetermined point (viewpoint) to each point of the target area AR. The detection unit 4 causes the storage unit 14 to store a depth map (depth spatial distribution) representing the detection result. In step S3, the point cloud data generator 5 generates point cloud data based on the depth detection result. The point cloud data generation unit 5 causes the storage unit 14 to store the generated point cloud data.
 ステップS4において、基準算出部11は、位置情報の変化量が所定の条件を満たす位置を基準位置BPとして算出する。基準算出部11は、基準位置BPとして、第1基準位置BP1および第2基準位置BP2を算出する。ステップS4の処理は、ステップS5の処理およびステップS6の処理を含む。 In step S4, the reference calculation unit 11 calculates, as the reference position BP, a position where the amount of change in the position information satisfies a predetermined condition. The reference calculation unit 11 calculates a first reference position BP1 and a second reference position BP2 as the reference position BP. The processing in step S4 includes the processing in step S5 and the processing in step S6.
 ステップS5において、基準算出部11は、鉛直方向(Y方向)の座標(位置情報)の変化量に基づいて、第1基準位置BP1を算出する(図4参照)。基準算出部11は、算出した第1基準位置BP1を記憶部14に記憶させる。ステップS6において、基準算出部11は、第1基準位置BP1に基づいて第2基準位置BP2を算出する(図5参照)。基準算出部11は、ステップS5で算出した第1基準位置BP1に基づいて所定の領域AR2を設定し、所定の領域AR2において所定の条件を満たす位置を第2基準位置BP2として算出する。基準算出部11は、算出した第2基準位置BP2を記憶部14に記憶させる。 In step S5, the reference calculation unit 11 calculates the first reference position BP1 based on the amount of change in the coordinates (position information) in the vertical direction (Y direction) (see FIG. 4). The reference calculation unit 11 causes the storage unit 14 to store the calculated first reference position BP1. In step S6, the reference calculation unit 11 calculates a second reference position BP2 based on the first reference position BP1 (see FIG. 5). The reference calculation unit 11 sets a predetermined area AR2 based on the first reference position BP1 calculated in step S5, and calculates a position satisfying a predetermined condition in the predetermined area AR2 as a second reference position BP2. The reference calculation unit 11 causes the storage unit 14 to store the calculated second reference position BP2.
 次に、ステップS7において、部分判別部12は、基準位置BPに基づいて基準面BFに対して第1側(-Z側)の第1部分M1と第2側(+Z側)の第2部分M2とを判別する。部分判別部12は、ステップS6で算出された第2基準位置BP2に基づいて基準面BFを設定する。また、部分判別部12は、部分判別処理の判別対象部分QXと第2基準位置BP2とを結ぶ測地線SLを算出する。 Next, in step S7, based on the reference position BP, the part determining unit 12 determines the first part M1 on the first side (−Z side) and the second part on the second side (+ Z side) with respect to the reference plane BF. M2 is determined. The partial determination unit 12 sets the reference plane BF based on the second reference position BP2 calculated in step S6. Further, the partial determination unit 12 calculates a geodesic line SL connecting the determination target portion QX of the partial determination process and the second reference position BP2.
 部分判別部12は、基準面BFと測地線SLとの相対位置に基づいて、判別対象部分QXが第1部分M1と第2部分M2とのいずれに属するかを判別する。部分判別部12は、部分判別処理の処理結果として、判別対象部分QXが第1部分M1と第2部分M2とのいずれに属するかを示す判別情報を生成する。部分判別部12は、生成した判別情報を記憶部14に記憶させる。 The part determination unit 12 determines whether the determination target part QX belongs to the first part M1 or the second part M2 based on the relative position between the reference plane BF and the geodesic line SL. The partial determination unit 12 generates, as a processing result of the partial determination processing, determination information indicating which of the first part M1 and the second part M2 the determination target part QX belongs to. The partial determination unit 12 causes the storage unit 14 to store the generated determination information.
 ステップS8において、姿勢推定部13は、物体Mの姿勢を推定する。姿勢推定部13は、部分判別部12の処理結果(例、判別情報)に基づいて、物体M(人体M)の姿勢を推定する姿勢推定処理を実行する。姿勢推定部13は、点群データから得られる人体Mの形状に対して認識処理を実行し、人体Mの特徴部分を特定する(図7参照)。姿勢推定部13は、特定した複数の特徴部分の相対位置に基づいて、人体Mの姿勢を推定する。 In step S8, the posture estimating unit 13 estimates the posture of the object M. The posture estimating unit 13 executes a posture estimating process of estimating the posture of the object M (human body M) based on the processing result (eg, the discrimination information) of the partial determining unit 12. The posture estimating unit 13 performs a recognition process on the shape of the human body M obtained from the point cloud data, and specifies a characteristic portion of the human body M (see FIG. 7). The posture estimating unit 13 estimates the posture of the human body M based on the relative positions of the specified characteristic portions.
 このように、実施形態に係る検出装置1は、基準算出部11が位置情報の変化量に基づいて基準位置BPを算出し、部分判別部12が基準位置BPに基づいて物体Mの第1側(例、-Z側、左側)の第1部分M1と第2側(例、+Z側、右側)の第2部分M2とを判別する。したがって、検出装置1は、第1部分M1(例、左足)と第2部分M2(例、右足)とを高精度に判別することができ、例えば姿勢推定部13が物体M(人体M)の姿勢を高精度に推定することができる。 As described above, in the detection device 1 according to the embodiment, the reference calculation unit 11 calculates the reference position BP based on the amount of change in the position information, and the partial determination unit 12 determines the first side of the object M based on the reference position BP. The first portion M1 on the (eg, −Z side, left side) and the second portion M2 on the second side (eg, + Z side, right side) are determined. Therefore, the detection device 1 can determine the first portion M1 (eg, the left foot) and the second portion M2 (eg, the right foot) with high accuracy. For example, the posture estimation unit 13 detects the object M (the human body M). The posture can be estimated with high accuracy.
 また、本実施形態に係る基準算出部11は、基準位置BPとして、交差方向(例、Z方向)における物体Mの中心を含む部分の位置を推定する。この場合、部分判別部12は、物体Mの中心を含む部分の位置に基づいて部分判別処理を実行するので、第1側の第1部分M1と第2側の第2部分M2とを高精度に判別することができる。 {Circle around (4)} The reference calculation unit 11 according to the present embodiment estimates the position of a portion including the center of the object M in the cross direction (eg, the Z direction) as the reference position BP. In this case, since the part determination unit 12 performs the part determination processing based on the position of the part including the center of the object M, the first part M1 on the first side and the second part M2 on the second side are accurately determined. Can be determined.
 次に、変形例について説明する。基準位置BPは、Z方向における物体Mの中心からずれていてもよい。例えば、基準位置BPが物体Mの中心に対して第1側(+Z側)に所定のずれ量でずれているとする。また、基準位置BPと物体Mの中心とのずれ量に対して、測地線SL(図6参照)と基準面BFとの最大の距離が大きいとする。この場合、部分判別部12は、測地線SLにおいて基準面BFに対して第1側(+Z側)に配置される部分の特徴量(例、測地線SLの全体に占める比率、基準面BFからの最大の距離)に基づいて、測地線SLが基準面BFに対していずれの側に配置されているかを判別してもよい。例えば、部分判別部12は、測地線SLにおいて基準面BFに対して第1側(+Z側)に配置される部分の比率が閾値以上である場合に、判別対象部分QXが第1側に配置された部分であると判別してもよい。 Next, a modified example will be described. The reference position BP may be shifted from the center of the object M in the Z direction. For example, it is assumed that the reference position BP is shifted from the center of the object M to the first side (+ Z side) by a predetermined shift amount. It is also assumed that the maximum distance between the geodesic line SL (see FIG. 6) and the reference plane BF is larger than the amount of displacement between the reference position BP and the center of the object M. In this case, the part discriminating unit 12 calculates the characteristic amount of the portion (eg, the ratio of the geodesic line SL to the entirety, the reference surface BF) of the portion arranged on the first side (+ Z side) with respect to the reference plane BF in the geodesic line SL. Of the geodesic line SL with respect to the reference plane BF may be determined based on the maximum distance of the geodesic line SL. For example, when the ratio of the portion arranged on the first side (+ Z side) with respect to the reference plane BF in the geodesic line SL is equal to or greater than the threshold, the portion determination unit 12 arranges the determination target portion QX on the first side. It may be determined that the part has been processed.
 なお、位置検出部2は、位置情報として点群データを算出しなくてもよい。この場合、基準算出部11は、位置検出部2(検出部4)が検出したデプスを処理して基準位置BPを算出してもよい。また、基準算出部11は、位置検出部2(検出部4)が検出したデプスから得られる物体Mの形状情報(例、ポリゴンデータ)に基づいて、基準位置BPを算出してもよい(後に図10を参照して説明する。)。 位置 Note that the position detection unit 2 does not have to calculate the point cloud data as the position information. In this case, the reference calculation unit 11 may calculate the reference position BP by processing the depth detected by the position detection unit 2 (detection unit 4). The reference calculation unit 11 may calculate the reference position BP based on shape information (eg, polygon data) of the object M obtained from the depth detected by the position detection unit 2 (detection unit 4) (later, This will be described with reference to FIG.
 なお、基準算出部11は、所定の領域AR2を設定しなくてもよい。例えば、基準算出部11は、第1基準位置BP1aと第1基準位置BP1bとの中心を第2基準位置BP2に設定してもよい。また、基準算出部11は、第1基準位置BP1として、第1基準位置BP1aと第1基準位置BP1bとの一方のみを算出してもよい。この場合、基準算出部11は、第1基準位置BP1からZ方向に所定量だけシフトした位置を第2基準位置BP2に設定してもよい。上記所定量は、例えば、物体Mのサイズ(スケール)に基づいて設定されてもよい。例えば、上記所定量は、物体MのZ方向の大きさ(例、左肩の左端と右肩の右端との距離)に対して所定の比率(例、25%)の量に設定されてもよい。また、基準算出部11は、第1基準位置BP1を算出しなくてもよい。例えば、基準算出部11は、図5(A)のように物体Mの前面側または後面側の特徴量(例、Xmin(Z))がZ方向において変化するパターン(例、図5(B)参照)に基づいて、X=Xmin(Zc)、Y=Yk、Z=Zcの位置を基準位置BPとして算出してもよい。 Note that the reference calculation unit 11 does not have to set the predetermined area AR2. For example, the reference calculation unit 11 may set the center between the first reference position BP1a and the first reference position BP1b to the second reference position BP2. Further, the reference calculation unit 11 may calculate only one of the first reference position BP1a and the first reference position BP1b as the first reference position BP1. In this case, the reference calculation unit 11 may set a position shifted by a predetermined amount in the Z direction from the first reference position BP1 as the second reference position BP2. The predetermined amount may be set, for example, based on the size (scale) of the object M. For example, the predetermined amount may be set to a predetermined ratio (for example, 25%) with respect to the size of the object M in the Z direction (for example, the distance between the left end of the left shoulder and the right end of the right shoulder). . Further, the reference calculation unit 11 does not need to calculate the first reference position BP1. For example, as shown in FIG. 5A, the reference calculation unit 11 determines a pattern in which the feature amount (eg, Xmin (Z i )) on the front side or the rear side of the object M changes in the Z direction (eg, FIG. 5B). )), The position of X = Xmin (Zc), Y = Yk, and Z = Zc may be calculated as the reference position BP.
 なお、検出装置1は、上記の姿勢推定部13を備えなくてもよい。検出装置1は、部分判別部12が判別した部分の特徴量(例、寸法、外周の長さ、断面積)を算出してもよい。また、検出装置1は、部分判別部12による部分判別処理の処理結果を用いて、物体Mをセグメント化してもよい。検出装置1は、部分判別部12による部分判別処理の処理結果(例、検出された「足」が「左足」であるのか「右足」であるのか識別する情報)を用いて、物体Mのモデル情報を生成してもよい(後に図10で説明する)。 検 出 Note that the detection device 1 does not need to include the attitude estimation unit 13 described above. The detection device 1 may calculate the feature amounts (eg, dimensions, outer peripheral length, cross-sectional area) of the part determined by the part determination unit 12. In addition, the detection device 1 may segment the object M using the processing result of the partial determination processing performed by the partial determination unit 12. The detection device 1 uses the processing result of the partial determination processing by the partial determination unit 12 (eg, information for identifying whether the detected “foot” is the “left foot” or the “right foot”), and Information may be generated (described later with reference to FIG. 10).
[第2実施形態]
 次に、第2実施形態について説明する。本実施形態において、上述の実施形態と同様の構成については、同じ符号を付してその説明を省略あるいは簡略化する。図9は、第2実施形態に係る検出装置を示す図である。第1実施形態において、移動方向は、対象領域AR(例、位置検出部2の視野)に対して予め定められているものとしたが、本実施形態において、検出装置1は、物体M(例、人体M)の移動方向を検出する。検出装置1は、位置検出部2の検出結果に基づいて物体M(人体M)の移動方向を算出(導出、検出)する方向算出部21を備える。
[Second embodiment]
Next, a second embodiment will be described. In the present embodiment, the same components as those in the above-described embodiment are denoted by the same reference numerals, and description thereof will be omitted or simplified. FIG. 9 is a diagram illustrating a detection device according to the second embodiment. In the first embodiment, the moving direction is determined in advance with respect to the target area AR (eg, the field of view of the position detection unit 2), but in the present embodiment, the detection device 1 , The moving direction of the human body M) is detected. The detection device 1 includes a direction calculation unit 21 that calculates (derives and detects) the moving direction of the object M (human body M) based on the detection result of the position detection unit 2.
 本実施形態において、検出部4は人体Mを繰り返し検出し、点群データ生成部5は、位置情報として検出タイミングごとの人体Mの点群データを生成する。方向算出部21は、位置検出部2が検出した人体Mの位置情報の時間変化に基づいて、人体Mの移動方向を算出する。例えば、方向算出部21は、人体Mの軌跡(例、人体Mの所定位置の時間履歴)を算出し、軌跡に沿う方向を人体Mの移動方向とする。上記所定位置は、デフォルトの設定またはユーザの設定によって定められる人体M上の位置である。 In the present embodiment, the detecting unit 4 repeatedly detects the human body M, and the point cloud data generating unit 5 generates point cloud data of the human body M for each detection timing as position information. The direction calculation unit 21 calculates the moving direction of the human body M based on the time change of the position information of the human body M detected by the position detection unit 2. For example, the direction calculation unit 21 calculates a trajectory of the human body M (for example, a time history of a predetermined position of the human body M), and sets a direction along the trajectory as a moving direction of the human body M. The predetermined position is a position on the human body M determined by default setting or user setting.
 方向算出部21は、人体Mの所定位置として、点群データに含まれる複数の点の重心を算出する。方向算出部21は、算出した重心の時間変化に基づいて、移動方向を算出する。例えば、方向算出部21は、第1時刻における検出部4検出結果から得られる人体Mの第1の所定位置を始点とし、第1時刻よりも後の第2時刻における検出部4検出結果から得られる人体Mの第2の所定位置を終点とするベクトルを算出し、このベクトルに平行な方向ベクトル(例、単位ベクトル)を、第1時刻における人体Mの移動方向を決定する。 The direction calculation unit 21 calculates the center of gravity of a plurality of points included in the point cloud data as the predetermined position of the human body M. The direction calculating unit 21 calculates the moving direction based on the calculated temporal change of the center of gravity. For example, the direction calculation unit 21 starts from a first predetermined position of the human body M obtained from the detection result of the detection unit 4 at the first time, and obtains the detection result from the detection unit 4 at the second time after the first time. A vector having the second predetermined position of the human body M as an end point is calculated, and a direction vector (eg, a unit vector) parallel to this vector is determined as the moving direction of the human body M at the first time.
 方向算出部21は、算出した移動方向(例、移動情報)を記憶部14に記憶させる。基準算出部11は、方向算出部21が算出した移動方向に基づいて、基準位置BPを算出する。例えば、基準算出部11は、方向算出部21が算出した移動方向を記憶部14から読み出し、移動方向をX方向に設定して、基準算出処理を実行する。部分判別部12は、方向算出部21が算出した移動方向に基づいて、第1部分M1と第2部分M2とを判別する。例えば、部分判別部12は、鉛直方向と、方向算出部21が算出した移動方向とに平行、かつ第2基準位置BP2を含む平面を基準面BFに設定する。そして、部分判別部12は、判別対象部分QXについて、設定した基準面BFに対して第1側(例、-Z側)または第2側(例、+Z側)のいずれに配置されるかを判別する。 The direction calculation unit 21 causes the storage unit 14 to store the calculated movement direction (eg, movement information). The reference calculation unit 11 calculates a reference position BP based on the moving direction calculated by the direction calculation unit 21. For example, the reference calculation unit 11 reads the movement direction calculated by the direction calculation unit 21 from the storage unit 14, sets the movement direction to the X direction, and executes a reference calculation process. The part determination unit 12 determines the first part M1 and the second part M2 based on the moving direction calculated by the direction calculation unit 21. For example, the part determination unit 12 sets a plane parallel to the vertical direction and the movement direction calculated by the direction calculation unit 21 and including the second reference position BP2 as the reference plane BF. Then, the part determination unit 12 determines whether the determination target part QX is located on the first side (eg, −Z side) or the second side (eg, + Z side) with respect to the set reference plane BF. Determine.
 なお、上記所定位置は、第1実施形態で説明した特徴部分(例、人体Mの頭部)の位置でもよい。また、上記所定位置は、部分判別部12の判別結果に基づいて導出される人体Mの特徴部分(例、右足、左足)の位置でもよい。また、上記所定位置は、基準算出部11が算出した基準位置BP(例、第2基準位置BP2)でもよい。例えば、方向算出部21は、予め設定される第1の所定位置(例、重心)の軌跡に基づいて移動方向の候補を算出してもよい。そして、基準算出部11は、方向算出部21が算出した移動方向の候補をX方向として、基準位置BP(例、第2基準位置BP2)の候補を算出してもよい。方向算出部21は、基準算出部11が算出した第2基準位置BP2の候補を第2の所定位置に用いて、第2の所定位置の軌跡に基づいて移動方向を算出(例、再計算)してもよい。基準算出部11は、第2の所定位置を用いて算出される移動方向をX方向として、基準位置BPを算出(再計算)してもよい。 The predetermined position may be the position of the characteristic portion (eg, the head of the human body M) described in the first embodiment. Further, the predetermined position may be a position of a characteristic part (eg, right foot, left foot) of the human body M derived based on the determination result of the partial determination unit 12. Further, the predetermined position may be a reference position BP calculated by the reference calculation unit 11 (eg, a second reference position BP2). For example, the direction calculation unit 21 may calculate a candidate for the movement direction based on a trajectory of a first predetermined position (for example, the center of gravity) set in advance. Then, the reference calculation unit 11 may calculate a candidate for the reference position BP (for example, the second reference position BP2), using the candidate for the movement direction calculated by the direction calculation unit 21 as the X direction. The direction calculation unit 21 uses the candidate for the second reference position BP2 calculated by the reference calculation unit 11 as the second predetermined position, and calculates the moving direction based on the trajectory of the second predetermined position (eg, recalculation). May be. The reference calculation unit 11 may calculate (recalculate) the reference position BP with the movement direction calculated using the second predetermined position as the X direction.
 なお、方向算出部21は、全地球測位システム(GPS)から取得される物体Mの位置情報に基づいて、物体Mの移動方向を算出してもよい。また、物体Mに加速度センサが設けられ、方向算出部21は、上記加速度センサの検出結果に基づいて、物体Mの移動方向を算出してもよい。例えば、人体Mは、GPSから情報を受信する受信部と上記加速度センサとの一方または双方を備える携帯端末(例、スマートフォン)を付帯して移動し、方向算出部21は、上記携帯端末から人体Mの位置情報(例、GPS情報、加速度)を取得して、物体Mの移動方向を算出してもよい。 The direction calculation unit 21 may calculate the moving direction of the object M based on the position information of the object M obtained from the global positioning system (GPS). Further, an acceleration sensor may be provided on the object M, and the direction calculation unit 21 may calculate the moving direction of the object M based on the detection result of the acceleration sensor. For example, the human body M moves with a mobile terminal (for example, a smartphone) including one or both of a receiving unit that receives information from GPS and the acceleration sensor, and the direction calculation unit 21 transmits a human body from the mobile terminal. The position information (eg, GPS information, acceleration) of M may be acquired, and the moving direction of the object M may be calculated.
 なお、検出装置1は、鉛直方向を検出する鉛直検出部(例、重力加速度の向きを検出するセンサ)を備え、鉛直検出部が検出した検出結果を上記のY方向に設定して基準算出処理と部分判別処理との一方または双方を実行してもよい。検出装置1は、上記鉛直検出部を備えなくてもよい。例えば、上記鉛直検出部は、物体Mに設けられ(人体Mが携帯し)、検出装置1は、鉛直検出部から取得される鉛直方向に基づいて、基準算出処理と部分判別処理との一方または双方を実行してもよい。また、鉛直検出部は、検出装置1と物体Mとのいずれとも別に設けられてもよい。例えば、鉛直検出部は、物体Mを検出する場所(例、検出装置1が設置される設備)に設けられてもよい。 Note that the detection device 1 includes a vertical detection unit (for example, a sensor that detects the direction of gravitational acceleration) that detects a vertical direction, sets the detection result detected by the vertical detection unit in the Y direction, and performs a reference calculation process. And / or both of the partial determination processing. The detection device 1 may not include the vertical detection unit. For example, the vertical detection unit is provided on the object M (the human body M is carried), and the detection device 1 performs one of the reference calculation process and the partial determination process based on the vertical direction acquired from the vertical detection unit. Both may be performed. Further, the vertical detection unit may be provided separately from both the detection device 1 and the object M. For example, the vertical detection unit may be provided in a place where the object M is detected (for example, equipment in which the detection device 1 is installed).
[第3実施形態]
 次に、第3実施形態について説明する。本実施形態において、上述の実施形態と同様の構成については、同じ符号を付してその説明を省略あるいは簡略化する。図10は、第3実施形態に係る検出装置を示す図である。検出装置1は、物体M(この場合、人体M)のモデル情報を生成するモデル生成部22を備える。モデル情報は、例えば3次元のCGモデルデータであり、物体Mの形状情報を含む。モデル生成部22は、部分判別部12が判別した第1部分M1と第2部分M2とに基づいて、物体Mのモデル情報を算出する。
[Third embodiment]
Next, a third embodiment will be described. In the present embodiment, the same components as those in the above-described embodiment are denoted by the same reference numerals, and description thereof will be omitted or simplified. FIG. 10 is a diagram illustrating a detection device according to the third embodiment. The detection device 1 includes a model generation unit 22 that generates model information of an object M (in this case, a human body M). The model information is, for example, three-dimensional CG model data, and includes shape information of the object M. The model generation unit 22 calculates model information of the object M based on the first part M1 and the second part M2 determined by the part determination unit 12.
 本実施形態に係るモデル生成部22は、点群データ生成部5とサーフェス情報生成部23とを備える。点群データ生成部5は、第1実施形態で説明したように、検出部4が検出した物体Mのデプスに基づいて、形状情報として点群データを生成する。サーフェス情報生成部23は、形状情報としてサーフェス情報を算出するサーフェス処理を実行する。 モ デ ル The model generation unit 22 according to the present embodiment includes the point cloud data generation unit 5 and the surface information generation unit 23. The point cloud data generation unit 5 generates point cloud data as shape information based on the depth of the object M detected by the detection unit 4 as described in the first embodiment. The surface information generation unit 23 performs a surface process for calculating surface information as shape information.
 サーフェス情報は、例えばポリゴンデータ、ベクタデータ、及びドローデータの少なくとも1つを含む。サーフェス情報は、物体の表面上の複数の点の座標と、複数の点間の連結情報とを含む。連結情報(例、属性情報)は、例えば、物体表面の稜線(例、エッジ)に相当する線の両端の点を互いに関連付ける情報を含む。また、連結情報は、例えば、物体表面(サーフェス)の輪郭に相当する複数の線を互いに関連付ける情報を含む。 The surface information includes, for example, at least one of polygon data, vector data, and draw data. The surface information includes coordinates of a plurality of points on the surface of the object and connection information between the plurality of points. The connection information (for example, attribute information) includes, for example, information that associates points at both ends of a line corresponding to a ridge line (for example, an edge) on the surface of an object. Further, the connection information includes, for example, information that associates a plurality of lines corresponding to the contour of the object surface (surface) with each other.
 サーフェス情報生成部23は、サーフェス処理において、点群データに含まれる複数の点から選択される点とその近傍の点との間の面を推定する。サーフェス情報生成部23は、点とその近傍の点との間の面を推定する際に、部分判別部12の処理結果を用いて点群データをセグメント化する。例えば、サーフェス情報生成部23は、人体Mの左足のサーフェス情報を生成する際に、部分判別部12の処理結果に基づいて左足の点群データを右足の点群データと区別する。この場合、サーフェス情報生成部23は、例えば左膝と右膝とが接近(例、接触)している場合等において、左膝と右膝とにまたがった面を推定することが回避され、高精度なサーフェス情報を生成可能である。 The surface information generation unit 23 estimates a surface between a point selected from a plurality of points included in the point cloud data and a nearby point in the surface processing. The surface information generation unit 23 segments the point group data using the processing result of the partial determination unit 12 when estimating a surface between a point and a nearby point. For example, when generating the surface information of the left foot of the human body M, the surface information generation unit 23 distinguishes the point group data of the left foot from the point group data of the right foot based on the processing result of the partial determination unit 12. In this case, for example, when the left knee and the right knee are approaching (for example, in contact with each other), the surface information generation unit 23 avoids estimating a surface that straddles the left knee and the right knee. Accurate surface information can be generated.
 また、サーフェス情報生成部23は、サーフェス処理において、点群データを点間の平面情報を持つポリゴンデータに変換する。サーフェス情報生成部23は、例えば、最小二乗法を用いたアルゴリズムにより、点群データをポリゴンデータへ変換する。このアルゴリズムは、例えば、点群処理ライブラリに公開されているアルゴリズムを適用したものでもよい。サーフェス情報生成部23は、算出したサーフェス情報を記憶部14に記憶させる。 {Circle around (4)} In the surface processing, the surface information generation unit 23 converts the point cloud data into polygon data having plane information between points. The surface information generation unit 23 converts the point cloud data into polygon data by, for example, an algorithm using the least squares method. This algorithm may be, for example, an algorithm that is published in a point cloud processing library. The surface information generation unit 23 causes the storage unit 14 to store the calculated surface information.
 なお、モデル情報は、物体Mのテクスチャ情報を含んでもよい。モデル生成部22は、3次元の点座標及びその関連情報で規定された面のテクスチャ情報を生成してもよい。テクスチャ情報は、例えば、物体表面の文字や図形、模様、質感、パターン、凹凸を規定する情報、特定の画像、及び色彩(例、有彩色、無彩色)の少なくとも1つの情報を含む。モデル生成部22は、生成したテクスチャ情報を記憶部14に記憶させてもよい。 The model information may include texture information of the object M. The model generation unit 22 may generate texture information of a plane defined by three-dimensional point coordinates and related information. The texture information includes, for example, at least one information of a character, a graphic, a pattern, a texture, a pattern, information defining irregularities, a specific image, and a color (eg, chromatic color, achromatic color) on the surface of the object. The model generation unit 22 may cause the storage unit 14 to store the generated texture information.
 また、モデル情報は、画像の空間情報(例、照明条件、光源情報)を含んでもよい。光源情報は、物体Mに対して光(例、照明光)を照射する光源の位置、この光源から物体Mへ光が照射される方向(照射方向)、この光源から照射される光の波長、及びこの光源の種類のうち少なくとも1項目の情報を含む。モデル生成部22は、例えば、ランバート反射を仮定したモデル、アルベド(Albedo)推定を含むモデルなどを利用して、光源情報を算出してもよい。モデル生成部22は、生成した空間情報を記憶部14に記憶させてもよい。モデル生成部22は、テクスチャ情報と空間情報との一方または双方を生成しなくてもよい。 モ デ ル The model information may include spatial information (eg, lighting conditions, light source information) of the image. The light source information includes a position of a light source that irradiates the object M with light (eg, illumination light), a direction in which light is emitted from the light source to the object M (irradiation direction), a wavelength of light emitted from this light source, And information of at least one of the types of the light sources. The model generation unit 22 may calculate the light source information using, for example, a model assuming Lambertian reflection, a model including Albedo estimation, or the like. The model generation unit 22 may cause the storage unit 14 to store the generated spatial information. The model generation unit 22 does not need to generate one or both of the texture information and the spatial information.
[第4実施形態]
 次に、第4実施形態について説明する。本実施形態において、上述の実施形態と同様の構成については、同じ符号を付してその説明を省略あるいは簡略化する。図11は、第4実施形態に係る検出装置を示す図である。本実施形態に係る検出装置1(検出システム)は、モデル生成部22と、レンダリング処理部24(レンダリング処理装置、情報処理装置)と、入力装置25と、表示装置26とを備える。
[Fourth embodiment]
Next, a fourth embodiment will be described. In the present embodiment, the same components as those in the above-described embodiment are denoted by the same reference numerals, and description thereof will be omitted or simplified. FIG. 11 is a diagram illustrating a detection device according to the fourth embodiment. The detection device 1 (detection system) according to the present embodiment includes a model generation unit 22, a rendering processing unit 24 (rendering processing device, information processing device), an input device 25, and a display device 26.
 レンダリング処理部24は、例えば、グラフィックス プロセッシング ユニット(GPU)を含む。なお、レンダリング処理部24は、CPUおよびメモリが画像処理プログラムに従って各処理を実行する態様でもよい。レンダリング処理部24は、例えば、描画処理、テクスチャマッピング処理、シェーディング処理の少なくとも一つの処理を行う。 The {rendering processing unit 24 includes, for example, a graphics {processing} unit (GPU). Note that the rendering processing unit 24 may be configured so that the CPU and the memory execute each processing according to the image processing program. The rendering processing unit 24 performs, for example, at least one of drawing processing, texture mapping processing, and shading processing.
 レンダリング処理部24は、描画処理において、例えば、モデル情報の形状情報に定められた形状を任意の視点から見た推定画像(例、再構築画像)を算出できる。以下の説明において、形状情報が示す形状をモデル形状という。レンダリング処理部24は、例えば、描画処理によって、モデル情報(例、形状情報)からモデル形状(例、推定画像)を再構成できる。レンダリング処理部24は、例えば、算出した推定画像のデータを記憶部14に記憶させる。 In the rendering processing, the rendering processing unit 24 can calculate, for example, an estimated image (eg, a reconstructed image) obtained by viewing the shape defined in the shape information of the model information from an arbitrary viewpoint. In the following description, the shape indicated by the shape information is referred to as a model shape. The rendering processing unit 24 can reconstruct a model shape (eg, an estimated image) from the model information (eg, shape information) by, for example, a drawing process. The rendering processing unit 24 causes the storage unit 14 to store the data of the calculated estimated image, for example.
 また、レンダリング処理部24は、テクスチャマッピング処理において、例えば、推定画像上の物体の表面に、モデル情報のテクスチャ情報が示す画像を貼り付けた推定画像を算出できる。レンダリング処理部24は、推定画像上の物体の表面に、対象物と別のテクスチャを貼り付けた推定画像を算出することができる。 In addition, in the texture mapping process, the rendering processing unit 24 can calculate, for example, an estimated image in which the image indicated by the texture information of the model information is pasted on the surface of the object on the estimated image. The rendering processing unit 24 can calculate an estimated image in which a texture different from the target object is pasted on the surface of the object on the estimated image.
 レンダリング処理部24は、シェーディング処理において、例えば、モデル情報の光源情報が示す光源により形成される陰影を推定画像上の物体に付加した推定画像を算出できる。また、レンダリング処理部24は、シェーディング処理において、例えば、任意の光源により形成される陰影を推定画像上の物体に付加した推定画像を算出できる。 In the shading process, the rendering processing unit 24 can calculate, for example, an estimated image in which a shadow formed by the light source indicated by the light source information of the model information is added to an object on the estimated image. In the shading process, the rendering processing unit 24 can calculate, for example, an estimated image in which a shadow formed by an arbitrary light source is added to an object on the estimated image.
 入力装置25は、処理装置3に対する各種情報(例、データ、指令)の入力に利用される。ユーザは、入力装置25を操作することによって、処理装置3に対して各種情報を入力可能である。入力装置25は、例えば、キーボード、マウス、トラックボール、タッチパネル、及び音声入力デバイス(例、マイク)の少なくとも一つを含む。 The input device 25 is used for inputting various information (eg, data, commands) to the processing device 3. The user can input various information to the processing device 3 by operating the input device 25. The input device 25 includes, for example, at least one of a keyboard, a mouse, a trackball, a touch panel, and a voice input device (eg, a microphone).
 表示装置26は、処理装置3から出力される画像のデータに基づいて、この画像を表示する。例えば、処理装置3は、レンダリング処理部24が生成した推定画像のデータを表示装置26に出力する。表示装置26は、処理装置3から出力された推定画像のデータに基づいて、推定画像を表示する。表示装置26は、例えば、液晶ディスプレイを含む。表示装置26および入力装置25は、タッチパネルなどでもよい。 The display device 26 displays the image based on the image data output from the processing device 3. For example, the processing device 3 outputs the data of the estimated image generated by the rendering processing unit 24 to the display device 26. The display device 26 displays the estimated image based on the data of the estimated image output from the processing device 3. The display device 26 includes, for example, a liquid crystal display. The display device 26 and the input device 25 may be a touch panel or the like.
 なお、検出装置1は、入力装置25を備えなくてもよい。例えば、検出装置1は、各種の指令、情報が通信を介して入力される形態でもよい。また、検出装置1は、表示装置26を備えなくてもよい。例えば、検出装置1は、レンダリング処理により生成された推定画像のデータを通信を介して表示装置へ出力し、この表示装置が推定画像を表示してもよい。レンダリング処理部24は、図11において処理装置3に設けられるが、処理装置3の外部の装置に設けられてもよい。上記外部の装置は、処理装置3と通信可能に接続されるクラウドコンピュータでもよい。 The detection device 1 does not need to include the input device 25. For example, the detection device 1 may be in a form in which various commands and information are input via communication. The detection device 1 does not need to include the display device 26. For example, the detection device 1 may output data of an estimated image generated by the rendering process to a display device via communication, and the display device may display the estimated image. The rendering processing unit 24 is provided in the processing device 3 in FIG. 11, but may be provided in a device external to the processing device 3. The external device may be a cloud computer communicably connected to the processing device 3.
 上述の実施形態において、処理装置3は、例えばコンピュータシステムを含む。処理装置3は、記憶部14に記憶されている処理プログラムを読み出し、この処理プログラムに従って各種の処理を実行する。この処理プログラムは、例えば、コンピュータに、移動する物体の移動方向と鉛直方向とに交差する交差方向において、物体の表面上の各点の位置情報の変化量が所定の条件を満たす位置を基準位置として算出することと、基準位置に基づいて、移動方向と鉛直方向とを含む基準面に対して第1側に配置される物体の第1部分と、基準面に対して第1側と反対の第2側に配置される物体の第2部分とを判別することと、を実行させる。このプログラムは、コンピュータ読み取り可能な記憶媒体(例、非一時的な記録媒体、non-transitory tangible media)に記録されて提供されてもよい。 In the above embodiment, the processing device 3 includes, for example, a computer system. The processing device 3 reads a processing program stored in the storage unit 14 and executes various processes according to the processing program. This processing program, for example, instructs the computer to set the position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition in the intersecting direction intersecting the moving direction of the moving object and the vertical direction as the reference position. And a first portion of the object disposed on the first side with respect to the reference plane including the movement direction and the vertical direction, based on the reference position, and a first portion opposite to the first side with respect to the reference plane. Determining the second portion of the object disposed on the second side. This program may be provided by being recorded on a computer-readable storage medium (eg, a non-transitory recording medium, non-transitory @ tangible @ media).
 なお、本発明の技術範囲は、上述の実施形態などで説明した態様に限定されるものではない。上述の実施形態などで説明した要件の1つ以上は、省略されることがある。また、上述の実施形態などで説明した要件は、適宜組み合わせることができる。また、法令で許容される限りにおいて、日本特許出願である特願2018-143687、及び、上述の実施形態などで引用した全ての文献の開示を援用して本文の記載の一部とする。 技術 Note that the technical scope of the present invention is not limited to the modes described in the above embodiments and the like. One or more of the requirements described in the above embodiments may be omitted. Further, the requirements described in the above embodiments and the like can be appropriately combined. Further, as far as permitted by laws and regulations, the disclosure of Japanese Patent Application No. 2018-143687 and all documents cited in the above-described embodiments and the like are incorporated herein as a part of the description of the text.
1・・・検出装置、2・・・位置検出部、3・・・処理装置、4・・・検出部、5・・・点群データ生成部、11・・・基準算出部、12・・・部分判別部、13・・・姿勢推定部、14・・・記憶部、21・・・方向算出部、22・・・モデル生成部、BF・・・基準面、BP・・・基準位置、BP1・・・第1基準位置、BP1a・・・第1基準位置、BP1b・・・第1基準位置、BP2・・・第2基準位置、CL・・・中心線、D2・・・点群データ、M・・・物体(人体)、M1・・・第1部分(左半身)、M2・・・第2部分(右半身)、MD・・・移動方向、SL・・・測地線、V1・・・第1閾値、V2・・・第2閾値 DESCRIPTION OF SYMBOLS 1 ... Detection apparatus, 2 ... Position detection part, 3 ... Processing apparatus, 4 ... Detection part, 5 ... Point group data generation part, 11 ... Reference calculation part, 12 ...・ Partial discriminating unit, 13 ・ ・ ・ Attitude estimating unit, 14 ・ ・ ・ Storage unit, 21 ・ ・ ・ Direction calculating unit, 22 ・ ・ ・ Model generating unit, BF ・ ・ ・ Reference plane, BP ・ ・ ・ Reference position, BP1 first reference position, BP1a first reference position, BP1b first reference position, BP2 second reference position, CL center line, D2 point group data , M: object (human body), M1: first part (left body), M2: second part (right body), MD: moving direction, SL: geodesic, V1. ..First threshold, V2... Second threshold

Claims (25)

  1.  移動する物体の各点の位置情報を検出する検出部と、
     前記物体の移動方向と鉛直方向とに交差する交差方向において前記位置情報の変化量が所定の条件を満たす位置を基準位置として算出する基準算出部と、
     前記基準算出部が算出した前記基準位置に基づいた、前記移動方向と前記鉛直方向とを含む基準面に対して、第1側に配置される前記物体の第1部分と、前記基準面に対して前記第1側と反対の第2側に配置される前記物体の第2部分とを判別する判別部と、を備える検出装置。
    A detection unit that detects position information of each point of the moving object,
    A reference calculation unit that calculates, as a reference position, a position where the amount of change in the position information satisfies a predetermined condition in an intersecting direction intersecting the moving direction of the object and the vertical direction;
    Based on the reference position calculated by the reference calculation unit, with respect to a reference plane including the moving direction and the vertical direction, a first portion of the object disposed on a first side, A determination unit configured to determine a second portion of the object disposed on a second side opposite to the first side.
  2.  前記基準位置は、前記交差方向における前記物体の中心を含む部分の位置である、
     請求項1に記載の検出装置。
    The reference position is a position of a portion including the center of the object in the cross direction,
    The detection device according to claim 1.
  3.  前記検出部は、前記位置情報として、前記物体の表面上の各点の三次元座標を含む点群データを検出し、
     前記基準位置は、前記点群データに基づいて求められる、
     請求項1または請求項2に記載の検出装置。
    The detection unit, as the position information, detects point cloud data including three-dimensional coordinates of each point on the surface of the object,
    The reference position is determined based on the point cloud data,
    The detection device according to claim 1 or 2.
  4.  前記基準位置は、前記点群データの座標の変化量が所定値以上となる第1基準位置によるものである、
     請求項3に記載の検出装置。
    The reference position is based on a first reference position at which the amount of change in the coordinates of the point cloud data is equal to or greater than a predetermined value.
    The detection device according to claim 3.
  5.  前記基準位置は、前記点群データに含まれる点について前記交差方向の各座標における前記鉛直方向の座標の最大値を算出し、前記交差方向の座標に対する前記最大値の変化量が所定値以上となる位置を前記第1基準位置とする、
     請求項4に記載の検出装置。
    The reference position calculates a maximum value of the coordinates in the vertical direction at each coordinate in the cross direction for a point included in the point cloud data, and a change amount of the maximum value with respect to the coordinates in the cross direction is equal to or more than a predetermined value. Is the first reference position,
    The detection device according to claim 4.
  6.  前記基準位置は、前記第1基準位置によって決まる所定の領域内の第2基準位置であり、
     前記判別部は、前記第2基準位置に対して前記第1側に配置される部分を前記第1部分として判別し、前記第2基準位置に対して前記第2側に配置される部分を前記第2部分として判別する、
     請求項4または請求項5に記載の検出装置。
    The reference position is a second reference position in a predetermined area determined by the first reference position,
    The determination unit determines a portion disposed on the first side with respect to the second reference position as the first portion, and determines a portion disposed on the second side with respect to the second reference position. Discriminate as the second part,
    The detection device according to claim 4.
  7.  前記第1基準位置は、前記物体における前記第1側の位置と前記第2側の位置とを含み、
     前記第1側の位置と前記第2側の位置との間を前記所定の領域とし、
     前記所定の領域により前記第2基準位置を求める、
     請求項6に記載の検出装置。
    The first reference position includes a position on the first side and a position on the second side of the object,
    The predetermined area is defined between the position on the first side and the position on the second side,
    Obtaining the second reference position from the predetermined area;
    The detection device according to claim 6.
  8.  前記基準算出部は、前記点群データに含まれる点について前記交差方向の各座標における前記移動方向の前方を正とした座標の極小値を算出し、前記交差方向の座標に対する前記極小値の変化量に基づいて前記第2基準位置を選択する、
     請求項6または請求項7に記載の検出装置。
    The reference calculation unit calculates a minimum value of a coordinate included in the coordinates in the cross direction with respect to a point included in the point group data, where the forward direction of the movement direction is positive, and changes in the minimum value with respect to the coordinates in the cross direction. Selecting the second reference position based on the amount;
    The detection device according to claim 6.
  9.  前記基準算出部は、前記極小値が最大となる位置を前記第2基準位置として算出する、
     請求項8に記載の検出装置。
    The reference calculation unit calculates a position where the minimum value is maximum as the second reference position,
    The detection device according to claim 8.
  10.  前記判別部は、前記第2基準位置を含む面を前記基準面として、前記第1部分と前記第2部分とを判別する、
     請求項6から請求項9のいずれか一項に記載の検出装置。
    The determination unit determines the first portion and the second portion, using a surface including the second reference position as the reference surface.
    The detection device according to any one of claims 6 to 9.
  11.  前記判別部は、前記基準位置と前記物体の部分とを結ぶ測地線と前記基準面との相対位置に基づいて、前記第1部分と前記第2部分とを判別する、
     請求項1から請求項10のいずれか一項に記載の検出装置。
    The determination unit determines the first portion and the second portion based on a relative position between a geodesic line connecting the reference position and the portion of the object and the reference surface,
    The detection device according to any one of claims 1 to 10.
  12.  前記判別部は、前記測地線において前記基準面に対して前記第1側に配置される部分の第1特徴量と、前記測地線において前記基準面に対して前記第2側に配置される部分の第2特徴量とに基づいて、前記測地線と前記基準面との相対位置を特定し、特定した前記測地線と前記基準面との相対位置に基づいて前記第1部分と前記第2部分とを判別する、
     請求項11に記載の検出装置。
    The determination unit includes a first feature amount of a portion arranged on the first side with respect to the reference plane in the geodesic line, and a portion arranged on the second side with respect to the reference plane in the geodesic line A relative position between the geodesic line and the reference plane based on the second characteristic amount of the first part and the second part based on the specified relative position between the geodesic line and the reference plane. To determine,
    The detection device according to claim 11.
  13.  前記第1特徴量は、前記基準面に対して前記第1側に配置される前記測地線上の複数の点についての各点と前記基準面との距離の平均であり、
     前記第2特徴量は、前記基準面に対して前記第2側に配置される前記測地線上の複数の点についての各点と前記基準面との距離の平均である、
     請求項12に記載の検出装置。
    The first feature amount is an average of the distance between each point of the plurality of points on the geodesic line arranged on the first side with respect to the reference plane and the reference plane,
    The second feature value is an average of the distance between each point of the plurality of points on the geodesic line arranged on the second side with respect to the reference plane and the reference plane,
    The detection device according to claim 12.
  14.  前記第1特徴量は、前記基準面から前記第1側へ最も離れた前記測地線上の点と前記基準面との距離であり、
     前記第2特徴量は、前記基準面から前記第2側へ最も離れた前記測地線上の点と前記基準面との距離である、
     請求項12または請求項13に記載の検出装置。
    The first feature amount is a distance between the reference plane and a point on the geodesic line farthest from the reference plane to the first side,
    The second feature amount is a distance between the reference plane and a point on the geodesic line farthest from the reference plane to the second side,
    The detection device according to claim 12.
  15.  前記判別部が判別した前記第1部分と前記第2部分とに基づいて、前記物体のモデル情報を算出するモデル生成部を備える、
     請求項1から請求項14のいずれか一項に記載の検出装置。
    A model generation unit that calculates model information of the object based on the first part and the second part determined by the determination unit;
    The detection device according to any one of claims 1 to 14.
  16.  前記判別部が判別した前記第1部分と前記第2部分とに基づいて、前記物体の姿勢を推定する姿勢推定部を備える、
     請求項1から請求項15のいずれか一項に記載の検出装置。
    A posture estimating unit that estimates a posture of the object based on the first part and the second part determined by the determining unit;
    The detection device according to any one of claims 1 to 15.
  17.  前記物体は、前記第1部分と前記第2部分とが交互に前記移動方向に移動する、
     請求項1から請求項16のいずれか一項に記載の検出装置。
    In the object, the first portion and the second portion alternately move in the movement direction,
    The detection device according to any one of claims 1 to 16.
  18.  前記物体は人体を含み、
     前記第1部分は、左半身の少なくとも一部であり、
     前記第2部分は、右半身の少なくとも一部である、
     請求項1から請求項17のいずれか一項に記載の検出装置。
    The object includes a human body;
    The first part is at least a part of a left body,
    The second portion is at least a part of a right body,
    The detection device according to any one of claims 1 to 17.
  19.  前記基準算出部は、前記基準位置として、前記人体の腕から頭部までの前記位置情報の変化量に基づいて、前記人体の頭部に対応する第1基準位置を算出する、
     請求項18に記載の検出装置。
    The reference calculation unit calculates, as the reference position, a first reference position corresponding to the head of the human body, based on a change amount of the position information from the arm to the head of the human body.
    The detection device according to claim 18.
  20.  前記基準算出部は、前記人体の左半身に対応する前記第1基準位置と前記人体の右半身に対応する前記第1基準位置とに基づいて、前記基準位置として、前記交差方向における前記人体の中心線の位置に対応する第2基準位置を算出する、
     請求項19に記載の検出装置。
    The reference calculation unit is configured to determine, based on the first reference position corresponding to the left body of the human body and the first reference position corresponding to the right body of the human body, the reference position, Calculating a second reference position corresponding to the position of the center line;
    The detection device according to claim 19.
  21.  前記検出部の検出結果に基づいて前記物体の移動方向を算出する方向算出部を備え、
     前記基準算出部は、前記方向算出部が算出した移動方向に基づいて、前記基準位置を算出し、
     前記判別部は、前記方向算出部が算出した移動方向に基づいて、前記第1部分と前記第2部分とを判別する、
     請求項1から請求項20のいずれか一項に記載の検出装置。
    A direction calculation unit that calculates a moving direction of the object based on a detection result of the detection unit,
    The reference calculation unit calculates the reference position based on the movement direction calculated by the direction calculation unit,
    The determination unit determines the first part and the second part based on the moving direction calculated by the direction calculation unit.
    The detection device according to any one of claims 1 to 20.
  22.  前記検出部は、前記物体を含む対象領域における各点の位置情報を検出し、前記対象領域における各点の位置情報から前記物体の表面上の各点の位置情報を抽出する、
     請求項1から請求項21のいずれか一項に記載の検出装置。
    The detection unit detects position information of each point in a target region including the object, and extracts position information of each point on the surface of the object from position information of each point in the target region,
    The detection device according to any one of claims 1 to 21.
  23.  移動する物体の移動方向と鉛直方向とに交差する交差方向において、前記物体の表面上の各点の位置情報の変化量が所定の条件を満たす位置を基準位置として算出する基準算出部と、
     前記基準位置に基づいて、前記物体のうち前記移動方向と前記鉛直方向とを含む基準面に対して第1側に配置される第1部分と、前記物体のうち前記基準面に対して前記第1側と反対の第2側に配置される第2部分とを判別する判別部と、を備える処理装置。
    A reference calculation unit that calculates, as a reference position, a position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition in an intersecting direction that intersects the moving direction of the moving object and the vertical direction.
    A first portion disposed on a first side with respect to a reference plane including the movement direction and the vertical direction of the object, based on the reference position; A determination unit configured to determine a first portion and a second portion disposed on a second side opposite to the first side.
  24.  移動する物体の各点の位置情報を検出することと、
     前記物体の移動方向と鉛直方向とに交差する交差方向において前記位置情報の変化量が所定の条件を満たす位置を基準位置として算出することと、
     前記基準位置に基づいた、前記移動方向と前記鉛直方向とを含む基準面に対して、第1側に配置される前記物体の第1部分と、前記基準面に対して前記第1側と反対の第2側に配置される前記物体の第2部分とを判別することと、を含む検出方法。
    Detecting position information of each point of the moving object;
    Calculating, as a reference position, a position where the amount of change in the position information satisfies a predetermined condition in an intersecting direction intersecting the moving direction of the object and the vertical direction;
    A first portion of the object disposed on a first side with respect to a reference plane including the moving direction and the vertical direction based on the reference position, and an opposite side of the first side with respect to the reference plane; Determining a second portion of the object disposed on the second side of the object.
  25.  コンピュータに、
     移動する物体の移動方向と鉛直方向とに交差する交差方向において、前記物体の表面上の各点の位置情報の変化量が所定の条件を満たす位置を基準位置として算出することと、
     前記基準位置に基づいて、前記移動方向と前記鉛直方向とを含む基準面に対して第1側に配置される前記物体の第1部分と、前記基準面に対して前記第1側と反対の第2側に配置される前記物体の第2部分とを判別することと、を実行させる処理プログラム。
    On the computer,
    In a cross direction that intersects the moving direction of the moving object and the vertical direction, a position where the amount of change in the position information of each point on the surface of the object satisfies a predetermined condition is calculated as a reference position,
    A first portion of the object arranged on a first side with respect to a reference plane including the moving direction and the vertical direction, based on the reference position, and a first portion opposite to the first side with respect to the reference plane. Determining a second portion of the object disposed on the second side.
PCT/JP2019/026218 2018-07-31 2019-07-02 Detection device, processing device, detection method, and processing program WO2020026677A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020534123A JP7024876B2 (en) 2018-07-31 2019-07-02 Detection device, processing device, detection method, and processing program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018143687 2018-07-31
JP2018-143687 2018-07-31

Publications (1)

Publication Number Publication Date
WO2020026677A1 true WO2020026677A1 (en) 2020-02-06

Family

ID=69230965

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026218 WO2020026677A1 (en) 2018-07-31 2019-07-02 Detection device, processing device, detection method, and processing program

Country Status (2)

Country Link
JP (1) JP7024876B2 (en)
WO (1) WO2020026677A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7261342B1 (en) 2022-09-22 2023-04-19 三菱ケミカルグループ株式会社 Information processing device, method, program, and system
WO2024116444A1 (en) * 2022-11-28 2024-06-06 株式会社Jvcケンウッド Image processing device and image processing program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007107927A (en) * 2005-10-11 2007-04-26 Nittetsu Hokkaido Control Systems Corp Length measuring device, length measuring method, and computer program for length measurement
JP2008032489A (en) * 2006-07-27 2008-02-14 Kanazawa Inst Of Technology Three-dimensional shape data creation method and apparatus for human body
JP2012215555A (en) * 2011-03-30 2012-11-08 Advanced Telecommunication Research Institute International Measurement device, measurement method, and measurement program
WO2017119154A1 (en) * 2016-01-07 2017-07-13 三菱電機株式会社 Detection device and detection method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007107927A (en) * 2005-10-11 2007-04-26 Nittetsu Hokkaido Control Systems Corp Length measuring device, length measuring method, and computer program for length measurement
JP2008032489A (en) * 2006-07-27 2008-02-14 Kanazawa Inst Of Technology Three-dimensional shape data creation method and apparatus for human body
JP2012215555A (en) * 2011-03-30 2012-11-08 Advanced Telecommunication Research Institute International Measurement device, measurement method, and measurement program
WO2017119154A1 (en) * 2016-01-07 2017-07-13 三菱電機株式会社 Detection device and detection method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7261342B1 (en) 2022-09-22 2023-04-19 三菱ケミカルグループ株式会社 Information processing device, method, program, and system
WO2024062642A1 (en) * 2022-09-22 2024-03-28 株式会社Shosabi Information processing device, method, program, and system
JP2024045823A (en) * 2022-09-22 2024-04-03 三菱ケミカルグループ株式会社 Information processing device, method, program, and system
WO2024116444A1 (en) * 2022-11-28 2024-06-06 株式会社Jvcケンウッド Image processing device and image processing program

Also Published As

Publication number Publication date
JP7024876B2 (en) 2022-02-24
JPWO2020026677A1 (en) 2021-08-02

Similar Documents

Publication Publication Date Title
US9594950B2 (en) Depth mapping with enhanced resolution
US9898651B2 (en) Upper-body skeleton extraction from depth maps
US9842405B2 (en) Visual target tracking
US8565476B2 (en) Visual target tracking
US8577084B2 (en) Visual target tracking
US7974443B2 (en) Visual target tracking using model fitting and exemplar
US8682028B2 (en) Visual target tracking
US9767611B2 (en) Information processing apparatus and method for estimating depth values using an approximate plane
US8577085B2 (en) Visual target tracking
US20140010425A1 (en) Extraction of skeletons from 3d maps
US8565477B2 (en) Visual target tracking
JP2016071645A (en) Object three-dimensional model restoration method, device, and program
JP7363962B2 (en) Processing equipment, detection equipment, systems and programs
JP7024876B2 (en) Detection device, processing device, detection method, and processing program
JP7200994B2 (en) Processing device, detection device, processing method, and processing program
JP7234595B2 (en) Detection device, processing device, detection method, and processing program
JP7447956B2 (en) Processing device, attitude analysis system, and program
JP2022037506A (en) Detection device, processing device, detection method, and processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19845513

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020534123

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19845513

Country of ref document: EP

Kind code of ref document: A1