WO2019155727A1 - 情報処理装置、追跡方法、及び追跡プログラム - Google Patents

情報処理装置、追跡方法、及び追跡プログラム Download PDF

Info

Publication number
WO2019155727A1
WO2019155727A1 PCT/JP2018/043204 JP2018043204W WO2019155727A1 WO 2019155727 A1 WO2019155727 A1 WO 2019155727A1 JP 2018043204 W JP2018043204 W JP 2018043204W WO 2019155727 A1 WO2019155727 A1 WO 2019155727A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinates
detection
detection target
coordinate
representative
Prior art date
Application number
PCT/JP2018/043204
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
晋 飯野
順司 助野
正英 小池
聡 道籏
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to JP2019570309A priority Critical patent/JP6789421B2/ja
Priority to CN201880088321.8A priority patent/CN111670456B/zh
Publication of WO2019155727A1 publication Critical patent/WO2019155727A1/ja

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to an information processing apparatus, a tracking method, and a tracking program.
  • Patent Documents 1 and 2 a technique for tracking a person or an object has been proposed (see Patent Documents 1 and 2).
  • the passing person counting device disclosed in Patent Literature 1 extracts a person based on image data captured by a plurality of cameras, and tracks the extracted person.
  • the control device disclosed in Patent Document 2 captures images from a plurality of cameras, extracts features indicating the vehicle on the images, and tracks the movement of the vehicle.
  • the movement trajectory of a person or an object is calculated by integrating a plurality of pieces of position information.
  • Outlier position information may be included in the plurality of position information.
  • the outlier is position information when the position of a person or an object is erroneously detected. Integrating a plurality of pieces of position information including outliers deteriorates the accuracy of the movement trajectory of a person or object.
  • the problem that the accuracy of the movement trajectory of a person or an object is not solved by simply increasing the frame rate of the camera. Further, the problem that the accuracy of the movement trajectory of a person or an object is not solved by simply increasing the sampling period of the sensor.
  • An object of the present invention is to improve the accuracy of the movement trajectory.
  • An information processing apparatus includes: a position detection unit that detects a plurality of detection target positions indicating positions of the first detection target from a plurality of detection information in which each of the plurality of detection apparatuses periodically detects the first detection target; A conversion unit that converts the plurality of detection target positions into coordinates based on a space in which the plurality of detection devices are installed, and the first detection target before the plurality of detection devices detect the first detection target.
  • a storage unit that stores first coordinates that are coordinates of the position where the detection target exists, and coordinates based on the space, and a plurality of the plurality of detection target positions converted into coordinates based on the space
  • a classification unit that acquires coordinates, and extracts a plurality of second coordinates that are a plurality of coordinates predicted to have a relationship with the first coordinates from the acquired converted coordinates; and the plurality of second coordinates
  • a representative coordinate is calculated based on the coordinates, and a representative coordinate calculation unit that determines the calculated representative coordinate as a position where the first detection target has moved from the first coordinate is included.
  • the accuracy of the movement trajectory can be improved.
  • FIG. 3 is a first diagram illustrating the tracking system according to the first embodiment;
  • FIG. 3 is a second diagram illustrating the tracking system according to the first embodiment.
  • 2 is a diagram illustrating a main hardware configuration of the information processing apparatus according to Embodiment 1.
  • FIG. 3 is a functional block diagram illustrating a configuration of the information processing apparatus according to the first embodiment.
  • FIG. 6 is a diagram illustrating an example of a movement trajectory table according to the first embodiment.
  • FIG. 3 is a flowchart illustrating a process for storing converted coordinates according to the first embodiment.
  • 3 is a flowchart illustrating a process for calculating representative coordinates according to the first embodiment.
  • FIG. 6 is a diagram (part 1) illustrating a specific example of the tracking process according to the first embodiment;
  • FIG. 10 is a second diagram illustrating a specific example of the tracking process according to the first embodiment;
  • FIG. 10 is a third diagram illustrating a specific example of the tracking process according to the first embodiment;
  • 3 is a functional block diagram illustrating a configuration of an information processing apparatus according to Embodiment 2.
  • FIG. 10 is a flowchart (part 1) illustrating a representative coordinate calculation process according to the second embodiment.
  • 10 is a flowchart (part 2) illustrating the processing for calculating representative coordinates according to the second embodiment.
  • FIG. 1 is a diagram (part 1) illustrating the tracking system according to the first embodiment.
  • the tracking system includes an information processing apparatus 100 and cameras 200, 201, and 202.
  • the information processing apparatus 100 and the cameras 200, 201, and 202 are connected via a network.
  • the information processing apparatus 100 can track the subject to be photographed.
  • the information processing apparatus 100 is a computer.
  • the information processing apparatus 100 acquires images captured by the cameras 200, 201, and 202.
  • Cameras 200, 201, and 202 are also referred to as imaging devices or image generation devices.
  • the cameras 200, 201, and 202 may be sensors.
  • Cameras 200, 201, and 202 may include sensors.
  • a camera, a sensor, or a camera including a sensor is also referred to as a detection device.
  • FIG. 1 shows a case where there are three cameras. However, the number of cameras is not limited to three.
  • Cameras 200, 201, and 202 are installed in a shooting target space.
  • the shooting target space is a space where a camera is installed, and is a space in a range where the camera can shoot.
  • the cameras 200, 201, and 202 are installed above the shooting target space.
  • the cameras 200, 201, and 202 are installed on a ceiling in a room that is a shooting target space.
  • Cameras 200, 201, and 202 capture an image of an object to be photographed from above.
  • the captured image is acquired by the information processing apparatus 100.
  • the taken time is associated with the taken image.
  • FIG. 1 shows a state in which the cameras 200, 201, and 202 are installed on the indoor ceiling.
  • the information processing apparatus 100 may exist in the room, or may exist in a different place from the room.
  • FIG. 2 is a diagram (part 2) illustrating the tracking system according to the first embodiment.
  • FIG. 2 shows a state in which FIG. 1 is viewed from the side.
  • Part of the range that can be captured by the camera 200 and the camera 201 overlaps.
  • Part of the range that can be captured by the camera 201 and the camera 202 overlaps.
  • overlapping a part of a range that can be captured by a plurality of cameras means that the imaging target is captured by a plurality of cameras.
  • the information processing apparatus 100 is not shown.
  • FIG. 3 is a diagram illustrating a main hardware configuration of the information processing apparatus according to the first embodiment.
  • the information processing apparatus 100 includes a processor 101, a volatile storage device 102, and a nonvolatile storage device 103.
  • the processor 101 controls the entire information processing apparatus 100.
  • the processor 101 is a CPU (Central Processing Unit) or an FPGA (Field Programmable Gate Array).
  • the processor 101 may be a multiprocessor.
  • the information processing apparatus 100 may be realized by a processing circuit, or may be realized by software, firmware, or a combination thereof.
  • the processing circuit may be a single circuit or a composite circuit.
  • the volatile storage device 102 is a main storage device of the information processing apparatus 100.
  • the volatile storage device 102 is a RAM (Random Access Memory).
  • the nonvolatile storage device 103 is an auxiliary storage device of the information processing apparatus 100.
  • the non-volatile storage device 103 is an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • FIG. 4 is a functional block diagram illustrating the configuration of the information processing apparatus according to the first embodiment.
  • the information processing apparatus 100 includes a position detection unit 110, a conversion unit 120, a coordinate storage unit 130, a storage unit 140, a classification unit 150, a representative coordinate calculation unit 160, a display control unit 170, and a timer 180.
  • the position detection unit 110 includes object detection units 111, 112, and 113.
  • the conversion unit 120 includes coordinate conversion units 121, 122, and 123.
  • Some or all of the position detection unit 110, the object detection units 111, 112, and 113, the conversion unit 120, the coordinate conversion units 121, 122, and 123, the classification unit 150, the representative coordinate calculation unit 160, and the display control unit 170 are processors. 101 may be realized. Some or all of the position detection unit 110, the object detection units 111, 112, and 113, the conversion unit 120, the coordinate conversion units 121, 122, and 123, the classification unit 150, the representative coordinate calculation unit 160, and the display control unit 170 are processors. It may be realized as a module of a program executed by 101. The program is stored in the volatile storage device 102 or the nonvolatile storage device 103. The program is a tracking program.
  • the position detection unit 110, the object detection units 111, 112, and 113, the conversion unit 120, the coordinate conversion units 121, 122, and 123, the classification unit 150, the representative coordinate calculation unit 160, and the display control unit 170 perform information processing. It can be realized as a module of a tracking program executed by a processor 101 included in the apparatus 100 (for example, a computer).
  • the coordinate accumulation unit 130 and the storage unit 140 are realized as a storage area secured in the volatile storage device 102 or the nonvolatile storage device 103.
  • the position detection unit 110 detects a plurality of image coordinates indicating the position of the imaging target from a plurality of captured images obtained by each of the cameras 200, 201, and 202 periodically capturing the imaging target.
  • An imaging target is also called a detection target.
  • the imaging target may be expressed as a first detection target.
  • the captured image is also referred to as detection information.
  • Image coordinates are also referred to as detection target positions.
  • the plurality of captured images may be images in which the cameras 200, 201, and 202 each photographed a subject to be photographed in the same cycle, or may be images in which each of the cameras 200, 201, and 202 has been photographed in a different cycle. .
  • the conversion unit 120 converts a plurality of image coordinates into coordinates based on a space in which the cameras 200, 201, and 202 are installed.
  • the space may be expressed as a space obtained by integrating the shooting target spaces of the cameras 200, 201, and 202. Coordinates based on the space are called common system coordinates.
  • the coordinate storage unit 130 stores a plurality of converted coordinates obtained by converting a plurality of image coordinates into common system coordinates.
  • the storage unit 140 stores a plurality of pieces of position information where the shooting target has existed in the past.
  • the plurality of pieces of position information are indicated by coordinates.
  • the coordinates are common system coordinates.
  • the display control unit 170 can generate a movement trajectory by integrating a plurality of pieces of position information (that is, a plurality of coordinates).
  • the movement trajectory may be expressed as a flow line.
  • the coordinates corresponding to the latest time among the plurality of coordinates that are the basis of the movement locus are referred to as the latest coordinates.
  • the latest coordinates may be expressed as the leading coordinates that are the leading coordinates of the movement locus.
  • the storage unit 140 stores the coordinates of the position where the shooting target exists before the cameras 200, 201, and 202 detect a shooting target (also referred to as a first detection target, for example).
  • the coordinates are coordinates based on space (that is, common system coordinates). As will be described later, the coordinates are the latest coordinates of the imaging target closest to the converted coordinates, and may be expressed as first coordinates.
  • the classification unit 150 extracts a plurality of coordinates predicted to have a relationship with the latest coordinate from the plurality of converted coordinates stored in the coordinate storage unit 130.
  • the plurality of coordinates are also referred to as a plurality of second coordinates.
  • the representative coordinate calculation unit 160 calculates representative coordinates based on the plurality of coordinates.
  • the representative coordinate calculation unit 160 determines the representative coordinates as the position where the imaging target has moved from the latest coordinates.
  • the display control unit 170 generates a two-dimensional map indicating a movement locus based on a plurality of pieces of position information (that is, a plurality of coordinates) stored in the storage unit 140.
  • the two-dimensional map is a bird's-eye view from above the space, and represents a movement trajectory of a person or an object.
  • the display control unit 170 displays a two-dimensional map on the display included in the information processing apparatus 100. Thereby, the user can recognize the movement trajectory of the person or the object.
  • FIG. 5 is a diagram illustrating an example of a movement trajectory table according to the first embodiment.
  • the movement trajectory table 141 is stored in the storage unit 140.
  • the movement trajectory table 141 has items of item number, data content, data format, and data size.
  • the item of the item number indicates an identifier.
  • the data content item indicates the data content.
  • the data format item indicates the data format.
  • the data size item indicates the data size.
  • the unit of information registered in the data size item is bytes.
  • item number 2 indicates that the number of movement trajectory IDs (identifiers) is N (N is a positive integer).
  • Item No. 3 is registered with the coordinates of the start position of the movement of the imaging target whose movement locus ID is T1 (hereinafter referred to as “movement locus ID: T1”).
  • movement locus ID: T1 an imaging target ID corresponding to the movement trajectory ID: T1 is registered.
  • the number of coordinates used when calculating the movement trajectory of the photographing target ID corresponding to the movement trajectory ID: T1 is registered.
  • FIG. 5 shows that the number of the coordinates is m 1 .
  • the last update time of the movement locus ID: T1 is registered. From item number 7 onward, the position where the imaging target ID corresponding to the movement trajectory ID: T1 has moved is registered in chronological order.
  • FIG. 5 illustrates the case of three-dimensional coordinates, but there are also cases where two-dimensional coordinates are used.
  • FIG. 5 shows the latest coordinates of the movement locus ID: T1.
  • the latest coordinates of the movement locus ID: T1 are an x coordinate m 1x , a y coordinate m 1y , and a z coordinate m 1z .
  • the display control unit 170 can generate a movement trajectory of the photographing target ID corresponding to the movement trajectory ID: T1 by using m 1 coordinates.
  • information relating to the movement trajectory ID: T2,... TN is registered after the information relating to the movement trajectory ID: T1.
  • the latest coordinates are registered for each movement locus ID, such as a movement locus ID: T1.
  • the processing executed by the object detection unit 111 is the same as the processing executed by the object detection units 112 and 113. Therefore, in FIG. 6, a process executed by the object detection unit 111 will be described. Description of the processing executed by the object detection units 112 and 113 is omitted.
  • the processing executed by the coordinate conversion unit 121 is the same as the processing executed by the coordinate conversion units 122 and 123. Therefore, in FIG. 6, a process executed by the coordinate conversion unit 121 will be described. Description of the processing executed by the coordinate conversion units 122 and 123 is omitted.
  • FIG. 6 is a flowchart showing storage processing of converted coordinates according to the first embodiment. The process of FIG. 6 is executed every time the camera 200 captures an image. In the description of the processing in FIG. 6, reference is made to FIGS.
  • the object detection unit 111 acquires an image captured by the camera 200.
  • the object detection unit 111 performs a recognition process on the image and detects an imaging target.
  • the recognition process is a background difference process, an inter-frame difference process, a general object recognition technique, or a specific object recognition technique. Further, the object detection unit 111 detects a plurality of shooting targets when there are a plurality of shooting targets in the image.
  • the object detection unit 111 detects the position of the photographing target in the image. That is, the object detection unit 111 detects image coordinates. The image coordinates are relative positions with respect to the camera 200. Moreover, the object detection part 111 detects a some image coordinate, when a some imaging
  • the coordinate conversion unit 121 converts the image coordinates into common system coordinates. For the conversion, the positions corresponding to the installation positions of the cameras 200, 201, and 202 and the orientations of the cameras 200, 201, and 202 in the common system coordinates are measured in advance, and parameters for converting the coordinates are calculated. deep. The coordinate conversion unit 121 converts image coordinates to common system coordinates using parameters.
  • the coordinate conversion unit 121 may convert the image coordinates into two-dimensional common system coordinates, or may convert the image coordinates into three-dimensional common system coordinates. For example, when converting the image coordinates into three-dimensional common system coordinates, the coordinate conversion unit 121 obtains the two-dimensional common system coordinates by projecting them onto a known plane such as the ground or the floor.
  • the coordinate conversion unit 121 stores the converted coordinates obtained by converting the image coordinates into common system coordinates in the coordinate storage unit 130.
  • the coordinate storage unit 130 stores a plurality of converted coordinates based on images periodically taken by the cameras 200, 201, and 202.
  • a post-conversion coordinate obtained by converting the image coordinate is associated with a photographing time when an image including the image coordinate is photographed.
  • the shooting time is also referred to as detection time.
  • the camera 200 or the like is a sensor
  • the sensor is an infrared sensor.
  • the sensor detects a detection target using infrared rays or the like.
  • the object detection unit 111 acquires detection information for detecting a detection target from the sensor.
  • the object detection unit 111 detects a detection target position from the detection information.
  • the detection target position is information indicating the distance from the sensor to the detection target.
  • the coordinate conversion unit 121 converts the detection target position into common system coordinates.
  • the coordinate conversion unit 121 stores the converted coordinates obtained by converting the detection target position into the common system coordinates in the coordinate accumulation unit 130.
  • the information processing apparatus 100 stores the converted coordinates obtained by converting the detection target position in the coordinate accumulation unit 130.
  • the coordinates are associated with the detection time at which the detection target is detected.
  • the information processing apparatus 100 performs the same process as the process illustrated in FIG. 6 even when the camera 200 is a sensor.
  • FIG. 7 is a flowchart showing the representative coordinate calculation process according to the first embodiment.
  • the process of FIG. 7 is started when the classification unit 150 receives a periodic trigger.
  • the period trigger is generated by the timer 180 and transmitted to the classification unit 150.
  • FIGS. 7 In the description of the processing in FIG. 7, reference is made to FIGS.
  • the classification unit 150 acquires the latest coordinates of each movement locus ID from the movement locus table 141.
  • the classification unit 150 acquires the latest coordinates of the movement locus ID: T1, the latest coordinates of the movement locus ID: T2, and the like from the movement locus table 141.
  • the classification unit 150 obtains, from the coordinate storage unit 130, converted coordinates obtained by converting the image coordinates of an image captured from a point of time when execution of Step S22 is started to a predetermined time. That is, the classification unit 150 acquires a plurality of converted coordinates from the coordinate storage unit 130 based on the shooting time (that is, the detection time).
  • the predetermined time is longer than the sampling period executed by the cameras 200, 201, and 202. Further, the predetermined time may be determined based on the accuracy of time resolution required in actual operation. For example, the predetermined time is about 0.1 to 2 seconds.
  • the predetermined time is also referred to as a first time. In this way, the classification unit 150 can acquire a plurality of converted coordinates in accordance with the operation timing of the classification unit 150. Further, the classification unit 150 may acquire all the converted coordinates stored in the coordinate accumulation unit 130.
  • the classification unit 150 may acquire a plurality of converted coordinates stored in the coordinate storage unit 130 from the execution start time of step S22 to a predetermined time before based on the storage time.
  • Step S23 The classification unit 150 selects one post-conversion coordinate from the plurality of post-conversion coordinates acquired in step S22.
  • Step S24 The classification unit 150 adds, to the post-conversion coordinates selected in Step S23, the movement trajectory ID of the latest coordinates closest to the post-conversion coordinates selected in Step S23 among the latest coordinates of each movement trajectory ID. For example, the classification unit 150 calculates the distance between the latest coordinates of each movement locus ID and the converted coordinates selected in step S23. As a result of the calculation, the classification unit 150 specifies that the latest coordinate closest to the post-conversion coordinate selected in Step S23 is the movement locus ID: T1.
  • the classification unit 150 adds the movement trajectory ID: T1 to the converted coordinates selected in step S23.
  • the latest coordinates closest to the converted coordinates selected in step S23 are also referred to as first coordinates.
  • the post-conversion coordinates to which the movement locus ID is added can be said to be coordinates that are predicted to have a relationship with the latest coordinates of the movement locus ID.
  • Step S25 The classification unit 150 determines whether or not all of the plurality of converted coordinates acquired in Step S22 have been selected. If all of the plurality of converted coordinates acquired in step S22 have not been selected (No in step S25), the classification unit 150 advances the process to step S23. When all the plurality of converted coordinates acquired in step S22 are selected (Yes in step S25), the classification unit 150 advances the process to step S26.
  • the classification unit 150 can classify the plurality of converted coordinates for each movement trajectory ID by executing Step S24. That is, the classification unit 150 performs clustering by executing step S24.
  • a plurality of converted coordinates are classified for each movement trajectory ID.
  • the latest coordinate of the movement locus ID: T1 is set as the first coordinate.
  • a plurality of latest coordinates other than the movement locus ID: T1 excluding the movement locus ID: T1 are set as a plurality of third coordinates.
  • the plurality of converted coordinates to which the movement trajectory ID: T1 is added are coordinates having the closest distance from the first coordinate among the plurality of coordinates including the first coordinate and the plurality of third coordinates. I can say that.
  • Step S26 The representative coordinate calculation unit 160 calculates a representative coordinate for each movement trajectory ID added in step S24. That is, the representative coordinate calculation unit 160 calculates the representative coordinates based on a plurality of converted coordinates classified for each movement locus ID.
  • the representative coordinate calculation unit 160 selects one post-conversion coordinate from a plurality of post-conversion coordinates with the same movement locus ID.
  • the representative coordinate calculation unit 160 individually calculates the distance between each of the plurality of converted coordinates excluding the selected converted coordinate among the plurality of converted coordinates and the selected converted coordinate. And the sum of the distance calculated separately between coordinates is calculated.
  • the representative coordinate calculation unit 160 calculates the sum of the distances calculated individually between the coordinates for all the converted coordinates.
  • the representative coordinate calculation unit 160 determines the converted coordinates of the shortest distance among the total distances as the representative coordinates.
  • the representative coordinate calculation unit 160 randomly selects some converted coordinates from among a plurality of converted coordinates assigned the same movement locus ID, and calculates the average coordinates of the selected converted coordinates.
  • the representative coordinate calculation unit 160 determines the average coordinate as the representative coordinate when the number of converted coordinates existing within a predetermined range centering on the average coordinate is equal to or greater than the threshold value.
  • the representative coordinate calculating unit 160 executes a process of selecting the converted coordinates at random again.
  • the representative coordinate calculation unit 160 determines the representative coordinates calculated for each movement trajectory ID as the position moved from the latest coordinates for each movement trajectory ID. For example, the representative coordinate calculation unit 160 uses the representative coordinates calculated based on the plurality of converted coordinates to which the movement trajectory ID: T1 is added, and the imaging target corresponding to the movement trajectory ID: T1 is the movement trajectory ID: T1. The position moved from the latest coordinates is determined.
  • Step S27 The representative coordinate calculation unit 160 adds the representative coordinates as new latest coordinates for each movement locus ID to the movement locus table 141. Further, the representative coordinate calculation unit 160 registers the time when the representative coordinates are added in the movement trajectory table 141. For example, the representative coordinate calculation unit 160 registers the time added to the representative coordinates of the movement locus ID: T1 as the last update time of the movement locus ID: T1 in the movement locus table 141.
  • Step S28 The classification unit 150 waits for a predetermined time.
  • the classification unit 150 proceeds with the process to step S21 after waiting.
  • the certain time is longer than the sampling period of the cameras 200, 201, and 202.
  • the fixed time is at least twice the sampling period.
  • a plurality of converted coordinates classified by the classification unit 150 are coordinates based on images captured at a plurality of shooting times. Therefore, spatial outlier removal and temporal outlier removal are performed simultaneously. That is, the movement locus table 141 does not include outliers. Therefore, the information processing apparatus 100 can generate a highly accurate movement locus based on the movement locus table 141. Therefore, the information processing apparatus 100 can improve the accuracy of the movement trajectory.
  • the first representative coordinates of each movement trajectory ID may be calculated in any way.
  • post-conversion coordinates obtained by converting the image coordinates of an image captured from the time when the execution of step S ⁇ b> 22 is started to a predetermined time before are acquired from the coordinate accumulation unit 130.
  • the predetermined time may coincide with the standby time that the classification unit 150 waits in step S28, or may be longer than the standby time.
  • the classification unit 150 adds each movement trajectory ID to the plurality of converted coordinates acquired in step S22. However, the classification unit 150, when the distance between the latest coordinates of each movement trajectory ID and the latest coordinates closest to the converted coordinates selected in step S23 is equal to or less than a threshold Th3 (also referred to as a third threshold).
  • the moving locus ID of the nearest latest coordinate may be added to the post-conversion coordinate selected in step S23. This may be considered as follows.
  • the classification unit 150 does not do anything for the converted coordinates whose distance from the latest coordinate of the movement locus ID is equal to or less than the threshold Th3 among the plurality of converted coordinates to which the same movement locus ID is added in Step S24, and sets the threshold Th3.
  • the movement trajectory ID added to the post-conversion coordinates that exceed is removed.
  • the representative coordinate calculation unit 160 calculates the representative coordinates based on a plurality of converted coordinates whose distance from the latest coordinate of the movement trajectory ID is equal to or less than the threshold Th3. Thereby, the classification
  • the classification unit 150 may execute the processing described below.
  • a threshold Th1 also referred to as a first threshold
  • the classifying unit 150 The coordinate movement trajectory ID is not added to the converted coordinates selected in step S23. That is, it can be said that the post-conversion coordinates to which the movement trajectory ID is not added are coordinates in which the distance between each of the latest coordinates of each movement trajectory ID exceeds the threshold Th1.
  • the classification unit 150 extracts the converted coordinates to which the movement trajectory ID is not added from the plurality of converted coordinates acquired in Step S22.
  • the post-conversion coordinates to which the movement locus ID is not added are also referred to as first post-conversion coordinates.
  • the classification unit 150 performs a plurality of first conversions in which the distance between each of the plurality of coordinates including the first coordinate and the plurality of third coordinates exceeds the first threshold. It may be expressed that the rear coordinates are extracted.
  • the classification unit 150 detects a plurality of features based on the converted coordinates to which the movement trajectory ID is not added.
  • the classification unit 150 classifies the converted coordinates, to which the movement trajectory ID is not added, for each feature.
  • the classification unit 150 extracts post-conversion coordinates (also referred to as second post-conversion coordinates) from the post-conversion coordinates to which the movement trajectory ID is not added based on the first feature among the plurality of features.
  • post-conversion coordinates also referred to as second post-conversion coordinates
  • the definition of the first feature for example, a definition in which the number of other converted coordinates exists within a predetermined range centering on the coordinates is equal to or more than the threshold Th6 is used.
  • the classification unit 150 determines that the person or object is newly detected.
  • the representative coordinate calculating unit 160 calculates the representative coordinates based on the extracted converted coordinates.
  • the representative coordinate calculation unit 160 registers the representative coordinates in the movement trajectory table 141 as a position where a newly detected person or object is detected.
  • the representative coordinate calculation unit 160 adds a new movement trajectory ID to the newly detected person or object and registers the new movement trajectory ID in the movement trajectory table 141.
  • category part 150 can detect the initial position of the new person or object which starts tracking.
  • FIG. 8 is a diagram (part 1) illustrating a specific example of the tracking process according to the first embodiment.
  • FIG. 8 shows that the cameras 200 and 201 are set on the ceiling in the room.
  • the photographing subject U1 and the photographing subject U2 are walking on the floor 300 in the room so as to pass each other.
  • FIG. 9 is a diagram (part 2) illustrating a specific example of the tracking process according to the first embodiment.
  • the situation information 400 includes the position of the subject U1 detected based on the image taken by the camera 200 at time T1, and the position of the subject U2 detected based on the image taken by the camera 201 at time T1. Is shown.
  • the position of the subject to be photographed detected based on the image photographed by the camera 200 and the position of the subject to be photographed detected based on the image photographed by the camera 201 are converted coordinates converted into common system coordinates. is there.
  • An area 301 indicates a range that can be captured by the camera 200.
  • An area 302 indicates a range that can be captured by the camera 201.
  • An area 303 is an area in which the camera 200 and the camera 201 capture images.
  • the overlapping image area is defined as an overlapping area. In FIG. 9, the shaded area is an overlapping area.
  • the situation information 401 includes the position of the subject U1 detected based on the image taken by the camera 200 at time T2 and the subjects U1 and U2 detected based on the image taken by the camera 201 at time T2. The position is shown.
  • the situation information 402 includes the positions of the subjects U1 and U2 detected based on the image captured by the camera 200 at time T3, and the subjects U1 and U1 detected based on the image captured by the camera 201 at time T3. The position of U2 is shown.
  • the situation information 403 includes the positions of the photographing subjects U1 and U2 detected based on the image photographed by the camera 200 at time T4 and the photographing subject person U1 detected based on the image photographed by the camera 201 at time T4. The position is shown.
  • the situation information 404 includes the position of the subject U2 detected based on the image captured by the camera 200 at time T5, and the position of the subject U1 detected based on the image captured by the camera 201 at time T5. Is shown.
  • FIG. 9 shows a case where the coordinate conversion by the position detection unit 110 and the conversion unit 120 is ideally performed.
  • the positions of the subjects to be photographed detected based on the images photographed by the camera 200 and the camera 201 are substantially the same.
  • the photographing is performed. It is possible to estimate the positions of the subject U1 and the subject U2 with high accuracy. By generating a movement trajectory using a position estimated with high accuracy, a movement trajectory with high accuracy can be generated.
  • FIG. 10 is a third diagram illustrating a specific example of the tracking process according to the first embodiment.
  • FIG. 10 is different in that the status information 402 in FIG. 9 is changed to status information 402a.
  • the situation information 402a includes the positions of the photographing subjects U1 and U2 detected based on the image photographed by the camera 200 at the time T3 ′ and the photographing subject detected based on the image photographed by the camera 201 at the time T3 ′.
  • the positions of U1 and U2 are shown.
  • the place where the situation information 402a differs from the situation information 402 is that the position of the subject U1 detected based on the image taken by the camera 200 at time T3 ′ is different.
  • the position of the subject U1 in the situation information 402a is detected by erroneous detection of the subject U1 by the position detector 110.
  • the position of the subject U1 in the situation information 402a occurs due to a cause such as insufficient accuracy of conversion parameters of the conversion unit 120.
  • FIG. 10 illustrates an example in which the detection position of the subject U1 is greatly deviated at time T3 ′.
  • Generating the movement trajectory of the photographing subject U1 using the position where the detection position is greatly deviated lowers the accuracy of the movement trajectory of the photographing subject U1.
  • the first embodiment can improve the accuracy of the movement trajectory of the photographing subject U1 even when the detection position of the photographing subject U1 is greatly deviated. The reason will be described below.
  • the coordinate storage unit 130 stores information on status information 401, 402a, and 403. That is, the coordinate storage unit 130 stores the positions of the photographing subjects U1 and U2 detected based on the images photographed by the cameras 200 and 201 at times T2, T3 ′, and T4.
  • the classification unit 150 determines the position closest to the subject U1 at time T1 among the positions of the subjects U1 and U2 detected based on the images taken by the cameras 200 and 201 at times T2, T3 ′, and T4. Extract.
  • the classification unit 150 extracts positions (that is, changed coordinates) in the areas 501, 502, and 503 in FIG.
  • the extracted position is a position that is predicted to have a relationship with the position of the subject U1 at time T1.
  • the representative coordinate calculation unit 160 determines the representative coordinates based on the positions (that is, the changed coordinates) in the areas 501, 502, and 503 in FIG.
  • the position of the subject U1 in the situation information 402a is an outlier and is excluded from the representative coordinates.
  • the representative coordinates are positions other than the position of the subject U1 in the situation information 402a.
  • the determined representative coordinates are determined as a position moved from the position of the subject U1 in the situation information 400.
  • the information processing apparatus 100 does not include the position of the subject U1 in the situation information 402a in the coordinates (that is, position information) indicating the movement trajectory of the subject U1.
  • the accuracy of the movement trajectory of the subject U1 can be increased.
  • Embodiment 2 FIG. Next, a second embodiment will be described. Items that are different from the first embodiment will be mainly described, and description of matters that are common to the first embodiment will be omitted. In the description of the second embodiment, reference is made to FIGS.
  • FIG. 11 is a functional block diagram illustrating a configuration of the information processing apparatus according to the second embodiment.
  • the information processing apparatus 100a includes a classification unit 150a and a representative coordinate calculation unit 160a.
  • the functions of the classification unit 150a and the representative coordinate calculation unit 160a will be described in detail later.
  • the configuration in FIG. 11 that is the same as or corresponds to the configuration shown in FIG. 4 is assigned the same reference numeral as that shown in FIG.
  • FIG. 12 is a flowchart (part 1) illustrating the representative coordinate calculation process according to the second embodiment.
  • the process of FIG. 12 is started when the classification unit 150a receives a periodic trigger.
  • the periodic trigger is generated by the timer 180 and transmitted to the classification unit 150a.
  • the coordinate storage unit 130 stores a plurality of converted coordinates. In the description of the processing in FIG. 12, reference is made to FIG.
  • Step S31 The classification unit 150a acquires the latest coordinates of each movement locus ID from the movement locus table 141.
  • the current representative coordinate candidate is used.
  • the current representative coordinate candidate is information used in the processing after step S34. Further, each of the plurality of current representative coordinate candidates is associated with a different movement locus ID.
  • Step S32 The classification unit 150a sets the latest coordinates of each movement locus ID as the current representative coordinate candidate of each movement locus ID. That is, the classification unit 150a sets the latest coordinates of the movement locus ID as the current representative coordinate candidate having the same movement locus ID as the movement locus ID. For example, the classification unit 150a sets the latest coordinate of the movement locus ID: T1 as the current representative coordinate candidate of the movement locus ID: T1. The latest coordinates of each movement locus ID are also referred to as fourth coordinates.
  • Step S33 The classification unit 150a obtains, from the coordinate storage unit 130, converted coordinates obtained by converting the image coordinates of an image captured from the execution start time of step S33 to a predetermined time before. That is, the classification unit 150a acquires a plurality of converted coordinates from the coordinate storage unit 130 based on the shooting time (that is, the detection time). Further, the classification unit 150a may acquire all the converted coordinates stored in the coordinate accumulation unit 130.
  • Step S34 The classification unit 150a selects one converted coordinate from the plurality of converted coordinates acquired in Step S33.
  • Step S35 The classification unit 150a selects the moving trajectory ID of the current representative coordinate candidate closest to the converted coordinate selected in Step S34 among the current representative coordinate candidates of each moving trajectory ID, which is selected in Step S34. Append to
  • Step S36 The classification unit 150a determines whether or not all of the plurality of converted coordinates acquired in Step S33 have been selected. If all the plurality of converted coordinates acquired in step S33 are not selected (No in step S36), the classification unit 150a advances the process to step S34. When all the plurality of converted coordinates acquired in step S33 are selected (Yes in step S36), the classification unit 150a advances the process to step S37.
  • the classification unit 150a is predicted to have a relationship for each of a plurality of fourth coordinates (that is, for each latest coordinate of the movement trajectory ID) from among the plurality of converted coordinates acquired in step S33. Extract multiple coordinates.
  • Step S37 The representative coordinate calculation unit 160a calculates a representative coordinate for each movement trajectory ID added in step S35. That is, the representative coordinate calculation unit 160a calculates the representative coordinates based on a plurality of converted coordinates classified for each movement locus ID. The method for calculating the representative coordinates is the same as in step S26. Then, the representative coordinate calculation unit 160a advances the process to step S41.
  • FIG. 13 is a flowchart (part 2) illustrating the representative coordinate calculation process according to the second embodiment.
  • the representative coordinate calculation unit 160a calculates the distance between the current representative coordinate candidate and the representative coordinate for each movement locus ID. For example, the representative coordinate calculation unit 160a calculates the distance between the current representative coordinate candidate of the movement trajectory ID: T1 and the representative coordinates of the movement trajectory ID: T1 calculated in step S37.
  • Step S42 The representative coordinate calculation unit 160a determines whether each distance calculated for each movement trajectory ID is equal to or less than a threshold Th4 (also referred to as a fourth threshold), or whether the number of repetitions exceeds the threshold Th5. For example, the representative coordinate calculation unit 160a advances the process to step S43 when at least one of the distances calculated in step S41 is longer than the threshold Th4.
  • a threshold Th4 also referred to as a fourth threshold
  • step S42 the number of repetitions is the number of times that the determination in step S42 is No, the process of step S43, etc. is executed, and the determination process of step S42 is repeated again.
  • the first case where it is determined No in step S42 is the first time.
  • Step S42 If the representative coordinate calculation unit 160a satisfies the condition (Yes in Step S42), the process proceeds to Step S44. If the condition is not satisfied (No in step S42), the representative coordinate calculation unit 160a advances the process to step S43.
  • the representative coordinate calculation unit 160a sets the representative coordinates of each movement locus ID as the current representative coordinate candidate of each movement locus ID. That is, the representative coordinate calculation unit 160a sets the representative coordinates of the movement locus ID as a current representative coordinate candidate having the same movement locus ID as the movement locus ID. For example, the representative coordinate calculation unit 160a sets the representative coordinate of the movement locus ID: T1 calculated in step S37 as the current representative coordinate candidate of the movement locus ID: T1. Then, the representative coordinate calculation unit 160a advances the process to step S34.
  • Step S44 The representative coordinate calculation unit 160a adds the representative coordinates of each movement locus ID to the movement locus table 141 as the latest coordinates. Further, the representative coordinate calculation unit 160 registers the time when the representative coordinates are added in the movement trajectory table 141.
  • Step S45 The classification unit 150a waits for a predetermined time. The classification unit 150a advances the process to step S31 after waiting.
  • the information processing apparatus 100a converges to appropriate representative coordinates by repeating step S42. Therefore, it can be said that the representative coordinates registered in the movement trajectory table 141 represent the position of the imaging target with high accuracy.
  • the information processing apparatus 100a can generate a highly accurate movement locus by using the coordinates registered in the movement locus table 141.
  • the cameras 200, 201, and 202 are illustrated.
  • the cameras 200, 201, and 202 may be sensors that can detect the relative position of the imaging target as two-dimensional coordinates from above in at least the imaging target space.
  • the sensor is an image sensor including an image sensor.
  • the image sensor is an infrared image sensor or a thermal image sensor when the subject to be photographed is limited to a person.
  • a person region is shown as a region having a higher temperature than the surroundings.
  • the position detection unit 110 can extract a person region from a difference image between an infrared image when no person is present in the background difference process and an infrared image acquired from the infrared image sensor. Further, the position detection unit 110 can extract a circular area as a human head. Thereby, the position detection part 110 can detect the position of a person.
  • the cameras 200, 201, and 202 may be ToF (Time of Flight) sensors.
  • the position detection unit 110 can detect the relative position of the shooting target as two-dimensional coordinates from above by comparing the depth map between the information acquired from the ToF sensor and the information when there is no shooting target. .
  • the ToF sensor is a technique for measuring the distance by measuring the time of flight of light, and the output of the sensor is obtained as a depth image from the center of the sensor.
  • the position detection unit 110 obtains a difference image obtained by subtracting the depth image when no person is present in the background difference because the person region is obtained as a region having a small depth on the depth image. A region with a small depth is extracted as a human head region.
  • the use of the ToF sensor has an effect that the detection of the photographing object is more stable than the image sensor.
  • the cameras 200, 201, and 202 may be image sensors or ToF sensors.
  • the image sensor or ToF sensor is also referred to as a detection device, an imaging device, or an image generation device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
PCT/JP2018/043204 2018-02-08 2018-11-22 情報処理装置、追跡方法、及び追跡プログラム WO2019155727A1 (ja)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2019570309A JP6789421B2 (ja) 2018-02-08 2018-11-22 情報処理装置、追跡方法、及び追跡プログラム
CN201880088321.8A CN111670456B (zh) 2018-02-08 2018-11-22 信息处理装置、追踪方法和记录介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-020592 2018-02-08
JP2018020592 2018-02-08

Publications (1)

Publication Number Publication Date
WO2019155727A1 true WO2019155727A1 (ja) 2019-08-15

Family

ID=67549551

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/043204 WO2019155727A1 (ja) 2018-02-08 2018-11-22 情報処理装置、追跡方法、及び追跡プログラム

Country Status (3)

Country Link
JP (1) JP6789421B2 (zh)
CN (1) CN111670456B (zh)
WO (1) WO2019155727A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112949375A (zh) * 2019-12-11 2021-06-11 株式会社东芝 计算***、计算方法及存储介质
WO2021210213A1 (ja) * 2020-04-13 2021-10-21 パナソニックIpマネジメント株式会社 移動体検知システム及び情報管理装置
WO2021240570A1 (ja) * 2020-05-25 2021-12-02 三菱電機株式会社 空調制御システム、コントローラおよび空調制御方法
JP2022514726A (ja) * 2019-11-15 2022-02-15 ベイジン センスタイム テクノロジー ディベロップメント カンパニー リミテッド 同行人を検出する方法および装置、システム、電子機器、記憶媒体及びコンピュータプログラム
JP7516037B2 (ja) 2019-12-11 2024-07-16 株式会社東芝 算出システム、算出方法、プログラム、及び記憶媒体

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015087730A1 (ja) * 2013-12-10 2015-06-18 株式会社日立国際電気 監視システム
JP2017016356A (ja) * 2015-06-30 2017-01-19 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5147761B2 (ja) * 2009-03-02 2013-02-20 セコム株式会社 画像監視装置
US8831677B2 (en) * 2010-11-17 2014-09-09 Antony-Euclid C. Villa-Real Customer-controlled instant-response anti-fraud/anti-identity theft devices (with true-personal identity verification), method and systems for secured global applications in personal/business e-banking, e-commerce, e-medical/health insurance checker, e-education/research/invention, e-disaster advisor, e-immigration, e-airport/aircraft security, e-military/e-law enforcement, with or without NFC component and system, with cellular/satellite phone/internet/multi-media functions
US8855427B2 (en) * 2011-12-16 2014-10-07 Harris Corporation Systems and methods for efficiently and accurately detecting changes in spatial feature data
JP6052293B2 (ja) * 2012-10-17 2016-12-27 富士通株式会社 画像処理装置、画像処理プログラムおよび画像処理方法
JP6344050B2 (ja) * 2014-05-16 2018-06-20 株式会社リコー 画像処理システム、画像処理装置、プログラム
JP6372314B2 (ja) * 2014-11-10 2018-08-15 株式会社豊田中央研究所 目標軌跡算出装置及びプログラム
KR101645451B1 (ko) * 2015-04-14 2016-08-12 공간정보기술 주식회사 스테레오 카메라를 이용한 감지영역 내의 이동객체 감지시스템
JP6448457B2 (ja) * 2015-04-30 2019-01-09 三菱電機株式会社 撮影方向変動検出装置および撮影方向変動検出方法
KR101664751B1 (ko) * 2015-11-19 2016-10-25 중앙대학교 산학협력단 오차를 줄일 수 있는 어레이 디텍터를 이용한 신호 위치 검출 장치

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015087730A1 (ja) * 2013-12-10 2015-06-18 株式会社日立国際電気 監視システム
JP2017016356A (ja) * 2015-06-30 2017-01-19 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022514726A (ja) * 2019-11-15 2022-02-15 ベイジン センスタイム テクノロジー ディベロップメント カンパニー リミテッド 同行人を検出する方法および装置、システム、電子機器、記憶媒体及びコンピュータプログラム
CN112949375A (zh) * 2019-12-11 2021-06-11 株式会社东芝 计算***、计算方法及存储介质
JP2021093037A (ja) * 2019-12-11 2021-06-17 株式会社東芝 算出システム、算出方法、プログラム、及び記憶媒体
JP7516037B2 (ja) 2019-12-11 2024-07-16 株式会社東芝 算出システム、算出方法、プログラム、及び記憶媒体
WO2021210213A1 (ja) * 2020-04-13 2021-10-21 パナソニックIpマネジメント株式会社 移動体検知システム及び情報管理装置
WO2021240570A1 (ja) * 2020-05-25 2021-12-02 三菱電機株式会社 空調制御システム、コントローラおよび空調制御方法
JPWO2021240570A1 (zh) * 2020-05-25 2021-12-02
JP7459248B2 (ja) 2020-05-25 2024-04-01 三菱電機株式会社 空調制御システム、コントローラおよび空調制御方法

Also Published As

Publication number Publication date
JPWO2019155727A1 (ja) 2020-05-28
CN111670456A (zh) 2020-09-15
JP6789421B2 (ja) 2020-11-25
CN111670456B (zh) 2023-09-15

Similar Documents

Publication Publication Date Title
JP6741130B2 (ja) 情報処理システム、情報処理方法及びプログラム
WO2019155727A1 (ja) 情報処理装置、追跡方法、及び追跡プログラム
JP7004017B2 (ja) 物体追跡システム、物体追跡方法、プログラム
US10572736B2 (en) Image processing apparatus, image processing system, method for image processing, and computer program
US10867166B2 (en) Image processing apparatus, image processing system, and image processing method
US20120262583A1 (en) Automated method and system for detecting the presence of a lit cigarette
CN105144705B (zh) 对象监视***、对象监视方法和用于提取待监视对象的程序
JP6806188B2 (ja) 情報処理システム、情報処理方法及びプログラム
JP6036824B2 (ja) 画角変動検知装置、画角変動検知方法および画角変動検知プログラム
JP2009143722A (ja) 人物追跡装置、人物追跡方法及び人物追跡プログラム
JP2010063001A (ja) 人物追跡装置および人物追跡プログラム
US20190384969A1 (en) Image processing apparatus, image processing system, image processing method, and program
US20180300579A1 (en) Image processing apparatus, image processing method, and non-transitory computer-readable storage medium
WO2014103673A1 (ja) 情報処理システム、情報処理方法及びプログラム
WO2017046838A1 (ja) 特定人物検知システムおよび特定人物検知方法
KR102226372B1 (ko) 다수 카메라와 라이다 센서 융합을 통한 객체 추적 시스템 및 방법
JP2015191268A (ja) 人物頭部検知装置及び姿勢推定装置
KR101355206B1 (ko) 영상분석을 이용한 출입 계수시스템 및 그 방법
JP2017017488A (ja) 監視システム、撮影制御方法及びプログラム
JP2011081634A (ja) 人体向き推定装置及び人体向き推定方法
JP6504711B2 (ja) 画像処理装置
JP2020135099A (ja) 状態認識装置
JP2018195872A (ja) 情報処理装置、情報処理システム、情報処理方法及びプログラム
JP2022028861A (ja) 情報処理装置、情報処理システム、情報処理方法及びプログラム
JP2016066901A (ja) 撮像部特定システム、方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18905147

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019570309

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18905147

Country of ref document: EP

Kind code of ref document: A1