WO2019181047A1 - 情報処理装置、人物検索システム、場所推定方法及びプログラムが格納された非一時的なコンピュータ可読媒体 - Google Patents
情報処理装置、人物検索システム、場所推定方法及びプログラムが格納された非一時的なコンピュータ可読媒体 Download PDFInfo
- Publication number
- WO2019181047A1 WO2019181047A1 PCT/JP2018/039858 JP2018039858W WO2019181047A1 WO 2019181047 A1 WO2019181047 A1 WO 2019181047A1 JP 2018039858 W JP2018039858 W JP 2018039858W WO 2019181047 A1 WO2019181047 A1 WO 2019181047A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- location
- presence information
- person
- time
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/215—Motion-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30242—Counting objects in image
Definitions
- the present invention relates to an information processing apparatus, a person search system, a place estimation method, and a program.
- Patent Document 1 discloses generating an existence probability map that represents a probability that an object appears in an image based on an input video image.
- Patent Document 2 uses an image taken by a surveillance camera to search for a customer who is planning to board at an airport, but who has not yet boarded even though the departure time is approaching. The technology is disclosed.
- the position of the search target person specified based on the image of the search target person may be different from the actual current search target person position. . Even in such a case, it is required to search for a search target person.
- Patent Document 1 is a technique for generating an existence probability map indicating the probability that an arbitrary person exists, and cannot be used for searching for a specific person such as a search target person.
- Patent Document 2 when a non-passenger is image
- such an assumption is based on the premise that a non-passenger does not move, and cannot sufficiently cope with a search when a search target person moves.
- one of the objects to be achieved by the embodiments disclosed in the present specification is an information processing apparatus, a person search system, a place estimation method, and a program that can search for a search target person who is scheduled to move. Is to provide.
- An information processing apparatus includes presence information acquisition means for acquiring presence information indicating a location where a person is captured, which is generated based on captured images at a plurality of predetermined locations, and the presence information Presence location estimation means for estimating a current location of a specific person from the plurality of predetermined locations based on the presence information acquired by the acquisition means, and the presence information And the presence location estimating means calculates the current location of the specific person based on the presence probability of the specific person at the predetermined location, which is calculated based on an elapsed time from the shooting time point. Is estimated.
- the person search system includes presence information generating means for generating presence information indicating the location of a person at the time of shooting based on captured images for each predetermined location, and a plurality of information based on the presence information.
- Presence location estimating means for estimating a current location of a specific person from the predetermined location, wherein the presence information includes time information at the time of shooting, and the presence location estimation means includes: The current location of the specific person is estimated based on the existence probability of the specific person at the predetermined location calculated based on the elapsed time from the time of shooting.
- the presence information indicating the location of the person at the time of shooting which is generated based on the captured images for each of the plurality of predetermined locations, is acquired, and based on the acquired presence information.
- the present location of a specific person is estimated from the plurality of predetermined locations, and the presence information includes time information at the time of shooting, and in the estimation of the location, an elapsed time from the time of shooting.
- the present location of the specific person is estimated based on the probability of existence of the specific person at the predetermined location.
- a program includes a presence information acquisition step for acquiring presence information indicating a location where a person is captured, which is generated based on captured images at a plurality of predetermined locations, and the presence information acquisition step.
- the computer executes a location estimation step for estimating a current location of a specific person from the plurality of predetermined locations based on the presence information acquired in step (b), and the presence information
- the presence location estimation step the current location of the specific person is calculated based on the presence probability of the specific person at the predetermined location calculated based on the elapsed time from the shooting time. Is estimated.
- an information processing apparatus a person search system, a location estimation method, and a program that can search for a search target person who is scheduled to move.
- FIG. 1 is a block diagram illustrating an example of the configuration of the information processing apparatus 1 according to the outline of the embodiment.
- the information processing apparatus 1 includes a presence information acquisition unit 2 and a presence location estimation unit 3.
- the presence information acquisition unit 2 acquires presence information generated based on a plurality of captured images for each predetermined location.
- the presence information is information indicating the location of the person at the time of shooting.
- the presence information includes time information at the time of shooting, that is, shooting time information.
- the acquisition method by the presence information acquisition unit 2 is arbitrary. Therefore, the presence information acquisition unit 2 may acquire presence information generated by another device via a network, or read presence information stored in a storage device (not shown) of the information processing device 1. May be obtained.
- the presence location estimation unit 3 estimates the current location of a specific person from the plurality of predetermined locations based on the presence information acquired by the presence information acquisition unit 2.
- the location estimation unit 3 estimates the current location of the specific person based on the probability of existence of the specific person at a predetermined location, which is calculated based on the elapsed time from the shooting time.
- the presence location estimation unit 3 calculates the presence probability of a specific person based on the elapsed time from the shooting time, and estimates the current location of the person. For this reason, it is possible to perform a search considering the movement of a person.
- FIG. 2 is a block diagram illustrating an example of a configuration of the person search system 10 according to the embodiment.
- the person search system 10 includes a plurality of cameras 100, a video analysis server group 250 including a plurality of video analysis servers 200, a control server 300, and a search reception server 400.
- the person search system 10 is a system for searching for a person by analyzing images taken by the camera 100 arranged at each of a plurality of predetermined locations.
- the camera 100 is arranged for each predetermined place, and photographs the surroundings of the arranged place.
- Each of the cameras 100 is connected to the video analysis server group 250 via a wired or wireless network, and each camera 100 transmits captured video data to the video analysis server group 250.
- the camera 100 transmits the captured video data to one of the video analysis servers 200 in the video analysis server group 250 in accordance with an instruction from the video acquisition control unit 201 described later.
- the video data includes shooting time information.
- the installation location of the camera 100 is arbitrary.
- the installation location of the camera 100 may be an airport, a harbor, a theme park, a shopping center, a stadium, or the like.
- the installation locations of the camera 100 are not limited to one building but may be scattered in a plurality of buildings.
- a forward path is predetermined for these installation locations. Therefore, the person moves along this forward path.
- the video analysis server group 250 includes a plurality of video analysis servers 200.
- the video analysis server 200 is a server that performs image recognition processing on the video data transmitted by the camera 100 and recognizes a person in the shooting region of the camera 100 (that is, around the installation location of the camera 100). In the present embodiment, the video analysis server 200 recognizes a person through recognition processing for the person's face, but may recognize the person using other appearance features of the person.
- the video analysis server 200 is connected to the control server 300 via a wired or wireless network, and transmits the analysis processing result to the control server 300.
- the search reception server 400 is a server that receives a search request from a client terminal (not shown).
- the search reception server 400 receives a search request including information specifying a person to be searched (searched) (hereinafter referred to as a search target person).
- search target person information specifying a person to be searched (searched)
- the search target person may be a boarding person who is missing. Needless to say, the search target person is not limited to the boarding person, and may be any person.
- the search reception server 400 is connected to the control server 300 via a wired or wireless network, and transmits search target person information to the control server 300.
- the control server 300 is a server that estimates the current location of the search target person based on the analysis processing result of the video analysis server group 250. In addition, the control server 300 changes the system configuration or setting based on the estimation result so that the video analysis server group 250 can focus on video analysis of a place where a search target person is likely to be present.
- FIG. 3 is a block diagram illustrating an example of the hardware configuration of the video analysis server 200, the control server 300, and the search reception server 400.
- the video analysis server 200, the control server 300, and the search reception server 400 each include, for example, a network interface 50, a memory 51, and a processor 52.
- the network interface 50 is used to communicate with other devices.
- the network interface 50 may include, for example, a network interface card (NIC).
- NIC network interface card
- the memory 51 is constituted by a combination of a volatile memory and a nonvolatile memory, for example.
- the video analysis server 200, the control server 300, and the search reception server 400 may have a storage device such as a hard disk in addition to the memory 51.
- the memory 51 is used for storing software (computer program) including one or more instructions executed by the processor 52.
- This program can be stored using various types of non-transitory computer readable media and supplied to a computer.
- Non-transitory computer readable media include various types of tangible storage media. Examples of non-transitory computer-readable media include magnetic recording media (eg flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg magneto-optical discs), compact disc read only memory (CD-ROM), CD-ROMs. R, CD-R / W, and semiconductor memory (for example, mask ROM, programmable ROM (PROM), erasable PROM (EPROM), flash ROM, random access memory (RAM)) are included.
- the program may also be supplied to the computer by various types of transitory computer readable media.
- Examples of transitory computer readable media include electrical signals, optical signals, and electromagnetic waves.
- the temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
- the processor 52 reads out and executes software (computer program) from the memory 51, thereby performing processing of the video analysis server 200, processing of the control server 300, or processing of the search reception server 400 described later.
- the video analysis server 200, the control server 300, and the search reception server 400 have a function as a computer.
- the processor 52 may be, for example, a microprocessor, an MPU (Micro Processor Unit), or a CPU (Central Processing Unit).
- the processor 52 may include a plurality of processors.
- the memory 51 or the storage device may be used as the presence information storage unit 302 described later.
- the camera 100 also has a hardware configuration similar to that shown in FIG. 3, and has a function as a computer.
- FIG. 4 is a block diagram illustrating an example of functional configurations of the video analysis server 200, the control server 300, and the search reception server 400.
- the video analysis server 200 includes a video acquisition control unit 201, a presence information generation unit 202, and a presence information output unit 203.
- the video acquisition control unit 201 acquires video data from one or more cameras 100.
- the data acquired by the video acquisition control unit 201 includes a captured image of the camera 100 and shooting time information.
- the video analysis server 200 analyzes video data of the camera 100 designated from the control unit 305 of the control server 300. For this reason, the video acquisition control unit 201 acquires video data from the camera 100 specified by the control unit 305.
- the video acquisition control unit 201 requests video data from the camera 100 specified by an instruction from the control unit 305 of the control server 300.
- the camera 100 transmits the captured video data to the video analysis server 200 requested by the video acquisition control unit 201.
- the presence information generation unit 202 generates presence information based on the captured image for each predetermined place.
- the presence information is information indicating the location of the person at the time of shooting.
- the presence information generation unit 202 generates presence information by performing real-time analysis by image recognition processing on a captured image and identifying a face included in the captured image. Specifically, for example, the presence information generation unit 202 obtains a predetermined feature amount (a feature amount of a face image) indicating the appearance feature of a person included in a captured image of the camera 100 and an appearance feature of a specific person. Presence information is generated by matching with a predetermined feature amount.
- the presence information generation unit 202 generates presence information indicating that the specific person was present at the time of shooting of the photographed image at a location within the shooting range of the camera 100 by matching the feature amounts of the two. Therefore, the presence information includes time information at the time of shooting (that is, shooting time information).
- presence information is used for searching for a person, but it may also be used as entrance / exit information by face authentication for a predetermined place in an airport or the like, for example.
- the presence information is information that specifies who, when, and where.
- the presence information output unit 203 outputs the presence information generated by the presence information generation unit 202 to the control server 300.
- the search reception server 400 includes a search request reception unit 401 and a search target person information output unit 402.
- the search request receiving unit 401 receives a search request.
- the search request is information including information specifying a search target person.
- the information specifying the search target person includes, for example, identification information (hereinafter referred to as search target person information) such as the name, ID, and passport information of the search target person.
- search target person information identification information
- the user designates a search target person by operating a client terminal that is connected to the search reception server 400 via a wired or wireless network.
- a search request is transmitted from the client terminal to the search reception server 400.
- the search target person information output unit 402 outputs the search target person information specified in the search request received by the search request reception unit 401 to the control server 300.
- the control server 300 includes a presence information acquisition unit 301, a presence information storage unit 302, a search target person information acquisition unit 303, a presence location estimation unit 304, a control unit 305, and a search result output. Part 306.
- the presence information acquisition unit 301 acquires the presence information generated based on the captured images for each of a plurality of predetermined locations.
- the presence information acquisition unit 301 acquires the presence information transmitted by the video analysis server 200.
- the presence information acquisition unit 301 stores the acquired presence information in the presence information storage unit 302.
- the presence information storage unit 302 is a storage area for storing presence information. Presence information is sequentially accumulated in the presence information storage unit 302 by real-time analysis of video data from each camera 100.
- the search target person information acquisition unit 303 acquires the search target person information transmitted from the search reception server 400.
- the presence location estimation unit 304 estimates the current location of a specific person from a plurality of predetermined locations where the camera 100 is installed based on the presence information acquired by the presence information acquisition unit 301. Specifically, the presence location estimation unit 304 first searches the presence information storage unit 302 for the latest presence information about the search target person specified by the search target person information. Next, the location estimation unit 304 calculates the specific location based on the existence probability of the specific person (that is, the search target person) calculated based on the elapsed time from the shooting time indicated by the searched presence information. Estimate the current location of a person.
- the presence location estimation unit 304 estimates the current location based on the presence probability calculated based on the elapsed time from the shooting time of the captured image used for generating the latest presence information about the specific person.
- the existence probability is a probability that the specific person exists in a predetermined place where the camera 100 is installed.
- the existence location estimation unit 304 calculates the existence probability using a monotone function with the elapsed time as a variable.
- a monotone increasing function f (t) represented by the following formula (1) is used.
- T XYi is the time taken for the person who appears at the place X to appear at the place Yi (hereinafter referred to as the scheduled travel time), and is a statistical result of people who move from the place X to the place Yi. It is a value set in advance based on this.
- the place X is an existing place indicated by the latest presence information on the search target person among the places where the camera 100 is installed, and is also referred to as a first place.
- the place Yi is a place that can be reached from the place X among the places where the camera 100 is installed, and is also referred to as a second place. Note that there are n places (n is an integer of 1 or more) that can be reached from the place X. That is, “i” of Yi is an index number of 1 or more and n or less, and the places that can be reached from the place X are n places in total from the places Y1 to Yn.
- the existence location estimation unit 304 calculates the existence probability p Yi (t) that the search target person currently exists in the location Yi using the function f (t) according to the following equation (2). Also, the presence location estimation unit 304 calculates the presence probability p X (t) that the search target person currently exists in the location X by the following equation (3).
- P XYi is the probability that the destination of the person from the place X is the place Yi. That is, P XYi is a movement probability from the location X for each location Yi.
- the movement probability P XYi is a value set in advance based on a statistical result or the like of people who move from the place X to the place Yi.
- the presence location estimation unit 304 uses the predetermined probability of movement for each location Yi reachable from the location X and the presence probability calculated based on the elapsed time t, Estimate the current location. By using the movement probability in this way, even when there are a plurality of destinations from the place X, the existence probability for each destination can be calculated.
- FIG. 5 is a schematic diagram showing an application example of the person search system 10 according to the present embodiment.
- FIG. 5 shows an example of searching for a person at an airport.
- the camera 100 includes a boarding procedure area (referred to as place A), an immigration area (referred to as place B), a duty-free shop (referred to as place C), and a lounge (referred to as place D). ) And a boarding gate (referred to as place E).
- the forward direction route is determined in advance. Specifically, the person who arrives at the location A is to move to the location B after that. 100% is set in advance as the movement probability P AB from the place A to the place B, and 5 minutes is set in advance as the scheduled movement time T AB . In addition, the person who arrives at the place B is to move to either the place C, D, or E after that. 10% is set in advance as the movement probability PBC from the place B to the place C, and 1 minute is set in advance as the estimated movement time TBC . Advance 40% as a mobile probability P BD from location B to location D is set as the moving prediction time T BD, in advance 5 minutes is set.
- the person who arrives at the place C is to move to the place E after that. 100% is set in advance as the movement probability PCE from the place C to the place E, and 30 minutes is set in advance as the scheduled movement time TCE .
- the person who has arrived at the place D is to move to the place E after that. 100% is set in advance as the movement probability P DE from the place D to the place E, and 60 minutes is set in advance as the estimated movement time T DE .
- the presence location estimation unit 304 determines that the search subject is as follows. Estimate your current location. That is, the presence location estimation unit 304 sets the location B as the location X, the locations C, D, and E as the location Yi, and the existence probabilities at the locations B, C, D, and E of the search target person as described above. To calculate the current location.
- the presence location estimation unit 304 may estimate the current location of the search target person based on the presence probability using an arbitrary method. For example, the presence location estimation unit 304 may estimate the location where the presence probability exceeding a predetermined threshold is calculated as the current location of the search target person, or the location where the presence probability is the maximum for the search target person. You may presume that you are now.
- the control unit 305 performs control to change the configuration or setting related to the generation of the presence information in accordance with the estimation result by the presence location estimation unit 304. For example, the control unit 305 changes the configuration so that resources corresponding to the presence probability of the location are allocated to the process for generating the presence information for the captured image of the location where the camera 100 is installed. Specifically, for example, the control unit 305 controls to change the number of cameras 100 that the video analysis server 200 is in charge of.
- FIG. 6 is a schematic diagram illustrating the configuration change by the control unit 305. In FIG. 6, the left side shows the correspondence between the camera 100 before the configuration change by the control unit 305 and the video analysis server 200, and the right side shows the camera 100 and the video analysis server 200 after the configuration change by the control unit 305.
- each video analysis server 200 is in charge of analyzing the video data of the two cameras 100.
- the control unit 305 causes the camera 100 at the location D (camera D) and the camera 100 at the location E (camera E).
- Change the configuration to focus on video data analysis That is, the control unit 305 changes the configuration so that the video analysis server 200 (video analysis server H) that processes the video data of the camera D can concentrate resources on the analysis of the video data of the camera D (on the right side of FIG. 6). (See figure).
- control unit 305 changes the configuration so that the video analysis server 200 (video analysis server I) that processes the video data of the camera E can concentrate resources on the analysis of the video data of the camera E (right side in FIG. 6). Refer to the figure below).
- the control unit 305 allocates the video data analysis processing of the camera C, which was handled by the video analysis server H, to the video analysis server G, and analyzes the video data of the camera F which was handled by the video analysis server I.
- the processing is allocated to the video analysis server G. Thereby, it is possible to search for a search target person by focusing on an area where the search target person is likely to exist.
- control unit 305 may change the setting regarding the image recognition processing performance for generating the presence information for the captured image at the installation location of the camera 100 according to the estimation result by the presence location estimation unit 304.
- the control unit 305 changes the setting so as to improve the performance of the image recognition processing performed when the video data of the camera D is analyzed.
- the control unit 305 changes the setting so as to improve the performance of the image recognition processing performed when the video data of the camera E is analyzed.
- the frame rate (FPS: frames per second) of the image to be analyzed may be used, or the face image that can be detected within one frame. It may be a number.
- control unit 305 performs the analysis of the video data of each camera so that the video analysis server G that has been changed to take charge of four cameras does not fall short of the resources of the video analysis server G.
- the setting may be changed to reduce the performance of the image recognition process.
- the control unit 305 instructs the video analysis server 200 to change the configuration or setting.
- the video acquisition control unit 201 of the video analysis server 200 acquires the video data of the camera 100 instructed by the control unit 305 of the control server 300.
- the presence information generation unit 202 of the video analysis server 200 performs analysis processing with settings instructed by the control unit 305 of the control server 300.
- the search result output unit 306 searches for presence information indicating the current location of the search target person specified by the search target person information. For example, the search result output unit 306 searches for presence information in which the difference between the current time and the shooting time is small, in other words, presence information in which the difference between the current time and the shooting time is within a predetermined time. To do. Thereby, the search result output unit 306 obtains presence information about the search target person generated in the video analysis server 200 after being changed by the control unit 305 and accumulated in the presence information storage unit 302. The search result output unit 306 outputs the obtained presence information as the current location of the search target person. The search result output unit 306 may output the obtained presence information to the search reception server 400 or may output it to a device such as a client terminal. The presence information output from the search result output unit 306 can be presented to the user as a search result by any method.
- FIG. 7 is a sequence chart showing an operation example of the person search system 10. First, the flow of normal operation will be described.
- step 100 the video acquisition control unit 201 of the video analysis server 200 acquires the video data transmitted from the camera 100.
- step 101 the presence information generation unit 202 of the video analysis server 200 analyzes the video data and generates presence information of various persons.
- step 102 the presence information acquisition unit 301 of the control server 300 acquires the presence information output by the presence information output unit 203 of the video analysis server 200. Then, the presence information acquisition unit 301 stores the acquired presence information in the presence information storage unit 302. Presence information is sequentially generated and stored in the presence information storage unit 302 sequentially.
- step 200 the search request receiving unit 401 of the search receiving server 400 receives a search request from the user.
- step 201 based on the search request, the search target person information output unit 402 of the search reception server 400 transmits the search target person information to the control server 300, which is searched for the search target person of the control server 300.
- the information acquisition unit 303 acquires the information.
- step 202 the presence location estimation unit 304 of the control server 300 estimates the current location of the search target person based on the presence information accumulated in step 102.
- step 203 the control unit 305 of the control server 300 determines a change in the system configuration and settings according to the estimation result in step 202.
- step 204 the control unit 305 of the control server 300 instructs the video analysis server 200 to make a change.
- step 205 the video acquisition control unit 201 instructs the camera 100 on the transmission destination of the video data in accordance with the instruction notified from the control server 300 in step 204. That is, the video acquisition control unit 201 notifies the camera 100 so that the video data from the camera 100 that the server itself is in charge of reaches the server.
- the camera 100 transmits the video data to the destination video analysis server 200 according to the notification.
- the presence information generation unit 202 performs an analysis for searching for a search target person. In this way, it is possible to perform a search process focused on a place where a search target person is likely to exist.
- the search result output unit 306 of the control server 300 obtains this from the presence information storage unit 302 and outputs it as a search result.
- the embodiment has been described above.
- the current location of the search target person is estimated based on the existence probability calculated based on the elapsed time from the shooting time. For this reason, it is possible to search for a search target person who is scheduled to move.
- the configuration or setting of the system is changed according to the estimation result of the location of the search target person. For this reason, processing can be performed by focusing on video analysis of a place where a search target person is more likely to be found. For this reason, limited calculation resources can be used efficiently.
- the present invention is not limited to the above-described embodiment, and can be appropriately changed without departing from the spirit of the present invention.
- the person search system 10 has a plurality of video analysis servers 200, but the number of video analysis servers 200 may be one.
- the control unit 305 of the control server 300 changes the setting so that the resources in the one video analysis server 200 are allocated to the analysis processing for each camera 100 according to the estimation result of the existence location estimation unit 304. .
- Presence information acquisition means for acquiring presence information that is generated based on captured images at a plurality of predetermined locations and that indicates the presence location of the person at the time of shooting;
- a presence location estimating means for estimating a current location of a specific person from the plurality of predetermined locations based on the presence information acquired by the presence information acquisition means, and
- the presence information includes time information at the time of shooting,
- the presence location estimating unit estimates a current location of the specific person based on an existence probability of the specific person at the predetermined location calculated based on an elapsed time from the shooting time point.
- the presence location estimating means is calculated based on a predetermined movement probability from the first location for each second location that can be reached from the first location, and an elapsed time from the shooting time point. Estimate the current location of the specific person by probability, The first place is an existing place at the shooting time of the specific person among the plurality of predetermined places, The information processing apparatus according to claim 1, wherein the second place is a place that can be reached from the first place among the plurality of predetermined places. (Appendix 3) The information processing apparatus according to claim 1 or 2, further comprising: a control unit that changes a configuration or setting relating to generation of the presence information according to an estimation result by the presence location estimation unit.
- the information processing apparatus is a setting of the number of face images that can be detected within one frame.
- Presence information generating means for generating presence information indicating the location of the person at the time of shooting based on the captured images for each predetermined location;
- a presence location estimating means for estimating a current location of a specific person from the plurality of the predetermined locations based on the presence information;
- the presence information includes time information at the time of shooting,
- the presence location estimation means estimates the current location of the specific person based on the presence probability of the specific person at the predetermined location, which is calculated based on the elapsed time from the shooting time point. .
- the person search system according to claim 8, wherein the presence information generation unit generates the presence information by identifying a face included in a captured image.
- the presence information generation unit Obtained presence information indicating the location of the person at the time of shooting, which is generated based on the captured images at a plurality of predetermined locations, Based on the acquired presence information, a current location of a specific person is estimated from the plurality of predetermined locations,
- the presence information includes time information at the time of shooting, In the estimation of the location, a location estimation method of estimating the current location of the specific person based on the probability of the specific person existing at the predetermined location calculated based on the elapsed time from the shooting time.
- the presence information includes time information at the time of shooting,
- a program for estimating the current location of the specific person based on the presence probability of the specific person at the predetermined location calculated based on the elapsed time from the shooting time point is stored.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Signal Processing (AREA)
- Image Analysis (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
実施形態の詳細な説明に先立って、実施形態の概要を説明する。図1は、実施の形態の概要にかかる情報処理装置1の構成の一例を示すブロック図である。図1に示すように、情報処理装置1は、存在情報取得部2と、存在場所推定部3とを有する。
次に、実施の形態の詳細について説明する。図2は、実施の形態にかかる人物検索システム10の構成の一例を示すブロック図である。図2に示すように人物検索システム10は、複数のカメラ100と、複数の映像解析サーバ200を含む映像解析サーバ群250と、制御サーバ300と、検索受付サーバ400とを有する。人物検索システム10は、複数の所定の場所のそれぞれに配置されたカメラ100により撮影された画像を解析することにより人物を検索するシステムである。
このプログラムは、様々なタイプの非一時的なコンピュータ可読媒体(non-transitory computer readable medium)を用いて格納され、コンピュータに供給することができる。非一時的なコンピュータ可読媒体は、様々なタイプの実体のある記録媒体(tangible storage medium)を含む。非一時的なコンピュータ可読媒体の例は、磁気記録媒体(例えばフレキシブルディスク、磁気テープ、ハードディスクドライブ)、光磁気記録媒体(例えば光磁気ディスク)、Compact Disc Read Only Memory(CD-ROM)、CD-R、CD-R/W、半導体メモリ(例えば、マスクROM、Programmable ROM(PROM)、Erasable PROM(EPROM)、フラッシュROM、Random Access Memory(RAM))を含む。また、プログラムは、様々なタイプの一時的なコンピュータ可読媒体(transitory computer readable medium)によってコンピュータに供給されてもよい。一時的なコンピュータ可読媒体の例は、電気信号、光信号、及び電磁波を含む。一時的なコンピュータ可読媒体は、電線及び光ファイバ等の有線通信路、又は無線通信路を介して、プログラムをコンピュータに供給できる。
存在情報出力部203は、存在情報生成部202により生成された存在情報を、制御サーバ300に出力する。
検索対象者情報出力部402は、検索要求受付部401が受け付けた検索要求で指定された検索対象者情報を制御サーバ300に出力する。
存在情報記憶部302は、存在情報を記憶する記憶領域である。存在情報記憶部302には、各カメラ100からの映像データのリアルタイムな解析によって、逐次、存在情報が蓄積されていく。
検索対象者情報取得部303は、検索受付サーバ400から送信された検索対象者情報を取得する。
また、場所Bに到着した人物は、その後、場所C又はD又はEのいずれかへ移動することになっている。場所Bから場所Cへの移動確率PBCとして予め10%が設定され、移動予定時間TBCとして、予め1分が設定されている。場所Bから場所Dへの移動確率PBDとして予め40%が設定され、移動予定時間TBDとして、予め5分が設定されている。場所Bから場所Eへの移動確率PBEとして予め50%が設定され、移動予定時間TBEとして、予め1分が設定されている。
さらに、場所Cに到着した人物は、その後、場所Eへ移動することになっている。場所Cから場所Eへの移動確率PCEとして予め100%が設定され、移動予定時間TCEとして、予め30分が設定されている。そして、場所Dに到着した人物は、その後、場所Eへ移動することになっている。場所Dから場所Eへの移動確率PDEとして予め100%が設定され、移動予定時間TDEとして、予め60分が設定されている。
まず、通常時の動作の流れについて説明する。
次に、ステップ101(S101)において、映像解析サーバ200の存在情報生成部202が、映像データを解析し、様々な人物の存在情報を生成する。
次に、ステップ102(S102)において、映像解析サーバ200の存在情報出力部203によって出力された存在情報を、制御サーバ300の存在情報取得部301が取得する。そして、存在情報取得部301は、取得した存在情報を存在情報記憶部302に格納する。
存在情報は、逐次生成され、存在情報記憶部302に逐次蓄積される。
ステップ200(S200)において、ユーザからの検索要求を検索受付サーバ400の検索要求受付部401が受け付ける。
次に、ステップ201(S201)において、検索要求に基づいて、検索受付サーバ400の検索対象者情報出力部402が検索対象者情報を制御サーバ300に送信し、これを制御サーバ300の検索対象者情報取得部303が取得する。
次に、ステップ202(S202)において、制御サーバ300の存在場所推定部304は、ステップ102において蓄積された存在情報に基づいて、検索対象者の現在の居場所を推定する。
次に、ステップ203(S203)において、制御サーバ300の制御部305が、ステップ202における推定結果に応じて、システムの構成及び設定の変更を決定する。
次に、ステップ204(S204)において、制御サーバ300の制御部305は、映像解析サーバ200に変更を指示する。
ステップ205(S205)において、映像取得制御部201は、ステップ204で制御サーバ300から通知された指示に従い、カメラ100に対し映像データの送信先を指示する。すなわち、映像取得制御部201は、自サーバが担当するカメラ100からの映像データが自サーバに届くように、カメラ100に通知する。カメラ100は通知に従った送信先の映像解析サーバ200に映像データを送信する。以降、制御部305により変更された構成及び設定の下、存在情報生成部202により、検索対象者を検索する解析が行われる。このようにして、検索対象者が存在しそうな場所に対して注力した検索処理を行うことができる。
なお、検索対象者の存在情報がいずれかの映像解析サーバ200において生成された場合、制御サーバ300の検索結果出力部306がこれを存在情報記憶部302から得て、検索結果として出力する。
複数の所定の場所毎の撮影画像に基づいて生成された、人物の撮影時点における存在場所を示す存在情報を取得する存在情報取得手段と、
前記存在情報取得手段により取得された前記存在情報に基づいて、前記複数の所定の場所の中から特定の人物の現在の存在場所を推定する存在場所推定手段と
を有し、
前記存在情報は、前記撮影時点の時刻情報を含み、
前記存在場所推定手段は、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
情報処理装置。
(付記2)
前記存在場所推定手段は、第1の場所から到達可能な第2の場所毎の前記第1の場所からの所定の移動確率と、前記撮影時点からの経過時間とに基づいて算出される前記存在確率により、前記特定の人物の現在の存在場所を推定し、
前記第1の場所は、前記複数の所定の場所のうち、前記特定の人物の前記撮影時点における存在場所であり、
前記第2の場所は、前記複数の所定の場所のうち、前記第1の場所から到達可能な場所である
付記1に記載の情報処理装置。
(付記3)
前記存在場所推定手段による推定結果に応じて、前記存在情報の生成に関する構成又は設定を変更する制御手段をさらに有する
付記1又は2に記載の情報処理装置。
(付記4)
前記制御手段は、前記所定の場所の撮影画像に対する前記存在情報の生成のための処理に、当該場所の前記存在確率の大きさに応じたリソースを割り当てる
付記3に記載の情報処理装置。
(付記5)
前記制御手段は、前記存在場所推定手段による推定結果に応じて、前記所定の場所の撮影画像に対する前記存在情報の生成のための画像認識処理性能に関する設定を変更する
付記3又は4に記載の情報処理装置。
(付記6)
前記画像認識処理性能に関する設定は、解析対象とする画像のフレームレートの設定である
付記5に記載の情報処理装置。
(付記7)
前記画像認識処理性能に関する設定は、1フレーム内で検出可能な顔画像の数の設定である
付記5に記載の情報処理装置。
(付記8)
所定の場所毎の撮影画像に基づいて、人物の撮影時点における存在場所を示す存在情報を生成する存在情報生成手段と、
前記存在情報に基づいて、複数の前記所定の場所の中から特定の人物の現在の存在場所を推定する存在場所推定手段と
を有し、
前記存在情報は、前記撮影時点の時刻情報を含み、
前記存在場所推定手段は、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
人物検索システム。
(付記9)
前記存在情報生成手段は、撮影画像に含まれる顔を識別することにより、前記存在情報を生成する
付記8に記載の人物検索システム。
(付記10)
複数の所定の場所毎の撮影画像に基づいて生成された、人物の撮影時点における存在場所を示す存在情報を取得し、
取得された前記存在情報に基づいて、前記複数の所定の場所の中から特定の人物の現在の存在場所を推定し、
前記存在情報は、前記撮影時点の時刻情報を含み、
存在場所の推定では、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
場所推定方法。
(付記11)
複数の所定の場所毎の撮影画像に基づいて生成された、人物の撮影時点における存在場所を示す存在情報を取得する存在情報取得ステップと、
前記存在情報取得ステップで取得された前記存在情報に基づいて、前記複数の所定の場所の中から特定の人物の現在の存在場所を推定する存在場所推定ステップと
をコンピュータに実行させ、
前記存在情報は、前記撮影時点の時刻情報を含み、
前記存在場所推定ステップでは、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
プログラムが格納された非一時的なコンピュータ可読媒体。
2 存在情報取得部
3 存在場所推定部
10 人物検索システム
50 ネットワークインタフェース
51 メモリ
52 プロセッサ
100 カメラ
200 映像解析サーバ
201 映像取得制御部
202 存在情報生成部
203 存在情報出力部
250 映像解析サーバ群
300 制御サーバ
301 存在情報取得部
302 存在情報記憶部
303 検索対象者情報取得部
304 存在場所推定部
305 制御部
306 検索結果出力部
400 検索受付サーバ
401 検索要求受付部
402 検索対象者情報出力部
Claims (11)
- 複数の所定の場所毎の撮影画像に基づいて生成された、人物の撮影時点における存在場所を示す存在情報を取得する存在情報取得手段と、
前記存在情報取得手段により取得された前記存在情報に基づいて、前記複数の所定の場所の中から特定の人物の現在の存在場所を推定する存在場所推定手段と
を有し、
前記存在情報は、前記撮影時点の時刻情報を含み、
前記存在場所推定手段は、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
情報処理装置。 - 前記存在場所推定手段は、第1の場所から到達可能な第2の場所毎の前記第1の場所からの所定の移動確率と、前記撮影時点からの経過時間とに基づいて算出される前記存在確率により、前記特定の人物の現在の存在場所を推定し、
前記第1の場所は、前記複数の所定の場所のうち、前記特定の人物の前記撮影時点における存在場所であり、
前記第2の場所は、前記複数の所定の場所のうち、前記第1の場所から到達可能な場所である
請求項1に記載の情報処理装置。 - 前記存在場所推定手段による推定結果に応じて、前記存在情報の生成に関する構成又は設定を変更する制御手段をさらに有する
請求項1又は2に記載の情報処理装置。 - 前記制御手段は、前記所定の場所の撮影画像に対する前記存在情報の生成のための処理に、当該場所の前記存在確率の大きさに応じたリソースを割り当てる
請求項3に記載の情報処理装置。 - 前記制御手段は、前記存在場所推定手段による推定結果に応じて、前記所定の場所の撮影画像に対する前記存在情報の生成のための画像認識処理性能に関する設定を変更する
請求項3又は4に記載の情報処理装置。 - 前記画像認識処理性能に関する設定は、解析対象とする画像のフレームレートの設定である
請求項5に記載の情報処理装置。 - 前記画像認識処理性能に関する設定は、1フレーム内で検出可能な顔画像の数の設定である
請求項5に記載の情報処理装置。 - 所定の場所毎の撮影画像に基づいて、人物の撮影時点における存在場所を示す存在情報を生成する存在情報生成手段と、
前記存在情報に基づいて、複数の前記所定の場所の中から特定の人物の現在の存在場所を推定する存在場所推定手段と
を有し、
前記存在情報は、前記撮影時点の時刻情報を含み、
前記存在場所推定手段は、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
人物検索システム。 - 前記存在情報生成手段は、撮影画像に含まれる顔を識別することにより、前記存在情報を生成する
請求項8に記載の人物検索システム。 - 複数の所定の場所毎の撮影画像に基づいて生成された、人物の撮影時点における存在場所を示す存在情報を取得し、
取得された前記存在情報に基づいて、前記複数の所定の場所の中から特定の人物の現在の存在場所を推定し、
前記存在情報は、前記撮影時点の時刻情報を含み、
存在場所の推定では、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
場所推定方法。 - 複数の所定の場所毎の撮影画像に基づいて生成された、人物の撮影時点における存在場所を示す存在情報を取得する存在情報取得ステップと、
前記存在情報取得ステップで取得された前記存在情報に基づいて、前記複数の所定の場所の中から特定の人物の現在の存在場所を推定する存在場所推定ステップと
をコンピュータに実行させ、
前記存在情報は、前記撮影時点の時刻情報を含み、
前記存在場所推定ステップでは、前記撮影時点からの経過時間に基づいて算出される、前記特定の人物の前記所定の場所における存在確率により、前記特定の人物の現在の存在場所を推定する
プログラムが格納された非一時的なコンピュータ可読媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/040,210 US11501568B2 (en) | 2018-03-23 | 2018-10-26 | Information processing apparatus, person search system, place estimation method, and non-transitory computer readable medium storing program |
AU2018414269A AU2018414269A1 (en) | 2018-03-23 | 2018-10-26 | Information processing apparatus, person search system, place estimation method, and non-transitory computer readable medium storing program |
JP2020507329A JP7067613B2 (ja) | 2018-03-23 | 2018-10-26 | 情報処理装置、人物検索システム、場所推定方法及びプログラム |
EP18910933.3A EP3770855A1 (en) | 2018-03-23 | 2018-10-26 | Information processing device, person search system, place estimation method, and non-temporary computer-readable medium in which program is stored |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018-056862 | 2018-03-23 | ||
JP2018056862 | 2018-03-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019181047A1 true WO2019181047A1 (ja) | 2019-09-26 |
Family
ID=67987595
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/039858 WO2019181047A1 (ja) | 2018-03-23 | 2018-10-26 | 情報処理装置、人物検索システム、場所推定方法及びプログラムが格納された非一時的なコンピュータ可読媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11501568B2 (ja) |
EP (1) | EP3770855A1 (ja) |
JP (1) | JP7067613B2 (ja) |
AU (1) | AU2018414269A1 (ja) |
WO (1) | WO2019181047A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021135473A (ja) * | 2020-02-28 | 2021-09-13 | 株式会社日立製作所 | 捜索支援システム、捜索支援方法 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11792501B2 (en) * | 2020-12-17 | 2023-10-17 | Motorola Solutions, Inc. | Device, method and system for installing video analytics parameters at a video analytics engine |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013092955A (ja) | 2011-10-27 | 2013-05-16 | Hitachi Ltd | 映像解析装置及びシステム |
WO2014061342A1 (ja) * | 2012-10-18 | 2014-04-24 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
WO2014148395A1 (ja) | 2013-03-21 | 2014-09-25 | 株式会社日立国際電気 | 映像監視システム、映像監視方法、および映像監視装置 |
JP2016143312A (ja) * | 2015-02-04 | 2016-08-08 | 沖電気工業株式会社 | 予測システム、予測方法およびプログラム |
JP2016163328A (ja) * | 2015-03-05 | 2016-09-05 | キヤノン株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2018032950A (ja) * | 2016-08-23 | 2018-03-01 | キヤノン株式会社 | 情報処理装置及びその方法、コンピュータプログラム |
JP2018056862A (ja) | 2016-09-30 | 2018-04-05 | ブラザー工業株式会社 | 通信装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4506801B2 (ja) * | 2007-09-25 | 2010-07-21 | カシオ計算機株式会社 | 画像認識装置、画像認識方法、画像認識プログラム |
US9684835B2 (en) * | 2012-09-13 | 2017-06-20 | Nec Corporation | Image processing system, image processing method, and program |
US9794525B2 (en) * | 2014-03-25 | 2017-10-17 | Ecole Polytechnique Federale De Lausanne (Epfl) | Systems and methods for tracking interacting objects |
JP6428144B2 (ja) * | 2014-10-17 | 2018-11-28 | オムロン株式会社 | エリア情報推定装置、エリア情報推定方法、および空気調和装置 |
WO2017163282A1 (ja) * | 2016-03-25 | 2017-09-28 | パナソニックIpマネジメント株式会社 | 監視装置及び監視システム |
-
2018
- 2018-10-26 US US17/040,210 patent/US11501568B2/en active Active
- 2018-10-26 WO PCT/JP2018/039858 patent/WO2019181047A1/ja active Application Filing
- 2018-10-26 EP EP18910933.3A patent/EP3770855A1/en not_active Withdrawn
- 2018-10-26 AU AU2018414269A patent/AU2018414269A1/en not_active Abandoned
- 2018-10-26 JP JP2020507329A patent/JP7067613B2/ja active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013092955A (ja) | 2011-10-27 | 2013-05-16 | Hitachi Ltd | 映像解析装置及びシステム |
WO2014061342A1 (ja) * | 2012-10-18 | 2014-04-24 | 日本電気株式会社 | 情報処理システム、情報処理方法及びプログラム |
WO2014148395A1 (ja) | 2013-03-21 | 2014-09-25 | 株式会社日立国際電気 | 映像監視システム、映像監視方法、および映像監視装置 |
JP2016143312A (ja) * | 2015-02-04 | 2016-08-08 | 沖電気工業株式会社 | 予測システム、予測方法およびプログラム |
JP2016163328A (ja) * | 2015-03-05 | 2016-09-05 | キヤノン株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2018032950A (ja) * | 2016-08-23 | 2018-03-01 | キヤノン株式会社 | 情報処理装置及びその方法、コンピュータプログラム |
JP2018056862A (ja) | 2016-09-30 | 2018-04-05 | ブラザー工業株式会社 | 通信装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021135473A (ja) * | 2020-02-28 | 2021-09-13 | 株式会社日立製作所 | 捜索支援システム、捜索支援方法 |
JP7407018B2 (ja) | 2020-02-28 | 2023-12-28 | 株式会社日立製作所 | 捜索支援システム、捜索支援方法 |
Also Published As
Publication number | Publication date |
---|---|
AU2018414269A1 (en) | 2020-10-08 |
US11501568B2 (en) | 2022-11-15 |
JP7067613B2 (ja) | 2022-05-16 |
JPWO2019181047A1 (ja) | 2021-02-25 |
US20210012095A1 (en) | 2021-01-14 |
EP3770855A1 (en) | 2021-01-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7184148B2 (ja) | 監視システム、管理装置および監視方法 | |
JP2015002553A (ja) | 情報処理システムおよびその制御方法 | |
JP7040463B2 (ja) | 解析サーバ、監視システム、監視方法及びプログラム | |
JP5899518B2 (ja) | サーバ装置、システム制御方法及びシステム制御プログラム | |
JP2007219948A (ja) | ユーザ異常検出装置、及びユーザ異常検出方法 | |
JP7056564B2 (ja) | 映像処理装置、映像処理方法及びプログラム | |
JP7338739B2 (ja) | 情報処理装置、制御方法、及びプログラム | |
WO2019181047A1 (ja) | 情報処理装置、人物検索システム、場所推定方法及びプログラムが格納された非一時的なコンピュータ可読媒体 | |
JP6769475B2 (ja) | 情報処理システム、認証対象の管理方法、及びプログラム | |
Kim et al. | Stabilized adaptive sampling control for reliable real-time learning-based surveillance systems | |
CN110889314A (zh) | 图像处理方法、装置、电子设备、服务器及*** | |
JP2022117996A (ja) | 情報処理装置、データ生成方法、及びプログラム | |
JP2015233204A (ja) | 画像記録装置及び画像記録方法 | |
JP6941457B2 (ja) | 監視システム | |
KR102099816B1 (ko) | 실시간 로드 영상을 통해 유동 인구 데이터를 수집하는 방법 및 장치 | |
JP2014215747A (ja) | 追跡装置、追跡システム、及び、追跡方法 | |
JP6558178B2 (ja) | 迷惑行為者推定システム、迷惑行為者推定システムの制御方法及び制御プログラム | |
JP7067593B2 (ja) | 情報処理システム、認証対象の管理方法、及びプログラム | |
JP6927585B2 (ja) | 管理装置、自動改札機制御方法、及びプログラム | |
JP7129920B2 (ja) | 撮像装置 | |
JP2019139716A (ja) | 移動体管理装置、移動体管理システム、および移動体管理方法 | |
JP6879336B2 (ja) | 迷惑行為者推定システム、迷惑行為者推定システムの制御方法及び制御プログラム | |
JP2019180017A (ja) | 画像処理装置、撮影装置、画像処理方法、およびプログラム | |
JP2019192155A (ja) | 画像処理装置、撮影装置、画像処理方法およびプログラム | |
JP7327571B2 (ja) | 情報処理システム、端末装置、認証対象の管理方法、及びプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18910933 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020507329 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018414269 Country of ref document: AU Date of ref document: 20181026 Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2018910933 Country of ref document: EP |