WO2016147581A1 - 監視装置、監視方法、監視プログラム、及び監視システム - Google Patents
監視装置、監視方法、監視プログラム、及び監視システム Download PDFInfo
- Publication number
- WO2016147581A1 WO2016147581A1 PCT/JP2016/001104 JP2016001104W WO2016147581A1 WO 2016147581 A1 WO2016147581 A1 WO 2016147581A1 JP 2016001104 W JP2016001104 W JP 2016001104W WO 2016147581 A1 WO2016147581 A1 WO 2016147581A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mobile terminal
- camera
- position information
- tracking
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/08—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using communication transmission lines
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M11/00—Telephonic communication systems specially adapted for combination with other electrical systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a monitoring device, a monitoring method, a monitoring program, a monitoring system, and a facility monitoring system.
- Patent Document 1 discloses that when an abnormal state of a user who uses a mobile communication terminal is detected, the position information of the user is transmitted to the monitoring camera, and an image including the user acquired by the monitoring camera is managed. A technique for transmitting to the center is disclosed.
- Patent Document 1 the user of the mobile communication terminal is a monitoring target, and the user's situation and state can be detected.
- Patent Document 1 described above is based on the assumption that the user of the mobile communication terminal does not move. For this reason, when the user of the mobile communication terminal is moving, there is a possibility that the user may not enter the captured image even if the camera is controlled to the position acquired from the mobile communication terminal. There was a bug.
- An object of the present invention is to provide a monitoring device, a monitoring method, a monitoring program, a monitoring system, and a facility monitoring system for solving the above problems.
- the monitoring system includes a mobile terminal and a management device that communicates with the mobile terminal, and the mobile terminal includes a first transmission unit that transmits position information of the mobile terminal to the management device at a predetermined interval.
- the management device includes a control unit that controls the orientation of the camera based on the position information when the position information is acquired.
- the monitoring system according to the present invention includes a mobile terminal and a management device that communicates with the mobile terminal, and the mobile terminal includes a first transmission unit that transmits location information of the mobile terminal to the management device, When the position information is acquired, the apparatus controls the direction of the camera based on the position information, the tracking section extracts a tracking target candidate based on the video acquired from the camera, and displays the video. And a display unit that highlights a candidate for tracking included in the video in a selectable manner.
- the mobile terminal according to the present invention includes a first transmission unit that transmits position information of the mobile terminal of the own device to the management device at a predetermined interval, and the first transmission unit is configured to detect a predetermined operation. The transmission of position information to the management device is started.
- the management device includes an acquisition unit that acquires position information of the mobile terminal at predetermined intervals, and a control unit that controls the orientation of the camera based on the position information when the position information of the mobile terminal is acquired. It is characterized by providing.
- the camera can be appropriately controlled even when the position of the user of the mobile terminal changes or when the monitoring target is different from the user of the mobile terminal.
- FIG. 1 is a block diagram illustrating a configuration of a monitoring system according to a first embodiment. It is a flowchart which shows the information transmission of the mobile terminal which concerns on Embodiment 1. FIG. It is a flowchart which shows the camera control of the management apparatus which concerns on Embodiment 1. FIG. It is a flowchart in which the management apparatus which concerns on Embodiment 1 tracks a tracking object. It is an overhead view which shows the specific example of the operation
- FIG. 3 is an overhead view showing a specific example of camera control of the management apparatus according to the first embodiment. It is a figure which shows the specific example of the image
- FIG. 10 is a flowchart illustrating camera control of the management apparatus according to the second embodiment.
- FIG. 10 is an overhead view showing a specific example of camera control of the management apparatus according to the second embodiment. It is a figure which shows the specific example of the display part of the management apparatus which concerns on Embodiment 2.
- FIG. It is a flowchart which shows the information transmission of the mobile terminal which concerns on Embodiment 3. It is a flowchart in which the management apparatus which concerns on Embodiment 3 tracks a tracking object. It is a figure which shows the hardware constitutions of the monitoring system which concerns on Embodiment 1.
- FIG. 10 is an overhead view showing a specific example of camera control of the management apparatus according to the second embodiment. It is a figure which shows the specific example of the display part of the management apparatus which concerns on Embodiment 2.
- FIG. It is a flowchart which shows the information transmission of the mobile terminal which concerns on Embodiment 3.
- the management apparatus which concerns on Embodiment 3 tracks a tracking object. It is a figure which shows the hardware constitutions of the monitoring system which concerns on Embodiment 1.
- FIG. 1 is a block diagram illustrating the configuration of a monitoring system 1000 according to this embodiment.
- the monitoring system 1000 includes facilities (ports, airports, platforms, power plants, plants, dams, and other important facilities, warehouses, leisure facilities, stadiums, commercial facilities, buildings, or cities / streets).
- the monitoring system 1000 includes a mobile terminal 1100, a management device 1200, and cameras 300-1 to 300-N.
- the mobile terminal 1100 includes a position acquisition unit 1110, an operation detection unit 1120, and a communication unit 1130.
- Examples of the mobile terminal 1100 include a wireless device, a mobile phone, a smartphone, and a tablet terminal.
- the position acquisition unit 1110, the operation detection unit 1120, and the like can be realized by causing a computer configuring these apparatuses to execute a computer program that executes each process using the hardware. That is, as shown in FIG. 20, a CPU (Central Processing Unit) 100 provided in the mobile terminal 1100 is connected to a memory 120 provided in the mobile terminal 1100, and a predetermined program or various types read from the memory 120
- the position acquisition unit 1110, the operation detection unit 1120, and the like may be realized by the function of the CPU 100 based on the data.
- the CPU 100 includes a processor and a microprocessor. Further, the CPU 100 of the mobile terminal 1100 described above may be connected between the communication unit 1130, the input receiving unit 110, and the display unit 130 via an input / output circuit.
- the memory 120 may be configured as a RAM (Random Access Memory) or a ROM (Read Only Memory).
- the position acquisition unit 1110 is connected to the operation detection unit 1120 and the communication unit 1130.
- the position acquisition unit 1110 acquires real world coordinates that are position information from a GPS (Global Positioning System) satellite.
- GPS satellites are used to acquire position information.
- a system that transmits and receives data using wireless communication is used to calculate position information from access location information.
- the position information is a real world coordinate which is a unified coordinate in the real world.
- the position information is not limited to this information and can be configured by adding information such as the current time. is there.
- the real world coordinates may be configured by information indicating longitude and latitude.
- the operation detection unit 1120 is connected to the position acquisition unit 1110 and the communication unit 1130. Moreover, the operation
- the motion detection unit 1120 sends control switching information for requesting camera control mode switching of the cameras 300-1 to 300-N to the communication unit 1130 when a predetermined operation of the guard is detected.
- the predetermined operation is that a guard operates the mobile terminal 1100. More specifically, when the security guard finds a suspicious person or a suspicious object while the patrol is being performed, the security guard operates the mobile terminal, and each camera 300-1 to 300-N The control switching information for requesting the camera control mode switching is sent to the communication unit 1130.
- the security guard captures an image of a suspicious person, a suspicious object, or the like with a camera (not shown) of the mobile terminal 1100 so that the captured image is sent to the communication unit 1130 together with the position information of the mobile terminal 1100. Also good.
- the camera control mode will be described later.
- the communication unit 1130 includes a first communication unit 1131 and a second communication unit 1132.
- the communication unit 1130 communicates with the management apparatus 1200 via a communication network.
- the first communication unit 1131 is connected to the position acquisition unit 1110.
- the first communication unit 1131 transmits the location information of the mobile terminal 1100 acquired by the location acquisition unit 1110 to the management device 1200 via the communication network. It is also possible to receive information from the management apparatus 1200. Also, the first communication unit 1131 transmits an image of a suspicious person, a suspicious object, or the like captured by a camera (not shown) of the mobile terminal 1100 to the management apparatus 1200 together with the position information of the mobile terminal 1100. It is also good.
- the second communication unit 1132 is connected to the operation detection unit 1120.
- the second communication unit 1132 transmits the control switching information to the management device 1200 via the communication network when the operation detection unit 1120 detects a predetermined operation of the guard. It is also possible to receive information from the management apparatus 1200.
- the first communication unit 1131 and the second communication unit 1132 need only be functionally separate and can transmit and receive independently.
- the hardware configuration does not have to be a separate body, and can be implemented with a single piece of hardware.
- the second communication unit 1132 may be configured to transmit the control switching information and images of suspicious persons, suspicious objects, etc. taken by a camera (not shown) of the mobile terminal 1100 together.
- the management apparatus 1200 includes a communication unit 1210, a control unit 1220, and a display unit 1230.
- the management device 1200 is provided in a building with a security company that monitors the entire facility, a security room installed in a monitored facility, or the like.
- the management device 1200 can be realized even when installed outside the facility as a management server.
- it not only exists as an independent device, but can also be incorporated as one component of the camera 300-1 described later.
- the computer which comprises these apparatuses can also be implement
- the CPU 200 provided in the management apparatus 1200 is connected to the memory 220 provided in the management apparatus 1200, and the CPU 200 is based on a predetermined program or various data read from the memory 220.
- the controller 1220 may be realized by functioning. More specific examples of the CPU 200 include a processor and a microprocessor. Further, the CPU 200 of the management device 1200 described above may be connected between the communication unit 1210, the input receiving unit 210, and the display unit 1230 via an input / output circuit. In addition, the connection unit 230 of the management apparatus 1200 is connected to each camera 300-1 to 300-N.
- the memory 220 may be configured as a RAM or a ROM.
- the communication unit 1210 includes a first communication unit 1211 and a second communication unit 1212.
- the communication unit 1210 can exchange information with the communication unit 1130 of the mobile terminal 1100.
- the communication unit 1210 sends the received information to the control unit 1220.
- the first communication unit 1211 communicates with the first communication unit 1131 of the mobile terminal 1100 and receives the position information of the mobile terminal 1100 described above.
- the first communication unit 1211 sends the received location information of the mobile terminal 1100 to the control unit 1220.
- the first communication unit 1211 can also transmit information to the first communication unit 1131.
- the first communication unit 1211 may acquire the position information of the mobile terminal 1100 and images of suspicious persons, suspicious objects, and the like taken by a camera (not shown) of the mobile terminal 1100 together. good.
- the second communication unit 1212 communicates with the second communication unit 1132 of the mobile terminal 1100 via the communication network, and receives the above-described control switching information. Second communication unit 1212 sends control switching information to control unit 1220. The second communication unit 1212 can also transmit information to the second communication unit 1132.
- the 1st communication part 1211 and the 2nd communication part 1212 are functionally different, and should just be able to transmit / receive independently. That is, the 1st communication part 1211 and the 2nd communication part 1212 do not need to become a separate body as a hardware constitution, and it is also possible to mount with one hardware. Further, the second communication unit 1212 may be configured to acquire the control switching information and images of suspicious persons, suspicious objects, and the like taken by a camera (not shown) of the mobile terminal 1100 together.
- the control unit 1220 includes a tracking unit 1221 and a storage unit 1222.
- the control unit 1220 is connected to the communication unit 1210, the display unit 1230, and the cameras 300-1 to 300-N.
- the control unit 1220 controls the camera based on the control switching information acquired from the communication unit 1210. More specifically, the control unit 1220 switches from the camera control in the normal mode to the camera control in the tracking mode when the control switching information is received.
- the respective monitoring areas are set for the respective cameras 300-1 to 300-N, and the control unit 1220 sets the monitoring areas set for the respective cameras 300-1 to 300-N.
- tracking means that an object detected in an n-1 frame (n is a natural number) that is a previous video frame, for example, and an object detected in a current video frame (n frame) can be collated as the same object. Whether or not the object is regarded as the same is measured. More specifically, this refers to controlling the pan, tilt, and zoom of the camera so that the tracking target is included in the video obtained by the camera 300-1 or the like.
- the tracking unit 1221 selects a tracking candidate based on the video captured by the cameras 300-1 to 300-N when the tracking mode described above is entered. Furthermore, the tracking unit 1221 starts tracking the tracking target designated by the operator of the management apparatus 1200 from the tracking candidates. A detailed tracking flow will be described later.
- the storage unit 1222 is a storage medium such as an HDD (Hard Disc Drive).
- the storage unit 1222 stores images captured by the cameras 300-1 to 300-N, position information where the cameras 300-1 to 300-N are installed, the traveling direction of the mobile terminal 1100, and the acceleration direction of the mobile terminal 1100. , Control switching information, position information of the mobile terminal 1100, feature quantities of specific objects (people, animals, small airplanes, helicopters, etc.), feature figures of guards, clothes, and the like.
- Display unit 1230 includes a liquid crystal display device.
- the display unit 1230 displays the camera image obtained by the control unit 1220.
- the display unit 1230 displays information on tracking candidates that are candidates for tracking, and displays information that prompts the operator to select tracking candidates to be tracked.
- Each of the cameras 300-1 to 300-N is a CMOS (Complementary Metal Oxide Semiconductor) camera. Each of the cameras 300-1 to 300-N can change the photographing range by PTZ (Pan-Tilt-Zoom) control. In this embodiment, a CMOS camera is used. However, a CCD (Charge Coupled Devices) camera, an infrared camera, or the like may be used as long as it is a video acquisition unit that can acquire a video from which a feature amount can be extracted. May be. Further, the cameras 300-1 to 300-N do not need to be the same camera, and various cameras can be employed. In this embodiment, an example in which the camera 300-1 and the management apparatus 1200 are configured as separate apparatuses has been described.
- the position acquisition unit 1110 acquires the position information of its own device from the GPS satellite.
- the acquired position information is sent to the first communication unit 1131.
- the position acquisition unit 1110 is configured to acquire position information at a predetermined interval (for example, every 10 seconds).
- a variable period or a random period may be employed as the predetermined interval.
- the position acquisition unit 1110 may acquire the position information when there is a request for the mobile terminal 1100 from a security guard. In this case, the predetermined interval is set when there is a request for the mobile terminal 1100 from the security guard.
- step S120 the motion detection unit 1120 receives control switching information that is a request to switch the monitoring control (more specifically, camera control) from the normal mode to the tracking mode from the guard.
- the operation detection unit 1120 transmits the acquired control switching information to the second communication unit 1132.
- the position acquisition unit 1110 and the operation detection unit 1120 are configured to acquire the position information and control switching information of the mobile terminal 1100 separately and sequentially.
- the position acquisition unit 1110 and the motion detection unit 1120 may be configured to acquire together and simultaneously.
- the operation detection unit 1120 may be configured to acquire the control switching information and images of suspicious persons, suspicious objects, and the like taken by a camera (not shown) of the mobile terminal 1100 together.
- step S ⁇ b> 130 the first communication unit 1131 transmits the acquired location information of the mobile terminal 1100 to the management apparatus 1200 via the first communication unit 1211. Further, the second communication unit 1132 transmits the acquired control switching information to the management device 1200 via the second communication unit 1212.
- the first communication unit 1131 transmits information to the management apparatus 1200 every time the position acquisition unit 1110 acquires the position information of the mobile terminal 1100 at a predetermined interval (for example, periodically). Yes. That is, the first communication unit 1131 transmits the location information of the mobile terminal 1100 to the management device 1200 at a predetermined interval (for example, periodically).
- the first communication unit 1131 or the second communication unit 1132 may be configured to transmit an image of a suspicious person, a suspicious object, or the like captured by a camera (not shown) of the mobile terminal 1100.
- the predetermined interval for acquiring the position information of the position acquisition unit 1110 and the predetermined interval for transmitting the position information of the first communication unit 1131 may be synchronized or may be asynchronous.
- FIG. 3 is a control flow diagram in which the management apparatus 1200 controls the imaging range of the cameras 300-1 to 300-N.
- the first communication unit 1211 receives the location information of the mobile terminal 1100 from the mobile terminal 1100, and sends the acquired information to the control unit 1220. Further, the second communication unit 1212 receives control switching information from the mobile terminal 1100 and sends the acquired information to the control unit 1220. Moreover, the 1st communication part 1211 or the 2nd communication part 1212 is good also as a structure which also acquires images, such as a suspicious person and a suspicious object, which were image
- step S220 the control unit 1220 switches the camera control mode from the normal mode to the tracking mode based on the control switching information. Further, the control unit 1220 controls panning, tilting, and zooming of the cameras 300-1 to 300-N based on the acquired position information of the mobile terminal 1100.
- the storage unit 1222 stores in advance positions where the camera can shoot.
- the control unit 1220 extracts combinations in which the real world coordinates, which are the acquired position information, are within the shootable range from among the combinations of pan, tilt, and zoom stored in the storage unit 1222.
- the control unit 1220 controls the pan, tilt, and zoom of the camera 300-1 with the combination closest to the current combination among the extracted combinations of pan, tilt, and zoom as the target value.
- the configuration is such that the combination closest to the current combination is selected.
- the control unit 1220 may select a combination in which the real world coordinates as the acquired position information exist in the center of the camera.
- control unit 1220 can detect the direction of travel of the guard and perform camera control that predicts the current position of the guard. Specifically, the control unit 1220 periodically receives the location information and transmission time information of the mobile terminal 1100. Thereafter, the control unit 1220 calculates the traveling direction and the moving speed of the guard from the position information and transmission time of the mobile terminal 1100 received this time and the position information and transmission time of the mobile terminal 1100 received last time. Next, the control unit 1220 calculates a predicted movement amount based on the difference between the time when the current position information is received and the current time and the movement speed. The control unit 1220 predicts a position where the mobile terminal 1100 will be located at the current time based on the calculated traveling direction of the guard and the estimated movement amount. Finally, the control unit 1220 controls pan, tilt, and zoom of each camera 300-1 to 300-N by the above-described method so that the predicted position information is included in the imaging range.
- control unit 1220 can detect the direction of travel of the guard and perform camera control that predicts the future position of the guard. Specifically, the control unit 1220 predicts the future position by performing the same processing as the means for predicting the current position described above as a prediction of the future position of the guard against the future time.
- step S230 the control unit 1220 transmits the control amount calculated in step S220 to the cameras 300-1 to 300-N. Thereafter, the flow ends.
- the control unit 1220 may be configured to sequentially update the camera positions of the cameras 300-1 to 300-N based on the position information of the mobile terminal acquired periodically. With this configuration, each camera 300-1 to 300-N is updated in order to update the shooting range of the camera even when the security guard holding the mobile terminal 1100 has moved away from the previously acquired position. Can improve the chances of capturing security guards. (Tracking control by management device) Next, a tracking control flow of the management apparatus 1200 will be described with reference to FIG. FIG. 4 is a flowchart of tracking control in which the management apparatus 1200 performs tracking.
- step S310 the control unit 1220 changes the camera control from the normal mode to the tracking mode when acquiring the mode change request.
- step S320 the tracking unit 1221 obtains images captured by the cameras 300-1 to 300-N.
- the tracking unit 1221 stores the video acquired from each of the cameras 300-1 to 300-N in the storage unit 1222.
- the tracking unit 1221 extracts a feature amount from the video acquired from each of the cameras 300-1 to 300-N.
- the tracking unit 1221 compares the extracted feature amount with the feature amount that is stored in advance in the storage unit 1222 and identifies a person, and extracts a human candidate.
- the tracking unit 1221 selects the extracted person candidate as a tracking candidate.
- the tracking unit 1221 may acquire a motion vector indicating the moving amount and moving direction of the moving body and select only the moving body as a tracking candidate. More specifically, the tracking unit 1221 acquires a motion vector by reading a plurality of frames and comparing image data between claims.
- the tracking unit 1221 extracts the feature amount from the image of the suspicious person captured by the camera (not shown) of the mobile terminal 1100 acquired by the first communication unit 1211 or the second communication unit 1212, and the extracted feature amount
- the tracking candidates may be extracted by comparing the feature amounts extracted from the videos acquired from the cameras 300-1 to 300-N.
- the tracking control of the management device 1200 is configured to extract a person as a tracking candidate, but is not limited to a person, and is an animal, a moving body (car, motorcycle), and a flying body (airplane, helicopter). May be.
- step S340 the tracking unit 1221 outputs the video of each camera 300-1 to 300-N taken in step S320 and the tracking candidate extracted in step S330 to the display unit 1230.
- the display unit 1230 displays the video of each camera 300-1 to 300-N and highlights the tracking candidates extracted in step S330 in the video.
- step S350 the display unit 1230 accepts designation of one or more tracking targets by the operator.
- the display unit 1230 accepts the operator's designation, but there is no particular limitation, and the control unit 1220 may accept the operator's designation.
- step S360 the tracking unit 1221 tracks the tracking target based on the designation of the tracking target.
- a specific tracking target detection algorithm is shown below.
- the tracking unit 1221 predicts the position of the tracking target on the current frame based on the position of the tracking target detected in the previous frame.
- various existing methods such as a method using a Kalman filter or a particle filter can be used.
- the tracking unit 1221 performs tracking based on the predicted position of the tracking target on the current frame and the video of the current frame.
- the tracking is not particularly limited to the above-described form, and tracking can also be performed by associating the similarity and likelihood of feature amounts between the tracking target and the object in the current frame.
- control unit 1220 may be configured to transmit the video captured by each camera 300-1 to 300-N to the mobile terminal 1100 via the communication unit 1210. With the above-described configuration, even if the security guard loses sight of the suspicious person, the tracking can be started again.
- the control unit 1220 may be configured to transmit video to a mobile terminal other than the mobile terminal that acquired the position information. It is.
- the control unit 1220 is configured to analyze the video captured by each of the cameras 300-1 to 300-N and generate metadata related to the tracking target while tracking the tracking target by the tracking unit 1221. A configuration in which the image is stored in the storage unit 1222 in association with the video may be employed.
- the metadata is the color data of the clothes to be tracked, the feature amount of the face, and the like.
- FIG. 5 is a bird's-eye view showing a specific example of the operation when the guard in the first embodiment finds a suspicious person.
- the building A is provided with a camera 300-1 and a camera 300-2.
- the building B is provided with a camera 300-3.
- the guard finds a suspicious person, the guard performs an input operation for requesting the mobile terminal 1100 to change the mode.
- the mobile terminal 1100 that has received the input transmits control switching information to the management apparatus 1200 together with the location information of the mobile terminal 1100. Thereafter, the guard starts moving toward the suspicious person in order to secure the suspicious person. While the suspicious person is moving, the mobile terminal 1100 sequentially transmits the acquired position information to the management apparatus 1200.
- FIG. 6 shows an overhead view of camera control of the management apparatus according to the first embodiment.
- the management apparatus 1200 when the management apparatus 1200 acquires control switching information from the mobile terminal 1100, the management apparatus 1200 switches the mode for controlling the cameras 300-1 to 300-3 to the tracking mode. Also, the management device 1200 acquires the line-of-sight directions of the cameras 300-1 to 300-3 to the position acquired from the mobile terminal 1100 (there will be a guard) based on the acquired position information of the mobile terminal 1100 Direct to the location of the mobile terminal 1100.
- FIG. 7 is a diagram showing a video example of the camera 300-1.
- a security guard and a suspicious person are photographed at a long distance by the camera 300-1.
- FIG. 8 is a diagram showing an example of video from the camera 300-2.
- a security guard and a suspicious person are photographed at a medium distance by the camera 300-2.
- FIG. 9 is a diagram illustrating an example of a video image of the camera 300-3. Although the security guard is photographed at a short distance with the camera 300-3, the suspicious person has only photographed the upper body. Since the control unit 1220 controls the focus target of the camera 300-3 to the position of the mobile terminal 1100, the camera 300-3 cannot catch the suspicious person. Videos obtained by the cameras 300-1 to 300-3 are sent to the management apparatus 1200 and stored in the storage unit 1222.
- FIG. 10 is a diagram illustrating a specific example of the display unit of the management apparatus according to the first embodiment.
- the control unit 1220 displays images taken by the cameras 300-1 to 300-3 on the display unit 1230.
- the control unit 1220 extracts the tracking candidates 1 to 5 that are shown in each camera video.
- the control unit 1220 highlights the images of the cameras 300-1 to 300-3 by surrounding the tracking candidates 1 to 5 with a display.
- one or more tracking candidates can be selected so that a tracking target can be determined from the tracking candidates 1 to 5.
- the management device operator selects the tracking candidate 1 and the tracking candidate 3 and presses the decision button, so that the tracking candidate 1 and the tracking candidate 3 are selected. Determine as a tracking target.
- the tracking unit 1221 determines the determined tracking target as a tracking target, and performs tracking on the determined tracking target. More specifically, the control unit 1220 controls the camera 300-1 so that the tracking target 1 is photographed near the center of the screen. Further, the control unit 1220 controls the camera 300-2 so that the tracking target 3 is photographed near the center of the screen at the center of the screen.
- FIG. 10 a modification is shown in FIG. In FIG. 10, the operator needs to execute a selection step and a determination step, and two operations are necessary.
- FIG. 11 a configuration may be adopted in which the tracking candidates highlighted in the images acquired by the cameras 300-1 to 300-3 can be directly selected. More specifically, “tracking target 1” in FIG. 11 is gray and indicates a state where it is selected as a tracking target. Further, “tracking target 2” is white and indicates a state where it is not selected as a target.
- the operator presses “tracking target 1” in FIG. 11 the operator turns gray and tracking starts as shown in FIG. Furthermore, when the operator presses “tracking target 1” in FIG. 11 again, the operator turns white and the tracking stops.
- the operator's visibility and operability can be improved. (Function and effect) Since the position information of the mobile terminal can be acquired sequentially, the possibility that the camera can supplement the guard or the suspicious person even when the guard is moving is improved. Further, by performing camera control based on the position information of the mobile terminal and displaying the tracking candidates in a selectable manner, it is possible to select a tracking target that is not a guard who is a user of the mobile terminal. In addition, the mobile terminal sends the tracking target image of the suspicious person or the like taken by the mobile terminal to the management apparatus in accordance with the transmission of the position information, thereby increasing the probability of the tracking candidate extraction in the management apparatus, and the tracking candidate Can be squeezed.
- FIG. 12 is a block diagram illustrating the configuration of the monitoring system 2000 according to this embodiment.
- the monitoring system 2000 includes a mobile terminal 2100, a management device (management server) 2200, and cameras 300-1 to 300-N.
- the mobile terminal 2100 includes at least a position acquisition unit 1110, an operation detection unit 2120, a communication unit 1130, and an acceleration sensor 2140.
- the mobile terminal 2100 includes all the hardware configurations of the mobile terminal 1100 according to the first embodiment illustrated in FIG. 20, and further includes an acceleration sensor 2140.
- the position acquisition unit 1110 and the communication unit 1130 are the same as those in the first embodiment.
- the motion detection unit 2120 is connected to the position acquisition unit 1110, the communication unit 1130, and the acceleration sensor 2140. Further, the motion detection unit 2120 detects a predetermined motion of the guard based on the output value of the acceleration sensor 2140. The motion detection unit 2120 sends control switching information requesting to change the camera control mode to the tracking mode to the communication unit 1130 when a predetermined motion of the guard is detected. In addition, the operation detection unit 2120 sends control switching information to the position acquisition unit 1110.
- the predetermined operation means that the security guard has found a suspicious person or a suspicious object and has entered a tracking state.
- the acceleration sensor 2140 may be expressed as an operation detection device that detects vertical and horizontal accelerations in the mobile terminal 2100.
- the management device 2200 includes a communication unit 1210, a control unit 2220, and a display unit 1230.
- the management apparatus 2200 has the same hardware configuration as that of the management apparatus 1200 according to the first embodiment illustrated in FIG.
- the communication unit 1210 and the display unit 1230 are the same as those in the first embodiment.
- the control unit 2220 includes a tracking unit 2221, a storage unit 1222, and a selection unit 2223.
- the control unit 2220 performs camera control similar to that of the control unit 1220 of the first embodiment.
- the control unit 2220 controls the camera based on the control switching information acquired by the communication unit 1210. More specifically, as in the first embodiment, the camera control mode is changed from the normal mode to the tracking mode to control the camera.
- the tracking unit 2221 extracts a provisional tracking target, which is a provisional tracking target, from the images of the cameras 300-1 to 300-N when the tracking mode is set. In addition, the tracking unit 2221 extracts tracking candidates from the images of the cameras 300-1 to 300-N when the tracking mode is set. Further, the tracking unit 2221 performs tracking for the tracking target designated by the operator of the management apparatus 2200 from the extracted tracking candidates.
- a provisional tracking target which is a provisional tracking target
- the storage unit 1222 is the same as that in the first embodiment.
- the selection unit 2223 selects the camera closest to the position of the mobile terminal 2100 based on the acquired position information of the mobile terminal 2100 and the position information of each camera 300-1 to 300-N stored in the storage unit 1222 in advance. Select.
- the closest camera is selected, but it is not limited to the closest camera. Specifically, it may be configured to select the farthest camera. Further, the number of cameras to be selected may not be one, and a configuration in which all the cameras whose distance from the mobile terminal 1100 is within a predetermined range may be selected.
- FIG. 13 is a flowchart in which the mobile terminal 1100 transmits information to the management apparatus 2200.
- step S410 processing similar to that in step S110 in FIG. 2 is performed.
- step S420 the motion detection unit 2120 receives the acceleration of the mobile terminal 2100 from the acceleration sensor 2140.
- step S430 the motion detection unit 2120 determines whether or not the acquired acceleration is equal to or greater than a predetermined value.
- the motion detection unit 2120 determines that the guard has performed the predetermined motion when the acceleration is equal to or greater than the predetermined value.
- the operation detection unit 2120 transmits the control switching information to the second communication unit 1132 and proceeds to step S440.
- the motion detection unit 2120 determines that the guard is not performing the predetermined motion, returns to step S420, and detects the acceleration again.
- the motion detection unit 2120 determines that the security guard has performed the predetermined motion by simply confirming the predetermined value of the acceleration. If the acceleration exceeds a predetermined period, the security guard It may be determined that the person has performed the predetermined operation.
- step S440 the same processing as in step S130 of FIG. 2 is performed, and the flow ends.
- FIG. 14 is a control flow diagram in which the management apparatus 2200 controls the shooting ranges of the cameras 300-1 to 300-N.
- Table 1 shows the position information of each camera 300-1 to 300-N stored in the storage unit 1222.
- step S510 processing similar to that in step S210 in FIG. 3 is performed.
- the selection unit 2223 compares the acquired position information of the mobile terminal 2100 with the position information of each camera 300-1 to 300-N as shown in Table 1 below, and sets the position information of the mobile terminal 2100. Select the closest camera. More specifically, the selection unit 2223 determines that each of the acquired real-world coordinates as the position information of the mobile terminal 2100 is 35 degrees 63 minutes 81.100 seconds north latitude and 139 degrees 74 minutes 72.894 seconds east longitude, The degree of north latitude and east longitude of the camera is compared with the degree of north latitude and east longitude of the acquired location information of the mobile terminal 2100, and the difference is calculated. The selection unit 2223 selects the camera with the smallest calculated difference from the cameras 300-1 to 300-N.
- the difference is calculated using the north latitude and east longitude, and the camera with the smallest difference is selected.
- the second latitude of north latitude and the second of east longitude are compared, and the camera having the smallest difference is selected.
- all of them are selected as target cameras. In this specific example, the camera 300-N is selected.
- step S530 processing similar to that in step S220 in FIG. 3 is performed.
- control unit 2220 periodically receives location information and transmission time information of mobile terminal 2100.
- the control unit 2220 receives the acceleration and acceleration direction of the mobile terminal 2100 acquired by the acceleration sensor 2140 of the mobile terminal 2100.
- the control unit 2220 calculates the expected movement amount of the guard from the position information, transmission time information, and acceleration of the mobile terminal 2100 received this time.
- the control unit 2220 predicts a position where the mobile terminal 2100 will be located at the current time based on the calculated movement prediction amount and the acquired acceleration direction.
- the control unit 2220 controls pan, tilt, and zoom of each camera 300-1 to 300-N by the above-described method so that the predicted position of the mobile terminal 2100 is included in the imaging range.
- step S540 the same process as in step S230 of FIG. 3 is performed, and the flow ends.
- Tracking control by management device Next, a tracking control flow of the management apparatus 2200 will be described with reference to FIG. FIG. 15 is a flowchart of tracking control in which the management apparatus 2200 performs tracking.
- step S610 processing similar to that in step S310 in FIG. 4 is performed.
- step S620 the same processing as in step S320 of FIG. 4 is performed.
- step S630 the tracking unit 2221 extracts a feature amount from the acquired video.
- the tracking unit 2221 compares the acquired feature amount with the feature amount of the guardian's person stored in the storage unit 1222 in advance, and extracts a guard. Then, the tracking unit 2221 determines the extracted security guard as a provisional tracking target. Furthermore, the tracking unit 2221 tracks the determined provisional tracking target.
- the tracking method is the same as the tracking algorithm of the first embodiment.
- step S640 processing similar to that in step S330 in FIG. 4 is performed.
- step S650 the tracking unit 2221 displays the video of each camera 300-1 to 300-N acquired in step S620, the temporary tracking target set in step S630, and the tracking candidate extracted in step S640 on the display unit 1230.
- the display unit 1230 displays the video of each camera 300-1 to 300-N acquired in step S620, and highlights the temporary tracking target set in step S630 and the tracking candidate extracted in step S640 on the screen.
- various methods such as surrounding with a line and painting a detection target with a marker or the like can be employed.
- step S660 processing similar to that in step S350 in FIG. 4 is performed.
- step S670 the tracking unit 2221 changes the tracking target from the temporary tracking target to the specified tracking target based on the designation of the tracking target, and continues tracking.
- the specific tracking target detection algorithm is the same as in the first embodiment. (Specific example of Embodiment 2)
- FIG. 5 is the same configuration as that described in the first embodiment.
- the acceleration sensor 2140 of the mobile terminal 1100 acquires the guard's acceleration.
- the motion detection unit 2120 transmits control switching information to the management device 2200 via the second communication unit 1132 when detecting that the acquired acceleration is equal to or greater than a predetermined acceleration.
- FIG. 16 is an overhead view of camera control of the management apparatus according to the second embodiment.
- the control unit 2220 acquires video from each of the cameras 300-1 to 300-3.
- the control unit 2220 selects the camera 300-2 closest to the position of the mobile terminal 2100 based on the position information of each camera 300-1 to 300-3 and the position information of the mobile terminal 2100. Further, the control unit 2220 directs the viewing direction of the camera 300-2 to the position of the mobile terminal 2100 to the position of the mobile terminal 1100.
- FIG. 17 is a diagram illustrating a specific example of the display unit of the management apparatus according to the second embodiment.
- the control unit 2220 displays images taken by the cameras 300-1 to 300-3 on the display unit 1230.
- the control unit 2220 extracts a security guard reflected in each camera video and determines it as a temporary tracking target.
- the control unit 2220 extracts the tracking candidate 1 shown in each camera video.
- the display unit 1230 displays the images of the cameras 300-1 to 300-3, and surrounds and highlights the temporary tracking target and the tracking candidate 1.
- the control unit 2220 causes the display unit 1230 to perform display so that one or more tracking candidates can be selected so that a tracking target can be determined from the tracking candidates.
- the management device operator selects the tracking candidate 1 and presses the decision button, thereby changing the tracking candidate 1 as a tracking target. Further, the control unit 2220 performs camera control for tracking the determined tracking target. (Function and effect) Because it is configured to use the acceleration sensor of the mobile terminal to estimate the presence or absence of tracking of the guard, and when it is estimated that the guard is tracking, the control switching information is transmitted from the mobile terminal to the management device Even if the security guard cannot directly control the mobile terminal and send control switching information, that is, if the suspicious person or suspicious object needs to be tracked immediately, the management device needs to change the mode. Can be communicated.
- a monitoring camera to be controlled is selected from a plurality of monitoring cameras, it is possible to limit the monitoring cameras and capture the target with the minimum number of cameras.
- the tracking is started with the guard as a temporary tracking target, so that the traffic volume and the communication time with the mobile terminal Can be reduced.
- Embodiment 3 (Characteristics of the invention)
- the cycle for acquiring the position of the mobile terminal is made earlier.
- the mobile terminal ends the control for speeding up the cycle when the operator of the management apparatus determines the tracking target.
- FIG. 18 is a flowchart in which the mobile terminal 2100 transmits information to the management apparatus 2200.
- step S710 the same processing as in step S110 of FIG. 2 is performed.
- step S720 processing similar to that in step S120 in FIG. 2 is performed.
- step S730 the position acquisition unit 1110 acquires control switching information from the operation detection unit 2120.
- the position acquisition unit 1110 changes the acquisition period of the position information acquired by itself to a high-speed period that is earlier than the low-speed period that is the normal acquisition period. Specifically, the position acquisition unit 1110 updates its own position acquisition cycle by 2 seconds (high speed) when it receives control switching information when there is no control switching information and is updated at a cycle of 10 seconds (low speed cycle). Switch to acquisition by period).
- the first communication unit 1131 is configured to transmit the position information to the management device 2200 every time the position information acquired by the position acquisition unit 1110 is acquired, but the first communication unit 1131 also has a cycle. It may be configured to change the period according to the control switching information.
- step S740 processing similar to that in step S130 in FIG. 2 is executed.
- step S750 the position acquisition unit 1110 determines whether or not the tracking unit 2221 of the control unit 2220 has acquired information indicating that the tracking target has been set. If the tracking unit 2221 has not acquired information indicating that the tracking target has been set, the position acquisition unit 1110 remains in step S750. If the position acquisition unit 1110 has acquired information indicating that the tracking unit 2221 has set the tracking target, the process proceeds to step S760.
- step S760 the position acquisition unit 1110 changes the high-speed cycle to the low-speed cycle when acquiring information indicating that the tracking unit 2221 has set the tracking target. Thereafter, the flow ends.
- the camera control by the management apparatus is a control flow similar to that in FIG. (Tracking control by management device)
- FIG. 19 is a flowchart of tracking control in which the management apparatus 2200 performs tracking.
- step S810 processing similar to that in step S310 in FIG. 4 is performed.
- step S820 processing similar to that in step S320 in FIG. 4 is performed.
- step S830 processing similar to that in step S630 in FIG. 15 is performed.
- step S840 the control unit 2220 transmits information indicating that the provisional tracking target is set to the mobile terminal 2100 via the communication unit 1210.
- the control unit 2220 transmits information indicating that the provisional tracking target has been set to the mobile terminal 2100.
- the control unit 2220 sets the tracking target to the mobile terminal 2100. You may make it transmit the information which shows having been set.
- step S850 the tracking unit 2221 detects a moving object whose comparison result with the temporary tracking target satisfies a predetermined condition from the images of the cameras 300-1 to 300-N. Then, the tracking unit 2221 sets the detected moving object as a new tracking target. That is, the tracking unit 2221 changes the tracking target to a moving body detected from the temporary tracking target that is a guard.
- a more specific determination algorithm is shown below. First, the tracking unit 2221 detects a provisional tracking target and at least one other moving body from a plurality of frames in the images of the cameras 300-1 to 300-N.
- the tracking unit 2221 compares the image data between the read frames, thereby moving the motion vector indicating the movement amount and movement direction of the temporary tracking target in the image (within the angle of view) and the movement amount of the other moving object. And a motion vector indicating the moving direction.
- the tracking unit 2221 sets, as a new tracking target, another moving body in which the motion vector of the provisional tracking target and the motion vector of the other moving body are in substantially the same direction and the difference in the moving amount is equal to or less than a predetermined value.
- the tracking unit 2221 uses the extracted movement when there is a moving body but there is no other moving body that satisfies the predetermined condition, or when a plurality of other moving bodies that satisfy the predetermined condition are detected.
- the body is highlighted on the display unit 1230 as a tracking candidate.
- the control unit 2220 requests the operator of the management apparatus 2200 to select a tracking target.
- step S860 the tracking unit 2221 changes the tracking target set in step S850 to a new tracking target, and continues tracking. Thereafter, the flow ends.
- the specific tracking target detection algorithm is the same as in the first embodiment. (Function and effect) Until the tracking target is determined, the real-time property of the location information that can be transmitted to the management device is improved by increasing the period for acquiring the position of the mobile terminal, and the trigger for the end of control that increases the period is tracked. By making this determination, it is possible to reduce an excessively long period for acquiring the position of the mobile terminal. In addition, since it is possible to automatically switch from the temporary tracking target to the tracking target that is the target to be tracked, it is possible to supplement the tracking target at high speed.
- a mobile terminal A management device that communicates with the mobile terminal;
- the mobile terminal A first transmission unit configured to transmit the location information of the mobile terminal to the management device at a predetermined interval;
- the management device A monitoring system comprising: a control unit that controls the orientation of a camera based on the position information when the position information is acquired.
- Appendix 2 The monitoring system according to appendix 1, wherein the control unit controls the orientation of the camera to a shooting range where the position indicated by the position information is shot.
- the mobile terminal A second transmission unit configured to transmit control switching information for switching camera control of the camera to the management device when a predetermined operation is detected;
- the monitoring system according to claim 1, wherein the control unit controls the orientation of the camera based on the position information of the mobile terminal when the control switching information is received.
- the management device includes a normal mode for monitoring a predetermined range and a tracking mode for monitoring based on the position information,
- the control unit according to claim 1 or 2 wherein when the control switching information is received, the control unit switches from the normal mode to the tracking mode, and controls the orientation of the camera based on the position information of the mobile terminal. Monitoring system.
- the control unit acquires position information of the camera, The control unit selects at least one camera to be controlled from the cameras based on the acquired position information of the camera and the acquired position information of the mobile terminal. Monitoring according to any one of appendices 1 to 4 system.
- the management device Based on the video acquired from the camera, a tracking unit that extracts tracking target candidates;
- the tracking unit identifies the holder of the mobile terminal based on the video acquired from the camera, and tracks the temporary tracking target using the specified holder as the temporary tracking target. Monitoring device.
- the tracking unit further extracts a moving body different from the owner of the mobile terminal based on the video acquired from the camera, The monitoring device according to any one of appendices 6 to 8, wherein the display unit highlights the extracted moving object as the tracking target candidate.
- Appendix 10 The monitoring system according to any one of appendices 1 to 9, wherein when the predetermined operation is detected, the mobile terminal shortens an interval for acquiring the position information, compared to a case where the predetermined operation is not detected.
- the management device changes an interval for acquiring the position information of the management device to an acquisition interval when the predetermined operation is not detected when the tracking unit determines the tracking target. Monitoring system.
- Appendix 12 The monitoring system according to any one of appendices 2 to 11, wherein the first transmission unit transmits position information of the mobile terminal to the management device at a predetermined interval when receiving an input to the mobile terminal.
- the mobile terminal further includes an acceleration sensor,
- the first transmission unit transmits the position information of the mobile terminal to the management device at a predetermined interval when the acceleration acquired by the acceleration sensor exceeds a predetermined value.
- the mobile terminal A first transmission unit configured to transmit the location information of the mobile terminal to the management device at a predetermined interval;
- the management device A facility monitoring system comprising: a control unit that controls the direction of a camera based on the position information when the position information is acquired.
- a first transmission unit that transmits the position information of the own device to the management device at a predetermined interval; The mobile terminal according to claim 1, wherein the first transmission unit starts transmission of the location information to the management device when a predetermined operation is detected.
- Appendix 16 The mobile terminal according to appendix 15, wherein the first transmission unit transmits the position information to the management apparatus at a predetermined interval when receiving an input to the mobile terminal.
- Appendix 18 An acquisition unit for acquiring location information of the mobile terminal at predetermined intervals; And a control unit that controls the orientation of the camera based on the position information when the position information of the mobile terminal is acquired.
- control unit 19 The management apparatus according to claim 18, wherein the control unit controls the orientation of the camera based on the position information of the mobile terminal when receiving control switching information for switching camera control from the mobile terminal.
- Appendix 20 Obtain location information of mobile terminals at predetermined intervals, When the position information is acquired, a camera direction is controlled based on the position information.
- Appendix 21 A process of acquiring position information of a mobile terminal at a predetermined interval; and a process of controlling the orientation of a camera based on the position information when the position information is acquired; A monitoring program that causes a computer to execute.
- a mobile terminal A management device that communicates with the mobile terminal;
- the mobile terminal A first transmitter that transmits the location information of the mobile terminal to the management device;
- the management device When acquiring the position information, based on the position information, a control unit that controls the orientation of the camera;
- a tracking unit Based on the video acquired from the camera, a tracking unit that extracts tracking target candidates;
- a display unit that displays the video and highlights the tracking target candidates included in the video in a selectable manner.
- the tracking unit further extracts a moving body different from the owner of the mobile terminal based on the video acquired from the camera, The monitoring system according to any one of appendices 22 to 24, wherein the display unit highlights the extracted moving object as the tracking target candidate.
- It can be used for monitoring devices, monitoring methods, monitoring programs, monitoring systems and facility monitoring systems that perform monitoring.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
- Telephonic Communication Services (AREA)
- Image Analysis (AREA)
- Selective Calling Equipment (AREA)
- Burglar Alarm Systems (AREA)
Abstract
Description
また、本発明に係る監視システムは、移動端末と、移動端末と通信する管理装置とを備え、移動端末は、移動端末の位置情報を管理装置へ送信する第1の送信部を有し、管理装置は、位置情報を取得した場合に、位置情報に基づいて、カメラの向きを制御する制御部と、カメラより取得した映像に基づいて、追跡対象の候補を抽出する追跡部と、映像を表示すると共に、該映像に含まれる追跡対象の候補を選択可能に強調表示する表示部とを有することを特徴とする。
(発明の特徴)
本実施形態は、監視システムにおいて、警備員が有する移動端末の位置情報に基づいて、移動端末が存在する位置を撮影するため、監視カメラを制御するとともに、監視カメラにより得られた映像から抽出された追跡候補を管理装置の操作者に対して表示する。表示した追跡候補の選択を管理装置の操作者に促すことを特徴とする。
(構成)
図1は、本実施形態に係る監視システム1000の構成を例示するブロック図である。本実施形態では、監視システム1000は、施設等(港、空港、プラットホーム、発電所、プラント及びダム等の重要施設、倉庫街、レジャー施設、スタジアム、商業施設、建物、又は都市・街頭など)を監視するためのシステムである。より具体的は、警備員による巡回監視とカメラによる監視を行っている施設を監視する実施形態について説明を行う。図1を参照すると、監視システム1000は、移動端末1100と、管理装置1200と、カメラ300-1から300-Nとから構成される。
(移動端末の制御)
次に、図2を用いて、移動端末1100における制御フローを説明する。図2は、移動端末1100が管理装置1200に対して、情報を送信するフロー図である。
(管理装置によるカメラ制御)
次に、図3を用いて、管理装置1200がカメラ300-1から300-Nを制御するフローを説明する。図3は、管理装置1200が、カメラ300-1から300-Nの撮影範囲を制御する制御フロー図である。
(管理装置による追跡制御)
次に、図4を用いて、管理装置1200の追跡制御のフローを説明する。図4は、管理装置1200が、追跡を行う追跡制御のフロー図である。
(実施形態1の具体例)
次に、図5から図10を用いて、実施形態1の具体例を説明する。図5は、実施形態1における警備員が不審者を発見した場合の動作の具体例を示す俯瞰図である。
(作用効果)
移動端末の位置情報を逐次取得することができるため、警備員が移動している場合にも、カメラが警備員または不審者を補足できる可能性が向上する。また、移動端末の位置情報に基づいて、カメラ制御を実施するとともに、追跡候補を選択可能に表示させることにより、移動端末のユーザである警備員ではない追跡対象を選定することができる。さらに、移動端末が、位置情報の送信に合わせて、自身が撮影した不審者等の追跡対象の画像を管理装置に送ることにより、管理装置における追跡候補の抽出の確からしさを高められ、追跡候補を絞ることができる。
[実施形態2]
本実施形態は、監視システムにおいて、移動端末の加速度センサを使用して、警備員の追跡の有無を推定し、警備員が追跡を行っていると判定された場合に、移動端末から管理装置に制御切替情報を送信する構成とする。また、複数台ある監視カメラから制御する監視カメラを1つ選択する構成とする。さらに、追跡候補を抽出し、管理装置操作者に対して追跡対象を選択させる前に、警備員を仮追跡対象として追跡を開始する構成とする。
(構成)
図12は、本実施形態に係る監視システム2000の構成を例示するブロック図である。図12を参照すると、監視システム2000は、移動端末2100と、管理装置(管理サーバ)2200と、カメラ300-1から300-Nとから構成される。
(移動端末の制御)
次に、図13を用いて、移動端末2100における制御フローを説明する。図13は、移動端末1100が管理装置2200に対して、情報を送信するフロー図である。
(管理装置によるカメラ制御)
次に、図14と下記表1を用いて、管理装置2200がカメラ300-1から300-Nを制御する制御フローを説明する。図14は、管理装置2200が、各カメラ300-1から300-Nの撮影範囲を制御する制御フロー図である。また、表1は、記憶部1222に蓄積されている各カメラ300-1から300-Nの位置情報を示す表である。
(管理装置による追跡制御)
次に、図15を用いて、管理装置2200の追跡制御のフローを説明する。図15は、管理装置2200が、追跡を行う追跡制御のフロー図である。
(実施形態2の具体例)
次に、図5、図16、図17に基づいて、実施形態2の具体例を示す。図5は、実施形態1で説明した構成と同様の構成である。図5で、警備員が不審者を発見した場合に、警備員は、移動端末1100を操作することなく、不審者を確保すべく、不審者の方向へ移動を実施する。移動端末1100の加速度センサ2140は、警備員の加速度を取得する。動作検出部2120は、取得された加速度が所定の加速度以上となっていることを検知した場合に、制御切替情報を、第2通信部1132を介して管理装置2200へ送信する。
(作用効果)
移動端末の加速度センサを使用して、警備員の追跡の有無を推定し、警備員が追跡していると推定された場合に、移動端末から管理装置に制御切替情報を送信する構成としているため、警備員が、移動端末を直接操作して制御切替情報を送れないような状況、つまり、不審者や不審物を即座に追跡する必要がある場合にも、管理装置にモード変更が必要な情報を伝えることができる。さらに、複数台ある監視カメラの中から制御する監視カメラを選択する構成とすることにより、監視カメラを制限して、最小限のカメラで対象を捕捉することが可能となる。また、追跡候補を抽出し、管理装置の操作者に対して追跡対象を選択させる前に、警備員を仮追跡対象として追跡を開始する構成とすることにより、移動端末との通信量、通信時間を低減することができる。
[実施形態3]
(発明の特徴)
移動端末が周期的に管理装置へ位置情報を送信している構成において、制御切替情報が送信される場合に、移動端末の位置を取得する周期を早くする構成とする。また、移動端末は、管理装置の操作者が追跡対象を決定した場合に、周期を早くする制御を終了する。加えて、仮追跡対象から実際に追跡すべき対象である追跡対象へ自動で切り替える構成とする。
(構成)
実施形態3の機能ブロック及びハードウェア構成については、実施形態2と同様である。
(移動端末の制御)
次に、図18を用いて、移動端末2100における制御フローを説明する。図18は、移動端末2100が、管理装置2200に対して情報を送信するフロー図である。
(管理装置によるカメラ制御)
管理装置によるカメラ制御は、実施形態2の図14と同様の制御フローであるため、説明を割愛する。
(管理装置による追跡制御)
次に、図19を用いて、管理装置2200における追跡制御の制御フローを説明する。図19は、管理装置2200が、追跡を行う追跡制御のフロー図である。
(作用効果)
追跡対象が決定されるまで、移動端末の位置を取得する周期を早くすることにより、管理装置へ送信できる位置情報のリアルタイム性が向上されるとともに、周期を早くする制御の終わりのトリガーを追跡対象の決定とすることにより、移動端末の位置を取得する周期を過度に長くすることを低減できる。加えて、仮追跡対象から実際に追跡すべき対象である追跡対象への切り替えを自動で行うことができるため、高速で追跡対象の補足が可能となる。
移動端末と、
前記移動端末と通信する管理装置とを備え、
前記移動端末は、
前記移動端末の位置情報を所定間隔で前記管理装置へ送信する第1の送信部を有し、
前記管理装置は、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部を有する
ことを特徴とする監視システム。
前記制御部は、前記位置情報が示す位置が撮影される撮影範囲へカメラの向きを制御する付記1に記載の監視システム。
前記移動端末は、
所定動作を検出した場合に、前記管理装置に前記カメラのカメラ制御を切り替える制御切替情報を送信する第2の送信部を有し、
前記制御部は、前記制御切替情報を受信した場合に、前記移動端末の前記位置情報に基づいて、前記カメラの向きを制御する
付記1に記載の監視システム。
前記管理装置は、所定範囲を監視する通常モードと前記位置情報に基づき監視する追跡モードとを備え、
前記制御部は、前記制御切替情報を受信した場合に、前記通常モードから前記追跡モードに切り替え、前記移動端末の前記位置情報に基づいて、前記カメラの向きを制御する
付記1または2に記載の監視システム。
前記制御部は、前記カメラの位置情報を取得し、
前記制御部は、取得した前記カメラの位置情報と取得した前記移動端末の位置情報に基づいて、前記カメラの中から制御するカメラを少なくとも1つ選定する
付記1から4のいずれかに記載の監視システム。
前記管理装置は、
前記カメラより取得した映像に基づいて、追跡対象の候補を抽出する追跡部と、
前記映像を表示すると共に、前記映像に含まれる前記追跡対象の候補を選択可能に強調表示する表示部とを有する
付記1から5のいずれかに記載の監視システム。
前記追跡部は、選択された追跡対象の候補を追跡対象とし、前記追跡対象を追跡する
付記6に記載の監視システム。
前記追跡部は、前記カメラより取得した映像に基づいて、前記移動端末の所持者を特定し、特定された前記所持者を仮追跡対象として前記仮追跡対象を追跡する
付記6または7に記載の監視装置。
前記追跡部は、さらに前記カメラより取得した映像に基づいて、前記移動端末の所持者とは異なる移動体を抽出し、
前記表示部は、抽出された前記移動体を前記追跡対象の候補として強調表示する
付記6から8のいずれかに記載の監視装置。
前記移動端末は、前記所定動作を検出した場合に、前記所定動作を検出していない場合に比べて、前記位置情報を取得する間隔を短くする
付記1から9のいずれかに記載の監視システム。
前記管理装置は、前記追跡部が前記追跡対象を決定した場合に、前記管理装置の前記位置情報を取得する間隔を、前記所定動作を検出していない場合の取得間隔に変更する
付記7に記載の監視システム。
前記第1の送信部は、前記移動端末に対する入力を受け付けた場合に、前記移動端末の位置情報を所定間隔で前記管理装置へ送信する
付記2から11のいずれかに記載の監視システム。
前記移動端末は、さらに、加速度センサを有し、
前記第1の送信部は、前記加速度センサにより取得された加速度が所定値以上となった場合に、前記移動端末の位置情報を所定間隔で前記管理装置へ送信する
付記2から11のいずれかに記載の監視システム。
特定施設を監視する警備員が所持する移動端末と、
前記移動端末と通信し、前記特定施設を監視する管理装置と、
前記特定施設の少なくとも一部を撮影可能な少なくとも一つのカメラとを備え、
前記移動端末は、
前記移動端末の位置情報を所定間隔で前記管理装置へ送信する第1の送信部を有し、
前記管理装置は、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部を有する
ことを特徴とする施設監視システム。
自装置の位置情報を所定間隔で管理装置へ送信する第1の送信部を有し、
前記第1の送信部は、所定動作を検出した場合に、前記管理装置に対して、前記位置情報の送信を開始する
ことを特徴とする移動端末。
前記第1の送信部は、前記移動端末に対する入力を受け付けた場合に、前記位置情報を所定間隔で前記管理装置へ送信する
付記15に記載の移動端末。
加速度センサをさらに有し、
前記第1の送信部は、前記加速度センサにより取得された加速度が所定値以上となった場合に、前記位置情報を所定間隔で前記管理装置へ送信する
付記15に記載の移動端末。
移動端末の位置情報を所定間隔で取得する取得部と、
前記移動端末の位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部とを備える
ことを特徴とする管理装置。
前記制御部は、前記移動端末からカメラ制御を切り替える制御切替情報を受信した場合に、前記移動端末の前記位置情報に基づいて、前記カメラの向きを制御する
付記18に記載の管理装置。
移動端末の位置情報を所定間隔で取得し、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する
ことを特徴とする監視方法。
移動端末の位置情報を所定間隔で取得する処理、および
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する処理、
をコンピュータに実行させるための監視プログラム。
移動端末と、
前記移動端末と通信する管理装置とを備え、
前記移動端末は、
前記移動端末の位置情報を前記管理装置へ送信する第1の送信部を有し、
前記管理装置は、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部と、
前記カメラより取得した映像に基づいて、追跡対象の候補を抽出する追跡部と、
前記映像を表示すると共に、前記映像に含まれる前記追跡対象の候補を選択可能に強調表示する表示部とを有する
ことを特徴とする監視システム。
前記追跡部は、選択された追跡対象の候補を追跡対象とし、前記追跡対象を追跡する
付記22に記載の監視システム。
前記追跡部は、前記カメラより取得した映像に基づいて、前記移動端末の所持者を特定し、特定された前記所持者を仮追跡対象として追跡を実施する
付記22または23に記載の監視システム。
前記追跡部は、さらにカメラより取得した映像に基づいて、前記移動端末の所持者とは異なる移動体を抽出し、
前記表示部は、抽出された前記移動体を前記追跡対象の候補として強調表示する
付記22から24のいずれかに記載の監視システム。
1100、2100 移動端末
1110 位置取得部
1120、2120 動作検出部
1130 通信部
1131 第1通信部
1132 第2通信部
2140 加速度センサ
1200、2200 管理装置
1210 通信部
1211 第1通信部
1212 第2通信部
1220、2220 制御部
1221、2221 追跡部
1222 記憶部
2223 選択部
1230 表示部
300-1~300-N カメラ
100、200 CPU
130 表示部
110、210 入力受付部
120、220 メモリ
230 接続部
Claims (25)
- 移動端末と、
前記移動端末と通信する管理装置とを備え、
前記移動端末は、
前記移動端末の位置情報を所定間隔で前記管理装置へ送信する第1の送信部を有し、
前記管理装置は、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部を有する
ことを特徴とする監視システム。 - 前記制御部は、前記位置情報が示す位置へ前記カメラの向きを制御する
請求項1に記載の監視システム。 - 前記移動端末は、
所定動作を検出した場合に、前記管理装置に前記カメラのカメラ制御を切り替える制御切替情報を送信する第2の送信部を有し、
前記制御部は、前記制御切替情報を受信した場合に、前記移動端末の前記位置情報に基づいて、前記カメラの向きを制御する
請求項1に記載の監視システム。 - 前記管理装置は、所定範囲を監視する通常モードと前記位置情報に基づき監視する追跡モードとを備え、
前記制御部は、前記制御切替情報を受信した場合に、前記通常モードから前記追跡モードに切り替え、前記移動端末の前記位置情報に基づいて、前記カメラの向きを制御する
請求項3に記載の監視システム。 - 前記制御部は、前記カメラの位置情報を取得し、
前記制御部は、取得した前記カメラの位置情報と取得した前記移動端末の位置情報に基づいて、前記カメラの中から制御するカメラを少なくとも1つ選定する
請求項1から4のいずれかに記載の監視システム。 - 前記管理装置は、
前記カメラより取得した映像に基づいて、追跡対象の候補を抽出する追跡部と、
前記映像を表示すると共に、前記映像に含まれる前記追跡対象の候補を選択可能に強調表示する表示部とを有する
請求項1から5のいずれかに記載の監視システム。 - 前記追跡部は、選択された追跡対象の候補を追跡対象とし、前記追跡対象を追跡する
請求項6に記載の監視システム。 - 前記追跡部は、前記カメラより取得した映像に基づいて、前記移動端末の所持者を特定し、特定された前記所持者を仮追跡対象として前記仮追跡対象を追跡する
請求項6または7に記載の監視システム。 - 前記追跡部は、さらに前記カメラより取得した映像に基づいて、前記移動端末の所持者とは異なる移動体を抽出し、
前記表示部は、抽出された前記移動体を前記追跡対象の候補として強調表示する
請求項6から8のいずれかに記載の監視システム。 - 前記移動端末は、前記所定動作を検出した場合に、前記所定動作を検出していない場合に比べて、前記位置情報を取得する間隔を短くする
請求項1から9のいずれかに記載の監視システム。 - 前記管理装置は、前記追跡部が前記追跡対象を決定した場合に、前記管理装置の前記位置情報を取得する間隔を、前記所定動作を検出していない場合の取得間隔に変更する
請求項7に記載の監視システム。 - 前記第1の送信部は、前記移動端末に対する入力を受け付けた場合に、前記移動端末の位置情報を所定間隔で前記管理装置へ送信する
請求項2から11のいずれかに記載の監視システム。 - 前記移動端末は、さらに、加速度センサを有し、
前記第1の送信部は、前記加速度センサにより取得された加速度が所定値以上となった場合に、前記移動端末の位置情報を所定間隔で前記管理装置へ送信する
請求項2から11のいずれかに記載の監視システム。 - 特定施設を監視する警備員が所持する移動端末と、
前記移動端末と通信し、前記特定施設を監視する管理装置と、
前記特定施設の少なくとも一部を撮影可能な少なくとも一つのカメラとを備え、
前記移動端末は、
前記移動端末の位置情報を所定間隔で前記管理装置へ送信する第1の送信部を有し、
前記管理装置は、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部を有する
ことを特徴とする施設監視システム。 - 自装置の位置情報を所定間隔で管理装置へ送信する第1の送信部を有し、
前記第1の送信部は、所定動作を検出した場合に、前記管理装置に対して、前記位置情報の送信を開始する
ことを特徴とする移動端末。 - 前記第1の送信部は、前記移動端末に対する入力を受け付けた場合に、前記位置情報を所定間隔で前記管理装置へ送信する
請求項15に記載の移動端末。 - 加速度センサをさらに有し、
前記第1の送信部は、前記加速度センサにより取得された加速度が所定値以上となった場合に、前記位置情報を所定間隔で前記管理装置へ送信する
請求項15に記載の移動端末。 - 移動端末の位置情報を所定間隔で取得する取得部と、
前記移動端末の位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部とを備える
ことを特徴とする管理装置。 - 前記制御部は、前記移動端末からカメラ制御を切り替える制御切替情報を受信した場合に、前記移動端末の前記位置情報に基づいて、前記カメラの向きを制御する
請求項18に記載の管理装置。 - 移動端末の位置情報を所定間隔で取得し、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する
ことを特徴とする監視方法。 - 移動端末の位置情報を所定間隔で取得する処理、および
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する処理、
をコンピュータに実行させるための監視プログラム。 - 移動端末と、
前記移動端末と通信する管理装置とを備え、
前記移動端末は、
前記移動端末の位置情報を前記管理装置へ送信する第1の送信部を有し、
前記管理装置は、
前記位置情報を取得した場合に、前記位置情報に基づいて、カメラの向きを制御する制御部と、
前記カメラより取得した映像に基づいて、追跡対象の候補を抽出する追跡部と、
前記映像を表示すると共に、前記映像に含まれる前記追跡対象の候補を選択可能に強調表示する表示部とを有する
ことを特徴とする監視システム。 - 前記追跡部は、選択された追跡対象の候補を追跡対象とし、前記追跡対象を追跡する
請求項22に記載の監視システム。 - 前記追跡部は、前記カメラより取得した映像に基づいて、前記移動端末の所持者を特定し、特定された前記所持者を仮追跡対象として前記仮追跡対象を追跡する
請求項22または23に記載の監視システム。 - 前記追跡部は、さらに前記カメラより取得した映像に基づいて、前記移動端末の所持者とは異なる移動体を抽出し、
前記表示部は、抽出された前記移動体を前記追跡対象の候補として強調表示する
請求項22から請求項24のいずれかに記載の監視システム。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017506064A JP6551511B2 (ja) | 2015-03-17 | 2016-03-01 | 監視装置、監視方法、監視プログラム、及び監視システム |
EP16764414.5A EP3273672B1 (en) | 2015-03-17 | 2016-03-01 | Monitoring device, monitoring method, monitoring program, and monitoring system |
US15/558,643 US20180077355A1 (en) | 2015-03-17 | 2016-03-01 | Monitoring device, monitoring method, monitoring program, and monitoring system |
US16/401,624 US10887526B2 (en) | 2015-03-17 | 2019-05-02 | Monitoring system, monitoring method, and monitoring program |
US16/401,647 US10728460B2 (en) | 2015-03-17 | 2019-05-02 | Monitoring system, monitoring method, and monitoring program |
US17/108,063 US11533436B2 (en) | 2015-03-17 | 2020-12-01 | Monitoring system, monitoring method, and monitoring program |
US17/988,032 US20230083918A1 (en) | 2015-03-17 | 2022-11-16 | Monitoring system, monitoring method, and monitoring program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-052795 | 2015-03-17 | ||
JP2015052795 | 2015-03-17 |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/558,643 A-371-Of-International US20180077355A1 (en) | 2015-03-17 | 2016-03-01 | Monitoring device, monitoring method, monitoring program, and monitoring system |
US16/401,624 Continuation US10887526B2 (en) | 2015-03-17 | 2019-05-02 | Monitoring system, monitoring method, and monitoring program |
US16/401,647 Continuation US10728460B2 (en) | 2015-03-17 | 2019-05-02 | Monitoring system, monitoring method, and monitoring program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016147581A1 true WO2016147581A1 (ja) | 2016-09-22 |
Family
ID=56919926
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/001104 WO2016147581A1 (ja) | 2015-03-17 | 2016-03-01 | 監視装置、監視方法、監視プログラム、及び監視システム |
Country Status (4)
Country | Link |
---|---|
US (5) | US20180077355A1 (ja) |
EP (1) | EP3273672B1 (ja) |
JP (6) | JP6551511B2 (ja) |
WO (1) | WO2016147581A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018116493A1 (ja) * | 2016-12-22 | 2018-06-28 | 日本電気株式会社 | 配置サーバ、警備システム、警備員配置方法及びプログラム |
EP3361723A1 (en) * | 2017-02-14 | 2018-08-15 | Beijing Xiaomi Mobile Software Co., Ltd. | Monitoring vehicle involved in a collision |
CN110245546A (zh) * | 2018-12-06 | 2019-09-17 | 浙江大华技术股份有限公司 | 一种目标跟踪***、方法及存储介质 |
JP2019203770A (ja) * | 2018-05-23 | 2019-11-28 | 株式会社リアルグローブ | 測位装置及び方法、並びに、コンピュータプログラム |
JP2020162067A (ja) * | 2019-03-28 | 2020-10-01 | 日本電気株式会社 | 通信装置、通信端末、通信方法、および通信プログラム |
WO2021131935A1 (ja) * | 2019-12-26 | 2021-07-01 | 株式会社コロプラ | プログラム、方法および情報処理装置 |
WO2023112286A1 (ja) * | 2021-12-16 | 2023-06-22 | 日本電気株式会社 | 監視システム、監視方法、情報処理装置、及びコンピュータ可読媒体 |
EP3598744B1 (en) * | 2017-03-16 | 2024-04-17 | Hangzhou Hikvision Digital Technology Co., Ltd. | Pan-tilt control method, device and system |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104796611A (zh) * | 2015-04-20 | 2015-07-22 | 零度智控(北京)智能科技有限公司 | 移动终端遥控无人机实现智能飞行拍摄的方法及*** |
US10867376B2 (en) * | 2015-08-28 | 2020-12-15 | Nec Corporation | Analysis apparatus, analysis method, and storage medium |
CN107666590B (zh) * | 2016-07-29 | 2020-01-17 | 华为终端有限公司 | 一种目标监控方法、摄像头、控制器和目标监控*** |
US11070729B2 (en) * | 2018-07-27 | 2021-07-20 | Canon Kabushiki Kaisha | Image processing apparatus capable of detecting moving objects, control method thereof, and image capture apparatus |
CN111291585B (zh) * | 2018-12-06 | 2023-12-08 | 杭州海康威视数字技术股份有限公司 | 一种基于gps的目标跟踪***、方法、装置及球机 |
CN111601233B (zh) * | 2019-02-21 | 2022-06-28 | 昆山纬绩资通有限公司 | 定位装置的监控方法与*** |
JP7302251B2 (ja) * | 2019-04-15 | 2023-07-04 | 東洋製罐株式会社 | 情報管理方法および識別情報付与装置 |
CN112215037B (zh) * | 2019-07-10 | 2024-04-09 | 浙江宇视科技有限公司 | 对象追踪方法及装置、电子设备及计算机可读存储介质 |
US11593951B2 (en) * | 2020-02-25 | 2023-02-28 | Qualcomm Incorporated | Multi-device object tracking and localization |
KR102657374B1 (ko) * | 2023-11-23 | 2024-04-16 | 주식회사 엠큐리티 | 인공지능 기반 경비원 관리 및 예측 시스템 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003264640A (ja) * | 2002-03-08 | 2003-09-19 | Matsushita Electric Ind Co Ltd | パーソナルモニタリングシステム |
JP2006014206A (ja) * | 2004-06-29 | 2006-01-12 | Kyocera Corp | 監視カメラシステムとその方法および監視カメラ制御装置、携帯無線端末 |
JP2007243571A (ja) * | 2006-03-08 | 2007-09-20 | Nec Corp | 携帯電話端末装置および防犯連絡方法 |
JP2009098774A (ja) * | 2007-10-15 | 2009-05-07 | Mitsubishi Electric Corp | 人物追跡システム及び人物追跡方法及び人物追跡プログラム |
JP2011018094A (ja) * | 2009-07-07 | 2011-01-27 | Nec Corp | 巡回警備支援システム、方法及びプログラム |
JP2012008742A (ja) * | 2010-06-23 | 2012-01-12 | Toshiba Corp | 行動監視システム |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6697103B1 (en) * | 1998-03-19 | 2004-02-24 | Dennis Sunga Fernandez | Integrated network for monitoring remote objects |
JP2002010240A (ja) * | 2000-06-21 | 2002-01-11 | Matsushita Electric Ind Co Ltd | 監視システム |
JP2002074565A (ja) | 2000-08-31 | 2002-03-15 | Ntt Docomo Inc | 移動体端末装置、緊急通報センタ装置、救助用移動体端末装置および緊急通報システム |
US7151454B2 (en) * | 2003-01-02 | 2006-12-19 | Covi Technologies | Systems and methods for location of objects |
JP2004328333A (ja) | 2003-04-24 | 2004-11-18 | Hitachi Ltd | 携帯通信端末及び異常事態報知システム |
US20100220836A1 (en) * | 2005-09-08 | 2010-09-02 | Feke Gilbert D | Apparatus and method for multi-modal imaging |
JP5101160B2 (ja) * | 2006-05-10 | 2012-12-19 | 株式会社九電工 | 携帯端末装置 |
CN201112942Y (zh) * | 2007-10-12 | 2008-09-10 | 富士康(昆山)电脑接插件有限公司 | 电连接器 |
DE202008007520U1 (de) * | 2008-06-05 | 2008-08-21 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Vorrichtung zur Verfolgung eines beweglichen Objekts |
JP2010277444A (ja) * | 2009-05-29 | 2010-12-09 | Fujitsu Ltd | 監視システム、監視装置、及び監視方法 |
IL201129A (en) * | 2009-09-23 | 2014-02-27 | Verint Systems Ltd | A system and method for automatically switching cameras according to location measurements |
JP5674307B2 (ja) | 2009-12-17 | 2015-02-25 | グローリー株式会社 | 対象者検出システムおよび対象者検出方法 |
JP5115572B2 (ja) * | 2010-03-01 | 2013-01-09 | 日本電気株式会社 | カメラ管理サーバ、防犯サービス管理方法および防犯サービス管理プログラム |
JP5715775B2 (ja) * | 2010-06-30 | 2015-05-13 | 株式会社日立国際電気 | 画像監視システムおよび画像監視方法 |
JP2012156752A (ja) | 2011-01-26 | 2012-08-16 | Canon Inc | 監視領域制御方法 |
KR20150031985A (ko) * | 2013-09-17 | 2015-03-25 | 한국전자통신연구원 | 모바일 기기와 협력하여 위험 상황을 추적하기 위한 시스템 및 그 방법 |
US9451062B2 (en) * | 2013-09-30 | 2016-09-20 | Verizon Patent And Licensing Inc. | Mobile device edge view display insert |
US9454889B2 (en) * | 2014-07-28 | 2016-09-27 | Dan Kerning | Security and public safety application for a mobile device |
JP6689566B2 (ja) | 2014-09-25 | 2020-04-28 | 綜合警備保障株式会社 | 警備システム及び警備方法 |
US9582975B2 (en) | 2015-01-27 | 2017-02-28 | Honeywell International Inc. | Alarm routing in integrated security system based on security guards real-time location information in the premises for faster alarm response |
-
2016
- 2016-03-01 US US15/558,643 patent/US20180077355A1/en not_active Abandoned
- 2016-03-01 JP JP2017506064A patent/JP6551511B2/ja active Active
- 2016-03-01 EP EP16764414.5A patent/EP3273672B1/en active Active
- 2016-03-01 WO PCT/JP2016/001104 patent/WO2016147581A1/ja active Application Filing
-
2019
- 2019-05-02 US US16/401,624 patent/US10887526B2/en active Active
- 2019-05-02 US US16/401,647 patent/US10728460B2/en active Active
- 2019-07-02 JP JP2019123639A patent/JP6696615B2/ja active Active
-
2020
- 2020-04-15 JP JP2020072855A patent/JP7131843B2/ja active Active
- 2020-12-01 US US17/108,063 patent/US11533436B2/en active Active
-
2021
- 2021-12-20 JP JP2021205683A patent/JP7184148B2/ja active Active
-
2022
- 2022-11-16 US US17/988,032 patent/US20230083918A1/en active Pending
- 2022-11-17 JP JP2022183819A patent/JP2023014141A/ja active Pending
-
2024
- 2024-01-23 JP JP2024007812A patent/JP2024032829A/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003264640A (ja) * | 2002-03-08 | 2003-09-19 | Matsushita Electric Ind Co Ltd | パーソナルモニタリングシステム |
JP2006014206A (ja) * | 2004-06-29 | 2006-01-12 | Kyocera Corp | 監視カメラシステムとその方法および監視カメラ制御装置、携帯無線端末 |
JP2007243571A (ja) * | 2006-03-08 | 2007-09-20 | Nec Corp | 携帯電話端末装置および防犯連絡方法 |
JP2009098774A (ja) * | 2007-10-15 | 2009-05-07 | Mitsubishi Electric Corp | 人物追跡システム及び人物追跡方法及び人物追跡プログラム |
JP2011018094A (ja) * | 2009-07-07 | 2011-01-27 | Nec Corp | 巡回警備支援システム、方法及びプログラム |
JP2012008742A (ja) * | 2010-06-23 | 2012-01-12 | Toshiba Corp | 行動監視システム |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018116493A1 (ja) * | 2016-12-22 | 2018-06-28 | 日本電気株式会社 | 配置サーバ、警備システム、警備員配置方法及びプログラム |
JPWO2018116493A1 (ja) * | 2016-12-22 | 2019-11-07 | 日本電気株式会社 | 配置サーバ、警備システム、警備員配置方法及びプログラム |
EP3361723A1 (en) * | 2017-02-14 | 2018-08-15 | Beijing Xiaomi Mobile Software Co., Ltd. | Monitoring vehicle involved in a collision |
US10846954B2 (en) | 2017-02-14 | 2020-11-24 | Beijing Xiaomi Mobile Software Co., Ltd. | Method for monitoring vehicle and monitoring apparatus |
EP3598744B1 (en) * | 2017-03-16 | 2024-04-17 | Hangzhou Hikvision Digital Technology Co., Ltd. | Pan-tilt control method, device and system |
JP2019203770A (ja) * | 2018-05-23 | 2019-11-28 | 株式会社リアルグローブ | 測位装置及び方法、並びに、コンピュータプログラム |
CN110245546A (zh) * | 2018-12-06 | 2019-09-17 | 浙江大华技术股份有限公司 | 一种目标跟踪***、方法及存储介质 |
JP2020162067A (ja) * | 2019-03-28 | 2020-10-01 | 日本電気株式会社 | 通信装置、通信端末、通信方法、および通信プログラム |
JP7298237B2 (ja) | 2019-03-28 | 2023-06-27 | 日本電気株式会社 | 通信方法、及び制御装置 |
WO2021131935A1 (ja) * | 2019-12-26 | 2021-07-01 | 株式会社コロプラ | プログラム、方法および情報処理装置 |
JP7514076B2 (ja) | 2019-12-26 | 2024-07-10 | 株式会社コロプラ | プログラム、方法および情報処理装置 |
WO2023112286A1 (ja) * | 2021-12-16 | 2023-06-22 | 日本電気株式会社 | 監視システム、監視方法、情報処理装置、及びコンピュータ可読媒体 |
Also Published As
Publication number | Publication date |
---|---|
JP7184148B2 (ja) | 2022-12-06 |
JP2020123973A (ja) | 2020-08-13 |
JP6696615B2 (ja) | 2020-05-20 |
JP2024032829A (ja) | 2024-03-12 |
JP2019201413A (ja) | 2019-11-21 |
US20180077355A1 (en) | 2018-03-15 |
US10887526B2 (en) | 2021-01-05 |
JP2023014141A (ja) | 2023-01-26 |
EP3273672A4 (en) | 2018-12-05 |
US20230083918A1 (en) | 2023-03-16 |
US20210105409A1 (en) | 2021-04-08 |
JP2022040141A (ja) | 2022-03-10 |
JP6551511B2 (ja) | 2019-07-31 |
JP7131843B2 (ja) | 2022-09-06 |
EP3273672A1 (en) | 2018-01-24 |
US10728460B2 (en) | 2020-07-28 |
US20190260942A1 (en) | 2019-08-22 |
EP3273672B1 (en) | 2020-12-30 |
US20190260941A1 (en) | 2019-08-22 |
JPWO2016147581A1 (ja) | 2017-12-28 |
US11533436B2 (en) | 2022-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7184148B2 (ja) | 監視システム、管理装置および監視方法 | |
US20230412925A1 (en) | Video surveillance system and video surveillance method | |
US9509900B2 (en) | Camera control method, and camera control device for same | |
JP6128468B2 (ja) | 人物追尾システム及び人物追尾方法 | |
JP2006523043A (ja) | 監視を行なう方法及びシステム | |
US9977429B2 (en) | Methods and systems for positioning a camera in an incident area | |
KR20140052357A (ko) | 다중 카메라를 이용하여 객체의 이동을 추적하는 객체 추적 시스템 및 객체 추적 방법 | |
WO2021068553A1 (zh) | 一种监控方法、装置和设备 | |
KR101780929B1 (ko) | 움직이는 물체를 추적하는 영상감시 시스템 | |
KR20190050113A (ko) | 이동 물체 자동 추적 영상 감시 시스템 | |
KR20150019230A (ko) | 복수의 카메라를 이용한 객체 추적 방법 및 장치 | |
JPWO2013175836A1 (ja) | 監視カメラ管理装置、監視カメラ管理方法およびプログラム | |
US11431255B2 (en) | Analysis system, analysis method, and program storage medium | |
JP6483326B2 (ja) | 監視システムおよび端末装置 | |
WO2005120070A2 (en) | Method and system for performing surveillance | |
KR102646190B1 (ko) | 복수개의 차량을 관제하는 방법, 서버 및 시스템 | |
KR20180099074A (ko) | 드론 카메라를 이용한 융합형 감시 시스템 | |
KR101576772B1 (ko) | Cctv 카메라 제어방법 | |
KR20240051767A (ko) | 원격 감시 제어가능한 cctv카메라 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16764414 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017506064 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2016764414 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15558643 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |