WO2008004578A1 - Système, dispositif et procédé de surveillance - Google Patents

Système, dispositif et procédé de surveillance Download PDF

Info

Publication number
WO2008004578A1
WO2008004578A1 PCT/JP2007/063368 JP2007063368W WO2008004578A1 WO 2008004578 A1 WO2008004578 A1 WO 2008004578A1 JP 2007063368 W JP2007063368 W JP 2007063368W WO 2008004578 A1 WO2008004578 A1 WO 2008004578A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
identification information
monitoring system
area
unit
Prior art date
Application number
PCT/JP2007/063368
Other languages
English (en)
Japanese (ja)
Inventor
Hideaki Oi
Tsugihiro Kurihara
Ryousuke Iida
Kazuhiko Miyamoto
Original Assignee
Panasonic Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corporation filed Critical Panasonic Corporation
Publication of WO2008004578A1 publication Critical patent/WO2008004578A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Definitions

  • Monitoring system Monitoring apparatus and monitoring method
  • the present invention relates to a monitoring system, a monitoring apparatus, and a monitoring method for monitoring a person with a captured image.
  • surveillance cameras are installed on the streets around the school route, and the children have wireless IC tags that can identify individuals.
  • the camera acquires images and stores them on a Web server.
  • the guardian can also monitor the child's safety because his / her power such as a personal computer at home also accesses the Web server and acquires the image of the child himself.
  • his / her power such as a personal computer at home also accesses the Web server and acquires the image of the child himself.
  • the position information is acquired from the wireless IC tag held by the child, and the camera's pan / tilt / zoom operation is performed based on the position information to capture and maintain an image of only the target child.
  • a monitoring system has been proposed in which guardians view the images (see Patent Document 1). In this conventional monitoring system, it is possible to prevent other children from appearing in the image, and privacy protection is ensured.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2002-269209 Disclosure of the invention
  • the present invention has been made in view of the above circumstances, and is not intended to obtain accurate position information of the target person or to match the shooting range of the camera only to the target person. It is an object to provide a monitoring system, a monitoring apparatus, and a monitoring method capable of protecting the privacy of a person.
  • the present invention provides a monitoring system, a monitoring apparatus, and a monitoring method capable of obtaining a monitoring image without missing an opportunity even when a large number of target persons simultaneously enter the shooting range of the camera. The purpose is to do.
  • the monitoring system of the present invention is a monitoring system for monitoring a person with a photographed image, a signal for receiving a signal from a transmission source possessed by the target person and acquiring identification information included in the received signal.
  • the image data holding unit held in association with the identification information and the image transmission unit for transmitting the held mask image are omitted.
  • the present invention is the above monitoring system, wherein an authentication information transmitting unit that transmits authentication information to be associated with the identification information, an image receiving unit that receives the transmitted mask image, A display unit for displaying the received mask image, wherein the image transmission unit specifies identification information corresponding to the authentication information and transmits a mask image related to the identification information to a transmission source of the authentication information And
  • the monitoring person can view the mask image generated by the monitoring system on a browsing terminal or the like, and only the person who is the object in the mask image can be viewed.
  • the present invention is the monitoring system described above, wherein the area detection unit detects a face area in the image as the specific area, and the area identification unit corresponds to the identification information.
  • the face area of the target person to be identified is identified as the target specific area.
  • the present invention is the monitoring system described above, including a database in which facial feature amounts are registered, and the region identification unit is registered in the database using the identification information.
  • a facial feature amount of the target person is compared, the acquired facial feature amount is compared with the detected facial feature amount of the face area, and the face area of the target person corresponding to the identification information is determined from the comparison result. Shall be identified.
  • the face area can be automatically detected by registering the facial feature amount of the target person in the database in advance.
  • the present invention is the monitoring system described above, wherein the masking unit masks a face area other than the face area of the target person corresponding to the identification information.
  • the present invention is the monitoring system described above, wherein the masking unit masks all areas other than the face area of the target person corresponding to the identification information. This can significantly enhance privacy protection for persons other than the target person. In addition, privacy can be maintained even when face areas other than the target person cannot be detected accurately.
  • the present invention is the monitoring system described above, wherein the masking unit generates the mask image when the region identifying unit cannot identify a specific region corresponding to the identification information! Shall not be performed.
  • processing for generating unnecessary images can be suppressed, and processing efficiency can be improved.
  • privacy is infringed by distributing an image that has been incorrectly masked due to erroneous detection and identification. Can be prevented.
  • the present invention is the monitoring system described above, wherein the area detection unit detects an area of a moving object in the image as the specific area.
  • the processing load can be reduced compared to the detection of the face area.
  • the present invention is the monitoring system described above, wherein the region identification unit targets a region of a moving object corresponding to the identification information using feature data associated with the identification information. It shall be identified as a target specific area corresponding to a person area.
  • the area of the moving object corresponding to the target person can be easily identified.
  • the present invention is the monitoring system described above, wherein the area identification unit includes, as characteristic data associated with the identification information, a height of a moving object, a color, presence / absence of a specific shape object, a specific shape At least one of the patterns of objects shall be used.
  • the present invention is the monitoring system described above, wherein the masking unit masks a region of a moving object other than a region of a target person corresponding to the identification information.
  • the privacy protection regarding the person other than the target person can be enhanced by masking the entire area of the moving object other than the area of the target person.
  • the present invention is the monitoring system described above, wherein the masking unit performs masking.
  • the method can be selected, and the masking unit performs masking of a region other than the target specific region according to the selected masking method.
  • the monitoring method of the present invention is a monitoring method for monitoring a person with a photographed image, receives a signal from a transmission source possessed by the target person, and acquires identification information included in the received signal.
  • An image data holding step for holding the mask image in association with the identification information; and an image transmission step for transmitting the held mask image.
  • the monitoring device of the present invention is a monitoring device that monitors a person with a captured image, and includes a data receiving unit that receives image data of a captured image and identification information of the person included in the image, A region detection unit for detecting a specific region in the captured image, a region identification unit for identifying a target specific region corresponding to the identification information from the detected specific region, and the identified target specification A masking unit that generates a mask image by masking other specific areas excluding an area, an image data holding unit that holds the mask image in association with the identification information, and transmits the held mask image And an image transmission unit.
  • a monitoring system, a monitoring apparatus, and a monitoring method capable of performing the above can be provided. Further, it is possible to provide a monitoring system, a monitoring apparatus, and a monitoring method capable of obtaining a monitoring image without missing an opportunity even when a large number of target persons enter the shooting range of the camera all at once.
  • FIG. 1 is a diagram showing a configuration of a monitoring system according to a first embodiment of the present invention.
  • FIG. 2 is a flowchart showing a monitoring processing procedure in the monitoring apparatus of the first embodiment.
  • FIG. 3 shows a photographed image and an image subjected to image processing in the first embodiment.
  • FIG. 4 is a flowchart showing a processing procedure for storing image data and related information in the data center of the first embodiment.
  • FIG. 5 is a flowchart showing a processing procedure when browsing a mask image in the browsing terminal and data center of the first embodiment.
  • FIG. 6 is a diagram showing a list of selection screens displayed on the portable terminal according to the first embodiment.
  • FIG. 7 is a diagram showing a configuration of a monitoring system according to a second embodiment of the present invention.
  • FIG. 8 is a flowchart showing an operation processing procedure in the monitoring node according to the second embodiment.
  • FIG. 9 is a diagram showing an example of a captured image in which a moving object is detected in the second embodiment
  • FIG. 10 is a flowchart showing a first example of a processing procedure for storing image data and related information in the monitoring center according to the second embodiment.
  • FIG. 11 is a view showing an image on which masking processing is performed in the second embodiment.
  • FIG. 12 is a flowchart showing a second example of a processing procedure for storing image data and related information in the monitoring center of the second embodiment.
  • FIG. 13 is a flowchart showing a third example of a processing procedure for storing image data and related information in the monitoring center according to the second embodiment.
  • FIG. 14 is a flowchart showing a fourth example of a processing procedure for storing image data and related information in the monitoring center according to the second embodiment.
  • FIG. 15 is a flowchart showing a fifth example of a processing procedure for storing image data and related information in the monitoring center according to the second embodiment.
  • a monitoring system a configuration example applied to a monitoring system for a parent to watch a child attending school is shown.
  • This monitoring system is configured as a remote monitoring system in which images taken by a camera installed in a school road are transmitted via a network, and a guardian displays the images on a terminal for viewing.
  • FIG. 1 is a diagram showing a configuration of a monitoring system according to the first embodiment of the present invention.
  • the monitoring system has a configuration in which a photographing apparatus 101, a data center 105, and a browsing terminal 111 are connected to each other via a network 104 such as the Internet or a public line so that data communication can be performed.
  • a network 104 such as the Internet or a public line so that data communication can be performed.
  • FIG. 1 only one monitoring device and two browsing terminals 111 are depicted, but in practice any number of devices can be connected.
  • the imaging device 101 includes a camera 102 and an IC tag signal receiving device 103.
  • the camera 102 corresponds to an example of a photographing unit, and is a fixed camera installed in the vicinity of the IC tag signal receiving device 103, and photographs a preset photographing range. In this embodiment, any part of the child's school route is set as the shooting range. Further, the force lens according to the present embodiment may have a force that does not have a pan “tilt” zoom function.
  • the IC tag signal receiving device 103 corresponds to an example of a signal receiving unit, and communicates with a wireless IC tag 125 corresponding to an example of a transmission source built in a name tag or the like possessed by a child who is a target person. I do.
  • the IC tag signal receiving device 103 receives a signal (IC tag signal) output from the wireless IC tag 125 as the child approaches, the IC tag signal receiving device 103 outputs a shooting start signal to the camera 102 and also uses the IC tag signal as identification information.
  • the wireless IC tag includes an active tag that actively transmits a signal by itself and a passive tag that transmits a signal by receiving a signal from a receiving device, but the method of the wireless IC tag is not particularly limited. .
  • the image data based on the image captured by the image capturing device 101 and the personal identification information acquired by the IC tag signal receiving device 103 are transmitted to the data center 105 via the network 104.
  • the data center 105 includes an image data management device 106, a face area detection device 107, a face identification device 108, a masking device 109, and a face feature quantity database 110 (DB).
  • face feature quantity DB 110 face feature quantities of persons to be monitored (in this embodiment, children) in this monitoring system are registered and accumulated in advance.
  • the face area detecting device 107 corresponds to an example of an area detecting unit, and detects a face area as a specific area in the received image.
  • a method for detecting a face area for example, an area that is judged to be similar to the eyes is extracted from the captured image, and a multiplex that determines whether or not the face is power is detected.
  • a matching face detection method or the like can be used.
  • the face identification device 108 corresponds to an example of an area identification unit, refers to the face feature DB 110 based on personal identification information, and acquires the face feature of a monitored child who has a wireless IC tag. To do. Further, from all the face areas detected by the face area detecting device 107, a face area having a feature amount that matches the face feature amount of the child is identified as a target specific area.
  • the masking device 109 corresponds to an example of a masking unit, and among the face regions detected by the face region detection device 107, the child's face region (target specific region) identified by the face identification device 108. Mask the face area other than, and generate a mask image that cannot identify a person who is not subject to monitoring. Further, the masking device 109 transmits the generated image to the image data management device 106.
  • the image data management device 106 has functions of an image data holding unit and an image transmission unit. The image data management device 106 holds the image data subjected to the masking process by the masking device 109 in association with the personal identification information, and can be browsed from the outside. Publish like so. Accordingly, the image data management device 106 stores a large number of image data (including mask images) taken at a plurality of points.
  • each function of the image data management device 106, the face area detection device 107, the face identification device 108, and the masking device 109 is a program stored in a recording medium in a computer provided in the data center 105. It is realized by executing.
  • the browsing terminal 111 is for browsing the image data and the like stored in the image data management device 106, and is a general-purpose personal computer (PC) installed at the home of a child guardian. It is comprised by a mobile telephone apparatus, a portable information terminal, etc.
  • the browsing terminal 111 has a communication unit (not shown) that transmits and receives data via the network 104, and realizes functions of an authentication information transmission unit and an image reception unit.
  • the browsing terminal 111 includes a display unit that displays a captured image or the like that has become a mask image.
  • FIG. 2 is a flowchart showing a monitoring processing procedure in the photographing apparatus 101. This process is repeatedly performed by the processor executing a program stored in a storage medium provided in the photographing apparatus 101.
  • the IC tag signal receiving device 103 determines whether or not a wireless IC tag signal has been received.
  • Step SI When the wireless IC tag signal is received, the personal identification information included in the wireless IC tag signal is acquired (step S2). In synchronization with the reception of the wireless IC signal, a shooting start signal is output from the IC tag signal receiving device 103 and shot by the camera 102 (step S3). The image data of the captured image and the acquired personal identification information are transmitted to the data center 105 via the network 104 (step S4). Thereafter, this process is terminated. On the other hand, if the wireless IC tag signal is not received in step S1, this process is terminated.
  • the IC tag signal receiving device 103 receives a wireless IC tag signal having the power of a wireless IC tag, and starts shooting with the camera 102. Send a signal.
  • the camera 102 receives a shooting start signal from the IC tag signal receiving device 103, the camera 102 takes a shooting range set in the vicinity of the IC tag signal receiving device 103 and generates image data.
  • FIG. 3 is a diagram showing a photographed image and an image subjected to image processing.
  • Figure 3 (A) shows the photographed image.
  • the photographed image 201 a large number of people are photographed in addition to the child 201a to be photographed!
  • the image capturing apparatus 101 when the image capturing apparatus 101 acquires an image captured by the camera 102, the image capturing apparatus 101 transmits the image data and related information to the data center 105 via the network 104.
  • the related information includes personal identification information obtained from the wireless IC tag signal power.
  • the personal identification information and the image data are associated with each other and held in the image data management device 106 in the data center 105.
  • the image is published via the network, so it cannot be viewed from the viewing terminal 111. In other words, parents cannot view images that may show an unspecified number of people.
  • FIG. 4 is a flowchart showing a processing procedure for storing image data and related information in the data center 105. This process is performed by the processor executing a program stored in a storage medium provided in each device constituting the data center 105.
  • Step SI 1 it is determined whether or not the image data and the personal identification information are received from the photographing apparatus 101. If image data and personal identification information are received, the image data is associated with the personal identification information and stored so that it cannot be viewed (step S12). An example of a method for saving data that cannot be browsed is to save it in a storage area that is not disclosed. The stored image data is read and all face areas in the image are detected (step S13).
  • FIG. 3B shows an image 202 in which a face area is detected with respect to the captured image 201 in FIG. 3A.
  • image 202 all detected face regions 202a to 202d are drawn so as to be surrounded by a square frame with a broken line.
  • step S14 the registered face feature of the child possessing the IC tag is acquired.
  • Face recognition is performed using the acquired face feature amount, and a face region in which all face region forces in the image 202 match the face feature amount is detected (step S15).
  • masking processing is performed on other face areas excluding the matched face area so that the face area is not displayed in the image (step S16).
  • FIG. 3C shows an image 203 obtained by performing masking on the image 202 shown in FIG. 3B.
  • the masked face areas 203b, 203c, and 203d are drawn with a solid color in a square frame.
  • the image data subjected to the masking process is associated with the personal identification information and stored in the image data management device 106 so as to be viewable (step S17).
  • a method for saving the file so as to be viewable for example, saving in a storage area for disclosure in the image data management apparatus 106 can be mentioned.
  • this process is complete
  • the present process is terminated as it is.
  • the masked image mask image
  • the image data management device 106 in the data center 105 is used with a browsing terminal 111 such as a personal computer installed with browsing software (browser application). Access.
  • FIG. 5 is a flowchart showing a processing procedure when browsing a mask image in the browsing terminal 111 and the data center 105.
  • Figure 5 (A) shows the browsing process on the browsing terminal 111. It is a flowchart which shows a physical procedure. This processing is performed by the processor executing a program stored in a storage medium provided in the browsing terminal 111.
  • FIG. 5B is a flowchart showing a browsing process procedure in the image data management apparatus 106 in the data center 105. This processing is performed by the processor executing a program stored in a storage medium provided in the image data management apparatus 106.
  • the browsing terminal 111 side waits until a guardian inputs a user ID and a password (step S21).
  • the personal identification information is transmitted to the image data management device 106 (step S22).
  • a list of selection screens representing time-series image data held in the image data management device 106 is transmitted from the image data management device 106 to the browsing terminal 111.
  • FIG. 6 shows a list of selection screens displayed on portable terminal 111.
  • the selection screen list includes a selection button 281 for requesting transmission of an image of each point where the target child was photographed.
  • the guardian can request a selection screen by selecting a desired selection button 281.
  • a request for a selection screen designated by the guardian is transmitted to the data center 105 (step S24). In this selection screen request, it is possible to request a screen representing an image taken by the camera at which point.
  • step S25 Wait until image data corresponding to the request is received from the image data management device 106 (step S25).
  • the image is displayed on the screen (step S26). This allows parents to view mask images showing only the target child.
  • step S27 it is determined whether or not a logout is input by the guardian (step S27). If mouthout is not entered, the process returns to step S24. On the other hand, if logout is entered, this process ends. On the other hand, if browsing is not permitted in step S23, the process is terminated.
  • step S31 it is determined whether or not a user ID and a password are received as authentication information from the browsing terminal 111 (step S31). If a user ID and password are received, authentication is performed (step S32). Whether the result of authentication is confirmed Determine (step S33). If the authentication is confirmed, a viewing permission is transmitted (step S3 4).
  • the authentication information in addition to the user ID and password, individual information of the browsing terminal, address information, information on the wireless ic tag, ic card, etc. can be used.
  • step S35 the process waits until a request for a selection screen is received from the browsing terminal 111 (step S35).
  • step S36 When the request for the selection screen is received, the image data of the screen corresponding to the request is transmitted to the browsing terminal 111 (step S36). Then, it is determined whether or not the power has been logged out (step S37). When the logout is performed, this process is terminated. On the other hand, if logout is not performed, the process returns to step S35. On the other hand, if the authentication cannot be confirmed in step S33, or if the user ID and password cannot be received in step S31, the process is terminated.
  • the image data management apparatus 106 requests input of a user ID and a password.
  • the guardian inputs this information.
  • the image data management device 106 manages the session information (information from login to logout) with the guardian, always holds the personal identification information of the child corresponding to the guardian, and Only the image associated with the identification information can be viewed.
  • the guardian can select what he / she viewed from among the images that can be browsed, and can check it on the screen of the browsing terminal 111. In this way, the guardian can only view the mask image in which the face images other than the registered children are masked.
  • the monitoring system of the first embodiment it is possible to protect each other's privacy even in a monitoring system in which a large number of children are registered.
  • a person is registered in the surveillance system, there is a possibility that a general person may appear in the image of the camera. Therefore, even in places where a large number of unspecified people such as streets come and go, the introduction of this system reduces the psychological resistance of people around and makes it easier to introduce.
  • the camera does not require pan / tilt / zoom and! / Functions, which is advantageous in terms of cost.
  • identifying a face a target person can be accurately specified, and privacy can be easily infringed and the face portion can be hidden.
  • the ability to mask only the face area of a person other than the monitoring target In the captured image, all areas other than the face area corresponding to the personal identification information are masked to increase privacy. be able to.
  • FIG. 7 is a diagram showing a configuration of a monitoring system according to the second embodiment of the present invention.
  • the area is masked by performing face recognition based on the captured image power.
  • a moving object is detected and the area is masked. An example of masking is shown.
  • the monitoring system of the second embodiment has a configuration in which a plurality of monitoring nodes 310 and a monitoring center 330 are connected to each other via a VPN (Virtual Private Network) 340 so that data communication is possible.
  • the watching center 330 is connected to the user terminal 370 via the network 360.
  • the watching node 310 includes a camera 311, an electronic tag reader 312, a motion detection unit 313, and a controller 314.
  • the camera 311 is a fixed camera installed at a monitoring target location such as a street, and shoots a preset area (shooting range) to generate image data.
  • the electronic tag reader 312 reads a wireless IC tag signal possessed by a child with the power of a wireless IC tag. This wireless IC tag signal includes personal identification information as in the first embodiment.
  • the motion detection unit 313 corresponds to an example of a region detection unit, obtains an image from the camera 311 and calculates a difference between two temporally different images in units of pixels, and obtains a difference greater than a predetermined value.
  • the moving object area in the image is detected and metadata is created by the inter-frame difference method that estimates that there is a moving object in the area where there is a minute.
  • Processing request signal of controller 314 When received, it provides an image associated with the metadata.
  • the controller 314 acquires an image and metadata from the motion detection unit 313, and monitors the personal identification information, the image, and metadata via the VPN 340. Send to.
  • the metadata includes the number of moving objects and their coordinates.
  • the VPN 340 is a virtual dedicated line that connects the watch node 310 and the watch center 330.
  • a general-purpose line such as VPN340, which is not a dedicated line, may be used.
  • the watching center 330 includes a router 331, a hub (HUB) 332, a web server 333, a mail server 334, a system management device 335, an image processing device 336, and a database (DB) 337.
  • the system management device 335 Upon receiving the image, metadata, and personal identification information from the watching node 310, the system management device 335 stores the child passage information in the database 337 and associates it with the personal information identification information through the mail server 334. It has a function to send passage information to user terminal 370.
  • the image processing device 336 includes a moving object identification function (corresponding to an example of an area identification unit) that identifies a moving object corresponding to personal identification information, and a moving object other than the moving object corresponding to personal identification information.
  • a moving object identification function corresponding to an example of an area identification unit
  • Masking function for masking the area corresponding to an example of the masking part
  • WEB server 333 uses a login ID and password authentication function to allow access from user terminal 370, and is stored in database 337! It has a function of extracting only information and transmitting passage information including an image or a mask image (corresponding to an example of an image transmission unit).
  • the database 337 corresponds to an example of an image holding unit, in which feature data as personal identification information is registered in addition to image data of a photographed image and a mask image.
  • feature data include face and body features, specific colors, specific shapes, specific pattern patterns, and the like.
  • the web server 333 and the mail server 334 are connected to the network 360 through the router 331. It is possible to continue.
  • the portable terminal 370 is a personal computer (personal computer) 371 or a mobile phone device 372.
  • the cellular phone device 372 can be connected to the network 360 through the cellular phone operator 380.
  • FIG. 8 is a flowchart showing an operation processing procedure in the watching node 310. This processing is performed by the processor executing a program stored in a storage medium provided in the watching node 310.
  • step S51 it is determined whether or not the RFID tag signal is received by the electronic tag reader 312 (step S51).
  • the wireless IC tag signal is received, the personal identification information included in the wireless IC tag signal is acquired (step S52). Then, an image in the shooting range set by the camera 311 is shot (step S53). Using this captured image, a moving object in the image is detected by the inter-frame difference method (step S54).
  • FIG. 9 is a diagram illustrating an example of a captured image in which a moving object is detected.
  • a moving object In this image, two children are detected as moving objects in addition to the children who have the IC tag 401. Each moving object is drawn with dashed frames a, b, and c.
  • the number of detected moving objects and their coordinates are further acquired as metadata (step S55).
  • These pieces of information personal identification information, the number and coordinates of moving objects
  • the wireless IC tag signal is not received at step S51, the process is terminated.
  • FIG. 10 is a flowchart showing a first example of a processing procedure for storing image data and related information in the watching center 330. This processing is performed by the processor executing a program stored in a storage medium provided in the system management device 335 and the image processing device 336.
  • step S61 it is determined whether or not image data, metadata, and personal identification information in which a moving object is detected is received from the watching node 310 (step S61).
  • image data, metadata, and personal identification information are received, the image data is associated with the personal identification information and stored so that it cannot be viewed (step S62).
  • image data is associated with the personal identification information and stored so that it cannot be viewed (step S62).
  • a masking process is performed on the area of other moving objects excluding the moving objects by the child holding the wireless IC tag 401 (step S63). .
  • the facial feature data of the child associated with the personal identification information registered in the database 337 is compared with the facial feature data extracted from the moving object coordinate area of the image data.
  • the moving object corresponding to the personal identification information is specified based on the information, and the area of the moving object other than the specified moving object is masked.
  • FIG. 11 is a diagram showing an image that has been subjected to masking processing.
  • the area of the moving object surrounded by the broken-line frame a is an area corresponding to the target person specified by the personal identification information, and only this area is drawn as it is.
  • the area surrounded by the dashed frames b and c, which are other moving object areas, are masked and filled with solid colors, and no person is identified.
  • step S64 the image data subjected to the masking process is associated with the personal identification information and stored so as to be viewable.
  • One way to save it for viewing is to save it in a public storage area.
  • this process is complete
  • image data, metadata, and personal identification information have not been received in step S61, the process is terminated as it is.
  • the mask image can be browsed from the browsing terminal 370.
  • FIG. 12 is a flowchart showing a second example of a processing procedure for storing image data and related information in the watching center 330. This processing is performed by the processor executing a program stored in a storage medium provided in the system management device 335 and the image processing device 336.
  • step S71 it is determined whether or not image data, metadata, and personal identification information in which a moving object is detected is received from the watching node 310 (step S71).
  • image data, metadata, and personal identification information are received, the image data is associated with the personal identification information and stored so that it cannot be viewed (step S72).
  • One way to save data so that it cannot be viewed is to save it in a storage area that is not open to the public.
  • a masking process is performed on the area of other moving objects excluding the moving objects by the child possessing the wireless IC tag 401 (step S73). .
  • the body feature data such as the height of the child associated with the personal identification information registered in the database 337 and the body feature data such as the height extracted from the moving object coordinate area of the image data are combined. Comparison is performed, a moving object corresponding to the personal identification information is identified based on the comparison result, and a region of the moving object other than the identified moving object is masked. As in the first example, this masking process masks the area of other moving objects that do not correspond to the personal identification information as shown in FIG. 11, and generates an image without specifying a person or the like.
  • step S74 the image data subjected to the masking process is associated with the personal identification information and stored so as to be viewable.
  • One way to save it for viewing is to save it in a public storage area.
  • this process is complete
  • image data, metadata, and personal identification information have not been received in step S71, this processing is terminated.
  • the mask image can be browsed from the browsing terminal 370.
  • FIG. 13 is a flowchart showing a third example of a processing procedure for storing image data and related information in the watching center 330. This processing is performed by the processor executing a program stored in a storage medium provided in the system management device 335 and the image processing device 336.
  • step S81 it is determined whether or not image data, metadata, and personal identification information in which a moving object is detected is received from the watching node 310 (step S81).
  • image data, metadata, and personal identification information are received, the image data is associated with the personal identification information and stored so as not to be browsed (step S82).
  • One way to save data so that it cannot be viewed is to save it in a storage area that is not open to the public.
  • a masking process is performed on the area of other moving objects excluding the moving objects by the child who has the wireless IC tag 401 (step S83). .
  • color features such as clothes and school bags worn by children associated with personal identification information registered in database 337
  • the data and the color feature data extracted from the moving object coordinate area of the image data are compared, the moving object corresponding to the personal identification information is identified based on the comparison result, and the moving object other than the identified moving object is identified.
  • Masking the area is done. As in the first example, this masking process masks areas of other moving objects that do not correspond to the personal identification information as shown in FIG. 11, and generates a habit image without specifying a person or the like.
  • step S84 the image data subjected to the masking process is associated with the personal identification information and stored so as to be viewable.
  • One way to save it for viewing is to save it in a public storage area.
  • this process is complete
  • the image data, metadata, and personal identification information have not been received in step S81, this process ends.
  • the mask image can be browsed from the browsing terminal 370.
  • FIG. 14 is a flowchart showing a fourth example of a processing procedure for storing image data and related information in the watching center 330. This processing is performed by the processor executing a program stored in a storage medium provided in the system management device 335 and the image processing device 336.
  • step S91 it is determined whether or not image data, metadata, and personal identification information in which a moving object is detected is received from the watching node 310 (step S91).
  • image data, metadata, and personal identification information are received, the image data is associated with the personal identification information and stored so as not to be browsed (step S92).
  • One way to save data so that it cannot be viewed is to save it in a storage area that is not open to the public.
  • a masking process is performed on the area of other moving objects excluding the moving objects by the child possessing the wireless IC tag 401 (step S93). .
  • this masking process the presence or absence of specific objects such as hats and school bags, which are registered in the database 337 and are worn by children associated with the personal identification information, is determined.
  • the corresponding moving object is specified, and the area of the moving object other than the specified moving object is masked.
  • this masking process masks V and other moving object areas that do not correspond to the personal identification information as shown in FIG.
  • step S94 the image data subjected to the masking process is associated with the personal identification information and stored so as to be viewable.
  • One way to save it for viewing is to save it in a public storage area.
  • this process is complete
  • image data, metadata, and personal identification information have not been received in step S91, this processing ends.
  • the mask image can be browsed from the browsing terminal 370.
  • FIG. 15 is a flowchart showing a fifth example of a processing procedure for storing image data and related information in the watching center 330. This processing is performed by the processor executing a program stored in a storage medium provided in the system management device 335 and the image processing device 336.
  • step S101 it is determined whether or not image data, metadata, and personal identification information in which a moving object is detected is received from the watching node 310 (step S101).
  • image data, metadata, and personal identification information are received, the image data is associated with the personal identification information and stored so as not to be browsed (step S102).
  • One way to save it in a non-viewable way is to make it publicly available and save it in a storage area.
  • a masking process is performed on the area of other moving objects excluding the moving objects by the child possessing the wireless IC tag 401 (step S103). .
  • this masking process the presence or absence of specific patterns (patterns) such as hats and school bags, which are registered in database 337 and are worn by children associated with personal identification information, is determined.
  • the moving object corresponding to the identification information is specified, and the area of the moving object other than the specified moving object is masked.
  • this masking process does not correspond to the personal identification information as shown in FIG. 11, masks the area of other moving objects, and generates an image without specifying a person or the like.
  • step S104 the image data subjected to the masking process is associated with the personal identification information and stored so as to be viewable.
  • One way to save it for viewing is to save it in a public storage area. Thereafter, this process is terminated.
  • step S1 If image data, metadata, and personal identification information are not received in 01, this process is terminated. As a result, the mask image can be browsed from the browsing terminal 370.
  • the area of the moving object is detected from the captured image. Therefore, when detecting the masking target area, the processing load is larger than the detection of the face area. Can be reduced. In addition, it is possible to easily identify the area of the moving object that is the target, and to mask the entire area of the moving object, thereby increasing privacy.
  • masking when masking is performed, solid coating is performed, but masking may be performed by various methods such as mosaic, blurring, and replacement with a specific image.
  • masking it is possible to obtain a desired mask image by allowing the user to select the portable terminal power for the masking method.
  • a person in a captured image, a person may be specified by combining two or more authentications instead of individually authenticating the face, height, color, shape, and pattern (pattern).
  • the photographed image is too powerful to authenticate the child, the image may be hidden without masking the image. This avoids generating unnecessary images.
  • the description information and time information of the IC tag may be disclosed.
  • the moving object is detected by the inter-frame difference method, but the detection method is not particularly limited.
  • the moving object may be detected by the template matching method. Oh ,.
  • a wireless IC tag As a device for detecting and authenticating a person to be monitored, a wireless IC tag, an Ic tag signal receiving device, and the like are used, but a combination of devices that transmit and receive such radio signals. Not limited to this, a combination of devices that transmit and receive optical signals may be used.
  • a monitoring system for a parent to watch a child attending school has been shown.
  • the monitoring system of the present invention is not limited to this application. It can also be applied to privacy protection.
  • the present invention can protect the privacy of a person other than the target person who does not need to acquire accurate position information of the target person or to match the shooting range of the camera only to the target person.
  • This is a monitoring system that has the effect of obtaining a monitoring image without missing the opportunity even when a large number of target persons enter the shooting range of the camera all at once. It is useful for monitoring devices and monitoring methods.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Alarm Systems (AREA)

Abstract

La présente invention concerne un système de surveillance d'un sujet qui n'a pas besoin d'ajuster une portée de prise de vues au moyen d'une caméra pour couvrir un sujet humain et peut protéger l'intimité des autres individus. Un centre de données (105) entre une image photographiée captée lorsqu'un sujet humain tenant une balise sans fil (125) passe à proximité d'un dispositif de prise de vues (101) et un dispositif de détection de région de visage (107) détecte une région du visage à partir de l'image photographiée en tant que région spécifique. En outre, un dispositif d'identification de visage (108) acquiert une quantité de caractéristiques faciales du sujet humain enregistré dans une base de données (110) selon les informations d'identification personnelles obtenues à partir de la balise à circuit intégré sans fil (125) et identifie la région du visage cohérente avec celle du sujet humain à partir des régions de visage détectées. Un dispositif de masquage (109) masque ensuite tout ce qui ne concerne pas la région du visage identifiée du sujet humain et génère une image masquée et un dispositif de gestion de données d'image (106) conserve l'image masquée en association aux informations d'identification personnelles pour la publicité. Lorsqu'un dispositif terminal de navigation (111) exige son inspection, le système de surveillance transmet l'image masquée correspondant aux informations d'identification personnelles authentifiées.
PCT/JP2007/063368 2006-07-05 2007-07-04 Système, dispositif et procédé de surveillance WO2008004578A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-185214 2006-07-05
JP2006185214A JP5150067B2 (ja) 2006-07-05 2006-07-05 監視システム、監視装置及び監視方法

Publications (1)

Publication Number Publication Date
WO2008004578A1 true WO2008004578A1 (fr) 2008-01-10

Family

ID=38894545

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/063368 WO2008004578A1 (fr) 2006-07-05 2007-07-04 Système, dispositif et procédé de surveillance

Country Status (2)

Country Link
JP (1) JP5150067B2 (fr)
WO (1) WO2008004578A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010070507A1 (fr) * 2008-12-19 2010-06-24 Nokia Corporation Maintien optimisé de sécurité et d'intégrité
CN101819687A (zh) * 2010-04-16 2010-09-01 阜新力兴科技有限责任公司 一种人脸识别学生考勤装置及方法
WO2016084304A1 (fr) * 2014-11-26 2016-06-02 パナソニックIpマネジメント株式会社 Dispositif d'imagerie, dispositif d'enregistrement, et dispositif de commande de sortie vidéo
CN106851226A (zh) * 2017-03-29 2017-06-13 宁夏宁信信息科技有限公司 基于用户行为识别的摄像头自动调整的监控方法及***
EP3119075A4 (fr) * 2014-03-10 2017-11-15 Sony Corporation Appareil de traitement d'informations, support d'informations et procédé de commande
CN108200382A (zh) * 2017-12-15 2018-06-22 北京奇虎科技有限公司 一种视频监控的方法和装置
CN112567728A (zh) * 2018-08-31 2021-03-26 索尼公司 成像设备、成像***、成像方法和成像程序
CN112977823A (zh) * 2021-04-15 2021-06-18 上海工程技术大学 一种监控人流量数据无人机及监控方法
CN113438420A (zh) * 2021-06-29 2021-09-24 维沃软件技术有限公司 图像处理方法、装置、电子设备及存储介质
US11889177B2 (en) 2018-08-31 2024-01-30 Sony Semiconductor Solutions Corporation Electronic device and solid-state imaging device

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5022956B2 (ja) * 2008-03-19 2012-09-12 株式会社東芝 画像処理装置、画像処理方法および画像処理システム
JP5159381B2 (ja) * 2008-03-19 2013-03-06 セコム株式会社 画像配信システム
JP2010146094A (ja) * 2008-12-16 2010-07-01 Nec Corp 画像処理装置、画像処理方法及び画像処理プログラム
JP5883577B2 (ja) * 2011-05-25 2016-03-15 富士通株式会社 セキュリティ確保型リモート監視装置および方式
JP5865710B2 (ja) * 2012-01-12 2016-02-17 セコム株式会社 画像処理装置
JP6028457B2 (ja) * 2012-08-24 2016-11-16 ソニー株式会社 端末装置、サーバ及びプログラム
JP6180244B2 (ja) * 2013-08-30 2017-08-16 Kddi株式会社 カメラ映像処理方法、装置およびプログラム
US9912838B2 (en) 2015-08-17 2018-03-06 Itx-M2M Co., Ltd. Video surveillance system for preventing exposure of uninteresting object
JP6665590B2 (ja) * 2016-03-03 2020-03-13 沖電気工業株式会社 情報処理装置、情報処理方法、プログラム、及び情報処理システム
JP6665591B2 (ja) * 2016-03-03 2020-03-13 沖電気工業株式会社 情報処理装置、情報処理方法、プログラム、及び情報処理システム
JP2017169131A (ja) * 2016-03-17 2017-09-21 日本電信電話株式会社 画像処理装置及び画像処理方法
KR101951605B1 (ko) * 2018-11-07 2019-02-22 이종원 영상의 유출을 방지하기 위한 cctv 영상 보안 시스템
KR102238939B1 (ko) * 2018-11-29 2021-04-14 오지큐 주식회사 무선단말을 이용한 초상권 보호 방법
JP2020102817A (ja) * 2018-12-25 2020-07-02 凸版印刷株式会社 監視対象識別装置、監視対象識別システム、および、監視対象識別方法
JP2020141212A (ja) * 2019-02-27 2020-09-03 沖電気工業株式会社 画像処理システム、画像処理装置、画像処理プログラム、画像処理方法、及び表示装置
JP6796294B2 (ja) * 2019-04-10 2020-12-09 昌樹 加藤 監視カメラ
CN112073689A (zh) * 2020-09-03 2020-12-11 国网北京市电力公司 一种有限空间作业人员进出管控装置

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003158735A (ja) * 2001-11-22 2003-05-30 Matsushita Electric Works Ltd 映像監視システム
JP2004048519A (ja) * 2002-07-15 2004-02-12 Hitachi Ltd 防犯装置
JP2006101028A (ja) * 2004-09-28 2006-04-13 Hitachi Kokusai Electric Inc 監視システム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3979902B2 (ja) * 2001-08-30 2007-09-19 株式会社日立国際電気 監視映像配信システムおよび監視映像配信方法
JP3793487B2 (ja) * 2002-07-12 2006-07-05 ナイルス株式会社 撮像システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003158735A (ja) * 2001-11-22 2003-05-30 Matsushita Electric Works Ltd 映像監視システム
JP2004048519A (ja) * 2002-07-15 2004-02-12 Hitachi Ltd 防犯装置
JP2006101028A (ja) * 2004-09-28 2006-04-13 Hitachi Kokusai Electric Inc 監視システム

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102257804A (zh) * 2008-12-19 2011-11-23 诺基亚公司 安全性和完整性的改进维持
US8515211B2 (en) 2008-12-19 2013-08-20 Nokia Corporation Methods, apparatuses, and computer program products for maintaining of security and integrity of image data
WO2010070507A1 (fr) * 2008-12-19 2010-06-24 Nokia Corporation Maintien optimisé de sécurité et d'intégrité
CN101819687A (zh) * 2010-04-16 2010-09-01 阜新力兴科技有限责任公司 一种人脸识别学生考勤装置及方法
EP3119075A4 (fr) * 2014-03-10 2017-11-15 Sony Corporation Appareil de traitement d'informations, support d'informations et procédé de commande
WO2016084304A1 (fr) * 2014-11-26 2016-06-02 パナソニックIpマネジメント株式会社 Dispositif d'imagerie, dispositif d'enregistrement, et dispositif de commande de sortie vidéo
CN106851226B (zh) * 2017-03-29 2018-07-31 宁夏宁信信息科技有限公司 基于用户行为识别的摄像头自动调整的监控方法及***
CN106851226A (zh) * 2017-03-29 2017-06-13 宁夏宁信信息科技有限公司 基于用户行为识别的摄像头自动调整的监控方法及***
CN108200382A (zh) * 2017-12-15 2018-06-22 北京奇虎科技有限公司 一种视频监控的方法和装置
CN112567728A (zh) * 2018-08-31 2021-03-26 索尼公司 成像设备、成像***、成像方法和成像程序
EP3846441A4 (fr) * 2018-08-31 2021-10-13 Sony Group Corporation Dispositif d'imagerie, système d'imagerie, procédé d'imagerie et programme d'imagerie
US11595608B2 (en) 2018-08-31 2023-02-28 Sony Corporation Imaging apparatus, imaging system, imaging method, and imaging program including sequential recognition processing on units of readout
CN112567728B (zh) * 2018-08-31 2023-06-09 索尼公司 成像设备、成像***、成像方法
US11704904B2 (en) 2018-08-31 2023-07-18 Sony Corporation Imaging apparatus, imaging system, imaging method, and imaging program
US11741700B2 (en) 2018-08-31 2023-08-29 Sony Corporation Imaging apparatus, imaging system, imaging method, and imaging program
US11763554B2 (en) 2018-08-31 2023-09-19 Sony Corporation Imaging apparatus, imaging system, imaging method, and imaging program
US11889177B2 (en) 2018-08-31 2024-01-30 Sony Semiconductor Solutions Corporation Electronic device and solid-state imaging device
CN112977823A (zh) * 2021-04-15 2021-06-18 上海工程技术大学 一种监控人流量数据无人机及监控方法
CN113438420A (zh) * 2021-06-29 2021-09-24 维沃软件技术有限公司 图像处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
JP2008017093A (ja) 2008-01-24
JP5150067B2 (ja) 2013-02-20

Similar Documents

Publication Publication Date Title
JP5150067B2 (ja) 監視システム、監視装置及び監視方法
CN101093542B (zh) 查询***、成像装置、查询装置、信息处理方法
US20060104483A1 (en) Wireless digital image capture device with biometric readers
CN104838642B (zh) 用创建者身份或场景标记媒体的方法和设备
JP4701356B2 (ja) プライバシー保護画像生成装置
US10165178B2 (en) Image file management system and imaging device with tag information in a communication network
WO2004105383A1 (fr) Systeme d'imagerie
JP4820636B2 (ja) オブジェクト行動検知・通知システム及びコントローラ装置
JP2007158421A (ja) 監視カメラシステム及び顔画像追跡記録方法
JP6028457B2 (ja) 端末装置、サーバ及びプログラム
JP4958600B2 (ja) 見守りシステムおよびマスキング処理方法
JP5500639B2 (ja) 端末装置、認証システム及びプログラム
JP2014154129A (ja) 装備認識システム
JP5883577B2 (ja) セキュリティ確保型リモート監視装置および方式
WO2016194275A1 (fr) Système d'analyse de ligne de flux, dispositif de caméra et procédé d'analyse de ligne de flux
KR101863846B1 (ko) 이벤트 감지 및 현장 사진 정보 제공 방법 및 시스템
JP2015015570A (ja) 画像管理システム、画像管理装置および画像管理プログラム
JP2005229265A (ja) 画像暗号システム及び画像暗号方法
JP5909712B1 (ja) 動線分析システム、カメラ装置及び動線分析方法
KR20160108236A (ko) 피관찰자 영상 제공 시스템 및 피관찰자 부착용 태그
JP2006352908A (ja) 監視装置及び監視映像配信システム並びに監視映像配信方法
KR101861732B1 (ko) 피관찰자 위치 추적 시스템
JP2016225813A (ja) 動線分析システム、カメラ装置及び動線分析方法
JP6921550B2 (ja) 監視システム、及び監視方法
JP2012083915A (ja) 画像公開システム、画像公開方法、撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07768126

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 07768126

Country of ref document: EP

Kind code of ref document: A1