WO2022064854A1 - Hardware accelerator, image processing device, and image processing method - Google Patents

Hardware accelerator, image processing device, and image processing method Download PDF

Info

Publication number
WO2022064854A1
WO2022064854A1 PCT/JP2021/028661 JP2021028661W WO2022064854A1 WO 2022064854 A1 WO2022064854 A1 WO 2022064854A1 JP 2021028661 W JP2021028661 W JP 2021028661W WO 2022064854 A1 WO2022064854 A1 WO 2022064854A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image processing
feature
threshold value
unit
Prior art date
Application number
PCT/JP2021/028661
Other languages
French (fr)
Japanese (ja)
Inventor
康佑 松本
Original Assignee
カシオ計算機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by カシオ計算機株式会社 filed Critical カシオ計算機株式会社
Publication of WO2022064854A1 publication Critical patent/WO2022064854A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a hardware accelerator, an image processing device, and an image processing method.
  • Feature point detection is used in applications such as object detection and alignment. In feature point detection, sorting processing is often performed in any of the steps.
  • an object of the present invention is to provide a hardware accelerator, an image processing device, and an image processing method suitable for object detection at a high frame rate.
  • the hardware accelerator of the present invention is used.
  • a frequency distribution derivation unit that derives the frequency distribution of the response value of the feature point from the image,
  • a threshold value derivation unit for deriving a threshold value based on the frequency distribution is provided.
  • the image processing apparatus is made to generate a feature amount of a feature point having a response value equal to or higher than the threshold value.
  • FIG. 1 It is a figure which shows the security system which concerns on embodiment of this invention. It is a block diagram of the image processing apparatus which concerns on embodiment of this invention. It is a flowchart of the feature amount generation processing of the image processing apparatus which concerns on embodiment of this invention. It is a flowchart of the response value count-up process of the image processing apparatus which concerns on embodiment of this invention. It is explanatory drawing of the response value count-up process of FIG. It is explanatory drawing of the response value count-up process of FIG. It is a flowchart of the threshold value derivation process of the image processing apparatus which concerns on embodiment of this invention. It is explanatory drawing of the threshold value derivation process of FIG. It is a block diagram of the image processing apparatus which concerns on the modification of this invention.
  • the image processing device provided with the accelerator according to the embodiment of the present invention is a component of a security system that detects a high-speed moving object or the like from an image of a surveillance camera and notifies a security company or the like.
  • a security system 1 installed in a bank discriminates between an image processing device 10 that detects a high-speed moving object and the like and an object and the like detected by the image processing device 10, and is a security guard who is a user. It is provided with a security device 90 for notifying the user.
  • the image processing device 10 captures an image of an object 100 such as a vehicle 101, a person 102, or an article 103 moving in an imaging range P within the security range of the security system, and the captured image is used to capture an image of the vehicle 101 and the person. 102, article 103, etc. are detected.
  • the vehicle 101 may move at a considerably high speed in some cases, it may be difficult to detect it from an image captured at a normal frame rate.
  • the image processing device 10 takes an image at a high frame rate, 240 fps (frames per second) in the present embodiment, and makes it possible to detect the vehicle 101 from the captured image.
  • the feature point detector 30 is targeted for the hardware accelerator.
  • the image processing apparatus 10 detects useful feature points without performing sorting processing, thereby making the feature point detector 30 which is the hardware accelerator A small-scale and low-latency.
  • the image processing device 10 includes a control unit 20, a feature point detector 30, a storage unit 40, an image pickup unit 50, a communication unit 60, a display unit 70, and an input unit 80.
  • the control unit 20 is composed of a CPU (Central Processing Unit) or the like, and by executing a program or the like stored in the storage unit 40, each unit (feature point acquisition unit 21, feature amount generation unit 22, feature amount transmission) described later is executed. The function of the part 23) is realized. Further, the control unit 20 has a clock (not shown), and can acquire the current date and time, count the elapsed time, and the like.
  • CPU Central Processing Unit
  • the feature point detector 30 which is the hardware accelerator A, is for executing various processes for extracting feature points from the image V on behalf of the control unit 20.
  • the feature point detector 30 includes a CPU and the like, is composed of an integrated circuit (IC: Integrated Circuit) in which electronic components such as semiconductors are integrated, and is described later by executing a program or the like stored in the storage unit 40.
  • IC Integrated Circuit
  • the functions of each unit are realized.
  • the storage unit 40 is composed of a ROM (Read Only Memory), a RAM (Random Access Memory), and the like, and a part or all of the ROM is composed of an electrically rewritable memory (flash memory, etc.).
  • the storage unit 40 functionally includes a captured image storage unit 41, a feature point list storage unit 42, a feature amount storage unit 43, and various setting storage units 44.
  • the ROM stores a program executed by the control unit 20 and the feature point detector 30 which is the hardware accelerator A, and data necessary for executing the program in advance. Data that is created or modified during program execution is stored in the RAM.
  • the image V imaged by the image pickup unit 50 is stored in the image pickup image storage unit 41.
  • the number of image pickup devices 51 described later is one for the sake of simplification of the description, but when a plurality of image pickup devices 51 are in operation, the image pickup image storage unit 41 takes an image for each image pickup device 51. It is possible to memorize the image that has been created.
  • the feature point list storage unit 42 stores the feature point list generated by the feature amount generation process described later.
  • the coordinates of the feature points and the response value are stored in association with each other.
  • the feature amount storage unit 43 stores the feature amount generated from the feature points having a response value equal to or higher than the threshold value described later in the feature point list stored in the feature point list storage unit 42. This feature amount is transmitted to the security device 90.
  • Various settings such as the frame rate for capturing the image V, the imaging range P for capturing the image V, the response value range, and the like are input to the various setting storage units 44 in advance or by the user via the input unit 80. Is remembered by.
  • the image pickup unit 50 has an image pickup device 51 for capturing an image V and a drive device 52 for controlling the posture of the image pickup device 51.
  • the image pickup apparatus 51 includes a CMOS (Complementary Metal Oxide Sensor) camera.
  • the image pickup apparatus 51 images the image pickup range P at a high frame rate of 240 fps (frames per second) and generates an image V.
  • the frame rate is initially set to 240 fps, it can be manually or automatically changed by the user according to the assumed moving speed of the image pickup object and the like.
  • the drive device 52 adjusts the image pickup range P by moving the position and direction of the image pickup device 51 according to the instructions of the input device 81 and the like described later.
  • the communication unit 60 has a communication device 61 which is a module for communicating with a security device 90, an external device, and the like.
  • the communication device 61 is a wireless module including an antenna when communicating with an external device.
  • the communication device 61 is a wireless module for performing short-range wireless communication based on Bluetooth (registered trademark).
  • the image processing device 10 can exchange data such as an image V and a feature amount with a security device 90 or an external device.
  • the display unit 70 includes a display device 71 composed of a liquid crystal display panel (LCD: Liquid Crystal Display).
  • LCD Liquid Crystal Display
  • a thin film transistor TFT: Thin Film Transistor
  • a liquid crystal display As the display device 71, a thin film transistor (TFT: Thin Film Transistor), a liquid crystal display, an organic EL, or the like can be adopted.
  • the image V, the feature amount list, and the like are displayed on the display device 71.
  • the input unit 80 is a resistance film type touch panel (input device 81) provided in the vicinity of the display unit 70 or integrally with the display unit 70.
  • the touch panel may be an infrared operation method, a projection type capacitance method, or the like, and the input unit may be a keyboard, a mouse, or the like instead of the touch panel.
  • the user can change the frame rate by manual operation via the input unit 80, or set the image pickup range P or the like using the display unit 70.
  • control unit 20 and the feature point detector 30 of the image processing device 10 will be described.
  • the control unit 20 realizes the functions of the feature point acquisition unit 21, the feature amount generation unit 22, and the feature amount transmission unit 23, and performs the feature amount generation process described later together with the feature point detector 30 which is the hardware accelerator A.
  • the feature point acquisition unit 21 reads the coordinates of the feature point from the feature point list and transmits the coordinates to the feature amount generation unit 22.
  • the feature amount generation unit 22 When the response value R of the feature point is equal to or greater than the threshold value Th, the feature amount generation unit 22 generates the feature amount of the feature point and stores it in the feature amount storage unit 43.
  • the result D of the DoG image is calculated from the following equation (3).
  • the two-dimensional Hessian matrix H shown in the following equation (4) is calculated for the key point candidate.
  • Subpixel estimation is used for the position of key points in scale space.
  • the DoG output at the sub-pixel position is calculated, and those whose absolute value of the DoG output is smaller than the threshold value are excluded from the key point candidates.
  • the feature amount generation unit 22 calculates the orientation for the detected key point, so that the gradient intensity m and the gradient direction ⁇ of the smoothed image L in which the key point is detected are calculated by the following equation (7). , (8), (9).
  • the weighted direction histogram h is calculated from the equations (10) and (11).
  • H is a histogram obtained by quantizing the gradient direction in 36 directions
  • w is a weight for the reference pixel
  • a Gaussian kernel that becomes heavier as it gets closer to the key point is used.
  • is a delta function, and 1 is returned when ⁇ is in the quantization gradient direction ⁇ '.
  • the key point orientation is the peak that is 80% or more from the maximum value of the histogram in 36 directions.
  • the feature amount transmission unit 23 transmits the feature amount stored in the feature amount storage unit 43 and the feature point list stored in the feature point list storage unit 42 to the security device 90.
  • the feature point detector 30 realizes the functions of the image acquisition unit 31, the corner detection unit 32, the histogram generation unit 33, and the threshold value derivation unit 34, and performs feature quantity generation processing and the like together with the control unit 20.
  • the image acquisition unit 31 acquires the image V imaged by the image pickup unit 50 with the image pickup range P under the exposure conditions set in advance in the image processing device 10 or set by the user from the image capture image storage unit 41.
  • the image acquisition unit 31 transmits the acquired image V to the corner detection unit 32.
  • the corner detection unit 32 detects feature points from the image V transmitted from the image acquisition unit 31 by the corner detection method of Harris. The details of the detection method will be described in the feature amount generation process described later.
  • the corner detection unit 32 stores the coordinates of the detected feature points and the response value R in the feature point list storage unit 42.
  • the histogram generation unit (frequency (frequency) distribution derivation unit) 33 generates a histogram as shown in FIGS. 5A and 5B by using the response value R of the feature points stored in the feature point list storage unit 42. The method of generating the histogram will be described in the response value count-up process described later.
  • the histogram generation unit 33 can change the response value range described later according to the result of the obtained histogram or arbitrarily by the user.
  • the threshold value derivation unit 34 derives the threshold value Th based on the response value R in the threshold value derivation process described later. Details of the method for deriving the threshold value Th will be described later.
  • control unit 20 and the feature point detector 30 has been described above.
  • feature points are extracted from the captured image V, and among the extracted feature points, the feature amount generation process for generating the feature amount of those having a response value equal to or higher than a predetermined threshold value is specified.
  • the frequency distribution process is performed without performing the sort process, so that the feature point detector 30 can be configured by the accelerator A having a small circuit.
  • the image pickup unit 50 captures a predetermined imaging range P at a frame rate of 240 fps, and stores the captured image V in the image pickup image storage unit 41 (step S1).
  • the feature point acquisition unit 21 reads one pixel nth from the image V stored in the image capture image storage unit 41, and transmits the acquired pixel to the corner detection unit 32 (step S3).
  • the corner detection unit 32 calculates the response value R for the received pixel using the following equations (12) to (15) (step S4).
  • a rectangular window or a Gaussian window is used for the window function w (x, y) in the equation (12).
  • Taylor expansion or the like is performed on the equation (12) to obtain the equation (13).
  • M in Eq. (13) is defined as in Eq. (14).
  • I x and I y represent the gradients of the image V in the x direction and the y direction, respectively.
  • step S5 Yes
  • the corner detection unit 32 proceeds to step S6.
  • the response value R is equal to or less than the threshold value F (step S5: No)
  • 0 clip is performed (step S7)
  • 1 is added to n, and the process returns to step S3.
  • the threshold value F is determined by statistically analyzing the target image V in advance using the target image V.
  • step S6 the corner detection unit 32 suppresses the response value R to a non-maximum value (Non Maximum Suppression), thereby excluding the portion other than the portion where the edge strength becomes maximum. If it is a maximum point (feature point) (step S6: Yes), the process proceeds to step S8, and if it is not a maximum point and is excluded, 1 is added to n (step S9), and the process returns to step S3.
  • a non-maximum value Non Maximum Suppression
  • step S8 the corner detection unit 32 stores the coordinates of the feature points and the response value R in the feature point list storage unit 42 as a feature point list.
  • step S10 the histogram generation unit 33 executes the response value count-up process described later.
  • step S12 the threshold value derivation unit 34 executes a threshold value derivation process described later for deriving the threshold value Th of the response value R which is the upper M from the histogram.
  • M is 10% of the number of feature points in the feature point list in step S8.
  • the feature point acquisition unit 21 determines whether or not the response value R of each of the detected Z feature points (1 ⁇ m ⁇ Z) is equal to or higher than the threshold value Th derived by the threshold value derivation process.
  • the feature point acquisition unit 21 reads the coordinates of the feature point from the feature point list and transmits the coordinates to the feature amount generation unit 22.
  • the feature amount generation unit 22 generates a feature amount of the feature point by SIFT (step S15), stores the feature amount in the feature amount storage unit 43, and proceeds to step S16.
  • the response value R of the feature point is less than the threshold value Th (step S14: No)
  • the process proceeds to step S16.
  • step S10 of the feature amount generation process will be described with reference to FIGS. 4, 5A, and 5B.
  • the response value range (see FIG. 5A) to which the response value R of the feature point corresponds is determined (step S21).
  • the response value range may be stored in advance in the various setting storage units 44, or may be arbitrarily set by the user. Alternatively, for example, when the response value R is concentrated in one (or two, three, etc.) BIN range, the response value range is changed so that the response value R is not concentrated in a specific range. You can also fix it. Further, as for the response value range, an appropriate frequency distribution can be obtained, for example, assuming that the width of the class uses an exponent (or logarithm) or the like, as shown in FIG. 5B, according to the frequency distribution of the response value R or the like. You may do so.
  • step S22 the bins in the corresponding response value range are counted up (step S22), and the process ends.
  • the threshold value derivation process executed in step S12 of the feature amount generation process will be described.
  • the threshold derivation unit 34 has the number F1 of the response values belonging to the bins of the first round and the number F2 of the response values belonging to the bin having the second largest response value R after the first round bin. Is added to obtain Q2.
  • the threshold derivation unit 34 has a response belonging to the number F1 + F2 of the response values belonging to the bins of the first and second rounds, and the bin having the second largest response value R after the bin of the second round. Add the number F3 of values to get Q3.
  • the threshold value derivation unit 34 sequentially adds the response values R belonging to the bin, which is the range in which the response value R is large, to obtain Qt.
  • the threshold value derivation unit 34 compares the number Qt of the response values obtained in step S32 with M. When the number Qt of the response values obtained in step S32 is M or less (step S33: No), 1 is added to t (step S34), and the process returns to step S32. When the number Qt of the response values obtained in step S32 exceeds M (step S33: Yes), the threshold value derivation unit 34 sets the bin one round before as the threshold value Th (step S35), and ends the threshold value derivation process.
  • the image processing apparatus 10 detects the feature points from the captured image, generates a histogram of the response value of the feature points, sets the threshold value Th of the response value R by the histogram, and sets the threshold value Th.
  • a feature amount of a feature point having the above response value R is generated. Therefore, the feature amount can be generated without performing the sorting process. Since the sorting process is not performed, the hardware accelerator A of the image processing device 10 can be configured with a low latency in a small-scale circuit. In addition, the data storage internal memory of the image processing device 10 can be reduced.
  • the feature amount transmission unit 23 of the image processing device 10 transmits the generated feature amount to the security device 90, and the security device 90 determines what the object 100 reflected in the image V is based on the received feature amount. , Notify the security guard who is the user.
  • the feature point detector 30 of the image processing apparatus 10 is the hardware accelerator A, but the target to be the hardware accelerator is, for example, only the part that generates the histogram, or the histogram generation and the derivation of the threshold value. It may be, or it may be in another range. If necessary, the range to be the hardware accelerator may be determined, and the range to be excluded from the hardware accelerator may be the range of the control unit and the storage unit. For example, as shown in FIG. 8, the image processing device 10A provided with the hardware accelerator B having the accelerator storage unit 40B may be used. Alternatively, the threshold value can be derived at a higher speed as an image processing device having a configuration in which two hardware accelerators A are mounted.
  • corner detection by Harris is performed, but other corner detection, for example, Plessey corner detection, Moravec corner detection, or the like may be used.
  • the feature amount is generated by SIFT, but the feature amount may be generated by ORB (Oriented FAST and Rotated BRIEF), SURF (Speeded-Up Robot Features), or the like.
  • ORB Oriented FAST and Rotated BRIEF
  • SURF Speeded-Up Robot Features
  • the image processing device 10 is used as a security device, but may be linked with, for example, a factory line sensor or an in-vehicle camera.
  • the image processing apparatus 10 generates a histogram of the response value R, but another frequency distribution, for example, a frequency distribution table may be used.
  • M is 10% of the number of feature points in the feature point list, but it may be any numerical value other than 10%, or M may be determined by the size of the image. good.
  • the threshold value F is reset in step S5 and the threshold value is further detected. Th should be determined.
  • Each function of the image processing apparatus 10 of the present invention can also be carried out by a computer such as a normal PC (Personal Computer).
  • a computer such as a normal PC (Personal Computer).
  • the exposure compensation processing and the image processing programs performed by the image processing apparatus 10 have been described as being stored in the ROM of the storage unit 40 in advance.
  • the program is stored and distributed in a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), a DVD (Digital Versaille Disc), and an MO (Magnet-Optical Disc), and the program is distributed.
  • the present invention is applicable to hardware accelerators, image processing devices and image processing methods suitable for object detection at a high frame rate.
  • Display unit 71 ... Display device , 80 ... Input unit, 81 ... Input device, 90 ... Security device, 100 ... Object, 101 ... Vehicle, 102 ... Person, 103 ... Article, A, B ... Hardware accelerator, P ... Imaging range, V ... Image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A hardware accelerator according to the present invention comprises: a frequency distribution derivation unit which derives the frequency distribution of response values R of features from an image; and a threshold derivation unit which derives a threshold on the basis of the frequency distribution. The hardware accelerator causes an image processing device to generate a feature value of a feature which has a response value R that is equal to or higher than the threshold. An image processing device according to the present invention comprises a hardware accelerator, a control unit which generates a feature value with respect to a feature that is equal to or higher than a threshold, and a storage unit in which an image, response values, and the feature value are stored.

Description

ハードウェアアクセラレータ、画像処理装置及び画像処理方法Hardware accelerator, image processing device and image processing method
 本発明は、ハードウェアアクセラレータ、画像処理装置及び画像処理方法に関する。 The present invention relates to a hardware accelerator, an image processing device, and an image processing method.
 物体検出や位置合わせ等のアプリケーションにおいて、特徴点検出が利用されている。特徴点検出では、いずれかの工程でソート処理を行うことが多い。 Feature point detection is used in applications such as object detection and alignment. In feature point detection, sorting processing is often performed in any of the steps.
 例えば、特許文献1に記載された特徴点検出では、ソート処理を行って各特徴点のレスポンス値を特定の順番(昇順や降順等)に並び替えた後、レスポンス値上位の任意の個数を選択し、選択された特徴点について特徴量の生成が行われる。このような特徴点検出は、現在、通常のフレームレート(30fps(frames per second)等)で撮像される場合に利用されている。 For example, in the feature point detection described in Patent Document 1, after sorting the response values of each feature point in a specific order (ascending order, descending order, etc.), an arbitrary number of higher response values is selected. Then, the feature quantity is generated for the selected feature points. Such feature point detection is currently used when imaging at a normal frame rate (30 fps (frames per second) or the like).
特開2014-120056号公報Japanese Unexamined Patent Publication No. 2014-120056
 ここで、高フレームレートでの物体検出等を行う場合において、ソート処理を伴った特徴点検出を行うためには、CPU(Central Processing Unit)によるソート処理では限界があるので、処理の一部を専用のハードウェア(以下、専用のハードウェアを「ハードウェアアクセラレータ」と記載する)に行わせてCPUの処理負担を低減させる。しかしながら、CPUでのソート処理をハードウェアアクセラレータでの処理に置き換えただけでは、回路コストが大きく、ハードウェアアクセラレータの回路規模が増大してしまう。 Here, in the case of object detection at a high frame rate, there is a limit to the sorting process by the CPU (Central Processing Unit) in order to perform the feature point detection accompanied by the sorting process, so a part of the process is performed. Dedicated hardware (hereinafter, the dedicated hardware is referred to as "hardware accelerator") is used to reduce the processing load on the CPU. However, simply replacing the sort process in the CPU with the process in the hardware accelerator increases the circuit cost and increases the circuit scale of the hardware accelerator.
 そこで、本発明は、上記問題を解決するためになされたものであり、高フレームレートでの物体検出に好適なハードウェアアクセラレータ、画像処理装置及び画像処理方法を提供することを目的とする。 Therefore, the present invention has been made to solve the above problems, and an object of the present invention is to provide a hardware accelerator, an image processing device, and an image processing method suitable for object detection at a high frame rate.
 上記目的を達成するため、本発明のハードウェアアクセラレータは、
 画像から特徴点のレスポンス値の度数分布を導出する度数分布導出部と、
 前記度数分布に基づいて閾値を導出する閾値導出部と、を備え、
 前記閾値以上であるレスポンス値を有する特徴点の特徴量を画像処理装置に生成させる。
In order to achieve the above object, the hardware accelerator of the present invention is used.
A frequency distribution derivation unit that derives the frequency distribution of the response value of the feature point from the image,
A threshold value derivation unit for deriving a threshold value based on the frequency distribution is provided.
The image processing apparatus is made to generate a feature amount of a feature point having a response value equal to or higher than the threshold value.
 本発明によれば、高フレームレートでの物体検出に好適なハードウェアアクセラレータ、画像処理装置及び画像処理方法を提供することができる。 According to the present invention, it is possible to provide a hardware accelerator, an image processing device, and an image processing method suitable for object detection at a high frame rate.
本発明の実施形態に係る警備システムを示す図である。It is a figure which shows the security system which concerns on embodiment of this invention. 本発明の実施形態に係る画像処理装置のブロック図である。It is a block diagram of the image processing apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る画像処理装置の特徴量生成処理のフローチャートである。It is a flowchart of the feature amount generation processing of the image processing apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る画像処理装置のレスポンス値カウントアップ処理のフローチャートである。It is a flowchart of the response value count-up process of the image processing apparatus which concerns on embodiment of this invention. 図4のレスポンス値カウントアップ処理の説明図である。It is explanatory drawing of the response value count-up process of FIG. 図4のレスポンス値カウントアップ処理の説明図である。It is explanatory drawing of the response value count-up process of FIG. 本発明の実施形態に係る画像処理装置の閾値導出処理のフローチャートである。It is a flowchart of the threshold value derivation process of the image processing apparatus which concerns on embodiment of this invention. 図6の閾値導出処理の説明図である。It is explanatory drawing of the threshold value derivation process of FIG. 本発明の変形例に係る画像処理装置のブロック図である。It is a block diagram of the image processing apparatus which concerns on the modification of this invention.
 本発明の実施形態に係るアクセラレータを備えた画像処理装置について、以下、図面を参照して詳細に説明する。 The image processing apparatus provided with the accelerator according to the embodiment of the present invention will be described in detail below with reference to the drawings.
 本発明の実施形態に係るアクセラレータを備えた画像処理装置は、監視カメラの画像から高速移動物体等を検知し、警備会社等に通報する警備システムの構成要素である。 The image processing device provided with the accelerator according to the embodiment of the present invention is a component of a security system that detects a high-speed moving object or the like from an image of a surveillance camera and notifies a security company or the like.
 [警備システムの構成]
 図1に示すように、例えば、銀行に設置された警備システム1は、高速移動物体等を検知する画像処理装置10と、画像処理装置10が検知した物体等を判別し、ユーザである警備員に報知する警備装置90と、を備える。画像処理装置10は、警備システムの警備範囲内である撮像範囲Pを移動する、車両101、人物102や物品103等の物体100が映った画像を撮像して、撮像した画像から車両101、人物102や物品103等を検知する。このうち、車両101は、場合によってはかなり速い速度で移動することもあるので、通常のフレームレートで撮像した画像から検知することが困難な場合がある。
[Security system configuration]
As shown in FIG. 1, for example, a security system 1 installed in a bank discriminates between an image processing device 10 that detects a high-speed moving object and the like and an object and the like detected by the image processing device 10, and is a security guard who is a user. It is provided with a security device 90 for notifying the user. The image processing device 10 captures an image of an object 100 such as a vehicle 101, a person 102, or an article 103 moving in an imaging range P within the security range of the security system, and the captured image is used to capture an image of the vehicle 101 and the person. 102, article 103, etc. are detected. Of these, since the vehicle 101 may move at a considerably high speed in some cases, it may be difficult to detect it from an image captured at a normal frame rate.
 そこで、画像処理装置10は、高フレームレート、本実施形態では240fps(frames per second)で撮像を行い、撮像された画像から車両101を検出できるようにしている。このとき、通常のフレームレートで撮像した画像を処理する場合と同様にソート処理を伴った特徴点検出を利用した処理を行うと、CPUだけでの処理は困難である。そこで、画像処理装置10では、特徴点検出器30をハードウェアアクセラレータの対象とした。しかし、特徴点検出器30をハードウェアアクセラレータの対象としても、ソート処理に対応させると、回路コストが大きくなり、回路規模が増大してしまう。このため、画像処理装置10では、ソート処理を行わずに有用な特徴点を検出することにより、ハードウェアアクセラレータAである特徴点検出器30を小規模で低レイテンシーなものにしている。 Therefore, the image processing device 10 takes an image at a high frame rate, 240 fps (frames per second) in the present embodiment, and makes it possible to detect the vehicle 101 from the captured image. At this time, if the processing using the feature point detection accompanied by the sorting processing is performed as in the case of processing the image captured at a normal frame rate, it is difficult to perform the processing only by the CPU. Therefore, in the image processing device 10, the feature point detector 30 is targeted for the hardware accelerator. However, even if the feature point detector 30 is targeted for a hardware accelerator, if the sort process is supported, the circuit cost increases and the circuit scale increases. Therefore, the image processing apparatus 10 detects useful feature points without performing sorting processing, thereby making the feature point detector 30 which is the hardware accelerator A small-scale and low-latency.
 (画像処理装置の構成)
 図2に示すように、画像処理装置10は、制御部20、特徴点検出器30、記憶部40、撮像部50、通信部60、表示部70、入力部80、を備える。
(Configuration of image processing device)
As shown in FIG. 2, the image processing device 10 includes a control unit 20, a feature point detector 30, a storage unit 40, an image pickup unit 50, a communication unit 60, a display unit 70, and an input unit 80.
 制御部20は、CPU(Central Processing Unit)等で構成され、記憶部40に記憶されたプログラム等を実行することにより、後述する各部(特徴点取得部21、特徴量生成部22、特徴量送信部23)の機能を実現する。また、制御部20は、時計(図示せず)を有し、現在日時の取得や経過時間のカウント等をすることができる。 The control unit 20 is composed of a CPU (Central Processing Unit) or the like, and by executing a program or the like stored in the storage unit 40, each unit (feature point acquisition unit 21, feature amount generation unit 22, feature amount transmission) described later is executed. The function of the part 23) is realized. Further, the control unit 20 has a clock (not shown), and can acquire the current date and time, count the elapsed time, and the like.
 ハードウェアアクセラレータAである特徴点検出器30は、制御部20に代わって、画像Vから特徴点を抽出するための各種処理を実行するためのものである。特徴点検出器30は、CPU等を含んでおり、半導体等の電子部品を集積した集積回路(IC:Integrated Circuit)で構成され、記憶部40に記憶されたプログラム等を実行することにより、後述する各部(画像取得部31、コーナー検出部32、ヒストグラム生成部33、閾値導出部34)の機能を実現する。 The feature point detector 30, which is the hardware accelerator A, is for executing various processes for extracting feature points from the image V on behalf of the control unit 20. The feature point detector 30 includes a CPU and the like, is composed of an integrated circuit (IC: Integrated Circuit) in which electronic components such as semiconductors are integrated, and is described later by executing a program or the like stored in the storage unit 40. The functions of each unit (image acquisition unit 31, corner detection unit 32, histogram generation unit 33, threshold derivation unit 34) are realized.
 記憶部40は、ROM(Read Only Memory)、RAM(Random Access Memory)等で構成され、ROMの一部又は全部は電気的に書き換え可能なメモリ(フラッシュメモリ等)で構成されている。記憶部40は、機能的に、撮像画像記憶部41と、特徴点リスト記憶部42と、特徴量記憶部43と、各種設定記憶部44と、を有する。ROMには制御部20およびハードウェアアクセラレータAである特徴点検出器30が実行するプログラム及びプログラムを実行する上で予め必要なデータが記憶されている。RAMには、プログラム実行中に作成されたり変更されたりするデータが記憶される。 The storage unit 40 is composed of a ROM (Read Only Memory), a RAM (Random Access Memory), and the like, and a part or all of the ROM is composed of an electrically rewritable memory (flash memory, etc.). The storage unit 40 functionally includes a captured image storage unit 41, a feature point list storage unit 42, a feature amount storage unit 43, and various setting storage units 44. The ROM stores a program executed by the control unit 20 and the feature point detector 30 which is the hardware accelerator A, and data necessary for executing the program in advance. Data that is created or modified during program execution is stored in the RAM.
 撮像画像記憶部41には、本実施形態では、撮像部50に撮像された画像Vが記憶される。本実施形態では、説明の簡略化のため後述する撮像装置51は1台とするが、撮像画像記憶部41は、撮像装置51が複数台稼働している場合には、撮像装置51毎に撮像された画像を記憶することができる。 In the present embodiment, the image V imaged by the image pickup unit 50 is stored in the image pickup image storage unit 41. In the present embodiment, the number of image pickup devices 51 described later is one for the sake of simplification of the description, but when a plurality of image pickup devices 51 are in operation, the image pickup image storage unit 41 takes an image for each image pickup device 51. It is possible to memorize the image that has been created.
 特徴点リスト記憶部42には、後述する特徴量生成処理で生成される特徴点リストを記憶する。特徴点リストには、特徴点の座標とレスポンス値が関連づけて記憶される。 The feature point list storage unit 42 stores the feature point list generated by the feature amount generation process described later. In the feature point list, the coordinates of the feature points and the response value are stored in association with each other.
 特徴量記憶部43には、特徴点リスト記憶部42に記憶された特徴点リストのうち、後述する閾値以上のレスポンス値を有する特徴点から生成された特徴量が記憶される。この特徴量が警備装置90に送信される。 The feature amount storage unit 43 stores the feature amount generated from the feature points having a response value equal to or higher than the threshold value described later in the feature point list stored in the feature point list storage unit 42. This feature amount is transmitted to the security device 90.
 各種設定記憶部44には、画像Vを撮像するフレームレート、画像Vが撮像される撮像範囲P、レスポンス値範囲等の各種設定が、予め、またはユーザが入力部80を介して入力すること等により記憶されている。 Various settings such as the frame rate for capturing the image V, the imaging range P for capturing the image V, the response value range, and the like are input to the various setting storage units 44 in advance or by the user via the input unit 80. Is remembered by.
 撮像部50は、画像Vを撮像する撮像装置51、撮像装置51の姿勢を制御するための駆動装置52を有する。 The image pickup unit 50 has an image pickup device 51 for capturing an image V and a drive device 52 for controlling the posture of the image pickup device 51.
 撮像装置51は、本実施形態では、CMOS(Complementary Metal Oxide Semiconductor)カメラを備える。撮像装置51は、撮像範囲Pを240fps(frames per second)の高フレームレートで撮像し画像Vを生成する。なお、フレームレートは240fpsに初期設定されているが、撮像対象物の想定移動速度等に応じて、ユーザにより手動で、または自動的に変更可能である。 In the present embodiment, the image pickup apparatus 51 includes a CMOS (Complementary Metal Oxide Sensor) camera. The image pickup apparatus 51 images the image pickup range P at a high frame rate of 240 fps (frames per second) and generates an image V. Although the frame rate is initially set to 240 fps, it can be manually or automatically changed by the user according to the assumed moving speed of the image pickup object and the like.
 駆動装置52は、後述する入力装置81等の指示に従って、撮像装置51の位置や方向を移動させて撮像範囲Pを調整する。 The drive device 52 adjusts the image pickup range P by moving the position and direction of the image pickup device 51 according to the instructions of the input device 81 and the like described later.
 通信部60は、警備装置90や外部機器等と通信するためのモジュールである通信装置61を有する。通信装置61は、外部機器と通信する場合にはアンテナを含む無線モジュールである。例えば、通信装置61は、Bluetooth(登録商標)に基づく近距離無線通信を行うための無線モジュールである。通信部60を用いることにより、画像処理装置10は、警備装置90や外部機器と画像Vや特徴量等のデータ等の受け渡しを行うことができる。 The communication unit 60 has a communication device 61 which is a module for communicating with a security device 90, an external device, and the like. The communication device 61 is a wireless module including an antenna when communicating with an external device. For example, the communication device 61 is a wireless module for performing short-range wireless communication based on Bluetooth (registered trademark). By using the communication unit 60, the image processing device 10 can exchange data such as an image V and a feature amount with a security device 90 or an external device.
 表示部70は、液晶表示パネル(LCD:Liquid Crystal Display)から構成された表示装置71を備える。 The display unit 70 includes a display device 71 composed of a liquid crystal display panel (LCD: Liquid Crystal Display).
 表示装置71としては、薄膜トランジスタ(TFT:Thin Film Transistor)、液晶、有機ELなどを採用できる。表示装置71には、画像Vや特徴量リスト等が表示される。 As the display device 71, a thin film transistor (TFT: Thin Film Transistor), a liquid crystal display, an organic EL, or the like can be adopted. The image V, the feature amount list, and the like are displayed on the display device 71.
 入力部80は、表示部70に近接して、または表示部70と一体に設けられた抵抗膜方式のタッチパネル(入力装置81)である。タッチパネルは、赤外線操作方式や投影型静電容量方式等であってもよく、入力部はタッチパネルで無くキーボードおよびマウス等であってもよい。ユーザは、入力部80を介したマニュアル操作にて、フレームレートを変更する、或いは、表示部70を使用して撮像範囲P等を設定することができる。 The input unit 80 is a resistance film type touch panel (input device 81) provided in the vicinity of the display unit 70 or integrally with the display unit 70. The touch panel may be an infrared operation method, a projection type capacitance method, or the like, and the input unit may be a keyboard, a mouse, or the like instead of the touch panel. The user can change the frame rate by manual operation via the input unit 80, or set the image pickup range P or the like using the display unit 70.
 次に、画像処理装置10の制御部20および特徴点検出器30の機能的構成について説明する。 Next, the functional configuration of the control unit 20 and the feature point detector 30 of the image processing device 10 will be described.
 制御部20は、特徴点取得部21、特徴量生成部22、特徴量送信部23の機能を実現し、ハードウェアアクセラレータAである特徴点検出器30とともに後述する特徴量生成処理等を行う。 The control unit 20 realizes the functions of the feature point acquisition unit 21, the feature amount generation unit 22, and the feature amount transmission unit 23, and performs the feature amount generation process described later together with the feature point detector 30 which is the hardware accelerator A.
 特徴点取得部21は、特徴点のレスポンス値Rが後述する閾値Th以上である場合に、特徴点リストから当該特徴点の座標を読み込み、特徴量生成部22に送信する。 When the response value R of the feature point is equal to or higher than the threshold value Th described later, the feature point acquisition unit 21 reads the coordinates of the feature point from the feature point list and transmits the coordinates to the feature amount generation unit 22.
 特徴量生成部22は、特徴点のレスポンス値Rが閾値Th以上である場合に、当該特徴点の特徴量を生成して、特徴量記憶部43に記憶する。特徴量生成部22は、本実施形態では、SIFT(Scale-Invariant Feature Transform)によって特徴量を生成する。具体的には、まず、下式(1)、(2)に示すスケールの異なるガウス関数Gと入力画像Iを畳み込んだ平滑画像Lとの差分であるDoG(Difference of Gaussian)画像により、キーポイント候補を算出する。
L(u,v,σ)=G(x,y,σ)×I(u,v)…(1)
When the response value R of the feature point is equal to or greater than the threshold value Th, the feature amount generation unit 22 generates the feature amount of the feature point and stores it in the feature amount storage unit 43. In the present embodiment, the feature amount generation unit 22 generates a feature amount by SIFT (Scale-Invariant Feature Transfer Transfer). Specifically, first, a key is obtained by using a DoG (Difference of Gaussian) image which is a difference between a Gaussian function G having a different scale shown in the following equations (1) and (2) and a smooth image L in which the input image I is convoluted. Calculate point candidates.
L (u, v, σ) = G (x, y, σ) × I (u, v) ... (1)
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 DoG画像の結果Dは、下式(3)から算出する。 The result D of the DoG image is calculated from the following equation (3).
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 ここで、キーポイント候補に対して下式(4)に示す2次元ヘッセ行列Hを算出する。 Here, the two-dimensional Hessian matrix H shown in the following equation (4) is calculated for the key point candidate.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 Hから算出される固有値をそれぞれα,βとして、γをα=γβとする場合に、Hの対角成分の和と行列式との関係は、下式(5)に示される。 When the eigenvalues calculated from H are α and β, respectively, and γ is α = γβ, the relationship between the sum of the diagonal components of H and the determinant is shown in the following equation (5).
Figure JPOXMLDOC01-appb-M000004
Figure JPOXMLDOC01-appb-M000004
 SIFTでは、下式(6)に示すγthによって閾値処理を実行することにより、エッジ上の不要なキーポイント候補を除外する。 In SIFT, unnecessary key point candidates on the edge are excluded by executing the threshold processing by γ th shown in the following equation (6).
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000005
 スケール空間内のキーポイントの位置は、サブピクセル推定を利用する。サブピクセル位置でのDoG出力を算出し、DoG出力の絶対値が閾値より小さいものを、キーポイント候補から除外する。これにより、特徴量生成部22は、検出されたキーポイントに対して、オリエンテーションの算出をするため、キーポイントが検出された平滑化画像Lの勾配強度mと勾配方向θを下式(7)、(8)、(9)より算出する。 Subpixel estimation is used for the position of key points in scale space. The DoG output at the sub-pixel position is calculated, and those whose absolute value of the DoG output is smaller than the threshold value are excluded from the key point candidates. As a result, the feature amount generation unit 22 calculates the orientation for the detected key point, so that the gradient intensity m and the gradient direction θ of the smoothed image L in which the key point is detected are calculated by the following equation (7). , (8), (9).
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000006
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000008
 算出されたmとθを使用して、重み付きの方向ヒストグラムhを式(10)、(11)から算出する。 Using the calculated m and θ, the weighted direction histogram h is calculated from the equations (10) and (11).
Figure JPOXMLDOC01-appb-M000009
w(x,y)=G(x,y,σ)・m(x,y)…(11)
Figure JPOXMLDOC01-appb-M000009
w (x, y) = G (x, y, σ) · m (x, y) ... (11)
 hは勾配方向を36方向に量子化したヒストグラムであり、wは参照ピクセルに対する重みであり、キーポイントに近いほど重くなるガウスカーネルを使用する。また、δはデルタ関数であり、θが量子化勾配方向θ’である場合に1を返す。36方向のヒストグラムの最大値から80パーセント以上となるピークをキーポイントのオリエンテーションとする。 H is a histogram obtained by quantizing the gradient direction in 36 directions, w is a weight for the reference pixel, and a Gaussian kernel that becomes heavier as it gets closer to the key point is used. Further, δ is a delta function, and 1 is returned when θ is in the quantization gradient direction θ'. The key point orientation is the peak that is 80% or more from the maximum value of the histogram in 36 directions.
 最後に、キーポイントのオリエンテーションを利用して、特徴量を記述する領域を回転させ、キーポイントの周辺領域が持つ勾配情報に基づいてキーポイントの特徴量を記述する。キーポイント記述子は、キーポイント周囲を、例えば、4×4=16ブロックに分割して、各ブロックを45°ずつ8方向のヒストグラムを作成することにより、128次元の特徴ベクトルのキーポイントの特徴量を記述する。 Finally, using the key point orientation, the area that describes the feature amount is rotated, and the feature amount of the key point is described based on the gradient information that the area around the key point has. The keypoint descriptor is a 128-dimensional feature vector keypoint feature by dividing the area around the keypoint into, for example, 4 × 4 = 16 blocks and creating a histogram of each block by 45 ° in 8 directions. Describe the quantity.
 特徴量送信部23は、特徴量記憶部43に記憶された特徴量と、特徴点リスト記憶部42に記憶された特徴点リストとを警備装置90に送信する。 The feature amount transmission unit 23 transmits the feature amount stored in the feature amount storage unit 43 and the feature point list stored in the feature point list storage unit 42 to the security device 90.
 特徴点検出器30は、画像取得部31、コーナー検出部32、ヒストグラム生成部33、閾値導出部34、の機能を実現し、制御部20とともに特徴量生成処理等を行う。 The feature point detector 30 realizes the functions of the image acquisition unit 31, the corner detection unit 32, the histogram generation unit 33, and the threshold value derivation unit 34, and performs feature quantity generation processing and the like together with the control unit 20.
 画像取得部31は、予め画像処理装置10に設定された、またはユーザが設定した露出条件で撮像範囲Pを撮像部50に撮像された画像Vを撮像画像記憶部41から取得する。
画像取得部31は、取得した画像Vをコーナー検出部32に送信する。
The image acquisition unit 31 acquires the image V imaged by the image pickup unit 50 with the image pickup range P under the exposure conditions set in advance in the image processing device 10 or set by the user from the image capture image storage unit 41.
The image acquisition unit 31 transmits the acquired image V to the corner detection unit 32.
 コーナー検出部32は、画像取得部31から送信された画像Vの中から、本実施形態では、Harrisのコーナー検出法によって特徴点を検出する。検出方法の詳細は、後述の特徴量生成処理にて説明する。コーナー検出部32は、検出した特徴点の座標とレスポンス値Rとを特徴点リスト記憶部42に記憶させる。 In the present embodiment, the corner detection unit 32 detects feature points from the image V transmitted from the image acquisition unit 31 by the corner detection method of Harris. The details of the detection method will be described in the feature amount generation process described later. The corner detection unit 32 stores the coordinates of the detected feature points and the response value R in the feature point list storage unit 42.
 ヒストグラム生成部(度数(頻度)分布導出部)33は、特徴点リスト記憶部42に記憶された特徴点のレスポンス値Rを用いて、図5A、図5Bに示すようなヒストグラムを生成する。ヒストグラムの生成方法については、後述するレスポンス値カウントアップ処理にて説明する。ヒストグラム生成部33は、得られたヒストグラムの結果に応じて、或いはユーザによって任意に、後述するレスポンス値範囲を変更することができる。 The histogram generation unit (frequency (frequency) distribution derivation unit) 33 generates a histogram as shown in FIGS. 5A and 5B by using the response value R of the feature points stored in the feature point list storage unit 42. The method of generating the histogram will be described in the response value count-up process described later. The histogram generation unit 33 can change the response value range described later according to the result of the obtained histogram or arbitrarily by the user.
 閾値導出部34は、後述する閾値導出処理において、レスポンス値Rに基づいて閾値Thを導出する。閾値Thの導出方法の詳細は、後述する。 The threshold value derivation unit 34 derives the threshold value Th based on the response value R in the threshold value derivation process described later. Details of the method for deriving the threshold value Th will be described later.
 以上、制御部20および特徴点検出器30の機能的構成について説明した。以下、図3等を用いて、撮像された画像Vから特徴点を抽出し、抽出された特徴点のうち、所定の閾値以上のレスポンス値を有するものの特徴量を生成する特徴量生成処理について具体的に説明する。この特徴量生成処理では、ソート処理を行わずに、度数分布処理を行うので、特徴点検出器30を小規模回路なアクセラレータAで構成することができる。 The functional configuration of the control unit 20 and the feature point detector 30 has been described above. Hereinafter, using FIG. 3 and the like, feature points are extracted from the captured image V, and among the extracted feature points, the feature amount generation process for generating the feature amount of those having a response value equal to or higher than a predetermined threshold value is specified. To explain. In this feature amount generation process, the frequency distribution process is performed without performing the sort process, so that the feature point detector 30 can be configured by the accelerator A having a small circuit.
 (特徴量生成処理)
 はじめに、撮像部50は、フレームレート240fpsで所定の撮像範囲Pを撮像し、撮像された画像Vを撮像画像記憶部41に記憶させる(ステップS1)。
(Feature quantity generation processing)
First, the image pickup unit 50 captures a predetermined imaging range P at a frame rate of 240 fps, and stores the captured image V in the image pickup image storage unit 41 (step S1).
 次に、n(1≦n≦k)回目の画素取り込みを始める(ステップS2)。特徴点取得部21は、撮像画像記憶部41に記憶された画像Vからn番目の画素を1画素読み込み、取得した画素をコーナー検出部32に送信する(ステップS3)。なお、画像Vの画素数は、k個(1≦n≦k)とし、n=1,2,3,…,kの順に画素を取得する。 Next, the n (1 ≦ n ≦ k) pixel acquisition is started (step S2). The feature point acquisition unit 21 reads one pixel nth from the image V stored in the image capture image storage unit 41, and transmits the acquired pixel to the corner detection unit 32 (step S3). The number of pixels of the image V is k (1 ≦ n ≦ k), and pixels are acquired in the order of n = 1, 2, 3, ..., K.
 コーナー検出部32は、受信した画素について、以下の式(12)から(15)を用いてレスポンス値Rを算出する(ステップS4)。(12)式の窓関数w(x,y)には、矩形窓又はガウシアン窓を使用する。E(u、v)を最大化するために(12)式にTaylor展開等を行い、(13)式を得る。なお、(13)式のMは、(14)式のように定義される。IとIとは、それぞれ画像Vのx方向とy方向の勾配を表す。 The corner detection unit 32 calculates the response value R for the received pixel using the following equations (12) to (15) (step S4). A rectangular window or a Gaussian window is used for the window function w (x, y) in the equation (12). In order to maximize E (u, v), Taylor expansion or the like is performed on the equation (12) to obtain the equation (13). It should be noted that M in Eq. (13) is defined as in Eq. (14). I x and I y represent the gradients of the image V in the x direction and the y direction, respectively.
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000011
Figure JPOXMLDOC01-appb-M000011
Figure JPOXMLDOC01-appb-M000012
Figure JPOXMLDOC01-appb-M000012
  R=det(M)-k(trace(M))2
    det(M)=λ1λ2             …(15)
    trace(M)=λ12
    λ12:Mの固有値
R = det (M) -k (trace (M)) 2
det (M) = λ 1 λ 2 … (15)
trace (M) = λ 1 + λ 2
λ 1 , λ 2 : Eigenvalues of M
 コーナー検出部32は、算出したレスポンス値Rが閾値Fを超える場合(ステップS5:Yes)、ステップS6へ進む。レスポンス値Rが閾値F以下の場合(ステップS5:No)、0クリップし(ステップS7)、nに1を加えてステップS3へ戻る。なお、閾値Fは、事前に対象となる画像Vを用いて、統計的に解析して、値を決定する。 When the calculated response value R exceeds the threshold value F (step S5: Yes), the corner detection unit 32 proceeds to step S6. When the response value R is equal to or less than the threshold value F (step S5: No), 0 clip is performed (step S7), 1 is added to n, and the process returns to step S3. The threshold value F is determined by statistically analyzing the target image V in advance using the target image V.
 ステップS6では、コーナー検出部32は、レスポンス値Rを非極大値抑制(Non Maximum Suppression)することにより、エッジの強さが極大となる部分以外を除外する。極大点(特徴点)である場合(ステップS6:Yes)、ステップS8へ進み、極大点で無く、除外された場合、nに1を加えて(ステップS9)、ステップS3に戻る。 In step S6, the corner detection unit 32 suppresses the response value R to a non-maximum value (Non Maximum Suppression), thereby excluding the portion other than the portion where the edge strength becomes maximum. If it is a maximum point (feature point) (step S6: Yes), the process proceeds to step S8, and if it is not a maximum point and is excluded, 1 is added to n (step S9), and the process returns to step S3.
 ステップS8では、コーナー検出部32は、特徴点の座標とレスポンス値Rとを特徴点リストとして特徴点リスト記憶部42に記憶させる。 In step S8, the corner detection unit 32 stores the coordinates of the feature points and the response value R in the feature point list storage unit 42 as a feature point list.
 ステップS10に進み、ヒストグラム生成部33は、後述するレスポンス値カウントアップ処理を実行する。 Proceeding to step S10, the histogram generation unit 33 executes the response value count-up process described later.
 ステップS11に進み、n=kの場合(ステップS11:Yes)、ステップS12に進み、n<kの場合(ステップS11:No)、nに1を加えて(ステップS13)、ステップS3に戻る。 Proceed to step S11, if n = k (step S11: Yes), proceed to step S12, if n <k (step S11: No), add 1 to n (step S13), and return to step S3.
 ステップS12では、閾値導出部34は、ヒストグラムから上位M個になるレスポンス値Rの閾値Thを導出するための、後述する閾値導出処理を実行する。Mは、本実施形態では、ステップS8の特徴点リストの特徴点の数の10パーセントとしている。 In step S12, the threshold value derivation unit 34 executes a threshold value derivation process described later for deriving the threshold value Th of the response value R which is the upper M from the histogram. In the present embodiment, M is 10% of the number of feature points in the feature point list in step S8.
 ステップS14に進み、特徴点取得部21は、検出された特徴点Z個(1≦m≦Z)について、レスポンス値Rがそれぞれ閾値導出処理で導出した閾値Th以上か否かを判別する。当該特徴点のレスポンス値Rが閾値Th以上である場合(ステップS14:Yes)、特徴点取得部21は、特徴点リストから当該特徴点の座標を読み込み、特徴量生成部22に送信する。特徴量生成部22は、SIFTによって当該特徴点の特徴量を生成して(ステップS15)、特徴量記憶部43に記憶してステップS16へ進む。当該特徴点のレスポンス値Rが閾値Th未満である場合(ステップS14:No)、ステップS16へ進む。 Proceeding to step S14, the feature point acquisition unit 21 determines whether or not the response value R of each of the detected Z feature points (1 ≦ m ≦ Z) is equal to or higher than the threshold value Th derived by the threshold value derivation process. When the response value R of the feature point is equal to or greater than the threshold value Th (step S14: Yes), the feature point acquisition unit 21 reads the coordinates of the feature point from the feature point list and transmits the coordinates to the feature amount generation unit 22. The feature amount generation unit 22 generates a feature amount of the feature point by SIFT (step S15), stores the feature amount in the feature amount storage unit 43, and proceeds to step S16. When the response value R of the feature point is less than the threshold value Th (step S14: No), the process proceeds to step S16.
 ステップS16では、m=Zである場合(ステップS16:Yes)、特徴量生成処理を終了し、m<Zである場合(ステップS16:No)、mに1を加えて(ステップS17)、ステップS14に戻る。 In step S16, when m = Z (step S16: Yes), the feature amount generation process is completed, and when m <Z (step S16: No), 1 is added to m (step S17), and the step. Return to S14.
 続いて、図4、図5A、図5Bを用いて、特徴量生成処理のステップS10で実行されるレスポンス値カウントアップ処理について、説明を行う。 Subsequently, the response value count-up process executed in step S10 of the feature amount generation process will be described with reference to FIGS. 4, 5A, and 5B.
 はじめに、当該特徴点のレスポンス値Rが該当する、レスポンス値範囲(図5A参照)を決定する(ステップS21)。なお、レスポンス値範囲は、各種設定記憶部44に予め記憶されているものを使用してもよいし、ユーザが任意に設定してもよい。或いは、例えば、レスポンス値Rが1つ(或いは2つ、3つ等)のBINの範囲に集中する場合には、レスポンス値範囲を変更して、レスポンス値Rが特定の範囲に集中しないように修正することもできる。また、レスポンス値範囲は、レスポンス値Rの頻度分布等に応じて、例えば、図5Bに示すように、階級の幅を指数(或いは対数)等を用いたものとして、適切な度数分布が得られるようにしてもよい。 First, the response value range (see FIG. 5A) to which the response value R of the feature point corresponds is determined (step S21). The response value range may be stored in advance in the various setting storage units 44, or may be arbitrarily set by the user. Alternatively, for example, when the response value R is concentrated in one (or two, three, etc.) BIN range, the response value range is changed so that the response value R is not concentrated in a specific range. You can also fix it. Further, as for the response value range, an appropriate frequency distribution can be obtained, for example, assuming that the width of the class uses an exponent (or logarithm) or the like, as shown in FIG. 5B, according to the frequency distribution of the response value R or the like. You may do so.
 次に、ステップS22に進み、該当するレスポンス値範囲のビンに対してカウントアップを行い(ステップS22)、終了する。 Next, the process proceeds to step S22, the bins in the corresponding response value range are counted up (step S22), and the process ends.
 続いて、図6、図7を用いて、特徴量生成処理のステップS12で実行される閾値導出処理について、説明を行う。図7に示すように、本実施形態では、ビン(BIN)は、0から4の5つに分かれているものとし、ビン4がレスポンス値Rの最も大きい範囲であり、ビン0がレスポンス値Rの最も小さい範囲であるものとする。また、M=6であるとする。 Subsequently, with reference to FIGS. 6 and 7, the threshold value derivation process executed in step S12 of the feature amount generation process will be described. As shown in FIG. 7, in the present embodiment, the bin (BIN) is divided into five from 0 to 4, in which the bin 4 is the largest range of the response value R, and the bin 0 is the response value R. It shall be the smallest range of. Further, it is assumed that M = 6.
 まず、1巡目では(t=1)、閾値導出部34は、レスポンス値Rの最も大きい範囲であるビンに属するレスポンス値の数F1(=Q1)を導出する(ステップS32)。2巡目には、閾値導出部34は、1巡目のビンに属するレスポンス値の数F1に、1巡目のビンの次にレスポンス値Rの大きい範囲であるビンに属するレスポンス値の数F2を加算し、Q2を得る。3巡目には、閾値導出部34は、1巡目および2巡目のビンに属するレスポンス値の数F1+F2に、2巡目のビンの次にレスポンス値Rの大きい範囲であるビンに属するレスポンス値の数F3を加算しQ3を得る。以後、1巡ごとに、閾値導出部34は、次にレスポンス値Rの大きい範囲であるビンに属するレスポンス値Rを順に加算し、Qtを得る。本実施形態では、1巡目、2巡目は1(T1=T2=1)で、3巡目は5(T3=5)、4巡目は7(T4=7)、5巡目は8(T5=8)となる。 First, in the first round (t = 1), the threshold value derivation unit 34 derives the number F1 (= Q1) of the response values belonging to the bin which is the largest range of the response value R (step S32). In the second round, the threshold derivation unit 34 has the number F1 of the response values belonging to the bins of the first round and the number F2 of the response values belonging to the bin having the second largest response value R after the first round bin. Is added to obtain Q2. In the third round, the threshold derivation unit 34 has a response belonging to the number F1 + F2 of the response values belonging to the bins of the first and second rounds, and the bin having the second largest response value R after the bin of the second round. Add the number F3 of values to get Q3. After that, for each round, the threshold value derivation unit 34 sequentially adds the response values R belonging to the bin, which is the range in which the response value R is large, to obtain Qt. In this embodiment, the first and second rounds are 1 (T1 = T2 = 1), the third round is 5 (T3 = 5), the fourth round is 7 (T4 = 7), and the fifth round is 8. (T5 = 8).
 ステップS33に進み、閾値導出部34は、ステップS32で得たレスポンス値の数QtとMとを比較する。ステップS32で得たレスポンス値の数QtがM以下である場合(ステップS33:No)、tに1を加え(ステップS34)、ステップS32に戻る。ステップS32で得たレスポンス値の数QtがMを超える場合(ステップS33:Yes)、閾値導出部34は、1巡前のビンを閾値Thとし(ステップS35)、閾値導出処理を終了する。 Proceeding to step S33, the threshold value derivation unit 34 compares the number Qt of the response values obtained in step S32 with M. When the number Qt of the response values obtained in step S32 is M or less (step S33: No), 1 is added to t (step S34), and the process returns to step S32. When the number Qt of the response values obtained in step S32 exceeds M (step S33: Yes), the threshold value derivation unit 34 sets the bin one round before as the threshold value Th (step S35), and ends the threshold value derivation process.
 以上、特徴量生成処理によって、画像処理装置10は、撮像された画像から特徴点を検出し、特徴点のレスポンス値のヒストグラムを生成し、ヒストグラムによりレスポンス値Rの閾値Thを設定し、閾値Th以上のレスポンス値Rを有する特徴点の特徴量を生成する。従って、ソート処理を行わずに特徴量生成をすることができる。ソート処理を行わないので、画像処理装置10のハードウェアアクセラレータAを小規模回路で低レイテンシーに構成することができる。また、画像処理装置10のデータ格納内部メモリを小さくすることができる。 As described above, by the feature amount generation process, the image processing apparatus 10 detects the feature points from the captured image, generates a histogram of the response value of the feature points, sets the threshold value Th of the response value R by the histogram, and sets the threshold value Th. A feature amount of a feature point having the above response value R is generated. Therefore, the feature amount can be generated without performing the sorting process. Since the sorting process is not performed, the hardware accelerator A of the image processing device 10 can be configured with a low latency in a small-scale circuit. In addition, the data storage internal memory of the image processing device 10 can be reduced.
 画像処理装置10の特徴量送信部23は、生成した特徴量を警備装置90に送信し、警備装置90は、受信した特徴量を元に画像Vに写った物体100が何であるかを判別し、ユーザである警備員に報知する。 The feature amount transmission unit 23 of the image processing device 10 transmits the generated feature amount to the security device 90, and the security device 90 determines what the object 100 reflected in the image V is based on the received feature amount. , Notify the security guard who is the user.
 [変形例]
 上記実施形態では、画像処理装置10の特徴点検出器30をハードウェアアクセラレータAとしたが、ハードウェアアクセラレータとする対象は、例えば、ヒストグラムを生成する部分だけであったり、ヒストグラム生成および閾値の導出であってもよいし、他の範囲であってもよい。必要に応じて、ハードウェアアクセラレータとする範囲を決定し、ハードウェアアクセラレータの対象外とする範囲を制御部や記憶部の範囲としてもよい。例えば、図8に示すように、アクセラレータ記憶部40Bを有するハードウェアアクセラレータBを設けた画像処理装置10Aとしてもよい。或いは、ハードウェアアクセラレータAを2つ実装した構成の画像処理装置として、より高速に閾値導出できるようにしてもよい。
[Modification example]
In the above embodiment, the feature point detector 30 of the image processing apparatus 10 is the hardware accelerator A, but the target to be the hardware accelerator is, for example, only the part that generates the histogram, or the histogram generation and the derivation of the threshold value. It may be, or it may be in another range. If necessary, the range to be the hardware accelerator may be determined, and the range to be excluded from the hardware accelerator may be the range of the control unit and the storage unit. For example, as shown in FIG. 8, the image processing device 10A provided with the hardware accelerator B having the accelerator storage unit 40B may be used. Alternatively, the threshold value can be derived at a higher speed as an image processing device having a configuration in which two hardware accelerators A are mounted.
 上記実施形態では、Harrisによるコーナー検出を行ったが、他のコーナー検出、例えば、Plesseyのコーナー検出やMoravecのコーナー検出等によってもよい。 In the above embodiment, corner detection by Harris is performed, but other corner detection, for example, Plessey corner detection, Moravec corner detection, or the like may be used.
 また、上記実施形態では、SIFTによる特徴量生成を行ったが、ORB(Oriented FAST and Roatated BRIEF)やSURF(Speeded-Up Robust Features)等による特徴量生成を行ってもよい。 Further, in the above embodiment, the feature amount is generated by SIFT, but the feature amount may be generated by ORB (Oriented FAST and Rotated BRIEF), SURF (Speeded-Up Robot Features), or the like.
 上記実施形態では、画像処理装置10は、警備装置に使用されたが、例えば、工場ラインセンサや車載カメラと連携させてもよい。 In the above embodiment, the image processing device 10 is used as a security device, but may be linked with, for example, a factory line sensor or an in-vehicle camera.
 また、上記実施形態では、画像処理装置10は、レスポンス値Rのヒストグラムを生成したが、他の度数分布、例えば、度数分布表を用いてもよい。 Further, in the above embodiment, the image processing apparatus 10 generates a histogram of the response value R, but another frequency distribution, for example, a frequency distribution table may be used.
 上記実施形態では、Mは、特徴点リストの特徴点の数の10パーセントとしていたが、10パーセント以外の任意の数値であってもよいし、或いは、Mは、画像のサイズによって決定してもよい。 In the above embodiment, M is 10% of the number of feature points in the feature point list, but it may be any numerical value other than 10%, or M may be determined by the size of the image. good.
 上記実施形態において、検出された特徴点の数が不足してMが決定できずに閾値Thが決定できない場合、例えば、ステップS5において閾値Fを設定し直してさらに特徴点を検出することにより閾値Thを決定するとよい。 In the above embodiment, when the number of detected feature points is insufficient and M cannot be determined and the threshold value Th cannot be determined, for example, the threshold value F is reset in step S5 and the threshold value is further detected. Th should be determined.
 この発明の画像処理装置10の各機能は、通常のPC(Personal Computer)等のコンピュータによっても実施することができる。具体的には、上記実施形態では、画像処理装置10が行う露出補正処理および画像処理のプログラムが、記憶部40のROMに予め記憶されているものとして説明した。しかし、プログラムを、フレキシブルディスク、CD-ROM(Compact Disc Read Only Memory)、DVD(Digital Versatile Disc)及びMO(Magneto-Optical Disc)等のコンピュータ読み取り可能な記録媒体に格納して配布し、そのプログラムをコンピュータに読み込んでインストールすることにより、上述の各機能を実現することができるコンピュータを構成してもよい。 Each function of the image processing apparatus 10 of the present invention can also be carried out by a computer such as a normal PC (Personal Computer). Specifically, in the above embodiment, the exposure compensation processing and the image processing programs performed by the image processing apparatus 10 have been described as being stored in the ROM of the storage unit 40 in advance. However, the program is stored and distributed in a computer-readable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), a DVD (Digital Versaille Disc), and an MO (Magnet-Optical Disc), and the program is distributed. You may configure a computer capable of realizing each of the above-mentioned functions by loading and installing the above-mentioned in a computer.
 以上、本発明の好ましい実施形態について説明したが、本発明は係る特定の実施形態に限定されるものではなく、本発明には、特許請求の範囲に記載された発明とその均等の範囲が含まれる。 Although the preferred embodiment of the present invention has been described above, the present invention is not limited to the specific embodiment, and the present invention includes the invention described in the claims and the equivalent range thereof. Will be.
 本出願は、2020年9月23日に出願された日本国特許出願特願2020-158335号に基づく。本明細書中に日本国特許出願特願2020-158335号の明細書、特許請求の範囲、図面全体を参照として取り込むものとする。 This application is based on Japanese Patent Application No. 2020-158335 filed on September 23, 2020. The specification, claims, and the entire drawing of Japanese Patent Application No. 2020-158335 shall be incorporated into this specification as a reference.
 本発明は、高フレームレートでの物体検出に好適なハードウェアアクセラレータ、画像処理装置及び画像処理方法に適用可能である。
The present invention is applicable to hardware accelerators, image processing devices and image processing methods suitable for object detection at a high frame rate.
1…警備システム、10,10A…画像処理装置、20…制御部、21…特徴点取得部、22…特徴量生成部、23…特徴量送信部、30…特徴点検出器、31…画像取得部、32…コーナー検出部、33…ヒストグラム生成部、34…閾値導出部、40…記憶部、40B…アクセラレータ記憶部、41,41A…撮像画像記憶部、42,42A…特徴点リスト記憶部、43,43A…特徴量記憶部、44…各種設定記憶部、50…撮像部、51…撮像装置、52…駆動装置、60…通信部、61…通信装置、70…表示部、71…表示装置、80…入力部、81…入力装置、90…警備装置、100…物体、101…車両、102…人物、103…物品、A,B…ハードウェアアクセラレータ、P…撮像範囲、V…画像 1 ... Security system, 10, 10A ... Image processing device, 20 ... Control unit, 21 ... Feature point acquisition unit, 22 ... Feature quantity generation unit, 23 ... Feature quantity transmission unit, 30 ... Feature point detector, 31 ... Image acquisition Unit, 32 ... Corner detection unit, 33 ... Histogram generation unit, 34 ... Threshold derivation unit, 40 ... Storage unit, 40B ... Accelerator storage unit, 41, 41A ... Captured image storage unit, 42, 42A ... Feature point list storage unit, 43, 43A ... Feature quantity storage unit, 44 ... Various setting storage units, 50 ... Imaging unit, 51 ... Imaging device, 52 ... Drive device, 60 ... Communication unit, 61 ... Communication device, 70 ... Display unit, 71 ... Display device , 80 ... Input unit, 81 ... Input device, 90 ... Security device, 100 ... Object, 101 ... Vehicle, 102 ... Person, 103 ... Article, A, B ... Hardware accelerator, P ... Imaging range, V ... Image

Claims (8)

  1.  画像から特徴点のレスポンス値の度数分布を導出する度数分布導出部と、
     前記度数分布に基づいて閾値を導出する閾値導出部と、を備え、
     前記閾値以上であるレスポンス値を有する特徴点の特徴量を画像処理装置に生成させる、
     ハードウェアアクセラレータ。
    A frequency distribution derivation unit that derives the frequency distribution of the response value of the feature point from the image,
    A threshold value derivation unit for deriving a threshold value based on the frequency distribution is provided.
    An image processing apparatus is made to generate a feature amount of a feature point having a response value equal to or higher than the threshold value.
    Hardware accelerator.
  2.  前記閾値は、検出された前記特徴点の数により決定される、
     請求項1に記載のハードウェアアクセラレータ。
    The threshold is determined by the number of detected feature points.
    The hardware accelerator according to claim 1.
  3.  前記閾値は、前記画像のサイズによって決定される、
     請求項1に記載のハードウェアアクセラレータ。
    The threshold is determined by the size of the image.
    The hardware accelerator according to claim 1.
  4.  前記度数分布は、ヒストグラムによって表現される、
     請求項1から3のいずれか1項に記載のハードウェアアクセラレータ。
    The frequency distribution is represented by a histogram.
    The hardware accelerator according to any one of claims 1 to 3.
  5.  前記検出された特徴点の数が不足して前記閾値が決定できない場合、さらに特徴点を検出することにより前記閾値を決定する、
     請求項2に記載のハードウェアアクセラレータ。
    When the number of detected feature points is insufficient and the threshold value cannot be determined, the threshold value is determined by further detecting the feature points.
    The hardware accelerator according to claim 2.
  6.  前記ヒストグラムの階級の幅は、指数的に大きくなる、または小さくなる、
     請求項4に記載のハードウェアアクセラレータ。
    The width of the class of the histogram increases or decreases exponentially.
    The hardware accelerator according to claim 4.
  7.  請求項1から6のいずれか1項に記載のハードウェアアクセラレータと、
     前記閾値以上の特徴点について特徴量を生成する制御部と、
     前記画像と前記レスポンス値と前記特徴量とを記憶する記憶部と、
     を備える、
     画像処理装置。
    The hardware accelerator according to any one of claims 1 to 6.
    A control unit that generates a feature amount for a feature point above the threshold value,
    A storage unit that stores the image, the response value, and the feature amount,
    To prepare
    Image processing device.
  8.  ハードウェアアクセラレータに、
     画像から特徴点のレスポンス値の度数分布を導出させ、
     前記度数分布に基づいて閾値を導出させ、
     画像処理装置に、
     前記閾値以上であるレスポンス値を有する特徴点の特徴量を生成させる、
     画像処理方法。
    For hardware accelerators
    Derived the frequency distribution of the response value of the feature point from the image,
    A threshold value is derived based on the frequency distribution, and the threshold value is derived.
    For image processing equipment
    A feature amount of a feature point having a response value equal to or higher than the threshold value is generated.
    Image processing method.
PCT/JP2021/028661 2020-09-23 2021-08-02 Hardware accelerator, image processing device, and image processing method WO2022064854A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020158335A JP7200983B2 (en) 2020-09-23 2020-09-23 HARDWARE ACCELERATOR, IMAGE PROCESSING DEVICE, AND FEATURE POINT DETECTION METHOD
JP2020-158335 2020-09-23

Publications (1)

Publication Number Publication Date
WO2022064854A1 true WO2022064854A1 (en) 2022-03-31

Family

ID=80845051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/028661 WO2022064854A1 (en) 2020-09-23 2021-08-02 Hardware accelerator, image processing device, and image processing method

Country Status (2)

Country Link
JP (1) JP7200983B2 (en)
WO (1) WO2022064854A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014120056A (en) * 2012-12-18 2014-06-30 Olympus Corp Image processing apparatus and image processing method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014120056A (en) * 2012-12-18 2014-06-30 Olympus Corp Image processing apparatus and image processing method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
FUJIYOSHI, HIRONOBU: "Large-Scale Data Processing: Image Local Feature Value SIFT and the Latest Approaches", JOURNAL OF JAPANESE SOCIETY FOR ARTIFICIAL INTELLIGENCE, vol. 25, no. 6, 1 November 2010 (2010-11-01), pages 753 - 760, XP009535211, ISSN: 2188-2266 *

Also Published As

Publication number Publication date
JP7200983B2 (en) 2023-01-10
JP2022052139A (en) 2022-04-04

Similar Documents

Publication Publication Date Title
CN109961009B (en) Pedestrian detection method, system, device and storage medium based on deep learning
US20190156157A1 (en) Information processing apparatus, information processing method, and non-transitory computer-readable storage medium
Pang et al. Distributed object detection with linear SVMs
WO2019041519A1 (en) Target tracking device and method, and computer-readable storage medium
US9697442B2 (en) Object detection in digital images
EP2336949B1 (en) Apparatus and method for registering plurality of facial images for face recognition
CN107424160A (en) The system and method that image center line is searched by vision system
EP2864933A1 (en) Method, apparatus and computer program product for human-face features extraction
US9058655B2 (en) Region of interest based image registration
WO2021090771A1 (en) Method, apparatus and system for training a neural network, and storage medium storing instructions
CN112767354A (en) Defect detection method, device and equipment based on image segmentation and storage medium
US10891740B2 (en) Moving object tracking apparatus, moving object tracking method, and computer program product
CN105208263B (en) Image processing apparatus and its control method
CN114390201A (en) Focusing method and device thereof
CN109816628B (en) Face evaluation method and related product
CN110569921A (en) Vehicle logo identification method, system, device and computer readable medium
WO2022064854A1 (en) Hardware accelerator, image processing device, and image processing method
CN111080683B (en) Image processing method, device, storage medium and electronic equipment
US11238309B2 (en) Selecting keypoints in images using descriptor scores
CN110619304A (en) Vehicle type recognition method, system, device and computer readable medium
Prasertsakul et al. Camera operation estimation from video shot using 2D motion vector histogram
CN114119423A (en) Image processing method, image processing device, electronic equipment and storage medium
Ashiba Dark infrared night vision imaging proposed work for pedestrian detection and tracking
Tzou et al. Detect safety net on the construction site based on YOLO-v4
CN114979607B (en) Image processing method, image processor and electronic equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21871986

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21871986

Country of ref document: EP

Kind code of ref document: A1