WO2022264332A1 - Dispositif d'enregistrement, procédé d'enregistrement, et programme - Google Patents

Dispositif d'enregistrement, procédé d'enregistrement, et programme Download PDF

Info

Publication number
WO2022264332A1
WO2022264332A1 PCT/JP2021/022919 JP2021022919W WO2022264332A1 WO 2022264332 A1 WO2022264332 A1 WO 2022264332A1 JP 2021022919 W JP2021022919 W JP 2021022919W WO 2022264332 A1 WO2022264332 A1 WO 2022264332A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
similarity
unit
failure
teacher
Prior art date
Application number
PCT/JP2021/022919
Other languages
English (en)
Japanese (ja)
Inventor
俊介 金井
晴久 野末
憲男 山本
文香 浅井
健一 田山
Original Assignee
日本電信電話株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電信電話株式会社 filed Critical 日本電信電話株式会社
Priority to JP2023528857A priority Critical patent/JPWO2022264332A1/ja
Priority to PCT/JP2021/022919 priority patent/WO2022264332A1/fr
Publication of WO2022264332A1 publication Critical patent/WO2022264332A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the embodiments relate to a registration device, a registration method, and a program.
  • the failure history information includes, for example, location of failure, cause of failure, and coping method for failure.
  • Embodiments provide a registration device, a registration method, and a program for reliably and easily registering information for creating rules for estimating failures.
  • the determination device of the embodiment includes an acquisition unit, a first calculation unit, a determination unit, a reception unit, a second calculation unit, and a registration unit.
  • the acquisition unit acquires teacher data including information about failures, and acquires operation data including information about operations of one or more devices.
  • the first calculator calculates a first degree of similarity between the teacher data and the operational data for each failure.
  • the determining unit determines, for each piece of operational data, teacher data having a first degree of similarity greater than or equal to a threshold.
  • the reception unit receives correct data based on teacher data that is equal to or greater than a threshold.
  • the second calculation unit calculates a second degree of similarity between the correct data and the teacher data using a plurality of similarity degree calculation methods.
  • the registration unit registers, as new teacher data, correct data corresponding to the highest degree of similarity among the plurality of degrees of second similarity.
  • the embodiment can reliably and easily register information for creating rules for estimating failures.
  • FIG. 1 is a diagram showing the hardware configuration of a failure information registration device according to an embodiment.
  • FIG. 2 is a diagram illustrating functions of the fault information registration device according to the embodiment.
  • FIG. 3 is a diagram showing an example of teacher data included in the teacher DB of FIG.
  • FIG. 4 is a diagram illustrating a part of operational data acquired by the failure information registration device according to the embodiment;
  • FIG. 5 is a diagram showing an example of the degree of similarity between teacher data and operational data.
  • FIG. 6 is a flowchart showing selection of teacher data, which is an example of the operation of the failure information registration device according to the embodiment.
  • FIG. 7 is a flowchart showing candidate calculation of teacher data, which is an example of the operation of the failure information registration device according to the embodiment.
  • FIG. 1 is a diagram showing the hardware configuration of a failure information registration device according to an embodiment.
  • FIG. 2 is a diagram illustrating functions of the fault information registration device according to the embodiment.
  • FIG. 3 is a diagram showing an
  • FIG. 8 is a diagram showing an example of information displayed in step S711 of FIG.
  • FIG. 9 is a flowchart showing determination and learning of correct data, which is an example of the operation of the failure information registration device according to the embodiment.
  • FIG. 10 is a diagram showing an example of a concept for explaining steps S901 and S902 in FIG.
  • FIG. 11 is a diagram showing an example of a concept for explaining step S909 in FIG.
  • FIG. 12 is a flow chart showing a modification of the operation from the start of FIG. 9 to step S904.
  • the fault information registration device 100 of this embodiment includes a processor 101, a ROM 102, a RAM 103, an interface 104, a display 105, and a storage 106.
  • the processor 101 is a processing device that controls the fault information registration device 100 as a whole.
  • the processor 101 is, for example, a CPU (Central Processing Unit).
  • the processor 101 is not limited to a CPU.
  • an ASIC Application Specific IC
  • the number of processors 101 may be two or more instead of one.
  • the ROM 102 is a read-only storage device.
  • the ROM 102 stores firmware and various programs necessary for the operation of the fault information registration device 100 .
  • the RAM 103 is a arbitrarily writable storage device.
  • a RAM 103 is used as a work area for the processor 101 and temporarily stores firmware and the like stored in the ROM 102 .
  • the interface 104 is a device for exchanging information with an external device.
  • the interface 104 receives, for example, operational data, teacher data, and input from the user.
  • the interface 104 may also transmit and receive information to and from an external server or the like.
  • the display 105 is a display device that displays various screens.
  • the display 105 may be a liquid crystal display, an organic EL display, or the like.
  • the display 105 may include a touch panel.
  • the storage 106 is a storage device such as a hard disk.
  • the storage 106 stores, for example, various applications executed by the processor 101, data used as input to the applications, and data obtained by executing the applications.
  • the failure information registration device 100 of the present embodiment includes, as functional blocks, a teacher data acquisition unit 201, a teacher DB (hereinafter, the database is abbreviated as DB) 202, a teacher data selection unit 203, an operation data acquisition unit 204, A similarity calculation method and threshold DB 205, a similarity calculation unit 206, a corresponding data determination unit 207, a corresponding data display unit 208, a correct data reception unit 209, a correct data extraction unit 210, and a teacher data registration unit 211 , including.
  • the teacher data acquisition unit 201, the operational data acquisition unit 204, and the correct data reception unit 209 are realized by the interface 104, for example.
  • the teacher data selection unit 203, the similarity calculation unit 206, the corresponding data determination unit 207, the correct data extraction unit 210, and the teacher data registration unit 211 are implemented by the processor 101, the ROM 102, the RAM 103, and the storage 106, for example.
  • the teacher DB 202 and the similarity calculation method and threshold DB 205 are realized by the storage 106, for example.
  • the corresponding data display unit 208 is realized by the display 105, for example.
  • the teaching data acquisition unit 201 acquires teaching data from an external DB or the like by connecting directly or via a network.
  • the training data includes information for creating rules for estimating the location of failure (also called location of failure).
  • the teacher data includes, for example, at least one of information on one or more failures, information on coping with this failure, or information on recovery from this failure.
  • the location of failure (also referred to as location of failure) indicates a location on the network, and is defined by at least one of a host name and an IP address, for example. The teaching data will be explained later with reference to FIG.
  • the teacher DB 202 stores the teacher data acquired by the teacher data acquisition unit 201.
  • the teacher data selection unit 203 regards the data that can be considered to have a common portion as teacher data. process. On the other hand, if the degree of similarity between the teacher data is less than the threshold, the teacher data selection unit 203 treats each teacher data as teacher data, assuming that there is no common part. For example, when the data part included in the teacher data is common, the teacher data selection unit 203 extracts only the common part as new teacher data, and deletes the data before extraction.
  • the training data selection unit 203 prepares the training data stored in the training DB 202 using the similarity calculation method and the similarity calculation method from the threshold DB 205 . As a result of this maintenance, the failure information registration device 100 can register information for creating rules for accurately estimating the location of failure.
  • the operational data acquisition unit 204 acquires operational data from an external DB or the like via a network or by direct connection.
  • Operational data includes information regarding the operation of one or more devices.
  • Operational data includes, for example, at least one of information on failure of one or more devices, information on dealing with this failure, or information on recovery from this failure.
  • the operation data acquisition unit 204 outputs the acquired operation data to the similarity calculation unit 206 as soon as it is received. Further, the operational data acquisition unit 204 may output the operational data to the similarity calculation unit 206 when a predetermined amount of operational data is stored in a buffer (for example, in the operational data acquisition unit 204). Operational data will be described later with reference to FIG.
  • the similarity calculation method and threshold DB 205 calculates similarity between teacher data, similarity between teacher data and operational data (first similarity), or similarity between correct data and teacher data (second similarity ) and threshold data relating to the threshold for each similarity calculation method are stored.
  • the correct answer data includes correct answer information for resolving the obstacle.
  • These method data and threshold data are preset and stored in the similarity calculation method and threshold DB 205 .
  • the similarity calculation method and threshold DB 205 may store method data and threshold data set based on instructions from the similarity calculation unit 206 .
  • the similarity calculation unit 206 obtains the similarity between the training data obtained from the training DB 202 and the operation data obtained from the operation data obtaining unit 204 from the similarity calculation method and threshold DB 205. Calculated by a similarity calculation method.
  • the similarity calculation unit 206 calculates one or more degrees of similarity between one piece of operational data and one or more teacher data stored in the teacher DB 202 .
  • the similarity calculation unit 206 sets the similarity calculation method and method data or threshold data in the threshold DB 205 based on the instruction from the correct data extraction unit 210 .
  • the similarity calculation unit 206 sets a similarity calculation method that calculates the highest similarity value as the method setting unit, and sets the threshold of this similarity based on the similarity as the threshold setting unit.
  • the threshold is set, for example, to a few percent (eg, 90%) of the highest similarity calculated by the corresponding similarity calculation technique.
  • the similarity calculation unit 206 calculates the similarity between the correct data extracted by the correct data extraction unit 210 and the teacher data by a similarity calculation method corresponding to the correct data and the teacher data. . Since the correct data are determined by the number of similarity calculation methods, the similarity between the correct data and the teacher data is calculated by the number of similarity calculation methods.
  • the corresponding data determining unit 207 determines whether or not the similarity calculated by one or more similarity calculation methods set by the similarity calculation unit 206 is equal to or higher than a set threshold value by combining the training data and the operation data. Decide for each The corresponding data determination unit 207, for example, determines a location where a failure is estimated to occur and is identified from the operational data (for example, the location of the failure device) and locations around the failure location (for example, the location of the failure device). position of the peripheral device) and the similarity between the operational data and one or more teacher data is greater than or equal to a threshold value (see FIG. 8).
  • the corresponding data determination unit 207 also identifies the correct data received by the correct data reception unit 209 and passes it to the correct data extraction unit 210 . Further, the corresponding data determination unit 207 receives the similarity calculated by the similarity calculation method corresponding to the correct data and the teacher data calculated by the similarity calculation unit 206, data). Corresponding data determination unit 207 passes data corresponding to the correct data and the degree of similarity to correct data extraction unit 210 .
  • the corresponding data display unit 208 displays the content determined by the corresponding data determining unit 207.
  • the corresponding data display unit 208 determines whether or not the degree of similarity between one or more operational data and one or more teacher data is equal to or greater than a threshold at the location of the failure-causing device and the location of the peripheral device of the failure-causing device. Shown for each training data.
  • the corresponding data display unit 208 displays, for example, the contents shown in FIG. 8, which will be described later. It should be noted that the corresponding data display unit 208 may not be used as a presentation unit, but may be presented by voice to the user or a device in place of the user. If the information is conveyed to the user or a device acting on behalf of the user, the corresponding data display 208 may convey the information by other means, whether visual or audible.
  • the correct data reception unit 209 receives instruction information in which the user or a device acting on behalf of the user selects teacher data assumed to be most relevant to the correct answer according to the contents displayed by the corresponding data display unit 208 (presence or absence of teacher data and log). accept. Then, the corresponding data display unit 208 receives the correct data determined as corresponding to the teacher data selected by the user or the device representing the user. Correct data is log information corresponding to teacher data. The log information is information output as a log related to each device, and is Syslog information, for example.
  • the correct data extracting unit 210 extracts one or more correct data acquired from the corresponding data determining unit 207, and calculates similarity between the one or more correct data and the teacher data using a plurality of similarity calculation methods.
  • the degree calculation unit 206 is instructed.
  • the correct data extraction unit 210 extracts a set having the maximum similarity from one or more sets of the correct data and the similarity received from the corresponding data determination unit 207 .
  • the correct data extraction unit 210 passes the correct data included in the extracted set having the maximum similarity to the teacher data registration unit 211 .
  • the teacher data registration unit 211 registers the correct data received from the correct data extraction unit 210 in the teacher DB 202 as new teacher data.
  • the teacher data includes at least one of information on failure, information on coping with this failure, or information on recovery from this failure.
  • the teacher data 1 shown in FIG. 3 includes information on failure "failure A” and information on coping with the failure "restart port: ID”.
  • the teacher data 2 shown in FIG. 3 includes information about the failure "failure A” and information about the recovery of the failure "restart-OK”.
  • the teacher data 3 shown in FIG. 3 includes information about a failure called “failure D" and information about how to deal with the failure called "re-insert card: ID”.
  • the teacher data 4 shown in FIG. 3 includes information about the failure "failure D” and information about the recovery of the failure "re-insert-OK”.
  • fault A and “Failure D” are associated with, for example, information specifying the state of the fault.
  • “failure A” indicates information about the state of the failure.
  • restart port: ID indicates that the ID port will be restarted.
  • the ID is a variable, and corresponds to a specific ID number (for example, natural number) in operational data.
  • start-OK indicates that the relevant device (the device corresponding to "failure A” in FIG. 3) has restarted (recovered).
  • re-insertcard:ID indicates to handle by reinserting the ID card.
  • re-insert-OK indicates reinsertion (recovery) into the corresponding device (the device corresponding to "failure A” in FIG. 3).
  • Operational data includes at least one of information about the failure of a certain device or information about the content of the failure.
  • Information about a failure of a device includes, for example, the date and time when the failure occurred, the location of the failure (host name, IP address, etc.), and the degree of urgency to deal with the failure (Emerg, Alert, Notice, Info, etc.).
  • the information about the content of the failure includes, for example, at least one of information about how to deal with this failure and information about recovery from this failure.
  • operational data attention will be focused on the location of the failure, information on how to deal with this failure, and information on recovery from this failure, which are mainly used in this embodiment. Therefore, operational data is described in this embodiment as including at least one of these pieces of information of interest. It should be noted that operational data, in addition to the information of interest, includes the information indicated above even if not specified.
  • Operational data 1 shown in FIG. 4 includes information indicating that a failure has occurred in "equipment A” at location "XX.XX.XX.XX” and information regarding handling of the failure "restart port: 03".
  • Operational data 2 shown in FIG. 4 includes information indicating that a failure has occurred in "equipment A” at location "XX.XX.XX.X” and information regarding failure recovery of "restart-OK”.
  • Operational data 3 shown in FIG. 4 includes information indicating that a failure has occurred in "device Z” at location "ZZ.ZZ.XX.YY", and failure handling "re-insert card: 04". contains information about and Operational data 4 shown in FIG. 4 includes information indicating that a failure has occurred in "device Z” at location "ZZ.ZZ.XX.YY” and failure recovery (“re-insert-NG”). (in this case recovery failure).
  • restart port: 03 indicates that a specific number 03 is specified and that the port of 03 will be restarted.
  • re-insert card:04 indicates that a specific number 04 is specified and that the card of 04 will be reinserted.
  • re-insert-NG indicates that re-insertion into the corresponding device (device Z in FIG. 4) did not recover from the failure.
  • the similarity calculation unit 206 calculates the similarity with all the teacher data in the teacher DB 202 for each operational data. It shows that mutual data are similar, so that similarity is large. As shown in FIG. 5, in this embodiment, the similarity is calculated between 0 and 1, for example. The closer the similarity is to 1, the more similar the operation data and the training data are, and the closer the similarity is to 0, the less similar the operation data and the training data are.
  • operational data 1 the degree of similarity with teacher data 1 is the highest at 0.97, and the lowest similarity with teacher data 3 is 0.25. Since the difference between 0.97 and 1 is smaller than the difference between 0.25 and 0, it is reasonable to adopt a similarity of 0.97. Therefore, operational data 1 is determined to be similar to teacher data 1 . That is, according to FIG. 5, failure A occurs in device A based on operational data 1, and failure A based on teacher data 1 occurs. is known to have been carried out.
  • the degree of similarity with teacher data 2 is the highest at 1.00, and the lowest similarity with teacher data 4 is 0.32. Since the difference between 1.00 and 1 is less than the difference between 0.32 and 0, it is reasonable to adopt a similarity of 1.00. Therefore, it is determined that the operational data 2 is similar to the teacher data 2 . That is, according to FIG. 5, a failure A caused by the teacher data 2 occurs in the device A by the operation data 2, and in response to the recovery "restart-OK" by the teacher data 2, "restart-OK" by the operation data 2 is executed. I know it was done.
  • the degree of similarity with teacher data 3 is the highest at 0.97, and the lowest similarity with teacher data 1 is 0.25. Since the difference between 0.97 and 1 is smaller than the difference between 0.25 and 0, it is reasonable to adopt a similarity of 0.97. Therefore, it is determined that the operational data 3 is similar to the teacher data 3 . That is, according to FIG. 5, a failure D caused by the training data 3 occurs in the device Z by the operation data 3, and in response to the countermeasure “re-insert card: ID” by the training data 3, the operation data 3 “re-insert card:04" has been implemented.
  • the degree of similarity with teacher data 3 is the highest at 0.53, and the lowest similarity with teacher data 4 is 0.21. Since the difference between 0.53 and 1 is greater than the difference between 0.21 and 0, it is reasonable to adopt a similarity of 0.21. Therefore, it is determined that the operational data 4 is hardly similar to the teacher data 4 . That is, according to FIG. 5, it is found that a failure D caused by the teacher data 4 occurred in the device Z by the operation data 4, and the recovery "re-insert-OK" by the teacher data 4 was not successful. It turns out that it becomes "re-insert-NG".
  • step S601 the teacher data selection unit 203 reads the teacher data from the teacher DB 202.
  • step S602 the teacher data selection unit 203 determines whether there is teacher data with the same number. Each teacher data is numbered. If the teacher data selection unit 203 determines that there is teacher data with the same number, the process proceeds to step S603, and if it determines that there is no teacher data with the same number, the process proceeds to step S607.
  • the teacher data includes, for example, three numbers: manufacturer number_log number_set number.
  • the teacher data selection unit 203 calculates the degree of similarity between teacher data belonging to the same group determined to have the same number.
  • a similarity calculation method and a certain similarity calculation method from the threshold DB 205 are adopted as a method for calculating this similarity.
  • the training data selection unit 203 may employ, for example, a similarity calculation method different from the similarity calculation method for calculating the similarity between training data and operational data.
  • the teacher data selection unit 203 may employ, for example, a similarity calculation method related to correct data when the teacher data registration unit 211 registers the teacher data.
  • step S604 the teacher data selection unit 203 compares the average similarity calculated in step S603 with a threshold.
  • the threshold of the similarity calculation method used in step S603 is used.
  • the similarity average value is calculated from the similarities calculated for each item of the training data.
  • the training data shown in FIG. 3 has three items, failure, coping, and recovery, and the degree of similarity is calculated for each item.
  • the average value of similarity between teacher data 1 and teacher data 3 is the average value of similarity of failure and similarity of coping.
  • the average value of the similarity between teacher data 1 and teacher data 2 is the same as the similarity of disability because the item to be compared is only disability.
  • step S604 the training data selection unit 203 proceeds to step S605 if the average value of similarities is equal to or greater than the threshold, and proceeds to step S606 if the average value of similarities is smaller than the threshold.
  • the teacher data selection unit 203 sets the common part of the teacher data as teacher data.
  • the teacher data selection unit 203 may add information that the teacher data have a common part.
  • a common part indicates the same information in the same item of teacher data. For example, assume that one teacher data includes “Link Down” and the other teacher data includes “Link is Down” in the same item of teacher data. In this case, the teacher data selection unit 203 determines that the common part is "LinkDown", and sets this item of the teacher data uniformly to "LinkDown". Then, the training data "Link Down" and “Link Down” are regarded as "LinkDown” and similarity calculation is performed.
  • step S605 if there is no common part, the teacher data selection unit 203 performs the same processing as in step S606.
  • the teacher data selection unit 203 sets each teacher data as it is as teacher data without integrating the teacher data.
  • the teacher data selection unit 203 may add information indicating that there is no common part in the teacher data. For example, assume that one teacher data includes "link is broken" and the other teacher data includes “link is down” in the same item of teacher data. In this case, the teacher data selection unit 203 sets "link is broken” and "link is down” as teacher data. In step S606, it can be interpreted that the teacher data selection unit 203 has determined that there is no common part in the teacher data.
  • step S607 the teacher data selection unit 203 determines whether there is teacher data with the same number in another category (for example, another manufacturer). For example, the teacher data selection unit 203 determines whether or not there is teacher data of another manufacturer. If the teacher data selection unit 203 determines that there is teacher data with the same number in another category, it returns to step S601. exit.
  • another category for example, another manufacturer.
  • the teacher data selection unit 203 determines whether or not there is teacher data of another manufacturer. If the teacher data selection unit 203 determines that there is teacher data with the same number in another category, it returns to step S601. exit.
  • step S701 the similarity calculation unit 206 reads the similarity calculation method and the similarity calculation method from the threshold DB 205, and in step S702, the similarity calculation unit 206 reads teacher data from the teacher DB 202.
  • the similarity calculation method read in step S701 if the similarity calculation method has already been determined in step S909 of FIG. 9, this similarity calculation method is read.
  • the default similarity calculation method or the similarity calculation method selected by the user or a device on behalf of the user is loaded. Note that the processing order of steps S701 and S702 may be reversed.
  • step S703 the similarity calculation unit 206 acquires operational data.
  • step S704 the similarity calculation unit 206 determines whether or not the teacher data in the teacher DB 202 have common parts. That is, the similarity calculation unit 206 determines whether the teacher data set in step S605 (there is a common part) or not the teacher data set in step S606 (there is no common part). If the similarity calculation unit 206 determines that there is a common part in the teacher data, the process advances to step S705, and if it determines that there is no common part in the teacher data, the process advances to step S706.
  • step S705 the similarity calculation unit 206 calculates the similarity between one piece of operational data and teacher data having a common part.
  • step S706 the similarity calculation unit 206 calculates the similarity between one piece of operational data and a plurality of teacher data having no common portion.
  • the similarity calculation unit 206 extracts the maximum similarity from these multiple similarities.
  • step S707 the threshold for the similarity calculation method read in step S701 is read from the similarity calculation method and threshold DB 205.
  • the read threshold is read if the threshold has already been determined in step S909 of FIG. 9, or is selected by the default threshold or by the user or a device on behalf of the user if step S909 has not yet been performed.
  • the specified threshold is read.
  • step S708 the corresponding data determination unit 207 determines the location of the faulty device that is estimated to have a fault identified from the operational data and the positions of the peripheral devices of this faulty device.
  • step S709 the corresponding data determination unit 207 determines for each set of teacher data and operational data whether the similarity calculated by the similarity calculation method is equal to or greater than the threshold read in step S707. If the corresponding data determination unit 207 determines that the degree of similarity is equal to or greater than the threshold, the process proceeds to step S710, and if it determines that the degree of similarity is less than the threshold, the process proceeds to step S711.
  • step S710 the corresponding data determination unit 207 determines that the corresponding teacher data "has a log".
  • step S711 the corresponding data determination unit 207 determines that the corresponding teacher data is "no log".
  • step S712 the correspondence data display unit 208 displays whether or not the degree of similarity between the operation data and the teacher data is equal to or greater than a threshold value for each teacher data at the location of the failure device and the location of the peripheral devices of the failure device. 105.
  • Corresponding data display unit 208 displays, for example, the table shown in FIG. 8 on display 105 .
  • step S713 the corresponding data determination unit 207 determines whether there is other operational data.
  • the corresponding data determining unit 207 returns to step S703 if it determines that there is other operational data, and terminates the processing in FIG. 7 if it determines that there is no other operational data.
  • FIG. 8 is an example of a table displayed in step S712.
  • FIG. 8 is an example of a table when the cosine similarity calculation method is read as the similarity calculation method in step S701.
  • teacher data determined to have a log in step S710 is displayed with "Yes" to the right of the data name. It is shown in bold and underlined in FIG.
  • Teacher data determined to be "no log” in step S711 are displayed as "no log" to the right of the data name.
  • the correct data reception unit 209 receives the teacher data selected by the user or a device acting on behalf of the user with reference to the content displayed by the corresponding data display unit 208 for each similarity calculation method.
  • the correct data reception unit 209 receives, for example, information instructing teacher data selected for each similarity calculation method from the teacher data shown in FIG.
  • the user or a device acting on behalf of the user determines the correct data after obtaining information necessary for determining the correct data. In other words, if the network breaks down, the location of the failure is estimated, and correct data is determined and registered after recovery by taking measures. We already know information about the correct answer data.
  • the correct data reception unit 209 determines that the user or a device acting in place of the user is data corresponding to the training data for each similarity calculation method selected in step S901 by the user or a device acting in place of the user as correct data. accept the data. Then, in step S 902 , the corresponding data determining unit 207 identifies the correct data received by the correct data receiving unit 209 and passes the correct data to the correct data extracting unit 210 .
  • the correct data reception unit 209 receives correct data selected for teacher data, as shown in FIG. 10, for example. Note that this teacher data and this correct answer data are data containing the same contents as "teacher data 1", "teacher data 2", “teacher data 3", "teacher data 4", etc. shown in the table of FIG. In addition to this, there is also information on measures included in the operation data in FIG. 4 (eg, "restart port: 03") and information on measures included in the training data in FIG. 3 (eg, "restart port: 03"). It can be correct data.
  • the similarity calculation unit 206 calculates the degree of similarity between the correct data extracted by the correct data extraction unit 210 and the training data by the corresponding similarity calculation method used when receiving the training data in step S901.
  • the similarity calculation unit 206 calculates the similarity between the correct data and the teacher data for each category of the teacher data, calculates the average value from the similarities of all the categories, and uses this average value as the correct data. and the degree of similarity with the training data.
  • This category is, for example, "failure", "countermeasure", and "recovery” described in FIGS.
  • FIG. 5 shows an example of similarity calculated in this way.
  • An example of calculation of the degree of similarity between correct data and teacher data for each category of teacher data is shown in FIG.
  • FIG. 11 shows that the similarity is calculated for each of the categories "failure", "countermeasure” and "recovery”.
  • step S904 the correct data extraction unit 210 extracts correct data for which the similarity calculation unit 200 has not yet calculated similarity among the received correct data based on the teacher data corresponding to the similarity calculation method received in step S901. Determine if data exists. If the correct data extraction unit 210 determines that there is correct data for which the similarity calculation unit 206 has not yet calculated the similarity, the process returns to step S902, and the similarity calculation unit 206 has not yet calculated the similarity. If it is determined that there is no correct data, the process proceeds to step S905.
  • step S905 the corresponding data determination unit 207 obtains information on the similarity calculation method used to calculate the degree of similarity between the correct data calculated by the similarity calculation unit 206 and the teacher data, information on the threshold value of this similarity calculation method, , and pass these data to the correct data extraction unit 210 . There are as many sets of these data as there are similarity calculation methods.
  • step S906 the correct data extraction unit 210 instructs the similarity calculation unit 206, and the similarity calculation unit 206 determines whether the learned similarity calculation method and threshold are stored in the similarity calculation method and threshold DB 205. judge. If the similarity calculation unit 206 determines that the learned similarity calculation method and threshold are stored, the process proceeds to step S907, and if it determines that the learned similarity calculation method and threshold are not stored. to step S908.
  • step S907 the similarity calculation unit 206 stores the average value of the threshold of the learned similarity calculation method and the threshold of the same similarity calculation method as the threshold of this similarity calculation method in the similarity calculation method and threshold DB 205. save. Further, in the case of a similarity calculation method different from a learned similarity calculation method, the similarity calculation unit 206 saves the threshold of the method in the similarity calculation method and threshold DB 205 .
  • step S ⁇ b>908 the similarity calculation unit 206 stores the threshold for each similarity calculation method in the similarity calculation method and threshold DB 205 .
  • step S909 the similarity calculation unit 206 calculates a similarity threshold between the teacher data selected in step S901 and the correct data, using all similarity calculation methods. Then, the similarity calculation unit 206 learns the similarity calculation method with the maximum similarity among the similarities calculated in step S903 and the threshold value corresponding to this similarity calculation method, and uses these as similarity calculation methods. and stored in the threshold DB 205 .
  • step S910 the correct data extraction unit 210 determines correct data corresponding to the similarity calculation method determined in step S909, and the teacher data registration unit 211 adds and registers this correct data to the teacher DB 202 as teacher data.
  • teacher data including information about failures is prepared, and based on the teacher data having a high degree of similarity with operation data about the operation of one or more devices, failure information is registered.
  • Correct data containing correct information that eliminates the above is received, the degree of similarity between the correct data and teacher data is calculated by a plurality of similarity calculation methods, and the correct data corresponding to the highest degree of similarity is registered as new teacher data.
  • the fault information registration device of the present embodiment it is possible to reduce resources for inputting many types of data necessary for learning rules including fault causes and fault alarms.
  • this resource is reduced, according to the present embodiment, there is an effect of shortening the time required to create a more accurate database for resolving failures. Therefore, according to this embodiment, the time from failure recovery to learning is shortened.
  • step S1201 the correspondence data determination unit 207 selects teacher data based on the information already possessed by the fault information registration device 100 without receiving information from the user or a device acting on behalf of the user, and selects correct data for the teacher data. can be identified. If the corresponding data determining unit 207 selects the teacher data based on the information already held by the failure information registration device 100 and determines that the correct data for the teacher data can be specified, the process proceeds to step S1202. If it is determined that the teacher data is selected based on the information already possessed by the fault information registration apparatus 100 and the correct data for the teacher data cannot be specified, the process proceeds to step S901.
  • This determination by the corresponding data determination unit 207 is performed when the process has already proceeded to step S910 in the past and there is a similarity calculation method and threshold determined in step S909.
  • the criterion for this determination is determined by the number of similarity calculation methods determined in step S909 and the threshold value of this method.
  • the determination criteria are, for example, that the same similarity calculation method is continuously determined in step S909 by a first value or more, and the threshold values of the similarity calculation methods are all equal to or more than a second value. be. If the determination criteria are satisfied in step S1201, the process proceeds to step S1202. More specifically, for example, the first value of the criterion is 5 and the second value is 0.9. These criteria may be changed appropriately and are not limited to these. For example, the first value of the criterion may be 10 and the second value may be 0.8.
  • step S1202 the corresponding data determination unit 207 selects, for each similarity calculation method, teacher data for which the teacher data was determined to be "logged" in step S710.
  • step S902 the user or a device acting in place of the user selects the teacher data for each similarity calculation method selected by the corresponding data determination unit 207 in step S1202 instead of the teacher data for each similarity calculation method selected in step S901.
  • the correct data reception unit 209 receives data determined as correct data corresponding to this teacher data by a device in place of the user.
  • At least one of the teacher DB 202 and the similarity calculation method and threshold DB 205 may not be included in the failure information registration device 100 and may be outside the failure information registration device 100 .
  • at least one of the teacher DB 202 or the similarity calculation method and threshold DB 205 may be included in an external server or the like.
  • the failure information registration apparatus 100 exchanges information with at least one of the teacher DB 202 and the similarity calculation method and threshold DB 205 via the interface 104 .
  • the device of the embodiment can also be realized by a computer and a program, and the program can be recorded on a recording medium (or storage medium) or provided via a network.
  • each of the above devices and their device parts can be implemented in either a hardware configuration or a combination configuration of hardware resources and software.
  • the combined configuration software is pre-installed in a computer from a network or a computer-readable recording medium (or storage medium), and is executed by the processor of the computer, so that the operation (or function) of each device is controlled by the computer.
  • a program is used to make it happen.
  • the present invention is not limited to the above-described embodiments, and can be variously modified in the implementation stage without departing from the gist of the present invention. Further, each embodiment may be implemented in combination as appropriate, in which case the combined effect can be obtained. Furthermore, various inventions are included in the above embodiments, and various inventions can be extracted by combinations selected from a plurality of disclosed constituent elements. For example, even if some constituent elements are deleted from all the constituent elements shown in the embodiments, if the problem can be solved and effects can be obtained, the configuration with the constituent elements deleted can be extracted as an invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Test And Diagnosis Of Digital Computers (AREA)

Abstract

Un dispositif d'enregistrement selon un mode de réalisation de la présente invention comprend une unité d'acquisition, une première unité de calcul, une unité de détermination, une unité de réception, une seconde unité de calcul, et une unité d'enregistrement. L'unité d'acquisition acquiert des données d'apprentissage comprenant des informations de défaillance, et acquiert des données d'opérations comprenant des informations sur des opérations d'un ou plusieurs dispositifs. La première unité de calcul calcule, pour chaque défaillance, un premier degré de similarité entre les données d'apprentissage et les données d'opérations. L'unité de détermination détermine, pour chaque donnée d'opérations, des données d'apprentissage ayant le premier degré de similarité égal ou supérieur à une valeur seuil. L'unité de réception reçoit des données correctes sur la base des données d'apprentissage ayant le premier degré de similarité égal ou supérieur à une valeur seuil. La seconde unité de calcul utilise une pluralité de techniques de calcul de degré de similarité pour calculer des seconds degrés de similarité entre les données correctes et les données d'apprentissage. L'unité d'enregistrement enregistre, en tant que nouvelles données d'apprentissage, des données correctes correspondant au plus grand degré de similarité parmi la pluralité de seconds degrés de similarité.
PCT/JP2021/022919 2021-06-16 2021-06-16 Dispositif d'enregistrement, procédé d'enregistrement, et programme WO2022264332A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023528857A JPWO2022264332A1 (fr) 2021-06-16 2021-06-16
PCT/JP2021/022919 WO2022264332A1 (fr) 2021-06-16 2021-06-16 Dispositif d'enregistrement, procédé d'enregistrement, et programme

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/022919 WO2022264332A1 (fr) 2021-06-16 2021-06-16 Dispositif d'enregistrement, procédé d'enregistrement, et programme

Publications (1)

Publication Number Publication Date
WO2022264332A1 true WO2022264332A1 (fr) 2022-12-22

Family

ID=84527304

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/022919 WO2022264332A1 (fr) 2021-06-16 2021-06-16 Dispositif d'enregistrement, procédé d'enregistrement, et programme

Country Status (2)

Country Link
JP (1) JPWO2022264332A1 (fr)
WO (1) WO2022264332A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013025367A (ja) * 2011-07-15 2013-02-04 Wakayama Univ 設備状態監視方法およびその装置
JP2020091561A (ja) * 2018-12-04 2020-06-11 日立グローバルライフソリューションズ株式会社 異常診断装置及び異常診断方法
WO2021033274A1 (fr) * 2019-08-20 2021-02-25 日本電信電話株式会社 Dispositif d'extraction de motifs et de génération de règle, procédé et programme

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013025367A (ja) * 2011-07-15 2013-02-04 Wakayama Univ 設備状態監視方法およびその装置
JP2020091561A (ja) * 2018-12-04 2020-06-11 日立グローバルライフソリューションズ株式会社 異常診断装置及び異常診断方法
WO2021033274A1 (fr) * 2019-08-20 2021-02-25 日本電信電話株式会社 Dispositif d'extraction de motifs et de génération de règle, procédé et programme

Also Published As

Publication number Publication date
JPWO2022264332A1 (fr) 2022-12-22

Similar Documents

Publication Publication Date Title
CN110928772B (zh) 一种测试方法及装置
JP2017194727A (ja) 因果関係抽出装置、因果関係抽出方法及び因果関係抽出プログラム
CN111949607B (zh) 一种udt文件的监控方法、***和装置
KR20190095099A (ko) 거래 시스템 에러 검출 방법, 장치, 저장 매체 및 컴퓨터 장치
JP2017041171A (ja) テストシナリオ生成支援装置およびテストシナリオ生成支援方法
US20160086126A1 (en) Information processing apparatus and method
US20190042393A1 (en) Software analysis apparatus and software analysis method
JPWO2020008991A1 (ja) 検証自動化装置、検証自動化方法、およびプログラム
JP7376631B2 (ja) 敵対的攻撃を使用して誤ラベル付きデータ・サンプルを識別するための方法及びシステム
JP6832903B2 (ja) 情報検索システムおよび方法
WO2022264332A1 (fr) Dispositif d'enregistrement, procédé d'enregistrement, et programme
JP6515048B2 (ja) インシデント管理システム
US20190265954A1 (en) Apparatus and method for assisting discovery of design pattern in model development environment using flow diagram
US9465687B2 (en) Information processing apparatus and information processing method
JP2018092362A (ja) テストスクリプト修正装置及びテストスクリプト修正プログラム
WO2020084734A1 (fr) Système, procédé et programme de génération de de connaissances
JP2018120256A (ja) 設定操作入力支援装置、設定操作入力支援システム
JP2019074966A (ja) Sql文抽出装置、sql文抽出方法及びプログラム
JP2018112876A (ja) 情報処理装置、情報処理方法、およびコンピュータプログラム
JP2006309377A (ja) 文書検索装置および文書検索方法ならびにそのプログラムと記録媒体
KR102528779B1 (ko) 한국어 생략어 복원을 위한 학습 말뭉치 생성 장치 및 방법
CN112699272A (zh) 信息输出方法、装置和电子设备
JP2011175446A (ja) 要件・バグレポート処理システム及びその方法
KR20170044408A (ko) 프로젝트의 추천 시스템 및 방법
JP2016110448A (ja) 診断・修理事例検索装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21946009

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023528857

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21946009

Country of ref document: EP

Kind code of ref document: A1