CN112016689B - Information processing device, prediction discrimination system, and prediction discrimination method - Google Patents

Information processing device, prediction discrimination system, and prediction discrimination method Download PDF

Info

Publication number
CN112016689B
CN112016689B CN202010283149.0A CN202010283149A CN112016689B CN 112016689 B CN112016689 B CN 112016689B CN 202010283149 A CN202010283149 A CN 202010283149A CN 112016689 B CN112016689 B CN 112016689B
Authority
CN
China
Prior art keywords
unit
model
evaluation
causal
cause
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010283149.0A
Other languages
Chinese (zh)
Other versions
CN112016689A (en
Inventor
西纳修一
前田真彰
樱井祐市
矢崎彻
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Publication of CN112016689A publication Critical patent/CN112016689A/en
Application granted granted Critical
Publication of CN112016689B publication Critical patent/CN112016689B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/04Inference or reasoning models
    • G06N5/046Forward inferencing; Production systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0635Risk analysis of enterprise or organisation activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/04Manufacturing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Game Theory and Decision Science (AREA)
  • Artificial Intelligence (AREA)
  • Operations Research (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Manufacturing & Machinery (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Quality & Reliability (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to an information processing apparatus, a prediction discrimination system, and a prediction discrimination method. The causal model with high applicability can be estimated. The device is provided with: a causal model estimating unit for estimating one or more causal models indicating the relationship between the explanatory variable and the target variable, using as input measurement data including the explanatory variable and the target variable obtained from the discrimination object; an evaluation unit that evaluates one or more causal models using an index indicating the predicted or discriminated performance for the target variable, and outputs a causal model in which the result of the evaluation satisfies a predetermined condition; and an editing unit for outputting the causal model output by the evaluation unit and the result of the evaluation to the display unit.

Description

Information processing device, prediction discrimination system, and prediction discrimination method
Technical Field
The invention relates to construction of a causal model and an application technology thereof.
Background
In the manufacturing site, in order to improve the production efficiency, it is necessary to predict the failure of the manufacturing apparatus and perform maintenance in advance, or to determine the cause of occurrence of defective products at an early stage and to make countermeasures.
Such a model for prediction and discrimination can be constructed by using a statistical method such as regression analysis and discrimination analysis, or a machine learning method such as neural network. In these methods, the set parameters, sensed data, and the like of the manufacturing apparatus and the production line are received as input variables, and the determination result and the prediction result, which are the target variables, are output, but in order to clarify the determination basis, the following means are considered: and constructing a causal model for representing the relativity between the input variables and the destination variables, and effectively utilizing the causal model to predict and judge.
In the automatic estimation of the causal model, generally, methods such as an SGS (threads, glymour and Scheines) algorithm and a path analysis are known, and the causal model can be automatically estimated so that the suitability of data is maximized. However, in a case where the number of data is small, a cause and effect model of an error may be estimated. Therefore, when the model is used in a manufacturing site, the model needs to be modified in advance according to the judgment of a person familiar with the manufacturing site or its constituent elements (hereinafter referred to as a domain knowledge holder). Patent document 1 and the like also show a mechanism for automatically estimating such a model and editing the result thereof.
Prior art literature
Patent literature
Patent document 1: JP-A2008-217711
However, in the estimation of the causal model, when an application such as prediction or discrimination is not considered to be important for the estimation itself of the graph structure, it is not optimal to select the causal model by using the suitability of the data on the whole graph as an index. This is because the suitability of the data of the entire model does not necessarily correspond to the predicted and discriminated performance (hereinafter referred to as an applicability index) for a specific objective variable. In order to improve applicability, for example, in a model, a partial graph of a variable set related to the existence of a target variable needs to be emphasized more than a partial graph of a variable set related to the absence of a target variable.
Disclosure of Invention
Accordingly, an object of the present invention is to provide a technique capable of estimating a cause and effect model with high applicability.
An information processing apparatus according to an aspect of the present invention includes: a causal model estimating unit that estimates one or more causal models indicating the relationship between the explanatory variable and the target variable, using as input measurement data including the explanatory variable and the target variable obtained from the discrimination target; an evaluation unit that evaluates the one or more causal models using an index indicating predicted or discriminated performance with respect to the target variable, and outputs a causal model in which a result of the evaluation satisfies a predetermined condition; and an editing unit that outputs the causal model output by the evaluation unit and the result of the evaluation to a display unit.
Effects of the invention
According to an aspect of the present invention, a highly applicable cause and effect model can be estimated.
Drawings
Fig. 1 is a schematic configuration diagram of a prediction discrimination device in embodiment 1.
Fig. 2 is a process flow of the prediction discrimination device in embodiment 1.
Fig. 3 is a data format in example 1.
Fig. 4 is a set of assumptions of the causal model in example 1.
Fig. 5 is a schematic configuration diagram of the model evaluation unit 102 in embodiment 1.
Fig. 6 is an example of the estimated parameters in embodiment 1.
Fig. 7 is an example of the user display editing unit 103 in embodiment 1.
Fig. 8 is a schematic configuration diagram of embodiment 2.
Fig. 9 is a schematic configuration diagram of embodiment 2.
Fig. 10 is a data format in example 3.
Reference numerals illustrate:
prediction and discrimination device
Causal model estimation unit
Model evaluation unit
Data dividing unit
202. learning unit
203
204. comprehensive evaluation unit
User display editing unit
Model effective utilization part
Data storage unit
112. display unit
Detailed Description
Hereinafter, embodiments of the present invention will be described with reference to the drawings. The following description and drawings are illustrative examples for explaining the present invention, and are omitted and simplified as appropriate for clarity of explanation. The invention can be implemented in various other ways. Each constituent element may be singular or plural as long as it is not limited.
For ease of understanding the present invention, the positions, sizes, shapes, ranges, etc. of the respective constituent elements shown in the drawings may not be indicative of actual positions, sizes, shapes, ranges, etc. Accordingly, the present invention is not necessarily limited to the positions, sizes, shapes, ranges, etc. disclosed in the drawings.
In the following description, various information is sometimes described in terms of "tables", "lists", and the like, but various information may be represented in data structures other than these. To indicate that the data structure is not relied upon, the "XX form", "XX List" and the like are sometimes referred to as "XX info". In describing the identification information, the terms "identification information", "identifier", "name", "ID", "number" and the like are used, but these terms can be replaced with each other.
When a plurality of constituent elements having the same or similar functions are provided, different subscripts may be given to the same reference numerals. Note that, when it is not necessary to distinguish between these plural components, a description may be given by omitting a subscript.
In the following description, processing by executing a program is sometimes described, but the program is executed by a processor (e.g., CPU or GPU) to perform the specified processing while appropriately using a memory resource (e.g., memory) and/or an interface device (e.g., communication port), and thus the main body of the processing may be a processor. Similarly, the main body of the processing performed by executing the program may be a controller, an apparatus, a system, a computer, or a node having a processor. The main body of the processing performed by executing the program may be an arithmetic unit, and may include a dedicated circuit (for example, FPGA or ASIC) for performing a specific process.
The program may be installed from a program source into a device such as a computer. The program source may be, for example, a program distribution server or a computer-readable storage medium. In the case where the program source is a program distribution server, the program distribution server may include a processor and a storage resource storing the program to be distributed, and the processor of the program distribution server may distribute the program to be distributed to other computers. In the following description, 2 or more programs may be implemented as 1 program, or 1 program may be implemented as 2 or more programs.
[ example 1 ]
Here, the case where the environment in which the wear of the cutting machine, which is an example of the equipment, the device, and the tool used in the manufacturing site, is detected is exemplified, but the present invention is not limited to this, and various objects can be used as detection objects and discrimination objects. As shown in fig. 3, each time the workpiece is machined by the cutting machine, the prediction discrimination device 100 receives measurement data including 5 variables ((1) manufacturing ID, (2) material, (3) vibration, (4) machining type, (5) presence or absence of machining abnormality) repeatedly transmitted from the cutting machine, and stores the measurement data in the data storage unit 111. Fig. 3 shows that the prediction discrimination device 100 receives and accumulates measurement data from various types of cutting machines having manufacturing IDs "ID-09", "ID-05", and the like. In addition, for example, measurement data received from "ID-09" indicates: the workpiece made of "steel" is machined, and the vibration is "small" (less than a predetermined threshold value), the machining class is "class-a" (metal cutting), and there is no machining abnormality. Further, the prediction determination device 100 measures the above (1) to (4) from the measurement data, and desirably creates a model for determining (5) whether or not there is a machining abnormality. In this case, (1) to (4) are explanatory variables, and (5) is a target variable.
As shown in fig. 1, the prediction discrimination device 100 includes a data storage unit 111, a cause and effect model estimation unit 101, a model evaluation unit 102, and a user display editing unit 103.
The data storage unit 111 is configured as hardware, for example, by a general storage device such as an HDD (hard Disk Drive) or an SSD (Solid State Drive, solid state Disk), and stores the measurement data.
The causal model estimating unit 101 reads measurement data from the data accumulating unit 111 to estimate 1 or more causal models. The model evaluation unit 102 evaluates, as candidates presented to the user, a causal model indicating a high index whose applicability index with respect to the performance of prediction and discrimination with respect to the target variable is equal to or higher than a predetermined reference value, from among the causal models generated by the causal model estimation unit 101.
The user display editing unit 103 outputs the result of the evaluation of the performance by the cause and effect model configuration and model evaluation unit 102, which are estimated by the cause and effect model estimation unit 101, to the display unit 112, which is a display device, typically hardware, such as an LCD (Liquid Crystal Display ), and presents the result to the user. The detailed processing performed by the cause-effect model estimating unit 101, the model evaluating unit 102, and the user display editing unit 103 will be described later.
The prediction determination apparatus 100 is configured as hardware by a general information processing apparatus such as a PC (Personal Computer ) or a server. The cause and effect model estimating unit 101, the model evaluating unit 102, and the user display editing unit 103 are realized by execution of a program. For example, the functions of the cause and effect model estimating unit 101, the model evaluating unit 102, and the user display editing unit 103 are realized by reading and executing a program from a ROM (Read Only Memory) by a CPU (Central Processing Unit) of the prediction judging device 100. The program may be supplied to the prediction judgment device 100 by being read from a storage medium such as a USB (Universal Serial Bus ) memory, or downloaded from another computer via a network, or the like. In the present embodiment, the display unit 112 is provided outside the prediction determination device 100, but the prediction determination device 100 may have the display unit 112.
Next, the processing in the present embodiment is described in accordance with the flowchart of fig. 2.
Step 201:
the cause and effect model estimating unit 101 reads out and acquires measurement data from the data accumulating unit 111.
Step 202:
the cause and effect model estimating unit 101 estimates a cause and effect model using the measurement data read from the data accumulating unit 111 as input data. At this point, one or more models are generated as hypotheses. The causal model is a model for specifying the relativity between variables, and can be defined as a visualized model by preparing nodes corresponding to variables and forming a graph structure in which nodes having a relationship are linked by links. Specifically, a bayesian network, a markov network, or the like can be used. As a method for estimating the cause and effect model, for example, an SGS (spirates, glymour and Scheines) algorithm can be used. For example, if an SGS, the thresholds for independence discrimination between particular nodes can be manipulated to derive different causal models. Further, a plurality of hypotheses may be generated by randomly changing the existence of links on the basis of a model that maximizes an index such as MDL (Minimum Description Length ) or the like. As a result, for example, the set of cause and effect models shown in fig. 4 can be estimated. Fig. 4 shows, as an example, 4 causal models (a) to (d) are estimated.
Step 203:
the model evaluation unit 102 evaluates one or more cause and effect models generated in step 1 using the applicability index, and sets the model as candidates for presentation to the user.
Fig. 5 shows the structure of the model evaluation unit 102. As shown in fig. 5, the model evaluation unit 102 includes a data dividing unit 201, a learning unit 202, a testing unit 203, and a comprehensive evaluation unit 204.
When the input data given from the data storage unit 111 is input as the data for evaluation as it is, the data used for estimating the causal model and the data used for evaluating are repeated and only one evaluation can be performed, so that a robust evaluation cannot be performed. Accordingly, the data dividing unit 201 divides the input data into learning data and test data, the learning unit 202 performs learning using the learning data, and the test unit 203 performs a test using the test data. The learning and the test are repeated while changing the data division method, and the comprehensive evaluation unit 204 evaluates the learning data and the test data, respectively, and performs an overall evaluation by integrating these data. With this structure, a robust evaluation can be performed.
Here, the learning unit 202 stores the model structure, inputs data obtained by changing the segmentation method to the causal model, and re-estimates the parameters. For example, the learning unit 202 can calculate parameters as shown in fig. 6. Here, as an example, the causal model is expressed as a bayesian network, and the parameters are obtained as probabilities. Fig. 6 shows, as an example, a case where parameters for the cause and effect model shown in fig. 4 (a) are estimated. In fig. 6, A, B, C, D, E is set for each of the manufacturing ID, material, machining type, vibration, and machining abnormality, and the distributions of the probabilities P (a), P (b|a), P (c|a), P (d|b), and P (e|b, C) which are estimated parameters are shown in a table. In general, maximum likelihood estimation and EAP (Expected A Posteriori, expected posterior) estimation can be used for estimating the parameters. When the parameter P (a) is estimated by maximum likelihood estimation, the number of occurrences of a (manufacturing ID) in the data is obtained, and the sum is divided by the number of occurrences. For example, in the case where ID-05 appears 15 times and ID-09 appears 15 times, respectively, the sum is 30, and the probability P (a) of each is 15/30=0.5. The learning unit 202 performs such estimation on all the causal models generated in step 1.
Further, the test unit 203 performs a test for predicting a target variable (in this case, a machining abnormality) from the input variables (in this case, 4 variables other than the machining abnormality) of the test data using the parameters estimated by the learning unit 202. As a method for predicting a target variable using the estimated parameters, a merge tree (join tree) algorithm or the like is known which can efficiently estimate even when the graph structure is large.
The comprehensive evaluation unit 204 can evaluate the causal model using, for example, accuracy and f-measure, which are one of the determination Accuracy indexes, as the applicability index. The flow of the processing performed in the data dividing unit 201, the learning unit 202, the testing unit 203, and the comprehensive evaluation unit 204 is performed for each model generated in step 202. Finally, the comprehensive evaluation unit 204 selects the causal model with the highest and optimal applicability index. In the present embodiment, the comprehensive evaluation unit 204 selects the causal model with the best applicability index, but may select one or more causal models with the applicability index satisfying the high index of the reference value as a given condition. Thereby, a causal model satisfying a certain level can be selected as a candidate. The reference value may be determined in accordance with the type of the measurement data and the required evaluation accuracy.
Step 204:
the user display editing unit 103 presents performance evaluation based on the structure of the causal model selected by the comprehensive evaluation unit 204 and the applicability index. For example, as shown in fig. 7, the user display editing unit 103 displays the cause and effect model 701 and the evaluation result 702 on the display unit 112. Fig. 7 shows a case where the user display editing unit 103 displays the model name of the cause and effect model shown in fig. 4 (a) and the value of the applicability index (precision: 60%) on the display unit 112.
Step 205:
here, when the user who is the domain knowledge holder checks the structure of the displayed causal model and determines that editing is not necessary and ends the editing, the prediction judgment device 100 receives the pressing of the editing end button 711 via the display unit 112 or another input device such as a keyboard or a mouse connected to the prediction judgment device 100. The user display editing unit 103 determines whether or not the pressing of the edit completion button 711 is accepted, and if it is determined that the pressing of the edit completion button 711 is accepted (yes in step 205), ends the processing. On the other hand, when the user display editing unit 103 determines that the pressing of the edit completion button 711 is not accepted (no in step 205), the process proceeds to S206.
Step 206:
when it is determined that the pressing of the edit end button 711 is not received (no in step 205), the user display editing unit 103 determines that there is a possibility that editing is required, and stands by while remaining unchanged. When the prediction determination device 100 determines that an operation (for example, a depression of the delete button 712 of the link or the add button 713 of the link) indicating that editing is necessary is received from the user, the user display editing unit 103 can change the configuration by switching the link of the causal model 701 being displayed. For example, the cause and effect model 701 shown in fig. 4 (a) displayed on the display unit 112 is edited as shown in fig. 4 (b) by the above operation.
Here, the display and editing by the user are not necessarily performed as shown in fig. 7, and editing in a text library is also considered. For example, the user display editing unit 103 generates "is the processing type thought to have an influence on the material? When it is determined that a negative "no" is input, the method of changing the cause and effect model is considered as described above.
Thereafter, the process returns to step 202, and the prediction discrimination device 100 repeatedly executes the subsequent process. In step 202 after editing, the cause and effect model estimating unit 101 adds an assumption that a model is newly created based on the cause and effect model edited by the user to the assumption generated in step 202 before editing. For example, a generation method is considered in which the cause and effect model estimating unit 101 randomly changes the presence or absence of links between other variables in compliance with a change made by a user. The cause and effect model estimating unit 101 may simply add the editing result of the user to the set of assumptions up to this point.
In step 203 after editing, the model evaluation unit 102 evaluates the set of hypotheses of the model created in step 202 after editing. Here, the model evaluation unit 102 may store the already evaluated models and evaluate only the previously non-evaluated models, thereby reducing the effort and time required for the evaluation.
In step 204 after editing, the user display editing unit 112 presents the user with a change in the applicability index caused by the editing of the user based on the evaluation result in step 203 after editing. For example, the user display editing unit 112 displays a list of changes in the accuracy of the model created and evaluated in the editing process up to this point, as in the edit history 705 of fig. 7. In fig. 7, the user display editing unit 112 displays the change in the index of the cause and effect model based on the applicability index for the past 5 times in a time series including the cause and effect model in display as the editing history 705. Further, since the evaluation of the cause and effect model 701 on the 5 th time in the display is lower than the previous (4 th time) evaluation, when the user selects the 4 th time evaluation value 7051 in the chart of the edit history 705, the user display editing unit 112 displays the pop-up screen 706, and the pop-up screen 706 displays the information of this time. The cause and effect model 7061 and the evaluation result 7062 at this time are displayed on the pop-up screen 706, and when the user returns to this time, the button 714 is pressed. In this case, the user display editing unit 112 switches the cause and effect model and the evaluation result to be displayed on the main screen from the cause and effect model and the evaluation result to be displayed at the present time. The past data may be stored in the data storage unit 1011 every time the processing shown in fig. 2 is performed, and the user display editing unit 112 may display the cause and effect model and the evaluation result stored at that point in time as a history at the timing of displaying fig. 7.
In this way, the user display editing unit 112 can edit not only the currently displayed causal model 701 but also the model retrospectively of the past.
In step 205 after editing, the user studies whether the self-edit is correct based on the information up to now, and if so, ends the edit. In the same manner as in step 205 described above, when the user display editing unit 103 determines that the user has pressed the edit completion button 711 (yes in step 205), the process is terminated. On the other hand, when the user display editing unit 103 determines that the pressing of the edit completion button 711 is not accepted (no in step 205), the process proceeds to S206.
In this way, even when a desired cause and effect model is not obtained, the domain knowledge holder can easily edit the cause and effect model. Even when the domain knowledge holder cannot determine how the causal model should be changed, editing of the causal model with high applicability can be performed by referring to the past editing history, and a model with higher applicability can be created.
[ example 2 ]
Fig. 8 and 9 show examples in which the model prepared in example 1 can be further applied to a manufacturing site as example 2. The prediction judgment device 800 in embodiment 2 is different from the prediction judgment device 100 in embodiment 1 in that it has a model effective use unit 104. The prediction determination device 800 is connected to a result output unit 113 for outputting data obtained from the model effective use unit 104, and is connected to a device group 114 (devices 114a, 114b, and 114 c) serving as an output source of measurement data.
While fig. 8 shows a configuration in which the model effective use unit 104 is included in the prediction determination device 800, for example, as shown in fig. 9, it may be configured as another device different from the prediction determination device 800 (the 2 nd device). In fig. 9, the model effective use unit 104 is included in a model effective use device 902 (3 rd device). The data storage unit 111 is included in the model efficient use device 901 (device 1), and the result output unit 113 is also included in the result output device 903 (output device). These can be included in one or more of the predictive decision apparatus 800 as desired. By adopting the configuration shown in fig. 9, both the evaluation of the model and the effective use can be performed in a stable environment. For example, since the processing by the model evaluation unit 102 and the processing by the model effective use unit 104 are performed by different devices, the processing load on each device can be reduced, and these functions can be realized by a device of a lower specification. Further, since the display unit 112 and the result output unit 113 are configured as separate devices, the present system can be used even when a user who edits the evaluation result of the model and a user who effectively uses the model are in different environments across a network.
Hereinafter, an example is shown in which the user display editing unit 112 outputs the cause and effect model to the model effective use unit 104 for effective use after the editing in step 205 of embodiment 1 is completed.
In the model effective use unit 104, measurement data is acquired from the device group 114, and a cause-effect model (for example, a cause-effect model with the best applicability index) with a high index that satisfies a reference value as a predetermined condition by the applicability index selected by the comprehensive evaluation unit 204 and a manufacturing ID included in the measurement data that is an input of the cause-effect model are acquired from the model evaluation unit 102. The model effective use unit 104 extracts data having the same manufacturing ID as the manufacturing ID from the measurement data received from the device group 114, and outputs the extracted data to the result output unit 113 together with a causal model (for example, an optimal causal model) satisfying the reference value.
The result output unit 113 stores the result including the cause and effect model and the manufacturing ID received from the model effective use unit 104 in a storage device, or presents the result to an output device such as a display device or a sound output device. This makes it possible to determine the cause and effect model with the highest evaluation as the cause and effect model to be effectively used, and to grasp the content of the measurement data input to the cause and effect model for evaluation. In addition, in embodiment 1, when the processing abnormality is determined, measures such as giving an alarm are considered, but in this embodiment, the product giving an alarm can be grasped from the product ID output by the result output unit 113.
Further, in the period in which the causal model is effectively used, the data storage unit 111 stores measurement data acquired from the device group 114, and inputs newly stored measurement data to the causal model estimating unit 101 to perform model relearning. In this way, the cause and effect model estimating unit 101 evaluates the cause and effect model using the latest measurement data stored at any time, and the model effective use unit 104 can acquire a new cause and effect model as the cause and effect model with the highest evaluation, and can make effective use of the new cause and effect model. In this case, the cause and effect model estimating unit 101 may determine whether to perform only the re-estimation of the learning parameter, to include the re-estimation of the structure, or to edit the domain knowledge holder, depending on the use environment, or the like.
[ example 3 ]
Examples of discrimination of the current state are shown in embodiments 1 and 2, but future predictions can be made as well. For example, consider a case where measurement data is stored in the data storage unit 111 in time series as shown in fig. 10. In fig. 10, the storage of such measurement data from past to present is shown, for example, at times tn, tn-1, tn-2. The learning unit 202 generates a causal model for predicting a target variable at a later time point (for example, m time points) from a current (for example, tn-1 time point) explanatory variable, using the measurement data thus stored as an input. In this way, if the target variable is set to have a fault, and the input variable is set to a sensor earlier than the fault occurrence time and a set value is learned, a model for predicting a future state from a past state can be constructed.
[ example 4 ]
In addition, when the prediction determination device is used to determine the normal/abnormal state of the device or the like, abnormal data is not necessarily required. For example, it is possible to measure various parameters of the device and vibration values of specific portions, and it is known that the vibration values are associated with abnormality of the device. In this case, even if only data of the vibration value at the normal time can be obtained, a model for discriminating the vibration value at the normal time can be made. In the case of effective use, the model can be used to predict the vibration value from data other than the vibration value, calculate the difference from the actual vibration value, and detect abnormal vibration when the difference exceeds a specific threshold.
The present invention is not limited to the above-described embodiments, but includes various modifications. For example, the above-described embodiments are described in detail for the purpose of describing the present invention in an easy-to-understand manner, but are not necessarily limited to having all the described structures. In addition, a part of the structure of one embodiment may be replaced with the structure of another embodiment, and the structure of another embodiment may be added to the structure of one embodiment. In addition, deletion, and substitution of other structures can be performed for a part of the structures of the respective embodiments. The above-described structures, functions, processing units, and the like may be partially or entirely implemented in hardware by, for example, designing them with an integrated circuit. The above-described structures, functions, and the like may be implemented in software by a processor interpreting and executing a program for realizing the functions. Information such as programs, tables, and files for realizing the respective functions can be placed in a recording device such as a memory, a hard disk, and an SSD (Solid State Drive, solid state disk), or a recording medium such as an IC card, an SD card, and a DVD.

Claims (3)

1. A prediction discrimination system is characterized by comprising:
a cause-effect model estimating unit that uses, as inputs, measurement data including a specification variable, a target variable, and a manufacturing ID obtained from a discrimination object that outputs the measurement data, and estimates one or more cause-effect models that represent a relationship between the specification variable and the target variable;
an evaluation unit that evaluates the one or more causal models using an index indicating predicted or discriminated performance with respect to the target variable, and outputs a causal model in which a result of the evaluation satisfies a predetermined condition;
an editing unit configured to output the causal model output from the evaluation unit and the result of the evaluation to a display unit; and
and a model effective utilization unit configured to acquire, from the evaluation unit, a cause-and-effect model in which a result of the evaluation satisfies a predetermined condition and a manufacturing ID included in measurement data to be input to the cause-and-effect model, extract, from the measurement data acquired from the discrimination object, data in which the manufacturing ID is identical to the manufacturing ID, and output the extracted data to a display unit together with the cause-and-effect model satisfying the predetermined condition.
2. The predictive discrimination system of claim 1, wherein,
the prediction discrimination system is composed of the following devices:
a 1 st device for accumulating measurement data outputted from the discrimination object;
a 2 nd device having the causal model estimating unit, the evaluating unit, and the editing unit;
a 3 rd device which acquires measurement data output from the discrimination object and has the model effective utilization unit;
a display device having the display unit as an output destination of the editing unit; and
and an output device serving as an output destination of the model effective use unit.
3. A prediction discrimination method for predicting or discriminating an abnormality in a manufacturing apparatus to be discriminated, characterized by,
a causal model estimating unit that uses as input the measurement data including a specification variable, a target variable, and a manufacturing ID obtained from the discrimination object that outputs the measurement data, the specification variable including a setting parameter and sensing data of the manufacturing apparatus, and the target variable indicating the presence or absence of an anomaly in the manufacturing apparatus, to estimate one or more causal models that indicate a relationship between the specification variable and the target variable,
an evaluation unit that evaluates the one or more causal models using an index indicating the predicted or discriminated performance with respect to the target variable, and outputs a causal model in which the result of the evaluation satisfies a predetermined condition,
the editing unit outputs the causal model output by the evaluation unit and the result of the evaluation to the display unit,
a model effective use unit acquires, from the evaluation unit, a cause-and-effect model in which the result of the evaluation satisfies a predetermined condition and a manufacturing ID included in measurement data to be input to the cause-and-effect model, extracts data having the same manufacturing ID as the manufacturing ID from the measurement data acquired from the discrimination object, and outputs the extracted data to a display unit together with the cause-and-effect model satisfying the predetermined condition.
CN202010283149.0A 2019-05-28 2020-04-10 Information processing device, prediction discrimination system, and prediction discrimination method Active CN112016689B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-099053 2019-05-28
JP2019099053A JP7247021B2 (en) 2019-05-28 2019-05-28 Information processing device, prediction discrimination system, and prediction discrimination method

Publications (2)

Publication Number Publication Date
CN112016689A CN112016689A (en) 2020-12-01
CN112016689B true CN112016689B (en) 2023-08-18

Family

ID=73506507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010283149.0A Active CN112016689B (en) 2019-05-28 2020-04-10 Information processing device, prediction discrimination system, and prediction discrimination method

Country Status (2)

Country Link
JP (1) JP7247021B2 (en)
CN (1) CN112016689B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113112311B (en) * 2021-05-12 2023-07-25 北京百度网讯科技有限公司 Method for training causal inference model and information prompting method and device
JP7409421B2 (en) 2022-03-24 2024-01-09 いすゞ自動車株式会社 Model creation device and model creation method
WO2024004384A1 (en) * 2022-06-27 2024-01-04 ソニーグループ株式会社 Information processing device, information processing method, and computer program
WO2024053020A1 (en) * 2022-09-07 2024-03-14 株式会社日立製作所 System and method for estimating factor of difference between simulation result and actual result

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008003866A (en) * 2006-06-22 2008-01-10 Omron Corp Causal structure acquiring device, causal structure acquiring method, causal structure acquiring program and computer readable medium recording it
JP2008217711A (en) * 2007-03-07 2008-09-18 Omron Corp Apparatus for deciding causal structure, control method therefor, and control program therefor
JP2009265713A (en) * 2008-04-22 2009-11-12 Toyota Central R&D Labs Inc Model construction device and program
WO2015122362A1 (en) * 2014-02-14 2015-08-20 オムロン株式会社 Causal network generating system and causal relation data structure
CN106796618A (en) * 2014-10-21 2017-05-31 株式会社日立制作所 Time series forecasting device and time sequence forecasting method
JP2017194730A (en) * 2016-04-18 2017-10-26 株式会社日立製作所 Decision Support System and Decision Support Method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008003866A (en) * 2006-06-22 2008-01-10 Omron Corp Causal structure acquiring device, causal structure acquiring method, causal structure acquiring program and computer readable medium recording it
JP2008217711A (en) * 2007-03-07 2008-09-18 Omron Corp Apparatus for deciding causal structure, control method therefor, and control program therefor
JP2009265713A (en) * 2008-04-22 2009-11-12 Toyota Central R&D Labs Inc Model construction device and program
WO2015122362A1 (en) * 2014-02-14 2015-08-20 オムロン株式会社 Causal network generating system and causal relation data structure
CN106796618A (en) * 2014-10-21 2017-05-31 株式会社日立制作所 Time series forecasting device and time sequence forecasting method
JP2017194730A (en) * 2016-04-18 2017-10-26 株式会社日立製作所 Decision Support System and Decision Support Method

Also Published As

Publication number Publication date
JP2020194320A (en) 2020-12-03
JP7247021B2 (en) 2023-03-28
CN112016689A (en) 2020-12-01

Similar Documents

Publication Publication Date Title
CN112016689B (en) Information processing device, prediction discrimination system, and prediction discrimination method
JP6354755B2 (en) System analysis apparatus, system analysis method, and system analysis program
JP6652699B2 (en) Anomaly evaluation program, anomaly evaluation method, and information processing device
JP6875179B2 (en) System analyzer and system analysis method
CN111459700A (en) Method and apparatus for diagnosing device failure, diagnostic device, and storage medium
JP6521096B2 (en) Display method, display device, and program
US20170261403A1 (en) Abnormality detection procedure development apparatus and abnormality detection procedure development method
EP3859472B1 (en) Monitoring system and monitoring method
JP7493930B2 (en) Information processing method, information processing device, production system, program, and recording medium
JP6711323B2 (en) Abnormal state diagnosis method and abnormal state diagnosis device
US10788817B2 (en) Manufacturing process analysis device, manufacturing process analysis method, and recording medium whereupon manufacturing process analysis program is stored
JPWO2014132612A1 (en) System analysis apparatus and system analysis method
EP2634733A1 (en) Operations task management system and method
JP6489235B2 (en) System analysis method, system analysis apparatus, and program
JPWO2014132611A1 (en) System analysis apparatus and system analysis method
CN113722134A (en) Cluster fault processing method, device and equipment and readable storage medium
CN111061581A (en) Fault detection method, device and equipment
JP2014044510A (en) Abnormality diagnostic device
JP5962367B2 (en) Fault tree generator
US10157113B2 (en) Information processing device, analysis method, and recording medium
WO2019073512A1 (en) System analysis method, system analysis device, and program
JP6405851B2 (en) Predictive detection support program, method, apparatus, and predictive detection program,
JP2007164346A (en) Decision tree changing method, abnormality determination method, and program
JP2022191680A (en) Data selection support device, and data selection support method
JP6247777B2 (en) Abnormality diagnosis apparatus and abnormality diagnosis method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant