CN115098862A - Water surface unmanned system intelligent algorithm model safety credibility evaluation method and device - Google Patents
Water surface unmanned system intelligent algorithm model safety credibility evaluation method and device Download PDFInfo
- Publication number
- CN115098862A CN115098862A CN202210628895.8A CN202210628895A CN115098862A CN 115098862 A CN115098862 A CN 115098862A CN 202210628895 A CN202210628895 A CN 202210628895A CN 115098862 A CN115098862 A CN 115098862A
- Authority
- CN
- China
- Prior art keywords
- intelligent algorithm
- water surface
- algorithm model
- unmanned system
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 title claims abstract description 88
- 238000011156 evaluation Methods 0.000 title claims abstract description 37
- 238000012360 testing method Methods 0.000 claims abstract description 73
- 238000000034 method Methods 0.000 claims abstract description 47
- 230000006399 behavior Effects 0.000 claims abstract description 21
- 230000004069 differentiation Effects 0.000 claims abstract description 14
- 210000002569 neuron Anatomy 0.000 claims description 13
- 230000006870 function Effects 0.000 abstract description 8
- 238000013528 artificial neural network Methods 0.000 description 9
- 238000004590 computer program Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 5
- 238000013135 deep learning Methods 0.000 description 4
- 238000013473 artificial intelligence Methods 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
- G06F21/577—Assessing vulnerabilities and evaluating computer system security
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3676—Test management for coverage analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3684—Test management for test design, e.g. generating new test cases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T90/00—Enabling technologies or technologies with a potential or indirect contribution to GHG emissions mitigation
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Computer Security & Cryptography (AREA)
- Computing Systems (AREA)
- Quality & Reliability (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Mathematical Physics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The invention discloses a method and a device for evaluating the security and the credibility of an intelligent algorithm model of an unmanned surface system. The method comprises the following steps: constructing a plurality of groups of intelligent algorithm models of the water surface unmanned system; establishing a first test case set of an intelligent algorithm model of the water surface unmanned system; establishing a second test case set with noise based on the test case set; and performing safety credibility evaluation on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method. Therefore, the invention adopts the combined use of a plurality of groups of models with similar functions, changes the input data of a certain model under the condition that other models keep the current input data, makes the model make the prediction different from other models, prompts the model to output more differentiated behaviors, and explores the errors of decision boundaries.
Description
Technical Field
The invention relates to the technical field of deep learning, in particular to a method and a device for evaluating the safety and the credibility of an intelligent algorithm model of a water surface unmanned system, electronic equipment and a computer readable storage medium.
Background
With the rapid development of a new round of intelligent technology represented by deep learning, the safety and reliability problems and the influence of artificial intelligence are more and more highly concerned, and once the intelligent equipment has an uncontrolled abnormal problem in actual combat, the possible harm is extremely large, and the intelligent equipment is unacceptable.
From the technical point of view, the safety and credibility problem of artificial intelligence can be divided into two aspects, namely the safety and credibility of the algorithm and the capability of defending against attacks. In the aspect of safety and credibility of the algorithm, the current artificial intelligence technology is limited by the limitation of training data and the black box characteristic of a neural network, so that the problems of unstable performance, difficulty in adapting to environmental changes and the like are encountered in practical application.
In the intelligent algorithm model used by the water surface unmanned system, the high complexity of the deep learning network causes difficulty in fully testing the intelligent algorithm model. Experience shows that due to the deviation of training data, overfitting, insufficient models and the like, unexpected or incorrect behaviors often occur in the intelligent algorithm in extreme test cases.
Disclosure of Invention
In view of the above, the present invention is proposed to provide a method, an apparatus, an electronic device, and a computer readable storage medium for secure trusted evaluation of intelligent algorithmic models of unmanned surface systems that overcome or at least partially solve the above problems.
One embodiment of the invention provides a security credibility assessment method for an intelligent algorithm model of a water surface unmanned system, which comprises the following steps:
constructing a plurality of groups of intelligent algorithm models of the water surface unmanned system;
establishing a first test case set of an intelligent algorithm model of the water surface unmanned system;
establishing a second test case set with noise based on the test case set;
and performing safety credibility evaluation on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method.
Optionally, the establishing a first test case set of the intelligent algorithm model of the unmanned surface system includes:
and establishing a first test case set of the intelligent algorithm model of the water surface unmanned system by taking the maximized neuron coverage rate as a target.
Optionally, the performing security credibility evaluation on the intelligent algorithm model of the water surface unmanned system by using a model behavior differentiation method includes:
selecting a target water surface unmanned system intelligent algorithm model from the multiple groups of water surface unmanned system intelligent algorithm models;
inputting the first test case set into other intelligent algorithm models of the water surface unmanned system except the target intelligent algorithm model of the water surface unmanned system to obtain a first output result;
inputting the second test case set into the target water surface unmanned system intelligent algorithm model to obtain a second output result;
and performing safety credible evaluation on the intelligent algorithm model of the water surface unmanned system according to the first output result and the second output result.
Optionally, when the difference between the first output result and the second output result is smaller than a preset threshold, it is determined that the intelligent algorithm model of the water surface unmanned system passes a safety credibility assessment test.
Another embodiment of the present invention provides a security credible evaluation apparatus for an intelligent algorithm model of an unmanned surface system, including:
the model building unit is used for building a plurality of groups of intelligent algorithm models of the water surface unmanned system;
the first test case set establishing unit is used for establishing a first test case set of the intelligent algorithm model of the water surface unmanned system;
the second test case set establishing unit is used for establishing a second test case set with noise on the basis of the test case set;
and the safety credibility evaluation unit is used for carrying out safety credibility evaluation on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method.
Optionally, the first test case set creating unit is further configured to:
and establishing a first test case set of the intelligent algorithm model of the water surface unmanned system by taking the maximized neuron coverage rate as a target.
Optionally, the secure trusted evaluation unit is further configured to:
selecting a target water surface unmanned system intelligent algorithm model from the multiple groups of water surface unmanned system intelligent algorithm models;
inputting the first test case set into other intelligent algorithm models of the water surface unmanned system except the target intelligent algorithm model of the water surface unmanned system to obtain a first output result;
inputting the second test case set into the target water surface unmanned system intelligent algorithm model to obtain a second output result;
and performing safety credible evaluation on the intelligent algorithm model of the water surface unmanned system according to the first output result and the second output result.
Optionally, when the difference between the first output result and the second output result is smaller than a preset threshold, it is determined that the intelligent algorithm model of the water surface unmanned system passes a safety credibility assessment test.
Another embodiment of the present invention provides an electronic device, wherein the electronic device includes:
a processor; and the number of the first and second groups,
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the method described above.
Another embodiment of the present invention provides a computer-readable storage medium, wherein the computer-readable storage medium stores one or more programs which, when executed by a processor, implement the above-described method.
The method has the advantages that the first test case set and the second test case set are input into the specified model, and the input data of a certain model is changed, so that the intelligent algorithm model of the water surface unmanned system is subjected to safe and credible evaluation by adopting a model behavior differentiation method. The invention adopts the combined use of a plurality of groups of models with similar functions, changes the input data of a certain model under the condition that other models keep the current input data, makes the model make the prediction different from other models, prompts the model to output more differentiated behaviors and explores the error of a decision boundary.
The invention discovers and excavates extreme cases by maximizing the coverage rate of the neurons to find holes in the neural network, and realizes safe and credible evaluation of an intelligent algorithm with higher scene coverage rate and stronger event traversability.
Drawings
FIG. 1 is a schematic flow chart of a security credibility assessment method for an intelligent algorithm model of an unmanned surface system according to one embodiment of the invention;
fig. 2 is a schematic structural diagram of a security and credibility assessment device for an intelligent algorithm model of an unmanned surface system according to an embodiment of the invention;
FIG. 3 shows a schematic structural diagram of an electronic device according to one embodiment of the invention;
fig. 4 shows a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The method is oriented to the severe requirement of the water surface unmanned system on the safety credibility of the intelligent algorithm model, explores and establishes a task-oriented water surface unmanned system intelligent algorithm model safety credibility evaluation method based on the brain-like cognitive mechanism, carries out simulation verification depending on a typical task scene, and provides support for the efficient conversion and application of the intelligent algorithm model in the water surface unmanned system.
Fig. 1 is a schematic flow chart of a security credibility assessment method for an intelligent algorithm model of a water surface unmanned system according to an embodiment of the present invention. As shown in fig. 1, the method includes:
s11: constructing a plurality of groups of intelligent algorithm models of the water surface unmanned system;
in practical application, each group of intelligent algorithm models of the water surface unmanned system are constructed by adopting a neural network.
S12: establishing a first test case set of an intelligent algorithm model of the water surface unmanned system;
it can be understood that, in order to fully explore the behavior boundary of the neural network in the intelligent algorithm model of the unmanned surface system, a test case set needs to be purposefully generated so as to activate most neurons in the neural network as much as possible and ensure that the model is completely tested integrally.
In the deep learning network, the algorithm often generates an undesirable result in part of extreme test environments due to insufficient training data, deviation and other factors, and the extreme test cases expose loopholes of the neural network, so that the reliability of the algorithm in the actual environment is ensured. Therefore, the safety reliability of the algorithm model can be presented through the test case with higher scene coverage and stronger event traversability.
S13: establishing a second test case set with noise based on the test case set;
s14: and performing safety credibility evaluation on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method.
According to the safe and reliable assessment method for the intelligent algorithm model of the water surface unmanned system, the first test case set and the second test case set are input into the specified model, and the input data of a certain model is changed, so that the safe and reliable assessment is performed on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method. The embodiment of the invention adopts the combined use of a plurality of groups of models with similar functions, changes the input data of a certain model under the condition that other models keep the current input data, makes the model make the prediction different from other models, prompts the model to output more differential behaviors, and explores the errors of decision boundaries.
In an optional implementation manner of the embodiment of the present invention, the establishing a first test case set of an intelligent algorithm model of an unmanned surface system includes:
and establishing a first test case set of the intelligent algorithm model of the water surface unmanned system by taking the coverage rate of the maximum neurons as a target.
It will be appreciated that the first set of test cases is generated specifically using neuron coverage as a quantitative indicator, i.e., the percentage of neurons in the neural network that can be activated by the test sample, with the goal of maximizing neuron coverage.
According to the method for safely and reliably evaluating the intelligent algorithm model of the water surface unmanned system, disclosed by the embodiment of the invention, the extreme cases are found and excavated by maximizing the coverage rate of the neurons so as to find the loopholes in the neural network, and the safe and reliable evaluation of the intelligent algorithm with higher scene coverage rate and higher event traversability is realized.
Specifically, the performing security credibility evaluation on the intelligent algorithm model of the water surface unmanned system by using a model behavior differentiation method includes:
selecting a target water surface unmanned system intelligent algorithm model from the multiple groups of water surface unmanned system intelligent algorithm models;
inputting the first test case set into other intelligent algorithm models of the water surface unmanned system except the target intelligent algorithm model of the water surface unmanned system to obtain a first output result;
inputting the second test case set into the target water surface unmanned system intelligent algorithm model to obtain a second output result;
and performing safety credible evaluation on the intelligent algorithm model of the water surface unmanned system according to the first output result and the second output result.
Further, when the difference between the first output result and the second output result is smaller than a preset threshold value, the intelligent algorithm model of the water surface unmanned system is judged to pass a safety credible evaluation test.
Specifically, the objective function of the optimization problem of the embodiment of the present invention is as follows:
Obj joint =∑ i≠j F i (x)[c]-F j (x)[c]
wherein, Obj joint To optimize the objective function, F i (x)[c]Refers to the output result of the model without changing the input data, F j (x)[c]Refers to an input result of a model in which input data is changed.
Fig. 2 is a schematic structural diagram of a security and trust evaluation device for an intelligent algorithm model of an unmanned surface system according to an embodiment of the invention. As shown in fig. 2, the apparatus includes:
the model building unit 21 is used for building a plurality of groups of intelligent algorithm models of the water surface unmanned system;
the first test case set establishing unit 22 is used for establishing a first test case set of the intelligent algorithm model of the water surface unmanned system;
a second test case set creating unit 23 for creating a second test case set with noise based on the test case set;
and the safety credibility evaluation unit 24 is used for carrying out safety credibility evaluation on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method.
According to the safe and reliable assessment device for the intelligent algorithm model of the water surface unmanned system, the first test case set and the second test case set are input into the specified model, the input data of a certain model is changed, and therefore the safe and reliable assessment is conducted on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method. The embodiment of the invention adopts the joint use of a plurality of groups of models with similar functions, changes the input data of a certain model under the condition that other models keep the current input data, makes the model make the prediction different from other models, prompts the model to output more differentiated behaviors, and explores the errors of decision boundaries.
In an optional implementation manner of the embodiment of the present invention, the first test case set creating unit 22 is further configured to:
and establishing a first test case set of the intelligent algorithm model of the water surface unmanned system by taking the maximized neuron coverage rate as a target.
The secure trusted evaluation unit 24 is further configured to:
selecting a target water surface unmanned system intelligent algorithm model from the multiple groups of water surface unmanned system intelligent algorithm models;
inputting the first test case set into other intelligent algorithm models of the water surface unmanned system except the target intelligent algorithm model of the water surface unmanned system to obtain a first output result;
inputting the second test case set into the target water surface unmanned system intelligent algorithm model to obtain a second output result;
and performing safety credible evaluation on the intelligent algorithm model of the water surface unmanned system according to the first output result and the second output result.
Further, when the difference between the first output result and the second output result is smaller than a preset threshold value, the intelligent algorithm model of the water surface unmanned system is judged to pass a safety credibility assessment test.
It should be noted that the security credible evaluation device of the intelligent algorithm model of the unmanned surface system in the foregoing embodiment can be respectively used for executing the methods in the foregoing embodiments, and therefore, detailed description thereof is omitted.
In summary, the first test case set and the second test case set are input into the designated model, and the input data of a certain model is changed, so that the model behavior differentiation method is adopted to perform safe and credible evaluation on the intelligent algorithm model of the water surface unmanned system. The invention adopts the combined use of a plurality of groups of models with similar functions, changes the input data of a certain model under the condition that other models keep the current input data, makes the model make the prediction different from other models, prompts the model to output more differentiated behaviors and explores the error of a decision boundary.
The invention discovers and excavates extreme cases by maximizing the coverage rate of the neurons to find holes in the neural network, and realizes safe and credible evaluation of an intelligent algorithm with higher scene coverage rate and stronger event traversability.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may be used with the teachings herein. The required structure for constructing an arrangement of this type will be apparent from the description above. Moreover, the present invention is not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the present invention as described herein, and any descriptions above of specific languages are provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules in the devices in an embodiment may be adaptively changed and arranged in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component and furthermore may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, not others, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. It will be appreciated by those skilled in the art that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components of the apparatus for detecting a wearing state of an electronic device according to embodiments of the present invention. The present invention may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
For example, fig. 3 shows a schematic structural diagram of an electronic device according to an embodiment of the invention. The electronic device conventionally comprises a processor 31 and a memory 32 arranged to store computer-executable instructions (program code). The memory 32 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. The memory 32 has a storage space 33 storing program code 34 for performing the method steps shown in fig. 1 and in any of the embodiments. For example, the storage space 33 for storing the program code may comprise respective program codes 34 for implementing the various steps in the above method, respectively. The program code can be read from or written to one or more computer program products. These computer program products comprise a program code carrier such as a hard disk, a Compact Disc (CD), a memory card or a floppy disk. Such a computer program product is typically a computer readable storage medium such as that shown in fig. 4. The computer readable storage medium may have memory segments, memory spaces, etc. arranged similarly to the memory 32 in the electronic device of fig. 3. The program code may be compressed, for example, in a suitable form. Typically, the memory space stores program code 41 for performing the steps of the method according to the invention, i.e. there may be program code, such as read by the processor 31, which, when executed by the electronic device, causes the electronic device to perform the steps of the method as described above.
While the foregoing is directed to embodiments of the present invention, other modifications and variations of the present invention may be devised by those skilled in the art in light of the above teachings. It should be understood by those skilled in the art that the foregoing detailed description is for the purpose of better explaining the present invention, and the scope of the present invention should be determined by the scope of the claims.
Claims (10)
1. A safety credibility assessment method for an intelligent algorithm model of an unmanned surface system is characterized by comprising the following steps:
constructing a plurality of groups of intelligent algorithm models of the water surface unmanned system;
establishing a first test case set of an intelligent algorithm model of the water surface unmanned system;
establishing a second test case set with noise based on the test case set;
and performing safety credibility evaluation on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method.
2. The method of claim 1, wherein the establishing a first set of test cases of the intelligent algorithm model of the unmanned surface system comprises:
and establishing a first test case set of the intelligent algorithm model of the water surface unmanned system by taking the maximized neuron coverage rate as a target.
3. The method of claim 1, wherein the performing of the security credibility assessment on the intelligent algorithm model of the water surface unmanned system by adopting the model behavior differentiation method comprises:
selecting a target water surface unmanned system intelligent algorithm model from the multiple groups of water surface unmanned system intelligent algorithm models;
inputting the first test case set into other intelligent algorithm models of the water surface unmanned system except the target intelligent algorithm model of the water surface unmanned system to obtain a first output result;
inputting the second test case set into the target water surface unmanned system intelligent algorithm model to obtain a second output result;
and performing safety credible evaluation on the intelligent algorithm model of the water surface unmanned system according to the first output result and the second output result.
4. The method according to claim 3, wherein when the difference between the first output result and the second output result is smaller than a preset threshold value, the intelligent algorithm model of the unmanned surface system is judged to pass a safe credible evaluation test.
5. The utility model provides a water surface unmanned systems intelligent algorithm model safety credible evaluation device which characterized in that includes:
the model building unit is used for building a plurality of groups of intelligent algorithm models of the water surface unmanned system;
the first test case set establishing unit is used for establishing a first test case set of the intelligent algorithm model of the water surface unmanned system;
the second test case set establishing unit is used for establishing a second test case set with noise on the basis of the test case set;
and the safety credibility evaluation unit is used for carrying out safety credibility evaluation on the intelligent algorithm model of the water surface unmanned system by adopting a model behavior differentiation method.
6. The apparatus of claim 5, wherein the first set of test cases setup unit is further configured to:
and establishing a first test case set of the intelligent algorithm model of the water surface unmanned system by taking the coverage rate of the maximum neurons as a target.
7. The apparatus of claim 5, wherein the secure trusted evaluation unit is further configured to:
selecting a target water surface unmanned system intelligent algorithm model from the multiple groups of water surface unmanned system intelligent algorithm models;
inputting the first test case set into other intelligent algorithm models of the water surface unmanned system except the target intelligent algorithm model of the water surface unmanned system to obtain a first output result;
inputting the second test case set into the target water surface unmanned system intelligent algorithm model to obtain a second output result;
and performing safety credible evaluation on the intelligent algorithm model of the water surface unmanned system according to the first output result and the second output result.
8. The apparatus of claim 7, wherein the intelligent algorithm model of the unmanned surface system is judged to pass a safe credible evaluation test when the difference between the first output result and the second output result is smaller than a preset threshold.
9. An electronic device, characterized in that the electronic device comprises:
a processor; and the number of the first and second groups,
a memory arranged to store computer executable instructions that, when executed, cause the processor to perform the method of any one of claims 1-4.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores one or more programs which, when executed by a processor, implement the method of any of claims 1-4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210628895.8A CN115098862B (en) | 2022-06-06 | 2022-06-06 | Safety and credibility assessment method and device for intelligent algorithm model of unmanned water surface system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210628895.8A CN115098862B (en) | 2022-06-06 | 2022-06-06 | Safety and credibility assessment method and device for intelligent algorithm model of unmanned water surface system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115098862A true CN115098862A (en) | 2022-09-23 |
CN115098862B CN115098862B (en) | 2023-12-08 |
Family
ID=83289331
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210628895.8A Active CN115098862B (en) | 2022-06-06 | 2022-06-06 | Safety and credibility assessment method and device for intelligent algorithm model of unmanned water surface system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115098862B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030212678A1 (en) * | 2002-05-10 | 2003-11-13 | Bloom Burton H. | Automated model building and evaluation for data mining system |
CN108615071A (en) * | 2018-05-10 | 2018-10-02 | 阿里巴巴集团控股有限公司 | The method and device of model measurement |
CN111445029A (en) * | 2020-03-30 | 2020-07-24 | 北京嘉楠捷思信息技术有限公司 | Model evaluation method, model evaluation device and computer-readable storage medium |
CN113642622A (en) * | 2021-08-03 | 2021-11-12 | 浙江数链科技有限公司 | Data model effect evaluation method, system, electronic device and storage medium |
CN113762335A (en) * | 2021-07-27 | 2021-12-07 | 北京交通大学 | Intelligent system test data generation method based on uncertainty |
WO2022026022A1 (en) * | 2020-07-31 | 2022-02-03 | Microsoft Technology Licensing, Llc | Model selection and parameter estimation for anomaly detection |
CN114443506A (en) * | 2022-04-07 | 2022-05-06 | 浙江大学 | Method and device for testing artificial intelligence model |
CN114492764A (en) * | 2022-02-21 | 2022-05-13 | 深圳市商汤科技有限公司 | Artificial intelligence model testing method and device, electronic equipment and storage medium |
-
2022
- 2022-06-06 CN CN202210628895.8A patent/CN115098862B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030212678A1 (en) * | 2002-05-10 | 2003-11-13 | Bloom Burton H. | Automated model building and evaluation for data mining system |
CN108615071A (en) * | 2018-05-10 | 2018-10-02 | 阿里巴巴集团控股有限公司 | The method and device of model measurement |
CN111445029A (en) * | 2020-03-30 | 2020-07-24 | 北京嘉楠捷思信息技术有限公司 | Model evaluation method, model evaluation device and computer-readable storage medium |
WO2022026022A1 (en) * | 2020-07-31 | 2022-02-03 | Microsoft Technology Licensing, Llc | Model selection and parameter estimation for anomaly detection |
CN113762335A (en) * | 2021-07-27 | 2021-12-07 | 北京交通大学 | Intelligent system test data generation method based on uncertainty |
CN113642622A (en) * | 2021-08-03 | 2021-11-12 | 浙江数链科技有限公司 | Data model effect evaluation method, system, electronic device and storage medium |
CN114492764A (en) * | 2022-02-21 | 2022-05-13 | 深圳市商汤科技有限公司 | Artificial intelligence model testing method and device, electronic equipment and storage medium |
CN114443506A (en) * | 2022-04-07 | 2022-05-06 | 浙江大学 | Method and device for testing artificial intelligence model |
Non-Patent Citations (1)
Title |
---|
王赞;闫明;刘爽;陈俊洁;张栋迪;吴卓;陈翔;: "深度神经网络测试研究综述", 软件学报, no. 05, pages 19 - 39 * |
Also Published As
Publication number | Publication date |
---|---|
CN115098862B (en) | 2023-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10387655B2 (en) | Method, system and product for using a predictive model to predict if inputs reach a vulnerability of a program | |
US20200169585A1 (en) | Network security system with cognitive engine for dynamic automation | |
CN113965404A (en) | Network security situation self-adaptive active defense system and method | |
EP3474174B1 (en) | System and method of adapting patterns of dangerous behavior of programs to the computer systems of users | |
Shin et al. | Intelligent sensor attack detection and identification for automotive cyber-physical systems | |
RU2587429C2 (en) | System and method for evaluation of reliability of categorisation rules | |
CN114579427A (en) | Fuzzing a software system | |
CN107231382A (en) | A kind of Cyberthreat method for situation assessment and equipment | |
US11003772B2 (en) | System and method for adapting patterns of malicious program behavior from groups of computer systems | |
CN115795483A (en) | Software vulnerability detection method based on artificial fish swarm algorithm | |
RU2746685C2 (en) | Cybersecurity system with a differentiated ability to cope with complex cyber attacks | |
CN111159833B (en) | Method and device for evaluating unmanned vehicle algorithm | |
CN113098827B (en) | Network security early warning method and device based on situation awareness | |
CN115098862A (en) | Water surface unmanned system intelligent algorithm model safety credibility evaluation method and device | |
de Santiago et al. | Testing environmental models supported by machine learning | |
CN115038087A (en) | Security assessment method and device for Internet of vehicles | |
CN114692295A (en) | Method and device for determining vehicle performance boundary, terminal equipment and storage medium | |
CN113839963A (en) | Network security vulnerability intelligent detection method based on artificial intelligence and big data | |
Usman et al. | Rule-Based Testing of Neural Networks | |
Usman et al. | Rule-Based Runtime Mitigation Against Poison Attacks on Neural Networks | |
Anandhi et al. | Malware Detection using Dynamic Analysis | |
CN112131582A (en) | SELinux rule generation method and device and electronic equipment | |
Dhonthi et al. | Backdoor mitigation in deep neural networks via strategic retraining | |
Edwards et al. | Identifying Security Vulnerabilities Early in the ECU Software Development Lifecycle | |
CN107679398A (en) | Virtual machine I/O data stream detection method and device, computing device, storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |