CN116562810A - Teaching resource conversion method, system, equipment and medium for capturing production information - Google Patents
Teaching resource conversion method, system, equipment and medium for capturing production information Download PDFInfo
- Publication number
- CN116562810A CN116562810A CN202310517936.0A CN202310517936A CN116562810A CN 116562810 A CN116562810 A CN 116562810A CN 202310517936 A CN202310517936 A CN 202310517936A CN 116562810 A CN116562810 A CN 116562810A
- Authority
- CN
- China
- Prior art keywords
- data
- production
- teaching
- production information
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 130
- 238000006243 chemical reaction Methods 0.000 title claims abstract description 47
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000004458 analytical method Methods 0.000 claims abstract description 50
- 238000000586 desensitisation Methods 0.000 claims description 17
- 238000001514 detection method Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 8
- 238000000638 solvent extraction Methods 0.000 claims description 8
- 230000005540 biological transmission Effects 0.000 claims description 6
- 238000004061 bleaching Methods 0.000 claims description 6
- 230000000873 masking effect Effects 0.000 claims description 4
- 238000010224 classification analysis Methods 0.000 claims 1
- 230000004927 fusion Effects 0.000 abstract description 6
- 230000006872 improvement Effects 0.000 abstract description 4
- 239000000463 material Substances 0.000 description 31
- 238000005516 engineering process Methods 0.000 description 6
- 238000007689 inspection Methods 0.000 description 4
- 238000004088 simulation Methods 0.000 description 4
- 238000012549 training Methods 0.000 description 4
- 238000012098 association analyses Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000011835 investigation Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 239000002994 raw material Substances 0.000 description 2
- 230000010076 replication Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 230000002411 adverse Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000000192 social effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000011426 transformation method Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/103—Workflow collaboration or project management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/602—Providing cryptographic facilities or services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/606—Protecting data by securing the transmission between two devices or processes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/20—Education
- G06Q50/205—Education administration or guidance
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- General Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Bioinformatics & Computational Biology (AREA)
- Tourism & Hospitality (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Educational Technology (AREA)
- General Business, Economics & Management (AREA)
- Computer Hardware Design (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Entrepreneurship & Innovation (AREA)
- Educational Administration (AREA)
- Bioethics (AREA)
- Economics (AREA)
- Marketing (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Primary Health Care (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The disclosure relates to the field of production and teaching fusion, in particular to a teaching resource conversion method, a system, equipment and a medium for capturing production information. The method comprises the following steps: acquiring production information, wherein the production information comprises production scene data and production event data; desensitizing the production information to obtain desensitized data; performing category analysis, availability analysis and value analysis on the desensitized data to obtain effective data; and converting the effective data into teaching resources. The present disclosure can transform various events, data, and information in a production process into teaching resources having educational value by capturing and analyzing these practical scenes and events in industrial practice scenes, and provide students with learning and practice. The invention can realize the fusion of the production and the teaching, fully excavate and utilize the industrial practice scene and the event, improve the practicability and the fidelity of teaching resources and promote the practice capability and the skill improvement of students.
Description
Technical Field
The disclosure relates to the field of production and teaching integration, in particular to a teaching resource conversion method, system, equipment and medium for capturing production information.
Background
The traditional teaching mode is often too theoretical, is difficult to meet the culture requirement of the practical ability and skills of students, and is far away from the actual requirement of industry. Therefore, the product teaching is integrated into a hot topic in the current education field. However, how to transform industrial practice scenes and events into teaching resources is still a problem to be solved. The problems are mainly manifested in the following aspects:
the teaching resource is not updated timely: the traditional teaching resource has high manufacturing cost, long period and high updating and maintaining cost, so that the teaching resource is not updated timely, and the pace of rapid development of information technology is difficult to keep pace. By adopting the method, the actual production scene and the event can be quickly converted into teaching resources, the updating and maintaining cost is greatly reduced, the teaching resources can be quickly updated, and the market demand can be responded in time.
The teaching resource quality is not high: the traditional teaching resources lack of support of real scenes, so that the teaching effect is difficult to expect. The method can take the real production scene and event as teaching resources, so that students can understand and memorize more easily, and meanwhile, the interest and enthusiasm of learning are improved.
The teaching resources are disjointed with actual production: the traditional teaching resources are often disjointed from actual production, and it is difficult for students to really master skills and knowledge required in actual production. The method can convert the production scene and the event into teaching resources, so that students can master actual skills and knowledge in practice, and the purpose of producing and teaching fusion is achieved.
Teaching resources lack personalization: traditional teaching resources can not meet the personalized requirements of different students, and teaching in accordance with the material is difficult to realize. By adopting the method, personalized teaching resources can be manufactured according to different characteristics and requirements of the students according to the requirements of different students, and the learning effect and satisfaction of the students are improved. There are also situations where some content unsuitable for disclosure is revealed to the network, which can have adverse social effects.
Disclosure of Invention
The present disclosure provides a method, system, apparatus, and medium for converting teaching resources for production information capture, capable of solving at least one of the problems mentioned in the background. In order to solve the technical problems, the present disclosure provides the following technical solutions:
as an aspect of the embodiments of the present disclosure, there is provided a teaching resource conversion method for capturing production information, including the steps of:
acquiring production information, wherein the production information comprises production scene data and production event data;
desensitizing the production information to obtain desensitized data;
performing category analysis, availability analysis and value analysis on the desensitized data to obtain effective data;
and converting the effective data into teaching resources.
Optionally, the production scene data includes manually and/or automatically acquired image data;
the production event data includes event inputs corresponding to a scene time generated by the production scene data.
Optionally, after the image data is acquired, the method further includes the steps of:
and (3) carrying out ID marking on the image data according with a standard format, and outputting the image and text data of the standard scene and the event.
Optionally, desensitizing the production information to obtain desensitized data, including the steps of:
encrypting data in the transmission process of uploading the production scene data and the production event data to the server;
after decrypting the encrypted data at the server, partitioning the decrypted data with a security level;
and performing desensitization operation on the classified data according to the security level, wherein the desensitization operation comprises one or more of the following operations: masking, deforming, encrypting or bleaching;
and carrying out data minimization detection on the desensitized data, and if the data minimization standard is not met, carrying out data minimization operation to obtain final desensitized data.
Optionally, performing category analysis, availability analysis and value analysis on the desensitized data to obtain effective data, including the following steps:
category analysis: performing specialized and skill classification on the desensitized data;
availability analysis: judging whether the classified data is completely consistent with the scene and event of the existing data, and judging the authenticity and rationality of the classified data; judging whether the classified data can reach the teaching element detection standard or not;
and carrying out value analysis on the data meeting the requirements after the judgment: and the value judgment is completed through the value module so as to obtain effective data.
Optionally, the method converts the effective data into teaching resources, and comprises the following steps:
and carrying out scene conversion and event conversion on the effective data, and correlating the converted scene and event to form standard teaching resources.
Optionally, the scene transition includes at least one of: non-modeling manner: directly outputting the effective data in a standard format; or, manual modeling mode: forming a standard scene output; or, 3D scan modeling; and/or associating the transformed scenes and events to form a standard teaching resource comprises: and splicing the scene and the event ID through the time stamp to generate a standard teaching resource including a teaching mirror image package or VR images formed by images and texts.
As another aspect of an embodiment of the present disclosure, there is provided a teaching resource conversion system for capturing production information, including:
the production information module acquires production information, wherein the production information comprises production scene data and production event data;
the safety processing module is used for desensitizing the production information to obtain desensitized data;
the effective data acquisition module is used for carrying out category analysis, availability analysis and value analysis on the desensitized data to obtain effective data;
and the resource conversion module is used for converting the effective data into teaching resources.
As another aspect of the embodiments of the present disclosure, there is provided an electronic device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the teaching resource conversion method for capturing production information when executing the computer program.
As another aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the teaching resource conversion method of production information capture described above.
Compared with the prior art, the embodiment of the disclosure can desensitize the data, delete contents unsuitable for disclosure, and convert the data meeting the standard into teaching resources after category analysis, availability analysis and value analysis, so that the standard teaching resource safety, teaching actual combat performance and pertinence are stronger. Embodiments of the present disclosure may also convert various events, data, and information in the production process by capturing and analyzing these practical scenarios and events in industrial practice scenarios into teaching resources with educational value and provide students with learning and practice. The invention can realize the fusion of the production and the teaching, fully excavate and utilize the industrial practice scene and the event, improve the practicability and the fidelity of teaching resources and promote the practice capability and the skill improvement of students.
Drawings
FIG. 1 is a flow chart of a teaching resource conversion method of production information capture in embodiment 1 of the present disclosure;
FIG. 2 is a flowchart of the steps for desensitizing the production information to desensitized data in accordance with example 1 of the present disclosure;
fig. 3 is a schematic block diagram of a teaching resource conversion system for production information capture in embodiment 1 of the present disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the disclosure will be described in detail below with reference to the drawings. In the drawings, like reference numbers indicate identical or functionally similar elements. Although various aspects of the embodiments are illustrated in the accompanying drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, numerous specific details are set forth in the following detailed description in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements, and circuits well known to those skilled in the art have not been described in detail in order not to obscure the present disclosure.
It will be appreciated that the above-mentioned method embodiments of the present disclosure may be combined with each other to form a combined embodiment without departing from the principle logic, and are limited to the description of the present disclosure.
In addition, the disclosure further provides a teaching resource conversion system, an electronic device, a computer readable storage medium and a program for capturing production information, and the foregoing may be used to implement any one of the teaching resource conversion methods for capturing production information provided in the disclosure, and the corresponding technical schemes and descriptions and corresponding descriptions of the method parts are omitted.
The execution subject of the teaching resource conversion method for capturing production information may be a computer or other device capable of implementing teaching resource conversion for capturing production information, for example, the method may be executed by a terminal device or a server or other processing device, where the terminal device may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a personal digital assistant (Personal Digital Assistant, PDA), a handheld device, a computing device, an in-vehicle device, a wearable device, or the like. In some possible implementations, the tutorial resource transformation method of production information capture may be implemented by way of a processor invoking computer readable instructions stored in a memory.
Example 1
As an aspect of the embodiments of the present disclosure, there is provided a teaching resource conversion method for capturing production information, as shown in fig. 1, including the steps of:
s10, acquiring production information, wherein the production information comprises production scene data and production event data;
s20, desensitizing the production information to obtain desensitized data;
s30, performing category analysis, availability analysis and value analysis on the desensitized data to obtain effective data;
s40, converting the effective data into teaching resources.
Based on the configuration, the embodiment of the disclosure can desensitize the data, delete contents unsuitable for disclosure, and convert the data meeting the standard into teaching resources after category analysis, availability analysis and value analysis, so that the standard teaching resource safety, teaching actual combat performance and pertinence are stronger.
The steps of the embodiments of the present disclosure are described in detail below, respectively.
S10, acquiring production information, wherein the production information comprises production scene data and production event data;
wherein, the production scene data comprises image data acquired manually and/or automatically; preferably, after the image data is acquired, the method further comprises the steps of:
and (3) carrying out ID marking on the image data according with a standard format, and outputting the image and text data of the standard scene and the event.
For example, capturing by a camera, a video camera and the like, wherein the capturing comprises 1) panoramic and one-mirror-to-bottom image materials of a production scene 2) 360-degree close-up image materials of characteristic objects based on various subject characteristic libraries in the production scene, uploading the materials according to a standard format after capturing, and performing material inspection according to a material acquisition standard by the system, for example: pixel requirements of panoramic photos, angles and quantity standards of special material acquisition, and the like, and simultaneously, ID marks are carried out on image materials meeting the standards. Related algorithms such as openCV can also be used for capturing images of production scenes, including capturing panoramic and characteristic objects, uploading the captured images according to a standard format, and performing material inspection according to a material acquisition standard, for example: pixel requirements of panoramic photos, angles and quantity standards of special material acquisition, and the like, and simultaneously, ID marks are carried out on image materials meeting the standards. Filtering the production scene image to remove noise, wherein any filter of a Gaussian filter, a mean filter and a median filter can be adopted; secondly, removing irrelevant background in the production scene image, wherein the removing method comprises the following steps: firstly taking a background image, then making a difference between the image containing the detection target and the background image, and obtaining the rest of the background image as the detection target; the detection target can be obtained by: a cascade classifier based on haar features, which is a series of weak classifiers. And adding and subtracting the area of the rectangle formed from the original point to each point to obtain the required calculation area. This method is not affected by the enlargement or reduction of the image because the calculation formula is fixed. Object tracking is then performed, where we choose points according to the color histogram and calculate the spatial centroid when we want to track a region of interest (ROI), and if the centroid is at the center of the region we know that the object is not moving, and if the centroid is not at the center of the region we know that the object is moving in a certain direction. The problem is that the bounding box size is not allowed to change. This enables the region of interest to be taken as a captured production scene image.
The production event data includes event inputs corresponding to a scene time generated by the production scene data. The production event data is typically a. Captured manually, event collection is performed by providing a standard input system, such as: time of occurrence, expertise, skill class, event class, chronological event statement, and the like. Meanwhile, the corresponding events of the log (standard format), the video (standard format) and the monitoring (standard format) need to be uploaded for verification. The event data is identical to the scene image data ID, so that the same material is judged, and the time axis of the event details must be aligned with the scene time axis, including characters, objects, information and the like.
S20, desensitizing the production information to obtain desensitized data;
the desensitization of the production information to obtain desensitized data, as shown in fig. 2, includes the following steps:
s201, data encryption is carried out in the transmission process of uploading production scene data and production event data to a server; the data encryption is needed in the transmission process from the raw material uploading end to the processing end, and encryption algorithms including symmetry, asymmetry, hash and the like can be adopted.
S202, after decrypting the encrypted data at the server, partitioning the decrypted data with a security level; after the server receives and completes data decryption, partitioning operation is carried out on the data, and data partitioning is carried out according to three security levels, namely high security level, medium security level and low security level.
S203, performing desensitization operation on the classified data according to the security level, wherein the desensitization operation comprises one or more of the following operations: masking, deforming, encrypting or bleaching; the key information of the data is desensitized by adopting desensitization technologies or desensitization equipment including covering, deforming, encrypting, bleaching and the like.
S204, performing data minimization detection on the desensitized data, and if the data minimization standard is not met, performing data minimization operation to obtain final desensitized data. The data after desensitization is detected in a minimized mode based on a minimized model standard required by teaching resource conversion, for example, for a production requirement of software development, the embodiment of the disclosure can detect data minimization according to a minimum principle requirement of requirement investigation, and if the minimum principle is not satisfied, data minimization operation is performed.
The minimizing model refers to minimizing the material on the premise of meeting teaching use, for example: a wrench, a screwdriver, appear in a picture or video of a room, which are not necessary for teaching, so that these objects are removed during the minimum detection. For an initial model of the minimized model, a minimized model library is built according to industry characteristics, and then the minimized model is trained by a manual marking method.
S30, performing category analysis, availability analysis and value analysis on the desensitized data to obtain effective data;
in this embodiment, the category analysis includes the following: based on the classification information input by the capturing end, performing professional and skill classification on the non-standard data after desensitization;
in this embodiment, the usability analysis includes the following:
a. judging whether the classified data is completely consistent with the existing data, whether the classified data is completely the same scene or completely the same event
b. Judging the authenticity and rationality of the classified data, and completing the authenticity judgment through the authenticity model training, for example, in a data center in the financial industry, the occurrence of an AP is considered unreasonable. After the machine judgment is completed, adding a manual review.
The authenticity model is a basic authenticity model formed by industry and professional technical principles, rationality of appearance of characteristic objects, experience data and the like and is used for judging authenticity and rationality of the material. After the initial model is formed, an authenticity model is formed by adopting manual data or automatic marking. For example, if the HSRP is used to implement the high availability technology of the network, the theoretical convergence time of the network is not less than 3 seconds, but if the convergence time is 1s when the HSRP is used to implement the high availability as described in the event material, we will mark as an unreal event.
c. Whether conversion conditions are provided or not, teaching element detection (according to the detection implementation of a teaching element model) is performed on the classified data, and difference items, which are less than a threshold, are notified. The input of the teaching element model is the material after the authenticity judgment is completed, the data which cannot be detected through the teaching element model is recorded and fed back, and the data which is detected through the teaching element model is processed for further conversion of teaching resources. Classification is based on standard text-related fields in the number of events, such as industry, expertise, and the like. The teaching element model can be trained through manual data marking.
In this embodiment, the value analysis includes the following: the value judgment is completed through value model training, such as event category occurrence probability, event content duplicate checking rate, event content related equipment and technology duplicate checking rate and the like, and judgment is performed through a series of sets and association algorithms.
S40, converting the effective data into teaching resources;
wherein, 1. Scene conversion includes the following:
a. standard output is directly carried out through a standard output format of a teaching material end without modeling;
b. forming a standard scene model, such as VR, by manual modeling;
c. modeling was performed by 3D scanning techniques.
2. Event conversion
a. Conversion to standard event output is performed through standard text formats.
b. And the event is reproduced through various simulation devices, real equipment and software systems to form standard event output.
c. Event replication is performed by a conversational class modeling component, such as ChatGPT, to form a standard event output.
d. The existing events are recombined through various machine learning and association analysis algorithms to form new simulation events which are output as standard events.
3. The scene and the event are associated to form a standard teaching resource. The method can splice according to scenes and event IDs through time stamps, the generated formats comprise teaching mirror image packages (formats of images and text record items) and VR mirror images, and the covered categories comprise design, implementation, operation and maintenance and management.
Embodiments of the present disclosure may also convert various events, data, and information in the production process by capturing and analyzing these practical scenarios and events in industrial practice scenarios into teaching resources with educational value and provide students with learning and practice. The invention can realize the fusion of the production and the teaching, fully excavate and utilize the industrial practice scene and the event, improve the practicability and the fidelity of teaching resources and promote the practice capability and the skill improvement of students.
Example 2
As another aspect of an embodiment of the present disclosure, there is provided a teaching resource conversion system 100 for production information capturing, as shown in fig. 3, including:
the method comprises the steps that a production information module obtains 1 and production information, wherein the production information comprises production scene data and production event data;
the safety processing module 2 desensitizes the production information to obtain desensitized data;
the effective data acquisition module 3 is used for carrying out category analysis, availability analysis and value analysis on the desensitized data to obtain effective data;
and the resource conversion module 4 converts the effective data into teaching resources.
Based on the configuration, the embodiment of the disclosure can desensitize the data, delete contents unsuitable for disclosure, and convert the data meeting the standard into teaching resources after category analysis, availability analysis and value analysis, so that the standard teaching resource safety, teaching actual combat performance and pertinence are stronger.
The steps of the embodiments of the present disclosure are described in detail below, respectively.
In the production information module obtaining 1, the production scene data comprises image data obtained by a manual mode and/or an automatic mode; preferably, after the image data is acquired, the method further comprises the steps of:
and (3) carrying out ID marking on the image data according with a standard format, and outputting the image and text data of the standard scene and the event.
For example, capturing by a camera, a video camera and the like, wherein the capturing comprises 1) panoramic and one-mirror-to-bottom image materials of a production scene 2) 360-degree close-up image materials of characteristic objects based on various subject characteristic libraries in the production scene, uploading the materials according to a standard format after capturing, and performing material inspection according to a material acquisition standard by the system, for example: pixel requirements of panoramic photos, angles and quantity standards of special material acquisition, and the like, and simultaneously, ID marks are carried out on image materials meeting the standards. Related algorithms such as openCV can also be used for capturing images of production scenes, including capturing panoramic and characteristic objects, uploading the captured images according to a standard format, and performing material inspection according to a material acquisition standard, for example: pixel requirements of panoramic photos, angles and quantity standards of special material acquisition, and the like, and simultaneously, ID marks are carried out on image materials meeting the standards.
The production event data includes event inputs corresponding to a scene time generated by the production scene data. The production event data is typically a. Captured manually, event collection is performed by providing a standard input system, such as: time of occurrence, expertise, skill class, event class, chronological event statement, and the like. Meanwhile, the corresponding events of the log (standard format), the video (standard format) and the monitoring (standard format) need to be uploaded for verification. The event data is identical to the scene image data ID, so that the same material is judged, and the time axis of the event details must be aligned with the scene time axis, including characters, objects, information and the like.
In the security processing module 2, the production information is desensitized to obtain desensitized data, as shown in fig. 2, including the following steps:
encrypting data in the transmission process of uploading the production scene data and the production event data to the server; the data encryption is needed in the transmission process from the raw material uploading end to the processing end, and encryption algorithms including symmetry, asymmetry, hash and the like can be adopted.
After decrypting the encrypted data at the server, partitioning the decrypted data with a security level; after the server receives and completes data decryption, partitioning operation is carried out on the data, and data partitioning is carried out according to three security levels, namely high security level, medium security level and low security level.
And performing desensitization operation on the classified data according to the security level, wherein the desensitization operation comprises one or more of the following operations: masking, deforming, encrypting or bleaching; the key information of the data is desensitized by adopting desensitization technologies or desensitization equipment including covering, deforming, encrypting, bleaching and the like.
And carrying out data minimization detection on the desensitized data, and if the data minimization standard is not met, carrying out data minimization operation to obtain final desensitized data. The data after desensitization is detected in a minimized mode based on a minimized model standard required by teaching resource conversion, for example, for a production requirement of software development, the embodiment of the disclosure can detect data minimization according to a minimum principle requirement of requirement investigation, and if the minimum principle is not satisfied, data minimization operation is performed.
In the effective data acquisition module 3, the category analysis includes the following: based on the classification information input by the capturing end, performing professional and skill classification on the non-standard data after desensitization;
in the effective data acquisition module 3, the usability analysis includes the following:
a. judging whether the classified data is completely consistent with the existing data, whether the classified data is completely the same scene or completely the same event
b. Judging the authenticity and rationality of the classified data, and completing the authenticity judgment through the authenticity model training, for example, in a data center in the financial industry, the occurrence of an AP is considered unreasonable. After the machine judgment is completed, adding a manual review.
c. Whether conversion conditions are provided or not, teaching element detection (according to the detection implementation of a teaching element model) is performed on the classified data, and difference items, which are less than a threshold, are notified.
In this embodiment, the value analysis includes the following: the value judgment is completed through value model training, such as event category occurrence probability, event content duplicate checking rate, event content related equipment and technology duplicate checking rate and the like, and judgment is performed through a series of sets and association algorithms.
In the resource conversion module 4, 1. Scene conversion includes the following:
a. standard output is directly carried out through a standard output format of a teaching material end without modeling;
b. forming a standard scene model, such as VR, by manual modeling;
c. modeling was performed by 3D scanning techniques.
2. Event conversion
a. Conversion to standard event output is performed through standard text formats.
b. And the event is reproduced through various simulation devices, real equipment and software systems to form standard event output.
c. Event replication is performed by a conversational class modeling component, such as ChatGPT, to form a standard event output.
d. The existing events are recombined through various machine learning and association analysis algorithms to form new simulation events which are output as standard events.
3. The scene and the event are associated to form a standard teaching resource. The method can splice according to scenes and event IDs through time stamps, the generated formats comprise teaching mirror image packages (formats of images and text record items) and VR mirror images, and the covered categories comprise design, implementation, operation and maintenance and management.
Embodiments of the present disclosure may also convert various events, data, and information in the production process by capturing and analyzing these practical scenarios and events in industrial practice scenarios into teaching resources with educational value and provide students with learning and practice. The invention can realize the fusion of the production and the teaching, fully excavate and utilize the industrial practice scene and the event, improve the practicability and the fidelity of teaching resources and promote the practice capability and the skill improvement of students.
Example 3
An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, the processor implementing the teaching resource conversion method of production information capture in embodiment 1 when the computer program is executed.
Embodiment 3 of the present disclosure is merely an example, and should not be construed as limiting the functionality and scope of use of the embodiments of the present disclosure.
The electronic device may be in the form of a general purpose computing device, which may be a server device, for example. Components of an electronic device may include, but are not limited to: at least one processor, at least one memory, a bus connecting different system components, including the memory and the processor.
The buses include a data bus, an address bus, and a control bus.
The memory may include volatile memory such as Random Access Memory (RAM) and/or cache memory, and may further include Read Only Memory (ROM).
The memory may also include program means having a set (at least one) of program modules including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
The processor executes various functional applications and data processing by running computer programs stored in the memory.
The electronic device may also communicate with one or more external devices (e.g., keyboard, pointing device, etc.). Such communication may be through an input/output (I/O) interface. And, the electronic device may also communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through a network adapter. The network adapter communicates with other modules of the electronic device via a bus. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with an electronic device, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID (disk array) systems, tape drives, data backup storage systems, and the like.
It should be noted that although several units/modules or sub-units/modules of an electronic device are mentioned in the above detailed description, such a division is merely exemplary and not mandatory. Indeed, the features and functionality of two or more units/modules described above may be embodied in one unit/module according to embodiments of the present application. Conversely, the features and functions of one unit/module described above may be further divided into ones that are embodied by a plurality of units/modules.
Example 4
A computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the teaching resource conversion method of production information capture in embodiment 1.
More specifically, among others, readable storage media may be employed including, but not limited to: portable disk, hard disk, random access memory, read only memory, erasable programmable read only memory, optical storage device, magnetic storage device, or any suitable combination of the foregoing.
In a possible embodiment, the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps of the teaching resource conversion method implementing the production information capture described in example 1, when said program product is run on the terminal device.
Wherein the program code for carrying out the present disclosure may be written in any combination of one or more programming languages, which program code may execute entirely on the user device, partly on the user device, as a stand-alone software package, partly on the user device, partly on the remote device or entirely on the remote device.
Although embodiments of the present disclosure have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the disclosure, the scope of which is defined in the appended claims and their equivalents.
Claims (10)
1. The teaching resource conversion method for capturing production information is characterized by comprising the following steps:
acquiring production information, wherein the production information comprises production scene data and production event data;
desensitizing the production information to obtain desensitized data;
performing category analysis, availability analysis and value analysis on the desensitized data to obtain effective data;
and converting the effective data into teaching resources.
2. The method for converting teaching resources for capturing production information according to claim 1, wherein the production scene data includes image data acquired manually and/or automatically;
the production event data includes event inputs corresponding to a scene time generated by the production scene data.
3. The teaching resource conversion method for capturing production information according to claim 1 or 2, characterized by further comprising the steps of, after acquiring the image data:
and (3) carrying out ID marking on the image data according with a standard format, and outputting the image and text data of the standard scene and the event.
4. The teaching resource conversion method for capturing production information according to claim 1, wherein desensitizing the production information to obtain desensitized data comprises the steps of:
encrypting data in the transmission process of uploading the production scene data and the production event data to the server;
after decrypting the encrypted data at the server, partitioning the decrypted data with a security level;
and performing desensitization operation on the classified data according to the security level, wherein the desensitization operation comprises one or more of the following operations: masking, deforming, encrypting or bleaching;
and carrying out data minimization detection on the desensitized data, and if the data minimization standard is not met, carrying out data minimization operation to obtain final desensitized data.
5. The teaching resource conversion method for capturing production information according to any one of claims 1 to 2 and 4, wherein the classification analysis, availability analysis and value analysis are performed on the desensitized data to obtain effective data, comprising the steps of:
category analysis: performing specialized and skill classification on the desensitized data;
availability analysis: judging whether the classified data is completely consistent with the scene and event of the existing data, and judging the authenticity and rationality of the classified data; judging whether the classified data can reach the teaching element detection standard or not;
and carrying out value analysis on the data meeting the requirements after the judgment: and the value judgment is completed through the value module so as to obtain effective data.
6. The teaching resource conversion method for capturing production information according to any one of claims 1-2 and 4, wherein the method for converting the effective data into teaching resources comprises the steps of:
and carrying out scene conversion and event conversion on the effective data, and correlating the converted scene and event to form standard teaching resources.
7. The method for converting educational resources for capturing production information of claim 6, wherein said scene conversion comprises at least one of the following: non-modeling manner: directly outputting the effective data in a standard format; or, manual modeling mode: forming a standard scene output; or, 3D scan modeling; and/or associating the transformed scenes and events to form a standard teaching resource comprises: and splicing the scene and the event ID through the time stamp to generate a standard teaching resource including a teaching mirror image package or VR images formed by images and texts.
8. A teaching resource conversion system for capturing production information, comprising:
the production information module acquires production information, wherein the production information comprises production scene data and production event data;
the safety processing module is used for desensitizing the production information to obtain desensitized data;
the effective data acquisition module is used for carrying out category analysis, availability analysis and value analysis on the desensitized data to obtain effective data;
and the resource conversion module is used for converting the effective data into teaching resources.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the teaching resource conversion method of production information capture of any of claims 1 to 7 when the computer program is executed by the processor.
10. A computer-readable storage medium having stored thereon a computer program, wherein the program when executed by a processor implements the teaching resource conversion method of production information capture of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310517936.0A CN116562810A (en) | 2023-05-10 | 2023-05-10 | Teaching resource conversion method, system, equipment and medium for capturing production information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310517936.0A CN116562810A (en) | 2023-05-10 | 2023-05-10 | Teaching resource conversion method, system, equipment and medium for capturing production information |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116562810A true CN116562810A (en) | 2023-08-08 |
Family
ID=87489263
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310517936.0A Pending CN116562810A (en) | 2023-05-10 | 2023-05-10 | Teaching resource conversion method, system, equipment and medium for capturing production information |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116562810A (en) |
-
2023
- 2023-05-10 CN CN202310517936.0A patent/CN116562810A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10726304B2 (en) | Refining synthetic data with a generative adversarial network using auxiliary inputs | |
US9524418B2 (en) | Systems and methods for detecting, identifying and tracking objects and events over time | |
Swaminathan et al. | Digital image forensics via intrinsic fingerprints | |
CN111414873B (en) | Alarm prompting method, device and alarm system based on wearing state of safety helmet | |
CN110659397B (en) | Behavior detection method and device, electronic equipment and storage medium | |
CN116188821B (en) | Copyright detection method, system, electronic device and storage medium | |
US10256829B1 (en) | Production of modified image inventories | |
CN109408672B (en) | Article generation method, article generation device, server and storage medium | |
WO2023124054A1 (en) | Method and apparatus for monitoring physical world on basis of digital twins, and storage medium | |
CN110969045B (en) | Behavior detection method and device, electronic equipment and storage medium | |
López et al. | Digital video source identification based on container’s structure analysis | |
CN113282778A (en) | Quality abnormity recording method, device, AR equipment, system and medium | |
CN113111369B (en) | Data protection method and system in data annotation | |
US20170294213A1 (en) | Method for video investigation | |
CN111881740A (en) | Face recognition method, face recognition device, electronic equipment and medium | |
Khan et al. | Visual user-generated content verification in journalism: An overview | |
CN114359159A (en) | Video generation method, system, electronic device and storage medium | |
CN110879945A (en) | Virtual reality laboratory system based on artificial intelligence and virtual reality | |
CN110418148B (en) | Video generation method, video generation device and readable storage medium | |
CN111369557A (en) | Image processing method, image processing device, computing equipment and storage medium | |
CN116562810A (en) | Teaching resource conversion method, system, equipment and medium for capturing production information | |
US10289915B1 (en) | Manufacture of image inventories | |
CN116962612A (en) | Video processing method, device, equipment and storage medium applied to simulation system | |
Celebi et al. | A survey of deep fake detection for trial courts | |
CN115035530A (en) | Image processing method, image text obtaining method, device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |