CN115841670A - Operation error question collecting system based on image recognition - Google Patents

Operation error question collecting system based on image recognition Download PDF

Info

Publication number
CN115841670A
CN115841670A CN202310102859.2A CN202310102859A CN115841670A CN 115841670 A CN115841670 A CN 115841670A CN 202310102859 A CN202310102859 A CN 202310102859A CN 115841670 A CN115841670 A CN 115841670A
Authority
CN
China
Prior art keywords
outline
character
text
characters
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310102859.2A
Other languages
Chinese (zh)
Other versions
CN115841670B (en
Inventor
苏楠明
梁城栋
黄富强
陈建勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Luming Education Technology Co ltd
Original Assignee
Fujian Luming Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Luming Education Technology Co ltd filed Critical Fujian Luming Education Technology Co ltd
Priority to CN202310102859.2A priority Critical patent/CN115841670B/en
Publication of CN115841670A publication Critical patent/CN115841670A/en
Application granted granted Critical
Publication of CN115841670B publication Critical patent/CN115841670B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to the field of image data processing, in particular to an operation error question collecting system based on image recognition.

Description

Operation wrong question collecting system based on image recognition
Technical Field
The invention relates to the field of image data processing, in particular to a job error question collecting system based on image recognition.
Background
With the continuous progress of image recognition technology, the related technologies are applied to various fields, especially in the office and education fields, such as document scanning, document recognition, job correction, etc.;
chinese patent publication No.: CN111242045A discloses an automatic operation exercise right and wrong indication method and system, the method is: collecting images to be judged of workbooks waiting for correction or test paper sheets in the color reference line frame; comparing the image to be judged with a standard problem set to find out a standard problem image with the highest matching degree; each standard exercise image comprises standard answer characters, positions and sizes of all the small exercise answer blocks; acquiring a small answer sub-image of each small answer block in the image to be judged, identifying answer characters from the small answer sub-image through a standard function, comparing the answer characters with standard answer characters of the corresponding small answer block in a standard exercise image, and determining that the answer of each small answer block in the image to be judged is wrong; and projecting and outputting a color registration or error number according to the corresponding position of the error of each question answer block in the image to be judged on the corresponding workbook or the test paper. The invention can realize automatic and high-speed answer right and wrong indication.
But there are also problems in the prior art that,
in the prior art, under the condition that outline patterns of characters are not clear or are not easy to identify, phrase composition of unrecognized characters is not considered according to adjacent characters, and then the identification rate of the characters is improved.
Disclosure of Invention
In order to solve the problem of low text recognition precision in the prior art, the invention provides an operation wrong question collection system based on image recognition, which comprises:
the database module comprises a first storage unit and a second storage unit, wherein the first storage unit is used for storing the incidence relation among the characters, the second storage unit is used for storing a plurality of character outlines, and the corresponding relation is pre-established between each character outline and the characters;
the data acquisition module is connected with the user side and used for receiving the picture information sent by the user side;
a data processing module including a first data comparing unit, a second data comparing unit, a first analyzing unit and a second analyzing unit,
the first data comparison unit is connected with the data acquisition module and used for receiving the picture information, extracting the character outline of the answer in the picture information and screening each character outline based on the definition parameter of each character outline;
the second data comparison unit is respectively connected with the first data comparison unit and the database module and is used for calculating the contact ratio of the character outline screened by the first data comparison unit and each outline pattern stored in the database module and comparing each contact ratio with a contact ratio comparison threshold or a contact ratio correction comparison threshold so as to obtain a contact ratio comparison result of the character outline and each outline pattern;
the first analysis unit, the second data comparison unit and the database module are respectively connected to determine characters corresponding to the character outlines based on the coincidence degree sequence of the character outlines and the outline patterns under the first coincidence degree comparison result, and text information is generated according to all the determined characters;
the second analysis unit is connected with the second data comparison unit and the database module respectively and is used for determining characters corresponding to adjacent character outlines of the character outlines under a second coincidence degree comparison result, determining outline patterns corresponding to associated characters of all the characters and sending all the outline patterns to the second data comparison unit so as to obtain a coincidence degree comparison result of the character outlines and the received outline patterns again after the second data comparison unit corrects the coincidence degree comparison threshold;
and the checking module is connected with the first analysis unit and used for comparing the text information generated by the first analysis unit with preset contrast answer information and judging whether the text information is wrong or not.
Further, the first coincidence degree comparison result is that the coincidence degree of the character contour and at least one contour pattern is greater than or equal to the coincidence degree comparison threshold or coincidence degree correction comparison threshold;
and the second coincidence degree comparison result is that the coincidence degrees of the character contour and all contour patterns are smaller than the coincidence degree comparison threshold or the coincidence degree correction comparison threshold.
Further, the first data comparison unit calculates a definition parameter D corresponding to each character outline according to a formula (1),
Figure SMS_1
(1)
in the formula (1), S represents the area of the letter outline, S0 represents the average area value of each letter outline, C represents the chroma value of the letter outline, and C0 represents the average chroma value of each letter outline.
Further, the first data comparison unit compares a character definition parameter D corresponding to the character outline with a preset definition contrast parameter D1 and screens out the character outline according to a comparison result, wherein,
if the comparison result meets a first preset condition, the first data comparison unit judges that the character outline is screened out;
the first preset condition is that D is larger than or equal to D1.
Furthermore, the first analysis unit ranks the coincidence degrees of the character outlines and the outline patterns under the first coincidence degree comparison result, and takes the character corresponding to the outline pattern with the highest coincidence degree as the character corresponding to the character outline in the ranking result.
Further, the second analysis unit obtains a coincidence degree determination result of adjacent character outlines of the character outlines,
if any adjacent character outline accords with a first contact ratio comparison result, the second analysis unit acquires characters corresponding to the adjacent character outline judged by the first analysis unit, and determines associated characters having an associated relationship with the acquired characters based on the associated relationship among the characters stored in the first storage unit so as to record and generate an associated character set;
if the adjacent character outlines all accord with a second overlap ratio comparison result, the second analysis unit judges that the characters corresponding to the character outlines cannot be identified.
Further, the first storage unit constructs an association relationship between the characters, wherein,
the first storage unit is internally stored with a plurality of vocabularies, the characters forming each vocabulary are determined, and for any vocabulary, the association relation of each character forming the vocabulary is established.
Further, the second data comparison unit calculates a discrete parameter E according to equation (2),
Figure SMS_2
(2)
in the formula (2), G (i) represents an average value of overlap ratios of the outline pattern corresponding to the ith associated character and the outline patterns corresponding to the remaining characters in the associated character set, n represents the number of the characters in the associated character set, and n is an integer greater than zero.
Further, the second data comparison unit receives the associated character set and determines a contour pattern corresponding to each associated character in the associated character set, and when the second data comparison unit obtains a coincidence degree comparison result between the character contour and each contour pattern, the second data comparison unit compares the discrete parameter E with a discrete parameter comparison parameter E0 and corrects the coincidence degree comparison threshold H0 according to the comparison result, wherein,
the first correction mode is that the coincidence degree comparison threshold value H0 is corrected to the coincidence degree comparison threshold value H according to a first preset correction parameter H1, and H = H0-H1 is set;
the second correction mode is that the coincidence degree comparison threshold value H0 is corrected to the coincidence degree comparison threshold value H according to a second preset correction parameter H2, and H = H0-H2 is set;
wherein h1 is less than h2, the first correction mode needs to satisfy that E is less than E0, and the second correction mode needs to satisfy that E is more than or equal to E0.
Further, the preset contrast answer information is a text pre-stored in a correction module, and the correction module compares the text information with the preset contrast answer information, wherein,
if the text information is the same as the preset contrast answer information, the proofreading module judges that the text information is correct,
and if the text information is different from the preset contrast answer information, the proofreading module judges that the text information is wrong.
Compared with the prior art, the image recognition method comprises the steps of setting a database module, a data acquisition module, a data processing module and a data proofreading module, acquiring image information uploaded by a user side, recognizing a text outline corresponding to an answer in the image based on the image information, comparing the text outline with outline patterns stored in a database, determining characters corresponding to the text outline, considering the associated characters of the adjacent text outline when the coincidence degree of the text outline and the outline patterns is low, re-acquiring the coincidence degree comparison result of the text outline and the outline patterns corresponding to the associated characters after adjusting the coincidence degree comparison threshold, and further improving the recognition rate of the characters corresponding to the text outline.
Particularly, the data processing module screens the character outline based on the definition parameter of the character outline, only judges the corresponding character of the screened character outline, in the practical situation, the character outline in the picture information uploaded by the user terminal has larger difference due to the writing difference, and the character outline can not be identified due to the poor definition of part of the character outline or the intersection of the part of the character outline and other character outlines.
Particularly, different analysis units are called to perform data processing based on the coincidence degree comparison result of the character outline and different outline patterns, data processing pressure is shared, and when the coincidence degree of the character outline and at least one outline pattern is greater than or equal to the coincidence degree comparison threshold or the coincidence degree correction comparison threshold, the character outline at the moment has higher coincidence degree with the outline pattern stored in the database module, so that characters corresponding to the character outline can be judged directly based on the sequence of the coincidence degree, and the recognition efficiency of the character outline is improved.
Particularly, when the coincidence degree of the character outline and all the outline patterns is smaller than the coincidence degree comparison threshold or the coincidence degree correction comparison threshold, the data processing module of the invention shows that the coincidence degree comparison threshold of the character outline and each outline pattern is not high, and at the moment, the associated characters of the characters corresponding to the adjacent character outline need to be determined.
In particular, the correction amount for correcting the coincidence degree comparison threshold is determined based on the corresponding discrete parameters of the associated character set, when the discrete parameters are low, the coincidence degree between the contour patterns corresponding to each associated character in the associated character set is high, and if the coincidence degree between any contour pattern and the character contour is low, the coincidence degree between other contour patterns and the character contour is likely to be low, so that the coincidence degree comparison threshold needs to be reduced more to avoid the second coincidence degree comparison result after the coincidence degree comparison threshold is reduced.
Drawings
FIG. 1 is a schematic structural diagram of a job error problem collection system based on image recognition according to an embodiment of the present invention;
FIG. 2 is a block diagram of a data processing module according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a database module structure of an embodiment of the invention.
Detailed Description
In order that the objects and advantages of the invention will be more clearly understood, the invention is further described in conjunction with the following examples; it should be understood that the specific embodiments described herein are merely illustrative of the invention and do not delimit the invention.
Preferred embodiments of the present invention are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention, and do not limit the scope of the present invention.
It should be noted that in the description of the present invention, the terms of direction or positional relationship indicated by the terms "upper", "lower", "left", "right", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, which are only for convenience of description, and do not indicate or imply that the device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
Furthermore, it should be noted that, in the description of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
Referring to fig. 1, fig. 2 and fig. 3, which are schematic structural diagrams of an operation error problem collection system based on image recognition, a schematic structural diagram of a data processing module and a schematic structural diagram of a database module according to an embodiment of the present invention, the operation error problem collection system based on image recognition includes:
the database module comprises a first storage unit and a second storage unit, wherein the first storage unit is used for storing the incidence relation among the characters, the second storage unit is used for storing a plurality of character outlines, and the corresponding relation is pre-established between each character outline and the characters;
the data acquisition module is connected with the user side and used for receiving the picture information sent by the user side;
a data processing module including a first data comparing unit, a second data comparing unit, a first analyzing unit and a second analyzing unit,
the first data comparison unit is connected with the data acquisition module and used for receiving the picture information, extracting the character outline of the answer in the picture information and screening each character outline based on the definition parameter of each character outline;
the second data comparison unit is connected with the first data comparison unit and the database module respectively and used for calculating the contact ratio between the character outline screened by the first data comparison unit and each outline pattern stored in the database module and comparing each contact ratio with a contact ratio comparison threshold or a contact ratio correction comparison threshold so as to obtain a contact ratio comparison result between the character outline and each outline pattern;
the first analysis unit, the second data comparison unit and the database module are respectively connected to determine characters corresponding to the character outlines based on the coincidence degree sequence of the character outlines and the outline patterns under the first coincidence degree comparison result, and text information is generated according to all the determined characters;
the second analysis unit is connected with the second data comparison unit and the database module respectively and is used for determining characters corresponding to adjacent character outlines of the character outlines under a second coincidence degree comparison result, determining outline patterns corresponding to related characters of each character and sending each outline pattern to the second data comparison unit so as to enable the second data comparison unit to obtain a coincidence degree comparison result of the character outlines and the received outline patterns again after correcting the coincidence degree comparison threshold;
and the checking module is connected with the first analysis unit and used for comparing the text information generated by the first analysis unit with preset contrast answer information and judging whether the text information is wrong or not.
Specifically, the specific structures of the database module, the data acquisition module, the data processing module and the data proofreading module are not limited, and the data acquisition module, the data processing module and the data proofreading module can be functional programs applied to a computer and only can meet the functions of data storage, data processing and data exchange.
Specifically, the specific form of the text outline of the answer in the obtained picture information is not limited, the specific position of the answer can be determined according to the characteristics, for example, the position of the answer is determined according to the underline position, and then the text outline of the position of the answer is obtained, the specific position of the answer in the picture can be preset, and then the text outline of the specific position is identified, and a person skilled in the art can replace the text outline according to the specific needs.
Specifically, the invention is not limited to the method for calculating the coincidence degree of the text outline and the outline pattern in the invention, and in the prior art, for most text detectors with arbitrary shapes to represent text examples in the spatial domain of an image, the representation method based on the spatial domain can be divided into two types, namely pixel mask representation and outline point sequence representation. The pixel mask representation method may require a complex and time-consuming post-processing process, and the requirement for the training sample size is often greater; the expression capacity of the contour point sequence expression method for the highly curved text is limited, and because Fourier coefficient expression can fit any closed curve theoretically and the text contour is more concentrated on low-frequency components, the problem can be well solved by characterizing an irregular scene character example in a Fourier domain, so that the method can perform Fourier transformation on the text example contour by using Fourier transformation, model building is performed in the Fourier domain instead of a space domain, the method can steadily and simply approach any closed contour step by step, and further calculate the contact ratio; of course, the calculation method related to the contour overlap ratio is the prior art, and those skilled in the art can select other overlap ratio calculation methods to calculate the overlap ratio between the text contour and the contour pattern in the present invention.
Specifically, the first coincidence degree comparison result is that the coincidence degree of the character contour and at least one contour pattern is greater than or equal to the coincidence degree comparison threshold or coincidence degree correction comparison threshold;
and the second coincidence degree comparison result is that the coincidence degrees of the character contour and all contour patterns are smaller than the coincidence degree comparison threshold or the coincidence degree correction comparison threshold.
Specifically, the first data comparison unit calculates a definition parameter D corresponding to each text outline according to formula (1),
Figure SMS_3
(1)
in formula (1), S represents the area of the text outline, S0 represents the average area value of each text outline, C represents the chroma value of the text outline, and C0 represents the average chroma value of each text outline.
Specifically, the first data comparison unit compares a text definition parameter D corresponding to the text outline with a preset definition contrast parameter D1, and screens out the text outline according to the comparison result, wherein,
if the comparison result meets a first preset condition, the first data comparison unit judges that the character outline is screened out;
the first preset condition is that D is larger than or equal to D1.
Specifically, the data processing module screens the character outline based on the definition parameter of the character outline, only judges the corresponding character of the screened character outline, in the practical situation, the character outline in the picture information uploaded by the user side has larger difference due to writing difference, and the character outline can not be identified due to poor definition of part of the character outline or the character outline is combined with other character outlines.
Specifically, the first analysis unit ranks the degrees of coincidence between the character outline and each outline pattern in the first coincidence degree comparison result, and takes the character corresponding to the outline pattern with the highest degree of coincidence as the character corresponding to the character outline in the ranking result.
Specifically, the second analysis unit obtains a coincidence degree determination result of adjacent character outlines of the character outlines,
if any adjacent character outline accords with a first contact ratio comparison result, the second analysis unit acquires characters corresponding to the adjacent character outline judged by the first analysis unit, and determines associated characters having an associated relationship with the acquired characters based on the associated relationship among the characters stored in the first storage unit so as to record and generate an associated character set;
and if the adjacent character outlines accord with a second overlap ratio comparison result, the second analysis unit judges that the characters corresponding to the character outlines cannot be identified.
Specifically, different analysis units are called to perform data processing based on the coincidence degree comparison result of the character outline and different outline patterns, data processing pressure is shared, and when the coincidence degree of the character outline and at least one outline pattern is greater than or equal to the coincidence degree comparison threshold or the coincidence degree correction comparison threshold, the character outline at the moment has higher coincidence degree with the outline pattern stored in the database module, so that characters corresponding to the character outline can be judged directly based on the sequence of the coincidence degree, and the recognition efficiency of the character outline is improved.
Specifically, the data processing module of the present invention indicates that the contact ratio of the text outline and all the outline patterns is less than the contact ratio comparison threshold or the contact ratio correction comparison threshold, which means that the contact ratio of the text outline and each outline pattern is not high, and at this time, it is necessary to determine the associated text of the text corresponding to the adjacent text outline.
Specifically, the first storage unit constructs an association relationship between characters, wherein,
the first storage unit is internally stored with a plurality of vocabularies, the characters forming each vocabulary are determined, and for any vocabulary, the association relation of each character forming the vocabulary is established.
Specifically, the second data comparison unit calculates a discrete parameter E according to equation (2),
Figure SMS_4
(2)
in the formula (2), G (i) represents an average value of coincidence degrees of the outline pattern corresponding to the ith associated character in the associated character set and the outline patterns corresponding to the remaining characters, n represents the number of characters in the associated character set, and n is an integer greater than zero.
Specifically, the second data comparison unit receives the associated character set and determines a contour pattern corresponding to each associated character in the associated character set, and when acquiring a coincidence degree comparison result between the character contour and each contour pattern, the second data comparison unit compares the discrete parameter E with a discrete parameter comparison parameter E0 and corrects the coincidence degree comparison threshold H0 according to the comparison result, wherein,
the first correction mode is that the coincidence degree comparison threshold value H0 is corrected to the coincidence degree comparison threshold value H according to a first preset correction parameter H1, and H = H0-H1 is set;
the second correction mode is that the coincidence degree comparison threshold value H0 is corrected to the coincidence degree comparison threshold value H according to a second preset correction parameter H2, and H = H0-H2 is set;
wherein h1 is less than h2, the first correction mode needs to satisfy that E is less than E0, and the second correction mode needs to satisfy that E is more than or equal to E0.
Specifically, the preset comparison answer information is a text pre-stored in a correction module, and the correction module compares the text information with the preset comparison answer information, wherein,
if the text information is the same as the preset contrast answer information, the proofreading module judges that the text information is correct,
and if the text information is different from the preset contrast answer information, the proofreading module judges that the text information is wrong.
Specifically, the correction amount for correcting the coincidence degree comparison threshold is determined based on the corresponding discrete parameters of the associated character set, and when the discrete parameters are low, the coincidence degree between the contour patterns corresponding to the associated characters in the associated character set is high, and if the coincidence degree between any contour pattern and the character contour is low, the coincidence degree between other contour patterns and the character contour is likely to be low, so that the coincidence degree comparison threshold needs to be reduced more to avoid the second coincidence degree comparison result after the coincidence degree comparison threshold is reduced.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.

Claims (10)

1. An operation mistake question collecting system based on image recognition is characterized by comprising:
the database module comprises a first storage unit and a second storage unit, wherein the first storage unit is used for storing the incidence relation among the characters, the second storage unit is used for storing a plurality of character outlines, and the corresponding relation is pre-established between each character outline and the characters;
the data acquisition module is connected with the user side and used for receiving the picture information sent by the user side;
a data processing module including a first data comparing unit, a second data comparing unit, a first analyzing unit and a second analyzing unit,
the first data comparison unit is connected with the data acquisition module and used for receiving the picture information, extracting the character outline of the answer in the picture information and screening each character outline based on the definition parameter of each character outline;
the second data comparison unit is respectively connected with the first data comparison unit and the database module and is used for calculating the contact ratio of the character outline screened by the first data comparison unit and each outline pattern stored in the database module and comparing each contact ratio with a contact ratio comparison threshold or a contact ratio correction comparison threshold so as to obtain a contact ratio comparison result of the character outline and each outline pattern;
the first analysis unit, the second data comparison unit and the database module are respectively connected to determine characters corresponding to the character outlines based on the coincidence degree sequence of the character outlines and the outline patterns under the first coincidence degree comparison result, and text information is generated according to all the determined characters;
the second analysis unit is connected with the second data comparison unit and the database module respectively and is used for determining characters corresponding to adjacent character outlines of the character outlines under a second coincidence degree comparison result, determining outline patterns corresponding to associated characters of all the characters and sending all the outline patterns to the second data comparison unit so as to obtain a coincidence degree comparison result of the character outlines and the received outline patterns again after the second data comparison unit corrects the coincidence degree comparison threshold;
and the checking module is connected with the first analysis unit and used for comparing the text information generated by the first analysis unit with preset contrast answer information and judging whether the text information is wrong or not.
2. The image recognition-based task error collection system according to claim 1, wherein the first coincidence degree comparison result is that coincidence degree of the text outline and at least one outline pattern is greater than or equal to the coincidence degree comparison threshold or coincidence degree correction comparison threshold;
and the second coincidence degree comparison result is that the coincidence degrees of the character contour and all contour patterns are smaller than the coincidence degree comparison threshold or the coincidence degree correction comparison threshold.
3. The image recognition-based work wrong question collecting system according to claim 1, wherein said first data comparing unit calculates a sharpness parameter D corresponding to each character outline according to formula (1),
Figure QLYQS_1
(1)
in formula (1), S represents the area of the text outline, S0 represents the average area value of each text outline, C represents the chroma value of the text outline, and C0 represents the average chroma value of each text outline.
4. The image recognition-based operation error problem collection system according to claim 3, wherein the first data comparison unit compares a text definition parameter D corresponding to the text outline with a preset definition contrast parameter D1, and screens out the text outline according to the comparison result, wherein,
if the comparison result meets a first preset condition, the first data comparison unit judges that the character outline is screened out;
the first preset condition is that D is larger than or equal to D1.
5. The system of claim 1, wherein the first parsing unit ranks coincidence degrees of the outline pattern and the text outline under the first coincidence degree comparison result, and selects a text corresponding to the outline pattern with the highest coincidence degree as the text corresponding to the text outline in the ranking result.
6. The image recognition-based work wrong question collecting system according to claim 1, wherein said first storage unit constructs an association relation between characters,
the first storage unit is internally stored with a plurality of vocabularies, the characters forming each vocabulary are determined, and for any vocabulary, the association relation of each character forming the vocabulary is established.
7. The image recognition-based work wrong question collecting system according to claim 1 wherein said second analyzing means acquires a result of determination of coincidence between adjacent ones of said character outlines,
if any adjacent character outline accords with a first contact ratio comparison result, the second analysis unit acquires characters corresponding to the adjacent character outline judged by the first analysis unit, and determines associated characters having an associated relationship with the acquired characters based on the associated relationship among the characters stored in the first storage unit so as to record and generate an associated character set;
if the adjacent character outlines all accord with a second overlap ratio comparison result, the second analysis unit judges that the characters corresponding to the character outlines cannot be identified.
8. The image recognition-based work problem collection system according to claim 7, wherein the second data comparing unit calculates a discrete quantity E according to formula (2),
Figure QLYQS_2
(2)
in the formula (2), G (i) represents an average value of coincidence degrees of the outline pattern corresponding to the ith associated character in the associated character set and the outline patterns corresponding to the remaining characters, n represents the number of characters in the associated character set, and n is an integer greater than zero.
9. The system according to claim 8, wherein the second data comparing unit receives the associated text set and determines a contour pattern corresponding to each associated text in the associated text set, and when obtaining a comparison result of a degree of coincidence between the text contour and each contour pattern, the second data comparing unit compares the discrete parameter E with a discrete parameter comparison parameter E0 and corrects the degree of coincidence comparison threshold H0 according to the comparison result, wherein,
the first correction mode is that the coincidence degree comparison threshold value H0 is corrected to the coincidence degree comparison threshold value H according to a first preset correction parameter H1, and H = H0-H1 is set;
the second correction mode is that the contact ratio comparison threshold value H0 is corrected to a contact ratio correction comparison threshold value H according to a second preset correction parameter H2, and H = H0-H2 is set;
wherein h1 is less than h2, the first correction mode needs to satisfy that E is less than E0, and the second correction mode needs to satisfy that E is more than or equal to E0.
10. The image recognition-based work wrong question collecting system according to claim 8, wherein said preset contrast answer information is a text pre-stored in a correction module, said correction module compares said text information with said preset contrast answer information, wherein,
if the text information is the same as the preset contrast answer information, the proofreading module judges that the text information is correct,
and if the text information is different from the preset contrast answer information, the proofreading module judges that the text information is wrong.
CN202310102859.2A 2023-02-13 2023-02-13 Operation wrong question collecting system based on image recognition Active CN115841670B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310102859.2A CN115841670B (en) 2023-02-13 2023-02-13 Operation wrong question collecting system based on image recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310102859.2A CN115841670B (en) 2023-02-13 2023-02-13 Operation wrong question collecting system based on image recognition

Publications (2)

Publication Number Publication Date
CN115841670A true CN115841670A (en) 2023-03-24
CN115841670B CN115841670B (en) 2023-05-12

Family

ID=85579623

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310102859.2A Active CN115841670B (en) 2023-02-13 2023-02-13 Operation wrong question collecting system based on image recognition

Country Status (1)

Country Link
CN (1) CN115841670B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982330A (en) * 2012-11-21 2013-03-20 新浪网技术(中国)有限公司 Method and device recognizing characters in character images
CN107977659A (en) * 2016-10-25 2018-05-01 北京搜狗科技发展有限公司 A kind of character recognition method, device and electronic equipment
CN108154132A (en) * 2018-01-10 2018-06-12 马上消费金融股份有限公司 Method, system and equipment for extracting characters of identity card and storage medium
CN109472014A (en) * 2018-10-30 2019-03-15 南京红松信息技术有限公司 A kind of wrong topic collection automatic identification generation method and its device
CN111104883A (en) * 2019-12-09 2020-05-05 平安国际智慧城市科技股份有限公司 Job answer extraction method, device, equipment and computer readable storage medium
CN111242045A (en) * 2020-01-15 2020-06-05 西安汇永软件科技有限公司 Automatic operation exercise right and wrong indication method and system
CN112287926A (en) * 2019-07-23 2021-01-29 小船出海教育科技(北京)有限公司 Method, device and equipment for correcting graphic questions
CN112347997A (en) * 2020-11-30 2021-02-09 广东国粒教育技术有限公司 Test question detection and identification method and device, electronic equipment and medium
CN115393865A (en) * 2022-08-31 2022-11-25 苏州市职业大学 Character retrieval method, character retrieval equipment and computer-readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102982330A (en) * 2012-11-21 2013-03-20 新浪网技术(中国)有限公司 Method and device recognizing characters in character images
CN107977659A (en) * 2016-10-25 2018-05-01 北京搜狗科技发展有限公司 A kind of character recognition method, device and electronic equipment
CN108154132A (en) * 2018-01-10 2018-06-12 马上消费金融股份有限公司 Method, system and equipment for extracting characters of identity card and storage medium
CN109472014A (en) * 2018-10-30 2019-03-15 南京红松信息技术有限公司 A kind of wrong topic collection automatic identification generation method and its device
CN112287926A (en) * 2019-07-23 2021-01-29 小船出海教育科技(北京)有限公司 Method, device and equipment for correcting graphic questions
CN111104883A (en) * 2019-12-09 2020-05-05 平安国际智慧城市科技股份有限公司 Job answer extraction method, device, equipment and computer readable storage medium
CN111242045A (en) * 2020-01-15 2020-06-05 西安汇永软件科技有限公司 Automatic operation exercise right and wrong indication method and system
CN112347997A (en) * 2020-11-30 2021-02-09 广东国粒教育技术有限公司 Test question detection and identification method and device, electronic equipment and medium
CN115393865A (en) * 2022-08-31 2022-11-25 苏州市职业大学 Character retrieval method, character retrieval equipment and computer-readable storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张振宇;姜贺云;樊明宇;: "一种面向银行票据文字自动化识别的高效人工智能方法" *
张振宇;姜贺云;樊明宇;: "一种面向银行票据文字自动化识别的高效人工智能方法", 温州大学学报(自然科学版) *

Also Published As

Publication number Publication date
CN115841670B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
CN110738602B (en) Image processing method and device, electronic equipment and readable storage medium
US6959121B2 (en) Document image processing device, document image processing method, and memory medium
US5640466A (en) Method of deriving wordshapes for subsequent comparison
CN101908136B (en) Table identifying and processing method and system
US5659638A (en) Method and system for converting bitmap data into page definition language commands
CN111680688A (en) Character recognition method and device, electronic equipment and storage medium
US6169822B1 (en) Method for correcting direction of document image
CN106033544B (en) Template matching-based test paper content area extraction method
CN106033535B (en) Electronic paper marking method
EP0472313A2 (en) Image processing method and apparatus therefor
CN105046200B (en) Electronic paper marking method based on straight line detection
JPH08241411A (en) System and method for evaluation of document image
CN110689013A (en) Automatic marking method and system based on feature recognition
CN102360419A (en) Method and system for computer scanning reading management
CN111985465A (en) Text recognition method, device, equipment and storage medium
US8787702B1 (en) Methods and apparatus for determining and/or modifying image orientation
US6493470B1 (en) Image processing method and apparatus for detecting the tilt amount of input image data
WO2019166301A1 (en) An image processing method and an image processing system
US9117132B2 (en) System and method facilitating designing of classifier while recognizing characters in a video
US7251380B2 (en) Adjustment method of a machine-readable form model and a filled form scanned image thereof in the presence of distortion
US6115506A (en) Character recognition method, character recognition apparatus and recording medium on which a character recognition program is recorded
US7251349B2 (en) Automatic table locating technique for documents
JP2003109007A (en) Device, method and program for classifying slip form and image collating device
CN115841670A (en) Operation error question collecting system based on image recognition
CN115497114B (en) Structured information extraction method for cigarette logistics receiving bill

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant