CN115083230A - Learning assisting method and device - Google Patents

Learning assisting method and device Download PDF

Info

Publication number
CN115083230A
CN115083230A CN202210833277.7A CN202210833277A CN115083230A CN 115083230 A CN115083230 A CN 115083230A CN 202210833277 A CN202210833277 A CN 202210833277A CN 115083230 A CN115083230 A CN 115083230A
Authority
CN
China
Prior art keywords
tracing
traced
user
characters
character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210833277.7A
Other languages
Chinese (zh)
Inventor
肖瑜
陈亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Bo Xue Guang Yue Education Technology Co ltd
Original Assignee
Beijing Bo Xue Guang Yue Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Bo Xue Guang Yue Education Technology Co ltd filed Critical Beijing Bo Xue Guang Yue Education Technology Co ltd
Priority to CN202210833277.7A priority Critical patent/CN115083230A/en
Publication of CN115083230A publication Critical patent/CN115083230A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B11/00Teaching hand-writing, shorthand, drawing, or painting

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

The application provides a method and a device for assisting learning, wherein the method comprises the following steps: displaying characters to be traced in a canvas area on a user interface, wherein the characters to be traced are ancient characters; receiving a tracing result of a user on characters to be traced in a canvas area; and obtaining a tracing score based on the tracing result and the character to be traced. The technical scheme of this application supplies the user to study the ancient characters through the mode with tracing on user interface, has improved the convenience of ancient characters study, and the score is imitated in the output tracing simultaneously, also makes the user can be directly perceived, objective understanding self to the study condition of writing of ancient characters.

Description

Learning assisting method and device
Technical Field
The application relates to the technical field of character tracing, in particular to a method and a device for assisting learning.
Background
At present, people have shown strong interest in the study of ancient characters (for example, oracle, golden and seal characters, etc.), but the study of ancient characters is more traditional at present, for example, the user knows and masters the ancient characters to be studied through practicing the writing of ancient characters on a paper notebook, etc. thereby the convenience of the study of ancient characters is lower. In addition, the user can not visually and objectively know the writing condition of the user in a mode of drawing the ancient characters on the paper notebook.
Therefore, how to enable a user to intuitively understand the learning situation of the user while improving the convenience of learning ancient characters becomes an urgent technical problem to be solved.
Disclosure of Invention
In view of this, the embodiment of the present application provides a method and an apparatus for assisting learning, which can improve convenience of learning ancient characters and enable a user to intuitively and objectively understand a writing learning condition of the ancient characters.
In a first aspect, an embodiment of the present application provides a method for assisting learning, where the method includes: displaying characters to be traced in a canvas area on a user interface, wherein the characters to be traced are ancient characters; receiving a tracing result of a user on characters to be traced in a canvas area; and obtaining a tracing score based on the tracing result and the character to be traced.
In some embodiments of the present application, receiving a tracing result of a user on a text to be traced in a canvas area includes: receiving configuration information of the paintbrush input by a user, wherein the configuration information comprises color; step a: acquiring the position information of a user application brush clicked in a canvas area; step b: determining a pixel area corresponding to the position information; step c: dyeing each first pixel point in the pixel area by using colors to obtain a pixel dyeing result; and repeating the steps a to c until a tracing result is obtained, wherein the tracing result comprises a plurality of pixel dyeing results.
In certain embodiments of the present application, the method further comprises: acquiring a first frame image and a second frame image when a user draws a character to be traced, wherein the first frame image and the second frame image are two continuous frame images; acquiring a first position drawn by a brush in a first frame image and a second position drawn by the brush in a second frame image; and filling at least one second pixel point between the first position and the second position by applying the color to realize the function of continuous drawing.
In some embodiments of the present application, obtaining a tracing score based on the tracing result and the text to be traced includes: acquiring the number of third pixel points included in the character to be traced; acquiring the number of fourth pixel points at the positions corresponding to the tracing result and the character to be traced, and the number of fifth pixel points overflowing the part of the character to be traced; and acquiring the tracing score based on the number of the third pixel points, the number of the fourth pixel points and the number of the fifth pixel points.
In some embodiments of the present application, obtaining the tracing score based on the number of the third pixel points, the number of the fourth pixel points, and the number of the fifth pixel points includes: determining the tracing coverage rate according to the ratio of the number of the fourth pixel points to the number of the third pixel points; determining the tracing overflow rate according to the ratio of the number of the fifth pixel points to the sum of the number of the fourth pixel points and the number of the fifth pixel points; and determining the tracing achievement based on the tracing coverage rate and the tracing overflow rate.
In certain embodiments of the present application, the method further comprises: and displaying the simplified characters corresponding to the characters to be traced on the user interface.
In some embodiments of the present application, before receiving a tracing result of a user drawing a text to be traced in a canvas area, the method further includes: detecting the click position of a user on a user interface; when the click position is located in the canvas area, a drawing operation of a user is received.
In some embodiments of the present application, the characters to be traced are characters of any one of oracle characters, golden characters and seal characters.
In a second aspect, an embodiment of the present application provides an apparatus for assisting learning, including: the display module is used for displaying characters to be traced in a canvas area on the user interface, wherein the characters to be traced are ancient characters; the receiving module is used for receiving a tracing result of a character to be traced in the canvas area by a user; and the acquisition module is used for acquiring a tracing score based on the tracing result and the character to be traced.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program is configured to execute the method for assisting learning according to the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a processor; a memory for storing processor executable instructions, wherein the processor is adapted to perform the method of assisted learning of the first aspect.
The embodiment of the application provides a method and a device for assisting learning, the user can trace the ancient characters by providing a user interface, the tracing score is obtained according to the tracing result and the characters to be traced, the impression of the user on the ancient characters is deepened during tracing, meanwhile, the user can visually and objectively know the writing condition of the user on the ancient characters in a mode of outputting the tracing score, and the enthusiasm and interestingness of learning of the user are enhanced.
Drawings
Fig. 1 is a flowchart illustrating a method for assisting learning according to an exemplary embodiment of the present application.
Fig. 2 is a flowchart illustrating a method for assisting learning according to another exemplary embodiment of the present application.
Fig. 3 is a flowchart illustrating a method for assisting learning according to another exemplary embodiment of the present application.
Fig. 4 is a flowchart illustrating a method for assisting learning according to still another exemplary embodiment of the present application.
Fig. 5 is a flowchart illustrating a method for assisting learning according to still another exemplary embodiment of the present application.
Fig. 6 is a flowchart illustrating a method for assisting learning according to still another exemplary embodiment of the present application.
Fig. 7 is a schematic structural diagram of a user interface provided in an exemplary embodiment of the present application.
Fig. 8 is a schematic structural diagram of an apparatus for assisting learning according to an exemplary embodiment of the present application.
Fig. 9 is a block diagram of an electronic device for assisting learning provided by an exemplary embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Summary of the application
At present, the learning mode to ancient writing is comparatively traditional, for example the user through on the notebook of paper, mode recognition such as the writing of practicing ancient writing and study ancient writing to having led to the user when studying ancient writing, need have the data of ancient writing, and the paper notebook, and then has led to the inconvenience of ancient writing study. Meanwhile, the user writes the ancient characters on the notebook by himself or herself, and the writing and learning conditions of the ancient characters cannot be visually and objectively checked.
In view of the above problems, embodiments of the present application provide a method for assisting learning, and various non-limiting embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Exemplary method
Fig. 1 is a flowchart illustrating a method for assisting learning according to an exemplary embodiment of the present application. The method of fig. 1 is performed by a computing device, e.g., a processor of a user terminal. As shown in fig. 1, the method of assisted learning includes the following steps.
110: the text to be traced is displayed within a canvas area on a user interface.
In one embodiment, the text to be traced is ancient text.
Specifically, the user interface may be a display interface presented on a display screen of the user terminal, where the user terminal may be a mobile phone, a tablet computer, a palmtop computer, or the like. The user interface may be an interface of an application program that learns ancient texts.
A canvas area may be displayed on a user interface of the user terminal, wherein the canvas area may be a canvas for tracing created using a RenderTexture. A clearing control can also be displayed on the user interface and used for recording image data when the canvas area is initialized so as to initialize the canvas. The user interface can also be displayed with a cancel control for canceling at least one operation executed by the user, wherein the cancel times can be flexibly set according to the actual situation. It should be noted that the processor may store historical image data drawn by the user, and when the user clicks a cancel control, the processor may read the stored historical image data, thereby implementing the cancel function.
In an embodiment, the canvas area may display text to be traced. The processor of the user terminal may call a Graphics Processing Unit (GPU), and copy pixel information of an ancient character picture corresponding to a simplified character stored in a memory of the user terminal according to the simplified character input by the user and the specified simplified character set by the course, so as to obtain the character to be traced displayed in the canvas area. The pixel information of the ancient character picture can be understood as color points of the ancient characters in the ancient character picture, and the color points are the most basic units of the ancient character picture.
It should be noted that the GPU resource performance can be directly called in the embodiment of the present application, the performance resource of the processor of the user terminal is not occupied, and the drawing fluency of the user is improved.
The characters to be traced can be displayed in the center of the canvas area, and can also be displayed in the positions under consideration in the canvas area. The font size of the character to be traced can be determined according to the size of the canvas area, for example, the font size of the character to be traced occupies two thirds of the area of the canvas area. And the color of the character to be traced can be a light color system, so as to facilitate tracing by a user, for example, the character to be traced is presented in light gray, and the present position and the color of the character to be traced are not particularly limited in the embodiment of the present application. The user can draw directly above the character to be traced, so that tracing is realized.
In one embodiment, the characters to be traced can include characters of any one of oracle, golden, seal (e.g., small seal) and drumstick characters.
120: and receiving a tracing result of the character to be traced in the canvas area by the user.
Specifically, the user configures configuration information of the brush through a touch object (such as a finger or a mouse), and the configuration information may include the color and thickness of the brush and the shape of the pen point. Then acquiring the position information of the user clicked in the canvas area by using a painting pen; converting the position information into a corresponding pixel area; and dyeing each first pixel point in the pixel region by applying the color configured by the user to the painting brush, thereby realizing the drawing function of the painting brush. Finally, repeatedly executing the step of' acquiring the position information of the brush clicked in the canvas area by the user application; converting the position information into a corresponding pixel area; and dyeing each first pixel point in the pixel region by applying the color configured by the user to the painting brush until a tracing result is obtained.
The tracing result can be a drawing result of a character to be traced in the canvas area by the user, the tracing result can be positioned above the character to be traced, and the color of the tracing result is the same as that of the brush pen configured by the user. That is, the color of the brush configuration is the color of the rendering tracing result.
Further, the color of the tracing result input by the user and the color of the character to be traced may be different, for example, the character to be traced is light gray, and the color of the tracing result input by the user may be yellow, so as to extract the number of pixel points of the tracing result and the character to be traced in the subsequent process, thereby calculating the tracing result.
130: and obtaining a tracing score based on the tracing result and the character to be traced.
Specifically, a processor of the user terminal evaluates similarity between a tracing result input by the user and the character to be traced, and acquires a tracing score based on a preset similarity standard, wherein the tracing score is used for indicating whether the user traces the character, and the closer the user traces the character to be traced, the higher the tracing score.
In one embodiment, the tracing coverage rate and the tracing overflow rate are obtained based on the number of third pixel points forming characters to be traced, the number of fourth pixel points and the number of fifth pixel points forming a tracing result; and obtaining a tracing score according to the tracing coverage rate and the tracing overflow rate and a preset similarity standard. The fourth pixel point can be a pixel point of a part corresponding to the character to be traced in the tracing result, namely a pixel point located above the character to be traced. The fifth pixel point can be a pixel point drawn outside the character to be traced in the tracing result, namely a pixel point beyond the character to be traced.
In an embodiment, the user interface may display a "submit" control, and after the user draws the text to be traced, the user clicks the "submit" control to complete tracing and obtain tracing scores.
In one embodiment, the tracing achievements are presented on the user interface in a pop-up window for the user to view.
It should be noted that, the embodiment of the present application may be a technology for implementing a virtual brush to trace an ancient character based on a Unity3D engine, and may implement drawing and tracing the ancient character according to a pattern of the ancient character in learning the ancient character, and giving a score of a completion degree.
Therefore, the embodiment of the application can provide the user interface for the user to trace the ancient characters, obtains the tracing score according to the tracing result and the characters to be traced, deepens the impression of the user on the ancient characters during tracing, enables the user to visually and objectively know the writing condition of the user on the ancient characters in the mode of outputting the tracing score, and enhances the enthusiasm and interest of learning of the user.
Fig. 2 is a flowchart illustrating a method for assisting learning according to another exemplary embodiment of the present application. The embodiment of fig. 2 is a further definition of step 120 of the embodiment of fig. 1. As shown in fig. 2, this step 120 further includes the following.
210: configuration information for a stylus input by a user is received.
In one embodiment, the configuration information includes a color.
Specifically, configuration information for a stylus input by a user applying a touch (e.g., a finger or a mouse) is received. The configuration information may include related parameters such as color, thickness (i.e. size of the brush), shape of the pen point, and drawing frequency of the brush.
It should be noted that, in the embodiment of the present application, a basic picture with a transparent gradual transition from the center to the periphery of a circle may be used as a basic style of a brush (or a brush tip) displayed on a user interface, and the basic style presented by the brush may be changed to other styles of graphics.
220: step a: and acquiring the position information of the user application brush clicked in the canvas area.
Specifically, the user interface may establish a coordinate system, and an origin of the coordinate system may be an upper left corner of the user interface or a center of the user interface.
In an embodiment, a processor of the user terminal may obtain real-time position information of a click input of the user application brush in the canvas area, and convert the position information into coordinate information of the canvas area, wherein the coordinate information is obtained based on a coordinate system established by the user interface.
It should be noted that when the processor detects that the user does not click on the canvas area, the user cannot perform the copy operation.
230: step b: and determining a pixel area corresponding to the position information.
In particular, the canvas area may establish UV coordinates. The UV coordinate is a two-dimensional plane, wherein U represents the horizontal direction, V represents the vertical direction, and any pixel point of the character to be traced on the canvas area can be defined through the UV coordinate. It should be noted that, at this time, the canvas area containing the characters to be copied may be regarded as a two-dimensional picture, and the UV coordinate may define any one pixel point of the characters to be copied in the canvas area.
In one embodiment, the position information obtained by clicking the canvas area by the user is converted into the coordinate information of the canvas area, wherein the coordinate information is determined based on a coordinate system established by the user interface. It should be noted that the coordinate information can be understood as a relatively local coordinate point of the canvas area.
Further, the coordinate information of the relative local part of the canvas area is converted into a pixel area obtained on the canvas area based on the UV coordinate, wherein the pixel area can be UV position information used for representing the pixel area corresponding to the position information.
240: step c: and dyeing each first pixel point in the pixel region by using the color to obtain a pixel dyeing result.
Specifically, according to the color configured by the paintbrush, each first pixel point in the obtained pixel region is dyed to obtain a pixel dyeing result. For example, when the color of the configuration of the pen input by the user is yellow, the acquired pixel dyeing result is also yellow.
It should be noted that, in the embodiment of the present application, the rendering calculation may be replaced on the canvas area through the GPU, so as to implement the basic function of drawing with a brush.
250: and repeating the steps a to c until a tracing result is obtained.
In one embodiment, the tracing result includes a plurality of pixel staining results.
Specifically, repeatedly executing steps a to c may be understood as a process in which the user continuously draws the text to be traced until a tracing result of the tracing completed by the user is obtained. That is, the tracing result is composed of a plurality of pixel dyeing results.
In an embodiment, after the user clicks a "submit" control displayed on the user interface, it indicates that the user has finished tracing the character to be traced, and at this time, the content displayed in the canvas area is the tracing result input by the user.
Therefore, the position information clicked by the user is converted into the pixel region information, and each pixel point (first pixel point) in the pixel region is dyed, so that the drawing function of the painting brush is realized.
Fig. 3 is a flowchart illustrating a method for assisting learning according to another exemplary embodiment of the present application. The embodiment of fig. 3 is an example of the embodiment of fig. 1, and the same parts are not repeated herein, and the differences are mainly described here. As shown in fig. 3, the method of assisted learning includes the following steps.
It should be noted that the basic principle of the drawing function of the brush pen is to draw a set brush point picture continuously at a small distance interval to form a line. It is conventional practice in the prior art to capture the position of a touch object (e.g., a mouse or finger) every frame, and to draw a graphic once per frame. However, the above method results in a low operation frame rate, and when a user manipulates a mouse or moves a finger, the moving speed of each frame is different, so that discontinuous rendering occurs. To this end, the embodiments of the present application provide the following methods.
310: acquiring a first frame image and a second frame image when a user draws a character to be traced.
In one embodiment, the first frame image and the second frame image are two consecutive frame images.
Specifically, two continuous frame images, namely a first frame image and a second frame image, of a user when drawing a character to be traced are acquired. The first frame image and the second frame image may be any two consecutive frames of images, and the first frame image and the second frame image are not particularly limited in this embodiment of the application.
320: and acquiring a first position drawn by the brush in the first frame image and a second position drawn by the brush in the second frame image.
Specifically, a first position and a second position drawn by a brush in two continuous frames of images are respectively obtained.
330: and filling at least one second pixel point between the first position and the second position by using the color so as to realize the function of continuous drawing.
Specifically, at least one second pixel point which is not dyed and located between the first position and the second position drawn by the brush is filled according to the color configured by the brush, so that the function of continuous drawing is achieved.
It should be noted that, in the embodiment of the present application, it may be understood that drawing information to be supplemented is uniformly filled according to the distance between the positions drawn by the brushes in the two frames before and after, and then drawing is performed.
Therefore, in the embodiment of the application, the distance interval between the front point and the rear point drawn by the brush pen in two continuous frames of images is supplemented in the drawing process, so that the continuous drawing function of the brush pen is realized.
Fig. 4 is a flowchart illustrating a method for assisting learning according to still another exemplary embodiment of the present application. The embodiment of fig. 4 is a further definition of step 130 of the embodiment of fig. 1. As shown in fig. 4, this step 130 further includes the following.
410: and acquiring the number of third pixel points included in the character to be traced.
Specifically, the text to be traced can be displayed in light gray within the canvas area, and in order to facilitate the user tracing the text to be traced, the text to be traced can be displayed in a bolded manner. That is, the text to be traced may include a plurality of third pixel points.
It should be noted that a pixel point (e.g., the third pixel point) can be understood as a color point constituting the character to be copied, and is the most basic unit constituting the character to be copied.
In an embodiment, a canvas area presented by the user interface may be provided with a function of acquiring the number of pixels of a specified color. The canvas area may obtain the number of third pixel points corresponding to the characters to be traced according to the color (for example, light gray) of the characters to be traced, where the number of the third pixel points is the total number of the pixel points corresponding to the characters to be traced.
420: and acquiring the quantity of fourth pixel points at the positions corresponding to the tracing result and the character to be traced and the quantity of fifth pixel points overflowing the character part to be traced.
Specifically, the tracing result input by the user includes a plurality of pixel points, where the plurality of pixel points of the tracing result may include a plurality of fourth pixel points and a plurality of fifth pixel points, that is, color points constituting the tracing result may be the plurality of fourth pixel points and the plurality of fifth pixel points.
The fourth pixel point can be a pixel point of a part corresponding to the character to be traced in the tracing result, namely a pixel point tracing above the character to be traced. The fifth pixel point can be a pixel point exceeding the character to be traced in the tracing result, namely a pixel point overflowing the character part to be traced.
In an embodiment, a canvas area presented by the user interface may be provided with a function of acquiring the number of pixels in a specified color, detect a specified color input by the user (i.e., a color input when the brush is configured), and acquire information on the number of pixels in the canvas area containing the color. That is to say, the canvas area can obtain the number of the fourth pixel points located above the character to be traced according to the color of the tracing result, and obtain the number of the fifth pixel points overflowing the character part to be traced.
430: and acquiring the tracing score based on the number of the third pixel points, the number of the fourth pixel points and the number of the fifth pixel points.
Specifically, determining the tracing coverage rate based on the ratio of the number of the fourth pixel points to the number of the third pixel points; and determining the tracing overflow rate based on the number of the fifth pixel points and the number of the total pixel points of the tracing result (namely the sum of the number of the fourth pixel points and the number of the fifth pixel points). And determining the tracing score according to the tracing coverage rate and the tracing overflow rate.
It should be noted that, for the detailed description of step 430, please refer to the record of the embodiment in fig. 5 for details, which are not described herein again.
Therefore, the method and the device for determining the copy score can determine the copy score by acquiring the pixel points of the character to be copied and the pixel points of the copy result, and the accuracy of similarity discrimination is improved.
Fig. 5 is a flowchart illustrating a method for assisting learning according to still another exemplary embodiment of the present application. The embodiment of fig. 5 is a further definition of step 430 in the embodiment of fig. 4. As shown in fig. 5, this step 130 further includes the following.
510: and determining the tracing coverage rate according to the ratio of the number of the fourth pixel points to the number of the third pixel points.
Specifically, the tracing coverage rate is determined according to the ratio of the number of fourth pixel points located above the character to be traced in the tracing result to the number of third pixel points of the character to be traced. That is to say, the tracing coverage is obtained by calculating the numerator which is the number of the fourth pixel points and the denominator which is the number of the third pixel points.
520: and determining the tracing overflow rate according to the ratio of the number of the fifth pixel points to the sum of the number of the fourth pixel points and the number of the fifth pixel points.
Specifically, the sum of the number of the fourth pixel points and the number of the fifth pixel points may be understood as the number of total pixel points of the tracing result. And determining the tracing overflow rate according to the ratio of the number of fifth pixel points overflowing the character to be traced in the tracing result to the number of total pixel points of the tracing result. That is to say, the tracing overflow rate is obtained by calculating the numerator which is the number of the fifth pixel points and the denominator which is the number of the total pixel points of the tracing result.
530: and determining the tracing achievement based on the tracing coverage rate and the tracing overflow rate.
Specifically, according to the tracing coverage rate and the tracing overflow rate, tracing scores are obtained according to a preset similarity standard. The preset similarity criterion may be a first value corresponding to the tracing coverage and a second value corresponding to the tracing overflow rate, wherein the first value is a product of the tracing coverage and 1.05, and the second value is a product of the tracing overflow rate divided by 3 times 100.
In one embodiment, the formula for determining the tracing performance may be (tracing coverage × 1.05) - (tracing overflow rate/3) × 100.
Therefore, the tracing score is determined based on the tracing coverage rate and the tracing overflow rate, and the accuracy of similarity judgment is improved.
In an embodiment of the present application, the method further includes: and displaying the simplified characters corresponding to the characters to be traced on the user interface.
Specifically, the user interface of the user terminal may include a simplified text corresponding to the text to be traced.
In an embodiment, the simplified text may be displayed over a canvas area.
In another embodiment, the simplified text may be displayed in a floating window of the user interface, and the display manner and the position of the simplified text are not specifically limited in the embodiment of the present application.
Therefore, the embodiment of the application displays the simplified characters corresponding to the characters to be traced on the user interface, so that the user can conveniently know the specific meanings of the characters to be traced, and the user can learn the characters to be traced more conveniently.
In an embodiment of the present application, before receiving a tracing result of a user drawing a text to be traced in a canvas area, the method further includes: detecting the click position of a user on a user interface; when the click position is located in the canvas area, a drawing operation of a user is received.
Specifically, before step 120 in the embodiment of fig. 1, the method may further include: a click position of a touch click of a user on a user interface with a touch object (e.g., a finger or a mouse) is detected. When the clicking position of the user is located in the canvas area, executing the tracing operation of the user on characters to be traced; otherwise, the user cannot copy the character to be copied.
In one embodiment, a coordinate system may be established on the user interface. The processor may determine whether the current click operation is located in the canvas area by determining coordinates of a click position where the user clicks with the touch object. It should be noted that, in order to determine whether the click position is located in the canvas area, the area coordinates of the canvas area may be stored in advance in the processor of the user terminal.
Therefore, the click position of the user is judged in advance, so that the drawing area of the user is located in the drawing area, and guarantee is provided for subsequent evaluation of the tracing performance.
In an embodiment of the application, the characters to be traced are characters of any one of oracle characters, golden characters and seal characters.
Specifically, the characters to be traced can include characters of any font such as oracle characters, golden characters, seal characters (for example, small seal characters), stone drum characters, and the like, and the specific type of the characters to be traced is not particularly limited in the embodiment of the present application.
For example, the character to be traced can be the oracle bone corresponding to the "force" character.
Therefore, the character to be traced is set to any font character in multiple ancient characters in the embodiment of the application, so that a user can select the ancient characters to be learned according to the requirement of the user.
Fig. 6 is a flowchart illustrating a method for assisting learning according to still another exemplary embodiment of the present application. The method for assisting learning comprises the following steps.
610: receiving the operation that the user starts to learn the ancient characters, namely receiving the operation that the user clicks the ancient character learning application program. That is, the user terminal detects that the user starts learning the ancient words.
620: and reading the canvas area initialized by the ancient character image information, namely acquiring the canvas area when the tracing operation is not executed, wherein the canvas area comprises the characters to be traced.
630: configuration information for the brush input by a user is received, wherein the configuration information comprises color, thickness and the like.
640: and detecting the clicking position of the user on the user interface.
650: and judging whether the click position is in the canvas area, if so, executing step 660, otherwise, executing step 640.
660: the method comprises the steps of obtaining position information of a user application brush clicked in a canvas area, and converting the position information into coordinate information of a coordinate system on a user interface.
670: the pixel area (i.e., UV location) corresponding to the coordinate information is determined.
680: and dyeing each first pixel point in the pixel area through GPU rendering.
690: and drawing and supplementing the first position and the second position drawn by the brush in the front frame image and the back frame image to form a line.
695: and submitting the tracing result and finishing drawing.
It should be understood that the specific working procedures and functions of steps 610 to 695 in the foregoing embodiment may refer to the description of the method for assisting learning provided in the foregoing embodiments of fig. 1 to 5, and are not described herein again to avoid repetition.
Fig. 7 is a schematic structural diagram of a user interface 700 provided in an exemplary embodiment of the present application. The user interface 700 includes a canvas area 701, a simplified text display area 702, a brush configuration area 703, and a submit control 704. The canvas area 701 includes text 7011 to be traced (e.g., oracle corresponding to "force"). Wherein the simplified text display area 702 is used for displaying the simplified text (e.g., "force") corresponding to the text to be traced. The brush configuration area 703 is used for a user to input configuration information of a brush, such as thickness and color. The user copies the character to be copied above the character 7011 to be copied in the canvas area 701, thereby realizing the learning of the character to be copied.
Exemplary devices
Fig. 8 is a schematic structural diagram of an apparatus 800 for assisting learning according to an exemplary embodiment of the present application. As shown in fig. 8, the apparatus 800 for assisting learning includes: a display module 810, a receiving module 820, and an obtaining module 830.
The display module 810 is configured to display a text to be traced in a canvas area on the user interface, where the text to be traced is an ancient text; the receiving module 820 is used for receiving a tracing result of a character to be traced in the canvas area by a user; the obtaining module 830 is configured to obtain a tracing score based on the tracing result and the text to be traced.
The embodiment of the application provides a device of supplementary study, offer the user through providing user interface and trace the ancient characters, and according to tracing the result with wait to trace the characters, acquire the score of tracing, make the user deepen the impression of user to the ancient characters during tracing, simultaneously with the mode of output trace score, also make the user can be directly perceived, objective understanding self to the condition of writing of ancient characters, strengthened the enthusiasm and the interest of user's study.
According to an embodiment of the present application, the receiving module 820 is configured to receive configuration information of the brush input by a user, where the configuration information includes a color; a, step a: acquiring the position information of a user application brush clicked in a canvas area; step b: determining a pixel area corresponding to the position information; step c: dyeing each first pixel point in the pixel area by using colors to obtain a pixel dyeing result; and repeating the steps a to c until a tracing result is obtained, wherein the tracing result comprises a plurality of pixel dyeing results.
According to an embodiment of the present application, the receiving module 820 is configured to obtain a first frame image and a second frame image when a user draws a character to be traced, where the first frame image and the second frame image are two consecutive frame images; acquiring a first position drawn by a brush in a first frame image and a second position drawn by the brush in a second frame image; and filling at least one second pixel point between the first position and the second position by using the color so as to realize the function of continuous drawing.
According to an embodiment of the present application, the obtaining module 830 is configured to obtain the number of third pixel points included in the text to be traced; acquiring the number of fourth pixel points at the corresponding positions of the tracing result and the character to be traced and the number of fifth pixel points overflowing the character part to be traced; and acquiring the tracing score based on the number of the third pixel points, the number of the fourth pixel points and the number of the fifth pixel points.
According to an embodiment of the present application, the obtaining module 830 is configured to determine the tracing coverage according to a ratio of the number of the fourth pixel points to the number of the third pixel points; determining the tracing overflow rate according to the ratio of the number of the fifth pixel points to the sum of the number of the fourth pixel points and the number of the fifth pixel points; and determining the tracing achievement based on the tracing coverage rate and the tracing overflow rate.
According to an embodiment of the present application, the display module 810 is configured to display a simplified text corresponding to the text to be traced on the user interface.
According to an embodiment of the present application, the receiving module 820 is configured to detect a click position of a user on a user interface; when the click position is located in the canvas area, a drawing operation of a user is received.
According to an embodiment of the application, the characters to be traced are characters of any one of oracle characters, golden characters and seal characters.
It should be understood that, for the specific working processes and functions of the display module 810, the receiving module 820 and the obtaining module 830 in the foregoing embodiments, reference may be made to the description of the method for assisting learning provided in the foregoing embodiments of fig. 1 to 7, and in order to avoid repetition, detailed description is not repeated here.
Exemplary electronic device and computer-readable storage Medium
Fig. 9 is a block diagram of an electronic device 900 for assisting learning provided by an exemplary embodiment of the present application.
Referring to fig. 9, electronic device 900 includes a processing component 910 that further includes one or more processors, and memory resources, represented by memory 920, for storing instructions, such as applications, that are executable by processing component 910. The application program stored in memory 920 may include one or more modules that each correspond to a set of instructions. Further, the processing component 910 is configured to execute instructions to perform the above-described method of assisted learning.
The electronic device 900 may also include a power component configured to perform power management for the electronic device 900, a wired or wireless network interface configured to connect the electronic device 900 to a network, and an input-output (I/O) interface. The electronic device 900 may be operated based on an operating system, such as Windows Server, stored in the memory 920 TM ,Mac OS X TM ,Unix TM ,Linux TM ,FreeBSD TM Or the like.
A non-transitory computer readable storage medium having instructions stored thereon that, when executed by a processor of the electronic device 900, enable the electronic device 900 to perform a method of assisting learning, comprising: displaying characters to be traced in a canvas area on a user interface, wherein the characters to be traced are ancient characters; receiving a tracing result of a user on characters to be traced in a canvas area; and obtaining a tracing score based on the tracing result and the character to be traced.
All the above optional technical solutions can be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program check codes, such as a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in the description of the present application, the terms "first", "second", "third", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, the meaning of "a plurality" is two or more unless otherwise specified.
The above description is only a preferred embodiment of the present application and should not be taken as limiting the present application, and any modifications, equivalents and the like that are within the spirit and scope of the present application should be included.

Claims (11)

1. A method of assisted learning, comprising:
displaying a character to be traced in a canvas area on a user interface, wherein the character to be traced is an ancient character;
receiving a tracing result of the user on the character to be traced in the drawing area;
and acquiring a tracing score based on the tracing result and the character to be traced.
2. The learning-aided method according to claim 1, wherein the receiving of the tracing result of the user on the text to be traced in the tracing area comprises:
receiving configuration information input by the user for the pen, wherein the configuration information comprises color;
step a: acquiring the position information of the user in the canvas area when the user uses the brush pen to click;
step b: determining a pixel area corresponding to the position information;
step c: dyeing each first pixel point in the pixel area by using the color to obtain a pixel dyeing result;
repeating the steps a to c until obtaining the tracing result, wherein the tracing result comprises a plurality of pixel dyeing results.
3. The method of assisted learning of claim 2, further comprising:
acquiring a first frame image and a second frame image when the user draws the character to be traced, wherein the first frame image and the second frame image are two continuous frame images;
acquiring a first position drawn by the brush in the first frame image and a second position drawn by the brush in the second frame image;
and filling at least one second pixel point between the first position and the second position by using the color so as to realize the function of continuous drawing.
4. The learning-assisting method according to claim 1, wherein the obtaining of the tracing result based on the tracing result and the text to be traced comprises:
acquiring the number of third pixel points included in the character to be traced;
acquiring the number of fourth pixel points at the position of the tracing result corresponding to the character to be traced and the number of fifth pixel points overflowing the character part to be traced;
and acquiring the tracing score based on the number of the third pixel points, the number of the fourth pixel points and the number of the fifth pixel points.
5. The learning-aided method according to claim 4, wherein the obtaining the tracing achievement based on the number of the third pixels, the number of the fourth pixels and the number of the fifth pixels comprises:
determining the tracing coverage rate according to the ratio of the number of the fourth pixel points to the number of the third pixel points;
determining a tracing overflow rate according to the ratio of the number of the fifth pixel points to the sum of the number of the fourth pixel points and the number of the fifth pixel points;
determining the tracing performance based on the tracing coverage and the tracing overflow rate.
6. The method of assisted learning of claim 1, further comprising:
and displaying the simplified characters corresponding to the characters to be traced on the user interface.
7. The learning-assisted method according to claim 1, before receiving a tracing result of a user drawing the text to be traced in the drawing area, further comprising:
detecting the clicking position of the user on the user interface;
and when the clicking position is located in the canvas area, receiving the drawing operation of the user.
8. The learning-assisting method according to any one of claims 1 to 7, wherein the characters to be traced are characters in any one of oracle characters, golden characters and seal characters.
9. An apparatus for assisting learning, comprising:
the display module is used for displaying characters to be traced in a canvas area on a user interface, wherein the characters to be traced are ancient characters;
the receiving module is used for receiving a tracing result of the user on the character to be traced in the tracing area;
and the obtaining module is used for obtaining a tracing score based on the tracing result and the character to be traced.
10. A computer-readable storage medium, characterized in that the storage medium stores a computer program for executing the method of assisted learning of any of the preceding claims 1 to 8.
11. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions,
wherein the processor is configured to perform the method for assisting learning of any one of claims 1 to 8.
CN202210833277.7A 2022-07-15 2022-07-15 Learning assisting method and device Pending CN115083230A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210833277.7A CN115083230A (en) 2022-07-15 2022-07-15 Learning assisting method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210833277.7A CN115083230A (en) 2022-07-15 2022-07-15 Learning assisting method and device

Publications (1)

Publication Number Publication Date
CN115083230A true CN115083230A (en) 2022-09-20

Family

ID=83259220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210833277.7A Pending CN115083230A (en) 2022-07-15 2022-07-15 Learning assisting method and device

Country Status (1)

Country Link
CN (1) CN115083230A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101079191A (en) * 2007-06-07 2007-11-28 深圳市和而泰电子科技有限公司 Touching electronic copybook
CN101339703A (en) * 2008-08-14 2009-01-07 武汉瑞中教育管理有限责任公司 Character calligraph exercising method based on computer
CN104391651A (en) * 2014-12-11 2015-03-04 北京轩文文化发展有限公司 Calligraphic handwriting presentation method based on optical principle
CN106373455A (en) * 2016-09-21 2017-02-01 陈新德 Micro projection imitation display apparatus and display method
CN107466415A (en) * 2016-07-11 2017-12-12 深圳市柔宇科技有限公司 A kind of electronic copybook, calligraphy practising pen and system
CN110244870A (en) * 2019-05-08 2019-09-17 深圳市战音科技有限公司 A kind of electronic drawing board copying method and relevant device
CN114253435A (en) * 2021-12-17 2022-03-29 安徽淘云科技股份有限公司 Handwriting display method and device, electronic equipment and storage medium
CN114415918A (en) * 2022-01-20 2022-04-29 深圳市闪联信息技术有限公司 Electronic equipment, method and device for simulating pen touch and computer readable medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101079191A (en) * 2007-06-07 2007-11-28 深圳市和而泰电子科技有限公司 Touching electronic copybook
CN101339703A (en) * 2008-08-14 2009-01-07 武汉瑞中教育管理有限责任公司 Character calligraph exercising method based on computer
CN104391651A (en) * 2014-12-11 2015-03-04 北京轩文文化发展有限公司 Calligraphic handwriting presentation method based on optical principle
CN107466415A (en) * 2016-07-11 2017-12-12 深圳市柔宇科技有限公司 A kind of electronic copybook, calligraphy practising pen and system
CN106373455A (en) * 2016-09-21 2017-02-01 陈新德 Micro projection imitation display apparatus and display method
CN110244870A (en) * 2019-05-08 2019-09-17 深圳市战音科技有限公司 A kind of electronic drawing board copying method and relevant device
CN114253435A (en) * 2021-12-17 2022-03-29 安徽淘云科技股份有限公司 Handwriting display method and device, electronic equipment and storage medium
CN114415918A (en) * 2022-01-20 2022-04-29 深圳市闪联信息技术有限公司 Electronic equipment, method and device for simulating pen touch and computer readable medium

Similar Documents

Publication Publication Date Title
CN111381754B (en) Handwriting processing method, equipment and medium
WO2017035966A1 (en) Method and device for processing facial image
US9020266B2 (en) Methods and devices for processing handwriting input
CN112070658B (en) Deep learning-based Chinese character font style migration method
CN102939575B (en) Ink presents
EP3872766A2 (en) Method and device for processing image, related electronic device and storage medium
CN111626297A (en) Character writing quality evaluation method and device, electronic equipment and recording medium
CN103598870A (en) Optometry method based on depth-image gesture recognition
US20210012540A1 (en) Calligraphy-painting device, calligraphy-painting apparatus, and auxiliary method for calligraphy-painting
CN109324749A (en) Drawing practice, device, storage medium and electronic equipment
CN111949156B (en) Chinese character writing test method and system of writing device and writing device
CN110795177A (en) Graph drawing method and device
CN111782131A (en) Pen point implementation method, device, equipment and readable storage medium
CN106297477B (en) A kind of method and device generating digitlization copybook
CN109656652A (en) Webpage graph making method, apparatus, computer equipment and storage medium
CN104636309B (en) Matrix recognition methods based on machine vision
CN104714743B (en) A kind of soft implementation method and device based on touch screen
US11100317B2 (en) Drawing device and drawing method
CN210038810U (en) Intelligent evaluation equipment and system
CN111862061A (en) Method, system, device and medium for evaluating aesthetic quality of picture
CN116912827A (en) Interactive labeling method and system based on large model
CN115083230A (en) Learning assisting method and device
CN111107264A (en) Image processing method, image processing device, storage medium and terminal
CN110633666A (en) Gesture track recognition method based on finger color patches
CN114387315A (en) Image processing model training method, image processing device, image processing equipment and image processing medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination