CN113031793A - Contour acquisition method and device and intelligent pen - Google Patents

Contour acquisition method and device and intelligent pen Download PDF

Info

Publication number
CN113031793A
CN113031793A CN202110369015.5A CN202110369015A CN113031793A CN 113031793 A CN113031793 A CN 113031793A CN 202110369015 A CN202110369015 A CN 202110369015A CN 113031793 A CN113031793 A CN 113031793A
Authority
CN
China
Prior art keywords
pen
detection module
coordinate
contour
posture information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110369015.5A
Other languages
Chinese (zh)
Other versions
CN113031793B (en
Inventor
洪小康
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110369015.5A priority Critical patent/CN113031793B/en
Publication of CN113031793A publication Critical patent/CN113031793A/en
Application granted granted Critical
Publication of CN113031793B publication Critical patent/CN113031793B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a contour collection method and device and an intelligent pen, and belongs to the field of communication. The contour acquisition method comprises the following steps: under the condition that a drawing mode is started, acquiring first posture information and a first coordinate through the first detection module, and acquiring second posture information through the second detection module; determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information; sending the second coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen tip; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.

Description

Contour acquisition method and device and intelligent pen
Technical Field
The application belongs to the technical field of communication, and particularly relates to a contour acquisition method, a contour acquisition device and an intelligent pen.
Background
With the development of science and technology, the three-dimensional contour of an object needs to be acquired in more and more application scenes. In a conventional contour acquisition apparatus, a structured light detection method is generally used to acquire a three-dimensional contour of an object. For example, the height profile information of the object is obtained one by one based on raster scanning by a triangulation principle, then the contour lines are associated in a uniform coordinate system to form point cloud data of the three-dimensional contour of the object, and the three-dimensional contour data of the object can be obtained through the intrinsic parameters of the device and the point cloud data.
However, for the conventional contour collection device, because the three-dimensional contour of the object needs to be collected based on the structured light detection, the contour collection device has a complex internal structure, high cost, large volume and mass, is not portable, and is difficult to adapt to the requirement of the user for collecting the three-dimensional contour of the object in different scenes.
Disclosure of Invention
The embodiment of the application aims to provide a contour collection method and device, an intelligent pen and a readable storage medium, and the problems that the existing contour collection equipment is complex in internal structure, high in equipment cost, large in size and mass and not beneficial to carrying are solved.
In a first aspect, an embodiment of the present application provides a contour collection method, which is used for an intelligent pen, where the intelligent pen includes a pen body and a pen point portion arranged at an end of the pen body, a first detection module is arranged in the pen body, a second detection module is arranged in the pen point portion, and the contour collection method includes:
under the condition that a drawing mode is started, acquiring first posture information and a first coordinate through the first detection module, and acquiring second posture information through the second detection module;
determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information;
sending the second coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen tip; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
In a second aspect, the embodiment of the present application provides a profile collection system for intelligent pen, intelligent pen is including a body and setting up the nib portion of a body tip, be provided with first detection module in the body, be provided with its characterized in that of second detection module in the nib portion, profile collection system includes:
the acquisition module is used for acquiring first posture information and a first coordinate through the first detection module and acquiring second posture information through the second detection module under the condition that a drawing mode is started;
the determining module is used for determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detecting module, the first posture information and the second posture information;
the sending module is used for sending the second coordinate to target equipment so that the target equipment displays the outline of the surface of the object scratched by the pen point; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
In a third aspect, an embodiment of the present application provides an electronic device, which further includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, when the smart pen is in a drawing mode, the first detection module acquires first posture information and first coordinates, and the second detection module acquires second posture information; determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information; sending the first coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen point; wherein the contour is derived based on a spatial trajectory formed by the second coordinates. Therefore, the pen point of the intelligent pen can be used for forming the outline of the object by scratching the surface of the object, and the intelligent pen is simple in method, simple in structure and low in cost. Moreover, the intelligent pen is small in size and convenient to carry, so that the intelligent pen can meet the requirement of a user for collecting the outline of an object in different scenes, and the use experience of the user is greatly enriched.
Drawings
FIG. 1 is a schematic structural diagram of a smart pen of the present application;
FIG. 2 is a schematic diagram of the internal modules of the smart pen of FIG. 1;
FIG. 3 is a flow chart of steps of a contour collection method according to an embodiment of the present application;
FIG. 4 is a flow chart of steps of another contour collection method of an embodiment of the present application;
FIG. 5 is a block diagram of a profile collection device according to an embodiment of the present application;
fig. 6 is a structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments that can be derived by one of ordinary skill in the art from the embodiments given herein are intended to be within the scope of the present disclosure.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail the contour acquisition method provided by the embodiment of the present application with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, a schematic structural diagram of a smart pen according to the present application is shown, as shown in fig. 1. The smart pen may specifically include: the pen comprises a pen body 10 and a pen point 11 arranged at the end of the pen body 10, wherein a first detection module A is arranged in the pen body 10, and a second detection module B is arranged in the pen point 11. In practical application, the second detection module B may be disposed in the pen tip portion 11 and close to the pen tip of the smart pen, and since the second detection module B is very close to the pen tip of the smart pen, in order to simplify the calculation process, the second detection module B may be considered to coincide with the pen tip of the smart pen.
In the embodiment of the present application, the first detection module a may include an accelerometer, a gyroscope, and a magnetometer, where the accelerometer may be configured to detect a linear acceleration at the first detection module a. The gyroscope may be used to detect the angular velocity at the first detection module a. The magnetometer can be used to test the magnetic field strength and direction at the first detection module a, the orientation of the pointing device. Specifically, the accelerometers are used for detecting the accelerations of the point A in different directions in real time, and through twice integration, the first coordinates (Lx, Ly, Lz) of the first detection module A in the space coordinates can be determined. And the measured parameters of the gyroscope and the magnetometer may become first attitude information of the first detection module a, and specifically, the first attitude information may include angular velocity information and magnetic field strength and orientation information of the first detection module a.
In practical applications, since the number of sensors integrated with the first detection module a is large, and accordingly, the volume of the first detection module a is also large, in particular, the first detection module B can be disposed in the pen body 10 with a large internal space.
Note that fig. 1 only shows a case where the first detection module a is disposed at a position where the pen body 10 and the pen tip 11 meet each other, but in practical applications, the first detection module a may be disposed at any position in the pen body 10 according to practical situations, for example, at the middle or the tail of the pen body 10, and the specific position of the first detection module a in the pen body 10 in the embodiments of the present application may not be limited.
In the embodiment of the present application, the second detection module B may include a gyroscope and a magnetometer. The gyroscope may be used to detect the angular velocity at the second detection module B. The magnetometer may be used to test the magnetic field strength and direction at the second detection module B, the orientation of the positioning device. The measured parameters of the gyroscope and the magnetometer may become second attitude information of the second detection module B, and specifically, the second attitude information may include angular velocity information and magnetic field strength and orientation information of the second detection module B.
Referring to fig. 2, which shows a schematic diagram of internal modules of the smart pen shown in fig. 1, as shown in fig. 2, the back of the smart pen may further include: the device comprises a processor C, a memory D, a power module E, a communication module F and an antenna unit G; the processor C is electrically connected with the first detection module A, the second detection module B, the memory D, the power supply module E, the communication module F and the antenna unit G respectively.
Specifically, the processor C may acquire a first coordinate and first posture information according to information acquired by each sensor in the first detection module a, acquire a second posture information according to information acquired by each sensor in the second detection module B, may further integrate the memory D to store a distance between the pen tip and the first detection module a in advance, and determine a second coordinate of the pen tip in a space coordinate according to the first coordinate, the distance between the pen tip and the first detection module, the first posture information, and the second posture information.
In practical application, the smart pen can be in communication connection with a target device through the communication module F, and based on the communication connection, the smart pen can send the second coordinate to the target device, so that the target device forms the contour of the surface of the object stroked by the pen point based on the second coordinate. The power module E may be configured to supply power to each module inside the smart pen.
For example, the form of the communication connection between the smart pen and the target device may include, but is not limited to, any one of a wired connection and a wireless connection, and the communication connection between the smart pen and the target device in the embodiments of the present application may not be limited. The target device may include any one of a mobile phone, a computer, or a wearable device, and the specific type of the target device may not be limited in this application embodiment.
In practical applications, a touch button 12 is further disposed in the smart pen, and when a first input to the touch button by a user is received, the smart pen may start a drawing mode in response to the first input, and when the first input to the touch button is received again, close the drawing mode in response to the first input.
Referring to fig. 3, a flowchart illustrating steps of a contour collection method according to an embodiment of the present application is shown, where the method specifically includes the following steps:
step 301: and under the condition that the drawing mode is started, acquiring first posture information and first coordinates through the first detection module, and acquiring second posture information through the second detection module.
In the embodiment of the application, the intelligent pen can comprise a drawing mode and a non-drawing mode. In the drawing mode, when the drawing mode is started, in the process that the pen point of the smart pen slides on the surface of the object, first posture information and first coordinates can be acquired through the first detection module a, and second posture information can be acquired through the second detection module B. In the non-drawing mode, the first detection module a and the second detection module B may stop collecting information, and the smart pen may be used as a general smart pen.
In practical application, when a user needs to use the smart pen to acquire a three-dimensional contour of an object, a pen point of the smart pen may be placed on a surface of the object, and then a drawing mode of the smart pen is started. After the drawing mode is initiated, the pen tip of the smart pen may be stroked across the surface of the object to capture pose information and coordinate information for as many contour points on the surface of the object as possible. In order to be able to form as complete a contour as possible, the tip of the smart pen should be moved across every surface of the object as far as possible.
Specifically, the first detection module a and the second detection module B should collect coordinate information and attitude information of different contour points of each portion of the object according to a preset frequency. The preset frequency may be a time interval of microsecond level, for example, 3 microseconds, 4 microseconds, or 6 microseconds, and the like.
Step 302: and determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information.
In the embodiment of the application, in the process of using the intelligent pen to stroke on the surface of the object, since the pen holding posture changes, the position of the second detection module B at the pen point relative to the first detection module a also changes correspondingly in the spatial coordinate system.
In practical application, in order to obtain the second coordinate of the pen tip under the spatial coordinate, firstly, a displacement variable of the second detection module B at the pen tip relative to the first detection module a at different acquisition times needs to be determined according to the distance H between the pen tip and the first detection module, the first posture information and the second posture information. And then, according to the first detection module A and the displacement variable, determining a second coordinate of the pen point under the space coordinate.
Specifically, since the pen tip is drawn across the surface of the object, the second coordinate of the pen tip is the same as the coordinate of each contour point on the surface of the object, and the second coordinate is acquired as many as possible by drawing the pen tip across as many contour points on the surface of the object, so that the contour of the surface of the object can be formed by the coordinate information of as many contour points as possible.
Step 303: sending the second coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen tip; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
In this application embodiment, because the smart pen with communication connection between the target device, consequently, based on communication connection, the smart pen can with the second coordinate of nib sends to the target device. Since the pen tip is drawn across the surface of the object, the second coordinate of the pen tip is the same as the coordinate of each contour point on the surface of the object, and therefore, after the target device receives the second coordinate of the pen tip, a spatial trajectory drawn by the pen tip can be formed according to the second coordinate, and the contour of the surface of the object drawn by the pen tip can be formed according to the spatial trajectory. Meanwhile, the outline can be displayed, so that a user can directly know the drawing condition of the outline of the object.
In practical applications, the smart pen may send the second coordinate of the pen tip to the target device according to a certain frequency, or send the second coordinate of the pen tip to the target device after all contour points of the surface of the object are collected, which is not limited in this embodiment of the present application. In addition, the target device may synchronously display the forming process of the contour of the object in the process of forming the contour of the surface of the object, so that a user can intuitively know the forming progress of the contour of the object, and the forming progress may also be displayed after the complete contour is formed, which is not limited in the embodiment of the present application.
In summary, the contour acquisition method according to the embodiment of the present application may include at least the following advantages:
in the embodiment of the application, when the smart pen starts a drawing mode, the first detection module acquires first posture information and a first coordinate, and the second detection module acquires second posture information; determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information; sending the first coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen point; wherein the contour is derived based on a spatial trajectory formed by the second coordinates. Therefore, the pen point of the intelligent pen can be used for forming the outline of the object by scratching the surface of the object, and the intelligent pen is simple in method, simple in structure and low in cost. Moreover, the intelligent pen is small in size and convenient to carry, so that the intelligent pen can meet the requirement of a user for collecting the outline of an object in different scenes, and the use experience of the user is greatly enriched.
Referring to fig. 4, a flowchart illustrating steps of another contour acquisition method according to an embodiment of the present application is shown, and as shown in fig. 4, the method may specifically include:
step 401: a first input is received for a touch button.
In the embodiment of the application, a touch button is further arranged in the smart pen, the touch button can be used for receiving a first input, and the first input can be used for starting or exiting a drawing mode of the smart pen.
For example, the touch button may be disposed at a top of the smart pen, or may be disposed at a side of the smart pen, and the like. The touch button can be triggered in a pressing and rotating mode, and the triggering mode of the touch button in the embodiment of the application is not limited.
Step 402: in response to the first input, a drawing mode is enabled.
In an embodiment of the application, the smart pen may initiate a drawing mode in response to the first input.
Step 403: and under the condition that the drawing mode is started, acquiring first posture information and first coordinates through the first detection module, and acquiring second posture information through the second detection module.
In the embodiment of the present application, the specific implementation process of step 403 is the same as that of step 301 in the foregoing embodiment, and is not described herein again.
Step 404: and determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information.
In an optional embodiment of the present application, the method of determining the second coordinate of the pen tip in the spatial coordinate system may comprise the sub-steps of:
substep S11: and determining a displacement variable of the pen point relative to the first detection module according to the distance between the pen point and the first detection module, the first posture information and the second posture information.
In the embodiment of the application, in the process of using the intelligent pen to stroke on the surface of the object, since the pen holding posture changes, the position of the second detection module B at the pen point relative to the first detection module a also changes correspondingly in the spatial coordinate system. Therefore, the displacement variation of the second detection module B at the pen point relative to the first detection module a at different times of acquisition needs to be determined.
In an alternative embodiment of the present application, the displacement variable of the second detection module B relative to the first detection module a at the pen tip may be determined according to the following method:
first, a polar angle and an azimuth angle of the pen tip relative to the second detection module are determined according to the first posture information and the second posture information.
In practical applications, the first posture information may include an angular velocity, a magnetic field strength and a direction of the first detection module a, and the second posture information may include an angular velocity, a magnetic field strength and a direction of the second detection module B. With the first detection module A as an origin, the polar angle of the second detection module B relative to the first detection module A can be calculated according to the first attitude information and the second attitude information
Figure BDA0003008515320000091
And an azimuth angle λ.
Then, according to the distance between the pen tip and the first detection module, the polar angle and the azimuth angle, the displacement variable of the pen tip relative to the second detection module is determined.
In practical applications, the distance H between the pen tip and the first detection module a in the space coordinate may be stored in the memory D of the smart pen in advance. According to the distance H, the polar angle of the second detection module B relative to the first detection module A
Figure BDA0003008515320000092
And an azimuth angle λ, the displacement variables σ Lx, σ Ly, σ Lz of the second detection module B of the pen tip relative to the first detection module a can be determined:
Figure BDA0003008515320000093
Figure BDA0003008515320000094
Figure BDA0003008515320000095
substep S12: and determining a second coordinate of the pen point under the space coordinate according to the first coordinate and the displacement variable.
In the embodiment of the present application, since the first coordinates (Lx, Ly, Lz) of the first detection module a are already determined in step 403, the second coordinates (Lx + σ Lx, Ly + σ Ly, Lz + σ Lz) of the pen tip in the spatial coordinates can be determined according to the first coordinates (Lx, Ly, Lz) of the first detection module a and the displacement variables σ Lx, σ Ly, σ Lz of the second detection module B relative to the first detection module a.
Specifically, since the pen tip is drawn across the surface of the object, the second coordinate of the pen tip is the same as the coordinate of each contour point on the surface of the object, and the second coordinate is acquired as many as possible by drawing the pen tip across as many contour points on the surface of the object, so that the contour of the surface of the object can be formed by the coordinate information of as many contour points as possible.
Step 405: sending the second coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen tip; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
In the embodiment of the present application, the specific implementation process of step 405 is the same as that in step 303 in the foregoing embodiment, and is not described herein again.
Step 406: under the condition that a contour completion instruction sent by the target equipment is received, the communication connection between the intelligent pen and the target equipment is interrupted; wherein the contour completion instruction is determined by the target device after forming the contour of the object surface.
In practical applications, the target device may generate a contour completing instruction after forming the contour of the object. The contour completing instruction may be automatically generated after the target device scans the contour, or may be generated according to an operation of a user on the target device. In practical applications, after the target device generates the contour completing command, the target device may send the contour completing command to the smart pen.
In the embodiment of the application, the intelligent pen can interrupt the communication connection between the intelligent pen and the target device when receiving the contour completion instruction sent by the target device, and stops sending the nib to the target device, wherein the second coordinate of the nib under the space coordinate system is used for avoiding the repeated drawing of the target device on the contour of the object, and the intelligent pen and the electric quantity of the target device can be saved.
Step 407: in a case where the first input for the touch button is received again, the drawing mode is turned off in response to the first input.
In this embodiment of the application, when the smart pen receives the first input to the touch button again, the drawing mode may be closed in response to the first input, so that the smart pen may operate in a non-drawing mode, and switching between the drawing mode and the non-drawing mode is achieved.
For example, in the case that the smart pen is operated in the non-drawing mode, the smart pen may be used to input instructions to a computer screen, a mobile device, a drawing board, or the like having a touch screen, and a user may select a file or a drawing by clicking on the touch screen with the touch pen.
In summary, the contour acquisition method according to the embodiment of the present application may include at least the following advantages:
in the embodiment of the application, under the condition that the intelligent pen starts the drawing mode, the pen point of the intelligent pen is used for scratching the surface of the object to form the outline of the object, and the method is simple, simple in structure and low in cost. Moreover, the intelligent pen is small in size and convenient to carry, so that the intelligent pen can meet the requirement of a user for collecting the outline of an object in different scenes, and the use experience of the user is greatly enriched. Moreover, the intelligent pen can respond to the first input aiming at the touch button, so that the switching between the drawing mode and the non-drawing mode is realized, and the functions of the intelligent pen are enriched.
It should be noted that, in the contour acquisition method provided in the embodiment of the present application, the executing body may be a contour acquisition device, or a control module of the contour acquisition device for executing the contour acquisition method. In the embodiment of the present application, a method for executing a contour acquisition by a contour acquisition device is taken as an example, and the contour acquisition device provided in the embodiment of the present application is described.
Referring to fig. 5, a block diagram of a profile collection device according to an embodiment of the present application is shown, where the profile collection device may be used in a smart pen, where the smart pen includes a pen body and a pen tip portion disposed at an end of the pen body, a first detection module is disposed in the pen body, and a second detection module is disposed in the pen tip portion, where the profile collection device 500 specifically includes:
an obtaining module 501, configured to obtain, by the first detecting module, first pose information and a first coordinate, and obtain, by the second detecting module, second pose information when the drawing mode is enabled.
A determining module 502, configured to determine a second coordinate of the pen tip in the space coordinate according to the first coordinate, the distance between the pen tip and the first detecting module, the first posture information, and the second posture information.
A sending module 503, configured to send the second coordinate to a target device, so that the target device displays an outline of the surface of the object stroked by the pen tip; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
Optionally, determining module package 502 may include:
the first determining submodule is used for determining a displacement variable of the pen point relative to the first detection module according to the distance between the pen point and the first detection module, the first posture information and the second posture information;
and the second determining submodule is used for determining a second coordinate of the pen point under the space coordinate according to the first coordinate and the displacement variable.
Optionally, the first determining sub-module includes:
a first determining unit, configured to determine a polar angle and an azimuth angle of the pen tip with respect to the second detection module according to the first posture information and the second posture information;
and the second determining unit is used for determining the displacement variable of the pen point relative to the second detection module according to the distance between the pen point and the first detection module, the polar angle and the azimuth angle.
Optionally, the contour acquisition apparatus 500 may further include:
the interruption module is used for interrupting the communication connection between the intelligent pen and the target equipment under the condition of receiving a contour completion instruction sent by the target equipment; wherein the contour completion instruction is determined by the target device after forming the contour of the object surface.
Optionally, the smart pen further includes a touch button, and the contour acquisition device 500 may further include:
a receiving module for receiving a first input for the touch button;
an enabling module to enable a drawing mode in response to the first input.
Optionally, the contour acquisition apparatus 500 may further include:
a closing module, configured to close the drawing mode in response to a first input to the touch button when the first input is received again.
In the embodiment of the application, when the smart pen starts a drawing mode, the first detection module acquires first posture information and a first coordinate, and the second detection module acquires second posture information; determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information; sending the first coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen point; wherein the contour is derived based on a spatial trajectory formed by the second coordinates. Therefore, the pen point of the intelligent pen can be used for forming the outline of the object by scratching the surface of the object, and the intelligent pen is simple in method, simple in structure and low in cost. Moreover, the intelligent pen is small in size and convenient to carry, so that the intelligent pen can meet the requirement of a user for collecting the outline of an object in different scenes, and the use experience of the user is greatly enriched.
The contour acquisition device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The contour acquisition device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The contour acquisition device provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 3 to 4, and is not described here again to avoid repetition.
Optionally, as shown in fig. 6, an electronic device 600 is further provided in this embodiment of the present application, and includes a processor 601, a memory 602, and a program or an instruction stored in the memory 602 and executable on the processor 601, where the program or the instruction is executed by the processor 601 to implement each process of the foregoing embodiment of the contour collection method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 7 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 700 includes, but is not limited to: a radio frequency unit 701, a network module 702, an audio output unit 703, an input unit 704, a sensor 705, a display unit 706, a user input unit 707, an interface unit 708, a memory 709, and a processor 710.
Those skilled in the art will appreciate that the electronic device 700 may also include a power supply (e.g., a battery) for powering the various components, and the power supply may be logically coupled to the processor 710 via a power management system, such that the functions of managing charging, discharging, and power consumption may be performed via the power management system. The electronic device structure shown in fig. 7 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is omitted here.
A processor 710, configured to, when the drawing mode is enabled, obtain first pose information and first coordinates through the first detection module, and obtain second pose information through the second detection module; determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information; sending the second coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen tip; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
In the embodiment of the application, when the smart pen starts a drawing mode, the first detection module acquires first posture information and a first coordinate, and the second detection module acquires second posture information; determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information; sending the first coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen point; wherein the contour is derived based on a spatial trajectory formed by the second coordinates. Therefore, the pen point of the intelligent pen can be used for forming the outline of the object by scratching the surface of the object, and the intelligent pen is simple in method, simple in structure and low in cost. Moreover, the intelligent pen is small in size and convenient to carry, so that the intelligent pen can meet the requirement of a user for collecting the outline of an object in different scenes, and the use experience of the user is greatly enriched.
A processor 710, further configured to determine a displacement variable of the pen tip relative to the first detection module according to the distance between the pen tip and the first detection module, the first pose information, and the second pose information; and obtaining a second coordinate of the pen point under the space coordinate according to the first coordinate and the displacement variable.
A processor 710 further configured to determine a polar angle and an azimuth angle of the pen tip relative to the second detection module based on the first pose information and the second pose information; and determining the displacement variable of the pen point relative to the second detection module according to the distance between the pen point and the first detection module, the polar angle and the azimuth angle.
A processor 710 further configured to receive a first input for the touch button; in response to the first input, a drawing mode is enabled.
The processor 710 is further configured to turn off the drawing mode in response to the first input if the first input for the touch button is received again.
In the embodiment of the application, under the condition that the intelligent pen starts the drawing mode, the pen point of the intelligent pen is used for scratching the surface of the object to form the outline of the object, and the method is simple, simple in structure and low in cost. Moreover, the intelligent pen is small in size and convenient to carry, so that the intelligent pen can meet the requirement of a user for collecting the outline of an object in different scenes, and the use experience of the user is greatly enriched. Moreover, the intelligent pen can respond to the first input aiming at the touch button, so that the switching between the drawing mode and the non-drawing mode is realized, and the functions of the intelligent pen are enriched.
It should be understood that in the embodiment of the present application, the input Unit 704 may include a Graphics Processing Unit (GPU) 7041 and a microphone 7042, and the Graphics Processing Unit 7041 processes image data of still pictures or videos obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The display unit 706 may include a display panel 7061, and the display panel 7061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 707 includes a touch panel 1071 and other input devices 7072. The touch panel 7071 is also referred to as a touch screen. The touch panel 7071 may include two parts of a touch detection device and a touch controller. Other input devices 7072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 709 may be used to store software programs as well as various data, including but not limited to applications and operating systems. Processor 710 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 710.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the foregoing contour acquisition method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction to implement each process of the above-mentioned contour acquisition method embodiment, and can achieve the same technical effect, and is not described here again to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A contour collection method is used for an intelligent pen and is characterized in that the intelligent pen comprises a pen body and a pen point portion arranged at the end portion of the pen body, a first detection module is arranged in the pen body, a second detection module is arranged in the pen point portion, and the contour collection method comprises the following steps:
under the condition that a drawing mode is started, acquiring first posture information and a first coordinate through the first detection module, and acquiring second posture information through the second detection module;
determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detection module, the first posture information and the second posture information;
sending the second coordinates to a target device to enable the target device to display the outline of the surface of the object scratched by the pen tip; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
2. The profile-acquisition method of claim 1, wherein the determining a second coordinate of the tip in spatial coordinates based on the first coordinate, the distance of the tip from the first detection module, the first pose information, and the second pose information comprises:
determining a displacement variable of the pen point relative to the first detection module according to the distance between the pen point and the first detection module, the first posture information and the second posture information;
and obtaining a second coordinate of the pen point under the space coordinate according to the first coordinate and the displacement variable.
3. The profile-acquisition method of claim 2, wherein the determining a displacement variable of the pen tip relative to the first detection module is based on the distance of the pen tip from the first detection module, the first pose information, and the second pose information; the method comprises the following steps:
determining a polar angle and an azimuth angle of the pen tip relative to the second detection module according to the first attitude information and the second attitude information;
and determining the displacement variable of the pen point relative to the second detection module according to the distance between the pen point and the first detection module, the polar angle and the azimuth angle.
4. The profile-acquisition method according to claim 1, wherein the smart pen further comprises a touch button, and wherein before the acquiring of the first pose information and the first coordinates by the first detection module and the acquiring of the second pose information by the second detection module in the case where the drawing mode is enabled, the profile-acquisition method further comprises:
receiving a first input for the touch button;
in response to the first input, a drawing mode is enabled.
5. The contour collection method according to claim 4, wherein after said transmitting the second coordinates to a target device to cause the target device to display a contour of a surface of an object stroked by the pen tip, the contour collection method further comprises:
in a case where the first input for the touch button is received again, the drawing mode is turned off in response to the first input.
6. The utility model provides a profile collection system for intelligence pen, its characterized in that, intelligence pen is in including body and setting the nib portion of body tip, be provided with first detection module in the body, be provided with its characterized in that of second detection module in the nib portion, profile collection system includes:
the acquisition module is used for acquiring first posture information and a first coordinate through the first detection module and acquiring second posture information through the second detection module under the condition that a drawing mode is started;
the determining module is used for determining a second coordinate of the pen point under the space coordinate according to the first coordinate, the distance between the pen point and the first detecting module, the first posture information and the second posture information;
the sending module is used for sending the second coordinate to target equipment so that the target equipment displays the outline of the surface of the object scratched by the pen point; wherein the contour is derived based on a spatial trajectory formed by the second coordinates.
7. The profile-acquisition apparatus of claim 6, wherein the determining module comprises:
the first determining submodule is used for determining a displacement variable of the pen point relative to the first detection module according to the distance between the pen point and the first detection module, the first posture information and the second posture information;
and the second determining submodule is used for determining a second coordinate of the pen point under the space coordinate according to the first coordinate and the displacement variable.
8. The contour acquisition device of claim 7, wherein the first determination submodule comprises:
a first determining unit, configured to determine a polar angle and an azimuth angle of the pen tip with respect to the second detection module according to the first posture information and the second posture information;
and the second determining unit is used for determining the displacement variable of the pen point relative to the second detection module according to the distance between the pen point and the first detection module, the polar angle and the azimuth angle.
9. The profile-acquisition device of claim 6, wherein the smart pen further comprises a touch button, the profile-acquisition device further comprising:
a receiving module for receiving a first input for the touch button;
an enabling module to enable a drawing mode in response to the first input.
10. The profile-acquisition device of claim 9, further comprising:
a closing module, configured to close the drawing mode in response to a first input to the touch button when the first input is received again.
11. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the contour acquisition method according to any one of claims 1-5.
12. A readable storage medium, on which a program or instructions are stored which, when executed by a processor, carry out the steps of the contour acquisition method according to any one of claims 1 to 5.
CN202110369015.5A 2021-04-06 2021-04-06 Contour acquisition method and device and intelligent pen Active CN113031793B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110369015.5A CN113031793B (en) 2021-04-06 2021-04-06 Contour acquisition method and device and intelligent pen

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110369015.5A CN113031793B (en) 2021-04-06 2021-04-06 Contour acquisition method and device and intelligent pen

Publications (2)

Publication Number Publication Date
CN113031793A true CN113031793A (en) 2021-06-25
CN113031793B CN113031793B (en) 2024-05-31

Family

ID=76454376

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110369015.5A Active CN113031793B (en) 2021-04-06 2021-04-06 Contour acquisition method and device and intelligent pen

Country Status (1)

Country Link
CN (1) CN113031793B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009223839A (en) * 2008-03-19 2009-10-01 Ricoh Co Ltd Pen type input device and handwriting input method
CN101833668A (en) * 2010-04-23 2010-09-15 清华大学 Detection method for similar units based on profile zone image
CN109099827A (en) * 2018-07-18 2018-12-28 厦门丰臻电子科技有限公司 A method of pen body posture is detected with electromagnetic location dual sensor by capacitor
US20190084331A1 (en) * 2017-09-21 2019-03-21 Casio Computer Co., Ltd. Contour detecting device, printing device, contour detecting method and storage medium
CN112132080A (en) * 2020-09-29 2020-12-25 深圳棒棒帮科技有限公司 Method and device for solving pen point image coordinates of intelligent pen, medium and intelligent pen

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009223839A (en) * 2008-03-19 2009-10-01 Ricoh Co Ltd Pen type input device and handwriting input method
CN101833668A (en) * 2010-04-23 2010-09-15 清华大学 Detection method for similar units based on profile zone image
US20190084331A1 (en) * 2017-09-21 2019-03-21 Casio Computer Co., Ltd. Contour detecting device, printing device, contour detecting method and storage medium
CN109099827A (en) * 2018-07-18 2018-12-28 厦门丰臻电子科技有限公司 A method of pen body posture is detected with electromagnetic location dual sensor by capacitor
CN112132080A (en) * 2020-09-29 2020-12-25 深圳棒棒帮科技有限公司 Method and device for solving pen point image coordinates of intelligent pen, medium and intelligent pen

Also Published As

Publication number Publication date
CN113031793B (en) 2024-05-31

Similar Documents

Publication Publication Date Title
US20140300542A1 (en) Portable device and method for providing non-contact interface
EP2775424A2 (en) Method for providing augmented reality, machine-readable storage medium, and portable terminal
CN111176764B (en) Display control method and terminal equipment
CN104081307A (en) Image processing apparatus, image processing method, and program
EP3842106A1 (en) Method and device for processing control information, electronic equipment, and storage medium
CN109558000B (en) Man-machine interaction method and electronic equipment
CN112929860B (en) Bluetooth connection method and device and electronic equipment
CN112214118A (en) Touch pen, control method thereof and electronic equipment
CN106598422B (en) hybrid control method, control system and electronic equipment
CN112929734A (en) Screen projection method and device and electronic equipment
WO2024012268A1 (en) Virtual operation method and apparatus, electronic device, and readable storage medium
CN112702527A (en) Image shooting method and device and electronic equipment
CN112287708A (en) Near Field Communication (NFC) analog card switching method, device and equipment
CN114489314A (en) Augmented reality image display method and related device
CN113031793B (en) Contour acquisition method and device and intelligent pen
CN112950535B (en) Video processing method, device, electronic equipment and storage medium
CN113592874B (en) Image display method, device and computer equipment
CN111147750B (en) Object display method, electronic device, and medium
CN112861565B (en) Method, apparatus, computer device and storage medium for determining track similarity
WO2021159450A1 (en) Nfc card simulation mode starting method and apparatus, and terminal and storage medium
CN111093030B (en) Equipment control method and electronic equipment
CN112328164B (en) Control method and electronic equipment
CN112540683B (en) Intelligent ring, handwritten character recognition method and electronic equipment
CN112333494B (en) Method and device for acquiring article information and electronic equipment
CN112328156B (en) Input device control method and device and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant