WO2020151119A1 - Méthode d'opération dentaire par réalité augmentée et appareil associé - Google Patents

Méthode d'opération dentaire par réalité augmentée et appareil associé Download PDF

Info

Publication number
WO2020151119A1
WO2020151119A1 PCT/CN2019/084455 CN2019084455W WO2020151119A1 WO 2020151119 A1 WO2020151119 A1 WO 2020151119A1 CN 2019084455 W CN2019084455 W CN 2019084455W WO 2020151119 A1 WO2020151119 A1 WO 2020151119A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
coordinate
visual marker
transformation matrix
pixel
Prior art date
Application number
PCT/CN2019/084455
Other languages
English (en)
Chinese (zh)
Inventor
王利峰
Original Assignee
雅客智慧(北京)科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 雅客智慧(北京)科技有限公司 filed Critical 雅客智慧(北京)科技有限公司
Publication of WO2020151119A1 publication Critical patent/WO2020151119A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61CDENTISTRY; APPARATUS OR METHODS FOR ORAL OR DENTAL HYGIENE
    • A61C8/00Means to be fixed to the jaw-bone for consolidating natural teeth or for fixing dental prostheses thereon; Dental implants; Implanting tools

Definitions

  • the embodiments of the application relate to the technical field of medical robots, and in particular to an augmented reality method and device for dental surgery.
  • Dental implant surgery requires precise operations in a narrow space. Due to the narrow operating space of the oral cavity, it is difficult to directly observe and operate the position of the implant. The location of the implant depends on the experience of the doctor, which leads to the positioning of the implant. Not accurate enough, it is easy to cause failure of implant surgery.
  • the embodiments of the present application provide an augmented reality method and device for dental surgery.
  • an embodiment of the present application proposes an augmented reality method for dental surgery, including:
  • a second coordinate system established based on the second visual marker and a first coordinate established based on the first visual marker are obtained.
  • the transformation matrix between the virtual coordinate system and the second coordinate system, and the transformation matrix between the second coordinate system and the first coordinate system Obtaining the coordinate set of the object to be positioned in the first coordinate system; wherein the transformation matrix between the virtual coordinate system and the second coordinate system is obtained in advance;
  • the coordinate set of the object to be positioned in the first coordinate system and the projection matrix from the first coordinate system to the pixel coordinate system corresponding to the endoscope of the planting mobile phone it is obtained that the object to be positioned in the
  • the pixel coordinate set in the pixel coordinate system is set, and the object to be positioned is displayed on the two-dimensional image corresponding to the endoscope according to the pixel coordinate set; the projection matrix is obtained in advance.
  • an embodiment of the present application provides an augmented reality device for dental surgery, including:
  • the acquiring unit is configured to acquire the position and posture of the first visual marker and the position and posture of the second visual marker; wherein the first visual marker corresponds to the implanted mobile phone, and the second visual marker corresponds to the oral jaw;
  • the first obtaining unit is configured to obtain the second coordinate system established based on the second visual marker and the second coordinate system based on the first visual marker according to the position and posture of the first visual marker and the position and posture of the second visual marker.
  • the second obtaining unit is configured to, according to the coordinate set of the object to be positioned in the virtual coordinate system, the transformation matrix between the virtual coordinate system and the second coordinate system, and the second coordinate system and the first coordinate system A transformation matrix between coordinate systems to obtain a coordinate set of the object to be positioned in the first coordinate system; wherein the transformation matrix between the virtual coordinate system and the second coordinate system is obtained in advance;
  • the display unit is configured to obtain the coordinate set of the object to be positioned in the first coordinate system and the projection matrix of the first coordinate system to the pixel coordinate system corresponding to the endoscope of the planting mobile phone.
  • an embodiment of the present application provides an electronic device, including a memory, a processor, and a computer program stored on the memory and capable of running on the processor, and the processor implements any of the above-mentioned embodiments when the program is executed.
  • the steps of the augmented reality method for dental surgery are described.
  • an embodiment of the present application provides a non-transitory computer-readable storage medium on which a computer program is stored.
  • the computer program is executed by a processor, the enhancement for dental surgery described in any of the above embodiments is implemented. Realistic method steps.
  • the augmented reality method and device for dental surgery can obtain the position and posture of the first visual marker and the position and posture of the second visual marker, and then according to the position and posture of the first visual marker and the first visual marker 2.
  • the position and posture of the visual marker obtaining the transformation matrix between the second coordinate system established based on the second visual marker and the first coordinate system established based on the first visual marker, according to the coordinate set of the object to be positioned in the virtual coordinate system ,
  • the transformation matrix between the virtual coordinate system and the second coordinate system and the transformation matrix between the second coordinate system and the first coordinate system to obtain the coordinate set of the object to be positioned in the first coordinate system, and then according to the
  • the coordinate set in the first coordinate system, the projection matrix from the first coordinate system to the pixel coordinate system corresponding to the endoscope of the planting mobile phone obtain the pixel coordinate set of the object to be positioned in the pixel coordinate system, and collect it according to the pixel coordinate
  • the object to be positioned is displayed on the two-dimensional image corresponding to the specul
  • FIG. 1 is a schematic flowchart of an augmented reality method for dental surgery provided by an embodiment of the application
  • Figure 2 is a schematic diagram of camera calibration provided by an embodiment of the application.
  • FIG. 3 is a schematic flowchart of an augmented reality method for dental surgery provided by another embodiment of the application.
  • FIG. 4 is a schematic structural diagram of an augmented reality device for dental surgery provided by an embodiment of the application.
  • FIG. 5 is a schematic structural diagram of an augmented reality device for dental surgery provided by another embodiment of the application.
  • FIG. 6 is a schematic diagram of the physical structure of an electronic device provided by an embodiment of the application.
  • Augmented Reality (AR) technology has received increasing attention and has played an important role in many industries, showing great potential.
  • Augmented reality technology is a technology that can calculate the position and angle of the image taken by the camera in real time and add the corresponding image. It not only displays the real world information, but also displays the virtual world information at the same time, which can display the real world information It complements and overlaps with the information of the virtual world.
  • the augmented reality method for dental surgery provided by the embodiments of the present application can expand the observation range in dental surgery and improve the positioning accuracy of the part that needs dental surgery.
  • FIG. 1 is a schematic flowchart of an augmented reality method for dental surgery provided by an embodiment of the application.
  • the augmented reality method for dental surgery provided by an embodiment of the present application includes:
  • a planting mobile phone with an endoscope is used.
  • the endoscope is set on the head of the planting mobile phone and can extend into the patient's mouth.
  • the camera installed in the endoscope can photograph the oral cavity of the patient and obtain a two-dimensional image of the oral cavity of the patient.
  • the augmented reality device used for dental surgery (hereinafter referred to as augmented reality device) can track the first visual mark and the second visual mark in real time through optical or electromagnetic navigation instruments commonly used in navigation surgery, thereby obtaining the first visual mark.
  • the position and posture of a visual marker and the position and posture of the second visual marker can track the first visual mark and the second visual mark in real time through optical or electromagnetic navigation instruments commonly used in navigation surgery, thereby obtaining the first visual mark.
  • the position and posture of a visual marker and the position and posture of the second visual marker can track the first visual mark and the second visual mark in real time through optical or electromagnetic navigation instruments commonly used in navigation surgery, thereby obtaining the first visual mark.
  • the first visual mark corresponds to the planting mobile phone, and a first visual mark can be set on the planting mobile phone, the first visual mark includes at least three first visual marking points, and the at least three first visual marks
  • the marking point is not on a straight line, the first visual marking can be set on the motor end of the implanting mobile phone;
  • the second visual marking corresponds to the oral jaw, and the second visual marking includes at least three second visual markings Point, the at least three second visual marking points are not on a straight line, and the second visual marking may be set on a dental tray or implanted in the alveolar process or jaw bone of the patient.
  • the specific installation positions and installation methods of the first visual mark and the second visual mark are selected based on actual experience, and are not limited in the embodiment of the present application.
  • the augmented reality device may use an optical or electromagnetic navigation instrument based on the position and posture of the first visual marker.
  • the posture and the position and posture of the second visual marker are obtained, and the transformation matrix between the second coordinate system and the first coordinate system is obtained.
  • the second coordinate system is established based on the second visual marker by an optical or electromagnetic navigation instrument
  • the first coordinate system is a three-dimensional coordinate system established by an optical or electromagnetic navigation instrument based on the first visual mark.
  • the transformation matrix between the second coordinate system and the first coordinate system is configured to convert coordinates in the second coordinate system into coordinates in the first coordinate system.
  • the three-dimensional model of the patient’s mouth can be reconstructed based on the CT scan data before the dental operation, and the three-dimensional model of the object to be positioned can be established in the design aid software, and then the three-dimensional model of the object to be positioned can be pre-planned in the three-dimensional model of the patient’s mouth
  • the position, angle and depth of the object to be positioned include but not limited to implants, temporary crowns, etc.
  • the virtual coordinate system is the three-dimensional coordinate system to which the three-dimensional model of the object to be positioned belongs.
  • the three-dimensional model of the object to be positioned needs to be Mapped to the two-dimensional field of view of the endoscope, the coordinate set of the object to be positioned in the virtual coordinate system is the coordinate corresponding to the three-dimensional model of the object to be positioned in the virtual coordinate system. All can be set according to actual needs.
  • the coordinate set of the object to be positioned in the virtual coordinate system is the coordinate corresponding to the outer contour of the three-dimensional model of the object to be positioned, and the coordinate set of the object to be positioned in the virtual coordinate system can be preset.
  • the augmented reality device may be based on the coordinate set of the object to be positioned in the virtual coordinate system, the transformation matrix between the virtual coordinate system and the second coordinate system, and the second coordinate system and the first coordinate system.
  • the transformation matrix between the coordinate systems obtains the coordinate set of the object to be positioned in the first coordinate system.
  • the transformation matrix between the virtual coordinate system and the second coordinate system is obtained in advance.
  • the augmented reality device may use the coordinate a and the transformation matrix between the virtual coordinate system and the second coordinate system , Obtain the coordinate b of the coordinate a in the second coordinate system, and then obtain the coordinate b in the first coordinate system according to the coordinate b and the transformation matrix between the second coordinate system and the first coordinate system Therefore, the coordinate set of the object to be positioned in the first coordinate system can be obtained.
  • At least three feature points are selected on the tooth model included in the three-dimensional model of the patient's mouth, and the coordinates of the at least three feature points in the virtual coordinate system are obtained.
  • the probe of the optical navigation instrument is used in reality to obtain the actual points corresponding to the at least three characteristic points on the patient’s teeth in the second coordinate system.
  • the difference between the virtual coordinate system and the second coordinate system can be obtained.
  • the feature points are selected based on actual experience, which is not limited in the embodiment of the present application.
  • the augmented reality device After the augmented reality device obtains the coordinate set of the object to be positioned in the first coordinate system, it can be based on the coordinate set of the object to be positioned in the first coordinate system, the first A projection matrix from the coordinate system to the pixel coordinate system corresponding to the endoscope to obtain the pixel coordinate set of the object to be positioned in the pixel coordinate system, and the object to be positioned in the coordinate set in the first coordinate system
  • Each coordinate of corresponds to a pixel coordinate in the pixel coordinate set.
  • the augmented reality device may display the object to be positioned on the two-dimensional image corresponding to the endoscope according to the set of pixel coordinates, thereby converting the three-dimensional model of the object to be positioned Mapped to the two-dimensional field of view of the endoscope, the two-dimensional image corresponding to the endoscope is captured by the camera of the endoscope to obtain a two-dimensional image.
  • the pixel coordinate system is a two-dimensional coordinate system corresponding to a two-dimensional image captured by a camera of the endoscope; the projection matrix is obtained in advance.
  • the augmented reality method for dental surgery provided by the embodiment of the application can obtain the position and posture of the first visual marker and the position and posture of the second visual marker, and then according to the position and posture of the first visual marker and the second visual marker The position and posture of the marker are obtained, and the transformation matrix between the second coordinate system established based on the second visual marker and the first coordinate system established based on the first visual marker is obtained. According to the coordinate set of the object to be positioned in the virtual coordinate system, the virtual The transformation matrix between the coordinate system and the second coordinate system and the transformation matrix between the second coordinate system and the first coordinate system are used to obtain the coordinate set of the object to be positioned in the first coordinate system.
  • the object to be positioned is displayed on the corresponding two-dimensional image, which improves the positioning accuracy of the object to be positioned in dental surgery.
  • obtaining the projection matrix includes:
  • the projection matrix is obtained according to the internal parameter matrix of the camera of the endoscope, the transformation matrix between the third coordinate system and the first coordinate, and the external parameter matrix of the camera of the endoscope; wherein, the The transformation matrix between the third coordinate system and the first coordinate system is obtained in advance.
  • the internal parameter matrix of the camera of the endoscope is M 1
  • the external parameter matrix of the camera of the endoscope is M 2
  • the transformation matrix between the third coordinate system and the first coordinate is obtained in advance, and may be configured to convert coordinates in the third coordinate system into coordinates in the first coordinate system ;
  • the internal parameter matrix and external parameter matrix of the camera can be obtained by Zhang Zhengyou calibration method.
  • the internal parameter matrix of the camera refers to the matrix formed by the internal parameters of the camera
  • the external parameter matrix of the camera refers to the external parameter configuration of the camera relative to the third coordinate system. Matrix.
  • the coordinates (u, v) in the pixel coordinate system of the camera and the coordinates (X 1 , Y 1 , Z 1 ) in the first coordinate system have the following relationship:
  • Z c is the coordinate value corresponding to the pixel coordinate (u, v) in the direction perpendicular to the pixel coordinate system
  • M 1 is the internal parameter matrix of the camera
  • the internal parameter matrix and the external parameter matrix of the camera are obtained according to the Zhang Zhengyou calibration method.
  • the internal parameter matrix and the external parameter matrix of the camera can be obtained according to Zhang Zhengyou's calibration method.
  • FIG. 2 is a schematic diagram of camera calibration provided by an embodiment of the application.
  • the head of the planting mobile phone 1 is provided with an endoscope, and the motor end of the planting mobile phone 1 is provided with the first visual mark, so
  • the first visual mark includes three first visual mark points 2
  • the calibration board 3 is a flat plate, a black and white checkerboard pattern is set on the calibration board 3
  • the third visual mark is set on the calibration board 3
  • the visual marker includes three third visual marker points 4, and the coordinate system established based on the third visual marker is the third coordinate system.
  • the calibration board 3 is opposite to the endoscope of the planting mobile phone 1, and the camera of the endoscope can capture the checkerboard pattern, the size of each square on the checkerboard pattern is equal, and the checkerboard
  • the coordinates of each feature point on the pattern in the third coordinate system can be obtained by accurate measurement, and the feature point refers to a black and white corner point on the checkerboard pattern.
  • the coordinates (u, v) in the pixel coordinate system of the camera and the coordinates (X 3 , Y 3 , Z 3 ) in the third coordinate system have the following relationship:
  • Z c is the coordinate value corresponding to the pixel coordinate (u, v) in the direction perpendicular to the pixel coordinate system
  • M 1 is the internal parameter matrix of the camera
  • M 2 is the external parameter matrix of the camera.
  • M 3 is called a projection matrix from the third coordinate system to the pixel coordinate system
  • m ij is an element in the projection matrix M
  • i and j are positive integers
  • i is less than or equal to 3
  • j is less than or equal to 4.
  • the coordinates of more than 6 feature points in the third coordinate system can be accurately obtained by technical methods in the field of image processing, such as Harris Corner detection algorithm or Shi-Tomasi corner detection algorithm, etc.
  • Three linear equations can be obtained by formula (3). After eliminating Z c , two linear equations about m ij can be obtained.
  • solving two linear equations for m ij may calculate the value of each element in m 3, m 3 and then decomposed to obtain m 1 and M 2 , that is, the internal parameter matrix and the external parameter matrix of the camera are obtained.
  • obtaining a transformation matrix between the third coordinate system and the first coordinate system includes:
  • the third visual marker corresponds to a calibration board, and the calibration board is configured to calibrate the camera;
  • a transformation matrix between the third coordinate system and the first coordinate system is obtained; wherein, based on the first coordinate system
  • the coordinate system established by the three-vision mark is the third coordinate system.
  • the position and posture of the third visual marker can be obtained through the optical or electromagnetic navigation instrument, as well as the first The position and posture of a visual marker.
  • the third visual mark corresponds to a calibration board, and the third visual mark can be set on the calibration board.
  • the transformation between the third coordinate system and the first coordinate system can be obtained matrix.
  • the coordinate system established based on the third visual mark is the third coordinate system.
  • FIG. 3 is a schematic flow chart of an augmented reality method for dental surgery provided by another embodiment of the application.
  • the coordinate set in the first coordinate system includes:
  • the augmented reality device may according to each coordinate in the coordinate set of the object to be positioned in the virtual coordinate system. Coordinates and the transformation matrix between the virtual coordinate system and the second coordinate system to obtain that each coordinate in the coordinate set of the object to be positioned in the virtual coordinate system is in the second coordinate system. The corresponding coordinates of each coordinate in the coordinate set of the object to be positioned in the virtual coordinate system in the second coordinate system constitute the object to be positioned in the second coordinate system The set of coordinates under.
  • the augmented reality device may be based on the coordinate set of the object to be positioned in the second coordinate system And the transformation matrix between the second coordinate system and the first coordinate system to obtain each coordinate in the coordinate set of the object to be positioned in the second coordinate system.
  • Corresponding coordinates in a coordinate system, and the corresponding coordinates of each coordinate in the coordinate set of the object to be positioned in the second coordinate system in the first coordinate system constitute the object to be positioned in the The set of coordinates in the first coordinate system.
  • one coordinate in the coordinate set of the object to be positioned in the second coordinate system is (X 2 , Y 2 , Z 2 ), and the distance between the second coordinate system and the first coordinate system
  • the transformation matrix is M 21
  • the pixel corresponding to the endoscope of the planting mobile phone from the first coordinate system includes:
  • M is a 3x4 matrix, by the formula Three linear equations can be obtained. Since (X 1 , Y 1 , Z 1 ) and the value of each element in the projection matrix M can be obtained, the three linear equations can be used to eliminate Z c to solve for u and v. Thus, the pixel coordinates (u, v) are obtained.
  • FIG. 4 is a schematic structural diagram of an augmented reality device for dental surgery provided by an embodiment of the application.
  • the augmented reality device for dental surgery provided by an embodiment of the present application includes an acquiring unit 401 and a first acquiring unit 402.
  • the acquiring unit 401 is configured to acquire the position and posture of the first visual marker and the position and posture of the second visual marker; wherein the first visual marker corresponds to the implanted mobile phone, and the second visual marker corresponds to the oral jaw;
  • the first obtaining unit 402 is configured to obtain the second coordinate system established based on the second visual marker and the second coordinate system based on the first visual marker according to the position and posture of the first visual marker and the position and posture of the second visual marker.
  • a transformation matrix between the first coordinate system established by a visual mark; the second obtaining unit 403 is configured to determine the relationship between the virtual coordinate system and the second coordinate system according to the coordinate set of the object to be positioned in the virtual coordinate system And the transformation matrix between the second coordinate system and the first coordinate system to obtain the coordinate set of the object to be positioned in the first coordinate system; wherein the virtual coordinate system is The transformation matrix between the second coordinate systems is obtained in advance; the display unit 404 is configured to correspond to the coordinate set of the object to be positioned in the first coordinate system, and the correspondence between the first coordinate system and the endoscope The projection matrix of the pixel coordinate system of the object to be positioned is obtained, the pixel coordinate set of the object to be positioned in the pixel coordinate system is obtained, and the pixel coordinate set is displayed on the two-dimensional image corresponding to the endoscope. Object; wherein the projection matrix is obtained in advance.
  • a planting mobile phone with an endoscope is used.
  • the endoscope is set on the head of the planting mobile phone and can extend into the patient's mouth.
  • the camera installed in the endoscope can photograph the oral cavity of the patient and obtain a two-dimensional image of the oral cavity of the patient.
  • the acquisition unit 401 can track the first visual marker and the second visual marker in real time through optical or electromagnetic navigation instruments commonly used in navigation surgery, so as to obtain the position and posture of the first visual marker and the second visual marker. The position and posture of the visual marker.
  • the first visual mark corresponds to the planting mobile phone, and a first visual mark can be set on the planting mobile phone, the first visual mark includes at least three first visual marking points, and the at least three first visual marks
  • the marking point is not on a straight line, the first visual marking can be set on the motor end of the implanting mobile phone;
  • the second visual marking corresponds to the oral jaw, and the second visual marking includes at least three second visual markings Point, the at least three second visual marking points are not on a straight line, and the second visual marking may be set on a dental tray or implanted in the alveolar process or jaw bone of the patient.
  • the specific installation positions and installation methods of the first visual mark and the second visual mark are selected based on actual experience, and are not limited in the embodiment of the present application.
  • the first obtaining unit 402 may use an optical or electromagnetic navigation instrument based on the position and posture of the first visual marker and the position and posture of the second visual marker.
  • the position and posture of the second visual marker are obtained, and the transformation matrix between the second coordinate system and the first coordinate system is obtained.
  • the second coordinate system is a three-dimensional coordinate established based on the second visual marker by an optical or electromagnetic navigation instrument
  • the first coordinate system is a three-dimensional coordinate system established based on the first visual marker by an optical or electromagnetic navigation instrument.
  • the transformation matrix between the second coordinate system and the first coordinate system is configured to convert coordinates in the second coordinate system into coordinates in the first coordinate system.
  • the three-dimensional model of the patient’s mouth can be reconstructed based on the CT scan data, and the three-dimensional model of the object to be positioned can be established in the auxiliary design software, and then the position of the three-dimensional model of the object to be positioned in the three-dimensional model of the patient’s mouth can be planned in advance , Angle and depth, the objects to be positioned include but are not limited to implants, temporary crowns, etc.
  • the virtual coordinate system is the three-dimensional coordinate system to which the three-dimensional model of the object to be positioned belongs.
  • the three-dimensional model of the object to be positioned needs to be Mapped to the two-dimensional field of view of the endoscope, the coordinate set of the object to be positioned in the virtual coordinate system is the coordinate corresponding to the three-dimensional model of the object to be positioned in the virtual coordinate system. All can be set according to actual needs.
  • the coordinate set of the object to be positioned in the virtual coordinate system is the coordinate corresponding to the outer contour of the three-dimensional model of the object to be positioned, and the coordinate set of the object to be positioned in the virtual coordinate system can be preset.
  • the second obtaining unit 403 may be based on the coordinate set of the object to be positioned in the virtual coordinate system, the transformation matrix between the virtual coordinate system and the second coordinate system, and the second coordinate system and the first coordinate system.
  • the transformation matrix between the coordinate systems obtains the coordinate set of the object to be positioned in the first coordinate system.
  • the transformation matrix between the virtual coordinate system and the second coordinate system is obtained in advance.
  • the display unit 404 may use the coordinate set of the object to be positioned in the first coordinate system,
  • the projection matrix of the pixel coordinate system corresponding to the mirror to obtain the pixel coordinate set of the object to be positioned in the pixel coordinate system, and each coordinate in the coordinate set of the object to be positioned in the first coordinate system corresponds to One pixel coordinate in one of the pixel coordinate sets.
  • the display unit 404 may display the object to be positioned on the two-dimensional image corresponding to the endoscope according to the pixel coordinate set, thereby mapping the three-dimensional model of the object to be positioned to
  • the two-dimensional image corresponding to the endoscope is a two-dimensional image captured by the camera of the endoscope.
  • the pixel coordinate system is a two-dimensional coordinate system corresponding to a two-dimensional image captured by a camera of the endoscope; the projection matrix is obtained in advance.
  • the augmented reality device for dental surgery provided by the embodiment of the application can acquire the position and posture of the first visual marker and the position and posture of the second visual marker, and then according to the position and posture of the first visual marker and the second visual marker The position and posture of the marker are obtained, and the transformation matrix between the second coordinate system established based on the second visual marker and the first coordinate system established based on the first visual marker is obtained. According to the coordinate set of the object to be positioned in the virtual coordinate system, the virtual The transformation matrix between the coordinate system and the second coordinate system and the transformation matrix between the second coordinate system and the first coordinate system are used to obtain the coordinate set of the object to be positioned in the first coordinate system.
  • the object to be positioned is displayed on the corresponding two-dimensional image, which improves the positioning accuracy of the object to be positioned in dental surgery.
  • FIG. 5 is a schematic structural diagram of an augmented reality device for dental surgery provided by another embodiment of the application. As shown in FIG. 5, on the basis of the foregoing embodiments, and further, on the basis of the foregoing embodiments, Further, the augmented reality device for dental surgery provided by the embodiment of the present application further includes a third obtaining unit 405, wherein:
  • the third obtaining unit 405 is configured to be based on the internal parameter matrix of the camera of the endoscope, the transformation matrix between the second coordinate system and the first coordinate, and the external parameter matrix of the camera of the endoscope, Obtain the projection matrix; wherein the transformation matrix between the second coordinate system and the first coordinate system is obtained in advance.
  • the third obtaining unit 405 obtains that the internal parameter matrix of the camera of the endoscope is M 1 , the external parameter matrix of the camera of the endoscope is M 2 , and the distance between the third coordinate system and the first coordinate
  • the transformation matrix is Then you can get the projection matrix
  • the transformation matrix between the third coordinate system and the first coordinate system is obtained in advance, and may be configured to convert coordinates in the third coordinate system into coordinates in the first coordinate system ;
  • the internal parameter matrix and external parameter matrix of the camera can be obtained by Zhang Zhengyou calibration method.
  • the internal parameter matrix of the camera refers to the matrix formed by the internal parameters of the camera
  • the external parameter matrix of the camera refers to the external parameter configuration of the camera relative to the first coordinate system. Matrix.
  • FIG. 6 is a schematic diagram of the physical structure of an electronic device provided by an embodiment of the application.
  • the electronic device may include: a processor 610, a communication interface 620, a memory (memory) 630, and The communication bus 640, wherein the processor 610, the communication interface 620, and the memory 630 communicate with each other through the communication bus 640.
  • the processor 610 may call the logic instructions in the memory 630 to execute the following method: obtain the position and posture of the first visual marker and the position and posture of the second visual marker; wherein the first visual marker corresponds to the planting mobile phone, so The second visual marker corresponds to the oral jaw; according to the position and posture of the first visual marker and the position and posture of the second visual marker, a second coordinate system established based on the second visual marker is obtained and based on The transformation matrix between the first coordinate system established by the first visual marker; according to the coordinate set of the object to be positioned in the virtual coordinate system, the transformation matrix between the virtual coordinate system and the second coordinate system, and the The transformation matrix between the second coordinate system and the first coordinate system is used to obtain the coordinate set of the object to be positioned in the first coordinate system; wherein, the virtual coordinate system and the second coordinate system The transformation matrix between is obtained in advance; according to the coordinate set of the object to be positioned in the first coordinate system and the projection of the first coordinate system to the pixel coordinate system corresponding to the endoscope of the planting mobile phone
  • the above-mentioned logical instructions in the memory 630 can be implemented in the form of software functional units and when sold or used as independent products, they can be stored in a computer readable storage medium.
  • the technical solution of this application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the method described in each embodiment of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program code .
  • the computer program product includes a computer program stored on a non-transitory computer-readable storage medium.
  • the computer program includes program instructions.
  • the program instructions When the program instructions are executed by a computer, the computer
  • the methods provided in the foregoing method embodiments can be executed, for example, including: acquiring the position and posture of the first visual marker and the position and posture of the second visual marker; wherein the first visual marker corresponds to the planting mobile phone, and the second visual marker The second visual marker corresponds to the oral jaw; according to the position and posture of the first visual marker and the position and posture of the second visual marker, a second coordinate system established based on the second visual marker is obtained and based on the The transformation matrix between the first coordinate system established by the first visual marker; according to the coordinate set of the object to be positioned in the virtual coordinate system, the transformation matrix between the virtual coordinate system and the second coordinate system, and the first The transformation matrix between the two coordinate system and the first coordinate system to obtain the coordinate set of the object to be positioned in the first coordinate system; wherein, the virtual coordinate
  • This embodiment provides a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium stores a computer program, and the computer program causes the computer to execute the methods provided in the foregoing method embodiments, for example, including : Obtain the position and posture of the first visual marker and the position and posture of the second visual marker; wherein, the first visual marker corresponds to the implanted mobile phone, and the second visual marker corresponds to the oral jaw; according to the first The position and posture of the visual marker and the position and posture of the second visual marker to obtain the transformation between the second coordinate system established based on the second visual marker and the first coordinate system established based on the first visual marker Matrix; according to the coordinate set of the object to be positioned in the virtual coordinate system, the transformation matrix between the virtual coordinate system and the second coordinate system, and the transformation between the second coordinate system and the first coordinate system Matrix to obtain the coordinate set of the object to be positioned in the first coordinate system; wherein the transformation matrix between the virtual coordinate system and the second coordinate system is obtained in advance; according
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments. Those of ordinary skill in the art can understand and implement without creative work.
  • each implementation manner can be implemented by software plus a necessary general hardware platform, and of course, it can also be implemented by hardware.
  • the computer software product can be stored in a computer-readable storage medium, such as ROM/RAM, magnetic A disc, an optical disc, etc., include a number of instructions to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute the methods described in each embodiment or some parts of the embodiment.

Landscapes

  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Orthopedic Medicine & Surgery (AREA)
  • Dentistry (AREA)
  • Epidemiology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)
  • Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)

Abstract

L'invention concerne une méthode d'opération dentaire par réalité augmentée, ainsi qu'un appareil associé. Le procédé selon l'invention consiste : à acquérir les emplacements et positions d'un premier marqueur visuel et d'un deuxième marqueur visuel (S101) ; à obtenir, en fonction des emplacements et positions des premier et deuxième marqueurs visuels, une matrice de transformation entre un deuxième système de coordonnées et un premier système de coordonnées (S102) ; à obtenir, en fonction d'un ensemble de coordonnées d'un objet à positionner dans un système de coordonnées virtuelles, d'une matrice de transformation entre le système de coordonnées virtuelles et le deuxième système de coordonnées, et de la matrice de transformation entre le deuxième système de coordonnées et le premier système de coordonnées, un ensemble de coordonnées de l'objet à positionner dans le premier système de coordonnées (S103) ; et à obtenir, en fonction de l'ensemble de coordonnées de l'objet à positionner dans le premier système de coordonnées et d'une matrice de projection du premier système de coordonnées sur un système de coordonnées de pixels, un ensemble de coordonnées de pixels de l'objet à positionner, et à afficher l'objet à positionner dans une image bidimensionnelle, en fonction de l'ensemble de coordonnées de pixels (S104). La méthode d'opération dentaire par réalité augmentée et l'appareil associé améliorent la précision de positionnement d'un objet à positionner.
PCT/CN2019/084455 2019-01-22 2019-04-26 Méthode d'opération dentaire par réalité augmentée et appareil associé WO2020151119A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910060460.6 2019-01-22
CN201910060460.6A CN109700550B (zh) 2019-01-22 2019-01-22 一种用于牙科手术的增强现实方法及装置

Publications (1)

Publication Number Publication Date
WO2020151119A1 true WO2020151119A1 (fr) 2020-07-30

Family

ID=66262538

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/084455 WO2020151119A1 (fr) 2019-01-22 2019-04-26 Méthode d'opération dentaire par réalité augmentée et appareil associé

Country Status (2)

Country Link
CN (1) CN109700550B (fr)
WO (1) WO2020151119A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112972027A (zh) * 2021-03-15 2021-06-18 四川大学 一种利用混合现实技术的正畸微种植体植入定位方法

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110664483A (zh) * 2019-07-09 2020-01-10 苏州迪凯尔医疗科技有限公司 根尖外科手术的导航方法、装置、电子设备和存储介质
CN110459083B (zh) 2019-08-22 2020-08-04 北京众绘虚拟现实技术研究院有限公司 一种视觉-触觉融合的增强现实口腔手术技能训练模拟器
CN111297501B (zh) * 2020-02-17 2021-07-30 北京牡丹电子集团有限责任公司 一种口腔种植手术增强现实导航方法和***
CN111445453B (zh) * 2020-03-25 2023-04-25 森兰信息科技(上海)有限公司 摄像机获取的琴键图像的偏移判断方法、***、介质及装置
CN111162840B (zh) * 2020-04-02 2020-09-29 北京外号信息技术有限公司 用于设置光通信装置周围的虚拟对象的方法和***
CN112037314A (zh) * 2020-08-31 2020-12-04 北京市商汤科技开发有限公司 图像显示方法、装置、显示设备及计算机可读存储介质
CN112168392A (zh) * 2020-10-21 2021-01-05 雅客智慧(北京)科技有限公司 牙科导航手术配准方法及***
CN112885436B (zh) * 2021-02-25 2021-11-30 刘春煦 一种基于增强现实三维成像的牙科手术实时辅助***
CN113440281B (zh) * 2021-07-21 2022-07-26 雅客智慧(北京)科技有限公司 手术路径规划方法、装置及自动植牙***
CN114521962B (zh) * 2022-04-24 2022-12-16 杭州柳叶刀机器人有限公司 手术机器人轨迹跟踪方法、装置、机器人及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106560163A (zh) * 2015-09-30 2017-04-12 合肥美亚光电技术股份有限公司 手术导航***及手术导航***的配准方法
CN108292175A (zh) * 2015-11-25 2018-07-17 特里纳米克斯股份有限公司 用于光学检测至少一个对象的检测器
CN108433834A (zh) * 2018-04-09 2018-08-24 上海术凯机器人有限公司 一种牙科种植钻针配准装置和方法
CN108742876A (zh) * 2018-08-02 2018-11-06 雅客智慧(北京)科技有限公司 一种手术导航装置
CN108784832A (zh) * 2017-04-26 2018-11-13 中国科学院沈阳自动化研究所 一种脊柱微创手术增强现实导航方法
US20180368930A1 (en) * 2017-06-22 2018-12-27 NavLab, Inc. Systems and methods of providing assistance to a surgeon for minimizing errors during a surgical procedure
US20190000570A1 (en) * 2017-06-29 2019-01-03 NavLab, Inc. Guiding a robotic surgical system to perform a surgical procedure

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5476036B2 (ja) * 2009-04-30 2014-04-23 国立大学法人大阪大学 網膜投影型ヘッドマウントディスプレイ装置を用いた手術ナビゲーションシステムおよびシミュレーションイメージの重ね合わせ方法
CN107822720A (zh) * 2017-10-26 2018-03-23 上海杰达齿科制作有限公司 齿科钻头、种植导板及其使用方法
CN208114666U (zh) * 2018-01-16 2018-11-20 浙江工业大学 基于增强现实的人机协作机器人种牙***
CN108399638B (zh) * 2018-02-08 2021-07-20 重庆爱奇艺智能科技有限公司 一种基于标记的增强现实交互方法、装置及电子设备
CN108742898B (zh) * 2018-06-12 2021-06-01 中国人民解放军总医院 基于混合现实的口腔种植导航***
CN109035414A (zh) * 2018-06-20 2018-12-18 深圳大学 增强现实手术图像的生成方法、装置、设备及存储介质
CN109077822B (zh) * 2018-06-22 2020-11-03 雅客智慧(北京)科技有限公司 一种基于视觉测量的牙科种植手机标定***及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106560163A (zh) * 2015-09-30 2017-04-12 合肥美亚光电技术股份有限公司 手术导航***及手术导航***的配准方法
CN108292175A (zh) * 2015-11-25 2018-07-17 特里纳米克斯股份有限公司 用于光学检测至少一个对象的检测器
CN108784832A (zh) * 2017-04-26 2018-11-13 中国科学院沈阳自动化研究所 一种脊柱微创手术增强现实导航方法
US20180368930A1 (en) * 2017-06-22 2018-12-27 NavLab, Inc. Systems and methods of providing assistance to a surgeon for minimizing errors during a surgical procedure
US20190000570A1 (en) * 2017-06-29 2019-01-03 NavLab, Inc. Guiding a robotic surgical system to perform a surgical procedure
CN108433834A (zh) * 2018-04-09 2018-08-24 上海术凯机器人有限公司 一种牙科种植钻针配准装置和方法
CN108742876A (zh) * 2018-08-02 2018-11-06 雅客智慧(北京)科技有限公司 一种手术导航装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112972027A (zh) * 2021-03-15 2021-06-18 四川大学 一种利用混合现实技术的正畸微种植体植入定位方法

Also Published As

Publication number Publication date
CN109700550B (zh) 2020-06-26
CN109700550A (zh) 2019-05-03

Similar Documents

Publication Publication Date Title
WO2020151119A1 (fr) Méthode d'opération dentaire par réalité augmentée et appareil associé
EP3903724B1 (fr) Procédé et dispositif d'étalonnage pour navigation chirurgicale en implantologie dentaire
US10980621B2 (en) Dental design transfer
US11963845B2 (en) Registration method for visual navigation in dental implant surgery and electronic device
EP2564375B1 (fr) Imagerie céphalométrique virtuelle
EP2134290B1 (fr) Création assistée par ordinateur d'un agencement dentaire particularisé à l'aide d'une analyse faciale
WO2021227548A1 (fr) Système de guidage numérique pour ostéotomie mandibulaire
US10111595B2 (en) Method for checking tooth positions
US20130273491A1 (en) System and method for evaluating orthodontic treatment
KR20160004864A (ko) 치과 시술 시뮬레이션을 위한 치아모델 생성 방법
JPWO2006033483A1 (ja) 人体情報抽出装置、人体撮影情報の基準面変換方法および断面情報検出装置
Lin et al. Point-based superimposition of a digital dental model on to a three-dimensional computed tomographic skull: an accuracy study in vitro
CN109598703B (zh) 牙齿图像的处理方法、***、计算机可读存储介质及设备
EP4113440A1 (fr) Examen parodontal non invasif
EP4103103B1 (fr) Suivi de progression à domicile à l'aide d'une caméra de téléphone
US12048605B2 (en) Tracking orthodontic treatment using teeth images
WO2016003256A1 (fr) Méthode permettant de mettre en oeuvre une procédure virtuelle destinée à une procédure orthodontique
CN109589179B (zh) 用于确定牙科器械的空间坐标的混合现实***和方法
Andrews Validation of the Apple i-Phone© combined with the Bellus© three-dimensional photogrammetry application for facial imaging
CN114521911A (zh) 基于头颅侧位的增强现实显示方法、***及存储介质
TWM589538U (zh) 兼具光學導航功能的數位化種植導板及種植系統
CN116919594A (zh) 一种结构光光学种植导航***及导航方法
CN112137744A (zh) 兼具光学导航功能的数字化种植导板及其使用方法
Aoki et al. 3D head model construction of individuals utilizing standard model and photogrammetry

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19911258

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16/11/2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19911258

Country of ref document: EP

Kind code of ref document: A1