CN213030699U - Imaging system - Google Patents

Imaging system Download PDF

Info

Publication number
CN213030699U
CN213030699U CN202021322995.0U CN202021322995U CN213030699U CN 213030699 U CN213030699 U CN 213030699U CN 202021322995 U CN202021322995 U CN 202021322995U CN 213030699 U CN213030699 U CN 213030699U
Authority
CN
China
Prior art keywords
projection
image
imaging
target
assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202021322995.0U
Other languages
Chinese (zh)
Inventor
邵鹏飞
吴柄萱
刘鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology of China USTC
Original Assignee
University of Science and Technology of China USTC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology of China USTC filed Critical University of Science and Technology of China USTC
Priority to CN202021322995.0U priority Critical patent/CN213030699U/en
Application granted granted Critical
Publication of CN213030699U publication Critical patent/CN213030699U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Endoscopes (AREA)

Abstract

The utility model provides an imaging system, including formation of image projection equipment, ultrasonic probe and host computer, ultrasonic probe's surface sets up the position calibration thing, ultrasonic probe, a multiframe ultrasonic image for gathering the object observation position, formation of image projection equipment includes imaging components and the projection subassembly of light path with the optical axis, imaging components is used for gathering including the fluorescence image of observation position and the target image including the position calibration thing image, the host computer, a for based on multiframe ultrasonic image and target picture, the formation includes the projected image of the object of interest at the position distribution information of observation position, and transmit the projected image to projection subassembly, make projection subassembly throw the projected image at preset area, preset area is the perpendicular to of object the surface region of observation position. The operator can obtain the position distribution condition of the interested objects at the observation part by projecting the projection image on the surface of the observation part, and the accurate positioning of all the interested objects included in the observation part is realized.

Description

Imaging system
Technical Field
The application relates to the technical field of images, in particular to an imaging system.
Background
With the development of imaging technology, it is common to detect a lesion in a living body or a defect in a physical material using the imaging technology.
For example, with the development of medical imaging technology, imaging methods are widely used for locating lesion tissues in the human body in diagnostic treatment. For example, for sentinel lymph nodes, currently common imaging methods for locating sentinel lymph nodes include ultrasound, fluorescence, and the like. However, the aforementioned imaging method cannot comprehensively and accurately locate the sentinel lymph node in the human body, so how to more comprehensively and accurately locate the sentinel lymph node in the human body becomes a problem to be solved urgently.
SUMMERY OF THE UTILITY MODEL
The applicant researches and discovers that ultrasonic image positioning has a good display effect on deep parts (such as vascular structures such as deep lymph more than 1cm away from the body surface) in a common ultrasonic frequency range (such as 2MHz-10MHz), but cannot display vascular information of shallow parts (such as deep lymph less than 1cm away from the body surface), and high-frequency ultrasound capable of displaying shallow information is few in clinical application, and cannot display deep information, so that the ultrasonic image positioning cannot comprehensively position the distribution positions of sentinel lymph nodes in a human body.
The fluorescent image positioning has a good display effect on shallow parts, and because the penetration depth of the excitation light (e.g., near infrared light or light emitting diode excitation light) in the tissue is limited and most of the fluorescence is absorbed by the tissue, the penetration depth under the epidermis of the fluorescent image is not more than 1cm, and therefore, the fluorescent image method cannot comprehensively position the distribution position of the sentinel lymph nodes in the human body.
Therefore, the ultrasonic image positioning and the fluorescence image positioning are combined, so that the sentinel lymph nodes at different depths below the epidermis of the human body are comprehensively positioned.
In order to achieve the above object, the present application provides the following technical solutions:
an imaging system, comprising:
the system comprises an imaging projection device, an ultrasonic probe and an upper computer; the imaging projection equipment and the ultrasonic probe are respectively connected with the upper computer; a position calibration object is arranged on the surface of the ultrasonic probe;
the ultrasonic probe is used for acquiring multi-frame ultrasonic images of an observation part of an object;
the imaging projection equipment comprises an imaging component and a projection component; the imaging assembly is used for acquiring a target image comprising a fluorescence image of the observation part and the position calibration object image; the optical path of the imaging component and the optical path of the projection component are coaxial;
the upper computer is used for generating a projection image comprising position distribution information of an interested object at the observation position based on the multi-frame ultrasonic images and the target image, and transmitting the projection image to the projection assembly, so that the projection assembly projects the projection image in a preset area, and the preset area is the surface area of the object perpendicular to the observation position.
The system described above, optionally, the position marker includes at least three light-reflecting balls or at least one position identification code.
In the above system, optionally, the upper computer includes a data processor.
In the above system, optionally, the imaging assembly at least includes an image collecting component, an optical filter, a first light splitter, and a first lens component; the projection assembly at least comprises a projector, a second light splitting sheet and a second lens assembly.
In the above system, optionally, the first light splitter and the second light splitter are the same target light splitter;
the first lens assembly and the second lens assembly are the same target lens assembly, and the target lens assembly comprises a plurality of lenses;
the target beamsplitter is configured to couple the optical path of the imaging assembly and the optical path of the projection assembly to an optical axis of the target lens assembly.
The system described above, optionally, further comprising a near-infrared laser emitter;
the image acquisition component in the imaging assembly is a near-infrared camera;
the infrared laser transmitter is used for irradiating the position calibration object so as to enable the near-infrared camera to collect the position calibration object image.
Optionally, the system described above, wherein the projector further includes a projection mode setting module, and the projection mode setting module includes a projection mode display and a memory;
the projection mode display is used for displaying the plurality of projection modes and storing the projection mode selected by the user into the memory;
the memory is used for storing the plurality of projection modes in advance, marking the projection mode selected by the user as a target projection mode and storing the projection mode, so that the projector projects the projection image according to the target projection mode.
The above system, optionally, further comprises an adjustable platform, wherein the imaging projection device is mounted on the adjustable platform;
the adjustable platform comprises a translation regulator and an angle regulator, the translation regulator is used for regulating the translation amount of the imaging projection equipment, and the angle regulator is used for regulating the rotating angle of the imaging projection equipment.
The above system, optionally, further comprises a support frame, and the imaging projection device is movably connected to the support frame.
The system optionally further includes a display, connected to the upper computer, and the display is configured to show an image of the distribution of the object of interest at the observation position and a depth distance value of the object of interest from the surface area.
The utility model provides an imaging system, including formation of image projection equipment, ultrasonic probe and host computer, ultrasonic probe's surface sets up the position calibration thing, ultrasonic probe, a multiframe ultrasonic image for gathering the object observation position, formation of image projection equipment includes imaging components and the projection subassembly of light path with the optical axis, imaging components is used for gathering including the fluorescence image of observation position and the target image including the position calibration thing image, the host computer, a for based on multiframe ultrasonic image and target image, generate the projection image including the object of interest at the position distribution information of observation position, and transmit the projection image to projection subassembly, make projection subassembly throw the projection image at preset area, preset area is the perpendicular to of object the surface region of observation position. The characteristics of an interested object (such as a focus) with the body surface below 1cm can be acquired according to the ultrasonic image, the characteristics of the interested object with the body surface above 1cm can be acquired according to the fluorescent image, therefore, the upper computer generates a projection image based on the multi-frame ultrasonic images and the target image, can cover all interested objects below the body surface, meanwhile, the projection image generated by the upper computer comprises the position distribution information of the interested object at the observation position, and the light path of the projection component and the light path of the imaging component are coaxial, therefore, the projection component can project the projection image of the original path according to the imaging light path of the image collected by the imaging component, so that the operator can accurately know the position distribution condition of the interested object at the observation part through the projection image projected on the body surface of the observation part, the target for accurately positioning all interested objects included in the observation part is realized.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an imaging system according to an embodiment of the present application;
FIG. 2 is a schematic structural diagram of an imaging and projection apparatus provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of an imaging system according to an embodiment of the present disclosure;
FIG. 4 is a schematic view of a scene for acquiring information of a lesion of a patient by using an imaging system according to an embodiment of the present application;
fig. 5 is a schematic view of a scene projected by an imaging system according to an embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic structural diagram of an imaging system provided in an embodiment of the present application, and includes an imaging projection apparatus 11, an ultrasound probe 12, and an upper computer 13. The imaging projection device 11 and the ultrasonic probe 12 are respectively connected with an upper computer 13, and a position calibration object 14 is arranged on the surface of the ultrasonic probe 12.
The ultrasonic probe 12 is used for acquiring multi-frame ultrasonic images of the observation part of the object.
The imaging projection device 11 includes an imaging component and a projection component, the imaging component is used for collecting a target image including a fluorescence image of an observation part and a position calibration object image, and a light path of the imaging component and a light path of the projection component are coaxial.
And the upper computer 13 is used for generating a projection image including the position distribution information of the interested object at the observation part based on the multi-frame ultrasonic images and the target image, and transmitting the projection image to the projection assembly, so that the projection assembly projects the projection image in a preset area, wherein the preset area is a surface area of the object, which is vertical to the observation part.
In this embodiment, the object may be a living body, or may be another material to be detected, for example, a metal material to be detected. In the case where the object is a living body, the object of interest is a lesion included in the observation site, and in the case where the object is a material to be detected, the object of interest is a defective site included in the observation site, such as a foreign substance. In this embodiment, optionally, the description will be given by taking the object as an organism and the object of interest as a lesion.
The connection between the imaging projection device 11 and the upper computer 13 may be a wireless communication connection or a wired communication connection, and similarly, the connection between the ultrasound probe 12 and the upper computer 13 may be a wireless communication connection or a wired communication connection.
The ultrasound probe 12 may be a hand-held ultrasound probe.
The position marker 14 includes at least three light-reflective balls or at least one position identification code. The position marker 14 may be configured to allow the imaging assembly to locate the position marker 14 by reflecting excitation light, such as near infrared light, and to acquire an image of the position marker 14. Specifically, the position identification code may be a two-dimensional code coated with a reflective material.
In the imaging projection device 11, the optical path of the projection module is coaxial with the optical path of the imaging module, so that the projection module can project a projection image on the original path according to the imaging optical path of the image collected by the imaging module.
The upper computer 13 may be a data processor, and the upper computer 13 is configured to generate a projection image including position distribution information of the object of interest at the observation site based on the multi-frame ultrasound image and the target image, where the data processor is specifically configured to: determining three-dimensional space position information of an ultrasonic probe 12 based on a position calibration object image included in a target image, segmenting each frame of ultrasonic image to obtain an interest point in the ultrasonic image, determining the three-dimensional space position of the interest point in each frame of ultrasonic image based on the three-dimensional space position information of the ultrasonic probe 12, and obtaining a three-dimensional ultrasonic image representing the position distribution of an interested object at an observed part based on the three-dimensional space position of the interest point in each frame of ultrasonic image; separating a fluorescence image of an observed part from the target image, and preprocessing the fluorescence image to obtain a two-dimensional fluorescence image representing the position distribution of the interested object at the observed part; and taking the three-dimensional ultrasonic image and the two-dimensional fluorescence image as projection images.
In the case that the position marker 14 is a small reflective sphere, the specific implementation manner of the data processor determining the three-dimensional spatial position information of the ultrasonic probe 12 based on the position marker image included in the target image and determining the three-dimensional spatial position of the interest point in each frame of the ultrasonic image based on the three-dimensional spatial position information of the ultrasonic probe 12 is as follows: global binarization is carried out on the target image, a fluorescence image and a reflective small ball image are divided, two-dimensional coordinates of the circle centers of a plurality of reflective small balls are determined, three-dimensional space coordinates and three-dimensional postures of the ultrasonic probe 12 are obtained through analyzing the two-dimensional coordinates of the circle centers of the plurality of reflective small balls, and the three-dimensional space position of the interest point in the ultrasonic image is obtained through calculation by combining the ultrasonic probe and a spatial transformation matrix of the ultrasonic image.
In the case that the position marker 14 is a position identification code, for example, a rectangular two-dimensional code, two-dimensional coordinates of four vertices of the rectangular two-dimensional code may be determined based on the two-dimensional coordinates of the center point of the two-dimensional code and the side length of the rectangular two-dimensional code, a three-dimensional spatial coordinate and a three-dimensional posture of the ultrasonic probe may be obtained by calculating through analyzing the two-dimensional coordinates of the center point of the rectangular two-dimensional code and the two-dimensional coordinates of the four vertices, and a three-dimensional spatial position of the point of interest in the ultrasonic image may be obtained by calculating through combining the spatial transformation matrix of.
The data processor obtains a three-dimensional ultrasonic image representing the position distribution of an interested object at an observation part based on the three-dimensional space position of an interested point in each frame of ultrasonic image in a specific implementation mode that: and according to the three-dimensional space position of the interest points in each frame of ultrasonic image, taking an image formed by the interest points as an interested object. Because the interested object is composed of the interested points, a three-dimensional ultrasonic image of the interested object distributed at the observed position can be obtained according to the three-dimensional space position of the interested points at the observed position.
The data processor may refer to the prior art for a specific implementation of preprocessing the fluorescence image to obtain a two-dimensional fluorescence image representing the position distribution of the object of interest at the observation site.
The imaging system that this embodiment provided, including formation of image projection equipment, ultrasonic probe, and host computer, ultrasonic probe's surface sets up the position calibration thing, ultrasonic probe, a multiframe ultrasonic image for gathering the object observation position, formation of image projection equipment includes optical path with optical axis's formation of image subassembly and projection subassembly, the formation of image subassembly is used for gathering the target image including the fluorescence image of observation position and position calibration thing image, the host computer, a for based on multiframe ultrasonic image and target image, generate the projection image including the position distribution information of object of interest at the observation position, and transmit the projection image to the projection subassembly, make the projection subassembly throw the projection image in the preset area, the preset area is perpendicular to of object the surface region of observation position. The characteristics of an interested object (such as a focus) with the body surface below 1cm can be acquired according to the ultrasonic image, the characteristics of the interested object with the body surface above 1cm can be acquired according to the fluorescent image, therefore, the upper computer generates a projection image based on the multi-frame ultrasonic images and the target image, can cover all interested objects below the body surface, meanwhile, the projection image generated by the upper computer comprises the position distribution information of the interested object at the observation position, and the light path of the projection component and the light path of the imaging component are coaxial, therefore, the projection component can project the projection image of the original path according to the imaging light path of the image collected by the imaging component, so that the medical staff can accurately know the position distribution condition of the interested object at the observation part through projecting the projection image on the body surface of the observation part, the target for accurately positioning all interested objects included in the observation part is realized.
Furthermore, the imaging system uses the imaging component of the imaging projection equipment to simultaneously acquire target images including the fluorescent image of the observed part and the position calibration object image, the upper computer is used for separating the fluorescent image and the position calibration object image, the positioning of the ultrasonic probe is realized, and compared with the existing system for positioning the ultrasonic probe by utilizing a plurality of information acquisition cameras, the positioning process of the ultrasonic probe by the imaging system is simpler, the reliability is higher, and the cost is lower.
Furthermore, because the imaging system acquires three-dimensional data, the imaging projection equipment can be placed at any position, as long as the imaging assembly of the imaging projection equipment and the information acquisition field of view of the projection assembly are ensured to comprise the observation position and the moving track of the ultrasonic probe, and the operation is convenient.
In the imaging and projecting device 11 of fig. 1, the imaging assembly of the imaging and projecting device 11 includes an image capturing component, an optical filter, a first light splitter, and a first lens assembly. The projection assembly of the imaging and projection device 11 includes a projector, a second dichroic sheet, and a second lens assembly. Wherein the lens assembly comprises a plurality of lenses.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an imaging projection apparatus, in which a first light splitter of an imaging assembly and a second light splitter of a projection assembly are a same target light splitter 23, a first lens assembly of the imaging assembly and a second lens assembly of the projection assembly are a same target lens assembly 24, and the target lens assembly 24 includes a plurality of lenses.
The imaging module is composed of an image acquisition unit 21, a filter 22, a target beam splitter 23, and a target lens module 24, and the projection module is composed of a target beam splitter 23, a target lens module 24, and a projector 25.
As shown in FIG. 2, the target beamsplitter 23 is used to couple the optical paths of the imaging assembly and the projection assembly to the optical axis of the target lens assembly 24.
Further, the projector 25 further includes a projection mode setting module, which includes a projection mode display, and a memory;
the projection mode display is used for displaying a plurality of projection modes and storing the projection mode selected by the user into the memory; and the memory is used for pre-storing a plurality of projection modes, marking the light projection mode selected by the user as a target light projection mode and storing the target light projection mode, so that the projector projects the projection image according to the target light projection mode.
The projection modes include a light and dark light alternate projection mode, a light projection mode with specified color, a single projection ultrasonic image mode, a single projection fluorescent image mode, and a projection mode in which a projection ultrasonic image and a fluorescent image are simultaneously displayed. For example, the projection mode of alternating light and dark light may be set while the mode of projecting the ultrasound image alone is set, so that the projector projects the ultrasound image in the manner of alternating light and dark light.
The operator can select different projection modes according to requirements, for example, the operator can select to only project the ultrasonic image or the fluorescence image, so that the operator can distinguish the focus corresponding to the ultrasonic image and the focus corresponding to the fluorescence image, and the light and dark light alternate projection mode can enhance the information contrast and is favorable for the operator to determine the edge of the focus.
At present, in a medical operation navigation system, an image display screen is separated from an operation visual field of an operator, the operator needs to switch the visual line back and forth between an operation area and the image display screen, errors are easy to generate, and the operation efficiency is also reduced. In this application, the light path of the formation of image subassembly among the formation of image projection equipment adopts coaxial design with the light path of projection subassembly, and the formation of image subassembly can be operation area information in succession to real-time conveying to the host computer, by the real-time normal position projection of projection subassembly behind the host computer generation projection image, through the projecting display mode of normal position, the operator can avoid switching the visual angle repeatedly between display and operation area, has greatly promoted the convenience of operation. Meanwhile, the projection mode setting module of the projector can be combined with the requirement flexible setting projection mode, and the speed and the precision of the operation are greatly increased.
Fig. 3 is a schematic structural diagram of another imaging system provided in an embodiment of the present application, and the schematic diagram shown in fig. 3 is a near-infrared laser emitter 15 further included on the basis of fig. 1, where an image capturing component in the imaging assembly is a near-infrared camera.
The near infrared laser transmitter 15 is used to emit near infrared light to assist in locating the position marker 14. The near infrared light emitted by the infrared laser emitter 15 irradiates the position calibration object 14, so that the near infrared light reflected by the position calibration object 14 is collected by the near infrared camera, the brightness of the position calibration object 14 can be greatly improved, the upper computer 13 is facilitated, the image representing the position calibration object 14 is distinguished from the target image, and the effect of tracking the position calibration object 14 in real time can be realized.
Under the condition that the image acquisition component is a near-infrared camera, the fluorescence image is specifically a near-infrared fluorescence image, the imaging system makes up the defect that common frequency band ultrasound can display information of a deeper part but cannot display information of a shallower part at the same time by fusing ultrasound contrast and the near-infrared fluorescence image, and also makes up the defect that the near-infrared fluorescence image technology can display information of a shallower part but cannot display information of the deeper part, thereby realizing development in a wide depth range.
The imaging system shown in fig. 1 further comprises a display, which is not shown in fig. 1, connected to the upper computer 13, for displaying an image of the distribution of the objects of interest at the location of the observation site and the depth distance values of the objects of interest from the surface area. The operator may be enabled to further obtain quantitative information of the position of the object of interest.
The imaging system shown in fig. 1 further includes an adjustable platform, which is not shown in fig. 1, the imaging projection device 11 is installed on the adjustable platform, the adjustable platform includes a translation adjuster and an angle adjuster, the translation adjuster is used for adjusting the translation amount of the imaging projection device, and the angle adjuster is used for adjusting the rotation angle of the imaging projection device. By operating the translation regulator and the angle regulator, the imaging projection equipment can be changed to acquire information in the field of view at different angles.
The imaging system shown in fig. 1 may further include a support frame, the support frame is not shown in fig. 1, the imaging projection device is movably connected to the support frame, and the imaging projection device may more conveniently acquire information in the field of view from different angles by translating or rotating different spatial positions.
The imaging system shown in fig. 1 or fig. 3 may be applied in medical clinics, and referring to fig. 4 and fig. 5, fig. 4 is a schematic view of a scene provided by an embodiment of the present application for acquiring information of a lesion of a patient by using the imaging system. Fig. 5 is a schematic view of a scene projected by an imaging system according to an embodiment of the present disclosure.
As shown in fig. 4, 1 denotes a deep lesion, 2 denotes a shallow lesion, 3 denotes a body surface, 4 denotes a hand-held ultrasound probe, 5 denotes a reflective bead on the surface of the hand-held ultrasound probe, 6 denotes an upper computer, and 7' denote imaging projection devices at different working angles.
As shown in fig. 4, the patient lies down under the imaging projection device in a certain posture, and the hand-held ultrasonic probe 5 is used for sweeping the body surface area of the patient, so as to acquire an ultrasonic image of a deep lesion in the body surface area and transmit the ultrasonic image to the upper computer 6. The imaging projection device 7 or the imaging projection device 7' is used for collecting the fluorescence image of a superficial lesion part below a body surface area and transmitting the fluorescence image to the upper computer 6, and the upper computer 6 processes the ultrasonic image and the fluorescence image to generate a projection image.
After the upper computer generates the projection image, the projection image is transmitted to the imaging projection device, so that the imaging projection device projects on the body surface, fig. 5 is a scene schematic diagram of projection of the focus by using the imaging system, as shown in fig. 5, 1 represents a deep lesion part, 2 represents a shallow lesion part, 3 represents the body surface, 4 and 4 'are focus images projected onto the body surface by the imaging projection device at different angles, 5 represents the upper computer, and 6' represent the imaging projection device at different working angles.
In the imaging projection device 6 or the imaging projection device 6', the optical path of the imaging component is coaxial with the optical path of the projection component, so that the projection component projects a projection image on the original path according to the imaging optical path of the image acquired by the imaging component.
One important application of the imaging system provided by the present application in medical clinics is lymph node dissection surgery in breast surgery. The patient can be subjected to intravenous injection of a multi-modal contrast agent before operation, multi-modal imaging of ultrasonic and fluorescence images can be realized by the multi-modal contrast agent, the surface of the breast of the patient is swept by a handheld ultrasonic probe, an image acquisition component in imaging projection equipment is responsible for acquiring data, the upper computer is used for completing analysis of the ultrasonic and fluorescence data to generate lymph distribution information, a projector in the imaging projection equipment is used for completing body surface in-situ projection, and a medical care operator is assisted to position lymphatic vessels and lymph nodes.
The following description is provided by the imaging system breast surgery lymph node cleaning process provided by the embodiments of the present application:
1. the patient lies down, an imaging projection device of the imaging system is placed above the patient, and the information acquisition visual field of the imaging projection device is ensured to comprise the chest of the patient;
2. injecting a multimodal contrast agent intravenously into a patient;
3. after the preset time, scanning the surface of the breast by using a handheld ultrasonic probe, and acquiring information of deep lesion parts below the surface of the breast;
4. an image acquisition component of the imaging projection equipment automatically acquires information of a superficial lesion part below the surface of the breast;
5. the upper computer automatically completes analysis of data uploaded by the handheld ultrasonic probe and the imaging projection equipment to obtain projection images of the distribution of lymphatic vessels and lymph nodes in the superficial layer and the deep layer, and transmits the projection images to a projector of the imaging projection equipment;
6. the projector projects projection images of the distribution of deep and shallow lymphatic vessels and lymph nodes to the body surface of a patient in situ;
7. the operator completes lymph node cleaning according to the projected lymph vessel and lymph node images.
Certainly, the imaging system provided by the application has wide clinical application, not only can be used for operations such as breast surgery and the like requiring sentinel lymph node cleaning, but also can be used for abdominal surgery navigation, puncture guidance and the like.
The imaging system provided by the application has the following advantages in medical clinical application:
compared with the traditional ultrasonic surgery navigation or fluorescence surgery navigation system based on the display, the projection mode is more visual and convenient to interact with the operator, and the operator does not need to repeatedly switch the visual angle between the display and the surgical position of the patient. Meanwhile, the projection navigation mode breaks through the limitation of the traditional surgical operation, increases the information acquired by doctors in the operation, and has great significance for increasing the operation precision and reducing the operation trauma.
Furthermore, the imaging system integrates the ultrasonic image and the fluorescence image, overcomes the defect that the common frequency band ultrasound can display organ and vessel information of a deeper part but cannot simultaneously display information of a shallower part, overcomes the defect that the fluorescence image technology can display vessel information of a shallower part but cannot display information of a deeper part, and realizes vessel visualization in a wide depth range.
Further, the imaging system does not require advanced medical image scanning or modeling of the patient. When the multi-mode imaging device is used, multi-mode imaging of ultrasonic and fluorescence images can be realized only by intravenous injection of a multi-mode contrast agent, and the imaging process is fast and efficient.
Furthermore, the ultrasonic data acquisition of the imaging system only needs one sweep of the ultrasonic probe, and after the ultrasonic three-dimensional data and the fluorescence image are acquired, the projection can be always kept in the operation.
Furthermore, the imaging system can determine the three-dimensional image of the lesion part in real time through the calibration and real-time three-dimensional tracking of the ultrasonic probe, and project the three-dimensional image of the lesion part and the fluorescence image to the body surface in situ.
Furthermore, the information projected by the imaging system in situ is helpful for planning the operation path before the operation and can also be used for operation teaching.
Furthermore, the imaging system projects the ultrasonic image and the fluorescence image through different colors during projection display, and only the ultrasonic image or the fluorescence image can be selectively projected, so that an operator can distinguish the positions and the depths of different lesion areas, and the operation speed and the operation precision are increased.
Furthermore, the imaging system enhances the information contrast through a projection mode of alternate light and shade during projection display, and is beneficial to an operator to determine the edge of the lesion area.
The embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts among the embodiments are referred to each other.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An imaging system, comprising:
the system comprises an imaging projection device, an ultrasonic probe and an upper computer; the imaging projection equipment and the ultrasonic probe are respectively connected with the upper computer; a position calibration object is arranged on the surface of the ultrasonic probe;
the ultrasonic probe is used for acquiring multi-frame ultrasonic images of an observation part of an object;
the imaging projection equipment comprises an imaging component and a projection component; the imaging assembly is used for acquiring a target image comprising a fluorescence image of the observation part and the position calibration object image; the optical path of the imaging component and the optical path of the projection component are coaxial;
the upper computer is used for generating a projection image comprising position distribution information of an interested object at the observation position based on the multi-frame ultrasonic images and the target image, and transmitting the projection image to the projection assembly, so that the projection assembly projects the projection image in a preset area, and the preset area is the surface area of the object perpendicular to the observation position.
2. The system of claim 1, wherein the position marker comprises at least three light-reflective balls or at least one position identification code.
3. The system of claim 1, wherein the upper computer comprises a data processor.
4. The system of claim 1, wherein the imaging assembly comprises at least an image acquisition component, a filter, a first light splitter, and a first lens assembly; the projection assembly at least comprises a projector, a second light splitting sheet and a second lens assembly.
5. The system of claim 4, wherein the first and second light splitters are the same target light splitter;
the first lens assembly and the second lens assembly are the same target lens assembly, and the target lens assembly comprises a plurality of lenses;
the target beamsplitter is configured to couple the optical path of the imaging assembly and the optical path of the projection assembly to an optical axis of the target lens assembly.
6. The system of claim 4, further comprising, a near-infrared laser emitter;
the image acquisition component in the imaging assembly is a near-infrared camera;
the infrared laser transmitter is used for irradiating the position calibration object so as to enable the near-infrared camera to collect the position calibration object image.
7. The system of claim 4, wherein the projector further comprises a projection mode setting module comprising a projection mode display, and a memory;
the projection mode display is used for displaying a plurality of projection modes and storing the projection mode selected by the user into the memory;
the memory is used for storing the plurality of projection modes in advance, marking the projection mode selected by the user as a target projection mode and storing the projection mode, so that the projector projects the projection image according to the target projection mode.
8. The system of claim 1, further comprising an adjustable platform to which the imaging projection device is mounted;
the adjustable platform comprises a translation regulator and an angle regulator, the translation regulator is used for regulating the translation amount of the imaging projection equipment, and the angle regulator is used for regulating the rotating angle of the imaging projection equipment.
9. The system of claim 1, further comprising a support frame, wherein the imaging projection device is movably coupled to the support frame.
10. The system of claim 1, further comprising a display connected to the upper computer, the display being configured to display an image of a distribution of positions of the object of interest at the observation site and a depth distance value of the object of interest from the surface region.
CN202021322995.0U 2020-07-08 2020-07-08 Imaging system Active CN213030699U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202021322995.0U CN213030699U (en) 2020-07-08 2020-07-08 Imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202021322995.0U CN213030699U (en) 2020-07-08 2020-07-08 Imaging system

Publications (1)

Publication Number Publication Date
CN213030699U true CN213030699U (en) 2021-04-23

Family

ID=75524774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202021322995.0U Active CN213030699U (en) 2020-07-08 2020-07-08 Imaging system

Country Status (1)

Country Link
CN (1) CN213030699U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113974832A (en) * 2021-12-01 2022-01-28 辽宁北镜医疗科技有限公司 Near-infrared fluorescence surgery navigation system and method with projection navigation function

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113974832A (en) * 2021-12-01 2022-01-28 辽宁北镜医疗科技有限公司 Near-infrared fluorescence surgery navigation system and method with projection navigation function

Similar Documents

Publication Publication Date Title
US20230389801A1 (en) Methods and systems for tracking and guiding sensors and instruments
US20220047244A1 (en) Three dimensional mapping display system for diagnostic ultrasound
US20180308247A1 (en) Tissue imaging system and method for tissue imaging
US20210186355A1 (en) Model registration system and method
CA2161126C (en) System for locating relative positions of objects
CN110584783B (en) Surgical navigation system
EP1804705B1 (en) Aparatus for navigation and for fusion of ecographic and volumetric images of a patient which uses a combination of active and passive optical markers
US20230103969A1 (en) Systems and methods for correlating regions of interest in multiple imaging modalities
CN109549689A (en) A kind of puncture auxiliary guide device, system and method
JP2004243140A (en) Reference marker embedded in part of human body
WO2022027251A1 (en) Three-dimensional display method and ultrasonic imaging system
CN107105972A (en) Model register system and method
CN108420532B (en) Handheld fluorescent image navigation positioning device
JP2022502126A (en) Breast mapping and abnormal localization
CN115334963A (en) System and method for generating tissue image biomarkers
CN213030699U (en) Imaging system
CN111671466A (en) Imaging system
CN113545735A (en) OCT image display adjustment method and device
CN113974830B (en) Surgical navigation system for ultrasonic guided thyroid tumor thermal ablation
WO2024158804A1 (en) Devices and methods for freehand multimodality imaging
WO2024097249A1 (en) 3d spatial mapping in a 3d coordinate system of an ar headset using 2d images
CN113974830A (en) Surgical navigation system for ultrasonically guiding thyroid tumor thermal ablation
CN117398065A (en) Blood vessel body surface real-time naked eye visualization method and device based on photoacoustic imaging
Watson Development of an interactive image-guided neurosurgical system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant