EP3599982A1 - A 3d reconstruction system - Google Patents
A 3d reconstruction systemInfo
- Publication number
- EP3599982A1 EP3599982A1 EP18771352.4A EP18771352A EP3599982A1 EP 3599982 A1 EP3599982 A1 EP 3599982A1 EP 18771352 A EP18771352 A EP 18771352A EP 3599982 A1 EP3599982 A1 EP 3599982A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- light
- structured light
- features
- reconstruction system
- computer system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00009—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
- A61B1/000094—Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20076—Probabilistic image processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
Definitions
- the invention relates to a 3D reconstruction system for determining a 3D profile of an object.
- the 3D reconstruction system is in particular suitable for use in surgery, such as minimally invasive surgery.
- MIS Minimally invasive surgery
- laparoscopy has been used increasingly in recent years due to the benefits compared to conventional open surgery as it reduces the trauma to the patient skin and optionally further tissue, leaves smaller scars, minimizes post-surgical pain and enables faster recovery of the patient.
- MIS neoscopy
- endoscopy arthroscopy
- thoracoscopy a method of performing both diagnostic and surgical procedures.
- a body cavity such as the abdominal or pelvic cavity
- An endoscope such as a laparoscope may be inserted through an incision and be conventionally connected to a monitor, thereby enabling the surgeon to see the inside of the body cavity, such as an abdominal or pelvic cavity.
- a surgical instrument is inserted through the same or usually another incision.
- the body cavity sometimes called "surgery cavity”
- the body cavity is inflated with a fluid, preferably gas e.g.
- US 2013/0296712 describes an apparatus for determining endoscopic dimensional measurements, including a light source for projecting light patterns on a surgical sight including shapes with actual dimensional measurements and fiducials, and means for analysing the projecting light patterns on the surgical site by comparing the actual dimensional
- WO 2013/163391 describes a system for generating an image, which the surgeon may use for measuring the size of or distance between structures in the surgical field by using an invisible light for marking a pattern to the surgical field.
- the system comprises a first camera; a second camera; a light source producing light at a frequency invisible to the human eye; a dispersion unit projecting a predetermined pattern of light from the invisible light source; an instrument projecting the predetermined pattern of invisible light onto a target area; a band pass filter directing visible light to the first camera and the predetermined pattern of invisible light to the second camera; wherein the second camera images the target area and the predetermined pattern of invisible light, and computes a three-dimensional image.
- US2008071140 discloses an endoscopic surgical navigation system which comprises a tracking subsystem to capture data representing positions and orientations of a flexible endoscope during an endoscopic procedure, to allow co-registration of live endoscopic video with intra-operative and/or pre- operative scan images. Positions and orientations of the endoscope are detected using one or more sensors and/or other signal-producing elements disposed on the endoscope.
- US6503195 describes a real-time structured light depth extraction system includes a projector for projecting structured light patterns comprising a positive pattern and an inverse pattern with onto an object of interest.
- a camera samples light reflected from the object synchronously with the projection of structured light patterns and outputs digital signals indicative of the reflected light.
- An image processor/controller receives the digital signals from the camera and processes the digital signals to extract depth information of the object in real time.
- a system for generating augmented reality vision of surgical cavities for viewing internal structures of the organs of a patient to determine the minimal distance to a cavity surface or organ of a patient is described in "Augmented reality in laparoscopic surgical oncology” by Stephane Nicolau et al. Surgical Oncology 20 (2011) 189-201 and "An effective visualization technique for depth perception in augmented reality-based surgical navigation" by Choi Hyunseok et al. The international journal of medical robotics and computer assisted surgery, 2015 May 5. doi: 10.1002/rcs.l657.
- the surgical instrument comprises a laser pointing instrument to project laser spots.
- the distance between instrument and organ may be estimated by using images of optical markers mounted on the tip of the instrument and images of the laser spots projected by the same instrument.
- These systems generally comprise a projector and a camera which are spatially interconnected.
- An object of the present invention is to provide a 3D reconstruction system for performing a determination of a tissue field in 3D space with high accuracy.
- the 3D reconstruction system of the invention is especially suitable for performing a determination of a tissue field, such as a field for performing diagnostics and/or a surgery field.
- tissue field is herein used to designate any surface areas of a mammal body, such as natural surface areas including external organs, e.g. skin areas and/or internal surface areas of natural openings and surface areas exposed by surgery and/or surfaces of a minimally invasive surgery cavity.
- the tissue field is advantageously an in vivo tissue field.
- the tissue field may include areas of organs, such as internal organs that have been exposed by surgery, e.g. surfaces of a heart, a spleen or a gland.
- the phrase "determination of a tissue field in 3D space" is herein used to designate a determination of a property of the tissue field or a part thereof, and/or a determination of the tissue field or a part thereof relative to a selected unit such as a surgical tool.
- the property may be a tissue type determination and/or a size determination, such as a topologic size
- the 3D reconstruction system of the invention is preferably suitable for performing a determination of a tissue field in 3D space, more preferably for performing real time determination in 3D space.
- 3D determinations may be performed with a very high accuracy.
- the 3D reconstruction system comprises
- the frames are digital frames and each frame comprises a set of pixel data associated with a time attribute, such as an actual time or a relative time, e.g. a time from start of a procedure or from a start time set by an operator.
- the structured light beam has a centre axis which may advantageously be determined as the centre axis of the structured light beam.
- the structured light beam comprises a cross-sectional light structure which means the light beam as seen in a cross sectional view e.g. as projected perpendicularly to a plan surface.
- the cross-sectional light structure comprises a plurality of light features which are recognizable by the computer system from the set of pixel data.
- the light features may comprise an indefinite number of light features, such as an indefinite number of fractions of the cross-sectional light structure which may be recognized by the computer system.
- the light features may be optically recognizable by comprising an optically recognizable attribute, such as a geometrical attribute (e.g. a local shape), an intensity attribute and/or a wavelength attribute.
- the optically recognizable attributes are recognizable from the pixel data.
- the set of pixel data comprises at least one value for each pixel.
- the value may be 0 for a pixel that does not detect any light.
- the values of the respective pixels may for example represent one or more wavelengths, the intensity of one or more wavelengths, total intensity and etc.
- Values of a group of pixels may represent a geometrical attribute e.g. a line and/or a pattern of pixels with corresponding values.
- the computer system may be a single computer or a group of computers which are in data communication with each other e.g. by wire or wireless.
- the computer system comprises a processor, such as a multi-core processor.
- the computer system forms part of a robot, such as a robot controller processing system configured for operating and controlling movement of the robot.
- the 3D reconstruction system may be operated with a relatively low processing power (CPU) while at the same time be operating with a high accuracy in real time.
- the computer system is configured for storing data representing the projected structured light beam.
- the data set representing the projected light beam may be transmitted or determined by the computer system.
- the computer system comprises a memory configured for storing the projected structured light beam in the form of the reference structured light data set.
- the memory optionally stores the reference structured light data set or as it will be elaborated a plurality of reference data sets each associated with properites of a structured light beam including data representing recognizable light features of the light beam.
- projected structured light beam means the structured light beam as projected from the projector device.
- the projected structured light beam has the orientation and position (pose) corresponding to the beam as projected.
- the pose of the projected structured light beam can therefore be estimated to be the same as the pose of the projector device.
- the projected structured light beam includes a group of electromagnetic waves projected from the projector and propagating along parallel or diverging directions and wherein the light is textured seen in a cross-sectional view orthogonal to a center axis (herein also referred to as the optical axis) of the group of electromagnetic waves i.e. the light has areas of higher intensity, and areas of lower intensities or no intensity which is not a natural Gaussian intensity distribution of a light beam.
- the terms "light pattern” and "light texture” are used interchangeably.
- the data representing the projected light beam is referred to as a "set of reference structured light data" or a
- the projected structured light beam may be stored in the form of a reference structured light data set.
- the reference structured light data set comprises at least a set of the light features of the projected structured light beam.
- the computer system is configured for
- frame means a frame comprising reflections of the structured light beam from the tissue field, i.e. a frame acquired while the projector is projecting the structured light beam.
- the computer system is configured for recognizing a plurality of the set of light features including a plurality of primary light features from two or more received sets of pixel data having corresponding time attribute, such as sets of pixel data of frames of a multi camera image acquisition device.
- Matching of features of stereo pairs of images is known in the art, but heretofore it has never been considered to perform feature matching between a projected light beam and an image to estimate the spatial position of the projector device.
- a very effective method and system for 3D reconstruction of acquired image(s) which may perform a real time 3D reconstruction with a high accuracy using relatively simple algorithms and which algorithms further may be processed using a relative low processing power (CPU).
- the matching of recognized primary light features may be performed according to principles known from the art of feature matching of stereo images for example by applying homographical iterative closest match algorithms and/or as described in the article "Wide Baseline Stereo Matching" by Philip Pritchett and Andrew Zisserman, Robotics Research Group,
- the matching of the recognized primary light features may preferably comprise matching the pixel data representing the primary light features with pixel data of the reference structured light data set.
- the computer system may estimate the spatial position of the projector device relative to at least a part of the tissue field determined e.g. from the position of the image acquisition device.
- the spatial position of the projector device relative to at least a part of the tissue field may be determined from the position of the image acquisition device at the time of acquiring the image processed and preferably
- the computer system is configured for matching the recognized primary light features with corresponding light features of the projected structured light beam and based on the matches estimating the spatial position of the projector device relative to at least a part of the tissue field as the spatial position determined from the position of the image acquisition device.
- the projector device need not be within the field of view of the image acquisition device since the computer system based on the light feature matches may determine the spatial position of the projector device relative to at least a part of the tissue field as it would have been imaged if it had been within the field of view of the image acquisition device.
- the image acquisition device and/or the projector device may be
- the view of field of the image acquisition device may be relatively narrow and preferably focused predominantly onto the tissue field.
- the acquired images of the tissue field may be of a very high quality and reveal many details, which may not have been revealed using an image acquisition device with a higher field of view and/or depth of focus.
- the computer system may be configured for receiving a reference structured light data set via a calibration step.
- the system as such may not require calibration once the computer has the reference structured light data.
- the computer system may perform the one or more determinations of the tissue field based on the spatial position of the projector device and the recognized light features e.g. using trigonometrical algorithms for example as described in US
- the computer system may perform the one or more determinations of the tissue field based on the spatial position of the projector device estimated as described herein and the recognized light features using the reconstruction models and algorithms described in WO2015/151098.
- the computer system may now calculate in a simple way the one or more determinations of the tissue field even where the projector and/or the image acquisition device is moved independently of each other.
- the phrase "estimate the spatial position of the projector device" includes a determination e.g. a calculation of the spatial position of the projector device which may be further refined e.g. as explained below.
- body cavity is herein used to denote any gas and/or liquid filled cavity within a mammal body.
- the cavity may be a natural cavity or it may be an artificial cavity which has been filled with a fluid (in particular gas) to reach a desired size.
- the cavity may be a natural cavity which has been enlarged by being filled with a fluid.
- the body cavity is a minimally invasive surgical cavity.
- distal and proximal should be interpreted in relation to the orientation of tools used in connection with diagnostics and/or surgery, such as minimally invasive surgery.
- real time is herein used to mean the time required by the computer to receive and process optionally changing data, such as
- intraoperative data optionally in combination with other data, such as predetermined data, reference data set, estimated data which may be non- real time data such as constant data or data changing with a frequency of above 1 minute to return the real time information to the operator.
- estimated data may include a short delay, such as up to 5 seconds, preferably within 1 second, more preferably within 0.1 second of an occurrence.
- the term "operator” is used to designate a human operator (human surgeon) or a robotic operator i.e. a robot programmed to perform a minimally invasive diagnostics or surgical procedure on a patient.
- the term “operator” also includes a combined human and robotic operator, such as a robotic assisted human surgeon.
- access port means a port into a body cavity provided by a cannula inserted into an incision through the mammal skin and through which cannula an instrument may be inserted.
- peernetration hole means a hole through the mammal skin without any cannula.
- the term “rigid connection” means a connection which ensures that the relative position between rigidly connected elements is substantially constant during normal use.
- cannula means herein a hollow tool adapted for being inserted into an incision to provide an access port as defined above.
- projector means “projector device” unless otherwise specified.
- a camera baseline means the distance between cameras or camera units. The distance is - unless otherwise specified - determined as the distance between the lens' center points (optical axis) corresponding to the distance between the center of the images acquired by the two
- a projector-camera baseline means the distance between the camera/camera unit and the projector. The distance is - unless otherwise specified - determined as the distance between the camera lens center of the camera/camera unit and the center of the projector. Often the surface of the tissue field may be very curved.
- target area or "target site” of the tissue field e.g. of the minimally invasive surgical cavity is herein used to designate an area which the surgeon may have focus on, e.g. for diagnostic purpose and/or for surgical purpose.
- tissue site may be any site of the tissue field e.g. a target site.
- the tissue field may e.g. comprise a surgical field of an open surgery or a minimally invasive surgery.
- the tissue field comprises surfaces of the intestine and the throat.
- skin is herein used to designate the skin of a mammal.
- the skin may include additional tissue which is or is to be penetrated by a penetrator tip or through which an incision for an access port is made or may be made.
- minimally surgical instrument means herein a surgical instrument which is suitable for use in surgery performed in natural and/or artificial body openings of a human or animal,, such as for use in minimally invasive surgery.
- corresponding time attributes is used to mean attributes that represent a substantially identical time.
- the set of pixel data may advantageously be subjected to an error and correction e.g. to detect and correct and/or discharge corrupted data, such data that have been corrupted due to transmission, data that include error reflection e.g. due to moisture at the tissue field, data that are missing due to occlusions or absorption.
- the error and correction may e.g. be provided by adding to the set of pixel data prior to transmission to the computer system some redundancy e.g. by adding extra data, which the computer system may use to check consistency of the set of pixel data, and to recover data that have been determined to be corrupted.
- redundancy is incorporated into the structured light pattern by designing the cross-sectional light structure of the projected structured light beam to provide that the set of pixel data comprise redundant data.
- Error and correction schemes are well known in the art and the skilled person will be capable of adapting such error and correction schemes to be used in the present invention.
- the error and correction of the set(s) of pixel data may comprise withdrawing of values representing background frame(s). This will be described in further details below.
- the reflected light is subjected to an optical filtering which may further be useful in obtaining high quality frames. Also this is described in further details below.
- the projector device of the structured light is subjected to an optical filtering which may further be useful in obtaining high quality frames. Also this is described in further details below.
- the front of the projector device from where the structured light beam is projected and the front of the image acquisition device collecting the light for imaging the surface of the tissue field onto where the structured light is impinging and from where the image is acquired may be arranged in a triangular configuration.
- a very accurate determination of the spatial position of the projector device may be obtained using for example algorithm based on geometrical math.
- the computer system comprises the reference structured light data set and comprises an algorithm that from the matched primary light features and their orientation and optionally distortion of primary recognized features may determine the triangular configuration between the projector device, the image acquisition device and the tissue field and thus based on this perform 3D determinations of the tissue field, such as 3D distances and/or
- topographical configurations of the tissue field are desired using trigonometrical, kinematic calculations for determining the spatial position and orientation of the projector device relative to at least a part of the tissue field.
- the triangular configuration between the projector device, the image acquisition device and the tissue field may be determined without the projector device being within the field of view of the image acquisition device.
- the determination of the spatial position and orientation of the projector device may be determined at a stationary or variable frequency e.g. the frequency may be increase where the movements of the projector and/or the image acquisition device is increasing.
- the 3D reconstruction system may operate with high accuracy even where the distance between the projector and the image acquisition device is relatively high, such as up about 45 cm, such as up to about 30 cm, such as up to about 15 cm, such as up to about 10 cm, such as up to about 5 cm, such as up to about 3 cm, such as up to about 2 cm.
- the estimated spatial position comprises an estimated distance in 3D space between the tissue field and the projector device.
- the distance in 3D space between the tissue field and the projector device may be preprogrammed or operator selectable and for example be a shortest distance, a distance in a selected vector direction, a distance from a center of the front of the projector device from where the structured light beam is projected, a distance to a specific point of the tissue field e.g. to a protruding point of the tissue field and/or a target site of the tissue field e.g. a nerve..
- the distance in 3D space between the tissue field and the projector device is the shortest euclidean distance between the tissue field and the point of the projector device corresponding to the center axis of the projected structured light beam, preferably the shortest euclidean distance together with a coordinate vector direction of the euclidean distance.
- the distance in 3D space between the tissue field and the projector device is given by the x, y, and z coordinates in a 3D coordinate system.
- the estimated spatial position comprises an estimated distance in 3D space between the tissue field and the projector device from a point of view of the image acquisition device, such as a minimum distance between the tissue field and the projector device.
- the estimated spatial position comprises the estimated distance in 3D space between the tissue field and the projector device as well as an estimated relative orientation of the projector.
- the computer system is configured for generating a representation of the spatial position of the projector device as seen from a point of view which is different from the image acquisition device.
- the computer system may comprise algorithm(s) for performing the required geometrical determinations.
- the algorithm for performing the calculation of the representation of the spatial position of the projector device as seen from a point of view which is different from the image acquisition device may for example use optional geometrical distortions of recognized light features as a part of the basis for the determination, e.g. for determining the angle between the projector device and the image acquisition device.
- the computer system may in an embodiment know the spatial or a relative spatial position (preferably including distance and angle) of the image acquisition device which may additionally be applied for improving the accuracy of the 3D determination.
- the system comprises a sensor arrangement arranged for determining the spatial or a relative spatial position (preferably including distance and angle) of the image acquisition device e.g. for determining the distance between the projector and the image acquisition device.
- the sensor arrangement is preferably configured for determining the distance and the relative orientation between the projector and the image acquisition device.
- the sensor arrangement may in principle be any kind of sensor arrangement capable of determining the distance and optionally the orientation
- determination(s) such as for example a sensor arrangement comprising a transmitter and a receiver located at or associated with respectively the projector and the image acquisition device.
- the term associated with means in this connection that there is a known and/or rigid interconnection with the projector or the image acquisition device with which the sensor is associated with.
- the sensor arrangement may e.g. comprise a first sensor on or associated with a first robot arm configured for being connected to the projector e.g. via an instrument and a second sensor on or associated with a second robot arm configured for being connected to the image acquisition device e.g. via an instrument.
- the computer system comprises or is supplied with data representing the divergence of the projected structured light beam.
- This data representing the divergence may advantageously form part of the reference structured light data set.
- the estimated spatial position comprises an estimated orientation, e.g. comprising a vector coordinate set and/or comprising at least one orientation parameter selected from yaw, roll and pitch or any combination thereof.
- the estimated spatial position comprises two or more, such as all of the orientation parameters yaw, roll and pitch.
- the orientation parameters yaw, roll and pitch and their relation are generally known within the art of airborne LIDAR technology.
- the estimated spatial position comprises an estimated distance and at least one orientation parameter, such as an orientation parameter selected from yaw, roll and pitch, preferably the estimated spatial position comprises two or more, such as all of the orientation parameters yaw, roll and pitch.
- the estimated spatial position comprises an estimated shortest or longest distance between a selected point of the projector and the tissue field.
- the estimated spatial position comprises an estimated distance described by 3 values in 3D space (e.g. x, y, and z values in a 3D coordinate system) and an estimated orientation e.g. described by 3 values in 3D space (e.g. x, y, and z values in a 3D coordinate system).
- the values in 3D space representing distance and orientations are preferably values in a common coordinate system.
- the estimated spatial position and orientation is described by two end points in a coordinate system e.g. end points defined by two sets of x, y, and z values in a 3D coordinate system.
- the estimated spatial position comprises an estimated distance represented by 2 set of values in a common 3D coordinate system, each set of values comprises an x, a y and a z value.
- the estimation of the spatial position comprises estimating the pose (position and orientation) of the structured light as projected from the projector relative to the orientation and position of the reflected light from the tissue field.
- the estimation of the spatial position of the projector device comprises estimating the pose of the projected structured light beam. In an embodiment the estimation of the spatial position of the projector device is determined to be the estimation of the pose of the projected structured light beam, preferably determined as projected i.e. at the position of the projector.
- the estimated orientation and optionally the estimated spatial position on the projector is determined using quaternion based geometrical algorithms.
- Mathematical methods based on or including quaternions are well known. The quaternion model was first described by the Irish mathematician William Rowan Hamilton in 1843. The quaternion model provide a convenient mathematical model for representing orientations and rotations of objects in three dimensions. Further information about quaternions may be found in Altmann, S.L., 2005. Rotations, quaternions, and double groups. Courier Corporation and/or in D. Scharstein and R. Szeliski. A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. International Journal of Computer Vision, 47(1/2/3):7-42, April- June 2002.
- the set of pixel data may be subjected to outlier removal i.e. removing data values that lie in the tail of the statistical distribution of a set of data values and therefore are estimated to likely be incorrect.
- the computer system is configured for estimating one or more of the orientation parameters yaw, roll and pitch at least partly based on the matches of light features.
- information relating to the orientation may be transmitted to the computer system from another source e.g. another sensor.
- one or more of the orientation parameters yaw, roll and pitch is/are transmitted to the computer system and/or determined by the computer system independently from the light features.
- one or more of the orientation parameters yaw, roll and pitch is/are at least partly obtained from a sensor located at or associated with the structured light arrangement and/or the image acquisition device to sense at least one of the orientation parameters yaw, roll and pitch.
- the sensor located at or associated with the structured light arrangement and/or the image acquisition device may advantageously be configured to determine the relative orientation between the structured light arrangement and the image acquisition device.
- the computer system is configured for estimating the homographic transformation between the matched features and based on the homographic transformation determining the spatial position, such as the pose of the projector device.
- the matching of one single recognized light feature with the corresponding light feature of the projected structured light beam may suffice where the tissue field is relatively plan, the light feature comprises both orientation and position attributes and/or where the light feature is perfectly recognized.
- the number of recognized primary light features which are matched to corresponding light features of the projected structured light beam is at least 2, preferably at least 3, such as at least 5, such as at least 7. Thereby a higher accuracy may be obtained.
- the computer system is configured for identifying a first number of pairs of matched features with a homographic transformation that corresponds within a threshold and preferably for applying this homographic transformation as the estimated homographic transformation e.g. to thereby estimate the pose of the projector device.
- the first number of pairs of matched features is at least two, such as at least 3, such as at least 5.
- the computer system may be configured for identifying one or more second pairs of matched features with a transformation which differs beyond the threshold from the homographic transformation of the first number of pairs of matched features.
- the computer system may further be configured for correcting and/or discharging the pixel data representing the recognized feature(s) of the second pairs of matched features in particular where the transformation of the second pair(s) of matched light features differs far beyond the threshold and/or where the transformation of the second pair of matched light features is determined from one single pair of matched light features.
- recognized light features reflected from areas of the tissue field having large topographical height differences relative to the average tissue field may be disregarded or corrected for the recognized light features to be used as primary recognized light features in the estimation of the spatial position of the projector device.
- a timely associated 2D image of an element associated to the projector may be used to confirm the estimation of the spatial position of the projector or e.g. to rule out erroneous estimates of the spatial position of the projector, e.g. due to undesired light reflections which may for example occur at very curved surfaces or where the angle between the emitted light beam and the surface at the body cavity reflecting the light pattern is very far from normal e.g. 10 degrees or less. It has been observed that in such situation some of the estimation e.g. 1 out of 10-100, may be an erroneous estimate.
- the timely associated 2D image may for example be a frame acquired by the image acquisition device with corresponding time attribute as the set(s) of pixel data used for the estimation of the spatial position of the projector.
- the element associated to the projector may be an instrument, such as a minimally invasive surgical instrument to which the projector is fixed or mounted.
- the 2D image may e.g. comprise a tip of the instrument together with a surface part of the tissue field.
- the tip may reveal the tip orientation and correlate this to the determination of the projector orientation and the relation between the tissue field and the tip may be correlated to the spatial position of the projector.
- the image acquisition device may comprise a single camera or several cameras e.g. a stereo camera.
- the image acquisition device comprises a single camera configured for acquiring the frames.
- the sets of pixel data of the respective frames are associated with respective consecutive time attributes
- the image acquisition device has one single camera it may be desired to use a higher number of recognized features for matching with the corresponding light features of the projected structured light beam.
- the computer system may be configured for matching recognized features of one set of pixel data with corresponding recognized features of a subsequent or previous set of pixel data.
- the matching of recognized light features from one set of pixel data with corresponding recognized features of a subsequent or previous set of pixel data is referred to as time shifted matching.
- the computer system is configured for repeating the steps of • receiving a set of pixel data associated with a time attribute and representing a single frame
- the number of recognized primary features for matching may advantageously be at least 3, such as at least 5, such as from 6 to 100, such as from 8 to 25.
- the frames comprise a plurality of frames acquired with a wide projector-camera baseline i.e. with a relative large distance between the projector and the camera relative to the distance between the projector and the tissue field.
- the computer system may, in one or more of the repeating steps, further be configured for performing time shifted matching comprising matching the primary light features with corresponding primary light features of at least one set of pixel data associated with a previous time attribute, and for applying the time shifted matching in the estimation of the spatial position of the projector device.
- the previous time attribute is up to about 10 seconds, such as up to about 5 seconds, such as up to one second earlier than the time attribute of the set of pixel data processed in the current processing step.
- the time shifted matching may comprise feature matching over 3, 4 or more sets of pixel data of subsequently acquired images.
- the image acquisition device may advantageously comprise a multi camera configured for acquiring sets of the frames where each set of frames comprises at least two simultaneously acquired frames and the sets of pixel data of the frames of a set of frames are associated with corresponding time attribute representing the time of acquisition.
- corresponding time is used to mean a substantially identical time attribute.
- the number of recognized features in this embodiment is at least 2, such as at least 3, such as from 5 to 100, such as from 6 to 25.
- the image acquisition device comprises two or more camera, such as at least one stereo camera, the computer system is advantageously
- stereo matching comprising matching the primary light features of two or more of the respective sets of pixel data with each other.
- the stereo matching may e.g. be performed in one or more of repetitions of the above steps.
- the stereo matching may advantageously be a wide camera baseline stereo matching e.g. using epipolar geometry.
- the two or more cameras of the multi camera may be integrated in one unit or in separate units.
- the computer system is configured for performing time shifted stereo matching comprising matching the primary light features with corresponding primary light features of at least one set of pixel data associated with a previous time attribute, and for applying the time shifted matching in the estimation of the spatial position of the projector device.
- the previous time attribute is up to about 10 seconds, such as up to about 5 seconds, such as up to one second earlier than the time attribute of the set of pixel data processed in the current processing step.
- the multi camera may advantageously comprise a stereo camera comprising two coordinated image acquisition units.
- the image acquisition units may be arranged to acquire wide camera baseline images.
- the image acquisition units are arranged with a fixed relative distance to each other, preferably of at least about 5 mm, such as at least about 1 cm, such as at least about 2 cm.
- the distance between two cameras arranged for acquiring image of a surface that is to be determined should be equal or as close to equal as possible to the distance between the cameras and the surface.
- the distance(s) between image acquisition units is advantageously relatively small, such as less than 5 mm, such as from about 0.1 to about 3 mm.
- the camera baseline i.e. the distance between the camera units
- the distance between the camera units will be relatively narrow compared to the distance to the tissue field. This may for example be compensated by operating using a wide projector-camera baseline e.g. as described below.
- the 3D reconstruction system is configured for operating using both a wide camera baseline and a wide projector-camera baseline.
- the computer system may further be configured for performing stereo matching and preferably wide field stereo matching of tissue field features - i.e. features that represent local tissue field areas having a characteristic attribute, such as a protrusion, a depression and/or a local lesion. Since the 3D reconstruction system is capable of determine sizes the computer system may determine the size and/or volume of for example a hernie and/or a protrusion, a depression and/or a local lesion.
- the multi camera comprises coordinated image acquisition units arranged with substantially parallel optical axis and/or centre axis.
- the multi camera comprises coordinated image acquisition units arranged with optical axis having a relative angle to each other of up to about 45 degrees, such as up to about 30 degrees, such, as up to about 15 degrees.
- the coordinated image acquisition units are arranged to have an overlapping field of view.
- the overlapping field of view is relatively large since only light features recognized in two or more images of the multi camera may be matched.
- the coordinated image acquisition units have an at least about 10 % overlapping field of view, such as at least about 25 % overlapping field of view, such as at least about 50% overlapping field of view.
- the overlapping field of view is generally determined as an angular field of view.
- the image acquisition device has a stereo field of view - determined as the maximal overlapping field of view - of up to about 60 degrees, such as up to about 50 degrees, such as up to about from at least about 40 degrees, such as from about 5 degrees to about 50 degrees.
- the image acquisition device has a maximal field of view of from at least about 5 degrees to about 160 degrees, such as up to about 120 degrees, such as from about 10 to about 100, such as from about 15 to about 50 degrees, such as from about 20 to about 40 degrees.
- the max field of view may be determined in any rotational orientation including horizontally, vertically or diagonally orientation.
- the image acquisition device has field of view adapted to cover the tissue field without including the projector device.
- the 3D reconstruction system need not acquire images including the projector device and thus the 3D reconstruction system is a very flexible system which may operate with limited field(s) of view while simultaneously providing highly accurate 3D determinations.
- the structured light arrangement and the image acquisition device are located on the same movable instrument. In an embodiment the structured light arrangement is located on one movable instrument and the image acquisition device is located on another independently movable instrument.
- the one or more cameras of the image acquisition device may be any one or more cameras of the image acquisition device.
- the one or more cameras is/are located on a cannula.
- the 3D reconstruction system is advantageously adapted for operating using a wide projector- camera baseline.
- the term "projector-camera baseline” means the distance between the camera and the projector.
- the projector-camera baseline is dynamic and variable and may be selected by the surgeon to ensure the desired depth resolution.
- the projector-camera baseline advantageously is selected in dependence of the distance between the projector and the tissue field, preferably such that the projector-camera baseline is matching the distance between the projector and the tissue field which has been found to provide very accurate
- the projector-camera baseline is from about 1/16 to about 16 times the distance between the camera and the tissue field, such as from about 1/4 to about 4 times the distance between the camera and the tissue field, such as from about half to about 2 times the distance between the camera and the tissue field, such as 1 time the distance between the camera and the tissue field.
- the 3D reconstruction system is adapted for operating using a wide projector-camera baseline comprising a projector-camera distance up to about 45 cm, such as up to about 30 cm, such as up to about 15 cm, such as up to about 10 cm, such as up to about 5 cm, such as up to about 3 cm, such as up to about 2 cm. It has been found that operating using a wide base line ensures an even higher accuracy for size determinations, such as tissue field volume
- tissue field topologic size determinations are possible.
- the reconstruction system is adapted for operating using a varying projector-camera baseline, preferably comprising that the projector and the image acquisition device is movable independently of each other.
- the computer system is adapted for determine the projector- camera baseline. This may be provided by the estimation of the homographic transformation between the matched features at a given time.
- the computer system is adapted for determine the projector- camera baseline as a function of time.
- the computer system is configured for associating a plurality of determined projector-camera baseline(s) with a timely
- the data link may e.g. be provided by the time attributes, for example to provide that each determined projector-camera baseline is associated to a time attribute representing the time of the determined projector-camera baseline.
- the determined projector-camera baselines and the of sets of pixel data may thereafter be linked using their respective time attribute for example such that a determined projector-camera baselines is linked to the set of pixel data having closest time attribute match.
- the 3D reconstruction system is adapted for operating using a non-rigid structure from motion.
- a non-rigid structure from motion This is in particular desired for size determinations, such as tissue field volume determinations and/or tissue field topologic size determinations e.g. where local motion may occur e.g. near a vessel or nerve.
- the non-rigid structure from motion technique is e.g.
- the computer system comprises data representing an angle of divergence of the projected structured light beam.
- the computer system may determine the spatial position of the projector device with an even higher accuracy using relatively simple algorithms.
- the projected structured light beam has an angle of divergence and the computer system stores or is configured for storing the angle of divergence in its memory.
- the angle of divergence data may e.g. be a part of the reference structured light data set.
- the angle of divergence may for example be at least about 10 degrees, such as at least about 20 degrees, such as at least about 30 degrees, such as at least about 40 degrees relative to the centre axis of the structured light.
- the optimal angle of divergence may advantageously be selected in dependence of how close the projector device is adapted for being located relative to the tissue field.
- the angle of divergence may advantageously be substantially rotationally symmetrical, thereby the structured light arrangement may be provided using relatively simple optical structures.
- the computer system is configured for acquiring the angle of divergence from an operator via a user interface and/or by a
- the angle of divergence may be tunable e.g. by a preprograming or by operator intervention. For example an operator may use a larger angle of divergence where the projector device is closer to the tissue field and a smaller angle of divergence where the projector device is further from the tissue field.
- the angle of divergence is fixed. In an embodiment the angle of divergence is tunable according to a preprogramed routine and/or by operator instructions.
- the computer system is configured for determining the angle of divergence by a calibration.
- the calibration may for example comprise projecting the projected structured light beam from a preselected distance and toward a known surface area, recording the reflected structured light and determining the angle of divergence, such as projecting the projected structured light beam from a preselected distance and with its centre axis orthogonal to the known surface area.
- the angle of divergence may for example be determined as the beam divergence, ⁇ :
- Dl is the largest cross-sectional dimension orthogonal to the centre axis of the structured light beam as projected from the projector device or at a first distance from the projector device
- D2 is the largest cross-sectional dimension orthogonal to the centre axis of the structured light at a second larger distance from the projector e.g. at a surface
- L is the distance between Dl and D2 and wherein the distances is determined along the centre axis of the light pattern.
- the data representing the angle of divergence for the or each projected structured light beam is included in the or in each respective reference structured light data set(s)
- the reference structured light data set for each projected structured light beam will be known to the computer system and may be applied in the at least one determination of the tissue field.
- the structured light beam may in practice have any kind of optically detectable structure.
- the wavelength(s), structure and intensity of the structured light beam is advantageously selected to be reflective from the tissue field, such as a field of soft tissue, such as a tissue field exposed by surgery.
- the cross-sectional light structure of the projected structured light beam comprises optically distinguished areas, such as a pattern of areas of light and areas of no-light and/or areas of light of a first quality of a character and areas of light of a second quality of the character, wherein the character advantageously is selected from light intensity, wavelength and/or range of wavelengths.
- the structured light beam may for example be a pattern of light of a certain wavelength range with intermediate areas of no light or areas of light with a more narrow range of wavelength.
- the pattern may e.g. be strips, cross hatched lines or any other lines, or shapes.
- the structured light beam may e.g. be provided by providing the projector device with one or more optical filters and/or by a projector device comprising a diffractive optical element (DOE), a spatial light modulator, a multi-order diffractive lens, a holographic lens, a Fresnel lens, a mirror arrangement, a digital micromirror device (DMD) and/or a computer regulated optical element, such as a computer regulated mechanically optical element e.g. a mems (micro-electro-mechanical) element.
- the structured light arrangement comprises light blocking element(s) that blocks parts of the light to form the structuring or part of the structuring of the structured light beam.
- the blocking elements may e.g. be blocking strips arranged on or forming part of the projector device.
- the structured light arrangement comprises a fiber optic probe comprising the projector device configured to project a structured light beam onto at least a section of the tissue field.
- the fiber optic probe advantageously comprises a structured light generating and projecting device and a bundle of fiber guiding the structured light to the projector device for projecting the structured light beam onto at least a section of the tissue field.
- the structured light generating and projecting device will in the following be referred to as the structured light device.
- the structured light device may be any device that can generate a suitable structured light.
- the size of the structured light device is not very important.
- the fibers of the fiber bundle has each a light receiving end and a light emitting end.
- the fiber bundle has a light receiving end and a light emitting end.
- the light receiving end of the fiber bundle is operatively coupled to the structured light device for receiving at least a part of the structured light from the structured light device.
- the operatively coupling may include one or more lenses and/or objectives e.g. for focusing the structured light to be received by the light receiving end of the fiber bundle.
- the light emitting end of the fiber ends are arranged in an encasing to thereby form a probe-head comprising the projector device.
- the probe-head may comprise one or more lenses for ensuring a desired projection of the structured light beam.
- the fiber bundle advantageously comprises at least 10 optical fibers, such as at least 50 optical fibers, such as from about 100 to about 2000 optical fibers, such as from about 200 to about 1000 optical fibers.
- the fibers of the fiber bundle may be identical or they may differ, e.g.
- the structured light comprises a structuring of different wavelengths.
- the fibers of the fiber bundle are substantially identical. In an embodiment, fibers of the fiber bundle are partly of fully fused to ensure a fixed relative location of the fiber ends.
- the structured light arrangement is mounted to a minimally surgical instrument
- the structured light arrangement and the minimally surgical instrument are advantageously in the form of a fiber optic probe instrument assembly as described below.
- the structured light beam is provided as described in DK PA 2016 71005.
- the cross-sectional light structure comprises a symmetrical or an asymmetrical light pattern which may be repeating or non-repeating.
- the cross-sectional light structure is asymmetrical and has no symmetry plan. Thus the risk of erroneous matching of light features may be reduced or even avoided.
- the light pattern advantageously comprises a plurality of light dots, an arch shape, ring or semi-ring shaped lines, a plurality of angled lines, a coded structured light configuration or any combinations thereof, preferably the pattern comprises a grid of lines, a crosshatched pattern optionally comprising substantially parallel lines.
- the light pattern comprises a bar code, such as a QR code.
- the light features comprise local light fractions comprising at least one optically detectable attribute.
- the local light fractions may for example independently of each other each comprise an intensity attribute, a wavelength attribute, a geometric attribute or any combinations thereof.
- Preferably at least one of the attributes is an orientation attribute and/or an orientation attribute.
- each local light fraction has a beam area fraction of up to about 25 % of the area of the cross-sectional light structure.
- each local light fraction has a beam area fraction of up to about 20 %, such as up to about 10 %, such as up to about 5 %, such up to about 3 % of the area of the cross-sectional light structure.
- the area of the cross-sectional light structure of a light feature may overlap with, form part of or fully include the area of the cross-sectional light structure of another light feature.
- the centre to centre distance between at least two of the light features is at least about 0.1 %, such as at least about 1 % of the maximal dimension of the cross-sectional light structure, preferably the centre to centre distance between at least two of the light features is at least about 10 %, such as at least about 25 %, such as at least about 50 % of the maximal dimension of the cross-sectional light structure.
- the centre to centre distance of light features is determined as the distance of the light features at the projected structured light beam.
- the distance of two corner features may be determined as the 2D euclidean distance between the corners e.g. diametrically determined.
- each of the light features are represented by a local light fraction of the cross-sectional light structure having an optically detectable attribute, preferably each light feature comprises a local and characteristic light fraction of the projected structured light.
- each of the light features independently of each other comprise a light fraction comprising two or more crossing lines, v-shaped lines, a single dot, a group of dots, a corner section, a pair of parallel lines, a circle or any combinations thereof and or any other geometrical shape(s).
- each of the light features comprises at least one of a location attribute and an orientation attribute.
- the one or more light features advantageously comprise a combined location and orientation attribute.
- the set of light features may be a predefined set of light features or the set of light features may be selected by the computer system e.g. by selecting light features which may be relatively simple to recognize by the computer system.
- the set of light features comprises predefined light features
- the computer system is advantageously configured for searching for at least some of the predefined light features in the set of pixel data and preferably for recognizing the predefined light features if present and preferably without being distorted beyond a threshold.
- Data representing the predefined light features may be transmitted to the computer system e.g. together with the reference structured light data set and/or the computer system may acquire the predefined light features from a database e.g. together with the reference structured light data set.
- the computer system is configured for defining the set of light features from the reference structured light data set representing the projected structured light beam.
- the computer system may be configured for defining the light features of the set of light features as light features with attributes which make the light features relatively simple to be recognized from the set of pixel data of the acquired images.
- the computer system may be programmed to define the light features of the set of light features according to preferred attributes.
- the computer system is advantageously configured for searching for at least some of the defined light features in the received set of pixel data.
- the primary light features may be preprogramed in the computer system or preferably the computer system is configured to select the primary light features from the recognized light features.
- the computer system may e.g. be configured for selecting the primary light features according to a set of selection rules, for example comprising selecting recognized light features having both orientation attribute and position attribute, selecting recognized light features which have a distance beyond a threshold distance to one or more other already selected recognized light features, selecting recognized light features representing corner segments of the cross-sectional light structure, selecting recognized light features representing square pattern segments of the cross-sectional light structure and/or combinations thereof.
- the selection of the primary light features may preferably be performed as the light features are recognized and continue until a sufficient number of primary light features have been selected.
- the sufficient number of primary light features may be determined by the computer system or it may be a predefined number programmed into the computer system and/or transmitted to the computer system.
- the computer system may have a preprogramed number as the sufficient number of primary light features and the computer may be configured to overwrite this number upon instruction of an operator to apply another number as the sufficient number of primary light features.
- the primary light features include at least 3 primary light features arranged with a triangular configuration to each other.
- the computer system comprises an artificial intelligent processing system configured for selecting qualified light features.
- the artificial intelligent processing system may be any artificial intelligent processing system suitable for selecting qualified light features - i.e. light features which comprise at least one position attribute and/or at least one orientation attribute.
- the artificial intelligent processing system may be any artificial intelligent processing system suitable for selecting qualified light features - i.e. light features which comprise at least one position attribute and/or at least one orientation attribute.
- the artificial intelligent processing system may be trained to select qualified light features by being presented to a number of qualified light features (labeled as being qualified) and a number of non-qualified light features (labeled as being nonqualified) and based on these presentations and labelling being trained to distinguish between qualified and non-qualified light features.
- the artificial intelligent processing system comprises machine learning algorithms, such as machine learning algorithms for supervised deep learning and/or machine learning algorithms for non- supervised deep learning.
- machine learning algorithms such as machine learning algorithms for supervised deep learning and/or machine learning algorithms for non- supervised deep learning.
- processing system are configured for including pre-operation data and/or inter-operation data, e.g. from a cloud of data in the selection of qualified light features.
- the computer system is configured to recognize the primary light features from the received pixel data.
- the computer system is configured for selecting the primary light features from the recognized light features and the computer system comprises a primary light feature threshold for selecting light features qualified for representing primary light features, the primary light features threshold preferably comprises a location attribute sub-threshold and an orientation attribute sub-threshold.
- the primary light feature threshold may for example comprise minimum intensity, minimum identity probability, minimum orientation probability, minimum asymmetry, minimum centre distance and etc.
- the primary light features are identified as light features comprising at least one of a location attribute and an orientation attribute.
- a plurality of the primary light features is identified as light features comprising both a location attribute and an orientation attribute.
- An orientation attribute is generally an attribute with a degree of asymmetry and preferably rotational asymmetry, more preferably at most two fold symmetry and preferably one fold symmetry.
- the asymmetry may be an asymmetry in light intensity, in geometrical shape, in wavelength and/or range of wavelengths.
- the computer system may be configured for determining the orientation of the light feature in the cross section plan of the structured light and thereby also the rotational orientation of the structured light relative to the projector device.
- the location attribute is represented by a light point, such as the cross of crossing lines, the tip of v-shaped lines, the position of a constraint along a line, the amplitude of a wave shaped line or the centre of a light dot and etc.
- the orientation attribute is represented by one or more asymmetrical geometrical shapes, such as the orientation of the lines of crossing lines, the orientation of the lines of v-shaped lines, orientations of the wave of a wave shaped line or the orientation of imaginary lines between dots of a group of dots.
- the computer system comprises an artificial intelligent processing system configured for selecting the primary light features from the recognized light features.
- the artificial intelligent processing system may for example be a trained artificial intelligent processing system which has been trained to select qualified light features e.g. according to the above
- the image acquisition device may in principle be any image acquisition device capable of acquisition of digital images.
- the image acquisition device comprises at least one image acquisition unit comprising a pixel sensor array, such as charge-coupled device (CCD) image sensor, or a complementary metal-oxide-semiconductor (CMOS) image sensor.
- CCD charge-coupled device
- CMOS complementary metal-oxide-semiconductor
- the image acquisition unit comprises an array of pixel sensors each comprising a photodetector (such as an avalanche photodiode (APD), a photomultiplier or a metal-semiconductor-metal photodetector (MSM photodetector).
- a photodetector such as an avalanche photodiode (APD), a photomultiplier or a metal-semiconductor-metal photodetector (MSM photodetector).
- the image acquisition unit comprises active pixel sensors (APS).
- each pixel comprises an amplifier which makes the operation of the image acquisition unit faster, more preferably the image acquisition unit comprises at least about 0.1 Mega pixels, such as at least about 1 Mega pixels, such as at least about 5 Mega pixels.
- the image acquisition device comprises and/or is associated with an optical filter.
- the optical filter may for example be a wavelength filer and/or a polaroid filter, for example a linear or a circular polarizer.
- the optical filter is arranged to provide that at least some of the light reflected from the tissue field is filtered prior to reaching the camera(s) of the image acquisition device.
- Such an optical filter may be applied to further ensure that the pixel data of the frames are subjected to as low noise as possibly.
- the image acquisition device comprises and/or is associated with at least one linear polarization filter
- the 3D reconstruction system is configured for acquiring one or more frames of reflected light of the structured light beam from the tissue field where the reflected light has been filtered by the at least one linear polarization filter which ensures light polarized in a first direction is blocked while the remaining light passes through.
- the 3D reconstruction system may further be configured for acquiring a one or more other frames of reflected light of the structured light beam from the tissue field where the reflected light has been filtered by the at least one linear polarization filter with light polarized in an orthogonal direction to the first direction is blocked.
- Light reflecting off a surface will tend to be polarized, with the direction of polarization (the way that the electric field vectors are pointing) being parallel to the plane of the interface.
- the computer system may acquire further data for performing a 3D reconstruction of a part of or the entire tissue field.
- the frame rate may be selected in dependence of the intended use of the 3D reconstruction system and in particular in dependence on how fast it is intended to move the structured light arrangement and/or the image acquisition device.
- the image acquisition unit has a frame rate of at least about 10 Hz, such as at least about 25 Hz, such as at least about 50 Hz, such as at least about 75 Hz.
- the image acquisition units are advantageously timely coordinated to acquire images simultaneously.
- the two or more image acquisition units preferably have the same frame rate.
- the frame rate is from about 10 to about 75 Hz.
- the image acquisition unit has an even higher frame rate, such as up to about 300 Hz, such as up to about 500 Hz, such as up to about 1000 Hz.
- the structured light arrangement is configured for being pulsed, preferably having a pulse duration and a pulse frequency.
- pulsed is herein applied that the structured light arrangement is projecting the structured light beam in pulses.
- the pulses may e.g. be provided by pulsing the light source e.g. by a light source driver and/or using a shutter and/or by any other means capable of switching the light on an off.
- the pulse duration is the time of one pulse of projected structured light.
- the time between pulses is referred to as inter pulse time.
- the 3D reconstruction system may perform other light based measurements in the inter pulse time between pulses, e.g. using one or more other light sources.
- the 3D reconstruction system may perform other light based measurements in the inter pulse time between pulses, e.g. using one or more other light sources.
- the imaging light arrangement for imaging the tissue field and advantageously the imaging light arrangement is pulsed asynchronous relatively to the structured light arrangement.
- the structured light arrangement has a pulse duration, i.e. the timely length of the pulse, which is from about half to about twice an inter pulse time between pulses, such as from about 0.01 to about 1.5 the inter pulse time, such as from 0.05 to about 1 the inter pulse time, such as from 0.1 to about 0.5 the inter pulse time.
- the pulse duration and/or the pulse rate may advantageously be selectable by the surgeon, e.g. regulated by the computer system.
- measurements and/or determinations using one or more other projected light beams e.g. additional structured light beams and/or beams comprising IR wavelength may be performed in the inter pulse time without disturbing the 3D determination of the tissue field by the 3D reconstruction system.
- the structured light arrangement has a pulse rate adjusted relative to the frame rate of the image acquisition unit.
- the pulse rate adjusted relative to the frame rate of the image acquisition unit.
- the structured light arrangement has a pulse rate adjusted relative to the frame rate of the image acquisition unit, to provide that the 3D reconstruction system is configured for acquiring the plurality of frames comprising reflected structured light and for acquiring a plurality of background frames between pulses of the projected structured light.
- the background frames are acquired in inter pulse time and preferably while also an optional illumination light is shut off.
- the structured light arrangement has a pulse rate adjusted relative to the frame rate of the image acquisition unit to provide that the image acquisition units acquires one or more background frames during the inter pulse time.
- the structured light arrangement has a pulse rate adjusted relative to the frame rate of the image acquisition unit to provide that the image acquisition units acquires every 2 nd , 3 rd 4 th 5 th or 6 th frames as background frames during the inter pulse time and preferably the remaining frames during the pulse time when the structured light beam is on.
- the structured light arrangement may advantageously have a structured light controller for adjusting the pulse rate.
- the structured light controller may also be configured for controlling the pulse length of the structured light beam.
- the structured light controller is preferably in data communication with the computer system or it forms part of the computer system.
- the computer system may advantageously control both pulse rate and frame rate and be configured for the timely control thereof.
- the computer system is configured for withdrawing pixel values of one or more background frame from the respective sets of pixel data for thereby reducing noise as indicated above.
- the computer system is configured for withdrawing pixel values from a timely nearest background frame from each of the respective sets of pixel data. This noise reduction is preferably performed prior to recognizing a plurality of the light features of the set light features, the set of pixel data.
- the time attribute may be a relative time attribute or an actual time attribute or a combination thereof.
- a first set of pixel data may be associated with an actual time attribute and subsequent sets of pixel data may be associated with relative time attributes, which are relative with respect to the actual time attribute of the first set of pixel data.
- the computer system is advantageously configured for communicating with the image acquisition device and preferably the structured light arrangement by wire or wireless.
- the computer system preferably comprises at least one processor, at least one memory, and at least one user interface, such as a graphical interface, a command line interface, an audible interface, a touch based interface, a holographic interface or any of the user interface.
- the user interface advantageously comprises at least a display/monitor (a screen e.g. a touch screen) and/or a printer.
- the computer system may advantageously be configured for receiving patient data via a user interface and/or for acquiring patient data from a database.
- the patient data may for example comprise data representing the tissue field e.g. at another point of time and/or data that represent a similar tissue field e.g. a tissue field from a patient of similar age and/or gender or from a group of similar patients.
- the patient data comprise pre-operation data (data obtained before starting a procedure e.g. a surgical procedure) and/or inter-operation data (data obtained during a procedure e.g.
- a surgical procedure such as data obtained by a scanning or other measuring methods, such as a CT scanning, a MR scanning, an ultrasound scanning, a fluorescence imaging and/or a PET scanning and/or such data estimated and/or calculated for groups of patient.
- a scanning or other measuring methods such as a CT scanning, a MR scanning, an ultrasound scanning, a fluorescence imaging and/or a PET scanning and/or such data estimated and/or calculated for groups of patient.
- suitable pre-operation and/or inter-operation data comprise for example data representing measurements and/or estimations obtained by the methods described in the review article "Novel methods for mapping the cavernous nerves during radical prostatectomy" by Fried, N. M. & Burnett, A. L. Nat. Rev. Urol. 12, 451 ⁇ 60 (2015); published online 10 August 2015; doi:10.1038/nrurol.2015.174.
- Fluorescence imaging has shown to be a helpful tool during or prior to surgery e.g. for improved identification for repair of damaged tissues. Further information about Fluorescence imaging for surgical guidance is for example disclosed in "Fluorescence Imaging in Surgery” by Ryan K. Orosco et al. IEEE Rev Biomed Eng. 2013; 6: 178-187. (Published online 2013 Jan 15.
- the computer system is configured for applying the patient data for validating pixel data and/or for repairing incomplete pixel data.
- the reference structured light data set may be loaded to the computer system by any method.
- the reference structured light data set may e.g. be transmitted to the computer system by an operator and/or the computer system may be configured for acquiring the reference structured light data set from a database e.g. in response to an instruction from an operator and/or based on an code included in the structured light beam, such as an optically detectable code.
- a database e.g. in response to an instruction from an operator and/or based on an code included in the structured light beam, such as an optically detectable code.
- Such reference structured light database comprises a plurality of sets of reference structured light data sets each linked to a unique code may e.g. form part of the depiction system.
- the computer system switch to project another structured light beam and simultaneously to apply the set of reference structured light data associated to this another structured light beam.
- the computer system is configured for receiving and storing reference structured light data set representing the structured light beam including the set of light features.
- the computer system is preferably configured for receiving the reference structured light data set via a
- the computer system may be configured for using the reference structured light data set for recognizing the light features from pixel data and preferably for identifying and/or selecting the primary features.
- the calibration is preferably performed by arranging the projector of the structured light arrangement and image acquisition unit(s) of the image acquisition device in predefined spatially positions relative to a plan surface, projecting the structured light to impinge the plan surface and acquiring reflected light by the image acquisition device.
- the computer system may also determine and store data representing the angle of divergence of the structured light beam.
- the computer system may advantageously be configured for matching the primary features and corresponding features of the reference structured light data set using homographical matching principles, e.g. involving trigonometric algorithms and/or epipolar geometric algorithms.
- the computer system is configured for estimating the spatial position of the projector device based on the matches between the primary features and corresponding features of the reference structured light data set.
- the computer system may be configured for performing the at least one determination of the tissue field based on the spatial position of the projector device and the recognized light features by using trigonometric algorithms, e.g. to determine surface topography e.g. comprising height differences and/or other surface shapes.
- the computer system is configured for performing the at least one determination of the tissue field, wherein the at least one determination comprises a distance between the projector device, a 3D structure of at least a part of the tissue field, a size determination of at least a part of the tissue field, such as a size and/or a volume of an organ section.
- the at least one determination of the tissue field comprises a determination of a position of a nerve and/or a vein e.g. relative to a tool and/or to the projector.
- the at least one determination of the tissue field comprises a volume determination e.g. of a cancer knot.
- the determination of the tissue field is advantageously determinations in 3D space - i.e. actual determinations not limited by a point of view.
- a determination of the tissue field may advantageously comprise an actual distance between the projector device and the tissue field (not limited to a view direction, such as a view from the image acquisition device) e.g. such as the closest point of the tissue field and/or a point of the tissue field selected by an operator.
- the projector device is fixed to an instrument, such as a minimally invasive surgical instrument for surgical and/or diagnostic used, the instrument may be moved with very high accuracy relative to the tissue field.
- the computer system may be configured for performing the at least one determination of the tissue field based of pixel data having corresponding time attribute.
- the computer system may be configured to supply with data from set(s) of pixel data having time attribute(s) within e.g. up to 1 s of the data in question, such as within 0.1 s, such as within 0.5 of the data in question.
- the computer system is configured for performing the at least one determination of the tissue field based of pixel data having two or more different time attributes.
- the computer system may advantageously be configured to display the at least one determination of the tissue field on a display, such as a screen e.g. continuously in real time and/or upon request from an operator.
- the projector of the structured light arrangement and the image acquisition device may be fixed relative to each other or they may be independent. In an embodiment the projector of the structured light arrangement and the image acquisition device are fixed to each other with an angle of up to about 45 degrees.
- the projector of the structured light arrangement and the image acquisition device are independently movable.
- the 3D reconstruction system comprises a sensor arrangement for determining the position and orientation of the image acquisition device e.g. relative to the projector device and/or relative to a robot.
- the computer is configured for calibrating the position and orientation of the image acquisition device relative to the projector device.
- the computer system may thereby acquire data representing the position and orientation of the image acquisition device relative to the projector device.
- the computer system may be configured for further refining the estimation of the spatial position of the projector device relative to the tissue field and/or the determination of the tissue field in 3D space.
- the 3D reconstruction system comprises a sensor arrangement for determining the position and orientation of the projector device e.g. relative to the image acquisition device and/or relative to a robot.
- the computer system is preferably configured for repeating the above described estimation of the spatial position of the projector device relative to the tissue field and/or the determination of the tissue in real time as the computer system acquires the set of pixel data representing the consecutive images acquired by the image acquisition device to thereby provide a real time determination of the tissue field.
- the 3D reconstruction system may be applied in surgery to expose pulsating areas of the tissue field and thus warn a surgeon to avoid accidently cutting into an artery or similar. Also accidents caused by other patient movements may be avoided by using the 3D reconstruction system. It has been found that the 3D reconstruction system may be applied for determining the changes of tissue field in real time, such as movements caused by patient movements e.g. local movement, such as peristaltic movements causing movements of the tissue field. Thus, the 3D reconstruction system may in particular be beneficial for use in surgery e.g. to ensure accurate cutting, avoiding undesired damage of tissue and shorting the time of operation.
- the at least one determination of the tissue field comprises determining a local movement of the tissue field, such as a pulsating movement and/or a movement caused by manipulation of the tissue e.g. caused by an instrument, such as a laparoscope.
- the computer system of the 3D reconstruction system is adapted for controlling movement of one or more instruments.
- the computer system of the 3D reconstruction system is adapted for controlling movement of a robot e.g. a robotic surgeon, preferably connected to or forming part (integrated with) the 3D
- the 3D reconstruction system ensures that the operator has a high 3D perception including a perception of sizes and distances.
- the operator may use the 3D reconstruction system for performing volumetric determinations of selected tissue parts, such as nodules protuberances and thickened tissue parts.
- the computer system is configured for repeating in real time the determination of the tissue field for consecutive sets of pixel data of the received frames.
- the structured light beam may advantageously be substantially constant for each determination of the tissue field.
- the 3D reconstruction system it may be desired to change the angle of divergence of the structured light beam and/or to change the structure of the structured light beam.
- the structured light beam is changed from
- the computer system may be configured for changing the structured light beam e.g. upon receipt of an instruction from an operator.
- the computer system is configured for running a routine in real time comprising repeating steps i-iii, i. calculating a spatial position of the projector device from one or more sets of pixel data having corresponding and/or subsequent time attribute(s)
- the computer system is configured for running a routine in real time comprising repeating steps i-iii, i. calculating a spatial position of the projector device from one or more sets of pixel data having corresponding time attribute
- the 3D reconstruction system or parts thereof may be mounted to or incorporated in one or more surgical and/or diagnostic instruments for optimizing movements of such instruments during a diagnostic procedure and/or a surgical procedure.
- At least the projector device of the structured light arrangement is mounted to or integrated with a minimally surgical instrument, such as a minimally surgical instrument selected from a penetrator, an endoscope, an ultrasound transducer instrument or an invasive surgical instrument comprising a grasper, a suture grasper, a stapler, forceps, a dissector, hooks, scissors, suction instrument, clamp instrument, electrode, curette, ablators, scalpels a biopsy and retractor instrument, a trocar or any combination comprising one or more of the abovementioned.
- a minimally surgical instrument selected from a penetrator, an endoscope, an ultrasound transducer instrument or an invasive surgical instrument comprising a grasper, a suture grasper, a stapler, forceps, a dissector, hooks, scissors, suction instrument, clamp instrument, electrode, curette, ablators, scalpels a biopsy and retractor instrument, a trocar or any combination comprising one or more of the abovementioned.
- At least image acquisition unit(s) of the image acquisition device is mounted to or integrated with a minimally surgical instrument, such as a minimally surgical instrument selected from a penetrator, an endoscope, an ultrasound transducer instrument or an invasive surgical instrument comprising a grasper, a suture grasper, a stapler, forceps, a dissector, hooks, scissors, suction instrument, clamp instrument, electrode, curette, ablators, scalpels a biopsy and retractor instrument, a trocar or any combination comprising one or more of the abovementioned.
- a minimally surgical instrument selected from a penetrator, an endoscope, an ultrasound transducer instrument or an invasive surgical instrument comprising a grasper, a suture grasper, a stapler, forceps, a dissector, hooks, scissors, suction instrument, clamp instrument, electrode, curette, ablators, scalpels a biopsy and retractor instrument, a trocar or any combination comprising one or more of the abovementione
- At least the projector device of the structured light arrangement and at least the image acquisition unit(s) of the image acquisition device is mounted to or integrated with separate minimally surgical instruments, such as minimally surgical instruments independently of each other selected from a penetrator, an endoscope, an ultrasound transducer instrument or an invasive surgical instrument comprising a grasper, a suture grasper, a stapler, forceps, a dissector, hooks, scissors, suction instrument, clamp instrument, electrode, curette, ablators, scalpels a biopsy and retractor instrument, a trocar or any combination comprising one or more of the abovementioned.
- the minimally surgical instrument may be a rigid or a bendable minimally surgical instrument.
- the minimally surgical instrument comprises an articulating length section e.g. a distal length section.
- the structured light arrangement and the ultrasound transducer instrument is preferably in the form of a structured light ultrasound instrument as described below.
- the structured light arrangement may be as the projector probe disclosed in the co-pending application DK PA 2016 71005.
- the structured light arrangement comprises a light source optically connected to the projector device.
- the light source may in principle be any kind of light source.
- the light source may be a coherent light source or an incoherent light source.
- Examples of light sources include a semiconductor light source, such as a laser diode and/or a VCSEL light source as well as any kind of laser sources including narrow bandwidth sources and broad band sources.
- the light source comprises a laser light source, such as a laser emitting diode, a fibre laser.
- the determination of light is based on full width at half maximum (FWHM) determination unless otherwise specified or clear from the context.
- the light source is a fibre laser and/or a semiconductor laser, the light source preferably comprises a VCSEL or a light emitting diode (LED).
- the light source is adapted for emitting modulated light, such as pulsed or continuous-wave (CW) modulated light, preferably with a frequency of at least about 200 Hz, such as at least about 100 KHz, such as at least about 1 MHz, such as at least about 20 MHz, such as up to about 200 MHz or more.
- modulated light such as pulsed or continuous-wave (CW) modulated light
- CW continuous-wave
- the wavelength or wavelengths may in principle comprise any wavelengths, such as from the low UV light to high IR light e.g. up to 3 pm or larger.
- the wavelength (s) of the light source for forming the structured light beam is invisible to the human eye.
- the light source is configured for emitting at least one electromagnetic wavelength within the UV range of from about 10 nm to about 400 nm, such as from about 200 to about 400 nm. In an embodiment the light source is configured for emitting at least one electromagnetic wavelength within the visible range of from about 400 nm to about 700 nm, such as from about 500 to about 600 nm. In an embodiment the light source is configured for emitting at least one electromagnetic wavelength within the IR range of from about 700 nm to about 1 mm, such as from about 800 to about 2500 nm.
- the band width of the light source may be narrow or wide, however, often it is desired to use a relatively narrow wavelength for cost reasons and optionally for allowing distinguishing between light emitted from or projected from different elements e.g. from a 3D reconstruction system of an
- the light source has a band width of up to about 50 nm, such as from 1 nm to about 40 nm.
- the light source has a band width which is larger than about 50 nm, such as a supercontinuum band width spanning over at least about 100 nm, such as at least about 500 nm.
- arrangement may comprise two or more light sources, such as two LEDs having different wavelengths.
- the 3D reconstruction system comprises two or more structured light arrangements.
- the two or more structured light arrangements may be adapted to operate simultaneously, independently of each other or asynchronous.
- the two or more structured light arrangements are adapted to operate independently of each other.
- the two or more structured light arrangements are adapted to operate asynchronous.
- the two or more structured light arrangements may comprise respective light sources that differs from each other e.g. with respect to intensity and/or wavelength(s).
- At least one light source of the 3D reconstruction system is an IR (infrared) light containing light source comprising light waves in the interval of from about 0.7 pm to about 4 pm, such as below 2 pm. It has been found that using IR light may provide a very effective system for determining sub tissue surface structures such as a vein. By identifying the subsurface position of for example a vein of another critical structure the surgeon may ensure not to damage such structure e.g. during a surgical intervention at the tissue field.
- IR infrared
- the two or more light source it pulsed asynchronous preferably such that they do not have timely overlapping pulse duration.
- the computer system is configured for determine one or more properties of a target site in the in the tissue field based on wavelength of light reflected from the target site.
- the computer system may comprise or being in communication with a spectroscope, such as a digital spectroscope for recognizing wavelengths in the reflected light.
- the spectroscope in an IR spectroscope.
- the spectroscope may e.g. form part of the image acquisition device or it may be an
- independent spectroscope such as a spectroscope comprising an IR transmitter and a spectroscopic sensor.
- certain properties of the tissue may be determined. This can for example be the oxygen level in the tissue and changes thereof, and the type of tissue.
- the reflected light can be used to determine what kind of organ the tissue is part of, which indicates to the surgeon what organs are which and thereby assisting the surgeon to an area of interest.
- the computer system is adapted to determine oxygen level of a tissue site, changes thereof and type of tissue at the tissue site where the tissue site may be the entire tissue field, an organ at the tissue field, a section of the tissue field and /or a tissue structure or another structure at a preselected depth of the tissue site, such as a sub tissue surface vein.
- the tissue site may e.g. be a target site for the surgeon.
- the at least one light source may preferably be wavelength tunable.
- the wavelength(s) of the light source may for example be selectable by the computer system and/or the surgeon. In an embodiment the wavelength(s) of the light source is selectable based on a feedback signal from the computer system.
- the computer system is configured for determine a boundary about a target site having at least one different property than tissue surrounding the target site.
- the computer system may be configured for determine a size of the target site based on the determined boundary, such as a periphery, an area or preferably a volume.
- the computer system is configured for performing frame stitching comprising stitching at least two sets of pixel data of the frames comprising reflections of the structured light beam from the tissue field.
- the stitched set of pixel data preferably comprises a stitched image data set representing a larger tissue field than each set of pixel data.
- the frame stitching comprises stitching sets of pixel data associated with different time attributes.
- the different time attributes are preferably consecutive time attributes.
- the computer system is configured for continuously stitching in the real time received frames to the stitched image data set.
- the computer system may be configured for unstitching and/or removing pixel data, e.g. by removing pixel data having a time attribute older than a preselected time and or by removing pixel data from the stitched image data set where the pixel data represents a site of the larger tissue field having a distance to a target site and/or a center site which is larger than a preselected distance.
- the 3D reconstruction system is configured for performing a plurality of topological determinations of the tissue field and the computer system is configured for performing topological stitching comprising stitching at least two of the topological determinations, such as from 3 to 100 of the topological determinations, such as from 5 to 50 of the topological
- the topological determination may e.g. comprise determining a plurality of points of the tissue field in 3D for example comprising the spatially relation between the points to obtain a point cloud and the topological stitching may comprise stitching point clouds of the topological determinations to a super point cloud comprising the point clouds spatially combined with each other to represent a larger and/or refined topological determination of the tissue field.
- the computer system may be configured to perform further 3D
- volume determinations such as volume determinations from the super point cloud.
- the computer system is configured for performing topological stitching of a plurality of topological determinations obtained from consecutive acquired frames, preferably such that the frames comprises frames obtained with the projector and/or the image acquisition device at different positions and/or angles relative to the tissue field.
- the invention also comprises a method for performing a determination of a tissue field.
- the method comprises
- ⁇ performing at least one determination of the tissue field based on the spatial position of the projector device and the recognized light features.
- Preferred embodiments of the method comprise the methods that the computer system of the 3D reconstruction system is configured to perform as described above.
- the invention also comprises a robot comprising the 3D reconstruction system as described above.
- the robot is advantageously a surgery robot e.g. for performing minimally invasive surgery.
- the projector device of the structured light arrangement and the image acquisition device are disposed on individually movable arms of the robot.
- the projector device and/or the image acquisition device may e.g. be disposed on respective surgical instrument e.g. held by one of the individually movable arms of the robot.
- the "disposed" as used in the explanation that the projector device and the image acquisition device are disposed on respective arms of the robot is used to mean that the projector device/image acquisition device may be integrated with, mounted to, held by, instated through or in other way engaged with the robot arm(s).
- the robot has a controller processing system which comprises or is in data communication with the computer system of the 3D reconstruction system.
- the computer system comprises a feedback algorithm for controlling movements of at least one of the individually movable arms of the robot in response to the determinations of the tissue field.
- the 3D reconstruction system may comprise a sensor
- the 3D reconstruction system comprises a sensor arrangement for determining the position and orientation of the image acquisition device relative to the projector device and/or relative to a location of the robot e.g. a location of a robot arm.
- the robot may e.g. be as described in WO16057980, W013116869 and/or in US213030571 with the difference that the robot comprises the 3D
- the robot comprises two or more robot arms e.g. as a robot of the ALF-X system or the SurgiBot system as marketed and disclosed by TransEnterix, Inc, where the projector device of the structured light arrangement and the image acquisition device are disposed on individually movable arms .
- the ALF-X System robot has been granted a CE Mark in Europe for use in abdominal and pelvic surgery and comprises a multi-port robotic surgery robot which allows up to four arms to control robotic instruments and a camera.
- the projector device and the image acquisition device may be disposed on any of these robot arms.
- the SurgiBot System robot is a single-incision, patient-side robotic-assisted surgery system and comprises a robot with a number of flexible, articulating robot arms held together in a single collar for insertion of instruments through the articulated robot arms through a single incision for thereafter introducing the robot arms with instruments through the collar for performing minimally invasive surgery within a cavity. All features of the inventions and embodiments of the invention as described herein including ranges and preferred ranges may be combined in various ways within the scope of the invention, unless there are specific reasons not to combine such features.
- the invention also relates to a fiber optic probe instrument assembly suitable for use in minimally invasive surgery.
- the fiber optic probe instrument assembly comprises a fiber optic probe and a minimally surgical instrument.
- the minimally surgical instrument may for example be as described above,
- the minimally surgical instrument is advantageously selected from a penetrator, an endoscope, an ultrasound transducer instrument or an invasive surgical instrument comprising a grasper, a suture grasper, a stapler, forceps, a dissector, hooks, scissors, suction instrument, clamp instrument, electrode, curette, ablators, scalpels a biopsy and retractor instrument, a trocar or any combination comprising one or more of the abovementioned minimally surgical instruments.
- the fiber optic probe comprises a structured light generating and projecting device (generally called structured light device), a bundle of optical fibers and a projector device.
- structured light device is configured for generating a structured light.
- the structured light device may in principle have any size, because the structured light device is not adapted to be near the surgical site e.g. it is not adapted for being inserted into any natural or artificial cavities of a human or animal patient subjected to surgery.
- the structured light generated by the structured light device may e.g. be as described above. It should be observed that the structured light generated by the structured light device may have a relatively large cross-sectional area compared to the cross-sectional area of the structured light delivered to and emitted by the projector device as the structured light beam.
- the structured light generated by the structured light device is generated by a pixel based image projector, where each fiber input end is arranged to receive light/no light from one or more pixels.
- the pattern may be a dynamic patter which may be chanded dynamically or in desired steps.
- the structured light generated by the structured light device may for example include a structure of wavelength variations and/or intensity variation over the cross-section of the structured light.
- the cross-sectional light structure of the structured light generated by the structured light device comprises optically distinguished areas, such as a pattern of areas of light and areas of no-light and/or areas of light of a first quality of a character and areas of light of a second quality of the character, wherein the character advantageously is selected from light intensity, wavelength and/or range of wavelengths.
- the structured light generated by the structured light device may for example be a pattern of light of a certain wavelength range with intermediate areas of no light or areas of light with a more narrow range of wavelength.
- the pattern may e.g. be strips, cross hatched lines or any other lines, or shapes.
- the bundle of fibers has a light receiving end and a light emitting end and is arranged for receiving at least a portion of the structured light from the structured light generating device at its light receiving end and for delivering at least a portion of the light to the projector device.
- the fiber bundle advantageously comprises at least 10 optical fibers, such as at least 50 optical fibers, such as from about 100 to about 2000 optical fibers, such as from about 200 to about 1000 optical fibers.
- the optical fibers are advantageously very thin and closely packed, such that the total cross-sectional area of the fiber bundle at least at a portion of its length nearest the light emitting end is sufficiently narrow for being inserted into any natural or artificial cavities of a human or animal patient subjected to surgery.
- the total cross-sectional area of the fiber bundle corresponds to or is smaller the projecting area of the projector device.
- the fibers of the fiber bundle may be identical or they may differ, e.g.
- the structured light comprises a structuring of different wavelengths.
- the fibers of the fiber bundle are substantially identical.
- fibers of the fiber bundle are partly of fully fused at least a portion of its length nearest the light emitting end to ensure a fixed relative location of the fiber ends.
- At least the projector device is mounted to or integrated with the minimally surgical instrument.
- the projector device configured to project the structured light as a structured light beam onto at least a section of a tissue field.
- the light receiving end of the fiber bundle is operatively coupled to the structured light device for receiving at least a part of the structured light from the structured light device.
- the operatively coupling may include one or more lenses and/or objectives e.g. for focusing the structured light to be received by the light receiving end of the fiber bundle.
- the fiber optic probe instrument assembly comprises one or more lenses and/or objectives arranged between the light emitting end of the bundle of fibers and the projector, preferably the projector comprises a micro lens.
- the light emitting end of the fiber ends are arranged in an encasing to thereby form a probe-head comprising the projector device.
- the probe-head may comprise one or more lenses for ensuring a desired projection of the structured light.
- the projector is formed by a protecting coating or cover for the emitting end of the fiber bundle, preferably forming part of the encasing.
- the light emitting end of the bundle of fibers are arranged in an encasing to form a probe-head comprising the projector device, preferably the probe-head comprises one or more lenses, such as micro lenses.
- the probe-head may advantageously have a maximal cross-sectional diameter of up to about 1 cm, such as up to about 8 mm, such as up to about 6 mm. Thereby ensuring that the probe-head may be inserted together with a distal end portion of the minimally surgical instrument to which it is mounted or integrated into a natural or artificial cavity of a human or animal patient subjected to surgery.
- the projector device and/or probe-head is advantageously mounted to the minimally surgical instrument at the distal end portion of the minimally surgical instrument or near the distal end, to ensure that the projector device may be inserted into a natural or artificial cavity together with the distal end portion of the minimally surgical instrument to which it is mounted or integrated of a human or animal patient subjected to surgery.
- the minimally surgical instrument has a distal portion that may be articulated at an articulating length section thereof
- the projector device and/or probe-head is advantageously mounted to the minimally surgical instrument at the distal end portion at a location at or closer to the distal end of the minimally surgical instrument than the articulating length section.
- the relative position between the distal end of the minimally surgery relative to the projector device may be determined by a computer system which comprises data representing the articulating state of the articulating length section.
- the invention also relates to a structured light ultrasound instrument.
- the structured light ultrasound instrument comprises an ultrasound transducer instrument and a structured light arrangement, wherein the structured light arrangement comprises a projector device for projecting a structured light beam to a tissue field, such as a tissue field within a natural or artificial body cavity.
- the wherein the structured light arrangement may be as described above.
- At least the projector device is mounted to or integrated with the ultrasound transducer instrument.
- ultrasound transducer instrument for imaging before or during surgery.
- Such prior art ultrasound transducer instruments are for example market by BK Ultrasound.
- BK Ultrasound it has been a major problem to navigate the ultrasound transducer instrument relative to the surface of the tissue that are scanned by the ultrasound transducer instrument and in particular, it has been very difficult to match the obtained ultrasound images with actual images of the tissue surface.
- the prior art ultrasound transducer instruments have been capable of identifying damage or malignant tissue areas, it has been difficult for the surgeon to actually find the exact location of the damage or malignant tissue areas.
- the structured light ultrasound instrument of the invention it is now possible to obtain an improved correlation between an ultrasound image and a surface image of a tissue area and surface.
- the structured light ultrasound instrument may advantageously form part of the above described 3D reconstruction system.
- the ultrasound transducer instrument has a distal portion with a distal end and an ultrasound head located at the distal end.
- the ultrasound head preferably has a maximal cross-sectional diameter of up to about 2 cm, such as up to about 1.5 cm, such as up to about 1 cm, such as up to about 8 mm, such as up to about 6 mm.
- the ultrasound transducer instrument may be suitable for use in artificial or natural openings of a human or animal.
- the ultrasound transducer instrument has an articulating length section at the distal portion, the articulating length section is preferably arranged proximally to the ultrasound head.
- the projector device is located at the distal portion of the ultrasound transducer instrument.
- the projector device is located distally to the articulating length section.
- the invention also comprises a minimally invasive surgery navigation system suitable for ensuring a desired and improved navigation of an ultrasound transducer instrument during minimally invasive surgery.
- the minimally invasive surgery navigation system comprises a structured light ultrasound instrument as described above, an endoscope and a computer system.
- the endoscope comprises an image acquisition device configured for recording data representing reflected rays from the emitted pattern and for transmitting the rays reflected from the a surface section of a tissue field to the computer system.
- the image acquisition device may be as described above.
- the computer system is configured
- the computer system may be as described above wherein the computer is further configured for
- the minimally invasive surgery navigation system and the 3D reconstruction system is a combined system.
- the surface data is 2D surface data e.g. a simple surface image.
- the surface data is 3D data e.g. determined by the 3D
- the minimally invasive surgery navigation system is preferably configured for operating in real time.
- a timely associated 2D image surface section of the surgical field may be used to confirm or improve the correlation the ultra sound image to the 2D and/or 3D surface data.
- the timely associated 2D image may for example be a frame acquired by the image acquisition device with corresponding time attribute as the set(s) of data representing reflected rays.
- the light emitting end of the bundle of fibers are arranged in an encasing to form a probe-head comprising the projector device, preferably the probe-head comprises one or more lenses, such as micro lenses.
- the probe-head may advantageously have a maximal cross-sectional diameter of up to about 1 cm, such as up to about 8 mm, such as up to about 6 mm. Thereby ensuring that the probe-head may be inserted together with a distal end portion of the minimally surgical instrument to which it is mounted or integrated into a natural or artificial cavity of a human or animal patient subjected to surgery.
- the projector device and/or probe-head is advantageously mounted to the minimally surgical instrument at the distal end portion of the minimally surgical instrument or near the distal end, to ensure that the projector device may be inserted into a natural or artificial cavity together with the distal end portion of the minimally surgical instrument to which it is mounted or integrated of a human or animal patient subjected to surgery.
- the projector device and/or probe-head is advantageously mounted to the minimally surgical instrument at the distal end portion at a location at or closer to the distal end of the minimally surgical instrument than the articulating length section.
- the relative position between the distal end of the minimally surgery relative to the projector device may be determined by a computer system which comprises data representing the articulating state of the articulating length section.
- the invention also relates to a structured light ultrasound instrument.
- the structured light ultrasound instrument comprises an ultrasound transducer instrument and a structured light arrangement, wherein the structured light arrangement comprises a projector device for projecting a structured light beam to a tissue field, such as a tissue field within a natural or artificial body cavity.
- the wherein the structured light arrangement may be as described above.
- At least the projector device is mounted to or integrated with the ultrasound transducer instrument.
- ultrasound transducer instrument for imaging before or during surgery.
- Such prior art ultrasound transducer instruments are for example market by BK Ultrasound.
- BK Ultrasound it has been a major problem to navigate the ultrasound transducer instrument relative to the surface of the tissue that are scanned by the ultrasound transducer instrument and in particular, it has been very difficult to match the obtained ultrasound images with actual images of the tissue surface.
- the prior art ultrasound transducer instruments have been capable of identifying damage or malignant tissue areas, it has been difficult for the surgeon to actually find the exact location of the damage or malignant tissue areas.
- the structured light ultrasound instrument of the invention it is now possible to obtain an improved correlation between an ultrasound image and a surface image of a tissue area and surface.
- the structured light ultrasound instrument may advantageously form part of the above described3D reconstruction system.
- the ultrasound transducer instrument has a distal portion with a distal end and an ultrasound head located at the distal end.
- the ultrasound transducer instrument has an articulating length section at the distal portion, the articulating length section is preferably arranged proximally to the ultrasound head. To ensure that the projected light beam reaches the surface tissue it is desired that the projector device is located at the distal portion of the ultrasound transducer instrument.
- the projector device is located distally to the articulating length section.
- the projector device is located proximally to the articulating length section.
- the invention also comprises a minimally invasive surgery navigation system suitable for ensuring a desired and improved navigation of an ultrasound transducer instrument during minimally invasive surgery.
- the minimally invasive surgery navigation system comprises a structured light ultrasound instrument as described above, an endoscope and a computer system.
- the endoscope comprises an image acquisition device configured for recording data representing reflected rays from the emitted pattern and for transmitting the rays reflected from the a surface section of a tissue field to the computer system,
- the image acquisition device may be as described above.
- the computer system is configured
- the computer system may be as described above wherein the computer is further configured for
- the minimally invasive surgery navigation system and the 3D reconstruction system is a combined system.
- the surface data is 2D surface data e.g. a simple surface image.
- the surface data is 3D data e.g. determined by the 3D
- the minimally invasive surgery navigation system is preferably configured for operating in real time.
- FIG. 1 is a schematic illustration of an embodiment of a 3D reconstruction system of the invention in use for performing a 3D determination of a tissue field.
- Figure 2 is a schematic illustration of an embodiment of a 3D reconstruction system of the invention in use for performing a 3D determination of a tissue field in a minimally invasive surgical cavity.
- Figure 3 is a schematic illustration of another embodiment of a 3D
- Figure 4 illustrates an example of a flow chart of data processing of a 3D reconstruction system 3D reconstruction system of an embodiment of the invention.
- Figure 5 illustrates another example of a flow chart of data processing of a 3D reconstruction system 3D reconstruction system of an embodiment of the invention.
- Figure 6 illustrates a further example of a flow chart of data processing of a 3D reconstruction system 3D reconstruction system of an embodiment of the invention.
- Figure 7 is a schematic illustration of an image of a tissue field reflecting a structured light pattern projected from a projector device relative to a reference structured light data set.
- Figure 8 illustrates a method where a 3D reconstruction system is performing a volumetric determination of a tissue field.
- Figure 9 illustrates a method where a 3D reconstruction system is performing a size determination of a tissue field.
- Figure 10 illustrates an example of a structured light.
- Figure 10a illustrates examples of light features of the structured light of figure 10.
- Figure 11 illustrates another example of a structured light.
- Figure 11a illustrates examples of light features of the structured light of figure 11.
- Figure 12 illustrates a further example of a structured light.
- Figure 12a illustrates examples of light features of the structured light of Figure 12.
- Figures 13a, 13b and 13c illustrate further examples of light features.
- Figures 14a, 14b and 14c illustrate further examples of structured light.
- Figure 15 is a schematic illustration of stereo image feature matching.
- Figure 16 is a schematic view of a portion of a penetrator for use in minimally invasive surgery and where a projector device of a 3D reconstruction system of an embodiment of the invention is disposed at a tip of the penetrator.
- Figures 17a and 17b illustrate a part a penetrator member with a projector device of a 3D reconstruction system of an embodiment of the invention disposed near the tip of the penetrator and, wherein the projector device has a first folded position and a second unfolded/pivoted position.
- Figure 18 is a schematic illustration of a structured light arrangement comprising a light source - waveguide - optical projector and focusing lens assembly suitable for forming part of an embodiment of 3D reconstruction system of the invention.
- Figure 19 is a schematic illustration of a projector probe which may form part of a structured light of an embodiment of 3D reconstruction system of the invention.
- Figure 20 is a schematic illustration of a beam expanding lens arrangement which may form part of a structured light of an embodiment of 3D
- Figure 21 is a schematic illustration of a robot comprising a 3D reconstruction system of an embodiment of the invention.
- Figures 22a-22c illustrates examples of image acquisition devices suitable for a 3D reconstruction system of embodiments of the invention.
- Figure 23 is a schematic illustration of a fiber optic probe.
- Figure 24 is a schematic illustration of a fiber optic probe instrument assembly of an embodiment of the invention comprising the fiber optic probe of figure 23.
- Figures 25a-25d illustrates cross-sectional views of examples of fiber bundles.
- Figure 26 is a schematic illustration of a distal portion of a fiber optic probe instrument assembly of an embodiment of the invention comprising the fiber optic probe of figure 23.
- Figure 27-33 illustrates a structured light ultrasound instrument and a minimally invasive surgery navigation system in use during a minimally invasive surgery system.
- the 3D reconstruction system illustrated in figure 1 comprises a structured light arrangement 1 with a projector device la configured to project a structured light beam onto at least a section of a tissue field 3.
- the 3D reconstruction system also comprises an image acquisition device 2
- the image acquisition device 2 comprises a stereo camera 2a for acquiring digital images(frames). Each frame comprises a set of pixel data and the set of pixel data is associated with a time attribute by the image acquisition device or by the computer system 6, which also form part of the 3D reconstruction system. As illustrated with the waves W the structured light arrangement 1 and the image acquisition device are both in data communication with the computer system.
- the image acquisition device is configured for transmitting the acquired frames - in the form of sets of pixel data - in real time to the computer system 6 and the computer system is configured for receiving the frames in real time - in the form of sets of pixel data.
- the computer system 6 is configured for receiving data from the structured light arrangement 1 representing the projected structured light beam. These data are stored in a memory of the computer system 6 as reference
- the computer system 6 is further configured for
- features including a plurality of primary light features from the received set(s) of pixel data having corresponding time attribute and optionally from previous set(s) of pixel data,
- tissue field • performing at least one determination of the tissue field 3 based on the spatial position of the projector device and the recognized light features.
- the tissue field may be rather curved and the 3D
- reconstruction system may e.g. be configured for determining the shortest Euclidean distance between a not shown instrument and the tissue field i.e. the actual distance irrespectively of the point of view.
- the 3D reconstruction system illustrated in figure 2 is here applied in a minimally invasive surgical procedure and comprises a structured light arrangement 11 with a projector 11a projecting a structured light beam as illustrated with rays lib onto at least a section of a tissue field 13.
- the 3D reconstruction system also comprises an image acquisition device 12 configured for acquiring frames comprising reflections 15 of the structured light beam from the tissue field 13.
- the image acquisition device 12 configured for acquiring frames comprising reflections 15 of the structured light beam from the tissue field 13.
- a not shown camera also referred to as an image acquisition unit.
- the image acquisition device 12 is wired to the computer 16a of the computer system for transmitting in real time sets of pixel data representing acquired frames.
- the structured light arrangement 11 is wired to the computer 16a of the computer system for transmitting data representing the projected structured light beam - the data represent at least a set of light features of the projected structured light beam.
- a tip portion comprising the projector device 11a of the structured light arrangement 11 is inserted through the skin 10 of a patient into the minimally invasive surgical cavity to project the rays lib onto an intestine area I of the tissue field.
- a portion comprising the camera 12a of the image acquisition device 12 is inserted through the skin 10 of the patient into the minimally invasive surgical cavity via a cannula port 12c to acquire the frames of the reflected structured light 15.
- the computer system comprises a display (as screen) 16b and the computer 16a is configured for
- the computer system 16a, 16b may also be configured for controlling the operation of the image acquisition device 12 and/or the structured light arrangement 11.
- the 3D reconstruction system illustrated in figure 3 comprises a structured light arrangement 2, with a portion comprising a projector device inserted through the skin 20 of a patient into a minimally invasive surgical cavity to project the rays onto the tissue field.
- the 3D reconstruction system also comprises an image acquisition device 22 partly inserted into the minimally invasive surgical cavity and configured for acquiring frames comprising reflections 25 of the structured light beam from the tissue field.
- the image acquisition device 22 comprises camera for acquiring digital frames which are transmitted to a not shown computer system of the 3D
- the computer system is configured for ⁇ recognizing a plurality of light features of received set(s) of light
- features including a plurality of primary light features from the received set(s) of pixel data having corresponding time attribute and optionally from previous set(s) of pixel data and matching the recognized primary light features with corresponding light features of the projected structured light beam,
- the 3D determination may e.g. be 3D reconstruction e.g. topological reconstruction, determining augmented reality view of tissue field, performing volumetric measures, tracking instrument relative to tissue field, etc.
- the computer system may receive pre-operation data and/or intra-operation data which may e.g. be used for refining the recognition step, the matching step, the estimation of projector device spatial position and/or the 3D determination.
- the flow chart of figure 4 illustrates a process scheme of data processing steps which the computer system may be configured to perform.
- step 4a) "image capture", the computer system receives a set of pixel data representing an acquired frame and with an associated time attribute.
- the computer system may store previous set(s)of pixel data (set(s) of pixel data representing previously acquired frames with a time attribute representing an earlier point in time).
- step 4b) "Recognizing light features"
- the computer system searches for light features in the set(s) of pixel data.
- step 4c) "Selecting primary light features" the computer system selects light features which are qualified for being used primary light features e.g. in respect to one or more thresholds and preferably light features with at least an orientation attribute, a position attribute or a combination thereof.
- the selected light features are deemed to be primary light features.
- step 4d) "Match primary light features", the computer system matches the primary light features with corresponding light features of the reference structured light data set.
- step 4e) "Estimating spatially position of the projector device” the computer system is estimating the spatial position of the projector device using the best match of features.
- the computer system may be configured for applying an iterative estimation procedure to find the estimation where most of the matched features are valid. Thereby primary features reflected from very curved areas of the tissue field may be ignored for the estimation of the spatial position of the projector device.
- step 4f) "Estimating location of features on tissue field”, the computer system is estimating the location of recognized light features (e.g. including primary light features) on the tissue field inclusive the spatial location relative to the now estimated spatial position of the projector device.
- step 4g) "Calculate tissue field map", the computer system calculates topological data (3D data) of the tissue field.
- the one or more steps may be performed iteratively. Further the computer system may apply additional steps, such a data rectifying steps, outlier removal steps, epipolar matching where a stereo camera has been used and etc.
- the flow chart of figure 5 illustrates a process scheme of another example of data processing steps which the computer system may be configured to perform.
- step 5a) "image capture", the computer system receives a set of pixel data representing an acquired frame and with an associated time attribute.
- the computer system may store previous sets of pixel data (set of pixel data representing previously acquired frames with a time attribute representing an earlier point in time).
- step 5b) "Image rectifying", the computer system rectifies the set(s) of pixel data by subjecting the data to an error and correction procedure e.g. as described above.
- step 5c) "Detect features” the computer system is recognizing light features e.g. by searching for light features in the set(s) of pixel data. The computer system further selects which are qualified for being used a primary light features e.g. in respect to one or more thresholds and preferably light features with at least an orientation attribute, a position attribute or a combination thereof. The selected light features are deemed to be primary light features.
- step 5d) "Match features" the computer system matches the primary light features with corresponding light features of the reference structured light data set.
- Steps 5c) and 5d) may for example include the steps i-iv of extracting light features, e.g. represented by local pattern fractions e.g. corner, corner connections, square arrangements and any other fractions of the structured light e.g. as described above and matching these light features in an iterative process.
- steps i-iv of extracting light features e.g. represented by local pattern fractions e.g. corner, corner connections, square arrangements and any other fractions of the structured light e.g. as described above and matching these light features in an iterative process.
- step 5e) "Outlier removal and rough pose & orientation estimation"
- the computer system performs a rough estimation of the spatial position of the projector device using the match of features - refines the estimation, removes outlier data and repeats as many times as required e.g. until further modifications are below a preselected threshold.
- Step 5f) "Refine pose and orientation" is a continuation of step 5e) to perform the final estimation of the spatial position of the projector device.
- step 5g) "Dense 3D reconstruction"
- the computer system estimates the location of recognized light features on the tissue field inclusive the spatial location relative to the now estimated spatial position of the projector device and calculates topological data (3D data) of the tissue field.
- the flow chart of figure 6 illustrates a process scheme of a further example of data processing steps which the computer system may be configured to perform.
- step 6a) "capture stereo image"
- the computer system receives a set of pixel data representing stereo acquired frames having corresponding associated time attribute.
- step 6b) "Recognizing light features from each image”
- the computer system searches for corresponding light features in each of the sets of pixel data and selects corresponding primary light features.
- step 6c) "Matching features to reference light data”
- the computer system matches the primary light features from one or preferably both sets of pixel data with corresponding light features of the reference structured light data set.
- step 6d) "Epipolar feature matching”, the computer system matches the primary light features and optionally other light features between the sets of pixel data of the stereo frames.
- the match relative to the reference data set and the epipolar matching may be performed as an iterative process.
- the step may
- step 6e) "Estimating spatially position of the projector device” the computer system estimates the spatial position of the projector device using the best match of features including the epipolar matching.
- the computer system may be configured for applying an iterative estimation procedure to find the estimation where most of the matched features are valid.
- step 6f) "Estimating location of features on tissue field”
- the computer system is estimating the location of recognized light features (e.g. including primary light features) on the tissue field inclusive the spatial location relative to the now estimated spatial position of the projector device.
- the computer system may for example receive and apply pre-operation data and/or inter-operative data in the location estimation to thereby further refine the 3D determination.
- step 6g) "Calculate tissue field map"
- the computer system calculates topological data (3D data) of the tissue field.
- Figure 7 illustrates the matching of light features of data of a set of pixel data representing an image 30 with data of a reference structured light data set representing the structured light 31 as projected by the projector device (the projected structured light beam). As illustrated with the lines L light features comprising corner, crossing lines and etc. may be matched.
- Figure 8 illustrates that the 3D reconstruction system is performing a volumetric determination of a tissue field based on the light features of the set of pixel data representing an image 30 after having estimated the spatial position of the projector device.
- the computer system can calculate the geometrically spatial position of the light features and form a map M where formations may be detected, e.g. by fitting the data representing position attributes of light features 32 to a circle as shown. Thereby the size and the volume of a protrusion may be determined.
- Figure 9 illustrates that the 3D reconstruction system is performing a size determination of a tissue field based on the light features of the set of pixel data representing an image 30 after having estimated the spatial position of the projector device.
- Figure 10 illustrates an example of a structured light in the form of a grid pattern. Examples of light features which may be extracted from the grid pattern are shown in figure 10a including for example a sub-grid, a square, a cross and a corner.
- a position attribute may e.g. be provided as the point of cross in a sub-grid or a cross or as a corner edge of a sub-grid or a corner.
- An orientation attribute may for example be representing the orientation of lines of the shown light features.
- Figure 11 illustrates an example of a structured light in the form of a grid pattern with random dots. Examples of light features which may be extracted from the grid pattern are shown in figure 11a including for example a grid feature, a grid with dots, a group of dots or a single dot.
- Figure 12 illustrates an example of a structured light in the form of pattern of dots. Examples of light features which may be extracted from the grid pattern are shown in figurel2a. As seen the group of dots can be matched to the pattern of dots as illustrated with the dotted ring.
- Figures 13a, 13b and 13c illustrate further examples of light features comprising light features with various characteristic shapes, light feature with various colors and light features with various intensities of light.
- Figures 14b illustrates a structured light comprises a number of parallel lines e.g. comprising different attributes - e.g. size, colour, structure and etc.
- Figures 14c illustrates a structured light in the form of a bar coded structure.
- Figure 15 is a schematic illustration of stereo image feature matching showing the matching of light features between the first image 42 and the second image of stereo images. Corresponding features of at least one of the images 43 are matched to the reference structured light data set. 41.
- Figure 16 shows a distal portion of penetrator 50.
- the penetrator comprises a channel 55 for supplying light e.g. via an optical fiber arrangement to a projector device 56 arranged at the tip of the penetrator.
- the penetrator 60 of figures 17a and 17b has a distal portion 62 with a tip 64, an obstruction 65 ensures that the penetrator 60 is not penetrating too deep into a minimally invasive surgical cavity and a proximal portion 63, for handling the penetrator 60 by an operator.
- the penetrator 60 comprises a projector device 66 forming part of a structured light arrangement of a 3D reconstruction system of an embodiment of the invention.
- the projector device 66 is in a folded position where the projector device 66 is folded into a sleeve of the penetrator 60. When the projector device 66 is in this first position the penetrator 60 may penetrate through the skin of a patient.
- the structured light arrangement illustrated in figure 18 comprises a light source 72, a waveguide 71, an optical projector 76 and focusing lens 75.
- the waveguide 71 is an optical fiber and the optical projector 76 is a DOE
- the light pattern is projected in the desired direction and focused by the focusing lens 75.
- the projected pattern has a diverging angle 0.
- the projector probe illustrated in figure 19 comprises an optical fiber 81 with a proximal end 81' and a distal end, a beam expanding lens 82 and a projector 86 with a distal front face 86'.
- the optical fiber 81, the beam expanding lens 82 and the projector 86 are fused in the fused interfaces F.
- the optical fiber 81, the beam expanding lens 82 and the projector 86 are arranged in a hermetic metal housing 80 preferably using epoxy seal 89.
- the light When a light beam is pumped from a not shown light source into the proximal end 81' of the optical fiber 81, the light will propagate through the optical fiber 81 collimated in the core of the optical fiber. From the fiber 81 the light will pass into the beam expanding lens 82 which is advantageously a GRIN lens and the beam will expand as the light propagates through the beam expanding lens 82. At the exit of the beam expanding lens 82 the light will be collimated and it will propagate into the projector which is advantageously a DOE.
- the light pattern will be shaped and the projector will project a divergent light pattern.
- the projector may advantageously comprise an optical filter or an optical filter layer as described above to prevent and/or remove fog/mist.
- the optional optical filter or an optical filter layer is indicated with the dotted part 86a of the projector 86.
- the beam expanding lens arrangement illustrated in figure 20 comprises a beam expanding lens 92 having a length L.
- the light is fed from a not shown light source via an optical fiber 91 to the proximal end of the beam expanding lens 92.
- the light enters the beam expanding lens and due to a continuous change of the refractive index within the lens material, the light rays rl are continuously bent to thereby expand the diameter of the beam as the light propagates through the beam expanding lens 92 along its length L.
- the beam expanding lens 302 the light is collimated to form a beam with substantially parallel rays r2 of light.
- the collimated light may be transmitted further to the DOE of a projector probe as illustrated in figure 19.
- FIG 21 is a schematic illustration of a robot 100 comprising a first movable robot arm 100a and a second movable robot arm 100b.
- the robot 100 comprises a minimally invasive surgery tool 107a and a projector device 107 of a structured light arrangement disposed on its first robot arm 100a and an image acquisition device 108 disposed on its second robot arm 100b.
- the invasive surgery tool 107a with the projector device 107 and the image acquisition device 108 are passes through not shown cannula ports through the skin 106 of a patient into a minimally invasive surgical cavity with a tissue field.
- the robot arms 100a and 100b are outside the cavity. In practice it is desired that the robot arms 100a and 100b are inserted through the cannula ports into the cavity for thereby increasing movability of the projector device and the image acquisition device.
- the projector device 107 projects a structured light beam SB to imping onto the tissue field and at least a part of the light of the structured light beam SB is reflected as a reflected light pattern RP.
- the image acquisition device 108 acquires frames comprising at least a part of the reflected light pattern RP.
- the robot also comprises a computer system comprising a data collecting arrangement 102, a computer 101 with processing capability and a controller processing system 104 for controlling the operation of the robot and in particular the robot arms 100a, 100b.
- the frames acquired by the image acquisition device is collected in real time in the data collecting arrangement 102 and stored in the form of sets of pixel data with associated time attributes.
- the data collecting arrangement 102 also stores the reference structured light data set.
- the computer 101 is requesting data from the data collecting arrangement 102 and is processing the data as described above to estimate the spatially position of the projector device and further the computer 101 performs 3D determinations of the tissue field.
- the 3D determinations may be transmitted to a display 103 to be visualizes for a human surgeon.
- the 3D determinations may further be transmitted to the controller processing system 104 for being used in the algorithm determining the movement of the robot arms.
- the controller processing system 104 may further provide a feed back to the computer 101 including data describing previous or expected moves of the robot arms 100a, 100b and the computer 101 may apply these feedback data to refine the 3D determinations.
- the image acquisition device also comprises a projector which projects a second structured light beam SBa towards the tissue field and this second structured light beam SBa is at least partly reflected as a reflected pattern RPa.
- the wavelength(s) of the two structured light beams SB, SBA preferably differs such that the computer 1001 may distinguish between the two reflected light patterns RP, RPa and features thereof.
- FIG 22a illustrates an example of an image acquisition device comprising one or more single cameras 112a, 112b incorporated into separate camera housings 111a, 111b. Where there are more cameras 112a, 112b it is desirable that the cameras 112a, 112b may be moved separately e.g. by independently tilting and/or twisting the camera housings.
- the cameras 112a, 112b may be operated by a common electronic circuitry to ensure that the cameras 112a, 112b are operating concurrently with each other and preferably with same frame rate.
- Figure 22b illustrates an example of an image acquisition device comprising a stereo camera with a stereo camera housingll3 comprising flexible housing arms 113a, 113b each encasing a camera 114a, 114b.
- the flexible housing arms 113a, 113b ensures that the cameras 114a, 114b can be positions with variable base line within a selected range.
- An embodiment of the 3D reconstruction system comprising a stereo camera with variable base line is very advantageous because the base line may be optimized during a procedure to thereby result in a very detained 3D analysis of the tissue field including highly accurate determined determinations of sizes of protruding parts and/or cavities of the tissue field and optionally volume determinations of such protrusions and/or cavities.
- the fiber optic probe shown in figure 23 comprises a structured light device 122, a bundle of optical fibers 124 and a probe-head 125 comprising a projector device encased in a probe-head 125.
- a not shown light source is arranged to feed light to the structured light device 122 via a fiber 121.
- the light source forms an integrated part of the structured light device 122.
- the light source may be as the sight sourced deisclosed above.
- the light source is a laser light source e.g. LED.
- the structured light device 122 is configured for generating a structured light. Since the size of the structured light device 122 is not very important, any device suitable for generating the light source may be applied e.g. as described above.
- the structured light device 122 is arranged to deliver at least a portion of the structured light to the light receiving end 124a of the fiber bundle 124.
- a number of lenses and/or objectives 123 are arranged between an output end 122a of the structured light device 122 and the light receiving end 124a of the fiber bundle 124. The lenses and/or objectives 123 ensures an effective focusing of the structured light to be received by the light receiving end 124a of the fiber bundle 124.
- the light is propagating to the probe-head 125.
- the light emitting end 124b of the fiber bundle 124 for high stability it is desired that the light emitting end 124b of the fiber bundle 124 is encased in and forms part of the probe-head 125.
- One or more not shown lenses may e.g. be arranged between the light emitting end 124b of the fiber bundle 124 and the projector.
- the probe-head shown comprises a micro lens 125a which may be the projector device or alternatively a not shown projector device is arranged in front of the micro lens 125s
- the projector device of the probe-head is configured to project the structured light as a structured light beam e.g. onto at least a section of a tissue field.
- Figure 24 shows the fiber optic probe mounted to an instrument 127 to form a fiber optic probe instrument assembly of an embodiment of the invention.
- the minimally surgical instrument 127 comprises a distal portion 127a adapted to be inserted into any natural or artificial cavities of a human or animal patient subjected to surgery.
- the minimally surgical instrument is a grasper, comprising a pair of grasper arms 127b which forms the distal end and the tip of the minimally surgical instrument 127.
- the probe-head 125 is mounted to the minimally surgical instrument 127 at its distal portion 127a
- Figures 25a-25d illustrates cross-sectional views of examples of fiber bundles suitable for use in a fiber optic probe instrument assembly of embodiments of the invention.
- Figure 26 shows a distal portion of a fiber optic probe instrument assembly, where the minimally surgical instrument is an endoscope 137.
- the endoscope 137 comprises a camera channel 137b and a probe channel 137a. In the shown embodiment a camera has not been inserted through the camera channel. A probe-head 135 and a length portion of the fiber bundle 134 has been passed through the probe channel.
- Figure 27-33 illustrates a structured light ultrasound instrument and a minimally invasive surgery navigation system in use during a minimally invasive surgery system.
- the illustration of figure 27 show a tissue surface 141 within a body cavity.
- the tissue surface has a tumor 142 which may be malignant and is to be diagnosed and optionally removed.
- a distal portion of a structured light ultrasound instrument 143 as described above is inserted into the cavity.
- the structured light ultrasound instrument 143 comprises an ultrasound head 144 comprising a transceiver for emitting and receiving ultrasound.
- Proximally to the ultrasound head 144 the distal portion of the structured light ultrasound instrument 143 has an articulating length section 145, which can be articulated with a high degree of freedom.
- the articulating movement are preferably computer controlled or at least the computer system comprises data representing the movements and articulated position of the articulating length section 145, such that the computer system can determine the relative position between the projector device 146 and the ultrasound head 144.
- the projector device is located proximally to the articulating length section 145 to ensure a desired location for emitting the light pattern onto a desired tissue field of the tissue surface 141.
- the structured light 147 here in the form of a plurality of light dots are impinging onto the tissue field and a portion of the light is reflected.
- An endoscope comprising an image acquisition device is configured for acquiring frames comprising reflections of the structured light 147.
- Each frames (images) comprises a set of pixel data associated with a time attribute and advantageously the set of pixel data is transmitted to the computer system for being computed according to the 3D reconstruction system as escribed above.
- the computer system is configured for performing a 3D reconstruction in real time using these acquired frames ads described above and at the same time the ultrasound head 144 is scanning the tissue below the tissue surface.
- combined 3D reconstruction system/minimally invasive surgery navigation system is configured to in real time determining the spatial location and orientation of the non-articulated portion of the structured light ultrasound instrument 143, of the ultrasound head 144 and the endoscope (camera) 148.
- the combined 3D reconstruction system/minimally invasive surgery navigation system As shown in the figure 28 the combined 3D reconstruction system/minimally invasive surgery navigation system.
- the system further overlay a depiction 143a, 144a, 148a of the spatial location and orientation of respectively the non-articulated portion of the structured light ultrasound instrument 143, of the ultrasound head 144 and the endoscope (camera) 148.
- the surgeon has full information about the coordinates (spatial location) and the orientation of the non-articulated portion of the structured light ultrasound instrument 143, of the ultrasound head 144 and the endoscope (camera) 148.
- the structured light has been changed to an angular pattern.
- Figure 29 illustrates a view seen from the endoscope (the probe spatial location is shown but not the orientation. The system advantageously allow the surgeon to switch these data on and off.
- Figure 30 illustrates the image of the tissue field and a corresponding a 2D ultrasound image 149. In real time, both images will continuously be replaced by real time images.
- the tissue near the tumor may comprise a vein 140 or another critical structure which the surgeon should aim not do damage during surgery. Due to the present invention the surgeon may locate such critical structure and thereby ensune unneccesary damaging
- Figure 31 illustrates the image of the tissue field and a corresponding a 3D ultrasound image 150. In real time, both images will continuously be replaced by real time images.
- To couple the 3D ultrasound image 150 i.e. to determine spatial location and orientation of the 3D ultrasound image 150 the it is required to know the coordinates and orientation of the ultrasound head 144 and at least the orientation and advantageously also the coordinate of the endoscope 148.
- Figure 32 show a view of the image of the tissue field and the corresponding a 3D ultrasound image 150, where the 3D ultrasound image 150 and the 2D and/or 3D surface data determined from the frames of the endoscope has been correlated in 3D orientation, spatial position and size.
- Figure 33 show two different view corresponding to the view of figure 32 where the surgeon may dynamically change the x,y,z angle of one or both of the views.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Medical Informatics (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radiology & Medical Imaging (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Signal Processing (AREA)
- Quality & Reliability (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DKPA201770193 | 2017-03-20 | ||
DKPA201770430 | 2017-06-01 | ||
PCT/DK2018/050048 WO2018171851A1 (en) | 2017-03-20 | 2018-03-20 | A 3d reconstruction system |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3599982A1 true EP3599982A1 (en) | 2020-02-05 |
EP3599982A4 EP3599982A4 (en) | 2020-12-23 |
Family
ID=63585007
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP18771352.4A Pending EP3599982A4 (en) | 2017-03-20 | 2018-03-20 | A 3d reconstruction system |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3599982A4 (en) |
WO (1) | WO2018171851A1 (en) |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200015899A1 (en) | 2018-07-16 | 2020-01-16 | Ethicon Llc | Surgical visualization with proximity tracking features |
CN109870386B (en) * | 2019-04-03 | 2024-06-07 | 浙江省工程物探勘察设计院有限公司 | Sample density testing system for rock-soil investigation test |
CN110349237B (en) * | 2019-07-18 | 2021-06-18 | 华中科技大学 | Fast volume imaging method based on convolutional neural network |
CN110495900B (en) * | 2019-08-19 | 2023-05-26 | 武汉联影医疗科技有限公司 | Image display method, device, equipment and storage medium |
US11776144B2 (en) | 2019-12-30 | 2023-10-03 | Cilag Gmbh International | System and method for determining, adjusting, and managing resection margin about a subject tissue |
US12002571B2 (en) | 2019-12-30 | 2024-06-04 | Cilag Gmbh International | Dynamic surgical visualization systems |
US11284963B2 (en) | 2019-12-30 | 2022-03-29 | Cilag Gmbh International | Method of using imaging devices in surgery |
US11759283B2 (en) | 2019-12-30 | 2023-09-19 | Cilag Gmbh International | Surgical systems for generating three dimensional constructs of anatomical organs and coupling identified anatomical structures thereto |
US11896442B2 (en) | 2019-12-30 | 2024-02-13 | Cilag Gmbh International | Surgical systems for proposing and corroborating organ portion removals |
US11744667B2 (en) | 2019-12-30 | 2023-09-05 | Cilag Gmbh International | Adaptive visualization by a surgical system |
US11832996B2 (en) | 2019-12-30 | 2023-12-05 | Cilag Gmbh International | Analyzing surgical trends by a surgical system |
US11484245B2 (en) | 2020-03-05 | 2022-11-01 | International Business Machines Corporation | Automatic association between physical and visual skin properties |
US11659998B2 (en) * | 2020-03-05 | 2023-05-30 | International Business Machines Corporation | Automatic measurement using structured lights |
CN111637850B (en) * | 2020-05-29 | 2021-10-26 | 南京航空航天大学 | Self-splicing surface point cloud measuring method without active visual marker |
WO2022209156A1 (en) * | 2021-03-30 | 2022-10-06 | ソニーグループ株式会社 | Medical observation device, information processing device, medical observation method, and endoscopic surgery system |
US20230062782A1 (en) * | 2021-09-02 | 2023-03-02 | Cilag Gmbh International | Ultrasound and stereo imaging system for deep tissue visualization |
CN116740325B (en) * | 2023-08-16 | 2023-11-28 | 广州和音科技文化发展有限公司 | Image stitching method and system based on exhibition hall scene three-dimensional effect design |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB9216743D0 (en) * | 1992-08-07 | 1992-09-23 | Epstein Ruth | A device to calibrate adsolute size in endoscopy |
US6503195B1 (en) | 1999-05-24 | 2003-01-07 | University Of North Carolina At Chapel Hill | Methods and systems for real-time structured light depth extraction and endoscope using real-time structured light depth extraction |
JP3887377B2 (en) * | 2004-01-19 | 2007-02-28 | 株式会社東芝 | Image information acquisition apparatus, image information acquisition method, and image information acquisition program |
US7824328B2 (en) | 2006-09-18 | 2010-11-02 | Stryker Corporation | Method and apparatus for tracking a surgical instrument during surgery |
JP5118867B2 (en) * | 2007-03-16 | 2013-01-16 | オリンパス株式会社 | Endoscope observation apparatus and operation method of endoscope |
JP2009240621A (en) | 2008-03-31 | 2009-10-22 | Hoya Corp | Endoscope apparatus |
IT1401669B1 (en) | 2010-04-07 | 2013-08-02 | Sofar Spa | ROBOTIC SURGERY SYSTEM WITH PERFECT CONTROL. |
DE102010050227A1 (en) | 2010-11-04 | 2012-05-10 | Siemens Aktiengesellschaft | Endoscope with 3D functionality |
EP2689708B1 (en) | 2011-04-27 | 2016-10-19 | Olympus Corporation | Endoscopic apparatus and measurement method |
US11510600B2 (en) * | 2012-01-04 | 2022-11-29 | The Trustees Of Dartmouth College | Method and apparatus for quantitative and depth resolved hyperspectral fluorescence and reflectance imaging for surgical guidance |
KR102088541B1 (en) | 2012-02-02 | 2020-03-13 | 그레이트 빌리프 인터내셔널 리미티드 | Mechanized multiinstrument surgical system |
WO2013163391A1 (en) | 2012-04-25 | 2013-10-31 | The Trustees Of Columbia University In The City Of New York | Surgical structured light system |
US20130296712A1 (en) | 2012-05-03 | 2013-11-07 | Covidien Lp | Integrated non-contact dimensional metrology tool |
TWI533675B (en) | 2013-12-16 | 2016-05-11 | 國立交通大學 | Optimal dynamic seam adjustment system and method for images stitching |
US11116383B2 (en) | 2014-04-02 | 2021-09-14 | Asensus Surgical Europe S.à.R.L. | Articulated structured light based-laparoscope |
US20150371420A1 (en) | 2014-06-19 | 2015-12-24 | Samsung Electronics Co., Ltd. | Systems and methods for extending a field of view of medical images |
BR112017005251A2 (en) * | 2014-09-17 | 2017-12-12 | Taris Biomedical Llc | Methods and Systems for Bladder Diagnostic Mapping |
GB2545588B (en) | 2014-09-22 | 2018-08-15 | Shanghai United Imaging Healthcare Co Ltd | System and method for image composition |
EP3204420B1 (en) | 2014-10-10 | 2020-09-02 | The United States of America, as represented by The Secretary, Department of Health and Human Services | Methods to eliminate cancer stem cells by targeting cd47 |
US10368720B2 (en) | 2014-11-20 | 2019-08-06 | The Johns Hopkins University | System for stereo reconstruction from monoscopic endoscope images |
JP6450589B2 (en) | 2014-12-26 | 2019-01-09 | 株式会社モルフォ | Image generating apparatus, electronic device, image generating method, and program |
US9866815B2 (en) * | 2015-01-05 | 2018-01-09 | Qualcomm Incorporated | 3D object segmentation |
JP6618704B2 (en) * | 2015-04-10 | 2019-12-11 | オリンパス株式会社 | Endoscope system |
US10512508B2 (en) | 2015-06-15 | 2019-12-24 | The University Of British Columbia | Imagery system |
-
2018
- 2018-03-20 EP EP18771352.4A patent/EP3599982A4/en active Pending
- 2018-03-20 WO PCT/DK2018/050048 patent/WO2018171851A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
EP3599982A4 (en) | 2020-12-23 |
WO2018171851A1 (en) | 2018-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3599982A1 (en) | A 3d reconstruction system | |
US20230380659A1 (en) | Medical three-dimensional (3d) scanning and mapping system | |
US20220241013A1 (en) | Quantitative three-dimensional visualization of instruments in a field of view | |
US20210345855A1 (en) | Real time correlated depiction system of surgical tool | |
US11357593B2 (en) | Endoscopic imaging with augmented parallax | |
CN107260117B (en) | Chest endoscope for surface scan | |
EP3359012B1 (en) | A laparoscopic tool system for minimally invasive surgery | |
Maier-Hein et al. | Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery | |
US9220399B2 (en) | Imaging system for three-dimensional observation of an operative site | |
CN118284386A (en) | Surgical system with intra-and extra-luminal cooperative instruments | |
CN112741689B (en) | Method and system for realizing navigation by using optical scanning component | |
CN111281534B (en) | System and method for generating a three-dimensional model of a surgical site | |
US20230196595A1 (en) | Methods and systems for registering preoperative image data to intraoperative image data of a scene, such as a surgical scene | |
US20230062782A1 (en) | Ultrasound and stereo imaging system for deep tissue visualization | |
CN118302122A (en) | Surgical system for independently insufflating two separate anatomical spaces | |
CN118284368A (en) | Surgical system with devices for endoluminal and extraluminal access | |
EP4228492A1 (en) | Stereoscopic endoscope with critical structure depth estimation | |
EP4149340A1 (en) | Systems and methods for image mapping and fusion during surgical procedures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20191002 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20201119 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 1/05 20060101ALI20201113BHEP Ipc: G06T 7/00 20170101ALI20201113BHEP Ipc: A61B 1/04 20060101AFI20201113BHEP Ipc: A61B 1/00 20060101ALI20201113BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: CILAG GMBH INTERNATIONAL |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 7/00 20060101ALI20230220BHEP Ipc: A61B 1/00 20060101ALI20230220BHEP Ipc: A61B 1/05 20060101ALI20230220BHEP Ipc: A61B 1/04 20060101AFI20230220BHEP |
|
INTG | Intention to grant announced |
Effective date: 20230313 |
|
GRAJ | Information related to disapproval of communication of intention to grant by the applicant or resumption of examination proceedings by the epo deleted |
Free format text: ORIGINAL CODE: EPIDOSDIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
INTC | Intention to grant announced (deleted) | ||
17Q | First examination report despatched |
Effective date: 20230718 |