WO2011069469A1 - Stereoscopic visualization system for surgery - Google Patents

Stereoscopic visualization system for surgery Download PDF

Info

Publication number
WO2011069469A1
WO2011069469A1 PCT/CN2010/079658 CN2010079658W WO2011069469A1 WO 2011069469 A1 WO2011069469 A1 WO 2011069469A1 CN 2010079658 W CN2010079658 W CN 2010079658W WO 2011069469 A1 WO2011069469 A1 WO 2011069469A1
Authority
WO
WIPO (PCT)
Prior art keywords
stereoscopic
display
map
pixel
pixels
Prior art date
Application number
PCT/CN2010/079658
Other languages
French (fr)
Inventor
Ka-Shun Carrison Tong
Original Assignee
Hospital Authority
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from HK09111675A external-priority patent/HK1134743A2/en
Priority claimed from GB0921894A external-priority patent/GB2476245A/en
Application filed by Hospital Authority filed Critical Hospital Authority
Publication of WO2011069469A1 publication Critical patent/WO2011069469A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/156Mixing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/334Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spectral multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/337Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the present invention relates to stereoscopic visualization of surgery for staff present in an operating theatre.
  • Robotic surgery is becoming increasingly popular due to its many benefits over traditional open surgeries, including quicker recovery time, reduced pain, reduced scarring, smaller incisions and greater precision.
  • a known robotic surgery system called the da Vinci SurgicalTM System (“dVSS) is provided by Intuitive Surgical, Inc. and comprises two main components: a master console 1 and a slave robot 2. These components are shown in Figure 1 (which
  • the master console 1 is operated by the surgeon 3. Using manipulators on the console the surgeon controls the arms of the robot and the surgical implements attached to their ends to perform the surgery. These are also operated by the surgeon.
  • the end of one of the arms is provided with a pair of cameras 4 spaced apart by a few millimetres having a stereoscopic view of the surgery; these cameras provide the console with a respective video feeds.
  • the console displays the video feeds on
  • the present invention improves on the visual information provided to other members of staff in the operating theatre.
  • Preferred examples have advantages of easy set up of new 3D monitors and a depth control for the 3D image.
  • FIGURE l shows an example of the invention for robotic surgery
  • FIGURE 2 shows an example of the stereoscopic video system
  • FIGURE 3 illustrates the interlacing method of the
  • FIGRUE 4 illustrates a third method of generating the pixel polarization map
  • FIGURE 5 shows the pixel offset for depth control
  • FIGURE 6 shows an example of the invention for laparoscopic
  • Figure 1 shows an example system in accordance with the present invention for robotic surgery.
  • This example system includes components of an existing robotic surgery system and so the invention can be retrofitted to such a system.
  • the existing system includes the master console 1, the slave surgery robot 2, the pair of endoscopic cameras 4 mounted on one of the robot arms and an image processing unit 5 that provides signals to the separate respective displays in the master console for the surgeon to see a 3D display.
  • the camera may in use be located either inside the person receiving surgery or outside the body looking in, e.g. through an incision.
  • stereoscopic video system 6 is connected to receive from the image processing unit 5 respective video signals representing the views seen by the respective stereoscopic cameras 4 at the end of one of the robot arms.
  • the stereoscopic video system is shown in detail in Figure 2.
  • This comprises a computer 10 configured with two video capture cards 11 and 12. Those are respectively connected to receive from the image processing unit video signals representing the respective views from the two endoscopic cameras 4. (In other examples the cameras may be connected directly to the stereoscopic video system.)
  • the video capture cards convert those video signals into
  • FIG. 3 shows the image combining process for one particular example of an LCD 3D display.
  • the monitor can use other technologies, such as plasma, to generate the pixels.
  • This monitor has an input for a combined signal 21 comprising pixels of both the left and right images and has its pixels arranged in rows and has mounted over each row a strip of polarizing material.
  • the strips of polarizing material for alternate rows have opposite polarizations, for example left circular polarization and right circular polarization (or vertical polarization and horizontal polarization) .
  • the application program operates (i) to take pixels from the left video signal and provide them in rows of the combined video signal 21 that are displayed on rows of the display 7 that have a filter providing a first polarization and (ii) to take pixels from the right video signal and display them on rows having a filter of the second opposite polarization.
  • the viewer 23 is provided with spectacles 24 having a polarizing filter for the left eye that allows through the pixels having the first polarization and a filter for the right that allows through pixels having the second, opposite, polarization.
  • the left and right eyes see respectively only the left and right images and so the viewer perceives 25 the view in 3D.
  • the rows and pixels of each left or right frame are, of course, kept in their original order. Further, corresponding rows from the left and right frames are displayed only with a small vertical distance from each other; in this example where the polarizing filter strips are only one row of pixels high corresponding rows from the left and right images are displayed next to each other.
  • the method of the application program 18 in this example is to build up the rows of the combined frame in order by taking rows, in order, alternately from the left and right frames.
  • the combined signal is displayed on the 3D monitor in the same single combined image area.
  • a 3D monitor has vertical strips of
  • polarizing filter one pixel wide with alternate columns of pixels being given opposite polarization. Since pixels of the frames are usually organised in RAM in a raster pattern of one row after the next, the application program in this case steps through following that pattern taking pixels from the left and right images alternately.
  • pixels in certain columns are dropped; for example if the left and right frames have the same width in pixels as the display 7 then pixels in alternate columns of the left and right frames are dropped.
  • a further example of how the filters on a 3D monitor may be arranged is to arrange them in a chequerboard pattern with the polarization of the filter changing to the opposite every pixel both in the vertical and horizontal directions.
  • the application program works through the pixels of each row in order taking pixels alternately from the corresponding pixels of the left and right frames. The first pixel of each row is taken alternately from the left and right frames.
  • Video capture and display cards also have RAM and processing units, and in alternative examples of the invention the RAM of any of these or the main RAM 15 and the processors of any these or the CPU 17 may be used, alone or in combination .
  • a pixel polarization map 30 of the display being used is first derived.
  • An example of such a map is shown in Figure 4 at 30. It comprises a two-dimensional array of values each corresponding to a pixel of the display 7.
  • display 7 is used to display a frame which is completely white. The display is then viewed through a magnifying glass and one of the left and right filters of the 3D spectacles that are provided to view the display. This will reveal the pattern of pixels in the display for that eye. This information is then used to compile the map. This method is useful if there is no relevant documentation supplied with the 3D display.
  • Figure 4 illustrates a third example of a method of deriving the map.
  • Some 3D monitors are supplied with a test program that takes two static images provided by the user, a left image and right image, and interlaces them in an appropriate manner for display on the 3D monitor.
  • this test program is provided with a completely black image 31 as the left image and a completely white image 32 as the right image to produce an output image. (Other pairs of colours may be used.)
  • the resultant image is not a 3D image
  • 24 bit RGB representation are those for the left image pixels and those having the value (255,255,255) (in 24 bit RGB representation) are those for the right image pixels.
  • the resultant image can be saved as it is for later use as the map.
  • This can be in RAM or in long term storage such as a disc drive.
  • the images in computer systems are usually stored as RGB values there are in fact three sub-pixel maps 30 as shown in Figure 4, one for each or red, green and blue, but since white and black test images are used they contain the same values.
  • the preferred example of the application program 18 uses the map as follows. As before the video cards 11 and 12 provide series of frames of those images in the RAM 15 of the computer 10. The application program 18 processes each pair of frames from the left and right series as follows. For each pixel of the combined frame 16 the application program first looks up the value of that pixel in the map.
  • the left, right and combined frames are of the same height and width in pixels. No special steps are required to drop pixels since for each pixel it only selects one or other of the left and right pixels. If however the left and right frames are of a different size to the combined frame a rule for the mapping the pixels between the two is adopted, but the basic method of selecting the pixel to be from the left or right frame on the basis of the map is unaffected.
  • mapping rule could be that a 2x2 block of four pixels in the combined frame maps to a respective 1 pixel in the left and right frames, which will result (with the more likely arrangements of polarization filters on the 3D ) in each block of 2x2 pixels in the combined frame showing each corresponding pixel of the left and right frames twice, with of course the correct polarizations for each .
  • each sub-pixel of each pixel of the map is checked individually and in response to each such test the corresponding sub-pixel of the left or right image is copied to the resultant image. Although this is less efficient in terms of the number of tests, the resultant image is the same.
  • the application program (whether using a pixel polarization map or not) also provides a depth control.
  • the offset can be set to values from zero to positive values for which the pixels for the left image are offset in the display to the left of the corresponding pixels of the right image.
  • Figure 5 illustrates the final positions of the pixels for an offset of 2.
  • R right and the pixel coordinates from those images are given as (row, column) .
  • the case is for a monitor having horizontal strips of polarization filter one pixel high.
  • Figure 6 shows an example of the invention for laparoscopic surgery.
  • the surgeon 3 works beside the patient.
  • the pair of cameras 4 are provided in an endoscope.
  • the camera control unit 8 to which the cameras are connected provides a pair of video signals representing the images from the endoscopic cameras to a stereoscopic video system 6, which is that same as for the robotic surgery example and operates in the same manner to provide a 3D display on 3D LCD monitor 7.
  • This example illustrates that the surgeon 3 can use the 3D monitor (using cooperating spectacles 24) to see the view provided by the cameras 4 in 3D.
  • This provides the surgeon with depth perception which is lacking in traditional 2D displays for laparoscopy.
  • Other staff members 9 can share the monitor to see the same 3D display that the surgeon sees (in 3D by using the 3D spectacles) .
  • the system can be used to drive further 3D monitors. These can be in the operating theatre. It is also possible to show
  • FIG. 1 For both the robotic and laparoscopic surgery examples shown Figures 1 and 6 there is shown an additional 3D display 7' in a lecture theatre being viewed using cooperating spectacles 24
  • the extra monitor 7' is driven by an additional stereoscopic video system 6' connected to receive the same video signals from the camera control unit or the image processing system 5 as does the first mentioned stereoscopic video system 6.
  • the monitors 7 and 7' are the same then a single common stereoscopic video system can be provided with its output video signal passed through a splitter to drive the two displays 7 and 7'.
  • stereoscopic video unit involves a specially configured and programmed computer that could be constructed as a special purpose integrated circuit, or set of such circuits.
  • the application program may be provided as a separate item, for example on a computer readable medium, such as a DVD or magnetic disc, or via an internet download.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A stereoscopic visualization system, particularly for surgery, combines two views into a combined image signal for display on a secondary 3D display, for example an LCD TV. Left and right views are combined for rapidly using a pixel map recoding pixels of the 3D display which are for the right eye and which for the left.

Description

STEREOSCOPIC VISUALIZATION SYSTEM FOR SURGERY
Field of the Invention
The present invention relates to stereoscopic visualization of surgery for staff present in an operating theatre.
Background of the invention
Robotic surgery is becoming increasingly popular due to its many benefits over traditional open surgeries, including quicker recovery time, reduced pain, reduced scarring, smaller incisions and greater precision.
A known robotic surgery system called the da Vinci Surgical™ System ("dVSS") is provided by Intuitive Surgical, Inc. and comprises two main components: a master console 1 and a slave robot 2. These components are shown in Figure 1 (which
otherwise illustrates an example of the invention) . The master console 1 is operated by the surgeon 3. Using manipulators on the console the surgeon controls the arms of the robot and the surgical implements attached to their ends to perform the surgery. These are also operated by the surgeon. The end of one of the arms is provided with a pair of cameras 4 spaced apart by a few millimetres having a stereoscopic view of the surgery; these cameras provide the console with a respective video feeds. The console displays the video feeds on
respective displays having respective optical systems which project their respective images into respective ones of the surgeon's eyes providing him or her with a detailed 3D view of the surgery. This requires the surgeon to place his or her head into a recess in the console 1 to position his or her eyes appropriately in the optical system. Summary of the Invention
The present invention improves on the visual information provided to other members of staff in the operating theatre. Preferred examples have advantages of easy set up of new 3D monitors and a depth control for the 3D image.
In accordance with the present invention there is provided a stereoscopic surgery display system and method of performing surgery as defined in the appended claims.
Description of the Drawings
Examples of the invention will now be described, with
reference to the accompanying drawings, of which:
FIGURE lshows an example of the invention for robotic surgery;
FIGURE 2 shows an example of the stereoscopic video system;
FIGURE 3 illustrates the interlacing method of the
stereoscopic video system;
FIGRUE 4 illustrates a third method of generating the pixel polarization map;
FIGURE 5 shows the pixel offset for depth control;
FIGURE 6 shows an example of the invention for laparoscopic
surgery .
Detailed Description
Figure 1 shows an example system in accordance with the present invention for robotic surgery. This example system includes components of an existing robotic surgery system and so the invention can be retrofitted to such a system. The existing system includes the master console 1, the slave surgery robot 2, the pair of endoscopic cameras 4 mounted on one of the robot arms and an image processing unit 5 that provides signals to the separate respective displays in the master console for the surgeon to see a 3D display.
Note that the camera may in use be located either inside the person receiving surgery or outside the body looking in, e.g. through an incision.
In this example of the invention there is provided
additionally a stereoscopic video system 6 and a 3D LCD flat panel monitor 7 for other staff members 9 to view. The
stereoscopic video system 6 is connected to receive from the image processing unit 5 respective video signals representing the views seen by the respective stereoscopic cameras 4 at the end of one of the robot arms. The stereoscopic video system is shown in detail in Figure 2. This comprises a computer 10 configured with two video capture cards 11 and 12. Those are respectively connected to receive from the image processing unit video signals representing the respective views from the two endoscopic cameras 4. (In other examples the cameras may be connected directly to the stereoscopic video system.) The video capture cards convert those video signals into
respective series of video frames 13 and 14 stored in the RAM 15 of the computer 10. These frames are combined into combined frames 16. (In particular one frame from each series that were taken at the same time are combined together to make one frame of the combined series 16.) This processing is carried out by the central processing unit 17 of the computer under the control of an application program for the purpose 18. The combined frames are converted to a video signal 21 by a video display card 19 in the computer, which provides that signal at a combined image output. The 3D LCD monitor 7 is connected to receive that video signal and to display it.
Figure 3 shows the image combining process for one particular example of an LCD 3D display. (Note that the monitor can use other technologies, such as plasma, to generate the pixels.) This monitor has an input for a combined signal 21 comprising pixels of both the left and right images and has its pixels arranged in rows and has mounted over each row a strip of polarizing material. The strips of polarizing material for alternate rows have opposite polarizations, for example left circular polarization and right circular polarization (or vertical polarization and horizontal polarization) . The application program operates (i) to take pixels from the left video signal and provide them in rows of the combined video signal 21 that are displayed on rows of the display 7 that have a filter providing a first polarization and (ii) to take pixels from the right video signal and display them on rows having a filter of the second opposite polarization. The viewer 23 is provided with spectacles 24 having a polarizing filter for the left eye that allows through the pixels having the first polarization and a filter for the right that allows through pixels having the second, opposite, polarization. In this way the left and right eyes see respectively only the left and right images and so the viewer perceives 25 the view in 3D. The rows and pixels of each left or right frame are, of course, kept in their original order. Further, corresponding rows from the left and right frames are displayed only with a small vertical distance from each other; in this example where the polarizing filter strips are only one row of pixels high corresponding rows from the left and right images are displayed next to each other.
So, in summary the method of the application program 18 in this example is to build up the rows of the combined frame in order by taking rows, in order, alternately from the left and right frames. The combined signal is displayed on the 3D monitor in the same single combined image area.
If the sum of the number of rows in the left image and that in the right image exceeds the number that can be displayed on the monitor 7 then only a fraction of the rows from the left and right images are used, with the other rows being dropped; preferably the dropped rows are spread across the image so that no large area at the top or bottom is omitted. For example, if the left and right images each have 1080 rows and the monitor has 1080 rows then only every other row from each of the images are used and the others are dropped - the result image therefore has 540 + 540 = 1080 rows, as required for the monitor. (See in Figure 5 later, which Figure is in accordance with this example, that the combined image has only odd numbered rows from the left image and even numbered rows from the right image . )
In another example a 3D monitor has vertical strips of
polarizing filter one pixel wide with alternate columns of pixels being given opposite polarization. Since pixels of the frames are usually organised in RAM in a raster pattern of one row after the next, the application program in this case steps through following that pattern taking pixels from the left and right images alternately.
In this case, if the sum of the number of pixels across the left image and the number across the right image is too great to display then pixels in certain columns are dropped; for example if the left and right frames have the same width in pixels as the display 7 then pixels in alternate columns of the left and right frames are dropped.
A further example of how the filters on a 3D monitor may be arranged is to arrange them in a chequerboard pattern with the polarization of the filter changing to the opposite every pixel both in the vertical and horizontal directions. Here the application program works through the pixels of each row in order taking pixels alternately from the corresponding pixels of the left and right frames. The first pixel of each row is taken alternately from the left and right frames.
Note that in the examples above the left, right and combined frames are stored in the main RAM of the computer and the application program is executed by the central processing unit of the computer 10. Video capture and display cards also have RAM and processing units, and in alternative examples of the invention the RAM of any of these or the main RAM 15 and the processors of any these or the CPU 17 may be used, alone or in combination .
These methods of combining the pixels of the left and right frames can be explicitly coded into the application program for each type of filter layout. This however is not very
flexible since the user of the system may wish to do so
without having the application program specially coded for the pattern of their particular 3D display. Problems would also occur if the monitor of an installed system was to be upgraded to a different one having a different layout. A preferred example of the application program 18 avoids these
inconveniences .
In that preferred example of the application program a pixel polarization map 30 of the display being used is first derived. An example of such a map is shown in Figure 4 at 30. It comprises a two-dimensional array of values each corresponding to a pixel of the display 7.
In a first example of deriving the map the information
necessary to compile it is taken from the documentation
supplied with the 3D display.
In a second example of a method of deriving the map the
display 7 is used to display a frame which is completely white. The display is then viewed through a magnifying glass and one of the left and right filters of the 3D spectacles that are provided to view the display. This will reveal the pattern of pixels in the display for that eye. This information is then used to compile the map. This method is useful if there is no relevant documentation supplied with the 3D display.
Figure 4 illustrates a third example of a method of deriving the map. Some 3D monitors are supplied with a test program that takes two static images provided by the user, a left image and right image, and interlaces them in an appropriate manner for display on the 3D monitor. For the derivation of the map 30 this test program is provided with a completely black image 31 as the left image and a completely white image 32 as the right image to produce an output image. (Other pairs of colours may be used.) The resultant image is not a 3D image
(in contrast to the purpose of the test program) but is in fact a map of the filter polarizations for the pixels. Since the left image is all black the pixels having the value (0,0,0)
(in 24 bit RGB representation) are those for the left image pixels and those having the value (255,255,255) (in 24 bit RGB representation) are those for the right image pixels.
Therefore the resultant image can be saved as it is for later use as the map. This can be in RAM or in long term storage such as a disc drive. (Note that it is not necessary to discover the method by which the test program interlaces the pixels from the left and right images - only the resultant image - i.e. the map 30 - is required. Nor in fact is it necessary to display the test pattern on the 3D monitor.) (Since the images in computer systems are usually stored as RGB values there are in fact three sub-pixel maps 30 as shown in Figure 4, one for each or red, green and blue, but since white and black test images are used they contain the same values. If different pairs of colours are used again only one sub-pixel of the three need be tested because the although the red, green and blue values will in general be different from each other each still indicates by itself whether the pixel is for the left or right eye - except, of course for a colour pair both having the same red value, for example, testing the red value only is not sensible.)
The preferred example of the application program 18 uses the map as follows. As before the video cards 11 and 12 provide series of frames of those images in the RAM 15 of the computer 10. The application program 18 processes each pair of frames from the left and right series as follows. For each pixel of the combined frame 16 the application program first looks up the value of that pixel in the map. If it indicates that the pixel is for the left image (pixel = (0,0,0) in 24 bit RGB representation) then it takes the value of the corresponding pixel from the left frame and copies it to the corresponding pixel at the same position in the combined frame; on the other hand if it indicates that the pixel is for the right image (pixel = (255,255,255) in 24 bit RGB representation) then it takes the value of the corresponding pixel from the right frame and copies it to the corresponding pixel in the combined frame. (Note that since each sub-pixel map contains the same information the value of the pixel in the map can be checked by checking just one of the sub-pixels, as well as by testing all three of them. Note also that it does not matter which order the application works through the pixels, but the raster pattern of working across each row in turn is employed in this example . )
In the simplest example of this method the left, right and combined frames are of the same height and width in pixels. No special steps are required to drop pixels since for each pixel it only selects one or other of the left and right pixels. If however the left and right frames are of a different size to the combined frame a rule for the mapping the pixels between the two is adopted, but the basic method of selecting the pixel to be from the left or right frame on the basis of the map is unaffected. For example if the left and right frames are half height and half width then mapping rule could be that a 2x2 block of four pixels in the combined frame maps to a respective 1 pixel in the left and right frames, which will result (with the more likely arrangements of polarization filters on the 3D ) in each block of 2x2 pixels in the combined frame showing each corresponding pixel of the left and right frames twice, with of course the correct polarizations for each .
In an alternative version of the preferred example of the application program 18 each sub-pixel of each pixel of the map is checked individually and in response to each such test the corresponding sub-pixel of the left or right image is copied to the resultant image. Although this is less efficient in terms of the number of tests, the resultant image is the same.
The application program (whether using a pixel polarization map or not) also provides a depth control. This takes the form of a pixel offset value which is used when copying pixels from one of the left and right images to the combined image so that corresponding pixels in the left and right images are offset from each other horizontally by that number of pixels. The offset can be set to values from zero to positive values for which the pixels for the left image are offset in the display to the left of the corresponding pixels of the right image.
Figure 5 illustrates the final positions of the pixels for an offset of 2. The notation used in the Figure is L=left,
R=right and the pixel coordinates from those images are given as (row, column) . The case is for a monitor having horizontal strips of polarization filter one pixel high.
Figure 6 shows an example of the invention for laparoscopic surgery. Here the surgeon 3 works beside the patient. The pair of cameras 4 are provided in an endoscope. The camera control unit 8 to which the cameras are connected provides a pair of video signals representing the images from the endoscopic cameras to a stereoscopic video system 6, which is that same as for the robotic surgery example and operates in the same manner to provide a 3D display on 3D LCD monitor 7. This example illustrates that the surgeon 3 can use the 3D monitor (using cooperating spectacles 24) to see the view provided by the cameras 4 in 3D. This provides the surgeon with depth perception which is lacking in traditional 2D displays for laparoscopy. However it does not isolate the surgeon from the patient and his or her colleagues in the operating theatre as can happen with an immersive console 1 as is used for robotic surgery. Other staff members 9 can share the monitor to see the same 3D display that the surgeon sees (in 3D by using the 3D spectacles) .
The system can be used to drive further 3D monitors. These can be in the operating theatre. It is also possible to show
students and others 27, for teaching or other purposes, the procedure in 3D in another room, for example a lecture theatre. For both the robotic and laparoscopic surgery examples shown Figures 1 and 6 there is shown an additional 3D display 7' in a lecture theatre being viewed using cooperating spectacles 24 The extra monitor 7' is driven by an additional stereoscopic video system 6' connected to receive the same video signals from the camera control unit or the image processing system 5 as does the first mentioned stereoscopic video system 6. (If the monitors 7 and 7' are the same then a single common stereoscopic video system can be provided with its output video signal passed through a splitter to drive the two displays 7 and 7'.)
Note that while visualization of surgery has been discussed above the invention also applies where the aim is merely to view the inside of the body, whether that is via a natural orifice or an incision.
Note also that while the example of the stereoscopic video unit described above involves a specially configured and programmed computer that could be constructed as a special purpose integrated circuit, or set of such circuits.
The application program may be provided as a separate item, for example on a computer readable medium, such as a DVD or magnetic disc, or via an internet download.

Claims

CLAIMS :
1. A stereo processing unit comprising a first image input for receiving a first one of two stereoscopic images; a second image input for receiving the second one of the two stereoscopic images; a processing unit connected to receive images from the first and second inputs and to intersperse pixels from the those to form a combined image, for display on a 3D display; and a combined image output connected to output a signal representing the combined image, wherein the processing unit (i) comprises a map of which pixels in the 3D display are to be displayed to the left eye of a viewer and which to the right eye, and
(ii) is arranged to check the map and to include in the combined image a corresponding pixel from the left image where a left eye pixel is indicated by the map and a corresponding pixel from the right image where a right eye pixel is
indicated by the map.
2. A computer program product which when executed on a
processor performs the following steps: receiving a first and second ones of two stereoscopic images ; interspersing pixels from the first and second images to form a combined image, for display on a 3D display, by:
(i) receiving a map of which pixels in the 3D display are to be displayed to the left eye of a viewer and which to the right eye, and
(ii) checking the map and including in the combined image a corresponding pixel from the left image where a left eye pixel is indicated by the map and a corresponding pixel from the right image where a right eye pixel is indicated by the map.
3. A method of providing a 3D pixel map for a 3D display comprising : supplying an interspersing computer program for the 3D display that will combine two stereoscopic views of a scene into a combined image that when displayed on the 3D display will provide a 3D image, with left and right images each of a single colour, the left and right images being of different colours, and recording the resultant output of the interspersing program as the 3D pixel map.
4. A stereoscopic surgery display system comprising: a stereoscopic surgical camera having respective outputs representing two stereoscopic views of the surgery, a stereo processing unit connected to receive the output of the stereoscopic camera and to combine the two stereoscopic images into a combined image signal; and a 3D display connected to receive and display the
combined image signal .
5. A stereoscopic surgery display system as claimed in claim 4 wherein the 3D display comprises polarizing filters arranged to polarize the light emitted by the pixels of the display, the filters being of opposite polarization types and the filters of the two types being interspersed among each other across the display.
6. A stereoscopic surgery display system as claimed in claim 5 wherein the filters of the two polarization types are
interspersed with each other in rows or columns.
7. A stereoscopic surgery display system as claimed in claim 5 wherein the filters of the two polarization types are
interspersed with each other in a chequerboard pattern.
8. A stereoscopic surgery display system as claimed in any one of claims 5 to 7 wherein the filters are circularly polarized.
9. A stereoscopic surgery display system as claimed in any one of claims 5 to 7 wherein the filters are linearly polarized.
10. A stereoscopic surgery display system as claimed in any one of claims 4 to 9 wherein the stereoscopic camera and the said 3D display are located in the same operating theatre.
11. A stereoscopic surgery display system as claimed in any one of claims 4 to 10 wherein the camera is mounted in a
laparoscopic surgical implement.
12. A stereoscopic surgery display system as claimed in any one of claims 4 to 10 wherein the camera is mounted on a
robotic surgery arm.
13. A stereoscopic surgery display system as claimed in any one of claims 4 to 11 wherein the 3D display has a single image area displaying the combined image.
14. A stereoscopic surgery display system as claimed in any one of claims 4 to 13 wherein the 3D display is a flat panel display.
15. A stereoscopic surgery display system as claimed in any one of claims 4 to 14 wherein the 3D display is an LCD display.
16. A stereoscopic surgery display system as claimed in any one of claims 4 to 15 wherein the 3D display has a combined image signal port connected to the stereo processing unit to receive therefrom the combined image signal .
17. A stereoscopic surgery display system as claimed in any one of claims 4 to 16 wherein the stereo processing unit is arranged to receive the outputs of the camera via an
intermediate image processing system.
18. A stereospcopic surgery display system as claimed in any one of claims 4 to 17 comprising a pair of 3D spectacles which cooperate with the 3D display to form a 3D image of a person viewing the 3D display through those spectacles.
19. A stereoscopic surgery display system as claimed in any one of claims 4 to 18 further comprising a console connected to receive the respective stereoscopic outputs of the surgical camera and to display that in 3D.
20. A stereoscopic surgery display system as claimed in any one of claims 4 to 19 wherein the a stereo processing unit comprises a computer.
21. A stereoscopic surgery display system as claimed in claim 20 wherein the computer comprises: respective video capture cards connected to receive the respective outputs representing two stereoscopic views of the surgery; a processor and an application program arranged to cooperate to combine images captured by the video capture cards into combined image data; and a video display card connected to receive the combined image data and to provide the combined image signal .
22. A stereoscopic surgery display system as claimed in any one of claims 4 to 21 wherein the stereo processing unit is arranged to intersperse, in the combined image, pixels from the respective stereoscopic views.
23. A stereoscopic surgery display system as claimed in claim 22 wherein the stereo processing unit is arranged to
intersperse, in the combined image, rows of pixels from the respective stereoscopic views.
24. A stereoscopic surgery display system as claimed in claim 22 wherein the stereo processing unit is arranged to
intersperse, in the combined image, columns of pixels from the respective stereoscopic views.
25. A stereoscopic surgery display system as claimed in claim 22 wherein the stereo processing unit is arranged to
intersperse, in the combined image, pixels from the respective stereoscopic views in a chequerboard pattern.
26. A stereoscopic surgery display system as claimed in any one of claims 22 to 25 wherein the stereo processing unit
(i) comprises a map of which pixels in the 3D display are to be displayed to the left eye of a viewer and which to the right eye, and
(ii) is arranged to check the map and to include in the combined image a corresponding pixel from the left image where a left eye pixel is indicated by the map and a corresponding pixel from the right image where a right eye pixel is
indicated by the map.
27. A stereoscopic surgery display system as claimed in any one of claims 22 to 25 wherein the stereo processing unit
(i) comprises a sub-pixel map of which sub-pixels in the 3D display are to be displayed to the left eye of a viewer and which to the right eye, and
(ii) is arranged to check the map and to include in the combined image a corresponding sub-pixel from the left image where a left eye sub-pixel is indicated by the map and a corresponding sub-pixel from the right image where a right eye sub-pixel is indicated by the map.
28. A stereoscopic surgery display system as claimed in any one of claims 22 to 27 wherein the stereo processing unit is arranged to offset horizontally in the combined image pixels from a left one of the stereoscopic views with respect to the pixels of the right one of the stereoscopic views.
29. A method of displaying images comprising: capturing two views of the inside of a person using a stereoscopic camera, processing signals representing those two views of the surgery into a combined image signal, displaying the combined image on a 3D display.
30. A method of displaying images as claimed in claim 29 wherein the 3D display is located in the same room as the person .
31. A method of displaying images as claimed in claim 30 wherein the room is an operating theatre.
32. A method of displaying images as claimed in any one of claims 29 to 31 wherein the images are made available to a surgeon .
33. A method of displaying images as claimed in any one of claims 29 to 32 wherein the images are made available to a person other than the surgeon or the person being imaged.
34. A method of displaying images as claimed in any one of claims 29 to 33 wherein the camera is external to the person.
35. A method of displaying images as claimed in any one of claims 29 to 33 wherein the camera is internal to the person.
36. A method of displaying images as claimed in any one of claims 29 to 35 including viewing the 3D display through 3D spectacles .
37. A method of displaying images as claimed in any one of claim 29 to 36 wherein the processing comprises interspersing, in the combined image, pixels from the respective stereoscopic views .
38. A method of displaying images as claimed in claim 37 wherein the interspersing comprises interspersing, in the combined image, rows of pixels from the respective
stereoscopic views.
39. A method of displaying images as claimed in claim 37 wherein the interspersing comprises interspersing, in the combined image, columns of pixels from the respective
stereoscopic views.
40. A method of displaying images as claimed in claim 37 wherein the interspersing comprises interspersing, in the combined image, pixels from the respective stereoscopic views in a chequerboard pattern.
41. A method of displaying images as claimed in any one of claims 37 to 40 wherein the interspersing comprises:
(i) providing a map of which pixels in the 3D display are to be displayed to the left eye of a viewer and which to the right eye, and
(ii) checking the map and including in the combined image a corresponding pixel from the left image where a left eye pixel is indicated by the map and a corresponding pixel from the right image where a right eye pixel is indicated by the map .
42. A method of displaying images as claimed in any one of claims 37 to 40 wherein the interspersing comprises:
(i) providing a sub-pixel map of which sub-pixels in the 3D display are to be displayed to the left eye of a viewer and which to the right eye, and
(ii) checking the map and including in the combined image a corresponding sub-pixel from the left image where a left eye sub-pixel is indicated by the map and a corresponding sub- pixel from the right image where a right eye sub-pixel is indicated by the map.
43. A stereoscopic surgery display system as claimed in any one of claims 29 to 42 wherein the processing comprises offsetting horizontally in the combined image pixels from a left one of the stereoscopic views with respect to the pixels of the right one of the stereoscopic views.
PCT/CN2010/079658 2009-12-11 2010-12-10 Stereoscopic visualization system for surgery WO2011069469A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
HK09111675.0 2009-12-11
HK09111675A HK1134743A2 (en) 2009-12-11 2009-12-11 Stereoscopic visualization system for robotic and laparoscopic ; surgeries
GB0921894A GB2476245A (en) 2009-12-15 2009-12-15 Stereoscopic display system for surgery
GB0921894.2 2009-12-15

Publications (1)

Publication Number Publication Date
WO2011069469A1 true WO2011069469A1 (en) 2011-06-16

Family

ID=44145109

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2010/079658 WO2011069469A1 (en) 2009-12-11 2010-12-10 Stereoscopic visualization system for surgery

Country Status (1)

Country Link
WO (1) WO2011069469A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9216068B2 (en) 2012-06-27 2015-12-22 Camplex, Inc. Optics for video cameras on a surgical visualization system
US9642606B2 (en) 2012-06-27 2017-05-09 Camplex, Inc. Surgical visualization system
US9782159B2 (en) 2013-03-13 2017-10-10 Camplex, Inc. Surgical visualization systems
US10028651B2 (en) 2013-09-20 2018-07-24 Camplex, Inc. Surgical visualization systems and displays
WO2019036005A3 (en) * 2017-08-16 2019-04-18 Covidien Lp Optimizing perception of stereoscopic visual content
US10568499B2 (en) 2013-09-20 2020-02-25 Camplex, Inc. Surgical visualization systems and displays
US10702353B2 (en) 2014-12-05 2020-07-07 Camplex, Inc. Surgical visualizations systems and displays
US10918455B2 (en) 2017-05-08 2021-02-16 Camplex, Inc. Variable light source
US10966798B2 (en) 2015-11-25 2021-04-06 Camplex, Inc. Surgical visualization systems and displays
US11154378B2 (en) 2015-03-25 2021-10-26 Camplex, Inc. Surgical visualization systems and displays

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997003378A1 (en) * 1995-07-07 1997-01-30 International Telepresence Corporation System with movable lens for producing three-dimensional images
CN1985773A (en) * 2005-12-22 2007-06-27 天津市华志计算机应用技术有限公司 Celebral operating robot system based on optical tracking and closed-loop control and its realizing method
US20070167702A1 (en) * 2005-12-30 2007-07-19 Intuitive Surgical Inc. Medical robotic system providing three-dimensional telestration
CN101170961A (en) * 2005-03-11 2008-04-30 布拉科成像S.P.A.公司 Methods and devices for surgical navigation and visualization with microscope
CN101193603A (en) * 2005-06-06 2008-06-04 直观外科手术公司 Laparoscopic ultrasound robotic surgical system
CN101518438A (en) * 2009-03-27 2009-09-02 南开大学 Binocular endoscope operation visual system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997003378A1 (en) * 1995-07-07 1997-01-30 International Telepresence Corporation System with movable lens for producing three-dimensional images
CN101170961A (en) * 2005-03-11 2008-04-30 布拉科成像S.P.A.公司 Methods and devices for surgical navigation and visualization with microscope
CN101193603A (en) * 2005-06-06 2008-06-04 直观外科手术公司 Laparoscopic ultrasound robotic surgical system
CN1985773A (en) * 2005-12-22 2007-06-27 天津市华志计算机应用技术有限公司 Celebral operating robot system based on optical tracking and closed-loop control and its realizing method
US20070167702A1 (en) * 2005-12-30 2007-07-19 Intuitive Surgical Inc. Medical robotic system providing three-dimensional telestration
CN101518438A (en) * 2009-03-27 2009-09-02 南开大学 Binocular endoscope operation visual system

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9936863B2 (en) 2012-06-27 2018-04-10 Camplex, Inc. Optical assembly providing a surgical microscope view for a surgical visualization system
US10925589B2 (en) 2012-06-27 2021-02-23 Camplex, Inc. Interface for viewing video from cameras on a surgical visualization system
US10925472B2 (en) 2012-06-27 2021-02-23 Camplex, Inc. Binocular viewing assembly for a surgical visualization system
US9629523B2 (en) 2012-06-27 2017-04-25 Camplex, Inc. Binocular viewing assembly for a surgical visualization system
US9642606B2 (en) 2012-06-27 2017-05-09 Camplex, Inc. Surgical visualization system
US9681796B2 (en) 2012-06-27 2017-06-20 Camplex, Inc. Interface for viewing video from cameras on a surgical visualization system
US9723976B2 (en) 2012-06-27 2017-08-08 Camplex, Inc. Optics for video camera on a surgical visualization system
US10022041B2 (en) 2012-06-27 2018-07-17 Camplex, Inc. Hydraulic system for surgical applications
US9615728B2 (en) 2012-06-27 2017-04-11 Camplex, Inc. Surgical visualization system with camera tracking
US9492065B2 (en) 2012-06-27 2016-11-15 Camplex, Inc. Surgical retractor with video cameras
US10231607B2 (en) 2012-06-27 2019-03-19 Camplex, Inc. Surgical visualization systems
US11889976B2 (en) 2012-06-27 2024-02-06 Camplex, Inc. Surgical visualization systems
US11389146B2 (en) 2012-06-27 2022-07-19 Camplex, Inc. Surgical visualization system
US10555728B2 (en) 2012-06-27 2020-02-11 Camplex, Inc. Surgical visualization system
US9216068B2 (en) 2012-06-27 2015-12-22 Camplex, Inc. Optics for video cameras on a surgical visualization system
US11166706B2 (en) 2012-06-27 2021-11-09 Camplex, Inc. Surgical visualization systems
US11129521B2 (en) 2012-06-27 2021-09-28 Camplex, Inc. Optics for video camera on a surgical visualization system
US9782159B2 (en) 2013-03-13 2017-10-10 Camplex, Inc. Surgical visualization systems
US10932766B2 (en) 2013-05-21 2021-03-02 Camplex, Inc. Surgical visualization systems
US10568499B2 (en) 2013-09-20 2020-02-25 Camplex, Inc. Surgical visualization systems and displays
US10881286B2 (en) 2013-09-20 2021-01-05 Camplex, Inc. Medical apparatus for use with a surgical tubular retractor
US11147443B2 (en) 2013-09-20 2021-10-19 Camplex, Inc. Surgical visualization systems and displays
US10028651B2 (en) 2013-09-20 2018-07-24 Camplex, Inc. Surgical visualization systems and displays
US10702353B2 (en) 2014-12-05 2020-07-07 Camplex, Inc. Surgical visualizations systems and displays
US11154378B2 (en) 2015-03-25 2021-10-26 Camplex, Inc. Surgical visualization systems and displays
US10966798B2 (en) 2015-11-25 2021-04-06 Camplex, Inc. Surgical visualization systems and displays
US10918455B2 (en) 2017-05-08 2021-02-16 Camplex, Inc. Variable light source
WO2019036005A3 (en) * 2017-08-16 2019-04-18 Covidien Lp Optimizing perception of stereoscopic visual content

Similar Documents

Publication Publication Date Title
WO2011069469A1 (en) Stereoscopic visualization system for surgery
Fergo et al. Three-dimensional laparoscopy vs 2-dimensional laparoscopy with high-definition technology for abdominal surgery: a systematic review
US9812052B2 (en) 2D/3D image displaying apparatus
KR101222975B1 (en) Three-dimensional image Display
JP3807721B2 (en) Image synthesizer
Schwab et al. Evolution of stereoscopic imaging in surgery and recent advances
US5510832A (en) Synthesized stereoscopic imaging system and method
EP2494402B1 (en) Stereo display systems
JP2020173481A (en) Generation of observation image of object region
Livatino et al. Stereoscopic visualization and 3-D technologies in medical endoscopic teleoperation
TWI357987B (en) A three-dimension image display device and a displ
US20070008314A1 (en) Stereoscopic image display device
Khoshabeh et al. Multiview glasses-free 3-D laparoscopy
TW201142356A (en) Display with adaptable parallax barrier
US9077982B2 (en) Device and method for displaying 3D image and device and method for receiving 3D image by using light of different wavelengths
CN108135740A (en) Surgical operation microscope, image processing apparatus and image processing method
US9408528B2 (en) Stereoscopic endoscope system
US20080291126A1 (en) Viewing direction image data generator, directional display image data generator, directional display device, directional display system, viewing direction image data generating method, and directional display image data generating method
TWI471607B (en) Hybrid multiplexed 3d display and displaying method of hybrid multiplexed 3d image
Kwon et al. High-definition 3D stereoscopic microscope display system for biomedical applications
Pastoor 3D Displays
Fergason et al. An innovative beamsplitter-based stereoscopic/3D display design
US8400493B2 (en) Virtual stereoscopic camera
Minami et al. Portrait and landscape mode convertible stereoscopic display using parallax barrier
GB2476245A (en) Stereoscopic display system for surgery

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10835504

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10835504

Country of ref document: EP

Kind code of ref document: A1