WO2008050823A1 - Dispositif d'affichage d'images médicales - Google Patents

Dispositif d'affichage d'images médicales Download PDF

Info

Publication number
WO2008050823A1
WO2008050823A1 PCT/JP2007/070775 JP2007070775W WO2008050823A1 WO 2008050823 A1 WO2008050823 A1 WO 2008050823A1 JP 2007070775 W JP2007070775 W JP 2007070775W WO 2008050823 A1 WO2008050823 A1 WO 2008050823A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
displayed
display
display device
dimensional
Prior art date
Application number
PCT/JP2007/070775
Other languages
English (en)
Japanese (ja)
Inventor
Hiroto Kokubun
Takashi Shirahata
Osamu Miyazaki
Original Assignee
Hitachi Medical Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Medical Corporation filed Critical Hitachi Medical Corporation
Priority to JP2008541015A priority Critical patent/JP5285427B2/ja
Publication of WO2008050823A1 publication Critical patent/WO2008050823A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/05Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves 
    • A61B5/055Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves  involving electronic [EMR] or nuclear [NMR] magnetic resonance, e.g. magnetic resonance imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display

Definitions

  • the present invention relates to a medical image display device, and more particularly to a medical image display device that efficiently presents a three-dimensional image of a specific region of a human body to an operator.
  • X-ray CT apparatus X-ray CT apparatus
  • X-ray apparatus X-ray apparatus
  • MRI apparatus ultrasonic image diagnostic apparatus
  • ultrasonic image diagnostic apparatus etc.
  • threshold separation processing and area expansion method For example, there is a method of extracting a three-dimensional region (outside the object) corresponding to a human body part to be observed and displaying it three-dimensionally.
  • Patent Document 1 discloses a three-dimensional image display method capable of clearly displaying an arbitrary specific object by performing shading based on the pixel value of the CT value and the magnitude of the pixel value gradient. Proposed.
  • Patent Document 1 Japanese Patent Laid-Open No. 10-11604
  • Patent Document 1 when a specific object and other objects can be distinguished and displayed, and when it is desired to display a plurality of objects clearly and the number of objects increases, an optimal image is displayed. There was no consideration for display
  • a plurality of objects can be clearly displayed by synthesizing a three-dimensional image in which specific objects are clearly displayed by the method disclosed in Patent Document 1, for example. Can be considered.
  • the operator cannot grasp the shape of the object registered in the list until it is actually displayed in 3D, it is necessary to determine whether to display or not while repeating the 3D display.
  • the present invention has been made in view of the above circumstances, and by efficiently synthesizing a plurality of three-dimensional images showing a plurality of types of objects, the labor of the operator is reduced and synthesized.
  • An object of the present invention is to provide a medical image display apparatus capable of reducing the time required for image diagnosis using an image.
  • the medical image display apparatus of the present invention includes a plurality of objects indicating a plurality of types of objects to be diagnosed, which are extracted from the image data of the subject acquired by the medical image photographing apparatus.
  • An image acquisition means for acquiring a three-dimensional image; a display means for displaying the plurality of three-dimensional images; and two or more display areas; a single three-dimensional image displayed in each display area of the display means;
  • an instruction unit that selects an arbitrary three-dimensional image from the combined image obtained by combining and displaying a plurality of three-dimensional images and moves the selected three-dimensional image to another display area of the display unit; and the instruction Display control means for displaying a 3D image instructed to move by means of display in a corresponding display area of the display means, and for displaying a plurality of 3D images in one display area. It is characterized by having.
  • a plurality of three-dimensional images showing a plurality of types of objects are displayed on a display unit having two or more display areas.
  • One 3D image displayed in each display area by the instruction means, or an arbitrary 3D image selected from a composite image obtained by combining and displaying a plurality of 3D images, and the selected 3D image is displayed on the display means.
  • An instruction to move to another display area is given.
  • the three-dimensional image instructed to move by the display control means is displayed in the corresponding display area of the display means.
  • the instruction unit includes at least one of an input time, an input count, and an input pressure of the signal input by the instruction unit and the composite image.
  • An arbitrary three-dimensional image is selected from a plurality of three-dimensional images constituting the composite image by associating with the depth direction.
  • the composite image is obtained by associating at least one of the input time, the number of inputs, and the input pressure of the signal input by the instruction unit with the depth direction of the composite image.
  • Arbitrary 3D images are selected from multiple 3D images
  • the medical image display device of the present invention is such that, in the medical image display device, when the three-dimensional image instructed to move by the instruction means is moved out of the display area, the display control means. Is characterized in that the three-dimensional image is deleted from the display area where the three-dimensional image was displayed.
  • the medical image display device of the present invention when a three-dimensional image instructed to move by the pointing device is moved out of the display area, the third order is displayed from the display area where the three-dimensional image was displayed. An image with the original image deleted is displayed.
  • the medical image display device of the present invention is a cross-sectional image of a three-dimensional image selected by the instruction means in the medical image display device, and an input time of a signal input by the instruction means Corresponding to at least one of the number of inputs and the input pressure.
  • a cross-sectional image and the composite image are displayed in an overlapping manner.
  • the medical image display device of the present invention is a cross-sectional image of the three-dimensional image selected by the instruction unit, and at least one of the input time, the number of inputs, and the input pressure of the signal input by the instruction unit.
  • a cross-sectional image at a predetermined position in the depth direction indicated by one is created.
  • an image obtained by superimposing the cross-sectional image created by the display control means and the composite image is displayed in the display area.
  • the display control means displays the three-dimensional image displayed as the three-dimensional image instructed to move by the instruction means. Until the instruction to move from one area to another display area is confirmed, the 3D image is displayed in a display area where the 3D image is moved with a small amount of image processing! As! /
  • the medical image display device of the present invention when the three-dimensional image that is instructed to be moved by the instruction means is moved from the display area that is currently displayed! / To another display area, the movement is performed. Until it is confirmed, the 3D image instructed to move is displayed in another display area with a small amount of image processing and a simple display.
  • the synthesized and displayed three-dimensional image is instantaneously displayed. Therefore, since a composite image can be created while visually confirming each object, it is possible to easily combine 3D images with the force S.
  • FIG. 1 is a hardware configuration diagram showing the overall configuration of a medical image display apparatus to which the present invention is applied.
  • FIG. 2 is a flowchart showing a processing flow of the first embodiment of the medical image display device.
  • FIG. 3 shows an example of a selection screen for selecting an object to be displayed in the display area of the first embodiment of the medical image display device.
  • FIG. 4 is a display example of the first embodiment of the medical image display device.
  • FIG. 5 is an explanatory diagram for explaining a method for creating a composite image of the first embodiment of the medical image display device.
  • FIG. 6 is an explanatory diagram for explaining a method of creating a composite image of the first embodiment of the medical image display device.
  • FIG. 7 is an explanatory diagram for explaining a method for creating a composite image of the first embodiment of the medical image display device.
  • FIG. 8 is an explanatory diagram for explaining a method for creating a composite image of the first embodiment of the medical image display device.
  • FIG. 9 is an explanatory diagram for explaining a method of creating a composite image of the first embodiment of the medical image display device.
  • FIG. 10 is an explanatory diagram for explaining a method for selecting an edit object on the composite image of the first embodiment of the medical image display device.
  • FIG. 11 is a diagram in which the edit object of the first embodiment of the medical image display device is highlighted.
  • FIG. 12 is a diagram in which a cross-sectional view of an editing object according to the first embodiment of the medical image display device is displayed so as to overlap with a composite image.
  • FIG. 13 is an explanatory diagram for explaining a method of creating a cross-sectional view of an editing object according to the first embodiment of the medical image display device.
  • FIG. 14 is an explanatory diagram for explaining a method of creating a composite image of the first embodiment of the medical image display device.
  • FIG. 15 illustrates a method for creating a composite image of the first embodiment of the medical image display device. Explanatory drawing to clarify.
  • FIG. 16 is an explanatory diagram for explaining a method of creating a composite image of the first embodiment of the medical image display device.
  • FIG. 17 is an explanatory diagram illustrating a method for deleting an object according to the first embodiment of the medical image display apparatus.
  • FIG. 18 is an explanatory diagram for explaining a method for deleting an object according to the first embodiment of the medical image display apparatus.
  • FIG. 19 is an explanatory diagram illustrating a method for deleting an object according to the first embodiment of the medical image display apparatus.
  • FIG. 20 is an explanatory diagram for explaining an example of a method for selecting an editing object on the composite image of the first embodiment of the medical image display device.
  • FIG. 21 is an explanatory diagram for explaining an example of a method for selecting an edit object on the composite image of the first embodiment of the medical image display device.
  • FIG. 1 is a hardware configuration diagram showing the overall configuration of the medical image display apparatus according to the first embodiment of the present invention.
  • the medical image display device 10 is connected to a medical image photographing device 1 for photographing an image of a subject, an image database 2 in which images taken by the medical image photographing device 1 are stored, and a network such as LAN3. Is done.
  • the medical imaging apparatus 1 is configured by an apparatus capable of capturing an image of a subject (preferably a three-dimensional image) such as an X-ray CT apparatus, an MRI apparatus, or an ultrasonic imaging apparatus. Although illustration is omitted, the medical image display device 10 includes the medical image photographing device 1 and the image database 2! /, Or may be! /.
  • the medical image display device 10 mainly serves as a control device that controls the operation of each component.
  • Central processing unit (CPU) ll main memory 12 that stores the control program for the unit and the work area when the program is executed, operating system (OS), peripheral device drive, chest wall thickness
  • a data recording device 13 that stores various application software including programs for processing such as measurement, a display memory 14 that temporarily stores display data, and an image based on the data from the display memory 14
  • the status of the pointing device 17, which is a pointing device such as a mouse, track Bonore, touch panel, etc. for operating the soft switch on the CRT monitor, LCD monitor, etc.
  • the controller 16 outputs the above signal to the CPU 11, the external input device 18 for an operator to input an instruction, and the bus 19 for connecting the above components.
  • the CPU 11 reads the program from the data recording device 13, loads it into the main memory 12, and executes it.
  • the data recording device 13 is connected as a storage device other than the main memory 12, but the data recording device 13 is a memory built in or externally attached to the medical image display device 10.
  • a storage device such as a magnetic disk, a device that writes data to or reads data from a removable external medium, a device that transmits and receives data to and from the external storage device via a network, and the like may be applied.
  • FIG. 2 is a flowchart showing a flow of processing for displaying a plurality of three-dimensional images showing a plurality of types of objects of the medical image display device 10 in the display area.
  • the CPU 11 operates according to this flowchart.
  • the following processing is read from the medical imaging device 1 and the image database 2 to the medical image display device 10 and corresponds to a region to be observed by the subject by a method such as threshold separation processing or region expansion method. Starts after multiple 3D images (objects) are extracted.
  • object selection processing (steps S10 to S12) is performed.
  • the part to be displayed on the screen is selected by the pointing device 17 or the like (Steps S10).
  • the selection is made on the initial screen as shown in FIG.
  • select the region to be displayed with the pointing device 17 or the like head, chest, abdomen, etc.
  • select the region select the region, and press the enter button to determine the selected region.
  • the enter button it will be described when the head is selected as the part.
  • step Sl l an object to be displayed as a three-dimensional image is selected by the pointing device 17 or the like.
  • a button for selecting an object on the head is displayed as shown in FIG. Press the button of the object (skin, bone, tumor, blood vessel, brain, etc.) you want to display with the pointing device 17 etc., select the object, and press the enter button to determine the selected object.
  • buttons press the OK button.
  • the determination button may be pressed while “bone” is selected by pressing the “bone” button.
  • the “tumor” and the “brain” are displayed by pressing the two buttons “tumor” and “brain”. What is necessary is just to push a decision button in the selected state.
  • four types of objects, skin, bone, tumor, and blood vessel will be described.
  • step S11 it is determined whether the 3D image determined in step S11 includes a plurality of objects displayed as one 3D image! / (Step S12). ).
  • step S12 If the displayed 3D images are all one object (NO in step S12), the object selection process is terminated.
  • a synthesized image is created by synthesizing at the same position anatomically (step S13).
  • step S14 The number of three-dimensional images selected in step SI 1 is counted (step S14), and a display area of the number of selected three-dimensional images is set on the display 15 (step S15). For example, a display area of m X n «m, where n is a natural number) and the number of display areas including the number of selected three-dimensional images is the smallest is set. Note that the number of display areas is not limited as long as it is plural.
  • step S11 the three-dimensional image selected in step S11 is placed in the set display area (step S16).
  • the four display areas are displayed.
  • skin objects, bone objects, tumor objects, and blood vessel objects are displayed in each display area.
  • the image displayed in the display area may be a display of only one object for each display area.
  • the composite display may be displayed for each display area.
  • One image may be displayed depending on the display area.
  • Object display and composite image display are mixed! /
  • FIG. 4 Note that in Fig. 4, other three-dimensional displays such as force S, surface rendering processing, maximum value projection processing, minimum value projection processing, and pixel value integration processing, in which a three-dimensional image is displayed by volume rendering processing. There may be.
  • the display area name (area. A, etc.) and the object name (skin, etc.) are displayed together with the object in each display area, and only the 3D image is displayed.
  • the three-dimensional image and the name of the display area or the name of the object may be displayed.
  • an object (edit object) to be combined is selected from the 3D images displayed in the display area of the medical image display device 10, and a plurality of objects are combined by combining the edit object with other objects.
  • the process for creating a composite image is described.
  • the following process displays a plurality of three-dimensional images showing a plurality of types of objects shown in FIG. This is performed by the CPU 11 after the process of displaying in the display area is completed.
  • the object pointed to by the pointer is selected as an editing object. For example, in FIG. 5, since the pointer is in the area ⁇ ⁇ and the pointing device 17 is pressed, the bone is selected as the editing object! /.
  • the pointing device 17 is used to move the pointer with the editing object selected (drag operation). Thereby, the edit object can be moved.
  • the editing object is moved into the display area where the synthesized object and other objects are displayed! /, And the pointing device 17 is released (drop operation).
  • the CPU 11 is instructed to synthesize and display the editing object and the object displayed in the display area where the editing object has been dropped. For example, in Fig. 6, by dragging the bone object displayed in the area ⁇ to the area D, the bone object and the blood vessel object displayed in the area D are combined and displayed. Instruction is input to CPU 11.
  • a blood vessel object and a bone object are combined and displayed in region .D
  • the blood vessel object is selected and the skin displayed in blood vessel object and region A is displayed.
  • a case where a composite image with an object is displayed in area A will be described as an example.
  • the composite image displayed in area .D may be a composite image in which individual objects displayed in separate display areas are combined! /, And at the stage of selecting an object displayed in the display area It may be a composite image from the beginning by selecting two objects, that is, a composite image synthesized in step S13 of FIG.
  • the cursor indicating the pointer and the specified point A (x, y) is moved to the desired position on the object. Move. Place the pointer at the desired position on the object! And hold down the pointing device 17 to determine the position of the cursor indicating the specified point A (x, y). Specify the specified point A (x, y). Is done.
  • FIG. 10 (a) is a drawing of the composite image displayed in FIG. 9 from the side.
  • the edit object cannot be automatically determined if the operator only specifies the specified point A (x, y).
  • the force that selects the object that is closest to the depth direction (z direction) as the editing object may arise when displaying.
  • FIG. 10 (a) when creating a composite image in which a blood vessel object is displayed through the bone object by reducing the opacity of the bone object, the blood vessel object It is clearly displayed in the display area.
  • the above method is inconvenient because a bone object positioned in front of the blood vessel object is selected, and an object different from the operator's intention is selected.
  • an edit object outside the object having the largest contribution rate of integrated reflected light is considered in consideration of the amount of reflected light of each object. It is a method of selecting as a tatoo.
  • Volume rendering is a method of projecting virtual rays onto a display object and integrating the reflected light to create a three-dimensional image.
  • the edit object object can be displayed even when multiple objects are combined and displayed with different opacity. Is unlikely to be different from the object intended by the operator.
  • the contribution rate of the integrated reflected light is the largest, considering the amount of reflected light of each object, and the object is selected as the editing object.
  • the object that the operator intends to select is automatically selected.
  • an object including the maximum value in the depth direction is used for the maximum value projection method, and an object including the minimum value in the depth direction is used for the minimum value projection method. It becomes possible to carry out by selecting.
  • the object outside the object with the largest contribution of the integrated reflected light is selected as the edit object in consideration of the amount of reflected light outside each object.
  • the edit object is selected using the input time of the signal input by the operator. It is a method.
  • the input time is, for example, the time for which the pointing device 17 is clicked when the designated point A (x, y) is selected.
  • the range in the depth direction (z direction) is divided, and as shown in FIG. 10 (b), a selection area to which each object belongs is set.
  • the boundary of the selection area may be set at the center position of the object, or a range from the position coordinates of the object to be selected until the next object appears in the depth direction may be set.
  • the bone A object is selected when the click time is 1.0 second, and the blood vessel object is selected when the click time is 1.5 seconds. [0081] Therefore, by selecting an object in the selection area corresponding to the input time of the signal input by the operator at the specified point A (x, y) as an edit object, the object that the operator intends to select is selected. Automatically selected.
  • the selection area may be set to stop at the innermost position in the depth direction or after reaching the innermost position in the depth direction. You may make it return to a direction.
  • the selection area can also be set by clicking the pointing device 17 with the left mouse button to advance in the depth direction and right clicking to return to the front!
  • any object can be selected.
  • the power S to do.
  • the edit object can be selected as intended by the operator, but the candidate for the edit object is not visually displayed to the operator due to the context of the selection candidate object. There is a case.
  • the selected edit object can be displayed separately from the other objects, the operator can recognize the edit object being selected at a glance, so that composite images can be created efficiently. This is possible.
  • the edit object is displayed in a distinguishable manner by highlighting the edit object.
  • FIG. 11 shows the blood vessel object highlighted by displaying the outline of the blood vessel object with a bold line in the composite image of the bone object and the blood vessel object.
  • Other highlighting methods include coloring (a color different from that of the other object in the outline and inside of the editing object), patterning (a pattern inside the editing object), For example, blinking display (displays the editing object blinking).
  • the cross-sectional image of the edit object is displayed so as to be distinguished from the other objects by displaying it on the composite image.
  • FIG. 12 shows a cross-sectional image of a blood vessel object superimposed on the composite image in a composite image of a bone and a blood vessel object.
  • FIG. 13 shows a method for determining a cross-sectional position for creating a cross-sectional image. Is shown.
  • the cross-sectional position for creating the cross-sectional image of the editing object is determined.
  • the coordinate in the depth direction (z direction) is determined by the click time of the designated point A (x, y) input by the pointing device 17.
  • the section position is determined simultaneously with the selection of the edit object. As shown in FIG. 13, when the click time force is 5 seconds, the bone A object is selected as the editing object, and zl is determined as the cross-sectional position. When the click time is 1.5 seconds, the blood vessel object is selected as the editing object, and z2 is determined as the cross-sectional position.
  • FIG. 12 shows a display example when the blood vessel object is selected as the editing object and z2 is determined as the cross-sectional position.
  • the cross-sectional image of the blood vessel object is displayed overlaid on the composite image, so that the bone A object located in front of the blood vessel object is removed and displayed, making it easy to select the blood vessel object. Can be recognized.
  • the cross-sectional image of the edit object is displayed so as to be superimposed on the composite image, so that the edit object can be easily distinguished from other objects, and the operator can easily select the edit object. Is possible.
  • the method for determining the cross-sectional position is not limited to the above, and the cross-sectional position may be sequentially updated by extending the click time. For example, if a click is first performed for 0.5 seconds, and the cross section at zl of the bone A object is determined, then if a further click is performed for 1.0 second, a total of 1.5 seconds are clicked. Because of the vessel object z2 A cross section is determined.
  • a cross-sectional image orthogonal to the depth direction (z direction) is displayed.
  • the present invention is not limited to the above, and a cross-sectional image of a cross-sectional angle at which a selection candidate object is clearly displayed is displayed. It may be displayed.
  • the pointing device 17 is used to move the pointer with the editing object selected (drag operation). Thereby, the edit object can be moved.
  • the editing object is moved into the display area where the combined object and other objects are displayed! /, And the pointing device 17 is released (drop operation).
  • the edit object is displayed as a composite, and the instruction to remove the edit object from the object is displayed, and the edit object and the object displayed in the display area where the edit object is dropped are displayed anatomically.
  • the CPU 11 is instructed to synthesize the images so as to match the above and display them in the drop-down display area (composite display).
  • the blood vessel object of the composite image displayed in the area D is determined as the editing object, and the blood vessel object is dragged from the area D to the area A.
  • the blood vessel object is deleted from the composite image displayed in the area D by the CPU 11.
  • the blood vessel object and the skin object displayed in the area A are synthesized so as to anatomically match, and the synthesized image (synthesized image) is synthesized and displayed in the area A.
  • a series of processes for compositing and displaying the above composite image uses the pointing device 17. In this way, the operator can intuitively create a composite image in a short work time.
  • the simple processing refers to image processing that performs high-speed drawing by reducing the image processing amount by reducing the resolution or the like.
  • FIG. 15 shows an example in which a composite image of a bone object and a blood vessel object displayed in a simple manner is displayed in region D by dragging the bone object displayed in region B to region D.
  • a synthesis condition for the synthesized display may be set.
  • Fig. 16 shows an example of changing the composition ratio of the bone object and the blood vessel object according to the time required for the drag operation when the blood vessel object is selected as the editing object and the bone object and the blood vessel object are composited and displayed. Indicates. In other words, the longer the drag time of the blood vessel object, the lower the opacity of the bone object, and the composite rate is changed so that the opacity of the blood vessel object becomes higher.
  • the composition ratio between objects may be changed according to the time required for the drag operation. As a result, the synthesized image can be created while the operator visually confirms the synthesized image displayed in a simple manner, so that the three-dimensional image can be easily synthesized.
  • the object displayed in the display area is not necessary, the object displayed in the display area can be deleted.
  • the object is deleted from the display area by pointing the object with the pointer and dragging and dropping the object outside the display area. Is done.
  • the tumor object displayed in the area C is dragged and dropped outside the display area, as shown in FIG. Deleted.
  • the bone object included in the composite image displayed in area D is deleted.
  • the bone object is dragged and dropped outside the display area as a target object, the bone object is deleted from the composite image displayed in area D. Only the blood vessel object is displayed in area D.
  • the composite image can be decomposed again by the same procedure.
  • Fig. 19 selecting the bone object included in the composite image displayed in region D, and moving the bone object from region D to region B, the composite image of region D is decomposed.
  • region D blood vessel objects are displayed, and in region B, bone objects are displayed.
  • the target composite image can be efficiently displayed. Can be synthesized.
  • the 3D image is displayed from the display area where the 3D image was displayed. Since the deleted image is displayed, it is easy to combine and delete a plurality of objects, and the target composite image can be efficiently combined.
  • the input means has another function equivalent to the force shown in the example in which the pointing device selects and moves the object as the instruction means.
  • the same effect can be obtained by combining keys input by the operator.
  • the number of objects to be selected may be one force.
  • a drag-and-drop operation is performed on area A
  • the skin object is displayed in area A.
  • a composite image of the jet, blood vessel object, and bone object is displayed. It is possible to select a plurality of desired objects from a plurality of objects included in the composite image by combining a pointer operation with a pointing device and an operation with an external input device.
  • the object to be displayed in the display area is selected using the name of the object.
  • the display area is selected using a reference such as a CT value or an image taken by angiography.
  • An object to be displayed may be automatically selected.
  • the automatically selected objects may be automatically displayed in the display area.
  • the click time of the pointing device when selecting a designated point A (x, y).
  • the present invention is not limited to this, and the number of tricks of the pointing device may be used, or the click pressure of the pointing device may be used.
  • the bone A object is selected when the number of clicks is 3 ⁇ 4, and the blood vessel object is selected when the number is 3 If the number of clicks is power, the bone A object is selected. If the number of clicks is 2, the blood vessel object is selected. If the number of clicks is 3, the bone B object is selected. In this case, objects in the depth direction may be sequentially selected according to the number of clicks, such as the bone A object is selected.
  • the click pressure of the pointing device is used, the click pressure is detected by installing a pressure sensor inside the pointing device. For example, as shown in Figure 21, if the click pressure is 2kPa, the bone A object should be selected, and if it is 3kPa, the blood vessel object should be selected! /.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Optics & Photonics (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

L'invention concerne un dispositif d'affichage d'images médicales permettant de synthétiser de manière efficace une pluralité d'images tridimensionnelles afin de réduire le travail de l'opérateur et de permettre de écourter le temps nécessaire pour un diagnostic d'image au moyen de l'image synthétisée. Les objets ou les images synthétisées sont affichés au niveau de deux régions d'affichage ou plus. Le dispositif d'affichage d'images médicales sélectionne un objet affiché au niveau de chaque région d'affichage par un dispositif de pointage et déplace l'objet sélectionné vers les autres régions d'affichage. L'objet déplacé est synthétisé avecun objet affiché au niveau de l'autre région et l'objet synthétisé est affiché au niveau de sa région correspondante. De plus, dans le cas où une pluralité d'objets sont affichés au niveau d'une région d'affichage, une pluralité d'objets sont synthétisés et les objets synthétisés sont affichés au niveau de la région d'affichage.
PCT/JP2007/070775 2006-10-26 2007-10-25 Dispositif d'affichage d'images médicales WO2008050823A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008541015A JP5285427B2 (ja) 2006-10-26 2007-10-25 医療画像表示装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-291647 2006-10-26
JP2006291647 2006-10-26

Publications (1)

Publication Number Publication Date
WO2008050823A1 true WO2008050823A1 (fr) 2008-05-02

Family

ID=39324609

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/070775 WO2008050823A1 (fr) 2006-10-26 2007-10-25 Dispositif d'affichage d'images médicales

Country Status (2)

Country Link
JP (1) JP5285427B2 (fr)
WO (1) WO2008050823A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010238097A (ja) * 2009-03-31 2010-10-21 Sony Ericsson Mobilecommunications Japan Inc 画像合成装置、画像合成制御方法及び画像合成制御プログラム
JP2014171656A (ja) * 2013-03-08 2014-09-22 Toshiba Corp 医用画像解析結果表示装置
WO2015015777A1 (fr) * 2013-07-31 2015-02-05 富士フイルム株式会社 Dispositif de commande d'affichage d'images médicales, son procédé d'exploitation et programme de commande d'affichage d'images médicales
JP2016123584A (ja) * 2014-12-26 2016-07-11 株式会社根本杏林堂 医用画像表示端末および医用画像表示プログラム
JP2019103818A (ja) * 2014-12-26 2019-06-27 株式会社根本杏林堂 医用画像表示端末および医用画像表示プログラム
JP2021000206A (ja) * 2019-06-20 2021-01-07 日本光電工業株式会社 生体情報表示装置
JP2021006261A (ja) * 2020-09-15 2021-01-21 株式会社根本杏林堂 医用画像表示端末および医用画像表示プログラム
JP2021148711A (ja) * 2020-03-23 2021-09-27 コニカミノルタジャパン株式会社 医用画像表示装置、医用画像表示方法、および、プログラム
JP2023015213A (ja) * 2017-03-30 2023-01-31 ホロジック, インコーポレイテッド 合成***組織画像を生成するための標的オブジェクト増強のためのシステムおよび方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11299782A (ja) * 1998-04-24 1999-11-02 Toshiba Iyo System Engineering Kk 医用画像処理装置
JP2000353255A (ja) * 1999-06-11 2000-12-19 Ge Yokogawa Medical Systems Ltd 画像表示方法および画像診断装置
JP2005185405A (ja) * 2003-12-25 2005-07-14 Ziosoft Inc 医用画像処理装置、関心領域抽出方法、ならびに、プログラム
JP2007282656A (ja) * 2006-04-12 2007-11-01 Toshiba Corp 医用画像表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11299782A (ja) * 1998-04-24 1999-11-02 Toshiba Iyo System Engineering Kk 医用画像処理装置
JP2000353255A (ja) * 1999-06-11 2000-12-19 Ge Yokogawa Medical Systems Ltd 画像表示方法および画像診断装置
JP2005185405A (ja) * 2003-12-25 2005-07-14 Ziosoft Inc 医用画像処理装置、関心領域抽出方法、ならびに、プログラム
JP2007282656A (ja) * 2006-04-12 2007-11-01 Toshiba Corp 医用画像表示装置

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010238097A (ja) * 2009-03-31 2010-10-21 Sony Ericsson Mobilecommunications Japan Inc 画像合成装置、画像合成制御方法及び画像合成制御プログラム
JP2014171656A (ja) * 2013-03-08 2014-09-22 Toshiba Corp 医用画像解析結果表示装置
US10062184B2 (en) 2013-07-31 2018-08-28 Fujifilm Corporation Medical image display control device, method of operation for same, and medical image display control program
WO2015015777A1 (fr) * 2013-07-31 2015-02-05 富士フイルム株式会社 Dispositif de commande d'affichage d'images médicales, son procédé d'exploitation et programme de commande d'affichage d'images médicales
JP2015029518A (ja) * 2013-07-31 2015-02-16 富士フイルム株式会社 医用画像表示制御装置およびその作動方法並びに医用画像表示制御プログラム
JP2019103818A (ja) * 2014-12-26 2019-06-27 株式会社根本杏林堂 医用画像表示端末および医用画像表示プログラム
JP2016123584A (ja) * 2014-12-26 2016-07-11 株式会社根本杏林堂 医用画像表示端末および医用画像表示プログラム
JP2023015213A (ja) * 2017-03-30 2023-01-31 ホロジック, インコーポレイテッド 合成***組織画像を生成するための標的オブジェクト増強のためのシステムおよび方法
JP2021000206A (ja) * 2019-06-20 2021-01-07 日本光電工業株式会社 生体情報表示装置
JP7319103B2 (ja) 2019-06-20 2023-08-01 日本光電工業株式会社 生体情報表示装置
JP2021148711A (ja) * 2020-03-23 2021-09-27 コニカミノルタジャパン株式会社 医用画像表示装置、医用画像表示方法、および、プログラム
JP7447595B2 (ja) 2020-03-23 2024-03-12 コニカミノルタ株式会社 医用画像表示装置、医用画像表示方法、および、プログラム
JP2021006261A (ja) * 2020-09-15 2021-01-21 株式会社根本杏林堂 医用画像表示端末および医用画像表示プログラム
JP7107590B2 (ja) 2020-09-15 2022-07-27 株式会社根本杏林堂 医用画像表示端末および医用画像表示プログラム

Also Published As

Publication number Publication date
JPWO2008050823A1 (ja) 2010-02-25
JP5285427B2 (ja) 2013-09-11

Similar Documents

Publication Publication Date Title
WO2008050823A1 (fr) Dispositif d'affichage d'images médicales
JP4798712B2 (ja) 医用画像表示装置及び方法並びにプログラム
US9078566B2 (en) Dual display CT scanner user interface
JP4820680B2 (ja) 医用画像表示装置
US7603182B2 (en) Ultrasonograph, work flow edition system, and ultrasonograph control method
CN104516627B (zh) 显示设备和使用所述显示设备的图像显示方法
US9078565B2 (en) Dual display CT scanner user interface
CN101896124B (zh) 用于在诊断成像***中创建协议的方法
JPH10137190A (ja) 医用画像処理装置
WO2007142607A2 (fr) procédé et système pour une visualisation et une interaction sélectives avec des données d'image tridimensionnelle, dans un visualiseur en tunnel
CA3017071A1 (fr) Affichage d'informations relatives a une lesion de la peau
JPWO2012132840A1 (ja) 画像管理装置、画像管理装置の作動方法、画像管理プログラム、及びカプセル型内視鏡システム
JPH10155746A (ja) 医療用画像表示方法及びシステム
JP2007029341A (ja) 医用画像表示装置
WO2007122896A1 (fr) SYSTÈME d'affichage d'images MÉDICALES et programme d'affichage d'images MÉDICALES
EP3998038A1 (fr) Procédé d'affichage de densité osseuse multiple pour établir un plan de procédure d'implant et dispositif de traitement d'image associé
WO2007141995A1 (fr) dispositif de traitement d'affichage
JP2015016067A (ja) 画像表示方法および装置並びにプログラム
JP2003116838A (ja) 医用画像診断装置及び医用画像診断システム
JP3989896B2 (ja) 医用画像処理装置、関心領域抽出方法、ならびに、プログラム
US20140095993A1 (en) Medical image display apparatus,medical image display method, and recording medium
JP4202461B2 (ja) 医用画像表示装置
JP2008119252A (ja) 医用画像生成装置、方法およびプログラム
JP2009080545A (ja) 医療情報処理装置
JP2016001393A (ja) 医用画像表示装置およびその制御方法、プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07830509

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008541015

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07830509

Country of ref document: EP

Kind code of ref document: A1