WO2018008664A1 - Control device, control method, control system, and program - Google Patents

Control device, control method, control system, and program Download PDF

Info

Publication number
WO2018008664A1
WO2018008664A1 PCT/JP2017/024575 JP2017024575W WO2018008664A1 WO 2018008664 A1 WO2018008664 A1 WO 2018008664A1 JP 2017024575 W JP2017024575 W JP 2017024575W WO 2018008664 A1 WO2018008664 A1 WO 2018008664A1
Authority
WO
WIPO (PCT)
Prior art keywords
probe
image
photoacoustic
unit
information
Prior art date
Application number
PCT/JP2017/024575
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 謙介
野歩 宮沢
浩 荒井
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016229311A external-priority patent/JP2018011927A/en
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to CN201780042494.1A priority Critical patent/CN109414254A/en
Publication of WO2018008664A1 publication Critical patent/WO2018008664A1/en
Priority to US16/239,330 priority patent/US20190150894A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography

Definitions

  • the present invention relates to a control device, a control method, a control system, and a program.
  • Patent Document 1 discloses a photoacoustic measurement device capable of switching between an operation mode including detection of a photoacoustic signal and an operation mode not including detection of a photoacoustic signal by an operation on a mode switch provided on the probe. Has been.
  • an imaging apparatus that acquires an ultrasonic signal and a photoacoustic signal
  • imaging is performed while switching operation modes related to the detection of the ultrasonic signal and the photoacoustic signal.
  • the user may interrupt the operation of the probe. There is a possibility that the user cannot observe a desired image due to the body movement of the subject or the displacement of the probe position during the interruption.
  • the control device disclosed in this specification outputs an ultrasonic signal by transmitting / receiving ultrasonic waves to / from a subject, and receives a photoacoustic wave generated by light irradiation on the subject, from a probe that outputs a photoacoustic signal.
  • the first acquisition means for acquiring the ultrasonic signal and the photoacoustic signal the second acquisition means for acquiring information related to the movement of the probe, and the photoacoustic signal
  • Display control means for displaying on the display unit a photoacoustic image generated by use.
  • the operation mode related to the detection of the ultrasonic signal and the photoacoustic signal is switched. The trouble of performing the operation can be reduced.
  • an acoustic wave generated by irradiating a subject with light and expanding inside the subject is referred to as a photoacoustic wave.
  • an acoustic wave transmitted from the transducer or a reflected wave (echo) in which the transmitted acoustic wave is reflected inside the subject is referred to as an ultrasonic wave.
  • an imaging method using ultrasonic waves and an imaging method using photoacoustic waves are used.
  • the method of imaging using ultrasonic waves is, for example, that the ultrasonic waves oscillated from the transducer are reflected by the tissue inside the subject according to the difference in acoustic impedance, and the time until the reflected wave reaches the transducer and the reflected wave.
  • An image imaged using ultrasound is hereinafter referred to as an ultrasound image.
  • the user can operate while changing the angle of the probe and observe ultrasonic images of various cross sections in real time. Ultrasound images depict the shapes of organs and tissues and are used to find tumors.
  • the imaging method using photoacoustic waves is a method of generating an image based on, for example, ultrasonic waves (photoacoustic waves) generated by adiabatic expansion of tissue inside a subject irradiated with light. .
  • An image imaged using the photoacoustic wave is hereinafter referred to as a photoacoustic image.
  • the photoacoustic image information related to optical characteristics such as the degree of light absorption of each tissue is depicted.
  • photoacoustic images for example, it is known that blood vessels can be drawn by the optical characteristics of hemoglobin, and its use for evaluating the malignancy of tumors is being studied.
  • various information may be collected by imaging different phenomena on the same part of the subject based on different principles.
  • diagnosis about cancer is performed by combining morphological information obtained from a CT (Computed Tomography) image and functional information relating to metabolism obtained from a PET (Positronization Tomography) image.
  • CT Computer Tomography
  • PET PET
  • diagnosis accuracy it is considered effective to improve diagnosis accuracy to perform diagnosis using information obtained by imaging different phenomena based on different principles.
  • an imaging device for obtaining an image obtained by combining the respective characteristics has been studied.
  • both an ultrasonic image and a photoacoustic image are imaged using ultrasonic waves from a subject
  • the user wants to operate the probe in the same manner as a conventional ultrasonic image. That is, it is conceivable that the user touches the surface of the subject and operates the probe while observing an image displayed based on information acquired by the probe. At that time, if the operation mode related to signal acquisition or image display is switched via, for example, a switch provided on the probe or an input device provided on the console of the imaging device, the user observes the image. It is necessary to interrupt the probe operation. For this reason, it is conceivable that the body movement of the subject occurs during the operation input to the switch or the input device of the console, or the probe position shifts.
  • An object of the first embodiment is to provide a control device that can switch an image to be displayed without deteriorating operability when a user observes an image.
  • FIG. 1 is a diagram illustrating an example of a configuration of a system including a control device 101 according to the first embodiment.
  • An imaging system 100 that can generate an ultrasonic image and a photoacoustic image is connected to various external devices via a network 110.
  • Each configuration and various external devices included in the imaging system 100 do not need to be installed in the same facility, and may be connected to be communicable.
  • the imaging system 100 includes a control device 101, a probe 102, a detection unit 103, a display unit 104, and an operation unit 105.
  • the control device 101 acquires an ultrasonic signal and a photoacoustic signal from the probe 102, and displays an ultrasonic image and a photoacoustic image on the display unit 104 based on information related to movement of the probe 102 acquired from the detection unit 103. It is a displayable device.
  • the control device 101 acquires information related to an examination including imaging of an ultrasonic image and a photoacoustic image from the ordering system 112, and controls the probe 102, the detection unit 103, and the display unit 104 when the examination is performed.
  • the control device 101 outputs the generated ultrasonic image, photoacoustic image, and superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image to the PACS 113.
  • the control device 101 transmits and receives information to and from external devices such as the ordering system 112 and the PACS 113 in accordance with standards such as HL7 (Health level 7) and DICOM (Digital Imaging and Communications in Medicine). Details of processing performed by the control device 101 will be described later.
  • the probe 102 is operated by a user and transmits an ultrasonic signal and a photoacoustic signal to the control device 101.
  • the probe 102 includes a transmission / reception unit 106 and an irradiation unit 107.
  • the probe 102 transmits an ultrasonic wave from the transmission / reception unit 106 and receives the reflected wave by the transmission / reception unit 106. Further, the probe 102 irradiates the subject with light from the irradiation unit 107, and the photoacoustic wave is received by the transmission / reception unit 106.
  • the probe 102 converts the received reflected wave and photoacoustic wave into an electric signal, and transmits it to the control device 101 as an ultrasonic signal and a photoacoustic signal.
  • the probe 102 is controlled so that, when information indicating contact with the subject is received, transmission of ultrasonic waves for acquiring an ultrasonic signal and light irradiation for acquiring a photoacoustic signal are executed. It is preferable.
  • the probe 102 acquires an ultrasonic signal and a photoacoustic signal, may acquire these alternately, may acquire simultaneously, and may acquire in a predetermined aspect.
  • the transmission / reception unit 106 includes at least one transducer (not shown), a matching layer (not shown), a damper (not shown), and an acoustic lens (not shown).
  • the transducer (not shown) is made of a material exhibiting a piezoelectric effect, such as PZT (lead zirconate titanate) or PVDF (polyvinylidene difluoride).
  • the transducer (not shown) may be other than a piezoelectric element, for example, a transducer using a capacitive transducer (CMUT: capacitive ultrasonic transducer) or a Fabry-Perot interferometer.
  • CMUT capacitive ultrasonic transducer
  • the ultrasonic signal is composed of frequency components of 2 to 20 MHz and the photoacoustic signal is composed of frequency components of 0.1 to 100 MHz, and a transducer (not shown) that can detect these frequencies is used.
  • the signal obtained by the transducer (not shown) is a time-resolved signal.
  • the amplitude of the received signal represents a value based on the sound pressure received by the transducer at each time.
  • the transmission / reception unit 106 includes a circuit (not shown) or a control unit for electronic focusing.
  • the array form of transducers (not shown) is, for example, a sector, a linear array, a convex, an annular array, or a matrix array.
  • the transmitting / receiving unit 106 may include an amplifier (not shown) that amplifies a time-series analog signal received by a transducer (not shown).
  • the transmission / reception unit 106 may include an A / D converter that converts a time-series analog signal received by a transducer (not shown) into a time-series digital signal.
  • the transducer (not shown) may be divided into a transmitter and a receiver depending on the purpose of imaging an ultrasonic image. Further, the transducer (not shown) may be divided into an ultrasonic image capturing unit and a photoacoustic image capturing unit.
  • the irradiation unit 107 includes a light source (not shown) for acquiring a photoacoustic signal and an optical system (not shown) that guides pulsed light emitted from the light source (not shown) to the subject.
  • the pulse width of light emitted from a light source (not shown) is, for example, 1 ns or more and 100 ns or less.
  • the wavelength of the light which a light source (not shown) injects is a wavelength of 400 nm or more and 1600 nm or less, for example.
  • a wavelength of 400 nm or more and 700 nm or less and a large absorption in the blood vessel is preferable.
  • the wavelength of 700 nm or more and 1100 nm or less which is hard to be absorbed by tissues such as water and fat is preferable.
  • the light source (not shown) is, for example, a laser or a light emitting diode.
  • the irradiation unit 107 may use a light source that can convert wavelengths in order to acquire a photoacoustic signal using light of a plurality of wavelengths.
  • the irradiation unit 107 may include a plurality of light sources that generate light of different wavelengths, and may be configured to be able to irradiate light of different wavelengths alternately from each light source.
  • the laser is, for example, a solid laser, a gas laser, a dye laser, or a semiconductor laser.
  • a pulsed laser such as an Nd: YAG laser or an alexandrite laser may be used.
  • a Ti: sa laser or an OPO (optical parametric oscillators) laser that uses Nd: YAG laser light as excitation light may be used as a light source (not shown).
  • a microwave source may be used as a light source (not shown).
  • optical elements such as lenses, mirrors, and optical fibers are used.
  • the optical system may include a diffusion plate that diffuses the emitted light.
  • the optical system may include a lens or the like so that the beam can be focused.
  • the detection unit 103 acquires information regarding the displacement of the probe 102.
  • the detection unit 103 will be described using an example in which the detection unit 103 includes the magnetic transmitter 503 and the magnetic sensor 502 illustrated in FIG. 5.
  • the detection unit 103 acquires, for example, information indicating the speed of movement of the probe 102 relative to the subject, information regarding the speed of rotation of the probe 102, and information indicating the degree of pressure on the subject as information regarding the movement of the probe 102. .
  • the detection unit 103 transmits the acquired information regarding the movement of the probe 102 to the control device 101.
  • the display unit 104 displays an image captured by the imaging system 100 and information related to the inspection based on control from the control device 101.
  • the display unit 104 provides an interface for receiving user instructions based on control from the control device 101.
  • the display unit 104 is a liquid crystal display, for example.
  • the operation unit 105 transmits information related to user operation input to the control apparatus 101.
  • the operation unit 105 is, for example, a keyboard, a trackball, and various buttons for performing operation inputs related to inspection.
  • the display unit 104 and the operation unit 105 may be integrated as a touch panel display. Further, the control device 101, the display unit 104, and the operation unit 105 do not need to be separate devices, and these configurations may be integrated as in the console 501 in FIG.
  • the control device 101 may have a plurality of probes.
  • a HIS (Hospital Information System) 111 is a system that supports hospital operations.
  • the HIS 111 includes an electronic medical record system, an ordering system, and a medical accounting system.
  • the ordering system of the HIS 111 transmits order information to the ordering system 112 for each department.
  • the ordering system 112 which will be described later, manages the execution of the order.
  • the ordering system 112 is a system that manages inspection information and manages the progress of each inspection in the imaging apparatus.
  • the ordering system 112 may be configured for each department that performs inspection.
  • the ordering system 112 is, for example, RIS (Radiology Information System) in the radiation department.
  • RIS Radiology Information System
  • the ordering system 112 transmits information on examinations performed by the imaging system 100 to the control apparatus 101.
  • the ordering system 112 receives information related to the progress of the inspection from the control device 101.
  • the ordering system 112 transmits information indicating that the inspection is completed to the HIS 111.
  • the ordering system 112 may be integrated into the HIS 111.
  • a PACS (Picture Archiving and Communication System) 113 is a database system that holds images obtained by various imaging devices inside and outside the facility.
  • the PACS 113 manages a storage unit (not shown) that stores medical images and imaging conditions of such medical images, additional parameters such as image processing parameters including reconstruction, and patient information, and information stored in the storage unit.
  • a controller (not shown).
  • the PACS 113 stores an ultrasonic image, a photoacoustic image, and a superimposed image output from the control device 101. It is preferable that communication between the PACS 113 and the control device 101 and various images stored in the PACS 113 comply with standards such as HL7 and DICOM. In various images output from the control device 101, incidental information is stored in various tags in accordance with the DICOM standard.
  • the Viewer 114 is a terminal for image diagnosis, and reads an image stored in the PACS 113 and displays it for diagnosis.
  • the doctor displays an image on the Viewer 114 for observation, and records information obtained as a result of the observation as an image diagnosis report.
  • the diagnostic imaging report created using the Viewer 114 may be stored in the Viewer 114, or may be output and stored in the PACS 113 or a report server (not shown).
  • the Printer 115 prints an image stored in the PACS 113 or the like.
  • the Printer 115 is, for example, a film printer, and outputs an image stored in the PACS 113 or the like by printing it on a film.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of the control device 101.
  • the control device 101 includes a CPU 201, ROM 202, RAM 203, HDD 204, USB 205, communication circuit 206, GPU board 207, and HDMI (registered trademark) 208. These are communicably connected via an internal bus.
  • a CPU (Central Processing Unit) 201 is a control circuit that integrally controls the control device 101 and each unit connected thereto.
  • the CPU 201 performs control by executing a program stored in the ROM 202. Further, the CPU 201 executes a display driver which is software for controlling the display unit 104 and performs display control on the display unit 104. Further, the CPU 201 performs input / output control for the operation unit 105.
  • a ROM (Read Only Memory) 202 stores a program and data in which a control procedure by the CPU is stored.
  • a RAM (Random Access Memory) 203 is a memory for storing a program for executing processing in the control device 101 and each unit connected thereto, and various parameters used in image processing.
  • the RAM 203 stores a control program executed by the CPU 201, and temporarily stores various data when the CPU 201 executes various controls.
  • HDD (Hard Disk Drive) 204 is an auxiliary storage device that stores various data such as an ultrasonic image and a photoacoustic image.
  • a USB (Universal Serial Bus) 205 is a connection unit connected to the operation unit 105.
  • the communication circuit 206 is a circuit for communicating with each unit constituting the imaging system 100 and various external devices connected to the network 110.
  • the communication circuit 206 may be realized by a plurality of configurations in accordance with a desired communication form.
  • the GPU board 207 is a general-purpose graphics board including a GPU and a video memory.
  • the GPU board 207 constitutes part or all of the image processing unit 303, and performs, for example, a photoacoustic image reconstruction process. By using such an arithmetic device, it is possible to perform operations such as reconstruction processing at high speed without requiring dedicated hardware.
  • An HDMI (registered trademark) (High Definition Multimedia Interface) 208 is a connection unit connected to the display unit 104.
  • the CPU 201 and the GPU are examples of processors.
  • the ROM 202, RAM 203, and HDD 204 are examples of memories.
  • the control device 101 may have a plurality of processors. In the first embodiment, the function of each unit of the control device 101 is realized by the processor of the control device 101 executing a program stored in the memory.
  • control device 101 may have a CPU or GPU that performs a specific process exclusively. Further, the control device 101 may have a field-programmable gate array (FPGA) in which specific processing or all processing is programmed. Furthermore, the control device 101 may have an SSD (Solid State Drive) as a memory.
  • the control apparatus 101 may include an SSD instead of the HDD 204, or may include both the HDD 204 and the SSD.
  • FIG. 3 is a diagram illustrating an example of a functional configuration of the control device 101.
  • the control device 101 includes an inspection control unit 300, a signal acquisition unit 301, a position acquisition unit 302, an image processing unit 303, a determination unit 304, a display control unit 305, and an output unit 306.
  • the inspection control unit 300 controls inspection performed in the imaging system 100.
  • the inspection control unit 300 acquires inspection order information from the ordering system 112.
  • the examination order includes information on a patient who undergoes an examination and information on imaging procedures.
  • the inspection control unit 300 controls the probe 102 and the detection unit 103 based on information on the imaging technique.
  • the inspection control unit 300 causes the display unit 104 to display information on the inspection via the display control unit 305 in order to present information related to the inspection to the user.
  • the information on the examination displayed on the display unit 104 includes information on the patient undergoing the examination, information on the imaging technique included in the examination, and an image already generated after imaging.
  • the inspection control unit 300 transmits information regarding the progress of the inspection to the ordering system 112. For example, when the inspection is started by the user, the system 112 is notified of the start, and when imaging by all the imaging techniques included in the inspection is completed, the system 112 is notified of the completion.
  • the inspection control unit 300 acquires information related to the probe 102 used for imaging.
  • Information related to the probe 102 includes information such as the type of probe, center frequency, sensitivity, acoustic focus, electronic focus, and observation depth.
  • the user connects the probe 102 to, for example, a probe connector port (not shown) of the control apparatus 101, inputs an operation to the control apparatus 101, validates the probe 102, and inputs imaging conditions and the like.
  • the inspection control unit 300 acquires information regarding the activated probe 102.
  • the examination acquisition unit 300 appropriately transmits information related to the probe 102 to the image processing unit 303, the determination unit 304, and the display control unit 305. These are an example of the 2nd acquisition means which acquires the information regarding the movement of the probe 102.
  • the signal acquisition unit 301 acquires an ultrasonic signal and a photoacoustic signal from the probe 102. Specifically, the signal acquisition unit 301 distinguishes and acquires an ultrasonic signal and a photoacoustic signal from information acquired from the probe 102 based on information from the inspection control unit 300 and the position acquisition unit 302. For example, in the imaging technique in which imaging is performed, when the acquisition timing of the ultrasonic signal and the acquisition of the photoacoustic signal are specified, based on the acquisition timing information acquired from the inspection control unit 300, An ultrasonic signal and a photoacoustic signal are distinguished and acquired from information acquired from the probe 102.
  • the signal acquisition unit 301 is an example of a first acquisition unit that acquires at least one of an ultrasonic signal and a photoacoustic signal from the probe 102.
  • the position acquisition unit 302 acquires information related to the displacement of the probe 102 based on the information from the detection unit 103.
  • the position acquisition unit 302 is based on information from the detection unit 103, information on the position and orientation of the probe 102, movement speed with respect to the subject, rotational speed information, acceleration of movement with respect to the subject, and with respect to the subject. At least one of the information indicating the degree of pressing is acquired. That is, the position acquisition unit 302 acquires information indicating how the user operates the probe 102 with respect to the subject.
  • the position acquisition unit 302 determines, for example, whether the user is stationary with the probe 102 in contact with the subject or is moving at a predetermined speed or more. can do.
  • the position acquisition unit 302 preferably acquires the position information of the probe 102 from the detection unit 103 at regular time intervals, preferably in real time.
  • the position acquisition unit 302 appropriately transmits information related to the displacement of the probe 102 to the inspection control unit 300, the image processing unit 303, the determination unit 304, and the display control unit 305.
  • the position acquisition unit 302 is an example of a second acquisition unit that acquires information related to the displacement of the probe 102.
  • the image processing unit 303 generates an ultrasonic image, a photoacoustic image, and a superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image.
  • the image processing unit 303 generates an ultrasonic image to be displayed on the display unit 104 from the ultrasonic signal acquired by the signal acquisition unit 301.
  • the image processing unit 303 generates an ultrasound image suitable for the set mode based on the imaging technique information acquired from the inspection control unit 300. For example, when the Doppler mode is set as an imaging technique, the image processing unit 303 calculates the flow velocity inside the subject based on the difference between the frequency of the ultrasonic signal acquired by the signal acquisition unit 301 and the transmission frequency. Generate the image shown.
  • the image processing unit 303 generates a photoacoustic image based on the photoacoustic signal acquired by the signal acquisition unit 301.
  • the image processing unit 303 reconstructs an acoustic wave distribution (hereinafter referred to as an initial sound pressure distribution) when light is irradiated based on the photoacoustic signal.
  • the image processing unit 303 acquires the light absorption coefficient distribution in the subject by dividing the reconstructed initial sound pressure distribution by the light fluence distribution of the subject irradiated with the light. Further, the concentration distribution of the substance in the subject is obtained from the absorption coefficient distribution for a plurality of wavelengths by utilizing the fact that the degree of light absorption in the subject varies depending on the wavelength of the light irradiated to the subject.
  • the image processing unit 303 acquires the concentration distribution of substances in the subject of oxyhemoglobin and deoxyhemoglobin. Further, the image processing unit 303 acquires the oxygen saturation distribution as a ratio of the oxyhemoglobin concentration to the deoxyhemoglobin concentration.
  • the photoacoustic image generated by the image processing unit 303 is an image indicating information such as the above-described initial sound pressure distribution, light fluence distribution, absorption coefficient distribution, substance concentration distribution, and oxygen saturation distribution. That is, the image processing unit 303 is an example of a generation unit that generates an ultrasonic image based on the ultrasonic signal and generates a photoacoustic image based on the photoacoustic signal.
  • the determination unit 304 determines whether to display a photoacoustic image on the display unit 104 via the display control unit 305 based on the information regarding the displacement of the probe 102 acquired by the position acquisition unit 302. That is, the determination unit 304 is an example of a determination unit that determines whether to display a photoacoustic image on the display unit 104.
  • the determination unit 304 when the position acquisition unit 302 acquires information indicating that the probe 102 is moving at a predetermined speed or less, or the probe 102 is pressed against the subject at a predetermined pressure or higher.
  • the information indicating this is acquired, it is determined that the photoacoustic image is displayed.
  • the photoacoustic image is displayed on the display unit 104 when the user is performing an operation of observing a specific region of the subject. The user can observe the ultrasonic image and the photoacoustic image without performing a special operation input such as pressing a switch having a physical structure.
  • the determination unit 304 determines to display the photoacoustic image on the display unit 104
  • the image processing unit 303 generates a superimposed image in which the photoacoustic image is superimposed on the ultrasonic image, and via the display control unit 305.
  • a superimposed image is displayed on the display unit 104. That is, the mode is switched from the mode for displaying the ultrasonic image to the mode for displaying the ultrasonic image and the photoacoustic image.
  • the inspection control unit 300 controls the irradiation unit 107 and the signal acquisition unit 301 to acquire a photoacoustic signal. .
  • the image processing unit 303 generates a photoacoustic image by performing reconstruction processing based on the photoacoustic signal acquired according to the determination.
  • the display control unit 305 causes the display unit 104 to display the generated photoacoustic image.
  • the examination control unit 300 is an example of an irradiation control unit that controls the irradiation unit 107 to perform light irradiation on the subject when it is determined to display the photoacoustic image on the display unit 104.
  • the display control unit 305 controls the display unit 104 to display information on the display unit 104.
  • the display control unit 305 causes the display unit 104 to display information in accordance with inputs from the inspection control unit 300, the image processing unit 303, the determination unit 304, and user operation inputs via the operation unit 105.
  • the display control unit 305 is an example of a display control unit.
  • the display control unit 305 is an example of a display control unit that displays the photoacoustic image on the display unit 104 based on the result of the determination unit 304 determining that the photoacoustic image is displayed on the display unit 104.
  • the output unit 306 outputs information from the control device 101 to an external device such as the PACS 113 via the network 110.
  • the output unit 306 outputs the ultrasonic image and the photoacoustic image generated by the image processing unit 303 and a superimposed image thereof to the PACS 113.
  • the image output from the output unit 306 includes incidental information attached as various tags according to the DICOM standard by the inspection control unit 300.
  • the incidental information includes, for example, patient information, information indicating the imaging device that captured the image, an image ID for uniquely identifying the image, and an examination ID for uniquely identifying the examination that captured the image Is included. Further, the incidental information includes information associating an ultrasonic image and a photoacoustic image captured during a series of probe operations.
  • the information associating the ultrasonic image and the photoacoustic image is information indicating a frame having the closest timing at which the photoacoustic image is acquired, for example, among a plurality of frames constituting the ultrasonic image. Furthermore, as the incidental information, the position information of the probe 102 acquired by the detection unit 103 may be incidental to each frame of the ultrasonic image and the photoacoustic image. That is, the output unit 306 outputs information indicating the position of the probe 102 that has acquired the ultrasonic signal for generating the ultrasonic image attached to the ultrasonic image. The output unit 306 outputs information indicating the position of the probe 102 that has acquired the photoacoustic signal for generating the photoacoustic image, attached to the photoacoustic image. The output unit 306 is an example of an output unit.
  • FIG. 4 is a diagram illustrating an example of an ultrasonic image, a photoacoustic image, and a superimposed image displayed on the display unit 104 by the display control unit 305.
  • FIG. 4A is an example of an ultrasonic image, which is an example of a tomographic image in which the amplitude of the reflected wave is expressed by luminance, that is, an image generated in the B mode.
  • a B-mode image is generated as an ultrasound image will be described.
  • the ultrasound image acquired by the control device 101 in the first embodiment is not limited to a B-mode image.
  • the acquired ultrasonic image may be generated by any other method such as A mode, M mode, or Doppler mode, or may be a harmonic image or a tissue elasticity image.
  • the region in the subject from which an ultrasound image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a liver, or a pancreas.
  • an ultrasound image of a subject to which an ultrasound contrast agent using microbubbles is administered may be captured.
  • FIG. 4B is an example of a photoacoustic image, which is an example of a blood vessel image drawn based on the absorption coefficient distribution and the hemoglobin concentration.
  • the photoacoustic image acquired by the control apparatus 101 includes information on the generated sound pressure (initial sound pressure) of the photoacoustic wave, the light absorption energy density, the light absorption coefficient, and the concentration of the substance constituting the subject. And any image generated by combining them.
  • region in the subject from which a photoacoustic image is imaged with the imaging system 100 is areas, such as a circulatory organ area
  • a blood vessel region including a new blood vessel and a plaque on a blood vessel wall may be set as a target for imaging a photoacoustic image in accordance with the characteristics relating to light absorption in the subject.
  • a photoacoustic image is captured while capturing an ultrasonic image will be described as an example.
  • a region in a subject where a photoacoustic image is captured by the imaging system 100 is not necessarily a region where an ultrasonic image is captured. Does not have to match.
  • a dye such as methylene blue or indocyanine green, gold fine particles, or a substance obtained by integrating or chemically modifying them.
  • FIG. 4C is a superimposed image in which the ultrasonic image and the photoacoustic image illustrated in FIGS. 4A and 4B are superimposed.
  • the image processing unit 303 aligns the ultrasonic image and the photoacoustic image to generate a superimposed image.
  • the image processing unit 303 may use any method as the alignment method. For example, the image processing unit 303 performs alignment based on a characteristic region drawn in common for each of the ultrasonic image and the photoacoustic image. In another example, based on the information on the position of the probe 102 acquired by the position acquisition unit 302, an ultrasonic image and a photoacoustic image that can be determined to be rendered by signals from substantially the same region of the subject are superimposed. Thus, a superimposed image may be generated.
  • FIG. 5 is a diagram illustrating an example of the configuration of the imaging system 100.
  • the imaging system 100 includes a console 501, a probe 102, a magnetic sensor 502, a magnetic transmitter 503, and a gantry 504.
  • the console 501 is obtained by integrating the control device 101, the display unit 104, and the operation unit 105.
  • the control device according to the first embodiment is the control device 101 or the console 501.
  • the magnetic sensor 502 and the magnetic transmitter 503 are an example of the detection unit 103.
  • the gantry 504 supports the subject.
  • the magnetic sensor 502 and the magnetic transmitter 503 are devices for acquiring position information of the probe 102.
  • the magnetic sensor 502 is a magnetic sensor attached to the probe 102.
  • the magnetic transmitter 503 is a device that is arranged at an arbitrary position and forms a magnetic field toward the outside centering on itself. In the first embodiment, the magnetic transmitter 503 is installed in the vicinity of the gantry 504.
  • the magnetic sensor 502 detects a three-dimensional magnetic field formed by the magnetic transmitter 503. Then, the magnetic sensor 502 acquires positions (coordinates) of a plurality of points of the probe 102 in a space having the magnetic transmitter 503 as the origin, based on the detected magnetic field information.
  • the position acquisition unit 302 acquires three-dimensional position information of the probe 102 based on the position (coordinate) information acquired from the magnetic sensor 502.
  • the three-dimensional position information of the probe 102 includes the coordinates of the transmission / reception unit 106. Based on the coordinates of the transmitting / receiving unit 106, the position of the contact surface with the subject is acquired. Further, the three-dimensional position information of the probe 102 includes information on the tilt (angle) of the probe 102 with respect to the subject. And the position acquisition part 302 acquires the information regarding the displacement of the probe 102 based on the change of the three-dimensional position information with time.
  • FIG. 6 is a flowchart illustrating an example of processing in which the control device according to the first embodiment displays a photoacoustic image on the display unit 104 based on the operation of the probe 102 by the user.
  • the user acquires an ultrasonic signal using at least the probe 102, operates the probe 102 while displaying the ultrasonic image on the display unit 104, and further displays the photoacoustic image on the display unit 104. explain.
  • step S600 the inspection control unit 300 acquires pre-set information regarding the display of the photoacoustic image.
  • the user Prior to the examination, the user performs settings related to the display of the photoacoustic image by operating the console 501.
  • the setting related to the display of the photoacoustic image includes a setting related to the acquisition of the photoacoustic signal and a setting related to the display of the photoacoustic image generated based on the acquired photoacoustic signal.
  • the probe 102 is determined in which mode the probe 102 is to be operated among the second acquisition mode in which the probe 102 is acquired and the third acquisition mode in which only the ultrasonic signal is acquired.
  • the first acquisition mode an ultrasonic signal and a photoacoustic signal are alternately acquired at predetermined time intervals, or an ultrasonic signal and a photoacoustic signal are obtained in a manner determined by the order information acquired from the ordering system 112. It includes the case of acquisition.
  • the settings related to the display of the photoacoustic image are the first display mode in which the photoacoustic image is reconstructed from the photoacoustic signal and sequentially displayed, and the photoacoustic image is displayed until the photographing is completed even if the photoacoustic signal is reconstructed.
  • step S601. the setting relating to the acquisition of the photoacoustic signal is the second acquisition mode and the setting relating to the display of the photoacoustic image is the first display mode. If the setting relating to the acquisition of the photoacoustic signal is the second acquisition mode and the setting relating to the display of the photoacoustic image is the first display mode, the process proceeds to step S601. Otherwise, the process proceeds to step S603. .
  • step S601 the determination unit 304 determines whether the moving speed of the probe 102 is equal to or less than a predetermined value. Specifically, the position acquisition unit 302 first acquires information indicating the position of the probe 102 from the magnetic sensor 502, and acquires information indicating the speed of movement of the probe 102 based on a change with time of the position. The position acquisition unit 302 transmits information indicating the movement speed of the probe 102 to the determination unit 304. The determination unit 304 determines whether or not the moving speed of the probe 102 is equal to or less than a predetermined value.
  • the determination unit 304 determines that the probe 102 is moving at a speed smaller than a predetermined speed even when the probe 102 is stationary with respect to the subject, that is, when the movement speed is zero. For example, the position acquisition unit 302 stores the position information of the probe 102 acquired by the magnetic sensor 502 for a certain period of time. Then, the position acquisition unit 302 acquires a velocity vector related to the movement of the probe 102 and transmits it to the determination unit 304. The determination unit 304 determines that the position of the probe 102 has not changed sufficiently if the velocity of the probe 102 is equal to or less than a predetermined value for a predetermined period.
  • the determination unit 304 determines that the moving speed of the probe 102 is equal to or lower than the predetermined value. If the moving speed of the probe 102 is equal to or less than the predetermined value, the process proceeds to step S602. If the moving speed of the probe 102 is greater than the predetermined value, the process proceeds to step S605.
  • step S602 the determination unit 304 determines whether the rotation speed of the probe 102 is equal to or less than a predetermined value. Specifically, in the same manner as in step S601, the position acquisition unit 302 first acquires information indicating the position of the probe 102 from the magnetic sensor 502, and the rotation speed of the probe 102 based on the change over time of the position. Get information indicating The position acquisition unit 302 transmits information indicating the rotation speed of the probe 102 to the determination unit 304. The determination unit 304 determines whether or not the rotation speed of the probe 102 is equal to or less than a predetermined value.
  • the determination unit 304 determines that the probe 102 is rotating at a speed smaller than a predetermined speed even when the probe 102 is stationary with respect to the subject, that is, when the rotation speed is zero. For example, the position acquisition unit 302 acquires a velocity vector related to the movement of the probe 102 in the same manner as in step S ⁇ b> 601 and transmits it to the determination unit 304. For example, the determination unit 304 determines that the predetermined value is 1 / 6 ⁇ rad / sec, and the probe 102 rotates at a speed equal to or lower than the predetermined value for 3 seconds, the rotation speed of the probe 102 is equal to or lower than the predetermined value. If the probe 102 is rotating at a speed smaller than the predetermined speed, the process proceeds to step S604. If the probe 102 is rotating at a speed greater than the predetermined speed, the process proceeds to step S605.
  • step S603 the process branches based on the preset information acquired by the inspection control unit 300 in step S600. If the setting related to the acquisition of the photoacoustic signal is the first acquisition mode and the setting related to the display of the photoacoustic image is the first display mode, the process proceeds to step S604. Otherwise, the process proceeds to step S605. move on.
  • step S604 the display control unit 305 causes the display unit 104 to display a photoacoustic image.
  • the image processing unit 303 reconstructs a photoacoustic image based on a photoacoustic signal appropriately based on information regarding the displacement of the probe 102 or a photoacoustic signal acquired at a predetermined timing. .
  • the display control unit 305 displays the photoacoustic image on the display unit 104.
  • the image processing unit 303 superimposes the photoacoustic image on the ultrasonic signal generated based on the ultrasonic signal acquired at a time close to the time when the photoacoustic signal is acquired. A superimposed image is generated.
  • the display control unit 305 causes the display unit 104 to display the superimposed image. That is, the display control unit 305 causes the display unit 104 to display a photoacoustic image generated from the photoacoustic signal based on information regarding the displacement of the probe 102.
  • the user acquires an ultrasonic signal with the probe 102 and operates the probe 102 while observing an ultrasonic image displayed on the display unit 104.
  • the moving speed or rotational speed of the probe 102 is smaller than a predetermined value, it is assumed that the user intends to observe a specific area in the subject in more detail.
  • a photoacoustic image is displayed on the display unit 104 based on such a change in the operation of the user's probe 102. Accordingly, the photoacoustic image can be displayed on the display unit 104 at an appropriate timing without preventing the user from observing the ultrasonic image in order to search for a region to be observed in detail.
  • the display control unit 305 may display the photoacoustic image in the superimposed image with higher transparency as the moving speed of the probe 102 increases. Then, when the moving speed of the probe 102 becomes larger than a predetermined value, the photoacoustic image may not be displayed. That is, the display control unit 305 changes the manner in which the photoacoustic image is displayed on the display unit 104 according to the degree of displacement of the probe 102.
  • step S605 the display control unit 305 does not display the photoacoustic image on the display unit 104.
  • the image processing unit 303 generates an ultrasound image based on the ultrasound signal acquired by the probe 102, and the display control unit 305 causes the display unit 104 to display the ultrasound image.
  • the process shown in FIG. In the example illustrated in FIG. 6, the example in which the photoacoustic image is displayed on the display unit 104 according to the operation of the probe 102 or the presetting has been described, but the present invention is not limited to the display of the photoacoustic image.
  • the superimposed image or the photoacoustic image generated by the image processing unit 303 may be stored at the same time when the photoacoustic image is displayed on the display unit 104 according to the operation of the probe 102.
  • the saving is not limited to saving in the memory in the control apparatus 101.
  • the saving may be performed by outputting to the external apparatus such as the PACS 113 via the output unit 306.
  • step S600 When it is determined that the photoacoustic image is not displayed by the processing from step S600 to step S603, it is assumed that the user is searching for a region to be observed in detail while operating the probe 102. Therefore, it is not always necessary to save the moving image being searched. Therefore, by storing the superimposed image when it is determined to display the photoacoustic image, it is possible to selectively store the image that is the object of detailed observation for the user, and to efficiently save the capacity of the memory and the external device. Can be used.
  • step S601 and step S602 may be processed simultaneously or in parallel. That is, the position acquisition unit 302 may transmit information on the moving speed and the rotational speed of the probe 102 simultaneously or in parallel to the determination unit 304 based on information indicating the position of the probe 102 acquired from the magnetic sensor 502. Then, the determination unit 304 determines whether the moving speed of the probe 102 is a predetermined value or less and whether the rotation speed is a predetermined value or less. If the moving speed of the probe 102 is not more than the predetermined value and the rotation speed is not more than the predetermined value, the process proceeds to step S604. If at least one of the moving speed and the rotating speed of the probe 102 is equal to or greater than the predetermined value, the process proceeds to step S605. In another example, only one of step S601 and step S602 may be processed. That is, the determination unit 304 may determine whether to display the photoacoustic image based on only one of the moving speed and the rotational speed.
  • information for guiding the probe 102 to a region where the photoacoustic signal of the subject is to be acquired may be further displayed on the display unit 104.
  • the information for guiding is information for guiding the position of the probe 102 or the tilt with respect to the subject to a target state, for example.
  • the position acquisition unit 302 acquires the position information of the probe 102 based on the position information from the detection unit 103.
  • the determination unit 304 stores position information of the probe 102 when it is determined that a photoacoustic image is displayed on the display unit 104 during operation of the probe 102.
  • the position of the probe 102 when the photoacoustic image is displayed last time is referred to as a target position.
  • the determination unit 304 acquires the position information of the probe 102 from the position acquisition unit 302, for example, as described above in the description of the processing in step S602 and step S603.
  • the determination unit 304 generates guide information for guiding the probe 102 to the target position based on the target position and the current position of the probe 102.
  • the guide information includes information indicating a movement direction, a movement amount, an inclination angle, a rotation direction, and a rotation amount for moving the probe 102 to the target position.
  • the determination unit 304 is an example of a guide unit that generates guide information for guiding the probe 102 to a specific position.
  • the determination unit 304 generates guide information when the probe 102 is operated in the vicinity of the target position for a predetermined time or more, but an operation that is not determined to display the photoacoustic image is performed. Thereby, it is possible to easily reproduce the photoacoustic image and the ultrasonic image of the region observed in detail by the user during the observation.
  • the guide information generated by the determination unit 304 is displayed on the display unit 104 by the display control unit 305.
  • the display control unit 305 displays on the display unit 104 a guide image that serves as an objective index indicating the movement direction, movement amount, tilt angle, rotation direction, and rotation amount for moving the probe 102 to the target position.
  • the guide image may be any image as long as it is an objective index of the guide information.
  • the guide image is an image of an arrow having a size corresponding to the amount of movement or rotation and a direction corresponding to the direction of movement, rotation, or tilt.
  • the guide image is a figure that has a size corresponding to the amount of movement or rotation, and whose shape is deformed according to the direction of movement, rotation, or inclination.
  • the guide image is displayed on the display unit 104 in a manner that does not hinder observation of an area (hereinafter referred to as a target area) drawn in an ultrasonic image or a photoacoustic image when the probe 102 is moved to the target position.
  • a target area an area drawn in an ultrasonic image or a photoacoustic image when the probe 102 is moved to the target position.
  • the guide image is displayed in an area where the ultrasonic image, the photoacoustic image, and the superimposed image are not displayed.
  • the probe 102 while guiding the probe 102 to move to the target position, it is displayed at a position that overlaps the area near the target area, and when the target area is rendered, it is deformed into a shape that cannot be seen. May be displayed.
  • the guide information generated by the determination unit 304 may be notified to the user by generating a sound that decreases the pronunciation interval as the probe 102 approaches the target position.
  • the determination unit 304 may determine to generate guide information, notify the position acquisition unit 302 to generate guide information, and the position acquisition unit 302 may generate guide information. Further, guide information may be generated by providing a module different from the position acquisition unit 302 and the determination unit 304.
  • the position of the probe 102 that can depict the region observed in detail by the user during the observation is stored for generating the guide information
  • the present invention is not limited to this.
  • the position of the probe 102 capable of rendering a designated area based on an ultrasonic image, an ultrasonic image observed in the past, a photoacoustic image, and other medical images while operating the probe 102 is guided. You may memorize
  • the position of the probe 102 for which guide information is to be generated has been described as being automatically stored at the time of determination to display a photoacoustic image.
  • the present invention is not limited to this. May be specified.
  • the present invention is not limited to this.
  • a case will be described in which a three-dimensional photoacoustic image of a specific region is acquired in accordance with an inspection order or a user operation input.
  • the image processing unit 303 is insufficient to generate a three-dimensional photoacoustic image based on the photoacoustic signal transmitted from the signal acquisition unit 301 and the position information of the probe 102 transmitted from the position acquisition unit 302.
  • the position acquisition unit 302 generates guide information for guiding the probe 102 to the position of the probe 102 that can acquire a signal that is insufficient to generate the three-dimensional photoacoustic image, via the display control unit 305.
  • the guide information is displayed on the display unit 104. Thereby, a three-dimensional photoacoustic image can be generated efficiently.
  • the present invention is not limited to this.
  • FIG. 7 is a diagram illustrating an example of the configuration of the imaging system 100.
  • the imaging system 100 includes a console 501, a probe 102, a gantry 504, and a motion sensor 700.
  • the motion sensor 700 is an example of a detection unit 103 that tracks position information of the probe 102.
  • the motion sensor 700 is provided or embedded in a portion different from the transmission / reception unit 106 of the probe 102 and the light source (not shown).
  • the motion sensor 700 is composed of, for example, a micro electro mechanical system, and provides 9-axis motion sensing including a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetic compass.
  • the position acquisition unit 302 acquires information related to the displacement of the probe 102 sensed by the motion sensor 700.
  • FIG. 8 is a diagram illustrating an example of the configuration of the imaging system 100.
  • the imaging system 100 includes a console 501, a probe 102, a mount 504, a transmission / reception unit 106, and a pressure sensor 801. *
  • the pressure sensor 801 is an example of the detection unit 103.
  • the pressure sensor 801 acquires information indicating the degree to which the user presses the probe 102 against the subject as information regarding the displacement mode of the probe 102.
  • the transmission / reception unit 106 is provided as a semi-fixed floating structure inside the probe 102.
  • the pressure sensor 801 is provided on the side opposite to the surface where the transmission / reception unit 106 is in contact with the subject, and measures the pressure received by the transmission / reception unit 106.
  • the pressure sensor 801 may be a diaphragm type pressure sensor provided on the contact surface of the probe 102 with the subject.
  • the position acquisition unit 302 acquires information regarding the pressure measured by the pressure sensor 801.
  • FIG. 9 is a flowchart illustrating an example of processing in which the control device according to the second embodiment displays a photoacoustic image on the display unit 104 based on the operation of the probe 102 by the user.
  • the user acquires an ultrasonic signal using at least the probe 102, operates the probe 102 while displaying the ultrasonic image on the display unit 104, and further displays the photoacoustic image on the display unit 104.
  • the processes in steps S600, 603, 604, and 605 are the same as those in the first embodiment described with reference to FIG.
  • step S900 the determination unit 304 determines whether or not the user is pressing the subject with the probe 102 at a constant pressure. Specifically, the position acquisition unit 302 transmits information acquired from the pressure sensor 801 to the determination unit 304. The determination unit 304 determines that the user is pressing the probe 102 against the subject with a constant pressure when the pressure received by the transmission / reception unit 106 is within a predetermined range for a predetermined time or more. To do. If the user is pressing the probe 102 against the subject with a constant pressure, the process proceeds to step S604. When the user presses the probe 102 with a constant pressure, it is assumed that a specific region of the subject is being observed.
  • the photoacoustic image can be displayed on the display unit 104. If the user does not press the probe 102 with a constant pressure, the process proceeds to step S605, and the photoacoustic image is not displayed.
  • step S604 the image processing unit 303 generates a superimposed image in which a photoacoustic image is superimposed on an ultrasonic image, for example, and causes the display unit 104 to display the superimposed image. Further, in the second embodiment, the image processing unit 303 may acquire pressure information from the position acquisition unit 302 and display a photoacoustic image on the display unit 104 based on the pressure information. It is assumed that the longer the user is pressing the probe 102 with a constant pressure, the higher the degree of attention that the user is paying attention to the area drawn at that time. Therefore, as the time during which the pressure value of the pressure sensor 801 is constant is longer, the image processing unit 303 sets the transparency of the photoacoustic image in the superimposed image to be lower. That is, the display control unit 305 changes the manner in which the photoacoustic image is displayed on the display unit 104 according to the degree of displacement of the probe 102. Thereby, the user can observe a photoacoustic image according to the degree of attention.
  • the probe 102 may include a magnetic sensor 502 and a motion sensor 700.
  • the determination unit 304 may determine whether to display a photoacoustic image based on not only the pressure with which the probe 102 is pressed against the subject but also information such as the position of the probe 102 and the angle with respect to the subject. That is, the display control unit 305 displays information indicating that the probe 102 is moving with respect to the subject at a speed lower than a predetermined speed, and the probe 102 presses the subject with a constant pressure against the subject.
  • the photoacoustic image may be displayed on the display unit 104 when at least one of the information indicating that the position is acquired is acquired by the position acquisition unit 302.
  • FIG. 10 is a flowchart illustrating an example of a process in which the control device according to the third embodiment displays a photoacoustic image according to the characteristics of the probe 102 and the purpose of the inspection.
  • the user acquires an ultrasonic signal using at least the probe 102, operates the probe 102 while displaying the ultrasonic image on the display unit 104, and further displays the photoacoustic image on the display unit 104.
  • a plurality of probes may be connected to the console 501, and the user selects a probe to be used according to the purpose of the examination, such as an area for observing the subject.
  • the processing in steps S604 and 605 is the same as that in the first embodiment described with reference to FIG.
  • the determination unit 304 determines whether the ultrasonic image can be complemented with the photoacoustic image. Specifically, the inspection control unit 300 acquires imaging conditions for the ultrasonic image and the photoacoustic image, and transmits them to the determination unit 304.
  • the position acquisition unit 302 acquires information on the probe 102 used by the user for observation, and transmits the information to the determination unit 304.
  • the information on the probe 102 includes an array of transducers (not shown) of the probe 102, initial settings when the probe is connected to the console 501, scanning method information, and information indicating the presence or absence of the irradiation unit 107. It is.
  • the characteristics of the acquired ultrasonic image differ depending on the imaging conditions such as the transducer arrangement, scanning method, and signal acquisition settings.
  • the convex electronic scan method an ultrasonic image with a wide field of view is obtained in the deep part of the subject, and is mainly used for observing the abdominal region.
  • the sector electronic scanning method obtains an ultrasonic image with a wide field of view from a narrow contact portion, and is mainly used for observation of a circulatory region.
  • an ultrasonic signal is acquired using high-frequency ultrasonic waves, an ultrasonic image with fine resolution can be obtained.
  • the transmission power of the ultrasonic signal is weak, the region of the subject depicted in the ultrasonic image is It becomes shallower.
  • the determination unit 304 determines whether to display the photoacoustic image on the display unit 104 according to the feature. For example, when the depth of the subject depicted in the photoacoustic image is greater than the depth of the subject depicted in the ultrasound image, the determination unit 304 determines that the ultrasound image can be complemented with the photoacoustic image. .
  • the determination unit 304 determines that the ultrasonic image can be complemented with the photoacoustic image.
  • the determination unit 304 determines that the ultrasonic image cannot be complemented with the photoacoustic image.
  • the determination unit 304 determines whether to display a photoacoustic image on the display unit 104 based on the characteristics of the probe 102 used for observation.
  • the features of the ultrasonic image and the photoacoustic image to be drawn depend on the characteristics of the probe 102. Therefore, the determination unit 304 performs the determination based on the characteristics of the probe 102 such as the imaging conditions and the configuration of the probe 102 related to the characteristics of the ultrasonic image and the photoacoustic image.
  • the position acquisition unit 302 that acquires information related to the characteristics of the probe 102 is an example of a third acquisition unit that acquires information related to the characteristics of the ultrasonic image rendered based on the ultrasonic signal acquired by the probe 102. is there.
  • the example in which the ultrasonic image is complemented with the photoacoustic image according to the depth and resolution of the subject depicted in the image has been described.
  • a criterion for determining whether or not to complement an ultrasonic image with a photoacoustic image may be set appropriately by the user.
  • the present invention is not limited thereto.
  • a superimposed image in which the photoacoustic image is superimposed only on a part of the region of the subject displayed on the display unit 104 may be displayed. Accordingly, since the photoacoustic image is not superimposed in the region where the ultrasonic image describes the structure of the subject in detail, observation of the ultrasonic image is not hindered.
  • the transparency of the superposed photoacoustic image may be varied according to the depth and the degree of resolution described above.
  • the probe 102 may include a magnetic sensor 502 and a motion sensor 700.
  • the determination unit 304 determines whether to display a photoacoustic image based on information such as the pressure at which the probe 102 is pressed against the subject, the position of the probe 102, and the angle with respect to the subject, as well as the parameters of the ultrasound image. May be determined.
  • the user may be notified that the probe being used is inappropriate. For example, notification is made by displaying a message or an image indicating inappropriateness on the display unit 104. Alternatively, the acquisition of the photoacoustic signal may be invalidated to notify the user that it has been invalidated.
  • An inappropriate case is, for example, a case where a probe that does not have the irradiation unit 107 for acquiring a photoacoustic signal is used even though acquisition of a photoacoustic image is requested in an inspection order.
  • the inspection control unit 300 may control the irradiation unit 107 to acquire a photoacoustic signal.
  • reconstructed based on the photoacoustic signal acquired according to the said determination may be displayed on the display part 104.
  • FIG. 11 is a flowchart illustrating an example of processing from when the irradiation unit 107 is controlled based on the determination of the determination unit 304 until a photoacoustic image is acquired and displayed on the display unit 104.
  • step S1100 the determination unit 304 determines whether or not to display the photoacoustic image on the display unit 104.
  • Step S1100 corresponds to, for example, the processing in steps S600 to S603 in the first embodiment, the processing in steps S600, 603, and 900 in the second embodiment, and step S1000 in the third embodiment. If it is determined to be displayed, the process proceeds to step S1101, and if it is determined not to be displayed, the process proceeds to step S1102.
  • step S1101 the examination control unit 300 controls the irradiation unit 107 to irradiate the subject with light.
  • the signal acquisition unit 301 acquires a photoacoustic signal from the probe 102.
  • the image processing unit 303 reconstructs a photoacoustic image from the photoacoustic signal.
  • the display control unit 305 displays the photoacoustic image on the display unit 104.
  • Step S1101 corresponds to step S604 in the first to third embodiments, for example.
  • step S1102 the position acquisition unit 302 acquires information related to the state of the probe 102. If information indicating that the photoacoustic signal is being acquired is acquired, the process proceeds to step S1103. If information indicating that the photoacoustic signal is not being acquired is acquired, the process proceeds to step S1104.
  • step S1103 the examination control unit 300 controls the irradiation unit 107 to stop the light irradiation on the subject.
  • the processes in steps S1102 and S1103 correspond to, for example, step S605 in the first to third embodiments.
  • step S1104 the inspection control unit 300 determines whether or not to end the inspection for imaging the ultrasonic image and the photoacoustic image.
  • the user can instruct the end of the inspection through an operation input to the console 501.
  • the examination control unit 300 acquires the position information of the probe 102 from the position acquisition unit 302, and ends the examination when, for example, the state in which the probe 102 is not in contact with the subject continues for a certain time or longer. You may judge.
  • it is determined to end the inspection based on the position information it is preferable to display a screen for allowing the user to confirm whether or not to end the inspection on the display unit 104 via the display control unit 305. If there is no instruction to end the inspection, the process returns to step S1100, and if there is an instruction to end the inspection, the process shown in FIG. 11 is ended.
  • the irradiation unit 107 is controlled by the signal acquisition unit 301, for example. It is preferable that the signal acquisition unit 301 performs light irradiation in a period in which the influence of body movement due to respiration or pulsation is small, and controls each component of the irradiation unit 107 so as to acquire a photoacoustic signal. For example, the signal acquisition unit 301 may control the irradiation unit 107 to perform light irradiation within 250 ms after it is determined in step S1100 to display a photoacoustic image. Further, the time from the determination to the light irradiation may be a predetermined value, or may be designated by the user via the operation unit 105.
  • the determination unit 304 determines whether to display the photoacoustic image on the display unit 104 has been described.
  • the process for displaying the photoacoustic image on the display unit 104 based on the determination of the determination unit 304 is not limited to the example described above.
  • the control device 101 may continuously acquire the ultrasonic signal and the photoacoustic signal, and may generate the photoacoustic image when it is determined to display the photoacoustic image.
  • the control apparatus 101 may acquire a photoacoustic signal, when it determines with displaying a photoacoustic image.
  • the aspect which displays a photoacoustic image on the display part 104 is not restricted to the example mentioned above.
  • the display When the ultrasonic image is displayed on the display unit 104, the display may be switched to the display of the photoacoustic image, the ultrasonic image and the photoacoustic image may be displayed in parallel, or the ultrasonic image may be displayed. On the other hand, a superimposed image in which a photoacoustic image is superimposed may be displayed.
  • the determination unit 304 performs determination based on information regarding the displacement of the probe 102, that is, information indicating how the user has operated the probe 102.
  • the determination of the determination unit 304 is not limited to this.
  • the control device 101 may be provided with a sound collecting microphone, and an instruction by a user's voice may be received.
  • the control apparatus 101 may store and execute a voice recognition program.
  • whether or not to display a photoacoustic image is determined based on whether or not a certain time has elapsed when the parameters of the probe 102 are further adjusted. You may be made to do. It is assumed that the user adjusts parameters such as sensitivity, focus, and depth of the probe 102 by an operation input to the console 501 and the probe 102. At this time, the determination unit 304 determines not to display the photoacoustic image on the display unit 104 until a predetermined time has elapsed after the adjustment of the parameter. Thereby, when the user intends to continue the observation with the changed parameter, the photoacoustic image is displayed, and when the parameter may be changed, the photoacoustic image is not displayed. . The user can easily adjust the parameters while observing the ultrasonic image, and the workflow can be improved.
  • the user may be notified that light irradiation is performed by the probe 102.
  • a notification image for notifying that light irradiation is performed by the probe 102 is displayed on the display unit 104.
  • the probe 102 may be provided with an LED light that is turned on during light irradiation.
  • the control device 101 may generate a notification sound during light irradiation.
  • the display control unit 305 that displays the guide image on the display unit 104, the LED light provided to the probe 102, and the sound generation unit that generates a notification sound emit light to acquire a photoacoustic signal.
  • reporting means which alert
  • FIG. 12 is a flowchart showing an example of processing for canceling the superimposed display of the photoacoustic image superimposed on the ultrasonic image.
  • Step 1200 is a process executed after the photoacoustic image is displayed on the ultrasonic image. That is, this embodiment can be combined with any of the above-described embodiments.
  • the determination unit 304 determines whether the moving speed of the probe 102 is greater than a predetermined value. Specifically, the position acquisition unit 302 first acquires information indicating the position of the probe 102 from the magnetic sensor 502, and acquires information indicating the speed of movement of the probe 102 based on a change with time of the position. The position acquisition unit 302 transmits information indicating the movement speed of the probe 102 to the determination unit 304.
  • the determination unit 304 acquires information indicating the movement speed of the probe 102 transmitted from the position acquisition unit 302, and based on the acquired information indicating the movement speed of the probe 102, the movement speed of the probe 102 is less than a predetermined value. Determine whether it is larger.
  • the predetermined value is a value similar to the predetermined value used in step 601, for example. If the determination unit 304 determines that the moving speed of the probe 102 is greater than the predetermined value, the process proceeds to step 1201. If the determination unit 304 determines that the moving speed of the probe 102 is equal to or less than the predetermined value, the process returns to step 1200 again.
  • the determination unit 304 may determine that the movement speed of the probe 102 is greater than the predetermined value when the movement speed of the probe 102 is longer than the predetermined value for a predetermined period.
  • step 1201 the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed instead of the superimposed image displayed on the display unit 104. That is, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which no photoacoustic image is superimposed in real time.
  • the user when a user wants to observe in detail an ultrasound image on which no photoacoustic image is superimposed, the user displays a desired ultrasound image with a simple operation on the probe 102. Can be displayed on the screen.
  • the superimposed display of the photoacoustic image is stopped using the moving speed of the probe 102, but the display control unit 305 stops the superimposed display of the photoacoustic image using other information. It is good.
  • the rotational speed of the probe 102 may be used instead of the moving speed of the probe 102.
  • the display control unit 305 may stop the superimposed display of the photoacoustic image.
  • the predetermined value compared with the rotation speed of the probe 102 is the same value as the predetermined value used in step 602, for example.
  • the display control unit 305 may stop the superimposed display of the photoacoustic image when the moving speed of the probe 102 and the rotational speed of the probe 102 are larger than predetermined values compared with each other.
  • the acceleration of the probe 102 may be used instead of the moving speed of the probe 102.
  • the display control unit 305 may stop the superimposed display of the photoacoustic image.
  • Step 1210 is a process executed after the photoacoustic image is displayed on the ultrasonic image. That is, this embodiment can be combined with any of the above-described embodiments.
  • the determination unit 304 determines whether or not the moving speed of the probe 102 is within a predetermined range.
  • the determination unit 304 acquires information indicating the movement speed of the probe 102 transmitted from the position acquisition unit 302, and the movement speed of the probe 102 is within a predetermined range based on the acquired information indicating the movement speed of the probe 102. It is determined whether or not.
  • the predetermined range is, for example, a range that is larger than the predetermined value used in step 601 and smaller than another predetermined value. If the determination unit 304 determines that the moving speed of the probe 102 is within the predetermined range, the process proceeds to step 1211. If the determination unit 304 determines that the moving speed of the probe 102 is out of the predetermined range, the process proceeds to step 1212.
  • the determination unit 304 may determine that the moving speed of the probe 102 is greater than a predetermined value when the time during which the moving speed of the probe 102 is within a predetermined range continues for a predetermined period.
  • the display control unit 305 changes the superposition state of the photoacoustic image. For example, if the photoacoustic image is superimposed on the ultrasonic image before step 1211, the display control unit 305 replaces the superimposed image displayed on the display unit 104 in step 1211 with the superacoustic image on which the photoacoustic image is not superimposed. A sound wave image is displayed on the display unit 104. If the photoacoustic image is not superimposed on the ultrasonic image before step 1211, the display control unit 305 replaces the ultrasonic image displayed on the display unit 104 in step 1211 with the superposed photoacoustic image. A sound wave image is displayed on the display unit 104.
  • step 1211 switching of the superposition state of the photoacoustic image is executed.
  • the determination unit 304 may not execute the determination in step 1210 within a predetermined period after the superposition state is changed in step 1211 so that the superposition state is not frequently changed. The same applies to other examples described later.
  • step 1212 the determination unit 304 determines whether or not the moving speed of the probe 102 is equal to or higher than another predetermined value (threshold value) that is the upper limit of the predetermined range. If the determination unit 304 determines that the moving speed of the probe 102 is equal to or higher than the threshold, the process proceeds to step 1213. If the determination unit 304 determines that the moving speed of the probe 102 is smaller than the threshold (that is, not more than the predetermined value used in step 601), the process returns to step 1210 again. That is, according to the example of the process shown in FIG. 12B, once the superposition state of the photoacoustic image is changed, even if the probe is stopped, the display state is maintained.
  • threshold value another predetermined value
  • the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed instead of the superimposed image displayed on the display unit 104. That is, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which no photoacoustic image is superimposed in real time. If the photoacoustic image is not superimposed on the ultrasonic image before step 1213, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed.
  • whether or not to superimpose the photoacoustic image on the ultrasonic image can be switched by a simple operation on the probe 102. Therefore, the user can observe in detail an ultrasonic image on which the photoacoustic image is not superimposed and displayed by a simple operation on the probe 102. Furthermore, the user can superimpose the photoacoustic image again on the ultrasonic image by a simple operation on the probe 102.
  • the superimposition state of the photoacoustic image is changed using the moving speed of the probe 102.
  • the display control unit 305 changes the superposition state of the photoacoustic image using other information. It is good as well.
  • the rotational speed of the probe 102 may be used instead of the moving speed of the probe 102.
  • the display control unit 305 may change the superimposed state of the photoacoustic image.
  • the display control unit 305 may change the superimposed state of the photoacoustic image when the moving speed of the probe 102 and the rotational speed of the probe 102 are within a predetermined range compared with each other.
  • the acceleration of the probe 102 may be used instead of the moving speed of the probe 102.
  • the display control unit 305 may change the superimposed state of the photoacoustic image.
  • the display control unit 305 displays the photoacoustic image superimposed on the ultrasonic image in accordance with the moving speed of the probe 102.
  • the pressure at which the probe 102 is pressed against the subject May be further used.
  • the display control unit 305 superimposes the photoacoustic image on the ultrasonic image on the display unit 104 when the moving speed of the probe 102 is equal to or lower than a predetermined value and the pressure at which the probe 102 is pressed against the subject is equal to or higher than the predetermined value. It is good also as making it display.
  • the display control unit 305 changes the superimposed state of the photoacoustic image. That is, when the photoacoustic image is superimposed on the ultrasonic image in advance, the display control unit 305 displays the ultrasonic image on which the photoacoustic image is not superimposed on the display unit instead of the superimposed image displayed on the display unit 104. 104 is displayed.
  • the display control unit 305 displays the ultrasonic image on which the photoacoustic image is superimposed instead of the ultrasonic image displayed on the display unit 104. 104 is displayed.
  • the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed on the display unit 104. .
  • the above processing it is possible to switch whether or not to superimpose the photoacoustic image on the ultrasonic image by a simple operation on the probe 102. Therefore, the user can observe in detail an ultrasonic image on which the photoacoustic image is not superimposed and displayed by a simple operation on the probe 102. Furthermore, the user can superimpose the photoacoustic image again on the ultrasonic image by a simple operation on the probe 102.
  • the display control unit 305 displays the photoacoustic image superimposed on the ultrasonic image on the display unit 104.
  • the display control unit 305 may change the superimposed state of the photoacoustic image based on information indicating the angle of the probe 102 detected by the gyro sensor. For example, the display control unit 305 changes the superposition state of the photoacoustic image when the moving speed of the probe 102 is equal to or less than a predetermined value and the angle change of the probe 102 in a predetermined period is equal to or greater than the predetermined value.
  • the display control unit 305 changes the superimposed state of the photoacoustic image. Accordingly, when the photoacoustic image is superimposed on the ultrasonic image in advance, the display control unit 305 replaces the superimposed image displayed on the display unit 104 with an ultrasonic image on which the photoacoustic image is not superimposed. 104 is displayed. In addition, when the photoacoustic image is not superimposed on the ultrasonic image in advance, the display control unit 305 displays the ultrasonic image on which the photoacoustic image is superimposed instead of the ultrasonic image displayed on the display unit 104. 104 is displayed.
  • the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed when the moving speed of the probe 102 exceeds a predetermined value.
  • the above processing it is possible to switch whether or not to superimpose the photoacoustic image on the ultrasonic image by a simple operation on the probe 102. Therefore, the user can observe in detail an ultrasonic image on which the photoacoustic image is not superimposed and displayed by a simple operation on the probe 102. Furthermore, the user can superimpose the photoacoustic image again on the ultrasonic image by a simple operation on the probe 102.
  • the display control unit 305 changes the superposition state in step 1211.
  • the present invention is not limited to this.
  • the display controller 104 controls the display unit 104 so that the moving speed of the probe 102 is within a predetermined range and the photoacoustic image is not superimposed on the ultrasonic image
  • the moving speed of the probe 102 is again within the predetermined range.
  • the display control unit 305 may not superimpose the photoacoustic image on the ultrasonic image.
  • the probe 102 may be moved as follows.
  • the probe 102 is moved so that the moving speed exceeds the upper limit of the predetermined range, and then the probe 102 is moved so as to be equal to or less than the predetermined value used in step 601. That is, when the determination unit 304 determines that the moving speed of the probe 102 has become equal to or less than the predetermined value used in step 601 after exceeding the upper limit of the predetermined range, the display control unit 305 displays the photoacoustic image superimposed. An ultrasonic image is displayed on the display unit 104.
  • the display control unit 305 changes the superposition state of the photoacoustic image.
  • the display unit 104 is set so that the moving speed of the probe 102 is larger than a predetermined value and the pressure at which the probe 102 is pressed against the subject is equal to or higher than the predetermined value, and the display control unit 305 does not superimpose the photoacoustic image on the ultrasonic image.
  • the display control unit 305 After the control, the display control unit 305 superimposes the photoacoustic image on the ultrasonic image even when the moving speed of the probe 102 is again larger than the predetermined value and the pressure at which the probe 102 is pressed against the subject exceeds the predetermined value. You may not do it.
  • the probe 102 In order to display the ultrasonic image on which the photoacoustic image is superimposed again on the display unit 104, for example, the probe 102 may be operated as follows. After the pressure at which the probe 102 is pressed against the subject is set below a predetermined value, the moving speed of the probe 102 is equal to or lower than the predetermined value, and the pressure at which the probe 102 is pressed against the subject is equal to or higher than the predetermined value. To do. In this case, the display control unit 305 causes the display unit 104 to display the photoacoustic image again superimposed on the ultrasonic image.
  • the user does not easily superimpose the photoacoustic image on the ultrasonic image. Therefore, it becomes difficult to be distracted by the operation, and it becomes possible to concentrate on the observation of the ultrasonic image.
  • the superposition state of the photoacoustic image is changed when the moving speed of the probe 102 is not more than a predetermined value and the angle change of the probe 102 in the predetermined period is not less than the predetermined value. It is not limited to.
  • the display unit 104 is controlled so that the moving speed of the probe 102 is not more than a predetermined value and the angle change of the probe 102 in a predetermined period is not less than a predetermined value and the display control unit 305 does not superimpose the photoacoustic image on the ultrasonic image.
  • the display control unit 305 may not superimpose the photoacoustic image on the ultrasonic image even when the moving speed of the probe 102 is equal to or lower than the predetermined value and the angle change of the probe 102 during the predetermined period is equal to or higher than the predetermined value.
  • the probe 102 may be operated as follows. For example, after the moving speed of the probe 102 is made larger than a predetermined value, the moving speed of the probe 102 is set to a predetermined value or less.
  • the display control unit 305 superimposes the photoacoustic image.
  • the displayed ultrasonic image is displayed on the display unit 104. Accordingly, even when the angle of the probe 102 is changed without moving the probe 102 or finely moved, it is possible to keep displaying the ultrasonic image on which the photoacoustic image is not superimposed.
  • the user does not easily superimpose the photoacoustic image on the ultrasonic image. Therefore, it becomes difficult to be distracted by the operation, and it becomes possible to concentrate on the observation of the ultrasonic image.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in the computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • ASIC application specific integrated circuit
  • the control device in each of the above-described embodiments may be realized as a single device, or may be configured to execute the above-described processing by combining a plurality of devices so that they can communicate with each other. included.
  • the above-described processing may be executed by a common server device or server group.
  • the plurality of devices constituting the control device and the control system need only be able to communicate at a predetermined communication rate, and do not need to exist in the same facility or in the same country.
  • a software program that realizes the functions of the above-described embodiments is supplied to a system or apparatus, and the computer of the system or apparatus reads and executes the code of the supplied program. Includes form.
  • the processing according to the embodiment is realized by a computer
  • the program code itself installed in the computer is also one embodiment of the present invention.
  • an OS or the like running on the computer performs part or all of the actual processing, and the functions of the above-described embodiments can be realized by the processing. .
  • Embodiments appropriately combining the above-described embodiments are also included in the embodiments of the present invention.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

This control device acquires an ultrasonic signal and a photoacoustic signal from a probe, which outputs the ultrasonic signal by means of transmission/reception of ultrasonic waves to and from a subject, and which outputs the photoacoustic signal by receiving photoacoustic waves generated due to light irradiation to the subject. The control device acquires information relating to displacement of the probe, and on the basis of the information relating to the displacement of the probe, the control device causes a display unit to display a photoacoustic image.

Description

制御装置、制御方法、制御システム及びプログラムControl device, control method, control system, and program
 本発明は、制御装置、制御方法、制御システム及びプログラムに関する。 The present invention relates to a control device, a control method, a control system, and a program.
 被検体内部の状態を低侵襲に画像化する撮像装置として、超音波撮像装置や光音響撮像装置が利用されている。特許文献1には、プローブに設けられたモード切替スイッチに対する操作により、光音響信号の検出を含む動作モードと、光音響信号の検出を含まない動作モードとを切替可能な光音響計測装置が開示されている。 An ultrasonic imaging device or a photoacoustic imaging device is used as an imaging device that images a state inside a subject in a minimally invasive manner. Patent Document 1 discloses a photoacoustic measurement device capable of switching between an operation mode including detection of a photoacoustic signal and an operation mode not including detection of a photoacoustic signal by an operation on a mode switch provided on the probe. Has been.
特開2012-196430号公報JP 2012-196430 A
 超音波信号と光音響信号とを取得する撮像装置において、超音波信号や光音響信号の検出に関する動作モードを切り替えながら撮像が行われることが想定される。しかしながら、当該動作モードを切り替えるために、プローブに設けられたモード切替スイッチを操作する必要がある場合、ユーザはプローブの操作を中断する場合がある。中断している間に被検体の体動やプローブ位置のずれが生じて、ユーザは所望の画像を観察できないおそれがある。 In an imaging apparatus that acquires an ultrasonic signal and a photoacoustic signal, it is assumed that imaging is performed while switching operation modes related to the detection of the ultrasonic signal and the photoacoustic signal. However, when it is necessary to operate a mode switch provided on the probe to switch the operation mode, the user may interrupt the operation of the probe. There is a possibility that the user cannot observe a desired image due to the body movement of the subject or the displacement of the probe position during the interruption.
 本明細書が開示する制御装置は、被検体に対する超音波の送受信により超音波信号を出力し、被検体への光照射により発生する光音響波を受信することにより光音響信号を出力するプローブから、前記超音波信号と前記光音響信号とを取得する第1の取得手段と、前記プローブの移動に関する情報を取得する第2の取得手段と、前記移動に関する情報に基づいて、前記光音響信号を用いて生成される光音響画像を表示部に表示させる表示制御手段と、を有することを特徴とする。 The control device disclosed in this specification outputs an ultrasonic signal by transmitting / receiving ultrasonic waves to / from a subject, and receives a photoacoustic wave generated by light irradiation on the subject, from a probe that outputs a photoacoustic signal. Based on the information related to the movement, the first acquisition means for acquiring the ultrasonic signal and the photoacoustic signal, the second acquisition means for acquiring information related to the movement of the probe, and the photoacoustic signal Display control means for displaying on the display unit a photoacoustic image generated by use.
 本発明によれば、プローブの移動に関する情報に基づいて光音響信号から生成される光音響画像を表示部に表示させることができるので、超音波信号や光音響信号の検出に関する動作モードを切り替えるための操作を行う手間を低減できる。 According to the present invention, since the photoacoustic image generated from the photoacoustic signal based on the information related to the movement of the probe can be displayed on the display unit, the operation mode related to the detection of the ultrasonic signal and the photoacoustic signal is switched. The trouble of performing the operation can be reduced.
本発明の実施形態に係る制御装置を含むシステムの構成の一例を示す図である。It is a figure which shows an example of a structure of the system containing the control apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る制御装置のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware constitutions of the control apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る制御装置の機能構成の一例を示す図である。It is a figure which shows an example of a function structure of the control apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る制御装置により表示部に表示される画像の一例を示す図である。It is a figure which shows an example of the image displayed on a display part by the control apparatus which concerns on embodiment of this invention. 第1の実施形態に係る制御装置を含む構成の一例を示す図である。It is a figure which shows an example of a structure containing the control apparatus which concerns on 1st Embodiment. 第1の実施形態に係る制御装置による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by the control apparatus which concerns on 1st Embodiment. 第1の実施形態に係る制御装置を含む構成の一例を示す図である。It is a figure which shows an example of a structure containing the control apparatus which concerns on 1st Embodiment. 第2の実施形態に係る制御装置を含む構成の一例を示す図である。It is a figure which shows an example of a structure containing the control apparatus which concerns on 2nd Embodiment. 第2の実施形態に係る制御装置による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by the control apparatus which concerns on 2nd Embodiment. 第3の実施形態に係る制御装置による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by the control apparatus which concerns on 3rd Embodiment. 本発明の実施形態に係る制御装置による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by the control apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る制御装置による処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process by the control apparatus which concerns on embodiment of this invention.
 以下、図面を参照して本発明の実施形態を説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 [第1の実施形態]
 本願明細書では、被検体に光を照射し、被検体内部で生じた膨張によって発生する音響波を光音響波と称する。また、トランスデューサから送信された音響波または当該送信された音響波が被検体内部で反射した反射波(エコー)を超音波と称する。
[First Embodiment]
In the specification of the present application, an acoustic wave generated by irradiating a subject with light and expanding inside the subject is referred to as a photoacoustic wave. In addition, an acoustic wave transmitted from the transducer or a reflected wave (echo) in which the transmitted acoustic wave is reflected inside the subject is referred to as an ultrasonic wave.
 被検体内部の状態を低侵襲に画像化する方法として、超音波を用いた画像化の方法や光音響波を用いた画像化の手法が利用されている。超音波を用いた画像化の方法は、たとえばトランスデューサから発振された超音波が被検体内部の組織で音響インピーダンスの差に応じて反射され、反射波がトランスデューサに到達するまでの時間や反射波の強度に基づいて画像を生成する方法である。超音波を用いて画像化された画像を以下では超音波画像と称する。ユーザはプローブの角度等を変えながら操作し、様々な断面の超音波画像をリアルタイムに観察することができる。超音波画像には臓器や組織の形状が描出され、腫瘍の発見等に活用されている。また、光音響波を用いた画像化の方法は、たとえば光を照射された被検体内部の組織が断熱膨張することにより発生する超音波(光音響波)に基づいて画像を生成する方法である。光音響波を用いて画像化された画像を以下では光音響画像と称する。光音響画像には各組織の光の吸収の度合いといった光学特性に関連した情報が描出される。光音響画像では、たとえばヘモグロビンの光学特性により血管を描出できることが知られており、腫瘍の悪性度の評価等への活用が検討されている。 As a method for imaging a state inside a subject in a minimally invasive manner, an imaging method using ultrasonic waves and an imaging method using photoacoustic waves are used. The method of imaging using ultrasonic waves is, for example, that the ultrasonic waves oscillated from the transducer are reflected by the tissue inside the subject according to the difference in acoustic impedance, and the time until the reflected wave reaches the transducer and the reflected wave This is a method for generating an image based on intensity. An image imaged using ultrasound is hereinafter referred to as an ultrasound image. The user can operate while changing the angle of the probe and observe ultrasonic images of various cross sections in real time. Ultrasound images depict the shapes of organs and tissues and are used to find tumors. The imaging method using photoacoustic waves is a method of generating an image based on, for example, ultrasonic waves (photoacoustic waves) generated by adiabatic expansion of tissue inside a subject irradiated with light. . An image imaged using the photoacoustic wave is hereinafter referred to as a photoacoustic image. In the photoacoustic image, information related to optical characteristics such as the degree of light absorption of each tissue is depicted. In photoacoustic images, for example, it is known that blood vessels can be drawn by the optical characteristics of hemoglobin, and its use for evaluating the malignancy of tumors is being studied.
 診断の精度を高めるために、被検体の同一部位を、異なる原理に基づいて異なる現象を画像化することにより、様々な情報を収集する場合がある。たとえば、CT(Computed Tomography)画像で得られた形態情報と、PET(positronemission tomography)画像で得られた代謝に関する機能情報とを組み合わせて、がんに関する診断を行う場合がある。このように、異なる原理に基づいて異なる現象を画像化して得られた情報を用いて診断を行うことは、診断の精度向上に有効であると考えられる。 In order to improve the accuracy of diagnosis, various information may be collected by imaging different phenomena on the same part of the subject based on different principles. For example, there is a case where diagnosis about cancer is performed by combining morphological information obtained from a CT (Computed Tomography) image and functional information relating to metabolism obtained from a PET (Positronization Tomography) image. As described above, it is considered effective to improve diagnosis accuracy to perform diagnosis using information obtained by imaging different phenomena based on different principles.
 上述した超音波画像と光音響画像に関しても、それぞれの特性を組み合わせた画像を得るための撮像装置が検討されている。特に、超音波画像も光音響画像も被検体からの超音波を利用して画像化されることから、超音波画像の撮像と光音響画像の撮像とを同じ撮像装置で行うことも可能である。より具体的には、被検体に照射した反射波と光音響波とを同じトランスデューサで受信する構成とすることができる。これにより、超音波信号と光音響信号とを一つのプローブで取得することができ、ハードウェア構成が複雑にならずに、超音波画像の撮像と光音響画像の撮像とを行う撮像装置を実現できる。 As for the above-described ultrasonic image and photoacoustic image, an imaging device for obtaining an image obtained by combining the respective characteristics has been studied. In particular, since both an ultrasonic image and a photoacoustic image are imaged using ultrasonic waves from a subject, it is also possible to perform imaging of an ultrasonic image and photoacoustic image with the same imaging device. . More specifically, it can be configured such that the reflected wave and the photoacoustic wave irradiated to the subject are received by the same transducer. As a result, it is possible to acquire an ultrasonic signal and a photoacoustic signal with a single probe, and realize an imaging device that captures an ultrasonic image and a photoacoustic image without complicating the hardware configuration. it can.
 超音波画像の撮像と光音響画像の撮像とを行う撮像装置において、ユーザは従来の超音波画像の撮像と同様にプローブの操作を行いたい場合が想定される。すなわちユーザはプローブを被検体の表面に接触させ、当該プローブにより取得された情報に基づいて表示される画像を観察しながらプローブを操作することが考えられる。その際に、信号取得や画像表示に関する動作モードの切り替えを、たとえばプローブに設けられたスイッチや、当該撮像装置の操作卓に設けられた入力デバイスを介して行うと、ユーザは画像を観察しながらのプローブ操作を中断する必要がある。そのため、スイッチや操作卓の入力デバイスへの操作入力の間に被検体の体動が生じたり、プローブ位置がずれたりすることが考えられる。 In an imaging apparatus that captures an ultrasonic image and a photoacoustic image, it is assumed that the user wants to operate the probe in the same manner as a conventional ultrasonic image. That is, it is conceivable that the user touches the surface of the subject and operates the probe while observing an image displayed based on information acquired by the probe. At that time, if the operation mode related to signal acquisition or image display is switched via, for example, a switch provided on the probe or an input device provided on the console of the imaging device, the user observes the image. It is necessary to interrupt the probe operation. For this reason, it is conceivable that the body movement of the subject occurs during the operation input to the switch or the input device of the console, or the probe position shifts.
 たとえば、上述の例のように超音波画像と光音響画像とを組み合わせて観察し、腫瘍の悪性度を評価する場合を考える。ユーザは超音波画像を観察しながらプローブを操作したところ、腫瘍の可能性がある部位を発見し、光音響画像を取得して血管の情報を収集したいとする。このとき、光音響画像を表示するための動作モードに切り替えるために上述したスイッチや操作卓の入力デバイスへの操作入力の間に、腫瘍の可能性があると考えた部位を観察できる位置からプローブがずれてしまうおそれがある。第1の実施形態は、ユーザが画像を観察する際の操作性を低下させずに、表示させる画像を切り替えることができる制御装置を提供することを目的とする。 For example, let us consider a case in which the malignancy of a tumor is evaluated by observing a combination of an ultrasonic image and a photoacoustic image as in the above example. A user operates a probe while observing an ultrasound image, finds a site that may be a tumor, acquires a photoacoustic image, and collects blood vessel information. At this time, in order to switch to the operation mode for displaying the photoacoustic image, the probe from a position where it is possible to observe the part considered to be a tumor during the operation input to the switch or the input device of the console described above. May shift. An object of the first embodiment is to provide a control device that can switch an image to be displayed without deteriorating operability when a user observes an image.
 図1は、第1の実施形態に係る制御装置101を含むシステムの構成の一例を示す図である。超音波画像と光音響画像とを生成可能な撮像システム100は、ネットワーク110を介して各種の外部装置と接続されている。撮像システム100に含まれる各構成及び各種の外部装置は、同じ施設内に設置されている必要はなく、通信可能に接続されていればよい。 FIG. 1 is a diagram illustrating an example of a configuration of a system including a control device 101 according to the first embodiment. An imaging system 100 that can generate an ultrasonic image and a photoacoustic image is connected to various external devices via a network 110. Each configuration and various external devices included in the imaging system 100 do not need to be installed in the same facility, and may be connected to be communicable.
 撮像システム100は、制御装置101、プローブ102、検知部103、表示部104、操作部105を含む。制御装置101は、プローブ102から超音波信号と光音響信号とを取得し、検知部103から取得されるプローブ102の移動に関する情報に基づいて、超音波画像と光音響画像とを表示部104に表示可能な装置である。また、制御装置101は、超音波画像ならびに光音響画像の撮像を含む検査に関する情報をオーダリングシステム112から取得し、当該検査が行われる際にプローブ102や検知部103や表示部104を制御する。制御装置101は、生成された超音波画像、光音響画像、超音波画像に光音響画像を重畳した重畳画像をPACS113に出力する。制御装置101は、HL7(Health level 7)やDICOM(Digital Imaging and Communications in Medicine)といった規格に準じて、オーダリングシステム112やPACS113といった外部装置との間で情報の送受信を行う。制御装置101により行われる処理についての詳細は、後述する。 The imaging system 100 includes a control device 101, a probe 102, a detection unit 103, a display unit 104, and an operation unit 105. The control device 101 acquires an ultrasonic signal and a photoacoustic signal from the probe 102, and displays an ultrasonic image and a photoacoustic image on the display unit 104 based on information related to movement of the probe 102 acquired from the detection unit 103. It is a displayable device. In addition, the control device 101 acquires information related to an examination including imaging of an ultrasonic image and a photoacoustic image from the ordering system 112, and controls the probe 102, the detection unit 103, and the display unit 104 when the examination is performed. The control device 101 outputs the generated ultrasonic image, photoacoustic image, and superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image to the PACS 113. The control device 101 transmits and receives information to and from external devices such as the ordering system 112 and the PACS 113 in accordance with standards such as HL7 (Health level 7) and DICOM (Digital Imaging and Communications in Medicine). Details of processing performed by the control device 101 will be described later.
 プローブ102は、ユーザにより操作され、超音波信号と光音響信号とを制御装置101に送信する。プローブ102は、送受信部106と照射部107とを含む。プローブ102は、送受信部106から超音波を送信し、反射波を送受信部106で受信する。また、プローブ102は照射部107から被検体に光を照射し、光音響波を送受信部106で受信する。プローブ102は受信した反射波ならびに光音響波を電気信号に変換し、超音波信号ならびに光音響信号として制御装置101に送信する。プローブ102は、被検体との接触を示す情報を受信したときに、超音波信号を取得するための超音波の送信ならびに光音響信号を取得するための光照射が実行されるように制御されることが好ましい。プローブ102は、超音波信号と光音響信号とを取得し、これらを交互に取得してもよいし、同時に取得してもよいし、予め定められた態様で取得してもよい。 The probe 102 is operated by a user and transmits an ultrasonic signal and a photoacoustic signal to the control device 101. The probe 102 includes a transmission / reception unit 106 and an irradiation unit 107. The probe 102 transmits an ultrasonic wave from the transmission / reception unit 106 and receives the reflected wave by the transmission / reception unit 106. Further, the probe 102 irradiates the subject with light from the irradiation unit 107, and the photoacoustic wave is received by the transmission / reception unit 106. The probe 102 converts the received reflected wave and photoacoustic wave into an electric signal, and transmits it to the control device 101 as an ultrasonic signal and a photoacoustic signal. The probe 102 is controlled so that, when information indicating contact with the subject is received, transmission of ultrasonic waves for acquiring an ultrasonic signal and light irradiation for acquiring a photoacoustic signal are executed. It is preferable. The probe 102 acquires an ultrasonic signal and a photoacoustic signal, may acquire these alternately, may acquire simultaneously, and may acquire in a predetermined aspect.
 送受信部106は、少なくとも1つのトランスデューサ(不図示)と、整合層(不図示)、ダンパー(不図示)、音響レンズ(不図示)を含む。トランスデューサ(不図示)はPZT(lead zirconate titanate)やPVDF(polyvinylidene difluoride)といった、圧電効果を示す物質からなる。トランスデューサ(不図示)は圧電素子以外のものでもよく、たとえば静電容量型トランスデューサ(CMUT:capacitive micro-machined ultrasonic transducers)、ファブリペロー干渉計を用いたトランスデューサである。典型的には、超音波信号は2~20MHz、光音響信号は0.1~100MHzの周波数成分からなり、トランスデューサ(不図示)はこれらの周波数を検出できるものが用いられる。トランスデューサ(不図示)により得られる信号は時間分解信号である。受信された信号の振幅は各時刻にトランスデューサで受信される音圧に基づく値を表したものである。送受信部106は、電子フォーカスのための回路(不図示)もしくは制御部を含む。トランスデューサ(不図示)の配列形は、たとえばセクタ、リニアアレイ、コンベックス、アニュラアレイ、マトリクスアレイである。 The transmission / reception unit 106 includes at least one transducer (not shown), a matching layer (not shown), a damper (not shown), and an acoustic lens (not shown). The transducer (not shown) is made of a material exhibiting a piezoelectric effect, such as PZT (lead zirconate titanate) or PVDF (polyvinylidene difluoride). The transducer (not shown) may be other than a piezoelectric element, for example, a transducer using a capacitive transducer (CMUT: capacitive ultrasonic transducer) or a Fabry-Perot interferometer. Typically, the ultrasonic signal is composed of frequency components of 2 to 20 MHz and the photoacoustic signal is composed of frequency components of 0.1 to 100 MHz, and a transducer (not shown) that can detect these frequencies is used. The signal obtained by the transducer (not shown) is a time-resolved signal. The amplitude of the received signal represents a value based on the sound pressure received by the transducer at each time. The transmission / reception unit 106 includes a circuit (not shown) or a control unit for electronic focusing. The array form of transducers (not shown) is, for example, a sector, a linear array, a convex, an annular array, or a matrix array.
 送受信部106は、トランスデューサ(不図示)が受信した時系列のアナログ信号を増幅する増幅器(不図示)を備えていてもよい。また、送受信部106は、トランスデューサ(不図示)が受信した時系列のアナログ信号を時系列のデジタル信号に変換するA/D変換器を備えていてもよい。トランスデューサ(不図示)は、超音波画像の撮像の目的に応じて、送信用と受信用とに分割されてもよい。また、トランスデューサ(不図示)は、超音波画像の撮像用と、光音響画像の撮像用とに分割されてもよい。 The transmitting / receiving unit 106 may include an amplifier (not shown) that amplifies a time-series analog signal received by a transducer (not shown). The transmission / reception unit 106 may include an A / D converter that converts a time-series analog signal received by a transducer (not shown) into a time-series digital signal. The transducer (not shown) may be divided into a transmitter and a receiver depending on the purpose of imaging an ultrasonic image. Further, the transducer (not shown) may be divided into an ultrasonic image capturing unit and a photoacoustic image capturing unit.
 照射部107は、光音響信号を取得するための光源(不図示)と、光源(不図示)から射出されたパルス光を被検体へ導く光学系(不図示)とを含む。光源(不図示)が射出する光のパルス幅は、たとえば1ns以上、100ns以下のパルス幅である。また、光源(不図示)が射出する光の波長は、たとえば400nm以上、1600nm以下の波長である。被検体の表面近傍の血管を高解像度でイメージングする場合は、400nm以上、700nm以下の、血管での吸収が大きい波長が好ましい。また、被検体の深部をイメージングする場合には、700nm以上、1100nm以下の、水や脂肪といった組織で吸収されにくい波長が好ましい。 The irradiation unit 107 includes a light source (not shown) for acquiring a photoacoustic signal and an optical system (not shown) that guides pulsed light emitted from the light source (not shown) to the subject. The pulse width of light emitted from a light source (not shown) is, for example, 1 ns or more and 100 ns or less. Moreover, the wavelength of the light which a light source (not shown) injects is a wavelength of 400 nm or more and 1600 nm or less, for example. When imaging a blood vessel in the vicinity of the surface of the subject with a high resolution, a wavelength of 400 nm or more and 700 nm or less and a large absorption in the blood vessel is preferable. Moreover, when imaging the deep part of a test object, the wavelength of 700 nm or more and 1100 nm or less which is hard to be absorbed by tissues such as water and fat is preferable.
 光源(不図示)は、たとえばレーザーや発光ダイオードである。照射部107は、複数の波長の光を用いて光音響信号を取得するために、波長を変換できる光源を用いてもよい。あるいは、照射部107は、互いに異なる波長の光を発生する複数の光源を備え、それぞれの光源から交互に異なる波長の光を照射できる構成であってもよい。レーザーは、たとえば固体レーザー、ガスレーザー、色素レーザー、半導体レーザーである。光源(不図示)として、Nd:YAGレーザーやアレキサンドライトレーザーといったパルスレーザーを用いてもよい。また、Nd:YAGレーザーの光を励起光とするTi:saレーザーやOPO(optical parametric oscillators)レーザーを光源(不図示)として用いてもよい。また、光源(不図示)として、マイクロウェーブ源を用いてもよい。 The light source (not shown) is, for example, a laser or a light emitting diode. The irradiation unit 107 may use a light source that can convert wavelengths in order to acquire a photoacoustic signal using light of a plurality of wavelengths. Alternatively, the irradiation unit 107 may include a plurality of light sources that generate light of different wavelengths, and may be configured to be able to irradiate light of different wavelengths alternately from each light source. The laser is, for example, a solid laser, a gas laser, a dye laser, or a semiconductor laser. As a light source (not shown), a pulsed laser such as an Nd: YAG laser or an alexandrite laser may be used. Further, a Ti: sa laser or an OPO (optical parametric oscillators) laser that uses Nd: YAG laser light as excitation light may be used as a light source (not shown). A microwave source may be used as a light source (not shown).
 光学系(不図示)には、レンズ、ミラー、光ファイバといった光学素子が用いられる。被検体が***である場合には、パルス光のビーム径を広げて照射することが好ましいため、光学系(不図示)は射出される光を拡散させる拡散板を備えていてもよい。あるいは解像度を上げるために、光学系(不図示)はレンズ等を備え、ビームをフォーカスできる構成であってもよい。 In the optical system (not shown), optical elements such as lenses, mirrors, and optical fibers are used. When the subject is a breast, it is preferable to irradiate with the beam diameter of the pulsed light expanded, so the optical system (not shown) may include a diffusion plate that diffuses the emitted light. Alternatively, in order to increase the resolution, the optical system (not shown) may include a lens or the like so that the beam can be focused.
 検知部103は、プローブ102の変位に関する情報を取得する。検知部103は、第1の実施形態においては、検知部103が図5に示す磁気トランスミッタ503と磁気センサ502とから構成される場合を例に説明する。検知部103は、プローブ102の移動に関する情報として、たとえば被検体に対するプローブ102の移動の速度を示す情報や、プローブ102の回転の速度に関する情報や、被検体に対する押圧の程度を示す情報を取得する。検知部103は、取得したプローブ102の移動に関する情報を制御装置101に送信する。 The detection unit 103 acquires information regarding the displacement of the probe 102. In the first embodiment, the detection unit 103 will be described using an example in which the detection unit 103 includes the magnetic transmitter 503 and the magnetic sensor 502 illustrated in FIG. 5. The detection unit 103 acquires, for example, information indicating the speed of movement of the probe 102 relative to the subject, information regarding the speed of rotation of the probe 102, and information indicating the degree of pressure on the subject as information regarding the movement of the probe 102. . The detection unit 103 transmits the acquired information regarding the movement of the probe 102 to the control device 101.
 表示部104は、制御装置101からの制御に基づいて、撮像システム100で撮像された画像や、検査に関する情報を表示する。表示部104は、制御装置101からの制御に基づいて、ユーザの指示を受け付けるためのインタフェースを提供する。表示部104は、たとえば液晶ディスプレイである。 The display unit 104 displays an image captured by the imaging system 100 and information related to the inspection based on control from the control device 101. The display unit 104 provides an interface for receiving user instructions based on control from the control device 101. The display unit 104 is a liquid crystal display, for example.
 操作部105は、ユーザの操作入力に関する情報を制御装置101に送信する。操作部105は、たとえばキーボードやトラックボールや、検査に関する操作入力を行うための各種のボタンである。 The operation unit 105 transmits information related to user operation input to the control apparatus 101. The operation unit 105 is, for example, a keyboard, a trackball, and various buttons for performing operation inputs related to inspection.
 なお、表示部104と操作部105はタッチパネルディスプレイとして統合されていてもよい。また、制御装置101と表示部104と操作部105は別体の装置である必要はなく、図5の操作卓501のようにこれらの構成が統合されていてもよい。制御装置101は、複数のプローブを有していてもよい。 Note that the display unit 104 and the operation unit 105 may be integrated as a touch panel display. Further, the control device 101, the display unit 104, and the operation unit 105 do not need to be separate devices, and these configurations may be integrated as in the console 501 in FIG. The control device 101 may have a plurality of probes.
 HIS(Hospital Information System)111は、病院の業務を支援するシステムである。HIS111は、電子カルテシステム、オーダリングシステムや医事会計システムを含む。HIS111により検査のオーダ発行から会計までを連携して管理することができる。HIS111のオーダリングシステムは、オーダ情報を部門ごとのオーダリングシステム112に送信する。そして後述するオーダリングシステム112において当該オーダの実施が管理される。 A HIS (Hospital Information System) 111 is a system that supports hospital operations. The HIS 111 includes an electronic medical record system, an ordering system, and a medical accounting system. By HIS111, it is possible to manage from inspection order issuance to accounting. The ordering system of the HIS 111 transmits order information to the ordering system 112 for each department. The ordering system 112, which will be described later, manages the execution of the order.
 オーダリングシステム112は、検査情報を管理し、撮像装置でのそれぞれの検査の進捗を管理するシステムである。オーダリングシステム112は検査を行う部門ごとに構成されていてもよい。オーダリングシステム112は、たとえば放射線部門においてはRIS(Radiology Information System)である。オーダリングシステム112は、制御装置101からの問い合わせに応じて、撮像システム100で行う検査の情報を制御装置101に送信する。オーダリングシステム112は、制御装置101から検査の進捗に関する情報を受信する。そして、オーダリングシステム112は、検査が完了したことを示す情報を制御装置101から受信すると、当該検査が完了したことを示す情報をHIS111に送信する。オーダリングシステム112はHIS111に統合されていてもよい。 The ordering system 112 is a system that manages inspection information and manages the progress of each inspection in the imaging apparatus. The ordering system 112 may be configured for each department that performs inspection. The ordering system 112 is, for example, RIS (Radiology Information System) in the radiation department. In response to an inquiry from the control apparatus 101, the ordering system 112 transmits information on examinations performed by the imaging system 100 to the control apparatus 101. The ordering system 112 receives information related to the progress of the inspection from the control device 101. When the ordering system 112 receives information indicating that the inspection is completed from the control device 101, the ordering system 112 transmits information indicating that the inspection is completed to the HIS 111. The ordering system 112 may be integrated into the HIS 111.
 PACS(Picture Archiving and Communication System)113は、施設内外の各種の撮像装置で得られた画像を保持するデータベースシステムである。PACS113は、医用画像及びかかる医用画像の撮影条件や、再構成を含む画像処理のパラメータや患者情報といった付帯情報を記憶する記憶部(不図示)と、当該記憶部に記憶される情報を管理するコントローラ(不図示)とを有する。PACS113は、制御装置101から出力された超音波画像や光音響画像や重畳画像を記憶する。PACS113と制御装置101との通信や、PACS113に記憶される各種の画像はHL7やDICOMといった規格に則していることが好ましい。制御装置101から出力される各種の画像には、DICOM規格に則って、各種のタグに付帯情報が記憶されている。 A PACS (Picture Archiving and Communication System) 113 is a database system that holds images obtained by various imaging devices inside and outside the facility. The PACS 113 manages a storage unit (not shown) that stores medical images and imaging conditions of such medical images, additional parameters such as image processing parameters including reconstruction, and patient information, and information stored in the storage unit. A controller (not shown). The PACS 113 stores an ultrasonic image, a photoacoustic image, and a superimposed image output from the control device 101. It is preferable that communication between the PACS 113 and the control device 101 and various images stored in the PACS 113 comply with standards such as HL7 and DICOM. In various images output from the control device 101, incidental information is stored in various tags in accordance with the DICOM standard.
 Viewer114は、画像診断用の端末であり、PACS113等に記憶された画像を読み出し、診断のために表示する。医師は、Viewer114に画像を表示させて観察し、当該観察の結果得られた情報を画像診断レポートとして記録する。Viewer114を用いて作成された画像診断レポートは、Viewer114に記憶されていてもよいし、PACS113やレポートサーバ(不図示)に出力され、記憶されてもよい。 The Viewer 114 is a terminal for image diagnosis, and reads an image stored in the PACS 113 and displays it for diagnosis. The doctor displays an image on the Viewer 114 for observation, and records information obtained as a result of the observation as an image diagnosis report. The diagnostic imaging report created using the Viewer 114 may be stored in the Viewer 114, or may be output and stored in the PACS 113 or a report server (not shown).
 Printer115は、PACS113等に記憶された画像を印刷する。Printer115はたとえばフィルムプリンタであり、PACS113等に記憶された画像をフィルムに印刷することにより出力する。 The Printer 115 prints an image stored in the PACS 113 or the like. The Printer 115 is, for example, a film printer, and outputs an image stored in the PACS 113 or the like by printing it on a film.
 図2は、制御装置101のハードウェア構成の一例を示す図である。制御装置101は、CPU201と、ROM202と、RAM203と、HDD204と、USB205と、通信回路206と、GPUボード207と、HDMI(登録商標)208とを有する。これらは内部バスにより通信可能に接続されている。 FIG. 2 is a diagram illustrating an example of a hardware configuration of the control device 101. The control device 101 includes a CPU 201, ROM 202, RAM 203, HDD 204, USB 205, communication circuit 206, GPU board 207, and HDMI (registered trademark) 208. These are communicably connected via an internal bus.
 CPU(Central Processing Unit)201は制御装置101及びこれに接続する各部を統合的に制御する制御回路である。CPU201はROM202に格納されているプログラムを実行することにより制御を実施する。またCPU201は、表示部104を制御するためのソフトウェアであるディスプレイドライバを実行し、表示部104に対する表示制御を行う。さらにCPU201は、操作部105に対する入出力制御を行う。 A CPU (Central Processing Unit) 201 is a control circuit that integrally controls the control device 101 and each unit connected thereto. The CPU 201 performs control by executing a program stored in the ROM 202. Further, the CPU 201 executes a display driver which is software for controlling the display unit 104 and performs display control on the display unit 104. Further, the CPU 201 performs input / output control for the operation unit 105.
 ROM(Read Only Memory)202は、CPUによる制御の手順を記憶させたプログラムやデータを格納する。 A ROM (Read Only Memory) 202 stores a program and data in which a control procedure by the CPU is stored.
 RAM(Random Access Memory)203は、制御装置101を及びこれに接続する各部における処理を実行するためのプログラムや、画像処理で用いる各種パラメータを記憶するためのメモリである。RAM203は、CPU201が実行する制御プログラムを格納し、CPU201が各種制御を実行する際の様々なデータを一時的に格納する。 A RAM (Random Access Memory) 203 is a memory for storing a program for executing processing in the control device 101 and each unit connected thereto, and various parameters used in image processing. The RAM 203 stores a control program executed by the CPU 201, and temporarily stores various data when the CPU 201 executes various controls.
 HDD(Hard Disk Drive)204は、超音波画像や光音響画像などの各種のデータを保存する補助記憶装置である。 HDD (Hard Disk Drive) 204 is an auxiliary storage device that stores various data such as an ultrasonic image and a photoacoustic image.
 USB(Universal Serial Bus)205は操作部105と接続する接続部である。 A USB (Universal Serial Bus) 205 is a connection unit connected to the operation unit 105.
 通信回路206は撮像システム100を構成する各部や、ネットワーク110に接続されている各種の外部装置との通信を行うための回路である。通信回路206は、所望の通信形態にあわせて、複数の構成により実現されていてもよい。 The communication circuit 206 is a circuit for communicating with each unit constituting the imaging system 100 and various external devices connected to the network 110. The communication circuit 206 may be realized by a plurality of configurations in accordance with a desired communication form.
 GPUボード207は、GPU、及びビデオメモリを含む汎用グラフィックスボードである。GPUボード207は、画像処理部303の一部又は全部を構成し、たとえば光音響画像の再構成処理を行う。このような演算装置を使用するにより、専用ハードウェアを必要とせずに高速に再構成処理などの演算を行うことができる。 The GPU board 207 is a general-purpose graphics board including a GPU and a video memory. The GPU board 207 constitutes part or all of the image processing unit 303, and performs, for example, a photoacoustic image reconstruction process. By using such an arithmetic device, it is possible to perform operations such as reconstruction processing at high speed without requiring dedicated hardware.
 HDMI(登録商標)(High Definition Multimedia Interface)208は、表示部104と接続する接続部である。 An HDMI (registered trademark) (High Definition Multimedia Interface) 208 is a connection unit connected to the display unit 104.
 CPU201やGPUはプロセッサの一例である。また、ROM202やRAM203やHDD204はメモリの一例である。制御装置101は複数のプロセッサを有していてもよい。第1の実施形態においては、制御装置101のプロセッサがメモリに格納されているプログラムを実行することにより、制御装置101の各部の機能が実現される。 The CPU 201 and the GPU are examples of processors. The ROM 202, RAM 203, and HDD 204 are examples of memories. The control device 101 may have a plurality of processors. In the first embodiment, the function of each unit of the control device 101 is realized by the processor of the control device 101 executing a program stored in the memory.
 なお、制御装置101は特定の処理を専用に行うCPUやGPUを有していても良い。また、制御装置101は特定の処理あるいは全ての処理をプログラムしたFPGA(Field-Programmable Gate Array)を有していてもよい。さらに、制御装置101はメモリとしてSSD(Solid State Drive)を有していてもよい。制御装置101はHDD204の代わりにSSDを有していてもよいし、HDD204とSSDの両方を有していてもよい。 Note that the control device 101 may have a CPU or GPU that performs a specific process exclusively. Further, the control device 101 may have a field-programmable gate array (FPGA) in which specific processing or all processing is programmed. Furthermore, the control device 101 may have an SSD (Solid State Drive) as a memory. The control apparatus 101 may include an SSD instead of the HDD 204, or may include both the HDD 204 and the SSD.
 図3は、制御装置101の機能構成の一例を示す図である。制御装置101は、検査制御部300、信号取得部301、位置取得部302、画像処理部303、判定部304、表示制御部305、出力部306を含む。 FIG. 3 is a diagram illustrating an example of a functional configuration of the control device 101. The control device 101 includes an inspection control unit 300, a signal acquisition unit 301, a position acquisition unit 302, an image processing unit 303, a determination unit 304, a display control unit 305, and an output unit 306.
 検査制御部300は、撮像システム100において行われる検査を制御する。検査制御部300は、オーダリングシステム112から検査オーダの情報を取得する。検査オーダには、検査を受ける患者の情報や、撮影手技に関する情報が含まれる。検査制御部300は、撮影手技の情報に基づいてプローブ102や検知部103を制御する。さらに検査制御部300は、ユーザに検査に関する情報を提示するために表示制御部305を介して表示部104に当該検査の情報を表示させる。表示部104に表示される検査の情報には、検査を受ける患者の情報や、当該検査に含まれる撮影手技の情報や、既に撮像が完了して生成された画像が含まれる。さらに検査制御部300は、当該検査の進捗に関する情報をオーダリングシステム112に送信する。たとえば、ユーザにより当該検査が開始された際には、システム112に開始を通知し、当該検査に含まれる全ての撮影手技による撮像が完了した際には、システム112に完了を通知する。 The inspection control unit 300 controls inspection performed in the imaging system 100. The inspection control unit 300 acquires inspection order information from the ordering system 112. The examination order includes information on a patient who undergoes an examination and information on imaging procedures. The inspection control unit 300 controls the probe 102 and the detection unit 103 based on information on the imaging technique. Furthermore, the inspection control unit 300 causes the display unit 104 to display information on the inspection via the display control unit 305 in order to present information related to the inspection to the user. The information on the examination displayed on the display unit 104 includes information on the patient undergoing the examination, information on the imaging technique included in the examination, and an image already generated after imaging. Further, the inspection control unit 300 transmits information regarding the progress of the inspection to the ordering system 112. For example, when the inspection is started by the user, the system 112 is notified of the start, and when imaging by all the imaging techniques included in the inspection is completed, the system 112 is notified of the completion.
 さらに検査制御部300は、撮像に用いられているプローブ102に関する情報を取得する。プローブ102に関する情報には、プローブの種類、中心周波数、感度、音響フォーカス、電子フォーカス、観察深度といった情報が含まれる。ユーザはプローブ102を、たとえば制御装置101のプローブコネクタポート(不図示)に接続し、制御装置101に操作入力を行ってプローブ102を有効化し、撮像条件等を入力する。検査制御部300は、有効化されたプローブ102に関する情報を取得する。検査取得部300は、プローブ102に関する情報を画像処理部303、判定部304、表示制御部305に適宜送信する。は、プローブ102の移動に関する情報を取得する第2の取得手段の一例である。 Further, the inspection control unit 300 acquires information related to the probe 102 used for imaging. Information related to the probe 102 includes information such as the type of probe, center frequency, sensitivity, acoustic focus, electronic focus, and observation depth. The user connects the probe 102 to, for example, a probe connector port (not shown) of the control apparatus 101, inputs an operation to the control apparatus 101, validates the probe 102, and inputs imaging conditions and the like. The inspection control unit 300 acquires information regarding the activated probe 102. The examination acquisition unit 300 appropriately transmits information related to the probe 102 to the image processing unit 303, the determination unit 304, and the display control unit 305. These are an example of the 2nd acquisition means which acquires the information regarding the movement of the probe 102. FIG.
 信号取得部301は、プローブ102から超音波信号と光音響信号とを取得する。具体的には、信号取得部301は、検査制御部300や位置取得部302からの情報に基づいて、プローブ102から取得した情報から超音波信号と光音響信号とを区別して取得する。たとえば、撮像を行っている撮影手技において、超音波信号の取得と光音響信号の取得のタイミングが規定されている場合には、検査制御部300から取得した当該取得のタイミングの情報に基づいて、プローブ102から取得した情報から超音波信号と光音響信号とを区別して取得する。後述する例のように、プローブ102の移動に関する情報に基づいて光音響信号を取得する場合には、位置取得部302から取得したプローブ102の移動に関する情報に基づいて、プローブ102から取得した情報から超音波信号と光音響信号とを区別して取得する。信号取得部301は、プローブ102から超音波信号と光音響信号とのうち少なくともいずれかを取得する第1の取得手段の一例である。 The signal acquisition unit 301 acquires an ultrasonic signal and a photoacoustic signal from the probe 102. Specifically, the signal acquisition unit 301 distinguishes and acquires an ultrasonic signal and a photoacoustic signal from information acquired from the probe 102 based on information from the inspection control unit 300 and the position acquisition unit 302. For example, in the imaging technique in which imaging is performed, when the acquisition timing of the ultrasonic signal and the acquisition of the photoacoustic signal are specified, based on the acquisition timing information acquired from the inspection control unit 300, An ultrasonic signal and a photoacoustic signal are distinguished and acquired from information acquired from the probe 102. When acquiring a photoacoustic signal based on information related to the movement of the probe 102 as in an example described later, based on information acquired from the probe 102 based on information related to the movement of the probe 102 acquired from the position acquisition unit 302. An ultrasonic signal and a photoacoustic signal are distinguished and acquired. The signal acquisition unit 301 is an example of a first acquisition unit that acquires at least one of an ultrasonic signal and a photoacoustic signal from the probe 102.
 位置取得部302は、検知部103からの情報に基づいてプローブ102の変位に関する情報を取得する。たとえば位置取得部302は、検知部103からの情報に基づいてプローブ102の位置、姿勢や、被検体に対する移動の速度や、回転の速度の情報や、被検体に対する移動の加速度や、被検体に対する押圧の程度を示す情報の、少なくともいずれかを取得する。すなわち位置取得部302は、ユーザがプローブ102を被検体に対してどのように操作しているかを示す情報を取得する。位置取得部302は検知部103からの情報に基づいて、たとえばユーザがプローブ102を被検体に対して接触させた状態で静止させているのか、あるいは所定の速度以上で移動させているのかを判定することができる。位置取得部302は一定の時間間隔で、好ましくはリアルタイムでプローブ102の位置情報を検知部103から取得することが好ましい。 The position acquisition unit 302 acquires information related to the displacement of the probe 102 based on the information from the detection unit 103. For example, the position acquisition unit 302 is based on information from the detection unit 103, information on the position and orientation of the probe 102, movement speed with respect to the subject, rotational speed information, acceleration of movement with respect to the subject, and with respect to the subject. At least one of the information indicating the degree of pressing is acquired. That is, the position acquisition unit 302 acquires information indicating how the user operates the probe 102 with respect to the subject. Based on the information from the detection unit 103, the position acquisition unit 302 determines, for example, whether the user is stationary with the probe 102 in contact with the subject or is moving at a predetermined speed or more. can do. The position acquisition unit 302 preferably acquires the position information of the probe 102 from the detection unit 103 at regular time intervals, preferably in real time.
 位置取得部302は、プローブ102の変位に関する情報を、検査制御部300、画像処理部303、判定部304、表示制御部305に適宜送信する。位置取得部302は、プローブ102の変位に関する情報を取得する第2の取得手段の一例である。 The position acquisition unit 302 appropriately transmits information related to the displacement of the probe 102 to the inspection control unit 300, the image processing unit 303, the determination unit 304, and the display control unit 305. The position acquisition unit 302 is an example of a second acquisition unit that acquires information related to the displacement of the probe 102.
 画像処理部303は、超音波画像と光音響画像と、超音波画像に対して光音響画像を重畳させた重畳画像とを生成する。画像処理部303は、信号取得部301により取得された超音波信号から、表示部104に表示させるための超音波画像を生成する。画像処理部303は、検査制御部300から取得した撮影手技の情報に基づいて、設定されたモードに適した超音波画像を生成する。たとえば撮影手技としてドプラモードが設定されている場合には、画像処理部303は、信号取得部301により取得された超音波信号の周波数と送信周波数との差に基づいて、被検体内部の流速を示す画像を生成する。 The image processing unit 303 generates an ultrasonic image, a photoacoustic image, and a superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image. The image processing unit 303 generates an ultrasonic image to be displayed on the display unit 104 from the ultrasonic signal acquired by the signal acquisition unit 301. The image processing unit 303 generates an ultrasound image suitable for the set mode based on the imaging technique information acquired from the inspection control unit 300. For example, when the Doppler mode is set as an imaging technique, the image processing unit 303 calculates the flow velocity inside the subject based on the difference between the frequency of the ultrasonic signal acquired by the signal acquisition unit 301 and the transmission frequency. Generate the image shown.
 また、画像処理部303は信号取得部301により取得された光音響信号に基づいて光音響画像を生成する。画像処理部303は、光音響信号に基づいて光が照射された時の音響波の分布(以下、初期音圧分布と称する。)を再構成する。画像処理部303は、再構成された初期音圧分布を、被検体に照射された光の被検体の光フルエンス分布で除することにより、被検体内における光の吸収係数分布を取得する。また、被検体に照射する光の波長に応じて、被検体内で光の吸収の度合いが異なることを利用して、複数の波長に対する吸収係数分布から被検体内の物質の濃度分布を取得する。たとえば画像処理部303は、オキシヘモグロビンとデオキシヘモグロビンの被検体内における物質の濃度分布を取得する。さらに画像処理部303は、オキシヘモグロビン濃度のデオキシヘモグロビン濃度に対する割合として酸素飽和度分布を取得する。画像処理部303により生成される光音響画像は、たとえば上述した初期音圧分布、光フルエンス分布、吸収係数分布、物質の濃度分布、酸素飽和度分布といった情報を示す画像である。すなわち画像処理部303は、超音波信号に基づいて超音波画像を生成し、光音響信号に基づいて光音響画像を生成する生成手段の一例である。 Further, the image processing unit 303 generates a photoacoustic image based on the photoacoustic signal acquired by the signal acquisition unit 301. The image processing unit 303 reconstructs an acoustic wave distribution (hereinafter referred to as an initial sound pressure distribution) when light is irradiated based on the photoacoustic signal. The image processing unit 303 acquires the light absorption coefficient distribution in the subject by dividing the reconstructed initial sound pressure distribution by the light fluence distribution of the subject irradiated with the light. Further, the concentration distribution of the substance in the subject is obtained from the absorption coefficient distribution for a plurality of wavelengths by utilizing the fact that the degree of light absorption in the subject varies depending on the wavelength of the light irradiated to the subject. . For example, the image processing unit 303 acquires the concentration distribution of substances in the subject of oxyhemoglobin and deoxyhemoglobin. Further, the image processing unit 303 acquires the oxygen saturation distribution as a ratio of the oxyhemoglobin concentration to the deoxyhemoglobin concentration. The photoacoustic image generated by the image processing unit 303 is an image indicating information such as the above-described initial sound pressure distribution, light fluence distribution, absorption coefficient distribution, substance concentration distribution, and oxygen saturation distribution. That is, the image processing unit 303 is an example of a generation unit that generates an ultrasonic image based on the ultrasonic signal and generates a photoacoustic image based on the photoacoustic signal.
 判定部304は、位置取得部302により取得された、プローブ102の変位に関する情報に基づいて、表示制御部305を介して表示部104に光音響画像を表示させるか否かを判定する。すなわち判定部304は、光音響画像を表示部104に表示させるか否かを判定する判定手段の一例である。 The determination unit 304 determines whether to display a photoacoustic image on the display unit 104 via the display control unit 305 based on the information regarding the displacement of the probe 102 acquired by the position acquisition unit 302. That is, the determination unit 304 is an example of a determination unit that determines whether to display a photoacoustic image on the display unit 104.
 たとえば、判定部304は、位置取得部302によりプローブ102が所定の速度以下で移動していることを示す情報が取得された場合や、所定の圧力以上でプローブ102が被検体に押圧されていることを示す情報が取得された場合に、光音響画像を表示させると判定する。これにより、ユーザは被検体の特定の領域を観察しようとする操作をしている際に光音響画像が表示部104に表示される。ユーザは、物理的構造を有するスイッチを押下するといった、特別な操作入力を行わなくても、超音波画像と光音響画像とを観察することができる。 For example, in the determination unit 304, when the position acquisition unit 302 acquires information indicating that the probe 102 is moving at a predetermined speed or less, or the probe 102 is pressed against the subject at a predetermined pressure or higher. When the information indicating this is acquired, it is determined that the photoacoustic image is displayed. Thus, the photoacoustic image is displayed on the display unit 104 when the user is performing an operation of observing a specific region of the subject. The user can observe the ultrasonic image and the photoacoustic image without performing a special operation input such as pressing a switch having a physical structure.
 判定部304が表示部104に光音響画像を表示させると判定した場合には、たとえば画像処理部303は超音波画像に光音響画像を重畳した重畳画像を生成し、表示制御部305を介して表示部104に重畳画像が表示される。すなわち、超音波画像を表示するモードから、超音波画像と光音響画像とを表示するモードへの切り替えを行う。別の例では、判定部304が表示部104に光音響画像を表示させると判定した場合には、検査制御部300は照射部107と信号取得部301を制御して、光音響信号を取得させる。そして、画像処理部303は当該判定に応じて取得された光音響信号に基づいて再構成処理を行うことにより、光音響画像を生成する。表示制御部305は、当該生成された光音響画像を表示部104に表示させる。この観点で検査制御部300は、光音響画像を表示部104に表示させると判定された場合に、被検体への光照射を行うように照射部107を制御する照射制御手段の一例である。 When the determination unit 304 determines to display the photoacoustic image on the display unit 104, for example, the image processing unit 303 generates a superimposed image in which the photoacoustic image is superimposed on the ultrasonic image, and via the display control unit 305. A superimposed image is displayed on the display unit 104. That is, the mode is switched from the mode for displaying the ultrasonic image to the mode for displaying the ultrasonic image and the photoacoustic image. In another example, when the determination unit 304 determines to display a photoacoustic image on the display unit 104, the inspection control unit 300 controls the irradiation unit 107 and the signal acquisition unit 301 to acquire a photoacoustic signal. . Then, the image processing unit 303 generates a photoacoustic image by performing reconstruction processing based on the photoacoustic signal acquired according to the determination. The display control unit 305 causes the display unit 104 to display the generated photoacoustic image. In this respect, the examination control unit 300 is an example of an irradiation control unit that controls the irradiation unit 107 to perform light irradiation on the subject when it is determined to display the photoacoustic image on the display unit 104.
 表示制御部305は、表示部104を制御して、表示部104に情報を表示させる。表示制御部305は、検査制御部300や画像処理部303や判定部304からの入力や、操作部105を介したユーザの操作入力に応じて、表示部104に情報を表示させる。表示制御部305は、表示制御手段の一例である。また、表示制御部305は、判定部304により光音響画像を表示部104に表示させると判定された結果に基づいて、光音響画像を表示部104に表示させる表示制御手段の一例である。 The display control unit 305 controls the display unit 104 to display information on the display unit 104. The display control unit 305 causes the display unit 104 to display information in accordance with inputs from the inspection control unit 300, the image processing unit 303, the determination unit 304, and user operation inputs via the operation unit 105. The display control unit 305 is an example of a display control unit. The display control unit 305 is an example of a display control unit that displays the photoacoustic image on the display unit 104 based on the result of the determination unit 304 determining that the photoacoustic image is displayed on the display unit 104.
 出力部306は、制御装置101からネットワーク110を介してPACS113といった外部装置に情報を出力する。たとえば、出力部306は画像処理部303で生成された超音波画像や光音響画像、これらの重畳画像をPACS113に出力する。出力部306から出力される画像には、検査制御部300によりDICOM規格に則った各種のタグとして付帯された付帯情報が含まれる。付帯情報には、たとえば患者情報や、当該画像を撮像した撮像装置を示す情報や、当該画像を一意に識別するための画像IDや、当該画像を撮像した検査を一意に識別するための検査IDが含まれる。また、付帯情報には、一連のプローブ操作の中で撮像された超音波画像と光音響画像とを関連付ける情報が含まれる。超音波画像と光音響画像とを関連付ける情報とは、たとえば超音波画像を構成する複数のフレームのうち、光音響画像を取得したタイミングが最も近いフレームを示す情報である。さらに、付帯情報として、検知部103で取得されたプローブ102の位置情報を超音波画像及び光音響画像の各フレームに付帯させてもよい。すなわち出力部306は、超音波画像を生成するための超音波信号を取得したプローブ102の位置を示す情報を、当該超音波画像に付帯させて出力する。また、出力部306は、光音響画像を生成するための光音響信号を取得したプローブ102の位置を示す情報を、当該光音響画像に付帯させて出力する。出力部306は、出力手段の一例である。 The output unit 306 outputs information from the control device 101 to an external device such as the PACS 113 via the network 110. For example, the output unit 306 outputs the ultrasonic image and the photoacoustic image generated by the image processing unit 303 and a superimposed image thereof to the PACS 113. The image output from the output unit 306 includes incidental information attached as various tags according to the DICOM standard by the inspection control unit 300. The incidental information includes, for example, patient information, information indicating the imaging device that captured the image, an image ID for uniquely identifying the image, and an examination ID for uniquely identifying the examination that captured the image Is included. Further, the incidental information includes information associating an ultrasonic image and a photoacoustic image captured during a series of probe operations. The information associating the ultrasonic image and the photoacoustic image is information indicating a frame having the closest timing at which the photoacoustic image is acquired, for example, among a plurality of frames constituting the ultrasonic image. Furthermore, as the incidental information, the position information of the probe 102 acquired by the detection unit 103 may be incidental to each frame of the ultrasonic image and the photoacoustic image. That is, the output unit 306 outputs information indicating the position of the probe 102 that has acquired the ultrasonic signal for generating the ultrasonic image attached to the ultrasonic image. The output unit 306 outputs information indicating the position of the probe 102 that has acquired the photoacoustic signal for generating the photoacoustic image, attached to the photoacoustic image. The output unit 306 is an example of an output unit.
 図4は、表示制御部305により表示部104に表示される超音波画像、光音響画像、重畳画像の一例をそれぞれ示す図である。図4(a)は超音波画像の一例であり、反射波の振幅を輝度で表した断層画像、すなわちBモードで生成された画像の一例である。以下では超音波画像としてBモードの画像を生成する倍を例に説明するが、第1の実施形態において制御装置101が取得する超音波画像はBモードの画像に限らない。取得される超音波画像は、Aモード、Mモード、ドプラモードといった、その他のいずれの方法により生成されるものであってもよいし、ハーモニックイメージや組織弾性イメージであってもよい。撮像システム100で超音波画像を撮像される被検体内の領域は、たとえば循環器領域、***、肝臓、膵臓といった領域である。また、撮像システム100では、たとえば微小気泡を利用した超音波造影剤を投与した被検体の超音波画像を撮像してもよい。 FIG. 4 is a diagram illustrating an example of an ultrasonic image, a photoacoustic image, and a superimposed image displayed on the display unit 104 by the display control unit 305. FIG. 4A is an example of an ultrasonic image, which is an example of a tomographic image in which the amplitude of the reflected wave is expressed by luminance, that is, an image generated in the B mode. In the following, an example in which a B-mode image is generated as an ultrasound image will be described. However, the ultrasound image acquired by the control device 101 in the first embodiment is not limited to a B-mode image. The acquired ultrasonic image may be generated by any other method such as A mode, M mode, or Doppler mode, or may be a harmonic image or a tissue elasticity image. The region in the subject from which an ultrasound image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a liver, or a pancreas. In the imaging system 100, for example, an ultrasound image of a subject to which an ultrasound contrast agent using microbubbles is administered may be captured.
 図4(b)は光音響画像の一例であり、吸収係数分布とヘモグロビン濃度とに基づいて描出された血管画像の一例である。第1の実施形態において制御装置101が取得する光音響画像は、光音響波の発生音圧(初期音圧)、光吸収エネルギー密度、光吸収係数、被検体を構成する物質の濃度に関する情報、及びこれらを組み合わせて生成される画像のいずれであってもよい。また、撮像システム100で光音響画像を撮像される被検体内の領域は、たとえば循環器領域、***、径部、腹部、手指および足指を含む四肢といった領域である。特に、被検体内の光吸収に関する特性に応じて、新生血管や血管壁のプラークを含む血管領域を、光音響画像の撮像の対象としてもよい。以下では、超音波画像を撮像しながら光音響画像を撮像する場合を例に説明するが、撮像システム100で光音響画像を撮像される被検体内の領域は必ずしも超音波画像を撮像される領域と一致していなくてもよい。また、撮像システム100では、たとえばメチレンブルー(methylene blue)やインドシアニングリーン(indocyanine green)といった色素や、金微粒子、それらを集積あるいは化学的に修飾した物質を造影剤として投与した被検体の光音響画像を撮像してもよい。 FIG. 4B is an example of a photoacoustic image, which is an example of a blood vessel image drawn based on the absorption coefficient distribution and the hemoglobin concentration. In the first embodiment, the photoacoustic image acquired by the control apparatus 101 includes information on the generated sound pressure (initial sound pressure) of the photoacoustic wave, the light absorption energy density, the light absorption coefficient, and the concentration of the substance constituting the subject. And any image generated by combining them. Moreover, the area | region in the subject from which a photoacoustic image is imaged with the imaging system 100 is areas, such as a circulatory organ area | region, a breast, a diameter part, an abdominal part, extremities including fingers and toes, for example. In particular, a blood vessel region including a new blood vessel and a plaque on a blood vessel wall may be set as a target for imaging a photoacoustic image in accordance with the characteristics relating to light absorption in the subject. In the following, a case where a photoacoustic image is captured while capturing an ultrasonic image will be described as an example. However, a region in a subject where a photoacoustic image is captured by the imaging system 100 is not necessarily a region where an ultrasonic image is captured. Does not have to match. In the imaging system 100, for example, a photoacoustic image of a subject administered as a contrast agent with a dye such as methylene blue or indocyanine green, gold fine particles, or a substance obtained by integrating or chemically modifying them. May be imaged.
 図4(c)は、図4(a)と図4(b)にそれぞれ例示した超音波画像と光音響画像を重畳した重畳画像である。画像処理部303は、超音波画像と光音響画像とを位置合わせして重畳画像を生成する。画像処理部303は、当該位置合わせの方法として、いかなる方法を用いてもよい。たとえば画像処理部303は、超音波画像と光音響画像とのそれぞれに共通して描出されている特徴的な領域に基づいて位置合わせを行う。別の例では、位置取得部302により取得されたプローブ102の位置の情報に基づいて、被検体の略同一の領域からの信号により描出されたと判定できる超音波画像と光音響画像とを重畳して重畳画像を生成してもよい。 FIG. 4C is a superimposed image in which the ultrasonic image and the photoacoustic image illustrated in FIGS. 4A and 4B are superimposed. The image processing unit 303 aligns the ultrasonic image and the photoacoustic image to generate a superimposed image. The image processing unit 303 may use any method as the alignment method. For example, the image processing unit 303 performs alignment based on a characteristic region drawn in common for each of the ultrasonic image and the photoacoustic image. In another example, based on the information on the position of the probe 102 acquired by the position acquisition unit 302, an ultrasonic image and a photoacoustic image that can be determined to be rendered by signals from substantially the same region of the subject are superimposed. Thus, a superimposed image may be generated.
 図5は、撮像システム100の構成の一例を示す図である。撮像システム100は、操作卓501と、プローブ102と、磁気センサ502と、磁気トランスミッタ503と、架台504を含む。操作卓501は、制御装置101と表示部104と操作部105とが統合されたものである。第1の実施形態にかかる制御装置は、制御装置101若しくは操作卓501である。磁気センサ502と磁気トランスミッタ503は、検知部103の一例である。架台504は被検体を支持する。 FIG. 5 is a diagram illustrating an example of the configuration of the imaging system 100. The imaging system 100 includes a console 501, a probe 102, a magnetic sensor 502, a magnetic transmitter 503, and a gantry 504. The console 501 is obtained by integrating the control device 101, the display unit 104, and the operation unit 105. The control device according to the first embodiment is the control device 101 or the console 501. The magnetic sensor 502 and the magnetic transmitter 503 are an example of the detection unit 103. The gantry 504 supports the subject.
 磁気センサ502及び磁気トランスミッタ503は、プローブ102の位置情報を取得するための装置である。磁気センサ502は、プローブ102に取り付けられる磁気センサである。また、磁気トランスミッタ503は、任意の位置に配置され、自身を中心として外側に向かって磁場を形成する装置である。第1の実施形態においては、磁気トランスミッタ503は、架台504の近傍に設置される。 The magnetic sensor 502 and the magnetic transmitter 503 are devices for acquiring position information of the probe 102. The magnetic sensor 502 is a magnetic sensor attached to the probe 102. Further, the magnetic transmitter 503 is a device that is arranged at an arbitrary position and forms a magnetic field toward the outside centering on itself. In the first embodiment, the magnetic transmitter 503 is installed in the vicinity of the gantry 504.
 磁気センサ502は、磁気トランスミッタ503によって形成された3次元の磁場を検出する。そして、磁気センサ502は、検出した磁場の情報に基づいて、磁気トランスミッタ503を原点とする空間におけるプローブ102の複数の点についての位置(座標)を取得する。位置取得部302は磁気センサ502から取得した位置(座標)の情報に基づいて、プローブ102の3次元位置情報を取得する。プローブ102の3次元位置情報には、送受信部106の座標が含まれる。送受信部106の座標に基づいて、被検体との接触面の位置が取得される。また、プローブ102の3次元位置情報には、プローブ102の被検体に対する傾き(角度)の情報が含まれる。そして位置取得部302は、経時的な3次元位置情報の変化に基づいて、プローブ102の変位に関する情報を取得する。 The magnetic sensor 502 detects a three-dimensional magnetic field formed by the magnetic transmitter 503. Then, the magnetic sensor 502 acquires positions (coordinates) of a plurality of points of the probe 102 in a space having the magnetic transmitter 503 as the origin, based on the detected magnetic field information. The position acquisition unit 302 acquires three-dimensional position information of the probe 102 based on the position (coordinate) information acquired from the magnetic sensor 502. The three-dimensional position information of the probe 102 includes the coordinates of the transmission / reception unit 106. Based on the coordinates of the transmitting / receiving unit 106, the position of the contact surface with the subject is acquired. Further, the three-dimensional position information of the probe 102 includes information on the tilt (angle) of the probe 102 with respect to the subject. And the position acquisition part 302 acquires the information regarding the displacement of the probe 102 based on the change of the three-dimensional position information with time.
 図6は、第1の実施形態にかかる制御装置が、ユーザによるプローブ102の操作に基づいて表示部104に光音響画像を表示させる処理の一例を示すフローチャートである。以下では、ユーザは少なくともプローブ102を用いて超音波信号を取得し、超音波画像を表示部104に表示させながらプローブ102を操作し、さらに光音響画像を表示部104に表示させる場合を例に説明する。 FIG. 6 is a flowchart illustrating an example of processing in which the control device according to the first embodiment displays a photoacoustic image on the display unit 104 based on the operation of the probe 102 by the user. Hereinafter, as an example, the user acquires an ultrasonic signal using at least the probe 102, operates the probe 102 while displaying the ultrasonic image on the display unit 104, and further displays the photoacoustic image on the display unit 104. explain.
 ステップS600において、検査制御部300は、光音響画像の表示に関する事前設定の情報を取得する。ユーザは検査に先だって、光音響画像の表示に関する設定を操作卓501に対する操作入力により行う。光音響画像の表示に関する設定とは、光音響信号の取得に関する設定と、取得された光音響信号に基づいて生成される光音響画像の表示に関する設定とが含まれる。光音響信号の取得に関する設定により、超音波信号と光音響信号とを予め定められたタイミングで取得する第1の取得モードと、超音波信号を取得しながらユーザのプローブ操作に応じて光音響信号を取得する第2の取得モードと、超音波信号のみを取得する第3の取得モードとのうち、プローブ102を何れのモードで動作させるかが定められる。第1の取得モードには、所定の時間ずつ交互に超音波信号と光音響信号とを取得する場合や、オーダリングシステム112から取得したオーダ情報に定められた態様で超音波信号と光音響とを取得する場合が含まれる。光音響画像の表示に関する設定とは、光音響画像を光音響信号から再構成され次第順次表示する第1の表示モードと、光音響信号の再構成が行われても撮影が完了するまで光音響画像を表示しない第2の表示モードとが含まれる。光音響信号の取得に関する設定が第2の取得モードであり、かつ光音響画像の表示に関する設定が第1の表示モードである場合にはステップS601に進み、それ以外の場合にはステップS603に進む。 In step S600, the inspection control unit 300 acquires pre-set information regarding the display of the photoacoustic image. Prior to the examination, the user performs settings related to the display of the photoacoustic image by operating the console 501. The setting related to the display of the photoacoustic image includes a setting related to the acquisition of the photoacoustic signal and a setting related to the display of the photoacoustic image generated based on the acquired photoacoustic signal. A first acquisition mode for acquiring an ultrasonic signal and a photoacoustic signal at a predetermined timing according to settings related to acquisition of the photoacoustic signal, and a photoacoustic signal according to a user's probe operation while acquiring the ultrasonic signal. It is determined in which mode the probe 102 is to be operated among the second acquisition mode in which the probe 102 is acquired and the third acquisition mode in which only the ultrasonic signal is acquired. In the first acquisition mode, an ultrasonic signal and a photoacoustic signal are alternately acquired at predetermined time intervals, or an ultrasonic signal and a photoacoustic signal are obtained in a manner determined by the order information acquired from the ordering system 112. It includes the case of acquisition. The settings related to the display of the photoacoustic image are the first display mode in which the photoacoustic image is reconstructed from the photoacoustic signal and sequentially displayed, and the photoacoustic image is displayed until the photographing is completed even if the photoacoustic signal is reconstructed. And a second display mode in which no image is displayed. If the setting relating to the acquisition of the photoacoustic signal is the second acquisition mode and the setting relating to the display of the photoacoustic image is the first display mode, the process proceeds to step S601. Otherwise, the process proceeds to step S603. .
 ステップS601において、判定部304はプローブ102の移動速度が所定値以下であるか否かを判定する。具体的には、まず位置取得部302は、磁気センサ502からプローブ102の位置を示す情報を取得し、当該位置の経時的な変化に基づいてプローブ102の移動の速度を示す情報を取得する。位置取得部302は、プローブ102の移動の速度を示す情報を判定部304に送信する。判定部304は、プローブ102の移動の速度が所定値以下であるか否かを判定する。判定部304は、被検体に対してプローブ102を静止させた状態、すなわち移動速度がゼロの場合も、プローブ102が所定の速度よりも小さい速度で移動していると判定する。たとえば、位置取得部302は磁気センサ502が取得したプローブ102の位置情報を一定時間記憶する。そして位置取得部302は、プローブ102の移動に関する速度ベクトルを取得し、判定部304に送信する。判定部304は、所定の期間、プローブ102の速度が所定値以下であれば、プローブ102の位置が十分に変化していないと判定する。たとえば判定部304は、所定値を50mm/秒とし、プローブ102が3秒間、所定値以下の速度で移動した場合には、プローブ102の移動速度が所定値以下であると判定する。プローブ102の移動速度が所定値以下場合には、ステップS602に進む。プローブ102の移動速度が所定値よりも大きい場合には、ステップS605に進む。 In step S601, the determination unit 304 determines whether the moving speed of the probe 102 is equal to or less than a predetermined value. Specifically, the position acquisition unit 302 first acquires information indicating the position of the probe 102 from the magnetic sensor 502, and acquires information indicating the speed of movement of the probe 102 based on a change with time of the position. The position acquisition unit 302 transmits information indicating the movement speed of the probe 102 to the determination unit 304. The determination unit 304 determines whether or not the moving speed of the probe 102 is equal to or less than a predetermined value. The determination unit 304 determines that the probe 102 is moving at a speed smaller than a predetermined speed even when the probe 102 is stationary with respect to the subject, that is, when the movement speed is zero. For example, the position acquisition unit 302 stores the position information of the probe 102 acquired by the magnetic sensor 502 for a certain period of time. Then, the position acquisition unit 302 acquires a velocity vector related to the movement of the probe 102 and transmits it to the determination unit 304. The determination unit 304 determines that the position of the probe 102 has not changed sufficiently if the velocity of the probe 102 is equal to or less than a predetermined value for a predetermined period. For example, when the predetermined value is 50 mm / second and the probe 102 moves at a speed equal to or lower than the predetermined value for 3 seconds, the determination unit 304 determines that the moving speed of the probe 102 is equal to or lower than the predetermined value. If the moving speed of the probe 102 is equal to or less than the predetermined value, the process proceeds to step S602. If the moving speed of the probe 102 is greater than the predetermined value, the process proceeds to step S605.
 ステップS602において、判定部304はプローブ102の回転速度が所定値以下であるか否かを判定する。具体的には、ステップS601と同様にして、まず位置取得部302は、磁気センサ502からプローブ102の位置を示す情報を取得し、当該位置の経時的な変化に基づいてプローブ102の回転の速度を示す情報を取得する。位置取得部302は、プローブ102の回転の速度を示す情報を判定部304に送信する。判定部304は、プローブ102の回転の速度が所定値以下であるか否かを判定する。判定部304は、被検体に対してプローブ102を静止させた状態、すなわち回転速度がゼロの場合も、プローブ102が所定の速度よりも小さい速度で回転していると判定する。たとえば、位置取得部302はステップS601と同様にしてプローブ102の移動に関する速度ベクトルを取得し、判定部304に送信する。たとえば判定部304は、所定値を1/6πrad/秒とし、プローブ102が3秒間、所定値以下の速度で回転した場合には、プローブ102の回転速度が所定値以下であると判定する。プローブ102が所定の速度よりも小さい速度で回転している場合には、ステップS604に進む。プローブ102が所定の速度よりも大きい速度で回転している場合には、ステップS605に進む。 In step S602, the determination unit 304 determines whether the rotation speed of the probe 102 is equal to or less than a predetermined value. Specifically, in the same manner as in step S601, the position acquisition unit 302 first acquires information indicating the position of the probe 102 from the magnetic sensor 502, and the rotation speed of the probe 102 based on the change over time of the position. Get information indicating The position acquisition unit 302 transmits information indicating the rotation speed of the probe 102 to the determination unit 304. The determination unit 304 determines whether or not the rotation speed of the probe 102 is equal to or less than a predetermined value. The determination unit 304 determines that the probe 102 is rotating at a speed smaller than a predetermined speed even when the probe 102 is stationary with respect to the subject, that is, when the rotation speed is zero. For example, the position acquisition unit 302 acquires a velocity vector related to the movement of the probe 102 in the same manner as in step S <b> 601 and transmits it to the determination unit 304. For example, the determination unit 304 determines that the predetermined value is 1 / 6π rad / sec, and the probe 102 rotates at a speed equal to or lower than the predetermined value for 3 seconds, the rotation speed of the probe 102 is equal to or lower than the predetermined value. If the probe 102 is rotating at a speed smaller than the predetermined speed, the process proceeds to step S604. If the probe 102 is rotating at a speed greater than the predetermined speed, the process proceeds to step S605.
 ステップS603において、検査制御部300がステップS600で取得した事前設定の情報に基づいて処理が分岐する。光音響信号の取得に関する設定が第1の取得モードであり、かつ光音響画像の表示に関する設定が第1の表示モードである場合には、ステップS604に進み、それ以外の場合にはステップS605に進む。 In step S603, the process branches based on the preset information acquired by the inspection control unit 300 in step S600. If the setting related to the acquisition of the photoacoustic signal is the first acquisition mode and the setting related to the display of the photoacoustic image is the first display mode, the process proceeds to step S604. Otherwise, the process proceeds to step S605. move on.
 ステップS604において、表示制御部305は、表示部104に光音響画像を表示させる。具体的には、プローブ102の変位に関する情報に基づいて適宜された光音響信号または、予め定められたタイミングで取得された光音響信号に基づいて、画像処理部303は光音響画像を再構成する。そして表示制御部305は、光音響画像を表示部104に表示させる。第1の実施形態においては、画像処理部303は光音響信号が取得されたのと近い時刻に取得された超音波信号に基づいて生成された超音波信号に対して、光音響画像を重畳した重畳画像を生成する。そして、表示制御部305は当該重畳画像を表示部104に表示させる。すなわち、表示制御部305は、プローブ102の変位に関する情報に基づいて、光音響信号から生成される光音響画像を表示部104に表示させる。 In step S604, the display control unit 305 causes the display unit 104 to display a photoacoustic image. Specifically, the image processing unit 303 reconstructs a photoacoustic image based on a photoacoustic signal appropriately based on information regarding the displacement of the probe 102 or a photoacoustic signal acquired at a predetermined timing. . Then, the display control unit 305 displays the photoacoustic image on the display unit 104. In the first embodiment, the image processing unit 303 superimposes the photoacoustic image on the ultrasonic signal generated based on the ultrasonic signal acquired at a time close to the time when the photoacoustic signal is acquired. A superimposed image is generated. The display control unit 305 causes the display unit 104 to display the superimposed image. That is, the display control unit 305 causes the display unit 104 to display a photoacoustic image generated from the photoacoustic signal based on information regarding the displacement of the probe 102.
 第1の取得モードに設定されている場合には、ユーザは超音波信号をプローブ102により取得し、表示部104に表示される超音波画像を観察しながらプローブ102を操作する。プローブ102の移動速度や回転速度が所定の値よりも小さい場合には、ユーザが被検体中の特定の領域をより詳細に観察しようとしている場合が想定される。第1の実施形態においては、このようなユーザのプローブ102の操作の変化に基づいて光音響画像が表示部104に表示される。これにより、詳細に観察すべき領域を探索するためにユーザが超音波画像を観察するのを妨げずに、適切なタイミングで光音響画像を表示部104に表示させることができる。さらに表示制御部305は、プローブ102の移動の速度が大きいほど重畳画像における光音響画像の透明度を高くして表示させてもよい。そして、プローブ102の移動の速度が所定の値よりも大きくなると、光音響画像を表示させないようにしてもよい。すなわち表示制御部305は、プローブ102の変位の程度に応じて、光音響画像を表示部104に表示させる態様を異ならせる。 When the first acquisition mode is set, the user acquires an ultrasonic signal with the probe 102 and operates the probe 102 while observing an ultrasonic image displayed on the display unit 104. When the moving speed or rotational speed of the probe 102 is smaller than a predetermined value, it is assumed that the user intends to observe a specific area in the subject in more detail. In the first embodiment, a photoacoustic image is displayed on the display unit 104 based on such a change in the operation of the user's probe 102. Accordingly, the photoacoustic image can be displayed on the display unit 104 at an appropriate timing without preventing the user from observing the ultrasonic image in order to search for a region to be observed in detail. Further, the display control unit 305 may display the photoacoustic image in the superimposed image with higher transparency as the moving speed of the probe 102 increases. Then, when the moving speed of the probe 102 becomes larger than a predetermined value, the photoacoustic image may not be displayed. That is, the display control unit 305 changes the manner in which the photoacoustic image is displayed on the display unit 104 according to the degree of displacement of the probe 102.
 ステップS605において、表示制御部305は光音響画像を表示部104に表示しない。画像処理部303はプローブ102により取得された超音波信号に基づいて超音波画像を生成し、表示制御部305は当該超音波画像を表示部104に表示させる。 In step S605, the display control unit 305 does not display the photoacoustic image on the display unit 104. The image processing unit 303 generates an ultrasound image based on the ultrasound signal acquired by the probe 102, and the display control unit 305 causes the display unit 104 to display the ultrasound image.
 以上で、図6に示す処理が終了する。なお、図6に示す例では光音響画像をプローブ102の操作や事前設定に応じて表示部104に表示させる例について説明したが、本発明は光音響画像の表示に限定されるものではない。たとえば、プローブ102の操作に応じて光音響画像を表示部104に表示するのと同時に、画像処理部303により生成された重畳画像若しくは光音響画像を保存してもよい。当該保存は制御装置101内のメモリに保存することに限定されず、たとえば出力部306を介してPACS113といった外部装置に出力することにより、当該外部装置に保存させてもよい。ステップS600からステップS603の処理により、光音響画像を表示しないと判定される場合は、ユーザがプローブ102を操作しながら、詳細に観察すべき領域を探索している場合と想定される。したがって、このような探索中の動画は必ずしも保存が必要とは限らない。したがって、光音響画像を表示すると判定される場合に重畳画像を保存することで、ユーザにとって詳細な観察の対象となる画像を選択的に保存することができ、メモリや外部装置の容量を効率的に利用することができる。 Thus, the process shown in FIG. In the example illustrated in FIG. 6, the example in which the photoacoustic image is displayed on the display unit 104 according to the operation of the probe 102 or the presetting has been described, but the present invention is not limited to the display of the photoacoustic image. For example, the superimposed image or the photoacoustic image generated by the image processing unit 303 may be stored at the same time when the photoacoustic image is displayed on the display unit 104 according to the operation of the probe 102. The saving is not limited to saving in the memory in the control apparatus 101. For example, the saving may be performed by outputting to the external apparatus such as the PACS 113 via the output unit 306. When it is determined that the photoacoustic image is not displayed by the processing from step S600 to step S603, it is assumed that the user is searching for a region to be observed in detail while operating the probe 102. Therefore, it is not always necessary to save the moving image being searched. Therefore, by storing the superimposed image when it is determined to display the photoacoustic image, it is possible to selectively store the image that is the object of detailed observation for the user, and to efficiently save the capacity of the memory and the external device. Can be used.
 なお、ステップS601とステップS602は同時に又は並列して処理されてもよい。すなわち、位置取得部302は磁気センサ502から取得したプローブ102の位置を示す情報に基づいて、判定部304にプローブ102の移動速度と回転速度の情報を同時に又は並列して送信してもよい。そして判定部304は、プローブ102の移動速度が所定値以下であり、かつ回転速度が所定値以下であるか否かを判定する。プローブ102の移動速度が所定値以下であり、かつ回転速度が所定値以下である場合にはステップS604に進む。プローブ102の移動速度又は回転速度の少なくともいずれかが所定値以上である場合には、ステップS605に進む。また別の例では、ステップS601とステップS602のいずれか一方のみが処理されてもよい。すなわち、判定部304は移動速度と回転速度のいずれか一方のみに基づいて光音響画像を表示するか否かについての判定を行っても良い。 Note that step S601 and step S602 may be processed simultaneously or in parallel. That is, the position acquisition unit 302 may transmit information on the moving speed and the rotational speed of the probe 102 simultaneously or in parallel to the determination unit 304 based on information indicating the position of the probe 102 acquired from the magnetic sensor 502. Then, the determination unit 304 determines whether the moving speed of the probe 102 is a predetermined value or less and whether the rotation speed is a predetermined value or less. If the moving speed of the probe 102 is not more than the predetermined value and the rotation speed is not more than the predetermined value, the process proceeds to step S604. If at least one of the moving speed and the rotating speed of the probe 102 is equal to or greater than the predetermined value, the process proceeds to step S605. In another example, only one of step S601 and step S602 may be processed. That is, the determination unit 304 may determine whether to display the photoacoustic image based on only one of the moving speed and the rotational speed.
 [第1の実施形態の変形例]
 第1の実施形態においては、さらに、被検体の光音響信号を取得すべき領域にプローブ102をガイドするための情報を表示部104に表示させてもよい。当該ガイドするための情報とは、たとえばプローブ102の位置や被検体に対する傾きを目標とする状態に案内するための情報である。具体的には、まず、第2の取得モードにおいて位置取得部302は、検知部103からの位置情報に基づいてプローブ102の位置情報を取得する。
[Modification of First Embodiment]
In the first embodiment, information for guiding the probe 102 to a region where the photoacoustic signal of the subject is to be acquired may be further displayed on the display unit 104. The information for guiding is information for guiding the position of the probe 102 or the tilt with respect to the subject to a target state, for example. Specifically, first, in the second acquisition mode, the position acquisition unit 302 acquires the position information of the probe 102 based on the position information from the detection unit 103.
 判定部304は、プローブ102の操作中に光音響画像を表示部104に表示させると判定した際のプローブ102の位置情報を記憶する。以下では、前回に光音響画像を表示した際のプローブ102の位置を、目標位置と称する。判定部304は、たとえばステップS602やステップS603の処理の説明において上述したように、位置取得部302からプローブ102の位置情報を取得している。判定部304は、目標位置と、現在のプローブ102の位置とに基づいて、目標位置にプローブ102をガイドするためのガイド情報を生成する。ガイド情報には、プローブ102を目標位置まで移動させるための移動方向、移動量、傾斜角、回転方向、回転量を示す情報が含まれる。この観点で判定部304は、プローブ102を特定の位置に誘導するためのガイド情報を生成するガイド手段の一例である。 The determination unit 304 stores position information of the probe 102 when it is determined that a photoacoustic image is displayed on the display unit 104 during operation of the probe 102. Hereinafter, the position of the probe 102 when the photoacoustic image is displayed last time is referred to as a target position. The determination unit 304 acquires the position information of the probe 102 from the position acquisition unit 302, for example, as described above in the description of the processing in step S602 and step S603. The determination unit 304 generates guide information for guiding the probe 102 to the target position based on the target position and the current position of the probe 102. The guide information includes information indicating a movement direction, a movement amount, an inclination angle, a rotation direction, and a rotation amount for moving the probe 102 to the target position. From this viewpoint, the determination unit 304 is an example of a guide unit that generates guide information for guiding the probe 102 to a specific position.
 たとえば判定部304は、所定の時間以上、目標位置の近傍でプローブ102を操作しているが、光音響画像を表示すると判定されないような操作が行われている場合に、ガイド情報を生成する。これにより、観察を行う中でユーザが詳細に観察した領域の光音響画像及び超音波画像を、容易に再現させることができる。 For example, the determination unit 304 generates guide information when the probe 102 is operated in the vicinity of the target position for a predetermined time or more, but an operation that is not determined to display the photoacoustic image is performed. Thereby, it is possible to easily reproduce the photoacoustic image and the ultrasonic image of the region observed in detail by the user during the observation.
 判定部304により生成されたガイド情報は、表示制御部305により表示部104に表示される。具体的には、表示制御部305はプローブ102を目標位置まで移動させるための移動方向、移動量、傾斜角、回転方向、回転量を示す客観的な指標となるガイド画像を表示部104に表示させる。当該ガイド画像はこれらのガイド情報の客観的な指標となるものであればいかなるものでもよい。たとえばガイド画像は、移動や回転の量に対応した大きさで、移動や回転や傾斜の方向に対応した向きを有する矢印の画像である。別の例では、ガイド画像は移動や回転の量に対応した大きさで、移動や回転や傾斜の方向に対応して形状が変形する図形である。ガイド画像は、プローブ102を目標位置に移動させた場合に超音波画像若しくは光音響画像に描出される領域(以下では、目標領域と称する。)の観察を妨げない態様で表示部104に表示される。たとえばガイド画像は、超音波画像及び光音響画像及び重畳画像を表示していない領域に表示される。別の例では、目標位置にプローブ102を移動させるようガイドしている間は、目標領域の近傍の領域に重畳する位置に表示させ、目標領域が描出されると、視認できないような形状に変形するように表示してもよい。 The guide information generated by the determination unit 304 is displayed on the display unit 104 by the display control unit 305. Specifically, the display control unit 305 displays on the display unit 104 a guide image that serves as an objective index indicating the movement direction, movement amount, tilt angle, rotation direction, and rotation amount for moving the probe 102 to the target position. Let The guide image may be any image as long as it is an objective index of the guide information. For example, the guide image is an image of an arrow having a size corresponding to the amount of movement or rotation and a direction corresponding to the direction of movement, rotation, or tilt. In another example, the guide image is a figure that has a size corresponding to the amount of movement or rotation, and whose shape is deformed according to the direction of movement, rotation, or inclination. The guide image is displayed on the display unit 104 in a manner that does not hinder observation of an area (hereinafter referred to as a target area) drawn in an ultrasonic image or a photoacoustic image when the probe 102 is moved to the target position. The For example, the guide image is displayed in an area where the ultrasonic image, the photoacoustic image, and the superimposed image are not displayed. In another example, while guiding the probe 102 to move to the target position, it is displayed at a position that overlaps the area near the target area, and when the target area is rendered, it is deformed into a shape that cannot be seen. May be displayed.
 さらに別の例では、判定部304により生成されたガイド情報を、プローブ102が目標位置に近付くにつれて発音間隔が小さくなるような音を発生させることによりユーザに報知してもよい。 In yet another example, the guide information generated by the determination unit 304 may be notified to the user by generating a sound that decreases the pronunciation interval as the probe 102 approaches the target position.
 なお、判定部304はガイド情報を生成すると判定して、位置取得部302にガイド情報を生成するよう通知し、位置取得部302でガイド情報を生成してもよい。また、位置取得部302や判定部304とは異なるモジュールを設けて、ガイド情報を生成してもよい。 Note that the determination unit 304 may determine to generate guide information, notify the position acquisition unit 302 to generate guide information, and the position acquisition unit 302 may generate guide information. Further, guide information may be generated by providing a module different from the position acquisition unit 302 and the determination unit 304.
 上述の例では、観察を行う中でユーザが詳細に観察した領域を描出できるプローブ102の位置を、ガイド情報を生成するために記憶しておく例について説明したが、これに限らない。たとえば、プローブ102を操作する中で超音波画像や、過去に観察された超音波画像、光音響画像、及びその他の医用画像に基づいて指定された領域を描出可能なプローブ102の位置を、ガイド情報を生成させたいプローブ102の位置として記憶させてもよい。ガイド情報を生成させたいプローブ102の位置は、上述の例では光音響画像を表示させる判定の際に自動的に記憶する例について説明したが、これに限らずユーザが操作卓501に対する操作入力により指定してもよい。 In the above-described example, the example in which the position of the probe 102 that can depict the region observed in detail by the user during the observation is stored for generating the guide information has been described. However, the present invention is not limited to this. For example, the position of the probe 102 capable of rendering a designated area based on an ultrasonic image, an ultrasonic image observed in the past, a photoacoustic image, and other medical images while operating the probe 102 is guided. You may memorize | store as a position of the probe 102 which wants to produce | generate information. In the above example, the position of the probe 102 for which guide information is to be generated has been described as being automatically stored at the time of determination to display a photoacoustic image. However, the present invention is not limited to this. May be specified.
 また、上述の例では、観察を行う中でユーザが詳細に観察した領域の画像を再現させるためのガイド情報を生成する例について説明したが、これに限らない。たとえば、検査オーダやユーザの操作入力に応じて、特定の領域の3次元光音響画像を取得するような場合について説明する。プローブ102をユーザが操作しながら光音響信号を取得する際に、3次元光音響画像を生成するために十分な信号を取得する必要がある。画像処理部303は信号取得部301から送信される光音響信号と、位置取得部302から送信されるプローブ102の位置情報とに基づいて、3次元光音響画像を生成するのに不足している信号の情報を生成する。位置取得部302は、当該3次元光音響画像を生成するのに不足している信号を取得可能なプローブ102の位置にプローブ102を導くためのガイド情報を生成し、表示制御部305を介して表示部104に当該ガイド情報を表示させる。これにより、3次元光音響画像を効率的に生成することができる。 In the above-described example, the example in which the guide information for reproducing the image of the region observed in detail by the user during the observation has been described. However, the present invention is not limited to this. For example, a case will be described in which a three-dimensional photoacoustic image of a specific region is acquired in accordance with an inspection order or a user operation input. When acquiring a photoacoustic signal while the user operates the probe 102, it is necessary to acquire a sufficient signal to generate a three-dimensional photoacoustic image. The image processing unit 303 is insufficient to generate a three-dimensional photoacoustic image based on the photoacoustic signal transmitted from the signal acquisition unit 301 and the position information of the probe 102 transmitted from the position acquisition unit 302. Generate signal information. The position acquisition unit 302 generates guide information for guiding the probe 102 to the position of the probe 102 that can acquire a signal that is insufficient to generate the three-dimensional photoacoustic image, via the display control unit 305. The guide information is displayed on the display unit 104. Thereby, a three-dimensional photoacoustic image can be generated efficiently.
  第1の実施形態において検知部103の例として磁気センサ502と磁気トランスミッタ503を用いる例を上述したが、本発明はこれに限らない。 Although the example using the magnetic sensor 502 and the magnetic transmitter 503 as the example of the detection unit 103 in the first embodiment has been described above, the present invention is not limited to this.
 図7は、撮像システム100の構成の一例を示す図である。撮像システム100は、操作卓501と、プローブ102と、架台504と、モーションセンサ700とを含む。モーションセンサ700は、プローブ102の位置情報をトラッキングする検知部103の一例である。モーションセンサ700は、プローブ102の送受信部106と光源(不図示)とは異なる部分に備えつけられ、若しくは埋め込まれる。モーションセンサ700は、たとえば微小電気機械システム(Micro Electro MechanicalSystems)で構成され、3軸の加速度計と、3軸のジャイロスコープと、3軸の磁気コンパスとを備える9軸モーションセンシングを提供する。モーションセンサ700が感知したプローブ102の変位に関する情報を、位置取得部302は取得する。 FIG. 7 is a diagram illustrating an example of the configuration of the imaging system 100. The imaging system 100 includes a console 501, a probe 102, a gantry 504, and a motion sensor 700. The motion sensor 700 is an example of a detection unit 103 that tracks position information of the probe 102. The motion sensor 700 is provided or embedded in a portion different from the transmission / reception unit 106 of the probe 102 and the light source (not shown). The motion sensor 700 is composed of, for example, a micro electro mechanical system, and provides 9-axis motion sensing including a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetic compass. The position acquisition unit 302 acquires information related to the displacement of the probe 102 sensed by the motion sensor 700.
 [第2の実施形態]
 第2の実施形態では、プローブ102を被検体に押圧する圧力に応じて光音響画像を表示部104に表示させる例について説明する。第1の実施形態と異なる部分についてのみ説明することとし、第1の実施形態と共通する部分については上述した説明を援用してここでの説明は省略する。第2の実施形態にかかる制御装置は、制御装置101及び操作卓501である。
[Second Embodiment]
In the second embodiment, an example in which a photoacoustic image is displayed on the display unit 104 in accordance with the pressure with which the probe 102 is pressed against the subject will be described. Only the parts different from the first embodiment will be described, and the parts described in common with the first embodiment will be referred to above, and the description thereof will be omitted here. The control devices according to the second embodiment are the control device 101 and the console 501.
 図8は、撮像システム100の構成の一例を示す図である。撮像システム100は、操作卓501、プローブ102、架台504、送受信部106、圧力センサ801を含む。    FIG. 8 is a diagram illustrating an example of the configuration of the imaging system 100. The imaging system 100 includes a console 501, a probe 102, a mount 504, a transmission / reception unit 106, and a pressure sensor 801. *
 圧力センサ801は、検知部103の一例である。圧力センサ801は、プローブ102の変位の態様に関する情報として、ユーザがプローブ102を被検体に押圧する程度を示す情報を取得する。送受信部106は、プローブ102の内部に半固定のフローティング構造として設けられる。圧力センサ801は、送受信部106が被検体と接触している面とは反対側に設けられ、送受信部106が受ける圧力を計測する。なお、圧力センサ801は、プローブ102の被検体との接触面に設けられた隔膜型圧力センサでもよい。位置取得部302は、圧力センサ801により計測された圧力に関する情報を取得する。 The pressure sensor 801 is an example of the detection unit 103. The pressure sensor 801 acquires information indicating the degree to which the user presses the probe 102 against the subject as information regarding the displacement mode of the probe 102. The transmission / reception unit 106 is provided as a semi-fixed floating structure inside the probe 102. The pressure sensor 801 is provided on the side opposite to the surface where the transmission / reception unit 106 is in contact with the subject, and measures the pressure received by the transmission / reception unit 106. The pressure sensor 801 may be a diaphragm type pressure sensor provided on the contact surface of the probe 102 with the subject. The position acquisition unit 302 acquires information regarding the pressure measured by the pressure sensor 801.
 図9は、第2の実施形態にかかる制御装置が、ユーザによるプローブ102の操作に基づいて表示部104に光音響画像を表示させる処理の一例を示すフローチャートである。以下では、ユーザは少なくともプローブ102を用いて超音波信号を取得し、超音波画像を表示部104に表示させながらプローブ102を操作し、さらに光音響画像を表示部104に表示させる場合を例に説明する。ステップS600、603、604、605の処理は、図6に基づいて説明した第1の実施形態と同様である。 FIG. 9 is a flowchart illustrating an example of processing in which the control device according to the second embodiment displays a photoacoustic image on the display unit 104 based on the operation of the probe 102 by the user. Hereinafter, as an example, the user acquires an ultrasonic signal using at least the probe 102, operates the probe 102 while displaying the ultrasonic image on the display unit 104, and further displays the photoacoustic image on the display unit 104. explain. The processes in steps S600, 603, 604, and 605 are the same as those in the first embodiment described with reference to FIG.
 ステップS900において、判定部304はユーザがプローブ102を一定の圧力で被検体を押圧しているか否かを判定する。具体的には、位置取得部302は圧力センサ801から取得した情報を判定部304に送信する。判定部304は、送受信部106が受ける圧力が所定の時間以上、当該圧力が所定の範囲内に含まれている場合に、ユーザがプローブ102を一定の圧力で被検体に押圧していると判定する。ユーザがプローブ102を一定の圧力で被検体に押圧している場合には、ステップS604に進む。ユーザがプローブ102を一定の圧力で押圧している場合には、被検体の特定の領域を観察していると想定される。これにより、ユーザが被検体の特定の領域を観察したい場合に光音響画像を表示部104に表示させることができる。ユーザがプローブ102を一定の圧力で押圧していない場合には、ステップS605に進み、光音響画像は表示されない。 In step S900, the determination unit 304 determines whether or not the user is pressing the subject with the probe 102 at a constant pressure. Specifically, the position acquisition unit 302 transmits information acquired from the pressure sensor 801 to the determination unit 304. The determination unit 304 determines that the user is pressing the probe 102 against the subject with a constant pressure when the pressure received by the transmission / reception unit 106 is within a predetermined range for a predetermined time or more. To do. If the user is pressing the probe 102 against the subject with a constant pressure, the process proceeds to step S604. When the user presses the probe 102 with a constant pressure, it is assumed that a specific region of the subject is being observed. Thereby, when the user wants to observe a specific region of the subject, the photoacoustic image can be displayed on the display unit 104. If the user does not press the probe 102 with a constant pressure, the process proceeds to step S605, and the photoacoustic image is not displayed.
 ステップS604において、画像処理部303はたとえば超音波画像に光音響画像を重畳した重畳画像を生成して表示部104に表示させる。第2の実施形態においてはさらに、画像処理部303は位置取得部302から圧力の情報を取得して、当該圧力の情報に基づいて光音響画像を表示部104に表示させてもよい。ユーザが、プローブ102を一定の圧力で押圧している時間が長いほど、ユーザはそのとき描出されている領域に注目している度合いが高いと想定される。そのため、圧力センサ801の圧力値が一定である時間が長いほど、画像処理部303は重畳画像における光音響画像の透明度を低く設定する。すなわち表示制御部305は、プローブ102の変位の程度に応じて、光音響画像を表示部104に表示させる態様を異ならせる。これにより、ユーザは注目の度合いに応じて光音響画像を観察することができる。 In step S604, the image processing unit 303 generates a superimposed image in which a photoacoustic image is superimposed on an ultrasonic image, for example, and causes the display unit 104 to display the superimposed image. Further, in the second embodiment, the image processing unit 303 may acquire pressure information from the position acquisition unit 302 and display a photoacoustic image on the display unit 104 based on the pressure information. It is assumed that the longer the user is pressing the probe 102 with a constant pressure, the higher the degree of attention that the user is paying attention to the area drawn at that time. Therefore, as the time during which the pressure value of the pressure sensor 801 is constant is longer, the image processing unit 303 sets the transparency of the photoacoustic image in the superimposed image to be lower. That is, the display control unit 305 changes the manner in which the photoacoustic image is displayed on the display unit 104 according to the degree of displacement of the probe 102. Thereby, the user can observe a photoacoustic image according to the degree of attention.
 なお、第2の実施形態においてプローブ102が被検体に押圧される圧力に基づいて光音響画像を表示部104に表示する例について説明したが、これに限らない。プローブ102は磁気センサ502やモーションセンサ700を備えていてもよい。判定部304は、プローブ102が被検体に押圧される圧力のみならず、プローブ102の位置や被検体に対する角度といった情報に基づいて、光音響画像を表示させるか否かを判定してもよい。すなわち表示制御部305は、プローブ102が被検体に対して所定の速度より低い速度で移動していることを示す情報と、プローブ102が被検体に対して一定の圧力で被検体に対して押圧されていることを示す情報とのうち、少なくともいずれかの情報が位置取得部302により取得された場合に、光音響画像を表示部104に表示させてもよい。 In addition, although the example which displays a photoacoustic image on the display part 104 based on the pressure by which the probe 102 is pressed by the test object in 2nd Embodiment was demonstrated, it is not restricted to this. The probe 102 may include a magnetic sensor 502 and a motion sensor 700. The determination unit 304 may determine whether to display a photoacoustic image based on not only the pressure with which the probe 102 is pressed against the subject but also information such as the position of the probe 102 and the angle with respect to the subject. That is, the display control unit 305 displays information indicating that the probe 102 is moving with respect to the subject at a speed lower than a predetermined speed, and the probe 102 presses the subject with a constant pressure against the subject. The photoacoustic image may be displayed on the display unit 104 when at least one of the information indicating that the position is acquired is acquired by the position acquisition unit 302.
 [第3の実施形態]
 第3の実施形態では、ユーザが被検体の観察に利用しているプローブ102の特性と検査の目的とに応じて光音響画像を表示部104に表示させる例について説明する。第1の実施形態と異なる部分についてのみ説明することとし、第1の実施形態と共通する部分については上述した説明を援用してここでの説明は省略する。第3の実施形態にかかる制御装置は、制御装置101及び操作卓501である。
[Third Embodiment]
In the third embodiment, an example will be described in which a photoacoustic image is displayed on the display unit 104 according to the characteristics of the probe 102 used by the user for observation of the subject and the purpose of the examination. Only the parts different from the first embodiment will be described, and the parts described in common with the first embodiment will be referred to above, and the description thereof will be omitted here. The control devices according to the third embodiment are the control device 101 and the console 501.
 図10は、第3の実施形態にかかる制御装置が、プローブ102の特性と検査の目的とに応じて光音響画像を表示させる処理の一例を示すフローチャートである。以下では、ユーザは少なくともプローブ102を用いて超音波信号を取得し、超音波画像を表示部104に表示させながらプローブ102を操作し、さらに光音響画像を表示部104に表示させる場合を例に説明する。操作卓501には複数のプローブが接続されている場合があり、ユーザは被検体を観察する領域といった、検査の目的に応じて利用するプローブを選択する。ステップS604、605の処理は、図6に基づいて説明した第1の実施形態と同様である。 FIG. 10 is a flowchart illustrating an example of a process in which the control device according to the third embodiment displays a photoacoustic image according to the characteristics of the probe 102 and the purpose of the inspection. Hereinafter, as an example, the user acquires an ultrasonic signal using at least the probe 102, operates the probe 102 while displaying the ultrasonic image on the display unit 104, and further displays the photoacoustic image on the display unit 104. explain. A plurality of probes may be connected to the console 501, and the user selects a probe to be used according to the purpose of the examination, such as an area for observing the subject. The processing in steps S604 and 605 is the same as that in the first embodiment described with reference to FIG.
 ステップS1000において、判定部304は光音響画像で超音波画像を補完できるか否かを判定する。具体的には、検査制御部300は超音波画像及び光音響画像の撮像条件を取得し、判定部304に送信する。位置取得部302は、ユーザが観察に利用しているプローブ102の情報を取得し、判定部304に送信する。プローブ102の情報には、プローブ102のトランスデューサ(不図示)の配列や、当該プローブを操作卓501に接続した際の初期設定や、スキャン方式の情報や、照射部107の有無を示す情報が含まれる。光音響画像で超音波画像を補完できると判定部304により判定されると、ステップS604に進む。光音響画像で超音波画像を補完できないと判定部304により判定されると、ステップS605に進む。 In step S1000, the determination unit 304 determines whether the ultrasonic image can be complemented with the photoacoustic image. Specifically, the inspection control unit 300 acquires imaging conditions for the ultrasonic image and the photoacoustic image, and transmits them to the determination unit 304. The position acquisition unit 302 acquires information on the probe 102 used by the user for observation, and transmits the information to the determination unit 304. The information on the probe 102 includes an array of transducers (not shown) of the probe 102, initial settings when the probe is connected to the console 501, scanning method information, and information indicating the presence or absence of the irradiation unit 107. It is. If the determination unit 304 determines that the ultrasonic image can be complemented with the photoacoustic image, the process proceeds to step S604. If the determination unit 304 determines that the ultrasonic image cannot be complemented with the photoacoustic image, the process proceeds to step S605.
 トランスデューサの配列やスキャン方式、信号取得に関する設定といった撮像条件に応じて、取得される超音波画像の特徴が異なる。たとえば、コンベックス電子スキャン方式は、被検体の深部で広視野の超音波画像が得られ、主に腹部領域の観察に用いられる。セクタ電子スキャン方式は、狭い接触部から広視野の超音波画像が得られ、主に循環器領域の観察に用いられる。また、高い周波数の超音波を用いて超音波信号を取得すると、分解能の細かい超音波画像が得られるが、超音波信号の透過力が弱いため、超音波画像に描出される被検体の領域は浅くなる。このように、撮像条件に応じて描出される超音波画像の特徴が異なるため、判定部304は当該特徴に応じて光音響画像を表示部104に表示させるか否かを判定する。たとえば、光音響画像に描出される被検体の深度が、超音波画像に描出される被検体の深度よりも大きい場合には、判定部304は光音響画像で超音波画像を補完できると判定する。 The characteristics of the acquired ultrasonic image differ depending on the imaging conditions such as the transducer arrangement, scanning method, and signal acquisition settings. For example, in the convex electronic scan method, an ultrasonic image with a wide field of view is obtained in the deep part of the subject, and is mainly used for observing the abdominal region. The sector electronic scanning method obtains an ultrasonic image with a wide field of view from a narrow contact portion, and is mainly used for observation of a circulatory region. In addition, when an ultrasonic signal is acquired using high-frequency ultrasonic waves, an ultrasonic image with fine resolution can be obtained. However, since the transmission power of the ultrasonic signal is weak, the region of the subject depicted in the ultrasonic image is It becomes shallower. As described above, since the feature of the ultrasonic image drawn according to the imaging condition is different, the determination unit 304 determines whether to display the photoacoustic image on the display unit 104 according to the feature. For example, when the depth of the subject depicted in the photoacoustic image is greater than the depth of the subject depicted in the ultrasound image, the determination unit 304 determines that the ultrasound image can be complemented with the photoacoustic image. .
 あるいは、被検体の深度を優先して中程度の周波数の超音波を用いて超音波信号を取得する場合、描出される超音波画像の分解能が詳細な観察のためには不十分となる可能性が想定される。よって、分解能の不足を補完するために光音響画像を併せて観察することが有効であると想定される。たとえば、判定部304は光音響画像の分解能が超音波画像の分解能よりも高い場合には、光音響画像で超音波画像を補完できると判定する。 Or, when acquiring ultrasonic signals using ultrasonic waves of medium frequency with priority on the depth of the subject, the resolution of the rendered ultrasonic image may be insufficient for detailed observation Is assumed. Therefore, it is assumed that it is effective to observe the photoacoustic image together to compensate for the lack of resolution. For example, when the resolution of the photoacoustic image is higher than the resolution of the ultrasonic image, the determination unit 304 determines that the ultrasonic image can be complemented with the photoacoustic image.
 プローブ102が照射部107を有さない、超音波信号の取得のみに対応したものである場合には、光音響信号を取得できない。したがって判定部304は、光音響画像で超音波画像を補完できないと判定する。 When the probe 102 does not have the irradiation unit 107 and corresponds to only the acquisition of the ultrasonic signal, the photoacoustic signal cannot be acquired. Therefore, the determination unit 304 determines that the ultrasonic image cannot be complemented with the photoacoustic image.
 すなわち判定部304は、観察に利用されるプローブ102の特性に基づいて光音響画像を表示部104に表示させるか否かを判定する。描出される超音波画像の特徴も、光音響画像の特徴も、プローブ102の特性に依存する。したがって判定部304は、超音波画像や光音響画像の特徴に関わる、撮像条件やプローブ102の構成といったプローブ102の特性に基づいて、当該判定を行う。また、プローブ102の特性に関する情報を取得する位置取得部302は、プローブ102により取得される超音波信号に基づいて描出される超音波画像の特徴に関する情報を取得する第3の取得手段の一例である。 That is, the determination unit 304 determines whether to display a photoacoustic image on the display unit 104 based on the characteristics of the probe 102 used for observation. The features of the ultrasonic image and the photoacoustic image to be drawn depend on the characteristics of the probe 102. Therefore, the determination unit 304 performs the determination based on the characteristics of the probe 102 such as the imaging conditions and the configuration of the probe 102 related to the characteristics of the ultrasonic image and the photoacoustic image. In addition, the position acquisition unit 302 that acquires information related to the characteristics of the probe 102 is an example of a third acquisition unit that acquires information related to the characteristics of the ultrasonic image rendered based on the ultrasonic signal acquired by the probe 102. is there.
 上述の例では、画像に描出される被検体の深度や分解能に応じて光音響画像で超音波画像を補完する例について説明した。光音響画像で超音波画像を補完させるか否かの判定の基準を、深度や分解能といったパラメータを指定することにより、ユーザが適宜設定できるようにしてもよい。 In the above-described example, the example in which the ultrasonic image is complemented with the photoacoustic image according to the depth and resolution of the subject depicted in the image has been described. By specifying parameters such as depth and resolution, a criterion for determining whether or not to complement an ultrasonic image with a photoacoustic image may be set appropriately by the user.
 上述の例では、表示部104に光音響画像を表示させるか否かを判定する例について説明したが、これに限らない。表示部104に表示される被検体の領域の一部にのみ光音響画像を重畳させた重畳画像を表示させてもよい。これにより、超音波画像が詳細に被検体の構造を描出している領域においては光音響画像を重畳させないので、超音波画像の観察を妨げない。超音波画像が詳細に被検体の構造を描出していない領域については光音響画像を重畳させることにより、ユーザが当該領域を観察するのを補助することができる。上述した深度や分解能の程度に応じて、重畳させる光音響画像の透明度を異ならせてもよい。 In the above-described example, the example of determining whether or not to display the photoacoustic image on the display unit 104 has been described, but the present invention is not limited thereto. A superimposed image in which the photoacoustic image is superimposed only on a part of the region of the subject displayed on the display unit 104 may be displayed. Accordingly, since the photoacoustic image is not superimposed in the region where the ultrasonic image describes the structure of the subject in detail, observation of the ultrasonic image is not hindered. By superimposing a photoacoustic image on a region where the ultrasonic image does not depict the structure of the subject in detail, it is possible to assist the user in observing the region. The transparency of the superposed photoacoustic image may be varied according to the depth and the degree of resolution described above.
 上述の例では、超音波画像のパラメータに基づいて、光音響画像を表示させるか否かを判定する例について説明した。操作卓501に接続される複数のプローブのそれぞれに対して、光音響画像を表示させるか否かを予め設定しておいてもよい。 In the above-described example, the example in which it is determined whether to display the photoacoustic image based on the parameters of the ultrasonic image has been described. Whether to display a photoacoustic image for each of a plurality of probes connected to the console 501 may be set in advance.
  第3の実施形態にかかるプローブ102は、磁気センサ502やモーションセンサ700を備えていてもよい。判定部304は、超音波画像のパラメータのみならず、プローブ102が被検体に押圧される圧力や、プローブ102の位置や被検体に対する角度といった情報に基づいて、光音響画像を表示させるか否かを判定してもよい。 プ ロ ー ブ The probe 102 according to the third embodiment may include a magnetic sensor 502 and a motion sensor 700. The determination unit 304 determines whether to display a photoacoustic image based on information such as the pressure at which the probe 102 is pressed against the subject, the position of the probe 102, and the angle with respect to the subject, as well as the parameters of the ultrasound image. May be determined.
 また、オーダリングシステム112から取得した検査オーダに適さないプローブ102が使用されている場合に、ユーザに使用中のプローブが不適切である旨を報知してもよい。たとえば、不適切である旨を示すメッセージや画像を表示部104に表示させることにより報知する。あるいは、光音響信号の取得を無効化し、無効化されていることをユーザに報知してもよい。不適切な場合とは、たとえば検査オーダで光音響画像の取得が要求されているにも関わらず、光音響信号を取得するための照射部107を有さないプローブを使用していう場合である。 Further, when the probe 102 not suitable for the inspection order acquired from the ordering system 112 is used, the user may be notified that the probe being used is inappropriate. For example, notification is made by displaying a message or an image indicating inappropriateness on the display unit 104. Alternatively, the acquisition of the photoacoustic signal may be invalidated to notify the user that it has been invalidated. An inappropriate case is, for example, a case where a probe that does not have the irradiation unit 107 for acquiring a photoacoustic signal is used even though acquisition of a photoacoustic image is requested in an inspection order.
 [第4の実施形態]
 第1の実施形態乃至第3の実施形態においては、画像処理部303により生成された光音響画像を表示部104に表示させる例を説明したが、本発明はこれに限らない。たとえば上述したように、判定部304により表示部104に表示させると判定された場合に、検査制御部300が照射部107を制御して光音響信号を取得させてもよい。そして、当該判定に応じて取得された光音響信号に基づいて再構成された光音響画像を、表示部104に表示させてもよい。
[Fourth Embodiment]
In the first to third embodiments, the example in which the photoacoustic image generated by the image processing unit 303 is displayed on the display unit 104 has been described, but the present invention is not limited to this. For example, as described above, when the determination unit 304 determines to display on the display unit 104, the inspection control unit 300 may control the irradiation unit 107 to acquire a photoacoustic signal. And the photoacoustic image reconfigure | reconstructed based on the photoacoustic signal acquired according to the said determination may be displayed on the display part 104. FIG.
 図11は判定部304の判定に基づいて照射部107が制御され、光音響画像が取得され、表示部104に表示されるまでの処理の一例を示すフローチャートである。 FIG. 11 is a flowchart illustrating an example of processing from when the irradiation unit 107 is controlled based on the determination of the determination unit 304 until a photoacoustic image is acquired and displayed on the display unit 104.
 ステップS1100において、判定部304は光音響画像を表示部104に表示させるか否かを判定する。ステップS1100は、たとえば第1の実施形態におけるステップS600乃至ステップS603の処理、第2の実施形態におけるステップS600、603、900の処理、第3の実施形態におけるステップS1000に対応する。表示させると判定された場合にはステップS1101に進み、表示させないと判定された場合にはステップS1102に進む。 In step S1100, the determination unit 304 determines whether or not to display the photoacoustic image on the display unit 104. Step S1100 corresponds to, for example, the processing in steps S600 to S603 in the first embodiment, the processing in steps S600, 603, and 900 in the second embodiment, and step S1000 in the third embodiment. If it is determined to be displayed, the process proceeds to step S1101, and if it is determined not to be displayed, the process proceeds to step S1102.
 ステップS1101において、検査制御部300は照射部107を制御して被検体に光を照射させる。信号取得部301はプローブ102から光音響信号を取得する。画像処理部303は光音響信号から光音響画像を再構成する。表示制御部305は光音響画像を表示部104に表示させる。ステップS1101は、たとえば第1の実施形態乃至第3の実施形態におけるステップS604に対応する。 In step S1101, the examination control unit 300 controls the irradiation unit 107 to irradiate the subject with light. The signal acquisition unit 301 acquires a photoacoustic signal from the probe 102. The image processing unit 303 reconstructs a photoacoustic image from the photoacoustic signal. The display control unit 305 displays the photoacoustic image on the display unit 104. Step S1101 corresponds to step S604 in the first to third embodiments, for example.
 ステップS1102において、位置取得部302はプローブ102の状態に関する情報を取得する。光音響信号を取得中であることを示す情報が取得された場合にはステップS1103に進む。光音響信号を取得中ではないことを示す情報が取得された場合にはステップS1104に進む。 In step S1102, the position acquisition unit 302 acquires information related to the state of the probe 102. If information indicating that the photoacoustic signal is being acquired is acquired, the process proceeds to step S1103. If information indicating that the photoacoustic signal is not being acquired is acquired, the process proceeds to step S1104.
 ステップS1103において、検査制御部300は照射部107を制御して、被検体への光照射を停止させる。ステップS1102及びステップS1103の処理は、たとえば第1の実施形態乃至第3の実施形態におけるステップS605に対応する。 In step S1103, the examination control unit 300 controls the irradiation unit 107 to stop the light irradiation on the subject. The processes in steps S1102 and S1103 correspond to, for example, step S605 in the first to third embodiments.
 ステップS1104において、検査制御部300は超音波画像ならびに光音響画像を撮像するための検査を終了するか否かを判定する。たとえば、ユーザは操作卓501への操作入力より、検査終了を指示することができる。あるいは、検査制御部300は位置取得部302からプローブ102の位置情報を取得し、たとえば一定の時間以上、プローブ102が被検体と接触していない状態が継続している場合に、検査を終了すると判定してもよい。当該位置情報に基づいて検査を終了すると判定した場合には、表示制御部305を介して表示部104に、検査を終了するか否かをユーザに確認させるための画面を表示させることが好ましい。検査を終了する指示がない場合にはステップS1100に戻り、検査を終了する指示がある場合には図11に示す処理を終了する。 In step S1104, the inspection control unit 300 determines whether or not to end the inspection for imaging the ultrasonic image and the photoacoustic image. For example, the user can instruct the end of the inspection through an operation input to the console 501. Alternatively, the examination control unit 300 acquires the position information of the probe 102 from the position acquisition unit 302, and ends the examination when, for example, the state in which the probe 102 is not in contact with the subject continues for a certain time or longer. You may judge. When it is determined to end the inspection based on the position information, it is preferable to display a screen for allowing the user to confirm whether or not to end the inspection on the display unit 104 via the display control unit 305. If there is no instruction to end the inspection, the process returns to step S1100, and if there is an instruction to end the inspection, the process shown in FIG. 11 is ended.
 これにより、光音響画像を表示させる必要がある場合に被検体に光を照射するように制御することができ、ユーザや被検体の安全性を向上することができる。 Thereby, when it is necessary to display a photoacoustic image, it can be controlled to irradiate the subject with light, and the safety of the user and the subject can be improved.
 なお、照射部107はたとえば信号取得部301により制御される。信号取得部301は呼吸や拍動による体動の影響が小さいとみなせる期間に光照射を行い、光音響信号を取得するように照射部107の各構成を制御することが好ましい。たとえば、信号取得部301は、ステップS1100で光音響画像を表示すると判定されてから250ms以内に光照射を行うように照射部107を制御してもよい。また、当該判定から光照射までの時間は、所定の値でもよいし、ユーザが操作部105を介して指定してもよい。 The irradiation unit 107 is controlled by the signal acquisition unit 301, for example. It is preferable that the signal acquisition unit 301 performs light irradiation in a period in which the influence of body movement due to respiration or pulsation is small, and controls each component of the irradiation unit 107 so as to acquire a photoacoustic signal. For example, the signal acquisition unit 301 may control the irradiation unit 107 to perform light irradiation within 250 ms after it is determined in step S1100 to display a photoacoustic image. Further, the time from the determination to the light irradiation may be a predetermined value, or may be designated by the user via the operation unit 105.
 [変形例1]
 第1の実施形態乃至第4の実施形態において、判定部304が、表示部104に光音響画像を表示させるか否かを判定する例を説明した。判定部304の判定に基づいて光音響画像を表示部104に表示させるための処理は、上述した例に限られない。制御装置101は、超音波信号と光音響信号とを継続的に取得し、光音響画像を表示させると判定されたときに光音響画像を生成してもよい。また、制御装置101は、光音響画像を表示させると判定されたときに光音響信号を取得してもよい。また、光音響画像を表示部104に表示させる態様は、上述した例に限られない。超音波画像を表示部104に表示させている場合に、光音響画像の表示に切り替えてもよいし、超音波画像と光音響画像とを並列して表示させてもよいし、超音波画像に対して光音響画像を重畳させた重畳画像を表示させてもよい。
[Modification 1]
In the first to fourth embodiments, the example in which the determination unit 304 determines whether to display the photoacoustic image on the display unit 104 has been described. The process for displaying the photoacoustic image on the display unit 104 based on the determination of the determination unit 304 is not limited to the example described above. The control device 101 may continuously acquire the ultrasonic signal and the photoacoustic signal, and may generate the photoacoustic image when it is determined to display the photoacoustic image. Moreover, the control apparatus 101 may acquire a photoacoustic signal, when it determines with displaying a photoacoustic image. Moreover, the aspect which displays a photoacoustic image on the display part 104 is not restricted to the example mentioned above. When the ultrasonic image is displayed on the display unit 104, the display may be switched to the display of the photoacoustic image, the ultrasonic image and the photoacoustic image may be displayed in parallel, or the ultrasonic image may be displayed. On the other hand, a superimposed image in which a photoacoustic image is superimposed may be displayed.
 第1の実施形態乃至第4の実施形態において、判定部304がプローブ102の変位に関する情報、すなわちユーザがプローブ102をどのように操作したかを示す情報に基づいて判定を行う例を説明した。判定部304の判定はこれに限らない。たとえば、制御装置101に集音マイクを備え、ユーザの音声による指示を受け付けてもよい。ユーザの音声による指示を判別するために、制御装置101は音声認識プログラムを記憶し、実行してもよい。 In the first to fourth embodiments, an example has been described in which the determination unit 304 performs determination based on information regarding the displacement of the probe 102, that is, information indicating how the user has operated the probe 102. The determination of the determination unit 304 is not limited to this. For example, the control device 101 may be provided with a sound collecting microphone, and an instruction by a user's voice may be received. In order to determine a user's voice instruction, the control apparatus 101 may store and execute a voice recognition program.
 第1の実施形態乃至第4の実施形態に加えて、さらにプローブ102のパラメータ調整を行った場合に一定時間が経過しているか否かに基づいて、光音響画像を表示するか否かが判定されるようにしてもよい。ユーザは、操作卓501やプローブ102に対する操作入力により、プローブ102の感度やフォーカス、深度といったパラメータを調整する場合が想定される。このとき、判定部304は当該パラメータの調整後一定時間が経過するまでは光音響画像を表示部104に表示させないように判定する。これにより、ユーザが変更後のパラメータで観察を継続しようとしている場合には光音響画像を表示させ、さらにパラメータを変更する可能性がある場合には光音響画像を表示させないようにすることができる。ユーザは超音波画像を観察しながら容易にパラメータを調整することができ、ワークフローを向上することができる。 In addition to the first to fourth embodiments, whether or not to display a photoacoustic image is determined based on whether or not a certain time has elapsed when the parameters of the probe 102 are further adjusted. You may be made to do. It is assumed that the user adjusts parameters such as sensitivity, focus, and depth of the probe 102 by an operation input to the console 501 and the probe 102. At this time, the determination unit 304 determines not to display the photoacoustic image on the display unit 104 until a predetermined time has elapsed after the adjustment of the parameter. Thereby, when the user intends to continue the observation with the changed parameter, the photoacoustic image is displayed, and when the parameter may be changed, the photoacoustic image is not displayed. . The user can easily adjust the parameters while observing the ultrasonic image, and the workflow can be improved.
 また、第1の実施形態乃至第4の実施形態において、プローブ102で光照射が行われていることをユーザに報知するようにしてもよい。たとえば、プローブ102で光照射が行われていることを報知する報知画像を表示部104に表示する。表示部104に当該報知画像を表示させる場合には、ユーザが観察している被検体の画像の近傍に表示させることが好ましい。別の例では、プローブ102に光の照射中に点灯するLEDライトを備えてもよい。さらに別の例では、光の照射中に制御装置101は報知音を発生させてもよい。この観点では、ガイド画像を表示部104に表示させる表示制御部305や、プローブ102に供えられたLEDライトや、報知音を発生させる音発生部は、光音響信号を取得するために光照射を行っていることを報知する報知手段の一例である。これにより、たとえば光音響信号を取得するようにプローブ102を制御してから、表示部104に光音響画像が表示されるまでに間隔がある場合でも、ユーザはプローブ102から光を照射中であることを知ることができ、ユーザ及び被検体の安全性を向上することができる。 In the first to fourth embodiments, the user may be notified that light irradiation is performed by the probe 102. For example, a notification image for notifying that light irradiation is performed by the probe 102 is displayed on the display unit 104. When displaying the notification image on the display unit 104, it is preferable to display the notification image in the vicinity of the image of the subject being observed by the user. In another example, the probe 102 may be provided with an LED light that is turned on during light irradiation. In yet another example, the control device 101 may generate a notification sound during light irradiation. From this viewpoint, the display control unit 305 that displays the guide image on the display unit 104, the LED light provided to the probe 102, and the sound generation unit that generates a notification sound emit light to acquire a photoacoustic signal. It is an example of the alerting | reporting means which alert | reports what is going. Thereby, for example, even when there is an interval from when the probe 102 is controlled to acquire a photoacoustic signal to when the photoacoustic image is displayed on the display unit 104, the user is irradiating light from the probe 102. And the safety of the user and the subject can be improved.
 [変形例2]
 上記の実施形態においては、超音波画像に対して光音響画像が重畳される例について説明した。本変形例では、超音波画像に重畳された光音響画像を非表示とする方法について説明する。
[Modification 2]
In the above embodiment, an example in which a photoacoustic image is superimposed on an ultrasonic image has been described. In the present modification, a method for hiding the photoacoustic image superimposed on the ultrasonic image will be described.
 図12は、超音波画像に重畳された光音響画像の重畳表示を中止させる処理の一例を示すフローチャートである。まず、図12(a)を参照しながら、超音波画像に重畳された光音響画像を非表示とする方法の一例について説明する。 FIG. 12 is a flowchart showing an example of processing for canceling the superimposed display of the photoacoustic image superimposed on the ultrasonic image. First, an example of a method for hiding a photoacoustic image superimposed on an ultrasonic image will be described with reference to FIG.
  ステップ1200 は、光音響画像が超音波画像上に表示された後に実行される処理である。すなわち、本実施例は上述の任意の実施例と組み合わせることが可能である。 Step 1200 is a process executed after the photoacoustic image is displayed on the ultrasonic image. That is, this embodiment can be combined with any of the above-described embodiments.
  ステップ1200において、判定部304はプローブ102の移動速度が所定値より大きいかを判定する。具体的には、まず位置取得部302は、磁気センサ502からプローブ102の位置を示す情報を取得し、当該位置の経時的な変化に基づいてプローブ102の移動の速度を示す情報を取得する。位置取得部302は、プローブ102の移動の速度を示す情報を判定部304に送信する。 In step 1200, the determination unit 304 determines whether the moving speed of the probe 102 is greater than a predetermined value. Specifically, the position acquisition unit 302 first acquires information indicating the position of the probe 102 from the magnetic sensor 502, and acquires information indicating the speed of movement of the probe 102 based on a change with time of the position. The position acquisition unit 302 transmits information indicating the movement speed of the probe 102 to the determination unit 304.
 判定部304は、位置取得部302から送信されたプローブ102の移動の速度を示す情報を取得し、取得したプローブ102の移動の速度を示す情報に基づいて、プローブ102の移動速度が所定値より大きいか否かを判定する。ここで、所定値とは、例えば、ステップ601で用いられた所定値と同様の値である。判定部304により、プローブ102の移動速度が所定値より大きいと判定された場合、処理はステップ1201へ進む。また、判定部304によりプローブ102の移動速度が所定値以下と判定された場合、処理は再びステップ1200に戻る。 The determination unit 304 acquires information indicating the movement speed of the probe 102 transmitted from the position acquisition unit 302, and based on the acquired information indicating the movement speed of the probe 102, the movement speed of the probe 102 is less than a predetermined value. Determine whether it is larger. Here, the predetermined value is a value similar to the predetermined value used in step 601, for example. If the determination unit 304 determines that the moving speed of the probe 102 is greater than the predetermined value, the process proceeds to step 1201. If the determination unit 304 determines that the moving speed of the probe 102 is equal to or less than the predetermined value, the process returns to step 1200 again.
 なお、判定部304は、プローブ102の移動速度が所定値より大きい時間が所定期間継続した場合に、プローブ102の移動速度が所定値より大きいと判定することとしてもよい。 Note that the determination unit 304 may determine that the movement speed of the probe 102 is greater than the predetermined value when the movement speed of the probe 102 is longer than the predetermined value for a predetermined period.
 ステップ1201では、表示制御部305は、表示部104に表示された重畳画像に替えて、光音響画像が重畳されていない超音波画像を表示部104に表示させる。すなわち、表示制御部305は、光音響画像が重畳されていない超音波画像をリアルタイムに表示部104に表示させる。 In step 1201, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed instead of the superimposed image displayed on the display unit 104. That is, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which no photoacoustic image is superimposed in real time.
 図12(a)に示す処理の一例によれば、光音響画像が重畳表示されていない超音波画像を詳細に観察したい場合に、ユーザはプローブ102に対する簡易な操作で所望の超音波画像を表示部に表示させることが可能となる。 According to an example of the process illustrated in FIG. 12A, when a user wants to observe in detail an ultrasound image on which no photoacoustic image is superimposed, the user displays a desired ultrasound image with a simple operation on the probe 102. Can be displayed on the screen.
 なお、上記の例ではプローブ102の移動速度を用いて光音響画像の重畳表示を中止させることとしたが、表示制御部305は、他の情報を用いて光音響画像の重畳表示を中止させることとしてもよい。例えば、プローブ102の移動速度に替えてプローブ102の回転速度を用いることとしてもよい。例えば、プロープ102の回転速度が所定値より大きい場合に、表示制御部305は光音響画像の重畳表示を中止させることとしてもよい。なお、プローブ102の回転速度と比較される所定値は例えばステップ602で用いられた所定値と同様の値である。 In the above example, the superimposed display of the photoacoustic image is stopped using the moving speed of the probe 102, but the display control unit 305 stops the superimposed display of the photoacoustic image using other information. It is good. For example, the rotational speed of the probe 102 may be used instead of the moving speed of the probe 102. For example, when the rotation speed of the probe 102 is larger than a predetermined value, the display control unit 305 may stop the superimposed display of the photoacoustic image. The predetermined value compared with the rotation speed of the probe 102 is the same value as the predetermined value used in step 602, for example.
 また、表示制御部305は、プローブ102の移動速度およびプローブ102の回転速度がそれぞれと比較される所定値より大きい場合に、光音響画像の重畳表示を中止させることとしてもよい。 Further, the display control unit 305 may stop the superimposed display of the photoacoustic image when the moving speed of the probe 102 and the rotational speed of the probe 102 are larger than predetermined values compared with each other.
 さらに、プローブ102の移動速度に替えて、プローブ102の加速度を用いることとしてもよい。例えば、プロープ102の加速度が所定値より大きい場合に、表示制御部305は光音響画像の重畳表示を中止させることとしてもよい。 Furthermore, the acceleration of the probe 102 may be used instead of the moving speed of the probe 102. For example, when the acceleration of the probe 102 is larger than a predetermined value, the display control unit 305 may stop the superimposed display of the photoacoustic image.
 また、図12(b)を参照しながら、超音波画像に重畳された光音響画像の表示・非表を切り替える方法の一例について説明する。 Also, an example of a method for switching between displaying and hiding the photoacoustic image superimposed on the ultrasonic image will be described with reference to FIG.
 ステップ1210 は、光音響画像が超音波画像上に表示された後に実行される処理である。すなわち、本実施例は上述の任意の実施例と組み合わせることが可能である。 Step 1210 is a process executed after the photoacoustic image is displayed on the ultrasonic image. That is, this embodiment can be combined with any of the above-described embodiments.
 ステップ1210において、判定部304はプローブ102の移動速度が所定範囲内か否かを判定する。判定部304は、位置取得部302から送信されたプローブ102の移動の速度を示す情報を取得し、取得したプローブ102の移動の速度を示す情報に基づいて、プローブ102の移動速度が所定範囲内か否かを判定する。ここで、所定範囲とは、例えば、ステップ601で用いられた所定値より大きく他の所定値より小さい範囲である。判定部304により、プローブ102の移動速度が所定範囲内であると判定された場合、処理はステップ1211へ進む。また、判定部304により、プローブ102の移動速度が所定範囲外と判定された場合、処理はステップ1212に進む。 In step 1210, the determination unit 304 determines whether or not the moving speed of the probe 102 is within a predetermined range. The determination unit 304 acquires information indicating the movement speed of the probe 102 transmitted from the position acquisition unit 302, and the movement speed of the probe 102 is within a predetermined range based on the acquired information indicating the movement speed of the probe 102. It is determined whether or not. Here, the predetermined range is, for example, a range that is larger than the predetermined value used in step 601 and smaller than another predetermined value. If the determination unit 304 determines that the moving speed of the probe 102 is within the predetermined range, the process proceeds to step 1211. If the determination unit 304 determines that the moving speed of the probe 102 is out of the predetermined range, the process proceeds to step 1212.
 なお、判定部304は、プローブ102の移動速度が所定範囲内となる時間が所定期間継続した場合に、プローブ102の移動速度が所定値より大きいと判定することとしてもよい。 Note that the determination unit 304 may determine that the moving speed of the probe 102 is greater than a predetermined value when the time during which the moving speed of the probe 102 is within a predetermined range continues for a predetermined period.
 ステップ1211では、表示制御部305が光音響画像の重畳状態を変更する。例えば、ステップ1211以前に超音波画像に光音響画像が重畳されていた場合、表示制御部305はステップ1211において表示部104に表示された重畳画像に替えて、光音響画像が重畳されていない超音波画像を表示部104に表示させる。また、ステップ1211以前に超音波画像に光音響画像が重畳されていない場合、表示制御部305はステップ1211において表示部104に表示された超音波画像に替えて、光音響画像が重畳された超音波画像を表示部104に表示させる。すなわち、ステップ1211において、光音響画像の重畳状態の切り替えが実行される。なお、重畳状態が頻繁に変更されないように、判定部304は、ステップ1211にて重畳状態が変更された後、所定期間内はステップ1210の判定を実行しないこととしてもよい。後述する他の例についても同様である。 In step 1211, the display control unit 305 changes the superposition state of the photoacoustic image. For example, if the photoacoustic image is superimposed on the ultrasonic image before step 1211, the display control unit 305 replaces the superimposed image displayed on the display unit 104 in step 1211 with the superacoustic image on which the photoacoustic image is not superimposed. A sound wave image is displayed on the display unit 104. If the photoacoustic image is not superimposed on the ultrasonic image before step 1211, the display control unit 305 replaces the ultrasonic image displayed on the display unit 104 in step 1211 with the superposed photoacoustic image. A sound wave image is displayed on the display unit 104. That is, in step 1211, switching of the superposition state of the photoacoustic image is executed. Note that the determination unit 304 may not execute the determination in step 1210 within a predetermined period after the superposition state is changed in step 1211 so that the superposition state is not frequently changed. The same applies to other examples described later.
 また、ステップ1212では、判定部304はプローブ102の移動速度が所定範囲の上限である他の所定値(閾値)以上であるか否かを判定する。判定部304によりプローブ102の移動速度が閾値以上と判定された場合には、処理はステップ1213へ進む。判定部304によりプローブ102の移動速度が閾値より小さい場合(すなわち、ステップ601で用いられた所定値以下)の場合、再びステップ1210へ戻る。すなわち、図12(b)に示す処理の一例によれば光音響画像の重畳状態が一度変更されるとプローブが止まっていたとしても、表示状態は維持されることとなる。 In step 1212, the determination unit 304 determines whether or not the moving speed of the probe 102 is equal to or higher than another predetermined value (threshold value) that is the upper limit of the predetermined range. If the determination unit 304 determines that the moving speed of the probe 102 is equal to or higher than the threshold, the process proceeds to step 1213. If the determination unit 304 determines that the moving speed of the probe 102 is smaller than the threshold (that is, not more than the predetermined value used in step 601), the process returns to step 1210 again. That is, according to the example of the process shown in FIG. 12B, once the superposition state of the photoacoustic image is changed, even if the probe is stopped, the display state is maintained.
 ステップ1213では、表示制御部305は、表示部104に表示された重畳画像に替えて、光音響画像が重畳されていない超音波画像を表示部104に表示させる。すなわち、表示制御部305は、光音響画像が重畳されていない超音波画像をリアルタイムに表示部104に表示させる。なお、ステップ1213以前に超音波画像に光音響画像が重畳表示されていない場合、表示制御部305は、継続して光音響画像が重畳されていない超音波画像を表示部104に表示させる。 In step 1213, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed instead of the superimposed image displayed on the display unit 104. That is, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which no photoacoustic image is superimposed in real time. If the photoacoustic image is not superimposed on the ultrasonic image before step 1213, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed.
 図12(b)に示す処理の一例によれば、超音波画像へ光音響画像を重畳させるか否かをプローブ102に対する簡易な操作で切り換えることが可能となる。従って、ユーザはプローブ102に対する簡易な操作で光音響画像が重畳表示されていない超音波画像を詳細に観察することが可能となる。さらに、ユーザは、プローブ102に対する簡易な操作で超音波画像に再度光音響画像を重畳させることが可能となる。 According to an example of the process shown in FIG. 12B, whether or not to superimpose the photoacoustic image on the ultrasonic image can be switched by a simple operation on the probe 102. Therefore, the user can observe in detail an ultrasonic image on which the photoacoustic image is not superimposed and displayed by a simple operation on the probe 102. Furthermore, the user can superimpose the photoacoustic image again on the ultrasonic image by a simple operation on the probe 102.
 なお、上記の例ではプローブ102の移動速度を用いて光音響画像の重畳状態を変更さすることとしたが、表示制御部305は、他の情報を用いて光音響画像の重畳状態を変更することとしてもよい。例えば、プローブ102の移動速度に替えてプローブ102の回転速度を用いることとしてもよい。例えば、プロープ102の回転速度が所定範囲内の場合に、表示制御部305は光音響画像の重畳状態を変更することとしてもよい。 In the above example, the superimposition state of the photoacoustic image is changed using the moving speed of the probe 102. However, the display control unit 305 changes the superposition state of the photoacoustic image using other information. It is good as well. For example, the rotational speed of the probe 102 may be used instead of the moving speed of the probe 102. For example, when the rotation speed of the probe 102 is within a predetermined range, the display control unit 305 may change the superimposed state of the photoacoustic image.
 また、表示制御部305は、プローブ102の移動速度およびプローブ102の回転速度がそれぞれと比較される所定範囲内の場合に、光音響画像の重畳状態を変更することとしてもよい。 Further, the display control unit 305 may change the superimposed state of the photoacoustic image when the moving speed of the probe 102 and the rotational speed of the probe 102 are within a predetermined range compared with each other.
  さらに、プローブ102の移動速度に替えて、プローブ102の加速度を用いることとしてもよい。例えば、プロープ102の加速度が所定範囲内の場合に、表示制御部305は光音響画像の重畳状態を変更することとしてもよい。 Furthermore, the acceleration of the probe 102 may be used instead of the moving speed of the probe 102. For example, when the acceleration of the probe 102 is within a predetermined range, the display control unit 305 may change the superimposed state of the photoacoustic image.
 また、第1の実施形態では、表示制御部305は、プローブ102の移動速度に応じて光音響画像を超音波画像に重畳表示することとしていたが、プローブ102が被検体に押圧されている圧力を更に用いることとしてもよい。例えば、プローブ102の移動速度が所定値以下であり且つプローブ102が被検体に押圧されている圧力が所定値以上の場合に表示制御部305は表示部104に光音響画像を超音波画像に重畳して表示させることとしてもよい。そして、光音響画像が超音波画像に重畳して表示部104に表示されている状態で、プローブ102の移動速度が所定値より大きく且つプローブ102が被検体に押圧されている圧力が所定値以上の場合、表示制御部305が光音響画像の重畳状態を変更する。すなわち、事前に超音波画像に光音響画像が重畳されていた場合、表示制御部305は表示部104に表示された重畳画像に替えて、光音響画像が重畳されていない超音波画像を表示部104に表示させる。また、事前に超音波画像に光音響画像が重畳されていない場合、表示制御部305は表示部104に表示された超音波画像に替えて、光音響画像が重畳された超音波画像を表示部104に表示させる。 In the first embodiment, the display control unit 305 displays the photoacoustic image superimposed on the ultrasonic image in accordance with the moving speed of the probe 102. However, the pressure at which the probe 102 is pressed against the subject. May be further used. For example, the display control unit 305 superimposes the photoacoustic image on the ultrasonic image on the display unit 104 when the moving speed of the probe 102 is equal to or lower than a predetermined value and the pressure at which the probe 102 is pressed against the subject is equal to or higher than the predetermined value. It is good also as making it display. Then, in a state where the photoacoustic image is superimposed on the ultrasonic image and displayed on the display unit 104, the moving speed of the probe 102 is greater than a predetermined value and the pressure at which the probe 102 is pressed against the subject is greater than or equal to the predetermined value. In this case, the display control unit 305 changes the superimposed state of the photoacoustic image. That is, when the photoacoustic image is superimposed on the ultrasonic image in advance, the display control unit 305 displays the ultrasonic image on which the photoacoustic image is not superimposed on the display unit instead of the superimposed image displayed on the display unit 104. 104 is displayed. In addition, when the photoacoustic image is not superimposed on the ultrasonic image in advance, the display control unit 305 displays the ultrasonic image on which the photoacoustic image is superimposed instead of the ultrasonic image displayed on the display unit 104. 104 is displayed.
 なお、プローブ102が被検体に押圧されている圧力が所定値未満の場合には、表示制御部305は、表示部104に光音響画像が重畳されていない超音波画像を表示部104に表示させる。 When the pressure at which the probe 102 is pressed against the subject is less than a predetermined value, the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed on the display unit 104. .
 上記の処理によっても、超音波画像へ光音響画像を重畳させるか否かをプローブ102に対する簡易な操作で切り換えることが可能となる。従って、ユーザはプローブ102に対する簡易な操作で光音響画像が重畳表示されていない超音波画像を詳細に観察することが可能となる。さらに、ユーザは、プローブ102に対する簡易な操作で超音波画像に再度光音響画像を重畳させることが可能となる。 Also by the above processing, it is possible to switch whether or not to superimpose the photoacoustic image on the ultrasonic image by a simple operation on the probe 102. Therefore, the user can observe in detail an ultrasonic image on which the photoacoustic image is not superimposed and displayed by a simple operation on the probe 102. Furthermore, the user can superimpose the photoacoustic image again on the ultrasonic image by a simple operation on the probe 102.
 また、第1の実施形態では、プローブ102の移動速度が所定値以下となった場合に表示制御部305が超音波画像に重畳された光音響画像を表示部104に表示させている。この場合、表示制御部305は、ジャイロセンサにより検出されたプローブ102の角度を示す情報に基づいて光音響画像の重畳状態変更することとしてもよい。例えば、表示制御部305は、プローブ102の移動速度が所定値以下であり且つ所定期間におけるプローブ102の角度変化が所定値以上である場合には、光音響画像の重畳状態を変更する。すなわち、例えば、ユーザがプローブ102の先端の位置を変えずに角度のみを変更しようとする場合に、表示制御部305は光音響画像の重畳状態を変更する。従って、事前に超音波画像に光音響画像が重畳されていた場合、表示制御部305は表示部104に表示された重畳画像に替えて、光音響画像が重畳されていない超音波画像を表示部104に表示させる。また、事前に超音波画像に光音響画像が重畳されていない場合、表示制御部305は表示部104に表示された超音波画像に替えて、光音響画像が重畳された超音波画像を表示部104に表示させる。 Further, in the first embodiment, when the moving speed of the probe 102 becomes a predetermined value or less, the display control unit 305 displays the photoacoustic image superimposed on the ultrasonic image on the display unit 104. In this case, the display control unit 305 may change the superimposed state of the photoacoustic image based on information indicating the angle of the probe 102 detected by the gyro sensor. For example, the display control unit 305 changes the superposition state of the photoacoustic image when the moving speed of the probe 102 is equal to or less than a predetermined value and the angle change of the probe 102 in a predetermined period is equal to or greater than the predetermined value. That is, for example, when the user intends to change only the angle without changing the position of the tip of the probe 102, the display control unit 305 changes the superimposed state of the photoacoustic image. Accordingly, when the photoacoustic image is superimposed on the ultrasonic image in advance, the display control unit 305 replaces the superimposed image displayed on the display unit 104 with an ultrasonic image on which the photoacoustic image is not superimposed. 104 is displayed. In addition, when the photoacoustic image is not superimposed on the ultrasonic image in advance, the display control unit 305 displays the ultrasonic image on which the photoacoustic image is superimposed instead of the ultrasonic image displayed on the display unit 104. 104 is displayed.
 なお、表示制御部305は、プローブ102の移動速度が所定値より大きくなると光音響画像が重畳されていない超音波画像を表示部104に表示させる。 Note that the display control unit 305 causes the display unit 104 to display an ultrasonic image on which the photoacoustic image is not superimposed when the moving speed of the probe 102 exceeds a predetermined value.
 上記の処理によっても、超音波画像へ光音響画像を重畳させるか否かをプローブ102に対する簡易な操作で切り換えることが可能となる。従って、ユーザはプローブ102に対する簡易な操作で光音響画像が重畳表示されていない超音波画像を詳細に観察することが可能となる。さらに、ユーザは、プローブ102に対する簡易な操作で超音波画像に再度光音響画像を重畳させることが可能となる。 Also by the above processing, it is possible to switch whether or not to superimpose the photoacoustic image on the ultrasonic image by a simple operation on the probe 102. Therefore, the user can observe in detail an ultrasonic image on which the photoacoustic image is not superimposed and displayed by a simple operation on the probe 102. Furthermore, the user can superimpose the photoacoustic image again on the ultrasonic image by a simple operation on the probe 102.
 なお、上記の例ではプローブ102の移動速度が所定範囲内の場合にステップ1211において表示制御部305が重畳状態を変更することとしたが、本発明はこれに限定されるものではない。例えば、プローブ102の移動速度が所定範囲内となり表示制御部305が超音波画像に光音響画像を重畳させないように表示部104を制御した後に、再びプローブ102の移動速度が所定範囲内となった場合でも表示制御部305は超音波画像に光音響画像を重畳しないこととしてもよい。なお、再び光音響画像が重畳された超音波画像を表示部104に表示させるためには、例えば以下のようにプローブ102を移動させればよい。移動速度が所定範囲の上限を超えるようにプローブ102を移動させ、その後にステップ601で用いられた所定値以下となるようにプローブ102を移動させる。すなわち、判定部304によってプローブ102の移動速度が所定範囲の上限を超えた後にステップ601で用いられた所定値以下となったと判定された場合、表示制御部305は、光音響画像が重畳された超音波画像を表示部104に表示させる。従って、光音響画像が重畳された超音波画像から光音響画像が重畳されてない超音波画像に切り替えた場合にはプローブ102を停止又は微動させた場合でも、光音響画像が重畳されてない超音波画像を表示させたままにすることが可能となる。 In the above example, when the moving speed of the probe 102 is within the predetermined range, the display control unit 305 changes the superposition state in step 1211. However, the present invention is not limited to this. For example, after the display controller 104 controls the display unit 104 so that the moving speed of the probe 102 is within a predetermined range and the photoacoustic image is not superimposed on the ultrasonic image, the moving speed of the probe 102 is again within the predetermined range. Even in this case, the display control unit 305 may not superimpose the photoacoustic image on the ultrasonic image. In order to display the ultrasonic image on which the photoacoustic image is superimposed again on the display unit 104, for example, the probe 102 may be moved as follows. The probe 102 is moved so that the moving speed exceeds the upper limit of the predetermined range, and then the probe 102 is moved so as to be equal to or less than the predetermined value used in step 601. That is, when the determination unit 304 determines that the moving speed of the probe 102 has become equal to or less than the predetermined value used in step 601 after exceeding the upper limit of the predetermined range, the display control unit 305 displays the photoacoustic image superimposed. An ultrasonic image is displayed on the display unit 104. Therefore, when switching from an ultrasonic image on which the photoacoustic image is superimposed to an ultrasonic image on which the photoacoustic image is not superimposed, even if the probe 102 is stopped or finely moved, the superacoustic on which no photoacoustic image is superimposed It is possible to keep the sound image displayed.
 すなわち、上記の形態によれば、一度超音波画像に光音響画像を重畳させないように変更された後は、超音波画像に光音響画像を簡単には重畳させないようになっているため、ユーザはプロープ102の操作に気を取られ難くなるため超音波画像の観察に集中することが可能となる。 In other words, according to the above-described embodiment, once the photoacoustic image is not superimposed on the ultrasonic image, the user cannot easily superimpose the photoacoustic image on the ultrasonic image. Since it becomes difficult to be distracted by the operation of the probe 102, it is possible to concentrate on the observation of the ultrasonic image.
 また、プローブ102の移動速度が所定値より大きく且つプローブ102が被検体に押圧されている圧力が所定値以上の場合、表示制御部305が光音響画像の重畳状態を変更することとしたが、本発明はこれに限定されるものではない。例えば、プローブ102の移動速度が所定値より大きく且つプローブ102が被検体に押圧されている圧力が所定値以上となり表示制御部305が超音波画像に光音響画像を重畳させないように表示部104を制御した後に、再びプローブ102の移動速度が所定値より大きく且つプローブ102が被検体に押圧されている圧力が所定値以上となった場合でも表示制御部305は超音波画像に光音響画像を重畳しないこととしてもよい。なお、再び光音響画像が重畳された超音波画像を表示部104に表示させるためには、例えば以下のようにプローブ102を操作すればよい。プローブ102が被検体に押圧されている圧力を所定値未満とした後に、プローブ102の移動速度が所定値以下であり且つプローブ102が被検体に押圧されている圧力が所定値以上となるようにする。この場合に表示制御部305は表示部104に光音響画像を超音波画像に再度重畳して表示させる。 In addition, when the moving speed of the probe 102 is higher than a predetermined value and the pressure at which the probe 102 is pressed against the subject is equal to or higher than the predetermined value, the display control unit 305 changes the superposition state of the photoacoustic image. The present invention is not limited to this. For example, the display unit 104 is set so that the moving speed of the probe 102 is larger than a predetermined value and the pressure at which the probe 102 is pressed against the subject is equal to or higher than the predetermined value, and the display control unit 305 does not superimpose the photoacoustic image on the ultrasonic image. After the control, the display control unit 305 superimposes the photoacoustic image on the ultrasonic image even when the moving speed of the probe 102 is again larger than the predetermined value and the pressure at which the probe 102 is pressed against the subject exceeds the predetermined value. You may not do it. In order to display the ultrasonic image on which the photoacoustic image is superimposed again on the display unit 104, for example, the probe 102 may be operated as follows. After the pressure at which the probe 102 is pressed against the subject is set below a predetermined value, the moving speed of the probe 102 is equal to or lower than the predetermined value, and the pressure at which the probe 102 is pressed against the subject is equal to or higher than the predetermined value. To do. In this case, the display control unit 305 causes the display unit 104 to display the photoacoustic image again superimposed on the ultrasonic image.
 上記の形態によれば、一度超音波画像に光音響画像を重畳させないように変更された後は、超音波画像に光音響画像を簡単には重畳させないようになっているため、ユーザはプロープ102の操作に気を取られ難くなるため超音波画像の観察に集中することが可能となる。 According to the above embodiment, once the photoacoustic image is changed so as not to be superimposed on the ultrasonic image, the user does not easily superimpose the photoacoustic image on the ultrasonic image. Therefore, it becomes difficult to be distracted by the operation, and it becomes possible to concentrate on the observation of the ultrasonic image.
 さらに、プローブ102の移動速度が所定値以下であり且つ所定期間におけるプローブ102の角度変化が所定値以上である場合には、光音響画像の重畳状態を変更することとしたが、本発明はこれに限定されるものではない。例えば、プローブ102の移動速度が所定値以下であり且つ所定期間におけるプローブ102の角度変化が所定値以上となり表示制御部305が超音波画像に光音響画像を重畳させないように表示部104を制御した後に、再びプローブ102の移動速度が所定値以下であり且つ所定期間におけるプローブ102の角度変化が所定値以上となった場合でも表示制御部305は超音波画像に光音響画像を重畳しないこととしてもよい。なお、再び光音響画像が重畳された超音波画像を表示部104に表示させるためには、例えば以下のようにプローブ102を操作すればよい。例えば、プローブ102の移動速度を所定値より大きくした後にプローブ102の移動速度を所定値以下にする。すなわち、判定部304によってプローブ102の移動速度が移動速度を所定値より大きくなった後にプローブ102の移動速度が所定値以下となったと判定された場合、表示制御部305は、光音響画像が重畳された超音波画像を表示部104に表示させる。従って、プロープ102を移動させずに或いは微動させた状態でプローブ102の角度を変更した場合でも、光音響画像が重畳されてない超音波画像を表示させたままにすることが可能となる。 Furthermore, the superposition state of the photoacoustic image is changed when the moving speed of the probe 102 is not more than a predetermined value and the angle change of the probe 102 in the predetermined period is not less than the predetermined value. It is not limited to. For example, the display unit 104 is controlled so that the moving speed of the probe 102 is not more than a predetermined value and the angle change of the probe 102 in a predetermined period is not less than a predetermined value and the display control unit 305 does not superimpose the photoacoustic image on the ultrasonic image. Later, the display control unit 305 may not superimpose the photoacoustic image on the ultrasonic image even when the moving speed of the probe 102 is equal to or lower than the predetermined value and the angle change of the probe 102 during the predetermined period is equal to or higher than the predetermined value. Good. In order to display the ultrasonic image on which the photoacoustic image is superimposed again on the display unit 104, for example, the probe 102 may be operated as follows. For example, after the moving speed of the probe 102 is made larger than a predetermined value, the moving speed of the probe 102 is set to a predetermined value or less. That is, when the determination unit 304 determines that the moving speed of the probe 102 has become equal to or lower than the predetermined value after the moving speed of the probe 102 has become larger than the predetermined value, the display control unit 305 superimposes the photoacoustic image. The displayed ultrasonic image is displayed on the display unit 104. Accordingly, even when the angle of the probe 102 is changed without moving the probe 102 or finely moved, it is possible to keep displaying the ultrasonic image on which the photoacoustic image is not superimposed.
 上記の形態によれば、一度超音波画像に光音響画像を重畳させないように変更された後は、超音波画像に光音響画像を簡単には重畳させないようになっているため、ユーザはプロープ102の操作に気を取られ難くなるため超音波画像の観察に集中することが可能となる。 According to the above embodiment, once the photoacoustic image is changed so as not to be superimposed on the ultrasonic image, the user does not easily superimpose the photoacoustic image on the ultrasonic image. Therefore, it becomes difficult to be distracted by the operation, and it becomes possible to concentrate on the observation of the ultrasonic image.
 本発明は、上述の実施形態の1以上の機能を実現するプログラムを、ネットワーク又は記憶媒体を介してシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサがプログラムを読出し実行する処理でも実現可能である。また、1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。 The present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in the computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
 上述の各実施形態における制御装置は、単体の装置として実現してもよいし、複数の装置を互いに通信可能に組合せて上述の処理を実行する形態としてもよく、いずれも本発明の実施形態に含まれる。共通のサーバ装置あるいはサーバ群で、上述の処理を実行することとしてもよい。制御装置および制御システムを構成する複数の装置は所定の通信レートで通信可能であればよく、また同一の施設内あるいは同一の国に存在することを要しない。 The control device in each of the above-described embodiments may be realized as a single device, or may be configured to execute the above-described processing by combining a plurality of devices so that they can communicate with each other. included. The above-described processing may be executed by a common server device or server group. The plurality of devices constituting the control device and the control system need only be able to communicate at a predetermined communication rate, and do not need to exist in the same facility or in the same country.
 本発明の実施形態には、前述した実施形態の機能を実現するソフトウェアのプログラムを、システムあるいは装置に供給し、そのシステムあるいは装置のコンピュータが該供給されたプログラムのコードを読みだして実行するという形態を含む。 In the embodiment of the present invention, a software program that realizes the functions of the above-described embodiments is supplied to a system or apparatus, and the computer of the system or apparatus reads and executes the code of the supplied program. Includes form.
 したがって、実施形態に係る処理をコンピュータで実現するために、該コンピュータにインストールされるプログラムコード自体も本発明の実施形態の一つである。また、コンピュータが読みだしたプログラムに含まれる指示に基づき、コンピュータで稼働しているOSなどが、実際の処理の一部又は全部を行い、その処理によっても前述した実施形態の機能が実現され得る。 Therefore, since the processing according to the embodiment is realized by a computer, the program code itself installed in the computer is also one embodiment of the present invention. Further, based on instructions included in a program read by the computer, an OS or the like running on the computer performs part or all of the actual processing, and the functions of the above-described embodiments can be realized by the processing. .
 上述の実施形態を適宜組み合わせた形態も、本発明の実施形態に含まれる。 Embodiments appropriately combining the above-described embodiments are also included in the embodiments of the present invention.
 本発明は上記実施の形態に制限されるものではなく、本発明の精神及び範囲から離脱することなく、様々な変更及び変形が可能である。従って、本発明の範囲を公にするために以下の請求項を添付する。 The present invention is not limited to the above embodiment, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Therefore, in order to make the scope of the present invention public, the following claims are attached.
 本願は、2016年7月8日提出の日本国特許出願特願2016-136107、2016年11月25日提出の日本国特許出願特願2016-229311を基礎として優先権を主張するものであり、その記載内容の全てをここに援用する。 This application claims priority on the basis of Japanese Patent Application No. 2016-136107 filed on July 8, 2016 and Japanese Patent Application No. 2016-229311 filed on November 25, 2016. All the descriptions are incorporated herein.

Claims (25)

  1.  被検体に対する超音波の送受信により超音波信号を出力し、被検体への光照射により発生する光音響波を受信することにより光音響信号を出力するプローブから、前記超音波信号と前記光音響信号とを取得する第1の取得手段と、
     前記プローブの変位に関する情報を取得する第2の取得手段と、
     前記変位に関する情報に基づいて、前記光音響信号を用いて生成される光音響画像を表示部に表示させる表示制御手段と、
      を有することを特徴とする制御装置。
    The ultrasonic signal and the photoacoustic signal are output from a probe that outputs an ultrasonic signal by transmitting and receiving ultrasonic waves to the subject and outputs a photoacoustic signal by receiving a photoacoustic wave generated by light irradiation on the subject. First acquisition means for acquiring
    Second acquisition means for acquiring information relating to displacement of the probe;
    Display control means for displaying a photoacoustic image generated using the photoacoustic signal on a display unit based on the information on the displacement;
    A control device comprising:
  2.  前記表示制御手段は、前記超音波信号を用いて生成される超音波画像を前記表示部に表示している場合に、前記変位に関する情報に基づいて、前記光音響画像を前記表示部に表示させることを特徴とする請求項1に記載の制御装置。 The display control unit displays the photoacoustic image on the display unit based on the information on the displacement when an ultrasonic image generated using the ultrasonic signal is displayed on the display unit. The control device according to claim 1.
  3.  前記第2の取得手段は、前記変位に関する情報として、前記プローブの被検体に対する位置と姿勢の情報と、前記被検体に対する移動の速度に関する情報と、前記プローブの回転の速度に関する情報と、前記被検体に対する移動の加速度に関する情報と、前記被検体に対する押圧の程度を示す情報とのうち、少なくともいずれかの情報を取得することを特徴とする請求項1又は請求項2のいずれか一項に記載の制御装置。 The second acquisition means includes, as the information on the displacement, information on the position and orientation of the probe with respect to the subject, information on the speed of movement of the probe with respect to the subject, information on the speed of rotation of the probe, and the subject. 3. The information according to claim 1, wherein at least one of the information related to the acceleration of the movement with respect to the specimen and the information indicating the degree of pressing with respect to the subject is acquired. Control device.
  4.  前記表示制御手段は、前記プローブが前記被検体に対して所定の速度より低い速度で移動していることを示す情報と、前記プローブが前記被検体に対して一定の圧力で前記被検体に対して押圧されていることを示す情報とのうち、少なくともいずれかの情報が取得された場合に、前記光音響画像を前記表示部に表示させることを特徴とする請求項1乃至請求項3のいずれか一項に記載の制御装置。 The display control means includes information indicating that the probe is moving at a speed lower than a predetermined speed with respect to the subject, and the probe is applied to the subject at a constant pressure with respect to the subject. The photoacoustic image is displayed on the display unit when at least one of the information indicating that the image is pressed is acquired, and the display unit displays the photoacoustic image. A control device according to claim 1.
  5.  前記表示制御手段は、前記変位の程度に応じて前記光音響画像を前記表示部に表示させる態様を異ならせることを特徴とする請求項1乃至請求項4のいずれか一項に記載の制御装置。 5. The control device according to claim 1, wherein the display control unit changes a mode of displaying the photoacoustic image on the display unit according to a degree of the displacement. 6. .
  6.  前記表示制御手段は、前記被検体に対する移動の速度が大きいほど、前記光音響画像の透明度を大きくして前記表示部に表示させることを特徴とする請求項5に記載の制御装置。 6. The control apparatus according to claim 5, wherein the display control means increases the transparency of the photoacoustic image and displays it on the display unit as the moving speed with respect to the subject increases.
  7.  前記プローブの前記変位に関する情報に基づいて、前記光音響画像を前記表示部に表示させるか否かを判定する判定手段をさらに有し、
     前記表示制御手段は、前記判定手段により前記光音響画像を前記表示部に表示させると判定された結果に基づいて、前記光音響画像を表示部に表示させることを特徴とする請求項1乃至請求項6のいずれか一項に記載の制御装置。
    Based on the information related to the displacement of the probe, further includes a determination unit that determines whether to display the photoacoustic image on the display unit,
    The display control means causes the display section to display the photoacoustic image based on a result determined by the determination means to display the photoacoustic image on the display section. Item 7. The control device according to any one of items 6.
  8.  前記判定手段は、前記プローブが前記被検体に対して所定の速度より低い速度で移動していることを示す情報と、前記プローブが前記被検体に対して所定の圧力より高い圧力で前記被検体に対して押圧されていることを示す情報とのうち、少なくともいずれかの情報が前記取得手段により取得された場合に、前記光音響画像を前記表示部に表示させると判定することを特徴とする請求項7に記載の制御装置。 The determination means includes information indicating that the probe is moving at a speed lower than a predetermined speed with respect to the subject, and the subject at a pressure higher than a predetermined pressure with respect to the subject. It is determined that the photoacoustic image is displayed on the display unit when at least one piece of information is acquired by the acquisition unit among the information indicating that the pressure is pressed against the display unit. The control device according to claim 7.
  9.  前記判定手段により、前記光音響画像を前記表示部に表示させると判定された場合に、前記被検体への光照射を行うように照射部を制御する照射制御手段をさらに有することを特徴とする請求項7又は請求項8のいずれか一項に記載の制御装置。 When it is determined by the determination unit that the photoacoustic image is displayed on the display unit, the apparatus further includes an irradiation control unit that controls the irradiation unit to perform light irradiation on the subject. The control apparatus as described in any one of Claim 7 or Claim 8.
  10.  前記第1の取得手段により取得された前記超音波信号に基づいて超音波画像を生成し、前記光音響信号に基づいて光音響画像を生成する生成手段をさらに有することを特徴とする請求項1乃至請求項9のいずれか一項に記載の制御装置。 2. The apparatus according to claim 1, further comprising: a generating unit that generates an ultrasonic image based on the ultrasonic signal acquired by the first acquiring unit and generates a photoacoustic image based on the photoacoustic signal. The control device according to claim 9.
  11.  前記生成手段により生成された前記超音波画像と前記光音響画像とを関連付けて外部装置に出力する出力手段をさらに有することを特徴とする請求項10に記載の制御装置。 11. The control device according to claim 10, further comprising an output unit that associates the ultrasonic image generated by the generation unit with the photoacoustic image and outputs the image to an external device.
  12.  前記出力手段は、前記超音波画像と前記光音響画像とを互いに関連付ける情報を、それぞれの画像に付帯させて出力することを特徴とする請求項11に記載の制御装置。 12. The control apparatus according to claim 11, wherein the output means outputs information that associates the ultrasonic image and the photoacoustic image with each other, attached to each image.
  13.  前記生成手段により生成された前記超音波画像に対して、前記光音響画像が重畳された重畳画像を外部装置に出力する出力手段をさらに有することを特徴とする請求項10に記載の制御装置。 11. The control apparatus according to claim 10, further comprising output means for outputting a superimposed image in which the photoacoustic image is superimposed on the ultrasonic image generated by the generating means to an external device.
  14.  前記出力手段は、前記超音波画像を生成するための前記超音波信号を取得した前記プローブの位置を示す情報を前記超音波画像に付帯させることを特徴とする請求項11乃至請求項13のいずれか一項に記載の制御装置。 The output means adds information indicating the position of the probe that has acquired the ultrasonic signal for generating the ultrasonic image to the ultrasonic image. A control device according to claim 1.
  15.  前記出力手段は、前記光音響画像を生成するための前記光音響信号を取得した前記プローブの位置を示す情報を前記光音響画像に付帯させることを特徴とする請求項11乃至請求項14のいずれか一項に記載の制御装置。 The said output means attaches the information which shows the position of the said probe which acquired the said photoacoustic signal for producing | generating the said photoacoustic image to the said photoacoustic image, Any of Claim 11 thru | or 14 characterized by the above-mentioned. A control device according to claim 1.
  16.  前記プローブを特定の位置に誘導するためのガイド情報を生成するガイド手段をさらに有することを特徴とする請求項1乃至請求項15のいずれか一項に記載の制御装置。 16. The control device according to claim 1, further comprising guide means for generating guide information for guiding the probe to a specific position.
  17.  前記プローブが前記光音響信号を取得するために前記光照射を行っていることを報知する報知手段をさらに有することを特徴とする請求項1乃至請求項16のいずれか一項に記載の制御装置。 The control device according to any one of claims 1 to 16, further comprising notification means for notifying that the probe is performing the light irradiation in order to acquire the photoacoustic signal. .
  18.  前記表示制御手段は、前記超音波信号から生成される超音波画像を前記表示部に表示させ、前記プローブの前記変位に関する情報に基づいて、前記光音響画像を前記超音波画像に重畳して表示させることを特徴とする請求項1乃至請求項17のいずれか一項に記載の制御装置。 The display control unit displays an ultrasonic image generated from the ultrasonic signal on the display unit, and displays the photoacoustic image superimposed on the ultrasonic image based on information on the displacement of the probe. The control device according to any one of claims 1 to 17, wherein:
  19.  前記第2の取得手段は、磁場での前記プローブの変位に関する情報を、前記プローブに供えられた磁気センサから取得された情報に基づいて取得することを特徴とする請求項1乃至請求項18のいずれか一項に記載の制御装置。 19. The second acquisition unit according to claim 1, wherein the second acquisition unit acquires information related to displacement of the probe in a magnetic field based on information acquired from a magnetic sensor provided in the probe. The control device according to any one of the above.
  20.  前記第2の取得手段は、前記プローブに供えられた圧力センサから取得された情報に基づいて、前記プローブの変位に関する情報を取得することを特徴とする請求項1乃至請求項19のいずれか一項に記載の制御装置。 The said 2nd acquisition means acquires the information regarding the displacement of the said probe based on the information acquired from the pressure sensor provided to the said probe, The any one of Claim 1 thru | or 19 characterized by the above-mentioned. The control device according to item.
  21.  前記プローブの変位に関する情報は、ユーザが前記プローブを操作する態様に対応することを特徴とする請求項1乃至請求項20のいずれか一項に記載の制御装置。 The control apparatus according to any one of claims 1 to 20, wherein the information related to the displacement of the probe corresponds to a mode in which a user operates the probe.
  22.  被検体に対する超音波の送受信により超音波信号を出力し、被検体への光照射により発生する光音響波を受信することにより光音響信号を出力するプローブから、前記超音波信号と前記光音響信号とのうち少なくともいずれかを取得する第1の取得手段と、
      前記超音波信号に基づいて取得される超音波画像の特徴に関する情報を取得する第3の取得手段と、
     前記取得された前記超音波画像の特徴に関する情報に基づいて、前記光音響信号から生成される光音響画像を表示部に表示させる表示制御手段と、
      を有することを特徴とする制御装置。
    The ultrasonic signal and the photoacoustic signal are output from a probe that outputs an ultrasonic signal by transmitting and receiving ultrasonic waves to the subject and outputs a photoacoustic signal by receiving a photoacoustic wave generated by light irradiation on the subject. First acquisition means for acquiring at least one of
    Third acquisition means for acquiring information relating to characteristics of the ultrasonic image acquired based on the ultrasonic signal;
    Display control means for displaying a photoacoustic image generated from the photoacoustic signal on a display unit based on information on the characteristics of the acquired ultrasonic image;
    A control device comprising:
  23.  前記超音波画像の特徴は、前記プローブの特性と前記超音波信号を取得するための撮像条件により定まることを特徴とする請求項22に記載の制御装置。 23. The control device according to claim 22, wherein the characteristics of the ultrasonic image are determined by characteristics of the probe and imaging conditions for acquiring the ultrasonic signal.
  24.  被検体に光を照射するための光源と、
     超音波を送受信するためのトランスデューサと、
     前記トランスデューサにより送信された超音波の反射波を超音波信号として取得し、前記光源から被検体に照射された光により発生する光音響波を光音響信号として取得する第1の取得手段と、
     前記トランスデューサを備えるプローブの変位に関する情報を取得する第2の取得手段と、
     前記変位に関する情報に基づいて、前記光音響信号から生成される光音響画像を表示部に表示させる表示制御手段と、
      を有することを特徴とする撮像システム。
    A light source for irradiating the subject with light;
    A transducer for transmitting and receiving ultrasound,
    A first acquisition means for acquiring a reflected wave of the ultrasonic wave transmitted by the transducer as an ultrasonic signal, and acquiring a photoacoustic wave generated by light irradiated on a subject from the light source as a photoacoustic signal;
    Second acquisition means for acquiring information relating to displacement of the probe comprising the transducer;
    Display control means for displaying a photoacoustic image generated from the photoacoustic signal on a display unit based on the information about the displacement;
    An imaging system comprising:
  25.  被検体に対する超音波の送受信により超音波信号を出力するプローブから、前記超音波信号を取得する第1の工程と、
     前記プローブの変位に関する情報を取得する第2の工程と、
     前記プローブの変位に関する情報に基づいて、被検体への光照射により発生する光音響波を受信することにより光音響信号を出力する前記プローブから、前記光音響信号を取得する第3の工程と、
     前記光音響信号から生成される光音響画像を表示部に表示させる第3の工程と、
      を有することを特徴とする制御方法。
    A first step of acquiring the ultrasonic signal from a probe that outputs an ultrasonic signal by transmitting and receiving ultrasonic waves to and from the subject;
    A second step of acquiring information relating to displacement of the probe;
    A third step of acquiring the photoacoustic signal from the probe that outputs a photoacoustic signal by receiving a photoacoustic wave generated by light irradiation on the subject based on information on the displacement of the probe;
    A third step of displaying a photoacoustic image generated from the photoacoustic signal on a display unit;
    A control method characterized by comprising:
PCT/JP2017/024575 2016-07-08 2017-07-05 Control device, control method, control system, and program WO2018008664A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780042494.1A CN109414254A (en) 2016-07-08 2017-07-05 Control equipment, control method, control system and program
US16/239,330 US20190150894A1 (en) 2016-07-08 2019-01-03 Control device, control method, control system, and non-transitory storage medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016136107 2016-07-08
JP2016-136107 2016-07-08
JP2016-229311 2016-11-25
JP2016229311A JP2018011927A (en) 2016-07-08 2016-11-25 Control device, control method, control system, and program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/239,330 Continuation US20190150894A1 (en) 2016-07-08 2019-01-03 Control device, control method, control system, and non-transitory storage medium

Publications (1)

Publication Number Publication Date
WO2018008664A1 true WO2018008664A1 (en) 2018-01-11

Family

ID=60912810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/024575 WO2018008664A1 (en) 2016-07-08 2017-07-05 Control device, control method, control system, and program

Country Status (1)

Country Link
WO (1) WO2018008664A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022510696A (en) * 2018-12-04 2022-01-27 フジフィルム ソノサイト インコーポレイテッド Photoacoustic ECG Synchronous Kilohertz Visualization

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013111432A (en) * 2011-12-01 2013-06-10 Fujifilm Corp Photoacoustic image generation apparatus and photoacoustic image generation method
JP2013150787A (en) * 2011-12-28 2013-08-08 Fujifilm Corp Acoustic image generating device and method for displaying progress when generating images using the same
JP2014061124A (en) * 2012-09-21 2014-04-10 Fujifilm Corp Photoacoustic measuring apparatus, detection method of light scanning state of photoacoustic measuring apparatus, and sheet-like member for use in method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013111432A (en) * 2011-12-01 2013-06-10 Fujifilm Corp Photoacoustic image generation apparatus and photoacoustic image generation method
JP2013150787A (en) * 2011-12-28 2013-08-08 Fujifilm Corp Acoustic image generating device and method for displaying progress when generating images using the same
JP2014061124A (en) * 2012-09-21 2014-04-10 Fujifilm Corp Photoacoustic measuring apparatus, detection method of light scanning state of photoacoustic measuring apparatus, and sheet-like member for use in method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022510696A (en) * 2018-12-04 2022-01-27 フジフィルム ソノサイト インコーポレイテッド Photoacoustic ECG Synchronous Kilohertz Visualization
JP7321266B2 (en) 2018-12-04 2023-08-04 フジフィルム ソノサイト インコーポレイテッド Photoacoustic electrocardiogram-gated kilohertz visualization

Similar Documents

Publication Publication Date Title
JP5019205B2 (en) Ultrasonic diagnostic equipment
US20190150894A1 (en) Control device, control method, control system, and non-transitory storage medium
KR20180006308A (en) Apparatus, method, and program for obtaining information derived from ultrasonic waves and photoacoustic waves
JP6576424B2 (en) Display control apparatus, image display method, and program
US20150173721A1 (en) Ultrasound diagnostic apparatus, medical image processing apparatus and image processing method
WO2018008439A1 (en) Apparatus, method and program for displaying ultrasound image and photoacoustic image
JP2018057695A (en) Image display system, image display method, and program
WO2018008661A1 (en) Control device, control method, control system, and program
WO2018008664A1 (en) Control device, control method, control system, and program
US11510630B2 (en) Display control apparatus, image display method, and non-transitory computer-readable medium
EP3329843B1 (en) Display control apparatus, display control method, and program
JP2018011928A (en) Control device, control method, control system, and program
WO2018097030A1 (en) Information processing device, information processing method, information processing system, and program
Nayak et al. Technological Evolution of Ultrasound Devices: A Review
JP2015006260A (en) Ultrasonic diagnostic apparatus
US20200113541A1 (en) Information processing apparatus, information processing method, and storage medium
US20190321005A1 (en) Subject information acquisition apparatus, subject information processing method, and storage medium using probe to receive acoustic wave
JP2020028669A (en) Image processing device, image processing method, and program
JP7129158B2 (en) Information processing device, information processing method, information processing system and program
JP6929204B2 (en) Information processing equipment, information processing methods, and programs
JP2017042603A (en) Subject information acquisition apparatus
WO2018097050A1 (en) Information processing device, information processing method, information processing system, and program
US20180299763A1 (en) Information processing apparatus, object information acquiring apparatus, and information processing method
WO2020040174A1 (en) Image processing device, image processing method, and program
JP2020162746A (en) Image processing device, image processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17824267

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17824267

Country of ref document: EP

Kind code of ref document: A1