WO2021153314A1 - Système de traitement d'informations médicales, dispositif de traitement d'informations médicales, procédé de commande de système de traitement d'informations médicales, et programme - Google Patents

Système de traitement d'informations médicales, dispositif de traitement d'informations médicales, procédé de commande de système de traitement d'informations médicales, et programme Download PDF

Info

Publication number
WO2021153314A1
WO2021153314A1 PCT/JP2021/001491 JP2021001491W WO2021153314A1 WO 2021153314 A1 WO2021153314 A1 WO 2021153314A1 JP 2021001491 W JP2021001491 W JP 2021001491W WO 2021153314 A1 WO2021153314 A1 WO 2021153314A1
Authority
WO
WIPO (PCT)
Prior art keywords
learning model
medical information
accuracy
determination
information processing
Prior art date
Application number
PCT/JP2021/001491
Other languages
English (en)
Japanese (ja)
Inventor
一誠 小原
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2021153314A1 publication Critical patent/WO2021153314A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Definitions

  • the present invention relates to a medical information processing system, a medical information processing device, a control method of the medical information processing system, and a program.
  • Patent Document 1 discloses a technique of calculating the determination accuracy of a classifier created by machine learning and using the classifier having the highest determination accuracy for the determination process.
  • the judgment performance of the function often depends on the learning data. For example, in learning the function of recognizing the irradiation field of an X-ray image, when most of the learning data uses the function learned using the data of the chest part, the data of the part other than the chest is input. It is expected that the judgment performance will deteriorate if the function is used as.
  • the judgment performance may be improved under specific conditions, but the judgment performance may be lowered under other conditions.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a medical information processing technique capable of performing judgment processing by a learning model that matches the data conditions.
  • the medical information processing system includes an acquisition means for acquiring medical information and an acquisition means.
  • Condition setting means for setting conditions for distributing data used for machine learning
  • a determination means for performing determination processing on target data using a learning model generated by the machine learning using the medical information and the conditions and different for each condition is provided.
  • the determination means is characterized in that a learning model that matches the conditions of the target data is selected from learning models that differ for each of the conditions, and the determination process is performed using the selected learning model.
  • the determination process can be performed by a learning model that matches the data conditions.
  • a system configuration diagram showing an example of a medical information system The flowchart which shows the procedure of the process which the machine learning control apparatus which concerns on 1st Embodiment carries out.
  • the flowchart which shows the procedure of the process which the machine learning control apparatus which concerns on 2nd Embodiment carries out.
  • the flowchart which shows the procedure of the process which the machine learning control device which concerns on 3rd Embodiment carries out.
  • radiation includes not only X-rays but also ⁇ -rays, ⁇ -rays, ⁇ -rays, various particle beams, and the like.
  • FIG. 1 is a diagram showing an example of the configuration of the medical information processing system 10 according to the first embodiment of the present invention.
  • the medical information processing system 10 includes a radiography control device 101 and a machine learning control device 113 that functions as a determination accuracy evaluation device.
  • the radiography control device 101 and the machine learning control device 113 are communicably connected to the HIS 117, the RIS 118, the PACS 119, the printer 120, and the report server 121 via the network 122.
  • HIS117 Hospital Information System
  • HIS117 Hospital Information System
  • RIS118 Radiology Information System
  • PACS119 Picture Archiving and Communication Systems
  • the image interpretation report created by the image interpretation doctor is stored in the report server 121.
  • HIS117 may include a server that manages accounting information.
  • an inspection instruction is input from the terminal of HIS117 and transmitted to the radiology department, which is the request destination.
  • This request information is called an inspection order, and this inspection order includes the department name of the requesting source, inspection items, personal data of the subject 130, and the like.
  • the radiology department receives the inspection order transmitted from the RIS 118, it adds imaging conditions and the like and transfers it to the radiography control device 101.
  • the radiography control device 101 performs radiography according to the received inspection order. Inspection information is added to the image taken based on the radiography control of the radiography control device 101, and the image is transferred to the PACS 119 or printed out by the printer 120. In addition, the information on the inspection performed by the radiography control device 101 is transferred to the HIS 117.
  • the inspection implementation information transferred to HIS117 is used not only for inspection progress management but also for post-inspection accounting.
  • the image interpreting doctor confirms the image transferred to the PACS 119 and the image printed by the printer 120, and creates an image interpretation report in which the image interpretation result is described by a report creating device (not shown). The interpretation report is stored in the report server 121.
  • Each of these devices is connected via, for example, a network 122 composed of a LAN (Local Area Network), a WAN (Wide Area Network), or the like.
  • Each of these devices includes one or more computers.
  • the computer is provided with, for example, a main control unit such as a CPU, and a storage unit such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
  • the computer may be provided with a communication unit such as a network card, an input / output unit such as a keyboard, a display or a touch panel, and the like.
  • a bus or the like is controlled by the main control unit reading and executing a program stored in the storage unit.
  • the radiography control device 101 includes a display unit 102, an operation unit 103, a determination unit 104, a radiation generation control unit 105, a display control unit 106, and an imaging control unit 107.
  • the radiation generation control unit 105 is connected to the radiation generation unit 110 via a cable 111, and controls the irradiation of radiation from the radiation generation unit 110.
  • the radiation generating unit 110 is realized by, for example, a radiation tube, and irradiates the subject 130 (for example, a specific part of the patient) with radiation.
  • the imaging control unit 107 controls the processing in the radiography control device 101 in an integrated manner.
  • the display unit 102 is realized by, for example, a liquid crystal display or the like, and displays various information to a user (photographer, doctor, etc.).
  • the operation unit 103 is realized by, for example, a mouse, an operation button, or the like, and inputs various instructions from the user into the device.
  • the display unit 102 and the operation unit 103 may be realized by a touch panel in which they are integrated.
  • the determination unit 104 is a discriminator that makes inferences based on a learning model created by machine learning using medical information obtained in the past diagnosis, and performs inference of diseases and image processing.
  • the medical information may be information included in any one of the examination order, the image, and the interpretation report, or may be a combination of the information included in the examination order, the image, and the interpretation report.
  • the imaging control unit 107 is connected to the radiation detector 109 via a cable 112, and a power supply, an image signal, a control signal, etc. are exchanged between the two by the cable 112.
  • the radiation detector 109 functions as a detector that detects the radiation transmitted through the subject 130 and acquires an image (radiation image) based on the radiation transmitted through the subject 130. That is, the radiation photographing unit is realized by operating the radiation generating unit 110 and the radiation detector 109 in cooperation with each other.
  • the radiation detector 109 is installed on, for example, a standing or lying position imaging table 108.
  • the imaging control unit 107 functions as an instruction unit for instructing the start of radiological imaging corresponding to at least one of the order information received from the RIS 118.
  • the order information received from the RIS 118 includes, for example, subject information and one or more imaging sites for the subject.
  • the operation unit 103 receives an input from the user and is instructed to start radiography.
  • the photographing control unit 107 may select the order information to be photographed and instruct the start of photographing.
  • radiography When radiography is performed, an image (radiation image) is displayed on the display unit 102.
  • the user can perform image processing, cropping, annotation, geometric transformation, and other image editing on the displayed image via the operation unit 103. These image edits may be automatically performed by the determination unit 104.
  • the above is an explanation of an example of the configuration of the radiography control device 101 in the medical information processing system according to the first embodiment.
  • the configuration shown in FIG. 1 is merely an example and can be changed as appropriate.
  • various devices are connected to the radiography control device 101 via a network 122, but the radiography control device 101 does not necessarily have to be connected to such a device.
  • the diagnostic image may be output to a portable medium such as a DVD and input to various devices via the portable medium.
  • the network 122 may be configured by wire or may be partially configured by a wireless signal transmission line. A part of the cable 111 and the cable 112 may also be composed of a wireless signal transmission line.
  • the machine learning control device 113 includes a learning / evaluation data acquisition unit 114, a learning / evaluation data storage unit 115, a determination accuracy evaluation unit 116, a condition setting unit 123, and a machine learning unit. It has 124 and.
  • the learning / evaluation data acquisition unit 114 acquires data (medical information) used for machine learning learning or evaluation from any device of the medical information processing system 10 connected via the network 122. I do. For example, when learning using a medical image as an input, the learning / evaluation data acquisition unit 114 acquires an image from PACS119.
  • the data acquired as input data for machine learning may be either a single data or a plurality of data.
  • the acquired data is stored in the learning / evaluation data storage unit 115.
  • the learning / evaluation data acquisition unit 114 may not store the acquired data in the learning / evaluation data storage unit 115, but may output the data to the condition setting unit 123.
  • the condition setting unit 123 sets the conditions for allocating the data used for performing machine learning. For example, as a condition for distributing data, for example, a condition such as a photographing portion and an imaging direction is set.
  • the condition setting is not limited to the setting by the condition setting unit 123, and the user may set it manually.
  • the machine learning unit 124 creates a plurality of learning models different for each condition by machine learning using the data (medical information) and the conditions set in step S202. That is, the machine learning unit 124 sets the data acquired by the learning / evaluation data acquisition unit 114 (or the data held in the learning / evaluation data storage unit 115) and the conditions set by the condition setting unit 123. Receive and create multiple learning models that are different for each condition. For example, if the condition of the imaging region is set in step S202, a learning model for each imaging region is created, and if the condition of the imaging direction is set, a learning model for each imaging direction is created.
  • the determination accuracy evaluation unit 116 evaluates the accuracy of the determination process using the learning model newly created by the machine learning unit 124. That is, the determination accuracy evaluation unit 116 has the medical information acquired by the learning / evaluation data acquisition unit 114 (or the data held in the learning / evaluation data storage unit 115 (medical information)) and the condition setting unit 123. Based on the conditions set by, the machine learning unit 124 evaluates the accuracy of a plurality of learning models that are different for each condition newly created. As the accuracy evaluation, the judgment accuracy evaluation unit 116 acquires the condition used for the accuracy evaluation (evaluation data condition) and the accuracy of the judgment process (judgment accuracy) when the evaluation data corresponding to the condition is input.
  • the determination accuracy evaluation unit 116 uses the same data conditions as those used for creating the learning model by the machine learning unit 124 as the conditions for the data to be evaluated. For example, the determination accuracy evaluation unit 116 uses the data of the chest region as the evaluation data of the learning model created under the condition of the chest region.
  • step S205 the determination accuracy evaluation unit 116 applies the learning model newly created by the learning of the machine learning unit 124 to the determination unit 104 based on the result of the accuracy evaluation of the created learning model. Make a decision. For example, the judgment accuracy evaluation unit 116 compares the accuracy of the judgment processing by the learning model newly created by the learning of the machine learning unit 124 with the accuracy of the judgment processing by the learning model currently used in the judgment unit 104. The accuracy is evaluated based on.
  • the determination unit 104 inputs evaluation data corresponding to each condition into the newly created learning model, and acquires the accuracy of the determination process output from the newly created learning model. Then, the determination unit 104 has the accuracy of the determination process output from the newly created learning model, and the learning currently used when the evaluation data is input to the learning model currently used by the determination unit 104. The accuracy is evaluated based on the comparison with the accuracy of the judgment process output from the model.
  • the determination accuracy evaluation unit 116 performs processing when the determination accuracy of the new learning model is equal to or less than the determination accuracy of the learning model currently used by the determination unit 104 based on the comparison result of the determination accuracy (S205-No). finish.
  • the determination accuracy evaluation unit 116 determines that the determination accuracy of the new learning model is higher than the determination accuracy of the learning model currently used by the determination unit 104 based on the comparison result of the determination accuracy (S205-YES). ), The determination accuracy evaluation unit 116 advances the process to step S206, and determines that the created new learning model is output to the determination unit 104.
  • step S206 the determination accuracy evaluation unit 116 is newly created when the accuracy of the determination process by the newly created learning model is higher than the accuracy of the determination process by the learning model used in the determination unit 104.
  • the learning model and the condition are output to the determination unit 104.
  • the determination accuracy evaluation unit 116 outputs the learning model newly created by the learning of the machine learning unit 124 and the conditions set in step S202 to the radiography control device 101.
  • the radiography control device 101 acquires the conditions and the learning model output from the determination accuracy evaluation unit 116, the learning model of each condition is applied to the setting of the determination unit 104.
  • the learning model of each condition is applied to the setting of the determination unit 104.
  • the radiography control device 101 acquires the model A learned under the condition of the chest portion and the model B learned under the condition of the portion other than the chest, the condition of the determination target data used in the determination process of the determination unit 104.
  • a learning model suitable for is selected and used for machine learning.
  • step S301 the determination unit 104 acquires the determination target data, which is an input parameter when executing the inference process, from the photographing control unit 107.
  • the determination target data is the X-ray photographed image.
  • the learning model is switched according to the conditions, information such as the imaging site is also included in the determination target data.
  • step S302 the determination unit 104 acquires the learning model and conditions output from the machine learning control device 113 in the process of step S206.
  • the determination unit 104 selects a learning model that matches the conditions based on the acquired determination target data.
  • the determination unit 104 selects a learning model that matches the conditions of the determination target data from a plurality of learning models that differ for each condition, and performs determination processing using the selected learning model. For example, when determining the irradiation field recognition for an X-ray image of a chest region, the determination unit 104 selects a learning model corresponding to the condition of the chest region. Then, the determination unit 104 applies the learning model corresponding to the condition to the machine learning classifier.
  • step S304 the determination unit 104 performs a determination process using machine learning using the learning model selected in step S303. With the above, the processing of the radiography control device 101 is completed.
  • FIG. 4 is a diagram showing an example of a learning model applied to the determination unit 104.
  • 4A of FIG. 4 shows the state before the application of the learning model created by the learning of the machine learning unit 124 to the determination unit 104
  • 4B of FIG. 4 shows the state after the application of the created learning model. ..
  • the learning model to be used is not switched depending on the condition (the part in the example of 4A of FIG. 4), and the determination unit 104 has already been set in all the parts. Judgment processing using the learning model A that has been performed is performed.
  • the determination unit 104 switches the model to be used depending on the condition (the part in the example of 4B of FIG. 4). For example, when the determination target data input to the determination unit 104 is an X-ray image of the chest region, a learning model B suitable for the condition of the chest region is used.
  • a learning model in which learning is performed for each condition at the time of learning in the machine learning control device 113 is created, and the condition is used in the determination process in the determination unit 104 of the radiography control device 101. It is possible to select a learning model suitable for the above and perform judgment processing using the selected learning model.
  • the determination unit 104 uses the determination unit 104 as a learning model to be used in the determination process corresponding to the condition. Update the learning model used with the newly created learning model.
  • the determination unit 104 When the target data corresponding to the condition is input, the determination unit 104 performs a determination process on the target data using the updated learning model. Further, when the target data different from the conditions corresponding to the learning model to be updated is input, the determination unit 104 uses the learning model used in the determination unit 104 to obtain the target data. Judgment processing is performed.
  • the determination accuracy may decrease when the determination target data other than the chest region is input. is expected.
  • the learning model used in the past is used as it is for the judgment target data other than the chest part, and the updated learning model is used for the judgment target data of the chest part.
  • the condition of the part has been described, but the present invention is not limited to this example, and medical information data such as sensor information, imaging conditions, subject information, image processing parameters, and image information can be used as conditions. That is, a learning model is created in which learning is performed on the condition of medical information data at the time of learning on the machine learning control device 113, and the determination process on the determination unit 104 of the radiography control device 101 uses the medical information data as a condition. , A learning model suitable for medical information data can be selected, and a determination process using the selected learning model can be performed.
  • the learning model that has been used conventionally is used as it is, and if the judgment target data is the medical information data at the time of learning, the medical information data is used as a condition. Switch to the learning model that has been trained. As a result, the judgment accuracy is not lowered when the judgment target data is data other than the medical information data at the time of learning, and the judgment accuracy is determined when the judgment target data is the medical information data at the time of learning. Can be improved.
  • the condition setting unit 123 sets the condition based on all combinations of the plurality of parameters
  • the machine learning unit 124 sets the medical information and the plurality of parameters.
  • FIG. 5 is a flowchart showing a procedure of processing performed by the machine learning control device 113 according to the second embodiment.
  • the processing contents of steps S201 and S203 to S206 are the same as those of the flowchart of FIG. 2, and are different from the flowchart of FIG. 2 in that the condition setting method (S501) is different.
  • step S501 when a plurality of parameters are included in the condition, the condition setting unit 123 sets the condition based on all combinations of the plurality of parameters.
  • the condition setting unit 123 automatically sets a condition based on all combinations of a plurality of possible parameters.
  • the conditions for allocating data used for machine learning include parameters A1, A2, and A3 that can take three kinds of values, and parameters B1, B2, B3, and B4 that can take four kinds of values. If so, a total of 12 patterns of conditions are set based on all combinations of the parameters.
  • step S203 the machine learning unit 124 learns about the medical information and the conditions based on all combinations of the plurality of parameters, and in step S204, the determination is made.
  • the accuracy evaluation unit 116 evaluates the accuracy of the learning model.
  • step S205 the determination accuracy evaluation unit 116 determines whether or not to apply the newly created learning model to the determination unit 104 based on the result of the accuracy evaluation of the created learning model.
  • step S303 the determination unit 104 selects a learning model that matches the conditions based on the acquired determination target data, as in the process described in FIG. A learning model that matches the set of conditions is selected based on the judgment target data.
  • the medical information processing system of the present embodiment it is possible to comprehensively learn and evaluate not only the parameters set by the user but also the conditions based on all combinations of parameters.
  • the condition is limited by the user's intention, but by learning the condition based on all combinations of parameters, all possibilities of performance improvement that are not limited by the user's intention can be obtained. It becomes possible to create and evaluate a learning model by reflecting it in the conditions. It should be noted that this embodiment is merely an example, and may include a process of switching the learning model by determining the compatibility between the determination target data and the set conditions.
  • the user is notified of the accuracy evaluation of the learning model in step S204 of the first embodiment, and the newly created learning model is applied to the determination unit 104 based on the result of the accuracy evaluation of the learning model.
  • An example of the process of determining whether or not to do so will be described.
  • the configuration of the medical information processing system according to the present embodiment is the same as the configuration described with reference to FIG. 1 of the first embodiment.
  • a process of notifying the user of the accuracy evaluation of the learning model which is a process different from the process method in the first embodiment, will be mainly described.
  • the determination accuracy evaluation unit 116 notifies the result of the accuracy evaluation for each condition for the newly created learning model.
  • FIG. 6 is a flowchart showing a procedure of processing performed by the machine learning control device 113 according to the third embodiment.
  • the processing contents of steps S201 to S206 are the same as those of the flowchart of FIG. 2, and are different from the flowchart of FIG. 2 in that the processing (S601) for notifying the evaluation result is added. ..
  • step S601 the determination accuracy evaluation unit 116 notifies the evaluation result of the learning model for each condition.
  • FIG. 7A of FIG. 7 is a diagram showing an example of information notified by the determination accuracy evaluation unit 116 of the machine learning control device 113.
  • the accuracy evaluation results include information indicating the learning model to be evaluated, the conditions used for accuracy evaluation (evaluation data conditions), and the accuracy of judgment processing (judgment accuracy) when evaluation data corresponding to the conditions is input. Is included.
  • model A is a learning model currently in use by the determination unit 104, and shows a model before application in which the learning model created by learning of the machine learning unit 124 is applied to the determination unit 104.
  • Model A corresponds to the learning model described in 4A of FIG.
  • the model B is a learning model newly created by the learning of the machine learning unit 124, and corresponds to the learning model (model B) described in 4B of FIG.
  • the judgment target data is data other than the medical information data at the time of learning (data other than the chest)
  • the previously used learning model model A in FIG. 7
  • the judgment target data is for medical use at the time of learning.
  • switching is performed to a learning model (model B in FIG. 7) that has been trained on the condition of medical information data.
  • the judgment accuracy is not lowered when the judgment target data is data other than the medical information data at the time of learning, and the judgment accuracy is determined when the judgment target data is the medical information data at the time of learning. Can be improved.
  • the judgment accuracy evaluation unit 116 displays the evaluation result of the learning model for each condition and the comparison result of the evaluation result comparing the newly created learning model (model B) and the learning model currently in use (model A) as message information (message information). Notify as 7B) in FIG.
  • the determination accuracy evaluation unit 116 Generates message information (7B in FIG. 7) including information on conditions (evaluation data conditions), learning model (model B), and accuracy, and notifies the user.
  • the judgment accuracy evaluation unit 116 notifies by a combination of the accuracy evaluation result and the message information comparing the accuracy of the learning model used in the judgment unit 104 with the accuracy of the newly created learning model.
  • the notification of the evaluation result 7A and the message information 7B shown in FIG. 7 is output via the network 122.
  • the display control unit 106 of the radiography control device 101 causes the display unit 102 to display the combination of the accuracy evaluation result and the message information.
  • the evaluation result and the message information output from the determination accuracy evaluation unit 116 are displayed on the display unit 102 based on the control of the display control unit 106.
  • this embodiment is merely an example, and is not limited to the configuration of this embodiment as long as it includes a mechanism for notifying the evaluation result for each combination of the learning model and the condition (evaluation data condition).
  • the format of the notification shown in FIG. 7 can be changed and is not necessarily limited to the format shown in FIG. 7.
  • the configuration having the radiography control device 101 and the machine learning control device 113 as the medical information processing system 10 has been described, but the configuration is not limited to this configuration, and the medical information of the device itself is not limited to this. It can also be configured as a processing device.
  • the functional configuration of the machine learning control device 113 shown in FIG. 1 can be provided inside the radiography control device 101. It is also possible to provide the functional configuration of the radiography control device 101 inside the machine learning control device 113. In this case, when the device is configured as a single medical information processing device, it is possible to obtain the same effect as the effect realized by the medical information processing system 10 described above.
  • the determination process can be performed by a learning model that matches the data conditions.
  • the judgment performance is improved in the judgment processing by the learning model that matches the data condition, and the decrease in the judgment performance can be suppressed under other conditions.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiment to a system or device via a network or storage medium, and one or more processors in the computer of the system or device reads and executes the program. It can also be realized by the processing to be performed. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • a circuit for example, ASIC

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Primary Health Care (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Analysis (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

La présente invention concerne un système de traitement d'informations médicales comprenant : une unité d'acquisition qui acquiert des informations médicales; une unité de définition de condition qui définit une condition pour le tri de données à utiliser pour un entraînement automatique; et une unité de détermination qui effectue un processus de détermination sur des données cibles en utilisant des modèles d'apprentissage qui sont différents en fonction de chaque condition et qui sont générés par apprentissage automatique dans lequel les informations médicales et la condition sont utilisées. L'unité de détermination sélectionne, parmi les modèles d'apprentissage qui sont différents en fonction de chaque condition, un modèle d'apprentissage approprié pour la condition des données cibles, et effectue le processus de détermination à l'aide du modèle d'apprentissage sélectionné.
PCT/JP2021/001491 2020-01-29 2021-01-18 Système de traitement d'informations médicales, dispositif de traitement d'informations médicales, procédé de commande de système de traitement d'informations médicales, et programme WO2021153314A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-012887 2020-01-29
JP2020012887A JP2021117926A (ja) 2020-01-29 2020-01-29 医用情報処理システム、医用情報処理装置、医用情報処理システムの制御方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2021153314A1 true WO2021153314A1 (fr) 2021-08-05

Family

ID=77078366

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/001491 WO2021153314A1 (fr) 2020-01-29 2021-01-18 Système de traitement d'informations médicales, dispositif de traitement d'informations médicales, procédé de commande de système de traitement d'informations médicales, et programme

Country Status (2)

Country Link
JP (1) JP2021117926A (fr)
WO (1) WO2021153314A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7487159B2 (ja) * 2021-10-12 2024-05-20 キヤノン株式会社 医用画像処理装置、医用画像処理方法およびプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014155690A1 (fr) * 2013-03-29 2014-10-02 富士通株式会社 Procédé, dispositif et programme de mise à jour de modèle
JP2018147023A (ja) * 2017-03-01 2018-09-20 ヤフー株式会社 提供装置、提供方法、及び提供プログラム
JP2019109620A (ja) * 2017-12-15 2019-07-04 ヤフー株式会社 推定装置、推定方法、及び推定プログラム
JP2020010805A (ja) * 2018-07-17 2020-01-23 大日本印刷株式会社 特定装置、プログラム、特定方法、情報処理装置及び特定器
JP2020010823A (ja) * 2018-07-18 2020-01-23 キヤノンメディカルシステムズ株式会社 医用情報処理装置、医用情報処理システム及び医用情報処理プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014155690A1 (fr) * 2013-03-29 2014-10-02 富士通株式会社 Procédé, dispositif et programme de mise à jour de modèle
JP2018147023A (ja) * 2017-03-01 2018-09-20 ヤフー株式会社 提供装置、提供方法、及び提供プログラム
JP2019109620A (ja) * 2017-12-15 2019-07-04 ヤフー株式会社 推定装置、推定方法、及び推定プログラム
JP2020010805A (ja) * 2018-07-17 2020-01-23 大日本印刷株式会社 特定装置、プログラム、特定方法、情報処理装置及び特定器
JP2020010823A (ja) * 2018-07-18 2020-01-23 キヤノンメディカルシステムズ株式会社 医用情報処理装置、医用情報処理システム及び医用情報処理プログラム

Also Published As

Publication number Publication date
JP2021117926A (ja) 2021-08-10

Similar Documents

Publication Publication Date Title
US11047809B2 (en) Radiation imaging system, radiation imaging method, control apparatus, and computer-readable medium
CN110338823B (zh) 信息处理装置及方法、放射线摄影装置及***和存储介质
KR20130103689A (ko) 정보처리장치 및 정보처리방법
WO2021153314A1 (fr) Système de traitement d'informations médicales, dispositif de traitement d'informations médicales, procédé de commande de système de traitement d'informations médicales, et programme
JP2006198042A (ja) 画像データ読影システム、アクセス権管理装置及び画像データ読影方法
JP6132483B2 (ja) 放射線撮影制御装置および方法
JP7094691B2 (ja) 放射線撮影システム、放射線撮影方法、制御装置及びプログラム
JP2018055278A (ja) 医用情報処理装置、医用情報処理システム、医用情報処理方法およびプログラム
JP6004704B2 (ja) 医療用検査装置および方法
JP2017192453A (ja) 情報処理装置、情報処理システム、情報処理方法及びプログラム
JP7289638B2 (ja) 医用情報処理システム及びプログラム
JP2022035719A (ja) 写損判断支援装置及びプログラム
JP2005124812A (ja) 医用画像システムおよび画像処理方法
WO2021153355A1 (fr) Système de traitement d'informations médicales, dispositif de traitement d'informations médicales, procédé de commande de système de traitement d'informations médicales, et programme
JP2006092132A (ja) 医用画像管理システム、医用画像管理サーバ装置及びプログラム
US20210043305A1 (en) Medical image diagnosis system, medical image processing method, and storage medium
US20220304642A1 (en) Dynamic analysis device and storage medium
JP7428055B2 (ja) 診断支援システム、診断支援装置及びプログラム
JP2022072572A (ja) 医用情報処理システム、医用情報処理方法、及び、プログラム
JP2009125147A (ja) 画像処理装置、画像処理方法及びプログラム
JP2008229251A (ja) 医用画像処理装置、医用画像処理方法及びプログラム
JP2024086195A (ja) 放射線撮影装置及びその制御方法、放射線撮影システム、情報処理装置、並びに、プログラム
JP2013048695A (ja) 放射線撮影システム及びその制御方法、並びに、プログラム
JP2016129543A (ja) 医用画像処理装置
JP2023049838A (ja) 情報処理装置、情報処理方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21748302

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21748302

Country of ref document: EP

Kind code of ref document: A1