WO2023024974A1 - 医疗设备及*** - Google Patents

医疗设备及*** Download PDF

Info

Publication number
WO2023024974A1
WO2023024974A1 PCT/CN2022/112781 CN2022112781W WO2023024974A1 WO 2023024974 A1 WO2023024974 A1 WO 2023024974A1 CN 2022112781 W CN2022112781 W CN 2022112781W WO 2023024974 A1 WO2023024974 A1 WO 2023024974A1
Authority
WO
WIPO (PCT)
Prior art keywords
node
tutorial
instruction
image
teaching
Prior art date
Application number
PCT/CN2022/112781
Other languages
English (en)
French (fr)
Inventor
周玉钰
朱皓
Original Assignee
武汉联影医疗科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN202110987077.2A external-priority patent/CN114035725B/zh
Priority claimed from CN202111102067.2A external-priority patent/CN114052793A/zh
Application filed by 武汉联影医疗科技有限公司 filed Critical 武汉联影医疗科技有限公司
Publication of WO2023024974A1 publication Critical patent/WO2023024974A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • This manual relates to the technical field of medical equipment, in particular to a teaching method, an auxiliary method and a system for medical equipment.
  • ultrasound imaging diagnosis has become a very important means of medical diagnosis.
  • ultrasound equipment With the continuous advancement of ultrasound imaging technology, ultrasound equipment is developing in the direction of full touch screen and intelligence. Many functions no longer provide physical buttons, but use touch screen buttons; many functions provide intelligent assistance without manual operation by users. These make it necessary to learn the operation of the ultrasound imaging equipment first for the novice who cannot use the ultrasound imaging equipment proficiently.
  • many scanning parameters provided by ultrasound equipment eg, dynamic range, harmonic imaging, spatial compounding, etc.
  • the physical meaning of these parameters is more complicated for some non-professional or less experienced doctors, and they are not fully clear The effect of these parameters.
  • the teaching of ultrasound imaging equipment is mainly carried out through teaching methods such as actively reading the product manual, learning the teaching group or preset parameters built in the ultrasound imaging equipment, and automatically recognizing image features for prompts.
  • teaching methods have the problems of not being intuitive enough, poor interactivity, and low teaching efficiency.
  • the teaching method for a medical device includes: displaying a teaching task list in response to a use instruction triggered based on a display interface, wherein the teaching task list includes a plurality of candidate teaching tasks; based on the teaching task list, acquiring a user-triggered selection An instruction; according to the selection instruction, determine the target teaching task.
  • the candidate teaching tasks may be pre-created based on the editing interface.
  • the editing interface includes a creation area
  • the process of creating each of the candidate teaching tasks in the creation area may include: obtaining a creation instruction based on the creation area, and displaying a node menu in response to the creation instruction; Obtain a node selection instruction based on the node menu, the node selection instruction includes the node identifier of the selected tutorial node; configure the tutorial node corresponding to the node identifier based on the creation area, and obtain the configuration of each tutorial node information; and creating the candidate teaching task according to the tutorial node and the configuration information of the tutorial node.
  • the configuration box of each tutorial node may be displayed according to the node selection instruction; based on the configuration box of each tutorial node, the configuration information of each tutorial node is obtained.
  • the node menu may include a basic node menu and an ultrasound application node menu
  • the creation instruction may be acquired based on the creation area
  • the basic node menu and the ultrasound application may be displayed in response to the creation instruction Node menu.
  • the node selection instruction may include a first node selection instruction and a second node selection instruction, and the first node selection instruction may be obtained based on the basic node menu; the first node selection instruction includes The basic node identifier of the selected tutorial node; and acquiring the second node selection instruction based on the ultrasound application node menu; the second node selection instruction includes the ultrasound application node identifier of the selected tutorial node.
  • the editing interface further includes an information prompt area
  • the teaching method of the medical device may further include: acquiring a setting instruction based on the editing interface and/or the information prompt area; and responding to the setting The instruction sets the node data and node attributes of each tutorial node.
  • the editing interface may further include an editing area
  • the teaching method of the medical device may further include: acquiring an editing instruction based on the editing area, wherein the editing instruction indicates a node to be edited; in response to the The editing instruction displays the editing interface of the node to be edited on the editing area; and obtaining editing information based on the editing interface, and performing verification operations, saving operations, and setting operations on the node to be edited according to the editing information and at least one of the search operations.
  • the teaching method for medical equipment may further include running the target teaching task.
  • the target parameter pre-adjustment may be displayed on the display interface in response to a target parameter among a plurality of initial parameters associated with the scanning protocol being selected or adjusted.
  • the first tutorial node in the target teaching task can be stored in the stack, and the following steps are cyclically executed until all the tutorial nodes in the stack have been taken out: Take a node from the top of the stack The tutorial node is used as the current tutorial node; in response to the current tutorial node containing input data, and the input data contains invalid data, the tutorial node connected to the input pin corresponding to the invalid data is stored in the stack; or in response to The current tutorial node includes input data, and the input data is valid data, executes the operations contained in the current tutorial node, and stores the tutorial node corresponding to the output of the current tutorial node into a stack.
  • a new teaching task may be created as the target teaching task, or at least one of the candidate teaching tasks may be edited as the target teaching task.
  • At least one of the candidate teaching tasks can be determined as a teaching task to be edited; the teaching task to be edited is decoded into a graphical interface; according to the node logic in the script of the teaching task to be edited
  • the connection information obtains at least one tutorial node; based on the type information and position information of the at least one tutorial node, determine the display position of the at least one tutorial node; based on the logical connection information of the nodes, connect the at least one tutorial node Connecting; based on the internal logic and setting information of the at least one tutorial node, obtain the name information, input and output connection information and data connection information of the at least one tutorial node and display them in the editing interface.
  • the teaching method for a medical device may further include updating at least one of the plurality of candidate teaching tasks.
  • the system includes: a first display module, configured to display a teaching task list in response to a use instruction triggered based on a display interface, wherein the teaching task list includes a plurality of candidate teaching tasks; a first acquisition module, configured to display a teaching task list based on the The teaching task list is obtained to obtain the selection instruction triggered by the user; according to the selection instruction, the target teaching task is determined.
  • One of the embodiments of this specification provides a computer-readable storage medium, the storage medium stores computer instructions, and after the computer reads the computer instructions in the storage medium, the computer executes the teaching method of the medical device.
  • the teaching task creation method includes: obtaining a creation instruction based on the creation area, and displaying a node menu in response to the creation instruction; obtaining a node selection instruction based on the node menu, and the node selection instruction includes a selected node of the tutorial node Identifying; configuring the tutorial node corresponding to the node identifier based on the creation area to obtain configuration information of each tutorial node; and creating the candidate teaching task according to the tutorial node and the configuration information of the tutorial node.
  • the configuration box of each tutorial node may be displayed according to the node selection instruction; based on the configuration box of each tutorial node, the configuration information of each tutorial node is obtained.
  • the node menu may include a basic node menu and an ultrasound application node menu
  • the creation instruction may be acquired based on the creation area
  • the basic node menu and the ultrasound application menu may be displayed in response to the creation instruction. App node menu.
  • the node selection instruction may include a first node selection instruction and a second node selection instruction, the first node selection instruction may be obtained based on the basic node menu, and the first node selection instruction includes the basic node identifier of the selected tutorial node; and acquiring the second node selection instruction based on the ultrasound application node menu, where the second node selection instruction includes the ultrasound application node identifier of the selected tutorial node.
  • the candidate teaching task may be created based on an editing interface, the editing interface further includes an information prompt area, and the method for creating a teaching task may further include: based on the editing interface and/or the information prompting The area acquires a setting instruction; and setting node data and node attributes of each of the tutorial nodes in response to the setting instruction.
  • the teaching task creation method may further include: generating a teaching task script corresponding to the candidate teaching task based on each of the tutorial nodes and the configuration information of each of the tutorial nodes; Compilation check; in response to the compilation check being passed, uploading the teaching task script; or in response to the compilation checking not being passed, prompting that the setting error location of the teaching task script.
  • the teaching task script can be uploaded to the cloud platform.
  • the tutorial nodes may include foundation nodes and/or ultrasound application nodes.
  • the basic node may include at least one of an event receiving node, an event sending node, a mathematical variable node, a logic calculation node, an execution function node, a presentation node, a logic waiting node, and an end node; the ultrasound application
  • the nodes may include at least one of a mode switch node, a measurement package switch node, a probe simulation data node, an image parameter setting node, a function activation node, and a B-mode image comparison node.
  • the medical equipment assistance method includes: acquiring a scanning protocol, the scanning protocol is associated with a plurality of initial parameters; receiving an auxiliary function trigger instruction; receiving a target parameter selected from the plurality of initial parameters; The first image corresponding to the pre-adjustment of the target parameters and the second image corresponding to the pre-adjustment of the target parameters are shown above.
  • the medical equipment assistance method may further include: displaying a teaching task list in response to a use instruction triggered based on the display interface, wherein the teaching task list includes a plurality of candidate teaching tasks; based on the The teaching task list acquires a selection instruction triggered by the user; according to the selection instruction, a target teaching task is determined to execute the medical equipment assistance method.
  • the plurality of initial parameters may include first-level initial parameters and second-level initial parameters
  • the medical device assistance method may further include at least one of the following operations:
  • the scanning protocol or the auxiliary function trigger instruction displays the first-level initial parameters on the display interface; in response to the display trigger instruction for displaying the second-level initial parameters, displays on the display interface The second-level initial parameters.
  • the first image and the second image may be determined and displayed on the display interface by querying a pre-stored database, wherein, The pre-stored database includes images corresponding to each value of the plurality of initial parameters.
  • auxiliary function triggering instruction when the auxiliary function triggering instruction is a dynamic auxiliary function triggering instruction, at least one of the following operations may be included before displaying the plurality of initial parameters on the display interface: collecting data based on the scanning protocol an initial image; displaying the initial image on the display interface; displaying on the display interface the first image corresponding to the target parameter before pre-adjustment and the second image corresponding to the target parameter after pre-adjustment
  • the image includes: using the initial image as the first image; based on the target parameter, processing the first image to determine the second image; and displaying the first image and the first image on the display interface. Describe the second image.
  • the target parameters may include a first target parameter and a second target parameter.
  • the first image corresponding to the target parameter before pre-adjustment may include the image corresponding to the first target parameter and the second target parameter before pre-adjustment;
  • the image may include an image corresponding to the first target parameter and the second target parameter after pre-adjustment, an image corresponding to the first target parameter after pre-adjustment, and an image corresponding to the second target parameter after pre-adjustment image.
  • the first image and the second image may comprise static images or dynamic images.
  • the scanned object presented in the first image and the second image may be related to the scanning protocol.
  • the medical device may be an ultrasound device
  • the scanning protocol may include at least one of a probe type and a scanning mode.
  • the medical equipment auxiliary system includes: an acquisition module, used to acquire a scanning protocol, the scanning protocol is associated with a plurality of initial parameters; a receiving module, used for Receive an auxiliary function trigger instruction and a selected target parameter among the plurality of initial parameters; a display module, configured to display the first image corresponding to the target parameter before pre-adjustment and the target parameter after pre-adjustment on the display interface corresponding to the second image.
  • One of the embodiments of the present specification provides a computer-readable storage medium, the storage medium stores computer instructions, and after the computer reads the computer instructions in the storage medium, the computer executes the medical equipment assistance method.
  • the teaching task list can be displayed, and the teaching task list includes a plurality of candidate teaching tasks, and each candidate teaching task is pre-created based on the editing interface; through
  • the visual display can intuitively select the target teaching task from multiple candidate teaching tasks, and the selected target teaching task is also created based on the visual interface; then the selected target teaching task can be run, and the target teaching task can be used to help users understand Ultrasound equipment (for example, the physical meaning of scanning parameters, etc.), and ensure that users practice and master the functions and operations of ultrasound equipment.
  • Ultrasound equipment for example, the physical meaning of scanning parameters, etc.
  • Fig. 1 is a schematic diagram of an application scenario of a medical system according to some embodiments of this specification
  • Fig. 2 is an exemplary block diagram of a teaching system for medical equipment according to some embodiments of this specification
  • Fig. 3 is an exemplary block diagram of a medical equipment auxiliary system shown according to some embodiments of this specification.
  • Fig. 4 is an exemplary flow chart of a teaching method for a medical device according to some embodiments of this specification
  • Fig. 5 is an exemplary flow chart of a teaching task creation method according to some embodiments of this specification.
  • Fig. 6a is a schematic structural diagram of a teaching task editing interface according to some embodiments of the present specification.
  • Fig. 6b is a schematic diagram of basic nodes in a teaching task according to some embodiments of this specification.
  • Fig. 6c is a schematic diagram of an ultrasound application node in a teaching task according to some embodiments of the present specification.
  • Fig. 7 is a schematic diagram of creating a teaching task according to some embodiments of this specification.
  • Fig. 8 is an exemplary flow chart of a medical device assisting method according to some embodiments of this specification.
  • Fig. 9 is a schematic diagram of at least part of the display interface corresponding to a single target parameter in the static auxiliary function of the ultrasound device according to some embodiments of the present specification;
  • Fig. 10 is a schematic diagram of at least part of the display interface corresponding to the combined target parameters in the static auxiliary function of the ultrasound device according to some embodiments of the present specification;
  • Fig. 11 is a schematic diagram of at least part of the display interface corresponding to a single target parameter in the dynamic assistance function of an ultrasound device according to some embodiments of the present specification;
  • Fig. 12 is a schematic diagram of a teaching task of adjusting a depth value in mode B of an ultrasound device according to some embodiments of the present specification.
  • system means for distinguishing different components, elements, parts, parts or assemblies of different levels.
  • the words may be replaced by other expressions if other words can achieve the same purpose.
  • Fig. 1 is a schematic diagram of an application scenario of an exemplary medical device teaching system according to some embodiments of the present application.
  • a medical device teaching system 100 may include a medical device 110 , a network 120 , a terminal device 130 , a processing device 140 and a storage device 150 .
  • the components of the medical device teaching system 100 can be connected in various ways.
  • processing device 140 may be connected to medical device 110 via network 120 .
  • the processing device 140 may be directly connected to the medical device 110 .
  • a terminal device eg, 131 , 132 , 133 , etc.
  • the medical device 110 can be used to scan a target object or a part thereof within its detection area and generate an image related to the target object or a part thereof.
  • the target object may include a human body, an animal (for example, a laboratory mouse or other animal), a phantom, etc., or any combination thereof.
  • the target object may include specific parts of the human body, such as the head, chest, abdomen, etc., or any combination thereof.
  • target objects may include specific organs such as the heart, thyroid, esophagus, trachea, stomach, gallbladder, small intestine, colon, bladder, ureters, uterus, fallopian tubes, and the like.
  • medical equipment 110 may include ultrasound equipment, computed tomography (Computed Tomography, CT) equipment, magnetic resonance imaging (Magnetic Resonance Imaging, MRI) equipment, positron emission computed tomography equipment (Positron Emission Computed Tomography, PET), single-photon emission computed tomography (Single-Photon Emission Computed Tomography, SPECT) equipment, etc., or any combination thereof.
  • computed tomography Computed Tomography, CT
  • Magnetic Resonance Imaging Magnetic Resonance Imaging
  • MRI Magnetic Resonance Imaging
  • positron emission computed tomography equipment PET
  • Single-photon emission computed tomography Single-photon emission computed tomography
  • SPECT single-photon emission computed tomography
  • Network 120 may facilitate the exchange of information and/or data.
  • one or more components of the medical device teaching system 100 can communicate with the medical device teaching system 100 through the network 120 exchange information and/or data with other components of the
  • processing device 140 may obtain scan data from medical device 110 via network 120 .
  • network 120 may be any type of wired or wireless network or combination thereof.
  • network 120 may include a cable network, a wired network, a wireless network, a fiber optic network, a telecommunications network, an intranet, the Internet, a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), etc., or any combination thereof .
  • network 120 may include one or more network access points.
  • network 120 may include wired and/or wireless network access points, such as base stations and/or Internet switching points. Through a network access point, one or more components of the educational system 100 for medical devices may connect to the network 120 to exchange data and/or information.
  • the terminal device 130 can enable the user to interact with other components in the teaching system 100 of medical equipment. For example, the user may send a request to access the teaching task of the medical device 110 to the processing device 140 through the terminal device 130 . For another example, the terminal device 130 may also receive the scan image acquired by the medical device 110 through the network 120 .
  • the terminal device 130 may include a mobile device 131, a tablet computer 132, a notebook computer 133, etc. or any combination thereof.
  • the mobile device 131 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, etc., or any combination thereof.
  • the processing device 140 may process information and/or data obtained from the medical device 110 , the terminal device 130 and/or the storage device 150 .
  • the processing device 140 may acquire user-triggered instructions (for example, use instructions, selection instructions, obtain scan protocol instructions, auxiliary function trigger instructions, etc.), so as to display operations related to the instructions on the interactive interface (for example, display teaching task lists, identifying targeted teaching tasks, displaying multiple initial parameters, etc.).
  • the processing device 140 may display corresponding images of the target parameters before and after pre-adjustment on the interactive interface according to the target parameters selected from multiple initial parameters.
  • processing device 140 may be a single server or a group of servers. Server groups can be centralized or distributed.
  • processing device 140 may be local or remote.
  • the processing device 140 may access information and/or data from the medical device 110 , the terminal device 130 and/or the storage device 150 through the network 120 .
  • the processing device 140 may be directly connected to the medical device 110, the terminal device 130 and/or the storage device 150 to access information and/or data.
  • the processing device 140 may include one or more processing units (eg, single-core processing engines or multi-core processing engines).
  • processing device 140 may include a central processing unit (CPU), an application specific integrated circuit (ASIC), an application specific instruction set processor (ASIP), a graphics processing unit (GPU), a physical processing unit (PPU), a digital signal processor (DSP), field programmable gate array (FPGA), programmable logic device (PLD), controller, microcontroller unit, reduced instruction set computer (RISC), microprocessor, etc., or any combination thereof.
  • the processing device 140 may be implemented on a cloud platform.
  • the cloud platform may include one or a combination of private cloud, public cloud, hybrid cloud, community cloud, distributed cloud, cross-cloud, multi-cloud, etc.
  • the processing device 140 may be a part of the medical device 110 or the terminal device 130 .
  • Storage device 150 may store data, instructions and/or any other information.
  • the storage device 150 can store data obtained from the medical device 110 , the terminal device 130 and/or the processing device 140 .
  • storage device 150 may store images corresponding to respective values of parameters related to medical device 110 .
  • the storage device 150 may store scan data (or image data) of the scan object obtained from the medical device 110 .
  • storage device 150 may store data and/or instructions that may be executed or used by processing device 140 to perform the exemplary methods described herein.
  • the storage device 150 may include one or a combination of mass storage, removable storage, volatile read-write storage, and read-only memory (ROM).
  • the storage device 150 may be implemented through the cloud platform described in this application.
  • the storage device 150 can be connected to the network 120 to realize communication with one or more components in the medical device teaching system 100 (eg, the medical device 110 , the processing device 140 , the terminal device 130 , etc.).
  • One or more components in the teaching system 100 for medical equipment can read data or instructions in the storage device 150 through the network 120 .
  • the storage device 150 may be a part of the processing device 140 or may be independent, and is directly or indirectly connected to the processing device 140 .
  • the above description of the teaching system 100 for medical equipment is only for the purpose of illustration, and is not intended to limit the scope of the present application. It can be understood that, after understanding the principle of the system, those skilled in the art can make various modifications and changes in the form and details of the application fields implementing the above system without departing from this principle. However, these changes and modifications do not depart from the scope of the present application.
  • the medical device 110, the processing device 140, and the terminal device 130 may share one storage device 150, or may have their own storage devices. Such deformations are within the protection scope of this specification.
  • Fig. 2 is an exemplary block diagram of a teaching system for medical equipment according to some embodiments of this specification.
  • the teaching system 200 for a medical device may be implemented by the processing device 140 .
  • the teaching system 200 for medical equipment may include a first display module 210 and a first acquisition module 220 .
  • the first display module 210 may be configured to display a list of teaching tasks in response to a use instruction triggered based on a display interface.
  • the teaching task list includes multiple candidate teaching tasks.
  • the candidate teaching tasks may be pre-created based on the editing interface.
  • the first acquiring module 220 may be configured to acquire a user-triggered selection instruction based on the teaching task list, and determine a target teaching task from candidate teaching tasks according to the selection instruction.
  • the first acquiring module 220 may also be used to create a new teaching task as the target teaching task, or edit at least one of the candidate teaching tasks as the target teaching task.
  • the first acquisition module 220 can be used to determine at least one of the candidate teaching tasks as a teaching task to be edited; decode the teaching task to be edited into a graphical interface; according to the node in the script of the teaching task to be edited Obtain at least one tutorial node from logical connection information; determine the display position of at least one tutorial node based on the type information and location information of at least one tutorial node; connect at least one tutorial node based on the node logical connection information; based on at least one tutorial node
  • the internal logic and setting information of the node, the name information, input and output connection information and data connection information of at least one tutorial node are obtained and displayed in the editing interface.
  • the medical device teaching system 200 may further include an update module (not shown).
  • An update module can be used to update at least one of the plurality of candidate teaching tasks.
  • the medical device teaching system 200 may further include a running module 230 for running a target teaching task.
  • the running module 230 can be used to receive target parameters selected from multiple initial parameters associated with the scan protocol; display the corresponding first image and target parameters before pre-adjustment of the target parameters on the display interface The corresponding second image after pre-adjustment.
  • the operation module 230 may be configured to display on the display interface the first image corresponding to the target parameter before pre-adjustment and the target parameter after pre-adjustment corresponding to the second image.
  • the running module 230 can be used to store the first tutorial node in the target teaching task into the stack, and perform the following steps in a loop until all the tutorial nodes in the stack have been taken out: take out a tutorial from the top of the stack
  • the node is used as the current tutorial node; in response to the current tutorial node containing input data, and the input data contains invalid data, the tutorial node connected to the input pin corresponding to the invalid data is stored in the stack; or in response to the current tutorial node containing input data, And the input data is valid data, execute the operation contained in the current tutorial node, and store the tutorial node corresponding to the output of the current tutorial node into the stack.
  • the editing interface may include a creation area.
  • the medical equipment teaching system 200 may also include: a second display module, a second acquisition module, a configuration module, a creation module, etc. (not shown).
  • the second presentation module can be used to obtain the creation instruction based on the creation area, and display the node menu in response to the creation instruction.
  • the second obtaining module can be used to obtain node selection instructions based on the node menu.
  • the node selection instruction includes the node identifier of the selected tutorial node.
  • the configuration module can be used to configure the tutorial node corresponding to the node identifier based on the creation area, and obtain the configuration information of each tutorial node.
  • the creation module can be used to create candidate teaching tasks according to the tutorial node and the configuration information of the tutorial node.
  • the configuration information may include: the type information of the tutorial node, the logic information of the tutorial node, the setting information of the tutorial node, the location information of the tutorial node, the logic pin connection information of the tutorial node, the data pin of the tutorial node At least one of connection information, script attribute information of the tutorial script, configuration information of the tutorial script, and the like.
  • the configuration module may include a display unit and an acquisition unit.
  • the display unit can be used to display the configuration box of each tutorial node according to the node selection instruction.
  • the acquisition unit can be used to obtain the configuration information of each tutorial node based on the configuration frame of each tutorial node.
  • the node menu may include a base node menu and/or an ultrasound application node menu.
  • the second presentation module may be used to obtain a creation instruction based on the creation area, and display the basic node menu and/or the ultrasound application node menu in response to the creation instruction.
  • the node selection instruction may include a first selection instruction and/or a second selection instruction.
  • the second obtaining module may be used to obtain the first node selection instruction based on the basic node menu, wherein the first node selection instruction may include the basic node identifier of the selected tutorial node; obtain the second node selection instruction based on the ultrasound application node menu, wherein, the second node selection instruction may include the ultrasound application node identifier of the selected tutorial node.
  • the editing interface may also include an information prompt area.
  • the medical equipment teaching system 200 may also include a third acquiring module and a setting module.
  • the third obtaining module may be used to obtain setting instructions based on the editing interface and/or the information prompt area.
  • the setting module can be used to set node data and/or node attributes of each tutorial node in response to a setting instruction.
  • the editing interface may also include an editing area.
  • the medical equipment teaching system 200 may also include a fourth acquiring module, a third displaying module and an editing module.
  • the fourth obtaining module may be used to obtain an editing instruction based on the editing area, where the editing instruction indicates a node to be edited.
  • the third display module may be used to display the editing interface of the node to be edited on the editing area in response to the editing instruction.
  • the editing module can be used to obtain editing information based on the editing interface, and perform at least one of a verification operation, a saving operation, a setting operation, and a searching operation on the node to be edited according to the editing information.
  • Fig. 3 is an exemplary block diagram of a medical equipment auxiliary system according to some embodiments of the present specification.
  • the medical device assistance system 300 may be implemented by the processing device 140 .
  • the medical equipment auxiliary system 300 may include an acquisition module 310 , a reception module 320 and/or a display module 330 .
  • the acquiring module 310 can be used to acquire the scan protocol.
  • the scanning protocol may be associated with multiple initial parameters.
  • the initial parameters can be divided into multiple levels or categories of initial parameters for the display module 330 to display by level or category respectively.
  • the receiving module 320 may be used to receive an auxiliary function trigger instruction. Wherein, the auxiliary function triggering instruction may trigger the auxiliary function of the medical device. In some embodiments, the auxiliary function triggering instruction may include a static auxiliary function triggering instruction and a dynamic auxiliary function triggering instruction. In some embodiments, the receiving module 320 may also be configured to receive a target parameter selected from a plurality of initial parameters. Wherein, the target parameter may include the type of the selected initial parameter and its corresponding pre-adjustment value.
  • the display module 330 may be configured to display the first image corresponding to the pre-adjustment of the target parameter and the second image corresponding to the pre-adjustment of the target parameter on the display interface. In other words, the display module 330 can simultaneously display the corresponding images before and after the target parameter pre-adjustment for the user to compare and view, so that the user can better understand the application effect of the target parameter on the image. In some embodiments, the display module 330 can also be used to display multiple initial parameters on the display interface. The user can select the parameters displayed on the display interface to determine the target parameters.
  • the systems, devices and modules shown in Fig. 2 and Fig. 3 may be implemented in various ways.
  • the system, apparatus and modules thereof may be realized by hardware, software, or a combination of software and hardware.
  • the hardware part can be implemented by using dedicated logic; the software part can be stored in a memory and executed by an appropriate instruction execution system, such as a microprocessor or specially designed hardware.
  • an appropriate instruction execution system such as a microprocessor or specially designed hardware.
  • the methods and systems described above can be implemented using computer-executable instructions and/or contained in processor control code, for example on a carrier medium such as a magnetic disk, CD or DVD-ROM, such as a read-only memory (firmware ) or on a data carrier such as an optical or electronic signal carrier.
  • the system and its modules of the present application can not only be implemented by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc. , can also be realized by software executed by various types of processors, for example, and can also be realized by a combination of the above-mentioned hardware circuits and software (for example, firmware).
  • the receiving module 320 may include two units, eg, an instruction accepting unit and a parameter receiving unit, to respectively receive the auxiliary function trigger instruction and the selected target parameter.
  • each module may share one storage device, or each module may have its own storage device. Such deformations are within the protection scope of the present application.
  • Fig. 4 is an exemplary flowchart of a teaching method for a medical device according to some embodiments of the present specification. As shown in Fig. 4, the process 400 includes the following steps. In some embodiments, the process 400 may be performed by the processing device 140 .
  • Step 410 displaying a list of teaching tasks in response to the use instruction triggered based on the display interface.
  • the teaching task list includes multiple candidate teaching tasks.
  • step 410 may be performed by the first presentation module 210 .
  • the teaching task list refers to a list showing teaching tasks.
  • the teaching task refers to a process for training a user on functions and operations of the medical device.
  • the teaching task list may be a list arranged vertically or horizontally.
  • multiple candidate teaching tasks related to operating the ultrasound imaging device may be displayed in the teaching task list, for example, imaging tasks, adjustment tasks, and the like.
  • the candidate teaching task may include multiple tutorial nodes performing the teaching task and the logical relationship between the multiple tutorial nodes, and the corresponding logical relationship between the multiple tutorial nodes of the candidate teaching task can be reflected.
  • the execution logic of the candidate teaching task and the execution content of the candidate teaching task so as to execute the corresponding candidate teaching task according to the execution logic and execution content.
  • a tutorial node is a node that represents a single tutorial in a teaching task.
  • the logical relationship between tutorial nodes refers to the relationship between multiple tutorial nodes related to the execution logic of the tutorial, for example, dependency relationship, sequence relationship, and so on.
  • a dependency could be that the execution of tutorial node 2 needs to use the output of tutorial node 1, and an order relationship could be that tutorial node 3 must be executed before tutorial node 4.
  • the user may trigger a use instruction on the display interface of the medical device, so that the medical device responds to the use instruction triggered by the user, calling The teaching task list, and display the called teaching task list.
  • the medical device is a touch-screen ultrasonic imaging device
  • the user can trigger a teaching instruction by long pressing the touch screen, and a teaching task list will pop up at the long-press position for display;
  • the medical device is a non-contact medical device
  • the user can trigger teaching instructions through gestures, voice, eye movement, consciousness and other methods; in addition, the user can also trigger teaching instructions through mouse, keyboard, trackball, etc.
  • the candidate teaching tasks may be pre-created based on the editing interface.
  • the editing interface may include a creation area.
  • the candidate teaching task can be created in the creation area through the following process: obtain the creation instruction based on the creation area, display the node menu in response to the creation instruction; obtain the node selection instruction based on the node menu, wherein the node selection instruction includes the selected The node identifier of the tutorial node; configure the tutorial node corresponding to the node identifier based on the creation area, and obtain the configuration information of each tutorial node; create candidate teaching tasks according to the tutorial node and the configuration information of the tutorial node.
  • the relevant description in FIG. 5 For more information on how to create teaching tasks in the creation area, refer to the relevant description in FIG. 5 , and details will not be repeated here.
  • the editing interface may also include an information prompt area.
  • the node data and/or node attributes of the tutorial node can be set in the information prompt area.
  • the setting instruction can be acquired based on the editing interface and/or the information prompt area.
  • the setting instruction can be obtained from the information prompt area included in the editing interface, or can be obtained from the entire editing interface.
  • FIG. 6a shows an editing interface
  • the setting instructions can be obtained from the prompt information area 630 in FIG. 6a, or can be obtained from the entire editing interface shown in FIG. 6a.
  • node data and/or node attributes of each tutorial node may be set in response to the acquired setting instruction.
  • the node data of each tutorial node can be the input data and output data of the node, for example, the input data of the node can be events, mathematical variables, strings, custom types, etc.
  • the node attribute of each tutorial node may be the type of each tutorial node, for example, an event receiving node, a mathematical variable node, a mode switching node, and the like.
  • the working status of the editor of the medical device can also be displayed in the editing interface (for example, information prompt area, etc.). For example, it may be displayed that the current working status of the editor of the medical device is one of verifying the status of the tutorial script, saving the status of the tutorial script, and the like.
  • the node data and node attributes of each tutorial node are set through the editing interface and/or the information prompt area, so that the node data and node attributes of the tutorial nodes can be accurately set, so that the tutorial nodes can adapt to more Wide range of application scenarios.
  • the editing interface may also include an editing area.
  • the node to be edited can be edited in the editing area.
  • the editing instruction can be obtained based on the editing area.
  • the editing instruction indicates the node to be edited.
  • the editing area included in the editing interface may be the area 610 shown in FIG. 6a.
  • the user-triggered editing instruction can be obtained in the information prompt area, and the node to be edited is determined according to the editing instruction.
  • an editing interface of a node to be edited may be displayed on the editing area in response to the editing instruction. Specifically, after receiving the editing instruction, the editing interface of the node to be edited can be displayed on the editing area.
  • the editing information may be obtained based on the editing interface, and at least one of a verification operation, a saving operation, a setting operation, and a searching operation, etc. may be performed on the node to be edited according to the editing information.
  • the displayed editing interface may include editing information, the editing information may be acquired from the editing interface, and various editing operations may be performed on the node to be edited according to the acquired editing information.
  • the editing operation may include at least one of a verifying operation, a saving operation, a setting operation, a searching operation and the like.
  • the editing interface of the node to be edited is displayed through the editing area and edited, so that the node to be edited can be edited and adjusted in real time, so that the edited node can adapt to a wider range of application scenarios.
  • Step 420 based on the teaching task list, obtain the selection instruction triggered by the user.
  • step 420 may be performed by the first acquiring module 220 .
  • a selection can be made from multiple candidate teaching tasks in the teaching task list to select at least one candidate teaching task that needs to be performed.
  • the candidate teaching task is selected in the teaching task list
  • the generate selection command will be triggered.
  • the triggered selection instruction may be retrieved.
  • the selection instruction includes the information of the selected candidate teaching task, for example, the number, name, ID, etc. of the candidate teaching task.
  • Step 430 determine the target teaching task.
  • step 430 may be performed by the first acquiring module 220 .
  • the candidate teaching task corresponding to the candidate teaching task information in the selection instruction may be determined as the target teaching task.
  • the number, name, ID, etc. corresponding to the target teaching task may be determined according to the number, name, ID, etc. of the candidate teaching task.
  • the target teaching task may include one or more candidate teaching tasks in the teaching task list.
  • Step 440 run the target teaching task.
  • step 440 may be performed by the running module 230 .
  • At least one tutorial node included in the target teaching task and the logical relationship among these tutorial nodes may be executed to execute the target teaching task.
  • the first tutorial node in the target teaching task can be stored in the stack, and the following steps are cyclically executed to push and pop the tutorial nodes in the target teaching task until all the tutorial nodes in the stack are Take out: take out a tutorial node from the top of the stack as the current tutorial node; in response to the current tutorial node containing input data, and the input data contains invalid data, store the tutorial node connected to the input pin corresponding to the invalid data into the stack; Or in response to the fact that the current tutorial node contains input data and the input data is valid data, execute the operations contained in the current tutorial node, and store the tutorial node corresponding to the output of the current tutorial node into the stack.
  • the process of running the teaching task can be regarded as the process of executing the teaching task script.
  • the teaching task is the teaching task shown in Figure 7, the process may include the following steps:
  • the receiving event node of the teaching task script is triggered, and the ClickEvent node is pushed into the stack.
  • the output pin EventName of the ClickEvent node is set by the system event into PW mode.
  • the ClickEvent node has output, put its logical output node DisplayVideo node into the stack, and put the isEqual node connected to its data output node into the stack.
  • the isEqual node has two input pins, and the EventName has been assigned by the output of the ClickEvent node, so put the node connected to the other input pin string—a mathematical variable node into the stack.
  • the logic of the DisplayVideo node is that when the value of the input pin condition is true, a video configured in this node is triggered and played at the corresponding position on the ultrasonic screen device.
  • teaching tasks can be downloaded in the medical device through the user account.
  • the process of downloading teaching tasks may include: the user requests to log in to the server using an account and password; after the server passes authentication, query the user's teaching task list and the teaching task list shared in the authority domain, and Send to the cloud storage; the cloud storage queries the teaching task list, obtains the teaching task script and verification data, and returns them to the server; the server adds the sorting information to the obtained teaching task script and verification data, and returns them to the local ultrasound system where the user is located ;
  • the local ultrasound system will first decide whether to cover the local teaching tasks according to the verification data, and sort the novice teaching tasks according to the sorting information of the novice teaching tasks. After that, the user can select and activate the appropriate teaching task according to the current needs.
  • the ultrasound system will analyze the teaching task script and run the corresponding teaching task nodes in sequence when the execution conditions are met.
  • a new teaching task can be created as the target teaching task, or at least one of the candidate teaching tasks can be edited and used as the target teaching task.
  • a new teaching task refer to the relevant description in FIG. 5 , which will not be repeated here.
  • At least one of the candidate teaching tasks can be determined as a teaching task to be edited; the teaching task to be edited is decoded into a graphical interface; at least one tutorial is obtained according to the node logic connection information in the script of the teaching task to be edited Node; based on the type information and position information of at least one tutorial node, determine the display position of at least one tutorial node; based on the logical connection information of the nodes, connect at least one tutorial node; based on the internal logic and setting information of at least one tutorial node , get the name information, input and output connection information and data connection information of at least one tutorial node and display them in the editing interface.
  • the ultrasound imaging device will decode the teaching task script into a graphical interface.
  • the processor module of the ultrasonic imaging device will obtain the nodes sequentially according to the node logic pin connection information in the teaching task script; through the node type information and position information, the rendering module will render the type of node to the corresponding position; through the node reference Pin connection information, through the rendering module to connect the nodes through the rendered line segment; through the internal logic and setting information of the node, through the rendering module, the node name, input and output logic pins, and data pins are rendered in the editor .
  • At least one of the plurality of candidate teaching tasks may be updated. Specifically, after at least one candidate teaching task is edited, the corresponding candidate teaching task before editing can be replaced with the edited candidate teaching task.
  • the teaching method for the medical device may include the following steps:
  • the ultrasound imaging device After the user logs in to the ultrasound imaging device, the ultrasound imaging device automatically requests the user's teaching task data list from the server, and updates locally available teaching tasks after obtaining the list data.
  • the list data can include those created by the user account itself, and can also include those in the permission domain whose accounts are set to be shared. In the list, you can see the activated teaching tasks and the inactive teaching tasks.
  • the user checks whether there is a teaching task that meets the current demand. If so, select and activate the teaching task, and the task will be executed when the conditions are met. If not, the user clicks the "New Teaching Task” or "Edit Teaching Task” button on the interface to enter the teaching task creation and editing interface.
  • a new event node is created by long-pressing the screen on the canvas or clicking an output pin of an existing node and dragging a lead line.
  • the teaching task can be saved in the cloud storage through the user account.
  • the process of saving teaching tasks may include: after the user logs in with an account, upload the teaching task flow script and data required for sorting to the server; after the server passes the authentication, generate the teaching task script The verification data, and upload the novice teaching task script and verification data to the cloud storage; the cloud storage will decide whether to overwrite the user's previous teaching task script according to the verification data, and return the processing result to the server; the server will return The processing result determines whether to update the teaching task list, and returns the final result to the local ultrasound system; the local ultrasound system will process the next work according to the returned final result. If the archive is successful, it will prompt success; if the archive fails, the teaching task script will be saved locally, and wait for the next save, and retry the above steps.
  • the corresponding value of the target parameter before pre-adjustment is displayed on the display interface.
  • teaching tasks can be intuitively selected and run in a visual manner, with a friendly interface , easy to use and convenient for new users; by creating and editing teaching tasks based on the visual interface, and then running teaching tasks, teaching tasks can be created quickly and easily, thereby helping users better understand medical equipment (for example, ultrasound equipment, etc. ), can guide users to master various functions and operations of medical equipment through hands-on practice.
  • medical equipment for example, ultrasound equipment, etc.
  • Fig. 5 is an exemplary flow chart of a method for creating a teaching task according to some embodiments of this specification. As shown in FIG. 5 , the process 500 includes the following steps. In some embodiments, the process 500 may be performed by the processing device 140 .
  • Step 510 acquiring a creation instruction based on the creation area, and displaying a node menu in response to the creation instruction.
  • a triggered create instruction may be retrieved from the create area, in response to which create instruction the node menu is presented.
  • FIG. 6a is a schematic diagram of an editing interface, where the creation area may be the area 620 in FIG. 6a, and the displayed node menu may be the menu 640 in FIG. 6a.
  • the medical device may include an ultrasound imaging device, and each menu item in the node menu corresponds to a type of tutorial node.
  • tutorial nodes refer to nodes corresponding to various elements in the tutorial.
  • the tutorial node may include multiple types of nodes, for example, at least one of a basic node, an ultrasound application node, and the like.
  • the basic nodes may include multiple types of nodes, for example, event receiving nodes, event sending nodes, mathematical variable nodes, logical calculation nodes, execution function nodes, presentation nodes, logical waiting nodes, end nodes, etc. at least one.
  • the event receiving node refers to the node connected to it after receiving some system-provided events or user-defined events that are triggered and executed, such as when the user clicks a certain interface button, or opens a certain interface, or completes a certain After an event is entered, the nodes following the event node are executed.
  • the event sending node refers to customizing an event and sending the input data of the node together.
  • Math variable nodes are nodes that include variables of various types (e.g., Boolean, numeric, string, vector, object, etc.) that can also be used as at least some other types of nodes (e.g., etc.), the mathematical variable node can also be used as the input of other types of nodes for calculation.
  • variables of various types e.g., Boolean, numeric, string, vector, object, etc.
  • the mathematical variable node can also be used as the input of other types of nodes for calculation.
  • Logic calculation nodes can perform logic calculations on data, such as addition, subtraction, multiplication, and division of numeric and vector types, true and false discrimination of Boolean types, size and equality judgment of numeric types, equality judgment of string types, etc.
  • the performing function node may choose to perform a function of the medical device. For example, mode switching, freezing, measurement package selection, etc.
  • the display node can display an interface through the screen, and the interface can prompt the user how to operate in the form of text, pictures, and videos.
  • the logic waiting node means that only after the logic of the node has input, it will continue to execute backwards.
  • An end node may indicate the end of a teaching task.
  • the node control appearance features of the basic node may be as shown in FIG. 6b.
  • the nodes form a rectangular graphic frame, which can have different wireframe colors or background colors according to the node type.
  • the interior of the node is divided into upper and lower areas.
  • the upper area has the node name (the node can customize the name), and the lower area has the node data.
  • Node data represents the logic, data input and output pins of the node, with the data object name displayed next to the pins.
  • the data input and output pin areas feature marker symbols, for example, logic connection pins are identified with a triangle symbol, and data connection pins are identified with a dot.
  • Logic and data input pins are on the left and logic and data output pins are on the right.
  • a pin of a node may include at least one of a logic input, a logic output, a data input, and a data output.
  • a logic input a logic input
  • a logic output a data input
  • a data output a data output
  • an event node has no logic-in and data-in pins
  • an end node has no logic-out and data-out pins.
  • a node creation menu can be called out at a corresponding position by long pressing the screen.
  • the two nodes can be connected.
  • a pin of a node can be selected, a connection line can be pulled out by sliding, and the node creation menu can be called out automatically when the node is released in a blank space of the creation area (for example, area 620 in FIG. 6 a ).
  • connection rules can be set as required. For example, data can flow out from an output pin and flow into data input pins of multiple nodes, but the input pin of a node can only flow in from the output pin of a certain node.
  • a node can freely change (move) its position in the creation area, and a node can be selected to modify attributes and delete a node.
  • ultrasound application nodes may include multiple types of nodes, for example, mode switching nodes, measurement package switching nodes, probe simulation data nodes, image parameter setting nodes, function activation nodes, and B-mode image comparison nodes, etc. at least one.
  • the mode switching node can trigger mode switching, and will change the parameter configuration of the ultrasound machine after passing in the corresponding mode parameters.
  • the measurement package switching node can switch the measurement package, so that the user operates under the specified measurement package.
  • the probe simulation data node can select the original data of the ultrasound image saved on the machine as the simulation data of the teaching task. After the simulated data is executed, the ultrasound equipment no longer takes the real data connected to the probe as input, but the simulated data as input.
  • the image parameter setting node can change the parameter data of the medical imaging device by setting the parameter data under the node.
  • the function activation node can activate specific functions of the medical device, such as freezing, saving images, viewing patients, viewing playback, etc.
  • the B-mode image comparison node can compare the ultrasonic image data and parameters of a specific input parameter according to the settings of the node, and display the simulated B-mode ultrasonic image in a certain set area.
  • the node control of the ultrasound application node is similar to the node control of the basic node, and its appearance characteristics may be as shown in FIG. 6c.
  • the node menu may include a basic node menu, an ultrasound application node menu, and the like.
  • the node type included in the basic node menu is a basic node
  • the node type included in the ultrasound application node menu is an ultrasound application node.
  • creation instructions may be obtained from the creation area, and a base node menu, an ultrasound application node menu, and the like are displayed in response to the creation instructions.
  • a node selection instruction is obtained based on the node menu, and the node selection instruction includes the node identifier of the selected tutorial node.
  • the medical device may select a node in the displayed node menu based on the displayed node menu, thereby triggering the node selection instruction.
  • the node selection instruction can be obtained from the displayed node menu, wherein the obtained node selection instruction includes the node identifier of the selected tutorial node.
  • the node identifier of the selected tutorial node may include at least one of a node name, a node ID, and the like of the selected tutorial node.
  • the node name in the node identification corresponds to the node name in the node menu.
  • the selected tutorial node name is ClickEvent, then the node name here is ClickEvent.
  • the node selection instruction may include a first selection instruction and a second selection instruction.
  • the first node selection instruction can be obtained from the basic node menu, wherein the first node selection instruction includes the basic node identifier of the selected tutorial node.
  • the second node selection instruction can be acquired based on the ultrasound application node menu, wherein the second node selection instruction includes the ultrasound application node identifier of the selected tutorial node.
  • Step 530 configure the tutorial node corresponding to the node identifier based on the creation area, and obtain the configuration information of each tutorial node.
  • the tutorial node corresponding to the node identifier can be configured based on the creation area, so as to obtain the configuration information of each tutorial node.
  • the configuration of the tutorial node can be completed by configuring the node attributes and node data of the tutorial node.
  • the configuration information of the tutorial node refers to the information related to the tutorial node, which can include a variety of information, for example, node type information, node logic information, node setting information, node location information, node logic pin connection information, node data pin At least one of connection information, script attribute information of the tutorial script, configuration information of the tutorial script, and the like.
  • the configuration information includes node attributes and node data of the tutorial node.
  • node data can include node logic information, node logic pin connection information, node data pin connection information, etc.
  • node attributes can include node type information, node setting information, node location information, tutorial script Script property information, configuration information of the tutorial script, etc.
  • a configuration box for each tutorial node may be displayed according to a node selection instruction.
  • the tutorial node corresponding to the node identifier of the selected tutorial node has a corresponding configuration box
  • the configuration box includes configuration information of the tutorial node.
  • the configuration information of each tutorial node can be obtained based on the configuration box of each tutorial node.
  • the configuration information selected in the configuration box of the tutorial node may be determined as the configuration information of the tutorial node.
  • each node when configuring the tutorial node corresponding to the node identifier of the selected tutorial node, each node may be processed sequentially along the logic pin connection starting from the node corresponding to the first node identifier. First, you can analyze the internal logic and setting information of each node; then, check its input data pin connection information, and if the pin is connected, reverse the input connection line to find the output data pin connection information of the previous node , until all the pre-data meet the requirements, so as to obtain the configuration information of the target tutorial node corresponding to each node identifier.
  • Step 540 create candidate teaching tasks according to the tutorial nodes and the configuration information of the tutorial nodes.
  • candidate teaching tasks can be created based on the editing interface, wherein the boundary interface also includes an information prompt area.
  • the setting instruction can be acquired based on the editing interface and/or the information prompt area, wherein the setting instruction includes information for setting node data and node attributes of the tutorial node.
  • the newly created tutorial node 1 type is a logic node, and its name is isEqual;
  • the newly created tutorial node 2 type is a mathematical variable node, and the input data value is "PW Mode".
  • the node data and node attributes of each tutorial node are set in response to the set instruction. For example, in response to the node data and node attribute information of the tutorial node in the setting instruction, set the tutorial node 1 type as a logic node, and set the name as isEqual; the tutorial node 2 type as a mathematical variable node, and set the input data value as "PW Mode ".
  • teaching task scripts corresponding to candidate teaching tasks can be generated based on the above-mentioned tutorial nodes and the configuration information of each tutorial node.
  • each tutorial node can be used as a tutorial node included in a candidate teaching task, and the configuration information of each tutorial node can be used as a logical relationship between the tutorial nodes included in a candidate teaching task, so as to obtain a candidate teaching task.
  • a teaching task script corresponding to the teaching task may be generated according to the obtained tutorial nodes and configuration information of the candidate teaching task.
  • compilation checks can be performed on the generated teaching task scripts. In some embodiments, if the compilation check is passed, the teaching task script may be uploaded; if the compilation check is not passed, a setting error location of the teaching task script may be prompted.
  • the generated teaching task scripts can be uploaded to various local or network platforms, for example, cloud platforms and the like.
  • the creation instruction is obtained through the creation area included in the editing interface, and the node menu is displayed in response to the creation instruction, so that the user can independently select the tutorial node through the intuitively displayed node menu;
  • the node selection command can accurately configure the tutorial node corresponding to the node identifier based on the creation area; through the selected tutorial node and corresponding configuration information, the teaching task can be created quickly and accurately, reducing the complexity of creating teaching tasks and improving created efficiency.
  • Fig. 8 is an exemplary flow chart of a medical device assisting method according to some embodiments of this specification. As shown in FIG. 8, the process 800 includes the following steps. In some embodiments, the process 800 may be performed by the processing device 140 . In some embodiments, the process 800 may be included in the process of running the target teaching task in step 440 .
  • Step 810 acquire the scanning protocol.
  • step 810 may be performed by the acquisition module 310 .
  • the scan protocol may be the program used in the medical device for the actual scan. In some embodiments, the scan protocol may only be related to the parameters and/or images to be recalled, not to the actual scan. For example, when using a medical device (for example, the medical device 110 ) to scan an object to be scanned, the user may select or set a corresponding scanning protocol according to the information of the object to be scanned.
  • the processing device 140 may receive the scan protocol, and control the medical device to scan and image the scanned object according to the scan protocol.
  • the user can select a scanning protocol on the terminal device (for example, terminal device 130), and the user can select or set parameters related to the scanning protocol, so as to display corresponding images before and after parameter pre-adjustment on the display device of the terminal device .
  • a scanning protocol on the terminal device (for example, terminal device 130)
  • the user can select or set parameters related to the scanning protocol, so as to display corresponding images before and after parameter pre-adjustment on the display device of the terminal device .
  • the medical device may include an ultrasound device, a CT device, an MRI device, a PET device, a SPECT device, etc., or any combination thereof.
  • a scanning protocol can be associated with one or more initial parameters. Different medical devices may correspond to different initial parameters.
  • an ultrasonic device will be used as an example of a medical device in this specification, which does not limit the scope of this application.
  • the ultrasound device may include a one-dimensional ultrasound device, a two-dimensional ultrasound device, and/or a three-dimensional ultrasound device.
  • the medical device may be a handheld ultrasound device.
  • the scanning protocol may include at least one of probe types, scanning modes, and the like.
  • the probes may include linear array probes, phased array probes, 4D probes, intracavitary probes, transesophageal probes, gastroscopic probes, and the like.
  • the scanning mode may include an M scanning mode, a PW scanning mode, a scanning mode corresponding to scanning objects one by one, and the like. Users can enter the scanning page by selecting probes, scanning modes, etc.
  • the scan protocol may be related to the objects in the image to be displayed in the auxiliary function of the medical device.
  • the type of probe included in the scan protocol is a probe for a certain part
  • the auxiliary function of the medical device is called for the scan protocol
  • the object in the displayed image is the probe parts.
  • the scan object displayed in the auxiliary function of the medical device is the heart.
  • the scan mode included in the scan protocol is a scan mode for a certain part
  • the auxiliary function of the medical device is called for the scan protocol
  • the object in the displayed image is the scan mode parts. For example, if the scan modes included in the scan protocol are scan modes applied to the abdomen, heart, thyroid, etc., then the scan objects displayed in the auxiliary functions of the medical device correspond to the abdomen, heart, thyroid, etc. respectively.
  • the initial parameters related to the scanning protocol of the ultrasound device may be related to the ultrasound device and/or the ultrasound image.
  • initial parameters may include dynamic range, depth, transmit power, gain, false color, line density, speckle noise suppression, time gain compensation (TGC), harmonic imaging, multi-beam imaging, spatial Composite, brightness, focus adjustment, etc., or any combination thereof.
  • initial parameters may include multi-level or multiple types of initial parameters. For example, the initial parameters may be divided into multi-level initial parameters according to the frequency of use of each initial parameter in clinical applications (that is, the frequency with which the user adjusts each initial parameter when performing image optimization). For another example, initial parameters may be divided into multiple levels according to the adjustment effect of each initial parameter on the image.
  • parameters such as dynamic range, brightness, and gain can be divided into one level of initial parameters, and parameters such as dynamic range, brightness, and gain can be divided into another level of initial parameters.
  • Each initial parameter may correspond to multiple values for the user to adjust to optimize the image.
  • the initial parameter may be a continuously numerical parameter.
  • continuously numerical parameters may include depth, gain, brightness, etc.
  • the initial parameters may be binning parameters.
  • the "speckle noise suppression" parameter it can be quantified and graded by intensity and denoising index.
  • Step 820 receiving an auxiliary function trigger instruction.
  • step 820 may be performed by the receiving module 320 .
  • the user can trigger the auxiliary function of the medical device by sending an auxiliary function trigger instruction to the processing device 140, so as to know the application effect of each parameter (or initial parameter) of the medical device on the image.
  • the auxiliary function triggering instruction may include a static auxiliary function triggering instruction and a dynamic auxiliary function triggering instruction.
  • the static auxiliary function triggering instruction is related to the static auxiliary function of the medical device.
  • Static auxiliary functions of the medical device may be associated with a pre-stored database.
  • the pre-stored database may include images corresponding to each value of each initial parameter.
  • the pre-stored database may also include images corresponding to individual combination values of multiple combination parameters.
  • Combination parameters that is, the combination of two or more initial parameters.
  • the processing device 140 may display an image corresponding to the target parameter by querying a pre-stored database based on the selected target parameter (eg, the parameter and its value that the user wants to know).
  • the processing device 140 does not calculate the image to be displayed according to the target parameters in real time, but calls the corresponding pre-stored image for display.
  • the static auxiliary function of the medical device can be directly loaded on the medical device, so as to be called by the user when the scan is not performed or during the scan.
  • the static auxiliary function of the medical device can also be loaded on the terminal device in the form of an application program (Application, APP), web page, etc., so that the user can call it anytime and anywhere.
  • the dynamic auxiliary function triggering instruction is related to the dynamic auxiliary function of the medical device.
  • the dynamic assistance function of the medical device can be correlated with the images acquired in real time.
  • the processing device 140 may obtain the image to be displayed through real-time calculation based on the image collected in real time and/or the selected target parameters. Therefore, the dynamic assistance function of the medical equipment needs to be loaded on the medical equipment so as to collect images in real time through the medical equipment.
  • the user may send an auxiliary function triggering instruction to the processing device 140 through physical keys, touch screen, mouse, voice, gesture, eye movement, brain wave, and the like.
  • an auxiliary function trigger button may be set on the medical device, and the user may click the button to send an auxiliary function trigger instruction to the processing device 140 .
  • an auxiliary function triggering module is provided on the user interface, and the user can trigger the module by selecting with a mouse or touching the screen to send an auxiliary function triggering instruction to the processing device 140 .
  • the user may send an auxiliary function triggering instruction to the processing device 140 by preset triggering rules (for example, quickly double-clicking, long-pressing a certain parameter button).
  • the user when the ultrasound device is scanning, the user can quickly double-click or long-press the "Dynamic Range” parameter button to trigger the auxiliary function corresponding to the dynamic range, that is, send the dynamic range to the processing device 140 The corresponding auxiliary function trigger command.
  • the auxiliary function interface of the dynamic range can be entered, so that the user can further set the pre-adjustment value of the dynamic range in the auxiliary function interface.
  • the acquisition of the scanning protocol and the auxiliary function trigger instruction by the processing device 140 may not have a sequence.
  • the processing device 140 may acquire the scanning protocol first, and then acquire the auxiliary function triggering instruction; or may first acquire the auxiliary function triggering instruction, and then acquire the scanning protocol.
  • the processing device may simultaneously acquire the scanning protocol and the auxiliary function triggering instruction.
  • the APP can be set with a button that triggers a specific scanning protocol and auxiliary function trigger command at the same time. When the user clicks the button, he can directly enter the Static auxiliary function interface related to scanning protocol.
  • Step 830 receiving a target parameter selected from a plurality of initial parameters.
  • step 830 may be performed by the receiving module 320 .
  • a plurality of initial parameters may be displayed on the display interface based on a scanning protocol or an auxiliary function trigger instruction.
  • the processing device 140 may display initial parameters associated with the scan protocol on the display interface. For example, for the static auxiliary function loaded on the terminal device in the form of APP, when the user opens the APP and selects the scanning protocol, the initial parameters related to the scanning protocol can be displayed on the display device of the terminal device for the user to choose study. As another example, for the dynamic auxiliary function loaded on the medical device, after the user selects or sets the scanning protocol, the initial parameters related to the scanning protocol can be displayed on the display device (or display interface) of the medical device. In some embodiments, processing device 140 may display the initial parameters in response to receiving the accessibility trigger instruction. For example, after the processing device 140 acquires the scan protocol, initial parameters related to the scan protocol may be displayed on the display device of the medical device in response to the user triggering the auxiliary function.
  • the processing device 140 may respectively display initial parameters of various levels/types on the display interface.
  • the processing device 140 may trigger an instruction based on a scanning protocol or an auxiliary function, and display The primary parameters are shown above.
  • a display trigger instruction to display advanced parameters for example, a user may select an "Advanced" button displayed on the display interface
  • the processing device 140 may display the advanced parameters on the display interface.
  • the processing device 140 may display the types of various initial parameters based on the scan protocol or the auxiliary function trigger instruction. Further, in response to displaying a display trigger instruction of a specific category (for example, a first-type initial parameter display trigger instruction), the processing device 140 may display initial parameters corresponding to the category on the display interface.
  • a display trigger instruction of a specific category for example, a first-type initial parameter display trigger instruction
  • the user may select a target parameter based on the displayed multiple initial parameters.
  • the initial parameters when the auxiliary function of the medical device is loaded on the medical device, the initial parameters may not be displayed. Specifically, after the processing device 140 receives the scan protocol and the auxiliary function trigger instruction, the initial parameters may not be displayed on the display interface. At this time, the user can select and/or select the initial parameters through the various parameter buttons set on the medical device. or set to determine the target parameters.
  • processing device 140 may receive a selected target parameter from a plurality of initial parameters.
  • the target parameter includes the type of the selected initial parameter, that is, its corresponding pre-adjustment value.
  • the user may select one or more initial parameters displayed on the display interface as target parameters. That is to say, the user can select single or multiple selections for the initial parameters. For example, a plurality of initial parameters may be listed on the display interface, and a selection box is set before each parameter, and the user may select (for example, single-click, double-click) the selection box before the parameter that the user wants to know. Further, after the selection is completed, the user may determine the parameters corresponding to the selected one or more selection boxes as the target parameters through a confirmation operation (for example, clicking the "Apply" button).
  • a confirmation operation for example, clicking the "Apply" button
  • dynamic range, TGC, and gain can be combined to form a new parameter "image optimization".
  • the parameter values of dynamic range, TGC, and gain can be set in different combinations for adjusting penetration performance and resolution, so as to correspond to different gears of "image optimization" (for example, penetration 1, penetration 2, penetration 3 levels of transparency, 1 level of resolution, 2 levels of resolution, 3 levels of resolution, etc.). Users can simultaneously select the combined value of dynamic range, TGC, and gain by clicking the "Image Optimization" button and setting the corresponding gear.
  • the user can select target parameters through physical keys, touch screen, mouse, voice, gesture, eye movement, brain wave and other means. After the target parameter is selected, the user can further select or set the pre-adjustment value of the target parameter. For example, the user can select several levels of "depth”, set the value of the gain, and so on.
  • Step 840 displaying the first image corresponding to the pre-adjustment of the target parameter and the second image corresponding to the pre-adjustment of the target parameter on the display interface.
  • step 350 may be performed by the display module 230 .
  • Preconditioning refers to the process of determining an image of interest based on target parameters. What needs to be known is that the processing device 140 does not actually adjust the parameters during the scanning process, but only obtains and displays the images of interest (for example, the corresponding first image before and after pre-adjustment of the target parameters, the first image, etc. two images). The user can optimize the captured image by previewing the image of interest and then determining whether to adjust the target parameters.
  • the first image and/or the second image may comprise static images (eg, image frames) or dynamic images.
  • a dynamic image may refer to an image composed of multiple continuous still images.
  • the dynamic image can be formatted in Graphics Interchange Format (Graphics Interchange Format, GIF), Moving Picture Experts Group (Moving Picture Experts Group, MPEG) format, MP4, Audio Video Interleaved format (Audio Video Interleaved, AVI) and other formats for storage or processing.
  • GIF Graphics Interchange Format
  • MPEG Moving Picture Experts Group
  • MP4 Audio Video Interleaved format
  • AVI Audio Video Interleaved format
  • the object associated with the scan protocol is the heart
  • the first image and the second image may be dynamic images to better display the state of the heart.
  • the object associated with the scanning protocol is a blood vessel (for example, blood vessel color ultrasound)
  • the first image and the second image may be dynamic images to better understand blood flow conditions.
  • the first image and the second image may be determined based on the target parameters by querying a pre-stored database.
  • each target parameter for example, each initial parameter or a combination of multiple initial parameters
  • the processing device 140 or other processing devices may determine a target image (which may be a static image or a dynamic image) corresponding to the target parameter based on the initial image and the target parameter (for example, a pre-adjusted value).
  • the processing device 140 or other processing devices may store the initial image and the target image corresponding to each target parameter in a pre-stored database.
  • each target parameter may be a default setting of the medical device assistance system 100 .
  • the processing device 140 may collect an initial image based on a scanning protocol, and display the initial image on a display interface. After receiving the selected target parameters, the processing device 140 may use the collected initial image as the first image. The processing device 140 may process the initial image to determine the second image based on the target parameter (eg, a pre-adjusted value of the initial parameter). The processing device 140 may display the determined first image and the second image on the display interface. In some embodiments, the first image and the second image may change in real time as the user moves the probe.
  • the target parameter eg, a pre-adjusted value of the initial parameter
  • the user when the user triggers the dynamic assistance function, the user can use the probe to collect the image of the object (for example, phantom) in real time. ). Further, the user can select the target parameter (for example, dynamic range), at this time, the initial image (that is, the first image) can be displayed on the left side of the display device, and the second image determined based on the target parameter and the initial image can be displayed on the right side .
  • the target parameter for example, dynamic range
  • the processing device 140 may simultaneously display on the display interface the initial images corresponding to multiple target parameters before pre-adjustment, the images corresponding to each target parameter after pre-adjustment, And an image corresponding to at least two target parameters acting together on the initial image.
  • the target parameters may include a first target parameter and a second target parameter, and the processing device 140 may first determine an initial image corresponding to the first target parameter and the second target parameter before pre-adjustment. The processing device 140 may determine the first sub-image corresponding to the pre-adjustment of the first target parameter based on the initial image and the first target parameter.
  • the processing device 140 may also determine the second sub-image corresponding to the pre-adjustment of the second target parameter based on the initial image and the second target parameter.
  • the processing device 140 may also determine a third sub-image corresponding to the pre-adjustment of the first target parameter and the second target parameter based on the initial image, the first target parameter and the second target parameter.
  • the processing device 140 may display the initial image, the first sub-image, the second sub-image and the third sub-image on the display interface.
  • the auxiliary function trigger instruction is a dynamic auxiliary function trigger instruction
  • the corresponding initial image before the pre-adjustment of the first target parameter and the second target parameter may be the currently collected image.
  • the initial images corresponding to the first target parameter and the second target parameter before pre-adjustment may be pre-stored in the database corresponding to the parameter types of the first target parameter and the second target parameter image.
  • the auxiliary function of the medical device can be triggered, and the parameter value of the parameter you want to know can be set. Adjusted second image. Further, if the user is satisfied with the optimization effect of the parameter on the currently captured image, the user may accept the adjustment of the parameter. For example, the user may click the "Apply" button on the display interface, and at this time, only the image corresponding to the adjusted parameter may be displayed on the display interface.
  • the processing device 140 can also display descriptions about the target parameters (for example, physical meaning, function description, etc.) on the display interface to assist users in understanding (930 in FIG. 9, 1030 in FIG. 11 in 1130).
  • the processing device 140 may also mark on the first image and/or the second image in order to facilitate the user to observe the change of the target parameter before and after the pre-adjustment. For example, processing device 140 may compare the first image and the second image. The processing device 140 may mark, on the first image and/or the second image, places where the image signal difference is higher than the threshold with markers (for example, as shown by 915 and 925 in FIG. 9 ) according to the comparison result.
  • the processing device 140 may use " ⁇ " to circle the place with the greatest brightening or darkening degree in the second image. Further, the same position of the first image may also be marked with " ⁇ ". For another example, for reducing the "depth” parameter value, the processing device 140 may mark structures in the first image that are more than those in the second image. The processing device 140 may also include text next to the label.
  • the corresponding images before and after the pre-adjustment of each parameter are displayed to the user at the same time, which greatly saves the time for the user to understand each parameter; through the static auxiliary function installed in the mobile terminal in the form of APP, it is convenient for the user anytime, anywhere Understand the parameters you want to learn and improve user experience; by displaying the images before and after parameter pre-adjustment, users can preview the effect of the parameter before and after pre-adjustment, and then decide whether to adjust the parameter, so that users can use each parameter more flexibly .
  • Fig. 9 is a schematic diagram of at least part of the display interface corresponding to a single target parameter in the static auxiliary function of the ultrasound device according to some embodiments of the present specification.
  • the target parameter is the dynamic range.
  • the display interface 900 can display the first image 910 corresponding to the dynamic range pre-adjustment (for example, the dynamic range value is 30) and the first image 910 corresponding to the dynamic range pre-adjustment (for example, the dynamic range value is 60).
  • the corresponding second image 920 Both the first image 910 and the second image 920 are still images.
  • parameter information 930 about the dynamic range may also be displayed at the bottom of the display interface 900 .
  • the parameter information 930 may include the physical meaning of the parameter of the dynamic range, the function effect of the parameter of the dynamic range, and the like.
  • places where the differences between the first image 910 and the second image 920 are relatively large can be marked on the first image 910 and the second image 920 .
  • the boundary of the portion corresponding to the circle 925 ie, the liver
  • the user for example, a doctor
  • the user can more intuitively understand the application effect of the dynamic range parameter on the image, which greatly saves the doctor's time .
  • Fig. 10 is a schematic diagram of at least part of the display interface corresponding to the combined target parameters in the static auxiliary function of the ultrasound device according to some embodiments of the present specification.
  • the combined target parameters are "image optimization" parameters, including dynamic range, gain and TGC.
  • the display interface 1000 may display a first image 1010 corresponding to "image optimization" pre-adjustment and a second image 1020 corresponding to "image optimization" pre-adjustment. Both the first image 1010 and the second image 1020 are still images.
  • the lower part of the display interface 1000 may also display parameter information 1030 about "image optimization".
  • places with large differences between the first image 1010 and the second image 1020 can be marked on the first image 1010 and the second image 1020 (for example, the parts indicated by circles 1015 and 1025 ).
  • Fig. 11 is a schematic diagram of at least part of the display interface corresponding to a single target parameter in the dynamic assistance function of the ultrasound device according to some embodiments of the present specification.
  • the display interface 1100 corresponding to a single target parameter in the dynamic assistance function may be similar to the display interface 900 corresponding to a single target parameter in the static assistance function shown in FIG. 9 . The difference is that the images displayed in the dynamic assist function are dynamic images.
  • the target parameter is dynamic range.
  • the display interface 1100 may display a first image 1110 corresponding before the dynamic range pre-adjustment and a second image 1120 corresponding after the dynamic range pre-adjustment. Both the first image 1110 and the second image 1120 are dynamic images.
  • the first image 1110 and the second image 1120 may change in real time according to the user moving the probe.
  • parameter information 1130 about the dynamic range may also be displayed at the bottom of the display interface 1100 .
  • places with large differences between the first image 1110 and the second image 1120 can be marked on the first image 1110 and the second image 1120 (for example, the parts indicated by circles 1115 and 1125 ).
  • Fig. 12 is a schematic diagram of a teaching task of adjusting a depth value in mode B of an ultrasound device according to some embodiments of the present specification.
  • the teaching task shown in FIG. 12 can simulate the parameter adjustment in the B mode environment.
  • the first image may be an image before adjustment
  • the second image may be an image after adjustment.
  • the quality of the image presented in this range is obviously better than that of the unadjusted image, that is, the quality of the second image is obviously better than that of the first image. This enables the user to learn how to adjust the depth parameters in the ultrasound machine to make the image achieve the desired effect through hands-on practice.
  • the teaching task shown in FIG. 12 is set to be triggered after receiving a system click event (ie, the ClickEvent node). If the event is an event for switching the B mode, the simulated B mode will be activated.
  • a system click event ie, the ClickEvent node
  • the B-mode switching node (that is, the ActiveBMode node) can receive three input data.
  • Condition is used as the trigger condition for the logic operation of this node
  • ProbeData is used as the analog probe data of this mode
  • BModeParams is used as the parameter data of this mode.
  • the system will no longer use real probe data as input.
  • the simulated data is a multi-frame kidney ultrasound data. Since the kidney is located deep in the abdomen, it is necessary to adjust the depth parameter to an appropriate range to see the kidney image.
  • the image parameter setting node ie BModeParams node
  • the Depth parameter can be set to 5, and other parameters still use the current system parameter values.
  • the complete kidney cannot be seen in the images generated under the current simulated environmental conditions.
  • a string of text prompts can be displayed through the text display node (ie DisplayText node). For example, "Adjusting the Depth parameter can change the visible depth distance in B mode. If you want to see deeper tissue, turn up the parameter; if you want to see a larger shallow tissue image, turn down the parameter. Now please adjust the Depth parameter so that Kidney tissue is clearly visible.”
  • the above prompt can always be displayed on the interface, and enter the logic waiting node (ie WaitForEvent node), waiting for the user to adjust the parameters.
  • a parameter adjustment event node ie, a ParamChangeEvent node
  • the output of this node is two data brought when the system sends an event, one is the parameter name ParamName, and the other is the parameter value ParamValue.
  • These two parameter values can pass through several logical compute nodes. After the logical operation of these logical computing nodes, when it is judged that the user has adjusted the Depth parameter to more than 15, the true condition can be output to the logical waiting node.
  • the logic waiting node When the logic waiting node receives the condition input data as true, it can continue to execute the following B-mode image comparison node (ie, BModeImageCompare node).
  • B-mode image comparison node ie, BModeImageCompare node
  • the BModeImageCompare node can compare the ultrasonic image data and parameters of a specific input parameter according to the settings of the node, that is, compare the real-time image with the image in a certain set area.
  • the contrast mode may be a dual image mode. For example, the image on the left is a real-time image (ie, the first image) and real-time parameters, while the image on the right is the ultrasonic image data (ie, the second image) and parameters set above. After the user exits the comparison mode, he can enter the end node (that is, the EndTask node).
  • the end node can prompt the user to complete the guidance of the Depth parameter in B mode, and can provide text and other prompts on the interface. For example, "Congratulations, you have adjusted the Depth parameter to the correct range, and now you can see the complete kidney, and you have completed this tutorial!. And exit the simulated B-mode environment after confirmation by the user, and return to the real system environment.
  • beneficial effects include but are not limited to: (1) by displaying candidate teaching tasks in response to instructions triggered based on a visual interface, and selecting and running target teaching tasks from candidate teaching tasks according to the instructions, the visual Intuitive way to select and run teaching tasks, friendly interface, easy to use, and convenient for new users; (2) By creating and editing teaching tasks based on the visual interface, and then running teaching tasks, teaching tasks can be created quickly and easily, thus Help users better understand medical equipment (for example, ultrasound equipment, etc.), and be able to master various functions and operations of medical equipment by guiding users to practice; (3) Obtain various instructions through a visual interface to create and edit tutorial nodes , create teaching tasks quickly and accurately, reduce the complexity of creating teaching tasks, and improve creation efficiency; (4) Display the corresponding images before and after each parameter pre-adjustment to the user at the same time, and through various static and dynamic auxiliary functions , save the time for users to understand each parameter, improve user experience, enable users to intuitively understand the meaning and adjustment effect of
  • numbers describing the quantity of components and attributes are used. It should be understood that such numbers used in the description of the embodiments use the modifiers "about”, “approximately” or “substantially” in some examples. grooming. Unless otherwise stated, “about”, “approximately” or “substantially” indicates that the stated figure allows for a variation of ⁇ 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that can vary depending upon the desired characteristics of individual embodiments. In some embodiments, numerical parameters should take into account the specified significant digits and adopt the general digit reservation method. Although the numerical ranges and parameters used in some embodiments of this specification to confirm the breadth of the range are approximations, in specific embodiments, such numerical values are set as precisely as practicable.

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

一种医疗设备的教学方法、***及存储介质。该方法由处理器执行,包括:响应于基于显示界面触发的使用指令,展示教学任务列表,其中,教学任务列表包括多个候选教学任务;基于教学任务列表,获取用户触发的选择指令;根据选择指令,确定目标教学任务。

Description

医疗设备及***
相关申请的交叉引用
本申请要求2021年08月26日提交的名称为“超声设备的教学方法、装置、超声成像设备和存储介质”的中国专利申请202110987077.2和2021年09月18日提交的名称为“一种医疗设备辅助方法、装置及存储介质”的中国专利申请202111102067.2的优先权,上述申请的全部内容以引用方式被完全包含在此。
技术领域
本说明书涉及医疗设备技术领域,特别涉及一种医疗设备的教学方法、辅助方法及其***。
背景技术
当前,超声成像诊断已经成为非常重要的医疗诊断手段。随着超声成像技术的不断进步,超声设备朝着全触屏化、智能化方向发展,很多功能不再提供物理按钮,而使用触屏按钮;很多功能提供智能辅助,而不需要用户手工操作。这些使得对于不能熟练使用超声成像设备的新手来说,需要先学习超声成像设备的操作。尤其对于超声设备提供的众多扫查参数(例如,动态范围、谐波成像、空间复合等),这些参数的物理意义对于一些非专业或经验较少的医生来说比较复杂,他们并不完全清楚这些参数的作用。通常,对于超声成像设备的教学主要是通过主动阅读产品使用手册、学习内置在超声成像设备中的教学组或预设参数、自动化识别图像特征进行提示等教学方法进行。但是,这些教学方法存在不够直观、交互性较差、教学效率较低的问题。
因此,希望提供一种医疗设备及***以增强交互性、提高教学效率。
发明内容
本说明书实施例之一提供一种医疗设备的教学方法,由处理器执行。所述医疗设备的教学方法包括:响应于基于显示界面触发的使用指令,展示教学任务列表,其中,所述教学任务列表包括多个候选教学任务;基于所述教学任务列表,获取用户触发的选择指令;根据所述选择指令,确定目标教学任务。
在一些实施例中,所述候选教学任务可以为预先基于编辑界面所创建的。
在一些实施例中,所述编辑界面包括创建区域,在所述创建区域创建各所述候选教学任务的过程可以包括:基于所述创建区域获取创建指令,响应于所述创建指令展示节点菜单;基于所述节点菜单获取节点选择指令,所述节点选择指令中包括被选中的教程节点的节点标识;基于所述创建区域对所述节点标识对应的教程节点进行配置,得到每个教程节点的配置信息;以及根据所述教程节点和所述教程节点的配置信息,创建所述候选教学任务。
在一些实施例中,可以根据所述节点选择指令显示各所述教程节点的配置框;基于各所述教程节点的配置框,得到各所述教程节点的配置信息。
在一些实施例中,所述节点菜单可以包括基础节点菜单和超声应用节点菜单,可以基于所述创建区域获取所述创建指令,响应于所述创建指令展示所述基础节点菜单和所述超声应用节点菜单。
在一些实施例中,所述节点选择指令可以包括第一节点选择指令和第二节点选择指令,可以基于所述基础节点菜单获取所述第一节点选择指令;所述第一节点选择指令中包括被选中的教程节点的基础节点标识;以及基于所述超声应用节点菜单获取所述第二节点选择指令;所述第二节点选择指令中包括被选中的教程节点的超声应用节点标识。
在一些实施例中,所述编辑界面还包括信息提示区域,所述医疗设备的教学方法还可以包括:基于所述编辑界面和/或所述信息提示区域获取设置指令;以及响应于所述设置指令对各所述教程节点的节点数据和节点属性进行设置。
在一些实施例中,所述编辑界面还可以包括编辑区域,所述医疗设备的教学方法还可以包括:基于所述编辑区域获取编辑指令,其中,所述编辑指令指示待编辑节点;响应于所述编辑指令在所述编辑区域上展示所述待编辑节点的编辑界面;以及基于所述编辑界面获取编辑信息,根据所述编辑信息对所述待编辑节点进行校验操作、保存操作、设置操作和搜索操作中的至少一个。
在一些实施例中,所述医疗设备的教学方法还可以包括运行所述目标教学任务。
在一些实施例中,在运行目标教学任务过程中,可以响应于与扫查协议相关联的多个初始参数中的目标参数被选择或调节,在所述显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像。
在一些实施例中,可以将所述目标教学任务中的第一个教程节点存入栈中,循环执行以下步骤,直到所述栈中所有教程节点已取出:从所述栈的栈顶取出一个教程节点作为当前教程节点;响应于所述当前教程节点包含输入数据,且所述输入数据包含无效数据,将与所述无效数据对应的输入引脚相连的教程节点存入栈中;或响应于所述当前教程节点包含输入数据,且所述输入数据为有效数据,执行所述当前教程节点中包含的操作,并将所述当前教程节点的输出所对应的教程节点存入栈中。
在一些实施例中,可以创建新的教学任务作为所述目标教学任务,或对所述候选教学任务中的至少一个进行编辑后作为目标教学任务。
在一些实施例中,可以将所述候选教学任务中的至少一个确定为待编辑教学任务;将所述待编辑教学任务解码成图形化界面;根据所述待编辑教学任务的脚本中的节点逻辑连接信息得到至少一个教程节点;基于所述至少一个教程节点的类型信息和位置信息,确定所述至少一个教程节点的显示位置;基于所述节点逻辑连接信息,将所述至少一个教程节点之间进行连接;基于所述至少一个教程节点的内部逻辑和设置信息,得到所述至少一个教程节点的名称信息、输入输出连接信息和数据连接信息并显示在所述编辑界面中。
在一些实施例中,所述医疗设备的教学方法进一步可以包括更新所述多个候选教学任务中的至少一个。
本说明书实施例之一提供一种医疗设备的教学***。所述***包括:第一展示模块,用于响应于基于显示界面触发的使用指令,展示教学任务列表,其中,所述教学任务列表包括多个候选教学任务;第一获取模块,用于基于所述教学任务列表,获取所述用户触发的选择指令;根据所述选择指令,确定目标教学任务。
本说明书实施例之一提供一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行所述医疗设备的教学方法。
本说明书实施例之一提供一种教学任务创建方法,由处理器执行。所述教学任务创建方法包括:基于创建区域获取创建指令,并响应于所述创建指令展示节点菜单;基于所述节点菜单获取节点选择指令,所述节点选择指令中包括被选中的教程节点的节点标识;基于所述创建区域对所述节点标识对应的教程节点进行配置,得到每个教程节点的配置信息;以及根据所述教程节点和所述教程节点的配置信息,创建所述候选教学任务。
在一些实施例中,可以根据所述节点选择指令显示各所述教程节点的配置框;基于各所述教程节点的配置框,得到各所述教程节点的配置信息。
在一些实施例中,所述节点菜单可以包括基础节点菜单和超声应用节点菜单,可以基于所述创建区域获取所述创建指令,并响应于所述创建指令展示所述基础节点菜单和所述超声应用节点菜单。
在一些实施例中,所述节点选择指令可以包括第一节点选择指令和第二节点选择指令,可以基于所述基础节点菜单获取所述第一节点选择指令,所述第一节点选择指令中包括被选中的教程节点的基础节点标识;以及基于所述超声应用节点菜单获取所述第二节点选择指令,所述第二节点选择指令中包括被选中的教程节点的超声应用节点标识。
在一些实施例中,所述候选教学任务可以基于编辑界面所创建,所述编辑界面还包括信息提示区域,所述教学任务创建方法还可以包括:基于所述编辑界面和/或所述信息提示区域获取设置指令;以及响应于所述设置指令对各所述教程节点的节点数据和节点属性进行设置。
在一些实施例中,所述教学任务创建方法还可以包括:基于各所述教程节点和各所述教程节点的配置信息生成所述候选教学任务对应的教学任务脚本;对所述教学任务脚本进行编译检查;响应于所述编译检查被通过,上传所述教学任务脚本;或响应于所述编译检查未被通过,提示所述教学任务脚本的设置错误位置。
在一些实施例中,可以将所述教学任务脚本上传到云平台。
在一些实施例中,所述教程节点可以包括基础节点和/或超声应用节点。
在一些实施例中,所述基础节点可以包括事件接收节点、事件发出节点、数学变量节点、逻辑计算节点、执行功能节点、展示节点、逻辑等待节点和结束节点中的至少一个;所述超声应用节点可以包括模式切换节点、测量包切换节点、探头模拟数据节点、图像参数设置节点、功能激活节点和B模式图像对比节点中的至少一个。
本说明书实施例之一提供一种医疗设备辅助方法。所述医疗设备辅助方法包括:获取扫查协议,所述扫查协议与多个初始参数相关联;接收辅助功能触发指令;接收所述多个初始参数中被选择的目标参数;以及在显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像。
在一些实施例中,所述医疗设备辅助方法还可以包括:响应于基于所述显示界面触发的使用指令, 展示教学任务列表,其中,所述教学任务列表包括多个候选教学任务;基于所述教学任务列表,获取用户触发的选择指令;根据所述选择指令,确定目标教学任务,以执行所述医疗设备辅助方法。
在一些实施例中,所述多个初始参数可以包括第一级初始参数和第二级初始参数,在接收辅助功能触发指令之后,所述医疗设备辅助方法还可以包括以下至少一种操作:基于所述扫查协议或所述辅助功能触发指令,在所述显示界面上展示所述第一级初始参数;响应于显示所述第二级初始参数的显示触发指令,在所述显示界面上展示所述第二级初始参数。
在一些实施例中,当所述辅助功能触发指令为静态辅助功能触发指令时,可以通过查询预存数据库,确定所述第一图像和所述第二图像并在所述显示界面上展示,其中,所述预存数据库包括所述多个初始参数的各个值对应的图像。
在一些实施例中,当所述辅助功能触发指令为动态辅助功能触发指令时,所述在显示界面上展示所述多个初始参数之前可以包括以下至少一种操作:基于所述扫查协议采集初始图像;在所述显示界面上展示所述初始图像;所述在所述显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像包括:将所述初始图像作为所述第一图像;基于所述目标参数,处理所述第一图像来确定所述第二图像;以及在所述显示界面上展示所述第一图像和所述第二图像。
在一些实施例中,所述目标参数可以包括第一目标参数和第二目标参数。其中,所述目标参数预调节前所对应的第一图像可以包括所述第一目标参数和所述第二目标参数预调节前所对应的图像;所述目标参数预调节后所对应的第二图像可以包括所述第一目标参数和所述第二目标参数预调节后对应的图像、所述第一目标参数预调节后所对应的图像、以及所述第二目标参数预调节后所对应的图像。
在一些实施例中,所述第一图像和所述第二图像可以包括静态图像或动态图像。
在一些实施例中,所述第一图像和所述第二图像中呈现的被扫描对象可以与所述扫查协议相关。
在一些实施例中,所述医疗设备可以为超声设备,所述扫查协议可以包括探头种类、扫查模式中的至少一种。
本说明书实施例之一提供一种医疗设备辅助***,所述医疗设备辅助***包括:获取模块,用于获取扫查协议,所述扫查协议与多个初始参数相关联;接收模块,用于接收辅助功能触发指令以及所述多个初始参数中被选择的目标参数;显示模块,用于在显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像。
本说明书实施例之一提供一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行所述医疗设备辅助方法。
本说明书一些实施例中,通过响应于基于显示界面触发的使用指令,能够展示教学任务列表,而教学任务列表包括多个候选教学任务,并且各候选教学任务为预先基于编辑界面所创建的;通过可视化的展示,可以从多个候选教学任务中直观地选择出目标教学任务,而选择的目标教学任务同样是基于可视化界面创建的;进而可以运行选择的目标教学任务,通过目标教学任务帮助用户了解超声设备(例如,扫查参数的物理意义等),并确保用户动手实践并掌握超声设备的功能及操作。从而使得用户能够通过界面友好、交互性强的教学方法学习并熟悉医疗设备(例如,超声设备等)的操作,缩短了教学流程,提高了教学质量和效率。
本说明书一些实施例中,通过各种静态和动态辅助功能,方便用户随时随地了解想要学习的参数,提升了用户体验;通过将各个参数预调节前后对应的图像同时展示给用户,节省了用户了解各个参数的时间,用户可以先通过预览该参数预调节前后的效果,再决定是否对该参数进行调节,使得用户能够更灵活地运用各个参数来调整图像,并更好地了解并掌握各个参数,从而有助于掌握医疗设备的功能及操作。
附图说明
本说明书将以示例性实施例的方式进一步说明,这些示例性实施例将通过附图进行详细描述。这些实施例并非限制性的,在这些实施例中,相同的编号表示相同的结构,其中:
图1是根据本说明书一些实施例所示的医疗***的应用场景示意图;
图2是根据本说明书一些实施例所示的医疗设备的教学***的示例性模块图;
图3是根据本说明书一些实施例所示的医疗设备辅助***的示例性模块图;
图4是根据本说明书一些实施例所示的医疗设备的教学方法的示例性流程图;
图5是根据本说明书一些实施例所示的教学任务创建方法的示例性流程图;
图6a是根据本说明书一些实施例所示的教学任务编辑界面的结构示意图;
图6b是根据本说明书一些实施例所示的教学任务中基础节点的示意图;
图6c是根据本说明书一些实施例所示的教学任务中超声应用节点的示意图;
图7是根据本说明书一些实施例所示的创建教学任务的示意图;
图8是根据本说明书一些实施例所示的医疗设备辅助方法的示例性流程图;
图9是根据本说明书一些实施例所示的超声设备的静态辅助功能中针对单个目标参数所对应的至少部分显示界面的示意图;
图10是根据本说明书一些实施例所示的超声设备的静态辅助功能中针对组合目标参数所对应的至少部分显示界面的示意图;
图11是根据本说明书一些实施例所示的超声设备的动态辅助功能中针对单个目标参数所对应的至少部分显示界面的示意图;
图12是根据本说明书一些实施例所示的超声设备的B模式下调节深度值的教学任务的示意图。
具体实施方式
为了更清楚地说明本说明书实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本说明书的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本说明书应用于其它类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
应当理解,本文使用的“***”、“装置”、“单元”和/或“模块”是用于区分不同级别的不同组件、元件、部件、部分或装配的一种方法。然而,如果其他词语可实现相同的目的,则可通过其他表达来替换所述词语。
如本说明书和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其它的步骤或元素。
本说明书中使用了流程图用来说明根据本说明书的实施例的***所执行的操作。应当理解的是,前面或后面操作不一定按照顺序来精确地执行。相反,可以按照倒序或同时处理各个步骤。同时,也可以将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。
图1是根据本申请一些实施例所示的示例性医疗设备的教学***的应用场景示意图。如图1所示,医疗设备的教学***100可以包括医疗设备110、网络120、终端设备130、处理设备140以及存储设备150。医疗设备的教学***100的组件可以以各种方式进行连接。仅作为示例,如图1所示,处理设备140可以通过网络120连接到医疗设备110。又例如,处理设备140可以直接连接到医疗设备110。又例如,终端设备(例如,131、132、133等)可以直接连接到处理设备140,也可以通过网络120连接到处理设备140。
医疗设备110可以用于扫查位于其检测区域内的目标对象或其一部分,并生成与该目标对象或其一部分有关的图像。在一些实施例中,目标对象可以包括人体、动物(例如,试验用小白鼠等其他动物)、模体等,或其任意组合。在一些实施例中,目标对象可以包括人体的特定部分,例如头部、胸部、腹部等,或其任意组合。在一些实施例中,目标对象可以包括特定器官,例如心脏、甲状腺、食道、气管、胃、胆囊、小肠、结肠、膀胱、输尿管、子宫、输卵管等。在一些实施例中,医疗设备110可以包括超声设备、计算机断层扫描(Computed Tomography,CT)设备、磁共振成像(Magnetic Resonance Imaging,MRI)设备、正电子发射型计算机断层显像设备(Positron Emission Computed Tomography,PET)、单光子发射计算机断层成像(Single-Photon Emission Computed Tomography,SPECT)设备等,或其任意组合。
网络120可以促进信息和/或数据的交换。在一些实施例中,医疗设备的教学***100的一个或多个组件(例如,医疗设备110、终端设备130、处理设备140、存储设备150等)可以通过网络120与医疗设备的教学***100中的其他组件交换信息和/或数据。例如,处理设备140可以经由网络120从医疗设备110获取扫查数据。在一些实施例中,网络120可以是任何类型的有线或无线网络或其组合。仅作为示例,网络120可以包括电缆网络、有线网络、无线网络、光纤网络、电信网络、内联网、因特网、局域网(LAN)、广域网(WAN)、城域网(MAN)等,或其任意组合。在一些实施例中,网络120可以包括一个或多个网络接入点。例如,网络120可以包括有线和/或无线网络接入点,例如基站和/或因特网交换点。通过网络接入点,医疗设备的教学***100的一个或多个组件可以连接网络120以交换数据和/或信息。
终端设备130可以使用户与医疗设备的教学***100中的其他组件进行用户交互。例如,用户可以通过终端设备130向处理设备140发送访问医疗设备110的教学任务的请求。又例如,终端设备130还可以通过网络120接收医疗设备110获取的扫查图像。在一些实施例中,终端设备130可以包括移动设备131、平板电脑132、笔记本电脑133等或其任意组合。在一些实施例中,移动设备131可以包括智能家居设备、可穿戴设备、智能移动设备、虚拟现实设备、增强现实设备等,或其任意组合。
处理设备140可以处理从医疗设备110、终端设备130和/或存储设备150获得的信息和/或数据。例如,处理设备140可以获取用户触发的指令(例如,使用指令、选择指令、获取扫查协议指令、辅助功 能触发指令等),以在交互的界面上展示与指令相关的操作(例如,展示教学任务列表、确定目标教学任务、展示多个初始参数等)。又例如,处理设备140可以根据从多个初始参数中选择的目标参数,在交互的界面上展示目标参数预调节前后分别对应的图像。在一些实施例中,处理设备140可以是单个服务器或服务器组。服务器组可以是集中式的,也可以是分布式的。在一些实施例中,处理设备140可以是本地的或远程的。例如,处理设备140可以通过网络120从医疗设备110、终端设备130和/或存储设备150访问信息和/或数据。又例如,处理设备140可以直接连接医疗设备110、终端设备130和/或存储设备150以访问信息和/或数据。在一些实施例中,处理设备140可以包括一个或一个以上的处理单元(例如,单核处理引擎或多核处理引擎)。仅作为示例,处理设备140可以包括中央处理单元(CPU)、专用集成电路(ASIC)、专用指令集处理器(ASIP)、图形处理单元(GPU)、物理处理单元(PPU)、数字信号处理器(DSP)、现场可编程门阵列(FPGA)、可编程逻辑器件(PLD)、控制器、微控制器单元、精简指令集计算机(RISC)、微处理器等,或其任意组合。在一些实施例中,处理设备140可以在云平台上实现。例如,云平台可以包括私有云、公共云、混合云、社区云、分布式云、跨云、多云等其中一种或几种的组合。在一些实施例中,处理设备140可以为医疗设备110或终端设备130的一部分。
存储设备150可以存储数据、指令和/或任何其他信息。在一些实施例中,存储设备150可以存储从医疗设备110、终端设备130和/或处理设备140处获得的数据。例如,存储设备150可以存储与医疗设备110相关的参数的各个值对应的图像。又例如,存储设备150可以存储从医疗设备110获得的扫查对象的扫查数据(或图像数据)。在一些实施例中,存储设备150可以存储处理设备140可以执行或使用的数据和/或指令,以执行本申请中描述的示例性方法。在一些实施例中,存储设备150可包括大容量存储器、可移除存储器、易失性读写存储器、只读存储器(ROM)等其中一种或几种的组合。在一些实施例中,存储设备150可以通过本申请中描述的云平台实现。
在一些实施例中,存储设备150可以连接网络120,以与医疗设备的教学***100中的一个或多个组件(例如,医疗设备110、处理设备140、终端设备130等)之间实现通信。医疗设备的教学***100中的一个或多个组件可以通过网络120读取存储设备150中的数据或指令。在一些实施例中,存储设备150可以是处理设备140的一部分,也可以是独立的,与处理设备140直接或间接相连。
需要注意的是,对医疗设备的教学***100的以上描述仅出于说明的目的,而无意于限制本申请的范围。可以理解,对于本领域普通技术人员而言,在了解该***的原理后,可以在不背离这一原理的情况下,对实施上述***的应用领域进行形式和细节上的各种修正和改变。然而,这些变化和修改并不脱离本申请的范围。例如,医疗设备110、处理设备140与终端设备130可以共用一个存储设备150,也可以有各自的存储设备。诸如此类的变形,均在本说明书的保护范围之内。
图2是根据本说明书一些实施例所示的医疗设备的教学***的示例性模块图。在一些实施例中,医疗设备的教学***200可以由处理设备140实现。如图2所示,医疗设备的教学***200可以包括第一展示模块210、第一获取模块220。
在一些实施例中,第一展示模块210可以用于响应于基于显示界面触发的使用指令,展示教学任务列表。其中,教学任务列表包括多个候选教学任务。在一些实施例中,候选教学任务可以为预先基于编辑界面所创建的。
在一些实施例中,第一获取模块220可以用于基于教学任务列表,获取用户触发的选择指令,根据选择指令从候选教学任务中确定出目标教学任务。
在一些实施例中,第一获取模块220还可以用于创建新的教学任务作为目标教学任务,或对候选教学任务中的至少一个进行编辑后作为目标教学任务。
在一些实施例中,第一获取模块220可以用于将候选教学任务中的至少一个确定为待编辑教学任务;将待编辑教学任务解码成图形化界面;根据待编辑教学任务的脚本中的节点逻辑连接信息得到至少一个教程节点;基于至少一个教程节点的类型信息和位置信息,确定至少一个教程节点的显示位置;基于节点逻辑连接信息,将至少一个教程节点之间进行连接;基于至少一个教程节点的内部逻辑和设置信息,得到至少一个教程节点的名称信息、输入输出连接信息和数据连接信息并显示在编辑界面中。
在一些实施例中,医疗设备的教学***200进一步可以包括更新模块(未示出)。更新模块可以用于更新多个候选教学任务中的至少一个。
在一些实施例中,医疗设备的教学***200还可以包括运行模块230,用于运行目标教学任务。
在一些实施例中,运行模块230可以用于接收与扫查协议相关联的多个初始参数中被选择的目标参数;在显示界面上展示目标参数预调节前所对应的第一图像以及目标参数预调节后所对应的第二图像。
在一些实施例中,运行模块230可以用于响应于多个初始参数中的至少一个参数被选择或调节,在显示界面上展示目标参数预调节前所对应的第一图像以及目标参数预调节后所对应的第二图像。
在一些实施例中,运行模块230可以用于将目标教学任务中的第一个教程节点存入栈中,循环执 行以下步骤,直到栈中所有教程节点已取出:从栈的栈顶取出一个教程节点作为当前教程节点;响应于当前教程节点包含输入数据,且输入数据包含无效数据,将与无效数据对应的输入引脚相连的教程节点存入栈中;或响应于当前教程节点包含输入数据,且输入数据为有效数据,执行当前教程节点中包含的操作,并将当前教程节点的输出所对应的教程节点存入栈中。
在一些实施例中,编辑界面可以包括创建区域。医疗设备的教学***200还可以包括:第二展示模块、第二获取模块、配置模块和创建模块等(未示出)。第二展示模块可以用于基于创建区域获取创建指令,响应于创建指令展示节点菜单。第二获取模块可以用于基于节点菜单获取节点选择指令。其中,节点选择指令中包括被选中的教程节点的节点标识。配置模块可以用于基于创建区域对节点标识对应的教程节点进行配置,得到每个教程节点的配置信息。创建模块可以用于根据教程节点和教程节点的配置信息,创建候选教学任务。
在一些实施例中,配置信息可以包括:教程节点的类型信息、教程节点的逻辑信息、教程节点的设置信息、教程节点的位置信息、教程节点的逻辑引脚连接信息、教程节点的数据引脚连接信息、教程脚本的脚本属性信息和教程脚本的配置信息等中的至少一个。
在一些实施例中,配置模块可以包括显示单元和获取单元。其中:显示单元可以用于根据节点选择指令显示各教程节点的配置框。获取单元可以用于基于各教程节点的配置框,得到各教程节点的配置信息。
在一些实施例中,节点菜单可以包括基础节点菜单和/或超声应用节点菜单。第二展示模块可以用于基于创建区域获取创建指令,响应于创建指令展示基础节点菜单和/或超声应用节点菜单。
在一些实施例中,节点选择指令可以包括第一选择指令和/或第二选择指令。第二获取模块可以用于基于基础节点菜单获取第一节点选择指令,其中,第一节点选择指令中可以包括被选中的教程节点的基础节点标识;基于超声应用节点菜单获取第二节点选择指令,其中,第二节点选择指令中可以包括被选中的教程节点的超声应用节点标识。
在一些实施例中,编辑界面还可以包括信息提示区域。医疗设备的教学***200还可以包括第三获取模块和设置模块。第三获取模块可以用于基于编辑界面和/或信息提示区域获取设置指令。设置模块可以用于响应于设置指令对各教程节点的节点数据和/或节点属性进行设置。
在一些实施例中,编辑界面还可以包括编辑区域。医疗设备的教学***200还可以包括第四获取模块、第三展示模块和编辑模块。第四获取模块可以用于基于编辑区域获取编辑指令,其中,编辑指令指示待编辑节点。第三展示模块可以用于响应于编辑指令在编辑区域上展示待编辑节点的编辑界面。编辑模块可以用于基于编辑界面获取编辑信息,根据编辑信息对待编辑节点进行校验操作、保存操作、设置操作和搜索操作等中的至少一个。
图3是根据本说明书一些实施例所示的医疗设备辅助***的示例性模块图。在一些实施例中,医疗设备辅助***300可以由处理设备140实现。如图2所示,医疗设备辅助***300可以包括获取模块310、接收模块320和/或显示模块330。
在一些实施例中,获取模块310可以用于获取扫查协议。其中,扫查协议可以与多个初始参数相关联。在一些实施例中,初始参数可以被划分为多级或多类初始参数,以供显示模块330按级别或类别分别进行显示。
在一些实施例中,接收模块320可以用于接收辅助功能触发指令。其中,辅助功能触发指令可以触发医疗设备的辅助功能。在一些实施例中,辅助功能触发指令可以包括静态辅助功能触发指令和动态辅助功能触发指令。在一些实施例中,接收模块320还可以用于接收多个初始参数中被选择的目标参数。其中,目标参数可以包括所选择的初始参数的种类以及其对应的预调节值。
在一些实施例中,显示模块330可以用于在显示界面上显示目标参数预调节前所对应的第一图像以及目标参数预调节后所对应的第二图像。换句话说,显示模块330可以同时显示目标参数预调节前后所对应的图像,以供用户对比查看,方便用户更好地了解目标参数对图像的应用效果。在一些实施例中,显示模块330还可以用于在显示界面上显示多个初始参数。用户可以通过对显示界面上显示的参数进行选择以确定目标参数。
应当理解,图2和图3所示的***、装置及其模块可以利用各种方式来实现。例如,在一些实施例中,***、装置及其模块可以通过硬件、软件或者软件和硬件的结合来实现。其中,硬件部分可以利用专用逻辑来实现;软件部分则可以存储在存储器中,由适当的指令执行***,例如微处理器或者专用设计硬件来执行。本领域技术人员可以理解上述的方法和***可以使用计算机可执行指令和/或包含在处理器控制代码中来实现,例如在诸如磁盘、CD或DVD-ROM的载体介质、诸如只读存储器(固件)的可编程的存储器或者诸如光学或电子信号载体的数据载体上提供了这样的代码。本申请的***及其模块不仅可以有诸如超大规模集成电路或门阵列、诸如逻辑芯片、晶体管等的半导体、或者诸如现场可编程门阵列、可编 程逻辑设备等的可编程硬件设备的硬件电路实现,也可以用例如由各种类型的处理器所执行的软件实现,还可以由上述硬件电路和软件的结合(例如,固件)来实现。
需要注意的是,以上对于***、装置及其模块的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该***的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子***与其他模块连接。例如,在一些实施例中,接收模块320可以包括两个单元,例如,指令接受单元和参数接收单元,以分别接收辅助功能触发指令和被选择的目标参数。又例如,各个模块可以共用一个存储设备,各个模块也可以分别具有各自的存储设备。诸如此类的变形,均在本申请的保护范围之内。
图4是根据本说明书一些实施例所示的医疗设备的教学方法的示例性流程图。如图4所示,流程400包括下述步骤。在一些实施例中,流程400可以由处理设备140执行。
步骤410,响应于基于显示界面触发的使用指令,展示教学任务列表。其中,教学任务列表包括多个候选教学任务。在一些实施例中,步骤410可以由第一展示模块210执行。
教学任务列表是指展示教学任务的列表。教学任务是指用于对用户进行医疗设备的功能和操作等的培训的流程。在一些实施例中,教学任务列表可以为竖向排布或者横向排布的列表。在一些实施例中,教学任务列表中可以展示与操作超声成像设备相关的多个候选教学任务,例如,成像任务、调整任务等。在一些实施例中,在候选教学任务中可以包括执行该教学任务的多个教程节点以及这多个教程节点间的逻辑关系,通过候选教学任务的多个教程节点间的逻辑关系能够体现对应的候选教学任务的执行逻辑和候选教学任务的执行内容,从而根据执行逻辑和执行内容执行对应的候选教学任务。教程节点是指表示教学任务中单个教程的节点。教程节点间的逻辑关系是指多个教程节点间与教程的执行逻辑相关的关系,例如,依赖关系、顺序关系等。仅作为示例,依赖关系可以是教程节点2的执行需要使用到教程节点1的输出,顺序关系可以是教程节点3必须在教程节点4之前执行。
在一些实施例中,当用户需要使用医疗设备(例如,超声成像设备等)的教学功能时,可以通过用户在医疗设备的显示界面触发使用指令,使得医疗设备响应用户触发的该使用指令,调用教学任务列表,并将调用的该教学任务列表进行展示。在一些实施例中,若医疗设备为触屏式的超声成像设备,则用户可以通过长按触摸屏触发教学指令,在长按位置弹出教学任务列表进行展示;若医疗设备为非接触式的医疗设备,则用户可以通过手势,语音,眼动,意识等方法,触发教学指令;另外也可以通过鼠标,键盘,轨迹球等触发教学指令。
在一些实施例中,候选教学任务可以为预先基于编辑界面所创建的。
在一些实施例中,编辑界面可以包括创建区域。在一些实施例中,可以通过以下过程在创建区域创建候选教学任务:基于创建区域获取创建指令,响应于创建指令展示节点菜单;基于节点菜单获取节点选择指令,其中,节点选择指令中包括被选中的教程节点的节点标识;基于创建区域对节点标识对应的教程节点进行配置,得到每个教程节点的配置信息;根据教程节点和教程节点的配置信息,创建候选教学任务。关于如何在创建区域创建教学任务的更多内容,可以参见图5的相关描述,在此不再赘述。
在一些实施例中,编辑界面还可以包括信息提示区域。在一些实施例中,可以在信息提示区域对教程节点的节点数据和/或节点属性进行设置。
在一些实施例中,可以基于编辑界面和/或信息提示区域获取设置指令。具体地,可以从编辑界面包括的信息提示区域获取设置指令,也可以从整个编辑界面获取设置指令。例如,图6a所示为编辑界面,可以从图6a中的提示信息区域630获取设置指令,也可以从图6a所示的整个编辑界面获取设置指令。
在一些实施例中,可以响应于获取的设置指令对各教程节点的节点数据和/或节点属性进行设置。各教程节点的节点数据可以为节点的输入数据、输出数据等,例如,节点的输入数据可以为事件、数学变量、字符串、自定义类型等。各教程节点的节点属性可以为各教程节点的类型,例如,事件接收节点、数学变量节点、模式切换节点等。
在一些实施例中,在编辑界面(例如,信息提示区域等)中还可以展示医疗设备的编辑器的工作状态。例如,可以展示医疗设备的编辑器当前的工作状态为校验教程脚本状态、保存教程脚本状态等中的其中一种。
本说明书一些实施例中,通过编辑界面和/或信息提示区域对各教程节点的节点数据和节点属性进行设置,从而能够对教程节点的节点数据和节点属性进行准确设置,使得教程节点能够适应更加广泛的应用场景。
在一些实施例中,编辑界面还可以包括编辑区域。在一些实施例中,可以在编辑区域对待编辑节点进行编辑。
在一些实施例中,可以基于编辑区域获取编辑指令。其中,编辑指令指示待编辑节点。例如,编辑界面包括的编辑区域可以如图6a中所示的区域610。在一些实施例中,可以在信息提示区域获取用户触 发的编辑指令,根据编辑指令确定待编辑节点。
在一些实施例中,可以响应于编辑指令在编辑区域上展示待编辑节点的编辑界面。具体地,可以在接收到编辑指令后,在编辑区域上展示待编辑节点的编辑界面。
在一些实施例中,可以基于编辑界面获取编辑信息,根据编辑信息对待编辑节点进行校验操作、保存操作、设置操作和搜索操作等中的至少一个。在一些实施例中,展示的编辑界面上可以包括编辑信息,可以从编辑界面获取编辑信息,根据获取的编辑信息对待编辑节点进行各种编辑操作。其中,编辑操作可以包括校验操作、保存操作、设置操作和搜索操作等中的至少一个。
本说明书一些实施例中,通过编辑区域展示待编辑节点的编辑界面,并对其进行编辑,从而能够对待编辑节点进行实时编辑调整,使得编辑后的节点能够适应更加广泛的应用场景。
步骤420,基于教学任务列表,获取用户触发的选择指令。在一些实施例中,步骤420可以由第一获取模块220执行。
在一些实施例中,在展示教学任务列表后,可以从教学任务列表的多个候选教学任务中进行选择,选出需要执行的至少一个候选教学任务,当在教学任务列表中选出候选教学任务后,将会触发生成选择指令。在一些实施例中,可以获取所触发的选择指令。其中,选择指令中包含已选中的候选教学任务的信息,例如,候选教学任务的编号、名称、ID等。
步骤430,根据选择指令,确定目标教学任务。在一些实施例中,步骤430可以由第一获取模块220执行。
在一些实施例中,可以根据获取的选择指令,将选择指令中候选教学任务的信息对应的候选教学任务确定为目标教学任务。例如,可以根据候选教学任务的编号、名称、ID等确定目标教学任务对应的编号、名称、ID等。在一些实施例中,目标教学任务可以包括教学任务列表中的一个或多个候选教学任务。
步骤440,运行目标教学任务。在一些实施例中,步骤440可以由运行模块230执行。
在一些实施例中,在确定目标教学任务后,可以运行该目标教学任务包括的至少一个教程节点以及这些教程节点间的逻辑关系,以运行目标教学任务。
在一些实施例中,可以将目标教学任务中的第一个教程节点存入栈中,循环执行以下步骤,将目标教学任务中的教程节点进行入栈出栈操作,直到栈中所有教程节点已取出:从栈的栈顶取出一个教程节点作为当前教程节点;响应于当前教程节点包含输入数据,且输入数据包含无效数据,将与无效数据对应的输入引脚相连的教程节点存入栈中;或响应于当前教程节点包含输入数据,且输入数据为有效数据,执行当前教程节点中包含的操作,并将当前教程节点的输出所对应的教程节点存入栈中。
在一些实施例中,当某一个教程节点被设为当前教程节点时,执行该教程节点包含的操作。
仅作为示例,当医疗设备为超声成像设备,运行教学任务的过程可以看作为执行教学任务脚本的过程。其中,如教学任务为图7所示的教学任务,则过程可以包括以下步骤:
S1-1,使用者登录***,下载已经安装的教学任务脚本。
S1-2,用户点击PW模式切换按钮,***发送这个点击事件,此时该教学任务脚本的接收事件节点被触发,将ClickEvent节点推入栈中,该ClickEvent节点输出引脚EventName被该***事件设置成PW mode。
S1-3,从栈中取出ClickEvent节点,该ClickEvent节点没有输入数据,执行该节点逻辑(事实上事件接收节点没有额外的节点逻辑)。
S1-4,该ClickEvent节点有输出,将其逻辑输出节点DisplayVideo节点放入栈中,将其数据输出节点连接的isEqual节点放入栈中。
S1-5,栈中有两个未取出的节点,先取出栈顶的isEqual节点。
S1-6,isEqual节点有二个输入引脚,其中EventName已经由ClickEvent节点的输出赋值,所以将另外一个输入引脚string连接的节点——一个数学变量节点放入栈中。
S1-7,取出栈顶的数学变量节点,该节点没有输入数据,该节点也没有额外节点逻辑,输出是一个值为“PW mode”的字符串,将该字符串赋值给isEqual节点的string输入引脚,并将isEqual节点放入栈中。
S1-8,取出栈顶的isEqual节点。
S1-9,该节点的两个输入引脚现在均有赋值,则执行该节点逻辑,判断这两个输入引脚的值是否相等。这里是相等的,result输出引脚值是true。将该引脚相连的DisplayVideo节点放入栈中,而栈中之前已经存放了该节点,所以不重复放入。
S1-10,取出栈顶的DisplayVideo节点,该节点的一个数据输入引脚Condition已经有值,值为true,执行该节点逻辑。
S1-11,DisplayVideo节点逻辑是,当输入引脚condition的值为true时,触发一个在该节点中配 置好的视频,播放在超声屏幕设备上的相应位置。
S1-12,将DisplayVideo节点的逻辑输出连接的节点EndTask放入栈中,该节点是任务的终止节点,并且栈中现在没有其他节点,完成教学任务。
在一些实施例中,可以通过使用者账户在医疗设备中下载教学任务。以医疗设备是超声设备为例,下载教学任务的过程可以包括:使用者使用账户、密码向服务器请求登录;服务器通过认证后,查询使用者的教学任务列表和权限域内共享的教学任务列表,并发送给云端存储器;云端存储器查询教学任务列表,获得教学任务脚本和校验数据,并返回给服务器;服务器将获得的教学任务脚本和校验数据,加入排序信息返回给使用者所在的本地超声***;本地超声***首先会根据校验数据决定是否覆盖本地教学任务,并根据新手教学任务的排序信息将新手教学任务排序。之后,使用者可以根据当前需求选择合适的教学任务并激活,超声***会解析教学任务脚本,在满足执行条件时依次运行相应的教学任务节点。
在一些实施例中,可以创建新的教学任务作为目标教学任务,或对候选教学任务中的至少一个进行编辑后作为目标教学任务。关于如何创建新的教学任务可以参见图5的相关描述,在此不再赘述。
在一些实施例中,可以将候选教学任务中的至少一个确定为待编辑教学任务;将待编辑教学任务解码成图形化界面;根据待编辑教学任务的脚本中的节点逻辑连接信息得到至少一个教程节点;基于至少一个教程节点的类型信息和位置信息,确定至少一个教程节点的显示位置;基于节点逻辑连接信息,将至少一个教程节点之间进行连接;基于至少一个教程节点的内部逻辑和设置信息,得到至少一个教程节点的名称信息、输入输出连接信息和数据连接信息并显示在编辑界面中。
仅作为示例,当在教学任务编辑器的超声成像设备上打开已有的教学任务时,超声成像设备会将教学任务脚本解码成图形化界面。超声成像设备的处理器模块会根据教学任务脚本中节点逻辑引脚连接信息,依次得到节点;通过节点的类型信息和位置信息,经过渲染模块将该类型的节点渲染到相应位置;通过节点的引脚连接信息,经过渲染模块将节点之间通过渲染的线段连接;通过节点的内部逻辑和设置信息,经过渲染模块将该节点的名称、输入输出的逻辑引脚、数据引脚渲染在编辑器中。
在一些实施例中,可以更新多个候选教学任务中的至少一个。具体来说,在对至少一个候选教学任务进行编辑后,可以用编辑后的候选教学任务替换对应的编辑前的候选教学任务。
仅作为示例,当医疗设备为全触屏的超声成像设备,医疗设备的教学方法可以包括以下步骤:
S2-1,使用者登录到超声成像设备后,超声成像设备自动向服务器请求使用者的教学任务数据列表,在得到列表数据后更新本地可使用的教学任务。此时列表数据可以包含使用者账户自己创建的,还可以包括权限域内其账户设置成共享的。在列表中可以看到已经激活的教学任务和未激活的教学任务。
S2-2,使用者查看是否有符合当前需求的教学任务。如果有,选择并激活该教学任务,该任务会在条件符合时执行。如果没有,用户点击界面的“新建教学任务”或“编辑教学任务”按钮进入教学任务的创建和编辑界面。
S2-3,对之前列表中已经选择的一个已有的教学任务进行编辑修改,或者,新建并命名一个教学任务,使用者选择如上操作后会进入到教学任务的可视化编辑界面。
S2-4,如图7所示,在可视化编辑界面的创建区域620中通过在画布上长按屏幕或者点击已有节点的输出引脚并拖动引出连线后创建一个新的事件节点。
S2-5,点击该事件节点,设置该节点的节点名称为ClickEvent。当用户点击屏幕上某个控件时会触发这个事件节点,并且该节点输出这个控件发出的事件名称。
S2-6,在ClickEvent事件节点的EventName引脚处点击,并长按拖动,引出一条线段到合适位置,结束触摸,弹出新建节点菜单,选择创建名称为isEqual的逻辑节点,并且连接了上个ClickEvent节点的EventName引脚。该逻辑节点的输入数据也自动命名为EventName类型。长按空白区域弹出新建节点菜单,创建一个变量节点,点击变量节点设置变量节点的输入数据值为“PW Mode”,将该节点输出引脚和逻辑节点的第二个输入引脚相连。当逻辑节点判断EventName字符串为“PW Mode”时,Result引脚会输出true,否则输出false。
S2-7,使用与步骤S2-4到S2-6中类似方法,创建一个名称为DisplayVideo的展示节点,将事件节点的逻辑输出引脚和展示节点的逻辑输入引脚相连,将展示节点属性设置成播放一个视频,播放视频条件为一个bool型变量,设置播放视频的内容,设置视频控件显示的位置等等。将之前的逻辑节点的Result输出引脚和视频展示节点的Condition输入引脚相连。最后连接一个名称为EndTask的结束节点结束教学任务。这个教学任务即:当用户触发了一个点击“PW Mode”按钮时,播放一个教学视频,并结束这个教学任务。
S2-8,编辑完整个教学任务后,点击保存并上传。***将这个教学任务进行编译,如果编译通过则会自动启动上传,如果编译未通过会提示该教学任务哪个部分设置报错。完成编译后,***就会将这个教学任务脚本上传到该使用者账号下的云服务器中。如果该任务设置成了域权限共享,那么上传后其他在 域权限内的账号即可查看和使用该教学任务。
S2-9,回到选择可用教学任务列表的界面,激活选中的教学任务。
在一些实施例中,可以通过使用者账户将教学任务保存到云端存储器中。以医疗设备是超声设备为例,保存教学任务的过程可以包括:使用者使用账户登录后,将需要保存的教学任务流脚本与排序所需数据上传到服务器;服务器通过认证后,生成教学任务脚本的校验数据,并将新手教学任务脚本和校验数据上传到云端存储器;云端存储器会根据校验数据决定是否覆盖使用者之前的教学任务脚本,并将处理结果返回给服务器;服务器会返回的处理结果决定是否更新教学任务列表,并将最终结果返回给本地超声***;本地超声***会根据返回的最终结果处理接下来的工作。如果存档成功,提示成功;如果存档失败,教学任务脚本将会保存在本地,并且等待下次保存时,重试上述步骤。
在一些实施例中,在运行目标教学任务的过程中,可以响应于与扫查协议相关联的多个初始参数中的目标参数被选择或调节,在显示界面上展示目标参数预调节前所对应的第一图像以及目标参数预调节后所对应的第二图像。在一些实施例中,关入如何对目标参数进行选择和调节以得到调节前后的图像的更多内容,可以参见图8的相关描述,在此不再赘述。
本说明书一些实施例中,通过响应于基于显示界面触发的指令展示候选教学任务,根据指令从候选教学任务选中并运行目标教学任务,从而可以通过可视化的方式直观地选择并运行教学任务,界面友好,使用简单,方便了新用户等的使用;通过基于可视化界面创建、编辑教学任务,进而运行教学任务,可以快速简便地创建教学任务,从而帮助用户更好地了解医疗设备(例如,超声设备等),能够通过引导用户动手实践掌握医疗设备的各项使用功能及操作。
图5是根据本说明书一些实施例所示的教学任务创建方法的示例性流程图。如图5所示,流程500包括下述步骤。在一些实施例中,流程500可以由处理设备140执行。
步骤510,基于创建区域获取创建指令,响应于创建指令展示节点菜单。
在一些实施例中,可以从创建区域获取触发的创建指令,响应于该创建指令展示节点菜单。例如,图6a所示为编辑界面的示意图,其中,创建区域可以为图6a中的区域620,展示的节点菜单可以为图6a中的菜单640。
在一些实施例中,医疗设备可以包括超声成像设备,节点菜单中的没一个菜单项对应一种类型的教程节点。
教程节点是指对应教程中的各种元素的节点。在一些实施例中,教程节点可以包括多种类型的节点,例如,基础节点、超声应用节点等中的至少一种。
在一些实施例中,基础节点可以包括多种类型的节点,例如,事件接收节点、事件发出节点、数学变量节点、逻辑计算节点、执行功能节点、展示节点、逻辑等待节点和结束节点等中的至少一个。
事件接收节点是指当接收到一些***提供的事件或使用者自定义的事件后触发并执行之后与之连接的节点,如使用者点击某个界面按钮,或者打开某个界面,或者完成某个输入等事件后,执行该事件节点之后的节点。
事件发出节点是指自定义一个事件,并将该节点的输入数据一并发出。
数学变量节点是指包括各种类型的变量(例如,布尔、数值、字符串、向量、对象等)的节点,这些变量类型也可以作为至少部分其他类型节点(例如,逻辑计算节点、执行功能节点等)的数据输入和输出类型,数学变量节点也可以单独作为其他类型节点的输入进行计算。
逻辑计算节点可以对数据进行逻辑计算,如数值和矢量类型的加减乘除,布尔类型的真假判别,数值类型大小、相等判别,字符串类型的相等判别等。
执行功能节点可以选择执行医疗设备的一种功能。例如,模式切换,冻结,测量包选择等等。
展示节点可以通过屏幕展示一个界面,界面中可以通过文字、图片、视频的形式提示用户如何进行操作。
逻辑等待节点是指只有该节点的逻辑均有输入后,才会继续向后执行。
结束节点可以表示结束教学任务。
仅作为示例,基础节点的节点控件外观特征可以如图6b所示。节点成矩形图形外框,根据节点类型可以具有不同线框颜色或底色,节点内部分为上下两个区域,上部区域有节点名称(节点可以自定义名称),下部区域有节点数据,通过该节点数据表示节点的逻辑、数据输入和输出引脚,引脚旁边显示数据对象名称。数据输入和输出引脚区域具有标记符号特征,例如,逻辑连接引脚采用三角符号标识,数据连接引脚采用圆点标识。左侧为逻辑和数据输入引脚,右侧为逻辑和数据输出引脚。在一些实施例中,节点的引脚可以包括逻辑输入、逻辑输出、数据输入、数据输出中的至少一个。例如,事件节点没有逻辑输入和数据输入引脚,而结束节点没有逻辑输出和数据输出引脚。
在一些实施例中,对于全触屏超声设备,可以通过长按屏幕在对应位置调出节点创建菜单。
在一些实施例中,可以通过选中节点的一个引脚,滑动拉出一根连线,在另外一个节点的引脚处松开,即可和将两个节点相连。
在一些实施例中,可以通过选中节点的一个引脚,滑动拉出一根连线,在创建区域(例如,图6a中的区域620)空白处松开,则自动调出节点创建菜单。
在一些实施例中,可以判断节点间的数据和逻辑连接是否符合连接规则,若不符合则连接可以被取消并提示错误。连接规则可以根据需要设定,例如,数据可以由一个输出引脚流出,流入到多个节点的数据输入引脚,但一个节点的输入引脚只能由某一个节点的输出引脚流入。
在一些实施例中,节点可以在创建区域中随意改变(移动)位置,可以选中节点修改属性和删除节点。
在一些实施例中,超声应用节点可以包括多种类型的节点,例如,模式切换节点、测量包切换节点、探头模拟数据节点、图像参数设置节点、功能激活节点和B模式图像对比节点等中的至少一个。
模式切换节点可以触发模式切换,并且在传入相应模式参数后会更改超声机器的参数配置。
测量包切换节点可以切换测量包,使得用户在指定的测量包下进行操作。
探头模拟数据节点可以通过选择机器上保存的超声图像的原始数据作为教学任务的模拟数据。执行该模拟数据后,超声设备不再以连接探头的真实数据作为输入,而是以模拟数据作为输入。
图像参数设置节点可以通过设置该节点下的参数数据,使得医学成像设备的参数数据得以改变。
功能激活节点可以激活医疗设备的特定功能,例如,冻结、存图、查看患者、查看回放等。
B模式图像对比节点可以根据该节点的设置,对比某个特定输入参数的超声图像数据和参数,显示在某个设定区域内的模拟B模式超声图像。
仅作为示例,超声应用节点的节点控件与基础节点的节点控件类似,其外观特征可以如图6c所示。
在一些实施例中,节点菜单可以包括基础节点菜单和超声应用节点菜单等。其中,基础节点菜单中包括的节点类型为基础节点,超声应用节点菜单中包括的节点类型为超声应用节点。
在一些实施例中,可以从创建区域获取创建指令,响应于该创建指令展示基础节点菜单和超声应用节点菜单等。
步骤520,基于节点菜单获取节点选择指令,节点选择指令中包括被选中的教程节点的节点标识。
在一些实施例中,医疗设备响应于触发的创建指令展示节点菜单后,可以基于展示的节点菜单,对展示的节点菜单中的节点进行选择,从而触发节点选择指令。在一些实施例中,可以从展示的节点菜单获取节点选择指令,其中,获取的节点选择指令中包括被选中的教程节点的节点标识。
在一些实施例中,被选中的教程节点的节点标识可以包括被选中的教程节点的节点名称、节点ID等中的至少一个。其中,节点标识中的节点名称和节点菜单中的节点名称相对应。例如,被选中的教程节点名称为ClickEvent,则此处的节点名称为ClickEvent。
在一些实施例中,若节点菜单包括基础节点菜单和超声应用节点菜单,则节点选择指令可以包括第一选择指令和第二选择指令。在一些实施例中,可以从基础节点菜单获取第一节点选择指令,其中,第一节点选择指令中包括被选中的教程节点的基础节点标识。在一些实施例中,可以基于超声应用节点菜单获取第二节点选择指令,其中,第二节点选择指令包括被选中的教程节点的超声应用节点标识。
步骤530,基于创建区域对节点标识对应的教程节点进行配置,得到每个教程节点的配置信息。
在一些实施例中,获取被选中的教程节点的节点标识后,可以基于创建区域对该节点标识对应的教程节点进行配置,从而得到每个教程节点的配置信息。具体来说,可以通过对教程节点的节点属性和节点数据进行配置,从而完成教程节点的配置。
教程节点的配置信息是指与教程节点相关的信息,可以包括多种信息,例如,节点的类型信息、节点逻辑信息、节点设置信息、节点位置信息、节点逻辑引脚连接信息、节点数据引脚连接信息、教程脚本的脚本属性信息、教程脚本的配置信息等中的至少一种。在一些实施例中,配置信息中包括教程节点的节点属性和节点数据。其中,节点数据可以包括节点的逻辑信息、节点的逻辑引脚连接信息、节点的数据引脚连接信息等,节点属性可以包括节点的类型信息、节点的设置信息、节点的位置信息、教程脚本的脚本属性信息、教程脚本的配置信息等。
在一些实施例中,可以根据节点选择指令显示各教程节点的配置框。其中,被选中的教程节点的节点标识对应的教程节点对应地有配置框,配置框中包括有教程节点的配置信息。在一些实施例中,可以基于各教程节点的配置框,得到各教程节点的配置信息。具体来说,可以将在教程节点的配置框中选择的配置信息,确定为教程节点的配置信息。通过教程节点的配置框,能够使用户直观地看到并选择教程节点的配置信息,提高了获取教程节点的配置信息的准确度和便利性。
在一些实施例中,在对选中的教程节点的节点标识对应的教程节点进行配置时,可以从第一个节 点标识对应的节点开始,沿着逻辑引脚连线依次处理每个节点。首先,可以解析每个节点的内部逻辑和设置信息;然后,检查其输入数据引脚连接信息,如果该引脚被连接,则沿着输入连接线逆向查找前置节点的输出数据引脚连接信息,直到所有前置数据均满足要求,从而得到每个节点标识对应的目标教程节点的配置信息。
步骤540,根据教程节点和教程节点的配置信息,创建候选教学任务。
在一些实施例中,可以基于编辑界面创建候选教学任务,其中,边界界面还包括信息提示区域。在一些实施例中,可以基于编辑界面和/或信息提示区域获取设置指令,其中,设置指令包括对教程节点的节点数据和节点属性等进行设置的信息。例如,新建的教程节点1类型为逻辑节点,名称为isEqual;新建的教程节点2类型为数学变量节点,输入数据值为“PW Mode”。
在一些实施例中,响应于设置指令对各教程节点的节点数据和节点属性进行设置。例如,响应于设置指令中教程节点的节点数据和节点属性信息,将教程节点1类型设置为逻辑节点,名称设置为isEqual;教程节点2类型设置为数学变量节点,输入数据值设置为“PW Mode”。
在一些实施例中,可以基于述教程节点和各教程节点的配置信息生成候选教学任务对应的教学任务脚本。
在一些实施例中,可以将各教程节点作为候选教学任务包括的教程节点,将各教程节点的配置信息作为候选教学任务包括的教程节点间的逻辑关系,从而得到候选教学任务。在一些实施例中,可以根据得到的候选教学任务的教程节点和配置信息等生成该教学任务对应的教学任务脚本。
在一些实施例中,可以对生成的教学任务脚本进行编译检查。在一些实施例中,如果编译检查被通过,则可以上传教学任务脚本;如果编译检查未被通过,则可以提示教学任务脚本的设置错误位置。
在一些实施例中,可以将生成的教学任务脚本上传到各种本地或网络平台,例如,云平台等。
本说明书一些实施例中,通过编辑界面包括的创建区域获取创建指令,并响应于该创建指令展示节点菜单,从而可以使用户通过直观展示的节点菜单自主选择教程节点;通过节点菜单获取包括节点标识的节点选择指令,可以基于创建区域对节点标识对应的教程节点进行准确地配置;通过选中的教程节点和对应的配置信息,快速准确地创建出教学任务,降低了创建教学任务的复杂性,提高了创建效率。
图8是根据本说明书一些实施例所示的医疗设备辅助方法的示例性流程图。如图8所示,流程800包括下述步骤。在一些实施例中,流程800可以由处理设备140执行。在一些实施例中,流程800可以包括在步骤440的运行目标教学任务的过程中。
步骤810,获取扫查协议。在一些实施例中,步骤810可以由获取模块310执行。
在一些实施例中,扫查协议可以是医疗设备中用于实际扫查的程序。在一些实施例中,扫查协议可以仅与即将调用的参数和/或图像相关,而不用于实际扫查。例如,用户在利用医疗设备(例如,医疗设备110)对被扫查对象进行扫查时,可以根据被扫查对象的信息选择或设置相应的扫查协议。处理设备140可以接收该扫查协议,并根据该扫查协议控制医疗设备对被扫查对象进行扫查成像。又例如,用户可以在终端设备(例如,终端设备130)上选择扫查协议,用户可以选择或设置与扫查协议相关的参数,以在终端设备的显示设备上展示参数预调节前后对应的图像。
在一些实施例中,医疗设备可以包括超声设备、CT设备、MRI设备、PET设备、SPECT设备等,或其任意组合。扫查协议可以与一个或以上初始参数相关联。不同的医疗设备可以对应不同的初始参数。为了便于描述,本说明书中将以超声设备作为医疗设备的示例进行说明,其并不限制本申请的范围。在一些实施例中,超声设备可以包括一维超声设备、二维超声设备和/或三维超声设备。在一些实施例中,医疗设备可以为手持式超声设备。
在一些实施例中,扫查协议可以包括探头种类、扫查模式等中的至少一种。例如,探头可以包括线阵探头、相控阵探头、4D探头、腔内探头、经食道探头、胃镜探头等。扫查模式可以包括M扫查模式、PW扫查模式、与扫查对象一一对应的扫查模式等。用户可以通过选择探头、扫查模式等进入扫查页面。
在一些实施例中,扫查协议可以与医疗设备的辅助功能中所要展示的图像中的对象相关。具体地,在一些实施例中,当扫查协议中包括的探头种类是针对某个部位的探头时,若针对该扫查协议调用医疗设备的辅助功能,则展示的图像中的对象即为该部位。例如,若扫查协议中包括的探头是经食道探头,则医疗设备的辅助功能中展示的扫描对象为心脏。在一些实施例中,当扫查协议中包括的扫查模式是针对某个部位的扫查模式时,若针对该扫查协议调用医疗设备的辅助功能,则展示的图像中的对象即为该部位。例如,若扫查协议中包括的扫查模式是应用于腹部、心脏、甲状腺等的扫查模式,则医疗设备的辅助功能中展示的扫描对象分别对应为腹部、心脏、甲状腺等。
在一些实施例中,与超声设备的扫查协议相关的初始参数可以与超声设备和/或超声图像相关。在一些实施例中,初始参数可以包括动态范围、深度、发射功率、增益、伪彩、线密度、斑点噪声抑制、时间增益补偿(time gain compensation,TGC)、谐波成像、多波束成像、空间复合、亮度、焦点调节等,或 其任意组合。在一些实施例中,初始参数可以包括多级或多类初始参数。例如,可以根据临床应用中各个初始参数的使用频次(即,用户在进行图像优化时调节各个初始参数的频次),将初始参数划分为多级初始参数。又例如,可以根据各个初始参数对图像的调节效果划分为多级初始参数。例如,动态范围、亮度、增益等参数可以划分为一级初始参数,其与参数可以划分为另一级初始参数。每个初始参数可以对应多个数值,以供用户对其进行调节来优化图像。在一些实施例中,初始参数可以是连续数值化的参数。例如,连续数值化的参数可以包括深度、增益、亮度等。在一些实施例中,初始参数可以是进行分档的参数。例如,对于“斑点噪声抑制”参数,可以用强度、去噪指数对其进行量化分档。
步骤820,接收辅助功能触发指令。在一些实施例中,步骤820可以由接收模块320执行。
用户可以通过向处理设备140发送辅助功能触发指令来触发医疗设备的辅助功能,以了解医疗设备的各个参数(或初始参数)对图像的应用效果。在一些实施例中,辅助功能触发指令可以包括静态辅助功能触发指令和动态辅助功能触发指令。
在本说明书中,静态辅助功能触发指令与医疗设备的静态辅助功能相关。医疗设备的静态辅助功能可以与预存数据库相关。预存数据库可以包括各个初始参数的各个值对应的图像。在一些实施例中,预存数据库还可以包括多个组合参数的各个组合值对应的图像。组合参数,即两个或以上初始参数的组合。用户触发静态辅助功能指令后,处理设备140可以基于所选择的目标参数(例如,用户想要了解的参数及其值),通过查询预存数据库来显示与目标参数相对应的图像。换句话说,处理设备140不是实时根据目标参数计算得到所要展示的图像,而是通过调用对应的预存图像来进行展示。在一些实施例中,医疗设备的静态辅助功能可以直接加载在医疗设备上,以供用户在未进行扫查时或在扫查过程中进行调用。在一些实施例中,医疗设备的静态辅助功能也可以以应用程序(Application,APP)、网页等形式加载在终端设备上以方便用户随时随地调用。
动态辅助功能触发指令与医疗设备的动态辅助功能相关。医疗设备的动态辅助功能可以与实时采集的图像相关。处理设备140可以基于实时采集的图像和/或所选择的目标参数通过实时计算得到所要展示的图像。因此,医疗设备的动态辅助功能需要加载在医疗设备上,以通过医疗设备实时采集图像。
在一些实施例中,用户可以通过物理按键、触摸屏幕、鼠标、语音、手势、眼动、脑电波等方式向处理设备140发送辅助功能触发指令。例如,医疗设备上可以设置有辅助功能触发按钮,用户可以通过点击该按钮向处理设备140发送辅助功能触发指令。又例如,用户界面上设置有辅助功能触发模块,用户可以通过鼠标选择或触摸屏幕触发该模块以向处理设备140发送辅助功能触发指令。再例如,用户可以通过预设触发规则(例如,快速双击、长按某个参数按钮),以向处理设备140发送辅助功能触发指令。例如,在一些实施例中,当超声设备正在进行扫查时,用户可以通过快速双击或长按“动态范围”参数按钮来触发与动态范围对应的辅助功能,即向处理设备140发送与动态范围对应的辅助功能触发指令。换句话说,通过快速双击或长按“动态范围”参数按钮,可以进入动态范围的辅助功能界面,以进一步供用户在该辅助功能界面设置动态范围的预调节值。
在一些实施例中,处理设备140对扫查协议和辅助功能触发指令的获取可以不存在先后顺序。换句话说,处理设备140可以先获取扫查协议,再获取辅助功能触发指令;也可以先获取辅助功能触发指令,再获取扫查协议。在一些实施例中,处理设备可以同时获取扫查协议和辅助功能触发指令。例如,当医疗设备的静态辅助功能以APP的形式加载在终端设备上时,APP上可以设置有同时出发特定扫查协议和辅助功能触发指令的按钮,当用户点击该按钮后,可以直接进入与扫查协议相关静态辅助功能界面。
步骤830,接收多个初始参数中被选择的目标参数。在一些实施例中,步骤830可以由接收模块320执行。
在一些实施例中,可以基于扫查协议或辅助功能触发指令,在显示界面上展示多个初始参数。
在一些实施例中,当接收到扫查协议时,处理设备140即可以在显示界面显示与扫查协议相关联的初始参数。例如,对于以APP形式加载在终端设备上的静态辅助功能,当用户打开APP并选择扫查协议后,终端设备的显示设备上即可显示与扫查协议相关的初始参数,以供用户进行选择学习。又例如,对于在医疗设备上加载的动态辅助功能,当用户选择或设置扫查协议后,医疗设备的显示设备(或显示界面)上即可显示与扫查协议相关的初始参数。在一些实施例中,处理设备140可以响应于接收到辅助功能触发指令,显示初始参数。例如,当处理设备140获取到扫查协议后,响应于用户触发了辅助功能,可以在医疗设备的显示设备上展示与扫查协议相关的初始参数。
在一些实施例中,当初始参数包括多级/多类初始参数时,处理设备140可以在显示界面上分别展示各级/各类初始参数。例如,当初始参数包括初级参数(例如,使用频次较高的参数)和高级参数(例如,使用频次较低的参数)时,处理设备140可以基于扫查协议或辅助功能触发指令,在显示界面上展示初级参数。进一步地,响应于显示高级参数的显示触发指令(例如,用户可以选择显示界面上显示的“高级”按钮),处理设备140可以在显示界面上展示高级参数。再例如,当初始参数包括第一类初始参数、第二 类初始参数时,处理设备140可以基于扫查协议或辅助功能触发指令,展示各类初始参数的类别。进一步地,响应于显示特定类别的显示触发指令(例如,第一类初始参数显示触发指令),处理设备140可以在显示界面上展示该类别对应的初始参数。
在一些实施例中,当在显示界面上展示多个初始参数后,用户可以基于展示的多个初始参数选择目标参数。
在一些实施例中,当医疗设备的辅助功能被加载在医疗设备上时,可以不展示初始参数。具体地,当处理设备140接收扫查协议和辅助功能触发指令后,可以不在显示界面上展示初始参数,此时,用户可以通过设置在医疗设备上的各个参数按钮来对初始参数进行选择和/或设置,以确定目标参数。
在一些实施例中,响应于用户选择目标参数,处理设备140可以接收多个初始参数中被选择的目标参数。目标参数包括所选择的初始参数的种类即其对应的预调节值。
在一些实施例中,用户可以选择显示界面上展示的一个或多个初始参数作为目标参数。也就是说,用户可以对初始参数进行单选或多选。例如,在显示界面上可以罗列多个初始参数,每个参数前设置有选择框,用户可以选中(例如,单击、双击)想要了解参数前的选择框。进一步地,在选择完毕后,用户可以通过确认操作(例如,点击“应用”按钮)以确定选择的一个或多个选择框对应的参数作为目标参数。在一些实施例中,可以将特定的几个初始参数组合形成一个新参数,并将这几个参数的不同值的组合设置成新参数的不同档位,以达到对图像的不同调节效果。例如,如图10所示,可以将动态范围、TGC、增益组合形成一个新参数“图像优化”。可以将动态范围、TGC、增益的参数值针对调节穿透性能、分辨率,设置成不同的组合,以对应“图像优化”的不同档位(例如,穿透1档、穿透2档、穿透3档、分辨率1档、分辨率2档、分辨率3档等)。用户可以通过点击“图像优化”按钮并设置其对应的档位来实现同时选择动态范围、TGC、增益的组合值。在一些实施例中,用户可以通过物理按键、触摸屏幕、鼠标、语音、手势、眼动、脑电波等方式来选择目标参数。当目标参数被选择后,用户可以进一步对目标参数的预调节值进行选择或设置。例如,用户可以选择几档的“深度”、设置增益的数值等。
步骤840,在显示界面上展示目标参数预调节前所对应的第一图像以及目标参数预调节后所对应的第二图像。在一些实施例中,步骤350可以由显示模块230执行。通过在显示界面上展示预调节前后的图像,可以让用户更为直观地了解目标参数的对图像的应用效果。
预调节是指基于目标参数确定感兴趣图像的过程。需要知道的是,处理设备140并没有在扫查过程中对参数进行实际调节,而只是通过调用或计算等方式得到并展示感兴趣图像(例如,目标参数预调节前后对应的第一图像、第二图像)。用户可以通过预览感兴趣图像再确定是否对该目标参数进行调节来优化采集图像。
在一些实施例中,第一图像和/或第二图像可以包括静态图像(例如,图像帧)或动态图像。在本申请中,动态图像可以指由多个连续的静态图像所构成的图像。在一些实施例中,动态图像可以以图形交换格式(Graphics Interchange Format,GIF)、动态图像专家组(Moving Picture Experts Group,MPEG)格式、MP4、音频视频交错格式(Audio Video Interleaved,AVI)等格式进行存储或处理。例如,当与扫查协议关联的对象为心脏时,由于心脏时刻在跳动,第一图像和第二图像可以为动态图像以更好地展示心脏状态。又例如,当与扫查协议关联的对象为血管时(例如,血管彩超),第一图像和第二图像可以为动态图像以更好地了解血流情况。
在一些实施例中,当辅助功能触发指令为静态辅助功能触发指令时,第一图像和第二图像可以基于目标参数通过查询预存数据库进行确定。例如,每个目标参数(例如,每个初始参数或多个初始参数的组合的类别)可以对应一个初始图像(可以为静态图像,也可以为动态图像)。处理设备140或其他处理设备可以基于初始图像和目标参数(例如,预调节值),确定该目标参数对应的目标图像(可以为静态图像,也可以为动态图像)。处理设备140或其他处理设备可以将每个目标参数对应的初始图像和目标图像存储在预存数据库中。当用户选择目标参数(包括初始参数的总类及其对应的预调节值)后,处理设备140可以从预存数据库中调取目标参数对应的初始图像作为第一图像,以及目标参数对应的目标图像作为第二图像。在一些实施例中,每个目标参数(或组合目标参数)可以为医疗设备辅助***100的默认设置。
在一些实施例中,当辅助功能触发指令为动态辅助功能触发指令时,处理设备140可以基于扫查协议采集初始图像,并在显示界面上展示该初始图像。当接收到被选择的目标参数后,处理设备140可以将采集的初始图像作为第一图像。处理设备140可以基于目标参数(例如,初始参数的预调节值),处理初始图像来确定第二图像。处理设备140可以在显示界面上展示确定的第一图像和第二图像。在一些实施例中,第一图像和第二图像可以随着用户移动探头而实时变化。例如,当用户触发动态辅助功能后,用户可以使用探头实时采集对象(例如,模体)的图像,此时,医疗设备的显示设备上可以实时显示左右两幅相同的初始图像(例如,动态图像)。进一步地,用户可以选择目标参数(例如,动态范围),此时,显示设备上可以在左方展示初始图像(即,第一图像),右方展示基于目标参数和初始图像确定的第二图像。
在一些实施例中,当两个或以上目标参数被选择时,处理设备140可以在显示界面上同时展示多个目标参数预调节前对应的初始图像、每个目标参数预调节后对应的图像、以及至少两个目标参数对初始图像共同作用时对应的图像。例如,目标参数可以包括第一目标参数和第二目标参数,处理设备140可以首先确定第一目标参数和第二目标参数预调节前对应的初始图像。处理设备140可以基于初始图像和第一目标参数,确定第一目标参数预调节后对应的第一子图像。处理设备140还可以基于初始图像和第二目标参数,确定第二目标参数预调节后对应的第二子图像。处理设备140还可以基于初始图像、第一目标参数和第二目标参数,确定第一目标参数和第二目标参数预调节后对应的第三子图像。处理设备140可以在显示界面上展示初始图像、第一子图像、第二子图像和第三子图像。具体地,当辅助功能触发指令为动态辅助功能触发指令时,第一目标参数和第二目标参数预调节前对应的初始图像可以为当前采集的图像。当辅助功能触发指令为静态辅助功能触发指令时,第一目标参数和第二目标参数预调节前对应的初始图像可以预存数据库中预存的第一目标参数和第二目标参数的参数种类所对应的图像。
在一些实施例中,在医疗设备进行实时扫查的过程中,基于当前采集图像,若用户想了解某个参数(例如,新设置的医疗设备的参数)对当前采集图像的应用效果(例如,优化效果),可以触发医疗设备的辅助功能,并设置想要了解的该参数的参数值,此时,医疗设备的显示设备上可以同时显示当前采集图像(即,第一图像)和该参数预调节后的第二图像。进一步地,若用户满意该参数对当前采集图像的优化效果,用户可以接受对该参数的调节。例如,用户可以点击显示界面上的“应用”按钮,此时,显示界面上可以只显示该参数调节过后对应的图像。
在一些实施例中,处理设备140还可以在显示界面上展示关于目标参数的描述(例如,物理意义、功能描述等),以辅助用户进行理解(如图9中930、图10中1030、图11中1130所示)。在一些实施例中,为便于用户观察目标参数预调节前后的变化,处理设备140还可以在第一图像和/或第二图像上进行标注。例如,处理设备140可以将第一图像和第二图像进行对比。处理设备140可以根据对比结果,在第一图像和/或第二图像上用标记符号(例如,如图9中915和925所示)标出图像信号差异高于阈值的地方。具体地,例如,针对超声设备中的“动态范围”参数,处理设备140可以在第二图像中用“○”圈出变亮或变暗程度最大的地方。进一步地,还可以在第一图像的相同位置用“○”进行标记。再例如,针对将“深度”参数值调小时,处理设备140可以在第一图像中标记出相对于第二图像多出的结构。处理设备140还可以在标记的旁边用文字进行说明。
本说明书一些实施例中,通过将各个参数预调节前后对应的图像同时展示给用户,大大节省了用户了解各个参数的时间;通过以APP的形式安装在移动终端的静态辅助功能,方便用户随时随地了解想要学习的参数,提高用户体验;通过展示参数预调节前后的图像,用户可以先通过预览该参数预调节前后的效果,再决定是否对该参数进行调节,使用户更灵活地运用各个参数。
应当注意的是,上述有关流程400、500、800的描述仅仅是为了示例和说明,而不限定本说明书的适用范围。对于本领域技术人员来说,在本说明书的指导下可以对流程400、500、800进行各种修正和改变。然而,这些修正和改变仍在本说明书的范围之内。例如,步骤810和步骤820的顺序可以交换。
图9是根据本说明书一些实施例所示的超声设备的静态辅助功能中针对单个目标参数所对应的至少部分显示界面的示意图。如图9所示,目标参数为动态范围。当用户触发静态辅助功能时,显示界面900可以显示动态范围预调节前(例如,动态范围值为30)所对应的第一图像910及动态范围预调节后(例如,动态范围值为60)所对应的第二图像920。第一图像910和第二图像920均为静态图像。在一些实施例中,显示界面900的下方还可以显示关于动态范围的参数信息930。参数信息930可以包括动态范围的参数物理意义、动态范围的参数功能效果等。在一些实施例中,可以在第一图像910和第二图像920上标注出第一图像910和第二图像920之间差异较大的地方。例如,圆圈925对应的部分(即肝脏)的边界相较于圆圈915对应的部分的边界更清晰。通过显示界面900上展示的动态范围预调节前后的图像(图像910和920),可以使用户(例如,医生)更为直观地了解动态范围这个参数对图像的应用效果,大大节省了医生的时间。
图10是根据本说明书一些实施例所示的超声设备的静态辅助功能中针对组合目标参数所对应的至少部分显示界面的示意图。如图10所示,组合目标参数为“图像优化”参数,包括动态范围、增益和TGC。当用户触发静态辅助功能时,显示界面1000可以显示“图像优化”预调节前所对应的第一图像1010及“图像优化”预调节后所对应的第二图像1020。第一图像1010和第二图像1020均为静态图像。在一些实施例中,显示界面1000的下方还可以显示关于“图像优化”的参数信息1030。在一些实施例中,可以在第一图像1010和第二图像1020上标注出第一图像1010和第二图像1020之间差异较大的地方(例如,圆圈1015和1025所示的部分)。通过展示的“图像优化”预调节前后的图像,可以使用户更为直观地了解“图像优化”对图像的应用效果,大大节省了医生的时间。
图11是根据本说明书一些实施例所示的超声设备的动态辅助功能中针对单个目标参数所对应的 至少部分显示界面的示意图。动态辅助功能中针对单个目标参数所对应的显示界面1100可以与图9中所示的静态辅助功能中针对单个目标参数所对应的显示界面900相似。区别在于动态辅助功能中显示的图像为动态图像。如图1100所示,目标参数为动态范围。当用户触发动态辅助功能时,显示界面1100可以显示动态范围预调节前所对应的第一图像1110及动态范围预调节后所对应的第二图像1120。第一图像1110和第二图像1120均为动态图像。第一图像1110和第二图像1120可以根据用户移动探头而实时变化。在一些实施例中,显示界面1100的下方还可以显示关于动态范围的参数信息1130。在一些实施例中,可以在第一图像1110和第二图像1120上标注出第一图像1110和第二图像1120之间差异较大的地方(例如,圆圈1115和1125所示的部分)。通过展示的动态范围预调节前后的图像,可以使用户更为直观地了解动态范围参数对动态图像的应用效果,大大节省了医生的时间。
图12是根据本说明书一些实施例所示的超声设备的B模式下调节深度值的教学任务的示意图。
如图12所示的教学任务可以模拟B模式环境下的参数调节。通过查看模拟探头数据显示的超声图像,手动调节B模式的深度参数到一定的范围,并呈现调节前后的图像。其中,第一图像可以为调节前的图像,第二图像可以为调节后的图像。在该范围下呈现的图像质量明显优于未调节前的图像,即第二图像的质量明显优于第一图像的质量。这使得使用者能够通过动手实践学习到超声机器中如何调节深度参数使得图像达到想要的效果。
图12中所示的教学任务设定为接收到一个***点击事件(即ClickEvent节点)后触发,该事件如果是切换B模式的事件,则激活模拟的B模式。
如图12所示,该B模式切换节点(即ActiveBMode节点)可以接收三个输入数据。其中,Condition作为该节点逻辑运行的触发条件,ProbeData作为该模式的模拟探头数据,BModeParams作为该模式参数数据。使用设定好的模拟探头数据后,***将不再以真实探头数据作为输入。例如,该模拟数据是一个多帧肾脏超声数据,由于肾脏位于腹部深处,所以需要将depth参数调整到合适范围才能看到肾脏图像。图像参数设置节点(即BModeParams节点)将改变当前超声***下的参数设置情况。如图12所示,可以将Depth参数设置成5,其他依然使用***当前参数值。在上述参数设置的基础上,当前的模拟环境情况下产生的图像中看不到完整的肾脏。
激活到模拟环境下的B模式后,可以接着通过文本显示节点(即DisplayText节点)显示一串文字提示。例如,“调节Depth参数可以改变B模式下可见深度距离。想要看见更深的组织情况,调大参数;想要看到更大浅层组织图像,调小参数。现在请调节Depth参数,使整个肾脏组织清晰可见。”
上述提示可以一直显示在界面上,并且进入逻辑等待节点(即WaitForEvent节点),等待用户进行参数调整。
当***发出参数调整事件(例如,通过用户手动操作、自动调整等)时,可以触发参数调整事件节点(即ParamChangeEvent节点)。该节点的输出是***发出事件时带的两个数据,一个是参数名ParamName,另一个是参数值ParamValue。这两个参数值可以经过几个逻辑计算节点。在经过这些逻辑计算节点的的逻辑运算后,当判断用户将Depth参数调整到15以上时,可以将true的condition输出到逻辑等待节点。
当逻辑等待节点收到condition输入数据为true时,可以继续执行下面的B模式图像比较节点(即BModeImageCompare节点)。
BModeImageCompare节点可以根据该节点的设置,对比某个特定输入参数的超声图像数据和参数,即对比实时图像与在某个设定区域内的图像。在一些实施例中,该对比模式可以是双幅图像模式。例如,左边一幅图像为实时图像(即第一图像)和实时参数,而右边一幅图像是上述设定的超声图像数据(即第二图像)和参数。用户退出对比模式后,可以进入结束节点(即EndTask节点)。
结束节点可以提示用户完成了B模式下Depth参数引导,可以在界面上进行文字等提示。例如,显示“恭喜你将Depth参数调整到了正确范围,现在可以看到完整的肾脏了,您已完成本次教学!”。并且在用户确认后退出该模拟的B模式环境,回到真实的***环境下。
本说明书实施例可能带来的有益效果包括但不限于:(1)通过响应于基于可视化界面触发的指令展示候选教学任务,根据指令从候选教学任务选中并运行目标教学任务,从而可以通过可视化的方式直观地选择并运行教学任务,界面友好,使用简单,方便了新用户等的使用;(2)通过基于可视化界面创建、编辑教学任务,进而运行教学任务,可以快速简便地创建教学任务,从而帮助用户更好地了解医疗设备(例如,超声设备等),能够通过引导用户动手实践掌握医疗设备的各项使用功能及操作;(3)通过可视化界面获取各种指令,从而创建、编辑教程节点,快速准确地创建出教学任务,降低了创建教学任务的复杂性,提高了创建效率;(4)通过将各个参数预调节前后对应的图像同时展示给用户,并通过各种静态和动态辅助功能,节省了用户了解各个参数的时间,提高用户体验,使用户能够直观地了解各个参数的意义以及调节效果,从而方便用户更快、更有效地掌握医疗设备的功能和操作。需要说明的是,不同实施例可能产生 的有益效果不同,在不同的实施例里,可能产生的有益效果可以是以上任意一种或几种的组合,也可以是其他任何可能获得的有益效果。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述详细披露仅仅作为示例,而并不构成对本说明书的限定。虽然此处并没有明确说明,本领域技术人员可能会对本说明书进行各种修改、改进和修正。该类修改、改进和修正在本说明书中被建议,所以该类修改、改进、修正仍属于本说明书示范实施例的精神和范围。
同时,本说明书使用了特定词语来描述本说明书的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本说明书至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一个替代性实施例”并不一定是指同一实施例。此外,本说明书的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,除非权利要求中明确说明,本说明书所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本说明书流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本说明书实施例实质和范围的修正和等价组合。例如,虽然以上所描述的***组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的服务器或移动设备上安装所描述的***。
同理,应当注意的是,为了简化本说明书披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本说明书实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本说明书对象所需要的特征比权利要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述成分、属性数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根据个别实施例所需特点可以发生改变。在一些实施例中,数值参数应考虑规定的有效数位并采用一般位数保留的方法。尽管本说明书一些实施例中用于确认其范围广度的数值域和参数为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。
针对本说明书引用的每个专利、专利申请、专利申请公开物和其他材料,如文章、书籍、说明书、出版物、文档等,特此将其全部内容并入本说明书作为参考。与本说明书内容不一致或产生冲突的申请历史文件除外,对本说明书权利要求最广范围有限制的文件(当前或之后附加于本说明书中的)也除外。需要说明的是,如果本说明书附属材料中的描述、定义、和/或术语的使用与本说明书所述内容有不一致或冲突的地方,以本说明书的描述、定义和/或术语的使用为准。
最后,应当理解的是,本说明书中所述实施例仅用以说明本说明书实施例的原则。其他的变形也可能属于本说明书的范围。因此,作为示例而非限制,本说明书实施例的替代配置可视为与本说明书的教导一致。相应地,本说明书的实施例不仅限于本说明书明确介绍和描述的实施例。

Claims (36)

  1. 一种医疗设备的教学方法,由处理器执行,其特征在于,所述方法包括:
    响应于基于显示界面触发的使用指令,展示教学任务列表,其中,所述教学任务列表包括多个候选教学任务;
    基于所述教学任务列表,获取用户触发的选择指令;根据所述选择指令,确定目标教学任务。
  2. 如权利要求1所述的方法,其特征在于,所述候选教学任务为预先基于编辑界面所创建的。
  3. 如权利要求2所述的方法,其特征在于,所述编辑界面包括创建区域,在所述创建区域创建各所述候选教学任务的过程包括:
    基于所述创建区域获取创建指令,响应于所述创建指令展示节点菜单;
    基于所述节点菜单获取节点选择指令,所述节点选择指令中包括被选中的教程节点的节点标识;
    基于所述创建区域对所述节点标识对应的教程节点进行配置,得到每个教程节点的配置信息;以及
    根据所述教程节点和所述教程节点的配置信息,创建所述候选教学任务。
  4. 如权利要求3所述的方法,其特征在于,所述基于所述创建区域对所述节点标识对应的教程节点进行配置,得到每个教程节点的配置信息,包括:
    根据所述节点选择指令显示各所述教程节点的配置框;
    基于各所述教程节点的配置框,得到各所述教程节点的配置信息。
  5. 如权利要求3所述的方法,其特征在于,所述节点菜单包括基础节点菜单和超声应用节点菜单,所述基于所述创建区域获取创建指令,响应于所述创建指令展示节点菜单,包括:
    基于所述创建区域获取所述创建指令,响应于所述创建指令展示所述基础节点菜单和所述超声应用节点菜单。
  6. 如权利要求5所述的方法,其特征在于,所述节点选择指令包括第一节点选择指令和第二节点选择指令,所述基于所述节点菜单获取节点选择指令,包括:
    基于所述基础节点菜单获取所述第一节点选择指令;所述第一节点选择指令中包括被选中的教程节点的基础节点标识;以及
    基于所述超声应用节点菜单获取所述第二节点选择指令;所述第二节点选择指令中包括被选中的教程节点的超声应用节点标识。
  7. 如权利要求3所述的方法,其特征在于,所述编辑界面还包括信息提示区域,所述方法还包括:
    基于所述编辑界面和/或所述信息提示区域获取设置指令;以及
    响应于所述设置指令对各所述教程节点的节点数据和节点属性进行设置。
  8. 如权利要求2-7中任一项所述的方法,其特征在于,所述编辑界面还包括编辑区域,所述方法还包括:
    基于所述编辑区域获取编辑指令,其中,所述编辑指令指示待编辑节点;
    响应于所述编辑指令在所述编辑区域上展示所述待编辑节点的编辑界面;以及
    基于所述编辑界面获取编辑信息,根据所述编辑信息对所述待编辑节点进行校验操作、保存操作、设置操作和搜索操作中的至少一个。
  9. 如权利要求1所述的方法,其特征在于,所述方法还包括:
    运行所述目标教学任务。
  10. 如权利要求9所述的方法,其特征在于,所述运行所述目标教学任务包括:
    响应于与扫查协议相关联的多个初始参数中的目标参数被选择或调节,在所述显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像。
  11. 如权利要求9所述的方法,其特征在于,所述运行所述目标教学任务包括:
    将所述目标教学任务中的第一个教程节点存入栈中,循环执行以下步骤,直到所述栈中所有教程节点已取出:
    从所述栈的栈顶取出一个教程节点作为当前教程节点;
    响应于所述当前教程节点包含输入数据,且所述输入数据包含无效数据,将与所述无效数据对应的输入引脚相连的教程节点存入栈中;或
    响应于所述当前教程节点包含输入数据,且所述输入数据为有效数据,执行所述当前教程节点中包含的操作,并将所述当前教程节点的输出所对应的教程节点存入栈中。
  12. 如权利要求1所述的方法,其特征在于,所述根据所述选择指令,确定目标教学任务,还包括:
    创建新的教学任务作为所述目标教学任务,或对所述候选教学任务中的至少一个进行编辑后作为目标教学任务。
  13. 如权利要求12所述的方法,其特征在于,所述对所述候选教学任务中的至少一个进行编辑包括:
    将所述候选教学任务中的至少一个确定为待编辑教学任务;
    将所述待编辑教学任务解码成图形化界面;
    根据所述待编辑教学任务的脚本中的节点逻辑连接信息得到至少一个教程节点;
    基于所述至少一个教程节点的类型信息和位置信息,确定所述至少一个教程节点的显示位置;
    基于所述节点逻辑连接信息,将所述至少一个教程节点之间进行连接;
    基于所述至少一个教程节点的内部逻辑和设置信息,得到所述至少一个教程节点的名称信息、输入输出连接信息和数据连接信息并显示在所述编辑界面中。
  14. 如权利要求1所述的方法,其特征在于,所述方法进一步包括:
    更新所述多个候选教学任务中的至少一个。
  15. 一种教学任务创建方法,由处理器执行,其特征在于,所述方法包括:
    基于创建区域获取创建指令,并响应于所述创建指令展示节点菜单;
    基于所述节点菜单获取节点选择指令,所述节点选择指令中包括被选中的教程节点的节点标识;
    基于所述创建区域对所述节点标识对应的教程节点进行配置,得到每个教程节点的配置信息;以及
    根据所述教程节点和所述教程节点的配置信息,创建所述候选教学任务。
  16. 如权利要求15所述的方法,其特征在于,所述基于所述创建区域对所述节点标识对应的教程节点进行配置,得到每个教程节点的配置信息,包括:
    根据所述节点选择指令显示各所述教程节点的配置框;
    基于各所述教程节点的配置框,得到各所述教程节点的配置信息。
  17. 如权利要求15所述的方法,其特征在于,所述节点菜单包括基础节点菜单和超声应用节点菜单,所述基于创建区域获取创建指令,并响应于所述创建指令展示节点菜单,包括:
    基于所述创建区域获取所述创建指令,并响应于所述创建指令展示所述基础节点菜单和所述超声应用节点菜单。
  18. 如权利要求17所述的方法,其特征在于,所述节点选择指令包括第一节点选择指令和第二节点选择指令,所述基于所述节点菜单获取节点选择指令,包括:
    基于所述基础节点菜单获取所述第一节点选择指令,所述第一节点选择指令中包括被选中的教程节点的基础节点标识;以及
    基于所述超声应用节点菜单获取所述第二节点选择指令,所述第二节点选择指令中包括被选中的教程节点的超声应用节点标识。
  19. 如权利要求15所述的方法,其特征在于,所述候选教学任务基于编辑界面所创建,所述编辑界面还包括信息提示区域,所述方法还包括:
    基于所述编辑界面和/或所述信息提示区域获取设置指令;以及
    响应于所述设置指令对各所述教程节点的节点数据和节点属性进行设置。
  20. 如权利要求15所述的方法,其特征在于,所述方法还包括:
    基于各所述教程节点和各所述教程节点的配置信息生成所述候选教学任务对应的教学任务脚本;
    对所述教学任务脚本进行编译检查;
    响应于所述编译检查被通过,上传所述教学任务脚本;或
    响应于所述编译检查未被通过,提示所述教学任务脚本的设置错误位置。
  21. 如权利要求20所述的方法,其特征在于,所述上传所述候选教学任务脚本包括:
    将所述教学任务脚本上传到云平台。
  22. 如权利要求15所述的方法,所述教程节点包括基础节点和/或超声应用节点。
  23. 如权利要求22所述的方法,所述基础节点包括事件接收节点、事件发出节点、数学变量节点、逻辑计算节点、执行功能节点、展示节点、逻辑等待节点和结束节点中的至少一个;所述超声应用节点包括模式切换节点、测量包切换节点、探头模拟数据节点、图像参数设置节点、功能激活节点和B模式图像对比节点中的至少一个。
  24. 一种医疗设备的教学***,其特征在于,所述***包括:
    第一展示模块,用于响应于基于显示界面触发的使用指令,展示教学任务列表,其中,所述教学任务列表包括多个候选教学任务;
    第一获取模块,用于基于所述教学任务列表,获取所述用户触发的选择指令;根据所述选择指令,确定目标教学任务。
  25. 一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行如权利要求1~14任一项所述的医疗设备的教学方法。
  26. 一种医疗设备辅助方法,其特征在于,所述方法包括:
    获取扫查协议,所述扫查协议与多个初始参数相关联;
    接收辅助功能触发指令;
    接收所述多个初始参数中被选择的目标参数;以及
    在显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像。
  27. 根据权利要求26所述的方法,其特征在于,所述方法进一步包括:
    响应于基于所述显示界面触发的使用指令,展示教学任务列表,其中,所述教学任务列表包括多个候选教学任务;
    基于所述教学任务列表,获取用户触发的选择指令;根据所述选择指令,确定目标教学任务,以执行所述方法。
  28. 根据权利要求27所述的方法,其特征在于,所述多个初始参数包括第一级初始参数和第二级初始参数,在接收辅助功能触发指令之后,所述方法还包括:
    基于所述扫查协议或所述辅助功能触发指令,在所述显示界面上展示所述第一级初始参数;以及
    响应于显示所述第二级初始参数的显示触发指令,在所述显示界面上展示所述第二级初始参数。
  29. 根据权利要求27所述的方法,其特征在于,当所述辅助功能触发指令为静态辅助功能触发指令时,所述在所述显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像包括:
    通过查询预存数据库,确定所述第一图像和所述第二图像并在所述显示界面上展示,其中,所述预存数据库包括所述多个初始参数的各个值对应的图像。
  30. 根据权利要求27所述的方法,其特征在于,当所述辅助功能触发指令为动态辅助功能触发指令时,
    所述在显示界面上展示所述多个初始参数之前包括:
    基于所述扫查协议采集初始图像;
    在所述显示界面上展示所述初始图像;
    所述在所述显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像包括:
    将所述初始图像作为所述第一图像;
    基于所述目标参数,处理所述第一图像来确定所述第二图像;以及
    在所述显示界面上展示所述第一图像和所述第二图像。
  31. 根据权利要求27所述的方法,其特征在于,所述目标参数包括第一目标参数和第二目标参数,其中,
    所述目标参数预调节前所对应的第一图像包括所述第一目标参数和所述第二目标参数预调节前所对应的图像,
    所述目标参数预调节后所对应的第二图像包括所述第一目标参数和所述第二目标参数预调节后对应的图像、所述第一目标参数预调节后所对应的图像、以及所述第二目标参数预调节后所对应的图像。
  32. 根据权利要求27所述的方法,其特征在于,所述第一图像和所述第二图像包括静态图像或动态图像。
  33. 根据权利要求27所述的方法,其特征在于,所述第一图像和所述第二图像中呈现的被扫描对象与所述扫查协议相关。
  34. 根据权利要求27所述的方法,其特征在于,所述医疗设备为超声设备,所述扫查协议包括探头种类、扫查模式中的至少一种。
  35. 一种医疗设备辅助***,其特征在于,所述***包括:
    获取模块,用于获取扫查协议,所述扫查协议与多个初始参数相关联;
    接收模块,用于接收辅助功能触发指令以及所述多个初始参数中被选择的目标参数;以及
    显示模块,用于在显示界面上展示所述目标参数预调节前所对应的第一图像以及所述目标参数预调节后所对应的第二图像。
  36. 一种计算机可读存储介质,所述存储介质存储计算机指令,当计算机读取存储介质中的计算机指令后,计算机执行如权利要求26-34中任一项所述的医疗设备辅助方法。
PCT/CN2022/112781 2021-08-26 2022-08-16 医疗设备及*** WO2023024974A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202110987077.2A CN114035725B (zh) 2021-08-26 2021-08-26 超声设备的教学方法、装置、超声成像设备和存储介质
CN202110987077.2 2021-08-26
CN202111102067.2A CN114052793A (zh) 2021-09-18 2021-09-18 一种医疗设备辅助方法、装置及存储介质
CN202111102067.2 2021-09-18

Publications (1)

Publication Number Publication Date
WO2023024974A1 true WO2023024974A1 (zh) 2023-03-02

Family

ID=85322497

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/112781 WO2023024974A1 (zh) 2021-08-26 2022-08-16 医疗设备及***

Country Status (1)

Country Link
WO (1) WO2023024974A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102682421A (zh) * 2012-03-14 2012-09-19 飞依诺科技(苏州)有限公司 超声图像的实时放大方法
CN109567860A (zh) * 2018-10-19 2019-04-05 深圳迈瑞生物医疗电子股份有限公司 超声成像方法、设备和存储介质
CN112199007A (zh) * 2020-09-01 2021-01-08 北京达佳互联信息技术有限公司 菜单显示方法、装置、电子设备和存储介质
CN112870706A (zh) * 2021-03-19 2021-06-01 腾讯科技(深圳)有限公司 教学内容的显示方法、装置、设备及存储介质
CN114035725A (zh) * 2021-08-26 2022-02-11 武汉联影医疗科技有限公司 超声设备的教学方法、装置、超声成像设备和存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102682421A (zh) * 2012-03-14 2012-09-19 飞依诺科技(苏州)有限公司 超声图像的实时放大方法
CN109567860A (zh) * 2018-10-19 2019-04-05 深圳迈瑞生物医疗电子股份有限公司 超声成像方法、设备和存储介质
CN112199007A (zh) * 2020-09-01 2021-01-08 北京达佳互联信息技术有限公司 菜单显示方法、装置、电子设备和存储介质
CN112870706A (zh) * 2021-03-19 2021-06-01 腾讯科技(深圳)有限公司 教学内容的显示方法、装置、设备及存储介质
CN114035725A (zh) * 2021-08-26 2022-02-11 武汉联影医疗科技有限公司 超声设备的教学方法、装置、超声成像设备和存储介质

Similar Documents

Publication Publication Date Title
US10158806B2 (en) Camera system and method for aligning images and presenting a series of aligned images
US8843852B2 (en) Medical interface, annotation and communication systems
US10372802B2 (en) Generating a report based on image data
US8860717B1 (en) Web browser for viewing a three-dimensional object responsive to a search query
US11900266B2 (en) Database systems and interactive user interfaces for dynamic conversational interactions
US11164314B2 (en) Systems and methods for lesion analysis
JP5274180B2 (ja) 画像処理装置、画像処理方法、コンピュータプログラム及び記憶媒体
JP2016539411A (ja) 医療情報用の進化型コンテキスト臨床データエンジン
JP2021191429A (ja) 医療画像のアノテーションのための装置、方法、及びシステム
KR101576047B1 (ko) 의료 영상 판독 과정에서 구조화된 관심 영역 정보 생성 방법 및 그 장치
US20140149910A1 (en) Method of displaying medical image acquisition information and medical image display apparatus
US20130187911A1 (en) Image and Annotation Display
CN107003404A (zh) 使用多个显示器提供信息的方法和超声波设备
US20200175756A1 (en) Two-dimensional to three-dimensional spatial indexing
JP2011078527A (ja) 医用画像管理装置、及び医用画像表示装置
CN101118574A (zh) 基于规则的体绘制和导航的***和方法
WO2023024974A1 (zh) 医疗设备及***
CN108352184A (zh) 医用图像处理装置、可安装到医用图像处理装置中的程序和医用图像处理方法
JP2018138087A (ja) 超音波画像処理装置
RU2750278C2 (ru) Способ и аппарат для модификации контура, содержащего последовательность точек, размещенных на изображении
CN114052793A (zh) 一种医疗设备辅助方法、装置及存储介质
CN114035713A (zh) 一种超声扫查流程控制方法和***
Whelan et al. Informatics in Radiology (info RAD) NeatVision: Visual Programming for Computer-aided Diagnostic Applications
Bøe Comparative Visualization Of Longitudinal PET/CT Data For Acute Myeloid Leukemia Treatment Analysis
US20240197292A1 (en) Systems and methods for ultrasound examination

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22860320

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE