CN110376922B - Operating room scene simulation system - Google Patents

Operating room scene simulation system Download PDF

Info

Publication number
CN110376922B
CN110376922B CN201910666857.XA CN201910666857A CN110376922B CN 110376922 B CN110376922 B CN 110376922B CN 201910666857 A CN201910666857 A CN 201910666857A CN 110376922 B CN110376922 B CN 110376922B
Authority
CN
China
Prior art keywords
scene
operating room
interactive object
user
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910666857.XA
Other languages
Chinese (zh)
Other versions
CN110376922A (en
Inventor
陈玉冰
张立臣
叶典
曹祖晟
叶子成
江嘉伟
曾博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201910666857.XA priority Critical patent/CN110376922B/en
Publication of CN110376922A publication Critical patent/CN110376922A/en
Application granted granted Critical
Publication of CN110376922B publication Critical patent/CN110376922B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Optimization (AREA)
  • Chemical & Material Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Medicinal Chemistry (AREA)
  • Algebra (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Analysis (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of virtual reality, in particular to an operating room scene simulation system, which comprises: the system comprises a scene modeling module and a scene interaction module; the scene modeling module is used for constructing an operating room simulation scene and guiding the operating room simulation scene into the scene interaction module, the operating room simulation scene comprises a plurality of interaction objects, and the interaction objects comprise an operating bed, a shadowless lamp, an X-ray film reader, a monitor and a multifunctional anesthesia machine; the scene interaction module is used for receiving each control instruction sent by a user and simulating human bodies to carry out human-computer interaction with each interactive object in the operating room simulation scene according to the control operation corresponding to each control instruction. By applying the system, a user can observe the internal structure of the operating room in an intuitive form, and simulate the operation process of various control operations of a human body in the operating room through various control instructions, so that the medical teaching efficiency is improved.

Description

Operating room scene simulation system
Technical Field
The invention relates to the technical field of virtual reality, in particular to an operating room scene simulation system.
Background
With the continuous development of network technology and scientific technology, virtual technology is becoming more mature. A user constructs three-dimensional environment information through a virtual technology, a real object and an environment in the real world can be realistically simulated, the user can interact in the three-dimensional virtual environment, and various behavior operation processes of the user in a real scene are simulated.
At present, with the continuous development of medical technology, the technical level of medical care personnel and students needs to be further improved, and the medical care personnel and students need to continuously develop academic knowledge, including the use and explanation of medical instruments, the diagnosis and treatment of various diseases, the study and research of various anatomical operations on human bodies, and the like.
Disclosure of Invention
The technical problem to be solved by the invention is to provide an operating room scene simulation system, by which an operating room simulation scene is constructed, so that a user can observe the internal structure of an operating room in a visual mode, and simulate the operation process of a human body in the operating room for various control operations through various control instructions, thereby improving the medical teaching efficiency.
An operating room scene simulation system comprising:
the system comprises a scene modeling module and a scene interaction module;
the scene modeling module is used for constructing an operating room simulation scene and guiding the operating room simulation scene into the scene interaction module, the operating room simulation scene comprises a plurality of interaction objects, and the interaction objects comprise an operating table, a shadowless lamp, an X-ray film reader, a monitor and a multifunctional anesthesia machine;
the scene interaction module is used for receiving each control instruction sent by a user and simulating human bodies to carry out human-computer interaction with each interactive object in the operating room simulation scene according to the control operation corresponding to each control instruction.
The above system, optionally, the scene modeling module includes:
a setting unit and a modeling unit;
the setting unit is used for setting a placing position corresponding to each interactive object;
and the modeling unit is used for applying preset 3Ds Max software and constructing the operating room simulation scene according to the corresponding placing position of each interactive object.
Optionally, the modeling unit of the system is further configured to store the constructed operating room simulation scene as an object format file, and import the object format file into a module program corresponding to the scene interaction module.
The above system, optionally, the modeling unit includes:
the device comprises an operating bed adjusting subunit, a shadowless lamp adjusting subunit, a film reading device control subunit, a monitor control subunit and an anesthesia machine control subunit;
the operating bed adjusting subunit is used for receiving an operating bed adjusting instruction sent by the user and adjusting the bed height, the handle position and the backrest angle of the operating bed;
the shadowless lamp adjusting subunit is used for receiving a shadowless lamp adjusting instruction sent by the user, adjusting the lighting position and the light brightness of the shadowless lamp, and controlling the on-off of the shadowless lamp;
the film reader control subunit is used for receiving the film reading control instruction sent by the user and controlling the switch of the X-ray film reader;
the monitor control subunit is used for receiving a monitor operation instruction sent by the user and controlling the on-off of the monitor;
and the anesthesia machine control subunit is used for receiving an anesthesia machine operation instruction sent by the user and controlling a switch of the multifunctional anesthesia machine.
The above system, optionally, the scene interaction module includes:
a character controller, a collision detector, an interactable object property storage device, and a UI interface;
the role controller is used for receiving a control instruction sent by the user through the UI interface and simulating the behavior of the human body in the operating room simulation scene for user interaction with each interactive object according to the control instruction;
the collision detector is used for detecting whether human bodies exist in a collision range corresponding to each interactive object for human-computer interaction, determining object information of a current interactive object corresponding to the current human-computer interaction, and triggering the UI to display object details and an operation method corresponding to the current interactive object according to object attributes of the current interactive object, which are stored in the interactive object characteristic storage device;
the interactive object characteristic storage device is used for setting object attributes of each interactive object and storing common attributes of all the interactive objects in a preset common set, and different attributes of each interactive object are stored in a different set corresponding to each interactive object;
the UI interface is used for displaying the operating room simulation scene and the object details and the operation method of each interactive object, receiving the control instruction corresponding to each operation method selected by the user on the UI interface, sending the control instruction to the role controller, and triggering and receiving the role controller to execute the control operation corresponding to each control instruction.
The system, optionally, the character controller includes:
a displacement controller, a rotation controller, a camera and a role collision controller;
the displacement controller is used for controlling the human body displacement of the human body in the operating room simulation scene when receiving a displacement control instruction sent by the user through the UI interface;
the rotation controller is used for controlling the human body to rotate within a preset angle range when receiving a rotation control instruction sent by the user through the UI;
the camera is used for shooting an object watched by the human body in the operating room simulation scene according to a preset shooting visual angle;
and the character collision controller is used for controlling the human body to interact with the interactive object to be interacted when receiving a touch control instruction sent by the user through the UI.
The above system, optionally, the collision detector, comprises:
an object configurator and a plurality of object detectors;
the object configurator is used for detecting the object form of each interactive object, increasing the rigid body attribute of the object according to the object form of each interactive object, and configuring the collision range corresponding to each object form;
each object detector is arranged on the corresponding interactive object;
each object detector is used for detecting whether the human body currently enters a collision range corresponding to the current interactive object or not; and if the collision range corresponding to the current interactive object of the human body is existed, triggering the UI interface to display the object introduction and operation method corresponding to the current interactive object.
Compared with the prior art, the invention has the following advantages:
the invention provides an operating room scene simulation system, which comprises: the system comprises a scene modeling module and a scene interaction module; the scene modeling module is used for constructing an operating room simulation scene and guiding the operating room simulation scene into the scene interaction module, the operating room simulation scene comprises a plurality of interaction objects, and the interaction objects comprise an operating table, a shadowless lamp, an X-ray film reader, a monitor and a multifunctional anesthesia machine; the scene interaction module is used for receiving each control instruction sent by a user and simulating human bodies to carry out human-computer interaction with each interactive object in the operating room simulation scene according to the control operation corresponding to each control instruction. By applying the system, a user can observe the internal structure of the operating room in a visual mode, and the operating process of various control operations of the human body in the operating room is simulated through various control instructions, so that the medical teaching efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a system structure diagram of an operating room scene simulation system according to an embodiment of the present invention;
FIG. 2 is a diagram of another system architecture of an operating room scene simulation system according to an embodiment of the present invention;
fig. 3 is a further system structure diagram of an operating room scene simulation system according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In this application, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions, and the terms "comprise", "comprises", or any other variation thereof are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but also other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising a," "8230," "8230," or "comprising" does not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the element.
The invention is operational with numerous general purpose or special purpose computing device environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multi-processor apparatus, distributed computing environments that include any of the above devices or equipment, and the like.
The embodiment of the invention provides an operating room scene simulation system, the system structure diagram of which is shown in fig. 1, and the system specifically comprises:
a scene modeling module 100 and a scene interaction module 200;
the scene modeling module 100 is configured to construct an operating room simulation scene, and introduce the operating room simulation scene into the scene interaction module, where the operating room simulation scene includes a plurality of interactive objects, and the interactive objects include an operating table, a shadowless lamp, an X-ray film reader, a monitor, and a multifunctional anesthesia machine;
the scene interaction module 200 is configured to receive each control instruction sent by a user, and simulate a human body to perform human-computer interaction with each interactive object in the operating room simulation scene according to a control operation corresponding to each control instruction.
In the operating room scene simulation system provided by the embodiment of the invention, a scene modeling module is used for constructing an operating room simulation scene, and the operating room simulation scene is constructed by using preset modeling drawing software. The operating room simulation scene can be constructed according to the internal composition structure of the operating room of modern medicine. The operating room simulation scene comprises various interactive objects, and each interactive object is some medical instruments needed by a user for medical treatment in the operating room simulation scene, and specifically comprises an operating bed, a shadowless lamp, an X-ray film reading device, a monitor and a multifunctional anesthesia machine. After the operating room simulation scene is constructed, the operating room simulation scene is led into a scene interaction module, and human-computer interaction is carried out on each interactive object by using a human body simulation mode according to the scene interaction module. The user can send a control instruction through the scene interaction module, and the scene interaction module executes corresponding control operation according to the control instruction to realize the process of man-machine interaction.
It should be noted that the scene interaction module may be constructed by Unity 3D software. After the scene modeling module completes construction of the operating room simulation scene, the operating room simulation scene can be imported into the Unity software, and a human-computer interaction process is realized by writing a software program corresponding to the scene modeling module.
Optionally, in the scene modeling module, in addition to the operating table, the shadowless lamp, the X-ray film reader, the monitor, and the multifunctional anesthesia machine in the operating room simulation scene, other interactive objects may be added, for example: a scalpel, an oxygen cylinder, etc.
By applying the system provided by the embodiment of the invention, the scene simulation of the operating room is realized through the scene modeling module and the scene interaction module, and the operation process of various control operations of a human body in the operating room is simulated through various control instructions, so that the medical teaching efficiency is improved.
In the system provided in the embodiment of the present invention, referring to fig. 2, the scene modeling module 100 may specifically include: a setting unit 110 and a modeling unit 120;
the setting unit is used for setting a placing position corresponding to each interactive object;
and the modeling unit is used for applying preset 3Ds Max software and constructing the operating room simulation scene according to the corresponding placing position of each interactive object.
In the system provided by the embodiment of the invention, the operating room simulation field is constructed through the modeling unit, wherein the operating room simulation scene comprises a plurality of interactive objects, the specific placing position of each interactive object is set firstly, optionally, the corresponding relation between each interactive object can be set according to 3D (three-dimensional) coordinates, for example, the multifunctional anesthesia machine can be arranged on any surface of the operating bed, and after the condition of the operating bed is determined, the coordinate position corresponding to the multifunctional anesthesia machine can be set according to the current scene. Wherein, the drawing is performed by 3Ds Max drawing software.
By applying the system provided by the embodiment of the invention, the 3Ds Max software is adopted to construct the operating room simulation scene, so that the internal structure of the operating room can be displayed more intuitively.
In the system provided in the embodiment of the present invention, optionally, the modeling unit 120 is further configured to store the constructed operating room simulation scene as an object format file, and import the object format file into a module program corresponding to the scene interaction module.
In the system provided by the embodiment of the invention, after the operating room simulation scene is constructed by the 3Ds Max software, the operating room simulation scene can be saved as a target format file, namely, a file with a suffix of FBX format according to the file format corresponding to the 3Ds Max software. And importing the target format file into a scene interaction module according to the target format file so as to realize the process of man-machine interaction through the operating room simulation scene.
In the system provided in the embodiment of the present invention, referring to fig. 3, the modeling unit 120 specifically includes:
an operating bed adjusting subunit 1201, a shadowless lamp adjusting subunit 1202, a film reader control subunit 1203, a monitor control subunit 1204 and an anesthesia machine control subunit 1205;
an operating bed adjusting subunit 1201, configured to receive an operating bed adjusting instruction sent by the user, and adjust a bed height, a handle position, and a backrest angle of the operating bed;
the shadowless lamp adjusting subunit 1202 is configured to receive a shadowless lamp adjusting instruction sent by the user, adjust the lighting position and the light brightness of the shadowless lamp, and control the on/off of the shadowless lamp;
the film reader control subunit 1203 is configured to receive a film reading control instruction sent by the user, and control a switch of the X-ray film reader;
the monitor control subunit 1204 is configured to receive a monitor operation instruction sent by the user, and control a switch of the monitor;
the anesthesia machine control subunit 1205 is configured to receive an anesthesia machine operation instruction sent by the user, and control a switch of the multifunctional anesthesia machine.
In the system provided by the embodiment of the present invention, the modeling unit may include a control subunit or a regulation subunit of each interactive object. Wherein the operation bed adjusting subunit can be used for adjusting the operation bed in the simulation scene of the operation room. Such as bed height, handle position, depending on the angle of the copy, etc. The shadowless lamp adjusting subunit can adjust the brightness of the lamp light and the position of the illumination, and can move to change the position of the illumination and control the on and off of the shadowless lamp. And the film reader control subunit, the monitor control subunit and the anesthesia machine control subunit are respectively used for controlling switches of the X-ray film reader, the monitor and the multifunctional anesthesia machine.
Optionally, in addition to the operating table adjusting subunit, the shadowless lamp adjusting subunit, the film reader controlling subunit, the monitor controlling subunit and the anesthesia machine controlling subunit, if other interactive objects are also constructed in the operating room simulation scene, the modeling unit may set the controlling subunit corresponding to the interactive object according to the function and the using mode of the interactive object.
By applying the system provided by the invention, the operability of each interactive object can be more vividly displayed through the control subunit or the adjusting subunit corresponding to each interactive object, the function of each interactive object is more truly reflected, and a medical student can more intuitively know the operation method of each instrument in the learning process.
In the system provided in the embodiment of the present invention, referring to fig. 2, the scene interaction module 200 includes:
character control 210, collision detector 220, interactable object characteristics storage 230, and UI interface 240;
the role controller 210 is configured to receive a control instruction sent by the user through the UI interface, and simulate, according to the control instruction, a behavior of the human body interacting with each of the interactive objects in the operating room simulation scene;
the collision detector 220 is configured to detect whether the human body performs human-computer interaction in a collision range corresponding to each interactive object, determine object information of a current interactive object corresponding to current human-computer interaction, and trigger the UI interface to display object details and an operation method corresponding to the current interactive object according to the object attribute of the current interactive object stored in the interactive object characteristic storage device;
the interactive object property storage device 230 is configured to set an object property of each interactive object, and store a common property of all the interactive objects in a preset common set, where a different property of each interactive object is stored in a different property set corresponding to each interactive object;
the UI interface 240 is configured to display the operating room simulation scene and the object details and the operation methods of the interactive objects, receive a control instruction corresponding to each operation method selected on the UI interface by the user, send the control instruction to the role controller, and trigger and receive the control operation corresponding to each control instruction executed by the role controller.
In the system provided by the embodiment of the invention, a role controller simulates a human body to perform human-computer interaction operation, wherein when a user sends a control instruction through a UI (user interface), human bodies in the operating room simulation field set are triggered to perform human-computer interaction behaviors, such as human body actions of walking, turning, touching and the like, according to the control operation corresponding to the control instruction, wherein the human bodies perform the control operation at a first visual angle of the user in an operating room simulation scene. The collision detector is mainly used for detecting whether a human body in the operating room simulation scene generates a human-computer interaction behavior at present, if the human body and a current interaction object generate the human-computer interaction behavior, determining object information of the current interaction object, namely determining what the interaction object is and what name the interaction object is, and after determining the object information of the current interaction object, triggering a UI (user interface) to display object details and an operation method of the current interaction object according to object attributes corresponding to the current interaction object and stored in the interactive object characteristic storage device. For example, if the current interactive object is an operating table, the UI interface is triggered to display information such as the structure, classification, usage, and usage of the operating table according to the object attribute of the operating table. The object property of each interactive object is stored by the interactive object property storage device, specifically, common properties of all interactive objects are stored in a preset common set, and different properties of each interactive object are stored in a heterosexual set corresponding to each interactive object. For example, common attributes of an operating bed, a shadowless lamp, an X-ray film reader, a monitor and a multifunctional anesthesia machine include:
1) Interactive prompts need to be displayed;
2) An interactive UI interface needs to be displayed;
3) Whether a human body enters a collision range of an interactive object needs to be marked;
4) Event monitoring needs to be dynamically added to controls in the interactive UI interface;
if the simulated human body enters the collision range, an enter instruction is generated, and the human body is marked to carry out human-computer interaction currently according to the enter instruction; similarly, if the human body leaves the collision range, an exit instruction is generated, and the original mark is cancelled according to the exit instruction, so that the man-machine interaction is completed. The event monitoring refers to monitoring the control subunit or the adjusting subunit corresponding to each interactive object and determining whether the current interactive object performs human-computer interaction with a human body.
The UI interface is an interactive interface which is used for a real operation user to perform simulated interactive operation with the operating room simulation scene. The user can operate through the UI interface, each control instruction is sent to the role controller according to the control instruction selected by the user on the UI interface, and the role controller executes corresponding control operation according to the control instruction.
It should be noted that, in the interactable object property storage device, the common set may specifically be an InteractiveObj class, that is, a common attribute interaction class. The set of heterology corresponding to each interactive object may specifically include: lamp lamps, operating table Bed lamps, monitors Invigilator, X-ray monitors XRaymonitor, anesthesia Anasethesia, and the like. Lamp class corresponds to interactive implementation of shadowless lamps; the Bed class corresponds to the interactive implementation of an operating Bed; the Invigilator class corresponds to the interactive implementation of the monitor; the XRayMonitor class corresponds to the interactive realization of an X-ray film reading lamp; the Anasethesia class corresponds to the interactive implementation of a multifunctional anesthesia machine. In interaction, not only are function introduction and operation methods of related instruments available, but also corresponding instruments can be operated so as to fully understand the layout of an operating room, and a user can learn and practice training as if the user is in the system environment.
In this embodiment, the separation between the commonality and the opposite sex of each interactive object can be realized according to the object-oriented programming polymorphic method, that is, the commonality and the opposite sex are separated by abstract design of the interactive objects, if new different types of interactive objects are added, the interactive objects only need to inherit the interactiveObj class and then realize the opposite sex of the interactive objects by themselves, and the commonality among the interactive objects does not need to be additionally realized. Furthermore, if other classes need to call the method of the interactive object, it is not necessary to consider which type the object is, because the objects all belong to the type of the interactiveObj, and only the classes need to be handed to themselves to judge and call the method which should be actually called.
By applying the method provided by the embodiment of the invention, the process of man-machine interaction in the operating room simulation scene is realized under the mutual cooperation of the role controller, the collision detector, the interactive object characteristic storage device and the UI interface, and the medical teaching efficiency is improved.
In the system provided in the embodiment of the present invention, referring to fig. 3, the role control 210 includes:
a displacement controller 2101, a rotation controller 2102, a camera 2103, and a character collision controller 2104;
the displacement controller 2101 is configured to control the human body displacement in the operating room simulation scene when receiving a displacement control instruction sent by the user through the UI interface;
the rotation controller 2102 is configured to control the human body to rotate within a preset angle range when receiving a rotation control instruction sent by the user through the UI interface;
the camera 2103 is used for shooting an object watched by the human body in the operating room simulation scene according to a preset shooting angle of view;
the character collision controller 2104 is configured to, when a touch control instruction sent by the user through the UI interface is received, control the human body to interact with an interactive object to be interacted.
In the system provided by the embodiment of the invention, the displacement controller can be used for controlling the human body to move when receiving the displacement control instruction sent by the user through the UI. Wherein the displacement control instruction comprises: the displacement control commands such as forward, backward, left and right movements can be that the human body moves forward, backward, leftward, namely, rightward, and the like in the operating room simulation scene. The rotation controller can control the human body to twist the head or twist the body. Wherein, the rotation controller can rotate according to a preset rotation angle. The camera is equivalent to the eye part of a human body, can truly simulate the human body to observe the surrounding environment of a simulated scene of an operating room, and displays the shot content to a user through a UI (user interface). The character collision controller is used for controlling human bodies to carry out human-computer interaction with the object to be interacted when receiving a touch control instruction.
By applying the system provided by the embodiment of the invention, the interaction process of the human body in the operating room simulation scene is realized through the mutual cooperation of the displacement controller, the rotation controller, the camera and the role collision controller, the authenticity of the scene is ensured, and the medical teaching efficiency is further improved.
In the system provided in the embodiment of the present invention, referring to fig. 3, the collision detector 220 includes:
an object configurator 2201 and a plurality of object detectors 2202;
the object configurator 2201 is configured to detect an object form of each of the interactive objects, add an object rigid body attribute according to the object form of each of the interactive objects, and configure a collision range corresponding to each of the object forms;
each of the object detectors 2202 is disposed on an interactive object corresponding thereto;
each object detector 2202 is configured to detect whether the human body currently enters a collision range corresponding to a current interactive object corresponding to the object detector 2202; and if the collision range corresponding to the current interactive object of the human body is existed, triggering the UI interface to display the object introduction and operation method corresponding to the current interactive object.
In the system provided by the embodiment of the invention, in the operating room simulation scene, the object configurator detects the forms of the interactive objects and adds the rigid body attribute of the object to each interactive object. The rigid body attribute refers to simulating a real object in our life, giving it solid properties, quality, smoothness and the like, and if the rigid body attribute is not present, the object can be interspersed with any object, which is not present in real life. For example, rigid body attributes are attached to a housing of a multifunctional anesthesia apparatus, a bed board, a handrail, a support, and the like of an operating bed. When human-computer interaction is carried out through a human body, the human body can be equivalent to a real user touching a real object, and interpenetration cannot be generated. Since the shape, posture and component of each interactive object are different, different rigid body attributes can be set for each interactive object. The object detector is arranged on each interactive object, each interactive object is provided with a specially corresponding object detector, the object detector is used for detecting whether the current interactive object corresponding to the object detector has a collision range corresponding to the current interactive object, and if the human body has entered the collision range, object introduction and an operation method corresponding to the current interactive object are displayed through a UI interface.
It should be noted that the object detector may set a collision range for a collision range corresponding to the object detected by the object detector. For example, the object detector corresponding to the bed surface of the operating bed may be a mesh collider, which may automatically generate a collision range conforming to the size of the bed surface according to the corresponding bed surface; as for a multifunctional anesthesia machine, an X-ray film reader and the like in an operating room, a box-type collision device BoxCollier which saves resources is added for the multifunctional anesthesia machine, the X-ray film reader and the like, and the size of the collision device is adjusted according to the multifunctional anesthesia machine and the X-ray film reader so as to adapt to the size of an object. And setting a collision range corresponding to each interactive object by the MeshCollider or the BoxConlier.
By applying the system provided by the embodiment of the invention, the object configurator and the object detectors detect and configure the interactive objects, so that a user can really experience the interactive process, and the medical teaching efficiency is further improved.
Based on the system provided by the embodiment, the human body is a virtual user which performs human-computer interaction in the operating room simulation scene and simulates a real user to execute various control operations. The specific implementation procedures and derivatives thereof of the above embodiments are within the scope of the present invention.
All the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from other embodiments. In particular, the system or system embodiments, which are substantially similar to the method embodiments, are described in a relatively simple manner, and reference may be made to some descriptions of the method embodiments for relevant points. The above-described system and system embodiments are only illustrative, wherein the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both,
to clearly illustrate this interchangeability of hardware and software, various illustrative components and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. An operating room scene simulation system, comprising:
the system comprises a scene modeling module and a scene interaction module;
the scene modeling module is used for constructing an operating room simulation scene and guiding the operating room simulation scene into the scene interaction module, the operating room simulation scene comprises a plurality of interaction objects, and the interaction objects comprise an operating table, a shadowless lamp, an X-ray film reader, a monitor and a multifunctional anesthesia machine;
the scene interaction module is used for receiving each control instruction sent by a user and simulating human bodies to carry out human-computer interaction with each interactive object in the operating room simulation scene according to the control operation corresponding to each control instruction;
and the collision detector is used for detecting whether the human body exists in a collision range corresponding to each interactive object for human-computer interaction, determining object information of the current interactive object corresponding to the current human-computer interaction, and triggering a UI (user interface) to display the object details and the operation method corresponding to the current interactive object according to the object attribute of the current interactive object stored in the interactive object characteristic storage device.
2. The system of claim 1, wherein the scene modeling module comprises:
a setting unit and a modeling unit;
the setting unit is used for setting a placing position corresponding to each interactive object;
and the modeling unit is used for applying preset 3Ds Max software and constructing the operating room simulation scene according to the corresponding placing position of each interactive object.
3. The system of claim 2, wherein the modeling unit is further configured to save the constructed operating room simulation scene as an object format file, and import the object format file into a module program corresponding to the scene interaction module.
4. The system of claim 2, wherein the modeling unit comprises:
the device comprises an operating bed adjusting subunit, a shadowless lamp adjusting subunit, a film reading device control subunit, a monitor control subunit and an anesthesia machine control subunit;
the operating bed adjusting subunit is used for receiving an operating bed adjusting instruction sent by the user and adjusting the bed height, the handle position and the backrest angle of the operating bed;
the shadowless lamp adjusting subunit is used for receiving a shadowless lamp adjusting instruction sent by the user, adjusting the lighting position and the light brightness of the shadowless lamp, and controlling the on-off of the shadowless lamp;
the film reader control subunit is used for receiving a film reading control instruction sent by the user and controlling the switch of the X-ray film reader;
the monitor control subunit is used for receiving a monitor operation instruction sent by the user and controlling the switch of the monitor;
and the anesthesia machine control subunit is used for receiving the anesthesia machine operation instruction sent by the user and controlling the switch of the multifunctional anesthesia machine.
5. The system of claim 1, wherein the scene interaction module comprises:
a character controller, a collision detector, an interactable object property storage device, and a UI interface;
the role controller is used for receiving a control instruction sent by the user through the UI interface and simulating the behavior of the human body in the operating room simulation scene for user interaction with each interactive object according to the control instruction;
the collision detector is used for detecting whether human bodies exist in a collision range corresponding to each interactive object for human-computer interaction, determining object information of a current interactive object corresponding to the current human-computer interaction, and triggering the UI to display object details and an operation method corresponding to the current interactive object according to object attributes of the current interactive object, which are stored in the interactive object characteristic storage device;
the interactive object characteristic storage device is used for setting the object attribute of each interactive object and storing the common attribute of all the interactive objects in a preset common set, and the different attribute of each interactive object is stored in a different sex set corresponding to each interactive object;
the UI interface is used for displaying the operating room simulation scene and the object details and the operation method of each interactive object, receiving the control instruction corresponding to each operation method selected by the user on the UI interface, sending the control instruction to the role controller, and triggering and receiving the role controller to execute the control operation corresponding to each control instruction.
6. The system of claim 5, wherein the character controller comprises:
a displacement controller, a rotation controller, a camera and a role collision controller;
the displacement controller is used for controlling the human body displacement of the human body in the operating room simulation scene when receiving a displacement control instruction sent by the user through the UI interface;
the rotation controller is used for controlling the human body to rotate within a preset angle range when receiving a rotation control instruction sent by the user through the UI;
the camera is used for shooting an object watched by the human body in the operating room simulation scene according to a preset shooting visual angle;
and the character collision controller is used for controlling the human body to interact with the interactive object to be interacted when receiving a touch control instruction sent by the user through the UI.
7. The system of claim 5, wherein the collision detector comprises:
an object configurator and a plurality of object detectors;
the object configurator is used for detecting the object form of each interactive object, increasing the rigid body attribute of the object according to the object form of each interactive object, and configuring the collision range corresponding to each object form;
each object detector is arranged on the corresponding interactive object;
each object detector is used for detecting whether the human body currently enters a collision range corresponding to the current interactive object or not; and if the collision range corresponding to the current interactive object of the human body is existed, triggering the UI interface to display the object introduction and operation method corresponding to the current interactive object.
CN201910666857.XA 2019-07-23 2019-07-23 Operating room scene simulation system Active CN110376922B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910666857.XA CN110376922B (en) 2019-07-23 2019-07-23 Operating room scene simulation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910666857.XA CN110376922B (en) 2019-07-23 2019-07-23 Operating room scene simulation system

Publications (2)

Publication Number Publication Date
CN110376922A CN110376922A (en) 2019-10-25
CN110376922B true CN110376922B (en) 2022-10-21

Family

ID=68255081

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910666857.XA Active CN110376922B (en) 2019-07-23 2019-07-23 Operating room scene simulation system

Country Status (1)

Country Link
CN (1) CN110376922B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113730715B (en) * 2021-10-15 2023-10-03 核工业总医院 Remote anesthesia auxiliary control method and device, electronic equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441831A (en) * 2008-11-17 2009-05-27 江苏科技大学 Virtual operation artificial system based on force feedback
CN102254475A (en) * 2011-07-18 2011-11-23 广州赛宝联睿信息科技有限公司 Method for realizing endoscopic minimal invasive surgery simulated training 3D platform system
CN105303605A (en) * 2015-10-26 2016-02-03 哈尔滨理工大学 Orthopedic surgery operation simulation system on the basis of force feedback
CN107229388A (en) * 2017-05-02 2017-10-03 中南民族大学 A kind of Looper's height control operative training system and training method
CN108335754A (en) * 2018-03-30 2018-07-27 赵东生 A kind of interactive mode interventional cardiac procedures simulator
CN108919954A (en) * 2018-06-29 2018-11-30 蓝色智库(北京)科技发展有限公司 A kind of dynamic change scene actual situation object collision exchange method
CN109002167A (en) * 2018-08-07 2018-12-14 浙江冰峰科技有限公司 Eyeball tracking analogy method, device and wear display equipment
CN109192030A (en) * 2018-09-26 2019-01-11 郑州大学第附属医院 True hysteroscope Minimally Invasive Surgery simulation training system and method based on virtual reality
CN109285225A (en) * 2018-10-15 2019-01-29 东北大学 A kind of method for building up of the virtual reality auxiliary operation based on medical image
CN109791801A (en) * 2017-06-29 2019-05-21 威博外科公司 Virtual reality training, simulation and cooperation in robotic surgical system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102254476B (en) * 2011-07-18 2014-12-10 广州赛宝联睿信息科技有限公司 Endoscopic minimally invasive surgery simulation training method and system
CN103680279A (en) * 2013-12-31 2014-03-26 广州赛宝联睿信息科技有限公司 Cystoscope surgery simulated training method and system
CN105656956A (en) * 2014-11-12 2016-06-08 梁廷洲 Digital compound operation room remote maintenance system
CN106200982A (en) * 2016-07-26 2016-12-07 南京意斯伽生态科技有限公司 A kind of medical surgery training system based on virtual reality technology
CN109961682A (en) * 2017-12-26 2019-07-02 深圳先进技术研究院 Virtual plastic operation training device and terminal device
CN208808665U (en) * 2018-02-11 2019-05-03 深圳市汇健医疗工程有限公司 A kind of artificial intelligence operating room
CN108648548A (en) * 2018-04-19 2018-10-12 浙江工业大学 A kind of neuro-surgery virtual operation training system
CN109065147A (en) * 2018-07-30 2018-12-21 广州狄卡视觉科技有限公司 Medical Digital 3D model human body surgical simulation human-computer interaction system and method
CN109308739A (en) * 2018-10-11 2019-02-05 南京工程学院 A kind of soft tissue Minimally Invasive Surgery training method based on virtual reality
CN109658772B (en) * 2019-02-11 2021-01-26 三峡大学 Operation training and checking method based on virtual reality
CN109979600A (en) * 2019-04-23 2019-07-05 上海交通大学医学院附属第九人民医院 Orbital Surgery training method, system and storage medium based on virtual reality
CN109979266A (en) * 2019-04-30 2019-07-05 邵阳学院 A kind of human anatomy 3D tutoring system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101441831A (en) * 2008-11-17 2009-05-27 江苏科技大学 Virtual operation artificial system based on force feedback
CN102254475A (en) * 2011-07-18 2011-11-23 广州赛宝联睿信息科技有限公司 Method for realizing endoscopic minimal invasive surgery simulated training 3D platform system
CN105303605A (en) * 2015-10-26 2016-02-03 哈尔滨理工大学 Orthopedic surgery operation simulation system on the basis of force feedback
CN107229388A (en) * 2017-05-02 2017-10-03 中南民族大学 A kind of Looper's height control operative training system and training method
CN109791801A (en) * 2017-06-29 2019-05-21 威博外科公司 Virtual reality training, simulation and cooperation in robotic surgical system
CN108335754A (en) * 2018-03-30 2018-07-27 赵东生 A kind of interactive mode interventional cardiac procedures simulator
CN108919954A (en) * 2018-06-29 2018-11-30 蓝色智库(北京)科技发展有限公司 A kind of dynamic change scene actual situation object collision exchange method
CN109002167A (en) * 2018-08-07 2018-12-14 浙江冰峰科技有限公司 Eyeball tracking analogy method, device and wear display equipment
CN109192030A (en) * 2018-09-26 2019-01-11 郑州大学第附属医院 True hysteroscope Minimally Invasive Surgery simulation training system and method based on virtual reality
CN109285225A (en) * 2018-10-15 2019-01-29 东北大学 A kind of method for building up of the virtual reality auxiliary operation based on medical image

Also Published As

Publication number Publication date
CN110376922A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
US10438415B2 (en) Systems and methods for mixed reality medical training
CN108701429B (en) Method, system, and storage medium for training a user of a robotic surgical system
CN109432753B (en) Action correcting method, device, storage medium and electronic equipment
Hamrol et al. Virtual 3D atlas of a human body–development of an educational medical software application
CN105374251A (en) Mine virtual reality training system based on immersion type input and output equipment
Samosky et al. BodyExplorerAR: enhancing a mannequin medical simulator with sensing and projective augmented reality for exploring dynamic anatomy and physiology
CN107993545A (en) Children's acupuncture training simulation system and emulation mode based on virtual reality technology
WO2019203952A1 (en) Systems and methods for applications of augmented reality
CN110376922B (en) Operating room scene simulation system
Dhanasree et al. Hospital emergency room training using virtual reality and leap motion sensor
Zhou et al. H-GOMS: a model for evaluating a virtual-hand interaction system in virtual environments
Mihaľov et al. Potential of low cost motion sensors compared to programming environments
CN110942519A (en) Computer assembly virtual experiment system and implementation method thereof
KR101505174B1 (en) Methods and apparatuses of an learning simulation model using images
US10692401B2 (en) Devices and methods for interactive augmented reality
Chen A virtual environment system for the comparative study of dome and hmd
RU2494441C1 (en) Interactive learning complex
Shen et al. Immersive haptic eye tele-surgery training simulation
KR102684281B1 (en) Providing display pannel talble for education in oriental medicine
CN203520303U (en) Interactive device
Bressler A virtual reality training tool for upper limp prostheses
Bibb Determining principles for the development of mixed reality systems for command and control applications
König Design and evaluation of novel input devices and interaction techniques for large, high-resolution displays
Balani Investigation of Interaction Metaphors for Augmented and Virtual Reality on Multi-Platforms for Medical Applications
Sousa FUN2HELPELDERLY-COGNITIVE STIMULATION AND REHABILITATION OF HAND-EYE COORDINATION IN THE ELDERLY THROUGH SERIOUS GAMES

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant