CN117976246A - Instruction interaction method and system for doctors and patients - Google Patents

Instruction interaction method and system for doctors and patients Download PDF

Info

Publication number
CN117976246A
CN117976246A CN202211318867.2A CN202211318867A CN117976246A CN 117976246 A CN117976246 A CN 117976246A CN 202211318867 A CN202211318867 A CN 202211318867A CN 117976246 A CN117976246 A CN 117976246A
Authority
CN
China
Prior art keywords
radar
target object
radar device
preset distance
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211318867.2A
Other languages
Chinese (zh)
Inventor
闵佳乐
王晗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202211318867.2A priority Critical patent/CN117976246A/en
Publication of CN117976246A publication Critical patent/CN117976246A/en
Pending legal-status Critical Current

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The embodiment of the specification discloses a method and a system for instruction interaction between doctors and patients. The method comprises the following steps: acquiring detection data of a target object through radar equipment; determining a motion pattern of the target object based on the detection data; and outputting the motion mode and/or the instruction corresponding to the motion mode to a console.

Description

Instruction interaction method and system for doctors and patients
Technical Field
The present disclosure relates to the field of information technologies, and in particular, to a method and a system for interaction of instructions between doctors and patients.
Background
The patient may need to communicate (interact) with the physician during scanning imaging, e.g., the patient may experience physical discomfort during a long scan and may not be able to follow the scan, at which point the patient needs to interact with the physician in time, indicating that the physician is stopping the scan. In some interactions, the patient may instruct the technician to stop the scan by grasping the warning ball. However, the patient is inconvenient to handle, e.g., the scanning aperture is relatively dark, and it is difficult for the patient to perceive the exact position of the warning ball.
In view of this, it is desirable to provide an interactive solution that facilitates patient handling.
Disclosure of Invention
One of the embodiments of the present disclosure provides a method for instruction interaction between doctors and patients, which may include: acquiring detection data of a target object through radar equipment; determining a motion pattern of the target object based on the detection data; and outputting the motion mode and/or the instruction corresponding to the motion mode to a console.
In some embodiments, the detection data may include radar echo signals, and the determining a motion pattern of the target object based on the detection data may include: for each preset distance unit in a plurality of preset distance units, acquiring a radar echo signal corresponding to the preset distance unit, and acquiring an image texture feature corresponding to the preset distance unit based on the radar echo signal corresponding to the preset distance unit; and processing a plurality of image texture features corresponding to the plurality of distance units by using a preset model to determine the motion mode of the target object.
In some embodiments, the plurality of preset distance units may be determined by: setting corner reflectors at one or more parts of a reference object to strengthen radar echo signals scattered by the one or more parts; identifying an enhanced radar return signal from the received radar return signals to measure a range bin in which the one or more locations are located; and determining the preset distance units based on the multiple measurement results of the distance units where the one or more parts are located.
In some embodiments, the plurality of image texture features may be obtained by: obtaining a range-doppler matrix based on each frame of radar echo signals, wherein each row of the range-doppler matrix corresponds to a preset range unit, and the row is obtained based on the radar echo signals corresponding to the preset range unit; every time radar equipment receives a frame of radar echo signals, N distance Doppler matrixes corresponding to the previous N frames of radar echo signals are spliced to obtain radar image data; wherein N is an integer greater than 1; and extracting image texture features corresponding to the preset distance units from the radar image data.
In some embodiments, the radar device may include a first radar device for detecting at least a head of the target object and a second radar device for detecting at least a limb portion of the target object.
In some embodiments, the sum of the detection ranges of the first radar device and the second radar device can cover all parts of the target object.
In some embodiments, the operating frequency of both the first radar device and the second radar device is not less than 60GHz.
In some embodiments, the instructions may include a talk instruction that instructs to open a talk with a console for the target object and a stop scan instruction that instructs to stop scanning.
One of the embodiments of the present disclosure provides a system for interaction of instructions between doctors and patients, which may include an acquisition module, a determination module, and an output module. The acquisition module is used for acquiring detection data of the target object through radar equipment. The determination module is used for determining a motion mode of the target object based on the detection data. The output module is used for outputting the motion mode and/or the instruction corresponding to the motion mode to the console.
One of the embodiments of the present specification provides an instruction interaction apparatus for use between doctors and patients, including a scanning device, a first radar device and a second radar device disposed on the scanning device, a processor, and a console. The first radar device is used for detecting the head of the target object, and the second radar device is used for detecting the limb part of the target object. The processor is used for processing the detection data of the first radar device and the second radar device and sending the processing result to the console.
The embodiment of the specification provides a method and a system for instruction interaction between doctors and patients. The method comprises the steps of detecting a target object through radar equipment, and determining a motion mode of the target object based on detection data, so that a control console further determines an instruction corresponding to the motion mode of the target object. In this manner, interaction with the console (e.g., instructing to stop scanning) may be conveniently accomplished by the patient's own motion (e.g., making a waving motion) during scanning imaging.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic illustration of an application scenario of a medical system according to some embodiments of the present description;
FIG. 2 is a schematic diagram illustrating the mounting location of radar apparatus at different perspectives according to some embodiments of the present description;
FIG. 3 is an integrated schematic diagram of a scanning device and a radar device shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary block diagram of an instruction interaction system for use between doctors and patients, according to some embodiments of the present description;
FIG. 5 is an exemplary flow chart of a method for instruction interaction between a doctor and a patient according to some embodiments of the present description;
FIG. 6 is an exemplary flow chart of radar signal/data processing shown in accordance with some embodiments of the present description;
Fig. 7 is a schematic view of the structure of a transmitting part and a receiving part of a radar apparatus according to some embodiments of the present specification;
FIG. 8 is a schematic diagram of I/Q channel signal generation shown in accordance with some embodiments of the present description;
Fig. 9 is a schematic diagram of receive antenna beamforming according to some embodiments of the present description;
fig. 10 is a schematic diagram of a process for processing data per frame according to some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies of different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification, the terms "a," "an," "the," and/or "the" are not intended to be limiting, but rather are to be construed as covering the singular and the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
Fig. 1 is a schematic view of an application scenario of a medical system according to some embodiments of the present description.
As shown in fig. 1, medical system 100 may include a scanning device 110, a radar device 120, a processor 130, and a console 140.
The scanning device 110 is used for acquiring scan data of a scan object. The scan data may be used for reconstruction of medical images, so the scanning device may also be referred to as a medical imaging device. In some embodiments, the medical imaging device 110 may transmit the scan data to other devices (e.g., the processor 130) and image reconstruction by the other devices. In some embodiments, image reconstruction may also be performed by the medical imaging device 110.
In this specification, an object may refer to a person or a part of a human body, for example, a patient or a part of his body receiving a scan. For convenience of description, the terms "subject" and "patient" may be used interchangeably without causing ambiguity.
In some embodiments, scanning device 110 may receive information and/or instructions from console 140. For example, the scanning device 110 may receive scan parameters set by a user (e.g., a physician) from the console 140. As another example, scanning device 110 may receive various instructions from console 140 including, but not limited to, instructions to turn on a scan, instructions to stop a scan, instructions to turn on a call, instructions to adjust a scan room temperature.
In some embodiments, the scanning device 110 may include a CT (Computed Tomography ) device, a PET (Positron Emission Computed Tomography, positron emission computed tomography) device, an MRI (Magnetic Resonance Imaging ) device, or the like, or any combination thereof.
A radar apparatus (simply referred to as radar) is an electronic apparatus that detects a target using electromagnetic waves. The radar device 120 may be used to acquire probe data for a target object. Wherein the target object may refer to a patient or a part of a patient's body. The detection data may be used to determine a movement pattern of the target object, e.g. to determine an action by the patient, which action may act as an instruction. For more details on determining the movement pattern, reference may be made to fig. 5 and its related description.
The radar may include one or more transmit antennas, a transmit (denoted Tx) portion, one or more receive antennas, and a receive (denoted Rx) portion. The Tx part may include a signal generation module and a Tx link. The Rx part may include an Rx link and a signal acquisition module. For a detailed description of radar internal signal flow, reference may be made to the relevant description of fig. 6 to 10.
In some embodiments, radar device 120 may be electromagnetically shielded to avoid interference between scanning device 110 and radar device 120.
In some embodiments, as shown in fig. 1, a single radar device may be used to detect a target object. In some embodiments, two or more radar devices may be used to detect a target object. For example, as shown in fig. 2, two radar devices (denoted as a first radar device and a second radar device) may be used to achieve a larger detection range. Wherein the first radar device is operable to detect at least a head of the target object and the second radar device is operable to detect at least a limb portion of the target object.
In some embodiments, the sum of the detection ranges of the first radar device and the second radar device can cover all parts of the target object. Thus, the method is helpful for identifying the movement pattern of any part of the whole body of the patient.
The radar device 120 may be disposed on the scanning device 110. Specifically, the radar device 120 may be disposed on a coil of the scanning device 110 forming a scanning aperture. As shown in fig. 2, the radar device 120 may be embedded in a coil portion located above a patient (a scanner bed).
Fig. 3 is an integrated schematic diagram of a scanning device and a radar device according to some embodiments of the present description. When the scanning device 110 is a magnetic resonance imaging device, a multi-layer structure in the form of a ring (cylinder) is provided inside the magnetic resonance device, comprising a volume transmit coil (Volume Transmit Coil, VTC), a gradient coil and a main magnet. The radar front-end is responsible for the transceiving and processing of radar signals, e.g. the radar front-end may identify user instructions based on received echo signals. As shown in fig. 3, the radar front end may be embedded in a portion of the volume transmit coil above the scan object (e.g., a patient facing coil portion). The Power and data transmission Unit (Power AND TRANSFER Unit, PWDT) based on the FPGA (Field Programmable GATE ARRAY, programmable array gate) is responsible for supplying Power to the radar front-end, and performs real-time communication with the radar front-end through two optical fibers, such as responsible for receiving radar data (such as user instructions) and controlling radar states. Also, PWDT may communicate with a scanning control center (Scan Control Center, SCC) via console 140. The SCC may be considered part of the scanning device 110 for running a preset pulse program to enable gradient control of the gradient coils and signaling of the VTC (and thus scanning imaging).
In some embodiments, radar device 120 may include one or more of millimeter wave radar, ultrasonic radar, laser radar, and the like.
The spatial resolution of the radar apparatus (reflecting the detection accuracy) is closely related to its operating frequency. The operating frequency of a radar device may refer to the frequency of electromagnetic waves it emits. The higher the frequency, the higher the spatial resolution. The spatial resolution (detection accuracy) may be reflected in various aspects such as distance, angle, speed, etc. In some embodiments, the operating frequency of radar device 120 may be no less than 60GHz. Specifically, radar device 120 may include millimeter wave radar, with the millimeter wave band generally referring to 30-300 GHz. Besides high spatial resolution, millimeter wave radars also have the characteristics of small volume, light weight, etc., and are particularly suitable for being installed in scanning devices with limited installation conditions (for example, small available installation space). When a plurality of radar apparatuses (e.g., two) are used, each of the radar apparatuses may have an operating frequency of not lower than 60GHz, for example, each of the radar apparatuses used is a millimeter wave radar and the operating frequency is not lower than 60GHz.
It is worth noting that the movement space of the patient in the scanning device is limited by factors such as the scanning aperture, i.e. the movement amplitude is limited. Therefore, the use of high-precision radars is of great importance for accurately identifying the patient's movement patterns (especially small amplitude movements).
The processor 130 may be configured to process the probe data of the radar device 120 and transmit the processing result to the console 140. In some embodiments, the processing result may include a motion pattern of the target object and/or an instruction corresponding to the motion pattern.
In some embodiments, the Processor 130 may include one or more hardware of an ARM (ADVANCED RISC MACHINE, advanced reduced instruction set machine) Processor, a DSP (DIGITAL SIGNAL Processor ), an FPGA, or the like. In some embodiments, at least a portion of processor 130 may be integrated with radar device 120.
Console 140 may be used to control other devices internal to system 100 and/or external devices to system 100. In a real-world scenario, the environment in which the console 140 is located may be isolated from the environment in which the scanning device 110 is located. For example, the scanning device 110 may be mounted inside the scanning room, while the console 140 may be mounted outside the scanning room. During a scan, a patient may control the scanning device 110 to turn on and/or off the scan by, in some embodiments, the console 140. In some embodiments, the system 100 may also include a talk system (not shown) through which the patient may talk to the console 140, and the console 140 may control the status of the talk system (e.g., on/off). In some embodiments, the console 140 may control the state (e.g., on/off, cool down/warm up) of the scan room's refrigeration system and/or heating system.
The console 140 may receive the motion pattern of the target object and/or instructions corresponding to the motion pattern from the processor 130 to enable patient interaction with the console 140. For example only, the patient may be notified of the need to stop the scan prior to the scan, and the processor 130 may output the hand swing and/or instructions to stop the scan to the console 140 after identifying the patient's hand swing. Further, the physician of the console 140 can control the scanning device 110 to stop scanning through the console 140 after confirming the security.
The console 140 may have an output device (e.g., a display) to present the movement pattern of the target object and/or instructions corresponding to the movement pattern.
Fig. 4 is an exemplary block diagram of an instruction interaction system for use between doctors and patients, according to some embodiments of the present description. In some embodiments, system 400 may be implemented on processor 130.
As shown in fig. 4, the system 400 may include an acquisition module 410, a determination module 420, and an output module 430.
The acquisition module 410 may be used to acquire probe data of a target object by a radar device.
In some embodiments, the radar device may include a first radar device and a second radar device, wherein the first radar device may be configured to detect at least a head of the target object and the second radar device may be configured to detect at least a limb portion of the target object. In some embodiments, the sum of the detection ranges of the first radar device and the second radar device can cover all parts of the target object. In some embodiments, the operating frequency of both the first radar device and the second radar device is not less than 60GHz.
The determination module 420 may be configured to determine a movement pattern of the target object based on the detection data.
In some embodiments, the probe data may include radar return signals, and the determination module 420 may be further configured to: for each preset distance unit in a plurality of preset distance units, acquiring a radar echo signal corresponding to the preset distance unit, and acquiring an image texture feature corresponding to the preset distance unit based on the radar echo signal corresponding to the preset distance unit; and processing a plurality of image texture features corresponding to the plurality of distance units by using a preset model to determine the motion mode of the target object.
In some embodiments, for each of the plurality of preset distance units, the determination module 420 may be further to: every time radar equipment receives a frame of radar echo signals, N distance Doppler matrixes corresponding to the preset distance units are spliced to obtain radar image data, wherein the N distance Doppler matrixes are obtained based on the previous N frames of radar echo signals, and N is an integer greater than 1; and extracting image texture features corresponding to the preset distance units from the radar image data.
The output module 430 may be configured to output the motion pattern and/or an instruction corresponding to the motion pattern to a console.
In some embodiments, the instructions may include a talk instruction that instructs to open a talk with the console for the target object and a stop scan instruction that instructs to stop scanning.
For more details on system 400 and its modules, reference may be made to FIG. 5 and its associated description.
It should be understood that the system shown in fig. 4 and its modules may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or special purpose design hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such as provided on a carrier medium such as a magnetic disk, CD or DVD-ROM, a programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system of the present specification and its modules may be implemented not only with hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software executed by various types of processors, for example, and with a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system and its modules is for convenience of description only and is not intended to limit the present description to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. For example, in some embodiments, the determination module 420 and the output module 430 may be two modules or may be combined into one module. Such variations are within the scope of the present description.
Fig. 5 is an exemplary flow chart of a method for instruction interaction between a doctor and a patient according to some embodiments of the present description. In some embodiments, the process 500 may be performed by the processor 130, and in particular, the instruction interaction system 400 implemented on the processor 130 for use between doctors and patients.
In step 510, probe data of a target object is acquired by a radar device. In some embodiments, step 510 may be performed by the acquisition module 410.
The target object refers to an individual, e.g., a patient, of the scanned object. The detection data of the target object may refer to detection data of various parts of the target object, for example, detection data of parts of a head, an arm, a leg, etc.
In some embodiments, the probe data may include one or more of radar echo signals, signals resulting from radar signal processing of radar echo signals, signal analysis results, and the like. In some embodiments, the radar signal processing may include one or more of filtering (e.g., filtering out noise), amplifying, mixing, beamforming, etc. In some embodiments, the signal analysis results may include one or more of signal spectrum, range, bearing, speed, etc. of the target.
The acquisition module 410 may send instructions to a radar device (e.g., radar device 120) to control the radar device to emit electromagnetic waves to a target object (whole body region, or partial body region). In turn, the radar device may receive radar echo signals scattered by a target object (whole body region, or part of body region). It will be appreciated that the radar echo signal may be used directly as probe data or as a source signal for acquiring probe data.
Step 520, determining a movement pattern of the target object based on the detection data. In some embodiments, step 520 may be performed by determination module 420.
The movement pattern of the target object may indicate an intention (which may be considered as an instruction) of the target object. The patient may learn the intent of one or more movement pattern indications prior to scanning in order to convey his or her intent to the console during the scanning. The physician may also grasp the intent of the one or more movement pattern indications to determine the patient's intent after observing the patient's movement pattern through the console. In addition, the console may store a correspondence between the movement pattern and the instruction in advance so as to automatically convert the movement pattern of the target object into the corresponding instruction.
The motion pattern of the target object may belong to a predefined set of motion patterns. For example only, a waving motion may be predefined as a motion pattern, which may be used to indicate to stop scanning, or to indicate to turn on a call with the console.
Step 520 may be implemented by any pattern recognition means. For example only, the determination module 420 may identify the motion pattern of the target object by means of a machine-based learning model (also referred to as a classifier). The input of the classifier may comprise the probe data or the result of processing the probe data.
In some embodiments, the probe data may include radar echo signals (echo signals for short). For each preset distance unit of the plurality of preset distance units, the determining module 420 may obtain an echo signal corresponding to the preset distance unit, and obtain an image texture feature corresponding to the preset distance unit based on the echo signal corresponding to the preset distance unit. Furthermore, the determining module 420 may process a plurality of image texture features corresponding to a plurality of distance units by using a preset model to determine a motion mode of the target object.
The preset model is a machine learning model trained according to a preset algorithm, the input of the model is a plurality of image texture features corresponding to a plurality of preset distance units, and the output of the model is a motion mode of a predicted object (such as the target object). In some embodiments, the preset model may include one or more models of a neural network, a decision tree, a logistic regression model, a support vector machine, and the like.
It is understood that a distance unit where one or more portions of the target object are located may be selected as the plurality of preset distance units. For example only, when the preset motion pattern includes only a waving motion, a plurality of distance units in which the arm part is located may be selected as the plurality of preset distance units. Compared with the processing of radar data (such as echo signals) on all distance units, the processing of the radar data on the preset distance units reduces the data processing amount, thereby improving the calculation (pattern recognition) speed and being beneficial to timely conveying of patient demands.
In some embodiments, the plurality of preset distance units may be predetermined by a plurality of measurements. Specifically, first, a corner reflector may be provided at one or more locations of the reference object to intensify echo signals scattered/reflected by the one or more locations. The enhanced radar return signal may then be identified from the received radar return signal to measure a range bin at which the one or more locations are located. The plurality of preset distance units may be determined based on a plurality of measurements of the distance unit in which the one or more locations are located. For example, the multiple measurement results may be combined to obtain the multiple preset distance units.
It is understood that the reference object may refer to the target object itself or an object having the same (similar) body size as the target object. In practical applications, different patients may be divided into a plurality of body types, the body types may be determined by the height and the weight, for example, a combination of the height section and the weight section may be regarded as a body type, and for each body type defined in advance, a plurality of preset distance units under the body type may be determined.
In some embodiments, the determination module 420 may obtain a range-doppler matrix based on each frame of radar echo signals. Wherein each Line (denoted as line_i) of the range-doppler matrix corresponds to a preset range cell (denoted as bin_i), and the Line (line_i) is obtained based on the radar echo signal corresponding to the preset range cell (bin_i). Based on this, each time the radar device receives a frame of echo signals, the determining module 420 may stitch N range-doppler matrices corresponding to the previous N frames of radar echo signals to obtain radar image data. Wherein N is an integer greater than 1. For more details on obtaining a range-doppler matrix (abbreviated RD matrix), reference may be made to the relevant description of fig. 6-10.
And step 530, outputting the motion mode and/or the instruction corresponding to the motion mode to a console. In some embodiments, step 530 may be performed by output module 430.
After knowing the motion pattern of the target object and/or the instructions corresponding to the motion pattern, a staff member (such as a physician) of the console can determine the patient's intention (e.g., the patient's hand waving is indicative of stopping the scan), thereby achieving interaction between the patient and the doctor. This interactive mode is very convenient for the patient to operate compared to operating entity trigger buttons (e.g. warning balls).
In some embodiments, the instructions may include a talk instruction and a stop scan instruction. The call instruction may instruct to start a call with the console for the target object, and the scan stop instruction may instruct to stop scanning.
Different motion patterns may correspond to different instructions. For example only, the shaking motion may correspond to a talk command, the waving motion may correspond to a stop scan command, and the lifting motion may correspond to a command to increase room temperature.
In some embodiments, the correspondence of movement patterns to instructions may be adjusted for a particular patient. For example, some patients have injuries on arms and are difficult to move, and the movement mode corresponding to the instruction of stopping scanning can be changed from a hand waving action to an action of a part which can normally move, such as a head waving action or a leg lifting action. In some embodiments, the correspondence of the movement pattern to the instructions may be set according to the preference of the patient. For example, for each instruction in a set of preset instructions, the patient may choose a corresponding movement pattern for that instruction in a set of preset movement patterns.
Fig. 6 is an exemplary flow chart for obtaining an RD matrix according to some embodiments of the present description. It is understood that flow 500 is a data processing flow for a single radar device.
As shown in fig. 6, the flow 600 may include the following steps.
At step 610, an electromagnetic wave signal is transmitted to a target object by a radar device.
Referring to fig. 7, a Tx (transmit) part of the radar apparatus may include a signal generation module and a Tx link, and a signal generated by the signal generation module is transmitted to a transmit antenna via the Tx link.
The signal generation module is responsible for generating the waveforms to be transmitted, the generation module typically operates at baseband and is typically comprised of a Direct Digital Synthesizer (DDS) that generates samples corresponding to the desired waveforms. To avoid spectral aliasing, the output of the DDS will be filtered by a low pass filtering section, the cut-off frequency of the filter being limited to the sampling frequency. The Tx link is responsible for converting the baseband signal into the radio frequency range and then transmitting it over the air, which can be accomplished by a mixer and/or frequency multiplier. Amplifiers and filters may also be provided in the Tx link to further enhance the spectrum to be transmitted. A high power amplifier may also be provided in the Tx chain, which is generally used at the end of the Tx chain, for increasing the level of the output signal, thereby improving the effect of detecting a distant object.
The radar device may have one or more transmitting antennas that may transmit frequency modulated continuous waves (Frequency Modulated Continuous Wave, FMCW) at the same frequency and initial phase.
For example only, for any time instant (denoted as t), the radar transmit signal (denoted as s t (t)) may be represented as follows:
Wherein a is the power of the radar transmission signal, f c is the frequency (operating frequency) of the radar transmission signal (transmitted electromagnetic wave), B is the operating bandwidth (the variation range of the operating frequency f c), and T is the signal duration. The signal s t (T), also called the chirp signal, is characterized by a linear transformation of frequency with time (see fig. 10), the waveform of the frequency being composed of a succession of pulses, the time interval between adjacent pulses being T. The duration of a frame, whether a transmit signal or an echo signal, comprises a set number (greater than 2) of pulse intervals T.
Step 620, receiving, by the radar device, radar echo signals scattered by the target object.
Referring to fig. 7, the echo signal, after being received by a receive (Rx) antenna, is processed by a mixer that combines the echo signal with the transmit signal to produce an intermediate frequency (INTERMEDIATE FREQUENCY, IF) signal (denoted s b (t)), also known as a beat signal.
For example only, the beat signal s b (t) may be represented as follows:
Wherein: sigma is the amplitude of the received signal relative to the transmitted signal, and the power of the transmitted signal is A; τ is the pulse number (also called slow time), the relevant description of the pulse can be found in the foregoing; c is the speed of light; s is a linear frequency modulation and represents the variation of frequency in unit time; r (τ) is the distance of the target relative to the radar; phi RVP is the residual video phase (Residual Video Phase), which is negligible for near field radar.
The Rx link of the radar apparatus may be composed of a cascade of a Low-Noise Amplifier (LNA), a mixer, and a filter, so that the received signal may be sufficiently adjusted. Wherein the low noise amplifier can reduce the noise level of the receiving end, and thus, the sensitivity of the radar to weak targets can be improved. Referring to fig. 7 and 8 In combination, in-Phase/Quadrature (I/Q) demodulation is performed after noise reduction, the signal is split into two branches, the two branches are mixed respectively, and the mixed signal is collected by the signal collecting module. The signal acquisition module comprises an Analog-to-Digital Converter (ADC) for digitally sampling the mixed Analog signal, i.e. converting the Analog signal into a corresponding digital signal.
In step 630, a target distance process is performed for each chirp echo signal. Referring to the foregoing, each chirp echo signal corresponds to a frequency domain pulse. The target distance processing refers to processing only echo signals corresponding to the preset distance units (i.e. target distances).
The spectrum of the beat signal can be obtained by selecting I, Q two channel signals for each receive (Rx) antenna to form a complex signal and performing a one-dimensional fourier transform (called Range-FFT) on the complex signal. Since the spectrum information of the beat signal is correlated with the distance information of the scatterer (e.g., target object), echo signals corresponding to different distance units (Range-Bin) can be distinguished according to the spectrum of the beat signal. Furthermore, echo signals corresponding to the preset distance units can be selected from echo signals of all the distance units, and the echo signals corresponding to the preset distance units are used for receiving antenna beam forming and static clutter filtering.
Referring to fig. 9, the receiving antenna beam forming refers to weighted summation of echo signals received by multiple receiving antennas in the receiving antenna array to form a signal with a required designated receiving direction, so that signal to noise ratio of a signal at a receiving end can be effectively improved, motion characteristics at each preset distance unit are more obvious, and the method is beneficial to identifying a motion mode of a target object.
By way of example only, when n (e.g., 10, 20, 50, 100, 200, etc.) receiving antennas are horizontally arranged, the composite signal s (t) obtained by beamforming may be expressed as:
s(t)=s1(t)*w1+…+sn(t)*wn (3)
Wherein: s i (t) denotes the received echo signal of the i (1, 2, …, n) th receiving antenna; w i=e-j*π*(i-1)*sinθ denotes a complex weight of the ith receiving antenna, e is a natural constant, j is an imaginary unit, and θ denotes an angle corresponding to the receiving direction.
Static clutter filtering refers to the steps of averaging all chirp echo signals after beam forming, and subtracting the average value from each chirp echo signal to obtain an echo signal which is subjected to static clutter filtering. If the target object is stationary (locally stationary) at the preset distance unit, the phase difference between the chirp echo signals is smaller, the amplitude of each chirp echo signal after subtracting the mean value becomes smaller, and the amplitude is smaller after Doppler Fourier transform, so that the influence of the static reflection point can be weakened.
At step 640, doppler processing is performed on each frame of data over the range of the target. Each frame of data here includes echo signals processed in step 630.
At step 640, all chirp echo signals in a single frame of data may be fourier transformed, also known as Doppler fourier transformed (Doppler-FFT), over the plurality of preset distance bins. After the doppler fourier transform, an energy distribution of the range-doppler two-dimensional plane can be obtained, which is represented in a matrix form, also called range-doppler matrix (RD matrix) or RD map. The processor 130 may obtain an RD matrix on a per frame data basis, wherein for each of the plurality of preset distance units, the processor 130 may obtain a row of the RD matrix on a per frame data basis, i.e. the number of rows of the RD matrix may be equal to the number of preset distance units.
After the RD matrix is obtained, noise suppression can be performed on the matrix, so that the motion characteristics are more obvious. Specifically, the energy amplitude of each preset distance unit in the plurality of preset distance units in the static state and the motion state can be counted, then a noise threshold value is calculated, and finally grid points (elements) with the energy amplitude smaller than the threshold value in the RD matrix are set to zero.
Fig. 10 is a schematic diagram of processing of data per frame shown in accordance with some embodiments of the present description. In fig. 10, the slow time axis is used to distinguish different data frames, and the slow time axis coordinates may be regarded as data frame numbers, for example, the number of the received nth frame data may be n. The fast time axis is used to distinguish between different pulses within each frame of data, and the fast time axis coordinates may be regarded as pulse numbers, e.g. the number of the nth pulse within each frame of data may be n. For specific implementations of Range-FFT and Doppler-FFT, reference may be made to the relevant description above.
Step 650 extracts image texture features from the image data generated based on the previous N frame data.
Each time the radar device receives one frame of echo signal, the processor 130 may splice N RD matrices corresponding to the previous N frames of data and treat the spliced matrices as image data. Referring to step 640, the processor 130 may obtain one RD matrix based on each frame of data, and thus the N RD matrices in step 650 may be obtained based on the previous N frames of data. Furthermore, image texture features corresponding to the preset distance units can be extracted from the image data. When a plurality of radar apparatuses are used, a plurality of image texture features corresponding to the plurality of preset distance units extracted from image data of the plurality of radar apparatuses may be collectively input into a preset model to identify a movement pattern of a target object.
It should be noted that the above description of the flow is only for the purpose of illustration and description, and does not limit the application scope of the present specification. Various modifications and changes to the flow may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
Possible benefits of embodiments of the present description include, but are not limited to: (1) The instruction interaction method is convenient for the patient to operate and used between doctors and patients, and the patient can trigger interaction through autonomous movement; (2) The radar data is processed on the preset distance unit, so that the data processing capacity is reduced, the calculation (pattern recognition) speed is increased, and timely transmission of the patient needs is facilitated; (3) The high-precision radar equipment with the working frequency not lower than 60GHz is used for detecting the target, so that the method is beneficial to accurately identifying the movement mode of a scanned patient in a narrow space. It should be noted that, the advantages that may be generated by different embodiments may be different, and in different embodiments, the advantages that may be generated may be any one or a combination of several of the above, or any other possible advantages that may be obtained.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting of the embodiments of the present disclosure. Although not explicitly described herein, various modifications, improvements, and adaptations to the embodiments of the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are suggested in the present description examples, and therefore, are intended to fall within the spirit and scope of the example embodiments of this description.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, those skilled in the art will appreciate that aspects of the embodiments of the specification can be illustrated and described in terms of several patentable categories or conditions, including any novel and useful processes, machines, products, or compositions of matter, or any novel and useful improvements thereof. Accordingly, aspects of the embodiments of this specification may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of embodiments of the present description may take the form of a computer product, including computer-readable program code, embodied in one or more computer-readable media.
The computer storage medium may contain a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
Computer program code necessary for operation of portions of embodiments of the present disclosure may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, and the like, a conventional programming language such as C language, visualBasic, fortran2003, perl, COBOL2002, PHP, ABAP, dynamic programming languages such as Python, ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are presented in the examples, the use of numerical letters, or other designations are used, unless specifically indicated in the claims, is not intended to limit the order in which the steps of the examples and methods are presented. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in order to simplify the description of embodiments disclosed herein and thereby facilitate an understanding of one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure is not intended to imply that the subject matter of the embodiments of the present specification requires more features than are set forth in the claims. Indeed, less than all of the features of a single embodiment disclosed above.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history files that are inconsistent or conflicting with the disclosure of this specification, files that are limiting to the broadest scope of the claims of the present application (currently or later in the application) are also excluded. It is noted that, if the description, definition and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are also possible within the scope of the embodiments of the present description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (10)

1. A method for instruction interaction between doctors and patients, comprising:
Acquiring detection data of a target object through radar equipment;
determining a motion pattern of the target object based on the detection data;
and outputting the motion mode and/or the instruction corresponding to the motion mode to a console.
2. The method of claim 1, wherein the probe data comprises radar echo signals; the determining a motion pattern of the target object based on the detection data includes:
For each preset distance unit in a plurality of preset distance units, acquiring a radar echo signal corresponding to the preset distance unit, and acquiring an image texture feature corresponding to the preset distance unit based on the radar echo signal corresponding to the preset distance unit;
and processing a plurality of image texture features corresponding to the plurality of distance units by using a preset model to determine the motion mode of the target object.
3. The method of claim 2, wherein the plurality of preset distance units are determined by:
Setting corner reflectors at one or more parts of a reference object to strengthen radar echo signals scattered by the one or more parts;
Identifying an enhanced radar return signal from the received radar return signals to measure a range bin in which the one or more locations are located;
and determining the preset distance units based on the multiple measurement results of the distance units where the one or more parts are located.
4. The method of claim 2, wherein the plurality of image texture features are obtained by:
Obtaining a range-doppler matrix based on each frame of radar echo signals, wherein each row of the range-doppler matrix corresponds to a preset range unit, and the row is obtained based on the radar echo signals corresponding to the preset range unit;
Every time radar equipment receives a frame of radar echo signals, N distance Doppler matrixes corresponding to the previous N frames of radar echo signals are spliced to obtain radar image data; wherein N is an integer greater than 1;
and extracting image texture features corresponding to the preset distance units from the radar image data.
5. The method of claim 1, wherein the radar device comprises a first radar device for detecting at least a head of the target object and a second radar device for detecting at least a limb portion of the target object.
6. The method of claim 5, wherein a sum of detection ranges of the first radar device and the second radar device is capable of covering all locations of the target object.
7. The method of claim 5, wherein an operating frequency of the first radar device and the second radar device is not less than 60GHz.
8. The method of claim 1, wherein the instructions include a talk instruction that instructs the target object to open a talk with a console and a stop scan instruction that instructs to stop scanning.
9. The instruction interaction system for the doctor-patient interaction is characterized by comprising an acquisition module, a determination module and an output module;
the acquisition module is used for acquiring detection data of a target object through radar equipment;
The determining module is used for determining a motion mode of the target object based on the detection data;
The output module is used for outputting the motion mode and/or the instruction corresponding to the motion mode to the console.
10. An instruction interaction device used between doctors and patients is characterized by comprising a scanning device, a first radar device, a second radar device, a processor and a console, wherein the first radar device and the second radar device are arranged on the scanning device;
The first radar device is used for detecting the head of the target object, and the second radar device is used for detecting the limb part of the target object;
The processor is used for processing the detection data of the first radar device and the second radar device and sending the processing result to the console.
CN202211318867.2A 2022-10-26 2022-10-26 Instruction interaction method and system for doctors and patients Pending CN117976246A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211318867.2A CN117976246A (en) 2022-10-26 2022-10-26 Instruction interaction method and system for doctors and patients

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211318867.2A CN117976246A (en) 2022-10-26 2022-10-26 Instruction interaction method and system for doctors and patients

Publications (1)

Publication Number Publication Date
CN117976246A true CN117976246A (en) 2024-05-03

Family

ID=90844811

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211318867.2A Pending CN117976246A (en) 2022-10-26 2022-10-26 Instruction interaction method and system for doctors and patients

Country Status (1)

Country Link
CN (1) CN117976246A (en)

Similar Documents

Publication Publication Date Title
EP3775991B1 (en) Motion tracking in magnetic resonance imaging using radar and a motion detection system
US4805627A (en) Method and apparatus for identifying the distribution of the dielectric constants in an object
JP6139144B2 (en) Ultrasound system
US20190239851A1 (en) Position correlated ultrasonic imaging
US5204627A (en) Adaptive NMR angiographic reprojection method
US11298111B2 (en) Method for generating an enhanced image of a volume of tissue
US11642096B2 (en) Method for postural independent location of targets in diagnostic images acquired by multimodal acquisitions and system for carrying out the method
CA3111578A1 (en) Apparatus and process for medical imaging
JP2013512064A (en) Localization of features in the heart using radio frequency imaging
JPH06259559A (en) Analyzing method of picture
CN102640014A (en) Image generating apparatus, image generating method, and program
KR102203928B1 (en) Method for detecting position of micro robot using ultra wiide band impulse radar and therefore device
EP3756536A1 (en) Medical imaging system
JP2013113603A (en) Microwave imaging system and imaging processing method
CN111407308A (en) Ultrasound imaging system and computer-implemented method and medium for optimizing ultrasound images
CN107820435A (en) The radiotherapy system of ultrasonic guidance
US11710230B2 (en) Medical data processing apparatus and medical image diagnostic apparatus
CN117976246A (en) Instruction interaction method and system for doctors and patients
US20190261952A1 (en) Optimization in ultrasound color flow imaging
Solimene et al. An incoherent radar imaging system for medical applications
JP2015045655A (en) Locating features in heart using radio frequency imaging
RU2700468C2 (en) Specific absorption rate modulated by spatial proximity to patient
RU2784922C2 (en) Motion tracking in magnetic resonance imaging, using radar and motion detection system
EP3781932B1 (en) Device for microwave measurement of a dielectric discontinuity of a material
JPH10253603A (en) Internal spawn detecting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination