KR20160145315A - Method for displaying image including eye tracking and brain signal data - Google Patents

Method for displaying image including eye tracking and brain signal data Download PDF

Info

Publication number
KR20160145315A
KR20160145315A KR1020150081721A KR20150081721A KR20160145315A KR 20160145315 A KR20160145315 A KR 20160145315A KR 1020150081721 A KR1020150081721 A KR 1020150081721A KR 20150081721 A KR20150081721 A KR 20150081721A KR 20160145315 A KR20160145315 A KR 20160145315A
Authority
KR
South Korea
Prior art keywords
present
displaying
brain
storing
information
Prior art date
Application number
KR1020150081721A
Other languages
Korean (ko)
Inventor
채용욱
Original Assignee
주식회사 룩시드랩스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 룩시드랩스 filed Critical 주식회사 룩시드랩스
Priority to KR1020150081721A priority Critical patent/KR20160145315A/en
Publication of KR20160145315A publication Critical patent/KR20160145315A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/008Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Neurosurgery (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The present invention relates to a method of storing and displaying data obtained by observing images of a cross-section of a three-dimensional structure. According to one aspect of the present invention,
Loading at least one tomographic image on a stereoscopic object; Storing eye position and brain wave data in a tomographic image of the stereoscopic subject; A step of combining and displaying the stored gaze position and brain wave data and an image
In addition, other configurations may be further provided in accordance with the technical idea of the present invention.

Description

[0001] METHOD FOR DISPLAYING IMAGE INCLUDING EYE TRACKING AND BRAIN SIGNAL DATA [0002]

And more particularly, to a method and system for displaying eyeballs and brain wave information for viewing a planar image and displaying the information so as to utilize the information more effectively.

In recent years, along with the generalization of mobile smart devices that can be carried around in one hand such as laptops, smart phones, and smart pads, recently wearable devices such as smart glasses, smart watches, smart rings, smart necklets, The device is gradually expanding its scalability. Among them SmartGlass is one of the most wearable devices with the highest utilization and market potential.

In such a situation, the user wearing the smart glass would see the external image

Recognizing the image or recognizing the image itself is not aware of the fine

EEG changes can be recorded and meaningful data can be generated, which can be utilized in medical field and sports situations. However, according to the prior art, the (potential) intention of such a user could not be adequately supported.

The present invention relates to a method of storing and displaying data obtained by observing images of a cross-section of a three-dimensional structure. According to one aspect of the present invention,

Loading at least one tomographic image on a stereoscopic object; Storing eye position and brain wave data in a tomographic image of the stereoscopic subject; A step of combining and displaying the stored gaze position and brain wave data and an image

In addition, other configurations may be further provided in accordance with the technical idea of the present invention.

According to the present invention, it can be actively used in the medical service field. Particularly, in an environment in which a three-dimensional structure using a computer in MRI, CT, etc. is photographed on each cross section and the shape of the whole structure is acquired by observing the tomographic photographs, the eye tracking device or the brain- You can tell us important information to understand the process of visual information processing. For example, an image observed in a 3D environment is expressed as a single layer, and the visual and brain wave information is stored for each voxel (one point in a three-dimensional space), and the visual and brain wave information is reconstructed in three dimensions, Or to simulate a human understanding process of an object. In one embodiment, a method of observing a lesion on the basis of a tomographic image of MRI, CT, fMRI, or the like, which is often used in medical information processing, can be used to show how to process the data at a later time.

       In another example, a program used for 3D design such as architectural design and computer graphics can be utilized to effectively display or interact with the main part and the non-main part separately.

FIG. 1 is a diagram illustrating a method of storing and displaying an eye movement and a brain wave signal of a user observing an MRI or CT image according to an exemplary embodiment of the present invention.
FIG. 2 is a diagram illustrating a method of storing and displaying a user's gaze movement and an EEG signal in a stereoscopic graphics operation on a computer according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating a method of storing and displaying a user's gaze movement and an EEG signal in a flat graphic work according to an embodiment of the present invention.
FIG. 4 is a diagram illustrating a method of storing and utilizing an eye movement and a brain wave signal of a user watching a sports game according to an embodiment of the present invention.

The following detailed description of the invention refers to the accompanying drawings, which illustrate, by way of illustration, specific embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. It should be understood that the various embodiments of the present invention are different, but need not be mutually exclusive. For example, certain features, structures, and characteristics described herein may be implemented in other embodiments without departing from the spirit and scope of the invention in connection with an embodiment. It is also to be understood that the position or arrangement of the individual components within each disclosed embodiment may be varied without departing from the spirit and scope of the invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is to be limited only by the appended claims, along with the full scope of equivalents to which such claims are entitled, if properly explained. In the drawings, like reference numerals refer to the same or similar functions throughout the several views.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings, so that those skilled in the art can easily carry out the present invention.

Hereinafter, a detailed example of a method of providing a user interface between devices according to various embodiments of the present invention will be described in detail.

According to the present exemplary embodiment, more convenient interfacing and interaction can be performed by storing various kinds of methods of displaying a screen according to the stored values and storing eye movement and EEG data of a user.

FIG. 1 is a diagram illustrating a method of storing and displaying an eye movement and a brain wave signal of a user observing an MRI or CT image according to an exemplary embodiment of the present invention. The MRI and CT images may be consecutive frames of images of cross-sections of the body. At this time, it is possible to match and store the image frame playback time, the eye movement of the user looking at the images, and the time value for the brain wave data in real time.

The visual track path and stay time during the observation of each part were measured, and the intensity, stress, and activity level of each part of the brain were measured by analyzing the brain waves at this time. And the parts of the CT that are important in the MRI and CT can be expressed in 3D, which can be reproduced later in 2D or reproduced through the display together with brain wave information and eye information.

FIG. 2 is a diagram illustrating a method of storing and displaying a user's gaze movement and an EEG signal in a stereoscopic graphics operation on a computer according to an embodiment of the present invention. First, when 3D computer graphic design work is performed by measuring the brain waves and photographs of the observer in real time, the eye movement path and the time during the editing of each part are measured. By analyzing the brain waves at this time, brain activity information such as concentration, stress, and activity level of each part of the brain is measured in real time, and the brain activity information is combined with each other to grasp the place where the observer concentrates or not for each voxel, Can be expressed.

- Adjust the level of Zoom through the user's concentration and fatigue of the eye (because more detailed view is required for more detailed work)

- Noise is less than when using eye tracking only

FIG. 3 is a diagram illustrating a method of storing and displaying a user's gaze movement and an EEG signal in a flat graphic work according to an embodiment of the present invention. Real-time measurement of the brain waves and pictures of the observer. Perform 2D Computer Graphic design work. It is possible to measure the eye-tracked path and the stay time during observation of each site. By analyzing the EEG, we measure concentration, stress, and activity level of each part of the brain. By combining these, we can identify the place where the observer concentrates for each pixel, and store the emotional information and stress information at this time in the layer. Adjust the level of Zoom through the user's focus and eye fatigue (since more detailed views are needed for more detailed work). In graphic editing, it is possible to observe the change of the discrimination feeling of each editor and to observe the change of the emotion after layer editing to check the changed cognitive and emotional information.

FIG. 4 is a diagram illustrating a method of storing and utilizing an eye movement and a brain wave signal of a user watching a sports game according to an embodiment of the present invention. By measuring the observer's brain waves and gaze in real time, the position information of the athlete, the data collected from the sports apparatus and the record of the other game can be matched to the viewer's brain wave and gaze record. By analyzing the brain waves at this time, it is possible to measure the concentration, the stress, and the activity level of each part of the brain and combine them to find out where the observer concentrates or not by each athlete and sports apparatus. Emotional information and stress information are stored at the same point in time. The position and motion information of the athletes and the position and motion information of the sports device are used to collect the event, analyze the user's line of sight and the EEG signal corresponding to the event, record the reaction data for the athlete, and reflect the result to the game.

none

Claims (1)

Included in summary.
KR1020150081721A 2015-06-10 2015-06-10 Method for displaying image including eye tracking and brain signal data KR20160145315A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150081721A KR20160145315A (en) 2015-06-10 2015-06-10 Method for displaying image including eye tracking and brain signal data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150081721A KR20160145315A (en) 2015-06-10 2015-06-10 Method for displaying image including eye tracking and brain signal data

Publications (1)

Publication Number Publication Date
KR20160145315A true KR20160145315A (en) 2016-12-20

Family

ID=57734262

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150081721A KR20160145315A (en) 2015-06-10 2015-06-10 Method for displaying image including eye tracking and brain signal data

Country Status (1)

Country Link
KR (1) KR20160145315A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110333852A (en) * 2019-07-10 2019-10-15 南京邮电大学 A kind of real-time 3D display software design approach of EEG signals based on Qt

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110333852A (en) * 2019-07-10 2019-10-15 南京邮电大学 A kind of real-time 3D display software design approach of EEG signals based on Qt

Similar Documents

Publication Publication Date Title
Arabadzhiyska et al. Saccade landing position prediction for gaze-contingent rendering
Qian et al. Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display
JP5828880B2 (en) Apparatus and method for examining visual ability of a subject
WO2016157677A1 (en) Information processing device, information processing method, and program
CN111602082B (en) Position tracking system for head mounted display including sensor integrated circuit
McDuff et al. Pulse and vital sign measurement in mixed reality using a HoloLens
CN111427150B (en) Eye movement signal processing method used under virtual reality head-mounted display and wearable device
Moon et al. Perceptual experience analysis for tone-mapped HDR videos based on EEG and peripheral physiological signals
CN106575039A (en) Head-up display with eye tracking device determining user spectacles characteristics
CN109901290B (en) Method and device for determining gazing area and wearable device
Wieczorek et al. GPU-accelerated rendering for medical augmented reality in minimally-invasive procedures.
US8434870B2 (en) Method and apparatus for simulating an optical effect of an optical lens
Vienne et al. The role of vertical disparity in distance and depth perception as revealed by different stereo-camera configurations
US10729370B2 (en) Mobile sensor system and methods for use
JP2018000308A (en) Image display device system, heart beat specification method, and heart beat specification program
Weber et al. Gaze3DFix: Detecting 3D fixations with an ellipsoidal bounding volume
KR20160145315A (en) Method for displaying image including eye tracking and brain signal data
Alghamdi et al. Fixation detection with ray-casting in immersive virtual reality
Yildiz et al. The conceptual understanding of depth rather than the low-level processing of spatial frequencies drives the corridor illusion
Chu et al. Tracking feature-based attention
Khaustova et al. An investigation of visual selection priority of objects with texture and crossed and uncrossed disparities
Du Plessis Graphical processing unit assisted image processing for accelerated eye tracking
Daniol et al. Eye-tracking in Mixed Reality for Diagnosis of Neurodegenerative Diseases
JP2021079041A (en) Empathy degree evaluation system, empathy degree evaluation method and program
Booth et al. Gaze3D: Framework for gaze analysis on 3D reconstructed scenes

Legal Events

Date Code Title Description
N231 Notification of change of applicant