KR101407249B1 - Method and apparatus for controlling augmented reality-based presentation - Google Patents

Method and apparatus for controlling augmented reality-based presentation Download PDF

Info

Publication number
KR101407249B1
KR101407249B1 KR1020130055567A KR20130055567A KR101407249B1 KR 101407249 B1 KR101407249 B1 KR 101407249B1 KR 1020130055567 A KR1020130055567 A KR 1020130055567A KR 20130055567 A KR20130055567 A KR 20130055567A KR 101407249 B1 KR101407249 B1 KR 101407249B1
Authority
KR
South Korea
Prior art keywords
region
event
depth
area
extracting
Prior art date
Application number
KR1020130055567A
Other languages
Korean (ko)
Inventor
김진수
Original Assignee
한밭대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한밭대학교 산학협력단 filed Critical 한밭대학교 산학협력단
Priority to KR1020130055567A priority Critical patent/KR101407249B1/en
Application granted granted Critical
Publication of KR101407249B1 publication Critical patent/KR101407249B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to a method and a system for controlling an augmented reality-based presentation and, more specifically, to a method for controlling an augmented reality-based presentation which installs depth and color cameras to one of sides of a screen and controls the augmented reality-based presentation. According to the present invention, a background image that is not affected by the brightness can be provided from distance information provided by the depth camera. Further, a moving area can be set as an area of interest by using the depth information, and an error due to a difference in brightness that usually occurs in general color cameras can be eliminated by only extracting the area of interest.

Description

[0001] METHOD AND APPARATUS FOR CONTROLLING AUGMENTED REALITY [0002] BASED PRESENTATION [0003]

The present invention relates to a presentation control system and a method thereof, and more particularly, to a presentation control system using an augmented reality for controlling a presentation operation by an augmented reality using a color camera and a depth camera, and a method thereof .

Typical applications that utilize presentations include Power Point, Keynote, and Forge. These software are placement-oriented and design-oriented programs. Pointing tools such as mouse, keyboard and laser pointer are used as hardware. Among these devices, mouse and keyboard have high accuracy, but there is a disadvantage that they need auxiliary personnel, and laser pointer can display accurate position, but there are disadvantages of only two or three events. In addition, studies have been carried out using a color camera to detect and determine a specific body part and motion of a person and use it in a presentation. However, there is a limitation in extracting objects of interest and providing various presentation events accurately and in real time through a color camera.

 In recent years, with the development of depth sensing technology, many researches on computer control using pose tracking and gesture recognition using real time depth camera have been done. In particular, the Xtion series of Microsoft's Kinect and Asus launched at the end of 2010 has become widespread, and a lot of research on presentation control software using depth camera has been done.

There are two types of depth cameras: Time of Flight (TOF) and stereo matching. Among these methods, the stereo matching method using the phase difference between the irradiation wave and the reflected wave is most used. In the Kinect and Xtion series, there is a separation between the transmitter that emits light and the receiver that senses the reflected wave. Accordingly, the infrared light having a specific pattern is irradiated, and the depth information is extracted by stereo matching with the wave reflected from the object. Therefore, there is a problem that an error occurs on the left or right side of the object regardless of the wave.

Korean Patent No. 10-1019255 Korean Patent No. 10-0588042

SUMMARY OF THE INVENTION The present invention has been conceived to solve the problems described above, and it is an object of the present invention to provide a pointer extraction method using a low-cost stereo matching type depth camera and a color camera simultaneously and a presentation control using augmented reality using depth information and spatial coordinates of a pointer The method is for that purpose.

In addition, the present invention creates a background image that is not affected by brightness from distance information provided by a depth camera, extracts a moving interest region from the background image, labels the extracted region, and compares the size and the length of the labeled region The present invention is directed to a method of controlling a presentation by an augmented reality in which an area of a specific size is extracted and a range of an event and an event to be executed are set using depth information and spatial coordinates.

A method of controlling a presentation by an augmented reality according to an embodiment of the present invention includes a step of setting a depth and color camera at one of upper, lower, right, and left sides of a screen, and controlling a presentation operation by an augmented reality Extracting an area in which the moving object exists from the background image, labeling the extracted area, comparing sizes of pixels of the labeled area to obtain a specific size Extracting a region of interest in the color image and extracting a region of a minimum square shape including the region of interest; and converting pixels of the corresponding region corresponding to the extracted region of the minimum square shape in the color image into a HSV (Hue Saturation Value) type And extracts only the color value from the changed pixels to generate a histogram The region corresponding to the histogram is searched in the entire color image, the search range is changed in the search, the region corresponding to the histogram is extracted, and when a portion in which the depth image and the color image pixel do not coincide with each other occurs Extracting an area in which the moving object exists in the extracted minimum rectangular area, selecting pixels forming a polygon of an outline among all the pixels, and continuing to detect the outline; if the outline is detected, Extracting a position of a pixel included in the depth information, obtaining a position of a pixel closest to the screen and setting the obtained pixel as a pointer for event determination, calculating a depth value of the position corresponding to the position of the pointer in the depth information, Setting an event area and an event to be executed using the set event When the pointer is entered in the tree area characterized by comprising the step of executing a set event.

At this time, in the setting of the event, the range designation and the setting for the determination of the event are performed by dividing and specifying the horizontal and vertical axes by using the spatial coordinates of the depth image and the background image with respect to the range to be photographed, The method comprising the steps of:

The step of generating the histogram includes a step of obtaining all the depth information of the shooting range from the depth camera and obtaining a HSV (Hue Saturation Value) signal of the extracted interest area including the moving object from the color camera .

In addition, the step of generating the background image may include creating a storage space for a range of depth information at the time of generating the background image and generating a depth value D (i, j) at the position of the space (i, j) (i, j, t + 1) is included in the current threshold value region if the difference between the depth value D (i, j, t + 1) and the subsequent depth value D And setting the upper and lower thresholds as a lower threshold value, and continuing until a predetermined number of images are input and background images are accumulated.

Figure 112013043312102-pat00001

In addition, the labeling step generates a new threshold value area if the difference of the depth values is equal to or greater than T, and sets upper and lower threshold values for the newly generated area as follows: The pixel value of the region not included in the background image is changed to 255 and the region included is changed to 0, and the pixel value of the pixel value And " 255 " are grouped and labeled.

Figure 112013043312102-pat00002

In addition, the step of executing the event includes the step of causing an event to be executed when the depth value of the pointer and the pixel position correspond to the depth value area and the spatial coordinate area of the event.

The presentation control system according to an embodiment of the present invention includes a depth and color camera installed at one of upper, lower, right, and left sides of a screen, and includes a presentation control unit for controlling a presentation operation by an augmented reality The system includes a background image generation unit for generating a background image from distance information provided by a depth camera, a labeling unit for extracting an area having motion objects from the background image, labeling the extracted area, Extracts a region of interest having a specific size and extracts a region having a minimum rectangular shape including the region of interest, and extracts pixels of a corresponding region corresponding to the extracted region of the minimum square from the color image, (Hue Saturation Value) type, and only the color value A histogram generating unit for extracting a histogram and generating a histogram, an area corresponding to the histogram is searched in the entire color image, a search range is changed at the time of searching, an area corresponding to the histogram is extracted, A region of interest is extracted again from the extracted minimum square region, and the pixels forming the polygon of the outline among all the pixels are selected and continued A pointer setting unit for extracting a position of a pixel included in an outline when an outline is detected and obtaining a position of a pixel nearest to the screen and setting the obtained pixel as a pointer for event determination; The depth value of the position corresponding to the position of the pointer, When using a table containing the pointers to the event area event setting unit and configured to set a region and execute events for the event and being configured to include a run event that runs the set event.

According to the present invention, the presentation control system and the method of the present invention can generate a background image that is not influenced by brightness from the distance information provided by the depth camera, By extracting only the region of interest, it is possible to remove the error according to the brightness difference appearing in a general color camera and to recognize the event recognition using the depth information and the spatial coordinates of the pointer touching the screen .

1 is a diagram illustrating a presentation control system according to an embodiment of the present invention.
2 is a flowchart illustrating a method of controlling a presentation by an augmented reality according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating an example of depth information that is not included in a new threshold value after generation of a background image according to an embodiment of the present invention.
4 is a diagram illustrating an example of a background image extracted from a color image and a depth image according to an embodiment of the present invention.
5 is a diagram illustrating an example of extracting a region of interest using a histogram according to an embodiment of the present invention.
FIG. 6 is an exemplary diagram illustrating the detection of an outline in a region of interest according to an embodiment of the present invention.
FIG. 7 is a diagram illustrating an event area arrangement according to an exemplary embodiment of the present invention. Referring to FIG.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a presentation control system and method according to an embodiment of the present invention will be described in detail with reference to the accompanying drawings. The following drawings are provided by way of example so that those skilled in the art can fully understand the spirit of the present invention. Therefore, the present invention is not limited to the following drawings, but may be embodied in other forms. In addition, like reference numerals designate like elements throughout the specification.

In this case, unless otherwise defined, technical terms and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. In the following description and the accompanying drawings, A description of known functions and configurations that may unnecessarily obscure the description of the present invention will be omitted.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

1 is a block diagram showing a presentation control system according to an augmented reality in which a presentation operation of the present invention is controlled using an augmented reality.

The present invention relates to a presentation control system (1000) for controlling a presentation operation using an augmented reality.

The presentation control system 1000 according to the present invention includes a depth and color camera 10 installed at one of upper, lower, left, and right sides of a screen, and includes a background image generation unit 11, a labeling unit 12, And includes an extraction unit 13, a histogram generation unit 14, an interest region correction unit 15, an outline detection unit 16, a pointer setting unit 17, an event setting unit 18, And may include a control unit (not shown) for controlling each component.

It is also obvious to those skilled in the art that two or more of the above components may be combined into one component, or all of the components may be implemented in one microprocessor or ASIC chip.

The background image generation unit 11 among the components of the presentation control system 1000 generates a background image from the distance information provided by the depth camera.

The labeling unit 12 extracts an area having a motion object from the background image, and labels the extracted area.

The ROI extractor 13 extracts ROIs of a specific size by comparing the sizes of the pixels of the labeled ROIs, and extracts a ROI including the ROIs.

The histogram generation unit 14 changes the pixels of the corresponding region corresponding to the extracted region of the minimum square in the color image to the HSV (Hue Saturation Value) type, extracts only the color values from the changed pixels, .

The ROI corrector 15 searches the entire region of the color image corresponding to the histogram, extracts a region corresponding to the histogram while changing the search range at the time of searching, If inconsistencies occur, correct the position.

The outline detecting unit 16 extracts a region having a moving object in the extracted minimum square region, selects pixels forming an outline polygon among all the pixels, and continues to detect an outline.

When the outline is detected, the pointer setting unit 17 extracts the positions of the pixels included in the outline, obtains the position of the pixel closest to the screen, and sets the obtained pixel as a pointer for event determination.

The event setting unit 18 sets an event area and an event to be executed using the depth value and the spatial coordinates of the position corresponding to the pointer position in the depth information.

The event execution unit 19 executes an event set when a pointer enters the set event area.

2 is a flowchart of a presentation control method of a presentation control system according to an augmented reality of the present invention.

As shown in FIG. 2, the presentation control method using the augmented reality according to the present invention includes a depth and color camera installed at one of upper, lower, right, and left sides of the screen, and controls the presentation operation by the augmented reality.

In the procedure of the control method,

Generating a background image from distance information provided by a depth camera, extracting an area in which the moving object exists from the background image, and labeling the extracted area; Extracting a region of interest having a specific size by comparing the sizes of pixels of the labeled region, extracting a region having a minimum rectangular shape including the region of interest, extracting a region corresponding to the extracted region of the minimum rectangular shape from the color image, The method comprising: changing a pixel of the histogram to a hue saturation value (HSV) type; extracting only a color value from the changed pixels to generate a histogram; searching an area corresponding to the histogram over the entire color image; Extracting a region corresponding to a histogram and correcting a position when a portion of the depth image and a color image that do not coincide with each other occurs when generating a histogram; extracting a region having a moving object in the extracted minimum rectangular region; The pixels forming the outer polygon are selected and connected to form the outline Extracting a position of a pixel included in an outline when an outline is detected, obtaining a position of a pixel closest to the screen and setting the obtained pixel as a pointer for event determination, Setting an event area and an event to be executed using the depth value and the spatial coordinates of the corresponding position, and executing the set event when the pointer enters the set event area.

In the presentation control method using the augmented reality according to the present invention, a moving area is determined as an area of interest using depth information, and by extracting only the area of interest, an error according to a difference in brightness appearing in a general color camera can be removed. The event information can be confirmed using the depth information of the pointer and the space coordinates of the pointer. The event is executed when the pointer enters the set event area.

The present invention provides a method of controlling a presentation by an augmented reality using depth and color cameras at the same time, obtaining all the depth information about a shooting range from a depth camera, and extracting HSV (Hue Saturation Value) signal.

A method of controlling a presentation by an augmented reality according to the present invention will now be described in detail.

As shown in FIG. 2, in the present invention, the presentation control system first performs the depth information input step 100.

The depth information may cause noise due to changes in brightness of light. The noise is not continuous to all depth information but discontinuously at a certain time. Therefore, in order to remove the noise factor, depth information is input and accumulated for a certain period of time, and a background image is searched for a specific range of depth information. When a background image is generated, storage space is created as much as the range of depth information. The depth value D (i, j, t + 1) and the depth value D (i, j, t) at the time Is included in the current threshold value region if the difference is less than or equal to the threshold T, and the largest and smallest depth values in the region are set as the upper and lower threshold values.

Figure 112013043312102-pat00003

The upper and lower threshold settings are continued until a certain number of screens are input and background images are accumulated (101).

FIG. 3 shows an example of depth information not included in a new threshold value after generation of a background image.

As shown in FIG. 3, if the difference in depth value is equal to or greater than T, a new threshold value region is generated. At this time, as shown in Equation (2), the upper and lower threshold values are set for the newly generated region as described above.

Figure 112013043312102-pat00004

That is, no moving objects appear within the shooting range of the camera, or noises that did not appear during generation of the background image are not included in the threshold range. In this case, the region where the moving object exists is constantly represented by Equation (2), which is expressed as an area not in the range of the threshold value in the depth image. The pixel value of the area not included in the background image is changed to '255', and the area included in the background image is changed to '0'.

FIG. 4 shows an example of a background image extracted from a color image and a depth image.

As shown in FIG. 4, pixels having a pixel value '255' sequentially from the first pixel of the background image are grouped and labeled (103). In the labeled order, '1' is stored in the first object, '2' is stored in the second object, and so on. In order to remove the noise and unspecified objects among the labeled objects, only the objects having the number of pixels of Nmin or more and Nmax or less are extracted, and the ROI is extracted (104).

FIG. 5 shows an example of a region of interest extraction using a histogram.

Only the pixel group of the minimum square shape including the region of interest is extracted as in the example shown in FIG.

It is possible to extract an accurate region of interest by using a color image for a region of a minimum square shape extracted from a background image. The pixels of the corresponding area corresponding to the ROI in the color image are changed to the HSV type (105). Hue (color) values are extracted from the changed pixels and a histogram is created. The region corresponding to the histogram is searched in the whole color image. At the time of searching, the search range is changed and an area corresponding to the histogram is extracted (106). When the histogram is generated, the position is corrected when a part of the depth image does not match the pixel of the color image. At this time, the pointer of the presentation can be extracted by using the histogram for the color image.

In the rectangular region extracted from the background image, only the pixels having the pixel value of '255' are extracted again, and the pixels forming the outer polygon among all the pixels are selected (107). If the outline is detected, the position of the pixels included in the outline is extracted (108). The position gets the position of the pixel closest to the screen. The obtained pixel is used as a pointer to determine an event (109). In the depth information, horizontal and vertical axes are separated using the depth value and the spatial coordinates of the position corresponding to the position of the pointer and used for event determination. The depth value of the pointer is compared with the spatial coordinates (110).

6 shows an example of contour detection in a region of interest.

At this time, as shown in the example of FIG. 6, the corresponding event is executed when it is included in the event range (111). The range designation and setting for the determination of the event is performed by specifying the horizontal and vertical axes by using the depth information and the spatial coordinates of the background image, and setting an event to be executed.

7 shows an example of an event area arrangement.

As shown in FIG. 7, the event is executed when the depth value of the pointer and the pixel position correspond to the depth value area and the spatial coordinate area of the event. Types of events can include previous slide, zoom out, last slide, move left, move right, enter, resume, zoom in, close, and so on.

Meanwhile, the presentation control method by the augmented reality according to the embodiment of the present invention can be implemented in the form of a program command which can be executed through various electronic information processing means, and can be recorded in a storage medium. The storage medium may include program instructions, data files, data structures, and the like, alone or in combination.

Program instructions to be recorded on the storage medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of software. Examples of storage media include magnetic media such as hard disks, floppy disks and magnetic tape, optical media such as CD-ROMs and DVDs, magneto-optical media such as floptical disks, magneto-optical media and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. The above-mentioned medium may also be a transmission medium such as a light or metal wire, wave guide, etc., including a carrier wave for transmitting a signal designating a program command, a data structure and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as devices for processing information electronically using an interpreter or the like, for example, a high-level language code that can be executed by a computer.

As described above, the present invention has been described with reference to specific embodiments such as specific components and exemplary embodiments. However, the present invention is not limited to the above-described embodiments, And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains.

Accordingly, the spirit of the present invention should not be construed as being limited to the embodiments described, and all of the equivalents or equivalents of the claims, as well as the following claims, fall within the scope of the present invention .

1000: Presentation control system
10: Depth and color camera
11: background image generating unit
12: Labeling section
13: ROI extracting unit
14: histogram generator
15: Area of interest correction
16: Outline detection unit
17: Pointer setting section
18: Event setting section
19: Event execution unit

Claims (8)

A presentation control method using an augmented reality in which a depth and color camera is installed at one of upper, lower, right, and left sides of a screen and a presentation operation is controlled by an augmented reality,
Generating a background image from distance information provided by a depth camera;
Extracting a region having a moving object from the background image, and labeling the extracted region;
Extracting a region of interest having a specific size by comparing sizes of pixels of the labeled region, and extracting a region having a minimum rectangular shape including the region of interest;
Changing a pixel of a corresponding region corresponding to the extracted minimum square shape region to a hue saturation value (HSV) type in a color image, extracting only a color value from the changed pixels, and generating a histogram;
The region corresponding to the histogram is searched in the entire color image, and the region corresponding to the histogram is extracted while changing the search range at the time of searching. When a portion where the depth image and the color image pixel do not coincide with each other occurs, Correcting;
Extracting an area in which the moving object exists in the extracted minimum square area, selecting pixels forming an outer polygon among all the pixels, and continuing to detect an outline;
Extracting a position of a pixel included in an outline, obtaining a position of a pixel closest to the screen, and setting the obtained pixel as a pointer for event determination;
Setting an event area and an event to be executed using the depth value and the spatial coordinates of the position corresponding to the position of the pointer in the depth information; And
Executing a set event when a pointer enters the set event area;
The method of any one of claims 1 to 5,
The method according to claim 1,
The step of setting the event
The range designation and setting for the determination of an event includes a step of identifying and designating an event to be executed and identifying horizontal and vertical axes by using the spatial information of the background image and the depth information of the range to be photographed, / RTI >
The method according to claim 1,
The step of generating the histogram
Acquiring all depth information on a shooting range from a depth camera, and obtaining an HSV (Hue Saturation Value) signal of an extracted ROI including an object moving from a color camera; Way.
The method according to claim 1,
The step of generating the background image
(I, j, t) at the position of space (i, j) at time t as shown in the following equation, If the difference between D (i, j, t + 1) is less than or equal to the threshold value T, the maximum depth value and the smallest depth value in the area are set as upper and lower thresholds, Wherein the lower threshold value setting step includes a step of continuing until a certain number of images are input and a background image is accumulated.
Figure 112013043312102-pat00005

5. The method of claim 4,
The labeling step
If the difference between depth values is equal to or greater than T, a new threshold value area is generated. In this case, upper and lower threshold values are set for the newly generated area as shown in the following equation. The pixel value of the area not included in the background image is changed to 255 and the area included in the background image is changed to 0 and the pixels having the pixel value of 255 sequentially from the first pixel of the background image And grouping and labeling the at least one of the plurality of the augmented reality images.
Figure 112013043312102-pat00006

The method of claim 1,
The step of executing the event
And executing an event when the depth value and the pixel position of the pointer correspond to the depth value area and the spatial coordinate area of the event.
A presentation control system comprising: a depth and color camera installed at one of upper, lower, right, and left sides of a screen, and for controlling a presentation operation by an augmented reality,
A background image generation unit for generating a background image from the distance information provided by the depth camera;
A labeling unit for extracting a region having a motion object from the background image and labeling the extracted region;
An interest region extracting unit for extracting a region of interest having a specific size by comparing sizes of pixels of the labeled region and extracting a region having a minimum rectangular shape including the region of interest;
A histogram generation unit for generating a histogram by extracting only the color values from the changed pixels by changing the pixels of the corresponding region corresponding to the extracted rectangular region in the color image to the HSV (Hue Saturation Value) type;
The region corresponding to the histogram is searched in the entire color image, and the region corresponding to the histogram is extracted while changing the search range at the time of searching. When a portion where the depth image and the color image pixel do not coincide with each other occurs, A region of interest corrector to correct;
Extracting an area in which the moving object exists in the extracted minimum square area, selecting pixels forming an outer polygon among all the pixels, and continuing to detect an outline;
A pointer setting unit that extracts a position of a pixel included in an outline when an outline is detected, obtains a position of a pixel closest to the screen, and sets the obtained pixel as a pointer for event determination;
An event setting unit for setting an area of the event and an event to be executed using the depth value and the spatial coordinates of the position corresponding to the position of the pointer in the depth information; And
An event execution unit for executing a set event when a pointer enters the set event area;
And a control unit for controlling the presentation control unit to control the presentation control unit.
A computer-readable recording medium storing a program for executing the method according to any one of claims 1 to 6.
KR1020130055567A 2013-05-16 2013-05-16 Method and apparatus for controlling augmented reality-based presentation KR101407249B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020130055567A KR101407249B1 (en) 2013-05-16 2013-05-16 Method and apparatus for controlling augmented reality-based presentation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020130055567A KR101407249B1 (en) 2013-05-16 2013-05-16 Method and apparatus for controlling augmented reality-based presentation

Publications (1)

Publication Number Publication Date
KR101407249B1 true KR101407249B1 (en) 2014-06-13

Family

ID=51132831

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020130055567A KR101407249B1 (en) 2013-05-16 2013-05-16 Method and apparatus for controlling augmented reality-based presentation

Country Status (1)

Country Link
KR (1) KR101407249B1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101806864B1 (en) * 2016-10-05 2017-12-08 연세대학교 산학협력단 Apparatus for controlling 3d object in augmmented reality environment and method thereof
CN113116367A (en) * 2019-12-30 2021-07-16 通用电气精准医疗有限责任公司 System and method for patient structure estimation during medical imaging

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100049775A (en) * 2008-11-04 2010-05-13 한국전자통신연구원 Marker recognition apparatus using dynamic threshold and method thereof
KR20100106834A (en) * 2009-03-24 2010-10-04 주식회사 이턴 Surgical robot system using augmented reality and control method thereof
KR20110053288A (en) * 2009-11-09 2011-05-20 한국과학기술원 System and method of 3d object recognition using a tree structure
KR20110132260A (en) * 2010-05-29 2011-12-07 이문기 Monitor based augmented reality system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20100049775A (en) * 2008-11-04 2010-05-13 한국전자통신연구원 Marker recognition apparatus using dynamic threshold and method thereof
KR20100106834A (en) * 2009-03-24 2010-10-04 주식회사 이턴 Surgical robot system using augmented reality and control method thereof
KR20110053288A (en) * 2009-11-09 2011-05-20 한국과학기술원 System and method of 3d object recognition using a tree structure
KR20110132260A (en) * 2010-05-29 2011-12-07 이문기 Monitor based augmented reality system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101806864B1 (en) * 2016-10-05 2017-12-08 연세대학교 산학협력단 Apparatus for controlling 3d object in augmmented reality environment and method thereof
CN113116367A (en) * 2019-12-30 2021-07-16 通用电气精准医疗有限责任公司 System and method for patient structure estimation during medical imaging

Similar Documents

Publication Publication Date Title
JP7004017B2 (en) Object tracking system, object tracking method, program
KR101636370B1 (en) Image processing apparatus and method
CN105493078B (en) Colored sketches picture search
EP4036855A1 (en) Depth determination method, depth determination device and electronic device
US10347000B2 (en) Entity visualization method
US20120163661A1 (en) Apparatus and method for recognizing multi-user interactions
WO2014136623A1 (en) Method for detecting and tracking objects in sequence of images of scene acquired by stationary camera
JP2015212849A (en) Image processor, image processing method and image processing program
US9965041B2 (en) Input device, apparatus, input method, and recording medium
JP2012191354A (en) Information processing apparatus, information processing method, and program
US20120155748A1 (en) Apparatus and method for processing stereo image
CN109671098B (en) Target tracking method and system applicable to multiple tracking
KR101407249B1 (en) Method and apparatus for controlling augmented reality-based presentation
US9727145B2 (en) Detecting device and detecting method
JP2006244272A (en) Hand position tracking method, device and program
KR101517538B1 (en) Apparatus and method for detecting importance region using centroid weight mask map and storage medium recording program therefor
CN106951077B (en) Prompting method and first electronic device
JP2020017136A (en) Object detection and recognition apparatus, method, and program
KR101350387B1 (en) Method for detecting hand using depth information and apparatus thereof
JP5217917B2 (en) Object detection and tracking device, object detection and tracking method, and object detection and tracking program
JP2017033556A (en) Image processing method and electronic apparatus
KR101853276B1 (en) Method for detecting hand area from depth image and apparatus thereof
KR20140123399A (en) Apparatus and Method of Body Parts Detection
KR101935969B1 (en) Method and apparatus for detection of failure of object tracking and retracking based on histogram
KR101893142B1 (en) Object extraction method and apparatus

Legal Events

Date Code Title Description
A201 Request for examination
E701 Decision to grant or registration of patent right
GRNT Written decision to grant
FPAY Annual fee payment

Payment date: 20170605

Year of fee payment: 4

FPAY Annual fee payment

Payment date: 20180605

Year of fee payment: 5

FPAY Annual fee payment

Payment date: 20190529

Year of fee payment: 6