KR20120135126A - Method for controlling augmented reality using pointing device and apparatus therefof - Google Patents

Method for controlling augmented reality using pointing device and apparatus therefof Download PDF

Info

Publication number
KR20120135126A
KR20120135126A KR1020120059566A KR20120059566A KR20120135126A KR 20120135126 A KR20120135126 A KR 20120135126A KR 1020120059566 A KR1020120059566 A KR 1020120059566A KR 20120059566 A KR20120059566 A KR 20120059566A KR 20120135126 A KR20120135126 A KR 20120135126A
Authority
KR
South Korea
Prior art keywords
user
pointing device
augmented reality
finger
location
Prior art date
Application number
KR1020120059566A
Other languages
Korean (ko)
Inventor
안건준
박승용
서종철
주윤선
조유숙
조성미
주효민
Original Assignee
크루셜소프트 주식회사
크루셜텍 (주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 크루셜소프트 주식회사, 크루셜텍 (주) filed Critical 크루셜소프트 주식회사
Publication of KR20120135126A publication Critical patent/KR20120135126A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

PURPOSE: A method for controlling AR(Augmented Reality) by using a pointing device and a device thereof are provided to conveniently obtain information about an object in the AR with the pointing device manipulation of a finger by freely changing a current location of a user in the AR through the pointing device included in a portable terminal. CONSTITUTION: A pointing device(100) generates an input signal according to a finger movement. A display unit(110) overlaps virtual objects indicating surrounding buildings or stores around a current location of a user with an image taken in real time in order to display the same in AR. When an input signal according to a swipe motion is delivered from the pointing device, a control unit(160) detects a swipe displacement of the finger based on the input signal and changes the location of the user in the AR according to the swipe displacement. [Reference numerals] (100) Pointing device; (110) Display unit; (130) Memory; (140) Communication unit; (150) Audio processing unit; (160) Control unit; (170) Photographing unit; (180) Location obtaining unit

Description

Augmented reality control method and device using a pointing device {METHOD FOR CONTROLLING AUGMENTED REALITY USING POINTING DEVICE AND APPARATUS THEREFOF}

The present invention relates to an augmented reality control method and apparatus using a pointing device. More specifically, the present invention relates to a method and apparatus for controlling augmented reality displayed on a screen of a mobile terminal using a pointing device.

In general, various electronic devices are being made through the development of electronic communication technology, and these devices are increasingly emphasizing the user's ease of operation and the beautifulness of the design. Highlighted by this trend is the change from an input device represented by a keyboard or keypad to an input device such as a display.

In particular, in the case of a mobile terminal such as a smart phone, a tablet PC, a portable multimedia player (PMP), a mobile phone, and the like, additional keyboards and keypads are provided for intuitive user input, user convenience, and user aesthetics. Minimizing the number of buttons, etc., user input devices such as touch screens are being adopted as a main input device.

The touch screen is an input device in which a sensor is embedded in a screen and a user input is made by a method in which a sensor detects when a user touches a specific object on the touch screen using a hand or a stylus pen.

Meanwhile, various functions of portable electronic devices are integrated into one portable terminal, and intelligent high-performance portable terminals that support computer functions such as the Internet and communication information retrieval such as smart phones are rapidly becoming common. Accordingly, there is a demand for an input method capable of using, checking, accessing, and controlling detailed data related to various specific contents or programs more easily and quickly.

However, in the case of a mobile terminal employing a user input device such as a touch screen as a main input device, a user input interface corresponding to various operation modes may be provided to the user in various ways, and the user may intuitively perform the corresponding input. Although there is an advantage in that it is possible, detailed methods related to various contents or files, etc. are not presenting an effective way for a user to more easily and quickly use, view, access, and control.

For this reason, in recent years, such a touch screen and a pointing device have tended to be mounted in one terminal.

Here, the pointing device is an input device that moves the cursor (or pointer) on the screen according to the direction of finger movement when the user moves the finger on the contact surface of the pointing device. The pointing device is an infrared, laser, An electric field sensor, a capacitor, a thermistor, etc. can be used.

1 illustrates an example of an augmented reality screen provided by a recently released portable terminal.

Referring to FIG. 1, in general, augmented reality provided by a portable terminal displays an image captured by a camera of the portable terminal on a screen, and displays a virtual object on the screen to a user. That is, the virtual object having additional information is added to the image captured in real time and displayed as one image. For example, when a user executes an augmented reality application of a mobile terminal, trade names registered by buildings or businesses around the user's current location on the screen of the mobile terminal are displayed on the screen. The information is displayed on the screen and the desired information is displayed or searched to obtain the desired information.

In the case of augmented reality provided by a conventional portable terminal, a name or a trade name (ie, a virtual object) of a building located around the user's current location identified through GPS or the like is displayed. Therefore, only when the user moves with the mobile terminal, virtual objects that are not displayed on the screen are displayed on the screen. In other words, in order to obtain information about an object located farther away, the user had to move his portable terminal to the location where the object is located.

The present invention is to solve the above problems,

By using the pointing device mounted on the mobile terminal, the user can freely change the user's current position in augmented reality, so that the user does not have to move directly to obtain information about an object located farther away. An object of the present invention is to easily acquire information on a desired object in augmented reality by only pointing device manipulation.

An augmented reality control apparatus using a pointing device according to an embodiment of the present invention, the pointing device for generating an input signal according to the user's finger movement; A display unit configured to display a plurality of virtual objects representing nearby buildings or shops based on the current location of the user as an augmented reality by superimposing images captured in real time; And when an input signal according to a swipe operation is transmitted from the pointing device, detecting a swipe displacement of a user's finger based on the transmitted input signal, and detecting a swipe displacement of the user's finger based on the detected swipe displacement. It is provided with a control unit for changing the position.

In particular, the controller may display a virtual object of a neighboring building or a shop that is closest to the user's location in the changed augmented reality, and may highlight the current location of the user to a building or a shop corresponding to the highlighted virtual object. It characterized in that the moving distance of the display together.

In particular, the controller determines whether a push detection signal is input from the pointing device, and when the push detection signal is input, information related to the highlighted virtual object is displayed on the screen.

By using a pointing device mounted on a mobile terminal, the user can freely change the user's current position in augmented reality, so that the user does not have to move directly to obtain information about an object located farther away. It is effective to easily obtain information on a desired object in augmented reality only by operating a device.

1 illustrates an example of an augmented reality screen provided by a currently released portable terminal.
2 is a perspective view showing a portable terminal according to the present invention.
3 is a view for explaining in detail the components of FIG.
4 to 7 are exemplary views for explaining a method and apparatus for augmented reality control using a pointing device according to the present invention.

DETAILED DESCRIPTION Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art may easily implement the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In the drawings, parts irrelevant to the description are omitted in order to clearly describe the present invention, and like reference numerals designate like parts throughout the specification.

Throughout the specification, when a part is said to 'include' a certain component, it means that it can further include other components, except to exclude other components unless otherwise stated.

2 is a perspective view showing a portable terminal according to the present invention, FIG. 3 is a view for explaining the components of FIG. 2 in detail, and FIGS. 4 to 7 are augmented reality control method using a pointing device according to the present invention; An illustration for explaining the apparatus. In FIG. 2, reference numeral 11 denotes a housing of the portable terminal, reference numeral 12 denotes a display unit of a display unit, and reference numeral 100 denotes a pointing device.

2 to 7, the portable terminal 10 according to the present invention includes a pointing device 100, a display unit 110, a memory 130, a communication unit 140, an audio processing unit 150, and a controller 160. , The photographing unit 170, and the position obtaining unit 180 are provided.

The display unit 110 is controlled by the controller 160 to display a series of operation states, operation results, and a plurality of pieces of information made in the portable terminal 10. The display unit 110 visually provides a menu of the portable terminal, user data input by the user, function setting information, and various information to the user. In particular, the display unit 110 of the present invention displays augmented reality on the screen under the control of the controller 160.

The augmented reality in the mobile terminal according to the present invention is to combine the virtual object having the additional information with the image taken in real time through the photographing unit 170 to show as a single image.

For example, when the user executes the augmented reality application of the portable terminal, trade names registered for each building or business type are displayed on the screen of the portable terminal at the current location of the user, and the direction of the photographing unit 170 is displayed. As the information is changed, new information is displayed on the screen, and the desired information is displayed or searched to obtain desired information.

The display unit 110 may be configured as a display device such as an LCD, an OLED, and an LED. In addition, since the display unit 110 according to the present invention includes a touch sensor (not shown), the display unit 110 may detect an input signal according to a user's touch position, touch intensity, and touch direction on the screen 12. Here, the touch sensor may be implemented in various forms such as a current sensing method and a pressure sensing method. In addition, depending on the applied method, the operation state may be detected only by the user's proximity without directly contacting.

The memory 130 stores an application program for operating a function according to an embodiment of the present invention. The memory 130 includes a program area and a data area. The program area is linked between an operating system (OS) and functional elements for booting a mobile terminal, and a program for changing a user's location (ie, the location of the mobile terminal) in augmented reality according to a user's pointing device manipulation. Stored. The data area stores data generated according to the use of the mobile terminal 10.

The communicator 140 receives a GPS signal transmitted from a GPS satellite and transmits the GPS signal to the location acquirer 180, and the location acquirer 180 uses the GPS signals received from the GPS satellites to determine the current location of the mobile terminal 10. To calculate. The calculated current location is used to determine the location of the user in augmented reality.

In addition, the communication unit 140 forms a communication channel between the mobile terminal 10 and the base station, and is responsible for a series of communication for transmitting and receiving necessary signals.

The audio processor 150 reproduces an audio signal output from the controller 160 or transmits an audio signal such as a voice input from a microphone MIC to the controller 160. That is, the audio processor 150 converts the voice and sound data into an audible sound through the speaker SPK under the control of the controller 160, and outputs the audio signal such as a voice received from the microphone MIC. Transfer to the controller 160.

The controller 160 controls the overall operation of the mobile terminal 10 and the signal flow between the internal blocks.

In particular, the controller 160 displays a building device or a shop of nearby buildings as a virtual object (ie, an icon) around the current location of the user (ie, the location of the mobile terminal) (see FIG. 1). In step 100, it is determined whether a signal corresponding to a user's finger movement is input, and according to the input signal, a function according to the present invention is executed. The input signal may be a swipe detection signal generated by the pointing device 100 according to a swipe gesture of a user finger or a push detection signal generated by the pointing device 100 according to a user's finger pressing operation. .

More specifically, the control unit 160 is a state in which the surrounding buildings or shops are displayed as icons around the current location (ie, the current location of the user) of the mobile terminal 10 and displayed on the screen (see FIG. 4). When a swipe detection signal is input from the pointing device 100, a swipe displacement of a user's finger is detected based on the input swipe detection signal. The controller 160 changes the position of the user in the augmented reality displayed on the screen 12 according to the detected swipe displacement.

In other words, the same effect that the user moves is displayed through the pointing device 100 operation without the user actually moving his portable terminal 10.

For example, in a screen as shown in FIG. 1, when a user touches a finger to the pointing device 100 and swipes a finger in a specific direction, the user's position in the augmented reality moves according to the detected swipe displacement. Move in the direction corresponding to. When the user swipes the pointing device 100 in the 12 o'clock direction, the user's current position is moved in the same direction as the photographing direction of the photographing unit 170.

Based on the changed user location (ie, 'virtual user location') by the manipulation of the pointing device 100, an icon corresponding to the building or store closer to the 'virtual user location' is displayed. The icon of the building or store closest to them is highlighted and displayed with the name of the building or store. The distance from the point where the mobile terminal 10 is currently located (the actual location of the mobile terminal acquired through the location acquisition unit) to the building or shop indicated by the highlighted icon is displayed together with the name of the building or shop.

Through this configuration, the user can accurately grasp the distance and the movement route from the point where he is currently located to the building or shop to go to. Here, 'highlight' refers to a case in which it is highlighted and distinguished from other icons.

Referring to FIG. 4, the highlighted icon 16 may be separately displayed on the minimap 15, and a center point on the minimap 15 represents a 'virtual user location'.

In the state as shown in FIG. 4, when the user makes a finger contact with the pointing device 100 and swipes the finger a predetermined number of times in the 12 o'clock direction, an icon 16 previously highlighted (activated) is displayed on the screen. It disappears, and the icon 17 located behind the icon 16 is enlarged and highlighted (see FIG. 5). Similarly, in the state shown in FIG. 5, when the user makes a finger contact with the pointing device 100 and swipes the finger a predetermined number of times at 12 o'clock, the icon 17 that was previously highlighted is displayed on the screen. Disappears, the icon 18 located behind the icon 17 is enlarged and highlighted.

On the contrary, in the state of FIG. 4, when the user touches the finger to the pointing device 100 and swipes the finger to the 6 o'clock position, the highlighted icon 16 is pushed backward and the highlight is released. And icons are reduced, and icons of buildings or shops that were not displayed on the screen are highlighted on the screen and displayed along with text (eg, names and streets).

The controller 160 determines whether a push detection signal is input from the pointing device 100 while a specific icon on the screen is highlighted. To this end, the pointing device 100 is basically provided with a dome switch capable of detecting a user's pressing operation.

When the push detection signal is input from the pointing device 100, the controller 160 displays information related to the highlighted icon on the screen. For example, as shown in FIG. 7, detailed information (eg, distance to each other, business type, user preference, business hours, travel time, coupon, etc.) of the corresponding building or shop may be displayed.

The photographing unit 170 transmits a still image or a video photographed using a photographing means such as a camera module to the controller 160.

The pointing device 100 applied to the present invention detects a movement of a subject (finger) and transmits a motion detection signal corresponding thereto to the controller 160.

The pointing device 100 uses a touch pad, a display unit, a trackball, a scroll wheel, or infrared (IR), a laser, an electric field, a capacitor, or a thermistor to move a subject. It may be a mobile pointing device that detects. The above-mentioned pointing device 100 is a well-known configuration, and those skilled in the art can easily infer it without adding special knowledge through the contents described in the present invention, and thus, further detailed description will be omitted.

Here, the subject may be generally understood as a body part of a user such as a finger, but may be any other replaceable object that can make the pointing device 100 detect a movement even if it is not the body.

As described above, an optimal embodiment has been disclosed in the drawings and specification. Although specific terms have been employed herein, they are used for purposes of illustration only and are not intended to limit the scope of the invention as defined in the claims or the claims. Therefore, those skilled in the art will understand that various modifications and equivalent other embodiments are possible from this. Accordingly, the true scope of the present invention should be determined by the technical idea of the appended claims.

10: portable terminal 100: pointing device
110: display unit 130: memory
140: communication unit 150: audio processing unit
160: control unit 170: recording unit

Claims (3)

A pointing device generating an input signal according to a user's finger movement;
A display unit configured to display a plurality of virtual objects representing nearby buildings or shops based on the current location of the user as an augmented reality by superimposing images captured in real time; And
When an input signal according to a swipe operation is transmitted from the pointing device, a swipe displacement of a user's finger is detected based on the transferred input signal, and a position of the user in the augmented reality according to the detected swipe displacement. With a control unit for changing the,
Augmented reality control device using a pointing device.
The method according to claim 1,
The control unit,
The virtual objects of the neighboring buildings or shops closest to the user's location in the changed augmented reality are highlighted, and the moving distance from the user's current location to the buildings or shops corresponding to the highlighted virtual objects is displayed together. An augmented reality control apparatus using a pointing device, characterized in that the.
The method according to claim 2,
The control unit,
And determining whether a push detection signal is input from the pointing device, and when the push detection signal is input, information related to the highlighted virtual object is displayed on a screen.
KR1020120059566A 2011-06-02 2012-06-04 Method for controlling augmented reality using pointing device and apparatus therefof KR20120135126A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20110053323 2011-06-02
KR1020110053323 2011-06-02

Publications (1)

Publication Number Publication Date
KR20120135126A true KR20120135126A (en) 2012-12-12

Family

ID=47903129

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120059566A KR20120135126A (en) 2011-06-02 2012-06-04 Method for controlling augmented reality using pointing device and apparatus therefof

Country Status (1)

Country Link
KR (1) KR20120135126A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9501871B2 (en) 2014-04-30 2016-11-22 At&T Mobility Ii Llc Explorable augmented reality displays
US11941162B2 (en) 2021-03-24 2024-03-26 Hyundai Motor Company Mobile apparatus and vehicle displaying augmented reality image

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9501871B2 (en) 2014-04-30 2016-11-22 At&T Mobility Ii Llc Explorable augmented reality displays
US9940755B2 (en) 2014-04-30 2018-04-10 At&T Mobility Ii Llc Explorable augmented reality displays
US10460522B2 (en) 2014-04-30 2019-10-29 At&T Mobility Ii Llc Explorable augmented reality displays
US11941162B2 (en) 2021-03-24 2024-03-26 Hyundai Motor Company Mobile apparatus and vehicle displaying augmented reality image

Similar Documents

Publication Publication Date Title
US20230359340A1 (en) Omnidirectional gesture detection
US9329714B2 (en) Input device, input assistance method, and program
US20080024454A1 (en) Three-dimensional touch pad input device
US9459704B2 (en) Method and apparatus for providing one-handed user interface in mobile device having touch screen
US20100328351A1 (en) User interface
KR20130142824A (en) Remote controller and control method thereof
KR20140035870A (en) Smart air mouse
US20150160731A1 (en) Method of recognizing gesture through electronic device, electronic device, and computer readable recording medium
CN101098533A (en) Keypad touch user interface method and mobile terminal using the same
KR20140105691A (en) Apparatus and Method for handling object in a user device having a touch screen
KR20140136855A (en) Function performing method and electronic device thereof
KR101339985B1 (en) Display apparatus, remote controlling apparatus and control method thereof
US20120262369A1 (en) Hand-mountable device for providing user input
GB2517284A (en) Operation input device and input operation processing method
KR20120135126A (en) Method for controlling augmented reality using pointing device and apparatus therefof
KR101165388B1 (en) Method for controlling screen using different kind of input devices and terminal unit thereof
EP2511792A1 (en) Hand-mountable device for providing user input
KR20120115620A (en) Method for controlling user interface of portable termianl using movement sensing device and apparatus therefof
KR101888902B1 (en) Method for displayng photo album of mobile termianl using movement sensing device and apparatus therefof
KR20120134469A (en) Method for displayng photo album image of mobile termianl using movement sensing device and apparatus therefof
KR20120134399A (en) Method for providing schedule information using movement sensing device and apparatus therefof
KR20120134485A (en) Method for searching index list using movement sensing device and apparatus therefof
KR20120134374A (en) Method for controlling 3d mode of navigation map using movement sensing device and apparatus therefof
KR101888904B1 (en) Method for displayng e-book of mobile termianl using movement sensing device and apparatus therefof
KR20120078816A (en) Providing method of virtual touch pointer and portable device supporting the same

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application