TW201132936A - Real-time augmented reality device, real-time augmented reality method and computer program product thereof - Google Patents

Real-time augmented reality device, real-time augmented reality method and computer program product thereof Download PDF

Info

Publication number
TW201132936A
TW201132936A TW99108329A TW99108329A TW201132936A TW 201132936 A TW201132936 A TW 201132936A TW 99108329 A TW99108329 A TW 99108329A TW 99108329 A TW99108329 A TW 99108329A TW 201132936 A TW201132936 A TW 201132936A
Authority
TW
Taiwan
Prior art keywords
instant
virtual
real
augmented reality
microprocessor
Prior art date
Application number
TW99108329A
Other languages
Chinese (zh)
Other versions
TWI408342B (en
Inventor
Yu-Chang Chen
Yung-Chih Liu
Shih-Yuan Lin
Original Assignee
Inst Information Industry
Prosense Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inst Information Industry, Prosense Technology Corp filed Critical Inst Information Industry
Priority to TW99108329A priority Critical patent/TWI408342B/en
Publication of TW201132936A publication Critical patent/TW201132936A/en
Application granted granted Critical
Publication of TWI408342B publication Critical patent/TWI408342B/en

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

A real-time augmented reality device, a real-time augmented reality method and a computer program product are provided. The real-time augmented reality device may work with a navigation device, a reality image capture device and a user image capture device. The navigation device is configured to generate navigation information according to a current location of navigation device. The reality image capture device is configured to capture a real-time reality image which comprises a reality object. The user image capture device is configured to capture a real-time user image which comprises a face object. The real-time augmented reality device is configured to generate guide information according to the navigation information, the real-time actual scene image and the real-time user image.

Description

201132936 六、發明說明: 【發明所屬之技術領域】 本發明係關於一種即時擴增實境裝置、即時擴增實境方法及其 電腦程式產品。具體而言,本發明係關於一種可根據導航資訊、 即時實景影像以及即時使用者影像’產生一指引資訊之擴增實境 裝置、擴增實境方法及其電腦程式產品。 【先前技術】 近幾年來,隨著定位導航技術快速成熟,各式各樣辅助定位導 航技術的顯示器亦大量見於人們日常生活中,例如抬頭顯示器 (HUD)、微型顯示眼鏡(HMD)以及一内含LCD/OLED顯示器之玻 璃窗板等等,其中抬頭顯示器(HUD)更已被消費市場廣泛接受,下 文將說明習知抬頭顯示器(HUD)定位導航系統之運作機制。 抬頭顯示器(HUD)係為一種可幫助駕驶員認清所有需要的駕駛 資訊’同時保持視線在前方幻顯示技術。HUD是利用光學反射的 原理’將重要的駕駛相關資訊投射在一片玻璃上,這片玻璃位於 駕歇者前端,駕駛者透過HUD往前方看的時候,能夠輕易將外界 的景象與HUD顯示的資料融合在一起。其設計用意係使駕駛者不 需低頭查看儀表,僅需保持抬頭姿勢即可透過HUD看到顯示資料。 惟習知抬頭顯示器(HUD)定位導航系統,顯示面板必定介於駕駛 員的眼睛與後方景物之間,同時顯示面板與駕駛員的眼睛距離, 遠小於與後方景物之距離。HUD顯示的資訊與後方景物通常具有 定關連性’因此’若使用固定式HUD,當駕駛員的頭部偏移(例 如上半身向左及右方傾斜)時,hud上顯示的資訊自駕駛員的視角 201132936 觀看,就會與原本的景物產生偏移,另外,若使用移動式HUD, 當駕駛員的視線方向改變時,HUD顯示的資訊同樣也會與原後方 景物產生偏移。 综上所述,如何克服HUD顯示的資訊會與後方景物產生偏移之 缺點,進而使抬頭顯示器(HUD)定位導航系統具備根據駕駛員視線 角度,自動修正顯示偏移的能力,實為該領域之技術者極需解決 之課題。 【發明内容】 本發明之一目的在於提供一種即時擴增實境(augmented reality) 裝置,此即時擴增實境裝置係可與一導航裝置、一實景影像擷取 裝置以及一使用者影像擷取裝置搭配使用,該導航裝置係可根據 導航裝置之一目前位置,產生一導航資訊,實景影像擷取裝置用 以擷取一包含一實景物件之即時實-景影像,使用者影像擷取裝置 用以擷取一包含一人臉物件之即時使用者影像。 本發明之即時擴增實境裝置包含一傳送/接收介面、一儲存器以 及一微處理器,傳送/接收介面係與導航裝置、影像擷取裝置以及 使用者影像擷取裝置呈電性連接,微處理器係與傳送/接收介面以 及儲存器呈電性連接,傳送/接收介面用以接收導航資訊、即時實 景影像以及即時使用者影像,儲存器用以儲存實景物件之一實際 尺寸以及人臉物件之一預設位置以及一預設視線角度,微處理器 用以判斷人臉物件於即時使用者影像中所具有之一虛擬視線角 度、判斷人臉物件於即時使用者影像中所具有之一虛擬位置、判 斷實景物件於即時實景影像中所具有之一虛擬尺寸以及根據實際 201132936 尺寸、預設位置、預設視線角度、虛擬視線角度、虛擬位置、虛 擬尺寸以及導航資訊,產生一指引資訊。 本發明之另一目的在於提供一種用於前述即時擴增實境裝置之 即時擴增實境方法,該即時擴增實境方法包含下列步驟:(A)令一 傳送/接收介面接收導航資訊、即時實景影像以及即時使用者影 像;(B)令一微處理器判斷人臉物件於即時使用者影像中所具有之 一虛擬視線角度;(C)令該微處理器判斷該人臉物件於該即時使用 者影像所具有之一虛擬位置;(D)令該微處理器判斷該實景物件於 該即時實景影像中所具有之一虛擬尺寸;以及(E)令該微處理器根 據該實際尺寸、該預設位置、該預設視線角度、該虛擬視線角度、 該虛擬位置、該虛擬尺寸以及該導航資訊,產生一指引資訊。 本發明之又一目的在於提供一種電腦程式產品,内儲一種用以 執行用於前述即時擴增實境裝置之即時擴增實境方法之程式,該 程式被載入即時擴增實境裝置後執行:一程式指令A,令一傳送/ 接收介面接收該導航資訊、該即時實景影像以及該即時使用者影 像;一程式指令B,令一微處理器判斷該人臉物件於即時使用者 影像中所具有之一虛擬視線角度;一程式指令C,令該微處理器 判斷該人臉物件於該即時使用者影像所具有之一虛擬位置;一程 式指令D,令該微處理器判斷該實景物件於該即時實景影像中所 具有之一虛擬尺寸;以及一程式指令E,令該微處理器根據該實 際尺寸、該預設位置、該預設視線角度、該虛擬視線角度、該虛 擬位置、該虛擬尺寸以及該導航資訊,產生一指引資訊。 综上所述,本發明之即時擴增實境裝置係可根據包含一人臉物 201132936 件之即時使用者影像、包含一實景物件之即時實景影像以及導航 資訊,產生符合駕駛人視覺辨識習慣之指引資訊,換言之,該指 引資訊係應因駕駛者頭部偏移以及視線改變做修正而產生,藉 此,習知HUD顯示資訊會與原後方景物產生偏移之缺點得以被有 效克服,進而增加抬頭顯示器(HUD)定位導航系統之整體附加價 值。 【實施方式】 以下將透過實施例來解釋本發明内容,本發明的實施例並非用 以限制本發明須在如實施例所述之任何特定的環境、應用或特殊 方式方能實施。因此,以下實施例之描述僅為說明目的,並非本 發明之限制。須說明者,以下實施例及圖式中,與本發明非直接 相關之元件已省略而未繪示;且圖式中各元件間之尺寸關係僅為 求容易瞭解,非用以限制實際比例。 本發明之第一實施例如第1圖所示,其係為一即時擴增實境導 航顯示系統1之示意圖,即時擴增實境導航顯示系統1係包含一 • 即時擴增實境裝置11、一實景影像擷取裝置13、一導航裝置15、 一使用者影像擷取裝置Π以及一顯示裝置19。於本實施例中,即 時擴增實境導航顯示系統1係用於一車輛,而於其它實施例中, 即時擴增實境導航顯示系統1亦可視使用者之實際需求,應用於 其它駕駛工具,例如:飛機、船以及機車…等等,並不以此限制本 發明之應用範圍。以下將說明擴增實境裝置11如何搭配實景影像 擷取裝置13、導航裝置15、使用者影像擷取裝置17以及顯示裝 置19實現即時擴增實境導航顯示系統1。 201132936 即時擴增實境導航顯示系統1之導航裝置15係根據其目前位 置,產生一導航資訊150,實景影像擷取裝置13用以擷取一包含 一物件之即時實景影像130,使用者影像擷取裝置17則用以擷取 一包含一人臉物件之即時使用者影像170,而擴增實境裝置11儲 存有該實景物件之一實際尺寸1130以及該人臉物件之一預設位置 1132以及一預設視線角度1134,且根據實際尺寸1130、預設位置 1132、預設視線角度1134、即時實景影像130、即時使用者影像 170以及導航資訊150,產生與傳送一指引資訊117至顯示裝置 19,俾顯示裝置19可顯示指引資訊117,供車輛駕駛人參考。 於本實施例中,導航裝置15係運用GPS技術確定導航裝置15 本身或其所安裝處的經緯度、方向、速度、高度等資訊,並利用 慣性導航如電子羅盤、加速計、陀螺儀等,輔助計算GPS資訊更 新週期之間的資訊,再利用定位資訊與地圖資訊,確認交通工具 所處位置,並決定行進路徑,以產生導航資訊150,其係屬一二維 資訊,於其它實施例中,導航裝置15可利用其它定位技術產生導 航資訊150,並不以此為限。 再者,實景影像擷取裝置13所擷取之即時實景影像130係可為 直接、即時方式輸入,舉例而言,如裝載於車輛前方之攝影機, 另外,此即時實景影像亦可以間接、非即時方式輸入,舉例而言, 如模擬駕駛艙將可用記錄影像,或是由記錄影像衍生的電腦3D影 像作為即時實景影像130。 為方便後續說明,於本實施例中,即時擴增實境導航顯示系統1 係安裝於一行駛中之車輛,導航裝置15所產生之導航資訊150可 201132936 ,為包含該行駛中之車_之目前位置,而實景影像 ^取之即時實景影冑⑼則可視為該行駛中車輛2 = 象,例如道路以及路樹裳 周圍環境景 視之道路料,⑽卩啊=㈣實景料13°騎《駛人所 自車輛前方車窗看二=像:。所包含之物件係可為駕駛人 係如何產生指5I^U福線^下將說明_實境裝置11 由第1圖可知’即時擴201132936 VI. Description of the Invention: [Technical Field] The present invention relates to an instant augmented reality device, an instant augmented reality method, and a computer program product thereof. In particular, the present invention relates to an augmented reality device, an augmented reality method, and a computer program product thereof that can generate a guide information based on navigation information, real-time live image, and instant user image. [Prior Art] In recent years, with the rapid maturity of positioning and navigation technology, various displays of assisted positioning and navigation technology have also been widely seen in people's daily life, such as head-up display (HUD), miniature display glasses (HMD), and A glazing panel or the like including an LCD/OLED display, wherein a head-up display (HUD) has been widely accepted by the consumer market, and the operation mechanism of the conventional head-up display (HUD) positioning navigation system will be described below. The Head-Up Display (HUD) is a technology that helps the driver recognize all the driving information needed while keeping the line of sight in front. HUD uses the principle of optical reflection to project important driving related information on a piece of glass. This glass is located at the front end of the driver. When the driver looks through the HUD, it can easily display the outside scene and the HUD. The data is fused together. The design is intended to allow the driver to view the instrument through the HUD without having to look down at the meter. However, the conventional head-up display (HUD) positioning navigation system, the display panel must be between the driver's eyes and the rear scene, and the distance between the display panel and the driver's eyes is much smaller than the distance from the rear scene. The information displayed by the HUD is usually related to the rear scene. Therefore, if a fixed HUD is used, when the driver's head is offset (for example, the upper body is tilted to the left and right), the information displayed on the hud is from the driver's When viewed from 201132936, it will be offset from the original scene. In addition, if the mobile HUD is used, when the driver's line of sight changes, the information displayed by the HUD will also be offset from the original scene. In summary, how to overcome the shortcomings of the information displayed by the HUD and the rear scene, so that the head-up display (HUD) positioning navigation system has the ability to automatically correct the display offset according to the driver's line of sight angle, which is actually the field. The problem that the technicians need to solve is extremely urgent. SUMMARY OF THE INVENTION An object of the present invention is to provide an augmented reality device, which can be used with a navigation device, a real image capturing device, and a user image capture device. The navigation device is configured to generate a navigation information according to the current position of one of the navigation devices, and the real image capturing device is configured to capture an instant real-view image including a real object, and the user image capturing device is used. To capture an instant user image containing a face object. The instant augmented reality device of the present invention comprises a transmitting/receiving interface, a storage and a microprocessor, and the transmitting/receiving interface is electrically connected to the navigation device, the image capturing device and the user image capturing device. The microprocessor is electrically connected to the transmitting/receiving interface and the storage, and the transmitting/receiving interface is for receiving navigation information, real-time live image and instant user image, and the storage is used for storing the actual size of one of the real objects and the face object. a preset position and a preset line of sight angle, the microprocessor is configured to determine a virtual line of sight angle of the face object in the instant user image, and determine that the face object has a virtual position in the instant user image And determining a virtual size of the real object in the real-time live image and generating a guiding information according to the actual 201132936 size, the preset position, the preset line of sight angle, the virtual line of sight angle, the virtual position, the virtual size, and the navigation information. Another object of the present invention is to provide an instant augmented reality method for the aforementioned instant augmented reality device, the instant augmented reality method comprising the following steps: (A) causing a transmitting/receiving interface to receive navigation information, (B) causing a microprocessor to determine a virtual line of sight angle of the face object in the instant user image; (C) causing the microprocessor to determine the face object in the The virtual user image has a virtual location; (D) causing the microprocessor to determine that the real object has a virtual size in the live real image; and (E) causing the microprocessor to The preset position, the preset line of sight angle, the virtual line of sight angle, the virtual position, the virtual size, and the navigation information generate a guidance information. It is still another object of the present invention to provide a computer program product storing a program for executing an instant augmented reality method for the aforementioned instant augmented reality device, the program being loaded into an instant augmented reality device Executing: a program command A for a transmitting/receiving interface to receive the navigation information, the live view image and the instant user image; a program command B for causing a microprocessor to determine the face object in the instant user image Having a virtual line of sight angle; a program command C, the microprocessor determines that the face object has a virtual position in the instant user image; a program command D causes the microprocessor to determine the real object Having a virtual size in the live view image; and a program command E for the microprocessor to determine the actual size, the preset position, the preset line of sight angle, the virtual line of sight angle, the virtual position, the The virtual size and the navigation information generate a guide information. In summary, the instant augmented reality device of the present invention can generate a guideline that conforms to the driver's visual identification habit according to an instant user image including a face of 201132936, an instant live image including a real object, and navigation information. Information, in other words, the guidance information should be generated due to the driver's head shift and the change of line of sight. Therefore, the shortcomings of the known HUD display information and the original rear view can be effectively overcome, thereby increasing the head-up. The overall added value of the display (HUD) positioning navigation system. The present invention will be explained by way of examples, and the embodiments of the present invention are not intended to limit the invention to any specific environment, application or special mode as described in the embodiments. Therefore, the description of the following examples is for illustrative purposes only and is not a limitation of the invention. It should be noted that in the following embodiments and drawings, elements that are not directly related to the present invention have been omitted and are not shown; and the dimensional relationships between the elements in the drawings are merely for ease of understanding and are not intended to limit the actual ratio. The first embodiment of the present invention is shown in FIG. 1 , which is a schematic diagram of an instant augmented reality navigation display system 1 . The instant augmented reality navigation display system 1 includes an instant augmented reality device 11 . A real image capturing device 13, a navigation device 15, a user image capturing device, and a display device 19. In this embodiment, the instant augmented reality navigation display system 1 is used for one vehicle, and in other embodiments, the instant augmented reality navigation display system 1 can also be applied to other driving tools according to the actual needs of the user. For example, aircraft, boats, and locomotives, etc., do not limit the scope of application of the present invention. Hereinafter, how the augmented reality device 11 is combined with the live view image capturing device 13, the navigation device 15, the user image capturing device 17, and the display device 19 realizes the instant augmented reality navigation display system 1. The navigation device 15 of the real-time augmented reality navigation display system 1 generates a navigation information 150 according to its current location. The real-time image capturing device 13 captures an instant real-time image 130 containing an object, and the user image is displayed. The device 17 is configured to capture a real-time user image 170 including a human face object, and the augmented reality device 11 stores an actual size 1130 of the real object and a preset position 1132 of the human face object and a The preset line of sight angle 1134 is generated and transmitted to the display device 19 according to the actual size 1130, the preset position 1132, the preset line of sight angle 1134, the live view image 130, the instant user image 170, and the navigation information 150. The display device 19 can display the guide information 117 for reference by the driver of the vehicle. In the embodiment, the navigation device 15 uses GPS technology to determine the latitude, longitude, direction, speed, height and other information of the navigation device 15 itself or its installation, and uses inertial navigation such as an electronic compass, an accelerometer, a gyroscope, etc. Calculating the information between the GPS information update periods, and then using the positioning information and the map information to confirm the location of the vehicle and determining the travel path to generate the navigation information 150, which is a two-dimensional information. In other embodiments, The navigation device 15 can generate the navigation information 150 by using other positioning technologies, and is not limited thereto. Furthermore, the real-time live image 130 captured by the real-time image capturing device 13 can be input in a direct, instant manner. For example, if the camera is mounted in front of the vehicle, the live-action image can also be indirect or non-instant. Mode input, for example, such as a simulated cockpit, a recorded image can be used, or a computer 3D image derived from a recorded image is used as an instant live image 130. In order to facilitate the following description, in the embodiment, the instant augmented reality navigation display system 1 is installed in a driving vehicle, and the navigation information 150 generated by the navigation device 15 can be 201132936, which is included in the driving vehicle. The current location, and the real-life image ^ take the real-life effect (9) can be regarded as the driving vehicle 2 = elephant, such as roads and roads around the trees and landscapes, (10) 卩 ah = (four) real scenery material 13 ° ride " The driver’s car looks at the front window of the vehicle. The included items can be the driver's system. How to generate the finger 5I^Ufu line ^ will be explained _ the real device 11 can be seen from the first picture

一儲存器113、-投影=磁1包含—傳送/接收介面心 ⑴係與導航裝置15、、實二以及—微處理器115 ’傳送/接收介面 置Π以及u w ^像跡裝置13、使用者影像操取裝 人 ·‘不、19呈電性連接,微處理器115係與傳送/接收 "卜投影器114以及儲存器⑴呈電性連接,儲存器113用 Z存該實景物件(即道路分隔線)之一實際尺寸⑽以及該人 臉物件之-預設位置1132以及一預設視線角度ιΐ34。A storage 113, a projection = a magnetic 1 includes - a transmission/reception interface (1) and a navigation device 15, a second and a microprocessor 115' transmission/reception interface, and a uw^track device 13, a user The image operation is loaded with 'No, 19 is electrically connected, the microprocessor 115 is electrically connected to the transmitting/receiving ' projector 114 and the storage (1), and the storage 113 stores the real object with Z (ie One of the actual dimensions (10) of the road divider line and the preset position 1132 of the face object and a preset line of sight angle ι 34.

=導航裝置15產生導航資訊15()、實景影像操取裝置13擁取出 包含該道路分隔線之即時實景影像⑽以及使用者影像操取裝置 摘取出包含人臉物件之即時使用者影像17〇後傳送/接收介面 ⑴係可接收導航資1凡150、即時使用者影像17〇錢即時實景影 像130’接下來微處理器115係可根據一物件邊緣辨識法,判斷該 道路分隔線於即時實景影像15〇中所具有之一虛擬尺寸(可為該 L路刀隔線之虛擬長度以及虛擬寬度),以供後續處理之用,需注 意者,本實施例所採用之物件邊緣辨識法係可透過習知技術達 成,且於其它實施例中,微處理器115亦可根據其它判斷方式, 判斷出該道路分隔線於即時實景影像150中所具有之虛擬長度與 201132936 虛擬寬度,並不以此為限。 除了判斷該道路分隔線於即時實景影像150中所具有之虛擬尺 寸,微處理器115更用以判斷該人臉物件於即時使用者影像170 中所具有之一虛擬視線角度,以及判斷該人臉物件於即時使用者 影像170中所具有之一虛擬位置。需注意者,微處理器115判斷 虛擬尺寸、虛擬視線角度以及虛擬位置之順序係可視實際應用而 改變,並不以此為限。 具體而言,由於當車輛行駛於路面時,駕駛人之雙眼視線以及 頭部將會產生平移以及上下游動等動作,為準確產生指引資訊 117,微處理器115係可由即時使用者影像170中,判斷出一駕駛 人目前之視線角度,即即時使用者影像170所具有之虛擬視線角 度,另一方面,當駕駛人之臉部於車内位置產生移動,如將頭向 右方車窗觀望以及將頭伸至後方查看,微處理器115係可由即時 使用者影像170中,判斷一移動後的臉部位置,即為即時使用者 影像170所具有之虛擬位置。 接下來,為得知人臉物件之視線角度變化程度,微處理器115 將根據判斷出之虛擬視線角度以及儲存器113所儲存之預設視線 角度1134,產生一視線角度差,微處理器115再根據儲存器113 所儲存之預設位置1132以及判斷出之虛擬位置,產生一位置差, 以得知人臉物件之位置變化程度。 微處理器115亦根據該實景物件之實際尺寸1130 (包含實景物 件之實際長度與實際寬度)以及判斷出之虛擬尺寸(包含虛擬長 度以及虛擬寬度),計算出實景影像擷取裝置13之一影像擷取方 201132936 向與一水平面間之一仰角,再根據實際長度、實際寬度、虛擬長 度、虛擬寬度以及導航資訊150,計算出該影像擷取方向與導航裝 置15之一行進方向間之一偏角,接下來,微處理器115根據該仰 角、該偏角、該視線角度差、該位置差以及導航資訊150,產生一 指引資訊117。 詳言之,根據道路法現的規範,道路分隔線之實際長度以‘及實 際寬度係為固定,於判斷出道路分隔線之虛擬長度與虛擬寬度 後,微處理器115係利用實際長度與虛擬長度、實際寬度以及虛 ^ 擬寬度的比例,計算出實景影像擷取裝置13之影像擷取方向與水 平面間之仰角,且微處理器115更利用實際長度與虛擬長度、實 際寬度以及虛擬寬度的比例以及導航資訊150,計算出實景影像擷 取裝置13之影像擷取方向與導航裝置15之行進方向間之偏角, 微處理器115根據該仰角、該偏角、該視線角度差、該位置差以 及導航資訊150,即可產生考量垂直視角深度的指引資訊117。 更具體而言,請參閱第2圖,其係為一裝載即時擴增實境導航 • 顯示系統1之車輛21行駛於路面的示意圖,路面上有一具有實際 長度D與實際寬度W之道路分隔線23,實景影像擷取裝置13係 自位置27看出去的影像擷取方向擷取即時實景影像130,即時實 景影像130中所包含之道路分隔線將隨著車輛的行進方向以及道 路所處之地形或沿伸之方向而改變,簡言之,即是即時實景影像 130中之道路分隔線的虛擬長度以及虛擬寬度係隨著車輛的行進 方向以及道路所處之地形或沿伸之方向而改變。 透過微處理器115連續的判斷道路分隔線的虛擬長度以及虛擬 11 201132936 寬度,目前影像擷取方向與導航裝置15行進方向之偏角,以及影 像擷取方向與水平面間之仰角將可連續即時計算出,以使微處理 器115將導航裝置15二維的導航資訊轉換為三維之指引資訊,換 °之微處理器115係將導航裝置所呈現的二維地圖中的距.離轉 換成’隹玟射影像_的深度,當指引資訊為—箭頭符號時由於 指引資訊係根據仰角、偏角、該視線角度差、該位置差以及導航 ^訊150即時產生時,箭頭符號遇到突然之岔路時仍能正確的 落在岔路中央位置,不使其偏移,以指示駕駛者選擇正確的路口 轉彎。 钃 需強調的是,微處理器i丨5係根據一領域轉換(D〇main 方式,輔以根據仰角、偏角、視線角度差、位置差以及導航資訊 150,產生指引資訊117,換言之,係利用仰角、偏角、視線角度 差、位置差等數據,可計算出一矩陣,將導航資訊15〇根據該矩 陣做領域轉換,以將用以指引道路方向之箭頭符號,利用該矩陣 使其上下部壓縮,成為一考量垂直視角深度後的指引資訊117。 另一方面,於本實施例中,顯示裝置19可為一透視式抬頭顯示修 (Head up Display)裝置,其係可與傳送/接收介面1U呈電性連接, 於指引資訊117產生後,微處理器115透過傳送/接收介面⑴, 示裝置可顯示指引資訊 117。再者,The navigation device 15 generates the navigation information 15 (), and the real-time image manipulation device 13 captures the real-time live image (10) including the road separation line and the user image manipulation device extracts the instant user image containing the human face object. The post-transmission/receiving interface (1) can receive the navigation aid 1 and 150, and the instant user image 17 saves the real-time live image 130'. Next, the microprocessor 115 can judge the road separation line according to an object edge identification method. The image has one of the virtual dimensions (which may be the virtual length and the virtual width of the L-bracket line) for subsequent processing. It should be noted that the object edge identification method used in this embodiment may be Through the prior art, in other embodiments, the microprocessor 115 can also determine the virtual length of the road separation line in the real-time live image 150 and the virtual width of 201132936 according to other determination manners, and not Limited. In addition to determining the virtual size of the road divider in the live view image 150, the microprocessor 115 is further configured to determine a virtual line of sight angle of the face object in the instant user image 170, and determine the face. The object has a virtual location in the instant user image 170. It should be noted that the microprocessor 115 determines that the order of the virtual size, the virtual line of sight angle, and the virtual position is changed according to the actual application, and is not limited thereto. Specifically, since the driver's binocular line of sight and the head will produce translational and upstream and downstream movements when the vehicle is traveling on the road surface, the microprocessor 115 can be used by the instant user image 170 in order to accurately generate the guidance information 117. In the middle, the driver's current line of sight angle is determined, that is, the virtual line of sight angle of the instant user image 170, and on the other hand, when the driver's face moves in the interior position, such as looking to the right side of the window And the head 115 is extended to the rear, and the microprocessor 115 can determine the position of the moved face from the instant user image 170, that is, the virtual position of the instant user image 170. Next, in order to know the degree of change of the line of sight angle of the face object, the microprocessor 115 will generate a line of sight angle difference according to the determined virtual line of sight angle and the preset line of sight angle 1134 stored in the memory 113, and the microprocessor 115 Based on the preset position 1132 stored in the storage 113 and the determined virtual position, a position difference is generated to know the degree of change in the position of the face object. The microprocessor 115 also calculates an image of the real image capturing device 13 according to the actual size 1130 of the real object (including the actual length and actual width of the real object) and the determined virtual size (including the virtual length and the virtual width). The angle of the intersection of the image capturing direction and the traveling direction of the navigation device 15 is calculated according to the actual length, the actual width, the virtual length, the virtual width, and the navigation information 150. Angle, next, the microprocessor 115 generates a guidance information 117 based on the elevation angle, the declination, the line-of-sight angle difference, the position difference, and the navigation information 150. In detail, according to the current norm of the road law, the actual length of the road dividing line is fixed by 'and the actual width. After judging the virtual length and virtual width of the road dividing line, the microprocessor 115 uses the actual length and the virtual The ratio of the length, the actual width, and the virtual width is calculated, and the elevation angle between the image capturing direction and the horizontal plane of the real image capturing device 13 is calculated, and the microprocessor 115 utilizes the actual length and the virtual length, the actual width, and the virtual width. The ratio and the navigation information 150 calculate an off angle between the image capturing direction of the real image capturing device 13 and the traveling direction of the navigation device 15, and the microprocessor 115 determines the position according to the elevation angle, the off angle, the line of sight angle difference, and the position The difference and navigation information 150 can generate guidance information 117 that takes into account the depth of the vertical view. More specifically, please refer to FIG. 2, which is a schematic diagram of a vehicle 21 that is loaded with real-time augmented reality navigation display system 1 on a road surface having a road separation line having an actual length D and an actual width W. 23, the real-time image capturing device 13 captures the real-time live image 130 from the image capturing direction seen from the position 27, and the road dividing line included in the real-time real-life image 130 will follow the traveling direction of the vehicle and the terrain of the road. Or changing in the direction of extension, in short, the virtual length of the road dividing line in the real-time live image 130 and the virtual width vary with the direction of travel of the vehicle and the terrain or direction along which the road is located. Through the microprocessor 115, the virtual length of the road dividing line and the width of the virtual 11 201132936 are continuously determined, and the angle between the current image capturing direction and the traveling direction of the navigation device 15 and the elevation angle between the image capturing direction and the horizontal plane can be continuously calculated in real time. In order to enable the microprocessor 115 to convert the two-dimensional navigation information of the navigation device 15 into three-dimensional guidance information, the microprocessor 115 converts the distance in the two-dimensional map presented by the navigation device into '隹The depth of the image _ when the guidance information is the - arrow symbol, because the guidance information is based on the elevation angle, the yaw angle, the line of sight angle difference, the position difference and the navigation signal 150 are generated immediately, the arrow symbol encounters a sudden road It can still correctly land in the center of the road without shifting it to instruct the driver to choose the right intersection to turn. It should be emphasized that the microprocessor i丨5 is based on a field conversion (D〇main mode, supplemented by the elevation angle, the off-angle, the line-of-sight angle difference, the position difference, and the navigation information 150 to generate the guidance information 117, in other words, Using data such as elevation angle, declination angle, line-of-sight angle difference, position difference, etc., a matrix can be calculated, and the navigation information 15〇 is converted according to the matrix, so that the arrow symbol used to guide the direction of the road is used to make it up and down. The compression is a guide information 117 after considering the vertical viewing depth. On the other hand, in the embodiment, the display device 19 can be a perspective head-up display device, which can be transmitted/received. The interface 1U is electrically connected. After the guidance information 117 is generated, the microprocessor 115 transmits the guidance information (117) through the transmission/reception interface (1).

傳送指5丨資訊117至該透視式抬賴示裝置,俾該透視式括頭顯 17。再者’於本實施例中,顯示裝置19 :裝置,微處理器II5可透過投影器12, 沒式抬頭顯示裝置,俾投影式抬頭顯示裝 需、/主意者,當顯示裝置19為透視式抬頭 12 201132936 顯示裝置時,即時擴增實境裝置11之投影器114可省略,以節省 即時擴增實境裝置11之硬體成本。 微處理器115更可將指引資訊117合成於即時實景影像130上, 以產生導航影像119,微處理器115透過可傳送/接收介面111,傳 送導航影像119至顯示裝置19,俾顯示裝置1/可顯示導航影像 117,以供車輛駕駛人參考。 具體而言,導航影像119係為微處理器115合成即時實景影像 130以及指引資訊117所產生,換言之,若指引資訊117係為一箭 頭符號,則即時實景影像130將與箭頭符號合成在一起,而駕駛 者係所看到之導航影像119係即時實景影像130搭配考量垂直視 角深度的指引資訊117所產生,需特別說明者,指引資訊117更 可為其它圖形,並不以此限制本發明之範圍。 本發明之第二實施例如第3A-3E圖所示,其係為一用於如第一 實施例所述之即時擴增實境裝置之即時擴增實境方法之流程圖, 即時擴增實境裝置係可與一導航裝置、一實景影像擷取裝置以及 一使用者影像裝置搭配使用,導航裝置係可根據其一目前位置, 產生一導航資訊,實景影像擷取裝置用以擷取一包含一實景物件 之即時實景影像,使用者影像擷取裝置用以擷取一包含一人臉物 件之即時使用者影像。 此外,該即時擴增實境裝置包含一傳送/接收介面、一投影器、 一儲存器以及一微處理器,該微處理器係與該傳送/接收介面以及 該儲存器呈電性連接,該傳送/接收介面係與該導航裝置、該影像 擷取裝置以及該使用者影像擷取裝置呈電性連接,該儲存器用以 13 201132936 儲存該實景物件之一實際尺寸以及該人臉物件之一預設位置以及 一預設視線角度。 第二實施例即時擴增實境方法所採之技術手段實質上與第一實 施例即時擴增實境裝置所採知技術手段相同,此項技術領域具有 通常知識者將可根據第一實施例所揭示之内容,輕易得知第二實 施例即時擴增實境方法係如何實現,以下將只簡述即時擴增實境 方法。 本實施例之即時擴增實境方法係包含以下步驟,請先參閱第3A 圖,執行步驟301,令一傳送/接收介面接收該導航資訊、該即時 實景影像以及該即時使用者影像,執行步驟302,令一微處理器判 斷人臉物件於即時使用者影像中所具有之一虛擬視線角度,再執 行步驟303,令微處理器判斷該人臉物件於該即時使用者影像所具 有之一虛擬位置。 執行步驟304,令該微處理器判斷該實景物件於該即時實景影像 中所具有之一虛擬尺寸,接下來請參閱第3B圖,執行步驟305, 令微處理器根據虛擬視線角度以及預設視線角度,產生一視線角 度差,執行步驟306,令微處理器根據預設位置以及虛擬位置,產 生一位置差,執行步驟307,令微處理器根據實際尺寸以及虛擬尺 寸,計算出實景影像擷取裝置之一影像擷取方向與一水平面間之 一仰角,執行步驟308,令微處理器根據實際尺寸、虛擬尺寸以及 導航資訊,計算出影像擷取方向與導航裝置之一行進方向之一偏 角。 請參閱第3C圖,執行步驟309,令微處理器根據仰角、偏角、 201132936 視線角度差、位置差以及導航資訊,產生指引資訊接 該顯=置係為-透視式抬·示裝置,職行步驟, 理器透過傳送/接收介面’傳送指引f訊至透視式 ^處 俾透視式抬頭顯示裝置可顯示指引資訊。 顯咖’ =裝置係為一投影式抬頭顯示震置,請參閱第祀圖,於 後,執仃步㈣卜令微處理器更用以透過投影 指引資訊至投影式抬頭顯示裝置,俾投影式 = 指引資訊。 衣罝可顯不 此外,請參閱第3關,於步驟期後,執行步驟312’令微處 Π::::"1合成於即時實景影像上’以產生-導航影像,最 像至顯1Γ13,令微處理器透過傳送7接收介面,傳送導航影 像顯不裝置,俾顯示裝置可顯示導航影像。 除了上述步驟’第二實施例亦能執行第—實施例所描述 及功能’所屬技術領域具有通常知識者可直接瞭解第; 何基於上述第-實闕以執行料操作及雜,故不贊述。 此外’第二實施制描述之㈣朗實境方法可由_電 2執行,當即時擴增實境裝置經由—電腦載人該電腦1 並執行該電難式產品所包含之複數個指 = =之即時擴增實境方法。前述之電腦程式二二 R〇=:錄媒體中,例如唯讀記憶體(一y — 路r取二?體、軟碟、硬碟、光碟、隨身碟、磁帶、可由網 :::::::或熟習此項技藝者—有相_之任何 15 201132936 综上所述,本發明之即時擴增實境裝置係可根據包含一人臉物 件之即時使用者影像、包含一實景物件之即時實景影像以及導航 資訊,產生符合駕駛人視覺辨識習慣之指引資訊,換言之,該指 引資訊係應因駕駛者頭部偏移以及視線改變做修正而產生,藉 此,習知HUD顯示資訊會與原後方景物產生偏移之缺點得以被有 效克服,進而增加抬頭顯示器(HUD)定位導航系統之,整體附加價 值。 上述之實施例僅用來例舉本發明之實施態樣,以及闡釋本發明 之技術特徵,並非用來限制本發明之保護範疇。任何熟悉此技術 者可輕易完成之改變或均等性之安排均屬於本發明所主張之範 圍,本發明之權利保護範圍應以申請專利範圍為準。 【圖式簡單說明】 第1圖係為本發明之第一實施例之示意圖; 第2圖係為裝載第一實施例之即時擴增實境導航顯示系統之車 輛行駛於路面之示意圖;以及 第3A圖至第3E圖係為本發明之第二實施例之流程圖。 【主要元件符號說明】 1 :即時擴增實境導航顯示系統 11 :即時擴增實境裝置 111 :傳送/接收介面 113 :儲存器 1130 :實際尺寸 1132 :預設位置 115 :微處理器 1134 :預設視線角度 201132936 114 :投影器 119 :導航影像 130 :即時實景影像 150 :導航資訊 17 :使用者影像擷取裝置 19 :顯示裝置 龜 27 :實景影像擷取裝置之位置 117 :指引資訊 13 :實景影像擷取裝置 15 :導航裝置 130 :即時使用者影像 21 :車輛The transfer refers to the information 117 to the see-through display device, and the see-through head display 17 is displayed. Furthermore, in the present embodiment, the display device 19: the device, the microprocessor II5 can pass through the projector 12, the head-up display device, the projection type head-up display, and the display device 19 is a perspective type. When the display device is looked up 12 201132936, the projector 114 of the instant augmented reality device 11 can be omitted to save the hardware cost of the instant augmented reality device 11. The microprocessor 115 can further synthesize the guidance information 117 on the real-time live image 130 to generate the navigation image 119. The microprocessor 115 transmits the navigation image 119 to the display device 19 through the transmit/receive interface 111, and the display device 1/ A navigation image 117 can be displayed for reference by the driver of the vehicle. Specifically, the navigation image 119 is generated by the microprocessor 115 synthesizing the real-time live image 130 and the guide information 117. In other words, if the guide information 117 is an arrow symbol, the live-action image 130 will be combined with the arrow symbol. The navigation image 119 that the driver sees is generated by the real-time live image 130 with the guidance information 117 considering the vertical depth of view. The special information is required. The guidance information 117 can be other graphics, and the present invention is not limited thereto. range. A second embodiment of the present invention, as shown in Figures 3A-3E, is a flow chart of an instant augmented reality method for an instant augmented reality device as described in the first embodiment. The navigation device can be used with a navigation device, a real image capturing device and a user image device. The navigation device can generate a navigation information according to a current location, and the real image capturing device can capture an inclusion. An instant real-time image of the real object, the user image capturing device is used to capture an instant user image containing a human face object. In addition, the instant augmented reality device includes a transmitting/receiving interface, a projector, a storage, and a microprocessor, and the microprocessor is electrically connected to the transmitting/receiving interface and the storage. The transmitting/receiving interface is electrically connected to the navigation device, the image capturing device and the user image capturing device, and the storage device is used for storing the actual size of one of the real objects and one of the facial objects for 13 201132936. Set the position and a preset line of sight. The technical means adopted by the instant augmented reality method of the second embodiment is substantially the same as the technical means adopted by the instant augmented reality device of the first embodiment, and those skilled in the art will be able to according to the first embodiment. What is disclosed, it is easy to know how the instant augmented reality method of the second embodiment is implemented, and only the instant augmented reality method will be briefly described below. The instant augmented reality method of the present embodiment includes the following steps. Please refer to FIG. 3A first, and step 301 is executed to enable a transmitting/receiving interface to receive the navigation information, the live view image, and the instant user image, and perform the steps. 302. Let a microprocessor determine that the human face object has a virtual line of sight angle in the instant user image, and then perform step 303 to enable the microprocessor to determine that the human face object has one of the instant user images. position. Step 306 is executed to enable the microprocessor to determine that the real object has a virtual size in the live view image. Next, refer to FIG. 3B, and step 305 is executed to enable the microprocessor to view the virtual line of sight and the preset line of sight. Angle, generating a line of sight angle difference, step 306 is executed to cause the microprocessor to generate a position difference according to the preset position and the virtual position, and step 307 is executed to enable the microprocessor to calculate the live image image according to the actual size and the virtual size. Step 308 is performed by one of the image capturing directions and a horizontal plane of the device, and the microprocessor is configured to calculate a deviation angle between the image capturing direction and one of the traveling directions of the navigation device according to the actual size, the virtual size, and the navigation information. . Referring to FIG. 3C, step 309 is executed to enable the microprocessor to generate guidance information according to the elevation angle, the off-angle, the 201132936 line-of-sight angle difference, the position difference, and the navigation information, and the system is configured to be a perspective-type lifting device. In the step, the processor transmits the guidance message to the perspective type through the transmission/reception interface. The perspective display device can display the guidance information.显咖' = The device is a projection type head-up display, please refer to the figure. After that, the step (4) is used to transmit the information to the projection head-up display device. = Guidance information. In addition, please refer to the third level. After the step period, perform step 312' to make the micro-bee::::"1 synthesized on the live-action image to generate--navigation image, most like to display 1Γ13, the microprocessor transmits the navigation image display device through the transmission 7 receiving interface, and the display device can display the navigation image. In addition to the above steps, the second embodiment can also perform the description and functions of the first embodiment. Those skilled in the art can directly understand the first; . In addition, the 'fourth implementation method described in the second implementation system can be performed by _Electricity 2, when the instant augmented reality device carries the computer 1 via the computer and executes the plurality of fingers included in the electric hard-working product == Instant augmentation of real-world methods. The aforementioned computer program 22: R: = recorded media, such as read-only memory (a y - road r take two body, floppy disk, hard disk, CD, flash drive, tape, can be net ::::: :: or familiar with the artist - any phase 15 201132936 In summary, the instant augmented reality device of the present invention can be based on an instant user image containing a human face object, including an instant real scene Image and navigation information to generate guidance information in line with the driver's visual identification habits. In other words, the guidance information should be generated by the driver's head shift and line of sight changes, so that the HUD display information will be related to the original rear. The disadvantages of the scene offset are effectively overcome, thereby increasing the overall added value of the head-up display (HUD) positioning navigation system. The above embodiments are only used to exemplify the embodiments of the present invention, and to explain the technical features of the present invention. It is not intended to limit the scope of the invention, and any arrangement or change that can be easily accomplished by those skilled in the art is within the scope of the invention. The scope of protection should be based on the scope of the patent application. [Simplified description of the drawings] Fig. 1 is a schematic view showing a first embodiment of the present invention; Fig. 2 is a view showing the instant augmented reality navigation of the first embodiment. A schematic diagram of a vehicle running on a road surface; and 3A to 3E are flowcharts of a second embodiment of the present invention. [Explanation of main component symbols] 1: Instantaneous augmented reality navigation display system 11: Instant expansion Augmentation device 111: transmission/reception interface 113: memory 1130: actual size 1132: preset position 115: microprocessor 1134: preset line of sight angle 201132936 114: projector 119: navigation image 130: instant live image 150: Navigation information 17: User image capturing device 19: Display device turtle 27: Position of the live image capturing device 117: Guidance information 13: Real image capturing device 15: Navigation device 130: Instant user image 21: Vehicle

1717

Claims (1)

201132936 七、申請專利範圍: , 1. 一種即時擴增實境(augmented reality )裝置,係可與一導航 裝置、一實景影像擷取裝置以及一使用者影像擷取裝置搭配使 < 用,該導航裝置係可根據該導航裝置之一目前位置,產生一導 航資訊,該實景影像擷取裝置用以擷取一包含一實景物件之即 時實景影像,該使用者影像擷取裝置用以擷取一包含一人臉物 件之即時使用者影像,該即時擴增實境裝置包含: 一傳送/接收介面,係與該導航裝置、該影像擷取裝置以 及該使用者影像擷取裝置呈電性連接,用以接收該導航資 訊、該即時實景影像以及該即時使用者影像; 一儲存器,用以儲存該實景物件之一實際尺寸以及該人 臉物件之一預設位置以及一預設視線角度; 一微處理器,係與該傳送/接收介面以及該儲存器呈電性 連接,並用以: 判斷該人臉物件於該即時使用者影像中所具有之一 虛擬視線角度; 判斷該人臉物件於該即時使用者影像中所具有之一 虛擬位置; 判斷該實景物件於該即時實景影像中所具有之一虛 擬尺寸;以及 根據該實際尺寸、該預設位置、該預設視線角度、 該虛擬視線角度、該虛擬位置、該虛擬尺寸以及該導航 資訊,產生一指引資訊。 2.如請求項1所述之即時擴增實境裝置,其中該微處理器更用以: 201132936 根據該虛擬視線角度以及該預設視線角度,產生一視線 角度差; - 根據該預設位置以及該虛擬位置,產生一位置差;以及 根據該實際尺寸、該視線角度差、該位置差以及該虛擬 尺寸以及該導航資訊’產生該指引貧訊。 3. 如請求項2所述之即時擴增實境裝置,其中該微處理器更用以: 根據該實際尺寸以及該虛擬尺寸,計算出該實景影像擷 取裝置之一影像擷取方向與一水平面間之一仰角; • 根據該實際尺寸、該虛擬尺寸以及該導航資訊,計算出 該影像擷取方向與該導航裝置之一行進方向間之一偏角;以 及 根據該仰角、該偏角、該視線角度差、該位置差以及該 導航資訊,產生該指引資訊。 4. 如請求項1所述之即時擴增實境裝置,其中該微處理器係根據 一物件邊緣辨識法,判斷該實景物件於該即時實景影像中所具 ^ 有之虛擬尺寸。 5. 如請求項1所述之即時擴增實境裝置,其中該即時擴增實境裝 置更可與一透視式抬頭顯示(Head up Display )裝置搭配使用, 該傳送/接收介面更與該透視式抬頭顯示裝置呈電性連接,該微 處理器更用以透過該傳送/接收介面,傳送該指引資訊至該透視 式抬頭顯示裝置,俾該透視式抬頭顯示裝置可顯示該指引資 訊。 6. 如請求項1所述之即時擴增實境裝置,其中該即時擴增實境裝 置更可與一投影式抬頭顯示裝置搭配使用,且更包含一與該微 19 201132936 處理器電性連接之投影器,該微處理器更用以透過該投影器, 投射該指引資訊至該投影式抬頭顯示裝置,俾該投影式抬頭顯 示裝置可顯示該指引資訊。 7. 如請求項1所述之即時擴增實境裝置,其中該微處理器更可將 該指引資訊合成於該即時實景影像上,以產生一導航影像。 8. —種用於一即時擴增實境裝置之即時擴增實境方法,該即時擴 增實境裝置係可與一導航裝置、一實景影像擷取裝置以及一使 用者影像擷取裝置搭配使用,該導航裝置係可根據該導航裝置 之一目前位置,產生一導航資訊,該實景影像擷取裝置用以擷 取一包含一實景物件之即時實景影像,該使用者影像擷取裝置 用以擷取一包含人臉物件之即時使用者影像,該即時擴增實境 裝置包含一傳送/接收介面、一儲存器以及一微處理器,該微處 理器係與該傳送/接收介面以及該儲存器呈電性連接,該傳送/ 接收介面係與該導航裝置、該影像擷取裝置以及該使用者影像 擷取裝置呈電性連接,該儲存器用以儲存該實景物件之一實際 尺寸以及該人臉物件之一預設位置以及一預設視線角度,該即 時擴增實境方法包含下列步驟: (A) 令該傳送/接收介面,接收該導航資訊、該即時實景影 像以及該即時使用者影像; (B) 令該微處理器判斷該人臉物件於即時使用者影像中 所具有之一虛擬視線角度; (C) 令該微處理器判斷該人臉物件於該即時使用者影像 所具有之一虛擬位置; (D) 令該微處理器判斷該實景物件於該即時實景影像中 20 201132936 所具有之一虛擬尺寸;以及 (E)令該微處理器根據該實際尺寸、該預設位置、該預設 視線角度、該虛擬視線角度、該虛擬位置、該虛擬尺寸以及 該導航資訊,產生一指引資訊。 9.如請求項8所述之即時擴增實境方法,其中該步驟(E)包含下列 步驟: 令該微處理器根據該虛擬視線角度以及該預設視線角 度’產生一視線角度差; 令该微處理器根據該預設位置以及該虛擬位置,產生一 位置差;以及 令該微處理器根據該實際尺寸、該視線角度差、該位置 差以及該虛擬尺寸以及該導航資訊,產生該指引資訊。 10·如請求項9所述之即時擴增實境方法,其中該步驟斤)更包含下 列步驟: 令該微處理器根據該實際尺寸以及該虛擬尺寸、計算出 該實景影像擷取裝置之一影像擷取方向與一水平面間之一仰 角; 7該微處S器根據該f際尺寸、該虛擬尺寸以及該導航 貝讯,計算出該影像擷取方向與該導航裝置之一行進方向之 一偏角;以及 令該微處理器,根據該仰角、該偏角 '該視線角度差、 該位置差以及該導航資訊,產生該指引資訊。 如請求項8所述之即時擴增實境方法,其中該步驟(d)可為一 η亥微處理H係根據—物件邊緣辨識法,騎該實景物件於該 21 201132936 17時實景影像中所具有之虛擬尺寸之步驟。 戈項8所述之即時擴增實境方法,其中該即時擴增實境装 置f可@ 透視式抬頭顯示(Head up Display)裝置搭配使用, k傳送/接收介面更與該透視式抬頭顯示裝置呈電性連接,該即 時擴增實境方法更包含下列步驟: 令該微處理器透過該傳送/接收介面,傳送該指引資訊至 該透視式抬碩顯示裝置’俾該透視式抬頭顯示裝置可顯示該 指引資訊。 請求項8所述之即時擴增實境方法,其巾該即時擴增實境裝 置更可與-投影式抬頭顯示裝置搭配使用,且更包含一與該微 處里器電!·生連接之投影器,其中該即時擴增實境方法更包含下 列步驟: 令該微處理器更用以透過該投影器,投射該指引資訊至 ;:式抬頭顯示裝置,俾該投影式抬頭顯示裳置可顯㈣ 指引資訊。 U.如請求項8所述之即時擴增實境方法,更包含下列步驟: :該微處理器將該指引f訊合成於該㈣實景影像上, 以產生一導航影像。 15.—種電腦程式產品,内儲_ 即時擴增實境方法之程式,於-即時擴增實境裝置之 裝置、-實景影像_置:及—時置係可與, 用’該導航裝置係可根據該導缺置之取裝置塔配使 銳資訊,該實景影像m取裝H瑪取’產生一導 時實景影像’該使用者影物取裝 s實景物件之即 、用以梅取一包含人臉物件 22 201132936 之即時使用者影.像,該即時擴㈣ 面、一餘存器《及-微處理器=裝置包含-傳送/接收介 面以及該錯存器呈電性連接該=器係與該傳送/接收介 置、該影像掏取裝置以及 、接收介面係與該導航裝 該儲存器^儲存該實;者—影軸取m電性連接, 之-預設位红及-預二二寸以及該人臉物件 實境裝置後執行·· u種式被載入該即時擴增201132936 VII. Patent application scope: 1. An instant augmented reality device, which can be combined with a navigation device, a real image capturing device and a user image capturing device to use < The navigation device generates a navigation information according to the current position of the navigation device. The real image capturing device is configured to capture an instant real image including a real object, and the user image capturing device is configured to capture a real image. The instant augmented reality device includes: a transmitting/receiving interface electrically connected to the navigation device, the image capturing device, and the user image capturing device, Receiving the navigation information, the real-time live image, and the instant user image; a storage device for storing an actual size of one of the real object and a preset position of the face object and a preset line of sight angle; The processor is electrically connected to the transmitting/receiving interface and the storage, and is configured to: determine that the face object is in the instant The user image has a virtual line of sight angle; determining that the face object has a virtual position in the instant user image; determining that the real object has a virtual size in the live view image; The actual size, the preset position, the preset line of sight angle, the virtual line of sight angle, the virtual position, the virtual size, and the navigation information generate a guidance information. 2. The instant augmented reality device of claim 1, wherein the microprocessor is further configured to: 201132936 generate a line of sight angle difference according to the virtual line of sight angle and the preset line of sight angle; - according to the preset position And the virtual location, generating a position difference; and generating the guidance message according to the actual size, the line of sight angle difference, the position difference, and the virtual size and the navigation information. 3. The instant augmented reality device of claim 2, wherein the microprocessor is further configured to: calculate an image capturing direction of the real image capturing device according to the actual size and the virtual size An elevation angle between the horizontal planes; • calculating an off angle between the image capturing direction and a traveling direction of the navigation device according to the actual size, the virtual size, and the navigation information; and according to the elevation angle, the declination, The guidance angle information, the position difference, and the navigation information generate the guidance information. 4. The instant augmented reality device of claim 1, wherein the microprocessor determines the virtual size of the real object in the live view image according to an object edge recognition method. 5. The instant augmented reality device of claim 1, wherein the instant augmented reality device is further configurable with a see-through head up display device, the transmitting/receiving interface being further related to the perspective The head-up display device is electrically connected, and the microprocessor is further configured to transmit the guidance information to the perspective head-up display device through the transmission/reception interface, and the perspective-type head-up display device can display the guidance information. 6. The instant augmented reality device of claim 1, wherein the instant augmented reality device is further usable with a projection head-up display device, and further comprising an electrical connection with the micro 19 201132936 processor The projector is further configured to project the guidance information to the projection head display device through the projector, and the projection head display device can display the guidance information. 7. The instant augmented reality device of claim 1, wherein the microprocessor further synthesizes the guidance information on the live real image to generate a navigation image. 8. An instant augmented reality method for an instant augmented reality device, the instant augmented reality device can be matched with a navigation device, a real image capturing device and a user image capturing device The navigation device is configured to generate a navigational image according to the current position of the navigation device. The real-time image capturing device is configured to capture an instant real-time image including a real-life object. Taking an instant user image containing a face object, the instant augmented reality device includes a transmitting/receiving interface, a storage, and a microprocessor, the microprocessor and the transmitting/receiving interface and the storing The device is electrically connected to the navigation device, the image capturing device, and the user image capturing device, and the storage device is configured to store the actual size of the real object and the person The one-time preset position of the face object and a preset line of sight angle, the instant augmented reality method comprises the following steps: (A) causing the transmitting/receiving interface to receive the navigation aid (B) causing the microprocessor to determine that the face object has a virtual line of sight angle in the instant user image; (C) causing the microprocessor to determine the person The virtual object has a virtual position in the instant user image; (D) causing the microprocessor to determine that the real object has a virtual size in the live view image 20 201132936; and (E) ordering the micro processing The device generates a guidance information according to the actual size, the preset position, the preset line of sight angle, the virtual line of sight angle, the virtual position, the virtual size, and the navigation information. 9. The instant augmented reality method of claim 8, wherein the step (E) comprises the steps of: causing the microprocessor to generate a line of sight angle difference according to the virtual line of sight angle and the preset line of sight angle; The microprocessor generates a position difference according to the preset position and the virtual position; and causes the microprocessor to generate the guide according to the actual size, the line of sight angle difference, the position difference, the virtual size, and the navigation information. News. The instant augmented reality method of claim 9, wherein the step further comprises the steps of: causing the microprocessor to calculate one of the real image capturing devices according to the actual size and the virtual size. An elevation angle between the image capturing direction and a horizontal plane; 7 the micro-S device calculates one of the image capturing direction and one of the navigation directions according to the f-size, the virtual size, and the navigation message Deflection; and causing the microprocessor to generate the guidance information based on the elevation angle, the off angle 'the line of sight angle difference, the position difference, and the navigation information. The instant augmented reality method as claimed in claim 8, wherein the step (d) is an η 微 micro-processing H system according to the object edge recognition method, riding the real object in the 21 201132936 17 o'clock real scene image The steps to have a virtual size. The instant augmented reality method described in the item 8, wherein the instant augmented reality device f can be used in conjunction with a head-up display device, and the k transmit/receive interface is further connected to the see-through head-up display device Electrically connected, the instant augmented reality method further comprises the steps of: causing the microprocessor to transmit the guidance information to the perspective display device through the transmission/reception interface, wherein the perspective display device is Display the guidance information. The instant augmented reality method according to claim 8, wherein the instant augmented reality device can be used in combination with the projection type head-up display device, and further comprises a connection with the micro-input device. a projector, wherein the instant augmented reality method further comprises the steps of: causing the microprocessor to further project the guidance information through the projector to: a head-up display device, wherein the projection head-up display device is Explicit (4) guidance information. U. The instant augmented reality method of claim 8, further comprising the step of: synthesizing the guide f signal on the (four) live view image to generate a navigation image. 15. A computer program product, the internal storage _ instant augmented reality method program, in the instant - augmented reality device device, - real-life image _ set: and - time system can be used, with the According to the device, the device tower can be equipped with sharp information, and the real image m is taken to generate a real-time image of the user, and the user object is used to take the object. An instant user image containing the face object 22 201132936, the instant expansion (four) face, a memory "and - the microprocessor = device contains - the transmission / receiving interface and the faulty device is electrically connected. And the transmitting/receiving interface, the image capturing device, and the receiving interface and the navigation device store the memory; the image axis is electrically connected, and the preset position is red and - Pre-two inches and the implementation of the face object real device. · u type is loaded into the instant amplification 即時:景=及該;面接收該導航資訊、該 式l B’令—微處理器判斷該人臉物件於即時使 用者影像中所具有之-虛擬視線角度; 程式私*7 C,令該微處理器判斷該人臉物件於該即時 使用者影像所具有之一虛擬位置; 程式指令D,令該微處理器判斷該實景物件於該即時 實景影像中所具有之一虛擬尺寸;以及 —程式指令E,令該微處理器根據該實際尺寸、該預設 位置、該預設視線角度、該虚擬視線角度、該虛擬位置、該 虛擬尺寸以及該導航資訊,產生一指引資訊。 16.如請求項15所述之電腦程式產品’其中該程式指令E包含: —程式指令E1 ’令該微處理器根據該虛擬視線角度以及 該預設視線角度,產生一視線角度差; ~~程式指令E2 ’令該微處理器根據該預設位置以及該虛 擬位置,產生一位置差;以及 一程式指令E3 ’令該微處理器根據該實際尺寸、該視線 23 • I 201132936 差該位置h及該虛擬尺相及該導航冑訊,產生該 才曰5丨資訊。 17.如凊求„之電腦程式產品,其中該程式指令E更包含: 擬尺程式扎γ E4 ’令該微處理器根據該實際尺寸以及該虛 α4算出該實景影像#1取裝置之—影賴取方向與一 水平面間之一仰角; 尺寸r ^式才曰7 Ε5 ’令賴處理器根據該實際尺寸、該虛擬 从及該導航資訊,計算出該影像娜方向與該導航裝置 之—行進方向之一偏角;以及 該視綠Γΐ,Ε6’令該微處理器’根據該仰角、該偏角、 、"角又差、該位置差以及該導航資 18. 如請求項15所^、㈣Μ產生該指引資訊。 人該檄 "腦程式產品’其中該程式指令D可為- 二理器根據一物件邊緣辨識法,判斷該實景物件 時貫4像中所具有之虛擬尺寸之程式指令。 19. 如請求項15所 更可與m J產品’其中該即時擴增實境裝置 謂視心顯示裝置搭配使用’該傳送,接收介面更與 實境裝錢Hi裝置呈電性連接,該程式被載入該即時擴增 :程式指令F ’令該微處理器透過該傳送/接收介面,傳 干=引資訊至該透視式抬頭顯示裝置,俾該透 不裝置可顯示該指引資訊。 项,.肩 -=:項15所述之電腦程式產品,其t該即時擴增 更可與—投影式抬頭顯示裝置搭配使用,且人… 理器電性連接之投影器,該程式被裁人該㈣擴增實境袭= 24 201132936 更執行: 一程式指令G,令該微處理器透過該投影器,投射該指 引資訊至該投影式抬頭顯示裝置,俾該投影式抬頭顯示裝置 可顯示該指引資訊。 21.如請求項15所述之電腦程式產品,該程式被載入該即時擴增 實境裝置後更執行: 一程式指令Η,令該微處理器將該指引資訊合成於該即 時實景影像上,以產生一導航影像。Instant: and = the face; receiving the navigation information, the formula B B - the microprocessor determines the face object in the instant user image has a virtual line of sight angle; program private * 7 C, so that The microprocessor determines that the face object has a virtual position in the instant user image; the program instruction D causes the microprocessor to determine that the real object has a virtual size in the live view image; and the program The command E causes the microprocessor to generate a guidance information according to the actual size, the preset position, the preset line of sight angle, the virtual line of sight angle, the virtual position, the virtual size, and the navigation information. 16. The computer program product of claim 15, wherein the program instruction E comprises: - the program instruction E1' causes the microprocessor to generate a line of sight angle difference according to the virtual line of sight angle and the preset line of sight angle; The program instruction E2' causes the microprocessor to generate a position difference according to the preset position and the virtual position; and a program command E3' causes the microprocessor to vary the position according to the actual size, the line of sight 23 • I 201132936 And the virtual ruler and the navigation message generate the information. 17. The computer program product of the present invention, wherein the program instruction E further comprises: the program program γ E4 ' causes the microprocessor to calculate the real image #1 according to the actual size and the virtual α4 Depending on the actual size, the virtual slave and the navigation information, the image is calculated based on the actual size, the virtual slave and the navigation information, and the elevation angle is one of the elevation angles of the navigation plane. One of the directions of the declination; and the apparent green Γΐ, Ε 6' causes the microprocessor to 'according to the elevation angle, the declination, the " angle difference, the position difference, and the navigation aid 18. As requested in item 15 And (4) generating the guidance information. The person should use the "brain program product" where the program instruction D can be - the second processor determines the virtual size program of the real object according to an object edge identification method. 19. If the request is further compatible with the m J product 'where the instant augmented reality device is used with the visual display device', the receiving interface is electrically connected to the real money loading device. The program is loaded The instant amplification: the program command F' causes the microprocessor to transmit the information to the perspective head-up display device through the transmission/reception interface, and the device can display the guidance information. =: The computer program product described in item 15, which can be used in conjunction with the projection type head-up display device, and the projector is electrically connected to the projector, and the program is cut off. Reality attack = 24 201132936 More execution: A program command G causes the microprocessor to project the guidance information to the projection head-up display device through the projector, and the projection head-up display device can display the guidance information. The computer program product of claim 15, wherein the program is loaded into the instant augmented reality device and executed: a program command to cause the microprocessor to synthesize the guide information on the live real-time image. To generate a navigation image. 2525
TW99108329A 2010-03-22 2010-03-22 Real-time augmented reality device, real-time augmented reality method and computer program product thereof TWI408342B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW99108329A TWI408342B (en) 2010-03-22 2010-03-22 Real-time augmented reality device, real-time augmented reality method and computer program product thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW99108329A TWI408342B (en) 2010-03-22 2010-03-22 Real-time augmented reality device, real-time augmented reality method and computer program product thereof

Publications (2)

Publication Number Publication Date
TW201132936A true TW201132936A (en) 2011-10-01
TWI408342B TWI408342B (en) 2013-09-11

Family

ID=46750996

Family Applications (1)

Application Number Title Priority Date Filing Date
TW99108329A TWI408342B (en) 2010-03-22 2010-03-22 Real-time augmented reality device, real-time augmented reality method and computer program product thereof

Country Status (1)

Country Link
TW (1) TWI408342B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI691891B (en) * 2018-09-07 2020-04-21 財團法人工業技術研究院 Method and apparatus for displaying information of multiple objects

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI657409B (en) 2017-12-27 2019-04-21 財團法人工業技術研究院 Superimposition device of virtual guiding indication and reality image and the superimposition method thereof
JP6828934B1 (en) * 2020-08-18 2021-02-10 株式会社ビーブリッジ Navigation devices, navigation systems, navigation methods, navigation programs

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7071970B2 (en) * 2003-03-10 2006-07-04 Charles Benton Video augmented orientation sensor
TW201011259A (en) * 2008-09-12 2010-03-16 Wistron Corp Method capable of generating real-time 3D map images and navigation system thereof

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI691891B (en) * 2018-09-07 2020-04-21 財團法人工業技術研究院 Method and apparatus for displaying information of multiple objects
US10755456B2 (en) 2018-09-07 2020-08-25 Industrial Technology Research Institute Method and apparatus for displaying information of multiple objects

Also Published As

Publication number Publication date
TWI408342B (en) 2013-09-11

Similar Documents

Publication Publication Date Title
JP6780642B2 (en) Information processing equipment, information processing methods and programs
EP3338136B1 (en) Augmented reality in vehicle platforms
KR102289389B1 (en) Virtual object orientation and visualization
US9934614B2 (en) Fixed size augmented reality objects
KR101699922B1 (en) Display system and method using hybrid user tracking sensor
JP6536856B2 (en) Vehicle display device
WO2015174050A1 (en) Display device and display method
KR101921969B1 (en) augmented reality head-up display apparatus and method for vehicles
US8838381B1 (en) Automatic video generation for navigation and object finding
TWI408339B (en) Real-time augmented reality device, real-time augmented reality methode and computer program product thereof
JP2015114757A (en) Information processing apparatus, information processing method, and program
US20110288763A1 (en) Method and apparatus for displaying three-dimensional route guidance
JP2009020089A (en) System, method, and program for navigation
CN109462750A (en) A kind of head-up-display system, information display method, device and medium
TWI453462B (en) Telescopic observation for virtual reality system and method thereof using intelligent electronic device
WO2016118344A1 (en) Fixed size augmented reality objects
KR20140080720A (en) Augmented Reality imaging based sightseeing guide apparatus
CN102200445B (en) Real-time augmented reality device and method thereof
US10771707B2 (en) Information processing device and information processing method
KR20110114114A (en) Real 3d navigation implementing method
KR20120007781U (en) - Route guidance method using Augmented Reality and Head-up display
JP6345381B2 (en) Augmented reality system
CN109990797A (en) A kind of control method of the augmented reality navigation display for HUD
US11410330B2 (en) Methods, devices, and systems for determining field of view and producing augmented reality
TWI799000B (en) Method, processing device, and display system for information display

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees