CN103871045A - Display system and method - Google Patents
Display system and method Download PDFInfo
- Publication number
- CN103871045A CN103871045A CN201310182182.4A CN201310182182A CN103871045A CN 103871045 A CN103871045 A CN 103871045A CN 201310182182 A CN201310182182 A CN 201310182182A CN 103871045 A CN103871045 A CN 103871045A
- Authority
- CN
- China
- Prior art keywords
- eye tracking
- driver
- sight line
- video camera
- stereo
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
- G06T7/593—Depth or shape recovery from multiple images from stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
A display system includes a line-of-sight tracking camera to detect an eyeball of a driver, first and second stereo cameras, a controlling unit, and a storing unit. The first and second stereo camera photographs a range corresponding to a field-of-view based on stereo camera-based line-of-sight information that is changed based on line-of-sight tracking camera-based line-of-sight information, and provide a photographed image. The controlling unit converts the line-of-sight tracking camera-based line-of-sight information into first and second stereo camera-based line-of-sight information based on pre-stored position and rotation information of the tracking camera and the first and second stereo cameras, and projects the converted information onto the first and second stereo cameras, such that a three-dimensional line-of-sight coordinate is calculated. The storing unit stores information related to a system as well as the position and rotation information of the tracking camera and the first and second stereo cameras.
Description
Technical field
Concept of the present invention relates to a kind of display system and method.
Background technology
At present, as disclosed in patent document 1, the various safety device of vehicle for vehicle driver's convenience and safety have been developed.
More specifically, provide a kind of Visual Trace Technology that obtains the sight line of driver in vehicle and real-time prospect, the alert service etc. of the road that vehicle drives are provided by the sight line of acquisition.
But, mainly only detect direction of visual lines such as advertising efficiency of two dimension (2D) environment checking, utilize the interface etc. of display in the above-mentioned Visual Trace Technology of optimization.
Meanwhile, because driver makes with the naked eye to watch for example 3D environment of actual environment when the steering vehicle, only accurately detect direction of visual lines with the line of sight detecting in 2D environment and have restriction.
[correlation technique file]
[patent document]
(patent document 1) KR10-2011-0139474A
Summary of the invention
Therefore, make concept of the present invention to solve the problems referred to above that occur in prior art, the advantage that complete maintenance prior art realizes simultaneously.
An aspect of concept of the present invention relates to a kind of display system and method for detection of the driver's LOS coordinate based on three-dimensional.Display system comprises the eye tracking video camera that is configured to detect driver's eyeball.The first and second stereo cameras are configured to take and the corresponding scope in the driver visual field according to the driver's sight line information based on stereo camera, and captured image is provided, wherein should the driver's sight line information based on stereo camera change according to the driver's sight line information based on eye tracking video camera.Control module is configured to eye tracking video camera based on prestoring and the driver's sight line information based on eye tracking video camera is converted to the driver's sight line information based on the first and second stereo cameras by positional information and the rotation information of the first and second stereo cameras, and the information after conversion is projected on the first and second stereo cameras, thereby the three-dimensional coordinate of calculating driver sight line.Memory cell arrangements becomes positional information and the rotation information of the relevant information of Storage & Display system and eye tracking video camera and the first and second stereo cameras.
Driver's sight line information based on eye tracking video camera can comprise the simple eye point of eye tracking vector and eye tracking vector.Control module can be configured to simple eye the eye tracking vector based on eye tracking video camera point to be converted to the simple eye point of eye tracking vector based on stereo camera.
Control module can be configured by following formula 1 by simple eye the eye tracking vector based on eye tracking video camera point (position)
be converted to the simple eye point of eye tracking vector based on stereo camera
Driver's sight line information based on eye tracking video camera can comprise the simple eye point of eye tracking vector and eye tracking vector, and control module can be converted to the eye tracking vector based on stereo camera by the eye tracking vector based on eye tracking video camera.
Control module can be configured by following formula 2 by the eye tracking vector based on eye tracking video camera
be converted to the eye tracking vector (rotation) based on stereo camera
Wherein Θ
x, Θ
y, Θ
zrepresent rotation information.
Control module can be configured to according to the sight line blinkpunkt that projects the eye tracking vector based on stereo camera on the first and second stereo cameras and calculate driver, and the sight line blinkpunkt of driver based on calculating calculates the three-dimensional coordinate of driver's sight line.
Control module can be configured on the image projecting on the first stereo camera, generate and the default corresponding window of the main fixation range of people's sight line, move the simple eye point of institute's generating window according to the eye tracking vector based on the first stereo camera, carry out template matches and make the eye tracking vector of the window that is moved and the second stereo camera corresponding, and as the execution result of template matches, extreme higher position is identified as to driver's sight line blinkpunkt.
The three-dimensional coordinate of driver's sight line can be P(Xp, Yp, Zp),
with
wherein P
1(x
1, y
1) project the point on the first stereo camera and imaging surface, P for P
r(x
r, y
r) for P projects the point on the imaging surface of the second stereo camera, the focal length that f is video camera, T is the distance between first and second stereo camera, and the value that obtains divided by the focal length of video camera of the d distance that is measurement of coordinates point.
Concept of the present invention comprise on the other hand a kind of display packing, it provides three-dimensional driver's sight line in the display system that comprises eye tracking video camera and the first and second stereo cameras.Display packing comprises the eyeball that detects driver by eye tracking video camera, to identify the driver's sight line information based on eye tracking video camera.Eye tracking video camera based on prestoring and positional information and the rotation information of the first and second stereo cameras, the driver's sight line information based on eye tracking video camera is converted into the driver's sight line information based on the first and second stereo cameras.Can be according to the sight line blinkpunkt of the driver's sight line information identification driver based on the first and second stereo cameras.Driver's sight line blinkpunkt is converted into the three-dimensional coordinate of driver's sight line.
Driver's sight line information based on eye tracking video camera can comprise the simple eye point of eye tracking vector and eye tracking vector.The driver's sight line information based on eye tracking video camera being converted in the process of the driver's sight line information based on the first and second stereo cameras, the simple eye point of eye tracking vector based on eye tracking video camera can be converted into the simple eye point of eye tracking vector based on stereo camera.
The driver's sight line information based on eye tracking video camera being converted in the process of the driver's sight line information based on the first and second stereo cameras, the simple eye point of line of sight (position) based on eye tracking video camera
can be converted into the simple eye point of eye tracking vector based on stereo camera by following formula 1
Driver's sight line information based on eye tracking video camera can comprise the simple eye point of eye tracking vector and eye tracking vector, and the driver's sight line information based on eye tracking video camera is converted in the process of the driver's sight line information based on the first and second stereo cameras, the eye tracking vector based on eye tracking video camera can be converted into the eye tracking vector based on stereo camera.
The driver's sight line information based on eye tracking video camera being converted in the process of the driver's sight line information based on the first and second stereo cameras, the eye tracking vector based on eye tracking video camera
can be converted into the eye tracking vector (rotation) based on stereo camera by following formula 2
Wherein, Θ
x, Θ
y, Θ
zrepresent rotation information.
In the process of sight line blinkpunkt of identifying driver, the driver's sight line information based on the first and second stereo cameras can be projected onto on the first and second stereo cameras.Can identify based on the information on the first and second stereo cameras of projecting driver's sight line blinkpunkt.
Identify in driver's the process of sight line blinkpunkt in the information based on projecting on the first and second stereo cameras, can on the image projecting on the first stereo camera, generate and the default corresponding window of the main fixation range of people's sight line.The simple eye point of generating window can move according to the eye tracking vector based on the first stereo camera.Executable template coupling makes the eye tracking vector of the window that is moved and the second stereo camera corresponding.As the result of execution step, extreme higher position can be identified as driver's sight line blinkpunkt.
The three-dimensional coordinate of driver's sight line can be P(Xp, Yp, Zp),
and
wherein P
1(x
1, y
1) project the point on the imaging surface of the first stereo camera, P for P
r(x
r, y
r) for P projects the point on the imaging surface of the second stereo camera, the focal length that f is video camera, T is the distance between first and second stereo camera, and the value that obtains divided by the focal length of video camera of the d distance that is measurement of coordinates point.
The various feature and advantage of concept of the present invention will be below with reference to more obvious in the description of the drawings book.
The term using in this instructions and claim and word should not be understood to be restricted to the definition of typical implication and dictionary, and should be understood to can suitably define based on inventor the rule of the concept of term, there is the implication relevant to the technical scope of concept of the present invention and concept, thus the most suitably describe that he or she knows for carrying out the best way of concept of the present invention.
Brief description of the drawings
The more specific description of the above-mentioned and further feature of concept of the present invention embodiment of the concept of the present invention shown in accompanying drawing by basis is and more obvious, and in the accompanying drawings, same reference numerals can refer to same or analogous part in different views.Accompanying drawing must be not pro rata, but focuses on the principle of example explanation concept embodiment of the present invention.
Fig. 1 is the view that the configuration of the display system of the illustrative embodiments of concept according to the present invention is shown.
Fig. 2 is the process flow diagram for describing the display packing of the illustrative embodiments of concept according to the present invention.
Fig. 3 to 6 is the views that illustrate for describing the embodiment of the display packing of the illustrative embodiments of concept according to the present invention.
the Reference numeral of each element in accompanying drawing
100: display system 100
110: eye tracking video camera
120: the first stereo cameras
130: the second stereo cameras
140: control module
150: storage unit
Embodiment
Above-mentioned and the other objects, features and advantages of concept of the present invention will more clearly be understood according to following embodiment by reference to the accompanying drawings.In instructions, in the time adding Reference numeral to run through accompanying drawing parts, it should be noted, even at parts shown in different accompanying drawings, identical Reference numeral also represents identical parts.In addition, when determine with the detailed description of the prior art of conceptual dependency of the present invention may make concept of the present invention want point fuzziness time, its detailed description will be omitted.Hereinafter, describe the illustrative embodiments of concept of the present invention in detail with reference to accompanying drawing.
Fig. 1 is the view that the configuration of the display system of the illustrative embodiments of concept according to the present invention is shown.With reference to Fig. 3 to 6 that the embodiment for describing display packing is shown, display system is described.
As shown in Figure 1, display system 100 can be configured to comprise eye tracking video camera 110, the first stereo camera 120, the second stereo camera 130, control module 140 and storage unit 150.
More specifically, eye tracking video camera 110 can detect the eyeball of driver in vehicle.
As shown in Figure 3, eye tracking video camera 110 can be arranged in eye tracking video camera 110 and can detect the position of driver's face in vehicle with detection driver's eyeball.
The first and second stereo cameras 120 and 130 can be taken with the corresponding scope in the driver visual field and the image of shooting is provided according to driver's sight line information based on stereo camera.Driver's sight line information based on stereo camera can change according to the driver's sight line information based on eye tracking video camera.
As shown in Figure 3, can arrange the first and second stereo cameras 120 and 130, make the first and second stereo cameras 120 and 130 spaced, thus corresponding each other in the each side about driver.
In addition, the first and second stereo cameras 120 and 130 can be shared inner parameter (focal length, principal point, inclination (skew) and distortion (distortion)) and/or external parameter (rotation and translation) mutually, and making to be three-dimensional position by single location restore.
Herein, the driver's sight line information based on eye tracking video camera can comprise that simple eye some E-1(of eye tracking vector is referring to Fig. 3) and eye tracking vector E-2(referring to Fig. 3).
More specifically, control module 140 can be by following formula 1 by simple eye the eye tracking vector based on eye tracking video camera point (position)
be converted to the simple eye point of eye tracking vector based on stereo camera
[formula 1]
In formula 1,
with
represent positional information, and Θ
x, Θ
y, Θ
zcan represent rotation information.
In addition, in formula 1, first to fourth matrix group represents that respectively move position, x axle rotates, y axle rotates and the rotation of z axle.
In addition, control module 140 can be by following formula 2 by the eye tracking vector based on eye tracking video camera
be converted to the eye tracking vector (rotation) based on stereo camera
[formula 2]
Θ
x, Θ
y, Θ
zcan represent rotation information.
In addition, in formula 2, the first to the 3rd matrix group represents respectively the rotation of x axle, the rotation of y axle and the rotation of z axle.
In addition, control module 140 can calculate according to the eye tracking vector based on stereo camera that projects the first and second stereo cameras 120 and 130 driver's sight line blinkpunkt, and driver's sight line blinkpunkt based on calculating calculates the three-dimensional coordinate of driver's sight line.
More specifically, control module 140 can on the image that projects the first stereo camera 120, generate with the default main fixation range of people's sight line (for example, approximately ± 3 to ± 5 degree) corresponding window, and move the simple eye point of institute's generating window according to the eye tracking vector based on the first stereo camera.
That is to say, the image the most similar with 130 the eye tracking vector based on stereo camera to projecting the first and second stereo cameras 120 can be identified as the focus of sight line.
Although the eye tracking vector based on stereo camera described above is projected onto the situation of the first stereo camera 120, concept of the present invention is not restricted to this.That is to say, the eye tracking vector based on stereo camera also can be projected onto on the second stereo camera 130.
For example, Fig. 4 A illustrates the eye tracking vector based on stereo camera projecting on the first stereo camera 120.Fig. 4 B illustrates the eye tracking vector based on stereo camera projecting on the second stereo camera 130.
Herein, control module 140 can be simple eye from 1. via 2. moving to 3. by what project window on the first stereo camera 120 according to the eye tracking vector based on the first stereo camera.
In addition, control module 140 executable templates couplings make the eye tracking vector of the window that is moved and the second stereo camera 130 corresponding and as the result performing step, extreme higher position is identified as to driver's sight line blinkpunkt.
Herein, template matches represents that identify technique by figure extracts the process of the given figure (figure) consistent with template from image (image), thereby finds peak-peak point in cross-correlation scheme (cross-correlation scheme).
For example, as shown in Figure 4 B, control module 140 executable templates couplings make the eye tracking vector of the window that is moved and the second stereo camera 130 corresponding.
And control module 140 can calculate the three-dimensional coordinate of driver's sight line.In this case, the three-dimensional coordinate of driver's sight line can be P(Xp, Yp, Zp).
With reference to Fig. 5, the three-dimensional coordinate of driver's sight line can be (P(Xp, Yp, Zp)),
And
Herein, P1(x1, y1) can be the point on the imaging surface that P projects the first stereo camera.P
r(x
r, y
r) can be the point on the imaging surface that P projects the second stereo camera.F can be the focal length of video camera.T can be the distance between first and second stereo camera.D can be the value (d=Z that the distance of measurement of coordinates point obtains divided by the focal length of video camera
p/ f).
In addition, in Fig. 5, I
leftrepresent the imaging surface of the first stereo camera (left video camera) 120.I
rightrepresent the imaging surface of the second stereo camera (right video camera) 130.C
lrepresent the simple eye point of image of the first stereo camera.C
rrepresent the simple eye point of image of the second stereo camera.O
lrepresent the focus of the first stereo camera.O
rrepresent the focus of the second stereo camera.
Meanwhile, the three-dimensional coordinate of driver's sight line can be used as the user interface of vehicle.
For example, as shown in Figure 6, the three-dimensional coordinate of driver's sight line can be used for opening or closing the service of looking squarely demonstration (HUD).Can use as follows the three-dimensional coordinate of driver's sight line, in the time there is focal length in HUD region, drive service to change alarm intensity according to line-of-sight distance when vehicle is braked suddenly forwardly.
Herein, can identify the positional information of eye tracking video camera 110 and the first and second stereo cameras 120 and 130 and rotation information and by before physical measurement or from information such as the camera calibration of software etc. by its storage.
Fig. 2 is the process flow diagram for describing the display packing of the illustrative embodiments of concept according to the present invention.
First, display system 100 can detect by eye tracking video camera 110 driver's eyeball, to identify the frame person of the sailing sight line information (S101) based on eye tracking video camera.
Herein, the driver's sight line information based on eye tracking video camera can comprise the simple eye point of eye tracking vector and eye tracking vector.
Then, eye tracking video camera 110 based on prestoring and the first and second stereo camera 120 and 130 positional information and rotation informations, display system can be converted to the driver's sight line information (S103) based on the first and second stereo cameras by the driver's sight line information based on eye tracking video camera.
In this case, display system 100 can be by above-mentioned formula 1 by simple eye the eye tracking vector based on eye tracking video camera point (position)
be converted to the simple eye point of eye tracking vector based on stereo camera
In addition, display system 100 can be by above-mentioned formula 2 by the eye tracking vector based on eye tracking video camera
be converted to the eye tracking vector (rotation) based on stereo camera
Then, display system 100 can be according to the sight line blinkpunkt (S105) of the driver's sight line information identification driver based on the first and second stereo cameras.
More specifically, display system 100 can project the driver's sight line information based on the first and second stereo cameras on the first and second stereo cameras 120 and 130.
Then, display system 100 can be identified based on the project information on the first and second stereo cameras 120 and 130 driver's sight line blinkpunkt.
This may complete by following steps, on the image projecting on the first stereo camera 120, generate the step with the default corresponding window of the main fixation range of people's sight line, move the step of the simple eye point of institute's generating window according to the eye tracking vector based on the first stereo camera, carry out template matches and make the window that is moved and the eye tracking vector corresponding step of the second stereo camera 130, and as the result of execution template matches step, extreme higher position is identified as to the step of driver's sight line blinkpunkt.
For example, Fig. 4 A illustrates the eye tracking vector based on stereo camera projecting on the first stereo camera 120.Fig. 4 B illustrates the eye tracking vector based on stereo camera projecting on the second stereo camera 130.
Herein, display system 100 can be simple eye from 1. via 2. moving to 3. by what project window on the first stereo camera 120 according to the eye tracking vector based on the first stereo camera.
In addition, as shown in Figure 4 B, display system 100 executable templates couplings make the eye tracking vector of the window that is moved and the second stereo camera 130 corresponding.
Then, display system 100 can be converted to driver's sight line blinkpunkt the three-dimensional coordinate (S107) of driver's sight line.
In addition P,
1(x
1, y
1) can be the point on the imaging surface that P projects the first stereo camera.P
r(x
r, y
r) can be the point on the imaging surface that P projects the second stereo camera.F can be the focal length of video camera.T can be the distance between first and second stereo camera.D can be the value that the distance of measurement of coordinates point obtains divided by the focal length of video camera.
Use display system and the method for the illustrative embodiments of concept according to the present invention, because the driver's sight line information based on two-dimentional is converted into the driver's sight line information based on three-dimensional, therefore can detect driver's direction of visual lines compared with correlation technique with more accurate three dimensional depth.
In addition, use display system and the method for the illustrative embodiments of concept according to the present invention, owing to can identifying driver's three-dimensional sight line focal length, therefore can easily and accurately judge the object in external environment condition, and can utilize widely the information for identifying driver intention etc.
Although for the openly illustrative embodiments of concept of the present invention of object of example explanation, but it should be understood that concept of the present invention should not be confined to this, and it will be understood by those skilled in the art that and can in the case of not departing from the scope and spirit of concept of the present invention, carry out various amendments, interpolation and replacement.
Therefore, any and all amendment, variation or equivalent arrangements should be considered in the scope of concept of the present invention, and by the openly detailed scope of concept of the present invention of claim.
Claims (16)
1. a display system, comprising:
Eye tracking video camera, is configured to detect driver's eyeball;
The first and second stereo cameras, be configured to take and the corresponding scope in the driver visual field according to the driver's sight line information based on stereo camera, and the image of shooting is provided, wherein said driver's sight line information based on stereo camera changes according to the driver's sight line information based on eye tracking video camera;
Control module, be configured to described eye tracking video camera based on prestoring and positional information and the rotation information of described the first and second stereo cameras described driver's sight line information based on eye tracking video camera is converted to the driver's sight line information based on the first and second stereo cameras, and the information after conversion is projected on described the first and second stereo cameras, thereby the three-dimensional coordinate of calculating driver sight line; And
Storage unit, is configured to the storage information relevant with described display system and positional information and the rotation information of described eye tracking video camera and described the first and second stereo cameras.
2. display system according to claim 1, wherein:
Described driver's sight line information based on eye tracking video camera comprises the simple eye point of eye tracking vector and eye tracking vector, and
Described control module is configured to the simple eye point of the described eye tracking vector based on eye tracking video camera to be converted to the simple eye point of eye tracking vector based on stereo camera.
3. display system according to claim 2, wherein said control module is configured to by following formula 1 the described simple eye point of the eye tracking vector based on eye tracking video camera (position)
be converted to the simple eye point of eye tracking vector based on stereo camera
4. display system according to claim 1, wherein:
Described driver's sight line information based on eye tracking video camera comprises the simple eye point of eye tracking vector and eye tracking vector, and
Described control module is configured to the described eye tracking vector based on eye tracking video camera to be converted to the eye tracking vector based on stereo camera.
5. display system according to claim 4, wherein said control module is configured to by following formula 2 the described eye tracking vector based on eye tracking video camera
be converted to the eye tracking vector (rotation) based on stereo camera
Wherein Θ
x, Θ
y, Θ
zrepresent rotation information.
6. display system according to claim 1, wherein said control module is configured to calculate driver's sight line blinkpunkt according to the eye tracking vector based on stereo camera projecting on described the first and second stereo cameras, and driver's sight line blinkpunkt based on calculating calculates the three-dimensional coordinate of described driver's sight line.
7. display system according to claim 6, wherein said control module is configured on the image projecting on described the first stereo camera, generate and the default corresponding window of the main fixation range of people's sight line, move the simple eye point of institute's generating window according to the eye tracking vector based on the first stereo camera, carry out template matches and make the window that is moved corresponding with the eye tracking vector of described the second stereo camera, and as the result of the described template matches of execution, extreme higher position is identified as to driver's sight line blinkpunkt.
8. display system according to claim 6, the three-dimensional coordinate of wherein said driver's sight line is P(Xp, Yp, Zp),
And
Wherein P
1(x
1, y
1) project the point on the imaging surface of described the first stereo camera, P for P
r(x
r, y
r) for P projects the point on the imaging surface of described the second stereo camera, the focal length that f is described video camera, T is the distance between described the first and second stereo cameras, and the value that obtains divided by the focal length of described video camera of the d distance that is measurement of coordinates point.
9. the display packing that three-dimensional driver's sight line is provided in the display system that comprises eye tracking video camera and the first and second stereo cameras, described display packing comprises:
Detect driver's eyeball by described eye tracking video camera, to identify the driver's sight line information based on eye tracking video camera;
Described eye tracking video camera based on prestoring and positional information and the rotation information of described the first and second stereo cameras, be converted to the driver's sight line information based on the first and second stereo cameras by described driver's sight line information based on eye tracking video camera;
According to described driver's sight line information identification driver sight line blinkpunkt based on the first and second stereo cameras; And
Described driver's sight line blinkpunkt is converted to the three-dimensional coordinate of driver's sight line.
10. display packing according to claim 9, wherein:
Described driver's sight line information based on eye tracking video camera comprises the simple eye point of eye tracking vector and eye tracking vector, and
The step that described driver's sight line information based on eye tracking video camera is converted to described driver's sight line information based on the first and second stereo cameras comprises the simple eye point of the described eye tracking vector based on eye tracking video camera is converted to the simple eye point of eye tracking vector based on stereo camera.
11. display packings according to claim 10, the step that wherein described driver's sight line information based on eye tracking video camera is converted to described driver's sight line information based on the first and second stereo cameras comprises by following formula 1 the described simple eye point of the eye tracking vector based on eye tracking video camera (position)
be converted to the simple eye point of eye tracking vector based on stereo camera
12. display packings according to claim 9, wherein:
Described driver's sight line information based on eye tracking video camera comprises the simple eye point of eye tracking vector and eye tracking vector, and
The step that described driver's sight line information based on eye tracking video camera is converted to described driver's sight line information based on the first and second stereo cameras comprises the described eye tracking vector based on eye tracking video camera is converted to the eye tracking vector based on stereo camera.
13. display packings according to claim 12, the step that wherein described driver's sight line information based on eye tracking video camera is converted to described driver's sight line information based on the first and second stereo cameras comprises by following formula 2 the described eye tracking vector based on eye tracking video camera
be converted to the eye tracking vector (rotation) based on stereo camera
Wherein Θ
x, Θ
y, Θ
zrepresent rotation information.
14. display packings according to claim 9, the step of wherein identifying described driver's sight line blinkpunkt comprises:
Described driver's sight line information based on the first and second stereo cameras is projected on described the first and second stereo cameras; And
Described information based on projecting on described the first and second stereo cameras is identified described driver's sight line blinkpunkt.
15. display packings according to claim 14, wherein the step based on the described information on described the first and second stereo cameras of projecting is identified described driver's sight line blinkpunkt comprises:
On the image projecting on described the first stereo camera, generate and the default corresponding window of the main fixation range of people's sight line;
Move the simple eye point of institute's generating window according to the eye tracking vector based on the first stereo camera;
Carrying out template matches makes the eye tracking vector of the window that is moved and described the second stereo camera corresponding; And
Extreme higher position is identified as driver's sight line blinkpunkt by result as described execution step.
16. display packings according to claim 9, the three-dimensional coordinate of wherein said driver's sight line is P(Xp, Yp, Zp),
And
Wherein P
1(x
1, y
1) project the point on the imaging surface of described the first stereo camera, P for P
r(x
r, y
r) for P projects the point on the imaging surface of described the second stereo camera, the focal length that f is described video camera, T is the distance between described the first and second stereo cameras, and the value that obtains divided by the focal length of described video camera of the d distance that is measurement of coordinates point.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0143921 | 2012-12-11 | ||
KR1020120143921A KR101382772B1 (en) | 2012-12-11 | 2012-12-11 | Display system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103871045A true CN103871045A (en) | 2014-06-18 |
CN103871045B CN103871045B (en) | 2018-09-04 |
Family
ID=50656975
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310182182.4A Active CN103871045B (en) | 2012-12-11 | 2013-05-16 | Display system and method |
Country Status (3)
Country | Link |
---|---|
US (1) | US9160929B2 (en) |
KR (1) | KR101382772B1 (en) |
CN (1) | CN103871045B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105700677A (en) * | 2015-12-29 | 2016-06-22 | 努比亚技术有限公司 | Mobile terminal and control method thereof |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014207397A1 (en) * | 2014-04-17 | 2015-10-22 | Bayerische Motoren Werke Aktiengesellschaft | Method, device, vehicle operating system, computer program for operating a vehicle and computer program product |
KR101892135B1 (en) * | 2016-06-30 | 2018-08-29 | 전자부품연구원 | Covering object transparent display device, method and system according to gaze direction |
US10007854B2 (en) * | 2016-07-07 | 2018-06-26 | Ants Technology (Hk) Limited | Computer vision based driver assistance devices, systems, methods and associated computer executable code |
US9874934B1 (en) | 2016-07-29 | 2018-01-23 | International Business Machines Corporation | System, method, and recording medium for tracking gaze with respect to a moving plane with a camera with respect to the moving plane |
EP3305176A1 (en) * | 2016-10-04 | 2018-04-11 | Essilor International | Method for determining a geometrical parameter of an eye of a subject |
KR20190013224A (en) * | 2017-08-01 | 2019-02-11 | 엘지전자 주식회사 | Mobile terminal |
US11620419B2 (en) | 2018-01-24 | 2023-04-04 | Toyota Research Institute, Inc. | Systems and methods for identifying human-based perception techniques |
JP7115276B2 (en) * | 2018-12-10 | 2022-08-09 | トヨタ自動車株式会社 | Driving support device, wearable device, driving support system, driving support method and program |
JP7223303B2 (en) * | 2019-03-14 | 2023-02-16 | 日本電気株式会社 | Information processing device, information processing system, information processing method and program |
PH12019050076A1 (en) * | 2019-05-06 | 2020-12-02 | Samsung Electronics Co Ltd | Enhancing device geolocation using 3d map data |
CN111522443B (en) * | 2020-04-13 | 2024-04-05 | 京东方科技集团股份有限公司 | Display method, system, equipment and storage medium of vehicle A column display assembly |
KR102426071B1 (en) * | 2021-07-21 | 2022-07-27 | 정희석 | Artificial intelligence based gaze recognition device and method, and unmanned information terminal using the same |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004259043A (en) * | 2003-02-26 | 2004-09-16 | Toyota Motor Corp | Direction detection device and direction detection method |
CN1659418A (en) * | 2002-04-22 | 2005-08-24 | 松下电器产业株式会社 | Camera corrector |
CN201307266Y (en) * | 2008-06-25 | 2009-09-09 | 韩旭 | Binocular sightline tracking device |
CN101549648A (en) * | 2008-03-31 | 2009-10-07 | 现代自动车株式会社 | Alarm system for alerting driver to presence of objects |
JP4397719B2 (en) * | 2004-03-30 | 2010-01-13 | 本田技研工業株式会社 | Gaze detection device |
CN102169364A (en) * | 2010-02-26 | 2011-08-31 | 原相科技股份有限公司 | Interaction module applied to stereoscopic interaction system and method of interaction module |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4032994B2 (en) | 2003-02-26 | 2008-01-16 | トヨタ自動車株式会社 | Gaze direction detection device and gaze direction detection method |
US6989754B2 (en) * | 2003-06-02 | 2006-01-24 | Delphi Technologies, Inc. | Target awareness determination system and method |
KR101122513B1 (en) * | 2005-04-04 | 2012-03-15 | (주)아이리스아이디 | Assuming system of eyeball position using 3-dimension position information and assuming method of eyeball position |
JP2008136789A (en) * | 2006-12-05 | 2008-06-19 | Nec Corp | Eyeball parameter estimating instrument and method |
WO2008097933A1 (en) * | 2007-02-04 | 2008-08-14 | Miralex Systems Incorporated | Systems and methods for gaze tracking using multiple images |
KR100941236B1 (en) | 2008-03-31 | 2010-02-10 | 현대자동차주식회사 | Driver distration warning system |
US8629784B2 (en) * | 2009-04-02 | 2014-01-14 | GM Global Technology Operations LLC | Peripheral salient feature enhancement on full-windshield head-up display |
JP2010281685A (en) | 2009-06-04 | 2010-12-16 | National Univ Corp Shizuoka Univ | System and method for measurement of position |
US8384534B2 (en) * | 2010-01-14 | 2013-02-26 | Toyota Motor Engineering & Manufacturing North America, Inc. | Combining driver and environment sensing for vehicular safety systems |
US20110310001A1 (en) * | 2010-06-16 | 2011-12-22 | Visteon Global Technologies, Inc | Display reconfiguration based on face/eye tracking |
KR101628394B1 (en) | 2010-11-22 | 2016-06-08 | 현대자동차주식회사 | Method for tracking distance of eyes of driver |
KR20120062541A (en) | 2010-12-06 | 2012-06-14 | 현대자동차주식회사 | Display system based on gaze direction vector |
KR101544524B1 (en) * | 2010-12-16 | 2015-08-17 | 한국전자통신연구원 | Display system for augmented reality in vehicle, and method for the same |
JP5706715B2 (en) | 2011-02-25 | 2015-04-22 | 株式会社ジオ技術研究所 | 3D map display system |
-
2012
- 2012-12-11 KR KR1020120143921A patent/KR101382772B1/en active IP Right Grant
-
2013
- 2013-04-26 US US13/871,678 patent/US9160929B2/en active Active
- 2013-05-16 CN CN201310182182.4A patent/CN103871045B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1659418A (en) * | 2002-04-22 | 2005-08-24 | 松下电器产业株式会社 | Camera corrector |
JP2004259043A (en) * | 2003-02-26 | 2004-09-16 | Toyota Motor Corp | Direction detection device and direction detection method |
JP4397719B2 (en) * | 2004-03-30 | 2010-01-13 | 本田技研工業株式会社 | Gaze detection device |
CN101549648A (en) * | 2008-03-31 | 2009-10-07 | 现代自动车株式会社 | Alarm system for alerting driver to presence of objects |
CN201307266Y (en) * | 2008-06-25 | 2009-09-09 | 韩旭 | Binocular sightline tracking device |
CN102169364A (en) * | 2010-02-26 | 2011-08-31 | 原相科技股份有限公司 | Interaction module applied to stereoscopic interaction system and method of interaction module |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105700677A (en) * | 2015-12-29 | 2016-06-22 | 努比亚技术有限公司 | Mobile terminal and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
KR101382772B1 (en) | 2014-04-08 |
US20140160249A1 (en) | 2014-06-12 |
US9160929B2 (en) | 2015-10-13 |
CN103871045B (en) | 2018-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103871045A (en) | Display system and method | |
US20210058608A1 (en) | Method and apparatus for generating three-dimensional (3d) road model | |
US9448758B2 (en) | Projecting airplane location specific maintenance history using optical reference points | |
JP6168833B2 (en) | Multimode data image registration using 3DGeoArc | |
JP6008397B2 (en) | AR system using optical see-through HMD | |
CN103557796B (en) | 3 D positioning system and localization method based on laser ranging and computer vision | |
JP2018519696A (en) | Estimating camera external parameters from image lines | |
US20180293450A1 (en) | Object detection apparatus | |
CN107122770B (en) | Multi-camera system, intelligent driving system, automobile, method and storage medium | |
WO2015146068A1 (en) | Information display device, information display method, and program | |
KR20140122126A (en) | Device and method for implementing augmented reality using transparent display | |
US10634918B2 (en) | Internal edge verification | |
US20170278269A1 (en) | Corneal reflection position estimation system, corneal reflection position estimation method, corneal reflection position estimation program, pupil detection system, pupil detection method, pupil detection program, gaze detection system, gaze detection method, gaze detection program, face orientation detection system, face orientation detection method, and face orientation detection program | |
CN107977082A (en) | A kind of method and system for being used to AR information be presented | |
JP2020057387A (en) | Vehicle positioning method, vehicle positioning device, electronic apparatus, and computer-readable storage medium | |
KR20150125862A (en) | Apparatus for augmented reality using 3d reconstruction and mehhod thereof | |
EP2926317B1 (en) | System and method for detecting pedestrians using a single normal camera | |
Kim et al. | External vehicle positioning system using multiple fish-eye surveillance cameras for indoor parking lots | |
US20130033597A1 (en) | Camera system and method for recognizing distance using the same | |
KR20130052400A (en) | Simulator for stereo vision system of intelligent vehicle and camera calibration method using the same | |
JP5727969B2 (en) | Position estimation apparatus, method, and program | |
JP2015041187A (en) | Traffic measurement device, and traffic measurement method | |
Li et al. | A combined vision-inertial fusion approach for 6-DoF object pose estimation | |
KR101316387B1 (en) | Method of object recognition using vision sensing and distance sensing | |
JP2017016460A (en) | Traffic flow measurement device and traffic flow measurement method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |