CN101931823A - Method and equipment for displaying 3D image - Google Patents

Method and equipment for displaying 3D image Download PDF

Info

Publication number
CN101931823A
CN101931823A CN200910146364XA CN200910146364A CN101931823A CN 101931823 A CN101931823 A CN 101931823A CN 200910146364X A CN200910146364X A CN 200910146364XA CN 200910146364 A CN200910146364 A CN 200910146364A CN 101931823 A CN101931823 A CN 101931823A
Authority
CN
China
Prior art keywords
viewpoint
face
beholder
people
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200910146364XA
Other languages
Chinese (zh)
Inventor
孔晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Priority to CN200910146364XA priority Critical patent/CN101931823A/en
Publication of CN101931823A publication Critical patent/CN101931823A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a method and equipment for displaying a 3D image on 2D display equipment. The method comprises the following steps: capturing the video of the viewer; analyzing the video to obtain the view point of the viewer; and determining the 2D projected image of the 3D image at the view point, wherein the 2D projected image related to the corresponding view point is displayed along with variation of the view point of the viewer. Therefore, the invention can display the 3D image on the 2D display according to the face detection and tracking results of the viewer. By utilizing the scheme, the user can directly view the 3D image on the 2D display with naked eyes.

Description

The method and apparatus that shows 3D rendering
Technical field
The present invention relates to show the method and apparatus of 3D rendering, can on the 2D display, show 3D rendering according to people's face detection and tracking result to the beholder.
Background technology
The method that various demonstration 3D renderings have been arranged is for example based on the 3D display packing of parallax.Patent documentation 1 (US5287437) has disclosed a kind of display packing and equipment.According to this patent documentation 1,, produce complicated 3D stereo-picture by a pair of image that from precalculated one group of image, obtains selection in response to beholder's real-time head movement.Precalculated image is corresponding to predetermined one group of possible viewpoint, and can read the image of precomputation according to the prediction to beholder's head movement.
Patent documentation 1 disclosed method comprises the steps: the corresponding precomputation image of one group of predetermined viewpoint of (a) generation and virtual objects, thus definition is scheduled in the display panel coordinate space of the display screen that is recorded in display device viewpoint and virtual objects; (b) determine on display device, to show the time index of precomputation image in three-dimensional mode; (c), thereby this eyes first node is positioned in the display panel coordinate space at beholder and corresponding every eyes prediction of time index eyes first node; (d) determine in the display panel coordinate space and the immediate predetermined viewpoint of eyes first node; (e) in the reading displayed plate coordinate space with the corresponding precomputation image of the immediate predetermined viewpoint of eyes first node, and with the image transfer of precomputation to frame buffer; (f) with three-dimensional mode precomputation image in the display frame buffer on display device.
There is the problem of inconvenient operation in patent documentation 1.The beholder wears three-dimensional shutter glasses 120 and handles 6 mouses 130 of 3D.3D mouse and shutter glasses 120 come sensing with respect to the 3D position of the display surface of three-dimensional CRT24 6 shaft head trackers 140 by 3D, and this tracker is installed on the three-dimensional CRT24.Utilize the ultrasonic transmission time triangulation shutter glasses 120 between ultrasonic receiver 122 and the 132 pre-ultrasound transmitter device 142 and the 3D position of 3D mouse 130.Like this, shutter glasses is that this method must be used.This is very inconvenience concerning watching the observation 3D rendering.
Summary of the invention
The objective of the invention is to propose a kind of 3D rendering display packing and equipment, can along with the variation of beholder's viewpoint at the 2D projected image that shows on the 2D screen under the respective viewpoints.
In one aspect of the invention, propose a kind of method that on the 2D display device, shows 3D rendering, comprised step: the video of catching the beholder; Analyze the viewpoint that described video obtains the beholder; Determine the 2D projected image of described 3D rendering under this viewpoint; Wherein, along with the variation of beholder's viewpoint shows and the relevant 2D projected image of viewpoint accordingly.
In another aspect of this invention, proposed a kind of equipment that on the 2D display device, shows 3D rendering, having comprised: video capture device, the video that is used to catch the beholder; Analytic unit is analyzed the viewpoint that described video obtains the beholder; Updating block is determined the 2D projected image of described 3D rendering under this viewpoint; Display device is along with the variation of beholder's viewpoint shows and the relevant 2D projected image of viewpoint accordingly.
The scheme of the embodiment of the invention has solved the prior art inconvenient problem with use, allows the user can use bore hole direct viewing 3D rendering on the 2D display.
In addition, eye detection and tracking can replace the detection and tracking of people's face.And, utilize eye detection and tracking, can calculate different 2D projected images with right eye at left eye.Like this, this method can be used for carrying out stereo display.
In addition, viewpoint is controlled by the detection and tracking of people's face.This viewpoint can be in 3d space real-time update.This has guaranteed the level and smooth requirement of height of viewpoint control.
In addition, according to the degree of depth of people's face size Control viewpoint of beholder.By beholder's face's reference dimension and gauged distance apart from screen are provided with accordingly, can calculate the degree of depth of viewpoint approx.
In addition, can use some other method to realize more accurate range measurement, for example laser ranging, ultrasonic ranging and infrared distance measurement method.
Description of drawings
From the detailed description below in conjunction with accompanying drawing, above-mentioned feature and advantage of the present invention will be more obvious, wherein:
Fig. 1 is the schematic diagram based on the 3D rendering display device of people's face detection and tracking according to the embodiment of the invention;
Fig. 2 shows the structured flowchart according to the 3D rendering display device of the embodiment of the invention;
Fig. 3 shows the flow chart according to the 3D rendering display packing of the embodiment of the invention;
Fig. 4 shows the people's face testing process according to the embodiment of the invention;
Fig. 5 shows the face tracking process according to the embodiment of the invention;
Fig. 6 shows the viewpoint renewal process according to the embodiment of the invention;
Fig. 7 shows the schematic diagram that the method and apparatus according to the embodiment of the invention detects people's face;
Fig. 8 shows the schematic diagram of people's face being followed the tracks of according to method and apparatus of the invention process;
Fig. 9 shows the schematic diagram of the 2D projected image that calculates 3D rendering;
Figure 10 shows according to the 2D projected image under the embodiment of the invention generation different points of view.
Embodiment
Below, describe preferred implementation of the present invention with reference to the accompanying drawings in detail.In the accompanying drawings, though be shown in the different accompanying drawings, identical Reference numeral is used to represent identical or similar assembly.For clarity and conciseness, be included in here known function and the detailed description of structure will be omitted, otherwise they will make theme of the present invention unclear.
As shown in Figure 1, the equipment according to the embodiment of the invention possesses display device 130, video camera 110 and main frame 120.Video camera 110 is arranged on the display device 130, faces the beholder, is used to catch beholder's live video.Main frame 120 is used for video camera 110 video captured are handled the viewpoint that obtains the beholder.In addition, the 3D rendering data in advance is stored in the computer, calculates 3D rendering at the 2D of this viewpoint projecting image data according to the viewpoint that obtains.Behind the 2D projected image that calculates under this viewpoint, the 2D projected image is presented on the display device 130.Like this,, calculate the 2D projected image of the 3D rendering under the viewpoint of corresponding viewpoint and calculating, be presented on the display, therefore can realize demonstration 3D rendering along with the variation of beholder's viewpoint.
Fig. 2 shows the structured flowchart according to the equipment of the embodiment of the invention.As shown in Figure 2, the equipment of present embodiment comprises the video camera 110 of the live video of catching the beholder, main frame 120 and display device 130.
Main frame 120 comprises the memory device 121 of storing 3D rendering data and other related datas in advance; Detecting unit 122, the image that is used for catching from video camera detects people's face information of beholder; Tracking cell 123 is used for the people's face based on the face tracking video captured that detects, thereby determines this beholder's viewpoint position; Updating block 124 is used for determining the 2D projected image of 3D rendering under this viewpoint based on beholder's viewpoint, just upgrades the 2D projected image, outputs to display device 130 then and shows.
Above-mentioned tracking cell and analytic unit can be called as analytic unit, realize the function of above-mentioned tracking cell and analytic unit.
Fig. 3 shows the flow chart according to the 3D display packing of the embodiment of the invention.
According to embodiments of the invention, the calculating of viewpoint and renewal are based on that the detection and tracking of people's face realize in the 3D display packing.At step S10, utilize video camera 120 to catch beholder's facial image, then the facial image in each frame of the automatic detection and tracking of step S11.Because beholder's viewpoint position is identical with face location, therefore can obtain beholder's viewpoint at each frame.At step S12, calculate the 2D projected image of the 3D rendering that is stored in advance in the computer according to this viewpoint.At step S13, along with the variation of beholder's viewpoint, at the 2D projected image that shows on the 2D display under the different viewpoints.Therefore, can on the 2D display, show 3D rendering.
4~6 detailed descriptions are according to the 3D display device and and the methods of the embodiment of the invention with reference to the accompanying drawings.
Fig. 4 shows the operating process according to the detecting unit of the embodiment of the invention.When starting working, live video is from video camera 120 inputs.Then, this video is detected so that find people's face of beholder at step S21.If do not detect people's face, this method continues to analyze new video.If detect people's face, then calculate the position of people's face, the zone is saved in the memory device 121 with other information and at step S22.At step S23, all information as testing result are output to next unit.And, if necessary, can detect the beholder eyes, mouth and even nose as additional information, equally also output to next unit.As shown in Figure 7, from the image of input, detected beholder's facial image, as shown in the square frame of (b) among Fig. 7.
Fig. 5 shows the processing procedure of tracking cell.At step S31, the people's face that detects is input in this module.Then, at step S32, a tracker is created for the people's face that detects in this unit.According to embodiments of the invention, the tracker here can be used as average drifting (Mean shift), Kalman filtering (Kalman filter), and the model of particle filter (Particle filter) etc. and so on is realized.When start working in this unit, will be analyzed frame by frame from the video of video camera input at step S33.For each new frame, tracker search and the most similar zone of detecting of people's face.In case realize stable tracking, then obtain all information, people's face position for example, area size or the like at this people's face.At step S34, output tracking result, for example relative position of the organoid of the position of people's face, size, people eyes, nose and mouth on the face etc.Fig. 8 shows the schematic diagram that carries out face tracking frame by frame.Along with the real time kinematics of people's face, people's face of detection also along with the time motion, has just obtained the movement locus of beholder's viewpoint on the picture that video camera is caught.
Fig. 6 shows the operating process of updating block.When start working in this unit, the result of input face tracking.At step S41, upgrade beholder's viewpoint according to the tracking results of input.At step S42, calculate the 2D projected image of 3D rendering under this viewpoint according to the viewpoint of upgrading, and the 2D projected image after step S43 output is upgraded.As mentioned above, because beholder's viewpoint is identical with people's face position of beholder, therefore can come more new viewpoint according to the change in location of people's face.Therefore calculate new 2D projected image.Upgrade the 2D projected image then and show.
Upgrade by above-mentioned people's face detection, face tracking and viewpoint, the beholder can see the image under the different points of view.Because these images are identical 3D rendering 2D projected images under different points of view, so the beholder can watch the 3D image on the 2D display.
Fig. 9 and Figure 10 show the example that 3D shows.Show in this embodiment and on the 2D screen, show the cubical example of 3D.At first, when the beholder appeared at before the screen, video camera was caught this beholder's real-time video, and real-time video is input to people's face detection module.In case detect people's face of beholder, then testing result outputed to the face tracking module.In the face tracking module, beholder's face is followed the tracks of frame by frame.Therefore obtain information about people's face, size for example, shape, the position, color, relative position of eyes, nose and mouth or the like, and these information are outputed to the viewpoint update module.Because beholder's viewpoint is identical with the position of people's face, therefore can obtain viewpoint according to the result of face tracking.Then, can calculate the 2D projected image of above-mentioned 3D cube under this viewpoint, and this 2D projected image is presented on the display.Because the variation of the position of person's viewpoint comes real-time update to be presented at the 2D projected image of the 3D rendering on the screen according to the observation, so the beholder it will be appreciated that 3D rendering.Therefore realized on the 2D display, showing the 3D cube.
3D rendering can illustrate with the camera model in the projective geometry to the projection process of 2D image.As shown in Figure 9, being located at true 3d space has a some A, for the convenience of calculation of back, it is expressed as A={X0 in the homogeneous coordinates mode, Y0, Z0,1}, this point is by the video camera center C, C={0,0,0,1} has a subpoint a on the 2D plane of delineation, this point can be calculated by following formula:
a=PA
P projection matrix wherein, it can homogeneously be expressed as
Figure B200910146364XD0000061
The parameter f here is the distance of the plane of delineation to the video camera center, just focal length.So, concrete projection formula just can be written as:
x 0 y 0 z 0 = a = PA = f 0 0 0 0 f 0 0 0 0 1 1 X 0 Y 0 Z 0 1
Here the subpoint a={x0 of Huo Deing, y0, z0} are homogeneous expression modes, its true coordinate on image is { x0/z0, y0/z0}.We just can obtain the 2D subpoint a of 3d space point A on the plane of delineation like this.
In like manner, to the 3D object in the real space, can obtain the projection of this object by calculating the projection of every bit on the plane of delineation on this object according to above-mentioned method.So just realized that the 3D object calculates to the projection of the 2D plane of delineation.
As shown in figure 10, because beholder's viewpoint is identical with the position of people's face of beholder, so can obtain observer's viewpoint by face tracking frame by frame.According to different viewpoints, calculate and show the cubical 2D projection of 3D under this viewpoint.Like this, the beholder can see the 3D cube on the 2D display.As shown in the figure, the observer watches image on the 2D display from four different viewpoints.
According to embodiments of the invention, solved the prior art inconvenient problem with use.According to the method for the embodiment of the invention, at first use video camera to catch beholder's video.Then, in video captured, detect and follow the tracks of beholder's face frame by frame, thereby obtain beholder's face image at each frame.Because beholder's viewpoint position is identical with the position of its face, can obtain beholder's viewpoint at each frame of video.Then, calculate the 2D projected image of the 3D rendering of storage in advance at this viewpoint.Therefore, along with the variation of viewpoint, the beholder can see the 2D projected image of identical 3D rendering under different points of view.Therefore, 3D rendering is presented on the 2D display.According to embodiments of the invention, the user can use bore hole direct viewing 3D rendering on the 2D display.
According to another embodiment of the present invention, can replace with detection and tracking the detection and tracking of people's face to eyes.And, utilize eye detection and tracking, can calculate different 2D projected images with right eye at left eye.Like this, the method for this embodiment can be used for carrying out stereo display.
According to embodiments of the invention, because viewpoint is controlled by the detection and tracking of people's face.This viewpoint can be in 3d space real-time update.This has guaranteed the level and smooth requirement of height of viewpoint control.
According to another embodiment of the present invention, according to the degree of depth of people's face size Control viewpoint of beholder.By beholder's face's reference dimension and gauged distance apart from screen are provided with accordingly, can calculate the degree of depth of viewpoint approx.In addition, can use some other method to realize more accurate range measurement, for example laser ranging, ultrasonic ranging and infrared distance measurement method.
The equipment of the above embodiment of the present invention can be applied to for example computer, portable terminal, PDA, electronic whiteboard etc.
Specification and accompanying drawing only show principle of the present invention.Therefore should be appreciated that those skilled in the art can advise different structures,, embodied principle of the present invention and be included within its spirit and scope though these different structures are not clearly described herein or illustrated.In addition, all examples of herein mentioning mainly only are used for teaching purpose clearly helping the design of reader understanding's principle of the present invention and promotion this area that the inventor was contributed, and should be interpreted as not being the restriction to these specific examples of mentioning and condition.In addition, all statement and specific examples thereof of mentioning principle of the present invention, aspect and execution mode comprise its equivalent interior herein.

Claims (10)

1. method that shows 3D rendering on the 2D display device comprises step:
Catch beholder's video;
Analyze the viewpoint that described video obtains the beholder;
Determine the 2D projected image of described 3D rendering under this viewpoint;
Wherein, along with the variation of beholder's viewpoint shows and the relevant 2D projected image of viewpoint accordingly.
2. the method for claim 1, wherein analyze the step that described video obtains beholder's viewpoint and comprise:
Video captured is carried out people's face to be detected;
Carry out face tracking frame by frame based on the people's face that detects;
Wherein the position of people's face of Jian Ceing and face tracking result are used as beholder's viewpoint.
3. method as claimed in claim 2, wherein analyze the step that described video obtains beholder's viewpoint and also comprise:
Obtain the shape and the size information of people's face of detection.
4. method as claimed in claim 3 is wherein determined the degree of depth of viewpoint based on described size information.
5. method as claimed in claim 3, wherein analyze the step that described video obtains beholder's viewpoint and also comprise:
Detect the position of people's eyes on the face, as described viewpoint.
6. equipment that shows 3D rendering on the 2D display device comprises:
Video capture device, the video that is used to catch the beholder;
Analytic unit is analyzed the viewpoint that described video obtains the beholder;
Updating block is determined the 2D projected image of described 3D rendering under this viewpoint;
Display device is along with the variation of beholder's viewpoint shows and the relevant 2D projected image of viewpoint accordingly.
7. equipment as claimed in claim 6, wherein analytic unit comprises:
Detecting unit carries out people's face to video captured and detects;
Tracking cell carries out face tracking frame by frame based on the people's face that detects;
Wherein the position of people's face of Jian Ceing and face tracking result are used as beholder's viewpoint.
8. equipment as claimed in claim 7, wherein analytic unit also obtains the shape and the size information of people's face of detecting.
9. equipment as claimed in claim 8, wherein analytic unit is determined the degree of depth of viewpoint based on described size information.
10. equipment as claimed in claim 8, wherein analytic unit detects the position of people's eyes on the face, as described viewpoint.
CN200910146364XA 2009-06-24 2009-06-24 Method and equipment for displaying 3D image Pending CN101931823A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910146364XA CN101931823A (en) 2009-06-24 2009-06-24 Method and equipment for displaying 3D image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910146364XA CN101931823A (en) 2009-06-24 2009-06-24 Method and equipment for displaying 3D image

Publications (1)

Publication Number Publication Date
CN101931823A true CN101931823A (en) 2010-12-29

Family

ID=43370697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910146364XA Pending CN101931823A (en) 2009-06-24 2009-06-24 Method and equipment for displaying 3D image

Country Status (1)

Country Link
CN (1) CN101931823A (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102123291A (en) * 2011-02-12 2011-07-13 中山大学 Intelligent naked-eye stereoscopic display system and control method thereof
CN102740154A (en) * 2011-04-14 2012-10-17 联发科技股份有限公司 Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
CN102780902A (en) * 2011-05-13 2012-11-14 Lg电子株式会社 Apparatus and method for processing 3-dimensional image
CN102970574A (en) * 2012-11-21 2013-03-13 深圳市酷开网络科技有限公司 Eyeglass-based 3D (3-Dimensional) intelligent terminal and system
CN103354616A (en) * 2013-07-05 2013-10-16 南京大学 Method and system for realizing three-dimensional display on two-dimensional display
CN103546733A (en) * 2012-07-17 2014-01-29 联想(北京)有限公司 Display method and electronic device
CN104137538A (en) * 2011-12-23 2014-11-05 韩国科学技术研究院 Device for displaying multi-view 3d image using dynamic visual field expansion applicable to multiple observers and method for same
CN104349155A (en) * 2014-11-25 2015-02-11 深圳超多维光电子有限公司 Method and equipment for displaying simulated three-dimensional image
CN104581350A (en) * 2015-02-04 2015-04-29 京东方科技集团股份有限公司 Display method and display device
CN105100775A (en) * 2015-07-29 2015-11-25 努比亚技术有限公司 Image processing method and apparatus, and terminal
CN105138215A (en) * 2014-05-26 2015-12-09 联想(北京)有限公司 Information processing method and electronic device
CN105592306A (en) * 2015-12-18 2016-05-18 深圳前海达闼云端智能科技有限公司 Three-dimensional stereo display processing method and device
CN105704475A (en) * 2016-01-14 2016-06-22 深圳前海达闼云端智能科技有限公司 Three-dimensional stereo display processing method of curved-surface two-dimensional screen and apparatus thereof
US20170351107A1 (en) * 2016-06-03 2017-12-07 GM Global Technology Operations LLC Display system and method of creating an apparent three-dimensional image of an object
CN109523849A (en) * 2018-12-29 2019-03-26 济南科明数码技术股份有限公司 A kind of simulation control subsystem applied to electronic whiteboard teaching
CN109845251A (en) * 2016-12-07 2019-06-04 三星电子株式会社 Electronic equipment for displaying images and method
CN110390686A (en) * 2019-07-24 2019-10-29 张天 Naked eye 3D display method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0874303A1 (en) * 1997-04-25 1998-10-28 Texas Instruments France Video display system for displaying a virtual threedimensinal image
US20050219694A1 (en) * 2004-04-05 2005-10-06 Vesely Michael A Horizontal perspective display
CN101739567A (en) * 2008-11-19 2010-06-16 索尼爱立信移动通信日本株式会社 Terminal apparatus, display control method, and display control program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0874303A1 (en) * 1997-04-25 1998-10-28 Texas Instruments France Video display system for displaying a virtual threedimensinal image
US20050219694A1 (en) * 2004-04-05 2005-10-06 Vesely Michael A Horizontal perspective display
CN101065783A (en) * 2004-04-05 2007-10-31 迈克尔·A·韦塞利 Horizontal perspective display
CN101739567A (en) * 2008-11-19 2010-06-16 索尼爱立信移动通信日本株式会社 Terminal apparatus, display control method, and display control program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
冯成志 沈模卫: "视线跟踪技术及其在人机交互中的应用", 《浙江大学学报》 *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102123291A (en) * 2011-02-12 2011-07-13 中山大学 Intelligent naked-eye stereoscopic display system and control method thereof
CN102123291B (en) * 2011-02-12 2013-10-09 中山大学 Intelligent naked-eye stereoscopic display system and control method thereof
CN102740154A (en) * 2011-04-14 2012-10-17 联发科技股份有限公司 Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
US9367218B2 (en) 2011-04-14 2016-06-14 Mediatek Inc. Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
CN102740154B (en) * 2011-04-14 2015-07-15 联发科技股份有限公司 Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
US8988512B2 (en) 2011-04-14 2015-03-24 Mediatek Inc. Method for adjusting playback of multimedia content according to detection result of user status and related apparatus thereof
CN102780902B (en) * 2011-05-13 2014-12-24 Lg电子株式会社 Apparatus and method for processing 3-dimensional image
CN102780902A (en) * 2011-05-13 2012-11-14 Lg电子株式会社 Apparatus and method for processing 3-dimensional image
CN104137538A (en) * 2011-12-23 2014-11-05 韩国科学技术研究院 Device for displaying multi-view 3d image using dynamic visual field expansion applicable to multiple observers and method for same
US10237543B2 (en) 2011-12-23 2019-03-19 Samsung Electronics Co., Ltd. Device for displaying multi-view 3D image using dynamic viewing zone expansion applicable to multiple observers and method for same
CN104137538B (en) * 2011-12-23 2017-09-29 三星电子株式会社 Can be applied to multiple observers is used to extend using dynamic vision area showing the devices and methods therefor of multiple views 3D rendering
CN103546733A (en) * 2012-07-17 2014-01-29 联想(北京)有限公司 Display method and electronic device
CN103546733B (en) * 2012-07-17 2017-05-24 联想(北京)有限公司 Display method and electronic device
CN102970574A (en) * 2012-11-21 2013-03-13 深圳市酷开网络科技有限公司 Eyeglass-based 3D (3-Dimensional) intelligent terminal and system
CN103354616A (en) * 2013-07-05 2013-10-16 南京大学 Method and system for realizing three-dimensional display on two-dimensional display
CN105138215B (en) * 2014-05-26 2018-12-14 联想(北京)有限公司 A kind of information processing method and electronic equipment
CN105138215A (en) * 2014-05-26 2015-12-09 联想(北京)有限公司 Information processing method and electronic device
CN104349155A (en) * 2014-11-25 2015-02-11 深圳超多维光电子有限公司 Method and equipment for displaying simulated three-dimensional image
US9961334B2 (en) 2014-11-25 2018-05-01 Superd Co. Ltd. Simulated 3D image display method and display device
US9749612B2 (en) 2015-02-04 2017-08-29 Boe Technology Group Co., Ltd. Display device and display method for three dimensional displaying
CN104581350A (en) * 2015-02-04 2015-04-29 京东方科技集团股份有限公司 Display method and display device
CN105100775A (en) * 2015-07-29 2015-11-25 努比亚技术有限公司 Image processing method and apparatus, and terminal
CN105592306A (en) * 2015-12-18 2016-05-18 深圳前海达闼云端智能科技有限公司 Three-dimensional stereo display processing method and device
CN105704475A (en) * 2016-01-14 2016-06-22 深圳前海达闼云端智能科技有限公司 Three-dimensional stereo display processing method of curved-surface two-dimensional screen and apparatus thereof
CN105704475B (en) * 2016-01-14 2017-11-10 深圳前海达闼云端智能科技有限公司 The 3 D stereo display processing method and device of a kind of curved surface two-dimensional screen
US20170351107A1 (en) * 2016-06-03 2017-12-07 GM Global Technology Operations LLC Display system and method of creating an apparent three-dimensional image of an object
CN107465905A (en) * 2016-06-03 2017-12-12 通用汽车环球科技运作有限责任公司 The method of the apparent 3-dimensional image of display system and establishment object
CN109845251A (en) * 2016-12-07 2019-06-04 三星电子株式会社 Electronic equipment for displaying images and method
US10681340B2 (en) 2016-12-07 2020-06-09 Samsung Electronics Co., Ltd. Electronic device and method for displaying image
CN109845251B (en) * 2016-12-07 2021-08-31 三星电子株式会社 Electronic device and method for displaying images
CN109523849A (en) * 2018-12-29 2019-03-26 济南科明数码技术股份有限公司 A kind of simulation control subsystem applied to electronic whiteboard teaching
CN110390686A (en) * 2019-07-24 2019-10-29 张天 Naked eye 3D display method and system

Similar Documents

Publication Publication Date Title
CN101931823A (en) Method and equipment for displaying 3D image
US11778159B2 (en) Augmented reality with motion sensing
Zollmann et al. Augmented reality for construction site monitoring and documentation
US9805509B2 (en) Method and system for constructing a virtual image anchored onto a real-world object
US9613262B2 (en) Object detection and tracking for providing a virtual device experience
CN106774880B (en) Three-dimensional tracking of user control devices in space
US7627447B2 (en) Method and apparatus for localizing and mapping the position of a set of points on a digital model
US9411413B2 (en) Three dimensional user interface effects on a display
US11783580B2 (en) Input apparatus, input method of input apparatus, and output apparatus
US20140320661A1 (en) Indoor scene capture system
JP7182976B2 (en) Information processing device, information processing method, and program
US20210225038A1 (en) Visual object history
CN102508548A (en) Operation method and system for electronic information equipment
US11537196B2 (en) Drift cancelation for portable object detection and tracking
JP2021136017A (en) Augmented reality system using visual object recognition and stored geometry to create and render virtual objects
Cheng et al. AR-based positioning for mobile devices
CN104345885A (en) Three-dimensional tracking state indicating method and display device
Piérard et al. I-see-3d! an interactive and immersive system that dynamically adapts 2d projections to the location of a user's eyes
US20210258476A1 (en) System for generating a three-dimensional scene of a physical environment
US9551922B1 (en) Foreground analysis on parametric background surfaces
Angladon Room layout estimation on mobile devices
Dong et al. Cost efficient virtual environment generation framework using annotated panoramic videos
JP2024072072A (en) Information processing device
WO2022019128A1 (en) Information processing device, information processing method, and computer-readable recording medium
KR20110136013A (en) Augmented reality device to display hologram object using the law of gravity

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20101229