CN107105215A - The method and display system of image is presented - Google Patents

The method and display system of image is presented Download PDF

Info

Publication number
CN107105215A
CN107105215A CN201710195608.8A CN201710195608A CN107105215A CN 107105215 A CN107105215 A CN 107105215A CN 201710195608 A CN201710195608 A CN 201710195608A CN 107105215 A CN107105215 A CN 107105215A
Authority
CN
China
Prior art keywords
image
destination object
scene
observer
presented
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710195608.8A
Other languages
Chinese (zh)
Other versions
CN107105215B (en
Inventor
骆伟权
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201710195608.8A priority Critical patent/CN107105215B/en
Priority to CN202010029590.6A priority patent/CN111208906B/en
Publication of CN107105215A publication Critical patent/CN107105215A/en
Application granted granted Critical
Publication of CN107105215B publication Critical patent/CN107105215B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Present disclose provides a kind of method that image is presented and display system.The method that image is presented includes:Gather image and the presentation of scene;Observer's destination object of interest on image is determined, the destination object is the object in scene;And when the destination object is moved, the image of the scene is presented on the display unit, while maintaining the eye impressions of destination object for observer described in the scene.

Description

The method and display system of image is presented
Technical field
This disclosure relates to a kind of method that image is presented and display system, in particular it relates to be able to maintain that pair in image The method and display system at the presentation visual angle of elephant.
Background technology
With communication and the fast development of computer technology, electronic equipment is increasingly being applied to the daily life of people It is living.Also various technologies are occurred in that to improve Consumer's Experience.
Augmented reality AR is a kind of to be combined virtual image with the physical environment of real world or space to be presented to user Technology.By AR technologies, the object occurred in identification video, and supplement the information related to the object in video, that is, strengthen Information, and it is presented to user by the way that object and its enhancement information are combined together.Side information can include being covered in regarding Figure or text message on the frame of frequency so that can to user's identification, define or object described otherwise above.So, AR skills Art can provide a user the enhancing real-time experience relative to the video for being captured and being shown in real time.
But, in current AR technologies, when by capture the image of real scene and the image is supplemented with When now giving user, if the object movement in real scene, its presentation on the display device is also moved.If user The object is given more sustained attention, then needs user constantly to move its eye, neck or body with the movement of object, so as to Constantly watch the object attentively.This is inconvenient for a user.
The content of the invention
An aspect of this disclosure provides a kind of method that image is presented, including:Gather image and the presentation of scene;Really Determine the destination object of interest of observer on image, the destination object is the object in scene;And in the destination object When mobile, the image of the scene is presented on the display unit, while maintaining destination object described in the scene for observation Eye impressions for person.
According to one embodiment of the disclosure, the eye impressions include:Visual angle of the observer on the destination object; And/or sighting distance of the observer on the destination object.
According to one embodiment of the disclosure, determination destination object of interest includes:Obtain the eyes of observer Pupil image;And destination object of interest is determined according to the pupil image;Or receive the defeated of the observer Enter;And destination object of interest is determined according to the input.
According to one embodiment of the disclosure, when the destination object is moved, the figure of adjustment collection scene image As collecting unit, to maintain the eye impressions.
Another aspect of the present disclosure provides a kind of display system, including:Display unit, for image to be presented;Memory, Be stored with computer-readable instruction;And processor, it is configured as going the computer-readable instruction in the memory, to perform Operate below:The image of gathered scene is presented on the display unit;Observer's destination object of interest on image is determined, The destination object is the object in scene;And when the destination object is moved, it is presented described on the display unit The image of scene, while maintaining the eye impressions of destination object for observer described in the scene.
Another aspect of the present disclosure provides a kind of computer-readable storage medium, and be stored with computer program, the computer Program causes computing device method according to the present disclosure when by computing device.
Brief description of the drawings
In order to be more fully understood from the disclosure and its advantage, referring now to the following description with reference to accompanying drawing, wherein:
Fig. 1 diagrammatically illustrates the application scenarios of the method for presentation image in accordance with an embodiment of the present disclosure;
Fig. 2 diagrammatically illustrates the flow chart of the method for presentation image in accordance with an embodiment of the present disclosure;
Fig. 3 shows the schematic diagram of the effect of the method for presentation image in accordance with an embodiment of the present disclosure;
Fig. 4 shows the flow chart of the method according to the determination of one embodiment of disclosure destination object of interest;
Fig. 5 shows the flow chart of the method according to the determination of another embodiment of the present disclosure destination object of interest; And
Fig. 6 diagrammatically illustrates the block diagram of the structure of display system in accordance with an embodiment of the present disclosure.
Embodiment
According to reference to accompanying drawing to the described in detail below of disclosure exemplary embodiment, other side, the advantage of the disclosure It is will become obvious with prominent features for those skilled in the art.
In the disclosure, term " comprising " and " containing " and its derivative mean including and it is unrestricted;Term "or" is bag Containing property, mean and/or.
In this manual, following various embodiments for being used to describe disclosure principle are explanation, should not be with any Mode is construed to limit scope of disclosure.Referring to the drawings described below is used to help comprehensive understanding by claim and its equivalent The exemplary embodiment for the disclosure that thing is limited.It is described below to help to understand including a variety of details, but these details should Think what is be merely exemplary.Therefore, it will be appreciated by those of ordinary skill in the art that without departing substantially from the scope of the present disclosure and spirit In the case of, embodiment described herein can be made various changes and modifications.In addition, for clarity and brevity, Eliminate the description of known function and structure.In addition, through accompanying drawing, same reference numbers are used for identity function and operation.
Embodiment of the disclosure provides a kind of method and display device that image is presented.This method includes collection scene Image is simultaneously presented.Observer checks presented image.At this moment, observer's destination object of interest on image is determined.Target Object is the object in scene.When the destination object is moved, continue that the image of scene is presented, while maintaining destination object pair Eye impressions for observer.In other words, if destination object determines that user pays close attention to field at the front of user The destination object in scape image;If destination object is moved to the right of user, after the image of collection scene, to adopting The image of the scene of collection is adjusted, to cause destination object to be still shown in the front of user.For example, by the image of scene Entirety is moved to the left.So, user can constantly pay close attention to the destination object without carrying out any body action.Meanwhile, by Integrally it is moved to the left in by the image of scene, user can also know that current scene is that destination object is being moved rather than field Other objects in scape are moved, and so will not also make user cause any to obscure.Which improve Consumer's Experience.
Fig. 1 diagrammatically illustrates the application scenarios of the method for presentation image in accordance with an embodiment of the present disclosure.
As shown in figure 1, active user is watching performance.User can be watched using auxiliary display device.For example, User can wear near-to-eye, head mounted display or HUD.Show that user wears nearly eye and shown in Fig. 1 The result that device, such as glasses are watched.As shown in Fig. 1 (a), currently there is a people to dance on platform, a people Performed riding, and a car.Fig. 1 (b) is shown for when the performance on foreground, the glasses that user is worn at it On the image seen, the image is in addition to people and Che on dais, further it is shown that people and Che brief introduction, to provide the user with The more information of scene on being seen.
In the scene shown in Fig. 1, such as bicyclist can move.In the scene shown in Fig. 1, the people of nautch is carried out Performed in its present position, i.e. this people does not change position in activity.Car is static, that is, is parked in residing position Put.
Now, the image that user sees changes also with the activity of bicyclist.That is, the image presented on glasses is Real-time change.
If user represents concern to bicyclist, user is needed as the remote of bicyclist is gone and rotate eyes or head Portion.
Fig. 2 diagrammatically illustrates the flow chart of the method 2000 of presentation image in accordance with an embodiment of the present disclosure.
As shown in Fig. 2 starting according to the method 2000 of the presentation image of the embodiment of the present disclosure, step S2100 is transferred to, is gathered The image of scene and presentation.In the situation shown in Fig. 1, collection be the image of scene on dais that user is watched simultaneously Presented on the glasses that user is worn.In step S2200, observer's destination object of interest on image, institute are determined It is the object in scene to state destination object.For example, in the situation shown in Fig. 1, determining observer's target of interest on image Object is bicyclist.For example, observer rotates eyes and continues bicyclist's several seconds on gazing at images.Finally, in step S2300, when destination object is moved, is presented the image of scene on the display unit, while maintaining destination object for observer For eye impressions.In the situation shown in Fig. 1, when bicyclist moves, moved during the image that scene is presented on glasses The image of scene, the right front of user is always positioned at the image for keeping bicyclist.So, user need not rotate eyes just can be with Constantly pay close attention to bicyclist.
In accordance with an embodiment of the present disclosure, eye impressions include observer on the visual angle of destination object and/or observer pass In the sighting distance of destination object.
Fig. 3 shows the schematic diagram of the effect of the method 2000 of presentation image in accordance with an embodiment of the present disclosure.
For example, when bicyclist relative to user's right direction far to go, Fig. 3 (a) shows the real scene at the moment 1 Image and the image that is presented on the glasses that user is worn.Fig. 3 (b) show the real scene at moment 2 image and It is presented on the image on the glasses that user is worn.Datum line in Fig. 3 may, for example, be the center line of glasses, to schematically illustrate The position of the image shown on glasses.
, can be with the figure of continuous collecting scene in destination object movement of interest according to one embodiment of the disclosure As and when the image of scene is presented the display on the display unit of adjustment scene image to maintain regarding for destination object in scene Feel impression.For example, adjusting scene image to cause destination object to be presented on the pre-position of display unit all the time, i.e. maintain Visual angle of the observer on destination object.According to another embodiment of the present disclosure, the image that can adjust collection scene image is adopted Collect unit, to cause the eye impressions of scene image destination object when presenting on the display unit of collection constant.For example, can With using 360 ° of video cameras or removable/rotation video camera, when destination object of interest is determined, Camera location target Object is acquired.When destination object is moved, video camera is also moved/rotated so that destination object is always positioned at shooting Such as center line of machine, such destination object is always positioned at centre position in the scene image gathered, and scene image is being presented When, the position of destination object keeps constant on the display unit.
In another example, when bicyclist relative to user far to go forward, 1 at the time of shown in Fig. 3 (a) after, Fig. 3 (c) Show the image and the image being presented on the glasses that user is worn in the real scene at moment 3.
In accordance with an embodiment of the present disclosure, can be with the figure of continuous collecting scene after destination object of interest is determined As and when the image of scene is presented the display on the display unit of adjustment scene image to maintain regarding for destination object in scene Feel impression.For example, size of the adjustment scene image to maintain destination object to present on the display unit, i.e. maintain observer to close In the sighting distance of destination object.According to another embodiment of the present disclosure, the image acquisition units of collection scene image can be adjusted, with So that the eye impressions of the scene image of collection destination object when presenting on the display unit are constant.It is for instance possible to use becoming Focus video camera, when destination object of interest is determined, Camera location destination object is acquired.It is remote in destination object Go/close to when, adjust focal length, to cause size of the destination object in the scene image gathered constant, so in scene During scape image, the size of destination object keeps constant on the display unit.
In another example, when observer is watching aircraft performance, the destination object paid close attention in observer is aircraft in-flight When, the angle of pitch of video camera can be adjusted, to cause the image of aircraft all the time in the medium height position of display unit.
Fig. 4 shows the flow of the method 4000 according to the determination of one embodiment of disclosure destination object of interest Figure.
As shown in figure 4, at step S4100, obtaining the pupil image of the eyes of observer.Then in step S4200, root The object in the eye gaze described image is determined according to pupil image and the image presented on the display device.In step S4300, when it is determined that the object in the eye gaze described image reaches predetermined lasting time, it is determined that the object watched attentively is Destination object of interest.
According to one embodiment of the disclosure, the eye of observer can be captured by installing camera on the display device The pupil image of eyeball.
Of course, it is possible to determine destination object of interest by other means.Fig. 5 is shown according to the another of the disclosure The flow chart of the method 5000 of the determination of embodiment destination object of interest.
As shown in figure 5, at step S5100, the image of gathered scene is constantly presented on the display unit.So Afterwards, at step S5200, the input equipment in observer's touch-display unit for example presses button.In step S5300, ring Should pressing in button, such as the respective option of operation that display can be carried out on the display unit, " tracking destination object”. In step S5400, observer determines to choose the option.In step S5500, pair in presented image is recognized As, such as dancer, bicyclist and Che shown in Fig. 1, and present on the display device.In step S5600, identification observation Selection of the person to object.So, its destination object of interest can be manually determined by observer.
In accordance with an embodiment of the present disclosure, rendering preferences etc. can be superimposed upon and be currently displayed at display unit in method 5000 On image on, or be shown in a part for display unit.
Certainly, the above method is not limited to according to the method for the determination of disclosure destination object of interest.It can use Any other applicable method determines destination object.For example, observer can blink with fixation object object and at a predetermined interval For several times, to determine the concern to destination object.
In accordance with an embodiment of the present disclosure, observer can cancel the concern to destination object.For example, observer can pass through Method shown in Fig. 5 determines new destination object, or cancels the concern to destination object.In another example, can periodically it hold Method shown in row Fig. 4, so as to cancel the locking to destination object when observer no longer pays close attention to destination object.
Fig. 6 diagrammatically illustrates the block diagram of the structure of display system 6000 in accordance with an embodiment of the present disclosure.
As shown in fig. 6, display system 6000 includes processor 610, computer-readable recording medium 620, sender unit 630th, signal receiver 640, display unit 650 and image acquisition units 660.The display system can be performed above with reference to Fig. 2 The method of~Fig. 5 descriptions, so that when it is determined that observer pays close attention to the destination object shown on display unit, the image of scene is being presented While maintain destination object eye impressions for observer.
Specifically, processor 610 can for example include general purpose microprocessor, instruction set processor and/or related chip group And/or special microprocessor (for example, application specific integrated circuit (ASIC)), etc..Processor 610 can also include being used to cache using The onboard storage device on way.Processor 610 can be performed for the side according to the embodiment of the present disclosure described with reference to Fig. 2~Fig. 5 Single treatment unit either multiple processing units of the different actions of method flow.
Computer-readable recording medium 620 for example can be that can include, store, transmit, propagate or transmit appointing for instruction Meaning medium.For example, readable storage medium storing program for executing can include but is not limited to electricity, magnetic, optical, electromagnetic, infrared or semiconductor system, device, Device or propagation medium.The specific example of readable storage medium storing program for executing includes:Magnetic memory apparatus, such as tape or hard disk (HDD);Optical storage Device, such as CD (CD-ROM);Memory, such as random access memory (RAM) or flash memory;And/or wire/wireless communication chain Road.
Computer-readable recording medium 620 can include computer program 621, and the computer program 621 can include generation Code/computer executable instructions, it by processor 610 when being performed so that processor 610 is performed for example above in conjunction with Fig. 2~figure Method flow and its any deformation described by 5.
Computer program 621 can be configured with such as computer program code including computer program module.Example Such as, in the exemplary embodiment, the code in computer program 621 can include one or more program modules, such as including mould Block 621A, module 621B, module 621C.Wherein, as computing device module 621A, processor is controlled to gather scene Image, and control display unit 650 shown.As computing device module 621B, processor determines to observe on image Person's destination object of interest.As computing device module 621C, processor is controlled to present on display unit 650 The image of the scene gathered, while maintaining the eye impressions of destination object for observer.
It should be noted that the dividing mode and number of module are not fixed, those skilled in the art can be according to reality Situation is combined using suitable program module or program module, when the combination of these program modules is performed by processor 610 so that Processor 610 can be performed for example above in conjunction with the method flow described by Fig. 2~5 and its any deformation.For example, computer journey Code in sequence 621 can also include other program modules, for example, make computing device method 4000 when by computing device With the module of the grade of method 5000.
In accordance with an embodiment of the present disclosure, processor 610 can use sender unit 630 and signal receiver 640 to hold Row is above in conjunction with the method flow described by Fig. 2~5 and its any deformation.
In accordance with an embodiment of the present disclosure, display system 6000 can also include image acquisition units 660, to the image of scene It is acquired, and the image of collection is handled to be presented on display unit 650 by processor.According to the one of the disclosure Embodiment, can control image acquisition units 660, to adjust elementary area unit 660 so that acquired image by processor The size of middle destination object/position keeps constant, so as to maintain its eye impressions for observer.
Although the disclosure, art technology has shown and described in the certain exemplary embodiments with reference to the disclosure Personnel it should be understood that without departing substantially from appended claims and its equivalent restriction spirit and scope of the present disclosure in the case of, A variety of changes in form and details can be carried out to the disclosure.Therefore, the scope of the present disclosure should not necessarily be limited by above-described embodiment, But not only should be determined by appended claims, also it is defined by the equivalent of appended claims.

Claims (9)

1. a kind of method that image is presented, including:
Gather image and the presentation of scene;
Observer's destination object of interest on image is determined, the destination object is the object in scene;And
When the destination object is moved, the image of the scene is presented on the display unit, while maintaining institute in the scene State the eye impressions of destination object for observer.
2. according to the method described in claim 1, wherein, the eye impressions include:
Visual angle of the observer on the destination object;And/or
Sighting distance of the observer on the destination object.
3. according to the method described in claim 1, wherein, it is described to determine that destination object of interest includes:
Obtain the pupil image of the eyes of observer;And
Destination object of interest is determined according to the pupil image;Or
Receive the input of the observer;And
Destination object of interest is determined according to the input.
4. according to the method described in claim 1, wherein, when the destination object is moved, adjustment collection scene image Image acquisition units, to maintain the eye impressions.
5. a kind of display system, including:
Display unit, for image to be presented;
Memory, be stored with computer-readable instruction;And
Processor, is configured as performing the computer-readable instruction in the memory, to perform following operation:
The image of gathered scene is presented on the display unit;
Observer's destination object of interest on image is determined, the destination object is the object in scene;And
When the destination object is moved, the image of the scene is presented on the display unit, while maintaining the scene Described in the eye impressions of destination object for observer.
6. display system according to claim 5, wherein, the eye impressions include:
Visual angle of the observer on the destination object;And/or
Sighting distance of the observer on the destination object.
7. display system according to claim 5, wherein, the processor is additionally configured to perform in the memory Computer-readable instruction, with:
Obtain the pupil image of the eyes of observer;And
Destination object of interest is determined according to the pupil image;Or
Receive the input of the observer;And
Destination object of interest is determined according to the input.
8. display system according to claim 5, in addition to:Image acquisition units, the image for gathering scene;Its In, the processor is additionally configured to perform the computer-readable instruction in the memory, single to adjust described image collection Member, keeps the visual angle.
9. a kind of computer-readable storage medium, be stored with computer program, and the computer program causes when by computing device Method of the computing device according to one of claim 1-4.
CN201710195608.8A 2017-03-28 2017-03-28 Method and display system for presenting image Active CN107105215B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201710195608.8A CN107105215B (en) 2017-03-28 2017-03-28 Method and display system for presenting image
CN202010029590.6A CN111208906B (en) 2017-03-28 2017-03-28 Method and display system for presenting image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710195608.8A CN107105215B (en) 2017-03-28 2017-03-28 Method and display system for presenting image

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202010029590.6A Division CN111208906B (en) 2017-03-28 2017-03-28 Method and display system for presenting image

Publications (2)

Publication Number Publication Date
CN107105215A true CN107105215A (en) 2017-08-29
CN107105215B CN107105215B (en) 2020-02-21

Family

ID=59675437

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201710195608.8A Active CN107105215B (en) 2017-03-28 2017-03-28 Method and display system for presenting image
CN202010029590.6A Active CN111208906B (en) 2017-03-28 2017-03-28 Method and display system for presenting image

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202010029590.6A Active CN111208906B (en) 2017-03-28 2017-03-28 Method and display system for presenting image

Country Status (1)

Country Link
CN (2) CN107105215B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111142821A (en) * 2019-12-26 2020-05-12 联想(北京)有限公司 Processing method and device, electronic equipment and output equipment
CN111208906A (en) * 2017-03-28 2020-05-29 联想(北京)有限公司 Method and display system for presenting image
CN113398596A (en) * 2021-07-30 2021-09-17 广州边在晓峰网络科技有限公司 AR processing system based on multidimensional game

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1512258A (en) * 2002-12-30 2004-07-14 上海科星自动化技术有限公司 Automatic following camera shooting device
CN101405680A (en) * 2006-03-23 2009-04-08 皇家飞利浦电子股份有限公司 Hotspots for eye track control of image manipulation
CN101943982A (en) * 2009-07-10 2011-01-12 北京大学 Method for manipulating image based on tracked eye movements
CN104880905A (en) * 2015-05-13 2015-09-02 北京康得新三维科技有限责任公司 Device and method for tilt-shift stereoscopic photography
CN105518555A (en) * 2014-07-30 2016-04-20 深圳市大疆创新科技有限公司 Systems and methods for target tracking
CN106296743A (en) * 2016-08-23 2017-01-04 常州轻工职业技术学院 A kind of adaptive motion method for tracking target and unmanned plane follow the tracks of system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2565482C2 (en) * 2010-03-22 2015-10-20 Конинклейке Филипс Электроникс Н.В. System and method for tracing point of observer's look
CN104239877B (en) * 2013-06-19 2019-02-05 联想(北京)有限公司 The method and image capture device of image procossing
CN107105215B (en) * 2017-03-28 2020-02-21 联想(北京)有限公司 Method and display system for presenting image

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1512258A (en) * 2002-12-30 2004-07-14 上海科星自动化技术有限公司 Automatic following camera shooting device
CN101405680A (en) * 2006-03-23 2009-04-08 皇家飞利浦电子股份有限公司 Hotspots for eye track control of image manipulation
CN101943982A (en) * 2009-07-10 2011-01-12 北京大学 Method for manipulating image based on tracked eye movements
CN105518555A (en) * 2014-07-30 2016-04-20 深圳市大疆创新科技有限公司 Systems and methods for target tracking
CN104880905A (en) * 2015-05-13 2015-09-02 北京康得新三维科技有限责任公司 Device and method for tilt-shift stereoscopic photography
CN106296743A (en) * 2016-08-23 2017-01-04 常州轻工职业技术学院 A kind of adaptive motion method for tracking target and unmanned plane follow the tracks of system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111208906A (en) * 2017-03-28 2020-05-29 联想(北京)有限公司 Method and display system for presenting image
CN111208906B (en) * 2017-03-28 2021-12-24 联想(北京)有限公司 Method and display system for presenting image
CN111142821A (en) * 2019-12-26 2020-05-12 联想(北京)有限公司 Processing method and device, electronic equipment and output equipment
CN113398596A (en) * 2021-07-30 2021-09-17 广州边在晓峰网络科技有限公司 AR processing system based on multidimensional game

Also Published As

Publication number Publication date
CN111208906B (en) 2021-12-24
CN111208906A (en) 2020-05-29
CN107105215B (en) 2020-02-21

Similar Documents

Publication Publication Date Title
US10438410B2 (en) Text enhancements for head-mounted displays
CN111602082B (en) Position tracking system for head mounted display including sensor integrated circuit
EP3526967B1 (en) Non-planar computational displays
EP3646140B1 (en) Systems and methods for displaying images in a virtual world environment
WO2016157677A1 (en) Information processing device, information processing method, and program
CN106484116B (en) The treating method and apparatus of media file
JP5986198B2 (en) Display device, head mounted display, display method, display program, and recording medium
CN106327584B (en) Image processing method and device for virtual reality equipment
CN104956252A (en) Peripheral display for a near-eye display device
JP2014219621A (en) Display device and display control program
WO2013179426A1 (en) Display device, head-mounted display, display method, display program, and recording medium
CN108040247A (en) A kind of wear-type augmented reality display device and method
JPWO2019058492A1 (en) Display system and display method
CN107105215A (en) The method and display system of image is presented
US11157078B2 (en) Information processing apparatus, information processing method, and program
EP3346375B1 (en) Program, recording medium, content provision device, and control method
US11747897B2 (en) Data processing apparatus and method of using gaze data to generate images
JP7258620B2 (en) Image processing system and image processing method
US10083675B2 (en) Display control method and display control apparatus
KR20150061476A (en) Method for providing video of non-gazing region

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant