EP3577540A1 - Dispositif d'analyse pour la détermination d'un délai de détection contribuant à un temps de latence au sein d'un système immersif de réalité virtuelle - Google Patents

Dispositif d'analyse pour la détermination d'un délai de détection contribuant à un temps de latence au sein d'un système immersif de réalité virtuelle

Info

Publication number
EP3577540A1
EP3577540A1 EP18703077.0A EP18703077A EP3577540A1 EP 3577540 A1 EP3577540 A1 EP 3577540A1 EP 18703077 A EP18703077 A EP 18703077A EP 3577540 A1 EP3577540 A1 EP 3577540A1
Authority
EP
European Patent Office
Prior art keywords
space
target
signal
rail
move
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18703077.0A
Other languages
German (de)
English (en)
French (fr)
Inventor
Matthieu MIKA
Christophe MION
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PSA Automobiles SA
Original Assignee
PSA Automobiles SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PSA Automobiles SA filed Critical PSA Automobiles SA
Publication of EP3577540A1 publication Critical patent/EP3577540A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Definitions

  • the invention relates to immersive virtual reality systems.
  • virtual reality immersive systems are used to immerse a user in a virtual environment.
  • This immersion may be intended, for example, to teach a user to evolve in a particular environment or to use objects or functions present in a particular environment, or to analyze the behavior of a user in a particular environment, or still observe a particular environment depending on the position of a user in relation to the latter.
  • Such immersive systems include:
  • At least one target capable of being secured to a user (or sometimes an object) able to move in a predefined space
  • detection means able to detect the current position of this target in this predefined space and to deliver a signal representative of this current position
  • At least one display means responsible for displaying, on at least one screen, installed in the predefined space, images intended for this screen, and
  • At least one computer responsible for defining in real time for each associated screen three-dimensional (possibly stereoscopic) images of a chosen environment, according to the current position of the / each target and the position of the associated screen in the predefined space.
  • time difference or delay between the moment when a user changes position and the moment when this user sees the image resulting from his change of position on each screen.
  • This time difference or delay results not only from the processing time of signals and data and the transmission time of signals, data and images, but also from the graphical rendering time of computers and the time difference. between the moment when the user is placed in a new position and the moment when the detection means detect the target (and therefore the user) in this new position.
  • the invention is therefore particularly intended to allow the determination of this last time difference.
  • an analysis device for performing analyzes in an immersive virtual reality system comprising at least one target adapted to be secured to an object capable of moving in a space and detecting means capable of detecting the current position of this target in this space and to deliver a first signal representative of this current position.
  • analysis means coupled to the sensor and detection means and able to determine a first instant of reception of a first signal, representative of the detected known position, and a second instant of reception of the second signal, then to determine a time difference between these first and second reception instants determined.
  • the analysis device according to the invention may comprise other characteristics that can be taken separately or in combination, and in particular:
  • it may comprise a rail on which the object is able to move and which is adapted to be placed in the space so that the object can move to the known position;
  • the rail can be secured to the support so as to be inclined at a predefined acute angle with respect to a horizontal plane of the space, and thus allow an automatic gravitational displacement of the object with respect to the rail between a position of departure and at least the known position;
  • the electromagnetic means may comprise electromagnetic means fixedly installed on the rail, and able to immobilize the object in the starting position when they are placed in a first state of attraction, and to release the object, so that it can move to the known position, when placed in a second non-attractive state;
  • the electromagnetic means may have an operation that is controllable remotely;
  • the object may be provided with an electric motor capable of inducing its movement during operation;
  • the electric motor can have an operation that is controllable remotely;
  • the senor may be able to be placed in the vicinity of the known position and to generate the second signal when the object contacts it.
  • the invention also proposes an immersive system of reality virtual device comprising at least one target adapted to be secured to an object adapted to move in a space, detection means adapted to detect the current position of the target in this space and to deliver a first signal representative of this position in progress , and an analysis device of the type of that presented above.
  • FIG. 1 diagrammatically and functionally illustrates part of an immersive virtual reality system coupled to an exemplary embodiment of an analysis device according to the invention in which the object to be detected is placed in a starting position, and
  • FIG. 2 diagrammatically and functionally illustrates the part of the immersive virtual reality system of FIG. 1 with the object to be detected of the analysis device placed in a known (final) position.
  • the object of the invention is notably to propose a DA analysis device intended to perform analyzes in an immersive system of virtual reality SI in order to determine a time difference ect participating in the latency time of the latter (SI).
  • the virtual reality immersive system SI is intended to immerse a user in a virtual environment representative of at least a part of a vehicle, possibly of the automotive type. (such as a car). But the invention is not limited to this type of virtual environment. It concerns indeed any type of virtual environment.
  • FIGS. 1 and 2 show schematically a small part of an immersive (virtual reality) system SI associated with a predefined space EP in which at least a partial embodiment of an analysis device DA according to the invention.
  • This small part here only comprises a PC target gate, here secured to an object O, mobile and part of the analysis device DA, and detection means MD capable of detecting the current position of the CD targets of the target gate PC in this predefined EP space and to deliver a first signal s1 representative of this current position.
  • the target gate PC here comprises four CD targets whose positions must be determined at each measurement instant by the detection means MD in order to deduce therefrom at each instant of measurement the current position of the object O. But it can include any number of CD targets, as long as this number is at least one (1).
  • the detection means MD here comprise two cameras each associated with an emitter of infrared photons and capable of filming in the infrared.
  • Each transmitter emits an infrared beam that will be reflected on the targets (or spheres) CD.
  • Each camera records images of the photons reflected on the targets (or spheres) CD, and sends each recorded image to an image analysis computer which will deduce the position in the space of the target target PC at the instant considered.
  • the MD detection means could include more than two cameras.
  • an immersive (virtual reality) system SI also comprises at least one computer, and at least one display means.
  • the target PC gate is intended to equip a user (or sometimes an object) that can move in the predefined EP space.
  • Each display means is responsible for displaying on at least one screen, installed in the predefined space EP, images that are intended for this screen. It should be noted that each display means may comprise a screen and at least one projector, or a screen and an LCD-type slab with its associated electronic control means, for example.
  • the number of screens is generally between one and five.
  • Each screen is installed in the predefined EP space.
  • At least one computer is responsible for defining, in real time, three-dimensional (possibly stereoscopic) images of the chosen environment for at least one screen associated with it, according to the current position of the CD targets of the target gate PC and the position of this associated screen in the predefined EP space.
  • each projector is responsible for projecting on the associated screen three-dimensional images determined by the associated computer and intended for this screen.
  • an analysis device DA comprises, in addition to the object O, a sensor CC and analysis means MA.
  • the object O is mobile so that it can move in the (predefined) space EP.
  • it must be equipped with at least one CD target, possibly forming part of a target PC gate (as in the non-limiting example illustrated in Figures 1 and 2).
  • the DC sensor is capable of generating a second signal s2 when the object O reaches a known position p2 in the space EP.
  • This DC sensor can, for example and as illustrated without limitation in Figures 1 and 2, be adapted to be placed in the vicinity of the known position p2 and generate the second signal s2 when the object O contacts.
  • it may, for example, be of piezoelectric or capacitive or inductive or mechanical type.
  • the detection could be done without contact (and therefore at a distance), for example by interrupting a light beam passing through the known position p2.
  • the known position p2 serves as a reference position with respect to which the analysis means MA determine the time difference ect between the instant i2 where the object O (which materializes a user moving in the space EP ) finds itself placed in a "new position" (here p2) and the instant when the detection means MD detect the target (s) CD (and therefore the object O) in this new position (here p2) .
  • this time difference ect is particularly useful to know because it contributes significantly to the latency of the immersive system SI, which we want to reduce elsewhere.
  • the analysis means MA when the analysis means MA receives at a time a first signal s1 which represents the known position p2 detected, they record this instant as the first moment it, and when they receive at a moment a second signal s2, they record this moment as the second moment i2.
  • the analysis means MA are part of a computer OR which is coupled (directly or indirectly) to the sensor CC and to the detection means MD of the immersive system SI. But this is not obligatory. Indeed, they could constitute electronic equipment (for example comprising an oscilloscope and an electronic signal analysis circuit) coupled (directly or indirectly) to the DC sensor and the detection means MD of the immersive system SI. Therefore, these analysis means MA can be made in the form of software modules (or computer (or "software”)), or a combination of electronic circuits (or “hardware”) and software modules.
  • Moving the object O can be done in different ways.
  • the analysis device DA may, for example, comprise a rail R on which is able to move the object O and which is adapted to be placed in the space EP so that the object O can move to the known position p2. In this case the displacement of the object O is constrained.
  • this rail R may be a single axis, possibly of circular section but not necessarily.
  • the analysis device DA may also comprise a support SR on which the rail R. is fixedly secured.
  • Such a support SR may, for example, be intended to be placed on the ground in the EP space. It can therefore allow placement of the rail R in a position parallel to the ground or inclined at an acute angle predefined with respect to the ground and therefore with respect to a horizontal plane of the space EP (as illustrated in non-limiting manner in FIGS. 2).
  • the object O In the first alternative (parallel), for the object O to move from a starting position to the known position p2, it must either receive a initial impulse by a person, or be provided with an electric motor preferably having remotely controllable operation (for example by wave).
  • the displacement of the object O with respect to the rail R can be done automatically by gravitation between a starting position p1 (illustrated in FIG. 1) and at least the known position p2 (illustrated in FIG. Figure 2). In other words, the displacement results from the fall of the object O along the rail R (along the arrow F1 of FIG. 1).
  • the angle of inclination of the rail R relative to the ground is equal to 90 °.
  • the analysis device DA may comprise electromagnetic means MEL fixedly installed on the rail R in the vicinity of the starting position p1.
  • electromagnetic means MEL are able, on the one hand, to immobilize the object O in its starting position p1 when they are placed in a first state of attraction, and, on the other hand, to release the object O so that it can move to the known position p2 when placed in a second non-attractive state.
  • These electromagnetic means MEL can, for example, be arranged in the form of an electromagnet which is attractive when it is supplied with current and not attractive when it is not supplied with current. Note that if the electromagnet is sufficiently powerful, it can also be used, when supplied with power, to raise the object O automatically from the known position p2 to its starting position p1.
  • Such electromagnetic means MEL may, for example, have an operation that is controllable remotely, possibly by wave. This control can be done via a computer coupled to the electromagnetic means MEL, and which is optionally that (OR) which can include the MA analysis means, or via a remote control. it allows a user to trigger the fall of the object O remotely without hindering the further detection of its fall by the detection means MD.
  • the DC sensor is fixedly secured to the rail R just below the known position p2 because the DC sensor provides a contact detection.
  • the displacement of the object O is not necessarily constrained, for example because of its attachment to a rail R.
  • the object O may comprise wheels which are possibly rotated by an electric motor.
  • the object O moves from a starting position to the known position p2 by means of an initial pulse supplied by a person.
  • the operation of the latter induces the displacement of the object O from a starting position to the known position p2.
  • This operation is then preferably remotely controllable (possibly by waves). This control can be done via a computer coupled to the object O, and which is optionally that (OR) which can comprise the analysis means MA, or via a remote control.
  • the object O may have a large number of arrangements, depending in particular on the way in which it must move.
  • it may be made in the form of a part (possibly metal) of parallelepipedal general shape, either with a groove or coupling means adapted (s) to its movement along a R rail, either with wheels.
  • the movements could also be done on air cushion, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
EP18703077.0A 2017-02-01 2018-01-25 Dispositif d'analyse pour la détermination d'un délai de détection contribuant à un temps de latence au sein d'un système immersif de réalité virtuelle Withdrawn EP3577540A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1750821A FR3062489B1 (fr) 2017-02-01 2017-02-01 Dispositif d’analyse pour la determination d’un delai de detection contribuant a un temps de latence au sein d’un systeme immersif de realite virtuelle
PCT/FR2018/050169 WO2018142043A1 (fr) 2017-02-01 2018-01-25 Dispositif d'analyse pour la détermination d'un délai de détection contribuant à un temps de latence au sein d'un système immersif de réalité virtuelle

Publications (1)

Publication Number Publication Date
EP3577540A1 true EP3577540A1 (fr) 2019-12-11

Family

ID=58501678

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18703077.0A Withdrawn EP3577540A1 (fr) 2017-02-01 2018-01-25 Dispositif d'analyse pour la détermination d'un délai de détection contribuant à un temps de latence au sein d'un système immersif de réalité virtuelle

Country Status (5)

Country Link
US (1) US20190377427A1 (zh)
EP (1) EP3577540A1 (zh)
CN (1) CN110268372A (zh)
FR (1) FR3062489B1 (zh)
WO (1) WO2018142043A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115098005B (zh) * 2022-06-24 2023-01-24 北京华建云鼎科技股份公司 一种控制目标对象移动的数据处理***

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2662318C (en) * 2009-01-17 2014-12-02 Lockheed Martin Corporation Immersive collaborative environment using motion capture, head mounted display, and cave
US8339364B2 (en) * 2010-02-03 2012-12-25 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
FR2975198A1 (fr) * 2011-05-10 2012-11-16 Peugeot Citroen Automobiles Sa Dispositif d'affichage pour occultation dans systeme de realite virtuelle
US10055018B2 (en) * 2014-08-22 2018-08-21 Sony Interactive Entertainment Inc. Glove interface object with thumb-index controller
CN105138135B (zh) * 2015-09-15 2018-08-28 北京国承万通信息科技有限公司 头戴式虚拟现实设备及虚拟现实***
CN105807601A (zh) * 2016-03-10 2016-07-27 北京小鸟看看科技有限公司 一种测试虚拟现实设备延时的方法和***
CN105652279A (zh) * 2016-03-11 2016-06-08 北京维阿时代科技有限公司 一种实时空间定位***和方法及含该***的虚拟现实设备
CN105807931B (zh) * 2016-03-16 2019-09-17 成都电锯互动科技有限公司 一种虚拟现实的实现方法

Also Published As

Publication number Publication date
US20190377427A1 (en) 2019-12-12
CN110268372A (zh) 2019-09-20
FR3062489A1 (fr) 2018-08-03
FR3062489B1 (fr) 2020-12-25
WO2018142043A1 (fr) 2018-08-09

Similar Documents

Publication Publication Date Title
CA2859900C (fr) Procede d'estimation de flot optique a partir d'un capteur asynchrone de lumiere
US10003777B2 (en) Projection screen for specularly reflecting light
EP3358366A1 (en) Light detection and ranging device
EP2495531B1 (fr) Procédé de mesure de la stabilité d'une ligne de visée et senseur stellaire correspondant
FR3030091A1 (fr) Procede et systeme de detection automatique d'un desalignement en operation d'un capteur de surveillance d'un aeronef.
EP3152593A1 (fr) Dispositif de detection a plans croises d'un obstacle et procede de detection mettant en oeuvre un tel dispositif
WO2015185532A1 (fr) Dispositif de detection a plan horizontal d'obstacles et procede de detection mettant en oeuvre un tel dispositif
US10116874B2 (en) Adaptive camera field-of-view
EP3577540A1 (fr) Dispositif d'analyse pour la détermination d'un délai de détection contribuant à un temps de latence au sein d'un système immersif de réalité virtuelle
EP2732238B1 (fr) Procede de representation des mouvements eventuels d'une structure pour un appareil de type ordiphone
EP3486815A1 (en) Model data of an object disposed on a movable surface
EP3577531A1 (fr) Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realitie virtuelle
EP3367353B1 (fr) Procédé de pilotage d'une caméra ptz, produit programme d'ordinateur et dispositif de pilotage associés
FR3098955A1 (fr) Procédé de commande d’une interface graphique pour afficher le lancer d’un projectile dans un environnement virtuel
US10254406B2 (en) Surveying physical environments and monitoring physical events
EP3147688B1 (fr) Procede de detection d'obstacles et vehicule muni d'un systeme de detection d'obstacles
EP3671548B1 (fr) Syste me de visualisation d'un espace de contro le par rajout de frontie res graphiques au niveau d'une image prise par une came ra
EP3577541B1 (fr) Dispositif d'analyse de la synchronisation d'images sur des voies d'affichage distinctes
FR3051617A1 (fr) Systeme de prise de vue
EP1772743A1 (fr) Procédé de détection d'un objet par traitement spatio-temporel et dispositif de mise en oeuvre du procédé

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20190726

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200821

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: PSA AUTOMOBILES SA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20210112