EP3577531A1 - Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realitie virtuelle - Google Patents

Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realitie virtuelle

Info

Publication number
EP3577531A1
EP3577531A1 EP18707070.1A EP18707070A EP3577531A1 EP 3577531 A1 EP3577531 A1 EP 3577531A1 EP 18707070 A EP18707070 A EP 18707070A EP 3577531 A1 EP3577531 A1 EP 3577531A1
Authority
EP
European Patent Office
Prior art keywords
signal
screen
space
known position
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18707070.1A
Other languages
German (de)
English (en)
French (fr)
Inventor
Matthieu MIKA
Christophe MION
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PSA Automobiles SA
Original Assignee
PSA Automobiles SA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PSA Automobiles SA filed Critical PSA Automobiles SA
Publication of EP3577531A1 publication Critical patent/EP3577531A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image

Definitions

  • the invention relates to immersive virtual reality systems.
  • immersive virtual reality systems are increasingly being used to immerse users in virtual environments. This is particularly the case, although not exclusively, in the field of vehicles, possibly of the automotive type.
  • Such an immersion may be intended, for example, to teach a user to evolve in a particular environment or to use objects or functions present in a particular environment, or to analyze the behavior of a user in a particular environment, or to observe a particular environment according to the position of a user in relation to the latter.
  • an immersive system includes:
  • At least one target that can be secured to a user (or sometimes an object) that is able to move in a predefined space
  • detection means able to detect the current position of this target in this predefined space and to deliver a signal representative of this current position
  • At least one display means responsible for displaying on at least one screen, installed in the predefined space, images (possibly three-dimensional (or 3D)) intended for this screen, and
  • processing means responsible for defining, in real time for each associated screen, three-dimensional (possibly stereoscopic) images of a chosen environment, as a function of the current position of the / each target and the position of the associated screen in the predefined space.
  • This time difference or delay results from the processing time of signals and data, the transmission time of signals, data and images, the graphical rendering time of computers and the time difference between the instant where the user is placed in a new position and the moment when the detection means detect the target (and therefore the user) in this new position.
  • the invention is therefore particularly intended to improve the situation.
  • an analysis device intended to perform analyzes in an immersive virtual reality system comprising:
  • detection means able to detect the current position of this target in this space and to deliver a first signal representative of this current position
  • At least one display means responsible for displaying, on at least one screen, installed in the predefined space, images intended for this screen, and
  • processing means responsible for defining images for this screen as a function of the detected position of the object.
  • a first sensor capable of generating a second signal when the object reaches a known position in this space
  • a second sensor capable of generating a third signal in the event of detection of an image change displayed on the screen, subsequent to the detection of the object in this known position by the detection means
  • analysis means coupled at least to the first and second sensors and suitable for determining a first instant of reception of the second signal and a second instant of reception of the third signal, then determining a first time difference (or latency) between the first and second reception instants determined.
  • the analysis device according to the invention may comprise other characteristics that can be taken separately or in combination, and in particular:
  • its analysis means may also be coupled to the detection means and processing means, and suitable for determining a third instant of reception of a first signal representative of the known position detected by the detection means, and a second time difference between the first and third reception instants determined, this second time difference being representative of a detection delay of the target in the known position by the detection means;
  • its analysis means may be able to determine a third time difference between the first and second time differences determined, this third time difference being representative of at least one image generation duration by the processing means during a change image;
  • its second sensor may be able to detect a variation in light intensity resulting from a change of image displayed on the screen
  • it may comprise a rail on which is able to move the object and which is adapted to be placed in the space so that the object can move to the known position; it can comprise a support on which the rail is fixedly fixed;
  • the rail can be secured to the support so as to be inclined at a predefined acute angle with respect to a horizontal plane of the space, and thus allow an automatic gravitational displacement of the object with respect to the rail between a position of departure and at least the known position;
  • it can comprise electromagnetic means fixedly installed on the rail, and clean, on the one hand, to immobilize the object in the starting position when placed in a first state of attraction, and secondly releasing the object so that it can move to the known position when placed in a second non-attractive state;
  • its first sensor may be able to be placed in the vicinity of the known position and to generate the second signal when the object contacts it.
  • the invention also proposes an immersive virtual reality system comprising at least one target capable of being secured to an object capable of moving in a space, detection means able to detect the current position of the target in this space and to delivering a first signal representative of this current position, at least one display means responsible for displaying, on at least one screen, installed in the predefined space, images intended for this screen, processing means responsible for defining images for the screen according to at least this current position detected, and an analysis device of the type of that presented above.
  • FIG. 1 diagrammatically and functionally illustrates an example of an immersive virtual reality system coupled to an exemplary embodiment of an analysis device according to the invention
  • FIG. 2 diagrammatically and functionally illustrates the analysis device of FIG. 1 with its object to be detected placed in a starting position
  • FIG. 3 schematically and functionally illustrates the analysis device of FIG. 1 with its object to be detected placed in a known (final) position.
  • the object of the invention is in particular to propose a DA analysis device intended to perform analyzes in an immersive virtual reality system SI in order to determine at least the global latency time and the latter (SI).
  • the virtual reality immersive system SI is intended to immerse a user in a virtual environment representative of at least a part of a vehicle, possibly of the automotive type. (such as a car). But the invention is not limited to this type of virtual environment. It concerns indeed any type of virtual environment.
  • FIG. 1 shows schematically an example of an immersive (virtual reality) system SI associated with a predefined space EP in which at least a partial embodiment of an analysis device DA according to the invention is at least partially installed.
  • an immersive (virtual reality) system SI comprises at least one CD target, detection means MD, processing means MT, and at least one display means PI, EA.
  • Each CD target is adapted to be secured to a user (or sometimes an OM object of the DA analysis device) which is adapted to move in a predefined EP space.
  • the detection means MD are able to detect the current position of each target CD in the predefined space EP and to deliver a first signal s1 which is representative of this current position.
  • Each EA screen is installed in the predefined EP space.
  • the processing means MT are responsible for defining, in real time for each associated screen EA, possibly three-dimensional (and possibly stereoscopic) images of a chosen environment, as a function of the detected position of the object OM (function of the current position of the / each CD target) and the position of the associated EA screen in the space predefined EP.
  • the processing means MT are loaded, at the request of the latter (DA), to define for each associated screen EA a first image until the object OM has not was detected by the detection means MD in a known position p2 in the EP space (see FIG. 3), and a second image when the object OM was detected by the detection means MD in this known position p2.
  • the first image may be all white, and the second image may be all black.
  • Each display means PI, EA is responsible for displaying on at least one EA screen, installed in the predefined space EP, images that are intended for this screen EA. It should be noted that each display means may comprise an EA screen and at least one PI projector, or an EA screen and an LCD type panel with its associated electronic control means, for example.
  • the number of EA screens is generally between one and five.
  • Each EA screen is installed in the predefined EP space.
  • At least one computer is responsible for defining in real time three-dimensional (possibly stereoscopic) images of the chosen environment for at least one associated EA screen, depending on the current position of the CD targets (possibly a door PC targets) and the position of this associated EA screen in the predefined EP space.
  • each PI projector is responsible for projecting on the associated screen EA three-dimensional images determined by the associated computer and for this EA screen.
  • the immersive system SI comprises only a display means comprising an EA screen associated with a projector P1. However, it could comprise several (at least two) PI display means. , EA. Moreover, each display means is generally associated with its own processing means MT. But we could consider that the same processing means (very powerful) define images for several (at least two) display means.
  • SI immersive system comprises several targets secured to a PC target holder intended to be secured to a user or an OM object of the analysis device DA to move in the predefined space EP.
  • the target gate PC here comprises four CD targets whose positions must be determined at each measurement instant by the detection means MD in order to deduce at each instant of measurement the current position of the object OM.
  • the PC target gate may include any number of CD targets, provided that this number is at least one (1).
  • the detection means MD here comprise two cameras each associated with an emitter of infrared photons and capable of filming in the infrared.
  • Each emitter emits an infrared beam that will be reflected on the targets (or spheres) CD.
  • Each camera records images of the photons reflected on the targets (or spheres) CD, and sends each recorded image to an image analysis computer which will deduce the position in the space of the target target PC at the instant considered.
  • the MD detection means could include more than two cameras.
  • the processing means MT can be subdivided into several parts (here four (01 -04)), when they have to define stereoscopic 3D images for at least one means of processing.
  • display here a PI projector associated with an EA screen.
  • the second part 02 may be a computer responsible for defining the images for the left eye.
  • the third part 03 may be a computer responsible for defining the images for the right eye.
  • the fourth part 04 may be a computer responsible for transmitting synchronously to the display means (here a projector PI associated with an EA screen) the images defined by the second OR2 and third OR3 parts according to the same position in progress detected by the detection means MD.
  • the first part 01 may be a computer coupled to the detection means MD and here to the second 02 and third 03 computers, and responsible for controlling the second 02 and third 03 computers according to the current positions detected by the detection means MD .
  • a device DA analysis comprises, in addition to the OM object, at least a first sensor C1, a second sensor C2 and analysis means MA.
  • the analysis device DA can inform the processing means MT so that they define for each associated screen EA the first image until the OM object has been detected by the detection means MD in the known position p2, and a second image when the object OM has been detected by the detection means MD in this known position p2.
  • the object OM is mobile so that it can move in the (predefined) space EP.
  • it must be equipped with at least one CD target, possibly forming part of a target PC gate (as in the non-limiting example illustrated in Figures 1 to 3).
  • the first sensor C1 is capable of generating a second signal s2 when the object OM reaches a known position p2 in the space EP (see FIG. 3).
  • This first sensor C1 can, for example and as illustrated without limitation in FIGS. 1 to 3, be adapted to be placed in the vicinity of the known position p2 and to generate the second signal s2 when the object OM contacts it.
  • it may, for example, be of piezoelectric or capacitive or inductive or mechanical type. But in an alternative embodiment the detection could be done without contact (and therefore at a distance), for example by interrupting a light beam passing through the known position p2.
  • the second sensor C2 is capable of generating a third signal s3 when an image change displayed on the screen EA is detected.
  • display image change is meant here the replacement of a first image by a second image immediately distinguishable from the first image by at least one characteristic. For example, when the first image is all white and the second image is all black, the second sensor C2 can generate a third signal s3 when it detects on the EA screen the transition from white to black.
  • This second sensor C2 may, for example, be able to detect a variation in light intensity resulting from an image change displayed on the EA screen.
  • it may be a photodiode which delivers a third signal s3 when it detects not only white.
  • This first time difference and1 constitutes the overall latency time of the immersive system S1 since it is equal to the difference between the instant in which the object OM (representing a user) changes position (here is detected in p2 by the first sensor C1) and the instant i2 where is displayed on the screen EA the new (or second) image (for example all black and which represents the image resulting from the detection of the object OM in p2 by means of MD detection).
  • the known position p2 serves as a reference position with respect to which the analysis means MA determine the first time difference (or overall latency) and 1.
  • the analysis means MA when they receive at a time a second signal s2 generated by the first sensor C1, they record this instant as the first moment it, and when they receive at a moment a third signal s3 generated by the second sensor C2, they record this moment as second time i2.
  • the triggering of the image change is carried out automatically by the processing means MT (here the first part 01) when they receive MD detection means a first signal s1 representative of the known position p2 detected for the OM object.
  • the analysis means MA are part of a computer OR which is coupled (directly or indirectly) to the first C1 and second C2 sensors and to the detection means MD (here via the first computer 01 means of MT treatment of the SI immersive system).
  • the analysis means MA could constitute an electronic equipment (for example comprising an oscilloscope and an electronic signal analysis circuit) coupled (directly or indirectly) to the first C1 and second C2 sensors and to the detection means MD of the immersive system.
  • the analysis means MA could be implanted in the processing means MT (for example in the first computer 01 which is coupled to the detection means MD). Therefore, these analysis means MA can be made in the form of software modules (or computer (or "software”)), or a combination of electronic circuits (or “hardware”) and software modules.
  • the analysis means MA when the analysis means MA are coupled to the detection means MD and processing means MT, they may also be suitable for determining a third instant i3 for receiving the first signal s1 which is representative of the known position p2 detected. For example, when the analysis means MA receive at a time a first signal s1 which represents the known position p2 detected, they record this instant as the third time i3. In this case, the analysis means MA are also able to determine a second time difference and 2 between the first 11th and third i3 instants of reception determined.
  • This second time difference and2 is representative of the detection delay of the target CD in the known position p2 by the detection means MD. Indeed, the moment it is the moment when the object OM (which materializes a user moving in the space EP) is found placed in a "new position" (here p2) and the moment i3 is the moment when the detection means MD detect the target (s) CD (and therefore the object OM) in this new position (here p2).
  • This second time difference and2 is particularly useful to know because it contributes significantly to the overall latency of the immersive system SI.
  • the analysis means MA may also be suitable for determining a third time difference and 3 between the first and second and second time differences determined.
  • This third time difference and3 is representative of at least the image generation duration by the MT processing means during a change of image (that is to say, following the receipt of the first signal s1 representative of p2).
  • This third time difference and 3 is also useful to know because it contributes significantly to the overall latency of the immersive system SI.
  • the analysis means MA may also be and possibly informed by each of the parts 01 to 04 of the processing means MT of the reception of a signal or instructions or of a data file and / or transmitting a signal or instructions or a data file to another equipment of the immersive system SI. This allows to deduce intermediate processing times which also participate in the overall latency of the immersive system SI. Thus, one can have all the contributions to the overall latency of the immersive system SI.
  • Moving the OM object can be done in different ways.
  • the analysis device DA may, for example, comprise a rail R on which is able to move the object OM and which is adapted to be placed in the space EP so that the object OM can move to the known position p2. In this case the displacement of the object OM is constrained.
  • this rail R may be a single axis, possibly of circular section but not necessarily.
  • the analysis device DA may also comprise a support SR on which the rail R is fixedly secured.
  • Such a support SR may, for example, be intended to be placed on the ground in the EP space. It can therefore allow a placement of the rail R in a position parallel to the ground or inclined at an acute angle predefined with respect to the ground and therefore with respect to a horizontal plane of the space EP (as illustrated without limitation in FIGS. 3).
  • the object OM In the first alternative (parallel), for the object OM to move from a starting position to the known position p2, it must either receive an initial pulse by a person, or be provided with an electric motor having preferably remotely controllable operation (eg example by wave).
  • the displacement of the object OM with respect to the rail R can be done automatically by gravitation between a starting position p1 (illustrated in FIGS. 1 and 2) and at least the known position p2 (illustrated in FIG. in Figure 3). In other words, the displacement results from the fall of the object OM along the rail R (along the arrow F1 of FIG. 2).
  • the angle of inclination of the rail R relative to the ground is equal to 90 °.
  • the analysis device DA may comprise electromagnetic means MEL fixedly installed on the rail R in the vicinity of the starting position p1.
  • electromagnetic means MEL are able, on the one hand, to immobilize the object OM in its starting position p1 when they are placed in a first state of attraction, and, on the other hand, to release the object OM so that it can move to the known position p2 when placed in a second non-attractive state.
  • These electromagnetic means MEL can, for example, be arranged in the form of an electromagnet which is attractive when it is supplied with current and not attractive when it is not supplied with current. Note that if the electromagnet is sufficiently powerful, it can also be used, when powered, to raise the OM object automatically from the known position p2 to its starting position p1.
  • Such electromagnetic means MEL may, for example, have an operation that is controllable remotely, possibly by wave. This control can be done via a computer coupled to the electromagnetic means MEL, and which is optionally that (OR) which can include the MA analysis means, or via a remote control. This allows a user to trigger the fall of the OM object to distance without jeopardizing the further detection in its fall by the MD detection means.
  • the first sensor C1 is fixedly secured to the rail R just below the known position p2 because the first sensor C1 provides contact detection.
  • the displacement of the object OM is not necessarily constrained, for example because of its attachment to a rail R.
  • the object OM is arranged in a manner to roll on the floor of the EP space.
  • it may comprise wheels which are possibly rotated by an electric motor.
  • the OM object moves from a starting position to the known position p2 by means of an initial pulse supplied by a person.
  • the operation of the latter induces the displacement of the object OM from a starting position to the known position p2.
  • This operation is then preferably remotely controllable (possibly by waves). This control can be done via a computer coupled to the object OM, and which is optionally that (OR) which can comprise the analysis means MA, or via a remote control.
  • the object OM may have a large number of arrangements, depending in particular on how it should move.
  • it may be made in the form of a part (possibly metal) of parallelepipedal general shape, either with a groove or coupling means adapted (s) to its movement along a R rail, either with wheels.
  • the movements could also be done on air cushion, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)
EP18707070.1A 2017-02-01 2018-01-25 Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realitie virtuelle Withdrawn EP3577531A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1750820A FR3062488B1 (fr) 2017-02-01 2017-02-01 Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realite virtuelle
PCT/FR2018/050170 WO2018142044A1 (fr) 2017-02-01 2018-01-25 Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realitie virtuelle

Publications (1)

Publication Number Publication Date
EP3577531A1 true EP3577531A1 (fr) 2019-12-11

Family

ID=58501677

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18707070.1A Withdrawn EP3577531A1 (fr) 2017-02-01 2018-01-25 Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realitie virtuelle

Country Status (5)

Country Link
US (1) US10551911B2 (zh)
EP (1) EP3577531A1 (zh)
CN (1) CN110291487B (zh)
FR (1) FR3062488B1 (zh)
WO (1) WO2018142044A1 (zh)

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2662318C (en) * 2009-01-17 2014-12-02 Lockheed Martin Corporation Immersive collaborative environment using motion capture, head mounted display, and cave
US8339364B2 (en) * 2010-02-03 2012-12-25 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
FR2975198A1 (fr) * 2011-05-10 2012-11-16 Peugeot Citroen Automobiles Sa Dispositif d'affichage pour occultation dans systeme de realite virtuelle
CN103197757A (zh) * 2012-01-09 2013-07-10 癸水动力(北京)网络科技有限公司 一种沉浸式虚拟现实***及其实现方法
US9841839B2 (en) 2013-10-07 2017-12-12 Tactual Labs Co. System for measuring latency on a touch device
US10007350B1 (en) * 2014-06-26 2018-06-26 Leap Motion, Inc. Integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
DE202014103729U1 (de) * 2014-08-08 2014-09-09 Leap Motion, Inc. Augmented-Reality mit Bewegungserfassung
US9767574B2 (en) * 2014-08-22 2017-09-19 Applied Research Associates, Inc. Techniques for accurate pose estimation
EP3268883A4 (en) * 2015-02-27 2019-01-02 Valorisation-Recherche, Limited Partnership Method of and system for processing signals sensed from a user
US20190079480A1 (en) * 2015-03-17 2019-03-14 Whirlwind VR, Inc. System and Method for Delivery of Variable Flow Haptics in an Immersive Environment with Latency Control
EP3364270A4 (en) * 2015-10-15 2018-10-31 Sony Corporation Information processing device and information processing method
US10046234B2 (en) * 2015-10-20 2018-08-14 OBE Gaming Inc. Interactive movement tracking system applicable to articles of clothing
WO2017151778A1 (en) * 2016-03-01 2017-09-08 ARIS MD, Inc. Systems and methods for rendering immersive environments
US9928661B1 (en) * 2016-03-02 2018-03-27 Meta Company System and method for simulating user interaction with virtual objects in an interactive space
CN105807601A (zh) * 2016-03-10 2016-07-27 北京小鸟看看科技有限公司 一种测试虚拟现实设备延时的方法和***

Also Published As

Publication number Publication date
US20190354167A1 (en) 2019-11-21
US10551911B2 (en) 2020-02-04
CN110291487A (zh) 2019-09-27
FR3062488A1 (fr) 2018-08-03
CN110291487B (zh) 2023-03-24
FR3062488B1 (fr) 2020-12-25
WO2018142044A1 (fr) 2018-08-09

Similar Documents

Publication Publication Date Title
CA2859900C (fr) Procede d'estimation de flot optique a partir d'un capteur asynchrone de lumiere
US9945936B2 (en) Reduction in camera to camera interference in depth measurements using spread spectrum
KR20160126060A (ko) 편광 시선 추적
EP3152593B1 (fr) Dispositif de detection a plans croises d'un obstacle et procede de detection mettant en oeuvre un tel dispositif
FR2902527A1 (fr) Dispositif de localisation tridimentionnelle de sources de rayonnement
EP3152592A1 (fr) Dispositif de detection a plan horizontal d'obstacles et procede de detection mettant en oeuvre un tel dispositif
EP2757530A1 (fr) Procédé et dispositif de détection de chute par analyse d'images
WO2018142043A1 (fr) Dispositif d'analyse pour la détermination d'un délai de détection contribuant à un temps de latence au sein d'un système immersif de réalité virtuelle
EP3577531A1 (fr) Dispositif d'analyse pour la determination d'un temps de latence d'un systeme immersif de realitie virtuelle
JP6149717B2 (ja) 撮像装置及び撮像方法
EP3999942A1 (fr) Procédé de commande d'une interface graphique pour afficher le lancer d'un projectile dans un environnement virtuel
EP3367353B1 (fr) Procédé de pilotage d'une caméra ptz, produit programme d'ordinateur et dispositif de pilotage associés
FR2975198A1 (fr) Dispositif d'affichage pour occultation dans systeme de realite virtuelle
EP3170205B1 (fr) Dispositif de détection de mouvement
WO2017149254A1 (fr) Dispositif d'interface homme machine avec des applications graphiques en trois dimensions
FR3033467A1 (fr) Camera adaptee pour travailler dans un environnement radioactif.
EP3577541B1 (fr) Dispositif d'analyse de la synchronisation d'images sur des voies d'affichage distinctes
FR3090950A1 (fr) Système de visualisation d’un espace de contrôle par rajout de frontières graphiques au niveau d’une image prise par une caméra
EP3839819A1 (fr) Assistant et procede d'assistance de recherche d'un element dans une zone
EP0528077A1 (fr) Système radar aéroporté muni d'une caméra pour poursuivre objets volants à basse altitude
FR3083906A1 (fr) Interface utilisateur pour un systeme d’alarme
WO2009121199A1 (fr) Procede et dispositif pour realiser une surface tactile multipoints a partir d'une surface plane quelconque et pour detecter la position d'un objet sur une telle surface
EP3853815A1 (fr) Dispositif, systeme et procede de localisation, par un module de traitement, d'un module d'acquisition par rapport a un equipement a controler
WO2015071426A1 (fr) Système et un procédé de caractérisation d'objets d'intérêt présents dans une scène
FR3051617A1 (fr) Systeme de prise de vue

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20190725

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

INTG Intention to grant announced

Effective date: 20200629

RAP1 Party data changed (applicant data changed or rights of an application transferred)

Owner name: PSA AUTOMOBILES SA

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20201110