WO2022185596A1 - Système d'estimation, procédé d'estimation et programme - Google Patents

Système d'estimation, procédé d'estimation et programme Download PDF

Info

Publication number
WO2022185596A1
WO2022185596A1 PCT/JP2021/037911 JP2021037911W WO2022185596A1 WO 2022185596 A1 WO2022185596 A1 WO 2022185596A1 JP 2021037911 W JP2021037911 W JP 2021037911W WO 2022185596 A1 WO2022185596 A1 WO 2022185596A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
imaging
imaging means
unit
wavelength band
Prior art date
Application number
PCT/JP2021/037911
Other languages
English (en)
Japanese (ja)
Inventor
元貴 吉岡
博之 古屋
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2022185596A1 publication Critical patent/WO2022185596A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/42Photometry, e.g. photographic exposure meter using electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present disclosure relates to an estimation system, an estimation method, and a program for estimating a person's state or recognizing a person.
  • Patent Document 1 a technique for recognizing a person using a plurality of image data captured at different times. For example, even if part of a person's face is covered by sunglasses or other wearables, multiple image data captured at different times may be captured depending on the ever-changing position and orientation of the person's face. , a part of the person's face that is hidden by the wearer may be displayed. Therefore, by using a plurality of image data captured at different times, it is possible to recognize the person wearing the object.
  • a part of a person's face that is blocked by a wearable object is not displayed in any of the plurality of image data captured at different times. In some cases, it may not be possible to estimate the state of the person wearing the object or recognize the person wearing the object with high accuracy.
  • the present disclosure provides an estimation system and the like capable of estimating the state of a person wearing a wearable item or recognizing a person wearing the wearable item with high accuracy.
  • An estimation system for estimating a state of a person or recognizing the person, comprising first imaging means in a first wavelength band and second imaging means in a second wavelength band an imaging unit having imaging means; a detection unit that detects the presence or absence of an object worn by the person based on an image obtained by imaging the person by the imaging unit; an image obtained by imaging the person by the imaging unit using the determined imaging means; and an estimating unit that estimates the state of the person or recognizes the person based on.
  • the estimation system and the like it is possible to highly accurately estimate the state of the person wearing the wearable item or recognize the person wearing the wearable item.
  • FIG. 1 is a configuration diagram showing an example of an estimation system according to an embodiment.
  • FIG. 2 is a diagram for explaining the first wavelength band and the second wavelength band.
  • FIG. 3 is a flow chart showing an example of the operation of the estimation system according to the embodiment.
  • FIG. 4 is a diagram for explaining the operation flow of the estimation system according to the embodiment.
  • FIG. 5 is a diagram illustrating an example of a presentation unit in the embodiment;
  • FIG. 6A is a diagram for explaining imaging means determined based on the presence or absence of a wearable object and the content of specific information.
  • FIG. 6B is a diagram for explaining imaging means determined based on the presence or absence of a wearable object and the content of specific information.
  • FIG. 6C is a diagram for explaining imaging means determined based on the presence or absence of a wearable object and the content of specific information.
  • FIG. 7 is a diagram illustrating an example of a display unit according to the embodiment;
  • FIG. 1 is a configuration diagram showing an example of the estimation system 1 according to the embodiment.
  • the estimation system 1 is a system for estimating the state of a person or recognizing a person, and includes an imaging unit 10, a detection unit 20, a determination unit 30, an estimation unit 40, a first recognition model 41, a second recognition model 42, A presentation unit 50 and a display unit 60 are provided.
  • the state of a person is, for example, the state of a person's facial expression (emotions, drowsiness, or the like) or the state of a person's line of sight.
  • the state of a person's face is estimated.
  • the estimation system 1 includes a processor, memory, and the like.
  • the memory is ROM (Read Only Memory), RAM (Random Access Memory), etc., and can store programs executed by the processor.
  • the detection unit 20, the determination unit 30, and the estimation unit 40 are implemented by a processor or the like that executes a program stored in memory.
  • the first recognition model 41 and the second recognition model 42 are stored in memory.
  • the memory in which the program is stored, the memory in which the first recognition model 41 is stored, and the memory in which the second recognition model 42 is stored may be the same memory or different memories.
  • the imaging unit 10 is a component having first imaging means in the first wavelength band and second imaging means in the second wavelength band.
  • the first wavelength band and the second wavelength band are different wavelength bands. A part of the first wavelength band and a part of the second wavelength band may overlap.
  • the first wavelength band and the second wavelength band will be described with reference to FIG.
  • FIG. 2 is a diagram for explaining the first wavelength band and the second wavelength band.
  • the first wavelength band is the visible wavelength band
  • the second wavelength band is the infrared wavelength band. Therefore, an RGB image is obtained by the first imaging means in the visible light wavelength band, and an IR image is obtained by the second imaging means in the infrared light wavelength band.
  • RGB image facial skin wrinkles, facial expressions, contours, eye color and hair color are clearly visible, but when a person wears a wearable object such as sunglasses or a mask, the wearable object is clearly visible. The part occluded by is no longer visible.
  • the first imaging means is imaging means by a first camera that performs imaging in the first wavelength band
  • the second imaging means is imaging means by a second camera that performs imaging in the second wavelength band, good too.
  • the imaging unit 10 may be realized by two cameras, a first camera and a second camera.
  • the first imaging means is imaging means using a second cut filter that cuts light in the second wavelength band
  • the second imaging means uses a first cut filter that cuts light in the first wavelength band.
  • It may also be an image pickup means. That is, the imaging unit 10 may be realized by, for example, one camera that can use the first cut filter and the second cut filter selectively.
  • the detection unit 20 detects the presence or absence of an object worn by the person based on the image obtained by the imaging unit 10 capturing the image of the person.
  • the wearable item is a wearable item worn on a person's face, and specifically includes sunglasses or a mask.
  • An image used for detecting the presence or absence of an object worn by a person is, for example, an RGB image obtained by imaging a person by the imaging unit 10 using the first imaging means in the first wavelength band.
  • a method for detecting the presence or absence of an object worn by a person is not particularly limited, but for example, a learned model learned by machine learning or the like may be used.
  • the determination unit 30 determines the imaging means to be used for imaging from the first imaging means and the second imaging means based on the presence or absence of the detected wearing object of the person.
  • the imaging is imaging for obtaining an image used when the estimating unit 40 performs estimation or recognition, as will be described later.
  • the determining unit 30 determines the imaging means to be used for imaging to be the second imaging means when it is detected that the person is wearing the object, and when it is detected that the person is not wearing the object, imaging is performed. is determined to be the first imaging means.
  • the first recognition model 41 is a recognition model for recognizing an image (RGB image) obtained by imaging a person by the imaging unit 10 using the first imaging means.
  • the first recognition model 41 is a trained model trained by machine learning or the like using an image obtained by the first imaging means. By inputting the image obtained by the first imaging means into the first recognition model 41, information indicating the state of the person appearing in the image or the recognition result of the person appearing in the image is output.
  • the second recognition model 42 is a recognition model for recognizing an image (IR image) obtained by imaging a person with the imaging unit 10 using the second imaging means.
  • the second recognition model 42 is a trained model trained by machine learning using the image obtained by the second imaging means. By inputting the image obtained by the second imaging means into the second recognition model 42, information indicating the state of the person in the image or the recognition result of the person in the image is output.
  • the estimating unit 40 estimates the state of the person or recognizes the person based on the image obtained by imaging the person with the imaging unit 10 using the determined imaging means. For example, when the imaging means used for imaging is determined to be the first imaging means, the estimating section 40 uses the first recognition model 41 as an image obtained by imaging a person by the imaging section 10 using the first imaging means. By using and recognizing, estimation of a person's state or recognition of a person is performed. Further, for example, when the imaging means used for imaging is determined to be the second imaging means, the estimating section 40 converts an image obtained by imaging a person by the imaging section 10 using the second imaging means into the second recognition model. By recognizing using 42, estimation of a person's state or recognition of a person is performed.
  • the estimating unit 40 estimates the state of a person's facial expression or the state of a person's line of sight. Also, for example, the estimation unit 40 recognizes a person (specifically, identifies an individual). The estimating unit 40 outputs the result of estimating the state of the person or the result of recognizing the person. For example, the estimating unit 40 outputs the result of estimating the state of the person or the result of recognizing the person to the user's mobile terminal, personal computer, or the like.
  • the presentation unit 50 is a component that presents specific information.
  • the presentation unit 50 is signage, or a display of a mobile terminal or a personal computer.
  • the presentation unit 50 may be a display or the like that the communication robot has.
  • the contents of the specific information are, for example, product menus, store maps, product suggestions, and the like. Details of the presentation unit 50 will be described later.
  • the determination unit 30 acquires information indicating the content of specific information from the presentation unit 50, and based on the content of the specific information presented by the presentation unit 50 in addition to the presence or absence of an object worn by the person,
  • the imaging means used for imaging may be determined from one of the first imaging means and the second imaging means.
  • the display unit 60 displays the recognition result of the person recognized using the first recognition model 41 and the recognition result of the person recognized using the second recognition model 42 in comparison.
  • the display unit 60 is a display of a mobile terminal or a personal computer. Details of the display unit 60 will be described later.
  • the first imaging means will also be referred to as RGB imaging means
  • the second imaging means will also be referred to as IR imaging means.
  • FIG. 3 is a flow chart showing an example of the operation of the estimation system 1 according to the embodiment.
  • FIG. 4 is a diagram for explaining the operation flow of the estimation system 1 according to the embodiment.
  • the detection unit 20 detects the presence or absence of a person's wearing object based on the image obtained by imaging the person with the imaging unit 10 (step S11). For example, as shown in the left side of FIG. 4, when an image is obtained in which a person wearing sunglasses is captured, the detection unit 20 detects the sunglasses, as shown in the center of FIG. detect that there is
  • the determination unit 30 determines the imaging means to be used for imaging from the RGB imaging means and the IR imaging means based on the detected presence or absence of the person's wearing object (step S12). For example, as shown in the center of FIG. 4, when it is detected that there is an object worn by a person, the determination unit 30 determines the IR imaging means as the imaging means to be used for imaging.
  • the estimation unit 40 estimates the state of the person or recognizes the person based on the image obtained by imaging the person with the imaging unit 10 using the determined imaging means (step S13). For example, when the imaging means is determined to be the IR imaging means, the imaging unit 10 uses the IR imaging means to image a person, and as shown on the left side of FIG. An IR image can be acquired that shows occluded parts of the face (eg around the eyes under sunglasses). Thereby, the estimation unit 40 can estimate a person's state such as a person's line of sight or a person's emotion, and recognize a person.
  • the determination unit 30 determines the RGB imaging means as the imaging means used for imaging.
  • the estimation unit 40 may estimate the state of the person or recognize the person based on the RGB image used when detecting the presence or absence of the object worn by the person.
  • the image capturing unit 10 captures an image of the person again using the RGB image capturing means, and the estimating unit 40 estimates the state of the person or recognizes the person based on the RGB image obtained by the image capturing. good too.
  • RGB images often have clearer facial skin wrinkles, facial expressions, contours, eye color, hair color, etc. than IR images. can be used to estimate a person's state or recognize a person.
  • the determination unit 30 determines the imaging means to be used for imaging from the RGB imaging means and the IR imaging means based on the presence or absence of the object worn by the detected person and the content of the specific information presented by the presentation unit 50. A specific example for doing so will be described with reference to FIGS. 5 to 6C.
  • FIG. 5 is a diagram showing an example of the presentation unit 50 in the embodiment.
  • FIGS. 6A to 6C are diagrams for explaining imaging means determined based on the presence or absence of a wearable object and the contents of specific information.
  • the presentation unit 50 is, for example, a signage, and presents a product menu or product suggestions as specific information.
  • the imaging unit 10 is installed in the presentation unit 50 and images a person viewing the presentation unit 50 .
  • FIG. 6A is an example of an image captured by a person looking at the presentation unit 50 when there is an item worn by the person and the content of the specific information to be presented is a product menu.
  • the determining unit 30 determines that there is an item to be worn by the person (specifically, the person is wearing sunglasses as the item to be worn) and the content of the specific information is a product menu.
  • the IR imaging means is determined as the imaging means used for imaging. This is because, when the content of the specific information is a product menu, it is possible to determine which product the person is paying attention to by seeing the line of sight of the person through the sunglasses through the IR imaging means.
  • FIG. 6B is an example of an image captured by a person looking at the presentation unit 50 when there is no item worn by the person and the content of the specific information to be presented is a product menu.
  • the determining unit 30 determines that there is no wearable item for the person (specifically, the person does not wear sunglasses as the wearable item) and the content of the specific information is a product menu.
  • the RGB imaging means is determined as the imaging means used for imaging. Since the line of sight of the person is known without wearing sunglasses and without using the IR imaging means, a clear image can be obtained by the RGB imaging means.
  • FIG. 6C is an example of an image captured by a person looking at the presentation unit 50 when there is an object worn by the person and the content of the specific information to be presented is a product proposal.
  • the imaging means is determined to be the RGB imaging means. If the content of the specific information is a proposal for a product, even if the person's line of sight cannot be seen because the sunglasses are not transmitted by the RGB imaging means, the person's emotions can be seen from the appearance around the person's mouth, and the product is proposed. This is because it is possible to determine the degree of interest in.
  • FIG. 7 a specific example in which the display unit 60 compares and displays the recognition result of a person recognized using the first recognition model and the recognition result of a person recognized using the second recognition model is shown in FIG. 7 will be used for explanation.
  • FIG. 7 is a diagram showing an example of the display section 60 in the embodiment.
  • the display unit 60 is a display or the like, and displays a person imaged by the imaging unit 10 and the recognition result of the person.
  • the recognition result the recognition result of the person recognized using the first recognition model and the recognition result of the person recognized using the second recognition model are displayed in contrast.
  • the person shown on the left side of FIG. 7 is wearing a wearable object, and the recognition result when the IR image is recognized using the second recognition model is displayed.
  • the recognition result when the RGB image is recognized using the first recognition model is also displayed in contrast.
  • the user of the estimation system 1 or the like who sees the display on the display unit 60 can finally judge the recognition result.
  • the person shown on the right side of FIG. 7 is not wearing a wearable object, in this case, only the recognition result when the RGB image is recognized using the first recognition model may be displayed.
  • the estimation system 1 is a system for estimating a state of a person or recognizing a person, and includes an imaging unit 10 having a first imaging means in a first wavelength band and a second imaging means in a second wavelength band; , a detection unit 20 for detecting the presence or absence of an object worn by a person based on an image obtained by imaging a person by the imaging unit 10; A determining unit 30 that determines an imaging means to be used for imaging from among the two imaging means; and an estimating unit 40 for recognizing.
  • a person when a person is wearing a wearable object, in a wavelength band (for example, an infrared wavelength band) that can be imaged through the wearable object out of the first wavelength band and the second wavelength band A person can be imaged by the imaging means.
  • a wavelength band for example, an infrared wavelength band
  • the image showing the portion shielded by the wearable item can be used for estimating the state of the person or recognizing the person, it is possible to estimate the state of the person wearing the wearable item or recognize the person wearing the wearable item. can be performed with high precision.
  • the imaging means in the wavelength band for example, visible light wavelength band
  • the imaging means in the wavelength band that can be imaged without penetrating the wearable object out of the first wavelength band and the second wavelength band
  • a person can be imaged by Imaging means in a wavelength band that can transmit images through a wearable object tends to obscure facial skin wrinkles, facial expressions, contours, eye color, hair color, and the like.
  • the images can be used for human state estimation or human recognition.
  • the first imaging means may be imaging means by a first camera that performs imaging in the first wavelength band
  • the second imaging means is imaging means by a second camera that performs imaging in the second wavelength band.
  • the imaging unit 10 is provided with the first camera that performs imaging in the first wavelength band and the second camera that performs imaging in the second wavelength band, thereby realizing the first imaging means and the second imaging means.
  • the first imaging means may be imaging means using a second cut filter that cuts light in the second wavelength band, and the second imaging means includes the first cut filter that cuts light in the first wavelength band. It may be the imaging means used.
  • the imaging unit 10 may include, for example, one camera, and the first cut filter and the second cut filter may be selectively used to implement the first imaging means and the second imaging means.
  • the estimation system 1 may further include a presentation unit 50 that presents specific information.
  • the imaging means used for imaging may be determined from among the means and the second imaging means.
  • the portion shielded by the wearable object is effectively used for estimation or recognition.
  • the portion occluded by the wearable is less important for estimation or recognition. Therefore, it is possible to determine an imaging means suitable for imaging for obtaining an image used for estimating a person's state or recognizing a person, depending on the content of specific information.
  • the estimation system 1 further includes a first recognition model 41 for recognizing an image obtained by imaging a person by the imaging unit 10 using the first imaging means, and a human model 41 by the imaging unit 10 using the second imaging means. and a second recognition model 42 for recognizing an image obtained by imaging.
  • the imaging means used for imaging is determined to be the first imaging means
  • the estimating section 40 uses the first recognition model 41 to obtain the image obtained by imaging the person by the imaging section 10 using the first imaging means.
  • the imaging means used for imaging is determined to be the second imaging means
  • the person is imaged by the imaging unit 10 using the second imaging means.
  • the second recognition model 42 the state of the person may be estimated or the person may be recognized.
  • the first recognition model 41 suitable for image recognition by the first imaging means and the second recognition model 42 suitable for image recognition by the second imaging means it is possible to estimate the state of a person or can be recognized more accurately.
  • the estimation system 1 further includes a display unit 60 for displaying the recognition result of the person recognized using the first recognition model 41 and the recognition result of the person recognized using the second recognition model 42 in comparison.
  • each recognition result can be visually compared.
  • the first wavelength band may be a visible light wavelength band
  • the second wavelength band may be an infrared light wavelength band.
  • the determining unit 30 may determine the imaging means to be the second imaging means when it is detected that there is an object worn by a person.
  • the person when a person is wearing a wearable object, the person can be imaged by the second imaging means in the infrared wavelength band.
  • estimation system 1 includes the presentation unit 50
  • estimation system 1 does not have to include the presentation unit 50 .
  • the estimation system 1 includes the first recognition model 41 and the second recognition model 42, but the estimation system 1 includes the first recognition model 41 and the second recognition model 42. It doesn't have to be.
  • the estimation system 1 includes the display unit 60
  • the estimation system 1 does not have to include the display unit 60 .
  • the present disclosure can be realized not only as the estimation system 1, but also as an estimation method including steps (processes) performed by each component constituting the estimation system 1.
  • the estimation method is a method by the estimation system 1 for estimating the state of a person or recognizing the person, and the estimation system 1 includes a first imaging means in a first wavelength band and a second imaging in a second wavelength band.
  • An imaging unit 10 having means is provided.
  • the estimation method includes a detection step (step S11) of detecting the presence or absence of an object worn by a person based on an image obtained by imaging a person by the imaging unit 10; Based on the presence or absence of, the determination step (step S12) of determining the imaging means from the first imaging means and the second imaging means, and the imaging unit 10 using the determined imaging means obtained by imaging the person and an estimation step (step S13) of estimating the state of the person or recognizing the person based on the image.
  • the steps in the estimation method may be executed by a computer (computer system).
  • the present disclosure can be realized as a program for causing a computer to execute the steps included in the estimation method.
  • the present disclosure can be implemented as a non-temporary computer-readable recording medium such as a CD-ROM on which the program is recorded.
  • each step is executed by executing the program using hardware resources such as the CPU, memory, and input/output circuits of the computer.
  • hardware resources such as the CPU, memory, and input/output circuits of the computer.
  • each step is executed by the CPU acquiring data from a memory, an input/output circuit, or the like, performing an operation, or outputting the operation result to the memory, an input/output circuit, or the like.
  • each component included in the estimation system 1 of the above embodiment may be implemented as a dedicated or general-purpose circuit.
  • each component included in the estimation system 1 of the above embodiment may be implemented as an LSI (Large Scale Integration), which is an integrated circuit (IC).
  • LSI Large Scale Integration
  • IC integrated circuit
  • integrated circuits are not limited to LSIs, and may be realized by dedicated circuits or general-purpose processors.
  • a programmable FPGA (Field Programmable Gate Array) or a reconfigurable processor capable of reconfiguring connections and settings of circuit cells inside the LSI may be used.
  • the present disclosure can be applied, for example, to a system that performs control according to the state of a person, a system that recognizes a person, or the like.
  • estimation system 10 imaging unit 20 detection unit 30 determination unit 40 estimation unit 41 first recognition model 42 second recognition model 50 presentation unit 60 display unit

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Engineering & Computer Science (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un système d'estimation (1) servant à estimer l'état d'une personne ou à reconnaître une personne et comprend : une unité d'imagerie (10) comprenant un premier moyen d'imagerie utilisant une première bande de longueur d'onde et un second moyen d'imagerie utilisant une seconde bande de longueur d'onde ; une unité de détection (20) utilisant une image, qui a été obtenue au moyen de l'unité d'imagerie (10) pour imager une personne, pour détecter s'il existe un objet porté par la personne ; une unité de détermination (30) servant à déterminer s'il faut utiliser le premier moyen d'imagerie ou le second moyen d'imagerie afin d'effectuer une imagerie sur la base de la détection d'un objet porté par la personne ; et une unité d'estimation (40) servant à estimer l'état de la personne ou à reconnaître la personne sur la base d'une image obtenue par l'imagerie d'une personne par l'unité d'imagerie (10) au moyen du moyen d'imagerie déterminé.
PCT/JP2021/037911 2021-03-04 2021-10-13 Système d'estimation, procédé d'estimation et programme WO2022185596A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-034491 2021-03-04
JP2021034491 2021-03-04

Publications (1)

Publication Number Publication Date
WO2022185596A1 true WO2022185596A1 (fr) 2022-09-09

Family

ID=83154195

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/037911 WO2022185596A1 (fr) 2021-03-04 2021-10-13 Système d'estimation, procédé d'estimation et programme

Country Status (1)

Country Link
WO (1) WO2022185596A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017011634A (ja) * 2015-06-26 2017-01-12 キヤノン株式会社 撮像装置およびその制御方法、並びにプログラム
JP2017055250A (ja) * 2015-09-09 2017-03-16 富士通株式会社 表示制御装置、表示制御方法および表示制御プログラム
JP2018018401A (ja) * 2016-07-29 2018-02-01 東芝アルパイン・オートモティブテクノロジー株式会社 瞼開閉検出装置および瞼開閉検出方法
WO2019069599A1 (fr) * 2017-10-05 2019-04-11 ソニー株式会社 Dispositif et procédé de traitement d'image
WO2020049636A1 (fr) * 2018-09-04 2020-03-12 日本電気株式会社 Système d'identification, procédé de présentation de modèle et programme de présentation de modèle
JP2020125618A (ja) * 2019-02-04 2020-08-20 東京電力ホールディングス株式会社 情報処理方法、プログラム、取水制御システム及び学習済みモデルの生成方法
JP2020156903A (ja) * 2019-03-27 2020-10-01 Hoya株式会社 内視鏡用プロセッサ、情報処理装置、プログラム、情報処理方法および学習モデルの生成方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017011634A (ja) * 2015-06-26 2017-01-12 キヤノン株式会社 撮像装置およびその制御方法、並びにプログラム
JP2017055250A (ja) * 2015-09-09 2017-03-16 富士通株式会社 表示制御装置、表示制御方法および表示制御プログラム
JP2018018401A (ja) * 2016-07-29 2018-02-01 東芝アルパイン・オートモティブテクノロジー株式会社 瞼開閉検出装置および瞼開閉検出方法
WO2019069599A1 (fr) * 2017-10-05 2019-04-11 ソニー株式会社 Dispositif et procédé de traitement d'image
WO2020049636A1 (fr) * 2018-09-04 2020-03-12 日本電気株式会社 Système d'identification, procédé de présentation de modèle et programme de présentation de modèle
JP2020125618A (ja) * 2019-02-04 2020-08-20 東京電力ホールディングス株式会社 情報処理方法、プログラム、取水制御システム及び学習済みモデルの生成方法
JP2020156903A (ja) * 2019-03-27 2020-10-01 Hoya株式会社 内視鏡用プロセッサ、情報処理装置、プログラム、情報処理方法および学習モデルの生成方法

Similar Documents

Publication Publication Date Title
US11227158B2 (en) Detailed eye shape model for robust biometric applications
JP6885935B2 (ja) 眼の特徴を用いる眼ポーズ識別
US9750420B1 (en) Facial feature selection for heart rate detection
JP6722590B2 (ja) 顔の表情のトラッキング
JP7178403B2 (ja) ロバストなバイオメトリックアプリケーションのための詳細な眼形状モデル
EP3047361B1 (fr) Procédé et dispositif d'affichage d'une interface utilisateur graphique
WO2021095277A1 (fr) Procédé de détection de ligne de visée, dispositif de détection de ligne de visée, et programme de commande
KR20200051591A (ko) 정보 처리 장치, 정보 처리 방법, 및 프로그램
WO2014128751A1 (fr) Appareil, programme et procédé visiocasque
KR102364929B1 (ko) 피부 변화를 추적하는 전자 장치, 서버, 및 시스템
KR20200144196A (ko) 전자 장치 및 각막 이미지를 이용한 전자 장치의 기능 제공 방법
JP2002318652A (ja) 仮想入力装置およびプログラム
JP2017191546A (ja) 医療用ヘッドマウントディスプレイ、医療用ヘッドマウントディスプレイのプログラムおよび医療用ヘッドマウントディスプレイの制御方法
JP6745518B1 (ja) 視線検出方法、視線検出装置、及び制御プログラム
US11328187B2 (en) Information processing apparatus and information processing method
WO2022185596A1 (fr) Système d'estimation, procédé d'estimation et programme
JP2021010652A (ja) 情報処理装置、評価方法、および情報処理プログラム
JP2010166939A (ja) 表情測定方法、表情測定プログラムならびに表情測定装置
US20230410382A1 (en) Information processing apparatus, head-mounted display apparatus, information processing method, and non-transitory computer readable medium
JP7460450B2 (ja) 視線推定システム、視線推定方法、視線推定プログラム、学習用データ生成装置、及び、視線推定装置
US20240153136A1 (en) Eye tracking
Egloff et al. A Blink Detection Algorithm for Eye-Tracking Applications Based on Facial Geometry
Takacs et al. Sensing user needs: recognition technologies and user models for adaptive user interfaces
CN114761999A (zh) 图像处理方法、图像处理装置以及图像处理程序
CN117707333A (zh) 用于协作使用的可穿戴电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929157

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929157

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP