WO2019073562A1 - Display control device, display control method, and vehicle-mounted apparatus provided with display control device - Google Patents

Display control device, display control method, and vehicle-mounted apparatus provided with display control device Download PDF

Info

Publication number
WO2019073562A1
WO2019073562A1 PCT/JP2017/036909 JP2017036909W WO2019073562A1 WO 2019073562 A1 WO2019073562 A1 WO 2019073562A1 JP 2017036909 W JP2017036909 W JP 2017036909W WO 2019073562 A1 WO2019073562 A1 WO 2019073562A1
Authority
WO
WIPO (PCT)
Prior art keywords
candidate
display control
unit
priority
image
Prior art date
Application number
PCT/JP2017/036909
Other languages
French (fr)
Japanese (ja)
Inventor
和則 阿部
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/036909 priority Critical patent/WO2019073562A1/en
Publication of WO2019073562A1 publication Critical patent/WO2019073562A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to a display control apparatus, a display control method, and an on-vehicle apparatus including the display control apparatus for making an operator of a control apparatus known to an occupant of a vehicle.
  • a camera installed toward the operator, a control unit connected with a gesture detection unit that detects the gesture of the operator from the camera signal output from the camera, and the control unit are connected
  • a gesture operation device including a gaze detection unit that detects a gaze of an operator and a display unit that displays an image toward the operator, wherein the gesture detection unit detects a gesture of the operator, and the gaze detection unit Detects that the line of sight of the operator is directed to the display unit, the gesture detection unit outputs the detected gesture to the control unit, and the control unit executes an operation corresponding to the gesture (Patent Literature 1).
  • the present invention has been made to solve the above-described problems, and display control capable of making the operator known to the occupant by displaying the operator of the operation device on the display unit of the operation device. It aims at providing an apparatus.
  • a display control apparatus calculates a line-of-sight detection unit for detecting a line-of-sight of an occupant and an object of line-of-sight from an image captured by an imaging device for capturing an occupant. And a priority setting unit that determines a passenger who is looking at the display unit as a candidate and sets a priority for the candidate; And a candidate display control unit for acquiring a candidate image of the highest priority candidate from the sight line detection unit and displaying the image on the display unit.
  • the operator can be made known to the occupant by displaying the operator of the operation device on the display unit of the operation device.
  • FIGS. 3A and 3B are diagrams showing an example of the hardware configuration of the display control apparatus according to the first embodiment of the present invention. It is a flowchart which shows operation
  • FIGS. 6A, 6B, 6C, 6D, and 6E are diagrams showing display examples of candidate composite images according to the second embodiment of the present invention.
  • FIG. 1 is a block diagram showing the configuration of a display control apparatus according to the first embodiment.
  • the display control device 100 includes a gaze detection unit 1, a gaze target calculation unit 2, a priority setting unit 3, and a candidate display control unit 4. As shown in FIG. 1, the display control device 100 is connected to the imaging device 20 and the operation device 30.
  • the controller device 30 also includes a control unit 31 and a display unit 32.
  • the operation device 30 in the first embodiment is configured of a non-contact operation device that performs a non-contact operation such as a voice operation or a gesture operation.
  • the gaze detection unit 1 acquires an image captured by the imaging device 20 provided in the vicinity of the display unit 32 of the operation device 30.
  • the imaging device 20 is configured of a wide-angle camera capable of simultaneously imaging the driver's seat and the passenger's seat.
  • the gaze detection unit 1 extracts a face image including at least the entire face of each occupant from the acquired image, and detects the gaze of the occupant from the face image.
  • the line-of-sight information is output to the line-of-sight object calculation unit 2.
  • the line-of-sight information is information including coordinates (face coordinates) in the image of the face image of each occupant and the direction of the line of sight (line-of-sight direction).
  • the above-described line of sight can be determined, for example, by the direction of the face or the rotation angle of the eyeball, which is calculated from feature points of the face such as eyes, nose, and mouth.
  • the imaging device 20 in Embodiment 1 was a structure using one wide-angle camera provided in display part 32 vicinity of the operation apparatus 30, if it is the structure which can image a driver's seat and a front passenger seat
  • a narrow-angle camera may be provided as the imaging device 20 for each seat.
  • the line-of-sight calculation unit 2 calculates an area in which the driver is looking at the line of sight based on the line-of-sight information including the position of the display unit 32 and the face coordinates of the occupant and the line-of-sight direction set beforehand. Identify the object.
  • the gaze target calculation unit 2 detects an occupant whose gaze is directed to the display unit 32, it adds information that the gaze of the display unit 32 is gazed to the gaze information of the passenger. Then, the line-of-sight information of the occupant is output to the priority setting unit 3.
  • the priority setting unit 3 sets the time (gaze time) to continuously direct the line of sight to the display unit 32. Count. Then, an occupant whose gaze time exceeds a preset threshold time is determined as a candidate who operates the operation device 30. In this manner, the gaze time is counted, and an occupant whose gaze time exceeds the preset threshold time is set as a candidate, thereby suppressing a passenger whose line of sight turns to the display 32 accidentally becomes a candidate. Can.
  • the determination method of this candidate is an example, for example, the threshold time may not be provided, and the candidate may be determined at the moment when the occupant gazes at the display unit 32 or the display unit 32 from the image captured by the imaging device 20 A gesture of an occupant who is directing their gaze at may be detected, and it may be determined as a candidate when a gesture representing the start of operation of the operation device 30 set in advance is detected.
  • the priority setting unit 3 sets the priority to the candidate in the order of earlier timing determined as the candidate, and displays the face coordinates of the candidate with the highest priority (the first candidate with priority) as a candidate Output to control unit 4. It should be noted that the driver often performs more important operations than the other occupants, and often wants to operate in a hurry. Therefore, when the driver is determined as a candidate, it is related to the gaze time of the other occupants. Instead, the priority may be set to give the highest priority.
  • the priority setting unit 3 counts the time (non-gaze time) in which the candidate continuously removes the line of sight from the display unit 32 from the line-of-sight information of the candidate. Then, when the non-gaze time of the candidate continues for a preset threshold time, the candidate is excluded from the candidate and the priority is updated. For example, in a situation where there are three candidates, when the candidate with the highest priority continues to look away from the display unit 32 for a predetermined time and is excluded from the candidates, the candidate with priority 2 has priority The candidate with the third highest priority is the second highest priority. As a result, the candidate can be removed from the candidate simply by taking his or her eyes off the display unit 32, and the candidate with the highest priority in the operation device 30 can be quickly replaced.
  • the vibration during driving and sunlight may cause a blur in the line of sight and a short distance look away regardless of the intention of the candidate.
  • This method of excluding a candidate is an example, and for example, a gesture of an occupant whose eyes are directed to the display unit 32 is detected from an image captured by the imaging device 20, and a gesture representing the end of operation of the operation device 30 set in advance. May be excluded from the candidates when it is detected.
  • the candidate display control unit 4 is a candidate that is an image including the entire face of the priority first candidate from the sight line detection unit 1 based on the face coordinates of the priority first candidate input from the priority setting unit 3 Image of the person who The candidate display control unit 4 outputs, to the control unit 31 of the operation device 30, priority first candidate information including information capable of specifying the priority first candidate and a candidate image.
  • the candidate display control unit 4 acquires a candidate image every time the highest priority candidate is changed, and outputs the image to the control unit 31.
  • the control unit 31 When the highest priority candidate information is input from the candidate display control unit 4, the control unit 31 performs noncontact operation (voice operation, gesture operation, etc.) on the operation device 30 other than the occupant (operator) of the candidate image. Disable). Furthermore, when the priority first candidate is absent, the priority setting unit 3 notifies the control unit 31 of the absence of the operator. When the control unit 31 receives the notification of the absence of the operator from the priority setting unit 3, the control unit 31 invalidates the non-contact operation (voice operation, gesture operation, and the like) on the operation device 30. As a result, it is possible to suppress erroneous detection of the movement or speech of an occupant whose line of sight is not directed to the display unit 32, that is, an occupant who is not willing to operate the operation device 30, as an operation on the operation device.
  • FIG. 2 is a diagram of a display example of a candidate image in the first embodiment.
  • the control unit 31 displays the candidate image input from the candidate display control unit 4 on the display unit 32 as shown in FIG.
  • the candidate image displayed on the display unit 31 is updated.
  • the candidate display control unit 4 has the function of the control unit 31 and the candidate The display control unit 4 may be configured to limit the non-contact operation to the operation device 30 and display on the display unit 32.
  • FIGS. 3A and 3B are diagrams showing an example of the hardware configuration of the display control apparatus 100.
  • FIG. Each function of the visual axis detection unit 1, the visual axis target calculation unit 2, the priority setting unit 3, and the candidate display control unit 4 in the display control apparatus 100 is realized by a processing circuit. That is, the display control device 100 includes a processing circuit for realizing the above functions.
  • the processing circuit may be a processing circuit 200a that is dedicated hardware as shown in FIG. 3A, or may be a processor 200b that executes a program stored in the memory 200c as shown in FIG. 3B. Good.
  • the processing circuit 200a may be, for example, a single circuit, A compound circuit, a programmed processor, a processor programmed in parallel, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination thereof may be used.
  • the functions of the visual axis detection unit 1, visual axis calculation unit 2, priority setting unit 3, and candidate display control unit 4 may be implemented by processing circuits, or the functions of the sections may be implemented by a single processing circuit. You may
  • the function of each unit is software, firmware, or software and firmware It is realized by the combination of The software or firmware is described as a program and stored in the memory 200c.
  • the processor 200b reads out and executes the program stored in the memory 200c to realize the respective functions of the visual axis detection unit 1, the visual axis target calculation unit 2, the priority setting unit 3, and the candidate display control unit 4. That is, when the visual axis detection unit 1, visual axis target calculation unit 2, priority setting unit 3, and candidate display control unit 4 are executed by the processor 200b, each step shown in FIG. 4 described later is eventually executed.
  • Memory 200c for storing a program to be stored. In addition, it can be said that these programs cause a computer to execute the procedure or method of the gaze detection unit 1, gaze target calculation unit 2, priority setting unit 3 and candidate display control unit 4.
  • the processor 200 b refers to, for example, a central processing unit (CPU), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, or a digital signal processor (DSP).
  • the memory 200c may be, for example, a nonvolatile or volatile semiconductor memory, such as a random access memory (RAM), a read only memory (ROM), a flash memory, an EPROM (erasable programmable ROM), or an EEPROM (electrically EPROM). It may be a hard disk, a magnetic disk such as a flexible disk, or an optical disk such as a mini disk, a CD (Compact Disc), a DVD (Digital Versatile Disc), or the like.
  • the functions of the visual axis detection unit 1, visual axis calculation unit 2, priority setting unit 3, and candidate display control unit 4 are partially realized by dedicated hardware and partially realized by software or firmware. You may do so.
  • the processing circuit 200 a in the display control device 100 can realize the above-described functions by hardware, software, firmware, or a combination thereof.
  • FIG. 4 is a flowchart showing an operation of face image display processing of the display control apparatus 100 according to the first embodiment.
  • the priority setting unit 3 determines an occupant whose gaze time with respect to the display unit 32 exceeds the threshold time as a candidate and sets the priority to the candidate in the order of becoming a candidate.
  • the gaze detection unit 1 acquires an image captured by the imaging device 20 (step ST1), and acquires gaze information of an occupant from the acquired image (step ST2).
  • the line-of-sight object calculation unit 2 determines whether or not there is an occupant whose line of sight is directed to the display unit 32 based on the line-of-sight information acquired in step ST2 (step ST3). When there is no occupant whose line of sight is directed to the display unit 32 (step ST3; NO), the process proceeds to step ST1. On the other hand, when there is an occupant whose line of sight is directed to the display unit 32 (step ST3; YES), the priority setting unit 3 counts the gaze time of the occupant on the display unit 31 (step ST4).
  • the priority setting unit 3 determines whether there is an occupant whose gaze duration exceeds the threshold time in step ST4 (step ST5), and there is no occupant whose gaze duration exceeds the threshold time (step ST5). ST5; NO), shift to the process of step ST1. On the other hand, when there is an occupant whose fixation time exceeds the threshold time (step ST5; YES), the occupant whose fixation time exceeds the threshold time is determined as a candidate (step ST6), and candidates are determined in the order determined. The priority order is set for the person (step ST7).
  • the candidate display control unit 4 acquires the candidate image of the candidate with the highest priority set in step ST7 from the line-of-sight detection unit 1 (step ST8), and the candidate image and priority acquired in step ST8 Priority-one candidate information including information that can specify the highest candidate is output to the control unit 31 (step ST9). Then, the process returns to step ST1.
  • the visual axis detection unit 1 for detecting the visual line of the occupant and the target of the visual line are calculated from the image captured by the imaging device 20 for imaging the occupant.
  • Line-of-sight object calculation unit 2 that determines an occupant who is pointing the line of sight to the unit 32 and a priority setting unit that determines an occupant who is pointing the line of sight to the display unit 32 as a candidate and sets priorities to the candidates 3 and the candidate display control unit 4 for acquiring the candidate image of the candidate with the highest priority from the image and displaying the image on the display unit 32, a plurality of occupants operate the operating device 30.
  • the operator of the operating device 30 can be made known to the occupant who is going to operate the operating device 30.
  • the second embodiment is a display control apparatus for displaying on the display unit 32 the images of the faces of all the candidates, not only the first-priority candidate.
  • the display control apparatus according to the second embodiment will be described with reference to FIG.
  • FIG. 5 is a block diagram showing the configuration of a display control apparatus according to the second embodiment.
  • the display control apparatus 100A of the second embodiment is the same as the display control apparatus 100 of the first embodiment except for the priority setting unit 3 and the candidate display control unit 4 in place of the priority setting unit 3a and the candidate display control unit Composed with 4a,
  • parts the same as or corresponding to those of display control apparatus 100 according to Embodiment 1 will be assigned the same reference numerals as those used in Embodiment 1 to omit or simplify the description.
  • the priority setting unit 3a determines a candidate by the method described in the first embodiment, and sets the priority to the candidate. Then, data (candidate data) in which the set priority order and face coordinates of the candidate are linked is output to the candidate display control unit 4a.
  • FIG. 6 is a diagram of a display example of a candidate composite image in the second embodiment.
  • the candidate display control unit 4a acquires a candidate image from the gaze detection unit 1 based on the candidate data output from the priority setting unit 3a.
  • the candidate is only one candidate, that is, only the candidate with the highest priority
  • the candidate information with the highest priority including the image capable of identifying the candidate and the candidate is output to the control unit 31.
  • a candidate composite image is created which is an image in which a plurality of candidate images are put together. Further, the candidate composite image is processed so that the occupant can distinguish and recognize the candidate image of the candidate with the highest priority and the image of the candidate other than the candidate with the highest priority. Specifically, as shown in FIG. 6A, an operator mark is added to the candidate image in the candidate composite image. Then, candidate information including the candidate composite image to which the operator mark has been assigned and information capable of specifying the candidate with the highest priority is output to the control unit 31. Thus, the communication amount between the candidate display control unit 4a and the control unit 31 can be suppressed by outputting the plurality of candidate image as one candidate composite image to the control unit 31.
  • the candidate display control unit 4a creates a candidate composite image each time there is a change in the priority of the candidates, and outputs the image to the control unit 31.
  • the control unit 31 causes the display unit 32 to display the image.
  • the candidate composite image displayed on the display unit 32 is updated.
  • this candidate image may be emphasized by bordering on the candidate image of the priority first candidate as shown in FIG. 6B, and conversely,
  • the candidate image of the candidate other than the top priority may be grayed out as in 6C, or as shown in FIG. 6D, the transparency of the candidate image other than the top priority is made to be less noticeable.
  • FIG. 6E the candidate composite image is displayed, the candidate composite image is deleted after a predetermined time elapses, and the candidate image of the candidate with the highest priority is displayed, whereby the candidate with the highest priority can be selected.
  • Image may be emphasized.
  • FIG. 7 is a diagram of an example of a candidate-combined image according to the second embodiment.
  • the configuration has been described in which a plurality of candidate images are displayed on the display unit 32 as one candidate composite image, but this is an example, and the candidate display control unit 4a is a candidate.
  • Data in which the image and the priority are linked is output to the control unit 31, and the control unit 31 controls the arrangement and the candidate image to be displayed on the display unit 32 so that the occupant can recognize who the operator is It may be configured to be displayed on the unit 32.
  • the candidate display control unit 4a also has the function of the control unit 31, and the candidate display control unit 4a restricts the noncontact operation on the operation device 30 and displays the display unit 32. It may be configured as follows.
  • the priority setting unit 3a and the candidate display control unit 4a in the display control device 100A are the processing circuit 200a shown in FIG. 3A or the processor 200b that executes a program stored in the memory 200c shown in FIG. 3B.
  • FIG. 8 is a flow chart showing the operation of face image display processing of the display control apparatus 100A according to the second embodiment.
  • the candidate display control unit 4 a gives an operator mark to the candidate image of the candidate with the highest priority.
  • the same steps as those of the display control apparatus 100 according to the first embodiment are denoted by the reference numerals used in FIG. 4, and the description will be omitted or simplified.
  • the candidate display control unit 4a acquires the candidate image of the candidate determined in step ST6 from the sight line detection unit 1 (step ST28), and the acquired candidate image has only the candidate image with the highest priority in the priority order It is determined whether or not (step ST29). If the acquired candidate image is only the candidate image of the highest priority candidate (step ST29; YES), the highest priority candidate including the information capable of specifying the candidate image and the candidate with the highest priority Person information is output to the control unit 31 (step ST30). On the other hand, when there are a plurality of candidates (step ST29; NO), a candidate composite image which is an image in which a plurality of candidate images are put together is created (step ST31).
  • step ST32 an operator mark is added to the candidate image of the first candidate in priority in the candidate composite image created in step ST31 (step ST32), and the candidate composite image to which the operator mark is added has one priority
  • Candidate information including information capable of specifying the highest candidate is output to the control unit 31 (step ST33). Then, the process returns to step ST1.
  • the display control apparatus 100 displays only the candidate image of the candidate with the highest priority on the display unit 32.
  • the priority setting unit 3 a and the candidate display control unit 4 a instead of the priority setting unit 3 and the candidate display control unit 4, the candidate image with the highest priority and the candidates other than the priority first
  • the candidate image of the person is displayed on the display unit 32 so that the occupant can distinguish and recognize it.
  • the crew member can recognize the candidate recognized by the operation device 30 by displaying on the display unit 32 the candidate image other than the priority first candidate image in addition to the priority first candidate candidate image. It becomes possible. As a result, it is possible to eliminate the anxiety of the occupant that may not be recognized by the operation device 30.
  • the content information of the operation device 30 which was operated last by the highest priority candidate is linked to the candidate and the content information storage unit 5 is It is a display control device which has a function to save and display again the content information that was last operated when the candidate operated the operation device 30 again.
  • the display control apparatus according to the third embodiment will be described with reference to FIG.
  • FIG. 9 is a block diagram showing the configuration of a display control apparatus according to the third embodiment.
  • the display control device 100B of the third embodiment is configured by adding a content information storage unit 5 to the display control device 100 of the first embodiment.
  • a candidate display control unit 4b is provided instead of the candidate display control unit 4.
  • the same or corresponding parts as those of the display control apparatus 100 according to the first embodiment are indicated by the same reference numerals as the reference numerals used in the first embodiment, and the description will be omitted or simplified.
  • the content information storage unit 5 is an area in which the face coordinates of the occupant and the content information of the operation device 30 are linked and stored.
  • the content information storage unit 5 is provided in the display control device 100B.
  • the content information storage unit 5 may be included in the operation device 30, or may be included in other in-vehicle devices (not shown) mounted in the vehicle, or may be included in an external server (not shown). Good.
  • the candidate display control unit 4b acquires the candidate image by the processing described in the first embodiment when the face coordinates of the first priority candidate are input from the priority setting unit 3, and the acquired candidate image And information that can identify the candidate with the highest priority to the control unit 31.
  • the candidate display control unit 4 b sequentially acquires, from the control unit 31, content information of the operation device 30 operated by the candidate with the highest priority.
  • the control unit 4 b collates whether content information corresponding to the face coordinates of the first priority candidate, which is newly input from the priority setting unit 3, is stored in the content information storage unit 5. If there is content information corresponding to the face coordinates of the new candidate with the highest priority, the content information is acquired from the content information storage unit 5, and the candidate with the highest priority with the candidate image is selected. The information is output to the control unit 31 together with the identifiable information.
  • the control unit 31 causes the display unit 32 to display content corresponding to the content information.
  • the candidate display control unit 4b in the display control device 100B is a processor 200b that executes a program stored in the processing circuit 200a shown in FIG. 3A or the memory 200c shown in FIG. 3B.
  • FIG. 10 is a flowchart showing an operation of processing of associating content information and a candidate in the content information storage unit 5 of the display control apparatus 100B according to the third embodiment and storing the result.
  • the candidate display control unit 4b determines whether the candidate with the highest priority is excluded from the candidates (step ST41). If it is determined that the candidate is not excluded (step ST41; NO), the process returns to step ST41. On the other hand, when it is determined that the candidate has been excluded from the candidate (step ST41; YES), the priority first candidate acquires the content information which was operated at the end from the control unit 31 (step ST42). The face coordinates of the first candidate with the priority and the content information acquired in step ST42 are linked and stored in the content information storage unit 5 (step ST43).
  • FIG. 11 is a flowchart showing an operation of acquiring content information from the content information storage unit 5 of the display control apparatus 100B according to the third embodiment and outputting the content information to the control unit 31.
  • the candidate display control unit 4 b receives the face coordinates of the first candidate in the priority order from the priority setting unit 3, and the content information linked to the face coordinates of the first candidate in the priority order is the content information storage unit 5. It is collated whether it is stored in (step ST51). If not stored in the content information storage unit 5 (step ST51; NO), the process proceeds to step ST8. On the other hand, when stored in the content information storage unit 5 (step ST52; YES), the content information storage unit 5 acquires content information linked to the face coordinates of the first priority candidate (step ST52). The content information is output to the control unit 31 together with the information capable of specifying the candidate image and the candidate with the highest priority (step ST53).
  • the display control apparatus 100B adds the content information storage unit 5 to the display control apparatus 100 according to the first embodiment, and replaces the candidate display control unit 4 It comprised including the candidate display control part 4b.
  • the operation of the operation device 30 is temporarily interrupted and the operation is attempted again, the operation can be started from the point of interruption, and the convenience of the operation is improved.
  • FIG. 12 is a block diagram showing the configuration of a display control apparatus according to the fourth embodiment.
  • the display control device 100C of the fourth embodiment is the same as the display control device 100A of the second embodiment except that the priority setting unit 3 and the candidate display control unit 4 are replaced by a priority setting unit 3c and a candidate display control unit. It is configured to have 4c.
  • the operation device 30 includes a touch operation detection unit 33 that detects a touch operation of the occupant on the display unit 32.
  • a touch operation detection unit 33 that detects a touch operation of the occupant on the display unit 32.
  • FIG. 13 shows an example of transferring the authority of the operator of the display control apparatus according to the fourth embodiment.
  • the touch operation detection unit 33 detects the touch operation of the occupant on the candidate composite image displayed on the display unit 32 by the processing described in the second embodiment, and when the touch operation is detected, the touch operation is performed.
  • the coordinates in the candidate composite image are output to the candidate display control unit 4c.
  • the candidate display control unit 4c determines, based on the coordinates input from the touch operation detection unit 33, to which candidate image in the candidate complex coordinates the touch operation has been performed, and the touch operation is performed.
  • the candidate of the candidate image is notified to the priority setting unit 3c.
  • the priority setting unit 3c sets a flag for the candidate.
  • the candidate with the flag set is prioritized first regardless of the gaze time. Further, the priority order of the candidate for whom the flag is not set is set by the line-of-sight information as in the priority setting process in the second embodiment.
  • the priority setting unit 3c Set the priority of the third priority candidate to the first priority, change the first priority candidate to the second priority, and change the second priority to the third priority. .
  • the candidate with the fourth priority remains the fourth priority.
  • the flag is released.
  • the flag is delegated to the candidate who has been touch-operated. Then, the candidate deprived of the flag is prioritized again based on the line-of-sight information.
  • the priority setting unit 3c and the candidate display control unit 4c in the display control apparatus 100C are processors 200b that execute programs stored in the processing circuit 200a shown in FIG. 3A or the memory 200c shown in FIG. 3B.
  • FIG. 14 is a flowchart showing the operation of the flag setting process of the display control device 100C according to the fourth embodiment.
  • the candidate display control unit 4c determines, from the touch operation detection unit 33, whether or not the touch operation has been performed on the coordinates in the candidate composite image (step ST61), and the input of the touch operation has not been performed on the coordinates ( Step ST61; NO) Return to the process of step ST61 again.
  • the candidate display control unit 4c determines which candidate image the touch operation is for from the coordinates input in step ST61 (step ST62). ).
  • the priority setting unit 3c sets a flag for the candidate determined in step ST62 (step 63), and ends the operation.
  • FIG. 15 is a flowchart showing the operation of setting the priority of the display control device 100C according to the fourth embodiment.
  • the same steps as those of the display control device 100A according to the second embodiment are denoted by the reference numerals used in FIG. 8, and the description will be omitted or simplified.
  • the priority setting unit 3c determines whether there is a candidate for which the flag is set (step ST64), and if there is no candidate for which the flag is set (step ST64; NO), the process proceeds to step ST8. Transition. On the other hand, when there is a candidate for which the flag is set (step ST64; YES), the candidate for which the flag is set is set to the first priority, and the priorities of the other candidates are updated ( Step ST65).
  • the display control apparatus 100C according to the fourth embodiment is configured by including the candidate display control unit 4b in place of the candidate display control unit 4 in the display control apparatus 100 according to the first embodiment. .
  • the display control apparatus 100C is configured by including the candidate display control unit 4b in place of the candidate display control unit 4 in the display control apparatus 100 according to the first embodiment.
  • 100, 100A, 100B, 100C Display control device, 1 gaze detection unit, 2 gaze target calculation unit, 3, 3a, 3c priority setting unit, 4, 4a, 4b, 4c candidate display control unit, 5 content information storage unit , 20 imaging device, 30 operation device, 31 control unit, 32 display unit, 33 touch operation detection unit, 200a processing circuit, 200b processor, 200c memory

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention addresses the problem of a conventional example that, when a plurality of operators are trying to perform a gesture operation, it is unclear the gesture operation being performed by which operator is being executed by a gesture operation device. The problem is solved by a display control device (100) which causes a face image of an operator of an operation apparatus (30) to be displayed on a display unit (32) of the operation apparatus (30), thereby letting passengers know the operator.

Description

表示制御装置、表示制御方法、及び表示制御装置を備える車載機器Display control device, display control method, and in-vehicle device provided with display control device
 この発明は操作機器の操作者を車両の乗員に周知させる表示制御装置、表示制御方法、及び表示制御装置を備える車載機器に関するものである。 The present invention relates to a display control apparatus, a display control method, and an on-vehicle apparatus including the display control apparatus for making an operator of a control apparatus known to an occupant of a vehicle.
 例えば、操作者に向けて設置されたカメラと、このカメラから出力されたカメラ信号から操作者のジェスチャを検出するジェスチャ検出部が接続された制御部と、この制御部に接続され、カメラ信号から操作者の視線を検出する視線検出部と、操作者に向けて画像を表示する表示部と、を有するジェスチャ操作装置であって、ジェスチャ検出部が操作者のジェスチャを検出し、かつ視線検出部が操作者の視線が表示部に向いていることを検出すると、ジェスチャ検出部は検出したジェスチャを制御部に出力し、制御部は当該ジェスチャに対応する操作を実行する装置が存在する(下記特許文献1)。 For example, a camera installed toward the operator, a control unit connected with a gesture detection unit that detects the gesture of the operator from the camera signal output from the camera, and the control unit are connected A gesture operation device including a gaze detection unit that detects a gaze of an operator and a display unit that displays an image toward the operator, wherein the gesture detection unit detects a gesture of the operator, and the gaze detection unit Detects that the line of sight of the operator is directed to the display unit, the gesture detection unit outputs the detected gesture to the control unit, and the control unit executes an operation corresponding to the gesture (Patent Literature 1).
特開2014-197252号公報 JP 2014-197252 A
 上記従来例おいては、複数の乗員が表示部に視線を向け、ジェスチャ操作機器の操作を行おうとした場合、ジェスチャ操作装置がどの乗員が行っているジェスチャを実行しているのかわからないといった問題点があった。 In the above-described conventional example, when a plurality of occupants direct their eyes to the display unit and try to operate the gesture operating device, there is a problem that the gesture operating device does not know which occupant is performing the gesture. was there.
 この発明は上記した問題点を解決するためになされたものであり、操作機器の操作者を操作機器の表示部に表示させることで、乗員に対して操作者を周知させることが可能な表示制御装置を提供することを目的とする。 The present invention has been made to solve the above-described problems, and display control capable of making the operator known to the occupant by displaying the operator of the operation device on the display unit of the operation device. It aims at providing an apparatus.
 この発明に係わる表示制御装置は、乗員を撮像する撮像装置の撮像した画像から、乗員の視線を検出する視線検出部と、視線の対象を算出し、非接触操作機器の表示部に対して視線を向けている乗員を判定する視線対象算出部と、表示部に対する視線を向けている乗員を候補者に決定し、候補者に優先順位を設定する優先順位設定部と、優先順位が一番の優先順位一番候補者の候補者画像を視線検出部から取得し、表示部に表示させる候補者表示制御部と、を備える。 A display control apparatus according to the present invention calculates a line-of-sight detection unit for detecting a line-of-sight of an occupant and an object of line-of-sight from an image captured by an imaging device for capturing an occupant. And a priority setting unit that determines a passenger who is looking at the display unit as a candidate and sets a priority for the candidate; And a candidate display control unit for acquiring a candidate image of the highest priority candidate from the sight line detection unit and displaying the image on the display unit.
 この発明の表示制御装置は、操作機器の操作者を操作機器の表示部に表示させることによって、操作者を乗員に周知させることができる。 According to the display control device of the present invention, the operator can be made known to the occupant by displaying the operator of the operation device on the display unit of the operation device.
この発明の実施の形態1に係わる表示制御装置の構成を示すブロック図である。It is a block diagram which shows the structure of the display control apparatus concerning Embodiment 1 of this invention. この発明の実施の形態1に係わる候補者画像の表示例の図である。It is a figure of the example of a display of the candidate image concerning Embodiment 1 of this invention. 図3Aおよび図3Bは、この発明の実施の形態1に係わる、表示制御装置のハードウェア構成例を示す図である。FIGS. 3A and 3B are diagrams showing an example of the hardware configuration of the display control apparatus according to the first embodiment of the present invention. この発明の実施の形態1における表示制御装置の顔画像表示処理の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the face image display process of the display control apparatus in Embodiment 1 of this invention. この発明の実施の形態2に係わる表示制御装置の構成を示すブロック図である。It is a block diagram which shows the structure of the display control apparatus concerning Embodiment 2 of this invention. 図6A、図6B、図6C、図6D、図6Eは、この発明の実施の形態2における候補者合成画像の表示例の図である。FIGS. 6A, 6B, 6C, 6D, and 6E are diagrams showing display examples of candidate composite images according to the second embodiment of the present invention. この発明の実施の形態2に係わる候補者合成画像の一例の図である。It is a figure of an example of the candidate synthetic | combination image concerning Embodiment 2 of this invention. この発明の実施の形態2に係わる表示制御装置の顔画像表示処理の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the face image display process of the display control apparatus concerning Embodiment 2 of this invention. この発明の実施の形態3に係わる表示制御装置の構成を示すブロック図である。It is a block diagram which shows the structure of the display control apparatus concerning Embodiment 3 of this invention. この発明の実施の形態3に係わる表示制御装置のコンテンツ情報記憶部5にコンテンツ情報と候補者を紐付けて保存する処理の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the process of stringing | linking content information and a candidate in the content information storage part 5 of the display control apparatus concerning Embodiment 3 of this invention, and preserve | saving it. この発明の実施の形態3に係わる表示制御装置100Bのコンテンツ情報記憶部5からコンテンツ情報を取得し、操作機器30に出力する処理の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the operation | movement of a process which acquires content information from the content information storage part 5 of the display control apparatus 100B concerning Embodiment 3 of this invention, and outputs it to the operating device 30. この発明の実施の形態4に係わる表示制御装置の構成を示すブロック図であるIt is a block diagram which shows the structure of the display control apparatus concerning Embodiment 4 of this invention. この発明の実施の形態4係わる表示制御装置の操作者の権限を委譲する一例の図である。It is a figure of an example which transfers the authority of the operator of the display control apparatus concerning Embodiment 4 of this invention. この発明の実施の形態4に係わる表示制御装置100Cのフラグの設定処理の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the setting process of the flag of the display control apparatus 100C concerning Embodiment 4 of this invention. この発明の実施の形態4に係わる表示制御装置100Cの優先順位の設定処理の動作を示すフローチャートである。It is a flowchart which shows operation | movement of a setting process of a priority of the display control apparatus 100C concerning Embodiment 4 of this invention.
 以下、この発明を実施するための形態について、図を参照して説明する。なお、各図中の同一又は相当する部分には同一の符号を付しており、その重複説明は適宜に簡略化ないし省略する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. The same or corresponding parts in the respective drawings are denoted by the same reference numerals, and the redundant description thereof is appropriately simplified or omitted.
実施の形態1
 図1は本実施の形態1に係わる表示制御装置の構成を示すブロック図である。
 表示制御装置100は、視線検出部1、視線対象算出部2、優先順位設定部3、候補者表示制御部4を備える。
 図1に示すように、表示制御装置100は、撮像装置20および操作機器30に接続されている。また、操作機器30は制御部31、表示部32を備えている。
Embodiment 1
FIG. 1 is a block diagram showing the configuration of a display control apparatus according to the first embodiment.
The display control device 100 includes a gaze detection unit 1, a gaze target calculation unit 2, a priority setting unit 3, and a candidate display control unit 4.
As shown in FIG. 1, the display control device 100 is connected to the imaging device 20 and the operation device 30. The controller device 30 also includes a control unit 31 and a display unit 32.
 なお、本実施の形態1における操作機器30は音声操作やジェスチャ操作等の非接触で操作を行う非接触操作機器で構成される。 The operation device 30 in the first embodiment is configured of a non-contact operation device that performs a non-contact operation such as a voice operation or a gesture operation.
 視線検出部1について説明する。
 視線検出部1は、操作機器30の表示部32付近に設けられた撮像装置20の撮像した画像を取得する。撮像装置20は、運転席および助手席を同時に撮像可能な広角カメラで構成されている。視線検出部1は、取得した画像から少なくとも乗員ごとの顔全体が含まれる顔画像を抽出し、顔画像から乗員の視線を検出する。乗員の視線を検出した場合は、視線情報を視線対象算出部2に出力する。視線情報とは乗員ごとの顔画像の画像中の座標(顔座標)および視線の方向(視線方向)を含む情報である。上述した視線は、例えば、目、鼻、口等の顔の特徴点から算出される顔の向きや眼球の回転角度によって求めることができる。
 なお、実施の形態1における撮像装置20は、操作機器30の表示部32付近に設けられた1台の広角カメラを用いる構成であったが、運転席および助手席を撮像可能な構成であればよいものであり、例えば撮像装置20として座席ごとに狭角カメラを設ける構成にしてもよい。
The gaze detection unit 1 will be described.
The gaze detection unit 1 acquires an image captured by the imaging device 20 provided in the vicinity of the display unit 32 of the operation device 30. The imaging device 20 is configured of a wide-angle camera capable of simultaneously imaging the driver's seat and the passenger's seat. The gaze detection unit 1 extracts a face image including at least the entire face of each occupant from the acquired image, and detects the gaze of the occupant from the face image. When the line of sight of the occupant is detected, the line-of-sight information is output to the line-of-sight object calculation unit 2. The line-of-sight information is information including coordinates (face coordinates) in the image of the face image of each occupant and the direction of the line of sight (line-of-sight direction). The above-described line of sight can be determined, for example, by the direction of the face or the rotation angle of the eyeball, which is calculated from feature points of the face such as eyes, nose, and mouth.
In addition, although the imaging device 20 in Embodiment 1 was a structure using one wide-angle camera provided in display part 32 vicinity of the operation apparatus 30, if it is the structure which can image a driver's seat and a front passenger seat For example, a narrow-angle camera may be provided as the imaging device 20 for each seat.
 次に、視線対象算出部2について説明する。
 視線対象算出部2は、予め設定されている表示部32の位置と乗員の顔座標および視線方向を含む視線情報に基づいて、乗員が視線を向けているエリアを算出し、乗員が見ている対象物を特定する。視線対象算出部2は、表示部32に対して視線を向けている乗員を検知した場合、その乗員の視線情報に表示部32を注視しているという情報を加える。そして、その乗員の視線情報を優先順位設定部3に出力する。
Next, the gaze target calculation unit 2 will be described.
The line-of-sight calculation unit 2 calculates an area in which the driver is looking at the line of sight based on the line-of-sight information including the position of the display unit 32 and the face coordinates of the occupant and the line-of-sight direction set beforehand. Identify the object. When the gaze target calculation unit 2 detects an occupant whose gaze is directed to the display unit 32, it adds information that the gaze of the display unit 32 is gazed to the gaze information of the passenger. Then, the line-of-sight information of the occupant is output to the priority setting unit 3.
 次に、優先順位設定部3について説明する。
 優先順位設定部3は、視線対象算出部3から表示部32に対して視線を向けている乗員の視線情報が入力されると、連続して表示部32に視線を向ける時間(注視時間)をカウントする。そして、注視時間が予め設定した閾値時間を越えた乗員を、操作機器30の操作を行う候補者に決定する。このように注視時間をカウントし、注視時間が予め設定した閾値時間を越えた乗員を候補者とすることで、偶然表示部32に視線が向いただけの乗員が候補者になるのを抑制することができる。
Next, the priority setting unit 3 will be described.
When the line-of-sight information of the occupant turning the line of sight to the display unit 32 is input from the line-of-sight object calculation unit 3, the priority setting unit 3 sets the time (gaze time) to continuously direct the line of sight to the display unit 32. Count. Then, an occupant whose gaze time exceeds a preset threshold time is determined as a candidate who operates the operation device 30. In this manner, the gaze time is counted, and an occupant whose gaze time exceeds the preset threshold time is set as a candidate, thereby suppressing a passenger whose line of sight turns to the display 32 accidentally becomes a candidate. Can.
 この候補者の決定方法は一例であり、例えば、閾値時間を設けず、表示部32を乗員が注視した瞬間に候補者に決定してもよいし、撮像装置20の撮像した画像から表示部32に視線を向けている乗員のジェスチャを検出し、予め設定された操作機器30の操作開始を表すジェスチャを検知した場合に候補者に決定してもよい。 The determination method of this candidate is an example, for example, the threshold time may not be provided, and the candidate may be determined at the moment when the occupant gazes at the display unit 32 or the display unit 32 from the image captured by the imaging device 20 A gesture of an occupant who is directing their gaze at may be detected, and it may be determined as a candidate when a gesture representing the start of operation of the operation device 30 set in advance is detected.
 さらに、優先順位設定部3は、候補者に決定したタイミングが早い順に、候補者に優先順位を設定し、優先順位一番の候補者(優先順位一番候補者)の顔座標を候補者表示制御部4に出力する。
 なお、運転者は、他の乗員よりも重要な操作を行うことが多く、また、急いで操作したい場合も多いため、運転者が候補者決定された場合には他の乗員の注視時間に関係なく優先順位を一番高くするように優先順位を設定してもよい。
Furthermore, the priority setting unit 3 sets the priority to the candidate in the order of earlier timing determined as the candidate, and displays the face coordinates of the candidate with the highest priority (the first candidate with priority) as a candidate Output to control unit 4.
It should be noted that the driver often performs more important operations than the other occupants, and often wants to operate in a hurry. Therefore, when the driver is determined as a candidate, it is related to the gaze time of the other occupants. Instead, the priority may be set to give the highest priority.
 また、優先順位設定部3は、候補者の視線情報から、候補者が連続して表示部32から視線を外している時間(非注視時間)をカウントする。そして、候補者の非注視時間が予め設定した閾値時間継続した場合は、その候補者を候補者から除外し、優先順位を更新する。
 例えば、候補者が三人いる状況で、優先順位一番の候補者が表示部32から所定時間継続して視線を外し、候補者から除外された場合は、優先順位二番の候補者は優先順位一番となり、優先順位三番の候補者は優先順位二番となる。
 これによって、候補者は表示部32から視線を外すのみで候補者から外れることが可能となり、操作機器30の優先順位一番の候補者を速やかに交代することができる。
Further, the priority setting unit 3 counts the time (non-gaze time) in which the candidate continuously removes the line of sight from the display unit 32 from the line-of-sight information of the candidate. Then, when the non-gaze time of the candidate continues for a preset threshold time, the candidate is excluded from the candidate and the priority is updated.
For example, in a situation where there are three candidates, when the candidate with the highest priority continues to look away from the display unit 32 for a predetermined time and is excluded from the candidates, the candidate with priority 2 has priority The candidate with the third highest priority is the second highest priority.
As a result, the candidate can be removed from the candidate simply by taking his or her eyes off the display unit 32, and the candidate with the highest priority in the operation device 30 can be quickly replaced.
 なお、実際の車両では走行時の振動や日光によって候補者の意図と関係なく視線のぶれや短時間のよそ見が発生する。上記のように非注視時間が所定時間継続した際に視線を外したと判断することによって、視線のぶれや短時間のよそ見の影響を抑制することができる。 In an actual vehicle, the vibration during driving and sunlight may cause a blur in the line of sight and a short distance look away regardless of the intention of the candidate. As described above, it is possible to suppress the influence of blurring of the line of sight and short-time looking away by judging that the line of sight is removed when the non-gaze time continues for a predetermined time.
 この候補者の除外方法は一例であり、例えば、撮像装置20の撮像した画像から表示部32に視線を向けている乗員のジェスチャを検出し、予め設定された操作機器30の操作終了を表すジェスチャを検知した場合に候補者から除外するようにしてもよい。 This method of excluding a candidate is an example, and for example, a gesture of an occupant whose eyes are directed to the display unit 32 is detected from an image captured by the imaging device 20, and a gesture representing the end of operation of the operation device 30 set in advance. May be excluded from the candidates when it is detected.
 次に、候補者表示制御部4について説明する。
 候補者表示制御部4は、優先順位設定部3から入力された優先順位一番候補者の顔座標に基づいて、視線検出部1から優先順位一番候補者の顔全体を含む画像である候補者画像を取得する。候補者表示制御部4は、優先順位一番候補者を特定可能な情報と候補者画像を含んだ優先順位一番候補者情報を、操作機器30の制御部31に出力する。また、候補者表示制御部4は、優先順位一番候補者が変更される毎に候補者画像を取得し、制御部31に出力する。
Next, the candidate display control unit 4 will be described.
The candidate display control unit 4 is a candidate that is an image including the entire face of the priority first candidate from the sight line detection unit 1 based on the face coordinates of the priority first candidate input from the priority setting unit 3 Image of the person who The candidate display control unit 4 outputs, to the control unit 31 of the operation device 30, priority first candidate information including information capable of specifying the priority first candidate and a candidate image. In addition, the candidate display control unit 4 acquires a candidate image every time the highest priority candidate is changed, and outputs the image to the control unit 31.
  制御部31は、候補者表示制御部4から優先順位一番候補者情報が入力されると、候補者画像の乗員(操作者)以外の操作機器30に対する非接触操作(音声操作、ジェスチャ操作等)を無効にする。
 さらに、優先順位設定部3は、優先順位一番候補者が不在の場合は、制御部31に対して、操作者不在の通知を行う。制御部31は、優先順位設定部3から、操作者不在の通知を受信した場合、操作機器30に対する非接触操作(音声操作、ジェスチャ操作等)を無効にする。
 これによって、表示部32に対して視線を向けていない乗員、つまり、操作機器30を操作する気のない乗員の動きや発話を、操作機器に対する操作として誤検出することを抑制できる。
When the highest priority candidate information is input from the candidate display control unit 4, the control unit 31 performs noncontact operation (voice operation, gesture operation, etc.) on the operation device 30 other than the occupant (operator) of the candidate image. Disable).
Furthermore, when the priority first candidate is absent, the priority setting unit 3 notifies the control unit 31 of the absence of the operator. When the control unit 31 receives the notification of the absence of the operator from the priority setting unit 3, the control unit 31 invalidates the non-contact operation (voice operation, gesture operation, and the like) on the operation device 30.
As a result, it is possible to suppress erroneous detection of the movement or speech of an occupant whose line of sight is not directed to the display unit 32, that is, an occupant who is not willing to operate the operation device 30, as an operation on the operation device.
 図2は実施の形態1における候補者画像の表示例の図である。制御部31は候補者表示制御部4から入力された候補者画像を図2のように表示部32に表示する。また、候補者表示制御部4から入力される候補者画像が変更された場合は表示部31に表示している候補者画像を更新する。
 このように、候補者画像を表示部32に表示させることによって、操作機器30を操作可能な操作者を乗員に周知させることができる。特に、候補者画像として顔全体を含む画像を用いることによって、乗員は操作者を容易に認識することができる。
FIG. 2 is a diagram of a display example of a candidate image in the first embodiment. The control unit 31 displays the candidate image input from the candidate display control unit 4 on the display unit 32 as shown in FIG. When the candidate image input from the candidate display control unit 4 is changed, the candidate image displayed on the display unit 31 is updated.
Thus, by displaying the candidate image on the display unit 32, it is possible to make the occupant aware of an operator who can operate the operation device 30. In particular, by using an image including the entire face as the candidate image, the occupant can easily recognize the operator.
 また、本実施の形態1においては、候補者表示制御部4と制御部31を別の制御部とする構成について説明したが、候補者表示制御部4が制御部31の機能を兼ね備え、候補者表示制御部4が操作機器30への非接触操作の制限や表示部32への表示を行うような構成としてもよい。 In the first embodiment, although the configuration has been described in which the candidate display control unit 4 and the control unit 31 are separate control units, the candidate display control unit 4 has the function of the control unit 31 and the candidate The display control unit 4 may be configured to limit the non-contact operation to the operation device 30 and display on the display unit 32.
 次に、表示制御装置100のハードウェア構成例を説明する。
 図3Aおよび図3Bは、表示制御装置100のハードウェア構成例を示す図である。
 表示制御装置100における視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4の各機能は、処理回路によって実現される。即ち、表示制御装置100は上記各機能を実現するための処理回路を備える。当該処理回路は、図3Aに示すように専用のハードウェアである処理回路200aであってもよいし、図3Bに示すようにメモリ200cに格納されているプログラムを実行するプロセッサ200bであってもよい。
Next, a hardware configuration example of the display control device 100 will be described.
FIGS. 3A and 3B are diagrams showing an example of the hardware configuration of the display control apparatus 100. FIG.
Each function of the visual axis detection unit 1, the visual axis target calculation unit 2, the priority setting unit 3, and the candidate display control unit 4 in the display control apparatus 100 is realized by a processing circuit. That is, the display control device 100 includes a processing circuit for realizing the above functions. The processing circuit may be a processing circuit 200a that is dedicated hardware as shown in FIG. 3A, or may be a processor 200b that executes a program stored in the memory 200c as shown in FIG. 3B. Good.
 図3Aに示すように、視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4が専用のハードウェアである場合、処理回路200aは、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-programmable Gate Array)、またはこれらを組み合わせたものが該当する。視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4の各部の機能それぞれ処理回路で実現してもよいし、各部の機能をまとめて1つの処理回路で実現してもよい。 As shown in FIG. 3A, when the gaze detection unit 1, gaze target calculation unit 2, priority setting unit 3 and candidate display control unit 4 are dedicated hardware, the processing circuit 200a may be, for example, a single circuit, A compound circuit, a programmed processor, a processor programmed in parallel, an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or a combination thereof may be used. The functions of the visual axis detection unit 1, visual axis calculation unit 2, priority setting unit 3, and candidate display control unit 4 may be implemented by processing circuits, or the functions of the sections may be implemented by a single processing circuit. You may
 図3Bに示すように、視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4がプロセッサ200bである場合、各部の機能は、ソフトウェア、ファームウェア、またはソフトウェアとファームウェアとの組み合わせにより実現される。ソフトウェアまたはファームウェアはプログラムとして記述され、メモリ200cに格納される。プロセッサ200bは、メモリ200cに記憶されたプログラムを読み出して実行することにより、視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4の各機能を実現する。即ち、視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4は、プロセッサ200bにより実行されるときに、後述する図4に示す各ステップが結果的に実行されることになるプログラムを格納するためのメモリ200cを備える。また、これらのプログラムは、視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4の手順または方法をコンピュータに実行させるものであるともいえる。 As shown in FIG. 3B, when the gaze detection unit 1, gaze target calculation unit 2, priority setting unit 3 and candidate display control unit 4 are the processor 200b, the function of each unit is software, firmware, or software and firmware It is realized by the combination of The software or firmware is described as a program and stored in the memory 200c. The processor 200b reads out and executes the program stored in the memory 200c to realize the respective functions of the visual axis detection unit 1, the visual axis target calculation unit 2, the priority setting unit 3, and the candidate display control unit 4. That is, when the visual axis detection unit 1, visual axis target calculation unit 2, priority setting unit 3, and candidate display control unit 4 are executed by the processor 200b, each step shown in FIG. 4 described later is eventually executed. Memory 200c for storing a program to be stored. In addition, it can be said that these programs cause a computer to execute the procedure or method of the gaze detection unit 1, gaze target calculation unit 2, priority setting unit 3 and candidate display control unit 4.
 ここで、プロセッサ200bとは、例えば、CPU(Central Processing Unit)、処理装置、演算装置、プロセッサ、マイクロプロセッサ、マイクロコンピュータ、またはDSP(Digital Signal Processor)などのことである。メモリ200cは、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable ROM)、EEPROM(Electrically EPROM)等の不揮発性または揮発性の半導体メモリであってもよいし、ハードディスク、フレキシブルディスク等の磁気ディスクであってもよいし、ミニディスク、CD(Compact Disc)、DVD(Digital Versatile Disc)等の光ディスクであってもよい。 Here, the processor 200 b refers to, for example, a central processing unit (CPU), a processing device, an arithmetic device, a processor, a microprocessor, a microcomputer, or a digital signal processor (DSP). The memory 200c may be, for example, a nonvolatile or volatile semiconductor memory, such as a random access memory (RAM), a read only memory (ROM), a flash memory, an EPROM (erasable programmable ROM), or an EEPROM (electrically EPROM). It may be a hard disk, a magnetic disk such as a flexible disk, or an optical disk such as a mini disk, a CD (Compact Disc), a DVD (Digital Versatile Disc), or the like.
 なお、視線検出部1、視線対象算出部2、優先順位設定部3および候補者表示制御部4の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェアまたはファームウェアで実現するようにしてもよい。このように、表示制御装置100における処理回路200aは、ハードウェア、ソフトウェア、ファームウェア、または、これらの組み合わせによって、上述の各機能を実現することができる。 The functions of the visual axis detection unit 1, visual axis calculation unit 2, priority setting unit 3, and candidate display control unit 4 are partially realized by dedicated hardware and partially realized by software or firmware. You may do so. As described above, the processing circuit 200 a in the display control device 100 can realize the above-described functions by hardware, software, firmware, or a combination thereof.
 次に、表示制御装置100の顔画像表示処理の動作について図4を用いて説明する。
 図4は実施の形態1に係わる表示制御装置100の顔画像表示処理の動作を示すフローチャートである。なお、図4のフローチャートでは、優先順位設定部3が、表示部32に対する注視時間が閾値時間を越えた乗員を候補者に決定し、候補者になった順に候補者に優先順位を設定する例を説明する。
 視線検出部1は、撮像装置20の撮像した画像を取得し(ステップST1)、取得した画像から乗員の視線情報を取得する(ステップST2)。視線対象算出部2はステップST2で取得した視線情報に基づいて表示部32に視線を向けている乗員がいるか否かを判定する(ステップST3)。表示部32に視線を向けている乗員が存在しない場合(ステップST3;NO)、ステップST1の処理に移行する。一方、表示部32に視線を向けている乗員が存在する場合(ステップST3;YES)、優先順位設定部3はその乗員の表示部31に対する注視時間をカウントする(ステップST4)。
Next, the operation of the face image display processing of the display control apparatus 100 will be described using FIG.
FIG. 4 is a flowchart showing an operation of face image display processing of the display control apparatus 100 according to the first embodiment. In the flowchart of FIG. 4, the priority setting unit 3 determines an occupant whose gaze time with respect to the display unit 32 exceeds the threshold time as a candidate and sets the priority to the candidate in the order of becoming a candidate. Explain.
The gaze detection unit 1 acquires an image captured by the imaging device 20 (step ST1), and acquires gaze information of an occupant from the acquired image (step ST2). The line-of-sight object calculation unit 2 determines whether or not there is an occupant whose line of sight is directed to the display unit 32 based on the line-of-sight information acquired in step ST2 (step ST3). When there is no occupant whose line of sight is directed to the display unit 32 (step ST3; NO), the process proceeds to step ST1. On the other hand, when there is an occupant whose line of sight is directed to the display unit 32 (step ST3; YES), the priority setting unit 3 counts the gaze time of the occupant on the display unit 31 (step ST4).
 優先順位設定部3は、ステップST4でカウントした注視時間が閾値時間を越えた乗員が存在するか否かを判定し(ステップST5)、注視時間が閾値時間を越えた乗員が存在しない場合(ステップST5;NO)、ステップST1の処理に移行する。一方、注視時間が閾値時間を越えた乗員が存在する場合(ステップST5;YES)、注視時間が閾値時間を越えた乗員を候補者に決定し(ステップST6)、候補者に決定した順番に候補者に優先順位を設定する(ステップST7)。 The priority setting unit 3 determines whether there is an occupant whose gaze duration exceeds the threshold time in step ST4 (step ST5), and there is no occupant whose gaze duration exceeds the threshold time (step ST5). ST5; NO), shift to the process of step ST1. On the other hand, when there is an occupant whose fixation time exceeds the threshold time (step ST5; YES), the occupant whose fixation time exceeds the threshold time is determined as a candidate (step ST6), and candidates are determined in the order determined. The priority order is set for the person (step ST7).
 候補者表示制御部4は、ステップST7で設定した優先順位が一番高い候補者の候補者画像を視線検出部1から取得し(ステップST8)、ステップST8で取得した候補者画像と優先順位が一番高い候補者を特定可能な情報を含んだ優先順位一候補者情報を制御部31に出力する(ステップST9)。そして、ステップST1の処理に戻る。 The candidate display control unit 4 acquires the candidate image of the candidate with the highest priority set in step ST7 from the line-of-sight detection unit 1 (step ST8), and the candidate image and priority acquired in step ST8 Priority-one candidate information including information that can specify the highest candidate is output to the control unit 31 (step ST9). Then, the process returns to step ST1.
 以上のように、実施の形態1によれば、乗員を撮像する撮像装置20の撮像した画像から、乗員の視線を検出する視線検出部1と、視線の対象を算出し、操作機器30の表示部32に対して視線を向けている乗員を判定する視線対象算出部2と、表示部32に対する視線を向けている乗員を候補者に決定し、候補者に優先順位を設定する優先順位設定部3と、優先順位が一番の候補者の候補者画像を画像から取得し、表示部32に表示させる候補者表示制御部4と、を備える構成にしたので、複数の乗員が操作機器30を操作しようとした場合において、操作機器30を操作しようとしている乗員に操作機器30の操作者を周知させることができる。 As described above, according to the first embodiment, the visual axis detection unit 1 for detecting the visual line of the occupant and the target of the visual line are calculated from the image captured by the imaging device 20 for imaging the occupant. Line-of-sight object calculation unit 2 that determines an occupant who is pointing the line of sight to the unit 32 and a priority setting unit that determines an occupant who is pointing the line of sight to the display unit 32 as a candidate and sets priorities to the candidates 3 and the candidate display control unit 4 for acquiring the candidate image of the candidate with the highest priority from the image and displaying the image on the display unit 32, a plurality of occupants operate the operating device 30. When trying to operate, the operator of the operating device 30 can be made known to the occupant who is going to operate the operating device 30.
実施の形態2
 本実施の形態2は、優先順位一番候補者だけでなく、候補者全員の顔の画像を表示部32に表示する表示制御装置である。実施の形態2の表示制御装置について図5を用いて説明する。
 図5は本実施の形態2に係わる表示制御装置の構成を示すブロック図である。
 実施の形態2の表示制御装置100Aは、実施の形態1の表示制御装置100に対して、優先順位設定部3、候補者表示制御部4の替わりに優先順位設定部3a、候補者表示制御部4aを備えて構成している、
 以下では、実施の形態1に係わる表示制御装置100と同一または相当する部分には、実施の形態1で使用した符号と同一の符号を付して説明を省略または簡略化する。
Embodiment 2
The second embodiment is a display control apparatus for displaying on the display unit 32 the images of the faces of all the candidates, not only the first-priority candidate. The display control apparatus according to the second embodiment will be described with reference to FIG.
FIG. 5 is a block diagram showing the configuration of a display control apparatus according to the second embodiment.
The display control apparatus 100A of the second embodiment is the same as the display control apparatus 100 of the first embodiment except for the priority setting unit 3 and the candidate display control unit 4 in place of the priority setting unit 3a and the candidate display control unit Composed with 4a,
In the following, parts the same as or corresponding to those of display control apparatus 100 according to Embodiment 1 will be assigned the same reference numerals as those used in Embodiment 1 to omit or simplify the description.
 まず、優先順位設定部3aについて説明する。
 優先順位設定部3aは、実施の形態1で説明した方法によって、候補者を決定し、候補者に優先順位を設定する。そして、設定した優先順位と候補者の顔座標とを紐付けたデータ(候補者データ)を候補者表示制御部4aに出力する。
First, the priority setting unit 3a will be described.
The priority setting unit 3a determines a candidate by the method described in the first embodiment, and sets the priority to the candidate. Then, data (candidate data) in which the set priority order and face coordinates of the candidate are linked is output to the candidate display control unit 4a.
 次に、候補者表示制御部4aについて図6を用いて説明する。
 図6は実施の形態2における候補者合成画像の表示例の図である。
 候補者表示制御部4aは、優先順位設定部3aから出力された候補者データに基づいて視線検出部1から候補者画像を取得する。候補者が一人、つまり、優先順位一番候補者のみであった場合、候補者画像と候補者を特定可能な情報を含む優先順位一番候補者情報を制御部31に出力する。
Next, the candidate display control unit 4a will be described using FIG.
FIG. 6 is a diagram of a display example of a candidate composite image in the second embodiment.
The candidate display control unit 4a acquires a candidate image from the gaze detection unit 1 based on the candidate data output from the priority setting unit 3a. When the candidate is only one candidate, that is, only the candidate with the highest priority, the candidate information with the highest priority including the image capable of identifying the candidate and the candidate is output to the control unit 31.
 一方、候補者が複数の場合、複数の候補者画像をまとめた画像である候補者合成画像を作成する。さらに、乗員が優先順位一番候補者の候補者画像と優先順位一番候補者以外の候補者の候補者画像を区別して認識できるように候補者合成画像を加工する。具体的には、図6Aのように、候補者合成画像中の候補者画像に操作者マークを付与する。 そして、操作者マークを付与した後の候補者合成画像と優先順位が一番高い候補者を特定可能な情報を含む候補者情報を制御部31に出力する。
 このように、複数の候補者画像を1つの候補者複合画像として制御部31に出力することで、候補者表示制御部4aと制御部31の間の通信量を抑制することができる。
 なお、候補者表示制御部4aは、候補者の優先順位に変更があるごとに、候補者合成画像を作成し、制御部31に出力する。
 制御部31は、候補者表示制御部4aから候補者合成画像が入力された場合、表示部32に表示させる。また、候補者表示制御部4aから入力され候補者合成画像に変更があった場合は、表示部32に表示させている候補者合成画像を更新する。
On the other hand, when there are a plurality of candidates, a candidate composite image is created which is an image in which a plurality of candidate images are put together. Further, the candidate composite image is processed so that the occupant can distinguish and recognize the candidate image of the candidate with the highest priority and the image of the candidate other than the candidate with the highest priority. Specifically, as shown in FIG. 6A, an operator mark is added to the candidate image in the candidate composite image. Then, candidate information including the candidate composite image to which the operator mark has been assigned and information capable of specifying the candidate with the highest priority is output to the control unit 31.
Thus, the communication amount between the candidate display control unit 4a and the control unit 31 can be suppressed by outputting the plurality of candidate image as one candidate composite image to the control unit 31.
The candidate display control unit 4a creates a candidate composite image each time there is a change in the priority of the candidates, and outputs the image to the control unit 31.
When the candidate composite image is input from the candidate display control unit 4a, the control unit 31 causes the display unit 32 to display the image. In addition, when there is a change in the candidate composite image input from the candidate display control unit 4a, the candidate composite image displayed on the display unit 32 is updated.
 この候補者画像に操作者マークを付与するのは一例であり、例えば、図6Bのように優先順位一番候補者の候補者画像に縁取りをして強調してもよいし、反対に、図6Cのように先順位一番以外の候補者の候補者画像をグレーアウトさせてもよいし、図6Dのように先順位一番以外の候補者の候補者画像の透明度を高くして目立たなくさせてもよい。また、図6Eのように候補者合成画像を表示させ、所定時間経過後に候補者合成画像を消去し、優先順位一番候補者の候補者画像を表示させることで優先順位一番候補者の候補者画像を強調してもよい。 It is an example to attach an operator mark to this candidate image, for example, it may be emphasized by bordering on the candidate image of the priority first candidate as shown in FIG. 6B, and conversely, The candidate image of the candidate other than the top priority may be grayed out as in 6C, or as shown in FIG. 6D, the transparency of the candidate image other than the top priority is made to be less noticeable. May be Further, as shown in FIG. 6E, the candidate composite image is displayed, the candidate composite image is deleted after a predetermined time elapses, and the candidate image of the candidate with the highest priority is displayed, whereby the candidate with the highest priority can be selected. Image may be emphasized.
 さらに、候補者合成画像中の候補者画像の配置を、例えば、図7のように、右から左にかけて優先順位が高くなる配置とすることで、乗員は自身が何番目に操作機器30を操作できるかを用意に認識できるようになる。図7は、実施の形態2に係わる候補者合成画像の一例の図である。 Furthermore, as shown in FIG. 7, for example, the occupant operates the operation device 30 by setting the position of the candidate image in the candidate composite image to be higher in priority from right to left as shown in FIG. You will be able to easily recognize what you can do. FIG. 7 is a diagram of an example of a candidate-combined image according to the second embodiment.
 なお、本実施の形態2においては、複数の候補者画像を1つの候補者複合画像として表示部32に表示させる構成について説明したが、これは一例であり、候補者表示制御部4aが候補者画像と優先順位とを紐付けたデータを制御部31に出力し、制御部31が表示部32に表示させる配置や候補者画像の態様を制御し、乗員に操作者が誰かを認識できるよう表示部32に表示させるような構成としてもよい。
 また、実施の形態1と同様に、候補者表示制御部4aが制御部31の機能を兼ね備え、候補者表示制御部4aが操作機器30への非接触操作の制限や表示部32への表示を行うような構成としてもよい。
In the second embodiment, the configuration has been described in which a plurality of candidate images are displayed on the display unit 32 as one candidate composite image, but this is an example, and the candidate display control unit 4a is a candidate. Data in which the image and the priority are linked is output to the control unit 31, and the control unit 31 controls the arrangement and the candidate image to be displayed on the display unit 32 so that the occupant can recognize who the operator is It may be configured to be displayed on the unit 32.
Further, as in the first embodiment, the candidate display control unit 4a also has the function of the control unit 31, and the candidate display control unit 4a restricts the noncontact operation on the operation device 30 and displays the display unit 32. It may be configured as follows.
 次に、表示制御装置100Aのハードウェア構成例を説明する。なお、実施の形態1と同一の構成の説明は省略する。
 表示制御装置100Aにおける優先順位設定部3a、候補者表示制御部4aは、図3Aで示した処理回路200a、または図3Bで示したメモリ200cに格納されるプログラムを実行するプロセッサ200bである。
Next, a hardware configuration example of the display control device 100A will be described. Description of the same configuration as that of the first embodiment is omitted.
The priority setting unit 3a and the candidate display control unit 4a in the display control device 100A are the processing circuit 200a shown in FIG. 3A or the processor 200b that executes a program stored in the memory 200c shown in FIG. 3B.
 次に、表示制御装置100Aの顔画像表示処理の動作について図8を用いて説明する。
 図8は実施の形態2に係わる表示制御装置100Aの顔画像表示処理の動作を示すフローチャートである。図8のフローチャートでは、候補者表示制御部4aが、優先順位一番候補者の候補者画像に操作者マークを付与する例を説明する。なお、以下では、実施の形態1に係わる表示制御装置100と同一のステップには図4で使用した符号を付し、説明を省略または簡略化する。
Next, the operation of the face image display processing of the display control apparatus 100A will be described with reference to FIG.
FIG. 8 is a flow chart showing the operation of face image display processing of the display control apparatus 100A according to the second embodiment. In the flowchart of FIG. 8, an example will be described in which the candidate display control unit 4 a gives an operator mark to the candidate image of the candidate with the highest priority. In the following, the same steps as those of the display control apparatus 100 according to the first embodiment are denoted by the reference numerals used in FIG. 4, and the description will be omitted or simplified.
 候補者表示制御部4aは、ステップST6で決定された候補者の候補者画像を視線検出部1から取得し(ステップST28)、取得した候補者画像が優先順位一番候補者の候補者画像のみか否かを判定する(ステップST29)。取得した候補者画像が優先順位一番候補者の候補者画像のみの場合(ステップST29;YES)、候補者画像と優先順位が一番高い候補者を特定可能な情報を含む優先順位一番候補者情報を制御部31に出力する(ステップST30)。一方、候補者が複数の場合(ステップST29;NO)、複数の候補者画像をまとめた画像である候補者合成画像を作成する(ステップST31)。そして、ステップST31で作成した候補者合成画像中の優先順位一番候補者の候補者画像に操作者マークを付与し(ステップST32)、操作者マークを付与した候補者合成画像と優先順位が一番高い候補者を特定可能な情報を含む候補者情報を制御部31に出力する(ステップST33)。そして、ステップST1の処理に戻る。 The candidate display control unit 4a acquires the candidate image of the candidate determined in step ST6 from the sight line detection unit 1 (step ST28), and the acquired candidate image has only the candidate image with the highest priority in the priority order It is determined whether or not (step ST29). If the acquired candidate image is only the candidate image of the highest priority candidate (step ST29; YES), the highest priority candidate including the information capable of specifying the candidate image and the candidate with the highest priority Person information is output to the control unit 31 (step ST30). On the other hand, when there are a plurality of candidates (step ST29; NO), a candidate composite image which is an image in which a plurality of candidate images are put together is created (step ST31). Then, an operator mark is added to the candidate image of the first candidate in priority in the candidate composite image created in step ST31 (step ST32), and the candidate composite image to which the operator mark is added has one priority Candidate information including information capable of specifying the highest candidate is output to the control unit 31 (step ST33). Then, the process returns to step ST1.
 以上のように、実施の形態2に係わる表示制御装置100Aは、実施の形態1に係わる表示制御装置100が優先順位一番候補者の候補者画像のみを表示部32に表示したのに対し、優先順位設定部3、候補者表示制御部4の替わりに優先順位設定部3a、候補者表示制御部4aを備えることで、優先順位一番候補者の候補者画像と優先順位一番以外の候補者の候補者画像を、乗員が区別して認識できるように、表示部32に表示する。
 このように、優先順位一番候補者の候補者画像に加え優先順位一番以外の候補者画像も表示部32に表示させることで、乗員は操作機器30が認識している候補者を把握することが可能となる。これにより、操作機器30に認識されていないかもしれないという乗員の不安を取り除くことができる。
As described above, in the display control apparatus 100A according to the second embodiment, the display control apparatus 100 according to the first embodiment displays only the candidate image of the candidate with the highest priority on the display unit 32. By providing the priority setting unit 3 a and the candidate display control unit 4 a instead of the priority setting unit 3 and the candidate display control unit 4, the candidate image with the highest priority and the candidates other than the priority first The candidate image of the person is displayed on the display unit 32 so that the occupant can distinguish and recognize it.
As described above, the crew member can recognize the candidate recognized by the operation device 30 by displaying on the display unit 32 the candidate image other than the priority first candidate image in addition to the priority first candidate candidate image. It becomes possible. As a result, it is possible to eliminate the anxiety of the occupant that may not be recognized by the operation device 30.
実施の形態3
 本実施の形態3は、実施の形態1の表示制御装置に、優先順位一番候補者が最後に操作していた操作機器30のコンテンツ情報をその候補者と紐付けてコンテンツ情報記憶部5に保存し、再度その候補者が操作機器30を操作するときに最後に操作していたコンテンツ情報を復元表示できる機能を加えた表示制御装置である。
 実施の形態3の表示制御装置について図9を用いて説明する。
 図9は本実施の形態3に係わる表示制御装置の構成を示すブロック図である。
 実施の形態3の表示制御装置100Bは、実施の形態1の表示制御装置100に対して、コンテンツ情報記憶部5を追加して構成している。また、候補者表示制御部4に替えて候補者表示制御部4bを備えている。
 実施の形態1に係わる表示制御装置100と同一または相当する部分には、実施の形態1で使用した符号と同一の符号を付して説明を省略または簡略化する。
Third Embodiment
In the third embodiment, in the display control apparatus of the first embodiment, the content information of the operation device 30 which was operated last by the highest priority candidate is linked to the candidate and the content information storage unit 5 is It is a display control device which has a function to save and display again the content information that was last operated when the candidate operated the operation device 30 again.
The display control apparatus according to the third embodiment will be described with reference to FIG.
FIG. 9 is a block diagram showing the configuration of a display control apparatus according to the third embodiment.
The display control device 100B of the third embodiment is configured by adding a content information storage unit 5 to the display control device 100 of the first embodiment. Further, instead of the candidate display control unit 4, a candidate display control unit 4b is provided.
The same or corresponding parts as those of the display control apparatus 100 according to the first embodiment are indicated by the same reference numerals as the reference numerals used in the first embodiment, and the description will be omitted or simplified.
 コンテンツ情報記憶部5は乗員の顔座標と操作機器30のコンテンツ情報を紐付けて記憶する領域である。コンテンツ情報記憶部5は、表示制御装置100Bが備えている。なお、コンテンツ情報記憶部5は、操作機器30が備えてもよいし、車両に搭載されたその他の車載機器(図示しない)が備えてもよいし、外部サーバ(図示しない)が備えていてもよい。 The content information storage unit 5 is an area in which the face coordinates of the occupant and the content information of the operation device 30 are linked and stored. The content information storage unit 5 is provided in the display control device 100B. The content information storage unit 5 may be included in the operation device 30, or may be included in other in-vehicle devices (not shown) mounted in the vehicle, or may be included in an external server (not shown). Good.
 次に、候補者表示制御部4bについて説明する。 
 候補者表示制御部4bは、優先順位設定部3から優先順位一番候補者の顔座標が入力されると、実施の形態1で示した処理によって候補者画像を取得し、取得した候補者画像と優先順位が一番高い候補者を特定可能な情報を制御部31に出力する。
 候補者表示制御部4bは、優先順位一番候補者が操作する操作機器30のコンテンツ情報を制御部31から逐次取得する。そして、優先順位一番候補者が除外される際に、除外される直前に操作していたコンテンツ情報優先順位一番候補者の顔座標に紐付けてコンテンツ情報記憶部5に記憶する
 候補者表示制御部4bは、新たに優先順位設定部3から入力された優先順位一番候補者の顔座標に対応するコンテンツ情報が、コンテンツ情報記憶部5に記憶されているかを照合する。もし、新たな優先順位一番候補者の顔座標に対応するコンテンツ情報が存在した場合は、コンテンツ情報記憶部5からそのコンテンツ情報を取得し、候補者画像と優先順位が一番高い候補者を特定可能な情報と一緒に制御部31に出力する。
 制御部31は、候補者表示制御部4bから、コンテンツ情報が入力されると、表示部32にコンテンツ情報に対応したコンテンツを表示する。
Next, the candidate display control unit 4b will be described.
The candidate display control unit 4b acquires the candidate image by the processing described in the first embodiment when the face coordinates of the first priority candidate are input from the priority setting unit 3, and the acquired candidate image And information that can identify the candidate with the highest priority to the control unit 31.
The candidate display control unit 4 b sequentially acquires, from the control unit 31, content information of the operation device 30 operated by the candidate with the highest priority. Then, when the first priority candidate is excluded, the content information storage unit 5 is linked to the face coordinates of the first content information priority candidate operated immediately before being excluded and stored in the content information storage unit 5 The control unit 4 b collates whether content information corresponding to the face coordinates of the first priority candidate, which is newly input from the priority setting unit 3, is stored in the content information storage unit 5. If there is content information corresponding to the face coordinates of the new candidate with the highest priority, the content information is acquired from the content information storage unit 5, and the candidate with the highest priority with the candidate image is selected. The information is output to the control unit 31 together with the identifiable information.
When content information is input from the candidate display control unit 4b, the control unit 31 causes the display unit 32 to display content corresponding to the content information.
 次に、表示制御装置100Bのハードウェア構成例を説明する。なお、実施の形態1と同一の構成の説明は省略する。
 表示制御装置100Bにおける候補者表示制御部4bは、図3Aで示した処理回路200a、または図3Bで示したメモリ200cに格納されるプログラムを実行するプロセッサ200bである。
Next, a hardware configuration example of the display control device 100B will be described. Description of the same configuration as that of the first embodiment is omitted.
The candidate display control unit 4b in the display control device 100B is a processor 200b that executes a program stored in the processing circuit 200a shown in FIG. 3A or the memory 200c shown in FIG. 3B.
 次に、表示制御装置100Bの動作について、コンテンツ情報記憶部5にコンテンツ情報と候補者を紐付けて保存する動作とコンテンツ情報記憶部5からコンテンツ情報を取得し、制御部31に出力する動作について分けて説明する。
 まず、図10のフローチャートを参照しながらコンテンツ情報記憶部5にコンテンツ情報と候補者を紐付けて保存する動作について説明する。
Next, with regard to the operation of the display control device 100B, the operation of associating the content information and the candidate in the content information storage unit 5 and storing them and the operation of acquiring the content information from the content information storage unit 5 and outputting it to the control unit 31 I will divide and explain.
First, the operation of associating and storing content information and a candidate in the content information storage unit 5 will be described with reference to the flowchart of FIG.
 図10は、実施の形態3に係わる表示制御装置100Bのコンテンツ情報記憶部5にコンテンツ情報と候補者を紐付けて保存する処理の動作を示すフローチャートである。
 候補者表示制御部4bは、優先順位一番候補者が候補者から除外されたか否かを判定する(ステップST41)。候補者から除外されていないと判定した場合(ステップST41;NO)、ステップST41に戻る。一方、候補者から除外されたと判定した場合(ステップST41;YES)、優先順位一番候補者が、最後に操作していたコンテンツ情報を制御部31から取得し(ステップST42)、更新される前の優先順位一番候補者の顔座標とステップST42で取得したコンテンツ情報を紐付けてコンテンツ情報記憶部5に記憶する(ステップST43)。
FIG. 10 is a flowchart showing an operation of processing of associating content information and a candidate in the content information storage unit 5 of the display control apparatus 100B according to the third embodiment and storing the result.
The candidate display control unit 4b determines whether the candidate with the highest priority is excluded from the candidates (step ST41). If it is determined that the candidate is not excluded (step ST41; NO), the process returns to step ST41. On the other hand, when it is determined that the candidate has been excluded from the candidate (step ST41; YES), the priority first candidate acquires the content information which was operated at the end from the control unit 31 (step ST42). The face coordinates of the first candidate with the priority and the content information acquired in step ST42 are linked and stored in the content information storage unit 5 (step ST43).
 次に、図11のフローチャートを参照しながらコンテンツ情報記憶部5からコンテンツ情報を取得し、制御部31に出力する動作について説明する。なお、以下では、実施の形態1に係わる表示制御装置100と同一のステップには図4で使用した符号を付し、説明を省略または簡略化する。 Next, an operation of acquiring content information from the content information storage unit 5 and outputting the content information to the control unit 31 will be described with reference to the flowchart of FIG. In the following, the same steps as those of the display control apparatus 100 according to the first embodiment are denoted by the reference numerals used in FIG. 4, and the description will be omitted or simplified.
 図11は、実施の形態3に係わる表示制御装置100Bのコンテンツ情報記憶部5からコンテンツ情報を取得し、制御部31に出力する動作を示すフローチャートである。
 候補者表示制御部4bは、優先順位設定部3から、優先順位一番候補者の顔座標を入力し、優先順位一番候補者の顔座標に紐付けられたコンテンツ情報がコンテンツ情報記憶部5に記憶されているか照合する(ステップST51)。コンテンツ情報記憶部5に記憶されていない場合(ステップST51;NO)、ステップST8に移行する。一方、コンテンツ情報記憶部5に記憶されている場合は(ステップST52;YES)、コンテンツ情報記憶部5から、優先順位一番候補者の顔座標に紐付けられたコンテンツ情報を取得し(ステップST52)、候補者画像と優先順位が一番高い候補者を特定可能な情報と一緒にコンテンツ情報を制御部31に出力する(ステップST53)。
FIG. 11 is a flowchart showing an operation of acquiring content information from the content information storage unit 5 of the display control apparatus 100B according to the third embodiment and outputting the content information to the control unit 31.
The candidate display control unit 4 b receives the face coordinates of the first candidate in the priority order from the priority setting unit 3, and the content information linked to the face coordinates of the first candidate in the priority order is the content information storage unit 5. It is collated whether it is stored in (step ST51). If not stored in the content information storage unit 5 (step ST51; NO), the process proceeds to step ST8. On the other hand, when stored in the content information storage unit 5 (step ST52; YES), the content information storage unit 5 acquires content information linked to the face coordinates of the first priority candidate (step ST52). The content information is output to the control unit 31 together with the information capable of specifying the candidate image and the candidate with the highest priority (step ST53).
 以上のように、実施の形態3に係わる表示制御装置100Bは、実施の形態1に係わる表示制御装置100に対し、コンテンツ情報記憶部5を追加し、また、候補者表示制御部4に替えて候補者表示制御部4bを備えて構成した。これにより、操作機器30の操作を一旦中断し、再度操作しようとしたときに、中断したところから操作を開始することが可能となり、操作の利便性が向上する。 As described above, the display control apparatus 100B according to the third embodiment adds the content information storage unit 5 to the display control apparatus 100 according to the first embodiment, and replaces the candidate display control unit 4 It comprised including the candidate display control part 4b. As a result, when the operation of the operation device 30 is temporarily interrupted and the operation is attempted again, the operation can be started from the point of interruption, and the convenience of the operation is improved.
実施の形態4
 本実施の形態4は、実施の形態2の表示制御装置に、表示部32に表示されている候補者画像を乗員がタッチすることで、乗員がタッチした候補者画像の候補者に、操作機器30の操作者の権限を委譲できる機能を加えた表示制御装置である。
 実施の形態4の表示制御装置について図12を用いて説明する。
 図12は本実施の形態4に係わる表示制御装置の構成を示すブロック図である。
 実施の形態4の表示制御装置100Cは、実施の形態2の表示制御装置100Aに対して、 優先順位設定部3、候補者表示制御部4に替えて優先順位設定部3c、候補者表示制御部4cを備えて構成している。また、操作機器30は表示部32への乗員のタッチ操作を検出するタッチ操作検出部33を備えて構成している。
 実施の形態2に係わる表示制御装置100Aと同一または相当する部分には、実施の形態2で使用した符号と同一の符号を付して説明を省略または簡略化する。
Fourth Embodiment
In the fourth embodiment, when the occupant touches the candidate image displayed on the display unit 32 in the display control device of the second embodiment, the operation device becomes a candidate of the candidate image touched by the occupant. This is a display control device to which a function capable of transferring the authority of 30 operators is added.
A display control apparatus according to the fourth embodiment will be described with reference to FIG.
FIG. 12 is a block diagram showing the configuration of a display control apparatus according to the fourth embodiment.
The display control device 100C of the fourth embodiment is the same as the display control device 100A of the second embodiment except that the priority setting unit 3 and the candidate display control unit 4 are replaced by a priority setting unit 3c and a candidate display control unit. It is configured to have 4c. In addition, the operation device 30 includes a touch operation detection unit 33 that detects a touch operation of the occupant on the display unit 32.
The same or corresponding parts as those of the display control device 100A according to the second embodiment are indicated by the same reference numerals as the reference numerals used in the second embodiment, and the description will be omitted or simplified.
 次に図13を用いてタッチ操作検出部33、候補者表示制御部4cおよび優先順位設定部3cについて説明する。
 図13は、本実施の形態4係わる表示制御装置の操作者の権限を委譲する一例の図である。
 タッチ操作検出部33は、実施の形態2で示した処理によって表示部32に表示させた候補者複合画像に対する乗員のタッチ操作を検出し、タッチ操作が検出された場合は、タッチ操作が行われた候補者複合画像中の座標を候補者表示制御部4cに出力する。
Next, the touch operation detection unit 33, the candidate display control unit 4c, and the priority setting unit 3c will be described with reference to FIG.
FIG. 13 shows an example of transferring the authority of the operator of the display control apparatus according to the fourth embodiment.
The touch operation detection unit 33 detects the touch operation of the occupant on the candidate composite image displayed on the display unit 32 by the processing described in the second embodiment, and when the touch operation is detected, the touch operation is performed. The coordinates in the candidate composite image are output to the candidate display control unit 4c.
 候補者表示制御部4cは、タッチ操作検出部33から入力された座標に基づいて、候補者複合座標中のどの候補者画像に対してタッチ操作が行われたかを判定し、タッチ操作が行われた候補者画像の候補者を優先順位設定部3cに通知する。
 優先順位設定部3cは、候補者表示制御部4cから、タッチ操作が行われた候補者が通知された場合、その候補者にフラグを設定する。フラグが設定されている候補者は注視時間に関係なく優先順位が一番に設定される。また、フラグが設定されていない候補者の優先順位は、実施の形態2での優先順位の設定処理と同様に視線情報によって設定される。
 例えば、優先順位一番から四番の候補者画像が表示部32に表示されている状況で、優先順位三番の候補者画像が乗員にタッチされた場合には、優先順位設定部3cは、優先順位三番の候補者の優先順位を優先順位一番に設定し、優先順位一番だった候補者は優先順位二番に、優先順位二番だった候補者は優先順番三番に変更する。なお、優先順位四番だった候補者は優先順位四番のままとなる。
The candidate display control unit 4c determines, based on the coordinates input from the touch operation detection unit 33, to which candidate image in the candidate complex coordinates the touch operation has been performed, and the touch operation is performed. The candidate of the candidate image is notified to the priority setting unit 3c.
When the candidate display control unit 4c notifies the candidate for which the touch operation has been performed, the priority setting unit 3c sets a flag for the candidate. The candidate with the flag set is prioritized first regardless of the gaze time. Further, the priority order of the candidate for whom the flag is not set is set by the line-of-sight information as in the priority setting process in the second embodiment.
For example, in a situation where the candidate images of the first to fourth priorities are displayed on the display unit 32, when the candidate image of the third priority is touched by the occupant, the priority setting unit 3c Set the priority of the third priority candidate to the first priority, change the first priority candidate to the second priority, and change the second priority to the third priority. . The candidate with the fourth priority remains the fourth priority.
 なお、フラグが設定されている候補者が候補者から除外された場合、フラグは解除される。また、他の候補者の候補者画像が乗員によってタッチ操作された場合、フラグはタッチ操作された候補者に委譲される。そして、フラグを剥奪された候補者は再度視線情報に基づいて優先順位が設定される。 When the candidate for which the flag is set is excluded from the candidates, the flag is released. In addition, when the candidate image of another candidate is touch-operated by the occupant, the flag is delegated to the candidate who has been touch-operated. Then, the candidate deprived of the flag is prioritized again based on the line-of-sight information.
 次に、表示制御装置100Cのハードウェア構成例を説明する。なお、実施の形態2と同一の構成の説明は省略する。
 表示制御装置100Cにおける優先順位設定部3c、候補者表示制御部4cは、図3Aで示した処理回路200a、または図3Bで示したメモリ200cに格納されるプログラムを実行するプロセッサ200bである。
Next, a hardware configuration example of the display control device 100C will be described. The description of the same configuration as that of the second embodiment will be omitted.
The priority setting unit 3c and the candidate display control unit 4c in the display control apparatus 100C are processors 200b that execute programs stored in the processing circuit 200a shown in FIG. 3A or the memory 200c shown in FIG. 3B.
 次に、表示制御装置100Cの操作者権限委譲処理の動作について、フラグの設定処理の動作と優先順位の設定処理の動作について分けて説明する。
 まず、図14のフローチャートを参照して、フラグの設定処理の動作について説明する。
 図14は、実施の形態4に係わる表示制御装置100Cのフラグの設定処理の動作を示すフローチャートである。
Next, with respect to the operation of the operator authority transfer process of the display control apparatus 100C, the operation of the process of setting the flag and the operation of the process of setting the priority will be separately described.
First, the operation of the flag setting process will be described with reference to the flowchart of FIG.
FIG. 14 is a flowchart showing the operation of the flag setting process of the display control device 100C according to the fourth embodiment.
 候補者表示制御部4cは、タッチ操作検出部33から、候補者複合画像中の座標にタッチ操作が行われたかを判定し(ステップST61)、座標にタッチ操作の入力が行われなかった場合(ステップST61;NO)、再びステップST61の処理に戻る。一方、座標にタッチ操作の入力が行われた場合(ステップST61;YES)、候補者表示制御部4cは、ステップST61で入力された座標から、どの候補者画像に対するタッチ操作か判定する(ステップST62)。優先順位設定部3cは、ステップST62で判定された候補者にフラグを設定し(ステップ63)、動作を終了する。 The candidate display control unit 4c determines, from the touch operation detection unit 33, whether or not the touch operation has been performed on the coordinates in the candidate composite image (step ST61), and the input of the touch operation has not been performed on the coordinates ( Step ST61; NO) Return to the process of step ST61 again. On the other hand, when the touch operation is input to the coordinates (step ST61; YES), the candidate display control unit 4c determines which candidate image the touch operation is for from the coordinates input in step ST61 (step ST62). ). The priority setting unit 3c sets a flag for the candidate determined in step ST62 (step 63), and ends the operation.
 次に、図15のフローチャートを参照して、優先順位の設定処理の動作について説明する。
 図15は実施の形態4に係わる表示制御装置100Cの優先順位の設定処理の動作を示すフローチャートである。なお、以下では、実施の形態2に係わる表示制御装置100Aと同一のステップには図8で使用した符号を付し、説明を省略または簡略化する。
Next, the operation of the priority setting process will be described with reference to the flowchart of FIG.
FIG. 15 is a flowchart showing the operation of setting the priority of the display control device 100C according to the fourth embodiment. In the following, the same steps as those of the display control device 100A according to the second embodiment are denoted by the reference numerals used in FIG. 8, and the description will be omitted or simplified.
 優先順位設定部3cは、フラグが設定されている候補者が存在するかを判定し(ステップST64)、フラグが設定されている候補者が存在しない場合は(ステップST64;NO)、ステップST8に移行する。一方、フラグが設定されている候補者が存在する場合は(ステップST64;YES)、フラグが設定されている候補者を優先順位一番に設定し、その他の候補者の優先順位を更新する(ステップST65)。 The priority setting unit 3c determines whether there is a candidate for which the flag is set (step ST64), and if there is no candidate for which the flag is set (step ST64; NO), the process proceeds to step ST8. Transition. On the other hand, when there is a candidate for which the flag is set (step ST64; YES), the candidate for which the flag is set is set to the first priority, and the priorities of the other candidates are updated ( Step ST65).
 以上のように、実施の形態4に係わる表示制御装置100Cは、実施の形態1に係わる表示制御装置100に対し、候補者表示制御部4に替えて候補者表示制御部4bを備えて構成した。これにより、複数の候補者が操作機器30を操作したいというときに、候補者の中から任意に操作機器30の操作者を選択することが可能となり、利便性が向上する。
As described above, the display control apparatus 100C according to the fourth embodiment is configured by including the candidate display control unit 4b in place of the candidate display control unit 4 in the display control apparatus 100 according to the first embodiment. . Thereby, when a plurality of candidates want to operate the operation device 30, it is possible to arbitrarily select the operator of the operation device 30 from among the candidates, and the convenience is improved.
100、100A、100B、100C 表示制御装置、1 視線検出部、2 視線対象算出部 、3、3a、3c 優先順位設定部、4、4a、4b、4c 候補者表示制御部、5 コンテンツ情報記憶部、20 撮像装置、30 操作機器、31 制御部、32 表示部、33 タッチ操作検出部、200a 処理回路、200b プロセッサ、200c メモリ  100, 100A, 100B, 100C Display control device, 1 gaze detection unit, 2 gaze target calculation unit, 3, 3a, 3c priority setting unit, 4, 4a, 4b, 4c candidate display control unit, 5 content information storage unit , 20 imaging device, 30 operation device, 31 control unit, 32 display unit, 33 touch operation detection unit, 200a processing circuit, 200b processor, 200c memory

Claims (14)

  1.  乗員を撮像する撮像装置の撮像した画像から、前記乗員の視線を検出する視線検出部と、
     前記視線の対象を算出し、非接触操作機器の表示部に対して前記視線を向けている前記乗員を判定する視線対象算出部と、
     前記表示部に対する視線を向けている前記乗員を候補者に決定し、前記候補者に優先順位を設定する優先順位設定部と、
     前記優先順位が一番の優先順位一番候補者の候補者画像を前記視線検出部から取得し、前記表示部に表示させる候補者表示制御部と、
     を備える表示制御装置。
    A gaze detection unit configured to detect a gaze of the occupant from an image captured by an imaging device that captures the occupant;
    A gaze target calculation unit that calculates the target of the gaze and determines the occupant whose gaze is directed to the display unit of the non-contact operation device;
    A priority setting unit configured to determine, as a candidate, the occupant whose line of sight to the display unit is directed, and to set a priority to the candidate;
    A candidate display control unit for acquiring a candidate image of the first candidate with the highest priority from the gaze detection unit and displaying the image on the display unit;
    A display control apparatus comprising:
  2.  前記候補者画像は、前記候補者の顔全体を含む画像である請求項1記載の表示制御装置。 The display control device according to claim 1, wherein the candidate image is an image including the entire face of the candidate.
  3.  前記候補者表示制御部は、前記候補者全員の候補者画像を取得し、前記表示部に表示させる請求項2記載の表示制御装置。 The display control device according to claim 2, wherein the candidate display control unit acquires candidate images of all the candidates and causes the display unit to display the images.
  4.  前記候補者表示制御部は、前記優先順位一番候補者の前記候補者画像に操作者マークを付与する請求項3記載の表示制御装置。  The display control device according to claim 3, wherein the candidate display control unit adds an operator mark to the candidate image of the first candidate with the priority.
  5.  前記候補者画像に対する前記乗員のタッチ操作を検出した場合、前記優先順位設定部は前記タッチ操作が入力された対象者を前記優先順位の一番に設定する請求項3記載の表示制御装置。 4. The display control apparatus according to claim 3, wherein when the touch operation of the occupant on the candidate image is detected, the priority setting unit sets a target person to which the touch operation is input as the first priority.
  6.  前記候補者表示制御部は、前記候補者全員の前記候補者画像を表示させ、所定時間経過後に、前記優先順位一番候補者以外の前記候補者画像を消去する請求項3記載の表示制御装置。 The display control device according to claim 3, wherein the candidate display control unit displays the candidate images of all the candidates, and erases the candidate images other than the candidate with the highest priority after a predetermined time has elapsed. .
  7.  前記優先順位設定部は、連続して前記表示部に視線を向ける注視時間をカウントし、前記注視時間が予め設定した閾値時間を越えた場合に候補者に決定する請求項1から6記載の表示制御装置。 The display according to any one of claims 1 to 6, wherein the priority setting unit continuously counts the gaze time at which the gaze is directed to the display unit, and determines the candidate as the candidate when the gaze time exceeds a preset threshold time. Control device.
  8.  前記優先順位設定部は、連続して前記表示部から視線を外す非注視時間をカウントし、前記非注視時間が予め設定した閾値時間越えた場合に前記候補者から除外する請求項7記載の表示制御装置。 8. The display according to claim 7, wherein the priority setting unit continuously counts non-gaze times at which the user gazes from the display unit, and excludes from the candidate when the non-gaze time exceeds a preset threshold time. Control device.
  9.  前記候補者表示制御部は、前記候補者の前記優先順位の変更がある毎に、前記候補者画像を取得し、前記表示部に更新表示させる請求項8記載の表示制御装置。 9. The display control device according to claim 8, wherein the candidate display control unit acquires the candidate image every time the priority of the candidate is changed, and causes the display unit to update and display the candidate image.
  10.  前記操作機器のコンテンツの情報を前記候補者に紐付けて記憶するコンテンツ情報記憶部を備え、
     前記優先順位設定部は前記コンテンツ情報記憶部に前記優先順位一番候補者と紐付けされたコンテンツ情報が記憶されていた場合、前記優先順位一番候補者の候補者画像と併せて前記コンテンツ情報を前記表示部に出力する請求項1記載の表示制御装置。
    The content information storage unit is configured to store the information of the content of the operation device in association with the candidate, and
    When the content information associated with the priority first candidate is stored in the content information storage unit, the priority setting unit combines the content information with the candidate image of the priority first candidate. The display control device according to claim 1, wherein:
  11.  乗員を撮像する撮像装置の撮像した画像から、前記乗員の視線を検出するステップと、
     前記視線の対象を算出し、非接触操作機器の表示部に対して前記視線を向けている前記乗員を判定するステップと、
     前記表示部に対する視線を向けている前記乗員を候補者に決定し、前記候補者に優先順位を設定するステップと、
     前記優先順位が一番の優先順位一番候補者の候補者画像を前記視線検出部から取得し、前記表示部に表示させるステップと、
     を含む表示制御方法。
    Detecting a line of sight of the occupant from an image captured by an imaging device for imaging the occupant;
    Calculating an object of the line of sight, and determining the occupant whose line of sight is directed to a display unit of the non-contact operation device;
    Determining the occupant whose line of sight with respect to the display unit is a candidate and setting a priority for the candidate;
    Acquiring a candidate image of the highest priority first candidate from the gaze detection unit and displaying the image on the display unit;
    Display control method including.
  12.  乗員を撮像する撮像装置と、
     非接触操作機器と、 
     前記撮像装置の撮像した画像から得られ視線に基づいて、前記非接触操作機器を操作可能な操作者を決定し、操作者の少なくとも顔全体を含む画像を生成する表示制御装置と、
     前記操作者の少なくとも顔全体を含む画像を表示させる表示部と、
     を備える車載機器。
    An imaging device for imaging an occupant;
    Noncontact operation equipment,
    A display control device that determines an operator capable of operating the non-contact operation device based on a line of sight obtained from an image captured by the imaging device, and generates an image including at least the entire face of the operator;
    A display unit for displaying an image including at least the entire face of the operator;
    Vehicle equipment equipped with
  13.  前記候補者表示制御部は、前記操作者以外の前記操作機器に対する非接触操作を無効にする請求項12記載の車載機器。 The in-vehicle device according to claim 12, wherein the candidate display control unit invalidates non-contact operation on the operation device other than the operator.
  14.  前記候補者表示制御部は、前記優操作者が存在しない場合は、前記操作機器に対する操作を無効にする請求項13記載の車載機器。 The in-vehicle device according to claim 13, wherein the candidate display control unit invalidates the operation on the operation device when the superior operator does not exist.
PCT/JP2017/036909 2017-10-12 2017-10-12 Display control device, display control method, and vehicle-mounted apparatus provided with display control device WO2019073562A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/036909 WO2019073562A1 (en) 2017-10-12 2017-10-12 Display control device, display control method, and vehicle-mounted apparatus provided with display control device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/036909 WO2019073562A1 (en) 2017-10-12 2017-10-12 Display control device, display control method, and vehicle-mounted apparatus provided with display control device

Publications (1)

Publication Number Publication Date
WO2019073562A1 true WO2019073562A1 (en) 2019-04-18

Family

ID=66100526

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/036909 WO2019073562A1 (en) 2017-10-12 2017-10-12 Display control device, display control method, and vehicle-mounted apparatus provided with display control device

Country Status (1)

Country Link
WO (1) WO2019073562A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109539354A (en) * 2018-12-28 2019-03-29 九阳股份有限公司 A kind of gesture identification control method identifying accurate range hood and range hood

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000347692A (en) * 1999-06-07 2000-12-15 Sanyo Electric Co Ltd Person detecting method, person detecting device, and control system using it
WO2013141161A1 (en) * 2012-03-23 2013-09-26 株式会社エヌ・ティ・ティ・ドコモ Information terminal, method for controlling input acceptance, and program for controlling input acceptance
JP2013254342A (en) * 2012-06-07 2013-12-19 Sharp Corp Image display device and image display method
JP2014197252A (en) * 2013-03-29 2014-10-16 パナソニック株式会社 Gesture operation apparatus, program thereof, and vehicle mounted with gesture operation apparatus
US20150019995A1 (en) * 2013-07-15 2015-01-15 Samsung Electronics Co., Ltd. Image display apparatus and method of operating the same
JP2016110269A (en) * 2014-12-03 2016-06-20 株式会社東海理化電機製作所 Manipulation input device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000347692A (en) * 1999-06-07 2000-12-15 Sanyo Electric Co Ltd Person detecting method, person detecting device, and control system using it
WO2013141161A1 (en) * 2012-03-23 2013-09-26 株式会社エヌ・ティ・ティ・ドコモ Information terminal, method for controlling input acceptance, and program for controlling input acceptance
JP2013254342A (en) * 2012-06-07 2013-12-19 Sharp Corp Image display device and image display method
JP2014197252A (en) * 2013-03-29 2014-10-16 パナソニック株式会社 Gesture operation apparatus, program thereof, and vehicle mounted with gesture operation apparatus
US20150019995A1 (en) * 2013-07-15 2015-01-15 Samsung Electronics Co., Ltd. Image display apparatus and method of operating the same
JP2016110269A (en) * 2014-12-03 2016-06-20 株式会社東海理化電機製作所 Manipulation input device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109539354A (en) * 2018-12-28 2019-03-29 九阳股份有限公司 A kind of gesture identification control method identifying accurate range hood and range hood

Similar Documents

Publication Publication Date Title
JP6316559B2 (en) Information processing apparatus, gesture detection method, and gesture detection program
JP3903968B2 (en) Non-contact information input device
US10061995B2 (en) Imaging system to detect a trigger and select an imaging area
EP3691926A1 (en) Display system in a vehicle
US20190004614A1 (en) In-Vehicle Device
CN110481419B (en) Human-vehicle interaction method, system, vehicle and storage medium
US9141185B2 (en) Input device
US20190236343A1 (en) Gesture detection device
CN113994312A (en) Method for operating a mobile terminal by means of a gesture recognition and control device, motor vehicle and head-mounted output device
JP5813562B2 (en) Operation support apparatus, operation system, operation support method, and program
JP6917697B2 (en) Information display method and information display device
US20180052563A1 (en) Touch panel control device and in-vehicle information device
JP5201480B2 (en) Vehicle control device
JP2021512392A (en) How to operate a head-mounted electronic display device for displaying virtual contents and a display system for displaying virtual contents
JPWO2018066023A1 (en) Driving authority shift determination device and driving authority shift determination method
US10003772B2 (en) Vehicular image control apparatus
WO2019073562A1 (en) Display control device, display control method, and vehicle-mounted apparatus provided with display control device
JP7038560B2 (en) Information processing equipment and information processing method
JP5057091B2 (en) Display control system, display control method, and display control program
JP2019002867A (en) Display controller and display control program
JP6847323B2 (en) Line-of-sight detection device and line-of-sight detection method
US20200218347A1 (en) Control system, vehicle and method for controlling multiple facilities
JP6188468B2 (en) Image recognition device, gesture input device, and computer program
US11451705B2 (en) Imaging control apparatus, imaging control method, and storage medium
US9600097B2 (en) On-vehicle device operation apparatus and on-vehicle device operation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17928145

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17928145

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP