US8077970B2 - Operation estimating apparatus and related article of manufacture - Google Patents

Operation estimating apparatus and related article of manufacture Download PDF

Info

Publication number
US8077970B2
US8077970B2 US11/987,800 US98780007A US8077970B2 US 8077970 B2 US8077970 B2 US 8077970B2 US 98780007 A US98780007 A US 98780007A US 8077970 B2 US8077970 B2 US 8077970B2
Authority
US
United States
Prior art keywords
human body
operator
body feature
estimation model
transitional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US11/987,800
Other languages
English (en)
Other versions
US20080130953A1 (en
Inventor
Takahiro Ishikawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Assigned to DENSO CORPORATION reassignment DENSO CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ISHIKAWA, TAKAHIRO
Publication of US20080130953A1 publication Critical patent/US20080130953A1/en
Application granted granted Critical
Publication of US8077970B2 publication Critical patent/US8077970B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning

Definitions

  • the present invention relates to an operation estimating apparatus for estimating an operation to be performed by the operator.
  • a body model expressing a body by simulation is superimposed in an area corresponding to a human being in an image in accordance with the posture of the human being.
  • the posture or the like of the human being is estimated on the basis of locations of the human body feature points (refer to JP-2003-109015 A).
  • the posture or the like of a human being can be estimated.
  • the above technique cannot estimate whether or not the human being is going to operate some operational objects.
  • the above technique cannot estimate which operational object to be operated.
  • the present invention is made in view of the above disadvantages. Thus, it is an objective of the present invention to address at least one of the above disadvantages.
  • an operation estimating apparatus which includes an image obtaining means, a human body feature point specifying means, and an operation estimating means.
  • the image obtaining means repeatedly obtains images. Each of the images has a plurality of operational objects and an operator that is positioned to be able to perform operations on the plurality of operational objects.
  • the human body feature point specifying means specifies a predetermined human body feature point of the operator in each of the images that are repeatedly obtained by the image obtaining means.
  • the operation estimating means estimates one of the operations, which the operator is going to perform, based on the human body feature points specified by the human body feature point specifying means in the images.
  • the operation estimating means compares an actual posture locus of the operator with a transitional estimation model for each of the operations by the operator to obtain a degree of approximation of the transitional estimation model to the actual posture locus, the transitional estimation model for the each of the operations being formed based on an estimated posture locus of the each of the operations.
  • the operator is estimated to track the estimated posture locus to operate one of the plurality of operational objects that corresponds to the estimated posture locus.
  • the actual posture locus of the operator is obtained based on the human body feature points specified by the human body feature point specifying means in the images.
  • the operation estimating means estimates that the operator is going to perform the one of the operations that corresponds to the estimated posture locus of the transitional estimation model having the degree of approximation that satisfies a predetermined threshold.
  • an article manufacture which includes a computer readable medium readable by a computer system and program instructions carried by the computer readable medium for causing the computer system to execute various procedures as all elements of the above operation estimating apparatus.
  • FIG. 1 is a block diagram showing a general configuration of an operation support system
  • FIGS. 2A and 2B are diagrams showing an interior of a vehicle and an interior of the vehicle captured by a camera, respectively;
  • FIG. 3 is a block diagram showing functions in a microcomputer of an operation estimating apparatus
  • FIG. 4 is a flowchart showing a human body feature point specifying process
  • FIGS. 5A to 5J are images captured by the camera and showing a state where the driver operates operational objects
  • FIG. 6 is a diagram showing positions of human body feature points specified by a human body feature point specifying means
  • FIG. 7 is a flowchart showing an operation supporting process
  • FIGS. 8A to 8E are graphs showing examples of estimated stationary models and estimated transitional models
  • FIG. 9 is a flowchart showing a data collecting process
  • FIG. 10 is a flowchart showing a process for specifying the operator
  • FIG. 11 is a flowchart showing an estimated model correcting process
  • FIG. 12 is a diagram showing examples of feature points and feature point groups read from an operational table
  • FIG. 13 is a flowchart showing an approach notifying process
  • FIG. 14 is a flowchart showing a start notifying process.
  • An operation support system 1 is made of components mounted on a vehicle and, as shown in FIG. 1 , includes an operation estimating apparatus 10 as a known ECU having a microcomputer, a memory, and the like, a camera 20 for capturing an image of the interior of the vehicle, a lamp group 30 made of multiple lamps provided around operational objects disposed in the vehicle, a navigation apparatus 40 , an air conditioning apparatus 50 , a shift lever 60 , and a sensor group 70 made of multiple sensors (a proximity sensor, a touch sensor, a switch, and the like) provided around the multiple operational objects disposed in the vehicle.
  • an operation estimating apparatus 10 as a known ECU having a microcomputer, a memory, and the like
  • a camera 20 for capturing an image of the interior of the vehicle
  • a lamp group 30 made of multiple lamps provided around operational objects disposed in the vehicle
  • a navigation apparatus 40 for capturing an image of the interior of the vehicle
  • an air conditioning apparatus 50 for capturing an image of the interior of the vehicle
  • a shift lever 60
  • the operational objects disposed in the vehicle include, as shown in FIG. 2A , for example, an inside rear view mirror 82 , a steering wheel 84 , a passenger seat 86 , a center panel right air outlet 90 , a center panel left air outlet 92 , a glove box 94 , a console box 96 , the navigation apparatus 40 , the air conditioning apparatus 50 , and the shift lever 60 .
  • the camera 20 is attached in an area 82 a above the inside rear view mirror 82 in the vehicle and at the position where the camera 20 can capture images of at least a driver of the vehicle and all of the operational objects the driver can operate in the vehicle.
  • the camera 20 outputs a captured image (refer to FIG. 2B ) to the operation estimating apparatus 10 .
  • Each of the multiple lamps of the lamp group 30 is attached in a position such that the each lamp illuminates a corresponding operational object.
  • Each of the multiple sensors of the sensor group 70 is attached in a position such that the each sensor detects the operation on a corresponding operational object.
  • information indicative of the detection result is transmitted to the operation estimating apparatus 10 via a communication path (for example, a network in the vehicle) extending from the sensor to the operation estimating apparatus 10 .
  • the human body feature point specifying process is executed by making the microcomputer in the operation estimating apparatus 10 function as a function block made of an image obtaining means 110 for obtaining an image input from (e.g., image inputted through) the camera 20 , and of a human body feature point specifying means 120 for specifying one or more human body feature points in the driver in the image obtained by the image obtaining means 110 (refer to FIG. 3 ).
  • the image obtaining means 110 obtains an image input from the camera 20 (Step S 100 ).
  • images showing a state where the driver operates the operational objects are obtained (refer to FIGS. 5A to 5J ).
  • the human body feature point specifying means 120 specifies each of the one or more human body feature points in the driver in the images (Step S 120 ).
  • a body model expressing a human body by simulation is superimposed on the posture of the driver in an area corresponding to the driver in the image input from the camera 20 .
  • Each of coordinates as references in the body model in the image are specified as human body feature points in the driver in the image.
  • the “reference points in the body model” are, for example, center positions in shoulders, elbows, wrists, and the like (refer to d 1 to d 6 in FIG. 6 , hereinbelow, one of more human body feature points specified from one image will be simply called “human body feature points”).
  • the technique disclosed in JP-2003-109015 A may be employed.
  • Step S 130 After the human body feature point specifying means 120 registers human body feature points specified at Step S 120 into a human body feature point list (Step S 130 ), the process returns to Step S 110 .
  • the human body feature point list is prepared in the memory for registration of the predetermined number N of human body feature points by the first-in first-out method. That is, in the human body feature point list, after the human feature point specifying process is repeated by a predetermined number of times, the predetermined number N of human body feature points counted from the latest one are always registered for each of the reference points in the body model, for example. In the human body feature point list, expected human feature points of specific postures of the driver may be pre-registered as initial values.
  • the operation supporting process is executed by making the microcomputer in the operation estimating apparatus 10 function as a function block made by an operation estimating means 210 for estimating an operation of the driver on the basis of human body feature points registered in the human body feature point list, and an operation supporting means 220 for controlling lighting of the lamp group 30 to support execution of the operation estimated by the operation estimating means 210 (refer to FIG. 3 ).
  • the operation estimating means 210 reads all of the human body feature points registered in the human body feature point list (Step S 210 ) and specifies the posture locus of the driver on the basis of the read human body feature points (Step S 220 ).
  • the human body feature points read at Step S 210 are projected to an eigenspace expressed by principle component analysis, and then, the locus of coordinates on the eigenspace, to which the human body feature points are projected, is specified as the locus of an actual posture of the driver (hereinbelow, called “actual posture locus”).
  • the operation estimating means 210 estimates the operation of the driver (e.g., the operation that the driver intends to perform or is performing).
  • the estimation models are roughly classified into a stationary estimation model and a transitional estimation model.
  • the stationary estimation model is formed from points of an estimated posture (estimated posture points) of the driver who is continuously operating an operational object.
  • the stationary estimation models corresponding to the operational objects are prepared in the memory.
  • the transitional estimation model is formed by an estimated posture locus of the driver who performs an operation (e.g., transitional operation). When the driver performs the transitional operation, the driver switches from an operation on a certain operational object to an operation on another operational object, and the driver is estimated to track a certain posture locus (i.e., estimated posture locus) at this time.
  • a model corresponding to each of the transitional operations is prepared in the memory.
  • FIG. 8A shows a stationary estimation model in an eigenspace simplified to three dimensions.
  • FIGS. 8B to 8E show examples of the transitional estimation model in the eigenspace.
  • an angle is calculated.
  • the above angle is formed by each of coordinates (i.e., posture points, that is, coordinates on the eigenspace) in the actual posture locus specified at Step S 220 (p01 to p0n; 1 ⁇ n) and each of coordinates (p11 to p1m, p21 to p2m, . . . , px1 to pxm; 1 ⁇ m) on the eigenspace in the estimated posture locus and estimated posture points expressed in estimation models 1 to x (1 ⁇ x) (Step S 230 ).
  • the smallest (minimum) angle among the multiple angles of each of the coordinates calculated at Step S 230 is specified, and estimation information on the estimated posture locus or estimated posture point used in the calculation of the smallest angle is specified (Step S 240 ) for each of the coordinates.
  • the “estimation information” refers to information such as an estimation model corresponding to the estimated posture locus or estimated posture point used in the calculation of the minimum angle, a transition direction of each of the coordinates in the estimated posture locus used in the calculation of the minimum angle (the direction from the start point to the end point in the estimated posture locus), and the like.
  • the minimum angle is smaller, the estimated posture locus or estimated posture point used in the calculation of the minimum angle has the higher order of approximation to the actual posture locus.
  • Step S 250 a check is made to see whether or not all of the minimum angles specified at Step S 240 are less than a predetermined threshold (for example, 20 degrees) (Step S 250 ).
  • a predetermined threshold for example, 20 degrees
  • Step S 250 When it is determined at Step S 250 that all of the minimum angles are equal to or larger than the predetermined threshold (NO at Step S 250 ), it is estimated that there is no estimation model corresponding to the operation of the driver, that is, any of the operational objects is not performed (Step S 260 ). After that, the process returns to Step S 210 . At Step S 260 , a process of turning off of all of the lamps in the lamp group 30 is also performed.
  • Step S 250 when it is determined at Step S 250 that all of the minimum angles are less than the predetermined threshold (YES at Step S 250 ), a check is made to see whether all of the estimation models in the estimation information sets specified at Step S 240 are the same or not (Step S 270 ). For example, in other words, a check is made at Step S 270 to see whether or not each estimation model in the estimation information specified at Step S 240 is identical with each other.
  • Step S 270 When it is determined at Step S 270 that all of the estimation models in the estimation information are not the same (NO at Step S 270 ), the process shifts to Step S 260 where it is estimated that an operation on any of the operational objects is not performed. After that, the process returns to Step S 210 .
  • Step S 270 when it is determined at Step S 270 that all of the estimation models in the estimation information are the same (YES at Step S 270 ), a check is made to see whether the estimation model in the coordinate information specified at Step S 240 is a stationary estimation model or not (Step S 290 ).
  • Step S 290 When it is determined at Step S 290 that the estimation model is a stationary estimation model (YES at Step S 290 ), the operation of the driver is estimated on the basis of the estimation models which are determined as the same at Step S 270 (Step S 300 ). Then, the process shifts to the following process (Step S 310 ).
  • Step S 300 when the estimation models determined as the same with each other at Step S 270 are transitional estimation models, it is estimated on the basis of the combination of operational objects corresponding to the estimation models, that the driver is performing an operation of finishing the operation on a specific operational object and performing an operation on another operational object.
  • the estimation models determined as the same at Step S 270 are stationary estimation models, it is estimated that the driver is operating the operational object corresponding to the estimation models.
  • Step S 290 When it is determined at Step S 290 that the estimation models are transitional estimation models (NO at Step S 290 ), a check is made to see whether all of transition directions in the estimation information specified at Step S 240 are the same or not (Step S 320 ). For example, in other words, a check is made at Step S 320 whether or not each transition direction in the estimation information specified at Step S 240 is identical with each other.
  • Step S 320 When it is determined at Step S 320 that all of the transition directions are not the same (NO at Step S 320 ), the process shifts to Step S 260 where it is estimated that the operation on the operational object is not performed. After that, the process returns to Step S 210 .
  • Step S 320 when it is determined at Step S 320 that all of the transition directions are the same (YES at Step S 320 ), the process shifts to Step S 300 where the operation performed by the driver is estimated. After that, the process shifts to the following step (Step S 310 ).
  • the operation supporting means 220 supports (e.g., assists) the operation on the operational object (Step S 310 ).
  • the lamp illuminating the operational object corresponding to the stationary estimation model in the lamp group 30 is turned on, and the other lamps are turned off.
  • the estimation result is based on the transition estimation model, that is, a combination of operational objects, lamp(s) corresponding to one or both of the operational objects (in the present embodiment, only the operational object on the end point side in the transition direction) is/are turned on, and the other lamps are turned off.
  • Each lamp in the lamp group 30 is turned on by starting energization of the each lamp to be turned on. Also, each lamp is turned off by ending the energization to the each lamp to be turned off.
  • Step S 310 After the Step S 310 , the process returns to Step S 210 .
  • the data collecting process is executed by making the microcomputer in the operation estimating apparatus 10 function as a function block constructed by an operation state monitoring means 310 , an operator determining means 320 , and a locus storing means 330 (refer to FIG. 3 ).
  • the operation state monitoring means 310 detects that an operation is performed on an operational object prior to start of the data collecting process.
  • the operator determining means 320 determines whether the operation detected by the operation state monitoring means 310 is performed by the driver or not.
  • the locus storing means 330 stores the human body feature points registered in the human body feature point list into predetermined storage areas (a stationary operation table and a transitional operation table which will be described later) in the built-in memory.
  • the operator determining means 320 reads the latest human body feature point registered in the human body feature point list (Step S 410 ). On the basis of the human body feature point, the operator determining means 320 determines whether the operation performed on the operational object detected prior to the start is performed by the driver or not (Step S 420 ).
  • the latest human body feature point that has been registered in the human body feature point list is regarded as a human body feature point corresponding to an operational object detected prior to the start of the data collecting process.
  • a check is made to see whether the operation on the operational object is performed by the driver or not (to specify the operator).
  • Step S 610 the coordinates on an eigenspace expressed by the principle component analysis in the case of projecting the human body feature point read at Step S 410 to the eigenspace are specified.
  • the above coordinates are referred as the posture point, more specifically, a posture point in the present location.
  • the angle is calculated, which is formed between (a) the posture point in the present location specified at Step S 610 and (b) each of the coordinates (p11 to p1m, p21 to p2m, . . . , px1 to pxm; 1 ⁇ m) on the eigenspace in the estimated posture locus or the estimated posture points shown in each of estimation models 1 to x (Step S 620 ).
  • the angle group ( ⁇ 11 to ⁇ xm) corresponding to the present-location posture point is calculated.
  • an estimation model is specified, which corresponds to the estimated posture locus or the estimated posture point used in the calculation of the smallest (minimum) angle among the angle group calculated at Step S 620 (Step S 630 ).
  • Step S 650 When an operation corresponding to the estimation model specified at Step S 630 relates to the operational object on which the operation is detected prior to start of the data collecting process (YES at Step S 640 ), it is determined that the operation on the operational object is performed by the driver (Step S 650 ). When the above operation does not relate to the operational object (NO at Step S 640 ), it is determined that the operation on the operational object is not performed by the driver (Step S 660 ).
  • Step S 420 Step S 610 to Step S 660
  • the data collecting process is finished immediately.
  • the locus storing means 330 registers the human body feature point read at Step S 410 in a stationary operation table corresponding to the operational object, the operation performed on which is detected prior to start of the data collecting process (Step S 430 ).
  • the above stationary operation table is selected among from multiple stationary operation tables prepared for the operations.
  • the stationary operation table is a data table capable of registering a predetermined number of human body feature points at the time an operation on an operational object is performed in the first-in first-out method.
  • the stationary operation tables corresponding to the operational objects are prepared in the memory. That is, in the stationary operation table, after the data collecting process is repeated by predetermined number of times, the predetermined number of human body feature points counted (starting) from the latest one are registered as human body feature points on the corresponding operational object.
  • human feature points expected at the time point when the corresponding operational object is operated may be registered as initial values.
  • the apparatus enters a standby state until the operation on the operational object, which is detected prior to start of the data collecting process is finished (NO at Step S 440 ).
  • the sensor for detecting the operation on the corresponding operational object out of the sensor group 70 does not detect the operation, it is determined that the operation is finished.
  • the locus storing means 330 reads the latest human body feature point registered in the human body feature point list (Step S 450 ).
  • the latest human body feature point is read as the human body feature point of the driver at the time of completion of the operation.
  • the locus storing means 330 generates an operation specifying table for registering human body feature points and specifying an operation.
  • the operation specifying table the human body feature point read at Step S 450 and the operational object, the operation on which is detected prior to start of the data collecting process, are registered so as to be associated with each other (Step S 460 ).
  • the operation specifying table is a data table for sequentially registering, in time series, human body feature points of the driver after an operation on an operational object is performed.
  • the operation state monitoring means 310 determines whether an operation is newly performed on an operation object or not (Step S 470 ).
  • Step S 470 When it is determined at Step S 470 that an operation on an operational object is not performed (NO at Step S 470 ), the locus storing means 330 determines whether time that has elapsed since the determination of finishing the operation at Step S 440 exceeds a predetermined threshold value (timeout occurs) or not (Step S 480 ).
  • Step S 480 When it is determined at Step S 480 that the lapse time does not exceed the predetermined threshold value (NO at Step S 480 ), the locus storing means 330 reads the latest human body feature point registered in the human body feature point list (Step S 490 ) and registers it into the operation specifying table (Step S 500 ). After that, the process returns to Step S 470 . In this case, the latest human body feature point that corresponds to the human body feature point of the driver after completion of the operation is read and registered.
  • Step S 480 when it is determined at Step S 480 that the lapse time exceeds the threshold value (YES at Step S 480 ), the locus storing means 330 deletes the operation specifying table generated at Step S 460 (Step S 510 ). Then, the data collecting process is finished.
  • Step S 470 When it is determined at Step S 470 that an operation on an operational object is performed (YES at Step S 470 ), the locus storing means 330 reads the latest human body feature point registered in the human body feature point list (Step S 520 ). In a manner similar to the Step S 420 , the operator determining means 320 determines whether the operation on the operational object, detected prior to the start of the process, is performed by the driver or not on the basis of the human body feature point (Step S 530 ).
  • Step S 530 When it is determined at Step S 530 that the operation on the operational object is not performed by the driver (NO at Step S 530 ), the process returns to Step S 470 .
  • the locus storing means 330 determines whether the presently operated operational object is the same as the previously operated operational object, the operation on which has been detected to be finished at Step S 440 (Step S 540 ).
  • Step S 540 When it is determined at Step S 540 that the operational object is the same (YES at Step S 540 ), the process shifts to Step S 510 where the locus storing means erases the operation specifying table. After that, the data collecting process is finished.
  • the locus storing means 330 registers the human body feature points read at Step S 520 into a stationary operation table corresponding to the operational object determined to be operated at Step S 470 out of the multiple stationary operation tables prepared for the operational objects (Step S 550 ).
  • the human body feature points read at Step S 520 are registered in the stationary operation table as human body feature points of the driver who starts an operation on another operational object.
  • the locus storing means 330 registers the human body feature points read at Step S 520 and the operational object determined to be operated at Step S 470 into the operation specifying table generated at Step S 460 so as to be associated with each other (Step S 560 ).
  • the human body feature points read at Step S 520 are registered in the operation specifying table as human body feature points of the driver who has started the operation of the another operational object.
  • the locus storing means 330 registers each of the multiple human body feature points, which have been registered in the operation specifying table at this time point, into a transitional operation table of the operation corresponding to the data registered in the above operation specifying table (Step S 570 ).
  • the above transitional operation table is selected among from multiple transitional operation tables prepared for various operations.
  • the operation specifying table is erased (Step S 580 ) and, after that, the process returns to Step S 440 .
  • the transitional operation table is a data table capable of registering the predetermined number of human body feature points by the first-in first-out method.
  • the transitional operation table registers all of the human body feature points at time points during an interval between (a) the end of an operation on a certain operational object and (b) the start of an operation on another operational object as a single data record (human body feature point group).
  • the transitional operation table corresponding to the various operations are prepared in advance in the memory.
  • the transitional operation table as a human body feature group made of human body feature points for the corresponding operation, the predetermined number of human body feature points starting from the latest human body feature point are registered.
  • the human body feature points registered in the operation specifying table are registered to the transitional operation table corresponding to a transitional operation starting from the operational object corresponding to the oldest human body feature point to the other operational object corresponding to the latest human body feature point registered in the operation specifying table.
  • the estimation model correcting process is executed by making the microcomputer in the operation estimating apparatus 10 function as a functional block made by an estimation model correcting means 410 for correcting (updating) an estimation model stored in the memory to an estimation model generated on the basis of the operation tables (refer to FIG. 3 ).
  • the standby state is set until a correction condition to correct an estimation model is satisfied (NO at Step S 710 ). For example, it is determined that the correction condition is satisfied when the operation estimating apparatus 10 starts in association with the start of the vehicle. Also, the correction condition may be satisfied when the predetermined time period has elapsed since an estimation model was updated by the estimation model correcting process. Alternatively, when a predetermined proportion or more of the operation tables has been updated since the estimation model updating was started by the estimation model correcting process. For example, the correction condition may be satisfied when at least one of the above conditions is satisfied.
  • the estimation model correcting means 410 reads data registered in the stationary operation table and the transitional operation table (Step S 720 ). In this case, the estimation model correcting means 410 reads (a) a predetermined number of sets of feature points for each of the operational objects registered in the stationary operation table and (b) a predetermined number of sets of the feature point groups for each of the operations registered in the transitional operation table.
  • the estimation model correcting means 410 reads 30 sets of the feature points corresponding to 10 kinds of operational objects registered in the stationary operation tables and 30 sets of the feature point groups corresponding to 9 kinds of operations registered in the transitional operation tables will be described as an example (refer to FIG. 12 ).
  • the estimation model correcting means 410 eliminates outliers from the feature points and the feature point groups read at Step S 720 (Step S 730 ).
  • the feature points of a predetermined proportion for example, 10 sets out of 30 sets
  • the feature points of a predetermined proportion for example, 10 sets out of 30 sets
  • the feature point groups of a predetermined proportion (the same as above), in which each of one or more human body feature points of the human body feature point group is deviated from the corresponding human body feature points in the other feature point groups, is removed as outliers from the predetermined number of sets of the feature point groups for each of the transitional operations.
  • the estimation model correcting means 410 obtains an eigenspace by applying the principle component analysis to the remaining feature points and feature point groups remained after the removal of the outliers at Step S 730 .
  • the estimation model correcting means 410 projects each of the remaining feature points and feature point groups to the eigenspace obtained at Step S 740 for each of the operational objects and the operations.
  • the coordinates and the loci of the coordinates on the eigenspace are specified as actual posture points for the respective operational objects by the driver, and specified as actual posture loci for the transitional operations (Step S 750 ).
  • the estimation model correcting means 410 calculates an average of the above specified actual posture points (i.e., averaged actual posture points) of each operational object specified at Step S 750 as the stationary estimation model corresponding to the operational object. Also, at Step S 760 , the estimation model correcting means 410 calculates an average of the above specified actual posture loci for each operation specified at Step S 750 as the transitional estimation model corresponding to the operation. Specifically, an average value of coordinates as actual posture points is calculated for each of the operational objects. The averaged actual posture point is generated as the stationary estimation model for the operational object.
  • a parametric curve (such as cubic spline curve) is applied to the locus of the coordinates as the actual posture locus for each of the operations, and a sequence of coordinate points obtained in such a manner is generated as the transitional estimation model corresponding to the transitional operation.
  • the estimation model correcting means 410 replaces the prepared estimation models with the estimation models generated correspondingly to the equivalent operation objects and operations of the prepared estimation models at Step S 760 . In other words, the estimation model correcting means 410 updates the estimation model (Step S 770 ).
  • Step S 770 the process returns to Step S 710 and the standby state is set until the correction condition is satisfied again.
  • the approach notifying process is executed by making the microcomputer in the operation estimating apparatus 10 function as a functional block including the operation estimating means 210 , an approach detecting means 510 for determining approach of a matter to the vehicle, on which the operation supporting system 1 is mounted by a radar, and a first interruption notifying means 520 for sending a notification on the basis of a result of detection of the approach detecting means 510 and of a result of estimation of the operation estimating means 210 (refer to FIG. 3 ).
  • the approach detecting means 510 determines whether there is a matter approaching the vehicle or not on the basis of the radar (Step S 810 ).
  • the apparatus is in the standby state when it is determined at Step S 810 that a matter is not approaching the vehicle (NO at Step S 810 ).
  • the first interruption notifying means 520 determines whether or not it is a dangerous state such that the driver is not aware of the approach of the matter (Step S 820 ).
  • the first interruption notifying means 520 monitors the result of estimation of the operation estimating means 210 to determine a dangerous state when the result shows that an operational object other than the steering wheel 84 and the shift lever 60 is operated.
  • Step S 820 When the dangerous state is not determined at Step S 820 (NO at Step S 820 ), the process returns to Step S 810 .
  • the first interruption notifying means 520 notifies of the dangerous state (Step S 830 ), and the process returns to Step S 810 .
  • the notification is realized by outputting a message indicating that the driver is not aware of the approach of the matter, and thereby it is dangerous through the speaker or the display of the navigation apparatus 40 .
  • the message to be output is not limited to the message indicating that the matter (e.g., object) is approaching.
  • a message which indicates that the driver should stop the operation on the operational object and drive the vehicle safely, may be output.
  • a message which indicates that the driver should stop the operation on the operational object at the end point of the posture locus in the operation and drive the vehicle safely, may be output.
  • the start notifying process is executed by making the microcomputer in the operation estimating apparatus 10 function as a function block including the above-described operation estimating means 210 , a traffic information obtaining means 610 for obtaining traffic information, a travel state detecting means 620 for detecting a travel state of the vehicle, and a second interruption notifying means 630 for sending a notification on the basis of the traffic information obtained by the traffic information obtaining means 610 , of the travel state detected by the travel state detecting means 62 , and of the result of estimation of the operation estimating means 210 (refer to FIG. 3 ).
  • the travel state detecting means 620 determines whether the vehicle stops or not on the basis of an output from a sensor capable of detecting travel speed in the sensor group 70 (Step S 910 ). In the present embodiment, it is determined that the vehicle stops when the travel speed detected by the sensor is less than a predetermined value (for example, 5 km/h).
  • the apparatus is in a standby mode when it is determined at Step S 910 that the vehicle does not stop (NO at Step S 910 ).
  • the traffic information obtaining means 610 obtains traffic information from the outside of the vehicle (e.g., from an apparatus mounted on a road and other vehicles) (Step S 920 ).
  • traffic information indicative of traffic conditions is obtained via a communication function of the navigation apparatus 40 .
  • the traffic conditions include whether or not a junction having a traffic signal exists ahead of the vehicle, the display state of the signal (green or red) in the travel direction of the vehicle, whether or not a crossing exists ahead of the vehicle, and the open/close state of the crossing bar of the crossing.
  • the second interruption notifying means 630 determines whether the vehicle should start or not on the basis of the traffic information obtained at Step S 920 (Step S 930 ). It is determined that the vehicle should start when the traffic information obtained at Step S 920 shows that a junction exists and the traffic signal is “green” or that a crossing exists and the crossing bar is “open.”
  • the travel state determining means 620 determines whether the vehicle has started traveling on the basis of an output from the sensor capable of sending the travel speed in the sensor group 70 (Step S 940 ).
  • the predetermined value for example, 5 km/h
  • Step S 940 When it is determined at Step S 940 that the vehicle has started traveling (YES at Step S 940 ), the process returns to Step S 910 . On the other hand, when it is determined that the vehicle has not started traveling (NO at Step S 940 ), the process returns to Step S 920 .
  • the travel state detecting means 620 determines whether the vehicle has started traveling or not on the basis of the output from the sensor capable of detecting the travel speed in the sensor group 70 (Step S 950 ).
  • Step S 950 When it is determined at S 950 that the vehicle has started traveling (YES at Step S 950 ), the process returns to S 910 . On the other hand, when it is determined that the vehicle has not started traveling (NO at Step S 950 ), a check is made to see whether predetermined time (for example, 1 second) has elapsed since the traffic information was obtained at Step S 920 or not (Step S 960 ).
  • predetermined time for example, 1 second
  • Step S 970 the second interruption notifying means 630 monitors a result of estimation of the operation estimating means 210 . When it is monitored that an operational object other than the steering wheel 84 or the shift lever 60 is operated, it is determined that the cause of the delay in start is the operation on the operational object.
  • Step S 970 When it is determined at Step S 970 that the cause of the delay in start is the operation on the operational object (YES at Step S 970 ), the second interruption notifying means 630 notifies of the above state (Step S 980 ) and the process returns to Step S 910 .
  • Step S 980 notification is realized by outputting a message indicating that the start delays due to the operation on the operational object through the speaker or the display of the navigation apparatus 40 .
  • a message indicating that the signal shows “green”, and thereby the vehicle should start may be output.
  • a message that the driver should stop the operation on the operational object and start the vehicle safely may be output.
  • the estimation result of the operation estimating means 210 is based on the transitional estimation model, a message to interrupt the operation on the operational object at the end point of the posture locus in the corresponding operation and start the vehicle safely may be output, alternatively.
  • Step S 970 when it is determined that the operation on the operational object is not the cause of the delay in start (NO at Step S 970 ), without performing Step S 980 , the process returns to Step S 910 .
  • the operation estimating apparatus 10 compares the actual posture locus of the driver with each of estimated posture loci in the transitional estimation models and each of estimated posture points in stationary estimation models (Step S 220 to Step S 270 in FIG. 7 ) and, when there is an estimated posture locus having a degree of approximation that satisfies a predetermined threshold value, the operation estimating apparatus 10 can estimate that the driver is going to operate the operational object. Further, the operation estimating apparatus 10 can estimate that the driver is going to perform an certain operation corresponding to the estimated posture locus or the estimated posture points that satisfies the condition of the degree of approximation (Step S 300 in FIG. 7 ).
  • the operation supporting means 220 the operation on the operational object to be operated in the estimated operation can be supported (assisted) ahead of time, or the operation performed by the driver can be continuously supported (Step S 310 in FIG. 7 ). Consequently, execution of the operation can be facilitated.
  • Support of the operation on the operational object is realized by turning on a lamp corresponding to the operational object to illuminate the operational object to be operated. Since the operational object to be operated by the driver is illuminated by the lamp, the position of the operational object can be easily visually recognized. As a result, a hand or the like of the driver can be guided to the position of the operational object visually.
  • the operation estimating means 210 reads the estimation models stored in the memory (Step S 230 in FIG. 7 ) and compares the actual posture locus with each of the read estimation models.
  • the estimation models are stored in the memory as above, and the comparison of the actual posture locus with each of the stored estimation models, which are read from the memory, can be realized.
  • the stationary estimation model corresponding to the operation can be corrected (Step S 720 to Step S 770 in FIG. 11 ).
  • the stationary estimation model is corrected to a model in which the tendency of the posture loci in the operator is reflected.
  • the stationary estimation model adapted to the operator can be obtained. This becomes more remarkable as the same operation is repeatedly performed on the operator.
  • the stationary estimation model can be corrected at a predetermined timing on the basis of the actual posture locus obtained from the stored human body feature points (“YES” at Step S 710 in FIG. 11 ).
  • the stationary estimation model By updating (overwriting) the stationary estimation model corresponding to the operated operational object by using an estimation model obtained by averaging the predetermined number of actual posture points obtained from the human body feature points stored in the storage, the stationary estimation model can be corrected (Step S 720 to Step S 770 in FIG. 11 ).
  • a transitional estimation model corresponding to the transitional operation can be corrected based on the actual posture locus obtained based on the human body feature points specified in the transitional operation (Step S 720 to Step S 770 in FIG. 11 ).
  • the transitional estimation model is corrected into a model in which the tendency of the posture loci in the operator is reflected.
  • the transitional estimation model adapted to the operator can be obtained. This becomes more remarkable as the same operation is repeatedly performed on the operator.
  • a transitional estimation model can be corrected at a predetermined timing on the basis of the actual posture loci obtained from the stored human body feature point groups (“YES” at Step S 710 in FIG. 11 ).
  • the transitional estimation model can be corrected by updating (overwriting) the transitional estimation model corresponding to the transition between the end of operation on a certain operational object and the start of operation on another operational object by using an estimation model obtained by averaging the predetermined number of actual posture loci obtained from the human body feature point groups stored in the storage (Step S 720 to Step S 770 in FIG. 11 ).
  • the operation estimating apparatus 10 of the present invention is mounted on a vehicle.
  • the vehicle of the operation estimating apparatus include not only an automobile, but also an airplane or a motorcycle that requires the driver to drive the vehicle.
  • the operation estimating apparatus 10 can be applied to a machine tool requiring an operation by the operator.
  • any camera can be used provided that it can capture images including human body feature points which can be specified by the human body feature point specifying means 120 .
  • a black-and-white camera, a color camera, a stereo camera (made of multiple camera units) for obtaining a stereoscopic image as human body feature points, a three-dimensional image sensor (a sensor for outputting distance on the basis of components of pixels), or the like can be used.
  • the human body feature point specifying means 120 may specify three-dimensional coordinates as human body feature points.
  • the operation estimating means 210 determines whether or not all of estimation models corresponding to one or more estimated posture loci (at least one estimated posture locus) having a high degree of approximation specified by comparison of the actual posture locus are identical with each other. Only when all of the estimation models are the same (“YES” at Step S 270 in FIG. 7 ), it is estimated that the driver is going to perform an operation corresponding to the identical estimation models (Step S 270 to Step S 300 in FIG. 7 ).
  • the operation estimating means 210 may estimate that the driver is going to perform an operation for an estimation model that corresponds to the estimated posture locus having the maximum or minimum degree of approximation among one or more estimation posture loci having a high degree of approximation. For this purpose, when it is determined as “YES” at Step S 250 in FIG. 7 , the process may be shifted to Step S 290 without performing Step S 270 such that the operation estimating means 210 estimates that the driver is going to perform an operation corresponding to the estimated posture locus having the maximum degree of approximation (the minimum angle).
  • the operation supporting means 220 supports the operation on the operational object (Step S 310 in FIG. 7 ).
  • the support of the operation may be also realized by making the brightness of the corresponding lamp higher than that of the other lamps.
  • the operation supporting means 220 supports the operation on the operational object (Step S 310 in FIG. 7 ).
  • a specific configuration for supporting an operation on an operational object is not limited to the above-described configuration.
  • the following configuration may be employed.
  • the changing mechanism for an operational object corresponding to a result of estimation of the operation estimating means 210 is controlled such that the direction of the operational object is changed toward the operator, thereby supporting the operation of the operator.
  • the lamp in the above description will be read as “the changing mechanism” and “turn on/off the lamp” will be read as “change/reset the direction of the operational object.”
  • the direction of the operational object corresponding to the result of estimation can be set toward the operator, so that the operational object can be operated more easily.
  • Another configuration may be employed when a vibration mechanism capable of vibrating a part or all of the operational object is provided for each of one or more operational objects.
  • the vibration mechanism corresponding to an operational object to be operated in an operation estimated by the operation estimating means 210 is controlled such that a part or all of the operational object vibrates.
  • the operation is supported.
  • “the lamp” in the above description will be read as “the vibration mechanism” and “turn on/off the lamp” will be read as “start/finish the vibration of the operational object.”
  • the multiple operational objects include enclosing operational objects (for example, the glove box 94 and the console box 96 ) capable of enclosing a matter by opening/closing a cover and the enclosing operational object has an opening/closing mechanism for opening/closing the cover, the following configuration may be employed.
  • an operational object to be operated in the operation estimated by the operation estimating means 210 is an enclosing operational object
  • the opening/closing mechanism in the enclosing operational object to open the cover
  • the operation is supported.
  • the lamp in the above description will be read as “the opening/closing mechanism” and “turn on/off the lamp” will be read as “open/close the cover of the enclosing operational object.”
  • the cover of the enclosing operation object can be opened ahead of time. Consequently, a part of an operation of enclosing a matter into the operational object or of an operation of taking the matter from the operational object can be omitted.
  • the operation supporting member 220 controls the opening/closing mechanism in the enclosing operational object to close the cover.
  • the opening/closing mechanism is controlled to close the cover.
  • the cover of the enclosing operational object can be automatically closed.
  • notification is realized by outputting a message indicating that the start delays due to the operation on the operational object through the speaker or the display.
  • the notification may be also realized by another method, for example, of turning on a lamp or driving a predetermined actuator.
  • each of the stationary operation table and the transitional operation table is constructed as a data table capable of registering data only by a predetermined number in the first-in first-out method.
  • the operation tables may be constructed as data tables capable of registering data without setting an upper limit. For example, data in the operation tables may be cleared at any time or a predetermined time).
  • the estimation model correcting means 410 extracts the predetermined number of human body feature points and the human body feature point groups by excluding the outliers from the stationary operation table and the transitional operation table, and then, corrects the estimation table (Steps S 730 to S 770 in FIG. 11 ).
  • the human body feature points and the human body feature point groups are registered to the operation tables each time a predetermined operation is performed. Consequently, when the correction condition is satisfied in a state where the operation is not performed by the sufficient number of times, the predetermined number of human body feature points and human body feature point groups to be essentially used for correction cannot be extracted.
  • the estimation model correcting means 410 may correct an estimation model on the basis of the human body feature points and the human body feature point groups of the number less than the predetermined number. It is also possible to make up the shortfall using prepared human body feature points or human body feature point groups to correct the model.
  • a static operation table in which human body feature points and human body feature point groups as fixed values may be prepared for operations.
  • the estimation model correcting means 410 may compensate the shortage from the human body feature points and the human body feature point groups for the certain operation from the static operation table and realize correction of the estimation table.
  • the estimation model can be corrected on the basis of the predetermined number of human body feature points or human body feature point groups.
  • the operation of the driver existing in the driver's seat in the vehicle is estimated. It is also possible to estimate the operation of an operator existing in an alternative one of other seats (for example, passenger's seat) other than the driver's seat in the vehicle. In the alternative case, the camera 20 is mounted to capture the images of the above operator in the alternative one of the other seats.
  • Steps S 420 and S 530 in FIG. 9 a check is made to see whether the operation is performed by the operator seated in the alternative one or not.
  • Steps S 420 and S 530 in FIG. 9 a check is made to see whether the operation is performed by one of the driver and the operator.
  • notification at Step 830 in FIG. 13 and Step S 980 in FIG. 14 is performed only when it is estimated that the operation is performed by the driver.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • User Interface Of Digital Computer (AREA)
US11/987,800 2006-12-04 2007-12-04 Operation estimating apparatus and related article of manufacture Active 2030-10-11 US8077970B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006327403A JP4670803B2 (ja) 2006-12-04 2006-12-04 操作推定装置およびプログラム
JP2006-327403 2006-12-04

Publications (2)

Publication Number Publication Date
US20080130953A1 US20080130953A1 (en) 2008-06-05
US8077970B2 true US8077970B2 (en) 2011-12-13

Family

ID=39475812

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/987,800 Active 2030-10-11 US8077970B2 (en) 2006-12-04 2007-12-04 Operation estimating apparatus and related article of manufacture

Country Status (2)

Country Link
US (1) US8077970B2 (ja)
JP (1) JP4670803B2 (ja)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080052627A1 (en) * 2006-07-06 2008-02-28 Xanavi Informatics Corporation On-vehicle display device and display method adopted in on-vehicle display device
US20110060499A1 (en) * 2009-09-04 2011-03-10 Hyundai Motor Japan R&D Center, Inc. Operation system for vehicle
US9690292B1 (en) * 2015-01-13 2017-06-27 State Farm Mutual Automobile Insurance Company Apparatuses, systems and methods for transitioning between autonomous and manual modes of vehicle operations

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4420081B2 (ja) * 2007-08-03 2010-02-24 株式会社デンソー 行動推定装置
JP5168000B2 (ja) * 2008-07-17 2013-03-21 トヨタ自動車株式会社 操作支援装置および操作支援方法
WO2010053160A1 (ja) * 2008-11-07 2010-05-14 国立大学法人 北海道大学 コンテンツ検索装置およびコンテンツ検索プログラム
WO2010061448A1 (ja) * 2008-11-27 2010-06-03 パイオニア株式会社 操作入力装置、情報処理装置及び選択ボタン特定方法
JP4613999B2 (ja) * 2008-12-22 2011-01-19 株式会社デンソー 行動推定装置、プログラム
JP4683123B2 (ja) 2008-12-24 2011-05-11 株式会社デンソー 行動推定装置、プログラム
JP5200926B2 (ja) 2008-12-26 2013-06-05 トヨタ自動車株式会社 運転支援装置
JP4973667B2 (ja) * 2009-01-16 2012-07-11 株式会社デンソー 操作推定装置およびプログラム
JP5233977B2 (ja) * 2009-12-11 2013-07-10 株式会社デンソー 乗員姿勢推定装置
JP5407886B2 (ja) * 2010-01-15 2014-02-05 アイシン・エィ・ダブリュ株式会社 減速制御装置、方法およびプログラム
US8976230B1 (en) * 2010-06-28 2015-03-10 Vlad Vendrow User interface and methods to adapt images for approximating torso dimensions to simulate the appearance of various states of dress
JP2012118761A (ja) * 2010-12-01 2012-06-21 Panasonic Corp 操作入力装置
CN102958756B (zh) 2011-04-22 2016-07-06 松下知识产权经营株式会社 车辆用输入装置及车辆用输入方法
CN102214296B (zh) * 2011-06-03 2012-11-28 东南大学 一种基于空间比例的驾驶人姿态特征提取方法
JP2013037454A (ja) * 2011-08-05 2013-02-21 Ikutoku Gakuen 姿勢判定方法、プログラム、装置、システム
JP5944287B2 (ja) * 2012-09-19 2016-07-05 アルプス電気株式会社 動作予測装置及びそれを用いた入力装置
JP6303907B2 (ja) * 2014-08-08 2018-04-04 株式会社デンソー 運転者監視装置
JP6443393B2 (ja) * 2016-06-01 2018-12-26 トヨタ自動車株式会社 行動認識装置,学習装置,並びに方法およびプログラム
US9990553B1 (en) 2016-06-14 2018-06-05 State Farm Mutual Automobile Insurance Company Apparatuses, systems, and methods for determining degrees of risk associated with a vehicle operator
US9996757B1 (en) * 2016-06-14 2018-06-12 State Farm Mutual Automobile Insurance Company Apparatuses, systems, and methods for detecting various actions of a vehicle operator
US10452933B1 (en) * 2017-01-19 2019-10-22 State Farm Mutual Automobile Insurance Company Apparatuses, systems and methods for generating a vehicle driver model for a particular vehicle
US11281920B1 (en) * 2019-05-23 2022-03-22 State Farm Mutual Automobile Insurance Company Apparatuses, systems and methods for generating a vehicle driver signature
JP7402084B2 (ja) * 2020-03-05 2023-12-20 本田技研工業株式会社 乗員行動判定装置
DE102022119855B4 (de) 2022-08-08 2024-06-06 Bayerische Motoren Werke Aktiengesellschaft Vorrichtung und Verfahren zum Detektieren einer Ablenkung eines Fahrzeugführers

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11134090A (ja) 1997-10-30 1999-05-21 Tokai Rika Co Ltd 操作信号出力装置
JP2000280780A (ja) 1999-03-31 2000-10-10 Toshiba Corp ドライバ状態検知システム
JP2001216069A (ja) 2000-02-01 2001-08-10 Toshiba Corp 操作入力装置および方向検出方法
JP2002015322A (ja) 2000-06-29 2002-01-18 Nissan Motor Co Ltd 脇見状態検出装置
JP2002133401A (ja) 2000-10-18 2002-05-10 Tokai Rika Co Ltd 操作者判定方法及び操作者判定装置
JP2002236534A (ja) 2001-02-13 2002-08-23 Mitsubishi Motors Corp 車載機器操作装置
JP2003109015A (ja) 2001-10-01 2003-04-11 Masanobu Yamamoto 身体動作測定方式
US20030156756A1 (en) * 2002-02-15 2003-08-21 Gokturk Salih Burak Gesture recognition system using depth perceptive sensors
JP2005063090A (ja) 2003-08-11 2005-03-10 Keio Gijuku ハンドパターンスイッチ装置
US20050063564A1 (en) 2003-08-11 2005-03-24 Keiichi Yamamoto Hand pattern switch device
US20050134117A1 (en) 2003-12-17 2005-06-23 Takafumi Ito Interface for car-mounted devices
US7098812B2 (en) 2002-08-08 2006-08-29 Nissan Motor Co., Ltd. Operator identifying device
US20060210112A1 (en) * 1998-08-10 2006-09-21 Cohen Charles J Behavior recognition system
US20070110298A1 (en) * 2005-11-14 2007-05-17 Microsoft Corporation Stereo video for gaming
US20070195997A1 (en) * 1999-08-10 2007-08-23 Paul George V Tracking and gesture recognition system particularly suited to vehicular control applications

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002259990A (ja) * 2001-02-28 2002-09-13 Nippon Telegr & Teleph Corp <Ntt> 文字入力方法及び装置並びに文字入力プログラムとこのプログラムを記憶した記憶媒体
JP3867039B2 (ja) * 2002-10-25 2007-01-10 学校法人慶應義塾 ハンドパターンスイッチ装置
JP4035610B2 (ja) * 2002-12-18 2008-01-23 独立行政法人産業技術総合研究所 インタフェース装置
JP2006309448A (ja) * 2005-04-27 2006-11-09 Sony Corp ユーザインターフェース装置及び方法

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11134090A (ja) 1997-10-30 1999-05-21 Tokai Rika Co Ltd 操作信号出力装置
US20060210112A1 (en) * 1998-08-10 2006-09-21 Cohen Charles J Behavior recognition system
JP2000280780A (ja) 1999-03-31 2000-10-10 Toshiba Corp ドライバ状態検知システム
US20070195997A1 (en) * 1999-08-10 2007-08-23 Paul George V Tracking and gesture recognition system particularly suited to vehicular control applications
JP2001216069A (ja) 2000-02-01 2001-08-10 Toshiba Corp 操作入力装置および方向検出方法
JP2002015322A (ja) 2000-06-29 2002-01-18 Nissan Motor Co Ltd 脇見状態検出装置
JP2002133401A (ja) 2000-10-18 2002-05-10 Tokai Rika Co Ltd 操作者判定方法及び操作者判定装置
JP2002236534A (ja) 2001-02-13 2002-08-23 Mitsubishi Motors Corp 車載機器操作装置
JP2003109015A (ja) 2001-10-01 2003-04-11 Masanobu Yamamoto 身体動作測定方式
US20030156756A1 (en) * 2002-02-15 2003-08-21 Gokturk Salih Burak Gesture recognition system using depth perceptive sensors
US7098812B2 (en) 2002-08-08 2006-08-29 Nissan Motor Co., Ltd. Operator identifying device
US20050063564A1 (en) 2003-08-11 2005-03-24 Keiichi Yamamoto Hand pattern switch device
JP2005063090A (ja) 2003-08-11 2005-03-10 Keio Gijuku ハンドパターンスイッチ装置
US20050134117A1 (en) 2003-12-17 2005-06-23 Takafumi Ito Interface for car-mounted devices
US20070110298A1 (en) * 2005-11-14 2007-05-17 Microsoft Corporation Stereo video for gaming

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080052627A1 (en) * 2006-07-06 2008-02-28 Xanavi Informatics Corporation On-vehicle display device and display method adopted in on-vehicle display device
US8327291B2 (en) * 2006-07-06 2012-12-04 Xanavi Informatics Corporation On-vehicle display device and display method adopted in on-vehicle display device
US20110060499A1 (en) * 2009-09-04 2011-03-10 Hyundai Motor Japan R&D Center, Inc. Operation system for vehicle
US8849506B2 (en) * 2009-09-04 2014-09-30 Hyundai Motor Japan R&D Center, Inc. Operation system for vehicle
US9690292B1 (en) * 2015-01-13 2017-06-27 State Farm Mutual Automobile Insurance Company Apparatuses, systems and methods for transitioning between autonomous and manual modes of vehicle operations
US10241512B1 (en) * 2015-01-13 2019-03-26 State Farm Mutual Automobile Insurance Company Apparatuses, systems and method for transitioning between autonomous and manual modes of vehicle operations

Also Published As

Publication number Publication date
JP4670803B2 (ja) 2011-04-13
US20080130953A1 (en) 2008-06-05
JP2008140268A (ja) 2008-06-19

Similar Documents

Publication Publication Date Title
US8077970B2 (en) Operation estimating apparatus and related article of manufacture
CN110831819B (zh) 泊车辅助方法以及泊车辅助装置
US10163016B2 (en) Parking space detection method and device
CN108140312B (zh) 停车辅助方法及停车辅助装置
US10810446B2 (en) Parking space line detection method and device
JP6493552B2 (ja) 駐車支援情報の表示方法及び駐車支援装置
US9317759B2 (en) Driving assistance device and driving assistance method
US20150275840A1 (en) Idling stop control system for vehicle
WO2015098156A1 (ja) 視界支援装置、視界支援方法、及び視界支援プログラム
US20190092327A1 (en) Vehicle control device
US20190171211A1 (en) Control Method and Control Device of Automatic Driving Vehicle
JP2004203126A (ja) 車両用周辺監視装置
JP2010009235A (ja) 画像表示装置
CN104812649A (zh) 车辆控制装置
JP7215228B2 (ja) 制御装置、制御方法、制御プログラム
CN111731318B (zh) 车辆控制装置、车辆控制方法、车辆以及存储介质
JPWO2019193715A1 (ja) 運転支援装置
US11198392B2 (en) Vehicle monitoring device
WO2019187749A1 (ja) 車両用情報提供装置
JP2004051063A (ja) 車両周辺視認装置
CN111661064B (zh) 车辆控制装置、车辆控制方法、车辆以及存储介质
JP5083137B2 (ja) 運転支援装置
JP2015144406A (ja) 視界支援装置、視界支援方法、及び視界支援プログラム
JP2022140026A (ja) 画像処理装置、画像処理方法およびプログラム
JP2005030925A (ja) 車両情報算出装置、車両情報算出方法および車両情報算出プログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: DENSO CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ISHIKAWA, TAKAHIRO;REEL/FRAME:020441/0762

Effective date: 20071210

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12