CN108780564A - Image processing apparatus, image processing method and program - Google Patents

Image processing apparatus, image processing method and program Download PDF

Info

Publication number
CN108780564A
CN108780564A CN201780017527.7A CN201780017527A CN108780564A CN 108780564 A CN108780564 A CN 108780564A CN 201780017527 A CN201780017527 A CN 201780017527A CN 108780564 A CN108780564 A CN 108780564A
Authority
CN
China
Prior art keywords
mentioned
image
vehicle
image processing
birds
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201780017527.7A
Other languages
Chinese (zh)
Inventor
重村宗作
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Denso Corp
Original Assignee
Denso Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Denso Corp filed Critical Denso Corp
Publication of CN108780564A publication Critical patent/CN108780564A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/607Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

Image processing apparatus (1) has:Image acquisition unit (7) consists of the image on the periphery for obtaining vehicle;Converting unit (9), consists of and converts the image into birds-eye view image;Signal acquiring unit (11) consists of from In-vehicle networking and obtains signals of vehicles;Position determining means (13) consists of the position relationship for determining multiple birds-eye view images;Synthesis unit (15), consists of generation composograph;Predicting unit (17), speed at the time of consisting of before prediction signal delay time;And stop judging unit (19), it consists of and judges whether vehicle stops.Position determining means determines the position relationship in the case of vehicle stopping in the case where being judged as that vehicle stops.

Description

Image processing apparatus, image processing method and program
Cross reference to related applications
This international application requires the Japanese patent application 2016- submitted in Japanese Patent Office on March 22nd, 2016 No. 056975 priority, and by referring to being hereby incorporated entire contents.
Technical field
This disclosure relates to image processing apparatus, image processing method and program.
Background technology
It has been known that there is following such image processing apparatus.Image processing apparatus uses the camera for being installed on vehicle, across Time difference obtains the image on the periphery of multiple vehicles.Image processing apparatus, which is converted to each image in multiple images, to be got a bird's eye view Figure image.Image processing apparatus obtains the signals of vehicles for the operating condition for indicating vehicle from In-vehicle networking.Image processing apparatus base In signals of vehicles, the position relationship of multiple birds-eye view images is determined.Image processing apparatus is according to the position determined as described above Relationship configures multiple birds-eye view images and generates composograph.Such image processing apparatus is disclosed in following patent documents 1.
Patent document 1:No. 4156214 bulletins of Japanese Patent No.
Inventor study in detail as a result, it has been found that project below.The signals of vehicles delay obtained from In-vehicle networking. In the case where the operating condition of vehicle is non-constant, if the signals of vehicles profit using delay generates composite diagram with the aforedescribed process Picture then generates deviation between composograph and actual state.
Invention content
It is preferred that the one side of the disclosure is capable of providing the image that can inhibit the deviation between composograph and actual state Processing unit, image processing method and program.
Image processing apparatus in the one side of the disclosure has:Image acquisition unit consists of use and is installed on vehicle Camera, the image on the periphery of multiple vehicles is obtained across the time difference;Converting unit, consisting of will be in multiple images Each image is converted to birds-eye view image and generates multiple birds-eye view images;Signal acquiring unit is consisted of and is obtained in image When unit being taken to obtain image, the signals of vehicles for the operating condition for indicating vehicle is obtained from In-vehicle networking;Position determining means, structure As the signals of vehicles obtained by signal acquiring unit is used, the position relationship of multiple birds-eye view images is determined;And synthesis is single Member consists of and generates at least one that is configured with multiple birds-eye view images according to the position relationship determined by position determining means The composograph divided.
Image processing apparatus in the one side of the disclosure is also equipped with:Predicting unit consists of use by signal acquisition The variation relative to the time for the signals of vehicles that unit obtains, vehicle at the time of before the signal delay time for predicting In-vehicle networking Speed;And stop judging unit, consist of the speed predicted by predicting unit preset threshold value with In the case of lower, by being compared to the multiple images for using camera to obtain at different times, whether to judge vehicle Stop.Position determining means is configured to, in the case where stopping judging unit being judged as that vehicle stops, determining the feelings that vehicle stops Position relationship under condition.
Image processing apparatus according to the one side of the disclosure, even if the situation non-constant in the operating condition of vehicle Under, it can also inhibit the deviation between composograph and actual state.
Image processing method in the other aspects of the disclosure is obtained using the camera for being installed on vehicle across the time difference Each image in multiple images is converted to birds-eye view image and generates multiple birds-eye views by the image on the periphery of multiple vehicles Image obtains the signals of vehicles for the operating condition for indicating vehicle from In-vehicle networking, uses vehicle when obtaining image using camera Signal determines the position relationship of multiple birds-eye view images, and generates and be configured with multiple birds-eye view figures according to above-mentioned position relationship At least part of composograph of picture.
Image processing method in the other aspects of the disclosure also uses the variation relative to the time of signals of vehicles, prediction The speed of vehicle at the time of before the signal delay time of In-vehicle networking, and in the speed predicted in preset threshold value In the case of below, by being compared to the multiple images for using camera to obtain at different times, to judge that vehicle is No stopping determines the position relationship in the case that vehicle stops in the case where being judged as that vehicle stops.
According to the image processing method in the other aspects of the disclosure, even if the situation non-constant in the operating condition of vehicle Under, it can also inhibit the deviation between composograph and actual state.
In addition, the reference numeral in bracket recorded in claims represent with as the aftermentioned implementation of mode The correspondence of specific unit recorded in mode, is not defined scope of the presently disclosed technology.
Description of the drawings
Fig. 1 is the block diagram for the composition for indicating image processing apparatus.
Fig. 2 is the block diagram for indicating the function of image processing apparatus and constituting.
Fig. 3 is the definition graph for the configuration for indicating camera and display in this vehicle.
Fig. 4 is the flow chart for the processing for indicating that image processing apparatus executes.
Fig. 5 is the explanation for indicating to predict the method for the speed of this vehicle at the time of current time before T at the time of Δ T Figure.
Fig. 6 is the definition graph for indicating to generate the method for the composograph G (i) at current time.
Fig. 7 be indicate amount of movement Δ X be 0 in the case of, generate saying for the method for the composograph G (i) at current time Bright figure.
Fig. 8 is the aobvious exemplary definition graph for indicating display.
Fig. 9 is the definition graph indicated to image P (i) and image P (i-1) method compared.
Figure 10 is the definition graph for indicating this vehicle, bar and stop line.
Figure 11 is the composograph indicated since the stop timing after Δ T, and generates movement to have been based on signals of vehicles The definition graph of composograph in the case of amount Δ X.
Figure 12 is indicated after the Δ T since the stop timing, the explanation for the composograph G (i) that image processing apparatus generates Figure.
Figure 13 is the explanation indicated based on object target position to image P (i) and image P (i-1) method compared Figure.
Specific implementation mode
Embodiment of the present disclosure is illustrated based on attached drawing.
< first embodiments >
1. the composition of image processing apparatus 1
The composition of image processing apparatus 1 is illustrated based on Fig. 1~Fig. 3.Image processing apparatus 1 is mounted to vehicle Car-mounted device.Hereinafter, the vehicle of image processing apparatus 1 will be installed as this vehicle.Image processing apparatus 1 have with CPU3, With the microcomputer of the semiconductor memories (hereinafter, as memory 5) such as RAM, ROM, flash memory.By executing storage by CPU3 The various functions of image processing apparatus 1 are realized in the program of non-transient tangible recording medium.In this example embodiment, 5 phase of memory When in the non-transient tangible recording medium for storing program.In addition, by the execution of the program, method corresponding with program is held Row.In addition, image processing apparatus 1 can both have a microcomputer, can also have multiple microcomputers.
As shown in Fig. 2, image processing apparatus 1 has image acquisition unit 7, converting unit 9, signal acquiring unit 11, position Set determining means 13, synthesis unit 15, predicting unit 17, stop judging unit 19 and display unit 20, as by by The function that CPU3 executes program and realizes is constituted.The method for realizing these elements that image processing apparatus 1 is included does not limit In software, the element for being combined with the hardware realizations such as logic circuit, analog circuit part of it or whole can also be used.
This vehicle is also equipped with camera 21, display 23 and In-vehicle networking 25 other than image processing apparatus 1.Such as Shown in Fig. 3, camera 21 is mounted on the rear portion of this vehicle 27.The shooting of camera 21 is in the wind at rear from this vehicle 27 Scape, and generate image.The rear of this vehicle corresponds to the periphery of this vehicle.
In viewed from above vehicle 27, the optical axis 29 of camera 21 is parallel with the antero posterior axis of this vehicle 27.In addition, light Axis 29 has the angle of depression.When on the basis of by this vehicle 27, optical axis 29 is constant always.Therefore, if inclination without this vehicle 27, and Road is flat, then the range 31 that the image that camera 21 takes is included has been at constant position relative to this vehicle 27 It sets.Include road surface in range 31.
As shown in figure 3, display 23 is located in the compartment of this vehicle 27.The driver of this vehicle 27 can visual confirmation it is aobvious Show device 23.Display 23 is controlled by image processing apparatus 1, shows image.
In-vehicle networking 25 is connect with image processing apparatus 1.Image processing apparatus 1 can obtain from In-vehicle networking 25 and indicate this The signals of vehicles of the operating condition of vehicle.Specifically, signals of vehicles is the signal for the speed for indicating this vehicle.As vehicle-mounted net Network 25, such as CAN (registered trademark) can be enumerated.
The signals of vehicles sent by In-vehicle networking 25 delays signal delay time Δ T.That is, image processing apparatus 1 exists The signals of vehicles that moment T is obtained from In-vehicle networking 25 is the vehicle letter of the operating condition of this vehicle for indicating the moment (T- Δ T) Number.Signal delay time Δ T is the intrinsic positive value of In-vehicle networking 25, is known value.
2. the image procossing that image processing apparatus 1 executes
Image processing apparatus 1 is illustrated with the constant period I image procossings executed repeatedly based on Fig. 4~Fig. 9.Week The unit of phase I is the time.Hereinafter, can will execute primary processing shown in Fig. 4 is known as a cycle.
In the step 1 of Fig. 4, image acquisition unit 7 obtains an image using camera 21.In addition, image procossing fills It sets 1 and cycle is repeated with constant period I.Therefore, image acquisition unit 7 obtains more across the time difference for being equivalent to period I A image.For example, image acquisition unit 7 is in moment t0The first image is obtained in the period of execution, in moment (t0+ I) execute week The second image of interim acquisition, in moment (t0+ 2I) execute period in obtain third image, in moment (t0+ 3I) execute period The 4th image of middle acquisition.
In step 2, signal acquiring unit 11 obtains signals of vehicles from In-vehicle networking 25.In addition, each period is by this step 2 Include that, so when image acquisition unit 7 obtains image, signal acquiring unit 11 obtains signals of vehicles together with above-mentioned steps 1.
In step 3, signal acquiring unit 11 by the signals of vehicles obtained in above-mentioned steps 2 and obtains the signals of vehicles Moment, which establishes, to be associated with and is stored in memory 5.
In step 4, the image obtained in the above-mentioned steps 1 in current period is converted to birds-eye view figure by converting unit 9 Picture.As the method for being converted to birds-eye view image, well known method can be used.As the method for being converted to birds-eye view image Example, such as the method etc. recorded in Japanese Unexamined Patent Publication 10-211849 bulletins can be used.
In addition, image processing apparatus 1 executes the processing of step 4 according to the period repeatedly.Converting unit 9 carries out step 4 every time Processing, the image obtained in the above-mentioned steps 1 in identical period is all converted into birds-eye view image.Converting unit 9 will as a result, Each image in the multiple images that above-mentioned steps 1 obtain is converted to birds-eye view image.Converting unit 9 is by birds-eye view image It is stored in memory 5.
In steps of 5, judge whether the timing for starting the generation of composograph has arrived.Start the generation of composograph Timing refer to since above-mentioned steps 4 store birds-eye view image for the first time, this vehicle moves preset distance Periodically.6 are entered step in the case where the timing of the generation of beginning composograph has arrived, is tied in the case of in addition to this Beam present treatment.
In step 6, predicting unit 17 is as described below, predicts sheet at the time of current time before T at the time of Δ T The speed of vehicle.
First, predicting unit 17 is read deposits in the above-mentioned steps 3 in current period and the above-mentioned steps 3 in period before this The signals of vehicles of storage.As described above, signals of vehicles indicates the speed of this vehicle.Next, as shown in figure 5, predicting unit 17 It is speed that the signals of vehicles curve of reading, which is turned to the longitudinal axis, and horizontal axis is chart at the time of obtaining signals of vehicles.
Next, variation relative to time of the predicting unit 17 using speed shown in the chart, calculates and indicates to obtain At the time of signals of vehicles and the curve of approximation of the relationship of speed 33.Curve of approximation 33 is either a letter related to time Number, can also be quadratic function, can also be cubic function.Next, predicting unit 17 uses curve of approximation 33, prediction working as The speed of this vehicle at the time of the preceding moment before T at the time of Δ T.
In step 7, whether stop judging unit 19 judges the speed predicted in above-mentioned steps 62 preset Below threshold value.8 are entered step in the case where the speed predicted is more than threshold value, in the speed predicted in threshold value feelings below 12 are entered step under condition.
In step 8, the amount of movement Δ of this vehicle during calculating until the last period to current period X.Amount of movement Δ X is after the speed represented by the signals of vehicles that the above-mentioned steps 2 in current period obtain is multiplied with period I Value.For example, being in the case that 60km/h periods I is 33msec in speed, Δ X is 0.55m.
In step 9, as described below, synthesis unit 15 generates composograph.Synthesis unit 15 is upper primary pervious In the case that period has begun the generation of composograph, read the step 10 in the upper primary period and be stored in memory 5 The content of composograph and the position of composograph.The position of composograph refers to position when being shown on display 23. In addition, in the case where the current period starts the generation of composograph, the reading without composograph.
Fig. 6 shows the example of the composograph read.In figure 6, composograph G (i-1) was in the upper primary period The composograph for generating and storing.Composograph G (i-1) be, for example, be combined with birds-eye view image TV (i-1), TV (i-2), The image of TV (i-3) and TV (i-4).TV (i-1) refers to the birds-eye view that the step 4 in the period before (n+1) is secondary generates Image, TV (i-2) refer to the birds-eye view image that the step 4 in the period before (n+2) is secondary generates, and TV (i-3) refers at (n+3) The birds-eye view image that the step 4 in the period before secondary generates, TV (i-4) refer to that the step 4 in the period before (n+4) is secondary generates Birds-eye view image.N is 1 or more natural number, is constant number.
In addition, composograph G (i-1) passes through generation by below.First, the step 9 in the period before four times In, generate the composograph G (i-4) being made of birds-eye view image TV (i-4).Next, the step 9 in the period before three times In, to composograph G (i-4) addition birds-eye view image TV (i-3), generate composograph G (i-3).Next, twice In the step 9 in preceding period, to composograph G (i-3) addition birds-eye view image TV (i-2), composograph G (i- are generated 2).Next, in the step 9 in upper primary period, it is raw to composograph G (i-2) addition birds-eye view image TV (i-1) At composograph G (i-1).
Next, synthesis unit 15 makes the position of composograph G (i-1) to the side opposite with the direction of travel D of this vehicle To mobile k Δs X.K Δs X is the value after constant k is multiplied with amount of movement Δ X.Constant k is pel spacing.The unit of pel spacing For m/pix.
Next, the newest birds-eye view image TV (i-1) that synthesis unit 15 will be included with composograph G (i-1) It is added to composograph G (i-1) compared to the birds-eye view image TV (i) generated in the rear primary period, generates the current period Composograph G (i).For example, being the feelings for the birds-eye view image that the step 4 in (n+1) secondary preceding period generates in TV (i-1) Under condition, TV (i) is the birds-eye view image that the step 4 in the period before n times generates.
The position of newly added birds-eye view image TV (i) is constant always.Birds-eye view image TV is determined according to k Δs X as a result, (i) with birds-eye view image TV (i-1), TV (i-2), TV (i-3) and TV (i-4) position relationship.That is, k Δs X with it is more The position relationship of a birds-eye view image corresponds to.
The position of birds-eye view image TV (i) is after deviating k Δs X to the direction sides D compared with birds-eye view image TV (i-1) Position.In the case where birds-eye view image TV (i) and composograph G (i-1) has the region repeated, the region repeated The content of composograph G (i) is the content of birds-eye view image TV (i).
Length in the upper and lower directions of composograph G (i) has constant upper limit value.To the upper and lower directions of composograph G (i) On length reach upper limit value until, even if addition birds-eye view image TV (i) if do not delete old birds-eye view image.
On the other hand, addition birds-eye view image TV's (i) the result is that length in the upper and lower directions of composograph G (i) In the case of upper limit value, image oldest in the birds-eye view image that composograph G (i) is included is deleted.Shown in Fig. 6 In example, birds-eye view image TV (i-4) is deleted.
In subsequent steps 14, in the case where amount of movement Δ X is 0, as shown in fig. 7, the position of composograph G (i-1) It sets and does not move.The position of birds-eye view image TV (i-1) in composograph G (i-1) is with additional birds-eye view image TV's (i) Position is identical.Birds-eye view image TV (i-1) and birds-eye view image TV (i) is repeated in their entirety as a result,.Composograph G (i) it is with the synthesis behind the part of the birds-eye view image TV (i-1) in birds-eye view image TV (i) displacement composograph G (i-1) Image.
In step 10, synthesis unit 15 deposits the content of the composograph G (i) generated in above-mentioned steps 9 and position It is stored in memory 5.
In a step 11, the image of the composograph G (i) generated included in above-mentioned steps 9 is shown in by display unit 20 Display 23.The display is for example shown in Fig. 8.Composograph G (i) is shown in the left side stage casing in display 23.It shows in fig. 8 Region for " image of direction of travel " is the region for the real-time image for showing camera 21.It is shown as the area of " real image " Domain is shown in the region of the birds-eye view image of the generation of above-mentioned steps 4 in current period.The region for being shown as " blank " is not Show the region of any image.
In addition, the region of the upper left in display 23 is bird made of showing the image converted and obtained using preposition camera It looks down from a height the region of figure image.Fig. 8 is the image shown in this vehicle rollback.In this vehicle rollback, preposition camera is not used. Therefore, in fig. 8, become " blank " in the region of the upper left of display 23.When this vehicle advances, in the upper left of display 23 The image that is obtained using preposition camera of region display conversion made of birds-eye view image.
12 are entered step in the case where above-mentioned steps 7 make affirmative determination.In step 12, it is right to stop judging unit 19 In the image P that the image P (i) that the above-mentioned steps 1 in current period obtain was obtained with the above-mentioned steps 1 in the upper primary period (i-1) it is compared.Image P (i) and image P (i-1) and the multiple figures obtained at different times using camera 21 As corresponding.
It is compared as described below.As shown in figure 9, to the picture of the same coordinate in image P (i) and image P (i-1) Element is compared each other, calculates the difference of their brightness, color etc..To all pictures in image P (i) and image P (i-1) Element carries out the processing, calculates the difference as whole image.
In step 13, stops judging unit 19 based on the comparing result in above-mentioned steps 12, judge whether this vehicle stops Only.That is, if above-mentioned steps 12 it is calculated as whole image difference in preset threshold value hereinafter, if be judged as this vehicle Stop, in the case of in addition to this, be judged as that this vehicle does not stop.Enter step in the case where being judged as that this vehicle stops Rapid 14, above-mentioned steps 8 are entered in the case where being judged as that this vehicle does not stop.
At step 14, it is 0 to make amount of movement Δ X.Amount of movement Δ X is 0 corresponding with the stopping of this vehicle.Then, entrance is above-mentioned Step 9.
3. the effect that image processing apparatus 1 plays
The speed of this vehicle at the time of before 1 prediction signal delay time Δ T of (1A) image processing apparatus.Then, scheme As processing unit 1 is below in preset threshold value in the speed predicted, by using camera 21 not With at the time of the multiple images that obtain compared, to judge whether this vehicle stops.Also, image processing apparatus 1 is judging In the case of stopping for this vehicle, it is 0 to make amount of movement Δ X.
Even if composograph and actual state can be inhibited if as a result, in the case where the operating condition of this vehicle changes Deviation.The example in conjunction with shown in Figure 10~Figure 12 illustrates the effect.
As shown in Figure 10, this vehicle 27 retreats between the bar 35 of the left and right sides, and in the nearby stopping of stop line 37. That is, the operating condition of this vehicle 27 from the state change of retrogressing be halted state.Image processing apparatus 1 is retreated in this vehicle 27 When and stop when, generate composograph repeatedly, and be shown in display 23.
Assuming that without the processing in above-mentioned steps 6,7,12,13,14, calculate amount of movement Δ X's in above-mentioned steps 8 always Situation.In this case, at the time of nearby stopping of this vehicle 27 in stop line 37 (hereinafter, as the stop timing) Δ T Composograph at the time of afterwards is as shown in figure 11.In the composograph, the position of the bar 35A nearest apart from stop line 37 in bar 35 It sets and is shown in upside compared with actual position 39.That is, deviation occurs for composograph and actual state.
This is because following reason.Image processing apparatus 1 obtains vehicle letter at the time of backtracking signal delay time Δ T Number.Signals of vehicles during until at the time of as a result, after from the stop timing to Δ T is to indicate this vehicle 27 also in lasting retrogressing When speed signals of vehicles.That is, during image processing apparatus 1 is after from the stop timing to Δ T at the time of, although actually This vehicle 27 stops, and but obtains the signals of vehicles of speed when indicating to retreat.Therefore, after the Δ T since the stop timing when Carving calculated amount of movement Δ X becomes than practical big value.
Moreover, as shown in fig. 6, at the time of after the Δ T since the stop timing generate composograph G (i) when, composograph G (i-1) is significantly moved upward compared with practical.As a result, as shown in figure 11, the bar shown by composograph G (i-1) The upward side movements of 35A are shown in upside in composograph G (i) compared with actual position 39.In addition, Figure 11 and aftermentioned Figure 12 in this vehicle 27 be image by Computer Graphic Demonstration.
On the other hand, image processing apparatus 1 is after the stop timing, by carrying out affirmative determination in above-mentioned steps 7, and Above-mentioned steps 13 carry out affirmative determination, and it is 0 that can make amount of movement Δ X.Moreover, image processing apparatus 1 was generated from the stop timing When composograph G (i) at the time of beginning after Δ T, as shown in fig. 7, not making composograph G (i-1) mobile.As a result, as schemed Shown in 12, the not upward side movements of bar 35A shown by composograph G (i-1) in composograph G (i), are correctly displayed in Actual position 39.That is, inhibiting the deviation of composograph and actual state.
The signals of vehicles that (1B) image processing apparatus 1 obtains indicates the speed of this vehicle.Therefore, the calculating of amount of movement Δ X It is easy.
(1C) image processing apparatus 1 is compared each other by the pixel to the same coordinate in multiple images, judges this Whether vehicle stops.Whether thereby, it is possible to easy and this vehicles that correctly judges to stop.
< second embodiments >
1. the difference with first embodiment
The composition substantially of second embodiment is identical with first embodiment, so omitted the description to being commonly constructed, and It illustrates centered on difference.In addition, reference numeral same as the first embodiment indicates identical composition, with reference to first Preceding explanation.
In above-mentioned steps 12, stop judging unit 19 using the following method to image P (i) and image P (i-1) into Row comparison.
As shown in figure 13, stop judging unit 19 in each image of image P (i) and image P (i-1), identification is same One object mark 41.The identification of object mark 41 can use well known image recognition technology.As object mark 41, preferably not relative to the earth Mobile object mark.As such object mark, such as white line, building, works can be enumerated etc..It can be according to the shape of the two The similitude of shape, size, color etc. is with the object mark identified in image P (i) to judge the object mark identified in image P (i) It is no identical.
Next, stop judging unit 19 to the position of object mark 41 identified in image P (i) at image P (i-1) The position of the object mark 41 identified is compared, their distance is calculated.
In above-mentioned steps 13, stops judging unit 19 based on the comparing result in above-mentioned steps 12, judge that this vehicle is No stopping.That is, if in 12 calculated distance of above-mentioned steps in preset threshold value hereinafter, if be judged as this vehicle stop, In the case of in addition to this, it is judged as that this vehicle does not stop.14 are entered step in the case where being judged as that this vehicle stops, It is judged as entering above-mentioned steps 8 in the case that this vehicle does not stop.
2. the effect that image processing apparatus 1 plays
According to second embodiment described in detail above, other than the effect (1A) of above-mentioned first embodiment, (1B), Effect below can also be accessed.
(2A) image processing apparatus 1 identifies same object mark in each image of image P (i) and image P (i-1) 41.Then, image processing apparatus 1 is compared by the position to the object mark 41 in image P (i) and image P (i-1), is come Judge whether this vehicle stops.Whether thereby, it is possible to easy and this vehicles that correctly judges to stop.
< others embodiments >
More than, the mode for implementing the disclosure is illustrated, but the disclosure is not limited to above-mentioned embodiment party Formula can carry out various modifications to implement.
(1) camera 21 can also be mounted on the front end of this vehicle 27.In this case, image processing apparatus 1 can make With camera 21, the image in the front for having taken this vehicle is obtained.The front of this vehicle is corresponding with the periphery of this vehicle.At image Reason device 1 can use the image in the front for having taken this vehicle to generate composograph.
(2) signals of vehicles can also be the signal of the position for indicating this vehicle or the amount of movement of this vehicle.For example, vehicle Signal can also be to indicate to select from the value of pulse counter, the direction of travel of this vehicle and the steering angle of this vehicle At least one signal.
(3) image compared in above-mentioned steps 12 can also be birds-eye view image.
(4) multiple work(that an inscape in the above embodiment has can also be realized by multiple inscapes Can, or realize have the function of one of an inscape by multiple inscapes.Alternatively, it is also possible to be constituted by one Element realizes multiple functions that multiple inscapes have, or is made of multiple element by an inscape realization and realizes A function.Alternatively, it is also possible to omit the above embodiment composition a part.Alternatively, it is also possible to by above-mentioned embodiment party At least part of the composition of formula is added to or is replaced into the composition of other the above embodiments.In addition, according only to right All modes that the technological thought that sentence recorded in claim determines is included all are embodiment of the present disclosure.
(5) other than above-mentioned image processing apparatus, also can using by the image processing apparatus as inscape System, for making the program that computer plays a role as the image processing apparatus, the semiconductor memory for having recorded the program It is embodied in various ways the disclosure etc. non-transient tangible recording medium, image processing method etc..

Claims (9)

1. a kind of image processing apparatus (1), wherein have:
Image acquisition unit (7) is consisted of using the camera (21) for being installed on vehicle (27), is obtained across the time difference multiple The image on the periphery of above-mentioned vehicle;
Converting unit (9) consists of and each image in above-mentioned multiple images is converted to birds-eye view image and is generated more A birds-eye view image;
Signal acquiring unit (11) is consisted of when above-mentioned image acquisition unit obtains above-mentioned image, from In-vehicle networking (25) Obtain the signals of vehicles for the operating condition for indicating above-mentioned vehicle;
Position determining means (13) is consisted of using the above-mentioned signals of vehicles that is obtained by above-mentioned signal acquiring unit, in decision State the position relationship of multiple birds-eye view images;
Synthesis unit (15) consists of generation and is configured with according to the above-mentioned position relationship determined by above-mentioned position determining means State at least part of composograph of multiple birds-eye view images;
Predicting unit (17) was consisted of using the above-mentioned signals of vehicles obtained by above-mentioned signal acquiring unit relative to the time Variation, predict the signal delay time of above-mentioned In-vehicle networking before at the time of above-mentioned vehicle speed;And
Stop judging unit (19), consists of in the above-mentioned speed predicted by above-mentioned predicting unit in preset threshold value In the case of below, by being compared to the multiple images for using above-mentioned camera to obtain at different times, judge above-mentioned Whether vehicle stops,
Above-mentioned position determining means is configured in the case where above-mentioned stopping judging unit being judged as that above-mentioned vehicle stops, in decision State the above-mentioned position relationship in the case that vehicle stops.
2. image processing apparatus according to claim 1, wherein
Above-mentioned signals of vehicles is to indicate to select in the speed from above-mentioned vehicle, pulse counter, direction of travel and steering angle At least one signal.
3. image processing apparatus according to claim 1 or 2, wherein
Above-mentioned stopping judging unit being configured to compare each other by the pixel to the same coordinate in above-mentioned multiple images, comes Judge whether above-mentioned vehicle stops.
4. image processing apparatus according to claim 1 or 2, wherein
Above-mentioned stopping judging unit being configured to by each image in above-mentioned multiple images, identifying same object mark, and Above-mentioned object target position in above-mentioned multiple images is compared, to judge whether above-mentioned vehicle stops.
5. a kind of image processing method, wherein
Using the camera for being installed on vehicle, the image (S1) on the periphery of multiple above-mentioned vehicles is obtained across the time difference,
Each figure in above-mentioned multiple images is converted into birds-eye view image and generates multiple birds-eye view images (S4),
When stating the camera above-mentioned image of acquisition in use, the vehicle for the operating condition for indicating above-mentioned vehicle is obtained from In-vehicle networking Signal (S2),
Using above-mentioned signals of vehicles, the position relationship (S8, S15) of above-mentioned multiple birds-eye view images is determined,
At least part of composograph (S9) that above-mentioned multiple birds-eye view images are configured with according to above-mentioned position relationship is generated,
Using the variation relative to the time of above-mentioned signals of vehicles, before predicting the signal delay time of above-mentioned In-vehicle networking when The speed (S5) for the above-mentioned vehicle carved,
It is below in preset threshold value in the above-mentioned speed predicted, by using above-mentioned camera in difference At the time of the multiple images that obtain compared, to judge whether above-mentioned vehicle stops (S13, S14),
The above-mentioned position relationship (S15) in the case that above-mentioned vehicle stops is determined in the case where being judged as that above-mentioned vehicle stops.
6. image processing method according to claim 5, wherein
Above-mentioned signals of vehicles is to indicate to select in the speed from above-mentioned vehicle, pulse counter, direction of travel and steering angle At least one signal.
7. according to the image processing method described in claim 5 or 6, wherein
It is compared each other by the pixel to the same coordinate in above-mentioned multiple images, to judge whether above-mentioned vehicle stops (S13)。
8. according to the image processing method described in claim 5 or 6, wherein
By in each image in above-mentioned multiple images, identifying same object mark, and to above-mentioned in above-mentioned multiple images Object target position is compared, to judge whether above-mentioned vehicle stops (S13).
9. a kind of program, wherein
Computer is set to play a role as each unit in the image processing apparatus described in any one of Claims 1 to 4.
CN201780017527.7A 2016-03-22 2017-03-22 Image processing apparatus, image processing method and program Withdrawn CN108780564A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016-056975 2016-03-22
JP2016056975A JP6512145B2 (en) 2016-03-22 2016-03-22 IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM
PCT/JP2017/011498 WO2017164245A1 (en) 2016-03-22 2017-03-22 Image processing apparatus, image processing method, and program

Publications (1)

Publication Number Publication Date
CN108780564A true CN108780564A (en) 2018-11-09

Family

ID=59900281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780017527.7A Withdrawn CN108780564A (en) 2016-03-22 2017-03-22 Image processing apparatus, image processing method and program

Country Status (4)

Country Link
JP (1) JP6512145B2 (en)
CN (1) CN108780564A (en)
DE (1) DE112017001515T5 (en)
WO (1) WO2017164245A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116168362A (en) * 2023-02-27 2023-05-26 小米汽车科技有限公司 Pre-training method and device for vehicle perception model, electronic equipment and vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1473433A (en) * 2001-06-13 2004-02-04 ��ʽ�����װ Peripheral image processor of vehicle and recording medium
US20070057816A1 (en) * 2005-09-12 2007-03-15 Aisin Aw Co., Ltd. Parking assist method and parking assist apparatus
CN1953553A (en) * 2005-10-17 2007-04-25 三洋电机株式会社 Vehicle driving assistance system
US20120257790A1 (en) * 2011-04-08 2012-10-11 Sony Corporation Image processing apparatus, image processing method, and program
WO2012169353A1 (en) * 2011-06-07 2012-12-13 株式会社小松製作所 Work vehicle vicinity monitoring device
WO2013015130A1 (en) * 2011-07-26 2013-01-31 アイシン精機株式会社 Vehicle surroundings monitoring system
WO2013114617A1 (en) * 2012-02-03 2013-08-08 パイオニア株式会社 Image-display device, method for displaying image, and image-display program
CN104584100A (en) * 2012-07-20 2015-04-29 丰田自动车株式会社 Vehicle-surroundings monitoring device and vehicle-surroundings monitoring system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3663801B2 (en) 1997-01-30 2005-06-22 いすゞ自動車株式会社 Vehicle rear view support device
JP3847547B2 (en) * 2000-10-17 2006-11-22 三菱電機株式会社 Vehicle periphery monitoring support device
JP3886376B2 (en) * 2001-12-26 2007-02-28 株式会社デンソー Vehicle perimeter monitoring system
JP4770755B2 (en) * 2007-02-26 2011-09-14 株式会社デンソー Road sign recognition device
JP6326869B2 (en) * 2014-03-05 2018-05-23 株式会社デンソー Vehicle periphery image display device and vehicle periphery image display method
JP6312565B2 (en) 2014-09-08 2018-04-18 日立アプライアンス株式会社 Hot water storage water heater with bubble function

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1473433A (en) * 2001-06-13 2004-02-04 ��ʽ�����װ Peripheral image processor of vehicle and recording medium
US20070057816A1 (en) * 2005-09-12 2007-03-15 Aisin Aw Co., Ltd. Parking assist method and parking assist apparatus
CN1953553A (en) * 2005-10-17 2007-04-25 三洋电机株式会社 Vehicle driving assistance system
US20120257790A1 (en) * 2011-04-08 2012-10-11 Sony Corporation Image processing apparatus, image processing method, and program
WO2012169353A1 (en) * 2011-06-07 2012-12-13 株式会社小松製作所 Work vehicle vicinity monitoring device
WO2013015130A1 (en) * 2011-07-26 2013-01-31 アイシン精機株式会社 Vehicle surroundings monitoring system
WO2013114617A1 (en) * 2012-02-03 2013-08-08 パイオニア株式会社 Image-display device, method for displaying image, and image-display program
CN104584100A (en) * 2012-07-20 2015-04-29 丰田自动车株式会社 Vehicle-surroundings monitoring device and vehicle-surroundings monitoring system

Also Published As

Publication number Publication date
DE112017001515T5 (en) 2018-12-06
JP6512145B2 (en) 2019-05-15
JP2017175268A (en) 2017-09-28
WO2017164245A1 (en) 2017-09-28

Similar Documents

Publication Publication Date Title
JP6269838B2 (en) Self-position calculation device and self-position calculation method
WO2017069191A1 (en) Calibration apparatus, calibration method, and calibration program
JP4725391B2 (en) Visibility measuring device for vehicle and driving support device
EP2437494A1 (en) Device for monitoring area around vehicle
US11283995B2 (en) Image display apparatus
KR102469650B1 (en) Driver assistance system
JP2007274377A (en) Periphery monitoring apparatus, and program
RU2621826C1 (en) Device for calculation of own position and method of calculation of own status
JP2009118415A (en) Method and apparatus for generating bird&#39;s-eye view image
US10965872B2 (en) Image display apparatus
JP4626400B2 (en) Overhead image display device and overhead image display method
CN111819571A (en) Panoramic looking-around system with adjusted and adapted projection surface
US20060115144A1 (en) Image information processing system, image information processing method, image information processing program, and automobile
CN108780564A (en) Image processing apparatus, image processing method and program
JPWO2018042976A1 (en) IMAGE GENERATION DEVICE, IMAGE GENERATION METHOD, RECORDING MEDIUM, AND IMAGE DISPLAY SYSTEM
JP7030607B2 (en) Distance measurement processing device, distance measurement module, distance measurement processing method, and program
JP2006080752A (en) Exposure controller for camera and exposure control method for camera
JP6398218B2 (en) Self-position calculation device and self-position calculation method
WO2021070814A1 (en) Synchronization device, synchronization method, and synchronization program
JP2018092603A (en) Information processing device, imaging device, apparatus control system, movable body, information processing method, and information processing program
CN113022590A (en) Vehicle periphery display device
JP4899151B2 (en) Parallax interpolation processing method and processing apparatus
KR101501678B1 (en) Image Picturing Apparatus for Vehicle using Controlling Exposure and Method thereof
EP3396620B1 (en) Display control device and display control method
JP4650935B2 (en) Vehicle driving support device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20181109