WO2023012049A1 - Method and system for determining direction of a traffic mirror - Google Patents

Method and system for determining direction of a traffic mirror Download PDF

Info

Publication number
WO2023012049A1
WO2023012049A1 PCT/EP2022/071315 EP2022071315W WO2023012049A1 WO 2023012049 A1 WO2023012049 A1 WO 2023012049A1 EP 2022071315 W EP2022071315 W EP 2022071315W WO 2023012049 A1 WO2023012049 A1 WO 2023012049A1
Authority
WO
WIPO (PCT)
Prior art keywords
traffic mirror
vehicle
traffic
mirror
placement angle
Prior art date
Application number
PCT/EP2022/071315
Other languages
French (fr)
Inventor
Noriaki Itagaki
Yoshitomo ASAI
Original Assignee
Continental Autonomous Mobility Germany GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Autonomous Mobility Germany GmbH filed Critical Continental Autonomous Mobility Germany GmbH
Publication of WO2023012049A1 publication Critical patent/WO2023012049A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection

Definitions

  • the present disclosure is generally related to autonomous vehicles and specifically related to method and system for determining direction and angle of a traffic mirror using image analysis.
  • Traffic mirrors or road safety mirrors are used for enhancing traffic safety in streets, road junctions and curved or bent lanes. Traffic mirrors provide better visibility to blind spots on the roads and help prevent common hazards or accidents. In general, traffic mirrors benefit both drivers and pedestrians to see if there is any other vehicle coming from the opposite side.
  • ADAS Advanced Driver Assistance System
  • AD Autonomous Driving
  • aspects of the present disclosure provide methods and system for determining direction of a traffic mirror by analyzing images of the traffic mirror.
  • One aspect of this disclosure provides a method for determining the direction of the traffic mirror and comprises detecting a region of a traffic mirror from a real-time image of a field of view of a vehicle.
  • the real-time image of the field of view of the vehicle is captured by an image sensor in the vehicle, during movement of the vehicle.
  • the method comprises identifying one or more edges of the traffic mirror by processing the region of the traffic mirror detected in the real-time image.
  • the method comprises identifying a shape of the traffic mirror based on analysis of the one or more edges.
  • the method comprises measuring a placement angle of the traffic mirror using a predefined angle measurement technique corresponding to the shape of the traffic mirror. Subsequently, the method comprises determining the direction of the traffic mirror based on value of the placement angle of the traffic mirror.
  • the system comprises a processor and at least one memory coupled to the processor.
  • the at least one memory stores instructions executable by the processor and cause the processor to detect a region of a traffic mirror from a real-time image of a field of view of a vehicle.
  • the real-time image of the field of view of the vehicle is captured by an image sensor of the vehicle, during movement of the vehicle.
  • the processor is configured to identify one or more edges of the traffic mirror by processing the region of the traffic mirror detected in the real-time image. After identifying the one or more edges, the processor identifies a shape of the traffic mirror based on analysis of the one or more edges. Thereafter, the processor measures a placement angle of the traffic mirror using a predefined angle measurement technique corresponding to the shape of the traffic mirror. Finally, the processor determines the direction of the traffic mirror based on value of the placement angle of the traffic mirror.
  • Embodiments of the disclosure according to the above method and system may bring about several advantages.
  • the method and system of the disclosure are useful for determining a direction of the traffic mirrors on the roads.
  • direction and/or angle of the traffic mirror is an important input for Advanced Driver Assistance System (ADAS) and Autonomous Driving (AD) integrated vehicles for accurate risk prediction. Consequently, determining the direction of the traffic mirror enhances traffic safety and mitigates risk of traffic hazards or accidents.
  • ADAS Advanced Driver Assistance System
  • AD Autonomous Driving
  • FIG. 1 is an exemplary schematic diagram illustrating a vehicle capturing real-time image of a traffic mirror according to one embodiment of the present disclosure.
  • FIG. 2 shows a block diagram of a safety monitoring system for determining direction of a traffic mirror according to one embodiment of the present disclosure.
  • FIG. 3A - FIG. 3D illustrate sequence of steps involved in the method of determining direction of the traffic mirror according to one embodiment of the present disclosure.
  • FIG. 4A and 4B illustrate exemplary variations of the present disclosure according to alternative embodiments of the present disclosure.
  • FIG. 5 is a flow diagram illustrating an exemplary method for determining direction of the according to one embodiment of the present disclosure. It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether such computer or processor is explicitly shown.
  • FIG. 1 is an exemplary schematic diagram illustrating a method of determining direction of a traffic mirror 106 using a real-time image of the traffic mirror 106 according to one embodiment of the present disclosure.
  • the traffic mirror 106 may be a road safety mirror or a traffic safety mirror, which is placed at a predetermined location on a road 102 to avoid traffic collisions.
  • the real-time image of the traffic mirror 106 may be an image captured by a vehicle 104 in real-time, while the vehicle 104 is moving on the road 102.
  • the safety monitoring system of the present disclosure has been integrated to the vehicle 104 shown in FIG. 1.
  • the vehicle 104 is moving on the road 102, leading to a crossroad junction through which one or more other vehicles 110A and 110B pass by.
  • a traffic mirror 106 has been positioned at the crossroad junction, such that the traffic mirror 106 displays one or more other vehicles 110A, 110B converging and/or diverging from either sides of the road 102.
  • the one or more other vehicles 110A, 110B shown in FIG. 1 is for representation purpose only and these may include other types of obstacles and hazards on the road 102 including, without limitation, potholes, pedestrians, and the like.
  • the vehicle 104 may be, without limitation, an autonomous or selfdriving vehicle, an assisted driving vehicle with Advanced Driver-Assi stance Systems (ADAS). To avoid possibilities of collisions, accidents, or other types of hazards, it may be necessary for the vehicle 104 to read and analyze the reflections displayed on the traffic mirror 106. In addition, for accurately predicting position of an incoming vehicle or hazard, the vehicle 104 may need to estimate the right direction and the angle in which the traffic mirror 106 is placed on the road 102. For this purpose, the vehicle 104 may continuously capture real-time images of the traffic mirror 106, when the traffic mirror 106 appears in the Field of View (FOV) 108 of image sensors associated with the vehicle 104. In an embodiment, the vehicle 104 may capture the real-time images of the traffic mirror 106 using the image sensors configured at the front portion of the vehicle 104.
  • ADAS Advanced Driver-Assi stance Systems
  • the safety monitoring system on the vehicle 104 may perform one or more Artificial Intelligence (Al) and Deep Learning (DL) based operations on the real-time image of the traffic mirror 106.
  • a processor associated with the safety monitoring system may process the real-time image for detecting a region of a traffic mirror 106 in the real-time image.
  • the region of the traffic mirror 106 in the real-time image may be a portion of the real-time image representing the traffic mirror 106.
  • the processor may identify one or more edges of the traffic mirror 106 by further processing the region of the traffic mirror 106 detected in the real-time image.
  • the one or more edges of the traffic mirror 106 may represent a boundary of the traffic mirror 106 in the real-time image.
  • the processor may identify a shape of the traffic mirror 106 based on analysis of the one or more edges.
  • the processor may measure a placement angle of the traffic mirror 106 using a predefined angle measurement technique corresponding to the shape of the traffic mirror 106.
  • the processor may determine the direction of the traffic mirror 106 based on value of the placement angle of the traffic mirror 106.
  • the direction of the traffic mirror 106 may be used as an input for accurately predicting position, speed, and movement of one or other vehicles 110A, 110B on the road 102.
  • the direction information of the traffic mirror 106 may be used to carefully scan and estimate position and speed of the incoming vehicle (that is, one of the one or more other vehicles 110A, 110B) or other obstacles. Consequently, this helps in reducing processing load on autonomous path monitoring control units of the vehicle 104, since the control units may easily alter the course of driving based on the direction of the traffic mirror 106.
  • the direction information of the traffic mirror 106 may be used for alerting the driver of the vehicle 104 about position and speed of one or more other vehicles 110A, 110B. Subsequently, the course of the vehicle 104 may be suitably altered to mitigate any risks.
  • FIG. 2 shows a block diagram of a safety monitoring system 200 for determining direction of a traffic mirror 106 according to one embodiment of the present disclosure.
  • the safety monitoring system 200 may be configured within a vehicle 104 for determining direction of the traffic mirror 106 encountered during movement of the vehicle 104.
  • the safety monitoring system 200 may be remotely connected to the vehicle 104.
  • the safety monitoring system 200 may comprise, without limiting to, a processor 202, a memory 204 and an I/O interface 206.
  • the processor 202 may, for example, be a microcontroller or Graphics Processing Unit (GPU) capable of accessing the memory 204 to store information and execute instructions stored therein.
  • the processor 202 may be a part Engine Control Unit (ECU) in the vehicle 104.
  • the processor 202 and memory 204 may be integrated on a single integrated circuit.
  • the memory 204 stores information accessible by the processor 202 such as instructions executable by the processor 202 and data which may be stored, retrieved, or otherwise used by the processor 202.
  • the processor 202 may execute a method for determining direction of a traffic mirror 106 according to some embodiments of the instant disclosure based on instructions stored in the memory 204.
  • the data stored in the memory 204 may include, without limiting to, reference images of the one or more traffic mirrors, measured placement angle of the traffic mirror 106 and the like.
  • the I/O interface 206 of the safety monitoring system 200 may be used for interfacing the safety monitoring system 200 with one or more other components of the vehicle 104.
  • the safety monitoring system 200 may comprise one or more functional modules including an image sensor module 208, a region detection module 210, an edge detection module 212, a shape identification module 214, an angle measurement module 216 and a user interface 218.
  • each of the above modules may be communicatively coupled to each of the other modules via a Controller Area Network (CAN) bus. Further, each of the modules may be controller and supervised by the processor 202 based on the instructions and data stored in the memory 204 for determining the direction of the traffic mirror 106.
  • CAN Controller Area Network
  • the image sensor module 208 may comprise one or more image sensors.
  • the image sensors may be mounted externally at different parts of the vehicle 104. Further, the image sensors may be configured to continuously capture real-time images in the Field of View (FOV) 108 ahead of the vehicle 104, as long as the vehicle 104 is in movement. Alternatively, the image sensors may capture the real-time images only when a traffic mirror 106 has been detected in the FOV 108 of the vehicle 104.
  • the image sensors may, for example, be vision image sensors such as mono cameras or wide-angle fisheye cameras mounted at the front bumper of the vehicle 104.
  • the image sensor module 208 may comprise other types and numbers of image sensors than the ones mentioned above based on requirement of the vehicle 104.
  • the region detection module 210 may be configured for detecting a region of of the traffic mirror 106 from the real-time image of the FOV 108 of the vehicle 104.
  • the region detection module 210 may detect the region of the traffic mirror 106 using one or more predetermined computer vision techniques such as, single-object localization or object detection.
  • the region detection module 210 may use any other Artificial Intelligence (Al) and Deep Learning (DL) technique for localizing and detecting the region of the traffic mirror 106 in the real-time image.
  • Al Artificial Intelligence
  • DL Deep Learning
  • the input to the region detection module 210 may be a raw real-time image captured by the image sensor module 208 and the output of the region detection module 210 may be a processed image, highlighting the region of the traffic mirror 106 in the real-time image.
  • An exemplary output of the region detection module 210 is shown in image 302 of FIG. 3A.
  • the edge detection module 212 may be configured for identifying one or more edges of the traffic mirror 106 by processing the region of the traffic mirror 106 detected in the real-time image. In an embodiment, the edge detection module 212 may identify the one or more edges using a pretrained Convolution Neural Network (CNN). Alternatively, the edge detection module 212 may use any other Al and DL technique for identifying the one or more edges corresponding to the region of the traffic mirror 106. An exemplary output of the edge detection module 212 is shown in image 304 of FIG. 3B.
  • CNN Convolution Neural Network
  • the shape identification module 214 may be configured for identifying the shape of the traffic mirror 106 by analysing the one or more detected edges of the traffic mirror 106.
  • the shape identification module 214 may identify the shape by determining a basic geometric shape that may be formed using the one or more edges identified in the real-time image.
  • the shape of the traffic mirror 106 may be identified as one of circular, elliptical or polygonal.
  • the angle measurement module 216 may be configured for measuring a placement angle of the traffic mirror 106 based on analysis of the shape of the traffic mirror 106.
  • the placement angle may be the angle between the traffic mirror 106 and a horizontal axis representing the road 102 on which the vehicle 104 is moving.
  • the placement angle may be 90 degrees when the traffic mirror 106 is placed perpendicularly and/or exactly in an upright position on the road 102.
  • the placement angle may vary when the traffic mirror 106 is tilted and/or bent in a particular direction.
  • the angle measurement module 216 may use different angle measurement techniques for measuring the placement angle based on the identified shape of the traffic mirror 106.
  • the angle measurement technique used for measuring the placement angle of the traffic mirror 106 in an elliptical shape may be an existing technique such as, without limiting to, ‘findContours’ function of OpenCV®.
  • the angle measurement module 216 may use a predetermined auto angle compensation technique for handling any irregularities in the identified shape, before measuring the placement angle.
  • one of the predetermined auto angle compensation technique may be rotating the identified shape in a clockwise or anticlockwise direction by a predefined angle.
  • the user interface 218 may be used for displaying and/or notifying the determined direction and angle of the traffic mirror 106 to a driver or a passenger in the vehicle 104.
  • the user interface 218 may be used for communicating audio and/or visual messages to the driver/passenger of the vehicle 104.
  • the user interface 218 may comprise one or more components such as an instrument panel, an electronic display, and an audio system.
  • the instrument panel may be a dashboard or a centre display which displays for example, a speedometer, tachometer, and warning light indicators.
  • the user interface 218 may also comprise an electronic display such as an infotainment or heads-up display for communicating other visual messages to the driver/passenger and an audio system for playing audio messages, warning, or music.
  • FIG. 3A - FIG. 3D illustrates sequence of steps involved in the method of determining direction of the traffic mirror 106 according to one embodiment of the present disclosure.
  • FIG. 3A shows an exemplary real-time image 300 of the Field of View (FOV) 108 of the vehicle 104, captured by an image sensor module 208 of the vehicle 104, while the vehicle 104 is in movement and a traffic mirror 106 has been detected in the FOV 108 of the vehicle 104.
  • the safety monitoring system 200 configured in the vehicle 104 may process the real-time image for detecting a region of the traffic mirror 106 in the real-time image 300.
  • a cropped and/or extracted portion of the real-time image 300, comprising the region of the traffic mirror 106 is shown in image 302 of FIG. 3A.
  • the safety monitoring system 200 may identify one or more edges of the traffic mirror 106 as shown in image 304 of FIG. 3B. Subsequently, the safety monitoring system 200 may identify the shape of the traffic mirror 106 based on analysis of the one or more edges detected in the image 304. In the instant example, since the traffic mirror 106, as shown in image 300 of FIG. 3A, is elliptical in shape, the safety monitoring system 200 may detect an elliptical boundary from the one or more edges, as shown in image 306 of FIG. 3C.
  • the safety monitoring system 200 may measure a placement angle of the traffic mirror 106 by calculating an angle of tilt and/or turn of the traffic mirror 106 with respect to a reference horizontal axis 309 corresponding to the vehicle 104.
  • the reference horizontal axis 309 may be x- coordinate axis, which represents plane of the road 102 and/or the vehicle 104.
  • the placement angle may be calculated as the angle between a titled axis of the detected shape (marked with dotted lines in image 308) and the reference horizontal axis 309. In the image 308, the placement angle is represented by angle ‘0’.
  • the safety monitoring system 200 may determine the direction of the traffic mirror 106 based on value of the placement angle.
  • the direction of the traffic mirror 106 may be determined as ‘right’ when the value of the placement angle is less than 90 degrees.
  • the direction of the traffic mirror 106 may be determined as ‘left’ when the value of the placement angle is a value between 90 degrees and 180 degrees.
  • the direction of the traffic mirror 106 may be determined as ‘right’ since the placement angle ‘0’ is a value between 90 degrees and 180 degrees. Same steps may be repeated for determining the placement angle of the traffic mirror 106 of any shape and size.
  • FIG. 4A illustrates an alternative embodiment in which the traffic mirror 106 is tilted right.
  • the placement angle ‘0’ may be a value less than 90 degrees, and hence, the direction of the traffic mirror 106 may be determined as left.
  • FIG. 4B illustrates yet another embodiment of the proposed disclosure, for detecting direction of the traffic mirror 106 which has a polygonal shape (for example, square shape).
  • image 406 of FIG. 4B may be a real-time image of the traffic mirror 106.
  • Image 408 may represent the region of the traffic mirror 106 extracted from the real-time image 406.
  • the placement angle may be measured by running an auto angle compensation technique on the region of the traffic mirror 106 shown in image 408. For example, since the region of the traffic mirror 106 in the image 408 is tilted right, the auto compensation technique may rotate the region of the traffic mirror 106 in an anticlockwise direction to compensate the titled angle of the region of the traffic mirror 106.
  • the image 408 may be rotated anticlockwise by 8 degrees to compensate the tilted angle and obtain a compensated image 410.
  • the placement angle of the traffic mirror 106 may be measured as magnitude of the angle with which the image 408 has been rotated. In the instant example, the placement angle may be measured as 8 degrees since the image 408 was rotated by 8 degrees.
  • the direction of the traffic mirror 106 of image 406 may be determined as ‘right’ since the placement angle is a value less than 90 degrees.
  • the process illustrated above may be used for measuring the placement angle of the traffic mirrors of any other size and shape.
  • FIG. 5 is a flow diagram illustrating an exemplary method 500 for determining direction of the traffic mirror 106 according to one embodiment of the present disclosure.
  • the method 500 may be executed sequentially or in parallel with other embodiments of this disclosure for determining direction of traffic mirror 106.
  • the traffic mirrors may be used in different sizes and shapes including, but not limiting to, elliptical, square, or other polygonal shapes.
  • two or more processes may be executed in contemporaneously or sequentially to determine direction of the traffic mirrors irrespective of their size and shape.
  • the operations of the method 500 will be described with reference to the safety monitoring system 200 in FIG. 1. However, it will be appreciated that other similar systems may also be suitable.
  • the method 500 starts at step 502 and may be initiated upon the ignition of a vehicle 104 being switched on. Other events for initiating the start of the method 500 may also be suitable and the method may also be initiated on demand.
  • the method 500 causes the processor 202 in the safety monitoring system 200 to start detecting a region of a traffic mirror 106 from a real-time image of a Field of View (FOV) 108 of a vehicle 104.
  • the vehicle 104 may be an autonomous vehicle or a vehicle 104 comprising Advanced Driver Assistance System (ADAS).
  • ADAS Advanced Driver Assistance System
  • the vehicle 104 may capture the real-time image of the FOV 108 of the vehicle 104, comprising the traffic mirror 106, using an image sensor configured in the vehicle 104.
  • the region of the traffic mirror 106 may be detecting using a predetermined Artificial Intelligence (Al) technique such as, without limiting to, a pretrained Convolutional Neural Network (CNN)
  • the method 500 causes the processor 202 in the safety monitoring system 200 to identify one or more edges of the traffic mirror 106 by processing the region of the traffic mirror 106 detected in the real-time image.
  • the one or more edges of the traffic mirror 106 may be identified using predetermined edge detection techniques.
  • the method 500 causes the processor 202 in the safety monitoring system 200 to identify a shape of the traffic mirror 106 based on analysis of the one or more edges identified.
  • the shape of the traffic mirror 106 may be at least one of, without limitation, circular, elliptical or polygonal.
  • the method 500 causes the processor 202 in the safety monitoring system 200 to measure a placement angle of the traffic mirror 106.
  • the placement angle may be an angle of placement of the traffic mirror 106 with respect to the road 102.
  • the placement angle may be the tilted angle of the traffic mirror 106 with respect to of the road 102.
  • the placement angle may be measured using a predefined angle measurement technique corresponding to the shape of the traffic mirror 106.
  • measuring the placement angle may further include applying an angle compensation technique on the identified shape before measuring the placement angle. The compensation technique may be selected based on the identified shape of the traffic mirror 106.
  • the method 500 causes the processor 202 in the safety monitoring system 200 to determine the direction of the traffic mirror 106 based on value of the placement angle of the traffic mirror 106.
  • the direction of the traffic mirror 106 may be determined as ‘right’ when the value of the placement angle is less than 90 degrees or between zero to 90 degrees.
  • the direction of the traffic mirror 106 may be determined as ‘left’ when the value of the placement angle is between 90 degrees and 180 degrees.
  • an embodiment means “one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)

Abstract

Present disclosure provides a method and system (200) for determining direction of a traffic mirror (106). The method comprises detecting a region of a traffic mirror (106) from a real-time image of a field of view (108) of a vehicle (104). The real-time image of the field of view (108) of the vehicle (104) is captured during movement of the vehicle (104). Further, the method comprises identifying one or more edges of the traffic mirror (106) by processing the region of the traffic mirror (106) detected in the real-time image. Thereafter, the method comprises identifying a shape of the traffic mirror (106) based on analysis of the one or more edges. Furthermore, the method comprises measuring a placement angle of the traffic mirror (106) using a predefined angle measurement technique corresponding to the shape of the traffic mirror (106). Finally, the method comprises determining the direction of the traffic mirror (106) based on value of the placement angle of the traffic mirror (106).

Description

METHOD AND SYSTEM FOR DETERMINING DIRECTION OF A TRAFFIC MIRROR
TECHNICAL FIELD:
The present disclosure is generally related to autonomous vehicles and specifically related to method and system for determining direction and angle of a traffic mirror using image analysis.
BACKGROUND:
Traffic mirrors or road safety mirrors are used for enhancing traffic safety in streets, road junctions and curved or bent lanes. Traffic mirrors provide better visibility to blind spots on the roads and help prevent common hazards or accidents. In general, traffic mirrors benefit both drivers and pedestrians to see if there is any other vehicle coming from the opposite side.
However, in order to further mitigate the risk of accidents, it is important that the traffic mirrors are clearly visible to the drivers. Moreover, for vehicles integrated with Advanced Driver Assistance System (ADAS) and Autonomous Driving (AD) mechanisms, it is also important to estimate direction and angle of the traffic mirrors to accurately predict direction of the traffic flow.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
SUMMARY:
Aspects of the present disclosure provide methods and system for determining direction of a traffic mirror by analyzing images of the traffic mirror.
One aspect of this disclosure provides a method for determining the direction of the traffic mirror and comprises detecting a region of a traffic mirror from a real-time image of a field of view of a vehicle. In an embodiment, the real-time image of the field of view of the vehicle is captured by an image sensor in the vehicle, during movement of the vehicle. Further, the method comprises identifying one or more edges of the traffic mirror by processing the region of the traffic mirror detected in the real-time image. Upon identifying the one or more edges, the method comprises identifying a shape of the traffic mirror based on analysis of the one or more edges. Once the shape of the traffic mirror is identified, the method comprises measuring a placement angle of the traffic mirror using a predefined angle measurement technique corresponding to the shape of the traffic mirror. Subsequently, the method comprises determining the direction of the traffic mirror based on value of the placement angle of the traffic mirror.
Another aspect of the disclosure provides a safety monitoring system for determining direction of a traffic mirror. The system comprises a processor and at least one memory coupled to the processor. The at least one memory stores instructions executable by the processor and cause the processor to detect a region of a traffic mirror from a real-time image of a field of view of a vehicle. In an embodiment, the real-time image of the field of view of the vehicle is captured by an image sensor of the vehicle, during movement of the vehicle. Further, the processor is configured to identify one or more edges of the traffic mirror by processing the region of the traffic mirror detected in the real-time image. After identifying the one or more edges, the processor identifies a shape of the traffic mirror based on analysis of the one or more edges. Thereafter, the processor measures a placement angle of the traffic mirror using a predefined angle measurement technique corresponding to the shape of the traffic mirror. Finally, the processor determines the direction of the traffic mirror based on value of the placement angle of the traffic mirror.
Embodiments of the disclosure according to the above method and system may bring about several advantages. Firstly, the method and system of the disclosure are useful for determining a direction of the traffic mirrors on the roads. In some embodiments, direction and/or angle of the traffic mirror is an important input for Advanced Driver Assistance System (ADAS) and Autonomous Driving (AD) integrated vehicles for accurate risk prediction. Consequently, determining the direction of the traffic mirror enhances traffic safety and mitigates risk of traffic hazards or accidents.
The foregoing summary is illustrative only and is not intended to be in any way limiting. The details of one or more embodiments of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS:
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are used throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and regarding the accompanying figures, in which:
FIG. 1 is an exemplary schematic diagram illustrating a vehicle capturing real-time image of a traffic mirror according to one embodiment of the present disclosure.
FIG. 2 shows a block diagram of a safety monitoring system for determining direction of a traffic mirror according to one embodiment of the present disclosure.
FIG. 3A - FIG. 3D illustrate sequence of steps involved in the method of determining direction of the traffic mirror according to one embodiment of the present disclosure.
FIG. 4A and 4B illustrate exemplary variations of the present disclosure according to alternative embodiments of the present disclosure.
FIG. 5 is a flow diagram illustrating an exemplary method for determining direction of the according to one embodiment of the present disclosure. It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether such computer or processor is explicitly shown.
DETAILED DESCRIPTION:
In the following disclosure, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the specific forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the spirit and the scope of the disclosure.
The terms “comprises”, “comprising”, “includes”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device, or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises... a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
FIG. 1 is an exemplary schematic diagram illustrating a method of determining direction of a traffic mirror 106 using a real-time image of the traffic mirror 106 according to one embodiment of the present disclosure. The traffic mirror 106 may be a road safety mirror or a traffic safety mirror, which is placed at a predetermined location on a road 102 to avoid traffic collisions. Further, the real-time image of the traffic mirror 106 may be an image captured by a vehicle 104 in real-time, while the vehicle 104 is moving on the road 102. In an embodiment, suppose the safety monitoring system of the present disclosure has been integrated to the vehicle 104 shown in FIG. 1. Suppose the vehicle 104 is moving on the road 102, leading to a crossroad junction through which one or more other vehicles 110A and 110B pass by. Suppose a traffic mirror 106 has been positioned at the crossroad junction, such that the traffic mirror 106 displays one or more other vehicles 110A, 110B converging and/or diverging from either sides of the road 102. It must be appreciated by a person skilled in the art that the one or more other vehicles 110A, 110B shown in FIG. 1 is for representation purpose only and these may include other types of obstacles and hazards on the road 102 including, without limitation, potholes, pedestrians, and the like.
In an embodiment, the vehicle 104 may be, without limitation, an autonomous or selfdriving vehicle, an assisted driving vehicle with Advanced Driver-Assi stance Systems (ADAS). To avoid possibilities of collisions, accidents, or other types of hazards, it may be necessary for the vehicle 104 to read and analyze the reflections displayed on the traffic mirror 106. In addition, for accurately predicting position of an incoming vehicle or hazard, the vehicle 104 may need to estimate the right direction and the angle in which the traffic mirror 106 is placed on the road 102. For this purpose, the vehicle 104 may continuously capture real-time images of the traffic mirror 106, when the traffic mirror 106 appears in the Field of View (FOV) 108 of image sensors associated with the vehicle 104. In an embodiment, the vehicle 104 may capture the real-time images of the traffic mirror 106 using the image sensors configured at the front portion of the vehicle 104.
In an embodiment, for determining the direction of the traffic mirror 106 for the reasons state above, the safety monitoring system on the vehicle 104 may perform one or more Artificial Intelligence (Al) and Deep Learning (DL) based operations on the real-time image of the traffic mirror 106. For instance, a processor associated with the safety monitoring system may process the real-time image for detecting a region of a traffic mirror 106 in the real-time image. The region of the traffic mirror 106 in the real-time image may be a portion of the real-time image representing the traffic mirror 106. After detecting the region of the traffic mirror 106, the processor may identify one or more edges of the traffic mirror 106 by further processing the region of the traffic mirror 106 detected in the real-time image. The one or more edges of the traffic mirror 106 may represent a boundary of the traffic mirror 106 in the real-time image. After identifying the one or more edges, the processor may identify a shape of the traffic mirror 106 based on analysis of the one or more edges. Subsequently, the processor may measure a placement angle of the traffic mirror 106 using a predefined angle measurement technique corresponding to the shape of the traffic mirror 106. Finally, the processor may determine the direction of the traffic mirror 106 based on value of the placement angle of the traffic mirror 106. A detailed illustration of various steps and processes involved in processing the real-time image for determining the direction of the traffic mirror 106 has been provided in further sections of the present disclosure.
In an embodiment, once the direction of the traffic mirror 106 has been determined, the direction of the traffic mirror 106 may be used as an input for accurately predicting position, speed, and movement of one or other vehicles 110A, 110B on the road 102. For instance, in the case of autonomous/self-driving vehicles, the direction information of the traffic mirror 106 may be used to carefully scan and estimate position and speed of the incoming vehicle (that is, one of the one or more other vehicles 110A, 110B) or other obstacles. Consequently, this helps in reducing processing load on autonomous path monitoring control units of the vehicle 104, since the control units may easily alter the course of driving based on the direction of the traffic mirror 106. Alternatively, in the case of driver-assisted or ADAS equipped vehicles, the direction information of the traffic mirror 106 may be used for alerting the driver of the vehicle 104 about position and speed of one or more other vehicles 110A, 110B. Subsequently, the course of the vehicle 104 may be suitably altered to mitigate any risks.
FIG. 2 shows a block diagram of a safety monitoring system 200 for determining direction of a traffic mirror 106 according to one embodiment of the present disclosure.
In an embodiment, the safety monitoring system 200 may be configured within a vehicle 104 for determining direction of the traffic mirror 106 encountered during movement of the vehicle 104. Alternatively, the safety monitoring system 200 may be remotely connected to the vehicle 104. In an embodiment, the safety monitoring system 200 may comprise, without limiting to, a processor 202, a memory 204 and an I/O interface 206.
In an embodiment, the processor 202 may, for example, be a microcontroller or Graphics Processing Unit (GPU) capable of accessing the memory 204 to store information and execute instructions stored therein. Alternatively, the processor 202 may be a part Engine Control Unit (ECU) in the vehicle 104. In an embodiment, the processor 202 and memory 204 may be integrated on a single integrated circuit. In an embodiment, the memory 204 stores information accessible by the processor 202 such as instructions executable by the processor 202 and data which may be stored, retrieved, or otherwise used by the processor 202. For example, the processor 202 may execute a method for determining direction of a traffic mirror 106 according to some embodiments of the instant disclosure based on instructions stored in the memory 204. As an example, the data stored in the memory 204 may include, without limiting to, reference images of the one or more traffic mirrors, measured placement angle of the traffic mirror 106 and the like. In an embodiment, the I/O interface 206 of the safety monitoring system 200 may be used for interfacing the safety monitoring system 200 with one or more other components of the vehicle 104.
In an embodiment, the safety monitoring system 200 may comprise one or more functional modules including an image sensor module 208, a region detection module 210, an edge detection module 212, a shape identification module 214, an angle measurement module 216 and a user interface 218. In an embodiment, each of the above modules may be communicatively coupled to each of the other modules via a Controller Area Network (CAN) bus. Further, each of the modules may be controller and supervised by the processor 202 based on the instructions and data stored in the memory 204 for determining the direction of the traffic mirror 106.
In an embodiment, the image sensor module 208 may comprise one or more image sensors. The image sensors may be mounted externally at different parts of the vehicle 104. Further, the image sensors may be configured to continuously capture real-time images in the Field of View (FOV) 108 ahead of the vehicle 104, as long as the vehicle 104 is in movement. Alternatively, the image sensors may capture the real-time images only when a traffic mirror 106 has been detected in the FOV 108 of the vehicle 104. As an example, the image sensors may, for example, be vision image sensors such as mono cameras or wide-angle fisheye cameras mounted at the front bumper of the vehicle 104. In an embodiment, the image sensor module 208 may comprise other types and numbers of image sensors than the ones mentioned above based on requirement of the vehicle 104.
In an embodiment, the region detection module 210 may be configured for detecting a region of of the traffic mirror 106 from the real-time image of the FOV 108 of the vehicle 104. As an example, the region detection module 210 may detect the region of the traffic mirror 106 using one or more predetermined computer vision techniques such as, single-object localization or object detection. Alternatively, the region detection module 210 may use any other Artificial Intelligence (Al) and Deep Learning (DL) technique for localizing and detecting the region of the traffic mirror 106 in the real-time image. In an embodiment, the input to the region detection module 210 may be a raw real-time image captured by the image sensor module 208 and the output of the region detection module 210 may be a processed image, highlighting the region of the traffic mirror 106 in the real-time image. An exemplary output of the region detection module 210 is shown in image 302 of FIG. 3A.
In an embodiment, the edge detection module 212 may be configured for identifying one or more edges of the traffic mirror 106 by processing the region of the traffic mirror 106 detected in the real-time image. In an embodiment, the edge detection module 212 may identify the one or more edges using a pretrained Convolution Neural Network (CNN). Alternatively, the edge detection module 212 may use any other Al and DL technique for identifying the one or more edges corresponding to the region of the traffic mirror 106. An exemplary output of the edge detection module 212 is shown in image 304 of FIG. 3B.
In an embodiment, the shape identification module 214 may be configured for identifying the shape of the traffic mirror 106 by analysing the one or more detected edges of the traffic mirror 106. The shape identification module 214 may identify the shape by determining a basic geometric shape that may be formed using the one or more edges identified in the real-time image. As an example, the shape of the traffic mirror 106 may be identified as one of circular, elliptical or polygonal.
In an embodiment, the angle measurement module 216 may be configured for measuring a placement angle of the traffic mirror 106 based on analysis of the shape of the traffic mirror 106. In an embodiment, the placement angle may be the angle between the traffic mirror 106 and a horizontal axis representing the road 102 on which the vehicle 104 is moving. As an example, the placement angle may be 90 degrees when the traffic mirror 106 is placed perpendicularly and/or exactly in an upright position on the road 102. The placement angle may vary when the traffic mirror 106 is tilted and/or bent in a particular direction. In an embodiment, the angle measurement module 216 may use different angle measurement techniques for measuring the placement angle based on the identified shape of the traffic mirror 106. As an example, the angle measurement technique used for measuring the placement angle of the traffic mirror 106 in an elliptical shape may be an existing technique such as, without limiting to, ‘findContours’ function of OpenCV®. Additionally, the angle measurement module 216 may use a predetermined auto angle compensation technique for handling any irregularities in the identified shape, before measuring the placement angle. As an example, one of the predetermined auto angle compensation technique may be rotating the identified shape in a clockwise or anticlockwise direction by a predefined angle. In an embodiment, the user interface 218 may be used for displaying and/or notifying the determined direction and angle of the traffic mirror 106 to a driver or a passenger in the vehicle 104. Further, the user interface 218 may be used for communicating audio and/or visual messages to the driver/passenger of the vehicle 104. In one variation, the user interface 218 may comprise one or more components such as an instrument panel, an electronic display, and an audio system. The instrument panel may be a dashboard or a centre display which displays for example, a speedometer, tachometer, and warning light indicators. The user interface 218 may also comprise an electronic display such as an infotainment or heads-up display for communicating other visual messages to the driver/passenger and an audio system for playing audio messages, warning, or music.
FIG. 3A - FIG. 3D illustrates sequence of steps involved in the method of determining direction of the traffic mirror 106 according to one embodiment of the present disclosure.
FIG. 3A shows an exemplary real-time image 300 of the Field of View (FOV) 108 of the vehicle 104, captured by an image sensor module 208 of the vehicle 104, while the vehicle 104 is in movement and a traffic mirror 106 has been detected in the FOV 108 of the vehicle 104. In an embodiment, after capturing the real-time image 300, the safety monitoring system 200 configured in the vehicle 104 may process the real-time image for detecting a region of the traffic mirror 106 in the real-time image 300. A cropped and/or extracted portion of the real-time image 300, comprising the region of the traffic mirror 106 is shown in image 302 of FIG. 3A.
In an embodiment, once the region of the traffic mirror 106 is obtained, the safety monitoring system 200 may identify one or more edges of the traffic mirror 106 as shown in image 304 of FIG. 3B. Subsequently, the safety monitoring system 200 may identify the shape of the traffic mirror 106 based on analysis of the one or more edges detected in the image 304. In the instant example, since the traffic mirror 106, as shown in image 300 of FIG. 3A, is elliptical in shape, the safety monitoring system 200 may detect an elliptical boundary from the one or more edges, as shown in image 306 of FIG. 3C. In an embodiment, after detecting the shape of the traffic mirror 106, the safety monitoring system 200 may measure a placement angle of the traffic mirror 106 by calculating an angle of tilt and/or turn of the traffic mirror 106 with respect to a reference horizontal axis 309 corresponding to the vehicle 104. For example, as shown in image 308 of FIG. 3D, the reference horizontal axis 309 may be x- coordinate axis, which represents plane of the road 102 and/or the vehicle 104. Further, as shown in image 308, the placement angle may be calculated as the angle between a titled axis of the detected shape (marked with dotted lines in image 308) and the reference horizontal axis 309. In the image 308, the placement angle is represented by angle ‘0’.
In an embodiment, once the placement angle has been measured, the safety monitoring system 200 may determine the direction of the traffic mirror 106 based on value of the placement angle. In an embodiment, the direction of the traffic mirror 106 may be determined as ‘right’ when the value of the placement angle is less than 90 degrees. In an alternate embodiment, the direction of the traffic mirror 106 may be determined as ‘left’ when the value of the placement angle is a value between 90 degrees and 180 degrees. In the instant example, the direction of the traffic mirror 106 may be determined as ‘right’ since the placement angle ‘0’ is a value between 90 degrees and 180 degrees. Same steps may be repeated for determining the placement angle of the traffic mirror 106 of any shape and size.
In an embodiment, FIG. 4A illustrates an alternative embodiment in which the traffic mirror 106 is tilted right. Here, as shown in image 404 of FIG. 4A, the placement angle ‘0’ may be a value less than 90 degrees, and hence, the direction of the traffic mirror 106 may be determined as left.
FIG. 4B illustrates yet another embodiment of the proposed disclosure, for detecting direction of the traffic mirror 106 which has a polygonal shape (for example, square shape). In an embodiment, image 406 of FIG. 4B may be a real-time image of the traffic mirror 106. Image 408 may represent the region of the traffic mirror 106 extracted from the real-time image 406. In an embodiment, for the traffic mirrors, which are polygonal in shape, the placement angle may be measured by running an auto angle compensation technique on the region of the traffic mirror 106 shown in image 408. For example, since the region of the traffic mirror 106 in the image 408 is tilted right, the auto compensation technique may rotate the region of the traffic mirror 106 in an anticlockwise direction to compensate the titled angle of the region of the traffic mirror 106. As an example, the image 408 may be rotated anticlockwise by 8 degrees to compensate the tilted angle and obtain a compensated image 410. Further, the placement angle of the traffic mirror 106 may be measured as magnitude of the angle with which the image 408 has been rotated. In the instant example, the placement angle may be measured as 8 degrees since the image 408 was rotated by 8 degrees. Finally, the direction of the traffic mirror 106 of image 406 may be determined as ‘right’ since the placement angle is a value less than 90 degrees.
In an embodiment, the process illustrated above may be used for measuring the placement angle of the traffic mirrors of any other size and shape.
FIG. 5 is a flow diagram illustrating an exemplary method 500 for determining direction of the traffic mirror 106 according to one embodiment of the present disclosure.
In an embodiment, the method 500 may be executed sequentially or in parallel with other embodiments of this disclosure for determining direction of traffic mirror 106. For instance, based on requirement and size of the traffic, the traffic mirrors may be used in different sizes and shapes including, but not limiting to, elliptical, square, or other polygonal shapes. As such, two or more processes may be executed in contemporaneously or sequentially to determine direction of the traffic mirrors irrespective of their size and shape.
The operations of the method 500 will be described with reference to the safety monitoring system 200 in FIG. 1. However, it will be appreciated that other similar systems may also be suitable. The method 500 starts at step 502 and may be initiated upon the ignition of a vehicle 104 being switched on. Other events for initiating the start of the method 500 may also be suitable and the method may also be initiated on demand. In step 502, the method 500 causes the processor 202 in the safety monitoring system 200 to start detecting a region of a traffic mirror 106 from a real-time image of a Field of View (FOV) 108 of a vehicle 104. In an embodiment, the vehicle 104 may be an autonomous vehicle or a vehicle 104 comprising Advanced Driver Assistance System (ADAS). In an embodiment, the vehicle 104 may capture the real-time image of the FOV 108 of the vehicle 104, comprising the traffic mirror 106, using an image sensor configured in the vehicle 104. In an embodiment, the region of the traffic mirror 106 may be detecting using a predetermined Artificial Intelligence (Al) technique such as, without limiting to, a pretrained Convolutional Neural Network (CNN)
In step 504, the method 500 causes the processor 202 in the safety monitoring system 200 to identify one or more edges of the traffic mirror 106 by processing the region of the traffic mirror 106 detected in the real-time image. In an embodiment, the one or more edges of the traffic mirror 106 may be identified using predetermined edge detection techniques.
In step 506, the method 500 causes the processor 202 in the safety monitoring system 200 to identify a shape of the traffic mirror 106 based on analysis of the one or more edges identified. In an embodiment, the shape of the traffic mirror 106 may be at least one of, without limitation, circular, elliptical or polygonal.
In step 508, the method 500 causes the processor 202 in the safety monitoring system 200 to measure a placement angle of the traffic mirror 106. In an embodiment, the placement angle may be an angle of placement of the traffic mirror 106 with respect to the road 102. In other words, the placement angle may be the tilted angle of the traffic mirror 106 with respect to of the road 102. In one embodiment, the placement angle may be measured using a predefined angle measurement technique corresponding to the shape of the traffic mirror 106. In an embodiment, measuring the placement angle may further include applying an angle compensation technique on the identified shape before measuring the placement angle. The compensation technique may be selected based on the identified shape of the traffic mirror 106. In step 510, the method 500 causes the processor 202 in the safety monitoring system 200 to determine the direction of the traffic mirror 106 based on value of the placement angle of the traffic mirror 106. In an embodiment, the direction of the traffic mirror 106 may be determined as ‘right’ when the value of the placement angle is less than 90 degrees or between zero to 90 degrees. Similarly, the direction of the traffic mirror 106 may be determined as ‘left’ when the value of the placement angle is between 90 degrees and 180 degrees.
The terms "an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise. The enumerated listing of items does not imply that any or all the items are mutually exclusive, unless expressly specified otherwise.
The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise. A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be clear that more than one device/article (whether they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether they cooperate), it will be clear that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself. Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
REFERRAL NUMERALS:
Figure imgf000018_0001

Claims

CLAIMS:
1 . A method for determining direction of a traffic mirror (106), the method comprising: detecting, by a processor (202), a region of a traffic mirror (106) from a real-time image of a field of view (108) of a vehicle (104), wherein the real-time image of the field of view (108) of the vehicle (104) is captured during movement of the vehicle (104); identifying, by the processor (202), one or more edges of the traffic mirror (106) by processing the region of the traffic mirror (106) detected in the real-time image; identifying, by the processor (202), a shape of the traffic mirror (106) based on analysis of the one or more edges; measuring, by the processor (202), a placement angle of the traffic mirror (106) using a predefined angle measurement technique corresponding to the shape of the traffic mirror (106); and determining, by the processor (202), the direction of the traffic mirror (106) based on value of the placement angle of the traffic mirror (106).
2. The method of claim 1 , wherein the vehicle (104) is an autonomous vehicle or an autonomous vehicle comprising Advanced Driver Assistance Systems (ADAS).
3. The method of claim 1 , wherein the region of the traffic mirror (106) is detected using a predetermined Artificial Intelligence (Al) technique.
4. The method of claim 1 , wherein the shape of the traffic mirror (106) is at least one of circular, elliptical or polygonal.
5. The method of claim 1 , wherein measuring the placement angle further comprises applying, by the processor (202), an angle compensation technique on the identified shape before measuring the placement angle.
6. The method of claim 5, wherein the placement angle is measured with respect to a reference horizontal axis corresponding to the vehicle (104).
7. The method of claim 1 , wherein: the direction of the traffic mirror (106) is determined as ‘right’ when the value of the placement angle is less than 90 degrees; and the direction of the traffic mirror (106) is determined as ‘left’ when the value of the placement angle is between 90 degrees and 180 degrees.
8. A safety monitoring system (200) for determining direction of a traffic mirror (106), the system comprising: a processor (202); and at least one memory (204) coupled to the processor (202) and storing instructions executable by the processor (202), causing the processor (202) to: detect a region of a traffic mirror (106) from a real-time image of a field of view (108) of a vehicle (104), wherein the real-time image of the field of view (108) of the vehicle (104) is captured during movement of the vehicle (104); identify one or more edges of the traffic mirror (106) by processing the region of the traffic mirror (106) detected in the real-time image; identify a shape of the traffic mirror (106) based on analysis of the one or more edges; measure a placement angle of the traffic mirror (106) using a predefined angle measurement technique corresponding to the shape of the traffic mirror (106); and determine the direction of the traffic mirror (106) based on value of the placement angle of the traffic mirror (106).
9. The system (200) of claim 8, wherein the vehicle is an autonomous vehicle or an autonomous vehicle comprising Advanced Driver Assistance Systems (ADAS).
10. The system (200) of claim 8, wherein the region of the traffic mirror (106) is detecting using a predetermined Artificial Intelligence (Al) technique.
11. The system (200) of claim 8, wherein the shape of the traffic mirror (106) is at least one of circular, elliptical or polygonal. 19
12. The system (200) of claim 8, wherein measuring the placement angle further comprises applying an angle compensation technique on the identified shape before measuring the placement angle.
13. The system (200) of claim 12, wherein the placement angle is measured with respect to a reference horizontal axis corresponding to the vehicle (104).
14. The system (200) of claim 8, wherein: the direction of the traffic mirror (106) is determined as ‘right’ when the value of the placement angle is less than 90 degrees; and the direction of the traffic mirror (106) is determined as ‘left’ when the value of the placement angle is between 90 degrees and 180 degrees.
15. A non-transitory computer-readable storage medium comprising computer- readable instructions for carrying out the methods according to any of claims 1-7.
16. A vehicle (104) comprising a safety monitoring system (200) for determining direction of a traffic mirror (106) according to any of claims 8-14.
PCT/EP2022/071315 2021-08-03 2022-07-29 Method and system for determining direction of a traffic mirror WO2023012049A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB2111145.5 2021-08-03
GB2111145.5A GB2609464A (en) 2021-08-03 2021-08-03 Method and system for determining direction of a traffic mirror

Publications (1)

Publication Number Publication Date
WO2023012049A1 true WO2023012049A1 (en) 2023-02-09

Family

ID=77651396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/071315 WO2023012049A1 (en) 2021-08-03 2022-07-29 Method and system for determining direction of a traffic mirror

Country Status (2)

Country Link
GB (1) GB2609464A (en)
WO (1) WO2023012049A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2624627A (en) * 2022-11-22 2024-05-29 Continental Autonomous Mobility Germany GmbH A system and method of detecting curved mirrors within an image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3249627A1 (en) * 2015-01-22 2017-11-29 Pioneer Corporation Driving assistance device and driving assistance method
DE102016215115A1 (en) * 2016-08-12 2018-02-15 Continental Automotive Gmbh Device and method for detecting road users in an environment of an ego vehicle
US20180342046A1 (en) * 2017-05-23 2018-11-29 Toyota Jidosha Kabushiki Kaisha Providing Traffic Mirror Content to a Driver
DE102019213791A1 (en) * 2019-09-11 2021-03-11 Robert Bosch Gmbh Use of information from traffic mirrors for automated driving

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3249627A1 (en) * 2015-01-22 2017-11-29 Pioneer Corporation Driving assistance device and driving assistance method
DE102016215115A1 (en) * 2016-08-12 2018-02-15 Continental Automotive Gmbh Device and method for detecting road users in an environment of an ego vehicle
US20180342046A1 (en) * 2017-05-23 2018-11-29 Toyota Jidosha Kabushiki Kaisha Providing Traffic Mirror Content to a Driver
DE102019213791A1 (en) * 2019-09-11 2021-03-11 Robert Bosch Gmbh Use of information from traffic mirrors for automated driving

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
MIDORI MORI ET AL: "Ergonomics Study of Direct and Indirect Visibility Evaluation at Uncontrolled Intersections Based on Three-Dimensional Computer Simulation", 21 July 2013, ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, pages: 70 - 77, XP047033693 *

Also Published As

Publication number Publication date
GB202111145D0 (en) 2021-09-15
GB2609464A (en) 2023-02-08

Similar Documents

Publication Publication Date Title
US10878256B2 (en) Travel assistance device and computer program
EP2492888B1 (en) Lane departure warning apparatus and system
CN112349144B (en) Monocular vision-based vehicle collision early warning method and system
JP4788426B2 (en) Vehicle display system
US20170293895A1 (en) Device and method for calculating damage repair cost
JP2009211624A (en) Driving support device, driving support method, and computer program
JP6129268B2 (en) Vehicle driving support system and driving support method
JP2007102691A (en) View-field support apparatus for vehicle
JP2000097714A (en) Car navigation apparatus
WO2022062000A1 (en) Driver assistance method based on transparent a-pillar
WO2018149539A1 (en) A method and apparatus for estimating a range of a moving object
WO2023012049A1 (en) Method and system for determining direction of a traffic mirror
JP2008134165A (en) Navigation system
CN108725319B (en) Image type car backing guidance method
JP2010176592A (en) Driving support device for vehicle
JP2008090683A (en) Onboard navigation device
JP2024528992A (en) Method and system for determining traffic mirror orientation - Patents.com
US20240247937A1 (en) Method and system for creating a virtual lane for a vehicle
US10380437B2 (en) Systems and methods for traffic sign assistance
CN110763244B (en) Electronic map generation system and method
JP2005205983A (en) Apparatus for visually recognizing surrounding of own vehicle
JP2017531268A5 (en)
JP2024528991A (en) Method and system for creating virtual lanes for vehicles
JP2020131957A (en) Vehicle, display method and program
EP4064220A1 (en) Method, system and device for detecting traffic light for vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22757585

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2024506705

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE