US20180122241A1 - Method and apparatus for warning of objects - Google Patents
Method and apparatus for warning of objects Download PDFInfo
- Publication number
- US20180122241A1 US20180122241A1 US15/342,795 US201615342795A US2018122241A1 US 20180122241 A1 US20180122241 A1 US 20180122241A1 US 201615342795 A US201615342795 A US 201615342795A US 2018122241 A1 US2018122241 A1 US 2018122241A1
- Authority
- US
- United States
- Prior art keywords
- objects
- graphical indicator
- graphical
- condition
- displaying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 230000004044 response Effects 0.000 claims abstract description 9
- 230000015654 memory Effects 0.000 claims description 9
- 230000035484 reaction time Effects 0.000 claims description 8
- 238000004891 communication Methods 0.000 description 22
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 230000006378 damage Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000001351 cycling effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000003340 mental effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/178—Warnings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/179—Distances to obstacles or vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0141—Head-up displays characterised by optical features characterised by the informative content of the display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
Definitions
- Apparatuses and methods consistent with exemplary embodiments relate to warning of objects. More particularly, apparatuses and methods consistent with exemplary embodiments relate to warning of objects that are visually obstructed by non-transparent components.
- One or more exemplary embodiments provide a method and an apparatus that detect objects or obstacles that are visually obstructed by non-transparent components of a vehicle. More particularly, one or more exemplary embodiments provide a method and an apparatus that detect objects that are visually obstructed by non-transparent components of a vehicle and warns a driver of the objects or obstacles.
- a method for warning of objects includes detecting a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine, determining whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, displaying a graphical indicator corresponding to the detected at least one object.
- the detected at least one object may include a plurality of objects and the method further include: detecting depths of the plurality of objects.
- the displaying the graphical indicator may include displaying a plurality of graphical indicators with depth cues corresponding to the detected depths of the plurality of objects.
- the method may further include detecting trajectories and locations of the plurality of objects.
- the displaying the graphical indicator may include displaying the plurality of graphical indicators with depth, location and trajectory cues including one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator.
- the determining whether the at least one condition for displaying the detected at least one object is met may include: determining a distance between the at least one object and the machine, and determining that the condition is met if the distance is within a predetermined distance of the machine.
- the determining whether the at least one condition for displaying the detected at least one object is met may include: determining a position of the at least one object, and determining that the condition is met if the position of the at least one object is within the predetermined area.
- the determining whether the at least one condition for displaying the detected at least one object is met may further include: determining a distance between the at least one object and the machine, and determining that the condition is met if the distance is within a predetermined distance of the machine and the position of the at least one object is within the predetermined area corresponding to the position of the machine.
- the displaying the graphical indicator may include displaying the graphical indicator on a portion of the machine visually obstructing the operator of the machine.
- the machine may be a vehicle and the portion may be a pillar of the vehicle, the pillar may be at least one from among an A-Pillar, a B-Pillar and a C-Pillar.
- the at least one condition for displaying the detected at least one object may include at least one from among a speed of the machine, a reaction time of the operator, and a number of the at least one object.
- the determining whether the at least one condition for displaying the detected at least one object is may include: determining a trajectory of the at least one object or the machine, and determining that the condition is met if the determined trajectory indicates that a trajectory of the at least one object will be within a predetermined distance of a trajectory of the machine
- an apparatus for warning of objects includes at least one memory comprising computer executable instructions; and at least one processor configured to read and execute the computer executable instructions.
- the computer executable instructions causing the at least one processor to: detect a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine; determine whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, display a graphical indicator corresponding to the detected at least one object.
- a non-transitory computer readable medium comprising computer executable instructions executable by a processor to perform the method for warning of objects.
- the method includes detecting a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine; determining whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, displaying a graphical indicator corresponding to the detected at least one object on a display corresponding to the visually obstructed area.
- FIG. 1 shows a block diagram of an apparatus that warns of objects according to an exemplary embodiment
- FIG. 2 shows a flowchart for a method for warning of objects according to an exemplary embodiment
- FIG. 3 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment
- FIG. 4 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment.
- FIGS. 1-4 of the accompanying drawings in which like reference numerals refer to like elements throughout.
- first element is “connected to,” “attached to,” “formed on,” or “disposed on” a second element
- first element may be connected directly to, formed directly on or disposed directly on the second element or there may be intervening elements between the first element and the second element, unless it is stated that a first element is “directly” connected to, attached to, formed on, or disposed on the second element.
- first element may send or receive the information directly to or from the second element, send or receive the information via a bus, send or receive the information via a network, or send or receive the information via intermediate elements, unless the first element is indicated to send or receive information “directly” to or from the second element.
- one or more of the elements disclosed may be combined into a single device or combined into one or more devices.
- individual elements may be provided on separate devices.
- Operators of devices or machines may encounter objects or obstacles during the operation of the devices or machines.
- a driver of a vehicle may encounter static or moving objects that may cross the path of the vehicle or that may be on a trajectory to cross a path of a moving vehicle.
- the vehicle may be a passenger car, but it should be appreciated that any other vehicle including motorcycles, trucks, sports utility vehicles (SUVs), recreational vehicles (RVs), marine vessels, aircraft, etc., can also be used.
- SUVs sports utility vehicles
- RVs recreational vehicles
- a device or machine must detect objects or obstacles and control the device or machine to avoid the objects or obstacles. By detecting and avoiding the objects or obstacles, damages and injuries may be avoided.
- an operator may use his/her own vision.
- an operator may not be able to visually detect these objects or obstacles.
- an apparatus that detects these objects or obstacles and outputs visual cues to the operator may assist an operator in avoiding these objects or obstacles that may be difficult to visually detect by the operator.
- FIG. 1 shows a block diagram of an apparatus 100 for warning of objects (i.e., an apparatus for warning of visually obstructed objects) according to an exemplary embodiment.
- the apparatus 100 includes a controller 101 , a power supply 102 , a storage 103 , an output 104 , a user input 106 , an object detector 107 (i.e., an object detecting sensor), and a communication device 108 .
- the apparatus 100 is not limited to the aforementioned configuration and may be configured to include additional elements and/or omit one or more of the aforementioned elements.
- the apparatus 100 may be implemented as part of a vehicle or as a standalone component.
- the controller 101 controls the overall operation and function of the apparatus 100 .
- the controller 101 may control one or more of a storage 103 , an output 104 , a user input 106 , an object detector 107 , and a communication device 108 of the apparatus 100 .
- the controller 101 may include one or more from among a processor, a microprocessor, a central processing unit (CPU), a graphics processor, Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, circuitry, and a combination of hardware, software and firmware components.
- the controller 101 is configured to send and/or receive information from one or more of the storage 103 , the output 104 , the user input 106 , the object detector 107 , and the communication device 108 of the apparatus 100 .
- the information may be sent and received via a bus or network, or may be directly read or written to/from one or more of the storage 103 , the output 104 , the user input 106 , the object detector 107 , and the communication device 108 of the apparatus 100 .
- suitable network connections include a controller area network (CAN), a media oriented system transfer (MOST), a local interconnection network (LIN), a local area network (LAN), and other appropriate connections such as Ethernet.
- the power supply 102 provides power to one or more of the controller 101 , the storage 103 , the output 104 , the user input 106 , the object detector 107 , and the communication device 108 of the apparatus 100 .
- the power supply 102 may include one or more from among a battery, an outlet, a capacitor, a solar energy cell, a generator, a wind energy device, an alternator, etc.
- the storage 103 is configured for storing information and retrieving information used by the apparatus 100 .
- the storage 103 may be controlled by the controller 101 to store and retrieve information about an object or obstacle, information on a condition for displaying an object or obstacle, and information on a graphical indicators corresponding to the objects or obstacles.
- the storage 103 may also include the computer instructions configured to be executed by a processor to perform the functions of the apparatus 100 .
- the storage 103 may include one or more from among floppy diskettes, optical disks, CD-ROMs (Compact Disc-Read Only Memories), magneto-optical disks, ROMs (Read Only Memories), RAMs (Random Access Memories), EPROMs (Erasable Programmable Read Only Memories), EEPROMs (Electrically Erasable Programmable Read Only Memories), magnetic or optical cards, flash memory, cache memory, and other type of media/machine-readable medium suitable for storing machine-executable instructions.
- the output 104 outputs information in one or more forms including: visual, audible and/or haptic form.
- the output 104 may be controlled by the controller 101 to provide outputs to the user of the apparatus 100 .
- the output 104 may include one or more from among a speaker, a display, a transparent display, a centrally-located display, a head up display, a windshield display, a haptic feedback device, a vibration device, a tactile feedback device, a tap-feedback device, a holographic display, an instrument light, an indicator light, etc.
- the output 104 may also include a display located on an A-Pillar (front), a door, B-Pillar (middle), a C-Pillar (rear) of a vehicle.
- the output 104 may also include a transparent display located on one or more of a windshield, a rear window, side windows, and mirrors of a vehicle.
- the display may be a light emitting diode (LED) or organic light emitting diode (OLED) display embedded in the aforementioned pillars.
- the output 104 may output a notification including one or more from among an audible notification, a light notification, and a display notification.
- the notification may include information regarding one or more detected obstacles or objects.
- the output 104 may provide an output displaying a graphical indicator corresponding to the detected object or obstacle.
- the graphical indicator may be a silhouette of the object.
- the graphical indicator may also include depth, location and trajectory cues including one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator.
- the user input 106 is configured to provide information and commands to the apparatus 100 .
- the user input 106 may be used to provide user inputs, etc., to the controller 101 .
- the user input 106 may include one or more from among a touchscreen, a keyboard, a soft keypad, a button, a motion detector, a voice input detector, a microphone, a camera, a trackpad, a mouse, a touchpad, etc.
- the user input 106 may be configured to receive a user input to acknowledge or dismiss the notification output by the output 104 .
- the user input 106 may also be configured to receive a user input to cycle through notifications or different screens of a notification.
- the object detector 107 is configured to detect an object or obstacle.
- the object detector 107 may be one or more sensors from among a radar sensor, a microwave sensor, an ultrasonic sensor, a camera, an infrared sensor, a LIDAR, and a laser sensor.
- the object detector may receive object information from one or more sensors and detect an object or obstacle based on the object information received from the one or more sensors.
- the object detector 107 provide the object information including one or more from among a position of an object, a trajectory of the object, a speed of the object, an acceleration of the object, whether the object is in a predetermined area around the machine and a distance between the object and machine or vehicle being operated by an operator.
- the object information may be provided to the controller 101 via a bus, storage 103 or communication device 108 .
- the object detector 107 may be positioned in at least one from among a vehicle door, a vehicle dashboard, a vehicle mirror, a vehicle windshield, a vehicle hood, a vehicle bumper, a vehicle fender, a vehicle structural pillar (e.g., A-Pillar, B-Pillar, and/or C-Pillar), and a vehicle roof.
- the communication device 108 may be used by the apparatus 100 to communicate with various types of external apparatuses according to various communication methods.
- the communication device 108 may be used to send/receive object information to/from the controller 101 of the apparatus 100 .
- the communication device 108 may also be configured to transmit the notification of an object or a warning of an object that is not visible to the operator due to an obstruction or obstructing part.
- the notification or warning may sent by the communication device 108 to an output device or display, such as output 104 .
- the communication device 108 may include various communication modules such as one or more from among a telematics unit, a broadcast receiving module, a near field communication (NFC) module, a GPS receiver, a wired communication module, or a wireless communication module.
- the broadcast receiving module may include a terrestrial broadcast receiving module including an antenna to receive a terrestrial broadcast signal, a demodulator, and an equalizer, etc.
- the NFC module is a module that communicates with an external apparatus located at a nearby distance according to an NFC method.
- the GPS receiver is a module that receives a GPS signal from a GPS satellite and determines a current location.
- the wired communication module may be a module that receives information over a wired network such as a local area network, a controller area network (CAN), or an external network.
- the wireless communication module is a module that is connected to an external network by using a wireless communication protocol such as IEEE 802.11 protocols, WiMAX, Wi-Fi or IEEE communication protocol and communicates with the external network.
- the wireless communication module may further include a mobile communication module that accesses a mobile communication network and performs communication according to various mobile communication standards such as 3 rd generation (3G), 3 rd generation partnership project (3GPP), long term evolution (LTE), Bluetooth, EVDO, CDMA, GPRS, EDGE or ZigBee.
- the controller 101 of the apparatus 100 may be configured to detect a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine; determine whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, display a graphical indicator corresponding to the detected at least one object.
- the controller 101 of the apparatus 100 may be configured to detect depths of the plurality of objects and control to display the graphical indicator by displaying a plurality of graphical indicators with depth cues corresponding to the detected depths of the plurality of objects.
- the controller 101 of the apparatus 100 may also be configured to detect the depths, locations and trajectories of the plurality of objects and control to display a plurality of graphical indicators with depth, location and trajectory cues corresponding to the detected depths of the plurality of objects.
- the depth, location and trajectory cues may include one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator.
- the size of one graphical indicator relative to the others and to contextual information may be a depth cue.
- a graphical indicator assumes a known shape (such as the silhouette of a cyclist or a pedestrian), then operators' mental models of the size of these familiar objects may impact their judgements.
- this may be a depth cue as to the location of an object and whether an object is in the foreground or background relative to another object.
- the location where a graphical indicator is displayed and the movement of a graphical indicator may be a cue for object location and trajectory.
- movement of the graphical indicator may be illustrated with common motion patterns (e.g., walking, cycling, stroller motion, etc.) to allow for object detection and recognition.
- the controller 101 of the apparatus 100 may also be configured to determine one or more from among a distance between the object and the machine, a position of the object, and a speed of the object. The controller 101 may then determine that the condition for displaying a graphical indicator is met if the distance is within a predetermined distance of the machine, if the position of the at least one object is within the predetermined area, and/or if the speed of the object is within a predetermined speed. The controller 101 may determine the at least one condition for displaying the detected object is met based on one or more from among a speed of the machine or vehicle, a reaction time of the operator of the machine or vehicle, and a number of objects.
- the controller 101 of the apparatus 100 may also control to display the graphical indicator on a portion of the machine visually obstructing the operator of the machine such as an A-Pillar, a B-Pillar and a C-Pillar of a vehicle.
- FIG. 2 shows a flowchart for a method for warning of objects according to an exemplary embodiment.
- the method of FIG. 2 may be performed by the apparatus 100 or may be encoded into a computer readable medium as instructions that are executable by a computer to perform the method.
- a presence of at least one object located in a predetermined area that is visually obstructed from an operator of a machine is detected in operation S 210 . It is then determined whether at least one condition for displaying the detected at least one object is met in operation S 220 . In operation S 230 , in response to determining that the at least one condition is met, graphical indicator corresponding to the detected at least one object is displayed.
- FIG. 3 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment.
- the illustration of FIG. 3 shows a graphical indicator 302 displayed on the A-Pillar 301 .
- the apparatus 100 may control to display the graphical indicator 302 on the A-Pillar 301 .
- a display may be part of the A-Pillar 301 and used to display graphical indicator 302 .
- the graphical indicator 432 may be shaded differently to indicate that a person or object corresponding to graphical indicator 302 is in a potential collision zone 304 with the machine 305 .
- the graphical indicator 302 may be displayed with a depth, location or trajectory cues.
- the depth, location and trajectory cues may include one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator.
- At least one from among a position and a distance of an object or person may be determined and the at least one object or person may be displayed if one or more from among a position and a distance of an object meets a condition. For example, if the distance of the person or object is within a predetermined distance of the machine 305 and the position of the at least one object or person is within the predetermined area 304 corresponding to the position of the machine 305 . Further, the at least one condition for displaying the detected at least one object or person may include one or more from among a speed of the machine, a reaction time of the operator, and a number of the at least one object.
- the detected at least one object or person may be displayed if the machine or vehicle is traveling at, below, or above a predetermined speed.
- the detected at least one object or person may be displayed when the person or object is a specific distance from the vehicle and the specific distance is less than a safe distance that is determined based on a reaction time of a machine or vehicle operator.
- the detected at least one object or person may be displayed at a point in time corresponding to the operator's or driver's reaction time to allow an operator or driver to react effectively.
- the at least one condition may be that only a predetermined number of the at least one object may be displayed. For example, if there are four objects that are visually obstructed by an A-Pillar and the predetermined number of objects that may be displayed is two. The four objects may be ranked in importance from first to fourth and a graphical indicator corresponding to the top two ranked objects may be displayed. The four objects may be ranked based on one or more from among risk of collision with the machine or vehicle, speed of the object, distance from the machine or vehicle, trajectory, size, etc.
- the graphical indicator 302 is displayed on the A-Pillar 301 . It may be displayed at least one from among an A-Pillar, a B-Pillar, a C-Pillar, or other surface of the machine or vehicle that obstructs the view of the object or person from an operator of the vehicle or machine 305 .
- FIG. 4 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment.
- the illustration of FIG. 4 shows a graphical indicators 402 and 403 displayed on the A-Pillar 401 .
- the apparatus 100 may control to display graphical indicators 402 and 403 on the A-Pillar 401 .
- a display may be part of the A-Pillar 401 and used to display graphical indicators 402 and 403 .
- the graphical indicator 402 may be shaded differently than graphical indicator 403 to indicate that a person or object corresponding to graphical indicator 402 is closer to the vehicle or a more likely obstacle than a person or object corresponding to graphical indicator 403 .
- graphical indicator 402 may be larger than graphical indicator 403 to indicate that a person or object corresponding to graphical indicator 402 is closer to the vehicle or a more likely obstacle than a person or object corresponding to graphical indicator 403 .
- Graphical indicators 402 and 403 may be displayed according to one or more depth, location and trajectory cues including one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator.
- the processes, methods, or algorithms disclosed herein can be deliverable to/implemented by a processing device, controller, or computer, which can include any existing programmable electronic control device or dedicated electronic control device.
- the processes, methods, or algorithms can be stored as data and instructions executable by a controller or computer in many forms including, but not limited to, information permanently stored on non-writable storage media such as ROM devices and information alterably stored on writeable storage media such as floppy disks, magnetic tapes, CDs, RAM devices, and other magnetic and optical media.
- the processes, methods, or algorithms can also be implemented in a software executable object.
- the processes, methods, or algorithms can be embodied in whole or in part using suitable hardware components, such as Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, controllers or other hardware components or devices, or a combination of hardware, software and firmware components.
- suitable hardware components such as Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, controllers or other hardware components or devices, or a combination of hardware, software and firmware components.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Traffic Control Systems (AREA)
- General Engineering & Computer Science (AREA)
Abstract
Description
- Apparatuses and methods consistent with exemplary embodiments relate to warning of objects. More particularly, apparatuses and methods consistent with exemplary embodiments relate to warning of objects that are visually obstructed by non-transparent components.
- One or more exemplary embodiments provide a method and an apparatus that detect objects or obstacles that are visually obstructed by non-transparent components of a vehicle. More particularly, one or more exemplary embodiments provide a method and an apparatus that detect objects that are visually obstructed by non-transparent components of a vehicle and warns a driver of the objects or obstacles.
- According to an aspect of an exemplary embodiment, a method for warning of objects is provided. The method includes detecting a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine, determining whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, displaying a graphical indicator corresponding to the detected at least one object.
- The detected at least one object may include a plurality of objects and the method further include: detecting depths of the plurality of objects. The displaying the graphical indicator may include displaying a plurality of graphical indicators with depth cues corresponding to the detected depths of the plurality of objects.
- The method may further include detecting trajectories and locations of the plurality of objects. The displaying the graphical indicator may include displaying the plurality of graphical indicators with depth, location and trajectory cues including one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator.
- The determining whether the at least one condition for displaying the detected at least one object is met may include: determining a distance between the at least one object and the machine, and determining that the condition is met if the distance is within a predetermined distance of the machine.
- The determining whether the at least one condition for displaying the detected at least one object is met may include: determining a position of the at least one object, and determining that the condition is met if the position of the at least one object is within the predetermined area.
- The determining whether the at least one condition for displaying the detected at least one object is met may further include: determining a distance between the at least one object and the machine, and determining that the condition is met if the distance is within a predetermined distance of the machine and the position of the at least one object is within the predetermined area corresponding to the position of the machine.
- The displaying the graphical indicator may include displaying the graphical indicator on a portion of the machine visually obstructing the operator of the machine.
- The machine may be a vehicle and the portion may be a pillar of the vehicle, the pillar may be at least one from among an A-Pillar, a B-Pillar and a C-Pillar.
- The at least one condition for displaying the detected at least one object may include at least one from among a speed of the machine, a reaction time of the operator, and a number of the at least one object.
- The determining whether the at least one condition for displaying the detected at least one object is may include: determining a trajectory of the at least one object or the machine, and determining that the condition is met if the determined trajectory indicates that a trajectory of the at least one object will be within a predetermined distance of a trajectory of the machine
- According to an aspect of another exemplary embodiment, an apparatus for warning of objects is provided. The apparatus includes at least one memory comprising computer executable instructions; and at least one processor configured to read and execute the computer executable instructions. The computer executable instructions causing the at least one processor to: detect a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine; determine whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, display a graphical indicator corresponding to the detected at least one object.
- According to an aspect of another exemplary embodiment, a non-transitory computer readable medium comprising computer executable instructions executable by a processor to perform the method for warning of objects. The method includes detecting a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine; determining whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, displaying a graphical indicator corresponding to the detected at least one object on a display corresponding to the visually obstructed area.
- Other objects, advantages and novel features of the exemplary embodiments will become more apparent from the following detailed description of exemplary embodiments and the accompanying drawings.
-
FIG. 1 shows a block diagram of an apparatus that warns of objects according to an exemplary embodiment; -
FIG. 2 shows a flowchart for a method for warning of objects according to an exemplary embodiment; -
FIG. 3 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment; and -
FIG. 4 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment. - An apparatus and method that warn of objects will now be described in detail with reference to
FIGS. 1-4 of the accompanying drawings in which like reference numerals refer to like elements throughout. - The following disclosure will enable one skilled in the art to practice the inventive concept. However, the exemplary embodiments disclosed herein are merely exemplary and do not limit the inventive concept to exemplary embodiments described herein. Moreover, descriptions of features or aspects of each exemplary embodiment should typically be considered as available for aspects of other exemplary embodiments.
- It is also understood that where it is stated herein that a first element is “connected to,” “attached to,” “formed on,” or “disposed on” a second element, the first element may be connected directly to, formed directly on or disposed directly on the second element or there may be intervening elements between the first element and the second element, unless it is stated that a first element is “directly” connected to, attached to, formed on, or disposed on the second element. In addition, if a first element is configured to “send” or “receive” information from a second element, the first element may send or receive the information directly to or from the second element, send or receive the information via a bus, send or receive the information via a network, or send or receive the information via intermediate elements, unless the first element is indicated to send or receive information “directly” to or from the second element.
- Throughout the disclosure, one or more of the elements disclosed may be combined into a single device or combined into one or more devices. In addition, individual elements may be provided on separate devices.
- Operators of devices or machines, such as vehicles, may encounter objects or obstacles during the operation of the devices or machines. For example, a driver of a vehicle may encounter static or moving objects that may cross the path of the vehicle or that may be on a trajectory to cross a path of a moving vehicle. The vehicle may be a passenger car, but it should be appreciated that any other vehicle including motorcycles, trucks, sports utility vehicles (SUVs), recreational vehicles (RVs), marine vessels, aircraft, etc., can also be used. Thus, an operator of a device or machine must detect objects or obstacles and control the device or machine to avoid the objects or obstacles. By detecting and avoiding the objects or obstacles, damages and injuries may be avoided.
- In order to detect these objects or obstacles, an operator may use his/her own vision. However, in the case in which the objects or obstacles are obstructed or occluded by a component of the device or machine, an operator may not be able to visually detect these objects or obstacles. Thus, an apparatus that detects these objects or obstacles and outputs visual cues to the operator may assist an operator in avoiding these objects or obstacles that may be difficult to visually detect by the operator.
-
FIG. 1 shows a block diagram of anapparatus 100 for warning of objects (i.e., an apparatus for warning of visually obstructed objects) according to an exemplary embodiment. As shown inFIG. 1 , theapparatus 100, according to an exemplary embodiment, includes acontroller 101, apower supply 102, astorage 103, anoutput 104, auser input 106, an object detector 107 (i.e., an object detecting sensor), and acommunication device 108. However, theapparatus 100 is not limited to the aforementioned configuration and may be configured to include additional elements and/or omit one or more of the aforementioned elements. Theapparatus 100 may be implemented as part of a vehicle or as a standalone component. - The
controller 101 controls the overall operation and function of theapparatus 100. Thecontroller 101 may control one or more of astorage 103, anoutput 104, auser input 106, anobject detector 107, and acommunication device 108 of theapparatus 100. Thecontroller 101 may include one or more from among a processor, a microprocessor, a central processing unit (CPU), a graphics processor, Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, circuitry, and a combination of hardware, software and firmware components. - The
controller 101 is configured to send and/or receive information from one or more of thestorage 103, theoutput 104, theuser input 106, theobject detector 107, and thecommunication device 108 of theapparatus 100. The information may be sent and received via a bus or network, or may be directly read or written to/from one or more of thestorage 103, theoutput 104, theuser input 106, theobject detector 107, and thecommunication device 108 of theapparatus 100. Examples of suitable network connections include a controller area network (CAN), a media oriented system transfer (MOST), a local interconnection network (LIN), a local area network (LAN), and other appropriate connections such as Ethernet. - The
power supply 102 provides power to one or more of thecontroller 101, thestorage 103, theoutput 104, theuser input 106, theobject detector 107, and thecommunication device 108 of theapparatus 100. Thepower supply 102 may include one or more from among a battery, an outlet, a capacitor, a solar energy cell, a generator, a wind energy device, an alternator, etc. - The
storage 103 is configured for storing information and retrieving information used by theapparatus 100. Thestorage 103 may be controlled by thecontroller 101 to store and retrieve information about an object or obstacle, information on a condition for displaying an object or obstacle, and information on a graphical indicators corresponding to the objects or obstacles. Thestorage 103 may also include the computer instructions configured to be executed by a processor to perform the functions of theapparatus 100. - The
storage 103 may include one or more from among floppy diskettes, optical disks, CD-ROMs (Compact Disc-Read Only Memories), magneto-optical disks, ROMs (Read Only Memories), RAMs (Random Access Memories), EPROMs (Erasable Programmable Read Only Memories), EEPROMs (Electrically Erasable Programmable Read Only Memories), magnetic or optical cards, flash memory, cache memory, and other type of media/machine-readable medium suitable for storing machine-executable instructions. - The
output 104 outputs information in one or more forms including: visual, audible and/or haptic form. Theoutput 104 may be controlled by thecontroller 101 to provide outputs to the user of theapparatus 100. Theoutput 104 may include one or more from among a speaker, a display, a transparent display, a centrally-located display, a head up display, a windshield display, a haptic feedback device, a vibration device, a tactile feedback device, a tap-feedback device, a holographic display, an instrument light, an indicator light, etc. Theoutput 104 may also include a display located on an A-Pillar (front), a door, B-Pillar (middle), a C-Pillar (rear) of a vehicle. In addition, theoutput 104 may also include a transparent display located on one or more of a windshield, a rear window, side windows, and mirrors of a vehicle. The display may be a light emitting diode (LED) or organic light emitting diode (OLED) display embedded in the aforementioned pillars. - The
output 104 may output a notification including one or more from among an audible notification, a light notification, and a display notification. The notification may include information regarding one or more detected obstacles or objects. Theoutput 104 may provide an output displaying a graphical indicator corresponding to the detected object or obstacle. In one example, the graphical indicator may be a silhouette of the object. The graphical indicator may also include depth, location and trajectory cues including one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator. - The
user input 106 is configured to provide information and commands to theapparatus 100. Theuser input 106 may be used to provide user inputs, etc., to thecontroller 101. Theuser input 106 may include one or more from among a touchscreen, a keyboard, a soft keypad, a button, a motion detector, a voice input detector, a microphone, a camera, a trackpad, a mouse, a touchpad, etc. Theuser input 106 may be configured to receive a user input to acknowledge or dismiss the notification output by theoutput 104. Theuser input 106 may also be configured to receive a user input to cycle through notifications or different screens of a notification. - The
object detector 107 is configured to detect an object or obstacle. Theobject detector 107 may be one or more sensors from among a radar sensor, a microwave sensor, an ultrasonic sensor, a camera, an infrared sensor, a LIDAR, and a laser sensor. For example, the object detector may receive object information from one or more sensors and detect an object or obstacle based on the object information received from the one or more sensors. Theobject detector 107 provide the object information including one or more from among a position of an object, a trajectory of the object, a speed of the object, an acceleration of the object, whether the object is in a predetermined area around the machine and a distance between the object and machine or vehicle being operated by an operator. The object information may be provided to thecontroller 101 via a bus,storage 103 orcommunication device 108. Theobject detector 107 may be positioned in at least one from among a vehicle door, a vehicle dashboard, a vehicle mirror, a vehicle windshield, a vehicle hood, a vehicle bumper, a vehicle fender, a vehicle structural pillar (e.g., A-Pillar, B-Pillar, and/or C-Pillar), and a vehicle roof. - The
communication device 108 may be used by theapparatus 100 to communicate with various types of external apparatuses according to various communication methods. Thecommunication device 108 may be used to send/receive object information to/from thecontroller 101 of theapparatus 100. Thecommunication device 108 may also be configured to transmit the notification of an object or a warning of an object that is not visible to the operator due to an obstruction or obstructing part. The notification or warning may sent by thecommunication device 108 to an output device or display, such asoutput 104. - The
communication device 108 may include various communication modules such as one or more from among a telematics unit, a broadcast receiving module, a near field communication (NFC) module, a GPS receiver, a wired communication module, or a wireless communication module. The broadcast receiving module may include a terrestrial broadcast receiving module including an antenna to receive a terrestrial broadcast signal, a demodulator, and an equalizer, etc. The NFC module is a module that communicates with an external apparatus located at a nearby distance according to an NFC method. The GPS receiver is a module that receives a GPS signal from a GPS satellite and determines a current location. The wired communication module may be a module that receives information over a wired network such as a local area network, a controller area network (CAN), or an external network. The wireless communication module is a module that is connected to an external network by using a wireless communication protocol such as IEEE 802.11 protocols, WiMAX, Wi-Fi or IEEE communication protocol and communicates with the external network. The wireless communication module may further include a mobile communication module that accesses a mobile communication network and performs communication according to various mobile communication standards such as 3rd generation (3G), 3rd generation partnership project (3GPP), long term evolution (LTE), Bluetooth, EVDO, CDMA, GPRS, EDGE or ZigBee. - The
controller 101 of theapparatus 100 may be configured to detect a presence of at least one object in a predetermined area, the predetermined area being visually obstructed from an operator of a machine; determine whether at least one condition for displaying the detected at least one object is met; and in response to determining that the at least one condition is met, display a graphical indicator corresponding to the detected at least one object. - The
controller 101 of theapparatus 100 may be configured to detect depths of the plurality of objects and control to display the graphical indicator by displaying a plurality of graphical indicators with depth cues corresponding to the detected depths of the plurality of objects. - The
controller 101 of theapparatus 100 may also be configured to detect the depths, locations and trajectories of the plurality of objects and control to display a plurality of graphical indicators with depth, location and trajectory cues corresponding to the detected depths of the plurality of objects. The depth, location and trajectory cues may include one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator. - For example, when multiple graphical indicators are displayed, the size of one graphical indicator relative to the others and to contextual information (e.g., background/view through windshield/driver expectations) may be a depth cue. If a graphical indicator assumes a known shape (such as the silhouette of a cyclist or a pedestrian), then operators' mental models of the size of these familiar objects may impact their judgements. Further, if one graphical indicator overlaps another, then this may be a depth cue as to the location of an object and whether an object is in the foreground or background relative to another object. Further still, the location where a graphical indicator is displayed and the movement of a graphical indicator may be a cue for object location and trajectory. In addition, movement of the graphical indicator may be illustrated with common motion patterns (e.g., walking, cycling, stroller motion, etc.) to allow for object detection and recognition.
- The
controller 101 of theapparatus 100 may also be configured to determine one or more from among a distance between the object and the machine, a position of the object, and a speed of the object. Thecontroller 101 may then determine that the condition for displaying a graphical indicator is met if the distance is within a predetermined distance of the machine, if the position of the at least one object is within the predetermined area, and/or if the speed of the object is within a predetermined speed. Thecontroller 101 may determine the at least one condition for displaying the detected object is met based on one or more from among a speed of the machine or vehicle, a reaction time of the operator of the machine or vehicle, and a number of objects. - The
controller 101 of theapparatus 100 may also control to display the graphical indicator on a portion of the machine visually obstructing the operator of the machine such as an A-Pillar, a B-Pillar and a C-Pillar of a vehicle. -
FIG. 2 shows a flowchart for a method for warning of objects according to an exemplary embodiment. The method ofFIG. 2 may be performed by theapparatus 100 or may be encoded into a computer readable medium as instructions that are executable by a computer to perform the method. - Referring to
FIG. 2 , a presence of at least one object located in a predetermined area that is visually obstructed from an operator of a machine is detected in operation S210. It is then determined whether at least one condition for displaying the detected at least one object is met in operation S220. In operation S230, in response to determining that the at least one condition is met, graphical indicator corresponding to the detected at least one object is displayed. -
FIG. 3 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment. In particular, the illustration ofFIG. 3 shows agraphical indicator 302 displayed on theA-Pillar 301. Theapparatus 100 may control to display thegraphical indicator 302 on theA-Pillar 301. - Referring to
FIG. 3 , a display may be part of the A-Pillar 301 and used to displaygraphical indicator 302. The graphical indicator 432 may be shaded differently to indicate that a person or object corresponding tographical indicator 302 is in apotential collision zone 304 with themachine 305. Thegraphical indicator 302 may be displayed with a depth, location or trajectory cues. The depth, location and trajectory cues may include one or more from among a shading of a graphical indicator, a transparency of a graphical indicator, an outlining of a graphical indicator, a size of a graphical indicator, a size of a graphical indicator relative to other graphical indicators, a color of a graphical indicator, a shape of a graphical indicator, a location of a graphical indicator, a motion of a graphical indicator, an occlusion of a graphical indicator by another, and an image displayed with a graphical indicator. - Moreover, at least one from among a position and a distance of an object or person may be determined and the at least one object or person may be displayed if one or more from among a position and a distance of an object meets a condition. For example, if the distance of the person or object is within a predetermined distance of the
machine 305 and the position of the at least one object or person is within thepredetermined area 304 corresponding to the position of themachine 305. Further, the at least one condition for displaying the detected at least one object or person may include one or more from among a speed of the machine, a reaction time of the operator, and a number of the at least one object. - For example, the detected at least one object or person may be displayed if the machine or vehicle is traveling at, below, or above a predetermined speed. In another example, the detected at least one object or person may be displayed when the person or object is a specific distance from the vehicle and the specific distance is less than a safe distance that is determined based on a reaction time of a machine or vehicle operator. In yet another example, the detected at least one object or person may be displayed at a point in time corresponding to the operator's or driver's reaction time to allow an operator or driver to react effectively.
- The at least one condition may be that only a predetermined number of the at least one object may be displayed. For example, if there are four objects that are visually obstructed by an A-Pillar and the predetermined number of objects that may be displayed is two. The four objects may be ranked in importance from first to fourth and a graphical indicator corresponding to the top two ranked objects may be displayed. The four objects may be ranked based on one or more from among risk of collision with the machine or vehicle, speed of the object, distance from the machine or vehicle, trajectory, size, etc.
- Although, the
graphical indicator 302 is displayed on theA-Pillar 301. It may be displayed at least one from among an A-Pillar, a B-Pillar, a C-Pillar, or other surface of the machine or vehicle that obstructs the view of the object or person from an operator of the vehicle ormachine 305. -
FIG. 4 shows an illustration of a warning provided to a vehicle operator according to an aspect of an exemplary embodiment. In particular, the illustration ofFIG. 4 shows agraphical indicators A-Pillar 401. Theapparatus 100 may control to displaygraphical indicators A-Pillar 401. - Referring to
FIG. 4 , a display may be part of the A-Pillar 401 and used to displaygraphical indicators graphical indicator 402 may be shaded differently thangraphical indicator 403 to indicate that a person or object corresponding tographical indicator 402 is closer to the vehicle or a more likely obstacle than a person or object corresponding tographical indicator 403. In addition,graphical indicator 402 may be larger thangraphical indicator 403 to indicate that a person or object corresponding tographical indicator 402 is closer to the vehicle or a more likely obstacle than a person or object corresponding tographical indicator 403. -
Graphical indicators - The processes, methods, or algorithms disclosed herein can be deliverable to/implemented by a processing device, controller, or computer, which can include any existing programmable electronic control device or dedicated electronic control device. Similarly, the processes, methods, or algorithms can be stored as data and instructions executable by a controller or computer in many forms including, but not limited to, information permanently stored on non-writable storage media such as ROM devices and information alterably stored on writeable storage media such as floppy disks, magnetic tapes, CDs, RAM devices, and other magnetic and optical media. The processes, methods, or algorithms can also be implemented in a software executable object. Alternatively, the processes, methods, or algorithms can be embodied in whole or in part using suitable hardware components, such as Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, controllers or other hardware components or devices, or a combination of hardware, software and firmware components.
- One or more exemplary embodiments have been described above with reference to the drawings. The exemplary embodiments described above should be considered in a descriptive sense only and not for purposes of limitation. Moreover, the exemplary embodiments may be modified without departing from the spirit and scope of the inventive concept, which is defined by the following claim.
Claims (20)
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/342,795 US9959767B1 (en) | 2016-11-03 | 2016-11-03 | Method and apparatus for warning of objects |
CN201711031414.0A CN108021859B (en) | 2016-11-03 | 2017-10-27 | Method and apparatus for warning an object |
DE102017125470.6A DE102017125470B4 (en) | 2016-11-03 | 2017-10-30 | OBJECT WARNING DEVICE |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/342,795 US9959767B1 (en) | 2016-11-03 | 2016-11-03 | Method and apparatus for warning of objects |
Publications (2)
Publication Number | Publication Date |
---|---|
US9959767B1 US9959767B1 (en) | 2018-05-01 |
US20180122241A1 true US20180122241A1 (en) | 2018-05-03 |
Family
ID=61912395
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/342,795 Active US9959767B1 (en) | 2016-11-03 | 2016-11-03 | Method and apparatus for warning of objects |
Country Status (3)
Country | Link |
---|---|
US (1) | US9959767B1 (en) |
CN (1) | CN108021859B (en) |
DE (1) | DE102017125470B4 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220363194A1 (en) * | 2021-05-11 | 2022-11-17 | Magna Electronics Inc. | Vehicular display system with a-pillar display |
WO2022240811A1 (en) * | 2021-05-11 | 2022-11-17 | Gentex Corporation | "a" pillar detection system |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016151631A1 (en) * | 2015-03-25 | 2016-09-29 | 河西工業株式会社 | Vehicle interior component |
US10981507B1 (en) | 2019-11-07 | 2021-04-20 | Focused Technology Solutions, Inc. | Interactive safety system for vehicles |
DE102021118730A1 (en) * | 2021-07-20 | 2023-01-26 | Bayerische Motoren Werke Aktiengesellschaft | Monitor system for a vehicle |
CN116620168B (en) * | 2023-05-24 | 2023-12-12 | 江苏泽景汽车电子股份有限公司 | Barrier early warning method and device, electronic equipment and storage medium |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20040003216A (en) * | 2002-07-02 | 2004-01-13 | 기아자동차주식회사 | A system for monitoring driving-dead space related to a front piller of vehicle |
JP4810953B2 (en) * | 2005-10-07 | 2011-11-09 | 日産自動車株式会社 | Blind spot image display device for vehicles |
US7804421B2 (en) * | 2007-02-21 | 2010-09-28 | Audiovox Corporation | Vehicle safety system |
JP4325705B2 (en) * | 2007-06-15 | 2009-09-02 | 株式会社デンソー | Display system and program |
JP4604103B2 (en) * | 2008-03-31 | 2010-12-22 | トヨタ自動車株式会社 | Intersection line-of-sight detection device |
WO2009157446A1 (en) * | 2008-06-24 | 2009-12-30 | トヨタ自動車株式会社 | Blind spot display device and driving support device |
US8547298B2 (en) | 2009-04-02 | 2013-10-01 | GM Global Technology Operations LLC | Continuation of exterior view on interior pillars and surfaces |
CN101513856A (en) * | 2009-04-03 | 2009-08-26 | 广东铁将军防盗设备有限公司 | Vehicle side-view imaging system |
US20140184399A1 (en) * | 2012-12-31 | 2014-07-03 | Kia Motors Corporation | Rear collision warning alert system and method |
US9514650B2 (en) * | 2013-03-13 | 2016-12-06 | Honda Motor Co., Ltd. | System and method for warning a driver of pedestrians and other obstacles when turning |
CN105378813A (en) * | 2013-07-05 | 2016-03-02 | 三菱电机株式会社 | Information display device |
JP6065296B2 (en) * | 2014-05-20 | 2017-01-25 | パナソニックIpマネジメント株式会社 | Image display system and display used in image display system |
US10486599B2 (en) * | 2015-07-17 | 2019-11-26 | Magna Mirrors Of America, Inc. | Rearview vision system for vehicle |
-
2016
- 2016-11-03 US US15/342,795 patent/US9959767B1/en active Active
-
2017
- 2017-10-27 CN CN201711031414.0A patent/CN108021859B/en active Active
- 2017-10-30 DE DE102017125470.6A patent/DE102017125470B4/en active Active
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220363194A1 (en) * | 2021-05-11 | 2022-11-17 | Magna Electronics Inc. | Vehicular display system with a-pillar display |
WO2022240811A1 (en) * | 2021-05-11 | 2022-11-17 | Gentex Corporation | "a" pillar detection system |
US11915590B2 (en) | 2021-05-11 | 2024-02-27 | Gentex Corporation | “A” pillar detection system |
Also Published As
Publication number | Publication date |
---|---|
CN108021859B (en) | 2022-03-01 |
CN108021859A (en) | 2018-05-11 |
DE102017125470A1 (en) | 2018-05-03 |
US9959767B1 (en) | 2018-05-01 |
DE102017125470B4 (en) | 2023-04-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9959767B1 (en) | Method and apparatus for warning of objects | |
US10229654B2 (en) | Vehicle and method for controlling the vehicle | |
US10407060B2 (en) | Driver assistance apparatus and method for operating the same | |
US10220704B2 (en) | Method and apparatus for door status detection | |
US10332002B2 (en) | Method and apparatus for providing trailer information | |
US10525873B2 (en) | Turn by turn activation of turn signals | |
US10577852B2 (en) | Method and apparatus for preventing tailgate collision with hitch accessory | |
US20170183007A1 (en) | Vehicle and method of controlling the vehicle | |
KR102464607B1 (en) | Vehicle and controlling method thereof | |
US11417122B2 (en) | Method for monitoring an occupant and a device therefor | |
US9779626B2 (en) | Vehicle, and apparatus and method for controlling vehicle | |
JP6614359B2 (en) | Obstacle determination method, parking support method, exit support method, and obstacle determination device | |
US20140156178A1 (en) | Road marker recognition device and method | |
US10077046B2 (en) | Method and apparatus for preventing collision with trailer | |
US10095937B2 (en) | Apparatus and method for predicting targets of visual attention | |
CN108664883B (en) | Method and apparatus for initiating a hook view | |
US20180297598A1 (en) | Method and apparatus for traffic control device detection optimization | |
WO2016194144A1 (en) | Vehicle door opening warning device and vehicle door opening warning system | |
US11554774B2 (en) | Control apparatus, control method, and program | |
US20190392656A1 (en) | Method and apparatus for leak detection | |
US20200143546A1 (en) | Apparatus and method for detecting slow vehicle motion | |
US10109200B1 (en) | Graphical multi-layer light alert display and control method thereof | |
CN108859950A (en) | Collision detecting system and method are bored under vehicle | |
US20190122382A1 (en) | Method and apparatus that display view alert | |
US20190210513A1 (en) | Vehicle and control method for the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CANELLA, DAVID A.;WEIGERT, NORMAN J.;SIGNING DATES FROM 20161027 TO 20161031;REEL/FRAME:040223/0803 |
|
AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION FILING DATE PREVIOUSLY RECORDED ON REEL 040223 FRAME 0803. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNORS:CANELLA, DAVID A.;WEIGERT, NORMAN J.;SIGNING DATES FROM 20161027 TO 20161031;REEL/FRAME:044551/0001 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |