WO2020036536A1 - A method, system and apparatus for connecting drivers of non-autonomous vehicles - Google Patents

A method, system and apparatus for connecting drivers of non-autonomous vehicles Download PDF

Info

Publication number
WO2020036536A1
WO2020036536A1 PCT/SG2019/050400 SG2019050400W WO2020036536A1 WO 2020036536 A1 WO2020036536 A1 WO 2020036536A1 SG 2019050400 W SG2019050400 W SG 2019050400W WO 2020036536 A1 WO2020036536 A1 WO 2020036536A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driver
index
parameters
central server
Prior art date
Application number
PCT/SG2019/050400
Other languages
French (fr)
Inventor
Sathiyan PALANI
Santosh Prabhu
Sudheendra SHANTHARAM
Original Assignee
Kaha Pte. Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kaha Pte. Ltd. filed Critical Kaha Pte. Ltd.
Publication of WO2020036536A1 publication Critical patent/WO2020036536A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096716Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096733Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place
    • G08G1/096741Systems involving transmission of highway information, e.g. weather, speed limits where a selection of the information might take place where the source of the transmitted information selects which information to transmit to each vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096766Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
    • G08G1/096775Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a central station
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/46Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for vehicle-to-vehicle communication [V2V]

Definitions

  • Embodiments of the present disclosure relate generally to signal and data processing and more specifically to a method, system and apparatus for connecting drivers of non-autonomous vehicles.
  • Non-autonomous vehicle generally refers to a vehicle at least partially controlled or driven by a human present within the vehicle. In such vehicle presence of a human being within the vehicle is a necessity to move the vehicle. In that, besides various navigational aids, the person or the driver is provided with various controlling interfaces (levers and pads) like accelerator, brake, steering, etc., to driver the vehicle to the best of the acquired skill.
  • various controlling interfaces levers and pads
  • a system comprising a vehicle unit (301) to determine plurality of parameters relating to movement of a first vehicle (302), a wearable device (220A-220N) to store and append health parameters of a wearer of the wearable device (220A- 220N), a mobile device (230A-230N) to store the health parameters and the plurality of parameters, a central server (270) to process and generate a driver index and compare parameters of the first vehicle, and a data storage (280) to store the processed information from the central server (270), wherein the vehicle unit, the wearable device and the mobile device are connected through a first network (240) and a second network (250) while the central server (270) is connected through a third network (260) to set out indications and warnings related to characteristic, behavior and tendency of the first vehicle movement to nearby vehicles (160A-160N) in its vicinity.
  • FIG. 1 is a block diagram illustrating an example scenario in which various aspects of the present disclosure may be seen.
  • FIG. 2 is a block diagram of an example system in which various aspects of the present invention may be seen.
  • FIG. 3A is a block diagram of an example vehicle unit.
  • FIG. 3B illustrates an example deployment of sensors on the vehicle.
  • FIG. 4 is a table illustrating example parameters, indexes and values employed in the system.
  • FIG. 5 is a block diagram illustrating example driver index determined in the system.
  • FIG. 6 is a block diagram illustrating the operation of the system in an embodiment.
  • FIG. 1 is a block diagram illustrating an example scenario in which various aspects of the present disclosure may be seen.
  • the scenario 100 is shown comprising non-autonomous vehicle 150 and other vehicles 160A-160N.
  • the other vehicles 160A-160N may be autonomous and/or non- autonomous vehicles.
  • the scenario 100 is further described below.
  • the non-autonomous vehicle 150 determines the vehicle movement behavior, characteristic and tendency based on the person driving the non-autonomous vehicle 150 and communicates the behavior, tendency and characteristic to the other vehicles 160A-160N so that the other vehicles 160A-160N may adjust its course of movement to avoid accidents. Further, the non-autonomous vehicle 150 may communicate the characteristic, behavior and tendency to the other vehicles 160A- 160N at all times, when in vicinity or when characteristic, behavior and tendency exhibit abnormality. As a result, the other vehicles 160A-160N may adjust their movement in relation to the tendency of the non-autonomous vehicle 150.
  • the characteristic, behavior and tendency of the non-autonomous vehicle 150 is determined in respect of a person driving the non-autonomous vehicle 150.
  • a set of characteristic, behavior and tendency tagged or associated to a driver together referred to as driver index of the non-autonomous vehicle 150.
  • driver index of the non-autonomous vehicle 150.
  • FIG. 2 is a block diagram of an example system in which various aspects of the present invention may be seen.
  • the example system 200 is shown comprising vehicle units 210A-N, wearable devices 220A-N, mobile devices 230A-N, communication network 240, 250 and 260, a central server 270, and a data storage 280. Each element is described in further detail below.
  • Each vehicle unit 210A-N determines a number of parameters (referred to as indexes) relating to the movement of the vehicle like speed, brake, drift, jerks, swing etc.
  • the determined indexes are then tagged to or associated to a profile of the person driving the respective vehicle (vehicle 150 for example) at the time of determination.
  • the number of indexes and its values are determined through a network or array of sensors mounted on the vehicle.
  • the wearable devices 220A-N connects to the corresponding one of the vehicle unit 210A- 210N through network 240 and stores the indexes and its values time to time. Further, the wearable device 220A-N may also append several health parameters measured from the person wearing the device.
  • the example wearable devices include, but not limited to bands, watches, rings, bracelets, glasses, e-textiles, smart fabrics etc. Further, the devices belonging to wearable technologies include smart watches, smart jewelry, fitness trackers, smart clothing, head mounted displays etc.
  • the wearable device 220A-N may connect to the central server 270 and transfer the index values and the health parameters through the network 260.
  • the wearable device 220A-N may receive driver index associated with the other vehicles and may set out various indications and warnings through one of sound, vibration, visual attention mechanisms to draw attention of the wearer of the approaching or nearby vehicles based on driver indexes for example.
  • the mobile devices 230A-N similarly connects to the corresponding one of vehicle unit’s 210A-210N through network 240 and stores the indexes and its values time to time. Further, the mobile devices 230A-N may also connect to wearable devices 220A-N on network 250 to receive indexes and the health parameters. In one embodiment, the mobile devices 230A-N may connect to central server 270 on network 260 and transfer the index values and the health parameters. In other embodiment, the mobile devices 230A-N may receive driver index associated with the other vehicles and may set out various indications and warnings through one of sound, vibration, visual attention mechanisms to draw attention of the wearer of the approaching or nearby vehicles based on the driver indexes for example. Alternatively, the mobile devices 230A-N may transfer the driver index to the wearable devices 220A-N to set out indications to the driver/user.
  • the communication network 240, 250 and 260 connects vehicle units 210A-N, wearable devices 220A-N, mobile devices 230A-N and central server 270 to each other independently or in combination.
  • the communication network 240 and 250 may employ communication standards and protocols such as Bluetooth, Wi-Fi, and alternatively any other short distance communication standards.
  • the communication network 260 may employ communication standards and protocols such as GSM, 3G, 4G, 5G etc.
  • the central server 270 processes the information received from vehicle units 210A-N, mobile devices 230A-N, wearable devices 220A-N, and data storage 280 to generate indexes, driver index and other comparison results such as indication, warnings and navigation control parameters.
  • the processed results and the information received are sent to relevant ones or all vehicle units 210A-N, mobile devices 230A-N, and wearable devices 220A-N.
  • the processing of information may be shared among the vehicle units 210A-N, mobile devices 230A-N, wearable devices 220A-N and central server 270 to exploit available processing power and reduce the communication overheads on the devices.
  • Data storage 280 stores data received from the central server 270 that may comprise the information received from the devices, processed results, indications and warnings, profdes of the users who are driving the vehicles etc., for further processing and reference.
  • the central server 270 and the data storage 280 may be deployed in plurality of servers and storage units spread over multiple geographical locations or at same place that are interconnected to operate in conjunction to provide the processing and storage resources to manage, control and process the desired functionality of the system 200.
  • Example embodiments of the vehicle unit 210A-N are further described below.
  • FIG. 3A and 3B illustrates several example embodiments of the vehicle unit.
  • FIG.3A is a block diagram of an example vehicle unit 301 and the FIG. 3B is a graphical representation illustrating an example deployment of elements the of the vehicle unit 301.
  • the vehicle unit 301 is shown comprising communication module 310, sensors 320, memory 330, and controller 340, vehicle built in components 350 and communication bus 360 for example. Each element is further described below.
  • the communication module 310 operates to transmit and receive the information to and from the network 240-260.
  • the communication module 310 may comprise encoder, modulator, frequency translator, RF front end and antenna (that are not shown) interfaced to the other elements on an integrated circuit for compact and power efficient deployment.
  • the sensors 320 operate to sense condition around the vehicle 302 and captures signal that determine the motional behavior, pattern and characteristics, tendency parameters (indexes) of the vehicle 302.
  • FIG. 3B illustrates an example deployment of sensors 320 on the vehicle 302.
  • the sensors 320 comprise sensors relating to inertial navigation such as gyroscopes, accelerometers, magnetic compass, proximity sensor, motion sensors, infrared, Fidar, Radar, camera, etc.
  • sensors 320 configured in vehicle 302 comprising specific sensors to capture the proximity of one or more vehicles (moving or idle) road characteristics (foot path, speed brake etc.) in one or more directions, in real-time manner.
  • one sensor is placed adjacent to another sensor to sense and capture the proximity value, simultaneously.
  • adjacent to each other assist (by sending data) the micro controller 340 to precisely determine the proximity between vehicles.
  • the sensors 320 may also comprise sensors type to monitor and determine the turning, stopping, acceleration of the vehicle 302.
  • the sensors 320 in combination sense movement of people, road size, foot path (for example, proximity to foot path), proximity between any two vehicles approaching the first vehicle, speed break, traffic, parking and any real-world activity related to the vehicle.
  • foot path for example, proximity to foot path
  • the sensors 320 are shown to have been deployed on a four-wheeler, it may understand that the deployment may be extended to two-wheelers, three-wheelers, multiple axel vehicles and/or vehicle of any type and size.
  • the vehicle built in components 350 detects condition and status of the vehicle 302, like ignition on/off, engine rpm, speed, load, cabin temperature, occupancy, engine temperature, fuel etc.
  • the vehicle built in components 350 are interfaced to the vehicle unit 301 through communication bus 360, thus, making it a part of the vehicle unit 301.
  • the communication bus 360 may be data interface lines like CAN bus or any other proprietary vehicle bus made available for interfacing the vehicle built in components 350.
  • the data and profile of the vehicle 302 may also be transferred on the CAN bus to the controller 340.
  • the profile of the vehicle 302 may comprise engine type, make, power, braking, fuel type, air bags and analysis, gear assembly, axle weight, or any data from OBD or telematics.
  • the controller 340 receives data from the sensors 320, vehicle built in components 350, and communication module 310 and performs various operations such as signal processing, image processing, on the data received to generate indexes, notifications, and alerts. Further the controller 340 may transfer the data thus collected from sensors 320, vehicle built in unit 350 through the communication module 310 to mobile device/wearable device/central server. The controller 340 may save the data received from the sensor 320/vehicle built in components 350 in the memory 330 for further processing or while processing.
  • the controller 340 may also save the information received from server/mobile device/wearable device through the communication modules 310.
  • the controller 340 may operate in client- server configuration with the at least one of wearable device/mobile device/central server to generate a driver indexes.
  • the example driver indexes determined by the central server, mobile device, wearable device, and vehicle unit 301 independently or in conjunction with each other is further described below.
  • FIG. 4 is a table illustrating example parameters, indexes and values employed in the system 200.
  • the column 401 represents the parameters
  • column 402 represents the name of the indexes obtained from the corresponding parameter
  • column 403 represents the typical behavior of the vehicle 302 and the value of the corresponding index.
  • the parameter is a vehicle jerks to the right
  • the corresponding index is referred to as Right Gauge Index (RGI)
  • the value of the RGI is represented in numbers, like total number of jerks to the right determined over time.
  • the jerks to the right may be determined from the sensors mounted on the vehicle 302. The accelerometer and gyroscopes measurements may be processed to determine the jerks to the right.
  • the parameter is a vehicle jerks to the left
  • the corresponding index is Left Gauge Index (LGI)
  • the value of the LGI is represented in numbers, like total number of jerks to the left determined over time.
  • the jerks to the left may also be determined from the sensors mounted on the vehicle 302.
  • row 413 and 414 represents the forward and backwards motion indexes that measures jerks in terms of number of jerks in the forward and backward direction respectively.
  • the parameter on rows 415 - 417 represents braking indexes like hard, late and early braking indexes (HBI, LBI and EBI) of the vehicle.
  • the HBI, LBI, and EBI may be determined from the sensors 320.
  • the HBI, LBI, and EBI may also be determined from the vehicle built in components in addition to making use of the sensors 320. For example, movement of the brake pad in the vehicle 302 may be sensed in addition to sudden retardation measured on the accelerometer.
  • the parameter on rows 418 - 420 respectively represents lane indexes like lane changes (LCI) sudden lane change to Right (RLC), sudden lane change to left (LLC).
  • the LCI, RLC, and LLC may be determined from the sensors Camera, GPS, etc.
  • the parameter on rows 421 - 424 respectively represents lane speed limit (LSL), duration of traffic (DT), presence of traffic signal (PTS), footpath data (FD).
  • LSL lane speed limit
  • DT duration of traffic
  • PTS presence of traffic signal
  • FD footpath data
  • the LSL, DT, PTS and FD may be either determined from the sensors 320 such as Camera, proximity sensors, and or may be received from the external reference data services (not shown).
  • the parameter on rows 425 - 429 comprise profile data of the users/persons associated with the one or more vehicles 302 one or other time. As shown there the profile data may comprise name, age, gender, address of the drivers, owners of the vehicle.
  • FIG. 5 is a block diagram illustrating example driver index determined in the system 200.
  • the central server 270 determines a value of the index for each parameter/index 411-424 for each user or driver.
  • n is an integer value representing the number of jerks to right counted while the first driver is driving the vehicle 302, the total Kilometers driven represents the total kilometer driven by the first driver.
  • the central server 270 determines a value index of one parameter for all users say ui through u n and computes a median value M as Median of (ui, u 2 , . u n ). In one embodiment,
  • Median may be computed as an average.
  • the computation of driver index may be determined by standard deviation, variance or any known statistical methods or any combination thereof.
  • the central server 270 compares independent values of each index associated with first driver with the median of that index. For example, value of the index“jerks to the right” of first driver is compared with the median of the value of the index“jerks to the right” of all the drivers in the system 200. In one embodiment the comparison is performed as difference of the independent values of each index associated with the first driver to the median of the index. The comparison result is stored as delta of the index of the first driver.
  • the central server 270 generates a driver quality value for the index. For example, if delta value of the jerks to the right is small, the driver quality value of the“jerks to the right” is set to 1 indicating a good driver. On the other hand, if delta value is large (for example, above optimal value), then the driver quality value is set to zero indicating not a good driver with respect to the handling of such condition (jerks to the right).
  • the central server 270 segregates driver quality value for the index with respect to driver age value.
  • the central server may create age buckets covering different age ranges. For example, the central server 270 may create two buckets corresponding to age range of 18-24, and 55-65.
  • the central server 270 assigns a risk index factor to enhance or decrease the driver quality value of each age bucket. For example, the central server 270 may assign risk index factor 1.5 to age bucket 18-24 and risk index factor 2.5 to age bucket 55-65. Accordingly, the driver quality value for the index of the first driver is respectively multiplied with 1.5 and 2.5 to arrive at the final driver quality value for the index of when the first driver is aged between 18-24 and 55-65.
  • the central server 270 performs various analytical computation with respect to vehicle profile like engine capacity to estimate an index for each driver.
  • the central server 270 may suggest the possible implications of the actions performed by the user as against the profile of the vehicle 302. Further, the central server 270 may determine a dependency factor between the vehicle profile and sensor data and determine the driver index.
  • various traffic rules and traffic data derived from a standard database may be employed to determine the driver index.
  • the central server 270 triggers the actions of block 510-560 of determining the driver quality value in real time at least when one or more driver in the system 200 comes in vicinity of the other driver. For example, when a second driver travelling (moving towards) in the opposite direction to the first driver.
  • the central server 270 may trigger notification only based on already computed driver quality value.
  • the central server 270 may trigger notification to the second driver on detecting the second driver is in the vicinity of the first driver and travelling in about direction when the driver quality value index jerks to the right is higher than a threshold value.
  • alerting mechanism/notification one or more of the users are being notified before the occurrence of one or more incidents.
  • the notifications may provide one driver to take precautionary measures about the other drivers (who are travelling in the same side or opposite side of the road).
  • Various machine learning algorithms may be employed to determine and alert one driver about one or more other drivers in the road travelling in the same or opposite directions.
  • the manner in which system 200 may operate to provide connectivity between the users/drivers of the vehicle is further described below.
  • FIG. 6 is a block diagram illustrating the operation of the system in an embodiment.
  • user wears wearable devices 220A-N that are connected to vehicle unit 210A-N of the vehicle 150.
  • the vehicle unit 210A-N receives all vehicle related information such as on-board diagnostics, one or more sensor values (for example, real-time sensor values from proximity sensors, placed at different parts of the vehicle body).
  • the wearable devices 220A-N detects the driver of the vehicle 150. Further, the wearable devices 220A-N detects at least one action from the user, to identify/register driver to the vehicle unit 210A-N.
  • the action may be one of gesture action, any change in accelerometer/gyro value.
  • the vehicle unit 210A-N further authenticates the action of the user, retrieves the user profile and initiates the monitoring of the driver activity.
  • the driver does not require a smart wearable/mobile devices 220A-N / 230A-N, to connect to the vehicle unit 210A-N.
  • the driver can perform an action (or pattern) on the vehicle steering (for example, the action may be one of, rotating the vehicular steering in left direction, twice, immediately after starting the vehicle).
  • the vehicle unit 210A-N identifies this action of user, and matches with the configuration stored in the database, if any match is found (meaning, with the same authentication to authorize the driver), then, the vehicular unit 210A-N retrieves the associated user’s profile. Further, the vehicle unit 210A-N initiates the recording of the driving activity.
  • the vehicular module 210A-N records the driving activity of the user and monitors driving actions (the parameters such as, LGI, RGI, FMI, BMI, HBI, LBI, EBI, LCI, RLC, LLC and LSL) and stores in the profile of the driver.
  • vehicle unit 210A-N may be always connected to the central server 270 through any one of the network 240-260.
  • a mobile device 230A-N is connected / paired to the vehicle unit 210A-N.
  • the mobile device 230A-N may identify user through one or more authentication methods.
  • the vehicular unit 210A-N can authenticate and identify the user (who is the driver), through the connected mobile device 230A-N.
  • the authentication methods may be one of or combination of entering pin, password, any gesture action, or any change in accelerometer/gyro values.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Atmospheric Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

A system comprising a vehicle unit (301) to determine plurality of parameters relating to movement of a first vehicle (302), a wearable device (220A-220N) to store and append health parameters of a wearer of the wearable device (220A-220N), a mobile device (230A-230N) to store the health parameters and the plurality of parameters, a central server (270) to process and generate a driver index and comparison parameters of the first vehicle, and a data storage (280) to store the processed information from the central server (270), wherein the vehicle unit, the wearable device and the mobile device are connected through a first network (240) and a second network (250) while the central server (270) is connected through a third network (260) to set out indications and warnings related to characteristic, behavior and tendency of the first vehicle movement to nearby vehicles (160A-60N) in its vicinity.

Description

A Method, System and Apparatus for Connecting Drivers of Non-
Autonomous Vehicles
DESCRIPTION
FIELD OF INVENTION
[0001] Embodiments of the present disclosure relate generally to signal and data processing and more specifically to a method, system and apparatus for connecting drivers of non-autonomous vehicles.
RELATED ART
[0002] Non-autonomous vehicle generally refers to a vehicle at least partially controlled or driven by a human present within the vehicle. In such vehicle presence of a human being within the vehicle is a necessity to move the vehicle. In that, besides various navigational aids, the person or the driver is provided with various controlling interfaces (levers and pads) like accelerator, brake, steering, etc., to driver the vehicle to the best of the acquired skill.
[0003] Movement of the non-autonomous vehicles often exhibit a characteristic and behavior that is directly or at least partially related to behavior, skills and other human factors of the person driving the vehicle. Thus, the characteristic and behavioral aspect of the vehicle in motion has remained in deterministic to larger extent.
[0004] It is therefore necessary to develop a system, method and apparatus for determining the behavior and characteristic of the non-autonomous vehicle when in motion and communicate the same to another non-autonomous vehicle in the vicinity to reduce and limit the probability of occurrence of an accident.
SUMMARY
[0005] According to an aspect of the present disclosure, a system comprising a vehicle unit (301) to determine plurality of parameters relating to movement of a first vehicle (302), a wearable device (220A-220N) to store and append health parameters of a wearer of the wearable device (220A- 220N), a mobile device (230A-230N) to store the health parameters and the plurality of parameters, a central server (270) to process and generate a driver index and compare parameters of the first vehicle, and a data storage (280) to store the processed information from the central server (270), wherein the vehicle unit, the wearable device and the mobile device are connected through a first network (240) and a second network (250) while the central server (270) is connected through a third network (260) to set out indications and warnings related to characteristic, behavior and tendency of the first vehicle movement to nearby vehicles (160A-160N) in its vicinity.
[0006] Several aspects are described below, with reference to diagrams. It should be understood that numerous specific details, relationships, and methods are set forth to provide full understanding of the present disclosure. Skilled personnel in the relevant art, however, will readily recognize that the present disclosure can be practiced without one or more of the specific details, or with other methods, etc. In other instances, well-known structures or operations are not shown in detail to avoid obscuring the features of the present disclosure.
BRIEF DESCRIPTION OF DRAWINGS
[0007] FIG. 1 is a block diagram illustrating an example scenario in which various aspects of the present disclosure may be seen.
[0008] FIG. 2 is a block diagram of an example system in which various aspects of the present invention may be seen.
[0009] FIG. 3A is a block diagram of an example vehicle unit.
[0010] FIG. 3B illustrates an example deployment of sensors on the vehicle.
[0011] FIG. 4 is a table illustrating example parameters, indexes and values employed in the system.
[0012] FIG. 5 is a block diagram illustrating example driver index determined in the system.
[0013] FIG. 6 is a block diagram illustrating the operation of the system in an embodiment.
DETAILED DESCRIPTION OF THE PREFERRED EXAMPLES
[0014] FIG. 1 is a block diagram illustrating an example scenario in which various aspects of the present disclosure may be seen. The scenario 100 is shown comprising non-autonomous vehicle 150 and other vehicles 160A-160N. The other vehicles 160A-160N may be autonomous and/or non- autonomous vehicles. The scenario 100 is further described below.
[0015] The non-autonomous vehicle 150 determines the vehicle movement behavior, characteristic and tendency based on the person driving the non-autonomous vehicle 150 and communicates the behavior, tendency and characteristic to the other vehicles 160A-160N so that the other vehicles 160A-160N may adjust its course of movement to avoid accidents. Further, the non-autonomous vehicle 150 may communicate the characteristic, behavior and tendency to the other vehicles 160A- 160N at all times, when in vicinity or when characteristic, behavior and tendency exhibit abnormality. As a result, the other vehicles 160A-160N may adjust their movement in relation to the tendency of the non-autonomous vehicle 150.
[0016] In one embodiment, the characteristic, behavior and tendency of the non-autonomous vehicle 150 is determined in respect of a person driving the non-autonomous vehicle 150. Thus, a set of characteristic, behavior and tendency tagged or associated to a driver, together referred to as driver index of the non-autonomous vehicle 150. The manner in which the characteristic, behavior and tendency (driver index) of the non-autonomous vehicle 150 is determined and communicated to the other vehicles 160A-160N is further described below. [0017] FIG. 2 is a block diagram of an example system in which various aspects of the present invention may be seen. The example system 200 is shown comprising vehicle units 210A-N, wearable devices 220A-N, mobile devices 230A-N, communication network 240, 250 and 260, a central server 270, and a data storage 280. Each element is described in further detail below.
[0018] Each vehicle unit 210A-N determines a number of parameters (referred to as indexes) relating to the movement of the vehicle like speed, brake, drift, jerks, swing etc. The determined indexes are then tagged to or associated to a profile of the person driving the respective vehicle (vehicle 150 for example) at the time of determination. In one embodiment, the number of indexes and its values are determined through a network or array of sensors mounted on the vehicle.
[0019] The wearable devices 220A-N connects to the corresponding one of the vehicle unit 210A- 210N through network 240 and stores the indexes and its values time to time. Further, the wearable device 220A-N may also append several health parameters measured from the person wearing the device. The example wearable devices include, but not limited to bands, watches, rings, bracelets, glasses, e-textiles, smart fabrics etc. Further, the devices belonging to wearable technologies include smart watches, smart jewelry, fitness trackers, smart clothing, head mounted displays etc. In one embodiment, the wearable device 220A-N may connect to the central server 270 and transfer the index values and the health parameters through the network 260. In other embodiment, the wearable device 220A-N may receive driver index associated with the other vehicles and may set out various indications and warnings through one of sound, vibration, visual attention mechanisms to draw attention of the wearer of the approaching or nearby vehicles based on driver indexes for example.
[0020] The mobile devices 230A-N similarly connects to the corresponding one of vehicle unit’s 210A-210N through network 240 and stores the indexes and its values time to time. Further, the mobile devices 230A-N may also connect to wearable devices 220A-N on network 250 to receive indexes and the health parameters. In one embodiment, the mobile devices 230A-N may connect to central server 270 on network 260 and transfer the index values and the health parameters. In other embodiment, the mobile devices 230A-N may receive driver index associated with the other vehicles and may set out various indications and warnings through one of sound, vibration, visual attention mechanisms to draw attention of the wearer of the approaching or nearby vehicles based on the driver indexes for example. Alternatively, the mobile devices 230A-N may transfer the driver index to the wearable devices 220A-N to set out indications to the driver/user.
[0021] The communication network 240, 250 and 260, connects vehicle units 210A-N, wearable devices 220A-N, mobile devices 230A-N and central server 270 to each other independently or in combination. In one embodiment, the communication network 240 and 250 may employ communication standards and protocols such as Bluetooth, Wi-Fi, and alternatively any other short distance communication standards. Similarly, the communication network 260 may employ communication standards and protocols such as GSM, 3G, 4G, 5G etc.
[0022] The central server 270 processes the information received from vehicle units 210A-N, mobile devices 230A-N, wearable devices 220A-N, and data storage 280 to generate indexes, driver index and other comparison results such as indication, warnings and navigation control parameters. The processed results and the information received are sent to relevant ones or all vehicle units 210A-N, mobile devices 230A-N, and wearable devices 220A-N. In one embodiment the processing of information may be shared among the vehicle units 210A-N, mobile devices 230A-N, wearable devices 220A-N and central server 270 to exploit available processing power and reduce the communication overheads on the devices.
[0023] Data storage 280 stores data received from the central server 270 that may comprise the information received from the devices, processed results, indications and warnings, profdes of the users who are driving the vehicles etc., for further processing and reference. In one embodiment, the central server 270 and the data storage 280 may be deployed in plurality of servers and storage units spread over multiple geographical locations or at same place that are interconnected to operate in conjunction to provide the processing and storage resources to manage, control and process the desired functionality of the system 200. Example embodiments of the vehicle unit 210A-N are further described below.
[0024] FIG. 3A and 3B illustrates several example embodiments of the vehicle unit. In that FIG.3A is a block diagram of an example vehicle unit 301 and the FIG. 3B is a graphical representation illustrating an example deployment of elements the of the vehicle unit 301. The vehicle unit 301 is shown comprising communication module 310, sensors 320, memory 330, and controller 340, vehicle built in components 350 and communication bus 360 for example. Each element is further described below.
[0025] The communication module 310 operates to transmit and receive the information to and from the network 240-260. The communication module 310 may comprise encoder, modulator, frequency translator, RF front end and antenna (that are not shown) interfaced to the other elements on an integrated circuit for compact and power efficient deployment.
[0026] The sensors 320 operate to sense condition around the vehicle 302 and captures signal that determine the motional behavior, pattern and characteristics, tendency parameters (indexes) of the vehicle 302. FIG. 3B illustrates an example deployment of sensors 320 on the vehicle 302. In one embodiment, the sensors 320 comprise sensors relating to inertial navigation such as gyroscopes, accelerometers, magnetic compass, proximity sensor, motion sensors, infrared, Fidar, Radar, camera, etc. [0027] In an embodiment, sensors 320 configured in vehicle 302 comprising specific sensors to capture the proximity of one or more vehicles (moving or idle) road characteristics (foot path, speed brake etc.) in one or more directions, in real-time manner. In another embodiment, one sensor is placed adjacent to another sensor to sense and capture the proximity value, simultaneously. Thus, adjacent to each other assist (by sending data) the micro controller 340 to precisely determine the proximity between vehicles. Further, the sensors 320 may also comprise sensors type to monitor and determine the turning, stopping, acceleration of the vehicle 302.
[0028] In an alternative embodiment, the sensors 320 in combination sense movement of people, road size, foot path (for example, proximity to foot path), proximity between any two vehicles approaching the first vehicle, speed break, traffic, parking and any real-world activity related to the vehicle. Though the sensors 320 are shown to have been deployed on a four-wheeler, it may understand that the deployment may be extended to two-wheelers, three-wheelers, multiple axel vehicles and/or vehicle of any type and size.
[0029] The vehicle built in components 350 detects condition and status of the vehicle 302, like ignition on/off, engine rpm, speed, load, cabin temperature, occupancy, engine temperature, fuel etc. In one embodiment the vehicle built in components 350 are interfaced to the vehicle unit 301 through communication bus 360, thus, making it a part of the vehicle unit 301. The communication bus 360 may be data interface lines like CAN bus or any other proprietary vehicle bus made available for interfacing the vehicle built in components 350. The data and profile of the vehicle 302 may also be transferred on the CAN bus to the controller 340. The profile of the vehicle 302 may comprise engine type, make, power, braking, fuel type, air bags and analysis, gear assembly, axle weight, or any data from OBD or telematics.
[0030] The controller 340 receives data from the sensors 320, vehicle built in components 350, and communication module 310 and performs various operations such as signal processing, image processing, on the data received to generate indexes, notifications, and alerts. Further the controller 340 may transfer the data thus collected from sensors 320, vehicle built in unit 350 through the communication module 310 to mobile device/wearable device/central server. The controller 340 may save the data received from the sensor 320/vehicle built in components 350 in the memory 330 for further processing or while processing.
[0031] In one embodiment, the controller 340 may also save the information received from server/mobile device/wearable device through the communication modules 310. In an alternative embodiment, the controller 340 may operate in client- server configuration with the at least one of wearable device/mobile device/central server to generate a driver indexes. The example driver indexes determined by the central server, mobile device, wearable device, and vehicle unit 301 independently or in conjunction with each other is further described below. [0032] FIG. 4 is a table illustrating example parameters, indexes and values employed in the system 200. The column 401 represents the parameters, column 402 represents the name of the indexes obtained from the corresponding parameter and column 403 represents the typical behavior of the vehicle 302 and the value of the corresponding index. For example, on row 411, the parameter is a vehicle jerks to the right, the corresponding index is referred to as Right Gauge Index (RGI) and the value of the RGI is represented in numbers, like total number of jerks to the right determined over time. In one embodiment, the jerks to the right may be determined from the sensors mounted on the vehicle 302. The accelerometer and gyroscopes measurements may be processed to determine the jerks to the right.
[0033] Similarly, on the on row 412, the parameter is a vehicle jerks to the left, the corresponding index is Left Gauge Index (LGI) and the value of the LGI is represented in numbers, like total number of jerks to the left determined over time. In one embodiment, the jerks to the left may also be determined from the sensors mounted on the vehicle 302. Similarly, on row 413 and 414 represents the forward and backwards motion indexes that measures jerks in terms of number of jerks in the forward and backward direction respectively.
[0034] The parameter on rows 415 - 417 represents braking indexes like hard, late and early braking indexes (HBI, LBI and EBI) of the vehicle. In one embodiment, the HBI, LBI, and EBI may be determined from the sensors 320. As an alternative, the HBI, LBI, and EBI may also be determined from the vehicle built in components in addition to making use of the sensors 320. For example, movement of the brake pad in the vehicle 302 may be sensed in addition to sudden retardation measured on the accelerometer. The parameter on rows 418 - 420 respectively represents lane indexes like lane changes (LCI) sudden lane change to Right (RLC), sudden lane change to left (LLC). In one embodiment, the LCI, RLC, and LLC may be determined from the sensors Camera, GPS, etc.
[0035] The parameter on rows 421 - 424 respectively represents lane speed limit (LSL), duration of traffic (DT), presence of traffic signal (PTS), footpath data (FD). In one embodiment, the LSL, DT, PTS and FD may be either determined from the sensors 320 such as Camera, proximity sensors, and or may be received from the external reference data services (not shown).
[0036] The parameter on rows 425 - 429 comprise profile data of the users/persons associated with the one or more vehicles 302 one or other time. As shown there the profile data may comprise name, age, gender, address of the drivers, owners of the vehicle.
[0037] In one embodiment, the indexes on row 411-424 are generated through sensors 320. Alternatively, the indexes on row 411-424 may be generated with vehicle built in components during the motion of the vehicle 302 for each driver. Manner in which the driver index may be generated from the indexes 411-424 is further described below. [0038] FIG. 5 is a block diagram illustrating example driver index determined in the system 200. In block 510, the central server 270 determines a value of the index for each parameter/index 411-424 for each user or driver. For example, for the index on row 411, the value of the index of the first user may be computed as ui = (h/100) * (total Kilometers driven). In that, n is an integer value representing the number of jerks to right counted while the first driver is driving the vehicle 302, the total Kilometers driven represents the total kilometer driven by the first driver.
[0039] In block 520, the central server 270 determines a value index of one parameter for all users say ui through un and computes a median value M as Median of (ui, u2, . un). In one embodiment,
Median may be computed as an average. In an embodiment, the computation of driver index may be determined by standard deviation, variance or any known statistical methods or any combination thereof.
[0040] In block 530, the central server 270 compares independent values of each index associated with first driver with the median of that index. For example, value of the index“jerks to the right” of first driver is compared with the median of the value of the index“jerks to the right” of all the drivers in the system 200. In one embodiment the comparison is performed as difference of the independent values of each index associated with the first driver to the median of the index. The comparison result is stored as delta of the index of the first driver.
[0041] In block 540, the central server 270 generates a driver quality value for the index. For example, if delta value of the jerks to the right is small, the driver quality value of the“jerks to the right” is set to 1 indicating a good driver. On the other hand, if delta value is large (for example, above optimal value), then the driver quality value is set to zero indicating not a good driver with respect to the handling of such condition (jerks to the right).
[0042] In block 550, the central server 270 segregates driver quality value for the index with respect to driver age value. In one embodiment, the central server may create age buckets covering different age ranges. For example, the central server 270 may create two buckets corresponding to age range of 18-24, and 55-65.
[0043] In block 560, the central server 270 assigns a risk index factor to enhance or decrease the driver quality value of each age bucket. For example, the central server 270 may assign risk index factor 1.5 to age bucket 18-24 and risk index factor 2.5 to age bucket 55-65. Accordingly, the driver quality value for the index of the first driver is respectively multiplied with 1.5 and 2.5 to arrive at the final driver quality value for the index of when the first driver is aged between 18-24 and 55-65.
[0044] In an embodiment, the central server 270 performs various analytical computation with respect to vehicle profile like engine capacity to estimate an index for each driver. The central server 270 may suggest the possible implications of the actions performed by the user as against the profile of the vehicle 302. Further, the central server 270 may determine a dependency factor between the vehicle profile and sensor data and determine the driver index. In another embodiment, various traffic rules and traffic data derived from a standard database may be employed to determine the driver index.
[0045] In block 570, the central server 270 triggers the actions of block 510-560 of determining the driver quality value in real time at least when one or more driver in the system 200 comes in vicinity of the other driver. For example, when a second driver travelling (moving towards) in the opposite direction to the first driver. Alternatively, the central server 270 may trigger notification only based on already computed driver quality value. For example, the central server 270 may trigger notification to the second driver on detecting the second driver is in the vicinity of the first driver and travelling in about direction when the driver quality value index jerks to the right is higher than a threshold value.
[0046] As a result of alerting mechanism/notification one or more of the users are being notified before the occurrence of one or more incidents. The notifications may provide one driver to take precautionary measures about the other drivers (who are travelling in the same side or opposite side of the road). Various machine learning algorithms may be employed to determine and alert one driver about one or more other drivers in the road travelling in the same or opposite directions. The manner in which system 200 may operate to provide connectivity between the users/drivers of the vehicle is further described below.
[0047] FIG. 6 is a block diagram illustrating the operation of the system in an embodiment. In block 610, user wears wearable devices 220A-N that are connected to vehicle unit 210A-N of the vehicle 150. The vehicle unit 210A-N, receives all vehicle related information such as on-board diagnostics, one or more sensor values (for example, real-time sensor values from proximity sensors, placed at different parts of the vehicle body).
[0048] In block 620, the wearable devices 220A-N detects the driver of the vehicle 150. Further, the wearable devices 220A-N detects at least one action from the user, to identify/register driver to the vehicle unit 210A-N. The action may be one of gesture action, any change in accelerometer/gyro value. The vehicle unit 210A-N further authenticates the action of the user, retrieves the user profile and initiates the monitoring of the driver activity.
[0049] In one embodiment, the driver does not require a smart wearable/mobile devices 220A-N / 230A-N, to connect to the vehicle unit 210A-N. Instead, the driver can perform an action (or pattern) on the vehicle steering (for example, the action may be one of, rotating the vehicular steering in left direction, twice, immediately after starting the vehicle). The vehicle unit 210A-N identifies this action of user, and matches with the configuration stored in the database, if any match is found (meaning, with the same authentication to authorize the driver), then, the vehicular unit 210A-N retrieves the associated user’s profile. Further, the vehicle unit 210A-N initiates the recording of the driving activity. Further, the vehicular module 210A-N records the driving activity of the user and monitors driving actions (the parameters such as, LGI, RGI, FMI, BMI, HBI, LBI, EBI, LCI, RLC, LLC and LSL) and stores in the profile of the driver. As a further alternative, vehicle unit 210A-N may be always connected to the central server 270 through any one of the network 240-260.
[0050] In block 630, a mobile device 230A-N is connected / paired to the vehicle unit 210A-N. The mobile device 230A-N may identify user through one or more authentication methods. Alternately, the vehicular unit 210A-N can authenticate and identify the user (who is the driver), through the connected mobile device 230A-N. The authentication methods may be one of or combination of entering pin, password, any gesture action, or any change in accelerometer/gyro values.
[0051] While various embodiments of the present disclosure have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of the present disclosure should not be limited by any of the above-discussed embodiments but should be defined only in accordance with the following claims and their equivalents.

Claims

CLAIMS We Claim,
1. A system comprising:
a vehicle unit (301) to determine a plurality of parameters relating to movement of a first vehicle (302);
a wearable device (220A-220N) to store and append at least one health parameter of a wearer of the wearable device (220A-220N) to the plurality of parameters relating to the movement of the first vehicle (302);
a mobile device (230A-230N) connected to the vehicle unit (301) and the wearable device (220A-220N) to store the health parameter and the plurality of parameters relating to the movement of the first vehicle (302);
a central server (270) to process and generate a driver index and a comparison parameter of the first vehicle from the health parameters and the plurality of parameters relating to the movement of the first vehicle (302); and
a data storage (280) to store the processed information from the central server (270), wherein the vehicle unit (301), the wearable device (220A-220N) and the mobile device (230A-230N) are connected through a first network (240) and a second network (250) while the central server (270) is connected through a third network (260) to set out indications and warnings related to characteristic, behavior and tendency of the first vehicle movement to nearby vehicles (160A-60N) in its vicinity.
2. The system of claim 1, wherein the vehicle unit (301) further comprising:
a communication module (310) to transmit and receive a first data to and from the first, second and third networks (240, 250 and 260);
a network or array of sensors (320) to sense and capture a signal that determines a motional behavior, a pattern and characteristic, a tendency parameter of the first vehicle (302);
at least one vehicle built-in component (350) to detect a condition and a status of the first vehicle (302);
a memory (330) to save a second data received from the vehicle built-in components (350); a controller (340) to process and transfer the first and second data to at least one of the wearable device (220A-220N), the mobile device (230A-230N) and the central server (270); and
a communication bus (360) to interface the vehicle built-in components (350) to the vehicle unit (301).
3. The system of claim 2, wherein the plurality of parameters are determined through the array of sensors (320) mounted on the first vehicle (302) and are associated to a profile of the wearer of the wearable device (220A-220N) at a time of determination.
4. They system of claim 3, wherein the array of sensors (320) comprises at least one of an inertial avigation sensor, a proximity sensor, an accelerometer, a gyroscope, a motion sensor, an infrared sensor, a Lidar, a Radar and a camera to determine the plurality of parameters relating to the movement of the first vehicle (302).
5. The system of claim 4, wherein the condition and status of the first vehicle (302) comprises at least one of an ignition status, an engine rpm, a speed, a load, a cabin temperature, an occupancy, an engine temperature, a fuel and its type, an engine type, an engine make, a braking and gear assembly.
6. The system of claim 5, wherein the plurality of parameters comprises at least one of the motional behavior, the pattern and characteristics and the tendency parameters such as speed, brake, drift, jerks and swing of the first vehicle (302).
7. The system of claim 6, wherein the wearable device (220A-220N) and the mobile device (230A-230N) receives the driver index associated with other vehicles (160A-160N) to set out the indications and warnings to draw attention of the wearer of the approaching or nearby vehicle (160A-160N).
8. A method of determining a driver index comprising:
determining a first set of value indexes (510) for a first set of parameters for each driver; determining a second set of value indexes of a first parameter (520) for all users and computing a median value;
comparing the first set of indexes with the second set of indexes (530) associated with a first driver with the median of that index;
generating a driver quality value (540) for the index;
segregating the driver quality value (550) for the index with respect to a driver age value; assigning a risk index factor (560) to enhance or decrease the driver quality value of each age bucket; and
triggering an action for a notification (570) when one or more driver in a system comes in vicinity of the other driver.
9. A method, system and apparatus providing one or more features as described in the paragraphs of this specification.
PCT/SG2019/050400 2018-08-14 2019-08-14 A method, system and apparatus for connecting drivers of non-autonomous vehicles WO2020036536A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201806878V 2018-08-14
SG10201806878VA SG10201806878VA (en) 2018-08-14 2018-08-14 A Method, System and Apparatus for Connecting Drivers of Non-Autonomous Vehicles

Publications (1)

Publication Number Publication Date
WO2020036536A1 true WO2020036536A1 (en) 2020-02-20

Family

ID=69525968

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2019/050400 WO2020036536A1 (en) 2018-08-14 2019-08-14 A method, system and apparatus for connecting drivers of non-autonomous vehicles

Country Status (2)

Country Link
SG (1) SG10201806878VA (en)
WO (1) WO2020036536A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120025969A1 (en) * 2009-04-07 2012-02-02 Volvo Technology Corporation Method and system to enhance traffic safety and efficiency for vehicles
US20150266484A1 (en) * 2012-10-10 2015-09-24 Freescale Semiconductor, In. Method and apparatus for generating an indicator of a risk level in operating systems
CA2912906A1 (en) * 2015-11-20 2017-05-20 Depura Partners Llc System for monitoring and classifying vehicle operator behavior
US20170154532A1 (en) * 2015-11-27 2017-06-01 Bragi GmbH Vehicle to vehicle communications using ear pieces
US20170263120A1 (en) * 2012-06-07 2017-09-14 Zoll Medical Corporation Vehicle safety and driver condition monitoring, and geographic information based road safety systems
US10460534B1 (en) * 2015-10-26 2019-10-29 Allstate Insurance Company Vehicle-to-vehicle accident detection

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120025969A1 (en) * 2009-04-07 2012-02-02 Volvo Technology Corporation Method and system to enhance traffic safety and efficiency for vehicles
US20170263120A1 (en) * 2012-06-07 2017-09-14 Zoll Medical Corporation Vehicle safety and driver condition monitoring, and geographic information based road safety systems
US20150266484A1 (en) * 2012-10-10 2015-09-24 Freescale Semiconductor, In. Method and apparatus for generating an indicator of a risk level in operating systems
US10460534B1 (en) * 2015-10-26 2019-10-29 Allstate Insurance Company Vehicle-to-vehicle accident detection
CA2912906A1 (en) * 2015-11-20 2017-05-20 Depura Partners Llc System for monitoring and classifying vehicle operator behavior
US20170154532A1 (en) * 2015-11-27 2017-06-01 Bragi GmbH Vehicle to vehicle communications using ear pieces

Also Published As

Publication number Publication date
SG10201806878VA (en) 2020-03-30

Similar Documents

Publication Publication Date Title
CN104656503B (en) Wearable computer in autonomous vehicle
EP3239011B1 (en) Driving consciousness estimation device
EP2811474B1 (en) Method and system for inferring the behaviour or state of the driver of a vehicle, use method and computer program for carrying out said method
JP6508258B2 (en) Personalized medical emergency autopilot system based on portable medical device data
US10729378B2 (en) Systems and methods of detecting problematic health situations
US9763614B2 (en) Wearable device and system for monitoring physical behavior of a vehicle operator
US9718468B2 (en) Collision prediction system
US9676395B2 (en) Incapacitated driving detection and prevention
US20190304309A1 (en) Driving assistance device
US10528833B1 (en) Health monitoring system operable in a vehicle environment
CN109421630A (en) For monitoring the controller architecture of the health of autonomous vehicle
CN115768671A (en) Motion sickness detection system for autonomous vehicles
WO2019151266A1 (en) Information processing device, mobile apparatus, method, and program
US10024674B2 (en) Predictive transportation system modeling
CN106043299A (en) Vehicle control apparatus
US10424203B2 (en) System and method for driving hazard estimation using vehicle-to-vehicle communication
JP2016028318A (en) Computer storage media and methods for automated emergency response systems for vehicle
CN109890662A (en) Vehicle control system, control method for vehicle and vehicle control program
US11866073B2 (en) Information processing device, information processing system, and information processing method for wearable information terminal for a driver of an automatic driving vehicle
CN111161551B (en) Apparatus, system and method for detecting, alerting and responding to emergency vehicles
US11299165B2 (en) Evaluation device and evaluation method
JP5838577B2 (en) Travel evaluation device
Parasana et al. A health perspective smartphone application for the safety of road accident victims
WO2020036536A1 (en) A method, system and apparatus for connecting drivers of non-autonomous vehicles
JPWO2020085223A1 (en) Information processing method, information processing device, information processing program and information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19849175

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19849175

Country of ref document: EP

Kind code of ref document: A1