GB2552489A - Apparatus and method for vehicle evaluation - Google Patents

Apparatus and method for vehicle evaluation Download PDF

Info

Publication number
GB2552489A
GB2552489A GB1612843.1A GB201612843A GB2552489A GB 2552489 A GB2552489 A GB 2552489A GB 201612843 A GB201612843 A GB 201612843A GB 2552489 A GB2552489 A GB 2552489A
Authority
GB
United Kingdom
Prior art keywords
vehicle
occupant
data
remote computer
computer system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1612843.1A
Other versions
GB2552489B (en
GB201612843D0 (en
Inventor
Skrypchuk Lee
Giacomin Joseph
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jaguar Land Rover Ltd
Original Assignee
Jaguar Land Rover Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jaguar Land Rover Ltd filed Critical Jaguar Land Rover Ltd
Priority to GB1612843.1A priority Critical patent/GB2552489B/en
Publication of GB201612843D0 publication Critical patent/GB201612843D0/en
Publication of GB2552489A publication Critical patent/GB2552489A/en
Application granted granted Critical
Publication of GB2552489B publication Critical patent/GB2552489B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/008Registering or indicating the working of vehicles communicating information to a remotely located station
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W2040/0872Driver physiology
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/18Steering angle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

A method of vehicle evaluation wherein the vehicle comprises; sensing means for measuring one or more attributes associated with the vehicle to determine vehicle data; occupant sensing means for determining a state of at least one occupant of the vehicle to determine occupant data, wireless communication means for wirelessly communicating with a remote computer system; and control means to receive the vehicle and occupant data and provide the remote computer system with that data. In response to the vehicle and/or occupant data, an audio communication channel may be established between the remote computer system and occupant. The data might be transmitted following an evaluation event based on the data. The event may be determined by the data exceeding one or more predetermined values, a rate of change of the data, or a predetermined pattern of the data. The event may alternatively be determined by imaging the occupants face and recognizing an emotion based on a facial expression. The audio communication may be used to gather feedback and other evaluation data from the occupant about the vehicle.

Description

(54) Title of the Invention: Apparatus and method for vehicle evaluation Abstract Title: Evaluating a vehicle and an occupant following an event (57) A method of vehicle evaluation wherein the vehicle comprises; sensing means for measuring one or more attributes associated with the vehicle to determine vehicle data; occupant sensing means for determining a state of at least one occupant of the vehicle to determine occupant data, wireless communication means for wirelessly communicating with a remote computer system; and control means to receive the vehicle and occupant data and provide the remote computer system with that data. In response to the vehicle and/or occupant data, an audio communication channel may be established between the remote computer system and occupant. The data might be transmitted following an evaluation event based on the data. The event may be determined by the data exceeding one or more predetermined values, a rate of change of the data, or a predetermined pattern of the data. The event may alternatively be determined by imaging the occupant’s face and recognizing an emotion based on a facial expression. The audio communication may be used to gather feedback and other evaluation data from the occupant about the vehicle.
Figure GB2552489A_D0001
Fig. 3
1/3
120
Figure GB2552489A_D0002
Fig. 1
110
Figure GB2552489A_D0003
Fig. 2
2/3
300
320
340
310
330
350
Figure GB2552489A_D0004
360
Fig. 3
3/3
Figure GB2552489A_D0005
Fig· 4
APPARATUS AND METHOD FOR VEHICLE EVALUATION
TECHNICAL FIELD
The present disclosure relates to a method and apparatus for vehicle evaluation. Aspects of the invention relate to a method of vehicle evaluation, to a vehicle, to a system for vehicle evaluation, to a method of vehicle development, and to computer software for vehicle evaluation.
BACKGROUND
It is currently known to evaluate a vehicle, or parts thereof, by two methods. Firstly, a vehicle it is wished to evaluate can be provided to a user for evaluation. The user uses the vehicle in the real world and then reports at a later time, for example, to a manufacturer of the vehicle or a part thereof, on the use of the vehicle or component. However this method of evaluation suffers because of imperfect recollection. For example the user reports back a period of time after using the vehicle and their memories or recollections fade over time or their viewpoint changes following use of the vehicle. Alternatively it is known to provide the vehicle for evaluation to a user who uses the vehicle in a controlled environment, such as a test track or other private area. In this case the user may provide a report to an evaluation person immediately or even during using the vehicle i.e. the evaluation person may be present in the vehicle. However the presence of the evaluation person within the vehicle may influence the user’s behaviour or viewpoint. Furthermore, the use of the vehicle in such a private area may not accurately reflect real world conditions.
It is an object of embodiments of the invention to at least mitigate one or more of the problems of the prior art.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide a method of vehicle evaluation, to a vehicle, to a system for vehicle evaluation, to a method of vehicle development, and to computer software for vehicle evaluation as claimed in the appended claims.
According to an aspect of the invention, there is provided a method of vehicle evaluation, comprising providing via a communication channel to a remote computer system vehicle data and occupant data, and responsive to at least one of the vehicle data and the occupant data, establishing a communication channel between the remote computer system and at least one occupant of the vehicle.
Embodiments of the invention aim to determine from one or both of the vehicle and occupant data an emotional event or situation relevant to the vehicle evaluation. In response to determining said event or situation, communication can be established with the at least one occupant of the vehicle to ascertain information about the event or situation. Thus the occupant’s response to the vehicle may be evaluated. Such situations may be relatively frequent during use of the vehicle.
According to an aspect of the present invention, there is provided a method of vehicle evaluation, the vehicle comprising sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto, occupant monitoring means for determining a state of at least one occupant of the vehicle and outputting occupant data corresponding thereto, and communication means for operatively supporting a communication channel between the vehicle and a remote computer system, the method comprising providing, via the communication channel to the remote computer system, the vehicle data and the occupant data, responsive to at least one of the vehicle data and the occupant data, establishing an audio communication channel between the remote computer system and the at least one occupant of the vehicle. The communication may be initiated in response to one or both of the vehicle data and the occupant data and thus relevant information may be obtained from the one or more occupants at an appropriate time during use of the vehicle.
The sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto may comprise one or more measurement devices or sensors arranged to measure one or more attributes associated with the vehicle. Without limitation, said sensors may comprise pre-existing sensors forming part of the vehicle architecture, for example accessible via a vehicle data-bus, e.g. a Controller Area Network (CAN). Alternatively, or in addition, said sensors may be additional to the existing vehicle sensors.
The occupant monitoring means may comprise one or more contact sensors, e.g. a physiological sensor such as a temperature sensor, a heart condition monitor, a blood flow monitor, an electrocardiogram or an electroencephalogram. Alternatively, or in addition, the occupant monitoring means may comprise one or more non-contact sensors, e.g. a noncontact temperature sensor (a pyrometer), an imaging sensor or a camera.
Without limitation, the state of the at least one occupant of the vehicle may comprise one or more of the physical state, the physiological state, the psychological state, the emotional state and the state of consciousness of said at least one occupant.
The communication means may comprise a telecommunications network, for example a communication channel established at least periodically between the vehicle and the remote computer system. The communication means may operate according to one or more telecommunications protocols such as 3G, 4G, and Long Term Evolution (LTE) etc.
The method may comprise determining, at the vehicle, an evaluation event of one or both of the vehicle and the occupant based on, respectively, one or both of the vehicle data and the occupant data. The communication may be initiated responsive to the evaluation event, thus providing communication at an appropriate time.
Optionally the providing of the vehicle data and the occupant data to the remote location is performed in dependence on the determination of the evaluation event. This is advantageous in that data communication is reduced between the vehicle and the remote computer system.
The evaluation event may be determined in dependence on the vehicle data exceeding one or more predetermined values. Optionally the evaluation event corresponds to the vehicle parameters exceeding normally expected parameters.
The evaluation event of the vehicle may be determined in dependence on a rate of change of the vehicle data. The evaluation event may be responsive to rapid changes in the vehicle data.
The evaluation event of the vehicle may be determined in dependence on a predetermined pattern of the vehicle data. In an embodiment, the evaluation event is responsive to patterns of the vehicle data corresponding to evaluation worthy events.
The occupant monitoring means may comprise one or more sensing means for sensing one or more physical parameters of the at least one occupant of the vehicle. The occupant monitoring means may output the occupant data indicative thereof. This is advantageous in that the physical response of the occupant(s) may be determined.
The evaluation event of the occupant may be determined in dependence on the occupant data. In an embodiment, the evaluation event is determined based upon the physical response of the occupant.
The evaluation event of the occupant may be determined in dependence on one or more of the occupant data exceeding one or more predetermined values, a rate of change of the occupant data and a predetermined pattern of the occupant data. Optionally, the evaluation event is responsive to one or more of parameters exceeding normally expected parameters, rapid changes, and/or patterns of the vehicle data corresponding to evaluation worthy events.
The occupant monitoring means optionally comprises one or more imaging means for imaging the at least one occupant of the vehicle and outputting image data corresponding thereto. This provides the advantage that the occupant monitoring means is non-invasive. The evaluation event of the occupant may be determined in dependence on the image data.
Without limitation, the imaging means may comprise an imaging sensor or a camera, for example a CCD sensor, a CMOS camera, a time of flight (TOF) camera, a visible spectrum camera, an infrared camera, a thermal imager, a video camera etc.
The method may comprise determining, from the image data, a facial expression of the at least one occupant wherein determining the evaluation event of the occupant may be based on the facial expression. In this manner, the occupant’s expression and thus emotion may be used to determine the evaluation event.
The method may comprise recognising an emotion of the at least one occupant based on the facial expression. Thus, emotion may provide a reliable indicator of the evaluation event.
The vehicle data and the occupant data may be provided to the remote computer in substantially real-time. Thus the remote computer may initiate communication in a short period of time following an evaluation event.
The audio communication channel may be established between an operator of the remote computer system and the at least one occupant of the vehicle. This is beneficial in that the operator may communicate with the occupant.
Furthermore, the audio communications may comprise a bi-directional communications channel in which the at least one occupant and the operator may converse in a form of technically mediated hot line support.
While the sensors, data reduction, event monitoring and communication technologies are all important elements of embodiments of the invention, another important dimension lies in the psychological and sociological interaction between the at least one vehicle occupant and the operator.
In one embodiment, the method comprises a questions-and-answers interaction protocol. The protocol may be predefined and I or may follow a predetermined structure or set of rules. By way of example, the method may include a scripted protocol to illicit from the at least one occupant in the vehicle any relevant thoughts and feelings in relation to the evaluation event, and to bring that information forward in an automotively relevant manner.
In a further embodiment, the method comprises real-time tracking of dialogue between the operator and the at least one occupant of the vehicle. The method may also comprise statistical analysis of said dialogue between the operator and the at least one occupant. Optionally, the method also includes compilation of metrics relating to the dialogue and one or more predetermined metric conditions which may trigger a specific response from the operator in dependence on the metric condition being satisfied
Such real time tracking enables embodiments of the present invention to provide a set of statistical indices of semantic, lexical and creative content of the dialogue between the operator and the at least one occupant of the vehicle.
This dialogue tracking provides real-time statistical information about the amount of information contained in an on-going dialogue and about sources of redundancy. Thus, embodiments of the method are beneficial in that they support the operator to make decisions such as when to continue a dialogue and when to bring it to a close.
The audio communication channel may be established between an artificial operator module executing on the remote computer system and the at least one occupant of the vehicle. The artificial operator module reduces a human requirement at the remote computer system.
The method may comprise establishing a video communication channel between the remote computer system and the at least one occupant of the vehicle responsive to at least one of the vehicle data and the occupant data. An advantage of such a video communication channel is that the occupant may be visually assessed or monitored.
The method may comprise storing, at the remote computer system, one or both of the vehicle data and the occupant data.
The method may comprise storing, at the remote computer system, data indicative of communication between the remote computer system and the at least one occupant of the vehicle associated with the one or both of the vehicle data and the occupant data. Accordingly, the data may be made available for subsequent analysis.
According to another aspect of the present invention, there is provided a vehicle, comprising sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto, occupant monitoring means for determining a state of at least one occupant of the vehicle and outputting occupant data corresponding thereto, wireless communication means for operatively supporting a wireless communication channel between the vehicle and a remote computer system and control means arranged to receive the vehicle data and occupant monitoring data and to provide, via the communication means, a representation of the vehicle data and the occupant data to the remote computer system and to receive a request for an audio communication channel between the remote computer system and the at least one occupant of the vehicle.
The control means may comprise a controller, or an electronic control unit (ECU), or a microcontroller or a processor.
The control means may be arranged to determine one or both of an evaluation event of the vehicle based on the vehicle data and an evaluation event of the occupant based on the occupant data and, in dependence on the determination of the evaluation event, to provide the representation of the vehicle data and the occupant data to the remote location via the communication means.
The occupant monitoring means may comprise one or more sensing means for sensing one or more physical parameters of the at least one occupant of the vehicle and outputting the occupant data indicative thereof.
The occupant monitoring means may comprise one or more imaging means for imaging the at least one occupant of the vehicle and outputting image data corresponding thereto. The control means may comprise facial expression recognition means for determining, from the image data, a facial expression of the at least one occupant. The control means may be arranged to determine the evaluation event of the occupant based on the facial expression.
The control means may be arranged to determine the evaluation event of the vehicle in dependence on one or more of the vehicle data exceeding one or more predetermined values, a rate of change of the vehicle data and/or a predetermined pattern of the vehicle data.
The vehicle may comprise user interface means for outputting audio data received from the remote computer and determining audio data corresponding to sounds within the vehicle and communicating audio data to the remote computer corresponding thereto.
Without limitation, the user interface means may comprise an entertainment system, an infotainment system or an information system of the vehicle, for example comprising an audio amplifier, at least one loudspeaker, and at least one microphone.
The user interface means may be arranged to visually output an image corresponding to video data received from the remote computer. In this case, the user interface means may comprise a display, optionally a video display, e.g. an LED or LCD display.
According to another aspect of the invention, there is provided a system for evaluating a vehicle, comprising: a vehicle according to the previous aspect; and a remote computer system; wherein the remote computer system is at least periodically in wireless communication with the vehicle via a wireless network to receive one or both of the vehicle data or the occupant data and to establish an audio communication channel with the vehicle responsive to the received data.
During use, the system may determine, at the remote computer system, whether the at least one of the vehicle data and the occupant data corresponds to a vehicle evaluation event and may initiate the audio communication channel between the remote computer system and the at least one occupant of the vehicle in response thereto.
The system may store, at the remote computer system, one or both of the vehicle data and the occupant data.
In an embodiment, the system comprises an artificial operator module executing on the remote computer system, the artificial operator module being arranged to communicate with the one or more occupants of the vehicle.
According to an aspect of the present invention there is provided a method of vehicle evaluation, the vehicle comprising sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto, occupant monitoring means for determining a state of at least one occupant of the vehicle and outputting occupant data corresponding thereto, and communication means for operatively supporting a communication channel between the vehicle and a remote computer system, the method comprising providing, via the communication channel to the remote computer system, the vehicle data and the occupant data, responsive to at least one of the vehicle data and the occupant data, and establishing an audio communication channel between the remote computer system and the at least one occupant of the vehicle.
The method may comprise determining, at the remote computer system, whether the least one of the vehicle data and the occupant data corresponds to a vehicle evaluation event and initiating the audio communication channel between the remote computer system and the at least one occupant of the vehicle in response thereto. An evaluation event may be an event, or period of time, that is relevant to the vehicle evaluation.
According to an aspect of the invention, there is provided a method of vehicle development, comprising providing a first vehicle according a first design, providing via a communication channel to a remote computer system, vehicle data indicative of one or more attributes associated with the vehicle and occupant data indicative of a state of at least one occupant of the vehicle, responsive to at least one of the vehicle data and the occupant data, establishing an audio communication channel between the remote computer system and the at least one occupant of the vehicle, and providing a second vehicle according to a second design in dependence on one or more of the vehicle data, the occupant data and the communication associated with the at least one occupant of the vehicle.
Embodiments of the invention provide a method which allows vehicle users to conveniently provide input to a vehicle design process.
Hence, the method of the present aspect provides a tool for co-design of vehicles by manufacturers in collaboration with potential customers. Rather than being merely an engineering process in which the vehicle manufacturer leads innovation in its capacity as technical expert, the present aspect of the invention creates a sociological framework for vehicle design and development based on linguistic discourse and social interaction.
To reiterate, the present system is not intended to achieve the traditional expert lead innovation approach to vehicle design but instead to achieve business-customer codesign.
According to another aspect of the invention, there is provided computer software which, when executed by a computer, is arranged to perform a method according to one or more of the aspects described above.
The computer software may be stored on a computer readable medium, optionally a nontransitory computer readable medium.
Within the scope of this application it is expressly intended that the various aspects, embodiments, examples and alternatives set out in the preceding paragraphs, in the claims and/or in the following description and drawings, and in particular the individual features thereof, may be taken independently or in any combination. That is, all embodiments and/or features of any embodiment can be combined in any way and/or combination, unless such features are incompatible. The applicant reserves the right to change any originally filed claim or file any new claim accordingly, including the right to amend any originally filed claim to depend from and/or incorporate any feature of any other claim although not originally claimed in that manner.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described by way of example only, with reference to the accompanying drawings, in which:
Figure 1 shows a vehicle evaluation system according to an embodiment of the invention;
Figure 2 shows a schematic illustration of a vehicle according to an embodiment of the invention;
Figure 3 shows a method according to an embodiment of the invention; and
Figure 4 shows an illustration of vehicle data according to an embodiment of the invention.
DETAILED DESCRIPTION
Figure 1 schematically illustrates operation of a vehicle evaluation system according to an embodiment of the invention. The system comprises a vehicle evaluation means in the form of a vehicle evaluation module (VEM) 100 which is associated with a vehicle 110 for evaluation. The VEM 100 may be integrated with the vehicle i.e. form an integral part of the vehicle’s systems, or may be a removable device that is at least temporarily associated with the vehicle 110, such as communicably coupled with the vehicle 110 via a wired or wireless connection thereto. The vehicle 110 is illustrated in Figure 1 as being a land-going vehicle, although it will be appreciated that embodiments of the invention are not limited in this respect and the vehicle may be a watercraft- or aircraft.
The VEM 100 represents a control means for evaluating the vehicle 110. The VEM 100 is communicably coupled, at least periodically, with a remote monitoring means in the form of a remote computer system 150. The remote computer system 150 is located at a location remote from the vehicle 110 and is therefore referred to as a remote computer 150. The remote computer 150 may be located in a control centre or control room 140. In some embodiments at least one human operator is present in in the control centre 140 to monitor an output of the remote computer 150 and, responsive thereto, to communicate with one or more occupants of the vehicle 110 as will be explained.
A communication means 120, 131, 132 allows the VEM 100 and the remote computer 150 to exchange data. The communication means 120, 131, 132 may comprise a telecommunications network 120 which allows the VEM 100 to exchange data with the remote computer 150 whilst the vehicle 110 is mobile. The network 120 provides a communication channel 131, 132 at least periodically between the VEM 100 and the remote computer 150. The communication channel 131, 132 may be formed by a first communication path 131 between the VEM 100 and network 120 and a second communication path 132 between the network 120 and the remote computer 150. In the illustrated embodiment the communication channel 131, 132 is bi-directional, as indicated by arrows 131, 132, allowing data to be sent from VEM 100 to remote computer 150 and from the remote computer 150 to the VEM 100. However it will be appreciated that in some embodiments first and second unidirectional channels may be established between the vehicle 110 and remote computer 150. In some embodiments a first channel allows monitoring data to be sent from VEM 100 to the remote computer 150 whilst a second channel allows data, particularly communications data, to be sent from remote computer 150 to the VEM 150.
In operation the VEM 100 is arranged to provide, via the communication channel 131, 132, to the remote computer 150, monitoring data. The monitoring data is indicative of one or more attributes associated with the vehicle 100 and a state of at least one occupant of the vehicle 100. The monitoring data may be provided by the VEM 100 substantially in real time i.e. to reflect ongoing or at least recent conditions of the vehicle 110 and one or more occupants. In some embodiments, the monitoring data is sent from the VEM 100 to the remote computer 150 in response to one or more evaluation conditions experienced by the vehicle 110 or by one or more occupants thereof.
The communications channel 131, 132 further allows communication between the remote computer 150 and the one or more occupants of the vehicle 110. In some embodiments the communication is between the operator of the remote computer 150 and the one or more occupants of the vehicle 110. In some embodiments, an artificial operator module (AOM) 160 executes on the remote computer 150 which is able to communicably interact, via the communication channel 131, 132, with the one or more occupants of the vehicle 110, as will be explained.
Referring to Figure 2, a schematic illustration of the vehicle 110 is provided. The vehicle 110 comprises a communication bus 205 which operatively allows various systems of the vehicle 110 to communicate. The communication bus 205 may be implemented as a CANBus, although it is envisaged that the communication bus 205 may alternatively be an IP-based vehicle network, such as Ethernet, although other protocols may be used. In these cases each system on the vehicle network is assigned a unique address, such as an IPaddress, thereby allowing inter-system communication via the communication bus 205, as will be appreciated.
The vehicle 110 comprises the VEM 100, as described above, which is communicably connected to the communication bus 205. The VEM 100 is communicable with occupant monitoring means 210, vehicle sensing means 220, user interface means 230 and communication means 240 which are also communicably coupled with the communication bus 205 as illustrated.
The occupant monitoring means 210 is arranged to operatively monitor one or more occupants of the vehicle 110. The occupant monitoring means 210 comprises one or more sensing means for sensing one or more physical parameters of the at least one occupant of the vehicle and outputting the occupant data indicative thereof to the VEM 100. The occupant monitoring means may comprise one or more sensing devices arranged to determine the one or more physical parameters of the at least one occupant. The physical parameters may be temperature, heart rate, respiration rate. It will be appreciated that other physical parameters may be determined.
In one embodiment, the occupant monitoring means 210 comprises one or more imaging devices or cameras 210 arranged to view at least one occupant of the vehicle 110. In some embodiments, each occupant seating position within the vehicle 110 may be associated with a respective one or more cameras. Each camera is arranged to provide image data indicative of at least one vehicle occupant to the VEM 100. The image data may be provided via the communication bus 205, although it will be appreciated that in some embodiments the one or more cameras 210 may be integrated with the VEM 100 and thus it is not necessary to communicate the image data via the communication bus 205. In other words, the one or more cameras 210 may provide image data directly to the VEM 100.
The vehicle sensing means 220 comprises more or more measurement devices 220 or sensors arranged to measure one or more attributes associated with the vehicle 110. The measurement devices or sensors 220 may be existing sensors of the vehicle 110. The vehicle sensing means 220 are arranged to measure one or more attributes associated with the vehicle and to output vehicle data corresponding thereto to the VEM 100.
At least some of the measurement devices 220 are arranged to provide vehicle data indicative of one or more of user inputs to the vehicle 110, for example steering and other control inputs, conditions experienced by the vehicle 110, such as temperature, air pressure, humidity, weather e.g. rain, and the like, and information indicative of a state of or more systems of the vehicle 110.
The user interface means 230 comprises one or more input devices for receiving an input from an occupant of the vehicle 110 and one or more output devices for outputting information to the occupant. In one embodiment the user interface means 230 comprises an audio input/output (I/O) means such as one or more microphones and one or more speakers within the vehicle 110. The user interface means 230 may further comprise one or more visual display devices for outputting a graphical image thereon. The graphical image may be used to display a graphical user interface (GUI) to the occupant which may comprise, in some embodiments, a video image. The video image may be representative of the operator associated with the remote computer 150 at the control centre 140. The video image may alternatively be an image provided by the AOM 160 executing on the remote computer 150.
Video data is communicated from the remote computer 150 to the vehicle 110 via the communications channel 131, 132. Similarly, in some embodiments, video data representative of one or more occupants of the vehicle 110 is provided from the vehicle 110 to the remote computer 150. The video data may be provided from one or more cameras (not shown) associated with the remote computer 150. The one or more cameras 210 within the vehicle 110 may be used to provide the video data representative of the one or more occupants of the vehicle 110. In this way, in some embodiments, an audio call may be established between the one or more occupants of the vehicle 110 and the operator or the AOM 160 associated with the remote computer 150 via the communications channel 131, 132. Furthermore, in some embodiments a video call may be established between the one or more occupants of the vehicle 110 and the operator or AOM 160 associated with the remote computer 150. In particular, the audio and/or video call may be established at a request of the operator of the remote computer 150.
The communication means 240 is arranged to support wireless communication with the telecommunications network 120. The communication means 240 is, in some embodiments, a communications module 240. The communication means may operate according to one or more telecommunications protocols such as 3G, 4G, and Long Term Evolution (LTE) etc. The communication means allows systems of the vehicle, including the VEM 100, to communicate with external computing devices such as the remote computer 150.
The VEM 100 comprises logic 250 to determine the one or more evaluation conditions of the vehicle and/or occupant(s) thereof. The VEM 100 and logic 250 is arranged to receive the data indicative of one or more of the user inputs to the vehicle 110, conditions experienced by the vehicle 110 and state of or more systems of the vehicle 110. The VEM 100 and logic 250 may receive the image data indicative of at least one vehicle occupant.
An embodiment of the invention will be described in detail with reference to Figure 3 wherein the VEM 100 and logic 250 receives steering data indicative of a driver’s steering input to the vehicle 110 and image data indicative of at least a portion of the driver’s face. The logic provides a facial expression recognition means for determining a facial expression of the driver from the image data. The facial expression may be determined as one of a plurality of types of facial expression. It will be appreciated, however, that this is not limiting and that other data may be received by the VEM 100 and that image data indicative of other vehicle occupants may be utilised.
Figure 3 illustrates a method according to an embodiment of the invention. The method 300 is a method of vehicle evaluation. It will be appreciated that the method 300 may be used to evaluate only a part of a vehicle, rather than an entire vehicle 110. The method 300 will be described with reference to Figure 4 which illustrates the driver’s steering input to the vehicle 110 over time t.
The method comprises a step 310 of receiving vehicle data. The vehicle data is indicative of one or more vehicle conditions. In step 310 the vehicle data may be steering data 410 indicative of the driver’s steering input to the vehicle 110, as illustrated in Figure 4. The steering data 410 may represent, for example, a position of a steering wheel or other input device for controlling the steering of the vehicle. As illustrated in Figure 4, the received steering data 410 is initially indicative of relatively stable, controlled, steering inputs to the vehicle which vary relatively slowly over time. However in the region identified with numeral 420 is can be observed that the steering data 420 is indicative of larger, more frequent, rapidly varying steering inputs.
Step 310 comprises receiving the image data indicative of the driver’s facial expressions. The data is received by the VEM 100 in step 310. The data may be received via the communication bus 205 of the vehicle 110.
The method comprises a step 320 of determining whether one or more evaluation conditions exist, based on the data received in step 310. In step 320 the logic 250 is operative to determine whether the received vehicle data is indicative of an evaluation event. The evaluation event may be an unexpected or evaluation-condition of the vehicle 110 or one or more systems thereof, the driver or other vehicle occupant’s input or interaction with the vehicle 110. Thus the evaluation event may be relevant to evaluation of the vehicle. In some embodiments, the evaluation conditions are one or more conditions not normally experienced during operation of the vehicle 110.
In step 320 the logic 250 may compare the data, such as the steering data 410 to data indicative of normal, or expected, steering inputs. The logic may determine that evaluation data is received based on one or more patterns being present in the received data, such as the data exceeding a predetermined magnitude, a rate of change of the data i.e. dx/dt, or one or more other predetermined patterns. The logic may be a trained logic, such as a neural network, which is trained based upon a data set comprising both normal and evaluation data to determine when evaluation data is received. The logic 250 is arranged to determine when the data indicative of the evaluation event, such as the excessive steering data 420, is received in step 320.
In step 320 the logic 250 is further arranged to receive the image data. The image data may be representative of the facial expression of one or more occupants of the vehicle, such as the driver’s facial expression. The logic 250 is arranged to determine when the image data is indicative of one or more types of facial expression or types of emotion of one or more vehicle occupants, such as the driver of the vehicle 110.
The logic 250 comprises, in some embodiments, facial expression recognition logic, which is arranged, in use, to determine a facial expression of the occupant. The facial expression may be determined, or selected from amongst, a plurality of predetermined types of facial expression. The plurality of predetermined types of facial expression may comprise one or more of normal, anger, surprise, puzzlement, fear etc. If the facial expression is normal in step 320, then the driver is determined not to have an abnormal facial expression. However if the determined facial expression is one or more of the remaining types of facial expression, then the facial expression is determined to be abnormal.
In some embodiments the facial expression recognition logic is arranged to operatively determine an emotion of the occupant based on their facial expression. The facial expression recognition logic may be arranged to determine whether the occupant is experiencing one of a plurality of predetermined types of emotion, such as being happy, sad, angry, although it will be realised that other emotions may be considered.
The facial expression recognition logic 250 may be trained logic, such as a neural network, which is trained based upon a data set comprising image data indicative of a plurality of types of facial expressions and, in some embodiments, image data indicative of various emotions. The logic 250 is then arranged to determine when the occupant’s facial expression matches one of the plurality of types of facial expression or when the occupant’s emotion is one of a plurality of types of emotion.
In step 320 if either the received data, such as the vehicle data or the image data, is indicative of the evaluation event, the method moves to step 330. If, however, no evaluation event is determined by the logic 250, the method returns to step 310.
If it is determined that the evaluation event exists in step 320, the method 300 proceeds to step 330 wherein data is transmitted indicative of the evaluation event. The data may be a portion of the vehicle data, such as that indicated by region 420, indicative of the evaluation event, i.e. excessive steering inputs, for example. The data may be a portion of the image data indicative of the facial expression of the vehicle occupant.
The data may be transmitted from the VEM 100 via the communications module 240 and telecommunications network 120 to the remote computer 150. In this way, only data indicative of evaluation events experienced by the vehicle or occupant(s) is communicated to the remote computer 150. In a step corresponding to step 330 at the remote computer 150, the data is received. The received data may be one or both of the vehicle data 420, such as that illustrated in Figure 4, and image data indicative of one or more occupants of the vehicle 110.
In step 340 it is determined at the remote computer 150 whether the received data is indicative of an evaluation event. An evaluation event may be an event, or period of time, that is relevant to the vehicle evaluation. That is, it is determined in step 340 whether the data corresponds to an event relevant to evaluation of the vehicle 110, or a part thereof. In some embodiments, it is determined in step 340 at the remote computer whether the received data is indicative of a notable vehicle usage condition or event. In particular, it may be determined whether the received data is indicative of an unexpected or evaluationrelevant vehicle usage condition. If it is determined that the received data is not indicative of the evaluation event, i.e. that the data relates to normal usage of the vehicle 110 or to a nonnotable event, the method returns to step 310.
Step 340 may be performed by one or both of the operator in the control centre 140 by monitoring the output of the remote computer 150, or the AOM 160 executing on the remote computer 150. In one embodiment, the remote computer is arranged to provide an output indicative of the received data and the operator assesses the output to determine whether the data corresponds to an evaluation event. For example, the remote computer 150 comprises a visual display device which may operatively visually display a representation of the received vehicle data, such as illustrative of the output shown in Figure 4 showing the identified portion 420 of the vehicle data. The remote computer 150 may display the image data received from the vehicle 110. The remote computer 150 may, in some embodiments, display the representation of the received vehicle data 420 alongside the image data such that the operator is able to assess the vehicle data 420 and the occupant(s) response at the time of the vehicle data 420 being generated to determine whether the received data is indicative of an evaluation event. In some embodiments, the AOM 160 executing on the remote computer 150 may determine whether the received data is indicative of an evaluation event. The AOM 160 may perform automated analysis on the received data, in addition to that performed by the logic 250 of the VEM 100, to determine whether the received data is indicative of the evaluation event. The AOM 160 may comprise trained logic, such as a neural network, which is trained based upon a data set comprising both normal and evaluation data to determine when evaluation event data is received. In this sense, the AOM 160 performs further determination on the received data, in addition to that performed at the VEM 100, to determine whether the received data is indicative of the notable vehicle usage condition or event. If, in step 340 it is determined that the received data is indicative of the evaluation event, then the method moves to step 350.
In step 350 communication is performed between the control centre 140 and the vehicle 110. In particular, in some embodiments, the communication is between the one or more occupants of the vehicle 110 and the control centre 140. The communication at the control centre 140 may be performed by the operator of the remote computer 150 or, in other embodiments, between the AOM 160 and the one or more occupants of the vehicle 110.
In step 350 the operator of the remote computer 150 or the AOM 160 may initiate communication with the vehicle 110. The operator or AOM 160 may send a request to the VEM 100 to open a communication channel with the vehicle 110 over the telecommunications network 120. The communication channel may be bi-directional to allow communication from the remote computer 150 to the vehicle 110 and from the vehicle 110 to the remote computer 150. The communication may be audio communication or may be video communication including audio. In step 350 effectively an audio call or a video call is initiated between the remote computer 150 and the one or more occupants of the vehicle 110. The user interface means 230 of the vehicle 110 may be used to support the call at the vehicle. For example, an image of the operator of the remote computer 150 may be displayed upon the visual display device 230 within the vehicle 110.
The communication may allow the operator of the remote computer 150 to discuss the evaluation event with the one or more occupants of the vehicle 110. The discussion may include the operator asking the one or more occupants of the vehicle 110 questions about the evaluation event in order to obtain the views of the occupant(s) on the evaluation event. For example, the operator may ask questions about the driver’s inputs to the vehicle corresponding to the vehicle data 420, or the driver’s reactions to the vehicle’s behaviour. In embodiments where the communication is between the AOM 160 and the one or more occupants of the vehicle 110, the AOM 160 comprises a speech synthesis module allowing spoken communication between the AOM 160 and the vehicle occupant(s). Similarly, the
AOM 160 may comprise a speech recognition module for interpreting speech of the vehicle occupant(s). In this way, spoken communication may be effected between the AOM 160 and vehicle occupant(s). In some embodiments, the AOM 160 may comprise an avatar generation module which generates a visual image for display to the vehicle occupant(s) representative of a person or other character corresponding to the AOM 160. In this way, the vehicle occupant(s) are able to view the representation of the person or character during the communication. The AOM 160 may present, either audibly or audibly and visually, the one or more vehicle occupants with questions relating to the evaluation event in a similar manner to the operator of the remote computer. The questions may correspond to a script, wherein the responses of the vehicle occupant(s) to each question may allow selection of one or more further questions in response.
The communication between the operator of the remote computer 150 or the AOM 160 and the one or more occupants of the vehicle 110 may be conducted according to an interaction protocol. The interaction protocol defines how the operator or AOM 160 interacts with the one or more occupants. In some embodiments the interaction protocol is a questions-andanswers interaction protocol. The protocol may be a scripted protocol to extract from one or more occupants of the vehicle any relevant information, such as thoughts and feelings about the current context, and to bring that information forward in an automotive-relevant manner.
The remote computer system 150 may further comprise a statistical analysis module (SAM) (not specifically illustrated). The SAM is arranged to provide real-time dialogue tracking or dialogue statistical analysis of the communication between the remote computer 150 and the one or more occupants of the vehicle. In some embodiments the SAM provides statistical indices relating to the dialogue with the one or more occupants of the vehicle. The statistical indices relate to an amount of relevant information contained in the on-going dialogue with the occupant(s). Thus the operator of the remote computer is provided support by the SAM to make decisions such as when to continue a conversation and when to bring it to a close.
In step 360 data indicative of the evaluation event is stored. The data may be stored in a data store, such as a data storage device, accessible to the remote computer 150. The data may comprise one or more of the vehicle data 420, image data corresponding to the one or more occupants of the vehicle, and data indicative of the communication between the control centre 140 and the vehicle 110. The data is stored for further analysis by a person evaluating the vehicle 110, or part thereof, under test.
Embodiments of the invention also provide a method of vehicle development which allows input from users of vehicles to be integrated into the development of design process.
In a first step a first vehicle is provided according to a first design. That is, the first vehicle comprises a first set of features or design elements. Then, during use of the first vehicle data indicative of one or more attributes associated with the vehicle and occupant data indicative of a state of at least one occupant of the vehicle is provided, as described above. In response to at least one of the vehicle data and the occupant data, an audio communication channel is established between the remote computer system and the at least one occupant of the vehicle, such that a dialogue may be facilitated between the operator of the remote computer or the AOM 160, as described above, and at least one occupant of the vehicle. Subsequent to the communication, a second vehicle is provided according to a second design, wherein the second design is influenced by one or more of the vehicle data, the occupant data and the audio communication associated with the at least one occupant of the vehicle. In this way, the occupant’s input is conveniently included within the vehicle design process.
It will be appreciated that embodiments of the present invention can be realised in the form of hardware, software or a combination of hardware and software. Any such software may be stored in the form of volatile or non-volatile storage such as, for example, a storage device like a ROM, whether erasable or rewritable or not, or in the form of memory such as, for example, RAM, memory chips, device or integrated circuits or on an optically or magnetically readable medium such as, for example, a CD, DVD, magnetic disk or magnetic tape. It will be appreciated that the storage devices and storage media are embodiments of machine-readable storage that are suitable for storing a program or programs that, when executed, implement embodiments of the present invention. Accordingly, embodiments provide a program comprising code for implementing a system or method as claimed herein and a machine readable storage storing such a program. Still further, embodiments of the present invention may be conveyed electronically via any medium such as a communication signal carried over a wired or wireless connection and embodiments suitably encompass the same.
All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and/or all of the steps of any method or process so disclosed, may be combined in any combination, except combinations where at least some of such features and/or steps are mutually exclusive.
Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise. Thus, unless expressly stated otherwise, each feature disclosed is one example only of a generic series of equivalent or similar features.
The invention is not restricted to the details of any foregoing embodiments. The invention extends to any novel one, or any novel combination, of the features disclosed in this specification (including any accompanying claims, abstract and drawings), or to any novel one, or any novel combination, of the steps of any method or process so disclosed. The claims should not be construed to cover merely the foregoing embodiments, but also any embodiments which fall within the scope of the claims.

Claims (25)

1.
2.
3.
4.
5.
35
6.
A method of vehicle evaluation, the vehicle comprising sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto, occupant monitoring means for determining a state of at least one occupant of the vehicle and outputting occupant data corresponding thereto, and communication means for operatively supporting a communication channel between the vehicle and a remote computer system, the method comprising:
providing, via the communication channel to the remote computer system, the vehicle data and the occupant data;
responsive to at least one of the vehicle data and the occupant data, establishing an audio communication channel between the remote computer system and the at least one occupant of the vehicle.
The method of claim 1, comprising:
determining, at the vehicle, an evaluation event of one or both of the vehicle and the occupant based on, respectively, one or both of the vehicle data and the occupant data; and wherein the providing of the vehicle data and the occupant data to the remote location is performed in dependence on the determination of the evaluation event.
The method of claim 2, wherein the evaluation event of the vehicle is determined in dependence on the vehicle data exceeding one or more predetermined values.
The method of claim 2 or 3, wherein the evaluation event of the vehicle is determined in dependence on a rate of change of the vehicle data.
The method of claim 2, 3 or 4, wherein the evaluation event of the vehicle is determined in dependence on a predetermined pattern of the vehicle data.
The method of claim 2 or any claim dependent thereon, wherein:
7.
8.
9.
30 1 the occupant monitoring means comprises one or more sensing means for sensing one or more physical parameters of the at least one occupant of the vehicle and outputting the occupant data indicative thereof; and the evaluation event of the occupant is determined in dependence on the occupant data.
The method of claim 6, wherein the evaluation event of the occupant is determined in dependence on one or more of the occupant data exceeding one or more predetermined values, a rate of change of the occupant data and a predetermined pattern of the occupant data.
The method of claim 2 or any claim dependent thereon, wherein:
the occupant monitoring means comprises one or more imaging means for imaging the at least one occupant of the vehicle and outputting image data corresponding thereto; and the evaluation event of the occupant is determined in dependence on the image data.
The method of claim 8, comprising:
determining, from the image data, a facial expression of the at least one occupant; and determining the evaluation event of the occupant based on the facial expression.
The method of claim 9, comprising recognising an emotion of the at least one occupant based on the facial expression.
The method of claim 1 or 2, wherein the vehicle data and the occupant data is provided to the remote location in substantially real-time.
The method of any preceding claim, wherein the audio communication channel is established between an operator of the remote computer system and the at least one occupant of the vehicle.
13. The method of any of claims 1 to 11, wherein the audio communication channel is established between an artificial operator module executing on the remote computer system and the at least one occupant of the vehicle.
14. The method of any preceding claim, comprising establishing a video communication channel between the remote computer system and the at least one occupant of the vehicle responsive to at least one of the vehicle data and the occupant data.
10
15. The method of any preceding claim, comprising storing, at the remote computer system, one or both of the vehicle data and the occupant data.
16. The method claim 15, comprising storing, at the remote computer system, data indicative of communication between the remote computer system and the at least
15 one occupant of the vehicle associated with the one or both of the vehicle data and the occupant data.
17. A vehicle, comprising:
20 sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto;
occupant monitoring means for determining a state of at least one occupant of the vehicle and outputting occupant data corresponding thereto;
wireless communication means for operatively supporting a wireless communication channel between the vehicle and a remote computer system;
control means arranged to receive the vehicle data and occupant monitoring 30 data and to provide, via the communication means, a representation of the vehicle data and the occupant data to the remote computer system and to receive a request for an audio communication channel between the remote computer system and the at least one occupant of the vehicle.
35
18. The vehicle of claim 17, wherein the control means is arranged to determine one or both of an evaluation event of the vehicle based on the vehicle data and an evaluation event of the occupant based on the occupant data and, in dependence on the determination of the evaluation event, to provide the representation of the vehicle data and the occupant data to the remote location via the communication means.
19. The vehicle of claim 17 or 18, wherein the occupant monitoring means comprises one or more sensing means for sensing one or more physical parameters of the at least one occupant of the vehicle and for outputting the occupant data indicative
5 thereof.
20. The vehicle of claim 19 when dependent on claim 18, wherein:
the occupant monitoring means comprises one or more imaging means for 10 imaging the at least one occupant of the vehicle and outputting image data corresponding thereto; and the control means comprises facial expression recognition means for determining, from the image data, a facial expression of the at least one occupant;
wherein the control means is arranged to determine the evaluation event of the occupant based on the facial expression.
21. The vehicle of claim 18, 19 or 20, wherein the control means is arranged to
20 determine the evaluation event of the vehicle in dependence on one or more of:
the vehicle data exceeding one or more predetermined values;
a rate of change of the vehicle data; and 25 a predetermined pattern of the vehicle data.
22. The vehicle of any of claims 17 to 21, comprising user interface means for outputting audio data received from the remote computer and determining audio data
30 corresponding to sounds within the vehicle and communicating audio data to the remote computer corresponding thereto.
23. The vehicle of claim 22, wherein the user interface means is arranged to visually output an image corresponding to video data received from the remote computer.
24. A system for evaluating a vehicle, comprising:
a vehicle according to any of claims 17 to 23; and a remote computer system;
wherein the remote computer system is at least periodically in wireless communication with the vehicle via a wireless network to receive one or both of the vehicle data or the occupant data and to establish an audio communication channel with the vehicle responsive to the received data.
The system of claim 24 comprising determining, at the remote computer system, whether the at least one of the vehicle data and the occupant data corresponds to a vehicle evaluation event and initiating the audio communication channel between the remote computer system and the at least one occupant of the vehicle in response thereto.
The system of claim 24 or 25, comprising storing, at the remote computer system, one or both of the vehicle data and the occupant data.
The system of claim 24, 25 or 26, comprising an artificial operator module executing on the remote computer system, the artificial operator module being arranged to communicate with the one or more occupants of the vehicle.
A method of vehicle development, comprising:
providing a first vehicle according to any of claims 17 to 23 according to a first design;
providing, via a communication channel to a remote computer system, vehicle data indicative of one or more attributes associated with the vehicle and occupant data indicative of a state of at least one occupant of the vehicle;
responsive to at least one of the vehicle data and the occupant data, establishing an audio communication channel between the remote computer system and the at least one occupant of the vehicle; and providing a second vehicle according to any of claims 17 to 23 according to a second design in dependence on one or more of the vehicle data, the occupant data and the audio communication associated with the at least one occupant of the vehicle.
29. Computer software which, when executed by a computer, is arranged to perform a method according to any of claims 1 to 16; optionally the computer software is stored on a computer readable medium.
30. A method, vehicle, system or computer software substantially as described hereinbefore with reference to the accompanying drawings.
amended claims have been filed as follows :1.
20 09 17
25 2.
3.
4.
5.
A method of vehicle evaluation, the vehicle comprising sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto, occupant monitoring means for determining a state of at least one occupant of the vehicle and outputting occupant data corresponding thereto, and communication means for operatively supporting a communication channel between the vehicle and a remote computer system, and wherein the occupant monitoring means comprises one or more imaging means for imaging the at least one occupant of the vehicle and outputting image data corresponding thereto, the method comprising:
determining, at the vehicle, an evaluation event of one or both of the vehicle based on the vehicle data and the occupant based on one or more of the occupant data or the image data;
providing, via the communication channel to the remote computer system, the vehicle data, the occupant data and the image data in dependence on the determination of the evaluation event; and responsive to at least one of the vehicle data, the occupant data, and the image data, establishing an audio communication channel between the remote computer system and the at least one occupant of the vehicle.
The method of claim 1, wherein the evaluation event of the vehicle is determined in dependence on the vehicle data exceeding one or more predetermined values.
The method of claim 1 or 2, wherein the evaluation event of the vehicle is determined in dependence on a rate of change of the vehicle data.
The method of any preceding claim, wherein the evaluation event of the vehicle is determined in dependence on a predetermined pattern of the vehicle data.
The method of any preceding claim, wherein:
20 09 17 the occupant monitoring means comprises one or more sensing means for sensing one or more physical parameters of the at least one occupant of the vehicle and outputting the occupant data indicative thereof; and
5 the evaluation event of the occupant is determined in dependence on the occupant data.
6. The method of claim 5, wherein the evaluation event of the occupant is determined in dependence on one or more of the occupant data exceeding one or more
10 predetermined values, a rate of change of the occupant data and a predetermined pattern of the occupant data.
7. The method of any preceding claim, comprising:
15 determining, from the image data, a facial expression of the at least one occupant; and determining the evaluation event of the occupant based on the facial expression.
8. The method of claim 7, comprising recognising an emotion of the at least one occupant based on the facial expression.
9. The method of any preceding claim, wherein the vehicle data and the occupant data
25 is provided to the remote location in substantially real-time.
10. The method of any preceding claim, wherein the audio communication channel is established between an operator of the remote computer system and the at least one occupant of the vehicle.
11. The method of any of claims 1 to 9, wherein the audio communication channel is established between an artificial operator module executing on the remote computer system and the at least one occupant of the vehicle.
35
12. The method of any preceding claim, comprising establishing a video communication channel between the remote computer system and the at least one occupant of the vehicle responsive to at least one of the vehicle data and the occupant data.
20 09 17
13. The method of any preceding claim, comprising storing, at the remote computer system, one or both of the vehicle data and the occupant data.
5
14. The method claim 13, comprising storing, at the remote computer system, data indicative of communication between the remote computer system and the at least one occupant of the vehicle associated with the one or both of the vehicle data and the occupant data.
10 15. A vehicle, comprising:
sensing means for measuring one or more attributes associated with the vehicle and outputting vehicle data corresponding thereto;
15 occupant monitoring means for determining a state of at least one occupant of the vehicle and outputting occupant data corresponding thereto wherein the occupant monitoring means comprises one or more imaging means for imaging the at least one occupant of the vehicle and outputting image data corresponding thereto;
wireless communication means for operatively supporting a wireless communication channel between the vehicle and a remote computer system;
control means arranged to receive the vehicle data and occupant monitoring
25 data to determine one or both of an evaluation event of the vehicle based on the vehicle data and an evaluation event of the occupant based on one or both of the occupant data or the image data, to provide, in dependence on the determination of the evaluation event, the representation of one or more of the vehicle data, the occupant data or the image data to the remote location via the communication means
30 and to receive a request for an audio communication channel between the remote computer system and the at least one occupant of the vehicle.
16. The vehicle of claim 15, wherein the occupant monitoring means comprises one or more sensing means for sensing one or more physical parameters of the at least one
35 occupant of the vehicle and for outputting the occupant data indicative thereof.
17. The vehicle of claim 15 or 16, wherein:
20 09 17 the control means comprises facial expression recognition means for determining, from the image data, a facial expression of the at least one occupant;
5 wherein the control means is arranged to determine the evaluation event of the occupant based on the facial expression.
18. The vehicle of claim 15, 16 or 17, wherein the control means is arranged to determine the evaluation event of the vehicle in dependence on one or more of:
the vehicle data exceeding one or more predetermined values;
a rate of change of the vehicle data; and
15 a predetermined pattern of the vehicle data.
19. The vehicle of any of claims 15 to 18, comprising user interface means for outputting audio data received from the remote computer and determining audio data corresponding to sounds within the vehicle and communicating audio data to the
20 remote computer corresponding thereto.
20. The vehicle of claim 19, wherein the user interface means is arranged to visually output an image corresponding to video data received from the remote computer.
25. Computer software which, when executed by a computer, is arranged to perform a method according to any of claims 1 to 14; optionally the computer software is stored on a computer readable medium.
20 09 17
Intellectual
Property
Office
Application No: GB1612843.1 Examiner: Peter Doenhoff
25
21. A system for evaluating a vehicle, comprising:
a vehicle according to any of claims 15 to 20; and a remote computer system;
wherein the remote computer system is at least periodically in wireless communication with the vehicle via a wireless network to receive one or both of the vehicle data or the occupant data and to establish an audio communication channel with the vehicle responsive to the received data.
22. The system of claim 21 comprising determining, at the remote computer system, whether the at least one of the vehicle data and the occupant data corresponds to a vehicle evaluation event and initiating the audio communication channel between the remote computer system and the at least one occupant of the vehicle in response thereto.
23. The system of claim 21 or 22, comprising storing, at the remote computer system,
5 one or both of the vehicle data and the occupant data.
24. The system of claim 21, 22 or 23, comprising an artificial operator module executing on the remote computer system, the artificial operator module being arranged to communicate with the one or more occupants of the vehicle.
GB1612843.1A 2016-07-25 2016-07-25 Apparatus and method for vehicle evaluation Active GB2552489B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1612843.1A GB2552489B (en) 2016-07-25 2016-07-25 Apparatus and method for vehicle evaluation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1612843.1A GB2552489B (en) 2016-07-25 2016-07-25 Apparatus and method for vehicle evaluation

Publications (3)

Publication Number Publication Date
GB201612843D0 GB201612843D0 (en) 2016-09-07
GB2552489A true GB2552489A (en) 2018-01-31
GB2552489B GB2552489B (en) 2019-05-22

Family

ID=56894482

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1612843.1A Active GB2552489B (en) 2016-07-25 2016-07-25 Apparatus and method for vehicle evaluation

Country Status (1)

Country Link
GB (1) GB2552489B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130226369A1 (en) * 2012-02-23 2013-08-29 Sirius XM Radio, Inc. Portable vehicle telematics systems and methods
US20140207535A1 (en) * 2013-01-24 2014-07-24 Ford Global Technologies, Llc Method and system for remote control of motor vehicles
US20150371456A1 (en) * 2014-06-24 2015-12-24 Hertz System, Inc. System and Method for Detecting and Remotely Assessing Vehicle Incidents and Dispatching Assistance

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130226369A1 (en) * 2012-02-23 2013-08-29 Sirius XM Radio, Inc. Portable vehicle telematics systems and methods
US20140207535A1 (en) * 2013-01-24 2014-07-24 Ford Global Technologies, Llc Method and system for remote control of motor vehicles
US20150371456A1 (en) * 2014-06-24 2015-12-24 Hertz System, Inc. System and Method for Detecting and Remotely Assessing Vehicle Incidents and Dispatching Assistance

Also Published As

Publication number Publication date
GB2552489B (en) 2019-05-22
GB201612843D0 (en) 2016-09-07

Similar Documents

Publication Publication Date Title
CN105159111B (en) Intelligent interaction device control method and system based on artificial intelligence
CN112673378B (en) Device for generating estimator, monitoring device, method for generating estimator, and program for generating estimator
US7766828B2 (en) Estimation apparatus and its control method
KR101520524B1 (en) Alzheimers cognitive enabler
CN103561652B (en) Method and system for assisting patients
WO2019205642A1 (en) Emotion recognition-based soothing method, apparatus and system, computer device, and computer-readable storage medium
JP2017007652A (en) Method for recognizing a speech context for speech control, method for determining a speech control signal for speech control, and apparatus for executing the method
JP2012059107A (en) Emotion estimation device, emotion estimation method and program
GB2528083A (en) System and method for automated device control for vehicles using driver emotion
US20180248782A1 (en) In-vehicle infotainment system interoperability testing device
WO2021131446A1 (en) Analysis device, analysis method, and analysis program
US20210349433A1 (en) System and method for modifying an initial policy of an input/output device
CN108025752B (en) Method and device for assisting driving using a peripheral device for measuring at least one physiological parameter
KR102396794B1 (en) Electronic device and Method for controlling the electronic device thereof
JP7123856B2 (en) Presentation evaluation system, method, trained model and program, information processing device and terminal device
US20190193280A1 (en) Method for personalized social robot interaction
WO2018207619A1 (en) Data collection apparatus and learning apparatus
GB2552489A (en) Apparatus and method for vehicle evaluation
JP7068156B2 (en) Information processing equipment and programs
DE102015203875A1 (en) Method and device for the information output from a vehicle
WO2021004799A1 (en) Device and method for analysing the behaviour of a subject
JP7443908B2 (en) Control device, information processing system, and control method
EP3733068B1 (en) Assessing cognitive reaction to over-the-air updates
US20220371610A1 (en) Method for operating an assistance system depending on a personalised configuration set, assistance system, computer program and computer-readable medium
WO2021196751A1 (en) Digital human-based vehicle cabin interaction method, apparatus and vehicle