WO2024107403A1 - Systèmes et procédés de commande de véhicules par commandes vocales - Google Patents

Systèmes et procédés de commande de véhicules par commandes vocales Download PDF

Info

Publication number
WO2024107403A1
WO2024107403A1 PCT/US2023/037198 US2023037198W WO2024107403A1 WO 2024107403 A1 WO2024107403 A1 WO 2024107403A1 US 2023037198 W US2023037198 W US 2023037198W WO 2024107403 A1 WO2024107403 A1 WO 2024107403A1
Authority
WO
WIPO (PCT)
Prior art keywords
processor
command
vehicle
audio input
control system
Prior art date
Application number
PCT/US2023/037198
Other languages
English (en)
Inventor
Michael Brett Mcmickell
Thomas KREIDER
Original Assignee
Circlio, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Circlio, Inc. filed Critical Circlio, Inc.
Publication of WO2024107403A1 publication Critical patent/WO2024107403A1/fr

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/20Control system inputs
    • G05D1/22Command input arrangements
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D13/00Control of linear speed; Control of angular speed; Control of acceleration or deceleration, e.g. of a prime mover
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L2015/088Word spotting
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/226Procedures used during a speech recognition process, e.g. man-machine dialogue using non-speech characteristics

Definitions

  • TITLE SYSTEMS AND METHODS FOR CONTROLLING VEHICLES
  • the disclosure generally relates to controlling vehicles, and, more specifically to methods and systems for locally and/or remotely operating one or more vehicles using voice commands.
  • control system onboard a vehicle comprises a non-transitory computer-readable storage medium in electronic communication with an audio input device and a machine control system, having instructions stored thereon that, in response to execution by a processor, cause the processor to perform operations.
  • the operations comprise: identifying an operation mode of the vehicle, resulting in an identified operating mode; receiving, by the processor, an audio input from the audio input device in communication with the processor; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; comparing, by the processor, the audio input to a first plurality of possible voice commands to determine a first command; validating, by the processor, whether the first command is allowable based on the identified operating mode and the environmental input; and controlling, by the processor, the machine control system in response to the first command when the first command is allowable.
  • the operations further comprise recognizing, by the processor, a wake word or an external start signal.
  • the article of manufacture comprises a tangible, non-transitory computer-readable storage medium in electronic communication with an audio input device and a machine control system, having instructions stored thereon that, in response to execution by a processor, cause the processor to perform operations comprising: identifying, by the processor, an operation mode of a vehicle, resulting in an identified operating mode; receiving, by the processor, an audio input from the audio input device in communication with the processor; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; determining, by the processor, whether the audio input is a first command of a first plurality of possible voice commands; validating, by the processor, whether the first command is allowable based on the identified operating mode and the environmental input; and controlling, by the processor, the machine control system in response to the first command when the first command is allowable.
  • the article of manufacture can be used along with any of the systems or methods described herein.
  • a method of operating a vehicle comprises: identifying, by the processor, an operation mode of a vehicle, resulting in an identified operating mode; receiving, by the processor, an audio input from the audio input device in communication with the processor; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; determining, by the processor, whether the audio input is a first command of a first plurality of possible voice commands; validating, by the processor, whether the first command is allowable based on the identified operating mode and the environmental input; and controlling, by the processor, the machine control system in response to the first command when the first command is allowable.
  • the method can be used with any of the systems disclosed herein.
  • a vehicle control system comprises a cloud system comprising a cloud database and a cloud processor, and a non-transitory computer-readable storage medium in electronic communication with an audio input device, the cloud system and a machine control system, having instructions stored thereon that, in response to execution by a processor, cause the processor to perform operations.
  • the operations comprise: identifying, by the processor, an operation mode of a vehicle, resulting in an identified operating mode; receiving, by the processor, an audio input from the audio input device in communication with the processor; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; transmitting, by the processor, the identified operating mode of the vehicle, the audio input and the environmental input to the cloud processor via a network; receiving, by the processor, a first command from the cloud processor; and controlling, by the processor, the machine control system in response to the first command.
  • the operations further comprise recognizing, by the processor, a wake word or an external start signal.
  • the article of manufacture comprises a tangible, non-transitory computer-readable storage medium in electronic communication with an audio input device, a cloud system and a machine control system, having instructions stored thereon that, in response to execution by a processor, cause the processor to perform operations comprising: identifying, by the processor, an operation mode of a vehicle, resulting in an identified operating mode; receiving, by the processor, an audio input from the audio input device in communication with the processor; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; transmitting, by the processor, the identified operating mode of the vehicle, the audio input and the environmental input to the cloud processor via a netw ork; receiving, by the processor, a first command from the cloud processor; and controlling, by the processor, the machine control system in response to the first command.
  • the article of manufacture can be used along with any of the systems or methods described herein.
  • a method of operating a vehicle comprises: identifying, by the processor, an operation mode of a vehicle, resulting in an identified operating mode; receiving, by the processor, an audio input from the audio input device in communication with the processor; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; transmitting, by the processor, the identified operating mode of the vehicle, the audio input and the environmental input to the cloud processor via a network; receiving, by the processor, a first command from the cloud processor; and controlling, by the processor, the machine control system in response to the first command.
  • the method can be used with any of the systems disclosed herein.
  • a vehicle control system comprises a processor and a cloud system comprising a cloud database and a non-transitory computer-readable storage medium in electronic communication with an audio input device and the processor, having instructions stored thereon that, in response to execution by a cloud processor, cause the cloud processor to perform operations.
  • the operations comprise: receiving, by the cloud processor, an identified operating mode of a vehicle, an audio input and an environmental input via a network; determining, by the cloud processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the cloud database; validating, by the cloud processor, whether the first command is allowable based on the identified operating mode and the environmental input; and sending, by the cloud processor, the first command to the processor.
  • the operations further comprise recognizing, by the cloud processor, a wake word or an external start signal.
  • the article of manufacture comprises a tangible, non-transitory computer-readable storage medium in electronic communication with an audio input device and a processor, having instructions stored thereon that, in response to execution by a cloud processor, cause the cloud processor to perform operations comprising: receiving, by the cloud processor, an identified operating mode of a vehicle, an audio input and an environmental input via a network; determining, by the cloud processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the cloud database; validating, by the cloud processor, whether the first command is allowable based on the identified operating mode and the environmental input; and sending, by the cloud processor, the first command to the processor.
  • the article of manufacture can be used along with any of the systems or methods described herein.
  • a method of operating a vehicle comprises: receiving, by the cloud processor, an identified operating mode of a vehicle, an audio input and an environmental input via a network; determining, by the cloud processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the cloud database; validating, by the cloud processor, whether the first command is allowable based on the identified operating mode and the environmental input; and sending, by the cloud processor, the first command to the processor.
  • the method can be used with any of the systems disclosed herein.
  • a vehicle remote control system comprises a first vehicle and a remote controller comprising a non-transitory computer-readable storage medium in electronic communication with an audio input device and a processor, having instructions stored thereon that, in response to execution by a remote processor, cause the remote processor to perform operations.
  • the operations comprise: receiving, by the remote processor, an identified operating mode of the first vehicle, an audio input and an environmental input via a remote network; determining, by the remote processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the remote controller; validating, by the remote processor, whether the first command is allowable based on the identified operating mode and the environmental input; and transmitting, by the remote processor, the first command to the processor to control the first vehicle.
  • the operations further comprise recognizing, by the remote processor, a wake word or an external start signal.
  • the article of manufacture comprises a tangible, non-transitory computer-readable storage medium in electronic communication with an audio input device and a processor, having instructions stored thereon that, in response to execution by a remote processor, cause the remote processor to perform operations comprising: receiving, by the remote processor, an identified operating mode of the first vehicle, an audio input and an environmental input via a remote network; determining, by the remote processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the remote controller; validating, by the remote processor, whether the first command is allowable based on the identified operating mode and the environmental input; and transmitting, by the remote processor, the first command to the processor to control the first vehicle.
  • the article of manufacture can be used along with any of the systems or methods described herein.
  • a method of remotely operating a vehicle comprises: receiving, by the remote processor, an identified operating mode of the first vehicle, an audio input and an environmental input via a remote netw ork; determining, by the remote processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the remote controller; validating, by the remote processor, whether the first command is allowable based on the identified operating mode and the environmental input; and transmitting, by the remote processor, the first command to the processor to control the first vehicle.
  • the method can be used with any of the systems disclosed herein.
  • a vehicle remote control system comprises a remote controller comprising a remote processor; and a first vehicle comprising a non-transitory computer-readable storage medium in electronic communication with a machine control system, having instructions stored thereon that, in response to execution by a processor, cause the processor to perform operations.
  • the operations comprise: identifying, by the processor, an operation mode of the first vehicle, resulting in an identified operating mode; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; transmitting, by the processor, the identified operating mode of the first vehicle and the environmental input to the remote processor; receiving, by the processor, a first command from the remote processor, and controlling, by the processor, the machine control system of the first vehicle in response to the first command.
  • the operations further comprise recognizing, by the processor, a wake word or an external start signal.
  • the article of manufacture comprises a tangible, non-transitory computer-readable storage medium in electronic communication with a machine control system, having instructions stored thereon that, in response to execution by a remote processor, cause the remote processor to perform operations comprising: identifying, by the processor, an operation mode of the first vehicle, resulting in an identified operating mode; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; transmitting, by the processor, the identified operating mode of the first vehicle and the environmental input to the remote processor; receiving, by the processor, a first command from the remote processor, and controlling, by the processor, the machine control system of the first vehicle in response to the first command.
  • the article of manufacture can be used along with any of the systems or methods described herein.
  • a method of remotely operating a vehicle comprises: identifying, by the processor, an operation mode of the first vehicle, resulting in an identified operating mode; receiving, by the processor, an environmental input from an environmental sensor input device in communication with the processor; transmitting, by the processor, the identified operating mode of the first vehicle and the environmental input to the remote processor; receiving, by the processor, a first command from the remote processor, and controlling, by the processor, the machine control system of the first vehicle in response to the first command.
  • the method can be used with any of the systems disclosed herein.
  • FIG. 1 illustrates an off-road vehicle, in accordance with various embodiments
  • FIG. 2 illustrates a further off-road vehicle, in accordance with various embodiments
  • FIG. 3 illustrates a still further off-road vehicle, in accordance with various embodiments
  • FIG. 4 illustrates an interior cabin of an off-road vehicle, in accordance with various embodiments
  • FIG. 5 illustrates a block diagram for a vehicle control system, in accordance with various embodiments
  • FIG. 6 illustrates a block diagram for a vehicle control system, in accordance with various embodiments
  • FIG. 7 illustrates a block diagram for a vehicle control system, in accordance with various embodiments
  • FIG. 8 illustrates a method for controlling an off-road vehicle, in accordance with various embodiments
  • FIG. 9 illustrates a method for controlling an off-road vehicle, in accordance with various embodiments.
  • FIG. 10 illustrates a method for controlling an off-road vehicle, in accordance with various embodiments
  • FIG. 11 illustrates a method for controlling an off-road vehicle, in accordance with various embodiments
  • FIG. 12 illustrates a method for controlling an off-road vehicle, in accordance with various embodiments.
  • ‘communication” means communication of electronic signals with physical coupling (e.g., “electrical communication” or “electrically coupled”) or without physical coupling and via an electromagnetic field (e g., “radio frequency communication” or “radio frequency coupled” or “radio frequency coupling”).
  • “transmit” may include sending electronic data from one system component to another via electronic communication between the components.
  • data may include encompassing information such as commands, queries, files, data for storage, and the like in digital or any other form which can be transmitted via electronic communication.
  • audio input device is any suitable hardware, software, and/or database components capable of sending and receiving audio data.
  • an audio input device may comprise a wired or wireless microphone, a wired or wireless lapel microphone, a wired or wireless headset/earpiece containing a microphone, and/or the like.
  • the audio input device is in electronic communication with a processor, a cloud processor via a network and/or a remote processor.
  • the audio input device can include frame mounted devices aimed away or towards the operator. The frame mounted devices aimed away from the vehicle operator can be used to gather environmental noise that may be used to filter the audio input.
  • the audio input device can also be an audio output device such as a BLUETOOTH® headset.
  • Any databases discussed herein may include relational, hierarchical, graphical, blockchain, object-oriented structure, and/or any other database configurations.
  • Any database may also include a flat file structure wherein data may be stored in a single file in the form of rows and columns, with no structure for indexing and no structural relationships between records.
  • a flat file structure may include a delimited text file, a CSV (comma- separated values) file, and/or any other suitable flat file structure.
  • DB2® by IBM 5 (Armonk, NY), various database products available from ORACLE® Corporation (Redwood Shores, CA), MICROSOFT ACCESS® or MICROSOFT SQL SERVER® by MICROSOFT® Corporation (Redmond, Washington), MYSQL® by MySQL AB (Uppsala. Sweden), MONGODB®, Redis, Apache Cassandra®, HBASE® by APACHE®, MapR-DB by the M APR 11 corporation, or any other suitable database product.
  • any database may be organized in any suitable manner, for example, as data tables or lookup tables. Each record may be a single file, a series of files, a linked series of data fields, or any other data structure.
  • big data may refer to partially or fully structured, semistructured, or unstructured data sets including millions of rows and hundreds of thousands of columns.
  • Association of certain data may be accomplished through any desired data association technique such as those known or practiced in the art.
  • the association may be accomplished either manually or automatically.
  • Automatic association techniques may include, for example, a database search, a database merge, GREP, AGREP, SQL, using a key field in the tables to speed searches, sequential searches through all the tables and files, sorting records in the file according to a known order to simplify lookup, and/or the like.
  • the association step may be accomplished by a database merge function, for example, using a “key field” in pre-selected databases or data sectors.
  • Various database tuning steps are contemplated to optimize database performance. For example, frequently used files such as indexes may be placed on separate file systems to reduce In/Out (“I/O”) bottlenecks.
  • any databases, systems, devices, servers, or other components of the system may consist of any combination thereof at a single location or at multiple locations, wherein each database or system includes any of various suitable security features, such as firewalls, access codes, encryption, decryption, public and private keys, and/or the like.
  • network includes any cloud, cloud computing system, or electronic communications system or method which incorporates hardware and/or software components. Communication among the parties may be accomplished through any suitable communication channels, such as, for example, a telephone network, an extranet, an intranet, internet, personal internet device, online communications, satellite communications, off-line communications, wireless communications, transponder communications, local area network (LAN), wide area network (WAN), virtual private network (VPN), networked or linked devices, keyboard, mouse, and/or any suitable communication or data input modality.
  • LAN local area network
  • WAN wide area network
  • VPN virtual private network
  • the system is frequently described herein as being implemented with TCP/IP communications protocols, the system may also be implemented using IPX, APPLETALK®, IPv6, NetBIOS, any tunneling protocol (e.g. IPsec, SSH, etc.), or any number of existing or future protocols.
  • IPX IPX
  • APPLETALK® IPv6, NetBIOS
  • any tunneling protocol e.g. IPsec, SSH, etc.
  • IPsec Secure Shell
  • SSH Secure Shell
  • Specific information related to the protocols, standards, and application software utilized in connection with the internet is generally know n to those skilled in the art and, as such, need not be detailed herein.
  • Cloud or “Cloud computing” includes a model for enabling convenient, on- demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.
  • Cloud computing may include location-independent computing, whereby shared servers provide resources, software, and data to computers and other devices on demand.
  • Computer programs are stored in main memory and/or secondary' memory. Computer programs may also be treated as data and transmitted or received via communications interface as a means of dissemination prior to execution. Such computer programs, when executed, enable the computer system to perform the features as discussed herein. In particular, the computer programs, when executed, enable the processor to perform the features of various embodiments. Accordingly, such computer programs represent controllers of the computer system.
  • Computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions that execute on the computer or other programmable data processing apparatus create means for implementing the functions specified in the flowchart block or blocks as shown in the Figures.
  • These computer program instructions may also be stored in a computer-readable memory 7 that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory' produce an article of manufacture including instruction means which implement the function specified in the flowchart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • software may be stored in a computer program product and loaded into a computer system using a removable storage drive, hard disk drive, or communications interface.
  • the control logic (software), when executed by the processor, causes the processor to perform the functions of various embodiments as described herein.
  • hardware components may take the form of application specific integrated circuits (ASICs) or commercially available packaged integrated circuits such as micro-controller units (MCUs), central processing units (CPUs), or field programmable gate arrays (FPGAs). Implementation of the hardware so as to perform the functions described herein w ill be apparent to persons skilled in the relevant art(s).
  • the system may be embodied as a customization of an existing system, an add-on product, a processing apparatus executing upgraded software, a stand-alone system, a distributed system, a method, a data processing system, a device for data processing, and/or a computer program product.
  • any portion of the system or a module may take the form of a processing apparatus executing code, an internet based embodiment, an entirely hardware embodiment, or an embodiment combining aspects of the internet, software, and hardware.
  • the system may take the form of a computer program product on a computer-readable storage medium having computer-readable program code means embodied in the storage medium. Any suitable computer-readable storage medium may be utilized, including hard disks, CD- ROM, BLU-RAY DISC®, optical storage devices, magnetic storage devices, flash disk, flash memory' and/or the like.
  • components, modules, and/or engines of the systems may be implemented as applications or apps.
  • Apps are typically deployed in the context of a mobile operating system, including for example, a WINDOWS® mobile operating system, an ANDROID 8 operating system, an APPLE® iOS operating system, a BLACKBERRY® company’s operating system, and the like.
  • the app may be configured to leverage the resources of the larger operating system and associated hardware via a set of predetermined rules which govern the operations of various operating systems and hardware resources. For example, where an app desires to communicate with a device or network other than the mobile device or mobile operating system, the app may leverage the communication protocol of the operating system and associated device hardware under the predetermined rules of the mobile operating system.
  • the app desires an input from a user, the app may be configured to request a response from the operating system which monitors various hardware components and then communicates a detected input from the hardw are to the app.
  • steps described herein may comprise, in any number of configurations, including the use of WINDOWS® / LINUX® / UNIX® applications, webpages, web forms, popup WINDOWS® / LINUX® / UNIX® applications, prompts, and the like. It should be further appreciated that the multiple steps as illustrated and described may be combined into single webpages and/or WINDOWS® / LINUX® / UNIX® applications but have been expanded for the sake of simplicity. In other cases, steps illustrated and described as single process steps may be separated into multiple webpages and/or WINDOWS® / LINUX® I UNIX® applications but have been combined for simplicity.
  • the computers discussed herein may provide a suitable website or other internet-based graphical user interface (GUI) which is accessible by users.
  • GUI graphical user interface
  • MICROSOFT® company s Internet Information Services (IIS), Transaction Server (MTS) service, and an SQL SERVER® database
  • IIS Internet Information Services
  • MTS Transaction Server
  • SQL SERVER® database WINDOWS NT® web server software
  • SQL SERVER® database WINDOWS NT® web server software
  • SQL SERVER® database WINDOWS NT® web server software
  • MICROSOFT® Commerce Server e.g., MICROSOFT® Commerce Server.
  • components such as ACCESS® software, SQL SERVER® database, ORACLE® software, SYBASE® software, INFORMIX® software, MYSQL® software, INTERBASE® software, etc.
  • ADO Active Data Object
  • the APACHE® web server is used in conjunction with a LINUX® operating system, a MYSQL® database, and
  • the system and method may be described herein in terms of functional block components, screen shots, optional selections, and various processing steps. It should be appreciated that such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions.
  • the system may employ various integrated circuit components, e.g.. memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
  • the software elements of the system may be implemented with any programming or scripting language such as C, C++.
  • system could be used to detect or prevent security' issues with a client-side scripting language, such as JAVASCRIPT®, VBScript, or the like.
  • client-side scripting language such as JAVASCRIPT®, VBScript, or the like.
  • Computer-based system program instructions and/or processor instructions may be loaded onto a tangible, non-transitory computer readable medium having instructions stored thereon that, in response to execution by a processor, cause the processor to perform various operations.
  • the term "non-transitory” is to be understood to remove only propagating transitory signals per se from the claim scope and does not relinquish rights to all standard machine-readable media that are not only propagating transitory signals per se.
  • non-transitory machine-readable medium and “non-transitory machine-readable storage medium” should be construed to exclude only those types of transitory machine-readable media which were found in In re Nuijten to fall outside the scope of patentable subject matter under 35 U.S.C. ⁇ 101.
  • Operators of off-road vehicles can commit errors when command options are not fully known, or particular operating data is not available to the operator. Additionally, operators can commit errors by inputting commands that are inappropriate to the current operating mode of the off-road vehicle, whether intentionally or by accident. For example, operator errors can be caused by an operator entering incorrect data including, but not limited to, entering data outside the acceptable ranges for the current operating mode of the off-road vehicle.
  • Voice control systems for off-road vehicles can provide additional support and safety measures by identifying available commands and verifying the commands are safe before executing.
  • an off-road vehicle 100 having an interior cabin 102, a drive system comprising an engine 104 and tires 106 is illustrated, in accordance with various embodiments.
  • the off-road vehicle 100 comprises a tool system 1 8 such as a bucket or attachment used for mining and agricultural applications.
  • Off-road vehicle 100 also comprises an image sensor 110.
  • the image sensor 110 can be, for example, a camera, a RADAR sensor, a LiDAR sensor, or the like.
  • the image sensor 110 can be an IR sensor and/or a structured light 3D scanner which comprises an IR dot projector and an IR sensor for producing 3D images.
  • the image sensor 110 is used to map the terrain around the off-road vehicle 1 .
  • the off-road vehicle 100 can be equipped with multiple image sensors 110 to provide a terrain output with a full 360 degree terrain map around the off-road vehicle 100.
  • the off-road vehicle 100 can comprise a global positioning system (GPS) receiver 107.
  • the GPS receiver 107 can also be a global navigation satellite system (GNSS) receiver.
  • GNSS global navigation satellite system
  • the GPS receiver 107 can be a stand-alone device disposed on the roof of the off road vehicle.
  • GPS and GNSS are a network of satellites that orbit Earth and can pinpoint the geographic location of a device using wireless signals.
  • An off-road vehicle location can be determined from a GPS location of the GPS receiver 107.
  • the GPS receiver 107 is configured to transmit the GPS location to a processor.
  • the GPS receiver 107 is configured to continuously transmit the GPS location to the processor.
  • an off-road vehicle 200 having an interior cabin 202, a drive system comprising an engine 204 and tires 206 is illustrated, in accordance with various embodiments.
  • the off-road vehicle 200 comprises a tool system 208 such as a dump body used for mining and agricultural applications to haul and dump materials.
  • Off-road vehicle 200 also comprises an image sensor 210.
  • the image sensor 210 can be the same as the image sensor 110 described herein.
  • the off-road vehicle 200 can comprise a GPS receiver 207.
  • the GPS receiver 207 can also be a GNSS receiver.
  • the GPS receiver 207 can be a stand-alone device disposed on the roof of the off road vehicle.
  • the GPS receiver 207 can be the same as described herein for the GPS receiver 107.
  • an off-road vehicle 300 having an interior cabin 302, a drive system comprising an engine 304 and tires 306 is illustrated, in accordance with various embodiments.
  • the off-road vehicle 300 comprises a tool system 308 such as a hitch for a plow used in agricultural applications.
  • Off-road vehicle 300 also comprises an image sensor 310.
  • the image sensor 310 can be the same as the image sensor 110 described herein.
  • the off-road vehicle 300 can comprise a GPS receiver 307.
  • the GPS receiver 307 can also be a GNSS receiver.
  • the GPS receiver 307 can be a stand-alone device disposed on the roof of the off road vehicle.
  • the GPS receiver 307 can be the same as described herein for the GPS receiver 107.
  • Interior cabin 400 can be used for any of the interior cabins 102, 202, and 302.
  • the interior cabin 400 can be used to seat the operator of the off-road vehicle in the event the off-road vehicle is being operated locally or the interior cabin 400 can be vacant if the off-road vehicle is being operated remotely.
  • the interior cabin 400 comprises a processor 402 in communication with an audio input device 404 (such as an audio input device as described herein).
  • the processor 402 can also be in electronic communication with a machine control system 403, a tool system (such as tool systems 108, 208.
  • an image sensor such as image sensors 110, 210, and 310
  • a haptic device 410 imbedded in a steering wheel 408 and/or a seat 409
  • a start button 411 is configured to transmit a start button signal to the processor 402 in response to being engaged.
  • the off-road vehicle can be operated remotely.
  • a remote audio input device is used to transmit remote audio input to a remote processor.
  • the remote processor can be in communication with a remote visual display, a remote speaker and a remote haptic device all disposed in a remote station.
  • the machine control system 403 may include one or more logic devices such as one or more of a central processing unit (CPU), a graphical processing unit (GPU), an accelerated processing unit (APU), a digital signal processor (DSP), a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or the like.
  • the processor 402 may further include any non-transitory memory known in the art and store instructions usable by the logic device to perform operations. Any appropriate computer-readable type/con figuration may be utilized as a part of the logic device, and any appropriate data storage architecture may be utilized by the logic device, or both.
  • the processor 402 is configured to control the movement of the off-road vehicle through a drive output signal transmitted to the machine control system 403.
  • the machine control system 403 is in communication with a drive system (such as the drive systems described herein) and is configured to send commands to the drive system.
  • the drive output can control the machine control system 403 to command the engine to operate at a set output and for a set time until a desired distance is achieved.
  • the drive output can control the machine control system 403 to command the steering wheel 408 to turn for a set time until a desired orientation is achieved.
  • closed loop control is implemented, where a command is performed until a contingent event occurs, though open loop control is also possible, where a command is executed at a single time or is continually executed until a command to cease execution is received.
  • the processor 402 may include one or more logic devices such as one or more of a central processing unit (CPU), a graphical processing unit (GPU), an accelerated processing unit (APU), a digital signal processor (DSP), a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or the like.
  • the processor 402 may further include any non-transitory memory known in the art and store instructions usable by the logic device to perform operations. Any appropriate computer-readable type/configuration may be utilized as a part of the logic device, and any appropriate data storage architecture may be utilized by the logic device, or both.
  • the processor 402 is configured to transmit a signal, such as an alert, to the speaker (as an audio output), to the visual display (as a visual output), and/or to the haptic device (as a vibrating output).
  • the processor 402 is configured to receive signals from the machine controller 403 which is in communication with an environmental input device, for example, the image sensor (such as image sensors 110, 210, and 310), the GPS receiver 407 and other sensors that may be used on the off-road vehicle to determine the safety of the off-road vehicle.
  • Some other sensors that could be used as environment input devices are wheel sensors embedded in the tire which can transmit a wheel spin rate to determine whether each wheel has lost traction or may soon lose traction with the driving surface, and fuel sensors connected to the fuel tank of the off-road vehicle to transmit fuel levels to help determine whether the off-road vehicle has enough fuel to complete a command.
  • the processor 402 is also connected to a cloud system via a network and is configured to transmit signals to and receive signals from the cloud system.
  • the processor 402 is in electronic communication with a remote system (such as another off-road vehicle, an on-site operator, or an off-site operator) via a network and the processor 402 is configured to transmit signals to and receive signals from the remote system.
  • a remote system such as another off-road vehicle, an on-site operator, or an off-site operator
  • the control system 500 can be a control system onboard a vehicle, such as off-road vehicles 100, 200, and 300.
  • the control system 500 comprises a processor 502 (such as processor 402).
  • the processor 502 may include one or more logic devices such as one or more of a central processing unit (CPU), a graphical processing unit (GPU), an accelerated processing unit (APU), a digital signal processor (DSP), a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or the like.
  • CPU central processing unit
  • GPU graphical processing unit
  • APU accelerated processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • the processor 502 may further include any non-transitory memory known in the art and store instructions usable by the logic device to perform operations. Any appropriate computer-readable type/ configuration may be utilized as a part of the logic device, and any appropriate data storage architecture may be utilized by the logic device, or both.
  • control system 500 further comprises an environmental input device 504 (such as any of the environmental input devices described herein).
  • environmental input device 504 is in communication with processor 502 and is configured to output an environmental input (such as GPS location, fuel levels, terrain output, and wheel spin rate). For example, wheel spin rate is measured as the number of wheel revolutions per minute.
  • control system 500 further comprises an audio input device 506 (such as the audio input device 404).
  • audio input device 506 such as the audio input device 404.
  • the audio input device 506 is in communication with processor 502 and is configured to electronically output an audio input.
  • the audio input device 506 can continuously transmit audio input to the processor 502.
  • control system 500 further comprises a machine control system 508 (such as machine control system 403).
  • the machine control system 508 is in communication with processor 502 and is configured to control the engine and steering wheel of the off-road vehicle based on a drive output from the processor 502.
  • the drive output can comprise an engine input and a steenng wheel input to control the off-road vehicle.
  • the drive output is a command, for example make speed 5 miles per hour, or nudge vehicle right 6 inches, and the drive output is then converted by the machine control system into the engine input and the steering wheel input.
  • control system 500 further comprises an alert device 510 (such as the haptic device 410, the visual display 412, and the speakers 414).
  • alert device 510 such as the haptic device 410, the visual display 412, and the speakers 414.
  • there are more than one alert devices 510 including a mix of any quantity and combination of alert devices 510.
  • Visual display 412 can be a screen display or a light (for example an LED) that can illuminate to indicate the type of alert being relayed to the operator.
  • the alert device 510 is in communication with processor 502 and is configured to output an alert notification.
  • the alert notification can be a vibrating output for a haptic device to vibrate, for example, a haptic device causing vibration of a steering wheel (such as steering wheel 410) or vibration of a seat (such as seat 409) to alert the operator.
  • the alert notification can be a visual output for a visual display to visually alert the operator.
  • the alert notification can be an audio output for a speaker to audially alert the operator.
  • the audio output can comprise a tone, a chime, a beep, and/or the like.
  • the audio output can also be a text-to-voice response, for example output through an attached internal or external speaker.
  • the alert notification can be used to alert the operator of a command that would be dangerous to execute given the environmental and/or operating conditions, and/or the alert notification can be used to confirm with the operator that the command determined by the processor is the desired command before outputting said command.
  • a method 800 for operating a vehicle may comprise identifying, by a processor (shown as processors 402 and 502) an operation mode of a vehicle, resulting in an identified operating mode (step 802).
  • the processor is configured to receive operation data from the vehicle and a machine control system (shown as machine control system 508), for example, an engine state (whether engine is off or on) a vehicle speed, a tool state (whether the off-road vehicle's tool system is being used) and other operation data of the like.
  • the processor in response to receiving the operation data, can determine the identified operating mode.
  • the method 800 may comprise receiving, by the processor, an audio input from an audio input device (show n as audio input devices 404 and 506) in communication with the processor (step 804). In various embodiments, the method 800 may comprise receiving, by the processor, an environmental input from an environmental sensor input device (shown as environmental input device 504) in communication with the processor (step 806).
  • the method 800 may comprise determining, by the processor, whether the audio input is a first command of a first plurality of possible voice commands (step 808).
  • the processor comprises the first plurality of possible voice commands in the non-transilory memory of the processor.
  • the first plurality’ of possible voice commands can be a list of all executable commands and functions for the vehicle. If the audio input is determined to match with a command from the first plurality' of possible voice commands, then the processor assigns the audio input as the first command. If the audio input is determined to not match with a command from the first plurality of possible voice commands, then the processor does not assign the first command and the processor can communicate the rejection using a default response signal which can be sent to any alert device.
  • the audio input received by the audio input device and transmitted to the processor can be continuous or substantially continuous, though it is understood that not all audio received by the audio input device is intended to be a command. This would be useful in preventing miscellaneous audio from being further analyzed by the processor.
  • the triggering event can be the use of a wake word.
  • the audio input may be received continuously or substantially continuously from the audio input device and analyzed, for example in real time, by the processor for purposes of detecting the wake word. Upon identifying the wake work, the processor may regard this as the triggering event.
  • the triggering event can be another suitable electronic input, such as the engaging of a start button (such as start button 411) by the operator or sending of a start signal over the network from a user device to the processor.
  • the operations of method 800 can further comprise receiving, by the processor, a triggering event comprising at least one of a wake word, a start button signal, and a start signal.
  • the method 800 may comprise validating, by the processor, whether the first command is allowable based on the identified operating mode and the environmental input (step 810).
  • the step of validating further comprises comparing, by the processor, the first command to the speed of the vehicle. For example, if the first command is turn off the engine, and the speed of the vehicle is 50 miles per hour, it would not be safe to turn off the engine at that speed. Therefore, the first command would not be validated as allowable. If the first command is, for example, move forward 50 feet, and the speed of the vehicle is 0 mph, it would be safe to command the machine control system to move the vehicle forward 50 feet. Therefore, the first command would be validated as allowable.
  • the step of validating further comprises comparing, by the processor, the first command to the state of the vehicle. For example, if the first command is to start autosteering, and the vehicle is currently in park. Therefore, the first command would not be validated as allowable. If the first command is start autosteering, and the vehicle is in motion, it would be safe to control the machine control system to start autosteering. Therefore, the first command would be validated as allowable.
  • the step of validating further comprises comparing, by the processor, the first command to the terrain output. For example, if the first command is move forward 50 feet, and the terrain output shows a slope that exceeds a set negative grade, for example -10% or more, it would not be safe to move forward 50 feet in heavy off-road vehicles. Therefore, the first command would not be validated as allowable. If the first command is move forward 50 feet, and the terrain output shows no physical limitations or a slope exceeding the set negative grade for the next 50 feet, it would be safe to control the machine control system to move the vehicle forward 50 feet. Therefore, the first command would be validated as allowable.
  • a set negative grade for example -10% or more
  • the off-road vehicle can comprise an Inertial Measurement Unit (IMU), in communication with the processor, configured to transmit an attitude of the off-road vehicle to the processor.
  • IMU Inertial Measurement Unit
  • the attitude of the off-road vehicle can also be compared to the first command.
  • the off-road vehicle can comprise an angular velocity sensor, such as a gyroscopic sensor, for the detection of angular velocity.
  • the step of validating can comprise comparing one or more of the environmental sensor inputs (GPS location, terrain output, and speed of the vehicle) to validate the first command.
  • the processor outputs an alert notification to one or more alert devices.
  • the alert notification can be an alert to the operator that the first command is not allowable.
  • the method 800 may comprise controlling, by the processor, the machine control system in response to the first command being allowable (step 812).
  • method 800 may comprise controlling, by the processor, a tool system (shown as tool systems 108, 208, and 308) in response to the first command being allowable.
  • the processor outputs a confirmation notification requesting a confirmatory response from the operator to confirm that the first command is correct.
  • the confirmation notification can be displayed by the visual display or played via audio signal through the speakers.
  • the method 800 may comprise receiving, by the processor, a confirmatory response through the audio input device, the start button, or the visual display.
  • the processor controls the vehicle system or the tool system with the first command in response to the processor receiving an affirmative confirmatory response (i.e., yes).
  • the method 800 can further comprise filtering, by the processor, the audio input based on the identified operating mode of the vehicle and/or an environmental audio input from a second audio input device (step 807).
  • the processor can filter the audio input to obtain a more ideal sample of audio input for processing, to remove background noises from the audio input which can be found using the identified operating mode of the vehicle and/or the second audio input, and can convert the audio input to a list of recognized words and confidences for the processor to analyze with the first plurality of possible voice commands.
  • the second audio input device can be positioned and configured to only transmit a second audio input received from the drive system and the environment around the vehicle.
  • the processor (such as processors 402 and/or 502) compares the audio input to the second audio input to filter the audio input by removing overlapping audio data between the audio input and the second audio input. The resulting output is a filtered audio input.
  • the processors 402 and 502 can comprise instructions stored thereon, that in response to execution by the processors 402 and 502, cause the processors 402 and 502 to perform operations comprising any of the steps and embodiments of method 800.
  • the method 800 can be performed by the processors 402 and 502.
  • the control system 600 can be a vehicle control system onboard a vehicle and connected to a network, such as off-road vehicles 100, 200, and 300.
  • the control system 600 comprises the processor 502.
  • the processor 502 is in electronic communication with a network 602.
  • a cloud system 604 comprising a cloud database 606 and a cloud processor 608 are also in electronic communication with the network 602.
  • the cloud database 606 may store instructions usable by the cloud processor 608 to perform operations. Any appropriate computer-readable type/configuration may 7 be utilized as the cloud database 606, any appropriate data storage architecture may be utilized by the cloud database 606, or both.
  • the cloud processor 608 may include one or more logic devices such as one or more of a central processing unit (CPU), a graphical processing unit (GPU), an accelerated processing unit (APU), a digital signal processor (DSP), a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or the like.
  • the cloud processor 608 may further include any non-transitory memory known in the art and store instructions usable by a logic device to perform operations. Any appropriate computer-readable type/configuration may be utilized as a part of the logic device, and any appropriate data storage architecture may be utilized by the logic device, or both.
  • the cloud processor 608 can be an on-premise processor or a dedicated server processor.
  • the control system 600 further comprises the environmental input device 504. In various embodiments, there are more than one environmental input devices 504. The environmental input device 504 is in communication with processor 502 and is configured to output an environmental input. In various embodiments, the control system 600 further comprises the audio input device 506. In various embodiments, there are more than one audio input device 506. In various embodiments, the audio input device 506 can continuously transmit audio input to the processor 502. In various embodiments, the control system 600 further comprises a machine control system 508. In various embodiments, the control system 600 further comprises an alert device 510 (such as the haptic device 410, the visual display 412, and the speakers 414). In various embodiments, there are more than one alert devices 510. The alert device 510 is in communication with processor 502 and is configured to output an alert notification as described herein.
  • a method 900 for operating a vehicle may comprise identifying, by a processor (shown as processors 402 and 502) an operation mode of a vehicle, resulting in an identified operating mode (step 902).
  • the processor is configured to receive operation data from the vehicle and a machine control system (shown as machine control system 508), for example, an engine state (whether engine is off or on) a vehicle speed, a tool state (whether the off-road vehicle’s tool system is being used or not) and other operation data of the like.
  • the processor in response to receiving the operation data, can determine the identified operating mode.
  • the method 900 may comprise receiving, by the processor, an audio input from an audio input device (shown as audio input devices 404 and 506) in communication with the processor (step 904).
  • the method 900 may comprise receiving, by the processor, an environmental input from an environmental sensor input device (shown as environmental input device 504) in communication with the processor (step 906).
  • the method 900 may comprise transmitting, by the processor, the identified operating mode of the vehicle, the audio input and the environmental input to the cloud processor (shown as cloud processor 608) via a network (shown as network 602) (step 908).
  • the cloud processor is in communication with the cloud database.
  • the cloud database comprises a first plurality of possible voice commands in the non-transitory memory of the cloud database.
  • the first plurality of possible voice commands can be a list of all executable commands and functions for the vehicle. If the audio input is determined to match with a command from the first plurality’ of possible voice commands, then the processor assigns the audio input as the first command. If the audio input is determined to not match with a command from the first plurality of possible voice commands, then the processor does not assign the first command.
  • the audio input received by the audio input device and transmitted to the processor can be continuous or substantially continuous, though it is understood that not all audio received by the audio input device is intended to be a command. This would be useful in preventing miscellaneous audio from being further analyzed by the processor or by the cloud processor.
  • the audio input may be received continuously or substantially continuously from the audio input device and analyzed, for example in real time, by the processor for purposes of detecting the wake word. Upon identifying the wake work, the processor may regard this as the triggering event.
  • the triggering event can be another suitable electronic input, such as the engaging of a start button (such as start button 411) by the operator or sending of a start signal over the network from a user device to the processor.
  • the operations of method 900 can further comprise receiving, by the processor, a triggering event comprising at least one of a wake word, a start button signal, and a start signal.
  • the method 900 may comprise receiving, by the processor, the first command from the cloud processor (step 910). In various embodiments, the method 900 may comprise controlling, by the processor, the machine control system in response to the first command (step 912). In various embodiments, method 900 may comprise controlling, by the processor, a tool system (shown as tool systems 108, 208, and 308) in response to the first command.
  • the processor outputs a confirmation notification requesting a confirmatory response from the operator to confirm that the first command is correct.
  • the confirmation notification can be displayed by the visual display, played through the speakers, vibrated with haptics in a seat or steering wheel, a network message or via a wireless audio device.
  • the method 900 may comprise receiving, by the processor, a confirmatory response through the audio input device, the visual display, the start button, or a network message.
  • the processor controls the vehicle system or the tool system with the first command in response to the processor receiving an affirmative confirmatory response (i.e., yes).
  • the method 900 can further comprise filtering, by the cloud processor, the audio input based on the identified operating mode of the vehicle and/or an environmental audio input from a second audio input device.
  • the cloud processor can filter the audio input to obtain a more ideal sample of audio input for processing, to remove background noises from the audio input which can be found using the identified operating mode of the vehicle and/or the second audio input, and can convert the audio input to a list of recognized words and confidences for the cloud processor to analyze with the first plurality of possible voice commands.
  • the second audio input device can be positioned and configured to only transmit a second audio input received from the drive system and the environment around the vehicle.
  • the cloud processor compares the audio input to the second audio input to filter the audio input by removing overlapping audio data between the audio input and the second audio input. The resulting output is a filtered audio input.
  • the processors 402 and 502 can comprise instructions stored thereon, that in response to execution by the processors 402 and 502, cause the processors 402 and 502 to perform operations comprising any of the steps and embodiments of method 900.
  • the method 900 can be performed by the processors 402 and 502.
  • a method 1000 for operating a vehicle may comprise receiving, by a cloud processor (shown as cloud processor 608), an identified operating mode of a vehicle (shown as off-road vehicles 100, 200, and 300), an audio input and an environmental input via a network (shown as network 602) (step 1002).
  • a processor (shown as processor 502) is configured to receive operation data from the vehicle and a machine control system (shown as machine control system 508), for example, an engine state (whether engine is off or on) a vehicle speed, a tool state (whether the off-road vehicle’s tool system is being used or not) and other operation data of the like.
  • the processor in response to receiving the operation data, can determine the identified operating mode.
  • the processor is configured to transmit the identified operating mode of the vehicle, the audio input and the environmental input via the network.
  • the method 1000 may comprise determining, by the cloud processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the cloud database (step 1004).
  • the cloud database comprises the first plurality of possible voice commands in the non-transitory memory of the cloud database.
  • the first plurality of possible voice commands can be a list of all executable commands and functions for the vehicle. If the audio input is determined to match wi th a command from the first plurality of possible voice commands, then the cloud processor assigns the audio input as the first command. If the audio input is determined to not match with a command from the first plurality of possible voice commands, then the cloud processor does not assign the first command and the cloud processor can communicate the rejection using a default response signal which can be sent to any alert device.
  • the method 1000 may comprise validating, by the cloud processor, whether the first command is allowable based on the identified operating mode and the environmental input (step 1006).
  • the step of validating further comprises comparing, by the cloud processor, the first command to the speed of the vehicle. For example, if the first command is turn off the engine, and the speed of the vehicle is 50 miles per hour, it would not be safe to turn off the engine at that speed. Therefore, the first command would not be validated as allowable. If the first command is move forward 50 feet, and the speed of the vehicle is 0 mph, it would be safe to control the machine control system to move the vehicle forward 50 feet. Therefore, the first command would be validated as allowable.
  • the step of validating further comprises comparing, by the cloud processor, the first command to the GPS location of the vehicle. For example, if the first command is move forward ten miles, and the GPS location of the vehicle is two miles from a body of water, it would not be safe to move forward ten miles. Therefore, the first command would not be validated as allowable. If the first command is move forward one mile, and the GPS location of the vehicle shows no physical barriers within one mile, it would be safe to control the machine control system to move the vehicle forward one mile. Therefore, the first command would be validated as allowable.
  • the step of validating further comprises comparing, by the cloud processor, the first command to the terrain output. For example, if the first command is move forward 50 feet, and the terrain output shows a slope that exceeds a set negative grade, for example -10% or more, it would not be safe to move forward 50 feet in heavy off-road vehicles. Therefore, the first command would not be validated as allowable. If the first command is move forward 50 feet, and the terrain output shows no physical limitations or a slope exceeding the set negative grade for the next 50 feet, it would be safe to control the machine control system to move the vehicle forward 50 feet. Therefore, the first command would be validated as allowable.
  • a set negative grade for example -10% or more
  • the off-road vehicle can comprise an Inertial Measurement Unit (IMU), in communication with the processor, configured to transmit an attitude of the off-road vehicle to the processor.
  • IMU Inertial Measurement Unit
  • the attitude of the off-road vehicle can also be compared to the first command.
  • the off-road vehicle can comprise an angular velocity sensor, such as a gyroscopic sensor, for the detection of angular velocity.
  • the step of validating can comprise comparing each of the environmental sensor inputs (GPS location, terrain output, and speed of the vehicle) to validate the first command. In various embodiments, if the first command is not allowable then the processor outputs an alert notification to one or more alert devices. In various embodiments, the alert notification can be an alert to the operator that the first command is not allowable.
  • the method 1000 may comprise transmitting, by the cloud processor, the first command to a processor (shown as processor 402 and 502) (step 1008).
  • the method 1000 may comprise controlling, by the processor, the machine control system in response to the first command (step 1010).
  • method 1000 may comprise controlling, by the processor, a tool system (shown as tool systems 108, 208, and 308) in response to the first command.
  • the processor outputs a confirmation notification requesting a confirmatory response from the operator to confirm that the first command is correct.
  • the confirmation notification can be displayed by the visual display or played through the speakers.
  • the method 1000 may comprise receiving, by the processor, a confirmatory response through the audio input device or the visual display.
  • the processor controls the vehicle system or the tool system with the first command in response to the processor receiving an affirmative confirmatory response (i.e.. yes).
  • the method 1000 can further comprise filtering, by the cloud processor, the audio input based on the identified operating mode of the vehicle and/or an environmental audio input from a second audio input device.
  • the cloud processor can filter the audio input to obtain a more ideal sample of audio input for processing, to remove background noises from the audio input which can be found using the identified operating mode of the vehicle and/or the second audio input, and can convert the audio input to a list of recognized words and confidences for the cloud processor to analyze with the first plurality of possible voice commands.
  • the second audio input device can be positioned and configured to only transmit a second audio input received from the drive system and the environment around the vehicle.
  • the cloud processor compares the audio input to the second audio input to filter the audio input by removing overlapping audio data between the audio input and the second audio input. The resulting output is a filtered audio input.
  • the cloud processor 608 can comprise instructions stored thereon, that in response to execution by the cloud processor 608, cause cloud processor 608 to perform operations comprising any of the steps and embodiments of method 1000. The method 1000 can be performed by the cloud processor 608.
  • the control system 700 can be a vehicle remote control system used with remote operators of off-road vehicles such as off-road vehicles 100, 200, and 300. Control system 700 can be used to control one off-road vehicle at a time or entire fleets of off-road vehicles.
  • the control system 700 comprises the processor 502.
  • the processor 502 is in electronic communication with a network 702.
  • a remote system 704 comprising a remote database 706 and a remote processor 708 are also in electronic communication with the network 702.
  • the remote database 706 may store instructions usable by the remote processor 708 to perform operations. Any appropriate computer-readable type/configuration may be utilized as the remote database 706, any appropriate data storage architecture may be utilized by the remote database 706, or both.
  • the remote processor 708 may include one or more logic devices such as one or more of a central processing unit (CPU), a graphical processing unit (GPU), an accelerated processing unit (APU), a digital signal processor (DSP), a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or the like.
  • the remote processor 708 may further include any non-transitory memory known in the art and store instructions usable by a logic device to perform operations. Any appropriate computer-readable type/configuration may be utilized as a part of the logic device, and any appropriate data storage architecture may be utilized by the logic device, or both.
  • control system 700 further comprises the environmental input device 504. In various embodiments, there are more than one environmental input devices 504.
  • the environmental input device 504 is in communication with processor 502 and is configured to output an environmental input over the network 702 to the remote system 704.
  • the control system 700 further comprises a machine control system 508.
  • the remote system 704 further comprises an audio input device 710 configured to receive an audio input from the remote operator.
  • the remote system 704 further comprises an alert device such as a visual display 712 and/or a speaker 714 configured to output an alert notification receive from the remote processor 708.
  • the visual display 712 and the speaker 714 are in communication with the remote processor 708.
  • a method 1100 for remotely operating a vehicle may comprise identifying, by a processor (shown as processors 402 and 502) an operation mode of a vehicle, resulting in an identified operating mode (step 1102).
  • the processor is configured to receive operation data from the vehicle and a machine control system (shown as machine control system 508), for example, an engine state (whether engine is off or on) a vehicle speed, a tool state (whether the off-road vehicle’s tool system is being used or not) and other operation data of the like.
  • the processor in response to receiving the operation data, can determine the identified operating mode.
  • the method 1100 may comprise receiving, by the processor, an environmental input from an environmental sensor input device (shown as environmental input device 504) in communication with the processor (step 1104). In various embodiments, the method 1100 may comprise transmitting, by the processor, the identified operating mode of the vehicle and the environmental input to a remote processor (shown as remote processor 708) via a network (shown as network 702) (step 1106).
  • the remote processor is in communication with a remote database (shown as remote database 706).
  • the method 1100 may comprise receiving, by the processor, a first command from the remote processor (step 1108). In various embodiments, the method 1100 may comprise controlling, by the processor, the machine control system in response to the first command (step 1110). In various embodiments, method 1100 may comprise controlling, by the processor, a tool system (shown as tool systems 108, 208, and 308) in response to the first command.
  • the remote processor outputs a confirmation notification requesting a confirmatory response from the operator to confirm that the first command is correct.
  • the confirmation notification can be displayed by a visual display (shown as visual display 712) or played through a speaker (shown as speaker 714) of the remote system 704.
  • the method 1100 may comprise receiving, by the remote processor, a confirmatory response through an audio input device (shown as audio input device 710) or the visual display.
  • the processor controls the vehicle system or the tool system with the first command in response to the remote processor receiving an affirmative confirmatory response (i.e., yes).
  • the processors 402 and 502 can comprise instructions stored thereon, that in response to execution by the processors 402 and 502, cause the processors 402 and 502 to perform operations comprising any of the steps and embodiments of method 1100.
  • the method 1100 can be performed by the processors 402 and 502.
  • a method 1200 for remotely operating a vehicle may comprise receiving, by a remote processor (shown as remote processor 708), an identified operating mode of a vehicle (shown as vehicles 100, 200, and 300), an audio input and an environmental input via a netw ork (shown as netw ork 702) (step 1202).
  • a processor (shown as processor 502) is configured to receive operation data from the vehicle and a machine control system (shown as machine control system 508), for example, an engine state (whether engine is off or on) a vehicle speed, a tool state (whether the off-road vehicle’s tool system is being used or not) and other operation data of the like.
  • the processor in response to receiving the operation data, can determine the identified operating mode.
  • the processor is configured to transmit the identified operating mode of the vehicle, the audio input and the environmental input via the network.
  • the remote system comprises an audio input device (shown as audio input device 710) configured to receive an audio input from the remote operator.
  • the method 1200 may comprise determining, by the remote processor, whether the audio input is a first command of a first plurality of possible voice commands stored in the remote database (shown as remote database 706) (step 1204).
  • the remote database comprises the first plurality of possible voice commands in the non-transitory memory of the cloud database.
  • the first plurality of possible voice commands can be a list of all executable commands and functions for the vehicle.
  • the remote processor assigns the audio input as the first command If the audio input is determined to not match with a command from the first plurality of possible voice commands, then the remote processor does not assign the first command and the remote processor can communicate the rejection using a default response signal which can be sent to any alert device.
  • the method 1200 may comprise validating, by the remote processor, whether the first command is allowable based on the identified operating mode and the environmental input (step 1206).
  • the step of validating further comprises comparing, by the remote processor, the first command to the speed of the vehicle. For example, if the first command is turn off the engine, and the speed of the vehicle is 50 miles per hour, it would not be safe to turn off the engine at that speed. Therefore, the first command would not be validated as allowable. If the first command is move forward 50 feet, and the speed of the vehicle is 0 mph, it would be safe to control the machine control system to move the vehicle forward 50 feet. Therefore, the first command would be validated as allowable.
  • the step of validating further comprises comparing, by the remote processor, the first command to the GPS location of the vehicle. For example, if the first command is move forward ten miles, and the GPS location of the vehicle is two miles from a body of water, it would not be safe to move forward ten miles. Therefore, the first command would not be validated as allowable. If the first command is move forward one mile, and the GPS location of the vehicle shows no physical barriers within one mile, it would be safe to control the machine control system to move the vehicle forward one mile. Therefore, the first command would be validated as allowable.
  • the step of validating further comprises comparing, by the remote processor, the first command to the terrain output. For example, if the first command is move forward 50 feet, and the terrain output shows a slope that exceeds a set negative grade, for example -10% or more, it would not be safe to move forward 50 feet in heavy off-road vehicles. Therefore, the first command would not be validated as allowable. If the first command is move forward 50 feet, and the terrain output shows no physical limitations or a slope exceeding the set negative grade for the next 50 feet, it would be safe to control the machine control system to move the vehicle forward 50 feet. Therefore, the first command would be validated as allowable.
  • a set negative grade for example -10% or more
  • the off-road vehicle can comprise an Inertial Measurement Unit (IMU), in communication with the processor, configured to transmit an attitude of the off-road vehicle to the processor.
  • IMU Inertial Measurement Unit
  • the attitude of the off-road vehicle can also be compared to the first command.
  • the off-road vehicle can comprise an angular velocity sensor, such as a gyroscopic sensor, for the detection of angular velocity.
  • the step of validating can comprise comparing each of the environmental sensor inputs (GPS location, terrain output, and speed of the vehicle) to validate the first command. In various embodiments, if the first command is not allowable then the processor outputs an alert notification to one or more alert devices. In various embodiments, the alert notification can be an alert to the operator that the first command is not allowable.
  • the method 1200 may comprise transmitting, by the remote processor, the first command to a processor (shown as processor 402 and 502) (step 1208).
  • the method 1200 may comprise controlling, by the processor, the machine control system in response to the first command.
  • method 1200 may comprise controlling, by the processor, a tool system (shown as tool systems 108, 208, and 308) in response to the first command.
  • the remote processor outputs a confirmation notification requesting a confirmatory response from the operator to confirm that the first command is correct.
  • the confirmation notification can be displayed by a visual display (shown as visual display 712) or played through a speaker (shown as speaker 714) of the remote system 704.
  • the method 1100 may comprise receiving, by the remote processor, a confirmatory response through the audio input device or the visual display.
  • the processor controls the vehicle system or the tool system with the first command in response to the remote processor receiving an affirmative confirmatory response (i.e., yes).
  • the remote processor 708 can comprise instructions stored thereon, that in response to execution by the remote processor 708, cause remote processor 708 to perform operations comprising any of the steps and embodiments of method 1200.
  • the method 1200 can be performed by the remote processor 708

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computational Linguistics (AREA)
  • Mechanical Engineering (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Selective Calling Equipment (AREA)

Abstract

L'invention concerne des procédés et des systèmes de commande de véhicules par commandes vocales. Des commandes vocales sont reçues localement au niveau du véhicule et/ou à distance au niveau d'une station à distance. Un procédé donné à titre d'exemple de commande d'un véhicule par commande vocale comprend les étapes consistant à : identifier un mode de fonctionnement du véhicule ; recevoir une entrée audio provenant d'un dispositif d'entrée audio ; recevoir une entrée environnementale provenant d'un dispositif d'entrée de capteur environnemental ; comparer l'entrée audio à une première pluralité de commandes vocales possibles pour déterminer une première commande ; valider si la première commande peut être autorisée sur la base du mode de fonctionnement identifié et de l'entrée environnementale ; et commander un système de commande de machine en réponse à la première commande lorsque la première commande est autorisée. Un système de commande peut comprendre un processeur et un dispositif d'entrée audio disposé sur un véhicule, et/ou une station à distance, et est conçu pour mettre en œuvre les procédés décrits.
PCT/US2023/037198 2022-11-16 2023-11-13 Systèmes et procédés de commande de véhicules par commandes vocales WO2024107403A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263425905P 2022-11-16 2022-11-16
US63/425,905 2022-11-16

Publications (1)

Publication Number Publication Date
WO2024107403A1 true WO2024107403A1 (fr) 2024-05-23

Family

ID=91085234

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/037198 WO2024107403A1 (fr) 2022-11-16 2023-11-13 Systèmes et procédés de commande de véhicules par commandes vocales

Country Status (1)

Country Link
WO (1) WO2024107403A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140136187A1 (en) * 2012-11-15 2014-05-15 Sri International Vehicle personal assistant
US20200278679A1 (en) * 2017-01-06 2020-09-03 Aurora Flight Sciences Corporation Collision-Avoidance System and Method for Unmanned Aircraft
US20200372908A1 (en) * 2019-05-22 2020-11-26 Ford Global Technologies, Llc Detecting and isolating competing speech for voice controlled systems
US20200398842A1 (en) * 2018-02-15 2020-12-24 Meidensha Corporation Vehicle Speed Control Device And Vehicle Speed Control Method
US20210081683A1 (en) * 2009-02-27 2021-03-18 Magna Electronics Inc. Vehicular control system
US20220101847A1 (en) * 2020-09-28 2022-03-31 Hill-Rom Services, Inc. Voice control in a healthcare facility

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210081683A1 (en) * 2009-02-27 2021-03-18 Magna Electronics Inc. Vehicular control system
US20140136187A1 (en) * 2012-11-15 2014-05-15 Sri International Vehicle personal assistant
US20200278679A1 (en) * 2017-01-06 2020-09-03 Aurora Flight Sciences Corporation Collision-Avoidance System and Method for Unmanned Aircraft
US20200398842A1 (en) * 2018-02-15 2020-12-24 Meidensha Corporation Vehicle Speed Control Device And Vehicle Speed Control Method
US20200372908A1 (en) * 2019-05-22 2020-11-26 Ford Global Technologies, Llc Detecting and isolating competing speech for voice controlled systems
US20220101847A1 (en) * 2020-09-28 2022-03-31 Hill-Rom Services, Inc. Voice control in a healthcare facility

Similar Documents

Publication Publication Date Title
US11941976B2 (en) System and method for sharing data collected from the street sensors
US10812593B2 (en) Cloud integrated vehicle platform
US20180315317A1 (en) Method for determining whether object is in target area, and parking management device
US11060882B2 (en) Travel data collection and publication
US10311312B2 (en) System and method for vehicle occlusion detection
US11664043B2 (en) Real-time verbal harassment detection system
EP3901764A1 (fr) Messagerie de mise à jour pour dispositifs informatiques de véhicule
US8717193B2 (en) Method and system for providing traffic alerts
US20200166929A1 (en) Detection and Communication of Safety Events
US20220107637A1 (en) Backup control systems and methods for autonomous vehicles
US20200207306A1 (en) Controlling vehicle operations based on driver information
US20210201893A1 (en) Pattern-based adaptation model for detecting contact information requests in a vehicle
US10466691B2 (en) Coordinated control of self-driving vehicles under emergency situations
US10741076B2 (en) Cognitively filtered and recipient-actualized vehicle horn activation
US11180115B2 (en) Controlling vehicle operations based on vehicle information
US11914372B2 (en) Advanced flight processing system and/or method
WO2024107403A1 (fr) Systèmes et procédés de commande de véhicules par commandes vocales
El-Tawab et al. Enhanced interface for autonomously driven golf cart in a networked controlled environment
US10768284B1 (en) Systems and methods for using audio cue for alignment
US20170103751A1 (en) Method and system for using combined voice and customized instructions to trigger vehicle reports
CN113879313A (zh) 驾驶员疲劳检测方法及装置
US11485374B2 (en) System and method for evacuation of a vehicle in condition
US20240255950A1 (en) Advanced flight processing system and/or method
CN116588015B (zh) 车辆控制方法、车辆控制***及存储介质
CN115171392A (zh) 用于向车辆提供预警信息的方法和车载终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23892234

Country of ref document: EP

Kind code of ref document: A1