CN114527923A - In-vehicle information display method and device and electronic equipment - Google Patents

In-vehicle information display method and device and electronic equipment Download PDF

Info

Publication number
CN114527923A
CN114527923A CN202210012814.1A CN202210012814A CN114527923A CN 114527923 A CN114527923 A CN 114527923A CN 202210012814 A CN202210012814 A CN 202210012814A CN 114527923 A CN114527923 A CN 114527923A
Authority
CN
China
Prior art keywords
information
vehicle
display
navigation image
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210012814.1A
Other languages
Chinese (zh)
Inventor
李君�
沈刚
赵静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Evergrande New Energy Automobile Investment Holding Group Co Ltd
Original Assignee
Evergrande New Energy Automobile Investment Holding Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Evergrande New Energy Automobile Investment Holding Group Co Ltd filed Critical Evergrande New Energy Automobile Investment Holding Group Co Ltd
Priority to CN202210012814.1A priority Critical patent/CN114527923A/en
Publication of CN114527923A publication Critical patent/CN114527923A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/80Technologies aiming to reduce greenhouse gasses emissions common to all road transportation technologies
    • Y02T10/84Data processing systems or methods, management, administration

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the invention provides an in-vehicle information display method, an in-vehicle information display device and electronic equipment, wherein the method comprises the following steps: the augmented reality head-up display displays a navigation image on a front windshield of the vehicle; obtaining operation information of a user through a gesture recognition sensor; and operating the navigation image according to the operation information. According to the embodiment of the invention, the navigation image is displayed on the windshield by utilizing the characteristics of large size and high definition of the AR-HUD, so that the navigation information can be more clearly and accurately transmitted to the user, and the operation can be carried out through gestures at any time, thereby improving the driving safety.

Description

In-vehicle information display method and device and electronic equipment
Technical Field
The invention relates to the technical field of vehicles, in particular to a method and a device for displaying in-vehicle information and electronic equipment.
Background
In the driving process, a driver needs to look up information such as vehicle speed, oil consumption and the like on an instrument panel at intervals in a head-down mode. The time consumed by the small action is about 1-3 seconds, and when a far target and a near instrument panel are observed, a sight focusing process exists, and frequent sight focusing causes slow visual response speed and visual fatigue. In real life, the number of traffic accidents is too large due to distraction and driving fatigue.
Head-Up displays (HUDs) project driving information onto an external piece of glass or a windshield so that a driver can see an instrument panel without lowering his Head during driving, thereby greatly reducing interruption of his attention and loss of control over state awareness.
However, when displaying navigation information, only a small amount of simple information is displayed, so that the navigation information cannot be accurately displayed.
Disclosure of Invention
The embodiment of the invention aims to provide an in-vehicle information display method, an in-vehicle information display device and electronic equipment, and aims to solve the problem that navigation information cannot be accurately displayed because only a small amount of simple information is displayed when the navigation information is displayed.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides an in-vehicle information display method, including:
displaying a navigation image on a front windshield of the vehicle through an augmented reality head-up display;
obtaining operation information of a user through a gesture recognition sensor;
and operating the navigation image according to the operation information.
Further, when the vehicle is in a parking state, the method further comprises:
and obtaining the operation information of the user through a touch screen arranged on the front windshield.
Further, the operation information includes selecting a target building in the map information, and the method further includes:
and displaying the relevant information of the target building in the navigation image.
Further, the method further comprises:
acquiring lane information of vehicle driving;
determining steering information of the vehicle according to the lane information and the navigation image;
and displaying the steering information through the augmented reality head-up display, and controlling a steering lamp of the vehicle according to the steering information.
Further, when the vehicle is in a parking state, the method further comprises:
displaying image information of a video through the augmented reality head-up display;
operating the video according to the operation information;
wherein the video comprises at least one of:
video and audio files;
and (5) carrying out video call.
In a second aspect, an embodiment of the present invention provides an in-vehicle information display device, including:
the display module is used for displaying a navigation image on a front windshield of the vehicle through the augmented reality head-up display;
the interaction module is used for obtaining the operation information of the user through the gesture recognition sensor;
and the operation module is used for operating the navigation image according to the operation information.
Further, the interaction module is also used for obtaining operation information of a user through a touch screen arranged on the front windshield when the vehicle is in a parking state.
Further, the operation information includes a target building selected from the map information, and the operation module is further configured to display information related to the target building in the navigation image.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete mutual communication through a bus; the memory is used for storing a computer program; the processor is configured to execute the program stored in the memory, and implement the in-vehicle information display method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps of the in-vehicle information display method according to the first aspect are implemented.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention displays the navigation image on the front windscreen of the vehicle through the augmented reality head-up display; obtaining operation information of a user through a gesture recognition sensor; and operating the navigation image according to the operation information. According to the embodiment of the invention, the navigation image is displayed on the windshield by utilizing the characteristics of large size and high definition of the AR-HUD, so that the navigation information can be more clearly and accurately transmitted to the user, and the operation can be carried out through gestures at any time, thereby improving the driving safety.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a schematic flow chart of an in-vehicle information display method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an in-vehicle information display system according to an embodiment of the present invention;
fig. 3 is another schematic structural diagram of an in-vehicle information display system according to an embodiment of the present invention;
fig. 4 is another schematic structural diagram of an in-vehicle information display system according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an in-vehicle information display device according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides an in-vehicle information display method and device and electronic equipment.
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1 and fig. 2, an execution subject of the method may be a vehicle control unit, and specifically, may be a cabin host 200 of a vehicle. The method may specifically comprise the steps of:
in step S110, the cabin host 200 displays a navigation image on the front windshield 300 of the vehicle via an Augmented Reality Head Up Display (AR-HUD) 210.
The in-vehicle information display system shown in fig. 2 mainly includes a cabin host 200, an AR-HUD210, a map box 220, and a gesture recognition sensor 230; the cockpit host 200 is connected to the AR-HUD210, the map box 220, and the gesture recognition sensor 230, respectively. The map box 220 can provide high-precision map information, and accurate positioning information can be obtained through a smart navigation antenna 221, such as a Global navigation system (GNSS) antenna, connected to the map box 220. The cabin host 200 obtains the navigation information according to the map information and the positioning information provided by the map box 220 and by combining the requirements and navigation settings of the user, and transmits the navigation image therein to the AR-HUD210 to project to the front windshield 300 for display. High speed Ethernet may be used for information transfer between the cabin host 200 and the map box 220 and AR-HUD 210.
AR-HUD210 combines certain light source through its special optical imaging principle, can improve formation of image luminance to a great extent for the driver can see the image clearly under the high bright ambient light equally, and AR-HUD 210's formation of image size can reach more than 30 cun simultaneously, and the information that shows is abundanter, colorful, and can also further realize 3D projection effect.
The AR-HUD210 may include: HUD host computer 211 and HUD optical projection equipment 212, HUD host computer 211 is connected with passenger cabin host computer 200, acquires the navigation image of passenger cabin 200, shows through HUD optical projection equipment 212 projection to preceding windscreen 300.
The voice information of the navigation information can be played through the speaker 240 connected to the cabin host 200.
Step S120, obtaining operation information of the user through the gesture recognition sensor 230.
When the vehicle is in a driving state, the gesture operation of the user can be collected through the gesture recognition sensor 230, the operation information of the user is recognized from the gesture operation, and the operation information is sent to the cabin host 200.
And S130, operating the navigation image according to the operation information.
The cabin host 200 realizes the operation of the navigation information according to the identified operation information, and further adjusts the navigation image. The operation information may include: cancel the current navigation, select other navigation routes, enlarge the navigation image, etc.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention displays the navigation image on the front windscreen of the vehicle through the augmented reality head-up display; obtaining operation information of a user through a gesture recognition sensor; and operating the navigation image according to the operation information. According to the embodiment of the invention, the navigation image is displayed on the windshield by utilizing the characteristics of large size and high definition of the AR-HUD, so that the navigation information can be more clearly and accurately transmitted to the user, and the operation can be carried out through gestures at any time, thereby improving the driving safety.
Based on the above embodiment, further, when the vehicle is in a parking state, the method may further include:
as shown in fig. 3, the in-vehicle information display system further includes: the touch screen 250 is arranged on the front windshield 300, and the touch screen 250 is connected with the cabin host 200 and used for acquiring touch operation of a user so as to obtain operation information of the user. The user may directly operate the navigation image in the display area of the navigation image on the front windshield 300 by performing a touch operation on the touch screen 250.
It should be appreciated that for driving safety, the touch screen 250 may be configured to operate only when the vehicle is in a parked state and to be disabled when the vehicle is in a driving state. The gesture recognition sensor 230 is set to operate only when the vehicle is in a driving state, or to operate both when the vehicle is in a driving state and a parking state.
In addition, the cabin host 200 can also perform voice interaction with the user through a voice acquisition device 270, such as a microphone, and obtain the operation information of the user through voice recognition, and the voice acquisition device 270 can be set to operate when the vehicle is in a driving state and a parking state.
Further, the operation information obtained through the gesture recognition sensor 230 and the touch screen 250 may be various, and in one embodiment, the operation information may include selecting a target building in the map information, and the method may further include:
and displaying the relevant information of the target building in the navigation image.
The user may select a target building in the navigation image through gesture operation, and specifically, the gesture recognition sensor 230 may determine the position and gesture motion of the hand of the user, and then determine the target building selected by the user in combination with the navigation image.
Alternatively, the user may also directly click on the target building in the navigation image through the touch screen 250.
The cabin host 200 acquires the related information of the target building, including the name, address, use purpose, and the like of the target building, from the map box 210, and displays the related information into the navigation image.
In one embodiment, the user may select the target building as a navigation destination or a transit place, etc. for re-planning the path of the navigation information.
According to the technical scheme provided by the embodiment of the invention, when the vehicle is in a parking state, the embodiment of the invention obtains the operation information of the user through the touch screen arranged on the front windshield. According to the embodiment of the invention, the touch screen is used for directly operating the navigation image on the front windshield, so that the accuracy of navigation information and the driving safety are improved.
Based on the above embodiment, further, the method may further include:
the lane information of the vehicle running is obtained, and the lane information can be obtained through high-precision map information and positioning information provided by the map box 220, and can also be obtained by combining a camera arranged in front of the vehicle, a radar and other devices.
And determining the steering information of the vehicle according to the lane information and the navigation image, wherein the steering information comprises information that the vehicle needs to go straight, turn left or turn right, or turn around and the like.
And displaying the steering information through the augmented reality head-up display, and controlling a steering lamp of the vehicle according to the steering information. For example, when the vehicle needs to turn around or turn around, the cockpit host 200 controls the turn lights in front and at the back of the vehicle to automatically flash, and controls the arrow signs of turning around or turning around and the like to be displayed on the navigation image projected by the AR-HUD210, so that the user does not need to manually dial a steering handle, and after the turning around or turning around is finished, the turn lights in front and at the back of the vehicle and the arrow signs on the navigation image are controlled to automatically turn off.
In one embodiment, the fatigue degree of the user during driving can be monitored when the vehicle is in a driving state, and if the user has a fatigue driving condition, the user can be reminded of the fatigue driving by changing the color of the display interface of the AR-UUD210 on the front windshield, so that the safety factor is improved.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention obtains the lane information of the vehicle running; determining steering information of the vehicle according to the lane information and the navigation image; and displaying the steering information through the augmented reality head-up display, and controlling a steering lamp of the vehicle according to the steering information. By the embodiment of the invention, the driving convenience and driving safety are improved.
Based on the above embodiment, further, as shown in fig. 4, when the vehicle is in a parking state, the method may further include:
displaying image information of a video through the augmented reality head-up display;
operating the video according to the operation information;
wherein the video comprises at least one of:
video and audio files;
and (5) carrying out video call.
As shown in fig. 4, the in-vehicle information display system may further include: a telematics BOX (T-BOX)260 connected to the cabin host 200 and a video capture device 280, such as a camera.
When the vehicle is in a parking state, the cabin host 200 can transmit the image information of the audio-video file to the AR-HUD210 according to the requirements of the user, and the image information is projected to the front windshield to be played. The video and audio file may include: movies, video, music, photos, etc. The video and audio file may be stored in the cabin host 200, or may be provided by another storage medium or a cloud server connected to the cabin host 200.
In the playing process of the audio-video file, the operation information of the user is identified through the touch screen 250 or the gesture identification sensor 230, the operation such as forward, backward, amplifying, shrinking, pausing and the like is carried out on the played audio-video file, and the sound signal of the audio-video file is played through the loudspeaker 240 connected with the cabin host 200.
When the vehicle is in a parking state, the user can also be connected with external mobile information through the T-BOX260 to realize bidirectional video call inside and outside the vehicle, in the video call process, the AR-HUD210 projects the image information of the video call onto the front windshield to be played, and the sound signal of the video call is played through the loudspeaker 240. Meanwhile, a voice signal and a video image of the user are captured through the voice capturing apparatus 270 and the video capturing apparatus 280. The user may operate the video call through the touch screen 250 or the gesture recognition sensor 230. For example, a user may place a bluetooth video call via the touch screen 250.
It should be understood that when the vehicle is parked, the low voltage power may be converted for use by the AR-HUD210 to resolve the battery power loss problem during viewing of audio and video files or video calls.
The above-described image information of the video displayed by the AR-HUD210 and the function of the touch screen 250 will all be disabled when the vehicle is in a driving state. However, the user can still use the gesture recognition sensor 230 or voice interaction to implement the functions of answering a voice call, switching music, pausing the playing, etc.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention displays video information through the augmented reality head-up display; and operating the video information according to the operation information. By the embodiment of the invention, the entertainment function and the communication practicability of the vehicle are improved in the parking state.
On the basis of the same technical concept, the embodiment of the present invention further provides an in-vehicle information display apparatus corresponding to the method for displaying in-vehicle information provided in the foregoing embodiment, and fig. 5 is a schematic diagram illustrating a module composition of the in-vehicle information display apparatus provided in the embodiment of the present invention, where the in-vehicle information display apparatus is configured to execute the method for displaying in-vehicle information described in fig. 1 to 4, and as shown in fig. 5, the in-vehicle information display apparatus includes: a display module 501, an interaction module 502 and an operation module 503.
The display module 501 is used for displaying a navigation image on a front windshield of a vehicle through an augmented reality head-up display; the interaction module 502 is configured to obtain operation information of a user through a gesture recognition sensor; the operation module 503 is configured to operate the navigation image according to the operation information.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention displays the navigation image on the front windscreen of the vehicle through the augmented reality head-up display; obtaining operation information of a user through a gesture recognition sensor; and operating the navigation image according to the operation information. According to the embodiment of the invention, the navigation image is displayed on the windshield by utilizing the characteristics of large size and high definition of the AR-HUD, so that the navigation information can be more clearly and accurately transmitted to the user, and the operation can be carried out through gestures at any time, thereby improving the driving safety.
Based on the above embodiment, further, the interaction module is further configured to obtain operation information of a user through a touch screen disposed on the front windshield when the vehicle is in a parking state.
Further, the operation information includes a target building selected from the map information, and the operation module is further configured to display information related to the target building in the navigation image.
According to the technical scheme provided by the embodiment of the invention, when the vehicle is in a parking state, the embodiment of the invention obtains the operation information of the user through the touch screen arranged on the front windshield. According to the embodiment of the invention, the touch screen is used for directly operating the navigation image on the front windshield, so that the accuracy of navigation information and the driving safety are improved.
Based on the above embodiment, further, the operation module is further configured to obtain lane information for the vehicle to travel; determining steering information of the vehicle according to the lane information and the navigation image; the display module is further used for displaying the steering information through the augmented reality head-up display and controlling a steering lamp of a vehicle according to the steering information.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention obtains the lane information of the vehicle running; determining steering information of the vehicle according to the lane information and the navigation image; and displaying the steering information through the augmented reality head-up display, and controlling a steering lamp of the vehicle according to the steering information. By the embodiment of the invention, the driving convenience and driving safety are improved.
Based on the above embodiment, further, the display module is further configured to display image information of a video through the augmented reality head-up display;
the operation module is also used for operating the video according to the operation information;
wherein the video comprises at least one of:
video and audio files;
and (5) carrying out video call.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention displays video information through the augmented reality head-up display; and operating the video information according to the operation information. By the embodiment of the invention, the entertainment function and the communication practicability of the vehicle are improved in the parking state.
The in-vehicle information display device provided by the embodiment of the invention can realize each process in the embodiment corresponding to the in-vehicle information display method, and is not repeated here to avoid repetition.
It should be noted that the in-vehicle information display apparatus provided in the embodiment of the present invention and the in-vehicle information display method provided in the embodiment of the present invention are based on the same inventive concept, and therefore, for specific implementation of the embodiment, reference may be made to implementation of the in-vehicle information display method described above, and repeated parts are not described again.
On the basis of the same technical concept, the embodiment of the present invention further provides an electronic device for executing the method for displaying in-vehicle information according to the method for displaying in-vehicle information provided by the above embodiment, and fig. 6 is a schematic structural diagram of an electronic device for implementing various embodiments of the present invention, as shown in fig. 6. Electronic devices may vary widely in configuration or performance and may include one or more processors 601 and memory 602, where one or more stored applications or data may be stored in memory 602. Wherein the memory 602 may be transient or persistent storage. The application program stored in memory 602 may include one or more modules (not shown), each of which may include a series of computer-executable instructions for the electronic device. Still further, the processor 601 may be arranged in communication with the memory 602 to execute a series of computer-executable instructions in the memory 602 on the electronic device. The electronic device may also include one or more power supplies 603, one or more wired or wireless network interfaces 604, one or more input-output interfaces 605, one or more keyboards 606.
Specifically, in this embodiment, the electronic device includes a processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete mutual communication through a bus; the memory is used for storing a computer program; the processor is used for executing the program stored in the memory and realizing the following method steps:
displaying a navigation image on a front windshield of the vehicle through an augmented reality head-up display;
obtaining operation information of a user through a gesture recognition sensor;
and operating the navigation image according to the operation information.
An embodiment of the present application further provides a computer-readable storage medium, in which a computer program is stored, and when executed by a processor, the computer program implements the following method steps:
displaying a navigation image on a front windshield of the vehicle through an augmented reality head-up display;
obtaining operation information of a user through a gesture recognition sensor;
and operating the navigation image according to the operation information.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, an electronic device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, apparatus or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. An in-vehicle information display method, characterized by comprising:
displaying a navigation image on a front windshield of the vehicle through an augmented reality head-up display;
obtaining operation information of a user through a gesture recognition sensor;
and operating the navigation image according to the operation information.
2. The method of claim 1, wherein when the vehicle is in a parked state, the method further comprises:
and obtaining the operation information of the user through a touch screen arranged on the front windshield.
3. The method of claim 1 or 2, wherein the operational information comprises selecting a target building in the map information, the method further comprising:
and displaying the relevant information of the target building in the navigation image.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
acquiring lane information of vehicle driving;
determining steering information of the vehicle according to the lane information and the navigation image;
and displaying the steering information through the augmented reality head-up display, and controlling a steering lamp of the vehicle according to the steering information.
5. The method according to claim 1 or 2, wherein when the vehicle is in a parked state, the method further comprises:
displaying image information of a video through the augmented reality head-up display;
operating the video according to the operation information;
wherein the video comprises at least one of:
video and audio files;
and (5) carrying out video call.
6. An in-vehicle information display device based on a head-up display, characterized by comprising:
the display module is used for displaying a navigation image on a front windshield of the vehicle through the augmented reality head-up display;
the interaction module is used for obtaining the operation information of the user through the gesture recognition sensor;
and the operation module is used for operating the navigation image according to the operation information.
7. The device of claim 6, wherein the interaction module is further configured to obtain the operation information of the user through a touch screen disposed on the front windshield when the vehicle is in a parking state.
8. The apparatus of claim 6 or 7, wherein the operation information comprises a selection of a target building in the map information, and the operation module is further configured to display information related to the target building in the navigation image.
9. An electronic device comprising a processor, a communication interface, a memory, and a communication bus; the processor, the communication interface and the memory complete mutual communication through a bus; the memory is used for storing a computer program; the processor is used for executing the program stored in the memory to realize the steps of the in-vehicle information display method based on the head-up display according to any one of claims 1-5.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon which, when being executed by a processor, carries out the in-vehicle information display method steps based on a head-up display according to any one of claims 1 to 5.
CN202210012814.1A 2022-01-06 2022-01-06 In-vehicle information display method and device and electronic equipment Pending CN114527923A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210012814.1A CN114527923A (en) 2022-01-06 2022-01-06 In-vehicle information display method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210012814.1A CN114527923A (en) 2022-01-06 2022-01-06 In-vehicle information display method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114527923A true CN114527923A (en) 2022-05-24

Family

ID=81620573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210012814.1A Pending CN114527923A (en) 2022-01-06 2022-01-06 In-vehicle information display method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114527923A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115469751A (en) * 2022-10-10 2022-12-13 复旦大学 Multi-mode human-computer interaction system and vehicle

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103786632A (en) * 2012-10-31 2014-05-14 现代摩比斯株式会社 Lighting system for vehicle and control method thereof
CN104875680A (en) * 2015-06-03 2015-09-02 深圳市光晕网络科技有限公司 HUD (head up display) device combining voice and video recognition
CN104943544A (en) * 2014-03-24 2015-09-30 福特全球技术公司 System and method for enabling touchscreen by passenger in moving vehicle
CN109945887A (en) * 2017-12-20 2019-06-28 上海博泰悦臻网络技术服务有限公司 AR air navigation aid and navigation equipment
CN111098865A (en) * 2018-10-25 2020-05-05 福特全球技术公司 Method and apparatus for facilitating navigation using a windshield display
CN111845542A (en) * 2019-05-21 2020-10-30 北京嘀嘀无限科技发展有限公司 Prompting method, prompting device, terminal and computer readable storage medium
CN112241204A (en) * 2020-12-17 2021-01-19 宁波均联智行科技有限公司 Gesture interaction method and system of vehicle-mounted AR-HUD

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103786632A (en) * 2012-10-31 2014-05-14 现代摩比斯株式会社 Lighting system for vehicle and control method thereof
CN104943544A (en) * 2014-03-24 2015-09-30 福特全球技术公司 System and method for enabling touchscreen by passenger in moving vehicle
CN104875680A (en) * 2015-06-03 2015-09-02 深圳市光晕网络科技有限公司 HUD (head up display) device combining voice and video recognition
CN109945887A (en) * 2017-12-20 2019-06-28 上海博泰悦臻网络技术服务有限公司 AR air navigation aid and navigation equipment
CN111098865A (en) * 2018-10-25 2020-05-05 福特全球技术公司 Method and apparatus for facilitating navigation using a windshield display
CN111845542A (en) * 2019-05-21 2020-10-30 北京嘀嘀无限科技发展有限公司 Prompting method, prompting device, terminal and computer readable storage medium
CN112241204A (en) * 2020-12-17 2021-01-19 宁波均联智行科技有限公司 Gesture interaction method and system of vehicle-mounted AR-HUD

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115469751A (en) * 2022-10-10 2022-12-13 复旦大学 Multi-mode human-computer interaction system and vehicle

Similar Documents

Publication Publication Date Title
CN109388467B (en) Map information display method, map information display device, computer equipment and storage medium
US11326891B2 (en) Method for providing image to vehicle and electronic device therefor
US10007264B2 (en) Autonomous vehicle human driver takeover mechanism using electrodes
JP6986699B2 (en) Display control system, display system, mobile body, display control method and program
US9057624B2 (en) System and method for vehicle navigation with multiple abstraction layers
US11610342B2 (en) Integrated augmented reality system for sharing of augmented reality content between vehicle occupants
JP2013112269A (en) In-vehicle display device
CN113237490A (en) AR navigation method, system, electronic device and storage medium
JP2022095778A (en) Method, apparatus, electronic device, computer-readable storage medium, and computer program for pushing information
KR20210129575A (en) Vehicle infotainment apparatus using widget and operation method thereof
CN111064936A (en) Road condition information display method and AR equipment
CN114527923A (en) In-vehicle information display method and device and electronic equipment
WO2020059188A1 (en) Navigation system, navigation display method, and navigation display program
EP2957448B1 (en) Display control apparatus, display control method, display control program, and display apparatus
JP2014234139A (en) On-vehicle display device and program
CN110576790A (en) Information prompting method and system based on automobile rear window glass display screen and vehicle-mounted terminal
CN107908356B (en) Information resident method and device based on intelligent equipment
US20230393801A1 (en) Synchronized rendering
JP2014223824A (en) Display device, display method and display program
JPWO2011121788A1 (en) Navigation device, information display device, navigation method, navigation program, and recording medium
CN114089890A (en) Vehicle driving simulation method, device, storage medium and program product
WO2014049705A1 (en) Display control device, display control method, program, and server device
JP2012233904A (en) Information display device, information display method, information display program, and recording medium
WO2014038044A1 (en) Display device, display method, program, and recording medium
CN203732076U (en) Novel screen-less navigation system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination