CN113900510A - Vehicle information display method, device and storage medium - Google Patents

Vehicle information display method, device and storage medium Download PDF

Info

Publication number
CN113900510A
CN113900510A CN202111033506.9A CN202111033506A CN113900510A CN 113900510 A CN113900510 A CN 113900510A CN 202111033506 A CN202111033506 A CN 202111033506A CN 113900510 A CN113900510 A CN 113900510A
Authority
CN
China
Prior art keywords
vehicle
view
component
control
perspective
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111033506.9A
Other languages
Chinese (zh)
Inventor
王晓彤
崔登学
文雅
王丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Urban Network Neighbor Information Technology Co Ltd
Beijing Chengshi Wanglin Information Technology Co Ltd
Original Assignee
Beijing Chengshi Wanglin Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chengshi Wanglin Information Technology Co Ltd filed Critical Beijing Chengshi Wanglin Information Technology Co Ltd
Priority to CN202111033506.9A priority Critical patent/CN113900510A/en
Publication of CN113900510A publication Critical patent/CN113900510A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0281Customer communication at a business location, e.g. providing product or service information, consulting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a vehicle information display method, vehicle information display equipment and a storage medium. In the embodiment of the application, the VR interactive interface can be displayed through the electronic equipment, and vehicle information display is carried out through the VR interactive interface. Specifically, the control can be associated with a key vehicle component of the vehicle object, and when the control is triggered, the state information of the key vehicle component is displayed, and meanwhile, switching of the VR view angle can be realized, and the control associated with other key vehicle components on the vehicle can be displayed. In the whole process, under the condition that no entity vehicle exists, the user can intuitively know the detailed condition of the vehicle, key vehicle parts concerned by the user are mainly checked, and the trust sense of the user is effectively enhanced.

Description

Vehicle information display method, device and storage medium
Technical Field
The present application relates to the field of virtual reality technologies, and in particular, to a method, a device, and a storage medium for displaying vehicle information.
Background
When choosing a second-hand vehicle, the user will typically have a heavy review and understanding of some common problems with the second-hand vehicle. When the vehicle is tried off line, the salesperson can answer questions on site, if the vehicle is not on site, or the user can not try driving on site, the user can not directly know the detailed conditions of the vehicle, which becomes a difficult problem in the second-hand vehicle trading scene.
Disclosure of Invention
Various aspects of the application provide a vehicle information display method, device and storage medium, so that a user can visually know the detailed conditions of a vehicle, key vehicle parts concerned by the user are mainly checked, and the trust sense of the user is effectively enhanced.
The embodiment of the application provides a vehicle information display method, a graphical user interface is provided through an electronic terminal, content displayed by the graphical user interface comprises a VR interactive interface, a first vehicle view is displayed on the VR interactive interface, the first vehicle view at least comprises a first key vehicle component, a first control is associated with the first key vehicle component, and the method comprises the following steps: in response to a triggering operation of the first control, displaying state information of the first key vehicle component in an associated area of the first control; responding to VR visual angle switching operation, switching and displaying a first vehicle view on a VR interactive interface as a second vehicle view, wherein the second vehicle view at least comprises a second key vehicle component, and a second control is associated on the second key vehicle component and is used for triggering the display of state information of the second key vehicle component in an associated area of the first control; wherein the first vehicle view and the second vehicle view are used for presenting the same vehicle object and/or parts thereof from different perspectives.
The embodiment of the application further provides an electronic terminal, wherein the electronic terminal provides a graphical user interface, the content displayed by the graphical user interface comprises a VR interactive interface, a first vehicle view is displayed on the VR interactive interface, the first vehicle view at least comprises a first key vehicle component, and a first control is associated with the first key vehicle component; the electronic terminal includes: a memory and a processor; a memory for storing a computer program; a processor coupled with the memory for executing the computer program for: in response to a triggering operation of the first control, displaying state information of the first key vehicle component in an associated area of the first control; responding to VR visual angle switching operation, switching and displaying a first vehicle view on a VR interactive interface as a second vehicle view, wherein the second vehicle view at least comprises a second key vehicle component, and a second control is associated on the second key vehicle component and is used for triggering the display of state information of the second key vehicle component in an associated area of the first control; wherein the first vehicle view and the second vehicle view are used for presenting the same vehicle object and/or parts thereof from different perspectives.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, which, when executed by a processor, causes the processor to implement the steps in the vehicle information display method provided in the embodiments of the present application.
Accordingly, the present application also provides a computer program product, which includes a computer program/instruction, and when the computer program/instruction is executed by a processor, the processor is enabled to implement the steps of the vehicle information display method provided by the present application.
In the embodiment of the application, the VR interactive interface can be displayed through the electronic equipment, and vehicle information display is carried out through the VR interactive interface. Specifically, the control can be associated with a key vehicle component of the vehicle object, and when the control is triggered, the state information of the key vehicle component is displayed, and meanwhile, switching of the VR view angle can be realized, and the control associated with other key vehicle components on the vehicle can be displayed. In the whole process, under the condition that no entity vehicle exists, the user can intuitively know the detailed condition of the vehicle, key vehicle parts concerned by the user are mainly checked, and the trust sense of the user is effectively enhanced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic flow chart diagram illustrating a vehicle information display method according to an exemplary embodiment of the present disclosure;
FIG. 2a is a schematic illustration of a first vehicle view provided by an exemplary embodiment of the present application;
FIG. 2b is a schematic illustration of paint surface data for a vehicle object provided in an exemplary embodiment of the present application;
FIG. 2c is a paint data schematic of another vehicle object provided by an exemplary embodiment of the present application;
FIG. 2d is a corresponding global perspective view of a vehicle model provided in an exemplary embodiment of the present application;
FIG. 2e is a corresponding partial perspective view of a vehicle model provided in accordance with an exemplary embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic terminal according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The problem of the user just can't know the detailed condition of vehicle directly perceivedly to prior art is solved, in this application embodiment, accessible electronic equipment demonstrates VR interactive interface, carries out vehicle information demonstration through this VR interactive interface. Specifically, the control can be associated with a key vehicle component of the vehicle object, and when the control is triggered, the state information of the key vehicle component is displayed, and meanwhile, switching of the VR view angle can be realized, and the control associated with other key vehicle components on the vehicle can be displayed. In the whole process, under the condition that no entity vehicle exists, the user can intuitively know the detailed condition of the vehicle, key vehicle parts concerned by the user are mainly checked, and the trust sense of the user is effectively enhanced.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
The vehicle information display method in the embodiment of the application can be operated on an electronic terminal or a server. The electronic terminal may be a local electronic terminal. When the vehicle information display method is operated as a server, cloud display can be performed.
In an optional embodiment, the cloud presentation refers to an information presentation manner based on cloud computing. In the cloud display operation mode, an operation main body of an information processing program and a display main body of a vehicle information picture are separated, data storage and operation corresponding to the vehicle information display method can be completed on a cloud display server, and a cloud display client is used for receiving and sending data and displaying the picture, for example, the cloud display client can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the electronic terminal for processing the information data is a cloud display server at the cloud end. When information browsing is carried out, a user operates the cloud display client to send an operation instruction to the cloud display server, the cloud display server displays related pictures according to the operation instruction, data are coded and compressed and are returned to the cloud display client through a network, and finally the cloud display client decodes the data and outputs the corresponding pictures.
In another alternative embodiment, the electronic terminal may be a local electronic terminal. The local electronic terminal stores an application program and is used for presenting an application interface. The local electronic terminal is used for interacting with a user through a graphical user interface, that is, an installation application is downloaded and run through the electronic device conventionally. The manner in which the local electronic terminal provides the graphical user interface to the user may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the user by holographic projection. For example, the local electronic terminal may include a display screen for presenting a graphical user interface including an application screen and a processor for running the application program, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
When the electronic terminal is a local electronic terminal, the electronic terminal may be an electronic terminal such as a desktop computer, a notebook computer, a tablet computer, a mobile terminal, and a Virtual Reality (VR) device. The VR device can comprise a computer, VR head-mounted equipment, VR control equipment and the like, and a user can roam in a specified area through a vehicle source picture displayed by a display device of the VR head-mounted equipment, so that the user can actually roam in a virtual space, and meanwhile can interact with a virtual vehicle through the VR control equipment.
The terminal can run application programs, such as life application programs, audio application programs, game application programs and the like. The life application programs can be further divided according to different types, such as car renting and selling application programs, house renting and selling application programs, home service application programs, entertainment application programs and the like. The embodiment of the application is exemplified by running a car renting and selling application program on a mobile terminal, and it is understood that the application is not limited thereto.
Fig. 1 is a flowchart illustrating a vehicle information displaying method according to an exemplary embodiment of the present application, in which a graphical user interface is provided through an electronic terminal, content displayed on the graphical user interface includes a VR interactive interface, a first vehicle view is displayed on the VR interactive interface, the first vehicle view includes at least a first key vehicle component, and a first control is associated with the first key vehicle component, as shown in fig. 1, the method includes:
101. in response to a triggering operation of the first control, displaying state information of the first key vehicle component in an associated area of the first control;
102. responding to VR visual angle switching operation, switching and displaying a first vehicle view on a VR interactive interface as a second vehicle view, wherein the second vehicle view at least comprises a second key vehicle component, and a second control is associated on the second key vehicle component and is used for triggering the display of state information of the second key vehicle component in an associated area of the first control; wherein the first vehicle view and the second vehicle view are used for presenting the same vehicle object and/or parts thereof from different perspectives.
In this embodiment, the vehicle object may be a live-action vehicle obtained by image-capturing an entity vehicle, or may be a vehicle model obtained by modeling the entity vehicle, for example, a 3D vehicle model obtained by three-dimensional modeling of the entity vehicle, or an animation vehicle model obtained by image-capturing the entity vehicle and performing animation synthesis on the captured images. The vehicle view may show the overall appearance of the vehicle object from a certain perspective, may also show the partial appearance of the vehicle object from a certain perspective, and may also show the partial interior space of the vehicle object from a certain perspective. For ease of distinction and description, two vehicle views showing the same vehicle object from different perspectives will be referred to as a first vehicle view and a second vehicle view, respectively.
In the present embodiment, the first vehicle view and the second vehicle view are not limited to showing the contents of the vehicle object. In an alternative embodiment, the first vehicle view and the second vehicle view show a portion of the interior space of the vehicle object from different perspectives. In another optional embodiment, the first vehicle view and the second vehicle view show the overall appearance of the vehicle object from different perspectives; or the first vehicle view and the second vehicle view show the partial appearance of the vehicle object from different perspectives; or in the first vehicle view and the second vehicle view, one displays the overall appearance of the vehicle and the other displays the partial appearance of the vehicle. In a further alternative embodiment, in the first vehicle view and the second vehicle view, one shows a partial interior space of the vehicle object, the other shows an overall appearance of the vehicle object or the other shows a partial appearance of the vehicle object.
In the present embodiment, regardless of how the first and second vehicle views present the vehicle object, the first and second vehicle views may contain key vehicle components. The key vehicle components are vehicle components of major interest to the user, for example, in a used vehicle sales scenario, the key vehicle components of the vehicle object may be vehicle components of major interest to the user, for example, the key vehicle components may include, but are not limited to: gear controllers, engines, skylights, brakes, doors, roofs, tires, or windows, etc. For ease of distinction and description, a critical vehicle component in a first vehicle view is referred to as a first critical vehicle component, and a critical vehicle component in a second vehicle view is referred to as a second critical vehicle component.
In this embodiment, critical vehicle components need to be reviewed and guidance or prompts are provided to the user indicating that the critical vehicle components need the user to focus on, or that the user has been prompted to verify the critical vehicle components in advance. Based on this, status information of key vehicle components may be obtained, which may be provided to the user as directions or prompts.
Optionally, the status information of the critical vehicle components includes, but is not limited to: at least one of detection information of key vehicle components and expert judgment results. The detection information of the key vehicle component is information obtained by detecting a problem in which the key vehicle component is focused, and the detection information of the key vehicle component includes detection time, detection times, detection results, and the like. The expert judgment result is result information for judging whether the key vehicle component is qualified or not according to the detection information of the key vehicle component, and the expert judgment result comprises information such as the name of a detection expert and whether the key vehicle component is qualified or not, for example, the detection expert is X1, and the skylight passes water leakage detection; or the detection expert is Y1, and the skylight fails the water leakage detection. The detection information of the key vehicle components is different according to the different key vehicle components. If the key vehicle part is a gear controller, the detection information at least comprises the gear shifting smoothness degree; if the key vehicle component is a skylight, the detection information at least comprises water leakage detection information, and the water leakage detection information at least comprises one of water leakage detection times, water leakage detection time and water leakage detection results; if the key vehicle component is an engine, the detection information at least comprises leakage detection information, and the leakage detection information at least comprises one of leakage detection time, leakage detection personnel and a leakage detection result. It should be noted that, when the key vehicle component is a sunroof, and the first vehicle view shows an internal space of the vehicle object, and at this time, the first vehicle view cannot show a local appearance of the sunroof, a local appearance image of the sunroof may be collected, and the local appearance image may be shown as state information of the sunroof.
In this embodiment, on the VR interactive interface, a control may be associated with a key vehicle component, and state information corresponding to the key vehicle component is displayed by triggering the control. Wherein, the control may include but is not limited to: button controls, a slider, a scroll bar, or the like, wherein the button controls include at least: text buttons, icon buttons, or picture buttons, etc. For ease of distinction and description, the control associated with the first key vehicle component is referred to as the first control and the control associated with the second key vehicle component is referred to as the second control. Specifically, a graphical user interface may be provided through the electronic terminal, where the content displayed by the graphical user interface includes a VR interactive interface, and a first vehicle view is displayed on the VR interactive interface, where the first vehicle view includes at least a first key vehicle component, and a first control is associated with the first key vehicle component, and the first control may be triggered, for example, by a viewer of the VR interactive interface, and the viewer of the VR interactive interface may include but is not limited to: a purchaser, a broker or a shopping guide of the vehicle, etc. The first control can be directly displayed on the VR interactive interface, and then the first control can be directly triggered, or the first control is in a hidden state, and when a triggering operation is initiated for the first key vehicle component, the first control can be displayed in an associated area of the first key vehicle component in response to the triggering operation.
In the embodiment, in the case that the first control is triggered, in response to a triggering operation on the first control, the state information of the first key vehicle component is displayed in the associated area of the first control. The display mode of the state information of the first key vehicle component is not limited. For example, the status information of the first key vehicle component may be displayed by a floating frame that switches from a hidden state to a displayed state when the first control is triggered to display the status information of the first key vehicle component in the floating frame, and that switches from the displayed state to the hidden state when the first control is triggered again to hide the status information of the first key vehicle component. For another example, the state information of the first key vehicle component may be displayed through a menu that slides left and right, when the first control is triggered, the menu is drawn on the VR interactive interface on the left side or the right side, the state information of the first key vehicle component is displayed in the menu, and when the first control is triggered again, the menu is drawn on the VR interactive interface, the menu is hidden, and then the state information of the first key vehicle component is also hidden. FIG. 2a is an example of a view of a first vehicle. The first key vehicle component is realized as a gear controller, the state information of the gear controller is that 'the platform is checked by you, the gear shifting smoothness degree is qualified, and the gear shifting smoothness degree is very smooth', the first control is realized as a circular control, and when the first control is triggered, the state information of the gear controller is displayed in a first vehicle view in a form of a suspension frame.
In this embodiment, switching between different vehicle views may be achieved on the VR interface through VR perspective switching operations, which may include, but are not limited to: click, slide, rotate, etc. When the VR visual angle switching operation is triggered, responding to the VR visual angle switching operation, switching and displaying a first vehicle view on the VR interactive interface into a second vehicle view, wherein the second vehicle view at least comprises a second key vehicle component, a second control is associated on the second key vehicle component, and the second control is used for triggering the display of state information of the second key vehicle component in an associated area of the first control. The second control can be directly displayed on the VR interactive interface, and then the second control can be directly triggered, or the second control is in a hidden state, and when a triggering operation is initiated for a second key vehicle component, the second control can be displayed in an associated area of the second key vehicle component in response to the triggering operation.
In the embodiment of the application, the VR interactive interface can be displayed through the electronic equipment, and vehicle information display is carried out through the VR interactive interface. Specifically, the control can be associated with a key vehicle component of the vehicle object, and when the control is triggered, the state information of the key vehicle component is displayed, and meanwhile, switching of the VR view angle can be realized, and the control associated with other key vehicle components on the vehicle can be displayed. In the whole process, under the condition that no entity vehicle exists, the user can intuitively know the detailed condition of the vehicle, key vehicle parts concerned by the user are mainly checked, and the trust sense of the user is effectively enhanced.
In an alternative embodiment, the second vehicle view may need to show the open-close region outside the vehicle object, the open-close region being an outside region containing the open-close component of the vehicle object, which may include but is not limited to: the opening and closing region outside the vehicle object may be a front region including a hood, a rear region including a trunk lid, or a door region including a door. The states of the open-close regions include: open state, closed state, and intermediate states in the opening or closing process. Under the condition that VR visual angle switching operation is triggered, responding to the VR visual angle switching operation, dynamically switching from a first vehicle view to a closed view corresponding to an opening and closing area, and displaying a first prompt control related to the opening and closing area; the first prompt control is used for triggering the opening of the opening and closing component; responding to the triggering operation of the first prompt control, dynamically displaying the opening process of the opening and closing component corresponding to the opening and closing area, and displaying the opening state view of the opening and closing area after the opening and closing component is opened so as to further display the corresponding internal component view after the opening and closing component is opened. Wherein, the first prompt control may include but is not limited to: button controls, a slider, a scroll bar, or the like, wherein the button controls include at least: text buttons, icon buttons, or picture buttons, etc. For example, the first prompt control is implemented as an icon with "openable and closable" or similar textual information. The first vehicle view may be a partial interior space of the vehicle object, or may be a whole or partial appearance of the vehicle object. If the first vehicle view is a closed view corresponding to the opening and closing area, the opening process of the opening and closing component corresponding to the opening and closing area can be directly and dynamically displayed in response to VR visual angle switching operation, and the open view of the opening and closing area is displayed after the opening and closing component is opened.
In an optional embodiment, when the first vehicle view shows the whole or partial appearance of the vehicle object, or the second vehicle view shows the whole or partial appearance of the vehicle object, the first vehicle view and the second vehicle view can both show the vehicle paint data of the vehicle object, the vehicle paint data includes paint information and a vehicle paint thickness value, the paint information can be used for representing whether scratch occurs at a corresponding part of the vehicle, and the vehicle paint thickness value can be used for representing whether paint repair occurs at the corresponding part of the vehicle, whether paint is uniform, and the like. The vehicle paint data can be controllably recorded into a hard disk or a memory of the electronic terminal in advance, and under the condition of vehicle paint detection, the vehicle paint data is obtained and displayed at a corresponding position of a vehicle view. It should be noted that the first key vehicle component displayed in the first vehicle view and the second key vehicle component displayed in the second vehicle view are appearance components that can display the vehicle paint data on the vehicle object, for example, the appearance components that can display the vehicle paint data on the vehicle object may include but are not limited to: a vehicle head appearance, a vehicle tail appearance, a vehicle door appearance, or a vehicle roof appearance, etc.
Specifically, when the first vehicle view is displayed on the VR interactive interface, a vehicle paint detection control is also displayed on the VR interactive interface, as shown in fig. 2 b. When the vehicle paint detection control is triggered, the vehicle paint detection is carried out on the surface of the vehicle in response to the triggering operation of the vehicle paint detection control, the vehicle paint detection light effect is dynamically displayed on the surface of the displayed vehicle, and the paint surface data of the displayed first key vehicle component is displayed. As shown in fig. 2b, a first key vehicle component of a vehicle object is illustrated, comprising a hood, a side door, a window frame and a rear cover, the hood having a finish thickness of 154 micrometers (μm), the side door having a finish thickness of 175 μm, the window frame having a finish thickness of 144 μm and the rear cover having a finish thickness of 170 μm; as shown in fig. 2c, a first key vehicle component of another vehicle object is illustrated, comprising a hood, a side door, a window frame and a rear cover, the hood having a paint thickness of 154 micrometers (μm), the side door having a paint thickness of 150 μm, the window frame having a paint thickness of 152 μm and the rear cover having a paint thickness of 152 μm; in the case where there is a problem area in the vehicle surface, the problem area and details of the problem existing on the vehicle surface are highlighted. The problem area refers to an area where a car paint problem occurs, for example, an area where car paint scratches are scraped, a paint repair area, an area where car paint is not uniform, or the like. The manner of highlighting the problem area is not limited, and for example, the problem area may be highlighted in a highlight color, marked with a highlight line, or marked with text information to highlight the problem area. The issue details may include, but are not limited to: vehicle paint data or quality control information, etc. As shown in FIG. 2b, the vehicle surface side door area is the problem area, and the problem detail is the "paint patch".
Optionally, the vehicle paint detection result may also be displayed on the VR interactive interface in response to the viewing of the vehicle paint detection result, the vehicle paint detection result including at least one of a vehicle paint status rating, a number of problem areas. The vehicle paint status rating may be expressed by "excellent", "good", and "pass", or may be expressed by a, b, c, or d, which is not limited. The number of problem areas may be one, multiple, or zero, i.e., no problem areas.
The embodiment of triggering to check the vehicle paint detection result is not limited. For example, the paint detection control may be triggered to automatically trigger the viewing of the paint detection result after the paint finish data of the first key vehicle component is displayed, and the paint detection result is displayed on the VR interface. As shown in fig. 2b, after the vehicle paint detection control is triggered, paint surface data of the vehicle surface is displayed, and a vehicle paint detection result is displayed, in the vehicle paint detection result, the vehicle paint state is rated as "good state", and the number of problem areas is "1 obvious paint repair mark". For another example, the VR interactive interface includes a viewing control in addition to the vehicle paint detection control, and when the viewing control is triggered, the viewing of the vehicle paint detection result is triggered, and the vehicle paint detection result is displayed on the VR interactive interface. As shown in fig. 2c, a view control is included on the VR interactive interface, and when the view control is triggered, a vehicle paint detection result is displayed, in the vehicle paint detection result, the vehicle paint state is ranked as "state excellent", and the number of problem areas is no repair trace.
Besides, in the embodiment, the vehicle bottom projection with the first visual attribute can be displayed at the bottom of the vehicle, and the first visual attribute represents that the problem area exists on the surface of the vehicle. The first visual attribute may include, but is not limited to: the color of the vehicle bottom projection or the shape of the vehicle bottom projection profile, the linearity or the line width of the profile line and the like. Optionally, in the absence of a problem area in the vehicle surface, displaying a vehicle bottom projection of a second visual attribute at the vehicle bottom, the second visual attribute representing the absence of a problem area in the vehicle surface, the second visual attribute being different from the first visual attribute. As shown in fig. 2b, a first visual attribute of the vehicle indicating a problem area is indicated by a dashed line, and as shown in fig. 2c, a second visual attribute of the vehicle indicating an area without a problem is indicated by a solid line.
Similarly, when a second vehicle view is displayed on the VR interactive interface, a vehicle paint detection control is further displayed on the VR interactive interface, when the vehicle paint detection control is triggered, vehicle paint detection is carried out on the surface of the vehicle in response to the triggering operation of the vehicle paint detection control, vehicle paint detection light effects are dynamically displayed on the surface of the displayed vehicle, and paint surface data of a second displayed key vehicle component are displayed; and in the case where there is a problem area in the vehicle surface, highlighting the problem area and details of the problem existing on the vehicle surface. Besides, the vehicle bottom projection with the first visual attribute can be displayed at the bottom of the vehicle, and the first visual attribute represents that the problem area exists on the surface of the vehicle. For the content of performing paint detection on the vehicle surface when the second vehicle view is displayed on the VR interface, reference may be made to the foregoing embodiments, and details are not repeated here.
In this embodiment, the flow that detects the lacquer under the line is restored to on-line, carries out visual show through VR interactive interface, has saved the cost of seeing the car under the user line, and simultaneously, the user can demonstrate the finish data through interactive operation 720 degrees, can compensate the not enough of unable visual expression car lacquer thickness in actually seeing the car.
In an optional embodiment, the vehicle object is a vehicle model obtained by modeling an entity vehicle, and the vehicle model may be a 3D vehicle model obtained by three-dimensional modeling of the entity vehicle, or may be a vehicle model obtained by collecting images of the entity vehicle and synthesizing the collected images, for example, a plurality of vehicle images are obtained by performing a circular shooting of the entity vehicle at a plurality of shooting angles, and the obtained plurality of vehicle images are subjected to animation synthesis to obtain an animated vehicle model. Under the condition that the first vehicle view or the second vehicle view shows the overall appearance of the vehicle model, the perspective processing can be further carried out on the vehicle model, and the perspective view corresponding to the vehicle model is shown on the first vehicle view or the second vehicle view. For example, a global perspective process may be performed for the vehicle model to present a corresponding global perspective of the vehicle model on the VR interface. For another example, a partial perspective process may be performed on the vehicle model to present a corresponding partial perspective view of the vehicle model on the VR interface. For another example, the global perspective processing and the local perspective processing are supported, and when the global perspective processing is performed on the vehicle model, the global perspective view corresponding to the vehicle model is displayed on the VR interactive interface, and when the local perspective processing is performed on the vehicle model, the local perspective view corresponding to the vehicle model is displayed on the VR interactive interface.
Case a 1: electronic terminal supporting global perspective processing
In this embodiment, a first vehicle view and a first perspective control are displayed on the VR interactive interface, and when the first perspective control is triggered, the first vehicle view on the VR interactive interface is switched to a global perspective view corresponding to the vehicle model in response to a global perspective trigger operation initiated on the first perspective control. For example, the first perspective control is implemented as a button, and if the button is triggered, a global perspective trigger operation is automatically initiated. In the case of a problem part in a physical vehicle, the problem part is highlighted on the global perspective. In this case, the problem component can be highlighted by its color, line shape, line width, or the like. For example, if the engine of the physical vehicle is a component that is replaced with a new one, the engine is highlighted in red by the color of the engine in the global perspective view of the vehicle model corresponding to the physical vehicle, and if the second prompt control of the engine is triggered, the details of the replacement time, the brand of the engine, and the like can be displayed.
In addition, the problem component is configured with a second prompt control, the second prompt control is used for triggering and displaying the problem details of the problem component, the second prompt control can be directly implemented as the problem component, and then directly triggering the problem component can initiate a triggering operation on the second prompt control, or the second prompt control is implemented as a button control, a sliding frame or a scroll bar, and the like. And when the second prompt control is triggered, responding to the triggering operation of the second prompt control, and displaying the question details of the question component. FIG. 2d is a corresponding global perspective view of a vehicle model including problem parts, such as vehicle shock absorbers, with the problem parts shown in FIG. 2d in phantom and a second prompt control shown as a circular button control. In addition, a question detail is also included on fig. 2d, which includes: problem part name (vehicle shock absorber), state of warranty, and under official warranty. Among them, the problem part is a part that has been subjected to repair or replacement among vehicle parts. The issue details of the issue component may include, but are not limited to: the maintenance record of the problem component, whether the problem component is in the warranty period when being maintained, whether the problem component is official maintenance, and the like, and whether the brand is renewed at the renewal time, whether the new component is the original factory, and the like.
Or displaying a second vehicle view and the first perspective control on the VR interactive interface, and when the first perspective control is triggered, responding to a global perspective trigger operation initiated on the first perspective control, and switching the second vehicle view on the VR interactive interface to a global perspective view corresponding to the vehicle model. In the case of a problem component in a physical vehicle, highlighting the problem component on the global perspective, the problem component being configured with a second prompt control, and in response to a triggering operation of the second prompt control, displaying problem details of the problem component. Among them, the problem part is a part that has been subjected to repair or replacement among vehicle parts.
Case a 2: electronic terminal supporting local perspective processing
In this embodiment, the VR interface displays a first vehicle view or a second vehicle view, and in addition, displays a second perspective control, when the second perspective control is triggered, in response to a local perspective trigger operation initiated on the second perspective control, displays a local perspective frame that can be moved on the vehicle model, and as the local perspective frame moves, an area covered by the local perspective frame on the vehicle model is displayed as a local perspective corresponding to the area; and synchronizing display of the problem details of the problem component in the case where the problem component is included in the partial perspective view. In fig. 2e, the problem component is an engine as an example, and the problem detail is "engine change". The shape of the partial perspective frame is not limited, and may be, for example, a circle, a square, a triangle, or the like. In fig. 2e, the partial perspective frame is illustrated as a circle, but the present invention is not limited thereto.
Case a 3: electronic terminal supporting global perspective processing and local perspective processing
In this embodiment, a first vehicle view and a third perspective control are displayed on the VR interactive interface, and a global perspective trigger operation or a local perspective trigger operation may be initiated for the third perspective control. For example, the third perspective control may be implemented as a composite control for which a global perspective trigger operation may be initiated if a "global perspective" is displayed on the composite button, and for which a local perspective trigger operation may be initiated if the composite button is displayed as a "local perspective" after the global perspective trigger operation is initiated. For another example, a click operation on the third perspective control may initiate a global perspective trigger operation, and a slide operation on the third perspective control may initiate a local perspective trigger operation.
In this embodiment, when a global perspective trigger operation is initiated for the third perspective control, in response to the global perspective trigger operation initiated for the third perspective control, the first vehicle view on the VR interactive interface is switched to a global perspective view corresponding to the vehicle model, and in a case that a problem part exists in the physical vehicle, the problem part is highlighted on the global perspective view, and the problem part is configured with a second prompt control, and in response to the trigger operation for the second prompt control, the problem details of the problem part are displayed. When a local perspective triggering operation is initiated aiming at the third perspective control, responding to the local perspective triggering operation initiated on the third perspective control, displaying a movable local perspective frame on the vehicle model, and displaying an area covered by the local perspective frame on the vehicle model as a local perspective corresponding to the area along with the movement of the local perspective frame; and synchronizing display of the problem details of the problem component in the case where the problem component is included in the partial perspective view.
Similarly, for the implementation of displaying the second vehicle view and the third perspective control on the VR interactive interface, the implementation of displaying the first vehicle view and the third perspective control on the VR interactive interface is the same as or similar to the implementation of displaying the first vehicle view and the third perspective control on the VR interactive interface, and reference may be made to the foregoing embodiment for details, which is not described herein again.
In this embodiment, when purchasing a second hand, the details inside the vehicle cannot be fully understood from the appearance. According to the method and the device, the vehicle object can be subjected to navigation detection through the global perspective view of the vehicle object and directly reach the inside of the vehicle object, and the vehicle part which is maintained or replaced with a new vehicle part on the vehicle object is highlighted, so that a user can visually know the vehicle maintenance experience and the problems needing important attention in the VR experience process.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 101 to 102 may be device a; for another example, the execution subject of step 101 may be device a, and the execution subject of step 102 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 3 is a schematic structural diagram of an electronic terminal according to an exemplary embodiment of the present application. As shown in fig. 3, the electronic terminal includes: a memory 34 and a processor 35.
The memory 34 is used for storing computer programs and may be configured to store other various data to support operations on the electronic terminal. Examples of such data include instructions for any application or method operating on the electronic terminal.
The memory 34 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In this embodiment, the electronic terminal further includes: a display 37; the electronic terminal provides a graphical user interface through a display 37, the content displayed by the graphical user interface includes a VR interactive interface, a first vehicle view is displayed on the VR interactive interface, the first vehicle view at least includes a first key vehicle component, and a first control is associated with the first key vehicle component; a processor 35, coupled to the memory 34, for executing the computer program in the memory 34 for: in response to a triggering operation of the first control, displaying state information of the first key vehicle component in an associated area of the first control; responding to VR visual angle switching operation, switching and displaying a first vehicle view on a VR interactive interface as a second vehicle view, wherein the second vehicle view at least comprises a second key vehicle component, and a second control is associated on the second key vehicle component and is used for triggering the display of state information of the second key vehicle component in an associated area of the first control; wherein the first vehicle view and the second vehicle view are used for presenting the same vehicle object and/or parts thereof from different perspectives.
In an alternative embodiment, the first vehicle view and the second vehicle view are used to show a portion of the interior space of the vehicle object from different perspectives; or the first vehicle view and the second vehicle view are used for showing the whole or partial appearance of the vehicle object from different visual angles; or in the first vehicle view and the second vehicle view, one is used for showing a part of the interior space of the vehicle object and the other is used for showing the whole or partial appearance of the vehicle object.
In an optional embodiment, in a case that the second vehicle view shows the open-close region outside the vehicle object, when the processor 35 switches and displays the first vehicle view on the VR interface as the second vehicle view in response to the VR view angle switching operation, specifically: responding to VR visual angle switching operation, and dynamically switching from a first vehicle view to a closed state view corresponding to an opening and closing area; the first prompt control is used for displaying the opening and closing area; responding to the triggering operation of the first prompt control, dynamically displaying the opening process of the opening and closing component corresponding to the opening and closing area, and displaying the opening state view of the opening and closing area after the opening and closing component is opened so as to further display the corresponding internal component view after the opening and closing component is opened.
In an alternative embodiment, the open-close region outside the vehicle object includes a head region, a tail region or a door region.
In an alternative embodiment, where the first or second vehicle view shows the overall or partial appearance of the vehicle object, a vehicle paint detection control is also displayed on the VR interactive interface; the processor 35 is further configured to: dynamically displaying a vehicle paint detection light effect on the displayed vehicle surface and displaying paint surface data of the displayed first or second key vehicle component in response to a triggering operation on the vehicle paint detection control; and in the case of a problem area in the vehicle surface, highlighting the problem area and details of the problem on the vehicle surface and displaying a vehicle underbody projection having a first visual attribute at the bottom of the vehicle, the first visual attribute indicating that the problem area is present on the vehicle surface.
In an alternative embodiment, processor 35 is further configured to: under the condition that there is not the problem region in the vehicle surface, show the vehicle bottom projection of second visual attribute at the vehicle bottom, the second visual attribute shows that there is not the problem region in the vehicle surface, and the second visual attribute is different from first visual attribute.
In an alternative embodiment, processor 35 is further configured to: responding to the checking operation of the vehicle paint detection result, displaying the vehicle paint detection result on the VR interactive interface, wherein the vehicle paint detection result comprises: a paint status rating, a number of problem areas.
In an optional embodiment, the vehicle object is a vehicle model obtained by modeling a physical vehicle, and a first perspective control is further displayed on the VR interactive interface under the condition that the first or second vehicle view shows the overall appearance of the vehicle model; the processor 35 is further configured to: in response to a global perspective trigger operation initiated on the first perspective control, the first or second vehicle view on the VR interactive interface is switched to a global perspective view corresponding to the vehicle model, and in the event that a problem component exists with the physical vehicle, the problem component is highlighted on the global perspective view, wherein the problem component is configured with a second prompt control, and in response to a trigger operation on the second prompt control, and in the event that the problem component is triggered, problem details of the problem component are displayed.
In an optional embodiment, the vehicle object is a vehicle model obtained by modeling a physical vehicle, and a second perspective control is further displayed on the VR interactive interface under the condition that the first or second vehicle view shows the overall appearance of the vehicle model; the processor 35 is further configured to: in response to a local perspective triggering operation initiated on the second perspective control, displaying a local perspective frame which can be moved on the vehicle model, and displaying an area covered by the local perspective frame on the vehicle model as a local perspective corresponding to the area along with the movement of the local perspective frame; and synchronizing display of the problem details of the problem component in the case where the problem component is included in the partial perspective view.
Further, as shown in fig. 3, the electronic terminal further includes: communication components 36, power components 38, audio components 39, and the like. Only some of the components are schematically shown in fig. 3, and the electronic terminal is not meant to include only the components shown in fig. 3. It should be noted that the components within the dashed line frame in fig. 3 are optional components, not necessary components, and may be determined according to the product form of the electronic terminal.
Accordingly, embodiments of the present application also provide a computer readable storage medium storing a computer program, which, when executed by a processor, causes the processor to implement the steps of the method shown in fig. 1.
Accordingly, embodiments of the present application also provide a computer program product, which includes computer programs/instructions, when executed by a processor, cause the processor to implement the steps of the method shown in fig. 1.
The communication component of fig. 3 described above is configured to facilitate communication between the device in which the communication component is located and other devices in a wired or wireless manner. The device where the communication component is located can access a wireless network based on a communication standard, such as a WiFi, a 2G, 3G, 4G/LTE, 5G and other mobile communication networks, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The display in fig. 3 described above includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply assembly of fig. 3 described above provides power to the various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio component of fig. 3 described above may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (11)

1. A vehicle information presentation method is characterized in that a graphical user interface is provided through an electronic terminal, content displayed by the graphical user interface comprises a VR interactive interface, a first vehicle view is displayed on the VR interactive interface, the first vehicle view at least comprises a first key vehicle component, and a first control is associated with the first key vehicle component, and the method comprises the following steps:
in response to a triggering operation of the first control, displaying state information of the first key vehicle component in an associated area of the first control;
in response to a VR perspective switching operation, switching and displaying the first vehicle view on the VR interactive interface as a second vehicle view, wherein the second vehicle view at least comprises a second key vehicle component, and a second control is associated on the second key vehicle component and used for triggering the display of state information of the second key vehicle component in an associated area of the first control;
wherein the first and second vehicle views are used to present the same vehicle object and/or part thereof from different perspectives.
2. The method of claim 1, wherein the first vehicle view and second vehicle view are used to show a portion of an interior space of the vehicle object from different perspectives;
or
The first and second vehicle views are for displaying the overall or partial appearance of the vehicle object from different perspectives;
or
In the first and second vehicle views, one is used to show a part of the interior space of the vehicle object and the other is used to show the overall or partial appearance of the vehicle object.
3. The method of claim 1 or 2, wherein the switching the first vehicle view on the VR interactive interface to display as a second vehicle view in response to a VR perspective switching operation with the second vehicle view showing an open-close region outside the vehicle object comprises:
responding to VR visual angle switching operation, dynamically switching from the first vehicle view to a closed state view corresponding to the opening and closing area, and displaying a first prompt control of the opening and closing area;
responding to the triggering operation of the first prompt control, dynamically displaying the opening process of the opening and closing component corresponding to the opening and closing area, and displaying the opening state view of the opening and closing area after the opening and closing component is opened so as to further display the corresponding internal component view after the opening and closing component is opened.
4. The method of claim 3, wherein the open-close region outside the vehicle object comprises a nose region, a tail region, or a door region.
5. The method of claim 2, wherein in the event that the first or second vehicle view presents a global or partial appearance of the vehicle object, the VR interface further displays a vehicle paint detection control thereon, the method further comprising:
dynamically displaying a vehicle paint detection light effect on the displayed vehicle surface and displaying paint surface data of the displayed first or second key vehicle component in response to the triggering operation of the vehicle paint detection control; and
in the case of a problem area in the vehicle surface, the problem area and details of the problem on the vehicle surface are highlighted and a vehicle underbody projection with first visual properties is displayed on the vehicle underbody, which first visual properties indicate that the problem area is present on the vehicle surface.
6. The method of claim 5, further comprising:
and under the condition that no problem area exists in the vehicle surface, displaying the underbody projection of a second visual attribute at the bottom of the vehicle, wherein the second visual attribute represents that no problem area exists on the vehicle surface, and the second visual attribute is different from the first visual attribute.
7. The method of claim 5, further comprising:
in response to viewing the vehicle paint detection results, displaying vehicle paint detection results on the VR interactive interface, the vehicle paint detection results including: a paint status rating, a number of problem areas.
8. The method of claim 2, wherein the vehicle object is a vehicle model modeling a physical vehicle, and wherein a first perspective control is further displayed on the VR interactive interface with the first or second vehicle view showing an overall appearance of the vehicle model, the method further comprising:
in response to a global perspective trigger operation initiated on the first perspective control, switching a first or second vehicle view on the VR interactive interface to a global perspective view corresponding to the vehicle model, and in the event that a problem component exists with the physical vehicle, highlighting the problem component on the global perspective view, wherein the problem component is configured with a second prompt control, and in response to a trigger operation on the second prompt control, displaying problem details of the problem component.
9. The method of claim 2, wherein the vehicle object is a vehicle model modeling a physical vehicle, and wherein a second perspective control is also displayed on the VR interactive interface with the first or second vehicle view showing an overall appearance of the vehicle model, the method further comprising:
in response to a local perspective trigger operation initiated on the second perspective control, displaying a local perspective frame which can be moved on the vehicle model, and as the local perspective frame moves, displaying a region on the vehicle model covered by the local perspective frame as a local perspective corresponding to the region; and in the case where a problem component is included in the partial perspective view, synchronously displaying the problem details of the problem component.
10. An electronic terminal, wherein the electronic terminal provides a graphical user interface, wherein content displayed by the graphical user interface comprises a VR interactive interface on which a first vehicle view is displayed, wherein the first vehicle view comprises at least a first key vehicle component on which a first control is associated;
the electronic terminal includes: a memory and a processor; the memory for storing a computer program; the processor, coupled with the memory, to execute the computer program to:
in response to a triggering operation of the first control, displaying state information of the first key vehicle component in an associated area of the first control; in response to a VR perspective switching operation, switching and displaying the first vehicle view on the VR interactive interface as a second vehicle view, wherein the second vehicle view at least comprises a second key vehicle component, and a second control is associated on the second key vehicle component and used for triggering the display of state information of the second key vehicle component in an associated area of the first control; wherein the first and second vehicle views are used to present the same vehicle object and/or part thereof from different perspectives.
11. A computer-readable storage medium having a computer program stored thereon, which, when executed by a processor, causes the processor to carry out the steps of the method according to any one of claims 1 to 9.
CN202111033506.9A 2021-09-03 2021-09-03 Vehicle information display method, device and storage medium Pending CN113900510A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111033506.9A CN113900510A (en) 2021-09-03 2021-09-03 Vehicle information display method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111033506.9A CN113900510A (en) 2021-09-03 2021-09-03 Vehicle information display method, device and storage medium

Publications (1)

Publication Number Publication Date
CN113900510A true CN113900510A (en) 2022-01-07

Family

ID=79188652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111033506.9A Pending CN113900510A (en) 2021-09-03 2021-09-03 Vehicle information display method, device and storage medium

Country Status (1)

Country Link
CN (1) CN113900510A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016721A (en) * 2022-05-09 2022-09-06 北京城市网邻信息技术有限公司 Information display method and device, electronic equipment and storage medium
CN115033133A (en) * 2022-05-13 2022-09-09 北京五八信息技术有限公司 Progressive information display method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104641400A (en) * 2012-07-19 2015-05-20 戈拉夫·瓦茨 User-controlled 3D simulation for providing realistic and enhanced digital object viewing and interaction experience
CN107153978A (en) * 2016-03-02 2017-09-12 腾讯科技(北京)有限公司 Vehicle methods of exhibiting and system
CN108369344A (en) * 2015-12-22 2018-08-03 奥迪股份公司 Method and virtual reality system for running virtual reality system
CN109949121A (en) * 2019-01-21 2019-06-28 广东康云科技有限公司 A kind of intelligence sees the data processing method and system of vehicle
CN210348303U (en) * 2019-06-21 2020-04-17 厦门伟特思汽车科技有限公司 Virtual display system of passenger car
CN111372055A (en) * 2020-03-25 2020-07-03 东风汽车集团有限公司 Vehicle bottom image display system and method
CN112824183A (en) * 2019-11-20 2021-05-21 华为技术有限公司 Automatic parking interaction method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104641400A (en) * 2012-07-19 2015-05-20 戈拉夫·瓦茨 User-controlled 3D simulation for providing realistic and enhanced digital object viewing and interaction experience
CN108369344A (en) * 2015-12-22 2018-08-03 奥迪股份公司 Method and virtual reality system for running virtual reality system
CN107153978A (en) * 2016-03-02 2017-09-12 腾讯科技(北京)有限公司 Vehicle methods of exhibiting and system
CN109949121A (en) * 2019-01-21 2019-06-28 广东康云科技有限公司 A kind of intelligence sees the data processing method and system of vehicle
CN210348303U (en) * 2019-06-21 2020-04-17 厦门伟特思汽车科技有限公司 Virtual display system of passenger car
CN112824183A (en) * 2019-11-20 2021-05-21 华为技术有限公司 Automatic parking interaction method and device
CN111372055A (en) * 2020-03-25 2020-07-03 东风汽车集团有限公司 Vehicle bottom image display system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016721A (en) * 2022-05-09 2022-09-06 北京城市网邻信息技术有限公司 Information display method and device, electronic equipment and storage medium
CN115033133A (en) * 2022-05-13 2022-09-09 北京五八信息技术有限公司 Progressive information display method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US11200895B2 (en) Interaction with a portion of a content item through a virtual assistant
CN113900510A (en) Vehicle information display method, device and storage medium
US20140245140A1 (en) Virtual Assistant Transfer between Smart Devices
CN110121093A (en) The searching method and device of target object in video
CN107870711A (en) Page navigation method, the method and client that user interface is provided
US20080103913A1 (en) System and method for guided sales
CN106993229A (en) Interactive attribute methods of exhibiting and device
US20200007948A1 (en) Video subtitle display method and apparatus
CN107948708A (en) Barrage methods of exhibiting and device
CN107797741A (en) Method for showing interface and device
CN112783398A (en) Display control and interaction control method, device, system and storage medium
KR20160064040A (en) Method and device for selecting information
CN108320208A (en) Vehicle recommends method and device
CN113900606B (en) Information display method, equipment and storage medium
CN114779665A (en) Automatic parking simulation test method and device and readable storage medium
CN109495765A (en) Video intercepting method and device
CN110321042B (en) Interface information display method and device and electronic equipment
US11710278B2 (en) Predictive virtual reconstruction of physical environments
CN114371898B (en) Information display method, equipment, device and storage medium
CN117008778A (en) Page information processing method, device, equipment and storage medium
CN107943404A (en) Method for showing interface and device
CN103116443A (en) Interface display control method and device
CN108984246A (en) Rendering method, device and the storage medium of toolbar
CN104049863A (en) Information processing method and electronic equipment
CN112784128A (en) Data processing and display method, device, system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20220107

RJ01 Rejection of invention patent application after publication