WO2019184889A1 - Procédé et appareil d'ajustement de modèle de réalité augmentée, support d'informations et dispositif électronique - Google Patents

Procédé et appareil d'ajustement de modèle de réalité augmentée, support d'informations et dispositif électronique Download PDF

Info

Publication number
WO2019184889A1
WO2019184889A1 PCT/CN2019/079588 CN2019079588W WO2019184889A1 WO 2019184889 A1 WO2019184889 A1 WO 2019184889A1 CN 2019079588 W CN2019079588 W CN 2019079588W WO 2019184889 A1 WO2019184889 A1 WO 2019184889A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
virtual model
electronic device
reality virtual
positional relationship
Prior art date
Application number
PCT/CN2019/079588
Other languages
English (en)
Chinese (zh)
Inventor
王健
谭筱
蓝和
邹奎
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2019184889A1 publication Critical patent/WO2019184889A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present application relates to the field of electronic devices, and in particular, to a method, an apparatus, a storage medium, and an electronic device for adjusting an augmented reality model.
  • AR Augmented Reality
  • Augmented Reality is a new technology that integrates real world information and virtual world information "seamlessly". It is an entity information (visual information) that is difficult to experience in a certain time and space of the real world. , sound, taste, touch, etc.) through computer and other scientific techniques, after simulation, the real environment and virtual objects are superimposed in real time on the same picture or space, and the virtual information is applied to the real world, being sensed by humans. Perceived to achieve a sensory experience that transcends reality. Augmented reality technology not only displays the information of the real world, but also displays the virtual information at the same time, and the two kinds of information are charged and superimposed.
  • the embodiment of the present application provides an adjustment method, device, storage medium, and electronic device for an augmented reality model, which can fuse the augmented reality virtual model with a real scene, and can greatly improve the accuracy of the fusion.
  • an embodiment of the present application provides an adjustment method of an augmented reality model, including:
  • the augmented reality virtual model is adjusted according to the positional relationship.
  • the embodiment of the present application further provides an apparatus for adjusting an augmented reality model, including: an image acquiring module, an adding module, a position acquiring module, and an adjusting module;
  • the image acquisition module is configured to acquire an image corresponding to a scene where the electronic device is currently located, and establish a spatial coordinate system according to the image;
  • the adding module is configured to add an augmented reality virtual model to the scene
  • the location obtaining module is configured to acquire a positional relationship between the augmented reality virtual model and the electronic device according to the spatial coordinate system;
  • the adjusting module is configured to adjust the augmented reality virtual model according to the positional relationship.
  • the embodiment of the present application further provides a storage medium, where a computer program is stored thereon, and the computer program is executed by the processor to implement the step of adjusting the augmented reality model.
  • an embodiment of the present application further provides an electronic device, including a processor and a memory, where the memory stores a plurality of instructions, and the processor loads the instructions in the memory to perform the following steps:
  • the augmented reality virtual model is adjusted according to the positional relationship.
  • FIG. 1 is a schematic flowchart of a method for adjusting an augmented reality model according to an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of selecting an augmented reality virtual model according to an embodiment of the present application.
  • FIG. 3 is another schematic flowchart of a method for adjusting an augmented reality model according to an embodiment of the present application.
  • FIG. 4 is a schematic structural diagram of an apparatus for adjusting an augmented reality model according to an embodiment of the present application.
  • FIG. 5 is another schematic structural diagram of an apparatus for adjusting an augmented reality model according to an embodiment of the present application.
  • FIG. 6 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
  • FIG. 7 is another schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the principles of the present application operate using many other general purpose or special purpose computing, communication environments, or configurations.
  • Examples of well-known computing systems, environments, and configurations suitable for use in the present application may include, but are not limited to, hand-held phones, personal computers, servers, multi-processor systems, microcomputer-based systems, mainframe computers, and A distributed computing environment, including any of the above systems or devices.
  • the device may be integrated into an electronic device, and the electronic device may be a network-enabled electronic device such as a mobile internet device (such as a smart phone or a tablet computer). device.
  • a mobile internet device such as a smart phone or a tablet computer.
  • An embodiment of the present application provides an adjustment method of an augmented reality model, including:
  • the augmented reality virtual model is adjusted according to the positional relationship.
  • the positional relationship includes a distance
  • the size of the augmented reality virtual model is adjusted according to a distance between the augmented reality virtual model and the electronic device.
  • the positional relationship includes a relative angle
  • the adding an augmented reality virtual model in the scene includes:
  • the method further includes:
  • the augmented reality picture after the depth blending is displayed.
  • establishing a spatial coordinate system according to the image comprises:
  • a spatial coordinate system is established in the three-dimensional model.
  • acquiring a positional relationship between the augmented reality virtual model and the electronic device according to the spatial coordinate system including:
  • FIG. 1 is a schematic flowchart of a method for adjusting an augmented reality model according to an embodiment of the present application, including the following steps:
  • Step S101 Acquire an image corresponding to a scene where the electronic device is currently located, and establish a spatial coordinate system according to the image.
  • the current scene may be image captured by the camera to acquire a scene image.
  • the above scene may include a scene that a user, an animal, or a landscape wants to capture.
  • the camera acquires the picture for the picture taken by the current scene.
  • the format of the image can be Bmp, jpg or other formats.
  • the image may also be pre-processed, and the pre-processing may include noise reduction processing and smoothing processing.
  • the three-dimensional model of the current scene may also be established according to the scene image.
  • the basic process of modeling can be: data acquisition and processing (training site data, device size data, etc.), professional 3D software modeling (such as: Pro/E, 3DMax modeling), visual editing software correction (such as: Cosmo Wrolds corrects errors, adds materials and interactions, and text editing software is perfect (eg, using Vrmlpad to optimize structure, reduce code, compress files).
  • the three-dimensional model may be image-rendered, and the image is finally rendered to conform to the stage of the 3D scene.
  • Rendering has a variety of software, such as: each CG software comes with a rendering engine, as well as such as RenderMan.
  • a three-dimensional scene image is obtained, and then the three-dimensional scene image is projected onto a display screen of the augmented reality device, and the user can view the three-dimensional scene image by watching the display screen of the augmented reality device.
  • a spatial coordinate system can be established in the three-dimensional model, for example, using the Marker-less technique to determine the initial coordinate plane, and initializing the coordinate system with the true scale.
  • Step S102 adding an augmented reality virtual model to the scene.
  • the electronic device may generate a corresponding augmented reality virtual model according to the current scene, so as to combine the acquired virtual model with the recognized real scene based on the augmented reality technology to generate a scene image of the augmented reality.
  • the augmented reality virtual model may also be selected by the user. As shown in FIG. 2, after the electronic device acquires the current scene, the preset virtual number is automatically recommended according to the scenario, and the user can pass the user. Click the way to select the target virtual model and add the augmented reality virtual model to the scene.
  • the augmented reality virtual model may have multiple expressions, such as text, image or video, etc., so that various types of virtual models may be combined with the recognized real scenes in order to be enhanced.
  • Realistic scene picture For example, the augmented reality virtual model can be superimposed in the real scene in the form of text and projected on the display screen of the electronic device for display. Then, the image information of the augmented reality virtual model can be superimposed on the real scene in combination with the audio information, and projected on the display screen of the electronic device for display, and the audio information can be played through the speaker of the electronic device.
  • merging the virtual model with the real scene requires omnidirectional alignment of the computer generated virtual model with the user's real environment.
  • the purpose of applying augmented reality is to combine virtual environments with real environments to make them look like a whole. If there is no exact match, the virtual model will appear to float above the real environment, so accurate 3D matching is critical for AR.
  • the above three-dimensional matching may include various methods such as a reference point method, a surface based method, a template method, an indefinite standard, and the like.
  • the reference point method is taken as an example to describe the embodiment of the present invention.
  • the method is to manually place some reference points in a real environment, and implement three-dimensional matching of the system according to the method of matching the reference points.
  • These reference points can be LEDs or specific landmarks. Their position in the real environment is known, can be identified by image processing, and then the transformation matrix of different spaces is calculated, and finally the matching is done by least squares method. This method simplifies the calculation and is accurate.
  • Step S103 Acquire a positional relationship between the augmented reality virtual model and the electronic device according to the spatial coordinate system.
  • the coordinate information of the augmented reality virtual model may be directly acquired in a spatial coordinate system of the current scene. Further, since the body of the electronic device does not exist in the current scene, acquiring the coordinate information of the electronic device requires calculation, for example, acquiring the shooting posture information of the electronic device according to the image capturing the current scene, and the shooting posture information may include shooting. Height and direction information, and then calculating coordinate information in the coordinate system corresponding to the electronic device in the current scene according to the shooting posture information.
  • the obtaining the positional relationship between the augmented reality virtual model and the electronic device may also be obtained by acquiring the depth information of the virtual model. For example, after adding the augmented reality virtual model to the current scene, determining to add the scene to the scene The location of the virtual model is then obtained by a depth of field camera in the electronic device to obtain a positional relationship between the location and the electronic device.
  • the location information may include distance information and relative angle information.
  • Step S104 adjusting the augmented reality virtual model according to the positional relationship.
  • the location relationship may include a distance between the augmented reality virtual model and the electronic device. Because in the prior art, the above virtual model often does not change with the distance from the electronic device, which reduces the authenticity. Therefore, the embodiment may change the size of the virtual model according to the distance between the augmented reality virtual model and the electronic device, that is, the step of adjusting the augmented reality virtual model according to the positional relationship may include:
  • the size of the augmented reality virtual model is adjusted according to the distance between the augmented reality virtual model and the electronic device.
  • the positional relationship may further include a relative angle between the augmented reality virtual model and the electronic device.
  • the orientation of the virtual model can also be adjusted in consideration of the fact that the virtual model presents different ways in different positions of the electronic device. Such as cartoon character models, etc., can make the virtual character always face the electronic device, improving the fun and authenticity. That is, the step of adjusting the augmented reality virtual model according to the positional relationship may further include:
  • the orientation of the augmented reality virtual model is adjusted according to the relative angle between the augmented reality virtual model and the electronic device.
  • the electronic device may be any device having a shooting function, such as a mobile phone, a tablet personal computer, a laptop computer, and a personal digital assistant (PDA). , Mobile Internet Device (MID) or Wearable Device.
  • a shooting function such as a mobile phone, a tablet personal computer, a laptop computer, and a personal digital assistant (PDA).
  • PDA personal digital assistant
  • MID Mobile Internet Device
  • Wearable Device any device having a shooting function, such as a mobile phone, a tablet personal computer, a laptop computer, and a personal digital assistant (PDA).
  • MID Mobile Internet Device
  • the embodiment of the present application can acquire an image corresponding to the current scene of the electronic device, and establish a spatial coordinate system according to the image, add an augmented reality virtual model to the scene, and obtain an augmented reality virtual model and the electronic device according to the spatial coordinate system.
  • the positional relationship adjusts the augmented reality virtual model according to the positional relationship.
  • the application can fuse the augmented reality virtual model with the real scene, and adjust the augmented reality virtual model according to the positional relationship, thereby greatly improving the accuracy of the fusion.
  • FIG. 3 is a schematic flowchart of another method for adjusting an augmented reality model according to an embodiment of the present application, including the following steps:
  • Step S201 Acquire an image corresponding to the current scene of the electronic device, and establish a spatial coordinate system according to the image.
  • the current scene may be image-collected by the camera to acquire a scene image.
  • the above scene may include a scene that a user, an animal, or a landscape wants to capture.
  • a spatial coordinate system can be established in the three-dimensional model, for example, using the Marker-less technique to determine the initial coordinate plane, and initializing the coordinate system with the true scale.
  • Step S202 Acquire a target object in the scene, and generate a corresponding augmented reality virtual model according to the target object.
  • the target object in the above scenario may be automatically selected by the electronic device or manually selected by the user. For example, the user may click on the object in the current scene in the finder frame to determine the target object.
  • an augmented reality virtual model corresponding to the target object is generated, wherein the augmented reality virtual model may include a plurality of different types of virtual models, such as text information, image information, and/or audio information.
  • Step S203 extracting coordinate information of the target object in the space coordinate system.
  • Step S204 adding an augmented reality virtual model according to the coordinate information.
  • different virtual models in the same type of virtual model may be sequentially combined with the current scene according to different types of the augmented reality virtual model, and the corresponding other types of virtual models are superimposed and displayed together. To generate augmented reality scenes.
  • the virtual model can be superimposed in the real scene in the form of text and projected onto the display screen of the augmented reality device for display.
  • the image information of the virtual model may be superimposed on the real scene in combination with the audio information, and projected on the display screen of the augmented reality device for display, and the audio information may be played by the speaker of the augmented reality device; wherein the image
  • the information can be a virtual three-dimensional image and fits it with the real scene to achieve seamless display, and the text information is displayed in a semi-transparent manner next to the intersection of the virtual three-dimensional image and the real scene for related description.
  • the virtual model content is projected onto the display screen of the augmented reality device by means of text information, image information and audio information.
  • Step S205 Acquire a positional relationship between the augmented reality virtual model and the electronic device according to the spatial coordinate system.
  • the coordinate information of the augmented reality virtual model may be directly acquired in a spatial coordinate system of the current scene. Further, since the body of the electronic device does not exist in the current scene, acquiring the coordinate information of the electronic device requires calculation, for example, acquiring the shooting posture information of the electronic device according to the image capturing the current scene, and the shooting posture information may include shooting. Height and direction information, and then calculating coordinate information in the coordinate system corresponding to the electronic device in the current scene according to the shooting posture information.
  • Step S206 adjusting the augmented reality virtual model according to the positional relationship.
  • the location relationship may include a distance between the augmented reality virtual model and the electronic device, so the embodiment may change the size of the virtual model according to the distance between the augmented reality virtual model and the electronic device.
  • the positional relationship may further include a relative angle between the augmented reality virtual model and the electronic device. Therefore, the embodiment may be based on the relative angle between the augmented reality virtual model and the electronic device, and the augmented reality virtual model. Adjust towards orientation.
  • Step S207 deeply integrating the adjusted augmented reality virtual model with the scene.
  • the embodiment may further obtain depth depth data by using a depth camera, and then perform depth fusion according to the virtual object distance data obtained in the three-dimensional engine, thereby achieving an occlusion effect, so that the real object and the virtual model are directly strong.
  • the application is robust and enables real objects and virtual data to occlude interactions.
  • Step S208 displaying the augmented reality picture after the depth fusion.
  • the electronic device may be any device capable of shooting, such as a mobile phone, a tablet personal computer, a laptop computer, or a personal digital assistant (PDA). , Mobile Internet Device (MID) or Wearable Device.
  • PDA personal digital assistant
  • MID Mobile Internet Device
  • Wearable Device any device capable of shooting, such as a mobile phone, a tablet personal computer, a laptop computer, or a personal digital assistant (PDA).
  • PDA personal digital assistant
  • MID Mobile Internet Device
  • Wearable Device any device capable of shooting
  • the embodiment of the present application can acquire an image corresponding to the current scene of the electronic device, and establish a spatial coordinate system according to the image, acquire a target object in the scene, and generate a corresponding augmented reality virtual model according to the target object, in the space coordinate system.
  • the coordinate information of the target object is extracted, the augmented reality virtual model is added according to the coordinate information, the positional relationship between the augmented reality virtual model and the electronic device is obtained according to the spatial coordinate system, and the augmented reality virtual model is adjusted according to the positional relationship, and the adjusted
  • the augmented reality virtual model is deeply integrated with the scene to display the augmented reality picture after deep fusion.
  • the embodiment of the present application can fuse the augmented reality virtual model with the real scene, and adjust the augmented reality virtual model according to the positional relationship, thereby greatly improving the accuracy of the fusion.
  • the embodiment of the present application further provides an apparatus for adjusting the method based on the augmented reality model described above.
  • the meaning of the noun is the same as that of the augmented reality model described above.
  • FIG. 4 is a schematic structural diagram of an apparatus for adjusting an augmented reality model according to an embodiment of the present disclosure.
  • the apparatus for adjusting the augmented reality model includes: an image acquiring module 301, an adding module 302, and a location acquiring module 303. Adjustment module 304;
  • the image obtaining module 301 is configured to acquire an image corresponding to a scene where the electronic device is currently located, and establish a spatial coordinate system according to the image;
  • the adding module 302 is configured to add an augmented reality virtual model to the scene
  • the location obtaining module 303 is configured to acquire a positional relationship between the augmented reality virtual model and the electronic device according to the spatial coordinate system;
  • the adjustment module 304 is configured to adjust the augmented reality virtual model according to the positional relationship.
  • the positional relationship includes a distance
  • the adjusting module is specifically configured to adjust a size of the augmented reality virtual model according to a distance between the augmented reality virtual model and the electronic device.
  • the positional relationship includes a relative angle
  • the adjusting module is specifically configured to adjust an orientation of the augmented reality virtual model according to a relative angle between the augmented reality virtual model and the electronic device.
  • the adding module 302 may specifically include: an obtaining submodule 3021, an extracting submodule 3022, and an adding submodule 3023;
  • the acquiring sub-module 3021 is configured to acquire a target object in the scene, and generate a corresponding augmented reality virtual model according to the target object;
  • the extraction submodule 3022 is configured to extract coordinate information of the target object in a spatial coordinate system
  • the adding submodule 3023 is configured to add an augmented reality virtual model according to the coordinate information.
  • the augmented reality model adjustment device 30 may further include: a fusion module 305 and a display module 306;
  • the fusion module 305 is configured to deeply integrate the adjusted augmented reality virtual model with the scene;
  • the display module 306 is configured to display the augmented reality picture after the depth fusion.
  • the adjusting device 30 of the augmented reality model provided by the embodiment of the present application can acquire an image corresponding to the current scene of the electronic device, and establish a spatial coordinate system according to the image, add an augmented reality virtual model to the scene, and obtain according to the spatial coordinate system.
  • the positional relationship between the augmented reality virtual model and the electronic device is adjusted, and the augmented reality virtual model is adjusted according to the positional relationship.
  • the application can fuse the augmented reality virtual model with the real scene, and adjust the augmented reality virtual model according to the positional relationship, thereby greatly improving the accuracy of the fusion.
  • the application further provides a storage medium on which a computer program is stored, wherein the computer program is executed by the processor to implement an adjustment method of the augmented reality model provided by the method embodiment.
  • the application further provides an electronic device comprising a memory, a processor and a computer program stored on the memory and operable on the processor, wherein the processor loads instructions in the memory for performing the following steps:
  • the augmented reality virtual model is adjusted according to the positional relationship.
  • the positional relationship includes a distance
  • the processor is configured to perform the following steps:
  • the positional relationship includes a relative angle
  • the processor is configured to perform the following steps:
  • the processor when the augmented reality virtual model is added to the scene, the processor is configured to perform the following steps:
  • the processor is further configured to perform the following steps:
  • the augmented reality picture after the depth blending is displayed.
  • the processor when the spatial coordinate system is established according to the image, the processor is configured to perform the following steps:
  • a spatial coordinate system is established in the three-dimensional model.
  • the processor when acquiring the positional relationship between the augmented reality virtual model and the electronic device according to the spatial coordinate system, the processor is configured to perform the following steps:
  • an electronic device is further provided, and the electronic device may be a device such as a smart phone or a tablet computer.
  • the electronic device 400 includes a processor 401 and a memory 402.
  • the processor 401 is electrically connected to the memory 402.
  • the processor 401 is a control center of the electronic device 400, which connects various parts of the entire electronic device using various interfaces and lines, executes the electronic by running or loading an application stored in the memory 402, and calling data stored in the memory 402.
  • the various functions and processing data of the device enable overall monitoring of the electronic device.
  • the processor 401 in the electronic device 400 loads the instructions corresponding to the process of one or more applications into the memory 402 according to the following steps, and is stored and stored in the memory 402 by the processor 401.
  • the application thus implementing various functions:
  • the augmented reality virtual model is adjusted according to the positional relationship.
  • FIG. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
  • the electronic device 500 can include a radio frequency (RF) circuit 501, a memory 502 including one or more computer readable storage media, an input unit 503, a display unit 504, a sensor 504, an audio circuit 506, and wireless fidelity ( WiFi, Wireless Fidelity module 507, processor 508 including one or more processing cores, and power supply 509 and the like.
  • RF radio frequency
  • FIG. 7 does not constitute a limitation on the electronic device, and may include more or less components than those illustrated, or a combination of certain components, or different component arrangements.
  • the radio frequency circuit 501 can be used for transmitting and receiving information, or receiving and transmitting signals during a call. Specifically, after receiving the downlink information of the base station, the radio network is processed by one or more processors 508; in addition, the data related to the uplink is sent to the base station. .
  • the radio frequency circuit 501 includes, but is not limited to, an antenna, at least one amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, and a low noise amplifier (LNA, Low Noise Amplifier), duplexer, etc.
  • SIM Subscriber Identity Module
  • LNA Low Noise Amplifier
  • the wireless communication can use any communication standard or protocol, including but not limited to Global System of Mobile communication (GSM), General Packet Radio Service (GPRS), and Code Division Multiple Access (CDMA). Code Division Multiple Access), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), e-mail, Short Messaging Service (SMS), and the like.
  • GSM Global System of Mobile communication
  • GPRS General Packet Radio Service
  • CDMA Code Division Multiple Access
  • Code Division Multiple Access Code Division Multiple Access
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • SMS Short Messaging Service
  • Memory 502 can be used to store applications and data.
  • the application stored in the memory 502 contains executable code. Applications can form various functional modules.
  • the processor 508 executes various functional applications and data processing by running an application stored in the memory 502.
  • the memory 502 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored according to Data created by the use of electronic devices (such as audio data, phone books, etc.).
  • memory 502 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, memory 502 may also include a memory controller to provide access to memory 502 by processor 508 and input unit 503.
  • the input unit 503 can be configured to receive input numbers, character information or user characteristic information (such as fingerprints), and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function controls.
  • input unit 503 can include a touch-sensitive surface as well as other input devices.
  • Touch-sensitive surfaces also known as touch screens or trackpads, collect touch operations on or near the user (such as the user using a finger, stylus, etc., any suitable object or accessory on a touch-sensitive surface or touch-sensitive Operation near the surface), and drive the corresponding connecting device according to a preset program.
  • the touch sensitive surface may include two parts of a touch detection device and a touch controller.
  • the touch detection device detects the touch orientation of the user, and detects a signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts the touch information into contact coordinates, and sends the touch information.
  • the processor 508 is provided and can receive commands from the processor 508 and execute them.
  • Display unit 504 can be used to display information entered by the user or information provided to the user, as well as various graphical user interfaces of the electronic device, which can be composed of graphics, text, icons, video, and any combination thereof.
  • the display unit 504 can include a display panel.
  • the display panel can be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
  • the touch-sensitive surface can cover the display panel, and when the touch-sensitive surface detects a touch operation on or near it, it is transmitted to the processor 508 to determine the type of the touch event, and then the processor 508 displays the type according to the type of the touch event. A corresponding visual output is provided on the panel.
  • the touch-sensitive surface and display panel are implemented as two separate components to perform input and input functions, in some embodiments, the touch-sensitive surface can be integrated with the display panel to implement input and output functions.
  • the electronic device can also include at least one type of sensor 505, such as a light sensor, motion sensor, and other sensors.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel according to the brightness of the ambient light, and the proximity sensor may close the display panel when the electronic device moves to the ear, and/or Backlighting.
  • the gravity acceleration sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity.
  • the gesture of the mobile phone such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.; as for the electronic device can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, not here Let me repeat.
  • the audio circuit 506 can provide an audio interface between the user and the electronic device through a speaker and a microphone.
  • the audio circuit 506 can convert the received audio data into an electrical signal, which is transmitted to a speaker, and converted into a sound signal output by the speaker.
  • the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 506 and converted into
  • the audio data is processed by the audio data output processor 508, transmitted via the RF circuit 501 to, for example, another electronic device, or the audio data is output to the memory 502 for further processing.
  • the audio circuit 506 may also include an earbud jack to provide communication of the peripheral earphones with the electronic device.
  • Wireless Fidelity is a short-range wireless transmission technology.
  • the wireless device can help users send and receive e-mail, browse web pages and access streaming media through the wireless fidelity module 507, which provides users with wireless broadband Internet access.
  • FIG. 7 shows the wireless fidelity module 507, it can be understood that it does not belong to the essential configuration of the electronic device, and may be omitted as needed within the scope of not changing the essence of the invention.
  • the processor 508 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, executes the electronic device by running or executing an application stored in the memory 502, and calling data stored in the memory 502.
  • the processor 508 may include one or more processing cores; preferably, the processor 508 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
  • the modem processor primarily handles wireless communications. It will be appreciated that the above described modem processor may also not be integrated into the processor 508.
  • the electronic device also includes a power source 509 (such as a battery) that supplies power to the various components.
  • a power source 509 (such as a battery) that supplies power to the various components.
  • the power source can be logically coupled to the processor 508 through the power management system to manage functions such as charging, discharging, and power management through the power management system.
  • the power supply 509 may also include any one or more of a DC or AC power source, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
  • the electronic device may further include a camera, a Bluetooth module, and the like, and details are not described herein again.
  • the foregoing modules may be implemented as a separate entity, or may be implemented in any combination, and may be implemented as the same or a plurality of entities.
  • the foregoing modules refer to the foregoing method embodiments, and details are not described herein.
  • the storage medium may include: a read only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk.
  • the functional modules may be integrated into one processing chip, or each module may exist physically separately. Two or more modules can be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules.
  • the principles and implementations of the present application are described in the following by using specific examples. The description of the above embodiments is only for helping to understand the method of the present application and its core ideas. Meanwhile, for those skilled in the art, according to the present application, There is a change in the scope of the present invention and the scope of application, and the contents of this specification should not be construed as limiting the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé d'ajustement d'un modèle de réalité augmentée, consistant : à obtenir une image correspondant à une scène d'un dispositif électronique, et à créer un système de coordonnées spatiales selon l'image; à ajouter un modèle virtuel de réalité augmentée dans la scène; à obtenir une relation de position entre le modèle virtuel de réalité augmentée et le dispositif électronique selon le système de coordonnées spatiales; et à ajuster le modèle virtuel de réalité augmentée selon la relation de position. La présente invention concerne en outre un appareil permettant d'ajuster un modèle de réalité augmentée, un support d'informations et un dispositif électronique.
PCT/CN2019/079588 2018-03-26 2019-03-25 Procédé et appareil d'ajustement de modèle de réalité augmentée, support d'informations et dispositif électronique WO2019184889A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810252914.5A CN108537889A (zh) 2018-03-26 2018-03-26 增强现实模型的调整方法、装置、存储介质和电子设备
CN201810252914.5 2018-03-26

Publications (1)

Publication Number Publication Date
WO2019184889A1 true WO2019184889A1 (fr) 2019-10-03

Family

ID=63484794

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/079588 WO2019184889A1 (fr) 2018-03-26 2019-03-25 Procédé et appareil d'ajustement de modèle de réalité augmentée, support d'informations et dispositif électronique

Country Status (2)

Country Link
CN (1) CN108537889A (fr)
WO (1) WO2019184889A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911356A (zh) * 2020-05-29 2021-06-04 腾讯科技(深圳)有限公司 一种虚拟现实vr视频的播放方法及相关设备

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108537889A (zh) * 2018-03-26 2018-09-14 广东欧珀移动通信有限公司 增强现实模型的调整方法、装置、存储介质和电子设备
CN109345560B (zh) * 2018-09-20 2021-02-19 网易(杭州)网络有限公司 增强现实设备的运动跟踪精度测试方法及装置
CN110070600B (zh) * 2018-11-02 2023-09-15 北京微播视界科技有限公司 三维模型的生成方法、装置、硬件装置
CN111325798B (zh) * 2018-12-13 2023-08-18 浙江宇视科技有限公司 相机模型纠正方法、装置、ar实现设备及可读存储介质
CN111464577B (zh) * 2019-01-21 2022-05-27 阿里巴巴集团控股有限公司 一种设备控制方法和装置
CN109981983B (zh) * 2019-03-26 2021-04-23 Oppo广东移动通信有限公司 增强现实图像处理方法、装置、电子设备和存储介质
CN109961523B (zh) * 2019-04-18 2023-07-04 广州市百果园信息技术有限公司 虚拟目标的更新方法、装置、***、设备及存储介质
CN110111428B (zh) * 2019-05-28 2023-06-20 艾瑞迈迪科技石家庄有限公司 一种应用于增强现实的虚拟目标标定方法及装置
CN110232744A (zh) * 2019-06-18 2019-09-13 北京华捷艾米科技有限公司 一种显示方法及装置
CN110456957B (zh) * 2019-08-09 2022-05-03 北京字节跳动网络技术有限公司 显示交互方法、装置、设备、存储介质
CN110706300A (zh) * 2019-09-19 2020-01-17 腾讯科技(深圳)有限公司 虚拟形象生成方法及装置
CN111739169A (zh) * 2019-10-31 2020-10-02 北京京东尚科信息技术有限公司 基于增强现实的产品展示方法、***、介质及电子设备
CN111077999B (zh) * 2019-11-14 2021-08-13 联想(北京)有限公司 一种信息处理方法、设备及***
CN112807683B (zh) * 2019-11-18 2024-01-16 深圳云天励飞技术有限公司 游戏角色调整方法及相关装置
CN111026276A (zh) * 2019-12-12 2020-04-17 Oppo(重庆)智能科技有限公司 视觉辅助方法及相关产品
CN111127661B (zh) * 2019-12-17 2023-08-29 北京超图软件股份有限公司 一种数据处理方法、装置及电子设备
WO2021197016A1 (fr) * 2020-04-01 2021-10-07 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Système et procédé permettant d'améliorer des sujets dans des vidéos
CN111652979A (zh) * 2020-05-06 2020-09-11 福建工程学院 一种实现ar的方法和***
CN111580658B (zh) * 2020-05-09 2024-04-26 维沃移动通信有限公司 基于ar的会议方法、装置及电子设备
CN111651047B (zh) * 2020-06-05 2023-09-19 浙江商汤科技开发有限公司 一种虚拟物体展示方法、装置、电子设备及存储介质
CN114078102A (zh) * 2020-08-11 2022-02-22 北京芯海视界三维科技有限公司 图像处理装置和虚拟现实设备
CN112422812B (zh) * 2020-09-01 2022-03-29 华为技术有限公司 图像处理方法、移动终端及存储介质
CN115248501B (zh) * 2021-04-27 2023-11-21 广州视享科技有限公司 增强现实设备的显示方法、装置和增强现实设备
CN113838201B (zh) * 2021-09-23 2022-06-07 北京百度网讯科技有限公司 模型适配方法、装置、电子设备及可读存储介质
CN114418918B (zh) * 2022-03-15 2022-07-05 深圳市北海轨道交通技术有限公司 一种基于增强现实的智能辅助报站方法和***

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140207434A1 (en) * 2013-01-21 2014-07-24 GM Global Technology Operations LLC Virtual model merging systems and methods
CN106548519A (zh) * 2016-11-04 2017-03-29 上海玄彩美科网络科技有限公司 基于orb‑slam和深度相机的真实感的增强现实方法
CN107393017A (zh) * 2017-08-11 2017-11-24 北京铂石空间科技有限公司 图像处理方法、装置、电子设备及存储介质
CN108537889A (zh) * 2018-03-26 2018-09-14 广东欧珀移动通信有限公司 增强现实模型的调整方法、装置、存储介质和电子设备

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102147658B (zh) * 2011-02-12 2013-01-09 华为终端有限公司 实现扩增实境互动的方法、扩增实境互动装置及移动终端
CN103035003B (zh) * 2012-12-11 2015-09-09 华为技术有限公司 一种实现增强现实的方法及装置
CN104539925B (zh) * 2014-12-15 2016-10-05 北京邮电大学 基于深度信息的三维场景增强现实的方法及***
CN105005970B (zh) * 2015-06-26 2018-02-16 广东欧珀移动通信有限公司 一种增强现实的实现方法及装置
CN106200914B (zh) * 2016-06-28 2019-04-05 Oppo广东移动通信有限公司 增强现实的触发方法、装置及拍照设备
CN106896940B (zh) * 2017-02-28 2020-01-07 杭州乐见科技有限公司 虚拟物品呈现效果控制方法及装置
CN106909223B (zh) * 2017-02-28 2020-07-10 杭州乐见科技有限公司 基于3d场景的摄像头朝向修正方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140207434A1 (en) * 2013-01-21 2014-07-24 GM Global Technology Operations LLC Virtual model merging systems and methods
CN106548519A (zh) * 2016-11-04 2017-03-29 上海玄彩美科网络科技有限公司 基于orb‑slam和深度相机的真实感的增强现实方法
CN107393017A (zh) * 2017-08-11 2017-11-24 北京铂石空间科技有限公司 图像处理方法、装置、电子设备及存储介质
CN108537889A (zh) * 2018-03-26 2018-09-14 广东欧珀移动通信有限公司 增强现实模型的调整方法、装置、存储介质和电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911356A (zh) * 2020-05-29 2021-06-04 腾讯科技(深圳)有限公司 一种虚拟现实vr视频的播放方法及相关设备
CN112911356B (zh) * 2020-05-29 2022-04-05 腾讯科技(深圳)有限公司 一种虚拟现实vr视频的播放方法及相关设备

Also Published As

Publication number Publication date
CN108537889A (zh) 2018-09-14

Similar Documents

Publication Publication Date Title
WO2019184889A1 (fr) Procédé et appareil d'ajustement de modèle de réalité augmentée, support d'informations et dispositif électronique
US11605214B2 (en) Method, device and storage medium for determining camera posture information
US10055879B2 (en) 3D human face reconstruction method, apparatus and server
CN111417028B (zh) 信息处理方法、装置、存储介质及电子设备
WO2018171429A1 (fr) Procédé d'assemblage d'images, dispositif, terminal, et support de stockage
CN110059685B (zh) 文字区域检测方法、装置及存储介质
WO2019233229A1 (fr) Procédé de fusion d'images, dispositif et support d'enregistrement
JP2021524957A (ja) 画像処理方法およびその、装置、端末並びにコンピュータプログラム
US9760998B2 (en) Video processing method and apparatus
EP3561667B1 (fr) Procédé d'affichage d'une application 2d dans un dispositif vr, et terminal
EP3748533A1 (fr) Procédé, appareil et moyen de stockage pour obtenir des informations d'objet
WO2019029379A1 (fr) Procédé et dispositif de commande d'objet d'interaction, terminal et support de stockage lisible par ordinateur
KR102242324B1 (ko) 카메라 광 데이터로 가상 환경을 조명하는 방법
CN110717964B (zh) 场景建模方法、终端及可读存储介质
WO2019196871A1 (fr) Procédé de modélisation et dispositif associé
WO2019071562A1 (fr) Procédé de traitement de données et terminal
US20230326147A1 (en) Helper data for anchors in augmented reality
WO2021078182A1 (fr) Procédé et système de lecture
CN108829600B (zh) 算法库的测试方法、装置、存储介质和电子设备
KR102084161B1 (ko) 이미지를 보정하는 전자 장치 및 그 제어 방법
CN113780291A (zh) 一种图像处理方法、装置、电子设备及存储介质
CN114093020A (zh) 动作捕捉方法、装置、电子设备及存储介质
CN113849142B (zh) 图像展示方法、装置、电子设备及计算机可读存储介质
KR102069228B1 (ko) 영상의 회화적 표현을 위한 영상 처리 방법 및 장치
CN108959073B (zh) 算法库的测试方法、装置、存储介质和电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19775567

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19775567

Country of ref document: EP

Kind code of ref document: A1