CN111913645B - Three-dimensional image display method and device, electronic equipment and storage medium - Google Patents

Three-dimensional image display method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111913645B
CN111913645B CN202010827340.7A CN202010827340A CN111913645B CN 111913645 B CN111913645 B CN 111913645B CN 202010827340 A CN202010827340 A CN 202010827340A CN 111913645 B CN111913645 B CN 111913645B
Authority
CN
China
Prior art keywords
dimensional image
dimensional space
dimensional
parameter
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010827340.7A
Other languages
Chinese (zh)
Other versions
CN111913645A (en
Inventor
黄仲华
周成富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Schen Industrial Investment Co ltd
Original Assignee
Guangdong Schen Industrial Investment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Schen Industrial Investment Co ltd filed Critical Guangdong Schen Industrial Investment Co ltd
Priority to CN202010827340.7A priority Critical patent/CN111913645B/en
Publication of CN111913645A publication Critical patent/CN111913645A/en
Application granted granted Critical
Publication of CN111913645B publication Critical patent/CN111913645B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Human Computer Interaction (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Computer Graphics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method and a device for displaying a three-dimensional image, electronic equipment and a storage medium. The specific implementation scheme is as follows: responding to the touch and/or sliding of the user on the three-dimensional image displayed by the mobile terminal, and acquiring the touch position change parameters of the user on the touch and/or sliding of the three-dimensional image; responding to the movement and/or rotation operation of a user on the mobile terminal, and acquiring the position change parameter and/or the posture change parameter of the mobile terminal in the three-dimensional space; simulating the stress condition of the three-dimensional image in the three-dimensional space according to the touch position change parameter, the self position change parameter and/or the posture change parameter of the mobile terminal in the three-dimensional space, so as to determine the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space; and adjusting the visual angle and/or the viewpoint of the three-dimensional image according to the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space. The embodiment of the application can enable the three-dimensional image to be more truly presented.

Description

Three-dimensional image display method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of three-dimensional display technologies, and in particular, to a method and an apparatus for displaying a three-dimensional image, an electronic device, and a storage medium.
Background
With the development of technology and the change of shopping habits, people tend to shop online more. The merchant takes a picture of the product, and then adds the picture of the product and the product description to the product page to display the product when the merchant shelves the product on the shopping website.
Typically, the presentation page provides a three-dimensional image of the product that the user can view at the client on the cell phone. During viewing, the three-dimensional image changes its presentation perspective or viewpoint as the user touches or slides on the interface. When the touch or slide is stopped, the three-dimensional image displayed on the interface also stops moving or rotating, but the stop is more abrupt.
Disclosure of Invention
The application provides a method, a device, an electronic device and a storage medium for displaying a three-dimensional image, so as to solve or alleviate one or more technical problems in the prior art.
According to an aspect of the present application, there is provided a method of three-dimensional image presentation, performed by an electronic device having a display, comprising:
responding to the touch and/or sliding of a user on a three-dimensional image displayed by the electronic equipment, and acquiring a touch position change parameter of the touch and/or sliding of the user on the three-dimensional image;
responding to the movement and/or rotation operation of the user on the electronic equipment, and acquiring a self position change parameter and/or an attitude change parameter of the electronic equipment in a three-dimensional space;
simulating the stress condition of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the electronic equipment, the self position change parameter and/or the posture change parameter of the electronic equipment in the three-dimensional space, so as to determine the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space; and
and adjusting the visual angle and/or the viewpoint of the three-dimensional image according to the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space.
In one embodiment, the determining the viewpoint adjustment parameter of the three-dimensional image in the three-dimensional space includes:
determining thrust of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the electronic equipment and the position change parameter of the electronic equipment in the three-dimensional space;
determining the moving speed and displacement of the three-dimensional image in the three-dimensional space according to the thrust force and preset inertial resistance of the three-dimensional image in the three-dimensional space;
and determining the viewpoint adjusting parameters of the three-dimensional image in the three-dimensional space according to the moving speed and the displacement of the three-dimensional image in the three-dimensional space.
In one embodiment, the touch position variation parameter includes a current touch position coordinate and a start touch position coordinate of a touch and/or a slide of the three-dimensional image displayed by the electronic device by the user, the self position variation parameter includes a reference point position coordinate and a current position coordinate of the electronic device, and the determining a thrust force to which the three-dimensional image is subjected in the three-dimensional space includes:
determining a first thrust and action time of the first thrust received by the three-dimensional image in the three-dimensional space according to an initial touch position coordinate, a current touch position coordinate and touch time of the touch and/or sliding of the three-dimensional image displayed by the electronic equipment by the user; and
and determining a second thrust and the action time of the second thrust, which are applied to the three-dimensional space by the three-dimensional image, according to the position coordinates of the reference point of the electronic equipment, the current position coordinates and the time of position change.
In one embodiment, determining the speed and displacement of the three-dimensional image moving in the three-dimensional space according to the thrust force and the preset inertial resistance of the three-dimensional image in the three-dimensional space comprises:
determining the weight of the three-dimensional image and the inertial resistance when subjected to thrust; and
and determining the speed and the position of the three-dimensional image moving in the three-dimensional space according to the acting time of the first thrust and the first thrust on the three-dimensional image, the acting time of the second thrust and the second thrust on the three-dimensional image, the inertial resistance suffered by the three-dimensional image and the weight of the three-dimensional image.
In one embodiment, the determining the view angle adjustment parameter of the three-dimensional image in the three-dimensional space includes:
determining the deflection force of the three-dimensional image in the three-dimensional space according to the touch position change parameters operated by the user on the electronic equipment and the posture change parameters of the terminal in the three-dimensional space;
determining the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space according to the deflection force of the three-dimensional image in the three-dimensional space and preset air resistance;
and determining a visual angle adjusting parameter of the three-dimensional image in the three-dimensional space according to the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space.
In one embodiment, the touch position variation parameter includes a current touch position coordinate and a starting touch position coordinate of a touch and/or a slide of the three-dimensional image displayed by the electronic device by the user, the gesture variation parameter includes a reference point deflection angle and a current deflection angle of the electronic device, and the determining the deflection force to which the three-dimensional image is subjected in the three-dimensional space includes:
determining a first deflection force and action time of the first deflection force on the three-dimensional image in the three-dimensional space according to an initial touch position coordinate, a current touch position coordinate and touch time of the touch and/or sliding of the three-dimensional image displayed by the electronic equipment by the user;
and determining the action time of a second deflection force and the action time of the second deflection force, which are applied to the three-dimensional image in the three-dimensional house, according to the reference point deflection angle and the current deflection angle of the electronic equipment.
In one embodiment, the determining the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space comprises:
determining the weight of the three-dimensional image and the air resistance when subjected to a deflection force; and
and determining the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space according to the action time of the first deflection force and the first deflection force, the action time of the second deflection force and the second deflection force, the air resistance of the three-dimensional image and the weight of the three-dimensional image.
According to an aspect of the present application, there is provided an apparatus for three-dimensional image presentation, performed by an electronic device having a display, comprising:
the first parameter acquisition module is used for responding to the touch and/or sliding of a user on the three-dimensional image displayed by the electronic equipment and acquiring the touch position change parameter of the touch and/or sliding of the user on the three-dimensional image;
the second parameter acquisition module is used for responding to the movement and/or rotation operation of the user on the electronic equipment, and acquiring the self position change parameter and/or the posture change parameter of the electronic equipment in a three-dimensional space;
the third parameter determining module is used for simulating the stress condition of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the electronic equipment, the self position change parameter and/or the posture change parameter of the electronic equipment in the three-dimensional space, so as to determine the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space; and
and the adjusting module is used for adjusting the visual angle and/or the viewpoint of the three-dimensional image according to the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space.
According to an aspect of the present application, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein
The memory stores instructions executable by the at least one processor to cause the at least one processor to perform a method provided by any of the embodiments of the present application.
According to an aspect of the present application, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform a method provided by an embodiment of the present application.
In the embodiment of the application, the action force on the three-dimensional image is simulated by using the touch position change, the position change and the posture change of the mobile terminal, which are obtained by the operation of the touch and/or the sliding of the mobile terminal by the user, and the change of the visual angle and the visual point of the three-dimensional image in the three-dimensional space is determined under the action force. And the visual angle and the viewpoint of the three-dimensional image can be adjusted according to the change of the visual angle and the viewpoint. With the above-described force, the three-dimensional image can naturally move and move.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
In the drawings, like reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are therefore not to be considered limiting of its scope.
FIG. 1 is a schematic diagram of a method of three-dimensional image presentation of an embodiment of the present application;
fig. 2 is a schematic diagram of a process of determining a viewpoint change of a three-dimensional image according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a process for determining a change in perspective of a three-dimensional image according to an embodiment of the present application;
FIG. 4 is a block diagram of an apparatus for three-dimensional image display provided in an embodiment of the present application;
FIG. 5 is a block diagram of an electronic device of a method of three-dimensional image presentation according to an embodiment of the present application;
Detailed Description
In the following, only certain exemplary embodiments are briefly described. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present application. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
Fig. 1 shows a schematic diagram of a method of three-dimensional image presentation according to an embodiment of the present application. The method may be performed by an electronic device having a display. For example: smart phones, tablets, LED televisions, OLED televisions, and the like. Exemplarily, the method of the embodiment of the present application will be described below by taking a mobile terminal as an example, and may include the following steps:
and S100, responding to the touch and/or sliding of the three-dimensional image displayed by the mobile terminal by the user, and acquiring the touch position change parameter of the touch and/or sliding of the three-dimensional image by the user.
Step S200, responding to the movement and/or rotation operation of the user to the mobile terminal, and acquiring the self position change parameter and/or the posture change parameter of the mobile terminal in the three-dimensional space.
Step S300, simulating the stress condition of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the mobile terminal, the self position change parameter and/or the posture change parameter of the mobile terminal in the three-dimensional space, so as to determine the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space.
And S400, adjusting the visual angle and/or the viewpoint of the three-dimensional image according to the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space.
In the embodiment of the application, the operation of the three-dimensional image on the mobile terminal by the user may include: and touching and/or moving the three-dimensional image, and moving and/or rotating the mobile terminal. These operations may yield the following information: the three-dimensional image processing method comprises a touch position change parameter of a user touching and/or sliding the three-dimensional image, a self position change parameter of the mobile terminal in a three-dimensional space and a posture change parameter. With these parameters, the forces, such as thrust or deflection forces, to which the three-dimensional image is subjected in three-dimensional space are simulated. Of course, it is also possible to give the three-dimensional image a certain weight and resistance to the movement. The movement of the three-dimensional image in the display process can be attached to the influence of force on the movement, so that the user feels more realistic when the movement of the three-dimensional image is sensed in the view angle.
Among them, the motion of the three-dimensional image includes a motion (non-self-yaw motion) in which a displacement is generated in a three-dimensional space based on a certain reference point (non-self-yaw motion) in the space, and a motion in which a yaw is performed based on a gravity center of itself or a certain position.
The viewpoint of the three-dimensional image is adjusted, and the movement of the three-dimensional image generating displacement in the three-dimensional space can be displayed. The visual angle of the three-dimensional image is adjusted, and the movement of the three-dimensional image deflecting around a certain position of the three-dimensional image in a three-dimensional space can be displayed.
The touch position change parameter of the user on the mobile terminal can be acquired through the following modes: and when the user touches or slides the three-dimensional image displayed on the mobile terminal, the initial touch position, the initial touch time, the current touch position and the current touch time. This may be obtained by a touch sensitive screen or frame on the mobile terminal located above the touch screen.
The touch position variation parameter can influence the position variation of the three-dimensional image in the three-dimensional space and the rotation of the three-dimensional image around the three-dimensional space.
The user moves and/or rotates the mobile terminal, and the mobile terminal changes in position and posture in the three-dimensional space, so that the position change parameter and the posture change parameter of the mobile terminal can be obtained. The position change parameter of the mobile terminal can be obtained by the following method: a plurality of network access points for the mobile terminal are determined. For example, there may be three or four. Then, the coordinates of the mobile terminal at the reference point position and the current position can be determined according to the information strength of the network signals of the plurality of network access points received by the mobile terminal at the reference point position and the current position.
The attitude change parameter of the mobile terminal can be obtained by the following method: acquiring sensing parameters of a gyroscope of the mobile terminal; and then, determining the reference point deflection angle and the current deflection angle of the mobile terminal according to the sensing parameters of the gyroscope.
Due to the fact that the viewpoint of the three-dimensional image is adjusted, the movement of the three-dimensional image which generates displacement in the three-dimensional space can be displayed, and the viewpoint adjusting parameters can be determined. As shown in fig. 2, the process of determining the viewpoint adjustment parameter of the three-dimensional image in the three-dimensional space may include the following steps:
step S310, determining thrust of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the mobile terminal and the position change parameter of the mobile terminal in the three-dimensional space.
And S320, determining the moving speed and displacement of the three-dimensional image in the three-dimensional space according to the thrust force and preset inertial resistance of the three-dimensional image in the three-dimensional space.
And step S330, determining a viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space according to the moving speed and the displacement of the three-dimensional image in the three-dimensional space.
In the embodiment of the application, the three-dimensional image is subjected to at least two thrust forces and one inertial resistance. Wherein, one thrust is a first thrust generated by the user touching or sliding the three-dimensional image on the mobile terminal, and the other thrust is a second thrust generated by the mobile terminal moving in the three-dimensional space. The time and duration of the first and second thrusts acting on the three-dimensional image may be different. Of course, the same situation can occur.
Specifically, since the touch position variation parameter may include a start touch position coordinate and a current touch position coordinate of a touch and/or a slide of the user on the three-dimensional image displayed by the mobile terminal, based on the start touch position coordinate, the current touch position coordinate and the touch time, the first thrust force applied to the three-dimensional image in the three-dimensional space and the acting time of the first thrust force on the three-dimensional image may be determined. For example, the magnitude of the displacement between the current touch position coordinates and the start touch position coordinates may determine the magnitude of the first pushing force. The direction of this displacement may determine the direction of the first thrust. This touch time may determine the time of the first push force acting on the three-dimensional image.
Since the self position change parameter of the mobile terminal in the three-dimensional space may include the reference point position coordinate of the mobile terminal, the current position coordinate of the mobile terminal, and the time of the position change, based on the reference point position coordinate and the current position coordinate of the mobile terminal, the second thrust applied to the three-dimensional space by the three-dimensional image and the acting time of the second thrust on the three-dimensional image may be determined. For example, the magnitude of the displacement between the reference point coordinates and the current position coordinates of the mobile terminal may determine the magnitude of the second thrust. The direction of this displacement may determine the direction of the second thrust.
Illustratively, a weight and an inertial resistance, e.g., friction, to be experienced during the movement are set for the three-dimensional image. The three-dimensional image starts to move when being subjected to the first thrust and the second thrust, and is influenced by inertial resistance during the movement. In this way, the three-dimensional image can start moving from the initial speed of zero, accelerate and move away with the influence of the first thrust, the second thrust and the inertial resistance, and stop moving with the evacuation of the first thrust and the second thrust, but the three-dimensional image starts decelerating until the speed of the three-dimensional image becomes zero due to the influence of the inertial resistance.
The weight and inertial resistance to be encountered while moving in the three-dimensional space are given to the three-dimensional image, which can be set as desired. For example, if the movement of the three-dimensional image in the three-dimensional space is slow from the visual point of view, the weight of the three-dimensional image may be set to be heavier and the inertial resistance received may be larger. If the movement of the three-dimensional image in the three-dimensional space is rapid from the visual point of view, the three-dimensional image can be set to be lighter in weight and to be less subjected to inertial resistance. During the movement of the three-dimensional image, if it is desired that the time for removing the three-dimensional image from the pushing force to the stopping movement is shorter, the inertial resistance may be set to be larger, and if it is desired that the time for removing the three-dimensional image from the pushing force to the stopping movement is longer, the inertial resistance may be set to be smaller.
Based on this, the change in the speed and position at which the three-dimensional image moves in the three-dimensional space can be determined from the first thrust and the acting time of the first thrust on the three-dimensional image, the acting time of the second thrust and the second thrust on the three-dimensional image, the inertial resistance received by the three-dimensional image, and the weight of the three-dimensional image. Then, based on the change in the moving speed and position of the three-dimensional image in the three-dimensional space, the viewpoint adjustment parameter can be determined. Finally, the viewpoint or movement of the viewpoint of the three-dimensional image in the three-dimensional space may be adjusted based on the viewpoint adjustment parameter.
Since the visual angle of the three-dimensional image is adjusted, it can be shown that the three-dimensional image rotates around itself in the three-dimensional space, and the visual angle adjusting parameters need to be determined. In the step S300, as shown in fig. 3, the process of determining the viewing angle adjustment parameter of the three-dimensional image in the three-dimensional space may include the following steps:
s330, determining the deflection force of the three-dimensional image in the three-dimensional space according to the touch position change parameters operated by the user on the mobile terminal and the posture change parameters of the terminal in the three-dimensional space;
s340, determining the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space according to the deflection force of the three-dimensional image in the three-dimensional space and the preset air resistance;
and S350, determining a visual angle adjusting parameter of the three-dimensional image in the three-dimensional space according to the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space.
In the embodiments of the present application, the three-dimensional image is subjected to at least two deflection forces and a resistance, e.g., air resistance, that prevents the three-dimensional image from rotating. One deflection force is a first deflection force generated by a user touching or sliding the three-dimensional image, and the other deflection force is a second deflection force generated by the mobile terminal rotating around the mobile terminal in the stereoscopic space. The magnitude of the effect and the time and duration of the effect of the first deflection force and the second deflection force on the three-dimensional image may not be uniform. Of course, it may be uniform.
Specifically, the touch position variation parameter may include a start touch position coordinate and a current touch position coordinate of a touch and/or a slide of the user on the three-dimensional image displayed by the mobile terminal, and a touch time. Based on the starting touch position coordinates and the current touch position coordinates, the magnitude and direction (angle) of the first deflection force that the three-dimensional image is subjected to in three-dimensional space can be determined. Based on the touch time, a start time, an end time, and a duration of time that the three-dimensional image is subjected to the first deflection force in the three-dimensional space may be determined. The attitude change parameters may include a reference point deflection angle of the mobile terminal, a current deflection angle, and a deflection time. Based on the reference point deflection angle and the current deflection angle of the mobile terminal, the magnitude and direction (angle) of the second deflection force that the three-dimensional image is subjected to in the three-dimensional space can be determined. Based on the deflection of the mobile terminal, the action starting time, the action ending time and the duration of the second deflection force applied to the three-dimensional image in the three-dimensional space can be determined.
Illustratively, a weight and an air resistance to be experienced during the movement are set for the three-dimensional image. The three-dimensional image starts to deflect when being subjected to the first deflection force and the second deflection force, and is influenced by air resistance during the deflection motion. In this way, the three-dimensional image can start moving from the initial velocity of zero, and accelerate deflection with the influence of the first deflection force, the second deflection force, and the air resistance, and stop the deflection movement as the first deflection force, the second deflection force evacuate, but the three-dimensional image starts decelerating deflection until the velocity becomes zero due to the influence of the air resistance.
The weight and the air resistance to be received when moving in the three-dimensional space are given to the three-dimensional image, which can be set as desired. If it is desired that the deflection of the three-dimensional image in the three-dimensional space is slow from the visual point of view, it is possible to set the weight of the three-dimensional image to be heavier and the air resistance to be received to be larger. If it is desired that the deflection of the three-dimensional image in the three-dimensional space is rapid from the visual point of view, it is possible to set the weight of the three-dimensional image to be lighter and the inertial resistance to be small. In the deflection process of the three-dimensional image, if it is desired that the time for removing the three-dimensional image from the deflection force to the stop motion is shorter, the air resistance may be set to be larger, and if it is desired that the time for removing the three-dimensional image from the deflection force to the stop motion is longer, the air resistance may be set to be smaller.
Based on this, the change of the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space can be determined according to the first deflection force and the acting time of the first deflection force, the second deflection force and the acting time of the second deflection force, the weight of the three-dimensional image, and the air resistance received by the mobile terminal. Then, according to the deflection speed and the deflection angle change of the three-dimensional image in the three-dimensional space, the sight line adjusting parameter of the three-dimensional image in the three-dimensional space can be determined.
Finally, based on the gaze adjustment parameter of the three-dimensional image, the perspective or deflection of the perspective of the three-dimensional image in three-dimensional space may be determined.
Referring to fig. 4, it is a structural diagram of a device for three-dimensional image display according to an embodiment of the present application. The apparatus may include:
a first parameter obtaining module 100, configured to obtain a touch position variation parameter of a touch and/or a slide of a user on a three-dimensional image displayed by the mobile terminal in response to the touch and/or the slide of the user on the three-dimensional image;
a second parameter obtaining module 200, configured to obtain a position change parameter and/or an attitude change parameter of the mobile terminal in a three-dimensional space in response to a movement and/or rotation operation of the mobile terminal by the user;
a third parameter determining module 300, configured to simulate, according to a touch position change parameter of the user on the mobile terminal, a self-position change parameter and/or a posture change parameter of the mobile terminal in a three-dimensional space, a stress condition of the three-dimensional image in the three-dimensional space, so as to determine a viewing angle adjusting parameter and/or a viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space; and
an adjusting module 400, configured to adjust a viewing angle and/or a viewpoint of the three-dimensional image according to a viewing angle adjusting parameter and/or a viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space.
As shown in fig. 5, it is a block diagram of an electronic device according to the method of three-dimensional image presentation in the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 5, the electronic apparatus includes: one or more processors 601, memory 602, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 4, one processor 601 is taken as an example.
The memory 602 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for three-dimensional image presentation provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the method of three-dimensional image presentation provided herein.
The memory 602, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the method of three-dimensional image presentation in the embodiments of the present application. The processor 601 executes various functional applications of the server and data processing by running non-transitory software programs, instructions and modules stored in the memory 602, namely, implements the method of three-dimensional image presentation in the above method embodiments.
The memory 602 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device of the method of three-dimensional image presentation, and the like. Further, the memory 602 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 602 optionally includes memory remotely located from the processor 601, and these remote memories may be connected to the electronic device of the method of three-dimensional image presentation via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the method for three-dimensional image presentation may further include: an input device 603 and an output device 604. The processor 601, the memory 602, the input device 603 and the output device 604 may be connected by a bus or other means, and fig. 4 illustrates the connection by a bus as an example.
The input device 603 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus of the method of three-dimensional image presentation, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 604 may include a display device, auxiliary lighting devices (e.g., LEDs), and tactile feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the application, the jump relation set of each address in the network can be determined by utilizing the network access records, so that the final address of the address to be searched can be determined by utilizing the jump relation set aiming at the address to be searched.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. A method of three-dimensional image presentation, performed by an electronic device having a display, comprising:
responding to the touch and/or sliding of a user on a three-dimensional image displayed by the electronic equipment, and acquiring a touch position change parameter of the touch and/or sliding of the user on the three-dimensional image;
responding to the movement and/or rotation operation of the user on the electronic equipment, and acquiring a self position change parameter and/or an attitude change parameter of the electronic equipment in a three-dimensional space;
simulating the stress condition of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the electronic equipment, the self position change parameter and/or the posture change parameter of the electronic equipment in the three-dimensional space, so as to determine the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space; and
adjusting the visual angle and/or the viewpoint of the three-dimensional image according to the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space;
wherein the determining of the viewpoint adjustment parameter of the three-dimensional image in the three-dimensional space comprises:
determining thrust of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the electronic equipment and the position change parameter of the electronic equipment in the three-dimensional space;
determining the moving speed and displacement of the three-dimensional image in the three-dimensional space according to the thrust force and preset inertial resistance of the three-dimensional image in the three-dimensional space;
and determining the viewpoint adjusting parameters of the three-dimensional image in the three-dimensional space according to the moving speed and the displacement of the three-dimensional image in the three-dimensional space.
2. The method of claim 1, wherein the touch location change parameters comprise a start touch location coordinate and a current touch location coordinate of the user's touch and/or slide on the three-dimensional image displayed by the electronic device, wherein the self location change parameters comprise a reference point location coordinate and a current location coordinate of the electronic device, and wherein the determining the thrust force experienced by the three-dimensional image in the three-dimensional space comprises:
determining a first thrust and action time of the first thrust received by the three-dimensional image in the three-dimensional space according to an initial touch position coordinate, a current touch position coordinate and touch time of the touch and/or sliding of the three-dimensional image displayed by the electronic equipment by the user; and
and determining a second thrust and the action time of the second thrust, which are applied to the three-dimensional space by the three-dimensional image, according to the position coordinates of the reference point of the electronic equipment, the current position coordinates and the time of position change.
3. The method of claim 2, wherein determining the speed and displacement of the three-dimensional image moving in the three-dimensional space according to the thrust force and the preset inertial resistance of the three-dimensional image in the three-dimensional space comprises:
determining the weight of the three-dimensional image and the inertial resistance when subjected to thrust; and
and determining the speed and the position of the three-dimensional image moving in the three-dimensional space according to the acting time of the first thrust and the first thrust on the three-dimensional image, the acting time of the second thrust and the second thrust on the three-dimensional image, the inertial resistance suffered by the three-dimensional image and the weight of the three-dimensional image.
4. The method of claim 1, wherein determining a view angle adjustment parameter of the three-dimensional image in three-dimensional space comprises:
determining the deflection force of the three-dimensional image in the three-dimensional space according to the touch position change parameter operated by the user on the electronic equipment and the posture change parameter of the electronic equipment in the three-dimensional space;
determining the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space according to the deflection force of the three-dimensional image in the three-dimensional space and preset air resistance;
and determining a visual angle adjusting parameter of the three-dimensional image in the three-dimensional space according to the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space.
5. The method of claim 4, wherein the touch location change parameters include a start touch location coordinate and a current touch location coordinate of a touch and/or a slide of the three-dimensional image displayed by the electronic device by the user, wherein the gesture change parameters include a reference point deflection angle and a current deflection angle of the electronic device, and wherein determining a deflection force experienced by the three-dimensional image in the three-dimensional space comprises:
determining a first deflection force and action time of the first deflection force on the three-dimensional image in the three-dimensional space according to an initial touch position coordinate, a current touch position coordinate and touch time of the touch and/or sliding of the three-dimensional image displayed by the electronic equipment by the user;
and determining the action time of a second deflection force and the action time of the second deflection force on the three-dimensional image in the three-dimensional space according to the reference point deflection angle and the current deflection angle of the electronic equipment.
6. The method of claim 5, wherein said determining a deflection speed and a deflection angle of said three-dimensional image in said three-dimensional space comprises:
determining the weight of the three-dimensional image and the air resistance when subjected to a deflection force; and
and determining the deflection speed and the deflection angle of the three-dimensional image in the three-dimensional space according to the action time of the first deflection force and the first deflection force, the action time of the second deflection force and the second deflection force, the air resistance of the three-dimensional image and the weight of the three-dimensional image.
7. An apparatus for three-dimensional image presentation, performed by an electronic device having a display, comprising:
the first parameter acquisition module is used for responding to the touch and/or sliding of a user on the three-dimensional image displayed by the electronic equipment and acquiring the touch position change parameter of the touch and/or sliding of the user on the three-dimensional image;
the second parameter acquisition module is used for responding to the movement and/or rotation operation of the user on the electronic equipment, and acquiring the self position change parameter and/or the posture change parameter of the electronic equipment in a three-dimensional space;
the third parameter determining module is used for simulating the stress condition of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the electronic equipment, the self position change parameter and/or the posture change parameter of the electronic equipment in the three-dimensional space, so as to determine the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space; and
the adjusting module is used for adjusting the visual angle and/or the viewpoint of the three-dimensional image according to the visual angle adjusting parameter and/or the viewpoint adjusting parameter of the three-dimensional image in the three-dimensional space;
the third parameter determination module is to:
determining thrust of the three-dimensional image in the three-dimensional space according to the touch position change parameter of the user on the electronic equipment and the position change parameter of the electronic equipment in the three-dimensional space;
determining the moving speed and displacement of the three-dimensional image in the three-dimensional space according to the thrust force and preset inertial resistance of the three-dimensional image in the three-dimensional space;
and determining the viewpoint adjusting parameters of the three-dimensional image in the three-dimensional space according to the moving speed and the displacement of the three-dimensional image in the three-dimensional space.
8. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6.
9. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-6.
CN202010827340.7A 2020-08-17 2020-08-17 Three-dimensional image display method and device, electronic equipment and storage medium Active CN111913645B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010827340.7A CN111913645B (en) 2020-08-17 2020-08-17 Three-dimensional image display method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010827340.7A CN111913645B (en) 2020-08-17 2020-08-17 Three-dimensional image display method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111913645A CN111913645A (en) 2020-11-10
CN111913645B true CN111913645B (en) 2022-04-19

Family

ID=73278202

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010827340.7A Active CN111913645B (en) 2020-08-17 2020-08-17 Three-dimensional image display method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111913645B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113947670A (en) * 2021-09-18 2022-01-18 北京城市网邻信息技术有限公司 Information display method and device, electronic equipment and readable medium
CN114398118B (en) * 2021-12-21 2023-03-24 深圳市易图资讯股份有限公司 Intelligent positioning system and method for smart city based on space anchor

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106454401A (en) * 2016-10-26 2017-02-22 乐视网信息技术(北京)股份有限公司 Method and device for playing video
CN107659851A (en) * 2017-03-28 2018-02-02 腾讯科技(北京)有限公司 The displaying control method and device of panoramic picture
CN107945282A (en) * 2017-12-05 2018-04-20 洛阳中科信息产业研究院(中科院计算技术研究所洛阳分所) The synthesis of quick multi-view angle three-dimensional and methods of exhibiting and device based on confrontation network
CN108245887A (en) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 virtual object control method, device, electronic device and storage medium
CN109683729A (en) * 2018-12-22 2019-04-26 威创集团股份有限公司 Three-dimensional scenic control method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170186219A1 (en) * 2015-12-28 2017-06-29 Le Holdings (Beijing) Co., Ltd. Method for 360-degree panoramic display, display module and mobile terminal
CN108986230A (en) * 2018-07-19 2018-12-11 北京知道创宇信息技术有限公司 Image aspects management method, device and electronic equipment
CN111163303B (en) * 2018-11-08 2021-08-31 中移(苏州)软件技术有限公司 Image display method, device, terminal and storage medium
CN111027192B (en) * 2019-12-02 2023-09-15 西安欧意特科技有限责任公司 Method and system for determining performance parameters

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106454401A (en) * 2016-10-26 2017-02-22 乐视网信息技术(北京)股份有限公司 Method and device for playing video
CN107659851A (en) * 2017-03-28 2018-02-02 腾讯科技(北京)有限公司 The displaying control method and device of panoramic picture
CN107945282A (en) * 2017-12-05 2018-04-20 洛阳中科信息产业研究院(中科院计算技术研究所洛阳分所) The synthesis of quick multi-view angle three-dimensional and methods of exhibiting and device based on confrontation network
CN108245887A (en) * 2018-02-09 2018-07-06 腾讯科技(深圳)有限公司 virtual object control method, device, electronic device and storage medium
CN109683729A (en) * 2018-12-22 2019-04-26 威创集团股份有限公司 Three-dimensional scenic control method and device

Also Published As

Publication number Publication date
CN111913645A (en) 2020-11-10

Similar Documents

Publication Publication Date Title
US11543891B2 (en) Gesture input with multiple views, displays and physics
EP3908906B1 (en) Near interaction mode for far virtual object
US10452249B2 (en) Tooltip feedback for zoom using scroll wheel
US11054894B2 (en) Integrated mixed-input system
JP6619005B2 (en) Selective pairing between application presented in virtual space and physical display
CN102362251B (en) For the user interface providing the enhancing of application programs to control
CN107533374A (en) Switching at runtime and the merging on head, gesture and touch input in virtual reality
KR102233807B1 (en) Input Controller Stabilization Technique for Virtual Reality System
US20180046363A1 (en) Digital Content View Control
CN111722245B (en) Positioning method, positioning device and electronic equipment
CN110215685B (en) Method, device, equipment and storage medium for controlling virtual object in game
KR20160081809A (en) Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications
US20140049558A1 (en) Augmented reality overlay for control devices
WO2018118537A1 (en) Facilitating selection of holographic keyboard keys
JP6389581B1 (en) Program, electronic apparatus, and method
US11430192B2 (en) Placement and manipulation of objects in augmented reality environment
CN111913645B (en) Three-dimensional image display method and device, electronic equipment and storage medium
CN108431734A (en) Touch feedback for non-touch surface interaction
KR20110030341A (en) System for interacting with objects in a virtual environment
US20170371432A1 (en) Integrated free space and surface input device
EP2538308A2 (en) Motion-based control of a controllled device
CN107391005B (en) Method for controlling cursor movement on host screen and game handle
US20140111551A1 (en) Information-processing device, storage medium, information-processing method, and information-processing system
US20160098160A1 (en) Sensor-based input system for mobile devices
CN108572744B (en) Character input method and system and computer readable recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant