CN111459263A - Virtual content display method and device, terminal equipment and storage medium - Google Patents

Virtual content display method and device, terminal equipment and storage medium Download PDF

Info

Publication number
CN111459263A
CN111459263A CN201910060758.7A CN201910060758A CN111459263A CN 111459263 A CN111459263 A CN 111459263A CN 201910060758 A CN201910060758 A CN 201910060758A CN 111459263 A CN111459263 A CN 111459263A
Authority
CN
China
Prior art keywords
shaking
virtual content
interactive
content
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910060758.7A
Other languages
Chinese (zh)
Other versions
CN111459263B (en
Inventor
卢智雄
戴景文
贺杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Virtual Reality Technology Co Ltd
Original Assignee
Guangdong Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Virtual Reality Technology Co Ltd filed Critical Guangdong Virtual Reality Technology Co Ltd
Priority to CN201910060758.7A priority Critical patent/CN111459263B/en
Publication of CN111459263A publication Critical patent/CN111459263A/en
Application granted granted Critical
Publication of CN111459263B publication Critical patent/CN111459263B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a virtual content display method, a virtual content display device, a terminal device and a storage medium, wherein the virtual content display method is applied to the terminal device, the terminal device is connected with an interactive device, and the virtual content display method comprises the following steps: acquiring position and posture information of the interactive equipment relative to the terminal equipment; displaying the virtual content according to the position and posture information; detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment; when the interactive equipment is in a shaking state, obtaining shaking parameters of the interactive equipment; and controlling the display of the virtual content according to the shaking parameters so as to enable the displayed virtual content to correspond to the shaking state of the interactive equipment. The method can better realize the interaction with the virtual content.

Description

Virtual content display method and device, terminal equipment and storage medium
Technical Field
The present application relates to the field of display technologies, and in particular, to a method and an apparatus for displaying virtual content, a terminal device, and a storage medium.
Background
In recent years, with the progress of science and technology, technologies such as Augmented Reality (AR) have become hot spots of research at home and abroad, and Augmented Reality is a technology for increasing the perception of a user to the real world through information provided by a computer system, in which a virtual object generated by a computer, a scene, or a content object such as system prompt information is superimposed on a real scene to enhance or modify the perception of the real world environment or data representing the real world environment. In augmented reality display technology, interaction with displayed content is a key issue affecting technology applications.
Disclosure of Invention
The embodiment of the application provides a display method and device of virtual content, terminal equipment and a storage medium, so as to better realize interaction with the display content.
In a first aspect, an embodiment of the present application provides a method for displaying virtual content, which is applied to a terminal device, where the terminal device is connected to an interactive device, and the method includes: acquiring position and posture information of the interactive equipment relative to the terminal equipment; displaying the virtual content according to the position and posture information; detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment; when the interactive equipment is in a shaking state, obtaining shaking parameters of the interactive equipment; and controlling the display of the virtual content according to the shaking parameters so as to enable the displayed virtual content to correspond to the shaking state of the interactive equipment.
In a second aspect, an embodiment of the present application provides an apparatus for displaying virtual content, which is applied to a terminal device, where the terminal device is connected to an interactive device, and the apparatus includes: the system comprises a position acquisition module, a content display module, a state detection module, a parameter acquisition module and a content control module, wherein the position acquisition module is used for acquiring the position and posture information of the interactive equipment relative to the terminal equipment; the content display module is used for displaying the virtual content according to the position and posture information; the state detection module is used for detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment; the parameter acquisition module is used for acquiring shaking parameters of the interactive equipment when the interactive equipment is in a shaking state; the content control module is used for controlling the display of the virtual content according to the shaking parameters so as to enable the displayed virtual content to correspond to the shaking state of the interactive equipment.
In a third aspect, an embodiment of the present application provides a terminal device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of displaying virtual content as provided in the first aspect above.
In a fourth aspect, an embodiment of the present application provides a storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the method for displaying virtual content provided in the first aspect.
The scheme provided by the application is applied to the terminal equipment, the terminal equipment is connected with the interactive equipment, through obtaining the position and the posture information of the interactive equipment relative to the terminal equipment, the virtual content is displayed, the user can observe the effect that the virtual content is superposed on the real world, and according to at least one of the change information of the position and/or the posture of the interactive equipment, when the interactive equipment is determined to be in a shaking state, the shaking parameter of the interactive equipment is obtained, the display of the virtual content is controlled according to the shaking parameter, the interaction between the virtual content and the displayed virtual content is better realized, and the interactivity is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic diagram of an application scenario suitable for use in an embodiment of the present application.
Fig. 2 shows a schematic structural diagram of an interaction device provided according to an embodiment of the present application.
Fig. 3 shows a flow chart of a method of displaying virtual content according to an embodiment of the application.
Fig. 4 shows a schematic diagram of a display effect according to an embodiment of the application.
Fig. 5 shows a flowchart of a method of displaying virtual content according to another embodiment of the present application.
Fig. 6 shows a flowchart of a method of displaying virtual content according to yet another embodiment of the present application.
Fig. 7 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 8 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 9 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 10 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 11 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 12 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 13 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 14 is a flowchart illustrating a method of displaying virtual content according to still another embodiment of the present application.
Fig. 15 shows a schematic diagram of a display effect according to a further embodiment of the present application.
Fig. 16 shows a schematic diagram of a display effect according to yet another embodiment of the present application.
FIG. 17 shows a block diagram of a display device of virtual content according to one embodiment of the present application.
Fig. 18 is a block diagram of a terminal device for executing a display method of virtual content according to an embodiment of the present application.
Fig. 19 is a storage unit for storing or carrying program codes for implementing a display method of virtual content according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
With the development of display technology, Augmented Reality (AR) display technology is gradually getting deeper into people's lives. AR technology may enable overlaying content objects such as computer-generated virtual objects, scenes, or system cues into real scenes to enhance or modify the perception of the real-world environment or data representing the real-world environment. At present, a virtual image can be displayed at a corresponding position on a display screen of a mobile terminal or a display component of a head-mounted display, so that the virtual image and a real scene are displayed in an overlapping manner, and a user can enjoy a science-fiction type viewing experience.
The inventor has found through long-term research that, in the conventional AR display technology, when interaction with virtual content is implemented, interaction with the virtual content is usually implemented through an additional controller, or the orientation of a device such as a head-mounted display device is changed by rotating the direction of the head, and the interactivity is poor. Based on the above problems, the inventor proposes a method, an apparatus, a terminal device and a storage medium for displaying virtual content in the embodiments of the present application, so as to better implement interaction with the displayed virtual content.
An application scenario of the display method of virtual content provided in the embodiment of the present application is described below.
Referring to fig. 1, an application scenario diagram of a display method of virtual content provided in an embodiment of the present application is shown, where the application scenario includes a display system 10, and the display system 10 includes: the terminal device 100 and the interactive device 200, wherein the terminal device 100 is connected with the interactive device 200.
In the embodiment of the present application, the terminal device 100 may be a head-mounted display device, or may be a mobile device such as a mobile phone and a tablet. When the terminal device 100 is a head-mounted display device, the head-mounted display device may be an integrated head-mounted display device. The terminal device 100 may also be an intelligent terminal such as a mobile phone connected to an external/plug-in head-mounted display device, that is, the terminal device 100 may be used as a processing and storage device of the head-mounted display device, and plug or connect to the external head-mounted display device to display virtual content in the head-mounted display device.
In an embodiment of the present application, the interaction device 200 may be a polyhedral marker, which may include a plurality of faces, a plurality of edges, and a plurality of vertices. Wherein the interactive device 200 comprises a plurality of marking surfaces, and wherein at least two non-coplanar marking surfaces have a marker disposed thereon. In some embodiments, a marker may include at least one sub-marker having one or more feature points.
In the embodiment of the present application, the specific shape and structure of the interactive device 200 are not limited, and may be a polyhedron in which a plane is combined with a curved surface, or a polyhedron in which a curved surface is combined with a curved surface. In some embodiments, the interaction device 200 may be a combination of any one or more of the following structures: a pyramid, a prism, a frustum, a polyhedron, or a sphere, but a sphere is understood to be a polyhedron formed by numerous faces.
In the embodiment of the present application, the image of the marker is stored in the terminal device 100. The marker may include at least one sub-marker having one or more characteristic points. When the marker is within the visual field of the terminal device 100, the terminal device 100 may capture an image containing the marker. When the image including the marker is acquired, the acquired image including the marker can be recognized, and spatial position information such as the position and the orientation of the marker with respect to the terminal device 100, and a recognition result such as the identification information of the marker can be obtained. The terminal device 100 can locate and track the interactive device 200 according to the marker. The terminal device 100 can display corresponding virtual content based on information such as the spatial position of the interactive device 200 relative to the terminal device 100. It is to be understood that the specific interaction device 200 and the tag are not limited in the embodiment of the present application, and only need to be identified and tracked by the terminal device 100.
In some embodiments, different markers on the interactive device 200 may rotate or/and displace within the visual field of the terminal device 100, so that the terminal device 100 may identify information of the markers on the interactive device 200 in real time, and further acquire spatial position information of the interactive device 200, so that the terminal device 100 displays corresponding virtual content according to the spatial position information.
Referring to fig. 2, fig. 2 shows a schematic diagram of an interaction device according to an embodiment of the present application, the interaction device 200 is a twenty-hexagonal surface, and includes eighteen square surfaces and eight triangular surfaces, where the eighteen square surfaces are all marking surfaces, and each marking surface is provided with a marker, and patterns of the markers on each surface are different from each other. By one approach, the interactive device 200 has a first marker 211 disposed on the first surface 220 and a second marker 212 distinct from the first marker 211 disposed on the second surface 230. The terminal device 100 recognizes either one or both of the first marker 211 and the second marker 212, and further acquires spatial position information of the interactive device 200.
In addition, the terminal device 100 may recognize a change in the spatial position information of the interactive device 200 according to the marker 210, thereby detecting a motion state (e.g., a shaking state, a moving state, etc.) of the interactive device 200. The terminal device 100 may also detect a motion state (e.g., a shaking state, etc.) of the interactive device 200 according to six-degree-of-freedom information detected by an Inertial Measurement Unit (IMU) of the interactive device 200.
Based on the display system, the embodiment of the application provides a display method of virtual content, which is applied to a terminal device of the display system, displays the virtual content through position and posture information of an interactive device relative to the terminal device, and controls a command to control the display of the virtual content according to a shaking parameter of the interactive device when the interactive device is determined to be in a shaking state, so that the interaction with the virtual content is better realized. A specific display method of the virtual content will be described below.
Referring to fig. 3, an embodiment of the present application provides a method for displaying virtual content, which is applicable to the terminal device, and the method for displaying virtual content may include:
step S110: and acquiring the position and posture information of the interactive equipment relative to the terminal equipment.
In the embodiment of the application, the terminal device may obtain the position and posture information of the interactive device relative to the terminal device, so as to display the virtual content according to the position and posture information. The attitude information may be the orientation and rotation angle of the interactive device relative to the terminal device.
In this embodiment, the interaction device is a polyhedral marker having a polyhedral structure, which may be a tetrahedral marker, a hexahedral marker, a hexacosanal marker, or the like, and may also be a polyhedral marker with other numbers of faces, which are not listed here. The polyhedral marker comprises a plurality of marking surfaces, and at least one marking surface is provided with a marker.
Furthermore, the marker on the interactive device can be identified by the terminal device, and the position and posture information of the interactive device relative to the terminal device can be obtained.
In some embodiments, the marker may include at least one sub-marker, and the sub-marker may be a pattern having a shape. In one embodiment, each sub-marker may have one or more feature points, wherein the shape of the feature points is not limited, and may be a dot, a ring, a triangle, or other shapes. In addition, the distribution rules of the sub-markers within different markers are different, and thus, each marker may have different identity information. The terminal device may acquire identity information corresponding to the tag by identifying the sub-tag included in the tag, and the identity information may be information that can be used to uniquely identify the tag, such as a code, but is not limited thereto.
In one embodiment, the outline of the marker may be rectangular, but the shape of the marker may be other shapes, and the rectangular region and the plurality of sub-markers in the region constitute one marker. It should be noted that the shape, style, size, color, number of feature points, and distribution of the specific marker are not limited in this embodiment, and only the marker needs to be identified and tracked by the terminal device.
In the embodiment of the application, the interactive device may be disposed at a position within the visual field of the terminal device. When the terminal device needs to display the virtual content, the image acquisition device may acquire an image of the interactive device and at least one marker of the interactive device within a visual field of the terminal device, that is, an image of the interactive device and at least one marker of the interactive device may be acquired. The field of view of the terminal device refers to the field of view of the image capturing device of the terminal device, and the field of view of the image capturing device may be determined by the size of the field of view.
When the image containing the interactive device and the at least one marker of the interactive device is actually acquired, the spatial position of the interactive device can be adjusted to enable the interactive device and the at least one marker of the interactive device to be within the visual field range of the image acquisition device of the terminal device, so that the terminal device can acquire and recognize the image of the interactive device and the at least marker of the interactive device.
In some embodiments, the interaction device may include at least two different markers that are not coplanar. When the image of the marker on the certain face containing the interactive device needs to be acquired, the orientation and the rotation angle of the interactive device relative to the terminal device can be changed by rotating the interactive device, so that the terminal device can acquire the marker on the certain face of the interactive device. Similarly, by rotating the interactive device, markers on multiple faces of the interactive device may also be collected.
After the terminal device collects a marker image containing the interactive device and at least one marker of the interactive device, the terminal device can identify the image to acquire the position and posture information of the interactive device relative to the terminal device.
It will be appreciated that, since the interactive apparatus has at least one marking surface provided with a marker, the number of markers in the image may be 1 or more. As one mode, when the number of the markers in the image is at least 1, the position and the posture information between the interactive device and the terminal device may be obtained by recognizing the spatial position of one marker in the image and according to the pre-stored position of the marker relative to the other markers of the interactive device. As another mode, when the number of the markers in the image is multiple, the position and the posture information of the interactive device relative to the terminal device may be obtained by recognizing the spatial position of each marker in the multiple markers and according to the spatial position of each marker in each marker. Of course, the position and posture information of the interactive device relative to the terminal device can be accurately acquired by acquiring an image containing a plurality of markers on the interactive device.
Step S120: and displaying the virtual content according to the position and posture information.
In some embodiments, after the position and posture information of the interactive device relative to the terminal device is obtained, the display position of the virtual content to be displayed may be obtained according to the position and posture information, and the virtual content to be displayed is displayed. The display position may be a position of the virtual content that can be seen by the user through the terminal device, that is, rendering coordinates of the virtual content in the virtual space.
Further, the terminal device may obtain the display position of the virtual content according to the relative relationship between the virtual content and the interactive device, which are required to be displayed, and the position and posture information of the interactive device relative to the terminal device. It can be understood that, when the virtual content is superimposed on the real world where the interactive device is located, a spatial coordinate of the interactive device in the real space may be obtained, where the spatial coordinate may be used to represent a positional relationship between the interactive device and an image acquisition device on the head-mounted display device, and may also be used to represent a positional relationship between the interactive device and the terminal device.
After the position and posture information of the interactive device relative to the terminal device is obtained, the space coordinate of the interactive device in the real space can be obtained, the space coordinate of the interactive device in the real space is converted into the virtual coordinate of the virtual space, and then the rendering coordinate of the virtual content to be displayed in the virtual space is obtained according to the position relative relation between the virtual content to be displayed and the interactive device, so that the display position of the virtual content is obtained, and the virtual content can be displayed conveniently.
After the display position of the virtual content is obtained, the virtual content may be rendered according to the data of the virtual content that needs to be displayed and the obtained display position. The data of the virtual content may include model data of the virtual content, and the model data is data for rendering the virtual content. For example, the model data may include color data, vertex coordinate data, contour data, and the like for establishing correspondence of the virtual content. The data of the virtual content may be stored in the terminal device, or may be acquired from another electronic device such as an interactive device or a server. In some embodiments, the data of the virtual content may be obtained according to the identity information of at least one marker of the interactive device, that is, the data of the corresponding virtual content may be read according to the identity information of the marker, so that the displayed virtual content corresponds to the identity information of a certain marker of the interactive device.
Therefore, the virtual content can be displayed in the virtual space, and the user can see the virtual content and the real world to be overlaid for displaying through the terminal equipment, so that the display effect of the virtual content for enhancing the reality is realized, and the display effect of the virtual content is improved. For example, as shown in fig. 4, the user can see the interactive device 200 in the real world through the terminal device, and can see the virtual content displayed in the virtual space in an overlapping manner at the corresponding position of the interactive device 200, wherein the virtual content is the crystal ball 30.
Step S130: and detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment.
In this embodiment of the application, after the virtual content is displayed according to the position and posture information of the entity object relative to the terminal device, the terminal device may detect the motion state of the interactive device, so as to control the display of the virtual content when the motion state of the interactive device is the target state. The terminal device can detect the motion state of the interactive device according to at least one of the change information of the position and the posture of the interactive device. It is understood that the change information of the position of the interactive device refers to a change of the position of the interactive device relative to the terminal device, and the change information of the posture refers to a change of the posture of the interactive device relative to the terminal device. In addition, the change information of the position and the posture of the interactive device may be recognized and tracked by the terminal device, so as to obtain the change information of the position and the posture of the interactive device.
In some embodiments, the motion state of the interactive device may be detected according to the change information of the position of the interactive device, or the motion state of the interactive device may be detected according to the change information of the posture of the interactive device, or the motion state of the interactive device may be detected according to the change information of the position and the change information of the posture of the interactive device.
When the terminal device detects the motion state of the interactive device according to at least one of the change information of the position and the posture of the interactive device, it may determine that the interactive device is in the motion state when detecting the change of the position and/or the posture of the interactive device. And according to the specific change of the position and/or the posture of the interactive equipment, the specific motion state of the interactive equipment is determined, such as uniform motion, shaking, accelerated motion, decelerated motion and the like. That is to say, according to the specific change of the position and/or the posture of the interactive device, the motion parameters such as the motion speed and the motion direction of the interactive device can be determined, and according to the motion parameters, the specific motion state of the interactive device can be determined. Of course, the specific manner of detecting the motion state of the interactive device may not be limiting.
Step S140: and when the interactive equipment is in a shaking state, obtaining shaking parameters of the interactive equipment.
In the embodiment of the application, when the motion state of the interactive device is detected and the interactive device is determined to be in the shaking state, the triggering condition that the terminal device controls the display of the virtual content can be triggered, and the display of the virtual content is controlled. The shaking state refers to shaking movement of the interactive device according to certain movement parameters such as movement frequency, movement amplitude, movement direction and the like, for example, the interactive device performs reciprocating movement according to certain frequency within a certain distance in the horizontal direction.
Furthermore, when the interactive equipment is detected to be in a shaking state, the shaking parameters of the interactive equipment can be acquired, so that the terminal equipment controls the display of the virtual content according to the shaking parameters, and the control content of the virtual content is in accordance with the shaking parameters of the interactive equipment. Wherein the shaking parameters may include: at least one of a shaking frequency, a shaking direction, a shaking amplitude, and the like. The shaking frequency may refer to the shaking times of the interactive device within a certain time, for example, the shaking times within 1S; the shaking direction refers to the moving direction of the interactive equipment when shaking; the shaking amplitude refers to the position change range, the posture angle change range and the like when the interactive equipment shakes.
In this embodiment of the application, the shaking parameter may be determined by change information of the position and/or the posture of the interactive device within a certain time period, for example, may be determined according to change information of the position and/or the posture within a set time period before the interactive device is detected to be in the shaking state. Of course, the specific manner of obtaining the shaking parameter of the interactive device may not be limited.
Step S150: and controlling the display of the virtual content according to the shaking parameters so that the displayed virtual content corresponds to the shaking state of the interactive equipment.
After the terminal device obtains the shaking parameters of the interactive device, the display of the virtual content can be controlled according to the shaking parameters, so that the aim of correspondingly controlling the displayed virtual content through the shaking of the interactive device is fulfilled.
In the embodiment of the application, different shaking parameters correspond to different control effects of the virtual content, and the control effects can control the virtual content to display different effects. For example, the moving direction of the virtual content can be controlled according to the shaking direction, so that a control effect that the moving direction of the virtual content is consistent with the shaking direction is achieved, and for example, the updating speed of the virtual content can be controlled according to the shaking frequency, so that a control effect that the updating speed of the virtual content is consistent with the shaking frequency is achieved, and for example, different display effects on the virtual content can be triggered according to the shaking amplitude, and the like. Of course, the control content of the virtual content corresponding to the specific shaking parameter may not be limited in the embodiment of the present application.
The virtual content display method is applied to the terminal equipment, the virtual content is displayed according to the position and posture information of the interactive equipment relative to the terminal equipment, the user can observe the effect that the virtual content is overlaid on the real world, and when the interactive equipment is determined to be in a shaking state according to at least one of the change information of the position and/or posture of the interactive equipment, the shaking parameter of the interactive equipment is obtained, the display of the virtual content is controlled according to the shaking parameter, so that the shaking of the interactive equipment is realized, the virtual content is controlled, good interaction with the displayed virtual content is achieved, and the interactivity in the display of the virtual content is improved.
Referring to fig. 5, an embodiment of the present application provides another virtual content display method, which is applicable to the terminal device, where the virtual content display method includes:
step S210: and acquiring the position and posture information of the interactive equipment relative to the terminal equipment.
Step S220: and displaying the virtual content according to the position and posture information.
In the embodiment of the present application, step S210 and step S220 may refer to the contents of the above embodiments, and are not described herein again.
Step S230: and judging whether the change frequency of the position and/or the posture of the interactive equipment in the specified time length is larger than a frequency threshold value.
In the embodiment of the application, after the terminal device displays the virtual content according to the position and posture information of the interactive device relative to the terminal device, the motion state of the interactive device can be detected according to at least one of the change information of the position and posture of the interactive device, so as to determine the shaking state of the interactive device.
In some embodiments, the change of the position and the posture of the interactive device may be determined according to the obtained change information of the position and the posture of the interactive device, and whether the change is performed according to a certain frequency. When the change of the position and the posture is determined to be changed according to a certain frequency, the frequency can be obtained as the change frequency of the position and the posture of the interactive equipment.
Further, it may be determined whether a frequency of change of the position of the interactive device is greater than a frequency threshold to determine whether the interactive device is in a shaking state. It will be appreciated that when the interactive device is in a shaking state, it moves at a frequency that is greater than the frequency threshold. Therefore, when the change frequency of the position of the interactive device is greater than the frequency threshold, it can be determined that the interactive device is in a shaking state. And when the change frequency of the position of the interactive equipment is not greater than the frequency threshold, the interactive equipment is not in a shaking state.
Whether the change frequency of the posture of the interactive equipment is greater than a frequency threshold value or not can be judged, the change frequency of the position of the interactive equipment is similar to the judgment of whether the change frequency of the posture of the interactive equipment is greater than the frequency threshold value or not, when the change frequency of the posture of the interactive equipment is greater than the frequency threshold value, the interactive equipment is in a shaking state, and when the change frequency of the posture of the interactive equipment is not greater than the frequency threshold value, the interactive equipment is not in the shaking state.
In addition, the shake state of the interactive device may also be determined jointly with the change frequency of the position and the change frequency of the posture of the interactive device, that is, when both the change frequency of the position and the change frequency of the posture of the interactive device are greater than the frequency threshold, the interactive device is determined to be in the shake state.
In the embodiment of the present application, the specific value of the frequency threshold may not be limited, for example, the frequency threshold may be 1 time per second, 2 times per second, 3 times per second, and the like.
Step S240: and when the change frequency of the position and/or the posture is larger than the frequency threshold value, determining that the interactive equipment is in a shaking state.
In this embodiment of the application, when it is detected in step S230 that the change frequency of the position and/or the posture of the interactive device is greater than the frequency threshold, it may be determined that the interactive device is in a shaking state.
In some embodiments, when it is determined that the change frequency of the position and/or the posture of the interactive device is greater than the frequency threshold, it may be further determined whether a position change range of the interactive device is within a certain range, it is determined whether a position change direction of the interactive device is a preset direction, and it is determined whether a posture change range (for example, a change range of a posture angle) of the interactive device is within a certain range.
Step S250: when the interactive equipment is in a shaking state, acquiring the attitude parameters of the interactive equipment within a preset time period, and determining the variation range of the attitude parameters of the interactive equipment.
When the interactive equipment is detected to be in a shaking state, the shaking parameters of the interactive equipment can be acquired, so that the terminal equipment controls the display of the virtual content according to the shaking parameters, and the control content of the virtual content is in accordance with the shaking parameters of the interactive equipment.
In the embodiment of the application, the attitude parameter of the interactive device in the preset time period can be obtained, and the variation range of the attitude parameter of the interactive device is determined so as to determine the shaking parameter of the interactive device. The preset time period may be a time period of a specified duration after the interactive device is detected to be in the shaking state, for example, a time period 2S after the time when the interactive device is detected to be in the shaking state, a time period 3S after the time, a time period 5S after the time, and the like.
In some embodiments, acquiring the posture parameter of the interactive device within a preset time period includes: and acquiring a marker image containing at least one marker set by the interactive equipment in a preset time period, and acquiring the attitude parameter of the interactive equipment according to the marker image. The attitude parameters may include parameters of the interactive device, such as an attitude angle, a rotation direction, an angular velocity, and an acceleration, and the specific attitude parameters may not be limited in this embodiment.
It can be understood that the terminal device may obtain real-time spatial position information of the interactive device within the preset time period by identifying the marker on the interactive device in real time within the preset time period, where the spatial position information includes information such as an attitude parameter and a position of the interactive device, so as to obtain the attitude parameter of the interactive device within the preset time period. The way of identifying the marker on the interactive device by the terminal device may refer to the contents in the above embodiments, and is not described herein again.
In some embodiments, acquiring the posture parameter of the interactive device within a preset time period includes: and receiving the attitude parameters detected by the interactive equipment, which are sent by the interactive equipment within a preset time period.
It is understood that the interactive device may include an Inertial Measurement Unit (IMU), wherein the Inertial measurement unit may detect six-degree-of-freedom information of the interactive device, and the six-degree-of-freedom information may include a degree of freedom of movement and a degree of freedom of rotation of the interactive device along three orthogonal coordinate axes (X, Y, Z axes) in space, and the degree of freedom of movement and the degree of freedom of rotation corresponding to the three orthogonal coordinate axes may constitute an attitude parameter of the interactive device. Therefore, the terminal device obtains the attitude parameter of the interactive device within the preset time period, which may be the attitude parameter detected by the interactive device sending the IMU within the preset time period.
Of course, in the embodiment of the present application, a specific manner of obtaining the posture parameter of the interactive device may not be limited.
After the attitude parameters of the interactive device within the preset time period are acquired, the variation range of the attitude parameters of the interactive device can be determined. Among them, the variation range of the attitude angle may be determined, for example, in the range of 20 ° to 170 °, the variation range of the attitude direction, that is, the attitude of the interactive apparatus is changed from one attitude direction to another attitude direction, the variation range of the mark surface, the variation from one mark surface to another mark surface, and the like. Of course, the variation range of the acceleration may also be determined, and the variation range of the specifically determined attitude parameter may not be limited.
Step S260: and determining the shaking parameters of the interactive equipment based on the variation range of the attitude parameters.
After the variation range of the attitude parameters is obtained, the terminal equipment can determine the shaking parameters of the interactive equipment according to the variation range of the attitude parameters of the interactive equipment. In some embodiments, the variation range of the attitude angle can be used as the shaking amplitude of the interactive device, and the shaking direction of the interactive device can be determined according to the variation range of the attitude direction. In addition, the shaking frequency of the interactive equipment can be determined according to the change times of the posture of the interactive equipment in the preset time period, so that the shaking parameters of the interactive equipment, such as the shaking frequency, the shaking amplitude, the shaking direction and the like, can be acquired. Of course, the specific way of determining the shaking parameter of the interactive device may not be limited in the embodiment of the present application, and the gesture parameter of the interactive device may also be used as the shaking parameter of the interactive device.
Step S270: and generating a control instruction corresponding to the shaking parameter according to the corresponding relation between the shaking parameter and the control instruction, and controlling the display of the virtual content according to the control instruction so as to enable the displayed virtual content to be matched with the position and/or posture change of the entity object.
After the terminal device obtains the shaking parameters of the interactive device, the display of the virtual content can be controlled according to the shaking parameters, so that the aim of correspondingly controlling the displayed virtual content through the shaking of the interactive device is fulfilled.
In the embodiment of the application, the terminal device can generate a control instruction corresponding to the shaking parameter according to the shaking parameter, and control the virtual content according to the control instruction. Specifically, the correspondence between the shaking parameter and the control instruction is stored in the terminal device in advance, and the correspondence may be set by the user, may be default when the terminal device leaves a factory, or may be acquired by the terminal device from the server.
After a control instruction is generated according to the acquired shaking parameter, the display of the virtual content may be controlled according to the control instruction. Wherein, different control instructions correspond to different control effects, and the control effects can control the virtual content to display different effects. For example, the speed of updating the virtual content is faster according to the control instruction generated by the higher shaking frequency, and for example, the rendering effect added to the virtual content is different according to the control instruction generated by the different shaking amplitudes. Of course, the above control effects are only examples, and the control effect on the virtual content corresponding to the specific control instruction may not be limited in the embodiment of the present application.
The virtual content display method is applied to the terminal equipment, the virtual content is displayed according to the position and posture information of the interactive equipment relative to the terminal equipment, a user can observe the effect that the virtual content is overlaid on the real world, the change frequency of the position and/or posture of the interactive equipment is compared with the frequency threshold value, when the interactive equipment is determined to be in a shaking state, the posture change range of the interactive equipment is utilized, the shaking parameters of the interactive equipment are determined, the control instruction is generated according to the shaking parameters, the display of the virtual content is controlled according to the control instruction, the virtual content is controlled according to the shaking of the interactive equipment, good interaction between the virtual content and the displayed virtual content is achieved, and the interactivity in the display of the virtual content is improved.
Referring to fig. 6, an embodiment of the present application provides another virtual content display method, which is applicable to the terminal device, where the virtual content display method includes:
step S310: and acquiring the position and posture information of the interactive equipment relative to the terminal equipment.
Step S320: and displaying the virtual content according to the position and posture information.
Step S330: and detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment.
Step S340: and when the interactive equipment is in a shaking state, obtaining shaking parameters of the interactive equipment.
In the embodiment of the present application, steps S310 to S340 may refer to the contents of the above embodiments, and are not described herein again.
Step S350: and controlling the virtual content to perform at least one of content interaction, content addition, movement, rotation, content selection and scaling adjustment according to the shaking parameters.
After the shaking parameters of the interactive device are obtained, the virtual content may be controlled to perform at least one of content interaction, content addition, movement, rotation, content selection, and scaling adjustment according to the shaking parameters, and of course, other controls may also be performed on the virtual content, for example, copying, splitting, and the like of the virtual content.
The control of the virtual content for content interaction may refer to interaction between virtual contents in display effect. When the virtual content includes multiple parts of virtual content, the virtual content interacts with each other to indicate that the virtual content of different parts has an interaction on a display effect, for example, one part of the virtual content applies a display effect to another part of the virtual content. Naturally, the interaction in the display effect between the virtual contents may be the interaction in the display effect between the virtual contents displayed by the terminal device and the virtual contents displayed by another terminal device.
In some embodiments, the virtual content displayed by the terminal device may include: first virtual content and second virtual content. It is understood that the above virtual content displayed by the terminal device may be composed of two virtual contents, that is, may be composed of a first virtual content and a second virtual content. Of course, other virtual contents may also be included in the virtual contents.
In addition, the shaking parameters of the interactive device may at least include a shaking direction, where the shaking direction is a moving direction of the interactive device during shaking, for example, the shaking direction is a horizontal direction, a vertical direction, and the like.
As one way, controlling the virtual content to perform content interaction according to the shaking parameter includes:
controlling the virtual content to display a shaking effect according to the shaking parameter and the shaking direction, wherein the first virtual content corresponds to a first shaking effect, and the second virtual content corresponds to a second shaking effect; and controlling the first virtual content and the second virtual content to execute interactive operation according to the first shaking effect and the second shaking effect.
It can be understood that when the virtual content is controlled to perform content interaction according to the shaking parameters, the virtual content can be controlled to display the shaking effect according to the shaking direction, namely, the direction corresponding to the shaking effect is consistent with the shaking direction, so that the shaking direction of the virtual content can be seen by a user and is consistent with the shaking direction of the interactive device. Moreover, the shake effect may correspond to a specific virtual content, that is, when the first virtual content is different from the second virtual content, the corresponding shake effect is different. Therefore, when the first virtual content and the second virtual content display the shaking effect, different shaking effects can be displayed, specifically, the first virtual content displays the first shaking effect according to the shaking direction, and the second virtual content displays the second shaking effect according to the shaking direction. For example, referring to fig. 7, the displayed virtual content includes a first virtual content, a second virtual content, and a third virtual content, the first virtual content is a virtual sea 31, the second virtual content is a virtual ship 32 located on the virtual sea 31, the third virtual content is a virtual crystal ball 30, the virtual sea 31 and the virtual ship 32 are located in the virtual crystal ball 30, and a user can see, through a terminal device, that the virtual crystal ball 30 is displayed superimposed on an interactive device, and when the user holds the interactive device for interaction, the virtual crystal ball 30 is displayed superimposed on the interactive device, so that the user visually feels that the virtual crystal ball 30 is held by the user for interaction. As shown in fig. 8, when the interactive device is in a shaking state, the terminal device may control the first virtual content and the second virtual content to display a shaking effect according to the shaking parameter, that is, may control the virtual sea 31 and the virtual ship 32 to display the shaking effect respectively. Referring to fig. 9, the first virtual content is a virtual sea 31, the second virtual content is a virtual ship 32, the first shaking effect corresponding to the first virtual content may be a shaking effect of waves generated on the water surface of the virtual sea 31, and the second shaking effect corresponding to the second virtual content may be a shaking effect of the virtual ship 32 shaking along with the water surface of the virtual sea 31. Of course, the above is merely an example, and the application scenario is not limited thereto.
Of course, the terminal device may also determine the shaking effect corresponding to the virtual content by combining with other parameters in the shaking parameters. For example, the shaking effect and the like may also be determined according to the shaking frequency and/or the shaking amplitude.
In addition, the terminal device can control the first virtual content and the second virtual content to execute corresponding interactive operation according to the first shaking effect and the second shaking effect. It can be understood that after the first virtual content and the second virtual content display different shaking effects, corresponding interaction operation can be generated between the first virtual content and the second virtual content. The interactive operation may be that the first virtual content applies a corresponding display effect to the second virtual content, and the second virtual content also applies a corresponding display effect to the first virtual content, where the display effect applied by the first virtual content may correspond to the first virtual content, the first shaking effect, and the second shaking effect, and the display effect applied by the second virtual content may correspond to the second virtual content, the first shaking effect, and the second shaking effect. For example, in the application scenario shown in fig. 9, the first virtual content is the virtual sea 31, the second virtual content is the virtual ship 32, the water surface of the virtual sea 31 can be controlled to generate a wave shaking effect, and after the virtual ship 32 is controlled to display the shaking effect following the water surface shaking of the virtual sea 31, the virtual sea 31 can spray water to the hull of the virtual ship 32, and the virtual ship 32 can also fluctuate the sea water along with the shaking.
For another example, in an application scenario of a simulated chemical experiment, if the first virtual content is the chemical solid 1 and the second virtual content is the chemical liquid 1, the chemical solid 1 and the chemical liquid 1 may generate a chemical reaction according to the first shaking effect of the chemical solid 1 and the second shaking effect of the chemical liquid 1, and display an effect of the chemical reaction. Of course, the above is merely an example, and the application scenario is not limited thereto.
And controlling the virtual content to add content according to the shaking parameter, which may mean adding other virtual content to the virtual space on the basis of the virtual content currently displayed by the terminal device, so that the virtual content currently displayed by the terminal device and the added other virtual content are displayed together.
In some embodiments, the additional virtual content may be extended content corresponding to virtual content currently displayed by the terminal device, where the extended content is related to the currently displayed virtual content, and data corresponding to the additional virtual content may be stored in the terminal device in advance, or may be acquired from another device (e.g., a server, etc.). For example, in an application scenario simulating cooking, when the displayed virtual content is a virtual dish, when adding the virtual content, the added virtual content may be a seasoning, other dishes, and the like. For another example, when the displayed virtual content is a map of a certain place, the added display content may be a map around the current map, and the currently displayed map may be displayed together with the expanded map, so that the content of the displayed map is increased. For another example, when the displayed virtual content is the chemical solid 2, the added virtual content may be the chemical liquid 2, so that the subsequent chemical solid 2 may generate a chemical reaction with the chemical liquid 2, thereby achieving the effect of simulating a chemical experiment. Of course, the above is merely an example, and the application scenario is not limited thereto.
The control of the virtual content to move according to the shaking parameter may refer to moving the virtual content or a part of the virtual content in any direction. For example, in a game scene, the movement of a game object can be controlled according to a shaking parameter, so that the game object can be located at different positions in a virtual space by shaking the interactive device, and the virtual object is controlled to execute different operations. Of course, the above is merely an example, and the application scenario is not limited thereto.
In some embodiments, the shaking parameter may include at least a shaking frequency, and the shaking frequency may refer to a number of times the interactive device is shaken within a certain time.
The controlling the virtual content to move according to the shaking parameter may include: according to the shaking parameters, moving the virtual content according to the speed corresponding to the shaking frequency; and when the shaking frequency reaches a preset threshold value, controlling the virtual content to change from the current first display state to the second display state.
It can be understood that, when the virtual content is controlled to move according to the shaking parameter, a speed corresponding to the shaking frequency in the shaking parameter can be determined, and the speed is used as a moving speed when the virtual content moves. The terminal device may control the virtual content to move at the speed according to the determined speed. The corresponding relationship between the shaking frequency and the moving speed of the virtual content may be a relationship that the moving speed of the virtual content is in direct proportion to the shaking frequency of the interactive device, that is, the higher the shaking frequency is, the higher the moving speed of the virtual content is. Therefore, the user can control the moving speed of the virtual content by controlling the shaking frequency of the interactive device.
In addition, whether the shaking frequency reaches a preset threshold value can be detected, and when the shaking frequency reaches the preset threshold value, the display state of the virtual content is changed. In particular, the virtual content may be controlled to change from a current first display state to a second display state. For example, please refer to fig. 9 and fig. 10 together, in the application scenario shown in fig. 9, virtual rain 33, virtual cloud 34, virtual lightning 35, etc. may be added to display the raining display effect. The control of the change of the virtual content to the second display state may be performed by changing the color, size, or the like of the virtual content. The control of the virtual content to change to the second display state may be controlling the virtual content to display a specific display effect, for example, a fire effect, a smoke effect, or the like. Of course, the display state to which the specific virtual content is changed may not be limited in the embodiment of the present application.
The control of the virtual content to rotate according to the shaking parameter may refer to rotating the virtual content in a predetermined direction (for example, a horizontal direction, a vertical direction, or a free direction) in a two-dimensional plane or a three-dimensional space, that is, rotating the virtual content along a rotation axis in the predetermined direction to change the posture (orientation direction or the like) of the displayed virtual content. In some embodiments, the direction in which the virtual content rotates may be set to a specific direction, that is, after the interactive device is detected to be in a shaking state, the virtual content may be controlled to rotate in the specific direction. In some embodiments, the direction of the virtual content rotation may correspond to a shaking direction, that is, after the interactive device is detected to be in a shaking state, the virtual content may be controlled to rotate in the direction corresponding to the shaking direction. For example, when the virtual content displayed is an earth model, the rotation of the earth model may be controlled according to the shaking parameters to display different viewing angles of the earth model.
Controlling the virtual content to select the content according to the shaking parameter may refer to selecting the virtual content or a part of the virtual content in the two-dimensional plane, so that the virtual content or the part of the virtual content is in a selected state. In an application scenario, when the virtual content is a plurality of virtual option contents for a dish selected by a user, and when one of the virtual option contents is at a designated position, and when a shaking parameter is a designated parameter, for example, when the shaking direction is the designated direction and the shaking frequency is higher than a set frequency, the virtual option content can be controlled to be in a selected state.
The control of the virtual content to perform the scaling adjustment according to the shaking parameter may refer to adjusting a model of the virtual content to an enlargement ratio or a reduction ratio, where the enlargement ratio and the reduction ratio are ratios of the size of the displayed virtual content to the original size of the virtual content. In some embodiments, the adjustment of the scaling up or scaling down of the model of the virtual content may be determined by the direction of shaking, for example, when the direction of shaking is a first direction, the model of the virtual content is scaled up, and when the direction of shaking is a second direction, the model of the virtual content is scaled down. In addition, the scale of enlarging or reducing the model of the virtual content may be determined according to the shaking frequency and/or the shaking amplitude, for example, the higher the shaking frequency, the larger the scale of enlarging or reducing, and for example, the higher the shaking amplitude, the larger the scale of enlarging or reducing.
In some embodiments, the above-mentioned specific control of the virtual content may also be performed in cooperation with the completion of the control of the virtual content. In one application scenario, referring to fig. 7, the virtual content may include a virtual crystal ball 30, a virtual sea 31, and a virtual boat 32. Referring to fig. 8 and fig. 9, when the terminal device detects that the interactive device is in a shaking state, the virtual sea 31 and the virtual ship 32 can be controlled to perform content interaction, so that the water surface of the virtual sea 31 generates a wave shaking effect along with the shaking direction, and the virtual ship 32 generates a shaking effect along with the water surface; referring to fig. 10, when the shaking frequency reaches a preset threshold, content addition may be performed, for example, virtual content such as virtual cloud 34, virtual rain 33, virtual lightning 35, etc. may be added to generate a display effect of a rain event; as shown in fig. 11, after the rain event is triggered, content addition may be continued, a virtual monster 36 may be added to the virtual sea 31, and a display effect that the virtual monster 36 extends out of the sea surface to palpate the virtual ship 32 may be displayed; referring to fig. 12 and 13, a user may move the virtual content by shaking the interactive device, so that the terminal device may move the virtual content according to the detected current shaking parameter of the interactive device, and specifically may control the virtual ship 32 to move in the virtual sea 31 to avoid the attack of the virtual sea monster 36.
Of course, the control of the virtual content based on the oscillation parameter is not limited to the above, and for example, the virtual content may be controlled based on the oscillation parameter to perform copying or the like. In addition, the terminal equipment can play audio corresponding to the virtual content and the like according to the shaking parameters.
According to the virtual content display method provided by the embodiment of the application, the terminal equipment displays the virtual content according to the position and posture information of the interactive equipment relative to the terminal equipment, so that a user can observe the effect that the virtual content is superimposed on the real world, and when the interactive equipment is determined to be in a shaking state according to the change information of the position and/or posture of the interactive equipment, the shaking parameter of the interactive equipment is determined by utilizing the posture change range of the interactive equipment, and according to the shaking parameter, the virtual content is controlled to perform content interaction, content addition, movement, rotation, content selection, scaling adjustment and the like, so that good interaction with the displayed virtual content is achieved, and the interactivity in the display of the virtual content is improved.
Referring to fig. 14, an embodiment of the present application provides another virtual content display method, which is applicable to the terminal device, and the virtual content display method may include:
step S410: and acquiring the position and posture information of the interactive equipment relative to the terminal equipment.
Step S420: and displaying the virtual content according to the position and posture information.
Step S430: and when a control trigger instruction sent by the interactive equipment is received, detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment, wherein the control trigger instruction is generated by the interactive equipment according to the control operation detected by the control area.
In an embodiment of the present application, the interaction device may be provided with at least one control area, and the control area may include at least one of a key and a touch screen. The control operation can be detected by the control area of the interactive device, and the control operation can be the key operation of a user on a key and can also be the touch operation on a touch screen.
When the control operation is detected in the control area of the interactive device, the interactive device may generate a control trigger instruction according to the control operation. The control trigger instruction is used for triggering the terminal device to control the virtual content according to the shaking parameters, that is, the terminal device can control the virtual content according to the shaking parameters when detecting the shaking state after receiving the control trigger instruction sent by the interactive device.
Therefore, when the terminal device receives the control trigger instruction sent by the interactive device, the terminal device may detect the motion state of the interactive device according to at least one of the interactive device and the change information of the posture, and the specific manner of detecting the motion state of the interactive device may refer to the contents of the above embodiments, which is not described herein again.
Step S440: and when the interactive equipment is in a shaking state, obtaining shaking parameters of the interactive equipment.
In the embodiment of the present application, step S440 may refer to the contents of the above embodiments, and is not described herein again.
Step S450: and receiving a control instruction sent by the interactive equipment according to the control operation detected by the control area.
In this embodiment of the application, the terminal device may further receive a control instruction sent by the interactive device, where the control instruction is generated according to the detected control operation after the control operation is detected by the interactive device, and is sent to the terminal device, and the control instruction is used for the terminal device to perform corresponding control on the virtual content. After receiving the control instruction, the terminal device may subsequently perform corresponding control on the virtual content according to the control instruction.
Step S460: and performing first control on the first content according to the shaking parameter, and performing second control on the second content according to the control instruction.
When the terminal device obtains the shaking parameters of the interactive device and receives the control instruction sent by the interactive device, the terminal device can control the virtual content according to the shaking parameters and the control instruction.
In the embodiment of the application, the virtual content displayed by the terminal device may include the first content and the second content. That is, the virtual content may be composed of the first content and the second content. Of course, the virtual content displayed by the terminal device may also include other content.
When the terminal device controls the virtual content according to the shaking parameter and the control instruction, the terminal device can control the first content and the second content according to the shaking parameter and the control instruction. Specifically, the terminal device can perform first control on the first content according to the shaking parameter, and perform second control on the second content according to the control instruction. The first control and the second control may be the same control as the virtual content or different controls, and the specific control content may not be limited in the embodiment of the present application. Therefore, the virtual content can be controlled together through the shaking of the interactive equipment and the control operation of the detection of the interactive equipment, and a better interactive effect is achieved.
In an application scenario, when the virtual crystal ball 30 is displayed, and a control trigger instruction of the interactive device is detected, the terminal device may detect a motion state of the interactive device, as shown in fig. 15, and when the control trigger instruction of the interactive device is detected, it may be displayed that virtual seawater is injected into the virtual crystal ball 30, and a virtual sea 31 is formed. As shown in fig. 7, after the virtual sea 31 is displayed, a virtual ship 32 may be displayed on the virtual sea 31. Referring to fig. 8 and 9, when the interactive device is detected to be in a shaking state, the water surface of the virtual sea 31 can be controlled to generate a wave shaking effect according to the shaking parameters, and the virtual ship 32 is controlled to display the shaking effect along with the water surface of the virtual sea 31. As shown in fig. 10, when the shaking frequency is greater than the preset threshold, virtual contents such as virtual dark clouds 34, virtual rain water 33, virtual lightning 35, etc. are displayed, and a display effect of a rain event is generated. As shown in fig. 11, after the virtual contents such as the virtual cloud 34, the virtual rain 33, the virtual lightning 35, and the like are displayed, a virtual monster 36 may be added to the virtual sea 31, and a display effect that the virtual monster 36 extends from the sea surface to strike the virtual ship 31 may be displayed. In addition, as shown in fig. 16, the terminal device may receive a manipulation instruction sent by the interactive device according to the detected control operation, and control the virtual lightning 35 to attack the virtual monster 36 according to the manipulation instruction. Of course, the application scenario is not limited to this, and other scenarios may also be used.
In another application scenario, the virtual content may also be a chemical, and when the chemical liquid 3 and the chemical solid 3 are displayed, the chemical may also be controlled together through an operation instruction sent by the interactive device and a shaking parameter of the interactive device. For example, can be according to rocking the parameter, control chemical solid 3 shows and rocks the effect to according to controlling the instruction, the simulation heats chemical liquid 3, reaches the effect of simulation chemistry experiment.
Of course, the application scenario of the display method of the virtual content provided in the embodiment of the present application is not limited to this, and may also be other application scenarios.
According to the method for displaying the virtual content, the terminal equipment displays the virtual content according to the position and posture information of the interactive equipment relative to the terminal equipment, so that a user can observe the effect that the virtual content is superimposed on the real world. After receiving the control trigger instruction that interactive equipment sent, detect the rocking state of interactive equipment to when determining that interactive equipment is in the rocking state, confirm the rocking parameter of interactive equipment, still receive the instruction of controlling that interactive equipment sent, and according to rocking parameter and the aforesaid instruction of controlling, control virtual content jointly, promote and the interactive effect between the virtual content that shows, promoted the interest in the virtual content demonstration.
Referring to fig. 17, a block diagram of a display device 400 for virtual content according to the present application is shown. The virtual content display apparatus 400 is applied to a terminal device, and the terminal device is connected with an interactive device. The display apparatus 400 of the virtual content includes: a location acquisition module 410, a content display module 420, a status detection module 430, a parameter acquisition module 440, and a content control module 450. The position obtaining module 410 is configured to obtain position and posture information of the interactive device relative to the terminal device; the content display module 420 is configured to display the virtual content according to the position and posture information; the state detection module 430 is configured to detect a motion state of the interactive device according to at least one of change information of a position and a posture of the interactive device; the parameter obtaining module 440 is configured to obtain a shaking parameter of the interactive device when the interactive device is in a shaking state; the content control module 450 is configured to control the display of the virtual content according to the shaking parameter, so that the displayed virtual content corresponds to the shaking state of the interactive device.
In the embodiment of the application, the shaking parameter is obtained according to at least one of the change information of the position and the posture of the interactive device. The content control module 450 may be specifically configured to: and generating a control instruction corresponding to the shaking parameter according to the corresponding relation between the shaking parameter and the control instruction, and controlling the display of the virtual content according to the control instruction so as to enable the displayed virtual content to be matched with the position and/or posture change of the entity object.
In this embodiment, the content control module 450 may be specifically configured to: and controlling the virtual content to perform at least one of content interaction, content addition, movement, rotation, content selection and scaling adjustment according to the shaking parameters.
In some implementations, the shake parameter includes a shake direction, and the virtual content includes first virtual content and virtual second content. The content control module 450 controls the virtual content to perform content interaction according to the shaking parameter, including: controlling the virtual content to display a shaking effect according to the shaking parameter and the shaking direction, wherein the first virtual content corresponds to a first shaking effect, and the second virtual content corresponds to a second shaking effect; and controlling the first virtual content and the second virtual content to execute interactive operation according to the first shaking effect and the second shaking effect.
In some embodiments, the shaking parameter comprises a shaking frequency. The content control module 450 controls the virtual content to move according to the shaking parameters, including: according to the control parameters, moving the virtual content at a speed corresponding to the shaking frequency; and when the shaking frequency reaches a preset threshold value, controlling the virtual content to change from the current first display state to the second display state.
In this embodiment of the present application, the state detection module 430 may be specifically configured to: judging whether the change frequency of the position and/or the posture of the interactive equipment in the specified duration is greater than a frequency threshold value; and when the change frequency of the position and/or the posture is larger than the frequency threshold value, determining that the interactive equipment is in a shaking state.
In this embodiment of the application, the parameter obtaining module 440 may be specifically configured to: acquiring attitude parameters of the interactive equipment in a preset time period, and determining the variation range of the attitude parameters of the interactive equipment; and determining the shaking parameters of the interactive equipment based on the variation range of the attitude parameters.
In this embodiment of the present application, the obtaining module 440 obtains the posture parameter of the interactive device in a preset time period, including: acquiring a marker image containing at least one marker set by the interactive equipment in a preset time period, and acquiring the attitude parameter of the interactive equipment according to the marker image; or receiving the gesture parameters detected by the interactive equipment, which are sent by the interactive equipment within a preset time period.
In this embodiment of the present application, the state detection module 430 may be specifically configured to: and when a control trigger instruction sent by the interactive equipment is received, detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment, wherein the control trigger instruction is generated by the interactive equipment according to the control operation detected by the control area.
In this embodiment, the content control module 450 may be specifically configured to: receiving a control instruction sent by the interactive equipment according to the control operation detected by the control area; and performing first control on the first content according to the shaking parameter, and performing second control on the second content according to the control instruction.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling. In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
To sum up, the scheme that this application provided is applied to terminal equipment, and terminal equipment is connected with interactive equipment, through acquireing interactive equipment for terminal equipment's position and attitude information, shows virtual content, makes the user can observe the effect that virtual content superposes in the real world, and according to at least one in the change information of the position of interactive equipment and/or gesture, when determining that interactive equipment is in the state of rocking, obtain the parameter of rocking of interactive equipment, according to the demonstration of rocking parameter control virtual content, better realize with the interaction between the virtual content that shows, promote the interactivity.
Referring to fig. 18, a block diagram of a terminal device according to an embodiment of the present application is shown. The terminal device 100 may be a terminal device capable of running an application, such as a smart phone, a tablet computer, a head-mounted display device, and the like. The terminal device 100 in the present application may include one or more of the following components: a processor 110, a memory 120, an image acquisition apparatus 130, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Processor 110 may include one or more Processing cores, processor 110 may connect various portions throughout terminal device 100 using various interfaces and lines, performing various functions of terminal device 100 and Processing data by executing or executing instructions, programs, code sets, or instruction sets stored in memory 120, and calling data stored in memory 120. alternatively, processor 110 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), Programmable logic Array (Programmable L organic Array, P L a), processor 110 may be implemented as a combination of one or more of a Central Processing Unit (CPU), Graphics Processing Unit (GPU), and modem, etc., where the CPU is primarily responsible for Processing operating systems, user interfaces, application programs, etc., the GPU is responsible for displaying content, and the modem may be implemented for rendering and rendering, or the wireless communication is implemented as a separate chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, and the like.
In the embodiment of the present application, the image capturing device 130 is used to capture an image of a marker. The image capturing device 130 may be an infrared camera or a color camera, and the specific type of the camera is not limited in the embodiment of the present application.
Referring to fig. 19, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (12)

1. A method for displaying virtual content is applied to a terminal device, wherein the terminal device is connected with an interactive device, and the method comprises the following steps:
acquiring position and posture information of the interactive equipment relative to the terminal equipment;
displaying the virtual content according to the position and posture information;
detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment;
when the interactive equipment is in a shaking state, obtaining shaking parameters of the interactive equipment;
and controlling the display of the virtual content according to the shaking parameters so as to enable the displayed virtual content to correspond to the shaking state of the interactive equipment.
2. The method of claim 1, wherein the shaking parameter is obtained according to at least one of change information of a position and a posture of the interactive device;
the controlling the display of the virtual content according to the shaking parameter comprises:
and generating a control instruction corresponding to the shaking parameter according to the corresponding relation between the shaking parameter and the control instruction, and controlling the display of the virtual content according to the control instruction so as to match the displayed virtual content with the position and/or posture change of the entity object.
3. The method of claim 1, wherein the controlling the display of the virtual content according to the shaking parameter comprises:
and controlling the virtual content to perform at least one of content interaction, content addition, movement, rotation, content selection and scaling adjustment according to the shaking parameters.
4. The method of claim 3, wherein the shake parameter comprises a shake direction, and wherein the virtual content comprises a first virtual content and a virtual second content;
the controlling the virtual content to perform content interaction according to the shaking parameter comprises the following steps:
controlling the virtual content to display a shaking effect according to the shaking direction according to the shaking parameter, wherein the first virtual content corresponds to a first shaking effect, and the second virtual content corresponds to a second shaking effect;
and controlling the first virtual content and the second virtual content to execute interactive operation according to the first shaking effect and the second shaking effect.
5. The method of claim 3, wherein the shaking parameter comprises a shaking frequency, and wherein controlling the virtual content to move according to the shaking parameter comprises:
moving the virtual content according to the shaking parameters and the speed corresponding to the shaking frequency;
and when the shaking frequency reaches a preset threshold value, controlling the virtual content to change from a current first display state to a second display state.
6. The method according to claim 1, wherein the detecting the motion state of the interactive device according to at least one of the change information of the position and the posture of the interactive device comprises:
judging whether the change frequency of the position and/or the posture of the interactive equipment in a specified duration is greater than a frequency threshold value;
and when the change frequency of the position and/or the posture is larger than the frequency threshold value, determining that the interactive equipment is in a shaking state.
7. The method of claim 1, wherein obtaining the shaking parameters of the interactive device comprises:
acquiring attitude parameters of the interactive equipment within a preset time period, and determining the variation range of the attitude parameters of the interactive equipment;
and determining the shaking parameters of the interactive equipment based on the variation range of the attitude parameters.
8. The method of claim 7, wherein the obtaining of the posture parameter of the interactive device within a preset time period comprises:
acquiring a marker image containing at least one marker set by the interaction device within a preset time period,
acquiring attitude parameters of the interactive equipment according to the marker images; or
And receiving the attitude parameters detected by the interactive equipment, which are sent by the interactive equipment within a preset time period.
9. The method of claim 1, wherein the interactive device comprises a manipulation area, wherein the virtual content comprises a first content and a second content, and wherein the controlling the display of the virtual content according to the shaking parameter comprises:
receiving a control instruction sent by the interaction equipment according to the control operation detected by the control area;
and performing first control on the first content according to the shaking parameter, and performing second control on the second content according to the control instruction.
10. A virtual content display device is applied to a terminal device, the terminal device is connected with an interactive device, and the device comprises: a position acquisition module, a content display module, a state detection module, a parameter acquisition module and a content control module, wherein,
the position acquisition module is used for acquiring the position and posture information of the interactive equipment relative to the terminal equipment;
the content display module is used for displaying the virtual content according to the position and posture information;
the state detection module is used for detecting the motion state of the interactive equipment according to at least one of the change information of the position and the posture of the interactive equipment;
the parameter acquisition module is used for acquiring shaking parameters of the interactive equipment when the interactive equipment is in a shaking state;
the content control module is used for controlling the display of the virtual content according to the shaking parameters so as to enable the displayed virtual content to correspond to the shaking state of the interactive equipment.
11. A terminal device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-9.
12. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 9.
CN201910060758.7A 2019-01-21 2019-01-21 Virtual content display method and device, terminal equipment and storage medium Active CN111459263B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910060758.7A CN111459263B (en) 2019-01-21 2019-01-21 Virtual content display method and device, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910060758.7A CN111459263B (en) 2019-01-21 2019-01-21 Virtual content display method and device, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111459263A true CN111459263A (en) 2020-07-28
CN111459263B CN111459263B (en) 2023-11-03

Family

ID=71682283

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910060758.7A Active CN111459263B (en) 2019-01-21 2019-01-21 Virtual content display method and device, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111459263B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346580A (en) * 2020-10-27 2021-02-09 努比亚技术有限公司 Motion attitude detection method and device and computer readable storage medium
CN113359988A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Information display method and device, computer equipment and storage medium
CN114764327A (en) * 2022-05-09 2022-07-19 北京未来时空科技有限公司 Method and device for manufacturing three-dimensional interactive media and storage medium
WO2023015895A1 (en) * 2021-08-10 2023-02-16 青岛小鸟看看科技有限公司 Position change-based vr interaction method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019569A (en) * 2012-12-28 2013-04-03 西安Tcl软件开发有限公司 Interactive device and interactive method thereof
US9383895B1 (en) * 2012-05-05 2016-07-05 F. Vinayak Methods and systems for interactively producing shapes in three-dimensional space
CN205644439U (en) * 2016-04-22 2016-10-12 邻元科技(北京)有限公司 Human machine interactive device of dice shape
CN106662926A (en) * 2014-05-27 2017-05-10 厉动公司 Systems and methods of gestural interaction in a pervasive computing environment
CN108269307A (en) * 2018-01-15 2018-07-10 歌尔科技有限公司 A kind of augmented reality exchange method and equipment
CN108958471A (en) * 2018-05-17 2018-12-07 中国航天员科研训练中心 The emulation mode and system of virtual hand operation object in Virtual Space
CN109240484A (en) * 2017-07-10 2019-01-18 北京行云时空科技有限公司 Exchange method, device and equipment in a kind of augmented reality system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9383895B1 (en) * 2012-05-05 2016-07-05 F. Vinayak Methods and systems for interactively producing shapes in three-dimensional space
CN103019569A (en) * 2012-12-28 2013-04-03 西安Tcl软件开发有限公司 Interactive device and interactive method thereof
CN106662926A (en) * 2014-05-27 2017-05-10 厉动公司 Systems and methods of gestural interaction in a pervasive computing environment
CN205644439U (en) * 2016-04-22 2016-10-12 邻元科技(北京)有限公司 Human machine interactive device of dice shape
CN109240484A (en) * 2017-07-10 2019-01-18 北京行云时空科技有限公司 Exchange method, device and equipment in a kind of augmented reality system
CN108269307A (en) * 2018-01-15 2018-07-10 歌尔科技有限公司 A kind of augmented reality exchange method and equipment
CN108958471A (en) * 2018-05-17 2018-12-07 中国航天员科研训练中心 The emulation mode and system of virtual hand operation object in Virtual Space

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112346580A (en) * 2020-10-27 2021-02-09 努比亚技术有限公司 Motion attitude detection method and device and computer readable storage medium
CN112346580B (en) * 2020-10-27 2024-01-12 努比亚技术有限公司 Motion gesture detection method, device and computer readable storage medium
CN113359988A (en) * 2021-06-03 2021-09-07 北京市商汤科技开发有限公司 Information display method and device, computer equipment and storage medium
CN113359988B (en) * 2021-06-03 2022-11-29 北京市商汤科技开发有限公司 Information display method and device, computer equipment and storage medium
WO2023015895A1 (en) * 2021-08-10 2023-02-16 青岛小鸟看看科技有限公司 Position change-based vr interaction method and system
CN114764327A (en) * 2022-05-09 2022-07-19 北京未来时空科技有限公司 Method and device for manufacturing three-dimensional interactive media and storage medium

Also Published As

Publication number Publication date
CN111459263B (en) 2023-11-03

Similar Documents

Publication Publication Date Title
CN111459263B (en) Virtual content display method and device, terminal equipment and storage medium
JP7256283B2 (en) Information processing method, processing device, electronic device and storage medium
CN111078003B (en) Data processing method and device, electronic equipment and storage medium
WO2019153750A1 (en) Method, apparatus and device for view switching of virtual environment, and storage medium
JP2022521324A (en) Game character control methods, devices, equipment and storage media
US8882593B2 (en) Game processing system, game processing method, game processing apparatus, and computer-readable storage medium having game processing program stored therein
CN109743892B (en) Virtual reality content display method and device
CN107890664A (en) Information processing method and device, storage medium, electronic equipment
US11513657B2 (en) Method and apparatus for controlling movement of virtual object, terminal, and storage medium
CN111383345B (en) Virtual content display method and device, terminal equipment and storage medium
US11087545B2 (en) Augmented reality method for displaying virtual object and terminal device therefor
EP2051208A2 (en) Generating an asset for interactive entertainment using digital image capture
CN110737414B (en) Interactive display method, device, terminal equipment and storage medium
CN108553895A (en) User interface element and the associated method and apparatus of three-dimensional space model
CN111223187A (en) Virtual content display method, device and system
CN111273777A (en) Virtual content control method and device, electronic equipment and storage medium
JP2022526512A (en) Interactive object drive methods, devices, equipment, and storage media
CN110737326A (en) Virtual object display method and device, terminal equipment and storage medium
US11100723B2 (en) System, method, and terminal device for controlling virtual image by selecting user interface element
CN110908508B (en) Control method of virtual picture, terminal device and storage medium
CN111913639B (en) Virtual content interaction method, device, system, terminal equipment and storage medium
CN108986228B (en) Method and device for displaying interface in virtual reality
US10497151B2 (en) Storage medium, information processing apparatus, information processing system and information processing method
CN111913564A (en) Virtual content control method, device and system, terminal equipment and storage medium
CN110598605B (en) Positioning method, positioning device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant