CN117093124A - Method, device, equipment and medium for adjusting AR display interface - Google Patents

Method, device, equipment and medium for adjusting AR display interface Download PDF

Info

Publication number
CN117093124A
CN117093124A CN202311055567.4A CN202311055567A CN117093124A CN 117093124 A CN117093124 A CN 117093124A CN 202311055567 A CN202311055567 A CN 202311055567A CN 117093124 A CN117093124 A CN 117093124A
Authority
CN
China
Prior art keywords
display interface
user
display
equipment
adjusting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311055567.4A
Other languages
Chinese (zh)
Inventor
刘威
李政
夏勇峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Beehive Century Technology Co ltd
Original Assignee
Beijing Beehive Century Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Beehive Century Technology Co ltd filed Critical Beijing Beehive Century Technology Co ltd
Priority to CN202311055567.4A priority Critical patent/CN117093124A/en
Publication of CN117093124A publication Critical patent/CN117093124A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method, a device, equipment and a medium for adjusting an AR display interface, wherein the method comprises the following steps: acquiring a display interface of the current AR equipment; adjusting the display mode of the display interface according to user gesture information to obtain an adjusted display interface, wherein the user gesture information is obtained by collecting the gesture of a user wearing the current AR equipment; the adjusted display interface is displayed, and the display mode can be adjusted through the gesture information of the user according to some embodiments of the application, so that the display interface always changes the display mode along with the gesture of the user, and the user experience is improved.

Description

Method, device, equipment and medium for adjusting AR display interface
Technical Field
The embodiment of the application relates to the field of AR display, in particular to a method, a device, equipment and a medium for adjusting an AR display interface.
Background
Among the related arts, the augmented reality (Augmented Reality, AR) technology is a technology of smartly fusing virtual information with the real world. In the process of displaying the display interface by the AR device, a user is required to manually adjust parameters of the AR device so that the display interface can be displayed in the most appropriate display mode, but the manual adjustment mode of the user easily causes the problems of inclination and unfocusing of the display interface, influences the viewing experience of the user, and reduces the efficiency of adjusting the AR display interface.
Therefore, how to quickly adjust the AR display interface becomes a problem to be solved.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a medium for adjusting an AR display interface, which can at least adjust a display mode through user gesture information by some embodiments of the application, so that the display interface always changes the display mode along with the gesture of a user, thereby realizing rapid adjustment of the AR display interface.
In a first aspect, the present application provides a method for adjusting an AR display interface, the method comprising: acquiring a display interface of the current AR equipment; adjusting the display mode of the display interface according to user gesture information to obtain an adjusted display interface, wherein the user gesture information is obtained by collecting the gesture of a user wearing the current AR equipment; and displaying the adjusted display interface.
Therefore, the embodiment of the application can enable the display interface to always change the display mode along with the gesture of the user by adjusting the display mode of the display interface according to the gesture information of the user, thereby realizing rapid adjustment of the AR display interface and improving the user experience.
With reference to the first aspect, in an embodiment of the present application, the user posture information includes a user head posture, and the user head posture includes a position, an angle, and a movement direction of the user head; the method for adjusting the display mode of the display interface according to the gesture information of the user to obtain an adjusted display interface comprises the following steps: and determining the projection position and direction of the display interface according to the position and angle of the head of the user, and correspondingly adjusting the display angle of the display interface according to the movement direction to obtain the adjusted display interface.
Therefore, according to the embodiment of the application, the display interface is adjusted according to the position and the movement direction of the head of the user, so that the display interface can always keep a relatively fixed position with the user, and the viewing experience is improved.
With reference to the first aspect, in an embodiment of the present application, the movement direction includes head upward movement, head downward movement, head leftward movement, or head rightward movement; the adjusting the display angle of the display interface according to the movement direction includes: when the movement direction is left movement or right movement, the display interface moves along with the movement direction; when the movement direction is upward movement, the display interface is controlled to incline downwards along with the movement direction; and when the movement direction is downward movement, controlling the display interface to incline upwards along with the movement direction.
Therefore, the embodiment of the application can ensure that the display interface is always kept right in front of the user by controlling the display interface to move along the movement direction of the user.
With reference to the first aspect, in an implementation manner of the present application, the adjusting a display manner of the display interface according to the gesture information of the user to obtain an adjusted display interface includes: and adjusting the display mode of the display interface according to the user gesture information and the environment information to obtain an adjusted display interface, wherein the environment information comprises the position of a target object, and the target object is a person or an object except the user wearing the current AR equipment.
Therefore, the embodiment of the application adjusts the display interface through the environment information, and can ensure that the display interface is not influenced by the surrounding environment, thereby giving a user better viewing experience.
With reference to the first aspect, in an implementation manner of the present application, the adjusting a display manner of the display interface according to the user gesture information and the environment information to obtain an adjusted display interface includes: determining shielding information through the position of the head of the user and the position of the target object, wherein the shielding information comprises the size of the target object for shielding the display interface; and moving the display interface based on the shielding information to obtain the adjusted display interface.
Therefore, the embodiment of the application can ensure that the display interface is not blocked by objects or people in the display process by adjusting the display interface through the blocking relation between the display interface and the environmental information.
With reference to the first aspect, in an embodiment of the present application, the determining occlusion information by a position of a head of a user and a position of the target object includes: performing target tracking on the position of the target object based on the position of the head of the user, and predicting the movement track of the target object; and determining the shielding information according to the moving track.
Therefore, the embodiment of the application can rapidly adjust the position of the display interface in the moving process of the target object by predicting the moving track of the target object, and improves the efficiency of adjusting the display interface.
With reference to the first aspect, in an implementation manner of the present application, the acquiring a display interface of the current AR device includes: and receiving the display interface sent to the current AR equipment by other AR equipment in a sharing mode, wherein the other AR equipment is equipment except the current AR equipment in a network formed by a plurality of AR equipment, and the plurality of AR equipment can share the display interface.
Therefore, the embodiment of the application can realize real-time communication and scene interaction among multiple people through interface sharing, thereby increasing interestingness.
With reference to the first aspect, in an implementation manner of the present application, after the receiving the display interface sent by the other AR device to the current AR device through a sharing manner, the method further includes: if the display interface is in an adjustable state, adjusting the display content of the display interface to obtain adjusted display content; and sending the adjusted display content to the other AR equipment in a sharing mode.
Therefore, the embodiment of the application can prevent the condition that a plurality of people change the display content simultaneously by confirming the adjustable state before adjusting the display content of the display interface.
In a second aspect, the present application provides an apparatus for adjusting an AR display interface, the apparatus comprising: the receiving module is configured to acquire a display interface of the current AR equipment; the adjusting module is configured to adjust the display mode of the display interface according to user gesture information to obtain an adjusted display interface, wherein the user gesture information is obtained by collecting the gesture of a user wearing the current AR equipment; and the display module is configured to display the adjusted display interface.
With reference to the second aspect, in one embodiment of the present application, the user posture information includes a user head posture including a position, an angle, and a movement direction of the user head; the adjustment module is further configured to: and determining the projection position and direction of the display interface according to the position and angle of the head of the user, and correspondingly adjusting the display angle of the display interface according to the movement direction to obtain the adjusted display interface.
With reference to the second aspect, in one embodiment of the present application, the movement direction includes head upward movement, head downward movement, head leftward movement, or head rightward movement; the adjustment module is further configured to: when the movement direction is left movement or right movement, the display interface moves along with the movement direction; when the movement direction is upward movement, the display interface is controlled to incline downwards along with the movement direction; and when the movement direction is downward movement, controlling the display interface to incline upwards along with the movement direction.
With reference to the second aspect, in an embodiment of the present application, the adjusting module is further configured to: and adjusting the display mode of the display interface according to the user gesture information and the environment information to obtain an adjusted display interface, wherein the environment information comprises the position of a target object, and the target object is a person or an object except the user wearing the current AR equipment.
With reference to the second aspect, in an embodiment of the present application, the adjusting module is further configured to: determining shielding information through the position of the head of the user and the position of the target object, wherein the shielding information comprises the size of the target object for shielding the display interface; and moving the display interface based on the shielding information to obtain the adjusted display interface.
With reference to the second aspect, in an embodiment of the present application, the adjusting module is further configured to: performing target tracking on the position of the target object based on the position of the head of the user, and predicting the movement track of the target object; and determining the shielding information according to the moving track.
With reference to the second aspect, in an embodiment of the present application, the receiving module is further configured to: and receiving the display interface sent to the current AR equipment by other AR equipment in a sharing mode, wherein the other AR equipment is equipment except the current AR equipment in a network formed by a plurality of AR equipment, and the plurality of AR equipment can share the display interface.
With reference to the second aspect, in an embodiment of the present application, the receiving module is further configured to: if the display interface is in an adjustable state, adjusting the display content of the display interface to obtain adjusted display content; and sending the adjusted display content to the other AR equipment in a sharing mode.
In a third aspect, the present application provides an electronic device, comprising: a processor, a memory, and a bus; the processor is connected to the memory via the bus, the memory storing a computer program which, when executed by the processor, performs the method according to any embodiment of the first aspect.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when executed, performs a method according to any embodiment of the first aspect.
Drawings
FIG. 1 is a schematic view illustrating a scene composition for adjusting an AR display interface according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating a method for adjusting an AR display interface according to an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating an apparatus for adjusting an AR display interface according to an embodiment of the present application;
fig. 4 is a schematic diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, but not all embodiments of the present application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art without any inventive effort, are intended to be within the scope of the present application based on the embodiments of the present application.
The embodiment of the application can be applied to a scene for adjusting the display interface of the AR equipment, and in order to solve the problems in the background technology, in some embodiments of the application, the display mode of the display interface is adjusted according to the gesture information of the user, so that the adjusted display interface is obtained, the display interface can always follow the gesture of the user to change the display mode, thereby realizing rapid adjustment of the AR display interface and improving the user experience.
The method steps in the embodiments of the present application are described in detail below with reference to the drawings.
FIG. 1 provides a schematic view of adjusting the scene composition of an AR display interface according to some embodiments of the application, the scene comprising: other AR devices 110 and current AR device 120. Specifically, after selecting a display interface to be shared, a user using other AR devices 110 in the shared network sends the display interface to the current AR device 120 in a sharing manner, and the current AR device 120 adjusts the display manner of the display interface according to the gesture information of the user, obtains an adjusted display interface, and displays the adjusted display interface.
The following exemplarily illustrates an implementation procedure of adjusting an AR display interface provided by some embodiments of the present application by using a current AR device.
To at least solve the above-mentioned problems in the background art, as shown in fig. 2, some embodiments of the present application provide a method for adjusting an AR display interface, which includes:
s210, acquiring a display interface of the current AR equipment.
It should be noted that the AR device may be any device capable of presenting an augmented reality interface, for example, the AR device may be AR glasses or the like. Each user-equipped AR glasses includes: display screen, camera, sensor, connection module and vision sensor. Among them, the AR glasses display screen adopts a high resolution liquid crystal display technology, and generally adopts an OLED screen to provide a clear and realistic image display effect. The cameras in the AR glasses are mainly used for capturing head movements and gestures of users so as to achieve operation of the virtual television controller. Cameras typically employ depth imaging techniques, such as ToF (Time of Flight) imaging techniques or structured light imaging techniques, to achieve accurate head tracking and gesture recognition. The sensor in the AR glasses comprises a gyroscope, an accelerometer, a magnetometer and the like and is used for detecting the gesture and the motion state of the AR glasses so as to realize accurate virtual television program projection and interaction. The connection module in the AR glasses is used for being in wireless connection with other devices such as a television, a smart phone and the like so as to receive and project video pictures of the virtual television programs. Visual sensors may use a depth camera or infrared camera to capture depth information and object detection of the user's surroundings, providing a more realistic virtual television viewing experience and environmental interactions.
In one embodiment of the present application, a display interface sent by other AR devices to a current AR device in a sharing manner is received.
It should be noted that, the other AR devices are devices other than the current AR device in the network composed of a plurality of AR devices, and the plurality of AR devices may share the display interface.
That is, the current AR device may perform a multi-person sharing operation with other AR devices, and after a plurality of users wear AR glasses, the multi-person sharing virtual television program viewing and interaction may be achieved through the following steps: s1, a plurality of AR glasses are in network communication through wireless connection, so that real-time communication and scene interaction among a plurality of people are realized. And S2, after one user selects one virtual television program, the AR glasses of other users receive corresponding signals and automatically switch to the same television program, so that a plurality of users watch the same virtual television program at the same time.
In one embodiment of the present application, after the current AR device receives the display interface, it is determined that the display interface is in an adjustable state, and then the display content of the display interface is adjusted, so as to obtain the adjusted display content, and then the adjusted display content is sent to other AR devices in a sharing manner.
That is, there is a mechanism of a synchronization lock in the system of each AR device, when a user performs picture adjustment, first, attempt to acquire adjustment authority, if the acquisition is successful, then the adjustment can be directly performed, and the screen state is set to a lock adjustment state on the server, so as to ensure that other devices except the device that is adjusting the picture do not have the authority to adjust the picture when the current state is the lock adjustment state, and when the user finishes or times out (if no adjustment operation is performed for more than 5 seconds), the screen state is set to an adjustable state, and then the other devices only have the authority to adjust the picture. If the acquisition fails, prompting the user: others are operating and try again later.
Optionally, the user may also communicate voice through a microphone and speaker in the AR glasses, or text through a chat window on the virtual television interface. The users can discuss and comment each other or select interactive links such as games, answers and the like in the virtual television programs in the process of watching the virtual television programs.
S220, adjusting the display mode of the display interface according to the user gesture information to obtain an adjusted display interface.
In one embodiment of the present application, the virtual television controller in the AR glasses implements operations such as selecting a television program, adjusting volume, and switching channels by:
s1, a user controls the virtual television controller through head movement or gesture operation. S2, capturing head movements and gesture actions by a camera in the AR glasses, and converting the head movements and the gesture actions into corresponding commands. And S3, the virtual television controller performs real-time interaction through the display screen according to the captured head movement and gesture actions. The user can select television programs, adjust volume and switch channels on the virtual television interface in a touch manner.
Optionally, the virtual television controller may also operate by recognizing the voice command of the user through voice recognition technology. The user may select a television program, adjust volume, and switch channels by voice commands.
In one embodiment of the present application, the user gesture information includes a user head gesture, the user head gesture includes a position, an angle, and a movement direction of the user head, and the specific step of adjusting a display mode of the display interface includes:
and determining the projection position and direction of the display interface according to the position and angle of the head of the user, and correspondingly adjusting the display angle of the display interface according to the movement direction to obtain the adjusted display interface.
Specifically, the movement direction includes head upward movement, head downward movement, head leftward movement or head rightward movement, and the specific implementation manner of correspondingly adjusting the display angle of the display interface according to the movement direction is as follows: when the movement direction is left movement or right movement, the display interface moves along with the movement direction. And when the movement direction is upward movement, controlling the display interface to incline downward along with the movement direction. And when the movement direction is downward movement, controlling the display interface to incline upwards along with the movement direction.
That is, the camera in the AR glasses captures the pose of the user's head, and projects the video picture of the virtual television program onto the display screen according to the captured head pose.
Specifically, S1 captures head pose and motion state: the camera in AR glasses continuously captures the pose and motion state of the user's head. This may be achieved by directing the camera's field of view at the user's head and using visual algorithms that identify the position, angle and direction of motion of the user's head, on-the-fly positioning and mapping (Simultaneous Localization and Mapping, SLAM) techniques. It will be appreciated that the user's movement state may be determined by a movement bracelet worn by the user.
S2, space mapping: the captured head pose and motion state need to be spatially mapped with the video frames of the virtual television program to determine the correct projection position and orientation, which can be accomplished by using a geometric transformation algorithm to translate the captured head pose and motion state into projection coordinates on the display screen, repositioning orientation and size.
S3, video projection: associating spatially mapped projection coordinates with video pictures of the virtual television program may be accomplished by integrating projection technology in the AR glasses, for example using a laser projector or by transmitting the video pictures onto the display of the AR glasses.
S4, video picture adjustment: and dynamically adjusting the projected video picture according to the captured head gesture and motion state. For example, when the user turns the head left or right, the projected video frame will move left or right accordingly. In addition, the projected video picture can also be adjusted in viewing angle and size according to the up-and-down movement of the head.
It will be appreciated that the pose and motion state of the user's head may also be captured, and video pictures of the virtual television program projected onto the display screen based on the captured head pose and motion state.
Specifically, the user can adjust the viewing angle and size of the virtual television program by moving the head. The viewing angle and size of the virtual television program will change with the captured head pose and motion state.
Moving the head left and right: when the user moves the head left and right, the viewing angle of the virtual television program is adjusted accordingly to the left or right. For example, when the user moves his head to the right, the projected video frame will move to the left, causing the user to feel that the television program is moving to the right in the field of view.
Moving the head up and down: as the user moves the head up and down, the viewing angle and size of the virtual television program will be adjusted accordingly. For example, when a user moves his head upward, the projected video image may tilt downward, causing the user to feel that the television program is tilting upward in the field of view. Meanwhile, if the user moves the head upwards more, the video picture is gradually reduced, so that the user feels that the television program is farther away from the user.
Therefore, through the above steps, the AR glasses can automatically adjust the viewing angle and size of the virtual television program according to the posture and motion state of the user's head, so that the user can watch the virtual television program comfortably.
In one embodiment of the application, the display mode of the display interface is adjusted according to the user gesture information and the environment information, and the adjusted display interface is obtained.
It should be noted that the environmental information includes the location of a target object, which is a person or object other than the user wearing the current AR device.
Specifically, first, the occlusion information is determined by the position of the head of the user and the position of the target object, it is understood that the occlusion information includes the size of the occlusion display interface at the position of the target object, and then, the display interface is moved based on the occlusion information, thereby obtaining the adjusted display interface.
As a specific embodiment of the application, firstly, the position of a target object is tracked based on the position of the head of a user, the moving track of the target object is predicted, and then, shielding information is determined according to the moving track.
As another specific embodiment of the application, after determining the size of the position shielding display interface of the target object, the display interface is controlled to move a distance larger than the size so as to enable the target object to be always kept outside the display interface.
Specifically, object recognition and object tracking are techniques based on computer vision and deep learning algorithms, and the projected position and size of a virtual television program can be adjusted according to the position and motion of the object by:
s1, visual identification: the system uses computer vision algorithm to identify people and objects in the surrounding environment of the user in real time. These algorithms can identify faces, bodies, objects, etc. by analyzing features and patterns in the image or video stream. By identifying a unique identifier for each target, the system is able to track the position and movement of these targets.
S2, target tracking: once the system has identified the target, it can track the position and motion of the target in successive frames, which can be accomplished by comparing features of the target in different frames, such as the position, shape, size, etc. of the target. The system may also use motion estimation algorithms to estimate the speed and direction of the target.
S3, projection position and size adjustment: once the system has tracked the position and motion of the target, the projected position and size of the virtual television program can be adjusted based on the position and motion of the target in the real world. For example, if the user is moving while watching a television program, the system may adjust the projection location accordingly so that the view of the virtual television program remains in the user's field of view at all times. If the target is further from the user, the system may adjust the size of the virtual television program accordingly to ensure its visibility.
By continually identifying and tracking the position and movement of the target, the system can dynamically adjust the projected position and size of the virtual television program to provide a more fluid adaptation to each person's personalized user experience. So that the user can move at will and maintain a good viewing experience, and at the same time, better environmental adaptability is provided for multiple people to share the virtual television program.
In one embodiment of the present application, the user may set the content seen by himself in a personalized manner by:
the user can select different styles, pictures and sound effects on the virtual television interface to meet personal preference, and the settings are carried out through the personal setting application built in the AR glasses.
The user can choose whether to synchronize own personalized settings to other users so as to realize unified experience and sharing fun among multiple people. The synchronization setting may be performed through wireless communication between AR glasses.
Visual identification and target tracking: people and objects in the surrounding environment of each person are identified and tracked using computer vision and deep learning algorithms. The system can automatically adjust the projection position and size of the virtual television program according to the position and movement of the target to provide a smoother adaptation to the user experience of each person.
S230, displaying the adjusted display interface.
The above describes a specific embodiment of a method for adjusting an AR display interface, and an apparatus for adjusting an AR display interface will be described below.
As shown in fig. 3, some embodiments of the present application provide an apparatus 300 for adjusting an AR display interface, the apparatus comprising: a receiving module 310, an adjusting module 320 and a displaying module 330.
A receiving module 310 configured to obtain a display interface of a current AR device; the adjusting module 320 is configured to adjust a display mode of the display interface according to user gesture information, so as to obtain an adjusted display interface, where the user gesture information is obtained by collecting a gesture of a user wearing the current AR device; and a display module 330 configured to display the adjusted display interface.
In one embodiment of the present application, the user pose information includes a user head pose including a position, an angle, and a movement direction of the user head; the adjustment module 320 is further configured to: and determining the projection position and direction of the display interface according to the position and angle of the head of the user, and correspondingly adjusting the display angle of the display interface according to the movement direction to obtain the adjusted display interface.
In one embodiment of the application, the direction of movement comprises head up movement, head down movement, head left movement or head right movement; the adjustment module 320 is further configured to: when the movement direction is left movement or right movement, the display interface moves along with the movement direction; when the movement direction is upward movement, the display interface is controlled to incline downwards along with the movement direction; and when the movement direction is downward movement, controlling the display interface to incline upwards along with the movement direction.
In one embodiment of the present application, the adjustment module 320 is further configured to: and adjusting the display mode of the display interface according to the user gesture information and the environment information to obtain an adjusted display interface, wherein the environment information comprises the position of a target object, and the target object is a person or an object except the user wearing the current AR equipment.
In one embodiment of the present application, the adjustment module 320 is further configured to: determining shielding information through the position of the head of the user and the position of the target object, wherein the shielding information comprises the size of the target object for shielding the display interface; and moving the display interface based on the shielding information to obtain the adjusted display interface.
In one embodiment of the present application, the adjustment module 320 is further configured to: performing target tracking on the position of the target object based on the position of the head of the user, and predicting the movement track of the target object; and determining the shielding information according to the moving track.
In one embodiment of the present application, the receiving module 310 is further configured to: and receiving the display interface sent to the current AR equipment by other AR equipment in a sharing mode, wherein the other AR equipment is equipment except the current AR equipment in a network formed by a plurality of AR equipment, and the plurality of AR equipment can share the display interface.
In one embodiment of the present application, the receiving module 310 is further configured to: if the display interface is in an adjustable state, adjusting the display content of the display interface to obtain adjusted display content; and sending the adjusted display content to the other AR equipment in a sharing mode.
In an embodiment of the present application, the module shown in fig. 3 is capable of implementing various processes in the embodiments of the methods of fig. 1 and 2. The operation and/or function of the individual modules in fig. 3 are for the purpose of realizing the respective flows in the method embodiments in fig. 1 and 2, respectively. Reference is specifically made to the description in the above method embodiments, and detailed descriptions are omitted here as appropriate to avoid repetition.
As shown in fig. 4, an embodiment of the present application provides an electronic device 400, including: a processor 410, a memory 420 and a bus 430, said processor being connected to said memory by means of said bus, said memory storing computer readable instructions for implementing the method according to any of the above-mentioned embodiments, when said computer readable instructions are executed by said processor, see in particular the description of the above-mentioned method embodiments, and detailed descriptions are omitted here as appropriate for avoiding repetition.
Wherein the bus is used to enable direct connection communication of these components. The processor in the embodiment of the application can be an integrated circuit chip with signal processing capability. The processor may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU for short), a network processor (Network Processor, NP for short), etc.; but may also be a Digital Signal Processor (DSP), application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The Memory may be, but is not limited to, random access Memory (Random Access Memory, RAM), read Only Memory (ROM), programmable Read Only Memory (Programmable Read-Only Memory, PROM), erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc. The memory has stored therein computer readable instructions which, when executed by the processor, perform the method described in the above embodiments.
It will be appreciated that the configuration shown in fig. 4 is illustrative only and may include more or fewer components than shown in fig. 4 or have a different configuration than shown in fig. 4. The components shown in fig. 4 may be implemented in hardware, software, or a combination thereof.
Embodiments of the present application also provide a computer readable storage medium, on which a computer program is stored, which when executed by a server, implements a method according to any one of the foregoing embodiments, and specifically reference may be made to the description in the foregoing method embodiments, and detailed descriptions are omitted herein as appropriate for avoiding repetition.
The above description is only of the preferred embodiments of the present application and is not intended to limit the present application, but various modifications and variations can be made to the present application by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. A method of adjusting an AR display interface, the method comprising:
acquiring a display interface of the current AR equipment;
adjusting the display mode of the display interface according to user gesture information to obtain an adjusted display interface, wherein the user gesture information is obtained by collecting the gesture of a user wearing the current AR equipment;
and displaying the adjusted display interface.
2. The method of claim 1, wherein the user pose information comprises a user head pose comprising a position, an angle, and a direction of motion of a user head;
the method for adjusting the display mode of the display interface according to the gesture information of the user to obtain an adjusted display interface comprises the following steps:
and determining the projection position and direction of the display interface according to the position and angle of the head of the user, and correspondingly adjusting the display angle of the display interface according to the movement direction to obtain the adjusted display interface.
3. The method of claim 2, wherein the direction of motion comprises head up motion, head down motion, head left motion, or head right motion;
the adjusting the display angle of the display interface according to the movement direction includes:
when the movement direction is left movement or right movement, the display interface moves along with the movement direction;
when the movement direction is upward movement, the display interface is controlled to incline downwards along with the movement direction;
and when the movement direction is downward movement, controlling the display interface to incline upwards along with the movement direction.
4. A method according to any one of claims 1 to 3, wherein adjusting the display mode of the display interface according to the user gesture information to obtain an adjusted display interface includes:
and adjusting the display mode of the display interface according to the user gesture information and the environment information to obtain an adjusted display interface, wherein the environment information comprises the position of a target object, and the target object is a person or an object except the user wearing the current AR equipment.
5. The method of claim 4, wherein adjusting the display mode of the display interface according to the user gesture information and the environment information to obtain the adjusted display interface comprises:
determining shielding information through the position of the head of the user and the position of the target object, wherein the shielding information comprises the size of the target object for shielding the display interface;
and moving the display interface based on the shielding information to obtain the adjusted display interface.
6. The method of claim 5, wherein the determining occlusion information from the position of the user's head and the position of the target object comprises:
performing target tracking on the position of the target object based on the position of the head of the user, and predicting the movement track of the target object;
and determining the shielding information according to the moving track.
7. The method of any of claims 1-3, wherein the obtaining a display interface of a current AR device comprises:
and receiving the display interface sent to the current AR equipment by other AR equipment in a sharing mode, wherein the other AR equipment is equipment except the current AR equipment in a network formed by a plurality of AR equipment, and the plurality of AR equipment can share the display interface.
8. The method of claim 7, wherein after the receiving the display interface sent by the other AR device to the current AR device in a shared manner, the method further comprises:
if the display interface is in an adjustable state, adjusting the display content of the display interface to obtain adjusted display content;
and sending the adjusted display content to the other AR equipment in a sharing mode.
9. An apparatus for adjusting an AR display interface, the apparatus comprising:
the receiving module is configured to acquire a display interface of the current AR equipment;
the adjusting module is configured to adjust the display mode of the display interface according to user gesture information to obtain an adjusted display interface, wherein the user gesture information is obtained by collecting the gesture of a user wearing the current AR equipment;
and the display module is configured to display the adjusted display interface.
10. An electronic device, comprising: a processor, a memory, and a bus;
the processor is connected to the memory via the bus, the memory storing a computer program which, when executed by the processor, performs the method according to any of claims 1-8.
11. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed, implements the method according to any of claims 1-8.
CN202311055567.4A 2023-08-21 2023-08-21 Method, device, equipment and medium for adjusting AR display interface Pending CN117093124A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311055567.4A CN117093124A (en) 2023-08-21 2023-08-21 Method, device, equipment and medium for adjusting AR display interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311055567.4A CN117093124A (en) 2023-08-21 2023-08-21 Method, device, equipment and medium for adjusting AR display interface

Publications (1)

Publication Number Publication Date
CN117093124A true CN117093124A (en) 2023-11-21

Family

ID=88769334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311055567.4A Pending CN117093124A (en) 2023-08-21 2023-08-21 Method, device, equipment and medium for adjusting AR display interface

Country Status (1)

Country Link
CN (1) CN117093124A (en)

Similar Documents

Publication Publication Date Title
US9842433B2 (en) Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality
US10157477B2 (en) Robust head pose estimation with a depth camera
JP6317452B2 (en) Method, apparatus, system and non-transitory computer readable storage medium for trimming content for projection onto a target
US10755438B2 (en) Robust head pose estimation with a depth camera
EP2590396B1 (en) Information processing system and information processing method
US20170316582A1 (en) Robust Head Pose Estimation with a Depth Camera
US9460340B2 (en) Self-initiated change of appearance for subjects in video and images
US20180315364A1 (en) Information Processing Apparatus and Image Generation Method
US20150116502A1 (en) Apparatus and method for dynamically selecting multiple cameras to track target object
US20130113830A1 (en) Information processing apparatus, display control method, and program
US9392248B2 (en) Dynamic POV composite 3D video system
KR20170031733A (en) Technologies for adjusting a perspective of a captured image for display
US11367260B2 (en) Video synthesis device, video synthesis method and recording medium
US11625858B2 (en) Video synthesis device, video synthesis method and recording medium
US20150253845A1 (en) System and method for altering a perspective of a figure presented on an electronic display
US20230209204A1 (en) Display apparatus and camera tracking method
CN111937045A (en) Information processing apparatus, information processing method, and recording medium
CN113870213A (en) Image display method, image display device, storage medium, and electronic apparatus
CN110633009A (en) Method and system for displaying virtual objects
CN113552947A (en) Virtual scene display method and device and computer readable storage medium
US20220244788A1 (en) Head-mounted display
EP3805899A1 (en) Head mounted display system and scene scanning method thereof
US20200342833A1 (en) Head mounted display system and scene scanning method thereof
CN111142660A (en) Display device, picture display method and storage medium
CN115985209A (en) Wearable display device, control method, control device, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination