CN115113957A - Object display method and device, smart glasses and storage medium - Google Patents

Object display method and device, smart glasses and storage medium Download PDF

Info

Publication number
CN115113957A
CN115113957A CN202110298285.1A CN202110298285A CN115113957A CN 115113957 A CN115113957 A CN 115113957A CN 202110298285 A CN202110298285 A CN 202110298285A CN 115113957 A CN115113957 A CN 115113957A
Authority
CN
China
Prior art keywords
target object
target
main interface
range
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110298285.1A
Other languages
Chinese (zh)
Inventor
王聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Shixiang Technology Co Ltd
Original Assignee
Guangzhou Shixiang Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Shixiang Technology Co Ltd filed Critical Guangzhou Shixiang Technology Co Ltd
Priority to CN202110298285.1A priority Critical patent/CN115113957A/en
Publication of CN115113957A publication Critical patent/CN115113957A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an object display method, an object display device, smart glasses and a storage medium, wherein the method comprises the following steps: displaying at least one target object within a target spatial range; the target space range is at least one preset space range in the space range of the field angle of the user; detecting the real-time position of the target object in the display process; and when the target object exceeds the target space range, reversely moving the target object so as to display the target object in the corresponding target space range. The scheme can ensure that the target object is displayed in a target space range in a following manner when the equipment moves, can continuously keep the focusing state of the target object, and avoids the user from losing the position of the target object; in addition, the target object can be prevented from moving due to slight shaking of the head of the user, and the application experience of the user is improved.

Description

Object display method and device, smart glasses and storage medium
Technical Field
The application relates to the technical field of head-mounted intelligent equipment, in particular to an object display method and device, intelligent glasses and a storage medium.
Background
With the continuous development of head-mounted intelligent devices such as intelligent glasses, related technologies such as Virtual Reality (VR), Augmented Reality (AR), and the like are widely applied, and in particular, mixed Reality (Mix Reality, MR for short), including Augmented Reality and Augmented Virtual, refers to a new visual environment generated by merging a real world and a Virtual world. The physical and digital objects coexist in a new visual environment and interact in real time, and the mixed reality is the development trend of scenes such as movies, broadcasting, on-site performance and the like; the intelligent glasses can provide an immersive virtual environment for a user by combining camera tracking and real-time content rendering technologies, and place people in a virtual and real mixed world to generate a visual environment simultaneously containing physical entities and virtual information.
At present, when a target object is displayed by an intelligent glasses system, a fixed display mode is mainly adopted, a main interface is taken as an example, the spatial position of the main interface is generally kept unchanged, and when a user moves in a real space, the focusing state of the target object is difficult to keep continuously, so that the position of the target object is easy to lose; and when the head of the user shakes slightly, the target object also moves frequently, which affects the display effect.
Disclosure of Invention
The present application aims to address one of the above technical drawbacks, and provides an object display method, an object display apparatus, smart glasses, and a storage medium.
An object display method comprising the steps of:
displaying at least one target object within a target spatial range; the target space range is at least one preset space range in the space range of the field angle of the user;
detecting the real-time position of the target object in the display process;
and when the target object exceeds the target space range, reversely moving the target object so as to display the target object in the corresponding target space range.
The present application also provides an object display apparatus, including:
a display module for displaying at least one target object within a target spatial range; the target space range is at least one preset space range in the space range of the field angle of the user;
the detection module is used for detecting the real-time position of the target object in the display process;
and the adjusting module is used for reversely moving the target object when the target object exceeds the target space range so as to display the target object in the corresponding target space range.
The present application also provides smart glasses comprising glasses, one or more processors, and a memory;
the glasses are used for displaying image information;
one or more applications are stored in the memory, wherein the applications are configured to be executed by the processor, the one or more programs performing the object display method as described above.
The present application also provides a computer storage medium having stored thereon a computer program which, when executed by a processor, implements an object display method as described above.
The technical scheme of the application has the following beneficial effects:
displaying at least one target object in at least one target space range preset in the space range of the field angle of a user, detecting the position of the target object in real time in the display process, and when the target object exceeds the target space range, reversely moving the target object to control the target object to display in the corresponding target space range; the scheme can ensure that the target object is displayed in a target space range in a following manner when the equipment moves, can continuously keep the focusing state of the target object, and avoids the user from losing the position of the target object; in addition, the target object can be prevented from moving due to slight shaking of the head of the user, and the application experience of the user is improved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic view of a view field of a user of an MR system;
FIG. 2 is a flow chart of an object display method;
FIG. 3 is a schematic diagram of a target object;
FIG. 4 is a schematic view of the reverse movement;
FIG. 5 is a schematic diagram of a display position of a main interface in an initial state;
FIG. 6 is a diagram of an example display of a main interface;
FIG. 7 is another exemplary illustration of a main interface display;
FIG. 8 is a main interface following flow diagram of an embodiment;
FIG. 9 is a schematic view of the detection point not exceeding the detection area;
FIG. 10 is a schematic view of a detection point exceeding a detection area;
FIG. 11 is a schematic diagram of opening a new application window;
fig. 12 is a schematic view of an object display apparatus.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those within the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Referring to fig. 1, fig. 1 is a schematic view of a viewing angle field of a user of an MR system, in which a gray background part is a viewing angle field range of the user, and in the viewing angle field range of the user, after entering a main interface, the user wears smart glasses to move in a real space, the main interface changes along with the change of the viewing angle of the user, and when the spatial position of the main interface remains unchanged, the main interface position is easily lost in the viewing angle of the user.
Accordingly, the present application provides an object display method, referring to fig. 2, where fig. 2 is a flowchart of the object display method, and mainly includes the following steps:
step S10, displaying at least one target object within the target space range; the target space range is at least one preset space range in the space range of the field angle of the user.
In the step, at least one space range is constructed in the space of the user field angle by using the user field angle of the glasses equipment as a target space range, and at least one target object is displayed in the target space range; for a target spatial range, having a certain spatial coordinate range, a target object may refer to all objects that need to be displayed in space, such as interfaces, icons, etc., and a target object may be a two-dimensional or three-dimensional object.
In one embodiment, the target object may be a partial image or an entire image in the display object; the target space range can be a plurality of independent spaces or a single space; the reverse movement includes at least a direction movement with a component opposite to the overrun direction.
Specifically, taking a main interface of a two-dimensional MR system as an example, referring to fig. 3, fig. 3 is a schematic diagram of a target object, where the target object may be an entire image of the main interface or may be an image of a certain portion of the main interface (e.g., a dashed frame portion in the drawing), a target space range may be a plurality of independent spaces or a single space, and a plurality of independent spaces may be constructed in the space. For the reverse movement, as shown in fig. 4, fig. 4 is a schematic diagram of the reverse movement, which at least includes the direction movement with the component opposite to the over direction, that is, the included angle with the over direction of the original movement is greater than plus or minus 90 °, so that at least the target object has the opposite direction movement to avoid further exceeding the target space range.
As an application example, after a user starts entry into the HOME interface of the MR system, or after the user presses a HOME page (HOME) key to swap out the HOME interface, for example, the HOME interface is displayed at a specified position within the user's field angle range. As shown in fig. 5, fig. 5 is a schematic diagram of a display position of a main interface in an initial state, the main interface is firstly displayed, and the main interface is placed at the center of a field angle of a user; the distance between the main interface and the user is then set. Initially, the main interface may be displayed at the center of the field angle range of the user, and meanwhile, the normal vector of the main interface is set to point to the user position, and the distance between the main interface and the user is set to be 2.3 meters.
Step S20, detecting the real-time position of the target object during the display process.
In this step, by detecting the real-time position of the display target object in real time during the movement, it is determined whether or not the predetermined display range is exceeded. When the intelligent glasses move, if the user wears the intelligent glasses to move in a real space, the main interface changes along with the change of the visual angle of the user, if the user wears the intelligent glasses to rotate the head, whether the position of the detection point exceeds the coverage range corresponding to the detection area or not is judged, and if the position of the detection point exceeds the coverage range, the adjustment processing of the next step is executed.
In one embodiment, for the method of detection, a reference point may be selected on the target object by whether the reference point is outside a predetermined display range.
For example, taking the main interface as an example, a detection point followed by the main interface may be selected on the main interface, then a detection area followed by the main interface is selected as a display area within a user field angle range, and the main interface is controlled to follow within a certain range by selecting the detection point and the detection area and detecting the detection point.
In one scheme, a plurality of reference points of the main interface can be selected on the main interface, and then the reference points are set as detection points followed by the main interface; the reference point may be set as the center point of the main interface or as four edge vertices of the main interface. In another scheme, a coverage area with a specific shape can be selected in the middle of the user field angle range according to the shape size of the main interface, and the coverage area is set as a detection area followed by the main interface; the coverage area of the specific shape may be set as a rectangular area in the middle within the user field angle range; in addition, the coverage area may be set according to actual requirements, or may be set to have other irregular shapes.
Referring to fig. 6, fig. 6 is a diagram showing an example of a main interface, assuming that the center point of the main interface is selected as the detection point O, and a rectangular ABCD is selected as the detection area within the field angle range of the user.
Referring to fig. 7, fig. 7 is another exemplary illustration of the main interface display, assuming that four edge vertices of the main interface are selected as the detecting points M, N, J, K, and a rectangular ABCD is selected as the detecting area within the field angle range of the user.
And step S30, when the target object exceeds the target space range, moving the target object reversely to display the target object in the corresponding target space range.
In the step, through judgment of the detection points, when the detection points exceed the coverage range, the main interface moves towards the direction of entering the detection area until the detection points enter the coverage range, so that the main interface is limited in a certain range, and the user is prevented from losing the position of the main interface.
Several embodiments of the technical solution of the present application are explained below with reference to the drawings.
Taking the central point of the main interface as the detection point O and the rectangle ABCD as the detection area as an example, referring to fig. 8, fig. 8 is a flowchart of the main interface following of an embodiment, which includes the following steps:
s801, detecting head 3DoF movement of the user;
s802, judging whether the detection point O exceeds the range of the rectangle ABCD; if yes, executing s803, otherwise executing s 805;
s803, the main interface moves to the direction of the access rectangle ABCD;
s804, judging whether the detection point O enters the rectangle ABCD, if so, executing s805, otherwise, returning to s 803;
s805, the home interface stops following the move.
In the above moving process, referring to fig. 9, fig. 9 is a schematic diagram of the detection point not exceeding the detection area, and the detection point O follows the left side AD of the rectangular ABCD and does not exceed the coverage of the rectangular ABCD, which is also the range in which the user can focus. Referring to fig. 10, fig. 10 is a schematic diagram of the detection point exceeding detection area, where the detection point O follows the left side AD of the rectangle ABCD and exceeds the coverage of the rectangle ABCD, and at this time, the control main interface moves toward the entering rectangle ABCD until the detection point O enters the coverage of the rectangle ABCD.
The technical scheme of the application can be applied to an MR system with 6DOF degrees of freedom, when the head of a user moves by 3DoF, the main interface moves and displays along with the user, and the focusing state of the main interface can be continuously kept after the user calls the main interface. Meanwhile, the frequent movement of the main interface when the head of the user slightly shakes is avoided.
In an embodiment, as shown in fig. 11, fig. 11 is a schematic diagram of opening a new application window, after adjusting the display position of the main interface, and after a user opens a new application window on the main interface, closing the display of the main interface, and displaying the application window at the center position of the current field angle of the user, so that the user can quickly focus on the new application window.
Embodiments of the object display apparatus are set forth below.
Referring to fig. 12, fig. 12 is a schematic structural view of an object display apparatus, including:
a display module 10 for displaying at least one target object within a target spatial range; the target space range is at least one preset space range in the space range of the field angle of the user;
a detection module 20, configured to detect a real-time position of the target object during a display process;
and the adjusting module 30 is configured to, when the target object exceeds the target space range, move the target object in a reverse direction, so that the target object is displayed in the corresponding target space range.
The scheme can ensure that the target object is displayed in a target space range in a following manner when the equipment moves, can continuously keep the focusing state of the target object, and avoids the user from losing the position of the target object; in addition, the target object can be prevented from moving due to slight shaking of the head of the user, and the application experience of the user is improved.
The object display device and the object display method are in one-to-one correspondence, and both the embodiment scheme and the technical effect in the embodiment of the object display method can be adapted to the embodiment of the object display device, so that the application is declared.
An embodiment of smart glasses is set forth below.
The present application also provides smart glasses comprising glasses, one or more processors, and a memory;
the glasses are used for displaying image information;
one or more applications are stored in the memory, wherein the applications are configured to be executed by the processor, the one or more programs performing the object display method as described above.
Embodiments of computer device storage media are set forth below.
A computer device storage medium is provided having a computer program stored thereon, which when executed by a processor implements the object display method of any of the embodiments described above.
The technical scheme of the embodiment has the following beneficial effects:
displaying at least one target object in at least one target space range preset in the space range of the field angle of a user, detecting the position of the target object in real time in the display process, and when the target object exceeds the target space range, reversely moving the target object to control the target object to be displayed in the corresponding target space range; the scheme can ensure that the target object is displayed in a target space range in a following manner when the equipment moves, can continuously keep the focusing state of the target object, and avoids the user from losing the position of the target object; in addition, the target object can be prevented from moving due to slight shaking of the head of the user, and the application experience of the user is improved.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a few embodiments of the present application and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present application, and that these improvements and modifications should also be considered as the protection scope of the present application.

Claims (10)

1. An object display method, comprising the steps of:
displaying at least one target object within a target spatial range; the target space range is at least one preset space range in the space range of the field angle of the user;
detecting the real-time position of the target object in the display process;
and when the target object exceeds the target space range, reversely moving the target object so as to display the target object in the corresponding target space range.
2. The object display method according to claim 1, wherein the target object is a partial image or an entire image in a display object;
the target space range is a plurality of independent spaces or a single space;
the reverse movement comprises at least a movement in a direction having a component opposite to the direction of the excess.
3. The object display method according to claim 2, wherein the target object is a main interface of an MR system;
the target spatial range is a user field angle range at a set distance from a user viewport.
4. The object display method of claim 3, wherein detecting the real-time position of the target object during the display process comprises:
when the target object exceeds the target space range, reversely moving the target object to display the target object in the corresponding target space range, wherein the step comprises the following steps:
in the display process, judging whether the position of the detection point exceeds the coverage range corresponding to the detection area; the detection point is a reference point which is selected on a main interface to be followed by the main interface, and the detection area is an area which is selected within the field angle range of the user to be followed by the main interface;
and when the detection point exceeds the coverage range, controlling the main interface to move towards the direction of entering the detection area until the detection point enters the coverage range.
5. The object display method according to claim 4, further comprising:
after entering a main interface, the main interface is placed at the center of the field angle of the user, the distance between the main interface and the user is set, and the normal vector of the main interface is set to point to the position of the user.
6. The object display method according to claim 4, further comprising:
selecting a plurality of reference points of a main interface on the main interface, and setting the reference points as detection points followed by the main interface;
and selecting a coverage area with a specific shape in the middle of the user field angle range according to the shape and the size of the main interface, and setting the coverage area as a detection area followed by the main interface.
7. The object display method according to claim 6, wherein the reference point is a center point of the main interface, or the reference points are four edge vertices of the main interface; the coverage area is a rectangular area in the middle of the user field angle range;
the method further comprises the following steps: and when the detection point does not exceed the coverage range, controlling the space coordinate of the main interface to be unchanged.
8. An object display apparatus, comprising:
a display module for displaying at least one target object within a target spatial range; the target space range is at least one preset space range in the space range of the field angle of the user;
the detection module is used for detecting the real-time position of the target object in the display process;
and the adjusting module is used for reversely moving the target object when the target object exceeds the target space range so as to display the target object in the corresponding target space range.
9. A smart eyewear comprising eyewear, one or more processors, and a memory;
the glasses are used for displaying image information;
one or more applications are stored in the memory, wherein the applications are configured to be executed by the processor, the one or more programs performing the object display method of any of claims 1-8.
10. A computer storage medium on which a computer program is stored, the program, when executed by a processor, implementing an object display method according to any one of claims 1 to 8.
CN202110298285.1A 2021-03-19 2021-03-19 Object display method and device, smart glasses and storage medium Pending CN115113957A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110298285.1A CN115113957A (en) 2021-03-19 2021-03-19 Object display method and device, smart glasses and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110298285.1A CN115113957A (en) 2021-03-19 2021-03-19 Object display method and device, smart glasses and storage medium

Publications (1)

Publication Number Publication Date
CN115113957A true CN115113957A (en) 2022-09-27

Family

ID=83324262

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110298285.1A Pending CN115113957A (en) 2021-03-19 2021-03-19 Object display method and device, smart glasses and storage medium

Country Status (1)

Country Link
CN (1) CN115113957A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010040570A1 (en) * 1997-12-24 2001-11-15 John J. Light Method and apparatus for automated dynamics of three-dimensional graphics scenes for enhanced 3d visualization
CN105930044A (en) * 2016-04-20 2016-09-07 乐视控股(北京)有限公司 Display page locating method and system
CN107037873A (en) * 2016-10-09 2017-08-11 深圳市金立通信设备有限公司 A kind of display methods and terminal of virtual reality main interface
CN109426419A (en) * 2017-09-05 2019-03-05 菜鸟智能物流控股有限公司 Interface display method and related equipment
CN111198608A (en) * 2018-11-16 2020-05-26 广东虚拟现实科技有限公司 Information prompting method and device, terminal equipment and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010040570A1 (en) * 1997-12-24 2001-11-15 John J. Light Method and apparatus for automated dynamics of three-dimensional graphics scenes for enhanced 3d visualization
CN105930044A (en) * 2016-04-20 2016-09-07 乐视控股(北京)有限公司 Display page locating method and system
CN107037873A (en) * 2016-10-09 2017-08-11 深圳市金立通信设备有限公司 A kind of display methods and terminal of virtual reality main interface
CN109426419A (en) * 2017-09-05 2019-03-05 菜鸟智能物流控股有限公司 Interface display method and related equipment
CN111198608A (en) * 2018-11-16 2020-05-26 广东虚拟现实科技有限公司 Information prompting method and device, terminal equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
US8640047B2 (en) Asynchronous handling of a user interface manipulation
CN109891465B (en) Method and device for processing virtual reality image
CN110166758B (en) Image processing method, image processing device, terminal equipment and storage medium
EP3662662B1 (en) Parallax viewer system for 3d content
US20040164957A1 (en) Three-dimensional object manipulating apparatus, method and computer program
US10890966B2 (en) Graphics processing systems
WO2019059020A1 (en) Control device, control method and program
US8938093B2 (en) Addition of immersive interaction capabilities to otherwise unmodified 3D graphics applications
KR20190125526A (en) Method and apparatus for displaying an image based on user motion information
US11375170B2 (en) Methods, systems, and media for rendering immersive video content with foveated meshes
CN109743892A (en) The display methods and device of virtual reality content
US11610380B2 (en) Method and computing device for interacting with autostereoscopic display, autostereoscopic display system, autostereoscopic display, and computer-readable storage medium
WO2018122448A1 (en) Method and apparatus for determining and varying the panning speed of an image based on saliency
JP2020119095A (en) Control device, control method and program
CN113286138A (en) Panoramic video display method and display equipment
CN106383577B (en) Scene control implementation method and system for VR video playing device
US7750907B2 (en) Method and apparatus for generating on-screen display using 3D graphics
CN115113957A (en) Object display method and device, smart glasses and storage medium
CN111833374A (en) Path planning method, system, storage medium and terminal based on video fusion
CN113282167B (en) Interaction method and device of head-mounted display equipment and head-mounted display equipment
CN114979614A (en) Display mode determining method and display mode determining device
CN114385062A (en) Display scheme switching method and device, readable storage medium and electronic equipment
CN109814703B (en) Display method, device, equipment and medium
CN105975165B (en) Display control method of fisheye menu
US11962743B2 (en) 3D display system and 3D display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination