WO2022068434A1 - 一种控制界面显示的方法、装置、设备及存储介质 - Google Patents

一种控制界面显示的方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2022068434A1
WO2022068434A1 PCT/CN2021/112672 CN2021112672W WO2022068434A1 WO 2022068434 A1 WO2022068434 A1 WO 2022068434A1 CN 2021112672 W CN2021112672 W CN 2021112672W WO 2022068434 A1 WO2022068434 A1 WO 2022068434A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
game
touch
mobile terminal
sub
Prior art date
Application number
PCT/CN2021/112672
Other languages
English (en)
French (fr)
Inventor
程俊彰
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP21874096.7A priority Critical patent/EP4102346A4/en
Priority to JP2022565833A priority patent/JP2023523442A/ja
Publication of WO2022068434A1 publication Critical patent/WO2022068434A1/zh
Priority to US17/949,031 priority patent/US20230017694A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Definitions

  • the present application relates to the technical field of interface display, and in particular, to control interface display.
  • games running on mobile terminals mainly include stand-alone games and online games. After users download game applications, they can play games on mobile terminals.
  • the game interface is displayed on the mobile terminal in a full-screen mode, and the user can operate the game on the display interface of the mobile terminal.
  • the function of the mobile terminal is not only to provide game services, but also to provide many other services. If the user needs to perform other business during the game, he needs to suspend the game or switch the game to run in the background. If the game is to be resumed, the user needs to start the game application again, which makes the whole operation process cumbersome and frequent human-computer interaction. consumes the processing resources of the mobile terminal.
  • the embodiments of the present application provide a method, device, device, and storage medium for controlling interface display, so as to enable users to operate other services on the mobile terminal while continuing the game progress, without requiring the user to frequently switch interfaces, thereby simplifying the User action.
  • the number of human-computer interactions is reduced, and the consumption of processing resources of the mobile terminal is reduced.
  • one aspect of the present application provides a method for controlling interface display, the method is applied to a mobile terminal, and the method includes:
  • the first game interface is reduced and displayed on the first sub-interface, and the first virtual key is displayed on the first touch interface, wherein the first sub-interface and the first touch interface It is created on the main user interface when the switching mode is triggered, and the first touch interface is used to display virtual keys for controlling the first game application.
  • Another aspect of the present application provides an interface display control device, the device is deployed on a mobile terminal, and the device includes:
  • a display module configured to display the first game interface and the first virtual key on the user main interface in a full-screen mode when the first game application is running, and the first virtual key is used to control the first game application;
  • the display module is further configured to reduce the first game interface and display it on the first sub-interface if the switching mode is triggered, and display the first virtual key on the first touch interface, wherein the first sub-interface And the first touch interface is created on the user main interface when the switching mode is triggered, and the first touch interface is used to display virtual keys for controlling the first game application.
  • a mobile terminal which may include: a memory and a processor; wherein the memory is used to store a program; the processor is used to execute the program in the memory, and the processor is used to execute the above aspects according to instructions in the program Methods.
  • Another aspect of the present application provides a computer-readable storage medium, in which instructions are stored, which, when executed on a computer, cause the computer to perform the methods of the above aspects.
  • Another aspect of the present application provides a computer program product or computer program, the computer program product or computer program comprising computer instructions stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the methods provided by the above aspects.
  • the embodiments of the present application have the following advantages:
  • a method for controlling interface display is provided.
  • the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode. If the switching mode is satisfied, Then a first sub-interface and a first touch interface are created on the user main interface, the first touch interface is used to display virtual keys for controlling the first game application, and the first game interface is reduced and displayed on the first sub-interface and displaying the first virtual key on the first touch interface.
  • a sub-interface for displaying the game interface and a touch interface for controlling the game are created on the user's main interface, so that the user can operate other services on the mobile terminal while continuing the game progress. , the user does not need to switch the interface frequently, thus simplifying the user operation.
  • the number of human-computer interactions is reduced, and the consumption of processing resources of the mobile terminal is reduced.
  • FIG. 1 is an environmental schematic diagram of a method for displaying a control interface in an embodiment of the present application
  • FIG. 2 is a schematic flowchart of a method for displaying a control interface in an embodiment of the present application
  • FIG. 3 is a schematic diagram of an embodiment of a control interface display method in an embodiment of the present application.
  • FIG. 4 is a schematic diagram of displaying a first game interface in a full-screen mode in an embodiment of the present application
  • FIG. 5 is a schematic diagram of creating a first sub-interface and a first touch interface in an embodiment of the present application
  • FIG. 6 is a schematic diagram of switching to a user main interface in an embodiment of the present application.
  • FIG. 7 is a schematic diagram of another embodiment of a control interface display method in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of switching to the first game interface in an embodiment of the application.
  • FIG. 9 is a schematic diagram of triggering a click operation on a first game interface in an embodiment of the application.
  • FIG. 10 is a schematic diagram of triggering a pressing operation on a first game interface in an embodiment of the application
  • FIG. 11 is a schematic diagram of triggering a sliding operation on a first game interface in an embodiment of the application
  • FIG. 12 is a schematic diagram of triggering a double-click operation on a first game interface in an embodiment of the application
  • FIG. 13 is a schematic diagram of triggering a multi-touch point operation on a first game interface in an embodiment of the present application
  • FIG. 14 is a schematic diagram of triggering a multi-touch point operation on a first sub-interface in an embodiment of the present application
  • FIG. 15 is a schematic diagram of an embodiment of switching the use state of a device in an embodiment of the present application.
  • FIG. 16 is a schematic diagram of another embodiment of switching the use state of a device in an embodiment of the present application.
  • 17 is a schematic diagram of the angle between the eye coordinate connection line and the horizontal direction of the mobile terminal in the embodiment of the application;
  • 18 is a schematic diagram of implementing interface switching based on face recognition in an embodiment of the application.
  • 19 is a schematic diagram of implementing interface switchback based on face recognition in an embodiment of the present application.
  • FIG. 20 is a schematic diagram of implementing interface switching based on an incoming call reminder in an embodiment of the application
  • FIG. 21 is a schematic diagram of determining the implementation of interface display based on an embedded interface in an embodiment of the application.
  • FIG. 22 is a schematic diagram of determining the implementation of interface display based on a floating interface in an embodiment of the application.
  • FIG. 23 is a schematic diagram of a drag operation performed on the first sub-interface in an embodiment of the present application.
  • 24 is a schematic diagram of a drag operation performed on the first sub-interface in an embodiment of the present application.
  • FIG. 25 is a schematic diagram of an embodiment of a first touch interface in an embodiment of the present application.
  • FIG. 26 is a schematic diagram of an embodiment of selecting parameters of the first touch interface in an embodiment of the present application.
  • FIG. 27 is a schematic diagram of another embodiment of a control interface display method in an embodiment of the present application.
  • FIG. 28 is a schematic diagram of another embodiment of a control interface display method in an embodiment of the present application.
  • 29 is a schematic diagram of another embodiment of a control interface display method in an embodiment of the present application.
  • FIG. 30 is a schematic diagram of an embodiment of an interface display control device in an embodiment of the present application.
  • FIG. 31 is a schematic structural diagram of a mobile terminal in an embodiment of the present application.
  • Embodiments of the present application provide a method, device, device, and storage medium for controlling interface display, which are used to implement user operations on other services on the mobile terminal while continuing the game progress, without requiring the user to frequently switch interfaces, thereby simplifying the User action.
  • the number of human-computer interactions is reduced, and the consumption of processing resources of the mobile terminal is reduced.
  • the game interface is displayed on the mobile terminal in a full-screen mode, and the user can operate the game on the display interface of the mobile terminal.
  • the present application provides a method for controlling interface display, which can realize the user's operation of other services on the mobile terminal while continuing the game progress, without requiring the user to frequently switch the interface, thereby simplifying the user's operation.
  • a game application when a user uses a mobile terminal to play a game, the game interface will be displayed on the mobile terminal in a full-screen mode.
  • a sub-interface and a touch interface can be created on the user's main interface, and the game interface can be reduced and displayed on the sub-interface.
  • virtual buttons can be displayed on the touch interface.
  • the user can control the game application through the virtual keys. Based on this, the user can continue the game progress while realizing the user's operation of other services on the mobile terminal, without the need for the user to frequently switch the interface, simplifying the user operation and improving the user experience.
  • the number of human-computer interactions is reduced, and the consumption of processing resources of the mobile terminal is reduced.
  • FIG. 1 is an environmental schematic diagram of a method for displaying a control interface in an embodiment of the present application.
  • a game interface display control system includes a mobile terminal, and an application program such as a game application is deployed on the mobile terminal superior.
  • the mobile terminal involved in this application may be a smart phone, a tablet computer, a palmtop computer, etc., but is not limited thereto.
  • This application takes the game application as the first game application as an example, A1 is used to indicate the first game interface in full-screen mode, A2 is used to indicate the first sub-interface, and A3 is used to indicate the first game displayed on the first sub-interface interface, A4 is used to indicate the first touch interface, and A5 is used to indicate the first virtual key.
  • the mobile terminal runs the first game application, the first game interface and the first virtual key are displayed on the user main interface in a full-screen mode.
  • the switching mode is triggered, the mobile terminal creates a first sub-interface and a first touch interface on the user's main interface, thereby reducing the first game interface and displaying it on the first sub-interface, and displaying the first virtual interface on the first sub-interface.
  • the keys are displayed on the first touch interface, and the first virtual keys are used to control the first game application.
  • the switch-back mode is triggered, the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode
  • the game engine renders the image to the display interface of the mobile terminal, it needs to be presented through the surface view (SurfaceView), because the SurfaceView has an independent drawing surface, that is, it does not share it with its host window.
  • the same drawing surface therefore, SurfaceView's user interface (user interface, UI) can be drawn in a separate thread.
  • SurfaceView does not occupy main thread resources, so SurfaceView can implement complex and efficient UI without causing user's operations to be unresponsive in time.
  • the SurfaceView can be implemented in a picture-in-picture mode or a floating window mode.
  • the picture-in-picture mode is a way of presenting a picture, and the picture-in-picture mode means that when the mobile terminal displays one picture, another picture is simultaneously broadcast on a small area of the picture. While the floating window mode suspends a movable window on the surface of the application, it should be understood that the mobile terminal may also need to obtain authorization from the system to use the floating window mode.
  • FIG. 2 is a schematic flowchart of a method for displaying a control interface in an embodiment of the present application, as shown in the figure:
  • step S1 the mobile terminal initiates an authorization request to the user, wherein the permissions included in the permission request may be window top display permission, front camera shooting permission, gravity sensing permission, and incoming call monitoring permission.
  • step S2 the mobile terminal determines in real time whether the switching mode is currently triggered, and if the switching mode is triggered, step S3 is performed.
  • the switching mode includes, but is not limited to, the user clicks the virtual button in the game interface, the screen rotates, the phone calls in, and the orientation of the user's face is the same as the orientation of the smartphone screen.
  • step S3 the mobile terminal creates a sub-interface and a touch area on its corresponding user main interface, and the sub-interface and touch area may be in a floating window mode or a picture-in-picture mode, which is not limited here.
  • step S4 the mobile terminal renders the game interface to the sub-interface, and switches the game to the background for execution.
  • step S5 the user can drag the sub-interface, or operate the sub-interface, and control the game process through the touch area.
  • step S6 the mobile terminal determines in real time whether the switchback mode is currently triggered, and if the switchback mode is triggered, the game interface returns to the foreground execution, the touch area disappears, and the game interface is displayed in full screen mode.
  • FIG. 3 is a schematic diagram of an embodiment of a method for displaying a control interface in an embodiment of the present application.
  • An embodiment of the control interface display includes:
  • the mobile terminal runs the first game application
  • the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode, and the first virtual key is used to control the first game application.
  • the first game interface and the first virtual key are displayed on the user's main interface in a full-screen mode.
  • the full-screen mode may be that the mobile terminal displays the first game interface in the entire displayable area of the screen, or the mobile terminal may display the first game interface in most of the displayable area of the screen.
  • a small portion of the displayable area at the top of the screen displays a status bar (such as operator, time, and battery level information), while the rest of the screen displays the first game interface.
  • FIG. 4 is a schematic diagram of displaying the first game interface in a full-screen mode in an embodiment of the application. As shown in the figure, (A) in FIG. The first game interface is displayed in most of the displayable area, and FIG. 4(B) shows that the first game interface is displayed in the entire displayable area of the screen of the mobile terminal.
  • the switching mode is triggered, the first game interface is reduced and displayed on the first sub-interface, and the first virtual key is displayed on the first touch interface, wherein the first sub-interface and the first touch interface are displayed.
  • the control interface is created on the user main interface when the switching mode is triggered, and the first touch interface is used to display virtual keys for controlling the first game application.
  • the mobile terminal may create a first sub-interface and a first touch interface on the user main interface.
  • the user main interface represents the first user interface seen after starting the mobile terminal.
  • the user main interface includes commonly used application icons, time, battery power information, and operator information.
  • FIG. 5 is a schematic diagram of creating a first sub-interface and a first touch interface in an embodiment of the present application.
  • B1 is used to indicate the user main interface
  • B2 is used to indicate the first sub-interface and the first touch interface.
  • a sub-interface, B3 is used to indicate the first touch interface.
  • the first sub-interface and the first touch interface can also be a circle, a triangle or a square, etc., and the first sub-interface and the first touch interface can also be located in the user interface.
  • the shape of the first sub-interface and the first touch interface and the display position in the user's main interface can be adjusted according to the actual situation, for example, the first touch interface can be located in the user's main interface.
  • the display position in the interface is independent of the display position of the first sub-interface, and may also be located at the display position where the first sub-interface is located, which is not limited here.
  • the mobile terminal can reduce the first game interface and display it on the first sub-interface. Since the first virtual key is a virtual key for controlling the first game application, the first game interface can be displayed on the first touch interface. Virtual Key. It should be noted that the first virtual key displayed on the first touch interface and the first virtual key displayed in the full screen mode are both used to control the first game application, wherein the first virtual key displayed on the first touch interface A virtual key and the first virtual key displayed in the full-screen mode may be the same or different in the number of keys and the shape of the keys, which are not limited here.
  • the user when the user operates other services on the mobile terminal, the user can control the first game application through the first virtual key displayed on the first touch interface, so as to continue the progress of the first game application.
  • FIG. 6 is a schematic diagram of switching to the main user interface in an embodiment of the present application.
  • (A) in FIG. 6 shows the first game that has not undergone reduction processing.
  • Interface (B) in Figure 6 shows the first game interface after the reduction process
  • (C) in Figure 6 shows the main user interface
  • C1 is used to indicate the first sub-interface
  • C2 is used for Indicates the first touch interface
  • C3 is used to instruct the first virtual key on the first touch interface.
  • the first game interface that has been reduced and processed is displayed on the first sub-interface
  • the first virtual key is displayed on the first touch interface.
  • a method for controlling interface display is provided.
  • a sub-interface for displaying the game interface and a touch interface for controlling the game are created on the user main interface, so that the While continuing the game progress, the user can operate other services on the mobile terminal without the need for the user to switch the interface frequently, thereby simplifying the user operation.
  • the number of human-computer interactions is reduced, and the consumption of processing resources of the mobile terminal is reduced.
  • FIG. 7 is a schematic diagram of another embodiment of the control interface display method in this embodiment of the present application.
  • An embodiment of the control interface display in the example includes:
  • the mobile terminal When the mobile terminal runs the first game application, the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode, and the first virtual key is used to control the first game application.
  • the mobile terminal performs a reduction process on the first game interface and displays it on the first sub-interface, and displays the first virtual key on the first touch interface, wherein the first sub-interface and the first sub-interface are displayed on the first sub-interface.
  • a touch interface is created on the user main interface when the switching mode is triggered, and the first touch interface is used to display virtual keys for controlling the first game application.
  • steps 201 to 202 are similar to those described in steps 101 to 102, and thus are not described herein again.
  • the mobile terminal when the switch-back mode is triggered, the mobile terminal will switch to the full-screen mode to display the first game interface and the first virtual key corresponding to the first game application, and no longer display the first touch area separately.
  • FIG. 8 is a schematic diagram of switching to the first game interface in the embodiment of the application.
  • (A) in FIG. 8 shows the main user interface
  • D1 is used to indicate the first sub-interface
  • D2 is used to indicate the first touch interface
  • D3 is used to indicate the first virtual key.
  • the user can also view the game interface of the first game application through the first sub-interface, and control the first game application through the first virtual key displayed on the first touch interface.
  • the mobile terminal can be held in landscape orientation, or click the second virtual button in the first sub-interface, thereby switching to (B) as shown in FIG. 8 . )picture.
  • another method for controlling the display of an interface is provided.
  • a sub-interface for displaying the game interface and a touch interface for controlling the game are created on the user's main interface, so as to be able to While continuing the game progress, the user can operate other services on the mobile terminal without the need for the user to switch the interface frequently, thus simplifying the user operation.
  • the switching process can also continue the game progress. , so on the basis of satisfying user needs, user operation is further simplified. At the same time, the number of human-computer interactions is reduced, and the consumption of processing resources of the mobile terminal is reduced.
  • the mobile terminal displays the first game interface on the user main interface in a full-screen mode and after the first virtual key, the following steps are also included:
  • the mobile terminal determines a trigger switching mode, wherein the preset switching operation includes at least one of a click operation, a pressing operation, a sliding operation, a double-click operation and an operation of multiple touch points.
  • a method for determining a trigger switching mode through a preset switching operation is introduced.
  • the mobile terminal displays the first game interface and the first virtual key on the user's main interface in full-screen mode, if the user needs to operate other applications (for example, video applications, instant messaging applications, shooting applications, or ordering applications, etc.)
  • a preset switching operation can be performed on the first game interface on the touch screen. Therefore, when the mobile terminal detects a preset switching operation of the touch object on the touch screen with respect to the first game interface, it determines the trigger switching mode.
  • the touch object may be the user's finger, knuckles or other touchable objects, which are not specifically limited here.
  • the first sub-interface and the first touch interface can be created on the user main interface.
  • the method of creating the first sub-interface and the first touch interface has been introduced in step 102, so it is not described here. Repeat. It can be understood that the preset switching operations include but are not limited to click operations, pressing operations, sliding operations, double-click operations, and multi-touch point operations.
  • a method for determining a trigger switching mode through a preset switching operation is provided.
  • the user adopts the preset switching operation to realize the interface switching, thereby improving the feasibility of the solution implementation.
  • the switching operation can be at least one of various operations, and the user can perform operations according to requirements, thereby increasing the flexibility of the solution.
  • the mobile terminal determines the trigger switching mode
  • the mobile terminal determines the trigger switching mode, wherein the pressing time of the pressing operation is greater than or equal to the first time threshold;
  • the mobile terminal determines the trigger switching mode, wherein the sliding trajectory of the sliding operation is generated based on the starting position of the touch object and the ending position of the touch object;
  • the mobile terminal determines the trigger switching mode, wherein the double-click time interval of the double-click operation is less than or equal to the second time threshold;
  • the mobile terminal determines a trigger switching mode, wherein the multi-touch point operation is generated based on the inward approach of at least two touch points.
  • the first method is a click operation. If the user clicks the first virtual button in the first game interface, the click operation is triggered.
  • FIG. 9 is a schematic diagram of triggering the click operation on the first game interface in the embodiment of the application.
  • (A) in FIG. 9 shows the first In the game interface, E1 is used to indicate the first virtual button.
  • E1 is used to indicate the first virtual button.
  • the mobile terminal can detect the click operation on the touch screen for the first virtual button, and determine the trigger switching mode. Based on this, enter Fig. Figure 9 (B).
  • FIG. 9(B) illustrates a user main interface including a first sub-interface and a first touch interface.
  • the first virtual button may also be a circle, an ellipse, a triangle or a pentagram, etc., and the position of the first virtual button may be located in the Any position in a game interface, based on this, the specific shape and position of the first virtual button can be flexibly determined according to the actual situation.
  • FIG. 10 is a schematic diagram of triggering the pressing operation on the first game interface in the embodiment of the application.
  • (A) in FIG. 10 shows the first In the game interface, when the user performs a pressing operation on the touch screen and the pressing time is greater than or equal to the first time threshold, the mobile terminal can determine the trigger switching mode. Based on this, enter (B) in FIG. 10 .
  • FIG. 10(B) illustrates a user main interface including a first sub-interface and a first touch interface.
  • the third way is the sliding operation. If the user performs a sliding operation on the first game interface, and the sliding trajectory is generated based on the start position of the touch object and the end position of the touch object, the sliding operation is triggered.
  • the method of triggering the sliding operation can also start the camera for the mobile terminal to shoot the touch object (eg, hand or stylus) to perform the sliding operation in the air, and the operation does not need to be in contact with the touch screen.
  • the touch object is taken as an example to introduce, please refer to FIG. 11.
  • FIG. 11 is a schematic diagram of triggering the sliding operation on the first game interface in the embodiment of the application. As shown in the figure, FIG.
  • FIG. 11 (A) shows the first game interface
  • F1 is used to indicate the starting position of the finger on the touch screen
  • F2 is used to indicate the end position of the finger on the touch screen
  • F3 is used to indicate the sliding track of the sliding operation
  • the mobile terminal can determine the trigger switching mode, and based on this, enter (B) in FIG. 11 .
  • FIG. 11(B) illustrates a user main interface including a first sub-interface and a first touch interface.
  • the fourth method is a double-click operation. If the time interval between two consecutive clicks by the user on the first game interface is less than or equal to the second time threshold, a double-click operation is triggered.
  • the second time threshold may be set according to actual requirements, for example, may be 0.5 seconds or 1 second, which is not limited here.
  • FIG. 12 is a schematic diagram of triggering the double-click operation on the first game interface in the embodiment of the application. As shown in the figure, (A) in FIG.
  • FIG. 12 shows the first In the game interface, when the user double-clicks on the touch screen, and the double-click time interval is less than or equal to the second time threshold, the mobile terminal can detect the switching mode.
  • FIG. 12(B) illustrates a user main interface including a first sub-interface and a first touch interface.
  • FIG. 13 is a schematic diagram of triggering the multi-touch point operation on the first game interface in an embodiment of the present application.
  • FIG. 13 (A) Shown is the first game interface, G1 is used to indicate the touch point A, and G2 is used to indicate the touch point B.
  • G1 is used to indicate the touch point A
  • G2 is used to indicate the touch point B.
  • the mobile terminal can determine the trigger switching mode, and based on this, enter the FIG. 13 (B) diagram.
  • FIG. 13(B) illustrates a user main interface including a first sub-interface and a first touch interface.
  • the preset switching operation includes a variety of different operations. Based on this, the user can perform different preset switching operations according to requirements. , to further enhance the flexibility of the scheme.
  • the mobile terminal performs a reduction process on the first game interface and displays it on the first sub-interface, and displays the first virtual interface on the first sub-interface. After the button is displayed on the first touch interface, the following steps are further included:
  • the mobile terminal determines to trigger a switchback mode, where the preset switchback operation includes a click operation, a press operation, a slide operation, a double-click operation, and a multi-touch point operation. at least one.
  • a method for determining the triggering switchback mode through a preset switching operation is introduced.
  • the mobile terminal shrinks the first game interface and displays it on the first sub-interface, and displays the first virtual key on the first touch interface, if the user wishes to return to the full screen mode to play the game, the user can display the first virtual key on the touch screen.
  • a preset switchback operation is performed on the first sub-interface. Therefore, when the mobile terminal detects a preset switch-back operation on the touch screen with respect to the first sub-interface, the mobile terminal determines to trigger the preset switch-back operation.
  • the touch object may be the user's finger, knuckles or other touchable objects, which are not specifically limited here.
  • the first game interface will be displayed in a full-screen mode.
  • the manner of displaying the first game interface in the full screen mode has already been introduced in step 204, so it is not repeated here.
  • the preset switch-back operation includes, but is not limited to, a click operation, a press operation, a slide operation, a double-click operation, and a multi-touch point operation.
  • a method for determining a trigger switch mode by using a preset switch operation is provided.
  • the user uses the preset switch operation to implement interface switching, thereby improving the feasibility of the solution implementation.
  • the preset switchback operation can be at least one of various operations, the user can perform operations according to requirements, thereby increasing the flexibility of the solution.
  • the mobile terminal determines that the switchback mode is triggered, and may It includes the following steps:
  • the mobile terminal determines to trigger the switchback mode
  • the mobile terminal determines to trigger the switchback mode, wherein the pressing time of the pressing operation is greater than or equal to the first time threshold;
  • the mobile terminal determines to trigger the switchback mode, wherein the sliding trajectory of the sliding operation is generated based on the start position of the touch object and the end position of the touch object;
  • the mobile terminal determines to trigger the switchback mode, wherein the double-click time interval of the double-click operation is less than or equal to the second time threshold;
  • the mobile terminal determines to trigger a switchback mode, wherein the multi-touch point operation is generated based on the outward extension of at least two touch points.
  • the first method is a click operation.
  • the user can click on the second virtual button in the first sub-interface on the touch screen, and the mobile terminal can detect the click operation.
  • the click operation is similar to the example described in FIG. 9 and will not be repeated here.
  • the second way is pressing operation.
  • the user can perform a pressing operation on the first sub-interface, and when the pressing time of the pressing operation is greater than or equal to the first time threshold, the mobile terminal can detect the pressing operation.
  • the first time threshold is set according to actual requirements, for example, it may be 2 seconds or 3 seconds, etc., which is not limited here.
  • the pressing operation is similar to the example in FIG. 10 , and will not be repeated here.
  • the third way is the sliding operation.
  • the user can perform a sliding operation on the first sub-interface, and the sliding trajectory of the sliding operation is generated based on the starting position of the touch object and the ending position of the touch object, and the mobile terminal can detect the sliding operation.
  • the example in FIG. 11 is similar, and details are not repeated here.
  • the fourth method is a double-click operation.
  • the user can perform a double-click operation on the first sub-interface, and the double-click time interval of the double-click operation is less than or equal to the second time threshold.
  • the mobile terminal can detect the double-click operation, wherein the second time threshold is performed according to actual needs.
  • the setting for example, can be 1 second or 2 seconds, etc., which is not limited here.
  • the specific pressing operation is similar to the example in FIG. 12 , and is not repeated here.
  • the fifth way is multi-touch point operation.
  • the user can use at least two fingers to simultaneously poke outwards on the first sub-interface, that is, trigger a multi-touch point operation, and at this time, the mobile terminal can detect the multi-touch point operation.
  • FIG. 14 is a schematic diagram of triggering the multi-touch point operation on the first sub-interface in an embodiment of the present application.
  • (A) in FIG. 14 The figure shows the user main interface, H1 is used to indicate the first sub-interface, H2 is used to indicate the touch point A, and H3 is used to indicate the touch point B.
  • FIG. 14(B) shows that the first game interface is displayed in a full-screen mode.
  • the preset switchback operation includes a variety of different operations. Based on this, the user can perform different preset switchback operations according to requirements. , thereby further improving the flexibility of the scheme.
  • the mobile terminal after the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode, the mobile terminal further includes: Follow the steps below:
  • the mobile terminal acquires the device usage state through the gravity sensor, wherein the device usage state is used to describe the posture of the mobile terminal when it is held;
  • the mobile terminal determines to trigger the switching mode.
  • a method for determining a trigger switching mode based on a device usage state is introduced.
  • the device usage state can be obtained through the gravity sensor.
  • the device usage state is used to describe the posture of the mobile terminal when it is held, and the device usage state It mainly includes the vertical screen use state and the horizontal screen use state. Since the first game interface displayed in the full screen mode is usually the horizontal screen use state, when the mobile terminal is in the vertical screen use state, the trigger switching mode can be determined.
  • the gravity sensor can determine the state of the mobile terminal by measuring the magnitude of the component forces in two orthogonal directions of the mobile terminal's gravity, that is, to obtain the device use state.
  • the device usage status can also be obtained through the built-in gyroscope in the mobile terminal. Due to the rotation axis of the high-speed rotating object, the external force that changes its direction tends to tend to the vertical direction, and when the rotating object is tilted horizontally, the gravity will act in the direction of increasing the tilt, while the axis moves in the vertical direction, resulting in The movement of shaking the head (ie, the precession movement).
  • the gyro axis of the gyroscope rotates on the horizontal axis, due to the rotation of the earth, it is subjected to a vertical rotation force, and the rotating body of the gyroscope produces a precession movement in the direction of the meridian in the horizontal plane, so that the device usage status can be obtained.
  • FIG. 15 is a schematic diagram of an embodiment of switching the use state of the device in the embodiment of the application.
  • (A) in FIG. The first game interface displayed in full-screen mode, when the user turns the mobile terminal from landscape to portrait, that is, as shown in (B) in Figure 15, and (B) in Figure 15 shows the use state in portrait
  • a method for determining a trigger switching mode based on the use state of the device is provided.
  • the state determination triggers the switching mode, which further clarifies the specific switching mode, thereby improving the feasibility of the solution.
  • the mobile terminal performs a reduction process on the first game interface and displays it on the first sub-interface, and displays the first virtual interface on the first sub-interface. After the button is displayed on the first touch interface, the following steps are further included:
  • the mobile terminal acquires the device usage state through the gravity sensor, wherein the device usage state is used to describe the posture of the mobile terminal when it is held;
  • the mobile terminal determines to trigger the switchback mode.
  • a method for determining a trigger switchback mode based on a device usage state is introduced.
  • the mobile terminal can obtain the device usage status through the gravity sensor, and the device usage status is used for Describes the posture of the mobile terminal when it is held.
  • the device usage status mainly includes the vertical screen usage status and the horizontal screen usage status. Since the display of the first sub-interface and the first virtual key are usually in the vertical screen use state, when the mobile terminal is in the horizontal screen use state, it can be determined to trigger the switch-back mode.
  • FIG. 16 is a schematic diagram of another embodiment of switching the use state of the device in this embodiment of the present application.
  • (A) in FIG. 16 shows the first sub-interface and The user main interface of the first touch interface, when the user turns the mobile terminal from the portrait screen to the landscape screen, that is, as shown in (B) in FIG. 16 , which is used in the landscape screen.
  • the user main interface includes the first sub-interface and the first touch interface.
  • the mobile terminal determines to trigger the switchback mode. Based on this, it enters the picture (C) in FIG. 16 , and the picture (C) in FIG. 16 shows is the first game interface displayed in the full-screen mode in the landscape use state.
  • a method for determining the triggering switchback mode based on the use state of the device is provided.
  • the user adjusts the use state of the device, when the user is in the horizontal screen use state, the user usually wants to display in the full screen mode. Since the first game interface is used, the trigger switchback mode can be determined through the use state of the device, and the specific switchback mode is further clarified, thereby improving the feasibility of this solution.
  • the mobile terminal after the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode, the mobile terminal further includes: Follow the steps below:
  • the mobile terminal obtains the face image through the front camera
  • the mobile terminal determines the left eye coordinate point corresponding to the left eye and the right eye coordinate point corresponding to the right eye according to the face image;
  • the mobile terminal determines the eye coordinate connection line according to the left eye coordinate point and the right eye coordinate point;
  • the mobile terminal determines the trigger switching mode, where the horizontal direction of the mobile terminal indicates the horizontal direction when the mobile terminal is in a vertical screen use state.
  • a method for triggering a switching mode based on a face recognition situation is introduced.
  • the front camera can also capture a face image in real time, and determine the coordinates of the left eye and the right eye from the face image. point, and then generate the eye coordinate connection line according to the left eye coordinate point and the right eye coordinate point. If the included angle between the coordinate connection line of the eyes and the horizontal direction of the mobile terminal is less than or equal to the angle threshold, it is determined that the switching mode is triggered.
  • the horizontal direction of the mobile terminal refers to the horizontal direction when the mobile terminal is in a vertical screen use state.
  • the angle threshold can be set according to actual needs, for example, it can be 30°. In practical applications, the angle threshold can also be set to other angles such as 45° or 20°, which are not limited here.
  • FIG. 17 is a schematic diagram of the angle between the eye coordinate connection line and the horizontal direction of the mobile terminal in the embodiment of the application.
  • (A) in FIG. 17 shows A mobile terminal in a vertical screen use state
  • Figure 17 (B) shows a mobile terminal in a landscape screen use state
  • I1 is used to indicate the horizontal direction of the mobile terminal
  • I2 is used to indicate the left eye
  • I3 is used to indicate right eye.
  • the connection line of the left eye and the right eye is parallel to the horizontal direction of the mobile terminal, that is, the eye coordinate connection line of the left eye coordinate point corresponding to the left eye and the right eye coordinate point corresponding to the right eye.
  • the included angle with the horizontal direction of the mobile terminal is 0°. It can be seen from (B) in Figure 17 that the left eye and the right eye are vertical to the horizontal direction of the mobile terminal, that is, the coordinates of the left eye corresponding to the left eye and the coordinates of the right eye corresponding to the right eye are connected. The included angle between the horizontal directions of the terminals is 90°.
  • FIG. 18 is a schematic diagram of implementing interface switching based on face recognition in the embodiment of the present application.
  • (A) in FIG. 18 shows is a mobile terminal
  • J1 is used to indicate the horizontal direction of the mobile terminal
  • J2 is used to indicate the left eye
  • J3 is used to indicate the right eye
  • J4 is used to indicate the angle between the eye coordinate line and the horizontal direction of the mobile terminal.
  • the line connecting the coordinates of the left eye and the coordinates of the right eye has an included angle with the horizontal direction of the mobile terminal. If the included angle is 20°, then the included angle is less than the angle.
  • the threshold value is 30°, therefore, the switching pattern is satisfied, and the graph (B) in FIG. 18 is obtained.
  • FIG. 18(B) illustrates a user main interface including a first sub-interface and a first touch interface.
  • a method for triggering a switching mode based on a face recognition situation is provided.
  • the coordinate point of the left eye and the coordinate point of the right eye are determined based on the face image, and the coordinate connection line between the eyes is further determined.
  • the relationship between the angle between the coordinate connection line and the horizontal direction of the mobile terminal is used to determine whether the switching mode is triggered, and the specific switching mode is clarified.
  • the user does not need to perform active operation. Only the face image can be obtained to determine whether to trigger the switching mode, thus simplifying the user. operate.
  • the mobile terminal performs a reduction process on the first game interface and displays it on the first sub-interface, and displays the first virtual interface on the first sub-interface. After the button is displayed on the first touch interface, the following steps are further included:
  • the mobile terminal obtains the face image through the front camera
  • the mobile terminal determines the left eye coordinate point corresponding to the left eye and the right eye coordinate point corresponding to the right eye according to the face image;
  • the mobile terminal determines the eye coordinate connection line according to the left eye coordinate point and the right eye coordinate point;
  • the mobile terminal determines to trigger the switchback mode, wherein the horizontal direction of the mobile terminal indicates the horizontal direction when the mobile terminal is in a vertical screen use state.
  • a method for determining a triggering switchback mode based on a face recognition situation is introduced.
  • the mobile terminal After the mobile terminal reduces the first game interface and displays it on the first sub-interface, and displays the first virtual button on the first touch interface, the mobile terminal can also use the front camera to capture a face image in real time, and record the image from the human face.
  • the coordinate point of the left eye and the coordinate point of the right eye are determined in the face image, and then the coordinate connection line of the eyes is generated according to the coordinate point of the left eye and the coordinate point of the right eye. If the included angle between the line connecting the coordinates of the eyes and the horizontal direction of the mobile terminal is greater than the angle threshold, it is determined that the switch-back mode is triggered.
  • the horizontal direction of the mobile terminal refers to the horizontal direction when the mobile terminal is in a vertical screen use state.
  • the angle threshold can be set according to actual needs, for example, it can be 30°. In practical applications, the angle threshold can also be set to other angles such as 45° or 20°, which are not limited here.
  • FIG. 19 is a schematic diagram of implementing interface switchback based on face recognition in the embodiment of the present application.
  • FIG. 19 ( A) The figure shows a mobile terminal, K1 is used to indicate the horizontal direction of the mobile terminal, K2 is used to indicate the left eye, K3 is used to indicate the right eye, and K4 is used to indicate the coordinate connection between the eyes and the horizontal direction of the mobile terminal. angle.
  • the line connecting the coordinates of the left eye and the right eye coordinates has an included angle with the horizontal direction of the mobile terminal. If the included angle is 45°, then the included angle is greater than the angle.
  • the threshold value is 30°, therefore, the switch-back mode is triggered, thereby obtaining the picture (B) in FIG. 19 , which shows the first game interface displayed in the full screen mode.
  • a method for determining a triggering switchback mode based on a face recognition situation is provided.
  • the left eye coordinate point and the right eye coordinate point are determined based on the face image, and the eye coordinate connection line is further determined, According to the relationship between the eye coordinate connection line and the horizontal direction of the mobile terminal, it is judged whether to trigger the switchback mode without the user's active operation.
  • the mobile terminal after the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode, the mobile terminal further includes: Follow the steps below:
  • the mobile terminal monitors the incoming call reminder
  • the mobile terminal determines the trigger switching mode.
  • a method for determining a trigger switching mode based on an incoming call situation is introduced.
  • the mobile terminal can monitor the incoming call alert in real time, and when there is an incoming call alert, if the user chooses to answer the incoming call, the mobile terminal can detect the answering instruction for the incoming call alert, thereby determining the trigger switching mode.
  • the answering operation of the user selecting to answer the incoming call may be an operation of clicking an answering button.
  • FIG. 20 is a schematic diagram of implementing interface switching based on an incoming call reminder in an embodiment of the application.
  • (A) in FIG. 20 shows the first screen displayed in full screen mode.
  • the mobile terminal displays the incoming call reminder
  • he can click the answer button, thus, the mobile terminal detects the answering instruction for the incoming call reminder, and can determine the trigger switching mode, based on this, enter Figure 20 Middle (B) panel.
  • FIG. 20(B) illustrates a user main interface including a first sub-interface and a first touch interface.
  • a method for determining a triggering switching mode based on an incoming call is provided.
  • the interface switching is automatically realized, thereby simplifying the user operation and improving the feasibility of this solution.
  • the mobile terminal after the mobile terminal displays the first game interface and the first virtual key on the user main interface in a full-screen mode, the mobile terminal further includes: Follow the steps below:
  • the mobile terminal obtains the voice to be detected through the microphone
  • the mobile terminal determines the trigger switching mode.
  • a method for determining a trigger switching mode based on speech is introduced.
  • the mobile terminal displays the first game interface and the first virtual key on the main user interface in a full-screen mode
  • the mobile terminal can also acquire the voice to be detected in real time through the microphone. If the voice to be detected successfully matches the first preset voice, it is determined to trigger the switching model. Exemplarily, assuming that the first preset voice is "interface switching", if the user speaks “interface switching" to the microphone (that is, "interface switching" is the voice to be detected), the mobile terminal will combine the voice to be detected with the first preset voice. Set the voice to be matched, and the trigger switching mode can be determined when the match is successful. It should be understood that the foregoing examples are only used for understanding this solution, and the first preset voice may also be set according to actual selection conditions.
  • a method for determining a trigger switching mode based on voice is provided.
  • a user can perform interface switching through voice, thereby simplifying user operation and increasing the flexibility of solution application.
  • the mobile terminal performs a reduction process on the first game interface and displays it on the first sub-interface, and displays the first virtual interface on the first sub-interface. After the button is displayed on the first touch interface, the following steps are further included:
  • the mobile terminal obtains the voice to be detected through the microphone
  • the mobile terminal determines to trigger the switchback mode.
  • a method for triggering a switchback mode based on voice determination is introduced.
  • the mobile terminal can also acquire the voice to be detected in real time through the microphone.
  • the second preset voice match it is determined that the switch-back mode is triggered.
  • the second preset voice is "interface switchback”
  • the mobile terminal will combine the to-be-detected voice with the voice to be detected.
  • the second preset voice is matched, and if the match is successful, the switchback mode can be determined to be triggered. It should be understood that the foregoing examples are only used for understanding this solution, and the second preset voice may also be set according to actual selection conditions.
  • a method for triggering a switchback mode based on voice determination is provided.
  • a user can perform interface switchback through voice, thereby simplifying user operations and increasing the flexibility of solution application.
  • the first sub-interface and the first touch interface both belong to embedded interfaces
  • the mobile terminal reduces the first game interface and displays it on the first sub-interface, and displays the first virtual key on the first touch interface, which may include the following steps:
  • the mobile terminal creates a first sub-interface on the first preset area of the user's main interface, wherein the first preset area is a fixed area in the user's main interface;
  • the mobile terminal creates a first touch interface on the second preset area of the user main interface, wherein the first touch interface is another fixed area in the user main interface;
  • the mobile terminal reduces the first game interface in the picture-in-picture mode and displays it on the first sub-interface, and displays the first virtual key on the first touch interface in the picture-in-picture mode.
  • a method for implementing interface display based on an embedded interface is introduced.
  • At least one of the first sub-interface and the first touch interface may be an embedded interface. Since the embedded interface is a fixed interface, the mobile terminal can create a first sub-interface on the first preset area of the user main interface, and the first preset area is a fixed area on the user main interface.
  • the first touch interface can also be created on the second preset area of the user main interface, and the second preset area is another fixed area on the user main interface. Based on this, the mobile terminal uses the picture-in-picture mode to reduce the first game interface and displays it on the first sub-interface, and displays the first virtual key on the first touch interface in the picture-in-picture mode.
  • FIG. 21 is a schematic diagram of determining the implementation of interface display based on an embedded interface in an embodiment of the application.
  • L1 is used to indicate User main interface
  • L2 is used to indicate the first preset area
  • L3 is used to indicate the second preset area
  • L4 is used to indicate the first sub-interface
  • L5 is used to indicate the reduced first game interface
  • L6 is used for Indicates the first touch interface
  • L7 is used to indicate the first virtual key.
  • the first preset area and the second preset area in (A) in FIG. 21 are fixed areas on the main user interface. Therefore, the first sub-interface in (B) in FIG.
  • the first touch interface 21 is created in the first preset On the set area, and the first touch interface is created on the second preset area L3, in a similar manner to the previous embodiment, the first game interface is reduced, and then the reduced first game interface is overlaid on the The first sub-interface is displayed, and the first virtual key is displayed on the first touch interface.
  • a method for realizing interface display based on an embedded interface is provided.
  • a sub-interface for displaying a game interface is created on a fixed preset area on a user main interface, and a touch screen for controlling the game is created.
  • the control interface can realize the user's operation of other services on the mobile terminal while continuing the game progress, and the user does not need to switch the interface frequently, which simplifies the user's operation.
  • both the first sub-interface and the first touch interface belong to the floating window interface
  • the mobile terminal reduces the first game interface and displays it on the first sub-interface, and displays the first virtual key on the first touch interface, which may include the following steps:
  • the mobile terminal creates a first sub-interface on the upper layer of the user main interface
  • the mobile terminal creates a first touch interface on the upper layer of the user main interface
  • the mobile terminal reduces the first game interface in the floating window mode and displays it on the first sub-interface, and displays the first virtual key on the first touch interface in the floating window mode.
  • a method for controlling interface display based on a floating window interface is introduced.
  • At least one of the first sub-interface and the first touch interface may be a floating window interface. Since the floating window interface is not a fixed interface, the mobile terminal can create the first sub-interface on the upper layer of the user main interface, and create the first touch interface on the upper layer of the user main interface.
  • the first game interface is then reduced and displayed on the first sub-interface in the floating window mode, and the first virtual key can also be displayed on the first touch interface in the floating window mode.
  • the floating window interface may be a non-transparent interface or a translucent interface, which is not limited here.
  • FIG. 22 is a schematic diagram of determining the implementation of interface display based on a floating interface in an embodiment of the present application.
  • M1 is used to indicate User main interface
  • M2 is used to indicate the first sub-interface
  • M3 is used to indicate the first touch interface
  • M4 is used to indicate the first game interface after reduction processing
  • M5 is used to indicate the first virtual key.
  • the first sub-interface and the first touch interface are both located on the upper layer of the user's main interface. After the first game interface is reduced, it is displayed on the first sub-interface, and the first game interface is displayed on the first touch interface. Virtual keys, based on this, an interface as shown in (B) in Figure 22 is obtained.
  • a method for controlling interface display based on a floating window interface is provided.
  • a sub-interface for displaying a game interface and a touch interface for controlling the game are created on the upper layer of the user's main interface, which can be used to control the game. While continuing the game progress, the user can operate other services on the mobile terminal without requiring the user to frequently switch the interface, which further simplifies the user operation.
  • another optional embodiment provided by the embodiment of the present application further includes the following steps:
  • the mobile terminal controls the first sub-interface to move along the drag track corresponding to the drag operation;
  • the mobile terminal controls the first touch interface to move along the drag track corresponding to the drag operation;
  • the mobile terminal controls the first sub-interface to zoom in or out according to the zoom operation;
  • the mobile terminal controls the first touch interface to zoom in or out according to the zoom operation.
  • FIG. 23 is a schematic diagram of a drag operation on the first sub-interface in an embodiment of the present application.
  • N1 is used to indicate the first sub-interface
  • N2 is used to indicate the starting position
  • N3 is used to indicate the starting position.
  • Figure 23 (A) shows that the user drags from the start position to the end position, and the drag track is from the start position to the end position, so that the mobile terminal can control the first position.
  • the sub-interface moves from the start position to the end position, and the figure (B) shown in Figure 23 is obtained. It can be understood that the drag operation on the first touch interface will not be repeated.
  • the user may also perform a zoom operation on the first sub-interface, for example, zoom in or zoom out the first sub-interface.
  • the zooming operation includes a zooming-out operation and a zooming-in operation, the zooming operation is that at least two touch points shrink inward, and the zooming operation is that at least two touch points extend outward.
  • FIG. 24 is a schematic diagram of a drag operation performed on the first sub-interface in an embodiment of the present application. As shown in the figure, O1 is used to indicate the first sub-interface, and O2 is used to indicate the reduced first sub-interface. Sub-interface, O3 is used to indicate the enlarged first sub-interface.
  • FIG. 24 (A) shows that the finger is retracted inward on the touch screen of the mobile terminal, that is, the zoom-out operation for the first sub-interface is triggered, thus, the (B) diagram shown in Fig. 24 is obtained. A sub-interface has been reduced.
  • Figure 24 (C) shows that the finger extends outward on the touch screen of the mobile terminal, that is, triggers the zoom-in operation for the first sub-interface. The sub-interface has been enlarged. It can be understood that the zooming operation on the first touch interface is not repeated.
  • a method for adjusting the position and size of a floating window is provided.
  • the positions and sizes of the first sub-interface and the first touch interface can also be adjusted according to user requirements. , thereby enhancing the flexibility of the scheme.
  • the mobile terminal obtains the game type of the first game application
  • the mobile terminal determines the first touch interface from at least one creatable touch interface according to the game type of the first game application, wherein each creatable touch interface displays a corresponding virtual key.
  • a method for adjusting a touch interface based on a game type is introduced. Since different game types correspond to different creatable touch interfaces, after the first game interface is reduced and displayed on the first sub-interface, and before the first virtual keys are displayed on the first touch interface, The mobile terminal may acquire the game type of the first game application, and then determine the first touch interface from at least one touchable interface that can be created according to the game type of the first game application, so as to display virtual keys corresponding to the game type.
  • the game types may be sports competitive games, casual games, and role-playing games (Role-Playing Game, RPG, etc.).
  • the touch interface corresponding to sports games may include virtual buttons such as “jump” and “squat", for example, the touch interface corresponding to casual games may include “left", “right” and “shooting” Wait for the virtual key.
  • the corresponding touch interface of the RPG may include directional virtual keys and the like.
  • the user can also customize parameters of the first touch interface, including but not limited to interface transparency, interface background color, interface pattern, interface text, and the shape and size of virtual keys, which are not limited here.
  • FIG. 25 is a schematic diagram of an embodiment of the first touch interface in the embodiment of the application, and (A) in FIG. 25 shows the main user interface corresponding to the RPG.
  • FIG. 25(B) shows the main user interface corresponding to sports games.
  • Figure 25 (C) shows the main user interface corresponding to casual games.
  • P1 is used to indicate the first touch interface
  • P11 is used to indicate the directional virtual button
  • P12 is used to indicate the virtual button of the "Confirm” function
  • P13 is used to indicate the "Cancel" function.
  • Virtual Key As shown in the figure, P1 is used to indicate the first touch interface, (A) in the figure P11 is used to indicate the directional virtual button, P12 is used to indicate the virtual button of the "Confirm” function, and P13 is used to indicate the "Cancel" function. Virtual Key.
  • P21 is used to indicate the virtual key of the "jump” function
  • P22 is used to indicate the virtual key of the "squat” function
  • P31 is used to instruct the virtual key of "move to the left
  • P32 is used to instruct the virtual key of "move to the right
  • P33 is used to instruct the virtual key of the "shooting" function.
  • FIG. 26 is a schematic diagram of an embodiment of selecting parameters of the first touch interface in the embodiment of the present application, as shown in the figure.
  • Q1 is used to indicate the first touch interface
  • Q2 is used to indicate the interface color selection area
  • Q3 is used to indicate the transparency selection area. The user can adjust the interface color and adjust the transparency according to requirements.
  • a method for adjusting a touch interface based on a game type is provided.
  • different touch interfaces can be set for different game types, so as to facilitate the control of different types of games.
  • the first touch interface can also adjust parameters according to user requirements, thereby increasing the flexibility of the solution.
  • another optional embodiment provided by the embodiment of the present application further includes the following steps:
  • the second game interface corresponding to the second game application is displayed in a full-screen mode
  • the mobile terminal will reduce the second game interface and display it on the second sub-interface, and display the second virtual key on the second touch interface, where the second virtual key is used to control the first game interface.
  • the second game application, the second sub-interface and the second touch interface are displayed on the main user interface.
  • a method for displaying multiple game interfaces at the same time is introduced.
  • the mobile terminal can also run the second game application, and display the second game interface corresponding to the second game application in a full-screen mode. If the switching mode is satisfied, a second sub-interface and a second touch interface can be created on the main user interface. The second game interface is reduced and displayed on the second sub-interface, and a second virtual key is displayed on the second touch interface, and the second virtual key is used to control the second game application.
  • the manner of triggering the switching mode is similar to that in the foregoing embodiment, and details are not described herein again.
  • FIG. 27 is a schematic diagram of another embodiment of the control interface display method in this embodiment of the present application.
  • R1 is used to indicate the first sub-interface
  • R2 is used to indicate the first touch interface
  • R3 is used to indicate the second sub-interface
  • R4 is used to indicate the second touch interface, on the user main interface
  • the first game interface after the reduction process is displayed on the first sub-interface
  • the second sub-interface is displayed on the first sub-interface.
  • the interface displays the second game interface after the reduction processing
  • the first virtual key corresponding to the first game application is displayed on the first touch interface
  • the second virtual button corresponding to the second game application is displayed on the second touch interface button.
  • a method for displaying multiple game interfaces at the same time is provided.
  • the mobile terminal can display multiple game applications on the user main interface, which can continue two game progresses while realizing the mobile terminal. operations on other businesses, thereby increasing the convenience of operations.
  • another optional embodiment provided by the embodiment of the present application further includes the following steps:
  • the video application interface corresponding to the video application is displayed on the main user interface
  • the mobile terminal reduces the video application interface and displays it on the third sub-interface, where the third sub-interface is displayed on the user main interface.
  • a method for simultaneously displaying a game interface and a video interface is introduced.
  • the mobile terminal can also run a video application, thereby displaying a video application interface corresponding to the video application on the user main interface.
  • a third sub-interface is created on the main user interface, and then the video application interface is reduced and displayed on the third sub-interface. It should be noted that the manner of triggering the switching mode and the manner of creating the third sub-interface are similar to those in the foregoing embodiments, and details are not described herein again.
  • FIG. 28 is a schematic diagram of another embodiment of the control interface display method in the embodiment of the present application.
  • T1 is used to indicate the first sub-interface
  • T2 is used to indicate the first touch interface
  • T3 is used to indicate the third sub-interface, on the user main interface
  • the first game interface after the reduction process is displayed on the first sub-interface
  • the first virtual key is displayed on the first touch interface
  • the video application interface after the reduction processing is displayed on the third sub-interface.
  • a method for displaying a game interface and a video interface at the same time is provided.
  • the mobile terminal can not only display the game application on the user's main interface, but also can simultaneously display the video application on the user's main interface, and It realizes the user's operation of other services on the mobile terminal, meets the diversified needs of the user, and further simplifies the user's operation.
  • the mobile terminal performs a reduction process on the first game interface and displays it on the first sub-interface, and After the first virtual key is displayed on the first touch interface, the following steps are further included:
  • the user main interface displays the application interface corresponding to the target application in full-screen mode, and displays the first sub-interface and the first touch interface on the user main interface, wherein the target application includes instant messaging At least one of a class application, an entertainment class app, and a tool class app.
  • a method for displaying a game interface and an information interface at the same time is introduced.
  • the mobile terminal shrinks the first game interface and displays it on the first sub-interface, and displays the first virtual key on the first touch interface
  • a full screen may also be used on the main user interface.
  • the mode displays the application interface of the target application, and at the same time, the first sub-interface and the first touch interface are continuously displayed on the main user interface.
  • the target application includes but is not limited to instant messaging applications, entertainment applications and tool applications.
  • communication applications include but are not limited to media instant messaging applications such as chat, social networking, email, and community; entertainment applications include but are not limited to applications such as music, ringtones, players, live broadcasts, entertainment, and horoscopes; tool applications include But it is not limited to applications such as life services, food, weather, calendars, utilities, flashlights, notes, office tools, network storage, and office software.
  • the application interface can be displayed in full-screen mode. For example, replying to messages, checking emails, ordering takeout, listening to songs, browsing Weibo, surfing the Internet, taking pictures, setting the system, etc. can all be displayed in full-screen mode.
  • FIG. 29 is a schematic diagram of another embodiment of the control interface display method in this embodiment of the present application.
  • U1 is used to indicate The first sub-interface
  • U2 is used to indicate the first touch interface
  • U3 is used to indicate the tool application interface. Therefore, on the user main interface, the first sub-interface is displayed with the first game interface after the reduction process. , a first virtual key is displayed on the first touch interface, and a tool application interface corresponding to the tool application is displayed in a full-screen mode.
  • a method for displaying a game interface and an information interface at the same time is provided.
  • the mobile terminal displays a variety of applications on the user's main interface, which can not only continue the game progress but also operate other applications, so that the user can The operation of a variety of application services on the mobile terminal, thereby meeting the diversified needs of users.
  • FIG. 30 is a schematic diagram of an embodiment of the interface display control device in the embodiment of the application.
  • the interface display control device 30 includes:
  • the display module 301 is configured to display the first game interface and the first virtual key on the user main interface in a full-screen mode when the first game application is running, and the first virtual key is used to control the first game application;
  • the display module 301 is further configured to reduce the first game interface and display it on the first sub-interface if the switching mode is triggered, and display the first virtual key on the first touch interface, wherein the first sub-interface is The interface and the first touch interface are created on the user main interface when the switching mode is triggered, and the first touch interface is used to display virtual keys for controlling the first game application.
  • the display module 301 is further configured to reduce the first game interface and display it on the first sub-interface, and display the first virtual key on the first touch interface. If the switch-back mode is triggered, the full-screen mode is used. A first game interface and a first virtual key are displayed on the main user interface.
  • the interface display control device 30 further includes a determination module 302 ;
  • the determining module 302 is configured to determine a trigger switching mode if a preset switching operation is triggered, wherein the preset switching operation includes at least one of a click operation, a pressing operation, a sliding operation, a double-click operation and a multi-touch point operation.
  • the determining module 302 is specifically configured to determine a trigger switching mode if a click operation is triggered for the first virtual button in the first game interface;
  • a trigger switching mode is determined, wherein the pressing time of the pressing operation is greater than or equal to the first time threshold
  • a trigger switching mode is determined, wherein the sliding trajectory of the sliding operation is generated based on the starting position of the touch object and the ending position of the touch object;
  • the trigger switching mode is determined, wherein the double-click time interval of the double-click operation is less than or equal to the second time threshold;
  • a trigger switching mode is determined, wherein the multi-touch point operation is generated based on the inward approach of at least two touch points.
  • the determining module 302 is further configured to trigger a switchback mode if a preset switchback operation is triggered for the first sub-interface, wherein the preset switchback operation includes a click operation, a press operation, a slide operation, a double-click operation and a multi-touch operation At least one of handle operations.
  • the determining module 302 specifically if a click operation is triggered for the second virtual button in the first sub-interface, determine to trigger the switchback mode
  • the pressing operation is triggered for the first sub-interface, it is determined that the switch-back mode is triggered, wherein the pressing time of the pressing operation is greater than or equal to the first time threshold;
  • the switch-back mode is determined to be triggered, wherein the sliding trajectory of the sliding operation is generated based on the start position of the touch object and the end position of the touch object;
  • the double-click operation is triggered for the first sub-interface, it is determined to trigger the switchback mode, wherein the double-click time interval of the double-click operation is less than or equal to the second time threshold;
  • a switch-back mode is determined to be triggered, wherein the multi-touch point operation is generated based on the outward extension of at least two touch points.
  • the interface display control apparatus 30 further includes an acquisition module 303 ;
  • the acquiring module 303 is used to acquire the device usage status through the gravity sensor after the display module 301 displays the first game interface and the first virtual button on the user main interface in full-screen mode, wherein the device usage status is used to describe that the mobile terminal is held attitude when
  • the determining module 302 is further configured to determine to trigger the switching mode if the device use state indicates that the mobile terminal is in a portrait screen use state.
  • the acquisition module 303 is also used for the display module 301 to reduce the first game interface and display it on the first sub-interface, and after displaying the first virtual key on the first touch interface, obtain the device usage status through the gravity sensor , where the device usage state is used to describe the posture of the mobile terminal when it is held;
  • the determining module 302 is further configured to determine to trigger the switchback mode if the device use state indicates that the mobile terminal is in a landscape use state.
  • the acquisition module 303 is further configured to acquire the face image through the front camera after the display module 301 displays the first game interface and the first virtual key on the user main interface in a full-screen mode;
  • the determining module 302 is further configured to determine the left eye coordinate point corresponding to the left eye and the right eye coordinate point corresponding to the right eye according to the face image;
  • the determining module 302 is further configured to determine the coordinate connection line of the eyes according to the coordinate point of the left eye and the coordinate point of the right eye;
  • the determining module 302 is further configured to determine the trigger switching mode if the included angle between the eye coordinate connection line and the horizontal direction of the mobile terminal is less than or equal to the angle threshold, wherein the horizontal direction of the mobile terminal indicates that the mobile terminal is in a vertical screen use state. horizontal direction.
  • the acquisition module 303 is further configured to display the first game interface on the first sub-interface after the display module is reduced, and after displaying the first virtual key on the first touch interface, acquire the face image through the front camera ;
  • the determining module 302 is further configured to determine the left eye coordinate point corresponding to the left eye and the right eye coordinate point corresponding to the right eye according to the face image;
  • the determining module 302 is further configured to determine the eye coordinate connection line according to the left eye coordinate point and the right eye coordinate point;
  • the determining module 302 is further configured to determine to trigger the switchback mode if the angle between the eye coordinate connection line and the horizontal direction of the mobile terminal is greater than the angle threshold, wherein the horizontal direction of the mobile terminal represents the horizontal direction of the mobile terminal when the mobile terminal is in a vertical screen use state direction.
  • the interface display control device 30 further includes a monitoring module 304 ;
  • the monitoring module 304 is used to monitor the incoming call reminder after the display module 301 displays the first game interface and the first virtual key on the user main interface in a full-screen mode;
  • the determining module 302 is further configured to determine the trigger switching mode if the answering instruction for the incoming call reminder is triggered.
  • the obtaining module 303 is further configured to obtain the voice to be detected through the microphone after the display module displays the first game interface and the first virtual key on the user main interface in a full-screen mode;
  • the determining module 302 is further configured to determine a trigger switching mode if the to-be-detected voice matches the first preset voice successfully.
  • the acquiring module 303 is further configured to display the first game interface on the first sub-interface after reducing the display module, and after displaying the first virtual key on the first touch interface, acquire the voice to be detected through the microphone;
  • the determining module 302 is further configured to determine to trigger the switchback mode if the to-be-detected voice matches the second preset voice successfully.
  • the first sub-interface and the first touch interface Both belong to the embedded interface
  • the display module 301 is specifically configured to create a first sub-interface on the first preset area of the user main interface, wherein the first preset area is a fixed area in the user main interface;
  • the first game interface is reduced and displayed on the first sub-interface in the picture-in-picture mode, and the first virtual key is displayed on the first touch interface in the picture-in-picture mode.
  • the first sub-interface and the first touch interface All belong to the floating window interface;
  • the display module 301 is specifically used to create a first sub-interface on the upper layer of the user main interface
  • the first game interface is reduced and displayed on the first sub-interface in the floating window mode, and the first virtual key is displayed on the first touch interface in the floating window mode.
  • the interface display control device 30 further includes a control module 305 ;
  • the control module 305 is configured to control the first sub-interface to move along the drag track corresponding to the drag operation if a drag operation is triggered for the first sub-interface;
  • the control module 305 is further configured to control the first touch interface to move along the drag track corresponding to the drag operation if the drag operation is triggered for the first touch interface;
  • the control module 305 is further configured to control the first sub-interface to zoom in or out according to the zoom operation if a zoom operation is triggered for the first sub-interface;
  • the control module 305 is further configured to control the first touch interface to zoom in or out according to the zoom operation if a zoom operation is triggered for the first touch interface.
  • the obtaining module 303 is further configured to display the first game interface on the first sub-interface after the display module is reduced, and before displaying the first virtual key on the first touch interface, obtain the game type of the first game application ;
  • the determining module 302 is further configured to determine a first touch interface from at least one creatable touch interface according to the game type of the first game application, wherein each creatable touch interface displays a corresponding virtual key.
  • the display module 301 is further configured to reduce the second game interface and display it on the second sub-interface if the switching mode is triggered, and display the second virtual key on the second touch interface, wherein the second virtual key
  • the keys are used to control the second game application, and both the second sub-interface and the second touch interface are displayed on the main user interface.
  • the display module 301 is further configured to display the video application interface corresponding to the video application on the user main interface when the video application is running;
  • the display module 301 is further configured to reduce the video application interface and display it on the third sub-interface if the switching mode is triggered, wherein the third sub-interface is displayed on the user main interface.
  • the display module 301 is further configured to reduce the first game interface and display it on the first sub-interface, and display the first virtual key on the first touch interface.
  • the application interface corresponding to the target application is displayed in full-screen mode, and the first sub-interface and the first touch interface are displayed on the main user interface, wherein the target application includes instant messaging applications, entertainment applications, and tool applications. at least one.
  • the embodiment of the present application also provides another interface display control device, as shown in FIG. 31 , for the convenience of description, only the part related to the embodiment of the present application is shown. If the specific technical details are not disclosed, please refer to the implementation of the present application. Example Methods section.
  • the mobile terminal is a smart phone as an example for description:
  • FIG. 31 is a block diagram showing a partial structure of a smart phone related to the mobile terminal provided by the embodiment of the present application.
  • the smartphone includes: a radio frequency (RF) circuit 410, a memory 420, an input unit 430, a display unit 440, a sensor 450, an audio circuit 460, a Wireless Fidelity (WiFi) module 470, and a processor 480, and the power supply 490 and other components.
  • the input unit 430 may include a touch panel 431 and other input devices 432
  • the display unit 440 may include a display panel 441
  • the audio circuit 460 may include a speaker 461 and a microphone 462 .
  • the smart phone structure shown in FIG. 31 does not constitute a limitation to the smart phone, and may include more or less components than the one shown, or combine some components, or arrange different components.
  • the processor 480 is the control center of the smart phone, using various interfaces and lines to connect various parts of the entire smart phone, by running or executing the software programs and/or modules stored in the memory 420, and calling the data stored in the memory 420. , perform various functions of the smartphone and process data, so as to monitor the smartphone as a whole.
  • the processor 480 may include one or more processing units; preferably, the processor 480 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, and application programs, etc. , the modem processor mainly deals with wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 480.
  • the smart phone may also include a camera, a Bluetooth module, etc., which will not be repeated here.
  • the processor 480 included in the terminal may perform the functions in any of the foregoing embodiments corresponding to FIG. 3 to FIG. 29 , and details are not described herein again.
  • Embodiments of the present application also provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when it runs on a computer, causes the computer to execute the methods described in the foregoing embodiments.
  • the embodiments of the present application also provide a computer program product including a program, which, when run on a computer, causes the computer to execute the methods described in the foregoing embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种控制界面显示的方法、装置、设备及存储介质,用于界面显示技术领域。该方法包括:当运行第一游戏应用时,以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键;若触发切换模式,则创建第一子界面以及第一触控界面,第一触控界面用于显示控制第一游戏应用的虚拟按键,将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上。该方法能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,从而简化了用户操作。

Description

一种控制界面显示的方法、装置、设备及存储介质
本申请要求于2020年9月30日提交中国专利局、申请号202011062334.3、申请名称为“一种控制界面显示的方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及界面显示技术领域,尤其涉及控制界面显示。
背景技术
随着互联网技术的不断发展,应用程序的类型也越来越多样化,用户对于移动终端显示功能的需求与日俱增。无论在人们的日常通信,还是在工作娱乐上,移动终端都扮演着非常重要的角色。
目前,在移动终端上运行的游戏主要包括单机游戏和网络游戏,用户下载游戏应用之后,即可在移动终端上进行游戏。通常情况下,游戏界面以全屏模式显示在移动终端上,用户可以在移动终端的显示界面上对游戏进行操作。
然而,移动终端的功能不仅在于提供游戏业务,还可以提供许多其他的业务。如果用户在游戏过程中需要进行其他业务,则需要暂停该游戏,或将游戏切换到后台运行,如果要恢复游戏,则需要用户再次启动游戏应用,导致整个操作过程较为繁琐,人机交互频繁,消耗移动终端的处理资源。
发明内容
本申请实施例提供了一种控制界面显示的方法、装置、设备及存储介质,从而能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,从而简化了用户操作。同时降低了人机交互的次数,减少移动终端处理资源的消耗。
有鉴于此,本申请一方面提供一种控制界面显示的方法,该方法应用于移动终端,该方法包括:
当运行第一游戏应用时,以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,第一虚拟按键用于控制第一游戏应用;
若触发切换模式,则将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,其中,第一子界面以及第一触控界面是在触发切换模式时在用户主界面上创建的,第一触控界面用于显示控制第一游戏应用的虚拟按键。
本申请另一方面提供一种界面显示控制装置,该装置部署在移动终端上,该装置包括:
显示模块,用于当运行第一游戏应用时,以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,第一虚拟按键用于控制第一游戏应用;
显示模块,还用于若触发切换模式,则将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,其中,第一子界面以及第一触控界面是在触发切换模式时在用户主界面上创建的,第一触控界面用于显示控制第一游戏应用的虚拟按键。
本申请的另一方面提供了一种移动终端,可以包括:存储器以及处理器;其中,存储器用于存储程序;处理器用于执行存储器中的程序,处理器用于根据程序中的指令执行上述各方面的方法。
本申请的另一方面提供了一种计算机可读存储介质,计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机执行上述各方面的方法。
本申请的另一个方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述各方面所提供的方法。
从以上技术方案可以看出,本申请实施例具有以下优点:
本申请实施例中,提供了一种控制界面显示的方法,当运行第一游戏应用时,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,如果满足切换模式,则在用户主界面上创建第一子界面以及第一触控界面,第一触控界面用于显示控制第一游戏应用的虚拟按键,将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上。通过上述方式,在用户主界面上创建一个用于显示游戏界面的子界面,以及一个用于控制游戏的触控界面,从而能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,从而简化了用户操作。同时降低了人机交互的次数,减少移动终端处理资源的消耗。
附图说明
图1为本申请实施例中控制界面显示方法的一个环境示意图;
图2为本申请实施例中控制界面显示方法的一个流程示意图;
图3为本申请实施例中控制界面显示方法的一个实施例示意图;
图4为本申请实施例中以全屏模式显示第一游戏界面的一个示意图;
图5为本申请实施例中创建第一子界面以及第一触控界面的一个示意图;
图6为本申请实施例中切换至用户主界面的一个示意图;
图7为本申请实施例中控制界面显示方法的另一个实施例示意图;
图8为本申请实施例中切换至第一游戏界面的一个示意图;
图9为本申请实施例中针对于第一游戏界面触发点击操作的一个示意图;
图10为本申请实施例中针对于第一游戏界面触发按压操作的一个示意图;
图11为本申请实施例中针对于第一游戏界面触发滑动操作的一个示意图;
图12为本申请实施例中针对于第一游戏界面触发双击操作的一个示意图;
图13为本申请实施例中针对于第一游戏界面触发多触控点操作的一个示意图;
图14为本申请实施例中针对于第一子界面触发多触控点操作的一个示意图;
图15为本申请实施例中切换设备使用状态的一个实施例示意图;
图16为本申请实施例中切换设备使用状态的另一个实施例示意图;
图17为本申请实施例中双眼坐标连线与移动终端水平方向之间夹角的一个示意图;
图18为本申请实施例中基于人脸识别实现界面切换的一个示意图;
图19为本申请实施例中基于人脸识别实现界面回切的一个示意图;
图20为本申请实施例中基于来电提醒实现界面切换的一个示意图;
图21为本申请实施例中确定基于嵌入型界面实现界面显示的一个示意图;
图22为本申请实施例中确定基于悬浮型界面实现界面显示的一个示意图;
图23为本申请实施例中针对于第一子界面进行拖动操作的一个示意图;
图24为本申请实施例中针对于第一子界面进行拖动操作的一个示意图;
图25为本申请实施例中第一触控界面的一个实施例示意图;
图26为本申请实施例中选择第一触控界面的参数的一个实施例示意图;
图27为本申请实施例中控制界面显示方法的另一个实施例示意图;
图28为本申请实施例中控制界面显示方法的另一个实施例示意图;
图29为本申请实施例中控制界面显示方法的另一个实施例示意图;
图30为本申请实施例中界面显示控制装置的一个实施例示意图;
图31为本申请实施例中移动终端的一个结构示意图。
具体实施方式
本申请实施例提供了一种控制界面显示的方法、装置、设备及存储介质,用于在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,从而简化了用户操作。同时降低了人机交互的次数,减少移动终端处理资源的消耗。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”、“第三”、“第四”等(如果存在)是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本申请的实施例例如能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“对应于”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、***、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
随着互联网技术的不断发展,应用程序的类型也越来越多样化,而运行在移动终端上的游戏应用成为最受欢迎的应用之一。为了达到更好地游戏效果,通常情况下,游戏界面以全屏模式显示在移动终端上,用户可以在移动终端的显示界面上对游戏进行操作。但是考虑到移动终端提供的业务类型多种多样,如果用户在游戏过程中需要进行其他业务,则需要暂停该游戏,或者将游戏切换到后台运行,导致整个操作过程较为繁琐。基于此,本申请提供了一种控制界面显示的方法,能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,从而简化了用户操作。
示例性地,以游戏应用作为一个示例进行说明,当用户使用移动终端进行游戏时,游戏界面会以全屏模式显示在移动终端上。当用户在游戏过程中需要进行其他业务时,可以在用户主界面上创建子界面以及触控界面,并将游戏界面进行缩小处理后显示于子界面上,同时,在触控界面上显示虚拟按键,用户可以通过虚拟按键控制游戏应用,基于此,用户可以在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,简化用户操作从而提升用户体验。同时降低了人机交互的次数,减少移动终端处理资源的消耗。
为了便于理解,请参阅图1,图1为本申请实施例中控制界面显示方法的一个环境示意图,如图所示,游戏界面显示控制***包括移动终端,且应用程序例如游戏应用部署于移动终端上。本申请涉及的移动终端可以是智能手机、平板电脑以及掌上电脑等,但并不局限于此。本申请以游戏应用是第一游戏应用为例,A1用于指示全屏模式下的第一游戏界面,A2用于指示第一子界面,A3用于指示在第一子界面上显示的第一游戏界面,A4用于指示第一触控界面,A5用于指示第一虚拟按键。在移动终端运行第一游戏应用时,以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键。当触发切换模式时,移动终端在用户主界面上创建第一子界面以及第一触控界面,由此,将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,第一虚拟按键用于控制第一游戏应用。当触发回切模式时,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键。
以应用于游戏应用界面显示作为一个示例进行说明,由于游戏引擎将图像渲染到移动终端的显示界面需要通过表层视图(SurfaceView)来呈现,由于SurfaceView拥有独立的绘图表面,即它不与其宿主窗口共享同一个绘图表面,因此,SurfaceView的用户界面(user interface,UI)可以在一个独立的线程中进行绘制。此外,SurfaceView不会占用主线程资源,所以SurfaceView可以实现复杂而高效的UI,不会导致用户的操作得不到及时响应。在本申请中,SurfaceView可以采用画中画模式或者悬浮窗口模式来实现。其中,画中画模式是一种画面呈现方式,画中画模式是指在移动终端显示一个画面的同时,于画面的小面积区域上同时播出另一个画面。而悬浮窗口模式在应用的表面悬浮一个可移动的窗口,应理解,移动终端使用悬浮窗口模式可能还需要获取***的授权。
基于上述介绍,为了便于理解,请参阅图2,图2为本申请实施例中控制界面显示方法的一个流程示意图,如图所示:
在步骤S1中,移动终端向用户发起授权请求,其中,权限请求所包括的权限可以是窗口顶层显示权限、前置摄像头拍摄权限、重力感应权限以及来电监听权限等。
在步骤S2中,移动终端实时判断当前是否触发切换模式,如果触发切换模式,则执行步骤S3。切换模式包含但不限于用户点击游戏界面中虚拟按钮,屏幕发生旋转,电话呼入,用户人脸方向与智能手机屏幕方向一致等。
在步骤S3中,移动终端在其对应的用户主界面上创建子界面以及触控区域,该子界面和触控区域可以为悬浮窗口模式或者画中画模式,此处不做限定。
在步骤S4中,移动终端将游戏界面渲染到子界面,并将游戏切到后台执行。
在步骤S5中,用户可以拖动子界面,或者对子界面进行操作,并且通过触控区域控制游戏进程。
在步骤S6中,移动终端实时判断当前是否触发回切模式,如果触发回切模式,则游戏界面恢复到前台执行,触控区域消失,以全屏模式显示游戏界面。
结合上述介绍,下面将对第一游戏界面切换至第一子界面的过程进行介绍,请参阅图3,图3为本申请实施例中控制界面显示方法的一个实施例示意图,本申请实施例中控制界面显示的一个实施例包括:
101、当移动终端运行第一游戏应用时,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,第一虚拟按键用于控制第一游戏应用。
本实施例中,当移动终端运行第一游戏应用时,会以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键。其中,全屏模式可以是移动终端在其屏幕的整个可显示区域中显示第一游戏界面,也可以是移动终端在其屏幕的绝大部分可显示区域中显示第一游戏界面,例如,移动终端在屏幕顶部的一小部分可显示区域中显示状态栏(如运营商、时间以及电池电量等信息),而在屏幕的其余部分显示第一游戏界面。
为了便于理解,请参阅图4,图4为本申请实施例中以全屏模式显示第一游戏界面的一个示意图,如图所示,图4中(A)图示出的为在移动终端屏幕的绝大部分可显示区域内显示第一游戏界面,图4中(B)图示出的为在移动终端屏幕的整个可显示区域内显示第一游戏界面。
102、若触发切换模式,则将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,其中,第一子界面以及第一触控界面是在触发切换模式时在用户主界面上创建的,第一触控界面用于显示控制第一游戏应用的虚拟按键。
本实施例中,如果触发切换模式,则移动终端可以在用户主界面上创建第一子界面以及第一触控界面。其中,用户主界面表示启动移动终端之后看到的第一个用户界面,通常情况下,用户主界面包括常用的应用图标、时间,电池电量信息以及运营商信息等。
为了便于理解,请参阅图5,图5为本申请实施例中创建第一子界面以及第一触控界面的一个示意图,如图所示,B1用于指示用户主界面,B2用于指示第一子界面,B3用于指示第一触控界面。应理解,图5的示例仅用于理解本方案,第一子界面以及第一触控界面还可以为圆形、三角形或者正方形等,且第一子界面以及第一触控界面还可以处于用户主界面的正上方,正下方等其他位置,第一子界面以及第一触控界面的形状以及在用户主界面中的显示位置可根据实际情况进行调整,例如第一触控界面可以位于用户主界面中独立于第一子界面的显示位置,也可以位于第一子界面所在的显示位置,此处不做限定。
基于此,移动终端可以将第一游戏界面进行缩小处理后显示于第一子界面上,由于第一虚拟按键是控制第一游戏应用的虚拟按键,故可以在第一触控界面上显示第一虚拟按键。需要说明的是,在第一触控界面上显示的第一虚拟按键以及在全屏模式下显示的第一虚拟按键均用于控制第一游戏应用,其中,在第一触控界面上显示的第一虚拟按键以及在全屏模式下显示的第一虚拟按键,在按键数量以及按键外形上可以一致或不同,此处不做限定。
因此,用户在对移动终端上其他业务进行操作操作时,可以通过第一触控界面上显示的第一虚拟按键控制第一游戏应用,以此持续第一游戏应用的进度。
为了便于理解,请参阅图6,图6为本申请实施例中切换至用户主界面的一个示意图,如图所示,图6中(A)图示出的为未经过缩小处理的第一游戏界面,图6中(B)图示出的为经过缩小处理后的第一游戏界面,图6中(C)图示出的为用户主界面,C1用于指示第一子界面,C2用于指示第一触控界面,C3用于指示第一触控界面上的第一虚拟按键。将经过缩小处理后的第一游戏界面显示于第一子界面上,并且在第一触控界面上显示第一 虚拟按键。应理解,图6的示例仅用于理解本方方案,第一游戏界面进行缩小处理后的尺寸需要根据第一子界面的尺寸来确定,第一虚拟按键的按键形状以及按键类型均可调整。
本申请实施例中,提供了一种控制界面显示的方法,通过上述方式,在用户主界面上创建一个用于显示游戏界面的子界面,以及一个用于控制游戏的触控界面,从而能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,从而简化了用户操作。同时降低了人机交互的次数,减少移动终端处理资源的消耗。
结合上述介绍,下面将对第一子界面回切至第一游戏界面的过程进行介绍,请参阅图7,图7为本申请实施例中控制界面显示方法的另一个实施例示意图,本申请实施例中控制界面显示的一个实施例包括:
201、当移动终端运行第一游戏应用时,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,第一虚拟按键用于控制第一游戏应用。
202、若触发切换模式,则移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,其中,第一子界面以及第一触控界面是在触发切换模式时在用户主界面上创建的,第一触控界面用于显示控制第一游戏应用的虚拟按键。
本实施例中,步骤201至步骤202与步骤101至步骤102所描述的内容类似,故此处不做赘述。
203、若触发回切模式,则以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键。
本实施例中,当触发回切模式时,移动终端将切换至全屏模式显示第一游戏应用所对应的第一游戏界面以及第一虚拟按键,且不再单独显示第一触控区域。
为了便于理解,请参阅图8,图8为本申请实施例中切换至第一游戏界面的一个示意图,如图所示,图8中(A)图示出的为用户主界面,图8中(B)图示出的为全屏模式下的第一游戏界面,D1用于指示第一子界面,D2用于指示第一触控界面,D3用于指示第一虚拟按键。用户在操作其他应用的同时,还可以通过第一子界面查看第一游戏应用的游戏界面,并通过第一触控界面显示的第一虚拟按键控制第一游戏应用。如果用户希望重新回到全屏模式下的游戏界面,则可以将移动终端改为横屏握持,或者点击第一子界面中第二虚拟按钮等,由此切换为如图8所示的(B)图。
本申请实施例中,提供了另一种控制界面显示的方法,通过上述方式,在用户主界面上创建一个用于显示游戏界面的子界面,以及一个用于控制游戏的触控界面,从而能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,从而简化了用户操作,其次,可以从子界面回切至全屏模式时,切换过程也能够持续游戏进度,因此在满足用户需求的基础上,进一步地简化了用户操作。同时降低了人机交互的次数,减少移动终端处理资源的消耗。
在一种可能的实现方式中,在上述图3对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,还包括如下步骤:
若触发预设切换操作,则移动终端确定触发切换模式,其中,预设切换操作包括点击操作、按压操作、滑动操作、双击操作以及出多触控点操作中的至少一种。
本实施例中,介绍了一种通过预设切换操作确定触发切换模式的方法。移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,如果用户需要操作其他应用(例如,视频应用、即时通讯类应用、拍摄类应用或点餐应用等),则可以在触摸屏上对第一游戏界面执行预设切换操作。因此,当移动终端检测到触控物体在触摸屏上针对于第一游戏界面的预设切换操作时,确定触发切换模式。需要说明的是,触控物体可以为用户的手指,指节或者其他可触控的物体,具体此处不做限定。
在满足切换模式的情况下,可以在用户主界面上创建第一子界面以及第一触控界面,创建第一子界面以及第一触控界面的方式已经在步骤102中进行介绍,故此处不再赘述。可以理解的是,预设切换操作包含但不仅限于点击操作、按压操作、滑动操作、双击操作以及多触控点操作等。
本申请实施例中,提供了一种通过预设切换操作确定触发切换模式的方法,通过上述方式,用户采用预设切换操作实现界面切换,由此,提升方案实现的可行性,其次,由于预设切换操作可以为多种操作中的至少一种,用户可以根据需求进行操作,由此增加方案的灵活性。
在一种可能的实现方式中,在上述图3对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,若触发预设切换操作,则移动终端确定触发切换模式,可以包括如下步骤:
若针对于所述第一游戏界面中的第一虚拟按钮触发点击操作,则移动终端确定触发切换模式;
或,
若针对于第一游戏界面触发按压操作,则移动终端确定触发切换模式,其中,按压操作的按压时间大于或等于第一时间阈值;
或,
若针对于第一游戏界面触发滑动操作,则移动终端确定触发切换模式,其中,滑动操作的滑动轨迹为基于触控物体的起始位置与触控物体的终止位置生成的;
或,
若针对于第一游戏界面触发双击操作,则移动终端确定触发切换模式,其中,双击操作的双击时间间隔小于或等于第二时间阈值;
或,
若针对于第一游戏界面触发多触控点操作,则移动终端确定触发切换模式,其中,多触控点操作为基于至少两个触控点向内靠近后生成的。
本实施例中,介绍了多种触发预设切换操作的方法。下面将结合具体示例介绍不同类型的预设切换操作。
第一种方式为点击操作。如果用户点击第一游戏界面中第一虚拟按钮,即触发点击操作。为了便于理解点击操作,请参阅图9,图9为本申请实施例中针对于第一游戏界面触 发点击操作的一个示意图,如图所示,图9中(A)图示出的为第一游戏界面,E1用于指示第一虚拟按钮,当用户点击第一虚拟按钮时,移动终端即可检测到在触摸屏上针对第一虚拟按钮的点击操作,并确定触发切换模式,基于此,进入图9中(B)图。图9中(B)图示出的为包括第一子界面以及第一触控界面的用户主界面。需要说明的是,图9的示例仅用于理解本方案,在实际应用中,第一虚拟按钮还可以为圆形、椭圆形、三角形或者五角星形等,第一虚拟按钮的位置可以位于第一游戏界面中的任一位置,基于此,具体第一虚拟按钮的形状以及位置可根据实际情况灵活确定。
第二种方式为按压操作。如果用户按压第一游戏界面的按压时间大于或等于第一时间阈值,即触发按压操作,需要说明的是,第一时间阈值可以根据实际需求进行设置,例如可以为2秒或者3秒等,此处不做限定。为了便于理解按压操作,请参阅图10,图10为本申请实施例中针对于第一游戏界面触发按压操作的一个示意图,如图所示,图10中(A)图示出的为第一游戏界面,当用户在触摸屏上进行按压操作,且按压时间大于或等于第一时间阈值时,移动终端可确定触发切换模式,基于此,进入图10中(B)图。图10中(B)图示出的为包括第一子界面以及第一触控界面的用户主界面。
第三种方式为滑动操作。如果用户在第一游戏界面上进行滑动操作,且滑动轨迹是基于触控物体的起始位置与触控物体的终止位置生成的,即触发滑动操作。在实际应用中,触发滑动操作的方式,还可以为移动终端启动摄像头拍摄触控物体(例如,手或者触控笔)在空中进行滑动操作,该操作无需与触摸屏进行接触。为了便于理解滑动操作,以触控物体为手指作为一个示例进行介绍,请参阅图11,图11为本申请实施例中针对于第一游戏界面触发滑动操作的一个示意图,如图所示,图11中(A)图示出的为第一游戏界面,F1用于指示手指在触摸屏上的起始位置,F2用于指示手指在触摸屏上的终止位置,F3用于指示滑动操作的滑动轨迹,当用户的手指在触摸屏上从起始位置滑动至终止位置,并且形成滑动轨迹时,移动终端可确定触发切换模式,基于此,进入图11中(B)图。图11中(B)图示出的为包括第一子界面以及第一触控界面的用户主界面。
第四种方式为双击操作。如果用户在第一游戏界面上进行连续两次点击的时间间隔小于或等于第二时间阈值,即触发双击操作。需要说明的是,第二时间阈值可以根据实际需求进行设置,例如可以为0.5秒或者1秒等,此处不做限定。为了便于理解双击操作,请参阅图12,图12为本申请实施例中针对于第一游戏界面触发双击操作的一个示意图,如图所示,图12中(A)图示出的为第一游戏界面,当用户在触摸屏上进行双击,且双击时间间隔小于或等于第二时间阈值,移动终端可以检测到切换模式,基于此,进入图12中(B)图。图12中(B)图示出的为包括第一子界面以及第一触控界面的用户主界面。
第五种方式为多触控点操作。如果用户在第一游戏界面上使用至少两个手指同时向内聚拢,即触发多触控点操作。为了便于理解多触控点操作,请参阅图13,图13为本申请实施例中针对于第一游戏界面触发多触控点操作的一个示意图,如图所示,图13中(A)图示出的为第一游戏界面,G1用于指示触控点A,G2用于指示触控点B,当用户对触摸屏上的触控点A以及触控点B进行点触,并且随着手指向内靠拢(如图13中(A)图所示 的两个箭头方向)时,移动终端可确定触发切换模式,基于此,进入图13中(B)图。图13中(B)图示出的为包括第一子界面以及第一触控界面的用户主界面。
本申请实施例中,提供了多种触发预设切换操作的方法,通过上述方式,预设切换操作包括多种不同的操作,基于此,用户可以根据需求进行不同的预设切换操作,由此,进一步提升本方案的灵活性。
在上述图7对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,还包括如下步骤:
若针对于第一子界面触发预设回切操作,则移动终端确定触发回切模式,其中,预设回切操作包括点击操作、按压操作、滑动操作、双击操作以及多触控点操作中的至少一种。
本实施例中,介绍了一种通过预设切换操作确定触发回切模式的方法。移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,如果用户希望恢复到全屏模式玩游戏,则可以在触摸屏上对第一子界面执行预设回切操作。因此,当移动终端检测到触控物体在触摸屏上针对于第一子界面的预设回切操作时,确定触发预设回切操作。需要说明的是,触控物体可以为用户的手指,指节或者其他可触控的物体,具体此处不做限定。
在触发回切模式的情况下,会以全屏模式显示第一游戏界面。以全屏模式显示第一游戏界面的方式已经在步骤204中进行介绍,故此处不再赘述。可以理解的是,预设回切操作包含但不仅限于点击操作、按压操作、滑动操作、双击操作以及多触控点操作等。
本申请实施例中,提供了一种通过预设切换操作确定触发回切模式的方法,通过上述方式,用户采用预设回切操作实现界面切换,由此,提升方案实现的可行性,其次,由于预设回切操作可以为多种操作中的至少一种,用户可以根据需求进行操作,由此增加方案的灵活性。
在上述图7对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,若针对于第一子界面触发预设回切操作,则移动终端确定触发回切模式,可以包括如下步骤:
若针对于第一子界面中的第二虚拟按钮触发点击操作,则移动终端确定触发回切模式;
或,
若针对于第一子界面触发按压操作,则移动终端确定触发回切模式,其中,按压操作的按压时间大于或等于第一时间阈值;
或,
若针对于第一子界面触发滑动操作,则移动终端确定触发回切模式,其中,滑动操作的滑动轨迹为基于触控物体的起始位置与触控物体的终止位置生成的;
或,
若针对于第一子界面触发双击操作,则移动终端确定触发回切模式,其中,双击操作的双击时间间隔小于或等于第二时间阈值;
或,
若针对于第一子界面触发多触控点操作,则移动终端确定触发回切模式,其中,多触控点操作为基于至少两个触控点向外延伸后生成的。
本实施例中,介绍了多种触发预设回切操作的方法。下面将结合具体示例介绍不同类型的预设回切操作。
第一种方式为点击操作。用户可在触摸屏上对于第一子界面中第二虚拟按钮进行点击操作,移动终端即可检测到该点击操作,点击操作与图9所介绍的示例类似,在此不再赘述。
第二种方式为按压操作。用户可在第一子界面上进行按压操作,且按压操作的按压时间大于或等于第一时间阈值的情况下,移动终端即可检测到该按压操作。其中,第一时间阈值根据实际需求进行设置,例如可以为2秒或者3秒等,此处不做限定。按压操作与图10示例类似,在此不再赘述。
第三种方式为滑动操作。用户可在第一子界面进行滑动操作,且滑动操作的滑动轨迹为基于触控物体的起始位置与触控物体的终止位置生成的,移动终端即可检测到该滑动操作,具体按压操作与图11示例类似,在此不再赘述。
第四种方式为双击操作。用户可在第一子界面上进行双击操作,并且双击操作的双击时间间隔小于或等于第二时间阈值,此时,移动终端即可检测到该双击操作,其中,第二时间阈值根据实际需求进行设置,例如可以为1秒或者2秒等,此处不做限定。具体按压操作与图12示例类似,在此不再赘述。
第五种方式为多触控点操作。用户可在第一子界面上使用至少两个手指同时向外拨开,即触发多触控点操作,此时,移动终端即可检测到该多触控点操作。为了便于理解该多触控点操作,请参阅图14,图14为本申请实施例中针对于第一子界面触发多触控点操作的一个示意图,如图所示,图14中(A)图示出的为用户主界面,H1用于指示第一子界面,H2用于指示触控点A,H3用于指示触控点B,当用户对触摸屏上的触控点A以及触控点B进行点触,并且随着手指向外分散(如图14中(A)图所示的两个箭头方向)时,移动终端可确定触发回切模式,基于此,进入图14中(B)图。图14中(B)图示出的以全屏模式显示第一游戏界面。
本申请实施例中,提供了多种触发预设回切操作的方法,通过上述方式,预设回切操作包括多种不同的操作,基于此,用户可以根据需求进行不同的预设回切操作,由此,进一步提升本方案的灵活性。
在上述图3对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,还包括如下步骤:
移动终端通过重力传感器获取设备使用状态,其中,设备使用状态用于描述移动终端被握持时的姿态;
若设备使用状态指示移动终端处于竖屏使用状态,则移动终端确定触发切换模式。
本实施例中,介绍了一种基于设备使用状态确定触发切换模式的方法。移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,可通过重力传感器获取 设备使用状态,该设备使用状态用于描述移动终端被握持时的姿态,设备使用状态主要包括竖屏使用状态以及横屏使用状态,由于以全屏模式显示第一游戏界面通常为横屏使用状态,因此,当移动终端处于竖屏使用状态时,可以确定触发切换模式。
例如,重力传感器可以通过测量移动终端重力正交两个方向的分力大小,从而判定移动终端所处状态,即得到设备使用状态。在实际应用中,还可以通过移动终端中内置的陀螺仪获取设备使用状态。由于高速旋转物体的旋转轴,对于改变其方向的外力作用有趋向于垂直方向的倾向,且旋转物体在横向倾斜时,重力会向增加倾斜的方向作用,而轴则向垂直方向运动,就产生了摇头的运动(即岁差运动)。因此,陀螺仪的陀螺旋转轴以水平轴旋转时,由于地球的旋转而受到铅直方向旋转力,陀螺的旋转体向水平面内的子午线方向产生岁差运动,也就可以获取设备使用状态。
为了便于理解,请参阅图15,图15为本申请实施例中切换设备使用状态的一个实施例示意图,如图所示,图15中(A)图示出的为在横屏使用状态下以全屏模式显示的第一游戏界面,当用户将移动终端从横屏转向竖屏时,即如图15中(B)图所示,图15中(B)图示出的为在竖屏使用状态下以全屏模式显示的第一游戏界面,此时,移动终端确定触发切换模式,基于此,进入图15中(C)图,图15中(C)图示出的为包括第一子界面以及第一触控界面的用户主界面。
本申请实施例中,提供了一种基于设备使用状态确定触发切换模式的方法,通过上述方式,当用户在调整设备使用状态时,可能需要对其他应用进行操作或者有其他需求,因此通过设备使用状态确定触发切换模式,进一步地明确了具体切换模式,由此提升本方案的可行性。
在上述图7对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,还包括如下步骤:
移动终端通过重力传感器获取设备使用状态,其中,设备使用状态用于描述移动终端被握持时的姿态;
若设备使用状态指示移动终端处于横屏使用状态,则移动终端确定触发回切模式。
本实施例中,介绍了一种基于设备使用状态确定触发回切模式的方法。移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,可通过重力传感器获取设备使用状态,该设备使用状态用于描述移动终端被握持时的姿态,设备使用状态主要包括竖屏使用状态以及横屏使用状态。由于显示第一子界面以及第一虚拟按键通常为竖屏使用状态,因此,当移动终端处于横屏使用状态时,可以确定触发回切模式。
为了便于理解,请参阅图16,图16为本申请实施例中切换设备使用状态的另一个实施例示意图,如图所示,图16中(A)图示出的为包括第一子界面以及第一触控界面的用户主界面,当用户将移动终端从竖屏转向横屏时,即如图16中(B)图所示,图16中(B)图示出的为在横屏使用状态下包括第一子界面以及第一触控界面的用户主界面,此时,移 动终端确定触发回切模式,基于此,进入图16中(C)图,图16中(C)图示出的为在横屏使用状态下以全屏模式显示的第一游戏界面。
本申请实施例中,提供了一种基于设备使用状态确定触发回切模式的方法,通过上述方式,通常用户在调整设备使用状态时,当处于横屏使用状态时,用户通常希望以全屏模式显示第一游戏界面,因此通过设备使用状态可以确定触发回切模式,进一步地明确了具体回切模式,由此提升本方案的可行性。
在上述图3对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,还包括如下步骤:
移动终端通过前置摄像头获取人脸图像;
移动终端根据人脸图像确定左眼对应的左眼坐标点以及右眼对应的右眼坐标点;
移动终端根据左眼坐标点以及右眼坐标点,确定双眼坐标连线;
若双眼坐标连线与移动终端水平方向之间的夹角小于或等于角度阈值,则移动终端确定触发切换模式,其中,移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。
本实施例中,介绍了一种基于人脸识别情况触发切换模式的方法。移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,还可以通过前置摄像头实时拍摄人脸图像,从人脸图像中确定左眼的坐标点与右眼的坐标点,再根据左眼坐标点以及右眼坐标点,生成双眼坐标连线。如果双眼的坐标连线与移动终端水平方向之间的夹角小于或等于角度阈值,则确定触发切换模式。其中,移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。需要说明的是,角度阈值可以根据实际需求设置,例如可以为30°,在实际应用中,角度阈值还可以设置为45°或者20°等其他角度,此处不做限定。
为了便于理解,请参阅图17,图17为本申请实施例中双眼坐标连线与移动终端水平方向之间夹角的一个示意图,如图所示,图17中(A)图示出的为处于竖屏使用状态的移动终端,图17中(B)图示出的为处于横屏使用状态的移动终端,I1用于指示移动终端的水平方向,I2用于指示左眼,I3用于指示右眼。由图17中(A)图可见,左眼以及右眼的连线与移动终端水平方向是平行的,即左眼对应的左眼坐标点以及右眼对应的右眼坐标点的双眼坐标连线,与移动终端水平方向之间的夹角为0°。由图17中(B)图可见,左眼以及右眼与移动终端水平方向是垂直的,即左眼对应的左眼坐标点以及右眼对应的右眼坐标点的双眼坐标连线,与移动终端水平方向之间的夹角为90°。
以角度阈值为30°作为一个示例进行说明,请参阅图18,图18为本申请实施例中基于人脸识别实现界面切换的一个示意图,如图所示,图18中(A)图示出的为移动终端,J1用于指示移动终端的水平方向,J2用于指示左眼,J3用于指示右眼,J4用于指示双眼坐标连线与移动终端水平方向之间的夹角。由图18中(A)图可见,左眼坐标点与右眼坐标点的双眼坐标连线,与移动终端水平方向之间具有夹角,如果该夹角为20°,那么该夹角小于角度阈值30°,因此,满足切换模式,从而得到图18中(B)图。图18中(B)图示出的为包括第一子界面以及第一触控界面的用户主界面。
本申请实施例中,提供了一种基于人脸识别情况触发切换模式的方法,通过上述方式,基于人脸图像确定左眼坐标点以及右眼坐标点,并且进一步确定双眼坐标连线,根据双眼坐标连线与移动终端水平方向之间的夹角的关系,判断是否触发切换模式,明确了具体切换模式,无需用户进行主动操作,仅获取人脸图像即可以确定是否触发切换模式,从而简化用户操作。
在上述图7对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,还包括如下步骤:
移动终端通过前置摄像头获取人脸图像;
移动终端根据人脸图像确定左眼对应的左眼坐标点以及右眼对应的右眼坐标点;
移动终端根据左眼坐标点以及右眼坐标点,确定双眼坐标连线;
若双眼坐标连线与移动终端水平方向之间的夹角大于角度阈值,则移动终端确定触发回切模式,其中,移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。
本实施例中,介绍了一种基于人脸识别情况确定触发回切模式的方法。移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,还可以通过前置摄像头实时拍摄人脸图像,并从人脸图像中确定左眼的坐标点以及右眼的坐标点,再根据左眼坐标点以及右眼坐标点,生成双眼坐标连线。如果双眼坐标连线与移动终端水平方向之间的夹角大于角度阈值,则确定触发回切模式。其中,移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。需要说明的是,角度阈值可以根据实际需求设置,例如可以为30°,在实际应用中,角度阈值还可以设置为45°或者20°等其他角度,此处不做限定。
为了便于理解,以角度阈值为30°作为一个示例进行说明,请参阅图19,图19为本申请实施例中基于人脸识别实现界面回切的一个示意图,如图所示,图19中(A)图示出的为移动终端,K1用于指示移动终端的水平方向,K2用于指示左眼,K3用于指示右眼,K4用于指示双眼坐标连线与移动终端水平方向之间的夹角。由图19中(A)图可见,左眼坐标点与右眼坐标点的双眼坐标连线,与移动终端水平方向之间具有夹角,如果该夹角为45°,那么该夹角大于角度阈值30°,因此,触发回切模式,从而得到图19中(B)图,图19中(B)图示出的为以全屏模式显示的第一游戏界面。
本申请实施例中,提供了一种基于人脸识别情况确定触发回切模式的方法,通过上述方式,基于人脸图像确定左眼坐标点以及右眼坐标点,并且进一步确定双眼坐标连线,根据双眼坐标连线与移动终端水平方向之间的夹角的关系,判断是否触发回切模式,无需用户进行主动操作,仅获取人脸图像即可以确定是否触发回切模式,从而简化用户操作。
在上述图3对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,还包括如下步骤:
移动终端监听来电提醒;
若触发针对于来电提醒的接听指令时,则移动终端确定触发切换模式。
本实施例中,介绍了一种基于来电情况确定触发切换模式的方法。移动终端可以实时监听来电提醒,当存在来电提醒时,如果用户选择接听该来电,则移动终端可以检测到针对于来电提醒的接听指令时,由此,确定触发切换模式。其中,用户选择接听来电的接听操作可以为点击接听按钮操作。
为了便于理解,请参阅图20,图20为本申请实施例中基于来电提醒实现界面切换的一个示意图,如图所示,图20中(A)图示出的为以全屏模式显示的第一游戏界面,当移动终端显示来电提醒时,如果用户选择接听,则可以点击接听按钮,由此,移动终端检测到针对于来电提醒的接听指令,即可确定触发切换模式,基于此,进入图20中(B)图。图20中(B)图示出的为包括第一子界面以及第一触控界面的用户主界面。
本申请实施例中,提供了一种基于来电情况确定触发切换模式的方法,通过上述方式,如果用户选择接听来电,则自动实现界面切换,从而简化了用户操作,并且提升本方案的可行性。
在上述图3对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,还包括如下步骤:
移动终端通过麦克风获取待检测语音;
若待检测语音与第一预设语音匹配成功,则移动终端确定触发切换模式。
本实施例中,介绍了一种基于语音确定触发切换模式的方法。移动终端以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,还可以通过麦克风实时获取待检测语音,如果待检测语音与第一预设语音匹配成功时,那么确定触发切换模式。示例性地,假设第一预设语音为“界面切换”,如果用户对麦克风说出“界面切换”(即“界面切换”为待检测语音),则移动终端将该待检测语音与第一预设语音进行匹配,在匹配成功的情况下即可确定触发切换模式。应理解,前述示例仅用于理解本方案,第一预设语音还可以根据实际选择情况进行设置。
本申请实施例中,提供了一种基于语音确定触发切换模式的方法,通过上述方式,用户可以通过语音进行界面切换,从而简化了用户操作,并且增加了方案应用的灵活性。
在上述图7对应的实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,还包括如下步骤:
移动终端通过麦克风获取待检测语音;
若待检测语音与第二预设语音匹配成功,则移动终端确定触发回切模式。
本实施例中,介绍了一种基于语音确定触发回切模式的方法。移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,还可以通过麦克风实时获取待检测语音,如果待检测语音与第二预设语音匹配成功时,那么确定触发回切模式。示例性地,假设第二预设语音为“界面回切”,如果用户对麦克风说出“界面回切”(即“界面回切”为待检测语音),则移动终端将该待检测语音与第二预设语 音进行匹配,在匹配成功的情况下即可确定触发回切模式。应理解,前述示例仅用于理解本方案,第二预设语音还可以根据实际选择情况进行设置。
本申请实施例中,提供了一种基于语音确定触发回切模式的方法,通过上述方式,用户可以通过语音进行界面回切,从而简化了用户操作,并且增加了方案应用的灵活性。
在上述图3以及图7对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,第一子界面与第一触控界面均属于嵌入型界面;
移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,可以包括如下步骤:
移动终端在用户主界面的第一预设区域上创建第一子界面,其中,第一预设区域为用户主界面中的一个固定区域;
移动终端在用户主界面的第二预设区域上创建第一触控界面,其中,第一触控界面为用户主界面中的另一个固定区域;
移动终端以画中画模式将第一游戏界面进行缩小处理后显示于第一子界面上,并以画中画模式在第一触控界面上显示第一虚拟按键。
本实施例中,介绍了一种基于嵌入型界面实现界面显示的方法。第一子界面与第一触控界面中的至少一个界面可以为嵌入型界面。由于嵌入型界面是固定界面,因此移动终端可以在用户主界面的第一预设区域上创建第一子界面,且第一预设区域为用户主界面上的一个固定区域。此外,还可以在用户主界面的第二预设区域上创建第一触控界面,且第二预设区域为用户主界面上的另一个固定区域。基于此,移动终端采用以画中画模式将第一游戏界面进行缩小处理后显示于第一子界面上,并以画中画模式在第一触控界面上显示第一虚拟按键。
为了便于理解,以移动终端为智能手机作为一个示例进行说明,请参阅图21,图21为本申请实施例中确定基于嵌入型界面实现界面显示的一个示意图,如图所示,L1用于指示用户主界面,L2用于指示第一预设区域,L3用于指示第二预设区域,L4用于指示第一子界面,L5用于指示进行缩小处理后的第一游戏界面,L6用于指示第一触控界面,L7用于指示第一虚拟按键。图21中(A)图中的第一预设区域以及第二预设区域均为用户主界面上的固定区域,因此,图21中(B)图中的第一子界面创建于第一预设区域上,而第一触控界面创建于第二预设区域L3上,与前述实施例类似的方式,将第一游戏界面进行缩小处理,然后将进行缩小处理后的第一游戏界面覆盖在第一子界面上进行展示,且在第一触控界面上显示第一虚拟按键。
本申请实施例中,提供了一种基于嵌入型界面实现界面显示的方法,通过上述方式,在用户主界面上固定的预设区域上创建显示游戏界面的子界面,以及用于控制游戏的触控界面,能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,简化了用户操作。
在上述图3以及图7对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,第一子界面与第一触控界面均属于悬浮窗界面;
移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,可以包括如下步骤:
移动终端在用户主界面的上层创建第一子界面;
移动终端在用户主界面的上层创建第一触控界面;
移动终端以悬浮窗模式将第一游戏界面进行缩小处理后显示于第一子界面上,并以悬浮窗模式在第一触控界面上显示第一虚拟按键。
本实施例中,介绍了一种基于悬浮窗界面的控制界面显示的方法。第一子界面与第一触控界面中的至少一个界面可以为悬浮窗界面。由于悬浮窗界面不是固定界面,因此移动终端可以在用户主界面的上层创建第一子界面,并且在用户主界面的上层创建第一触控界面。再以悬浮窗模式将第一游戏界面进行缩小处理后显示于第一子界面上,还可以以悬浮窗模式在第一触控界面上显示第一虚拟按键。其中,悬浮窗界面可以为非透明界面,也可以为半透明界面,此处不做限定。
为了便于理解,以移动终端为智能手机作为一个示例进行说明,请参阅图22,图22为本申请实施例中确定基于悬浮型界面实现界面显示的一个示意图,如图所示,M1用于指示用户主界面,M2用于指示第一子界面,M3用于指示第一触控界面,M4用于指示进行缩小处理后的第一游戏界面,M5用于指示第一虚拟按键。第一子界面以及第一触控界面均处于用户主界面的上层,将第一游戏界面进行缩小处理后,然后将其显示于第一子界面上,且在第一触控界面上显示第一虚拟按键,基于此,得到如图22中(B)图所示的界面。
本申请实施例中,提供了一种基于悬浮窗界面的控制界面显示的方法,通过上述方式,在用户主界面的上层创建显示游戏界面的子界面,以及用于控制游戏的触控界面,能够在持续游戏进度的同时,实现用户对移动终端上其他业务的操作,无需用户频繁切换界面,进一步地简化了用户操作。
在上述图3以及图7对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,还包括如下步骤:
若针对于第一子界面触发拖动操作,则移动终端控制第一子界面沿拖动操作所对应的拖动轨迹进行移动;
若到针对于第一触控界面触发拖动操作,则移动终端控制第一触控界面沿拖动操作所对应的拖动轨迹进行移动;
若到针对于第一子界面触发缩放操作,则移动终端控制第一子界面按照缩放操作进行放大或缩小;
若到针对于第一触控界面触发缩放操作,则移动终端控制第一触控界面按照缩放操作进行放大或缩小。
本实施例中,介绍了一种调整悬浮窗位置和尺寸的方法。对于悬浮窗界面而言,用户还可以进行拖动操作和缩放操作。请参阅图23,图23为本申请实施例中针对于第一子界面进行拖动操作的一个示意图,如图所示,N1用于指示第一子界面,N2用于指示起始位置,N3用于指示终止位置,图23中(A)图示出的为用户从起始位置向终止位置进行拖动,拖动轨迹为从起始位置至终止位置,由此,移动终端可控制第一子界面从起始位置移动至 终止位置,即得到如图23所示的(B)图。可以理解的是,对第一触控界面进行拖动操作不再赘述。
用户还可以对第一子界面进行缩放操作,例如,将第一子界面进行放大或缩小。其中,缩放操作包括缩小操作以及放大操作,缩小操作为至少两个触控点向内收缩,而放大操作为至少两个触控点向外延伸。请参阅图24,图24为本申请实施例中针对于第一子界面进行拖动操作的一个示意图,如图所示,O1用于指示第一子界面,O2用于指示缩小后的第一子界面,O3用于指示放大后的第一子界面。图24中(A)图示出的为手指在移动终端的触摸屏向内收缩,即触发针对于第一子界面的缩小操作,由此,得到如图24中所示的(B)图,第一子界面已被缩小。图24中(C)示出的为手指在移动终端的触摸屏向外延伸,即触发针对于第一子界面的放大操作,由此,得到如图24中所示的(D)图,第一子界面已被放大。可以理解的是,对第一触控界面进行缩放操作不再赘述。
本申请实施例中,提供了一种调整悬浮窗位置和尺寸的方法,通过上述方式,对于悬浮窗界面,还可以根据用户需求对第一子界面与第一触控界面的位置以及大小进行调整,由此提升本方案的灵活性。
在上述图3以及图7对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端在用户主界面上创建第一子界面以及第一触控界面之前,还包括如下步骤:
移动终端获取第一游戏应用的游戏类型;
移动终端根据第一游戏应用的游戏类型,从至少一个可创建触控界面中确定第一触控界面,其中,每个可创建触控界面显示对应的虚拟按键。
本实施例中,介绍了一种基于游戏类型调整触控界面的方法。由于不同的游戏类型对应不同的可创建触控界面,因此,在将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之前,移动终端可以获取第一游戏应用的游戏类型,进而根据第一游戏应用的游戏类型,从至少一个可创建触控界面中确定第一触控界面,从而显示与该游戏类型对应的虚拟按键。
需要说明的是,游戏类型可以为运动竞技类游戏,休闲类游戏以及角色扮演游戏(Role-Playing Game,RPG等)。例如,运动竞技类游戏对应的触控界面中可包括“跳跃”以及“下蹲”等虚拟按键,例如,休闲类游戏对应的触控界面中可包括“左”、“右”以及“射击”等虚拟按键。又例如,RPG的对应的触控界面可包括方向类虚拟按键等。在实际应用中,用户还可以自定义第一触控界面的参数,参数包含但不限于界面透明度、界面背景颜色、界面图案、界面文本以及虚拟按键形状和尺寸等,此处不做限定。
为了便于理解,请参阅图25,图25为本申请实施例中第一触控界面的一个实施例示意图,图25中(A)图示出的为RPG所对应的用户主界面。图25中(B)图示出的为运动竞技类游戏所对应的用户主界面。图25中(C)图示出的为休闲类游戏所对应的用户主界面。如图所示,P1用于指示第一触控界面,(A)图中P11用于指示方向类虚拟按键,P12用于指示“确认”功能的虚拟按键,P13用于指示“取消”功能的虚拟按键。(B)图中P21用于指示“跳跃”功能的虚拟按键,P22用于指示“下蹲”功能的虚拟按键。(C)图中P31用 于指示“向左移动”的虚拟按键,P32用于指示“向右移动”的虚拟按键,P33用于指示“射击”功能的虚拟按键。
以第一触控界面的参数包括界面透明度以及界面颜色作为一个示例进行介绍,请参阅图26,图26为本申请实施例中选择第一触控界面的参数的一个实施例示意图,如图所示,Q1用于指示第一触控界面,Q2用于指示界面颜色选择区域,Q3用于指示透明度选择区域,用户可以根据需求调整界面颜色以及调整透明度。
本申请实施例中,提供了一种基于游戏类型调整触控界面的方法,通过上述方式,不同的游戏类型,可设定不同的触控界面,从而便于实现对不同类型游戏的控制,此外,第一触控界面还可以根据用户需求调整参数,由此增加方案的灵活性。
在上述图3以及图7对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,还包括如下步骤:
当移动终端运行第二游戏应用时,以全屏模式显示第二游戏应用所对应的第二游戏界面;
若触发切换模式,则移动终端将第二游戏界面进行缩小处理后显示于第二子界面上,并将第二虚拟按键显示于第二触控界面上,其中,第二虚拟按键用于控制第二游戏应用,第二子界面以及第二触控界面均显示于用户主界面上。
本实施例中,介绍了一种同时显示多个游戏界面的方法。移动终端还可以运行第二游戏应用,并且以全屏模式显示第二游戏应用所对应的第二游戏界面,在满足切换模式的情况下,可在用户主界面上创建第二子界面以及第二触控界面,并且将第二游戏界面进行缩小处理后显示于第二子界面上,并在第二触控界面上显示第二虚拟按键,该第二虚拟按键用于控制第二游戏应用。触发切换模式的方式与前述实施例类似,在此不再赘述。
为了便于理解,请参阅图27,图27为本申请实施例中控制界面显示方法的另一个实施例示意图,如图所示,R1用于指示第一子界面,R2用于指示第一触控界面,R3用于指示第二子界面,R4用于指示第二触控界面,在该用户主界面上,在第一子界面上显示有进行缩小处理后的第一游戏界面,在第二子界面上显示有进行缩小处理后的第二游戏界面,在第一触控界面上显示第一游戏应用对应的第一虚拟按键,在第二触控界面上显示第二游戏应用对应的第二虚拟按键。应理解,图27的示例仅用于理解本方案,第二子界面以及第二触控界面显示的位置以及尺寸,可根据用户需求灵活设定。
本申请实施例中,提供了一种同时显示多个游戏界面的方法,通过上述方式,移动终端可以在用户主界面上显示多个游戏应用,能够持续两个游戏进度的同时,实现对移动终端上其他业务的操作,从而增加了操作的便利性。
在上述图3以及图7对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,还包括如下步骤:
当移动终端运行视频应用时,在用户主界面上显示视频应用所对应的视频应用界面;
若触发切换模式,则移动终端将视频应用界面进行缩小处理后显示于第三子界面上,其中,第三子界面显示于用户主界面上。
本实施例中,介绍了一种同时显示游戏界面以及视频界面的方法。移动终端还可以运行视频应用,由此在用户主界面上显示视频应用所对应的视频应用界面。在触发切换模式时,在用户主界面上创建第三子界面,于是将视频应用界面进行缩小处理后显示于第三子界面上。需要说明的是,触发切换模式的方式以及第三子界面的创建方式与前述实施例类似,在此不再赘述。
为了便于理解,请参阅图28,图28为本申请实施例中控制界面显示方法的另一个实施例示意图,如图所示,T1用于指示第一子界面,T2用于指示第一触控界面,T3用于指示第三子界面,在该用户主界面上,在第一子界面上显示有进行缩小处理后的第一游戏界面,在第一触控界面上显示有第一虚拟按键,并且在第三子界面上显示有进行缩小处理后的视频应用界面。应理解,图28的示例仅用于理解本方案,第三子界面的位置以及尺寸可根据用户需求进行灵活设定。
本申请实施例中,提供了一种同时显示游戏界面以及视频界面的方法,通过上述方式,移动终端不但可以在用户主界面上显示游戏应用,还能够同时在用户主界面上显示视频应用,并且实现用户对移动终端上其他业务的操作,满足用户多元化的需求,并进一步地简化了用户操作。
在上述图3以及图7对应的各个实施例的基础上,本申请实施例提供的另一个可选实施例中,移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,还包括如下步骤:
当移动终端运行目标应用时,在用户主界面上以全屏模式显示目标应用所对应的应用界面,并在用户主界面上显示第一子界面以及第一触控界面,其中,目标应用包括即时通讯类应用、娱乐类应用以及工具类应用中的至少一种。
本实施例中,介绍了一种同时显示游戏界面以及信息界面的方法。移动终端将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,如果运行目标应用,则在用户主界面上还可以采用全屏模式显示目标应用的应用界面,同时,持续在用户主界面上显示第一子界面以及第一触控界面,该目标应用包含但不限于即时通讯类应用、娱乐类应用以及工具类应用。
其中,通讯类应用包含但不仅限于聊天、社交、邮箱以及社区等传媒即时通讯类应用,娱乐类应用包含但不仅限于音乐、铃声、播放器、直播、娱乐以及星座运程等应用,工具类应用包含但不仅限于生活服务、美食、天气、日历、实用工具、手电筒、笔记、办公工具、网盘存储以及办公软件等应用。用户在用户主界面操作目标应用时,可以全屏模式显示应用界面,例如,回复消息,查收邮件,点外卖,听歌,上微博,上网,拍照,设置***等都可以全屏模式进行显示,用户在持续游戏进度的同时还可以实现这些操作。
为了便于理解,以目标应用为工具类应用作为一个示例进行说明,请参阅图29,图29为本申请实施例中控制界面显示方法的另一个实施例示意图,如图所示,U1用于指示第一子界面,U2用于指示第一触控界面,U3用于指示工具类应用界面,因此,在该用户主界面上,在第一子界面上显示有进行缩小处理后的第一游戏界面,在第一触控界面上显示有第一虚拟按键,并且以全屏模式显示工具类应用所对应的工具类应用界面。
本申请实施例中,提供了一种同时显示游戏界面以及信息界面的方法,通过上述方式,移动终端在用户主界面上显示多种应用,既能够持续游戏进度又能够操作其他应用,从而实现用户对移动终端上对多种应用业务的操作,进而满足用户多元化的需求。
下面对本申请中的界面显示控制装置进行详细描述,请参阅图30,图30为本申请实施例中界面显示控制装置一个实施例示意图,界面显示控制装置30包括:
显示模块301,用于当运行第一游戏应用时,以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,第一虚拟按键用于控制第一游戏应用;
显示模块301,还用于若触发切换模式,则将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上,其中,第一子界面以及第一触控界面是在触发切换模式时在用户主界面上创建的,第一触控界面用于显示控制第一游戏应用的虚拟按键。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
显示模块301,还用于将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,若触发回切模式,则以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,界面显示控制装置30还包括确定模块302;
确定模块302,用于若触发预设切换操作,则确定触发切换模式,其中,预设切换操作包括点击操作、按压操作、滑动操作、双击操作以及多触控点操作中的至少一种。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
确定模块302,具体用于若针对于所述第一游戏界面中的第一虚拟按钮触发点击操作,则确定触发切换模式;
或,
若针对于第一游戏界面触发按压操作,则确定触发切换模式,其中,按压操作的按压时间大于或等于第一时间阈值;
或,
若针对于第一游戏界面触发滑动操作,则确定触发切换模式,其中,滑动操作的滑动轨迹为基于触控物体的起始位置与触控物体的终止位置生成的;
或,
若针对于第一游戏界面触发双击操作,则确定触发切换模式,其中,双击操作的双击时间间隔小于或等于第二时间阈值;
或,
若针对于第一游戏界面触发多触控点操作,则确定触发切换模式,其中,多触控点操作为基于至少两个触控点向内靠近后生成的。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
确定模块302,还用于若针对于第一子界面触发预设回切操作,则确定触发回切模式,其中,预设回切操作包括点击操作、按压操作、滑动操作、双击操作以及多触控点操作中的至少一种。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
确定模块302,具体若针对于第一子界面中的第二虚拟按钮触发点击操作,则确定触发回切模式;
或,
若针对于第一子界面触发按压操作,则确定触发回切模式,其中,按压操作的按压时间大于或等于第一时间阈值;
或,
若针对于第一子界面触发滑动操作,则确定触发回切模式,其中,滑动操作的滑动轨迹为基于触控物体的起始位置与触控物体的终止位置生成的;
或,
若针对于第一子界面触发双击操作,则确定触发回切模式,其中,双击操作的双击时间间隔小于或等于第二时间阈值;
或,
若针对于第一子界面触发多触控点操作,则确定触发回切模式,其中,多触控点操作为基于至少两个触控点向外延伸后生成的。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,界面显示控制装置30还包括获取模块303;
获取模块303,用于显示模块301以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,通过重力传感器获取设备使用状态,其中,设备使用状态用于描述移动终端被握持时的姿态;
确定模块302,还用于若设备使用状态指示移动终端处于竖屏使用状态,则确定触发切换模式。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
获取模块303,还用于显示模块301将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,通过重力传感器获取设备使用状态,其中,设备使用状态用于描述移动终端被握持时的姿态;
确定模块302,还用于若设备使用状态指示移动终端处于横屏使用状态,则确定触发回切模式。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
获取模块303,还用于显示模块301以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,通过前置摄像头获取人脸图像;
确定模块302,还用于根据人脸图像确定左眼对应的左眼坐标点以及右眼对应的右眼坐标点;
确定模块302,还用于根据左眼坐标点以及右眼坐标点,确定双眼坐标连线;
确定模块302,还用于若双眼坐标连线与移动终端水平方向之间的夹角小于或等于角度阈值,则确定触发切换模式,其中,移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
获取模块303,还用于显示模块将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,通过前置摄像头获取人脸图像;
确定模块302,还用于根据人脸图像确定左眼对应的左眼坐标点以及右眼对应的右眼坐标点;
确定模块302,还用于根据左眼坐标点以及右眼坐标点,确定双眼坐标连线;
确定模块302,还用于若双眼坐标连线与移动终端水平方向之间的夹角大于角度阈值,则确定触发回切模式,其中,移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,界面显示控制装置30还包括监听模块304;
监听模块304,用于显示模块301以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,监听来电提醒;
确定模块302,还用于若触发针对于来电提醒的接听指令,则确定触发切换模式。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
获取模块303,还用于显示模块以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,通过麦克风获取待检测语音;
确定模块302,还用于若待检测语音与第一预设语音匹配成功,则确定触发切换模式。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
获取模块303,还用于显示模块将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,通过麦克风获取待检测语音;
确定模块302,还用于若待检测语音与第二预设语音匹配成功,则确定触发回切模式。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,第一子界面与第一触控界面均属于嵌入型界面;
显示模块301,具体用于在用户主界面的第一预设区域上创建第一子界面,其中,第一预设区域为用户主界面中的一个固定区域;
在用户主界面的第二预设区域上创建第一触控界面,其中,第一触控界面为用户主界面中的另一个固定区域;
以画中画模式将第一游戏界面进行缩小处理后显示于第一子界面上,并以画中画模式在第一触控界面上显示第一虚拟按键。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,第一子界面与第一触控界面均属于悬浮窗界面;
显示模块301,具体用于在用户主界面的上层创建第一子界面;
在用户主界面的上层创建第一触控界面;
以悬浮窗模式将第一游戏界面进行缩小处理后显示于第一子界面上,并以悬浮窗模式在第一触控界面上显示第一虚拟按键。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,界面显示控制装置30还包括控制模块305;
控制模块305,用于若针对于第一子界面触发拖动操作,则控制第一子界面沿拖动操作所对应的拖动轨迹进行移动;
控制模块305,还用于若到针对于第一触控界面触发拖动操作,则控制第一触控界面沿拖动操作所对应的拖动轨迹进行移动;
控制模块305,还用于若到针对于第一子界面触发缩放操作,则控制第一子界面按照缩放操作进行放大或缩小;
控制模块305,还用于若到针对于第一触控界面触发缩放操作,则控制第一触控界面按照缩放操作进行放大或缩小。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
获取模块303,还用于显示模块将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之前,获取第一游戏应用的游戏类型;
确定模块302,还用于根据第一游戏应用的游戏类型,从至少一个可创建触控界面中确定第一触控界面,其中,每个可创建触控界面显示对应的虚拟按键。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
显示模块301,还用于若触发切换模式,则将第二游戏界面进行缩小处理后显示于第二子界面上,并将第二虚拟按键显示于第二触控界面上,其中,第二虚拟按键用于控制第二游戏应用,第二子界面以及第二触控界面均显示于用户主界面上。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
显示模块301,还用于当运行视频应用时,在用户主界面上显示视频应用所对应的视频应用界面;
显示模块301,还用于若触发切换模式,则将视频应用界面进行缩小处理后显示于第三子界面上,其中,第三子界面显示于用户主界面上。
在一种可能的实现方式中,在上述图30所对应的实施例的基础上,本申请实施例提供的界面显示控制装置30的另一实施例中,
显示模块301,还用于将第一游戏界面进行缩小处理后显示于第一子界面上,并将第一虚拟按键显示于第一触控界面上之后,当运行目标应用时,在用户主界面上以全屏模式显示目标应用所对应的应用界面,并在用户主界面上显示第一子界面以及第一触控界面,其中,目标应用包括即时通讯类应用、娱乐类应用以及工具类应用中的至少一种。
本申请实施例还提供了另一种界面显示控制装置,如图31所示,为了便于说明,仅示出了与本申请实施例相关的部分,具体技术细节未揭示的,请参照本申请实施例方法部分。在本申请实施例中,以移动终端为智能手机为例进行说明:
图31示出的是与本申请实施例提供的移动终端相关的智能手机的部分结构的框图。参考图31,智能手机包括:射频(Radio Frequency,RF)电路410、存储器420、输入单元430、显示单元440、传感器450、音频电路460、无线保真(Wireless Fidelity,WiFi)模块470、处理器480、以及电源490等部件。输入单元430可包括触控面板431以及其他输入设备432,显示单元440可包括显示面板441,音频电路460可以包括扬声器461和传声器462。本领域技术人员可以理解,图31中示出的智能手机结构并不构成对智能手机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图31对智能手机的各个构成部件进行具体的介绍:
处理器480是智能手机的控制中心,利用各种接口和线路连接整个智能手机的各个部分,通过运行或执行存储在存储器420内的软件程序和/或模块,以及调用存储在存储器420内的数据,执行智能手机的各种功能和处理数据,从而对智能手机进行整体监控。可选的,处理器480可包括一个或多个处理单元;优选的,处理器480可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作***、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器480中。
尽管未示出,智能手机还可以包括摄像头、蓝牙模块等,在此不再赘述。
在本申请实施例中,该终端所包括的处理器480可以执行前述图3至图29任一项对应实施例中的功能,此处不再赘述。
本申请实施例中还提供一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,当其在计算机上运行时,使得计算机执行如前述各个实施例描述的方法。
本申请实施例中还提供一种包括程序的计算机程序产品,当其在计算机上运行时,使得计算机执行前述各个实施例描述的方法。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的***,装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。

Claims (16)

  1. 一种控制界面显示的方法,所述方法应用于移动终端,所述方法包括:
    当运行第一游戏应用时,以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,所述第一虚拟按键用于控制所述第一游戏应用;
    若触发切换模式,则将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上,其中,所述第一子界面以及所述第一触控界面是在触发所述切换模式时在所述用户主界面上创建的,所述第一触控界面用于显示控制所述第一游戏应用的虚拟按键。
  2. 根据权利要求1所述的方法,所述将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上之后,所述方法还包括:
    若触发回切模式,则以全屏模式在所述用户主界面上显示所述第一游戏界面以及所述第一虚拟按键。
  3. 根据权利要求1所述的方法,所述以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,所述方法还包括:
    若触发预设切换操作,则确定触发所述切换模式,其中,所述预设切换操作包括点击操作、按压操作、滑动操作、双击操作以及多触控点操作中的至少一种。
  4. 根据权利要求3所述的方法,所述若触发预设切换操作,则确定触发所述切换模式,包括:
    若针对于所述第一游戏界面中的第一虚拟按钮触发点击操作,则确定触发所述切换模式;
    或,
    若针对于所述第一游戏界面触发按压操作,则确定触发所述切换模式,其中,所述按压操作的按压时间大于或等于第一时间阈值;
    或,
    若针对于所述第一游戏界面触发滑动操作,则确定触发所述切换模式,其中,所述滑动操作的滑动轨迹为基于所述触控物体的起始位置与所述触控物体的终止位置生成的;
    或,
    若针对于所述第一游戏界面触发双击操作,则确定触发所述切换模式,其中,所述双击操作的双击时间间隔小于或等于第二时间阈值;
    或,
    若针对于所述第一游戏界面触发多触控点操作,则确定触发所述切换模式,其中,所述多触控点操作为基于至少两个触控点向内靠近后生成的。
  5. 根据权利要求2所述的方法,所述将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上之后,所述方法还包括:
    若触控物体在触摸屏上针对于所述第一子界面触发预设回切操作,则确定触发所述回切模式,其中,所述预设回切操作包括点击操作、按压操作、滑动操作、双击操作以及多触控点操作中的至少一种。
  6. 根据权利要求1所述的方法,所述以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,所述方法还包括:
    通过重力传感器获取设备使用状态,其中,所述设备使用状态用于描述移动终端被握持时的姿态;
    若所述设备使用状态指示所述移动终端处于竖屏使用状态,则确定触发所述切换模式。
  7. 根据权利要求2所述的方法,所述将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上之后,所述方法还包括:
    通过重力传感器获取设备使用状态,其中,所述设备使用状态用于描述移动终端被握持时的姿态;
    若所述设备使用状态指示所述移动终端处于横屏使用状态,则确定触发所述回切模式。
  8. 根据权利要求1所述的方法,所述以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,所述方法还包括:
    通过前置摄像头获取人脸图像;
    根据所述人脸图像确定左眼对应的左眼坐标点以及右眼对应的右眼坐标点;
    根据所述左眼坐标点以及所述右眼坐标点,确定双眼坐标连线;
    若所述双眼坐标连线与移动终端水平方向之间的夹角小于或等于角度阈值,则确定触发所述切换模式,其中,所述移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。
  9. 根据权利要求2所述的方法,所述将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上之后,所述方法还包括:
    通过前置摄像头获取人脸图像;
    根据所述人脸图像确定左眼对应的左眼坐标点以及右眼对应的右眼坐标点;
    根据所述左眼坐标点以及所述右眼坐标点,确定双眼坐标连线;
    若所述双眼坐标连线与移动终端水平方向之间的夹角大于角度阈值,则确定触发所述回切模式,其中,所述移动终端水平方向表示移动终端处于竖屏使用状态时的水平方向。
  10. 根据权利要求1所述的方法,所述以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键之后,所述方法还包括:
    监听来电提醒;
    若触发针对于所述来电提醒的接听指令,则确定触发所述切换模式。
  11. 根据权利要求1所述的方法,所述第一子界面与所述第一触控界面均属于嵌入型界面;
    所述将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上,包括:
    在所述用户主界面的第一预设区域上创建所述第一子界面,其中,所述第一预设区域为所述用户主界面中的一个固定区域;
    在所述用户主界面的第二预设区域上创建所述第一触控界面,其中,所述第一触控界面为所述用户主界面中的另一个固定区域;
    以画中画模式将所述第一游戏界面进行缩小处理后显示于所述第一子界面上,并以画中画模式在所述第一触控界面上显示所述第一虚拟按键。
  12. 根据权利要求1所述的方法,所述第一子界面与第一触控界面均属于悬浮窗界面;
    所述将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上,包括:
    在所述用户主界面的上层创建所述第一子界面;
    在所述用户主界面的上层创建所述第一触控界面;
    以悬浮窗模式将所述第一游戏界面进行缩小处理后显示于所述第一子界面上,并以悬浮窗模式在所述第一触控界面上显示所述第一虚拟按键。
  13. 一种界面显示控制装置,所述装置部署在移动终端上,所述装置包括:
    显示模块,用于当运行第一游戏应用时,以全屏模式在用户主界面上显示第一游戏界面以及第一虚拟按键,所述第一虚拟按键用于控制所述第一游戏应用;
    所述显示模块,还用于若触发切换模式,则将所述第一游戏界面进行缩小处理后显示于第一子界面上,并将所述第一虚拟按键显示于第一触控界面上,其中,所述第一子界面以及所述第一触控界面是在触发所述切换模式时在所述用户主界面上创建的,所述第一触控界面用于显示控制所述第一游戏应用的虚拟按键。
  14. 一种移动终端,包括:存储器以及处理器;
    其中,所述存储器用于存储程序;
    所述处理器用于执行所述存储器中的程序,所述处理器用于根据所述程序中的指令执行权利要求1至12中任一项所述的方法。
  15. 一种计算机可读存储介质,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至12中任一项所述的方法。
  16. 一种计算机程序产品,当所述计算机程序产品被执行时,用于执行权利要求1-12中任一所述的方法。
PCT/CN2021/112672 2020-09-30 2021-08-16 一种控制界面显示的方法、装置、设备及存储介质 WO2022068434A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP21874096.7A EP4102346A4 (en) 2020-09-30 2021-08-16 METHOD, APPARATUS, DEVICE AND STORAGE MEDIUM FOR CONTROL INTERFACE DISPLAY
JP2022565833A JP2023523442A (ja) 2020-09-30 2021-08-16 インターフェースの表示を制御する方法、装置、機器及びプログラム
US17/949,031 US20230017694A1 (en) 2020-09-30 2022-09-20 Method and apparatus for controlling interface display, device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011062334.3 2020-09-30
CN202011062334.3A CN112121415A (zh) 2020-09-30 2020-09-30 一种控制界面显示的方法、装置、设备及存储介质

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/949,031 Continuation US20230017694A1 (en) 2020-09-30 2022-09-20 Method and apparatus for controlling interface display, device, and storage medium

Publications (1)

Publication Number Publication Date
WO2022068434A1 true WO2022068434A1 (zh) 2022-04-07

Family

ID=73843555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/112672 WO2022068434A1 (zh) 2020-09-30 2021-08-16 一种控制界面显示的方法、装置、设备及存储介质

Country Status (5)

Country Link
US (1) US20230017694A1 (zh)
EP (1) EP4102346A4 (zh)
JP (1) JP2023523442A (zh)
CN (1) CN112121415A (zh)
WO (1) WO2022068434A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112121415A (zh) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 一种控制界面显示的方法、装置、设备及存储介质
CN112894857B (zh) * 2021-03-02 2024-04-09 路邦科技授权有限公司 一种医院临床辅助机器人的控键方法
CN115113785A (zh) * 2021-03-17 2022-09-27 深圳市万普拉斯科技有限公司 应用程序操作方法、装置、计算机设备和存储介质
CN113648649B (zh) * 2021-08-23 2024-06-07 网易(杭州)网络有限公司 游戏界面的控制方法、装置、计算机可读介质及终端设备
JP7164750B1 (ja) * 2022-07-07 2022-11-01 株式会社あかつき 情報処理システム、情報処理装置、プログラム及び情報処理方法
CN115047999B (zh) * 2022-07-27 2024-07-02 北京字跳网络技术有限公司 界面切换方法、装置、电子设备、存储介质及程序产品

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150182856A1 (en) * 2013-12-31 2015-07-02 Microsoft Corporation Touch screen game controller
CN106681584A (zh) * 2016-12-09 2017-05-17 深圳市金立通信设备有限公司 一种优化应用显示的方法及终端
CN107831979A (zh) * 2017-10-20 2018-03-23 深圳市万普拉斯科技有限公司 游戏中来电处理方法、装置及用户终端
CN109117069A (zh) * 2018-06-27 2019-01-01 努比亚技术有限公司 一种界面操作方法、终端及计算机可读存储介质
CN109165076A (zh) * 2018-10-17 2019-01-08 Oppo广东移动通信有限公司 游戏应用程序的显示方法、装置、终端及存储介质
CN110012156A (zh) * 2019-02-28 2019-07-12 努比亚技术有限公司 信息显示方法、移动终端及非暂态计算机可读存储介质
CN112121415A (zh) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 一种控制界面显示的方法、装置、设备及存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012168932A (ja) * 2011-02-10 2012-09-06 Sony Computer Entertainment Inc 入力装置、情報処理装置および入力値取得方法
CN108491127B (zh) * 2018-03-12 2020-02-07 Oppo广东移动通信有限公司 输入法界面显示方法、装置、终端及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150182856A1 (en) * 2013-12-31 2015-07-02 Microsoft Corporation Touch screen game controller
CN106681584A (zh) * 2016-12-09 2017-05-17 深圳市金立通信设备有限公司 一种优化应用显示的方法及终端
CN107831979A (zh) * 2017-10-20 2018-03-23 深圳市万普拉斯科技有限公司 游戏中来电处理方法、装置及用户终端
CN109117069A (zh) * 2018-06-27 2019-01-01 努比亚技术有限公司 一种界面操作方法、终端及计算机可读存储介质
CN109165076A (zh) * 2018-10-17 2019-01-08 Oppo广东移动通信有限公司 游戏应用程序的显示方法、装置、终端及存储介质
CN110012156A (zh) * 2019-02-28 2019-07-12 努比亚技术有限公司 信息显示方法、移动终端及非暂态计算机可读存储介质
CN112121415A (zh) * 2020-09-30 2020-12-25 腾讯科技(深圳)有限公司 一种控制界面显示的方法、装置、设备及存储介质

Also Published As

Publication number Publication date
EP4102346A4 (en) 2023-10-11
CN112121415A (zh) 2020-12-25
US20230017694A1 (en) 2023-01-19
EP4102346A1 (en) 2022-12-14
JP2023523442A (ja) 2023-06-05

Similar Documents

Publication Publication Date Title
WO2022068434A1 (zh) 一种控制界面显示的方法、装置、设备及存储介质
US11775169B2 (en) Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities
US11023055B2 (en) Devices, methods, and graphical user interfaces for an electronic device interacting with a stylus
US11947792B2 (en) Devices, methods, and graphical user interfaces for providing multitouch inputs and hardware-based features using a single touch input
US11907605B2 (en) Shared-content session user interfaces
US11112956B2 (en) Device, method, and graphical user interface for switching between camera interfaces
US11822761B2 (en) Shared-content session user interfaces
KR101691478B1 (ko) 통합 입력에 따른 단말기 운용 방법 및 이를 지원하는 휴대 단말기
US9933935B2 (en) Device, method, and graphical user interface for editing videos
US9529490B2 (en) Method and apparatus for improving one-handed operation of a large smartphone or a small tablet computer
CN103076967B (zh) 用于响应手势改变焦点的方法和双屏幕通信设备
JP6192013B2 (ja) デバイスフリップ中にディスプレイを移動させる方法及び装置
KR102519800B1 (ko) 전자 장치
CN112153283B (zh) 拍摄方法、装置及电子设备
CN102999309B (zh) 多屏显示控制
CA2917174A1 (en) A mobile device operating system
KR20120025976A (ko) 정보 처리 장치, 프로그램, 및 제어 방법
US10891028B2 (en) Information processing device and information processing method
JPWO2015064165A1 (ja) 情報処理装置、情報処理方法、およびプログラム
JP2013516699A (ja) 別の装置の表示解像度を持つモードを含む複数のアプリケーション表示モードを有する装置および方法
CN108920069B (zh) 一种触控操作方法、装置、移动终端和存储介质
US20160202948A1 (en) Causation of stylistic mimicry on a companion apparatus
US20230370507A1 (en) User interfaces for managing shared-content sessions
WO2018059552A1 (zh) 屏幕显示控制方法、装置及移动终端和计算机存储介质
WO2020038075A1 (zh) 显示屏的显示方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21874096

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021874096

Country of ref document: EP

Effective date: 20220908

ENP Entry into the national phase

Ref document number: 2022565833

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE