CN116540905A - Virtual reality equipment and focus operation method - Google Patents

Virtual reality equipment and focus operation method Download PDF

Info

Publication number
CN116540905A
CN116540905A CN202210087209.0A CN202210087209A CN116540905A CN 116540905 A CN116540905 A CN 116540905A CN 202210087209 A CN202210087209 A CN 202210087209A CN 116540905 A CN116540905 A CN 116540905A
Authority
CN
China
Prior art keywords
user
webpage
focus cursor
target area
user focus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210087209.0A
Other languages
Chinese (zh)
Inventor
罗桂边
温佳乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Electronic Technology Shenzhen Co ltd
Original Assignee
Hisense Electronic Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Electronic Technology Shenzhen Co ltd filed Critical Hisense Electronic Technology Shenzhen Co ltd
Priority to CN202210087209.0A priority Critical patent/CN116540905A/en
Publication of CN116540905A publication Critical patent/CN116540905A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides virtual reality equipment and a focus operation method. And when the user focus cursor moves to the target area of the webpage, determining the stay condition of the user focus cursor on the target area. Judging whether the stay condition meets the preset condition, and operating the target area by utilizing the target action corresponding to the user focus cursor when the stay condition meets the preset condition so as to finish the interactive operation of the user in the webpage. The interactive operation of the user in the webpage comprises a webpage scrolling operation, a webpage turning operation and a menu page opening operation. The user only operates the webpage by moving the focus cursor of the user, the click operation is not required to be completed by frequently pressing the button with the finger/hand, the operation complexity of the user for using the virtual reality device is reduced, and the use experience of the user is ensured.

Description

Virtual reality equipment and focus operation method
Technical Field
The application relates to the technical field of virtual reality, in particular to virtual reality equipment and a focus operation method.
Background
Virtual Reality (VR) technology is a display technology that simulates a Virtual environment by a computer, thereby giving an environmental immersion. A virtual reality device is a device that presents virtual pictures to a user using virtual reality technology. The virtual reality device typically displays web content using a VR browser, and in a rendering scenario of the virtual reality device, the web content displayed by the VR browser is typically presented in the form of a virtual user interface. When the virtual reality device displays the virtual user interface through the VR browser, a user can interact with the virtual user interface, so that actions such as clicking of a control of the virtual user interface, scrolling of a window, zooming in and/or zooming out the interface and the like are completed.
At present, the interaction between a user and a virtual user interface is mainly to finish clicking operation by pressing a button on a somatosensory handle. For example, after a user opens a web page using a VR browser, if the web page content exceeds the screen, the user needs to scroll up and down to view the content, but when scrolling the web page, the user needs to press a button for a long time and drag the button. If the font of the web page content is smaller, the web page needs to be enlarged, however, the enlarging button needs to be clicked when the web page is enlarged. In the case of opening a plurality of web pages, when returning to the up/down page, it is also necessary to click the up/down page button. Based on the above-mentioned scene, when the user uses the VR browser to carry out the webpage operation, all need frequently use the finger/hand to press the button and accomplish the click operation to influence the experience that the user used virtual reality equipment.
Disclosure of Invention
The application provides virtual reality equipment and a focus operation method, which are used for solving the problem that when a user uses a VR browser to operate a webpage, the user needs to frequently use fingers/hands to press buttons to finish clicking operation, so that the experience of the user using the virtual reality equipment is affected.
In a first aspect, the present application provides a virtual reality device, comprising: a display configured to display a virtual user interface for displaying a web page; a gesture sensor configured to detect a user focus cursor; a controller configured to: and when the user focus cursor moves to a target area of the webpage, determining a stay condition of the user focus cursor on the target area, wherein the target area is used for representing a response position of the user focus cursor on the webpage. Judging whether the stay condition meets a preset condition or not; and when the stay condition meets the preset condition, operating the target area by utilizing the target action corresponding to the focus cursor of the user so as to finish the interactive operation of the user in the webpage.
The user focus cursor is always in a fixed position in the user's field of view in the rendered scene of the virtual reality device, it being understood that the position of the user focus cursor relative to the user is unchanged and the position relative to the virtual user interface is changeable. Thus, when the user rotates the head after wearing the virtual reality device, the user focus cursor moves to any position of the virtual user interface along with the action of the user. When the user needs to operate the virtual user interface, the user can stay the user focus cursor in the target area, and when the stay condition meets the preset condition, the virtual reality device can operate the target area according to the target action corresponding to the user focus cursor by the user so as to complete interactive operation of the user in the webpage, further, the user is prevented from frequently using fingers/hands to press buttons to complete clicking operation by using a remote controller or a somatosensory handle and the like, the operation complexity of the user for using the virtual reality device is reduced, and the use experience of the user is ensured.
In a second aspect, the present application further provides a focus operation method, which specifically includes the following steps: and when the user focus cursor moves to a target area of the webpage, determining a stay condition of the user focus cursor on the target area, wherein the target area is used for representing a response position of the user focus cursor on the webpage. Judging whether the stay condition meets a preset condition or not; and when the stay condition meets the preset condition, operating the target area by utilizing the target action corresponding to the focus cursor of the user so as to finish the interactive operation of the user in the webpage.
According to the technical scheme, the application provides the virtual reality equipment and the focus operation method. And when the user focus cursor moves to the target area of the webpage, determining the stay condition of the user focus cursor on the target area. Judging whether the stay condition meets the preset condition, and operating the target area by utilizing the target action corresponding to the user focus cursor when the stay condition meets the preset condition so as to finish the interactive operation of the user in the webpage. The interactive operation of the user in the webpage comprises a webpage scrolling operation, a webpage turning operation and a menu page opening operation. The user only operates the webpage by moving the focus cursor of the user, the click operation is not required to be completed by frequently pressing the button with the finger/hand, the operation complexity of the user for using the virtual reality device is reduced, and the use experience of the user is ensured.
Drawings
In order to more clearly illustrate the technical solutions of the present application, the drawings that are needed in the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 illustrates a display system architecture diagram including a virtual reality device, according to some embodiments;
FIG. 2 illustrates a VR scene global interface schematic in accordance with some embodiments;
FIG. 3 illustrates a recommended content region schematic diagram of a global interface, according to some embodiments;
FIG. 4 illustrates an application shortcut entry area schematic for a global interface in accordance with some embodiments;
FIG. 5 illustrates a suspension diagram of a global interface, according to some embodiments;
FIG. 6 illustrates a schematic diagram of displaying a web page on a virtual user interface in accordance with some embodiments;
FIG. 7 illustrates a flow chart of a method of focus operation, in accordance with some embodiments;
FIG. 8 illustrates an interface diagram that shows a user focus cursor and menu reminder controls, in accordance with some embodiments;
FIG. 9 illustrates an interface diagram that shows a user focus cursor positioned on a menu reminder control, in accordance with some embodiments;
FIG. 10 illustrates a schematic diagram of a display menu interface, according to some embodiments;
FIG. 11 illustrates an interface diagram in which a user focus cursor is located in an upper directional control, in accordance with some embodiments;
FIG. 12 illustrates an interface schematic with a user focus cursor in a zoom-in control, in accordance with some embodiments;
FIG. 13 illustrates an interface diagram with a user focus cursor in a middle of page height position in accordance with some embodiments;
FIG. 14 illustrates an interface diagram where a user focus cursor is located in a page flip reminder control, in accordance with some embodiments;
FIG. 15 illustrates an interface schematic where a user focus cursor is located in a target area, in accordance with some embodiments;
FIG. 16 illustrates an interface schematic in which a user focus cursor is located in a page flip reminder control, in accordance with some embodiments;
FIG. 17 illustrates an interface diagram with a user focus cursor in a middle position of a page width in accordance with some embodiments;
FIG. 18 illustrates an interface schematic in which a user focus cursor is located in a scroll alert control in accordance with some embodiments;
FIG. 19 illustrates an interface diagram with a user focus cursor in a middle position of a page width in accordance with some embodiments;
FIG. 20 illustrates an interface schematic with a user focus cursor in a scroll alert control in accordance with some embodiments;
FIG. 21 illustrates an interface diagram of a user focus cursor being located at other target areas in accordance with some embodiments;
FIG. 22 illustrates an interface diagram of a user focus cursor being located at other target areas in accordance with some embodiments;
FIG. 23 illustrates an interface diagram with a user focus cursor in a lower right corner of a page, in accordance with some embodiments;
FIG. 24 illustrates an interface schematic with a user focus cursor in a zoom-in control, in accordance with some embodiments;
FIG. 25 illustrates an interface diagram for a page zoom operation for a web page in accordance with some embodiments;
FIG. 26 illustrates an interface diagram with a user focus cursor in a lower right corner of a page, in accordance with some embodiments;
FIG. 27 illustrates an interface schematic where a user focus cursor is located in a re-zoom-in control area, in accordance with some embodiments;
FIG. 28 illustrates an interface diagram with a user focus cursor in a lower left corner of a page, in accordance with some embodiments;
FIG. 29 illustrates an interface diagram in which a user focus cursor is located in a zoom-out control, in accordance with some embodiments;
FIG. 30 illustrates a flow chart showing a menu interface, according to some embodiments.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the exemplary embodiments of the present application more apparent, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is apparent that the described exemplary embodiments are only some embodiments of the present application, but not all embodiments.
All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present application, are intended to be within the scope of the present application based on the exemplary embodiments shown in the present application. Furthermore, while the disclosure has been presented in terms of an exemplary embodiment or embodiments, it should be understood that various aspects of the disclosure can be practiced separately from the disclosure in a complete subject matter.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims and in the above-described figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate, such as where appropriate, for example, implementations other than those illustrated or described in accordance with embodiments of the present application.
Furthermore, the terms "comprise" and "have," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The term "module" as used in this application refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the function associated with that element.
Reference throughout this specification to "multiple embodiments," "some embodiments," "one embodiment," or "an embodiment," etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases "in various embodiments," "in some embodiments," "in at least one other embodiment," or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, a particular feature, structure, or characteristic shown or described in connection with one embodiment may be combined, in whole or in part, with features, structures, or characteristics of one or more other embodiments without limitation. Such modifications and variations are intended to be included within the scope of the present application.
In this embodiment, the virtual reality device 500 generally refers to a display device that can be worn on the face of a user to provide an immersive experience for the user, including, but not limited to, VR glasses, augmented reality devices (Augmented Reality, AR), VR gaming devices, mobile computing devices, and other wearable computers. In some embodiments of the present application, VR glasses are taken as an example to describe a technical solution, and it should be understood that the provided technical solution may be applied to other types of virtual reality devices at the same time. The virtual reality device 500 may operate independently or be connected to other intelligent display devices as an external device, where the display device may be an intelligent tv, a computer, a tablet computer, a server, etc.
The virtual reality device 500 may display a media asset screen after being worn on the face of the user, providing close range images for both eyes of the user to bring an immersive experience. To present the asset screen, the virtual reality device 500 may include a plurality of components for displaying the screen and face wear. Taking VR glasses as an example, the virtual reality device 500 may include components such as a housing, a position fixture, an optical system, a display assembly, a gesture detection circuit, an interface circuit, and the like. In practical applications, the optical system, the display assembly, the gesture detection circuit and the interface circuit may be disposed in the housing, so as to be used for presenting a specific display screen; the two sides of the shell are connected with position fixing pieces so as to be worn on the face of a user.
When the gesture detection circuit is used, gesture detection elements such as a gravity acceleration sensor and a gyroscope are arranged in the gesture detection circuit, when the head of a user moves or rotates, the gesture of the user can be detected, detected gesture data are transmitted to processing elements such as a controller, and the processing elements can adjust specific picture contents in the display assembly according to the detected gesture data.
As shown in fig. 1, in some embodiments, the virtual reality device 500 may be connected to the display device 200, and a network-based display system is constructed between the virtual reality device 500, the display device 200, and the server 400, and data interaction may be performed in real time, for example, the display device 200 may obtain media data from the server 400 and play the media data, and transmit specific screen content to the virtual reality device 500 for display.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device, among others. The particular display device type, size, resolution, etc. are not limited, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired. The display device 200 may provide a broadcast receiving tv function, and may additionally provide an intelligent network tv function of a computer supporting function, including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), etc.
The display device 200 and the virtual reality device 500 also communicate data with the server 400 via a variety of communication means. The display device 200 and the virtual reality device 500 may be allowed to communicate via a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display device 200. By way of example, display device 200 receives software program updates, or accesses a remotely stored digital media library by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster, or may be multiple clusters, and may include one or more types of servers. Other web service content such as video on demand and advertising services are provided through the server 400.
In the course of data interaction, the user may operate the display device 200 through the mobile terminal 300 and the remote controller 100. The mobile terminal 300 and the remote controller 100 may communicate with the display device 200 by a direct wireless connection or by a non-direct connection. That is, in some embodiments, the mobile terminal 300 and the remote controller 100 may communicate with the display device 200 through a direct connection manner of bluetooth, infrared, etc. When transmitting the control instruction, the mobile terminal 300 and the remote controller 100 may directly transmit the control instruction data to the display device 200 through bluetooth or infrared.
In other embodiments, the mobile terminal 300 and the remote controller 100 may also access the same wireless network with the display device 200 through a wireless router to establish indirect connection communication with the display device 200 through the wireless network. When transmitting the control command, the mobile terminal 300 and the remote controller 100 may transmit the control command data to the wireless router first, and then forward the control command data to the display device 200 through the wireless router.
In some embodiments, the user may also use the mobile terminal 300 and the remote controller 100 to directly interact with the virtual reality device 500, for example, the mobile terminal 300 and the remote controller 100 may be used as handles in a virtual reality scene to implement functions such as somatosensory interaction.
In some embodiments, the display components of the virtual reality device 500 include a display screen and drive circuitry associated with the display screen. To present a specific picture and bring about a stereoscopic effect, two display screens may be included in the display assembly, corresponding to the left and right eyes of the user, respectively. When the 3D effect is presented, the picture contents displayed in the left screen and the right screen are slightly different, and a left camera and a right camera of the 3D film source in the shooting process can be respectively displayed. Because of the content of the screen observed by the left and right eyes of the user, a display screen with a strong stereoscopic impression can be observed when the display screen is worn.
The optical system in the virtual reality device 500 is an optical module composed of a plurality of lenses. The optical system is arranged between the eyes of the user and the display screen, and the optical path can be increased through the refraction of the optical signals by the lens and the polarization effect of the polaroid on the lens, so that the content presented by the display component can be clearly presented in the visual field of the user. Meanwhile, in order to adapt to the vision condition of different users, the optical system also supports focusing, namely, the position of one or more of the lenses is adjusted through the focusing assembly, the mutual distance among the lenses is changed, and therefore the optical path is changed, and the picture definition is adjusted.
The interface circuit of the virtual reality device 500 may be used to transfer interaction data, and besides transferring gesture data and displaying content data, in practical application, the virtual reality device 500 may also be connected to other display devices or peripheral devices through the interface circuit, so as to implement more complex functions by performing data interaction with the connection device. For example, the virtual reality device 500 may be connected to a display device through an interface circuit, so that a displayed screen is output to the display device in real time for display. For another example, the virtual reality device 500 may also be connected to a handle via interface circuitry, which may be operated by a user in a hand, to perform related operations in the VR user interface.
Wherein the VR user interface can be presented as a plurality of different types of UI layouts depending on user operation. For example, the user interface may include a global interface, such as the global UI shown in fig. 2 after the AR/VR terminal is started, which may be displayed on a display screen of the AR/VR terminal or may be displayed on a display of the display device. The global UI may include a recommended content area 1, a business class extension area 2, an application shortcut entry area 3, and a hover area 4.
The recommended content area 1 is used for configuring TAB columns of different classifications; media resources, themes and the like can be selectively configured in the columns; the media assets may include 2D movies, educational courses, travel, 3D, 360 degree panoramas, live broadcasts, 4K movies, program applications, games, travel, etc. services with media asset content, and the fields may select different template styles, may support simultaneous recommended programming of media assets and themes, as shown in fig. 3.
In some embodiments, the content recommendation area 1 may also include a main interface and a sub-interface. As shown in fig. 3, the portion located in the center of the UI layout is a main interface, and the portions located at both sides of the main interface are sub-interfaces. The main interface and the auxiliary interface can be used for respectively displaying different recommended contents. For example, according to the recommended type of the sheet source, the service of the 3D sheet source may be displayed on the main interface; and the left side sub-interface displays the business of the 2D film source, and the right side sub-interface displays the business of the full-scene film source.
Obviously, for the main interface and the auxiliary interface, different service contents can be displayed and simultaneously presented as different content layouts. And, the user can control the switching of the main interface and the auxiliary interface through specific interaction actions. For example, by controlling the focus mark to move left and right, the focus mark moves right when the focus mark is at the rightmost side of the main interface, the auxiliary interface at the right side can be controlled to be displayed at the central position of the UI layout, at this time, the main interface is switched to the service for displaying the full-view film source, and the auxiliary interface at the left side is switched to the service for displaying the 3D film source; and the right side sub-interface is switched to the service of displaying the 2D patch source.
In addition, in order to facilitate the user to watch, the main interface and the auxiliary interface can be displayed respectively through different display effects. For example, the transparency of the secondary interface can be improved, so that the secondary interface obtains a blurring effect, and the primary interface is highlighted. The auxiliary interface can be set as gray effect, the main interface is kept as color effect, and the main interface is highlighted.
In some embodiments, a status bar may be further provided on top of the recommended content area 1, and a plurality of display controls may be provided in the status bar, including common options such as duration, network connection status, and power. The content included in the status bar may be user-defined, e.g., weather, user avatar, etc., may be added. The content contained in the status bar may be selected by the user to perform the corresponding function. For example, when the user clicks on a duration option, the virtual reality device 500 may display a duration device window in the current interface or jump to a calendar interface. When the user clicks on the network connection status option, the virtual reality device 500 may display a WiFi list on the current interface or jump to the network setup interface.
The content displayed in the status bar may be presented in different content forms according to the setting status of a specific item. For example, the duration control can be directly displayed as specific duration text information, and different text is displayed at different durations; the power control may be displayed as different pattern styles according to the current power remaining situation of the virtual reality device 500.
The status bar is used to enable the user to perform a common control operation, so as to implement quick setting of the virtual reality device 500. Since the setup procedure for the virtual reality device 500 includes a number of items, all of the commonly used setup options cannot generally be displayed in the status bar. To this end, in some embodiments, an expansion option may also be provided in the status bar. After the expansion options are selected, an expansion window may be presented in the current interface, and a plurality of setting options may be further provided in the expansion window for implementing other functions of the virtual reality device 500.
For example, in some embodiments, after the expansion option is selected, a "shortcut center" option may be set in the expansion window. After clicking the shortcut center option, the user may display a shortcut center window by the virtual reality device 500. The shortcut center window can comprise screen capturing, screen recording and screen throwing options for respectively waking up corresponding functions.
The traffic class extension area 2 supports extension classes that configure different classes. And if the new service type exists, supporting configuration independent TAB, and displaying the corresponding page content. The service classification in the service classification expansion area 2 can also be subjected to sequencing adjustment and offline service operation. In some embodiments, the service class extension area 2 may include content: movie, education, travel, application, my. In some embodiments, the traffic class extension area 2 is configured to show large traffic classes TAB and support more classes configured, the icon of which supports the configuration as shown in fig. 3.
The application shortcut entry area 3 may specify that pre-installed applications, which may be specified as a plurality, are displayed in front for operational recommendation, supporting configuration of special icon styles to replace default icons. In some embodiments, the application shortcut entry area 3 further includes a left-hand movement control, a right-hand movement control for moving the options target, for selecting different icons, as shown in fig. 4.
The hover region 4 may be configured to be above the left diagonal or above the right diagonal of the fixation region, may be configured as an alternate character, or may be configured as a jump link. For example, upon receipt of a confirmation operation, the suspension jumps to an application or displays a designated function page, as shown in fig. 5. In some embodiments, the suspension may also be configured without jump links, purely for visual presentation.
In some embodiments, the global UI further includes a status bar at the top for displaying time duration, network connection status, power status, and more shortcut entries. After the handle of the AR/VR terminal is used, namely the handheld controller selects the icon, the icon displays a text prompt comprising left and right expansion, and the selected icon is stretched and expanded left and right according to the position.
For example, after selecting the search icon, the search icon will display the text "search" and the original icon, and after further clicking the icon or text, the search icon will jump to the search page; for another example, clicking on the favorites icon jumps to favorites TAB, clicking on the history icon defaults to locating the display history page, clicking on the search icon jumps to the global search page, clicking on the message icon jumps to the message page.
In some embodiments, the interaction may be performed through a peripheral device, e.g., a handle of the AR/VR terminal may operate a user interface of the AR/VR terminal, including a back button; the home key can realize the reset function by long-time pressing; volume up and down buttons; and the touch area can realize clicking, sliding and holding drag functions of the focus.
In the virtual reality device 500, the display of content on the virtual user interface is accomplished in dependence on the VR browser. When the virtual reality device 500 displays the virtual user interface through the VR browser, the user may interact with the virtual user interface, thereby completing actions such as element clicking, window scrolling, and panoramic video control of the virtual user interface. The virtual user interface may be the VR user interface in the foregoing.
Currently, a user may perform an interactive operation through a global UI interface and jump to a specific interface in a partial interactive mode. For example, FIG. 6 is a schematic diagram illustrating the display of a web page on a virtual user interface in accordance with some embodiments. Referring to fig. 6, in order to browse web content, a user may enter the browser by clicking on any browser icon in the global UI interface to browse web content, at which point the virtual reality device 500 may control jumping to the web page. After displaying the web page, the user's interaction with the web page in the virtual user interface is mainly accomplished by frequently pressing a button on the somatosensory handle with a finger/hand. For example, first a user is required to move a target element on a virtual user interface into the visual range 600 in a web page by head rotation; then the user throws light in the webpage by using a somatosensory handle and the like to aim at a target point; and finally, the user stabilizes the somatosensory handle at the current position of the webpage, and presses a button to finish clicking operation.
To avoid the complexity of interactive operations between a user and a virtual user interface by frequently manipulating a somatosensory handle with a finger/hand, the present application provides a virtual reality device comprising: a display configured to display a virtual user interface for displaying a web page; and a gesture sensor configured to detect a user focus cursor. In order to facilitate user operation, the method and the device operate the webpage through movement and stay of the user focus cursor.
The following specifically describes a virtual reality device and a focus operation method provided in the present application.
Fig. 7 illustrates a flow chart of a focus operation method according to some embodiments. Referring to fig. 7, a virtual reality device provided in an embodiment of the present application, when a configured controller performs a focus operation method, is configured to perform the following steps:
s1, when a user focus cursor moves to a target area of a webpage, determining a stay condition of the user focus cursor on the target area, wherein the target area is used for representing a response position of the user focus cursor on the webpage.
In some embodiments, the position of the user focus cursor in the web page needs to be obtained, and whether the position corresponding to the user focus cursor is in the target area of the web page is determined, which needs to be described, that the target area in the web page is not specifically limited, and the target area is used for representing the response position of the user focus cursor on the web page, so that the target area can be specifically set according to the actual situation. For example, the target area is the center position of the web page, the peripheral edge position of the web page, and the upper left, upper right, lower left, lower right positions of the web page.
In some embodiments, after determining that the position of the user focus cursor is in the target area of the web page, a dwell condition of the user focus cursor on the target area is determined. In specific implementation, the stay condition is a stay time of the user focus cursor on the target area. Detecting a first stay time length of the user focus cursor staying in the target area, and determining that the stay time length of the user focus cursor staying in the target area meets a preset condition when the first stay time length is greater than or equal to a preset stay time length.
For example, detecting that the first stay time of the user focus cursor on the target area is 2 seconds, if the preset stay time is 1.5 seconds, determining that the stay condition of the user focus cursor on the target area meets the preset condition, and then performing subsequent operations.
In some embodiments, the user focus cursor may be determined by a gesture detector, and as the user's head moves, the user focus cursor moves in the direction of the head movement. It should be noted that, the setting of the preset condition, such as the preset residence time, of the user focus cursor on the target area may be set according to the actual habit of the user, and if the preset residence time is too short, the interface of the subsequent indication operation will frequently jump out during the browsing process of the user, so as to affect the reading experience of the user. If the preset stay time is set too long, in the browsing process of the user, when the user needs to operate through the interface of the subsequent indication operation, the focusing time of the user in the target area is too long, so that the experience of the user operation is poor. The present application is described by way of example only, and the residence time and the preset time in the present application may be set according to the actual situation.
S2, judging whether the stay condition meets a preset condition or not; and when the stay condition meets the preset condition, operating the target area by utilizing the target action corresponding to the focus cursor of the user so as to finish the interactive operation of the user in the webpage.
In some embodiments, referring to fig. 30, when the stay condition satisfies the preset condition, the controller is further configured to, when executing the target action operation target area corresponding to the focus cursor of the user to complete the interactive operation of the user in the web page:
and S21, when the first stay time length is greater than or equal to the preset stay time length, displaying a menu reminding control in the webpage, wherein the menu reminding control is close to a focus cursor of a user, and the menu reminding control is used for prompting the user to open a menu.
S22, detecting a first moving position of a user focus cursor.
S23, judging whether the first moving position is in the area corresponding to the menu reminding control.
And S24, if the first moving position is in the area corresponding to the menu reminding control, detecting a second stay time of the user focus cursor at the first moving position.
And S25, when the second stay time length meets the preset condition, displaying a menu interface at a preset position of the webpage, wherein the menu interface comprises at least one functional control, and the functional control is used for realizing page turning and/or sliding operation of the webpage.
The process of displaying a menu interface described above in connection with fig. 8-13 is described below as an example.
FIG. 8 illustrates an interface diagram showing a user focus cursor and a menu reminder control, in accordance with some embodiments. Referring to fig. 8, for example, when the user focus cursor stays in the target area and the first stay time length of stay is greater than or equal to the preset stay time length, a menu reminding control 800 is displayed in the web page, the menu reminding control 800 is close to the user focus cursor, and the menu reminding control is used for prompting the user to perform the menu opening operation. "Menu? Focus is determined here "to prompt the user to operate after the user focus cursor is hovered over the menu reminder control.
FIG. 9 illustrates an interface diagram that shows a user focus cursor positioned on a menu reminder control, in accordance with some embodiments. Referring to fig. 9, for example, after the menu reminder control 800 is displayed, a first movement position of the user's focus cursor is detected, and it is determined whether the first movement position is in the area corresponding to the menu reminder control. As can be seen, the first movement location is in the region corresponding to the menu reminder control 800. Detecting a second stay time of the user focus cursor at the first moving position; and when the second stay time length meets the preset condition, displaying a menu interface at a preset position of the webpage. The second residence time and the preset conditions are the same as the residence time and the preset residence time of the user focus coordinate, and are not described herein. And triggering the display menu interface when the second stay time length is detected to meet the preset condition.
In some embodiments, if the first movement position is outside the area corresponding to the menu reminding control, detecting a third stay time of the user focus cursor at the first movement position; and when the third stay time length meets the preset condition, canceling the display of the menu reminding control.
For example, when the first moving position corresponding to the focus cursor of the user is in the area outside the menu reminding control, the third stay time of the first moving position is continuously detected, for example, stay for 2 seconds. When the third stay time length is greater than or equal to a preset condition, namely the preset stay time length, the preset stay time length is 2 seconds, namely the default is that the user does not continue to execute the operation of the menu interface, and the display of the menu reminding control is canceled. It should be noted that all preset conditions and residence conditions in the application can be set according to actual situations, and the preset conditions and residence conditions in each step can be the same or different.
FIG. 10 illustrates a schematic diagram of a display menu interface, according to some embodiments. Referring to fig. 10, the displayed menu interface is triggered after the second dwell time corresponding to the user focus cursor at last time satisfies the preset condition. The menu interface comprises an up, down, left and right direction control, an enlargement control and a reduction control. When the user moves the user focus cursor into the corresponding areas of the upper, lower, left and right direction controls, the webpage is controlled to realize the corresponding functions. When the user focus cursor moves to the area corresponding to the zoom-in control, controlling the webpage to perform webpage zoom-in operation; and when the user focus cursor moves to the area corresponding to the zoom-out control, controlling the webpage to perform webpage zoom-out operation.
By way of example, FIG. 11 illustrates an interface diagram in which a user focus cursor is located in an upper directional control, in accordance with some embodiments. Referring to fig. 11, when the user focus cursor moves to the area corresponding to the upper direction control, the above steps of detecting the stay time are repeated, which is not described herein. And if the preset condition is met, controlling the webpage to perform upward movement direction scrolling operation. Similarly, when the user focus cursor moves to the area corresponding to the lower direction control, the stay time is detected and whether the preset condition is met is judged. And after the preset condition is met, controlling the webpage to scroll according to the downward moving direction. When the user focus cursor moves to the area corresponding to the left direction control or the right direction control, the stay time is detected and whether the preset condition is met is judged. After the preset condition is met, whether the current webpage performs the amplifying operation is judged, and if the current webpage performs the amplifying operation, the webpage is controlled to perform the left or right moving direction rolling operation. And if the current webpage does not execute the amplifying operation, controlling the webpage to execute page turning operation of the previous page or the next page.
FIG. 12 illustrates an interface schematic with a user focus cursor in a zoom-in control, according to some embodiments. Referring to fig. 12, when the user focus cursor moves into the region corresponding to the zoom-in control, the stay time length is detected and whether the preset condition is satisfied is judged. And after the preset condition is met, controlling the webpage to carry out the webpage amplifying operation. Similarly, when the user focus cursor moves to the area corresponding to the abbreviation control, the stay time is detected and whether the preset condition is met is judged. And after the preset condition is met, controlling the webpage to carry out page shrinking operation.
In some embodiments, the plurality of controls included in the menu interface may be set according to actual situations, and meanwhile, the implementation function corresponding to each control is only illustrated in an exemplary form, and the copy, paste or screenshot operation on the web page displayed on the display may also be completed through the user focus cursor operation.
In other embodiments, in addition to the menu interface operation performed by the user focus cursor, a corresponding prompt may be given to the user stay in other areas of the web page. In particular, whether the target area is located in a second edge area of the webpage is detected, wherein the second edge area is an edge area in the height direction of the webpage. When the target area is located in the second edge area, a page turning prompt control is displayed on the webpage, the page turning prompt control is close to a focus cursor of a user, and the page turning prompt control is used for prompting the user to perform page turning operation.
In some embodiments, a fourth dwell time of the user focus cursor in the edge region is detected; and if the fourth stay time length meets the preset condition, displaying a scroll reminding control in the webpage, wherein the scroll reminding control is close to a focus cursor of the user and used for prompting the user to scroll the webpage. And if the fourth stay time does not meet the preset condition, not displaying the scroll reminding control in the webpage.
By way of example, FIG. 13 illustrates an interface diagram with a user focus cursor at a mid-height position of a page in accordance with some embodiments. Referring to fig. 13, when the user focus cursor stays in the target area, the target area is located in a second edge area of the web page, and the second edge area is an edge area of the web page in the height direction. Wherein the second edge region includes left and right regions of the page. When the user focus cursor is stopped in the left area, the stopped stopping time length is greater than or equal to the preset stopping time length. And displaying a page turning reminding control in the webpage, wherein the page turning reminding control is close to a focus cursor of a user and is used for prompting the user to perform page turning operation. The figure shows "last page? Focus is determined here "to prompt the user to perform the page turning operation after the user focus cursor remains on the page turning reminder control.
FIG. 14 illustrates an interface diagram where a user focus cursor is located in a page flip reminder control, in accordance with some embodiments. Referring to fig. 14, when the user focus cursor moves into the region corresponding to the page turning reminding control, the stay time is detected and whether the preset condition is satisfied is determined. And after the preset condition is met, controlling the webpage to perform page turning operation of the previous webpage.
FIG. 15 illustrates an interface schematic where a user focus cursor is located in a target area, in accordance with some embodiments. Referring to fig. 15, when the user focus cursor is stopped in the right area, and the stopped-time length is greater than or equal to the preset stopped-time length. And displaying a page turning reminding control in the webpage, wherein the page turning reminding control is close to a focus cursor of a user and is used for prompting the user to perform page turning operation. The "next page? Focus is determined here "to prompt the user to perform the page turning operation after the user focus cursor remains on the page turning reminder control.
FIG. 16 illustrates an interface diagram where a user focus cursor is located in a page flip reminder control, in accordance with some embodiments. Referring to fig. 16, when the user focus cursor moves into the region corresponding to the page turning reminding control, the stay time is detected and whether the preset condition is satisfied is determined. And after the preset condition is met, controlling the webpage to carry out page turning operation of the next webpage.
It should be noted that, in a scenario where the user opens a plurality of web pages and the current web page does not perform the zoom-in operation, when the user moves the user focus coordinate to the target area, the page turning reminding control is displayed in the web page. And if the user moves the user focus coordinate to the page turning reminding control, triggering to return to the previous page. If the user does not move the user focus coordinate to the page turning reminding control, the page turning reminding control is controlled to disappear according to the preset condition. Similarly, in a scene that the user opens only one webpage and/or the current webpage executes the zooming operation, when the user moves the user focus coordinate to the target area, the scroll reminding control is displayed in the webpage so as to facilitate the user to perform subsequent operation. The method and the device can be designed differently for different scenes to realize different functions.
In other embodiments, detecting whether the target area is located in a first edge area of the web page, where the first edge area is a middle area in a width direction of the web page; when the target area is positioned in the first edge area, determining the moving direction of the user focus cursor in the target area; and controlling the webpage to scroll in the corresponding direction according to the moving direction.
By way of example, FIG. 17 illustrates an interface diagram with a user focus cursor positioned in a middle of a page width in accordance with some embodiments. Referring to fig. 17, when the user focus cursor stays in the target area, the target area is located in a first edge area of the web page, and the first edge area is an edge area in the width direction of the web page. Wherein the first edge region includes a top side and a bottom side region of the page. When the user focus cursor stays in the bottom side area, the stay time length of the user focus cursor is larger than or equal to the preset stay time length. And displaying a scroll reminding control in the webpage, wherein the scroll reminding control is close to a focus cursor of the user and is used for prompting the user to perform page scroll operation. "scroll down? Focus is determined herein "to prompt the user to scroll after the user focus cursor is hovered over the scroll alert control.
FIG. 18 illustrates an interface diagram where a user focus cursor is located in a scroll alert control in accordance with some embodiments. Referring to fig. 18, when the user focus cursor moves into the area corresponding to the scroll alert control, the stay time is detected and whether the preset condition is satisfied is determined as described above. And after the preset condition is met, controlling the webpage to scroll downwards according to the moving direction.
FIG. 19 illustrates an interface diagram with a user focus cursor in a middle position of a page width, in accordance with some embodiments. Referring to fig. 19, when the user focus cursor is hovering in the topside area, and the hovering time period is greater than or equal to the preset hovering time period. And displaying a scroll reminding control in the webpage, wherein the scroll reminding control is close to a focus cursor of the user and is used for prompting the user to perform page scroll operation. "scroll up? Focus is determined herein "to prompt the user to scroll after the user focus cursor is hovered over the scroll alert control.
FIG. 20 illustrates an interface diagram where a user focus cursor is located in a scroll alert control in accordance with some embodiments. Referring to fig. 20, when the user focus cursor moves into the area corresponding to the scroll alert control, the dwell time is detected and whether the preset condition is satisfied is determined as described above. And after the preset condition is met, controlling the webpage to scroll upwards according to the moving direction.
Similarly, referring to fig. 21 and 22, when the user focus cursor is hovering over another target area, the hovering duration is greater than or equal to the preset hovering duration. The scroll alert control may also be displayed in a web page, the scroll alert control being proximate to the user's focus cursor. E.g. "scroll left? Focus gather here determines "or" scroll right? Focus is determined herein "to prompt the user to scroll after the user focus cursor is hovered over the scroll alert control.
Another embodiment of zooming in or out a web page using a user focus cursor is described below in conjunction with fig. 23-29.
FIG. 23 illustrates an interface diagram with a user focus cursor in a lower right corner of a page, in accordance with some embodiments. Referring to fig. 23, when the user focus cursor stays at the target area, the target area at this time is located at the lower right corner of the web page. When the cursor of the user focus stays at the right lower corner position of the page, and the stay time length of the stay is greater than or equal to the preset stay time length. And displaying an amplifying control in the webpage, wherein the amplifying control is close to a focus cursor of the user, and the amplifying control is used for prompting the user to carry out webpage amplifying operation. Shown in the figure "magnified? Focus is determined here "to prompt the user to zoom in after the user focus cursor is hovered over the zoom-in control.
FIG. 24 illustrates an interface schematic with a user focus cursor in a zoom-in control, according to some embodiments. Referring to fig. 24, when the user focus cursor moves into the region corresponding to the zoom-in control, the stay time is detected and whether the preset condition is satisfied is determined as described above. And after the preset condition is met, controlling the webpage to carry out the amplifying operation. Referring to fig. 25, an interface schematic diagram of a webpage after an amplifying operation is performed, where content in the webpage is amplified.
Further, based on the above page zoom-in operation, when the user still wants to perform the page zoom-in operation again, the above operation of moving the user focus cursor to the target area may be repeated.
By way of example, FIG. 26 illustrates an interface diagram with a user focus cursor in a lower right corner of a page, in accordance with some embodiments. Referring to fig. 26, when the user focus cursor again hovers over the lower right corner position of the page, and the hovering duration is greater than or equal to the preset hovering duration. And displaying a re-amplification control in the webpage, wherein the re-amplification control is close to a focus cursor of the user, and the re-amplification control is used for prompting the user to perform webpage re-amplification operation. "continue to zoom in? Focus is determined here "to prompt the user to zoom in again on the page after the user's focus cursor has been hovered over the zoom-in control.
FIG. 27 illustrates an interface schematic where a user focus cursor is located in a re-zoom-in control area, according to some embodiments. Referring to fig. 27, when the user focus cursor moves into the region corresponding to the zoom-in control again, the stay time length is detected and whether the preset condition is satisfied is determined as described above. And after the preset condition is met, controlling the webpage to perform re-amplification operation.
Similarly, FIG. 28 illustrates an interface diagram with a user focus cursor positioned at the lower left corner of a page, in accordance with some embodiments. Referring to fig. 28, when the user focus cursor stays at the target area, the target area at this time is located at the lower left corner of the web page. When the cursor of the user focus stays at the left lower corner position of the page, and the stay time length of the stay is greater than or equal to the preset stay time length. And displaying a shrinking control in the webpage, wherein the shrinking control is close to a focus cursor of the user, and the shrinking control is used for prompting the user to conduct webpage shrinking operation. "zoom out? Focus is determined here "to prompt the user to zoom out after the user focus cursor is hovered over the zoom out control.
FIG. 29 illustrates an interface diagram where a user focus cursor is located in a zoom-out control, in accordance with some embodiments. Referring to fig. 29, when the user focus cursor moves into the area corresponding to the reduction control, the stay time is detected and whether the preset condition is satisfied is determined as described above. And after the preset conditions are met, controlling the webpage to perform the shrinking operation.
In some embodiments, when the user moves the user focus coordinate into the control area and satisfies the preset condition to operate the control, the control is canceled from being displayed while the web page is controlled to operate.
In some embodiments, when the user moves the user focus coordinates into the control area and the preset condition is not met, the display of the control is canceled.
In some embodiments, the dwell conditions provided herein include, but are not limited to, dwell time, but may also be other dwell manners, such as a change in dwell position, dwell state of user focus coordinates, and so forth. The application is not particularly limited, and the device can be automatically set according to actual conditions.
The UI drawings provided in the present application are only schematic for describing the schemes, and do not represent actual product designs, and the residence conditions, preset conditions, target areas, functional controls and display effects should be based on actual applications and designs.
In some embodiments, focus event listening is added to various regions in a web page, such as the middle region, edge region, and lower left and lower right regions, based on the specific implementation of the software layer. When the user moves and focuses the user focus cursor on each area in the webpage, an event monitoring function is triggered, and display or disappearance design of the control is carried out based on the position and the state of the user focus coordinate movement.
In a second aspect, the present application provides a focus operation method, which specifically includes the following steps: and when the user focus cursor moves to a target area of the webpage, determining a stay condition of the user focus cursor on the target area, wherein the target area is used for representing a response position of the user focus cursor on the webpage. Judging whether the stay condition meets a preset condition or not; and when the stay condition meets the preset condition, operating the target area by utilizing the target action corresponding to the focus cursor of the user so as to finish the interactive operation of the user in the webpage.
In some embodiments, in determining a dwell condition of the user focus cursor on the target area, the method further comprises: and detecting a first stay time length of the user focus cursor staying in the target area when the user focus cursor moves to the target area of the webpage. And when the first stay time length is greater than or equal to the preset stay time length, determining that the stay time length of the user focus cursor on the target meets the preset condition.
In some embodiments, the target area is operated by using a target action corresponding to the focus cursor of the user, so as to complete the interactive operation of the user in the webpage, and the method further includes: and when the first stay time length is greater than or equal to the preset stay time length, displaying a menu reminding control in the webpage, wherein the menu reminding control is close to a focus cursor of a user, and the menu reminding control is used for prompting the user to open a menu. A first movement position of a user focus cursor is detected. And judging whether the first moving position is in a region corresponding to the menu reminding control. And if the first moving position is in the area corresponding to the menu reminding control, detecting a second stay time of the user focus cursor at the first moving position. When the second stay time length meets the preset condition, displaying a menu interface at a preset position of the webpage, wherein the menu interface comprises at least one functional control, and the functional control is used for realizing page turning and/or sliding operation on the webpage.
In some embodiments, the method further comprises: and if the first moving position is outside the area corresponding to the menu reminding control, detecting a third stay time of the user focus cursor at the first moving position. And when the third stay time length meets the preset condition, canceling the display of the menu reminding control.
In some embodiments, the menu interface further comprises an zoom-in control and a zoom-out control, the method further comprising: and when the user focus cursor moves to the area corresponding to the zoom-in control, controlling the webpage to perform the webpage zoom-in operation. And when the user focus cursor moves to the area corresponding to the zoom-out control, controlling the webpage to perform webpage zoom-out operation.
In some embodiments, the target area is operated by using a target action corresponding to the focus cursor of the user, so as to complete the interactive operation of the user in the webpage, and the method further includes: whether the target area is located in a first edge area of the webpage or not is detected, wherein the first edge area is a middle area in the width direction of the webpage. And when the target area is positioned in the first edge area, determining the moving direction of the user focus cursor in the target area. And controlling the webpage to scroll in the corresponding direction according to the moving direction.
In some embodiments, before determining the direction of movement of the user focus cursor within the target region, the method further comprises: and detecting a fourth stay time of the user focus cursor in the edge area. And if the fourth stay time length meets the preset condition, displaying a scroll reminding control in the webpage, wherein the scroll reminding control is close to a focus cursor of the user and used for prompting the user to scroll the webpage.
In some embodiments, the target area is operated by using a target action corresponding to the focus cursor of the user, so as to complete the interactive operation of the user in the webpage, and the method further includes: detecting whether the target area is positioned in a second edge area of the webpage, wherein the second edge area is an edge area in the height direction of the webpage. When the target area is located in the second edge area, a page turning prompt control is displayed on the webpage, the page turning prompt control is close to a focus cursor of a user, and the page turning prompt control is used for prompting the user to perform page turning operation.
The focus operation method in the second aspect of the present application may be applied to the virtual reality device in the first aspect and specifically implemented by the controller in the virtual reality device, so that the beneficial effects of the interaction method of the virtual user interface in the second aspect are the same as those of the virtual reality device in the first aspect, and are not described herein.
As can be seen from the above-mentioned scheme, the present application provides a virtual reality device and a focus operation method, when a user opens a web page using a VR browser, if the web page content exceeds the screen, the font of the web page content is smaller, or in the case of opening a plurality of web pages, the user does not need to frequently use a finger to press a button to complete a clicking operation, and only moves the focus cursor of the user to determine whether to further operate and the corresponding operation of the subsequent web page. The experience of the user using the virtual reality device is improved.
The foregoing detailed description of the embodiments is merely illustrative of the general principles of the present application and should not be taken in any way as limiting the scope of the invention. Any other embodiments developed in accordance with the present application without inventive effort are within the scope of the present application for those skilled in the art.

Claims (10)

1. A virtual reality device, comprising:
a display configured to display a virtual user interface for displaying a web page;
a gesture sensor configured to detect a user focus cursor;
A controller configured to:
determining a stay condition of the user focus cursor on a target area of the webpage when the user focus cursor moves to the target area, wherein the target area is used for representing a response position of the user focus cursor on the webpage;
judging whether the stay condition meets a preset condition or not; and when the stay condition meets a preset condition, operating the target area by utilizing a target action corresponding to the user focus cursor so as to finish the interactive operation of the user in the webpage.
2. The virtual reality device of claim 1, wherein the controller, when performing determining a dwell condition of the user focus cursor on the target area, is further configured to:
detecting a first stay time length of the user focus cursor staying in a target area of the webpage when the user focus cursor moves to the target area;
and when the first stay time length is greater than or equal to a preset stay time length, determining that the stay time length of the user focus cursor on the target area meets the preset condition.
3. The virtual reality device of claim 1, wherein the controller is further configured to, when performing the operation of the target area with the target action corresponding to the user focus cursor to complete the user interaction in the web page:
when the first stay time length is greater than or equal to a preset stay time length, displaying a menu reminding control in the webpage, wherein the menu reminding control is close to the user focus cursor and is used for prompting a user to open a menu;
detecting a first movement position of the user focus cursor;
judging whether the first moving position is in a region corresponding to the menu reminding control;
if the first moving position is in the area corresponding to the menu reminding control, detecting a second stay time of the user focus cursor at the first moving position;
and when the second stay time length meets the preset condition, displaying a menu interface at a preset position of the webpage, wherein the menu interface comprises at least one functional control, and the functional control is used for realizing page turning and/or sliding operation of the webpage.
4. The virtual reality device of claim 3, wherein the controller is further configured to:
if the first moving position is outside the area corresponding to the menu reminding control, detecting a third stay time of the user focus cursor at the first moving position;
and when the third stay time length meets the preset condition, canceling the display of the menu reminding control.
5. The virtual reality device of claim 4, wherein the menu interface further comprises an zoom-in control and a zoom-out control, the controller further configured to:
when the user focus cursor moves to the area corresponding to the zoom-in control, controlling the webpage to perform webpage zoom-in operation;
and when the user focus cursor moves to the area corresponding to the reduction control, controlling the webpage to carry out webpage reduction operation.
6. The virtual reality device of claim 1, wherein the controller is further configured to, when performing the operation of the target area with the target action corresponding to the user focus cursor to complete the user interaction in the web page:
Detecting whether the target area is positioned in a first edge area of the webpage, wherein the first edge area is a middle area in the width direction of the webpage;
determining the moving direction of the user focus cursor in the target area when the target area is positioned in the first edge area;
and controlling the webpage to scroll in the corresponding direction according to the moving direction.
7. The virtual reality device of claim 6, wherein the controller, prior to performing determining the direction of movement of the user focus cursor within the target area, is further configured to:
detecting a fourth stay time of the user focus cursor in the edge area;
and if the fourth stay time length meets the preset condition, displaying a scroll reminding control in the webpage, wherein the scroll reminding control is close to the focus cursor of the user, and the scroll reminding control is used for prompting the user to perform page scrolling operation.
8. The virtual reality device of claim 1, wherein the controller is further configured to, when performing the operation of the target area with the target action corresponding to the user focus cursor to complete the user interaction in the web page:
Detecting whether the target area is positioned in a second edge area of the webpage, wherein the second edge area is an edge area in the height direction of the webpage;
when the target area is located in the second edge area, a page turning prompt control is displayed on the webpage, the page turning prompt control is close to the user focus cursor, and the page turning prompt control is used for prompting the user to perform page turning operation.
9. A method of focus operation, characterized in that the method comprises in particular the steps of:
when a user focus cursor moves to a target area of the webpage, determining a stay condition of the user focus cursor on the target area, wherein the target area is used for representing a response position of the user focus cursor on the webpage;
judging whether the stay condition meets a preset condition or not; and when the stay condition meets a preset condition, operating the target area by utilizing a target action corresponding to the user focus cursor so as to finish the interactive operation of the user in the webpage.
10. The method of claim 9, wherein the determining a dwell condition of the user focus cursor on the target area is performed, the method further comprising:
Detecting a first stay time length of the user focus cursor staying in a target area of the webpage when the user focus cursor moves to the target area;
and when the first stay time length is greater than or equal to a preset stay time length, determining that the stay time length of the user focus cursor on the target meets the preset condition.
CN202210087209.0A 2022-01-25 2022-01-25 Virtual reality equipment and focus operation method Pending CN116540905A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210087209.0A CN116540905A (en) 2022-01-25 2022-01-25 Virtual reality equipment and focus operation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210087209.0A CN116540905A (en) 2022-01-25 2022-01-25 Virtual reality equipment and focus operation method

Publications (1)

Publication Number Publication Date
CN116540905A true CN116540905A (en) 2023-08-04

Family

ID=87444055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210087209.0A Pending CN116540905A (en) 2022-01-25 2022-01-25 Virtual reality equipment and focus operation method

Country Status (1)

Country Link
CN (1) CN116540905A (en)

Similar Documents

Publication Publication Date Title
CN110611787B (en) Display and image processing method
WO2020248711A1 (en) Display device and content recommendation method
CN114286142B (en) Virtual reality equipment and VR scene screen capturing method
CN112073798B (en) Data transmission method and equipment
CN112732089A (en) Virtual reality equipment and quick interaction method
CN112073770B (en) Display device and video communication data processing method
CN112399233A (en) Display device and position self-adaptive adjusting method of video chat window
CN114302221B (en) Virtual reality equipment and screen-throwing media asset playing method
CN112995406B (en) Display method and device and electronic equipment
CN111385631B (en) Display device, communication method and storage medium
CN114286077B (en) Virtual reality device and VR scene image display method
CN116540905A (en) Virtual reality equipment and focus operation method
CN115129280A (en) Virtual reality equipment and screen-casting media asset playing method
CN116225205A (en) Virtual reality equipment and content input method
CN116126175A (en) Virtual reality equipment and video content display method
CN116149517A (en) Virtual reality equipment and interaction method of virtual user interface
CN116132656A (en) Virtual reality equipment and video comment display method
CN112732088B (en) Virtual reality equipment and monocular screen capturing method
WO2022111005A1 (en) Virtual reality (vr) device and vr scenario image recognition method
CN112667079A (en) Virtual reality equipment and reverse prompt picture display method
CN116069974A (en) Virtual reality equipment and video playing method
CN116931713A (en) Virtual reality equipment and man-machine interaction method
CN116266090A (en) Virtual reality equipment and focus operation method
CN116347143A (en) Display equipment and double-application same-screen display method
CN116935084A (en) Virtual reality equipment and data verification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination