EP3474118A1 - Controlling field of view - Google Patents

Controlling field of view Download PDF

Info

Publication number
EP3474118A1
EP3474118A1 EP18201994.3A EP18201994A EP3474118A1 EP 3474118 A1 EP3474118 A1 EP 3474118A1 EP 18201994 A EP18201994 A EP 18201994A EP 3474118 A1 EP3474118 A1 EP 3474118A1
Authority
EP
European Patent Office
Prior art keywords
view
field
preset
display area
target application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP18201994.3A
Other languages
German (de)
French (fr)
Inventor
Xingsheng LIN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Publication of EP3474118A1 publication Critical patent/EP3474118A1/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/373Details of the operation on graphic patterns for modifying the size of the graphic pattern
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04804Transparency, e.g. transparent or translucent windows
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0613The adjustment depending on the type of the information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/04Changes in size, position or resolution of an image
    • G09G2340/045Zooming at least part of an image, i.e. enlarging it or shrinking it
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/18Use of a frame buffer in a display terminal, inclusive of the display panel

Definitions

  • the present disclosure relates to the field of computer technology, and more particularly to controlling field of view.
  • a field of view (FOV) is fixed, that is, a display area of a virtual picture in a display screen is fixed.
  • aspects of the disclosure provide a method of controlling a field of view.
  • the method includes receiving, by an augmented reality (AR) device, an instruction for starting a target application; determining, by the AR device, a preset field of view corresponding to the target application in response to receiving the instruction for starting the target application; and adjusting, by the AR device, a current field of view based on the preset field of view.
  • AR augmented reality
  • the method may further include determining, by the AR device, a preset display area of a preset virtual picture corresponding to the target application in a display screen; and determining, by the AR device, the preset field of view based on the preset display area.
  • the method may further include adjusting, by the AR device, a current display area based on the preset display area corresponding to the preset field of view, wherein the current display area is a display area of a current virtual picture in the display screen; and determining, by the AR device, the current field of view based on the adjusted current display area.
  • the method may further include setting, by the AR device, a non-display area in the display screen as a transparent area, wherein the current virtual picture is not displayed in the non-display area.
  • the method may further include determining, by the AR device, a type of the target application; and querying, by the AR device, the preset field of view corresponding to the type of the target application from pre-stored correspondence data that indicates correspondence between application types and preset fields of view.
  • the apparatus includes a processor and a memory configured to store instructions executable by the processor.
  • the processor is configured to receive, by an augmented reality (AR) device, an instruction for starting a target application; determine a preset field of view corresponding to the target application in response to receiving the instruction for starting the target application; and adjust a current field of view based on the preset field of view.
  • AR augmented reality
  • aspects of the disclosure also provide a non-transitory computer-readable storage medium having stored thereon instructions that, when executed by one or more processors of an AR device, cause the AR device to receive an instruction for starting a target application; determine a preset field of view corresponding to the target application in response to the instruction for starting the target application; and adjust a current field of view based on the preset field of view.
  • FIG. 1A is a flowchart of a method of controlling a field of view according to an exemplary embodiment; the embodiment may be used for an AR device, including AR glasses, an AR helmet, etc.
  • the method includes the following Steps S101-S102: S101: a preset field of view corresponding to a target application is determined in response to receiving an instruction for starting the target application.
  • the instruction described above may include an instruction generated by a user via touching a predetermined location on the AR device or pressing a button on the AR device, or a voice command from the user to open the target application.
  • the target application may be a pre-installed application in the mobile device, including but not limited to, instant messaging application such as WeChat, text reading application, video playing application, and various game applications.
  • the mobile device is a device associated with the AR device, such as the user's mobile phone, PC, or other terminal with a display screen, etc.
  • the FOV indicates an angular range of the virtual scene viewable to human eyes through the AR device.
  • the field of view generally refers to the horizontal field of view.
  • FIG. 1B is a schematic diagram of field of views according to an exemplary embodiment.
  • ⁇ AOB is the horizontal field of view
  • ⁇ BOC is the vertical field of view.
  • the process of determining the preset field of view corresponding to the target application may refer to the following aspects shown in FIG. 2 and FIG. 5 , and details are not described here.
  • the current field of view indicates an angular range in which the AR device currently displays the virtual scene.
  • the current field of view of the AR device may be adjusted based on the preset field of view.
  • a large preset display area may be set to enhance the visual effect and improve user experience; while for instant messaging application such as WeChat, weather forecast application, step counting application, time displaying application, navigation application and other auxiliary applications, a small preset display area may be set to avoid blocking the sight of the user when walking or driving.
  • the preset field of view corresponding to the target application is determined in response to receiving the instruction for starting the target application and the current field of view is adjusted based on the preset field of view.
  • different field of views may be set according to different target applications, thereby meeting the requirements of different applications for field of views and improving the intelligence level of the terminal device.
  • FIG. 2 is a flowchart illustrating determining a preset field of view corresponding to a target application according to an exemplary embodiment.
  • determining the preset field of view corresponding to the target application in step S101 may include steps S201-S202: S201: a preset display area of a preset virtual picture corresponding to the target application in a display screen is determined.
  • a corresponding display area may be preset for different applications, where setting of the display area may include the size of the display area.
  • an identifier of the target application may be determined first, and then the preset display area corresponding to the identifier of the target application is queried from the pre-stored correspondence of application identifiers and preset display areas.
  • the preset field of view may be determined according to the size of the preset display area.
  • the preset field of view is proportional to the size of preset display area.
  • the scaling factor between the preset display area and the preset field of view may be determined according to the type or model of the AR device.
  • the preset field of view may be determined reasonably and accurately.
  • the accuracy of adjusting the current field of view based on the preset field of view may be further improved.
  • FIG. 3 is a flowchart illustrating adjusting a current field of view based on the preset field of view according to an exemplary embodiment. As shown in FIG. 3 , the current field of view is adjusted based on the preset field of view according to step S102, which may include steps S301-S302.
  • a current display area is adjusted according to the preset display area corresponding to the preset field of view, where the current display area is a display area of a current virtual picture in the display screen;
  • the size of the display area of the current virtual picture in the display screen may be adjusted according to the size of the preset display area corresponding to the preset field of view.
  • the size of the current display area may be set to be the same as the size of the preset display area.
  • the current field of view may be determined according to the size of the adjusted current display area.
  • the current field of view is proportional to the size of the adjusted current display area.
  • the scaling factor between the adjusted current display area and the current field of view may be determined based on the type or model of the AR device.
  • the display area of the current virtual picture in the display screen is adjusted according to the preset display area corresponding to the preset field of view, and the current field of view is determined based on the adjusted current display area, so that different field of views may be set according to different target applications, thereby meeting the requirements of different applications for field of views and improving the intelligence level of the terminal device.
  • FIG. 4A is a flowchart illustrating adjusting a current field of view based on the preset field of view according to another exemplary embodiment
  • FIG. 4B is a display area and a non-display area in a display screen according to an exemplary embodiment.
  • the current field of view is adjusted based on the preset field of view according to step S102, which may include steps S401-S403.
  • S401 a current display area is adjusted according to the preset display area corresponding to the preset field of view, where the current display area is a display area of a current virtual picture in the display screen;
  • S402 the current field of view is determined based on the adjusted current display area.
  • a non-display area in the display screen is set as a transparent area; where the non-display area is an area in the display screen where the current virtual picture is not displayed.
  • the steps S401-S402 are the same as the steps S301-S302 in the foregoing embodiment of the present disclosure. For related explanations, reference may be made to the foregoing embodiments, and details are not described here.
  • the process of adjusting the current field of view ⁇ AOB based on the preset field of view may include setting the non-display area 402 (i.e., an area other than the current display area 401 in the display screen 400) as a transparent area, in addition to adjusting the current display area 401 according to the preset display area.
  • the user may observe the road condition or the like through the transparent non-display area 402 while walking or driving.
  • the display area of the current virtual picture in the display screen is adjusted according to the preset display area corresponding to the preset field of view, the current field of view is determined based on the adjusted current display area, and the non-display area is set to be transparent.
  • different fields of view may be set according to different target applications, thereby satisfying the requirements of different applications for fields of view, on the other hand, security of the user using the AR device in an application scenario such as walking or driving may be improved.
  • FIG. 5 is a flowchart illustrating determining a preset field of view corresponding to a target application according to another exemplary embodiment. As shown in FIG. 5 , determining the preset field of view corresponding to the target application in the foregoing step S101 may include the following steps S501-S502:
  • a type of the target application is determined; In an embodiment, in response to receiving the instruction for starting the target application, the type of the target application is determined.
  • the types of the target applications include, but are not limited to, instant messaging, text reading, video playing, weather forecast, navigation, game, etc.
  • S502 a preset field of view corresponding to the type of the target application is queried from a pre-stored correspondence data indicating the correspondence between application types and preset fields of view.
  • the AR device pre-stores the correspondence data indicating the first correspondence between application types and preset fields of view.
  • the first correspondence data pre-stored in the AR device is as shown in Table 1 below: Table 1 Correspondence between application types and preset fields of view Program type Instant messaging Text reading Navigation Weather forecast Video playing Game Preset field of view 45° 45° 45° 45° 90° 90°
  • the Table 1 may be queried to determine a preset field of view corresponding to the type of the target application (i.e., 45°).
  • the preset field of view corresponding to the current application scenario may be queried from pre-stored second correspondence data according to the current application scenario; where the second correspondence data indicates the correspondence between the current application scenarios and the preset fields of view.
  • the preset field of view may be set small (for example, 45°) to avoid blocking the sight of the user.
  • the field of view may be set as large as possible (for example, 90°).
  • the type of the application and the preset field of view corresponding to the current application scenario may be freely set by the user according to actual conditions.
  • the flexible setting of the field of view according to the application type and/or the application scenario may be realized by querying the appropriate preset field of view according to the pre-stored correspondence data, thereby improving the intelligence level of the terminal and enhancing user experience.
  • FIG. 6 is a block diagram of an apparatus for controlling a field of view according to an exemplary embodiment. As shown in FIG. 6 , the apparatus includes a view field determining module 110 and a view field adjusting module 120.
  • the view field determining module 110 is configured to determine a preset field of view corresponding to a target application in response to receiving an instruction for starting the target application.
  • the view field adjusting module 120 is configured to adjust a current field of view based on the preset field of view.
  • the preset field of view corresponding to the target application is determined, and the current field of view is adjusted based on the preset field of view.
  • different field of views are set according to different target applications, thereby meeting the requirements of different applications for different fields of view, and improving the intelligence level of the terminal device.
  • FIG. 7 is a block diagram of an apparatus for controlling a field of view according to still another exemplary embodiment; where the view field determining module 210 and the view field adjusting module 220 are the same as the view field determining module 110 and the view field adjusting module 120 in aspect shown in FIG. 6 , and are not described repeatedly herein. As shown in FIG. 7 , on the basis of the foregoing aspect, the view field determining module 210 may further include:
  • the view field adjusting module 220 may include:
  • the view field adjusting module 220 may further include an area setting unit 223 which is configured to set a non-display area in the display screen as a transparent area.
  • the non-display area is an area in the display screen where the current virtual picture is not displayed.
  • FIG. 8 is a block diagram of an apparatus for controlling a field of view according to still another exemplary embodiment; where the view field determining module 310 and the view field adjusting module 320 are the same as the view field determining module 110 and the view field adjusting module 120, and are not repeatedly described here. As shown in FIG. 8 , on the basis of the embodiment shown in FIG. 6 , the view field determining module 310 may further include:
  • the preset field of view corresponding to the type of the target application may be queried from the pre-stored correspondence data based on the type of the target application, so that the field of view may be set according to the application type, improving the intelligence level of the terminal and enhancing user experience.
  • FIG. 9 is a block diagram of an electronic device according to an exemplary embodiment.
  • device 900 may be a mobile phone, a computer, a digital broadcast terminal, a message transceiving device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • a device 900 may include one or more of the following components: processing component 902, memory 904, power supply component 906, multimedia component 908, audio component 910, input/output (I/O) interface 912, sensor component 914, and communication component 916.
  • Processing component 902 typically controls the overall operation of the device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations.
  • Processing component 902 can include one or more processors 920 to execute instructions to perform all or part of the blocks of the above methods.
  • processing component 902 may include one or more modules to facilitate interaction between component 902 and other components.
  • processing component 902 may include a multimedia module to facilitate interaction between multimedia component 908 and processing component 902.
  • Memory 904 is configured to store various types of data to support operation of device 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phone book data, messages, pictures, videos, and the like.
  • the memory 904 may be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic memory, flash memory, or optical disk.
  • SRAM Static Random Access Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • EPROM Erasable Programmable Read-Only Memory
  • PROM Programmable Read-Only Memory
  • ROM Read-Only Memory
  • Power supply component 906 provides power to various components of device 900.
  • the power supply component 906 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 900.
  • the multimedia component 908 includes a screen that provides an output interface between the device 900 and the user.
  • the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user.
  • the touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide operation, but also detect the duration and pressure associated with the touch or slide operation.
  • the multimedia component 908 includes a front camera and/or a rear camera. When the device 900 is in an operating mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each of the front and rear cameras may be a fixed optical lens system or have focal length and optical zoom capabilities.
  • the audio component 910 is configured to output and/or input an audio signal.
  • the audio component 910 includes a microphone (MIC) that is configured to receive an external audio signal when the device 900 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode.
  • the received audio signal may be further stored in memory 904 or transmitted via communication component 916.
  • the audio component 910 also includes a speaker for outputting an audio signal.
  • the I/O interface 912 provides an interface between the processing component 902 and a peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor component 914 includes one or more sensors for evaluating state of various aspects of device 900.
  • sensor component 914 may detect on/off state of the device 900, relative locations of components, such as the display and keypad of device 900, and sensor component 914 may also detect a change in position of device 900 or one component of device 900, the presence or absence of user contact with device 900, orientation or acceleration/deceleration of the device 900, and temperature variation of device 900.
  • the sensor component 914 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact.
  • the sensor component 914 may also include an optical sensor, such as a CMOS or CCD image sensor, for use in imaging applications.
  • the sensor component 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 916 is configured to facilitate wired or wireless communication between device 900 and other devices.
  • the device 900 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof.
  • the communication component 916 receives a broadcast signal or broadcast-associated information from an external broadcast management system via a broadcast channel.
  • the communication component 916 also includes a Near Field Communication (NFC) module to facilitate short range communication.
  • NFC Near Field Communication
  • the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wide Band (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wide Band
  • Bluetooth Bluetooth
  • the device 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGA), controller, microcontrollers, microprocessors or other electronic component to perform the above described method of controlling a field of view.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGA Field Programmable Gate Arrays
  • controller microcontrollers, microprocessors or other electronic component to perform the above described method of controlling a field of view.
  • a non-transitory computer-readable storage medium comprising instructions, such as a memory 904 comprising instructions executable by processor 920 of the device 900 to perform the above-described method of controlling a field of view.
  • the non-transitory computer-readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
  • modules, sub-modules, units, and components in the present disclosure can be implemented using any suitable technology.
  • a module may be implemented using circuitry, such as an integrated circuit (IC).
  • IC integrated circuit
  • a module may be implemented as a processing circuit executing software instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a method and an apparatus for controlling a field of view of an augmented reality (AR) device. The method includes receiving, by an AR device, an instruction for starting a target application; determining, by the AR device, a preset field of view corresponding to the target application in response to receiving the instruction for starting the target application; and adjusting, by the AR device, a current field of view based on the preset field of view.

Description

    TECHNICAL FIELD
  • The present disclosure relates to the field of computer technology, and more particularly to controlling field of view.
  • BACKGROUND
  • In an Augmented Reality (AR) device, a field of view (FOV) is fixed, that is, a display area of a virtual picture in a display screen is fixed.
  • SUMMARY
  • This Summary is provided to introduce a selection of aspects of the present disclosure in a simplified form that are further described below in the Detailed Description.
  • Aspects of the disclosure provide a method of controlling a field of view. The method includes receiving, by an augmented reality (AR) device, an instruction for starting a target application; determining, by the AR device, a preset field of view corresponding to the target application in response to receiving the instruction for starting the target application; and adjusting, by the AR device, a current field of view based on the preset field of view.
  • According to an embodiment, when determining the preset field of view corresponding to the target application, the method may further include determining, by the AR device, a preset display area of a preset virtual picture corresponding to the target application in a display screen; and determining, by the AR device, the preset field of view based on the preset display area.
  • According to another embodiment, when adjusting the current field of view based on the preset field of view, the method may further include adjusting, by the AR device, a current display area based on the preset display area corresponding to the preset field of view, wherein the current display area is a display area of a current virtual picture in the display screen; and determining, by the AR device, the current field of view based on the adjusted current display area.
  • According to yet another embodiment, the method may further include setting, by the AR device, a non-display area in the display screen as a transparent area, wherein the current virtual picture is not displayed in the non-display area.
  • According to yet another embodiment, when determining the preset field of view corresponding to the target application, the method may further include determining, by the AR device, a type of the target application; and querying, by the AR device, the preset field of view corresponding to the type of the target application from pre-stored correspondence data that indicates correspondence between application types and preset fields of view.
  • Aspects of the disclosure also provide an apparatus for controlling a field of view. The apparatus includes a processor and a memory configured to store instructions executable by the processor. The processor is configured to receive, by an augmented reality (AR) device, an instruction for starting a target application; determine a preset field of view corresponding to the target application in response to receiving the instruction for starting the target application; and adjust a current field of view based on the preset field of view.
  • Aspects of the disclosure also provide a non-transitory computer-readable storage medium having stored thereon instructions that, when executed by one or more processors of an AR device, cause the AR device to receive an instruction for starting a target application; determine a preset field of view corresponding to the target application in response to the instruction for starting the target application; and adjust a current field of view based on the preset field of view.
  • It is to be understood that both the foregoing general description and the following detailed description are illustrative and explanatory only and are not restrictive of the present disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments consistent with the present disclosure, and together with the description, serve to explain the principles of the present disclosure.
    • FIG. 1A is a flowchart of a method of controlling field of view according to an exemplary embodiment of the present disclosure.
    • FIG. 1B is a schematic diagram showing a field of view according to an exemplary embodiment of the present disclosure.
    • FIG. 2 is a flowchart illustrating determining a preset field of view corresponding to a target application according to an exemplary embodiment of the present disclosure.
    • FIG. 3 is a flowchart illustrating adjusting a current field of view based on the preset field of view according to an exemplary embodiment of the present disclosure.
    • FIG. 4A is a flowchart illustrating adjusting a current field of view based on the preset field of view according to another exemplary embodiment of the present disclosure.
    • 4B is a schematic diagram showing a display area and a non-display area in a display screen according to another exemplary embodiment.
    • FIG. 5 is a flowchart illustrating determining a preset field of view corresponding to a target application according to another exemplary embodiment of the present disclosure.
    • FIG. 6 is a block diagram of an apparatus for controlling a field of view according to an exemplary embodiment of the present disclosure.
    • FIG. 7 is a block diagram of an apparatus for controlling a field of view according to another exemplary embodiment of the present disclosure.
    • FIG. 8 is a block diagram of an apparatus for controlling a field of view according to still another exemplary embodiment of the present disclosure.
    • FIG. 9 is a block diagram of an electronic device according to an exemplary embodiment of the present disclosure.
  • The specific embodiment of the present disclosure, which have been illustrated by the accompanying drawings described above, will be described in detail below. These accompanying drawings and description are not intended to limit the scope of the present disclosure in any manner, but to explain the concept of the present disclosure to those skilled in the art via referencing specific embodiments.
  • DETAILED DESCRIPTION
  • Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise represented. The implementations set forth in the following description of illustrative embodiments do not represent all implementations consistent with the disclosure. Instead, they are merely examples of apparatuses and methods consistent the invention as defined in the appended claims.
  • FIG. 1A is a flowchart of a method of controlling a field of view according to an exemplary embodiment; the embodiment may be used for an AR device, including AR glasses, an AR helmet, etc. As shown in FIG. 1A, the method includes the following Steps S101-S102:
    S101: a preset field of view corresponding to a target application is determined in response to receiving an instruction for starting the target application.
  • In an embodiment, the instruction described above may include an instruction generated by a user via touching a predetermined location on the AR device or pressing a button on the AR device, or a voice command from the user to open the target application.
  • In an embodiment, the target application may be a pre-installed application in the mobile device, including but not limited to, instant messaging application such as WeChat, text reading application, video playing application, and various game applications. The mobile device is a device associated with the AR device, such as the user's mobile phone, PC, or other terminal with a display screen, etc.
  • In this embodiment, the FOV indicates an angular range of the virtual scene viewable to human eyes through the AR device. In an AR device, the field of view generally refers to the horizontal field of view. In particular, FIG. 1B is a schematic diagram of field of views according to an exemplary embodiment. As shown in FIG. IB, ∠AOB is the horizontal field of view and ∠ BOC is the vertical field of view. When the field of view is greater than 110°, people usually turn his/her head instead of looking sideways to view pictures at the edge of the field of view, which leads to fatigue, therefore, the FOV is usually set to less than 110°.
  • In an aspect, the process of determining the preset field of view corresponding to the target application may refer to the following aspects shown in FIG. 2 and FIG. 5, and details are not described here.
  • S102: a current field of view is adjusted based on the preset field of view.
  • In an embodiment, the current field of view indicates an angular range in which the AR device currently displays the virtual scene.
  • In an embodiment, after determining the preset field of view corresponding to the target application to be opened, the current field of view of the AR device may be adjusted based on the preset field of view.
  • In an embodiment, for a text reading application, a video playing application, various game applications and the like, a large preset display area may be set to enhance the visual effect and improve user experience; while for instant messaging application such as WeChat, weather forecast application, step counting application, time displaying application, navigation application and other auxiliary applications, a small preset display area may be set to avoid blocking the sight of the user when walking or driving.
  • It can be seen from the above description that in the aspect of the present disclosure, the preset field of view corresponding to the target application is determined in response to receiving the instruction for starting the target application and the current field of view is adjusted based on the preset field of view. In this way, different field of views may be set according to different target applications, thereby meeting the requirements of different applications for field of views and improving the intelligence level of the terminal device.
  • FIG. 2 is a flowchart illustrating determining a preset field of view corresponding to a target application according to an exemplary embodiment. As shown in FIG. 2, determining the preset field of view corresponding to the target application in step S101 may include steps S201-S202:
    S201: a preset display area of a preset virtual picture corresponding to the target application in a display screen is determined.
  • In an embodiment, a corresponding display area may be preset for different applications, where setting of the display area may include the size of the display area.
  • In an embodiment, an identifier of the target application may be determined first, and then the preset display area corresponding to the identifier of the target application is queried from the pre-stored correspondence of application identifiers and preset display areas..
  • S202: the preset field of view is determined based on the preset display area.
  • In an embodiment, the preset field of view may be determined according to the size of the preset display area.
  • In an embodiment, the preset field of view is proportional to the size of preset display area.
  • In an embodiment, the scaling factor between the preset display area and the preset field of view may be determined according to the type or model of the AR device.
  • According to the above technical solution, by determining the preset display area of the preset virtual picture corresponding to the target application in the display screen, and determining the preset field of view based on the preset display area, the preset field of view may be determined reasonably and accurately. The accuracy of adjusting the current field of view based on the preset field of view may be further improved.
  • FIG. 3 is a flowchart illustrating adjusting a current field of view based on the preset field of view according to an exemplary embodiment. As shown in FIG. 3, the current field of view is adjusted based on the preset field of view according to step S102, which may include steps S301-S302.
  • S301: a current display area is adjusted according to the preset display area corresponding to the preset field of view, where the current display area is a display area of a current virtual picture in the display screen;
    In an embodiment, the size of the display area of the current virtual picture in the display screen may be adjusted according to the size of the preset display area corresponding to the preset field of view. For example, the size of the current display area may be set to be the same as the size of the preset display area.
  • S302: the current field of view is determined based on the adjusted current display area.
  • In an embodiment, after determining the size of the adjusted current display area, the current field of view may be determined according to the size of the adjusted current display area.
  • In an embodiment, the current field of view is proportional to the size of the adjusted current display area.
  • In an embodiment, the scaling factor between the adjusted current display area and the current field of view may be determined based on the type or model of the AR device.
  • According to the foregoing technical solution, the display area of the current virtual picture in the display screen is adjusted according to the preset display area corresponding to the preset field of view, and the current field of view is determined based on the adjusted current display area, so that different field of views may be set according to different target applications, thereby meeting the requirements of different applications for field of views and improving the intelligence level of the terminal device.
  • FIG. 4A is a flowchart illustrating adjusting a current field of view based on the preset field of view according to another exemplary embodiment; FIG. 4B is a display area and a non-display area in a display screen according to an exemplary embodiment. As shown in FIG. 4A, the current field of view is adjusted based on the preset field of view according to step S102, which may include steps S401-S403.
  • S401: a current display area is adjusted according to the preset display area corresponding to the preset field of view, where the current display area is a display area of a current virtual picture in the display screen;
    S402: the current field of view is determined based on the adjusted current display area.
  • S403: a non-display area in the display screen is set as a transparent area; where the non-display area is an area in the display screen where the current virtual picture is not displayed.
  • The steps S401-S402 are the same as the steps S301-S302 in the foregoing embodiment of the present disclosure. For related explanations, reference may be made to the foregoing embodiments, and details are not described here.
  • As shown in FIG. 4B, in an embodiment, the process of adjusting the current field of view ∠AOB based on the preset field of view may include setting the non-display area 402 (i.e., an area other than the current display area 401 in the display screen 400) as a transparent area, in addition to adjusting the current display area 401 according to the preset display area. For example, the user may observe the road condition or the like through the transparent non-display area 402 while walking or driving.
  • According to the foregoing technical solution, the display area of the current virtual picture in the display screen is adjusted according to the preset display area corresponding to the preset field of view, the current field of view is determined based on the adjusted current display area, and the non-display area is set to be transparent. On the one hand, different fields of view may be set according to different target applications, thereby satisfying the requirements of different applications for fields of view, on the other hand, security of the user using the AR device in an application scenario such as walking or driving may be improved.
  • FIG. 5 is a flowchart illustrating determining a preset field of view corresponding to a target application according to another exemplary embodiment. As shown in FIG. 5, determining the preset field of view corresponding to the target application in the foregoing step S101 may include the following steps S501-S502:
  • S501: a type of the target application is determined;
    In an embodiment, in response to receiving the instruction for starting the target application, the type of the target application is determined. The types of the target applications include, but are not limited to, instant messaging, text reading, video playing, weather forecast, navigation, game, etc.
  • S502: a preset field of view corresponding to the type of the target application is queried from a pre-stored correspondence data indicating the correspondence between application types and preset fields of view.
  • In an embodiment, the AR device pre-stores the correspondence data indicating the first correspondence between application types and preset fields of view.
  • In an embodiment, the first correspondence data pre-stored in the AR device is as shown in Table 1 below: Table 1 Correspondence between application types and preset fields of view
    Program type Instant messaging Text reading Navigation Weather forecast Video playing Game
    Preset field of view 45° 45° 45° 45° 90° 90°
  • In an aspect, after the AR device determines the type of the target application (for example, the text reading type), the Table 1 may be queried to determine a preset field of view corresponding to the type of the target application (i.e., 45°).
  • In an embodiment, the preset field of view corresponding to the current application scenario may be queried from pre-stored second correspondence data according to the current application scenario; where the second correspondence data indicates the correspondence between the current application scenarios and the preset fields of view.
  • For example, in an application scenario such as driving or walking that requires more attention to the external environment, the preset field of view may be set small (for example, 45°) to avoid blocking the sight of the user. In an application scenario of taking the subway, staying indoors, etc., without the need of paying much attention to the external environment, the field of view may be set as large as possible (for example, 90°).
  • In an embodiment, the type of the application and the preset field of view corresponding to the current application scenario may be freely set by the user according to actual conditions.
  • According to the foregoing technical solution, the flexible setting of the field of view according to the application type and/or the application scenario may be realized by querying the appropriate preset field of view according to the pre-stored correspondence data, thereby improving the intelligence level of the terminal and enhancing user experience.
  • FIG. 6 is a block diagram of an apparatus for controlling a field of view according to an exemplary embodiment. As shown in FIG. 6, the apparatus includes a view field determining module 110 and a view field adjusting module 120.
  • The view field determining module 110 is configured to determine a preset field of view corresponding to a target application in response to receiving an instruction for starting the target application.
  • The view field adjusting module 120 is configured to adjust a current field of view based on the preset field of view.
  • It can be seen from the above description that in response to receiving the instruction for starting the target application, the preset field of view corresponding to the target application is determined, and the current field of view is adjusted based on the preset field of view. In this way, different field of views are set according to different target applications, thereby meeting the requirements of different applications for different fields of view, and improving the intelligence level of the terminal device.
  • FIG. 7 is a block diagram of an apparatus for controlling a field of view according to still another exemplary embodiment; where the view field determining module 210 and the view field adjusting module 220 are the same as the view field determining module 110 and the view field adjusting module 120 in aspect shown in FIG. 6, and are not described repeatedly herein. As shown in FIG. 7, on the basis of the foregoing aspect, the view field determining module 210 may further include:
    • area determining unit 211 configured to determine a preset display area of the preset virtual picture corresponding to the target application in a display screen;
    • preset view field determining unit 212 configured to determine the preset field of view based on the preset display area.
  • In an embodiment, the view field adjusting module 220 may include:
    • area adjusting unit 221 configured to adjust a current display area according to the preset display area corresponding to the preset field of view, where the current display area is a display area of a current virtual picture in the display screen;
    • current view field determining unit 222 is configured to determine the current field of view based on the adjusted current display area.
  • In an embodiment, the view field adjusting module 220 may further include an area setting unit 223 which is configured to set a non-display area in the display screen as a transparent area. The non-display area is an area in the display screen where the current virtual picture is not displayed.
  • FIG. 8 is a block diagram of an apparatus for controlling a field of view according to still another exemplary embodiment; where the view field determining module 310 and the view field adjusting module 320 are the same as the view field determining module 110 and the view field adjusting module 120, and are not repeatedly described here. As shown in FIG. 8, on the basis of the embodiment shown in FIG. 6, the view field determining module 310 may further include:
    • type determining unit 311 configured to determine a type of the target application;
    • view field querying unit 312 configured to query, from pre-stored correspondence data, a preset field of view corresponding to the type of the target application, where the correspondence data indicates the correspondence between the application types and the preset fields of view.
  • According to the foregoing technical solution, the preset field of view corresponding to the type of the target application may be queried from the pre-stored correspondence data based on the type of the target application, so that the field of view may be set according to the application type, improving the intelligence level of the terminal and enhancing user experience.
  • FIG. 9 is a block diagram of an electronic device according to an exemplary embodiment. For example, device 900 may be a mobile phone, a computer, a digital broadcast terminal, a message transceiving device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, and the like.
  • Referring to Figure 9, a device 900 may include one or more of the following components: processing component 902, memory 904, power supply component 906, multimedia component 908, audio component 910, input/output (I/O) interface 912, sensor component 914, and communication component 916.
  • Processing component 902 typically controls the overall operation of the device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. Processing component 902 can include one or more processors 920 to execute instructions to perform all or part of the blocks of the above methods. Moreover, processing component 902 may include one or more modules to facilitate interaction between component 902 and other components. For example, processing component 902 may include a multimedia module to facilitate interaction between multimedia component 908 and processing component 902.
  • Memory 904 is configured to store various types of data to support operation of device 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phone book data, messages, pictures, videos, and the like. The memory 904 may be implemented by any type of volatile or non-volatile storage device, or a combination thereof, such as Static Random Access Memory (SRAM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Erasable Programmable Read-Only Memory (EPROM), Programmable Read-Only Memory (PROM), Read-Only Memory (ROM), magnetic memory, flash memory, or optical disk.
  • Power supply component 906 provides power to various components of device 900. The power supply component 906 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 900.
  • The multimedia component 908 includes a screen that provides an output interface between the device 900 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, slides, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide operation, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 908 includes a front camera and/or a rear camera. When the device 900 is in an operating mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each of the front and rear cameras may be a fixed optical lens system or have focal length and optical zoom capabilities.
  • The audio component 910 is configured to output and/or input an audio signal. For example, the audio component 910 includes a microphone (MIC) that is configured to receive an external audio signal when the device 900 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may be further stored in memory 904 or transmitted via communication component 916. In some embodiments, the audio component 910 also includes a speaker for outputting an audio signal.
  • The I/O interface 912 provides an interface between the processing component 902 and a peripheral interface module, which may be a keyboard, a click wheel, a button, or the like. These buttons may include, but are not limited to, a home button, a volume button, a start button, and a lock button.
  • Sensor component 914 includes one or more sensors for evaluating state of various aspects of device 900. For example, sensor component 914 may detect on/off state of the device 900, relative locations of components, such as the display and keypad of device 900, and sensor component 914 may also detect a change in position of device 900 or one component of device 900, the presence or absence of user contact with device 900, orientation or acceleration/deceleration of the device 900, and temperature variation of device 900. The sensor component 914 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor component 914 may also include an optical sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some aspects, the sensor component 914 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
  • Communication component 916 is configured to facilitate wired or wireless communication between device 900 and other devices. The device 900 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary aspect, the communication component 916 receives a broadcast signal or broadcast-associated information from an external broadcast management system via a broadcast channel. In an exemplary aspect, the communication component 916 also includes a Near Field Communication (NFC) module to facilitate short range communication. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, Infrared Data Association (IrDA) technology, Ultra Wide Band (UWB) technology, Bluetooth (BT) technology, and other technologies.
  • In an exemplary embodiment, the device 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGA), controller, microcontrollers, microprocessors or other electronic component to perform the above described method of controlling a field of view.
  • In an exemplary embodiment, there is also provided a non-transitory computer-readable storage medium comprising instructions, such as a memory 904 comprising instructions executable by processor 920 of the device 900 to perform the above-described method of controlling a field of view. For example, the non-transitory computer-readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
  • It is noted that the various modules, sub-modules, units, and components in the present disclosure can be implemented using any suitable technology. For example, a module may be implemented using circuitry, such as an integrated circuit (IC). As another example, a module may be implemented as a processing circuit executing software instructions.
  • Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed here. This application is intended to cover any variations, uses, or adaptations of the disclosure following the general principles thereof and including such departures from the present disclosure as come within known or customary practice in the art. It is intended that the specification and examples be considered as illustrative only, with a true scope of the invention being indicated by the following claims.
  • It will be appreciated that the present disclosure is not limited to the exact construction that has been described above and illustrated in the accompanying drawings, and that various modifications and changes can be made without departing from the scope thereof. It is intended that the scope of the invention only be limited by the appended claims.

Claims (15)

  1. A method of controlling a field of view, characterized in that, the method comprises:
    receiving, by an augmented reality (AR) device, an instruction for starting a target application;
    determining, by the AR device, a preset field of view corresponding to the target application in response to receiving the instruction for starting the target application; and
    adjusting, by the AR device, a current field of view based on the preset field of view.
  2. The method according to claim 1, characterized in that, determining the preset field of view corresponding to the target application comprises:
    determining, by the AR device, a preset display area of a preset virtual picture corresponding to the target application in a display screen; and
    determining, by the AR device, the preset field of view based on the preset display area.
  3. The method of claim 2, characterized in that, adjusting the current field of view based on the preset field of view comprises:
    adjusting, by the AR device, a current display area based on the preset display area corresponding to the preset field of view, wherein the current display area is a display area of a current virtual picture in the display screen; and
    determining, by the AR device, the current field of view based on the adjusted current display area.
  4. The method of claim 3, characterized in that, the method further comprises:
    setting, by the AR device, a non-display area in the display screen as a transparent area, wherein the current virtual picture is not displayed in the non-display area.
  5. The method of claim 1, characterized in that, determining the preset field of view corresponding to the target application comprises:
    determining, by the AR device, a type of the target application; and
    querying, by the AR device, the preset field of view corresponding to the type of the target application from pre-stored correspondence data that indicates correspondence between application types and preset fields of view.
  6. An apparatus for controlling a field of view, characterized in that, the apparatus comprises:
    a processor (920); and
    a memory (904) configured to store instructions executable by the processor,
    wherein the processor is configured to:
    receive, by an augmented reality (AR) device, an instruction for starting a target application;
    determine a preset field of view corresponding to the target application in response to receiving the instruction for starting the target application; and
    adjust a current field of view based on the preset field of view.
  7. The apparatus according to claim 6, characterized in that, when determining the preset field of view corresponding to the target application, the processor (920) is further configured to:
    determine a preset display area of a preset virtual picture corresponding to the target application in a display screen; and
    determine the preset field of view based on the preset display area.
  8. The apparatus according to claim 7, characterized in that, when adjusting the current field of view based on the preset field of view, the processor (920) is further configured to:
    adjust a current display area based on the preset display area corresponding to the preset field of view, wherein the current display area is a display area of a current virtual picture in the display screen; and
    determine the current field of view field based on the adjusted current display area.
  9. The apparatus according to claim 8, characterized in that, the processor (920) is further configured to:
    set a non-display area in the display screen as a transparent area, wherein the current virtual picture is not displayed in the non-display area.
  10. The apparatus according to claim 6, characterized in that, when determining the preset field of view corresponding to the target application, the processor (920) is further configured to:
    determine a type of the target application; and
    query the preset field of view corresponding to the type of the target application from pre-stored correspondence data that indicates correspondence between application types and preset fields of view.
  11. A non-transitory computer-readable storage medium having stored thereon instructions, characterized in that, when executed by one or more processors of an augmented reality (AR) device, the instructions cause the AR device to:
    receive an instruction for starting a target application;
    determine a preset field of view corresponding to the target application in response to the instruction for starting the target application; and
    adjust a current field of view based on the preset field of view.
  12. The non-transitory computer-readable storage medium of claim 11, characterized in that, when determining the preset field of view corresponding to the target application, the instructions further cause the AR device to:
    determine a preset display area of a preset virtual picture corresponding to the target application in a display screen; and
    determine the preset field of view based on the preset display area.
  13. The non-transitory computer-readable storage medium of claim 12, characterized in that, when adjusting the current field of view based on the preset field of view, the instructions further cause the AR device to:
    adjust a current display area based on the preset display area corresponding to the preset field of view, wherein a current display area is a display area of the current virtual picture in the display screen; and
    determine the current field of view field based on the adjusted current display area.
  14. The non-transitory computer-readable storage medium of claim 13, characterized in that, the instructions further cause the AR device to:
    set a non-display area in the display screen as a transparent area, wherein the current virtual picture is not displayed in the non-display area.
  15. The non-transitory computer-readable storage medium of claim 11, characterized in that, when determining the preset field of view corresponding to the target application, the instructions further cause the AR device to:
    determine a type of the target application; and
    query the preset field of view corresponding to the type of the target application from pre-stored correspondence data that indicates correspondence between application types and preset fields of view.
EP18201994.3A 2017-10-23 2018-10-23 Controlling field of view Ceased EP3474118A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710992242.7A CN107797662B (en) 2017-10-23 2017-10-23 Viewing angle control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
EP3474118A1 true EP3474118A1 (en) 2019-04-24

Family

ID=61533601

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18201994.3A Ceased EP3474118A1 (en) 2017-10-23 2018-10-23 Controlling field of view

Country Status (3)

Country Link
US (1) US11024264B2 (en)
EP (1) EP3474118A1 (en)
CN (1) CN107797662B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108650461B (en) * 2018-05-11 2020-09-08 普宙飞行器科技(深圳)有限公司 Control method, device and equipment for variable field angle camera holder
CN111695459B (en) * 2020-05-28 2023-04-18 腾讯科技(深圳)有限公司 State information prompting method and related equipment
CN112581561A (en) * 2020-12-25 2021-03-30 深圳市元征科技股份有限公司 Method and device for creating pin view, electronic equipment and storage medium
CN114415368B (en) * 2021-12-15 2023-05-12 青岛歌尔声学科技有限公司 Regulation and control method and device of VR equipment, system and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3023872A2 (en) * 2014-11-18 2016-05-25 Samsung Electronics Co., Ltd. Method of controlling screen and electronic device for processing same
US20170256096A1 (en) * 2016-03-07 2017-09-07 Google Inc. Intelligent object sizing and placement in a augmented / virtual reality environment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL214894A (en) * 2011-08-30 2016-09-29 Rafael Advanced Defense Systems Ltd Combination of narrow- and wide-view images
WO2016017997A1 (en) * 2014-07-31 2016-02-04 Samsung Electronics Co., Ltd. Wearable glasses and method of providing content using the same
US9489044B2 (en) * 2014-11-07 2016-11-08 Eye Labs, LLC Visual stabilization system for head-mounted displays
CN105786340A (en) * 2014-12-23 2016-07-20 中兴通讯股份有限公司 Terminal screen display method and apparatus
US10049495B2 (en) * 2015-01-14 2018-08-14 Hashplay Inc. System and method for providing virtual reality content
CN105704478B (en) * 2015-08-31 2017-07-18 深圳超多维光电子有限公司 Stereo display method, device and electronic equipment for virtual and reality scene

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3023872A2 (en) * 2014-11-18 2016-05-25 Samsung Electronics Co., Ltd. Method of controlling screen and electronic device for processing same
US20170256096A1 (en) * 2016-03-07 2017-09-07 Google Inc. Intelligent object sizing and placement in a augmented / virtual reality environment

Also Published As

Publication number Publication date
US11024264B2 (en) 2021-06-01
US20190122641A1 (en) 2019-04-25
CN107797662A (en) 2018-03-13
CN107797662B (en) 2021-01-01

Similar Documents

Publication Publication Date Title
US11315336B2 (en) Method and device for editing virtual scene, and non-transitory computer-readable storage medium
EP3182716A1 (en) Method and device for video display
US9661390B2 (en) Method, server, and user terminal for sharing video information
US20170344192A1 (en) Method and device for playing live videos
EP3474118A1 (en) Controlling field of view
US20170154604A1 (en) Method and apparatus for adjusting luminance
EP3136699A1 (en) Method and device for connecting external equipment
EP3145164A1 (en) Method and device for displaying answer extension function
CN107888984B (en) Short video playing method and device
US20180144546A1 (en) Method, device and terminal for processing live shows
EP3599581A1 (en) Method and apparatus for displaying a commodity
EP3462367B1 (en) Method and apparatus for displaying application interface
US10705729B2 (en) Touch control method and apparatus for function key, and storage medium
CN108829475B (en) UI drawing method, device and storage medium
CN104503657A (en) Image processing method and device
CN107656616B (en) Input interface display method and device and electronic equipment
US9619016B2 (en) Method and device for displaying wallpaper image on screen
CN109389547B (en) Image display method and device
CN106604088B (en) Method, device and equipment for processing data in buffer area
CN111246012A (en) Application interface display method and device and storage medium
CN111538447A (en) Information display method, device, equipment and storage medium
CN111081143B (en) Display control method, display control device, electronic equipment and computer-readable storage medium
CN110955328B (en) Control method and device of electronic equipment and storage medium
CN108596719B (en) Image display method and device
US11064307B2 (en) Electronic device and method of outputting audio

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20191022

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210129

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED

18R Application refused

Effective date: 20221109