WO2022161119A1 - 一种显示方法及电子设备 - Google Patents

一种显示方法及电子设备 Download PDF

Info

Publication number
WO2022161119A1
WO2022161119A1 PCT/CN2022/070132 CN2022070132W WO2022161119A1 WO 2022161119 A1 WO2022161119 A1 WO 2022161119A1 CN 2022070132 W CN2022070132 W CN 2022070132W WO 2022161119 A1 WO2022161119 A1 WO 2022161119A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
display area
application
display
display screen
Prior art date
Application number
PCT/CN2022/070132
Other languages
English (en)
French (fr)
Inventor
高凌云
史雪菲
赵博
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022161119A1 publication Critical patent/WO2022161119A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present application relates to the technical field of electronic devices, and in particular, to a display method and electronic device.
  • applications In the process of using electronic devices such as smart phones and tablet computers, users often need to frequently switch applications (referred to as applications) displayed on the electronic devices. For example, users often need to switch to social apps to reply to messages when they use gaming apps for entertainment.
  • applications can close or hide applications currently displayed on the electronic device through functions such as three-key navigation and gesture navigation, and then open other applications to be used so that the electronic device displays the user interface of other applications. This operation is more complicated and the response speed is slow.
  • the user can also make the electronic device display the user interfaces of multiple applications through the split screen and floating window functions, but in this way, the user interface of each application is small, the display effect is not good, and the user is inconvenient to use.
  • the embodiment of the present application discloses a display method and an electronic device, which can quickly and conveniently switch displayed applications, and the display effect of each application is also good.
  • an embodiment of the present application provides a display method, which is applied to an electronic device.
  • the electronic device includes a first display area and a second display area, and the method includes: receiving a first user operation; responding to the first user operation operation, the electronic device determines that the first display area is associated with the first application; when the electronic device is in the first physical state, displays the user interface of the first application through the first display area; receives a second user operation; The second user operation, the electronic device displays the user interface of the second application through the first display area, and the first application is different from the second application; when the user interface of the second application is displayed through the first display area , in response to a third user operation, the physical state of the electronic device changes from the first physical state to the second physical state, and the electronic device displays the user interface of the second application through the second display area; When the user interface of the second application is displayed in the display area, in response to the physical state of the electronic device changing from the second physical state to the first physical state, the electronic device displays the user of the first application
  • the electronic device displays the user interface of the first application through the first display area. That is to say, the user can quickly switch the application displayed on the electronic device by changing the physical state of the electronic device, without having to exit or hide the currently displayed application multiple times and reopen the desired application, which greatly facilitates the use of the user.
  • the electronic device has different display areas for displaying the user interface in different physical states, the existing layout of the application interface is not changed, and the display effect is better.
  • the first display area includes at least part of the second display area; and/or the second display area includes at least part of the first display area.
  • the first display area and the second display area belong to the same display screen of the electronic device.
  • the first display area and the second display area have overlapping display areas, or the first display area is a partial display area of the second display area, or the second display area is a partial display area of the first display area.
  • the electronic device is a foldable electronic device; when the electronic device is in an unfolded state, the first display area and the second display area are on the same plane; when the electronic device is in a folded state , the light emitting surface of the first display area and the light emitting surface of the second display area are opposite to each other.
  • the electronic device includes a first display screen, the first display area is at least part of the display area of the first display screen, and the second display area is at least part of the display area of the first display screen.
  • the electronic device includes a first display screen and a second display screen, the first display area is a display area of the first display screen, and the second display area is a display area of the second display screen.
  • the first physical state and the second physical state are both folded states
  • the third user operation is the user's operation of turning over the electronic device.
  • the electronic device is a foldable electronic device, the electronic device includes a first display screen, the second display area is a full-screen display area of the first display screen, and the first display area is the In a partial display area of the first display screen, the first physical state is a folded state, and the second physical state is an unfolded state.
  • the electronic device is a foldable electronic device, and when the electronic device is in an unfolded state, the light-emitting surface of the first display area and the light-emitting surface of the second display area are opposite to each other; the first physical The state is the unfolded state, the second physical state is the folded state, and the third user operation is the user's operation to convert the electronic device from the unfolded state to the folded state.
  • the electronic device is a foldable electronic device, and when the electronic device is in an unfolded state, the light-emitting surface of the first display area and the light-emitting surface of the second display area are opposite to each other; the first physical The state and the second physical state are both the unfolded state, and the third user operation is the user's operation of turning over the electronic device.
  • the first display area and the second display area have various shapes
  • the display method of the present application can be applied to foldable electronic devices of various shapes, and has a wide range of application scenarios and high usability.
  • the method further includes: when the user interface of the first application is displayed through the first display area, changing from the first physical state to the second physical state in response to the physical state of the electronic device physical state, the electronic device displays the user interface of the first application through the second display area; when the electronic device displays the user interface of the first application through the second display area, a fourth user operation is received; in response to the above In the fourth user operation, the electronic device displays a user interface of a third application through the second display area, and the third application is different from the first application.
  • the application displayed when the electronic device is in the first physical state can be continuously displayed through the second display area. Therefore, the user can continue to use the application displayed when the electronic device is in the first physical state, the user does not need to manually open the application again, and the data of the application will not be lost, which is more convenient for the user to use.
  • the method further includes: in response to the physical state of the electronic device changing from the first physical state to the second physical state, the electronic device displays the user interface through the first physical state.
  • the second display area displays the user interface of the second application.
  • the application displayed when the electronic device was last in the second physical state can be displayed through the second display area. Therefore, the user can continue to use the application displayed when the electronic device was in the second physical state last time, and the user does not need to manually open the application again, and the data of the application will not be lost, which is more convenient for the user to use.
  • the above-mentioned method further includes: receiving a fifth user operation; in response to the above-mentioned fifth user operation, the above-mentioned electronic device determines that the above-mentioned second display area is associated with the fourth application; When displaying the user interface of the first application, in response to the physical state of the electronic device changing from the first physical state to the second physical state, the electronic device displays the user interface of the fourth application through the second display area, The above-mentioned first application is different from the above-mentioned fourth application.
  • the user can set different display areas to associate with different applications. If the user needs to use the first application and the fourth application frequently, the user can directly switch the physical state of the electronic device to quickly switch the applications displayed on the electronic device without manual operations, which greatly facilitates the use of the user.
  • the above-mentioned electronic device when the above-mentioned first user operation is received, the above-mentioned electronic device is in the above-mentioned first physical state, and displays the user interface of the above-mentioned first application through the above-mentioned first display area; the above-mentioned first user operation is a function User operation on the display area where the user interface of the first application is displayed.
  • the user can directly set the association between the first display area and the first application when the electronic device displays the user interface of the first application through the first display area, without having to go to a specific interface for setting, which is more convenient for the user to use.
  • an embodiment of the present application provides a graphical user interface (graphical user interface, GUI) display method, which is applied to an electronic device, where the electronic device includes a first display area and a second display area, and the display method is the first display area.
  • GUI graphical user interface
  • an embodiment of the present application provides an electronic device, the electronic device includes a first display area, a second display area, one or more memories, and one or more processors; the one or more memories are used for A computer program is stored, and the above-mentioned one or more processors are used to call the above-mentioned computer program, and the above-mentioned computer program includes instructions, and when the above-mentioned instructions are executed by the above-mentioned one or more processors, the above-mentioned electronic device is made to perform the first aspect and the first aspect. Display method provided by any implementation.
  • an embodiment of the present application provides a computer storage medium, including a computer program, where the computer program includes instructions, and when the instructions are run on a processor, the first aspect and any one of the implementation manners of the first aspect are provided. display method.
  • an embodiment of the present application provides a computer program product that, when the computer program product runs on an electronic device, causes the electronic device to execute the display method provided by the first aspect and any implementation manner of the first aspect .
  • an embodiment of the present application provides a chip, the chip includes at least one processor and an interface circuit, and optionally, the chip further includes a memory; the above-mentioned memory, the above-mentioned interface circuit and the above-mentioned at least one processor are interconnected through a line , a computer program is stored in the at least one memory; when the computer program is executed by the at least one processor, the display method provided by the first aspect or any one of the implementation manners of the first aspect is implemented.
  • the electronic device provided in the third aspect, the computer storage medium provided in the fourth aspect, the computer program product provided in the fifth aspect, and the chip provided in the sixth aspect are all used to execute any of the first aspect and the first aspect.
  • a display method provided by an implementation Therefore, for the beneficial effects that can be achieved, reference may be made to the beneficial effects in the display method provided in the first aspect, which will not be repeated here.
  • FIG. 1 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a software architecture of an electronic device provided by an embodiment of the present application.
  • 3-5 are schematic diagrams of physical states of some electronic devices provided by embodiments of the present application.
  • FIG. 6-17 are schematic diagrams of some human-computer interactions provided by embodiments of the present application.
  • 23-26 are schematic flowcharts of hardware driver interaction inside some electronic devices provided by the embodiments of the present application.
  • FIG. 27 is a schematic flowchart of a display method provided by an embodiment of the present application.
  • the present application provides a display method, which can be applied to an electronic device, and the electronic device can include a first display area and a second display area.
  • the display area of the electronic device changes accordingly, and the displayed application also changes accordingly.
  • the first display area is associated with the first application
  • the second display area is associated with the second application. Therefore, when the physical state of the electronic device is in the first physical state, the electronic device can display the user interface of the first application through the first display area.
  • the electronic device may display the user interface of the second application through the second display area. Therefore, the user can quickly and conveniently switch the applications displayed on the electronic device, and the use is more convenient.
  • the electronic device is configured with a foldable display screen (which may be referred to as a folding screen).
  • the electronic device may be referred to as a foldable electronic device.
  • the folding of the folding screen can also be called the folding of the electronic device, and the physical state of the folding screen can also be called the physical state of the electronic device.
  • the foldable electronic device is simply referred to as an electronic device in the following embodiments for description.
  • the electronic device can run at least one application.
  • the electronic device may display the user interface of the running application, or may not display the user interface of the running application.
  • a user operation eg, a click operation on a display screen acting on the electronic device can be detected, and in response to the operation, the electronic device can perform a corresponding task based on the application. That is, the user can operate the application by operating the user interface of the application.
  • This kind of application can provide users with more functions and requires more resources.
  • the electronic device can run the application through any available resources, such as network resources such as bandwidth, and system resources such as a central processing unit (CPU).
  • CPU central processing unit
  • the electronic device can display prompt information such as notification messages and pop-up messages of these applications.
  • prompt information such as notification messages and pop-up messages of these applications.
  • the electronic device can display prompt information such as notification messages and pop-up messages of these applications.
  • the electronic device receives a message sent by another user based on the social application, the message may be pushed to the user in the form of a notification message. That is to say, although the user cannot operate the user interface of the application, he can receive the prompt information of the application, so as to obtain the relevant process of the application through the prompt information.
  • Such applications can provide users with fewer functions and require fewer resources.
  • the electronic device displays the user interface of the social application
  • the user can input text or voice through the user interface of the social application, but the user cannot input text or voice when the user interface is not displayed.
  • the electronic device can run the other applications mentioned above with some of the available resources.
  • the electronic devices involved in the embodiments of the present application may be, but are not limited to, mobile phones, tablet computers, personal digital assistants (PDAs), handheld computers, wearable electronic devices (such as smart watches, smart bracelets), augmented reality (augmented) terminal devices such as reality, AR) devices (such as AR glasses), virtual reality (VR) devices (such as VR glasses), smart home devices such as smart TVs, or other desktop, laptop, notebook, super mobile Personal computer (Ultra-mobile Personal Computer, UMPC), netbook and other equipment.
  • PDAs personal digital assistants
  • wearable electronic devices such as smart watches, smart bracelets
  • augmented reality (augmented) terminal devices such as reality, AR) devices (such as AR glasses), virtual reality (VR) devices (such as VR glasses), smart home devices such as smart TVs, or other desktop, laptop, notebook, super mobile Personal computer (Ultra-mobile Personal Computer, UMPC), netbook and other equipment.
  • augmented reality terminal devices such as reality, AR) devices (such as AR glasses), virtual reality (VR) devices (such as VR
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110 , an external memory interface 120 , an internal memory 121 , a universal serial bus (USB) interface 130 , a charging management module 140 , a power management module 141 , and a battery 142 , Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receiver 170B, Microphone 170C, Headphone Interface 170D, Sensor Module 180, Key 190, Motor 191, Indicator 192, Camera 193 , a display screen 194, and a subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • a processor 110 an external memory interface 120 , an internal memory 121 , a universal serial bus (USB) interface 130 , a charging management module 140 , a power management module 141 , and a battery 142 , Antenna 1, Antenna 2, Mobile Communication Module 150, Wireless Communication Module 160, Audio Module 170, Speaker 170A, Receive
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, angle sensor 180M, etc.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
  • the power management module 141 may also be provided in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be provided in the same device.
  • the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • global positioning system global positioning system, GPS
  • global navigation satellite system global navigation satellite system, GLONASS
  • Beidou navigation satellite system beidou navigation satellite system, BDS
  • quasi-zenith satellite system quadsi -zenith satellite system, QZSS
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 100 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the display screen 194 may include at least one display area, and the present application takes the example that the display screen 194 includes a first display area and a second display area for description.
  • the first display area of the display screen 194 is used to display a graphical user interface (graphical user interface, GUI), and the GUI may also be referred to as a user interface.
  • GUI graphical user interface
  • the first display area may be in a screen-on state
  • the second display area may be in a screen-off state.
  • the second display area may be in a screen-on state
  • the first display area may be in a screen-off state.
  • the second display area when the electronic device 100 is in the first physical state and the first display area when the electronic device 100 is in the second physical state may also be in a locked screen state or a bright screen state, which is not limited in this application .
  • the first display area when the electronic device 100 is in the first physical state, the first display area is on, and the second display area is off, and when the electronic device 100 is in the second physical state, the first display area is off, and the first display area is off.
  • the second display area is bright as an example to illustrate.
  • the display screen 194 of the electronic device 100 is a folding screen
  • the physical states of the electronic device 100 may include an unfolded state, a bent state, and a folded state.
  • the first display area may be in a flattened state.
  • the first physical state is an unfolded state
  • the second physical state is a folded state
  • the first display area and the second display area are the first display screen 200 and the second display screen shown in FIGS. 3-5 below, respectively. 300.
  • the electronic device 100 may implement a shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the electronic device 100 can realize face unlock, access application lock, etc. through the face information obtained by the photographing function.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • Camera 193 is used to capture still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the display screen 194 is a folding screen, and the display screen 194 includes a first display area and a second display area.
  • the electronic device 100 may include N cameras 193, and the N cameras 193 are respectively disposed on the first display area and the second display area. The electronic device 100 can determine whether the electronic device 100 is turned over through the detection signals of the N cameras 193 .
  • the first display area when the electronic device 100 is in the unfolded state, the first display area may be in the unfolded state, and is opposite to the user's face. At this time, the camera 193 on the first display area can obtain the user's face information, but the camera 193 on the second display area cannot obtain the user's face information.
  • the electronic device 100 may display the user interface of the application A through the first display area.
  • the electronic device 100 may receive a user's flip operation, where the flip operation is used to change the relative positions of the first display area and the second display area, that is, the second display area is opposite to the user's face.
  • the electronic device 100 can determine that the electronic device 100 has flipped. At this time, the second display area is opposite to the user's face, and the electronic device 100 can display the user interface of application B through the second display area.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D may be the USB interface 130, or may be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 can obtain the corresponding touch operation intensity according to the detection signal of the pressure sensor 180A.
  • the electronic device 100 may also calculate the position of the touch area on the display screen 194 (referred to as the touch position for short) by the touch operation according to the detection signal of the pressure sensor 180A.
  • the electronic device 100 may also calculate the shape of the above-mentioned touch area according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation commands. For example, when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, the instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyroscope sensor 180B can detect the angular velocity of the electronic device 100 in various directions (generally three axes, ie x, y and z axes), and can be used to determine the movement attitude of the electronic device 100 .
  • the gyro sensor 180B may be provided on the circuit board of the electronic device 100 .
  • the electronic device 100 can determine whether the bending angle of the electronic device 100 changes or whether the electronic device 100 is turned over according to the detection signal of the gyro sensor 180B.
  • the gyro sensor 180B may also be used for image stabilization.
  • the gyroscope sensor 180B detects the shaking angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc. Optionally, the acceleration sensor 180E may be provided on the circuit board of the electronic device 100 . The electronic device 100 can determine whether the bending angle of the electronic device 100 changes or whether the electronic device 100 is turned over according to the detection signal of the acceleration sensor 180E.
  • the electronic device 100 may include multiple acceleration sensors 180E and/or multiple gyroscope sensors 180B.
  • the display screen 194 is a folding screen, and the display screen 194 includes a first display area and a second display area.
  • the first display area When the electronic device 100 is in an unfolded state, the first display area may be in a flattened state, and when the electronic device 100 is in a bent state and a folded state, the first display area may be bent. When the first display area is bent, it can be divided into two display areas, and the planes where the two display areas are located intersect.
  • the plurality of acceleration sensors 180E and/or the plurality of gyroscope sensors 180B may be respectively disposed on the circuit boards on the sides of the two display areas. Therefore, the electronic device 100 can determine whether the bending angle of the electronic device 100 changes or whether the electronic device 100 is turned over according to the detection signals of the multiple acceleration sensors 180E and/or the multiple gyro sensors 180B.
  • the first display area faces up and displays the user interface of application A
  • the second display area faces down and the screen is off.
  • the electronic device 100 may receive a user's flip operation, where the flip operation is used to change the relative positions of the first display area and the second display area, that is, the first display area faces downward and the second display area faces upward.
  • the electronic device 100 may determine that the electronic device 100 is flipped according to the detection signals of the acceleration sensor 180E and/or the gyro sensor 180B.
  • the electronic device 100 may display the user interface of the application B through the second display area, and at this time, the first display area faces down and the screen is off.
  • the first display area faces up and displays the user interface of application A
  • the second display area faces down and the screen is off.
  • the electronic device 100 may receive a user's folding operation for folding the electronic device 100 .
  • the electronic device 100 may determine, according to the detection signal of the acceleration sensor 180E and/or the gyro sensor 180B, that the bending angle of the electronic device 100 is reduced and is in a folded state.
  • the second display area is bent and can be divided into two display areas, and the planes where the two display areas are located intersect.
  • the electronic device 100 may display the user interface of the application B through the upward-facing display area of the above two display areas, and the downward-facing display area of the above-mentioned two display areas may hold the screen off.
  • the electronic device 100 may determine the upward-facing display area among the above-mentioned two display areas according to the detection signal of the acceleration sensor 180E and/or the gyro sensor 180B.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, accessing application locks, taking pictures with fingerprints, answering incoming calls with fingerprints, and the like.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also referred to as a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can transmit the detected touch operation to the application processor to determine the position and shape of the touch area on the display screen 194 where the touch operation acts on the touch area, so as to determine the type of touch event.
  • Electronic device 100 may provide visual output related to touch operations through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display screen 194.
  • the angle sensor 180M can acquire the angle information and convert it into a usable electrical signal output.
  • the angle sensor 180M may be disposed in the display screen 194 for detecting the bending angle of the electronic device 100 .
  • the processor 110 may determine the physical state (eg, unfolded, bent, or folded) of the electronic device 100 according to the detection signal of the angle sensor 180M, and whether the physical state of the electronic device 100 changes.
  • the application does not limit the specific types of sensors used to detect physical states.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.
  • the SIM card interface 195 is used to connect a SIM card.
  • the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 195 or pulling out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. Multiple cards can be of the same type or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 employs an eSIM, ie: an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the pressure sensor 180A and/or the touch sensor 180K may be provided in the display screen 194 .
  • the pressure sensor 180A and/or the touch sensor 180K may detect the user's operation on the user interface by the user.
  • the electronic device 100 may perform a corresponding task based on the application. For example, when the user clicks on the avatar of a friend in the social application, the electronic device 100 may display the personal information published by the friend through the social application on the display screen 194 .
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the software system of the layered architecture may be an Android (Android) system or a Huawei mobile services (huawei mobile services, HMS) system.
  • the embodiments of the present application take an Android system with a layered architecture as an example to exemplarily describe the software structure of the electronic device 100 .
  • FIG. 2 is a block diagram of a software structure of an electronic device 100 according to an embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
  • the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include social applications, game applications, video applications, payment applications, camera, gallery, map, SMS, e-book and other applications.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
  • Content providers are used to store and retrieve data and make these data accessible to applications.
  • the above data can include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
  • a display interface can consist of one or more views.
  • the display interface of the application program including the display interface of the notification message of the application program, may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide the communication function of the electronic device 100 .
  • the management of call status including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files and so on.
  • the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
  • Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
  • the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
  • a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • surface manager surface manager
  • media library Media Libraries
  • 3D graphics processing library eg: OpenGL ES
  • 2D graphics engine eg: SGL
  • the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display drivers, camera drivers, audio drivers, and sensor drivers.
  • the sensor driver can be used to drive multiple sensors in the control hardware, such as pressure sensor 180A, gyro sensor 180B, acceleration sensor 180E, touch sensor 180K, angle sensor 180M and other sensors shown in FIG. 1 .
  • the sensor for detecting the physical state is the gyro sensor 180B as an example for description.
  • the first display area is associated with application A.
  • the electronic device 100 runs the application A and displays the user interface of the application A through the first display area.
  • the electronic device 100 can receive a user operation, and the gyro sensor 180B reports a detection signal to the kernel layer.
  • the inner core layer can process the detection signal into an input event (for example, the electronic device 100 is in the unfolded state, the first display area is facing down and the second display area is facing up, or the relative position of the first display area and the second display area is changed), input Events are stored at the kernel layer.
  • the application framework layer obtains an input event from the kernel layer, and determines that the physical state of the electronic device 100 has changed according to the input event. Assuming that the second display area is associated with application B, for example, when the electronic device 100 is in an expanded state and the second display area is facing up, the electronic device 100 runs application B and displays the user interface of application B through the second display area. Therefore, in response to the change of the physical state of the electronic device 100, the second application can call the interface of the application framework layer to start the second application, and then start the display driver by calling the kernel layer, and display the user interface of the second application through the second display area .
  • FIG. 3 is a schematic diagram of an unfolded state of an electronic device provided by an embodiment of the present application.
  • (A) of FIG. 3 shows a schematic diagram of a viewing angle of the electronic device.
  • (B) of FIG. 3 shows a schematic diagram of still another viewing angle of the electronic device.
  • the electronic device 100 may include a first display screen 200 and a second display screen 300 .
  • the electronic device can be in an unfolded state, a folded state or a bent state.
  • the first display screen 200 is in the flattened state.
  • the bending angle ⁇ of the first display screen 200 is about 180 degrees.
  • the bending angle of the first display screen 200 may also be greater than or equal to 170 degrees and less than or equal to 180 degrees.
  • the present application does not limit the specific value of the bending angle of the first display screen 200 in the unfolded state.
  • the light-emitting surface of the first display screen 200 is opposite to the light-emitting surface of the second display screen 300 .
  • the first display screen 200 may include three display areas, ie, a first display area 201 , a second display area 202 , and a third display area 203 .
  • the third display area 203 can be bent and is located on the bent part of the electronic device. Two ends of the bent portion of the electronic device are respectively connected to the first display area 201 and the second display area 202 .
  • the bending angle of the first display screen 200 can also be understood as: the angle ⁇ between the plane where the first display area 201 is located and the plane where the second display area 202 is located, that is, the two bending parts shown in (A) of FIG. 3 . the angle ⁇ between the ends.
  • the first display area 201 and the second display area 202 are equivalent to being on the same plane.
  • the second display screen 300 may include three display areas, namely, a fourth display area 301 , a fifth display area 302 , and a sixth display area 303 .
  • a fourth display area 301 a fifth display area 302
  • a sixth display area 303 a sixth display area 303 .
  • the fifth display area 302 of the second display screen 300 can be bent and is located on the bent portion of the electronic device. Both ends of the bent portion of the electronic device are also connected to the fourth display area 301 and the sixth display area 303 respectively.
  • the bending angle of the second display screen 300 can also be understood as the angle between the plane where the fourth display area 301 is located and the plane where the sixth display area 303 is located.
  • the electronic device may include at least one camera. As shown in (B) of FIG. 3 , the electronic device may include a camera 3031 (including the camera 3031A and the camera 3031B) disposed in the upper part of the sixth display area 303 , and a camera 3032 (including the camera 3032A and the camera) in the lower part of the sixth display area 303 3032B).
  • the electronic device may obtain the user's face information through the at least one camera, and perform face verification or determine whether the physical state of the electronic device changes according to the obtained face information.
  • the electronic device restarts the corresponding application and displays the user interface of the application.
  • the electronic device may also only include the camera 3031 disposed on the upper part of the sixth display area 303 .
  • the electronic device may only include the camera 3032 disposed at the lower part of the sixth display area 303.
  • the electronic device may also include a camera disposed in the fourth display area 301 and/or the fifth display area 302 .
  • the electronic device may also include a camera disposed on the first display screen 200, which is not limited in this embodiment of the present application.
  • the electronic device in the unfolded state shown in FIG. 3 can be partially folded (in this case, the angle between the two ends of the bent part of the electronic device can be changed from ⁇ to ⁇ ) to obtain the bending
  • the electronic device in the state as shown in Figure 4 below.
  • FIG. 4 is a schematic diagram of a bent state of an electronic device provided by an embodiment of the present application.
  • A) of FIG. 4 shows a schematic diagram of a viewing angle of the electronic device.
  • B) of FIG. 4 shows a schematic diagram of still another viewing angle of the electronic device.
  • the bending angle ⁇ of the first display screen 200 may be approximately 120 degrees.
  • the bending angle of the first display screen 200 can also be greater than 0 degrees and less than 180 degrees, such as but not limited to 60 degrees, 90 degrees, 100 degrees, 120 degrees, etc.
  • the specific value of the bending angle of the first display screen 200 is not limited.
  • the description of FIG. 4(B) is similar to that of FIG. 4(A), and will not be repeated.
  • the electronic device in the unfolded state shown in FIG. 3 can be folded (in this case, the angle between the two ends of the bent part of the electronic device can be transformed from ⁇ to ⁇ ) to obtain the folded state.
  • electronic equipment Alternatively, the electronic device in the bent state shown in FIG. 4 can also be folded (in this case, the angle between the two ends of the bent part of the electronic device can be converted from ⁇ to ⁇ ) to obtain the electronic device in the folded state. , as shown in Figure 5 below.
  • the first display screen 200 when the electronic device is in a folded state, the first display screen 200 is bent or folded. As shown in (A) of FIG. 5 , the bending angle ⁇ of the first display screen 200 may be approximately 0 degree. Not limited to this, the bending angle of the first display screen 200 may also be greater than or equal to 0 degrees and less than or equal to 20 degrees. The present application does not limit the specific value of the bending angle of the first display screen 200 in the folded state.
  • the bending angle of the display screen 194 may be different, but the specific value of the bending angle of the display screen 194 is not limited.
  • the first display area 201 and the second display area 202 are opposite to each other, and the fourth display area 301 and the sixth display area 303 are opposite to each other.
  • the third display area 203 and the fifth display area 302 are opposite to each other.
  • the first display screen 200 and the second display screen 300 may be different display areas on the same flexible folding screen (flexible screen for short).
  • the first display area 201, the second display area 202, the third display area 203, the fourth display area 301, the fifth display area 302 and the sixth display area 303 may be different areas on the flexible screen, Both are used to display the user interface.
  • the first display screen 200 may be a flexible screen of an electronic device.
  • the first display area 201, the second display area 202, and the third display area 203 are different areas on the flexible screen, all of which are used to display the user interface.
  • the second display screen 300 may be another flexible screen of the electronic device.
  • the fourth display area 301, the fifth display area 302, and the sixth display area 303 are different areas on the second display screen 300, all of which are used to display the user interface.
  • the first display screen 200 may be a display screen formed by splicing a rigid screen, a flexible screen, a chain or other connecting components.
  • the first display screen 200 may be formed by splicing two rigid screens and one flexible screen.
  • the first display area 201 and the second display area 202 may be areas on the above two rigid screens respectively, and the third display area 203 may be an area on the above one flexible screen, both of which are used for displaying user interfaces.
  • the first display screen 200 may be formed by splicing two rigid screens and a chain for connecting the two rigid screens.
  • the first display area 201 and the second display area 202 may be areas on the above-mentioned two rigid screens respectively, both of which are used to display the user interface.
  • the third display area 203 is the above-mentioned chain for connecting the two rigid screens.
  • the second display screen 300 may also be a display screen formed by splicing a rigid screen, a flexible screen, a chain and other connecting components, and the specific description is similar to that of the first display screen 200 , and details are not repeated here.
  • the electronic device is in the folded state when the bending angle is smaller than the first angle threshold, the electronic device is in the unfolded state when the bending angle is greater than the second angle threshold, and the bending angle of the electronic device is greater than or equal to the first angle threshold.
  • the angle threshold value and the bending state when the angle threshold value is less than or equal to the second angle threshold value are taken as an example for description.
  • the first angle threshold is 30 degrees
  • the second angle threshold is 160 degrees.
  • FIGS. 3-5 when the electronic device is in the unfolded state, if the first display screen 200 faces upward, the user interface is displayed through the first display screen 200 , and if the second display screen 300 faces upward, the user interface is displayed through the second display screen 300 .
  • Screen 300 displays a user interface.
  • the electronic device is in a folded state, if the fourth display area 301 faces upwards, the user interface is displayed through the fourth display area 301 (and optionally, the fifth display area 302 ). If the sixth display area 303 faces upwards, the user interface is displayed through the sixth display area 303 (and optionally, the fifth display area 302 ).
  • FIGS. 6-17 An application scenario involved in the embodiments of the present application and a schematic diagram of human-computer interaction in the scenario are described below.
  • FIGS. 6-17 the structure of the electronic device 100 is described by taking the structure shown in FIGS. 3-5 as an example.
  • FIG. 6 exemplarily shows a schematic diagram of human-computer interaction.
  • the electronic device when the electronic device 100 is in an unfolded state and the first display screen 200 is facing upward, the electronic device can display the user interface of the first application through the first display screen 200 .
  • the electronic device may receive a user operation when displaying the setting interface or the desktop, and in response to the user operation, the electronic device sets the first display screen 200 to associate with the first application.
  • the electronic device may also receive a user operation (such as clicking a lock control) acting on the first display screen 200, and in response to the user operation, the electronic device
  • the first display screen 200 is set to be associated with the first application, which is not limited in this application.
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the folded part.
  • the electronic device may display the user interface through the sixth display area 303 of the second display screen 300 .
  • the electronic device may also display the user interface through the fifth display area 302 and the sixth display area 303 together.
  • the electronic device changes from the unfolded state to the folded state for the first time, and the sixth display area 303 can still display the electronic device
  • the user interface of the application displayed in the expanded state that is, the user interface of the first application, at this time, the electronic device may be in the state shown in (B) of FIG. 6 below.
  • the electronic device when the electronic device is in a folded state and the sixth display area 303 faces upward, the electronic device can continue to display the user interface of the first application displayed in the unfolded state through the sixth display area 303 .
  • the electronic device may receive a user operation for opening the second application, such as returning to the desktop through a three-key navigation function, and clicking an icon of the second application on the desktop, where the first application and the second application are different.
  • the electronic device can cancel displaying the user interface of the first application, and display the user interface of the second application through the sixth display area 303.
  • the electronic device can be in the state shown in (C) of FIG. 6 .
  • the electronic device when the electronic device is in a folded state, and the sixth display area 303 faces upward, the electronic device can display the user interface of the second application through the sixth display area 303 .
  • the electronic device can detect the unfolding operation of the user, and the unfolding operation can increase the included angle between the two ends of the bending part.
  • the electronic device may display, through the first display screen 200, the user interface of the application associated with the first display screen 200, that is, the user interface of the first application, this At this time, the electronic device may be in the state shown in (A) of FIG. 6 .
  • a user operation may be received when the user interface of the first application is displayed through the first display screen 200 .
  • the user operation is used to open the third application, for example, enter the multitasking interface through the gesture navigation function, and click the thumbnail of the third application in the multitasking interface, wherein the first application and the third application are different.
  • the electronic device may display the user interface of the third application through the first display screen 200 .
  • the physical state of the electronic device can be transformed into a folded state, and the user interface of the third application is displayed through the second display screen 300 .
  • Figure 7 A specific example is shown in Figure 7 below.
  • FIG. 7 exemplarily shows yet another schematic diagram of human-computer interaction.
  • FIG. 7 is the same as that of FIG. 6 (A), and will not be repeated.
  • the electronic device can receive a user operation for opening the third application.
  • the electronic device can cancel displaying the user interface of the first application, and display the user interface of the third application through the first display screen 200.
  • the electronic device can be in the state shown in (B) of FIG. 7 .
  • the electronic device is in the unfolded state, and the first display screen 200 is facing upward, and the electronic device displays the user interface of the third application through the first display screen 200 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device may display the user interface through the sixth display area 303 of the second display screen 300 .
  • the electronic device may also display the user interface through the fifth display area 302 and the sixth display area 303 together.
  • the electronic device changes from the unfolded state to the folded state for the first time, and the sixth display area 303 can still display the electronic
  • the user interface of the application displayed by the device in the expanded state, that is, the user interface of the third application, at this time, the electronic device may be in the state shown in (C) of FIG. 7 .
  • the electronic device when the electronic device is in a folded state and the sixth display area 303 faces upward, the electronic device can continue to display the user interface of the third application displayed in the unfolded state through the sixth display area 303 .
  • the electronic device can detect the unfolding operation of the user, and the unfolding operation can increase the included angle between the two ends of the bending part.
  • the electronic device may display, through the first display screen 200, the user interface of the application associated with the first display screen 200, that is, the user interface of the first application, this At this time, the electronic device may be in the state shown in (A) of FIG. 7 .
  • the electronic device does not set the first application to associate with the first display screen 200, for example, the first display screen 200 is not set to associate with the first application in the state shown in (A) of FIG.
  • the physical state is transformed into the unfolded state.
  • the electronic device in response to the included angle between the two ends of the bending part being greater than the second angle threshold, can display the user interface of the application displayed by the electronic device in the above-mentioned other physical states. desktop or lock screen interface, etc.
  • the state changes of the electronic device are as follows: FIG. 6(A), FIG. 6(B), FIG. 6(C), and FIG. 6(A).
  • the electronic device is in an unfolded state, and the first display screen 200 faces upward, and the user interface of the first application is displayed through the first display screen 200 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device may display, through the sixth display area 303, the user interface of the application displayed in the folded state of the electronic device last time, that is, the user interface of the second application, At this time, the electronic device may be in the state shown in (C) of FIG. 6 .
  • the state changes of the electronic device are sequentially: (A) of FIG. 7 , (B) of FIG. 7 , (C) of FIG. 7 , (A) of FIG. 7 .
  • the electronic device is in an unfolded state, and the first display screen 200 faces upward, and the user interface of the first application is displayed through the first display screen 200 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device may display, through the sixth display area 303, the user interface of the application displayed in the folded state of the electronic device last time, that is, the user interface of the third application, At this time, the electronic device may be in the state shown in (C) of FIG. 7 .
  • the electronic device may receive a user operation when displaying the setting interface or the desktop, and in response to the user operation, the electronic device sets the sixth display area 303 to associate with a fourth application, wherein the first application and the fourth application are different .
  • the electronic device may also receive a user operation (such as clicking a lock control) acting on the sixth display area 303 when displaying the user interface of the fourth application through the sixth display area 303, and in response to the user operation, the electronic device
  • the sixth display area 303 is set to be associated with the fourth application, which is not limited in this application. In this case, when the electronic device is in a folded state and the sixth display area 303 faces upwards, the user interface of the fourth application can be displayed through the sixth display area 303 (and optionally, the fifth display area 302 ).
  • the electronic device when the electronic device is in the unfolded state, the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part. In response to the angle between the two ends of the bending part being smaller than the first angle threshold, the electronic device may display the user interface of the application associated with the sixth display area 303 , that is, the user interface of the fourth application, through the sixth display area 303 .
  • the last time the user interface of the application displayed by the electronic device in the folded state is for the situation after the electronic device sets the association between the first display screen 200 and the first application.
  • the electronic device may also set the sixth display area 303 to be associated with the fourth application.
  • the electronic device may display, through the first display screen 200 , the user interface of the application displayed in the expanded state of the electronic device last time. If the application displayed by the electronic device in the unfolded state last time does not exist, that is, after the electronic device sets the second display screen 300 to associate with the fourth application, the electronic device changes from the folded state to the unfolded state for the first time, the electronic device can pass the first
  • the six display areas 303 continue to display the user interface of the application displayed in the folded state of the electronic device.
  • the electronic device may display the user interface of the application associated with the sixth display area 303 , ie the user interface of the fourth application, through the sixth display area 303 .
  • FIGS. 6-7 are described by taking the sixth display area 303 facing upward as an example when the electronic device is in a folded state. If the fourth display area 301 faces upward when the electronic device is in the folded state, the electronic device can display the user interface through the fourth display area 301 , but not limited to this, the electronic device can also be displayed through the fourth display area 301 and the fifth display area 302 together User Interface. For example, the electronic device may detect the upward-facing display area through the acceleration sensor 180E and/or the gyro sensor 180B.
  • the electronic device has set the first display screen 200 to be associated with a video application, and has also set the second display screen 300 to be associated with a social application.
  • the electronic device can display the user interface 110 of the video application through the first display screen 200 , as shown in (A) of FIG. 8 .
  • the electronic device can display the user interface 120 of the social application through the second display screen 300 , as shown in (B) of FIG. 8 .
  • the electronic device can detect a user's flip operation, which can change the relative positions of the first display screen 200 and the second display screen 300 .
  • the electronic device may display, through the second display screen 300, the user interface of the application associated with the second display screen 300, that is, the user interface 120 of the social application, which is At this time, the electronic device may be in the state shown in (B) of FIG. 8 .
  • the electronic device can detect a user's flip operation, which can change the relative positions of the first display screen 200 and the second display screen 300 .
  • the electronic device can display the user interface of the application associated with the first display screen 200, that is, the user interface 110 of the video application, through the first display screen 200.
  • the electronic device may be in the state shown in (A) of FIG. 8 .
  • the electronic device has set the first display screen 200 to be associated with a video application, and has also set the second display screen 300 to be associated with a social application. For examples of setting methods, see Figures 6-7 above. Moreover, the electronic device determines that the application associated with the fourth display area 301 and the application associated with the sixth display area 303 are the same as the application associated with the second display screen 300 . That is, the applications associated with the fourth display area 301 and the sixth display area 303 are also social applications. Then, when the physical state of the electronic device is changed to the folded state, the electronic device can display the user interface of the social application. A specific example is shown in FIG. 9 below.
  • FIG. 9 exemplarily shows yet another schematic diagram of human-computer interaction.
  • the electronic device is in the unfolded state, and the first display screen 200 is facing upward, and the electronic device displays the user interface 110 of the video application through the first display screen 200 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device can display the user interface through the sixth display area 303 or the fourth display area 301, and at this time the electronic device can be (B) of FIG. 9 below or the state shown in (C) of FIG. 9 .
  • the electronic device is in a folded state, and the sixth display area 303 is facing upward, and the electronic device displays the user interface of the application associated with the sixth display area 303 through the sixth display area 303 , which is also related to the sixth display area 303 .
  • the user interface of the application associated with the two display screens 300 that is, the user interface 120 of the social application.
  • the electronic device may also display the user interface 120 of the social application through the sixth display area 303 and the fifth display area 302 together.
  • the electronic device is in a folded state, and the fourth display area 301 is facing upward, and the electronic device displays the user interface of the application associated with the fourth display area 301 through the fourth display area 301 , which is also related to the fourth display area 301 .
  • the user interface of the application associated with the two display screens 300 that is, the user interface 120 of the social application.
  • the electronic device may also display the user interface 120 of the social application through the fourth display area 301 and the fifth display area 302 together.
  • the electronic device can detect the user's flip operation, which can change the relative positions of the fourth display area 301 and the sixth display area 303 .
  • the electronic device can display the user interface 120 of the social application through the fourth display area 301, and the electronic device can be in the state shown in (C) of FIG. 9 at this time .
  • the electronic device can also detect a user's flip operation, which can change the relative positions of the fourth display area 301 and the sixth display area 303 .
  • the electronic device can display the user interface 120 of the social application through the sixth display area 303, and at this time the electronic device can be in the state shown in (B) of FIG. 9 .
  • the electronic device has set the first display screen 200 to associate with a video application, and has also set the second display screen 300 to associate with a social application.
  • the electronic device has also set the fourth display area 301 to be associated with the gallery, and the sixth display area 303 to be associated with the camera.
  • the setting method please refer to FIGS. 6-7 above. That is to say, the applications associated with the fourth display area 301 , the applications associated with the sixth display area 303 and the applications associated with the second display screen 300 are different from each other. Then, when the physical state of the electronic device is changed to the folded state, the electronic device can determine the displayed application according to the upward-facing display area. A specific example is shown in FIG. 10 below.
  • FIG. 10 exemplarily shows yet another schematic diagram of human-computer interaction.
  • (A) of FIG. 10 is the same as that of FIG. 9 (A), and will not be described again.
  • the electronic device displays the user interface of the application associated with the sixth display area 303 through the sixth display area 303 , that is, the camera’s user interface.
  • User interface 130 the electronic device may also display the user interface 130 of the camera through the sixth display area 303 and the fifth display area 302 together.
  • the electronic device is in a folded state, and the fourth display area 301 is facing upward, and the electronic device displays the user interface of the application associated with the fourth display area 301 through the fourth display area 301 , that is, the user interface of the gallery.
  • User interface 140 may also display the user interface 140 of the gallery through the fourth display area 301 and the fifth display area 302 together.
  • the electronic device can detect a user's flip operation, which can change the relative positions of the fourth display area 301 and the sixth display area 303.
  • the electronic device can display the user interface 140 of the gallery through the fourth display area 301, and the electronic device can be in the state shown in (C) of FIG. 10 .
  • the electronic device can also detect the user's flip operation, which can change the relative positions of the fourth display area 301 and the sixth display area 303 .
  • the electronic device can display the camera user interface 130 through the sixth display area 303, and the electronic device can be in the state shown in (B) of FIG. 10 .
  • the application associated with the fourth display area 301 may be the same as the application associated with the second display screen 300, for example, a social application, and the application associated with the sixth display area 303 and the second display area 300.
  • the application associated with the display screen 300 is different, for example, a camera.
  • the application associated with the sixth display area 303 may be the same as the application associated with the second display screen 300
  • the application associated with the fourth display area 301 may be different from the application associated with the second display screen 300 .
  • the application associated with the fourth display area 301 is the same as the application associated with the sixth display area 303 , but is different from the application associated with the second display screen 300 .
  • the electronic device has set the first display screen 200 to associate with a video application.
  • the user interface of the application can be displayed in a split screen through the first display screen 200 , and a specific example is shown in FIGS. 11-12 below.
  • FIG. 11 exemplarily shows yet another schematic diagram of human-computer interaction.
  • the two applications displayed on the split screen when the electronic device is in a bent state are different from the applications associated with the first display screen 200 .
  • the electronic device As shown in (A) of FIG. 11 , the electronic device is in a bent state, and the first display screen 200 is facing upward, the electronic device displays the user interface 150 of the short message through the first display area 201 and part of the third display area 203 , and through The second display area 202 and part of the third display area 203 display the user interface 160 of the electronic book.
  • the electronic device can detect the unfolding operation of the user, and the unfolding operation can increase the included angle between the two ends of the bending part.
  • the electronic device may display the user interface of the application associated with the first display screen 200 through the first display screen 200 , That is, the user interface 110 of the video application.
  • the electronic device may be in the state shown in (B) of FIG. 11 .
  • the electronic device in response to the included angle between the two ends of the bending portion being greater than the second angle threshold, and the second display screen 300 is facing upward, the electronic device may be associated with the second display screen 300 through the second display screen 300 At this time, the electronic device may be in the state shown in (B) of FIG. 8 .
  • the electronic device can also detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device may display the user interface of the application associated with the fourth display area 301 through the fourth display area 301, At this time, the electronic device may be in the state shown in (C) of FIG. 9 or (C) of FIG. 10 .
  • the electronic device may display the user of the application associated with the sixth display area 303 through the sixth display area 303 interface, the electronic device may be in the state shown in FIG. 9(B) or FIG. 10(B) at this time.
  • the electronic device is in the unfolded state, and the first display screen 200 is facing upward, and the electronic device displays the user interface 110 of the video application through the first display screen 200 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device can display the user interface of the application in a split screen through the first display screen 200, and at this time the electronic device It may be in the state shown in (A) of FIG. 11 or (A) of FIG. 12 below.
  • FIG. 12 exemplarily shows yet another schematic diagram of human-computer interaction.
  • one of the two applications displayed on the split screen when the electronic device is in a bent state is an application associated with the first display screen 200 , that is, a video application.
  • the electronic device As shown in (A) of FIG. 12 , the electronic device is in a bent state, and the first display screen 200 is facing upward, the electronic device displays the user interface 150 of the short message through the first display area 201 and part of the third display area 203 , and through The second display area 202 and part of the third display area 203 display the user interface 110 of the video application. At this time, the electronic device detects the user's operation, and the description that the physical state changes is similar to (A) of FIG. 11 above, and will not be repeated here.
  • (B) of FIG. 12 is similar to (B) of FIG. 11 and will not be described again.
  • the electronic device in response to the physical state of the electronic device changing from the bent state to the unfolded state or the folded state, the electronic device can also pass The second display screen 300 displays the user interface displayed when the electronic device was last in the unfolded state or the folded state.
  • the electronic device in response to the included angle between the two ends of the bending portion being greater than the second angle threshold, and the second display screen 300 is facing upward, the electronic device may display through the second display screen 300 that the electronic device passed through the second display screen most recently. 300 displays the user interface.
  • the electronic device may display the user who was last displayed by the electronic device through the fourth display area 301 through the fourth display area 301 interface.
  • the electronic device may display the user who was last displayed by the electronic device through the sixth display area 303 through the sixth display area 303 interface.
  • the electronic device in response to the physical state of the electronic device changing from the unfolded state to the bent state, may also continue to display the application information displayed when the electronic device is in the unfolded state through the first display screen 200 .
  • User Interface Alternatively, the electronic device may also use the first display screen 200 to display the user interface displayed when the electronic device was in a bending state last time. Alternatively, the electronic device may also display the user interface of the application associated with the first display screen 200 through the first display screen 200 , which is not limited in this application.
  • the physical state of the electronic device can also be transformed from a folded state to a folded state, and the electronic device can display the user interface of the application in a split screen through the first display screen 200.
  • the electronic device can be the one shown in FIG. ) or the state shown in (A) of FIG. 12 .
  • the electronic device may also continue to display the user interface of the application displayed when the electronic device is in the folded state through the first display screen 200 .
  • the electronic device may also use the first display screen 200 to display the user interface displayed when the electronic device was in a bending state last time.
  • the electronic device may also display the user interface of the application associated with the first display screen 200 through the first display screen 200 , which is not limited in this application.
  • the electronic device when the electronic device is in the unfolded state, and the first display screen 200 faces upward and is in a landscape state, the electronic device can display the user interface 110 of the video application through the first display screen 200, as shown in FIG. 13 ( A) shown.
  • the electronic device when the electronic device is in the unfolded state and the first display screen 200 faces upward and is in the vertical screen state, the electronic device can display the user interface 160 of the e-book through the first display screen 200, as shown in FIG. 13(B).
  • the electronic device can detect a user's rotation operation, which can change the physical state of the electronic device from a landscape state to a portrait state.
  • the electronic device can display the user interface 160 of the e-book through the first display screen 200 , and the electronic device can be in the state shown in (B) of FIG. 13 .
  • the electronic device can detect a user's rotation operation, which can change the physical state of the electronic device from a portrait state to a landscape state.
  • the electronic device may display the user interface 110 of the video application through the first display screen 200, and the electronic device may be in the state shown in (A) of FIG. 13 .
  • the electronic device when the electronic device is in the unfolded state and the second display screen 300 is facing up and in a landscape state, the electronic device can display the user interface of application A through the second display screen 300 .
  • the electronic device when the electronic device is in the unfolded state and the second display screen 300 faces upward and is in a vertical screen state, the electronic device can display the user interface of application B through the second display screen 300 .
  • the electronic device may detect whether the electronic device is in a landscape screen state or a portrait screen state through the acceleration sensor 180E and/or the gyro sensor 180B.
  • the second display screen 300 may only include the fourth display area 301 , and optionally, may also include the fourth display area 301 and the fifth display area 302 .
  • the second display screen 300 may only include the sixth display area 303 , or optionally, may also include the sixth display area 303 and the fifth display area 302 .
  • the second display screen 300 is a rigid screen. When the physical state of the electronic device changes, the display area used to display the user interface may change, and the displayed application may also change. A specific example is shown in Figure 14 below.
  • FIG. 14 exemplarily shows yet another schematic diagram of human-computer interaction.
  • FIG. 14 takes as an example that the second display screen 300 only includes the sixth display area 303 .
  • the electronic device As shown in (A) of FIG. 14 , the electronic device is in the unfolded state, and the first display screen 200 is facing upward, and the electronic device displays the user interface 110 of the video application through the first display screen 200 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device may display the user interface of the application associated with the sixth display area 303 (ie, the second display screen 300 ) through the sixth display area 303 , namely The user interface 120 of the social application, at this time, the electronic device may be in the state shown in (B) of FIG. 14 .
  • the electronic device can also detect a user's flip operation, which can change the relative positions of the first display screen 200 and the second display screen 300 .
  • the electronic device may display, through the sixth display area 303, the user interface of the application associated with the sixth display area 303, that is, the user interface 120 of the social application.
  • the electronic device may be in the state shown in (C) of FIG. 14 .
  • the electronic device is in a folded state, and the user interface 120 of the social application is displayed through the sixth display area 303 .
  • the electronic device can detect the unfolding operation of the user, and the unfolding operation can increase the included angle between the two ends of the bending part.
  • the electronic device may display the user interface of the application associated with the first display screen 200 through the first display screen 200 , That is, the user interface 110 of the video application, at this time, the electronic device may be in the state shown in (A) of FIG. 14 .
  • the electronic device may continue to display the application associated with the sixth display area 303 through the sixth display area 303
  • the user interface that is, the user interface 120 of the social application, at this time, the electronic device may be in the state shown in (C) of FIG. 14 .
  • the electronic device is in the unfolded state, and the second display screen 300 is facing upward, and the electronic device displays the user interface 120 of the social application through the sixth display area 303 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device may continue to display the user interface of the application associated with the sixth display area 303 through the sixth display area 303, that is, the user interface 120 of the social application, At this time, the electronic device may be in the state shown in (B) of FIG. 14 .
  • the electronic device can also detect a user's flip operation, which can change the relative positions of the first display screen 200 and the second display screen 300 .
  • the electronic device can display the user interface of the application associated with the first display screen 200, that is, the user interface 110 of the video application, through the first display screen 200.
  • the electronic device may be in the state shown in (A) of FIG. 14 .
  • the first display screen 200 may only include the first display area 201 , and optionally, may also include the first display area 201 and the third display area 203 .
  • the first display screen 200 may only include the second display area 202 , and optionally, may also include the second display area 202 and the third display area 203 .
  • the first display screen 200 is a rigid screen. When the physical state of the electronic device changes, the display area used to display the user interface may change, and the displayed application may also change. A specific example is shown in Figure 15 below.
  • FIG. 15 exemplarily shows yet another schematic diagram of human-computer interaction.
  • FIG. 15 takes the example that the first display screen 200 only includes the first display area 201 for illustration.
  • the electronic device is in the unfolded state, and the first display screen 200 is facing upward, and the electronic device displays the user interface 110 of the video application through the first display area 201 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device may display, through the sixth display area 303, the user interface of the application associated with the sixth display area 303, that is, the user interface 120 of the social application. At this time, the electronic device may be in the state shown in (B) of FIG. 15 .
  • the electronic device can also detect a user's flip operation, which can change the relative positions of the first display screen 200 and the second display screen 300 .
  • the electronic device may display, through the sixth display area 303, the user interface of the application associated with the sixth display area 303, that is, the user interface 120 of the social application.
  • the electronic device may be in the state shown in (C) of FIG. 15 .
  • (B) and (C) of FIG. 15 are similar to (B) and (C) of FIG. 14 and will not be described again.
  • the second display screen 300 may also include a fourth display area 301 and a sixth display area 303 , and optionally, a fifth display area 302 .
  • a fourth display area 301 and a sixth display area 303 may also include a fifth display area 302 .
  • FIG. 3 above Illustration of the second display screen 300 in FIG. 5 .
  • the electronic device can display the user interface of the application associated with the second display screen 300 through the second display screen 300 .
  • the schematic diagrams when the physical state of the electronic device changes are similar to those in FIGS. 8-12 and 15 , and will not be repeated here.
  • the electronic device may also only include the second display screen 300 , and the structure of the second display screen 300 may refer to the description of the second display screen 300 in FIGS. 3-5 above.
  • the electronic device may have set the second display screen 300 to be associated with the social application, and the electronic device has also set the fourth display area 301 to be associated with the gallery, and the sixth display area 303 to be associated with the camera.
  • the setting method see Figure 6- Figure 7.
  • the display area used to display the user interface can also change, and the displayed application can also change.
  • a specific example is shown in Figure 16 below.
  • FIG. 16 exemplarily shows yet another schematic diagram of human-computer interaction.
  • the electronic device is in the unfolded state, and the second display screen 300 is facing upward, and the electronic device displays the user interface 120 of the social application through the second display screen 300 .
  • the electronic device can detect the user's folding operation, and the folding operation can reduce the angle between the two ends of the bending part.
  • the electronic device can display the user interface through the sixth display area 303 or the fourth display area 301, and at this time the electronic device can be (B) of FIG. 16 below or the state shown in (C) of FIG. 16 .
  • (B) and (C) of FIG. 16 are the same as (B) and (C) of FIG. 10 , and will not be repeated here.
  • the application associated with the fourth display area 301 may be the same as the application associated with the second display screen 300, such as a social application, and the application associated with the sixth display area 303 and the second display screen 300.
  • the application associated with the display screen 300 is different, for example, a camera.
  • the application associated with the sixth display area 303 may be the same as the application associated with the second display screen 300
  • the application associated with the fourth display area 301 may be different from the application associated with the second display screen 300 .
  • the application associated with the fourth display area 301 is the same as the application associated with the sixth display area 303 , but is different from the application associated with the second display screen 300 .
  • the applications associated with the fourth display area 301, the applications associated with the sixth display area 303, and the applications associated with the second display screen 300 are all the same, for example, social applications.
  • the electronic device 100 is not foldable, that is, the display screen 194 of the electronic device 100 is a candy bar screen.
  • the display screen 194 includes a first display screen 200 and a second display screen 300 , and the light-emitting surface of the first display screen 200 and the light-emitting surface of the second display screen 300 are opposite to each other.
  • the first display screen 200 and the second display screen 300 are respectively a rigid screen.
  • the first display screen 200 and the second display screen 300 are an integrally formed flexible screen, and the first display screen 200 and the second display screen 300 are respectively different display areas on the flexible screen.
  • FIG. 17 An example of the structure of the electronic device 100 is shown in FIG. 17 below. When the physical state of the electronic device changes, the display area used to display the user interface can change, and the displayed application can also change. A specific example is shown in Figure 17 below.
  • FIG. 17 exemplarily shows yet another schematic diagram of human-computer interaction.
  • the first display screen 200 of the electronic device faces upward, and the electronic device displays the user interface 110 of the video application through the first display screen 200 .
  • the electronic device can detect the user's flip operation, and the flip operation can change the relative positions of the first display screen 200 and the second display screen 300 .
  • the electronic device may display, through the second display screen 300, the user interface of the application associated with the second display screen 300, that is, the user interface 120 of the social application, which is At this time, the electronic device may be in the state shown in (B) of FIG. 17 .
  • the second display screen 300 of the electronic device faces upward, and the electronic device displays the user interface 120 of the social application through the second display screen 300 .
  • the electronic device can detect the user's flip operation, and the flip operation can change the relative positions of the first display screen 200 and the second display screen 300 .
  • the electronic device can display the user interface of the application associated with the first display screen 200, that is, the user interface 110 of the video application, through the first display screen 200.
  • the electronic device may be in the state shown in (A) of FIG. 17 .
  • the electronic device is in an unfolded state, and when the first display screen 200 is facing the user, the electronic device can display the user interface through the first display screen 200 .
  • the camera 193 on the first display screen 200 can obtain the user's face information, but the camera 193 on the second display screen 300 cannot obtain the user's face information.
  • the electronic device can display the user interface through the second display screen 300 .
  • the camera 193 on the second display screen 300 can obtain the user's face information, but the camera 193 on the first display screen 200 cannot obtain the user's face information.
  • the electronic device when the electronic device is in a folded state and the sixth display area 303 of the second display screen 300 faces the user, the electronic device can display the user interface through the sixth display area 303 (and optionally, the fifth display area 302 ). At this time, the camera 193 on the sixth display area 303 can obtain the user's face information, but the camera 193 on the fourth display area 301 cannot obtain the user's face information.
  • the electronic device when the electronic device is in a folded state and the fourth display area 301 of the second display screen 300 faces the user, the electronic device can display a user interface through the fourth display area 301 (and optionally, the fifth display area 302 ). At this time, the camera 193 on the fourth display area 301 can obtain the user's face information, but the camera 193 on the sixth display area 303 cannot obtain the user's face information.
  • both the first display screen 200 and the second display screen 300 may be bright, wherein the first display screen 200 displays and the first display screen The user interface of the associated application 200 is displayed, and the second display screen 300 displays the user interface of the application associated with the second display screen 300 .
  • the fourth display area 301 and the sixth display area 303 may both be bright, wherein the fourth display area 301 displays the The user interface of the application associated with the four display areas 301, the sixth display area 303 displays the user interface of the application associated with the sixth display area 303, optionally, the fourth display area 301 or the sixth display area 303 and the fifth display area 302 the user interface is displayed together.
  • the first application can be closed (for example, the electronic device is turned off, or a user operation for closing the first application is received, etc.).
  • the physical state of the electronic device when the electronic device is turned on) changes from another physical state (for example, the folded state) to the unfolded state, the electronic device can automatically open the first application, and display the first application through the first display screen 200 .
  • User Interface That is to say, the user can quickly open the application by changing the physical state of the electronic device, which saves the user's operation steps for starting the application, which is more convenient and quicker.
  • Other examples of association between display areas and applications are similar to the above examples, and will not be described again.
  • the electronic device can receive a user operation when displaying the setting interface or the desktop, and in response to the user operation, the electronic device cancels the first display screen 200. associated with the first application.
  • the electronic device may also receive a user operation (such as clicking a lock control) acting on the first display screen 200, and in response to the user operation, the electronic device Cancel the association between the first display screen 200 and the first application.
  • the electronic device cancels the association between the first display screen 200 and the first application.
  • This application does not limit this.
  • the description of how the electronic device cancels the association between other display areas and applications is similar to the above process, and will not be repeated here.
  • the association relationship between the display area and the application may be preset by the system.
  • the second display screen 300 of the electronic device displays the user interface of the payment application by default.
  • the association between the display area and the application may also be determined by the electronic device in real time.
  • the electronic device determines that the application displayed on the first display screen 200 most recently is associated with the first display screen 200 .
  • the association relationship between the display area and the application may also be customized in response to a user operation. For specific examples, see Figures 18-22 below.
  • User interface 180 may include settings interface 151 , guide examples 152 and toggle options 153 . in:
  • the settings interface 151 may include a first title 1511 , a video application option 1512 , a social application option 1513 , and a game application option 1514 .
  • the first title 1511 of the setting interface 151 includes text information: "setting related applications", indicating that the setting interface 151 is used to set the relationship between the display area and the application.
  • the setting interface 151 can be used to set the association between any application and the display area, for example, the association between a video application, a social application, a game application and the display area can be set.
  • Video application options 1512 may include a first option 1512A and a second option 1512B.
  • the first option 1512A can be used by the user to set whether the video application is associated with the second display screen 300 , including text information: "display on the external screen in the folded state", where the external screen is the second display screen 300 .
  • the electronic device may detect a user operation (eg, a tap or swipe operation) acting on the first option 1512A, and in response to the operation, the electronic device may determine that the video application is associated with the second display screen 300, or cancel the video application and the second display screen 300 relationship.
  • the first option 1512A of the user interface 180 represents that the video application and the second display screen 300 are not associated.
  • the second option 1512B can be used by the user to set whether the video application is associated with the first display screen 200 , including text information: "display on the inner screen in an expanded state", where the inner screen is the first display screen 200 .
  • the electronic device may detect a user operation (eg, a tap or swipe operation) acting on the second option 1512B, and in response to the operation, the electronic device may determine that the video application is associated with the first display screen 200, or cancel the video application and the first display screen 200 relationship.
  • the second option 1512B of the user interface 180 represents that the video application is associated with the first display screen 200 .
  • the social application options 1513 may also include a third option 1513A and a fourth option 1513B
  • the gaming application may also include a fifth option 1514A and a sixth option.
  • the descriptions of the third option 1513A and the fifth option 1514A are similar to the descriptions of the first option 1512A, and the descriptions of the fourth option 1513B and the sixth option are similar to the descriptions of the second option 1512B, and will not be repeated.
  • the introductory example 152 may include a second title 1521 and a picture example 1522 .
  • the second title 1521 of the guide example 152 includes text information: "in-screen display in the expanded state", which is used to indicate that the scene shown in the picture example 1522 is: the electronic device is in an expanded state, and the user interface of the application is displayed through the first display screen 200 .
  • the electronic device in response to the physical state of the electronic device changing from other physical states to the expanded state, the electronic device may display the user interface of the video application through the first display screen 200, and at this time the electronic device The state shown in the bootstrap example 152 may be.
  • Toggle options 153 may include a first example option 153A and a second example option 153B.
  • the first example option 153A of the switching option 153 is in the selected state, indicating that the guide example 152 is the first interface of the guide example displayed by the electronic device.
  • the electronic device may receive a swipe operation (eg, swipe from right to left) by the user acting on the guide instance 152, the toggle option 153, or a blank area of the user interface 180, and in response to the swipe operation, the electronic device may switch to display a second guide instance.
  • the interface that is, the picture example 1542 of the guide example 154 shown in FIG. 19 .
  • the second example option 153B is in the selected state, as shown in FIG. 19 .
  • the guide example 154 shown in FIG. 19 is different from the guide example 153 shown in FIG. 18 , and the guide example 154 may include a third title 1541 and a picture example 1542 .
  • the third title 1541 includes text information: "display on the external screen in the folded state", which is used to indicate that the scene shown in the picture example 1542 is: the electronic device is in the folded state, and the second display screen 300 displays an example of the user interface of the application.
  • the electronic device in response to the physical state of the electronic device changing from other physical states to the folded state, the electronic device can pass through the sixth display area 303 of the second display screen 300 (optionally, and the fifth display area 302 ) displays the user interface of the social application, and the electronic device may be in the state shown in the guide example 154 at this time. Not limited to this, the electronic device may also display the user interface of the social application through the fourth display area 301 (optionally, and the fifth display area 302 ) of the second display screen 300 .
  • the other contents of FIG. 19 are the same as those of FIG. 18 and will not be repeated.
  • one display area is usually only associated with one application. Therefore, in the scenario of setting the associated application shown in FIGS. 18-19 , at most two options are enabled, and the two options are: for setting the association between application A and the first display screen 200 , for setting the application B is associated with the second display screen 300 .
  • the applications that can be set include video applications, social applications, and game applications.
  • Only one of the first option 1512A, the third option 1513A, and the fifth option 1514A can be turned on, that is, there is at most one application associated with the second display screen 300, the second option 1512B, the fourth option 1513B, the sixth option Only one of them can be in an open state, that is, there is at most one application associated with the first display screen 200 .
  • the electronic device will cancel the association relationship between the original application and the first display screen 200, and Other applications are set to be associated with the first display screen 200 .
  • the second option 1512B of the video application is in an on state, indicating that the video application has been associated with the first display screen 200 .
  • the electronic device detects a user operation (such as a click operation) acting on the fourth option 1513B, in response to the user operation, the electronic device will set the second option 1512B to an off state and set the fourth option 1513B to an on state .
  • the electronic device in response to the user operation, will cancel the association between the video application and the first display screen 200 , and set the association between the social application and the first display screen 200 .
  • the second display screen 300 is similar to the first display screen 200 and will not be repeated here.
  • FIG. 20 exemplarily shows a schematic diagram of yet another user interface embodiment.
  • User interface 170 may include a status bar 171 and a list of application icons 172 . in:
  • the status bar 171 may include the name of the connected mobile network, the WI-FI icon, the signal strength and the current remaining battery.
  • the mobile network accessed is a fifth-generation mobile communication technology (5th generation mobile networks, 5G) network with a signal grid number of 4 (that is, the signal strength is the best).
  • 5G fifth-generation mobile communication technology
  • the list of application icons 172 may include, for example, settings icons 1721, calculator icons 1722, music icons 1723, gallery icons 1724, dial icons 1725, contacts icons 1726, internet icons 1727, SMS icons 1728, cameras
  • the icons 1729 and the like may also include icons of other applications, which are not limited in this embodiment of the present application.
  • the icon of any application can be used to respond to a user's operation, such as a touch operation, so that the electronic device starts the application corresponding to the icon.
  • the icon of any application can be used to respond to a user's operation, such as a long-press operation, so that the electronic device displays an application editing interface (eg, the editing interface 173 of the gallery).
  • the electronic device may detect a user operation (eg, a long press operation) of the user acting on the icon 1724 of the gallery, and in response to the user operation, the electronic device may display the editing interface 173 of the gallery.
  • the editing interface 173 of the gallery may include, for example, a remove option 173A, an association option 173B, an edit option 173C, a more option 173D, etc., and may also include other options, which are not limited in this embodiment of the present application.
  • the electronic device may detect a user operation (such as a click operation) acting on the association option 173B, and in response to the user operation, the electronic device may set the first display screen 200 to associate with the gallery application, or cancel the association between the first display screen 200 and the gallery application relation.
  • the user interface 170 shown in FIG. 20 is a user interface displayed through the first display screen 200 when the electronic device is in an unfolded state.
  • the association option 173B displayed by the electronic device is used for the user to set the association relationship between the gallery application and the first display screen 200 .
  • the electronic device receives a user operation (eg, a click operation) acting on the associated option 173B.
  • the electronic device may set the association between the second display screen 300 and the gallery application, or cancel the association between the second display screen 300 and the gallery application.
  • the electronic device may also determine the application associated with the fourth display area 301 and the application associated with the sixth display area 303 in the above manner. That is to say, the association option of the application is used by the user to set the association relationship between the display area currently used for display by the electronic device and the application.
  • User interface 210 may include thumbnails 181 for video applications, thumbnails 182 for social applications, thumbnails 183 for music, and a close option 184 .
  • the user interface 210 may be a multitasking interface entered by the user through a gesture navigation function (eg, the user swipes up from the bottom edge of the screen and pauses), or may be a user through a multitasking option acting on the three-button navigation function or the floating navigation function
  • the multitasking interface entered by an operation is not limited in this embodiment of the present application.
  • the thumbnail 181 of the video application displays the application name 181A (ie, the video application) and the associated option 181B.
  • the electronic device may detect a user operation (eg, a click operation) acting on the association option 181B, and in response to the user operation, the electronic device may determine that the first display screen 200 is associated with the video application, or disassociate the first display screen 200 with the video application relation.
  • the associated option 181B of the video application shown in the user interface 210 indicates that the video application has been associated with the first display screen 200 .
  • the electronic device will cancel the association relationship between the first display screen 200 and the video application, and the association option 181B will be displayed in the state shown in the association option 182A of the social application.
  • An associated option 182A is displayed on the thumbnail 182 of the social application.
  • the association option 182A of the social application shown in the user interface 210 indicates that the social application is not associated with the first display screen 200 . If the electronic device detects a user operation (eg, a click operation) acting on the association option 182A at this time, in response to the user operation, the electronic device may set the first display screen 200 to associate with a social application.
  • An application name 183A ie, music
  • the thumbnail 182 of the social application and the thumbnail 183 of the music display only part of the content, and the thumbnail 181 of the video application 181 displays the complete content.
  • the electronic device may detect a left-right sliding operation acting on the user interface 210, and in response to the sliding operation, the electronic device may switch the positions of thumbnails of multiple applications. For example, when the user slides from right to left, the electronic device may place the thumbnail 181 of the video application in the position of the thumbnail 182 of the social application, and the thumbnail 183 of the music in the position of the thumbnail 181 of the video application.
  • the applications run by the electronic device only include social applications, video applications and music, the electronic device may place the thumbnail 182 of the social application in the position of the thumbnail 183 of the music.
  • the electronic device also runs other applications, such as a game application, the thumbnail of the game application can be placed in the position of the thumbnail 183 of the music, and the thumbnail 182 of the social application is not visible at this time.
  • the close option 184 can be used to close all applications running on the electronic device.
  • the electronic device detects a user operation (such as a click operation) acting on the close option 184, the electronic device can close all applications running on the electronic device in response to the user operation. , and display the desktop of the electronic device.
  • the electronic device may also close part of the running applications and display only the user interface of one application (for example, the user interface of the video application displaying the full thumbnail in the user interface 210).
  • the association option 182A of social applications shown in FIG. 21 , the association option 181B of video applications are similar to the association option 173B shown in FIG. 20 , all of which are used by the user to set the association relationship between the display area currently used by the electronic device for display and the application. For example, when the electronic device is in a folded state and the user interface is displayed through the sixth area 303, the electronic device receives a user operation (eg, a click operation) acting on the associated option 181B of the video application. Then, in response to the user operation, the electronic device may determine that the sixth area 303 is associated with the video application, or cancel the association between the sixth area 303 and the video application.
  • a user operation eg, a click operation
  • FIG. 22 exemplarily shows a schematic diagram of yet another user interface embodiment.
  • the electronic device is in an unfolded state, and the user interface 110 of the video application is displayed through the first display screen 200 .
  • User interface 110 may include association options 110A.
  • the electronic device may detect a user operation (eg, a click operation) acting on the association option 110A, and in response to the user operation, the electronic device may determine that the first display screen 200 is associated with the video application, or disassociate the first display screen 200 with the video application relation.
  • the association option 110A shown in the user interface 110 indicates that the video application has been associated with the first display screen 200 .
  • the user clicks the association option 110A the electronic device will cancel the association relationship between the first display screen 200 and the video application, and the association option 110A will be displayed as shown in the association option 182A of the social application in FIG. 21 above.
  • the user interface of the video application is displayed through the second display screen 300 .
  • the user interface of the video application may also include an association option.
  • the electronic device may also receive a user action (eg, a click action) acting on the associated option.
  • the electronic device may determine that the second display screen 300 is associated with the video application, or cancel the association relationship between the second display screen 300 and the video application.
  • the electronic device may also determine the application associated with the fourth display area 301 and the application associated with the sixth display area 303 in the above manner.
  • one display area is usually only associated with one application.
  • the applications that can be set include video applications, social networking applications, and music. Only one of the associated option 181B of the video application, the associated option 182A of the social application, and the associated option of the music can be turned on.
  • the electronic device will cancel the association between the original application and the first display screen 200, and Other applications are set to be associated with the first display screen 200 .
  • the association option 181B of the video application is in an on state, which indicates that the video application has been associated with the first display screen 200 .
  • the electronic device detects a user operation (such as a click operation) acting on the association option 182A of the social application, in response to the user operation, the electronic device will set the association option 181B of the video application to the off state, and the association option 181B of the social application will be disabled.
  • Option 182A is set to on. That is to say, in response to the user operation, the electronic device will cancel the association between the video application and the first display screen 200 , and set the association between the social application and the first display screen 200 .
  • the second display screen 300 is similar to the first display screen 200 and will not be repeated here.
  • the manner of setting the association between the first display screen 200 and the application may be the same as the manner of setting the association between the second display screen 300 and the application.
  • the electronic device may receive a click operation acting on the first option 1512A and the fourth option 1513B.
  • the electronic device may determine that the first display screen 200 is associated with a social application, and determine that the second display screen 300 is associated with a video application.
  • the manner of setting the association between the first display screen 200 and the application may also be different from the manner of setting the association between the second display screen 300 and the application. For example, in the embodiment shown in FIG.
  • the electronic device sets the association between the first display screen 200 and the video application in response to a user operation.
  • the association relationship between the second display screen 300 and the application may be determined by the electronic device in real time, that is, the application associated with the second display screen 300 is the application displayed by the electronic device through the second display screen 300 most recently.
  • the embodiments of the present application do not limit the specific manner and specific time for setting the association between different display areas and applications.
  • FIG. 23 exemplarily shows a cooperative relationship of various components in the electronic device 100 in the scenario shown in FIG. 6 .
  • the electronic device 100 is in an expanded state, and the user interface of the first application is displayed on the first display screen 200 .
  • the angle sensor 180M detects the bending angle of the electronic device 100 and reports it to the processor 100 .
  • the processor 110 determines that the bending angle of the first display screen 200 is smaller than the first angle threshold, and accordingly determines that the display area is the sixth display area 303 of the second display screen 300 .
  • the processor 110 determines that there is no application associated with the sixth display area 303 of the second display screen 300 .
  • the processor 110 instructs the sixth display area 303 of the second display screen 300 to continue to display the user interface of the application displayed by the electronic device 100 in the expanded state, that is, the user interface of the first application.
  • the sixth display area 303 of the second display screen 300 displays the user interface of the first application.
  • the pressure sensor 180A detects a touch operation on the sixth display area 303 of the second display screen 300 , and the touch operation is used to open the second application.
  • the pressure sensor 180A reports the event of the touch operation to the processor 110 .
  • the processor 110 determines that the displayed application is the second application according to the event of the touch operation.
  • the processor 110 instructs the sixth display area 303 of the second display screen 300 to display the user interface of the second application.
  • the sixth display area 303 of the second display screen 300 displays the user interface of the second application.
  • the order of the above 3 and 4 is not limited, and may be executed simultaneously.
  • FIG. 24 exemplarily shows yet another cooperation relationship of various components in the electronic device 100 in the scenario shown in FIG. 6 .
  • the angle sensor 180M detects the bending angle of the electronic device 100 and reports it to the processor 100 .
  • the processor 110 determines that the bending angle of the electronic device 100 is greater than the second angle threshold, and accordingly determines that the display area is the first display screen 200 .
  • the processor 110 determines that the first display screen 200 is associated with the first application.
  • the association relationship may be determined by the processor 110 by itself, or may be set in response to a user operation. For specific examples, please refer to FIG. 18-FIG. 22 above.
  • the processor 110 instructs the first display screen 200 to display the user interface of the application associated with the first display screen 300, that is, the user interface of the first application.
  • the first display screen 200 displays the user interface of the first application.
  • the angle sensor 180M detects the bending angle of the electronic device 100 and reports it to the processor 100 .
  • the processor 110 determines that the bending angle of the electronic device 100 is smaller than the first angle threshold, and accordingly determines that the display area is the sixth display area 303 of the second display screen 300 .
  • the processor 110 determines that the last application displayed on the electronic device 100 in the folded state is the second application, that is, the last application displayed on the second display screen 300 is the second application.
  • the processor 110 instructs the sixth display area 303 of the second display screen 300 to display the user interface of the application displayed in the folded state of the electronic device 100 last time, that is, the user interface of the second application.
  • the sixth display area 303 of the second display screen 300 displays the user interface of the second application.
  • the sequence of the above 13 and 14 is not limited, and may be executed simultaneously.
  • the order of the above 18 and 19 is not limited, and may be executed simultaneously.
  • FIG. 25 exemplarily shows a cooperative relationship of various components in the electronic device 100 in the scenario shown in FIG. 7 .
  • the electronic device 100 is in an expanded state, and the user interface of the first application is displayed on the first display screen 200 .
  • the pressure sensor 180A detects a touch operation acting on the first display screen 200, and the touch operation is used to open the third application.
  • the pressure sensor 180A reports the event of the touch operation to the processor 110 .
  • the processor 110 determines that the displayed application is the third application according to the event of the touch operation.
  • the processor 110 instructs the first display screen 200 to display the user interface of the third application.
  • the first display screen 200 displays the user interface of the third application.
  • the angle sensor 180M detects the bending angle of the electronic device 100 and reports it to the processor 100 .
  • the processor 110 determines that the bending angle of the electronic device 100 is smaller than the first angle threshold, and accordingly determines that the display area is the sixth display area 303 of the second display screen 300 .
  • the processor 110 determines that there is no application associated with the sixth display area 303 of the second display screen 300 .
  • the processor 110 instructs the sixth display area 303 of the second display screen 300 to continue to display the user interface of the application displayed by the electronic device 100 in the expanded state, that is, the user interface of the third application.
  • the sixth display area 303 of the second display screen 300 displays the user interface of the third application.
  • the order of the above 8 and 9 is not limited, and may be executed simultaneously.
  • FIG. 26 exemplarily shows yet another cooperation relationship of various components in the electronic device 100 in the scenario shown in FIG. 7 .
  • the angle sensor 180M detects the bending angle of the electronic device 100 and reports it to the processor 100 .
  • the processor 110 determines that the bending angle of the electronic device 100 is greater than the second angle threshold, and accordingly determines that the display area is the first display screen 200.
  • the processor 110 determines that the first display screen 200 is associated with the first application.
  • the association relationship may be determined by the processor 110 by itself, or may be set in response to a user operation. For specific examples, please refer to FIG. 18-FIG. 22 above.
  • the processor 110 instructs the first display screen 200 to display the user interface of the application associated with the first display screen 300, that is, the user interface of the first application.
  • the first display screen 200 displays the user interface of the first application.
  • the angle sensor 180M detects the bending angle of the electronic device 100 and reports it to the processor 100 .
  • the processor 110 determines that the bending angle of the electronic device 100 is smaller than the first angle threshold, and accordingly determines that the display area is the sixth display area 303 of the second display screen 300 .
  • the processor 110 determines that the last application displayed on the electronic device 100 in the folded state is the third application, that is, the last application displayed on the second display screen 300 is the third application.
  • the processor 110 instructs the sixth display area 303 of the second display screen 300 to display the user interface of the application displayed in the folded state of the electronic device 100 last time, that is, the user interface of the third application.
  • the sixth display area 303 of the second display screen 300 displays the user interface of the third application.
  • the sequence of the above 13 and 14 is not limited, and may be executed simultaneously.
  • the order of the above 18 and 19 is not limited, and may be executed simultaneously.
  • FIG. 27 is a display method provided by an embodiment of the present application.
  • the method can be applied to the electronic device 100 shown in FIG. 1 .
  • the method can also be applied to the electronic device 100 shown in FIG. 2 .
  • the method includes but is not limited to the following steps:
  • the electronic device includes a first display area and a second display area.
  • the user interface is displayed in the first display area
  • the user interface is displayed in the second display area.
  • the first display area includes at least part of the second display area; and/or the second display area includes at least part of the first display area.
  • the first display area and the second display area belong to the same display screen of the electronic device, such as the second display screen 300 shown in Figures 3-5 above.
  • the first display area includes the fourth display area 301 and the fifth display area 302 shown in Figures 3-5 above
  • the second display area includes the sixth display area 303 and the fifth display area shown in Figures 3-5 above 302 , an overlapping display area exists between the first display area and the second display area, that is, the fifth display area 302 .
  • the electronic device is a foldable electronic device.
  • the first display area and the second display area are on the same plane.
  • the light emitting surface of the first display area and the light emitting surface of the second display area are opposite to each other.
  • the first display area and the second display area belong to the same display screen of the electronic device, for example, the second display screen 300 shown in FIGS. 3-5 above.
  • the first display area and the second display area are the fourth display area 301 and the sixth display area 303 shown in FIGS. 3-5 above, respectively.
  • the first physical state and the second physical state may both be folded states
  • the third user operation is an operation of the user turning over the electronic device.
  • the first display area faces upward and the second display area faces downward; in the second physical state, the second display area faces upward and the first display area faces downward.
  • the first display area faces the user in the first physical state, and the second display area faces the user in the second physical state.
  • the electronic device includes a first display screen, the first display area is at least part of the display area of the first display screen, and the second display area is at least part of the display area of the first display screen.
  • the first display screen may be a flexible folding screen, and the first display area and the second display area are display areas on the flexible folding screen.
  • the electronic device includes a first display screen and a second display screen, the first display area is the display area of the first display screen, and the second display area is the display area of the second display screen.
  • the first display screen may be a display screen formed by splicing two rigid screens, flexible screens, chains and other connecting components, the first display area is a display area on one rigid screen, and the second display area is a display area on another rigid screen display area.
  • the electronic device is a foldable electronic device, the electronic device includes a first display screen, the second display area is a full-screen display area of the first display screen, and the first display area is a partial display area of the first display screen,
  • the first physical state is the folded state
  • the second physical state is the unfolded state.
  • the first display screen is the second display screen 300 shown in Figures 3-5 above
  • the second display area includes the fourth display area 301, the fifth display area 302 and The sixth display area 303 .
  • the first display area is the fourth display area 301 or the sixth display area 303 , and optionally, the first display area may further include the fifth display area 302 .
  • the electronic device is a foldable electronic device, and when the electronic device is in an unfolded state, the light-emitting surface of the first display area and the light-emitting surface of the second display area are opposite to each other.
  • the first display area and the second display area are the first display screen 200 and the second display screen 300 shown in FIGS. 3-5 above.
  • the first display area is at least one display area on the first display screen 200 , for example, the first display area 201 .
  • the second display area is at least one display area on the second display screen 300 , for example, the fourth display area 301 .
  • the first physical state is an unfolded state
  • the second physical state is a folded state
  • the third user operation is an operation performed by the user to convert the electronic device from the unfolded state to the folded state.
  • both the first physical state and the second physical state are unfolded states
  • the third user operation is an operation of the user turning over the electronic device.
  • the first display area faces upward and the second display area faces downward
  • the second display area faces upward and the first display area faces downward.
  • the first display area faces the user in the first physical state
  • the second display area faces the user in the second physical state.
  • examples of the structure and physical state change process of the electronic device can be seen in Figures 3-15 above.
  • the electronic device is not foldable, and the light exit surface of the first display area and the light exit surface of the second display area are opposite to each other.
  • the third user operation is an operation in which the user turns over the electronic device.
  • the first display area faces upward and the second display area faces downward; in the second physical state, the second display area faces upward and the first display area faces downward.
  • the first display area faces the user in the first physical state, and the second display area faces the user in the second physical state.
  • an example of the structure and physical state change process of the electronic device can be seen in Figure 17 above.
  • the method may further include: when the user interface of the first application is displayed through the first display area, in response to the physical state of the electronic device changing from the first physical state to the second physical state, the electronic device displays the user interface through the first physical state.
  • the second display area displays the user interface of the first application; when the electronic device displays the user interface of the first application through the second display area, a fourth user operation is received; in response to the fourth user operation, the electronic device displays the first application through the second display area
  • the user interface of the third application is different from the first application.
  • the electronic device in response to the physical state of the electronic device changing from the first physical state to the second physical state, may display, through the second display area, a user interface of an application that was last displayed by the electronic device in the second physical state.
  • the method may further include: in response to the physical state of the electronic device changing from the first physical state to the second physical state, displaying the user interface of the second application on the electronic device through the second display area.
  • the electronic device can also customize the association between the second display area and the fourth application in response to a user operation.
  • the fourth application is different from the first application.
  • the method may further include: receiving a fifth user operation; in response to the fifth user operation, the electronic device determines that the second display area is associated with the fourth application; when displaying the user interface of the first application through the first display area , in response to the physical state of the electronic device being transformed from the first physical state to the second physical state, the electronic device displays the user interface of the fourth application through the second display area.
  • the electronic device may also preset the association between the first display area and the first application, and the present application does not limit the manner of determining the association between the display area and the application.
  • the electronic device may store the association relationship between the display area and the application through the data set.
  • the data set may be temporarily stored when the electronic device determines that there is an association relationship, thereby reducing unnecessary storage overhead.
  • the data set may include a set of data including an identifier of the first display area, an identifier of an application associated with the first display area, and optionally, an identifier of the first physical state.
  • the data set may also include two groups of data, wherein one group of data is the above-mentioned group of data, and the other group of data includes the identifier of the second display area and the identifier of the application associated with the second display area, optionally, also An identification of the second physical state may be included.
  • the electronic device can determine the association relationship between the display area and the application according to the data set, so as to display the associated application through the corresponding display area in different physical states.
  • the electronic device stores the association relationship between the first display area, the second display area and the application through a data set, where the data set includes two sets of data. After the electronic device determines that the first display area is associated with the first application, one group of data in the data set includes the identifier of the first display area and the identifier of the first application, while the other group of data may be empty. After the electronic device determines that the first display area is associated with the first application, when the physical state changes to the second physical state for the first time, the electronic device can continue to display the user of the application displayed when the electronic device is in the first physical state through the second display area interface.
  • the electronic device may determine that another set of data in the data set includes the identifier of the second display area and the identifier of the second application. Therefore, when the physical state of the electronic device is subsequently changed to the first physical state, the electronic device may determine to display the user interface of the first application through the first display area according to the data set. When the physical state of the electronic device is transformed into the second physical state, the electronic device may determine, according to the data set, to display the user interface of the second application through the second display area.
  • the association between the display area and the application may change, and the electronic device may update the application identifier stored in the data set.
  • the data set originally includes the identifier of the second display area and the identifier of application A.
  • the electronic device receives a user operation acting on the setting interface, and the user operation is used to set the association between the second display area and the application B, then the electronic device can update another set of data in the data set, and the updated another set of data includes the second display The identification of the region and the identification of application B.
  • the electronic device can update another set of data in the data set, update The second set of data includes the identification of the second display area and the identification of the application C.
  • the electronic device when the electronic device determines that an application is associated with the display area, if the electronic device runs the application, even if the application is not displayed by the electronic device, the electronic device can display all or most of the application. Process retention without reclaiming network resources, system resources, etc. required to run the application. For example, the electronic device determines that the first display area is associated with the first application, and the electronic device runs the first application and the second application. When the electronic device is in the second physical state and displays the user interface of the second application through the second display area, the electronic device will not recycle the resources required for running the first application.
  • the electronic device can display through the first display area: the user interface of the first application displayed through the first display area when the electronic device was in the first physical state last time.
  • situations such as user data loss caused by the electronic device closing any application process of the first application are avoided, the speed of opening the first application is improved, and the user's sense of use is improved.
  • the electronic device can display the first application through the first display area.
  • User Interface That is to say, the user can quickly switch the displayed application by changing the physical state of the electronic device, without repeatedly exiting or hiding the currently displayed application and reopening the desired application, which greatly facilitates the use of the user.
  • the electronic device has different display areas for displaying the user interface in different physical states, the existing layout of the application interface is not changed, the display effect is better, and the use is more convenient.
  • the above-mentioned embodiments it may be implemented in whole or in part by software, hardware, firmware or any combination thereof.
  • software it can be implemented in whole or in part in the form of a computer program product.
  • the computer program product described above includes one or more computer instructions.
  • the computer program instructions described above are loaded and executed on a computer, the procedures or functions described above in accordance with the present application are produced in whole or in part.
  • the aforementioned computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the above-mentioned computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the above-mentioned computer instructions may be transmitted from a website site, computer, server or data center via wired communication. (eg coaxial cable, optical fiber, digital subscriber line) or wireless (eg infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • the above-mentioned computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, etc. that includes one or more available media integrated.
  • the above-mentioned usable media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, digital versatile disc (DVD)), or semiconductor media (eg, solid state disk (SSD)) )Wait.
  • magnetic media eg, floppy disk, hard disk, magnetic tape
  • optical media eg, digital versatile disc (DVD)
  • semiconductor media eg, solid state disk (SSD)

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供一种显示方法及电子设备,该方法应用于包括第一显示区域和第二显示区域的电子设备,该方法包括:接收第一操作;响应于第一操作,电子设备确定第一显示区域和第一应用关联;电子设备处于第一物理状态时,通过第一显示区域显示第一应用的用户界面;接收第二操作;响应于第二操作,电子设备通过第一显示区域显示第二应用的用户界面;响应于电子设备的物理状态由第一物理状态变换为第二物理状态,电子设备通过第二显示区域显示第二应用的用户界面;响应于电子设备的物理状态由第二物理状态变换为第一物理状态,电子设备通过第一显示区域显示第一应用的用户界面。本申请能够让用户快速便捷地切换电子设备显示的应用。

Description

一种显示方法及电子设备
本申请要求于2021年01月30日提交中国专利局、申请号为202110134456.7、申请名称为“一种显示方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子设备技术领域,尤其涉及一种显示方法及电子设备。
背景技术
用户使用智能手机、平板电脑等电子设备的过程中,经常需要频繁切换电子设备显示的应用程序(简称应用)。例如,用户使用游戏应用娱乐时,经常需要切换到社交应用回复消息。目前,用户可以通过三键导航、手势导航等功能关闭或隐藏电子设备当前显示的应用,然后打开需使用的其他应用以使电子设备显示其他应用的用户界面。这样的操作较为繁琐,响应速度较慢。或者,用户也可以通过分屏、悬浮窗功能使电子设备显示多个应用的用户界面,但这样每个应用的用户界面都较小,显示效果不好,用户使用起来也不方便。
发明内容
本申请实施例公开了一种显示方法及电子设备,能够快速便捷地切换显示的应用,并且每个应用的显示效果也较好。
第一方面,本申请实施例提供了一种显示方法,应用于电子设备,上述电子设备包括第一显示区域和第二显示区域,上述方法包括:接收第一用户操作;响应于上述第一用户操作,上述电子设备确定上述第一显示区域和第一应用关联;上述电子设备处于第一物理状态时,通过上述第一显示区域显示上述第一应用的用户界面;接收第二用户操作;响应于上述第二用户操作,上述电子设备通过上述第一显示区域显示第二应用的用户界面,上述第一应用和上述第二应用不同;当通过上述第一显示区域显示上述第二应用的用户界面时,响应于第三用户操作,上述电子设备的物理状态由上述第一物理状态变换为第二物理状态,上述电子设备通过上述第二显示区域显示上述第二应用的用户界面;当通过上述第二显示区域显示上述第二应用的用户界面时,响应于上述电子设备的物理状态由上述第二物理状态变换为上述第一物理状态,上述电子设备通过上述第一显示区域显示上述第一应用的用户界面。
本申请实施例中,电子设备确定第一显示区域和第一应用关联之后,当电子设备的物理状态变换为第一物理状态时,电子设备会通过第一显示区域显示第一应用的用户界面。也就是说,用户可以通过改变电子设备的物理状态来快速切换电子设备显示的应用,而无需多次退出或隐藏当前显示的应用并重新打开想要查看的应用,大大方便了用户的使用。并且,电子设备在不同物理状态下用于显示用户界面的显示区域不同,不会改变应用界面的已有布局,显示效果更好。
在一种可能的实现方式中,上述第一显示区域包括至少部分上述第二显示区域;和/或,上述第二显示区域包括至少部分上述第一显示区域。
示例性地,第一显示区域和第二显示区域属于电子设备的同一个显示屏。第一显示区域 和第二显示区域存在重叠的显示区域,或者,第一显示区域是第二显示区域的部分显示区域,或者,第二显示区域是第一显示区域的部分显示区域。
在一种可能的实现方式中,上述电子设备为可折叠电子设备;当上述电子设备处于展开状态时,上述第一显示区域和上述第二显示区域处于同一个平面;当上述电子设备处于折叠状态时,上述第一显示区域的出光面和上述第二显示区域的出光面相背。
在一些实施例中,上述电子设备包括第一显示屏,上述第一显示区域为上述第一显示屏的至少部分显示区域,上述第二显示区域为上述第一显示屏的至少部分显示区域。
在一些实施例中,上述电子设备包括第一显示屏和第二显示屏,上述第一显示区域为上述第一显示屏的显示区域,上述第二显示区域为上述第二显示屏的显示区域。
在一些实施例中,上述第一物理状态和上述第二物理状态均为折叠状态,上述第三用户操作为上述用户翻转上述电子设备的操作。
在一种可能的实现方式中,上述电子设备为可折叠电子设备,上述电子设备包括第一显示屏,上述第二显示区域为上述第一显示屏的全屏显示区域,上述第一显示区域为上述第一显示屏的部分显示区域,上述第一物理状态为折叠状态,上述第二物理状态为展开状态。
在一种可能的实现方式中,上述电子设备为可折叠电子设备,当上述电子设备处于展开状态时,上述第一显示区域的出光面和上述第二显示区域的出光面相背;上述第一物理状态为上述展开状态,上述第二物理状态为折叠状态,上述第三用户操作为上述用户将上述电子设备由上述展开状态转换为上述折叠状态的操作。
在一种可能的实现方式中,上述电子设备为可折叠电子设备,当上述电子设备处于展开状态时,上述第一显示区域的出光面和上述第二显示区域的出光面相背;上述第一物理状态和上述第二物理状态均为上述展开状态,上述第三用户操作为上述用户翻转上述电子设备的操作。
本申请实施例中,第一显示区域和第二显示区域的形态多种多样,本申请的显示方法可以应用于各种形态的可折叠电子设备,应用场景较为广泛,可用性较高。
在一种可能的实现方式中,上述方法还包括:当通过上述第一显示区域显示上述第一应用的用户界面时,响应于上述电子设备的物理状态由上述第一物理状态变换为上述第二物理状态,上述电子设备通过上述第二显示区域显示上述第一应用的用户界面;当上述电子设备通过上述第二显示区域显示上述第一应用的用户界面时,接收第四用户操作;响应于上述第四用户操作,上述电子设备通过上述第二显示区域显示第三应用的用户界面,上述第三应用和上述第一应用不同。
本申请实施例中,电子设备的物理状态由第一物理状态变换为第二物理状态时,可以通过第二显示区域继续显示电子设备处于第一物理状态时显示的应用。因此用户可以继续使用电子设备处于第一物理状态时显示的应用,用户无需重新手动打开该应用,该应用的数据也不会丢失,用户使用更方便。
在一种可能的实现方式中,当通过上述第二显示区域显示上述第二应用的用户界面时,响应于上述电子设备的物理状态由上述第二物理状态变换为上述第一物理状态,上述电子设备通过上述第一显示区域显示上述第一应用的用户界面之后,上述方法还包括:响应于上述电子设备的物理状态由上述第一物理状态变换为上述第二物理状态,上述电子设备通过上述第二显示区域显示上述第二应用的用户界面。
本申请实施例中,电子设备的物理状态由第一物理状态变换为第二物理状态时,可以通过第二显示区域显示电子设备最近一次处于第二物理状态时显示的应用。因此用户可以继续 使用电子设备最近一次处于第二物理状态时显示的应用,用户无需重新手动打开该应用,该应用的数据也不会丢失,用户使用更方便。
在一种可能的实现方式中,上述方法还包括:接收第五用户操作;响应于上述第五用户操作,上述电子设备确定上述第二显示区域和第四应用关联;当通过上述第一显示区域显示上述第一应用的用户界面时,响应于上述电子设备的物理状态由上述第一物理状态变换为上述第二物理状态,上述电子设备通过上述第二显示区域显示上述第四应用的用户界面,上述第一应用和上述第四应用不同。
本申请实施例中,用户可以设置不同显示区域和不同应用关联。若用户需要频繁使用第一应用和第四应用,用户可以直接转换电子设备的物理状态以此快速切换电子设备显示的应用,而无需多次手动操作,大大方便了用户的使用。
在一种可能的实现方式中,上述接收第一用户操作时,上述电子设备处于上述第一物理状态,并通过上述第一显示区域显示上述第一应用的用户界面;上述第一用户操作为作用于显示上述第一应用的用户界面的显示区域的用户操作。
本申请实施例中,用户可以直接在电子设备通过第一显示区域显示第一应用的用户界面时,设置第一显示区域和第一应用关联,而无需到特定界面进行设置,用户使用更加方便。
第二方面,本申请实施例提供了一种图形用户界面(graphical user interface,GUI)的显示方法,应用于电子设备,该电子设备包括第一显示区域和第二显示区域,该显示方法为第一方面、第一方面的任意一种实现方式提供的显示方法。
第三方面,本申请实施例提供了一种电子设备,该电子设备包括第一显示区域、第二显示区域、一个或多个存储器、一个或多个处理器;上述一个或多个存储器用于存储计算机程序,上述一个或多个处理器用于调用上述计算机程序,上述计算机程序包括指令,当上述指令被上述一个或多个处理器执行时,使得上述电子设备执行第一方面、第一方面的任意一种实现方式提供的显示方法。
第四方面,本申请实施例提供了一种计算机存储介质,包括计算机程序,该计算机程序包括指令,当该指令在处理器上运行时实现第一方面、第一方面的任意一种实现方式提供的显示方法。
第五方面,本申请实施例提供了一种计算机程序产品,当该计算机程序产品在电子设备上运行时,使得该电子设备执行第一方面、第一方面的任意一种实现方式提供的显示方法。
第六方面,本申请实施例提供了一种芯片,该芯片包括至少一个处理器和接口电路,可选地,该芯片还包括存储器;上述存储器、上述接口电路和上述至少一个处理器通过线路互联,上述至少一个存储器中存储有计算机程序;上述计算机程序被上述至少一个处理器执行时实现第一方面、第一方面的任意一种实现方式提供的显示方法。
可以理解地,上述第三方面提供的电子设备、第四方面提供的计算机存储介质、第五方面提供的计算机程序产品以及第六方面提供的芯片均用于执行第一方面、第一方面的任意一种实现方式提供的显示方法。因此,其所能达到的有益效果可参考第一方面所提供的显示方法中的有益效果,此处不再赘述。
附图说明
以下对本申请实施例用到的附图进行介绍。
图1是本申请实施例提供的一种电子设备的硬件结构示意图;
图2是本申请实施例提供的一种电子设备的软件架构示意图;
图3-图5是本申请实施例提供的一些电子设备的物理状态的示意图;
图6-图17是本申请实施例提供的一些人机交互的示意图;
图18-图22是本申请实施例提供的一些用户界面实施例的示意图;
图23-图26是本申请实施例提供的一些电子设备内部的硬件驱动交互的流程示意图;
图27是本申请实施例提供的一种显示方法的流程示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。本申请实施例的实施方式部分使用的术语仅用于对本申请的具体实施例进行解释,而非旨在限定本申请。
本申请提供了一种显示方法,可以应用于电子设备,该电子设备可以包括第一显示区域和第二显示区域。电子设备的物理状态改变,电子设备的显示区域随之变化,显示的应用也随之变化。例如,第一显示区域和第一应用关联,第二显示区域和第二应用关联。因此,电子设备的物理状态处于第一物理状态时,电子设备可以通过第一显示区域显示第一应用的用户界面。电子设备的物理状态处于第二物理状态时,电子设备可以通过第二显示区域显示第二应用的用户界面。从而用户可以快速便捷地切换电子设备显示的应用,使用更加方便。
在一些实施例中,电子设备配置有可折叠的显示屏(可以称为折叠屏)。该电子设备可以称为可折叠电子设备。折叠屏折叠也可以称为电子设备折叠,折叠屏的物理状态也可以称为电子设备的物理状态。为了描述简便,以下实施例将可折叠电子设备简称为电子设备进行说明。
本申请中,电子设备可以运行至少一个应用。电子设备可以显示运行的应用的用户界面,也可以不显示运行的应用的用户界面。电子设备显示一个应用的用户界面时,可以检测作用于电子设备的用户操作(例如作用于显示屏的点击操作),响应于该操作,电子设备可以基于该应用执行相应的任务。也就是说,用户可以通过操作该应用的用户界面来操作该应用。这种应用可以提供给用户的功能较多,所需的资源也较多。电子设备可以通过任意可用的资源来运行该应用,例如带宽等网络资源、中央处理器(central processing unit,CPU)等***资源。
而电子设备运行的其他应用虽然不会被电子设备显示,但电子设备可以显示这些应用的通知消息、弹框消息等提示信息。例如,社交应用在电子设备中运行但未被显示时,若电子设备接收到其他用户基于该社交应用发送的消息,则可以将该消息以通知消息的形式推送给用户。也就是说,用户虽然无法操作该应用的用户界面,但可以接收到该应用的提示信息,以此通过该提示信息获取该应用的相关进程。这种应用可以提供给用户的功能较少,所需的资源也较少。例如,电子设备显示社交应用的用户界面时,用户可以通过社交应用的用户界面输入文字或语音,而未显示用户界面时用户无法输入文字或语音。电子设备可以通过部分可用的资源来运行上述其他应用。
本申请实施例涉及的电子设备可以但不限于是手机、平板电脑、个人数字助理(Personal Digital Assistant,PDA)、手持计算机、可穿戴电子设备(例如智能手表、智能手环)、增强现实(augmented reality,AR)设备(例如AR眼镜)、虚拟现实(virtual reality,VR)设备(例如VR眼镜)等终端设备,智能电视等智能家居设备,或其他桌面型、膝上型、笔记本电脑、超级移动个人计算机(Ultra-mobile Personal Computer,UMPC)、上网本等设备。本申请实施例对 电子设备的具体类型不作限定。
接下来介绍本申请实施例中提供的示例性电子设备的结构。
请参见图1,图1示出了一种电子设备100的结构示意图。
如图1所示,电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,角度传感器180M等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了***的效率。
充电管理模块140用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块140可以通过USB接口130接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块140可以通过电子设备100的无线充电线圈接收无线充电输入。充电管理模块140为电池142充电的同时,还可以通过电源管理模块141为电子设备供电。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块141也可以设置于处理器110中。在另一些实施例中,电源管理模块141和充电管理模块140也可以设置于同一个器件中。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星***(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯***(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位***(global positioning system,GPS),全球导航卫星***(global navigation satellite system,GLONASS),北斗卫星导航***(beidou navigation satellite system,BDS),准天顶卫星***(quasi-zenith satellite system,QZSS)和/或星基增强***(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏194,N为大于1的正整数。
本申请中,显示屏194可以包括至少一个显示区域,本申请以显示屏194包括第一显示区域和第二显示区域为例进行说明。
电子设备100处于第一物理状态时,显示屏194的第一显示区域用于显示图形用户界面 (graphical user interface,GUI),GUI也可以称为用户界面。此时,第一显示区域为亮屏状态,第二显示区域可以为息屏状态。电子设备100处于第二物理状态时,显示屏194的第二显示区域用于显示用户界面。此时,第二显示区域为亮屏状态,第一显示区域可以为息屏状态。不限于此,电子设备100处于第一物理状态时的第二显示区域、电子设备100处于第二物理状态时的第一显示区域也可以为锁屏状态或亮屏状态,本申请对此不作限定。
为了方便描述,以下实施例以电子设备100处于第一物理状态时,第一显示区域亮屏,第二显示区域息屏,电子设备100处于第二物理状态时,第一显示区域息屏,第二显示区域亮屏为例进行说明。
在一些实施例中,电子设备100的显示屏194为折叠屏,电子设备100的物理状态可以包括展开状态、弯折状态和折叠状态。电子设备100处于展开状态时,第一显示区域的出光面和第二显示区域的出光面相背,此时,第一显示区域可以处于展平状态,具体示例可参见下图3。电子设备100处于弯折状态和折叠状态时,第一显示区域可以弯折,具体示例可参见下图4-图5。示例性地,第一物理状态为展开状态,第二物理状态为折叠状态,第一显示区域、第二显示区域分别为下图3-图5所示的第一显示屏200、第二显示屏300。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。在一些实施例中,电子设备100可以通过拍摄功能获取的人脸信息实现人脸解锁、访问应用锁等。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
在一些实施例中,显示屏194为折叠屏,显示屏194包括第一显示区域和第二显示区域。电子设备100可以包括N个摄像头193,这N个摄像头193分别设置于第一显示区域和第二显示区域上。电子设备100可以通过这N个摄像头193的检测信号确定电子设备100是否翻转。
示例性地,电子设备100处于展开状态时,第一显示区域可以处于展平状态,并和用户脸部相对。此时,第一显示区域上的摄像头193可以获取用户的人脸信息,第二显示区域上的摄像头193无法获取到用户的人脸信息。电子设备100可以通过第一显示区域显示应用A的用户界面。电子设备100可以接收用户的翻转操作,该翻转操作用于改变第一显示区域和第二显示区域的相对位置,即将第二显示区域和用户脸部相对。当第二显示区域上的摄像头193可以获取到用户的人脸信息,第一显示区域上的摄像头193无法获取到用户的人脸信息时,电子设备100可以确定电子设备100发生翻转。此时,第二显示区域和用户脸部相对,电子设备100可以通过第二显示区域显示应用B的用户界面。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功 能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作***,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备100的各种功能应用以及数据处理。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触控操作作用于显示屏194,电子设备100可以根据压力传感器180A的检测信号得到对应的触控操作强度。电子设备100也可以根据压力传感器180A的检测信号计算该触控操作作用于显示屏194上的触控区域的位置(简称为触控位置)。电子设备100还可以根据压力传感器180A的检测信号计算上述触控区域的形状。
在一些实施例中,作用于相同触控位置,但不同触控操作强度的触控操作,可以对应不同的操作指令。例如:当有触控操作强度小于第一压力阈值的触控操作作用于短消息应用图标时,执行查看短消息的指令。当有触控操作强度大于或等于第一压力阈值的触控操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以检测电子设备100在各个方向上(一般为三轴,即x,y和z轴) 角速度的大小,可以用于确定电子设备100的运动姿态。可选地,陀螺仪传感器180B可以设置于电子设备100的电路板上。电子设备100可以根据陀螺仪传感器180B的检测信号确定电子设备100的弯折角度是否变化,或者电子设备100是否翻转。在一些实施例中,陀螺仪传感器180B也可以用于拍摄防抖。示例性地,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
加速度传感器180E可以检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。可选地,加速度传感器180E可以设置于电子设备100的电路板上。电子设备100可以根据加速度传感器180E的检测信号确定电子设备100的弯折角度是否变化,或者电子设备100是否翻转。
在一些实施例中,电子设备100可以包括多个加速度传感器180E和/或多个陀螺仪传感器180B。显示屏194为折叠屏,显示屏194包括第一显示区域和第二显示区域。电子设备100处于展开状态时,第一显示区域可以处于展平状态,电子设备100处于弯折状态和折叠状态时,第一显示区域可以弯折。第一显示区域弯折时可以被分为两个显示区域,这两个显示区域所在的平面相交。上述多个加速度传感器180E和/或多个陀螺仪传感器180B可以分别设置于上述两个显示区域侧的电路板上。因此,电子设备100可以根据这多个加速度传感器180E和/或多个陀螺仪传感器180B的检测信号确定电子设备100的弯折角度是否变化,或者电子设备100是否翻转。
示例性地,电子设备100处于展开状态时,第一显示区域朝上并显示应用A的用户界面,第二显示区域朝下并息屏。电子设备100可以接收用户的翻转操作,该翻转操作用于改变第一显示区域和第二显示区域的相对位置,即将第一显示区域朝下,第二显示区域朝上。电子设备100可以根据加速度传感器180E和/或陀螺仪传感器180B的检测信号确定电子设备100发生翻转。响应于该翻转操作,电子设备100可以通过第二显示区域显示应用B的用户界面,此时第一显示区域朝下并息屏。
示例性地,电子设备100处于展开状态时,第一显示区域朝上并显示应用A的用户界面,第二显示区域朝下并息屏。电子设备100可以接收用户的折叠操作,该折叠操作用于将电子设备100折叠。电子设备100可以根据加速度传感器180E和/或陀螺仪传感器180B的检测信号确定电子设备100的弯折角度变小,并处于折叠状态。电子设备100处于折叠状态时,第二显示区域弯折并可以被分为两个显示区域,这两个显示区域所在的平面相交。响应于该折叠操作,电子设备100可以通过上述两个显示区域中朝上的显示区域显示应用B的用户界面,上述两个显示区域中朝下的显示区域可以息屏。其中,电子设备100可以根据加速度传感器180E和/或陀螺仪传感器180B的检测信号确定上述两个显示区域中朝上的显示区域。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
触摸传感器180K,也称“触控器件”。在一些实施例中,触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸操作作用于显示屏194上的触控区域的位置、该触控区域的形状,从而确定触摸事件类型。电子设备100可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的 位置不同。
角度传感器180M可以获取角度信息,并将其转换为可用的电信号输出。在一些实施例中,角度传感器180M可以设置于显示屏194中,用于检测电子设备100的弯折角度。处理器110可以根据角度传感器180M检测信号确定电子设备100的物理状态(例如展开状态、弯折状态或折叠状态),以及电子设备100的物理状态是否变化。
本申请对用于检测物理状态的传感器的具体类型不作限定。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过***SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时***多张卡。多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
本申请中,压力传感器180A和/或触摸传感器180K可以设置于显示屏194中。显示屏194显示应用的用户界面时,压力传感器180A和/或触摸传感器180K可以检测用户作用于该用户界面的用户操作。响应于该用户操作,电子设备100可以基于该应用执行相应的任务。例如,用户点击社交应用中某个好友的头像,电子设备100可以在显示屏194上显示该好友通过该社交应用发布的个人信息。
电子设备100的软件***可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。例如,分层架构的软件***可以是安卓(Android)***,也可以是华为移动服务(huawei mobile services,HMS)***。本申请实施例以分层架构的Android***为例,示例性说明电子设备100的软件结构。
图2是本发明实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android***分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和***库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图2所示,应用程序包可以包括社交应用、游戏应用、视频应用、支付应用、相机、图库、地图、短信、电子书等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图2所示,应用程序框架层可以包括窗口管理器,内容提供器,视图***,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。上述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图***包括可视控件,例如显示文字的控件,显示图片的控件等。视图***可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,应用程序的显示界面,包括应用程序的通知消息的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在***顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓***的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
***库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子***进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。其中,传感器驱动可以用于驱动控制硬件中的多个传感器,例如图1所示的压力传感器180A、陀螺仪传感器180B、加速度传感器180E、触摸传感器180K、角度传感器180M等传感器。
下面结合显示场景,示例性说明电子设备100软件以及硬件的工作流程。其中,以用于检测物理状态的传感器为陀螺仪传感器180B为例进行说明。
假设第一显示区域和应用A关联。示例性地,电子设备100处于展开状态,以及第一显示区域朝上时,电子设备100运行应用A并通过第一显示区域显示应用A的用户界面。电子设备100可以接收用户操作,陀螺仪传感器180B上报检测信号给内核层。内核层可以将该 检测信号加工成输入事件(例如电子设备100处于展开状态,第一显示区域朝下和第二显示区域朝上或第一显示区域和第二显示区域的相对位置变化),输入事件被存储在内核层。应用程序框架层从内核层获取输入事件,根据该输入事件确定电子设备100的物理状态发生变化。假设第二显示区域和应用B关联,例如电子设备100处于展开状态,以及第二显示区域朝上时,电子设备100运行应用B并通过第二显示区域显示应用B的用户界面。因此,响应于电子设备100的物理状态的变化,第二应用可以调用应用框架层的接口,启动第二应用,进而通过调用内核层启动显示驱动,通过第二显示区域显示第二应用的用户界面。
接下来示例性介绍配置有折叠屏的电子设备的物理状态。
请参见图3,图3是本申请实施例提供的一种电子设备的展开状态的示意图。其中,图3的(A)示出了电子设备一种视角的示意图。图3的(B)示出了电子设备又一种视角的示意图。
如图3的(A)所示,电子设备100可以包括第一显示屏200和第二显示屏300。电子设备可以处于展开状态,折叠状态或弯折状态。当电子设备处于展开状态,第一显示屏200处于展平状态,如图3的(A)所示,第一显示屏200的弯折角度α大约为180度,不限于此,第一显示屏200的弯折角度也可以大于或等于170度,并小于或等于180度,本申请对展开状态下第一显示屏200的弯折角度的具体取值不作限定。电子设备处于展开状态时,第一显示屏200的出光面和第二显示屏300的出光面相背。
第一显示屏200可以包括三个显示区域,即第一显示区域201、第二显示区域202、第三显示区域203。其中,第三显示区域203可弯折,位于电子设备的弯折部位上。电子设备的弯折部位两端分别连接第一显示区域201和第二显示区域202。第一显示屏200的弯折角度也可以理解为:第一显示区域201所在平面和第二显示区域202所在平面之间的夹角α,即图3的(A)所示的弯折部位两端之间的夹角α。如图3的(A)所示,当第一显示屏200处于展平状态时,第一显示区域201和第二显示区域202相当于处于同一个平面。
第二显示屏300可以包括三个显示区域,即第四显示区域301、第五显示区域302、第六显示区域303。其中,电子设备处于展开状态时,第一显示区域201和第四显示区域301相背,第二显示区域202和第六显示区域303相背,第三显示区域203和第五显示区域302相背。
如图3的(B)所示,第二显示屏300的第五显示区域302可弯折,位于电子设备的弯折部位上。电子设备的弯折部位两端也分别连接第四显示区域301和第六显示区域303。可以理解地,上述第一显示屏200的弯折角度α也为:360度-第二显示屏300的弯折角度α′,即α=360-α′。第二显示屏300的弯折角度也可以理解为:第四显示区域301所在平面和第六显示区域303所在平面之间的角度。
如图3的(B)所示,当第二显示屏200处于展平状态时,第四显示区域301和第六显示区域303相当于处于同一个平面。
在一些实施例中,电子设备可以包括至少一个摄像头。如图3的(B)所示,电子设备可以包括设置于第六显示区域303上部的摄像头3031(包括摄像头3031A和摄像头3031B),以及第六显示区域303下部的摄像头3032(包括摄像头3032A和摄像头3032B)。可选地,电子设备可以通过上述至少一个摄像头获取用户的人脸信息,并根据获取的人脸信息进行人脸验证或者判断电子设备的物理状态是否变化。可选地,在人脸验证通过的情况下电子设备再启动对应的应用,并显示该应用的用户界面。
不限于上述列举的摄像头的情况,在具体实现中,电子设备也可以仅包括设置于第六显示区域303上部的摄像头3031。或者,电子设备也可以仅包括设置于第六显示区域303下部 的摄像头3032。或者,电子设备也可以包括设置于第四显示区域301和/或第五显示区域302的摄像头。或者,电子设备也可以包括设置于第一显示屏200的摄像头,本申请实施例对此不作限定。
在一些实施例中,可以将图3所示的展开状态下的电子设备部分折叠(此时,电子设备的弯折部位两端之间的夹角可以从α变换为β),以得到弯折状态下的电子设备,具体如下图4所示。
请参见图4,图4是本申请实施例提供的一种电子设备的弯折状态的示意图。其中,图4的(A)示出了电子设备一种视角的示意图。图4的(B)示出了电子设备又一种视角的示意图。
如图4的(A)所示,电子设备处于弯折状态时,第一显示屏200弯折,第一显示区域201所在平面和第二显示区域202所在平面相交,如图4的(A)所示,第一显示屏200的弯折角度β可以大约为120度。不限于此,第一显示屏200的弯折角度也可以大于0度,并小于180度,例如但不限于为60度、90度、100度、120度等等,本申请对弯折状态下第一显示屏200的弯折角度的具体取值不作限定。图4的(B)的说明和图4的(A)类似,不再赘述。
在一些实施例中,可以将图3所示的展开状态下的电子设备折叠(此时,电子设备的弯折部位两端之间的夹角可以从α变换为γ),以得到折叠状态下的电子设备。或者也可以将图4所示的弯折状态下的电子设备折叠(此时,电子设备的弯折部位两端之间的夹角可以从β变换为γ),以得到折叠状态下的电子设备,具体如下图5所示。
如图5的(A)所示,电子设备处于折叠状态时,第一显示屏200弯折或折叠,如图5的(A)所示,第一显示屏200的弯折角度γ可以大约为0度。不限于此,第一显示屏200的弯折角度也可以大于或等于0度,并小于或等于20度,本申请对折叠状态下第一显示屏200的弯折角度的具体取值不作限定。
可以理解地,电子设备处于折叠状态、弯折状态、展开状态时,显示屏194的弯折角度不同即可,但显示屏194的弯折角度的具体取值不作限定。
如图5的(A)和图5的(B)所示,当电子设备处于折叠状态时,第一显示区域201和第二显示区域202相对,第四显示区域301和第六显示区域303相背,第三显示区域203和第五显示区域302相背。
在一些实施例中,第一显示屏200和第二显示屏300可以为同一个柔性折叠屏(简称柔性屏)上不同的显示区域。可选地,第一显示区域201、第二显示区域202、第三显示区域203、第四显示区域301、第五显示区域302和第六显示区域303可以是这一个柔性屏上的不同区域,均用于显示用户界面。
在一些实施例中,第一显示屏200可以为电子设备的一个柔性屏。可选地,第一显示区域201、第二显示区域202、第三显示区域203为这一个柔性屏上的不同区域,均用于显示用户界面。第二显示屏300可以为电子设备的另一个柔性屏。可选地,第四显示区域301、第五显示区域302、第六显示区域303为第二显示屏300上的不同区域,均用于显示用户界面。
在一些实施例中,第一显示屏200可以是刚性屏和柔性屏、链条等连接组件拼接而成的显示屏。例如,第一显示屏200可以是由两个刚性屏和一个柔性屏拼接而成的。第一显示区域201和第二显示区域202可以分别为上述两个刚性屏上的区域,第三显示区域203可以为上述一个柔性屏上的区域,均用于显示用户界面。或者,第一显示屏200可以是由两个刚性屏和用于连接这两个刚性屏的链条拼接而成的。第一显示区域201和第二显示区域202可以分别为上述两个刚性屏上的区域,均用于显示用户界面。第三显示区域203为上述用于连接这两个刚性屏的链条。第二显示屏300也可以是刚性屏和柔性屏、链条等连接组件拼接而成的显示屏,具体说明和第一显示屏200类似,不再赘述。
为了方便描述,本申请以电子设备的弯折角度小于第一角度阈值时处于折叠状态,电子设备的弯折角度大于第二角度阈值时处于展开状态,电子设备的弯折角度大于或等于第一角度阈值以及小于或等于第二角度阈值时处于弯折状态为例进行说明。示例性地,第一角度阈值为30度,第二角度阈值为160度。
如图3-图5所示,电子设备处于展开状态时,若第一显示屏200朝上,则通过第一显示屏200显示用户界面,若第二显示屏300朝上,则通过第二显示屏300显示用户界面。电子设备处于折叠状态时,若第四显示区域301朝上,则通过第四显示区域301(可选地,以及第五显示区域302)显示用户界面。若第六显示区域303朝上,则通过第六显示区域303(可选地,以及第五显示区域302)显示用户界面。
下面介绍本申请实施例涉及的应用场景以及该场景下的人机交互示意图。图6-图17中,电子设备100的结构以图3-图5所示的结构为例进行描述。
请参见图6,图6示例性示出一种人机交互示意图。
如图6的(A)所示,电子设备100处于展开状态,以及第一显示屏200朝上,电子设备可以通过第一显示屏200显示第一应用的用户界面。其中,电子设备可以在显示设置界面或桌面时接收用户操作,响应于该用户操作,电子设备设置第一显示屏200和第一应用关联。不限于此,电子设备也可以在通过第一显示屏200显示第一应用的用户界面时,接收作用于第一显示屏200的用户操作(例如点击锁定控件),响应于该用户操作,电子设备设置第一显示屏200和第一应用关联,本申请对此不作限定。
如图6的(A)所示,电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第二显示屏300的第六显示区域303显示用户界面。不限于此,电子设备也可以通过第五显示区域302和第六显示区域303一起显示用户界面。假设不存在和第六显示区域303关联的应用,则电子设备设置第一显示屏200和第一应用关联后,第一次从展开状态变换为折叠状态,第六显示区域303可以仍然显示电子设备处于展开状态显示的应用的用户界面,即第一应用的用户界面,此时电子设备可以为下图6的(B)所示状态。
如图6的(B)所示,电子设备处于折叠状态,以及第六显示区域303朝上,电子设备可以通过第六显示区域303继续显示展开状态下显示的第一应用的用户界面。电子设备可以接收用户操作,该用户操作用于打开第二应用,例如通过三键导航功能返回桌面,并点击桌面上第二应用的图标,其中,第一应用和第二应用不同。响应于该用户操作,电子设备可以取消显示第一应用的用户界面,通过第六显示区域303显示第二应用的用户界面,此时电子设备可以为下图6的(C)所示状态。
如图6的(C)所示,电子设备处于折叠状态,以及第六显示区域303朝上,电子设备可以通过第六显示区域303显示第二应用的用户界面。电子设备可以检测用户的展开操作,该展开操作可以将弯折部位两端之间的夹角变大。响应于弯折部位两端之间的夹角大于第二角度阈值,电子设备可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,即第一应用的用户界面,此时电子设备可以为图6的(A)所示状态。
在一些实施例中,电子设备100设置第一显示屏200和第一应用关联之后,通过第一显示屏200显示第一应用的用户界面时可以接收用户操作。该用户操作用于打开第三应用,例如通过手势导航功能进入多任务界面,并点击多任务界面中第三应用的缩略图,其中,第一 应用和第三应用不同。响应于该用户操作,电子设备可以通过第一显示屏200显示第三应用的用户界面。此时电子设备的物理状态可以变换为折叠状态,并通过第二显示屏300显示第三应用的用户界面。具体示例如下图7所示。
请参见图7,图7示例性示出又一种人机交互示意图。
图7的(A)和图6的(A)一致,不再赘述。在图7的(A)所示状态下,电子设备设置第一显示屏200和第一应用关联之后,电子设备可以接收用于打开第三应用的用户操作。响应于该用户操作,电子设备可以取消显示第一应用的用户界面,通过第一显示屏200显示第三应用的用户界面,此时电子设备可以为下图7的(B)所示状态。
如图7的(B)所示,电子设备处于展开状态,以及第一显示屏200朝上,电子设备通过第一显示屏200显示第三应用的用户界面。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第二显示屏300的第六显示区域303显示用户界面。不限于此,电子设备也可以通过第五显示区域302和第六显示区域303一起显示用户界面。假设不存在和第六显示区域303关联的应用,则在电子设备设置第一显示屏200和第一应用关联后,第一次从展开状态变换为折叠状态,第六显示区域303可以仍然显示电子设备处于展开状态显示的应用的用户界面,即第三应用的用界面,此时电子设备可以为图7的(C)所示状态。
如图7的(C)所示,电子设备处于折叠状态,以及第六显示区域303朝上,电子设备可以通过第六显示区域303继续显示展开状态下显示的第三应用的用户界面。电子设备可以检测用户的展开操作,该展开操作可以将弯折部位两端之间的夹角变大。响应于弯折部位两端之间的夹角大于第二角度阈值,电子设备可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,即第一应用的用户界面,此时电子设备可以为图7的(A)所示状态。
可以理解地,若电子设备未设置第一应用和第一显示屏200关联,例如未在图6的(A)所示状态下设置第一显示屏200和第一应用关联,则电子设备从其他物理状态变换为展开状态,例如响应于弯折部位两端的夹角大于第二角度阈值,电子设备可以显示电子设备处于上述其他物理状态显示的应用的用户界面,不限于此,也可以是电子设备的桌面或锁屏界面等。
在一些实施例中,电子设备的状态变化依次为:图6的(A)、图6的(B)、图6的(C)、图6的(A)。此时,电子设备处于展开状态,以及第一显示屏200朝上,通过第一显示屏200显示第一应用的用户界面。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303显示电子设备最近一次处于折叠状态显示的应用的用户界面,即第二应用的用户界面,此时电子设备可以为图6的(C)所示状态。
在一些实施例中,电子设备的状态变化依次为:图7的(A)、图7的(B)、图7的(C)、图7的(A)。此时,电子设备处于展开状态,以及第一显示屏200朝上,通过第一显示屏200显示第一应用的用户界面。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303显示电子设备最近一次处于折叠状态显示的应用的用户界面,即第三应用的用户界面,此时电子设备可以为图7的(C)所示状态。
在一些实施例中,电子设备可以在显示设置界面或桌面时接收用户操作,响应于该用户操作,电子设备设置第六显示区域303和第四应用关联,其中,第一应用和第四应用不同。不限于此,电子设备也可以在通过第六显示区域303显示第四应用的用户界面时,接收作用 于第六显示区域303的用户操作(例如点击锁定控件),响应于该用户操作,电子设备设置第六显示区域303和第四应用关联,本申请对此不作限定。在这种情况下,电子设备处于折叠状态,以及第六显示区域303朝上时,可以通过第六显示区域303(可选地,以及第五显示区域302)显示第四应用的用户界面。
示例性地,当电子设备处于展开状态时,电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,即第四应用的用户界面。
需要说明的是,上述电子设备最近一次处于折叠状态显示的应用的用户界面,是针对电子设备设置第一显示屏200和第一应用关联之后的情况。
不限于上述列举的情况,在具体实现中,也可以是电子设备设置第六显示区域303和第四应用关联。响应于电子设备的物理状态从折叠状态变换为展开状态,电子设备可以通过第一显示屏200显示电子设备最近一次处于展开状态显示的应用的用户界面。若上述电子设备最近一次处于展开状态显示的应用不存在,即电子设备设置第二显示屏300和第四应用关联之后,电子设备第一次从折叠状态变换为展开状态,则电子设备可以通过第六显示区域303继续显示电子设备处于折叠状态显示的应用的用户界面。响应于电子设备的物理状态从展开状态变换为折叠状态,电子设备可以通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,即第四应用的用户界面。
需要说明的是,图6-图7以电子设备处于折叠状态时第六显示区域303朝上为例进行说明。若电子设备处于折叠状态时第四显示区域301朝上,电子设备可以通过第四显示区域301显示用户界面,不限于此,电子设备也可以通过第四显示区域301和第五显示区域302一起显示用户界面。示例性地,电子设备可以通过加速度传感器180E和/或陀螺仪传感器180B检测朝上的显示区域。
在一些实施例中,电子设备已设置第一显示屏200和视频应用关联,也已设置第二显示屏300和社交应用关联,设置方式的示例可参见上图6-图7。电子设备处于展开状态,以及第一显示屏200朝上时,电子设备可以通过第一显示屏200显示视频应用的用户界面110,具体如图8的(A)所示。电子设备处于展开状态,以及第二显示屏300朝上时,电子设备可以通过第二显示屏300显示社交应用的用户界面120,具体如图8的(B)所示。
如图8的(A)所示,电子设备可以检测用户的翻转操作,该翻转操作可以改变第一显示屏200和第二显示屏300的相对位置。响应于第一显示屏200朝下,第二显示屏300朝上,电子设备可以通过第二显示屏300显示和第二显示屏300关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图8的(B)所示状态。
如图8的(B)所示,电子设备可以检测用户的翻转操作,该翻转操作可以改变第一显示屏200和第二显示屏300的相对位置。响应于第二显示屏300朝下,第一显示屏200朝上,电子设备可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,即视频应用的用户界面110,此时电子设备可以为图8的(A)所示状态。
在一些实施例中,电子设备已设置第一显示屏200和视频应用关联,也已设置第二显示屏300和社交应用关联,设置方式的示例可参见上图6-图7。并且,电子设备确定第四显示区域301关联的应用,第六显示区域303关联的应用,均和第二显示屏300关联的应用相同。 即第四显示区域301和第六显示区域303关联的应用也为社交应用。则电子设备的物理状态变换为折叠状态时,电子设备可以显示社交应用的用户界面,具体示例如下图9所示。
请参见图9,图9示例性示出又一种人机交互示意图。
如图9的(A)所示,电子设备处于展开状态,以及第一显示屏200朝上,电子设备通过第一显示屏200显示视频应用的用户界面110。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303或第四显示区域301显示用户界面,此时电子设备可以为下图9的(B)或图9的(C)所示状态。
如图9的(B)所示,电子设备处于折叠状态,以及第六显示区域303朝上,电子设备通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,也是和第二显示屏300关联的应用的用户界面,即社交应用的用户界面120。不限于此,电子设备也可以通过第六显示区域303和第五显示区域302一起显示社交应用的用户界面120。
如图9的(C)所示,电子设备处于折叠状态,以及第四显示区域301朝上,电子设备通过第四显示区域301显示和第四显示区域301关联的应用的用户界面,也是和第二显示屏300关联的应用的用户界面,即社交应用的用户界面120。不限于此,电子设备也可以通过第四显示区域301和第五显示区域302一起显示社交应用的用户界面120。
在图9的(B)所示状态下,电子设备可以检测用户的翻转操作,该翻转操作可以改变第四显示区域301和第六显示区域303的相对位置。响应于第六显示区域303朝下,第四显示区域301朝上,电子设备可以通过第四显示区域301显示社交应用的用户界面120,此时电子设备可以为图9的(C)所示状态。类似地,在图9的(C)所示状态下,电子设备也可以检测用户的翻转操作,该翻转操作可以改变第四显示区域301和第六显示区域303的相对位置。响应于第四显示区域301朝下,第六显示区域303朝上,电子设备可以通过第六显示区域303显示社交应用的用户界面120,此时电子设备可以为图9的(B)所示状态。
在一些实施例中,电子设备已设置第一显示屏200和视频应用关联,也已设置第二显示屏300和社交应用关联。并且,电子设备也已设置第四显示区域301和图库关联,第六显示区域303和相机关联,设置方式的示例可参见上图6-图7。也就是说,第四显示区域301关联的应用,第六显示区域303关联的应用以及第二显示屏300关联的应用各不相同。则电子设备的物理状态变换为折叠状态时,电子设备可以根据朝上的显示区域确定显示的应用,具体示例如下图10所示。
请参见图10,图10示例性示出又一种人机交互示意图。图10的(A)和图9的(A)一致,不再赘述。
如图10的(B)所示,电子设备处于折叠状态,以及第六显示区域303朝上,电子设备通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,即相机的用户界面130。不限于此,电子设备也可以通过第六显示区域303和第五显示区域302一起显示相机的用户界面130。
如图10的(C)所示,电子设备处于折叠状态,以及第四显示区域301朝上,电子设备通过第四显示区域301显示和第四显示区域301关联的应用的用户界面,即图库的用户界面140。不限于此,电子设备也可以通过第四显示区域301和第五显示区域302一起显示图库的用户界面140。
在图10的(B)所示状态下,电子设备可以检测用户的翻转操作,该翻转操作可以改变 第四显示区域301和第六显示区域303的相对位置。响应于第六显示区域303朝下,第四显示区域301朝上,电子设备可以通过第四显示区域301显示图库的用户界面140,此时电子设备可以为图10的(C)所示状态。类似地,在图10的(C)所示状态下,电子设备也可以检测用户的翻转操作,该翻转操作可以改变第四显示区域301和第六显示区域303的相对位置。响应于第四显示区域301朝下,第六显示区域303朝上,电子设备可以通过第六显示区域303显示相机的用户界面130,此时电子设备可以为图10的(B)所示状态。
不限于上述列举的情况,在具体实现中,也可以是第四显示区域301关联的应用和第二显示屏300关联的应用相同,例如为社交应用,第六显示区域303关联的应用和第二显示屏300关联的应用不同,例如为相机。或者,也可以是第六显示区域303关联的应用和第二显示屏300关联的应用相同,第四显示区域301关联的应用和第二显示屏300关联的应用不同。或者,第四显示区域301关联的应用和第六显示区域303关联的应用相同,但和第二显示屏300关联的应用不同。
在一些实施例中,电子设备已设置第一显示屏200和视频应用关联。电子设备处于弯折状态时,可以通过第一显示屏200分屏显示应用的用户界面,具体示例如下图11-图12所示。
请参见图11,图11示例性示出又一种人机交互示意图。其中,电子设备处于弯折状态时分屏显示的两个应用和第一显示屏200关联的应用不同。
如图11的(A)所示,电子设备处于弯折状态,以及第一显示屏200朝上,电子设备通过第一显示区域201和部分第三显示区域203显示短信的用户界面150,以及通过第二显示区域202和部分第三显示区域203显示电子书的用户界面160。电子设备可以检测用户的展开操作,该展开操作可以将弯折部位两端之间的夹角变大。响应于弯折部位两端之间的夹角大于第二角度阈值,以及第一显示屏200朝上,电子设备可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,即视频应用的用户界面110,此时电子设备可以为图11的(B)所示状态。在一些实施例中,响应于弯折部位两端之间的夹角大于第二角度阈值,以及第二显示屏300朝上,电子设备可以通过第二显示屏300显示和第二显示屏300关联的应用的用户界面,此时电子设备可以为图8的(B)所示状态。
在一些实施例中,电子设备也可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,以及第四显示区域301朝上,电子设备可以通过第四显示区域301显示和第四显示区域301关联的应用的用户界面,此时电子设备可以为图9的(C)或图10的(C)所示状态。或者,响应于弯折部位两端之间的夹角小于第一角度阈值,以及第六显示区域303朝上,电子设备可以通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,此时电子设备可以为图9的(B)或图10的(B)所示状态。
如图11的(B)所示,电子设备处于展开状态,以及第一显示屏200朝上,电子设备通过第一显示屏200显示视频应用的用户界面110。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角大于或等于第一角度阈值,以及小于或等于第二角度阈值,电子设备可以通过第一显示屏200分屏显示应用的用户界面,此时电子设备可以为图11的(A)或下图12的(A)所示状态。
请参见图12,图12示例性示出又一种人机交互示意图。其中,电子设备处于弯折状态时分屏显示的两个应用中的一个应用为第一显示屏200关联的应用,即视频应用。
如图12的(A)所示,电子设备处于弯折状态,以及第一显示屏200朝上,电子设备通 过第一显示区域201和部分第三显示区域203显示短信的用户界面150,以及通过第二显示区域202和部分第三显示区域203显示视频应用的用户界面110。此时电子设备检测用户操作,物理状态发生变化的说明和上图11的(A)类似,不再赘述。图12的(B)和图11的(B)类似,不再赘述。
不限于上述列举的情况,在具体实现中,若电子设备未设置第二显示屏300关联的应用,响应于电子设备的物理状态从弯折状态变换为展开状态或折叠状态,电子设备也可以通过第二显示屏300显示电子设备最近一次处于展开状态或折叠状态时显示的用户界面。示例性地,响应于弯折部位两端之间的夹角大于第二角度阈值,以及第二显示屏300朝上,电子设备可以通过第二显示屏300显示电子设备最近一次通过第二显示屏300显示的用户界面。响应于弯折部位两端之间的夹角小于第一角度阈值,以及第四显示区域301朝上,电子设备可以通过第四显示区域301显示电子设备最近一次通过第四显示区域301显示的用户界面。响应于弯折部位两端之间的夹角小于第一角度阈值,以及第六显示区域303朝上,电子设备可以通过第六显示区域303显示电子设备最近一次通过第六显示区域303显示的用户界面。
不限于上述列举的情况,在具体实现中,响应于电子设备的物理状态从展开状态变换为弯折状态,电子设备也可以通过第一显示屏200继续显示电子设备处于展开状态时显示的应用的用户界面。或者,电子设备也可以通过第一显示屏200显示电子设备最近一次处于弯折状态时显示的用户界面。或者,电子设备也可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,本申请对此不作限定。
在一些实施例中,电子设备的物理状态也可以从折叠状态变换为弯折状态,电子设备可以通过第一显示屏200分屏显示应用的用户界面,此时电子设备可以为图11的(A)或图12的(A)所示状态。不限于此,电子设备也可以通过第一显示屏200继续显示电子设备处于折叠状态时显示的应用的用户界面。或者,电子设备也可以通过第一显示屏200显示电子设备最近一次处于弯折状态时显示的用户界面。或者,电子设备也可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,本申请对此不作限定。
在一些实施例中,电子设备处于展开状态,以及第一显示屏200朝上并为横屏状态时,电子设备可以通过第一显示屏200显示视频应用的用户界面110,具体如图13的(A)所示。电子设备处于展开状态,以及第一显示屏200朝上并为竖屏状态时,电子设备可以通过第一显示屏200显示电子书的用户界面160,具体如图13的(B)所示。
如图13的(A)所示,电子设备可以检测用户的旋转操作,该旋转操作可以将电子设备的物理状态从横屏状态变换为竖屏状态。响应于电子设备的物理状态从横屏状态变换为竖屏状态,电子设备可以通过第一显示屏200显示电子书的用户界面160,此时电子设备可以为图13的(B)所示状态。
如图13的(B)所示,电子设备可以检测用户的旋转操作,该旋转操作可以将电子设备的物理状态从竖屏状态变换为横屏状态。响应于电子设备的物理状态从竖屏状态变换为横屏状态,电子设备可以通过第一显示屏200显示视频应用的用户界面110,此时电子设备可以为图13的(A)所示状态。
不限于上述列举的情况,在具体实现中,电子设备处于展开状态,以及第二显示屏300朝上并为横屏状态时,电子设备可以通过第二显示屏300显示应用A的用户界面。电子设备处于展开状态,以及第二显示屏300朝上并为竖屏状态时,电子设备可以通过第二显示屏300显示应用B的用户界面,具体说明和上图13类似,不再赘述。
示例性地,电子设备可以通过加速度传感器180E和/或陀螺仪传感器180B检测电子设备为横屏状态或竖屏状态。
在一种可能的实现方式中,第二显示屏300可以只包括第四显示区域301,可选地,也可以包括第四显示区域301和第五显示区域302。或者,第二显示屏300也可以只包括第六显示区域303,可选地,也可以包括第六显示区域303和第五显示区域302。例如,第二显示屏300为一个刚性屏。电子设备的物理状态发生变换时,用于显示用户界面的显示区域可以发生变化,显示的应用也可以发生变化,具体示例如下图14所示。
请参见图14,图14示例性示出又一种人机交互示意图。图14以第二显示屏300只包括第六显示区域303为例进行说明。
如图14的(A)所示,电子设备处于展开状态,以及第一显示屏200朝上,电子设备通过第一显示屏200显示视频应用的用户界面110。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303显示和第六显示区域303(即第二显示屏300)关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图14的(B)所示状态。电子设备也可以检测用户的翻转操作,该翻转操作可以改变第一显示屏200和第二显示屏300的相对位置。响应于第一显示屏200朝下,第二显示屏300朝上,电子设备可以通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图14的(C)所示状态。
如图14的(B)所示,电子设备处于折叠状态,通过第六显示区域303显示社交应用的用户界面120。电子设备可以检测用户的展开操作,该展开操作可以将弯折部位两端之间的夹角变大。响应于弯折部位两端之间的夹角大于第二角度阈值,以及第一显示屏200朝上,电子设备可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,即视频应用的用户界面110,此时电子设备可以为图14的(A)所示状态。或者,响应于弯折部位两端之间的夹角大于第二角度阈值,以及第二显示屏300朝上,电子设备可以通过第六显示区域303继续显示和第六显示区域303关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图14的(C)所示状态。
如图14的(C)所示,电子设备处于展开状态,以及第二显示屏300朝上,电子设备通过第六显示区域303显示社交应用的用户界面120。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303继续显示和第六显示区域303关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图14的(B)所示状态。电子设备也可以检测用户的翻转操作,该翻转操作可以改变第一显示屏200和第二显示屏300的相对位置。响应于第二显示屏300朝下,第一显示屏200朝上,电子设备可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,即视频应用的用户界面110,此时电子设备可以为图14的(A)所示状态。
在一种可能的实现方式中,第一显示屏200可以只包括第一显示区域201,可选地,也可以包括第一显示区域201和第三显示区域203。或者,第一显示屏200也可以只包括第二显示区域202,可选地,也可以包括第二显示区域202和第三显示区域203。例如,第一显示屏200为一个刚性屏。电子设备的物理状态发生变换时,用于显示用户界面的显示区域可以发生变化,显示的应用也可以发生变化,具体示例如下图15所示。
请参见图15,图15示例性示出又一种人机交互示意图。图15以第一显示屏200只包括第一显示区域201为例进行说明。
如图15的(A)所示,电子设备处于展开状态,以及第一显示屏200朝上,电子设备通过第一显示区域201显示视频应用的用户界面110。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图15的(B)所示状态。电子设备也可以检测用户的翻转操作,该翻转操作可以改变第一显示屏200和第二显示屏300的相对位置。响应于第一显示屏200朝下,第二显示屏300朝上,电子设备可以通过第六显示区域303显示和第六显示区域303关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图15的(C)所示状态。图15的(B)、(C)和图14的(B)、(C)类似,不再赘述。
不限于图15示例的情况,在具体实现中,第二显示屏300也可以包括第四显示区域301和第六显示区域303,可选地,以及第五显示区域302,具体可参见上图3-图5中第二显示屏300的说明。则在上图15的(C)中,电子设备处于展开状态,以及第二显示屏300朝上,电子设备可以通过第二显示屏300显示和第二显示屏300关联的应用的用户界面。这种情况下电子设备的物理状态发生变化时的示意图和图8-图12、图15类似,不再赘述。
在一种可能的实现方式中,电子设备也可以只包括第二显示屏300,第二显示屏300的结构可参见上图3-图5中第二显示屏300的说明。电子设备可以已设置第二显示屏300和社交应用关联,并且,电子设备也已设置第四显示区域301和图库关联,第六显示区域303和相机关联,设置方式的示例可参见上图6-图7。电子设备的物理状态发生变换时,用于显示用户界面的显示区域也可以发生变化,显示的应用也可以发生变化,具体示例如下图16所示。
请参见图16,图16示例性示出又一种人机交互示意图。
如图16的(A)所示,电子设备处于展开状态,以及第二显示屏300朝上,电子设备通过第二显示屏300显示社交应用的用户界面120。电子设备可以检测用户的折叠操作,该折叠操作可以将弯折部位两端之间的夹角变小。响应于弯折部位两端之间的夹角小于第一角度阈值,电子设备可以通过第六显示区域303或第四显示区域301显示用户界面,此时电子设备可以为下图16的(B)或图16的(C)所示状态。图16的(B)、(C)和图10的(B)、(C)一致,不再赘述。
不限于上述列举的情况,在具体实现中,也可以是第四显示区域301关联的应用和第二显示屏300关联的应用相同,例如为社交应用,第六显示区域303关联的应用和第二显示屏300关联的应用不同,例如为相机。或者,也可以是第六显示区域303关联的应用和第二显示屏300关联的应用相同,第四显示区域301关联的应用和第二显示屏300关联的应用不同。或者,第四显示区域301关联的应用和第六显示区域303关联的应用相同,但和第二显示屏300关联的应用不同。或者,第四显示区域301关联的应用、第六显示区域303关联的应用和第二显示屏300关联的应用均相同,例如为社交应用。
在一种可能的实现方式中,电子设备100不可折叠,即电子设备100的显示屏194为直板屏。显示屏194包括第一显示屏200和第二显示屏300,第一显示屏200的出光面和第二显示屏300的出光面相背。可选地,第一显示屏200和第二显示屏300分别为一个刚性屏。 可选地,第一显示屏200和第二显示屏300为一个一体成型的柔性屏,第一显示屏200和第二显示屏300分别为这一个柔性屏上的不同显示区域。电子设备100的结构示例如下图17所示。电子设备的物理状态发生变换时,用于显示用户界面的显示区域可以发生变化,显示的应用也可以发生变化,具体示例如下图17所示。
请参见图17,图17示例性示出又一种人机交互示意图。
如图17的(A)所示,电子设备的第一显示屏200朝上,电子设备通过第一显示屏200显示视频应用的用户界面110。电子设备可以检测用户的翻转操作,该翻转操作可以改变第一显示屏200和第二显示屏300的相对位置。响应于第一显示屏200朝下,第二显示屏300朝上,电子设备可以通过第二显示屏300显示和第二显示屏300关联的应用的用户界面,即社交应用的用户界面120,此时电子设备可以为图17的(B)所示状态。
如图17的(B)所示,电子设备的第二显示屏300朝上,电子设备通过第二显示屏300显示社交应用的用户界面120。电子设备可以检测用户的翻转操作,该翻转操作可以改变第一显示屏200和第二显示屏300的相对位置。响应于第二显示屏300朝下,第一显示屏200朝上,电子设备可以通过第一显示屏200显示和第一显示屏200关联的应用的用户界面,即视频应用的用户界面110,此时电子设备可以为图17的(A)所示状态。
不限于上述列举的情况,在具体实现中,电子设备处于展开状态,第一显示屏200正对用户时,电子设备可以通过第一显示屏200显示用户界面。此时,第一显示屏200上的摄像头193可以获取到用户的人脸信息,第二显示屏300上的摄像头193无法获取到用户的人脸信息。电子设备处于展开状态,第二显示屏300正对用户时,电子设备可以通过第二显示屏300显示用户界面。此时,第二显示屏300上的摄像头193可以获取到用户的人脸信息,第一显示屏200上的摄像头193无法获取到用户的人脸信息。
类似地,电子设备处于折叠状态,第二显示屏300的第六显示区域303正对用户时,电子设备可以通过第六显示区域303(可选地,以及第五显示区域302)显示用户界面。此时,第六显示区域303上的摄像头193可以获取到用户的人脸信息,第四显示区域301上的摄像头193无法获取到用户的人脸信息。电子设备处于折叠状态,第二显示屏300的第四显示区域301正对用户时,电子设备可以通过第四显示区域301(可选地,以及第五显示区域302)显示用户界面。此时,第四显示区域301上的摄像头193可以获取到用户的人脸信息,第六显示区域303上的摄像头193无法获取到用户的人脸信息。
不限于上述列举的情况,在具体实现中,电子设备处于展开状态时,也可以是第一显示屏200和第二显示屏300均亮屏,其中,第一显示屏200显示和第一显示屏200关联的应用的用户界面,第二显示屏300显示和第二显示屏300关联的应用的用户界面。类似地,电子设备处于折叠状态时,也可以是第四显示区域301和第六显示区域303(可选地,以及第五显示区域302)均亮屏,其中,第四显示区域301显示和第四显示区域301关联的应用的用户界面,第六显示区域303显示和第六显示区域303关联的应用的用户界面,可选地,第四显示区域301或第六显示区域303和第五显示区域302一起显示用户界面。
可以理解地,电子设备设置第一显示屏200和第一应用关联后,第一应用可以被关闭(例如电子设备关机,或者接收到用于关闭第一应用的用户操作等)。当电子设备(此时电子设备已开机)的物理状态从其他物理状态(例如折叠状态)变换为展开状态时,电子设备可以自动打开第一应用,并通过第一显示屏200显示第一应用的用户界面。也就是说,用户可以通过改变电子设备的物理状态来快速打开应用,节省用户启动应用的操作步骤,更加方便快捷。其他显示区域和应用关联的示例和上述示例类似,不再赘述。
不限于上述列举的情况,电子设备设置第一显示屏200和第一应用关联后,电子设备可以在显示设置界面或桌面时接收用户操作,响应于该用户操作,电子设备取消第一显示屏200和第一应用关联。不限于此,电子设备也可以在通过第一显示屏200显示第一应用的用户界面时,接收作用于第一显示屏200的用户操作(例如点击锁定控件),响应于该用户操作,电子设备取消第一显示屏200和第一应用关联。或者,当第一应用被关闭时(例如电子设备关机,或者接收到用于关闭第一应用的用户操作等),电子设备取消第一显示屏200和第一应用关联。本申请对此不作限定。电子设备取消其他显示区域和应用的关联的说明和上述过程类似,不再赘述。
本申请实施例中,显示区域和应用的关联关系可以是***预设的,例如,电子设备的第二显示屏300默认显示支付应用的用户界面。显示区域和应用的关联关系也可以是电子设备实时确定的,例如,电子设备确定最近一次通过第一显示屏200显示的应用和第一显示屏200关联。显示区域和应用的关联关系还可以是响应于用户操作自定义设置的,具体示例可参见下图18-图22。
以下实施例以电子设备配置的折叠屏为图3-图5所示的折叠屏为例进行说明,此时电子设备的物理状态发生变化的示例可参见上图6-图13所示实施例。
请参见图18,图18示例性示出一种用户界面实施例的示意图。用户界面180可以包括设置界面151、引导示例152和切换选项153。其中:
设置界面151可以包括第一标题1511、视频应用选项1512、社交应用选项1513、游戏应用选项1514。设置界面151的第一标题1511包括文字信息:“设置关联应用”,指示设置界面151用于设置显示区域和应用的关联关系。
设置界面151可以用于设置任意一个应用和显示区域的关联关系,例如可以设置视频应用、社交应用、游戏应用和显示区域的关联关系。视频应用选项1512可以包括第一选项1512A和第二选项1512B。第一选项1512A可以用于用户设置视频应用和第二显示屏300是否关联,包括文字信息:“折叠态下外屏显示”,其中,外屏即为第二显示屏300。电子设备可以检测到作用于第一选项1512A的用户操作(例如点击或滑动操作),响应于该操作,电子设备可以确定视频应用和第二显示屏300关联,或者取消视频应用和第二显示屏300的关联关系。用户界面180的第一选项1512A表征视频应用和第二显示屏300未关联。
第二选项1512B可以用于用户设置视频应用和第一显示屏200是否关联,包括文字信息:“展开态下内屏显示”,其中,内屏即为第一显示屏200。电子设备可以检测到作用于第二选项1512B的用户操作(例如点击或滑动操作),响应于该操作,电子设备可以确定视频应用和第一显示屏200关联,或者取消视频应用和第一显示屏200的关联关系。用户界面180的第二选项1512B表征视频应用和第一显示屏200关联。
类似地,社交应用选项1513也可以包括第三选项1513A和第四选项1513B,游戏应用也可以包括第五选项1514A和第六选项。第三选项1513A和第五选项1514A与第一选项1512A的说明类似,第四选项1513B和第六选项与第二选项1512B的说明类似,不再赘述。
引导示例152可以包括第二标题1521和图片示例1522。引导示例152的第二标题1521包括文字信息:“展开态下内屏显示”,用于指示图片示例1522所示场景为:电子设备处于展开状态,通过第一显示屏200显示应用的用户界面。在视频应用和第一显示屏200关联的情况下,响应于电子设备的物理状态从其他物理状态变换为展开状态,电子设备可以通过第一显示屏200显示视频应用的用户界面,此时电子设备可以为引导示例152所示状态。
切换选项153可以包括第一示例选项153A和第二示例选项153B。切换选项153的第一 示例选项153A为选中状态,表示引导示例152是电子设备显示的引导示例的第一界面。电子设备可以接收用户作用于引导示例152、切换选项153或用户界面180的空白区域的滑动操作(例如,从右往左滑动),响应于该滑动操作,电子设备可以切换显示引导示例的第二界面,即图19所示的引导示例154的图片示例1542。电子设备显示引导示例的第二界面时,第二示例选项153B为选中状态,具体如图19所示。
图19所示的引导示例154和图18所示的引导示例153不同,引导示例154可以包括第三标题1541和图片示例1542。第三标题1541包括文字信息:“折叠态下外屏显示”,用于指示图片示例1542所示场景为:电子设备处于折叠状态,通过第二显示屏300显示应用的用户界面的示例。在社交应用和第二显示屏300关联的情况下,响应于电子设备的物理状态从其他物理状态变换为折叠状态,电子设备可以通过第二显示屏300的第六显示区域303(可选地,以及第五显示区域302)显示社交应用的用户界面,此时电子设备可以为引导示例154所示状态。不限于此,电子设备也可以通过第二显示屏300的第四显示区域301(可选地,以及第五显示区域302)显示社交应用的用户界面。图19的其他内容和图18一致,不再赘述。
可以理解地,为了保证每个应用的显示效果和使用效果,一个显示区域通常只和一个应用关联。因此,图18-图19所示的设置关联应用的场景下,最多有两个选项是开启的,这两个选项分别为:用于设置应用A和第一显示屏200关联,用于设置应用B和第二显示屏300关联。图18-图19所示实施例中,可以设置的应用包括视频应用、社交应用和游戏应用。第一选项1512A、第三选项1513A、第五选项1514A中仅有一个可以为开启状态,即和第二显示屏300关联的应用最多有一个,第二选项1512B、第四选项1513B、第六选项中仅有一个可以为开启状态,即和第一显示屏200关联的应用最多有一个。
并且,在已有一个应用和第一显示屏200关联的情况下,若用户设置其他应用和第一显示屏200关联,则电子设备会取消原有应用和第一显示屏200的关联关系,并设置其他应用和第一显示屏200关联。例如,图18-图19所示实施例中,视频应用的第二选项1512B为开启状态,表征视频应用已和第一显示屏200关联。此时若电子设备检测到作用于第四选项1513B的用户操作(例如点击操作),响应于该用户操作,电子设备会将第二选项1512B设置为关闭状态,将第四选项1513B设置为开启状态。也就是说,响应于该用户操作,电子设备会取消视频应用和第一显示屏200的关联关系,并设置社交应用和第一显示屏200关联。第二显示屏300和第一显示屏200类似,不再赘述。
请参见图20,图20示例性示出又一种用户界面实施例的示意图。用户界面170可以包括状态栏171和应用图标列表172。其中:
状态栏171中可以包括接入的移动网络名称,WI-FI图标、信号强度和当前剩余电量。其中,接入的移动网络是信号格数为4格(即信号强度最好)的第五代移动通信技术(5th generation mobile networks,5G)网络。
应用图标列表172可以包括例如设置的图标1721、计算器的图标1722、音乐的图标1723、图库的图标1724、拨号的图标1725、联系人的图标1726、互联网的图标1727、短信的图标1728、相机的图标1729等,还可以包含其他应用的图标,本申请实施例对此不作限定。任一个应用的图标可用于响应用户的操作,例如触摸操作,使得电子设备启动图标对应的应用。任一个应用的图标可用于响应用户的操作,例如长按操作,使得电子设备显示应用的编辑界面(如图库的编辑界面173)。
示例性地,电子设备可以检测用户作用于图库的图标1724的用户操作(例如长按操作),响应于该用户操作,电子设备可以显示图库的编辑界面173。图库的编辑界面173可以包括例如移除选项173A、关联选项173B、编辑选项173C、更多选项173D等,还可以包含其他选项,本申请实施例对此不作限定。电子设备可以检测作用于关联选项173B的用户操作(例如点击操作),响应于该用户操作,电子设备可以设置第一显示屏200和图库应用关联,或者取消第一显示屏200和图库应用的关联关系。
可以理解地,图20所示的用户界面170为电子设备处于展开状态时通过第一显示屏200显示的用户界面。此时电子设备显示的关联选项173B用于用户设置图库应用和第一显示屏200的关联关系。但若电子设备处于折叠状态,通过第二显示屏300显示用户界面170时,电子设备接收作用于关联选项173B的用户操作(例如点击操作)。响应于该用户操作,电子设备可以设置第二显示屏300和图库应用关联,或者取消第二显示屏300和图库应用的关联关系。电子设备也可以按照上述方式确定第四显示区域301关联的应用和第六显示区域303关联的应用。也就是说,应用的关联选项用于用户设置电子设备当前用于显示的显示区域和应用的关联关系。
请参见图21,图21示例性示出又一种用户界面实施例的示意图。用户界面210可以包括视频应用的缩略图181、社交应用的缩略图182、音乐的缩略图183和关闭选项184。其中,用户界面210可以是用户通过手势导航功能(例如用户从屏幕底部边缘向上滑并停顿)进入的多任务界面,也可以是通过作用于三键导航功能或悬浮导航功能的多任务选项的用户操作(例如点击操作)进入的多任务界面,本申请实施例对此不作限定。
视频应用的缩略图181上显示有应用名称181A(即视频应用)、关联选项181B。电子设备可以检测作用于关联选项181B的用户操作(例如点击操作),响应于该用户操作,电子设备可以确定第一显示屏200和视频应用关联,或者取消第一显示屏200和视频应用的关联关系。用户界面210所示的视频应用的关联选项181B表征视频应用已和第一显示屏200关联。此时若用户点击关联选项181B,电子设备会取消第一显示屏200和视频应用的关联关系,关联选项181B会显示为社交应用的关联选项182A所示状态。
社交应用的缩略图182上显示有关联选项182A。用户界面210所示的社交应用的关联选项182A表征社交应用未和第一显示屏200关联。若此时电子设备检测到作用于关联选项182A的用户操作(例如点击操作),响应于该用户操作,电子设备可以设置第一显示屏200社交应用关联。音乐的缩略图183上显示有应用名称183A(即音乐)。
社交应用的缩略图182和音乐的缩略图183仅显示部分内容,视频应用181的缩略图181显示完整内容。电子设备可以检测作用于用户界面210的左右滑动操作,响应于该滑动操作,电子设备可以切换多个应用的缩略图的位置。例如用户从右向左滑动时,电子设备可以将视频应用的缩略图181置于社交应用的缩略图182的位置,音乐的缩略图183置于视频应用的缩略图181的位置。若电子设备运行的应用仅包括社交应用、视频应用和音乐,则电子设备可以将社交应用的缩略图182置于音乐的缩略图183的位置。若电子设备还运行有其他应用,例如游戏应用,则可以将游戏应用的缩略图置于音乐的缩略图183的位置,社交应用的缩略图182此时不可见。
关闭选项184可以用于关闭电子设备运行的所有应用,当电子设备检测到作用于关闭选项184的用户操作(例如点击操作)时,响应于该用户操作,电子设备可以关闭电子设备运行的所有应用,并显示电子设备的桌面。不限于此,响应于该用户操作,电子设备也可以关 闭运行的部分应用,仅显示一个应用的用户界面(例如用户界面210中显示完整缩略图的视频应用的用户界面)。
图21所示的社交应用的关联选项182A、视频应用的关联选项181B和图20所示的关联选项173B类似,均用于用户设置电子设备当前用于显示的显示区域和应用的关联关系。例如,电子设备处于折叠状态,通过第六区域303显示用户界面时,电子设备接收作用于视频应用的关联选项181B的用户操作(例如点击操作)。则响应于该用户操作,电子设备可以确定第六区域303和视频应用关联,或者取消第六区域303和视频应用的关联关系。
请参见图22,图22示例性示出又一种用户界面实施例的示意图。
如图22所示,电子设备处于展开状态,通过第一显示屏200显示视频应用的用户界面110。用户界面110可以包括关联选项110A。电子设备可以检测作用于关联选项110A的用户操作(例如点击操作),响应于该用户操作,电子设备可以确定第一显示屏200和视频应用关联,或者取消第一显示屏200和视频应用的关联关系。用户界面110所示的关联选项110A表征视频应用已和第一显示屏200关联。此时若用户点击关联选项110A,电子设备会取消第一显示屏200和视频应用的关联关系,关联选项110A会显示为上图21中社交应用的关联选项182A所示状态。
类似地,电子设备处于折叠状态,通过第二显示屏300显示视频应用的用户界面时。该视频应用的用户界面也可以包括关联选项。电子设备也可以接收作用于该关联选项的用户操作(例如点击操作)。响应于该用户操作,电子设备可以确定第二显示屏300和视频应用关联,或者取消第二显示屏300和视频应用的关联关系。电子设备也可以按照上述方式确定第四显示区域301关联的应用和第六显示区域303关联的应用。
可以理解地,为了保证每个应用的显示效果和使用效果,一个显示区域通常只和一个应用关联。示例性地,图21所示的设置关联应用的场景下,最多有一个关联选项是开启的,这个关联选项用于设置应用C和第一显示屏200关联。图21所示实施例中,可以设置的应用包括视频应用、社交应用和音乐。视频应用的关联选项181B、社交应用的关联选项182A、音乐的关联选项中仅有一个可以为开启状态。并且,在已有一个应用和第一显示屏200关联的情况下,若用户设置其他应用和第一显示屏200关联,则电子设备会取消原有应用和第一显示屏200的关联关系,并设置其他应用和第一显示屏200关联。例如,图21所示实施例中,视频应用的关联选项181B为开启状态,表征视频应用已和第一显示屏200关联。此时若电子设备检测到作用于社交应用的关联选项182A的用户操作(例如点击操作),响应于该用户操作,电子设备会将视频应用的关联选项181B设置为关闭状态,将社交应用的关联选项182A设置为开启状态。也就是说,响应于该用户操作,电子设备会取消视频应用和第一显示屏200的关联关系,并设置社交应用和第一显示屏200关联。第二显示屏300和第一显示屏200类似,不再赘述。
可以理解地,设置第一显示屏200和应用关联的方式,和设置第二显示屏300和应用关联的方式可以相同。例如图18-图19所示实施例中,电子设备可以接收作用于第一选项1512A和第四选项1513B的点击操作。响应于该点击操作,电子设备可以确定第一显示屏200和社交应用关联,并且确定第二显示屏300和视频应用关联。设置第一显示屏200和应用关联的方式,和设置第二显示屏300和应用关联的方式也可以不同。例如,在图21所示实施例中,电子设备响应于用户操作设置第一显示屏200和视频应用关联。而第二显示屏300和应用的关联关系可以是电子设备实时确定的,即和第二显示屏300关联的应用是:电子设备最近一 次通过第二显示屏300显示的应用。本申请实施例对设置不同显示区域和应用关联的具体方式和具体时刻不作限定。
接下来示例性介绍电子设备100中的各个部件在图6-图7所示场景下的协作关系,具体如下图23-图26所示。以下实施例以角度传感器180M检测显示屏194的弯折角度为例进行说明。
请参见图23,图23示例性示出电子设备100中的各个部件在图6所示场景下的一种协作关系。
电子设备100从图6的(A)所示状态变换为图6的(B)所示状态的过程中,协作关系具体如下所示:
1.电子设备100处于展开状态,在第一显示屏200显示第一应用的用户界面。
2.角度传感器180M检测电子设备100的弯折角度,并上报给处理器100。
3.处理器110确定第一显示屏200的弯折角度小于第一角度阈值,并以此确定显示区域为第二显示屏300的第六显示区域303。
4.处理器110确定不存在和第二显示屏300的第六显示区域303关联的应用。
5.处理器110指示第二显示屏300的第六显示区域303继续显示电子设备100处于展开状态显示的应用的用户界面,即第一应用的用户界面。
6.第二显示屏300的第六显示区域303显示第一应用的用户界面。
电子设备100从图6的(B)所示状态变换为图6的(C)所示状态的过程中,协作关系具体如下所示:
7.压力传感器180A检测到作用于第二显示屏300的第六显示区域303的触控操作,该触控操作用于打开第二应用。
8.压力传感器180A将上述触控操作的事件上报至处理器110。
9.处理器110根据上述触控操作的事件,确定显示的应用为第二应用。
10.处理器110指示第二显示屏300的第六显示区域303显示第二应用的用户界面。
11.第二显示屏300的第六显示区域303显示第二应用的用户界面。
上述3和4的顺序不作限定,也可以是同时执行的。
请参见图24,图24示例性示出电子设备100中的各个部件在图6所示场景下的又一种协作关系。
电子设备100从图6的(C)所示状态变换为图6的(A)所示状态的过程中,协作关系具体如下所示:
12.角度传感器180M检测电子设备100的弯折角度,并上报给处理器100。
13.处理器110确定电子设备100的弯折角度大于第二角度阈值,并以此确定显示区域为第一显示屏200。
14.处理器110确定第一显示屏200和第一应用关联。该关联关系可以是处理器110自行确定的,也可以是响应于用户操作设置的,具体示例可参见上图18-图22。
15.处理器110指示第一显示屏200显示和第一显示屏300关联的应用的用户界面,即第一应用的用户界面。
16.第一显示屏200显示第一应用的用户界面。
电子设备100从图6的(A)所示状态变换为图6的(C)所示状态的过程中,协作关系 具体如下所示:
17.角度传感器180M检测电子设备100的弯折角度,并上报给处理器100。
18.处理器110确定电子设备100的弯折角度小于第一角度阈值,并以此确定显示区域为第二显示屏300的第六显示区域303。
19.处理器110确定电子设备100最近一次处于折叠状态显示的应用为第二应用,即第二显示屏300最近一次显示的应用为第二应用。
20.处理器110指示第二显示屏300的第六显示区域303显示电子设备100最近一次处于折叠状态显示的应用的用户界面,即第二应用的用户界面。
21.第二显示屏300的第六显示区域303显示第二应用的用户界面。
上述13和14的顺序不作限定,也可以是同时执行的。上述18和19的顺序不作限定,也可以是同时执行的。
接下来介绍电子设备100中的各个部件在图7所示场景下的一种协作关系。
请参见图25,图25示例性示出电子设备100中的各个部件在图7所示场景下的一种协作关系。
电子设备100从图7的(A)所示状态变换为图7的(B)所示状态的过程中,协作关系具体如下所示:
1.电子设备100处于展开状态,在第一显示屏200显示第一应用的用户界面。
2.压力传感器180A检测到作用于第一显示屏200的触控操作,该触控操作用于打开第三应用。
3.压力传感器180A将上述触控操作的事件上报至处理器110。
4.处理器110根据上述触控操作的事件,确定显示的应用为第三应用。
5.处理器110指示第一显示屏200显示第三应用的用户界面。
6.第一显示屏200显示第三应用的用户界面。
电子设备100从图7的(B)所示状态变换为图7的(C)所示状态的过程中,协作关系具体如下所示:
7.角度传感器180M检测电子设备100的弯折角度,并上报给处理器100。
8.处理器110确定电子设备100的弯折角度小于第一角度阈值,并以此确定显示区域为第二显示屏300的第六显示区域303。
9.处理器110确定不存在和第二显示屏300的第六显示区域303关联的应用。
10.处理器110指示第二显示屏300的第六显示区域303继续显示电子设备100处于展开状态显示的应用的用户界面,即第三应用的用户界面。
11.第二显示屏300的第六显示区域303显示第三应用的用户界面。
上述8和9的顺序不作限定,也可以是同时执行的。
请参见图26,图26示例性示出电子设备100中的各个部件在图7所示场景下的又一种协作关系。
电子设备100从图7的(C)所示状态变换为图7的(A)所示状态的过程中,协作关系具体如下所示:
12.角度传感器180M检测电子设备100的弯折角度,并上报给处理器100。
13.处理器110确定电子设备100的弯折角度大于第二角度阈值,并以此确定显示区域为 第一显示屏200。
14.处理器110确定第一显示屏200和第一应用关联。该关联关系可以是处理器110自行确定的,也可以是响应于用户操作设置的,具体示例可参见上图18-图22。
15.处理器110指示第一显示屏200显示和第一显示屏300关联的应用的用户界面,即第一应用的用户界面。
16.第一显示屏200显示第一应用的用户界面。
电子设备100从图7的(A)所示状态变换为图7的(C)所示状态的过程中,协作关系具体如下所示:
17.角度传感器180M检测电子设备100的弯折角度,并上报给处理器100。
18.处理器110确定电子设备100的弯折角度小于第一角度阈值,并以此确定显示区域为第二显示屏300的第六显示区域303。
19.处理器110确定电子设备100最近一次处于折叠状态显示的应用为第三应用,即第二显示屏300最近一次显示的应用为第三应用。
20.处理器110指示第二显示屏300的第六显示区域303显示电子设备100最近一次处于折叠状态显示的应用的用户界面,即第三应用的用户界面。
21.第二显示屏300的第六显示区域303显示第三应用的用户界面。
上述13和14的顺序不作限定,也可以是同时执行的。上述18和19的顺序不作限定,也可以是同时执行的。
基于上述图1-图26所示的一些实施例,下面介绍本申请提供的显示方法。
请参见图27,图27是本申请实施例提供的一种显示方法。该方法可以应用于图1所示的电子设备100。该方法也可以应用于图2所示的电子设备100。该方法包括但不限于如下步骤:
S101:接收第一用户操作。
S102:响应于第一用户操作,电子设备确定第一显示区域和第一应用关联。
具体地,电子设备响应于用户操作确定显示区域和应用关联的示例可参见上图18-图22所示实施例。
S103:电子设备处于第一物理状态时,通过第一显示区域显示第一应用的用户界面。
S104:接收第二用户操作。
S105:响应于第二用户操作,电子设备通过第一显示区域显示第二应用的用户界面。
S106:当通过第一显示区域显示第二应用的用户界面时,响应于第三用户操作,电子设备的物理状态由第一物理状态变换为第二物理状态,电子设备通过第二显示区域显示第二应用的用户界面。
S107:当通过第二显示区域显示第二应用的用户界面时,响应于电子设备的物理状态由第二物理状态变换为第一物理状态,电子设备通过第一显示区域显示第一应用的用户界面。
示例性地,图27所示的电子设备物理状态变化过程的示例可参见上图7。
具体地,电子设备包括第一显示区域和第二显示区域,电子设备处于第一物理状态时通过第一显示区域显示用户界面,电子设备处于第二物理状态时通过第二显示区域显示用户界面。
在一些实施例中,第一显示区域包括至少部分第二显示区域;和/或,第二显示区域包括至少部分第一显示区域。示例性地,第一显示区域和第二显示区域属于电子设备的同一个显 示屏,例如上图3-图5所示的第二显示屏300。第一显示区域包括上图3-图5所示的第四显示区域301和第五显示区域302,第二显示区域包括上图3-图5所示的第六显示区域303和第五显示区域302,第一显示区域和第二显示区域存在重叠的显示区域,即第五显示区域302。
在一些实施例中,电子设备为可折叠电子设备。当电子设备处于展开状态时,第一显示区域和第二显示区域处于同一个平面。当电子设备处于折叠状态时,第一显示区域的出光面和第二显示区域的出光面相背。示例性地,第一显示区域和第二显示区域属于电子设备的同一个显示屏,例如上图3-图5所示的第二显示屏300。第一显示区域、第二显示区域分别为上图3-图5所示的第四显示区域301、第六显示区域303。在上述情况下,第一物理状态和第二物理状态可以均为折叠状态,第三用户操作为用户翻转电子设备的操作。例如,第一物理状态下,第一显示区域朝上,第二显示区域朝下;第二物理状态下,第二显示区域朝上,第一显示区域朝下。或者,第一物理状态下第一显示区域正对用户,第二物理状态下第二显示区域正对用户。
可选地,电子设备包括第一显示屏,第一显示区域为第一显示屏的至少部分显示区域,第二显示区域为第一显示屏的至少部分显示区域。例如,第一显示屏可以为一个柔性折叠屏,第一显示区域和第二显示区域是这一个柔性折叠屏上的显示区域。
可选地,电子设备包括第一显示屏和第二显示屏,第一显示区域为第一显示屏的显示区域,第二显示区域为第二显示屏的显示区域。例如,第一显示屏可以是两个刚性屏和柔性屏、链条等连接组件拼接而成的显示屏,第一显示区域是一个刚性屏上的显示区域,第二显示区域是另一个刚性屏上的显示区域。
在一些实施例中,电子设备为可折叠电子设备,电子设备包括第一显示屏,第二显示区域为第一显示屏的全屏显示区域,第一显示区域为第一显示屏的部分显示区域,第一物理状态为折叠状态,第二物理状态为展开状态。示例性地,第一显示屏为上图3-图5所示的第二显示屏300,第二显示区域包括上图3-图5所示的第四显示区域301、第五显示区域302和第六显示区域303。第一显示区域为第四显示区域301或第六显示区域303,可选地,第一显示区域还可以包括第五显示区域302。
上述情况下电子设备的结构和物理状态变化过程的示例可参见上图16。
在一些实施例中,电子设备为可折叠电子设备,当电子设备处于展开状态时,第一显示区域的出光面和第二显示区域的出光面相背。示例性地,第一显示区域、第二显示区域为上图3-图5所示的第一显示屏200、第二显示屏300。或者,第一显示区域为第一显示屏200上的至少一个显示区域,例如第一显示区域201。或者,第二显示区域为第二显示屏300上的至少一个显示区域,例如第四显示区域301。可选地,第一物理状态为展开状态,第二物理状态为折叠状态,第三用户操作为用户将电子设备由展开状态转换为折叠状态的操作。可选地,第一物理状态和第二物理状态均为展开状态,第三用户操作为用户翻转电子设备的操作。例如,第一物理状态下,第一显示区域朝上,第二显示区域朝下;第二物理状态下,第二显示区域朝上,第一显示区域朝下。或者,第一物理状态下第一显示区域正对用户,第二物理状态下第二显示区域正对用户。在这种情况下,电子设备的结构和物理状态变化过程的示例可参见上图3-图15。
在一些实施例中,电子设备不可折叠,第一显示区域的出光面和第二显示区域的出光面相背。第三用户操作为用户翻转电子设备的操作。例如,第一物理状态下,第一显示区域朝上,第二显示区域朝下;第二物理状态下,第二显示区域朝上,第一显示区域朝下。或者,第一物理状态下第一显示区域正对用户,第二物理状态下第二显示区域正对用户。在这种情 况下,电子设备的结构和物理状态变化过程的示例可参见上图17。
在一些实施例中,该方法还可以包括:当通过第一显示区域显示第一应用的用户界面时,响应于电子设备的物理状态由第一物理状态变换为第二物理状态,电子设备通过第二显示区域显示第一应用的用户界面;当电子设备通过第二显示区域显示第一应用的用户界面时,接收第四用户操作;响应于第四用户操作,电子设备通过第二显示区域显示第三应用的用户界面,第三应用和第一应用不同。在这种情况下,电子设备物理状态变化过程的示例可参见上图6。
在一些实施例中,响应于电子设备的物理状态由第一物理状态变换为第二物理状态,电子设备可以通过第二显示区域显示电子设备最近一次处于第二物理状态显示的应用的用户界面。示例性地,S107之后,该方法还可以包括:响应于电子设备的物理状态由第一物理状态变换为第二物理状态,电子设备通过第二显示区域显示第二应用的用户界面。
在一些实施例中,电子设备也可以响应于用户操作自定义设置第二显示区域和第四应用关联,第四应用和第一应用不同,具体示例可参见上图18-图22所示实施例。示例性地,该方法还可以包括:接收第五用户操作;响应于第五用户操作,电子设备确定第二显示区域和第四应用关联;当通过第一显示区域显示第一应用的用户界面时,响应于电子设备的物理状态由第一物理状态变换为第二物理状态,电子设备通过第二显示区域显示第四应用的用户界面。
不限于上述列举的情况,在具体实现中,电子设备也可以预设第一显示区域和第一应用关联,本申请对确定显示区域和应用关联的方式不作限定。
本申请对于关联关系的实现方式不作限定。可选地,电子设备可以通过数据集来存储显示区域和应用的关联关系。可选地,数据集可以是电子设备确定存在关联关系时临时存储的,从而减少不必要的存储开销。示例性地,数据集可以包括一组数据,其中包括第一显示区域的标识、和第一显示区域关联的应用的标识,可选地,还可以包括第一物理状态的标识。或者,数据集也可以包括两组数据,其中一组数据为上述的一组数据,另一组数据包括第二显示区域的标识、和第二显示区域关联的应用的标识,可选地,还可以包括第二物理状态的标识。电子设备可以根据数据集确定显示区域和应用的关联关系,以此在不同的物理状态下,通过对应的显示区域显示关联的应用。
在一些实施例中,电子设备通过数据集存储第一显示区域、第二显示区域和应用的关联关系,该数据集包括两组数据。电子设备确定第一显示区域和第一应用关联后,数据集中的一组数据包括第一显示区域的标识和第一应用的标识,而另一组数据可以为空。电子设备确定第一显示区域和第一应用关联后,物理状态第一次变换为第二物理状态时,电子设备可以通过第二显示区域继续显示电子设备处于第一物理状态时显示的应用的用户界面。假设后续电子设备的物理状态从第二物理状态变换为第一物理状态之前,电子设备最近一次通过第二显示区域显示的应用为第二应用。则电子设备可以确定数据集中的另一组数据包括第二显示区域的标识和第二应用的标识。因此,后续电子设备的物理状态变换为第一物理状态时,电子设备可以根据数据集确定通过第一显示区域显示第一应用的用户界面。电子设备的物理状态变换为第二物理状态时,电子设备可以根据数据集确定通过第二显示区域显示第二应用的用户界面。
在一些实施例中,显示区域和应用的关联关系可以发生变化,则电子设备可以更新数据集中存储的应用标识。例如,假设数据集中原本包括第二显示区域的标识和应用A的标识。电子设备接收作用于设置界面的用户操作,该用户操作用于设置第二显示区域和应用B关联, 则电子设备可以更新数据集中的另一组数据,更新后的另一组数据包括第二显示区域的标识和应用B的标识。或者,电子设备的物理状态从第二物理状态变换为第一物理状态之前,电子设备最近一次通过第二显示区域显示的应用为应用C,则电子设备可以更新数据集中的另一组数据,更新后的另一组数据包括第二显示区域的标识和应用C的标识。
在一些实施例中,在电子设备确定某个应用和显示区域关联的情况下,若电子设备运行该应用,则即使该应用不被电子设备显示,电子设备也可以将该应用的所有或大部分进程保留,而不回收运行该应用所需的网络资源、***资源等。例如,电子设备确定第一显示区域和第一应用关联,以及电子设备运行第一应用和第二应用。当电子设备处于第二物理状态,并通过第二显示区域显示第二应用的用户界面时,电子设备不会回收运行第一应用所需的资源。因此,当电子设备的物理状态变换为第一物理状态时,电子设备可以通过第一显示区域显示:电子设备最近一次处于第一物理状态时通过第一显示区域显示的第一应用的用户界面。从而避免电子设备关闭第一应用的任意应用进程带来的用户数据丢失等情况,提升打开第一应用的速度,提升用户使用感。
在图27所示的方法中,电子设备确定第一显示区域和第一应用关联之后,只要电子设备的物理状态变换为第一物理状态,电子设备就可以通过第一显示区域显示第一应用的用户界面。也就是说,用户可以通过改变电子设备物理状态来快速切换显示的应用,无需多次退出或隐藏当前显示的应用并重新打开想要查看的应用,大大方便了用户的使用。并且,电子设备在不同物理状态下用于显示用户界面的显示区域不同,不会改变应用界面的已有布局,显示效果更好,使用也更加方便。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。上述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行上述计算机程序指令时,全部或部分地产生按照本申请上述的流程或功能。上述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。上述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,上述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。上述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。上述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,数字通用光盘(digital versatile disc,DVD))、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
总之,以上上述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。
显然,本领域的技术人员可以对本发明进行各种改动和变型而不脱离本发明的精神和范围。这样,倘若本发明的这些修改和变型属于本发明权利要求及其等同技术的范围之内,则本发明也意图包含这些改动和变型在内。

Claims (15)

  1. 一种显示方法,其特征在于,应用于电子设备,所述电子设备包括第一显示区域和第二显示区域,所述方法包括:
    接收第一用户操作;
    响应于所述第一用户操作,所述电子设备确定所述第一显示区域和第一应用关联;
    所述电子设备处于第一物理状态时,通过所述第一显示区域显示所述第一应用的用户界面;
    接收第二用户操作;
    响应于所述第二用户操作,所述电子设备通过所述第一显示区域显示第二应用的用户界面,所述第一应用和所述第二应用不同;
    当通过所述第一显示区域显示所述第二应用的用户界面时,响应于第三用户操作,所述电子设备的物理状态由所述第一物理状态变换为第二物理状态,所述电子设备通过所述第二显示区域显示所述第二应用的用户界面;
    当通过所述第二显示区域显示所述第二应用的用户界面时,响应于所述电子设备的物理状态由所述第二物理状态变换为所述第一物理状态,所述电子设备通过所述第一显示区域显示所述第一应用的用户界面。
  2. 如权利要求1所述的方法,其特征在于,所述第一显示区域包括至少部分所述第二显示区域;和/或,所述第二显示区域包括至少部分所述第一显示区域。
  3. 如权利要求1所述的方法,其特征在于,所述电子设备为可折叠电子设备;当所述电子设备处于展开状态时,所述第一显示区域和所述第二显示区域处于同一个平面;当所述电子设备处于折叠状态时,所述第一显示区域的出光面和所述第二显示区域的出光面相背。
  4. 如权利要求3所述的方法,其特征在于,所述电子设备包括第一显示屏,所述第一显示区域为所述第一显示屏的至少部分显示区域,所述第二显示区域为所述第一显示屏的至少部分显示区域。
  5. 如权利要求3所述的方法,其特征在于,所述电子设备包括第一显示屏和第二显示屏,所述第一显示区域为所述第一显示屏的显示区域,所述第二显示区域为所述第二显示屏的显示区域。
  6. 如权利要求1、2或4所述的方法,其特征在于,所述电子设备为可折叠电子设备,所述电子设备包括第一显示屏,所述第二显示区域为所述第一显示屏的全屏显示区域,所述第一显示区域为所述第一显示屏的部分显示区域,所述第一物理状态为折叠状态,所述第二物理状态为展开状态。
  7. 如权利要求3所述的方法,其特征在于,所述第一物理状态和所述第二物理状态均为折叠状态,所述第三用户操作为所述用户翻转所述电子设备的操作。
  8. 如权利要求1所述的方法,其特征在于,所述电子设备为可折叠电子设备,当所述电子设备处于展开状态时,所述第一显示区域的出光面和所述第二显示区域的出光面相背;所述第一物理状态为所述展开状态,所述第二物理状态为折叠状态,所述第三用户操作为所述用户将所述电子设备由所述展开状态转换为所述折叠状态的操作。
  9. 如权利要求1所述的方法,其特征在于,所述电子设备为可折叠电子设备,当所述电子设备处于展开状态时,所述第一显示区域的出光面和所述第二显示区域的出光面相背;所述第一物理状态和所述第二物理状态均为所述展开状态,所述第三用户操作为所述用户翻转所述电子设备的操作。
  10. 如权利要求1-9任一项所述的方法,其特征在于,所述方法还包括:
    当通过所述第一显示区域显示所述第一应用的用户界面时,响应于所述电子设备的物理状态由所述第一物理状态变换为所述第二物理状态,所述电子设备通过所述第二显示区域显示所述第一应用的用户界面;
    当所述电子设备通过所述第二显示区域显示所述第一应用的用户界面时,接收第四用户操作;
    响应于所述第四用户操作,所述电子设备通过所述第二显示区域显示第三应用的用户界面,所述第三应用和所述第一应用不同。
  11. 如权利要求1-9任一项所述的方法,其特征在于,当通过所述第二显示区域显示所述第二应用的用户界面时,响应于所述电子设备的物理状态由所述第二物理状态变换为所述第一物理状态,所述电子设备通过所述第一显示区域显示所述第一应用的用户界面之后,所述方法还包括:
    响应于所述电子设备的物理状态由所述第一物理状态变换为所述第二物理状态,所述电子设备通过所述第二显示区域显示所述第二应用的用户界面。
  12. 如权利要求1-9任一项所述的方法,其特征在于,所述方法还包括:
    接收第五用户操作;
    响应于所述第五用户操作,所述电子设备确定所述第二显示区域和第四应用关联;
    当通过所述第一显示区域显示所述第一应用的用户界面时,响应于所述电子设备的物理状态由所述第一物理状态变换为所述第二物理状态,所述电子设备通过所述第二显示区域显示所述第四应用的用户界面,所述第一应用和所述第四应用不同。
  13. 如权利要求1-9任一项所述的方法,其特征在于,所述接收第一用户操作时,所述电子设备处于所述第一物理状态,并通过所述第一显示区域显示所述第一应用的用户界面;所述第一用户操作为作用于显示所述第一应用的用户界面的显示区域的用户操作。
  14. 一种电子设备,其特征在于,所述电子设备包括第一显示区域、第二显示区域、一个或多个存储器、一个或多个处理器;所述一个或多个存储器用于存储计算机程序,所述一个或多个处理器用于调用所述计算机程序,所述计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行权利要求1至13任一项所述的方法。
  15. 一种计算机存储介质,其特征在于,包括计算机程序,所述计算机程序包括指令,当所述指令在处理器上运行时,实现如权利要求1至13任一项所述的方法。
PCT/CN2022/070132 2021-01-30 2022-01-04 一种显示方法及电子设备 WO2022161119A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110134456.7 2021-01-30
CN202110134456.7A CN114840280A (zh) 2021-01-30 2021-01-30 一种显示方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022161119A1 true WO2022161119A1 (zh) 2022-08-04

Family

ID=82561336

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/070132 WO2022161119A1 (zh) 2021-01-30 2022-01-04 一种显示方法及电子设备

Country Status (2)

Country Link
CN (1) CN114840280A (zh)
WO (1) WO2022161119A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024093520A1 (zh) * 2022-11-04 2024-05-10 荣耀终端有限公司 控制亮屏的方法和电子设备

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024034810A1 (ko) * 2022-08-08 2024-02-15 삼성전자주식회사 다중 윈도우들을 위한 입력을 처리하는 전자 장치
WO2024043440A1 (ko) * 2022-08-24 2024-02-29 삼성전자주식회사 애플리케이션의 화면을 표시하는 디스플레이를 복수의 디스플레이들 간 전환하기 위한 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109766053A (zh) * 2019-01-15 2019-05-17 Oppo广东移动通信有限公司 用户界面显示方法、装置、终端及存储介质
CN109889630A (zh) * 2019-01-11 2019-06-14 华为技术有限公司 显示方法及相关装置
CN110602273A (zh) * 2019-08-05 2019-12-20 华为技术有限公司 一种消息显示方法与电子设备
CN111124561A (zh) * 2019-11-08 2020-05-08 华为技术有限公司 应用于具有折叠屏的电子设备的显示方法及电子设备
WO2021221421A1 (ko) * 2020-04-27 2021-11-04 삼성전자 주식회사 디스플레이를 제어하는 방법 및 그 전자 장치

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111078091A (zh) * 2019-11-29 2020-04-28 华为技术有限公司 分屏显示的处理方法、装置及电子设备
CN111182137A (zh) * 2019-12-19 2020-05-19 华为技术有限公司 具有柔性屏幕的电子设备的显示方法和电子设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109889630A (zh) * 2019-01-11 2019-06-14 华为技术有限公司 显示方法及相关装置
CN109766053A (zh) * 2019-01-15 2019-05-17 Oppo广东移动通信有限公司 用户界面显示方法、装置、终端及存储介质
CN110602273A (zh) * 2019-08-05 2019-12-20 华为技术有限公司 一种消息显示方法与电子设备
CN111124561A (zh) * 2019-11-08 2020-05-08 华为技术有限公司 应用于具有折叠屏的电子设备的显示方法及电子设备
WO2021221421A1 (ko) * 2020-04-27 2021-11-04 삼성전자 주식회사 디스플레이를 제어하는 방법 및 그 전자 장치

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024093520A1 (zh) * 2022-11-04 2024-05-10 荣耀终端有限公司 控制亮屏的方法和电子设备

Also Published As

Publication number Publication date
CN114840280A (zh) 2022-08-02

Similar Documents

Publication Publication Date Title
WO2021013158A1 (zh) 显示方法及相关装置
WO2021129326A1 (zh) 一种屏幕显示方法及电子设备
CN112714901B (zh) ***导航栏的显示控制方法、图形用户界面及电子设备
WO2021213164A1 (zh) 应用界面交互方法、电子设备和计算机可读存储介质
WO2021036571A1 (zh) 一种桌面的编辑方法及电子设备
WO2020253758A1 (zh) 一种用户界面布局方法及电子设备
WO2020221063A1 (zh) 切换父页面和子页面的方法、相关装置
CN112217923B (zh) 一种柔性屏幕的显示方法及终端
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
WO2022161119A1 (zh) 一种显示方法及电子设备
WO2022022575A1 (zh) 显示控制方法、装置和存储介质
US11972106B2 (en) Split screen method and apparatus, and electronic device
CN116360725B (zh) 显示交互***、显示方法及设备
WO2022143180A1 (zh) 协同显示方法、终端设备及计算机可读存储介质
CN112068907A (zh) 一种界面显示方法和电子设备
CN113641271A (zh) 应用窗口的管理方法、终端设备及计算机可读存储介质
WO2022022674A1 (zh) 应用图标布局方法及相关装置
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
WO2022222688A1 (zh) 一种窗口控制方法及其设备
WO2022037408A1 (zh) 一种显示方法及电子设备
US12032410B2 (en) Display method for flexible display, and terminal
WO2022022381A1 (zh) 生成涂鸦图案的方法、装置、电子设备及存储介质
WO2024109573A1 (zh) 悬浮窗显示的方法和电子设备
CN117369914A (zh) 显示方法及电子设备
CN116339569A (zh) 分屏显示的方法、折叠屏设备和计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22744997

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22744997

Country of ref document: EP

Kind code of ref document: A1