CN118276726A - Interaction method, device, electronic equipment and storage medium - Google Patents

Interaction method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN118276726A
CN118276726A CN202211721521.7A CN202211721521A CN118276726A CN 118276726 A CN118276726 A CN 118276726A CN 202211721521 A CN202211721521 A CN 202211721521A CN 118276726 A CN118276726 A CN 118276726A
Authority
CN
China
Prior art keywords
target information
target
display screen
electronic device
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211721521.7A
Other languages
Chinese (zh)
Inventor
王剑锋
李轩恺
魏曦
汤志斌
许达兴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202211721521.7A priority Critical patent/CN118276726A/en
Priority to PCT/CN2023/121694 priority patent/WO2024139479A1/en
Publication of CN118276726A publication Critical patent/CN118276726A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interaction method, an interaction device, electronic equipment and a storage medium, and relates to the technical field of electronic equipment. The method is applied to electronic equipment, the electronic equipment comprises a display screen, the display screen is provided with an opening, and the method comprises the following steps: and displaying the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening in response to a first operation of the target information of the current interface, storing the target information, and displaying the target information in response to a second operation of the electronic equipment. According to the application, through the storage or display of the open pores in the display screen in combination with the information, new application can be given to the open pores in the display screen, and the use experience of a user is improved.

Description

Interaction method, device, electronic equipment and storage medium
Technical Field
The present application relates to the technical field of electronic devices, and in particular, to an interaction method, an interaction device, an electronic device, and a storage medium.
Background
With the development of science and technology, electronic devices are increasingly widely used, and have more and more functions, and become one of the necessities in daily life. The user pursues the full screen, but due to the existence of the photosensitive devices such as the front camera, an open area is reserved for supporting the realization of the functions of the devices, so that the occupation of the display area is caused, the display effect is affected, and bad user experience is brought.
Disclosure of Invention
In view of the above, the present application proposes an interaction method, an interaction device, an electronic device, and a storage medium, so as to solve the above problem.
In a first aspect, an embodiment of the present application provides an interaction method, which is applied to an electronic device, where the electronic device includes a display screen, and the display screen has an opening, and the method includes: displaying a current interface; responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to an area on a display screen corresponding to the opening, and storing the target information; or displaying the target information in response to a second operation for the electronic device.
In a second aspect, an embodiment of the present application provides an interaction apparatus, applied to an electronic device, where the electronic device includes a display screen, and the display screen has an opening, and the apparatus includes: the current interface display module is used for displaying a current interface; the first operation response module is used for responding to a first operation of target information of the current interface, enabling the target information to disappear after moving to an area on the display screen corresponding to the opening, and storing the target information; or a second operation response module, configured to display the target information in response to a second operation for the electronic device.
In a third aspect, an embodiment of the present application provides an electronic device comprising a display screen, a memory, and a processor, the display screen and the memory being coupled to the processor, the memory storing instructions that when executed by the processor perform the above method.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having program code stored therein, the program code being callable by a processor to perform the above method.
According to the interaction method, the device, the electronic equipment and the storage medium provided by the embodiment of the application, the current interface is displayed, the target information is moved to the area on the display screen corresponding to the opening in response to the first operation of the target information of the current interface, then the target information is disappeared, the target information is stored, and the target information is displayed in response to the second operation of the electronic equipment, so that the defect of the opening which originally influences the display of the full screen can be visually participated in the storage of the information through the cooperation of the opening in the display screen and the storage of the information, thus the storage of the information does not need to be participated in by sacrificing any area which originally needs to be used as the display, and new use can be given to the defect of the opening which originally influences the display of the full screen, and different use experiences are brought to users.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a first schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic diagram of a second structure of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic diagram of a third structure of an electronic device according to an embodiment of the present application;
Fig. 4 is a schematic diagram of a fourth structure of an electronic device according to an embodiment of the present application;
Fig. 5 shows a fifth structural schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 6 is a flow chart illustrating an interaction method according to an embodiment of the present application;
Fig. 7 shows a first interface schematic diagram of an electronic device according to an embodiment of the present application;
fig. 8 shows a second interface schematic diagram of an electronic device according to an embodiment of the present application;
Fig. 9 shows a third interface schematic diagram of an electronic device according to an embodiment of the present application;
Fig. 10 shows a fourth interface schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 11 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 12 is a flow chart illustrating step S230 of the interaction method of FIG. 11 in accordance with the present application;
FIG. 13 is a flow chart illustrating an interaction method according to an embodiment of the present application;
Fig. 14 shows a fifth interface schematic diagram of an electronic device according to an embodiment of the present application;
fig. 15 shows a sixth interface schematic of an electronic device according to an embodiment of the present application;
FIG. 16 is a flow chart illustrating step S330 of the interaction method of FIG. 13 in accordance with the present application;
FIG. 17 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 18 is a flow chart illustrating an interaction method according to an embodiment of the present application;
fig. 19 shows a seventh interface schematic of an electronic device according to an embodiment of the present application;
fig. 20 shows an eighth interface schematic diagram of an electronic device according to an embodiment of the present application;
fig. 21 shows a ninth interface schematic diagram of an electronic device according to an embodiment of the present application;
fig. 22 shows a tenth interface schematic diagram of an electronic device according to an embodiment of the present application;
fig. 23 shows an eleventh interface schematic of an electronic device according to an embodiment of the present application;
FIG. 24 is a schematic view of a twelfth interface of an electronic device according to an embodiment of the present application;
FIG. 25 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 26 shows a thirteenth interface schematic of an electronic device according to an embodiment of the application;
Fig. 27 shows a fourteenth interface schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 28 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 29 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 30 is a flow chart illustrating an interaction method according to an embodiment of the present application;
FIG. 31 is a schematic view of a fifteenth interface of an electronic device according to an embodiment of the present application;
FIG. 32 is a block diagram of an interactive device according to an embodiment of the present application;
FIG. 33 shows a block diagram of an electronic device for performing an interaction method according to an embodiment of the application;
fig. 34 shows a storage unit for storing or carrying program code for implementing an interaction method according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions according to the embodiments of the present application with reference to the accompanying drawings.
Display screens typically function to display text, pictures, icons, or video content in electronic devices such as cell phones and tablet computers. Typically, the electronic device includes a front panel, a rear cover, and a bezel. The front panel includes an upper forehead area, a middle screen area, and a lower key area. In general, the forehead area is provided with a headphone sound outlet, a front camera and other photosensitive devices, the middle screen area is provided with a display screen, and the lower key area is provided with one to three physical keys. With the development of technology, the lower key area is gradually cancelled, and the physical keys originally arranged in the lower key area are replaced by virtual keys in the display screen.
The earphone sound outlet hole and the front camera and other photosensitive devices arranged in the forehead area are important for the functional support of the mobile phone and are not easy to cancel, so that the display area of the display screen is expanded to cover the forehead area with great difficulty. After a series of researches, the inventor finds that an opening can be formed in the display screen, and the original forehead area can be expanded into a displayable area of the display screen by arranging the photosensitive device arranged in the forehead area into the opening, so as to increase the area of the displayable area, wherein, it can be understood that the displayable area is an area capable of being lightened and displayed.
Illustratively, the aperture may be disposed on one or more edges of the display screen, or may be disposed on a non-edge region of the display screen, and the aperture may be semi-circular, rectangular, rounded rectangular, circular, regular polygonal, irregular, or the like. Illustratively, referring to fig. 1, the opening 140 may be a circular notch formed at a position of a non-edge area of the display screen 130, where a hole formed by the circular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity light sensor, a receiver, a distance sensor, an ambient light brightness sensor, a temperature sensor, and a pressure sensor as the photosensitive device. Or the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging. Referring to fig. 2, the opening 140 may be a semicircular/V-shaped notch formed at an edge of the display screen 130, and the hole formed by the semicircular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity sensor, a receiver, a distance sensor, an ambient light level sensor, a temperature sensor, and a pressure sensor as a photosensitive device. Or the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging. Referring to fig. 3, the opening 140 may be a semicircular notch formed at a non-edge area of the display screen 130, where a hole formed by the semicircular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity sensor, a receiver, a distance sensor, an ambient light level sensor, a temperature sensor, and a pressure sensor as a photosensitive device. Or the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging.
In addition to the middle area of the display screen, the opening may be formed in an edge area of the display screen, for example, referring to fig. 4, the opening 140 may be a circular notch formed in a left edge area of the display screen 130, where a hole formed by the circular notch is used to accommodate at least one front panel component of the projection component, the camera, the proximity sensor, the earpiece, the distance sensor, the ambient light level sensor, the temperature sensor, and the pressure sensor as a power sensing device. Or the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging. Referring to fig. 5, the opening 140 may be a circular notch formed in a right edge region of the display screen 130, where a hole formed by the circular notch is used to accommodate at least one front panel component of a projection component, a camera, a proximity sensor, a receiver, a distance sensor, an ambient light level sensor, a temperature sensor, and a pressure sensor as a photosensitive device. Or the photosensitive device may be disposed below the display screen, where the position corresponds to the opening 140; for example, the front camera is disposed below the display screen, and is disposed corresponding to the opening 140 in the thickness direction of the display screen, so that ambient light can enter the front camera through the opening to support the front camera to perform photosensitive imaging.
However, the inventor finds that, for the electronic equipment with the above structure, due to the existence of the photosensitive devices such as the front camera, an open area has to be reserved for supporting the realization of the functions of the devices, so that the occupation of the display area is caused, the display effect is affected, and bad user experience is brought. In addition, when information is stored by using an electronic device, an area needs to be set in the current interface to participate in storing the information, so that the display area is sacrificed, and bad experience is brought to a user. Aiming at the problems, the inventor provides the interaction method, the device, the electronic equipment and the storage medium, and the novel application can be given to the holes in the display screen through the storage or the display of the holes in the display screen in combination with the information, so that the user experience is improved. The specific interaction method is described in detail in the following embodiments.
Referring to fig. 6, fig. 6 is a flow chart illustrating an interaction method according to an embodiment of the application. The interaction method is used for storing or displaying the open pores in the display screen in combination with information, so that new purposes can be given to the open pores in the display screen, and the use experience of a user is improved. In a specific embodiment, the interaction method is applied to the interaction device 200 shown in fig. 32 and the electronic apparatus 100 (fig. 33) configured with the interaction device 200. The specific flow of the present embodiment will be described below by taking a smart phone as an example, and it will be understood that the smart phone applied in the present embodiment may include a mobile terminal, a tablet computer, a wearable electronic device, and the like, which is not limited herein. In this embodiment, the electronic device includes a display screen, where the display screen has an opening, and the following details about the procedure shown in fig. 6 will be described, and the interaction method specifically may include the following steps:
step S110: and displaying the current interface.
In this embodiment, the electronic device may display the current interface. The current interface may be an interface displaying any content, or the current interface may be an interface displaying any operable (e.g., draggable) content, which is not limited herein.
In some implementations, the electronic device can display the current interface in response to the interface display instruction. Optionally, the electronic device may determine, in response to receiving the first voice information, to receive an interface display instruction and display a current interface; the method comprises the steps that in response to first touch operation of a target entity key of electronic equipment, an interface display instruction is determined to be received, and a current interface is displayed; the method comprises the steps that in response to first touch operation of a target virtual key of the electronic equipment, an interface display instruction is determined to be received, and a current interface is displayed; determining that an interface display instruction is received and displaying a current interface in response to a first target shaking operation (such as a lifting operation) acted on the electronic equipment; the method comprises the steps that in response to a first target sliding operation on a display screen of the electronic device, an interface display instruction is determined to be received, and a current interface is displayed; the receiving of the interface display instruction and displaying the current interface may be determined in response to the environment where the electronic device is located meeting a first preset environmental condition (e.g., the current time reaches a preset time, the current location is located at a preset location, the current temperature reaches a preset temperature, etc.), which is not limited herein.
In some embodiments, the current interface displayed by the electronic device may include a system desktop, a negative one-screen interface, a lock screen interface, a chat interface, a video play interface, a browser interface, an album interface, and the like, without limitation.
Step S120: and responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
In this embodiment, the target information may include text (such as a note, a mail, a chat message, a web page text, etc.), a picture, a video, a file, a screenshot, a screen, a web page being read, an article being browsed in a WeChat, a uniform resource locator (uniform resource locator, a URL), a contact (such as a contact head being chat), a card (such as an express card, a network bus card, etc.), and the like, which is not limited herein.
In this embodiment, during the process of displaying the current interface, whether the first operation of the target information for the current interface is received may be monitored. If the first operation of the target information of the current interface is monitored and received, the target information can be moved to the area on the display screen corresponding to the opening to disappear in response to the first operation of the target information, and the target information is stored. If the first operation of the target information of the current interface is not received, the current interface can be kept displayed and whether the first operation of the target information of the current interface is received is monitored.
In some implementations, the first operation for the target information of the current interface can include: the method comprises the steps of inputting first voice information carrying target information or inputting first touch operation aiming at the target information. Wherein, the inputting the first touch operation for the target information may include: a first touch operation of hardware (e.g., a back shell, such as a display screen, such as a power key, such as a side) of the input electronic device, a first touch operation of a control displayed by a display screen of the input electronic device, and the like, wherein the control displayed by the display screen may be used to trigger a first operation of target information, such as a "save" control.
In one manner, the electronic device may set the target information to a selected state in response to a first operation for the target information of the current interface, generate reduced-version target information or generate a display icon corresponding to the target information, and then the reduced-version target information or the display icon corresponding to the target information may move to an area on the display screen corresponding to the opening, disappear when moving to the area on the display screen corresponding to the opening, and save the target information. Optionally, the target information may disappear after moving to the area on the display screen corresponding to the opening hole, which may include: the target information of the reduced version disappears or the display icon corresponding to the target information disappears, the target information of the current interface does not disappear and remains to be displayed at the original position, and the integrity of the current interface can be maintained; or the target information of the reduced version disappears or the display icon corresponding to the target information disappears, and the target information of the current interface disappears from the original position, so that repeated storage of the target information can be avoided.
In some embodiments, the electronic device may preset and store a storage area and a storage path of the target information stored by the first operation for the target information of the current interface, where the storage area may be a certain storage area on a memory of the electronic device or may be a certain storage area on a hard disk of the electronic device, which is not limited herein. Therefore, in the present embodiment, the target information can be saved in the save area in accordance with the save path in response to the first operation for the target information of the current interface.
In some embodiments, in the process that the target information disappears after moving to the area on the display screen corresponding to the opening, the preset moving effect may be displayed in the area on the display screen corresponding to the opening and the preset peripheral area of the area. Alternatively, the preset action may include particles, halos, etc., without limitation. When the preset dynamic effect comprises particles and halos, the particles and halos can be outwards extended and dispersed on the display screen by taking the area on the display screen corresponding to the opening as an origin. The preset action formed by the particles and the halos can be presented in a regular pattern or an irregular pattern on the display screen, for example, the preset action formed by the particles and the halos can be presented in a circular shape, a fan shape, a square shape and the like on the display screen, which is not limited herein.
As one implementation, the first operation for the target information of the current interface may include a drag operation for the target information of the current interface. Optionally, when the electronic device displays the target information on the current interface, the user may drag the target information displayed on the current interface when want to save the target information, and the electronic device may respond to the drag operation for the target information, so that the target information moves on the current interface, and if the release operation for the target information is detected during the movement of the target information, the target information may be moved to the area on the display screen corresponding to the opening, and then disappear, and the target information is saved.
As yet another implementation, the first operation for the target information of the current interface may include a selected save operation for the target information of the current interface. Optionally, when the electronic device displays the target information on the current interface, the user may select the target information displayed on the current interface (such as selecting characters, files, pictures, etc. in the current interface) when the user wants to save the target information, the electronic device may display a control (such as a menu item "save into an opening") in response to a selection operation for the target information, and may remove the target information after moving to an area on the display screen corresponding to the opening in response to the confirmation operation if a confirmation operation for the control is detected, and save the target information.
As yet another implementation, the first operation for the target information of the current interface may include a voice save operation for the target information of the current interface. Optionally, when the user wants to save the target information, the electronic device instructs the electronic device to save the target information of the current interface by inputting a first voice (for example, saving the XX picture), and the electronic device may respond to the voice saving operation for the target information, when the target information moves to the area on the display screen corresponding to the opening, the target information disappears, and the target information is saved.
Referring to fig. 7 to fig. 9, fig. 7 shows a first interface schematic diagram of an electronic device according to an embodiment of the present application, fig. 8 shows a second interface schematic diagram of an electronic device according to an embodiment of the present application, and fig. 9 shows a third interface schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 7, the current interface of the electronic device displays the target information a, and if the first operation for the target information a is detected, the target information a may be moved to the opening, where the target information a may be disappeared and stored when the current interface is moved to the position shown in fig. 8.
Step S130: the target information is displayed in response to a second operation for the electronic device.
In this embodiment, during the process of displaying the current interface, whether the second operation for the electronic device is received may be monitored. If the second operation for the electronic device is monitored, the target information can be displayed in response to the second operation for the electronic device. If the second operation for the electronic device is not received, the current interface may be kept displayed and whether the second operation for the electronic device is received is monitored.
In some implementations, the second operation for the electronic device may include: and inputting second voice information carrying target information or inputting second touch operation for the electronic equipment. Wherein inputting the second touch operation for the electronic device may include: a second touch operation of hardware (e.g., a rear housing, such as a display screen, such as a power key, such as a side), a second touch operation of a control displayed by a display screen of the input electronic device, wherein the control displayed by the display screen may be used to trigger a first operation of the electronic device, such as a "display" control, etc.
As one approach, the electronic device may display the saved target information in response to a second operation for the electronic device. The electronic device may set and store a read path for displaying the target information saved by the second operation for the electronic device in advance, and thus, in this embodiment, the saved target information may be displayed based on the read path in response to the second operation for the electronic device. Alternatively, when the stored target information is displayed, the target information may be stored in the original storage area or deleted. Optionally, displaying the target information may include: the target information is displayed on the current interface through a preset display mode, or the target information is displayed on the current interface through a preset display position, and the like, which is not limited herein.
As one implementation, the second operation for the electronic device may include a touch operation for hardware of the electronic device. Optionally, when the electronic device stores the target information and the user wants to display the target information on the current interface, the electronic device may trigger a touch operation on hardware of the electronic device by tapping a rear shell of the electronic device, tapping a side edge of the electronic device, pressing a power key of the electronic device, and the like, and then the electronic device may display the target information in response to the touch operation on the hardware of the electronic device.
Taking the example of triggering the touch operation on the hardware of the electronic device by pressing the power key of the electronic device, in response to the number of times of pressing the power key acting on the electronic device reaching a number threshold (for example, twice), it may be determined that the touch operation on the hardware of the electronic device is triggered by pressing the power key of the electronic device; in response to the duration of the pressing of the power key on the electronic device reaching the duration threshold, it may be determined that a touch operation for hardware of the electronic device is triggered by pressing the power key of the electronic device.
As yet another implementation, the second operation for the electronic device may include a touch operation for a display screen of the electronic device. Optionally, when the electronic device stores the target information and the user wants to display the target information on the current interface, the electronic device may trigger a touch operation on the display screen of the electronic device by clicking the target position of the display screen, double clicking the target position of the display screen, long pressing the target position of the display screen, sliding the target position of the display screen, and the like, and then may respond to the touch operation on the display screen of the electronic device to display the target information.
The target position may be a position of a vicinity of the opening of the display screen, that is, a distance between the target position and the position of the opening is within a preset distance range. Then, in response to a click operation applied to a vicinity of the opening of the electronic device, it may be determined to trigger a touch operation applied to the display screen; in response to the number of clicks of the click operation acting on the vicinity of the aperture of the electronic device reaching a threshold number of clicks (e.g., twice), it may be determined to trigger a touch operation acting on the display screen; in response to the duration of the press applied to the vicinity of the aperture of the electronic device reaching a duration threshold, it may be determined to trigger a touch operation applied to the display screen; in response to a clockwise sliding operation acting on a vicinity of an aperture of the electronic device, it may be determined to trigger a touch operation acting on the display screen; in response to a counterclockwise sliding operation acting on a vicinity of an opening of the electronic device, it may be determined to trigger a touch operation acting on the display screen or the like, which is not limited herein.
As yet another implementable manner, the second operation for the electronic device may include a voice display operation for the target information. Optionally, when the electronic device stores the target information and the user wants to display the target information on the current interface, the electronic device may instruct the electronic device to display the target information stored in the hole by inputting a second voice (for example, displaying the XX picture), and the electronic device may respond to the voice display operation for the target information to display the target information.
Referring to fig. 10, fig. 10 is a schematic diagram illustrating a fourth interface of an electronic device according to an embodiment of the application. As shown in fig. 10, when a display operation for the target information is detected, the target information a may be moved out of the opening and displayed on the current interface.
According to the interaction method provided by the embodiment of the application, the current interface is displayed, the target information is moved to the area on the display screen corresponding to the opening and disappears after being subjected to the first operation aiming at the target information of the current interface, the target information is stored, and the target information is displayed after being subjected to the second operation aiming at the electronic equipment, so that new purposes can be given to the opening in the display screen through the storage or display of the opening matched with the information in the display screen, and the use experience of a user is improved.
Referring to fig. 11, fig. 11 is a flow chart illustrating an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the following details about the flow shown in fig. 11 will be described, and the interaction method specifically includes the following steps:
Step S210: and displaying the current interface.
The specific description of step S210 is referred to step S110, and will not be repeated here.
Step S220: and responding to the drag operation of the target information, and moving the target information to the area on the display screen corresponding to the opening.
In this embodiment, the first operation for the target information of the current interface may include a drag operation for the target information of the current interface. During the display of the current interface, it may be monitored whether a drag operation for the target information is received. If the drag operation for the target information is received, the target information can be moved to the area on the display screen corresponding to the opening in response to the drag operation for the target information. If the monitoring does not receive the drag operation for the target information, it may be kept monitoring whether the drag operation for the target information is received.
It will be appreciated that moving the target information to the area on the display corresponding to the aperture may include: and moving the target information on the current interface according to the movement track of the drag operation, namely, the movement track of the target information on the current interface is consistent with the movement track of the drag operation on the display screen. And controlling the target information to move on the current interface according to the moving speed of the drag operation, namely, the moving speed of the target information on the current interface is consistent with the moving speed of the drag operation on the display screen.
In some embodiments, the drag operation for the target information may include a single-finger drag operation on the target information, or a multi-finger drag operation on the target information, without limitation.
Step S230: and responding to the end of the dragging operation, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
In the present embodiment, in moving the target information to the area on the display screen corresponding to the aperture based on the drag operation, it may be determined whether the end of the drag operation is detected. If it is determined that the drag operation is finished, the target information may be moved to the area on the display screen corresponding to the opening and then disappear, and the target information may be stored.
In some embodiments, in moving the target information to the region on the display screen corresponding to the opening based on the drag operation, it may be determined whether a release operation for the target information is detected. Wherein if it is determined that the release operation for the target information is responded, it may be determined that the drag operation is ended. Wherein if it is determined that the release operation for the target information is not responded, it may be determined that the drag operation is not ended.
As an embodiment, in moving the target information to the area on the display screen corresponding to the opening based on the drag operation, it is possible to detect whether or not the drag hand acting on the target information is away from the display screen. Wherein it can be determined that a release operation for the target information is detected if the display screen is left in response to a drag hand acting on the target information. As one way, whether a drag hand acting on target information leaves the display screen may be detected by a touch sensor and/or a pressure sensor provided below the display screen.
Referring to fig. 12, fig. 12 is a flowchart illustrating a step S230 of the interaction method shown in fig. 11 according to the present application. The following details the flow shown in fig. 12, and the method may specifically include the following steps:
Step S231: and determining a distance between a display position of the target information and a position of the opening in response to the end of the drag operation.
In some embodiments, in moving the target information to the region on the display screen corresponding to the opening based on the drag operation, it may be determined whether the drag operation is ended. If it is determined that the drag operation is ended, a distance between the display position of the target information and the position of the opening may be determined in response to the drag operation being ended.
As an embodiment, the target coordinate system may be established with the electronic device, for example, the target coordinate system may be established with a center point of the electronic device as a coordinate origin, two sides of the electronic device perpendicular to each other as a horizontal axis and a vertical axis, respectively, or the target coordinate system may be established with a corner point of the electronic device as a coordinate origin, and two sides of the electronic device perpendicular to each other as a horizontal axis and a vertical axis, respectively. Based on this, first coordinate information of the target information in the target coordinate system can be acquired, and second coordinate information of the aperture in the target coordinate system can be acquired, and based on the first coordinate information and the second coordinate information, a distance between a display position of the target information and a position of the aperture can be determined.
In some embodiments, obtaining the location of the aperture may include: and acquiring the position on the display screen corresponding to the opening as the position of the opening. That is, coordinate information of the position on the display screen corresponding to the aperture in the target coordinate system may be acquired, and the position on the display screen corresponding to the aperture is determined as the position of the aperture based on the coordinate information.
Step S232: if the distance is within the first preset distance range, the target information is moved to the area on the display screen corresponding to the opening, then the target information disappears, and the target information is stored.
In some embodiments, the electronic device may preset and store a first preset distance range, where the first preset distance range is used as a criterion for determining a distance between the location of the target information and the location of the hole. Therefore, in the present embodiment, in the case of obtaining the distance between the position of the target information and the position of the opening, the distance may be compared with a first preset distance range to determine whether the distance is within the first preset distance range.
If the distance is determined to be within the first preset distance range, the distance can be considered to trigger the hole preservation effect, and the target information can be removed after moving to the area on the display screen corresponding to the hole, and the target information is preserved.
Step S240: the target information is displayed in response to a second operation for the electronic device.
The specific description of step S240 refers to step S230, and is not described herein.
Compared with the interaction method shown in fig. 6, the interaction method provided by the embodiment of the application can save the target information through the drag release operation, so that the convenience and diversity of interaction can be improved, and the use experience of a user is improved.
Referring to fig. 13, fig. 13 is a flow chart illustrating an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the following details about the procedure shown in fig. 13 will be described, and the interaction method specifically includes the following steps:
Step S310: and displaying the current interface.
The specific description of step S310 is referred to step S110, and will not be repeated here.
Step S320: and responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
The specific description of step S320 refers to step S120, and is not repeated here.
Step S330: the target information is displayed in response to a tapping operation on a rear case or side of the electronic device.
In this embodiment, the second operation for the electronic device includes a tapping operation for a rear case or side of the electronic device. In the case where the target information is stored in the electronic device, it may be determined whether a tapping operation for the rear case or side of the electronic device is detected. If it is determined that the knocking operation for the rear case or the side of the electronic device is detected, the target information may be displayed in response to the knocking operation for the rear case or the side of the electronic device. If it is determined that the tapping operation is not detected for the rear case or the side of the electronic device, it may be continuously determined whether the tapping operation is detected for the rear case or the side of the electronic device.
In some embodiments, the rear housing and side edges of the electronic device may be provided with one or a combination of several of a touch sensor, a pressure sensor, a capacitance sensor, a vibration sensor, without limitation. Based on this, the electronic apparatus can detect a tapping operation acting on its rear case or side by the above-described sensors provided on its rear case and side.
Referring to fig. 14 and 15, fig. 14 shows a fifth interface schematic diagram of an electronic device according to an embodiment of the present application, and fig. 15 shows a sixth interface schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 14, when a tapping operation on the rear case of the electronic device is not detected, the current interface of the electronic device does not display target information, such as tapping the rear case of the electronic device shown in fig. 14, the target information a may be displayed on the current interface of the electronic device, as shown in fig. 15.
Referring to fig. 16, fig. 16 is a flowchart illustrating a step S330 of the interaction method shown in fig. 13 according to the present application. The following will describe the flow shown in fig. 16 in detail, and the method specifically may include the following steps:
Step S331: and responding to the knocking operation aiming at the rear shell or the side edge of the electronic equipment, and acquiring knocking parameters corresponding to the knocking operation, wherein the knocking parameters comprise one or a combination of a plurality of knocking positions, knocking times and knocking frequencies.
In some implementations, where the electronic device holds the target information, it may be determined whether a tapping operation is detected for a rear shell or side of the electronic device. If it is determined that the knocking operation for the rear shell or the side edge of the electronic device is detected, knocking parameters corresponding to the knocking operation can be obtained in response to the knocking operation for the rear shell or the side edge of the electronic device. Alternatively, the tapping parameters may include one or a combination of several of tapping position, number of taps, and tapping frequency.
As an implementation manner, the rear shell and the side of the electronic device are provided with sensors which can be used for detecting the knocking parameters, and when the knocking operation for the rear shell or the side of the electronic device is detected, the knocking parameters corresponding to the knocking operation can be detected and acquired through the sensors. Alternatively, the sensor may comprise one or a combination of several of a pressure sensor, a touch sensor, a capacitive sensor, a vibration sensor.
Step S332: and if the knocking parameters meet preset knocking parameters, displaying the target information.
In some embodiments, the electronic device may preset and store a preset tapping parameter, where the preset tapping parameter is used as a basis for determining a tapping parameter corresponding to the tapping operation. Therefore, in this embodiment, when the tapping parameter corresponding to the tapping operation is obtained, the tapping parameter may be compared with the preset tapping parameter to determine whether the tapping parameter meets the preset tapping parameter.
If the tapping parameter is determined to meet the preset tapping parameter, the tapping operation can be considered to trigger the display effect of the target information, and the target information can be displayed.
As an implementation manner, when the tapping parameter is a tapping position, the tapping position may be any position of a rear shell or a side edge of the electronic device, or the tapping position may be a specified position of the rear shell or the side edge of the electronic device. Alternatively, the designated location of the rear housing may be in line with the location of the aperture.
As an implementation manner, when the tapping parameter is the number of taps, the number of taps may include 1, 2, 3, etc., which is not limited herein.
As an implementation manner, when the tapping parameter is a tapping frequency, the tapping frequency may include 2 times, 3 times, etc. continuously tapping within a preset time period, which is not limited herein.
Compared with the interaction method shown in fig. 6, the interaction method provided by the embodiment of the application can also display the target information through the operation of knocking the back shell or the side edge, so that the convenience of information display can be improved, and the use experience of a user can be improved.
Referring to fig. 17, fig. 17 is a flow chart illustrating an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the following details about the flow shown in fig. 17 will be described, and the interaction method specifically includes the following steps:
step S410: and displaying the current interface.
The specific description of step S410 is referred to step S110, and will not be repeated here.
Step S420: and responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
The specific description of step S420 refers to step S120, and is not repeated here.
Step S430: and responding to touch operation on a target position of the display screen, and displaying the target information, wherein the distance between the target position and the position of the opening is within a second preset distance range.
In this embodiment, the second operation for the electronic device may include a touch operation for a target position of a display screen of the electronic device. And under the condition that the electronic equipment stores the target information, determining whether touch operation aiming at the target position of the display screen is detected, wherein the distance between the target position and the position of the opening is within a second preset distance range. If it is determined that the touch operation for the target position of the display screen is detected, the target information may be displayed in response to the touch operation for the target position of the display screen. If it is determined that the touch operation for the target position of the display screen is not detected, whether the touch operation for the target position of the display screen is detected may be continuously determined.
In some embodiments, the second preset distance range may be a circular area range with the opening position as a center point and the preset distance as a radius; the range of the polygonal area formed at a predetermined distance with the position of the opening as the center point is not limited herein.
As an implementation manner, the touch operation for the target position of the display screen may include: the number of clicks at the target location reaches a preset number. Alternatively, the preset number of times may include 1 time, 2 times, 3 times, etc., which is not limited herein.
As yet another embodiment, the touch operation for the target position of the display screen may include: the pressing time length at the target position reaches a preset time length.
As yet another implementation manner, the touch operation for the target position of the display screen may include: the sliding track at the target position meets the preset sliding track. Alternatively, the preset sliding track may include a clockwise circular track, a counterclockwise circular track, etc., which is not limited herein.
Compared with the interaction method shown in fig. 6, the interaction method provided by the embodiment of the application also displays the target information through the touch operation on the target position of the display screen, so that the convenience of information display can be improved, and the use experience of a user can be improved.
Referring to fig. 18, fig. 18 is a flow chart illustrating an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the following details about the flow shown in fig. 18 will be described, and the interaction method specifically includes the following steps:
step S510: and displaying the current interface.
The specific description of step S510 refers to step S110, and is not repeated here.
Step S520: and responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
The specific description of step S520 refers to step S120, and is not repeated here.
Step S530: and responding to a second operation aiming at the electronic equipment, analyzing the target information and obtaining the content contained in the target information.
In this embodiment, when the second operation for the electronic device is received, the target information may be parsed in response to the second operation for the electronic device, to obtain the content included in the target information.
In some embodiments, if the target information includes text information, the text information included in the target information may be parsed, for example, text extraction is performed, to obtain text content included in the target information; if the target information includes image information, the image information included in the target information may be analyzed, for example, recognition, matting, text extraction, etc., to obtain image content or the like included in the target information, which is not limited herein.
In some embodiments, the electronic device may preset and store a content analysis model, and may analyze the target information through the content analysis model to obtain content included in the target information output by the content analysis model.
Step S540: and determining a display icon corresponding to the target information based on the content contained in the target information.
In the present embodiment, in the case of obtaining the content included in the target information, the display icon corresponding to the target information may be determined based on the content included in the target information.
In some embodiments, the electronic device may preset and store a plurality of contents, a plurality of display icons, and correspondence between the plurality of contents and the plurality of display icons, where the correspondence between the plurality of contents and the plurality of display icons may include: one content corresponds to one display icon, one content corresponds to a plurality of display icons, and a plurality of contents corresponds to one display icon. Therefore, in the present embodiment, in the case of obtaining the content included in the target information, the display icon corresponding to the content included in the target information may be determined based on the correspondence relationship between the plurality of contents and the plurality of display icons.
As an embodiment, if the target information is analyzed to include a name, an address, a telephone, etc., the target information may be automatically analyzed to be displayed as a different display icon, if the target information is analyzed to include an article, the target information may be automatically analyzed to be displayed as a shopping icon, and if the target information is analyzed to be video, a file, etc., the target information is directly displayed as an icon, which is not limited herein.
For example, if the content included in the target information is "name", the "phone" icon may be associated, if the content included in the target information is "address", the "map" icon may be associated, and if the content included in the target information is "phone", the "phone" icon may be associated, etc., without limitation.
Step S550: and displaying a display icon corresponding to the target information.
In this embodiment, when the display icon corresponding to the target information is obtained, the display icon corresponding to the target information may be displayed on the current interface. Based on the above, unified management of the display content of the interface can be realized by displaying the display icons corresponding to the target information, and the attractiveness of the interface display is improved.
As an embodiment, if it is determined that the content included in the target information includes a foreign word, the foreign word may be translated to obtain a native word. Alternatively, when the target information is displayed, the foreign text and the home text may be displayed in association with each other, or only the home text may be displayed, or only the foreign text may be displayed, and voice broadcasting in the home language may be added, which is not limited herein. For example, assuming that the target information is a picture and the picture includes foreign characters, the foreign characters in the picture may be translated into the native characters.
As still another embodiment, if the target information includes a URL, when a touch operation on a display icon corresponding to the target information is detected, the user may jump to a reading position corresponding to the URL. Alternatively, the target information may include a read footprint URL, and then the target information may be displayed on the current interface in the form of a hyperlink icon, and when a touch operation (such as a click operation) acting on the hyperlink icon is detected, the target information may be returned to the reading position in the original reading application.
Referring to fig. 19 and fig. 20, fig. 19 shows a seventh interface schematic diagram of an electronic device according to an embodiment of the present application, and fig. 20 shows an eighth interface schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 19, the target information is a URL, and may be displayed on the current interface in the form of a hyperlink icon A1, and when a touch operation on the hyperlink icon A1 is detected, the target information may be returned to a reading position in the original reading application, as shown in fig. 20.
As still another embodiment, if the target information includes a contact header, when a touch operation on a display icon corresponding to the target information is detected, the chat page corresponding to the contact header may be skipped. Optionally, the target information may include a contact icon, and then the target information may be displayed on the current interface in the form of a contact icon (such as a name icon, a face icon, etc.), and when a touch operation (such as a clicking operation) acting on the contact icon is detected, the target information may be returned to the original chat page.
Referring to fig. 21 and 22, fig. 21 shows a ninth interface schematic diagram of an electronic device according to an embodiment of the present application, and fig. 22 shows a tenth interface schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 21, the target information is a contact head portrait, and may be displayed on the current interface in the form of a contact icon A2, and when a touch operation acting on the contact icon A2 is detected, the original chat page may be returned, as shown in fig. 22.
Step S560: and responding to a drag operation of dragging the display icon corresponding to the target information to the target application program, and displaying the target information in an application page of the target application program.
In this embodiment, in the process of displaying the display icon corresponding to the target information on the current interface, whether a drag operation for the display icon corresponding to the target information is received may be monitored. If the drag operation of dragging the display icon corresponding to the target information to the target application program is monitored, the target information can be displayed in an application page of the target application program in response to the drag operation of dragging the display icon corresponding to the target information to the target application program. That is, the target information is opened in the application page of the target application program.
In some embodiments, the target application is a chat application (e.g., weChat), and if the target information is text, the text can be sent to the other party as a chat message on an application page of the chat application; if the target information is a picture, the picture can be used as a chat message to be sent to the opposite party on an application page of the chat application program; if the target information is a card (e.g., a network bus card, an express card), the card may be sent as a chat message to the other party on an application page of the chat application, which is not limited herein.
Step S570: and determining the application program corresponding to the target information based on the content contained in the target information.
In the present embodiment, in the case of obtaining the content included in the target information, the application corresponding to the target information may be determined based on the content included in the target information.
In some embodiments, in a case of obtaining the content included in the target information, a content type of the content included in the target information may be obtained, and an application program corresponding to the target information may be determined based on the content type. As an implementation manner, the electronic device may preset and store a plurality of content types, a plurality of application programs, and correspondence between a plurality of content types and a plurality of application programs, where the correspondence between a plurality of content types and a plurality of application types may include: one content type corresponds to one application, a plurality of content types corresponds to one application, and one content type corresponds to a plurality of applications. Therefore, in the present embodiment, in the case of obtaining the content type of the content included in the target information, the application corresponding to the content type of the content included in the target information may be determined based on the correspondence between the plurality of content types and the plurality of applications.
For example, if the target information is a name of a person or a phone, it may be determined that an application corresponding to the target information is a contact of a mobile phone; if the target information is an address, it may be determined that the application corresponding to the target information is a map or the like, which is not limited herein.
Step S580: and associating the display icon corresponding to the target information with the application program corresponding to the target information.
In this embodiment, when the display icon corresponding to the target information and the application program corresponding to the target information are obtained, the display icon corresponding to the target information and the application program corresponding to the target information may be associated. For example, the display icon corresponding to the target information and the application program corresponding to the target information are associated by the same identifier.
Step S590: and responding to the touch operation of the display icon corresponding to the target information, and jumping to an application page of the application program corresponding to the target information.
In this embodiment, in the process of displaying the display icon corresponding to the target information on the current interface, a touch operation acting on the display icon corresponding to the target information may be detected, where when the touch operation for the display icon corresponding to the target information is detected, the touch operation for the display icon corresponding to the target information may be responded, and the jump may be performed to the application page of the application program corresponding to the target information, so that the quick jump of the application page may be implemented.
In some embodiments, the touch operation of the displayed icon corresponding to the target information may include a clicking operation, a sliding operation, and the like, which is not limited herein.
For example, if the application program corresponding to the target information is a mobile phone contact person, the contact person page of the mobile phone contact person can be directly entered when the touch operation of the target icon corresponding to the target information is detected; if the application program corresponding to the target information is a map, the user can directly enter a navigation page of the map when the touch operation of the target icon corresponding to the target information is detected.
Referring to fig. 23 and 24, fig. 23 shows an eleventh interface schematic diagram of an electronic device according to an embodiment of the present application, and fig. 24 shows a twelfth interface schematic diagram of an electronic device according to an embodiment of the present application. As shown in fig. 23, the target information is an address, and may be displayed on the current interface in the form of a map icon A3, and when a touch operation on the map icon A3 is detected, a navigation interface of a map may be entered, as shown in fig. 24.
In the interaction method provided by an embodiment of the present application, compared with the display method shown in fig. 6, the present embodiment further displays the target information in the form of the corresponding display icon, in the display process, the display icon may be dragged to the application program to open the target information and display the target information, and in addition, the display icon corresponding to the target information may be associated with the application program corresponding to the target information, so that the application page of the application program may be skipped directly by the touch operation acting on the display icon, thereby improving the aesthetic property of information display, the convenience and diversity of interaction, and improving the interaction experience of the user.
Referring to fig. 25, fig. 25 is a flow chart illustrating an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the flow shown in fig. 25 will be described in detail, and in this embodiment, the number of target information is a plurality of target information, and the interaction method specifically may include the following steps:
step S610: and displaying the current interface.
The specific description of step S610 refers to step S110, and is not repeated here.
Step S620: and responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
The specific description of step S620 is referred to step S120, and will not be repeated here.
Step S630: and responding to a second operation for the electronic equipment, and displaying a plurality of target information in the area on the display screen corresponding to the current interface around the opening.
Optionally, the number of the target information stored in the openings is a plurality. The types of the plurality of target information may be the same, for example, the plurality of target information may be text, may be pictures, and the like; or the types of the plurality of target information may be different, for example, the plurality of target information may include text, pictures, videos, etc., which is not limited herein.
In this embodiment, when the second operation for the electronic device is received, multiple pieces of target information may be displayed on the current interface around the area on the display screen corresponding to the opening in response to the second operation for the electronic device.
In some embodiments, displaying the plurality of target information on the current interface around the area on the display screen corresponding to the opening may include: and displaying the display icons corresponding to the target information respectively around the area on the display screen corresponding to the opening in the current interface.
In some embodiments, displaying the plurality of target information on the current interface around the area on the display screen corresponding to the opening may include: displaying a plurality of target information in the current interface in an arc-shaped mode around the area on the display screen corresponding to the opening; displaying a plurality of target information in a current interface in a ring-shaped mode around the area on the display screen corresponding to the opening; the plurality of target information is displayed in a rectangular manner around the area on the display screen corresponding to the opening in the current interface, and the like, and is not limited herein.
Referring to fig. 26, fig. 26 is a schematic diagram showing a thirteenth interface of an electronic device according to an embodiment of the present application, and when the number of target information is plural as shown in fig. 26, plural target information a are displayed around an area on the display screen corresponding to the opening in the current interface.
Step S640: and in response to the sliding operation on the plurality of target information, moving one or more of the plurality of target information on the current interface around the area on the display screen corresponding to the opening to be hidden or displayed.
In the process of displaying the plurality of target information in the area on the display screen corresponding to the surrounding opening, when the number of the target information is large, the plurality of target information cannot be displayed around the opening at the same time on the current interface, so that part of information in the plurality of target information can be displayed around the opening on the current interface, part of information in the plurality of target information can be hidden, and the display can be switched when needed. Based on this, in the present embodiment, in displaying a plurality of pieces of target information around an area on the display screen corresponding to the aperture, it can be determined whether a sliding operation for the plurality of pieces of target information is detected. If it is determined that the sliding operation for the plurality of target information is detected, one or more of the plurality of target information can be controlled to move around the area on the display screen corresponding to the opening on the current interface to be hidden or displayed in response to the sliding operation for the plurality of target information. If it is determined that the sliding operation is not detected for the plurality of pieces of target information, the display of the plurality of pieces of target information may be maintained, and whether the sliding operation is detected for the plurality of pieces of target information may be continuously determined.
It can be understood that, one or more of the plurality of target information is controlled to move around the area on the display screen corresponding to the opening on the current interface to be hidden or displayed, so that the target information originally displayed on the current interface is switched to be hidden, and the target information originally hidden is switched to be displayed on the current interface, so as to meet the viewing and displaying requirements of the plurality of target information.
Step S650: and in response to a second operation for the electronic device, displaying the plurality of target information in a list form on the current interface.
In this embodiment, when the second operation for the electronic device is received, a plurality of target information may be displayed in a list form on the current interface in response to the second operation for the electronic device.
In some embodiments, displaying the plurality of target information in the form of a list at the current interface may include: the method comprises the steps that a plurality of target information is centrally displayed in a popup window in a list form on a current interface; the method comprises the steps that a plurality of target information is displayed on the left side of a popup window in a list form on a current interface; the multiple target information is displayed on the right side of the popup window in the form of a list on the current interface, and the like, and the method is not limited herein.
In some embodiments, in the process that the plurality of target information is displayed in the form of a list on the current interface, if a sliding operation for the plurality of target information is detected, one or more of the plurality of target information may be controlled to move in the list to be hidden or displayed in response to the sliding operation for the plurality of target information.
Referring to fig. 27, fig. 27 is a schematic diagram showing a fourteenth interface of an electronic device according to an embodiment of the present application, and when the number of target information is a plurality of target information as shown in fig. 27, a plurality of target information a is displayed in a list form in a current interface.
Compared with the interaction method shown in fig. 6, the interaction method provided by the embodiment of the application also displays or lists the target information around the area on the display screen corresponding to the opening, so that the diversity of information display can be improved, and the use experience of a user can be improved.
Referring to fig. 28, fig. 28 is a flow chart illustrating an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the following details about the flow shown in fig. 28 are described, and the interaction method specifically includes the following steps:
step S710: and displaying the current interface.
Step S720: and responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
Step S730: the target information is displayed in response to a second operation for the electronic device.
The specific description of step S710 to step S730 refers to step S110 to step S130, and is not repeated here.
Step S840: and displaying the target control on the current interface.
In this embodiment, the target information and the target control may be displayed on the current interface. The target control can record target information which is saved through a first operation on the target information and is unprocessed within a preset duration for a long time, that is, the target control can be used as a resident control to uniformly manage the saved target information. Optionally, the target control may include a target icon.
Step S850: and responding to the touch operation acted on the target control, and displaying the target information which is stored and unprocessed within the preset time length.
In this embodiment, in the process of displaying the target control on the current interface, it may be determined whether a touch operation acting on the target control is detected. If it is determined that the touch operation acting on the target control is detected, the stored and unprocessed target information within the preset duration can be displayed in response to the touch operation acting on the target control, so that all the stored target information can be quickly checked.
In some implementations, the unprocessed target information may include: not displayed again after storage, or not deleted after storage.
Compared with the interaction method shown in fig. 6, the interaction method provided by the embodiment of the application can also provide the capability of quickly recording and managing the fragmentation information through the target control, thereby facilitating the follow-up check of the user and improving the use experience of the user.
Referring to fig. 29, fig. 29 is a schematic flow chart of an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the detailed description will be given below with respect to the flowchart shown in fig. 29, and in this embodiment, the number of target information is a plurality of target information, and the interaction method specifically may include the following steps:
step S810: and displaying the current interface.
The specific description of step S910 refers to step S110, and is not repeated here.
Step S820: and responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
The specific description of step S820 refers to step S120, and is not repeated here.
Step S830: and in response to a second operation for the electronic equipment, merging the target information belonging to the same type in a plurality of target information to obtain one or more pieces of merged information.
Optionally, the electronic device stores a plurality of target information. The types of the plurality of target information may be the same, for example, the plurality of target information may be text, may be pictures, and the like; or the types of the plurality of target information may be different, for example, the plurality of target information may include text, pictures, videos, etc., which is not limited herein.
In this embodiment, when the second operation for the electronic device is received, the target information belonging to the same type in the plurality of target information may be combined in response to the second operation for the electronic device, so as to obtain one or more pieces of combined information. For example, the pictures in the plurality of target information may be combined to obtain one combination information, the text may be combined to obtain another combination information, the video may be combined to obtain yet another combination information, and so on, thereby obtaining a plurality of combination information.
Step S840: and displaying the one or more pieces of merging information.
In this embodiment, in the case where one or more pieces of merging information are obtained, the one or more pieces of merging information may be displayed. When the number of the merged information is plural, the plural merged information may be displayed on the current interface in a form surrounding the area on the display screen corresponding to the opening, or the plural merged information may be displayed on the current interface in a form of a list, which is not limited herein.
Compared with the display method shown in fig. 6, the interaction method provided by the embodiment of the application also combines the target information belonging to the same type for display, thereby improving the simplicity of information display.
Referring to fig. 30, fig. 30 is a flow chart illustrating an interaction method according to an embodiment of the application. The method is applied to the electronic device, and the electronic device includes a display screen, where the display screen has an opening, and the flow shown in fig. 30 will be described in detail, and in this embodiment, the number of target information is a plurality of target information, and the interaction method specifically may include the following steps:
step S910: and displaying the current interface.
The specific description of step S910 refers to step S110, and is not repeated here.
Step S920: and if the target information is the target text information obtained by converting the target voice information into the text, acquiring the target voice information.
In this embodiment, when the electronic device stores the target information, it may detect whether the target information is target text information obtained by converting the target voice information into text. And if the target information is detected to be the target text information obtained by converting the target voice information into the text, acquiring the target voice information.
In some embodiments, the electronic device may be provided with a voice assistant function under which the user may record a backlog, shopping list, a praise, any information desired to be recorded. Under the condition that the user inputs the target voice information through the voice assistant, the voice assistant can automatically perform voice-to-text operation to obtain the target text information.
Step S930: and responding to a first operation aiming at the target information of the current interface, and carrying out association storage on the target text information and the target voice information.
In this embodiment, for the target text information, when the user wants to save the target text information, the saving operation may be performed using the target text information as the target information, and the electronic device may store the target text information in association with the target speech information in response. Based on the above, the subsequent electronic device can play the target voice information when displaying the target text information.
Referring to fig. 31, fig. 31 shows a fifteenth interface schematic diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 31, when target information is target text information obtained by converting target voice information into text, the target text information and a voice playing control may be displayed, where the voice playing control may be used to trigger playing of the target voice information.
Step S940: the target information is displayed in response to a second operation for the electronic device.
The specific description of step S940 is referred to step S130, and is not repeated here.
Compared with the interaction method shown in fig. 6, the interaction method provided by the embodiment of the application further saves the target text information and the target voice information in an associated manner when the target text information is obtained by converting the target voice information into the text, so that the information saving rationality can be improved.
Referring to fig. 32, fig. 32 is a block diagram illustrating an interaction device according to an embodiment of the application. The interaction device 200 is applied to the electronic equipment, and the electronic equipment comprises a display screen, wherein an opening is arranged in a displayable area of the display screen. The following will describe the block diagram shown in fig. 32, and the interaction device 200 includes: a current interface display module 210, a first operation response module 220, and a second operation response module 230, wherein:
The current interface display module 210 is configured to display a current interface.
The first operation response module 220 is configured to respond to a first operation of the target information of the current interface, make the target information disappear after moving to an area on the display screen corresponding to the opening, and save the target information.
Further, the first operation response module 220 includes: the mobile control sub-module and the first target information storage sub-module, wherein:
And the movement control sub-module is used for responding to the drag operation of the target information and moving the target information to the area on the display screen corresponding to the opening.
And the first target information storage sub-module is used for responding to the end of the dragging operation, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
Further, the first target information saving submodule includes: a distance determining unit and a target information holding unit, wherein:
And a distance determining unit configured to determine a distance between a display position of the target information and a position of the opening in response to the end of the drag operation.
And the target information storage unit is used for enabling the target information to disappear after moving to the area on the display screen corresponding to the opening if the distance is within a first preset distance range, and storing the target information.
Further, the first operation response module 220: the target voice information acquisition sub-module and the second target information storage sub-module, wherein:
And the target voice information acquisition sub-module is used for acquiring the target voice information if the target voice information is the target text information obtained by converting the voice of the target voice information into the text.
And the second target information storage sub-module is used for carrying out association storage on the target text information and the target voice information.
A second operation response module 230, configured to display the target information in response to a second operation for the electronic device.
Further, the second operation response module 230 includes: a first target information display sub-module, wherein:
And the first target information display sub-module is used for responding to the knocking operation on the rear shell or the side edge of the electronic equipment and displaying the target information.
Further, the first target information display sub-module includes: a knocking parameter acquisition unit and a first target information display unit, wherein:
the electronic equipment comprises a knocking parameter acquisition unit, a knocking operation processing unit and a knocking processing unit, wherein the knocking parameter acquisition unit is used for responding to the knocking operation aiming at the rear shell or the side edge of the electronic equipment and acquiring knocking parameters corresponding to the knocking operation, and the knocking parameters comprise one or a combination of a plurality of knocking positions, knocking times and knocking frequencies.
The first target information display unit is used for displaying the target information if the knocking parameters meet preset knocking parameters.
Further, the second operation response module 230 includes: a second target information display sub-module, wherein:
and the second target information display sub-module is used for responding to the touch operation of the target position of the display screen and displaying the target information, wherein the distance between the target position and the position of the opening is in a second preset distance range.
Further, the second target information display sub-module includes: click times determining unit, pressing duration determining unit and sliding track determining unit, wherein:
And the click frequency determining unit is used for enabling the click frequency at the target position to reach the preset frequency.
And the pressing time length determining unit is used for reaching a preset time length in the pressing time length of the target position.
And the sliding track determining unit is used for enabling the sliding track at the target position to meet the preset sliding track.
Further, the second operation response module 230 includes: the content obtaining sub-module, the display icon determining sub-module and the third target information displaying sub-module, wherein:
And the content obtaining sub-module is used for analyzing the target information to obtain the content contained in the target information.
And the display icon determining submodule is used for determining a display icon corresponding to the target information based on the content contained in the target information.
And the third target information display sub-module is used for displaying the display icon corresponding to the target information.
Further, the second operation response module 230 includes: an application program determining sub-module, an application program associating sub-module and an application page jumping sub-module, wherein:
and the application program determining submodule is used for determining the application program corresponding to the target information based on the content contained in the target information.
And the application program association sub-module is used for associating the display icon corresponding to the target information with the application program corresponding to the target information.
And the application page jumping rotor module is used for jumping to an application page of the application program corresponding to the target information in response to the touch operation of the display icon corresponding to the target information.
Further, the second operation response module 230 includes: the system comprises a text translation sub-module, a reading position skip sub-module and a chatting interface skip sub-module, wherein:
and the text translation sub-module is used for translating the foreign text to obtain the home text if the content contained in the target information comprises the foreign text.
And the reading position skip rotor module is used for skipping to the reading position corresponding to the uniform resource locator when the touch operation on the display icon corresponding to the target information is detected if the content contained in the target information comprises the uniform resource locator.
And the chatting interface jumping sub-module is used for jumping to a chatting page corresponding to the contact head portrait when detecting the touch operation of the display icon corresponding to the target information if the content contained in the target information comprises the contact head portrait.
Further, the second operation response module 230 includes: a fourth target information display sub-module, wherein:
And the fourth target information display sub-module is used for responding to the drag operation of dragging the display icon corresponding to the target information to the target application program, and displaying the target information in the application page of the target application program.
Further, when the number of the target information is a plurality, the second operation response module 230 includes: a fifth target information display sub-module and a sixth target information display sub-module, wherein:
And the fifth target information display sub-module is used for displaying a plurality of target information in the area on the display screen corresponding to the current interface around the opening.
And the sixth target information display sub-module is used for displaying the plurality of target information in the current interface in a list form.
Further, the second operation response module 230 includes: a seventh target information display sub-module, wherein:
And the seventh target information display sub-module is used for responding to the sliding operation of the plurality of target information, and moving one or more of the plurality of target information on the area on the display screen corresponding to the current interface around the opening so as to be hidden or displayed.
Further, when the number of the target information is a plurality, the second operation response module 230 includes: a merging information obtaining sub-module and an eighth target information display sub-module, wherein:
and the merging information obtaining sub-module is used for merging the target information belonging to the same type in the plurality of target information to obtain one or more merging information.
And an eighth target information display sub-module for displaying the one or more pieces of merged information.
Further, the interaction device 200 further includes: the system comprises a target control display module and an information display module, wherein:
and the target control display module is used for displaying the target control on the current interface.
And the information display module is used for responding to the touch operation acted on the target control, and displaying the target information which is stored and is not processed within the preset duration.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the apparatus and modules described above may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
In several embodiments provided by the present application, the coupling of the modules to each other may be electrical, mechanical, or other.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
Referring to fig. 33, a block diagram of an electronic device 100 according to an embodiment of the application is shown. The electronic device 100 may be a smart phone, a tablet computer, an electronic book, or the like capable of running an application program. The electronic device 100 of the present application may include one or more of the following components: processor 110, memory 120, display 130, and one or more application programs, wherein the one or more application programs may be stored in memory 120 and configured to be executed by the one or more processors 110, the one or more program(s) configured to perform the methods as described in the foregoing method embodiments.
Wherein the processor 110 may include one or more processing cores. The processor 110 utilizes various interfaces and lines to connect various portions of the overall electronic device 100, perform various functions of the electronic device 100, and process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and invoking data stored in the memory 120. Alternatively, the processor 110 may be implemented in at least one hardware form of digital signal Processing (DIGITAL SIGNAL Processing, DSP), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA), programmable logic array (Programmable Logic Array, PLA). The processor 110 may integrate one or a combination of several of a central processing unit (Central Processing Unit, CPU), a graphics processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing the content to be displayed; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 110 and may be implemented solely by a single communication chip.
Memory 120 may include random access Memory (Random Access Memory, RAM) or Read-Only Memory (ROM). Memory 120 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing functions (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described below, and the like. The storage data area may also store data created by the electronic device 100 in use (e.g., phonebook, audiovisual data, chat log data), and the like.
The display 130 is used to display information input by a user, information provided to the user, and various graphical user interfaces of the electronic device 100, which may be formed by graphics, text, icons, numbers, video, and any combination thereof, and in one example, the display 130 may be a Liquid crystal display (Liquid CRYSTAL DISPLAY, LCD) or an Organic Light-Emitting Diode (OLED), which is not limited herein.
Referring to FIG. 34, a block diagram of a computer readable storage medium according to an embodiment of the application is shown. The computer readable medium 300 has stored therein program code which can be invoked by a processor to perform the methods described in the method embodiments described above.
The computer readable storage medium 300 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Optionally, the computer readable storage medium 300 comprises a non-volatile computer readable medium (non-transitory computer-readable storage medium). The computer readable storage medium 300 has storage space for program code 310 that performs any of the method steps described above. The program code can be read from or written to one or more computer program products. Program code 310 may be compressed, for example, in a suitable form.
In summary, according to the interaction method, the device, the electronic equipment and the storage medium provided by the embodiment of the application, the current interface is displayed, the target information is moved to the area on the display screen corresponding to the opening and then disappears in response to the first operation of the target information on the current interface, the target information is stored, and the target information is displayed in response to the second operation of the electronic equipment. According to the application, through the storage or display of the open pores in the display screen in combination with the information, new application can be given to the open pores in the display screen, and the use experience of a user is improved.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be appreciated by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not drive the essence of the corresponding technical solutions to depart from the spirit and scope of the technical solutions of the embodiments of the present application.

Claims (20)

1. An interaction method, characterized by being applied to an electronic device, the electronic device comprising a display screen, the display screen having an opening, the method comprising:
Displaying a current interface;
Responding to a first operation aiming at the target information of the current interface, enabling the target information to disappear after moving to an area on a display screen corresponding to the opening, and storing the target information; or alternatively
The target information is displayed in response to a second operation for the electronic device.
2. The method according to claim 1, wherein the moving the target information to the area on the display screen corresponding to the opening to disappear and saving the target information in response to the first operation for the target information of the current interface includes:
Responding to the drag operation of the target information, and enabling the target information to move to the area on the display screen corresponding to the opening;
and responding to the end of the dragging operation, enabling the target information to disappear after moving to the area on the display screen corresponding to the opening, and storing the target information.
3. The method according to claim 2, wherein the moving the target information to the area on the display screen corresponding to the opening and then disappearing in response to the end of the drag operation, and storing the target information, comprises:
Determining a distance between a display position of the target information and a position of the opening in response to the end of the drag operation;
If the distance is within the first preset distance range, the target information is moved to the area on the display screen corresponding to the opening, then the target information disappears, and the target information is stored.
4. The method of claim 1, wherein the displaying the target information in response to a second operation for the electronic device comprises:
the target information is displayed in response to a tapping operation on a rear case or side of the electronic device.
5. The method of claim 4, wherein the displaying the target information in response to a tap operation on a rear housing or side of the electronic device comprises:
Responding to the knocking operation aiming at the rear shell or the side edge of the electronic equipment, and acquiring knocking parameters corresponding to the knocking operation, wherein the knocking parameters comprise one or a combination of a plurality of knocking positions, knocking times and knocking frequencies;
and if the knocking parameters meet preset knocking parameters, displaying the target information.
6. The method of claim 1, wherein the displaying the target information in response to a second operation for the electronic device comprises:
and responding to touch operation on a target position of the display screen, and displaying the target information, wherein the distance between the target position and the position of the opening is within a second preset distance range.
7. The method of claim 6, wherein the touch operation for the target location of the display screen comprises one or more of:
the clicking times at the target position reach the preset times;
The pressing time length at the target position reaches a preset time length; or alternatively
The sliding track at the target position meets a preset sliding track.
8. The method of claim 1, wherein the displaying the target information comprises:
Analyzing the target information to obtain the content contained in the target information;
Determining a display icon corresponding to the target information based on the content contained in the target information;
and displaying a display icon corresponding to the target information.
9. The method of claim 8, wherein the method further comprises:
Determining an application program corresponding to the target information based on the content contained in the target information;
Associating the display icon corresponding to the target information with the application program corresponding to the target information;
And responding to the touch operation of the display icon corresponding to the target information, and jumping to an application page of the application program corresponding to the target information.
10. The method of claim 8, wherein the method further comprises:
If the content contained in the target information comprises foreign characters, translating the foreign characters to obtain native characters;
If the content contained in the target information comprises a uniform resource locator, when touch operation on a display icon corresponding to the target information is detected, jumping to a reading position corresponding to the uniform resource locator; or alternatively
If the content contained in the target information comprises a contact head portrait, when touch operation on a display icon corresponding to the target information is detected, skipping to a chat page corresponding to the contact head portrait.
11. The method of claim 8, further comprising, after the displaying the display icon corresponding to the target information:
And responding to a drag operation of dragging the display icon corresponding to the target information to the target application program, and displaying the target information in an application page of the target application program.
12. The method of claim 1, wherein when the number of the target information is a plurality, the displaying the target information includes:
Displaying a plurality of target information in the area on the display screen corresponding to the current interface around the opening; or alternatively
And displaying the plurality of target information in the current interface in a list form.
13. The method of claim 12, further comprising, after the displaying the plurality of target information on the current interface around the area on the display screen corresponding to the aperture:
and in response to the sliding operation on the plurality of target information, moving one or more of the plurality of target information on the current interface around the area on the display screen corresponding to the opening to be hidden or displayed.
14. The method according to claim 1, wherein the method further comprises:
displaying a target control on the current interface;
And responding to the touch operation acted on the target control, and displaying the target information which is stored and unprocessed within the preset time length.
15. The method of claim 1, wherein when the number of the target information is a plurality, the displaying the target information includes:
combining the target information belonging to the same type in the plurality of target information to obtain one or more pieces of combined information;
And displaying the one or more pieces of merging information.
16. The method of claim 1, wherein the saving the target information comprises:
if the target information is target text information obtained by converting the target voice information into a text, acquiring the target voice information;
and carrying out association storage on the target text information and the target voice information.
17. The method of any one of claims 1-16, wherein the electronic device further comprises a photosensitive device, the opening corresponding in position to the photosensitive device.
18. An interactive apparatus for use with an electronic device, the electronic device including a display screen having an aperture, the apparatus comprising:
the current interface display module is used for displaying a current interface;
The first operation response module is used for responding to a first operation of target information of the current interface, enabling the target information to disappear after moving to an area on the display screen corresponding to the opening, and storing the target information; or alternatively
And the second operation response module is used for responding to a second operation for the electronic equipment and displaying the target information.
19. An electronic device comprising a display screen, a memory, and a processor, the display screen and the memory coupled to the processor, the memory storing instructions that when executed by the processor perform the method of any of claims 1-17.
20. A computer readable storage medium having stored therein program code which is callable by a processor to perform the method of any one of claims 1-17.
CN202211721521.7A 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium Pending CN118276726A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211721521.7A CN118276726A (en) 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium
PCT/CN2023/121694 WO2024139479A1 (en) 2022-12-30 2023-09-26 Interaction method and apparatus, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211721521.7A CN118276726A (en) 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN118276726A true CN118276726A (en) 2024-07-02

Family

ID=91642712

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211721521.7A Pending CN118276726A (en) 2022-12-30 2022-12-30 Interaction method, device, electronic equipment and storage medium

Country Status (2)

Country Link
CN (1) CN118276726A (en)
WO (1) WO2024139479A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006293185A (en) * 2005-04-14 2006-10-26 Seiko Instruments Inc Electronic equipment with display switching function, and display switching program
CN109388309B (en) * 2018-09-27 2023-10-10 Oppo广东移动通信有限公司 Menu display method, device, terminal and storage medium
CN110297591A (en) * 2019-06-26 2019-10-01 维沃移动通信有限公司 A kind of operating method and terminal device
CN110442291A (en) * 2019-07-31 2019-11-12 维沃移动通信有限公司 A kind of control method and mobile terminal

Also Published As

Publication number Publication date
WO2024139479A1 (en) 2024-07-04

Similar Documents

Publication Publication Date Title
US11003331B2 (en) Screen capturing method and terminal, and screenshot reading method and terminal
CN103516892B (en) Mobile terminal and control method thereof
US9594496B2 (en) Method and apparatus for playing IM message
CN107491315B (en) Message prompting method, device and terminal
US10782856B2 (en) Method and device for displaying application function information, and terminal device
US8675024B2 (en) Mobile terminal and displaying method thereof
JP2021157837A (en) Notification processing method, electronic device, computer readable storage medium, and program
US9116594B2 (en) Mobile terminal and control method thereof
US20170090565A1 (en) User interfaces and associated processes for information resources
US20120289290A1 (en) Transferring objects between application windows displayed on mobile terminal
US10228835B2 (en) Method for displaying information, and terminal equipment
WO2022089330A1 (en) Method for taking screenshot, apparatus, electronic device, and readable storage medium
US20140325323A1 (en) Online video playing method and apparatus and computer readable medium
US20140330570A1 (en) Satisfying specified intent(s) based on multimodal request(s)
WO2019037359A1 (en) Split-screen display method, device and terminal
TW201351259A (en) User-resizable icons
KR101832394B1 (en) Terminal apparatus, server and contol method thereof
CN105094661A (en) Mobile terminal and method of controlling the same
CN108475182B (en) Data processing method and electronic terminal
WO2021104175A1 (en) Information processing method and apparatus
CN112711366A (en) Image generation method and device and electronic equipment
US20150082182A1 (en) Display apparatus and controlling method thereof
CN106445391A (en) Method and system for screen splitting based on multi-point touch
EP2442241A1 (en) Mobile terminal and displaying method thereof
TWI782137B (en) Method and device for generating and displaying data object information

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination