CN112015262A - Data processing method, interface control method, device, equipment and storage medium - Google Patents

Data processing method, interface control method, device, equipment and storage medium Download PDF

Info

Publication number
CN112015262A
CN112015262A CN201910453665.0A CN201910453665A CN112015262A CN 112015262 A CN112015262 A CN 112015262A CN 201910453665 A CN201910453665 A CN 201910453665A CN 112015262 A CN112015262 A CN 112015262A
Authority
CN
China
Prior art keywords
interface
control
information
gesture
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910453665.0A
Other languages
Chinese (zh)
Inventor
许侃
姚维
张迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Zhixing Network Hongkong Co Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201910453665.0A priority Critical patent/CN112015262A/en
Publication of CN112015262A publication Critical patent/CN112015262A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a data processing method, an interface control method, a data processing device, an interface control device and a storage medium, so that convenient operation is provided. The method comprises the following steps: detecting control information; responding to the detected first control information, and displaying a control interface, wherein the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information. The corresponding interface can be conveniently displayed based on the control information, the control path is convenient, the operation convenience is effectively improved, and the efficiency is high.

Description

Data processing method, interface control method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a data processing method and apparatus, an interface control method and apparatus, an electronic device, and a storage medium.
Background
With the development of computer technology, the computer technology is widely applied, and the aspects of entertainment activities, positioning navigation, life services and the like are all convenient for the life of people.
When a user uses the electronic equipment, the user can use various applications and other functions in the equipment, but the user is often required to find the corresponding application and then the corresponding application is started, so that the operation path is inconvenient and low in efficiency.
Disclosure of Invention
The embodiment of the application provides a data processing method to provide convenient operation.
Correspondingly, the embodiment of the application also provides a data processing device, an interface control method and device, electronic equipment and a storage medium, which are used for ensuring the implementation and application of the method.
In order to solve the above problem, an embodiment of the present application discloses a data processing method, including: detecting control information; responding to the detected first control information, and displaying a control interface, wherein the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
The embodiment of the application also discloses an interface control method, which comprises the following steps: displaying a first interface; responding to the detected first control information, and displaying a control interface, wherein the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
The embodiment of the application also discloses an interface control method, which comprises the following steps: displaying a navigation interface; in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected upward gesture, and displaying a function interface corresponding to the selected entry information on the navigation interface.
The embodiment of the application also discloses an interface control method, which comprises the following steps: displaying a navigation interface; in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected movement gesture, and switching the selected entrance information in the control interface according to the movement direction of the movement gesture.
The embodiment of the application also discloses an interface control method, which comprises the following steps: displaying a navigation interface; in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the gesture changed into the opening gesture by hovering, and displaying a function interface of the shortcut function corresponding to the selected application plug-in.
The embodiment of the application also discloses a data processing device, which comprises: the detection module is used for detecting the control information; the control module is used for responding to the detected first control information and displaying a control interface, and the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
The embodiment of the application also discloses an interface control device, the device includes: the display module is used for displaying a first interface; the interface control module is used for responding to the detected first control information and displaying a control interface, and the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
The embodiment of the application also discloses an interface control device, the device includes: the display module is used for displaying a navigation interface; an interface control module, configured to display a control interface on the navigation interface in response to the detected hovering gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected upward gesture, and displaying a function interface corresponding to the selected entry information on the navigation interface.
The embodiment of the application also discloses an interface control device, the device includes: the display module is used for displaying a navigation interface; an interface control module, configured to display a control interface on the navigation interface in response to the detected hovering gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected movement gesture, and switching the selected entrance information in the control interface according to the movement direction of the movement gesture.
The embodiment of the application also discloses an interface control device, the device includes: the display module is used for displaying a navigation interface; an interface control module, configured to display a control interface on the navigation interface in response to the detected hovering gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the gesture changed into the opening gesture by hovering, and displaying a function interface of the shortcut function corresponding to the selected application plug-in.
The embodiment of the application also discloses an electronic device, which comprises: a processor; and a memory having executable code stored thereon, which when executed, causes the processor to perform a data processing method as described in one or more of the embodiments of the present application.
One or more machine-readable media having stored thereon executable code that, when executed, causes a processor to perform a data processing method as described in one or more of the embodiments of the present application are also disclosed.
The embodiment of the application also discloses an electronic device, which comprises: a processor; and a memory having executable code stored thereon, which when executed, causes the processor to perform an interface control method as described in one or more of the embodiments of the present application.
One or more machine-readable media having executable code stored thereon that, when executed, cause a processor to perform an interface control method as described in one or more of the embodiments of the present application are also disclosed.
The embodiment of the application also discloses an electronic device, which comprises: a processor; and a memory having executable code stored thereon, which when executed, causes the processor to perform an interface control method as described in embodiments of the present application.
One or more machine-readable media having executable code stored thereon, which when executed, causes a processor to perform the interface control method according to embodiments of the present application are also disclosed.
The embodiment of the application also discloses an electronic device, which comprises: a processor; and a memory having executable code stored thereon, which when executed, causes the processor to perform an interface control method as described in embodiments of the present application.
One or more machine-readable media having executable code stored thereon, which when executed, causes a processor to perform the interface control method according to embodiments of the present application are also disclosed.
The embodiment of the application also discloses an electronic device, which comprises: a processor; and a memory having executable code stored thereon, which when executed, causes the processor to perform an interface control method as described in one or more of the embodiments of the present application.
One or more machine-readable media having executable code stored thereon that, when executed, cause a processor to perform an interface control method as described in one or more of the embodiments of the present application are also disclosed.
Compared with the prior art, the embodiment of the application has the following advantages:
in the embodiment of the application, the control information is detected, the control interface is displayed in response to the detected first control information, the control interface displays at least one entry information, the functional interface corresponding to the selected entry information is displayed in response to the detected second control information, the corresponding interface can be displayed conveniently based on the control information, the control path is convenient, the operation is effectively improved, and the efficiency is high.
Drawings
FIG. 1A is a schematic diagram of an interface change of an in-vehicle device according to an embodiment of the present application;
FIG. 1B is a schematic diagram of an example of a gesture according to an embodiment of the present application.
FIG. 2 is a flow chart of the steps of an embodiment of a data processing method of the present application;
FIG. 3 is a schematic structural diagram of a data processing device in an on-board device according to an embodiment of the present application;
FIG. 4 is a schematic diagram of an example of an interface of an in-vehicle device according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an example interface of another in-vehicle device according to an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating an example of an interface of another in-vehicle device according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a card panel for applying the shortcut function of a card according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another embodiment of the present application for a plug-in panel that employs the express functionality of a plug-in;
FIG. 9 is a schematic diagram of yet another embodiment of a plug-in panel for applying the express function of a plug-in;
FIG. 10 is a diagram illustrating an example of application plugin selection for a global panel in an embodiment of the present application;
FIG. 11 is a flow chart of the steps of an alternative embodiment of a data processing method of the present application;
FIG. 12 is a flow chart of steps in another data processing method embodiment of the present application;
FIG. 13 is a flow chart of steps in yet another data processing method embodiment of the present application;
FIG. 14 is a flowchart illustrating the steps of one embodiment of an interface control method of the present application;
FIG. 15 is a block diagram of an embodiment of a data processing apparatus of the present application;
FIG. 16 is a block diagram of an alternate embodiment of a data processing apparatus of the present application;
FIG. 17 is a block diagram of an embodiment of an interface control apparatus according to the present application;
fig. 18 is a schematic structural diagram of an apparatus according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
The embodiment of the application can be applied to various electronic equipment with a display screen, such as mobile terminals like mobile phones and tablet computers, intelligent household appliances like televisions and sound boxes, and intelligent household appliances like refrigerators and microwave ovens, so that equipment control can be performed based on control information like gestures under the condition that a user does not have convenience for a button and a touch screen of the touch equipment, and the use of the user is facilitated. The following embodiments are discussed by taking gestures as examples of control information applied to the in-vehicle device.
The vehicle-mounted device refers to a device for performing vehicle monitoring and management, and the vehicle-mounted device may be a terminal device installed on a vehicle, or a mobile terminal such as a mobile phone and a tablet computer used by a user on the vehicle.
The user can navigate through the vehicle-mounted equipment in the process of driving the vehicle, and can play music, make/receive calls and the like through the vehicle-mounted equipment. In the process of switching between different services and the like by using services provided by the vehicle-mounted equipment, driving safety is improved in order to reduce influence on driving. The embodiment of the application provides a scheme for awakening different services and switching between different services based on gestures, wherein the services provided by the vehicle-mounted equipment can be provided by an application program installed on the vehicle-mounted equipment. The gesture refers to an interaction mode between a human and a machine, and a user can control or interact with equipment by using a simple gesture to enable a computer to understand human behaviors.
In the process of driving a vehicle, a user navigates through the vehicle-mounted device, and the vehicle-mounted device can display a corresponding navigation route, navigation prompt information and the like, as shown in a first interface example shown in fig. 1A; the vehicle-mounted device can detect a gesture in the running process, so that the control can be quickly called and at least one application plug-in is displayed on the control according to the detected gesture, for example, a second interface example shown in fig. 1A; therefore, the user does not need to directly contact the vehicle-mounted equipment for operation, the influence on driving is reduced, and the driving safety is improved. The method can be realized by the following steps:
referring to fig. 2, a flow chart of steps of an embodiment of a data processing method of the present application is shown.
In step 202, a gesture is detected.
The gesture can be detected by the vehicle-mounted equipment in the operation process, so that various gestures of the user can be recognized. In the embodiment of the application, the vehicle-mounted equipment can be provided with hardware for gesture recognition, such as image acquisition components like a camera and infrared components. Taking an image acquisition component as an example, image data can be acquired for recognition, and gesture detection is realized. In one example, two cameras can be arranged in a vehicle, and the two cameras can be arranged on the vehicle-mounted device or in other positions of the vehicle, so that image data can be acquired through the two cameras, and then the image data is subjected to recognition processing to obtain gestures of a user, wherein gesture information such as the direction of the calculated gestures, the shape of the gestures, the law of the gestures and the like can be recognized.
In the embodiment of the application, the vehicle-mounted device may set the hardware identification layer 10, the global application 20, the control layer 30, and the application 40, as shown in fig. 3, the application 40 includes an application 401, an application 402, an application 403, and the like. The hardware identification layer is used for carrying out gesture identification to obtain gesture information; the global application is an application displayed in the foreground, such as a desktop of an operating system of the vehicle-mounted device, and the global application can comprise a status bar, a notification bar, a navigation bar and the like; the control layer is used for responding the recognized gesture information; the application refers to an application installed on the in-vehicle apparatus, or the like, such as a navigation application, a player application, a map application, or the like. The hardware recognition layer can perform recognition processing based on data acquired by hardware, such as image data and the like, determine gesture information of a user, and then can send the recognized gesture information to the global application; the global application is integrated with control controls such as a gesture control, a voice recognition control and the like, the control controls are packaged with functions of recognizing and distributing control information, and the functions of recognizing and distributing gestures and the like are packaged with the gesture control, so that the function service corresponding to the gestures can be recognized based on the gesture information recognized by the hardware recognition layer, and the function service is distributed to the control corresponding to the function service by the gestures, such as the controls for starting the services such as application plug-in and application shortcut functions; the global application may be distributed to the control layer based on the recognition of the gesture information.
In this embodiment of the application, the gesture distributed to the control layer for processing may include: the gesture control system comprises a plurality of gestures such as a first gesture, a second gesture, a third gesture, a fourth gesture, a fifth gesture and a sixth gesture, wherein each different gesture can provide different function services based on a control layer. In this application embodiment, the gesture can be set according to the demand or set by the user, for example, the gesture can be static gesture of static state, such as the forefinger is erect, five fingers open, the fist is made, the thumb is erect, etc., and dynamic gesture of motion state, such as the upper, lower, left and right movements that the forefinger is erect, five fingers open to the fist, the fist is made to the five fingers open, the single finger is drawn a circle, etc., which is not limited in this application embodiment. Hovering gestures with the forefinger erected, opening gestures with the five fingers opened and the like as shown in fig. 1B, and different movement gestures can be formed based on movement of the single finger erected in different directions, such as up, down, left and right; and corresponding gestures can be formed based on the fact that the single finger is erected to change into the five fingers to be opened, or the five fingers are opened to change into the single finger to be erected, and the like.
And step 204, responding to the detected first gesture, and displaying a control interface, wherein the control interface displays at least one piece of entrance information.
The method comprises the steps of monitoring gestures received by the vehicle-mounted equipment, responding to the detected first gesture if the first gesture is detected, calling a control by a control layer, displaying a display control interface of the control, wherein the control interface can also be called a global panel, a control view and the like, and displaying at least one entry information on the control interface, and the entry information can comprise entry information of an application plug-in of at least one application installed on the vehicle-mounted equipment. In the embodiment of the application, the control is used for providing a control interface so as to use the application conveniently.
In an alternative embodiment, the display control interface includes: calling a control, wherein the control is associated with at least one application plug-in, system service and the like; and displaying a control interface of the control, wherein the control interface is displayed with at least one application plug-in and entry information of system service. The application plug-in refers to a plug-in corresponding to an installed application in the equipment, the plug-in refers to a program written by an application program interface according to a certain specification, the system service refers to a service function provided by an operating system of the equipment, and the specific related function can be set according to requirements. In the embodiment of the application, in response to a first gesture, such as a hovering gesture formed by holding up an index finger, the first gesture is used for quickly starting the shortcut operation of the application and the system service on the vehicle-mounted device, so that the control of the shortcut operation can be called, the control may be associated with a plurality of application plug-ins and/or system services and the like for applications that provide quick launch functionality, therefore, in response to the first gesture, at least one application plug-in and/or system service associated with the control can also be invoked, then displaying the control interface of the plug-in and displaying the at least one entry information on the control interface, such as icons, display cards, display interfaces, etc., and as a second interface example in fig. 1A, plug-in icons for Frequency Modulation (FM) broadcast applications, music applications, conversation applications, map applications, etc. are displayed.
In an optional embodiment of the application, an application plug-in and a system service associated with a control can be set, and an interface can be called to set the application plug-in and the system service associated with the control, so that when a shortcut service function of the application and the system service needs to be added, the interface of the control, such as a first interface, can be called, and then the application plug-in and the system service of the application are associated with the control, so that associated entry information can be displayed when a control interface of the control is displayed. For example, a control information table of the control is stored in the vehicle-mounted device, in which control information such as a control identifier and control interface information of the control, an identifier and a plug-in identifier of an application plug-in associated with the control, a system service, and entry information such as an address of the plug-in information are recorded.
And step 206, responding to the detected second gesture, and displaying a function interface corresponding to the selected entrance information.
The in-vehicle device may continuously monitor the gesture, such that when the second gesture is detected, in response to the second gesture, the selected entry information may be determined, such as a plug-in of the music application selected in the second interface example of fig. 1A, and then a function interface corresponding to the selected entry information may be displayed, such as a function interface of the application plug-in may be a plug-in panel or the like. The functional interface includes a display interface of the plug-in, a display interface of the system service, and the like, such as the interface of the call plug-in shown in fig. 5.
In response to the detected second gesture, the selected entry information in the control interface can be determined, and then the function interface of the selected entry information is called, and the display control interface is switched to display the function interface. As a control interface is displayed based on a gesture of finger tip up, an upward gesture formed based on finger tip up and moving upward may be switched to display a corresponding functional interface. As shown in the second interface example in fig. 1A, the selected application plug-in is a plug-in of a music application, as shown in the interface example in fig. 4, the selected application plug-in is a plug-in of a call application, and in response to the second gesture, a function interface of the plug-in of the call application may be displayed, and as shown in fig. 5, the user may dial a call by voice, gesture, or the like.
In an optional embodiment of the present application, the device such as the vehicle-mounted device monitors the gesture, and switches the selected entry information in the control interface in response to the detected third gesture. Entry information may be selected in the control interface, the selected entry information being toggled with the third gesture. For example, the control interface is started by a gesture of erecting an index finger and the portal information is displayed, the gesture of erecting the index finger can be maintained and the corresponding movement gesture is a third gesture, the selected portal information can be switched based on the movement direction, the switching direction corresponds to the movement direction of the third gesture, if the selected portal information is moved leftwards, the portal information on the left side of the selected application plug-in is switched into the selected portal information, if the selected portal information is moved rightwards, the application plug-in on the right side of the selected portal information is switched into the selected portal information, and the like, wherein in order to facilitate the user to determine the selected portal information, the size of the selected portal information can be set to be larger than that of the unselected portal information in the control interface. As with the entry information for the selected music application in fig. 2, the selected entry information may be adjusted in response to the third gesture to move the entry information for the selected conversation application to the left in fig. 4.
In another optional embodiment of the present application, the system service, and the like may further set a quick start service of the function based on the provided function, so that a certain function of a certain application and a certain system service is quickly invoked based on the control interface, and a convenient service is provided for the user. And calling an interface of the control, such as a second interface, to set a shortcut function of the application, so as to respond to the detected fourth gesture and display a function interface of the shortcut function corresponding to the selected entry information. If the fourth gesture is a gesture changed from hovering to opening, after the gesture is detected, entry information can be determined to be selected in the control interface, then a shortcut function corresponding to the selected entry information is started, and the display control interface is switched to a function interface for displaying the shortcut function corresponding to the selected entry information, so that a user can use functions provided by the application program conveniently. In the second interface example shown in fig. 1A, a plug-in of a music application is selected, and in response to the fourth gesture, a shortcut function of the plug-in of the music application, such as a music playing function, may be started, and then a function interface of the music playing function interface may be displayed, as shown in fig. 6. Correspondingly, in response to the detected fifth gesture, switching back to the control interface on which at least one application plug-in is displayed from the function interface of the shortcut function, and switching back to the control interface quickly based on the fifth gesture, where the fifth gesture includes a gesture changing from open to hovering, so that after the gesture is detected, the function interface of the shortcut function can be closed, and the control interface is displayed, for example, switching from fig. 6 to the second interface shown in fig. 1A.
In an optional embodiment of the present application, data can be synchronized to a corresponding application or system service, and the like, through the functional interface. That is, an instruction corresponding to a user operation is received through the functional interface, data synchronization can be performed with a corresponding application or system service based on the instruction, so that processing corresponding to the user operation is executed, and corresponding functional processing can be provided without opening an interface of the application or system service.
In one example, receiving an instruction through the functional interface, and synchronizing the instruction to an application or system service corresponding to the functional interface; and receiving a processing result corresponding to the instruction, and outputting the processing result on the functional interface. The functional interface can receive an instruction corresponding to user operation, then the instruction is used for providing an application or system service corresponding to the functional interface, then the application or system service can execute a processing operation corresponding to the instruction to obtain a corresponding processing result, the processing result is sent to the functional interface, and the functional interface can output the processing result, such as displaying corresponding information, playing corresponding audio data and the like.
In another example, instructions are received through the functional interface; executing the processing corresponding to the instruction and outputting the processing on the functional interface; and synchronizing the processing result to the application or system service corresponding to the functional interface. The functional interface can receive an instruction corresponding to user operation, and then can execute processing operation corresponding to the instruction, wherein data of the functional interface is acquired from an application or system service corresponding to the functional interface, so that a corresponding function of the application or system service is provided, and the processing operation corresponding to the instruction can be executed to obtain a corresponding processing result. And then, the processing result can be output on the functional interface, and the processing result can be synchronized to the application or system service corresponding to the functional interface.
In an optional embodiment of the present application, the application plug-in includes: the method comprises the steps that a plug-in (host) and a plug-in (plug-in) are packaged, the plug-in (host) provides functions of a container, the plug-in (plug-in) packaging refers to specific service packaging, the service corresponds to the functional service provided by an application program, therefore, the specific service packaged by the plug-in (plug-in) can be preposed into the plug-in (host) based on a framework of the application plug-in, and a processing result is synchronized to the application after the processing in the plug-in without opening the application for operation.
In an optional embodiment of the present application, the control interface is closed in response to a detected sixth gesture. If a sixth gesture is detected, and in response to the sixth gesture, the control interface may also be closed, for example, the control interface may be started based on the hovering gesture, and then the index finger is moved up and down to form a down gesture, according to which the control interface of the control may be closed, and the previously displayed page on the vehicle-mounted device may be restored, for example, to the first display interface in fig. 1A.
In an example of starting and closing a control interface, as in a first interface shown in fig. 1A, an in-vehicle device displays an interface of a navigation application and provides a navigation function for a user; in the navigation process, gesture detection can be performed by collecting information through hardware; detecting a gesture of holding the index finger, displaying a control interface, and displaying entry information such as names and icons of plug-in applications or system services in the control interface, such as a second interface example shown in fig. 1A, displaying entry information of a fm broadcast application, entry information of a music application, entry information of a conversation application, and entry information of a map application, and default entry information of a selected music application; the detection of a gesture of the index finger moving up and down may close the control interface, reverting to the first interface example shown in FIG. 1A. In an alternative embodiment, a navigation interface is displayed; in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; closing the control interface in response to a detected gesture of a finger moving up and down.
Providing an example of selecting between different portal information, such as a second interface example shown in fig. 1A, displaying portal information such as names and icons of control interfaces and plug-in applications and system services; if a gesture that the index finger is raised and moved left and right is detected, the entry information selected in the control interface can be switched, for example, the entry information of the selected music application is switched to the entry information of the selected call application from the entry information of the selected music application in the case of moving right, as shown in fig. 4; when the selected entry information is located at the edge of the control interface, such as the entry information of the selected map application, and the selected entry information continues to move towards the edge, the first entry information at the other side can be used as the selected entry information, such as the entry information of the selected FM broadcast plug-in, or the selected entry information is determined by moving in the opposite direction, such as the entry information of the selected call application, or the entry information which is not displayed can move in the opposite direction to display the entry information which is not displayed and select the entry information, and the like. In an alternative embodiment, a navigation interface is displayed; in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected movement gesture, and switching the selected entrance information in the control interface according to the movement direction of the movement gesture.
After the user selects the entry information to be selected, the index finger can be kept standing up and moved upwards, and according to the gesture of the index finger standing up and moving upwards, a functional interface corresponding to the entry information can be displayed, for example, a functional interface capable of entering a call application for the interface example of fig. 4 is shown in fig. 5. In an alternative embodiment, a navigation interface is displayed; in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected upward gesture, and displaying a function interface corresponding to the selected entry information on the navigation interface.
After the user selects the entry information to be selected, if the shortcut function of the application or system service is to be started, the user can be erected from the index finger to be opened, and the function interface of the shortcut function of the entry information can be started based on the gesture that the hovering gesture is changed to be opened; as shown in the second interface example in fig. 1A, entry information of a music application is selected, and in response to a gesture that changes from hovering to opening, a function interface of a shortcut function for music playing may be entered; the shortcut function can be controlled in the plug-in panel by opening a gesture of shaking left and right (such as a seventh gesture), for example, different songs are switched, and instructions can be processed or transmitted to corresponding application and system service processing through the function interface based on the control of the shortcut function; the user changes from open to hovering, and in response to the gesture in which the open changes to hovering, the functional interface may be closed and the control interface displayed again, such as switching back to the second interface example in FIG. 1A. And after the moving time threshold is exceeded, the functional interface of the quick operation can be folded, and the control interface is switched back. In an alternative embodiment, a navigation interface is displayed; in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the gesture changed into the opening gesture by hovering, and displaying a function interface of the shortcut function corresponding to the selected application plug-in. In a further optional embodiment, in response to the detected gesture changed from open to hovering, the function interface of the shortcut function is closed, and the control interface is displayed.
In the embodiment of the application, the shortcut function of each application can be set, so that the application is convenient for a user to use. For example, for a call application, the shortcut function may be a function of a frequently-used contact, such as the function interface of the call application shown in fig. 7, which is a function interface of an uncommon contact and a function interface of a frequently-used contact; the functional interfaces of the map application shown in fig. 8 are functional interfaces of favorite addresses such as home, company, etc., and functional interfaces without favorite addresses; the functional interface of the music application shown in fig. 9 may include functional interfaces of various music lists, such as a favorite music list, a frequently listened-to music list, or a playing interface for music in a certain music list.
In the embodiment of the present application, the gestures described above are examples, and various static gestures including standing up, opening five fingers, making a fist, and various dynamic gestures including various dynamic movements and changes may be used in the embodiment of the present application.
In this embodiment, the entry information displayed in the control interface may also be arranged in such a manner that, along with the moving direction of the third gesture, the middle position of the interface where the selected entry information moves is arranged, as in the control interface moving example shown in fig. 10, when the finger moves to the left, the list moves to the left, the entry information of the FM broadcast application moves to the middle, and when the finger moves to the right, the list moves to the right, and the entry information of the map application moves to the middle. In this embodiment of the present application, the arrangement order of the entry information in the control interface may be fixed, or may be arranged according to a certain rule, such as time or frequency of use, and this is not limited in this embodiment of the present application.
The embodiment of the application is discussed by taking the application to the vehicle-mounted device as an example, and in the actual process, the technical scheme can be applied to various electronic devices with display screens. For example, when a user views a menu through a display screen on a refrigerator during cooking, information can be searched and inquired through control information such as voice, and menu pages can be switched through gestures and the like. And convenient use experience is provided for the user.
Referring to FIG. 11, a flowchart illustrating steps of an alternative embodiment of a data processing method of the present application is shown.
In step 1102, a gesture is detected.
And 1104, responding to the detected first gesture, and displaying a control interface, wherein the control interface displays at least one piece of entrance information. Wherein a control and at least one application plug-in or system service associated with the control can be invoked; and displaying a control interface of the control, and displaying at least one entry information on the control interface.
Step 1106, responding to the detected third gesture, switching the selected entry information in the control interface, wherein the switching direction corresponds to the moving direction of the third gesture.
Step 1108, in response to the detected second gesture, displaying a function interface corresponding to the selected entry information. Wherein the selected portal information can be determined; and calling and displaying the function interface of the selected entry information.
Step 1110, in response to the detected fourth gesture, displaying a function interface of the shortcut function corresponding to the selected entry information.
Step 1112, receiving an instruction through the functional interface, and performing data synchronization with the functional interface. In one example, an instruction is received through the functional interface, and the instruction is synchronized to an application or system service corresponding to the functional interface; and receiving a processing result corresponding to the instruction, and outputting the processing result on the functional interface. In another example, instructions are received through the functional interface; executing the processing corresponding to the instruction and outputting the processing on the functional interface; and synchronizing the processing result to the application or system service corresponding to the functional interface.
Step 1114, in response to detecting the fifth gesture, switching back from the function interface to the control interface.
Step 1116, closing the control interface in response to the detected sixth gesture.
The embodiments of the present application do not limit the steps of the foregoing embodiments, for example, after the control interface is displayed in step 1104, it is determined that the default selected entry information is the required entry information, step 1108 may be executed without switching the entry information; if the step 1110 is directly executed after the step 1104, directly entering a function interface of the shortcut function; or after 1104, performing 1106 and 1110; in another example, after the step 1104 is executed, the step 1116 is executed, and the execution sequence of the steps can be determined according to the requirement.
Therefore, when a user uses electronic equipment with a screen, such as vehicle-mounted equipment, the application can be started and used through gestures, and the operation can be performed without touching a touch screen of the vehicle-mounted equipment. Compared with a touch control use mode, the influence on driving can be reduced, and driving safety is improved.
On the basis of the above embodiments, the present embodiment further provides a data processing method, which can be operated conveniently.
Referring to FIG. 12, a flow chart of steps of another data processing method embodiment of the present application is shown.
Step 1202, detecting control information.
And 1204, responding to the detected first control information, and displaying a control interface, wherein the control interface displays at least one entry message.
And step 1206, responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
In this embodiment of the application, the control information may also be received in a manner that does not contact the electronic device, for example, the control information is received by detecting a gesture, a body motion, a voice, and the like, and therefore the control information may include: contactless control information. In an alternative embodiment, the control information may include at least one of: gestures, body movements, voice data, etc., so that the user can control the device without touching the electronic device. Through detection of the control information, a corresponding interface can be displayed based on the detected control information, for example, a control interface is displayed in response to the detected first control information such as a hovering gesture, the control interface can be an interface of a plug-in, a component and a control, and the control interface can display at least one entry information, so that the entry information can be controlled globally. And for another example, in response to the detected second control information such as an upward gesture, displaying a function interface corresponding to the selected entry information, where the function interface may be a main interface of the application or system service, or may be a sub-interface of a plug-in, component, or the like corresponding to the application or system service, and interacting with the application through data synchronization to provide a corresponding function of the application or system service.
In the embodiment of the present application, the control interface and the function interface are similar to those in the above embodiment, and therefore, the steps are similar to those in the above embodiment, and specific reference may be made to the description of the above embodiment.
Thus, similar to the above-described embodiment, in response to detecting a third gesture, such as a movement gesture, to switch selected entry information in the control interface, the entry information may be switched left or right according to the left or right movement. Responding to the detected fourth gesture such as a gesture changed from hovering to opening, and displaying a function interface of the shortcut function corresponding to the selected entry information; and responding to the detected fifth gesture such as the gesture changing from opening to hovering, and switching back to the control interface from the function interface of the shortcut function. The control interface may be closed in response to a detected sixth gesture, such as a down gesture. Therefore, the electronic equipment can be operated conveniently and quickly, the electronic equipment does not need to be touched, and convenience is brought to users.
In the embodiment of the present application, names such as a control interface, a function interface, and the like may also be replaced by other names, for example, the control interface may also be referred to as a global panel, and the function interface may also be referred to as a plug-in panel, a function panel, and the like, and through control information such as gestures, voice, and the like, a corresponding display interface is invoked by starting a plug-in, a component, a control, and the like, and schemes for quickly executing corresponding functions may all be within the protection scope of the embodiment of the.
On the basis of the above embodiments, the present embodiment further provides a data processing method, which can be operated conveniently.
Referring to FIG. 13, a flowchart illustrating steps of yet another data processing method embodiment of the present application is shown.
Step 1302, detecting speech.
And 1304, responding to the detected first voice data, and displaying a control interface, wherein the control interface displays at least one entry message.
And step 1306, responding to the detected second voice data, and displaying a function interface corresponding to the selected entry information.
By detecting the voice, a corresponding interface can be displayed based on the detected voice, for example, a control interface can be displayed in response to the detected first voice, the control interface can be an interface of a plug-in, a component and a control, and the control interface can display at least one entry message, so as to perform global control on the associated application, system service and the like. And if the second voice data is detected, displaying a function interface corresponding to the selected entry information, wherein the function interface can be a main interface of the corresponding application or system service, and can also be a sub-interface of a plug-in, a component and the like corresponding to the application or system service, and provides corresponding functions of the application or system service through data synchronization and interaction with the application or system service.
In the embodiment of the present application, the control interface and the function interface are similar to those in the above embodiment, and therefore, the steps are similar to those in the above embodiment, and specific reference may be made to the description of the above embodiment.
Therefore, similar to the above embodiment, if the selected entry information is switched in the control interface in response to the detected third voice data such as a movement gesture, the entry information may be switched left and right according to the left and right movement. Responding to the detected fourth voice data, and displaying a function interface of a shortcut function corresponding to the selected entry information; and responding to the detected fifth voice data, and switching back to a control interface from the function interface of the shortcut function. In response to the detected sixth voice data, the control interface may be closed. Therefore, the electronic equipment can be operated conveniently and quickly, the electronic equipment does not need to be touched, and convenience is brought to users.
On the basis of the above embodiment, the present embodiment further provides an interface control method, which can be conveniently operated.
Referring to FIG. 14, a flowchart illustrating steps of an embodiment of an interface control method of the present application is shown.
At step 1402, a first interface is displayed. The first interface includes: and (6) navigating the interface. In the example shown in FIG. 1A, the device begins displaying a navigation interface.
In step 1404, in response to the detected first control information, a control interface is displayed, the control interface displaying at least one entry information. The entry information includes at least one of: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book. As a second interface example shown in fig. 1A, a control interface is displayed on the navigation interface, and various entry information is displayed on the control interface.
And step 1406, responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
The function interface of the map comprises function information of at least one of the following functions: address search, favorite address, perimeter exploration. The function interface of the address book comprises function information of at least one of the following functions: common contacts, call records, address lists and dial plates. The functional interface of the music comprises functional information of at least one of the following functions: music list, player.
In the embodiment of the present application, the control information is also received in a non-contact manner with the electronic device, for example, the control information is received by detecting a gesture, a body motion, a voice, and the like, and therefore the control information may include: contactless control information. In an alternative embodiment, the control information may include gestures, body movements, voice data, etc., so that the user can control the device without touching the electronic device. Through the detection of the control information, a corresponding interface can be displayed based on the detected control information, for example, a first interface can be displayed based on the detected first control information, such as a hovering gesture, the first interface can be an interface currently used by the device, such as a positioning interface, a browser interface and the like, and the control information can be detected in real time under the first interface. And responding to the detected first control information, and displaying a control interface, wherein the control interface can be an interface of a plug-in, a component and a control, and the control interface can display at least one piece of entrance information, so that the entrance information is controlled globally. And for another example, in response to the detected second control information such as an upward gesture, displaying a function interface corresponding to the selected entry information, wherein the function interface can be a main interface of the application, or can also be a sub-interface of the application, the system service corresponding to the plug-in, the component and the like, and interacts with the application and the system service through data synchronization to provide corresponding functions of the application and the system service.
In the embodiment of the present application, the first interface is similar to the control interface and the function interface in the above embodiment, so that the steps are similar to those in the above embodiment, and specific reference may be made to the description of the above embodiment.
Thus, similar to the above-described embodiment, in response to detecting a third gesture, such as a movement gesture, to switch selected entry information in the control interface, the entry information may be switched left or right according to the left or right movement. Responding to the detected fourth gesture such as a gesture changed from hovering to opening, and displaying a function interface of the shortcut function corresponding to the selected entry information; and responding to the detected fifth gesture such as the gesture changing from opening to hovering, and switching back to the control interface from the function interface of the shortcut function. The control interface may be closed in response to a detected sixth gesture, such as a down gesture. Therefore, the electronic equipment can be operated conveniently and quickly, the electronic equipment does not need to be touched, and convenience is brought to users.
Moreover, a user can quickly execute the quick operation of a certain application based on the gesture, and compared with a touch mode, the operation path is shorter, and the efficiency is higher. The auxiliary interface can be provided in the embodiment of the application, and the gesture can be prompted, so that the memory cost of a user and thinking can be reduced, and instructions do not need to be remembered and described in a voice mode.
The user can switch the application plug-in through gestures, precise operation is not needed, and the method is safer; the core operation of a certain application can be quickly executed without entering the application, the operation path is shortened, and the efficiency is higher; through the control, an interface can be provided for the third-party application, the access of the third-party application is facilitated, and the setting of various required applications and functions by a user is facilitated.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the embodiments. Further, those skilled in the art will also appreciate that the embodiments described in the specification are presently preferred and that no particular act is required of the embodiments of the application.
On the basis of the above embodiments, the present embodiment further provides a data processing apparatus, which is applied to electronic devices such as terminal devices and vehicle-mounted devices.
Referring to fig. 15, a block diagram of a data processing apparatus according to an embodiment of the present application is shown, which may specifically include the following modules:
a detection module 1502 is configured to detect the control information.
The control module 1504 is used for responding to the detected first control information and displaying a control interface, wherein the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
In conclusion, the equipment can be controlled based on the control information, the equipment can detect the control information, a control interface is displayed in response to the detected first control information, at least one entry information is displayed on the control interface, and a function interface corresponding to the selected entry information is displayed in response to the detected second control information, so that the equipment can be conveniently controlled based on the control information, a corresponding interface is provided for a user, the influence on driving can be reduced under the condition of being applied to the vehicle-mounted equipment, and the driving safety is improved.
Referring to fig. 16, a block diagram of an alternative embodiment of a data processing apparatus according to the present application is shown, and specifically, the data processing apparatus may include the following modules:
a setting module 1508, configured to invoke an interface of the control interface and set entry information associated with the control interface.
A detection module 1502 is configured to detect the control information. Wherein the control information includes: contactless control information. The control information includes: gesture and/or voice data.
The control module 1504 is used for responding to the detected first control information and displaying a control interface, wherein the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information. The functional interface corresponds to an application or system service, and the functional interface comprises a functional interface of a function in the application or system service.
The synchronization module 1506 is configured to receive an instruction through the functional interface, where the functional interface and an application or system service corresponding to the functional interface are synchronized.
In an optional embodiment, the synchronization module 1506 is configured to receive an instruction through the functional interface and synchronize the instruction to an application or a system service corresponding to the functional interface; and receiving a processing result corresponding to the instruction, and outputting the processing result on the functional interface.
In another alternative embodiment, the synchronization module 1506 is configured to receive the instruction through the functional interface; executing the processing corresponding to the instruction and outputting the processing on the functional interface; and synchronizing the processing result to the application or system service corresponding to the functional interface.
The control module 1504 is further configured to switch the selected entry information in the control interface in response to the detected third control information, wherein a switching direction corresponds to a moving direction of the third control information.
The control module 1504 is further configured to display a function interface of the shortcut function corresponding to the selected entry information in response to the detected fourth control information.
The control module 1504 is further configured to switch back to the control interface from the function interface of the shortcut function in response to the detected fifth control information.
The control module 1504 is further configured to close the control interface in response to the detected sixth control information.
In an optional embodiment, the first control information comprises a hover gesture; the control module 1504 is configured to display a control interface corresponding to the control in response to the detected hovering gesture, where the control interface displays at least one entry information.
In an optional embodiment, the second control information comprises an up gesture; the control module 1504 to determine selected entry information in response to a detected upward gesture; and displaying the functional interface corresponding to the selected entrance information.
In an optional embodiment, the third control information comprises a movement gesture, and a movement direction of the movement gesture comprises: left or right movement; the control module 1504 is configured to switch the selected entry information in the control interface according to a moving direction of the movement gesture in response to the detected movement gesture.
In an optional embodiment, the fourth control information comprises a gesture changed to splay by hovering; the control module 1504 is configured to display a function interface of the shortcut function corresponding to the selected application plug-in response to the gesture changed from hovering to opening.
In an optional embodiment, the fifth control information comprises a gesture changed from splay to hover; the control module 1504 is configured to close the function interface of the shortcut function and display the control interface in response to the detected gesture that changes from open to hovering.
In an optional embodiment, the sixth control information comprises a down gesture; the control module 1504 is configured to close the control interface in response to a detected down gesture.
In an optional embodiment, the entry information includes at least one of: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book. The function interface of the map comprises function information of at least one of the following functions: address search, favorite address, perimeter exploration. The function interface of the address book comprises function information of at least one of the following functions: common contacts, call records, address lists and dial plates. The functional interface of the music comprises functional information of at least one of the following functions: music list, player.
On the basis of the above embodiments, the embodiments of the present application further provide an interface control device, which is applied to electronic devices such as terminal devices and vehicle-mounted devices.
Referring to fig. 17, a block diagram of an embodiment of an interface control apparatus according to the present application is shown, and specifically, the interface control apparatus may include the following modules:
a display module 1702 is configured to display a first interface.
An interface control module 1704, configured to display a control interface in response to the detected first control information, where the control interface displays at least one entry information; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
The first interface includes: and (6) navigating the interface. The entry information includes at least one of: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book. The function interface of the map comprises function information of at least one of the following functions: address search, favorite address, perimeter exploration. The function interface of the address book comprises function information of at least one of the following functions: common contacts, call records, address lists and dial plates. The functional interface of the music comprises functional information of at least one of the following functions: music list, player.
In an alternative embodiment, the display module 1702 is configured to display a navigation interface; the interface control module 1704 is configured to display a control interface on the navigation interface in response to the detected hover gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected upward gesture, and displaying a function interface corresponding to the selected entry information on the navigation interface.
In another alternative embodiment, the display module 1702 is configured to display a navigation interface; the interface control module 1704 is configured to display a control interface on the navigation interface in response to the detected hover gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected movement gesture, and switching the selected entrance information in the control interface according to the movement direction of the movement gesture.
In another alternative embodiment, the display module 1702 is configured to display a navigation interface; the interface control module 1704 is configured to display a control interface on the navigation interface in response to the detected hover gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the gesture changed into the opening gesture by hovering, and displaying a function interface of the shortcut function corresponding to the selected application plug-in. The interface control module 1704 is further configured to close the function interface of the shortcut function and display the control interface in response to the detected gesture changing from open to hovering.
Taking the application to the vehicle-mounted device as an example, the user can start and use the application through gestures in the process of using the vehicle-mounted device, and the operation can be performed without touching a touch screen of the vehicle-mounted device. Compared with a touch control use mode, the influence on driving can be reduced, and driving safety is improved.
Moreover, a user can quickly execute the quick operation of a certain application based on the gesture, and compared with a touch mode, the operation path is shorter, and the efficiency is higher. The auxiliary interface can be provided in the embodiment of the application, and the gesture can be prompted, so that the memory cost of a user and thinking can be reduced, and instructions do not need to be remembered and described in a voice mode.
The user can switch the application plug-in through gestures, precise operation is not needed, and the method is safer; the core operation of a certain application can be quickly executed without entering the application, the operation path is shortened, and the efficiency is higher; through the control, an interface can be provided for the third-party application, the access of the third-party application is facilitated, and the setting of various required applications and functions by a user is facilitated.
The present application further provides a non-transitory, readable storage medium, where one or more modules (programs) are stored, and when the one or more modules are applied to a device, the device may execute instructions (instructions) of method steps in this application.
Embodiments of the present application provide one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an electronic device to perform the methods as described in one or more of the above embodiments. In the embodiment of the present application, the electronic device includes various types of devices such as a terminal device and a server (cluster).
Embodiments of the present disclosure may be implemented as an apparatus, which may include electronic devices such as a terminal device, a server (cluster), etc., using any suitable hardware, firmware, software, or any combination thereof, to perform a desired configuration. Fig. 18 schematically illustrates an example apparatus 1800 that may be used to implement various embodiments described herein.
For one embodiment, fig. 18 shows an exemplary apparatus 1800 having one or more processors 1802, a control module (chipset) 1804 coupled to at least one of the processor(s) 1802, memory 1806 coupled to the control module 1804, non-volatile memory (NVM)/storage 1808 coupled to the control module 1804, one or more input/output devices 1810 coupled to the control module 1804, and a network interface 1812 coupled to the control module 1804.
The processor 1802 may include one or more single-core or multi-core processors, and the processor 1802 may include any combination of general-purpose or special-purpose processors (e.g., a graphics processor, an application processor, a baseband processor, etc.). In some embodiments, the apparatus 1800 can be a terminal device, a server (cluster), or the like as described in this embodiment.
In some embodiments, the apparatus 1800 may include one or more computer-readable media (e.g., the memory 1806 or the NVM/storage 1808) having instructions 1814 and one or more processors 1802 configured to execute the instructions 1814, in conjunction with the one or more computer-readable media, to implement modules to perform the actions described in this disclosure.
For one embodiment, the control module 1804 may include any suitable interface controller to provide any suitable interface to at least one of the processor(s) 1802 and/or to any suitable device or component in communication with the control module 1804.
The control module 1804 may include a memory controller module to provide an interface to the memory 1806. The memory controller module may be a hardware module, a software module, and/or a firmware module.
The memory 1806 may be used, for example, to load and store data and/or instructions 1814 for the apparatus 1800. For one embodiment, memory 1806 may comprise any suitable volatile memory, such as suitable DRAM. In some embodiments, memory 1806 may comprise a double data rate type four synchronous dynamic random access memory (DDR4 SDRAM).
For one embodiment, control module 1804 may include one or more input/output controllers to provide an interface to NVM/storage 1808 and input/output device(s) 1810.
For example, NVM/storage 1808 may be used to store data and/or instructions 1814. NVM/storage 1808 may include any suitable non-volatile memory (e.g., flash memory) and/or may include any suitable non-volatile storage device(s) (e.g., one or more hard disk drive(s) (HDD (s)), one or more Compact Disc (CD) drive(s), and/or one or more Digital Versatile Disc (DVD) drive (s)).
The NVM/storage 1808 may include storage resources that are physically part of the device on which the apparatus 1800 is installed, or it may be accessible by the device and may not necessarily be part of the device. For example, NVM/storage 1808 may be accessed over a network via input/output device(s) 1810.
Input/output device(s) 1810 may provide an interface for apparatus 1800 to communicate with any other suitable device, and input/output devices 1810 may include communication components, audio components, sensor components, and so forth. The network interface 1812 may provide an interface for the device 1800 to communicate over one or more networks, and the device 1800 may wirelessly communicate with one or more components of a wireless network according to any of one or more wireless network standards and/or protocols, such as access to a communication standard-based wireless network, e.g., WiFi, 2G, 3G, 4G, 5G, etc., or a combination thereof.
For one embodiment, at least one of the processor(s) 1802 may be packaged together with logic for one or more controller(s) (e.g., memory controller module) of the control module 1804. For one embodiment, at least one of the processor(s) 1802 may be packaged together with logic for one or more controller(s) of the control module 1804 to form a System In Package (SiP). For one embodiment, at least one of the processor(s) 1802 may be integrated on the same die with logic for one or more controller(s) of the control module 1804. For one embodiment, at least one of the processor(s) 1802 may be integrated on the same die with logic of one or more controllers of the control module 1804 to form a system on chip (SoC).
In various embodiments, the apparatus 1800 may be, but is not limited to: a server, a desktop computing device, or a mobile computing device (e.g., a laptop computing device, a handheld computing device, a tablet, a netbook, etc.), among other terminal devices. In various embodiments, apparatus 1800 may have more or fewer components and/or different architectures. For example, in some embodiments, device 1800 includes one or more cameras, keypads, Liquid Crystal Display (LCD) screens (including touch screen displays), non-volatile memory ports, multiple antennas, graphics chips, Application Specific Integrated Circuits (ASICs), and speakers.
The detection device may adopt a main control chip as a processor or a control module, the sensor data, the position information and the like are stored in a memory or an NVM/storage device, the sensor group may serve as an input/output device, and the communication interface may include a network interface.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all such alterations and modifications as fall within the true scope of the embodiments of the application.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The foregoing describes in detail a data processing method and apparatus, an interface control method and apparatus, an electronic device and a storage medium, which are provided by the present application, and specific examples are applied herein to explain the principles and embodiments of the present application, and the descriptions of the foregoing examples are only used to help understand the method and the core ideas of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (46)

1. A method of data processing, the method comprising:
detecting control information;
responding to the detected first control information, and displaying a control interface, wherein the control interface displays at least one entry message;
and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
2. The method of claim 1, wherein the control information comprises: contactless control information.
3. The method of claim 1, wherein the control information comprises: gesture and/or voice data.
4. The method of claim 1, wherein the functional interface corresponds to an application or system service.
5. The method of claim 1, further comprising:
receiving an instruction through the functional interface, and synchronizing the instruction to an application or system service corresponding to the functional interface;
and receiving a processing result corresponding to the instruction, and outputting the processing result on the functional interface.
6. The method of claim 1, further comprising:
receiving an instruction through the functional interface;
executing the processing corresponding to the instruction and outputting the processing on the functional interface;
and synchronizing the processing result to the application or system service corresponding to the functional interface.
7. The method of claim 1, further comprising:
and switching the selected entry information in the control interface in response to the detected third control information, wherein the switching direction corresponds to the moving direction of the third control information.
8. The method of claim 1, further comprising:
and responding to the detected fourth control information, and displaying a function interface of the shortcut function corresponding to the selected entry information.
9. The method of claim 8, further comprising:
and responding to the detected fifth control information, and switching back to the control interface from the function interface of the shortcut function.
10. The method of claim 1, further comprising:
and closing the control interface in response to the detected sixth control information.
11. The method of claim 1, further comprising:
and calling an interface of the control interface, and setting entry information associated with the control of the control interface.
12. The method of claim 1, wherein the first control information comprises a hover gesture; the responding to the detected first control information, displaying a control interface, comprising:
and responding to the detected hovering gesture, and displaying a control interface corresponding to the control, wherein at least one entry message is displayed on the control interface.
13. The method of claim 1, wherein the second control information comprises an up gesture; the responding to the detected second control information, displaying a function interface corresponding to the selected entry information, and including:
determining selected entry information in response to the detected upward gesture;
and displaying the functional interface corresponding to the selected entrance information.
14. The method of claim 7, wherein the third control information comprises a movement gesture, and wherein a direction of movement of the movement gesture comprises: left or right movement;
the switching selected entry information in the control interface in response to the detected third control information includes:
and responding to the detected movement gesture, and switching the selected entrance information in the control interface according to the movement direction of the movement gesture.
15. The method of claim 8, wherein the fourth control information comprises a gesture changed from hover to open;
the displaying a function interface of the shortcut function corresponding to the selected application plug-in response to the detected fourth control information includes:
and responding to the gesture changed into the opening gesture by hovering, and displaying a function interface of the shortcut function corresponding to the selected application plug-in.
16. The method of claim 9, wherein the fifth control information comprises a gesture that changes from splay to hover;
the responding to the detected fifth control information, and switching back to the control interface from the function interface of the shortcut function, including:
and in response to the detected gesture changed from opening to hovering, closing the function interface of the shortcut function and displaying the control interface.
17. The method of claim 10, wherein the sixth control information comprises a down gesture; responding to the detected sixth control information, closing the control interface, and including:
closing the control interface in response to the detected down gesture.
18. The method of claim 1, wherein the entry information comprises at least one of: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book.
19. The method of claim 18, wherein the function interface of the map includes function information for at least one of: address search, favorite address, perimeter exploration.
20. The method of claim 18, wherein the function interface of the contact list comprises function information of at least one of the following functions: common contacts, call records, address lists and dial plates.
21. The method of claim 18, wherein the functional interface of the music comprises functional information of at least one of the following functions: music list, player.
22. An interface control method, characterized in that the method comprises:
displaying a first interface;
responding to the detected first control information, and displaying a control interface, wherein the control interface displays at least one entry message;
and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
23. The method of claim 22, wherein the first interface comprises: and (6) navigating the interface.
24. The method of claim 22, wherein the entry information comprises at least one of: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book.
25. The method of claim 22, wherein the function interface of the map includes function information for at least one of: address search, favorite address, perimeter exploration.
26. The method of claim 22, wherein the function interface of the contact list comprises function information of at least one of the following functions: common contacts, call records, address lists and dial plates.
27. The method of claim 22, wherein the functional interface of the music comprises functional information of at least one of the following functions: music list, player.
28. An interface control method, characterized in that the method comprises:
displaying a navigation interface;
in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book;
and responding to the detected upward gesture, and displaying a function interface corresponding to the selected entry information on the navigation interface.
29. An interface control method, characterized in that the method comprises:
displaying a navigation interface;
in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book;
and responding to the detected movement gesture, and switching the selected entrance information in the control interface according to the movement direction of the movement gesture.
30. An interface control method, characterized in that the method comprises:
displaying a navigation interface;
in response to the detected hover gesture, displaying a control interface on the navigation interface, the control interface displaying at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book;
and responding to the gesture changed into the opening gesture by hovering, and displaying a function interface of the shortcut function corresponding to the selected application plug-in.
31. The method of claim 30, further comprising:
and in response to the detected gesture changed from opening to hovering, closing the function interface of the shortcut function and displaying the control interface.
32. A data processing apparatus, characterized in that the apparatus comprises:
the detection module is used for detecting the control information;
the control module is used for responding to the detected first control information and displaying a control interface, and the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
33. An interface control apparatus, the apparatus comprising:
the display module is used for displaying a first interface;
the interface control module is used for responding to the detected first control information and displaying a control interface, and the control interface displays at least one entry message; and responding to the detected second control information, and displaying a function interface corresponding to the selected entry information.
34. An interface control apparatus, the apparatus comprising:
the display module is used for displaying a navigation interface;
an interface control module, configured to display a control interface on the navigation interface in response to the detected hovering gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected upward gesture, and displaying a function interface corresponding to the selected entry information on the navigation interface.
35. An interface control apparatus, the apparatus comprising:
the display module is used for displaying a navigation interface;
an interface control module, configured to display a control interface on the navigation interface in response to the detected hovering gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the detected movement gesture, and switching the selected entrance information in the control interface according to the movement direction of the movement gesture.
36. An interface control apparatus, the apparatus comprising:
the display module is used for displaying a navigation interface;
an interface control module, configured to display a control interface on the navigation interface in response to the detected hovering gesture, where the control interface displays at least one of the following entry information: the entrance information of the map, the entrance information of the address list, the entrance information of the music, the entrance information of the broadcast and the entrance information of the audio book; and responding to the gesture changed into the opening gesture by hovering, and displaying a function interface of the shortcut function corresponding to the selected application plug-in.
37. An electronic device, comprising: a processor; and
memory having stored thereon executable code which, when executed, causes the processor to perform a data processing method as claimed in one or more of claims 1-21.
38. One or more machine readable media having executable code stored thereon that, when executed, causes a processor to perform a data processing method as recited in one or more of claims 1-21.
39. An electronic device, comprising: a processor; and
memory having stored thereon executable code which, when executed, causes the processor to perform the interface control method according to one or more of claims 22-27.
40. One or more machine-readable media having executable code stored thereon that, when executed, causes a processor to perform the interface control method of one or more of claims 22-27.
41. An electronic device, comprising: a processor; and
a memory having executable code stored thereon that, when executed, causes the processor to perform the interface control method of claim 28.
42. One or more machine-readable media having executable code stored thereon that, when executed, causes a processor to perform the interface control method of claim 28.
43. An electronic device, comprising: a processor; and
a memory having executable code stored thereon that, when executed, causes the processor to perform the interface control method of claim 29.
44. One or more machine-readable media having executable code stored thereon that, when executed, causes a processor to perform the interface control method of claim 29.
45. An electronic device, comprising: a processor; and
memory having stored thereon executable code which, when executed, causes the processor to perform the interface control method according to one or more of claims 30-31.
46. One or more machine-readable media having executable code stored thereon that, when executed, causes a processor to perform the interface control method of one or more of claims 30-31.
CN201910453665.0A 2019-05-28 2019-05-28 Data processing method, interface control method, device, equipment and storage medium Pending CN112015262A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910453665.0A CN112015262A (en) 2019-05-28 2019-05-28 Data processing method, interface control method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910453665.0A CN112015262A (en) 2019-05-28 2019-05-28 Data processing method, interface control method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112015262A true CN112015262A (en) 2020-12-01

Family

ID=73500626

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910453665.0A Pending CN112015262A (en) 2019-05-28 2019-05-28 Data processing method, interface control method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112015262A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110107272A1 (en) * 2009-11-04 2011-05-05 Alpine Electronics, Inc. Method and apparatus for controlling and displaying contents in a user interface
US20120050185A1 (en) * 2010-09-01 2012-03-01 Anton Davydov Device, Method, and Graphical User Interface for Selecting and Using Sets of Media Player Controls
US20130076615A1 (en) * 2010-11-18 2013-03-28 Mike Iao Interface method and apparatus for inputting information with air finger gesture
CN103180812A (en) * 2011-08-31 2013-06-26 观致汽车有限公司 Interactive system for vehicle
US20140267130A1 (en) * 2013-03-13 2014-09-18 Microsoft Corporation Hover gestures for touch-enabled devices
CN105955459A (en) * 2016-04-21 2016-09-21 深圳市绿地蓝海科技有限公司 Method for controlling vehicle electronic device, and device
CN106774842A (en) * 2016-11-24 2017-05-31 中国科学技术大学 Driving-situation assistant's gesture intersection control routine
WO2018068328A1 (en) * 2016-10-14 2018-04-19 华为技术有限公司 Interface display method and terminal
DE102017214012A1 (en) * 2017-08-10 2019-02-14 Volkswagen Aktiengesellschaft Method and device for operating a navigation system of a motor vehicle
CN109718549A (en) * 2019-02-21 2019-05-07 网易(杭州)网络有限公司 The method and device of Message Processing, electronic equipment, storage medium in game

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110107272A1 (en) * 2009-11-04 2011-05-05 Alpine Electronics, Inc. Method and apparatus for controlling and displaying contents in a user interface
US20120050185A1 (en) * 2010-09-01 2012-03-01 Anton Davydov Device, Method, and Graphical User Interface for Selecting and Using Sets of Media Player Controls
US20130076615A1 (en) * 2010-11-18 2013-03-28 Mike Iao Interface method and apparatus for inputting information with air finger gesture
CN103180812A (en) * 2011-08-31 2013-06-26 观致汽车有限公司 Interactive system for vehicle
US20140267130A1 (en) * 2013-03-13 2014-09-18 Microsoft Corporation Hover gestures for touch-enabled devices
CN105955459A (en) * 2016-04-21 2016-09-21 深圳市绿地蓝海科技有限公司 Method for controlling vehicle electronic device, and device
WO2018068328A1 (en) * 2016-10-14 2018-04-19 华为技术有限公司 Interface display method and terminal
CN106774842A (en) * 2016-11-24 2017-05-31 中国科学技术大学 Driving-situation assistant's gesture intersection control routine
DE102017214012A1 (en) * 2017-08-10 2019-02-14 Volkswagen Aktiengesellschaft Method and device for operating a navigation system of a motor vehicle
CN109718549A (en) * 2019-02-21 2019-05-07 网易(杭州)网络有限公司 The method and device of Message Processing, electronic equipment, storage medium in game

Similar Documents

Publication Publication Date Title
CN108491131B (en) Operation method and device of intelligent interaction panel and intelligent interaction panel
US9952681B2 (en) Method and device for switching tasks using fingerprint information
US11054988B2 (en) Graphical user interface display method and electronic device
KR102133410B1 (en) Operating Method of Multi-Tasking and Electronic Device supporting the same
CN108549513B (en) Application display method and device, storage medium and electronic equipment
KR101199618B1 (en) Apparatus and Method for Screen Split Displaying
US11119651B2 (en) Method for displaying multi-task management interface, device, terminal and storage medium
US20130152024A1 (en) Electronic device and page zooming method thereof
US20190073124A1 (en) Method and apparatus for controlling application
WO2013097896A1 (en) Application switcher
CN105630307A (en) Apparatus and method for displaying a plurality of applications on mobile terminal
CN108536357B (en) Application display method and device, storage medium and electronic equipment
WO2018112924A1 (en) Information display method, device and terminal device
CN103294382A (en) Application program switching system and method
CN111078113A (en) Sidebar editing method, mobile terminal and computer-readable storage medium
CN112445393A (en) Data processing method, device, equipment and machine readable medium
CN108595072B (en) Split screen display method and device, storage medium and electronic equipment
CN112578967B (en) Chart information reading method and mobile terminal
CN102141894A (en) User interface displaying method and device
CN111104037A (en) Method and mobile terminal for helping visually impaired to browse
US20150234546A1 (en) Method for Quickly Displaying a Skype Contacts List and Computer Program Thereof and Portable Electronic Device for Using the Same
CN110941384A (en) Interaction method, interaction device, terminal and computer readable storage medium
CN108182020A (en) screen display processing method, device and storage medium
CN112015262A (en) Data processing method, interface control method, device, equipment and storage medium
US20160286036A1 (en) Method for quick access to application functionalities

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20201223

Address after: Room 603, 6 / F, Roche Plaza, 788 Cheung Sha Wan Road, Kowloon, China

Applicant after: Zebra smart travel network (Hong Kong) Limited

Address before: A four-storey 847 mailbox in Grand Cayman Capital Building, British Cayman Islands

Applicant before: Alibaba Group Holding Ltd.

TA01 Transfer of patent application right