WO2024001871A1 - Procédé de commande et de fonctionnement et dispositif électronique - Google Patents

Procédé de commande et de fonctionnement et dispositif électronique Download PDF

Info

Publication number
WO2024001871A1
WO2024001871A1 PCT/CN2023/101372 CN2023101372W WO2024001871A1 WO 2024001871 A1 WO2024001871 A1 WO 2024001871A1 CN 2023101372 W CN2023101372 W CN 2023101372W WO 2024001871 A1 WO2024001871 A1 WO 2024001871A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
event
interface
touch event
external input
Prior art date
Application number
PCT/CN2023/101372
Other languages
English (en)
Chinese (zh)
Inventor
何书杰
姚仕贤
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024001871A1 publication Critical patent/WO2024001871A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • Embodiments of the present application relate to the field of electronic equipment, and in particular, to a control method and electronic equipment.
  • Some pages that are not developed based on the original operating system may not be able to respond to input commands from external input devices such as keyboard and mouse devices, making it difficult for users to control electronic devices through external input devices such as keyboard and mouse devices. Control, especially when it comes to screen projection scenarios, is not conducive to improving the user experience.
  • Embodiments of the present application provide a control method and an electronic device, with the purpose of enabling an external input device of the electronic device to control the electronic device on a page that is not developed based on the original operating system.
  • a control method includes: an electronic device receives a reported event from an external input device; the electronic device converts the reported event into a touch event; and the electronic device performs a target operation according to the touch event.
  • the electronic device can also be controlled by the external input device on pages that are not developed based on the original operating system.
  • the electronic device converts the reporting event into a touch event, including: obtaining a mapping relationship between the reporting event and the touch event; converting the reporting event into Touch events.
  • the mapping relationship can be a mapping table.
  • the method before the electronic device performs the target operation according to the touch event, the method further includes: the electronic device determines a target interface, and the target interface is the corresponding interface when the reporting event occurs; The device performs the target operation based on the touch event, including: the electronic device performs the target operation based on the touch event and the target interface.
  • the electronic device can learn the target interface corresponding to the operation of the external input device, thereby dispatching events to the target interface to achieve the target operation.
  • converting the reported event into a touch event includes: when it is determined that the target interface is the first page, converting the reported event into a touch event , the first page is a page that does not support direct control of external input devices.
  • the first page is a web page development page.
  • the electronic device when the electronic device detects that the electronic device is connected to an external input device and the electronic device is connected to the display device, the electronic device projects the screen to the display device.
  • the external input device is a mouse.
  • the electronic device is a device equipped with an Android system.
  • an electronic device is provided.
  • the electronic device is connected to an external input device. It is characterized in that the electronic device includes: a receiving unit for receiving reported events from the external input device; and a processing unit for converting the reported events into Touch events; the processing unit is also used to perform target operations based on touch events.
  • the processing unit is specifically configured to obtain the mapping relationship between the reporting event and the touch event; and convert the reporting event into a touch event according to the mapping relationship.
  • the processing unit is also used to: determine the target interface, and the target interface is the corresponding interface when the reporting event occurs; the processing unit is specifically used: according to the touch event and the target interface , perform the target operation.
  • the processing unit is specifically configured to: when it is determined that the target interface is the first interface, convert the reported event into a touch event, and the first page is not supported Pages directly controlled by electronic devices.
  • the first page is a web development page.
  • the processing unit when the processing unit detects that the electronic device is connected to an external input device and the electronic device is connected to the display device, the electronic device is projected onto the display device.
  • the external input device is a mouse.
  • the electronic device is a device equipped with an Android system.
  • an electronic device including: one or more processors; memory; and one or more computer programs. Wherein, one or more computer programs are stored in the memory, and the one or more computer programs include instructions. When the instruction is executed by the electronic device, the electronic device is caused to execute the control method in any of the possible implementations of the first aspect.
  • a fourth aspect provides a computer storage medium that includes computer instructions.
  • the electronic device When the computer instructions are run on an electronic device, the electronic device causes the electronic device to execute the control method in any of the possible implementations of the first aspect.
  • a computer program product is provided.
  • the computer program product When the computer program product is run on an electronic device, it causes the electronic device to execute the control method in any of the possible implementations of the first aspect.
  • a chip system in a sixth aspect, includes at least one processor.
  • the program instructions are executed in the at least one processor, any of the possible methods of the above first aspect can function on the electronic device. be realized.
  • a chip in a seventh aspect, includes a processor and a communication interface.
  • the communication interface is used to receive a signal and transmit the signal to the processor.
  • the processor processes the signal, making any one of the above first aspects possible.
  • the functions of the methods are performed on the electronic device.
  • Figure 1 is a schematic structural diagram of an electronic device.
  • Figure 2 is a software structure block diagram of an electronic device.
  • Figure 3 is an interface diagram of an electronic device connected to an external input device.
  • Figure 4 is another interface diagram of an electronic device connected to an external input device.
  • Figure 5 is a schematic block diagram of a control method provided by an embodiment of the present application.
  • FIG. 6 is an interface diagram of an electronic device connected to an external input device according to an embodiment of the present application.
  • FIG. 7 is another interface diagram of an electronic device connected to an external input device provided by an embodiment of the present application.
  • FIG. 8 is yet another interface diagram of an electronic device connected to an external input device provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural block diagram of an electronic device provided by an embodiment of the present application.
  • first and second are used for descriptive purposes only and cannot be understood as indicating or implying relative importance or implicitly indicating the quantity of indicated technical features. Therefore, features defined as “first” and “second” may explicitly or implicitly include one or more of these features. In the description of this embodiment, unless otherwise specified, “plurality” means two or more.
  • FIG. 1 shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, Indicator 192, camera 193, display screen 194, and subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light. Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) wait.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • NPU neural-network processing unit
  • different processing units can be independent devices or integrated in one or more processors.
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have been recently used or recycled by processor 110 . If the processor 110 needs to use the instructions or data again, it can be called directly from the memory. Repeated access is avoided and the waiting time of the processor 110 is reduced, thus improving the efficiency of the system.
  • processor 110 may include one or more interfaces.
  • Interfaces may include integrated circuit (inter-integrated circuit, I2C) interface, integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, pulse code modulation (pulse code modulation, PCM) interface, universal asynchronous receiver and transmitter (universal asynchronous receiver/transmitter (UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and /or universal serial bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous receiver and transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can separately couple the touch sensor 180K, charger, flash, camera 193, etc. through different I2C bus interfaces.
  • the processor 110 can be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to implement the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 can be coupled with the audio module 170 through the I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface to implement the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface to implement the function of answering calls through a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface to implement the function of playing music through a Bluetooth headset.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193, display screen 194, wireless communication module 160, audio module 170, sensor module 180, etc.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specifications. Specifically, it can be a Mini USB interface, a Micro USB interface, or a USB Type C interface, etc.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through them. This interface can also be used to connect other electronic devices, such as AR devices, etc.
  • the interface connection relationships between the modules illustrated in the embodiments of the present application are only schematic illustrations and do not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive wireless charging input through the wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142, it can also provide power to the electronic device through the power management module 141.
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the display panel can use a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • AMOLED organic light-emitting diode
  • FLED flexible light-emitting diode
  • Miniled MicroLed, Micro-oLed, quantum dot light emitting diode (QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement the data storage function. Such as saving music, videos, etc. files in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the processor 110 executes instructions stored in the internal memory 121 to execute various functional applications and data processing of the electronic device 100 .
  • the internal memory 121 may include a program storage area and a data storage area. Among them, the stored program area can store an operating system, at least one application program required for a function (such as a sound playback function, an image playback function, etc.).
  • the storage data area may store data created during use of the electronic device 100 (such as audio data, phone book, etc.).
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one disk storage device, flash memory device, universal flash storage (UFS), etc.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Touch sensor 180K also called “touch panel”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a location different from that of the display screen 194 .
  • the buttons 190 include a power button, a volume button, etc.
  • the button 190 may be a mechanical button or a touch button.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of this application takes the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 2 is a software structure block diagram of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the Android system is divided into four layers, from top to bottom: application layer, application framework layer, Android runtime and system libraries, and kernel layer.
  • the application layer can include a series of application packages.
  • the application layer can include cameras, settings, skin modules, user interface (UI), third-party applications, etc.
  • third-party applications can include gallery, calendar, calls, maps, navigation, WLAN, Bluetooth, music, video, short messages, etc.
  • the application framework layer provides an application programming interface (API) and programming framework for applications in the application layer.
  • API application programming interface
  • the application framework layer can include some predefined functions.
  • the application framework layer can include a window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can obtain the display size, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make this data accessible to applications. Said data can include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, etc.
  • a view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the notification manager allows applications to display notification information in the status bar, which can be used to convey notification-type messages and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, etc.
  • the notification manager can also be notifications that appear in the status bar at the top of the system in the form of charts or scroll bar text, such as notifications for applications running in the background, or notifications that appear on the screen in the form of conversation windows. For example, text information is prompted in the status bar, a beep sounds, the electronic device vibrates, the indicator light flashes, etc.
  • Android runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library contains two parts: one is the functional functions that need to be called by the Java language, and the other is the core library of Android.
  • the application layer and application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application layer and application framework layer into binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, security and exception management, and garbage collection and other functions.
  • system library can also include status monitoring service modules, etc., such as the physical status recognition module, which is used to analyze and recognize user gestures; the sensor service module, which is used to monitor the sensor data uploaded by various sensors in the hardware layer, and determine the electronic The physical state of device 100.
  • status monitoring service modules such as the physical status recognition module, which is used to analyze and recognize user gestures; the sensor service module, which is used to monitor the sensor data uploaded by various sensors in the hardware layer, and determine the electronic The physical state of device 100.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, sensor driver, and input device driver.
  • the hardware layer may include various types of sensors, such as the various types of sensors introduced in Figure 1, acceleration sensors, gyroscope sensors, touch sensors, etc. involved in the embodiment of this application.
  • the physical components involved in the electronic device 100 mainly include input devices, sensors, decision support systems (DSS) display chips, and touch screens. , fingerprint recognition module and other hardware components; as well as input device driver, screen management module, display driver, fingerprint driver, anti-accidental touch and other kernel software layers; anti-accidental touch input, screen control, always on display (AOD) services , power management and other application framework layer functions; as well as special adaptation applications (camera), third-party applications, system hibernation, AOD and other application layer services.
  • DSS decision support systems
  • web development pages may not be able to respond to input commands from external input devices such as keyboard and mouse devices. For example, they cannot recognize and respond to mouse click events. As a result, users cannot control electronic devices through external input devices, which is not conducive to improving the user experience. .
  • the electronic device 100 can use its touch screen to control it.
  • the electronic device 100 displays a red envelope interface 301
  • the red envelope interface 301 includes a control 10
  • the control 10 can be used to close the red envelope interface 301, and when the tablet detects that the position corresponding to the control 10 is touched, it can exit the red envelope interface 600.
  • the tablet when the tablet is connected to an external input device, the tablet can also be controlled through the external input device, but this may be limited to applications developed based on the operating system ecosystem. For web development pages, it may not be possible to control the tablet through the external input device. To control.
  • the pointer of the mouse 200 cannot move to the exit control 10 position, or the left mouse button 201 control 10 cannot exit the red envelope interface 301 at the control 10 position. Therefore, the mouse 50 cannot realize human-computer interaction, resulting in extremely poor user experience.
  • a tablet computer displays a graphic and text page 401 , and the graphic and text page 401 can be turned over in response to detecting a finger sliding on the screen.
  • pages can generally be flipped by rolling the wheel 202 of the mouse 200.
  • applications developed based on web pages cannot be manipulated in response to mouse wheel events. Even if the electronic device 100 receives the event reported by the scroll wheel 202, the page continues to stay on page 401 because it cannot respond.
  • the control method provided by the embodiment of the present application is mainly realized by the mutual cooperation between the external input device and one or more of the above-mentioned physical components as well as each layer of the software architecture layer of the electronic device 100 .
  • Figure 5 shows a control method 500 provided by an embodiment of the present application. Its purpose is to convert the event stream of an external input device into a touch-type event stream so that an external input device of an electronic device, such as a keyboard and mouse device, can control electronic devices. The device is controlled.
  • Figures 6 and 7 schematically illustrate scene graphs implemented based on the manipulation method 500.
  • the electronic device 100 is connected to the input device 200, and the electronic device 100 displays a red envelope interface 301.
  • the red envelope interface 301 includes a control 10, which is used to close the red envelope interface 301.
  • the electronic device 100 detects that the control 10 is clicked, and the click is performed by the mouse 200, the electronic device 100 can convert the mouse click event into a touch type click event, so that the electronic device 100 can respond to the touch type click event. , close the red envelope interface 301, thereby displaying the interface 302 shown in (b) of Figure 6 .
  • the electronic device can convert the operation of the mouse click control 10 into the operation of touch click control 10 .
  • the electronic device 100 is connected to the external input device 200, and the electronic device 100 displays a graphic and text page 401.
  • the electronic device may switch pages to display page 402 .
  • the electronic device 100 detects the scroll event stream of the mouse wheel 202
  • the electronic device 100 can convert the scroll event stream of the mouse wheel 202 into a touch type sliding event stream, so that the electronic device 100 can respond to the touch type sliding event. stream, flip the image and text page 401, thereby displaying page 402 as shown in (b) of Figure 7 .
  • the click event stream of the right button 503 can also be set to correspond to a touch type event stream of the tablet computer, for example, the touch event stream of the home page is displayed to enrich the mouse control capabilities and improve the user experience.
  • the electronic device can cast its screen onto the display device.
  • Display devices may include but are not limited to: projection screens, smart screens, TVs, tablets, PC monitors, etc.
  • the electronic device 100 is connected to the external input device 200 and projects the screen to the display device 800 .
  • the electronic device 100 can be, for example, a mobile phone, and the external input device 200 can be, for example, a mouse.
  • the external input device 200 can be used to control the electronic device 100 .
  • the electronic device 100 displays the interface 301
  • the display device 800 displays the interface 801.
  • the interface 801 is a projection of the interface 301. Therefore, both interfaces display the control 10 and the mouse pointer. Therefore, the movement of the mouse pointer of interface 801 is synchronized with the movement of the pointer of interface 301 .
  • the interface 801 can provide the position of the mouse pointer to facilitate user manipulation, and the manipulation is synchronized with the electronic device 100. Therefore, even if the electronic device 100 does not display the interface 301, it can be manipulated through the external input device 200, thereby improving the user experience. Alternatively, the user can control the electronic device 100 without seeing the interface 301 .
  • the electronic device 100 When the electronic device 100 detects that the control 10 is clicked, and the click is performed by the mouse 200, the electronic device 100 can convert the mouse click event into a touch type click event, so that the electronic device 100 can respond to the touch type click event. event, close the red envelope interface 301 and display the interface 302 shown in (b) of Figure 8. Since the display content comes from the electronic device 100, the display device The device 800 synchronously closes the interface 801, thereby displaying the interface 802 shown in (b) of FIG. 8 .
  • the electronic device is a device equipped with an Android system, such as an Android tablet or an Android phone.
  • an Android system such as an Android tablet or an Android phone.
  • the external input device is a mouse, but the embodiment of the present application is not limited thereto.
  • the external input device may also be a keyboard.
  • the steps for method 500 are as follows:
  • the electronic device receives the event reported by the input device.
  • the input device may be a mouse, an integrated keyboard and mouse device, or a keyboard.
  • the reported events of the input device may include but are not limited to: mouse clicks (including mouse single clicks and mouse double clicks), mouse long presses, mouse movements, and mouse wheel sliding.
  • mouse clicks including mouse single clicks and mouse double clicks
  • mouse long presses mouse movements
  • mouse wheel sliding mouse wheel sliding.
  • the mouse The click can be the click of the left and right mouse buttons or the click of the mouse wheel.
  • the electronic device may receive the above reported event through a wired interface or Bluetooth.
  • the electronic device and the mouse are connected through Bluetooth, so that the mouse can report mouse events to the electronic device.
  • the electronic device converts the reported event into a touch event.
  • the electronic device may determine the mapping relationship between the reporting event and the touch event.
  • the electronic device performs event conversion according to the mapping relationship.
  • the mapping relationship may include: mouse clicks correspond to touch clicks, and mouse wheel scrolling corresponds to touch slides.
  • clicking the keyboard's wake key can correspond to multiple quick taps on the screen, and multiple quick taps on the screen are preset as one of the ways to wake up the device.
  • the reported event of the mouse is the event stream of mouse click.
  • the conversion process can be, for example, based on the detected mouse event stream: action down ⁇ action_button_press ⁇ action_move*N ⁇ action_button_release ⁇ action_up.
  • the stream is converted to a touch type event stream: touch_move_on ⁇ touch_action_move ⁇ touch_action_up. It should be understood that the above conversion is only for illustration and does not constitute a limitation on the embodiments of the present application.
  • the reported event of the mouse is the scroll event stream of the wheel.
  • the conversion process can be, for example: wheel pressed ⁇ wheel rolled ⁇ wheel released, and this event stream is converted into a touch type event stream: touch Click ⁇ touch to slide ⁇ touch to release.
  • the reported wheel rolling event carries parameter information, such as the rolling acceleration of the wheel, and the conversion process can convert it into the corresponding sliding speed.
  • the event flow of the right mouse click can be mapped to the long-press event flow of the touch type.
  • a shortcut function menu for that space can appear.
  • Corresponding the two event streams can be more adapted to the user's needs.
  • the click event stream of the keyboard's ctrl key can also be set to correspond to a touch type event stream of the mobile phone.
  • the touch event stream of displaying the homepage when the mobile phone is connected to an external keyboard and detects When the user hits the ctrl key, the event stream can be converted into a touch event stream that displays the home page, so that the phone can switch from the application page to the home page.
  • the click event of the volume adjustment key of the keyboard can be converted into: the virtual volume key click event stream of the mobile phone. It should be understood that at this time, the electronic device does not need to determine the interface corresponding to the input operation.
  • the conversion can only be performed when needed. For example, when the target interface is determined to be the first page, the reported event is converted into a touch event of the electronic device.
  • the first page does not support electronic devices. Pages that directly control external input devices.
  • a collection of web development pages can be preset. You only need to determine whether the target interface is in this set. It should be understood that this collection may also be subsequently updated.
  • conversion can only be performed when necessary, thereby avoiding some pages that can use native control methods for event conversion and prolonging the time, thus further improving the user experience.
  • step S530 perform the target operation according to the touch event.
  • the target interface of the electronic device may also be determined, and the target interface is the corresponding interface when the reporting event occurs.
  • the target interface can be a page based on human-computer interaction. It should be understood that after the electronic device determines the target interface, the reported event can be dispatched to the target interface.
  • the electronic device 100 responds to the converted touch event stream into a click event stream, and sends the event stream to the window management module to determine the target interface, so that the electronic device 100 Control 10 closes, and interface 602 is displayed.
  • the electronic device 100 responds to the converted touch event stream into a sliding event stream, and sends the event stream to the window management module to determine the target interface, so that the electronic device 100 can switch Display interface, display interface 702.
  • the electronic device detects that the electronic device is connected to an external input device and is connected to the display device, and can project the screen of the electronic device to the display device.
  • An embodiment of the present application also provides an electronic device.
  • the electronic device is connected to an external input device, and the external input device is used to control the electronic device.
  • the electronic device includes: a receiving module and a processing module.
  • the receiving module is used to receive the event reported by the input device.
  • the processing module is used to convert reported events into touch events.
  • the electronic device can cast the screen to the display device.
  • An embodiment of the present application also provides a computer storage medium, which includes computer instructions.
  • the computer instructions When the computer instructions are run on an electronic device, the electronic device executes the above method 500.
  • An embodiment of the present application also provides a computer program product, which when the computer program product is run on an electronic device, causes the electronic device to execute the above method 500.
  • Embodiments of the present application also provide a chip system.
  • the chip system includes at least one processor.
  • the program instructions are executed in the at least one processor, the functions of the above method 500 on the electronic device are realized.
  • the disclosed systems, devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of units or modules is only a logical function division.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or they may be distributed to multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the functions are implemented in the form of software functional units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods described in various embodiments of this application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk and other media that can store program code. .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Les modes de réalisation de la présente demande concernent un procédé de commande et de fonctionnement et un dispositif électronique. Le procédé comprend les étapes suivantes : un dispositif électronique reçoit un événement de rapport en provenance d'un dispositif d'entrée externe ; le dispositif électronique convertit l'événement de rapport en un événement tactile ; et le dispositif électronique exécute un fonctionnement cible selon l'événement tactile. Au moyen du procédé de commande et de fonctionnement et du dispositif électronique décrits dans les modes de réalisation de la présente demande, un dispositif d'entrée externe d'un dispositif électronique peut commander et faire fonctionner le dispositif électronique dans une page qui n'est pas développée sur la base d'un système d'exploitation natif.
PCT/CN2023/101372 2022-06-29 2023-06-20 Procédé de commande et de fonctionnement et dispositif électronique WO2024001871A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210746887.3 2022-06-29
CN202210746887.3A CN117348785A (zh) 2022-06-29 2022-06-29 一种操控方法和电子设备

Publications (1)

Publication Number Publication Date
WO2024001871A1 true WO2024001871A1 (fr) 2024-01-04

Family

ID=89354492

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/101372 WO2024001871A1 (fr) 2022-06-29 2023-06-20 Procédé de commande et de fonctionnement et dispositif électronique

Country Status (2)

Country Link
CN (1) CN117348785A (fr)
WO (1) WO2024001871A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111840990A (zh) * 2020-07-21 2020-10-30 联想(北京)有限公司 输入控制方法、装置及电子设备
CN114281288A (zh) * 2021-12-10 2022-04-05 海宁奕斯伟集成电路设计有限公司 投屏处理方法、装置及电子设备

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111840990A (zh) * 2020-07-21 2020-10-30 联想(北京)有限公司 输入控制方法、装置及电子设备
CN114281288A (zh) * 2021-12-10 2022-04-05 海宁奕斯伟集成电路设计有限公司 投屏处理方法、装置及电子设备

Also Published As

Publication number Publication date
CN117348785A (zh) 2024-01-05

Similar Documents

Publication Publication Date Title
WO2021227770A1 (fr) Procédé d'affichage de fenêtre d'application et dispositif électronique
WO2021057830A1 (fr) Procédé de traitement d'informations et dispositif électronique
US20220342850A1 (en) Data transmission method and related device
WO2021129253A1 (fr) Procédé d'affichage de multiples fenêtres, et dispositif électronique et système
WO2021244443A1 (fr) Procédé d'affichage d'écran divisé, dispositif électronique, et support de stockage lisible par ordinateur
WO2021057868A1 (fr) Procédé de commutation d'interface et dispositif électronique
WO2021104030A1 (fr) Procédé d'affichage à écran partagé et dispositif électronique
KR102064952B1 (ko) 수신 데이터를 이용하여 어플리케이션을 운영하는 전자 장치
WO2021120914A1 (fr) Procédé d'affichage d'éléments d'interface et dispositif électronique
WO2022052671A1 (fr) Procédé d'affichage de fenêtre, procédé de commutation de fenêtres, dispositif électronique et système
WO2022068483A1 (fr) Procédé et appareil de démarrage d'application, et dispositif électronique
WO2021063098A1 (fr) Procédé de réponse d'écran tactile, et dispositif électronique
WO2023226455A1 (fr) Procédé d'affichage d'icône d'application, dispositif électronique et support d'enregistrement lisible
WO2021175272A1 (fr) Procédé d'affichage d'informations d'application et dispositif associé
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
EP4024839A1 (fr) Procédé d'opération et dispositif électronique
WO2024021519A1 (fr) Procédé d'affichage de carte et équipement terminal
WO2023051511A1 (fr) Procédé de déplacement d'icône, interface graphique associée et dispositif électronique
WO2022213831A1 (fr) Procédé d'affichage de commande et dispositif associé
WO2022194190A1 (fr) Procédé et appareil de réglage d'une plage numérique de paramètre de reconnaissance de geste tactile
US20230300240A1 (en) Lock Screen Display Method for Electronic Device and Electronic Device
WO2021052488A1 (fr) Procédé de traitement d'informations et dispositif électronique
CN114281440B (zh) 一种双***中用户界面的显示方法及电子设备
WO2022002213A1 (fr) Procédé et appareil d'affichage de résultat de traduction, et dispositif électronique
WO2024001871A1 (fr) Procédé de commande et de fonctionnement et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23830050

Country of ref document: EP

Kind code of ref document: A1