WO2023040505A1 - 跨设备绘制*** - Google Patents

跨设备绘制*** Download PDF

Info

Publication number
WO2023040505A1
WO2023040505A1 PCT/CN2022/110186 CN2022110186W WO2023040505A1 WO 2023040505 A1 WO2023040505 A1 WO 2023040505A1 CN 2022110186 W CN2022110186 W CN 2022110186W WO 2023040505 A1 WO2023040505 A1 WO 2023040505A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
texture
target area
information
user operation
Prior art date
Application number
PCT/CN2022/110186
Other languages
English (en)
French (fr)
Inventor
肖冬
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202280002664.4A priority Critical patent/CN116137915A/zh
Publication of WO2023040505A1 publication Critical patent/WO2023040505A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0441Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using active external devices, e.g. active pens, for receiving changes in electrical potential transmitted by the digitiser, e.g. tablet driving signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/044Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
    • G06F3/0447Position sensing using the local deformation of sensor cells
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Definitions

  • the embodiments of the present application relate to smart device technologies, and in particular to a cross-device rendering system.
  • the embodiment of the present application provides a cross-device drawing system, which can enrich the styles of handwriting displayed on electronic devices.
  • an embodiment of the present application provides a cross-device drawing system, where the system includes a first device and a second device.
  • the first device is configured to: display a first graphical interface in response to a first user operation on the display screen of the first device; in response to a second user operation on the display screen of the first device, displaying a second graphical interface, the second graphical interface is different from the first graphical interface; in response to a third user operation on the display screen of the first device, selecting the first target on the second graphical interface area, the first target area includes the first color.
  • the second device is configured to display handwriting of the first color in response to a fourth user operation on the display screen of the second device.
  • the second device is further configured to: select the second target on the graphical interface displayed by the second device in response to a fifth user operation on the display screen of the second device area, the second target area includes the second color.
  • the first device is further configured to: display the handwriting of the second color in response to a sixth user operation on the display screen of the first device.
  • the first device is further configured to: in response to receiving a first instruction from a stylus, detect the third user operation on the display screen of the first device, the first An instruction is sent when the stylus detects that the stylus performs a first preset action; or, in response to detecting that the stylus performs a second preset action, on the display screen of the first device or, in response to detecting that the first device performs a third preset action, detect the third user operation on the display screen of the first device.
  • the first target area further includes a first texture.
  • the second device is further configured to display handwriting combined by the first color and the first texture in response to the fourth user operation on the display screen of the second device.
  • the first device is further configured to: acquire texture information of the first texture after the first target area is selected on the second graphical interface.
  • the first device is specifically configured to: detect whether the first target area contains the same pattern. Wherein, if the first target area contains the same pattern, the pattern is screenshotted to obtain the image of the pattern; based on the image of the pattern, the vector data of the pattern is obtained; the vector data of the pattern is The data is used as texture information of the first texture. Wherein, if the first target area does not contain the same pattern, then take a screenshot of the target area to obtain an image of the first target area; based on the image of the target area, obtain an image of the first target area Vector data: using the vector data of the first target area as texture information of the first texture.
  • the first device is specifically configured to: divide the first target area into a plurality of grids, and each grid has a first preset size; The first similarity of the pattern; if there is a first similarity greater than or equal to the preset similarity, and the proportion of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then It is determined that the first target area contains the same pattern.
  • the first device is specifically configured to: if there is no first similarity greater than or equal to a preset similarity, or the ratio is smaller than the preset ratio, increase The size of the grid, obtain the second similarity of the patterns in every two grids after the size is increased; if there is a second similarity greater than or equal to the preset similarity, and greater than or equal to the preset similarity If the proportion of the second similarity degree is greater than or equal to the preset proportion, it is determined that the first target area contains the same pattern; if there is no second similarity greater than or equal to the preset similarity, Or the proportion of the second similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then continue to increase the size of the grid until the size of the grid reaches the second preset size .
  • the first device is further configured to: select a third target area on the second graphical interface in response to a seventh user operation on the display screen of the first device, the The third target area includes a third color.
  • the second device is further configured to: in response to the fourth user operation on the display screen of the second device, display handwriting in a color in which the first color and the third color are fused.
  • the first device is further configured to: select a third target area on the second graphical interface in response to a seventh user operation on the display screen of the first device, the The third target area includes a third texture.
  • the second device is further configured to: display the handwriting of the combination of the first color and the third texture in response to the fourth user operation on the display screen of the second device.
  • the first target area further includes a first texture
  • the first device is further configured to: after selecting the first target area on the second graphical interface, in the second A color control and a texture control to be selected are displayed on the graphical interface; an eighth user operation of detecting selection of the color control and/or texture control.
  • the second device is configured to: display the handwriting of the first color and/or the first texture combination in response to the fourth user operation on the display screen of the second device.
  • the first device is further configured to: select a third target area on the second graphical interface in response to a seventh user operation on the display screen of the first device, the The third target area includes a third color and a third texture; displaying a color control and a texture control to be selected on the second graphical interface; detecting a ninth user operation for selecting a color control and/or a texture control.
  • the second device is further configured to: display the first color, the first texture, and the third color in response to the fourth user operation on the display screen of the second device, and/or handwriting of said third texture combination.
  • the embodiment of the present application provides a method for drawing handwriting.
  • the execution body of the method may be the first device or a chip in the first device.
  • the first device is used as an example for illustration below.
  • the method may include: In response to a first user operation on the display screen of the first device, displaying a first graphical interface; in response to a second user operation on the display screen of the first device, displaying a second graphical interface, the second graphical interface The interface is different from the first graphical interface; in response to a third user operation on the display screen of the first device, a first target area is selected on the second graphical interface, and the first target area includes the first target area.
  • a color sending color information of the first color to a second device, where the color information of the first color is used to instruct the second device to display handwriting of the first color.
  • the user can use the stylus to obtain colors across devices, and then use the color to draw handwriting, so that the handwriting is not limited to the limited colors on the second device when drawing handwriting, which enriches the styles of handwriting displayed on electronic devices, and can improve user experience.
  • the method further includes: in response to a sixth user operation on the display screen of the first device, displaying handwriting of a second color, where the second color is the The color of the selected secondary target area on the device's GUI.
  • the responding before the third user operation on the display screen of the first device further includes: in response to receiving the first instruction from the stylus, on the first device
  • the third user operation is detected on the display screen, and the first instruction is sent by the stylus detecting that the stylus performs a first preset action; or, in response to detecting that the stylus performs For a second preset action, detecting the third user operation on the display screen of the first device; or, in response to detecting that the first device performs a third preset action, on the display screen of the first device Detecting the third user operation.
  • the first target area further includes a first texture
  • the sending the color information of the first color to the second device includes: sending the first texture to the second device color information for the color and texture information for the first texture.
  • the method further includes: acquiring texture information of the first texture.
  • the acquiring the texture information of the first texture includes: detecting whether the first target area contains the same pattern; if so, taking a screenshot of the pattern to obtain the pattern image of the pattern; based on the image of the pattern, obtain the vector data of the pattern; use the vector data of the pattern as the texture information of the first texture; if not, take a screenshot of the first target area to obtain An image of the first target area; acquiring vector data of the first target area based on the image of the first target area; using the vector data of the first target area as texture information of the first texture.
  • the detecting whether the first target area contains the same pattern includes: dividing the first target area into a plurality of grids, each grid having a first preset size ; Obtain the first similarity of the patterns in every two grids; if there is a first similarity greater than or equal to the preset similarity, and the proportion of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset ratio, it is determined that the first target area contains the same pattern.
  • the method further includes: if there is no first similarity greater than or equal to a preset similarity, or if the ratio is smaller than the preset ratio, increasing the network grid size, obtain the second similarity of the patterns in every two grids after the size is increased; if there is a second similarity greater than or equal to the preset similarity, and greater than or equal to the second similarity of the preset similarity
  • the proportion of the two similarities is greater than or equal to the preset proportion, then it is determined that the first target area contains the same pattern; if there is no second similarity greater than or equal to the preset similarity, or the If the proportion of the second similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then continue to increase the size of the grid until the size of the grid reaches the second preset size.
  • the third target area includes a third color; fusing the first color and the third color to obtain fused color information; sending the second device to the second device Color information of a color, including: sending the fused color information to the second device, where the fused color information is used to instruct the second device to display the fused color of the first color and the third color handwriting.
  • the third target area includes a third texture; fusing the first color and the third texture to obtain fusion information.
  • the sending the color information of the first color to the second device includes: sending the fusion information to the second device, where the fusion information is used to instruct the second device to display the first color and the Handwriting of the third texture combination described above.
  • the first target area further includes a first texture
  • the first target area further includes a first texture
  • after selecting the first target area on the second graphical interface further comprising: displaying on the second graphical interface A color control and a texture control to be selected; an eighth user operation of detecting selection of the color control and/or texture control.
  • the eighth user operation of detecting the selection of the color control and/or texture control further includes: responding to a seventh user operation on the display screen of the first device , select a third target area on the second graphical interface, the third target area includes a third color and a third texture; display a color control and a texture control to be selected on the second graphical interface; detect A ninth user operation for selecting a color control and/or a texture control; fusing the first information indicated by the eighth user operation with the second information indicated by the ninth user operation to obtain fused information, the first The first information is the color information of the first color and/or the texture information of the first texture, and the second information is the color information of the third color and/or the texture information of the third texture.
  • the sending the color information of the first color to the second device includes:
  • the fusion information is used to instruct the second device to display the first color, the first texture, the third color, and/or the third Textured composition of handwriting.
  • the embodiment of the present application provides a handwriting drawing method applied to a second device, the method comprising: receiving information of a first color from the first device; responding to the first color information on the display screen of the second device Fourth, the user operates to display the handwriting of the first color.
  • the method further includes: in response to a fifth user operation on the display screen of the second device, selecting a second target area on a graphical interface displayed by the second device, the The second target area includes a second color; sending color information of the second color to the first device, where the color information of the second color is used to instruct the first device to display the second color handwriting.
  • the receiving the information of the first color from the first device includes: receiving the color information of the first color and the texture information of the first texture from the first device; responding In response to the fourth user operation on the display screen of the second device, a handwriting composed of the first color and the first texture is displayed.
  • the receiving the first color information from the first device includes: receiving fusion color information from the first device, where the fusion color information is the information after the fusion of the first color of the first target area and the third color of the third target area; in response to the fourth user operation on the display screen of the second device, displaying the first color and the The handwriting of the color after the tertiary color is blended.
  • the receiving information about the first color from the first device includes: receiving fusion information from the first device, where the fusion information is the first color information on the first device.
  • the receiving information about the first color from the first device includes: receiving fusion information from the first device, where the fusion information is the first color information on the first device.
  • the first color of the target area, the first texture, the third color of the third target area, and/or the third texture fusion information in response to the fourth user operation on the display screen of the second device, displaying the Combination of the first color, the first texture, the third color, and/or the third texture.
  • the embodiment of the present application provides a handwriting drawing method applied to a stylus, and the method includes: receiving texture information from a first device, and sending the texture information to a second device.
  • the method further includes: in response to detecting that the stylus performs the first preset action, sending a first instruction to the first device, where the first instruction is used to instruct to acquire the The texture information of the target area selected on the interface of the first device.
  • the method further includes: displaying the texture represented by the texture information.
  • the embodiment of the present application provides a handwriting drawing device, the handwriting drawing device is the first device or a chip in the first device, and the handwriting drawing device includes:
  • a display module configured to display a first graphical interface in response to a first user operation on the display screen of the first device, and display a second graphic interface in response to a second user operation on the display screen of the first device interface, the second graphical interface is different from the first graphical interface.
  • a processing module configured to select a first target area on the second graphical interface in response to a third user operation on the display screen of the first device, where the first target area includes a first color.
  • the transceiver module is configured to send the color information of the first color to the second device, and the color information of the first color is used to instruct the second device to display the handwriting of the first color.
  • the display module is further configured to display handwriting of a second color in response to a sixth user operation on the display screen of the first device, and the second color is the handwriting in the second color.
  • the processing module is further configured to, in response to receiving a first instruction from a stylus, detect the third user operation on the display screen of the first device, where the first instruction is The stylus detects that the stylus performs a first preset action; or, in response to detecting that the stylus performs a second preset action, detects the stylus on the display screen of the first device. the third user operation; or, in response to detecting that the first device performs a third preset action, detecting the third user operation on the display screen of the first device.
  • the first target area further includes a first texture.
  • the transceiver module is specifically configured to send the color information of the first color and the texture information of the first texture to the second device.
  • the processing module is further configured to acquire texture information of the first texture.
  • the processing module is specifically configured to detect whether the first target area contains the same pattern; if so, take a screenshot of the pattern to obtain an image of the pattern; image, acquiring the vector data of the pattern; using the vector data of the pattern as the texture information of the first texture; if not, taking a screenshot of the first target area to obtain an image of the first target area ; Acquiring vector data of the first target area based on the image of the first target area; using the vector data of the first target area as texture information of the first texture.
  • the processing module is specifically configured to divide the first target area into a plurality of grids, each grid has a first preset size; First similarity; if there is a first similarity greater than or equal to the preset similarity, and the proportion of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then determine the The first target area contains the same pattern.
  • the processing module is specifically configured to, if there is no first similarity greater than or equal to the preset similarity, or if the proportion is smaller than the preset proportion, increase the network grid size, obtain the second similarity of the patterns in every two grids after the size is increased; if there is a second similarity greater than or equal to the preset similarity, and greater than or equal to the second similarity of the preset similarity
  • the proportion of the two similarities is greater than or equal to the preset proportion, then it is determined that the first target area contains the same pattern; if there is no second similarity greater than or equal to the preset similarity, or the If the proportion of the second similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then continue to increase the size of the grid until the size of the grid reaches the second preset size.
  • the processing module is further configured to, in response to a seventh user operation on the display screen of the first device, select a third target area on the second graphical interface, and the third The target area includes a third color; fusing the first color and the third color to obtain fused color information.
  • a transceiver module specifically configured to send the fused color information to the second device, where the fused color information is used to instruct the second device to display the handwriting of the fused color of the first color and the third color .
  • the processing module is further configured to, in response to a seventh user operation on the display screen of the first device, select a third target area on the second graphical interface, and the third The target area includes a third texture; fusing the first color and the third texture to obtain fusion information.
  • the transceiving module is specifically configured to send the fusion information to the second device, where the fusion information is used to instruct the second device to display the handwriting of the combination of the first color and the third texture.
  • the first target area further includes a first texture.
  • the display module is further configured to display the color control and the texture control to be selected on the second graphical interface.
  • the processing module is further configured to detect an eighth user operation of selecting the color control and/or the texture control.
  • the processing module is further configured to, in response to a seventh user operation on the display screen of the first device, select a third target area on the second graphical interface, and the third The target area includes a tertiary color and a tertiary texture.
  • the display module is further configured to display the color control and the texture control to be selected on the second graphical interface.
  • the processing module is further configured to detect a ninth user operation of selecting a color control and/or a texture control, and fuse the first information indicated by the eighth user operation with the second information indicated by the ninth user operation, The fusion information is obtained, the first information is the color information of the first color and/or the texture information of the first texture, and the second information is the color information of the third color and/or the texture information of the third texture.
  • a transceiver module specifically configured to send the fusion information to the second device, where the fusion information is used to instruct the second device to display the first color, the first texture, and the third color, and /or handwriting of said third texture combination.
  • the embodiment of the present application provides a handwriting drawing device, the handwriting drawing device is the second device or a chip in the second device, and the handwriting drawing device includes:
  • the transceiver module is used for receiving the information of the first color from the first device.
  • a display module configured to display the handwriting of the first color in response to a fourth user operation on the display screen of the second device.
  • the display module is further configured to, in response to a fifth user operation on the display screen of the second device, select the second target area on the graphical interface displayed by the second device, so The second target area includes the second color.
  • a transceiver module configured to send color information of the second color to the first device, and the color information of the second color is used to instruct the first device to display handwriting of the second color.
  • the transceiver module is specifically configured to receive the color information of the first color and the texture information of the first texture from the first device.
  • the display module is specifically configured to display the handwriting combined by the first color and the first texture in response to the fourth user operation on the display screen of the second device.
  • the transceiver module is specifically configured to receive fused color information from the first device, where the fused color information is the first color and the second color of the first target area on the first device.
  • the fused information of the third color of the three target areas is specifically configured to receive fused color information from the first device, where the fused color information is the first color and the second color of the first target area on the first device. The fused information of the third color of the three target areas.
  • the display module is specifically configured to, in response to the fourth user operation on the display screen of the second device, display handwriting in a color obtained by fusing the first color and the third color.
  • the transceiver module is specifically configured to receive fusion information from the first device, where the fusion information is the first color and the third target of the first target area on the first device The fused information of the third texture of the region.
  • the display module is specifically configured to display the handwriting of the combination of the first color and the third texture in response to the fourth user operation on the display screen of the second device.
  • the transceiver module is specifically configured to receive fusion information from the first device, where the fusion information is the first color and the first texture of the first target area on the first device , the third color of the third target area, and/or the third texture fusion information.
  • a display module specifically configured to display the first color, the first texture, the third color, and/or the first color in response to the fourth user operation on the display screen of the second device. Handwriting of three texture combinations.
  • the embodiment of the present application provides a handwriting drawing device
  • the handwriting drawing device may be a stylus, or a chip in the stylus
  • the handwriting drawing device includes:
  • a transceiver module configured to receive texture information from the first device, and send texture information to the second device.
  • the handwriting drawing device further includes: a processing module, configured to detect an action of the stylus.
  • the transceiver module is further configured to send a first instruction to the first device in response to the processing module detecting that the stylus performs the first preset action, and the first instruction is used to instruct the user to perform the first preset action on the interface of the first device. Texture information for the selected target region.
  • the handwriting drawing device further includes: a display module.
  • a display module configured to display the texture represented by the texture information.
  • the embodiment of the present application provides an electronic device, which may be the first device in the second aspect, the second device in the third aspect, or the stylus in the fourth aspect.
  • the electronic device may include: a processor and a memory.
  • the memory is used to store computer-executable program codes, and the program codes include instructions; when the processor executes the instructions, the instructions cause the electronic device to execute the methods in the second aspect, the third aspect, and the fourth aspect.
  • an electronic device may include a display.
  • the embodiment of the present application provides an electronic device, which may be the handwriting drawing device of the fourth aspect, or the handwriting drawing device of the fifth aspect, or the handwriting drawing device of the sixth aspect.
  • the electronic device may include a unit, module or circuit for performing the methods provided in the above second aspect, third aspect, and fourth aspect.
  • the embodiments of the present application provide a computer program product containing instructions, which, when run on a computer, cause the computer to execute the methods in the above-mentioned second aspect, third aspect, and fourth aspect.
  • the embodiment of the present application provides a computer-readable storage medium, where instructions are stored in the computer-readable storage medium, and when it is run on a computer, the computer executes the above-mentioned second aspect, third aspect, The method in the fourth aspect.
  • FIG. 1 is a schematic diagram of an interface of an existing electronic device
  • FIG. 2A is another schematic diagram of the cross-device rendering system provided by the embodiment of the present application.
  • FIG. 2B is a schematic diagram of interaction in the cross-device drawing system provided by the embodiment of the present application.
  • FIG. 2C is another schematic diagram of the cross-device rendering system provided by the embodiment of the present application.
  • FIG. 3 is a schematic diagram of interaction between an electronic device and a stylus provided in an embodiment of the present application
  • FIG. 4 is another schematic diagram of the interaction between the electronic device and the stylus provided by the embodiment of the present application.
  • Fig. 5 is the schematic diagram that the amount of variation of the capacitance sampling value at the corresponding position of the TP sensor provided by the embodiment of the present application changes;
  • FIG. 6 is another schematic diagram of interaction in the cross-device rendering system provided by the embodiment of the present application.
  • FIG. 7 is a schematic diagram of a scene provided by an embodiment of the present application.
  • Fig. 8 is a schematic diagram of an interface provided by the embodiment of the present application.
  • Fig. 9 is a schematic diagram of another interface provided by the embodiment of the present application.
  • FIG. 10 is another schematic diagram of interaction in the cross-device rendering system provided by the embodiment of the present application.
  • FIG. 11 is another schematic diagram of interaction in the cross-device rendering system provided by the embodiment of the present application.
  • FIG. 12 is another schematic diagram of interaction in the cross-device rendering system provided by the embodiment of the present application.
  • FIG. 13 is a schematic diagram of another scenario provided by the embodiment of the present application.
  • Fig. 14 is a schematic diagram of texture overlay provided by the embodiment of the present application.
  • Fig. 15 is another schematic diagram of texture overlay provided by the embodiment of the present application.
  • FIG. 16 is a schematic diagram of another scenario provided by the embodiment of the present application.
  • FIG. 17 is a schematic diagram of another scenario provided by the embodiment of the present application.
  • Fig. 18 is a schematic structural diagram of the first device provided in the embodiment of the present application.
  • FIG. 19 is another schematic structural diagram of the second device provided by the embodiment of the present application.
  • FIG. 1 is a schematic diagram of an interface of an existing electronic device.
  • a color picking area 10 can be displayed on the electronic device, and the color picking area includes optional color blocks 11 , when the user uses a stylus to draw handwriting on the electronic device , the user can operate the stylus to select a color in the color block 11 to change the color of the handwriting drawn by the stylus on the electronic device.
  • the color of the color picking area 10 is fixed and single, resulting in few choices for the user. It should be understood that different colors are represented by different gray scales in FIG. 1 .
  • the embodiment of the present application provides a cross-device drawing system, the user can select the required color and/or texture on the first device, so as to extract the color and/or texture required by the user to the second device, so that the user
  • the color and/or texture can be directly used on the second device, which can enrich the style of handwriting displayed on the electronic device and improve the user's drawing efficiency.
  • FIG. 2A is a schematic diagram of a cross-device rendering system provided by an embodiment of the present application.
  • the cross-device drawing system may include a first device and a second device. Wherein, there may be multiple second devices.
  • a second device is used, and the first device is a mobile phone, and the second device is a tablet computer (portable android device, PAD) as an example for illustration.
  • the first device and the second device may be interconnected through a communication network.
  • the communication network can be but not limited to: WI-FI hotspot network, WI-FI point-to-point (peer-to-peer, P2P) network, Bluetooth network, zigbee network or near field communication (near field communication, NFC) network and other short-distance Communications network.
  • user operations on the display screen of the first device and the display screen of the second device may include but not limited to: operations on the display screen by a stylus, a user's finger, a mouse, and a keyboard.
  • multiple operations performed by the user on the display screen of the first device are represented by first, second, and so on.
  • the first user operation may include but not limited to: click, slide, long press, etc.
  • the second user operation may include but not limited to: click, slide, long press and so on.
  • the electronic device displays the first graphical interface in response to the first user operation on the display screen of the first device, and the electronic device displays the second graphic interface in response to the second user operation on the display screen of the first device
  • the interface is to indicate that the electronic device can display different graphical interfaces, that is, the user can select the first target area on any graphical interface displayed by the electronic device.
  • the first graphical interface may be the first page of the document page; if the second user operation is to slide up or turn pages, the second graphical interface may be the document the second page of .
  • the first graphical interface may be the first page of the document page; if the second user operation is an operation of switching to a photo album, the second graphical interface may be a display image interface.
  • the third user operation may be an operation for the user to select the first target area on the second image interface.
  • the third user operation may be that the user draws a circle on the display screen of the first device to delimit the first target area, and the electronic device may select the area in the circle as the first target area based on the user's operation of drawing a circle.
  • the first device may display the user's handwriting on the second graphical interface, so as to select the first target area on the second graphical interface.
  • the first device may acquire a first color of the first target area. Wherein, when the first target area has only one color, the first color is the color.
  • the first color is a fused color of the multiple colors, and the fused color may be an average value of RGB of the multiple colors.
  • the first device may send color information of the first color to the second device.
  • the color information includes RGB values of the first color.
  • the fourth user operation may be an operation for the user to draw handwriting, for example, the user draws a line on the memo interface of the second device.
  • the second device may display the handwriting of the first color in response to a fourth user operation on the display screen of the second device.
  • the user can select a color on the first device, and draw handwriting in that color on the second device. It is not necessary to only use the color on the second device to draw handwriting, which can enrich the style of handwriting displayed on the electronic device.
  • the user can also select the target area on the second device, and use the color of the target area on the second device to draw handwriting on the first device, in other words, this application
  • the cross-device drawing system provided by the embodiment may also include:
  • S205 may be performed after S204, or may be performed after referring to S201-S202 of the first device.
  • the first device and the second device in the embodiment of the present application can synchronize colors with each other, and one device can use the color of the target area on the other device to draw handwriting.
  • the following is an example of selecting a target area on the first device and drawing handwriting on the second device.
  • the user selects the target area on the first device with a stylus, and draws handwriting on the second device.
  • Draw handwriting on the device as an example for illustration.
  • FIG. 2C is another schematic diagram of the cross-device rendering system provided by the embodiment of the present application.
  • the scene includes a first device, a second device, and a stylus.
  • the stylus and the first device may be interconnected through a communication network
  • the stylus and the second device may be interconnected through a communication network.
  • the communication network can be but not limited to: WI-FI hotspot network, WI-FI point-to-point (peer-to-peer, P2P) network, Bluetooth network, zigbee network or near field communication (near field communication, NFC) network and other short-distance Communications network.
  • the stylus can select a color on the first device, and the first device can synchronize the color to the second device through the stylus, as shown in a in FIG. 2C .
  • the first device can synchronize the color to the second device through the stylus, as shown in a in FIG. 2C .
  • the stylus and the first device may be interconnected through a communication network
  • the stylus and the second device may be interconnected through a communication network
  • the first device and the second device may The interconnection is performed through a communication network, which may be as described above.
  • the stylus can select a texture on the first device, and the first device can directly synchronize the texture to the second device, as shown in b in Figure 2C, as shown in Figure 2A above, for details, refer to Figure Related description in 10.
  • Both the first device and the second device in the embodiment of the present application may be electronic devices including a touch screen, and the electronic devices may be called user equipment (UE), terminal, etc.
  • the electronic device may be a mobile phone , tablet computer (portable android device, PAD), personal digital assistant (PDA), handheld device with wireless communication function, computing device, vehicle-mounted device or wearable device, virtual reality (virtual reality, VR) terminal device , augmented reality (augmented reality, AR) terminal equipment, wireless terminals in industrial control (industrial control), wireless terminals in smart home (smart home), etc.
  • the form of the electronic device is not specifically limited in the embodiment of the present application.
  • the structure of the touch screen of the second device and the first device may be the same.
  • the interaction between the stylus and the electronic device is used as an example to illustrate the interaction between the stylus and the first device and the second device respectively.
  • FIG. 3 is a schematic diagram of interaction between an electronic device and a stylus according to an embodiment of the present application.
  • the electronic device includes: a touch panel, a display panel, a graphics processing unit (graphics processing unit, GPU), an application processor (application processor, AP), and a second communication module.
  • the touch panel includes: a touch sensor (TP sensor) and a touch processing module.
  • the display panel includes: a display screen and a display IC chip (integrated circuit chip).
  • a touch panel may be understood as a touch screen of an electronic device, and may also be called a touch screen.
  • the display panel and the touch panel may be collectively referred to as a screen or a display screen.
  • the stylus includes: a micro processing unit (micro controller unit, MCU), a first communication module, a sending module (transport, TX) and a receiving module (receive, RX).
  • MCU micro processing unit
  • TX transport, TX
  • RX receiving module
  • the first communication module and the second communication module can be a Bluetooth module, a wireless local area network module, a WI-FI module, etc., and are used to realize communication between an electronic device and a stylus.
  • module and the second communication module are not limited. It should be understood that the stylus and the electronic device can establish a wireless path through the first communication module and the second communication module, and the wireless path is used to transmit wireless signals.
  • the touch sensor is composed of an electrode array, and the electrode array includes a plurality of electrodes arranged in rows and columns.
  • the touch sensor is used to collect touch data, and the touch data may include: data of a stylus touching the touch screen, and data of a user touching the touch screen. Wherein, the user may touch the touch screen of the electronic device with fingers or knuckles, etc.
  • the touch data includes the data of touching the touch screen with a stylus pen as an example for illustration.
  • the touch processing module is used to determine the position of the stylus on the touch screen based on the touch data collected by the touch sensor, and send the position of the stylus on the touch screen to the application processor, and the touch processing module determines the position of the stylus on the touch screen For the location, refer to the related descriptions in FIG. 4 and FIG. 5 .
  • the touch processing module may be a touch IC chip, wherein the touch IC chip may also be referred to as a touch control chip, which is represented by a touch control chip in FIG. 3 .
  • the display chip is used to control the display interface of the display screen, so that the user can see the interface of the electronic device.
  • the graphics processor is used to process and analyze the image to obtain the color and texture, and reference may be made to the relevant description in the embodiments.
  • the application processor is configured to perform corresponding operations based on the position of the stylus from the touch chip on the touch screen.
  • the MCU is respectively connected with the first communication module, the sending module and the receiving module.
  • the sending module may include: a first electrode and a driving circuit, the first electrode is connected to the driving circuit, and the driving circuit is connected to the MCU.
  • the receiving module includes a second electrode and a decoding circuit, the second electrode is connected to the decoding circuit, and the decoding circuit is connected to the MCU.
  • the MCU is used to generate a pulse width modulation (pulse width modulation, PWM) signal and send the PWM signal to the driving circuit.
  • the driving circuit can drive the first electrode to send a signal based on the PWM signal.
  • the first electrode may be referred to as a transmitting electrode (transport, TX), and the first electrode may be disposed at a position close to the tip of the stylus.
  • the second electrode is used to receive the signal from the TP sensor in the electronic device and send the signal to the decoding circuit.
  • the decoding circuit is used to decode the signal from the electronic device and send the decoded signal to the MCU.
  • the second electrode may be referred to as a receiving electrode (receive, RX). It should be understood that the signal sent by the stylus through the first electrode and the signal sent by the electronic device through the TP sensor are both square wave signals.
  • the stylus may further include: a charging module and a sensor module.
  • the charging module is used for charging the stylus.
  • the sensor module may include, but is not limited to: a pressure sensor, an acceleration sensor (accelerometer sensor, G-sensor), a gyroscope, etc., which will not be described in this embodiment of the present application.
  • the sensor module can be connected with MCU.
  • the structure of the stylus shown in FIG. 3 is an example.
  • two electrodes can be set in the stylus, one of which is TX, and the other electrode can be switched between TX and RX.
  • TX the structure of the stylus shown in FIG. 3
  • RX the structure of the stylus shown in FIG. 3
  • the number and principle of the electrodes in the stylus are not limited.
  • the touch sensor in the electronic device since the tip of the stylus is provided with electrodes, the touch sensor in the electronic device includes an electrode array. Between the tip of the stylus and the electrodes of the touch sensor, there is an insulating substance (such as air, cover glass), so a capacitance can be formed between the tip of the stylus and the electrodes of the touch sensor, and the tip of the stylus and the electronic device.
  • the touch sensor in the electronic device can establish a circuit connection through capacitance, and the path between the tip of the stylus and the touch sensor in the electronic device can be called a circuit path.
  • the stylus and electronic device can interact with signals through circuit paths.
  • the first communication module in the stylus and the second communication module in the electronic device are both Bluetooth modules as an example, and a Bluetooth channel is established between the stylus and the electronic device.
  • the proximity of the tip of the stylus to the touch screen of the electronic device will cause a change in the capacitance sampling value of the TP sensor in the touch screen, and the closer the tip of the stylus is to the touch screen, the greater the variation in the capacitance sampling value of the TP sensor.
  • the variation of the capacitive sampling value at the corresponding position of the TP sensor is represented by a wave peak.
  • the position on the touch screen of the device such as the position where the touch chip can use the maximum change in the capacitance sampling value on the TP sensor as the position of the stylus on the touch screen, will not be described in this embodiment of the present application. There are related descriptions in the technology. It should be understood that, in FIG. 5 , black dots represent positions where the stylus touches the touch screen.
  • both the first device and the second device are wirelessly connected to the stylus.
  • FIG. 6 is another schematic diagram of interaction in the cross-device rendering system provided by the embodiment of the present application. Referring to FIG. 6, the interaction process between the first device and the second device may include:
  • the first device receives a first instruction, where the first instruction is used to instruct the first device to obtain texture information of a target area selected by a stylus.
  • the texture information in the embodiment shown in FIG. 6 can be replaced with color information, and the color information is the target area (that is, the first The color information of the first color of the first target area on the device.
  • the user when the user needs to acquire textures on the first device, the user may trigger the stylus to send the first instruction to the first device.
  • the stylus can send the first instruction to the first device through the above wireless path or circuit path.
  • the user can hold the stylus to perform a first preset action, so as to trigger the stylus to send the first instruction to the first device.
  • the stylus may detect an action of the stylus, and the stylus may send a first instruction to the first device in response to detecting that the user holds the stylus to perform a first preset action.
  • a G-sensor, a gyroscope, etc. can be set in the stylus, and the stylus can detect the action of the user holding the stylus based on the data collected by the G-sensor, gyroscope, etc., to determine whether the user is holding the stylus.
  • the control pen executes the first preset action.
  • the first preset action may include but not limited to: shaking, drawing a circle in the air, turning the stylus upside down, and the like.
  • Turning the stylus upside down means that the tail of the stylus is closer to the ground than the tip of the stylus.
  • the action of the stylus detecting the stylus based on the data collected by the G-sensor, the gyroscope, etc. will not be described in detail.
  • a button can be set on the stylus, and the button can be a mechanical button or a touch button, which is not limited in this embodiment of the present application.
  • the stylus detects that the user operates the key, it can send the first instruction to the first device.
  • the first device detects that the stylus performs the second preset action, it receives the first instruction from the stylus.
  • the second preset action may be, but not limited to: double-clicking the touch screen of the first device with the stylus, or long-pressing the touch screen of the first device with the stylus.
  • the second preset action may also be: the stylus draws a preset track on the touch screen of the first device.
  • the preset track may be a preset text, a preset letter, or a preset shape.
  • the second preset action can be: the stylus draws a preset track in the preset area of the touch screen of the first device.
  • the preset area may be, but not limited to, the central area of the touch screen.
  • the first device detects that the stylus draws a preset letter m in the central area of the touch screen, it can be determined that a selection indication from the stylus is detected; area (non-preset area) draws a preset letter m, then it can be determined that the stylus is drawing a note, and m is displayed on a corresponding position in other areas of the touch screen.
  • the embodiment of the present application does not limit the manner in which the first device distinguishes the track when the stylus draws notes and the preset track drawn by the stylus, and the preset area is an example.
  • the interface of the first device may display a preset control, such as a "fetch texture" control, and the first device may determine to receive the first instruction in response to detecting that the stylus operates the preset control.
  • a preset control such as a "fetch texture” control
  • the first device may determine to receive the second preset action. an instruction. That is to say, in this embodiment of the present application, the first instruction may also be input to the first device by the user using fingers or knuckles.
  • the first device may detect whether the first device performs the third preset action, and the first device may determine to receive the first instruction in response to detecting that the first device performs the third preset action.
  • the third preset action may include but not limited to: shaking, drawing circles in the air, and the like.
  • the first device In response to receiving the first instruction, the first device detects a target area selected by the stylus on the interface of the first device.
  • the selection operation of the stylus on the interface of the first device may be referred to as a third user operation.
  • the "interface of the first device" in S602 may be referred to as a second graphical interface.
  • the first device may detect a target area selected by the stylus on the interface of the first device.
  • the first device can acquire the position of the stylus on the interface of the first device, so the first device can detect the target area selected by the stylus on the interface of the first device.
  • the interface of the first device may be any interface, such as an interface where the first device displays an image in the photo album of the first device, or an interface where the first device displays a webpage, which is not limited in this embodiment of the present application, that is, In other words, in the embodiment of the present application, the stylus can select a target area on any interface of the first device.
  • the user may use a stylus to select the target area by clicking on the interface of the first device, or by drawing a preset shape.
  • the first device may use the position clicked by the stylus as the target area.
  • the first device may determine that the area within the preset shape is target area.
  • S602 may be replaced by: the first device detects the target area selected by the user on the interface of the first device in response to receiving the first instruction.
  • the user can not only use a stylus to select a target area on the interface of the first device, but also use fingers, knuckles, etc. to select the target area on the interface of the first device.
  • the following embodiments are described by taking the first device detecting that the stylus selects the target area on the interface of the first device as an example.
  • S602 can be understood as: the touch chip in the first device can obtain the target area selected by the stylus on the interface of the first device based on the variation of the capacitance sampling value at the corresponding position of the TP sensor, and store the target area corresponding to the target area The coordinates are sent to the AP in the first device.
  • the first device acquires texture information of the target area.
  • the first device may determine the content in the target area based on the content displayed on the interface of the first device and the position of the target area on the interface of the first device.
  • the first device may analyze and obtain texture information of the target area based on content in the target area.
  • the AP extracts the content in the target area based on the coordinates corresponding to the target area and the content displayed on the interface of the first device.
  • the AP can send the content in the target area to the GPU, and the GPU can analyze and obtain the texture information of the target area in response to the content in the target area.
  • the AP may capture the content in the target area on the interface of the first device by taking a screenshot, and then send the screenshot containing the content in the target area to the GPU.
  • the following takes the first device as the execution subject of acquiring the texture information of the target area to illustrate the process of the first device acquiring the texture information of the target area:
  • the same pattern in the target area can represent the texture of the target area, so the first device can draw a grid in the target area, and obtain the pattern in each grid to detect Whether the target area contains the same pattern, and if the target area contains multiple identical patterns, then obtain the pattern.
  • the first device may acquire the first similarity of patterns in every two grids, and detect whether there is a first similarity greater than or equal to a preset similarity. If there is a first similarity greater than or equal to the preset similarity, it indicates that the same pattern exists in the two grids of the target area.
  • the first device also needs to detect that the first similarity greater than or equal to the preset similarity is in the The proportions in all the first similarities, if the proportions are greater than or equal to the preset proportions, determine that the target area contains the same pattern, and will be greater than or equal to the same pattern in the grid corresponding to the preset similarity Repeating pattern as target area.
  • the grid drawn by the first device has a first preset size, and in an embodiment, the first preset size may be 1 pixel. If the first device does not have a first similarity greater than or equal to the preset similarity after drawing the grid, or the proportion of the first similarity greater than or equal to the preset similarity is less than the preset proportion, the first device can continue to increase The size of the grid, continue to obtain the second similarity of the patterns in every two grids, so as to detect whether the target area contains the same pattern.
  • the first device may continue to increase the size of the grids, and continue to obtain the third degree of similarity of the patterns in every two grids, so as to detect whether the target area contains the same pattern. This is repeated until the size of the grid reaches a second preset size, for example, the second preset size is half of the area of the target area, where the embodiment of the present application does not limit the manner in which the first device increases the grid.
  • the first device may then determine that the target area does not contain the same pattern.
  • the first device can acquire the repeating pattern of the target area, the first device can take a screenshot to obtain the image of the repeating pattern, and obtain the scalar data of the image of the repeating pattern.
  • the first device may take a screenshot to obtain the image of the target area to obtain scalar data of the image of the target area.
  • the first device in order to make the texture of the target area clear when the stylus draws handwriting and uses the texture of the target area, the first device may convert the obtained scalar data into vector data.
  • the texture information includes vector data.
  • the stylus sends a first instruction to the first device
  • the user draws handwriting on the second device (such as a tablet computer).
  • the second device such as a tablet computer
  • the user wants to use the texture on the first device (mobile phone)
  • the user shakes the stylus, and the stylus can send a first instruction to the first device, as shown in b in FIG. 7 .
  • the interface of the first device is an interface displaying an image
  • the user uses a stylus to draw a circle on the interface, and the first device can use the area in the circle as the target area.
  • the texture information obtained by the first device includes: vector data of the triangle.
  • the user may edit the texture of the target area in the first device, and the texture information obtained by the first device is the texture information after the user edits the texture of the target area.
  • the editing process may be adjusting the depth of the texture of the target area, scaling the texture of the target area, and the like.
  • the first device may display a texture editing interface, as shown in FIG. 8 .
  • a repeating pattern 81 in the target area, a depth editing control 82 , and a zoom control 83 may be displayed on the editing interface.
  • the depth editing control 82 and the zoom control 83 are represented by a progress bar.
  • the user can adjust the depth of the texture of the target area and zoom the texture of the target area by adjusting the progress bar.
  • adjusting the depth of the texture of the target area may be understood as: adjusting the contrast between the pattern of the target area and the background of the target area.
  • the user can adjust the depth progress bar to reduce the contrast between the triangle and the background, and the user can adjust the depth progress bar to enlarge the triangle.
  • the first device may obtain the user's editing processing parameters for the texture of the target area, and correspondingly, the texture information obtained by the first device may include: editing processing parameters.
  • the texture information may also include depth parameters (80%) and scaling parameters (120%).
  • the first device sends texture information to the stylus.
  • the stylus may include a storage module, and the storage module is used to store texture information from the first device.
  • the stylus may display the texture represented by the texture information on the display screen.
  • the stylus may display the texture represented by the texture information on the display screen.
  • a stylus can display a triangle on the display screen.
  • the stylus sends texture information to the second device.
  • the stylus may send the texture information to the second device in response to receiving the texture information from the first device.
  • the tip of the stylus may be provided with a pressure sensor, and when the tip of the stylus touches the touch screen of the second device, the pressure sensor may collect pressure data.
  • the stylus may determine that the tip of the stylus touches the touch screen of the second device in response to detecting the pressure data collected by the pressure sensor, indicating that the user needs to use the stylus to operate on the second device, and then The stylus can send texture information to the second device when touching the touch screen of the second device.
  • the second device can store texture information.
  • the second device includes a texture information storage, and the second device may store the texture information in the texture information storage.
  • the second device Based on the position of the stylus on the touch screen, the second device displays the handwriting with the texture represented by the texture information.
  • the operation of drawing handwriting can be called a fourth user operation, and the second device can detect the position of the stylus on the touch screen of the second device, and then use the position of the stylus to The texture represented by the texture information displays the handwriting at the corresponding position.
  • the second device can display the handwriting of the stylus in a triangle based on the position of the stylus on the touch screen, Refer to d in Figure 7.
  • the second device may display the texture represented by the texture information in the brush tool of the second device for the user to query and select.
  • a shown in FIG. 9 is an interface of an application program for drawing handwriting displayed on the second device, and a brush tool 91, a "back step” control 92, and a "history record” control are displayed on the interface 93, and the undo control 94, etc.
  • the second device responds to receiving texture information, it can store a "triangular" texture in the brush tool 91, and the user uses a stylus to operate the brush tool 91, and the second device can display a plurality of selectable textures in the brush tool 91.
  • Texture (including triangle texture 911, square texture 912, etc.), as shown in b in FIG. 9 .
  • b in FIG. 9 represents the control in text
  • the second device may also represent the control in pictures, symbols, etc., which is not limited in this embodiment of the present application.
  • the stylus is not shown in order to clearly show the interface changes of the second device.
  • the user can also select a texture, and edit the selected texture.
  • the second device may also display a depth edit control 82 , a zoom control 83 , a scale-according to size control 95 , an inversion control 96 , and the like.
  • the user operates the control 95 for adjusting the scale according to the size, so that the second device can adjust the size of the texture based on the thickness of the handwriting of the stylus. Exemplarily, when the handwriting of the stylus becomes thicker, the second device can increase the texture.
  • the user operates the inversion control 96 to invert the foreground and background.
  • the texture is black and the background is white, then the texture is white and the background is black after inversion.
  • the user can edit the selected texture, and correspondingly, the second device can use the edited texture to display the handwriting of the stylus based on the user's editing parameters of the texture.
  • the editing processing of the texture is described as an example, and the editing processing manner of the texture is not limited in this embodiment of the present application.
  • the second device can store used textures, and when the user operates the "history" control 93 with a stylus, the second device can display the used textures, as shown in d in Figure 9, the used Textures can include squares, circles, etc. It should be understood that in the embodiment of the present application, for illustration purposes, the texture is represented by a shape.
  • the second device can query the used textures stored in the "historical record", and use the last used texture to display the handwriting of the stylus. Referring to c in FIG. 9 , if the texture used last time by the second device is a square, the second device can display handwriting in a square.
  • the undo control 94 is used to undo the currently drawn handwriting.
  • the second device can delete the handwriting of the stylus displayed in a triangle, as shown in e in FIG. 9 .
  • S606 may be replaced by: the second device displays the handwriting with texture represented by texture information based on the position of the user's finger or knuckle on the touch screen.
  • the user can use fingers or knuckles to draw handwriting on the touch screen of the second device.
  • the second device can detect the position of the user's fingers or knuckles on the touch screen, and then based on the user's fingers or knuckles The position of the joint on the touch screen, the texture represented by the texture information shows the handwriting.
  • the user when the user uses the stylus to draw handwriting on the second device, he can use the stylus to select the texture on the first device, and the first device can send the texture selected by the stylus to the For the second device, when the stylus draws the handwriting on the second device, the second device displays the handwriting with the texture.
  • the user can use the stylus to obtain textures across devices, and then use the texture to draw handwriting, so that the stylus is not limited to the limited colors on the second device when drawing handwriting, which enriches the electronic equipment.
  • the style of drawing handwriting can improve user experience.
  • the first device can send texture information to the second device through the stylus.
  • the first device can be connected to the second device wirelessly. After the first device obtains the texture information, it can directly Send the texture information to the second device.
  • the second device can also display the handwriting with the texture corresponding to the texture information, because in the embodiment of the present application, the first device directly sends the handwriting to the second device.
  • the texture information is not transmitted by the stylus, so the transmission efficiency is higher.
  • both the first device and the second device are wirelessly connected to the stylus, and the first device and the second device are wirelessly connected.
  • S604-S605 can be replaced by S604A:
  • the first device sends texture information to the second device.
  • the first device and the second device are connected wirelessly, after the first device obtains the texture information, it can directly send the texture information to the second device, which can improve the transmission efficiency, and can also achieve the texture information of the second device.
  • the texture corresponding to the information shows the purpose of the handwriting.
  • the first device is also wirelessly connected to the third device, and/or the second device is wirelessly connected to the third device.
  • the first device may send The third device sends the texture information (or sends the texture information to the third device through the second device), so that the texture information can also be obtained and stored in the third device.
  • the user when a user draws handwriting on a tablet computer or a notebook computer, the user can operate the stylus to select a target area on the mobile phone, and the mobile phone can send the texture information of the target area to the tablet computer and the notebook computer. Computers and laptops can also use this texture to draw handwriting, and you can refer to the description in S606. It should be noted that when the first device is not wirelessly connected to the second device, the second device may send texture information to a third device wirelessly connected to the second device in response to receiving texture information from the first device.
  • the synchronization of texture information can be established among the first device, the second device, and the third device, and the first device can send the texture information to the second device and the third device in response to obtaining the texture information.
  • the second device as well as the third device can update the stored texture information.
  • the first device, the second device, and the third device each include: a texture information storage.
  • the texture information stored in the texture information memory in different devices is consistent.
  • the device can synchronize the acquired new texture information to other devices through the texture information memory in the device In the texture information memory, so that other devices can also use the texture information.
  • the texture information can be stored in the texture information storage of the first device, and the texture information storage of the first device can store the texture information obtained by the first device Synchronized to the texture information memory of the second device, and to the texture information memory of the third device.
  • the way the first device uses the texture information storage is a way of "synchronizing texture information from the first device to the second device and the third device".
  • the user can use the stylus to select the target area on the mobile phone, and the mobile phone can send the texture information of the target area to the computer and the portable laptop , and then the user can carry the notebook computer when going out, and can also use the texture stored on the notebook computer to draw handwriting, which improves the user experience.
  • the user can use the stylus to select not only the texture but also the color on the first device.
  • S603-S606 can be replaced by S603A- S606A:
  • the first device acquires texture information and color information of the target area.
  • the color information may include: RGB values, and the RGB values include the R value of red, the G value of green, and the B value of blue.
  • the color information and the texture information may be referred to as attribute information, that is, in the embodiment of the present application, the attribute information may include: texture information and/or color information.
  • the texture information may include: texture information, or texture information and color information.
  • the color of the texture is not considered in the above embodiment, if the texture is a triangle, the above embodiment does not consider the color of the triangle.
  • the first device obtains the repeated pattern in the target area, it needs to consider the color of the repeated pattern, that is, not only the shapes of the patterns in the repeated pattern are the same, but also the color of the pattern needs to be the same.
  • a device takes the same pattern with the same color as the repeating pattern (ie the same pattern) of the target area.
  • the color information in the embodiment of the present application is: the color information of the repeating pattern.
  • the color information of the repeating pattern includes: the RGB value of each position in the repeating pattern.
  • the texture information of the target area is the vector data of the image of the target area. You can refer to the description in the above embodiment, and the color information of the target area includes: RGB values for each location in the target region.
  • the acquisition of color information by the first device can be understood as: the AP obtains the image of the content in the target area on the interface of the first device by taking a screenshot based on the coordinates corresponding to the target area, and then sends the content of the target area to the GPU.
  • the GPU can extract RGB values in the image of the content in the target area, that is, obtain color information.
  • the first device sends texture information and color information to the stylus.
  • the stylus sends texture information and color information to the second device.
  • the second device Based on the position of the stylus on the touch screen, the second device displays the handwriting with the texture represented by the texture information and the color represented by the color information.
  • S604A-S605A may be replaced by: the first device sends the texture information and the color information to the second device.
  • the user can use the stylus to select not only the texture but also the color on the first device, so that the second device displays the handwriting of the stylus with the texture and color of the target area, which further enriches the display of electronic devices. Style of handwriting.
  • the user can select textures and/or colors in multiple areas of the interface of the first device, and the first device will use the multiple selected by the user After the texture and/or color of the region are fused, the texture information and color information are synchronized to the second device.
  • the interaction process between the first device and the second device may include:
  • the first device receives a first instruction, where the first instruction is used to instruct the first device to acquire texture information and/or color information of a target area selected by a touch pen.
  • the first instruction is used to instruct the first device to acquire texture information and/or color information of the target area selected by the stylus pen, and is not a single texture information.
  • the first device In response to receiving the first instruction, the first device detects the first target area and the third target area selected by the stylus on the interface of the first device.
  • the operation of selecting the first target area with the stylus on the interface of the first device may be referred to as the third user operation, and the operation of selecting the third target area with the stylus on the interface of the first device may be referred to as the seventh user operation.
  • user action The "interface of the first device" in S1202 may be referred to as a second graphical interface.
  • the stylus can select multiple target areas on the interface of the first device.
  • c in FIG. 7 can be replaced with a in FIG. 13.
  • the "OK" control 131 is operated.
  • the first device may determine that the user selection is complete. In this way, referring to a in FIG. 13 , the user selects the first target area 132 and the third target area 133 on the interface of the first device, wherein the user can select the "OK" control 131 to indicate that the user selection is completed.
  • the first device determines that the user has selected the first target area and the third target area in response to detecting that the user selects the "OK" control 131 .
  • the first device acquires first texture information of the first target area and third texture information of the third target area.
  • the first device acquires the first texture information of the first target area and the third texture information of the third target area, reference may be made to the description of the first device acquiring the texture information of the target area in S603.
  • the first device performs fusion processing on the first texture information and the third texture information to obtain fusion texture information.
  • the first device fuses the first texture information and the third texture information, which may be: the first device fuses the first texture represented by the first texture information and the third texture represented by the third texture information Superimpose to get the superimposed texture.
  • the fused texture information may include: image vector data of superimposed textures. Referring to a in FIG. 14 , if the first texture is a triangle and the third texture is a square, the superimposed texture is a square superimposed on a triangle.
  • the relative positions of the first texture and the third texture can be preset, for example, based on the order in which the user selects the target area, the third texture selected later is arranged in the preset relative position of the first texture selected earlier (such as right side, upper side, etc.).
  • the first texture is a triangle and the third texture is a square
  • the fused texture is: the triangles are arranged on the right side of the square.
  • the first device may perform multiple fusions of the first texture and the third texture.
  • the fusion of the first texture and the third texture by the first device may not be limited to: The first texture and the third texture are superimposed, or the first device arranges the first texture on the right side of the third texture, etc.
  • the first device may display a variety of fused textures on the interface of the first device for the user to select, wherein the user may select the fused texture on the interface of the first device, so that the first device responds to detecting that the user selects
  • the fused texture the texture information corresponding to the fused texture selected by the user is used as the fused texture information.
  • the user can customize the relative positions of the first texture and the third texture, so as to perform fusion processing on the first texture and the third texture.
  • the first device may display an editing interface, and the editing interface includes the first texture and the third texture.
  • the interface shown in FIG. 8 can be replaced with the interface shown in FIG. 15. Referring to a in FIG. 15, the editing interface displays the first texture in the first target area, and the texture in the third target area.
  • the third texture the user can drag any one of the textures (the user can use a finger or a stylus to drag) to change the relative position of the first texture and the third texture, as shown in b in Figure 15, the user can drag Move the square so that the square and the triangle are superimposed, then the fused texture is: the superposition of the square and the triangle.
  • the first device synchronously fuses the texture information with the second device.
  • the first device may send the fused texture information to the stylus, and the stylus may send the fused texture information to the second device in response to receiving the fused texture information. Or, in an embodiment, the first device may directly send the fusion texture information to the second device.
  • the second device Based on the position of the stylus on the touch screen, the second device uses the texture represented by the fused texture information to display the handwriting.
  • the second device can display the handwriting with a texture characterized by fused texture information (that is, a fused texture), as shown in b in FIG. 13 Take the fused texture as an example of the superposition of squares and triangles.
  • the user can use a stylus to select at least two target areas on the first device, and the first device can fuse the textures in the at least two target areas to obtain fused texture information, and then provide the information to the second
  • the device fuses texture information synchronously, and users can fuse multiple textures they need, which can further enrich the style of handwriting displayed on electronic devices.
  • the first device can fuse the first texture of the first target area with the third texture of the third target area.
  • the first device can combine the first The first texture of the target area is fused with the third color of the third target area.
  • the user likes the texture of one area in the interface displayed by the first device and the color of another area, the user The two areas may be selected, so that the first device fuses the textures and colors in the two areas.
  • S1203-S1206 in Figure 12 above can be replaced by S1203A-S1206A:
  • the first device acquires first texture information of the first target area and third color information of the third target area.
  • the first device may be preset whether the first device obtains the texture information of the target area or the color information. Exemplarily, it is possible to determine whether to acquire the texture information or the color information of the target area in the manner of "texture" and "color” intersecting in turn. If the user first selects the first target area, the first device acquires the first target area If the user selects the third target area secondly, the first device obtains the third color information of the third target area, and so on, the user can sequentially obtain the texture of the target area selected by the user, Color, texture, color....
  • the first device acquires the texture information of the first n target areas selected by the user, and the first device acquires the color information of the last m target areas selected by the user.
  • the information or the manner of the color information is not limited.
  • exemplary n and m are both integers greater than or equal to 1.
  • the first device performs fusion processing on the first texture information and the third color information to obtain fusion information.
  • the manner in which the first device fuses the first texture information and the third color information may be as follows: the first device superimposes the color represented by the third color information on the texture represented by the first texture information to obtain a texture after the superimposed color.
  • the first device may obtain fusion information based on the texture after color superimposition, where the fusion information includes vector data of the texture after color superimposition, and reference may be made to relevant descriptions in the foregoing embodiments.
  • the first device synchronizes the fusion information to the second device.
  • the second device Based on the position of the stylus on the touch screen, the second device displays the handwriting with the texture and color represented by the fusion information.
  • FIG. 13 can be replaced with FIG. 16.
  • the user uses a stylus to draw a circle on the interface of the first device, and the first device can use the area in the circle as The first target area, and acquire the first texture information of the first target area: triangle.
  • the user draws a circle on the interface of the first device with a stylus, then the first device can use the area in the circle as the third target area, and obtain the third color information of the third target area: gray (should be It should be understood that in the embodiments of the present application, the color is represented by gray scale).
  • the second device can obtain fusion information. Referring to b in Figure 16, when the user uses a stylus to draw handwriting on the second device, the second device can use the fusion information to represent Texture and color reveal handwriting.
  • S1203A may also be replaced by: the first device acquires the first color information of the first target area and the third color information of the third target area.
  • S1204A may be replaced by: the first device fuses the first color information and the third color information to obtain the fused color information.
  • S1205A may be replaced by: the first device synchronizes the fused color information to the second device.
  • S1206A may be replaced by: the second device displays the handwriting in a fused color based on the position of the stylus on the touch screen.
  • the color in the first target area may be called the first color
  • the texture in the first target area may be called the first texture
  • the information of the first color is called the first color information
  • the first color information is called the first color information.
  • Information of a texture is called first texture information.
  • the color in the third target area may be called a third color
  • the texture in the third target area may be called a third texture
  • the information of the third color is called third color information
  • the information of the third texture is called a third texture information.
  • the user when the user selects the target area, the user can independently choose to acquire the texture information or the color information of the target area, which can provide user experience.
  • the following description will be made by taking the texture information of the first target area and the color information of the third target area selected by the user as an example.
  • S1202-S1206 above can be replaced by S1202B-S1204B:
  • the first device In response to receiving the first instruction, the first device detects the first target area selected by the stylus on the interface of the first device.
  • the first device In response to detecting the first target area, the first device displays a color control and a texture control to be selected.
  • the first device acquires first texture information of the first target area in response to detecting that the user operates the texture control.
  • the texture control can be understood as the first target control.
  • the selection operation of the color control and/or the texture control after the first target area is selected may be referred to as an eighth user operation.
  • the first device detects the third target area selected by the stylus on the interface of the first device.
  • the first device In response to detecting the third target area, the first device displays a color control and a texture control to be selected.
  • the first device acquires third color information of the first target area in response to detecting that the user operates the color control.
  • the color control can be understood as the second target control.
  • the selection operation of the color control and/or the texture control after the second target area is selected may be referred to as a ninth user operation.
  • S1205A-S1206A may also be executed.
  • FIG. 13 can be replaced with FIG. 17.
  • the user uses a stylus to draw a circle on the interface of the first device, and the first device can A user-selectable color control 161 and texture control 162 are displayed, and the first device can use the area in the circle as the first target area.
  • the user can select the color control 161 or the texture control 162 to trigger the first device to obtain the color information or texture information of the first target area.
  • the user selects the texture control 162, and the first In response to detecting that the user selects the texture control 162, the device may obtain first texture information for the first target area.
  • the first device when the user uses a stylus to draw another circle in another area on the interface of the first device, the first device can display a user-selectable color control 161 on the interface. and texture control 162, and the first device may use the area in the circle as the third target area.
  • the user selects the color control 161 , and the first device may acquire the color information of the third target area in response to detecting that the user selects the color control 161 .
  • the user can select the color control 161 and the texture control 162 at the same time, thereby triggering the first device to acquire the first color information and first texture information of the first target area, Further, the first device may fuse the first color information and the first texture information of the first target area, and the third color information (or the third texture information) of the second device, wherein the texture fusion method may refer to FIG. Related description in 12.
  • the method of color fusion may be as follows: the first device adds the R value in the first color information and the R value in the third color information and takes an average value to obtain the fused R value. Similarly, the first device will The G value in the first color information and the G value in the third color information are summed and averaged to obtain the fused G value, and the first device combines the B value in the first color information with the third color information The B values are summed and averaged to obtain the fused B value, and the fused color information includes the fused R value, G value, and B value.
  • the second device can display the handwriting with the texture and color represented by the fusion texture information.
  • the user can select the color and/or texture from the first device and synchronize it to the example used in the second device.
  • the user can also select the color and/or texture from the second device. Textures, synced to the first device for use.
  • the user can select multiple target areas on the first device, and the first device can preset the color and/or texture of each target area to be acquired, or the user can choose to acquire the target area when selecting the target area.
  • the color information and/or texture information of the target area so that the first device fuses the color information and/or texture information of multiple target areas to obtain fusion information, so that the second device can represent the color and/or texture information based on the fusion information
  • the texture displays the handwriting of the stylus, which further enriches the style of displaying handwriting on electronic devices.
  • FIG. 18 is a schematic structural diagram of a handwriting drawing device provided by an embodiment of the present application.
  • the handwriting drawing device involved in this embodiment may be the aforementioned first device, or may be a chip in the first device.
  • the handwriting drawing device may be used to perform the actions of the first device in the above method embodiments.
  • the handwriting drawing device 1800 may include: a display module 1801 , a processing module 1802 , and a transceiver module 1803 .
  • the display module 1801 is configured to display a first graphical interface in response to a first user operation on the display screen of the first device, and display a second graphical interface in response to a second user operation on the display screen of the first device.
  • the second graphical interface is different from the first graphical interface.
  • the processing module 1802 is configured to select a first target area on the second graphical interface in response to a third user operation on the display screen of the first device, where the first target area includes the first color.
  • the transceiver module 1803 is configured to send the color information of the first color to the second device, and the color information of the first color is used to instruct the second device to display the handwriting of the first color.
  • the display module 1801 is further configured to display handwriting of a second color in response to a sixth user operation on the display screen of the first device, and the second color is displayed on the graphic interface of the second device The color of the selected secondary target area.
  • the processing module 1802 is further configured to detect a third user operation on the display screen of the first device in response to receiving a first instruction from the stylus, where the first instruction is that the stylus detects Sent by the stylus performing the first preset action; or, in response to detecting that the stylus pen performs the second preset action, detecting a third user operation on the display screen of the first device; or, in response to detecting the first device A third preset action is performed, and a third user operation is detected on the display screen of the first device.
  • the first target area further includes a first texture.
  • the transceiver module 1803 is specifically configured to send the color information of the first color and the texture information of the first texture to the second device.
  • the processing module 1802 is further configured to acquire texture information of the first texture.
  • the processing module 1802 is specifically configured to detect whether the first target area contains the same pattern; if so, take a screenshot of the pattern to obtain an image of the pattern; based on the image of the pattern, obtain the vector data of the pattern ;Take the vector data of the pattern as the texture information of the first texture; if not, take a screenshot of the first target area to obtain the image of the first target area; based on the image of the first target area, obtain the vector data of the first target area ; Use the vector data of the first target area as the texture information of the first texture.
  • the processing module 1802 is specifically configured to divide the first target area into a plurality of grids, each grid has a first preset size; and obtain the second number of patterns in every two grids. A similarity; if there is a first similarity greater than or equal to the preset similarity, and the proportion of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then it is determined that the first target area contains same pattern.
  • the processing module 1802 is specifically configured to increase the size of the grid and obtain The second similarity of the patterns in every two grids after the size is increased; if there is a second similarity greater than or equal to the preset similarity, and the proportion of the second similarity greater than or equal to the preset similarity is greater than or equal to the preset ratio, then determine that the first target area contains the same pattern; if there is no second similarity greater than or equal to the preset similarity, or the proportion of the second similarity greater than or equal to the preset similarity greater than or equal to the preset ratio, continue to increase the size of the grid until the size of the grid reaches the second preset size.
  • the processing module 1802 is further configured to, in response to a seventh user operation on the display screen of the first device, select a third target area on the second graphical interface, where the third target area includes the Three colors; fused the first color and the third color to obtain the fused color information.
  • the transceiving module 1803 is specifically configured to send fusion color information to the second device, where the fusion color information is used to instruct the second device to display the handwriting of the fusion color of the first color and the third color.
  • the processing module 1802 is further configured to, in response to a seventh user operation on the display screen of the first device, select a third target area on the second graphical interface, where the third target area includes the Three textures; fusing the first color and the third texture to obtain fusion information.
  • the transceiving module 1803 is specifically configured to send fusion information to the second device, where the fusion information is used to instruct the second device to display the handwriting combined with the first color and the third texture.
  • the first target area further includes a first texture.
  • the display module 1801 is further configured to display the color control and texture control to be selected on the second graphical interface.
  • the processing module 1802 is further configured to detect an eighth user operation of selecting a color control and/or a texture control.
  • the processing module 1802 is further configured to, in response to a seventh user operation on the display screen of the first device, select a third target area on the second graphical interface, where the third target area includes the Three colors and a third texture.
  • the display module 1801 is further configured to display the color control and texture control to be selected on the second graphical interface.
  • the processing module 1802 is further configured to detect a ninth user operation of selecting a color control and/or a texture control, and fuse the first information indicated by the eighth user operation with the second information indicated by the ninth user operation to obtain fusion information , the first information is color information of the first color and/or texture information of the first texture, and the second information is color information of the third color and/or texture information of the third texture.
  • the transceiving module 1803 is specifically configured to send fusion information to the second device, where the fusion information is used to instruct the second device to display the handwriting of the combination of the first color, the first texture, the third color, and/or the third texture.
  • the handwriting drawing device provided in the embodiment of the present application can perform the actions of the first device in the method embodiment above, and its implementation principle and technical effect are similar, and will not be repeated here.
  • FIG. 19 is a schematic structural diagram of a handwriting drawing device provided by an embodiment of the present application.
  • the handwriting drawing device involved in this embodiment may be the aforementioned second device, or may be a chip in the second device.
  • the handwriting drawing device can be used to execute the actions of the second device in the above method embodiments.
  • the handwriting drawing device 1900 may include: a transceiver module 1901 and a display module 1902 .
  • the transceiver module 1901 is configured to receive the information of the first color from the first device.
  • the display module 1902 is configured to display handwriting of the first color in response to a fourth user operation on the display screen of the second device.
  • the display module 1902 is further configured to, in response to a fifth user operation on the display screen of the second device, select a second target area on the graphical interface displayed by the second device, and the second target area Include the secondary color.
  • the transceiver module 1901 is configured to send color information of a second color to the first device, and the color information of the second color is used to instruct the first device to display handwriting of the second color.
  • the transceiver module 1903 is specifically configured to receive the color information of the first color and the texture information of the first texture from the first device.
  • the display module 1902 is specifically configured to display the handwriting combined by the first color and the first texture in response to a fourth user operation on the display screen of the second device.
  • the transceiver module 1903 is specifically configured to receive fusion color information from the first device, where the fusion color information is the first color of the first target area and the first color of the third target area on the first device. The information after the fusion of three colors.
  • the display module 1902 is specifically configured to, in response to a fourth user operation on the display screen of the second device, display the handwriting of the fusion color of the first color and the third color.
  • the transceiver module 1903 is specifically configured to receive fusion information from the first device, where the fusion information is the first color of the first target area and the third texture of the third target area on the first device fused information.
  • the display module 1902 is specifically configured to display the handwriting of the combination of the first color and the third texture in response to a fourth user operation on the display screen of the second device.
  • the transceiver module 1903 is specifically configured to receive fusion information from the first device, where the fusion information is the first color, the first texture, and the third target area of the first target area on the first device tertiary color, and/or tertiary texture fusion information.
  • the display module 1902 is specifically configured to display the handwriting of the combination of the first color, the first texture, the third color, and/or the third texture in response to a fourth user operation on the display screen of the second device.
  • the handwriting drawing device provided in the embodiment of the present application can execute the actions of the second device in the above method embodiment, and its implementation principle and technical effect are similar, and will not be repeated here.
  • the embodiment of the present application further provides an electronic device, and the electronic device may be the first device, the second device, and the stylus in the foregoing embodiments.
  • the electronic device may include: a processor (such as a CPU), and a memory.
  • the memory may include a high-speed random-access memory (random-access memory, RAM), and may also include a non-volatile memory (non-volatile memory, NVM), such as at least one disk memory, in which various instructions can be stored. It is used to complete various processing functions and realize the method steps of the present application.
  • the electronic equipment involved in this application may further include: a power supply, a communication bus, and a communication port.
  • the above-mentioned communication port is used to implement connection and communication between the electronic device and other peripheral devices.
  • the memory is used to store computer-executable program codes, and the program codes include instructions; when the processor executes the instructions, the instructions cause the processor of the electronic device to perform the actions in the above-mentioned method embodiments, and its implementation principles and technologies The effects are similar and will not be repeated here.
  • modules or components in the above embodiments may be one or more integrated circuits configured to implement the above methods, for example: one or more application specific integrated circuits (ASICs), or one or multiple microprocessors (digital signal processor, DSP), or, one or more field programmable gate arrays (field programmable gate array, FPGA), etc.
  • ASICs application specific integrated circuits
  • DSP digital signal processor
  • FPGA field programmable gate array
  • the processing element can be a general-purpose processor, such as a central processing unit (central processing unit, CPU) or other processors that can call program codes such as control device.
  • these modules can be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • SOC system-on-a-chip
  • a computer program product includes one or more computer instructions.
  • Computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, e.g. Coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
  • DSL digital subscriber line
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server, a data center, etc. integrated with one or more available media.
  • Available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, Solid State Disk (SSD)).
  • sequence numbers of the above-mentioned processes do not mean the order of execution, and the order of execution of the processes should be determined by their functions and internal logic, and should not be used in the implementation of this application.
  • the implementation of the examples constitutes no limitation.
  • a cross-device drawing system characterized in that the system includes a first device and a second device,
  • the first device is configured as:
  • the first target area includes a first color
  • the second device is configured as:
  • the second device is further configured to:
  • the first device is further configured to:
  • the first device is further configured to:
  • the third user operation is detected on the display screen of the first device.
  • the second device is further configured to:
  • texture information of the first texture is acquired.
  • the vector data of the first target area is used as the texture information of the first texture.
  • the proportion of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then it is determined that the first target area contains same pattern.
  • the third target area includes a third color
  • the second device is further configured to:
  • the second device is further configured to:
  • a color control and a texture control to be selected are displayed on the second graphical interface;
  • the second device is configured as:
  • the handwriting of the first color and/or the first texture combination is displayed.
  • the third target area includes a third color and a third texture
  • the second device is further configured to:
  • a handwriting drawing method applied to a cross-device drawing system characterized in that it is applied to a first device, and the method comprises:
  • the first target area includes a first color
  • the third user operation is detected on the display screen of the first device.
  • the sending the color information of the first color to the second device includes:
  • the vector data of the first target area is used as the texture information of the first texture.
  • the proportion of the first similarity greater than or equal to the preset similarity is greater than or equal to the preset proportion, then it is determined that the first target area contains same pattern.
  • the third target area includes a third color
  • the sending the color information of the first color to the second device includes:
  • the sending the color information of the first color to the second device includes:
  • the third target area includes a third color and a third texture
  • the first information being the color information of the first color and/or the information of the first texture Texture information
  • the second information is color information of the third color and/or texture information of the third texture
  • the sending the color information of the first color to the second device includes:
  • the fusion information is used to instruct the second device to display the first color, the first texture, the third color, and/or the third Textured composition of handwriting.
  • a handwriting drawing method applied to a cross-device drawing system characterized in that it is applied to a second device, and the method comprises:
  • the fused color information is fused information of the first color of the first target area and the third color of the third target area on the first device
  • fusion information is fusion information of the first color of the first target area and the third texture of the third target area on the first device
  • fusion information being the first color of the first target area, the first texture, the third color of the third target area, and/or the third color of the first target area on the first device.
  • Texture fusion information being the first color of the first target area, the first texture, the third color of the third target area, and/or the third color of the first target area on the first device.
  • a handwriting drawing device applied to a cross-device drawing system characterized in that,
  • a processing module configured to select a first target area on the second graphical interface in response to a third user operation on the display screen of the first device, where the first target area includes a first color
  • the transceiver module is configured to send the color information of the first color to the second device, and the color information of the first color is used to instruct the second device to display the handwriting of the first color.
  • a handwriting drawing device applied to a cross-device drawing system characterized in that,
  • a transceiver module configured to receive information of the first color from the first device
  • a display module configured to display the handwriting of the first color in response to a fourth user operation on the display screen of the second device.
  • An electronic device comprising: a processor and a memory
  • the memory stores computer-executable instructions
  • the processor executes the computer-implemented instructions stored in the memory, so that the processor performs the method according to any one of embodiments 13-30.
  • a computer-readable storage medium wherein a computer program or instruction is stored in the computer-readable storage medium, and when the computer program or instruction is executed, any one of the embodiments 13-30 can be realized. method described in the item.
  • a computer program product including computer programs or instructions, wherein when the computer program or instructions are executed by a processor, the method in any one of embodiments 13-30 is implemented.
  • a handwriting drawing method characterized in that it is applied to a first device, and the first device is wirelessly connected to at least one second device, and each of the first device and each second device includes: a texture information storage , the method includes:
  • the texture information storage of the first device is used to synchronize the attribute information of the target area to the texture of the at least one second device
  • the information storage, the attribute information of the target area in the texture information storage of the second device is used for the second device to display handwriting with the texture represented by the attribute information, or the texture and the texture represented by the attribute information Colors show handwriting.
  • the first instruction is sent by the stylus detecting that the stylus performs a first preset action, the first device and the stylus a wireless connection; or,
  • the target area is acquired based on the user's position on the touch screen of the first device.
  • the vector data of the target area is used as the texture information of the target area.
  • the target area includes a first target area and a second target area
  • the acquiring attribute information of the target area includes:
  • Fusion processing is performed on the first texture information and the second texture information to obtain fused texture information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供了一种跨设备绘制***,该***包括第一设备和第二设备,用户在第二设备上绘制笔迹时,可以在第一设备上选择纹理,第一设备可以将选择的颜色发送至第二设备,在第二设备上绘制笔迹时,第二设备以该颜色显示笔迹。本申请实施例中,用户可以采用触控笔跨设备获取颜色,进而使用该颜色绘制笔迹,使得绘制笔迹时不局限于第二设备上有限的颜色,丰富了电子设备显示笔迹的样式,可以提高用户体验。

Description

跨设备绘制*** 技术领域
本申请实施例涉及智能设备技术,尤其涉及一种跨设备绘制***。
背景技术
随着电子设备的发展,配置有触摸屏的电子设备与触控笔逐渐成为了绘制笔迹和操作的工具。用户可以使用触控笔在电子设备上绘制笔迹、操作电子设备的界面上显示的控件。当用户采用触控笔在电子设备上绘制笔迹时,用户可以采用触控笔选择电子设备上显示的“书写颜色”控件,改变绘制笔迹的颜色。
目前现有技术中电子设备显示笔迹的样式单一。
发明内容
本申请实施例提供一种跨设备绘制***,可以丰富电子设备显示笔迹的样式。
第一方面,本申请实施例提供一种跨设备绘制***,该***包括第一设备和第二设备。
其中,所述第一设备配置为:响应于在所述第一设备显示屏上的第一用户操作,显示第一图形界面;响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面;响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色。
所述第二设备配置为:响应于在所述第二设备显示屏上的第四用户操作,显示所述第一颜色的笔迹。
在一种可能的实现方式中,所述第二设备还配置为:响应于在所述第二设备显示屏上的第五用户操作,在所述第二设备显示的图形界面上选中第二目标区域,所述第二目标区域中包括第二颜色。
相应的,所述第一设备还配置为:响应于在所述第一设备显示屏上的第六用户操作,显示所述第二颜色的笔迹。
在一种可能的实现方式中,所述第一设备还配置为:响应于接收来自触控笔的第一指令,在所述第一设备显示屏上检测所述第三用户操作,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的;或者,响应于检测到所述触控笔执行第二预设动作,在所述第一设备显示屏上检测所述第三用户操作;或者,响应于检测到所述第一设备执行第三预设动作,在所述第一设备显示屏上检测所述第三用户操作。
在一种可能的实现方式中,所述第一目标区域还包括第一纹理。
所述第二设备还配置为:响应于在所述第二设备显示屏上的所述第四用户操作,显示由所述第一颜色和所述第一纹理组合的笔迹。
在一种可能的实现方式中,所述第一设备还配置为:在所述第二图形界面上选中所述第一目标区域之后,获取所述第一纹理的纹理信息。
在一种可能的实现方式中,所述第一设备具体配置为:检测所述第一目标区域是否包含相同的图案。其中,若所述第一目标区域包含相同的图案,则对所述图案进行截图,得到所述图案的图像;基于所述图案的图像,获取所述图案的矢量数据;将所述图案的矢量数据作 为所述第一纹理的纹理信息。其中,若所述第一目标区域未包含相同的图案,则对所述目标区域进行截图,得到所述第一目标区域的图像;基于所述目标区域的图像,获取所述第一目标区域的矢量数据;将所述第一目标区域的矢量数据作为所述第一纹理的纹理信息。
在一种可能的实现方式中,所述第一设备具体配置为:将所述第一目标区域划分为多个网格,每个网格具有第一预设尺寸;获取每两个网格中的图案的第一相似度;若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述第一目标区域包含相同的图案。
在一种可能的实现方式中,所述第一设备具体配置为:若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;若存在大于或等于所述预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述第一目标区域包含相同的图案;若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
在一种可能的实现方式中,所述第一设备还配置为:响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色。
所述第二设备还配置为:响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
在一种可能的实现方式中,所述第一设备还配置为:响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三纹理。
相应的,所述第二设备还配置为:响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三纹理组合的笔迹。
在一种可能的实现方式中,所述第一目标区域还包括第一纹理,所述第一设备还配置为:在所述第二图形界面上选中第一目标区域之后,在所述第二图形界面上显示待选择的颜色控件和纹理控件;检测对所述颜色控件和/或纹理控件的选择的第八用户操作。
相应的,所述第二设备配置为:响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和/或所述第一纹理组合的笔迹。
在一种可能的实现方式中,所述第一设备还配置为:响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色和第三纹理;在所述第二图形界面上显示待选择的颜色控件和纹理控件;检测对颜色控件和/或纹理控件的选择的第九用户操作。
相应的,所述第二设备还配置为:响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
第二方面,本申请实施例提供一种笔迹绘制方法,该方法的执行主体可以为第一设备或第一设备中的芯片,下述以第一设备为例进行说明,该方法可以包括:响应于在所述第一设备显示屏上的第一用户操作,显示第一图形界面;响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面;响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色;向第二设备发送所述第一颜色的颜色信息,所述第一颜色的颜色信息 用于指示所述第二设备显示所述第一颜色的笔迹。
本申请实施例中,用户可以采用触控笔跨设备获取颜色,进而使用该颜色绘制笔迹,使得绘制笔迹时不局限于第二设备上有限的颜色,丰富了电子设备显示笔迹的样式,可以提高用户体验。
在一种可能的实现方式中,所述方法还包括:响应于在所述第一设备显示屏上的第六用户操作,显示第二颜色的笔迹,所述第二颜色为在所述第二设备的图形界面上选中的第二目标区域的颜色。
在一种可能的实现方式中,所述响应于在所述第一设备显示屏上的第三用户操作之前,还包括:响应于接收来自触控笔的第一指令,在所述第一设备显示屏上检测所述第三用户操作,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的;或者,响应于检测到所述触控笔执行第二预设动作,在所述第一设备显示屏上检测所述第三用户操作;或者,响应于检测到所述第一设备执行第三预设动作,在所述第一设备显示屏上检测所述第三用户操作。
在一种可能的实现方式中,所述第一目标区域还包括第一纹理;所述向第二设备发送所述第一颜色的颜色信息,包括:向所述第二设备发送所述第一颜色的颜色信息和所述第一纹理的纹理信息。
在一种可能的实现方式中,所述在所述第二图形界面上选中第一目标区域之后,还包括:获取所述第一纹理的纹理信息。
在一种可能的实现方式中,所述获取所述第一纹理的纹理信息,包括:检测所述第一目标区域是否包含相同的图案;若是,则对所述图案进行截图,得到所述图案的图像;基于所述图案的图像,获取所述图案的矢量数据;将所述图案的矢量数据作为所述第一纹理的纹理信息;若否,则对所述第一目标区域进行截图,得到所述第一目标区域的图像;基于所述第一目标区域的图像,获取所述第一目标区域的矢量数据;将所述第一目标区域的矢量数据作为所述第一纹理的纹理信息。
在一种可能的实现方式中,所述检测所述第一目标区域是否包含相同的图案,包括:将所述第一目标区域划分为多个网格,每个网格具有第一预设尺寸;获取每两个网格中的图案的第一相似度;若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述第一目标区域包含相同的图案。
在一种可能的实现方式中,所述方法还包括:若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;若存在大于或等于所述预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述第一目标区域包含相同的图案;若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
在一种可能的实现方式中,所述在所述第二图形界面上选中第一目标区域之后,还包括:响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色;融合所述第一颜色和所述第三颜色,得到融合颜色信息;所述向第二设备发送所述第一颜色的颜色信息,包括:向所述第二设备发送所述融合颜色信息,所述融合颜色信息用于指示所述第二设备显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
在一种可能的实现方式中,所述在所述第二图形界面上选中第一目标区域之后,还包括:响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三纹理;融合所述第一颜色和所述第三纹理,得到融合信息。
所述向第二设备发送所述第一颜色的颜色信息,包括:向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色和所述第三纹理组合的笔迹。
在一种可能的实现方式中,所述第一目标区域还包括第一纹理,所述在所述第二图形界面上选中第一目标区域之后,还包括:在所述第二图形界面上显示待选择的颜色控件和纹理控件;检测对所述颜色控件和/或纹理控件的选择的第八用户操作。
在一种可能的实现方式中,所述检测对所述颜色控件和/或纹理控件的选择的第八用户操作之后,还包括:响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色和第三纹理;在所述第二图形界面上显示待选择的颜色控件和纹理控件;检测对颜色控件和/或纹理控件的选择的第九用户操作;将所述第八用户操作指示的第一信息和所述第九用户操作指示的第二信息进行融合,得到融合信息,所述第一信息为第一颜色的颜色信息和/或第一纹理的纹理信息,所述第二信息为所述第三颜色的颜色信息和/或第三纹理的纹理信息。
所述向第二设备发送所述第一颜色的颜色信息,包括:
向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
第三方面,本申请实施例提供一种笔迹绘制方法,应用于第二设备,该方法包括:接收来自第一设备的第一颜色的信息;响应于在所述第二设备显示屏上的第四用户操作,显示所述第一颜色的笔迹。
在一种可能的实现方式中,所述方法还包括:响应于在所述第二设备显示屏上的第五用户操作,在所述第二设备显示的图形界面上选中第二目标区域,所述第二目标区域中包括第二颜色;向所述第一设备发送所述第二颜色的颜色信息,所述第二颜色的颜色信息用于指示所述第一设备显示所述第二颜色的笔迹。
在一种可能的实现方式中,所述接收来自第一设备的第一颜色的信息,包括:接收来自所述第一设备的所述第一颜色的颜色信息和第一纹理的纹理信息;响应于在所述第二设备显示屏上的所述第四用户操作,显示由所述第一颜色和所述第一纹理组合的笔迹。
在一种可能的实现方式中,所述接收来自第一设备的第一颜色的信息,包括:接收来自所述第一设备的融合颜色信息,所述融合颜色信息为所述第一设备上的第一目标区域的第一颜色和第三目标区域的第三颜色融合后的信息;响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
在一种可能的实现方式中,所述接收来自第一设备的第一颜色的信息,包括:接收来自所述第一设备的融合信息,所述融合信息为所述第一设备上的第一目标区域的第一颜色和第三目标区域的第三纹理融合后的信息;响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三纹理组合的笔迹。
在一种可能的实现方式中,所述接收来自第一设备的第一颜色的信息,包括:接收来自所述第一设备的融合信息,所述融合信息为所述第一设备上的第一目标区域的第一颜色、第一纹理、第三目标区域的第三颜色,和/或第三纹理融合信息;响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
第四方面,本申请实施例提供一种笔迹绘制方法,应用于触控笔,该方法包括:接收来自第一设备的纹理信息,以及向第二设备发送纹理信息。
在一种可能的实现方式中,所述方法还包括:响应于检测到触控笔执行第一预设动作,向第一设备发送第一指令,所述第一指令用于指示获取用户在所述第一设备的界面上选择的目标区域的纹理信息。
在一种可能的实现方式中,所述接收来自第一设备的纹理信息之后,还包括:显示所述纹理信息表征的纹理。
第五方面,本申请实施例提供一种笔迹绘制装置,该笔迹绘制装置为第一设备或第一设备中的芯片,该笔迹绘制装置包括:
显示模块,用于响应于在所述第一设备显示屏上的第一用户操作,显示第一图形界面,以及响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面。
处理模块,用于响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色。
收发模块,用于向第二设备发送所述第一颜色的颜色信息,所述第一颜色的颜色信息用于指示所述第二设备显示所述第一颜色的笔迹。
在一种可能的实现方式中,显示模块,还用于响应于在所述第一设备显示屏上的第六用户操作,显示第二颜色的笔迹,所述第二颜色为在所述第二设备的图形界面上选中的第二目标区域的颜色。
在一种可能的实现方式中,处理模块,还用于响应于接收来自触控笔的第一指令,在所述第一设备显示屏上检测所述第三用户操作,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的;或者,响应于检测到所述触控笔执行第二预设动作,在所述第一设备显示屏上检测所述第三用户操作;或者,响应于检测到所述第一设备执行第三预设动作,在所述第一设备显示屏上检测所述第三用户操作。
在一种可能的实现方式中,所述第一目标区域还包括第一纹理。
收发模块,具体用于向所述第二设备发送所述第一颜色的颜色信息和所述第一纹理的纹理信息。
在一种可能的实现方式中,处理模块,还用于获取所述第一纹理的纹理信息。
在一种可能的实现方式中,处理模块,具体用于检测所述第一目标区域是否包含相同的图案;若是,则对所述图案进行截图,得到所述图案的图像;基于所述图案的图像,获取所述图案的矢量数据;将所述图案的矢量数据作为所述第一纹理的纹理信息;若否,则对所述第一目标区域进行截图,得到所述第一目标区域的图像;基于所述第一目标区域的图像,获取所述第一目标区域的矢量数据;将所述第一目标区域的矢量数据作为所述第一纹理的纹理信息。
在一种可能的实现方式中,处理模块,具体用于将所述第一目标区域划分为多个网格,每个网格具有第一预设尺寸;获取每两个网格中的图案的第一相似度;若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述第一目标区域包含相同的图案。
在一种可能的实现方式中,处理模块,具体用于若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;若存在大于或等于所述预设相似度的第二相似度,且大于或等 于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述第一目标区域包含相同的图案;若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
在一种可能的实现方式中,处理模块,还用于响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色;融合所述第一颜色和所述第三颜色,得到融合颜色信息。
收发模块,具体用于向所述第二设备发送所述融合颜色信息,所述融合颜色信息用于指示所述第二设备显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
在一种可能的实现方式中,处理模块,还用于响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三纹理;融合所述第一颜色和所述第三纹理,得到融合信息。
收发模块,具体用于向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色和所述第三纹理组合的笔迹。
在一种可能的实现方式中,所述第一目标区域还包括第一纹理。显示模块,还用于在所述第二图形界面上显示待选择的颜色控件和纹理控件。处理模块,还用于检测对所述颜色控件和/或纹理控件的选择的第八用户操作。
在一种可能的实现方式中,处理模块,还用于响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色和第三纹理。
显示模块,还用于在所述第二图形界面上显示待选择的颜色控件和纹理控件。
处理模块,还用于检测对颜色控件和/或纹理控件的选择的第九用户操作,将所述第八用户操作指示的第一信息和所述第九用户操作指示的第二信息进行融合,得到融合信息,所述第一信息为第一颜色的颜色信息和/或第一纹理的纹理信息,所述第二信息为所述第三颜色的颜色信息和/或第三纹理的纹理信息。
收发模块,具体用于向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
第六方面,本申请实施例提供一种笔迹绘制装置,该笔迹绘制装置为第二设备或第二设备中的芯片,该笔迹绘制装置包括:
收发模块,用于接收来自第一设备的第一颜色的信息。
显示模块,用于响应于在所述第二设备显示屏上的第四用户操作,显示所述第一颜色的笔迹。
在一种可能的实现方式中,显示模块,还用于响应于在所述第二设备显示屏上的第五用户操作,在所述第二设备显示的图形界面上选中第二目标区域,所述第二目标区域中包括第二颜色。
收发模块,用于向所述第一设备发送所述第二颜色的颜色信息,所述第二颜色的颜色信息用于指示所述第一设备显示所述第二颜色的笔迹。
在一种可能的实现方式中,收发模块,具体用于接收来自所述第一设备的所述第一颜色的颜色信息和第一纹理的纹理信息。
显示模块,具体用于响应于在所述第二设备显示屏上的所述第四用户操作,显示由所述第一颜色和所述第一纹理组合的笔迹。
在一种可能的实现方式中,收发模块,具体用于接收来自所述第一设备的融合颜色信息,所述融合颜色信息为所述第一设备上的第一目标区域的第一颜色和第三目标区域的第三颜色融合后的信息。
显示模块,具体用于响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
在一种可能的实现方式中,收发模块,具体用于接收来自所述第一设备的融合信息,所述融合信息为所述第一设备上的第一目标区域的第一颜色和第三目标区域的第三纹理融合后的信息。
显示模块,具体用于响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三纹理组合的笔迹。
在一种可能的实现方式中,收发模块,具体用于接收来自所述第一设备的融合信息,所述融合信息为所述第一设备上的第一目标区域的第一颜色、第一纹理、第三目标区域的第三颜色,和/或第三纹理融合信息。
显示模块,具体用于响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
第七方面,本申请实施例提供一种笔迹绘制装置,该笔迹绘制装置可以为的触控笔,或触控笔中的芯片,该笔迹绘制装置包括:
收发模块,用于接收来自第一设备的纹理信息,以及向第二设备发送纹理信息。
在一种可能的实现方式中,笔迹绘制装置还包括:处理模块,用于检测到触控笔的动作。
收发模块,还用于响应于处理模块检测到触控笔执行第一预设动作,向第一设备发送第一指令,所述第一指令用于指示获取用户在所述第一设备的界面上选择的目标区域的纹理信息。
在一种可能的实现方式中,笔迹绘制装置还包括:显示模块。显示模块,用于显示所述纹理信息表征的纹理。
第八方面,本申请实施例提供一种电子设备,该电子设备可以为如上第二方面的第一设备、第三方面的第二设备、第四方面的触控笔。该电子设备可以包括:处理器、存储器。存储器用于存储计算机可执行程序代码,程序代码包括指令;当处理器执行指令时,指令使所述电子设备执行如第二方面、第三方面、第四方面中的方法。
在一种实施例中,电子设备可以包括显示器。
第九方面,本申请实施例提供一种电子设备,该电子设备可以为第四方面的笔迹绘制装置或第五方面的笔迹绘制装置,或者第六方面的笔迹绘制装置。该电子设备可以包括用于执行以上第二方面、第三方面、第四方面中的所提供的方法的单元、模块或电路。
第十方面,本申请实施例提供一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述第二方面、第三方面、第四方面中的方法。
第十一方面,本申请实施例提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当其在计算机上运行时,使得计算机执行上述第二方面、第三方面、第四方面中的方法。
上述第一方面,以及第三方面至第十一方面的各可能的实现方式,其有益效果可以参见上述第二方面所带来的有益效果,在此不加赘述。
附图说明
图1为已有的电子设备的一种界面示意图;
图2A为本申请实施例提供的跨设备绘制***的另一种示意;
图2B为本申请实施例提供的跨设备绘制***中的一种交互示意图;
图2C为本申请实施例提供的跨设备绘制***的另一种示意图;
图3为本申请实施例提供的电子设备和触控笔交互的一种示意图;
图4为本申请实施例提供的电子设备和触控笔交互的另一种示意图;
图5为本申请实施例提供的TP sensor对应位置处的电容采样值的变化量发生变化的示意图;
图6为本申请实施例提供的跨设备绘制***中的另一种交互示意图;
图7为本申请实施例提供的一种场景示意图;
图8为本申请实施例提供的一种界面示意图;
图9为本申请实施例提供的另一种界面示意图;
图10为本申请实施例提供的跨设备绘制***中的另一种交互示意图;
图11为本申请实施例提供的跨设备绘制***中的另一种交互示意图;
图12为本申请实施例提供的跨设备绘制***中的另一种交互示意图;
图13为本申请实施例提供的另一种场景示意图;
图14为本申请实施例提供的一种纹理叠加示意图;
图15为本申请实施例提供的另一种纹理叠加示意图;
图16为本申请实施例提供的另一种场景示意图;
图17为本申请实施例提供的另一种场景示意图;
图18为本申请实施例提供的第一设备的一种结构示意图;
图19为本申请实施例提供的第二设备的另一种结构示意图。
具体实施方式
图1为已有的电子设备的一种界面示意图。参照图1,以电子设备为平板电脑为例,电子设备上可以显示取色区域10,取色区域中包括可选择的不同的颜色色块11,当用户使用触控笔在电子设备上绘制笔迹时,用户可以操作触控笔在颜色色块11中选择颜色,来改变触控笔在电子设备上绘制的笔迹的颜色。目前取色区域10的颜色固定单一,导致用户的选择少。应理解,图1中以不同灰度表征不同的颜色。另外,如对于服装设计师而言,在绘制衣服的草稿时,不仅要绘制衣服的颜色,还需要绘制衣服的图案,而目前的电子设备上没有供服装设计师选择的图案,服装设计师需要手动绘制图案,效率低。综上,目前现有技术中电子设备显示笔迹的样式单一。应理解,本申请实施例中的“界面”、“图形界面”均可以理解为电子设备的图形用户界面(graphical user interface,GUI)。
用户出行时往往拍摄很多好看的图片,图片上有好看的图案,或者用户在浏览网页时看到好看的图案,若用户能够及时将其他设备上的图案提取在电子设备中使用,那么用户可以不局限于电子设备中原本设置的颜色绘制笔迹,能够提高用户体验。因此,本申请实施例提供一种跨设备绘制***,用户可以在第一设备上进行选择需要的颜色和/或纹理,以提取用户需要的颜色和/或纹理至第二设备上,如此,用户可以在第二设备上直接使用该颜色和/或纹理,能够丰富电子设备显示笔迹的样式,且提高用户的绘制效率。
图2A为本申请实施例提供的跨设备绘制***的一种示意图。参照图2A,该跨设备绘制***中可以包括第一设备和第二设备。其中,第二设备可以为多个。图2A以及下述实施例中以 一个第二设备,且以第一设备为手机,第二设备为平板电脑(portable android device,PAD)为例进行说明。第一设备和第二设备可以通过通信网络进行互联。该通信网络可以但不限于为:WI-FI热点网络、WI-FI点对点(peer-to-peer,P2P)网络、蓝牙网络、zigbee网络或近场通信(near field communication,NFC)网络等近距离通信网络。
在图2A所示的跨设备绘制***的基础上,下述结合图2B,对第一设备和第二设备的功能进行介绍:
S201,响应于在第一设备显示屏上的第一用户操作,显示第一图形界面。
本申请实施例中,在第一设备显示屏以及在第二设备显示屏的用户操作可以包括但不限于为:触控笔、用户的手指、鼠标、键盘等在显示屏上的操作。本申请实施例中以第一、第二等表征用户在第一设备显示屏上的多次操作。
第一用户操作可以包括但不限于为:点击、滑动、长按等。
S202,响应于在第一设备显示屏上的第二用户操作,显示第二图形界面,第二图形界面不同于第一图形界面。
第二用户操作可以包括但不限于为:点击、滑动、长按等。
如上S201和S202,电子设备响应于在第一设备显示屏上的第一用户操作,显示第一图形界面,以及电子设备响应于在第一设备显示屏上的第二用户操作,显示第二图形界面,是为了表明:电子设备可以显示不同的图形界面,即用户可以在电子设备显示的任一个图形界面上选择第一目标区域。
示例性的,第一用户操作为打开一文档的操作,则第一图形界面可以为文档页面的第一页,若第二用户操作为上滑或翻页,则第二图形界面可以为该文档的第二页。示例性的,第一用户操作为打开一文档的操作,则第一图形界面可以为文档页面的第一页,若第二用户操作为切换至相册的操作,则第二图形界面可以为显示图像的界面。
S203,响应于在第一设备显示屏上的第三用户操作,在第二图形界面上选中第一目标区域,第一目标区域中包括第一颜色。
第三用户操作可以为用户在第二图像界面上选择第一目标区域的操作。示例性的,第三用户操作可以为用户在第一设备显示屏上画圈,以圈定第一目标区域,电子设备可以基于用户画圈的操作,选中圈中的区域为第一目标区域。
在一种实施例中,第一设备可以在第二图形界面上显示用户的笔迹,以选中第二图形界面上的第一目标区域。第一设备可以获取第一目标区域的第一颜色。其中,当第一目标区域只有一个颜色时,第一颜色为该颜色。当第一目标区域有多个颜色时,第一颜色为该多个颜色融合后的颜色,融合后的颜色可以为该多个颜色的RGB的均值。
在一种实施例中,第一设备可以向第二设备发送第一颜色的颜色信息。颜色信息中包括第一颜色的RGB值。
S204,响应于在第二设备显示屏上的第四用户操作,显示第一颜色的笔迹。
第四用户操作可以为用户绘制笔迹的操作,如用户在第二设备的备忘录界面绘制线条。第二设备可以响应于在第二设备显示屏上的第四用户操作,显示第一颜色的笔迹。
本申请实施例中,用户可以在第一设备上选择颜色,在第二设备上以该颜色绘制笔迹,无需只使用第二设备上自带的颜色绘制笔迹,可以丰富电子设备显示笔迹的样式。
参照图2B中的说明,本申请实施例中,用户也可以在第二设备上选择目标区域,且在第一设备上采用第二设备上的目标区域的颜色绘制笔迹,换句话说,本申请实施例提供的跨设备绘制***还可以包括:
S205,响应于在所述第二设备显示屏上的第五用户操作,在所述第二设备显示的图形界面上选中第二目标区域,所述第二目标区域中包括第二颜色。
应理解,第五用户操作可以参照上述第三用户操作的相关描述。应注意,S205可以在S204之后执行,或者可以参照第一设备的S201-S202之后执行。
S206,响应于在所述第一设备显示屏上的第六用户操作,显示所述第二颜色的笔迹。
第六用户操作可以参照上述第四用户操作的相关描述。应理解,图2B中未示出S205、S206。
也就是说,本申请实施例中的第一设备和第二设备可以相互同步颜色,一设备可以采用另一设备上的目标区域的颜色绘制笔迹。下述以在第一设备上选择目标区域,在第二设备上绘制笔迹为例进行说明,另外,下述实施例中以用户使用触控笔在第一设备上选择目标区域,以及在第二设备上绘制笔迹为例进行说明。
图2C为本申请实施例提供的跨设备绘制***的另一种示意图。参照图2C,该场景中包括第一设备、第二设备,以及触控笔。
在一种实施例中,触控笔和第一设备之间可以通过通信网络进行互联,以及触控笔和第二设备之间可以通过通信网络进行互联。该通信网络可以但不限于为:WI-FI热点网络、WI-FI点对点(peer-to-peer,P2P)网络、蓝牙网络、zigbee网络或近场通信(near field communication,NFC)网络等近距离通信网络。在该种实施例中,触控笔可以在第一设备上选择颜色,第一设备可以通过触控笔向第二设备同步该颜色,如图2C中的a所示,具体可以参照图6中的相关描述。
在一种实施例中,触控笔和第一设备之间可以通过通信网络进行互联,触控笔和第二设备之间可以通过通信网络进行互联,以及第一设备和第二设备之间可以通过通信网络进行互联,通信网络可以如上述。在该实施例中,触控笔可以在第一设备上选择纹理,第一设备可以直接向第二设备同步该纹理,如图2C中的b所示,如上图2A所示,具体可以参照图10中的相关描述。
本申请实施例中的第一设备和第二设备均可以为包含有触摸屏的电子设备,电子设备可以称为用户设备(user equipment,UE)、终端(terminal)等,例如,电子设备可以为手机、平板电脑(portable android device,PAD)、个人数字处理(personal digital assistant,PDA)、具有无线通信功能的手持设备、计算设备、车载设备或可穿戴设备,虚拟现实(virtual reality,VR)终端设备、增强现实(augmented reality,AR)终端设备、工业控制(industrial control)中的无线终端、智慧家庭(smart home)中的无线终端等,本申请实施例中对电子设备的形态不做具体限定。
本申请实施例中,第二设备和第一设备的触摸屏的结构可以相同,下述以触控笔和电子设备交互为例,说明触控笔分别与第一设备、第二设备之间的交互原理:
图3为本申请实施例提供的电子设备和触控笔交互的一种示意图。参照图3,电子设备中包括:触控面板、显示面板、图形处理器(graphics processing unit,GPU)、应用处理器(application processor,AP),以及第二通信模块。触控面板中包括:触摸传感器(TP sensor)和触摸处理模块。显示面板中包括:显示屏和显示IC芯片(integrated circuit chip)。触控面板可以理解为电子设备的触摸屏,也可以称为触控屏。在一种实施例中,可以将显示面板和触控面板统称为屏幕或显示屏。
触控笔中包括:微处理单元(micro controller unit,MCU)、第一通信模块、发送模块(transport,TX)和接收模块(receive,RX)。
示例性的,第一通信模块和第二通信模块可以为蓝牙模块、无线局域网模块、WI-FI模块等,用于实现电子设备和触控笔之间的通信,本申请实施例对第一通信模块和第二通信模块不作限制。应理解,触控笔和电子设备,可以通过第一通信模块和第二通信模块建立无线通路,无线通路用于传输无线信号。
电子设备中:
触摸传感器由电极阵列组成,电极阵列中包括行列排布的多个电极。触摸传感器,用于采集触摸数据,触摸数据可以包括:触控笔触摸触摸屏的数据,以及用户触摸触摸屏的数据。其中,用户可以用手指或者指关节等触摸电子设备的触摸屏,下述实施例中以触摸数据包括触控笔触摸触摸屏的数据为例进行说明。
触摸处理模块,用于基于触摸传感器采集的触摸数据,确定触控笔在触摸屏上的位置,且向应用处理器发送触控笔在触摸屏上的位置,触摸处理模块确定触控笔在触摸屏上的位置可以参照图4和图5的相关描述。在一种实施例中,触摸处理模块可以为触摸IC芯片,其中,触摸IC芯片也可以称为触控芯片,图3中以触控芯片进行表示。
显示芯片,用于控制显示屏显示界面,使得用户可以看到电子设备的界面。
图形处理器,用于处理解析图像,得到颜色和纹理,可以参照实施例中的相关描述。
应用处理器,用于基于来自触控芯片的触控笔在触摸屏上的位置,执行相应的操作。
触控笔中:
MCU分别与第一通信模块、发送模块,以及接收模块连接。发送模块中可以包括:第一电极和驱动电路,第一电极与驱动电路连接,驱动电路与MCU连接。接收模块中包括第二电极和解码电路,第二电极与解码电路连接,解码电路与MCU连接。
MCU,用于生成脉冲宽度调制(pulse width modulation,PWM)信号,且向驱动电路发送PWM信号。驱动电路可以基于PWM信号,驱动第一电极发送信号。其中,第一电极可以称为发射电极(transport,TX),第一电极可以设置在靠近触控笔的笔尖的位置。
第二电极,用于接收来自电子设备中的TP sensor的信号,且向解码电路发送该信号。解码电路,用于解码来自电子设备的信号,并向MCU发送解码后的信号。其中,第二电极可以称为接收电极(receive,RX)。应理解,触控笔通过第一电极发送的信号和电子设备通过TP sensor发送的信号均为方波信号。
参照图3,在一种实施例中,触控笔中还可以包括:充电模块和传感器模块。其中,充电模块,用于为触控笔进行充电。传感器模块可以包括但不限于:压力传感器、加速度传感器(accelerometer sensor,G-sensor)、陀螺仪等,本申请实施例对此不作赘述。传感器模块可以与MCU连接。
应理解,图3所示的触控笔的结构为一种示例。在一种实施例中,触控笔中可以设置两个电极,其中一个电极为TX,另一个电极可以在TX和RX之间进行切换,可以参照现有技术中的相关描述,本申请实施例对触控笔中的电极的个数和原理不作限制。
如上介绍了电子设备和触控笔的结构,以及电子设备和触控笔中各模块的功能。参照图4,因为触控笔的笔尖设置有电极,电子设备中的触摸传感器中包括电极阵列。触控笔的笔尖和触摸传感器的电极之间,存绝缘物质(如空气、盖板玻璃),因此触控笔的笔尖和触摸传感器的电极之间可以形成电容,触控笔的笔尖与电子设备中的触摸传感器可以通过电容,建立电路连接,触控笔的笔尖与电子设备中的触摸传感器之间的通路可以称为电路通路。触控笔和电子设备可以通过电路通路交互信号。应理解,图4中以触控笔中的第一通信模块和电子设备中的第二通信模块均为蓝牙模块为例,触控笔和电子设备之间建立蓝牙通路。
触控笔的笔尖靠近电子设备的触摸屏,会引起触摸屏中的TP sensor的电容采样值的变化量变化,且触控笔的笔尖距离触摸屏越近,TP sensor的电容采样值的变化量越大。参照图5,图5中以波峰表征TP sensor对应位置处的电容采样值的变化量发生变化,触摸屏中的触控芯片可以基于TP sensor上的电容采样值的变化量,确定触控笔在电子设备的触摸屏上的位置,如触控芯片可以将TP sensor上的电容采样值的变化量最大处的位置,作为触控笔在触摸屏上的位置,本申请实施例对此不作赘述,可以参照现有技术中的相关描述。应理解,图5以黑色圆点表征触控笔接触触摸屏的位置。
在如上电子设备和触控笔的结构的基础上,下面结合具体的实施例对本申请实施例提供的跨设备绘制***进行说明。下面这几个实施例可以相互结合,对于相同或相似的概念或过程可能在某些实施例不再赘述。
在一种实施例中,第一设备和第二设备均与触控笔无线连接。图6为本申请实施例提供的跨设备绘制***中的另一种交互示意图。参照图6,第一设备和第二设备的交互流程可以包括:
S601,第一设备接收第一指令,第一指令用于指示第一设备获取触控笔选择的目标区域的纹理信息。
在一种实施例中,基于如上图2B所示的实施例,图6所示的实施例中的纹理信息可以替换为颜色信息,颜色信息即为目标区域(即如上图2B中描述的第一设备上的第一目标区域)的第一颜色的颜色信息。
在一种实施例中,用户需要在第一设备上获取纹理时,用户可以触发触控笔向第一设备发送第一指令。其中,触控笔可以通过如上无线通路或电路通路向第一设备发送第一指令。
其一,用户可以握持触控笔执行第一预设动作,以触发触控笔向第一设备发送第一指令。该实施例中,触控笔可以检测触控笔的动作,触控笔响应于检测到用户握持触控笔执行第一预设动作,可以向第一设备发送第一指令。应理解,触控笔中可以设置G-sensor、陀螺仪等,触控笔可以基于G-sensor、陀螺仪等采集的数据,检测用户握持触控笔的动作,以确定用户是否握持触控笔执行第一预设动作。其中,第一预设动作可以包括但不限于为:摇一摇、在空中画圈、将触控笔颠倒等。将触控笔颠倒可以理解为:触控笔的尾部相较于触控笔的笔尖更靠近地面。本申请实施例中对触控笔基于G-sensor、陀螺仪等采集的数据,检测触控笔的动作不做赘述。
其二,在一种实施例中,触控笔上可以设置一按键,该按键可以为机械式的按键或者触摸式的按键,本申请实施例对此不作限制。当触控笔检测到用户操作该按键时,可以向第一设备发送第一指令。
在一种实施例中,第一设备若检测到触控笔执行第二预设动作,即接收到来自触控笔的第一指令。
其一,第二预设动作可以但不限于为:触控笔双击第一设备的触摸屏,或者触控笔长按第一设备的触摸屏。
第二预设动作还可以为:触控笔在第一设备的触摸屏上绘制预设轨迹。示例性的,预设轨迹可以为预设文字、预设字母或预设形状等。为了便于第一设备对触控笔绘制笔记时的轨迹,以及触控笔绘制的预设轨迹进行区分,第二预设动作可以为:触控笔在第一设备的触摸屏的预设区域绘制预设轨迹,预设区域可以但不限于为触摸屏的中心区域。示例性的,若第一设备检测到触控笔在触摸屏的中心区域绘制预设字母m,则可以确定检测到来自触控笔的 选择指示,若第一设备检测到触控笔在触摸屏的其他区域(非预设区域)绘制预设字母m,则可以确定触控笔在绘制笔记,在触摸屏的其他区域的对应位置上显示m。本申请实施例对第一设备区分触控笔绘制笔记时的轨迹,以及触控笔绘制的预设轨迹的方式不做限制,预设区域为一种示例。
其二,第一设备的界面上可以显示有预设控件,如“取纹理”控件,第一设备响应于检测到触控笔操作该预设控件,可以确定接收第一指令。
在一种实施例中,第一设备若检测到用户采用手指或者指关节等执行第二预设动作,或者,第一设备若检测到用户采用手指或者指关节操作预设控件,可以确定接收第一指令。也就是说,本申请实施例中,第一指令也可以由用户使用手指或者指关节等输入至第一设备的。
在一种实施例中,第一设备可以检测第一设备是否执行第三预设动作,第一设备响应于检测到第一设备执行第三预设动作,可以确定接收第一指令。其中,第三预设动作可以包括但不限于为:摇一摇、在空中画圈等。
S602,第一设备响应于接收第一指令,检测触控笔在第一设备的界面上选择的目标区域。
应理解,触控笔在第一设备的界面上的选择操作可以称为第三用户操作。S602中的“第一设备的界面”可以称为第二图形界面。
第一设备响应于接收第一指令,可以检测触控笔在第一设备的界面上选择的目标区域。如上图3-图5中的相关描述,第一设备可以获取触控笔在第一设备的界面上的位置,因此第一设备可以检测触控笔在第一设备的界面上选择的目标区域。应理解,第一设备的界面可以为任意的界面,如第一设备显示第一设备的相册中一图像的界面,或者第一设备显示网页的界面,本申请实施例对此不作限制,也就是说,本申请实施例中,触控笔可以在第一设备的任一界面上选择则目标区域。
本申请实施例中,用户可以采用触控笔在第一设备的界面上以点击的方式,或者画出预设形状等方式选择目标区域。示例性的,当用户采用触控笔在第一设备的界面上以点击的方式选择目标区域时,第一设备可以将触控笔点击的位置作为目标区域。示例性的,当用户采用触控笔以画出预设形状的方式选择目标区域时,第一设备响应于检测到触控笔的轨迹为预设形状时,可以确定预设形状内的区域为目标区域。
应理解,S602可以替换为:第一设备响应于接收第一指令,检测用户在第一设备的界面上选择的目标区域。其中,用户不仅可以采用触控笔在第一设备的界面上选择目标区域,还可以采用手指、指关节等在第一设备的界面上选择目标区域。下述实施例以第一设备检测到触控笔在第一设备的界面上选择目标区域为例进行说明。
S602可以理解为:第一设备中的触控芯片可以基于TP sensor对应位置处的电容采样值的变化量,获取触控笔在第一设备的界面上选择的目标区域,且将目标区域对应的坐标发送至第一设备中的AP。
S603,第一设备获取目标区域的纹理信息。
第一设备可以基于第一设备的界面显示的内容,以及目标区域在第一设备的界面的位置,可以确定目标区域中的内容。第一设备可以基于目标区域中的内容,分析得到目标区域的纹理信息。
S603可以理解为:AP基于目标区域对应的坐标,以及第一设备的界面显示的内容,提取目标区域中的内容。AP可以将目标区域中的内容发送至GPU,GPU响应于目标区域中的内容, 分析得到目标区域的纹理信息。其中,AP可以采用截图的方式,在第一设备的界面上截取目标区域中的内容,进而向GPU发送包含有目标区域的内容的截图。下述以第一设备为获取目标区域的纹理信息的执行主体,说明第一设备获取目标区域的纹理信息的过程:
若目标区域中包含多个相同的图案,则目标区域中的该相同的图案可以表征目标区域的纹理,因此第一设备可以在目标区域绘制网格,获取每个网格中的图案,以检测目标区域是否是包含相同的图案,且若目标区域中包含多个相同的图案,则获取该图案。其中,第一设备可以获取每两个网格中的图案的第一相似度,检测是否存在大于或等于预设相似度的第一相似度。若存在大于或等于预设相似度的第一相似度,表明目标区域的两个网格中存在相同的图案,另外,第一设备还需要检测大于或等于预设相似度的第一相似度在所有的第一相似度中的占比,若该占比大于或等于预设占比,则确定目标区域包含相同的图案,且将大于或等于预设相似度对应的网格中的相同的图案作为目标区域的重复图案。
应理解的是,第一设备绘制的网格具有第一预设尺寸,在一种实施例中,第一预设尺寸可以为1个像素。若第一设备在绘制网格后,不存在大于或等于预设相似度的第一相似度,或者大于或等于预设相似度的占比小于预设占比,则第一设备可以继续增大网格的大小,继续获取每两个网格中的图案的第二相似度,以检测目标区域是否是包含相同的图案。若第一设备在增大网格后,还是不存在大于或等于预设相似度的第二相似度,或者大于或等于预设相似度的第二相似度的占比小于预设占比,则第一设备可以继续增大网格的大小,继续获取每两个网格中的图案的第三相似度,以检测目标区域是否是包含相同的图案。如此重复,直至网格的大小达到第二预设尺寸,如该第二预设尺寸为目标区域的面积的一半,其中,本申请实施例对第一设备增大网格的方式不做限制。应理解,若网格的大小达到第二预设尺寸,第一设备也并未获取大于或等于预设相似度的相似度,或者大于或等于预设相似度的占比小于预设占比,则第一设备可以确定目标区域未包含相同的图案。
如此,在一种实施例中,若第一设备能够获取目标区域的重复图案,则第一设备可以截图获取该重复图案的图像,得到该重复图案的图像的标量数据。在一种实施例中,若第一设备未能够获取目标区域的重复图案,则第一设备可以截图获取目标区域的图像,得到目标区域的图像的标量数据。本申请实施例中,为了使得触控笔在绘制笔迹使用到该目标区域的纹理时,纹理清晰,第一设备可以将得到的标量数据转换成矢量数据。在一种实施例中,纹理信息中包括矢量数据。
图7中以“触控笔响应于检测到用户握持触控笔执行摇一摇的预设动作,触控笔向第一设备发送第一指令”为例进行说明,参照图7中的a,用户在第二设备(如平板电脑)上绘制笔迹。若用户想要使用第一设备(手机)上的纹理,则用户摇一摇触控笔,触控笔可以向第一设备发送第一指令,参照图7中的b所示。参照图7中的c,第一设备的界面为显示一图像的界面,用户采用触控笔在该界面上画一个圆形,则第一设备可以将该圆形中的区域作为目标区域。示例性的,目标区域的重复图案为三角形,则基于S603中的描述,第一设备得到的纹理信息中包括:三角形的矢量数据。
在一种实施例中,用户可以在第一设备中对目标区域的纹理进行编辑处理,第一设备得到的纹理信息为用户对目标区域的纹理进行编辑处理后的纹理信息。示例性的,编辑处理可以为对目标区域的纹理的深度进行调整,对目标区域的纹理进行缩放等。
示例性的,图7中的c中,用户采用触控笔在该界面上画一个圆形后,第一设备可以显示纹理的编辑界面,如图8所示。编辑界面上可以显示目标区域中的重复图案81、深度编辑控件82、以及缩放控件83。图8中以进度条表征深度编辑控件82、以及缩放控件83,用户调整 进度条,可以分别调整目标区域的纹理的深度,以及对目标区域的纹理进行缩放。应理解,调整目标区域的纹理的深度可以理解为:调整目标区域的图案与目标区域的背景的对比度。用户可以调整深度进度条,降低三角形与背景的对比度,以及用户调整深度进度条,对三角形进行放大处理。
在该实施例中,第一设备可以获取用户中对目标区域的纹理的编辑处理参数,相应的,第一设备得到的纹理信息中可以包括:编辑处理参数。参照图8,纹理信息中除了包括三角形的矢量数据,还可以包括深度参数(80%),以及缩放参数(120%)。
S604,第一设备向触控笔发送纹理信息。
本申请实施例中,触控笔中可以包括存储模块,存储模块用于存储来自第一设备的纹理信息。
在一种实施例中,若触控笔上设置有显示屏,则触控笔接收到来自第一设备的纹理信息后,可以在显示屏上显示纹理信息表征的纹理。示例性的,参照图7中的d所示,如触控笔可以在显示屏上显示三角形。
S605,触控笔向第二设备发送纹理信息。
在一种实施例中,触控笔响应于接收到来自第一设备的纹理信息,可以向第二设备发送纹理信息。
在一种实施例中,触控笔的笔尖可以设置有压力传感器,当触控笔的笔尖接触第二设备的触摸屏时,压力传感器可以采集到压力数据。在该实施例中,触控笔响应于检测到压力传感器采集到压力数据,可以确定触控笔的笔尖接触第二设备的触摸屏,表征用户需要采用触控笔在第二设备上进行操作,进而触控笔可以在接触到第二设备的触摸屏时,向第二设备发送纹理信息。
如此,第二设备可以存储纹理信息。在一种实施例中,第二设备中包括纹理信息存储器,第二设备可以将纹理信息存储在纹理信息存储器中。
S606,第二设备基于触控笔在触摸屏上的位置,以纹理信息表征的纹理显示笔迹。
当触控笔在第二设备上绘制笔迹时,绘制笔迹的操作可以称为第四用户操作,第二设备可以检测触控笔在第二设备的触摸屏上的位置,进而采用来自触控笔的纹理信息表征的纹理在对应位置上显示笔迹。在图7中的c或者图8之后,若用户握持触控笔在第二设备上绘制笔迹,则第二设备可以基于触控笔在触摸屏上的位置,以三角形显示触控笔的笔迹,参照图7中的d所示。
在一种实施例中,第二设备在接收到纹理信息后,可以在第二设备的笔刷工具中显示纹理信息表征的纹理,以供用户查询和选择。示例性的,图9中的a所示的为第二设备上显示的一绘制笔迹的应用程序的界面,该界面上显示有笔刷工具91、“上一步”控件92、“历史记录”控件93,以及撤销控件94等。如第二设备响应于接收纹理信息,可以在笔刷工具91中存储“三角形”纹理,用户采用触控笔操作笔刷工具91,第二设备可以显示笔刷工具91中的多个可选择的纹理(包括三角形纹理911,正方形纹理912等),如图9中的b所示。应理解,图9中的b以文字表征控件,第二设备还可以以图片、符号等表征控件,本申请实施例对此不作限制。图9中的a、b、d和e中为了清楚表示第二设备的界面变化,未示出触控笔。
在一种实施例中,用户还可以选择纹理,且对选择的纹理进行编辑处理。在图9中的b中,第二设备还可以显示深度编辑控件82、缩放控件83、根据尺寸调整比例的控件95、反相控件96等。深度编辑控件82、缩放控件83可以参照上述图7的相关描述,用户操作根据尺寸调整比例的控件95,可以使得第二设备基于触控笔的笔迹的粗细,调整纹理的大小。示例性 的,当触控笔的笔迹***时,第二设备可以增大纹理。用户操作反相控件96,可以将前景和背景进行反相,示例性的,如纹理为黑色,背景为白色,则反向之后纹理为白色,背景为黑色。如此,用户可以对选择的纹理进行编辑处理,对应的,第二设备可以基于用户对纹理的编辑处理参数,采用编辑处理后的纹理显示触控笔的笔迹。应理解,如上对纹理的编辑处理为示例说明,本申请实施例中不限制对纹理的编辑处理方式。
第二设备可以存储已使用过的纹理,当用户采用触控笔操作“历史记录”控件93时,第二设备可以显示已使用过的纹理,如图9中的d所示,已使用过的纹理可以包括正方形、圆形等。应理解,本申请实施例中为了示意,以形状表征纹理。
当用户采用触控笔操作“上一步”控件92,第二设备可以查询“历史记录”中存储的已使用过的纹理,采用上一次使用的纹理显示触控笔的笔迹。参照图9中的c所示,如第二设备上一次使用的纹理为正方形,则第二设备可以以正方形显示笔迹。
撤销控件94用于撤销当前绘制的笔迹,示例性的,当用户采用触控笔操作撤销控件94时,第二设备可以删除以三角形显示的触控笔的笔迹,如图9中的e所示。
在一种实施例中,S606可以替换为:第二设备基于用户的手指或指关节在触摸屏上的位置,以纹理信息表征的纹理显示笔迹。
应理解,用户可以采用手指或指关节在第二设备的触摸屏上绘制笔迹,该种实施例中,第二设备可以检测用户的手指或指关节在触摸屏上的位置,进而基于用户的手指或指关节在触摸屏上的位置,以纹理信息表征的纹理显示笔迹。
本申请实施例中,用户在第二设备上采用触控笔绘制笔迹时,可以采用触控笔在第一设备上选择纹理,第一设备可以将触控笔选择的纹理通过触控笔发送至第二设备,在触控笔在第二设备上绘制笔迹时,第二设备以该纹理显示笔迹。本申请实施例中的纹理处理方式,用户可以采用触控笔跨设备获取纹理,进而使用该纹理绘制笔迹,使得触控笔绘制笔迹时不局限于第二设备上有限的颜色,丰富了电子设备绘制笔迹的样式,可以提高用户体验。
如上实施例中,第一设备可以通过触控笔向第二设备发送纹理信息,在一种实施例中,第一设备可以与第二设备无线连接,第一设备在获取纹理信息后,可以直接向第二设备发送纹理信息,当触控笔在第二设备上绘制笔迹时,第二设备也能够以纹理信息对应的纹理显示笔迹,因为本申请实施例中第一设备直接向第二设备发送纹理信息,未经过触控笔的传递,传输效率更高。
在本申请实施例中,第一设备和第二设备均与触控笔无线连接,且第一设备和第二设备无线连接。参照图10,上述S604-S605可以替换为S604A:
S604A,第一设备向第二设备发送纹理信息。
本申请实施例中,因为第一设备和第二设备无线连接,第一设备在获取纹理信息后,可以直接向第二设备发送纹理信息,能够提高传输效率,且也能够达到第二设备以纹理信息对应的纹理显示笔迹的目的。
在一种实施例中,第一设备还与第三设备无线连接,和/或,第二设备与第三设备无线连接,在该种实施例中,第一设备在获取纹理信息后,可以向第三设备(或者通过第二设备向第三设备)发送纹理信息,如此,第三设备中也可以得到并存储该纹理信息。
示例性的,用户在平板电脑上、笔记本电脑上绘制笔迹,用户可以操作触控笔在手机上选择目标区域,手机可以向平板电脑上以及笔记本电脑发送目标区域的纹理信息,如此,用户在平板电脑上和笔记本电脑上也可以采用该纹理绘制笔迹,可以参照S606中的描述。应注 意的是,当第一设备未与第二设备无线连接时,第二设备响应于接收到来自第一设备的纹理信息,可以向与第二设备无线连接的第三设备发送纹理信息。
也就是说,第一设备、第二设备,以及第三设备之间可以建立纹理信息的同步,第一设备响应于获取纹理信息后,可以向第二设备以及第三设备发送该纹理信息,第二设备以及第三设备可以更新存储的纹理信息。
应理解的是,在一种实施例中,第一设备、第二设备,以及第三设备中均包括:纹理信息存储器。不同设备中的纹理信息存储器存储的纹理信息一致,换句话说,当一设备获取了新的纹理信息,该设备可以通过该设备中的纹理信息存储器将获取的该新的纹理信息同步至其他设备中的纹理信息存储器中,以便于其他设备也可以使用该纹理信息。
示例性的,该实施例中,当第一设备获取纹理信息后,可以将该纹理信息存储至第一设备的纹理信息存储器中,第一设备的纹理信息存储器可以将第一设备获取的纹理信息同步至第二设备的纹理信息存储器,以及第三设备的纹理信息存储器。
应注意,第一设备通过纹理信息存储器的方式,是“第一设备向第二设备和第三设备同步纹理信息”的一种方式。
本申请实施例中,如用户在家里采用携带不便的计算机绘制笔迹时,用户可以采用触控笔在手机上选择目标区域,手机可以向计算机上,以及方便携带的笔记本电脑发送目标区域的纹理信息,进而用户在外出时可以携带笔记本电脑,也能够使用笔记本电脑上存储的纹理绘制笔迹,提高用户体验。
在一种实施例中,用户使用触控笔不仅可以在第一设备上选择纹理,还可以在第一设备上选择颜色,在该实施例中,参照图11,S603-S606可以替换为S603A-S606A:
S603A,第一设备获取目标区域的纹理信息和颜色信息。
第一设备获取目标区域的纹理信息可以参照S603中的相关描述。颜色信息可以包括:RGB值,RGB值包括红色red的R值、绿色green的G值,以及蓝色blue的B值。
在一种实施例中,颜色信息和纹理信息可以称为属性信息,即本申请实施例中,属性信息可以包括:纹理信息和/或颜色信息。在一种实施例中,纹理信息可以包括:纹理信息,或者,纹理信息和颜色信息。
与上述实施例不同的是,上述实施例中未考虑纹理的颜色,如纹理为三角形,上述实施例并未考虑三角形的颜色。在本申请实施例中,第一设备在获取目标区域中的重复图案时,需要考虑重复图案的颜色,也就是说,重复图案中的图案的形状不仅相同,且图案的颜色也需要相同,第一设备将具有相同颜色的相同图案作为目标区域的重复图案(即相同的图案)。相应的,本申请实施例中的颜色信息为:重复图案的颜色信息。重复图案的颜色信息包括:重复图案中每个位置的RGB值。
在一种实施例中,若第一设备确定目标区域未包含重复图案,则目标区域的纹理信息为目标区域的图像的矢量数据,可以参照上述实施例中的描述,目标区域的颜色信息包括:目标区域中每个位置的RGB值。
S603A中第一设备获取颜色信息可以理解为:AP基于目标区域对应的坐标,采用截图的方式,在第一设备的界面上获取目标区域中的内容的图像,进而向GPU发送目标区域的内容的图像,GPU可以在目标区域中的内容的图像中提取RGB值,即得到颜色信息。
S604A,第一设备向触控笔发送纹理信息和颜色信息。
S605A,触控笔向第二设备发送纹理信息和颜色信息。
S606A,第二设备基于触控笔在触摸屏上的位置,以纹理信息表征的纹理以及颜色信息表征的颜色显示笔迹。
在一种实施例中,S604A-S605A可以替换为:第一设备向第二设备发送纹理信息和颜色信息。
本申请实施例中,用户使用触控笔不仅可以在第一设备上选择纹理,还可以选择颜色,使得第二设备以目标区域的纹理和颜色显示触控笔的笔迹,进一步丰富了电子设备显示笔迹的样式。
为了进一步丰富电子设备显示笔迹的样式,本申请实施例中提供的跨设备绘制***,用户可以在第一设备的界面的多个区域选择纹理和/或颜色,第一设备将用户选择的多个区域的纹理和/或颜色进行融合后,向第二设备同步纹理信息和颜色信息。在该实施例中,参照图12,第一设备和第二设备的交互过程可以包括:
S1201,第一设备接收第一指令,第一指令用于指示第一设备获取触控笔选择的目标区域的纹理信息和/或颜色信息。
S1201可以参照S601中的描述。与S601不同的是,第一指令用于指示第一设备获取触控笔选择的目标区域的纹理信息和/或颜色信息,并不是单一的纹理信息。
S1202,第一设备响应于接收第一指令,检测触控笔在第一设备的界面上选择的第一目标区域和第三目标区域。
应理解,触控笔在第一设备的界面上选择第一目标区域的操作可以称为第三用户操作,触控笔在第一设备的界面上选择第三目标区域的操作可以称为第七用户操作。S1202中的“第一设备的界面”可以称为第二图形界面。
本申请实施例中,触控笔可以在第一设备的界面上选择多个目标区域。在一种实施例中,图7中的c可以替换为图13中的a,参照图13中的a,第一设备的界面上可以显示“确定”控件131,用户可以在选择目标区域后,操作“确定”控件131。第一设备响应于检测到用户选择目标区域,可以确定用户选择完成。如此,参照图13中的a,用户在第一设备的界面上选择第一目标区域132,以及第三目标区域133,其中,用户可以选择“确定”控件131,以表征用户选择完成。对应的,第一设备响应于检测到用户选择“确定”控件131,确定用户选择了第一目标区域和第三目标区域。
S1203,第一设备获取第一目标区域的第一纹理信息和第三目标区域的第三纹理信息。
第一设备获取第一目标区域的第一纹理信息和第三目标区域的第三纹理信息的方式,可以参照S603中第一设备获取目标区域的纹理信息的描述。
S1204,第一设备将第一纹理信息和第三纹理信息进行融合处理,得到融合纹理信息。
在一种实施例中,第一设备将第一纹理信息和第三纹理信息进行融合,可以为:第一设备将第一纹理信息表征的第一纹理和第三纹理信息表征的第三纹理进行叠加,得到叠加后的纹理。其中,融合纹理信息可以包括:叠加后的纹理的图像矢量数据。参照图14中的a所示,若第一纹理为三角形,第三纹理为正方形,则叠加后的纹理为在三角形上叠加正方形。
在一种实施例中,可以预先设置第一纹理和第三纹理的相对位置,如基于用户选择目标区域的顺序,将后选择的第三纹理排列在先选择的第一纹理的预设相对位置(如右侧、上侧等)。参照图14中的b所示,如第一纹理为三角形,第三纹理为正方形,则融合后的纹理为:三角形排列在正方形的右侧。
在一种实施例中,第一设备可以将第一纹理和第三纹理进行多种融合,示例性的,第一 设备将第一纹理和第三纹理进行融合可以不限于:第一设备将第一纹理和第三纹理叠加,或者第一设备将第一纹理排列在第三纹理的右侧等。
第一设备可以在第一设备的界面上显示多种融合后的纹理,以供用户选择,其中,用户可以在第一设备的界面上选择融合纹理,如此,第一设备响应于检测到用户选择的融合纹理,将用户选择的融合纹理对应的纹理信息作为融合纹理信息。
在一种实施例中,用户可以自定义第一纹理和第三纹理的相对位置,以对第一纹理和第三纹理进行融合处理。如第一设备得到第一纹理信息和第三纹理信息后,可以显示编辑界面,编辑界面上包括第一纹理和第三纹理。相应的,如图8所示的界面可以替换为图15所示的界面,参照图15中的a所示,编辑界面上显示有第一目标区域中的第一纹理,以及第三目标区域中的第三纹理,用户可以拖动任意一个纹理(用户可以使用手指或触控笔拖动),以改变第一纹理和第三纹理的相对位置,参照图15中的b所示,用户可以拖动正方形,使得正方形和三角形叠加,则融合后的纹理为:正方形和三角形的叠加。
S1205,第一设备向第二设备同步融合纹理信息。
在一种实施例中,第一设备可以向触控笔发送融合纹理信息,触控笔响应于接收融合纹理信息,可以向第二设备发送融合纹理信息。或者,在一种实施例中,第一设备可以直接向第二设备发送融合纹理信息。
S1206,第二设备基于触控笔在触摸屏上的位置,以融合纹理信息表征的纹理显示笔迹。
参照图13中的b所示,当用户使用触控笔在第二设备上绘制笔迹时,第二设备可以以融合纹理信息表征的纹理(即融合后的纹理)显示笔迹,图13中的b中以融合后的纹理为正方形和三角形的叠加为例进行说明。
本申请实施例中,用户可以采用触控笔在第一设备上选择至少两个目标区域,第一设备可以将该至少两个目标区域中的纹理进行融合,得到融合纹理信息,进而向第二设备同步融合纹理信息,用户可以将需要的多个纹理进行融合处理,能够进一步丰富电子设备显示笔迹的样式。
如上图12所示的实施例说明的是第一设备可以将第一目标区域的第一纹理和第三目标区域的第三纹理进行融合,在一种实施例中,第一设备可以将第一目标区域的第一纹理和第三目标区域的第三颜色进行融合,在该种场景中,如用户喜欢将第一设备显示的界面中的一区域的纹理,喜欢另一区域的颜色,则用户可以选择该两个区域,进而使得第一设备将该两个区域中的纹理和颜色进行融合。
在该实施例中,如上图12中的S1203-S1206可以替换为S1203A-S1206A:
S1203A,第一设备获取第一目标区域的第一纹理信息和第三目标区域的第三颜色信息。
在一种实施例中,可以基于用户选择的目标区域的顺序,预先设置第一设备获取目标区域的纹理信息还是颜色信息。示例性的,可以依次按照“纹理”和“颜色”交叉的方式,确定获取目标区域的纹理信息还是颜色信息,如用户第一个选择的是第一目标区域,则第一设备获取第一目标区域的第一纹理信息,用户第二个选择的是第三目标区域,则第一设备获取第三目标区域的第三颜色信息,以此类推,用户可以依次获取用户选择的目标区域的纹理、颜色、纹理、颜色……。
或者,可以设置第一设备获取用户选择的前n个目标区域获取纹理信息,以及第一设备获取用户选择的后m个目标区域获取颜色信息等,本申请实施例对预先设置获取目标区域的 纹理信息还是颜色信息的方式不做限制。其中,示例性的n和m均为大于或等于1的整数。
S1204A,第一设备将第一纹理信息和第三颜色信息进行融合处理,得到融合信息。
第一设备将第一纹理信息和第三颜色信息进行融合处理的方式可以为:第一设备将第三颜色信息表征的颜色叠加在第一纹理信息表征的纹理上,得到叠加颜色后的纹理。第一设备可以基于叠加颜色后的纹理得到融合信息,其中,融合信息中包括叠加颜色后的纹理的矢量数据,可以参照上述实施例中的相关描述。
S1205A,第一设备向第二设备同步融合信息。
S1206A,第二设备基于触控笔在触摸屏上的位置,以融合信息表征的纹理和颜色显示笔迹。
示例性的,图13可以替换为图16,如图16中的a所示,用户采用触控笔第一设备的界面上画一个圆形,则第一设备可以将该圆形中的区域作为第一目标区域,且获取第一目标区域的第一纹理信息:三角形。用户采用触控笔第一设备的界面上又画一个圆形,则第一设备可以将该圆形中的区域作为第三目标区域,且获取第三目标区域的第三颜色信息:灰色(应理解,本申请实施例中以灰度表征颜色)。基于S1204A-S1205A中的描述,第二设备可以得到融合信息,参照图16中的b所示,当用户使用触控笔在第二设备上绘制笔迹时,第二设备可以以融合纹理信息表征的纹理和颜色显示笔迹。
在一种实施例中,S1203A还可以替换为:第一设备获取第一目标区域的第一颜色信息和第三目标区域的第三颜色信息。
相应的,S1204A可以替换为:第一设备将第一颜色信息和第三颜色信息进行融合处理,得到融合后的颜色信息。S1205A可以替换为:第一设备向第二设备同步融合后的颜色信息。S1206A可以替换为:第二设备基于触控笔在触摸屏上的位置,以融合后的颜色显示笔迹。
其中,在一种实施例中,第一目标区域中的颜色可以称为第一颜色,第一目标区域中的纹理可以称为第一纹理,第一颜色的信息称为第一颜色信息,第一纹理的信息称为第一纹理信息。第三目标区域中的颜色可以称为第三颜色,第三目标区域中的纹理可以称为第三纹理,第三颜色的信息称为第三颜色信息,第三纹理的信息称为第三纹理信息。
在一种实施例中,用户在选择目标区域时,可以自主选择获取该目标区域的纹理信息或颜色信息,可以提供用户体验。下述以用户选择第一目标区域的纹理信息和第三目标区域的颜色信息为例进行说明。
在该种实施例中,如上S1202-S1206可以替换为S1202B-S1204B:
S1202B,第一设备响应于接收第一指令,检测触控笔在第一设备的界面上选择的第一目标区域。
S1203B,第一设备响应于检测到第一目标区域,显示待选择的颜色控件和纹理控件。
S1204B,第一设备响应于检测用户操作纹理控件,获取第一目标区域的第一纹理信息。
在该实施例中,纹理控件可以理解成第一目标控件。在一种实施例中,第一目标区域选择后对颜色控件和/或纹理控件的选择操作可以称为第八用户操作。
S1205B,第一设备检测触控笔在第一设备的界面上选择的第三目标区域。
S1206B,第一设备响应于检测到第三目标区域,显示待选择的颜色控件和纹理控件。
S1207B,第一设备响应于检测用户操作颜色控件,获取第一目标区域的第三颜色信息。
在该实施例中,颜色控件可以理解成第二目标控件。在一种实施例中,第二目标区域选择后对颜色控件和/或纹理控件的选择操作可以称为第九用户操作。
在S1207B之后,还可以执行S1205A-S1206A。
示例性的,在该实施例中,图13可以替换为图17,如图17中的a所示,用户采用触控笔第一设备的界面上画一个圆形,第一设备可以在界面上显示用户可选择的颜色控件161和纹理控件162,且第一设备可以将该圆形中的区域作为第一目标区域。其中,用户可以选择颜色控件161或纹理控件162,以触发第一设备获取第一目标区域的颜色信息或纹理信息,示例性的,参照图17中的a,用户选择纹理控件162,则第一设备响应于检测到用户选择纹理控件162,可以获取第一目标区域的第一纹理信息。
同理的,参照图17中的b,当用户采用触控笔在第一设备的界面上的另一区域又画一个圆形,则第一设备可以在界面上显示用户可选择的颜色控件161和纹理控件162,且第一设备可以将该圆形中的区域作为第三目标区域。参照图17中的b,用户选择颜色控件161,则第一设备响应于检测到用户选择颜色控件161,可以获取第三目标区域的颜色信息。
应理解,在一种实施例中,参照图17中的b,用户可以同时选择颜色控件161和纹理控件162,进而触发第一设备获取第一目标区域的第一颜色信息和第一纹理信息,进而第一设备可以将第一目标区域的第一颜色信息和第一纹理信息,以及第二设备的第三颜色信息(或者以及第三纹理信息)进行融合,其中,纹理融合的方式可以参照图12中的相关描述。
颜色融合的方式可以为:第一设备将第一颜色信息中的R值和第三颜色信息中的R值加和后取平均值,得到融合后的R值,同理的,第一设备将第一颜色信息中的G值和第三颜色信息中的G值加和后取平均值,得到融合后的G值,以及第一设备将第一颜色信息中的B值和第三颜色信息中的B值加和后取平均值,得到融合后的B值,进而融合后的颜色信息包括融合后的R值、G值以及B值。
参照图17中的c所示,当用户使用触控笔在第二设备上绘制笔迹时,第二设备可以以融合纹理信息表征的纹理和颜色显示笔迹。
如上实施例中讲述了用户可以从第一设备中选择颜色和/或纹理,同步至第二设备中使用的示例,在一种实施例中,用户也可以从第二设备中选择颜色和/或纹理,同步至第一设备中使用。
本申请实施例中,用户可以在第一设备上选择多个目标区域,第一设备可以预先设置获取的每个目标区域的颜色和/或纹理,或者用户在选择目标区域时,可以选择获取该目标区域的颜色信息和/纹理信息,进而使得第一设备将多个目标区域的颜色信息和/或纹理信息进行融合,得到融合信息,进而使得第二设备可以基于融合信息表征的颜色和/或纹理显示触控笔的笔迹,进一步丰富了电子设备显示笔迹的样式。
图18为本申请实施例提供的一种笔迹绘制装置的结构示意图。本实施例所涉及的笔迹绘制装置可以为前述所说的第一设备,也可以为第一设备中的芯片。该笔迹绘制装置可以用于执行上述方法实施例中第一设备的动作。如图18所示,该笔迹绘制装置1800可以包括:显示模块1801、处理模块1802,以及收发模块1803。
显示模块1801,用于响应于在第一设备显示屏上的第一用户操作,显示第一图形界面,以及响应于在第一设备显示屏上的第二用户操作,显示第二图形界面,第二图形界面不同于第一图形界面。
处理模块1802,用于响应于在第一设备显示屏上的第三用户操作,在第二图形界面上选中第一目标区域,第一目标区域中包括第一颜色。
收发模块1803,用于向第二设备发送第一颜色的颜色信息,第一颜色的颜色信息用于指 示第二设备显示第一颜色的笔迹。
在一种可能的实现方式中,显示模块1801,还用于响应于在第一设备显示屏上的第六用户操作,显示第二颜色的笔迹,第二颜色为在第二设备的图形界面上选中的第二目标区域的颜色。
在一种可能的实现方式中,处理模块1802,还用于响应于接收来自触控笔的第一指令,在第一设备显示屏上检测第三用户操作,第一指令为触控笔检测到触控笔执行第一预设动作发送的;或者,响应于检测到触控笔执行第二预设动作,在第一设备显示屏上检测第三用户操作;或者,响应于检测到第一设备执行第三预设动作,在第一设备显示屏上检测第三用户操作。
在一种可能的实现方式中,第一目标区域还包括第一纹理。
收发模块1803,具体用于向第二设备发送第一颜色的颜色信息和第一纹理的纹理信息。
在一种可能的实现方式中,处理模块1802,还用于获取第一纹理的纹理信息。
在一种可能的实现方式中,处理模块1802,具体用于检测第一目标区域是否包含相同的图案;若是,则对图案进行截图,得到图案的图像;基于图案的图像,获取图案的矢量数据;将图案的矢量数据作为第一纹理的纹理信息;若否,则对第一目标区域进行截图,得到第一目标区域的图像;基于第一目标区域的图像,获取第一目标区域的矢量数据;将第一目标区域的矢量数据作为第一纹理的纹理信息。
在一种可能的实现方式中,处理模块1802,具体用于将第一目标区域划分为多个网格,每个网格具有第一预设尺寸;获取每两个网格中的图案的第一相似度;若存在大于或等于预设相似度的第一相似度,且大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定第一目标区域包含相同的图案。
在一种可能的实现方式中,处理模块1802,具体用于若不存在大于或等于预设相似度的第一相似度,或者占比小于预设占比,则增大网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;若存在大于或等于预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则确定第一目标区域包含相同的图案;若不存在大于或等于预设相似度的第二相似度,或者大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大网格的尺寸,直至网格的尺寸达到第二预设尺寸。
在一种可能的实现方式中,处理模块1802,还用于响应于在第一设备显示屏上的第七用户操作,在第二图形界面上选中第三目标区域,第三目标区域中包括第三颜色;融合第一颜色和第三颜色,得到融合颜色信息。
收发模块1803,具体用于向第二设备发送融合颜色信息,融合颜色信息用于指示第二设备显示第一颜色和第三颜色融合后的颜色的笔迹。
在一种可能的实现方式中,处理模块1802,还用于响应于在第一设备显示屏上的第七用户操作,在第二图形界面上选中第三目标区域,第三目标区域中包括第三纹理;融合第一颜色和第三纹理,得到融合信息。
收发模块1803,具体用于向第二设备发送融合信息,融合信息用于指示第二设备显示第一颜色和第三纹理组合的笔迹。
在一种可能的实现方式中,第一目标区域还包括第一纹理。显示模块1801,还用于在第二图形界面上显示待选择的颜色控件和纹理控件。处理模块1802,还用于检测对颜色控件和/或纹理控件的选择的第八用户操作。
在一种可能的实现方式中,处理模块1802,还用于响应于在第一设备显示屏上的第七用 户操作,在第二图形界面上选中第三目标区域,第三目标区域中包括第三颜色和第三纹理。
显示模块1801,还用于在第二图形界面上显示待选择的颜色控件和纹理控件。
处理模块1802,还用于检测对颜色控件和/或纹理控件的选择的第九用户操作,将第八用户操作指示的第一信息和第九用户操作指示的第二信息进行融合,得到融合信息,第一信息为第一颜色的颜色信息和/或第一纹理的纹理信息,第二信息为第三颜色的颜色信息和/或第三纹理的纹理信息。
收发模块1803,具体用于向第二设备发送融合信息,融合信息用于指示第二设备显示第一颜色、第一纹理、第三颜色,和/或第三纹理组合的笔迹。
本申请实施例提供的笔迹绘制装置,可以执行上述方法实施例中第一设备的动作,其实现原理和技术效果类似,在此不再赘述。
图19为本申请实施例提供的一种笔迹绘制装置的结构示意图。本实施例所涉及的笔迹绘制装置可以为前述所说的第二设备,也可以为第二设备中的芯片。该笔迹绘制装置可以用于执行上述方法实施例中第二设备的动作。如图19所示,该笔迹绘制装置1900可以包括:收发模块1901、显示模块1902。
收发模块1901,用于接收来自第一设备的第一颜色的信息。
显示模块1902,用于响应于在第二设备显示屏上的第四用户操作,显示第一颜色的笔迹。
在一种可能的实现方式中,显示模块1902,还用于响应于在第二设备显示屏上的第五用户操作,在第二设备显示的图形界面上选中第二目标区域,第二目标区域中包括第二颜色。
收发模块1901,用于向第一设备发送第二颜色的颜色信息,第二颜色的颜色信息用于指示第一设备显示第二颜色的笔迹。
在一种可能的实现方式中,收发模块1903,具体用于接收来自第一设备的第一颜色的颜色信息和第一纹理的纹理信息。
显示模块1902,具体用于响应于在第二设备显示屏上的第四用户操作,显示由第一颜色和第一纹理组合的笔迹。
在一种可能的实现方式中,收发模块1903,具体用于接收来自第一设备的融合颜色信息,融合颜色信息为第一设备上的第一目标区域的第一颜色和第三目标区域的第三颜色融合后的信息。
显示模块1902,具体用于响应于在第二设备显示屏上的第四用户操作,显示第一颜色和第三颜色融合后的颜色的笔迹。
在一种可能的实现方式中,收发模块1903,具体用于接收来自第一设备的融合信息,融合信息为第一设备上的第一目标区域的第一颜色和第三目标区域的第三纹理融合后的信息。
显示模块1902,具体用于响应于在第二设备显示屏上的第四用户操作,显示第一颜色和第三纹理组合的笔迹。
在一种可能的实现方式中,收发模块1903,具体用于接收来自第一设备的融合信息,融合信息为第一设备上的第一目标区域的第一颜色、第一纹理、第三目标区域的第三颜色,和/或第三纹理融合信息。
显示模块1902,具体用于响应于在第二设备显示屏上的第四用户操作,显示第一颜色、第一纹理、第三颜色,和/或第三纹理组合的笔迹。
本申请实施例提供的笔迹绘制装置,可以执行上述方法实施例中第二设备的动作,其实现原理和技术效果类似,在此不再赘述。
在一种实施例中,本申请实施例还提供一种电子设备,该电子设备可以为上述实施例中的第一设备、第二设备、触控笔。该电子设备中可以包括:处理器(例如CPU)、存储器。存储器可能包含高速随机存取存储器(random-access memory,RAM),也可能还包括非易失性存储器(non-volatile memory,NVM),例如至少一个磁盘存储器,存储器中可以存储各种指令,以用于完成各种处理功能以及实现本申请的方法步骤。可选的,本申请涉及的电子设备还可以包括:电源、通信总线以及通信端口。上述通信端口用于实现电子设备与其他外设之间进行连接通信。在本申请实施例中,存储器用于存储计算机可执行程序代码,程序代码包括指令;当处理器执行指令时,指令使电子设备的处理器执行上述方法实施例中的动作,其实现原理和技术效果类似,在此不再赘述。
需要说明的是,上述实施例中的模块或部件可以是被配置成实施以上方法的一个或多个集成电路,例如:一个或多个专用集成电路(application specific integrated circuit,ASIC),或,一个或多个微处理器(digital signal processor,DSP),或,一个或者多个现场可编程门阵列(field programmable gate array,FPGA)等。再如,当以上某个模块通过处理元件调度程序代码的形式实现时,该处理元件可以是通用处理器,例如中央处理器(central processing unit,CPU)或其它可以调用程序代码的处理器如控制器。再如,这些模块可以集成在一起,以片上***(system-on-a-chip,SOC)的形式实现。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
本文中的术语“多个”是指两个或两个以上。本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系;在公式中,字符“/”,表示前后关联对象是一种“相除”的关系。另外,需要理解的是,在本申请的描述中,“第一”、“第二”等词汇,仅用于区分描述的目的,而不能理解为指示或暗示相对重要性,也不能理解为指示或暗示顺序。
可以理解的是,在本申请的实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。
可以理解的是,在本申请的实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请的实施例的实施过程构成任何限定。
实施例:
1.一种跨设备绘制***,其特征在于,所述***包括第一设备和第二设备,
所述第一设备配置为:
响应于在所述第一设备显示屏上的第一用户操作,显示第一图形界面;
响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面;
响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色;
所述第二设备配置为:
响应于在所述第二设备显示屏上的第四用户操作,显示所述第一颜色的笔迹。
2.根据实施例1所述的***,其特征在于,
所述第二设备还配置为:
响应于在所述第二设备显示屏上的第五用户操作,在所述第二设备显示的图形界面上选中第二目标区域,所述第二目标区域中包括第二颜色;
所述第一设备还配置为:
响应于在所述第一设备显示屏上的第六用户操作,显示所述第二颜色的笔迹。
3.根据实施例1或2所述的***,其特征在于,
所述第一设备还配置为:
响应于接收来自触控笔的第一指令,在所述第一设备显示屏上检测所述第三用户操作,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的;或者,
响应于检测到所述触控笔执行第二预设动作,在所述第一设备显示屏上检测所述第三用户操作;或者,
响应于检测到所述第一设备执行第三预设动作,在所述第一设备显示屏上检测所述第三用户操作。
4.根据实施例1或2所述的***,其特征在于,所述第一目标区域还包括第一纹理;
所述第二设备还配置为:
响应于在所述第二设备显示屏上的所述第四用户操作,显示由所述第一颜色和所述第一纹理组合的笔迹。
5.根据实施例4所述的***,其特征在于,所述第一设备还配置为:
在所述第二图形界面上选中所述第一目标区域之后,获取所述第一纹理的纹理信息。
6.根据实施例5所述的***,其特征在于,所述第一设备具体配置为:
检测所述第一目标区域是否包含相同的图案;
若是,则对所述图案进行截图,得到所述图案的图像;
基于所述图案的图像,获取所述图案的矢量数据;
将所述图案的矢量数据作为所述第一纹理的纹理信息;
若否,则对所述第一目标区域进行截图,得到所述第一目标区域的图像;
基于所述第一目标区域的图像,获取所述第一目标区域的矢量数据;
将所述第一目标区域的矢量数据作为所述第一纹理的纹理信息。
7.根据实施例6所述的***,其特征在于,所述第一设备具体配置为:
将所述第一目标区域划分为多个网格,每个网格具有第一预设尺寸;
获取每两个网格中的图案的第一相似度;
若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述第一目标区域包含相同的图案。
8.根据实施例7所述的***,其特征在于,所述第一设备具体配置为:
若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;
若存在大于或等于所述预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述第一目标区域包含相同的图案;
若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
9.根据实施例1-8中任一项所述的***,其特征在于,所述第一设备还配置为:
响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色;
所述第二设备还配置为:
响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
10.根据实施例1-8中任一项所述的***,其特征在于,所述第一设备还配置为:
响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三纹理;
所述第二设备还配置为:
响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三纹理组合的笔迹。
11.根据实施例1-8中任一项所述的***,其特征在于,所述第一目标区域还包括第一纹理,所述第一设备还配置为:
在所述第二图形界面上选中第一目标区域之后,在所述第二图形界面上显示待选择的颜色控件和纹理控件;
检测对所述颜色控件和/或纹理控件的选择的第八用户操作;
所述第二设备配置为:
响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和/或所述第一纹理组合的笔迹。
12.根据实施例11所述的***,其特征在于,所述第一设备还配置为:
响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色和第三纹理;
在所述第二图形界面上显示待选择的颜色控件和纹理控件;
检测对颜色控件和/或纹理控件的选择的第九用户操作;
所述第二设备还配置为:
响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
13.一种应用于跨设备绘制***的笔迹绘制方法,其特征在于,应用于第一设备,所述方法包括:
响应于在所述第一设备显示屏上的第一用户操作,显示第一图形界面;
响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面;
响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色;
向第二设备发送所述第一颜色的颜色信息,所述第一颜色的颜色信息用于指示所述第二设备显示所述第一颜色的笔迹。
14.根据实施例13所述的方法,其特征在于,所述方法还包括:
响应于在所述第一设备显示屏上的第六用户操作,显示第二颜色的笔迹,所述第二颜色为在所述第二设备的图形界面上选中的第二目标区域的颜色。
15.根据实施例13或14所述的方法,其特征在于,所述响应于在所述第一设备显示屏上的第三用户操作之前,还包括:
响应于接收来自触控笔的第一指令,在所述第一设备显示屏上检测所述第三用户操作,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的;或者,
响应于检测到所述触控笔执行第二预设动作,在所述第一设备显示屏上检测所述第三用户操作;或者,
响应于检测到所述第一设备执行第三预设动作,在所述第一设备显示屏上检测所述第三用户操作。
16.根据实施例13-15中任一项所述的方法,其特征在于,所述第一目标区域还包括第一纹理;
所述向第二设备发送所述第一颜色的颜色信息,包括:
向所述第二设备发送所述第一颜色的颜色信息和所述第一纹理的纹理信息。
17.根据实施例16所述的方法,其特征在于,所述在所述第二图形界面上选中第一目标区域之后,还包括:
获取所述第一纹理的纹理信息。
18.根据实施例17所述的方法,其特征在于,所述获取所述第一纹理的纹理信息,包括:
检测所述第一目标区域是否包含相同的图案;
若是,则对所述图案进行截图,得到所述图案的图像;
基于所述图案的图像,获取所述图案的矢量数据;
将所述图案的矢量数据作为所述第一纹理的纹理信息;
若否,则对所述第一目标区域进行截图,得到所述第一目标区域的图像;
基于所述第一目标区域的图像,获取所述第一目标区域的矢量数据;
将所述第一目标区域的矢量数据作为所述第一纹理的纹理信息。
19.根据实施例18所述的方法,其特征在于,所述检测所述第一目标区域是否包含相同的图案,包括:
将所述第一目标区域划分为多个网格,每个网格具有第一预设尺寸;
获取每两个网格中的图案的第一相似度;
若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述第一目标区域包含相同的图案。
20.根据实施例19所述的方法,其特征在于,所述方法还包括:
若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;
若存在大于或等于所述预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述第一目标区域包含相同的图案;
若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
21.根据实施例13-20中任一项所述的方法,其特征在于,所述在所述第二图形界面上选中第一目标区域之后,还包括:
响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色;
融合所述第一颜色和所述第三颜色,得到融合颜色信息;
所述向第二设备发送所述第一颜色的颜色信息,包括:
向所述第二设备发送所述融合颜色信息,所述融合颜色信息用于指示所述第二设备显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
22.根据实施例13-20中任一项所述的方法,其特征在于,所述在所述第二图形界面上选中第一目标区域之后,还包括:
响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三纹理;
融合所述第一颜色和所述第三纹理,得到融合信息;
所述向第二设备发送所述第一颜色的颜色信息,包括:
向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色和所述第三纹理组合的笔迹。
23.根据实施例13-20中任一项所述的方法,其特征在于,所述第一目标区域还包括第一纹理,所述在所述第二图形界面上选中第一目标区域之后,还包括:
在所述第二图形界面上显示待选择的颜色控件和纹理控件;
检测对所述颜色控件和/或纹理控件的选择的第八用户操作。
24.根据实施例23所述的方法,其特征在于,所述检测对所述颜色控件和/或纹理控件的选择的第八用户操作之后,还包括:
响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色和第三纹理;
在所述第二图形界面上显示待选择的颜色控件和纹理控件;
检测对颜色控件和/或纹理控件的选择的第九用户操作;
将所述第八用户操作指示的第一信息和所述第九用户操作指示的第二信息进行融合,得到融合信息,所述第一信息为第一颜色的颜色信息和/或第一纹理的纹理信息,所述第二信息为所述第三颜色的颜色信息和/或第三纹理的纹理信息;
所述向第二设备发送所述第一颜色的颜色信息,包括:
向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
25.一种应用于跨设备绘制***的笔迹绘制方法,其特征在于,应用于第二设备,所述方法包括:
接收来自第一设备的第一颜色的信息;
响应于在所述第二设备显示屏上的第四用户操作,显示所述第一颜色的笔迹。
26.根据实施例25所述的方法,其特征在于,所述方法还包括:
响应于在所述第二设备显示屏上的第五用户操作,在所述第二设备显示的图形界面上选 中第二目标区域,所述第二目标区域中包括第二颜色;
向所述第一设备发送所述第二颜色的颜色信息,所述第二颜色的颜色信息用于指示所述第一设备显示所述第二颜色的笔迹。
27.根据实施例25或26所述的方法,其特征在于,所述接收来自第一设备的第一颜色的信息,包括:
接收来自所述第一设备的所述第一颜色的颜色信息和第一纹理的纹理信息;
响应于在所述第二设备显示屏上的所述第四用户操作,显示由所述第一颜色和所述第一纹理组合的笔迹。
28.根据实施例25或26所述的方法,其特征在于,所述接收来自第一设备的第一颜色的信息,包括:
接收来自所述第一设备的融合颜色信息,所述融合颜色信息为所述第一设备上的第一目标区域的第一颜色和第三目标区域的第三颜色融合后的信息;
响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
29.根据实施例25或26所述的方法,其特征在于,所述接收来自第一设备的第一颜色的信息,包括:
接收来自所述第一设备的融合信息,所述融合信息为所述第一设备上的第一目标区域的第一颜色和第三目标区域的第三纹理融合后的信息;
响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三纹理组合的笔迹。
30.根据实施例25或26所述的方法,其特征在于,所述接收来自第一设备的第一颜色的信息,包括:
接收来自所述第一设备的融合信息,所述融合信息为所述第一设备上的第一目标区域的第一颜色、第一纹理、第三目标区域的第三颜色,和/或第三纹理融合信息;
响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
31.一种应用于跨设备绘制***的笔迹绘制装置,其特征在于,
显示模块,用于:
响应于在第一设备显示屏上的第一用户操作,显示第一图形界面;
响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面;
处理模块,用于响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色;
收发模块,用于向第二设备发送所述第一颜色的颜色信息,所述第一颜色的颜色信息用于指示所述第二设备显示所述第一颜色的笔迹。
32.一种应用于跨设备绘制***的笔迹绘制装置,其特征在于,
收发模块,用于接收来自第一设备的第一颜色的信息;
显示模块,用于响应于在第二设备显示屏上的第四用户操作,显示所述第一颜色的笔迹。
33.一种电子设备,其特征在于,包括:处理器和存储器;
所述存储器存储计算机执行指令;
所述处理器执行所述存储器存储的计算机执行指令,使得所述处理器执行如实施例13-30中任一项所述的方法。
34.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有计算机程序或指令,当所述计算机程序或指令被运行时,实现如实施例13-30中任一项所述的方法。
35.一种计算机程序产品,包括计算机程序或指令,其特征在于,所述计算机程序或指令被处理器执行时,实现实施例13-30中任一项所述的方法。
36.一种笔迹绘制方法,其特征在于,应用于第一设备,所述第一设备和至少一个第二设备无线连接,所述第一设备和每个第二设备中均包括:纹理信息存储器,所述方法包括:
接收第一指令,所述第一指令用于指示获取用户在所述第一设备的界面上选择的目标区域的属性信息,所述属性信息包括:纹理信息,或者,纹理信息和颜色信息;
检测所述用户在所述第一设备的界面上选择的目标区域;
获取所述目标区域的属性信息;
将所述目标区域的属性信息存储至所述第一设备的纹理信息存储器,所述第一设备的纹理信息存储器用于将所述目标区域的属性信息同步至所述至少一个第二设备的纹理信息存储器,所述第二设备的纹理信息存储器中的所述目标区域的属性信息,用于所述第二设备以所述属性信息表征的纹理显示笔迹,或以所述属性信息表征的纹理和颜色显示笔迹。
37.根据实施例36所述的方法,其特征在于,所述接收所述第一指令,包括:
接收来自触控笔的所述第一指令,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的,所述第一设备和所述触控笔无线连接;或者,
响应于检测到所述触控笔执行第二预设动作,确定接收所述第一指令;或者,
响应于检测到所述第一设备执行第三预设动作,确定接收所述第一指令。
38.根据实施例36所述的方法,其特征在于,所述检测所述用户在所述第一设备的界面上选择的目标区域,包括:
检测所述用户在所述第一设备的触摸屏上的位置;
基于所述用户在所述第一设备的触摸屏上的位置,获取所述目标区域。
39.根据实施例38所述的方法,其特征在于,所述检测所述用户在所述第一设备的触摸屏上的位置,包括:
检测所述用户的手指或指关节在所述第一设备的触摸屏上的位置,或者检测所述用户使用的触控笔在所述第一设备的触摸屏上的位置,所述第一设备和所述触控笔无线连接。
40.根据实施例36中所述的方法,其特征在于,所述属性信息中包括纹理信息,获取所述目标区域的纹理信息,包括:
检测所述目标区域是否包含相同的图案;
若是,则对所述图案进行截图,得到所述图案的图像;
基于所述图案的图像,获取所述图案的矢量数据;
将所述图案的矢量数据作为所述目标区域的纹理信息;
若否,则对所述目标区域进行截图,得到所述目标区域的图像;
基于所述目标区域的图像,获取所述目标区域的矢量数据;
将所述目标区域的矢量数据作为所述目标区域的纹理信息。
41.根据实施例40所述的方法,其特征在于,所述检测所述目标区域是否包含相同的图案,包括:
在所述目标区域中绘制多个网格,所述网格具有第一预设尺寸;
获取每两个网格中的图案的第一相似度;
若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述目标区域包含相同的图案。
42.根据实施例41所述的方法,其特征在于,所述方法还包括:
若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;
若存在大于或等于所述预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述目标区域包含相同的图案;
若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
43.根据实施例36所述的方法,其特征在于,所述目标区域包括第一目标区域和第二目标区域,所述获取所述目标区域的属性信息,包括:
获取所述第一目标区域的第一纹理信息,以及获取所述第二目标区域的第二纹理信息;
将所述第一纹理信息和所述第二纹理信息进行融合处理,得到融合纹理信息。

Claims (24)

  1. 一种跨设备绘制***,其特征在于,所述***包括第一设备和第二设备,
    所述第一设备配置为:
    响应于在所述第一设备显示屏上的第一用户操作,显示第一图形界面;
    响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面;
    响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色;
    所述第二设备配置为:
    响应于在所述第二设备显示屏上的第四用户操作,显示所述第一颜色的笔迹。
  2. 根据权利要求1所述的***,其特征在于,
    所述第二设备还配置为:
    响应于在所述第二设备显示屏上的第五用户操作,在所述第二设备显示的图形界面上选中第二目标区域,所述第二目标区域中包括第二颜色;
    所述第一设备还配置为:
    响应于在所述第一设备显示屏上的第六用户操作,显示所述第二颜色的笔迹。
  3. 根据权利要求1或2所述的***,其特征在于,
    所述第一设备还配置为:
    响应于接收来自触控笔的第一指令,在所述第一设备显示屏上检测所述第三用户操作,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的;或者,
    响应于检测到所述触控笔执行第二预设动作,在所述第一设备显示屏上检测所述第三用户操作;或者,
    响应于检测到所述第一设备执行第三预设动作,在所述第一设备显示屏上检测所述第三用户操作。
  4. 根据权利要求1或2所述的***,其特征在于,所述第一目标区域还包括第一纹理;
    所述第二设备还配置为:
    响应于在所述第二设备显示屏上的所述第四用户操作,显示由所述第一颜色和所述第一纹理组合的笔迹。
  5. 根据权利要求4所述的***,其特征在于,所述第一设备还配置为:
    在所述第二图形界面上选中所述第一目标区域之后,获取所述第一纹理的纹理信息。
  6. 根据权利要求5所述的***,其特征在于,所述第一设备具体配置为:
    检测所述第一目标区域是否包含相同的图案;
    若是,则对所述图案进行截图,得到所述图案的图像;
    基于所述图案的图像,获取所述图案的矢量数据;
    将所述图案的矢量数据作为所述第一纹理的纹理信息;
    若否,则对所述第一目标区域进行截图,得到所述第一目标区域的图像;
    基于所述第一目标区域的图像,获取所述第一目标区域的矢量数据;
    将所述第一目标区域的矢量数据作为所述第一纹理的纹理信息。
  7. 根据权利要求6所述的***,其特征在于,所述第一设备具体配置为:
    将所述第一目标区域划分为多个网格,每个网格具有第一预设尺寸;
    获取每两个网格中的图案的第一相似度;
    若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述第一目标区域包含相同的图案。
  8. 根据权利要求7所述的***,其特征在于,所述第一设备具体配置为:
    若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;
    若存在大于或等于所述预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述第一目标区域包含相同的图案;
    若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
  9. 根据权利要求1-8中任一项所述的***,其特征在于,所述第一设备还配置为:
    响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色;
    所述第二设备还配置为:
    响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
  10. 根据权利要求1-8中任一项所述的***,其特征在于,所述第一设备还配置为:
    响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三纹理;
    所述第二设备还配置为:
    响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和所述第三纹理组合的笔迹。
  11. 根据权利要求1-8中任一项所述的***,其特征在于,所述第一目标区域还包括第一纹理,所述第一设备还配置为:
    在所述第二图形界面上选中第一目标区域之后,在所述第二图形界面上显示待选择的颜色控件和纹理控件;
    检测对所述颜色控件和/或纹理控件的选择的第八用户操作;
    所述第二设备配置为:
    响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色和/或所述第一纹理组合的笔迹。
  12. 根据权利要求11所述的***,其特征在于,所述第一设备还配置为:
    响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色和第三纹理;
    在所述第二图形界面上显示待选择的颜色控件和纹理控件;
    检测对颜色控件和/或纹理控件的选择的第九用户操作;
    所述第二设备还配置为:
    响应于在所述第二设备显示屏上的所述第四用户操作,显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
  13. 一种应用于跨设备绘制***的笔迹绘制方法,其特征在于,应用于第一设备,所述方法包括:
    响应于在所述第一设备显示屏上的第一用户操作,显示第一图形界面;
    响应于在所述第一设备显示屏上的第二用户操作,显示第二图形界面,所述第二图形界面不同于所述第一图形界面;
    响应于在所述第一设备显示屏上的第三用户操作,在所述第二图形界面上选中第一目标区域,所述第一目标区域中包括第一颜色;
    向第二设备发送所述第一颜色的颜色信息,所述第一颜色的颜色信息用于指示所述第二设备显示所述第一颜色的笔迹。
  14. 根据权利要求13所述的方法,其特征在于,所述方法还包括:
    响应于在所述第一设备显示屏上的第六用户操作,显示第二颜色的笔迹,所述第二颜色为在所述第二设备的图形界面上选中的第二目标区域的颜色。
  15. 根据权利要求13或14所述的方法,其特征在于,所述响应于在所述第一设备显示屏上的第三用户操作之前,还包括:
    响应于接收来自触控笔的第一指令,在所述第一设备显示屏上检测所述第三用户操作,所述第一指令为所述触控笔检测到所述触控笔执行第一预设动作发送的;或者,
    响应于检测到所述触控笔执行第二预设动作,在所述第一设备显示屏上检测所述第三用户操作;或者,
    响应于检测到所述第一设备执行第三预设动作,在所述第一设备显示屏上检测所述第三用户操作。
  16. 根据权利要求13-15中任一项所述的方法,其特征在于,所述第一目标区域还包括第一纹理;
    所述向第二设备发送所述第一颜色的颜色信息,包括:
    向所述第二设备发送所述第一颜色的颜色信息和所述第一纹理的纹理信息。
  17. 根据权利要求16所述的方法,其特征在于,所述在所述第二图形界面上选中第一目标区域之后,还包括:
    获取所述第一纹理的纹理信息。
  18. 根据权利要求17所述的方法,其特征在于,所述获取所述第一纹理的纹理信息,包括:
    检测所述第一目标区域是否包含相同的图案;
    若是,则对所述图案进行截图,得到所述图案的图像;
    基于所述图案的图像,获取所述图案的矢量数据;
    将所述图案的矢量数据作为所述第一纹理的纹理信息;
    若否,则对所述第一目标区域进行截图,得到所述第一目标区域的图像;
    基于所述第一目标区域的图像,获取所述第一目标区域的矢量数据;
    将所述第一目标区域的矢量数据作为所述第一纹理的纹理信息。
  19. 根据权利要求18所述的方法,其特征在于,所述检测所述第一目标区域是否包含相同的图案,包括:
    将所述第一目标区域划分为多个网格,每个网格具有第一预设尺寸;
    获取每两个网格中的图案的第一相似度;
    若存在大于或等于预设相似度的第一相似度,且所述大于或等于预设相似度的第一相似度的占比大于或等于预设占比,则确定所述第一目标区域包含相同的图案。
  20. 根据权利要求19所述的方法,其特征在于,所述方法还包括:
    若不存在大于或等于预设相似度的第一相似度,或者所述占比小于所述预设占比,则增大所述网格的尺寸,获取增大尺寸后每两个网格中的图案的第二相似度;
    若存在大于或等于所述预设相似度的第二相似度,且大于或等于预设相似度的第二相似度的占比大于或等于所述预设占比,则确定所述第一目标区域包含相同的图案;
    若不存在大于或等于所述预设相似度的第二相似度,或者所述大于或等于预设相似度的第二相似度的占比大于或等于预设占比,则继续增大所述网格的尺寸,直至所述网格的尺寸达到第二预设尺寸。
  21. 根据权利要求13-20中任一项所述的方法,其特征在于,所述在所述第二图形界面上选中第一目标区域之后,还包括:
    响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色;
    融合所述第一颜色和所述第三颜色,得到融合颜色信息;
    所述向第二设备发送所述第一颜色的颜色信息,包括:
    向所述第二设备发送所述融合颜色信息,所述融合颜色信息用于指示所述第二设备显示所述第一颜色和所述第三颜色融合后的颜色的笔迹。
  22. 根据权利要求13-20中任一项所述的方法,其特征在于,所述在所述第二图形界面上选中第一目标区域之后,还包括:
    响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三纹理;
    融合所述第一颜色和所述第三纹理,得到融合信息;
    所述向第二设备发送所述第一颜色的颜色信息,包括:
    向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色和所述第三纹理组合的笔迹。
  23. 根据权利要求13-20中任一项所述的方法,其特征在于,所述第一目标区域还包括第一纹理,所述在所述第二图形界面上选中第一目标区域之后,还包括:
    在所述第二图形界面上显示待选择的颜色控件和纹理控件;
    检测对所述颜色控件和/或纹理控件的选择的第八用户操作。
  24. 根据权利要求23所述的方法,其特征在于,所述检测对所述颜色控件和/或纹理控件的选择的第八用户操作之后,还包括:
    响应于在所述第一设备显示屏上的第七用户操作,在所述第二图形界面上选中第三目标区域,所述第三目标区域中包括第三颜色和第三纹理;
    在所述第二图形界面上显示待选择的颜色控件和纹理控件;
    检测对颜色控件和/或纹理控件的选择的第九用户操作;
    将所述第八用户操作指示的第一信息和所述第九用户操作指示的第二信息进行融合,得到融合信息,所述第一信息为第一颜色的颜色信息和/或第一纹理的纹理信息,所述第二信息为所述第三颜色的颜色信息和/或第三纹理的纹理信息;
    所述向第二设备发送所述第一颜色的颜色信息,包括:
    向所述第二设备发送所述融合信息,所述融合信息用于指示所述第二设备显示所述第一颜色、所述第一纹理、所述第三颜色,和/或所述第三纹理组合的笔迹。
PCT/CN2022/110186 2021-09-16 2022-08-04 跨设备绘制*** WO2023040505A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280002664.4A CN116137915A (zh) 2021-09-16 2022-08-04 跨设备绘制***

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN202111089486 2021-09-16
CN202111089486.7 2021-09-16
CN202111613488.1A CN114816135B (zh) 2021-09-16 2021-12-27 跨设备绘制***
CN202111613488.1 2021-12-27

Publications (1)

Publication Number Publication Date
WO2023040505A1 true WO2023040505A1 (zh) 2023-03-23

Family

ID=82527722

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/110186 WO2023040505A1 (zh) 2021-09-16 2022-08-04 跨设备绘制***

Country Status (2)

Country Link
CN (2) CN114816135B (zh)
WO (1) WO2023040505A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114816135B (zh) * 2021-09-16 2023-11-03 华为技术有限公司 跨设备绘制***
CN117762606A (zh) * 2022-09-23 2024-03-26 华为技术有限公司 一种设备控制方法与电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150018237A (ko) * 2013-08-09 2015-02-23 삼성전자주식회사 전자 기기에서 사용자 맞춤 필기를 제공하는 방법 및 이를 수행하기 위한 전자 기기
US20150177975A1 (en) * 2013-12-19 2015-06-25 Samsung Electronics Co., Ltd. Electronic device and method for providing graphical user interface of the same
CN106775374A (zh) * 2016-11-17 2017-05-31 广州视源电子科技股份有限公司 基于触摸屏的颜色获取方法及装置
CN113362410A (zh) * 2021-05-31 2021-09-07 维沃移动通信(杭州)有限公司 绘图方法、装置、电子设备及介质
CN114816135A (zh) * 2021-09-16 2022-07-29 华为技术有限公司 跨设备绘制***

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102306099A (zh) * 2011-08-23 2012-01-04 上海网达软件有限公司 手持终端设备上跨平台的图形显示方法及图形显示***
CN104796455A (zh) * 2015-03-12 2015-07-22 安徽讯飞皆成软件技术有限公司 跨平台的多屏互动方法、装置及***
US10739988B2 (en) * 2016-11-04 2020-08-11 Microsoft Technology Licensing, Llc Personalized persistent collection of customized inking tools
CN107422974B (zh) * 2017-07-21 2020-01-07 广州视源电子科技股份有限公司 基于双***的笔迹书写显示方法和***、存储介质及设备
CN110083324A (zh) * 2019-04-30 2019-08-02 华为技术有限公司 图像绘制的方法、装置、电子设备及计算机存储介质
CN110187810B (zh) * 2019-05-27 2020-10-16 维沃移动通信有限公司 一种绘图方法及终端设备
CN114764298B (zh) * 2020-07-29 2023-03-03 华为技术有限公司 一种跨设备的对象拖拽方法及设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150018237A (ko) * 2013-08-09 2015-02-23 삼성전자주식회사 전자 기기에서 사용자 맞춤 필기를 제공하는 방법 및 이를 수행하기 위한 전자 기기
US20150177975A1 (en) * 2013-12-19 2015-06-25 Samsung Electronics Co., Ltd. Electronic device and method for providing graphical user interface of the same
CN106775374A (zh) * 2016-11-17 2017-05-31 广州视源电子科技股份有限公司 基于触摸屏的颜色获取方法及装置
CN113362410A (zh) * 2021-05-31 2021-09-07 维沃移动通信(杭州)有限公司 绘图方法、装置、电子设备及介质
CN114816135A (zh) * 2021-09-16 2022-07-29 华为技术有限公司 跨设备绘制***

Also Published As

Publication number Publication date
CN114816135B (zh) 2023-11-03
CN116137915A (zh) 2023-05-19
CN114816135A (zh) 2022-07-29

Similar Documents

Publication Publication Date Title
WO2023040505A1 (zh) 跨设备绘制***
CN105830422B (zh) 可折叠电子设备及其界面交互方法
US11158057B2 (en) Device, method, and graphical user interface for processing document
EP3183640B1 (en) Device and method of providing handwritten content in the same
US11947791B2 (en) Devices, methods, and systems for manipulating user interfaces
WO2013011862A1 (ja) 情報処理装置、操作画面表示方法、制御プログラムおよび記録媒体
TW201610815A (zh) 字元安排的強化解讀
TW201610914A (zh) 圖表資料的增強辨識
CN113325988B (zh) 多任务管理方法和终端设备
KR20140070040A (ko) 터치스크린 상에 표시되는 복수의 객체들을 관리하는 장치 및 방법
US10416783B2 (en) Causing specific location of an object provided to a device
US11526322B2 (en) Enhanced techniques for merging content from separate computing devices
US10649615B2 (en) Control interface for a three-dimensional graphical object
CN106716493A (zh) 对内容样式化的方法和对内容样式化的触摸屏设备
JP6591527B2 (ja) フォンパッド
EP3721327B1 (en) Dynamic interaction adaptation of a digital inking device
EP3084634A1 (en) Interaction with spreadsheet application function tokens
WO2020046452A1 (en) Computationally efficient human-computer interface for collaborative modification of content
US10565299B2 (en) Electronic apparatus and display control method
US20160132478A1 (en) Method of displaying memo and device therefor
CN105739821B (zh) 移动终端的操作处理方法及装置
CN114115691B (zh) 电子设备及其交互方法、介质
JP6271980B2 (ja) 情報処理装置、情報処理方法、及びコンピュータプログラム
US10496241B2 (en) Cloud-based inter-application interchange of style information
CN113031795B (zh) 控制方法、鼠标及上位机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22868882

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE