WO2019080065A1 - 一种显示方法及装置 - Google Patents

一种显示方法及装置

Info

Publication number
WO2019080065A1
WO2019080065A1 PCT/CN2017/107893 CN2017107893W WO2019080065A1 WO 2019080065 A1 WO2019080065 A1 WO 2019080065A1 CN 2017107893 W CN2017107893 W CN 2017107893W WO 2019080065 A1 WO2019080065 A1 WO 2019080065A1
Authority
WO
WIPO (PCT)
Prior art keywords
display device
display
control device
distance
data
Prior art date
Application number
PCT/CN2017/107893
Other languages
English (en)
French (fr)
Inventor
陈浩
陈晓晓
王卿
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201780089410.XA priority Critical patent/CN110537165B/zh
Priority to US16/612,936 priority patent/US11081086B2/en
Priority to PCT/CN2017/107893 priority patent/WO2019080065A1/zh
Priority to EP17929725.4A priority patent/EP3605314B1/en
Publication of WO2019080065A1 publication Critical patent/WO2019080065A1/zh
Priority to US17/391,714 priority patent/US20220020339A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/282Controlling appliance services of a home automation network by calling their functionalities based on user interaction within the home
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/12Synchronisation between the display unit and other units, e.g. other display units, video-disc players
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • H04L12/2821Avoiding conflicts related to the use of home appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2356/00Detection of the display position w.r.t. other display screens
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/04Display device controller operating with a plurality of display units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/06Consumer Electronics Control, i.e. control of another device by a display or vice versa
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2370/00Aspects of data communication
    • G09G2370/08Details of image data interface between the display device controller and the data line driver circuit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L2012/2847Home automation networks characterised by the type of home appliance used
    • H04L2012/2849Audio/video appliances
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2213/00Indexing scheme relating to selecting arrangements in general and for multiplex systems
    • H04Q2213/13175Graphical user interface [GUI], WWW interface, visual indication

Definitions

  • the embodiments of the present invention relate to the field of communications technologies, and in particular, to a display method and apparatus.
  • Smart home also known as home automation
  • home automation is a residential platform that uses integrated wiring technology, network communication technology, security technology, automatic control technology, audio and video technology to integrate home-related facilities and build efficient
  • the management system of residential facilities and family schedules can improve home safety, convenience, comfort, and artistry, and achieve an environmentally friendly and energy-saving living environment.
  • multiple smart appliances in the same network can be automatically managed according to the location of the user. For example, when it is detected that the user enters the living room, the media is displayed on the television in the living room, and when it is detected that the user enters the bedroom, the media is switched to the television in the bedroom.
  • the embodiment of the present application provides a display method and device, which can implement seamless switching of display services between multiple devices, and improve collaboration efficiency between multiple devices.
  • an embodiment of the present application provides a display method, including: a control device receiving a display request sent when a target display device needs to display a target service; and in response to the display request, the control device determining to support displaying the first target of the target service Displaying the device and the second display device; further, requesting the first display device to report the first distance between the user and requesting the second display device to report the second distance between the user and the user; when the first distance is less than the second distance
  • the first display data of the target service is obtained from the target display device, and the first display data is sent to the first display device, so that the first display device is configured according to the first display device.
  • a display data indicates the target service; when the control device obtains that the first distance reported by the first display device is smaller than the second distance reported by the second display device, the user is closer to the second display device, and the control device may be Obtaining the second display data of the target service from the target display device, and respectively displaying the second display data A second display device and a second display data transmission device, such that the first and second display device to display the display devices are display target data according to the second service.
  • the control device switches the target service from the first display device to the second display device
  • the first display device and the second display device display the target service for a period of time, so that the user is away from the first
  • the target device in real-time playback can also be seen in the process of the display device being close to the second display device, thereby ensuring a stable transition when the target service is switched on different display devices, and visually realizing the target service on different display devices. Seamless integration improves the efficiency of collaboration between multiple devices.
  • the first display data includes at least one layer that supports display by the first display device in all layers of the target service; and/or, the second display data includes all of the target service. At least one layer displayed by the second display device is supported in the layer.
  • the method further includes: when the second display device displays that the target service reaches a preset time, It is considered that the user has moved to the viewing range of the second display device, and therefore, the control device may stop transmitting the second display data to the first display device, reducing the power consumption of the first display device.
  • the method further includes: when the second distance between the second display device and the user is less than the preset The distance of the threshold may be determined that the current point of interest of the user has been transferred to the second display device. Therefore, the control device may stop transmitting the second display data to the first display device, thereby reducing power consumption of the first display device.
  • the method further includes: when the second distance between the second display device and the user is less than the preset When the distance is greater than the preset time threshold, the control device determines that the current focus of the user has been transferred to the second display device. Therefore, the control device can stop transmitting the second display data to the first display device.
  • the method further includes: when the control device acquires the first distance reported by the first display device, the second display device reports the second display device The second display device instructs the first display device and the second display device to perform face detection. If the face detection result reported by the first display device is obtained, the current focus of the user falls on the first display device. In this case, the control device does not need to switch the target service to the second display device, but may continue to send the second display data of the target service to the first display device, so that the first display device continues to display the target service. .
  • the control device can respectively display the first display device and the first display device.
  • the second display device sends the second display data of the target service, so that the first display device and the second display device both display the target service, and the target service can be stably transitioned when switching on different display devices, and the user visually realizes
  • the seamless connection of target services on different display devices improves the efficiency of collaboration between multiple devices.
  • an embodiment of the present application provides a display method, including: when a first display device displays a target service, sending a display request of a target service to a control device; and in response to the display request, the control device determines to support displaying the target a second display device of the service; the control device requests the first display device to report the first distance between the user and the second display device to report the second distance between the user and the user; when the first distance is greater than the second distance, The user is located closer to the second display device, and the control device can obtain the current display data of the target service from the first display device, and send the display data to the second display device, while indicating that the first display device continues to display the target service. The current display data of the target service.
  • the first display device and the second display device each have current display data of the target service, so that the first display device and the second display device can display the target service according to the display data.
  • an embodiment of the present application provides a display method, including: a target display device backs up an installed application, a stored file, and data in a control device, and keeps synchronization with the control device; when the target display device needs to be displayed And sending, by the control device, a display request of the target service, and in response to the display request, the control device determines that the first display device and the second display device can support the display of the target service; And determining, by the first display device, the first distance between the user and the user, and requesting the second display device to report the second distance between the user and the user; when the first distance is less than the second distance, indicating that the user is away from the first display device at this time If the first control device detects that the first display device reports the first distance that is less than the second distance reported by the second display device, the second time distance is reported by the second display device.
  • the display device is relatively close, and the control device can send the current display data of the target service to the second display device, and instruct the first display device to continue to display the target service, so that the first display device and the second display device can all be based on the target service.
  • the current display data shows the target business.
  • an embodiment of the present application provides a display method, including: when a first display device displays a target service, determining that the candidate device capable of displaying the target service further includes a second display device; then, the first display device Obtaining a first distance from the user, and instructing the second device to report the second distance between the user and the user; when the first distance is greater than the second distance, indicating that the user is closer to the second display device, the first The display device can send the current display data of the target service to the second display device. At this time, the first display device and the second display device both have current display data of the target service, so that the first display device and the second display The device can display the target service based on the display data.
  • an embodiment of the present application provides a display system, including a control device, and a first display device, a second display device, and a target display device, wherein the target display device is configured to: Sending a display request to the control device when the target service needs to be displayed; the control device is configured to: in response to the display request, the control device determines to support the first display device and the second display device that display the target service; Displaying a first distance between the device and the user, and requesting the second display device to report the second distance between the device and the user; when the first distance is less than the second distance, acquiring the target service from the target display device The first display data, and the first display data is sent to the first display device; the first display device is configured to display the target service according to the first display data, and the control device is further configured to: When the first distance reported by the display device is less than the second distance reported by the second display device, the target industry is obtained from the target display device. The second display data is sent to the first display device and the second display device, wherein the
  • an embodiment of the present application provides a control device, including a processor, and a memory and a transceiver connected to the processor, the memory storing program code, the processor running the program code to instruct the control
  • the device performs the following steps: receiving a display request sent when the target display device needs to display the target service; determining, in response to the display request, determining the first display device and the second display device that support the display of the target service; requesting the first display device to report the user a first distance between the two, and requesting the second display device to report the second distance between the user and the user; when the first distance is less than the second distance, acquiring the first display data of the target service from the target display device, And sending the first display data to the first display device, so that the first display device displays the target service according to the first display data; when the first distance that is subsequently reported by the first display device is smaller than the second reported by the second display device The distance, the second display data of the target service is obtained from the target display device, and respectively A first display device
  • the first display data includes at least one layer that supports display by the first display device in all layers of the target service; and/or, the second display data includes all of the target service. Layer At least one layer displayed by the second display device is supported.
  • the program code after transmitting the second display data to the first display device and the second display device, the program code further includes: stopping when the second display device displays that the target service reaches a preset time Sending second display data to the first display device.
  • the program code after transmitting the second display data to the first display device and the second display device, the program code further includes: stopping the first when the second distance is less than a preset distance threshold The display device transmits the second display data.
  • the program code after transmitting the second display data to the first display device and the second display device, the program code further includes: determining the user distance when the second distance is less than a preset distance threshold The second display device is less than the duration of the distance threshold; if the duration is greater than the preset time threshold, the sending of the second display data to the first display device is stopped.
  • the program code further includes: when the subsequent acquisition to the first display device reports the first distance is equal to the second display device reported second When the distance is up, the first display device and the second display device are instructed to perform face detection; if the face detection result reported by the first display device is obtained, the second display data of the target service is obtained from the target display device, And sending the second display data to the first display device; if the face detection result reported by the second display device is obtained, acquiring the second display data of the target service from the target display device, and respectively displaying the second display data to the first display The device and the second display device transmit the second display data.
  • control device further includes a display connected to the processor, the display being configured to display the target service according to the first display data and/or the second display data sent by the target display device.
  • the embodiment of the present application provides a computer readable storage medium, where the computer readable storage medium stores an instruction, when the instruction is run on any one of the foregoing control devices, causing the control device to perform any of the foregoing Display method.
  • an embodiment of the present application provides a computer program product comprising instructions, when it is run on any one of the above control devices, causing the control device to perform any of the above display methods.
  • control devices and the various components in the control device are not limited to the devices themselves, and in actual implementation, these devices or components may appear under other names. As long as the functions of the various devices or components are similar to the embodiments of the present application, they are within the scope of the claims and their equivalents.
  • FIG. 1 is a schematic structural diagram 1 of a display system according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram 2 of a display system according to an embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram 3 of a display system according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic diagram of dividing a layer according to an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram 4 of a display system according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram 5 of a display system according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic structural diagram 6 of a display system according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a display system according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of a mobile phone according to an embodiment of the present application.
  • FIG. 10 is a schematic flowchart 1 of a display method according to an embodiment of the present disclosure.
  • FIG. 11 is a schematic diagram 1 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 12 is a second schematic diagram of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 13 is a schematic diagram 3 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 14 is a second schematic flowchart of a display method according to an embodiment of the present disclosure.
  • FIG. 15 is a schematic diagram 4 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 16 is a schematic diagram 5 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 17 is a schematic diagram 6 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 18 is a schematic diagram 7 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 19 is a schematic diagram 8 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 20 is a schematic diagram 9 of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 21 is a schematic diagram of an application scenario of a display method according to an embodiment of the present disclosure.
  • FIG. 22 is a schematic structural diagram of a control device according to an embodiment of the present application.
  • first and second are used for descriptive purposes only, and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated. Thus, features defining “first” and “second” may include one or more of the features either explicitly or implicitly. In the description of the embodiments of the present application, “multiple” means two or more unless otherwise stated.
  • a display method provided by an embodiment of the present application is applicable to the display system 100 shown in FIG. 1.
  • the above display system 100 includes a control device 200 and at least two display devices (e.g., the mobile phone 201, the smart TV 202, and the tablet 203 shown in FIG. 1) that are both communicable with the control device 200.
  • a control device 200 e.g., the mobile phone 201, the smart TV 202, and the tablet 203 shown in FIG. 1
  • the control device 200 and the respective display devices may be connected through a wireless network (for example, Wi-Fi, Bluetooth, cellular mobile network, etc.) or a wired network (for example, an optical fiber, etc.), which is not limited in this embodiment.
  • a wireless network for example, Wi-Fi, Bluetooth, cellular mobile network, etc.
  • a wired network for example, an optical fiber, etc.
  • the control device 200 stores device information that reflects the capabilities of each display device display. Taking the mobile phone 201 as an example, after establishing connection with the control device 200, as shown in FIG. 2, the mobile phone 201 can transmit its own device information to the control device 200, for example, the screen resolution of the mobile phone 201, and a graphics processing unit (Graphics Processing Unit). , GPU) rendering capabilities, the frequency of the Central Processing Unit (CPU), and so on.
  • the control device 200 stores the received device information of the mobile phone 201 in the memory of the control device 200 for filing.
  • each display device connected to the control device 200 can record its own device information in the control device 200. Subsequently, when a certain display device initiates a target service (for example, playing a video, running a game, etc.) that needs to be displayed, the corresponding display request and the corresponding display data of the target service may be transmitted to the control device 200. At this time, the control device 200 can determine a suitable display device as the target device for the current target service according to the device information of the respective display devices that have been recorded, and send the corresponding display data of the target service to the target device for display.
  • a target service for example, playing a video, running a game, etc.
  • the mobile phone 201 and the smart TV 202 are both connected to the control device 200.
  • the mobile phone 201 can analyze the attribute information of the to-be-displayed layer that needs to be displayed on the video call service. For example, analyzing the content (video, text, or picture) of the layer, the size, and the privacy of the layer, and the like, the mobile phone 201 may carry the analyzed attribute information of the layer to be displayed in the display request and send it to the display request. Control device 200.
  • the layer is a basic component of a display interface on the display device. After multiple layers are stacked in sequence, the final display effect of the display interface is formed.
  • Each layer can include one or more controls, and each layer's definition of the rule icon and the order of multiple layers can be defined by the developer as they develop the application.
  • Android for example, some basic layers are defined in Android, such as ImageView, AdapterView, and RelativeLayout. Developers can use or modify these base layers to draw custom layers.
  • the status bar 401 of the WeChat in the chat interface and the input field 404 for input can be defined as layer 1, and the chat background in the chat interface is defined as a map.
  • Layer 2 defines the chat record of the user in the chat interface as layer 3, and layer 3 is located on layer 2.
  • the WeChat application can determine the three layers included in the chat interface according to the predefined layer rules, that is, the above layer 1 - layer 3. Analyze the properties of each layer's content (video, text or image), size, and privacy of the layer. For example, layers 1 and 3 involve privacy such as contact avatars and names. Therefore, the privacy of layer 1 and layer 3 is high, and user privacy is not involved in layer 2, so the privacy of layer 2 is low. Further, the attribute information of the layer 1 to layer 3 obtained by the analysis is carried in the display request and transmitted to the control device 200.
  • the control device 200 can match the attribute information of the layer to be displayed sent by the mobile phone 201 with the device information of each display device, for example,
  • the attribute information indicates that the size of the layer to be displayed is 10M and the privacy is weak
  • the recorded smart TV 202 can support the layer with a size greater than 8M and the privacy is weak
  • the control device 200 can The smart TV 202 serves as a target device for displaying the above-described video call service.
  • the control device 200 can send the response information of the display request to the mobile phone 201, trigger the mobile phone 201 to generate the display data of the video call service (that is, the data of the layer to be displayed), and send it to the control device 200, as shown in FIG.
  • the display device transmits the display data to the smart TV 202 by the control device 200, so that the smart TV 202 displays an image of the video call service received on the original mobile phone 201.
  • the control device 200 can also carry the identifier of the smart TV 202 in the response information, so that the mobile phone 201 can generate the video call service according to the identifier of the smart TV 202.
  • the display data is sent to the smart TV 202 for display.
  • control device 200 may also have image processing capabilities, such as image rendering capabilities. Then, after receiving the display data of the video call service generated by the mobile phone 201, the control device 200 may follow the device information such as the resolution of the smart TV 202. The display data is subjected to secondary rendering to obtain display data conforming to the display capability of the smart TV 202, and the display data is transmitted to the smart TV 202 for display.
  • image processing capabilities such as image rendering capabilities.
  • a plurality of display devices of the user can be The control device 200 is interconnected, and the device information of each display device is recorded in the control device 200, so that the control device can intelligently select a suitable target device for the current target service according to the device information of each display device, and correspondingly target the service.
  • the layer is projected onto the target device for display.
  • any display device in the display system 100 can serve as a source device for providing screen source data when the target service is triggered, and the control device 200 in the display system 100 can intelligently determine the projection timing and display the target service.
  • the controlled device enables the source device and the controlled device in the multi-screen display scenario to be flexibly set according to service requirements, thereby improving the cooperation efficiency between multiple devices.
  • each display device in the above display system 100 can also back up all of its own data in the control device 200, for example, the mobile phone 201 can install the installed application and store The files and data are backed up in the control device 200.
  • the mobile phone 201 can directly transmit the target service to the control device 200, since the control device 200 includes all of the mobile phone 201. Data, therefore, the control device 200 can replace the attribute information of the to-be-displayed layer of the target service for the mobile phone 201, and then select a suitable target device (for example, the smart TV 202) for the target service according to the attribute information, and can also replace the mobile phone. 201 generates the layer to be displayed and sends it to the smart TV 202 for display.
  • a suitable target device for example, the smart TV 202
  • the mobile phone 201 that initiates the target service only needs to report the target service to the control device 200, so that the target service can be projected to the smart screen display function displayed on other display devices, and the implementation of each display device in the display system 100 can be reduced. Complexity and power consumption.
  • control device 200 determines a suitable target device for the target service
  • the distance between the user and each display device in the display system 100 may be acquired, and the display device closest to the user is determined as the target service.
  • Target device the display device closest to the user
  • the user puts the mobile phone 201 in the living room and enters the bedroom to open the smart TV 202.
  • the mobile phone 201 receives a video call service
  • the attribute information of the to-be-displayed layer of the video call service is sent.
  • the control device 200 can determine, according to the attribute information of the layer to be displayed, a plurality of display devices that support the display of the layer to be displayed, for example, the tablet 203, the smart TV 202, and the mobile phone 201.
  • control device 200 can pass the three The sensors such as the distance sensor or the camera set on the display device acquire the distance between the user and the three display devices respectively, and then display the display device to be displayed as the closest display device (for example, the smart TV 202 located in the bedroom in FIG. 7).
  • the target device of the layer for example, the smart TV 202 located in the bedroom in FIG. 7.
  • control device 200 in the display system 100 is not limited.
  • the control device 200 may be in the form of an independent physical device.
  • the display devices are connected to each other, or, as shown in (b) of FIG. 8, the control device 200 can also be integrated in one or more display devices in the form of functional modules, that is, the control device 200 can also be displayed.
  • a display device having a display function in the system 100 for example, the mobile phone 201 can be used as the control device 200 of the display system 100 in (b) of FIG. 8, and the mobile phone 201 is also a display device in the display system 100; As shown in (c) of FIG.
  • control device 200 can also be one or more servers (or virtual machines) set in the cloud. At this time, each display device in the display system 100 can pass the same user account and the cloud.
  • the control device 200 establishes a connection relationship, and the embodiment of the present application does not impose any limitation on this.
  • the target service of the display device is projected onto other display devices for display.
  • the control device 200 connected to the display system 100 may have other output functions.
  • the terminal for example, a Bluetooth stereo with an audio output function, and the like.
  • the display device (or the control device 200) in the above display system 100 may specifically be a mobile phone, a wearable device, an augmented reality (AR), a virtual reality (VR) device.
  • Any terminal such as a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (PDA).
  • PDA personal digital assistant
  • the embodiment will be specifically described below by using the mobile phone 201 as a display device in the display system 100. It should be understood that the illustrated mobile phone 201 is only one example of the above terminal, and the mobile phone 201 may have more or fewer components than those shown in the figure, two or more components may be combined, or Has a different component configuration.
  • the mobile phone 201 may specifically include: a processor 101, a radio frequency (RF) circuit 102, a memory 103, a touch screen 104, a Bluetooth device 105, one or more sensors 106, a Wi-Fi device 107, and positioning. Components such as device 108, audio circuit 109, peripheral interface 110, and power system 111. These components can communicate over one or more communication buses or signal lines (not shown in Figure 9). It will be understood by those skilled in the art that the hardware structure shown in FIG. 9 does not constitute a limitation to the mobile phone, and the mobile phone 201 may include more or less components than those illustrated, or some components may be combined, or different component arrangements.
  • RF radio frequency
  • the processor 101 is a control center of the mobile phone 201, and connects various parts of the mobile phone 201 by using various interfaces and lines, and executes the mobile phone 201 by running or executing an application stored in the memory 103 and calling data stored in the memory 103.
  • processor 101 can include one or more processing units.
  • the processor 101 can integrate an application processor and a modem processor.
  • the application processor mainly processes an operating system, a user interface, an application, and the like; the modem processor mainly processes wireless communication.
  • the above modulation and demodulation processor and the application processor may also be independently set.
  • the processor 101 may include a GPU 115 and a CPU 116, or may be a combination of a GPU 115, a CPU 116, a digital signal processing (DSP), and a control chip (for example, a baseband chip) in a communication unit.
  • DSP digital signal processing
  • both the GPU 115 and the CPU 116 may be a single operation core, and may also include multiple operation cores.
  • the GPU 115 is a microprocessor that performs image computing operations on personal computers, workstations, game consoles, and some mobile devices (such as tablets, smart phones, etc.). It can convert the display information required by the mobile phone 201 and provide a line scan signal to the display 104-2 to control the correct display of the display 104-2.
  • the GPU 115 can send a corresponding drawing command to the GPU 115, for example.
  • the drawing command may be “drawing a rectangle having a length and a width of a and b at the coordinate position (x, y)”, and then the GPU 115 can quickly calculate all the pixels of the graphic according to the drawing instruction.
  • a corresponding graphic is drawn at a specified position on the display 104-2.
  • the GPU 115 may be integrated in the processor 101 in the form of a function module, or may be disposed in the mobile phone 201 in an independent physical form (for example, a video card), which is not limited in this embodiment.
  • the radio frequency circuit 102 can be used to receive and transmit wireless signals during transmission or reception of information or calls.
  • the radio frequency circuit 102 can process the downlink data of the base station and then process it to the processor 101; in addition, transmit the data related to the uplink to the base station.
  • radio frequency circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency circuit 102 can also communicate with other devices through wireless communication.
  • the wireless communication can use any communication standard or protocol, including but not limited to global mobile communication systems, general packet radio services, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
  • the memory 103 is used to store applications and data, and the processor 101 executes various functions and data processing of the mobile phone 201 by running applications and data stored in the memory 103.
  • the memory 103 mainly includes a storage program area and a storage data area, wherein the storage program area can store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.); the storage data area can be stored according to the use of the mobile phone. Data created at 201 (such as audio data, phone book, etc.).
  • the memory 103 may include a high speed random access memory (RAM), and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • the memory 103 can store various operating systems, for example, developed by Apple. Operating system, developed by Google Inc. Operating system, etc.
  • the above memory 103 may be independent and connected to the processor 101 via the above communication bus; the memory 103 may also be integrated with the processor 101.
  • the touch screen 104 may specifically include a touch panel 104-1 and a display 104-2.
  • the touch panel 104-1 can collect touch events on or near the user of the mobile phone 201 (such as the user using a finger, a stylus, or the like on the touch panel 104-1 or on the touchpad 104.
  • the operation near -1), and the collected touch information is sent to other devices (for example, processor 101).
  • the touch event of the user in the vicinity of the touch panel 104-1 may be referred to as a hovering touch; the hovering touch may mean that the user does not need to directly touch the touchpad in order to select, move or drag a target (eg, an icon, etc.) And only the user is located near the terminal in order to perform the desired function.
  • the touch panel 104-1 can be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • a display (also referred to as display) 104-2 can be used to display information entered by the user or information provided to the user as well as various menus of the handset 201.
  • the display 104-2 can be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the touchpad 104-1 can be overlaid on the display 104-2, and when the touchpad 104-1 detects a touch event on or near it, it is transmitted to the processor 101 to determine the type of touch event, and then the processor 101 may provide a corresponding visual output on display 104-2 depending on the type of touch event.
  • the touchpad 104-1 and the display 104-2 are implemented as two separate components to implement the input and output functions of the handset 201, in some embodiments, the touchpad 104- 1 and display 104-2 Integration and implementation of the input and output functions of the handset 201.
  • the touch screen 104 is formed by stacking a plurality of layers of materials. In the embodiment of the present application, only the touch panel (layer) and the display screen (layer) are shown, and other layers are not described in the embodiment of the present application. .
  • the touch panel 104-1 may be disposed on the front surface of the mobile phone 201 in the form of a full-board
  • the display screen 104-2 may also be disposed on the front surface of the mobile phone 201 in the form of a full-board, so that the front side of the mobile phone can be borderless. Structure.
  • the mobile phone 201 can also include a Bluetooth device 105 for enabling data exchange between the handset 201 and other short-range terminals (eg, mobile phones, smart watches, etc.).
  • the Bluetooth device in the embodiment of the present application may be an integrated circuit or a Bluetooth chip or the like.
  • the handset 201 can also include at least one sensor 106, such as a fingerprint acquisition device 112, a light sensor, a motion sensor, and other sensors.
  • the fingerprint collection device 112 may be disposed on the back of the mobile phone 201 (for example, below the rear camera), or on the front side of the mobile phone 201 (for example, below the touch screen 104), and for example, the fingerprint collection device may be disposed in the touch screen 104.
  • 112 to implement the fingerprint recognition function that is, the fingerprint collection device 112 can be integrated with the touch screen 104 to implement the fingerprint recognition function of the mobile phone 201.
  • the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display of the touch screen 104 according to the brightness of the ambient light, and the proximity sensor may turn off the power of the display when the mobile phone 201 moves to the ear.
  • the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.
  • the mobile phone 201 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Give a brief description.
  • the senor 106 of the mobile phone 201 further includes a distance sensor 113, which can be used to sense the distance between it and an object (or user) to complete a certain function.
  • the optical distance sensor, the infrared distance sensor, the ultrasonic distance sensor, and the like are not limited according to the working principle of the present application.
  • the Wi-Fi device 107 is configured to provide the mobile phone 201 with network access complying with the Wi-Fi related standard protocol, and the mobile phone 201 can access the Wi-Fi access point through the Wi-Fi device 107, thereby helping the user to send and receive emails, Browsing web pages and accessing streaming media, etc., it provides users with wireless broadband Internet access.
  • the Wi-Fi device 107 can also function as a Wi-Fi wireless access point, and can provide Wi-Fi network access to other terminals.
  • the positioning device 108 is configured to provide a geographic location for the mobile phone 201. It can be understood that the positioning device 108 can be specifically a receiver of a positioning system such as a global positioning system (GPS) or a Beidou satellite navigation system, and a Russian GLONASS. After receiving the geographical location transmitted by the positioning system, the positioning device 108 sends the information to the processor 101 for processing, or sends it to the memory 103 for storage. In still other embodiments, the positioning device 108 may also be an assisted global positioning system (AGPS) receiver, and the AGPS system assists the positioning device 108 in performing ranging and positioning services by acting as an auxiliary server.
  • AGPS assisted global positioning system
  • the secondary location server provides location assistance over a wireless communication network in communication with a location device 108 (i.e., a GPS receiver) of a terminal, such as handset 201.
  • a location device 108 i.e., a GPS receiver
  • the positioning device 108 can also be a Wi-Fi access point based positioning technology. Since every Wi-Fi access point has a globally unique media With a media access control (MAC) address, the terminal can scan and collect broadcast signals of surrounding Wi-Fi access points when Wi-Fi is turned on, so that the Wi-Fi access point can be broadcasted.
  • MAC media access control
  • the terminal sends the data (such as a MAC address) capable of indicating the Wi-Fi access point to the location server through the wireless communication network, and the location server retrieves the geographic location of each Wi-Fi access point and combines the Wi - The degree of strength of the Fi broadcast signal, the geographic location of the terminal is calculated and sent to the location device 108 of the terminal.
  • the data such as a MAC address
  • the audio circuit 109, the speaker 113, and the microphone 114 can provide an audio interface between the user and the handset 201.
  • the audio circuit 109 can transmit the converted electrical data of the received audio data to the speaker 113 for conversion to the sound signal output by the speaker 113; on the other hand, the microphone 114 converts the collected sound signal into an electrical signal by the audio circuit 109. After receiving, it is converted into audio data, and then the audio data is output to the RF circuit 102 for transmission to, for example, another mobile phone, or the audio data is output to the memory 103 for further processing.
  • the peripheral interface 110 is used to provide various interfaces for external input/output devices (such as a keyboard, a mouse, an external display, an external memory, a subscriber identity module card, etc.). For example, it is connected to the mouse through a universal serial bus (USB) interface, and is connected with a subscriber identification module (SIM) card provided by the telecommunication operator through a metal contact on the card slot of the subscriber identification module. . Peripheral interface 110 can be used to couple the external input/output peripherals described above to processor 101 and memory 103.
  • USB universal serial bus
  • SIM subscriber identification module
  • the mobile phone 201 may further include a power supply device 111 (such as a battery and a power management chip) that supplies power to the various components, and the battery may be logically connected to the processor 101 through the power management chip to manage charging, discharging, and power management through the power supply device 111. And other functions.
  • a power supply device 111 such as a battery and a power management chip
  • the battery may be logically connected to the processor 101 through the power management chip to manage charging, discharging, and power management through the power supply device 111. And other functions.
  • the mobile phone 201 may further include a camera (front camera and/or rear camera), a flash, a micro projection device, a near field communication (NFC) device, and the like, and details are not described herein.
  • a camera front camera and/or rear camera
  • a flash a flash
  • micro projection device a micro projection device
  • NFC near field communication
  • the networking method of the display system 100 is as shown in FIG. 10 , and includes:
  • the first display device sends a first connection request to the control device.
  • the control device After receiving the first connection request, the control device establishes a connection relationship with the first display device.
  • the first display device for example, the above-mentioned mobile phone 201 actively establishes a connection with the control device as an example for description.
  • control device 200 when the control device 200 accesses a certain network, for example, a local area network with a Wi-Fi name of "1234", its own identity (eg, the MAC address of the control device 200) may be carried in the first The indication information is periodically broadcasted, and the first indication information is used to indicate that it is the control device 200. Then, when the mobile phone 201 also accesses the local area network with the Wi-Fi name "1234", the first indication information can be received, thereby determining the current control device 200.
  • a certain network for example, a local area network with a Wi-Fi name of "1234"
  • its own identity eg, the MAC address of the control device 200
  • the processor of the mobile phone 201 can call its Wi-Fi device to send a first connection request to the control device 200 via the Wi-Fi network named "1234" according to the identifier of the control device 200.
  • a connection request for requesting to establish a connection relationship between the mobile phone 201 and the control device 200 may be carried in the first connection request.
  • control device 200 can store the identifier of the mobile phone 201 in the memory to establish a connection relationship with the mobile phone 201. Subsequently, both the control device 200 and the mobile phone 201 can find each other to communicate by identifying each other.
  • the display can be performed on the mobile phone 201.
  • the candidate control device list 1001 is displayed in the interface, and the user selects the control device in the local area network. For example, the user clicks “My Mobile Phone” in the candidate control device list 1001, that is, the mobile phone 201.
  • the mobile phone 201 can set itself as the control device 200 and carry its own identifier in the first indication information to periodically broadcast. Then, after receiving the first indication information, the other display devices in the local area network may carry their own identifiers in the first connection request and send them to the mobile phone 201 (ie, the control device), so that the mobile phone 201 stores the received identifiers, thereby Establish a connection relationship with each display device in the local area network.
  • the control device sends a second connection request to the first display device.
  • the first display device After receiving the second connection request, the first display device establishes a connection relationship with the control device.
  • control device actively establishes a connection with the first display device (for example, the mobile phone 201) as an example for description.
  • the control device 200 can carry its own identifier in the second connection request and send it to the mobile phone 201. Then, after receiving the second connection request, the mobile phone can store the identifier of the control device 200, and The identification of the mobile phone 201 itself is transmitted to the control device 200, so that the control device 200 also stores the identification of the mobile phone 201 in its own memory, establishing a connection relationship with the mobile phone 201. Subsequently, both the control device 200 and the mobile phone 201 can find each other to communicate by identifying each other.
  • connection relationship between the first display device and the control device is exemplified, and other display devices may also establish a connection relationship with the control device according to the foregoing method, so as to be assembled as shown in FIG. 1-8.
  • Display system 100 the connection relationship between the first display device and the control device is exemplified, and other display devices may also establish a connection relationship with the control device according to the foregoing method, so as to be assembled as shown in FIG. 1-8.
  • the first display device sends the device information of the first display device to the control device.
  • control device After receiving the device information, the control device saves the file to the memory of the control device for filing.
  • the mobile phone 201 can send its own device to the control device according to the identifier of the saved control device 200.
  • the information for example, the screen resolution of the mobile phone 201, the rendering capability of the GPU, and the frequency of the CPU, etc., reflect the display capability of the mobile phone 201, the audio format supported by the mobile phone 201, and the like, the parameters reflecting the sound playback capability of the mobile phone 201, and whether the user privacy is supported.
  • the parameters are displayed, and the embodiment of the present application does not impose any limitation on this.
  • the user privacy may specifically include information such as secure transaction information (such as a stock transaction page), information having a chat nature (such as a short message, a message notification, etc.), a location information of the user, and a contact number, such as a user.
  • secure transaction information such as a stock transaction page
  • chat nature such as a short message, a message notification, etc.
  • location information of the user such as a user
  • contact number such as a user.
  • the display device may be based on the type of display device and/or the geographic location in which the display device is located Such parameters, to determine whether to support the display of user privacy. For example, when the display device is a mobile device, such as a mobile phone or a wearable device, since the user usually carries such devices with them, that is, the privacy of such devices is high, it can be determined that such devices support display users. Privacy; when the display device is a less mobile device, such as a Bluetooth stereo or a smart TV, the location of such devices is relatively fixed and usually cannot move with the user's movement, that is, the devices Private density is low, so it can be determined that such devices do not support displaying user privacy.
  • a mobile device such as a mobile phone or a wearable device
  • the location of such devices is relatively fixed and usually cannot move with the user's movement, that is, the devices Private density is low, so it can be determined that such devices do not support displaying user privacy.
  • control device 200 can store the correspondence between the mobile phone 201 and its device information in the memory of the control device 200 for filing.
  • the control device 200 can record the device information of each of the received display devices. Then, as shown in Table 1, the device information of each display device is maintained in the control device 200, and subsequently, when a certain target service needs to be displayed.
  • the control device 200 can determine a suitable display device for the target service as the target device to display the target service according to the device information of each display device that has been recorded.
  • the attributes of one or more to-be-displayed layers related to the incoming call service such as the resolution supported by the layer to be displayed, the capabilities of the CPU and GPU supported by the layer to be displayed. Whether the layer to be displayed relates to attribute information such as user privacy is sent to the control device 200.
  • the control device 200 matches the received attribute information of the layer to be displayed with the device information of each display device recorded in Table 1, to obtain one or more display devices supporting the display of the layer to be displayed.
  • the control device 200 determines that the mobile phone 201, the smart TV 202, and the tablet 203 in Table 1 support the display of the to-be-displayed layer of the incoming call service. Then, in order to facilitate the user to know the incoming call service in time, when the mobile phone 201, the smart TV 202, and the tablet computer 203 are both connected to the control device 200, the control device may send the second indication information to the mobile phone 201, the smart TV 202, and the tablet computer 203. The second indication information is used to indicate that the display device reports the distance between the user and the user.
  • the mobile phone 201, the smart TV 202, and the tablet computer 203 can periodically detect the distance sensor (such as a camera or an infrared sensor) or obtain another distance from the user. The detected distance is reported to the control device 200.
  • the control device 200 can determine the display device closest to the user, such as the smart TV 202 in the bedroom, as the target device displays the to-be-displayed layer of the incoming call service, and since the call service can be performed according to the real-time distance Selecting the target device, in the process of the incoming call service, when the distance between the user and the plurality of display devices changes, the incoming call service can be freely switched on multiple display devices, thereby improving the cooperation efficiency between the multiple devices. Can greatly improve the user experience.
  • the floor plan of the location where each display device is located may be pre-stored in the control device 200, as shown in FIG. 12, which is a schematic diagram of the floor structure of the room and the room where the user lives.
  • Each of the display devices in the system 100 can report the location information of the display device to the control device 200 by using a positioning device (for example, GPS).
  • the control device can determine the specific location of each display device in the user's home in conjunction with the floor structure diagram shown in FIG. .
  • a TV 1 is placed in the bedroom
  • a TV 1 and a mobile phone 3 are placed in the living room
  • a tablet 4 is placed in the kitchen.
  • the control device 200 determines the target device that displays the target service according to the distance between the user and each display device
  • the specific location of each display device shown in FIG. 12 may be in the same room as the user and the closest distance.
  • the display device (for example, the television 1 in the living room in FIG. 12) serves as a target device, avoiding the problem that the target device determined by the user is not in the room where the user is located, and the user cannot process the target service in time.
  • each display device in the room can also periodically report the distance between the user and the user to the control device 200, for example, the distance between the user and the user is reported every 30 seconds, then when the user moves within the room, for example, As shown in Figure 13, the user moves from point A in the living room to point B in the bedroom door, and the control device can obtain the distance between the user and each display device in real time, when the distance D1 between the user and the television 1 in the bedroom is smaller than that of the user and the living room.
  • the target service can be dynamically switched from the TV 2 in the living room to the TV 1 in the bedroom closest to the user at this time.
  • control device 200 switches the target service from the television 2 to the television 1, the user may not have entered the bedroom or entered the optimal viewing area in the bedroom, causing the user to miss the relevant picture of the target service.
  • the present embodiment provides a display method, in which the display device 100 includes the control device 200, and the first display device and the second display device connected to the control device 200, as shown in FIG. , the method includes:
  • the control device acquires a first distance between the first display device and the user, and a second distance between the second display device and the user.
  • a distance sensor may be disposed in the first display device and the second display device, and the first display device may measure a first distance between the current user and the user through the distance sensor, and the second display device may measure through the distance sensor The second distance between the current and the user. Furthermore, the first display device and the second display device can respectively transmit the measured first distance and the second distance to the control device.
  • the second display device can be considered to be between the user and the user.
  • the distance is infinity.
  • one or more cameras connected to the control device may be disposed in the display system 100, and the user image captured by the control device may be captured by the camera. Then, combined with the position of each display device stored in advance, the first camera may be determined. A first distance between the display device and the user, and a second distance between the second display device and the user.
  • the control device can also obtain the positioning result of the user through the positioning device of the wearable device (or the mobile phone), thereby combining the positions of the pre-stored display devices. Determining a first distance between the first display device and the user, and a second distance between the second display device and the user.
  • control device can also obtain the first distance and the first method by using other existing methods such as indoor positioning.
  • the two distances are not limited by the embodiment of the present application.
  • the control device may be configured to acquire the first distance and the second distance when the control device receives the display request initiated by the target service. For example, as shown in FIG. 15, when the user opens a certain video in the video playing APP on the mobile phone, the mobile phone may send a display request of the video playing service to the control device. After receiving the display request, the control device may first determine, according to the device information of each display device that has been filed in Table 1, a display device that supports the video playback service, such as the first display device and the second display device.
  • control device may send a distance request to the first display device and the second display device, request the first display device to report the first distance between the user and the user, and request the second display device to report the second between the user and the user. distance.
  • the first display device and the second display device may trigger the first display device and the second display device to periodically detect and report the distance between the user and the user.
  • the mobile phone in FIG. 15 can also be used as the first display device or the second display device.
  • the embodiment of the present application does not impose any limitation on this.
  • control device instructs the first display device to run the target service.
  • the first display device displays the target service in real time.
  • the first display device When the first distance D1 is smaller than the second distance D2, as shown in FIG. 16, the first display device is closer to the user, and the video playing service is still taken as an example. At this time, the control device can run the video playing service on the mobile phone.
  • the generated layer to be displayed is sent to the first display device for display in real time, so that the first display device can display the layer to be displayed of the video playing service in real time.
  • the layer to be displayed may include a part of a layer when the video playing service is run.
  • the control device may remove a layer that involves user privacy when the video playing service is run, and the layer that does not involve privacy is used as the above.
  • the layer to be displayed is sent to the first display device; or the control device may also send a layer related to user privacy when the video playing service is run to a third display device that supports displaying user privacy, and the layer that does not involve privacy
  • the layer to be displayed is sent to the first display device; of course, the layer to be displayed may also include all the layers when the video playing service is run, and the embodiment of the present application does not impose any limitation.
  • the control device may be triggered to instruct the first display device to run the target service, so as to prevent the first display device from displaying the target service when the user quickly passes through the first display device. Increase the power consumption of the first device.
  • control device may perform secondary rendering on the layer to be displayed sent by the mobile phone, for example, adjusting the to-be-displayed layer.
  • the size of the layer is adapted to the resolution of the first display device, and the embodiment of the present application does not impose any limitation on this.
  • the control device continues to acquire a first distance between the first display device and the user, and a second distance between the second display device and the user.
  • the first display device and the second display device may continue to detect and report the distance between the user and the user, so that the control device continues to acquire the first display device and the user. a first distance D1 between, and a second distance between the second display device and the user D2.
  • control device sends the first instruction to the first display device, and sends the second instruction to the second device.
  • the first instruction is used to indicate that the first display device continues to display the target service in real time
  • the second instruction is used to instruct the second display device to display the target service in real time from the target image displayed by the current first display device.
  • the control device may first determine whether the current second display device is in a connected state with the control device, that is, whether the second display device is online. When the second display device is online, the control device may be triggered to send a second instruction to the second display device.
  • control device may re-establish a connection relationship with the second display device, and then send a second instruction to the second display device; or, when the second display device is offline, control The device may also re-select other display devices that are closer to the user and are in a connected state with the control device, and send the second instruction to the display device.
  • the embodiment of the present application does not impose any limitation on this.
  • the first display device continues to display the target service in real time within a preset time in response to the first instruction.
  • the second display device displays the target service in real time from the target layer displayed by the current first display device in response to the second instruction.
  • the second display device When the first distance D1 is greater than the second distance D2, as shown in FIG. 17, the second display device is closer to the user at this time, and the user has a tendency to move to the second display device.
  • the video playback service is still used as an example.
  • the control device obtains that the first distance D1 reported by the first display device is greater than the second distance D2 reported by the second display device.
  • the device may send the to-be-displayed layer generated when the mobile phone runs the video playing service to the second display device, that is, send the second instruction to the second display device.
  • the second display device After receiving the second instruction, the second display device may continue to display the layer to be displayed of the video playing service from the display screen of the third minute and 45 seconds of the video A (ie, the target layer).
  • the layer to be displayed when the second display device displays the target service may be the same as or different from the layer to be displayed when the first display device displays the target service.
  • the layer to be displayed sent by the control device to the second display device may include a layer related to user privacy, for example, including a contact.
  • the layer of the phone, the content of the short message, and the like, and the layer to be displayed sent by the control device to the first display device does not include a layer related to user privacy.
  • the control device does not immediately stop sending the to-be-displayed layer of the video playback service to the first display device, but continues to send the to-be-displayed picture of the video A after the third minute and 45 seconds to the first display device.
  • Floor That is, when the control device switches the video playing service from the first display device to the second display device, the first display device and the second display device simultaneously display the same picture for a period of time.
  • the process of the user moving from the first display device to the second display device is a continuous process.
  • the user may still Not entering the room where the second display device is located, or the user has not entered the viewing area of the second display device (for example, an area three meters ahead of the second display device), if the first display device is turned off The played video A, the user will miss the corresponding play segment, that is, the target service viewed by the user cannot be seamlessly connected between the first display device and the second device.
  • the control device when detecting that the first distance D1 is greater than the second distance D2, the control device continues to send the video playing service to the first display device in addition to switching the video playing service from the first display device to the second display device.
  • the layer to be displayed is configured to enable the first display device to continue to display the video playing service for a period of time (for example, 30 seconds), so that the user can see the video playing service played in real time before leaving the room where the first display device is located, thereby ensuring
  • the video playback service can stably transition when switching between different display devices, and realize the seamless connection of the video playback service on different display devices, thereby improving the cooperation efficiency between multiple devices.
  • control device can simultaneously send the to-be-displayed layer of the target service to the first display device and the second display device,
  • the first display device and the second display device can display the to-be-displayed layer of the target service immediately, and improve the synchronization of the first display device and the second display device to play the video playback service.
  • a synchronization mechanism may be set in advance between the display devices of the display system to synchronize the system time of the first display device and the second display device. Then, the control device may carry the display time of the target service in the first instruction sent to the first display device and the second instruction sent to the second display device, so that when the display time comes, the first display may be triggered. The device and the second display device simultaneously play the target service to improve the synchronization of the first display device and the second display device to play the video play service.
  • control device acquires that the second distance between the second display device and the user is less than the distance threshold, the control device sends a close instruction to the first display device, so that the first display device stops displaying the target service.
  • the second distance D2 between the user and the second display device may also be continuously acquired.
  • the control device may stop sending the to-be-displayed layer of the video playing service to the first display device, so that the first display device stops displaying the video playing service (ie, the target service) to reduce the power consumption of the first display device.
  • control device may further determine a duration when the second distance between the second display device and the user is less than a threshold. If the duration is greater than the time threshold, the user has stayed in the second display device for a certain period of time. At this time, the control device may be triggered to send a shutdown instruction to the first display device, so that the first display device stops displaying the target service.
  • the first display device may stop sending the first distance between the first display device and the user to the control device, Reduce the power consumption of the first display device.
  • the first display device may continue to periodically send the first distance between the first display device and the user to the control device, so as to facilitate The subsequent user controls the device during the moving process to determine whether the first distance is timely
  • the video playback service is switched to the first display device until the control device sends an indication to the first display device to stop reporting the first distance.
  • the method is still as shown in FIG. Including steps 909-910:
  • the first display device and the second display device respectively perform face detection (or human eye detection).
  • the control device may determine the attention point of the user according to the orientation of the user's face (or the human eye), thereby determining whether to switch the video playing service from the first display device to the second display. device.
  • the camera can be set on the first display device and the second display device, so that the first display device and the second display device can capture the image of the user through the camera, and further, based on the face detection (or human eye detection) algorithm.
  • the user's image is recognized.
  • the face detection or human eye detection
  • the control device instructs the first display device to continue to display the target service.
  • the control device may continue to send the video playback service to the first display device.
  • the layer to be displayed generated in real time does not need to switch the video playing service from the first display device to the second display device.
  • the first display device and the second display device may also periodically capture the image of the user through the camera. Then, in step 910, when the first display device obtains face detection (or human eye detection) As a result, the control device can further recognize, by the face (or human eye) recognition algorithm, whether the currently detected face (or human eye) is detected with the face detected by the last face detection (or human eye detection). (or the human eye) the same. If the same, it means that the user who pays attention to the first display device has not changed, and the control device can instruct the first display device to continue to display the target service; otherwise, the control device can ignore the face detection (or the human eye) reported by the first display device. Test results.
  • the control device performs the above steps 905-908.
  • the control device may use the video playback service from the first display device. Switch to the second display device.
  • the control device may further identify the detected face (or person) by a face (or human eye) recognition algorithm.
  • the eye is the same as the face (or human eye) recognition result reported last time by the first display device. If the same, the user who originally focused on the first display device transfers the focus to the second display device, and the control device can switch the video play service from the first display device to the second display through the foregoing steps 905-908. The device; otherwise, the control device can ignore the face detection (or human eye detection) result reported by the second display device.
  • the user can also manually switch the target service from the first display device to the second display device.
  • the user may select the second display device after switching (for example, a smart TV) in the setting interface of the first display device.
  • the user may The identifier of the second display device is carried in the handover request and sent to the control device.
  • the control device can switch the target service from the first display device to the second display device according to the handover method described in steps 905-908 above.
  • the user may also trigger a handover process of the target service by executing a corresponding gesture in the first display device.
  • the mobile phone (first display device) is displaying a certain Word file 1101, and the mobile phone can determine the relative positional relationship between the mobile phone and the smart television (second display device) through the camera, if the mobile phone detects that the user is The drag operation is performed in the display interface of the current Word file 1101, and the mobile phone can determine the directivity of the drag operation according to the movement track of the drag operation in the touch screen and the relative positional relationship between the mobile phone and the smart TV.
  • the mobile phone may carry the identifier of the smart TV in the handover request to the control device, and then the control device may switch the Word file 1101 from the mobile phone according to the handover method described in steps 905-908 above. Continue to display in the smart TV.
  • the mobile phone can also display the relative position relationship between the determined mobile phone and the smart TV on the display screen of the mobile phone by text, picture or animation to prompt the user to perform a directional drag operation. Achieve the switching of the target business between mobile phones and smart TVs.
  • the user may also select to switch the display content in a certain area of the display interface in the first display device to the display in the second display device.
  • the current display interface of the first display device includes a plurality of display windows, such as the window 1102 of the input method displayed by the mobile phone in FIG. 21 and the window of the short message application. Then, similar to FIG. 20, the user may target different windows. Drag the selected window to the direction of the second display device to be switched. When the drag operation is directed to the smart TV, the mobile phone may carry the identifier of the smart TV in the handover request to the control device, and further, the control device may input the input method in the window 1102 according to the switching method described in steps 905-908 above.
  • the display content is switched from the mobile phone to the smart TV to continue to display.
  • the first display device may further divide the screen into different regions in advance, and the user may switch the display content in the selected region to the second display device for display for different regions, and the embodiment does not do this. Any restrictions.
  • control device or display device or the like includes a hardware structure and/or a software module corresponding to each function in order to implement the above functions.
  • embodiments of the present application can be implemented in a combination of hardware or hardware and computer software in combination with the elements and algorithm steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the embodiments of the present application.
  • the embodiment of the present application may perform the division of the function modules on the terminal or the like according to the foregoing method example.
  • each function module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or soft.
  • the form of the functional module is implemented. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 22 is a schematic diagram showing a possible structure of the control device involved in the foregoing embodiment, and the control device includes: a receiving unit 2101, a determining unit 2102, and a sending unit. 2103.
  • the receiving unit 2101 is configured to support the control device to perform the processes 901 and 904 in FIG. 14; the determining unit 2102 is configured to support the control device to perform the processes 902 and 909 in FIG. 14; the transmitting unit 2103 is configured to support the control device to perform the process in FIG. 905, 908, and 910. All the related content of the steps involved in the foregoing method embodiments may be referred to the functional descriptions of the corresponding functional modules, and details are not described herein again.
  • the determining unit 2102 is configured to control and control the action of the control device.
  • the receiving unit 2101 and the sending unit 2103 are configured to support a communication process between the control device and other devices.
  • the control device may further include a storage unit. The program code and data of the control device are saved, for example, the storage unit can be used to store device information sent by each display device.
  • control device When the above control device is one of the display devices in the above display system, it may further include a display unit for displaying information input by the user or information provided to the user and various menus of the control device.
  • the determining unit 2102 may be a processor
  • the receiving unit 2101 and the sending unit 2103 may be a transceiver circuit such as an RF circuit or a Wi-Fi device
  • the storage unit may be a memory
  • the display unit may be a display.
  • the control device provided by the embodiment may be the mobile phone 201 shown in FIG.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions can be from a website site, computer, server or data center Transfer to another website site, computer, server, or data center by wire (eg, coaxial cable, fiber optic, digital subscriber line (DSL), or wireless (eg, infrared, wireless, microwave, etc.).
  • the computer readable storage medium can be any available media that can be accessed by a computer or a data storage device such as a server, data center, or the like that includes one or more available media.
  • the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (such as a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种显示方法及装置,涉及通信技术领域,可实现多设备之间显示业务的无缝切换,提高多设备之间的协作效率。该方法包括:响应于目标显示设备发送的显示请求,确定支持显示目标业务的第一显示设备和第二显示设备;请求第一显示设备和第二显示设备分别上报与用户之间的距离;当第一显示设备与用户之间的第一距离小于第二显示设备与用户之间的第二距离时,从目标显示设备中获取目标业务当前的第一显示数据,并向第一显示设备发送第一显示数据;当后续获取到第一显示设备上报的第一距离小于第二显示设备上报的第二距离时,从目标显示设备中获取目标业务当前的第二显示数据,并分别向第一显示设备和第二显示设备发送第二显示数据。

Description

一种显示方法及装置 技术领域
本申请实施例涉及通信技术领域,尤其涉及一种显示方法及装置。
背景技术
智能家居(smart home,也可称为home automation)是以住宅为平台,利用综合布线技术、网络通信技术、安全防范技术、自动控制技术、音视频技术将家居生活有关的设施集成,构建高效的住宅设施与家庭日程事务的管理***,可提升家居安全性、便利性、舒适性、艺术性,并实现环保节能的居住环境。
目前,在一些智能家居的应用场景中,可根据用户的位置自动化管理同一网络内的多个智能家电。例如,当检测到用户走进客厅时用客厅的电视显示媒体,当检测到用户走进卧室时,切换到用卧室的电视播放该媒体。
但是,在用户从客厅移动至卧室的这段时间内,无法实现媒体在两个电视之间的无缝切换,降低了多设备之间的协作效率。
发明内容
本申请的实施例提供一种显示方法及装置,可实现多设备之间显示业务的无缝切换,提高多设备之间的协作效率。
为达到上述目的,本申请的实施例采用如下技术方案:
第一方面,本申请的实施例提供一种显示方法,包括:控制设备接收目标显示设备需要显示目标业务时发送的显示请求;响应于该显示请求,控制设备确定支持显示该目标业务的第一显示设备和第二显示设备;进而,请求第一显示设备上报与用户之间的第一距离,并请求第二显示设备上报与用户之间的第二距离;当第一距离小于第二距离时,说明用户距离第一显示设备较近,控制设备可从目标显示设备中获取该目标业务当前的第一显示数据,并向第一显示设备发送第一显示数据,以使得第一显示设备根据第一显示数据显示该目标业务;后续当控制设备获取到第一显示设备上报的第一距离小于第二显示设备上报的第二距离时,说明用户此时距离第二显示设备较近,控制设备可从目标显示设备中获取该目标业务当前的第二显示数据,并分别向第一显示设备和第二显示设备发送第二显示数据,使得第一显示设备和第二显示设备均根据第二显示数据显示该目标业务。
也就是说,当控制设备将目标业务从第一显示设备切换至第二显示设备时,第一显示设备和第二显示设备会在一段时间内均显示该目标业务,这样,用户在离开第一显示设备接近第二显示设备的过程中还可以看到实时播放的目标业务,从而保证目标业务在不同显示设备上切换时能够稳定过渡,同时从用户视觉上实现了目标业务在不同显示设备上的无缝衔接,提高了多设备之间的协作效率。
在一种可能的设计方法中,第一显示数据中包括该目标业务的所有图层中支持第一显示设备显示的至少一个图层;和/或,第二显示数据中包括该目标业务的所有图层中支持第二显示设备显示的至少一个图层。
在一种可能的设计方法中,在控制设备分别向第一显示设备发和第二显示设备发送第二显示数据之后,还包括:当第二显示设备显示该目标业务达到预设时间时,可认为用户已经移动至第二显示设备的观看范围内,因此,控制设备可停止向第一显示设备发送第二显示数据,降低第一显示设备的功耗。
在一种可能的设计方法中,在控制设备分别向第一显示设备发和第二显示设备发送第二显示数据之后,还包括:当第二显示设备与用户之间的第二距离小于预设的距离阈值时,可确定用户当前的关注点已转移至第二显示设备,因此,控制设备可停止向第一显示设备发送第二显示数据,降低第一显示设备的功耗。
在一种可能的设计方法中,在控制设备分别向第一显示设备发和第二显示设备发送第二显示数据之后,还包括:当第二显示设备与用户之间的第二距离小于预设的距离阈值时,控制设备确定该用户距离第二显示设备小于该距离阈值的持续时间;若该持续时间大于预设的时间阈值时,更加可确定用户当前的关注点已转移至第二显示设备,因此,控制设备可停止向第一显示设备发送第二显示数据。
在一种可能的设计方法中,在控制设备向第一显示设备发送第一显示数据之后,还包括:当后续该控制设备获取到第一显示设备上报的第一距离等于第二显示设备上报的第二距离时,该控制设备指示第一显示设备和第二显示设备进行人脸检测;若获取到第一显示设备上报的人脸检测结果,则说明用户当前的关注点落在第一显示设备上,此时,控制设备无需将目标业务切换至第二显示设备上显示,而是可以继续向第一显示设备发送该目标业务当前的第二显示数据,使得第一显示设备继续显示该目标业务。
相应的,若获取到第二显示设备上报的人脸检测结果,则说明用户当前的关注点已从第一显示设备转移至第二显示设备,因此,控制设备可分别向第一显示设备和第二显示设备发送目标业务当前的第二显示数据,使得第一显示设备和第二显示设备均显示该目标业务,保证目标业务在不同显示设备上切换时能够稳定过渡,同时从用户视觉上实现了目标业务在不同显示设备上的无缝衔接,提高了多设备之间的协作效率。
第二方面,本申请的实施例提供一种显示方法,包括:当第一显示设备显示目标业务时,向控制设备发送目标业务的显示请求;响应于该显示请求,控制设备确定支持显示该目标业务的第二显示设备;控制设备请求第一显示设备上报与用户之间的第一距离,并请求第二显示设备上报与用户之间的第二距离;当第一距离大于第二距离时,说明用户此时距离第二显示设备较近,控制设备可在指示第一显示设备继续显示该目标业务的同时,从第一显示设备中获取目标业务当前的显示数据,并向第二显示设备发送该目标业务当前的显示数据,此时,第一显示设备和第二显示设备均具有该目标业务当前的显示数据,使得第一显示设备和第二显示设备均可根据该显示数据显示目标业务。
第三方面,本申请的实施例提供一种显示方法,包括:目标显示设备将安装的应用、存储的文件及数据均备份在控制设备中,且与控制设备保持同步;当目标显示设备需要显示目标业务时,向控制设备发送该目标业务的显示请求,响应于该显示请求,控制设备确定可支持显示该目标业务的第一显示设备和第二显示设备;进而,请 求第一显示设备上报与用户之间的第一距离,并请求第二显示设备上报与用户之间的第二距离;当第一距离小于第二距离时,说明用户此时距离第一显示设备较近,控制设备可指示第一显示设备显示该目标业务;当后续控制设备获取到第一显示设备上报的第一距离小于第二显示设备上报的第二距离时,说明用户此时距离第二显示设备较近,控制设备可将该目标业务当前的显示数据发送给第二显示设备,并指示第一显示设备继续显示该目标业务,使得第一显示设备和第二显示设备均可根据目标业务当前的显示数据显示该目标业务。
第四方面,本申请的实施例提供一种显示方法,包括:当第一显示设备显示目标业务时,确定出能够显示该目标业务的候选设备还包括第二显示设备;那么,第一显示设备可获取与用户之间的第一距离,并指示第二设备上报与用户之间的第二距离;当第一距离大于第二距离时,说明用户此时距离第二显示设备较近,则第一显示设备可将该目标业务当前的显示数据发送给第二显示设备,此时,第一显示设备和第二显示设备均具有该目标业务当前的显示数据,使得第一显示设备和第二显示设备均可根据该显示数据显示目标业务。
第五方面,本申请的实施例提供一种显示***,包括控制设备,以及与该控制设备通信的第一显示设备、第二显示设备和目标显示设备,其中,该目标显示设备,用于:当需要显示目标业务时,向该控制设备发送显示请求;该控制设备,用于:响应于该显示请求,该控制设备确定支持显示该目标业务的第一显示设备和第二显示设备;请求第一显示设备上报与用户之间的第一距离,并请求第二显示设备上报与用户之间的第二距离;当第一距离小于第二距离时,从该目标显示设备中获取该目标业务当前的第一显示数据,并向第一显示设备发送第一显示数据;第一显示设备,用于:根据第一显示数据显示该目标业务;该控制设备,还用于:当后续获取到第一显示设备上报的第一距离小于第二显示设备上报的第二距离时,从该目标显示设备中获取该目标业务当前的第二显示数据,并分别向第一显示设备和第二显示设备发送第二显示数据;第一显示设备,还用于:根据第二显示数据显示该目标业务;第二显示设备,用于:根据第二显示数据显示该目标业务。
第六方面,本申请的实施例提供一种控制设备,包括处理器,以及与该处理器均相连的存储器和收发器,该存储器存储了程序代码,该处理器运行该程序代码以指令该控制设备执行以下步骤:接收目标显示设备需要显示目标业务时发送的显示请求;响应于该显示请求,确定支持显示该目标业务的第一显示设备和第二显示设备;请求第一显示设备上报与用户之间的第一距离,并请求第二显示设备上报与用户之间的第二距离;当第一距离小于第二距离时,从该目标显示设备中获取该目标业务当前的第一显示数据,并向第一显示设备发送第一显示数据,以使得第一显示设备根据第一显示数据显示该目标业务;当后续获取到第一显示设备上报的第一距离小于第二显示设备上报的第二距离时,从该目标显示设备中获取该目标业务当前的第二显示数据,并分别向第一显示设备和第二显示设备发送第二显示数据,使得第一显示设备和第二显示设备均根据第二显示数据显示该目标业务。
在一种可能的设计方法中,第一显示数据中包括该目标业务的所有图层中支持第一显示设备显示的至少一个图层;和/或,第二显示数据中包括该目标业务的所有图层 中支持第二显示设备显示的至少一个图层。
在一种可能的设计方法中,在向第一显示设备发和第二显示设备发送第二显示数据之后,该程序代码还包括:当第二显示设备显示该目标业务达到预设时间时,停止向第一显示设备发送第二显示数据。
在一种可能的设计方法中,在向第一显示设备发和第二显示设备发送第二显示数据之后,该程序代码还包括:当第二距离小于预设的距离阈值时,停止向第一显示设备发送第二显示数据。
在一种可能的设计方法中,在向第一显示设备发和第二显示设备发送第二显示数据之后,该程序代码还包括:当第二距离小于预设的距离阈值时,确定该用户距离第二显示设备小于该距离阈值的持续时间;若该持续时间大于预设的时间阈值时,停止向第一显示设备发送第二显示数据。
在一种可能的设计方法中,在向第一显示设备发送第一显示数据之后,该程序代码还包括:当后续获取到第一显示设备上报的第一距离等于第二显示设备上报的第二距离时,指示第一显示设备和第二显示设备进行人脸检测;若获取到第一显示设备上报的人脸检测结果,则从该目标显示设备中获取该目标业务当前的第二显示数据,并向第一显示设备发送第二显示数据;若获取到第二显示设备上报的人脸检测结果,则从该目标显示设备中获取该目标业务当前的第二显示数据,并分别向第一显示设备和第二显示设备发送第二显示数据。
在一种可能的设计方法中,该控制设备还包括与该处理器相连的显示器,该显示器用于根据该目标显示设备发送的第一显示数据和/或第二显示数据显示该目标业务。
第七方面,本申请实施例提供一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当该指令在上述任一项控制设备上运行时,使得控制设备执行上述任一项显示方法。
第八方面,本申请实施例提供一种包含指令的计算机程序产品,当其在上述任一项控制设备上运行时,使得控制设备执行上述任一项显示方法。
本申请的实施例中,上述控制设备以及控制设备中各个部件的名字对设备本身不构成限定,在实际实现中,这些设备或部件可以以其他名称出现。只要各个设备或部件的功能和本申请的实施例类似,即属于本申请权利要求及其等同技术的范围之内。
另外,第二方面至第八方面中任一种设计方式所带来的技术效果可参见上述第一方面中不同设计方法所带来的技术效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种显示***的架构示意图一;
图2为本申请实施例提供的一种显示***的结构示意图二;
图3为本申请实施例提供的一种显示***的结构示意图三;
图4为本申请实施例提供的一种图层的划分示意图;
图5为本申请实施例提供的一种显示***的结构示意图四;
图6为本申请实施例提供的一种显示***的结构示意图五;
图7为本申请实施例提供的一种显示***的结构示意图六;
图8为本申请实施例提供的一种显示***的结构示意图;
图9为本申请实施例提供的一种手机的结构示意图;
图10为本申请实施例提供的一种显示方法的流程示意图一;
图11为本申请实施例提供的一种显示方法的应用场景示意图一;
图12为本申请实施例提供的一种显示方法的应用场景示意图二;
图13为本申请实施例提供的一种显示方法的应用场景示意图三;
图14为本申请实施例提供的一种显示方法的流程示意图二;
图15为本申请实施例提供的一种显示方法的应用场景示意图四;
图16为本申请实施例提供的一种显示方法的应用场景示意图五;
图17为本申请实施例提供的一种显示方法的应用场景示意图六;
图18为本申请实施例提供的一种显示方法的应用场景示意图七;
图19为本申请实施例提供的一种显示方法的应用场景示意图八;
图20为本申请实施例提供的一种显示方法的应用场景示意图九;
图21为本申请实施例提供的一种显示方法的应用场景示意图十;
图22为本申请实施例提供的一种控制设备的结构示意图。
具体实施方式
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征。在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例提供的一种显示方法,可应用于图1所示的显示***100中。
如图1所示,上述显示***100包括控制设备200以及与控制设备200均可通信的至少两个显示设备(例如图1中所示的手机201、智能电视202以及平板电脑203等)。
其中,控制设备200与各个显示设备之间可以通过无线网络(例如Wi-Fi、蓝牙以及蜂窝移动网络等)或者有线网络(例如光纤等)连接,本申请实施例对此不作任何限制。
在本申请的一些实施例中,控制设备200中存储有可反映各个显示设备显示等能力的设备信息。以手机201为例,手机201在与控制设备200建立连接后,如图2所示,可向控制设备200发送自身的设备信息,例如,手机201的屏幕分辨率、图形处理器(Graphics Processing Unit,GPU)的渲染能力、中央处理器(Central Processing Unit,CPU)的频率等。控制设备200将接收到的手机201的设备信息存储现在控制设备200的存储器中进行备案。
类似的,与控制设备200相连的各个显示设备均可在控制设备200中备案自身的设备信息。后续,当某一个显示设备发起需要进行显示的目标业务(例如播放视频、运行游戏等)时,可向控制设备200发送相应的显示请求和目标业务的对应的显示数据。此时,控制设备200可根据已备案的各个显示设备的设备信息,为当前的目标业务确定一个合适的显示设备作为目标设备,并将该目标业务的对应的显示数据发送给目标设备进行显示。
示例性的,如图3所示,手机201和智能电视202均与控制设备200相连,当手机201接收到一个视屏通话业务后,可分析该视屏通话业务需要显示的待显示图层的属性信息,例如,分析该图层的内容(视频、文字或图片)、大小以及该图层的隐私性等,进而,手机201可将分析得到的待显示图层的属性信息携带在显示请求中发送给控制设备200。
其中,图层(view或layer)是显示设备上一个显示界面的基本组成单位,多个图层按照顺序堆叠后便形成了显示界面的最终显示效果。每一个图层中可以包括一个或多个控件,每个图层的定义规则图标以及多个图层的顺序可由开发人员在开发应用时进行定义。以安卓***为例,安卓***中定义了一些基础图层,例如,ImageView、AdapterView以及RelativeLayout等,开发人员可使用或修改这些基础图层绘制自定义图层。
如图4所示,以微信应用的聊天界面为例,可将该聊天界面中微信的状态栏401以及用于输入的输入栏404定义为图层1,将聊天界面中的聊天背景定义为图层2,将聊天界面中的用户的聊天记录定义为图层3,且图层3位于图层2之上。
那么,当手机201需要显示图4中的聊天界面(即目标业务)时,微信应用可根据预先定义的图层规则确定出该聊天界面包括的3个图层,即上述图层1-图层3,并对每个图层的内容(视频、文字或图片)、尺寸以及该图层的隐私性等属性进行分析,例如,图层1和图层3中涉及联系人头像和名称等隐私,因而图层1和图层3的隐私性较高,而图层2中未涉及用户隐私,因而图层2的隐私性较低。进而,将分析得到的图层1-图层3的属性信息携带在显示请求中发送给控制设备200。
仍如图3所示,由于控制设备200中备案有各个显示设备的设备信息,因此,控制设备200可将手机201发送的待显示图层的属性信息与各个显示设备的设备信息进行匹配,例如,当上述属性信息中指示待显示图层的大小为10M且隐私性较弱,而已备案的智能电视202可支持大小大于8M且隐私性较弱的图层进行显示,那么,控制设备200可将智能电视202作为显示上述视屏通话业务的目标设备。
此时,控制设备200可向手机201发送上述显示请求的响应信息,触发手机201生成上述视屏通话业务的显示数据(即上述待显示图层的数据)并发送至控制设备200,如图3所示,由控制设备200将该显示数据发送给智能电视202,使得智能电视202显示原本手机201上接收到的视屏通话业务的图像。
当然,如果手机201与智能电视202之间建立有连接关系,控制设备200也可在上述响应信息中携带智能电视202的标识,这样,手机201可以根据智能电视202的标识将生成的视屏通话业务的显示数据发送给智能电视202进行显示。
又或者,控制设备200也可以具有图像处理能力,例如图像渲染能力,那么,控制设备200接收到手机201生成的视屏通话业务的显示数据后,可按照智能电视202的分辨率等设备信息对该显示数据进行二次渲染,得到符合智能电视202显示能力的显示数据,并将该显示数据发送至智能电视202进行显示。
可以看出,在本申请实施例提供的显示方法中,可将用户的多个显示设备与 控制设备200互联,并将各个显示设备的设备信息备案至控制设备200中,使得控制设备可以根据各个显示设备的设备信息,智能的为当前的目标业务选择合适的目标设备,将该目标业务对应的图层投射至目标设备上显示。
也就是说,显示***100中的***示设备在触发目标业务时均可作为提供屏幕源数据的源设备,且显示***100中的控制设备200可智能的确定投屏时机以及显示该目标业务的被控设备,使得多屏幕显示场景下的源设备与被控设备可以根据业务需求灵活设置,从而提高多设备之间的协作效率。
在本申请的另一些实施例中,如图5所示,上述显示***100中的每一个显示设备还可以在控制设备200内备份自身的全部数据,例如,手机201可将安装的应用、存储的文件及数据均备份在控制设备200中。
那么,如图6所示,当某个显示设备(例如上述手机201)发起新的目标业务时,手机201可直接将该目标业务传递至控制设备200,由于控制设备200中包含手机201的全部数据,因此,控制设备200可以代替手机201为其分析目标业务的待显示图层的属性信息,进而根据该属性信息为该目标业务选择合适的目标设备(例如智能电视202),还可以替手机201生成该待显示图层并发送至智能电视202进行显示。
这样一来,发起目标业务的手机201只需向控制设备200上报目标业务,即可实现将目标业务投射至其他显示设备上显示的智能投屏功能,可降低显示***100中各个显示设备的实现复杂度和耗电量。
另外,上述控制设备200为目标业务确定合适的目标设备时,还可获取此时用户与显示***100中各个显示设备之间的距离,将与用户距离最近的显示设备作为确定为显示该目标业务的目标设备。
示例性的,如图7所示,用户将手机201放在客厅后进入卧室打开智能电视202,当手机201接收到一个视屏通话业务后,将该视屏通话业务的待显示图层的属性信息发送给控制设备200。控制设备200根据待显示图层的属性信息可确定出多个支持显示该待显示图层的显示设备,例如,平板电脑203、智能电视202以及手机201,此时,控制设备200可通过这三个显示设备上设置的距离传感器或摄像头等传感器获取用户分别与这三个显示设备之间的距离,进而将距离最近的显示设备(例如图7中位于卧室的智能电视202)作为显示该待显示图层的目标设备。
需要说明的是,本申请实施例中并不限定显示***100中控制设备200的具体实现形态,例如,如图8中的(a)所示,控制设备200可以以独立的实体设备的形态与各个显示设备相连,又或者,如图8中的(b)所示,控制设备200还可以以功能模块的形式集成在一个或多个显示设备中,也就是说,控制设备200也可以是显示***100中具有显示功能的一个显示设备,例如图8中的(b)中可将手机201作为显示***100的控制设备200,同时,手机201也是显示***100内的一个显示设备;又或者,如图8中的(c)所示,控制设备200还可以为云端设置的一个或多个服务器(或虚拟机),此时,显示***100中的各个显示设备可以通过同一个用户账号与云端的控制设备200建立连接关系,本申请实施例对此不作任何限制。
另外,上述实施例中是以将显示设备的目标业务投射至其他显示设备上进行显示为例说明的,可以理解的是,与显示***100中控制设备200相连接的还可以为具有其他输出功能的终端,例如,具有音频输出功能的蓝牙音响等。这样,当控制设备200接收到任意终端发起的待播放音频业务时,可智能的为其选择合适的音频播放设备执行该待播放音频业务,本申请实施例对此不作任何限制。
在本申请的一些实施例中,上述显示***100中的显示设备(或控制设备200)具体可以为手机、可穿戴设备、增强现实(augmented reality,AR)\虚拟现实(virtual reality,VR)设备、平板电脑、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本、个人数字助理(personal digital assistant,PDA)等任意终端,当然,在以下实施例中,对该终端的具体形式不作任何限制。
如图9所示,下面以手机201作为显示***100中的一个显示设备举例对实施例进行具体说明。应该理解的是,图示手机201仅是上述终端的一个范例,并且手机201可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。
如图9所示,手机201具体可以包括:处理器101、射频(radio frequency,RF)电路102、存储器103、触摸屏104、蓝牙装置105、一个或多个传感器106、Wi-Fi装置107、定位装置108、音频电路109、外设接口110以及电源***111等部件。这些部件可通过一根或多根通信总线或信号线(图9中未示出)进行通信。本领域技术人员可以理解,图9中示出的硬件结构并不构成对手机的限定,手机201可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面结合图9对手机201的各个部件进行具体的介绍:
处理器101是手机201的控制中心,利用各种接口和线路连接手机201的各个部分,通过运行或执行存储在存储器103内的应用程序,以及调用存储在存储器103内的数据,执行手机201的各种功能和处理数据。在一些实施例中,处理器101可包括一个或多个处理单元。可选的,处理器101可集成应用处理器和调制解调处理器。其中,应用处理器主要处理操作***、用户界面和应用程序等;调制解调处理器主要处理无线通信。可选的,上述调制解调处理器和应用处理器也可以是相互独立设置的。
在本申请实施例中,处理器101可以包括GPU 115和CPU116,也可以是GPU 115、CPU 116、数字信号处理(digital signal processing,DSP)以及通信单元中的控制芯片(例如基带芯片)的组合。在本申请实施方式中,GPU 115和CPU 116均可以是单运算核心,也可以包括多运算核心。
其中,GPU 115是一种专门在个人电脑、工作站、游戏机和一些移动设备(如平板电脑、智能手机等)上进行图像运算工作的微处理器。它可将手机201所需要的显示信息进行转换驱动,并向显示器104-2提供行扫描信号,控制显示器104-2的正确显示。
具体的,在显示过程中,GPU 115可将相应的绘图命令发送给GPU 115,例 如,该绘图命令可以为“在坐标位置(x,y)处画个长和宽分别为a,b的长方形”,那么,GPU 115根据该绘图指令便可以迅速计算出该图形的所有像素,并在显示器104-2上指定位置画出相应的图形。
需要说明的是,GPU 115可以以功能模块的形式集成在处理器101内,也可以以独立的实体形态(例如,显卡)设置在手机201内,本申请实施例对此不作任何限制。
射频电路102可用于在收发信息或通话过程中,无线信号的接收和发送。特别地,射频电路102可以将基站的下行数据接收后,给处理器101处理;另外,将涉及上行的数据发送给基站。通常,射频电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频电路102还可以通过无线通信和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯***、通用分组无线服务、码分多址、宽带码分多址、长期演进、电子邮件、短消息服务等。
存储器103用于存储应用程序以及数据,处理器101通过运行存储在存储器103的应用程序以及数据,执行手机201的各种功能以及数据处理。存储器103主要包括存储程序区以及存储数据区,其中,存储程序区可存储操作***、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等);存储数据区可以存储根据使用手机201时所创建的数据(比如音频数据、电话本等)。此外,存储器103可以包括高速随机存取存储器(ramdom access memory,RAM),还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。存储器103可以存储各种操作***,例如,苹果公司所开发的
Figure PCTCN2017107893-appb-000001
操作***,谷歌公司所开发的
Figure PCTCN2017107893-appb-000002
操作***等。上述存储器103可以是独立的,通过上述通信总线与处理器101相连接;存储器103也可以和处理器101集成在一起。
触摸屏104具体可以包括触控板104-1和显示器104-2。
其中,触控板104-1可采集手机201的用户在其上或附近的触摸事件(比如用户使用手指、触控笔等任何适合的物体在触控板104-1上或在触控板104-1附近的操作),并将采集到的触摸信息发送给其他器件(例如处理器101)。其中,用户在触控板104-1附近的触摸事件可以称之为悬浮触控;悬浮触控可以是指,用户无需为了选择、移动或拖动目标(例如图标等)而直接接触触控板,而只需用户位于终端附近以便执行所想要的功能。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型来实现触控板104-1。
显示器(也称为显示屏)104-2可用于显示由用户输入的信息或提供给用户的信息以及手机201的各种菜单。可以采用液晶显示器、有机发光二极管等形式来配置显示器104-2。触控板104-1可以覆盖在显示器104-2之上,当触控板104-1检测到在其上或附近的触摸事件后,传送给处理器101以确定触摸事件的类型,随后处理器101可以根据触摸事件的类型在显示器104-2上提供相应的视觉输出。虽然在图9中,触控板104-1与显示屏104-2是作为两个独立的部件来实现手机201的输入和输出功能,但是在某些实施例中,可以将触控板104-1与显示屏104-2 集成而实现手机201的输入和输出功能。可以理解的是,触摸屏104是由多层的材料堆叠而成,本申请实施例中只展示出了触控板(层)和显示屏(层),其他层在本申请实施例中不予赘述。另外,触控板104-1可以以全面板的形式配置在手机201的正面,显示屏104-2也可以以全面板的形式配置在手机201的正面,这样在手机的正面就能够实现无边框的结构。
手机201还可以包括蓝牙装置105,用于实现手机201与其他短距离的终端(例如手机、智能手表等)之间的数据交换。本申请实施例中的蓝牙装置可以是集成电路或者蓝牙芯片等。
手机201还可以包括至少一种传感器106,比如,指纹采集器件112、光传感器、运动传感器以及其他传感器。具体地,指纹采集器件112可以设置在手机201的背面(例如后置摄像头的下方),或者在手机201的正面(例如触摸屏104的下方),又例如,还可以在触摸屏104中配置指纹采集器件112来实现指纹识别功能,即指纹采集器件112可以与触摸屏104集成在一起来实现手机201的指纹识别功能。光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节触摸屏104的显示器的亮度,接近传感器可在手机201移动到耳边时,关闭显示器的电源。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机201还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不予赘述。
在本申请实施例中,手机201的传感器106中还包括距离传感器113,可用于感应其与某物体(或用户)间的距离以完成预设的某种功能。根据其工作原理的不同可分为光学距离传感器、红外距离传感器、超声波距离传感器等,本申请实施例对此不做任何限制。
Wi-Fi装置107,用于为手机201提供遵循Wi-Fi相关标准协议的网络接入,手机201可以通过Wi-Fi装置107接入到Wi-Fi接入点,进而帮助用户收发电子邮件、浏览网页和访问流媒体等,它为用户提供了无线的宽带互联网访问。在其他一些实施例中,该Wi-Fi装置107也可以作为Wi-Fi无线接入点,可以为其他终端提供Wi-Fi网络接入。
定位装置108,用于为手机201提供地理位置。可以理解的是,该定位装置108具体可以是全球定位***(global positioning system,GPS)或北斗卫星导航***、俄罗斯GLONASS等定位***的接收器。定位装置108在接收到上述定位***发送的地理位置后,将该信息发送给处理器101进行处理,或者发送给存储器103进行保存。在另外的一些实施例中,该定位装置108还可以是辅助全球卫星定位***(assisted global positioning system,AGPS)的接收器,AGPS***通过作为辅助服务器来协助定位装置108完成测距和定位服务,在这种情况下,辅助定位服务器通过无线通信网络与终端例如手机201的定位装置108(即GPS接收器)通信而提供定位协助。在另外的一些实施例中,该定位装置108也可以是基于Wi-Fi接入点的定位技术。由于每一个Wi-Fi接入点都有一个全球唯一的媒体 介入控制(media access control,MAC)地址,终端在开启Wi-Fi的情况下即可扫描并收集周围的Wi-Fi接入点的广播信号,因此可以获取到Wi-Fi接入点广播出来的MAC地址;终端将这些能够标示Wi-Fi接入点的数据(例如MAC地址)通过无线通信网络发送给位置服务器,由位置服务器检索出每一个Wi-Fi接入点的地理位置,并结合Wi-Fi广播信号的强弱程度,计算出该终端的地理位置并发送到该终端的定位装置108中。
音频电路109、扬声器113、麦克风114可提供用户与手机201之间的音频接口。音频电路109可将接收到的音频数据转换后的电信号,传输到扬声器113,由扬声器113转换为声音信号输出;另一方面,麦克风114将收集的声音信号转换为电信号,由音频电路109接收后转换为音频数据,再将音频数据输出至RF电路102以发送给比如另一手机,或者将音频数据输出至存储器103以便进一步处理。
外设接口110,用于为外部的输入/输出设备(例如键盘、鼠标、外接显示器、外部存储器、用户识别模块卡等)提供各种接口。例如通过通用串行总线(universal serial bus,USB)接口与鼠标连接,通过用户识别模块卡卡槽上的金属触点与电信运营商提供的用户识别模块卡(subscriber identification module,SIM)卡进行连接。外设接口110可以被用来将上述外部的输入/输出***设备耦接到处理器101和存储器103。
手机201还可以包括给各个部件供电的电源装置111(比如电池和电源管理芯片),电池可以通过电源管理芯片与处理器101逻辑相连,从而通过电源装置111实现管理充电、放电、以及功耗管理等功能。
尽管图9未示出,手机201还可以包括摄像头(前置摄像头和/或后置摄像头)、闪光灯、微型投影装置、近场通信(near field communication,NFC)装置等,在此不予赘述。
结合图1-图8所示的显示***100以及图9所示的手机201,在本申请实施例中,显示***100的组网方法如图10所示,包括:
801a、第一显示设备向控制设备发送第一连接请求。
802a、控制设备接收到上述第一连接请求后,建立与第一显示设备之间的连接关系。
在步骤801a-802a中,以第一显示设备(例如上述手机201)主动与控制设备建立连接为例进行说明。
在本申请的一些实施例中,当控制设备200接入某一网络,例如,Wi-Fi名称为“1234”的局域网后,可将自身的标识(例如控制设备200的MAC地址)携带在第一指示信息中定期广播,该第一指示信息用于指示自身为控制设备200。那么,当手机201也接入Wi-Fi名称为“1234”的局域网后,可接收到该第一指示信息,从而确定当前的控制设备200。
进而,如步骤801a所述,手机201的处理器可根据控制设备200的标识,调用其Wi-Fi装置通过名称为“1234”的Wi-Fi网络向控制设备200发送第一连接请求,该第一连接请求用于请求建立手机201与控制设备200之间的连接关系,该 第一连接请求中可携带手机201的标识(例如手机201的MAC地址)。
那么,当控制设备200接收到手机201发送的第一连接请求后,如步骤802a所述,控制设备200可将手机201的标识存储在存储器中,建立与手机201之间的连接关系。后续,控制设备200和手机201均可通过彼此的标识查找到对方进行通信。
在本申请的另一些实施例中,当手机201和其他多个设备接入同一网络(例如,Wi-Fi名称为“1234”的局域网)后,如图11所示,可在手机201的显示界面中显示候选控制设备列表1001,由用户选择该局域网内的控制设备,例如,用户点击候选控制设备列表1001中“我的手机”,即手机201。
那么,手机201检测到用户的这一输入操作后,可将自身设置为控制设备200,并将自身的标识携带在第一指示信息中定期广播。那么,该局域网内的其他显示设备接收到第一指示信息后,可将自身的标识携带在第一连接请求中发送至手机201(即控制设备),以使得手机201存储接收到的标识,从而与局域网内的各个显示设备建立连接关系。
801b、控制设备向第一显示设备发送第二连接请求。
802b、第一显示设备接收到上述第二连接请求后,建立与控制设备之间的连接关系。
在步骤801b-802b中,以控制设备主动与第一显示设备(例如上述手机201)建立连接为例进行说明。
与上述步骤801a-802a类似的,控制设备200可将自身的标识携带在第二连接请求中发送至手机201,那么,手机接收到该第二连接请求后,可存储控制设备200的标识,并将手机201自身的标识发送至控制设备200,使得控制设备200将手机201的标识也存储在自身的存储器中,建立与手机201之间的连接关系。后续,控制设备200和手机201均可通过彼此的标识查找到对方进行通信。
需要说明的是,上述实施例中均以第一显示设备与控制设备建立连接关系举例说明,其他显示设备也可按照上述方法与控制设备建立连接关系,从而组建如图1-图8中所示的显示***100。
803、第一显示设备向控制设备发送第一显示设备的设备信息。
804、控制设备接收到上述设备信息后,保存至控制设备的存储器中进行备案。
仍以手机201作为第一显示设备举例,在步骤803中,由于手机201已经与控制设备建立了连接关系,因此,手机201可以根据以保存的控制设备200的标识,向控制设备发送自身的设备信息,例如,手机201的屏幕分辨率、GPU的渲染能力以及CPU的频率等反映手机201显示能力的参数,手机201支持的音频格式等反映手机201声音播放能力的参数,以及是否支持用户隐私的显示等参数,本申请实施例对此不作任何限制。
其中,用户隐私具体可以包括安全交易信息(例如股票交易页面)、具有聊天性质的信息(例如短信、消息通知等)、用户的位置信息以及联系人号码等用户不远公开的信息。
示例性的,显示设备可以根据显示设备的类型和/或显示设备所处的地理位置 等参数,判断是否支持显示用户隐私。例如,当显示设备为移动性较强的设备,例如,手机、可穿戴设备时,由于用户通常随身携带这类设备,即这类设备的私密度较高,因此可确定这类设备支持显示用户隐私;而当显示设备为移动性较弱的设备,例如,蓝牙音响、智能电视时,由于这类设备所处的位置相对固定,且通常不能随着用户的移动而移动,即这类设备的私密度较低,因此可确定这类设备不支持显示用户隐私。
进而,在步骤804中,控制设备200接收到手机201发送的上述设备信息后,可将手机201与其设备信息的对应关系存储在控制设备200的存储器中进行备案。
控制设备200对接收到的每一个显示设备的设备信息均可进行备案,那么,如表1所示,控制设备200中维护有各个显示设备的设备信息,后续,当需要显示某个目标业务时,控制设备200可根据已备案的各个显示设备的设备信息,为该目标业务确定一个合适的显示设备作为目标设备显示该目标业务。
表1
Figure PCTCN2017107893-appb-000003
示例性的,手机201接收到来电业务后,可将来电业务相关的一个或多个待显示图层的属性,例如待显示图层支持的分辨率,待显示图层支持的CPU和GPU的能力,待显示图层是否涉及用户隐私等属性信息发送给控制设备200。控制设备200将接收到的待显示图层的属性信息与表1中备案的各个显示设备的设备信息进行匹配,得到一个或多个支持显示该待显示图层的显示设备。
例如,控制设备200确定表1中的手机201、智能电视202以及平板电脑203均支持显示上述来电业务的待显示图层。那么,为了方便用户及时获知该来电业务,当手机201、智能电视202以及平板电脑203均与控制设备200保持连接时,控制设备可向手机201、智能电视202以及平板电脑203发送第二指示信息,该第二指示信息用于指示显示设备上报与用户之间的距离。
进而,手机201、智能电视202以及平板电脑203接收到上述第二指示信息后,可通过自身的距离传感器(例如摄像头、红外传感器)定期检测或其他现有方式来获得与用户之间的距离,并将检测得到的距离上报给控制设备200。这样,控制设备200可将距离用户最近的显示设备,例如卧室中的智能电视202确定为目标设备显示该来电业务的待显示图层,并且,由于在来电业务进行的过程中可根据实时的距离选择目标设备,在来电业务进行的过程中,当用户与多个显示设备之间的距离发生变化,该来电业务可在多个显示设备上自由切换,因此提高多设备之间的协作效率的同时可大大提高用户体验。
在本申请实施例中,还可以在控制设备200内预先存储各个显示设备所在位置的户型结构图,如图12所示,为用户所住的一室一厅的户型结构示意图,显示 ***100中的各个显示设备可以通过定位装置(例如GPS)将自身的位置信息上报给控制设备200,控制设备结合图12所示的户型结构示意图,可确定每一个显示设备在用户家中的具***置。如图12所示,卧室中放置有电视1,客厅中放置有电视1和手机3,厨房内放置有平板电脑4。
那么,当控制设备200根据用户与各个显示设备之间的距离确定显示目标业务的目标设备时,可结合图12所示的各个显示设备的具***置,将与用户处于同一房间内且距离最近的显示设备(例如图12中客厅内的电视1)作为目标设备,避免为用户确定的目标设备不在用户所处的房间内而导致用户不能及时处理该目标业务的问题。
另外,房间内的各个显示设备还可以定期向控制设备200上报用户与自身之间的距离,例如,每隔30秒上报与用户之间的距离,那么,当用户在房间内移动时,例如,如图13所示,用户从客厅中的A点移动至卧室门口B点,控制设备可以实时获取到用户与各个显示设备的距离,当用户与卧室中电视1的距离D1小于用户与客厅中电视2的距离D2时,可动态的将目标业务从客厅中的电视2切换至此时与用户距离最近的卧室中的电视1。
但是,控制设备200将目标业务从电视2切换至电视1时,用户可能还没有进入卧室,或者没有进入卧室中最佳的观看区域,从而导致用户错过目标业务的相关画面。
对此,本申实施例请提供一种显示方法,以上述显示***100中包括控制设备200,以及与控制设备200相连接的第一显示设备和第二显示设备为例,如图14所示,该方法包括:
901、控制设备获取第一显示设备与用户之间的第一距离,以及第二显示设备与用户之间的第二距离。
具体的,第一显示设备和第二显示设备中均可设置距离传感器,第一显示设备通过其距离传感器可测量出当前与用户之间的第一距离,第二显示设备通过其距离传感器可测量出当前与用户之间的第二距离。进而,第一显示设备和第二显示设备可分别将测量出的第一距离和第二距离发送给控制设备。
当然,如果用户此时不在第一显示设备(或第二显示设备)的测距范围内,例如,第二显示设备在卧室内并没有检测到用户,则可认为第二显示设备与用户之间的距离为无穷大。
又或者,还可以在显示***100中设置与控制设备相连的一个或多个摄像头,控制设备可通过摄像头捕捉到的用户影像,那么,结合预先存储的各个显示设备的位置,可确定出第一显示设备与用户之间的第一距离,以及第二显示设备与用户之间的第二距离。
当然,由于可出穿戴设备(或手机)一般是用户随身携带的,因此,控制设备还可通过穿戴设备(或手机)的定位装置获取用户的定位结果,进而结合预先存储的各个显示设备的位置,确定出第一显示设备与用户之间的第一距离,以及第二显示设备与用户之间的第二距离。
可选的,控制设备还可以使用室内定位等其他现有的方式获取第一距离和第 二距离,本申请实施例对此不做限定。
在本申请的一些实施例中,可以在控制设备接收到目标业务发起的显示请求时,触发控制设备获取上述第一距离和第二距离。例如,如图15所示,用户在手机上打开视频播放APP中的某一视频时,手机可向控制设备发送视频播放业务的显示请求。控制设备接收到该显示请求后,首先可根据表1中已备案的各个显示设备的设备信息,确定支持该视频播放业务的显示设备,例如上述第一显示设备和第二显示设备。进而,控制设备可向第一显示设备和第二显示设备发送距离请求,请求第一显示设备上报其与用户之间的第一距离,以及请求第二显示设备上报其与用户之间的第二距离。第一显示设备和第二显示设备接收到该距离请求后,可触发第一显示设备和第二显示设备周期性的检测并上报与用户之间的距离。
当然,图15中的手机也可以作为上述第一显示设备或第二显示设备,本申请实施例对此不做任何限制。
902、当第一距离小于第二距离时,控制设备指示第一显示设备运行目标业务。
903、第一显示设备实时显示该目标业务。
当第一距离D1小于第二距离D2时,如图16所示,说明第一显示设备距离用户较近,仍以上述视频播放业务为例,此时,控制设备可以将手机运行该视频播放业务时生成的待显示图层实时的发送给第一显示设备进行显示,使得第一显示设备能够实时显示该视频播放业务的待显示图层。
其中,上述待显示图层中可以包括运行该视频播放业务时的部分图层,例如,控制设备可将运行该视频播放业务时涉及用户隐私的图层去除,将未涉及隐私的图层作为上述待显示图层发送给第一显示设备;又或者,控制设备还可以将运行该视频播放业务时涉及用户隐私的图层发送给支持显示用户隐私的第三显示设备,将未涉及隐私的图层作为上述待显示图层发送给第一显示设备;当然,上述待显示图层也可以包括运行该视频播放业务时的所有图层,本申请实施例对此不作任何限制。
另外,当第一距离D1小于第二距离D2时,如果该第一距离小于预设值,例如,用户距离第一显示设备小于3米(或其他预设距离),或者,用户距离第一显示设备小于3米(或其他预设距离)的时长大于预设时间时,可触发控制设备指示第一显示设备运行目标业务,避免用户快速经过第一显示设备时触发第一显示设备显示该目标业务,增加第一设备的功耗的问题。
当然,在控制设备将手机运行该视频播放业务时生成的待显示图层发送给第一显示设备之前,控制设备还可以对手机发送来的待显示图层进行二次渲染,例如调整该待显示图层的尺寸以适应第一显示设备的分辨率,本申请实施例对此不作任何限制。
904、控制设备继续获取第一显示设备与用户之间的第一距离,以及第二显示设备与用户之间的第二距离。
在第一显示设备上显示上述目标业务的实时画面的同时,第一显示设备和第二显示设备可继续检测并上报与用户之间的距离,使得控制设备继续获取到第一显示设备与用户之间的第一距离D1,以及第二显示设备与用户之间的第二距离 D2。
905、当第一距离大于第二距离时,控制设备向第一显示设备发送第一指令,并向第二设备发送第二指令。
其中,第一指令用于指示第一显示设备继续实时显示上述目标业务,第二指令用于指示第二显示设备从当前第一显示设备显示的目标画面开始实时显示该目标业务。
另外,在向第二显示设备发送第二指令之前,控制设备可以先确定当前第二显示设备是否与控制设备处于连接状态,即第二显示设备是否在线。当第二显示设备在线时,可触发控制设备向第二显示设备发送第二指令。
而当第二显示设备不在线时,控制设备可以重新建立与第二显示设备之间的连接关系,进而向第二显示设备发送第二指令;又或者,当第二显示设备不在线时,控制设备还可以重新选择此时与用户距离较近,且与控制设备处于连接状态的其他显示设备,并向该显示设备发送上述第二指令,本申请实施例对此不做任何限制。
906、响应于上述第一指令,第一显示设备在预设时间内继续实时显示该目标业务。
907、响应于上述第二指令,第二显示设备从当前第一显示设备显示的目标图层开始实时显示该目标业务。
当第一距离D1大于第二距离D2时,如图17所示,说明此时第二显示设备距离用户较近,用户具有向第二显示设备移动的趋势。仍以上述视频播放业务为例,当视频A播放至3分45秒时,控制设备获取到第一显示设备上报的第一距离D1大于第二显示设备上报的第二距离D2,此时,控制设备可将手机运行该视频播放业务时生成的待显示图层发送给第二显示设备,即向第二显示设备发送第二指令。第二显示设备接收到该第二指令后,可从视频A的第3分45秒的显示画面(即上述目标图层)开始继续显示该视频播放业务的待显示图层。
需要说明的是,第二显示设备显示上述目标业务时的待显示图层,可以与第一显示设备显示上述目标业务时的待显示图层相同或不同。例如,当第二显示设备支持显示用户隐私而第一显示设备不支持显示隐私时,控制设备向第二显示设备发送的待显示图层中可以包括涉及用户隐私的图层,例如,包含联系人电话、短信内容的图层等,而控制设备向第一显示设备发送的待显示图层中不包括涉及用户隐私的图层。
对于第一显示设备,控制设备不会立即停止向第一显示设备发送上述视频播放业务的待显示图层,而是继续向第一显示设备发送视频A在第3分45秒之后的待显示图层。也就是说,当控制设备将视频播放业务从第一显示设备切换至第二显示设备时,第一显示设备和第二显示设备会在一段时间内同时显示相同的画面。
这是因为,用户从第一显示设备向第二显示设备移动过程是一个持续性的过程,当检测到第一距离D1大于第二距离D2时,仍如图17所示,用户此时可能还没有进入第二显示设备所在的房间,或者,用户还没有进入第二显示设备的观看区域(例如位于第二显示设备前方三米的区域),此时如果关闭第一显示设备 播放的视频A,用户则会错过相应的播放片段,即用户观看到的目标业务在第一显示设备和第二设备之间无法实现无缝衔接。
因此,当检测到第一距离D1大于第二距离D2时,控制设备除了将视频播放业务从第一显示设备切换至第二显示设备之外,还会继续向第一显示设备发送该视频播放业务的待显示图层,使第一显示设备继续显示该视频播放业务一段时间(例如30秒),这样用户在离开第一显示设备所在的房间之前还可以看到实时播放的视频播放业务,从而保证视频播放业务在不同显示设备上切换时能够稳定过渡,同时实现视频播放业务在不同显示设备上的无缝衔接,提高了多设备之间的协作效率。
进一步地,为了尽可能的使得第一显示设备和第二显示设备可以同时播放上述视频播放业务,控制设备可以同时向第一显示设备和第二显示设备发送上述目标业务的待显示图层,这样,第一显示设备和第二显示设备接收到上述目标业务的待显示图层可立即显示,提高第一显示设备和第二显示设备播放上述视频播放业务的同步性。
又或者,还可以预先在显示***的各个显示设备之间设置同步机制,使得上述第一显示设备和第二显示设备的***时间同步。那么,控制设备可以在向第一显示设备发送的第一指令以及向第二显示设备发送的第二指令中携带上述目标业务的显示时刻,这样,当该显示时刻到来时,可触发第一显示设备和第二显示设备同时播放上述目标业务,以提高第一显示设备和第二显示设备播放上述视频播放业务的同步性。
908、当控制设备获取到第二显示设备与用户之间的第二距离小于距离阈值时,控制设备向第一显示设备发送关闭指令,使得第一显示设备停止显示上述目标业务。
可选的,在步骤908中,在控制设备将视频播放业务从第一显示设备切换至第二显示设备之后,还可继续获取用户与第二显示设备之间的第二距离D2。如图18所示,当第二显示设备与用户之间的第二距离D2小于距离阈值,例如,小于3米时,说明用户此时的关注点已经转移至第二显示设备。那么,控制设备可停止向第一显示设备发送上述视频播放业务的待显示图层,使得第一显示设备停止显示该视频播放业务(即目标业务),以降低第一显示设备的功耗。
进一步地,控制设备还可以判断第二显示设备与用户之间的第二距离小于阈值时的持续时长,若该持续时长大于时间阈值,则说明用户已在第二显示设备前停留了一定时间,此时可触发控制设备向第一显示设备发送关闭指令,使第一显示设备停止显示上述目标业务。
另外,第一显示设备如果在预设时间内没有接收到控制设备发送的上述视频播放业务的待显示图层,则可停止向控制设备发送第一显示设备与用户之间的第一距离,以降低第一显示设备的功耗。
当然,在控制设备停止向第一显示设备发送上述视频播放业务的待显示图层后,第一显示设备仍可继续定期向控制设备发送第一显示设备与用户之间的第一距离,以便于后续用户在移动过程中控制设备可以根据该第一距离及时确定是否 将上述视频播放业务切换至第一显示设备,直至控制设备向第一显示设备发送停止上报第一距离的指示为止,本申请实施例对此不做任何限制。
另外,在步骤904之后,若控制设备获取到的第一距离与第二距离相等,即用户与第一显示设备和第二显示设备的距离相等,则仍如图14所示,所述方法还包括步骤909-910:
909、第一显示设备和第二显示设备分别进行人脸检测(或人眼检测)。
当第一距离与第二距离相等时,控制设备可以根据用户人脸(或人眼)的朝向确定用户此时的关注点,进而确定是否将视频播放业务从第一显示设备切换至第二显示设备。
具体的,第一显示设备和第二显示设备上均可设置摄像头,这样,第一显示设备和第二显示设备可通过摄像头捕捉用户的影像,进而,基于人脸检测(或人眼检测)算法对用户的影像进行识别,当识别出人脸(或人眼)时,则说明此时得到人脸检测(或人眼检测)结果。
910、若第一显示设备得到人脸检测(或人眼检测)结果,则控制设备指示第一显示设备继续显示上述目标业务。
若第一显示设备得到人脸检测(或人眼检测)结果,则说明用户此时的关注点仍然落在第一显示设备上,那么,控制设备可继续向第一显示设备发送上述视频播放业务实时生成的待显示图层,无需将上述视频播放业务从第一显示设备切换至第二显示设备。
在本申请实施例中,第一显示设备和第二显示设备还可以周期性的通过摄像头捕捉用户的影像,那么,在步骤910中,当第一显示设备得到人脸检测(或人眼检测)结果时,控制设备还可以进一步通过人脸(或人眼)识别算法识别本次检测到的人脸(或人眼)是否与上一次人脸检测(或人眼检测)时检测到的人脸(或人眼)相同。若相同,则说明关注第一显示设备的用户没有改变,控制设备可指示第一显示设备继续显示上述目标业务;否则,控制设备可忽略本次第一显示设备上报的人脸检测(或人眼检测)结果。
911、若第二显示设备得到人脸检测(或人眼检测)结果,则控制设备执行上述步骤905-908。
若第二显示设备得到人脸检测(或人眼检测)结果,则说明用户此时的关注点已经转移至第二显示设备上,此时,控制设备可将上述视频播放业务从第一显示设备切换至第二显示设备。其中,上述视频播放业务的具体切换方法可参见步骤905-908的相关描述,故此处不再赘述。
与步骤910类似的,当第二显示设备得到人脸检测(或人眼检测)结果时,控制设备还可以进一步通过人脸(或人眼)识别算法识别本次检测到的人脸(或人眼)是否与第一显示设备最近一次上报的人脸(或人眼)识别结果相同。若相同,则说明原本关注第一显示设备的用户,将关注点转移至第二显示设备,进而,控制设备可通过上述步骤905-908将上述视频播放业务从第一显示设备切换至第二显示设备;否则,控制设备可忽略本次第二显示设备上报的人脸检测(或人眼检测)结果。
当然,用户也可以手动将目标业务从第一显示设备切换至第二显示设备。
例如,如图19所示,用户可在第一显示设备的设置界面中选择切换后的第二显示设备(例如智能电视),第一显示设备检测到用户的这一选择操作之后,可将第二显示设备的标识携带在切换请求发送给控制设备,进而,控制设备可按照上述步骤905-908所述的切换方法将目标业务从第一显示设备切换至第二显示设备。
又例如,用户还可以通过在第一显示设备中执行相应的手势触发目标业务的切换流程。如图20所示,手机(第一显示设备)正在显示某一Word文件1101,手机可以通过其摄像头确定手机与智能电视(第二显示设备)之间的相对位置关系,如果手机检测到用户在当前Word文件1101的显示界面中执行拖动操作,则手机可以根据该拖动操作在触摸屏中的移动轨迹以及手机与智能电视之间的相对位置关系,确定该拖动操作的指向性。当该拖动操作指向智能电视时,手机可将智能电视的标识携带在切换请求发送给控制设备,进而,控制设备可按照上述步骤905-908所述的切换方法将上述Word文件1101从手机切换至智能电视中继续显示。
可选的,手机还可以将确定出的手机与智能电视之间的相对位置关系通过文字、图片或动画等形式显示在手机的显示屏上,以提示用户通过有指向性的拖动操作便可实现目标业务在手机和智能电视之间的切换。
进一步地,在本申请实施例中,用户还可以选择将第一显示设备内显示界面中某个区域内的显示内容切换至第二显示设备中显示。例如,第一显示设备当前的显示界面中包括多个显示窗口,如图21中手机显示的输入法的窗口1102以及短信应用的窗口,那么,与图20类似的,用户可以针对不同的窗口,将选中的窗口向准备切换的第二显示设备的方向拖动。当该拖动操作指向智能电视时,手机可将智能电视的标识携带在切换请求发送给控制设备,进而,控制设备可按照上述步骤905-908所述的切换方法将输入法的窗口1102内的显示内容从手机切换至智能电视中继续显示。又或者,第一显示设备还可以预先将屏幕划分为不同的区域,用户可以针对不同的区域,将选中区域内的显示内容切换至第二显示设备中进行显示,本申请实施例对此不做任何限制。
可以理解的是,上述控制设备或显示设备等为了实现上述功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请实施例能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请实施例的范围。
本申请实施例可以根据上述方法示例对上述终端等进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软 件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图22示出了上述实施例中所涉及的控制设备的一种可能的结构示意图,该控制设备包括:接收单元2101、确定单元2102以及发送单元2103。
接收单元2101用于支持控制设备执行图14中的过程901和904;确定单元2102用于支持控制设备执行图14中的过程902和909;发送单元2103用于支持控制设备执行图14中的过程905、908和910。其中,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。
其中,上述确定单元2102用于对控制设备的动作进行控制管理;上述接收单元2101和发送单元2103用于支持控制设备与其他设备之间的通信过程;另外,控制设备还可以包括存储单元,用于保存控制设备的程序代码和数据,例如,存储单元可用于存储各个显示设备发送的设备信息。
当上述控制设备作为上述显示***中的一个显示设备时,还可以进一步包括显示单元,用于显示由用户输入的信息或提供给用户的信息以及控制设备的各种菜单。
示例性的,上述确定单元2102可以为处理器,接收单元2101和发送单元2103可以为RF电路或Wi-Fi装置等收发器件,存储单元可以为存储器,显示单元可以为显示器,此时,本申请实施例所提供的控制设备可以为图9所示的手机201。
在上述实施例中,可以全部或部分的通过软件,硬件,固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式出现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。该可用介质可以是磁性介质,(例如,软盘,硬盘、磁带)、光介质(例如,DVD)或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (16)

  1. 一种显示方法,其特征在于,包括:
    控制设备接收目标显示设备需要显示目标业务时发送的显示请求;
    响应于所述显示请求,所述控制设备确定支持显示所述目标业务的第一显示设备和第二显示设备;
    所述控制设备请求所述第一显示设备上报与用户之间的第一距离,并请求所述第二显示设备上报与用户之间的第二距离;
    当所述第一距离小于所述第二距离时,所述控制设备从所述目标显示设备中获取所述目标业务当前的第一显示数据,并向所述第一显示设备发送所述第一显示数据,以使得所述第一显示设备根据所述第一显示数据显示所述目标业务;
    当后续所述控制设备获取到所述第一显示设备上报的第一距离小于所述第二显示设备上报的第二距离时,所述控制设备从所述目标显示设备中获取所述目标业务当前的第二显示数据,并分别向所述第一显示设备和所述第二显示设备发送所述第二显示数据,使得所述第一显示设备和所述第二显示设备均根据所述第二显示数据显示所述目标业务。
  2. 根据权利要求1所述的方法,其特征在于,
    所述第一显示数据中包括所述目标业务的所有图层中支持所述第一显示设备显示的至少一个图层;和/或,
    所述第二显示数据中包括所述目标业务的所有图层中支持所述第二显示设备显示的至少一个图层。
  3. 根据权利要求1或2所述的方法,其特征在于,在所述控制设备向所述第一显示设备发和所述第二显示设备发送所述第二显示数据之后,还包括:
    当所述第二显示设备显示所述目标业务达到预设时间时,所述控制设备停止向所述第一显示设备发送所述第二显示数据。
  4. 根据权利要求1或2所述的方法,其特征在于,在所述控制设备分别向所述第一显示设备发和所述第二显示设备发送所述第二显示数据之后,还包括:
    当所述第二距离小于预设的距离阈值时,所述控制设备停止向所述第一显示设备发送所述第二显示数据。
  5. 根据权利要求1或2所述的方法,其特征在于,在所述控制设备分别向所述第一显示设备发和所述第二显示设备发送所述第二显示数据之后,还包括:
    当所述第二距离小于预设的距离阈值时,所述控制设备确定所述用户距离所述第二显示设备小于所述距离阈值的持续时间;
    若所述持续时间大于预设的时间阈值时,所述控制设备停止向所述第一显示设备发送所述第二显示数据。
  6. 根据权利要求1-5中任一项所述的方法,其特征在于,在控制设备向所述第一显示设备发送所述第一显示数据之后,还包括:
    当后续所述控制设备获取到所述第一显示设备上报的第一距离等于所述第二显示设备上报的第二距离时,所述控制设备指示所述第一显示设备和所述第二显示设备进行人脸检测;
    若获取到所述第一显示设备上报的人脸检测结果,则所述控制设备从所述目标显示设备中获取所述目标业务当前的第二显示数据,并向所述第一显示设备发送所述第二显示数据;
    若获取到所述第二显示设备上报的人脸检测结果,则所述控制设备从所述目标显示设备中获取所述目标业务当前的第二显示数据,并分别向所述第一显示设备和所述第二显示设备发送所述第二显示数据。
  7. 一种显示***,其特征在于,所述***包括控制设备,以及与所述控制设备通信的第一显示设备、第二显示设备和目标显示设备,其中,
    所述目标显示设备,用于:当需要显示目标业务时,向所述控制设备发送显示请求;
    所述控制设备,用于:响应于所述显示请求,所述控制设备确定支持显示所述目标业务的第一显示设备和第二显示设备;请求所述第一显示设备上报与用户之间的第一距离,并请求所述第二显示设备上报与用户之间的第二距离;当所述第一距离小于所述第二距离时,从所述目标显示设备中获取所述目标业务当前的第一显示数据,并向所述第一显示设备发送所述第一显示数据;
    所述第一显示设备,用于:根据所述第一显示数据显示所述目标业务;
    所述控制设备,还用于:当后续获取到所述第一显示设备上报的第一距离小于所述第二显示设备上报的第二距离时,从所述目标显示设备中获取所述目标业务当前的第二显示数据,并分别向所述第一显示设备和所述第二显示设备发送所述第二显示数据;
    所述第一显示设备,还用于:根据所述第二显示数据显示所述目标业务;
    所述第二显示设备,用于:根据所述第二显示数据显示所述目标业务。
  8. 一种控制设备,其特征在于,所述控制设备包括处理器,以及与所述处理器均相连的存储器和收发器,所述存储器存储了程序代码,所述处理器运行所述程序代码以指令所述控制设备执行以下步骤:
    接收目标显示设备需要显示目标业务时发送的显示请求;
    响应于所述显示请求,确定支持显示所述目标业务的第一显示设备和第二显示设备;
    请求所述第一显示设备上报与用户之间的第一距离,并请求所述第二显示设备上报与用户之间的第二距离;
    当所述第一距离小于所述第二距离时,从所述目标显示设备中获取所述目标业务当前的第一显示数据,并向所述第一显示设备发送所述第一显示数据,以使得所述第一显示设备根据所述第一显示数据显示所述目标业务;
    当后续获取到所述第一显示设备上报的第一距离小于所述第二显示设备上报的第二距离时,从所述目标显示设备中获取所述目标业务当前的第二显示数据,并分别向所述第一显示设备和所述第二显示设备发送所述第二显示数据,使得所述第一显示设备和所述第二显示设备均根据所述第二显示数据显示所述目标业务。
  9. 根据权利要求8所述的控制设备,其特征在于,
    所述第一显示数据中包括所述目标业务的所有图层中支持所述第一显示设备显示 的至少一个图层;和/或,
    所述第二显示数据中包括所述目标业务的所有图层中支持所述第二显示设备显示的至少一个图层。
  10. 根据权利要求8或9所述的控制设备,其特征在于,在向所述第一显示设备发和所述第二显示设备发送所述第二显示数据之后,所述程序代码还包括:
    当所述第二显示设备显示所述目标业务达到预设时间时,停止向所述第一显示设备发送所述第二显示数据。
  11. 根据权利要求8或9所述的控制设备,其特征在于,在向所述第一显示设备发和所述第二显示设备发送所述第二显示数据之后,所述程序代码还包括:
    当所述第二距离小于预设的距离阈值时,停止向所述第一显示设备发送所述第二显示数据。
  12. 根据权利要求8或9所述的控制设备,其特征在于,在向所述第一显示设备发和所述第二显示设备发送所述第二显示数据之后,所述程序代码还包括:
    当所述第二距离小于预设的距离阈值时,确定所述用户距离所述第二显示设备小于所述距离阈值的持续时间;
    若所述持续时间大于预设的时间阈值时,停止向所述第一显示设备发送所述第二显示数据。
  13. 根据权利要求8-12中任一项所述的控制设备,其特征在于,在向所述第一显示设备发送所述第一显示数据之后,所述程序代码还包括:
    当后续获取到所述第一显示设备上报的第一距离等于所述第二显示设备上报的第二距离时,指示所述第一显示设备和所述第二显示设备进行人脸检测;
    若获取到所述第一显示设备上报的人脸检测结果,则从所述目标显示设备中获取所述目标业务当前的第二显示数据,并向所述第一显示设备发送所述第二显示数据;
    若获取到所述第二显示设备上报的人脸检测结果,则从所述目标显示设备中获取所述目标业务当前的第二显示数据,并分别向所述第一显示设备和所述第二显示设备发送所述第二显示数据。
  14. 根据权利要求8-13中任一项所述的控制设备,其特征在于,所述控制设备还包括与所述处理器相连的显示器,所述显示器用于根据所述目标显示设备发送的第一显示数据和/或第二显示数据显示所述目标业务。
  15. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令在控制设备上运行时,使得所述控制设备执行如权利要求1-6中任一项所述的显示方法。
  16. 一种包含指令的计算机程序产品,其特征在于,当所述计算机程序产品在控制设备上运行时,使得所述控制设备执行如权利要求1-6中任一项所述的显示方法。
PCT/CN2017/107893 2017-10-26 2017-10-26 一种显示方法及装置 WO2019080065A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201780089410.XA CN110537165B (zh) 2017-10-26 2017-10-26 一种显示方法及装置
US16/612,936 US11081086B2 (en) 2017-10-26 2017-10-26 Display method and apparatus
PCT/CN2017/107893 WO2019080065A1 (zh) 2017-10-26 2017-10-26 一种显示方法及装置
EP17929725.4A EP3605314B1 (en) 2017-10-26 2017-10-26 Display method and apparatus
US17/391,714 US20220020339A1 (en) 2017-10-26 2021-08-02 Display method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/107893 WO2019080065A1 (zh) 2017-10-26 2017-10-26 一种显示方法及装置

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/612,936 A-371-Of-International US11081086B2 (en) 2017-10-26 2017-10-26 Display method and apparatus
US17/391,714 Continuation US20220020339A1 (en) 2017-10-26 2021-08-02 Display method and apparatus

Publications (1)

Publication Number Publication Date
WO2019080065A1 true WO2019080065A1 (zh) 2019-05-02

Family

ID=66246089

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/107893 WO2019080065A1 (zh) 2017-10-26 2017-10-26 一种显示方法及装置

Country Status (4)

Country Link
US (2) US11081086B2 (zh)
EP (1) EP3605314B1 (zh)
CN (1) CN110537165B (zh)
WO (1) WO2019080065A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248346A (zh) * 2019-06-17 2019-09-17 四川长虹电器股份有限公司 多设备投屏方法及***
CN111949900A (zh) * 2020-08-12 2020-11-17 北京百度网讯科技有限公司 显示方法、装置、设备以及存储介质
CN112671971A (zh) * 2020-12-21 2021-04-16 青岛海尔科技有限公司 通话处理方法及装置、存储介质、电子装置
CN113129888A (zh) * 2020-01-10 2021-07-16 阿里巴巴集团控股有限公司 设备唤醒方法、装置、设备
CN113656838A (zh) * 2021-08-31 2021-11-16 日立楼宇技术(广州)有限公司 智能家居数据保护方法、装置、计算机设备和存储介质
CN115223521A (zh) * 2022-07-12 2022-10-21 海信视像科技股份有限公司 显示设备及接力设备展示方法

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110235438B (zh) * 2016-12-01 2021-12-28 Lg 电子株式会社 图像显示设备和包含该图像显示设备的图像显示***
WO2020248627A1 (zh) * 2019-06-11 2020-12-17 聚好看科技股份有限公司 一种视频通话方法和显示设备
CN113099170B (zh) * 2020-01-09 2023-05-12 博泰车联网科技(上海)股份有限公司 用于信息处理的方法、设备和计算机存储介质
CN111523095B (zh) * 2020-03-31 2024-03-15 华为技术有限公司 一种跨设备交互的方法和终端设备
CN111796787B (zh) * 2020-06-30 2022-07-26 联想(北京)有限公司 一种显示方法及显示装置
CN112055074A (zh) * 2020-09-02 2020-12-08 深圳小马洛可科技有限公司 一种5g物联网大规模显示屏管理方法及***
CN112099705B (zh) * 2020-09-04 2022-06-10 维沃移动通信有限公司 投屏方法、装置及电子设备
CN112188362B (zh) * 2020-09-29 2022-06-03 歌尔科技有限公司 一种播放方法、装置和计算机可读存储介质
US20220141531A1 (en) * 2020-10-30 2022-05-05 Rovi Guides, Inc. Resource-saving systems and methods
CN112672203B (zh) * 2020-12-16 2023-05-23 努比亚技术有限公司 文件传输控制方法、移动终端及计算机可读存储介质
CN113242462B (zh) * 2021-06-23 2022-03-25 烽火通信科技股份有限公司 一种实现融合终端的投屏方法和装置
CN115834936A (zh) * 2021-09-18 2023-03-21 华为技术有限公司 多设备投屏或内容接续方法、电子设备及装置
WO2023172272A1 (en) * 2022-03-11 2023-09-14 Hewlett-Packard Development Company, L.P. Display devices focus indicators
CN116662859B (zh) * 2023-05-31 2024-04-19 西安工程大学 非遗文化数据特征选择方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104270668A (zh) * 2014-09-26 2015-01-07 广州创维平面显示科技有限公司 一种视频内容续播方法与***
US20150244853A1 (en) * 2014-02-22 2015-08-27 Samsung Electronics Co., Ltd. Method for communicating with neighbor device, electronic device, and storage medium
CN105094732A (zh) * 2015-06-29 2015-11-25 小米科技有限责任公司 屏幕显示方法及装置
CN106488316A (zh) * 2016-10-20 2017-03-08 北京小米移动软件有限公司 媒体播放方法及装置、电子设备
CN106534565A (zh) * 2016-11-28 2017-03-22 努比亚技术有限公司 电视控制装置、移动终端及方法
CN106936836A (zh) * 2017-03-21 2017-07-07 北京小米移动软件有限公司 多媒体通信的方法及装置

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4350740B2 (ja) * 2006-12-05 2009-10-21 レノボ・シンガポール・プライベート・リミテッド 携帯式電子機器、画面の表示方向の変更方法、プログラムおよび記憶媒体
GB2454219A (en) 2007-10-31 2009-05-06 Symbian Software Ltd Method and system for providing media content to a reproduction apparatus
KR101644598B1 (ko) 2010-02-12 2016-08-02 삼성전자주식회사 복수의 디스플레이 장치를 포함하는 영상 시스템 제어방법
US20130067331A1 (en) * 2011-09-09 2013-03-14 Screenovate Technologies Ltd. Method and System of Simultaneous Display of Multiple Screens on a Target Display
US9591346B2 (en) 2012-05-14 2017-03-07 Samsung Electronics Co., Ltd. Content delivery system with content sharing mechanism and method of operation thereof
US9674694B2 (en) * 2012-05-23 2017-06-06 Qualcomm Incorporated Systems and methods for group communication using a mobile device with mode transition based on motion
US8928587B1 (en) * 2013-11-25 2015-01-06 Google Inc. Automatic device login based on wearable sensor fusion
US9591603B2 (en) * 2013-12-10 2017-03-07 At&T Intellectual Property I, L.P. Dynamic network configuration based on passive location analytics
KR20160113226A (ko) 2014-01-29 2016-09-28 후아웨이 디바이스 컴퍼니 리미티드 통신 접속 구축 방법 및 중재 장치
US9916122B2 (en) * 2014-12-18 2018-03-13 Google Llc Methods, systems, and media for launching a mobile application using a public display device
US9578069B1 (en) * 2015-01-30 2017-02-21 Sprint Communications Company L.P. Cooperative IMS access from a visited domain
KR102393093B1 (ko) 2015-02-03 2022-05-03 삼성전자주식회사 전자 장치 및 그 컨텐츠 제공 방법
US20180011675A1 (en) * 2015-04-30 2018-01-11 Hewlett-Packard Development Company, L.P. Electronic display illumination
US10025447B1 (en) * 2015-06-19 2018-07-17 Amazon Technologies, Inc. Multi-device user interface
CN105979392A (zh) 2015-09-14 2016-09-28 乐视致新电子科技(天津)有限公司 网页显示方法和浏览器
KR20170096408A (ko) 2016-02-16 2017-08-24 삼성전자주식회사 어플리케이션을 표시하는 방법 및 이를 지원하는 전자 장치
US9878875B1 (en) * 2016-09-30 2018-01-30 Otis Elevator Company Building selection in elevator system supporting mobile device calls
US10268447B1 (en) * 2016-12-02 2019-04-23 Amazon Technologies, Inc. Curating audio and IR commands through machine learning
CN106730827B (zh) * 2016-12-06 2018-10-19 腾讯科技(深圳)有限公司 一种对象显示的方法以及终端设备
CN106951055B (zh) * 2017-03-10 2019-07-12 Oppo广东移动通信有限公司 一种移动终端的显示控制方法、装置及移动终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150244853A1 (en) * 2014-02-22 2015-08-27 Samsung Electronics Co., Ltd. Method for communicating with neighbor device, electronic device, and storage medium
CN104270668A (zh) * 2014-09-26 2015-01-07 广州创维平面显示科技有限公司 一种视频内容续播方法与***
CN105094732A (zh) * 2015-06-29 2015-11-25 小米科技有限责任公司 屏幕显示方法及装置
CN106488316A (zh) * 2016-10-20 2017-03-08 北京小米移动软件有限公司 媒体播放方法及装置、电子设备
CN106534565A (zh) * 2016-11-28 2017-03-22 努比亚技术有限公司 电视控制装置、移动终端及方法
CN106936836A (zh) * 2017-03-21 2017-07-07 北京小米移动软件有限公司 多媒体通信的方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3605314A4 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110248346A (zh) * 2019-06-17 2019-09-17 四川长虹电器股份有限公司 多设备投屏方法及***
CN113129888A (zh) * 2020-01-10 2021-07-16 阿里巴巴集团控股有限公司 设备唤醒方法、装置、设备
CN111949900A (zh) * 2020-08-12 2020-11-17 北京百度网讯科技有限公司 显示方法、装置、设备以及存储介质
CN111949900B (zh) * 2020-08-12 2024-05-17 北京百度网讯科技有限公司 显示方法、装置、设备以及存储介质
CN112671971A (zh) * 2020-12-21 2021-04-16 青岛海尔科技有限公司 通话处理方法及装置、存储介质、电子装置
CN113656838A (zh) * 2021-08-31 2021-11-16 日立楼宇技术(广州)有限公司 智能家居数据保护方法、装置、计算机设备和存储介质
CN113656838B (zh) * 2021-08-31 2023-05-26 日立楼宇技术(广州)有限公司 智能家居数据保护方法、装置、计算机设备和存储介质
CN115223521A (zh) * 2022-07-12 2022-10-21 海信视像科技股份有限公司 显示设备及接力设备展示方法
CN115223521B (zh) * 2022-07-12 2024-04-30 海信视像科技股份有限公司 显示设备及接力设备展示方法

Also Published As

Publication number Publication date
US11081086B2 (en) 2021-08-03
EP3605314B1 (en) 2024-01-10
EP3605314A1 (en) 2020-02-05
US20220020339A1 (en) 2022-01-20
EP3605314A4 (en) 2020-03-18
CN110537165B (zh) 2021-02-12
US20200168176A1 (en) 2020-05-28
CN110537165A (zh) 2019-12-03

Similar Documents

Publication Publication Date Title
US20220020339A1 (en) Display method and apparatus
AU2021269359B2 (en) Display method and apparatus
CN114449091B (zh) 一种显示方法及装置
AU2021204449B2 (en) Image sharing method and electronic device
WO2018161962A1 (zh) 分享图像的方法、电子设备及***
WO2015143900A1 (zh) 网络会议中进行数据共享的方法、装置及***
US10636228B2 (en) Method, device, and system for processing vehicle diagnosis and information
EP3640819A1 (en) Search method and apparatus
CN106201220B (zh) 显示内容获取方法及装置
JP6900546B2 (ja) 画像共有方法および電子デバイス
CN114935973A (zh) 互动处理方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17929725

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017929725

Country of ref document: EP

Effective date: 20191031

NENP Non-entry into the national phase

Ref country code: DE