WO2021088889A1 - 显示***、显示方法及计算设备 - Google Patents

显示***、显示方法及计算设备 Download PDF

Info

Publication number
WO2021088889A1
WO2021088889A1 PCT/CN2020/126566 CN2020126566W WO2021088889A1 WO 2021088889 A1 WO2021088889 A1 WO 2021088889A1 CN 2020126566 W CN2020126566 W CN 2020126566W WO 2021088889 A1 WO2021088889 A1 WO 2021088889A1
Authority
WO
WIPO (PCT)
Prior art keywords
instruction
display
target object
associated information
projection device
Prior art date
Application number
PCT/CN2020/126566
Other languages
English (en)
French (fr)
Inventor
唐甜甜
陈许
肖纪臣
刘显荣
王学磊
刘晋
Original Assignee
青岛海信激光显示股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 青岛海信激光显示股份有限公司 filed Critical 青岛海信激光显示股份有限公司
Publication of WO2021088889A1 publication Critical patent/WO2021088889A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42203Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] sound input device, e.g. microphone

Definitions

  • This application relates to smart display device technology, and in particular to a display system, a display method, and a computing device.
  • Laser projection equipment is based on Internet technology, has an open operating system and chip, and has an open application platform. It can support multiple functions such as audio and video, entertainment, data, etc., so as to meet the diverse needs of users and bring users a new experience. .
  • the laser projection device can support voice interaction, and the user can control the laser projection device through voice.
  • the laser projection device includes a display.
  • the laser projection device intercepts the image currently displayed on the display and sends the image to the cloud server to identify the characters, objects, etc. in the image.
  • the cloud server sends the recognition result to the laser projection device, and the laser projection device displays the recognition result on the display.
  • the laser projection device obtains the related information of the recognition result from the cloud server, and displays the related information on the display.
  • the laser projection device displays the recognition result and associated information, it will obscure the main screen currently being displayed on the display. This method will affect the user's normal viewing of the main screen, resulting in a poor user experience.
  • This application provides a projection system and a projection screen, and the technical solutions adopted are as follows:
  • an embodiment of the present application provides a display system, including:
  • the first display, the second display, the first controller, and the second controller are the first display, the second display, the first controller, and the second controller.
  • the first controller is configured to: receive an image search instruction input by a user, and in response to the image search instruction, intercept the image currently displayed on the first display, obtain the recognition result of the image, and report to the The second controller sends a first instruction, and the first instruction is used to instruct the second controller to display the recognition result.
  • the second controller is configured to display the recognition result on the second display in response to the first instruction.
  • an embodiment of the present application provides a display method, including:
  • the first instruction is sent by the first controller after receiving the image search instruction input by the user, intercepting the image currently displayed on the first display, and obtaining the recognition result of the image, the first The instruction is used to instruct to display the recognition result.
  • the recognition result is displayed on the second display.
  • an embodiment of the present application provides a display system, including:
  • Multimedia controllers laser projection equipment, and micro-projection equipment.
  • the multimedia controller is configured to: receive an image search instruction, and in response to the image search instruction, determine the image currently displayed by the laser projection device, obtain the recognition result of the image, and report to the microprojector The device sends a first instruction, where the first instruction is used to instruct the micro-projection device to display the recognition result.
  • the micro-projection device is configured to: receive a first instruction from the multimedia controller, and display the recognition result in response to the first instruction.
  • an embodiment of the present application provides a display method, including:
  • the image currently displayed by the laser projection device is determined, and the recognition result of the image is obtained.
  • an embodiment of the present application provides a display method, including:
  • the recognition result is displayed.
  • an embodiment of the present application provides a computing device, including:
  • the memory is used to store program instructions.
  • the processor is configured to call the program instructions stored in the memory, and execute the method described in the second aspect above according to the obtained program.
  • an embodiment of the present application provides a computing device, including:
  • the memory is used to store program instructions.
  • the processor is configured to call the program instructions stored in the memory, and execute the method described in the fourth aspect above according to the obtained program.
  • an embodiment of the present application provides a computing device, including:
  • the memory is used to store program instructions.
  • the processor is configured to call the program instructions stored in the memory, and execute the method described in the fifth aspect above according to the obtained program.
  • FIG. 1 is a schematic diagram of a display system provided by an embodiment of the application.
  • Figure 2 is an interaction process between a multimedia controller, a laser projection device, and a micro-projection device provided by an embodiment of the application;
  • FIG. 3 is a schematic diagram of the interface between the front of the laser projection device and the front of the micro-projection device provided by an embodiment of the application;
  • FIG. 4 is a schematic diagram of the back of the laser projection device and the back of the micro-projection device provided by an embodiment of the application;
  • FIG. 5 is a schematic diagram of an application scenario of interaction between a display system, a control device, and a server provided by an embodiment of the application;
  • FIG. 6 is a configuration block diagram of a control device 400 provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of a system architecture of a display system provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of the hardware structure of a laser projection device provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram of an application program layer of a laser projection device provided by an embodiment of the application.
  • 10-14 exemplarily show schematic diagrams of interaction between a user interface and a user in a display system according to an exemplary embodiment
  • FIG. 15 is a flowchart of a display method provided by an embodiment of the application.
  • FIG. 16 is a flowchart of another display method provided by an embodiment of the application.
  • module used in the various embodiments of this application can refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware or/and software code that can execute related components Function.
  • remote control used in the various embodiments of this application refers to a component of an electronic device (such as a multimedia controller, a laser projection TV, or a micro-projection device disclosed in this application). Wireless control of electronic equipment within range.
  • This component can generally use infrared and/or radio frequency (RF) signals and/or Bluetooth to connect to electronic devices, and can also include wireless fidelity (WIFI), wireless, universal serial bus (USB), Functional modules such as Bluetooth and motion sensors.
  • WIFI wireless fidelity
  • USB universal serial bus
  • Functional modules such as Bluetooth and motion sensors.
  • a handheld touch remote control replaces most of the physical built-in hard keys in general remote control devices with the user interface in the touch screen.
  • gesture used in the embodiments of the present application refers to a user's behavior through a change of hand shape or hand movement to express expected ideas, actions, goals, and/or results.
  • the term "hardware system” used in the various embodiments of this application may refer to an integrated circuit (IC), printed circuit board (Printed circuit board, PCB) and other mechanical, optical, electrical, and magnetic devices with computing , Control, storage, input and output functions of the physical components.
  • the hardware system is usually also referred to as a motherboard or a chip.
  • the display system 10 provided by the embodiment of the present application includes: a multimedia controller 100, a laser projection device 200, and a micro-projection device 300.
  • the multimedia controller 100 is in communication connection with the laser projection device 200 and the micro-projection device 300 respectively.
  • the multimedia controller 100 is used to analyze and process multimedia information, determine different multimedia information and perform shunt processing, and determine whether various types of multimedia information are displayed by the laser projection device 200 or the microprojection device 300. After the multimedia controller 100 splits the multimedia information, the multimedia controller 100 sends the multimedia information displayed by the laser projection device 200 to the laser projection device 200; and sends the multimedia information displayed by the micro projection device 300 to the micro projection device 300.
  • the laser projection apparatus 200 includes a third controller 201 and a third display 202.
  • the third controller 201 is configured to receive multimedia information sent by the multimedia controller 100, and drive and control the third display 202 to display the multimedia information.
  • the third display 202 is used for responding to the control and driving of the third controller 201 to display multimedia information.
  • the micro-projection device 300 includes a fourth controller 301 and a fourth display 302.
  • the fourth controller 301 is configured to receive multimedia information sent by the multimedia controller 100, and drive and control the fourth display 302 to display the multimedia information.
  • the fourth display 302 is used to respond to the control and driving of the fourth controller 301 to display multimedia information.
  • the third display 202 and the fourth display 302 can be used to display different display images.
  • the third display 202 may be used to display a picture of a TV program
  • the fourth display 302 may be used to display a picture of information such as notification messages and voice assistants.
  • the content displayed on the third display 202 and the content displayed on the fourth display 302 may be independent of each other and do not affect each other.
  • the fourth display 302 may display information such as time, weather, temperature, reminder messages, etc. that are not related to the TV program.
  • the fourth display 302 may display information such as the avatar and the chat duration of the user currently accessing the video chat.
  • part or all of the content displayed on the fourth display 302 can be adjusted to be displayed on the third display 202.
  • the time, weather, temperature, reminder message and other information displayed on the third display 202 can be adjusted to be displayed on the third display 202, and the fourth display 302 can be used to display other information.
  • the third display 202 displays a multi-party interactive screen while displaying the traditional television program screen, and the multi-party interactive screen does not obstruct the traditional television program screen.
  • the present application does not limit the display mode of the traditional TV program screen and the multi-party interactive screen.
  • this application can set the position and size of the traditional TV program screen and the multi-party interactive screen according to the priority of the traditional TV program screen and the multi-party interactive screen.
  • the area of traditional TV program screens is larger than that of multi-party interactive screens, and the multi-party interactive screens can be located on one side of the traditional TV program screen or can be set floating In the corner of the multi-party interactive screen.
  • the third display 202 and the fourth display 302 involved in this application are projection display devices.
  • the specific display device types, sizes, and resolutions of the third display 202 and the fourth display 302 are not limited, and those skilled in the art can understand Yes, the third display 202 and the fourth display 302 can be changed in performance and configuration as required.
  • the third display 202 may be connected or provided with a camera for presenting the picture captured by the camera on the display interface of a laser projection device, a micro-projection device, or other display device, so as to realize communication between users.
  • Interactive chat the picture captured by the camera can be displayed on the laser projection device in full screen, half screen, or in any selectable area.
  • the camera is connected to the rear shell of the laser projection device through a connecting plate, and is fixedly installed in the upper middle of the rear shell of the laser projection device.
  • it can be fixedly installed in the rear shell of the laser projection device. Any position of the rear case can ensure that the image acquisition area is not blocked by the rear case. For example, the image acquisition area and the display orientation of the laser projection device are the same.
  • the camera can be connected to the rear shell of the laser projection device through a connecting plate or other conceivable connectors.
  • the connector is equipped with a lifting motor.
  • the user wants to use the camera or has an application
  • the camera is used, it is raised above the laser projection device.
  • the camera is not needed, it can be embedded behind the rear shell to protect the camera from damage and protect the privacy of the user.
  • the camera used in this application may have 16 million pixels to achieve the purpose of ultra-high-definition display. In actual use, a camera with higher or lower than 16 million pixels can also be used.
  • the content displayed in different application scenarios of the laser projection device can be merged in a variety of different ways, so as to achieve functions that cannot be achieved by the traditional laser projection device.
  • the user may have a video chat with at least one other user while watching a video program.
  • the presentation of the video program can be used as the background screen, and the video chat window is displayed on the background screen.
  • At least one video chat is conducted across terminals.
  • the user can video chat with at least one other user while entering the education application for learning.
  • students can realize remote interaction with teachers while learning content in educational applications. Visually, you can call this function "learning and chatting”.
  • a video chat is conducted with players entering the game.
  • players entering the game.
  • a player enters a game application to participate in a game, it can realize remote interaction with other players. Visually, you can call this function "watch and play".
  • the game scene and the video picture are integrated, and the portrait in the video picture is cut out and displayed on the game picture to improve the user experience.
  • somatosensory games such as ball games, boxing games, running games, dancing games, etc.
  • human body postures and movements are acquired through the camera, body detection and tracking, and detection of key points of human skeleton data, and then interact with the game Animations are integrated to realize games such as sports, dance and other scenes.
  • the user can interact with at least one other user in video and voice in the K song application. Visually, you can call this function "watch and sing".
  • multiple users can jointly complete the recording of a song.
  • the user can turn on the camera locally to obtain pictures and videos, which is vivid, and this function can be called "look in the mirror".
  • FIG. 1 only takes the camera installed on the housing of the third display as an example for illustration.
  • the installation position of the camera can be determined according to actual requirements.
  • it is arranged on the housing of the third controller, on the housing of the third display, on the housing of the fourth controller, on the housing of the fourth display, or independently, which is not limited in this application.
  • the interaction process between the multimedia controller 100, the laser projection device 200, and the micro-projection device 300 includes the following steps:
  • the multimedia controller obtains multimedia information.
  • the multimedia information includes at least one of the following: video information, text information, system notification information, system push information, window information corresponding to the one-click image search function, streaming media information, and video call information.
  • the multimedia controller classifies the multimedia information, and determines the first information and the second information.
  • the first information is information displayed by the laser projection device.
  • the second information is the information played by the micro-projection device.
  • the first information is video information.
  • video information For example, live TV programs, online video-on-demand programs, and online video live programs.
  • the second information is information other than the video information in the multimedia information.
  • information other than the video information in the multimedia information For example, weather, time, text news, system notification information, system push information, window information corresponding to the one-click image search function, streaming media information, video call information, etc.
  • the multimedia controller sends the first information to the third controller.
  • the third controller receives the first information from the multimedia controller.
  • the third controller controls and drives the third display to display the first information.
  • the multimedia controller sends the second information to the fourth controller.
  • the fourth controller receives the first information from the multimedia controller.
  • S206 The fourth controller controls and drives the fourth display to display the second information.
  • the multimedia controller 100 and the third controller 201 may be provided in the main body of the laser projection device.
  • the main body of the laser projection device is usually placed directly in front of the third display 202.
  • the fourth display 302 may be provided on one side of the third display 202.
  • the fourth display 302 is arranged at any position above, below, on the left side, and on the right side of the third display 202.
  • the display system described in this application may include multiple micro-projection devices.
  • the display system includes two micro-projection devices, and the two micro-projection devices are respectively located on the left and right sides of the laser projection device.
  • This application only takes one micro-projection device as an example for description.
  • the specific implementation is similar to that of the display system including one micro-projection device, which is not repeated in this application.
  • the micro-projection device main body 301 may be arranged on the back of the display screen 202 of the laser projection device.
  • FIG. 5 a schematic diagram of an application scenario in which the display system 10 interacts with the control device 400 and the server 500 in the embodiment of the present application.
  • control device 400 may be a remote controller 400A, which can communicate with the multimedia controller 100 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or Other wired methods are used to control the multimedia controller 100.
  • the user can control the multimedia controller 100 by inputting user instructions through buttons, voice input, control panel input, etc. on the remote controller 400A.
  • the user can input corresponding control commands through the remote control 400A volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, switch machine keys, etc. to achieve multimedia control.
  • the function of the device 100 is a remote controller 400A, which can communicate with the multimedia controller 100 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or Other wired methods are used to control the multimedia controller 100.
  • the user can control the multimedia controller 100 by inputting user instructions through buttons, voice input, control panel input, etc. on the remote controller 400A.
  • the control device 400 can also be a smart device, such as a mobile terminal 400B, a tablet computer, a computer, a notebook computer, etc., which can be connected through a local area network (LAN, Wide Area Network), a wide area network (WAN, Wide Area Network), and a wireless local area network (WLAN, Wireless Local Area Network) or other networks communicate with the multimedia controller 100, and control the multimedia controller 100 through an application program corresponding to the multimedia controller 100.
  • LAN Local area network
  • WAN Wide Area Network
  • WLAN Wireless Local Area Network
  • an application program running on a smart device is used to control the multimedia controller 100.
  • the application can provide users with various controls through an intuitive user interface (UI, User Interface) on the screen associated with the smart device.
  • UI User Interface
  • both the mobile terminal 400B and the multimedia controller 100 can be installed with software applications, so that the connection and communication between the two can be realized through a network communication protocol, thereby achieving the purpose of one-to-one control operation and data communication.
  • the mobile terminal 400B can establish a control command protocol with the multimedia controller 100, synchronize the remote control keyboard to the mobile terminal 400B, and realize the function of controlling the multimedia controller 100 by controlling the user interface of the mobile terminal 400B;
  • the audio and video content displayed on the terminal 400B is transmitted to the multimedia controller 100 to realize the synchronous display function.
  • the server 500 may be a video server, an Electronic Program Guide (EPG, Electronic Program Guide) server, a cloud server, etc.
  • EPG Electronic Program Guide
  • cloud server etc.
  • the multimedia controller 100 can perform data communication with the server 500 through a variety of communication methods.
  • the multimedia controller 100 may be allowed to perform a wired communication connection or a wireless communication connection with the server 500 through a local area network, a wireless local area network, or other networks.
  • the server 500 may provide various contents and interactions to the multimedia controller 100.
  • the multimedia controller 100 receives software program updates through sending and receiving information, and EPG interaction, or accessing a remotely stored digital media library.
  • the server 500 may be a group or multiple groups, and may be one or more types of servers.
  • the server 500 provides other network service content such as video-on-demand and advertising services.
  • Fig. 6 exemplarily shows a configuration block diagram of the control device 400 according to an exemplary embodiment.
  • the control device 400 includes a controller 410, a communicator 430, a user input/output interface 440, a memory 490, and a power supply 480.
  • the control device 400 is configured to control the multimedia controller 100, and can receive user input operation instructions, and convert the operation instructions into instructions that can be recognized and responded to by the multimedia controller 100, and serve as a connection between the user and the multimedia controller 100 Intermediary role.
  • the user operates the channel addition and subtraction keys on the control device 400, and the multimedia controller 100 responds to the channel addition and subtraction operations.
  • control device 400 may be a smart device.
  • control device 400 can install various applications for controlling the multimedia controller 100 according to user requirements.
  • the mobile terminal 400B or other intelligent electronic equipment can perform a similar function as the control device 400 after installing an application for controlling the multimedia controller 100.
  • the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 400B or other smart electronic devices by installing applications to realize the function of the physical keys of the control device 400.
  • the controller 410 includes a processor 412, a RAM 413 and a ROM 414, a communication interface, and a communication bus.
  • the controller 410 is used to control the operation and operation of the control device 400, as well as the communication and cooperation between internal components, and external and internal data processing functions.
  • the communicator 430 realizes the communication of control signals and data signals with the multimedia controller 100 under the control of the controller 410. For example, the received user input signal is sent to the multimedia controller 100.
  • the communicator 430 may include at least one of communication modules such as a WIFI module 431, a Bluetooth module 432, and a Near Field Communication (NFC) module 433.
  • the user input/output interface 440 wherein the input interface includes at least one of input interfaces such as a microphone 441, a touch panel 442, a sensor 443, a button 444, and a camera 445.
  • input interfaces such as a microphone 441, a touch panel 442, a sensor 443, a button 444, and a camera 445.
  • the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
  • the input interface converts the received analog signal into a digital signal and the digital signal into a corresponding instruction signal, which is sent to the multimedia controller 100.
  • the output interface includes an interface for sending the received user instruction to the multimedia controller 100.
  • it may be an infrared interface or a radio frequency interface.
  • an infrared signal interface a user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and sent to the multimedia controller 100 via an infrared sending module.
  • a radio frequency signal interface a user input instruction needs to be converted into a digital signal, and then modulated according to the radio frequency control signal modulation protocol, and then sent to the multimedia controller 100 by the radio frequency transmitting terminal.
  • control device 400 includes at least one of a communicator 430 and an output interface.
  • the control device 400 is equipped with a communicator 430, such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the multimedia controller 100.
  • a communicator 430 such as WIFI, Bluetooth, NFC and other modules, which can encode the user input command through the WIFI protocol, or the Bluetooth protocol, or the NFC protocol, and send it to the multimedia controller 100.
  • the memory 490 is used to store various operating programs, data, and applications for driving and controlling the control device 400 under the control of the controller 410.
  • the memory 490 can store various control signal instructions input by the user.
  • the power supply 480 is used to provide operating power support for the electrical components of the control device 400 under the control of the controller 410.
  • the power supply 480 can be powered by a battery and related control circuits.
  • FIG. 7 is only an exemplary description of the system architecture of the display system of the present application, and does not limit the application. . In practical applications, the display system can include more or less hardware or interfaces as required.
  • the system architecture diagram of the display system provided in this embodiment of the application includes:
  • the multimedia control unit 100 includes a first multimedia unit 701.
  • the laser projection device 200 includes: a first projection display control unit 702, a first control unit 703, an eye guard control unit 704, and a first imaging projection display unit 705.
  • the micro-projection device 300 includes: a second multimedia unit 706, a second control unit 707, a second projection display control unit 708, and a second imaging projection display unit 709.
  • Unit modules such as the power supply module 710, the light source driving unit 711, and the audio processing unit 712 may be integrated in the host of the laser projection device 200 or the host of the micro-projection device 300.
  • the first multimedia unit 701 is configured to receive external input signals and send corresponding input signals to the first projection display control unit 702, the first control unit 703, and/or the second multimedia unit 706, respectively.
  • the first multimedia unit 701 may specifically include at least one of the following: WIFI module, wireless input module, Ethernet input module, USB input module, High Definition Multimedia Interface (HDMI) input module, touch key input module , Light sensing module, far-field voice module.
  • the first multimedia unit 701 can receive external input signals according to at least one of the above modules.
  • the first multimedia control unit 701 is also used for inter-integrated circuit (I2C) communication with the first control unit 703; I2C communication, serial communication, USB communication, or I2C communication with the second multimedia unit 706 Wireless communication; and provide VB1 video signal for the first projection display control unit 702.
  • I2C inter-integrated circuit
  • the first projection display control unit 702 is configured to send a low-voltage differential signal (LVDS) or high-speed serial structure (High-Speed Serial Interface, HSSI) video signal to the first imaging projection display unit 705, and The control signal is used to drive and control the first imaging projection display unit 705 to display video images, and to control the working state of the first imaging projection display unit 705.
  • the first projection display control unit 702 is further configured to provide a pulse width modulation (PWM) signal and a duty signal for the light source driving unit 711, and perform I2C communication with the first control unit 703.
  • PWM pulse width modulation
  • the first control unit 703 is used to control the working status of the heat sink of the laser projection equipment; monitor the ambient temperature and the laser temperature; control the speed of the de-speckle wheel; and control the power on and off of the light source driving unit 711.
  • the first control unit 703 is also used for I2C communication or serial communication with the eye guard.
  • the eye guard control unit 704 is used to control the working mode of the eye guard of the laser projection device.
  • the first imaging projection display unit 705 includes a lens part of a laser projection device, or an optical system composed of a light valve, an illumination, and a projection lens of the laser projection device.
  • the first imaging projection display unit 705 is configured to receive the LVDS or HSSI video signal, etc. from the first projection display control unit 702, and project the video signal on the third display.
  • the second multimedia unit 706 is configured to receive multimedia signals from the first multimedia unit 701, and send corresponding multimedia signals to the second control unit 707 and the second projection display control unit 708.
  • the second multimedia unit 706 is also used to perform I2C communication with the second control unit, and send LVDS video signals to the second projection display control unit 708.
  • the second control unit 707 is used to control the working state of the heat sink of the laser projection equipment; monitor the ambient temperature and the temperature of the laser; and perform I2C communication with the second projection display control unit 708.
  • the second projection display control unit 708 is configured to send LVDS video signals and control signals to the second imaging projection display 709.
  • the first imaging projection display unit 709 includes the lens part of the micro-projection device, or an optical system composed of the light valve, illumination, and projection lens of the micro-projection device.
  • the first imaging projection display unit 709 is configured to receive the LVDS video signal from the second projection display control unit 705 and project the video signal onto the fourth display.
  • the power module 710 is used to supply power to the display system.
  • the power module 710 may provide 12V power for the first control unit; provide 12V power for the first projection display control unit; provide 18V power for the audio processing unit 712 and the first multimedia unit.
  • the light source driving unit 711 is used to provide an energy source for the laser.
  • the lasers include: a blue laser 716, a green laser 717, and a red laser 718, which are used to provide lasers of three primary colors.
  • the three lasers are used to provide laser light sources for laser projection equipment.
  • FIG. 8 shows a schematic diagram of the hardware structure of a laser projection device provided by an embodiment of the present application.
  • the laser projection device includes: a television (television, TV) board 810, a display board 820, a light source 830, and a light source drive circuit 840.
  • the TV board 810 is mainly used to receive and decode external audio and video signals.
  • the TV board module 810 is equipped with a system-on-chip (SoC), which can decode data in different data formats into a normalized format, and transmit the data in the normalized format to the Display board 820.
  • SoC system-on-chip
  • the display board 820 may be provided with a Field Programmable Gate Array (FPGA) 821, a control processing module 822, and a light modulation device 823.
  • FPGA Field Programmable Gate Array
  • FPGA821 is used for processing the input video image signal, such as performing motion estimation and motion compensation (Motion Evaluation and Motion Compensation, MEMC) frequency multiplication processing, or image correction, etc. to implement image enhancement functions.
  • MEMC Motion Evaluation and Motion Compensation
  • the control processing module 822 is connected to the algorithm processing module FPGA, and is used to receive processed video image processing signal data as image data to be projected.
  • the control processing module 822 outputs the current PWM brightness adjustment signal and the enable control signal according to the image data to be transmitted, and realizes the timing and lighting control of the light source 830 through the light source driving circuit 840.
  • the light modulation device 823 can receive the video image signal output by the TV board 810, and analyze and obtain the partition brightness signal and image component of the video image.
  • the light modulation device 200 may receive the image signal to be projected output by the FPGA 821, and the image signal to be projected may include the image brightness signal and the image component after being absorbed.
  • the light source 830 is a red light source, a blue light source and a green light source.
  • the light sources of the three colors can emit light simultaneously or sequentially.
  • the light source 830 is driven to light up according to the timing of image display indicated by the control instruction of the control processing module 822.
  • the application program layer of the laser projection device includes various application programs that can be executed on the laser projection device 200.
  • the application layer 1912 of the laser projection device 200 may include, but is not limited to, one or more applications, such as a live TV application, a video-on-demand application, a media center application, an application center, a game application, and so on.
  • applications such as a live TV application, a video-on-demand application, a media center application, an application center, a game application, and so on.
  • Live TV applications can provide live TV through different sources.
  • a live TV application can provide a TV signal using input from cable TV, wireless broadcasting, satellite services, or other types of live TV services.
  • the live TV application program can display the video of the live TV signal on the laser projection device 200.
  • Video-on-demand applications can provide videos from different storage sources. Unlike live TV applications, VOD provides video display from certain storage sources. For example, video on demand can come from the server side of cloud storage, and from the local hard disk storage that contains stored video programs.
  • Media center applications can provide various multimedia content playback applications.
  • the media center can provide services that are different from live TV or video on demand, and users can access various images or audio through the media center application.
  • Application center can provide storage of various applications.
  • the application program can be a game, an application program, or some other application program that is related to a computer system or other device but can be run on a display device.
  • the application center can obtain these applications from different sources, store them in a local storage, and then run on the laser projection device 200.
  • the user can input an image search command to instruct the laser projection device to perform an image search, where the image search command can be input by pressing the image search button on the remote control of the multimedia controller, or issued by the user Enter specific voices such as "please search for pictures".
  • the image search means that the multimedia controller intercepts the image currently displayed on the third display according to the image search instruction input by the user, and sends the image to the cloud server for identification.
  • the cloud server recognizes the characters and the characters in the image.
  • the multimedia controller displays the information recognized by the cloud server on the fourth display, and further, when the user selects a certain recognition result, the associated information of the recognition result is displayed on the fourth display.
  • the interface of the third display and the fourth display after the multimedia controller receives the image search command, as well as the multimedia controller, the third controller, the third display, the fourth controller, the fourth display, and the cloud server.
  • 10-14 exemplarily show a schematic diagram of interaction between a user interface and a user in a multimedia controller, a laser projection device, and a micro-projection device according to an exemplary embodiment.
  • the third display is currently playing a program of the TV channel "AATV".
  • the multimedia controller displays as illustrated in FIG. 11.
  • the fourth display displays the recognition results of the captured images, including the faces of two people, a TV channel, and three objects.
  • a moving icon may be displayed on the fourth display, such as the icon that moves to the left as shown in Fig. 11.
  • the fourth display can display the associated information of the recognition result selected by the user. Specific examples can be shown in Figure 12, Figure 13 and Figure 14. As shown in FIG. 12, when the user selects a face icon in the fourth display in FIG. 11, the associated information of the person corresponding to the face is displayed on the fourth display. For example, the person’s name, occupation, date of birth, hometown, main works, etc.
  • the user can press a specific remote control button or issue a specific The voice can trigger the multimedia controller to display the same product of all the items in the recognition result and the purchase link of the same product on the fourth display.
  • FIG. 15 is a flowchart of a display method provided by an embodiment of the application.
  • the display method can be applied to the display system shown in FIG. 1.
  • the display method includes:
  • the multimedia controller receives an image search instruction.
  • the above-mentioned image search command is sent by the control device to the multimedia controller.
  • the user when the user is interested in the characters or commodities appearing in the video while watching the video program played by the laser projection device, the user can send the image search command to the multimedia controller by using the image search function on the control device.
  • the user presses the image search button on the remote controller of the multimedia controller, and the remote controller generates an image search command after detecting the pressing operation and sends it to the multimedia controller.
  • Another example is to use the voice control function of the remote control to input "picture search" content by voice. After the remote control detects the voice input, it generates a picture search command and sends it to the multimedia controller.
  • the multimedia controller determines the image currently displayed by the laser projection device, and obtains the recognition result of the image.
  • the image currently displayed by the laser projection device is a screenshot of the video interface currently played by the laser projection device.
  • the result of image recognition includes people, TV channels, objects, buildings, animals, plants, etc. in the image.
  • the multimedia controller sends a first instruction to the micro-projection device.
  • the micro-projection device receives the first instruction from the multimedia controller.
  • the first instruction is used to instruct the micro-projection device to display the recognition result.
  • the micro-projection device displays the recognition result.
  • the micro-projection device may display each recognition result in the form of an icon.
  • the micro-projection device can sort various types of recognition results and display each recognition result in a preset order.
  • the micro-projection device directly displays the image, and the image is divided into regions according to each recognition result.
  • the user selects the corresponding recognition result by selecting the area corresponding to the recognition result.
  • the multimedia controller can display the recognition result of the image currently displayed by the laser projection device on the micro projection device.
  • the recognition result does not cover the picture being displayed by the laser projection device, so that the user can watch the picture displayed by the laser projection device without obstruction while viewing the image search result, thereby greatly improving the user experience.
  • the above S1502 can be specifically implemented by the following S1502a-S1502b.
  • the multimedia controller determines the image currently displayed by the laser projection device.
  • the image displayed by the laser projection device is sent to the laser projection device through the multimedia controller. Therefore, after the multimedia controller receives the image search instruction, the multimedia controller determines the image sent by the multimedia controller to the laser projection device at the current moment. Multimedia uses this image as the image currently displayed by the laser projection device.
  • the laser projection device has a screen capture function. After the multimedia controller receives the image search command, it sends the image search command to the laser projection device. After the laser projection device receives the image search command, it captures the currently displayed screen Image and send the image to the multimedia controller.
  • the multimedia controller sends the image currently displayed by the laser projection device to the cloud server.
  • the cloud server receives the image from the multimedia controller.
  • the cloud server searches for the image, and determines the recognition result of the image.
  • the cloud server After receiving the image, the cloud server performs face recognition and object recognition on the image, and determines the recognition result of the image.
  • the cloud server sends the image recognition result to the multimedia controller.
  • the multimedia controller receives the image recognition result from the cloud server.
  • the micro-projection device determines the first operation.
  • the first operation is used to select the target object in the recognition result.
  • the first operation can be directly determined by the microprojection device, or can be determined by the multimedia control device and forwarded to the microprojection device.
  • the above-mentioned first operation may be input by pressing a button by the user operating the control device, or may be input by the user's voice.
  • the above-mentioned target object may be one of the above-mentioned recognition results, or it may be multiple recognition results among the above-mentioned recognition results.
  • the micro-projection device sends a second instruction to the multimedia controller.
  • the multimedia controller receives the second instruction from the micro-projection device.
  • the second instruction is used to indicate the target object.
  • the multimedia controller sends a fourth instruction to the cloud server.
  • the cloud server receives the fourth instruction from the multimedia controller.
  • the fourth instruction is used to instruct the cloud server to search for the associated information of the target object.
  • the above-mentioned associated information can represent different meanings.
  • the target object in the recognition result is a person
  • the above-mentioned related information may be the person's name, occupation, date of birth, hometown, representative work, and so on.
  • the target object in the recognition result is a TV channel
  • the above-mentioned associated information may be the name of the TV channel, the program being played, the program preview, and so on.
  • the target object in the recognition result is an item
  • the above-mentioned related information may be the same item of the item and the purchase link.
  • the target object in the recognition result is a building
  • the above-mentioned related information may be the location, characteristics, etc. of the building.
  • the target object in the recognition result is a plant
  • the above-mentioned related information may be the name, variety, brief introduction, etc. of the plant.
  • the target object in the recognition result is an animal
  • the above-mentioned associated information may be the animal's name, living area, habits, and so on.
  • the cloud server determines the associated information of the target object, and generates a fifth instruction according to the associated information of the target object.
  • the fifth instruction is used to indicate the associated information of the target object.
  • the fourth instruction sent by the multimedia controller to the cloud server includes an identifier (for example, an icon) of each target object.
  • the cloud server determines the target object according to the identification of each target object, and searches for associated information of the target object.
  • the cloud server sends a fifth instruction to the multimedia controller.
  • the multimedia controller receives the fifth instruction from the cloud server.
  • the multimedia controller generates a third instruction according to the fifth instruction.
  • the multimedia controller sends a third instruction to the micro-projection device.
  • the micro-projection device receives the third instruction from the multimedia controller.
  • the micro-projection device displays the associated information of the target object.
  • first related information of the person is acquired, and the first related information is displayed on the second display, and the first related information includes introduction information of the person.
  • the second associated information of the channel is acquired, and the second associated information is displayed on the second display.
  • the second associated information includes program information currently broadcast on the channel and a program preview.
  • the third related information of the article is acquired, and the third related information is displayed on the second display.
  • the third related information includes the same product information of the article and purchase link information.
  • the micro-projection device displays the associated information of the target object
  • the user can further select the associated information of the target object to view.
  • the user can select one item at a time or multiple items at a time. If an item is selected, the second controller displays the product information of the same item and the purchase link information of the item on the second display. If multiple items are selected, the second controller displays the same product information and purchase link information of the multiple items on the second display.
  • the user can instruct to view the product information of the same item and the purchase link information of the item by selecting the icon of the item and confirming it as illustrated in FIG. 14. And, by pressing a specific key (for example, the left key) of the remote control of the multimedia controller, it is instructed to view the same product information and purchase link information of the multiple items.
  • a specific key for example, the left key
  • the microprojection device after the microprojection device displays the target object, the microprojection device can further display the associated information of each target object according to the received instruction. As a result, the user can obtain more information of the target object while watching the screen of the laser projection device normally, and further enhance the user experience.
  • the information transmission between the multimedia controller, the laser projection device and the micro-projection device may be transmitted through a serial port.
  • an embodiment of the present application further provides a storage medium, in which instructions are stored in the storage medium, which when run on a computer, cause the computer to execute the method in the foregoing method embodiment.
  • an embodiment of the present application further provides a chip for executing instructions, and the chip is configured to execute the method in the foregoing method embodiment.
  • An embodiment of the present application also provides a program product, the program product includes a computer program, the computer program is stored in a storage medium, at least one processor can read the computer program from the storage medium, and the at least one When the processor executes the computer program, the method in the foregoing method embodiment can be implemented.
  • At least one refers to one or more, and “multiple” refers to two or more.
  • “And/or” describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship; in the formula, the character “/” indicates that the associated objects before and after are in a “division” relationship.
  • “The following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • at least one of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple A.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

本申请实施例提供一种显示***、显示方法及计算设备,显示***包括:多媒体控制器,激光投影设备,以及微投设备;其中,多媒体控制器被配置为:接收图搜指令,响应于图搜指令,确定激光投影设备当前所显示的图像、获取图像的识别结果,并向微投设备发送第一指令,第一指令用于指示微投设备显示识别结果;微投设备,被配置为:接收来自多媒体控制器的第一指令,响应于第一指令,显示识别结果。

Description

显示***、显示方法及计算设备
本申请要求于2019年11月04日提交的申请号为201911067968.5,发明名称为:显示设备,显示方法及计算设备的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及智能显示设备技术,尤其涉及一种显示***、显示方法及计算设备。
背景技术
随着电视技术以及互联网技术的不断发展,出现了基于互联网的激光投影设备。激光投影设备基于互联网技术,具备开放式操作***与芯片,并具有开放式应用平台,能够支持影音、娱乐、数据等多种功能,从而满足用户的多样化需求,为用户带来全新的使用体验。在用户交互方面,激光投影设备能够支持语音交互方式,用户可以通过语音对激光投影设备进行控制。
现有技术中,激光投影设备包括一个显示器。当用户按下激光投影设备的遥控器上的图搜按键后,激光投影设备截取显示器当前正在显示的图像,将图像发送至云端服务器,以识别图像中的人物、物品等。云端服务器将识别结果发送给激光投影设备,激光投影设备在显示器显示识别结果。用户选择某个识别结果后,激光投影设备从云端服务器获取该识别结果的关联信息,并在显示器显示关联信息。激光投影设备在显示识别结果和关联信息时,会遮挡显示器当前正在显示的主画面。这种方式会影响用户正常观看主画面,导致用户体验不佳。
发明内容
本申请提供了投影***及投影屏幕,采用技术方案如下:
一方面,本申请实施例提供一种显示***,包括:
第一显示器、第二显示器、第一控制器和第二控制器。
所述第一控制器,被配置为:接收用户输入的图搜指令,响应于所述图搜指令,截取所述第一显示器当前所显示的图像、获取所述图像的识别结果,并向所述第二控制器发送第一指令,所述第一指令用于指示所述第二控制器显示所述识别结果。
所述第二控制器,被配置为:响应于所述第一指令,在所述第二显示器上显示所述识别结果。
第二方面,本申请实施例提供一种显示方法,包括:
接收第一指令,所述第一指令由第一控制器在接收到用户输入的图搜指令、截取第一显示器当前所显示的图像,并获取所述图像的识别结果后发送,所述第一指令用于指示显示所述识别结果。
响应于所述第一指令,在第二显示器上显示所述识别结果。
第三方面,本申请实施例提供一种显示***,包括:
多媒体控制器,激光投影设备,以及微投设备。
其中,所述多媒体控制器被配置为:接收图搜指令,响应于所述图搜指令,确定所述激光投影设备当前所显示的图像、获取所述图像的识别结果,并向所述微投设备发送第一指令,所述第一指令用于指示所述微投设备显示所述识别结果。
所述微投设备,被配置为:接收来自所述多媒体控制器的第一指令,响应于所述第一指令,显示所述识别结果。
第四方面,本申请实施例提供一种显示方法,包括:
接收图搜指令。
响应于所述图搜指令,确定激光投影设备当前所显示的图像,并获取所述图像的识别结果。
向微投设备发送第一指令,所述第一指令用于指示所述微投设备显示所述识别结果。
第五方面,本申请实施例提供一种显示方法,包括:
接收来自所述多媒体控制器的第一指令;所述第一指令用于指示所述微投设备显示识别结果;所述识别结果为所述多媒体控制器根据搜图指令确定的,激光投影设备当前所显示的图像的识别结果。
响应于所述第一指令,显示所述识别结果。
第六方面,本申请实施例提供一种计算设备,包括:
存储器,用于存储程序指令。
处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行上述第二方面所述的方法。
第七方面,本申请实施例提供一种计算设备,包括:
存储器,用于存储程序指令。
处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行上述第四方面所述的方法。
第八方面,本申请实施例提供一种计算设备,包括:
存储器,用于存储程序指令。
处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行上述第五方面所述的方法。
附图说明
为了更清楚地说明本申请或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的显示***的示意图;
图2为本申请实施例提供的多媒体控制器与激光投影设备,以及微投设备之间的交互 流程;
图3为本申请实施例提供的激光投影设备正面与微投设备正面的界面示意图;
图4为本申请实施例提供的激光投影设备背面与微投设备背面的示意图;
图5为本申请实施例提供的显示***与控制装置以及服务器交互的应用场景的示意图;
图6中为本申请实施例提供的控制装置400的配置框图;
图7为本申请实施例提供的显示***的***架构示意图;
图8为本申请实施例提供的激光投影设备的硬件结构示意图;
图9为本申请实施例提供的激光投影设备的应用程序层示意图;
图10-图14示例性示出了根据示例性实施例中显示***中用户界面与用户进行交互示意图;
图15为本申请实施例提供的一种显示方法的流程图;
图16为本申请实施例提供的另一种显示方法的流程图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整的描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
下面首先结合附图对本申请所涉及的概念进行说明。在此需要指出的是,以下对各个概念的说明,仅为了使本申请的内容更加容易理解,并不表示对本申请保护范围的限定。
本申请各实施例中使用的术语“模块”,可以是指任何已知或后来开发的硬件、软件、固件、人工智能、模糊逻辑或硬件或/和软件代码的组合,能够执行与该元件相关的功能。
本申请各实施例中使用的术语“遥控器”,是指电子设备(如本申请中公开的多媒体控制器、激光投影电视或微投设备)的一个组件,该组件通常可在较短的距离范围内无线控制电子设备。该组件一般可以使用红外线和/或射频(RF)信号和/或蓝牙与电子设备连接,也可以包括无线保真(wireless fidelity,WIFI)、无线、通用串行总线(universal serial bus,USB)、蓝牙、动作传感器等功能模块。例如:手持式触摸遥控器,是以触摸屏中用户界面取代一般遥控装置中的大部分物理内置硬键。
本申请各实施例中使用的术语“手势”,是指用户通过一种手型的变化或手部运动等动作,用于表达预期想法、动作、目的/或结果的用户行为。
本申请各实施例中使用的术语“硬件***”,可以是指由集成电路(Integrated Circuit,IC)、印刷电路板(Printed circuit board,PCB)等机械、光、电、磁器件构成的具有计算、控制、存储、输入和输出功能的实体部件。在本申请各个实施例中,硬件***通常也会被称为主板(motherboard)或芯片。
如图1所示,本申请实施例提供的显示***10包括:多媒体控制器100,激光投影设备200,以及微投设备300。
多媒体控制器100分别与激光投影设备200和微投设备300通信连接。
其中,多媒体控制器100用于进行多媒体信息分析及处理,对不同的多媒体信息进行判定并做分流处理,确定各类型的多媒体信息由激光投影设备200显示还是由微投设备300显示。在多媒体控制器100对多媒体信息进行分流之后,多媒体控制器100向激光投影设备200发送由激光投影设备200显示的多媒体信息;以及向微投设备300发送由微投设备300显示的多媒体信息。
激光投影设备200包括第三控制器201和第三显示器202。其中,第三控制器201用于接收多媒体控制器100发送的多媒体信息,并驱动控制第三显示器202显示该多媒体信息。第三显示器202用于响应第三控制器201的控制驱动,显示多媒体信息。
微投设备300包括第四控制器301和第四显示器302。其中,第四控制器301用于接收多媒体控制器100发送的多媒体信息,并驱动控制第四显示器302显示该多媒体信息。第四显示器302用于响应第四控制器301的控制驱动,显示多媒体信息。
其中,第三显示器202和第四显示器302可以用于显示不同的显示画面。如,第三显示器202可以用于显示电视节目的画面,第四显示器302用于显示通知类消息、语音助手等信息的画面。
可选地,第三显示器202显示的内容与第四显示器302显示的内容之间可以相互独立,互不影响。例如,在第三显示器202播放电视节目时,第四显示器302可以显示与电视节目无关的时间、天气、气温、提醒消息等信息。
可选地,第三显示器202显示的内容与第四显示器302显示的内容之间也可以存在关联关系。例如,在第三显示器202播放视频聊天的主画面时,第四显示器302可以显示当前接入视频聊天的用户的头像、聊天时长等信息。
可选地,第四显示器302显示的部分或全部内容可以调整至第三显示器202显示。例如,可以将第三显示器202显示的时间、天气、气温、提醒消息等信息调整到第三显示器202显示,而用第四显示器302显示其它的信息。
另外,第三显示器202在显示传统电视节目画面的同时,还显示多方交互画面,且多方交互画面不会遮挡传统电视节目画面。其中,本申请对传统电视节目画面和多方交互画面的显示方式不做限定。例如,本申请可以根据传统电视节目画面和多方交互画面的优先级,设置传统电视节目画面和多方交互画面的位置和大小。
以传统电视节目画面的优先级高于多方交互画面的优先级为例,传统电视节目画面的面积大于多方交互画面的面积,且多方交互画面可以位于传统电视节目画面的一侧,也可以悬浮设置在多方交互画面的一角。
本申请所涉及到的第三显示器202和第四显示器302为投影显示设备,第三显示器202和第四显示器302的具体显示设备类型,尺寸大小和分辨率等不作限定,本领技术人员可以理解的是,第三显示器202和第四显示器302可以根据需要做性能和配置上的一些改变。
如图1所示,第三显示器202上可以连接或设置有摄像头,用于将摄像头拍摄到的画面呈现在激光投影设备、微投设备或其他显示设备的显示界面上,以实现用户之间的交互 聊天。具体的,摄像头拍摄到的画面可在激光投影设备上全屏显示、半屏显示、或者显示任意可选区域。
作为一种可选的连接方式,摄像头通过连接板与激光投影设备的后壳连接,固定安装在激光投影设备的后壳的上侧中部,作为可安装的方式,可以固定安装在激光投影设备的后壳的任意位置,能保证其图像采集区域不被后壳遮挡即可,例如,图像采集区域与激光投影设备的显示朝向相同。
作为另一种可选的连接方式,摄像头通过连接板或者其他可想到的连接器可升降的与激光投影设备后壳连接,连接器上安装有升降马达,当用户要使用摄像头或者有应用程序要使用摄像头时,再升出激光投影设备之上,当不需要使用摄像头时,其可内嵌到后壳之后,以达到保护摄像头免受损坏和保护用户的隐私安全。
作为一种实施例,本申请所采用的摄像头可以为1600万像素,以达到超高清显示目的。在实际使用中,也可采用比1600万像素更高或更低的摄像头。
当激光投影设备上安装有摄像头以后,激光投影设备不同应用场景所显示的内容可得到多种不同方式的融合,从而达到传统激光投影设备无法实现的功能。
示例性的,用户可以在边观看视频节目的同时,与至少一位其他用户进行视频聊天。视频节目的呈现可作为背景画面,视频聊天的窗口显示在背景画面之上。形象的,可以称该功能为“边看边聊”。
可选的,在“边看边聊”的场景中,在观看直播视频或网络视频的同时,跨终端的进行至少一路的视频聊天。
另一示例中,用户可以在边进入教育应用学习的同时,与至少一位其他用户进行视频聊天。例如,学生在学习教育应用程序中内容的同时,可实现与老师的远程互动。形象的,可以称该功能为“边学边聊”。
另一示例中,用户在玩纸牌游戏时,与进入游戏的玩家进行视频聊天。例如,玩家在进入游戏应用参与游戏时,可实现与其他玩家的远程互动。形象的,可以称该功能为“边看边玩”。
可选的,游戏场景与视频画面进行融合,将视频画面中人像进行抠图,显示在游戏画面中,提升用户体验。
可选的,在体感类游戏中(如打球类、拳击类、跑步类、跳舞类等),通过摄像头获取人体姿势和动作,肢体检测和追踪、人体骨骼关键点数据的检测,再与游戏中动画进行融合,实现如体育、舞蹈等场景的游戏。
另一示例中,用户可以在K歌应用中,与至少一位其他用户进行视频和语音的交互。形象的,可以称该功能为“边看边唱”。可选地,当至少一位用户在聊天场景进入该应用时,可多个用户共同完成一首歌的录制。
另一个示例中,用户可在本地打开摄像头获取图片和视频,形象的,可以称该功能为“照镜子”。
在另一些示例中,还可以再增加更多功能或减少上述功能。本申请对该激光投影设备的功能不作具体限定。
需要指出的是,图1仅以摄像头设置在第三显示器的外壳上为例进行说明,在具体实现时,摄像头的设置位置可以根据实际需求确定。例如设置在第三控制器外壳上,第三显示器的外壳上,第四控制器的外壳上,第四显示器的外壳上,或者独立设置,本申请对此不做限定。
一种具体的实现方式中,如图2所示,多媒体控制器100与激光投影设备200,以及微投设备300之间的交互流程包括以下步骤:
S201、多媒体控制器获取多媒体信息。
一种可能的实现方式中,该多媒体信息包括以下至少一项:视频类信息,文字类信息,***通知信息,***推送信息,一键图搜功能对应的窗口信息,流媒体信息,视频通话类信息。
S202、多媒体控制器对多媒体信息进行分类,确定第一信息和第二信息。
其中,第一信息为由激光投影设备显示的信息。第二信息为由微投设备播放的信息。
一种示例,第一信息为视频类信息。例如,电视直播节目,网络视频点播节目,网络视频直播节目等。
第二信息为多媒体信息中除视频类信息之外的其他信息。例如,天气,时间,文字新闻,***通知信息,***推送信息,一键图搜功能对应的窗口信息,流媒体信息,视频通话类信息等。
S203、多媒体控制器向第三控制器发送第一信息。相应的,第三控制器接收来自多媒体控制器的第一信息。
S204、第三控制器控制驱动第三显示器显示第一信息。
S205、多媒体控制器向第四控制器发送第二信息。相应的,第四控制器接收来自多媒体控制器的第一信息。
S206、第四控制器控制驱动第四显示器显示第二信息。
结合图1,如图3所示,多媒体控制器100和第三控制器201可以设置在激光投影设备主机中。激光投影设备主机通常放置在第三显示器202的正前方。
第四显示器302可以设置在第三显示器202的一侧。例如,第四显示器302设置在第三显示器202的上方,下方,左侧,右侧等任意位置。
需要指出的是,本申请记载的显示***可以包括多个微投设备。例如,显示***包括两个微投设备,该两个微投设备分别位于激光投影设备的左右两侧。
本申请仅以一个微投设备为例进行说明,在显示***包括多个微投设备时,其具体实现方式与显示***包括一个微投设备类似,本申请对此不在赘述。
结合图1,如图4所示,微投设备主机301可以设置在激光投影设备显示屏202背部。
如图5所示,本申请实施例中的显示***10与控制装置400以及服务器500交互的应用场景的示意图。
其中,控制装置400可以是遥控器400A,其可与多媒体控制器100之间通过红外协议通信、蓝牙协议通信、紫蜂(ZigBee)协议通信或其他短距离通信方式进行通信,用于通过无线或其他有线方式来控制多媒体控制器100。用户可以通过遥控器400A上按键、 语音输入、控制面板输入等输入用户指令,来控制多媒体控制器100。如:用户可以通过遥控器400A上音量加减键、频道控制键、上/下/左/右的移动按键、语音输入按键、菜单键、开关机按键等输入相应控制指令,来实现控制多媒体控制器100的功能。
控制装置400也可以是智能设备,如移动终端400B、平板电脑、计算机、笔记本电脑等,其可以通过本地网(LAN,Local Area Network)、广域网(WAN,Wide Area Network)、无线局域网(WLAN,Wireless Local Area Network)或其他网络与多媒体控制器100之间通信,并通过与多媒体控制器100相应的应用程序实现对多媒体控制器100的控制。例如,使用在智能设备上运行的应用程序控制多媒体控制器100。该应用程序可以在与智能设备关联的屏幕上通过直观的用户界面(UI,User Interface)为用户提供各种控制。
示例的,移动终端400B与多媒体控制器100均可安装软件应用,从而可通过网络通信协议实现二者之间的连接通信,进而实现一对一控制操作和数据通信的目的。如:可以使移动终端400B与多媒体控制器100建立控制指令协议,将遥控控制键盘同步到移动终端400B上,通过控制移动终端400B上用户界面,实现控制多媒体控制器100的功能;也可以将移动终端400B上显示的音视频内容传输到多媒体控制器100上,实现同步显示功能。
服务器500可以是视频服务器,电子节目指南(EPG,Electronic Program Guide)服务器,云端服务器等。
多媒体控制器100可与服务器500通过多种通信方式进行数据通信。在本申请各个实施例中,可允许多媒体控制器100通过局域网、无线局域网或其他网络与服务器500进行有线通信连接或无线通信连接。服务器500可以向多媒体控制器100提供各种内容和互动。
示例的,多媒体控制器100通过发送和接收信息,以及EPG互动,接收软件程序更新,或访问远程储存的数字媒体库。服务器500可以是一组,也可以是多组,可以是一类或多类服务器。通过服务器500提供视频点播和广告服务等其他网络服务内容。
图6中示例性示出了根据示例性实施例中控制装置400的配置框图。如图6所示,控制装置400包括控制器410、通信器430、用户输入/输出接口440、存储器490、供电电源480。
控制装置400被配置为可控制所述多媒体控制器100,以及可接收用户的输入操作指令,且将操作指令转换为多媒体控制器100可识别和响应的指令,起到用户与多媒体控制器100之间交互中介作用。如:用户通过操作控制装置400上频道加减键,多媒体控制器100响应频道加减的操作。
在一些实施例中,控制装置400可是一种智能设备。如:控制装置400可根据用户需求安装控制多媒体控制器100的各种应用。
在一些实施例中,移动终端400B或其他智能电子设备,可在安装操控多媒体控制器100的应用之后,起到控制装置400类似功能。如:用户可以通过安装应用,在移动终端400B或其他智能电子设备上可提供的图形用户界面的各种功能键或虚拟按钮,以实现控制装置400实体按键的功能。
控制器410包括处理器412、RAM 413和ROM 414、通信接口以及通信总线。控制 器410用于控制控制装置400的运行和操作,以及内部各部件之间通信协作以及外部和内部的数据处理功能。
通信器430在控制器410的控制下,实现与多媒体控制器100之间控制信号和数据信号的通信。如:将接收到的用户输入信号发送至多媒体控制器100上。通信器430可包括WIFI模块431、蓝牙模块432、近场通信(Near Field Communication,NFC)模块433等通信模块中至少一种。
用户输入/输出接口440,其中,输入接口包括麦克风441、触摸板442、传感器443、按键444、摄像头445等输入接口中至少一者。如:用户可以通过语音、触摸、手势、按压等动作实现用户指令输入功能,输入接口通过将接收的模拟信号转换为数字信号,以及数字信号转换为相应指令信号,发送至多媒体控制器100。
输出接口包括将接收的用户指令发送至多媒体控制器100的接口。在一些实施例中,可以是红外接口,也可以是射频接口。如:红外信号接口时,需要将用户输入指令按照红外控制协议转化为红外控制信号,经红外发送模块进行发送至多媒体控制器100。再如:射频信号接口时,需将用户输入指令转化为数字信号,然后按照射频控制信号调制协议进行调制后,由射频发送端子发送至多媒体控制器100。
在一些实施例中,控制装置400包括通信器430和输出接口中至少一者。控制装置400中配置通信器430,如:WIFI、蓝牙、NFC等模块,可将用户输入指令通过WIFI协议、或蓝牙协议、或NFC协议编码,发送至多媒体控制器100。
存储器490,用于在控制器410的控制下存储驱动和控制控制装置400的各种运行程序、数据和应用。存储器490,可以存储用户输入的各类控制信号指令。
供电电源480,用于在控制器410的控制下为控制装置400各电器元件提供运行电力支持。供电电源480可以采用电池及相关控制电路实现供电。
下面,结合图7对本申请所涉及到的显示***的***架构进行进一步的说明,需要指出的是,图7仅仅是对本申请显示***的***架构的一个示例性说明,并不表示对本申请的限定。在实际应用中,显示***可根据需要包含更多或更少的硬件或接口。
如图7所示,为本申请实施例提供的显示***的***架构图,包括:
第一多媒体单元701,第一投影显示控制单元702,第一控制单元703,护眼板控制单元704,第一成像投影显示单元705,第二多媒体单元706,第二控制单元707,第二投影显示控制单元708,第二成像投影显示单元709,电源模块710、光源驱动单元711,音频处理单元712。
对应于图1,多媒体控制单元100包括第一多媒体单元701。
激光投影设备200包括:第一投影显示控制单元702,第一控制单元703,护眼板控制单元704,以及第一成像投影显示单元705。
微投设备300包括:第二多媒体单元706,第二控制单元707,第二投影显示控制单元708,以及第二成像投影显示单元709。
电源模块710、光源驱动单元711,音频处理单元712等单元模块可以集成在激光投影设备200的主机,或者微投设备300的主机中。
以下,分别对上述单元模块的功能进行详细说明:
第一多媒体单元701,用于接收外部的输入信号,并分别将相应的输入信号发送至第一投影显示控制单元702、第一控制单元703和/或第二多媒体单元706。第一多媒体单元701具体可以包括以下至少一项:WIFI模块、无线输入模块,以太网输入模块,USB输入模块,高清多媒体接口(High Definition Multimedia Interface,HDMI)输入模块,触控按键输入模块,光感模块,远场语音模块.第一多媒体单元701可以根据以上至少一个模块接收外部输入信号。
此外,第一多媒体控制单元701还用于与第一控制单元703进行集成电路总线(inter iIntegrated circuit,I2C)通讯;与第二多媒体单元706进行I2C通讯,串口通讯,USB通讯或者无线通讯;以及为第一投影显示控制单元702提供VB1视频信号。
第一投影显示控制单元702,用于向第一成像投影显示单元705发送低电压差分信号(Low-Voltage Differential Signaling,LVDS)或高速串行结构(High-Speed Serial Interface,HSSI)视频信号,以及控制信号,用于驱动控制第一成像投影显示单元705显示视频画面,以及控制第一成像投影显示单元705的工作状态。第一投影显示控制单元702,还用于为光源驱动单元711提供脉冲宽度调制(Pulse width modulation,PWM)信号和Duty信号,以及与第一控制单元703进行I2C通讯。
第一控制单元703,用于控制激光投影设备散热器件的工作状态;监控环境温度和激光器温度;控制消散斑轮子转速;控制光源驱动单元711的上电断电等工作状态。第一控制单元703,还用于与护眼板进行I2C通讯或者串口通讯。
护眼板控制单元704,用于控制激光投影设备的护眼板的工作模式。
第一成像投影显示单元705,包括激光投影设备的镜头部分,或者激光投影设备的光阀、照明和投影镜头组成的光学***。第一成像投影显示单元705,用于接收来自第一投影显示控制单元702的LVDS或HSSI视频信号等,并将该视频信号投影到第三显示器上。
第二多媒体单元706,用于接收来自第一多媒体单元701的多媒体信号,并向第二控制单元707,以及第二投影显示控制单元708发送对应的多媒体信号。第二多媒体单元706,还用于与第二控制单元进行I2C通讯,以及向第二投影显示控制单元708发送LVDS视频信号。
第二控制单元707,用于控制激光投影设备散热器件的工作状态;监控环境温度和激光器温度;以及与第二投影显示控制单元708进行I2C通讯。
第二投影显示控制单元708,用于向第二成像投影显示709发送LVDS视频信号以及控制信号。
第一成像投影显示单元709,包括微投设备的镜头部分,或者微投设备的光阀、照明和投影镜头组成的光学***。第一成像投影显示单元709,用于接收来自第而投影显示控制单元705的LVDS视频信号,并将该视频信号投影到第四显示器上。
电源模块710,用于为显示***供电。举例来说,电源模块710可以为第一控制单元提供12V电源;为第一投影显示控制单元提供12V电源;为音频处理单元712和第一多媒体单元提供18V电源。
光源驱动单元711用于为激光器提供能量来源。
激光器包括:蓝色激光器716,绿色激光器717以及红色激光器718,用于提供三原色的激光。该三个激光器用于为激光投影设备提供激光光源。
图8示出了本申请实施例提供的激光投影设备的硬件结构示意图,如图8所示,该激光投影设备包括:电视(television,TV)板810、显示板820、光源830、光源驱动电路840。
以下,对图8中涉及到的各个器件进行详细说明:
TV板810主要用于接收外部音视频信号并进行解码。TV板模块810上设置有***级芯片(System on Chip,SoC),能够将不同数据格式的数据解码为归一化格式,并通过比如连接器(connector),将归一化格式的数据传输至显示板820。
显示板820可以设置有现场可编程逻辑门阵列(Field Programmable Gate Array,FPGA)821,控制处理模块822以及光调制器件823。
其中,FPGA821用于对输入的视频图像信号进行处理,比如进行运动估计与运动补偿(Motion Estimation and Motion Compensation,MEMC)倍频处理,或者图像的校正等进行图像增强功能的实现。
控制处理模块822,与算法处理模块FPGA连接,用于接收处理后的视频图像处理信号数据,作为待投射的图像数据。控制处理模块822根据待透射的图像数据输出电流PWM亮度调节信号以及使能控制信号,并通过光源驱动电路840实现对光源830的时序和点亮控制。
光调制器件823可以接收TV板810输出的视频图像信号,并分析获知该视频图像的分区亮度信号和图像分量。或者,光调制器件200可以接收FPGA821输出的待投射图像信号,该待投射图像信号可以包括图像亮度信号以及图像分量吸纳后。
光源830红光光源,蓝光光源和绿光光源,三种颜色的光源可以同时发光,也可以时序性发光。光源830根据控制处理模块822的控制指令指示的图像显示的时序被驱动点亮。
如图9中所示,激光投影设备的应用程序层包含可在激光投影设备200执行的各种应用程序。
激光投影设备200的应用程序层1912可包含但不限于一个或多个应用程序,如:直播电视应用程序、视频点播应用程序、媒体中心应用程序、应用程序中心、游戏应用等。
直播电视应用程序,可以通过不同的信号源提供直播电视。例如,直播电视应用程可以使用来自有线电视、无线广播、卫星服务或其他类型的直播电视服务的输入提供电视信号。以及,直播电视应用程序可在激光投影设备200上显示直播电视信号的视频。
视频点播应用程序,可以提供来自不同存储源的视频。不同于直播电视应用程序,视频点播提供来自某些存储源的视频显示。例如,视频点播可以来自云存储的服务器端、来自包含已存视频节目的本地硬盘储存器。
媒体中心应用程序,可以提供各种多媒体内容播放的应用程序。例如,媒体中心,可以为不同于直播电视或视频点播,用户可通过媒体中心应用程序访问各种图像或音频所提供服务。
应用程序中心,可以提供储存各种应用程序。应用程序可以是一种游戏、应用程序,或某些和计算机***或其他设备相关但可以在显示设备中运行的其他应用程序。应用程序中心可从不同来源获得这些应用程序,将它们储存在本地储存器中,然后在激光投影设备200上可运行。
本申请实施例中,用户可以输入图搜指令以指示激光投影设备进行图搜,其中,图搜指令可以由用户按下多媒体控制器的遥控器上的图搜键进行输入,或者,由用户发出“请图搜”等特定语音进行输入。本申请实施例中,图搜是指,多媒体控制器根据用户输入的图搜指令,截取第三显示器当前正在显示的图像,将图像发送给云端服务器进行识别,云端服务器识别出图像中的人物、电视频道标识、物品等,多媒体控制器将云端服务器所识别出的信息显示在第四显示器,进而,当用户选择某个识别结果后,将该识别结果的关联信息显示在第四显示器。以下实施例对于多媒体控制器接收到图搜指令后第三显示器和第四显示器的界面、以及多媒体控制器,第三控制器,第三显示器,第四控制器,第四显示器,以及云端服务器之间的交互以及处理过程进行详细说明。
图10-图14示例性示出了根据示例性实施例中多媒体控制器、激光投影设备,微投设备中用户界面与用户进行交互示意图。
如图10所示,第三显示器当前正在播放电视频道“AATV”的节目。此时,如果用户输入图搜指令,例如用户按下多媒体控制器的遥控器的图搜键,或者,用户输入“请图搜”的语音,则多媒体控制器按照图11所示例的进行显示。如图11所示,第四显示器显示所截取的图像的识别结果,包括两个人物的人脸、一个电视频道和三件物品。可选的,如果识别出的物品较多,无法全部在第四显示器显示时,可以首先显示一部分物品,并在第四显示器显示移动图标,例如图11所示的向左移动的图标,用户选择该图标并确认后,第四显示器上继续依次显示其余未显示过的识别结果。当用户选择一个识别结果并确认后,第四显示器可以显示用户所选择的识别结果的关联信息。具体可以如图12、图13和图14所示例。如图12所示例,当用户选择图11中第四显示器中的一个人脸图标,则在第四显示器显示该人脸对应的人物的关联信息。例如,该人物的姓名、职业、出生年月、籍贯、主要作品等。如图13所示例,当用户选择图11中第四显示器中的一个电视频道的图标,则在第四显示器显示该电视频道的名称、当前正在播放的节目名称、未来一个或多个时段的节目预告等。如图14所示例,当用户选择图11中第四显示器中的一个包的图标,则在第四显示器显示该包的同款商品的图标,同时,显示当前所选中的一个同款商品的购买链接(即图14中的二维码)。用户扫码即可购买该同款商品。除了图14所示例的显示同款商品信息的方式外,另一种方式中,可以在图11所示例的阶段,即显示识别结果的阶段,由用户按下特定的遥控器按钮或者发出特定的语音,可以触发多媒体控制器在第四显示器显示识别结果中所有物品的同款商品以及各同款商品的购买链接。
在上述图12-图14所示例的第四显示器中,当关联信息无法完整显示时,均可以先显示部分关联信息,用户选择第四显示器的移动图标并确认后,继续依次显示其余的关联信息。
图15为本申请实施例提供的显示方法的流程图,该显示方法可以应用于如图1所示 的显示***中,如图15所示,该显示方法包括:
S1501、多媒体控制器接收图搜指令。
一种可能的实现方式中,上述图搜指令由控制设备向多媒体控制器发送。
具体来说,用户在观看激光投影设备播放的视频节目的过程中,对视频中出现的人物或者商品感兴趣时,用户可以通过使用控制设备上的图搜功能向多媒体控制器发送图搜指令。
一种示例,用户按压多媒体控制器的遥控器上的图搜按键,遥控器检测到该按压操作后,生成图搜指令并向多媒体控制器发送。
又一种示例,用于通过遥控器的语音控制功能,语音输入“图搜”内容,遥控器检测到该语音输入后,生成图搜指令并向多媒体控制器发送。
S1502、响应于图搜指令,多媒体控制器确定激光投影设备当前所显示的图像,并获取图像的识别结果。
其中,激光投影设备当前所显示的图像,为激光投影设备当前播放的视频界面的截图。图像的识别结果包括图像中的人物,电视频道,物品,建筑,动物,植物等。
S1503、多媒体控制器向微投设备发送第一指令。相应的,微投设备接收来自多媒体控制器的第一指令。
第一指令用于指示微投设备显示识别结果。
S1504、响应于第一指令,微投设备显示识别结果。
一种可能的实现方式中,微投设备可以以图标的形式显示各个识别结果。微投设备可以对各种类型的识别结果进行排序,并按照预设的顺序显示各个识别结果。
又一种可能的实现方式中,微投设备直接显示图像,图像按照各个识别结果进行区域分割。用户通过选择识别结果对应的区域,选择对应的识别结果。
基于上述技术方案,在用户正在采用激光投影设备观看视频节目时,若用户使用了图搜功能,多媒体控制器可以将激光投影设备当前显示的图像的识别结果显示在微投设备上。这样,识别结果不会覆盖激光投影设备正在显示的画面,从而使得用户可以一边无遮挡地观看激光投影设备所显示的画面,一边查看图搜结果,从而使得用户的使用体验得到极大提升。
一种可能的实现方式中,结合图15,如图16所示,上述S1502具体可以通过以下S1502a-S1502b实现。
S1502a、响应于图搜指令,多媒体控制器确定激光投影设备当前所显示的图像。
一种可能的实现方式中,由于激光投影设备所显示的图像通过多媒体控制器向激光投影设备发送。因此,在多媒体控制器接受到图搜指令之后,多媒体控制器确定当前时刻多媒体控制器向激光投影设备发送的图像。多媒体将该图像作为激光投影设备当前所显示的图像。
又一种可能的实现方式中,激光投影设备具有截屏功能,多媒体控制器接收到图搜指令之后,向激光投影设备发送图搜指令,激光投影设备接收到该图搜指令之后截屏当前所显示的图像,并向多媒体控制器发送该图像。
S1502b、多媒体控制器向云端服务器发送激光投影设备当前所显示的图像。相应的,云端服务器接收来自多媒体控制器的图像。
S1502c、云端服务器搜索该图像,确定图像的识别结果。
具体来说,云端服务器接收到图像后,对图像进行人脸识别,物品识别,确定图像的识别结果。
S1502d、云端服务器向多媒体控制器发送图像的识别结果。相应的,多媒体控制器接收来自云端服务器的图像识别结果。
又一种可能的实现方式中,结合图15,如图16所示,在S1504之后,本申请实施例提供的方法还包括以下S1505-S1512,以下对S1505-S1512进行具体说明:
S1505、微投设备确定第一操作。
其中,第一操作用于选择识别结果中的目标对象。
该第一操作可以由微投设备直接确定,也可以由多媒体控制设备确定并转发给微投设备。
上述第一操作可以通过用户操作控制设备按下按键输入,或者,也可以由用户语音输入。
需要指出的是,上述目标对象可以为上述识别结果中的一项结果,也可以为上述识别结果中的多项识别结果。
S1506、响应于第一操作,微投设备向多媒体控制器发送第二指令。相应的,多媒体控制器接收来自微投设备的第二指令。
第二指令用于指示目标对象。
S1507、响应于第二指令,多媒体控制器向云端服务器发送第四指令。相应的,云端服务器接收来自多媒体控制器的第四指令。
第四指令用于指示云端服务器搜索目标对象的关联信息。
基于识别结果中目标对象的不同类别,上述关联信息可以表示不同的含义。
第一种示例中,识别结果中的目标对象为人物,则上述关联信息可以为该人物的姓名、职业、出生年月、籍贯、代表作品等。
第二种示例中,识别结果中的目标对象为电视频道,则上述关联信息可以为该电视频道的名称、正在播放的节目、节目预告等。
第三种示例中,识别结果中的目标对象为物品,则上述关联信息可以为该物品的同款商品以及购买链接等。
第四种示例中,识别结果中的目标对象为建筑,则上述关联信息可以为该建筑的位置、特点等。
第五种示例中,识别结果中的目标对象为植物,则上述关联信息可以为该植物的名称、品种、简介等。
第六种示例中,识别结果中的目标对象为动物,则上述关联信息可以为该动物的名称、生活地域、习性等。
S1508、云端服务器确定目标对象的关联信息,并根据目标对象的关联信息生成第五 指令。
第五指令用于指示目标对象的关联信息。
一种可能的实现方式中,多媒体控制器向云端服务器发送的第四指令中,包括各个目标对象的标识(例如图标)。云端服务器根据各个目标对象的标识,确定目标对象,并搜索目标对象的关联信息。
S1509、云端服务器向多媒体控制器发送第五指令。相应的,多媒体控制器接收来自云端服务器的第五指令。
S1510、多媒体控制器根据第五指令,生成第三指令。
S1511、多媒体控制器向微投设备发送第三指令。相应的,微投设备接收来自多媒体控制器的第三指令。
S1512、响应于第三指令,微投设备显示目标对象的关联信息。
具体来说,在上述目标对象为人物时,获取上述人物的第一关联信息,并在上述第二显示器显示上述第一关联信息,上述第一关联信息包括上述人物的介绍信息。
在上述目标对象为频道时,获取上述频道的第二关联信息,并在上述第二显示器显示上述第二关联信息,上述第二关联信息包括上述频道当前播放的节目信息以及节目预告。
在上述目标对象为物品时,获取上述物品的第三关联信息,并在上述第二显示器显示上述第三关联信息,上述第三关联信息包括上述物品的同款商品信息以及购买链接信息。
在微投设备显示目标对象的关联信息之后,用户可以进一步选择目标对象的关联信息进行查看。
例如,用户可以一次选择一个物品,也可以一次选择多个物品。如果选择一个物品,则第二控制器在第二显示器显示该一个物品的同款商品信息以及购买链接信息。如果选择多个物品,则第二控制器在第二显示器显示该多个物品的同款商品信息以及购买链接信息。
一种示例,用户可以通过图14所示例的选择物品的图标并确认的方式指示查看该一个物品的同款商品信息以及购买链接信息。以及,通过按下多媒体控制器的遥控器的特定按键(例如向左键)的方式指示查看该多个物品的同款商品信息以及购买链接信息。
基于上述技术方案,在微投设备显示目标对象之后,微投设备可以根据接收到的指令进一步显示各个目标对象的关联信息。从而使得用户在正常观看激光投影设备的画面的同时获取到目标对象的更多信息,进一步提升用户的使用体验。
可选的,上述交互流程中,多媒体控制器,激光投影设备和微投设备之间的信息传输可以通过串口传输。
可选的,本申请实施例还提供一种存储介质,所述存储介质中存储有指令,当其在计算机上运行时,使得计算机执行如上述方法实施例中的方法。
可选的,本申请实施例还提供一种运行指令的芯片,所述芯片用于执行上述方法实施例中的方法。
本申请实施例还提供一种程序产品,所述程序产品包括计算机程序,所述计算机程序存储在存储介质中,至少一个处理器可以从所述存储介质读取所述计算机程序,所述至少一个处理器执行所述计算机程序时可实现上述方法实施例中的方法。
在本申请实施例中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系;在公式中,字符“/”,表示前后关联对象是一种“相除”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中,a,b,c可以是单个,也可以是多个。
可以理解的是,在本申请实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请实施例的范围。
可以理解的是,在本申请的实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
最后应说明的是:以上各实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述各实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (19)

  1. 一种显示***,其特征在于,包括:
    第一显示器、第二显示器、第一控制器和第二控制器;
    所述第一控制器,被配置为:接收用户输入的图搜指令,响应于所述图搜指令,截取所述第一显示器当前所显示的图像、获取所述图像的识别结果,并向所述第二控制器发送第一指令,所述第一指令用于指示所述第二控制器显示所述识别结果;
    所述第二控制器,被配置为:响应于所述第一指令,在所述第二显示器上显示所述识别结果。
  2. 根据权利要求1所述的显示***,其特征在于,所述第二控制器被配置为:
    接收用户输入的查看指令,所述查看指令用于指示查看所述识别结果中的目标对象的关联信息;
    响应于所述查看指令,获取所述目标对象的关联信息,并在所述第二显示器显示所述目标对象的关联信息。
  3. 根据权利要求2所述的显示***,其特征在于,所述第二控制器被配置为:
    在所述目标对象为人物时,获取所述人物的第一关联信息,并在所述第二显示器显示所述第一关联信息,所述第一关联信息包括所述人物的介绍信息。
  4. 根据权利要求2所述的显示***,其特征在于,所述第二控制器被配置为:
    在所述目标对象为频道时,获取所述频道的第二关联信息,并在所述第二显示器显示所述第二关联信息,所述第二关联信息包括所述频道当前播放的节目信息以及节目预告。
  5. 根据权利要求2所述的显示***,其特征在于,所述第二控制器被配置为:
    在所述目标对象为物品时,获取所述物品的第三关联信息,并在所述第二显示器显示所述第三关联信息,所述第三关联信息包括所述物品的同款商品信息以及购买链接信息。
  6. 一种显示方法,其特征在于,包括:
    接收第一指令,所述第一指令由第一控制器在接收到用户输入的图搜指令、截取第一显示器当前所显示的图像,并获取所述图像的识别结果后发送,所述第一指令用于指示显示所述识别结果;
    响应于所述第一指令,在第二显示器上显示所述识别结果。
  7. 根据权利要求6所述的方法,其特征在于,所述方法还包括:
    接收用户输入的查看指令,所述查看指令用于指示查看所述识别结果中的目标对象的关联信息;
    响应于所述查看指令,获取所述目标对象的关联信息,并在所述第二显示器显示所述目标对象的关联信息。
  8. 根据权利要求7所述的方法,其特征在于,所述获取所述目标对象的关联信息,并在所述第二显示器显示所述目标对象的关联信息,包括:
    在所述目标对象为人物时,获取所述人物的第一关联信息,并在所述第二显示器显示所述第一关联信息,所述第一关联信息包括所述人物的介绍信息。
  9. 根据权利要求7所述的方法,其特征在于,所述获取所述目标对象的关联信息,并在所述第二显示器显示所述目标对象的关联信息,包括:
    在所述目标对象为频道时,获取所述频道的第二关联信息,并在所述第二显示器显示所述第二关联信息,所述第二关联信息包括所述频道当前播放的节目信息以及节目预告。
  10. 根据权利要求7所述的方法,其特征在于,所述获取所述目标对象的关联信息,并在所述第二显示器显示所述目标对象的关联信息,包括:
    在所述目标对象为物品时,获取所述物品的第三关联信息,并在所述第二显示器显示所述第三关联信息,所述第三关联信息包括所述物品的同款商品信息以及购买链接信息。
  11. 一种显示***,其特征在于,包括:多媒体控制器,激光投影设备,以及微投设备;
    其中,所述多媒体控制器,被配置为:接收图搜指令,响应于所述图搜指令,确定所述激光投影设备当前所显示的图像、获取所述图像的识别结果,并向所述微投设备发送第一指令,所述第一指令用于指示所述微投设备显示所述识别结果;
    所述微投设备,被配置为:接收来自所述多媒体控制器的第一指令,响应于所述第一指令,显示所述识别结果。
  12. 根据权利要求11所述的***,其特征在于,所述微投设备被配置为:
    确定第一操作,所述第一操作用于选择所述识别结果中的目标对象;
    响应于所述第一操作,向所述多媒体控制器发送第二指令;所述第二指令用于指示所述目标对象;
    接收来自所述多媒体控制器的第三指令,所述第三指令用于指示所述目标对象的关联信息;
    响应于所述第三指令,显示所述目标对象的关联信息。
  13. 根据权利要求12所述的***,其特征在于,所述多媒体控制器被配置为:
    接收来自所述微投设备的第二指令;
    响应于所述第二指令,向云端服务器发送第四指令,所述第四指令用于指示所述云端服务器搜索所述目标对象的关联信息;
    接收来自所述云端服务器的第五指令,所述第五指令用于指示所述目标对象的关联信息;
    根据所述第五指令确定所述第三指令,并向所述微投设备发送所述第三指令。
  14. 根据权利要求12或13所述的***,其特征在于,在所述目标对象为人物时,所述目标对象的关联信息包括所述人物的介绍信息;
    在所述目标对象为频道时,所述目标对象的关联信息包括以下至少一项:所述频道当前播放的节目信息,所述频道的节目预告;
    在所述目标对象为物品时,所述目标对象的关联信息包括以下至少一项:所述物品的同款商品信息,所述物品的同款商品购买链接信息。
  15. 一种显示方法,其特征在于,应用于权利要求11-14任一项所述的***中,所述方法包括:
    接收图搜指令;
    响应于所述图搜指令,确定激光投影设备当前所显示的图像,并获取所述图像的识别结果;
    向微投设备发送第一指令,所述第一指令用于指示所述微投设备显示所述识别结果。
  16. 根据权利要求15所述的方法,其特征在于,还包括:
    接收来自所述微投设备的第二指令;
    响应于所述第二指令,向云端服务器发送第四指令,所述第四指令用于指示所述云端服务器搜索目标对象的关联信息;所述目标对象为所述图像的识别结果中被选中的对象;
    接收来自所述云端服务器的第五指令,所述第五指令用于指示所述目标对象的关联信息;
    根据所述第五指令确定所述第三指令,并向所述微投设备发送所述第三指令。
  17. 一种显示方法,其特征在于,应用于权利要求11-14任一项所述的***中,所述方法包括:
    接收来自所述多媒体控制器的第一指令;所述第一指令用于指示所述微投设备显示识别结果;所述识别结果为所述多媒体控制器根据搜图指令确定的,激光投影设备当前所显示的图像的识别结果;
    响应于所述第一指令,显示所述识别结果。
  18. 根据权利要求17所述的显示方法,其特征在于,还包括:
    确定第一操作,所述第一操作用于选择所述识别结果中的目标对象;
    响应于所述第一操作,向所述多媒体控制器发送第二指令;所述第二指令用于指示所述目标对象;
    接收来自所述多媒体控制器的第三指令,所述第三指令用于指示所述目标对象的关联信息;
    响应于所述第三指令,显示所述目标对象的关联信息。
  19. 一种计算设备,其特征在于,包括:
    存储器,用于存储程序指令;
    处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行权利要求6~10任一项所述的方法;
    或者,处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行权利要求15或16所述的方法;
    或者,处理器,用于调用所述存储器中存储的程序指令,按照获得的程序执行权利要求17或18所述的方法。
PCT/CN2020/126566 2019-11-04 2020-11-04 显示***、显示方法及计算设备 WO2021088889A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911067968.5 2019-11-04
CN201911067968 2019-11-04

Publications (1)

Publication Number Publication Date
WO2021088889A1 true WO2021088889A1 (zh) 2021-05-14

Family

ID=74344300

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/126566 WO2021088889A1 (zh) 2019-11-04 2020-11-04 显示***、显示方法及计算设备

Country Status (2)

Country Link
CN (2) CN112784137A (zh)
WO (1) WO2021088889A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113902466A (zh) * 2021-09-09 2022-01-07 广景视睿科技(深圳)有限公司 一种无人商店交互方法、无人商店及存储介质

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113139492A (zh) * 2021-04-30 2021-07-20 百度在线网络技术(北京)有限公司 物品识别方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110282906A1 (en) * 2010-05-14 2011-11-17 Rovi Technologies Corporation Systems and methods for performing a search based on a media content snapshot image
US20120284740A1 (en) * 2011-05-02 2012-11-08 Samsung Electronics Co., Ltd. Method of surveying watching of image content, and broadcast receiving apparatus and server employing the same
CN104769957A (zh) * 2012-09-19 2015-07-08 谷歌公司 与当前播放的电视节目相关联的因特网可访问内容的识别和呈现
CN105763930A (zh) * 2014-12-17 2016-07-13 乐金电子(中国)研究开发中心有限公司 一种推送电视节目二维码的方法、智能电视机及机顶盒
CN107105340A (zh) * 2017-03-21 2017-08-29 百度在线网络技术(北京)有限公司 基于人工智能的视频中显示人物信息方法、装置和***
CN107480236A (zh) * 2017-08-08 2017-12-15 深圳创维数字技术有限公司 一种信息查询方法、装置、设备和介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3360101A1 (en) * 2015-10-09 2018-08-15 Google LLC Methods, systems and media for presenting media content that was advertised on a second screen device using a primary device
WO2017096509A1 (zh) * 2015-12-07 2017-06-15 华为技术有限公司 一种显示、处理的方法及相关装置
CN108600846A (zh) * 2018-03-15 2018-09-28 聚好看科技股份有限公司 移动终端以及便于搜索虚拟商品信息的方法
CN109034115B (zh) * 2018-08-22 2021-10-22 Oppo广东移动通信有限公司 视频识图方法、装置、终端及存储介质
CN109922363A (zh) * 2019-03-15 2019-06-21 青岛海信电器股份有限公司 一种显示画面截图的图形用户界面方法及显示设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110282906A1 (en) * 2010-05-14 2011-11-17 Rovi Technologies Corporation Systems and methods for performing a search based on a media content snapshot image
US20120284740A1 (en) * 2011-05-02 2012-11-08 Samsung Electronics Co., Ltd. Method of surveying watching of image content, and broadcast receiving apparatus and server employing the same
CN104769957A (zh) * 2012-09-19 2015-07-08 谷歌公司 与当前播放的电视节目相关联的因特网可访问内容的识别和呈现
CN105763930A (zh) * 2014-12-17 2016-07-13 乐金电子(中国)研究开发中心有限公司 一种推送电视节目二维码的方法、智能电视机及机顶盒
CN107105340A (zh) * 2017-03-21 2017-08-29 百度在线网络技术(北京)有限公司 基于人工智能的视频中显示人物信息方法、装置和***
CN107480236A (zh) * 2017-08-08 2017-12-15 深圳创维数字技术有限公司 一种信息查询方法、装置、设备和介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113902466A (zh) * 2021-09-09 2022-01-07 广景视睿科技(深圳)有限公司 一种无人商店交互方法、无人商店及存储介质

Also Published As

Publication number Publication date
CN112784137A (zh) 2021-05-11
CN112269553A (zh) 2021-01-26
CN112269553B (zh) 2023-11-07

Similar Documents

Publication Publication Date Title
US11856322B2 (en) Display apparatus for image processing and image processing method
WO2020248640A1 (zh) 一种显示设备
WO2021088888A1 (zh) 焦点切换方法、显示设备及***
WO2021088889A1 (zh) 显示***、显示方法及计算设备
US11917329B2 (en) Display device and video communication data processing method
US11877091B2 (en) Method for adjusting position of video chat window and display device
WO2020248924A1 (zh) 显示设备及扫描外部蓝牙设备的方法
CN112399263A (zh) 一种互动方法、显示设备及移动终端
WO2021088890A1 (zh) 显示***及显示方法
CN112783380A (zh) 显示设备和方法
CN112995733B (zh) 一种显示设备、设备发现方法及存储介质
WO2020248699A1 (zh) 一种声音处理法及显示设备
CN111385631B (zh) 一种显示设备、通信方法及存储介质
WO2020248681A1 (zh) 显示设备及蓝牙开关状态的显示方法
WO2020248682A1 (zh) 一种显示设备及虚拟场景生成方法
WO2021169125A1 (zh) 显示设备和控制方法
WO2021088326A1 (zh) 一种显示设备及来电显示方法
CN112073666B (zh) 一种显示设备的电源控制方法及显示设备
CN112073777B (zh) 一种语音交互方法及显示设备
WO2020248654A1 (zh) 显示设备及应用共同显示的方法
CN112073773A (zh) 一种屏幕互动方法、装置及显示设备
WO2020248788A1 (zh) 一种语音控制方法及显示设备
WO2020248810A1 (zh) 一种显示设备
CN112995762A (zh) 显示设备及网络状态同步方法
CN113497966A (zh) 一种双屏显示设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20884480

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20884480

Country of ref document: EP

Kind code of ref document: A1