CN113010020A - Time schedule controller and display device - Google Patents

Time schedule controller and display device Download PDF

Info

Publication number
CN113010020A
CN113010020A CN202110568109.5A CN202110568109A CN113010020A CN 113010020 A CN113010020 A CN 113010020A CN 202110568109 A CN202110568109 A CN 202110568109A CN 113010020 A CN113010020 A CN 113010020A
Authority
CN
China
Prior art keywords
module
image
unit
output
image data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110568109.5A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ivisual 3D Technology Co Ltd
Original Assignee
Beijing Ivisual 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ivisual 3D Technology Co Ltd filed Critical Beijing Ivisual 3D Technology Co Ltd
Priority to CN202110568109.5A priority Critical patent/CN113010020A/en
Publication of CN113010020A publication Critical patent/CN113010020A/en
Priority to PCT/CN2022/092374 priority patent/WO2022247647A1/en
Priority to TW111118502A priority patent/TWI802414B/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Electrophonic Musical Instruments (AREA)
  • Train Traffic Observation, Control, And Security (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The application relates to the technical field of 3D image data processing, and discloses a time schedule controller, which comprises: the system comprises an MCU, a 3D module, an eyeball processing module, a receiving module and an output module, wherein the MCU is connected with a first input end of the 3D module, the eyeball processing module is connected with a second input end of the 3D module, the receiving module is connected with a third input end of the 3D module, the output module is connected with an output end of the 3D module, and the MCU is configured to send control information to the 3D module; the eyeball processing module is configured to receive the image acquired by the image sensor and obtain eyeball coordinates according to the image; the receiving module is configured to receive the image data and transmit the image data to the 3D module; the 3D module is configured to generate pixel drive signals from the eye coordinates, the control information, and the image data; the output module is configured to output the pixel driving signal to a display screen. The time schedule controller provided by the application can improve user experience. The application also discloses a display device.

Description

Time schedule controller and display device
Technical Field
The present application relates to the field of 3D image data processing technology, for example, to a timing controller and a display device.
Background
At present, data processing of electronic equipment is usually executed by a Central Processing Unit (CPU), and because the data volume of 3D data and 2D data is large, the problems of delay, jamming and the like can occur in the processing of the CPU, and user experience is influenced.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a time schedule controller and a display device, so as to solve the technical problem.
In some embodiments, a timing controller, comprises: a microprocessor MCU, a 3D module, an eyeball processing module, a receiving module and an output module, wherein the MCU is connected with a first input end of the 3D module, the eyeball processing module is connected with a second input end of the 3D module, the receiving module is connected with a third input end of the 3D module, the output module is connected with an output end of the 3D module, wherein,
an MCU configured to transmit control information to the 3D module;
the eyeball processing module is configured to receive the image acquired by the image sensor and obtain eyeball coordinates according to the image;
a receiving module configured to receive image data and transmit the image data to the 3D module;
a 3D module configured to generate pixel driving signals according to the eye coordinates, the control information, and the image data;
and the output module is configured to output the pixel driving signal to the display screen.
In some embodiments, the display device includes a timing controller as described above.
The time schedule controller and the display device provided by the embodiment of the disclosure can realize the following technical effects:
the user experience is improved to a certain extent.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
At least one embodiment is illustrated by the accompanying drawings, which correspond to the accompanying drawings, and which do not form a limitation on the embodiment, wherein elements having the same reference numeral designations are shown as similar elements, and which are not to scale, and wherein:
fig. 1 shows a schematic structural diagram of a timing controller in an embodiment of the present disclosure;
fig. 2 shows a schematic structural diagram of an eyeball processing module in an embodiment of the present disclosure;
fig. 3 shows another schematic structural diagram of the timing controller in the embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating another structure of the timing controller according to the embodiment of the disclosure;
fig. 5 is a schematic diagram illustrating another structure of the timing controller according to the embodiment of the disclosure;
fig. 6 is a schematic diagram illustrating another structure of the timing controller according to the embodiment of the disclosure;
fig. 7 is a schematic diagram illustrating another structure of the timing controller according to the embodiment of the disclosure;
fig. 8 shows a schematic structural diagram of a display device in an embodiment of the present disclosure.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, at least one embodiment may be practiced without these specific details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
Generally, a display panel (or called a display panel) has a timing controller Tcon (timing controller), also called a Tcon board or a panel driving board, for receiving input signals such as RGB data signals, clock signals, and control signals, and then converting the input signals into signals capable of driving the display panel.
The disclosed embodiments provide a timing controller, which is explained below.
Fig. 1 shows a schematic structural diagram of a timing controller in an embodiment of the present disclosure.
As shown, the timing controller 700 may include: a Microprocessor (MCU) 10, an eyeball processing module 20, a three-dimensional 3D module 30, a receiving module 40 and an output module 50, wherein the MCU 10 is connected with a first input end of the 3D module 30, the eyeball processing module 20 is connected with a second input end of the 3D module 30, the receiving module 40 is connected with a third input end of the 3D module 30, the output module 50 is connected with an output end of the 3D module 30, wherein,
an MCU 10 configured to transmit control information to the 3D module;
an eyeball processing module 20 configured to receive the image acquired by the image sensor and obtain eyeball coordinates from the image;
a receiving module 40 configured to receive image data and transmit the image data to the 3D module;
a 3D module 30 configured to generate pixel driving signals according to the eyeball coordinates, the control information, and the image data;
an output module 50 configured to output the pixel driving signal to the display screen.
The time schedule controller improves the existing time schedule controller, the hardware unit for image data operation and the hardware unit for eyeball coordinate operation are added in the time schedule controller, the CPU does not need to participate in any image data operation, the image data operation and the eyeball coordinate operation are realized through special hardware equipment such as the time schedule controller, the efficiency is improved to a certain extent, the time schedule controller provided by the embodiment of the disclosure is adopted for 3D display, the processing rate can be improved, the delay is reduced, and the watching experience of a user is improved.
In some embodiments, the 3D module 30 may include two input pins bin and one output pin bin, the output pin of the MCU 10 may be connected to one input pin of the 3D module 30, the output pin of the receiving module 40 may be connected to another input pin of the 3D module 30, and the input pin of the output module 50 is connected to the output pin of the 3D module 30.
In some embodiments, an input of the timing controller 700 may be connected to the application processor AP, and the AP provides the image data. Alternatively, the image data may be a video picture acquired by an image sensor, or may be a virtual video picture generated by the AP.
In some embodiments, the image data may include left eye image data and right eye image data.
In some embodiments, the eyeball coordinates may include x and y coordinates in the horizontal and vertical directions in the screen coordinate system, and may further include a depth z from the eyeball to the screen. The eyeball coordinates may include coordinates of the left eye and coordinates of the right eye.
In some embodiments, the output module 50 may transmit signals in a point-to-point (P2P, Peer-to-Peer) manner.
In some embodiments, the 3D module 30, the eyeball processing module 20 and the output module 50 may be connected with the MCU 10 by means of a bus. Alternatively, the bus may be an axi (advanced eXtensible interface) bus or the like.
Fig. 2 shows a schematic structural diagram of an eyeball processing module in the embodiment of the present disclosure.
As shown, in some embodiments, the eyeball processing module 20 may include an image receiving unit 201, an eyeball processing unit 202 and an image transmitting unit 203,
an image receiving unit 201 connected to the image sensor and configured to receive an image provided by the image sensor;
an eyeball processing unit 202 configured to acquire an image provided by the image sensor and calculate eyeball coordinates;
an image transmitting unit 203 configured to transmit the eyeball coordinates to the 3D module 30.
In some embodiments, the image sensor may be a sensing device such as a camera. The timing controller Tcon may be connected to the image sensor, receive an image captured by the image sensor through the image receiving unit 201, process the image through the eyeball processing unit 202, calculate a position of an eyeball (eyeball coordinate) in the image, and transmit the eyeball coordinate to the 3D module 30 through the image transmitting unit 203.
In some embodiments, the image sensor may acquire an image including a face (for example, a human face), and the eyeball processing unit 203 performs calculation processing on the image including the face to obtain eyeball coordinates. The eyeball processing unit 203 may use an existing eyeball coordinate extraction algorithm to perform the calculation processing on the image.
In some embodiments, the timing controller may be connected to an image sensor that acquires an image including a face; alternatively, the timing controller may be connected to two or more image sensors, the shooting range of each image sensor may be different, and the eyeball processing unit 202 may perform calculation processing on images provided by different image sensors to obtain the eyeball coordinates.
Fig. 3 shows another schematic structural diagram of the timing controller in the embodiment of the present disclosure.
As shown in the figure, in some embodiments, the MCU 10 includes a processing unit 100 and a first interface 101, the first interface 101 is connected to a memory 104, and the memory 104 is pre-stored with optical data;
a processing unit 100 further configured to send a read command to the first interface 101;
a first interface 101 configured to retrieve optical data from the memory 104 according to a read command and send it to the 3D module 30.
In some embodiments, the processing unit 100 may send a read command to the first interface 101 after each power-up.
In some embodiments, the optical data may include a correspondence of pixels (each pixel may include a plurality of sub-pixels, each sub-pixel may also include a plurality of composite sub-pixels, etc.) on the display screen to the grating. Alternatively, the grating may comprise a lenticular grating or the like. The corresponding relationship may include the number, the horizontal position arrangement, the vertical position arrangement, and the like of the composite sub-pixels in each grating.
In some embodiments, the optical data may be transmitted from the application processor AP.
In some embodiments, the first Interface 101 may be a Serial Peripheral Interface (SPI).
In some embodiments, the memory 104 may be a non-volatile storage medium Flash, and optionally, when the corresponding first interface 101 is an SPI interface, the memory 104 may be an SPI Flash storage device.
Fig. 4 shows another schematic structural diagram of the timing controller in the embodiment of the present disclosure.
As shown, in some embodiments, MCU 10 further includes a second interface 102,
a second interface 102 configured to receive optical data;
a processing unit 100 further configured to send a write command to the first interface 101;
the first interface 101 is further configured to retrieve optical data according to the write command and write the optical data to the memory 104.
Fig. 5 shows another schematic structural diagram of the timing controller in the embodiment of the present disclosure.
As shown, in some embodiments, the timing controller 700 may further include an image transmission interface unit 60 connected to an input terminal of the receiving module 30,
the image transmission interface unit 60 may be configured to receive the image data in the first protocol format, convert the image data into the image data in the second protocol format, and output the image data to the receiving module 40.
In some embodiments, the image transmission Interface unit 60 may be a Mobile Industry Processor Interface (MIPI). Alternatively, it may be a MIPI Display Interface (DSI).
Fig. 6 shows another schematic structural diagram of the timing controller in the embodiment of the present disclosure.
As shown, in some embodiments, the MCU 10 may further include a third interface 103,
a third interface 103 configured to transmit initialization data to the image transmission interface unit 60;
an image transmission interface unit 60 configured to perform initialization according to the initialization data.
In some embodiments, the third interface 103 may be an IIC interface.
In some embodiments, the image transmission interface unit 60 may further include an initialization interface configured to receive initialization data.
In some embodiments, the initialization interface of the image transmission interface unit 60 may be an IIC interface configured to receive initialization data. Optionally, the IIC interface of the image transmission interface unit 60 may be an IIC slave interface IIC-s, the third interface 103 may be an IIC master interface IIC-m, and the second interface 102 may be a slave interface IIC-s.
In some embodiments, receiving module 40 may be further configured to reorder pixels of the image data prior to sending the image data to 3D module 30.
In some embodiments, the MCU 10 may be configured to transmit the mode control information to the 3D module 30, and the 3D module 30 may be configured to output the pixel driving signal in a corresponding mode according to the eyeball coordinates, the mode control information, and the image data.
In some embodiments, the mode control information may include at least one of: 2D mode, calibration mode, 3D mode;
a 3D module 30, which may be configured to perform pixel expansion on the received image data according to the number of composite sub-pixels of each pixel in the 2D mode and then output the image data; outputting a pre-stored standard chart in a calibration mode; and in the 3D mode, the received image data is output after determining the arrangement of left and right eye pixels according to the eyeball coordinates and the optical data.
Fig. 7 shows another schematic structural diagram of the timing controller in the embodiment of the present disclosure.
As shown, in some embodiments, the 3D module 30 may include a register 301, a selection unit 302, a standard graph unit 303, and a 3D algorithm unit 304, the register 301 is connected to the selection unit 302 and the 3D algorithm unit 304 respectively,
a selection unit 302 configured to receive mode control information selection of the register 301 and to connect with the standard map unit 303 or the 3D algorithm unit 304;
a standard chart unit 303 configured to output a pre-stored standard chart in a calibration mode;
and the 3D algorithm unit 304 is configured to determine left and right eye pixel arrangements according to the eyeball coordinates and the optical data and output the determined left and right eye pixel arrangements in the 3D mode, and perform pixel expansion on the received image data and output the pixel expanded image in the 2D mode.
In some embodiments, the processing unit 100 may configure the 3D mode in the register 301 via the bus with a value of 1, which represents that the 3D module 30 needs to operate in the 3D mode, and when the 3D module 30 determines that the 3D mode value is 1, the 3D algorithm unit 304 performs a 3D operation on the image data. Alternatively, the specific 3D operation may be implemented by using an existing 3D algorithm.
In some embodiments, the processing unit 100 may configure the calibration mode in the bus configuration register 301 to have a value of 1, and the 3D mode has a value of 0, which represents that the 3D module 30 needs to operate in the calibration mode, and the calibration graph unit 303 outputs the pre-stored standard graph when determining that the calibration mode has a value of 1.
In some embodiments, the processing unit 100 may configure the value of the calibration mode in the register 301 to be 0 and the value of the 3D mode to be 0 through the bus, which indicates that the 3D module 30 needs to operate in the 2D mode, and the 3D algorithm unit 304 copies each pixel of the image data and outputs the copied image data to the output module 50 when determining that the value of the calibration mode is 0 and the value of the 3D mode is 0.
In some embodiments, in the 2D mode, the 3D arithmetic unit 304 may copy each pixel of the image data N times and output to the output module 50. The number of replications is related to the number of composite sub-pixels comprised by each sub-pixel.
In some embodiments, the selection unit 302 may be implemented using a switching circuit. Alternatively, the selection unit 302 may include three input pins A, B, C, wherein the input pin a is in 3D mode, the input pin B is in calibration mode, the input pin C is in 2D mode, and the high and low levels of the three pins may determine which module and the output pin are turned on. For example: pin a inputs a high level, pin B and pin C low level, the selection unit 302 is conducted with the 3D algorithm unit 304 (3D mode); pin B inputs high level, pin a and pin C low level, the selection unit 302 is conducted with the standard chart unit 303 (calibration mode); pin C inputs high, pin a and pin B low, and the selection unit 302 is turned on with the 3D algorithm unit 304 (2D mode).
In some embodiments, the processing of the 2D mode may also be implemented by a functional unit independent of the 3D algorithm unit 304.
In some embodiments, each functional module and unit in the embodiments of the present disclosure may be implemented by a hardware circuit or the like.
The embodiment of the present disclosure also provides a display device including the above-mentioned timing controller 700.
Fig. 8 shows a schematic structural diagram of a display device in an embodiment of the present disclosure.
As shown, in some embodiments, the display device may include a timing controller 700 and, optionally, a display screen 800, and the timing controller 700 may output a pixel driving signal to the display screen 800 to drive a display of the display screen.
The timing controller and the Display device provided by the embodiment of the disclosure can be used in devices such as Liquid Crystal Display (LCD), Light-Emitting Diode (LED), and the like.
The time schedule controller and the display device provided by the embodiment of the disclosure can be used for 2D, 3D and other display devices.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It is clear to those skilled in the art that, for convenience and brevity of description, the working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit may be merely a division of a logical function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the drawings, the width, length, thickness, etc. of structures such as elements or layers may be exaggerated for clarity and descriptive purposes. When an element or layer is referred to as being "disposed on" (or "mounted on," "laid on," "attached to," "coated on," or the like) another element or layer, the element or layer may be directly "disposed on" or "over" the other element or layer, or intervening elements or layers may be present, or even partially embedded in the other element or layer.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises at least one executable instruction for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In the description corresponding to the flowcharts and block diagrams in the figures, operations or steps corresponding to different blocks may also occur in different orders than disclosed in the description, and sometimes there is no specific order between the different operations or steps. For example, two sequential operations or steps may in fact be executed substantially concurrently, or they may sometimes be executed in the reverse order, depending upon the functionality involved. Each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.

Claims (10)

1. A timing controller, comprising: a microprocessor MCU, a 3D module, an eyeball processing module, a receiving module and an output module, wherein the MCU is connected with the first input end of the 3D module, the eyeball processing module is connected with the second input end of the 3D module, the receiving module is connected with the third input end of the 3D module, the output module is connected with the output end of the 3D module, wherein,
the MCU configured to transmit control information to the 3D module;
the eyeball processing module is configured to receive an image acquired by an image sensor and obtain eyeball coordinates according to the image;
the receiving module configured to receive image data and transmit the image data to the 3D module;
the 3D module configured to generate pixel drive signals from the eye coordinates, control information, and the image data;
the output module is configured to output the pixel driving signal to a display screen.
2. The timing controller according to claim 1, wherein the eyeball processing module comprises an image receiving unit, an eyeball processing unit, and an image transmitting unit,
the image receiving unit is connected with the image sensor and is configured to receive the image provided by the image sensor;
the eyeball processing unit is configured to acquire the image provided by the image sensor and calculate eyeball coordinates;
the image transmitting unit is configured to transmit the eyeball coordinates to the 3D module.
3. The timing controller according to claim 1, wherein the MCU comprises a processing unit and a first interface, the first interface is connected to a memory, and the memory is pre-stored with optical data;
the processing unit further configured to send a read command to the first interface;
the first interface is configured to retrieve the optical data from the memory according to the read command and send it to the 3D module.
4. The timing controller of claim 3, wherein the MCU further comprises a second interface,
the second interface configured to receive optical data;
the processing unit further configured to send a write command to the first interface;
the first interface is further configured to retrieve the optical data and write the optical data to the memory according to the write command.
5. The timing controller of claim 3, further comprising an image transmission interface unit connected to an input terminal of the receiving module,
the image transmission interface unit is configured to receive image data in a first protocol format, convert the image data into image data in a second protocol format, and output the image data to the receiving module.
6. The timing controller of claim 5, wherein the MCU further comprises a third interface,
the third interface configured to transmit initialization data to the image transmission interface unit;
the image transmission interface unit is configured to initialize according to the initialization data.
7. The timing controller of claim 1, wherein the MCU is configured to transmit mode control information to the 3D module, and the 3D module is further configured to output pixel driving signals in corresponding modes according to the eye coordinates, the mode control information, and the image data.
8. The timing controller of claim 7, wherein the mode control information comprises at least one of: 2D mode, calibration mode, 3D mode;
the 3D module is configured to output the received image data after performing pixel expansion according to the number of composite sub-pixels of each pixel in a 2D mode; outputting a pre-stored standard chart in a calibration mode; and in the 3D mode, the received image data is output after determining the arrangement of left and right eye pixels according to the eyeball coordinates and the optical data.
9. The timing controller according to claim 8, wherein the 3D module comprises a register, a selection unit, a standard graph unit, and a 3D algorithm unit, the register is connected to the selection unit and the 3D algorithm unit respectively,
the selection unit is configured to receive mode control information selection of a register and is connected with the standard graph unit or the 3D algorithm unit;
the standard chart unit is configured to output a pre-stored standard chart in a calibration mode;
and the 3D algorithm unit is configured to determine the arrangement of pixels of the left eye and the right eye according to the eyeball coordinates and the optical data and output the image data in a 3D mode, and perform pixel expansion on the received image data in a 2D mode and output the image data.
10. A display device comprising the timing controller according to any one of claims 1 to 9.
CN202110568109.5A 2021-05-25 2021-05-25 Time schedule controller and display device Pending CN113010020A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202110568109.5A CN113010020A (en) 2021-05-25 2021-05-25 Time schedule controller and display device
PCT/CN2022/092374 WO2022247647A1 (en) 2021-05-25 2022-05-12 Timing controller and display device
TW111118502A TWI802414B (en) 2021-05-25 2022-05-18 Timing controller and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110568109.5A CN113010020A (en) 2021-05-25 2021-05-25 Time schedule controller and display device

Publications (1)

Publication Number Publication Date
CN113010020A true CN113010020A (en) 2021-06-22

Family

ID=76380803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110568109.5A Pending CN113010020A (en) 2021-05-25 2021-05-25 Time schedule controller and display device

Country Status (3)

Country Link
CN (1) CN113010020A (en)
TW (1) TWI802414B (en)
WO (1) WO2022247647A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022012455A1 (en) * 2020-07-15 2022-01-20 北京芯海视界三维科技有限公司 Method and apparatus for implementing target object positioning, and display component
CN115278200A (en) * 2022-07-29 2022-11-01 北京芯海视界三维科技有限公司 Processing apparatus and display device
CN115278201A (en) * 2022-07-29 2022-11-01 北京芯海视界三维科技有限公司 Processing apparatus and display device
CN115278198A (en) * 2022-07-29 2022-11-01 北京芯海视界三维科技有限公司 Processing apparatus and display device
WO2022247647A1 (en) * 2021-05-25 2022-12-01 北京芯海视界三维科技有限公司 Timing controller and display device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120038690A1 (en) * 2010-08-11 2012-02-16 Jaeyong Lee Stereoscopic image display and method for driving the same
US20130021325A1 (en) * 2011-07-22 2013-01-24 Bo-Ram Kim Three-dimensional image display device and a driving method thereof
US20130286057A1 (en) * 2012-04-26 2013-10-31 Samsung Display Co., Ltd. Method of driving 3d shutter glasseses (spectacles), a shutter spectacles apparatus for performing the method and display apparatus having the shutter spectacles apparatus
KR20140003685A (en) * 2012-06-22 2014-01-10 삼성디스플레이 주식회사 3d image display device and driving method thereof
CN107071384A (en) * 2017-04-01 2017-08-18 上海讯陌通讯技术有限公司 The binocular rendering intent and system of virtual active disparity computation compensation
CN108415157A (en) * 2018-03-07 2018-08-17 京东方科技集团股份有限公司 A kind of virtual display device
CN110933333A (en) * 2019-12-06 2020-03-27 河海大学常州校区 Image acquisition, storage and display system based on FPGA
CN111508447A (en) * 2020-04-27 2020-08-07 上海济丽信息技术有限公司 Image time sequence control circuit of grating type naked eye 3D liquid crystal screen based on FPGA
CN211554502U (en) * 2019-12-05 2020-09-22 北京芯海视界三维科技有限公司 Field sequential display
CN111757088A (en) * 2019-03-29 2020-10-09 刁鸿浩 Naked eye stereoscopic display system with lossless resolution

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101686099B1 (en) * 2010-07-09 2016-12-13 엘지디스플레이 주식회사 Driving circuit for image display device and method for driving the same
KR101890622B1 (en) * 2011-11-22 2018-08-22 엘지전자 주식회사 An apparatus for processing a three-dimensional image and calibration method of the same
KR102470377B1 (en) * 2015-12-31 2022-11-23 엘지디스플레이 주식회사 Display device for personal immersion apparatus
CN105930821B (en) * 2016-05-10 2024-02-02 上海青研科技有限公司 Human eye identification and tracking method and human eye identification and tracking device device applied to naked eye 3D display
CN108076208B (en) * 2016-11-15 2021-01-01 中兴通讯股份有限公司 Display processing method and device and terminal
KR102537784B1 (en) * 2018-08-17 2023-05-30 삼성전자주식회사 Electronic device and control method thereof
CN208705623U (en) * 2018-09-27 2019-04-05 昆山龙腾光电有限公司 The changeable display device in width visual angle
CN113010020A (en) * 2021-05-25 2021-06-22 北京芯海视界三维科技有限公司 Time schedule controller and display device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120038690A1 (en) * 2010-08-11 2012-02-16 Jaeyong Lee Stereoscopic image display and method for driving the same
US20130021325A1 (en) * 2011-07-22 2013-01-24 Bo-Ram Kim Three-dimensional image display device and a driving method thereof
US20130286057A1 (en) * 2012-04-26 2013-10-31 Samsung Display Co., Ltd. Method of driving 3d shutter glasseses (spectacles), a shutter spectacles apparatus for performing the method and display apparatus having the shutter spectacles apparatus
KR20140003685A (en) * 2012-06-22 2014-01-10 삼성디스플레이 주식회사 3d image display device and driving method thereof
CN107071384A (en) * 2017-04-01 2017-08-18 上海讯陌通讯技术有限公司 The binocular rendering intent and system of virtual active disparity computation compensation
CN108415157A (en) * 2018-03-07 2018-08-17 京东方科技集团股份有限公司 A kind of virtual display device
CN111757088A (en) * 2019-03-29 2020-10-09 刁鸿浩 Naked eye stereoscopic display system with lossless resolution
CN211554502U (en) * 2019-12-05 2020-09-22 北京芯海视界三维科技有限公司 Field sequential display
CN110933333A (en) * 2019-12-06 2020-03-27 河海大学常州校区 Image acquisition, storage and display system based on FPGA
CN111508447A (en) * 2020-04-27 2020-08-07 上海济丽信息技术有限公司 Image time sequence control circuit of grating type naked eye 3D liquid crystal screen based on FPGA

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022012455A1 (en) * 2020-07-15 2022-01-20 北京芯海视界三维科技有限公司 Method and apparatus for implementing target object positioning, and display component
WO2022247647A1 (en) * 2021-05-25 2022-12-01 北京芯海视界三维科技有限公司 Timing controller and display device
CN115278200A (en) * 2022-07-29 2022-11-01 北京芯海视界三维科技有限公司 Processing apparatus and display device
CN115278201A (en) * 2022-07-29 2022-11-01 北京芯海视界三维科技有限公司 Processing apparatus and display device
CN115278198A (en) * 2022-07-29 2022-11-01 北京芯海视界三维科技有限公司 Processing apparatus and display device

Also Published As

Publication number Publication date
TWI802414B (en) 2023-05-11
WO2022247647A1 (en) 2022-12-01
TW202246951A (en) 2022-12-01

Similar Documents

Publication Publication Date Title
CN113010020A (en) Time schedule controller and display device
KR101545682B1 (en) Video rendering across a high speed peripheral interconnect bus
EP1899913B1 (en) Dynamic load balancing in multiple video processing unit (vpu) systems
US8004531B2 (en) Multiple graphics processor systems and methods
US7663635B2 (en) Multiple video processor unit (VPU) memory mapping
CN113012621A (en) Time schedule controller and display device
EP1217602A2 (en) Updating image frames in a display device comprising a frame buffer
CN103262060A (en) Method and apparatus for 3d capture syncronization
CN113012636A (en) Time schedule controller and display device
WO2006129194A2 (en) Frame synchronization in multiple video processing unit (vpu) systems
CN112055845B (en) Image display method and video processing apparatus
US9563582B2 (en) Modular device, system, and method for reconfigurable data distribution
CN106358063B (en) Touch television and control method and control device thereof
US20120013523A1 (en) Sensor Driven Automatic Display Configuration System And Method
US12034908B2 (en) Stereoscopic-image playback device and method for generating stereoscopic images
CN109308862B (en) Signal control method, control device, processing device and signal control equipment
US20060022973A1 (en) Systems and methods for generating a composite video signal from a plurality of independent video signals
CN113315964A (en) Display method and device of 3D image and electronic equipment
CN116580100A (en) Video injection scheme-based camera calibration method and device
CN116708737A (en) Stereoscopic image playing device and stereoscopic image generating method thereof
CN115278198A (en) Processing apparatus and display device
CN101262576A (en) Method and related device for hiding data into video signals and transferring data to display device
CN117198232A (en) Color ink screen driving method, device and storage medium
CN114077410A (en) Image processing device and virtual reality equipment
CN115278201A (en) Processing apparatus and display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination