CN108646353B - Optical fiber-waveguide automatic alignment coupler based on image processing - Google Patents

Optical fiber-waveguide automatic alignment coupler based on image processing Download PDF

Info

Publication number
CN108646353B
CN108646353B CN201810397010.1A CN201810397010A CN108646353B CN 108646353 B CN108646353 B CN 108646353B CN 201810397010 A CN201810397010 A CN 201810397010A CN 108646353 B CN108646353 B CN 108646353B
Authority
CN
China
Prior art keywords
waveguide
image
fiber
filler block
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810397010.1A
Other languages
Chinese (zh)
Other versions
CN108646353A (en
Inventor
宋凝芳
李亮
冯迪
张春熹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Aeronautics and Astronautics
Original Assignee
Beijing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Aeronautics and Astronautics filed Critical Beijing University of Aeronautics and Astronautics
Priority to CN201810397010.1A priority Critical patent/CN108646353B/en
Publication of CN108646353A publication Critical patent/CN108646353A/en
Application granted granted Critical
Publication of CN108646353B publication Critical patent/CN108646353B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/24Coupling light guides
    • G02B6/36Mechanical coupling means
    • G02B6/38Mechanical coupling means having fibre to fibre mating means
    • G02B6/3807Dismountable connectors, i.e. comprising plugs
    • G02B6/3833Details of mounting fibres in ferrules; Assembly methods; Manufacture
    • G02B6/385Accessories for testing or observation of connectors

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Mechanical Coupling Of Light Guides (AREA)
  • Optical Couplings Of Light Guides (AREA)

Abstract

The invention discloses an optical fiber-waveguide automatic alignment coupler based on image processing, which belongs to the technical field of optical fiber communication and optical fiber sensing and comprises an image acquisition module, an image processing and control module, a motion execution module and a human-computer interaction module. The motion execution module comprises a filler block with tail fibers, a Y waveguide element, a six-dimensional electric displacement table, a motor control board, two sets of manual displacement tables and the like; the image processing and control module comprises a processor, a storage module, a communication interface and a display module. The image acquisition module acquires side images and end face images which are clearly imaged, after the side images and the end face images are processed by the image processing and control module, the motion execution module extracts edge characteristic straight lines from the side images which are clearly imaged and calculates relative deviation, deviation is eliminated to realize initial light passing, the position of an output point is determined, the maximum gray value of the output light point is controlled, and automatic alignment coupling of the optical fiber-Y waveguide is realized. The invention has the advantages of small volume, low cost, friendly man-machine interaction and high operation speed, and avoids long-time operation of manual alignment.

Description

Optical fiber-waveguide automatic alignment coupler based on image processing
Technical Field
The invention belongs to the technical field of optical fiber communication and optical fiber sensing, and relates to an optical fiber-waveguide automatic alignment coupler based on image processing.
Background
In the 21 st century, the development and use of optoelectronic devices in the fields of communication, sensing and the like have become strategic technologies of competitive development in various countries. In the aspect of optical fiber communication, the integrated optoelectronic device becomes a key device for supporting the rapid development of the next generation of optical fiber communication due to the characteristics of stable performance, high integration degree, high processing speed, high reliability and the like. The waveguide device is an important basic component of the integrated optoelectronic device, and the low-loss connection packaging technology of the optical fiber and the waveguide is a key technology of the integrated optoelectronic technology.
in the aspect of optical fiber sensing, the optical fiber gyroscope is an important application, has become a mainstream gyroscope in the field of inertial technology research, and has higher application value in military, aviation and various civil fields. The current high-precision fiber optic gyroscope generally adopts a direct coupling scheme of a high-birefringence fiber optic ring and a Y waveguide device. Loss and noise caused by the coupling point of the polarization maintaining fiber and the Y waveguide are main factors for restricting the improvement of the measurement precision of the fiber-optic gyroscope.
the coupling loss of the optical fiber and the optical waveguide device mainly comprises mode field mismatch loss, Fresnel reflection loss, transmission loss and alignment deviation loss.
The existing alignment coupling technology of optical fiber and waveguide is mainly divided into an active alignment mode and a passive alignment mode. An active alignment mode generally adopts the steps of introducing a light source at an input optical fiber, detecting output at a waveguide output end or the tail end of an output optical fiber by using an optical power meter, and performing feedback guide on adjustment of each degree of freedom to enable coupling power to reach the optimum and realize high-precision attitude adjustment. The method has higher precision, but generally adopts manual alignment at the initial light passing stage, and the alignment speed is slow; at present, algorithms aiming at feedback adjustment fine alignment include a hill climbing method, a polynomial fitting method, a simplex method, a Hamilton algorithm, a centroid method, a genetic method and the like, the maximum coupling position can be effectively searched, but the algorithms are limited by low feedback real-time performance and high data volatility of an optical power meter, and the algorithms have advantages and disadvantages.
the passive alignment mode adopts a semiconductor processing technology to process a U-shaped groove or a V-shaped groove on a waveguide chip, and directly couples and encapsulates an optical fiber and the waveguide, and the passive alignment mode has high alignment speed and does not need a precise and expensive instrument and an alignment technology, but the positioning precision of the U-shaped groove and the assembly precision of a device seriously influence the coupling efficiency of the device, and simultaneously, the manufacturing process of the chip is complex, and the coupling loss is high, so that an active alignment mode is generally adopted in practical application.
digital image processing refers to the processing of digital images by means of a digital computer to improve pictorial information for human interpretation or machine understanding. A digital image is a discretized image f (x, y) in both spatial coordinates and intensity, which can be represented by a two-dimensional array of integers. Digital image processing techniques have rich content, which can reduce noise, such as by filtering, extract target objects from the background by threshold segmentation, binarization operations, and identify target features and boundary regions by edge detection, feature extraction, and the like. Digital image processing has the characteristics of high processing precision, good reproducibility, low cost, wide application range and the like, and is widely applied to various technical fields.
The embedded system is application-centric, based on computer technology, and has tailorable software and hardware, and is suitable for special computer systems with strict requirements on function, reliability, cost, volume, and power consumption. The embedded system mainly comprises a hardware system and a software program. The core of the hardware system is a microprocessor, a memory, various interfaces, a measurement and control circuit and the like. The ARM microprocessor generally has the characteristics of small volume, low power consumption, low cost, high performance and high speed, has higher performance of internal hardware resources, can load a real-time operating system, can run an interface and an application program, has high-speed processing and computing capacity, can completely meet the requirements of general digital image acquisition and processing, and is very suitable for being applied to an image processing system.
Disclosure of Invention
the invention provides an optical fiber-waveguide automatic alignment coupler based on image processing, aiming at realizing the pose adjustment of an optical fiber and a waveguide, searching a maximum coupling point and realizing the quick and efficient automatic alignment coupling of the optical fiber and the waveguide.
The optical fiber-waveguide automatic alignment coupler comprises an image acquisition module, an image processing and control module, a motion execution module and a human-computer interaction module.
The image acquisition module comprises a white light source, a laser, an optical fiber adapter and two sets of microscope-CMOS cameras;
The laser directly couples red light into the filler block with the tail fiber through the bare fiber adapter; the backing block with the tail fiber is opposite to the Y waveguide element and is coupled; the first group of microscope-CMOS cameras are vertically fixed above the optical fiber-waveguide coupling point, and the white light source is added on the side of the first group of microscope-CMOS cameras to coaxially illuminate the coupling point. The second set of microscope-CMOS cameras is horizontally fixed behind the output end of the Y waveguide element.
the motion execution module comprises a filler block with tail fibers, a Y waveguide element, a six-dimensional electric displacement table, a motor control board, two sets of manual displacement tables and the like;
the filler block with the tail fiber is clamped by a filler block clamp and fixed on a six-dimensional electric displacement table; the processor sends an instruction to the motor control panel to enable the six-dimensional electric displacement table to realize the adjustment of the spatial pose in the six-dimensional direction. The Y-waveguide element is fixed to the waveguide frame as an alignment material. The first group of microscope-CMOS cameras are fixed on the first manual displacement table, the second group of microscope-CMOS cameras are fixed on the second manual displacement table, and focusing of the cameras is achieved through manual adjustment.
The image processing and control module comprises a processor, a storage module, a communication interface and a display module.
the communication interface comprises an RS232 serial port and a USB interface, the RS232 serial port is connected with the motor control board and the six-dimensional electric displacement table and used for sending instructions to control the six-dimensional electric displacement table, and the processor is respectively connected with the two sets of microscope-CMOS cameras through the USB interface and used for transmitting images to the processor; the display module is connected with the touch screen through an LCD interface;
the processor analyzes the pose relation and the coupling output light intensity of the tail fiber filler blocks and the Y waveguide elements in the images transmitted by the two groups of microscope-CMOS cameras, and sends an instruction to the motor control board to search a maximum coupling point;
The man-machine interaction module comprises a touch screen, a mouse and a keyboard.
and automatic coupling software is opened on the touch screen to realize the automatic alignment coupling of the optical fiber-Y waveguide.
the automatic alignment coupling process of the optical fiber-Y waveguide comprises the following specific steps:
respectively fixedly clamping a backing block with a tail fiber to be aligned and coupled and a Y waveguide element, and roughly adjusting the backing block with the tail fiber and the Y waveguide element to proper relative positions;
Step two, opening the automatic alignment software and running, and respectively acquiring and presenting images by two groups of microscope-CMOS cameras;
Step three, manually adjusting two sets of manual displacement tables respectively to ensure that side images and end surface images of the two sets of microscope-CMOS cameras are clearest;
A first set of microscope-CMOS camera corresponding side images; a second set of microscope-CMOS camera corresponding end face images;
automatically extracting and marking edge characteristic straight lines of the filler blocks with the tail fibers and the Y waveguide elements aiming at the clearly imaged side images;
The extracted edge characteristic straight lines are respectively the transverse and longitudinal edges and the angular point positions of the filler block with the tail fiber, the longitudinal edge of the Y waveguide element and the straight line position of the channel of the Y waveguide element.
The specific extraction process comprises the following steps: the first group of microscope-CMOS cameras collects images and transmits the images to the processor, and the processor converts the color images into gray images and carries out filtering and noise reduction; then, solving a segmentation threshold value of the gray-scale image by using the Otsu method, converting the image into a binary image, and segmenting the target object and the background; respectively adopting a longitudinal window moving method and a transverse window moving method to search the areas with the most intense gray scale change, namely the edge positions of the Y waveguide element and the filler block with the tail fiber; and setting the ROI area to perform extraction fitting on straight line segments in the ROI area, and labeling on the image.
The channel location for the Y waveguide element is determined by the position of the two electrode strips. The processing method is similar to the extraction of edge straight lines, and is different from the method that the gray segmentation threshold is obtained by calculation according to the gray values of the electrode and a small area nearby, and the electrode and the lithium niobate substrate are segmented; and extracting a linear equation of the two electrode strips at the fitting position, taking a straight line at the middle position as a position straight line of the input end of the Y waveguide, and marking the position straight line in the image.
calculating the angle difference between the transverse lower edge of the filler block with the tail fiber and the waveguide channel and the transverse and longitudinal distance deviation between the angular point position of the filler block and the waveguide channel according to the edge characteristic straight line, and moving the filler block with the tail fiber to a preset position;
firstly, respectively solving the following according to a simple point-to-straight line distance formula or a straight line distance formula: the angle difference between the transverse lower edge of the pad with the tail fiber and the waveguide channel, and the longitudinal deviation and the transverse deviation from the angular point position of the pad to the channel of the Y waveguide element.
The lateral lower edge of the pigtail filler block refers to the position of the optical fiber core in the pigtail filler block.
Then, converting the distance represented by the pixel value on the image into an actual space distance according to the size of each pixel point, and dividing the actual space distance by combining each deviation distance according to the minimum step distance of the six-dimensional electric displacement table to obtain the running step number of the motor corresponding to the eliminated deviation;
and finally, the processor sends an instruction to the motor control board to control the six-dimensional electric displacement table to realize translation and rotation operations, and the filler block with the tail fiber and the Y waveguide element reach a preset position while deviation is eliminated.
In the process of eliminating the deviation, the angular deviation is eliminated firstly, then the longitudinal distance deviation is eliminated, and finally the transverse distance deviation is eliminated; in the process of eliminating the distance, after each axis of the six-dimensional electric displacement table moves, the side coupling image is reprocessed, the characteristic straight line of the pigtailed filler block and the Y waveguide element is fitted, the relative deviation is calculated again, and the six-dimensional electric displacement table is operated to eliminate the deviation.
Step six, realizing initial light passing after the backing block with the tail fiber reaches a preset position, and enabling two output light spots at the tail end of the Y waveguide element to appear on an end surface image;
Step seven, the system automatically detects and marks the central positions of the two output light spots, judges whether a marking frame on the output image completely contains the output light spots, and fixes the positions of the output ends if the marking frame on the output image completely contains the output light spots; otherwise, fine tuning of the pigtail filler block is performed until the correct position is reached.
the processor automatically calculates the gray gravity center of the two output light spots;
The formula is as follows:
wherein f (i, j) is the gray value of the (i, j) pixel point in the image.
the obtained gray scale gravity center is the center position of the output light spot, the neighborhood with the preset size of the center position is fixedly selected and marked out to be used as the position of the output point for statistical processing.
Step eight, after the position of the output end is determined, the system automatically adjusts the exposure time of the end face image, controls the image maximum value of the output light spot and calculates the output light power;
the system automatically adjusts the exposure time of the end face image according to the light intensity detection feedback of the output end, and controls the maximum value of the output light spot image to be between 50 and 150; at the moment, the system automatically detects the rest two dim small light spots in the image, and extracts the red channel value of each pixel point in the neighborhood of the small light spot image and records the red channel value as pijadding to obtain P ═ Sigmai,jpijAs the output optical power.
and step nine, searching the maximum coupling point by the system according to an automatic search algorithm of axis-by-axis iterative fitting, and finding out the position of the backing block with the tail fiber, which enables the output optical power to reach the maximum.
the method comprises the following specific steps:
Step 901, determining a search axial direction and activating, and setting a moving distance L and a step distance delta L of a six-dimensional electric displacement table;
Step 902, controlling the six-dimensional electric displacement table to move a distance L in the axial direction in the reverse direction, and recording the position x as-L;
and step 903, moving the step distance delta L in the forward direction, recording the position x at the time as x plus delta L, acquiring a tail end output image of the Y waveguide element by a second group of microscope-CMOS cameras in real time, transmitting the tail end output image to a processor, automatically processing the image by the system, counting and extracting a red channel at an output point as output light power, feeding back the red channel, and storing the red channel in a sequence seq (x).
step 904, verifying whether x is larger than L, and if so, entering step 905 for curve fitting; otherwise, returning to step 903;
step 905, after the scanning of the interval (-L, L) is finished, searching the maximum position in the sequence seq (x), and extracting 3 groups of data around the maximum position and data at the maximum position as a data group (x) to be processedi,yi) A quadratic function fit was performed on the 7 sets of data using a least squares method.
set the fitting function to:
Wherein (a)1,a2,a3) Coefficients of each stage of the quadratic function to be fitted are obtained. To process the data set (x)i,yi) Substituting into least square method to solve formula to obtain coefficient group (a) of corresponding quadratic function1,a2,a3)。
And 906, after the secondary curve is obtained, calculating to obtain the extreme point position of the curve, replacing the maximum value position of the actual measurement data as the maximum value position, and moving the pigtail backing block to the maximum value position.
and step 907, verifying whether the step distance delta L reaches a preset precision threshold, if so, scanning and traversing the interval (-delta L, delta L) at the step distance delta L/5, searching a position with the maximum output light power, and moving the pigtail filler block to the maximum position. Proceed to step 909; otherwise, go to step 908;
and 908, halving the scanning parameters L and the Delta L when the step distance Delta L is larger than a preset precision threshold, returning to the step 902, scanning the halved interval (-L, L) by the halved step distance Delta L again, selecting a data set to perform quadratic curve fitting, and finding the maximum position.
step 909, when the traversal scanning is finished, the searching of the maximum coupling position in the axial direction is finished; and converting and activating the next axial direction of the six-dimensional electric displacement table, repeating the steps until the maximum coupling position of each axial direction is searched, wherein the position of the filler block with the tail fiber is the required maximum coupling position, and finishing automatic searching.
The invention has the advantages that:
(1) an optical fiber-waveguide automatic alignment coupler based on image processing selects an ARM as a processor, takes a touch screen as a human-computer interaction interface, and has the advantages of small volume, low cost, friendly human-computer interaction and high operation speed;
(2) An optical fiber-waveguide automatic alignment coupler based on image processing utilizes a digital image processing method to realize initial light passing of an optical fiber-waveguide, is fast and convenient, achieves sub-pixel alignment precision, and avoids long-time operation of manual alignment;
(3) An optical fiber-waveguide automatic alignment coupler based on image processing utilizes a CMOS camera to image and detect light intensity, replaces an optical power meter, achieves output light intensity feedback with high real-time performance, is controlled in a closed loop mode, and can achieve automatic alignment coupling of optical fiber-waveguide.
Drawings
FIG. 1 is a schematic structural diagram of an optical fiber-waveguide automatic alignment coupler based on image processing according to the present invention;
FIG. 2 is a schematic diagram of the alignment coupling point of the pigtail filler block and the Y-waveguide component in the fiber-waveguide automatic alignment coupler according to the present invention;
FIG. 3 is a side view of a coupling point acquired by a first set of microscope-CMOS cameras according to the present invention;
FIG. 3(a) is an image of the present invention when the camera is manually adjusted to focus clearly;
FIG. 3(b) is a diagram of the present invention for automatically extracting and labeling edges of a side image;
FIG. 3(c) is an image of the coupling point after the automatic coarse alignment has been performed to remove the offset according to the present invention;
FIG. 4 is an image of the output end of the Y waveguide taken by the second set of microscope-CMOS cameras of the present invention;
FIG. 4(a) is an image of the present invention when the camera is manually adjusted to focus clearly;
FIG. 4(b) is the output light spot image when the initial light-up is realized after the coarse alignment of the present invention;
FIG. 4(c) is an image of the output spot of the present invention after automatic dimming;
FIG. 5 is a flow chart of the automatic alignment coupling of the fiber-waveguide automatic alignment coupler according to the present invention;
FIG. 6 is a flow chart of a method of the present invention for automatically searching for a point of maximum coupling according to an auto-search algorithm with axis-wise iterative fitting;
FIG. 7 is an interface for operating the fiber-waveguide automatic alignment coupler software according to the present invention;
FIG. 8 is a comparison between experimental data and theoretical simulation data of the output optical power detected by a CMOS camera according to the present invention.
in the figure: 1-a processor; 2-a touch screen; 3, a mouse; 4-filler block with tail fiber; a 5-Y waveguide element; 6-polarization maintaining fiber ring; 7-an optical fiber adapter; 8-a laser; 9-six-dimensional electric displacement table; 10-motor control board; 11-a white light source; 12-a first set of microscope-CMOS cameras; 13-a first manual displacement stage; 14-second set of microscope-CMOS cameras; 15-a second manual displacement stage;
Detailed Description
the present invention will be described in further detail below with reference to the accompanying drawings.
The invention relates to an optical fiber-waveguide automatic alignment coupler based on image processing, which comprises an image acquisition module, an image processing and control module, a motion execution module and a human-computer interaction module, as shown in figure 1.
The image acquisition unit comprises a white light source 11, a laser 8, an optical fiber adapter 7 and two sets of microscope-CMOS cameras.
The laser 8 directly couples the red light into the pigtailed filler block 4 through the bare fiber adapter 7; the pigtailed filler block 4 is coupled to face the Y waveguide element 5; the two sets of CMOS cameras are connected to the microscope objective, and the first set of microscope-CMOS camera 12 is vertically fixed above the pad-waveguide coupling point to perform microscopic imaging on the coupling point. A white light source 11 is added to the side of the first set of microscope-CMOS cameras 12 to illuminate the coupling points coaxially. The second set of microscope-CMOS cameras 14 is horizontally placed behind the Y waveguide element 5 to microscopically image the output end of the Y waveguide element 5. Both microscope-CMOS cameras transmit images to the processor 1 via the USB interface for analysis.
The white light source 11 is a point light source, is inserted into an adapter at the side of the microscope objective, and realizes coaxial light striking through a semi-transparent semi-reflective mirror surface therein to illuminate the coupling point of the pigtail filler block 4 and the Y waveguide element 5.
the laser 8 selects 650nm red light with an output tail fiber, the coating layer of the input end of the polarization maintaining optical fiber ring 6 is stripped, the laser is inserted into the optical fiber adapter 7 after being cleaned, the ceramic head is exposed by 2-3mm, the ceramic head is cut off by a gem knife, and then the laser is directly coupled with the output tail fiber of the laser 8 through a flange. Red light is coupled into the Y waveguide element 5 at a coupling point via the polarization maintaining fiber ring 6 so that an output light spot is visible at the output of the Y waveguide element 5.
the motion execution module comprises a filler block 4 with tail fiber, a filler block clamp, a Y waveguide element 5, a six-dimensional electric displacement table 9, a motor control board 10, two sets of three-dimensional manual displacement tables, a waveguide fixing frame and the like.
In the fiber optic gyroscope, the requirement for alignment accuracy of direct coupling between the polarization maintaining fiber and the Y waveguide is high, so that the alignment coupling between the polarization maintaining fiber ring and the input end of the Y waveguide is selected as the operation target in the present embodiment. In view of the wide use in optical fiber sensing, the optical fiber selected for alignment in this embodiment is a panda type polarization maintaining fiber, and a schematic diagram of a pigtail pad 4 is shown on the left side of fig. 2. The pigtail filler block is used for replacing a polarization maintaining optical fiber core as a coupling alignment material, and detection alignment can be performed by means of the edge of the filler block.
The right side of fig. 2 shows a widely used Y waveguide element 5 as an alignment material in the aspect of optical fiber sensing; the Y waveguide element 5 is an annealed proton-exchanged waveguide of a lithium niobate substrate, and a channel thereof is obtained by annealing proton exchange, and a fine region having a width of about 6 micrometers and a depth of about 3 micrometers is penetrated on the surface of the lithium niobate substrate. The refractive index of the channel region is less than 0.02 of the refractive index of the lithium niobate substrate, and therefore is difficult to observe in a conventional manner. As shown in fig. 3(a), two thin lines are hidden and visible in the middle area of the surface of the Y waveguide, namely a Y waveguide channel area, one of which is an input channel of the Y waveguide at the middle position of two clear electrode areas, and the other is an auxiliary observation channel. The Y waveguide component selected by the invention is plated with electrode strips for indicating positions and assisting in observation on two sides of the input and output channels of the Y waveguide in the manufacturing stage, so that the positions of the input channels of the Y waveguide can be determined by the positions of the two electrode strips. In the process of manually placing the pigtail filler block 4 and the Y waveguide element 5, the pigtail filler block 4 is clamped by a filler block clamp and fixed on the six-dimensional electric displacement table 9.
The fiber-waveguide coupling is essentially a six degree-of-freedom coupling in three dimensions, as shown in fig. 2, in the lateral dislocation X and Y directions, respectively, the longitudinal separation Z, and the pitch and yaw angles α and β, respectively, generated by rotation about the axis X, Y, and the angle θ between the slow axis of the polarization-maintaining fiber and the polarization axis of the optical waveguide. Changes in any of the first five degrees of freedom affect insertion loss, while changes in the angle θ primarily affect polarization crosstalk, which is important in fiber sensing. It is therefore necessary to have process control for each degree of freedom to ensure minimum insertion loss, polarization crosstalk and good stability.
when the processor 1 sends an instruction to the motor control board 10, the six-dimensional displacement directions of the six-dimensional electric displacement table 9 are set according to six degrees of freedom of the coupling point, and each direction is orthogonal to each other, and the mutual movement under an ideal condition is not interfered with each other. The six-dimensional electric displacement table 9 realizes the adjustment of the spatial pose in the six-dimensional direction. The rotation axes of the three rotation axes are intersected at one point, which is called a rotation center. In the fixing phase of the pigtail filler block 4, the output pigtail end point is placed at this rotation center. This has the advantage that no additional displacement deviations will occur due to the presence of the swivel arm when performing the swivel operation. In an actually assembled instrument, due to the fact that the assembly precision is not high enough, six-dimensional displacement cannot meet strict orthogonality requirements, and therefore the mutual disturbance phenomenon is small in actual axial operation.
A first set of microscope-CMOS cameras 12 is fixed on a first manual displacement stage 13 and a second set of microscope-CMOS cameras 14 is fixed on a second manual displacement stage 15, and the focusing of the cameras is achieved through manual adjustment.
The image processing and control module comprises a processor 1, a storage module, a communication interface, a display module and a power supply module.
The processor 1 can select a PC or an ARM processor, the PC is selected in the development stage, the ARM microprocessor can be selected in the practical use for reducing the volume, the Cortex-A9 series ARM processor is used as a controller, and compared with the traditional PC control, the system has the advantages of small volume, low cost and flexible and convenient use; the design of the image acquisition module, the motion execution module and the interpersonal interaction module under the embedded environment is completed, and the system integration level is high.
The storage module adopts DDR3 as RAM memory, and eMMC as FLASH storage; the communication interface comprises an RS232 serial port and 3 USB interfaces, the RS232 serial port is connected with the motor control board 10 and the six-dimensional electric displacement table 9, the RS232 serial port is used for sending a motion instruction, and the motor control board 10 converts the motion instruction into phase current for driving the stepping motor to rotate. The 3 USB interfaces are respectively connected with the first group of microscope-CMOS cameras 12, the second group of microscope-CMOS cameras 14 and the mouse 3, and the COMS cameras adopt a special drive under Linux to acquire images and transmit the images to the ARM processor 1 for image processing; the display module is connected with the capacitive touch screen 2 through a 45pin LCD interface; the power module adopts a 5V power adapter socket and uses a 5V direct current power supply for power supply.
The processor 1 is respectively connected with the two groups of microscope-CMOS cameras through USB interfaces and is used for transmitting images to the processor 1; the processor 1 analyzes the pose relationship and the coupling output light intensity of the tail fiber filler block 4 and the Y waveguide element 5 in the images transmitted by the two groups of microscope-CMOS cameras, and sends an instruction to the motor control board 10 to search a maximum coupling point;
The man-machine interaction module comprises a touch screen 2, a mouse and a keyboard 3.
The man-machine interface is developed by adopting Qt, automatic alignment coupling software is opened on the touch screen, the operation interface is shown as the attached figure 7, and the first group of microscope-CMOS cameras 12 and the second group of microscope-CMOS cameras 14 start to acquire images and display the images on the software interface. Observing the coupling point image and the output end image of the Y waveguide element 5, adjusting the first manual displacement table 13 and the second manual displacement table 15, adjusting the two groups of microscope-CMOS cameras to be in the positive focal position, enabling the image to be clearest, and adjusting the direction to select a proper view field.
The automatic aligning and coupling process of the optical fiber-Y waveguide mainly comprises two steps, namely a coarse aligning stage for processing a side coupling point image and a fine aligning stage for processing an end face output image. As shown in fig. 5, the specific steps are as follows:
Step one, respectively fixedly clamping a backing block with tail fibers to be aligned and coupled and a Y waveguide element, and roughly adjusting the backing block with the tail fibers and the Y waveguide element to relative positions;
Fixedly clamping a filler block with tail fibers to be aligned and coupled on a filler block clamp, and placing the filler block clamp at the rotation center position of a six-dimensional electric displacement table; fixing the Y waveguide element to be aligned and coupled on the waveguide frame; the tape pigtail filler block and the Y waveguide component are coarsely adjusted to the proper relative positions.
step two, opening the automatic alignment software and running, and respectively acquiring and presenting images by two groups of microscope-CMOS cameras;
and (3) opening the automatic alignment software, clicking to operate the operation interface as shown in the figure 7, and starting to acquire images by the two groups of microscope-CMOS cameras and transmitting the images to be displayed in two image frames in the operation interface.
step three, manually adjusting two sets of manual displacement tables respectively to ensure that side images and end surface images of the two sets of microscope-CMOS cameras are clearest;
The corresponding side images of the first set of microscope-CMOS cameras are shown in fig. 3. A second set of microscope-CMOS camera corresponding end face images; as shown in fig. 4.
It should be noted that the end image needs to find the edge of the output end of the Y waveguide correctly and adjust the two side edges of the image to reach the positive focus position at the same time.
Automatically extracting and marking edge characteristic straight lines of the filler blocks with the tail fibers and the Y waveguide elements aiming at the clearly imaged side images;
In the course of rough alignment, the image processing process of the side face of the coupling point is mainly characterized by detection and extraction of characteristic straight lines of the pigtailed filler block and the Y waveguide element. The characteristic straight lines mainly comprise two types, the first is an edge straight line with a tail fiber filler block and a Y waveguide element, and the second is the waveguide channel position of the Y waveguide; specifically, the positions of the transverse and longitudinal edges and the angular points of the filler block with the tail fiber, the longitudinal edges of the Y waveguide element and the linear position of the waveguide channel found by two auxiliary electrodes are provided.
after the focusing adjustment of the first group of microscope-CMOS cameras, clicking an 'extraction edge' button on an operation interface, automatically performing digital image processing operations such as graying processing, threshold segmentation, edge detection, linear fitting and the like on a side image on the left side of the interface by the system, extracting and labeling edge characteristic straight lines with a tail fiber pad and a Y waveguide element in the side image.
as shown in fig. 3(a), the edge point is obvious and has a large gray difference from the background, so that the conventional image processing and line extraction method is used when extracting the edge line, and the specific process is as follows: the image is collected by a first group of microscope-CMOS cameras and then transmitted to a processor, an embedded program implanted in an image processing module firstly converts the collected color image into a gray-scale image, and Gaussian filtering is used for filtering and denoising the image so as to eliminate the interference of noise outliers; then, the segmentation threshold of the gray level image is obtained by utilizing the Otsu method, the image is converted into a binary image, and the target object is clearly segmented from the background; the regions with the most intense gray level change, namely the edge positions of the filler block with the tail fiber and the Y waveguide element, are searched by respectively adopting a longitudinal window moving method and a transverse window moving method, the ROI region is set to extract and fit straight line segments in the regions, and the images are labeled, and the result is shown in fig. 3 (b).
For the Y waveguide channel, the position of the Y waveguide input channel is determined by the positions of the two electrode strips because the Y waveguide channel is difficult to observe and has small gray difference with the peripheral area. The processing method is similar to the extraction of edge straight lines, except that the gray segmentation threshold is obtained by calculation according to the gray values of the electrode and a small area nearby, and the electrode and the lithium niobate substrate can be segmented. And (3) extracting a linear equation of the two electrode strips at the fitting position, taking a straight line at the middle position as a position straight line of the input end of the Y waveguide, and marking the straight line in the image as shown in the attached figure 3 (b). The position of the input end of the Y waveguide element is straight across the entire image in order to show its angular deviation from the lower edge of the pigtail patch.
calculating the angle difference between the transverse lower edge of the filler block with the tail fiber and the waveguide channel and the transverse and longitudinal distance deviation between the angular point position of the filler block and the waveguide channel according to the edge characteristic straight line, and moving the filler block with the tail fiber to a preset position;
After detecting all the characteristic straight lines, the processor analyzes each straight line segment to obtain the edge distance and the angle deviation between the pigtailed fiber lining block and the Y waveguide element and the deviation between the optical fiber axis (generally coinciding with the edge of the pigtailed fiber lining block) and the input line of the Y waveguide element. And calculating the number of steps required by the movement of the motor according to the distance or the angle deviation, transmitting the number of steps to a motor control board through an RS232 interface, and controlling the motor to align the filler block with the tail fiber with the Y waveguide element.
the method specifically comprises the following steps:
Firstly, after obtaining the edge characteristic straight line of the pad with the tail fiber and the Y waveguide element, respectively calculating according to a simple point-to-straight line distance formula or a straight line distance formula: the angle difference between the transverse lower edge of the pad with the tail fiber and the waveguide channel, and the longitudinal deviation and the transverse deviation from the angular point position of the pad to the waveguide channel.
the lateral lower edge of the pigtail filler block refers to the position of the optical fiber core in the pigtail filler block.
then, converting the distance represented by the pixel value on the image into an actual space distance and the minimum step distance of a six-dimensional electric displacement table according to the size (2.2 microns multiplied by 2.2 microns) of each pixel point, and combining each deviation distance and dividing to obtain the running step number of the motor corresponding to the deviation elimination;
and finally, clicking an automatic alignment button in an operation interface, sending an instruction to a motor control board by the processor to control the six-dimensional electric displacement table to realize translation and rotation operations, and eliminating the deviation while enabling the filler block with the tail fiber to reach a preset position.
In the process of eliminating the deviation, the sequence of eliminating the angle deviation, then eliminating the longitudinal distance deviation and finally eliminating the transverse distance deviation is selected.
In practical operation, because the accuracy of the six-dimensional electric displacement table is not high enough, crosstalk exists between the axes, and therefore, when one axis moves, the other axis also displaces, and the deviation changes. Therefore, in the process of eliminating the distance, the side coupling image is reprocessed after each axis movement, the characteristic straight line of the pigtail backing block and the Y waveguide element is fitted, the relative deviation is calculated again, and the electric displacement table is operated to eliminate the deviation. Finally, the pigtail pads and the Y-waveguide components are in the desired positions, as shown in fig. 3(c), and the initial pass light step of the fiber-waveguide alignment coupling is typically completed.
In the process of eliminating deviation, a certain distance needs to be reserved in the longitudinal axis direction of the Y waveguide channel, so that the movement of a pigtail filler block in the next automatic search process is facilitated, and meanwhile, a space is reserved for dripping ultraviolet curing glue after the automatic alignment coupling is finished.
Step six, realizing initial light passing after the backing block with the tail fiber reaches a preset position, and enabling two output light spots at the tail end of the Y waveguide element to appear on an end surface image;
when the backing block with the tail fiber reaches a preset position, the size deviation between the optical fiber and the waveguide is controlled in a small area, when the side coupling image observes that the optical fiber and the Y waveguide are basically aligned, two output light spots can be observed in an image of the output end of the Y waveguide collected by the COMS camera, namely initial light passing of the Y waveguide is realized, an image collected by the second group of microscope-CMOS cameras after the initial light passing is finished is shown in figure 4(b), and two bright light spots, namely two output end points at the tail end of the Y waveguide, appear on the edge of an end face image of the output end of the Y waveguide and are transmitted to a processor through RS232 to be displayed on a touch screen.
if the output light spot is not detected at the output end after the coarse alignment is finished, a 'motor adjustment' button in the interactive software can be selected, a six-dimensional motor direct control interface is opened, and the motor is controlled to carry out micro-adjustment in all directions until the output light spot is found.
Step seven, the system automatically detects and marks the central positions of the two output light spots, judges whether a marking frame on the output image completely contains the output light spots, and fixes the positions of the output ends if the marking frame on the output image completely contains the output light spots; otherwise, fine tuning of the pigtail filler block is performed until the correct position is reached.
The processor automatically calculates the gray gravity center of the two output light spots;
the formula is as follows:
Wherein f (i, j) is the gray value of the (i, j) pixel point in the image.
And the obtained gray gravity center is the center position of the output light spot, a 'determination area' button on an operation interface is clicked, the neighborhood with the preset size of the center position is fixedly selected and marked out to be used as the position of the output spot for statistical processing.
Step eight, after the position of the output end is determined, the system automatically adjusts the exposure time of the end face image, controls the image maximum value of the output light spot and calculates the output light power;
after the position of the output end is determined, an 'automatic dimming' button on an operation interface is clicked, the system automatically adjusts the exposure time of the end face image according to the light intensity detection feedback of the output end, and the maximum value of the output light spot image is controlled to be 50-150; at this time, as shown in fig. 4(c), the system automatically detects only two dim small light spots in the image, and extracts the red channel value of each pixel point in the neighborhood of the small light spot image and records the red channel value as pijadding to obtain P ═ Sigmai,jpijI.e. as the output optical power.
in the embodiment, the CMOS camera is used for replacing the optical power meter and is used as a detection device for the output optical power, so that the feedback real-time performance and the coupling efficiency are improved. In theory, the CMOS camera and the optical power meter work in the same principle, and both of them count the photocurrent generated on the sensor by the incident light in a certain time (i.e. the exposure time of the CMOS camera) according to the photoelectric effect to characterize the detected optical power. However, compared with an optical power meter, the CMOS camera has a smaller detection light intensity range, and the gray value of the collected image can only be distributed between 0 and 255. As shown in fig. 4(b), the occurrence of white spots in the center of the output light spots indicates that the CMOS camera is overexposed, so that to detect the output light intensity by using the CMOS camera instead of the optical power meter, the detected light intensity needs to be decreased, and the exposure time of the camera needs to be adjusted so that the maximum value of the output spots on the image is within the detectable range, as shown in fig. 4(c), the detection is prevented from being inaccurate due to overexposure, and two bright output light spots are pressed down to two dimmer light spots.
After the automatic dimming step, the system automatically detects the output light spot, extracts and counts the red channel value of the output light spot, and feeds back the red channel value as the output light power. The red channel values are extracted from the two light spots, and pixel scanning in a one-dimensional axial direction (for example, in the horizontal direction or the vertical direction) is performed, wherein the scanning axial direction is taken as the horizontal axis, and the red channel values are taken as the vertical axis, so that a good Gaussian-like line shape can be obtained. It is feasible to demonstrate that the two red points are fundamental modes of the output end, and characterize the change of the optical power of the output end by using the energy change of the statistical fundamental mode.
And extracting red channel values in a smaller neighborhood range of the central positions of the two output light spots, and performing statistical summation to obtain the output light power at the moment. The single-axis moving of the filler block with the tail fiber records the trend analysis of the output light intensity variation of each point, as shown in fig. 8, horizontal and vertical axial scanning curves on the cross section of the fiber-waveguide coupling point are shown, the horizontal axis of the curve is the single-axis moving distance, and the vertical axis is the relative coupling efficiency. It can be seen that the experimental data and the data obtained according to the theoretical simulation of the fiber-waveguide coupling are basically consistent in magnitude and trend, and show good unimodal performance. It is therefore practical to employ a CMOS camera in the present invention to detect the output optical power instead of an optical power meter.
and step nine, searching the maximum coupling point by the system according to an automatic search algorithm of axis-by-axis iterative fitting, and finding out the position of the backing block with the tail fiber, which enables the output optical power to reach the maximum.
Clicking an optimal point searching button in an operation interface, automatically searching the maximum coupling point by the system, and finding the position of the backing block with the tail fiber, which enables the output optical power to reach the maximum, according to an automatic searching algorithm of axis-by-axis iterative fitting. After the maximum coupling point is found, a prompt box of 'automatic coupling completion' appears on the operation interface, at this moment, the complete optical fiber-waveguide automatic alignment coupling process is finished, and then operations such as dripping ultraviolet curing glue on the coupling point are further carried out.
The automatic searching algorithm of the maximum coupling point is carried out on the basis of unimodal performance shown by each axis moving scanning curve, the position with the minimum coupling loss is the position with the maximum detection output optical power, and the position with the maximum coupling point is searched.
Because crosstalk exists among shaft motors of the actually used six-dimensional electric displacement table and certain vibration exists, the feedback output optical power is unstable, and a maximum coupling point is searched by adopting a shaft-by-shaft iterative fitting method; after the steps of initial light passing and automatic light dimming are completed and correct light power feedback can be obtained, automatic search of the maximum coupling point is started. As shown in fig. 6, the automatic search algorithm flow is as follows:
Step 901, determining a search axial direction and activating, and setting a moving distance L and a step distance delta L of a six-dimensional electric displacement table;
firstly, a search axial direction is determined, the axial motor is energized and activated, and search parameters L and delta L are set.
step 902, controlling the six-dimensional electric displacement table to move a distance L in the axial direction in the reverse direction, and recording the position x as-L;
And step 903, moving the step distance delta L in the forward direction, recording the position x at the time as x plus delta L, acquiring a tail end output image of the Y waveguide element by a second group of microscope-CMOS cameras in real time, transmitting the tail end output image to a processor, automatically processing the image by the system, counting and extracting a red channel at an output point as output light power, feeding back the red channel, and storing the red channel in a sequence seq (x).
Step 904, verifying whether x is larger than L, and if so, entering step 905 for curve fitting; otherwise, returning to step 903;
Step 905, after the scanning of the interval (-L, L) is finished, searching the maximum position in the sequence seq (x), and extracting 3 groups of data around the maximum position and data at the maximum position as a data group (x) to be processedi,yi) A quadratic function fit was performed on the 7 sets of data using a least squares method.
Least squares (also known as the least squares method) is a mathematical optimization technique. It finds the best functional match of the data by minimizing the sum of the squares of the errors.
Set the fitting function to:
Wherein (a)1,a2,a3) Coefficients of each stage of the quadratic function to be fitted are obtained.
By the least squares principleto process the data set (x)i,yi) Substituting the above formula and dividing the function S (a, b, c)are related to ak(k ═ 1, 2, 3) partial derivatives are taken and made zeroObtaining the desired parameter (a)1,a2,a3) Then a fitted quadratic function can be obtained.
and 906, after the secondary curve is obtained, calculating to obtain the extreme point position of the curve, replacing the maximum value position of the actual measurement data as the maximum value position, and moving the pigtail backing block to the maximum value position.
and step 907, verifying whether the step distance delta L reaches a preset precision threshold, if so, scanning and traversing the interval (-delta L, delta L) at the step distance delta L/5, searching a position with the maximum output light power, and moving the pigtail filler block to the maximum position. Proceed to step 909; otherwise, go to step 908;
and 908, halving the scanning parameters L and the Delta L when the step distance Delta L is larger than a preset precision threshold, returning to the step 902, scanning the halved interval (-L, L) by the halved step distance Delta L again, selecting a data set to perform quadratic curve fitting, and finding the maximum position.
step 909, when the traversal scanning is finished, the searching of the maximum coupling position in the axial direction is finished; and switching the axial direction, electrifying to activate the next axial direction of the six-dimensional electric displacement table, repeating the steps until the maximum coupling position of each axial direction is searched, and automatically finishing the search when the position of the filler block with the tail fiber is the required maximum coupling position.
and after each axial direction is automatically searched, popping up an automatic coupling completion prompt box on the operation interface to indicate that the maximum coupling point is found. The procedure can be exited, and bonding and packaging steps such as dripping ultraviolet curing glue and the like can be implemented, which are not described herein.

Claims (5)

1. An optical fiber-waveguide automatic alignment coupler based on image processing comprises an image acquisition module, an image processing and control module, a motion execution module and a human-computer interaction module;
The image acquisition module comprises a white light source, a laser, an optical fiber converter and two sets of microscope-CMOS cameras;
the laser directly couples red light into the filler block with the tail fiber through the bare fiber adapter; the backing block with the tail fiber is opposite to the Y waveguide element and is coupled; the first group of microscope-CMOS cameras are vertically fixed above the optical fiber-waveguide coupling point, and the white light source is additionally arranged on the side of the first group of microscope-CMOS cameras to coaxially illuminate the coupling point; the second group of microscope-CMOS cameras are horizontally fixed behind the output end of the Y waveguide element;
the motion execution module comprises a filler block with a tail fiber, a Y waveguide element, a six-dimensional electric displacement table, a motor control board and two sets of manual displacement tables;
the filler block with the tail fiber is clamped by a filler block clamp and fixed on a six-dimensional electric displacement table; the processor sends an instruction to the motor control board to enable the six-dimensional electric displacement table to realize the adjustment of the spatial pose in the six-dimensional direction; the Y waveguide component is used as an alignment material and is fixed on the waveguide frame; the first group of microscope-CMOS cameras are fixed on the first manual displacement table, the second group of microscope-CMOS cameras are fixed on the second manual displacement table, and focusing of the cameras is achieved through manual adjustment;
the image processing and controlling module comprises a processor, a storage module, a communication interface and a display module;
the communication interface comprises an RS232 serial port and a USB interface, the RS232 serial port is connected with the motor control board and the six-dimensional electric displacement table and used for sending instructions to control the six-dimensional electric displacement table, and the processor is respectively connected with the two sets of microscope-CMOS cameras through the USB interface and used for transmitting images to the processor; the display module is connected with the touch screen through an LCD interface;
the processor analyzes the pose relation and the coupling output light intensity of the tail fiber filler blocks and the Y waveguide elements in the images transmitted by the two groups of microscope-CMOS cameras, and sends an instruction to the motor control board to search a maximum coupling point;
the human-computer interaction module comprises a touch screen, a mouse and a keyboard;
automatic coupling software is opened on the touch screen to realize automatic alignment coupling of the optical fiber-Y waveguide;
the method is characterized in that the automatic alignment coupling process of the optical fiber-Y waveguide comprises the following specific steps:
Respectively fixedly clamping a backing block with a tail fiber to be aligned and coupled and a Y waveguide element, and roughly adjusting the backing block with the tail fiber and the Y waveguide element to proper relative positions;
step two, opening the automatic alignment software and running, and respectively acquiring and presenting images by two groups of microscope-CMOS cameras;
step three, manually adjusting two sets of manual displacement tables respectively to ensure that side images and end surface images of the two sets of microscope-CMOS cameras are clearest;
a first set of microscope-CMOS camera corresponding side images; a second set of microscope-CMOS camera corresponding end face images;
automatically extracting and marking edge characteristic straight lines of the filler blocks with the tail fibers and the Y waveguide elements aiming at the clearly imaged side images;
the extracted edge characteristic straight lines are respectively the transverse and longitudinal edges and the angular point positions of the filler block with the tail fiber, the longitudinal edge of the Y waveguide element and the straight line position of the channel of the Y waveguide element;
calculating the angle deviation between the transverse lower edge of the filler block with the tail fiber and the waveguide channel and the transverse and longitudinal distance deviation between the angular point position of the filler block and the waveguide channel according to the edge characteristic straight line, and moving the filler block with the tail fiber to a preset position;
Step six, realizing initial light passing after the backing block with the tail fiber reaches a preset position, and enabling two output light spots at the tail end of the Y waveguide element to appear on an end surface image;
step seven, the system automatically detects and marks the central positions of the two output light spots, judges whether a marking frame on the output image completely contains the output light spots, and fixes the positions of the output ends if the marking frame on the output image completely contains the output light spots; otherwise, finely adjusting the pigtail filler block until the pigtail filler block reaches the correct position;
The processor automatically calculates the gray gravity center of the two output light spots;
The formula is as follows:
wherein f (i, j) is the gray value of the (i, j) pixel point in the image;
the obtained gray scale gravity center is the center position of the output light spot, the neighborhood with the preset size of the center position is fixedly selected and marked out to be used as the position of the output point for statistical processing;
Step eight, after the position of the output end is determined, the system automatically adjusts the exposure time of the end face image, controls the image maximum value of the output light spot and calculates the output light power;
the system automatically adjusts the exposure time of the end face image according to the light intensity detection feedback of the output end, and controls the maximum value of the output light spot image to be between 50 and 150; at the moment, the system automatically detects the remaining two dim small light spots in the image, extracts the red channel value of each pixel point in the neighborhood of the small light spot image and records the red channel value as pijAdding to obtain P ═ Sigmai,jpijAs the output optical power;
and step nine, searching the maximum coupling point by the system according to an automatic search algorithm of axis-by-axis iterative fitting, and finding out the position of the backing block with the tail fiber, which enables the output optical power to reach the maximum.
2. The fiber-waveguide automatic alignment coupler based on image processing as claimed in claim 1, wherein in the fourth step, the extracted edge feature lines are the positions of the transverse and longitudinal edges and the angular points of the filler block with tail, the longitudinal edge of the Y waveguide element and the line position of the channel of the Y waveguide element, respectively.
3. the fiber-waveguide automatic alignment coupler based on image processing as claimed in claim 1, wherein said step four is specifically extracting as follows:
The first group of microscope-CMOS cameras collects images and transmits the images to the processor, and the processor converts the color images into gray images and carries out filtering and noise reduction; then, solving a segmentation threshold value of the gray-scale image by using the Otsu method, converting the image into a binary image, and segmenting the target object and the background; respectively adopting a longitudinal window moving method and a transverse window moving method to search the areas with the most intense gray scale change, namely the edge positions of the Y waveguide element and the filler block with the tail fiber; setting an ROI (region of interest) area to extract and fit straight line segments in the ROI area, and labeling the straight line segments on an image;
the position of a channel of the Y waveguide element is determined by the positions of the two electrode strips; the processing method is similar to the extraction of edge straight lines, and is different from the method that the gray segmentation threshold is obtained by calculation according to the gray values of the electrode and a small area nearby, and the electrode and the lithium niobate substrate are segmented; and extracting a linear equation of the two electrode strips at the fitting position, taking a straight line at the middle position as a position straight line of the input end of the Y waveguide, and marking the position straight line in the image.
4. The fiber-waveguide automatic alignment coupler based on image processing as claimed in claim 1, wherein said step five is specifically:
Firstly, respectively solving the following according to a simple point-to-straight line distance formula or a straight line distance formula: the angle difference between the transverse lower edge of the filler block with the tail fiber and the waveguide channel, and the longitudinal deviation and the transverse deviation from the angular point position of the filler block to the Y waveguide component channel;
The transverse lower edge of the pigtail filler block refers to the position of an optical fiber core in the pigtail filler block;
then, converting the distance represented by the pixel value on the image into an actual space distance according to the size of each pixel point, and dividing the actual space distance by combining each deviation distance according to the minimum step distance of the six-dimensional electric displacement table to obtain the motor operation step number corresponding to the elimination of the deviation;
finally, the processor sends an instruction to the motor control board to control the six-dimensional electric displacement table to realize translation and rotation operations, and the filler block with the tail fiber and the Y waveguide element reach a preset position while deviation is eliminated;
In the process of eliminating the deviation, the angular deviation is eliminated firstly, then the longitudinal distance deviation is eliminated, and finally the transverse distance deviation is eliminated; in the process of eliminating the distance, after each axis of the six-dimensional electric displacement table moves, the side coupling image is reprocessed, the characteristic straight line of the pigtailed filler block and the Y waveguide element is fitted, the relative deviation is calculated again, and the six-dimensional electric displacement table is operated to eliminate the deviation.
5. The fiber-waveguide automatic alignment coupler based on image processing as claimed in claim 1, wherein said nine specific steps are as follows:
Step 901, determining a search axial direction and activating, and setting a moving distance L and a step distance delta L of the six-dimensional electric displacement table;
Step 902, controlling the six-dimensional electric displacement table to move a distance L in the axial direction in the reverse direction, and recording the position x as-L;
step 903, moving the step distance delta L in the forward direction, recording the position x at the time as x + delta L, acquiring a tail end output image of the Y waveguide element by a second group of microscope-CMOS cameras in real time, transmitting the tail end output image to a processor, automatically processing the image by the system, counting and extracting a red channel at an output point as output light power to feed back, and storing the red channel in a sequence seq (x);
step 904, verifying whether x is larger than L, and if so, entering step 905 for curve fitting; otherwise, returning to step 903;
Step 905, after the scanning of the interval (-L, L) is finished, searching the position of the maximum value in the sequence seq (x), and extracting three groups of data around the position of the maximum value and data at the maximum value as a data group (x) to be processedi,yi) Performing quadratic function fitting on the seven groups of data by using a least square method;
set the fitting function to:
wherein (a)1,a2,a3) Coefficients of each level of the quadratic function to be fitted are obtained; to process the data set (x)i,yi) Substituting into least square method to solve formula to obtain coefficient group (a) of corresponding quadratic function1,a2,a3);
step 906, after the secondary curve is obtained, calculating to obtain the extreme point position of the curve, replacing the maximum value position of the actual measurement data as the maximum value position, and moving the filler block with the pigtail to the maximum value position;
Step 907, verifying whether the step distance delta L reaches a preset precision threshold, if so, scanning and traversing the interval (-delta L, delta L) by the step distance delta L/5, searching a position with the maximum output optical power, and moving the pigtail backing block to the maximum position; proceed to step 909; otherwise, go to step 908;
step 908, halving both the scanning parameters L and the scanning parameters Delta L when the step distance Delta L is larger than the preset precision threshold, returning to step 902, scanning the halved interval (-L, L) by the halved step distance Delta L again and selecting a data set for quadratic curve fitting to find the maximum position;
Step 909, when the traversal scanning is finished, the searching of the maximum coupling position in the axial direction is finished; and converting and activating the next axial direction of the six-dimensional electric displacement table, repeating the steps until the maximum coupling position of each axial direction is searched, wherein the position of the filler block with the tail fiber is the required maximum coupling position, and finishing automatic searching.
CN201810397010.1A 2018-04-28 2018-04-28 Optical fiber-waveguide automatic alignment coupler based on image processing Active CN108646353B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810397010.1A CN108646353B (en) 2018-04-28 2018-04-28 Optical fiber-waveguide automatic alignment coupler based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810397010.1A CN108646353B (en) 2018-04-28 2018-04-28 Optical fiber-waveguide automatic alignment coupler based on image processing

Publications (2)

Publication Number Publication Date
CN108646353A CN108646353A (en) 2018-10-12
CN108646353B true CN108646353B (en) 2019-12-17

Family

ID=63748412

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810397010.1A Active CN108646353B (en) 2018-04-28 2018-04-28 Optical fiber-waveguide automatic alignment coupler based on image processing

Country Status (1)

Country Link
CN (1) CN108646353B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111220141B (en) * 2020-02-25 2021-06-25 北京航空航天大学 Shaft aligning method for direct coupling of polarization maintaining optical fiber ring terminal and integrated optical chip
CN111596425A (en) * 2020-06-23 2020-08-28 中国计量大学 Automatic light focusing system for collimator to focus light
CN112082541A (en) * 2020-07-28 2020-12-15 北京航天时代光电科技有限公司 Y waveguide and optical fiber polarization axis alignment system and method based on image recognition
CN112747898A (en) * 2020-12-25 2021-05-04 长飞光纤光缆股份有限公司 Automatic aligning and centering device and method for matching oil coated optical fiber end face
CN112764172B (en) * 2020-12-28 2022-03-29 武汉光迅科技股份有限公司 Multi-channel pre-alignment system and multi-channel pre-alignment method based on machine vision
CN112859256B (en) * 2021-01-07 2022-07-08 天津大学 Grating coupler positioning measurement method based on image recognition
JP7128926B1 (en) 2021-03-18 2022-08-31 Nttエレクトロニクス株式会社 Optical fiber connection system, optical fiber connection control device, and optical fiber connection method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5857047A (en) * 1996-03-20 1999-01-05 The Regents Of The University Of California Automated fiber pigtailing machine
EP1355125A2 (en) * 2002-04-09 2003-10-22 Ngk Insulators, Ltd. Method for determining core positions of optical element array and an apparatus for determining core positions thereof
CN102927979A (en) * 2012-11-19 2013-02-13 中国电子科技集团公司第四十四研究所 Fiber-optic gyroscope and method for detecting optical fiber coupling quality online in manufacturing process of fiber-optic gyroscope
CN104316003A (en) * 2014-10-31 2015-01-28 北京航空航天大学 Online detection device and method for polarization axis alignment in direct coupling process of polarization-preserving fiber ring and Y waveguide
CN104614803A (en) * 2015-01-27 2015-05-13 北京航空航天大学 ARM-based integrated polarization maintaining fiber axis positioning instrument
CN104713542A (en) * 2013-12-11 2015-06-17 中国航空工业第六一八研究所 Non-fusion making method of high precision optical fiber gyroscope
CN105759390A (en) * 2016-04-27 2016-07-13 北京航空航天大学 Automatic positioning and placing apparatus and method for fiber

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5857047A (en) * 1996-03-20 1999-01-05 The Regents Of The University Of California Automated fiber pigtailing machine
EP1355125A2 (en) * 2002-04-09 2003-10-22 Ngk Insulators, Ltd. Method for determining core positions of optical element array and an apparatus for determining core positions thereof
CN102927979A (en) * 2012-11-19 2013-02-13 中国电子科技集团公司第四十四研究所 Fiber-optic gyroscope and method for detecting optical fiber coupling quality online in manufacturing process of fiber-optic gyroscope
CN104713542A (en) * 2013-12-11 2015-06-17 中国航空工业第六一八研究所 Non-fusion making method of high precision optical fiber gyroscope
CN104316003A (en) * 2014-10-31 2015-01-28 北京航空航天大学 Online detection device and method for polarization axis alignment in direct coupling process of polarization-preserving fiber ring and Y waveguide
CN104614803A (en) * 2015-01-27 2015-05-13 北京航空航天大学 ARM-based integrated polarization maintaining fiber axis positioning instrument
CN105759390A (en) * 2016-04-27 2016-07-13 北京航空航天大学 Automatic positioning and placing apparatus and method for fiber

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《光纤和波导对准过程中边缘高精度线性拟合》;阳波等;《光电工程》;20111130;第38卷(第11期);第16-22页 *
《高准确度熊猫保偏光纤自动定轴技术》;刘振华,冯迪,黄怀波;《光子学报》;20150228;第44卷(第2期);第55-59页 *

Also Published As

Publication number Publication date
CN108646353A (en) 2018-10-12

Similar Documents

Publication Publication Date Title
CN108646353B (en) Optical fiber-waveguide automatic alignment coupler based on image processing
CN106056587B (en) Full view line laser structured light three-dimensional imaging caliberating device and method
KR102397254B1 (en) Quantitative three-dimensional imaging of surgical scenes
CN107093195B (en) A kind of locating mark points method of laser ranging in conjunction with binocular camera
CN102360079B (en) Laser range finder and working method thereof
US20130058581A1 (en) Microscopic Vision Measurement Method Based On Adaptive Positioning Of Camera Coordinate Frame
CN102566023B (en) A kind of digital slide real time scanning automatic focusing system and method thereof
WO2021057422A1 (en) Microscope system, smart medical device, automatic focusing method and storage medium
CN105547147A (en) System and method for calibrating a vision system with respect to a touch probe
CN109612689B (en) Optical fiber end face detection method and system
CN105783880B (en) A kind of monocular laser assisted bay section docking calculation
CN106896343B (en) Servo follow-up machine vision device and dynamic tracking ranging method
CN104181685A (en) Automatic digital slide focusing device and method based on microscope
CN103837080B (en) Sub-micrometer precision coaxial confocal towards micro assemby is directed at detection method and device
CN102183221A (en) Measurement method for verticality of optical axis of microscope system
CN114820761B (en) XY direction included angle measurement and motion compensation method based on image microscopic scanning platform
CN104614803B (en) Integrated polarization-preserving fiber axis fixing instrument based on ARM
CN105004324A (en) Monocular vision sensor with triangulation ranging function
CN114578540B (en) Image technology-based adjustment method for perpendicularity between microscopic scanning objective table and objective lens
CN108805940B (en) Method for tracking and positioning zoom camera in zooming process
CN112509065A (en) Visual guidance method applied to deep sea mechanical arm operation
CN112596165A (en) Automatic coupling device for optical fiber waveguide array
CN102735220B (en) Long-focus large-field-of-view camera focal plane resetting method
CN102419157B (en) Micro-depth-dimension automatic image measuring system
CN2453411Y (en) Laser minimum light spot measuring device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant