CN116965394A - Laser weeding device - Google Patents

Laser weeding device Download PDF

Info

Publication number
CN116965394A
CN116965394A CN202311229392.4A CN202311229392A CN116965394A CN 116965394 A CN116965394 A CN 116965394A CN 202311229392 A CN202311229392 A CN 202311229392A CN 116965394 A CN116965394 A CN 116965394A
Authority
CN
China
Prior art keywords
laser
image
image acquisition
acquisition unit
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311229392.4A
Other languages
Chinese (zh)
Other versions
CN116965394B (en
Inventor
王宪涛
王勇
王斌
王红平
周化文
顾莉栋
李振辉
刘镇忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin Changhua Automotive Parts Co ltd
Ningbo Yibin Electronic Technology Corp
Original Assignee
Jilin Changhua Automotive Parts Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin Changhua Automotive Parts Co ltd filed Critical Jilin Changhua Automotive Parts Co ltd
Priority to CN202311229392.4A priority Critical patent/CN116965394B/en
Publication of CN116965394A publication Critical patent/CN116965394A/en
Application granted granted Critical
Publication of CN116965394B publication Critical patent/CN116965394B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M21/00Apparatus for the destruction of unwanted vegetation, e.g. weeds
    • A01M21/04Apparatus for destruction by steam, chemicals, burning, or electricity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • Wood Science & Technology (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Soil Working Implements (AREA)

Abstract

The invention relates to a laser weeding device, in particular to the technical field of new agriculture, which comprises a visual image system, a control system, a sensing system and a laser weeding component, wherein the visual image system, the sensing system and the laser weeding component are respectively in communication connection with the control system; the visual image system comprises a first image acquisition unit and a second image acquisition unit; the sensing system comprises a speed measuring encoder, and the laser weeding assembly comprises a laser and a two-dimensional scanning galvanometer which are connected through an optical path; the control system controls the operation of the laser weeding assembly according to the data acquired by the visual image system and the sensing system, solves the field weeding problem by utilizing a laser technology, and can realize synchronous shooting, recognition and laser weeding in the advancing process of the device.

Description

Laser weeding device
Technical Field
The invention relates to a laser weeding device, in particular to the technical field of new agriculture.
Background
Weed removal is an important link for improving crop yield, and black land protection and pesticide application reduction and synergy technology are new environmental protection requirements. At present, the weed removal method in China mainly comprises the steps of medicament weeding and manual removal. When the pesticide is used for treatment, the pesticide permeates into soil to destroy the soil, so that the yield of grains is reduced, and the ecological environment is also greatly destroyed. The long-term use of herbicide can enhance the drug resistance of weeds in the field, which leads to the next need of higher concentration of herbicide for control, and the residual quantity of herbicide in crops exceeds international standards. Herbicides are harmful to humans and can be toxic if sprayed carelessly. The herbicide exists in the soil for a long time, and harmful elements of the herbicide can be absorbed by crops and finally can be indirectly absorbed by human bodies, so that the herbicide causes great harm. The manual weeding mode has the advantages of high labor intensity, long time consumption, low efficiency and relatively high cost.
The invention provides a laser weeding device, which solves the field weeding problem by using a laser technology, replaces chemical pesticides with a novel agricultural production technology, avoids environmental pollution caused by herbicide use, protects black land, reduces pesticide residues, ensures grain safety, and greatly improves production efficiency. In addition, the device can be combined with a multipurpose unmanned vehicle to carry out traction, and the walking route of the vehicle is planned through satellite navigation. Is an efficient, intelligent and green unmanned weeding agricultural machine device.
Disclosure of Invention
At least one aspect and advantage of the present invention will be set forth in part in the description that follows, or may be obvious from the description, or may be learned by practice of the presently disclosed subject matter.
In order to solve the problems that in the prior art, laser weeding is not accurate enough and weeding can not be performed simultaneously in the advancing process, the invention provides a laser weeding device.
According to a first aspect of the invention, a laser weeding device comprises a visual image system, a control system, a sensing system and a laser weeding assembly, wherein the visual image system is in communication connection with the control system, the sensing system is in communication connection with the control system, and the laser weeding assembly is in communication connection with the control system;
The visual image system comprises a first image acquisition unit and a second image acquisition unit;
the first image acquisition unit is used for acquiring first image information, and the first image information is used for determining a first weed distribution area;
the second image acquisition unit is used for acquiring second image information in the action range of the light spots, and the second image information is used for determining a second weed distribution area matched with the first weed distribution area;
the sensing system comprises a speed measuring encoder;
the laser weeding assembly comprises a laser and a two-dimensional scanning galvanometer, the laser is connected with a two-dimensional scanning galvanometer optical path, and the angle of the two-dimensional scanning galvanometer is adjusted based on a second weed distribution area.
According to one embodiment of the invention, the number of the laser weeding assemblies is multiple, and each laser weeding assembly is correspondingly provided with a second image acquisition unit.
According to one embodiment of the invention, one of the first image acquisition units is provided for each of the second image acquisition units.
According to one embodiment of the invention, each two or more second image acquisition units are correspondingly provided with one first image acquisition unit, and the shooting view angle and the shooting area of the first image acquisition unit are larger than those of the second image acquisition units.
According to one embodiment of the invention, the laser weeding assembly comprises a sealed box body, and the two-dimensional scanning galvanometer is arranged inside the sealed box body.
According to one embodiment of the invention, the second image acquisition unit is arranged on the lower surface of the sealing box body, and the shooting view field area of the second image acquisition unit is located between the shooting view field area of the first image acquisition unit and the scanning area of the two-dimensional scanning galvanometer.
According to one embodiment of the invention, the second image acquisition unit is arranged at the top of the inner space of the sealed box body, and a light transmission area is arranged at the bottom of the sealed box body, corresponding to the lens optical axis area of the second image acquisition unit.
According to one embodiment of the invention, the second image acquisition unit is arranged at the top of the inner space of the sealed box body, and a 45-degree dichroic mirror is arranged at the intersection of the lens optical axis and the working laser optical axis, so that the lens optical axis and the working laser optical axis are coincident.
According to one embodiment of the invention, a field lens is arranged on an emergent light path of the two-dimensional scanning galvanometer in the sealed box body.
According to one embodiment of the invention, the light outlet of the sealed box body is provided with a sealed transmission window glass, and the sealed transmission window glass is high in transmission to working laser.
According to one embodiment of the invention, a plurality of air holes are formed in the side edge of the bottom end of the sealing box body, the air holes are communicated with the purified gas pipeline, and the air hole outlets are aligned with the sealing transmission window glass.
According to one embodiment of the invention, the sensing system further comprises an inertial measurement unit for sensing velocity and pose information.
According to one embodiment of the invention, a first inertial measurement unit is provided corresponding to the first image acquisition unit and a second inertial measurement unit is provided corresponding to the second image acquisition unit.
According to one embodiment of the invention, the first and second image acquisition units are illuminated by a lighting device.
According to one embodiment of the invention, the illumination device employs a standard surface light source.
According to one embodiment of the invention, the laser and the two-dimensional scanning galvanometer optical path are further provided with a laser beam shaping mirror.
According to one embodiment of the invention, the device further comprises an energy source device, a cooling device and a purifying air pump.
According to one embodiment of the invention, the laser is a high power carbon dioxide laser, the power being at least 100W.
The laser weeding device can automatically identify crop seedlings and weeds in the field in the running process, and quickly ablate and remove the weeds by focusing laser to the center of the weeds. The laser weeding device is formed by combining and arranging a plurality of laser weeding assemblies, and is provided with a speed measuring sensor and an inertial measurement unit for detecting the travelling speed and pose information of the device in real time. The first image acquisition unit is used for identifying crop seedlings and weed information, and comprises crop seedling characteristics and weed coordinates, the second image acquisition unit is used for accurately confirming, and the control system is used for controlling the vibrating mirror according to the coordinate information, the speed information and the pose information, so that a laser focal spot is always aligned to the weed center in the running process. The laser weeding device can synchronously complete the identification, positioning and removal of weeds in a new process, and is accurate and efficient in operation.
Drawings
FIG. 1 is a schematic diagram of the whole structure of an intelligent laser weeding device combined on a vehicle body;
FIG. 2 is a three-dimensional block diagram of a visual image system and a laser weeding assembly of the intelligent laser weeding device of the invention;
FIG. 3 is a bottom view of the structure shown in FIG. 2;
FIG. 4 is a cross-sectional three-dimensional view of the structure shown in FIG. 2;
Fig. 5 is a front view corresponding to the structure shown in fig. 4;
FIG. 6 is a front view of the second image capturing unit of the present invention disposed within a sealed housing;
FIG. 7 is a diagram showing the relative positional relationship of the physical coordinate systems corresponding to the structures shown in FIG. 6;
FIG. 8 is a three-dimensional cross-sectional view of a lens axis of a second image capturing unit according to a third embodiment of the present invention;
FIG. 9 is a front view corresponding to the three-dimensional view of the structure shown in FIG. 8;
fig. 10 is a diagram showing the relative positional relationship of the physical coordinate systems corresponding to the structure shown in fig. 8.
1-laser, 2-supporting mechanism, 3-sealed box body, 4-first image acquisition unit, 5-first inertial measurement unit, 6-lighting device, 7-second image acquisition unit, 8-second inertial measurement unit, 9-45-degree dichroic mirror, 10-two-dimensional scanning galvanometer, 11-purifying gas pipeline, 12-field lens, 13-transmitting window glass, 14-light transmission area, 15-air hole, 16-laser beam shaping mirror, 17-speed measuring encoder, 100-laser weeding component and 200-car body.
Detailed Description
The disclosure will now be discussed with reference to several exemplary embodiments. It should be understood that these embodiments are discussed only to enable those of ordinary skill in the art to better understand and thus practice the present disclosure, and are not meant to imply any limitation on the scope of the present disclosure.
As used herein, the term "comprising" and variants thereof are to be interpreted as meaning "including but not limited to" open-ended terms. The term "based on" is to be interpreted as "based at least in part on". The terms "one embodiment" and "an embodiment" are to be interpreted as "at least one embodiment. The term "another embodiment" is to be interpreted as "at least one other embodiment". The terms "upper", "lower", "left", "right", "front", "rear", "top", "bottom", "inner", "outer", "vertical", "horizontal", "transverse", "longitudinal", etc. refer to an orientation or positional relationship based on that shown in the drawings. These terms are only used to better describe the present application and its embodiments and are not intended to limit the scope of the indicated devices, elements or components to the particular orientations or to configure and operate in the particular orientations. Also, some of the terms described above may be used to indicate other meanings in addition to orientation or positional relationships, for example, the term "upper" may also be used to indicate some sort of attachment or connection in some cases. The specific meaning of these terms in the present application will be understood by those of ordinary skill in the art according to the specific circumstances. Furthermore, the terms "mounted," "configured," "provided," "connected," and "connected" are to be construed broadly. For example, it may be a fixed connection, a removable connection, or a unitary construction; may be a mechanical connection, or an electrical connection; may be directly connected, or indirectly connected through intervening media, or may be in internal communication between two devices, elements, or components. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art according to the specific circumstances. Furthermore, the terms "first," "second," and the like, are used primarily to distinguish between different devices, elements, or components (the particular species and configurations may be the same or different), and are not used to indicate or imply the relative importance and number of devices, elements, or components indicated. Unless otherwise indicated, the meaning of "a plurality" is two or more.
According to one embodiment of the invention, a laser weeding device comprises a visual image system, a control system, a sensing system and a laser weeding assembly 100, wherein the visual image system is in communication connection with the control system, the sensing system is in communication connection with the control system, and the laser weeding assembly 100 is in communication connection with the control system;
the visual image system comprises a first image acquisition unit 4 and a second image acquisition unit 7;
the first image acquisition unit 4 is used for acquiring first image information in the travelling direction, wherein the first image information is used for determining a first weed distribution area;
the second image acquisition unit 7 is used for acquiring second image information in the action range of the light spot in the travelling direction, and the second image information is used for determining a second weed distribution area matched with the first weed distribution area;
the sensing system includes a tachometer encoder 17;
the laser weeding assembly 100 comprises a laser 1 and a two-dimensional scanning galvanometer 10, wherein the laser 1 is in optical path connection with the two-dimensional scanning galvanometer 10, and the angle of the two-dimensional scanning galvanometer 10 is adjusted based on a second weed distribution area.
In some embodiments, as shown in fig. 1, the intelligent laser weeding apparatus is incorporated on a vehicle body 200, and the vehicle body 200 can be towed by an intelligent tractor. The vehicle body 200 may be composed of a two-layered frame, an upper frame for housing an energy device, a cooling device, a purge air pump, a control system, etc., and a bottom frame housing a visual image system, a sensing system, and a laser weeding assembly 100. The tachometer encoder 17 in the sensing system may be provided on a guide wheel of the vehicle body 200.
The first image acquisition unit 4 and the second image acquisition unit 7 in the visual image system may be cameras conventional in the art or other products that can realize image acquisition. The first image acquisition unit 4 may be a preprocessing camera, and the second image processing unit may be a tracking camera.
The preprocessing camera shoots images at a fixed frame rate and sends the images to the control system, and the control system recognizes and distinguishes crop seedlings in the images according to crop information accumulated in the pre-learning process and stores image features and position coordinates of the crop seedlings in real time. And simultaneously, taking plants except crop seedlings as weeds, and extracting the central coordinates of all the weeds. Since the vehicle body 200 is traveling forward all the time, the position coordinates of crops and weeds in the image need to be updated in real time according to the speed and pose information fed back by the speed measuring encoder 17 and the inertial measurement unit.
The tracking camera is used for rapidly shooting images at a fixed frame rate and sending the images to the control system, and the control system is used for comparing and checking according to the image preliminary information provided by the preprocessing camera, rapidly and accurately identifying the information of crop seedlings and weeds and carrying out coordinate marking. Also, since the vehicle body 200 is traveling forward all the time, it is necessary to update the position coordinates of crops and weeds in the image in real time based on the speed and pose information fed back by the speed measuring encoder 17 and the inertial measurement unit.
The first image pickup unit 4 and the second image pickup unit 7 are fixedly provided at the front side or the bottom or other positions of the vehicle body 200 such that the first image pickup unit 4 and the second image pickup unit 7 can pick up images of crops and weeds in front of, on the side of, or under the vehicle body 200 in the traveling direction of the vehicle.
The first image acquisition unit 4, i.e. the preprocessing camera, may also not be mounted on the laser weeding device structure. Because of the large field of view captured by the preprocessing camera, two or more adjacent laser weeding assemblies 100 can share a single preprocessing camera and then partition the captured image for distribution to adjacent individual weeding units.
The sensing system of the present invention may include a tachometer encoder 17 for sensing the speed of travel of the laser weeding apparatus and feeding that speed back to the control system. The sensing system may also take other elements and forms to monitor the speed of travel of the laser weeding apparatus in real time.
The laser 1 employed in the laser weeding assembly 100 of the present invention can be fixedly provided on the bottom frame of the vehicle body 200 through the laser 1 supporting mechanism 2, as shown in fig. 1 to 9. The laser 1 supporting mechanism 2 includes a horizontal base plate fixedly provided on the bottom frame of the vehicle body 200 and a vertically upward extending support plate perpendicular to the horizontal base plate, the longitudinal cross sections of the base plate and the support plate being formed in a substantially L-shape, the support plate being located at the end of the horizontal base plate in the vehicle traveling direction (V direction in fig. 1). The laser 1 is fixedly arranged on a horizontal bottom plate of the supporting mechanism 2 of the laser 1, the optical axis of working laser extends along the horizontal direction, and the light outlet faces the direction opposite to the travelling direction. The support plate is provided with a through hole at the position of the optical axis for the working laser to penetrate.
The two-dimensional scanning galvanometer 10 adopted in the laser weeding assembly 100 controls the galvanometer to lead the laser to be aligned to the center of weeds according to the images shot by the tracking camera and the weed information processed in real time by a galvanometer control program in the control system, and performs weed cleaning operation. Because the laser light spot is smaller, and meanwhile, the laser weed cleaning requires a certain time, and in the time, the vehicle always moves forward, so that the laser light spot cannot leave the center of the weed, and position change information needs to be transmitted to the galvanometer control system. This information is also transmitted in real time by the tachometer encoder 17 and the inertial measurement unit.
The preprocessing camera shoots and processes the images, the tracking camera shoots and processes the images, and the process of controlling the vibrating mirror to lead the laser to aim at weeds and clear the weeds is parallel processing.
In the embodiment of the invention, the action range of the light spot is the area covered by the reachable stroke of the light spot by adjusting the offset of the two-dimensional scanning galvanometer in the X and/or Y directions.
First, a description will be given of a process of laser weeding according to the present invention.
The intelligent laser weeding device provided by the invention can be arranged on the vehicle body 200, and also can be arranged on an aircraft or other movable objects. The invention provides a method for identifying plants in farmland by means of multi-source image identification, and adjusts the angle of the two-dimensional scanning galvanometer 10 based on the identification result to adjust the output of laser so as to realize targeted removal of weeds.
The invention at least comprises a visual image system, wherein the visual image system comprises an image acquisition unit, and the image acquisition unit can comprise a CCD image acquisition module, an image buffer, a processor and a communication module, wherein the CCD image acquisition module, the communication module and the image buffer are electrically connected with the processor; the CCD image acquisition module is used for acquiring images of farmlands in the travelling direction; the image buffer is used for buffering unsent pictures, the processor is used for receiving collected images, processing the images and sending the images to the control unit through the communication module, and the communication module can communicate through a communication cable such as an RJ45 interface or a Bluetooth or Wi-Fi wireless communication module. Different caching strategies can be adopted for the images acquired by the CCD image acquisition module according to the requirements, for example, no caching is provided during acquisition and measurement, and the images are directly sent to the control system. A fixed sampling rate, such as 50 images per second, may be set when image acquisition is performed.
The image acquisition unit may also be a video acquisition device, which typically includes a camera and a communication module, where the camera acquires video and encodes the video and sends the video to the control system for analysis, and the communication module may communicate via a communication cable such as an RJ45 interface, or may communicate via a wireless communication module such as bluetooth or Wi-Fi. In video acquisition, the acquisition should be performed at a frame rate high enough, such as 120Hz, to provide input information that matches the laser weeding heating duration.
The first image acquisition unit 4 is used for providing first image information acquired in the travelling process, analyzing the image information to obtain weed information contained in the first image information, and caching or summarizing the weed information to obtain a first weed distribution area in the travelling process.
When the first graphic information is provided through the video, the first image information is obtained by acquiring a time of frame formation in the video and an image. It should be appreciated that each image corresponds to an image acquisition time, and when the images are analyzed, the actual position of the position contained in the first image information after the vehicle travels for a period of time can be obtained by combining and calculating the data influencing the posture positioning of the vehicle in combination with the content of the images and the speed, vibration and offset information acquired during the vehicle traveling process.
I.e. at t1, an image 1 is obtained, the position of the object in the image 1 in the coordinate system is S (x, y, z), then, corresponding to any observer of the vehicle, the position of the object in the image 1 for that observer can be converted into a new coordinate point N (x ', y', z ') by the coordinate system in which the observer' S coordinates can be obtained by tracking the progress, i.e. the difference in the position of the image acquisition unit at the time of knowing the observer and the image acquisition, and the position of the observer can be obtained by superimposing the offset by tracking the performance of the vehicle to obtain the offset of the vehicle in the direction of travel, in the height direction and in the horizontal direction. The conversion may be performed using a rondrign matrix model, which may be expressed as n=λ×r×s+i, where λ is a scale factor, R is a rotation matrix, and I is a translation matrix. The ground can be an X-Y plane, the center of a field of view shot by the first image acquisition unit is an origin, the optical axis of the first image acquisition unit 4 is a Z axis, a physical coordinate system is established, and a mathematical corresponding relation is established between the image coordinates and the physical coordinates by calibrating the image shot by the first image acquisition unit. The coordinate system of the second image acquisition unit can be determined in an approximate manner, and the scaling factor lambda is determined for the rotation matrix R and the translation matrix I when both positions are determined.
Further, if the offset in the horizontal direction as well as the vertical direction is not considered, that is, the flatness of the working plane is assumed to be high, the offset can be determined by setting a velocimeter; when there is a horizontal and vertical offset, the posture of the observer needs to be determined by an inertial measurement unit such as a gyroscope, taking into account the posture change.
On the basis of this, the distribution of the first image information in the second image information is determined by acquiring the first image information and the second image acquisition unit 7 provided on the carrier to acquire the second image information.
Further, determining weed distribution or plant distribution in the first image information by performing object recognition on the image in the first image information; wherein the identification of the weed distribution can be accomplished by identifying the crop and taking the remaining part of the plant as the weed; or to identify weeds and obtain their distribution. The model recognizes images and objects through the accelerator card, the CPU or the GPU, and the thermal damage of the laser is generally not more than 30ms and is higher than the recognition speed of the model on the objects, so that the second image acquisition unit 7 can perform a matched form in the recognized objects to further reduce the response time.
For example, by capturing images of target crop plants and weeds in a field for identification and marking, and extracting their characteristics by means of machine learning, a trained plant identification network is obtained that is deployed and used to identify weeds or crop plants in the field.
The position information of the object in the image in the coordinate system of the image acquisition unit can be obtained through calibration in the image acquired by the image acquisition unit, so that the vehicle can continuously acquire images in the advancing process, and a series of weed distribution areas and space coordinates thereof can be acquired.
During the traveling of the vehicle, the first image information analysis is used for acquiring a weed distribution area included in the traveling track, and during the traveling, the second image acquisition unit 7 acquires second image information, and the second weed distribution area covered in the action range of the light spot is obtained by comparing the image characteristics in the second image with the image characteristics in the first weed distribution area.
The matching with the first weed distribution region is based on an object matching of the first weed distribution region in the second image information, which may further include a journey matching or an image matching.
In one embodiment of the present invention, a first weed distribution area corresponding to a first image is determined while processing the first image information; when the second image information is processed, acquiring first image information acquired by the first image acquisition unit 4 during the same journey according to the second image information, and acquiring a first weed distribution area according to the first image information; and performing feature matching on the second image information according to the first weed distribution area to determine a second weed distribution area.
In another embodiment of the present invention, a journey and a first weed distribution area corresponding to the journey are determined while processing the first image information; determining a first weed distribution area according to the travel of the second image information when the second image information is processed; and performing feature matching on the second image information according to the first weed distribution area to determine a second weed distribution area.
In another embodiment of the invention, when the first image information is processed, the weed distribution area in the first image can be obtained, which is buffered for searching and matching, for example, the step length is set to be 0.1m, the acquired image data is stored at intervals of 0.1m, and each frame of image only stores 1 step length of image, when the second image is processed, assuming that the first image acquisition unit 4 and the second image acquisition unit 7 are different in distance by 1.0m, the weed distribution area data of the front 10 steps can be selected for data extraction, the data of the weed distribution area of 10 steps are obtained, and are respectively converted into the coordinate system of the second image acquisition unit 7 and matched, so that the second weed distribution area matched with the first weed distribution area is obtained.
Inconsistent with the first image acquisition unit 4, the second image acquisition unit 7 is used for determining the action position of the light spot. In the present invention, the two-dimensional scanning galvanometer 10 and the laser head are connected by an optical path, and weed removal is performed in the form of thermal injury. The position of the second image acquisition unit 7 should be such that the acquired image area coincides with the galvanometer action area or falls within the scope of the galvanometer spot so that it has a movement space that coincides with the spot so that the displacement of the spot can be tracked.
When the weed region in the second image information is determined in the above-described manner, the action position of the spot can be obtained by conversion of the coordinate system, and the angle of the two-dimensional scanning galvanometer 10 is adjusted based on this, thereby realizing thermal injury of the spot to the weed. In the adjustment, the stepwise adjustment may be performed in units of 1 ms or 2 ms.
The control system can be arranged on a vehicle, an unmanned aerial vehicle or other carriers or fixedly arranged on a building, such as a lamp post, and can communicate in a 5G/Wi-Fi mode.
According to one embodiment of the present invention, a plurality of the laser weeding assemblies 100 are provided, and each of the laser weeding assemblies 100 is provided with more than one second image acquisition unit 7.
The second image acquisition unit 7 may be in one-to-one correspondence with the laser weeding assembly 100. In this case, the area range that each of the laser weeding assemblies 100 can scan should be equal to or larger than the range of the field area of the second image pickup unit 7 corresponding thereto.
In one embodiment of the present invention, the number of the second image capturing units 7 is 1, so as to capture the image information under the carrier, and the area of the image area that can be captured is lower than the whole coverage area that can be achieved by the two-dimensional scanning galvanometer 10; for example, when the two-dimensional scanning galvanometer 10 is controlled to reach an area of 6 square meters respectively, the second image acquisition unit 7 can cover 1 square meter of image sampling;
in another embodiment of the present invention, the number of the second image capturing units 7 is 3, so as to capture the image information under the carrier, and the area of the image area that can be captured is lower than the whole coverage area that can be achieved by the two-dimensional scanning galvanometer 10; for example, when the two-dimensional scanning galvanometer 10 is controlled to have an area of 6 square meters, each second image acquisition unit 7 can cover 1 square meter of image samples, and the whole actual coverage area (partial area overlapping) is 2 square meters, so that 33% coverage is realized.
In another embodiment of the present invention, the number of the second image capturing units 7 is 5, so as to capture the image information under the carrier, and the area of the image area that can be captured is lower than the whole coverage area that can be achieved by the two-dimensional scanning galvanometer 10; for example, when the two-dimensional scanning galvanometer 10 is controlled to reach an area of 3 square meters respectively, each second image acquisition unit 7 can cover 1 square meter of image samples, and the whole actual coverage area (partial area overlapping) is 3.6 square meters, so that 100% coverage is realized.
According to an embodiment of the invention, one of said first image acquisition units 4 is provided for each second image acquisition unit 7.
The first image acquisition unit 4 and the laser weeding assembly 100 can be in one-to-one correspondence, and the first image acquisition unit 4 and the second image acquisition unit 7 should also meet that the weed area information provided by the former and the travelling track of the latter at least partially overlap. In this case, the area range that each laser weeding assembly 100 can scan should be equal to or greater than the range of the field area of the second image acquisition unit 7 corresponding to the area range, and the area of the whole image area acquired by the second image acquisition unit 7 and the area of the whole area of the first image acquisition unit 4 have at least overlapping areas so as to meet the requirement of weed removal in the spot action area corresponding to the second image acquisition unit 7.
In one embodiment of the present invention, the number of the first image acquisition units 4 is 3, and the first image acquisition units are used for acquiring images in front of, on the left side of and on the right side of the carrier, and in a static state, the acquisition area is 8 square meters, the number of the second image acquisition units 7 is 1, and the second image acquisition units are used for acquiring image information below the carrier, and the area of an image area which can be acquired is lower than the whole coverage area which can be achieved by the two-dimensional scanning galvanometer 10; for example, when the area that the two-dimensional scanning galvanometer 10 can reach is 6 square meters, the second image acquisition unit 7 can cover 1 square meter of image sampling, at least 40% of the images acquired by the first image acquisition unit 4 can be covered by the second image acquisition unit 7, and other images are used for analyzing other information of the farmland, such as obstacles.
In another embodiment of the present invention, the number of the first image capturing units 4 is 1, and the capturing area of the first image capturing units is 5 square meters in a static state, and the number of the second image capturing units 7 is 3, and the area of the image area that can be captured is lower than the whole coverage area that can be achieved by the two-dimensional scanning galvanometer 10; for example, when the area that can be achieved by the two-dimensional scanning galvanometer 10 is 6 square meters, each second image acquisition unit 7 can cover 1 square meter of image samples, and the whole actual coverage area (partial area overlapping) is 3 square meters, so that 50% of the coverage of the area is achieved, at least 50% of the images acquired by the first image acquisition unit 4 can be covered by the second image acquisition units 7, and other images are used for analyzing other information of farmlands, such as obstacle and ponding information.
In another embodiment of the present invention, the number of the first image capturing units 4 is 2, and the capturing area of the first image capturing units is 3.6 square meters in a static state, and the number of the second image capturing units 7 is 5, and the area of the image area that can be captured is lower than the whole coverage area that can be achieved by the two-dimensional scanning galvanometer 10; for example, when the area that can be achieved by the two-dimensional scanning galvanometer 10 is 3 square meters, each second image acquisition unit 7 can cover 1 square meter of image samples, and the whole actual coverage area (partial area overlapping) is 3.6 square meters, so that 100% coverage is achieved; at least 50% of the images acquired by the first image acquisition unit 4 may be covered by the second image acquisition unit 7.
According to one embodiment of the present invention, one first image capturing unit 4 is disposed corresponding to each two or more second image capturing units 7, and the first image capturing unit 4 captures a larger viewing angle and a larger area than the second image capturing unit 7.
In this way, the load of front image analysis can be reduced, thereby reducing the delay of data analysis; whereas, without removing redundant images, which results in a larger number of images being required, delays in identifying objects may occur without increasing the computational effort.
In one embodiment of the present invention, the number of the first image acquisition units 4 is 1, which are used for acquiring images in front of, on the left side of and on the right side of the carrier, the acquisition area is 8 square meters in a static state, the number of the second image acquisition units 7 is 3, which are used for acquiring image information below the carrier, and the area of an image area which can be acquired is lower than the whole coverage area which can be achieved by the two-dimensional scanning galvanometer 10; when the area that the two-dimensional scanning galvanometer 10 can reach is 6 square meters, the second image acquisition unit 7 can cover 5 square meters of image sampling, and at least 50% of the images acquired by the first image acquisition unit 4 can be covered by the second image acquisition unit 7, and other images are used for analyzing other information of farmlands, such as obstacles and ponding.
Referring to fig. 2-4, the laser weeding assembly 100 according to one embodiment of the present invention includes a sealed housing 3, and the two-dimensional scanning galvanometer 10 is disposed inside the sealed housing 3. The sealing box 3 is used for sealing optical devices such as a camera, a vibrating mirror, a field lens 12 and the like, is connected with the supporting mechanism 2 of the laser 1 by taking a laser light path as a reference, and can be fixedly arranged on the bottom frame of the vehicle body 200, and the joint is required to be sealed and dustproof.
Referring to fig. 4-5, according to an embodiment of the present invention, the second image acquisition unit 7 is disposed on the lower surface of the sealing case 3, and the photographing field area thereof is located between the photographing field area of the first image acquisition unit 4 and the scanning area of the two-dimensional scanning galvanometer 10. That is, the first image pickup unit 4, the second image pickup unit 7, and the two-dimensional scanning galvanometer 10 are arranged in this order along the traveling direction of the vehicle.
Referring to fig. 6, according to an embodiment of the present invention, the second image capturing unit 7 is disposed at the top of the inner space of the sealed housing 3, and correspondingly, a light-transmitting area 14 is disposed at the bottom of the sealed housing 3 corresponding to the lens optical axis area of the second image capturing unit 7, and the light-transmitting area 14 is highly transparent to visible light, and the second image capturing unit 7 can capture a scene outside the sealed housing 3 through the light-transmitting area 14.
Based on the embodiment shown in fig. 4-6, the first image acquisition unit 4 takes the ground as an X-Y plane, the center of the field of view shot by the first image acquisition unit 4 as an origin, the optical axis of the first image acquisition unit 4 as a Z axis, a physical coordinate system SP is established, and by calibration, the image shot by the first image acquisition unit 4 and the physical coordinate establish a mathematical correspondence.
The second image acquisition unit 7 takes the ground as an X-Y plane, the center of a field of view shot by the second image acquisition unit 7 as an origin, and the optical axis of the second image acquisition unit 7 as a Z axis, and a physical coordinate system ST is established. By calibration, the image coordinate and the physical coordinate of the image shot by the second image acquisition unit 7 can establish a mathematical corresponding relation.
The two-dimensional scanning galvanometer 10 takes the ground as an X-Y plane, the corresponding working laser path at the initial position of the zero point of the galvanometer as a Z axis, and the center of a light spot on the plane is taken as an origin, so as to establish an initial coordinate system SV of the galvanometer. In the working process of the galvanometer, the laser light spots are directed to any position in the scanning area by controlling the positions of the X-direction lens and the Y-direction lens.
Referring to fig. 7, the physical coordinate systems SP, ST, SV have a specific mathematical correspondence relationship with each other. The image and the vibrating mirror control position have specific mathematical corresponding relation after calibration. The first image acquisition unit 4 shoots an image and the second image acquisition unit 7 shoots an image, and a fixed mathematical relationship exists between any pixel coordinates of the two images. By controlling the two-dimensional galvanometer 10, the laser spot can be directed to any physical position corresponding to a pixel in the image captured by the second image capturing unit 7 in the scannable range.
Referring to fig. 8-9, according to an embodiment of the present invention, the second image acquisition unit 7 is disposed at the top of the inner space of the sealed housing 3, and a 45-degree dichroic mirror 9 is disposed at the intersection of the lens optical axis and the working laser optical axis. The 45-degree dichroic mirror 9 is high in transmission of working laser and high in reflection of imaging visible light, and the dichroic mirror is adjusted to enable the optical axis of the lens to coincide with the optical axis of the laser. According to the installation mode, the center of the field of view of the second image acquisition unit 7 always coincides with the spot center of working laser in the working process. In this mounting mode, the field lens 12 is not mounted. In this way, the accuracy of laser alignment to weeds is advantageously improved. The positional relationship of the physical coordinate systems at this time is shown in fig. 10.
According to an embodiment of the present invention, as shown in fig. 4, a field lens 12 is disposed on the outgoing light path of the two-dimensional scanning galvanometer 10 in the sealed case 3. The field lens 12 can determine whether to install according to the light spot size and the working distance after the laser 1 is shaped.
According to an embodiment of the present invention, as shown in fig. 4, a sealed transmission window glass 13 is disposed at the light outlet of the sealed case 3, and the sealed transmission window glass 13 is highly transparent to the working laser. In the case where the 45-degree dichroic mirror 9 is provided, high transmission of imaging visible light is also required.
According to one embodiment of the present invention, as shown in fig. 4, the bottom side of the sealed case 3 is provided with a plurality of air holes 15, the air holes 15 are communicated with the purge gas pipe 11, and the outlets of the air holes 15 are aligned with the sealed transmission window glass 13. The purifying gas pipeline 11 blows purifying air through the air holes 15, blows dust, and ensures the cleanness of the transmission window glass 13.
According to one embodiment of the invention, the sensing system further comprises an inertial measurement unit for sensing velocity and pose information.
According to one embodiment of the invention, as shown in fig. 2-6, a first inertial measurement unit 5 is provided corresponding to the first image acquisition unit 4 and a second inertial measurement unit 8 is provided corresponding to the second image acquisition unit 7. The first inertial measurement unit 5 is mounted laterally of the first image acquisition unit 4. The first inertial measurement unit 5 can solve the velocity and pose information and provide data for the image captured by the first image capturing unit 4. The second inertial measurement unit 8 is mounted laterally of the second image acquisition unit 7. The second inertial measurement unit 8 can solve the velocity and pose information and provide data for the image captured by the second image capturing unit 7.
According to one embodiment of the invention, the device further comprises illumination means 6 for providing illumination for said first image acquisition unit 4 and second image acquisition unit 7. The illumination means 6 may comprise a first light source, a second light source, a third light source, arranged adjacent to the first image capturing unit 4, the second image capturing unit 7, the transmissive window glass 13, respectively, as shown in fig. 2. The lighting device 6 may or may not be mounted on the laser weeding device, and is suspended below the lower frame of the vehicle body 200 by a separate lamp strip structure.
The light source can be further provided with the optical filter, so that light with specific wavelength is transmitted, and the identification effect of plants is improved pertinently.
According to one embodiment of the present invention, the illumination device 6 employs a standard surface light source.
According to one embodiment of the present invention, the optical paths of the laser 1 and the two-dimensional scanning galvanometer 10 are further provided with a laser beam shaping mirror 16. Preferably, a laser beam shaping mirror 16 is provided on the laser 1 support mechanism 2, as shown in fig. 4.
According to one embodiment of the invention, the device further comprises an energy source device, a cooling device and a purifying air pump.
According to one embodiment of the invention, the laser 1 is a high power carbon dioxide laser 1, the power being at least 100W.
The application has the beneficial effects that: the laser weeding device can synchronously perform image acquisition, weed identification, positioning indexing and precise removal in the advancing process of the laser weeding device. The laser weeding device has high weeding accuracy and speed, and effectively improves the agricultural productivity.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus and device described above may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative,
the above description is only illustrative of the preferred embodiments of the present application and of the principles of the technology employed. It will be appreciated by persons skilled in the art that the scope of the application referred to in the present application is not limited to the specific combinations of the technical features described above, but also covers other technical features formed by any combination of the technical features described above or their equivalents without departing from the inventive concept. Such as the above-mentioned features and the technical features disclosed in the present application (but not limited to) having similar functions are replaced with each other.
It should be understood that, the sequence numbers of the steps in the summary and the embodiments of the present invention do not necessarily mean the order of execution, and the execution order of the processes should be determined by the functions and the internal logic, and should not be construed as limiting the implementation process of the embodiments of the present invention. The foregoing description of implementations of the present disclosure has been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or to limit the disclosure to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the disclosure. The embodiments were chosen and described in order to explain the principles of the present disclosure and its practical application to enable one skilled in the art to utilize the present disclosure in various embodiments and with various modifications as are suited to the particular use contemplated.

Claims (15)

1. The laser weeding device is characterized by comprising a visual image system, a control system, a sensing system and a laser weeding assembly, wherein the visual image system is in communication connection with the control system, the sensing system is in communication connection with the control system, and the laser weeding assembly is in communication connection with the control system;
The visual image system comprises a first image acquisition unit and a second image acquisition unit;
the first image acquisition unit is used for acquiring first image information, and the first image information is used for determining a first weed distribution area;
the second image acquisition unit is used for acquiring second image information in the action range of the light spots, and the second image information is used for determining a second weed distribution area matched with the first weed distribution area;
the sensing system comprises a speed measuring encoder;
the laser weeding assembly comprises a laser and a two-dimensional scanning galvanometer, the laser is connected with a two-dimensional scanning galvanometer optical path, and the angle of the two-dimensional scanning galvanometer is adjusted based on a second weed distribution area.
2. The laser weeding device according to claim 1, wherein the number of the laser weeding assemblies is more than two, and each laser weeding assembly is provided with a second image acquisition unit.
3. A laser weeding device according to claim 2, wherein each second image-capturing unit is provided with one first image-capturing unit.
4. A laser weeding device according to claim 2, wherein each two or more second image acquisition units is provided with one first image acquisition unit, and the first image acquisition unit has a larger shooting angle and larger shooting area than the second image acquisition units.
5. A laser weeding apparatus according to claim 2, wherein the laser weeding assembly comprises a sealed housing, and the two-dimensional scanning galvanometer is disposed inside the sealed housing.
6. A laser weeding device according to claim 5, wherein the second image acquisition unit is disposed on the lower surface of the seal housing, and the imaging field of view of the second image acquisition unit is located between the imaging field of view of the first image acquisition unit and the scanning area of the two-dimensional scanning galvanometer.
7. The laser weeding device according to claim 5, wherein the second image acquisition unit is disposed at the top of the inner space of the sealed housing, and a light transmission area is disposed at the bottom of the sealed housing corresponding to the lens optical axis area of the second image acquisition unit.
8. The laser weeding device according to claim 5, wherein the second image acquisition unit is arranged at the top of the inner space of the sealed box body, and a 45-degree dichroic mirror is arranged at the intersection of the lens optical axis and the working laser optical axis, so that the lens optical axis and the working laser optical axis are coincident.
9. The laser weeding device according to claim 5, wherein a field lens is arranged on the emergent light path of the two-dimensional scanning galvanometer in the sealed box body.
10. A laser weeding device according to claim 5, wherein the light outlet of the sealing box is provided with a sealing transmission window glass, and the sealing transmission window glass is highly transparent to working laser.
11. A laser weeding device according to claim 10, wherein the bottom side of the sealed box is provided with a plurality of air holes, the air holes are communicated with the purified gas pipeline, and the outlets of the air holes are aligned with the sealed transmission window glass.
12. The laser weeding device according to claim 1, wherein the sensing system further comprises an inertial measurement unit for sensing velocity and pose information.
13. A laser weeding apparatus according to claim 12, wherein a first inertial measurement unit is provided in correspondence with the first image-capturing unit, and a second inertial measurement unit is provided in correspondence with the second image-capturing unit.
14. A laser weeding device according to claim 1, further comprising illumination means for providing illumination to the first and second image acquisition units.
15. A laser weeding device according to claim 1, wherein the laser and the two-dimensional scanning galvanometer are further provided with laser beam shaper mirrors.
CN202311229392.4A 2023-09-22 2023-09-22 Laser weeding device Active CN116965394B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311229392.4A CN116965394B (en) 2023-09-22 2023-09-22 Laser weeding device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311229392.4A CN116965394B (en) 2023-09-22 2023-09-22 Laser weeding device

Publications (2)

Publication Number Publication Date
CN116965394A true CN116965394A (en) 2023-10-31
CN116965394B CN116965394B (en) 2023-12-12

Family

ID=88473425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311229392.4A Active CN116965394B (en) 2023-09-22 2023-09-22 Laser weeding device

Country Status (1)

Country Link
CN (1) CN116965394B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314693A (en) * 2011-07-27 2012-01-11 中国科学院长春光学精密机械与物理研究所 Dual mode video target identification tracking system
US20150075068A1 (en) * 2013-09-13 2015-03-19 Palo Alto Research Center Incorporated Unwanted plant removal system
CN205357927U (en) * 2016-03-03 2016-07-06 国网安徽省电力公司检修公司 Remote control spouts medicine formula weeding robot
CN106561093A (en) * 2016-10-27 2017-04-19 中国农业大学 Laser weeding robot based on four-degree-of-freedom parallel mechanism
CN111897349A (en) * 2020-07-08 2020-11-06 南京工程学院 Underwater robot autonomous obstacle avoidance method based on binocular vision
WO2022038363A1 (en) * 2020-08-20 2022-02-24 Arwac Ltd Agricultural machine
DE202022103059U1 (en) * 2022-05-31 2022-06-14 Abhilashi College Of Education A robotic system for weed control and crop spraying
CN114946805A (en) * 2022-06-14 2022-08-30 清华大学 Laser fiber weeding and pest killing system
CN115393806A (en) * 2022-09-20 2022-11-25 青岛华兴海洋工程技术有限公司 Ship body posture monitoring system and method based on visual technology
CN115648163A (en) * 2022-10-21 2023-01-31 湘潭大学 Weeding robot with weed recognition function

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314693A (en) * 2011-07-27 2012-01-11 中国科学院长春光学精密机械与物理研究所 Dual mode video target identification tracking system
US20150075068A1 (en) * 2013-09-13 2015-03-19 Palo Alto Research Center Incorporated Unwanted plant removal system
CN205357927U (en) * 2016-03-03 2016-07-06 国网安徽省电力公司检修公司 Remote control spouts medicine formula weeding robot
CN106561093A (en) * 2016-10-27 2017-04-19 中国农业大学 Laser weeding robot based on four-degree-of-freedom parallel mechanism
CN111897349A (en) * 2020-07-08 2020-11-06 南京工程学院 Underwater robot autonomous obstacle avoidance method based on binocular vision
WO2022038363A1 (en) * 2020-08-20 2022-02-24 Arwac Ltd Agricultural machine
DE202022103059U1 (en) * 2022-05-31 2022-06-14 Abhilashi College Of Education A robotic system for weed control and crop spraying
CN114946805A (en) * 2022-06-14 2022-08-30 清华大学 Laser fiber weeding and pest killing system
CN115393806A (en) * 2022-09-20 2022-11-25 青岛华兴海洋工程技术有限公司 Ship body posture monitoring system and method based on visual technology
CN115648163A (en) * 2022-10-21 2023-01-31 湘潭大学 Weeding robot with weed recognition function

Also Published As

Publication number Publication date
CN116965394B (en) 2023-12-12

Similar Documents

Publication Publication Date Title
US20200074176A1 (en) Method and arrangement for condition monitoring of an installation with operating means
US11849207B2 (en) Inspection system for use in monitoring plants in plant growth areas
US9488630B2 (en) Integrated remote aerial sensing system
JP5020444B2 (en) Crop growth measuring device, crop growth measuring method, crop growth measuring program, and computer-readable recording medium recording the crop growth measuring program
JP4980606B2 (en) Mobile automatic monitoring device
CN106527426A (en) Indoor multi-target track planning system and method
CN109084735B (en) A kind of tunnel monitoring abnormal state method based on unmanned plane device
WO2004021692A2 (en) Retinal array compound camera system
CN110244314A (en) One kind " low slow small " target acquisition identifying system and method
CN114241177A (en) Airport pavement apparent image detection system based on linear array scanning imaging
CN108897342B (en) Positioning and tracking method and system for fast-moving civil multi-rotor unmanned aerial vehicle
CN110487730A (en) Crop field phenotype high-throughout monitoring system and monitoring method
CN107390699B (en) Route planning system and route planning method of sugarcane planter
JPH07270518A (en) Distance measuring instrument
ES2863246T3 (en) Object image recognition and instant active response with improved app and utility
CN106846385B (en) Multi-sensing remote sensing image matching method, device and system based on unmanned aerial vehicle
CN108170139A (en) A kind of photoelectricity multitask system for unmanned boat and perform method
CN116965394B (en) Laser weeding device
CN108196538A (en) Three-dimensional point cloud model-based field agricultural robot autonomous navigation system and method
CN211293749U (en) A robot is walked by oneself in field for breeding district survey is produced
JP2006101816A (en) Method and apparatus for controlling steering
EP3373092B1 (en) Method for locating a fault of a system
CN115390582A (en) Point cloud-based multi-rotor unmanned aerial vehicle tracking and intercepting method and system
CN207264197U (en) A kind of route planning system of sugarcane planting machine
CN113055646A (en) Automatic intelligent insect pest situation forecasting device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240419

Address after: 315300 Industrial Park on the East Side of Xingye North Road, Zhouxiang Town, Cixi City, Ningbo City, Zhejiang Province

Patentee after: NINGBO YIBIN ELECTRONIC TECHNOLOGY Corp.

Country or region after: China

Patentee after: Jilin Changhua Automotive Parts Co.,Ltd.

Address before: No. 5 Dongsheng Road, Gongzhuling Economic Development Zone, Changchun City, Jilin Province, 136105

Patentee before: Jilin Changhua Automotive Parts Co.,Ltd.

Country or region before: China