CN220141574U - Radar assembly and robot - Google Patents

Radar assembly and robot Download PDF

Info

Publication number
CN220141574U
CN220141574U CN202320989859.4U CN202320989859U CN220141574U CN 220141574 U CN220141574 U CN 220141574U CN 202320989859 U CN202320989859 U CN 202320989859U CN 220141574 U CN220141574 U CN 220141574U
Authority
CN
China
Prior art keywords
light
obstacle
obstacle avoidance
receiving
receiving unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202320989859.4U
Other languages
Chinese (zh)
Inventor
袁磊
郝树林
陈悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huanchuang Technology Co ltd
Original Assignee
Shenzhen Camsense Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Camsense Technologies Co Ltd filed Critical Shenzhen Camsense Technologies Co Ltd
Priority to CN202320989859.4U priority Critical patent/CN220141574U/en
Application granted granted Critical
Publication of CN220141574U publication Critical patent/CN220141574U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The utility model relates to the technical field of robots, in particular to a radar assembly and a robot, wherein the radar assembly comprises a transmitting unit, a receiving unit and a processing unit, and the transmitting unit is used for transmitting drawing building light and obstacle avoidance light; the receiving unit is used for receiving the light reflected by the obstacle from the image building light and the obstacle avoidance light; the processing unit is used for building a graph and avoiding barriers according to the reflected light received by the receiving unit. The radar component and the robot can complete drawing and obstacle avoidance on one radar component, so that the cost of the radar component is reduced, and the problems of increased cost caused by using a camera and incapability of accurately drawing and obstacle avoidance at night are avoided.

Description

Radar assembly and robot
Technical Field
The utility model relates to the technical field of robots, in particular to a radar assembly and a robot.
Background
The floor sweeping robot, also called automatic sweeping machine, intelligent dust collector, robot dust collector, etc. is one kind of intelligent household appliance, and can complete the floor cleaning automatically via artificial intelligence to separate people from complicated household work and to obtain more recreation and leisure time.
The capability of map-building navigation and obstacle avoidance recognition is an important basis for realizing efficient work of the sweeping robot, and the map-building navigation mode of the sweeping robot at the present stage mainly comprises LDS laser radar navigation, vSLAM vision navigation and dTOF laser navigation, and the obstacle avoidance recognition mode mainly comprises line laser radar obstacle avoidance, infrared light source obstacle avoidance and 3D TOF obstacle avoidance. The map building navigation and obstacle avoidance recognition are usually realized through different radars, so that the cost is high, and some sweeping robots can realize the map building navigation and obstacle avoidance recognition through the cameras at the same time, but the cost is increased due to the use of the cameras, and the cameras cannot accurately build the map and avoid the obstacle at night.
Disclosure of Invention
The embodiment of the utility model aims to provide a radar assembly and a robot, which at least can solve the problems that the radar assembly is high in cost and a camera is used for accurately building a graph and avoiding an obstacle at night.
In order to solve the technical problems, one technical scheme adopted by the embodiment of the utility model is as follows: the radar assembly comprises a transmitting unit, a receiving unit and a processing unit, wherein the transmitting unit is used for transmitting map building light rays and obstacle avoidance light rays; the receiving unit is used for receiving the image building light and the light reflected by the obstacle avoidance light through the obstacle; the processing unit is used for building a graph and avoiding barriers according to the reflected light received by the receiving unit.
In some embodiments, the emission unit includes a light emitting element and a light splitting element, where the light splitting element is configured to split light emitted by the light emitting element into at least two sub-light rays, and at least two sub-light rays include the map building light ray and the obstacle avoidance light ray.
In some embodiments, the emitting unit includes a first light emitting element for emitting the map building light and a second light emitting element for emitting the obstacle avoidance light.
In some embodiments, the first light emitting member is a point laser and the second light emitting member is a line laser.
In some embodiments, the obstacle avoidance ray is located below the mapping ray.
In some embodiments, the receiving unit comprises a planar array sensor.
In some embodiments, the area array sensor includes a first sensing area for receiving light reflected by the obstacle from the mapping light and a second sensing area for receiving light reflected by the obstacle from the mapping light.
In some embodiments, the receiving unit includes a first sensor for receiving light reflected by the obstacle from the mapping light and a second sensor for receiving light reflected by the obstacle from the obstacle.
In some embodiments, the radar assembly further comprises a mounting portion, the transmitting unit and the receiving unit being both disposed at the mounting portion.
In some embodiments, the radar assembly further comprises a drive portion coupled to the mounting portion for driving rotational movement of the mounting portion about a vertical axis.
In order to solve the technical problems, another technical scheme adopted by the embodiment of the utility model is as follows: there is provided a robot comprising the radar assembly described above.
Different from the situation of the related art, the radar component and the robot provided by the embodiment of the utility model comprise the transmitting unit, the receiving unit and the processing unit, wherein the transmitting unit can transmit the image building light and the obstacle avoidance light, the receiving unit can receive the image building light and the obstacle avoidance light reflected by the obstacle, and the processing unit can build the image and avoid the obstacle according to the reflected light received by the receiving unit, so that the image building and the obstacle avoidance can be completed on one radar component, the cost of the radar component is reduced, and the problems of increased cost caused by using a camera and incapability of accurately building the image and avoiding the obstacle at night are avoided.
The foregoing description is only an overview of the present utility model, and is intended to be implemented in accordance with the teachings of the present utility model in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present utility model more readily apparent.
Drawings
One or more embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements, and in which the figures of the drawings are not to scale, unless expressly stated otherwise.
Fig. 1 is a schematic structural view of a robot according to an embodiment of the present utility model;
FIG. 2 is a cross-sectional view of a lidar of an embodiment of the present utility model;
FIG. 3 is a schematic diagram of a transmitting unit according to an embodiment of the present utility model;
fig. 4 is a schematic structural view of a transmitting unit according to another embodiment of the present utility model;
fig. 5 is a schematic structural diagram of a receiving unit according to an embodiment of the present utility model;
fig. 6 is a schematic diagram of a receiving unit according to another embodiment of the present utility model.
Reference numerals in the specific embodiments are as follows:
100. a robot; 1. a housing; 2. a radar assembly;
21. a transmitting unit; 211. a light emitting member; 212. a light splitting member; 213. a first light emitting member; 214. a second light emitting member;
22. a receiving unit; 221. a planar array sensor; 2211. a first sensing region; 2212. a second sensing region; 222. a first sensor; 223. a second sensor;
23. a processing unit; 24. a circuit board; 25. a mounting part; 251. a through hole; 26. a driving part.
Detailed Description
In order that the utility model may be readily understood, a more particular description thereof will be rendered by reference to specific embodiments that are illustrated in the appended drawings. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the utility model. It will be understood that when an element is referred to as being "fixed" to another element, it can be directly on the other element or one or more intervening elements may be present therebetween. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or one or more intervening elements may be present therebetween.
In the description of the present utility model, it should be noted that, orientation words such as "front, rear, upper, lower, left, right", "transverse, vertical, horizontal", and "top, bottom", etc., are generally based on the orientation or positional relationship shown in the drawings, and are merely for convenience of describing the present utility model and simplifying the description, and these orientation words do not indicate or imply that the apparatus or elements being referred to must have a specific orientation or be constructed and operated in a specific orientation, and thus should not be construed as limiting the scope of protection of the present utility model; the orientation word "inner and outer" refers to inner and outer relative to the contour of the respective component itself.
In the description of the present utility model, it should be noted that, the terms "first," "second," and the like are used for defining the components, and are merely for convenience in distinguishing the corresponding components, and the terms have no special meaning unless otherwise stated, so they should not be construed as limiting the scope of the present utility model. In the description of the embodiments of the present utility model, the meaning of "plurality" is two or more unless otherwise specifically defined.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this utility model belongs. The terminology used in the description of the utility model herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the utility model. The term "and/or" as used in this specification includes any and all combinations of one or more of the associated listed items.
In addition, the technical features mentioned in the different embodiments of the utility model described below can be combined with one another as long as they do not conflict with one another.
The radar assembly provided by the embodiment of the utility model can be arranged in any suitable electronic equipment, such as a robot, industrial equipment, household equipment, an unmanned automobile and the like. In this embodiment, the robot may be configured with any suitable business function to achieve the completion of the corresponding business operations, such as a cleaning robot, a sweeping robot, and the like.
As shown in fig. 1, the robot 100 includes a housing 1 and a radar assembly 2, and the radar assembly 2 is disposed on top of the housing 1. The radar module 2 may be disposed at other positions of the housing 1, for example, on a side wall of the housing 1, or disposed inside the housing 1 when the housing 1 has light transmittance, or a notch is formed in the housing 1, and the radar module 2 is disposed at the notch, so that the radar module 2 is only required to detect obstacles around the robot 100 and surrounding environments.
In some embodiments, the robot 100 is a sweeping robot 100, and may include one or more of a drive assembly, a cleaning assembly, a communication assembly, an audio assembly, a light supplementing assembly, a controller, and a power assembly. The driving component is used for driving the robot 100 to move, the communication component is used for the robot 100 to communicate with external equipment, the audio component is used for sending out sound information to prompt the user of the state of the robot 100, the light supplementing component is used for providing illumination, the power supply component is used for providing power support, and the controller is used for controlling the radar component 2, the driving component, the cleaning component, the communication component, the audio component and the light supplementing component.
As for the radar assembly 2 described above, as shown in fig. 2, the radar assembly 2 includes a transmitting unit 21, a receiving unit 22, and a processing unit 23, where the transmitting unit 21 is configured to transmit a mapping light and an obstacle avoidance light; the receiving unit 22 is used for receiving the light reflected by the obstacle from the image building light and the obstacle avoidance light; the processing unit 23 is used for mapping and obstacle avoidance according to the reflected light received by the receiving unit 22. The mapping means that a point cloud image is generated according to the received reflected light of the mapping light, so as to establish a space model of the surrounding environment of the robot 100, and according to the space model, the running path of the robot 100 can be determined, and when the robot 100 is the sweeping robot 100, the cleaning path of the sweeping robot 100 can also be determined. The obstacle avoidance refers to determining a distance from an obstacle according to received obstacle avoidance light, and controlling a traveling direction and speed of the robot 100 according to the distance so as to prevent the robot 100 from colliding with the obstacle.
Because the map is constructed by acquiring a spatial model of the surrounding environment, and the obstacle avoidance needs to probe the obstacles near the robot 100, especially the ground, the obstacle avoidance light can be located below the map construction light, so that the coincidence ratio between the area scanned by the obstacle avoidance light and the area scanned by the map construction light is reduced, namely the areas of the obstacle avoidance light and the useless area scanned by the map construction light are reduced, and the map construction and obstacle avoidance efficiency is improved. Optionally, the image-building light is emitted along the horizontal direction or emitted towards the upper part of the horizontal direction, and the obstacle-avoiding light is emitted towards the lower part of the horizontal direction.
For the above-mentioned emitting unit 21, the emitting unit 21 is configured to emit at least two light rays, including a map building light ray and an obstacle avoidance light ray. The emission unit 21 may emit at least two light beams in various manners, for example, as shown in fig. 3, the emission unit 21 may include a light emitting element 211 and a light splitting element 212, where the light emitting element 211 is configured to emit light beams, and the light splitting element 212 is configured to split the light beams emitted by the light emitting element 211 into at least two sub-light beams, where the at least two sub-light beams include a map building light beam and an obstacle avoidance light beam, the dotted lines are horizontal planes in the drawing, and the map building light beam and the obstacle avoidance light beam are respectively oriented above the horizontal plane and below the horizontal plane. Alternatively, as shown in fig. 4, the emission unit 21 may at least include a first light emitting element 213 and a second light emitting element 214, where the first light emitting element 213 is configured to emit a map building light, and the second light emitting element 214 is configured to emit an obstacle avoidance light, and a dotted line in the figure is a horizontal plane, and the map building light and the obstacle avoidance light face to an upper part and a lower part of the horizontal plane, respectively.
The light emitting element 211, the first light emitting element 213 and the second light emitting element 214 may be lasers, and may emit laser light, which has better monochromaticity and directivity than the common light source, and higher brightness. And the first light emitting member 213 may be a point laser, the second light emitting member 214 may be a line laser, the point laser is used for emitting point laser, the point laser is used for drawing, the line laser is used for emitting line laser, and the line laser is used for avoiding obstacles. The beam splitter 212 may be a powell lens, which can divide light into a plurality of light rays with uniform optical density, good stability and good linearity, and the powell lens scribing is superior to the scribing mode of a cylindrical lens, so that the central hot spot and the fading edge distribution of the gaussian beam can be eliminated.
For the receiving unit 22, the receiving unit 22 is configured to receive at least the light reflected by the obstacle from the image-building light and the light reflected by the obstacle from the obstacle-avoiding light. The receiving unit 22 may receive both the reflected light rays through one sensor at the same time, or may receive both the reflected light rays through a plurality of sensors, respectively. Alternatively, the receiving unit 22 may be a camera.
When the two reflected light rays are received by one sensor, as shown in fig. 5, the sensor may be a planar array sensor 221, i.e., the receiving unit 22 includes the planar array sensor 221, and the planar array sensor 221 may be a CMOS sensor or a CCD sensor. The area array sensor 221 has a large light sensing surface, and can receive light in a region in a short time, thereby improving data acquisition efficiency.
In some embodiments, as shown in fig. 5, the area array sensor 221 includes a first sensing area 2211 and a second sensing area 2212, where the first sensing area 2211 is used for receiving light reflected by an obstacle from a map building light, and the second sensing area 2212 is used for receiving light reflected by an obstacle from an obstacle, so as to separate a light spot formed on a sensing surface by the map building light and the obstacle avoidance light, and reduce interference between the map building light and the obstacle avoidance light. In addition, the first sensing area 2211 and the second sensing area 2212 can work independently, and can output a first image and a second image respectively, wherein the first image is used for reflecting a light spot formed by image building light, and the second image is used for reflecting a light spot formed by obstacle avoidance light, so that the analysis efficiency of the images can be improved.
The first sensing area 2211 and the second sensing area 2212 may be located in different areas of the same photosensitive surface or may be located on different photosensitive surfaces, which is not limited herein
In some embodiments, the photographing frame rate of the first sensing region 2211 is greater than the photographing frame rate of the second sensing region 2212. Since the first image output by the first sensing area 2211 is used for mapping, and high precision is required for mapping, the shooting frame rate of the first sensing area 2211 can be set to be higher to obtain more data, so that the precision of mapping is improved. The second image output by the second sensor 223 is used for obstacle avoidance, and only the distance of the obstacle needs to be detected, so that the shooting frame rate of the second sensing area 2212 can be appropriately reduced, the consumption of calculation power and electric power is reduced, and the cruising duration of the robot 100 is prolonged. Accordingly, the photographing frame rate of the first sensing area 2211 is greater than that of the second sensing area 2212. The photographing frame rate here refers to the number of pictures output per second by the first and second sensing areas 2211 and 2212.
In some embodiments, the photosensitive devices of the second sensing region 2212 of the first sensing region 2211 are different, e.g., the photosensitive devices of the first sensing region 2211 are CCD sensors and the photosensitive devices of the second sensing region 2212 are CMOS sensors. The CCD sensor has high resolution, low noise, wide dynamic range, low power consumption of the CMOS sensor and low cost.
In some embodiments, the first and second sensing regions 2211, 2212 are different in shape and size. For example, as shown in fig. 5, the height of the first sensing region 2211 is smaller than the height of the second sensing region 2212. Since the obstacle detected by the first sensing area 2211 is generally far, the detection can be realized by the smaller pitch detection range, and the height of the first sensing area 2211 affects the pitch detection range of the first sensing area 2211, so the height of the first sensing area 2211 can be reduced to reduce the cost. And when the robot 100 is the robot 100 for sweeping floor, the robot 100 for sweeping floor only walks on the ground, so that only the obstacle in a certain height of the surrounding environment is detected, that is, the pitch detection angle of the first sensing area 2211 is not required to be too high, and therefore, the height of the first sensing area 2211 can be reduced to reduce the cost. Thus, the height of the first sensing region 2211 may be smaller than the height of the second sensing region 2212.
Also, since the second sensing area 2212 needs to detect an obstacle on the ground that is closer, the angle between the exit angle of the obstacle avoidance light and the plane needs to be set larger, and the height of the second sensing area 2212 affects the pitch detection range of the second sensing area 2212, so the height of the second sensing area 2212 needs to be increased to ensure that an obstacle on the ground that is closer can be detected. Thus, the height of the first sensing region 2211 is smaller than the height of the second sensing region 2212.
In some embodiments, the first sensing region 2211 operates longer than the second sensing region 2212. Since the first sensing area 2211 needs to detect the obstacle 360 degrees around, and the second sensing area 2212 only needs to detect the obstacle in the advancing direction of the robot 100, the second sensing area 2212 may be suspended to operate when the receiving unit 22 is not directed to the advancing direction of the robot 100, so as to reduce power consumption. So that the operation time of the first sensing area 2211 is longer than that of the second sensing area 2212.
When the two reflected light rays are received by a plurality of sensors, as shown in fig. 6, the receiving unit 22 may include a first sensor 222 and a second sensor 223, the first sensor 222 is used for receiving the light rays reflected by the obstacle from the map building light rays, the second sensor 223 is used for receiving the light rays reflected by the obstacle from the obstacle, the first sensor 222 is provided with a first sensing area 2211, and the second sensor 223 is provided with a second sensing area 2212. Wherein the photographing frame rate, the photosensitive device, and the detectable pitch angle range, and the horizontal angle range of the first sensor 222 and the second sensor 223 may be different.
For the processing unit 23, the processing unit 23 is configured to process the image data, for example, process the image output by the receiving unit 22, obtain the position of the obstacle according to the image, and construct a spatial model of the surrounding environment according to the positions of the obstacles in the plurality of images, that is, implement mapping; the distance between the camera and the obstacle in the image can be obtained, and obstacle avoidance can be realized according to the distance.
In some embodiments, as shown in fig. 2, the radar assembly 2 further includes a circuit board 2424, where the transmitting unit 21, the receiving unit 22, and the processing unit 23 are electrically connected to the circuit board 24, and the circuit board 24 may be a PCB board, and the number of the circuit boards 24 may be one or more.
In some embodiments, as shown in fig. 1 and 2, the radar assembly 2 further includes a mounting portion 25, where the transmitting unit 21 and the receiving unit 22 are disposed on the mounting portion 25, and the mounting portion 25 is used for accommodating and protecting the transmitting unit 21 and the receiving unit 22. Alternatively, the processing unit 23 may be provided in the mounting portion 25 or may be provided in the housing 1.
The mounting portion 25 may be in a shell shape or a seat shape, and when the mounting portion 25 is in a shell shape, the transmitting unit 21 and the receiving unit 22 are both disposed in the mounting portion 25, and light transmission is realized with the outside through the through hole 251 or the light transmitting member, which may be glass or light transmitting resin. Further, the entire mounting portion 25 may be provided with light transmittance. When the mounting portion 25 is in a seat shape, the transmitting unit 21 and the receiving unit 22 are both disposed outside the mounting portion 25, and a light-transmitting protective shell may be disposed outside the mounting portion 25 to protect the transmitting unit 21 and the receiving unit 22.
In some embodiments, as shown in fig. 1, the radar assembly 2 further includes a driving portion 26, where the driving portion 26 is connected to the mounting portion 25, and the driving portion 26 is configured to drive the mounting portion 25 to rotate about a vertical axis, so that the transmitting unit 21 and the receiving unit 22 can both rotate about the vertical axis to increase a horizontal angle range of detection of the radar assembly 2, and when the mounting portion 25 can rotate by 360 degrees, the horizontal angle range of detection of the radar assembly 2 can also reach 360 degrees. In the present embodiment, the mounting portion 25 may be provided in a cylindrical shape or a pie shape to enhance dynamic balance when the mounting portion 25 rotates. Alternatively, a driving member may be provided in the housing 1 to drive the mounting portion 25 to rotate relative to the housing 1.
In some embodiments, the driving part 26 includes a motor, which is disposed in the housing 1 and is used to drive the mounting part 25 to rotate relative to the housing 1, and an output shaft of the motor is directly connected to the mounting part 25 or indirectly connected in a transmission manner through gears, belts, etc.
In some embodiments, when the radar assembly 2 includes the driving part 26, the receiving unit 22 may be a linear sensor, and the driving part 26 drives the linear sensor to rotate to realize 360-degree scanning.
The radar component 2 and the robot 100 in the embodiment of the utility model comprise the transmitting unit 21, the receiving unit 22 and the processing unit 23, wherein the transmitting unit 21 can transmit the image establishing light and the obstacle avoidance light, the receiving unit 22 can receive the image establishing light and the obstacle avoidance light reflected by the obstacle, and the processing unit 23 can establish the image and avoid the obstacle according to the reflected light received by the receiving unit 22, so that the image establishing and the obstacle avoidance can be completed on one radar component 2, the cost of the radar component 2 is reduced, and the problems of increased cost caused by using a camera and incapability of accurately establishing the image and avoiding the obstacle at night are avoided. The area array sensor 221 includes a first sensing area 2211 and a second sensing area 2212, where the first sensing area 2211 is used to receive light reflected by the obstacle from the image building light, and the second sensing area 2212 is used to receive light reflected by the obstacle from the image building light, so as to separate the image building light from the light spot formed by the obstacle from the light spot on the sensing surface, and reduce interference between the image building light and the obstacle from each other. In addition, the first sensing area 2211 and the second sensing area 2212 can work independently, and can output a first image and a second image respectively, wherein the first image is used for reflecting a light spot formed by image building light, and the second image is used for reflecting a light spot formed by obstacle avoidance light, so that the analysis efficiency of the images can be improved.
It should be noted that the description of the present utility model and the accompanying drawings illustrate preferred embodiments of the present utility model, but the present utility model may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein, which are not to be construed as additional limitations of the utility model, but are provided for a more thorough understanding of the present utility model. The above-described features are further combined with each other to form various embodiments not listed above, and are considered to be the scope of the present utility model described in the specification; further, modifications and variations of the present utility model may be apparent to those skilled in the art in light of the foregoing teachings, and all such modifications and variations are intended to be included within the scope of this utility model as defined in the appended claims.

Claims (8)

1. A radar assembly, comprising:
the emission unit is used for emitting a graph building light ray and an obstacle avoidance light ray, and the obstacle avoidance light ray is positioned below the graph building light ray;
the receiving unit is used for receiving the image building light and the light reflected by the obstacle avoidance light through the obstacle;
the processing unit is used for building a graph and avoiding barriers according to the reflected light received by the receiving unit;
the transmitting unit and the receiving unit are arranged on the mounting part; and
the driving part is connected with the mounting part and is used for driving the mounting part to rotate around the vertical axis.
2. The radar assembly of claim 1, wherein the emission unit includes a light emitting element and a light splitting element, the light splitting element configured to split light emitted by the light emitting element into at least two sub-light rays, the at least two sub-light rays including the map building light ray and the obstacle avoidance light ray.
3. The radar assembly of claim 1, wherein the transmitting unit includes a first light emitting member for emitting the mapping light and a second light emitting member for emitting the obstacle avoidance light.
4. A radar assembly according to claim 3 wherein the first glowing member is a point laser and the second glowing member is a line laser.
5. The radar assembly of claim 1, wherein the receiving unit comprises a planar array sensor.
6. The radar assembly of claim 5, wherein the area array sensor includes a first sensing area for receiving light reflected by the obstacle from the mapping light and a second sensing area for receiving light reflected by the obstacle from the obstacle.
7. The radar assembly of claim 1, wherein the receiving unit includes a first sensor for receiving light reflected by the obstacle from the mapping light and a second sensor for receiving light reflected by the obstacle from the obstacle avoidance light.
8. A robot, comprising:
a radar assembly as claimed in any one of claims 1 to 7.
CN202320989859.4U 2023-04-19 2023-04-19 Radar assembly and robot Active CN220141574U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202320989859.4U CN220141574U (en) 2023-04-19 2023-04-19 Radar assembly and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202320989859.4U CN220141574U (en) 2023-04-19 2023-04-19 Radar assembly and robot

Publications (1)

Publication Number Publication Date
CN220141574U true CN220141574U (en) 2023-12-08

Family

ID=89020818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202320989859.4U Active CN220141574U (en) 2023-04-19 2023-04-19 Radar assembly and robot

Country Status (1)

Country Link
CN (1) CN220141574U (en)

Similar Documents

Publication Publication Date Title
KR20190011497A (en) Hybrid LiDAR scanner
CN109752704A (en) A kind of prism and multi-line laser radar system
KR102474126B1 (en) Lidar optical apparatus and lidar apparatus having same
US11762067B2 (en) Systems and methods for modifying LIDAR field of view
CN211718520U (en) Multi-line laser radar
CN109581323A (en) A kind of micro electronmechanical laser radar system
JP6594282B2 (en) Laser radar equipment
KR20230028289A (en) Dual Shaft Axial Flux Motor for Optical Scanners
CN211402711U (en) Laser radar
CN220141574U (en) Radar assembly and robot
CN110118972A (en) Object detection device
CN215116780U (en) Light emitting device, laser radar system and electronic equipment
KR101458696B1 (en) A fully or partially electronic-scanned high speed 3-dimensional laser scanner system using laser diode arrays
CN115598899A (en) Linear light beam emitting module, depth camera and control method of depth camera
CN212965387U (en) Light emission module, TOF module and electronic equipment
KR102317474B1 (en) Lidar optical apparatus
CN212471510U (en) Mobile robot
CN210604981U (en) Multi-line laser range radar
CN214409284U (en) Laser radar and mobile robot
CN220491038U (en) Multi-point laser ranging device
KR20200107659A (en) Optical apparatus and lidar apparatus having same
CN219302660U (en) Scanning laser radar
CN220105293U (en) Obstacle avoidance sensor and mobile robot
KR102636500B1 (en) Lidar system with biased 360-degree field of view
JP7430886B2 (en) Distance measuring device, laser radar and mobile robot

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 518000, Floor 1801, Block C, Minzhi Stock Commercial Center, North Station Community, Minzhi Street, Longhua District, Shenzhen City, Guangdong Province

Patentee after: Shenzhen Huanchuang Technology Co.,Ltd.

Address before: 518000, Floor 1801, Block C, Minzhi Stock Commercial Center, North Station Community, Minzhi Street, Longhua District, Shenzhen City, Guangdong Province

Patentee before: SHENZHEN CAMSENSE TECHNOLOGIES Co.,Ltd.

CP01 Change in the name or title of a patent holder