WO2004081683A1 - 自律移動ロボット - Google Patents
自律移動ロボット Download PDFInfo
- Publication number
- WO2004081683A1 WO2004081683A1 PCT/JP2004/003355 JP2004003355W WO2004081683A1 WO 2004081683 A1 WO2004081683 A1 WO 2004081683A1 JP 2004003355 W JP2004003355 W JP 2004003355W WO 2004081683 A1 WO2004081683 A1 WO 2004081683A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- person
- image
- traveling
- autonomous mobile
- Prior art date
Links
- 238000012545 processing Methods 0.000 claims abstract description 106
- 238000005259 measurement Methods 0.000 claims abstract description 46
- 238000004458 analytical method Methods 0.000 claims abstract description 24
- 238000001514 detection method Methods 0.000 claims description 53
- 238000003384 imaging method Methods 0.000 claims description 16
- 230000007613 environmental effect Effects 0.000 claims description 12
- 238000013459 approach Methods 0.000 claims description 8
- 230000009471 action Effects 0.000 claims description 6
- 238000000034 method Methods 0.000 description 42
- 230000008569 process Effects 0.000 description 26
- 238000010586 diagram Methods 0.000 description 19
- 230000006870 function Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 208000019901 Anxiety disease Diseases 0.000 description 1
- 101100298222 Caenorhabditis elegans pot-1 gene Proteins 0.000 description 1
- 230000036506 anxiety Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/10—Programme-controlled manipulators characterised by positioning means for manipulator elements
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
Definitions
- the present invention relates to an autonomous mobile robot that can move to a destination while avoiding an obstacle.
- an autonomous mobile robot that detects an obstacle using a distance sensor such as a laser radar and a sonar and performs travel control has been known.
- a distance sensor such as a laser radar and a sonar and performs travel control
- recognition of detected objects requires dense distance measurement over the entire 3D space. Therefore, it is known to obtain spatial information from an image obtained by a camera and control traveling.
- a robot that travels using distance information and image information (see U.S. Pat. No. 5,525,882), or measures the lateral position of an obstacle using a CCD camera and an ultrasonic sensor.
- a known technique is known (see Japanese Patent Application Laid-Open No. 2000-2012).
- a robot that recognizes and moves a person by detecting the head of the person and the frontal face from images captured by two cameras is known (Japanese Patent Application Laid-Open No. 2002-56638). No. 8).
- the distance of the target object is calculated from the stereo image.
- the error is large, so that the distance information from the distance sensor is used.
- a robot that detects a person's face from an image captured by a stereo camera and calculates the distance to the person to perform autonomous action is also known (Japanese Patent Application Laid-Open No. 2000-3262674). reference). Disclosure of the invention
- the technique disclosed in the above-mentioned US Pat. No. 5,525,882 is directed to an obstacle having an object surface which is an obstacle to traveling, and what is the object Judgment of whether or not to run control. Further, the technique disclosed in Japanese Patent Application Laid-Open No. 2000-2012 only recognizes the lateral spread of an obstacle, and also controls by judging what the object is. Not. In addition, the technology disclosed in Japanese Patent Application Laid-Open No. 2000-56638 and Japanese Patent Application Laid-Open No. 2000-32664 is mainly based on detecting a person only from stereo image information. Therefore, the detection accuracy is low, and erroneous detection of obstacles is easy.
- An object of the present invention is to solve the above-mentioned problems, and to obtain object shape information obtained from distance information by distance measuring means such as a laser radar or an ultrasonic sensor (sonar) capable of measuring the distance and direction to an object.
- distance measuring means such as a laser radar or an ultrasonic sensor (sonar) capable of measuring the distance and direction to an object.
- the present invention relates to an autonomous mobile robot that travels while avoiding obstacles, a storage unit that stores map information of a traveling area and various parameters for traveling, Input instruction means for inputting a command to the storage means, route generation means for generating a travel route to a destination, and environment information acquisition means for acquiring environment information on a travel route including an object serving as an obstacle.
- Traveling means for performing traveling; self-position recognition means for recognizing a self-position based on the information obtained by the environment information obtaining means and the map information; and self-position recognition by the self-position recognition means.
- Traveling control means for controlling the traveling means so as to reach a destination while avoiding obstacles, and wherein the environment information acquiring means captures an image of an environment on a traveling route.
- Image processing device image recognition processing means for performing arithmetic processing on the captured image data to extract a region having an attribute related to a part of a human body; detecting an object existing in an environment on a traveling route; A distance measuring device that measures the distance and direction of the object; a distance information analysis unit that calculates an object shape by performing arithmetic processing on the measured distance data of the object; and a distance information analysis unit that recognizes that the object is a human candidate from the shape.
- the position of the area having the attribute extracted by the recognition processing means, and the distance information analysis means And an environment recognizing means for recognizing that the object is a person by comparing the position of the object recognized as a human being with the position of the object.
- an object shape is obtained from distance data with high measurement accuracy, and position information that is recognized as a human candidate based on the shape and attributes relating to a part of the human body based on image information that can obtain wide area environmental information are obtained.
- position information When the two pieces of position information match with each other, it is determined that a person has been detected. Therefore, highly reliable person detection can be realized.
- the position information also includes a direction.
- the image recognition processing means performs an arithmetic processing on the image data regarding the direction of the object obtained by the distance information analysis means, and the environment recognition means performs a recognition processing on a portion corresponding to the direction of the object. And it is sufficient. As a result, an area where no obstacle is present is not processed, so that the amount of calculation for image recognition can be reduced.
- highly reliable environment recognition can be performed.
- the distance measuring device measures the distance in the direction of the area extracted by the image recognition processing means, and the environment recognizing means recognizes a portion corresponding to the direction of the extracted area. Good. This can eliminate unnecessary distance information acquisition processing. In particular, when there are many objects with the same shape as the one you want to identify based on distance information, for example, when trying to detect and identify a person as a pillar, when there are many pillars, table legs, etc. Since the distance information can be obtained only for the candidate area of the obstacle obtained by the image information obtaining means, efficient distance information obtaining processing S can be performed.
- the distance measuring device may use a laser radar that scans at a predetermined angle in a plane parallel to the traveling road surface and measures a distance to an obstacle. As a result, measurement data with high position measurement accuracy can be obtained, and the shape of an obstacle can be recognized. In addition, since the shape of the obstacle on the measurement plane is known, it can be used as data for obstacle estimation.
- the distance information analysis means may recognize the object as a person's catch when the width of the object obtained by performing arithmetic processing on the distance data is within a predetermined range.
- the distance information analysis means may recognize the detected object as a human candidate when the measured distance to the object does not match the stored distance to the object on the map. Thus, by determining the object not on the map has a high possibility that a human, to recognize easily erroneously human and photograph of the person stretched c wall can be achieved human detection Disappears.
- the imaging device is a camera installed in the autonomous mobile robot, and image coordinates captured by the camera may be converted into a reference coordinate system of the autonomous mobile robot.
- the configuration is simple by fixing and installing in a specific direction, and a low-cost imaging device can be realized.
- the image recognition processing means includes a specific color region detection unit that detects a specific color region from at least captured image data, and a shape feature processing unit that performs arithmetic processing on the shape of the specific color region detected thereby. If the detected specific color region satisfies a predetermined shape characteristic condition, the specific color region may be recognized as a human candidate. This makes it possible to detect a person with high reliability by using both a specific color region, for example, a flesh color region, and a shape characteristic of a person, which well represent the characteristics of the person, on the image.
- the environment recognizing means calculates the area of the human skin color area detected in the image and the area of the human skin color area at a predetermined distance from the autonomous mobile robot to the human. By determining the ratio and comparing the ratio with a predetermined threshold value, it may be determined whether or not the detected person is aware of the autonomous mobile robot. As a result, it is possible to recognize not only the presence or absence of a person, but also the state of the detected person as to whether or not the detected person is aware of the autonomous mobile robot. Safe and smooth moving operation can be realized.
- the traveling control means stops when the object recognized as a person is in the direction of travel of the autonomous mobile robot and is within a predetermined distance, and the object recognized as a human is moved to the autonomous mobile robot. If the vehicle is not in the direction of travel, the vehicle should be avoided if there is an obstacle, and if there is no obstacle, the traveling means should be controlled to continue traveling. As a result, safety for humans can be ensured.
- the traveling control means is configured to, when an object recognized as a person is in an autonomous movement / bottom traveling direction, run away from the object recognized as the person at a predetermined distance. May be controlled.
- the operation can be performed while ensuring safety for the person. In other words, it is not necessary to avoid a detected object that is not a person at a long distance, and the avoidance distance can be shortened, so that efficient operation of an autonomous mobile port bot becomes possible.
- the traveling control means may control the traveling means so as to have a moving speed corresponding to the distance or the distance and the angle. As a result, efficient movement can be achieved with a motion that does not cause human anxiety. If the detected object is not a human, there is no need to reduce the speed, and an efficient autonomous mobile robot can be operated.
- the traveling control means may approach the detected person if it can be determined that the person is aware of the autonomous moving port bot, otherwise, the traveling means may not approach the detected person more than a predetermined distance. May be controlled. As a result, the autonomous mobile robot does not approach a person who is not aware of the autonomous mobile robot, and does not threaten people.
- the traveling control means may take a cautionary action for a person when passing near the detected person. As a result, the affinity of the autonomous mobile robot for humans can be improved.
- the image information acquisition means may capture the detected person and store the image in the storage means, or may transmit the image to a remote station. This allows the autonomous mobile port to have a monitoring function, Brief explanation of the drawing that can obtain the security effect and the security effect
- FIG. 1 is a block diagram of an autonomous mobile robot block according to an embodiment of the present invention.
- FIG. 2 is a perspective view showing the appearance of the above device.
- FIG. 3 is a block diagram of the image recognition processing means of the above device.
- FIG. 4 is a block diagram showing another example of the image recognition processing means of the above device.
- FIG. 5A is a plan view of the same device
- FIG. 5B is a side view of the same device
- FIG. 5C is a diagram showing an image taken by an imaging device of the same device.
- FIG. 6 is a flow chart of human recognition processing by the image recognition processing means in the above device.
- FI G. 7 A is a diagram of an input image to be subjected to human recognition processing in the same device
- FI G. 7 B is a diagram of an image showing the result of skin color region detection by the same process
- FI G. 7 C is a skin color region by the same process
- FIG. 7D is an edge image diagram showing the result of human recognition by the same processing.
- FIG. 8 is a conceptual diagram of a normalized color space used by the above device.
- FIG. 9 is a flowchart of a skin color region detection process included in the human recognition process flow of the above device.
- FIG.10A is a diagram of a voting range used in a head shape detection process by voting included in a human recognition processing flow in the above device
- FIG.10OB is a conceptual diagram illustrating head detection by voting.
- FIG. 11 is a flowchart of a head shape detection process by voting included in the human recognition process flow in the above device.
- FIG. 12A is a plan view illustrating an obstacle distance measurement by the laser radar in the same device
- FIG. 12B is a diagram showing a distance measurement result by the laser radar.
- FIG. 13 A is the specific shaped object distance by laser radar in the same equipment
- FIG. 13B is a plan view for explaining measurement
- FIG. 13B is a diagram showing a result of distance measurement by the laser radar.
- FIG. 14A is a diagram showing a graph of a distance measurement result of a specific shape object by a laser radar in the same equipment
- FIG. 14B is a diagram showing a graph of a difference between the same distance measurement values.
- FIG. 15 is a flowchart of a specific shape recognition process in the above device.
- FIG. 16 is a plan view illustrating a human detection stop area in the above device.
- FIG. 17A is a plan view showing a stopping operation in the case of detecting a person in the above device
- FIG. 17B is a plan view showing an avoiding operation when detecting an object other than a person.
- FIG. 18A is a plan view showing an avoidance operation when an object other than a person is detected in the above device
- FIG. 18B is a plan view showing an avoidance operation in the case of detecting a person.
- FIG. 19A is a plan view showing an avoidance operation for a person who has noticed the same device in the above device
- FIG. 19B is a plan view showing a stop operation for a person who has not noticed the same device.
- FIG. 2OA and FIG. 20B are perspective views for explaining an alerting action to a person in the above device.
- FIG. 21A is a perspective view illustrating the coordinate conversion of the distance measuring device in the above device
- FIG. 21B is a plan view thereof.
- FIG. 22 is an explanatory diagram showing the relationship between the reference coordinate system of the autonomous mobile robot and the normalized image coordinate system of the above device.
- FIG. 23 is a diagram showing an image subjected to image processing with reference to a laser radar measurement result in the above device.
- FI G. 2 the flow diagram of the image processing that is formed with reference to the laser radar measurement result in the high frequency device.
- Fig. 25A is a plan view of the device that measures the laser radar by referring to the image processing result of the above device
- Fig. 25B is the side view
- Fig. 25C is the measurement result. The figure which shows an image.
- FIG. 26 is a flowchart of a laser radar measurement process performed by referring to an image processing result in the above device.
- FIG. 27A is a diagram showing an image in which the same device determines that a person is aware of the device
- FIG. 27B is a diagram showing an image in which the same is not noticed.
- FIG. 28 is a view showing a relationship graph of a distance to a person and a skin color area of a face used in the above device.
- FIG. 29 is a flowchart of human detection processing in the above device.
- FIG. 30 is a flowchart of a process for judging whether or not a person in the above device has noticed the device.
- FIG. 31A, FIG. 31B, and FIG. 31C are perspective views illustrating a combination of an imaging device and a distance measuring device in the above device.
- FIG. 1 shows a block configuration of the present apparatus.
- the autonomous moving port Pot 1 is a storage means 2 for storing map data of a traveling area and various parameters for traveling, an input instruction means 3 for inputting a destination and a traveling command to the storage means 2, and a destination to the destination.
- Route generating means 41 for generating a traveling route of the vehicle, environmental information acquiring means 6 for acquiring environmental information on a traveling route including an object serving as an obstacle, traveling means 5 for traveling, and environmental information acquiring means.
- the environment information acquisition means 6 includes an imaging device 71 that captures an image of the environment on the travel route, and an image recognition process that performs arithmetic processing on image data captured thereby to extract an object having a predetermined attribute.
- Means 7 2 (the above two constitute image information acquisition means 7) and a distance measuring device 8 1 which detects an object existing in the environment on the traveling route and measures the distance to the object and the direction of the object 8 1
- Distance information analysis means 8 2 for calculating and processing the distance data measured thereby (the above two constitute distance information acquisition means 8);
- An environment recognizing means 9 for recognizing environmental information based on information of an object having a determined attribute and the distance information obtained by the distance information analyzing means 82 is provided.
- the environment recognition means 9 outputs environment information on the travel route to the travel control means 4.
- the self-position recognizing means 42 estimates the position of the self-position on the map during traveling by using, for example, an encoder attached to the traveling means 5 or a built-in jay mouth, and provides environmental information.
- the self-position estimation value is corrected using the self-position information obtained by the acquisition means 6.
- the self-position information obtained from the environmental information acquisition means 6 for example, a sign recognizable by the means 6 is installed in the driving environment in advance, and the sign is registered in the map data. Recognition of the sign while driving can be used as self-location information.
- the travel control means 4 searches for a registered sign on the map based on the sign position information based on the relative position sent based on its own estimated position. When a sign is detected, self-position recognition can be performed from the relative position information based on the position of the sign. If two or more signs can be detected, the self-position is uniquely determined.
- the environment information acquisition means 6 detects an obstacle during traveling, sends the position information to the traveling control means 4, and the traveling control means 4 corrects the traveling route so as to avoid the obstacle, and outputs the control output.
- the traveling control means 4 corrects the traveling route so as to avoid the obstacle, and outputs the control output.
- running means 5 As described above, the autonomous mobile robot 1 travels to the designated destination along the traveling route while avoiding obstacles while recognizing its own position.
- FIG. 2 shows the appearance of the autonomous mobile robot 1.
- the autonomous mobile robot 1 is provided with a laser radar (laser range finder) 83 as a distance measuring device 81 of the distance information acquiring means 8, and as an imaging device 71 of the image information acquiring means 7.
- the camera 70 is installed on a rotation mechanism 70a that can rotate (horizontal rotation) with respect to the autonomous mobile robot main body 10. Further, in this device, for example, the wheels 52 are driven by a battery to travel, and the traveling distance is measured by monitoring the rotation of the wheels 52 to be used as information of self-position recognition. .
- the details of the present apparatus will be described in order from the image information acquisition unit. (Configuration of image information acquisition means)
- the autonomous mobile robot 1 of the present invention uses two types of obstacle information: a partial obstacle detection result obtained by a distance measuring means such as a laser radar and a sonar, and an obstacle detection result obtained by image processing. To determine what the obstacles are.
- FIG. 3 and FIG. 4 illustrating obtaining an obstacle detection result by image processing show the block configuration of the image information acquisition means 7.
- the image recognition processing means 72 digitizes the image signal input from the color camera (imaging device) 70 by the analog-to-digital converter 61 and stores it as a digital image in the memory 62.
- the software in which the processing algorithm is recorded is called from the program memory 63 to the CPU 64 that performs the arithmetic processing on the image, and the CPU 64 uses the software to execute the processing procedure described below. Performs a human detection process from the digital image.
- the image processing result is transmitted by the communication control unit 65 to the travel control unit 4 via the environment recognition unit 9.
- the operator operates the device of the autonomous mobile robot main body 10 locally, that is, at the autonomous mobile robot main body 10, and operates the image information acquisition means 7 (power camera 70 and image recognition processing means 72).
- the keyboard control unit 66, the keyboard 6 ⁇ 'display control unit 68, and the display unit 69 are used as image recognition processing means 7 2 May be prepared. Further, the operation of the image information acquisition means 7 can be externally controlled via the communication control section 65.
- FIG. 5A and FIG. 5 show an image pickup device 71 composed of one fixed camera
- FIG. 5C shows an image taken by the image pickup device 71.
- the camera 70 is fixed to the autonomous mobile robot main body 10, and captures a scene ahead of the autonomous mobile robot main body 10 in the traveling direction 10 a as shown in an image 11.
- the coordinate system of this camera 70 is It will move with the reference coordinate system defined as fixed to the object body 10.
- a method of performing arithmetic processing on image data acquired by the imaging device 71 by the image recognition processing means 72 to detect a person or a specific object will be described.
- the skin color of the human head, especially the face is used as the attribute related to the part of the human body.
- 1 G.6 shows the overall processing flow (human recognition processing flow) for recognizing a person from an image.
- the surroundings of the autonomous mobile robot, for example, the front of the traveling direction are imaged by the color camera as the imaging device 71 (S101), and the analog image signal is digitized by the A / D converter 61 (S101).
- 102 a digital image of each color component consisting of R, G, and B (red, green, and blue), which are the three components of the color, is stored in the memory 62 (S103).
- a skin color region is detected from the image (S105), and a head shape detection process is performed on the detected skin color region based on the edge image (S106).
- the detected position of the head is output to the environment recognition means 9 as a human recognition result.
- FI G. 7 A is the input image
- FI G. 7 B is the image of the skin color area detection result
- FI G. 7 C is the image showing the skin color area on the edge image
- FI G. 7 D is the recognition result. Each image is shown.
- heads 24 to 26 are detected (image 23) and only the detected skin color regions 24 to 26 are subjected to the head shape detection processing on the edge image 27, efficient processing can be performed.
- FI G. 8 shows the normalized r-g space (plane).
- human skin color is FI G. 8
- one area 20 is formed on the r-g plane, so that the normalized r and g values obtained from each pixel of the color image It is determined whether or not the pixel has a flesh color.
- the skin color region 20 is created in advance based on the skin color data of many images.
- Fig. 9 shows the flow of the skin color area detection process.
- a skin color image portion storing the skin color area of the memory 62 is initialized (S201).
- the skin color is flagged in the corresponding positions of the skin-color image (S 2 04) 0 for all pixels of the input image after these process ends (Upsilon by S 2 0 5) ⁇ On the flesh color image, pixels with the flesh color flag are grouped, and a flesh color region composed of, for example, a rectangular region is created based on the group. A labeling process is performed on such a flesh-colored area, and the respective areas are distinguished to obtain flesh-colored areas 24 to 26 as shown in FIG.7B above (S206) 0.
- FI G.1 OA indicates a voting frame that defines the voting range used in the voting process
- FI G.10B indicates how to use the voting frame.
- Head shape detection is performed using the fact that the head has a substantially circular shape. For example, it is assumed that an edge pixel set 35 representing a face contour is obtained in an edge image 34 obtained by performing image processing for edge extraction on an image including a person's face (for example, subtle processing of image luminance).
- a voting space in which each pixel point of the edge image 34 is a voting bin (voting box) is set.
- the luminance gradient direction is determined for each pixel of the edge pixel set 35, for example, the luminance gradient direction 36a for the pixel 36.
- the image of a human face and its background has shading, and the direction of the luminance gradient at the edge pixels of the face contour is from the center of the face to the outside or from the outside to the inside. Therefore, the center point 32 of the rod-shaped voting frame 31 set to include the size of the head to be detected is aligned with the edge pixel 36, for example, and the direction of the voting frame 31 is changed to the edge pixel. Voting is performed for all voting bins in the voting space overlapping with each voting bin 3 3 of the voting frame 3 1 as the luminance gradient direction 36 a in 36.
- the voting bin at the center of the face has more votes than others. Will be done. In this way, the results of the voting process along the outer shape of the head are accumulated in the center of the head, so that a voting bin having a voting value larger than a predetermined threshold value in the voting space is obtained. If there is, it is determined that there is a head centered on the voting bin.
- FIG. 11 1 shows a head shape detection processing flow.
- a luminance image corresponding to the skin color area 24 is differentiated to generate a luminance gradient image and an edge image.
- a voting space which is a memory space for storing voting values, is initialized (S302).
- a voting box is arranged in the direction of the luminance gradient of each edge pixel, and a voting is performed in each voting bin of the selected voting space (S 3 0 3).
- the existence of a voting bin having a voting value equal to or greater than a predetermined threshold value is checked. In this case (Y in S305), the center position of the skin color area is set as the head position and output to the environment recognition means 9 via the communication control unit 65 (S305).
- the above processing is performed for all skin color areas, and the head shape detection processing ends (S307).
- the skin color region 24 is detected as the head shape, and the position of the star 29 is set as the head position.
- a method for specifying the direction in the real space of the position detected as the head position from the skin color area on the image will be described.
- the direction of the person in the real space is specified by the image recognition processing means 72.
- the approximate direction of the person is calculated from the size of the face on the image and the position of the face on the image.
- Can be The general direction in which a person is specified as described above is a candidate for the direction in which a person exists.
- FIG. 12A shows the distance measurement of an obstacle by the laser radar 83
- FIG. 12B shows the result of the distance measurement by the laser radar.
- the distance data obtained by the laser radar 83 is data that is longer than the distance L when reflected light is not received or the laser light reflecting object is farther than the specified distance, and the reflected light from the obstacle O. It consists of the distances L1, L2, etc. of the obstacle O obtained by the above.
- a series of distance data measured at a fixed angle of, for example, 0.5 ° it is possible to recognize the position of a specific shaped object (Japanese Patent Application Laid-Open No. 200202/202815). reference).
- the object is, for example, a human. It can recognize a person's prejudice as being the foot of a person. In this way, it is determined that there is a possibility that a person (foot) or a chair exists at a position where an object that is equal to or smaller than the width set in the storage unit 2 or within the range set in advance is detected.
- the (distribution) position of the specific shape object is determined by the number j of measurement points required for scanning the width, the distance of the object within the width, and a set of scan angles, D (i), ⁇ (i). , 0 ⁇ i ⁇ j.
- Such distance and direction data are sent from the distance information analysis means 82 to the environment recognition means 9.
- the image recognition processing means 72 The obtained candidate direction of, for example, the presence of a person is compared with the direction of the possibility of the presence of a person obtained from the distance information analysis means 82. If the two substantially match, the direction is determined to be the direction of the presence of the person, and the direction and distance are sent to the travel control means 4.
- the specific shape obtained by the laser radar 83 which is the distance measuring device 81
- the data D (i) and ⁇ (i) of the object (candidate position of the person) are data of the coordinate system (sensor coordinate system) of the laser radar 83. Therefore, in the distance information analysis means, as shown in Fig. 21 described later, the position of the specific shape object is coordinate-transformed and expressed in the coordinate system (XR, YR) of the autonomous mobile robot body 10. .
- Fig. 13A shows the state of distance measurement by a laser radar
- Fig. 13B shows the result of distance measurement by a laser radar.
- the laser radar 83 is used as the distance measuring device 81, for example, a predetermined angle range is horizontally scanned while emitting and receiving a laser beam at a predetermined angle ⁇ , and the laser beam is transmitted to FIG. 13B.
- a series of data indicating the distance D (i) between the laser reflecting objects OBl and OB2 and the laser source is obtained.
- FI G.14 A shows a graph of the distance data D (i) to the specific shape object
- the measured distance data D (i) indicates a distance change that is far, near, and far before and after the obstacle, so that the distance difference ⁇ (i) is positive and negative at the boundary of the obstacle.
- Peak (Point i m, n + 3).
- the boundary of the specific shaped object can be detected by the predetermined threshold value DR, one DR.
- (H) is stored. If there are a plurality of specific shapes to be considered, the plurality of data are stored.
- a threshold DN is set for the upper limit of the number of consecutive points of the distance measurement value of the specific shape object, and the distance measurement value of the measurement target object is determined as the distance measurement point of the specific shape object. If the number C is less than or equal to the threshold value DN (C minus DN), the object to be measured is determined to be a specific shaped object.
- the value of DN may be a fixed value. Also, for example, assuming that the size of a detected object that can be estimated from one distance data is D (i) X sin (A ⁇ ), and the size of the specific shape is D (i) X sin ( ⁇ ⁇ ) XC presume. Assuming the size of the specific shape in advance, the value of DN may be changed for each distance. That is, assuming that the maximum width of the specific shape object is Omax, for example, D N of D (m) is given by Omax / (D (m) X s i n
- FIG. 15 shows a specific shape recognition processing flow. The explanation is based on the measurement results shown in Fig. 13.
- i m
- D (m), D (m + 1), and D (m + 2) are determined to be distance data obtained by reflection from one specific shape object, and the j-th identification of a plurality of specific shape objects is determined.
- the distance measuring device 81 The case where an ultrasonic sensor capable of measuring distance is used as the distance measuring device 81 will be described.
- the ultrasonic sensor is inexpensive, and a plurality of ultrasonic sensors can be installed around the autonomous moving port body 10.
- the distance information analyzing means 82 determines the detection result (distance) of the ultrasonic sensor having the detection area in that direction. Determine if the result is a detected wall. If it is not judged to be a wall, the candidate is regarded as a person.
- the distance information analysis unit 82 converts the measured object to a human candidate. Recognize as The distance information analysis means 82 determines that an object that is not on the map is likely to be a person, so that simple person detection can be realized. Image recognition processing means 72 By combining the judgment result of 2 with the detection result of the ultrasonic sensor Therefore, it is possible to prevent a photograph or the like of a person on a wall from being erroneously recognized as a person. The calculation of the distance to the stored object (wall) on the map will be described later.
- the traveling control means 4 determines whether the object recognized as a person is traveling in the direction of the autonomous mobile robot main body 10. When the vehicle is within a predetermined distance, the traveling means 5 is controlled so as to stop, thereby ensuring safety for humans. When the object recognized as a person is not in the traveling direction of the autonomous mobile robot main body 10, the traveling control means 4 avoids the obstacle if there is another obstacle, and continues traveling if there is no other obstacle. Control running means 5 to continue.
- FIG. 16 shows a human detection stop area 43 based on the ultrasonic sensor 84.
- FIG. 17A shows a stop operation in the case of detecting a person
- FIG. 17B shows an avoidance operation in the case of detecting an object other than a person.
- the area within each soil ⁇ in the direction D1 from the autonomous mobile robot 1 and the traveling direction 10a from the autonomous mobile robot 1 is the human detection stop area 43
- the autonomous mobile robot 1 stops as shown in FIG. 17A.
- the ultrasonic sensor 84 such a human detection stop area 43 can be set at low cost.
- the autonomous mobile robot 1 moves away from the predetermined traveling route 44 and follows the traveling route 45 avoiding the obstacle O as shown in FIG. 17B. Continue moving to destination T.
- FI G. 18 A shows the avoidance operation when an object other than a person is detected
- FI G. 18 B shows the avoidance operation when a person is detected.
- the traveling control means controls the traveling means such that, when the object recognized as a person is in the traveling direction of the autonomous mobile robot main body, the object travels a predetermined distance away from the object recognized as the person.
- the method of avoiding the detected object may be changed when the detected object is recognized as a person and when it is determined that the detected object is not a person.
- the laser radar 83 is used as the distance measuring device 81 and the route is set so that the obstacle O does not enter the range set as the detection area 46, two types of preset routes are used. Switch the detection radius R 1, R 2 of.
- a small detection radius R1 is used as shown in FIG. 18A, and when it is determined that the detected object is a human, a large detection radius R1 is used as shown in FIG. 18B.
- the detection radius R2 is used.
- the traveling control means 4 may control the traveling means 5 so as to have a moving speed corresponding to the distance or the distance and the angle.
- the traveling control means 4 changes the moving speed according to the distance.
- the speed V is set as the moving speed used in a normal running state without obstacles
- the running control means 4 sets the speed V1 at the time of collision avoidance in the collision avoidance algorithm as follows: Can be.
- V 1 f (h) X v
- f (h) 1, and for h ⁇ d, 0 ⁇ f (h) ⁇ 1.
- the judgment value d is a preset distance
- the variable h is the shortest distance to a detected object recognized as a person.
- the function f (h) is a constant Good.
- a function g (h, ⁇ ) that further considers the absolute value ⁇ of the angle between the detected object recognized as the shortest person and the direction of travel of the autonomous mobile robot can be used. Good.
- the speed V2 at the time of collision avoidance can be expressed by the following equation.
- V 2 g (h, ⁇ ) X ⁇
- g (h, ⁇ ) 1 for h ⁇ d
- 0 ⁇ g (h, ⁇ ) 1 for h ⁇ d.
- an efficient movement of the autonomous mobile robot can be realized by an operation that does not make the person feel uneasy. If the detected object is not a human, there is no need to reduce the speed, and an efficient autonomous mobile robot can be operated.
- FIG. 19A shows an avoidance operation for a person who has noticed the autonomous moving port
- FIG. 19B shows a stop operation for a person who has not noticed the device.
- the traveling control means 4 approaches when the detected person can be determined to be aware of the autonomous mobile robot, and otherwise, does not approach the detected person more than a predetermined distance.
- Control means When information indicating that the autonomous mobile robot 1 is aware is sent from the environmental information acquisition means 6 to the travel control means 4, the travel control means 4 controls the travel means 5 as shown in FIG. Continue to drive to destination T while controlling and avoiding. Otherwise, as shown in FIG. 19B, stop control is performed so that the person M does not approach the person M for a predetermined distance D2 or more. With this control, it is possible to prevent an autonomous mobile robot from approaching and threatening a person when the person is unaware. Whether the detected person is aware of the autonomous mobile robot is determined by the method described later.
- FI G. 20 A and FIG. 20 B indicate the alerting behavior of the autonomous mobile robot toward the person. Take action on people as they move near the detected person. For example, if the monitor screen 47 is installed on the rotating part above the autonomous mobile robot main body 10, the rotating part is rotated according to the direction angle of the detected object sent from the environmental information acquisition means as a person. And turn the monitor screen 47 in that direction. If the moyuta screen 47 has a warning display such as a picture imitating the face of a person, the surrounding people will recognize that the autonomous mobile robot 1 is consciously traveling. . At the same time, a voice signal may be sent to a person to alert them. When the vehicle passes near the detected person, the traveling control means 4 performs an alerting action on the person, so that the affinity for the person can be improved.
- the traveling control means 4 performs an alerting action on the person, so that the affinity for the person can be improved.
- the image captured by the image capturing means is stored in the storage means, or the image is transmitted to the remote station by using the communication means mounted on the autonomous mobile robot, so that the image is transmitted to the autonomous mobile robot. It has a monitoring function and can be applied to security and crime prevention.
- FIG.21A and FIG.21B indicate the arrangement of coordinates in the autonomous mobile robot 1.
- the reference coordinate system XR—YR which is a coordinate fixed to the autonomous mobile robot 1, is a rectangular coordinate system, and the coordinate system XS—YS having the origin at the scan center of the laser radar 83 is also a rectangular coordinate system.
- the XR—YR plane and the XS—YS plane are in the same plane, the XR and XS axes are in the same direction, and the YR and YS axes are shifted by a distance dS.
- the position coordinate of the object ⁇ 3 in the XS-YS plane at the distance D3 from the scan center of the laser radar 83 and the direction angle ⁇ is in the coordinate system;
- FIG. 22 shows the relationship between the reference coordinate system of the autonomous mobile robot 1 and the normalized image coordinate system.
- the coordinates of the point Ml in the reference coordinate system of the autonomous mobile robot 1 are (XR, YR, ZR) and the coordinates of the point m1 in the normalized image coordinate system are (u, v), the coordinates of the point M1
- the coordinates of point m1 are calculated using Eq. (8).
- the laser radar measurement result can be projected on an image using such a conversion.
- S is a constant
- f is the focal length
- ku and kv are the units of the u and V axes with respect to the normalized image coordinates
- u0 and ⁇ are the normalized image coordinates origin in the digital image coordinate system.
- ⁇ is the angle between the u-axis and V-axis
- r11 is the rotation matrix between the reference coordinate system of the autonomous mobile robot 1 and the camera coordinate system
- t1, t2, and t3 are the autonomous mobile robots 1 shows the translation sequence between the reference coordinate system and the camera coordinate system.
- the image information obtaining means 7 performs human recognition by referring to the measurement result of the distance information obtaining means 8 to recognize the environment.
- a laser radar 83 is used as the distance measuring device 81 in the distance information acquiring means 8. I do.
- the distance information analysis means 82 detects an object having a width corresponding to the size of the person from the measurement data, the object is regarded as a person candidate.
- the image information acquisition means 7 performs person recognition in the image corresponding to the person candidate position.
- FIG. 23 shows an image to be recognized and processed with reference to the laser radar measurement result
- FIG. 24 shows a processing flow for performing human candidate detection with reference to the laser radar measurement result.
- the object distance is measured by the laser radar (S501), and the distance information analysis means 82 searches for a human candidate from the distance measurement data (S502). If there is no human catcher (1 in 3503, the result that there is no human being is output to the environmental recognition means 9 (S504), and if there is a human candidate (Y in S503, ), The position is converted from the laser radar coordinate system to the reference coordinate system of the autonomous mobile robot (S505).
- the position of the human candidate is converted from the reference coordinate system of the autonomous mobile robot to the normalized image coordinate system (S506) using Equation (8) above, and the position on the normalized image coordinates is calculated.
- a predetermined area around the person capture position 22 projected on the image 21 is set according to the size of the person to be detected.
- the detected person detection area 90 is set (S507).
- the environment recognition means 9 performs a human detection process in the human detection area 90 (S508), and upon detecting a human candidate, determines that the human candidate detected by the laser radar is a human (S509). Y), the detection result from the laser radar determines that a person has been correctly detected, and outputs a signal to the traveling control means 4 (S510).
- the above processing is performed for all candidate persons (S511). In this way, by limiting the area for performing the human detection processing by image processing based on the measurement result of the laser radar, the processing efficiency can be improved and the reliability of the human detection can be improved.
- the distance information obtaining means 8 obtains the distance information of the candidate person by referring to the processing result of the image information obtaining means 7 and recognizes the environment.
- FI G.25 A and FI G.25 B Shows a laser radar measurement based on physical result
- FI G. 2 5 C shows an image of the measurement results.
- Fig. 26 shows the processing flow of human detection by laser radar measurement with reference to the image processing results.
- the omnidirectional camera 74 is used as the imaging device 71 in the image information acquisition means 7, and a laser radar 83 is used as the distance measurement device 81 in the distance information acquisition means 8.
- the directions of the axes of the camera coordinate system XC—YC—ZC of the omnidirectional force camera 74 and the laser radar coordinate system XL—YL—ZL coincide with each other.
- the omnidirectional camera 74 is installed so that the origin of the camera coordinate system is shifted upward on the ZL axis of the laser radar coordinate system.
- a human candidate is detected by the image recognition processing means 72 (S601). If the candidate does not exist (at 3602: ⁇ ), a result indicating that the person does not exist is output to the environment recognition means 9 (S603), and if there is a captive (S620) Then Y is calculated, and the angle ⁇ between the head position direction of the detected human candidate and the XC axis of the camera coordinates is calculated (604).
- the laser radar 83 measures the area around the angle ⁇ in the direction of the angle ⁇ from the XL axis of the laser radar coordinates (S 605), and the distance information analysis means 82 and the environment recognition means 9 use An object having a width corresponding to the size is searched (S606), and if a person is detected (Y in S607), the result is output to the travel control means 4 as a person is detected. (S609).
- the above processing is performed for all the human candidates obtained by processing the image (S609).
- FI G. 27 A shows an image where the person is aware of the device
- FI G. 27 B shows an image where the person is unaware
- FI G. 28 shows the distance to the person and the skin color area of the face. 3 shows a relationship graph.
- the image recognition processing means 72 determines whether or not the person is aware of the autonomous mobile robot 1. It is performed by.
- the image recognition processing means 7 2 A skin color region detection unit that detects the region; and a shape feature processing unit that performs arithmetic processing on the shape of the detected skin color region. When the detected skin color region satisfies a predetermined shape feature condition, an image is generated.
- the recognition processing means 72 recognizes the skin color region as a human candidate.
- the degree of awareness is determined to be higher as the person faces the autonomous mobile robot in front, and conversely, the lower the degree of awareness is, the more the person turns sideways.
- the orientation of the face with respect to the self-supporting moving device is determined by the size of the skin color area in the head area. As shown in FIG.27, the skin color area generally increases as the user faces the front. The size of a person's head can be assumed to be a certain size regardless of individual differences. Therefore, as shown in FIG. 28, the skin color area S i of the standard face facing forward can be expressed as a function of the distance D i from the autonomously moving robot to the person. The relationship between the distance D i from the autonomous mobile robot to the person and the standard skin color area S i of the face can be obtained in advance by experiments.
- FIG.29 shows the flow of the human detection process.
- a laser radar 83 is used as the distance measuring device 81 of the distance information acquiring means 8
- a fixed camera 70 is used as the imaging device 71 of the image information acquiring means 7.
- Steps S701 to S709 in this processing flow are the same as steps S501 to S509 of the image processing flow based on the laser radar measurement result in FIG. Is omitted.
- step S709 if a person is detected (at ⁇ 509), a process is performed to determine whether the detected person is aware of the autonomous mobile robot (S710).
- FIG. 30 is for people This section shows the processing flow for determining whether or not the autonomous mobile robot is aware.
- the image recognition processing means 72 the area smi of the flesh color region 24 where a person is detected, for example, is measured by the number of pixels (S801).
- the area si is determined (S802).
- the ratio smi / si between the area smi in the flesh color region and the standard flesh color area si is determined and compared with a predetermined threshold TH (S803).
- TH a predetermined threshold
- the autonomous mobile robot 1 of FI G.31A has one laser radar 83 and two cameras 70 on a rotating mechanism 70a
- the autonomous mobile robot 1 of FI G.3 IB It is equipped with two laser radars 83 and two cameras 70 on a rotating mechanism 70a
- the autonomous mobile robot 1 of FIG. 31C has one laser radar 83 and one set of ultrasonic waves. It comprises a sensor 84 and two cameras 70 on a rotating mechanism 70a.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Robotics (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/549,110 US7684894B2 (en) | 2003-03-14 | 2004-03-12 | Autonomously moving robot |
DE112004000438T DE112004000438T5 (de) | 2003-03-14 | 2004-03-12 | Autonom sich bewegender Roboter |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-070728 | 2003-03-14 | ||
JP2003070728A JP3879848B2 (ja) | 2003-03-14 | 2003-03-14 | 自律移動装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004081683A1 true WO2004081683A1 (ja) | 2004-09-23 |
Family
ID=32984667
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/003355 WO2004081683A1 (ja) | 2003-03-14 | 2004-03-12 | 自律移動ロボット |
Country Status (6)
Country | Link |
---|---|
US (1) | US7684894B2 (ja) |
JP (1) | JP3879848B2 (ja) |
KR (1) | KR100773184B1 (ja) |
DE (1) | DE112004000438T5 (ja) |
TW (1) | TWI242701B (ja) |
WO (1) | WO2004081683A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7602133B2 (en) | 2006-05-01 | 2009-10-13 | Samsung Electronics Co., Ltd. | Robot having an obstacle detection unit and method of controlling the same |
US8340901B2 (en) | 2009-05-20 | 2012-12-25 | National Taiwan University Of Science And Technology | Mobile robot and path planning method thereof for manipulating target objects |
US20150006016A1 (en) * | 2008-01-28 | 2015-01-01 | Seegrid Corporation | Service robot and method of operating same |
CN104718507A (zh) * | 2012-11-05 | 2015-06-17 | 松下知识产权经营株式会社 | 自主行走装置的行走信息生成装置、方法及程序、以及自主行走装置 |
CN107463175A (zh) * | 2017-08-04 | 2017-12-12 | 河南工程学院 | 自动避障小车、避障方法及*** |
CN112232272A (zh) * | 2020-11-02 | 2021-01-15 | 上海有个机器人有限公司 | 一种激光与视觉图像传感器融合的行人识别方法 |
CN112602030A (zh) * | 2018-09-04 | 2021-04-02 | 索尼公司 | 信息处理设备、信息处理方法、程序和移动设备 |
CN113433560A (zh) * | 2021-06-25 | 2021-09-24 | 北京铁道工程机电技术研究所股份有限公司 | 一种机器人侧边巡检的定位方法、装置、电子设备及介质 |
WO2021233007A1 (zh) * | 2020-05-18 | 2021-11-25 | 科沃斯机器人股份有限公司 | 一种自移动机器人的控制方法、***及自移动机器人 |
TWI769924B (zh) * | 2021-09-15 | 2022-07-01 | 東元電機股份有限公司 | 人體跟隨系統 |
Families Citing this family (108)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4595436B2 (ja) * | 2004-03-25 | 2010-12-08 | 日本電気株式会社 | ロボット、その制御方法及び制御用プログラム |
JP4564447B2 (ja) * | 2004-12-14 | 2010-10-20 | 本田技研工業株式会社 | 自律移動ロボット |
KR100954621B1 (ko) * | 2005-02-23 | 2010-04-27 | 파나소닉 전공 주식회사 | 자동운전차량 및 평면 장애물인식방법 |
JP4093261B2 (ja) * | 2005-03-15 | 2008-06-04 | 松下電工株式会社 | 自律移動装置 |
JP4882275B2 (ja) * | 2005-05-18 | 2012-02-22 | パナソニック電工株式会社 | 自律移動ロボット及びその移動状況記録システム |
US7783382B2 (en) * | 2006-02-24 | 2010-08-24 | Qisda Corporation | Controlling machine actions based on luminance of environmental light and distance from user |
JP4970861B2 (ja) * | 2006-07-07 | 2012-07-11 | 本田技研工業株式会社 | 車両周辺監視システム、車両、車両周辺監視プログラム、および車両周辺監視システムの構築システム |
KR100966875B1 (ko) * | 2006-09-26 | 2010-06-29 | 삼성전자주식회사 | 전방위 영상을 이용한 로봇의 위치 결정방법 |
JP4314266B2 (ja) * | 2006-11-22 | 2009-08-12 | キヤノン株式会社 | 画像制御装置及びその制御方法 |
JP4528295B2 (ja) * | 2006-12-18 | 2010-08-18 | 株式会社日立製作所 | 案内ロボット装置及び案内システム |
JP4871160B2 (ja) | 2007-02-16 | 2012-02-08 | 株式会社東芝 | ロボットおよびその制御方法 |
JP4953313B2 (ja) * | 2007-09-10 | 2012-06-13 | 公立大学法人首都大学東京 | 環境認識システム、自律型移動体および環境認識プログラム |
KR100884904B1 (ko) * | 2007-09-12 | 2009-02-19 | 아주대학교산학협력단 | 평행 투영 모델을 이용한 자기위치 인식 방법 |
JP5085251B2 (ja) * | 2007-09-25 | 2012-11-28 | パナソニック株式会社 | 自律移動装置 |
JP4978494B2 (ja) * | 2008-02-07 | 2012-07-18 | トヨタ自動車株式会社 | 自律移動体、及びその制御方法 |
KR101415297B1 (ko) | 2008-04-16 | 2014-07-07 | 삼성전자주식회사 | 로봇 지도 생성 방법 및 로봇 지도 이용 방법 및 로봇지도를 가지는 로봇 |
JP4670899B2 (ja) * | 2008-05-22 | 2011-04-13 | 村田機械株式会社 | 走行車 |
KR101556593B1 (ko) * | 2008-07-15 | 2015-10-02 | 삼성전자주식회사 | 영상 처리 방법 |
TWM348676U (en) * | 2008-07-22 | 2009-01-11 | Iner Aec Executive Yuan | Environmental survey robot |
TWI408397B (zh) * | 2008-08-15 | 2013-09-11 | Univ Nat Chiao Tung | Automatic navigation device with ultrasonic and computer vision detection and its navigation method |
WO2010021090A1 (ja) * | 2008-08-20 | 2010-02-25 | パナソニック株式会社 | 距離推定装置、距離推定方法、プログラム、集積回路およびカメラ |
EP3358434A1 (en) | 2008-10-01 | 2018-08-08 | Murata Machinery, Ltd. | Autonomous mobile device |
TWI396830B (zh) * | 2008-11-28 | 2013-05-21 | Univ Nat Taiwan | 巡邏裝置及其巡邏路徑規劃方法 |
KR101006211B1 (ko) * | 2008-12-01 | 2011-01-07 | 한국기술교육대학교 산학협력단 | 센서를 구비한 이동체의 센싱 범위 확장 방법 |
GB0919158D0 (en) * | 2009-11-02 | 2009-12-16 | Onslow Leigh M The Viscountess | Multi-function monitor |
CN102782600B (zh) | 2009-11-27 | 2015-06-24 | 丰田自动车株式会社 | 自动移动体及其控制方法 |
TWI394933B (zh) * | 2010-01-28 | 2013-05-01 | Univ Nat Kaohsiung Applied Sci | Image path planning guidance system |
JP5560978B2 (ja) * | 2010-07-13 | 2014-07-30 | 村田機械株式会社 | 自律移動体 |
TWI409605B (zh) * | 2010-07-14 | 2013-09-21 | Qisda Corp | 能自動定位移動的電子裝置及讓其移動件自動歸位的方法 |
JP5621483B2 (ja) * | 2010-10-05 | 2014-11-12 | トヨタ自動車株式会社 | ロボット及びその制御方法 |
US9323250B2 (en) | 2011-01-28 | 2016-04-26 | Intouch Technologies, Inc. | Time-dependent navigation of telepresence robots |
WO2012103525A2 (en) | 2011-01-28 | 2012-08-02 | Intouch Technologies, Inc. | Interfacing with a mobile telepresence robot |
KR101049155B1 (ko) * | 2011-02-01 | 2011-07-14 | 국방과학연구소 | 자율이동장치의 장애물 판단방법 및 이를 이용한 자율이동장치 |
US9098611B2 (en) | 2012-11-26 | 2015-08-04 | Intouch Technologies, Inc. | Enhanced video interaction for a user interface of a telepresence network |
US20140139616A1 (en) | 2012-01-27 | 2014-05-22 | Intouch Technologies, Inc. | Enhanced Diagnostics for a Telepresence Robot |
US9259842B2 (en) | 2011-06-10 | 2016-02-16 | Microsoft Technology Licensing, Llc | Interactive robot initialization |
US8761933B2 (en) * | 2011-08-02 | 2014-06-24 | Microsoft Corporation | Finding a called party |
US8972055B1 (en) | 2011-08-19 | 2015-03-03 | Google Inc. | Methods and systems for selecting a velocity profile for controlling a robotic device |
JP5442050B2 (ja) * | 2012-02-15 | 2014-03-12 | 本田技研工業株式会社 | 車両周辺監視システム |
KR101272604B1 (ko) * | 2012-04-25 | 2013-06-07 | 주식회사 유진로봇 | 탑승형 로봇 및 이를 포함하는 탑승형 로봇 운용 시스템 |
US9361021B2 (en) | 2012-05-22 | 2016-06-07 | Irobot Corporation | Graphical user interfaces including touchpad driving interfaces for telemedicine devices |
EP2852881A4 (en) | 2012-05-22 | 2016-03-23 | Intouch Technologies Inc | GRAPHIC USER INTERFACES CONTAINING TOUCH PAD TOUCH INTERFACES FOR TELEMEDICINE DEVICES |
JP5314788B2 (ja) * | 2012-06-11 | 2013-10-16 | パナソニック株式会社 | 自律移動装置 |
US9321173B2 (en) | 2012-06-22 | 2016-04-26 | Microsoft Technology Licensing, Llc | Tracking and following people with a mobile robotic device |
US9186793B1 (en) * | 2012-08-31 | 2015-11-17 | Brain Corporation | Apparatus and methods for controlling attention of a robot |
TWI459170B (zh) * | 2012-10-04 | 2014-11-01 | Ind Tech Res Inst | 行進控制裝置以及具有該行進控制裝置之自動引導載具 |
JP6123370B2 (ja) * | 2013-03-12 | 2017-05-10 | 株式会社豊田中央研究所 | 障害物までの距離を計測して移動する自律式移動装置 |
US9141852B1 (en) * | 2013-03-14 | 2015-09-22 | Toyota Jidosha Kabushiki Kaisha | Person detection and pose estimation system |
JP5672327B2 (ja) * | 2013-03-19 | 2015-02-18 | 株式会社安川電機 | ロボットシステム |
WO2014156733A1 (ja) * | 2013-03-26 | 2014-10-02 | 株式会社日立国際電気 | 人数計数装置および人数計数方法 |
JP2014197294A (ja) * | 2013-03-29 | 2014-10-16 | 株式会社日立産機システム | 位置同定装置、及びそれを備えた移動ロボット |
JP2015019689A (ja) * | 2013-07-16 | 2015-02-02 | アルプス電気株式会社 | 障害物検知装置とその検知方法 |
KR102085180B1 (ko) * | 2013-10-08 | 2020-03-05 | 삼성전자주식회사 | 몸 방향 추정 방법, 상기 방법을 기록한 컴퓨터 판독 가능 저장매체 및 몸 방향 추정 장치. |
US20150142251A1 (en) * | 2013-11-21 | 2015-05-21 | International Business Machines Corporation | Vehicle control based on colors representative of navigation information |
US9532031B1 (en) * | 2014-04-08 | 2016-12-27 | The United States Of America As Represented By The Secretary Of The Navy | Method for extrinsic camera calibration using a laser beam |
US10735902B1 (en) * | 2014-04-09 | 2020-08-04 | Accuware, Inc. | Method and computer program for taking action based on determined movement path of mobile devices |
JP6599603B2 (ja) * | 2014-04-18 | 2019-10-30 | 東芝ライフスタイル株式会社 | 自律走行体 |
JP6344473B2 (ja) * | 2014-07-16 | 2018-06-27 | 株式会社リコー | システム、機械、制御方法、プログラム |
CN104216411B (zh) * | 2014-09-27 | 2016-11-09 | 江阴润玛电子材料股份有限公司 | 一种电子电路中的巡线方法 |
US9434069B1 (en) | 2014-11-10 | 2016-09-06 | Google Inc. | Motion heat map |
KR101681187B1 (ko) * | 2015-01-21 | 2016-12-01 | 한국과학기술연구원 | 로봇 위치 측정 시스템 및 방법 |
US9880263B2 (en) * | 2015-04-06 | 2018-01-30 | Waymo Llc | Long range steerable LIDAR system |
JP2016064829A (ja) * | 2015-12-28 | 2016-04-28 | エイディシーテクノロジー株式会社 | 車両制御装置 |
JP6528280B2 (ja) * | 2015-12-28 | 2019-06-12 | 株式会社エクォス・リサーチ | 移動体 |
WO2017152390A1 (zh) * | 2016-03-09 | 2017-09-14 | 广州艾若博机器人科技有限公司 | 地图构建方法、纠正方法及装置 |
WO2017159680A1 (ja) * | 2016-03-17 | 2017-09-21 | 日本電気株式会社 | 捜索支援装置、捜索支援システム、捜索支援方法及びプログラム記録媒体 |
CN106092091B (zh) * | 2016-08-10 | 2019-07-02 | 京东方科技集团股份有限公司 | 电子机器设备 |
CN107356929B (zh) * | 2016-08-29 | 2020-07-28 | 北醒(北京)光子科技有限公司 | 一种快速扫描探测方法 |
EP4194887A1 (en) * | 2016-09-20 | 2023-06-14 | Innoviz Technologies Ltd. | Lidar systems and methods |
US11094228B2 (en) * | 2016-11-14 | 2021-08-17 | Sony Corporation | Information processing device, information processing method, and recording medium |
KR102681557B1 (ko) * | 2017-02-24 | 2024-07-03 | 엘지전자 주식회사 | 이동 로봇 및 그 제어방법 |
JP6938969B2 (ja) * | 2017-03-07 | 2021-09-22 | 富士フイルムビジネスイノベーション株式会社 | 環境測定システムおよびプログラム |
JP2018146440A (ja) * | 2017-03-07 | 2018-09-20 | 株式会社豊田自動織機 | 環境認識装置 |
JP6776960B2 (ja) * | 2017-03-14 | 2020-10-28 | トヨタ自動車株式会社 | 自律移動体 |
WO2018173595A1 (ja) * | 2017-03-22 | 2018-09-27 | 日本電産株式会社 | 移動装置 |
JP6830656B2 (ja) * | 2017-03-30 | 2021-02-17 | 株式会社エクォス・リサーチ | 対象物判定装置および対象物判定プログラム |
KR20180134230A (ko) * | 2017-06-08 | 2018-12-18 | 삼성전자주식회사 | 청소 로봇 및 그 제어 방법 |
KR102391914B1 (ko) * | 2017-06-30 | 2022-04-27 | 엘지전자 주식회사 | 이동 로봇의 동작 방법 |
US10523880B2 (en) * | 2017-09-28 | 2019-12-31 | Waymo Llc | Synchronized spinning LIDAR and rolling shutter camera system |
EP3732542A4 (en) * | 2017-12-29 | 2021-08-25 | PlusAI Corp | METHOD AND SYSTEM FOR STEREO-BASED VEHICLE POSITION ESTIMATION |
JP7060231B2 (ja) * | 2018-02-06 | 2022-04-26 | 株式会社Soken | 移動体制御装置 |
EP3761136B1 (en) * | 2018-02-28 | 2022-10-26 | Honda Motor Co., Ltd. | Control device, mobile body, and program |
JP6894595B2 (ja) * | 2018-03-28 | 2021-06-30 | 株式会社エクォス・リサーチ | 移動体 |
JP6788915B2 (ja) * | 2018-06-21 | 2020-11-25 | クモノスコーポレーション株式会社 | 3dレーザスキャナ、3dレーザスキャナシステム、建設作業機械及び建設工事方法 |
WO2020060267A1 (en) * | 2018-09-20 | 2020-03-26 | Samsung Electronics Co., Ltd. | Cleaning robot and method for performing task thereof |
JP6559864B1 (ja) * | 2018-10-04 | 2019-08-14 | 関西電力株式会社 | 走行制御装置及び走行制御方法、並びに自動走行車 |
TWI721324B (zh) * | 2018-10-10 | 2021-03-11 | 鴻海精密工業股份有限公司 | 電子裝置及立體物體的判斷方法 |
JP2020071182A (ja) * | 2018-11-01 | 2020-05-07 | パナソニックIpマネジメント株式会社 | 運転支援装置、車両および運転支援方法 |
JP7390312B2 (ja) * | 2019-01-10 | 2023-12-01 | 株式会社小糸製作所 | LiDARセンサユニット |
US11338438B2 (en) * | 2019-01-25 | 2022-05-24 | Bear Robotics, Inc. | Method, system and non-transitory computer-readable recording medium for determining a movement path of a robot |
EP3924868A4 (en) * | 2019-02-11 | 2022-11-30 | 643AI Ltd. | SYSTEMS AND PROCEDURES FOR MANAGEMENT OF MULTIPLE AUTONOMOUS VEHICLES |
US11597104B2 (en) * | 2019-07-31 | 2023-03-07 | X Development Llc | Mobile robot sensor configuration |
JP2021056764A (ja) * | 2019-09-30 | 2021-04-08 | 日本電産株式会社 | 移動体 |
JP7408334B2 (ja) * | 2019-10-03 | 2024-01-05 | Thk株式会社 | 画像処理装置及び移動体制御システム |
CN115038990A (zh) * | 2020-01-31 | 2022-09-09 | 日产自动车株式会社 | 物体识别方法及物体识别装置 |
WO2021166637A1 (ja) * | 2020-02-17 | 2021-08-26 | ソニーグループ株式会社 | 運搬ロボット、制御方法、プログラム、および制御装置 |
JP7404137B2 (ja) | 2020-04-01 | 2023-12-25 | 株式会社豊田中央研究所 | 顔画像処理装置及び顔画像処理プログラム |
WO2021252425A1 (en) * | 2020-06-08 | 2021-12-16 | Brain Corporation | Systems and methods for wire detection and avoidance of the same by robots |
US11767037B2 (en) | 2020-09-22 | 2023-09-26 | Argo AI, LLC | Enhanced obstacle detection |
CN112526984B (zh) * | 2020-09-30 | 2024-06-21 | 深圳银星智能集团股份有限公司 | 一种机器人避障方法、装置及机器人 |
KR20220064222A (ko) * | 2020-11-11 | 2022-05-18 | 삼성전자주식회사 | 전자 장치 및 그 제어 방법 |
JP7521442B2 (ja) | 2021-02-01 | 2024-07-24 | トヨタ自動車株式会社 | 自律移動システム、自律移動方法及び自律移動プログラム |
JP7404282B2 (ja) | 2021-02-10 | 2023-12-25 | 株式会社豊田中央研究所 | 顔モデルパラメータ推定装置、顔モデルパラメータ推定方法及び顔モデルパラメータ推定プログラム |
CN113110481B (zh) * | 2021-04-26 | 2024-02-06 | 上海智蕙林医疗科技有限公司 | 一种应急避让实现方法、***、机器人和存储介质 |
CN113610910B (zh) * | 2021-07-30 | 2024-04-09 | 合肥科大智能机器人技术有限公司 | 一种移动机器人避障方法 |
KR20230039421A (ko) * | 2021-09-14 | 2023-03-21 | 삼성전자주식회사 | 전자 장치 및 이의 제어 방법 |
KR102425271B1 (ko) | 2021-12-29 | 2022-07-27 | 주식회사 파이엇 | 장애물회피방법을 구비한 자율주행로봇 |
KR102654371B1 (ko) * | 2022-02-07 | 2024-04-02 | 백수빈 | 지상용 자율주행 무인정찰 장치 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02259912A (ja) * | 1989-03-31 | 1990-10-22 | Glory Ltd | 移動体の自己定位方法 |
JPH11104984A (ja) * | 1997-10-06 | 1999-04-20 | Fujitsu Ltd | 実環境情報表示装置及び実環境情報表示処理を実行するプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JPH11198075A (ja) * | 1998-01-08 | 1999-07-27 | Mitsubishi Electric Corp | 行動支援装置 |
JP2003280739A (ja) * | 2002-03-26 | 2003-10-02 | Matsushita Electric Works Ltd | 案内用自律移動ロボットとその制御方法 |
JP2004171165A (ja) * | 2002-11-19 | 2004-06-17 | Honda Motor Co Ltd | 移動装置 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6371604A (ja) * | 1986-09-12 | 1988-04-01 | Hideo Mori | 領域分割したカラ−画像を使つて道路境界と障害物を検出する方式 |
JPS63124114A (ja) * | 1986-11-14 | 1988-05-27 | Hitachi Ltd | 移動体用環境認識装置 |
JPS63213005A (ja) * | 1987-03-02 | 1988-09-05 | Hitachi Ltd | 移動体誘導方法 |
JPH036710A (ja) * | 1989-06-05 | 1991-01-14 | Toshiba Corp | 追随移動ロボット制御装置 |
JPH0412218A (ja) * | 1990-05-01 | 1992-01-16 | Toyota Motor Corp | 車両用距離検出装置 |
JP3263699B2 (ja) | 1992-12-22 | 2002-03-04 | 三菱電機株式会社 | 走行環境監視装置 |
JP2501010B2 (ja) * | 1993-10-25 | 1996-05-29 | インターナショナル・ビジネス・マシーンズ・コーポレイション | 移動ロボットの誘導装置 |
JPH07213753A (ja) * | 1994-02-02 | 1995-08-15 | Hitachi Ltd | パーソナルロボット装置 |
JPH07325620A (ja) * | 1994-06-02 | 1995-12-12 | Hitachi Ltd | 知能ロボット装置及び知能ロボットシステム |
JPH09326096A (ja) * | 1996-06-06 | 1997-12-16 | Nissan Motor Co Ltd | 車両用人的障害物検知装置 |
EP0913751B1 (de) * | 1997-11-03 | 2003-09-03 | Volkswagen Aktiengesellschaft | Autonomes Fahrzeug und Verfahren zur Steuerung eines autonomen Fahrzeuges |
JP3660492B2 (ja) * | 1998-01-27 | 2005-06-15 | 株式会社東芝 | 物体検知装置 |
JP2000123298A (ja) | 1998-10-13 | 2000-04-28 | Japan Aviation Electronics Industry Ltd | 障害物位置検出装置およびその検出方法 |
JP2000326274A (ja) | 1999-05-24 | 2000-11-28 | Nec Corp | 自律行動ロボット |
JP3648604B2 (ja) | 2000-10-26 | 2005-05-18 | 松下電工株式会社 | 自律移動装置 |
JP3823760B2 (ja) | 2001-05-28 | 2006-09-20 | 日本電気株式会社 | ロボット装置 |
US6664918B2 (en) * | 2002-01-09 | 2003-12-16 | Mia-Com, Inc. | Method and apparatus for identifying complex objects based on range readings from multiple sensors |
JP4019736B2 (ja) * | 2002-02-26 | 2007-12-12 | トヨタ自動車株式会社 | 車両用障害物検出装置 |
JP4082190B2 (ja) * | 2002-11-26 | 2008-04-30 | 松下電工株式会社 | 人の存在位置検出装置とその検出方法及び同検出装置を用いた自律移動装置 |
-
2003
- 2003-03-14 JP JP2003070728A patent/JP3879848B2/ja not_active Expired - Fee Related
-
2004
- 2004-03-12 DE DE112004000438T patent/DE112004000438T5/de not_active Withdrawn
- 2004-03-12 TW TW093106652A patent/TWI242701B/zh not_active IP Right Cessation
- 2004-03-12 KR KR1020057017184A patent/KR100773184B1/ko not_active IP Right Cessation
- 2004-03-12 US US10/549,110 patent/US7684894B2/en not_active Expired - Fee Related
- 2004-03-12 WO PCT/JP2004/003355 patent/WO2004081683A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02259912A (ja) * | 1989-03-31 | 1990-10-22 | Glory Ltd | 移動体の自己定位方法 |
JPH11104984A (ja) * | 1997-10-06 | 1999-04-20 | Fujitsu Ltd | 実環境情報表示装置及び実環境情報表示処理を実行するプログラムを記録したコンピュータ読み取り可能な記録媒体 |
JPH11198075A (ja) * | 1998-01-08 | 1999-07-27 | Mitsubishi Electric Corp | 行動支援装置 |
JP2003280739A (ja) * | 2002-03-26 | 2003-10-02 | Matsushita Electric Works Ltd | 案内用自律移動ロボットとその制御方法 |
JP2004171165A (ja) * | 2002-11-19 | 2004-06-17 | Honda Motor Co Ltd | 移動装置 |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7602133B2 (en) | 2006-05-01 | 2009-10-13 | Samsung Electronics Co., Ltd. | Robot having an obstacle detection unit and method of controlling the same |
US20150006016A1 (en) * | 2008-01-28 | 2015-01-01 | Seegrid Corporation | Service robot and method of operating same |
US9603499B2 (en) * | 2008-01-28 | 2017-03-28 | Seegrid Corporation | Service robot and method of operating same |
US8340901B2 (en) | 2009-05-20 | 2012-12-25 | National Taiwan University Of Science And Technology | Mobile robot and path planning method thereof for manipulating target objects |
CN104718507A (zh) * | 2012-11-05 | 2015-06-17 | 松下知识产权经营株式会社 | 自主行走装置的行走信息生成装置、方法及程序、以及自主行走装置 |
CN107463175A (zh) * | 2017-08-04 | 2017-12-12 | 河南工程学院 | 自动避障小车、避障方法及*** |
US11995995B2 (en) | 2018-09-04 | 2024-05-28 | Sony Corporation | Information processing device, information processing method, program, and mobile device |
CN112602030A (zh) * | 2018-09-04 | 2021-04-02 | 索尼公司 | 信息处理设备、信息处理方法、程序和移动设备 |
WO2021233007A1 (zh) * | 2020-05-18 | 2021-11-25 | 科沃斯机器人股份有限公司 | 一种自移动机器人的控制方法、***及自移动机器人 |
CN112232272B (zh) * | 2020-11-02 | 2023-09-08 | 上海有个机器人有限公司 | 一种激光与视觉图像传感器融合的行人识别方法 |
CN112232272A (zh) * | 2020-11-02 | 2021-01-15 | 上海有个机器人有限公司 | 一种激光与视觉图像传感器融合的行人识别方法 |
CN113433560A (zh) * | 2021-06-25 | 2021-09-24 | 北京铁道工程机电技术研究所股份有限公司 | 一种机器人侧边巡检的定位方法、装置、电子设备及介质 |
CN113433560B (zh) * | 2021-06-25 | 2023-12-26 | 北京铁道工程机电技术研究所股份有限公司 | 一种机器人侧边巡检的定位方法、装置、电子设备及介质 |
TWI769924B (zh) * | 2021-09-15 | 2022-07-01 | 東元電機股份有限公司 | 人體跟隨系統 |
Also Published As
Publication number | Publication date |
---|---|
US20060184274A1 (en) | 2006-08-17 |
KR100773184B1 (ko) | 2007-11-02 |
TW200508826A (en) | 2005-03-01 |
KR20050108396A (ko) | 2005-11-16 |
JP2004280451A (ja) | 2004-10-07 |
DE112004000438T5 (de) | 2006-01-26 |
JP3879848B2 (ja) | 2007-02-14 |
TWI242701B (en) | 2005-11-01 |
US7684894B2 (en) | 2010-03-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004081683A1 (ja) | 自律移動ロボット | |
CN108406731B (zh) | 一种基于深度视觉的定位装置、方法及机器人 | |
KR102113955B1 (ko) | 선박 및 항만 모니터링 장치 및 방법 | |
JP6132659B2 (ja) | 周囲環境認識装置、それを用いた自律移動システムおよび周囲環境認識方法 | |
US8209074B2 (en) | Robot and method for controlling the same | |
US7277559B2 (en) | Mobile apparatus | |
KR101581197B1 (ko) | 로봇 및 그 제어방법 | |
Kim et al. | Moving obstacle avoidance of a mobile robot using a single camera | |
EP3226207B1 (en) | Automatic operation vehicle | |
US8842162B2 (en) | Method and system for improving surveillance of PTZ cameras | |
KR20100031277A (ko) | 전방 영상을 이용한 위치 인식 장치 및 방법 | |
JP6524529B2 (ja) | 建築限界判定装置 | |
JP6919882B2 (ja) | 人推定システムおよび推定プログラム | |
KR20210090574A (ko) | 선박 및 항만 모니터링 장치 및 방법 | |
JPH11257931A (ja) | 物体認識装置 | |
JP2000293693A (ja) | 障害物検出方法および装置 | |
CN114341930A (zh) | 图像处理装置、拍摄装置、机器人以及机器人*** | |
JP2004301607A (ja) | 移動物体検出装置、移動物体検出方法及び移動物体検出プログラム | |
Poomarin et al. | Automatic docking with obstacle avoidance of a differential wheel mobile robot | |
JP3237705B2 (ja) | 障害物検出装置および障害物検出装置を搭載した移動体 | |
KR101784584B1 (ko) | 레이저 회전을 이용하여 3차원 물체를 판별하는 장치 및 방법 | |
AU2020317303B2 (en) | Information processing device, data generation method, and program | |
JP4742695B2 (ja) | 視線認識装置、及び視線認識方法 | |
KR101997563B1 (ko) | 이동 기기의 위치 인식 방법 | |
US20230324199A1 (en) | Information processing apparatus, movable apparatus, information processing method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 1020057017184 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057017184 Country of ref document: KR |
|
RET | De translation (de og part 6b) |
Ref document number: 112004000438 Country of ref document: DE Date of ref document: 20060126 Kind code of ref document: P |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112004000438 Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006184274 Country of ref document: US Ref document number: 10549110 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase | ||
WWP | Wipo information: published in national office |
Ref document number: 10549110 Country of ref document: US |
|
REG | Reference to national code |
Ref country code: DE Ref legal event code: 8607 |