WO2023100415A1 - Information processing device, movable body, information processing method and program - Google Patents

Information processing device, movable body, information processing method and program Download PDF

Info

Publication number
WO2023100415A1
WO2023100415A1 PCT/JP2022/028507 JP2022028507W WO2023100415A1 WO 2023100415 A1 WO2023100415 A1 WO 2023100415A1 JP 2022028507 W JP2022028507 W JP 2022028507W WO 2023100415 A1 WO2023100415 A1 WO 2023100415A1
Authority
WO
WIPO (PCT)
Prior art keywords
information processing
image
view image
control unit
bird
Prior art date
Application number
PCT/JP2022/028507
Other languages
French (fr)
Japanese (ja)
Inventor
則政 岸
雅司 高田
秀雄 廣重
Original Assignee
株式会社光庭インフォ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社光庭インフォ filed Critical 株式会社光庭インフォ
Priority to JP2023564741A priority Critical patent/JPWO2023100415A1/ja
Publication of WO2023100415A1 publication Critical patent/WO2023100415A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an information processing device, a mobile object, an information processing method, and a program.
  • Japanese Patent Laid-Open No. 2002-200001 discloses a technique aimed at displaying a display that makes it easy for a passenger to understand the traveling direction of a moving object and to easily concentrate on monitoring the surroundings.
  • the surrounding information of the moving object is presented in an easy-to-understand manner.
  • an information processing device has a control unit.
  • the control unit generates one image based on a bird's-eye view image regarding the traveling direction of the moving object and a top view image including the moving object and its surroundings.
  • One image includes a first area displaying a bird's-eye view image related to the traveling direction and a second area displaying a top view image. Control is performed so that one image is displayed on a display unit visible to the operator of the mobile object.
  • FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000.
  • FIG. FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110.
  • FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110.
  • FIG. 4 is an activity diagram showing an example of information processing by the information processing apparatus 110.
  • FIG. 5 is a diagram (part 1) showing an example of a bird's-eye view image.
  • FIG. 6 is a diagram (part 1) showing an example of a top view image.
  • FIG. 7 is a diagram (part 1) showing an example of a synthesized image.
  • FIG. 8 is a diagram (part 1) for explaining adjustment of the aspect ratio.
  • FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110.
  • FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110.
  • FIG. 9 is a diagram (part 2) for explaining adjustment of the aspect ratio.
  • FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio.
  • FIG. 11 is a diagram (part 2) showing an example of a bird's-eye view image.
  • FIG. 12 is a diagram (part 2) showing an example of a top view image.
  • FIG. 13 is a diagram (part 2) showing an example of a synthesized image.
  • FIG. 14 is a diagram showing an example of a displacement curve when per is decreasing away from an object.
  • FIG. 15 is a diagram showing an example of a displacement curve when per approaches an object and increases.
  • FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse.
  • FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse.
  • FIG. 17 is a diagram showing an example of a synthesized image.
  • FIG. 18 is a diagram illustrating an example of image processing for a top view image and a bird's eye view image.
  • FIG. 19 is a diagram showing an example in which projective transformation is performed so that a rectangle becomes an inverted trapezoid in a bird's-eye view image.
  • 20A and 20B are diagrams illustrating an example of image processing for a top view image and a bird's eye view image in Modification 3.
  • FIG. FIG. 21 is a diagram showing an example in which a distance sensor is provided in automobile 100.
  • FIG. 22 is a diagram illustrating an example of a display mode of a bird's-eye view image.
  • the term "unit” may include, for example, a combination of hardware resources implemented by circuits in a broad sense and software information processing that can be specifically realized by these hardware resources.
  • various information is handled in the present embodiment, and these information are, for example, physical values of signal values representing voltage and current, and signal values as binary bit aggregates composed of 0 or 1. It is represented by high and low, or quantum superposition (so-called quantum bit), and communication and operation can be performed on a circuit in a broad sense.
  • a circuit in a broad sense is a circuit realized by at least appropriately combining circuits, circuits, processors, memories, and the like. That is, Application Specific Integrated Circuit (ASIC), programmable logic device (for example, Simple Programmable Logic Device (SPLD), Complex Programmable Logic Device (CPLD), and field It includes a programmable gate array (Field Programmable Gate Array: FPGA)).
  • ASIC Application Specific Integrated Circuit
  • SPLD Simple Programmable Logic Device
  • CPLD Complex Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • FIG. 1 is a diagram showing an example of the system configuration of an information processing system 1000.
  • an information processing system 1000 includes an automobile 100 as a system configuration.
  • a car is an example of a mobile object.
  • Automobile 100 includes information processing device 110 , display 120 , and multiple cameras 130 .
  • the information processing device 110 is a device that executes the processing of this embodiment. In the present embodiment, the information processing device 110 is described as being included in the vehicle 100, but may not be included in the vehicle 100 as long as it can communicate with the vehicle 100 or the like.
  • the display 120 is a display device that displays a 3D synthesized image and the like, which will be described later, under the control of the information processing device 110 .
  • Camera 130 is a camera that captures an image of the surroundings of automobile 100 . More specifically, the camera 130 is an RGB camera or the like that adds and describes color information as an image. Camera 130 1 is provided behind automobile 100 . Camera 130 2 is provided in front of automobile 100 . Although not shown in FIG. 1, in addition to the cameras 130-1 and 130-2 , cameras 130 are provided on the left and right sides of the automobile 100, respectively. These cameras are hereinafter simply referred to as camera 130 . In addition, although not shown in FIG. 1 for simplification of explanation, the vehicle 100 has a plurality of depth sensors. The depth sensor measures the shape of objects around the automobile 100 and the distance from the automobile 100 to the objects using laser light or the like. However, the depth sensor is not limited to laser light, and may measure distance using ultrasonic waves, or may measure distance using a camera with a depth sensor.
  • FIG. 2 is a diagram illustrating an example of the hardware configuration of the information processing apparatus 110.
  • the information processing apparatus 110 includes a control unit 201, a storage unit 202, and a communication unit 203 as a hardware configuration.
  • the control unit 201 is a CPU (Central Processing Unit) or the like, and controls the entire information processing apparatus 110 .
  • the storage unit 202 is any one of HDD (Hard Disk Drive), ROM (Read Only Memory), RAM (Random Access Memory), SSD (Solid State Drive), etc., or any combination thereof, and stores programs and controls. It stores data and the like used when the unit 201 executes processing based on a program.
  • the control unit 201 executes processing based on a program stored in the storage unit 202 to configure the functional configuration of the information processing apparatus 110 as shown in FIG. Activity diagram processing and the like are realized.
  • data used when the control unit 201 executes processing based on a program is stored in the storage unit 202, but the data is stored in another device with which the information processing device 110 can communicate. You may make it memorize
  • the communication unit 203 is a NIC (Network Interface Card) or the like, connects the information processing apparatus 110 to a network, and controls communication with other apparatuses (eg, display 120, camera 130, etc.).
  • FIG. 3 is a diagram illustrating an example of the functional configuration of the information processing device 110 .
  • the information processing apparatus 110 includes a peripheral information receiving unit 301, an object recognition unit 302, a behavior information receiving unit 303, and a display control unit 304 as functional configurations.
  • the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, and the like from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
  • the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
  • Surrounding objects include, for example, buildings, roads, other vehicles, parking lots, and the like.
  • the behavior information receiving unit 303 receives behavior information of the moving body. More specifically, the behavior information receiving unit 303 converts information detected by a wheel speed sensor, a steering angle sensor, an inertial measurement unit (IMU), etc. included in the vehicle 100 into the behavior information of the vehicle 100. to get as An inertial measurement device is a device that detects three-dimensional inertial motion (translational motion and rotational motion in orthogonal three-axis directions). Rotational motion is detected by [deg/sec].
  • the behavior information receiving unit 303 may receive information about the behavior of the automobile 100 from SLAM (Simultaneous Localization and Mapping) included in the automobile 100, LiDAR, a stereo camera, a gyro sensor, etc. included in the automobile 100.
  • SLAM Simultaneous Localization and Mapping
  • LiDAR LiDAR
  • stereo camera LiDAR
  • gyro sensor a gyro sensor
  • the behavior information includes, for example, position information and orientation information of the vehicle 100 .
  • the display control unit 304 converts a surrounding object into a 3D model based on information on the object recognized by the object recognition unit 302, information from the SLAM, and the like.
  • the display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like.
  • the display control unit 304 also generates a top view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like.
  • the top view image is an image that has undergone perspective transformation so that the viewpoint can be viewed from above the road surface, and is an image that includes the entire automobile 100 or at least half or more of the automobile 100 .
  • an image of the ground taken from directly above is used as a bottom image, and an image synthesized and morphed with an image of the automobile 100 or the like is used as a top image.
  • the display control unit 304 generates one composite image based on the bird's-eye view image regarding the traveling direction of the automobile 100 and the top view image including the automobile 100 and its surroundings.
  • the bird's-eye view image regarding the traveling direction of the vehicle 100 is, for example, a bird's-eye view image in front of the vehicle 100 when the vehicle 100 is moving forward, and is, for example, a bird's-eye view image in front of the vehicle 100 when the vehicle 100 is moving backward.
  • 2 is a bird's-eye view image of the rear of the automobile 100.
  • the forward bird's eye view image is an image obtained by morphing the image captured by the camera 1302 based on the peripheral information of the automobile 100 acquired by the peripheral information receiving unit 301 .
  • the rear bird's-eye view image is obtained by morphing the image captured by the camera 1301 based on the peripheral information of the automobile 100 acquired by the peripheral information receiving unit 301 .
  • the synthesized image includes a first area displaying a bird's-eye view image related to the traveling direction and a second area displaying a top view image.
  • the composite image is shown in FIG. 7 and the like, which will be described later.
  • the display control unit 304 controls to display the generated composite image on the display 120 of the automobile 100 or the like.
  • the display 120 of the automobile 100 is an example of a display visible to the operator of the automobile 100 .
  • part or all of the functional configuration of FIG. 3 may be implemented in the information processing device 110 or the automobile 100 as a hardware configuration. Also, part or all of the functional configuration of the information processing device 110 may be implemented in the vehicle 100 .
  • FIG. 4 is an activity diagram showing an example of information processing of the information processing apparatus 110 .
  • the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, etc. from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
  • the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
  • the behavior information receiving unit 303 receives information about the behavior of the automobile 100.
  • the display control unit 304 converts the surrounding objects into a 3D model based on the information on the object recognized by the object recognition unit 302, the information from the SLAM, and the like.
  • the display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like.
  • the display control unit 304 also generates a top view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like.
  • the display control unit 304 also generates a bird's-eye view image regarding the traveling direction of the automobile 100 based on information from the camera 130 and the like.
  • the display control unit 304 In A405, the display control unit 304 generates one composite image based on the bird's eye view image regarding the traveling direction of the automobile 100 and the top view image including the automobile 100 and its surroundings. In A406, the display control unit 304 controls to display the generated composite image on the display 120 of the automobile 100 or the like.
  • FIG. 5 is a diagram (Part 1) showing an example of a bird's-eye view image regarding the traveling direction of the automobile 100.
  • FIG. An image 501 is an example of a bird's-eye view image of the traveling direction of the automobile 100 .
  • FIG. 5 is a diagram of a scene in which the vehicle 100 is about to be parked in a parking space in reverse.
  • the display control unit 304 identifies the driving scene of the automobile 100 based on the distance to objects around the automobile 100 obtained from a depth sensor or the like.
  • a bird's-eye view image back-view bird's-eye view image
  • a bird's-eye view image front-view bird's-eye view image
  • the display control unit 304 changes the ratio of the first area and the ratio of the second area in the synthesized image based on the distance between the automobile 100 and the object existing in the traveling direction.
  • the first display area and the second display area will be described later with reference to FIG. 7 and the like.
  • the display control unit 304 changes the proportion of the first area and the proportion of the second area in the composite image in stages based on the distance between the automobile 100 and the object existing in the traveling direction.
  • An example to change is explained.
  • An example of an object existing in the direction of travel is an obstacle such as a car stop.
  • the display control unit 304 may recognize and extract an object present in the direction of travel by image processing from a bird's-eye view image related to the direction of travel, or an object around the automobile 100 recognized by the object recognition unit 302 may be extracted.
  • the display control unit 304 does not have two stages, but three stages (for example, 5 m or more, less than 5 m, 3 m or more, less than 3 m, etc.), four stages (for example, 5 m or more, less than 5 m, 4 m or more, less than 4 m, 3 m or more, less than 3 m, etc.).
  • 3 m will be described below as an example.
  • 3m is an example.
  • the vertical and horizontal lengths of the screen presented on the display 120 by the display control unit 304 are H and V
  • the vertical and horizontal lengths of the top view image subjected to image processing such as cropping to generate the composite image are Hf and Vf
  • the composite image is generated.
  • Hd and Vd be bird's-eye view images of the traveling direction of the automobile 100 that have undergone image processing, such as cropping, for the purpose of obtaining the image.
  • the display control unit 304 controls Hf and Hd to be one to one. More specifically, the display control unit 304 cuts out a portion of the frame 502 from the image 501 in FIG.
  • the display control unit 304 presents an image far from the automobile 100 in the bird's-eye view image regarding the traveling direction of the automobile 100. Delete the image near 100.
  • the display control unit 304 always keeps the aspect ratio of the cropped image constant.
  • the display control unit 304 cuts out a portion of a frame 602 from the top view image 601 in FIG.
  • FIG. 6 is a diagram (part 1) showing an example of a top view image.
  • the display control unit 304 combines the image 503 and the image 604 to generate the composite image 701 shown in FIG.
  • FIG. 7 is a diagram (part 1) showing an example of a synthesized image.
  • Hd and Hf are one-to-one.
  • 702 is the first area.
  • 703 is the second area. As shown in FIG. 7, the first region is placed on top of the composite image 701 . The second area is placed at the bottom of the composite image 701 .
  • the first area is an area where a bird's-eye view image regarding the traveling direction of the automobile 100 is displayed.
  • the second area is an area in which a top view image including the automobile 100 and its surroundings is displayed. In the synthesized image, the first area and the second area match the flow of surrounding information with respect to the movement of the car. Therefore, the operator of the vehicle 100 can intuitively understand the surrounding conditions and the like.
  • FIG. 8 is a diagram (part 1) for explaining adjustment of the aspect ratio.
  • 704 represents the automobile 100 .
  • the lines in the first area 702 and the second area 703 are attached for explanation. represents a line.
  • the display control unit 304 performs image processing to compress the portion corresponding to the automobile 100 in the top view image so that the left and right sides of the automobile 100 can be seen, as shown in FIG.
  • FIG. 9 is a diagram (part 2) for explaining adjustment of the aspect ratio.
  • the display control unit 304 performs projective transformation so that the rectangle becomes a trapezoid in the top view image as shown in FIG. A top view image and a bird's eye view image are synthesized.
  • FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio. By performing image processing in this manner, the lines of the top view image and the bird's eye view image can be made as if they were straight lines.
  • FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio.
  • the display control unit 304 controls Hf and Hd to be 1:2. More specifically, the display control unit 304 cuts out a frame 802 from the image 801 in FIG. 11 to obtain an image 803 . That is, when the distance L between the automobile 100 and the object behind the automobile 100 is less than 3 m, the display control unit 304 controls the image near the automobile 100 because distant images are not important in the bird's-eye view image regarding the traveling direction of the automobile 100 . Cut out the image of FIG. 11 is a diagram (No. 2) showing an example of a bird's-eye view image regarding the traveling direction of the automobile 100. As shown in FIG. The display control unit 304 changes the scale of the aspect ratio of the cropped image 803 to obtain an image 804 .
  • FIG. 12 is a diagram (part 2) showing an example of a top view image.
  • the display control unit 304 changes the scale of the horizontal ratio of the image 904 to obtain an image 905 .
  • the display control unit 304 combines the image 804 and the image 905 to generate the composite image 1001 in FIG. 13 .
  • FIG. 13 is a diagram (part 2) showing an example of a synthesized image. In FIG. 13, Hd and Hf are 1:2.
  • the display control unit 304 continuously changes the proportion of the first area and the proportion of the second area in the synthesized image. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to the object, the smaller the proportion of the first area in the composite image than the proportion of the second area.
  • the display control unit 304 uses the area change rate between the image displayed in the first area and the image displayed in the second area, and the displacement curves shown in FIGS. 14 and 15 to be described later, Change the area of the image displayed in each area.
  • the display control unit 304 obtains the display area division (dif) from the displacement curve using the obtained per.
  • the displacement curve is set as shown in FIG. (curveInc)
  • the automobile 100 is configured to have two of FIG. Also, curveInc is always located on the right side of curveDec.
  • FIG. 14 is a diagram showing an example of a displacement curve when per is decreasing away from an object.
  • FIG. 15 is a diagram showing an example of a displacement curve when per approaches an object and increases. In both FIGS. 14 and 15, the horizontal axis is per and the vertical axis is dif.
  • the display control unit 304 changes the depression angle with respect to the viewpoint of the top view image based on the distance between the automobile 100 and an object existing in the traveling direction. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to the object, the greater the angle of depression with respect to the viewpoint of the top view image. At this time, the display control unit 304 uses the displacement curve described above to perform control such that the closer the vehicle 100 is to the object, the greater the angle of depression with respect to the viewpoint of the top view image.
  • the display control unit 304 may change the rate of change in the angle of depression depending on whether the vehicle 100 is moving closer to the object or when the vehicle 100 is moving away from the object.
  • the display control unit 304 may change the display ratio described above when the automobile 100 is moving backward, and may change the viewpoint when the automobile 100 is moving forward.
  • the information processing apparatus 110 presents the composite image on one screen, and changes the viewpoint and/or the presentation area of the perspective transformation according to the surrounding conditions, thereby obtaining a bird's-eye view of the direction of travel.
  • the image and the top view image are represented continuously.
  • intuitive surrounding information can be presented to the operator.
  • the information processing device 110 considers the distance to objects such as obstacles around the automobile 100, and changes the viewpoint and/or the presentation area, thereby realizing a display method that makes the obstacles more recognizable.
  • the operator can more intuitively understand the forward and backward movements of the vehicle when the vehicle is backing up, and can drive the automobile 100 safely.
  • the automobile 100 has been described as an example of a moving object.
  • the mobile object may be, for example, a so-called drone such as an unmanned aerial vehicle.
  • the information processing device 110 may be included in the drone or may be included in the controller of the operator who operates the drone.
  • the moving body is a drone
  • a display section that can be visually observed by the operator is provided in the controller.
  • examples of the object include a mark indicating the departure and arrival location of the drone. According to the modified example, it is also possible for the operator to easily understand the moving direction of the moving object, and to present the peripheral information of the moving object in an easy-to-understand manner.
  • FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse.
  • Arrow 1601 indicates the traveling direction of automobile 100 .
  • Image 1602 is a top view image.
  • An image 1603 is a bird's-eye view image.
  • FIG. 17 is a diagram showing an example of a synthesized image.
  • a synthesized image 1701 is an image created by synthesizing the top view image 1602 and the overhead view image 1603 shown in FIG.
  • FIG. 18 is a diagram illustrating an example of image processing for a top view image and a bird's eye view image.
  • the display control unit 304 turns the top view image upside down so that the vehicle faces downward.
  • the display control unit 304 performs rectangular clipping on the vertically inverted top view image so as to have a predetermined area Sa. That is, the display control unit 304 does not draw a portion other than the predetermined area Sa of the top view image that is vertically inverted.
  • the display control unit 304 projectively transforms the rectangle-clipped top view image so that the rectangle becomes a trapezoid.
  • the display control unit 304 performs rectangular clipping so that the bird's-eye view image has a predetermined area Sb. That is, the display control unit 304 does not draw a portion other than the predetermined area Sb of the predetermined bird's-eye view image.
  • the display control unit 340 changes the ratio of Sa to Sb described above according to the distance between the automobile 100 and the obstacle.
  • the display control unit 304 performs projective transformation so as to correct the distortion of the bird's-eye view image subjected to rectangular clipping.
  • the display control unit 304 synthesizes the projectively transformed top view image and the projectively transformed bird's eye view image to generate one synthesized image. Note that while the automobile 100 is moving, the processing shown in FIG. 18 is repeatedly executed.
  • FIG. 19 is a diagram showing an example in which projective transformation is performed so that a rectangle becomes an inverted trapezoid in a bird's-eye view image.
  • Projective transformation is performed so that the rectangle becomes an inverted trapezoid, and one image is generated by synthesizing the top view image subjected to the projective transformation and the overhead view image subjected to the projective transformation.
  • the top view image and the bird's eye view image are transformed as shown in FIG. , it is possible to connect straight lines smoothly.
  • FIG. 20A and 20B are diagrams illustrating an example of image processing for a top view image and a bird's eye view image in Modification 3.
  • FIG. In S2001, the display control unit 304 turns the top view image upside down so that the vehicle faces downward.
  • the display control unit 304 sets an ROI (Region Of Interest).
  • the display control unit 304 sets the ROI so that the upper side of the top view image and the lower side of the bird's eye view image match in three-dimensional position.
  • the display control unit 304 performs projective transformation so that the rectangle of the top view image becomes a trapezoid.
  • the display control unit 304 sets an ROI (Region Of Interest).
  • the display control unit 304 sets the ROI so that the lower side of the bird's-eye view image and the upper side of the top view image match at three-dimensional positions. Note that when the automobile 100 approaches an obstacle, the display control unit 304 sets the ROI so that the near area is enlarged in the vertical direction and the display area in front is enlarged in the horizontal direction.
  • the display control unit 304 performs projective transformation so that the rectangle of the overhead view image becomes an inverted trapezoid.
  • the display control unit 304 combines the top view image that has undergone the projective transformation and the overhead view image that has undergone the projective transformation.
  • S ⁇ b>2007 the display control unit 304 enlarges or reduces the combined image to fit the display 120 . Also, the display control unit 304 clips the combined image. Note that while the automobile 100 is moving, the processing shown in FIG. 20 is repeatedly executed.
  • FIG. 21 is a diagram showing an example in which a distance sensor is provided in automobile 100. As shown in FIG. FIG. 21 shows an example in which distance sensors are provided on the rear left and right sides of the automobile 100 . However, the number of distance sensors is not limited to two. Also, the distance sensor may measure the distance to an object existing on the side or front of the automobile 100 . As shown in FIG. 21, for example, when an object exists in the right rear of the automobile 100, the display control unit 304 of Modification 4 displays the left portion of the bird's-eye view image as shown in FIG. Projective transformation is performed so that the rectangle becomes an inverted trapezoid. Note that in FIG.
  • projective transformation is not performed on the right side of the bird's-eye view image so that the rectangle becomes an inverted trapezoid.
  • the display control unit 304 performs projective transformation so that the rectangle becomes an inverted trapezoid on the right side of the bird's-eye view image.
  • the left portion of the bird's-eye view image is not subjected to projective transformation such that the rectangle becomes an inverted trapezoid. That is, the display control unit 304 performs control so that the display form of the bird's-eye view image is changed depending on whether the object is on the left or right of the moving object.
  • FIG. 22 is a diagram illustrating an example of a display mode of a bird's-eye view image.
  • projective transformation is performed on the left side of the bird's eye view image so that the rectangle becomes an inverted trapezoid, and projective transformation is not performed on the right side of the bird's eye view image.
  • An information processing apparatus having a control unit, wherein the control unit generates a bird's-eye view image related to a moving direction of a moving body and a top view image including the moving body and the periphery of the moving body. based on, one image is generated, the one image includes a first area for displaying a bird's eye view image related to the traveling direction and a second area for displaying the top view image, An information processing device that controls to display the one image on a display unit that is visible to an operator of the mobile object.
  • the first area is arranged above the one image, and the second area is arranged below the one image.
  • Information processing equipment
  • the control unit based on the distance between the moving object and the object existing in the traveling direction, in the one image, An information processing device that changes a ratio of the first area and a ratio of the second area.
  • control unit controls, based on the distance, the ratio of the first area and the ratio of the second area in the one image, An information processing device that continuously changes the
  • control unit controls, based on the distance, the ratio of the first area and the ratio of the second area in the one image, Information processing device that changes step by step.
  • control unit controls the area of the first area in the one image as the moving object approaches the object.
  • An information processing device that performs control so that the ratio is smaller than the ratio of the second region.
  • the control unit when an object exists in the direction of travel, causes the rectangle to become a trapezoid in the top view image.
  • An information processing device that generates the one image by performing a projective transformation and synthesizing it with the bird's-eye view image.
  • the control unit when an object exists in the traveling direction, the control unit performs projective transformation so that a rectangle becomes a trapezoid in the top view image, and An information processing device that performs projective transformation so that a rectangle in an image becomes an inverted trapezoid, and generates the one image by synthesizing the projectively transformed top view image and the projectively transformed overhead view image.
  • control unit may change the display form of the bird's-eye view image depending on whether the object is on the left or right of the moving object. Information processing device to control.
  • control unit adjusts the overhead view image based on the distance between the moving object and the object.
  • An information processing device that changes the depression angle related to the viewpoint.
  • control unit performs control such that the depression angle with respect to the viewpoint of the bird's-eye view image increases as the moving object approaches the object.
  • control unit changes the bird's-eye view image when the moving body approaches the object and when the moving body moves away from the object.
  • An information processing device that changes the rate of change in depression angle with respect to a viewpoint.
  • the moving object is an automobile, and the object is related to a parking position of the automobile. Device.
  • a mobile body comprising the information processing device according to any one of (1) to (13) above.
  • An information processing method executed by an information processing apparatus wherein one An image is generated, wherein the one image includes a first area for displaying a bird's-eye view image related to the traveling direction and a second area for displaying the top view image, and the one image is generated.
  • An information processing method for controlling display on a display unit of the moving body
  • the back view has been mainly explained as an example, but the above-described effects can be obtained by executing the same processing for the front view as well.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)

Abstract

According to an aspect of the present invention, an information processing device is provided. This information processing device comprises a control unit. The control unit generates a single image on the basis of both a bird's eye view image related to the traveling direction of a movable body and a top view image including both the movable body and the periphery of the movable body. The single image includes a first area for displaying the bird's eye view image related to the traveling direction and a second area for displaying the top view image. The control unit performs a control such that the single image is displayed on a display unit that can be seen by the operator of the movable body.

Description

情報処理装置、移動体、情報処理方法及びプログラムInformation processing device, mobile object, information processing method and program
 本発明は、情報処理装置、移動体、情報処理方法及びプログラムに関する。 The present invention relates to an information processing device, a mobile object, an information processing method, and a program.
 移動体に関する表示システムが存在する。
 特許文献1には、乗員が移動体の進行方向を理解しやすく、周辺監視に集中しやすい表示を目的とした技術が開示されている。
Display systems exist for mobile objects.
Japanese Patent Laid-Open No. 2002-200001 discloses a technique aimed at displaying a display that makes it easy for a passenger to understand the traveling direction of a moving object and to easily concentrate on monitoring the surroundings.
特開2021-094939号公報JP 2021-094939 A
 移動体の周辺情報をより分かりやすく提示する。  The surrounding information of the moving object is presented in an easy-to-understand manner.
 本発明の一態様によれば、情報処理装置が提供される。この情報処理装置は、制御部を有する。制御部は、移動体の進行方向に関する俯瞰図画像と、移動体と移動体の周辺とを含む上面図画像と、に基づいて、一つの画像を生成する。一つの画像には、進行方向に関する俯瞰図画像を表示する第1の領域と、上面図画像を表示する第2の領域と、が含まれる。一つの画像を移動体の操作者が目視可能な表示部に表示するよう制御する。 According to one aspect of the present invention, an information processing device is provided. This information processing device has a control unit. The control unit generates one image based on a bird's-eye view image regarding the traveling direction of the moving object and a top view image including the moving object and its surroundings. One image includes a first area displaying a bird's-eye view image related to the traveling direction and a second area displaying a top view image. Control is performed so that one image is displayed on a display unit visible to the operator of the mobile object.
図1は、情報処理システム1000のシステム構成の一例を示す図である。FIG. 1 is a diagram showing an example of a system configuration of an information processing system 1000. As shown in FIG. 図2は、情報処理装置110のハードウェア構成の一例を示す図である。FIG. 2 is a diagram showing an example of the hardware configuration of the information processing device 110. As shown in FIG. 図3は、情報処理装置110の機能構成の一例を示す図である。FIG. 3 is a diagram showing an example of the functional configuration of the information processing device 110. As shown in FIG. 図4は、情報処理装置110の情報処理の一例を示すアクティビティ図である。FIG. 4 is an activity diagram showing an example of information processing by the information processing apparatus 110. As shown in FIG. 図5は、俯瞰図画像の一例を示す図(その1)である。FIG. 5 is a diagram (part 1) showing an example of a bird's-eye view image. 図6は、上面図画像の一例を示す図(その1)である。FIG. 6 is a diagram (part 1) showing an example of a top view image. 図7は、合成画像の一例を示す図(その1)である。FIG. 7 is a diagram (part 1) showing an example of a synthesized image. 図8は、アスペクト比の調整について説明する図(その1)である。FIG. 8 is a diagram (part 1) for explaining adjustment of the aspect ratio. 図9は、アスペクト比の調整について説明する図(その2)である。FIG. 9 is a diagram (part 2) for explaining adjustment of the aspect ratio. 図10は、アスペクト比の調整について説明する図(その3)である。FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio. 図11は、俯瞰図画像の一例を示す図(その2)である。FIG. 11 is a diagram (part 2) showing an example of a bird's-eye view image. 図12は、上面図画像の一例を示す図(その2)である。FIG. 12 is a diagram (part 2) showing an example of a top view image. 図13は、合成画像の一例を示す図(その2)である。FIG. 13 is a diagram (part 2) showing an example of a synthesized image. 図14は、オブジェクトから離れるperの減少時の変位曲線の一例を示す図である。FIG. 14 is a diagram showing an example of a displacement curve when per is decreasing away from an object. 図15は、オブジェクトに近づくperの増加時の変位曲線の一例を示す図である。FIG. 15 is a diagram showing an example of a displacement curve when per approaches an object and increases. 図16は、車両が後退している場合の上面図画像と俯瞰図画像の一例を示す図である。FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse. 図17は、合成画像の一例を示す図である。FIG. 17 is a diagram showing an example of a synthesized image. 図18は、上面図画像及び俯瞰図画像に対する画像処理の一例を示す図である。FIG. 18 is a diagram illustrating an example of image processing for a top view image and a bird's eye view image. 図19は、俯瞰図画像において長方形が逆台形となるよう射影変換を行った一例を示す図である。FIG. 19 is a diagram showing an example in which projective transformation is performed so that a rectangle becomes an inverted trapezoid in a bird's-eye view image. 図20は、変形例3における上面図画像及び俯瞰図画像に対する画像処理の一例を示す図である。20A and 20B are diagrams illustrating an example of image processing for a top view image and a bird's eye view image in Modification 3. FIG. 図21は、自動車100に距離センサを設けた場合の一例を示す図である。FIG. 21 is a diagram showing an example in which a distance sensor is provided in automobile 100. As shown in FIG. 図22は、俯瞰図画像の表示態様の一例を示す図である。FIG. 22 is a diagram illustrating an example of a display mode of a bird's-eye view image.
 以下、図面を用いて本発明の実施形態について説明する。以下に示す実施形態中で示した各種特徴事項は、互いに組み合わせ可能である。 Embodiments of the present invention will be described below with reference to the drawings. Various features shown in the embodiments shown below can be combined with each other.
 本明細書において「部」とは、例えば、広義の回路によって実施されるハードウェア資源と、これらのハードウェア資源によって具体的に実現されうるソフトウェアの情報処理とを合わせたものも含みうる。また、本実施形態においては様々な情報を取り扱うが、これら情報は、例えば電圧・電流を表す信号値の物理的な値、0又は1で構成される2進数のビット集合体としての信号値の高低、又は量子的な重ね合わせ(いわゆる量子ビット)によって表され、広義の回路上で通信・演算が実行されうる。 In this specification, the term "unit" may include, for example, a combination of hardware resources implemented by circuits in a broad sense and software information processing that can be specifically realized by these hardware resources. In addition, various information is handled in the present embodiment, and these information are, for example, physical values of signal values representing voltage and current, and signal values as binary bit aggregates composed of 0 or 1. It is represented by high and low, or quantum superposition (so-called quantum bit), and communication and operation can be performed on a circuit in a broad sense.
 また、広義の回路とは、回路(Circuit)、回路類(Circuitry)、プロセッサ(Processor)、及びメモリ(Memory)等を少なくとも適当に組み合わせることによって実現される回路である。すなわち、特定用途向け集積回路(Application Specific Integrated Circuit:ASIC)、プログラマブル論理デバイス(例えば、単純プログラマブル論理デバイス(Simple Programmable Logic Device:SPLD)、複合プログラマブル論理デバイス(Complex Programmable Logic Device:CPLD)、及びフィールドプログラマブルゲートアレイ(Field Programmable Gate Array:FPGA))等を含むものである。 A circuit in a broad sense is a circuit realized by at least appropriately combining circuits, circuits, processors, memories, and the like. That is, Application Specific Integrated Circuit (ASIC), programmable logic device (for example, Simple Programmable Logic Device (SPLD), Complex Programmable Logic Device (CPLD), and field It includes a programmable gate array (Field Programmable Gate Array: FPGA)).
<実施形態1>
1.システム構成
 図1は、情報処理システム1000のシステム構成の一例を示す図である。図1に示されるように、情報処理システム1000は、システム構成として、自動車100を含む。自動車は移動体の一例である。自動車100は、情報処理装置110と、ディスプレイ120と、複数のカメラ130と、を含む。情報処理装置110は、本実施形態の処理を実行する装置である。本実施形態では、情報処理装置110は、自動車100に含まれるものとして説明を行うが、自動車100等と通信可能であれば、自動車100に含まれなくてもよい。ディスプレイ120は、情報処理装置110の制御のもと、後述する3Dの合成画像等を表示する表示装置である。カメラ130は、自動車100の周辺を撮像するカメラである。より具体的に説明すると、カメラ130は、映像として色情報を付加して記述する、RGBカメラ等である。カメラ130は、自動車100の後方に設けられている。カメラ130は、自動車100の前方に設けられている。図1では省略してあるが、カメラ130及びカメラ130以外にも自動車100の左右にそれぞれカメラ130が設けられている。以下、これら複数のカメラを単にカメラ130という。また、説明の簡略化のため図1には図示していないが、自動車100は、複数の深度センサを有する。深度センサは、自動車100の周りの物体の形状及び自動車100から物体までの距離を、レーザー光等を使って測定する。ただし、深度センサは、レーザー光に限られず、超音波を用いて距離を測定してもよいし、深度センサ付きカメラを用いて距離を測定してもよい。
<Embodiment 1>
1. System Configuration FIG. 1 is a diagram showing an example of the system configuration of an information processing system 1000. As shown in FIG. As shown in FIG. 1, an information processing system 1000 includes an automobile 100 as a system configuration. A car is an example of a mobile object. Automobile 100 includes information processing device 110 , display 120 , and multiple cameras 130 . The information processing device 110 is a device that executes the processing of this embodiment. In the present embodiment, the information processing device 110 is described as being included in the vehicle 100, but may not be included in the vehicle 100 as long as it can communicate with the vehicle 100 or the like. The display 120 is a display device that displays a 3D synthesized image and the like, which will be described later, under the control of the information processing device 110 . Camera 130 is a camera that captures an image of the surroundings of automobile 100 . More specifically, the camera 130 is an RGB camera or the like that adds and describes color information as an image. Camera 130 1 is provided behind automobile 100 . Camera 130 2 is provided in front of automobile 100 . Although not shown in FIG. 1, in addition to the cameras 130-1 and 130-2 , cameras 130 are provided on the left and right sides of the automobile 100, respectively. These cameras are hereinafter simply referred to as camera 130 . In addition, although not shown in FIG. 1 for simplification of explanation, the vehicle 100 has a plurality of depth sensors. The depth sensor measures the shape of objects around the automobile 100 and the distance from the automobile 100 to the objects using laser light or the like. However, the depth sensor is not limited to laser light, and may measure distance using ultrasonic waves, or may measure distance using a camera with a depth sensor.
2.ハードウェア構成
 図2は、情報処理装置110のハードウェア構成の一例を示す図である。情報処理装置110は、ハードウェア構成として、制御部201と、記憶部202と、通信部203と、を含む。制御部201は、CPU(Central Processing Unit)等であって、情報処理装置110の全体を制御する。記憶部202は、HDD(Hard Disk Drive)、ROM(Read Only Memory)、RAM(Random Access Memory)、SSD(Solid Sate Drive)等の何れか、又はこれらの任意の組み合わせであって、プログラム及び制御部201がプログラムに基づき処理を実行する際に利用するデータ等を記憶する。制御部201が、記憶部202に記憶されているプログラムに基づき、処理を実行することによって、後述する図3に示されるような情報処理装置110の機能構成、後述する図4に示されるようなアクティビティ図の処理等が実現される。なお、本実施形態では、制御部201がプログラムに基づき処理を実行する際に利用するデータを記憶部202に記憶するものとして説明するが、データは情報処理装置110が通信可能な他の装置の記憶部等に記憶するようにしてもよい。通信部203は、NIC(Network Interface Card)等であって、情報処理装置110をネットワークに接続し、他の装置(例えば、ディスプレイ120、カメラ130等)との通信を司る。
2. Hardware Configuration FIG. 2 is a diagram illustrating an example of the hardware configuration of the information processing apparatus 110. As illustrated in FIG. The information processing apparatus 110 includes a control unit 201, a storage unit 202, and a communication unit 203 as a hardware configuration. The control unit 201 is a CPU (Central Processing Unit) or the like, and controls the entire information processing apparatus 110 . The storage unit 202 is any one of HDD (Hard Disk Drive), ROM (Read Only Memory), RAM (Random Access Memory), SSD (Solid State Drive), etc., or any combination thereof, and stores programs and controls. It stores data and the like used when the unit 201 executes processing based on a program. The control unit 201 executes processing based on a program stored in the storage unit 202 to configure the functional configuration of the information processing apparatus 110 as shown in FIG. Activity diagram processing and the like are realized. In this embodiment, it is assumed that data used when the control unit 201 executes processing based on a program is stored in the storage unit 202, but the data is stored in another device with which the information processing device 110 can communicate. You may make it memorize|store in a memory|storage part. The communication unit 203 is a NIC (Network Interface Card) or the like, connects the information processing apparatus 110 to a network, and controls communication with other apparatuses (eg, display 120, camera 130, etc.).
3.機能構成
 図3は、情報処理装置110の機能構成の一例を示す図である。情報処理装置110は、機能構成として、周辺情報受取部301と、物体認識部302と、挙動情報受取部303と、表示制御部304と、を含む。
3. Functional Configuration FIG. 3 is a diagram illustrating an example of the functional configuration of the information processing device 110 . The information processing apparatus 110 includes a peripheral information receiving unit 301, an object recognition unit 302, a behavior information receiving unit 303, and a display control unit 304 as functional configurations.
(周辺情報受取部301)
 周辺情報受取部301は、カメラ130から自動車100の周囲の物体の情報、自動車100の周囲の色の情報等を周辺情報として受け取る。また、周辺情報受取部301は、複数の深度センサから自動車100の周りの物体の形状及び自動車100から物体までの距離等を周辺情報として受け取る。
(Peripheral information receiving unit 301)
The peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, and the like from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
(物体認識部302)
 物体認識部302は、周辺情報受取部301によって受け取られた周辺情報等に基づき自動車100の周辺の物体を認識する。周囲の物体としては、例えば、建物、道路、他の自動車、駐車場等がある。
(Object recognition unit 302)
The object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 . Surrounding objects include, for example, buildings, roads, other vehicles, parking lots, and the like.
(挙動情報受取部303)
 挙動情報受取部303は、移動体の挙動情報を受け取る。より具体的に説明すると、挙動情報受取部303は、自動車100に含まれる、車輪速センサ、舵角センサ及び慣性計測装置(IMU:Inertial Measurement Unit)等で検出された情報を自動車100の挙動情報として取得する。なお、慣性計測装置は、3次元の慣性運動(直行3軸方向の並進運動及び回転運動)を検出する装置であって、加速度センサ[m/s]により並進運動を、角速度(ジャイロ)センサ[deg/sec]により回転運動を検出する。また、挙動情報受取部303は、自動車100に含まれるSLAM(Simultaneous Localization and Mapping)及び自動車100に含まれるLiDAR、ステレオカメラ、ジャイロセンサー等より自動車100の挙動に関する情報を受けとってもいい。
 ここで、挙動に関する情報としては、例えば、自動車100の位置の情報、向きの情報等が含まれる。
(Behavior information receiving unit 303)
The behavior information receiving unit 303 receives behavior information of the moving body. More specifically, the behavior information receiving unit 303 converts information detected by a wheel speed sensor, a steering angle sensor, an inertial measurement unit (IMU), etc. included in the vehicle 100 into the behavior information of the vehicle 100. to get as An inertial measurement device is a device that detects three-dimensional inertial motion (translational motion and rotational motion in orthogonal three-axis directions). Rotational motion is detected by [deg/sec]. Also, the behavior information receiving unit 303 may receive information about the behavior of the automobile 100 from SLAM (Simultaneous Localization and Mapping) included in the automobile 100, LiDAR, a stereo camera, a gyro sensor, etc. included in the automobile 100.
Here, the behavior information includes, for example, position information and orientation information of the vehicle 100 .
(表示制御部304)
 表示制御部304は、物体認識部302が認識した物体の情報、SLAMからの情報等に基づき、周辺の物体を3Dモデルに変換する。また、表示制御部304は、カメラ130からの情報、SLAMからの情報、挙動情報受取部303が受け取った自動車100の挙動情報等に基づき、自動車100の3Dモデルを生成する。また、表示制御部304は、周辺の物体の3Dモデル、自動車100の3Dモデル、カメラ130からの情報等に基づき、自動車100及び自動車100の周辺を含む上面図画像を生成する。ここで、上面図画像は、視点を道路面に対して上面から見えるよう透視変換された画像であり、自動車100の全体、又は少なくとも自動車100の半分以上を含む画像である。なお、本実施形態では、地面を真上から撮った画像を下面画像として、自動車100の画像等と合成、モーフィングした画像を上面図画像としている。表示制御部304は、自動車100の進行方向に関する俯瞰図画像と、自動車100を含む、自動車100の周辺を含む上面図画像と、に基づいて、一つの合成画像を生成する。ここで、自動車100の進行方向に関する俯瞰図画像とは、自動車100が前進している場合は、例えば、自動車100の前方の俯瞰図画像であり、自動車100が後退している場合は、例えば、自動車100の後方の俯瞰図画像である。なお、前方の俯瞰図画像は、周辺情報受取部301で取得された自動車100の周辺情報に基づきカメラ130で撮像された画像をモーフィングした画像を俯瞰図画像としている。同様に、後方の俯瞰図画像は、周辺情報受取部301で取得された自動車100の周辺情報に基づきカメラ130で撮像された画像をモーフィングした画像を俯瞰図画像としている。ここで、合成画像には、進行方向に関する俯瞰図画像を表示する第1の領域と、上面図画像を表示する第2の領域と、が含まれる。合成画像については、後述する図7等に示す。表示制御部304は、生成した合成画像を自動車100のディスプレイ120等に表示するよう制御する。自動車100のディスプレイ120は、自動車100の操作者が目視可能な表示部の一例である。
(Display control unit 304)
The display control unit 304 converts a surrounding object into a 3D model based on information on the object recognized by the object recognition unit 302, information from the SLAM, and the like. The display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like. The display control unit 304 also generates a top view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like. Here, the top view image is an image that has undergone perspective transformation so that the viewpoint can be viewed from above the road surface, and is an image that includes the entire automobile 100 or at least half or more of the automobile 100 . In the present embodiment, an image of the ground taken from directly above is used as a bottom image, and an image synthesized and morphed with an image of the automobile 100 or the like is used as a top image. The display control unit 304 generates one composite image based on the bird's-eye view image regarding the traveling direction of the automobile 100 and the top view image including the automobile 100 and its surroundings. Here, the bird's-eye view image regarding the traveling direction of the vehicle 100 is, for example, a bird's-eye view image in front of the vehicle 100 when the vehicle 100 is moving forward, and is, for example, a bird's-eye view image in front of the vehicle 100 when the vehicle 100 is moving backward. 2 is a bird's-eye view image of the rear of the automobile 100. FIG. The forward bird's eye view image is an image obtained by morphing the image captured by the camera 1302 based on the peripheral information of the automobile 100 acquired by the peripheral information receiving unit 301 . Similarly, the rear bird's-eye view image is obtained by morphing the image captured by the camera 1301 based on the peripheral information of the automobile 100 acquired by the peripheral information receiving unit 301 . Here, the synthesized image includes a first area displaying a bird's-eye view image related to the traveling direction and a second area displaying a top view image. The composite image is shown in FIG. 7 and the like, which will be described later. The display control unit 304 controls to display the generated composite image on the display 120 of the automobile 100 or the like. The display 120 of the automobile 100 is an example of a display visible to the operator of the automobile 100 .
 なお、図3の機能構成の一部、又はすべてをハードウェア構成として情報処理装置110、又は自動車100に実装するようにしてもよい。また、情報処理装置110の機能構成の一部は、又はすべてを自動車100に実装するようにしてもよい。 Note that part or all of the functional configuration of FIG. 3 may be implemented in the information processing device 110 or the automobile 100 as a hardware configuration. Also, part or all of the functional configuration of the information processing device 110 may be implemented in the vehicle 100 .
4.情報処理
 図4は、情報処理装置110の情報処理の一例を示すアクティビティ図である。
 A401において、周辺情報受取部301は、カメラ130から自動車100の周囲の物体の情報、自動車100の周囲の色の情報等を周辺情報として受け取る。また、周辺情報受取部301は、複数の深度センサから自動車100の周りの物体の形状及び自動車100から物体までの距離等を周辺情報として受け取る。
 A402において、物体認識部302は、周辺情報受取部301によって受け取られた周辺情報等に基づき自動車100の周辺の物体を認識する。
4. Information Processing FIG. 4 is an activity diagram showing an example of information processing of the information processing apparatus 110 .
At A401, the peripheral information receiving unit 301 receives information on objects around the vehicle 100, information on colors around the vehicle 100, etc. from the camera 130 as peripheral information. Further, the peripheral information receiving unit 301 receives the shapes of objects around the vehicle 100 and the distance from the vehicle 100 to the objects as peripheral information from the plurality of depth sensors.
At A<b>402 , the object recognition unit 302 recognizes objects around the automobile 100 based on the surrounding information received by the surrounding information receiving unit 301 .
 A403において、挙動情報受取部303は、自動車100の挙動に関する情報を受け取る。 At A403, the behavior information receiving unit 303 receives information about the behavior of the automobile 100.
 A404において、表示制御部304は、物体認識部302が認識した物体の情報、SLAMからの情報等に基づき、周辺の物体を3Dモデルに変換する。また、表示制御部304は、カメラ130からの情報、SLAMからの情報、挙動情報受取部303が受け取った自動車100の挙動情報等に基づき、自動車100の3Dモデルを生成する。また、表示制御部304は、周辺の物体の3Dモデル、自動車100の3Dモデル、カメラ130からの情報等に基づき、自動車100及び自動車100の周辺を含む上面図画像を生成する。また、表示制御部304は、カメラ130からの情報等に基づき、自動車100の進行方向に関する俯瞰図画像を生成する。 At A404, the display control unit 304 converts the surrounding objects into a 3D model based on the information on the object recognized by the object recognition unit 302, the information from the SLAM, and the like. The display control unit 304 also generates a 3D model of the automobile 100 based on information from the camera 130, information from SLAM, behavior information of the automobile 100 received by the behavior information receiving unit 303, and the like. The display control unit 304 also generates a top view image including the automobile 100 and its surroundings based on the 3D model of the surrounding objects, the 3D model of the automobile 100, information from the camera 130, and the like. The display control unit 304 also generates a bird's-eye view image regarding the traveling direction of the automobile 100 based on information from the camera 130 and the like.
 A405において、表示制御部304は、自動車100の進行方向に関する俯瞰図画像と、自動車100及び自動車100の周辺を含む上面図画像と、に基づいて、一つの合成画像を生成する。
 A406において、表示制御部304は、生成した合成画像を自動車100のディスプレイ120等に表示するよう制御する。
 図5は、自動車100の進行方向に関する俯瞰図画像の一例を示す図(その1)である。画像501は、自動車100の進行方向に関する俯瞰図画像の一例である。図5は、バックで駐車スペースに自動車100を駐車しようとしている場面の図である。表示制御部304は、深度センサ等から得られる自動車100の周囲の物体までの距離に基づいて自動車100の運転シーンを特定する。なお、説明の簡略化のため、本実施形態では、自動車100が後退している場合のカメラ130に関する俯瞰図画像(バックビューの俯瞰図画像)を例に説明を行う。しかし、自動車100が前進している場合のカメラ130に関する俯瞰図画像(フロントビューの俯瞰図画像)の場合も同様である。
In A405, the display control unit 304 generates one composite image based on the bird's eye view image regarding the traveling direction of the automobile 100 and the top view image including the automobile 100 and its surroundings.
In A406, the display control unit 304 controls to display the generated composite image on the display 120 of the automobile 100 or the like.
FIG. 5 is a diagram (Part 1) showing an example of a bird's-eye view image regarding the traveling direction of the automobile 100. FIG. An image 501 is an example of a bird's-eye view image of the traveling direction of the automobile 100 . FIG. 5 is a diagram of a scene in which the vehicle 100 is about to be parked in a parking space in reverse. The display control unit 304 identifies the driving scene of the automobile 100 based on the distance to objects around the automobile 100 obtained from a depth sensor or the like. For simplicity of explanation, in the present embodiment, a bird's-eye view image (back-view bird's-eye view image) of the camera 1301 when the automobile 100 is moving backward will be described as an example. However, the same applies to a bird's-eye view image (front-view bird's-eye view image) of the camera 1302 when the automobile 100 is moving forward.
(表示割合の変更―段階的)
 表示制御部304は、自動車100と、進行方向に存在するオブジェクトと、の距離に基づいて、合成画像における、第1の領域の割合と、第2の領域の割合と、を変更する。第1の表示領域と、第2の表示領域については、後述する図7等を用いて説明する。
 まずは、表示制御部304が、自動車100と、進行方向に存在するオブジェクトと、の距離に基づいて、合成画像における、第1の領域の割合と、第2の領域の割合と、を段階的に変更する例を説明する。進行方向に存在するオブジェクトの例としては、例えば、車止め等の障害物がある。ただし、これは本実施の形態を制限するものではなく、進行方向に存在するオブジェクトの他の例としては、駐車場の壁、駐車場に止まっている他の車、駐車場における駐車場所を示すマーク等であってもよい。ここで、表示制御部304は、進行方向に存在するオブジェクトを、進行方向に関する俯瞰図画像より画像処理により認識、抽出してもよいし、物体認識部302によって認識された自動車100の周辺の物体により認識してもよい。車止めや駐車場所を示すマーク等は、自動車の駐車位置に関するものの一例である。
(Change of display ratio - step by step)
The display control unit 304 changes the ratio of the first area and the ratio of the second area in the synthesized image based on the distance between the automobile 100 and the object existing in the traveling direction. The first display area and the second display area will be described later with reference to FIG. 7 and the like.
First, the display control unit 304 changes the proportion of the first area and the proportion of the second area in the composite image in stages based on the distance between the automobile 100 and the object existing in the traveling direction. An example to change is explained. An example of an object existing in the direction of travel is an obstacle such as a car stop. However, this does not limit the present embodiment, and other examples of objects existing in the direction of travel include parking lot walls, other cars parked in the parking lot, and parking spaces in the parking lot. It may be a mark or the like. Here, the display control unit 304 may recognize and extract an object present in the direction of travel by image processing from a bird's-eye view image related to the direction of travel, or an object around the automobile 100 recognized by the object recognition unit 302 may be extracted. can be recognized by A car stop, a mark indicating a parking place, or the like is an example of a car parking position.
 以下では説明の簡略化のため、自動車100と自動車100の後方のオブジェクとの距離Lが所定距離以上の場合と、所定距離未満の場合とにおける表示の制御を例に説明を行う。ただし、このことは本実施の形態を制限するものではなく、表示制御部304は、2段階ではなく、3段階(例えば、5m以上、5m未満3m以上、3m未満等)、4段階(例えば、5m以上、5m未満4m以上、4m未満3m以上、3m未満等)等、2段階上の複数段階で表示する領域の割合を変更するようにしてもよい。なお、所定距離の例として、以下では3mを例に説明を行う。なお、3mは一例である。 For the sake of simplicity of explanation, an example of display control when the distance L between the vehicle 100 and the object behind the vehicle 100 is greater than or equal to a predetermined distance and when it is less than the predetermined distance will be described below. However, this does not limit the present embodiment, and the display control unit 304 does not have two stages, but three stages (for example, 5 m or more, less than 5 m, 3 m or more, less than 3 m, etc.), four stages (for example, 5 m or more, less than 5 m, 4 m or more, less than 4 m, 3 m or more, less than 3 m, etc.). In addition, as an example of the predetermined distance, 3 m will be described below as an example. In addition, 3m is an example.
 表示制御部304がディスプレイ120に提示する画面の縦横長をH、Vとしたとき、合成画像を生成するために切り取る等、画像処理した上面図画像の縦横長をHf、Vf、合成画像を生成するために切り取る等、画像処理した自動車100の進行方向に関する俯瞰図画像をHd、Vdとする。
 表示制御部304は、自動車100と自動車100の後方のオブジェクとの距離Lが3m以上の場合は、HfとHdとが1対1になるように制御する。
 より具体的に説明すると、表示制御部304は、図5の画像501から枠502の部分を切り取り、画像503とする。すなわち、表示制御部304は、自動車100と自動車100の後方のオブジェクとの距離Lが3m以上の場合、自動車100の進行方向に関する俯瞰図画像では、自動車100から遠くの映像を提示するため、自動車100近傍の映像を削除する。表示制御部304は、切り取った画像の縦比は常に一定とする。
When the vertical and horizontal lengths of the screen presented on the display 120 by the display control unit 304 are H and V, the vertical and horizontal lengths of the top view image subjected to image processing such as cropping to generate the composite image are Hf and Vf, and the composite image is generated. Let Hd and Vd be bird's-eye view images of the traveling direction of the automobile 100 that have undergone image processing, such as cropping, for the purpose of obtaining the image.
When the distance L between the automobile 100 and the object behind the automobile 100 is 3 m or more, the display control unit 304 controls Hf and Hd to be one to one.
More specifically, the display control unit 304 cuts out a portion of the frame 502 from the image 501 in FIG. That is, when the distance L between the automobile 100 and an object behind the automobile 100 is 3 m or more, the display control unit 304 presents an image far from the automobile 100 in the bird's-eye view image regarding the traveling direction of the automobile 100. Delete the image near 100. The display control unit 304 always keeps the aspect ratio of the cropped image constant.
 また、表示制御部304は、図6の上面図画像601から枠602の部分を切り取り、画像603とし、画像603の縦比を圧縮し、画像604となるよう画像処理を行う。図6は、上面図画像の一例を示す図(その1)である。表示制御部304は、画像503と、画像604と、を結合し、図7の合成画像701を生成する。図7は、合成画像の一例を示す図(その1)である。図7においてHdとHfとは1対1になっている。図7において、702は、第1の領域である。703は、第2の領域である。図7に示されるように、第1の領域は、合成画像701の上部に配置される。第2の領域は、合成画像701の下部に配置される。第1の領域は、自動車100の進行方向に関する俯瞰図画像が表示される領域である。第2の領域は、自動車100を含む、自動車100周辺を含む上面図画像が表示される領域である。合成画像では、第1の領域と第2の領域とで車の動きに対する周辺の情報の流れが一致する。したがって、自動車100の操作者は、直感的に周辺状況等を理解することができる。 Also, the display control unit 304 cuts out a portion of a frame 602 from the top view image 601 in FIG. FIG. 6 is a diagram (part 1) showing an example of a top view image. The display control unit 304 combines the image 503 and the image 604 to generate the composite image 701 shown in FIG. FIG. 7 is a diagram (part 1) showing an example of a synthesized image. In FIG. 7, Hd and Hf are one-to-one. In FIG. 7, 702 is the first area. 703 is the second area. As shown in FIG. 7, the first region is placed on top of the composite image 701 . The second area is placed at the bottom of the composite image 701 . The first area is an area where a bird's-eye view image regarding the traveling direction of the automobile 100 is displayed. The second area is an area in which a top view image including the automobile 100 and its surroundings is displayed. In the synthesized image, the first area and the second area match the flow of surrounding information with respect to the movement of the car. Therefore, the operator of the vehicle 100 can intuitively understand the surrounding conditions and the like.
 また、表示制御部304は、合成画像における、第1の領域と、第2の領域とが接する箇所において、それぞれの表示領域における表示距離が略同一となるようそれぞれの領域のアスペクト比を調整する。以下、図8~図10を用いてアスペクト比の調整について説明する。
 図8は、アスペクト比の調整について説明する図(その1)である。図8において、704は、自動車100を表している。第1の領域702及び第2の領域703における線は説明のために付しているものであって、例えば、地面に所定の間隔で直線が引かれていた場合にそれぞれの領域で表示される線を表している。
 自動車100の進行方向にオブジェクトが存在しない場合、表示制御部304は、図9に示されるように、上面図画像の自動車100に該当する部分を自動車100の左右が見えるように圧縮する画像処理を行い、上面図画像と俯瞰図画像とを合成する。図9は、アスペクト比の調整について説明する図(その2)である。
 一方、自動車100の進行方向にオブジェクトが存在する場合、表示制御部304は、図10に示されるように、上面図画像において長方形が台形の形の形になるよう射影変換する画像処理を行い、上面図画像と俯瞰図画像とを合成する。図10は、アスペクト比の調整について説明する図(その3)である。このように画像処理することによって、上面図画像と俯瞰図画像の線があたかも直線に近い形にすることができる。図10は、アスペクト比の調整について説明する図(その3)である。
In addition, the display control unit 304 adjusts the aspect ratio of each area so that the display distances of the respective display areas are substantially the same at the locations where the first area and the second area in the synthesized image are in contact with each other. . The adjustment of the aspect ratio will be described below with reference to FIGS. 8 to 10. FIG.
FIG. 8 is a diagram (part 1) for explaining adjustment of the aspect ratio. In FIG. 8, 704 represents the automobile 100 . The lines in the first area 702 and the second area 703 are attached for explanation. represents a line.
When there is no object in the traveling direction of the automobile 100, the display control unit 304 performs image processing to compress the portion corresponding to the automobile 100 in the top view image so that the left and right sides of the automobile 100 can be seen, as shown in FIG. Then, the top view image and the bird's eye view image are synthesized. FIG. 9 is a diagram (part 2) for explaining adjustment of the aspect ratio.
On the other hand, if there is an object in the traveling direction of the automobile 100, the display control unit 304 performs projective transformation so that the rectangle becomes a trapezoid in the top view image as shown in FIG. A top view image and a bird's eye view image are synthesized. FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio. By performing image processing in this manner, the lines of the top view image and the bird's eye view image can be made as if they were straight lines. FIG. 10 is a diagram (part 3) for explaining adjustment of the aspect ratio.
 表示制御部304は、自動車100と自動車100の後方のオブジェクとの距離Lが3m未満の場合は、HfとHdとが1対2になるように制御する。より具体的に説明すると、表示制御部304は、図11の画像801から枠802の部分を切り取り、画像803とする。すなわち、表示制御部304は、自動車100と自動車100の後方のオブジェクとの距離Lが3m未満の場合、自動車100の進行方向に関する俯瞰図画像では、遠くの映像は重要ではなくなるため、自動車100近傍の映像を切り取る。図11は、自動車100の進行方向に関する俯瞰図画像の一例を示す図(その2)である。表示制御部304は、切り取った画像803の縦横比の縮尺を変化させ、画像804とする。 When the distance L between the car 100 and the object behind the car 100 is less than 3 m, the display control unit 304 controls Hf and Hd to be 1:2. More specifically, the display control unit 304 cuts out a frame 802 from the image 801 in FIG. 11 to obtain an image 803 . That is, when the distance L between the automobile 100 and the object behind the automobile 100 is less than 3 m, the display control unit 304 controls the image near the automobile 100 because distant images are not important in the bird's-eye view image regarding the traveling direction of the automobile 100 . Cut out the image of FIG. 11 is a diagram (No. 2) showing an example of a bird's-eye view image regarding the traveling direction of the automobile 100. As shown in FIG. The display control unit 304 changes the scale of the aspect ratio of the cropped image 803 to obtain an image 804 .
 また、表示制御部304は、図12の上面図画像901から枠902の部分を切り取り、画像903とし、画像903の縦比を圧縮し、画像904となるよう画像処理を行う。図12は、上面図画像の一例を示す図(その2)である。表示制御部304は、画像904の横比の縮尺を変化させ、画像905とする。表示制御部304は、画像804と、画像905と、を結合し、図13の合成画像1001を生成する。図13は、合成画像の一例を示す図(その2)である。図13においてHdとHfとは1対2になっている。 Further, the display control unit 304 cuts out the portion of the frame 902 from the top view image 901 of FIG. FIG. 12 is a diagram (part 2) showing an example of a top view image. The display control unit 304 changes the scale of the horizontal ratio of the image 904 to obtain an image 905 . The display control unit 304 combines the image 804 and the image 905 to generate the composite image 1001 in FIG. 13 . FIG. 13 is a diagram (part 2) showing an example of a synthesized image. In FIG. 13, Hd and Hf are 1:2.
(表示割合の変更―連続的)
 次に、表示制御部304が、合成画像における、第1の領域の割合と、第2の領域の割合と、を連続的に変更する例を説明する。より具体的に説明すると、表示制御部304は、自動車100がオブジェクトに近づくほど合成画像における第1の領域の割合が第2の領域の割合に比べて小さくなるよう制御する。
 表示制御部304は、第1の領域に表示される画像と、第2の領域に表示される画像と、の面積変化率、後述する図14及び図15に示される変位曲線を用いて求め、それぞれの領域に表示される画像の面積を変化させる。
(change of display ratio - continuous)
Next, an example in which the display control unit 304 continuously changes the proportion of the first area and the proportion of the second area in the synthesized image will be described. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to the object, the smaller the proportion of the first area in the composite image than the proportion of the second area.
The display control unit 304 uses the area change rate between the image displayed in the first area and the image displayed in the second area, and the displacement curves shown in FIGS. 14 and 15 to be described later, Change the area of the image displayed in each area.
 表示制御部304は、距離割合を示すperを求める。
 後退の場合は、表示制御部304は、後方のオブジェクトまでの距離を測定し、最大距離(5m)で割ったものをperとする。なお、5mは、所定の距離の一例であって、これに限られるものではない。
 per=1-後方のオブジェクトまでの距離/最大距離(5m)
 前進の場合は、表示制御部304は、前方のオブジェクトまでの距離を測定し、最大距離(5m)で割ったものをperとする。
 per=1-前方のオブジェクトまでの距離/最大距離(5m)
 オブジェクトが最大距離よりも近くにない場合は以降の処理は行わない。
 per(0<=per<=1)
The display control unit 304 obtains per indicating the distance ratio.
In the case of retreating, the display control unit 304 measures the distance to the object behind and divides it by the maximum distance (5 m) to obtain per. In addition, 5m is an example of a predetermined distance, and is not limited to this.
per = 1 - distance to object behind/maximum distance (5m)
In the case of moving forward, the display control unit 304 measures the distance to the object ahead and divides it by the maximum distance (5 m) to obtain per.
per = 1 - distance to object in front/maximum distance (5m)
If the object is not closer than the maximum distance, the subsequent processing is not performed.
per(0<=per<=1)
 表示制御部304は、求めたperを用いて、変位曲線により、表示エリア割(dif)を求める。
 ここで、自動車100の駐車時には切り返しを行う場合がある。また、ステアリングの切り角で自動車100とオブジェクトとが離れる場合がある。このような場合に、ディスプレイ120に表示される画像(映像)が大きく変化しないようにするため、変位曲線は、オブジェクトから離れるperの減少時(curveDec)図14と、オブジェクトに近づくperの増加時(curveInc)図15と、の2つを自動車100に持たせる構成としている。また、curveIncはcurveDecよりも右側に必ず位置するようにしている。図14は、オブジェクトから離れるperの減少時の変位曲線の一例を示す図である。図15は、オブジェクトに近づくperの増加時の変位曲線の一例を示す図である。図14及び図15ともに、横軸はperであり、縦軸はdifである。
The display control unit 304 obtains the display area division (dif) from the displacement curve using the obtained per.
Here, when the automobile 100 is parked, there is a case where the vehicle is turned back. Also, the vehicle 100 may be separated from the object due to the steering angle. In such a case, in order to prevent a large change in the image (video) displayed on the display 120, the displacement curve is set as shown in FIG. (curveInc) The automobile 100 is configured to have two of FIG. Also, curveInc is always located on the right side of curveDec. FIG. 14 is a diagram showing an example of a displacement curve when per is decreasing away from an object. FIG. 15 is a diagram showing an example of a displacement curve when per approaches an object and increases. In both FIGS. 14 and 15, the horizontal axis is per and the vertical axis is dif.
(視点の変更)
 また、表示制御部304は、自動車100と、進行方向に存在するオブジェクトと、の距離に基づいて、上面図画像の視点に関する俯角を変更する。より具体的に説明すると、表示制御部304は、自動車100がオブジェクトに近づくほど上面図画像の視点に関する俯角が大きくなるよう制御する。その際、表示制御部304は、上述したような変位曲線を用いて自動車100がオブジェクトに近づくほど上面図画像の視点に関する俯角が大きくなるよう制御する。なお、表示制御部304は、自動車100がオブジェクトに近づく場合と自動車100がオブジェクトから遠ざかる場合とで俯角の変化の割合を変更するようにしてもよい。
 なお、表示制御部304は、自動車100が後退しているときは上述した表示割合の変更を行い、自動車100が前進しているときは視点の変更を行うようにしてもよい。
(change of viewpoint)
In addition, the display control unit 304 changes the depression angle with respect to the viewpoint of the top view image based on the distance between the automobile 100 and an object existing in the traveling direction. More specifically, the display control unit 304 performs control such that the closer the vehicle 100 is to the object, the greater the angle of depression with respect to the viewpoint of the top view image. At this time, the display control unit 304 uses the displacement curve described above to perform control such that the closer the vehicle 100 is to the object, the greater the angle of depression with respect to the viewpoint of the top view image. The display control unit 304 may change the rate of change in the angle of depression depending on whether the vehicle 100 is moving closer to the object or when the vehicle 100 is moving away from the object.
The display control unit 304 may change the display ratio described above when the automobile 100 is moving backward, and may change the viewpoint when the automobile 100 is moving forward.
 本実施形態によれば、操作者が移動体の進行方向を理解しやすく、より移動体の周辺情報を分かりやすく提示することができる。より具体的に説明すると、情報処理装置110は、合成画像を、1つの画面で提示するとともに、透視変換の視点及び/又は提示面積を周囲の状況に合わせ変化させることで、進行方向に関する俯瞰図画像と上面図画像とを連続的に表現している。このことにより、操作者に対して、直感的な周囲情報を提示することができる。特に、情報処理装置110は、自動車100周囲の障害等のオブジェクトまでの距離を考慮し、視点変換及び/又は提示面積を変化させることにより、より障害物を意識できる表示法を実現している。これらにより操作者は、車両の後退時の前後の動きをより直感的に理解でき、安全に自動車100を運転することができる。 According to the present embodiment, it is possible for the operator to easily understand the traveling direction of the mobile object, and to present the peripheral information of the mobile object in an easy-to-understand manner. More specifically, the information processing apparatus 110 presents the composite image on one screen, and changes the viewpoint and/or the presentation area of the perspective transformation according to the surrounding conditions, thereby obtaining a bird's-eye view of the direction of travel. The image and the top view image are represented continuously. As a result, intuitive surrounding information can be presented to the operator. In particular, the information processing device 110 considers the distance to objects such as obstacles around the automobile 100, and changes the viewpoint and/or the presentation area, thereby realizing a display method that makes the obstacles more recognizable. As a result, the operator can more intuitively understand the forward and backward movements of the vehicle when the vehicle is backing up, and can drive the automobile 100 safely.
(変形例1)
 実施形態1の変形例を説明する。
 実施形態1では、移動体の一例として、自動車100を例に説明を行った。しかし、移動体は、例えば、無人飛行機等のいわゆるドローンであってもよい。移動体がドローンの場合、情報処理装置110は、ドローンに含まれてもよいし、ドローンを操作する操作者のコントローラーに含まれてもよい。また、移動体がドローンの場合、操作者が目視可能な表示部は、コントローラーに設けられる。また、移動体がドローンの場合、オブジェクトの例としては、ドローンの発着場所を示すマーク等がある。
 変形例によっても、操作者が移動体の進行方向を理解しやすく、より移動体の周辺情報を分かりやすく提示することができる。
(Modification 1)
A modification of the first embodiment will be described.
In the first embodiment, the automobile 100 has been described as an example of a moving object. However, the mobile object may be, for example, a so-called drone such as an unmanned aerial vehicle. When the moving object is a drone, the information processing device 110 may be included in the drone or may be included in the controller of the operator who operates the drone. Further, when the moving body is a drone, a display section that can be visually observed by the operator is provided in the controller. Further, when the moving object is a drone, examples of the object include a mark indicating the departure and arrival location of the drone.
According to the modified example, it is also possible for the operator to easily understand the moving direction of the moving object, and to present the peripheral information of the moving object in an easy-to-understand manner.
(変形例2)
 実施形態1の変形例を説明する。
 図16は、車両が後退している場合の上面図画像と俯瞰図画像の一例を示す図である。矢印1601は、自動車100の進行方向を示している。画像1602は、上面図画像である。画像1603は、俯瞰図画像である。
 図17は、合成画像の一例を示す図である。合成画像1701は、図16の示した上面図画像1602及び俯瞰図画像1603を合成して作成された画像である。
 図18は、上面図画像及び俯瞰図画像に対する画像処理の一例を示す図である。
 S1801において、表示制御部304は、上面図画像を上下反転させ車両が下向きになるようにする。
 S1802において、表示制御部304は、上下反転させた上面図画像を定められた面積Saとなるよう矩形クリッピングする。すなわち、表示制御部304は、上下反転させた上面図画像の定められた面積Sa以外の部分を描画しないようにする。
 S1803において、表示制御部304は、矩形クリッピングした上面図画像を長方形が台形となるようを射影変換する。
 一方、S1804において、表示制御部304は、俯瞰図画像を定められた面積Sbとなるよう矩形クリッピングする。すなわち、表示制御部304は、定められた俯瞰図画像の定められた面積Sb以外の部分を描画しないようにする。
 ここで、表示制御部340は、自動車100と障害物との距離に応じて、上述したSa対Sbの比を変更する。
 S1805において、表示制御部304は、表示制御部304は、矩形クリッピングした俯瞰図画像を歪み補正するよう射影変換する。
 S1806において、表示制御部304は、射影変換した上面図画像と、射影変換した俯瞰図画像とを合成し、1つの合成画像を生成する。
 なお、自動車100が動いている間、図18に示される処理は繰り返して実行される。
(Modification 2)
A modification of the first embodiment will be described.
FIG. 16 is a diagram showing an example of a top view image and a bird's eye view image when the vehicle is in reverse. Arrow 1601 indicates the traveling direction of automobile 100 . Image 1602 is a top view image. An image 1603 is a bird's-eye view image.
FIG. 17 is a diagram showing an example of a synthesized image. A synthesized image 1701 is an image created by synthesizing the top view image 1602 and the overhead view image 1603 shown in FIG.
FIG. 18 is a diagram illustrating an example of image processing for a top view image and a bird's eye view image.
In S1801, the display control unit 304 turns the top view image upside down so that the vehicle faces downward.
In S1802, the display control unit 304 performs rectangular clipping on the vertically inverted top view image so as to have a predetermined area Sa. That is, the display control unit 304 does not draw a portion other than the predetermined area Sa of the top view image that is vertically inverted.
In S1803, the display control unit 304 projectively transforms the rectangle-clipped top view image so that the rectangle becomes a trapezoid.
On the other hand, in S1804, the display control unit 304 performs rectangular clipping so that the bird's-eye view image has a predetermined area Sb. That is, the display control unit 304 does not draw a portion other than the predetermined area Sb of the predetermined bird's-eye view image.
Here, the display control unit 340 changes the ratio of Sa to Sb described above according to the distance between the automobile 100 and the obstacle.
In S1805, the display control unit 304 performs projective transformation so as to correct the distortion of the bird's-eye view image subjected to rectangular clipping.
In S1806, the display control unit 304 synthesizes the projectively transformed top view image and the projectively transformed bird's eye view image to generate one synthesized image.
Note that while the automobile 100 is moving, the processing shown in FIG. 18 is repeatedly executed.
(変形例3)
 実施形態1の変形例を説明する。
 図19は、俯瞰図画像において長方形が逆台形となるよう射影変換を行った一例を示す図である。
 自動車100の進行方向にオブジェクトが存在する場合、変形例3の表示制御部304は、図19に示されるように、上面図画像において長方形が台形となるよう射影変換を行うとともに、俯瞰図画像において長方形が逆台形となるよう射影変換を行い、射影変換を行った上面図画像と射影変換を行った俯瞰図画像とを合成することで一つの画像を生成する。上面図画像において長方形が台形となるよう射影変換を行うとともに、俯瞰図画像において長方形が逆台形となるよう射影変換を行うことによって、図19に示されるように、上面図画像と俯瞰図画像とを合成しても直線等をスムーズに接続することができる。
(Modification 3)
A modification of the first embodiment will be described.
FIG. 19 is a diagram showing an example in which projective transformation is performed so that a rectangle becomes an inverted trapezoid in a bird's-eye view image.
When an object exists in the traveling direction of the automobile 100, as shown in FIG. Projective transformation is performed so that the rectangle becomes an inverted trapezoid, and one image is generated by synthesizing the top view image subjected to the projective transformation and the overhead view image subjected to the projective transformation. By performing projective transformation so that the rectangle in the top view image becomes a trapezoid and performing projective transformation so that the rectangle in the bird's eye view image becomes an inverse trapezoid, the top view image and the bird's eye view image are transformed as shown in FIG. , it is possible to connect straight lines smoothly.
 図20は、変形例3における上面図画像及び俯瞰図画像に対する画像処理の一例を示す図である。
 S2001において、表示制御部304は、上面図画像を上下反転させ車両が下向きになるようにする。
 S2002において、表示制御部304は、ROI(Region Of Interest)を設定する。この際、表示制御部304は、上面図画像の上辺と俯瞰図画像の下辺とが3次元位置で一致するようROIを設定する。
 S2003において、表示制御部304は、上面図画像の長方形が台形となるよう射影変換を行う。
 一方、S2004において、表示制御部304は、ROI(Region Of Interest)を設定する。この際、表示制御部304は、俯瞰図画像の下辺と上面図画像の上辺とが3次元位置で一致するようROIを設定する。なお、表示制御部304は、自動車100が障害物に接近した場合、縦方向は近くを拡大し、横方向は手前の表示範囲を拡大するようROIを設定する。
 S2005において、表示制御部304は、俯瞰図画像の長方形が逆台形となるよう射影変換を行う。
 S2006において、表示制御部304は、射影変換を行った上面図画像と射影変換を行った俯瞰図画像とを結合する。
 S2007において、表示制御部304は、結合画像をディスプレイ120に合わせて拡大、又は縮小する。また、表示制御部304は、結合画像をクリッピングする。
 なお、自動車100が動いている間、図20に示される処理は繰り返して実行される。
20A and 20B are diagrams illustrating an example of image processing for a top view image and a bird's eye view image in Modification 3. FIG.
In S2001, the display control unit 304 turns the top view image upside down so that the vehicle faces downward.
In S2002, the display control unit 304 sets an ROI (Region Of Interest). At this time, the display control unit 304 sets the ROI so that the upper side of the top view image and the lower side of the bird's eye view image match in three-dimensional position.
In S2003, the display control unit 304 performs projective transformation so that the rectangle of the top view image becomes a trapezoid.
On the other hand, in S2004, the display control unit 304 sets an ROI (Region Of Interest). At this time, the display control unit 304 sets the ROI so that the lower side of the bird's-eye view image and the upper side of the top view image match at three-dimensional positions. Note that when the automobile 100 approaches an obstacle, the display control unit 304 sets the ROI so that the near area is enlarged in the vertical direction and the display area in front is enlarged in the horizontal direction.
In S2005, the display control unit 304 performs projective transformation so that the rectangle of the overhead view image becomes an inverted trapezoid.
In S2006, the display control unit 304 combines the top view image that has undergone the projective transformation and the overhead view image that has undergone the projective transformation.
In S<b>2007 , the display control unit 304 enlarges or reduces the combined image to fit the display 120 . Also, the display control unit 304 clips the combined image.
Note that while the automobile 100 is moving, the processing shown in FIG. 20 is repeatedly executed.
(変形例4)
 実施形態1の変形例を説明する。
 図21は、自動車100に距離センサを設けた場合の一例を示す図である。図21では、自動車100の後方の左右に距離センサを設けた例を示している。しかし、距離センサの数は2つに限られない。また、距離センサは、自動車100の横や前方に存在するオブジェクトまでの距離を測るようにしてもよい。変形例4の表示制御部304は、図21に示されるように、例えば、自動車100の右後方にオブジェクトが存在する場合は、図22に示されるように、俯瞰図画像の左側部分に対して長方形が逆台形となるよう射影変換を行う。なお、図22では、俯瞰図画像の右側部分に対しては長方形が逆台形となるような射影変換は行われていない。同様に、表示制御部304は、例えば、自動車100の左後方にオブジェクトが存在する場合は、俯瞰図画像の右側部分に対して長方形が逆台形となるよう射影変換を行う。この場合も、俯瞰図画像の左側部分に対しては長方形が逆台形となるような射影変換は行われない。すなわち、表示制御部304は、オブジェクトが移動体の左にあるか右にあるかに応じて、俯瞰図画像の表示形態を異ならせるよう制御する。
 図22は、俯瞰図画像の表示態様の一例を示す図である。図22の例では、俯瞰図画像の左側部分に対して長方形が逆台形となるよう射影変換が行われ、俯瞰図画像の右側部分に対しては、射影変換は行われていない。
(Modification 4)
A modification of the first embodiment will be described.
FIG. 21 is a diagram showing an example in which a distance sensor is provided in automobile 100. As shown in FIG. FIG. 21 shows an example in which distance sensors are provided on the rear left and right sides of the automobile 100 . However, the number of distance sensors is not limited to two. Also, the distance sensor may measure the distance to an object existing on the side or front of the automobile 100 . As shown in FIG. 21, for example, when an object exists in the right rear of the automobile 100, the display control unit 304 of Modification 4 displays the left portion of the bird's-eye view image as shown in FIG. Projective transformation is performed so that the rectangle becomes an inverted trapezoid. Note that in FIG. 22, projective transformation is not performed on the right side of the bird's-eye view image so that the rectangle becomes an inverted trapezoid. Similarly, for example, when an object exists on the left rear of the automobile 100, the display control unit 304 performs projective transformation so that the rectangle becomes an inverted trapezoid on the right side of the bird's-eye view image. In this case as well, the left portion of the bird's-eye view image is not subjected to projective transformation such that the rectangle becomes an inverted trapezoid. That is, the display control unit 304 performs control so that the display form of the bird's-eye view image is changed depending on whether the object is on the left or right of the moving object.
FIG. 22 is a diagram illustrating an example of a display mode of a bird's-eye view image. In the example of FIG. 22, projective transformation is performed on the left side of the bird's eye view image so that the rectangle becomes an inverted trapezoid, and projective transformation is not performed on the right side of the bird's eye view image.
<付記>
 発明は、次に記載の各態様で提供されてもよい。
<Appendix>
The invention may be provided in each of the aspects described below.
(1)情報処理装置であって、制御部を有し、前記制御部は、移動体の進行方向に関する俯瞰図画像と、前記移動体と前記移動体の周辺とを含む上面図画像と、に基づいて、一つの画像を生成し、前記一つの画像には、前記進行方向に関する俯瞰図画像を表示する第1の領域と、前記上面図画像を表示する第2の領域と、が含まれ、前記一つの画像を前記移動体の操作者が目視可能な表示部に表示するよう制御する、情報処理装置。 (1) An information processing apparatus having a control unit, wherein the control unit generates a bird's-eye view image related to a moving direction of a moving body and a top view image including the moving body and the periphery of the moving body. based on, one image is generated, the one image includes a first area for displaying a bird's eye view image related to the traveling direction and a second area for displaying the top view image, An information processing device that controls to display the one image on a display unit that is visible to an operator of the mobile object.
(2)上記(1)に記載の情報処理装置において、前記第1の領域は、前記一つの画像の上部に配置され、前記第2の領域は、前記一つの画像の下部に配置される、情報処理装置。 (2) In the information processing device according to (1) above, the first area is arranged above the one image, and the second area is arranged below the one image. Information processing equipment.
(3)上記(1)又は(2)に記載の情報処理装置において、前記制御部は、前記移動体と、前記進行方向に存在するオブジェクトと、の距離に基づいて、前記一つの画像における、前記第1の領域の割合と、前記第2の領域の割合と、を変更する、情報処理装置。 (3) In the information processing apparatus according to (1) or (2) above, the control unit, based on the distance between the moving object and the object existing in the traveling direction, in the one image, An information processing device that changes a ratio of the first area and a ratio of the second area.
(4)上記(3)に記載の情報処理装置において、前記制御部は、前記距離に基づいて、前記一つの画像における、前記第1の領域の割合と、前記第2の領域の割合と、を連続的に変更する、情報処理装置。 (4) In the information processing apparatus according to (3) above, the control unit controls, based on the distance, the ratio of the first area and the ratio of the second area in the one image, An information processing device that continuously changes the
(5)上記(3)に記載の情報処理装置において、前記制御部は、前記距離に基づいて、前記一つの画像における、前記第1の領域の割合と、前記第2の領域の割合と、を段階的に変更する、情報処理装置。 (5) In the information processing apparatus according to (3) above, the control unit controls, based on the distance, the ratio of the first area and the ratio of the second area in the one image, Information processing device that changes step by step.
(6)上記(3)から(5)までの何れか1項に記載の情報処理装置において、前記制御部は、前記移動体が前記オブジェクトに近づくほど前記一つの画像における前記第1の領域の割合が前記第2の領域の割合に比べて小さくなるよう制御する、情報処理装置。 (6) In the information processing apparatus according to any one of (3) to (5) above, the control unit controls the area of the first area in the one image as the moving object approaches the object. An information processing device that performs control so that the ratio is smaller than the ratio of the second region.
(7)上記(3)から(6)までの何れか1項に記載の情報処理装置において、前記制御部は、前記進行方向にオブジェクトが存在する場合、前記上面図画像において長方形が台形となるよう射影変換を行い、前記俯瞰図画像と合成することで前記一つの画像を生成する、情報処理装置。 (7) In the information processing apparatus according to any one of (3) to (6) above, when an object exists in the direction of travel, the control unit causes the rectangle to become a trapezoid in the top view image. An information processing device that generates the one image by performing a projective transformation and synthesizing it with the bird's-eye view image.
(8)上記(7)に記載の情報処理装置において、前記制御部は、前記進行方向にオブジェクトが存在する場合、前記上面図画像において長方形が台形となるよう射影変換を行うとともに、前記俯瞰図画像において長方形が逆台形となるよう射影変換を行い、射影変換を行った上面図画像と射影変換を行った俯瞰図画像とを合成することで前記一つの画像を生成する、情報処理装置。 (8) In the information processing apparatus described in (7) above, when an object exists in the traveling direction, the control unit performs projective transformation so that a rectangle becomes a trapezoid in the top view image, and An information processing device that performs projective transformation so that a rectangle in an image becomes an inverted trapezoid, and generates the one image by synthesizing the projectively transformed top view image and the projectively transformed overhead view image.
(9)上記(8)に記載の情報処理装置において、前記制御部は、前記オブジェクトが前記移動体の左にあるか右にあるかに応じて、前記俯瞰図画像の表示形態を異ならせるよう制御する、情報処理装置。 (9) In the information processing device according to (8) above, the control unit may change the display form of the bird's-eye view image depending on whether the object is on the left or right of the moving object. Information processing device to control.
(10)上記(3)から(9)までの何れか1項に記載の情報処理装置において、前記制御部は、前記移動体と、前記オブジェクトと、の距離に基づいて、前記俯瞰図画像の視点に関する俯角を変更する、情報処理装置。 (10) In the information processing apparatus according to any one of (3) to (9) above, the control unit adjusts the overhead view image based on the distance between the moving object and the object. An information processing device that changes the depression angle related to the viewpoint.
(11)上記(10)に記載の情報処理装置において、前記制御部は、前記移動体が前記オブジェクトに近づくほど前記俯瞰図画像の視点に関する俯角が大きくなるよう制御する、情報処理装置。 (11) In the information processing apparatus according to (10) above, the control unit performs control such that the depression angle with respect to the viewpoint of the bird's-eye view image increases as the moving object approaches the object.
(12)上記(10)又は(11)に記載の情報処理装置において、前記制御部は、前記移動体が前記オブジェクトに近づく場合と前記移動体が前記オブジェクトから遠ざかる場合とで前記俯瞰図画像の視点に関する俯角の変化の割合を変更する、情報処理装置。 (12) In the information processing device described in (10) or (11) above, the control unit changes the bird's-eye view image when the moving body approaches the object and when the moving body moves away from the object. An information processing device that changes the rate of change in depression angle with respect to a viewpoint.
(13)上記(3)から(12)までの何れか1項に記載の情報処理装置において、前記移動体は、自動車であり、前記オブジェクトは、前記自動車の駐車位置に関するものである、情報処理装置。 (13) In the information processing apparatus according to any one of (3) to (12) above, the moving object is an automobile, and the object is related to a parking position of the automobile. Device.
(14)移動体であって、上記(1)から(13)までの何れか1項に記載の情報処理装置を有する、移動体。 (14) A mobile body, comprising the information processing device according to any one of (1) to (13) above.
(15)上記(14)に記載の移動体において、前記移動体は自動車である、移動体。 (15) The mobile object according to (14) above, wherein the mobile object is an automobile.
(16)情報処理装置が実行する情報処理方法であって、移動体の進行方向に関する俯瞰図画像と、前記移動体と前記移動体の周辺とを含む上面図画像と、に基づいて、一つの画像を生成し、前記一つの画像には、前記進行方向に関する俯瞰図画像を表示する第1の領域と、前記上面図画像を表示する第2の領域と、が含まれ、前記一つの画像を前記移動体の表示部に表示するよう制御する、情報処理方法。 (16) An information processing method executed by an information processing apparatus, wherein one An image is generated, wherein the one image includes a first area for displaying a bird's-eye view image related to the traveling direction and a second area for displaying the top view image, and the one image is generated. An information processing method for controlling display on a display unit of the moving body.
(17)プログラムであって、コンピュータを、上記(1)から(13)までの何れか1項に記載の情報処理装置の制御部として機能させるためのプログラム。
 もちろん、この限りではない。
(17) A program for causing a computer to function as a control unit of the information processing apparatus according to any one of (1) to (13) above.
Of course, this is not the only case.
 例えば、上述のプログラムを記憶させる、コンピュータ読み取り可能な非一時的な記憶媒体として提供してもよい。
 また、変形例を任意に組み合わせてもよい。
 また、上述した実施形態等では主にバックビューを例に説明を行ったが、フロントビューに対しても同様な処理を実行することによって上述した効果を奏することができる。
For example, it may be provided as a computer-readable non-temporary storage medium that stores the above program.
Moreover, you may combine a modification arbitrarily.
Also, in the above-described embodiments and the like, the back view has been mainly explained as an example, but the above-described effects can be obtained by executing the same processing for the front view as well.
 最後に、本発明に係る種々の実施形態を説明したが、これらは、例として提示したものであり、発明の範囲を限定することは意図していない。新規な実施形態は、その他の様々な形態で実施されることが可能であり、発明の要旨を逸脱しない範囲で、種々の省略、置き換え、変更を行うことができる。実施形態やその変形は、発明の範囲や要旨に含まれると共に、特許請求の範囲に記載された発明とその均等の範囲に含まれるものである。 Finally, although various embodiments of the present invention have been described, these are presented as examples and are not intended to limit the scope of the invention. The novel embodiments can be embodied in various other forms, and various omissions, replacements, and modifications can be made without departing from the scope of the invention. Embodiments and modifications thereof are included in the scope and gist of the invention, and are included in the scope of the invention described in the claims and equivalents thereof.
100  :自動車
110  :情報処理装置
120  :ディスプレイ
130  :カメラ
201  :制御部
202  :記憶部
203  :通信部
301  :周辺情報受取部
302  :物体認識部
303  :挙動情報受取部
304  :表示制御部
340  :表示制御部
501  :画像
502  :枠
503  :画像
601  :上面図画像
602  :枠
603  :画像
604  :画像
701  :合成画像
702  :第1の領域
703  :第2の領域
801  :画像
802  :枠
803  :画像
804  :画像
901  :上面図画像
902  :枠
903  :画像
904  :画像
905  :画像
1000 :情報処理システム
1001 :合成画像
1301 :カメラ
1302 :カメラ
1601 :矢印
1701 :合成画像
100: Car 110: Information processing device 120: Display 130: Camera 201: Control unit 202: Storage unit 203: Communication unit 301: Peripheral information reception unit 302: Object recognition unit 303: Behavior information reception unit 304: Display control unit 340: Display control unit 501 : Image 502 : Frame 503 : Image 601 : Top view image 602 : Frame 603 : Image 604 : Image 701 : Synthetic image 702 : First area 703 : Second area 801 : Image 802 : Frame 803 : Image 804 : Image 901 : Top view image 902 : Frame 903 : Image 904 : Image 905 : Image 1000 : Information processing system 1001 : Synthetic image 1301 : Camera 1302 : Camera 1601 : Arrow 1701 : Synthetic image

Claims (17)

  1.  情報処理装置であって、
     制御部を有し、
     前記制御部は、
     移動体の進行方向に関する俯瞰図画像と、前記移動体と前記移動体の周辺とを含む上面図画像と、に基づいて、一つの画像を生成し、
     前記一つの画像には、前記進行方向に関する俯瞰図画像を表示する第1の領域と、前記上面図画像を表示する第2の領域と、が含まれ、
     前記一つの画像を前記移動体の操作者が目視可能な表示部に表示するよう制御する、
    情報処理装置。
    An information processing device,
    having a control unit,
    The control unit
    Generating one image based on a bird's-eye view image of the moving direction of the moving body and a top view image including the moving body and the periphery of the moving body,
    The one image includes a first area that displays a bird's eye view image related to the traveling direction and a second area that displays the top view image,
    controlling to display the one image on a display unit visible to an operator of the mobile body;
    Information processing equipment.
  2.  請求項1に記載の情報処理装置において、
     前記第1の領域は、前記一つの画像の上部に配置され、
     前記第2の領域は、前記一つの画像の下部に配置される、
    情報処理装置。
    In the information processing device according to claim 1,
    The first area is arranged at the top of the one image,
    The second area is arranged at the bottom of the one image,
    Information processing equipment.
  3.  請求項1又は請求項2に記載の情報処理装置において、
     前記制御部は、
     前記移動体と、前記進行方向に存在するオブジェクトと、の距離に基づいて、前記一つの画像における、前記第1の領域の割合と、前記第2の領域の割合と、を変更する、
    情報処理装置。
    In the information processing apparatus according to claim 1 or claim 2,
    The control unit
    changing the ratio of the first area and the ratio of the second area in the one image based on the distance between the moving object and the object existing in the traveling direction;
    Information processing equipment.
  4.  請求項3に記載の情報処理装置において、
     前記制御部は、
     前記距離に基づいて、前記一つの画像における、前記第1の領域の割合と、前記第2の領域の割合と、を連続的に変更する、
    情報処理装置。
    In the information processing device according to claim 3,
    The control unit
    continuously changing the ratio of the first area and the ratio of the second area in the one image based on the distance;
    Information processing equipment.
  5.  請求項3に記載の情報処理装置において、
     前記制御部は、
     前記距離に基づいて、前記一つの画像における、前記第1の領域の割合と、前記第2の領域の割合と、を段階的に変更する、
    情報処理装置。
    In the information processing device according to claim 3,
    The control unit
    Based on the distance, stepwise changing the ratio of the first area and the ratio of the second area in the one image.
    Information processing equipment.
  6.  請求項3から請求項5までの何れか1項に記載の情報処理装置において、
     前記制御部は、
     前記移動体が前記オブジェクトに近づくほど前記一つの画像における前記第1の領域の割合が前記第2の領域の割合に比べて小さくなるよう制御する、
    情報処理装置。
    In the information processing apparatus according to any one of claims 3 to 5,
    The control unit
    controlling so that the ratio of the first area in the one image becomes smaller than the ratio of the second area as the moving object approaches the object;
    Information processing equipment.
  7.  請求項3から請求項6までの何れか1項に記載の情報処理装置において、
     前記制御部は、
     前記進行方向にオブジェクトが存在する場合、前記上面図画像において長方形が台形となるよう射影変換を行い、前記俯瞰図画像と合成することで前記一つの画像を生成する、
    情報処理装置。
    In the information processing apparatus according to any one of claims 3 to 6,
    The control unit
    When an object exists in the traveling direction, projective transformation is performed so that the rectangle becomes a trapezoid in the top view image, and the one image is generated by synthesizing it with the bird's eye view image.
    Information processing equipment.
  8.  請求項7に記載の情報処理装置において、
     前記制御部は、
     前記進行方向にオブジェクトが存在する場合、前記上面図画像において長方形が台形となるよう射影変換を行うとともに、前記俯瞰図画像において長方形が逆台形となるよう射影変換を行い、射影変換を行った上面図画像と射影変換を行った俯瞰図画像とを合成することで前記一つの画像を生成する、
    情報処理装置。
    In the information processing device according to claim 7,
    The control unit
    When an object exists in the traveling direction, projective transformation is performed so that the rectangle becomes a trapezoid in the top view image, and projective transformation is performed so that the rectangle becomes an inverted trapezoid in the bird's eye view image. generating the one image by synthesizing the diagram image and the bird's-eye view image that has undergone the projective transformation;
    Information processing equipment.
  9.  請求項8に記載の情報処理装置において、
     前記制御部は、
     前記オブジェクトが前記移動体の左にあるか右にあるかに応じて、前記俯瞰図画像の表示形態を異ならせるよう制御する、
    情報処理装置。
    In the information processing device according to claim 8,
    The control unit
    controlling to change the display form of the bird's-eye view image according to whether the object is on the left or right of the moving object;
    Information processing equipment.
  10.  請求項3から請求項9までの何れか1項に記載の情報処理装置において、
     前記制御部は、
     前記移動体と、前記オブジェクトと、の距離に基づいて、前記俯瞰図画像の視点に関する俯角を変更する、
    情報処理装置。
    In the information processing apparatus according to any one of claims 3 to 9,
    The control unit
    changing a depression angle with respect to the viewpoint of the bird's-eye view image based on the distance between the moving body and the object;
    Information processing equipment.
  11.  請求項10に記載の情報処理装置において、
     前記制御部は、
     前記移動体が前記オブジェクトに近づくほど前記俯瞰図画像の視点に関する俯角が大きくなるよう制御する、
    情報処理装置。
    In the information processing device according to claim 10,
    The control unit
    controlling such that the closer the moving body approaches the object, the greater the angle of depression with respect to the viewpoint of the bird's-eye view image;
    Information processing equipment.
  12.  請求項10又は請求項11に記載の情報処理装置において、
     前記制御部は、
     前記移動体が前記オブジェクトに近づく場合と前記移動体が前記オブジェクトから遠ざかる場合とで前記俯瞰図画像の視点に関する俯角の変化の割合を変更する、
    情報処理装置。
    In the information processing device according to claim 10 or claim 11,
    The control unit
    changing the rate of change in the depression angle with respect to the viewpoint of the bird's-eye view image between when the moving body approaches the object and when the moving body moves away from the object;
    Information processing equipment.
  13.  請求項3から請求項12までの何れか1項に記載の情報処理装置において、
     前記移動体は、自動車であり、
     前記オブジェクトは、前記自動車の駐車位置に関するものである、
    情報処理装置。
    In the information processing apparatus according to any one of claims 3 to 12,
    The moving object is an automobile,
    the object relates to a parking position of the car;
    Information processing equipment.
  14.  移動体であって、
     請求項1から請求項13までの何れか1項に記載の情報処理装置を有する、
    移動体。
    being mobile,
    Having the information processing device according to any one of claims 1 to 13,
    Mobile.
  15.  請求項14に記載の移動体において、
     前記移動体は自動車である、
    移動体。
    In the moving body according to claim 14,
    the moving object is an automobile,
    Mobile.
  16.  情報処理装置が実行する情報処理方法であって、
     移動体の進行方向に関する俯瞰図画像と、前記移動体と前記移動体の周辺とを含む上面図画像と、に基づいて、一つの画像を生成し、
     前記一つの画像には、前記進行方向に関する俯瞰図画像を表示する第1の領域と、前記上面図画像を表示する第2の領域と、が含まれ、
     前記一つの画像を前記移動体の表示部に表示するよう制御する、
    情報処理方法。
    An information processing method executed by an information processing device,
    Generating one image based on a bird's-eye view image of the moving direction of the moving body and a top view image including the moving body and the periphery of the moving body,
    The one image includes a first area that displays a bird's eye view image related to the traveling direction and a second area that displays the top view image,
    controlling to display the one image on the display unit of the moving body;
    Information processing methods.
  17.  プログラムであって、
     コンピュータを、
     請求項1から請求項13までの何れか1項に記載の情報処理装置の制御部として機能させるためのプログラム。
    a program,
    the computer,
    A program for functioning as a control unit of the information processing apparatus according to any one of claims 1 to 13.
PCT/JP2022/028507 2021-11-30 2022-07-22 Information processing device, movable body, information processing method and program WO2023100415A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023564741A JPWO2023100415A1 (en) 2021-11-30 2022-07-22

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-194729 2021-11-30
JP2021194729 2021-11-30

Publications (1)

Publication Number Publication Date
WO2023100415A1 true WO2023100415A1 (en) 2023-06-08

Family

ID=86611853

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/028507 WO2023100415A1 (en) 2021-11-30 2022-07-22 Information processing device, movable body, information processing method and program

Country Status (2)

Country Link
JP (1) JPWO2023100415A1 (en)
WO (1) WO2023100415A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000064175A1 (en) * 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Ltd. Image processing device and monitoring system
JP2003158736A (en) * 2000-07-19 2003-05-30 Matsushita Electric Ind Co Ltd Monitoring system
JP2010215027A (en) * 2009-03-13 2010-09-30 Fujitsu Ten Ltd Driving assistant device for vehicle
JP2011030078A (en) * 2009-07-28 2011-02-10 Toshiba Alpine Automotive Technology Corp Image display device for vehicle
JP2012147285A (en) * 2011-01-13 2012-08-02 Alpine Electronics Inc Back monitor apparatus
JP2014110604A (en) * 2012-12-04 2014-06-12 Denso Corp Vehicle periphery monitoring device
JP2017098932A (en) * 2015-11-17 2017-06-01 株式会社Jvcケンウッド Vehicle display device and vehicle display method
WO2021131481A1 (en) * 2019-12-24 2021-07-01 株式会社Jvcケンウッド Display device, display method, and display program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000064175A1 (en) * 1999-04-16 2000-10-26 Matsushita Electric Industrial Co., Ltd. Image processing device and monitoring system
JP2003158736A (en) * 2000-07-19 2003-05-30 Matsushita Electric Ind Co Ltd Monitoring system
JP2010215027A (en) * 2009-03-13 2010-09-30 Fujitsu Ten Ltd Driving assistant device for vehicle
JP2011030078A (en) * 2009-07-28 2011-02-10 Toshiba Alpine Automotive Technology Corp Image display device for vehicle
JP2012147285A (en) * 2011-01-13 2012-08-02 Alpine Electronics Inc Back monitor apparatus
JP2014110604A (en) * 2012-12-04 2014-06-12 Denso Corp Vehicle periphery monitoring device
JP2017098932A (en) * 2015-11-17 2017-06-01 株式会社Jvcケンウッド Vehicle display device and vehicle display method
WO2021131481A1 (en) * 2019-12-24 2021-07-01 株式会社Jvcケンウッド Display device, display method, and display program

Also Published As

Publication number Publication date
JPWO2023100415A1 (en) 2023-06-08

Similar Documents

Publication Publication Date Title
JP7010221B2 (en) Image generator, image generation method, and program
US7554461B2 (en) Recording medium, parking support apparatus and parking support screen
KR102275310B1 (en) Mtehod of detecting obstacle around vehicle
JP7150274B2 (en) Autonomous vehicles with improved visual detection capabilities
EP2429877B1 (en) Camera system for use in vehicle parking
US9712791B2 (en) Around view provision apparatus and vehicle including the same
US8717196B2 (en) Display apparatus for vehicle
US10908604B2 (en) Remote operation of vehicles in close-quarter environments
US20170287168A1 (en) Around view provision apparatus and vehicle including the same
JP5209578B2 (en) Image display device for vehicle
US20170036678A1 (en) Autonomous vehicle control system
JP2017200182A (en) Topographic visualization for vehicle and vehicle driver
US20200290600A1 (en) Parking assistance device and parking assistance method
JP7091624B2 (en) Image processing equipment
CN112825127B (en) Method of generating a compact 2D bounding box for an autopilot marker
JP2008236403A (en) Driving support device for vehicle
KR20170118077A (en) Method and device for the distortion-free display of an area surrounding a vehicle
WO2020026825A1 (en) Information processing device, information processing method, program, and mobile body
US10540807B2 (en) Image processing device
JP2019526105A5 (en)
US20190100141A1 (en) Ascertainment of Vehicle Environment Data
KR102031635B1 (en) Collision warning device and method using heterogeneous cameras having overlapped capture area
CN108422932B (en) Driving assistance system, method and vehicle
CN112837209B (en) Novel method for generating distorted image for fish-eye lens
WO2023100415A1 (en) Information processing device, movable body, information processing method and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22900850

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023564741

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE