WO2018112898A1 - Procédé et dispositif de projection, et robot - Google Patents

Procédé et dispositif de projection, et robot Download PDF

Info

Publication number
WO2018112898A1
WO2018112898A1 PCT/CN2016/111754 CN2016111754W WO2018112898A1 WO 2018112898 A1 WO2018112898 A1 WO 2018112898A1 CN 2016111754 W CN2016111754 W CN 2016111754W WO 2018112898 A1 WO2018112898 A1 WO 2018112898A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
panoramic
projection area
partial
area
Prior art date
Application number
PCT/CN2016/111754
Other languages
English (en)
Chinese (zh)
Inventor
骆磊
Original Assignee
深圳前海达闼云端智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳前海达闼云端智能科技有限公司 filed Critical 深圳前海达闼云端智能科技有限公司
Priority to CN201680002677.6A priority Critical patent/CN106797455A/zh
Priority to PCT/CN2016/111754 priority patent/WO2018112898A1/fr
Publication of WO2018112898A1 publication Critical patent/WO2018112898A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3188Scale or resolution adjustment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/225Feedback of the input speech

Definitions

  • the present application relates to the field of projection imaging technology, and in particular, to a projection method, apparatus, and robot.
  • the existing robot has a projection function, can flexibly change the projection position, and project a preset panoramic projection screen to a corresponding area, so that the user can complete the related work according to the panoramic projection screen.
  • the projection device of the robot projects the entire projection picture required for construction onto the construction wall, so that the construction personnel can carry out the construction on the construction wall according to the whole projection picture.
  • the prior art robot projects with a high configuration and an expensive projection device, or a plurality of projection devices simultaneously project to perform one-time projection of the entire projection image.
  • the user does not need to complete the related work according to the entire projection picture, and only needs the partial projection picture in the whole projection picture to complete the work within the current range, for example, during the construction process, the construction personnel Only need to obtain the partial projection picture in the current construction range, the construction within the current range can be completed, and when the position of the construction personnel changes, the partial projection picture in the next position range can be acquired, and the construction for the next position can be completed.
  • the inventors of the present application have found that the existing robot or projection apparatus has not been able to solve the technical problem of selecting a partial projection picture for partial projection in a panoramic projection picture.
  • the technical problem to be solved by the present application is to provide a projection method, a device and a robot.
  • the main purpose of the present invention is to solve the technical problem that the projection device of the prior art has not been able to solve the problem of selecting a partial projection image for partial projection in a panoramic projection image.
  • the embodiment of the present application provides the following technical solutions:
  • an embodiment of the present application provides a projection method, including: determining a position of a partial projection area from a panoramic projection area; adjusting a projection direction of the projection apparatus according to a position of the partial projection area, and The projection device projects a partial projection picture corresponding to the partial projection area to the partial projection area.
  • determining the location of the partial projection area from the panoramic projection area comprises: determining a position of the user in the panoramic projection area; determining, in the panoramic projection area, according to the determined location of the user in the panoramic projection area The location of the local projection area.
  • the determining a location of the user in the panoramic projection area comprises: capturing a panoramic picture of a location where the panoramic projection area is located; determining a location of the user in the panoramic projection area from the panoramic picture.
  • the partial projection picture is obtained by cropping from a pre-stored panoramic projection image, wherein when the pre-stored panoramic projection image is projected to the panoramic projection area, the pre-stored panoramic projection screen covers the location The panoramic projection area.
  • adjusting the projection direction of the projection device according to the position of the partial projection area comprises: moving the projection device according to the position of the partial projection area until the projection optical axis of the moved projection device is perpendicular to the The plane in which the local projection area is located.
  • adjusting the projection direction of the projection device according to the position of the partial projection area comprises: adjusting a projection angle of the projection device according to a position of the partial projection area until a projection optical axis of the adjusted projection device Focusing on the partial projection area.
  • an embodiment of the present application provides a projection apparatus, the apparatus comprising: a determining module, configured to determine a position of a partial projection area from a panoramic projection area; and an adjustment module, configured to determine a position according to the partial projection area And adjusting a projection direction of the projection device, and causing the projection device to project a partial projection image corresponding to the partial projection region to the partial projection region.
  • the determining module includes: a first determining unit, configured to determine a location of the user in the panoramic projection area; and a second determining unit, configured to determine, according to the determined location of the user in the panoramic projection area, The position of the partial projection area is determined in the panoramic projection area.
  • the first determining unit includes: a first determining subunit, configured to capture a panoramic picture of a location where the panoramic projection area is located; and a second determining subunit, configured to determine, from the panoramic picture, the user The position of the panoramic projection area.
  • the partial projection picture is obtained by cropping from a pre-stored panoramic projection image, wherein when the pre-stored panoramic projection image is projected to the panoramic projection area, the pre-stored panoramic projection screen covers the location The panoramic projection area.
  • the adjustment module includes: a first adjustment unit, configured to move the projection device according to a position of the partial projection area, until a projection optical axis of the moved projection device is perpendicular to a plane where the partial projection area is located .
  • the adjusting module includes: a second adjusting unit, configured to adjust a projection angle of the projection device according to the position of the partial projection area, until the projection optical axis of the adjusted projection device is focused on the partial projection area .
  • an embodiment of the present application provides a robot, including: at least one processor; and a projection device; an adjustment device, configured to adjust a projection direction of the projection device; and communicatively coupled to the at least one processor a memory; wherein the processor is coupled to the projection device, the adjustment device, and the memory, respectively; the memory stores an instruction program executable by the at least one processor, the instruction program being The at least one processor executes to enable the at least one processor to perform the method described above based on the projection device and the adjustment device.
  • the position of the partial projection area is determined from the panoramic projection area, and the projection direction of the projection apparatus is adjusted according to the position of the partial projection area, so that the projection apparatus will have a partial projection picture corresponding to the partial projection area, Project to the local projection area.
  • the partial projection picture in the panoramic projection picture can be projected into the local projection area of the current desired projection in the case that the panoramic projection cannot be satisfied, and can be real-time every time the partial projection area changes. Adjust the partial projection picture of the current local projection area.
  • FIG. 1 is a schematic structural diagram of a projection apparatus according to an embodiment of the present application.
  • FIG. 2 is a schematic structural diagram of a projection apparatus according to another embodiment of the present application.
  • FIG. 3 is a schematic structural diagram of hardware of a robot provided by an embodiment of the present application.
  • 3a to 3c are schematic views of a construction scenario provided by an embodiment of the present application.
  • FIG. 4 is a schematic flow chart of a projection method provided by an embodiment of the present application.
  • FIG. 5 is a schematic flow chart of step 40 in Figure 4.
  • FIG. 6 is a schematic flow chart of step 401 in FIG. 5.
  • FIG. 1 is a schematic structural diagram of a projection apparatus according to an embodiment of the present application.
  • the projection device 10 includes a determination module 101 and an adjustment module 102 .
  • the determining module 101 is configured to determine a location of the partial projection area from the panoramic projection area;
  • the adjustment module 102 is configured to adjust a projection direction of the projection device according to the position of the partial projection area, and cause the projection device to project the partial projection picture corresponding to the partial projection area to the partial projection area.
  • the partial projection area is a partial projection area in the panoramic projection area, and the projection device 10 uses the panoramic projection area as the desired projection area according to the task requirement during the projection process.
  • the determination module 101 may select a partial region from the panoramic projection region as a partial projection region, and project a corresponding partial projection image to the partial projection region.
  • the panoramic projection area is square, and the corner points of the panoramic projection area are A(0,0), B(0,100), C(150,0), D(150,100), and the partial projection area can also be square.
  • the coordinates of each corner point are: E (30, 40), F (100, 40), G (30, 50), H (100, 50).
  • the partial projection area may not be square, or may be other shapes, such as a circle, a triangle, or the partial projection area may be other irregular shapes.
  • the range of the re-determined partial projection area may be re-determined according to the current operation requirement, instead of limiting the range of the partial projection area in advance.
  • the panoramic projection area is a rectangular area of 1000 cm in length and 500 cm in width
  • the current partial projection area may be a rectangular area having a length of 150 cm and a width of 100 cm in the panoramic projection area, in the current partial projection area.
  • the position of the partial projection area changes, and the next partial projection area may be a circular area having a radius of 120 cm in the panoramic projection area.
  • the current partial projection image can be projected into the local projection area of the current desired projection without satisfying the panoramic projection, and the current local region can be adjusted in real time when each partial projection region changes.
  • a partial projection image of the projection area enables dynamic local projection.
  • FIG. 2 is a schematic structural diagram of a projection apparatus according to another embodiment of the present application.
  • the projection device 20 includes a determination module 201 and an adjustment module 202.
  • the determination module 201 and the adjustment module 202 are based on the same idea as the above embodiment, and will not be further described herein.
  • the projection device 20 is different from the above embodiment in that the determination module 201 includes a first determination unit 2011 and a second determination unit 2012.
  • the first determining unit 2011 is configured to determine a position of the user in the panoramic projection area; and the second determining unit 2012 is configured to determine the position of the partial projection area in the panoramic projection area according to the determined position of the user in the panoramic projection area. Since the partial projection area is determined according to the position of the user in the panoramic projection area, local projection is performed following the position of the user.
  • Image recognition refers to first acquiring a panoramic picture of the position of the full-image projection area, and then determining the position of the user in the panoramic picture, thereby determining the position of the user in the full-image projection area, the first determining unit 2011 includes A determining subunit 20111 and a second determining subunit 20112.
  • the first determining subunit 20111 is configured to capture a panoramic picture of the location where the panoramic projection area is located; the second determining subunit 20112 is configured to determine the location of the user in the panoramic projection area from the panoramic picture. Wherein, the position of the user in the panoramic picture relative to the panoramic picture, that is, the position of the user in the panoramic projection area.
  • the first determining subunit 20111 can capture the position of the panoramic projection area according to the acquisition frequency, and output the captured panoramic picture.
  • the user's moving speed can also be recognized, and the size of the collecting frequency is adjusted according to the user's moving speed to ensure that the collecting frequency matches the user's moving speed.
  • the panoramic projection area corresponds to a panoramic projection screen, and when the panoramic projection screen is projected, the panoramic projection screen can cover the panoramic projection area.
  • the panoramic projection screen is pre-stored in the projection device.
  • the partial projection image can be obtained by cropping from the panoramic projection image according to the position and size of the local projection region, for example, the second determining subunit 20112
  • the local projection area EFGH are: E (30, 40), F (100, 40), G (30, 50), H (100, 50), according to the size, position and panorama of the panoramic projection area of the partial projection area.
  • Projection area and panoramic view The scale of the picture, the local projection area is mapped into the panoramic projection picture, and the clipping range and position are determined, for example, when the size of the panoramic projection picture is 285 mm x 210 mm, and the respective size parameters of the panoramic projection area are coordinates of four points respectively.
  • the size of the partial projection picture is cropped from the panoramic projection picture as obtained by the following four point coordinate parameters:
  • K1 (57,84);
  • the size of the partial projection picture (190 mm - 57 mm) x (105 - 84) 133 mm x 21 mm.
  • the first side of the rectangle has a first side length of 57 mm
  • the third side of the rectangle has a long distance
  • the third side of the picture has a length of 84 mm
  • a plurality of users may be in front of the panoramic viewing area.
  • one user may be matched according to the character characteristics, and the local projection area is determined according to the one user, specifically: first The determining unit 2011 collects the image of the entire wall through the camera, extracts the character features of each character from the image, and matches the extracted character features with the pre-existing character features. If the matching is successful, the matching successful character is the user. And the position of the successfully matched person in the image is the position of the user in the panoramic projection area.
  • the character characteristics are not limited, for example, the character makes a thumbs up gesture, or the character makes a fist gesture, etc., and the matching of the character features can well solve the problem of multiple characters standing on the wall. , causing technical problems in identifying confusion.
  • Speech recognition For example, the user stands at a certain position on the wall to issue an "I am here" voice indication, when the voice is recognized, and the position of the user in the panoramic projection area is determined by the directivity of the voice.
  • Infrared detection for example, detecting the wall surface by an infrared detector, when the heat is at a certain position on the wall When the heat is different from other locations, it is determined that there is a user at the location, thereby determining the location of the user in the panoramic projection area.
  • the position of the user in the scene projection area only defines the position of the partial projection area, and the range of the partial projection area is not limited, and the range can be determined according to actual business requirements.
  • the user itself is 180 cm high and 50 cm wide, and then the local part is local.
  • the range of the projection area may cover the location where the user is located or the vicinity of the location of the user according to actual needs, and the range of the partial projection area is not limited to 180 cm*50 cm.
  • the position of the partial projection area in the panoramic projection area may also be a fixed range of a certain proportion.
  • the projection apparatus may adjust its position according to the position of the partial projection area to output a non-distorted partial projection picture.
  • the adjustment module 202 includes a first adjustment unit (not shown). The first adjusting unit is configured to move according to the position of the partial projection area until the projection optical axis of the moved projection device is perpendicular to a plane where the partial projection area is located.
  • the adjustment module 202 includes a second adjustment unit (not shown). The second adjusting unit is configured to adjust the projection angle of the projection device according to the position of the partial projection area until the projection optical axis of the adjusted projection device is focused on the partial projection area.
  • a panoramic picture of a position where the user is located in the panoramic projection area can be captured, and the position of the user in the panoramic projection area is determined by the panoramic picture, and a partial projection picture related to the position of the user is obtained by the panoramic picture cropping.
  • the current partial projection image is projected into the local projection area of the current desired projection, and each time the local projection area changes, the current partial projection area can be adjusted in real time according to the position of the user.
  • the screen is partially projected, and the projection direction of the projection device is adjusted according to the position of the user to realize dynamic partial projection.
  • FIG. 3 is a schematic diagram of a hardware structure of a robot provided by an embodiment of the present application.
  • the robot 30 is configured to perform a projection method as described above, the robot 30 comprising: one or more processors 310, a projection device 320, an adjustment device 330, and a memory 340.
  • one processor 310 is taken as an example in FIG.
  • the processor 310, the projection device 320, the adjustment device 330, and the memory 340 may be connected by a bus or other means, and the connection through the bus is taken as an example in FIG.
  • Projection device 320 is used to project a local projection area.
  • the adjustment device 330 is used to adjust the projection direction of the projection device.
  • the processor 310 is configured to determine a position of the partial projection area from the panoramic projection area, and according to the position of the partial projection area, The projection direction of the projection device 320 is adjusted by the adjustment device 330, and the projection device 320 is caused to project a partial projection image corresponding to the partial projection region to the partial projection region.
  • adjustment device 330 herein may refer to a device for adjusting the position and/or posture of the robot to adjust projection device 320 on the robot.
  • the control robot moves along the projected wall to control the position of the projection.
  • It may also be a device dedicated to adjusting the projection device 320 rather than for adjusting the position and/or posture of the entire robot.
  • the adjusting device 330 may include a driving shaft on which the rotating shaft and the driving rotating shaft rotate, and the projection device 320 is mounted on the rotating shaft and rotatable in accordance with the rotation of the rotating shaft.
  • the processor 310 transmits a driving command to the driving controller of the adjusting device 330 to rotate the rotating shaft of the driving adjusting device 330 to adjust the projection device 320 to a suitable position, so that the projection device 320 will display a partial projection screen corresponding to the partial projection region. , projected to the local projection area.
  • the local projection area may be determined according to the position of the user in the panoramic projection area, and the position of the user is dynamically followed to be projected, and the processor 310 is specifically configured to determine the position of the user in the panoramic projection area;
  • the position of the panoramic projection area determines the position of the partial projection area in the panoramic projection area.
  • the robot further includes an image collection device 350.
  • the processor 310 determines the position of the user in the panoramic projection area, including: A panoramic picture of the position where the panoramic projection area is located is captured by the image capture device 350; the position of the user in the panoramic projection area is determined from the panoramic picture.
  • other ways of determining the location of the user in the panoramic projection area such as speech recognition and infrared recognition, may also be employed.
  • the panoramic projection area may correspond to a panoramic projection screen
  • the panoramic projection screen refers to a panoramic projection screen covering the panoramic projection area when the panoramic projection screen is projected to the panoramic projection area.
  • the panoramic projection screen is pre-stored in the projection device. After the local projection region is determined, the partial projection image can be obtained by cropping from the panoramic projection image according to the position and size of the partial projection region.
  • the projection apparatus may adjust its position according to the position of the partial projection area to output an undistorted partial projection picture.
  • the adjustment device can also adjust the position of the robot, and the processor 310 adjusts the projection direction of the projection device according to the position of the partial projection area, including: the processor 310 moves the projection device through the adjustment device according to the position of the partial projection area until the projection after the movement
  • the projection optical axis of the device is perpendicular to the plane in which the local projection area is located.
  • the processor 310 is based on the local projection area
  • the position of the projection device adjusts the projection direction, and the processor 310 adjusts the projection angle of the projection device according to the position of the partial projection region until the projection optical axis of the adjusted projection device is focused on the partial projection region.
  • the operations performed by the processor 310 may be stored in the memory 320 in the form of program instructions/modules, which are used as a non-volatile computer readable storage medium for storing non-volatile software programs,
  • the non-volatile computer executable program and the module such as the program instructions/modules corresponding to the projection method in the embodiment of the present application (for example, the respective modules shown in FIG. 1, the respective modules and the respective units shown in FIG. 2).
  • the processor 310 performs various functional applications and data processing of the projection apparatus by executing non-volatile software programs, instructions, and modules stored in the memory 320, that is, implementing the projection method of the following method embodiments and the apparatus embodiments described above. The function of each module.
  • Memory 320 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device.
  • memory 320 can optionally include memory remotely located relative to processor 310, which can be coupled to processor 301 via a network. Examples of such networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • the program instructions/modules are stored in the memory 320, and when executed by the one or more processors 301, perform a projection method in any of the method embodiments described below, for example, performing the following description of FIG. 4 to
  • the respective steps shown in Fig. 6 can also realize the functions of the respective modules shown in Fig. 1 described above, or the functions of the respective modules and the respective units shown in Fig. 2.
  • the computer software can be stored in a computer readable storage medium, which, when executed, can include the flow of an embodiment of the methods described above.
  • the storage medium may be a magnetic disk, an optical disk, a read-only storage memory, or a random storage memory.
  • the robot needs to project the projection screen required for the job to the work area.
  • the robot can select two ways to project during the projection process: the first type: the robot uses the entire work area as the panoramic projection area, and projects the panoramic projection picture corresponding to the panoramic projection area to The panoramic projection area.
  • the robot selects a local work area currently associated with the worker from the entire work area as a partial projection area according to the work range of the worker, and projects a partial projection screen corresponding to the partial projection area to the partial projection area. If the current projection device cannot project the panoramic projection screen corresponding to the panoramic projection area to the panoramic projection area, the second operation mode can be selected to complete the task. This embodiment mainly describes the second operation mode.
  • the construction wall 3a0 that is, the panoramic projection area
  • the construction wall 3a0 includes a bookshelf area 3a01, a background area 3a02, a television installation area 3a03, and a potting area 3a04, and each area is located at the position of the panoramic projection area 3a0.
  • the designer pre-plans so each area has a specified size.
  • the construction personnel need to carry out marking and marking for each area, so that the follow-up construction personnel can complete the next operation. If the first method is used for projection, since the width of the house is limited, the projection screen cannot be filled once to the construction wall 3a0, and the construction personnel generally do not simultaneously construct the entire construction wall 3a0.
  • the construction personnel generally have a familiarity with the engineering design drawings required for the construction.
  • the robot of the first embodiment needs to be equipped with a high projection device, which increases the design cost of the robot, and the second method can meet the construction requirements.
  • the constructor 3b0 stands on the ground to mark and construct the side elevation of the bookshelf 3b1.
  • the robot can use the side elevation of the bookshelf 3b1 as a partial projection area to project the local projections of the current construction personnel.
  • the screen 3b2 (that is, the projection screen related to the side elevation of the bookshelf 3b1) is projected to the partial projection area, where the partial projection area and the partial projection screen 3b2 overlap.
  • the constructor 3b0 stands on the line 3c1 of the right edge of the background area 3a02 of the ladder 3c0 and the construction line, and the robot can set the background area 3a02
  • the right edge scribe line region 3c1 is used as a partial projection region
  • the local projection screen 3c2 related to the current constructor ie, the projection image related to the right edge of the background region 3a02
  • the partial projection region and the partial projection are respectively The screen 3c2 overlaps.
  • the robot can project the current partial projection image into the local projection area of the current desired projection, and can be rooted every time the partial projection region changes.
  • the partial projection picture of the current partial projection area is adjusted in real time according to the position of the user, and the projection direction of the projection device is adjusted according to the position of the user to realize dynamic partial projection.
  • the explanation of the embodiment of the projection method based on the robot here is only an embodiment, and the projection method and the device included in the robot are not limited, and any local projection region is determined from the panoramic projection region. Position and adjust the projection direction of the projection device according to the position of the partial projection area, so that the projection device can project the partial projection image of the position when only the partial projection picture of the current position is needed, which should fall within the protection of the present application Within the scope.
  • the projection method includes:
  • Step 40 Determine a position of the partial projection area from the panoramic projection area
  • Step 42 Adjust the projection direction of the projection device according to the position of the partial projection area, and cause the projection device to project the partial projection picture corresponding to the partial projection area to the partial projection area.
  • the partial projection area is a partial projection area in the panoramic projection area.
  • the robot will use the panoramic projection area as the desired projection area according to the task requirements.
  • the robot can select a partial region from the panoramic projection region as a partial projection region, and project a corresponding partial projection image to the partial projection region.
  • the panoramic projection area is square, and the corner points of the panoramic projection area are A(0,0), B(0,100), C(150,0), D(150,100), and the partial projection area can also be square.
  • the coordinates of each corner point are: E (30, 40), F (100, 40), G (30, 50), H (100, 50).
  • the partial projection area may not be square, or may be other shapes, such as a circle, a triangle, or the partial projection area may be other irregular shapes.
  • the range of the re-determined partial projection area may be re-determined according to the current operation requirement, instead of limiting the range of the partial projection area in advance.
  • the panoramic projection area is a rectangular area of 1000 cm in length and 500 cm in width
  • the current partial projection area may be a rectangular area having a length of 150 cm and a width of 100 cm in the panoramic projection area, in the current partial projection area.
  • the position of the partial projection area changes, and the next partial projection area may be a circular area having a radius of 120 cm in the panoramic projection area.
  • the present application is capable of projecting the current partial projection image into the local projection area of the current desired projection without satisfying the panoramic projection, and in each partial projection area.
  • the local projection picture of the current local projection area can be adjusted in real time to realize dynamic local projection.
  • step 40 includes:
  • Step 401 Determine a location of the user in the panoramic projection area
  • Step 402 Determine the position of the partial projection area in the panoramic projection area according to the determined position of the user in the panoramic projection area.
  • the partial projection area is determined according to the position of the user in the panoramic projection area, local projection is performed following the position of the user.
  • step 401 includes:
  • Step 4011 Capture a panoramic picture of the location where the panoramic projection area is located;
  • Step 4012 Determine the position of the user in the panoramic projection area from the panoramic picture.
  • the first determining subunit 20111 can capture the position of the panoramic projection area according to the acquisition frequency, and output the captured panoramic picture.
  • the user's moving speed can also be recognized, and the size of the collecting frequency is adjusted according to the user's moving speed to ensure that the collecting frequency matches the user's moving speed.
  • the panoramic projection area corresponds to a panoramic projection screen, and when the panoramic projection screen is projected, the panoramic projection screen can cover the panoramic projection area.
  • the panoramic projection screen is pre-stored in the projection device.
  • the partial projection image can be obtained by cropping from the panoramic projection image according to the position and size of the local projection region.
  • the robot can be based on the local projection region.
  • EFGH are: E (30, 40), F (100, 40), G (30, 50), H (100, 50), according to the size, position and panoramic projection area of the panoramic projection area according to the local projection area.
  • the proportion of the panoramic projection screen is mapped to the panoramic projection screen to determine the clipping range and position, for example, when the size of the panoramic projection screen is 285 mm x 210 mm, and the respective size parameters of the panoramic projection area are respectively Is the coordinates of four points, where A(0,0), B(0,100), C(150,0), D(150,100), the size of the partial projection picture is cropped from the panoramic projection picture as the following four points
  • the coordinate parameters are obtained:
  • K1 (57,84);
  • the size of the partial projection picture (190 mm - 57 mm) x (105 - 84) 133 mm x 21 mm.
  • the first side of the rectangle has a first side length of 57 mm
  • the third side of the rectangle has a long distance
  • the third side of the picture has a length of 84 mm
  • the robot may The image of the entire wall is captured by the camera, the character features of each character are extracted from the image, and the extracted character features are matched with the pre-existing character features. If the matching is successful, the matching character is the user, and the matching is performed. The position of the successful character in the image is the position of the user in the panoramic projection area.
  • the character characteristics are not limited, for example, the character makes a thumbs up gesture, or the character makes a fist gesture, etc., and the matching of the character features can well solve the problem of multiple characters standing on the wall. , causing technical problems in identifying confusion.
  • Speech recognition For example, the user stands at a certain position on the wall to issue an "I am here" voice indication, when the voice is recognized, and the position of the user in the panoramic projection area is determined by the directivity of the voice.
  • Infrared detection For example, the wall surface is detected by an infrared detector. When the heat of a certain position on the wall surface is different from the heat of other positions, it is determined that the user exists at the position, thereby determining the position of the user in the panoramic projection area.
  • the position of the user in the projection area of the scene only defines the position of the partial projection area
  • the range of the local projection area is defined, and the range can be determined according to actual business requirements.
  • the user itself is 180 cm high and 50 cm wide, and the range of the partial projection area can cover the location of the user or the location of the user according to actual needs.
  • the vicinity of the area is not limited, and the range of the partial projection area is not limited to 180 cm * 50 cm.
  • the position of the partial projection area in the panoramic projection area may also be a fixed range of a certain proportion.
  • the projection apparatus may adjust its position according to the position of the partial projection area to output a non-distorted partial projection picture. For example, the robot moves according to the position of the partial projection area until the projection optical axis of the moved projection device is perpendicular to the plane in which the partial projection area is located. Since the projection optical axis of the projection device is perpendicular to the plane where the partial projection region is located, the partial projection image projected by the projection device can ensure no distortion.
  • the projection device when the partial projection area is located to the left of the projection device, and the projection device detects that the angle between the current projection optical axis and the plane where the partial projection area is located is 30 degrees, the projection device can move along the horizontal line until the projection device The projection optical axis is perpendicular to the plane of the local projection area, and the projection device projects the partial projection image to the partial projection area, thereby enabling the user to complete the job more accurately.
  • the robot can also implement the adjusted control device according to the position of the partial projection area without rotating the projection device, but only by rotating the projection angle of the projection device to change the direction of the projection optical axis of the projection device.
  • the projection optical axis is focused on the local projection area, thereby effectively projecting a partial projection area of the partial projection picture.
  • the partial projection image projected by the projection device may be distorted (partial projection image) Deviating from the desired projection picture), therefore, in order to output the undistorted partial projection picture, the projection device corrects the partial projection picture in advance by the image correction algorithm when projecting the partial projection picture, so that the projection device can be tilted at a certain horizontal angle and vertical The tilt angle responds to the local projection area at different positions, and outputs an undistorted partial projection picture, thereby enabling the user to complete the job more accurately.
  • a panoramic picture of a position where the user is located in the panoramic projection area can be captured, and the position of the user in the panoramic projection area is determined by the panoramic picture, and a partial projection picture related to the position of the user is obtained by the panoramic picture cropping.
  • the current partial projection image is projected into the local projection area of the current desired projection, and in each session When the projection area changes, the partial projection screen of the current partial projection area can be adjusted in real time according to the position of the user, and the projection direction of the projection device is adjusted according to the position of the user to realize dynamic partial projection.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Projection Apparatus (AREA)

Abstract

La présente invention porte sur un procédé et un dispositif de projection, ainsi que sur un robot. Le procédé consiste à : déterminer la position d'une zone de projection locale à partir d'une zone de projection panoramique (40); et ajuster une direction de projection d'un appareil de projection en fonction de la position de la zone de projection locale, et permettre à l'appareil de projection de projeter une image de projection locale, correspondant à la zone de projection locale, dans la zone de projection locale (42). Au moyen du procédé, l'appareil de projection peut projeter l'image de projection locale actuelle dans la zone de projection locale, là où la projection doit être effectuée, lorsqu'une projection panoramique ne peut pas être satisfaite, et lorsque la zone de projection locale change chaque fois, l'image de projection locale de la zone de projection locale actuelle peut être ajustée en temps réel de façon à achever une opération pertinente.
PCT/CN2016/111754 2016-12-23 2016-12-23 Procédé et dispositif de projection, et robot WO2018112898A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201680002677.6A CN106797455A (zh) 2016-12-23 2016-12-23 一种投影方法、装置及机器人
PCT/CN2016/111754 WO2018112898A1 (fr) 2016-12-23 2016-12-23 Procédé et dispositif de projection, et robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/111754 WO2018112898A1 (fr) 2016-12-23 2016-12-23 Procédé et dispositif de projection, et robot

Publications (1)

Publication Number Publication Date
WO2018112898A1 true WO2018112898A1 (fr) 2018-06-28

Family

ID=58952194

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/111754 WO2018112898A1 (fr) 2016-12-23 2016-12-23 Procédé et dispositif de projection, et robot

Country Status (2)

Country Link
CN (1) CN106797455A (fr)
WO (1) WO2018112898A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107463902B (zh) * 2017-08-07 2020-11-24 上海碧虎网络科技有限公司 一种投影机自动识别投影异常的方法和***
KR20190024190A (ko) * 2017-08-31 2019-03-08 (주)휴맥스 음성 인식 영상 피드백 제공 시스템 및 방법
CN109996051B (zh) * 2017-12-31 2021-01-05 广景视睿科技(深圳)有限公司 一种投影区域自适应的动向投影方法、装置及***
CN108600716A (zh) * 2018-05-17 2018-09-28 京东方科技集团股份有限公司 投影设备和***、投影方法
CN110795053B (zh) * 2018-08-01 2023-07-18 昆山纬绩资通有限公司 计算机屏幕局部投影方法与***
CN112040207B (zh) * 2020-08-27 2021-12-10 广景视睿科技(深圳)有限公司 一种调整投影画面的方法、装置以及投影设备
CN112511814A (zh) * 2021-02-05 2021-03-16 深圳市橙子数字科技有限公司 投影仪的对焦方法、投影仪、计算机设备及存储介质
CN114782901B (zh) * 2022-06-21 2022-09-09 深圳市禾讯数字创意有限公司 基于视觉变动分析的沙盘投影方法、装置、设备及介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000352761A (ja) * 1999-06-10 2000-12-19 Sony Corp 映像投射装置及び方法並びに映像投射制御装置
CN104750443A (zh) * 2013-12-31 2015-07-01 联想(北京)有限公司 一种显示控制方法及电子设备
CN104750445A (zh) * 2015-03-09 2015-07-01 联想(北京)有限公司 一种信息处理方法及电子设备
CN105262968A (zh) * 2015-10-22 2016-01-20 神画科技(深圳)有限公司 自动调整投影画面位置的投影***及其投影方法
CN105573345A (zh) * 2014-10-14 2016-05-11 深圳市维森软件股份有限公司 一种基于全视野图的云台摄像机的控制方法和装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7134080B2 (en) * 2002-08-23 2006-11-07 International Business Machines Corporation Method and system for a user-following interface
CN104012106B (zh) * 2011-12-23 2017-11-24 诺基亚技术有限公司 使表示不同视点的视频对准
JP6075066B2 (ja) * 2012-12-28 2017-02-08 株式会社リコー 画像管理システム、画像管理方法、及びプログラム
US9241103B2 (en) * 2013-03-15 2016-01-19 Voke Inc. Apparatus and method for playback of multiple panoramic videos with control codes
CN103226282B (zh) * 2013-05-13 2016-09-07 合肥华恒电子科技有限责任公司 一种便携式虚拟现实投影装置
CN104244019B (zh) * 2014-09-18 2018-01-19 孙轩 一种全景视频影像室内分屏显示方法及显示***
CN104602129B (zh) * 2015-01-27 2018-03-06 三星电子(中国)研发中心 互动式多视角视频的播放方法及***
CN104735464A (zh) * 2015-03-31 2015-06-24 华为技术有限公司 一种全景视频交互传输方法、服务器和客户端
CN105117024A (zh) * 2015-09-25 2015-12-02 联想(北京)有限公司 一种控制方法、电子设备及电子装置
CN105898460A (zh) * 2015-12-10 2016-08-24 乐视网信息技术(北京)股份有限公司 调整智能电视的全景视频播放视角的方法和装置
CN105632384A (zh) * 2016-03-02 2016-06-01 青岛海信电器股份有限公司 一种投影显示***及方法
CN105828090A (zh) * 2016-03-22 2016-08-03 乐视网信息技术(北京)股份有限公司 全景直播方法及装置
CN105916060A (zh) * 2016-04-26 2016-08-31 乐视控股(北京)有限公司 数据传输的方法、装置及***
CN106125468B (zh) * 2016-06-29 2018-12-18 海信集团有限公司 一种多方向投影设备及方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000352761A (ja) * 1999-06-10 2000-12-19 Sony Corp 映像投射装置及び方法並びに映像投射制御装置
CN104750443A (zh) * 2013-12-31 2015-07-01 联想(北京)有限公司 一种显示控制方法及电子设备
CN105573345A (zh) * 2014-10-14 2016-05-11 深圳市维森软件股份有限公司 一种基于全视野图的云台摄像机的控制方法和装置
CN104750445A (zh) * 2015-03-09 2015-07-01 联想(北京)有限公司 一种信息处理方法及电子设备
CN105262968A (zh) * 2015-10-22 2016-01-20 神画科技(深圳)有限公司 自动调整投影画面位置的投影***及其投影方法

Also Published As

Publication number Publication date
CN106797455A (zh) 2017-05-31

Similar Documents

Publication Publication Date Title
WO2018112898A1 (fr) Procédé et dispositif de projection, et robot
CN112689135B (zh) 投影校正方法、装置、存储介质及电子设备
CN112804508B (zh) 投影仪校正方法、***、存储介质以及电子设备
CN112804507B (zh) 投影仪校正方法、***、存储介质以及电子设备
WO2019128109A1 (fr) Procédé de projection dynamique basé sur un suivi de visages, dispositif et équipement électronique
US11838697B2 (en) Ultra-short-throw picture and screen alignment method and apparatus, and storage medium
EP3122034B1 (fr) Procédé d'étalonnage de caméra
US8398246B2 (en) Real-time projection management
WO2021093231A1 (fr) Procédé et appareil d'alignement d'écran d'image à focalisation ultra-courte, dispositif de projection à focalisation ultra-courte, et support
US11210796B2 (en) Imaging method and imaging control apparatus
CN101697105A (zh) 一种摄像式触摸检测定位方法及摄像式触摸检测***
US10317777B2 (en) Automatic zooming method and apparatus
JP6172987B2 (ja) 方位角推定装置及び方位角推定プログラム
US9591229B2 (en) Image tracking control method, control device, and control equipment
CN109982029B (zh) 一种摄像机监控场景自动调节方法及装置
WO2015103835A1 (fr) Procédé et dispositif de commande d'azimut d'un dispositif de caméra doté de fonctions panoramique/d'inclinaison
WO2018006566A1 (fr) Procédé et système de réglage de vue
WO2017215246A1 (fr) Procédé et système de reconnaissance de dispositif d'entrée, et procédé et système de reconnaissance d'instruction d'entrée
KR20110094664A (ko) 전방향 피티지 카메라 제어 장치 및 그 방법
CN111627073B (zh) 一种基于人机交互的标定方法、标定装置和存储介质
CN111343360B (zh) 一种校正参数获得方法
CN108628487A (zh) 一种位置信息确定方法、投影设备和计算机存储介质
CN115174878B (zh) 投影画面校正方法、装置和存储介质
WO2018099128A1 (fr) Procédé et dispositif utilisés dans un appareil de projection
WO2023029123A1 (fr) Procédé et appareil de détection de coordonnées de sommet, et dispositif et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16924880

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 15/10/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16924880

Country of ref document: EP

Kind code of ref document: A1