CN109788172A - Electronic equipment and mobile platform - Google Patents

Electronic equipment and mobile platform Download PDF

Info

Publication number
CN109788172A
CN109788172A CN201910007852.6A CN201910007852A CN109788172A CN 109788172 A CN109788172 A CN 109788172A CN 201910007852 A CN201910007852 A CN 201910007852A CN 109788172 A CN109788172 A CN 109788172A
Authority
CN
China
Prior art keywords
structured light
electronic equipment
video camera
light projector
initial depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910007852.6A
Other languages
Chinese (zh)
Inventor
张学勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910007852.6A priority Critical patent/CN109788172A/en
Publication of CN109788172A publication Critical patent/CN109788172A/en
Pending legal-status Critical Current

Links

Landscapes

  • Studio Devices (AREA)

Abstract

This application discloses a kind of electronic equipment and mobile platform.Electronic equipment includes multiple structure optical assemblies in the multiple and different orientation of ontology and setting on the body.Each structure optical assembly includes the structured light projector that two field angles are arbitrary value in 80 degree to 120 degree and the structure light video camera head that a field angle is the arbitrary value in 180 degree to 200 degree.Structured light projector is used for this external projection laser pattern, and structure light video camera head is used to acquire the laser pattern of corresponding two structured light projectors projection of target subject reflection.The structured light projector of multiple structure optical assemblies projects laser simultaneously, and the structure light video camera head of multiple structure optical assemblies exposes simultaneously, to obtain panoramic range image.In the electronic equipment and mobile platform of the application embodiment, multiple structured light projectors positioned at multiple and different orientation of ontology project laser simultaneously, multiple structure light video camera heads expose simultaneously, to obtain panoramic range image, can disposably get more comprehensive depth information.

Description

Electronic equipment and mobile platform
Technical field
This application involves image acquisition technologies, more specifically, are related to a kind of electronic equipment and mobile platform.
Background technique
In order to enable the function of electronic equipment is more diversified, depth image can be set on electronic equipment and obtained dress It sets, to obtain the depth image of target subject.However, current integrated phase shift range finding is merely able to obtain a direction or one Depth image in a angular range, the depth information got are less.
Summary of the invention
The application embodiment provides a kind of electronic equipment and mobile platform.
The electronic equipment of the application embodiment includes the multiple structure optical assemblies of ontology and setting on the body, more A structure optical assembly is located at multiple and different orientation of the ontology, and each structure optical assembly includes two knots Structure light projector one and structure light video camera head, the field angle of each structured light projector are appointing in 80 degree to 120 degree Meaning value, the field angle of each structure light video camera head are the arbitrary value in 180 degree to 200 degree, and the structured light projector is used In to described external projection laser pattern, the structure light video camera head is used to acquire corresponding two institutes of target subject reflection State the laser pattern of structured light projector projection;The structured light projector of multiple structure optical assemblies projects simultaneously The structure light video camera head of laser, multiple structure optical assemblies exposes simultaneously, to obtain panoramic range image.
The mobile platform of the application embodiment includes the multiple structure optical assemblies of ontology and setting on the body, more A structure optical assembly is located at multiple and different orientation of the ontology, and each structure optical assembly includes two knots Structure light projector and a structure light video camera head, the field angle of each structured light projector are appointing in 80 degree to 120 degree Meaning value, the field angle of each structure light video camera head are the arbitrary value in 180 degree to 200 degree, and the structured light projector is used In to described external projection laser pattern, the structure light video camera head is used to acquire corresponding two institutes of target subject reflection State the laser pattern of structured light projector projection;The structured light projector of multiple structure optical assemblies projects simultaneously The structure light video camera head of laser, multiple structure optical assemblies exposes simultaneously, to obtain panoramic range image.
Multiple structures in the electronic equipment and mobile platform of the application embodiment, positioned at multiple and different orientation of ontology Light projector projects laser simultaneously, and multiple structure light video camera heads expose simultaneously, to obtain panoramic range image, can disposably obtain Get more comprehensive depth information.
The additional aspect and advantage of presently filed embodiment will be set forth in part in the description, partially will be from following Description in become obvious, or recognized by the practice of presently filed embodiment.
Detailed description of the invention
The above-mentioned and/or additional aspect and advantage of the application is from combining in description of the following accompanying drawings to embodiment by change It obtains obviously and is readily appreciated that, in which:
Fig. 1 is the structural schematic diagram of the electronic equipment of the application certain embodiments;
Fig. 2 is the module diagram of the electronic equipment of the application certain embodiments;
Fig. 3 is the structural schematic diagram of the structured light projector of the structure optical assembly of the application certain embodiments;
Fig. 4 is the structural schematic diagram of the light source of the structured light projector of the application certain embodiments;
Fig. 5 is the schematic perspective view of the diffraction optical element of the structured light projector of the application certain embodiments;
Fig. 6 is the cross-sectional view of the diffraction optical element of the structured light projector of the application certain embodiments;
Fig. 7 is the planar structure schematic diagram of the diffraction optical element of the structured light projector of the application certain embodiments;
Fig. 8 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Fig. 9 is the coordinate system schematic diagram of the initial depth image mosaic of the application certain embodiments;
Figure 10 to Figure 14 is the application scenarios schematic diagram of the electronic equipment of the application certain embodiments;
Figure 15 to Figure 18 is the structural schematic diagram of the mobile platform of the application certain embodiments.
Specific embodiment
Presently filed embodiment is described further below in conjunction with attached drawing.Same or similar label is from beginning in attached drawing To the same or similar element of expression or element with the same or similar functions eventually.The application's described with reference to the accompanying drawing Embodiment is exemplary, and is only used for explaining presently filed embodiment, and should not be understood as the limitation to the application.
Also referring to Fig. 1 and Fig. 2, the electronic equipment 100 of the application embodiment includes ontology 10, structure optical assembly 20, CCD camera assembly 30, microprocessor 40 and application processor 50.
Ontology 10 includes multiple and different orientation.As shown in figure 1, ontology 10 can have there are four different direction example, along side clockwise To successively are as follows: first orientation, second orientation, third orientation and fourth orientation, first orientation is opposite with third orientation, second orientation It is opposite with fourth orientation.First orientation is the right side of orientation corresponding with the top of ontology 10, second orientation as with ontology 10 The corresponding orientation in side, third orientation are the left side of orientation corresponding with the lower section of ontology 10, fourth orientation as with ontology 10 Corresponding orientation.
Structure optical assembly 20 is arranged on ontology 10.The quantity of structure optical assembly 20 can be multiple, multiple structure light groups Part 20 is located at multiple and different orientation of ontology 10.Specifically, the quantity of structure optical assembly 20 can be two, respectively structure Optical assembly 20a and structure optical assembly 20b.Structure optical assembly 20a setting is arranged in first orientation, structure optical assembly 20b in third Orientation.Certainly, the quantity of structure optical assembly 20 may be four (or any other quantity for being greater than two), and in addition two Structure optical assembly 20 can be separately positioned on second orientation and fourth orientation.The application embodiment is with the number of structure optical assembly 20 Amount is illustrated for two, it will be understood that two structure optical assemblies 20, which can be realized, to be obtained panoramic range image (panorama is deep Degree image refers to that the field angle of the panoramic range image is greater than or equal to 180 degree, for example, the field angle of panoramic range image can For 180 degree, 240 degree, 360 degree, 480 degree, 720 degree etc.), be conducive to the manufacturing cost for saving electronic equipment 100 and reduce Volume and power consumption of electronic equipment 100 etc..The electronic equipment 100 of present embodiment, which can be, is provided with multiple structure optical assemblies 20 The portable electronic devices such as mobile phone, tablet computer, laptop, at this point, ontology 10 can be handset, tablet computer Fuselage, laptop fuselage etc..Electronic equipment 100 higher for thickness requirement, for mobile phone, since mobile phone requires machine Body thinner thickness, thus the side of fuselage usually can not mounting structure optical assembly 20, then using two structure optical assemblies 20 come The setting for obtaining panoramic range image can solve the above problem, and two structure optical assemblies 20 can be separately mounted to mobile phone at this time Fuselage is on the front and back.In addition, the mode that two structure optical assemblies 20 can obtain panoramic range image is also beneficial to subtract The calculation amount of small panoramic range image.
Each structure optical assembly 20 includes two structured light projectors 22 and a structure light video camera head 24.Structure light is thrown Emitter 22 is used to acquire corresponding the two of target subject reflection for projecting laser pattern, structure light video camera head 24 to outside ontology 10 The laser pattern that a structured light projector 22 projects.Specifically, structure optical assembly 20a includes structured light projector 222a, structure Light projector 224a and structure light video camera head 24a, structure optical assembly 20b include structured light projector 222b, structured light projector 224b and structure light video camera head 24b.Structured light projector 222a and structured light projector 224a are used to first outside ontology 10 Orientation projects laser pattern, and structured light projector 222b and structured light projector 224b are used to throw to the outer third orientation of ontology 10 Penetrate laser pattern, structure light video camera head 24a be used to acquire the target subject reflection of first orientation structured light projector 222a and The laser pattern of structured light projector 224a projection, structure light video camera head 24b are used to acquire the target subject reflection in third orientation Structured light projector 222b and structured light projector 224b projection laser pattern, it is each outside ontology 10 so as to cover Different zones are rotated by 360 ° for could obtaining more comprehensive depth information compared to existing needs, present embodiment Electronic equipment 100, which can not rotate, can disposably obtain more comprehensive depth information, and it is rapid to execute simple and response speed.
The structured light projector 22 of multiple structure optical assemblies 20 projects laser simultaneously, corresponding, multiple structure light groups The structure light video camera head 24 of part 20 exposes simultaneously, and to obtain panoramic range image, multiple structure optical assemblies 20 are, for example, two knots Structure optical assembly 20.Specifically, structured light projector 222a, structured light projector 224a, structured light projector 222b and structure light Projector 224b projects laser simultaneously, and structure light video camera head 24a and structure light video camera head 24b expose simultaneously.Due to multiple structures Light projector 22 projects laser simultaneously, and multiple structure light video camera heads 24 expose simultaneously, adopts according to multiple structure light video camera heads 24 When the laser pattern of collection acquires corresponding multiple initial depth images, multiple initial depth image timeliness having the same Property, it is able to reflect the picture that the 10 each orientation of outer synchronization of ontology is shown, the i.e. panoramic range image of synchronization.
The field angle of each structured light projector 22 is the arbitrary value in 80 degree~120 degree, each structure light video camera head 24 Field angle be 180 degree~200 degree in arbitrary value.
In one embodiment, the field angle of each structured light projector 22 is the arbitrary value in 80 degree~90 degree, such as The field angle of structured light projector 222a, structured light projector 224a, structured light projector 222b and structured light projector 224b It is 80 degree, the field angle of structure light video camera head 24a and structure light video camera head 24b are 180 degree.The view of structured light projector 22 When rink corner is smaller, the manufacturing process of structured light projector 22 is fairly simple, and manufacturing cost is lower, and can be improved swashing for projection The uniformity of light.When the field angle of structure light video camera head 24 is smaller, lens distortion is smaller, the initial depth picture quality of acquisition Preferably, the panoramic range image quality obtained from is also preferable, and can get accurate depth information.
In one embodiment, structured light projector 222a, structured light projector 224a, structured light projector 222b and knot The sum of field angle of structure light projector 224b is equal to 360 degree, the field angle of structure light video camera head 24a and structure light video camera head 24b The sum of be equal to 360 degree.Specifically, structured light projector 222a, structured light projector 224a, structured light projector 222b and structure The field angle of light projector 224b can be 90 degree, and the field angle of structure light video camera head 24a and structure light video camera head 24b can be with It is 180 degree, and four mutual field angles of structured light projector 22 are non-overlapping, two structure light video camera heads 24 are mutual Between field angle it is non-overlapping, obtain 360 degree or 360 degree of panoramic range image of approximation to realize.Alternatively, project structured light The field angle of device 222a and structured light projector 224a can be 80 degree, structured light projector 222b and structured light projector 224b Field angle be 100 degree, the field angle of structure light video camera head 24a and structure light video camera head 24b are 180 degree etc., four knots Structure light projector 22, which is realized by angled complimentary, two structure light video camera heads 24 by angled complimentary, obtains 360 degree or approximation 360 The panoramic range image of degree.
In one embodiment, structured light projector 222a, structured light projector 224a, structured light projector 222b and knot The sum of field angle of structure light projector 224b is greater than 360 degree, the field angle of structure light video camera head 24a and structure light video camera head 24b The sum of be greater than 360 degree, the mutual field angle friendship of at least two structured light projectors 22 in four structured light projectors 22 Folded, the mutual field angle of two structure light video camera heads 24 is overlapping.Specifically, structured light projector 222a, project structured light The field angle of device 224a, structured light projector 222b and structured light projector 224b can be 100 degree, four project structured lights The field angle of device 22 between any two is mutually overlapping.The field angle of structure light video camera head 24a and structure light video camera head 24b can be 200 degree, the field angle between two structure light video camera heads 24 is mutually overlapping.When obtaining panoramic range image, can first identify The edge overlapping part of two initial depth images, then the panoramic range image for being 360 degree by two initial depth image mosaics. Field angle since the field angle of four structured light projectors 22 between any two is mutually overlapping, between two structure light video camera heads 24 It is mutually overlapping, it can be ensured that outer 360 degree of the depth information of panoramic range image covering ontology 10 of acquisition.
Certainly, the specific value of each structured light projector 22 and the field angle of each structure light video camera head 24 is not limited to The example above, those skilled in the art, which can according to need, is set as 80 degree~120 for the field angle of structured light projector 22 Any number between degree, the field angle of structure light video camera head 24 are set as any number between 180 degree~200 degree, such as: The field angle of structured light projector 22 be 80 degree, 82 degree, 84 degree, 86 degree, 90 degree, 92 degree, 94 degree, 96 degree, 98 degree, 104 degree, 120 degree or any arbitrary value between the two, the field angle of structure light video camera head 24 be 180 degree, 181 degree, 182 degree, 187 degree, 188 degree, 193.2 degree, 195 degree, 200 degree or any arbitrary value between the two, this is not restricted.
Please continue to refer to Fig. 1 and Fig. 2, under normal circumstances, adjacent structure light between two adjacent structure optical assemblies 20 The laser pattern that the projector 22 projects be easy to cause interference between each other, such as between two adjacent structure optical assemblies 20 When the field angle of structured light projector 22 is mutually overlapping, the laser pattern that structured light projector 22 projects is be easy to cause between each other Interference.Therefore, in order to improve acquisition depth information accuracy, the adjacent structure light of two adjacent structure optical assemblies 20 The laser pattern that the projector 22 projects can be different, in order to distinguish and calculate initial depth image.Specifically, it is assumed that first party The laser pattern of the structured light projector 222a projection of position is pattern1, the structured light projector 224a projection of first orientation Laser pattern is pattern2, and the laser pattern of the structured light projector 222b projection in third orientation is pattern3, third party The laser pattern of the structured light projector 224b projection of position is pattern4, then need to only meet pattern1 and pattern3 not Together, pattern2 is different from pattern4.Wherein, pattern1 and pattern2 can be identical, be also possible to not With (since structured light projector 222a and structured light projector 224a is located at same orientation and belongs to same structure optical assembly Therefore 20a influences less the acquisition of depth information when pattern1 is identical as pattern2 and mutually overlapping, therefore, Pattern1 and pattern2 may be the same or different), pattern3 and pattern4 can be it is identical, can also be with Be different (ibid, the laser pattern of pattern3 and pattern4 may be the same or different), pattern1 with Pattern4 can be identical, be also possible to different, pattern2 and pattern3 can be identical, be also possible to not With.Preferably, the laser pattern that each structured light projector 22 projects can be different, to further increase the depth letter of acquisition The accuracy of breath.In other words, more in the case where pattern1, pattern2, pattern3 and pattern4 are all different The laser pattern that a structured light projector 22 projects is not interfere with each other, so that the calculating of initial depth image is easy the most.
Referring to Fig. 3, each structured light projector 22 includes light source 222, collimating element 224 and diffraction optical element 226 (Diffractive Optical Elements, DOE).Collimating element 224 and diffraction optical element 226 are successively set on light source In 222 optical path.Light source 222 is for emitting laser (such as infrared laser, at this point, structure light video camera head 24 is infrared photography Head), collimating element 224 is used for the laser that collimated light source 222 emits, and diffraction optical element 226 is quasi- for diffraction collimating element 224 Laser after straight is to form the laser pattern for projection.
Further, incorporated by reference to Fig. 4, light source 222 includes substrate 2222 and the multiple luminous members being arranged on substrate 2222 Part 2224.Substrate 2222 can be semiconductor substrate, and a plurality of light-emitting elements 2224 can be set up directly on substrate 2222;Alternatively, One or more grooves can be opened up in semiconductor substrate 2222 first with wafer scale optical technology, then by a plurality of light-emitting elements 2224 are placed in groove.Light-emitting component 2224 includes point light source light-emitting device, such as vertical cavity surface emitting laser (Vertical-Cavity Surface-Emitting Laser, VCSEL).
Collimating element 224 includes one or more lens, the coaxial hair for being successively set on light source 222 of one or more lens In light optical path.Lens can be used glass material and be made, and lens can generate temperature drift phenomenon when solving the problems, such as variation of ambient temperature; Alternatively, lens are made of plastic material, so that cost is relatively low, is convenient for volume production for lens.The face type of each lens can be aspheric Face, spherical surface, Fresnel surface, any one in binary optical face.
Incorporated by reference to Fig. 5, diffraction optical element 226 includes diffraction ontology 2262 and the diffraction being formed on diffraction ontology 2262 Structure 2264.Diffraction ontology 2262 includes opposite the diffraction plane of incidence and diffraction exit facet, and diffraction structure 2264 can be formed in On the diffraction plane of incidence;Or it is formed on diffraction exit facet;Or it is formed simultaneously on the diffraction plane of incidence and diffraction exit facet.
In order to enable the laser pattern that two structured light projectors 22 project is different, following implementation can be used:
A kind of mode is: between different structured light projectors 22, the arrangements of a plurality of light-emitting elements 2224, shape or At least one of size difference, so that the laser pattern that different structured light projectors 22 projects is different.
Specifically, referring to Fig. 4, structured light projector 222a and structured light projector 222b light-emitting component 2224 row At least one of cloth, shape or size difference;Structured light projector 224a and structured light projector 224b are in light-emitting component At least one of 2224 arrangement, shape or size are also different, so that two adjacent structure optical assemblies 20 is adjacent The laser pattern that structured light projector 22 projects is different.Structured light projector 222a, structured light projector 224a, project structured light Device 222b and structured light projector 224b, four at least one of the arrangement of light-emitting component 2224, shape or size no Together, so that the laser pattern difference that each structured light projector 22 projects.For example, referring to Fig. 4, Fig. 4 (a) expression is knot The structure of the light source 222 of structure light projector 222a, what Fig. 4 (b) was indicated is the structure of the light source 222 of structured light projector 224a, What Fig. 4 (c) was indicated is the structure of the light source 222 of structured light projector 222b, and that Fig. 4 (d) is indicated is structured light projector 224b Light source 222 structure.Structured light projector 222a is different from the shape of light-emitting component 2224 of structured light projector 224a, knot Of different sizes, structured light projector 222b and the knot of the light-emitting component 2224 of structure light projector 222a and structured light projector 222b The shapes and sizes of the light-emitting component 2224 of structure light projector 224a are all different, structured light projector 222b and project structured light The arrangement of the light-emitting component 2224 of device 224b, shape, size are all different, then the laser that different structured light projectors 22 projects Pattern is different.
Another mode is: between different structured light projectors 22, diffraction structure 2264 is different, so that different The laser pattern that projects of structured light projector 22 it is different.
Specifically, referring to Fig. 5, structured light projector 222a and structured light projector 222b, the diffraction structure of the two 2264 is different;Structured light projector 224a and structured light projector 224b, the diffraction structure 2264 of the two is also different, so that phase The laser pattern that the adjacent structured light projector 22 of two adjacent structure optical assemblies 20 projects is different.Structured light projector 222a, structured light projector 224a, structured light projector 222b and structured light projector 224b, four diffraction structure 2264 are equal Difference, so that the laser pattern that each structured light projector 22 projects is different.
Incorporated by reference to Fig. 6 and Fig. 7, diffraction structure 2264 is different can include: the depth D for the step that diffraction structure 2264 is formed, At least one of the number of the length L of step, the width W of step, step difference.Certainly, diffraction structure 2264 is different can also Structure to be other forms is different, it is only necessary to which the difference for meeting diffraction structure 2264 makes swashing for the projection of structured light projector 22 Light pattern is different.
It should be pointed out that those skilled in the art can also be real using other modes other than above two mode The laser pattern that existing structured light projector 22 projects is different, for example, by adding tool between light source 222 and collimating element 224 There is the exposure mask etc. of different transmission regions, this is not restricted.
When the laser pattern difference of each structured light projector 22 projection, each structure optical assembly 20 is corresponding with reference to figure As can be independent calibration or demarcate jointly.In the case where independently calibration, structure optical assembly 20a and structure optical assembly The calibration of 20b can be separated and be carried out, and structure optical assembly 20a and structure optical assembly 20b are not necessarily to while being installed in the enterprising rower of ontology 10 It is fixed.At this point, structured light projector 222a will not be overlapped with the laser pattern that structured light projector 222b is projected, structured light projector The laser pattern of 224a and structured light projector 224b projection will not be overlapped (due to structured light projector 222a and project structured light Device 224a is located at same orientation and belongs to same structure optical assembly 20a, therefore, structured light projector 222a and structure light at this time The laser pattern of projector 224a projection can be overlapped or not be overlapped, and structured light projector 222a and structured light projector 224a can To project laser pattern simultaneously, similarly, the laser pattern of structured light projector 222b and structured light projector 224b projection can also To be overlapped or not be overlapped, structured light projector 222b and structured light projector 224b can also project laser pattern simultaneously), structure The reference picture that light video camera head 24a and structure light video camera head 24b is obtained will not influence each other.In the case where common calibration, knot Structure optical assembly 20a and structure optical assembly 20b is installed on ontology 10 is demarcated simultaneously.At this point, structured light projector 222a, knot Structure light projector 224a, structured light projector 222b and structured light projector 224b can project laser pattern, structure light simultaneously It not only include the complete of structured light projector 222a and structured light projector 224a projection in the reference picture that camera 24a is obtained Portion's laser pattern further includes the fraction of laser light pattern of structured light projector 222b and structured light projector 224b projection, structure It not only include what structured light projector 222b and structured light projector 224b was projected in the reference picture that light video camera head 24b is obtained Whole laser patterns further include the fraction of laser light pattern of structured light projector 222a and structured light projector 224a projection.It is right In the reference picture that structure light video camera head 24a is obtained, since the laser pattern and structure light of structured light projector 222a projection are thrown The laser pattern of emitter 222b projection is different, and the laser pattern and structured light projector 224b of structured light projector 224a projection are thrown The laser pattern penetrated is different, then can be distinguished in a reference image according to the difference of laser pattern structured light projector 222a, The laser pattern of structured light projector 224a, structured light projector 222b and structured light projector 224b projection, and structure light The partial filtration of the laser pattern of projector 222b and structured light projector 224b projection, merely with remaining project structured light The laser pattern of device 222a and structured light projector 224a projection is as final reference picture.It is taken the photograph similarly, for structure light As the reference picture that head 24b is obtained, respective handling can also be carried out.For example, if the shape of laser pattern is different, specifically Ground, for example, spot shape are different, then the laser of the projection of different structure light projector 22 can be distinguished according to the shape of spot The spot of pattern;If laser pattern is of different sizes, different structure light projector 22 can be distinguished according to the size of spot Spot of laser pattern of projection etc..In actual use, since four structured light projectors 22 are to project laser pattern simultaneously , after two structure light video camera heads 24 acquire laser pattern, microprocessor 40 (as shown in Figure 2) is also required to filter out remaining structure light The spot in laser pattern that the projector 22 projects, only retains in the laser pattern that corresponding two structured light projectors 22 project Spot, and do based on these remaining spots and reference picture the calculating of depth information.
When structured light projector 222a is identical as the laser pattern that structured light projector 222b is projected, structured light projector It, can between structure optical assembly 20a and structure optical assembly 20b when 224a is identical with the laser pattern that structured light projector 224b is projected Interference can be will cause.At this point, the corresponding reference picture of structure optical assembly 20a and the corresponding reference picture of structure optical assembly 20b must It must demarcate jointly.It is appreciated that the field angle due to structured light projector 222a and structured light projector 222b may be handed over mutually Folded, the field angle of structured light projector 224a and structured light projector 224b may be overlapped mutually, and structure light video camera head 24a is not only Whole laser patterns that structured light projector 222a and structured light projector 224a projection can be collected, can also collect structure The fraction of laser light pattern of light projector 222b and structured light projector 224b projection, similarly, structure light video camera head 24b is not only Whole laser patterns that structured light projector 222b and structured light projector 224b projection can be collected, can also collect structure The fraction of laser light pattern of light projector 222a and structured light projector 224a projection.For structure optical assembly 20a, structure It is extra in the laser pattern of light video camera head 24a acquisition to be projected by structured light projector 222b and structured light projector 224b Spot can be used for the calculating of depth information, also should include structure light in the reference picture of structure optical assembly 20a accordingly The spot of projector 222b and structured light projector 224b projection;For structure optical assembly 20b, structure light video camera head 24b The extra spot projected by structured light projector 222a and structured light projector 224a can also be in the laser pattern of acquisition It also should include structured light projector 222a in the reference picture of structure optical assembly 20b accordingly for the calculating of depth information With the spot of structured light projector 224a projection.Therefore, in calibration structure optical assembly 20a and structure optical assembly 20b with reference to figure When picture, it should by structured light projector 222a, structured light projector 224a, structured light projector 222b and structured light projector 224b is installed on ontology 10 simultaneously and projects simultaneously laser pattern, so that structure light video camera head 24a can not only collect packet Whole laser patterns of 222a containing structured light projector and structured light projector 224a projection, moreover it is possible to collect project structured light The fraction of laser light pattern of device 222b and structured light projector 224b projection, structure light video camera head 24b, which can not only be collected, includes Whole laser patterns of structured light projector 222b and structured light projector 224b projection, moreover it is possible to collect structured light projector The fraction of laser light pattern of 222a and structured light projector 224a projection.At this point, be used to calculate the laser pattern of depth information Spot is augmented with the quantity and accuracy for being conducive to increase depth information.
Fig. 1 and Fig. 2 are please referred to, CCD camera assembly 30 is arranged on ontology 10.The quantity of CCD camera assembly 30 can be more It is a, the corresponding structure optical assembly 20 of each CCD camera assembly 30.For example, being taken the photograph when the quantity of structure optical assembly 20 is two As the quantity of head assembly 30 is also two, two CCD camera assemblies 30 are separately positioned on first orientation and third orientation.
Multiple CCD camera assemblies 30 are connect with application processor 50.Each CCD camera assembly 30 is for acquiring target subject Scene image and export to application processor 50.In present embodiment, two CCD camera assemblies 30 are respectively used to acquisition first The scene image of the target subject in orientation, the scene image of the target subject in third orientation are simultaneously exported respectively to application processor 50.It is appreciated that each CCD camera assembly 30 is identical as the field angle of structure light video camera head 24 of corresponding structure optical assembly 20 Or it is approximately uniform, so that each scene image can preferably be matched with corresponding initial depth image.
CCD camera assembly 30 can be visible image capturing head 32 or infrared pick-up head 34.When CCD camera assembly 30 When for visible image capturing head 32, scene image is visible images;When CCD camera assembly 30 is infrared pick-up head 34, scene Image is infrared light image.
Referring to Fig. 2, microprocessor 40 can be processing chip.The quantity of microprocessor 40 can be multiple, Mei Gewei Processor 40 corresponds to a structure optical assembly 20.For example, the quantity of structure optical assembly 20 is two, micro- place in present embodiment The quantity for managing device 40 is also two.Each microprocessor 40 in corresponding structure optical assembly 20 structured light projector 22 and knot Structure light video camera head 24 is all connected with.Each microprocessor 40 can drive the corresponding projection of structured light projector 22 to swash by driving circuit Light, and realize that four structured light projectors 22 project laser simultaneously by the control of multi-microprocessor 40.Each microprocessor 40 are also used to provide the clock information of acquisition laser pattern to corresponding structure light video camera head 24 so that structure light video camera head 24 Exposure, and by being exposed while the control of two microprocessors 40 two structure light video camera heads 24 of realization.Each microprocessor 40 are also used to handle the laser pattern of the corresponding acquisition of structure light video camera head 24 to obtain initial depth image.For example, two micro- The laser pattern that processing structure light video camera head 24a is acquired respectively of processor 40 is to obtain initial depth image P1, processing structure light The laser pattern of camera 24b acquisition is to obtain initial depth image P2 (as shown in the upper part of Fig. 8).Each microprocessor 40 Tiled, distortion correction, the processing of self calibration scheduling algorithm can also be carried out, to initial depth image to improve initial depth image Quality.
It is appreciated that the quantity of microprocessor 40 may be one, at this point, microprocessor 40 needs successively to handle two The laser pattern that structure light video camera head 24 acquires is to obtain initial depth image.Two microprocessors 40 are relative to a micro process For device 40, processing speed faster, is delayed smaller.
Two microprocessors 40 are connect with application processor 50, by initial depth image transmitting to application processor 50.In one embodiment, microprocessor 40 can pass through mobile industry processor interface (Mobile Industry Processor Interface, MIPI) it is connect with application processor 50, specifically, microprocessor 40 passes through mobile industry processing The credible performing environment (Trusted Execution Environment, TEE) of device interface and application processor 50 connects, with Data (initial depth image) in microprocessor 40 are transmitted directly in credible performing environment, to improve electronic equipment 100 The safety of interior information.Wherein, the code in credible performing environment and region of memory are controlled by access control unit, It cannot be accessed by the program in untrusted performing environment (Rich Execution Environment, REE), credible execution ring Border and untrusted performing environment can be formed in application processor 50.
The system that application processor 50 can be used as electronic equipment 100.Application processor 50 can reset microprocessor 40, Wake up (wake) microprocessor 40, error correction (debug) microprocessor 40 etc..Application processor 50 can also be with electronic equipment 100 Multiple electronic components connect and control multiple electronic component and run in predetermined patterns, such as application processor 50 It connect with visible image capturing head 32 and infrared pick-up head 34, is shot with controlling visible image capturing head 32 and infrared pick-up head 34 Visible images and infrared light image, and handle the visible images and infrared light image;When electronic equipment 100 includes display screen When, application processor 50 can control display screen and show scheduled picture;Application processor 50 can be with controlling electronic devices 100 Antenna send or receive scheduled data etc..
Referring to Fig. 8, in one embodiment, application processor 50 is used for will according to the field angle of structure light video camera head 24 Two initial depth images that two microprocessors 40 obtain synthesize a frame panoramic range image.
Specifically, it incorporated by reference to Fig. 1, is built using transversal line as X-axis by Y-axis of longitudinal axis using the center of ontology 10 as center of circle O Vertical rectangular coordinate system XOY, in rectangular coordinate system XOY, the visual field of structure light video camera head 24a is between 190 degree~350 degree (rotating clockwise, rear same), the visual field of structured light projector 222a is between 190 degree~90 degree, structured light projector 224a Visual field between 90 degree~350 degree, the visual field of structure light video camera head 24b is between 10 degree~170 degree, project structured light The visual field of device 222b is between 270 degree~170 degree, and the visual field of structured light projector 224b is between 10 degree~270 degree, then Application processor 50 splices initial depth image P1, initial depth image P2 according to the field angle of two structure light video camera heads 24 For 360 degree of panoramic range image P12 of a frame, so as to the use of depth information.
Each microprocessor 40 handles the initial depth that the laser pattern that corresponding structure light video camera head 24 acquires obtains In image, the depth information of each pixel is between the structure light video camera head 24 in the target subject and the orientation in corresponding orientation Distance.That is, the depth information of each pixel is the target subject and structure light video camera head of first orientation in initial depth image P1 The distance between 24a;The depth information of each pixel is the target subject and structure light in third orientation in initial depth image P2 The distance between camera 24b.It is being the panorama depth map of 360 degree of a frame by multiple initial depth image mosaics in multiple orientation As during, first have to the depth information of each pixel in each initial depth image being converted to unitized depth information, Unitized depth information indicates each target subject in each orientation at a distance from some base position.Depth information is converted into system After one changes depth information, application processor 40 is facilitated to do the splicing of initial depth image according to unitized depth information.
Specifically, a frame of reference, the structure light video camera head that the frame of reference can be with some orientation are selected 24 image coordinate system is also possible to select other coordinate systems as the frame of reference as the frame of reference.It is with Fig. 9 Example, with xo-yo-zoCoordinate system is benchmark coordinate system.Coordinate system x shown in Fig. 9a-ya-zaFor the image of structure light video camera head 24a Coordinate system, coordinate system xb-yb-zbFor the image coordinate system of structure light video camera head 24b.Application processor 50 is according to coordinate system xa-ya- zaWith frame of reference xo-yo-zoBetween spin matrix and translation matrix by the depth of each pixel in initial depth image P1 Information is converted to unitized depth information, according to coordinate system xb-yb-zbWith frame of reference xo-yo-zoBetween spin matrix and The depth information of each pixel in initial depth image P2 is converted to unitized depth information by translation matrix.
After the completion of depth information conversion, multiple initial depth images are located under a unified frame of reference, and each Corresponding coordinate (the x of one pixel of initial depth imageo,yo,zo), then initial depth can be done by coordinate matching The splicing of image.For example, some pixel P in initial depth image P1aCoordinate be (xo1,yo1,zo1), initial deep Spend some pixel P in image P2bCoordinate be also (xo1,yo1,zo1), due to PaAnd PbUnder the current frame of reference Coordinate value having the same, then pixels illustrated point PaWith pixel PbIt is actually the same point, initial depth image P1 and initial When depth image P2 splices, pixel PaIt needs and pixel PbIt is overlapped.In this way, application processor 50 can pass through of coordinate The splicing of multiple initial depth images is carried out with relationship, and obtains 360 degree of panoramic range image.
It should be noted that the splicing that the matching relationship based on coordinate carries out initial depth image requires initial depth image Resolution ratio need be greater than a default resolution ratio.If being appreciated that the resolution ratio of initial depth image is lower, coordinate (xo,yo,zo) accuracy also can be relatively low, at this point, directly being matched according to coordinate, in fact it could happen that PaPoint and PbPoint is practical On be not overlapped, but differ an offset offset, and the value of offset be more than error bounds limit value the problem of.If image Resolution ratio it is higher, then coordinate (xo,yo,zo) accuracy also can be relatively high, at this point, directly being matched according to coordinate, i.e., Make PaPoint and PbPoint is practically without coincidence, differs an offset offset, but the value of offset can also be less than bouds on error Value will not influence too much the splicing of initial depth image that is, in the range of error permission.
It is appreciated that subsequent implementation mode can be used aforesaid way by two or more initial depth images into Row splicing or synthesis, no longer illustrate one by one.
Two initial depth images can also be synthesized three-dimensional with corresponding two visible images by application processor 50 Scene image is watched with being shown for user.For example, two visible images are respectively visible images V1 and visible light figure As V2.Then application processor 50 initial depth image P1 is synthesized with visible images V1 respectively, by initial depth image P2 with Visible images V2 synthesis, then two images after synthesis are spliced to obtain the three-dimensional scene images of 360 degree of a frame.Or Person, application processor 50 first splice initial depth image P1 and initial depth image P2 to obtain the panorama depth of 360 degree of a frame Image, and will be seen that light image V1 and visible images V2 splices to obtain the panorama visible images of 360 degree of a frame;Again by panorama Depth image and panorama visible images synthesize 360 degree of three-dimensional scene images.
Referring to Fig. 10, in one embodiment, application processor 50 be used to be obtained according to two microprocessors 40 two A initial depth image and two scene images of two CCD camera assemblies 30 acquisition identify target subject.
Specifically, when scene image is infrared light image, two infrared light images can be infrared light image I1 respectively With infrared light image I2.Application processor 50 is respectively according to initial depth image P1 and infrared light image I1 identification first orientation Target subject, the target subject that third orientation is identified according to initial depth image P2 and infrared light image I2.When scene image is When visible images, two visible images are visible images V1 and visible images V2 respectively.Application processor 50 is distinguished According to the target subject of initial depth image P1 and visible images V1 identification first orientation, according to initial depth image P2 and can The target subject in light-exposed image V2 identification third orientation.
When identifying target subject is to carry out recognition of face, application processor 50 is using infrared light image as scene image It is higher to carry out recognition of face accuracy.Application processor 50 carries out recognition of face according to initial depth image and infrared light image Process can be as follows:
Firstly, carrying out Face datection according to infrared light image determines target human face region.Since infrared light image includes The detailed information of scene can carry out Face datection according to infrared light image, to detect after getting infrared light image It whether include out face in infrared light image.If in infrared light image including face, extract in infrared light image where face Target human face region.
Then, In vivo detection processing is carried out to target human face region according to initial depth image.Due to each initial depth Image and infrared light image are corresponding, include the depth information of corresponding infrared light image in initial depth image, therefore, Depth information corresponding with target human face region can be obtained according to initial depth image.Further, since living body faces are Three-dimensional, and the face of the display such as picture, screen is then plane, it therefore, can be according to the target human face region of acquisition Depth information judge that target human face region is three-dimensional or plane, to carry out In vivo detection to target human face region.
If In vivo detection success, obtains the corresponding target face property parameters of target human face region, and according to target person Face property parameters carry out face matching treatment to the target human face region in infrared light image, obtain face matching result.Target Face character parameter refers to the parameter that can characterize the attribute of target face, can be to target person according to target face property parameters Face carries out identification and matching treatment.Target face property parameters include but is not limited to be face deflection angle, face luminance parameter, Face parameter, skin quality parameter, geometrical characteristic parameter etc..Electronic equipment 100 can be stored in advance joins for matched face character Number.After getting target face property parameters, so that it may by target face property parameters and pre-stored face character Parameter is compared.If target face property parameters are matched with pre-stored face character parameter, recognition of face passes through.
It should be pointed out that application processor 50 carries out the tool of recognition of face according to initial depth image and infrared light image Body process is not limited to this, such as application processor 50 can also detect facial contour according to initial depth visual aids, to mention High recognition of face precision etc..Application processor 50 according to initial depth image and visible images carry out the process of recognition of face with Application processor 50 is similar with the infrared light image progress process of recognition of face according to initial depth image, no longer separately explains herein It states.
Figure 10 and Figure 11 are please referred to, application processor 50 is also used to according to two initial depth images and two scene figures When as identification target subject failure, two for being obtained two microprocessors 40 according to the field angle of structure light video camera head 24 are initial Range image integration is that a frame merges depth image, and two scene images that two CCD camera assemblies 30 are acquired synthesize a frame Merge scene image, and identifies target subject according to merging depth image and merging scene image.
Specifically, in embodiment shown in Figure 10 and Figure 11, due to the structure light video camera head 24 of each structure optical assembly 20 Field angle it is limited, it is understood that there may be the half of face be located at initial depth image P1, the other half be located at initial depth image P2's Initial depth image P1 and initial depth image P2 are synthesized a frame and merge depth image P12 by situation, application processor 50, and It is corresponding that infrared light image I1 and infrared light image I2 (or visible images V1 and visible images V2) are synthesized into frame merging Scene image I12 (or V12), with again shot according to merging depth image P12 and merging scene image I12 (or V12) identification Target.
Figure 12 and Figure 13 are please referred to, in one embodiment, application processor 50 is used for according to multiple initial depth images Judge that the distance between target subject and electronic equipment 100 change.
Specifically, each structure light video camera head 24 can be with multi collect laser pattern.For example, at the first moment, structure Light video camera head 24a and structure light video camera head 24b acquires laser pattern, and two correspondences of microprocessor 40 obtain initial depth image P11, initial depth image P21;At the second moment, structure light video camera head 24a and structure light video camera head 24b acquire laser pattern, Two correspondences of microprocessor 40 obtain initial depth image P12, initial depth image P22.Then, application processor 50 distinguishes root According to initial depth image P11 and initial depth image P12 judge between the target subject of first orientation and electronic equipment 100 away from From variation;The target subject and electronic equipment in third orientation are judged according to initial depth image P21 and initial depth image P22 The distance between 100 variations.
It is appreciated that due to include in initial depth image target subject depth information, application processor 50 Can be changed according to the depth information at multiple continuous moment between the target subject and electronic equipment 100 that judge corresponding orientation away from From variation.
Figure 14 is please referred to, application processor 50 is also used to judging that distance change fails according to multiple initial depth images When, a frame is synthesized according to two initial depth images that the field angle of structure light video camera head 24 obtains two microprocessors 40 Merging depth image, application processor 50 continuously performs synthesis step to obtain multiframe and continuously merge depth image, and according to Multiframe merges depth image and judges distance change.
Specifically, in embodiment shown in Figure 14, due to the visual field of the structure light video camera head 24 of each structure optical assembly 20 Angle is limited, it is understood that there may be the half of face be located at initial depth image P11, the other half be located at the situation of initial depth image P21, The initial depth image P11 at the first moment and initial depth image P21 are synthesized a frame and merge depth map by application processor 50 As P121, and corresponds to and the initial depth image P12 and initial depth image P22 at the second moment are synthesized into frame merging depth map As P122, depth image P121 and P122 are then merged according to this two frame after merging and rejudge distance change.
Figure 13 is please referred to, when judging that distance change reduces for distance according to multiple initial depth images, or according to more When frame merging depth image judges that distance change reduces for distance, application processor 50 can be improved from least one microprocessor 40 The frame per second to judge the initial depth image of distance change is acquired in multiple initial depth images of transmission.
It is appreciated that electronic equipment 100 can not prejudge when the distance between target subject and electronic equipment 100 reduce The distance, which reduces, whether there is risk, and therefore, application processor 50 can be improved from the more of the transmission of at least one microprocessor 40 The frame per second to judge the initial depth image of distance change is acquired in a initial depth image, it should be away from closer concern From variation.Specifically, when judging that the corresponding distance in some orientation reduces, the orientation is can be improved from Wei Chu in application processor 50 The frame per second to judge the initial depth image of distance change is acquired in multiple initial depth images that reason device 40 transmits.
For example, two microprocessors 40 obtain initial depth image P11, initial depth image respectively at the first moment P21;At the second moment, two microprocessors 40 obtain initial depth image P12, initial depth image P22 respectively;In third It carves, two microprocessors 40 obtain initial depth image P13, initial depth image P23 respectively;At the 4th moment, two micro- places Reason device 40 obtains initial depth image P14, initial depth image P24 respectively.
Under normal circumstances, the selection of application processor 50 initial depth image P11 and initial depth image P14 judges first The variation of the distance between the target subject in orientation and electronic equipment 100;Choose initial depth image P21 and initial depth image P24 judges that the distance between target subject and the electronic equipment 100 in third orientation change.Application processor 50 is adopted in each orientation The frame per second of collection initial depth image is to acquire a frame at interval of two frames, i.e., every three frame chooses a frame.
When judging that the corresponding distance of first orientation reduces according to initial depth image P11 and initial depth image P14, Application processor 50 can then choose initial depth image P11 and initial depth image P13 judge the target subject of first orientation with The variation of the distance between electronic equipment 100.The frame per second that application processor 50 acquires the initial depth image of first orientation becomes every It is spaced a frame and acquires a frame, i.e., every two frame chooses a frame.And the frame per second in other orientation remains unchanged, i.e., application processor 50 still selects Initial depth image P21 and initial depth image P24 is taken to judge distance change.
When judging that the corresponding distance of first orientation reduces according to initial depth image P11 and initial depth image P14, together When according to initial depth image P21 and initial depth image P24 judge third orientation it is corresponding distance reduce when, using processing Device 50 can then choose initial depth image P11 and initial depth image P13 judges the target subject and electronic equipment of first orientation The target subject that initial depth image P21 and initial depth image P23 judges third orientation is chosen in the distance between 100 variations The variation of the distance between electronic equipment 100, application processor 50 acquire the initial depth image of first orientation and third orientation Frame per second become acquiring a frame at interval of a frame, i.e. every two frame chooses a frame.
Certainly, application processor 50 can also be improved when judging that the corresponding distance in any one orientation reduces from each The frame per second to judge the initial depth image of distance change is acquired in multiple initial depth images that microprocessor 40 transmits. That is: when the target subject and electronic equipment for judging first orientation according to initial depth image P11 and initial depth image P14 When the distance between 100 reduction, application processor 50 can then choose initial depth image P11 and initial depth image P13 judgement Initial depth image P21 and initial depth figure are chosen in the variation of the distance between the target subject of first orientation and electronic equipment 100 As P23 judges that the distance between target subject and the electronic equipment 100 in third orientation changes.
Application processor 50 can also judge the distance in conjunction with visible images or infrared light image when distance reduces Variation.Specifically, application processor 50 first identifies target subject according to visible images or infrared light image, then further according to more The initial depth image at a moment judges distance change, to set for different target subjects from different distance controlling electronics Standby 100 execute different operations.Alternatively, the control of microprocessor 40 improves counter structure light projector 22 and throws when distance reduces Penetrate the frequency etc. that laser and structure light video camera head 24 expose.
It should be noted that the electronic equipment 100 of present embodiment is also used as an external terminal, be fixedly mounted or It is removably mounted on the portable electronic device such as mobile phone, tablet computer, laptop outside, can also be fixedly mounted Make in the loose impediments such as vehicle body (as is illustrated by figs. 11 and 12), unmanned aerial vehicle body, robot body or ship ontology With.When specifically used, when electronic equipment 100 synthesizes a frame panorama depth map according to multiple initial depth images as previously described Picture, panoramic range image can be used for three-dimensional modeling, immediately positioning and map structuring (simultaneous localization And mapping, SLAM), augmented reality shows.When the identification target subject as previously described of electronic equipment 100, then can be applied to Recognition of face unlock, the payment of portable electronic device, or applied to the avoidance of robot, vehicle, unmanned plane, ship etc..When When electronic equipment 100 judges the variation of the distance between target subject and electronic equipment 100 as previously described, then it can be applied to machine The automatic runnings such as people, vehicle, unmanned plane, ship, object tracking etc..
Fig. 2 and Figure 15 are please referred to, the application embodiment also provides a kind of mobile platform 300.Mobile platform 300 includes this Body 10 and the multiple structure optical assemblies 20 being arranged on ontology 10.Multiple structure optical assemblies 20 be located at ontology 10 it is multiple not Same orientation.Each structure optical assembly 20 includes two structured light projectors 22 and a structure light video camera head 24.Each structure The field angle of light projector 22 is the arbitrary value in 80 degree to 120 degree, the field angle of each structure light video camera head 24 for 180 degree extremely Arbitrary value in 200 degree.Structured light projector 22 is for projecting laser pattern to outside ontology 10, and structure light video camera head 24 is for adopting Collect the laser pattern of corresponding two structured light projectors 22 projection of target subject reflection.The structure of multiple structure optical assemblies 20 Light projector 22 projects laser simultaneously, and the structure light video camera head 24 of multiple structure optical assemblies 20 exposes simultaneously, to obtain panorama depth Spend image.
Specifically, ontology 10 can be vehicle body, unmanned aerial vehicle body, robot body or ship ontology.
Figure 15 is please referred to, when ontology 10 is vehicle body, the quantity of multiple structure optical assemblies 20 is two, two structures Optical assembly 20 is separately mounted to the two sides of vehicle body, for example, headstock and the tailstock, alternatively, being mounted on right with vehicle body on the left of vehicle body Side.Vehicle body can drive two structure optical assemblies 20 to move on road, construct 360 degree of panorama depth in travelling route Image, using as Reference Map etc.;Or the initial depth image of two different directions is obtained, to identify target subject, judgement The variation of the distance between target subject and mobile platform 300, thus control vehicle body acceleration, slow down, stop, detour etc., it is real Existing unmanned avoidance, for example, in vehicle when being moved on road, if recognize target subject reduce at a distance from vehicle and by Taking the photograph target is the pit on road, then vehicle is slowed down with the first acceleration, is reduced at a distance from vehicle if recognizing target subject And target subject is behaved, then vehicle is slowed down with the second acceleration, wherein the absolute value of the first acceleration is less than the second acceleration Absolute value.In this way, executing different operations according to different target subjects when distance reduces, vehicle can be made more intelligent Change.
Figure 16 is please referred to, when ontology 10 is unmanned aerial vehicle body, the quantity of multiple structure optical assemblies 20 is two, two knots Structure optical assembly 20 is separately mounted to the opposite two sides of unmanned aerial vehicle body, such as front and rear sides or arranged on left and right sides, or installation The opposite two sides of the holder carried on unmanned aerial vehicle body.Unmanned aerial vehicle body can drive multiple structure optical assemblies 20 in the sky Flight, to be taken photo by plane, inspection etc., the panoramic range image that unmanned plane can will acquire is returned to ground control terminal, can also be direct Carry out SLAM.Multiple structure optical assemblies 20 can realize unmanned plane acceleration, deceleration, stopping, avoidance, object tracking.
Figure 17 is please referred to, when ontology 10 is robot body, for example (,) sweeping robot, the number of multiple structure optical assemblies 20 Amount is two, and two structure optical assemblies 20 are separately mounted to the opposite sides of robot body.Robot body can drive more A structure optical assembly 20 moves at home, obtains the initial depth image in multiple and different orientation, to identify target subject, judge quilt Take the photograph the distance between target and mobile platform 300 variation, thus control robot body movement, realize robot remove rubbish, Avoidance etc..
Figure 18 is please referred to, when ontology 10 is ship ontology, the quantity of multiple structure optical assemblies 20 is two, two structures Optical assembly 20 is separately mounted to the opposite two sides of ship ontology.Ship ontology can be moved with driving structure optical assembly 20, be obtained The initial depth image in multiple and different orientation, thus adverse circumstances (such as under the environment that hazes) accurately identify target subject, Judge that the distance between target subject and mobile platform 300 change, improves sea going safety etc..
The mobile platform 300 of the application embodiment be can movable independently platform, multiple structure optical assemblies 20 install On the ontology 10 of mobile platform 300, to obtain panoramic range image.And the electronic equipment 100 of the application embodiment itself It generally can not independently move, electronic equipment 100 can further be equipped on the device that can be moved similar to mobile platform 300 etc. On, so that the device be helped to obtain panoramic range image.
It is pointed out that above-mentioned ontology 10, structure optical assembly 20, CCD camera assembly 30, Wei Chu to electronic equipment 100 The explanation of reason device 40 and application processor 50 is equally applicable to the mobile platform 300 of the application embodiment, herein no longer Repeated explanation.
Although embodiments herein has been shown and described above, it is to be understood that above-described embodiment is example Property, it should not be understood as the limitation to the application, those skilled in the art within the scope of application can be to above-mentioned Embodiment is changed, modifies, replacement and variant, and scope of the present application is defined by the claims and their equivalents.

Claims (13)

1. a kind of electronic equipment, which is characterized in that the electronic equipment includes:
Ontology;With
Multiple structure optical assemblies on the body are set, and multiple structure optical assemblies are located at the multiple of the ontology Different direction, each structure optical assembly include two structured light projectors and a structure light video camera head, each described The field angle of structured light projector is the arbitrary value in 80 degree to 120 degree, and the field angle of each structure light video camera head is 180 Degree is to the arbitrary value in 200 degree, and the structured light projector is used for described external projection laser pattern, and the structure light is taken the photograph The laser pattern projected as corresponding two structured light projectors that head is used to acquire target subject reflection;
The structured light projectors of multiple structure optical assemblies projects laser simultaneously, multiple structure optical assemblies it is described Structure light video camera head exposes simultaneously, to obtain panoramic range image.
2. electronic equipment according to claim 1, which is characterized in that the structure optical assembly includes two, described in two The laser pattern of the adjacent structured light projector projection of structure optical assembly is different.
3. electronic equipment according to claim 2, which is characterized in that each the described of structured light projector projection swashs Light pattern is different.
4. electronic equipment according to claim 2 or 3, which is characterized in that each structured light projector includes more A light-emitting component;Between the different structured light projectors, in the arrangement of the multiple light-emitting component, shape or size At least one it is different so that the laser pattern difference that the different structured light projectors project.
5. electronic equipment according to claim 2 or 3, which is characterized in that each structured light projector includes spreading out Penetrate optical element, the diffraction optical element includes diffraction ontology and the diffraction structure that is formed on the diffraction ontology;Not Between the same structured light projector, the diffraction structure is different, so that the different structured light projectors projected The laser pattern is different.
6. electronic equipment according to claim 2, which is characterized in that the electronic equipment further includes application processor and two A microprocessor, each corresponding structure optical assembly of the microprocessor, two microprocessors are answered with described It is connected with processor, the structure light video camera head that each microprocessor is used to handle the corresponding structure optical assembly is adopted The laser pattern of collection is to obtain initial depth image and be transmitted to the application processor;The application processor is used for root Two initial depth images that two microprocessors obtain are synthesized according to the field angle of the structure light video camera head Panoramic range image described in one frame.
7. electronic equipment according to claim 2, which is characterized in that the electronic equipment further includes application processor and two A microprocessor, each corresponding structure optical assembly of the microprocessor, two microprocessors are answered with described It is connected with processor, the structure light video camera head that each microprocessor is used to handle the corresponding structure optical assembly is adopted The laser pattern of collection is to obtain initial depth image and be transmitted to the application processor;
The electronic equipment further includes two CCD camera assemblies of setting on the body, and each CCD camera assembly is corresponding One structure optical assembly, two CCD camera assemblies are connect with the application processor, each camera group Part is used to acquire the scene image of the target subject and exports to the application processor;
The two initial depth images and two institutes that the application processor is used to be obtained according to two microprocessors Two scene images for stating CCD camera assembly acquisition identify the target subject.
8. electronic equipment according to claim 7, which is characterized in that the application processor is also used to according to two institutes When stating initial depth image and two scene images identification target subject failures, according to the structure light video camera head Two initial depth images that two microprocessors obtain are synthesized a frame and merge depth image by field angle, by two Two scene images of a CCD camera assembly acquisition synthesize a frame and merge scene image, and deep according to the merging It spends image and the merging scene image identifies the target subject.
9. electronic equipment according to claim 2, which is characterized in that the electronic equipment further includes application processor and two A microprocessor, each corresponding structure optical assembly of the microprocessor, two microprocessors are answered with described It is connected with processor, the structure light video camera head that each microprocessor is used to handle the corresponding structure optical assembly is more The laser pattern of secondary acquisition is to obtain multiple initial depth images and be transmitted to the application processor;The application processing Device is used to judge that the distance between the target subject and the electronic equipment change according to multiple initial depth images.
10. electronic equipment according to claim 9, which is characterized in that the application processor is also used to according to multiple It, will be described in two according to the field angle of the structure light video camera head when initial depth image judges distance change failure Two initial depth images that microprocessor obtains synthesize a frame and merge depth image, and the application processor is continuously held Row synthesis step is merged described in depth image judgement according to multiframe with obtaining the multiframe continuously merging depth image Distance change.
11. electronic equipment according to claim 9 or 10, which is characterized in that the application processor is also used to judging When the distance change is that distance reduces, the multiple initial depth images transmitted from microprocessor described at least one are improved The middle frame per second acquired to judge the initial depth image of the distance change.
12. a kind of mobile platform, which is characterized in that the mobile platform includes:
Ontology;With
Multiple structure optical assemblies on the body are set, and multiple structure optical assemblies are located at the multiple of the ontology Different direction, each structure optical assembly include two structured light projectors and a structure light video camera head, each described The field angle of structured light projector is the arbitrary value in 80 degree to 120 degree, and the field angle of each structure light video camera head is 180 Degree is to the arbitrary value in 200 degree, and the structured light projector is used for described external projection laser pattern, and the structure light is taken the photograph The laser pattern projected as corresponding two structured light projectors that head is used to acquire target subject reflection;
The structured light projectors of multiple structure optical assemblies projects laser simultaneously, multiple structure optical assemblies it is described Structure light video camera head exposes simultaneously, to obtain panoramic range image.
13. mobile platform according to claim 12, which is characterized in that the ontology be vehicle body, unmanned aerial vehicle body, Robot body or ship ontology.
CN201910007852.6A 2019-01-04 2019-01-04 Electronic equipment and mobile platform Pending CN109788172A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910007852.6A CN109788172A (en) 2019-01-04 2019-01-04 Electronic equipment and mobile platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910007852.6A CN109788172A (en) 2019-01-04 2019-01-04 Electronic equipment and mobile platform

Publications (1)

Publication Number Publication Date
CN109788172A true CN109788172A (en) 2019-05-21

Family

ID=66499916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910007852.6A Pending CN109788172A (en) 2019-01-04 2019-01-04 Electronic equipment and mobile platform

Country Status (1)

Country Link
CN (1) CN109788172A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111654606A (en) * 2020-06-04 2020-09-11 小狗电器互联网科技(北京)股份有限公司 Structured light device
CN115052136A (en) * 2022-05-10 2022-09-13 合肥的卢深视科技有限公司 Structured light projection method, electronic device, and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106443985A (en) * 2015-08-07 2017-02-22 高准精密工业股份有限公司 Method of scaling a structured light pattern and optical device using same
CN107263480A (en) * 2017-07-21 2017-10-20 深圳市萨斯智能科技有限公司 A kind of robot manipulation's method and robot
CN107393011A (en) * 2017-06-07 2017-11-24 武汉科技大学 A kind of quick three-dimensional virtual fitting system and method based on multi-structured light vision technique
CN107580208A (en) * 2017-08-24 2018-01-12 上海视智电子科技有限公司 A kind of cooperative operation system and method for more depth measuring devices
CN107742296A (en) * 2017-09-11 2018-02-27 广东欧珀移动通信有限公司 Dynamic image generation method and electronic installation
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
CN108471487A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 Generate the image device and associated picture device of panoramic range image
CN108493767A (en) * 2018-03-12 2018-09-04 广东欧珀移动通信有限公司 Laser generator, structured light projector, image obtain structure and electronic device
CN108965751A (en) * 2017-05-25 2018-12-07 钰立微电子股份有限公司 For generating the image device of 360 degree of depth maps

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180139431A1 (en) * 2012-02-24 2018-05-17 Matterport, Inc. Capturing and aligning panoramic image and depth data
CN106443985A (en) * 2015-08-07 2017-02-22 高准精密工业股份有限公司 Method of scaling a structured light pattern and optical device using same
CN108471487A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 Generate the image device and associated picture device of panoramic range image
CN108965751A (en) * 2017-05-25 2018-12-07 钰立微电子股份有限公司 For generating the image device of 360 degree of depth maps
CN107393011A (en) * 2017-06-07 2017-11-24 武汉科技大学 A kind of quick three-dimensional virtual fitting system and method based on multi-structured light vision technique
CN107263480A (en) * 2017-07-21 2017-10-20 深圳市萨斯智能科技有限公司 A kind of robot manipulation's method and robot
CN107580208A (en) * 2017-08-24 2018-01-12 上海视智电子科技有限公司 A kind of cooperative operation system and method for more depth measuring devices
CN107742296A (en) * 2017-09-11 2018-02-27 广东欧珀移动通信有限公司 Dynamic image generation method and electronic installation
CN108493767A (en) * 2018-03-12 2018-09-04 广东欧珀移动通信有限公司 Laser generator, structured light projector, image obtain structure and electronic device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111654606A (en) * 2020-06-04 2020-09-11 小狗电器互联网科技(北京)股份有限公司 Structured light device
CN111654606B (en) * 2020-06-04 2024-04-09 北京小狗吸尘器集团股份有限公司 Structured light device
CN115052136A (en) * 2022-05-10 2022-09-13 合肥的卢深视科技有限公司 Structured light projection method, electronic device, and storage medium
CN115052136B (en) * 2022-05-10 2023-10-13 合肥的卢深视科技有限公司 Structured light projection method, electronic device and storage medium

Similar Documents

Publication Publication Date Title
CN109862275A (en) Electronic equipment and mobile platform
CN106371281A (en) Multi-module 360-degree space scanning and positioning 3D camera based on structured light
EP1504597B1 (en) Method for displaying an output image on an object
CN110572630B (en) Three-dimensional image shooting system, method, device, equipment and storage medium
CN109618108A (en) Electronic equipment and mobile platform
US10672143B2 (en) Image processing method for generating training data
CN108307675A (en) More baseline camera array system architectures of depth enhancing in being applied for VR/AR
US20060164526A1 (en) Image processing device and image capturing device
US20040046737A1 (en) Information input apparatus, information input method, and recording medium
CN107783353A (en) For catching the apparatus and system of stereopsis
CN109788172A (en) Electronic equipment and mobile platform
KR101962543B1 (en) System and method for generating augmented reality based on marker detection
CN206378680U (en) 3D cameras based on 360 degree of spacescans of structure light multimode and positioning
CN109688400A (en) Electronic equipment and mobile platform
CN109803089A (en) Electronic equipment and mobile platform
CN109587304A (en) Electronic equipment and mobile platform
CN109660731A (en) Electronic equipment and mobile platform
CN109618085A (en) Electronic equipment and mobile platform
CN109587303A (en) Electronic equipment and mobile platform
US11900058B2 (en) Ring motion capture and message composition system
CN109756660A (en) Electronic equipment and mobile platform
CN109788196A (en) Electronic equipment and mobile platform
CN109660733A (en) Electronic equipment and mobile platform
CN117173756A (en) Augmented reality AR system, computer equipment and storage medium
CN104850383A (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190521