WO2021004248A1 - 电子设备 - Google Patents

电子设备 Download PDF

Info

Publication number
WO2021004248A1
WO2021004248A1 PCT/CN2020/096820 CN2020096820W WO2021004248A1 WO 2021004248 A1 WO2021004248 A1 WO 2021004248A1 CN 2020096820 W CN2020096820 W CN 2020096820W WO 2021004248 A1 WO2021004248 A1 WO 2021004248A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
sub
housing
electronic device
output
Prior art date
Application number
PCT/CN2020/096820
Other languages
English (en)
French (fr)
Inventor
赵斌
王锐添
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP20836654.2A priority Critical patent/EP3993370A4/en
Publication of WO2021004248A1 publication Critical patent/WO2021004248A1/zh
Priority to US17/564,365 priority patent/US11741746B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Definitions

  • This application relates to the field of mobile terminals, in particular to an electronic device.
  • the depth camera can obtain the depth information of the objects in the scene.
  • the electronic device can use the depth information obtained by the depth camera to implement three-dimensional face authentication, depth information assisted beauty and other applications, which greatly enriches the functions of the electronic device.
  • the depth camera is usually used as the front camera of the mobile phone, which will make the electronic device less capable.
  • the depth camera as the front camera cannot fully obtain the depth information in the scene. If the user needs to use an electronic device To realize applications such as 3D scene modeling, the function of 3D scene modeling cannot be realized.
  • the embodiment of the present application provides an electronic device.
  • the electronic device of the embodiment of the present application includes a first housing, a second housing, and an input and output assembly.
  • the second housing is arranged on the side opposite to the display screen of the electronic device, the first housing is connected with the second housing to form a receiving space, and the second housing is provided with light hole.
  • the input and output assembly is installed on the first housing and is accommodated in the receiving space, and a side of the first housing facing the second housing is provided with a limiting member, and the limiting member is used for
  • the input and output assembly is fixed on the first housing, the input and output assembly includes a plurality of input and output modules, and the plurality of input and output modules at least include a laser transmitter, a laser receiver, and at least one image
  • a plurality of the input and output modules correspond to the light-passing holes.
  • the electronic device of the embodiment of the present application mounts a depth camera formed by a laser transmitter and a laser receiver on the first housing, so that the depth camera can be used as a rear camera, which can completely acquire the depth information in the scene and enrich the electronic device Features.
  • the electronic device is provided with a depth camera and an image collector at the same time, and more applications can be realized by combining the depth information obtained by the depth camera and the color information obtained by the image collector, such as realizing three-dimensional scene modeling to further enrich the functions of the electronic device.
  • Fig. 1 is a front view of an electronic device according to an embodiment of the present application.
  • Fig. 2 is a rear view of the electronic device of Fig. 1.
  • Fig. 3 is a partial exploded schematic diagram of the electronic device of Fig. 2.
  • FIG. 4 is a schematic diagram of a three-dimensional structure of an input and output component in the electronic device of FIG. 3.
  • Fig. 5 is an exploded schematic diagram of the input and output components of Fig. 4.
  • Fig. 6 is a rear view of the input/output assembly of Fig. 4;
  • Fig. 7 is a front view of the input/output assembly of Fig. 4.
  • FIG. 8 is a schematic diagram of a partial plane structure of the combination of the first housing and the input and output components of the electronic device of FIG. 1.
  • FIG. 9 is a schematic diagram of a partial plane structure of the combination of the first housing, the main board, and the input/output assembly of the electronic device of FIG. 1.
  • Fig. 10 is a schematic partial cross-sectional view of the electronic device of Fig. 9 along line X-X.
  • Fig. 11 is a schematic partial cross-sectional view of the electronic device of Fig. 9 along line XI-XI.
  • FIG. 12 is a rear view of the combination of the main board and the input and output components of the electronic device of FIG. 3.
  • FIG. 13 is a front view of the combination of the main board and the input and output components of the electronic device of FIG. 3.
  • Fig. 14 is a rear view of an electronic device according to another embodiment of the present application.
  • the “on” or “under” of the first feature on the second feature may be in direct contact with the first and second features, or indirectly through an intermediary. contact.
  • the "above”, “above” and “above” of the first feature on the second feature may mean that the first feature is directly above or obliquely above the second feature, or simply means that the level of the first feature is higher than the second feature.
  • the “below”, “below” and “below” of the second feature of the first feature may mean that the first feature is directly below or obliquely below the second feature, or it simply means that the level of the first feature is smaller than the second feature.
  • the present application provides an electronic device 100.
  • the electronic device 100 includes a first housing 10, a second housing 20 and an input/output assembly 30.
  • the second housing 20 is disposed on the side opposite to the display screen 90 of the electronic device 100, and the first housing 10 and the second housing 20 are connected to form a gap between the first housing 10 and the second housing 20 In the accommodating space 1001, the second housing 20 has a light through hole 21.
  • the input/output assembly 30 is installed on the first housing 10 and is housed in the receiving space 1001.
  • the side of the first housing 10 facing the second housing 20 is provided with a limiting member 11, and the limiting member 11 is used to fix the input/output assembly 30 on the first housing 10.
  • the input and output assembly 30 includes a plurality of input and output modules 50.
  • the plurality of input and output modules 50 at least include a laser transmitter 51, a laser receiver 52, and at least one image collector 53. Hole 21 corresponds.
  • a depth camera formed by a laser transmitter 51 and a laser receiver 52 is mounted on the first housing 10, so that the depth camera can be used as a rear camera of the electronic device 100, which can completely capture the scene
  • the in-depth information enriches the functions of the electronic device 100.
  • the electronic device 100 is provided with a depth camera and an image collector 53 at the same time. The depth information obtained by the depth camera and the color information obtained by the image collector 53 can be combined to realize more applications, such as realizing three-dimensional scene modeling and further enriching the electronic device. 100 functions.
  • the electronic device 100 of the present application includes a first housing 10, a second housing 20, an input/output assembly 30, a main board 60 and a display screen 90.
  • the second housing 20 is disposed on the side opposite to the display screen 90.
  • the first housing 10 and the second housing 20 are connected to form a receiving space 1001, and the input/output assembly 30 and the main board 60 are received in the receiving space 1001.
  • the user normally uses the electronic device 100, he always looks at the surface where the display screen 90 is located.
  • the first housing 10 includes a step 15 and a limiting member 11.
  • the first housing 10 has a receiving groove 13 on a side facing the second housing 20.
  • the number of steps 15 may be multiple, and the multiple steps 15 extend from the bottom wall 131 of the receiving groove 13.
  • the limiting member 11 may be a limiting protrusion 11 extending from the first housing 10.
  • the limiting protrusion 11 is arranged in the receiving groove 13.
  • the number of the limiting protrusions 11 may be multiple, at least one limiting protrusion 11 extends from the top surface of the step 15, a part of the at least one limiting protrusion 11 extends from the top surface of the step 15, and the other part extends from the receiving groove 13
  • the bottom wall 131 extends.
  • the second housing 20 is provided with the light through hole 21, and the light through hole 21 corresponds to the input and output module 50 so that the input and output module 50 can emit light to the outside and/or receive light incident from the outside.
  • the second housing 20 includes a top 22, a bottom 23, and a central axis A perpendicular to the top 22 and the bottom 23.
  • the input/output assembly 30 is installed on the first housing 10 and is housed in the receiving space 1001.
  • the limiting member 11 on the first housing 10 can fix the input/output assembly 30 on the first housing 10.
  • the input and output assembly 30 includes a plurality of input and output modules 50 and a bracket 40.
  • the multiple input and output modules 50 at least include a laser transmitter 51, a laser receiver 52 and at least one image collector 53.
  • the multiple input/output modules 50 correspond to the light-passing holes 21, so that each input/output module 50 can emit light to the outside or receive light incident from the outside.
  • the number of laser transmitters 51 can be one or more, and the number of laser receivers 52 can also be one or more.
  • the number of laser transmitters 51 is the same as the number of laser receivers 52.
  • the laser transmitter 51 and the laser receiver 52 can form a structured light depth camera. At this time, the light emitted from the laser transmitter 51 can form a speckle pattern.
  • the receiver 52 collects the speckle pattern to obtain the speckle image, and the electronic device 100 can use the speckle image to calculate the depth of the scene; or, the laser transmitter 51 and the laser receiver 52 can also form a time-of-flight depth camera, in which case the laser transmitter 51 emits uniform surface light outward, and the laser receiver 52 receives the light reflected by objects in the scene.
  • the electronic device 100 can determine the difference between the time when the laser transmitter 51 emits the light and the time when the laser receiver 52 receives the light. The time difference is used to calculate the depth information of the scene.
  • one laser transmitter 51 and one laser receiver 52 can form a structured light depth camera, and the other laser transmitter 51 and another laser receiver 52 form a time-of-flight depth camera.
  • the electronic device 100 can use different depth cameras to measure depth information in different scenes, or use two sets of depth cameras to measure depth information at the same time.
  • the number of image collectors 53 may be one or more.
  • the type of the image collector 53 may be any one of an RGB camera, an infrared camera, or a black and white camera.
  • the types of image collectors 53 can be the same or different.
  • two image collectors 53 are RGB cameras, or one image collector 53 is an RGB camera, and the other image collector 53 is a black and white camera and so on.
  • different image collectors 53 may have the same or different field of view angles.
  • two image collectors 53 are both wide-angle cameras, or one image collector 53 is a wide-angle camera, and the other image collector 53 is a telephoto camera.
  • the number of image collectors 53 is two.
  • One of the image collectors 53 is the main camera 54 and the other image collector 53 is the auxiliary camera 55.
  • the types of the main camera 54 and the auxiliary camera 55 are There is no restriction here.
  • the center points of the multiple input/output modules 50 are located on the same straight line, and the straight line is parallel to the central axis A of the second housing 20.
  • the laser transmitter 51, the main camera 54, and the laser receiver 52 Arranged in sequence, the center points of the laser transmitter 51, the main camera 54, and the laser receiver 52 are located on the same straight line parallel to the central axis A (not shown).
  • the main camera 54 is placed on the laser transmitter 51
  • the distance between the laser transmitter 51 and the laser receiver 52 can be increased to improve the accuracy of the depth information measured by the structured light depth camera.
  • the laser transmitter 51 and the laser receiver 52 form a time-of-flight depth camera
  • the laser transmitter 51, the laser receiver 52, the main camera 54 are arranged in sequence, the center points of the laser transmitter 51, the laser receiver 52, and the main camera 54 are located on the same straight line parallel to the central axis A (not shown).
  • the laser receiver 52 is placed on the laser transmitter
  • the position of the laser transmitter 51 and the laser receiver 52 is close between the camera 51 and the main camera 54, which can improve the accuracy of the depth information measured by the time-of-flight depth camera.
  • the laser receiver 52 and the main camera 54 can be set together. It facilitates the registration between the image collected by the main camera 54 and the depth image.
  • the number of image collectors 53 is two, the main camera 54 and the sub-camera 55 respectively.
  • the laser transmitter 51 and the laser receiver 52 form a structured light depth camera
  • the laser transmitter 51 and the sub-camera 55 are arranged in sequence.
  • the center points of the laser transmitter 51, the auxiliary camera 55, the main camera 54, and the laser receiver 52 are located on the same straight line parallel to the central axis A (shown in Figure 3).
  • placing the main camera 54 and the sub-camera 55 between the laser transmitter 51 and the laser receiver 52 can further increase the distance between the laser transmitter 51 and the laser receiver 52 and improve the structured light depth camera
  • the accuracy of the measured depth information is that the main camera 54 and the laser receiver 52 are set together to facilitate the registration between the image collected by the main camera 54 and the depth image.
  • the laser transmitter 51 and the laser receiver 52 form a time-of-flight depth camera
  • the laser transmitter 51 and the laser receiver 52 The central points of the laser transmitter 51, the laser receiver 52, the main camera 54, and the auxiliary camera 55 are arranged on the same straight line parallel to the central axis A (not shown)
  • the laser transmitter 51 and the laser receiver 52 are set together to improve the accuracy of the depth information measured by the time-of-flight depth camera.
  • the laser receiver 52 and the main camera 54 are set together to facilitate the main camera. 54 Registration between the captured image and the depth image.
  • the bracket 40 includes a first surface 41, a second surface 42 and a third surface 43.
  • the first surface 41 and the second surface 42 are located two opposite sides of the bracket 40, and the third surface 43 connects the first surface 41 and the second surface 42.
  • the bracket 40 is provided with a plurality of accommodating cavities 44 penetrating from the first surface 41 to the second surface 42.
  • the accommodating cavities 44 are used to install a plurality of input/output modules 50 and expose the input/output modules 50 from the first surface 41 ,
  • the number of accommodating cavities 44 corresponds to the number of input and output modules 50.
  • the inner wall 401 of the at least one receiving cavity 44 between the side where the first surface 41 is located and the side where the second surface 42 is located extends to the center of the receiving cavity 44 to form a first limiting wall 451.
  • the first limiting wall 451 partitions the receiving cavity 44 into a first sub-cavity 441 and a second sub-cavity 442.
  • the first subcavity 441 is located on the side of the first surface 41
  • the second subcavity 442 is located on the side of the second surface 42.
  • the at least one input/output module 50 includes a first module part 501 and a second module part 502.
  • the cross-sectional area of the second module part 502 is larger than the cross-sectional area of the first module part 501, the first module part 501 is partially contained in the first sub-cavity 441, and the second module part 502 is contained in the second sub-cavity Within 442.
  • the connecting surface of the second module portion 502 and the first module portion 501 collides with the first limiting wall 451.
  • the receiving cavity 44 that contains the laser receiver 52 and the sub-photograph 55 is formed with a first limiting wall 451, and the first module part 501 and the sub-module of the laser receiver 52
  • the first module part 501 of the camera 55 is partially accommodated in the corresponding first subcavity 441
  • the second module part 502 of the laser receiver 52 and the second module part 502 of the auxiliary camera 55 are respectively accommodated in the corresponding first subcavity 441.
  • the first limit wall 451 can limit the position of the input and output module 50, and play the role of fixing the input and output module 50 on the bracket 40, thereby preventing the input and output module 50 (laser receiver 52 and sub-photograph 55) from being The bracket 40 falls off.
  • the inner wall 401 of the at least one receiving cavity 44 located on the side of the first surface 41 extends toward the center of the receiving cavity 44 to form a second limiting wall 452.
  • the at least one input/output module 50 includes a first module part 501 and a second module part 502.
  • the cross-sectional area of the second module part 502 is larger than the cross-sectional area of the first module part 501.
  • the second module part 502 is received in the receiving cavity 44, and the first module part 501 is located outside the receiving cavity 44.
  • the connecting surface of 501 conflicts with the second limiting wall 452.
  • the receiving cavity 44 for the main camera 54 is formed with a second limiting wall 452, the second module portion 502 of the main camera 54 is located in the receiving cavity 44, and the first mold The assembly part 501 is located outside the receiving cavity 44.
  • the second limiting wall 452 can limit the position of the input/output module 50 and play a role of fixing the input/output module 50 on the bracket 40, thereby preventing the input/output module 50 (main camera 54) from falling off from the bracket 40.
  • the at least one receiving cavity 44 includes a first subcavity 441 and a second subcavity 442.
  • the first subcavity 441 is located on the side where the first surface 41 is located
  • the second subcavity 442 is located on the side where the second surface 42 is located.
  • the cross-sectional area of the second sub-cavity 442 is smaller than the cross-sectional area of the first sub-cavity 441.
  • the receiving cavity 44 for accommodating the laser transmitter 51 is formed with a first subcavity 441 and a second subcavity 442, and the cross-sectional area of the second subcavity 442 is smaller than that of the first subcavity.
  • FIGS. 4 and 6 Please refer to FIGS. 4 and 6 together.
  • a tiny gap is formed between the inner wall 401 of the receiving cavity 44 on the side where the second surface 42 is located and the input/output module 50.
  • Gap 31. Glue can be dispensed on the gap 31 to glue the inner wall 401 and the input/output module 50 to further fix the input/output module 50 on the bracket 40.
  • the receiving cavity 44 is not formed with the first limiting wall 451 or the second limiting wall 452, but the receiving cavity 44 is formed with a first sub-cavity 441 and a second sub-cavity 442, the inner wall 401 of the second sub-cavity 442 and the input A tiny gap 31 is formed between the output modules 50. Glue can be dispensed on the gap 31 to glue the inner wall 401 and the input/output module 50 to further fix the input/output module 50 on the bracket 40.
  • the bracket 40 includes an axis B along the length direction, and the limiting protrusion 11 includes a plurality of sub-protrusions 111, and the plurality of sub-protrusions 111 are symmetrically arranged about the axis B.
  • the third surface 43 includes a first sub-surface 431 perpendicular to the axis B and two second sub-surfaces 432 connected to the first sub-surface 431, and the two second sub-surfaces 432 are symmetrical about the axis B.
  • the limiting protrusion 11 includes a first sub-protrusion 112 corresponding to the first sub-surface 431 and a plurality of second sub-protrusions 113 corresponding to the two second sub-surfaces 432 respectively.
  • the first sub-surface 431 is engaged with the corresponding first sub-protrusion 112, and the second sub-surface 432 is engaged with the corresponding second sub-protrusion 113.
  • the two image collectors 53 are the main camera 54 and the auxiliary camera 55 respectively, and the plurality of second sub-protrusions 113 correspond to the two sides of the main camera 54 respectively.
  • the main camera 54 has a larger photosensitive pixel array than the secondary camera 55, and the main camera 54 has a larger volume.
  • a plurality of second sub-protrusions 113 are arranged on both sides of the main camera 54.
  • the input/output module 50 can be fixed on the second housing 20 more firmly.
  • Each second sub-surface 432 includes a first portion 4321, a second portion 4322, and a third portion 4323 that are sequentially connected.
  • the first part 4321 corresponds to the laser receiver 52
  • the second part 4322 and the third part 4323 correspond to the main camera 54.
  • the second part 4322 is closer to the side wall of the receiving groove 13 than the first part 4321
  • the third part 4323 is closer to the side wall of the receiving groove 13 than the second part 4322.
  • the second part 4322 and the third part 4323 correspond to the second The sub-bump 113 is engaged.
  • the main board 60 is disposed between the input/output assembly 30 and the first housing 10.
  • the multiple input and output modules 50 are respectively connected to the main board 60 through the connector 70.
  • the multiple connectors 70 are connected to the side of the main board 60 facing the second housing 20.
  • the main board 60 is provided with a first via 61 and a second via 63 corresponding to the input and output components 30.
  • the first via 61 and the second via 63 are separated by the reinforcement 65 of the main board 60, and the input and output components 30 are installed in When the first housing 10 is mounted, at least one input/output module 50 corresponds to the first via 61, and at least one input/output module 50 corresponds to the second via 63.
  • the first through hole 61 and the second through hole 63 are connected by a reinforcing part 65, and at least one input/output module 50 corresponds to the reinforcing part 65.
  • the input/output module 50 corresponding to the reinforced portion 65 may be carried on the reinforced portion 65; or, a gap 80 is formed between the input/output module 50 corresponding to the reinforced portion 65 and the reinforced portion 65.
  • the sub-photograph 55 corresponds to the reinforcing portion 65
  • a gap 80 is formed between the sub-photograph 55 and the reinforcing portion 65.
  • other input/output modules 50 may correspond to the reinforcing part 65.
  • the main board 60 is provided with two via holes corresponding to the input/output assembly 30, and the two via holes are connected by a reinforcing portion 65, which can increase the strength of the main board 60 and ensure the stability of the main board 60 in use.
  • the limiting protrusion 11 is located in the first through hole 61. In this way, the height of the limiting protrusion 11 is sufficiently high, and the contact surface of the limiting protrusion 11 and the third surface 43 is larger when the input The output assembly 30 is more stable when installed on the first housing 10.
  • the electronic device 100 of the embodiment of the present application mounts the depth camera formed by the laser transmitter 51 and the laser receiver 52 on the first housing 10, so that the depth camera can be used as a rear camera, and can completely capture the scene In-depth information enriches the functions of the electronic device 100.
  • the electronic device 100 is provided with a depth camera and an image collector 53 at the same time.
  • the depth information obtained by the depth camera and the color information obtained by the image collector 53 can be combined to realize more applications, such as realizing three-dimensional scene modeling and further enriching the electronic device. 100 functions.
  • multiple input and output modules 50 are provided on the input and output assembly 30 to improve the integration of the input and output assembly 30.
  • the multiple input and output modules 50 are installed on the same bracket 40, which is beneficial to restrict the distance between the input and output modules 50. Relative position.
  • the center points of the multiple input and output modules 50 are located on the same straight line, and the straight line is perpendicular to the central axis A of the second housing 20.
  • the arrangement of the plurality of input and output modules 50 is similar to the arrangement when the center point of the plurality of input and output modules 50 is parallel to the central axis A of the second housing 20, and will not be repeated here.
  • the input/output module 50 may further include elements such as floodlights and flashlights.
  • the floodlights, flashlights and other components can be placed on the same bracket 40 as the laser transmitter 51, the laser receiver 52, and the image collector 54 to further improve the integration of the input and output components 30.
  • the light-passing hole 21 also corresponds to newly-added elements such as floodlights, flashlights, etc., so that light emitted by elements such as floodlights and flashlights can be emitted through the light-passing hole 21.
  • the electronic device 100 may further include a decorative ring 91 disposed on the second housing 20, the decorative ring 91 is installed in the light-passing hole 21, and the decorative ring A perforation 910 is provided on 91.
  • the shape and size of the perforation 910 correspond to the shape and size of the input/output module 50, so that external light can enter the input/output module 50 after passing through the perforation 910 and the light-passing hole 21, or the input/output module The light emitted by the group 50 enters the outside after passing through the light-passing hole 21 and the perforation 910.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present application, “plurality” means at least two, such as two or three, unless otherwise specifically defined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Vascular Medicine (AREA)
  • Studio Devices (AREA)

Abstract

一种电子设备(100)。电子设备(100)包括第一壳体(10)、第二壳体(20)及输入输出组件(30)。第二壳体(20)开设有通光孔(21)。输入输出组件(30)安装在第一壳体(10)上,第一壳体(10)朝向第二壳体(20)的一面设置有限位件(11),限位件(11)用于将输入输出组件(30)固定在第一壳体(10)上。输入输出组件(30)包括多个输入输出模组(50),多个输入输出模组(50)与通光孔(21)均对应。

Description

电子设备
优先权信息
本申请请求2019年07月09日向中国国家知识产权局提交的、专利申请号为201921071690.4的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及移动终端领域,特别涉及一种电子设备。
背景技术
深度相机可以获取场景中物体的深度信息。深度相机应用在电子设备,如手机上时,电子设备可以利用深度相机获取的深度信息实现三维人脸的身份验证、深度信息辅助美颜等应用,极大地丰富了电子设备的功能。但目前深度相机通常作为手机的前置相机使用,这会使得电子设备能够实现的功能较少,比如,作为前置相机的深度相机无法完整地获取场景中的深度信息,用户若需要借助电子设备来实现三维场景建模等应用,则三维场景建模这一功能就无法实现。
发明内容
本申请实施方式提供了一种电子设备。
本申请实施方式的电子设备包括第一壳体、第二壳体及输入输出组件。所述第二壳体设置在与所述电子设备的显示屏相背的一面,所述第一壳体与所述第二壳体连接以形成收容空间,所述第二壳体开设有通光孔。所述输入输出组件安装在所述第一壳体上并收容在所述收容空间内,所述第一壳体朝向所述第二壳体的一面设置有限位件,所述限位件用于将所述输入输出组件固定在所述第一壳体上,所述输入输出组件包括多个输入输出模组,多个所述输入输出模组至少包括激光发射器、激光接收器及至少一个图像采集器,多个所述输入输出模组与所述通光孔均对应。
本申请实施方式的电子设备将由激光发射器及激光接收器形成的深度相机安装在第一壳体上,使得深度相机作为后置相机使用,可以完整地获取场景中的深度信息,丰富电子设备的功能。而且,电子设备同时设置有深度相机和图像采集器,可以结合深度相机获取的深度信息和图像采集器获取的色彩信息来实现更多应用,例如实现三维场景建模,进一步丰富电子设备的功能。
本申请实施方式的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
本申请的上述和/或附加的方面和优点可以从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本申请一个实施例的电子设备的正视图。
图2是图1的电子设备的后视图。
图3是图2的电子设备的部分分解示意图。
图4是图3的电子设备中的输入输出组件的立体结构示意图。
图5是图4的输入输出组件的分解示意图。
图6是图4的输入输出组件的后视图。
图7是图4的输入输出组件的正视图。
图8是图1的电子设备的第一壳体及输入输出组件结合的部分平面结构示意图。
图9是图1的电子设备的第一壳体、主板及输入输出组件结合的部分平面结构示意图。
图10是图9的电子设备沿X-X线的部分截面示意图。
图11是图9的电子设备沿XI-XI线的部分截面示意图。
图12是图3的电子设备的主板及输入输出组件结合的后视图。
图13是图3的电子设备的主板及输入输出组件结合的正视图。
图14是本申请另一个实施例的电子设备的后视图。
具体实施方式
以下结合附图对本申请的实施方式作进一步说明。附图中相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。
另外,下面结合附图描述的本申请的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的限制。
在本申请中,除非另有明确的规定和限定,第一特征在第二特征“上”或“下”可以是第一和第二特征直接接触,或第一和第二特征通过中间媒介间接接触。而且,第一特征在第二特征“之上”、“上方”和“上面”可是第一特征在第二特征正上方或斜上方,或仅仅表示第一特征水平高度高于第二特征。第一特征在第二特征“之下”、“下方”和“下面”可以是第一特征在第二特征正下方或斜下方,或仅仅表示第一特征水平高度小于第二特征。
请一并参阅图1至图3,本申请提供一种电子设备100。电子设备100包括第一壳体10、第二壳体20及输入输出组件30。第二壳体20设置在与电子设备100的显示屏90相背的一面,第一壳体10与第二壳体20连接以在所述第一壳体10与第二壳体20之间形成 收容空间1001,第二壳体20开设有通光孔21。输入输出组件30安装在第一壳体10上并收容在收容空间1001内。第一壳体10朝向第二壳体20的一面设置有限位件11,限位件11用于将输入输出组件30固定在第一壳体10上。输入输出组件30包括多个输入输出模组50,多个输入输出模组50至少包括激光发射器51、激光接收器52及至少一个图像采集器53,多个输入输出模组50均与通光孔21对应。
本申请实施方式的电子设备100将由激光发射器51及激光接收器52形成的深度相机安装在第一壳体10上,使得深度相机作为电子设备100的后置相机使用,可以完整地获取场景中的深度信息,丰富电子设备100的功能。而且,电子设备100同时设置有深度相机和图像采集器53,可以结合深度相机获取的深度信息和图像采集器53获取的色彩信息来实现更多应用,例如实现三维场景建模,进一步丰富电子设备100的功能。
请一并参阅图1至图3,本申请的电子设备100包括第一壳体10、第二壳体20、输入输出组件30、主板60及显示屏90。第二壳体20设置在与显示屏90相背的一面,第一壳体10与第二壳体20连接以形成收容空间1001,输入输出组件30及主板60收容在收容空间1001内。用户在正常使用电子设备100时,都是正视显示屏90所在的表面。
第一壳体10包括台阶15及限位件11。第一壳体10朝向第二壳体20的一面开设有收容槽13,台阶15的数量可为多个,多个台阶15自收容槽13的底壁131延伸。在一个例子中,限位件11可为自第一壳体10延伸的限位凸起11。限位凸起11设置在收容槽13内。限位凸起11的数量可为多个,至少一个限位凸起11自台阶15的顶面延伸,至少一个限位凸起11的一部分自台阶15的顶面延伸,另一部分自收容槽13的底壁131延伸。
第二壳体20开设有所述通光孔21,通光孔21与输入输出模组50对应,以使输入输出模组50可以向外界发射光线和/或接收从外界入射的光线。第二壳体20包括顶部22、底部23、以及垂直顶部22与底部23的中心轴A。
输入输出组件30安装在第一壳体10上并收容在收容空间1001内。第一壳体10上的限位件11可以将输入输出组件30固定在第一壳体10上。输入输出组件30包括多个输入输出模组50及支架40。
多个输入输出模组50至少包括激光发射器51、激光接收器52及至少一个图像采集器53。多个输入输出模组50与通光孔21均对应,以使每个输入输出模组50可以向外界发射光线或接收从外界入射的光线。
具体地,激光发射器51的数量可为一个或多个,激光接收器52的数量也可为一个或多个,激光发射器51的数量与激光接收器52的数量一致。激光发射器51和激光接收器52的数量均为一个时,激光发射器51和激光接收器52可以组成结构光深度相机,此时激光发射器51向外发射的光线可以形成散斑图案,激光接收器52采集散斑图案得到散斑图 像,电子设备100可以利用散斑图像进行场景的深度计算;或者,激光发射器51和激光接收器52也可以组成飞行时间深度相机,此时激光发射器51向外发射均匀的面光,激光接收器52接收被场景中的物体反射的光线,电子设备100可以根据激光发射器51发射光线的时间点与激光接收器52接收光线的时间点之间的时间差来计算场景的深度信息。当激光发射器51和激光接收器52的数量均为多个时,比如均为两个,则可以是其中一个激光发射器51与一个激光接收器52组成结构光深度相机,另一个激光发射器51与另一个激光接收器52组成飞行时间深度相机,电子设备100可以在不同的场景下采用不同的深度相机来测量深度信息,或者同时采用两组深度相机来测量深度信息等等。
图像采集器53的数量可为一个或多个。图像采集器53的类型可以是RGB摄像头、红外摄像头、或黑白摄像头中的任意一种。图像采集器53的个数为多个时,图像采集器53的类型可以相同或者不同,比如两个图像采集器53均为RGB摄像头,或一个图像采集器53为RGB摄像头、另一个图像采集器53为黑白摄像头等等。并且,不同的图像采集器53可以具有相同或者不同的视场角,比如两个图像采集器53均为广角摄像头,或者一个图像采集器53为广角摄像头,另一个图像采集器53为长焦摄像头等等,不同的图像采集器53具有不同的视场角可以使得电子设备100同时具备数字变焦和光学变焦两个功能,电子设备100拍摄图像时,图像的成像效果更好。在本申请的实施例中,图像采集器53的个数为两个,其中一个图像采集器53为主摄54,另一个图像采集器53为副摄55,主摄54和副摄55的类型在此不作限制。
多个输入输出模组50的中心点位于同一直线上,且该直线与第二壳体20的中心轴A平行。在一个例子中,图像采集器53的个数为一个,并为主摄51,激光发射器51和激光接收器52组成结构光深度相机时,激光发射器51、主摄54、激光接收器52依次排列,激光发射器51、主摄54、激光接收器52三者的中心点位于与中心轴A平行的同一直线上(图未示),此时,将主摄54放在激光发射器51和激光接收器52之间,可以增大激光发射器51和激光接收器52之间的距离,提升结构光深度相机测量的深度信息的准确性。在另一个例子中,图像采集器53的个数为一个,并为主摄51,激光发射器51和激光接收器52组成飞行时间深度相机时,激光发射器51、激光接收器52、主摄54依次排列,激光发射器51、激光接收器52、主摄54三者的中心点位于与中心轴A平行的同一直线上(图未示),此时,将激光接收器52放置在激光发射器51和主摄54之间,激光发射器51和激光接收器52的位置邻近,可以提升飞行时间深度相机测量的深度信息的准确性,将激光接收器52和主摄54设置在一起,可以便于主摄54采集的图像与深度图像之间的配准。在又一个例子中,图像采集器53的个数为两个,分别为主摄54和副摄55,激光发射器51和激光接收器52组成结构光深度相机时,激光发射器51、副摄55、主摄54、激光接收器52依 次排列,激光发射器51、副摄55、主摄54、激光接收器52四者的中心点位于与中心轴A平行的同一直线上(图3所示),此时,将主摄54和副摄55放在激光发射器51和激光接收器52之间,可以进一步增大激光发射器51和激光接收器52之间的距离,提升结构光深度相机测量的深度信息的准确性,将主摄54与激光接收器52设置在一起,可以便于主摄54采集的图像与深度图像之间的配准。在再一个例子中,图像采集器53的个数为两个,分别为主摄54和副摄55,激光发射器51和激光接收器52组成飞行时间深度相机时,激光发射器51、激光接收器52、主摄54、副摄55依次排列,激光发射器51、激光接收器52、主摄54、副摄55四者的中心点位于与中心轴A平行的同一直线上(图未示),此时,将激光发射器51和激光接收器52的设置在一起,可以提升飞行时间深度相机测量的深度信息的准确性,将激光接收器52和主摄54设置在一起,可以便于主摄54采集的图像与深度图像之间的配准。
请一并参阅图3至图5,支架40包括第一面41、第二面42及第三面43。第一面41与第二面42位于支架40的相背两测,第三面43连接第一面41与第二面42。输入输出组件30安装在第一壳体10上时,第一面41朝向第二壳体20,第二面42朝向第一壳体10,限位凸起11与第三面43接触。
支架40开设有多个从第一面41贯穿至第二面42的收容腔44,多个收容腔44用于安装多个输入输出模组50并使输入输出模组50从第一面41露出,收容腔44的个数与输入输出模组50的个数对应。
至少一个收容腔44的位于第一面41所在一侧及第二面42所在一侧之间的内壁401向收容腔44的中心延伸形成有第一限位壁451。第一限位壁451将收容腔44分隔成第一子腔441和第二子腔442。第一子腔441位于第一面41所在一侧,第二子腔442位于第二面42所在一侧。至少一个输入输出模组50包括第一模组部501和第二模组部502。第二模组部502的横截面积大于第一模组部501的横截面积,第一模组部501部分收容在第一子腔441内,第二模组部502收容在第二子腔442内。输入输出模组50安装在收容腔44内时,第二模组部502与第一模组部501的连接面与第一限位壁451相抵触。如图4和图5所示,在一个例子中,收容激光接收器52和副摄55的收容腔44均形成有第一限位壁451,激光接收器52的第一模组部501和副摄55的第一模组部501分别部分收容在对应的第一子腔441内,激光接收器52的第二模组部502和副摄55的第二模组部502分别收容在对应的第二子腔442内。第一限位壁451可以限制输入输出模组50的位置,起到将输入输出模组50固定在支架40上的作用,从而避免输入输出模组50(激光接收器52和副摄55)从支架40上脱落。
至少一个收容腔44的位于第一面41所在一侧的内壁401向收容腔44的中心延伸形成 有第二限位壁452。至少一个输入输出模组50包括第一模组部501和第二模组部502。第二模组部502的横截面积大于第一模组部501的横截面积。第二模组部502收容在收容腔44内,第一模组部501位于收容腔44外,输入输出模组50安装在收容腔44内时,第二模组部502与第一模组部501的连接面与第二限位壁452相抵触。如图4和图5所示,在一个例子中,收容主摄54的收容腔44形成有第二限位壁452,主摄54的第二模组部502位于收容腔44内,第一模组部501位于收容腔44外。第二限位壁452可以限制输入输出模组50的位置,起到将输入输出模组50固定在支架40上的作用,从而避免输入输出模组50(主摄54)从支架40上脱落。
至少一个收容腔44包括第一子腔441和第二子腔442,第一子腔441位于第一面41所在一侧,第二子腔442位于第二面42所在一侧。第二子腔442的横截面积小于第一子腔441的横截面积。至少一个输入输出模组50安装在收容腔44内时,第二子腔442与输入输出模组50的形状对应,第二子腔442可以起到一定的将输入输出模组50固定在支架40上的作用,从而避免输入输出模组50从支架40上脱落。如图4和图5所示,在一个例子中,收容激光发射器51的收容腔44形成有第一子腔441和第二子腔442,第二子腔442的横截面积小于第一子腔441的横截面积。激光发射器51收容在收容腔44内时,由于激光发射器51发射的光线具有一定的发射角度,那么将第一子腔441的横截面积设置成较大,可以避免支架40的第一面41对激光发射器51发射的光线的遮挡。
请一并参阅图4和图6,输入输出模组50收容在收容腔44内时,收容腔44的位于第二面42所在一侧的内壁401与输入输出模组50之间形成有微小的间隙31。可以在该间隙31上点胶,以胶粘内壁401与输入输出模组50,从而进一步将输入输出模组50固定在支架40上。
请一并参阅图4、图5和图7,输入输出模组50收容在收容腔44内时,若收容腔44形成有第一限位壁451或第二限位壁452,则第一限位壁451或第二限位壁452与输入输出模组50之间形成有微小的间隙31。可以在该间隙31上点胶,以胶粘内壁401与输入输出模组50,从而进一步将输入输出模组50固定在支架40上。若收容腔44未形成有第一限位壁451或第二限位壁452,但收容腔44形成有第一子腔441和第二子腔442,则第二子腔442的内壁401与输入输出模组50之间形成微小的间隙31。可以在该间隙31上点胶,以胶粘内壁401与输入输出模组50,从而进一步将输入输出模组50固定在支架40上。
请一并参阅图3、图5、图8、图9和图10,输入输出组件30安装在第一壳体10上时,支架40的第二面42部分与台阶15的顶面相抵触,支架40的第三面43部分与限位凸起11相抵触。支架40包括沿长度方向的轴线B,限位凸起11包括多个子凸起111,多个子凸起111关于轴线B对称设置。
第三面43包括与轴线B垂直的第一子面431及与第一子面431均连接的两个第二子面432,两个第二子面432关于轴线B对称。限位凸起11包括与第一子面431对应的第一子凸起112及与两个第二子面432分别对应的多个第二子凸起113。第一子面431与对应的第一子凸起112卡合,第二子面432与对应的第二子凸起113卡合。具体地,两个图像采集器53分别为主摄54及副摄55,多个第二子凸起113分别与主摄54的两侧对应,第一子凸起112与激光接收器52的一侧对应。在本申请的一个实施例中,主摄54相较于副摄55具有更大的感光像素阵列,主摄54的体积较大,将多个第二子凸起113设置在主摄54的两侧,可以更加稳固地将输入输出模组50固定在第二壳体20上。
每个第二子面432包括依次相接的第一部4321、第二部4322及第三部4323。第一部4321与激光接收器52对应,第二部4322及第三部4323与主摄54对应。第二部4322较第一部4321更靠近收容槽13的侧壁,第三部4323较第二部4322更靠近收容槽13的侧壁,第二部4322及第三部4323与对应的第二子凸起113卡合。
请一并参阅图3、及图11至图13,主板60设置在输入输出组件30与第一壳体10之间。多个输入输出模组50分别通过连接器70与主板60连接,具体地,多个连接器70连接在主板60的朝向第二壳体20的一面。主板60开设有与输入输出组件30对应的第一过孔61和第二过孔63,第一过孔61和第二过孔63由主板60的加强部65隔开,输入输出组件30安装在第一壳体10上时,至少一个输入输出模组50与第一过孔61对应,至少一个输入输出模组50与第二过孔63对应。第一过孔61与第二过孔63通过加强部65连接,至少一个输入输出模组50与加强部65对应。与加强部65对应的输入输出模组50可以承载在加强部65上;或者,与加强部65对应的输入输出模组50与加强部65之间形成有间隙80。如图11所示,副摄55与加强部65对应,副摄55与加强部65之间形成有间隙80。当然,在其他实施方式中,也可以是其他的输入输出模组50与加强部65对应。在主板60上对应输入输出组件30开设的两个过孔,两个过孔之间通过加强部65连接,可以增加主板60的强度,保证主板60使用的稳定性。
请参阅图10,限位凸起11位于第一过孔61内,如此,限位凸起11的高度足够高,限位凸起11与第三面43卡合时的接触面更大,输入输出组件30安装在第一壳体10上时更为稳固。
综上,本申请实施方式的电子设备100将由激光发射器51及激光接收器52形成的深度相机安装在第一壳体10上,使得深度相机作为后置相机使用,可以完整地获取场景中的深度信息,丰富电子设备100的功能。而且,电子设备100同时设置有深度相机和图像采集器53,可以结合深度相机获取的深度信息和图像采集器53获取的色彩信息来实现更多应用,例如实现三维场景建模,进一步丰富电子设备100的功能。
另外,输入输出组件30上设置多个输入输出模组50可以提升输入输出组件30的集成度,多个输入输出模组50安装在同一支架40上,有利于限制各个输入输出模组50之间的相对位置。
请参阅图14,在某些实施方式中,多个输入输出模组50的中心点位于同一直线上,且该直线与第二壳体20的中心轴A垂直。多个输入输出模组50的排列方式与多个输入输出模组50中心点所在直线与第二壳体20中心轴A平行时的排列方式类似,在此不再赘述。
在某些实施方式中,输入输出模组50还可以进一步包括泛光灯、闪光灯等元件。泛光灯、闪光灯等元件可以与激光发射器51、激光接收器52及图像采集器54放置在同一个支架40上,从而进一步提升输入输出组件30的集成度。通光孔21与新增的泛光灯、闪光灯等元件也均对应,以使得泛光灯、闪光灯等元件发出的光线可以经由通光孔21射出。
请一并参阅图2和图14,在某些实施方式中,电子设备100还可包括设置在第二壳体20上的装饰圈91,装饰圈91安装在通光孔21内,且装饰圈91上开设有穿孔910,穿孔910的形状大小与输入输出模组50的形状大小对应,以使得外界光线能够经过穿孔910及通光孔21后进入输入输出模组50中,或者,输入输出模组50发出的光线经过通光孔21及穿孔910后射入到外界。
在本说明书的描述中,参考术语“某些实施方式”、“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”、或“一些示例”的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个所述特征。在本申请的描述中,“多个”的含义是至少两个,例如两个,三个,除非另有明确具体的限定。
尽管上面已经示出和描述了本申请的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施例进行变化、修改、替换和变型,本申请的范围由权利要求及其等同物限定。

Claims (17)

  1. 一种电子设备,其特征在于,所述电子设备包括:
    第一壳体;
    第二壳体,所述第二壳体设置在与所述电子设备的显示屏相背的一面,所述第一壳体与所述第二壳体连接以形成收容空间,所述第二壳体开设有通光孔;和
    输入输出组件,所述输入输出组件安装在所述第一壳体上并收容在所述收容空间内,所述第一壳体朝向所述第二壳体的一面设置有限位件,所述限位件用于将所述输入输出组件固定在所述第一壳体上,所述输入输出组件包括多个输入输出模组,多个所述输入输出模组至少包括激光发射器、激光接收器及至少一个图像采集器,多个所述输入输出模组均与所述通光孔对应。
  2. 根据权利要求1所述的电子设备,其特征在于,所述限位件包括自所述第一壳体延伸的限位凸起。
  3. 根据权利要求2所述的电子设备,其特征在于,所述输入输出组件还包括支架,多个所述输入输出模组安装在所述支架上,所述支架包括第一面、第二面及第三面,所述第一面与所述第二面位于所述支架的相背两侧,所述第三面连接所述第一面与所述第二面,所述输入输出组件固定在所述第一壳体上时,所述限位件与所述第三面接触,所述第一面朝向所述第二壳体。
  4. 根据权利要求3所述的电子设备,其特征在于,所述支架包括沿长度方向的轴线,所述限位凸起包括多个子凸起,多个所述子凸起关于所述轴线对称设置。
  5. 根据权利要求4所述的电子设备,其特征在于,所述第三面包括与所述轴线垂直的第一子面及与所述第一子面均连接的两个第二子面,两个所述第二子面关于所述轴线对称;
    所述限位凸起包括与所述第一子面对应的第一子凸起及与两个所述第二子面分别对应的多个第二子凸起,所述第一子面与对应的所述第一子凸起卡合,所述第二子面与对应的所述第二子凸起卡合。
  6. 根据权利要求3所述的电子设备,其特征在于,所述支架开设有多个从所述第一面贯穿至所述第二面的收容腔,多个所述收容腔用于安装多个所述输入输出模组并使多个所述输入输出模组均从所述第一面露出。
  7. 根据权利要求6所述的电子设备,其特征在于,至少一个所述收容腔的位于所述第一面所在一侧及所述第二面所在一侧之间的内壁向所述收容腔的中心延伸形成有第一限位壁,所述第一限位壁将所述收容腔分隔成第一子腔和第二子腔,所述第一子腔位于所述第一面所在一侧,所述第二子腔位于所述第二面所在一侧;
    至少一个所述输入输出模组包括第一模组部和第二模组部,所述第二模组部的横截面积大于所述第一模组部的横截面积,所述第一模组部部分收容在所述第一子腔内,所述第二模组部收容在所述第二子腔内,所述输入输出模组安装在所述收容腔内时,所述第二模组部与所述第一模组部的连接面与所述第一限位壁相抵触。
  8. 根据权利要求6所述的电子设备,其特征在于,至少一个所述收容腔的位于所述第一面所在一侧的内壁向所述收容腔的中心延伸形成有第二限位壁;
    至少一个所述输入输出模组包括第一模组部和第二模组部,所述第二模组部的横截面积大于所述第一模组部的横截面积,所述第二模组部收容在所述收容腔内,所述第一模组部位于所述收容腔外,所述输入输出模组安装在所述收容腔内时,所述第二模组部与所述第一模组部的连接面与所述第二限位壁相抵触。
  9. 根据权利要求6所述的电子设备,其特征在于,至少一个所述收容腔包括第一子腔及第二子腔,所述第一子腔位于所述第一面所在一侧,所述第二子腔位于所述第二面所在一侧,所述第二子腔的横截面积小于所述第一子腔的横截面积;
    至少一个所述输入输出模组安装在所述收容腔内时,所述第二子腔与所述输入输出模组的形状对应。
  10. 根据权利要求6所述的电子设备,其特征在于,在所述输入输出模组安装在所述收容腔内时,所述收容腔的位于所述第二面所在一侧的内壁与所述输入输出模组之间形成有间隙。
  11. 根据权利要求3所述的电子设备,其特征在于,所述第一壳体朝向所述第二壳体的一面开设有收容槽,所述限位凸起设置在所述收容槽内,所述输入输出模组部分收容在所述收容槽内;
    所述第一壳体还包括自所述收容槽的底壁延伸的多个台阶,所述输入输出模组收容在所述收容槽内时,所述支架的所述第二面部分与所述台阶的顶面抵触。
  12. 根据权利要求1所述的电子设备,其特征在于,所述第二壳体包括顶部和底部,所述第二壳体还包括垂直所述顶部与所述底部的中心轴,多个所述输入输出模组的中心点位于同一直线上,且所述直线与所述中心轴平行;或
    所述第二壳体包括顶部和底部,所述第二壳体还包括垂直所述顶部与所述底部的中心轴,多个所述输入输出模组的中心点位于同一直线上,且所述直线与所述中心轴垂直。
  13. 根据权利要求12所述的电子设备,其特征在于,所述图像采集器的个数为一个,所述激光发射器、一个所述图像采集器及所述激光接收器依次排列;或
    所述图像采集器的个数为一个,所述激光发射器、所述激光接收器及一个所述图像采集器依次排列。
  14. 根据权利要求12所述的电子设备,其特征在于,所述图像采集器的个数为两个,所述激光发射器、两个所述图像采集器及所述激光接收器依次排列;或
    所述图像采集器的个数为两个,所述激光发射器、所述激光接收器及两个所述图像采集器依次排列。
  15. 根据权利要求1所述的电子设备,其特征在于,所述电子设备还包括主板,所述主板设置在所述输入输出组件与所述第一壳体之间,多个所述输入输出模组分别通过连接器与所述主板连接,多个所述连接器连接在所述主板的朝向所述第二壳体的一面。
  16. 根据权利要求15所述的电子设备,其特征在于,所述主板开设有与所述输入输出组件对应的第一过孔和第二过孔,所述第一过孔与所述第二过孔由所述主板的加强部隔开,所述输入输出组件安装在所述第一壳体上时,至少一个所述输入输出模组与第一过孔对应,至少一个所述输入输出模组与所述第二过孔对应;
    所述第一过孔与所述第二过孔通过所述加强部连接,至少一个所述输入输出模组与所述加强部对应。
  17. 根据权利要求16所述的电子设备,其特征在于,与所述加强部对应的所述输入输出模组承载在所述加强部上;或
    与所述加强部对应的所述输入输出模组与所述加强部之间形成有间隙。
PCT/CN2020/096820 2019-07-09 2020-06-18 电子设备 WO2021004248A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20836654.2A EP3993370A4 (en) 2019-07-09 2020-06-18 ELECTRONIC DEVICE
US17/564,365 US11741746B2 (en) 2019-07-09 2021-12-29 Electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201921071690.4 2019-07-09
CN201921071690.4U CN209823807U (zh) 2019-07-09 2019-07-09 电子设备

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/564,365 Continuation US11741746B2 (en) 2019-07-09 2021-12-29 Electronic device

Publications (1)

Publication Number Publication Date
WO2021004248A1 true WO2021004248A1 (zh) 2021-01-14

Family

ID=68885583

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/096820 WO2021004248A1 (zh) 2019-07-09 2020-06-18 电子设备

Country Status (4)

Country Link
US (1) US11741746B2 (zh)
EP (1) EP3993370A4 (zh)
CN (1) CN209823807U (zh)
WO (1) WO2021004248A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN209823807U (zh) * 2019-07-09 2019-12-20 Oppo广东移动通信有限公司 电子设备
CN111107206A (zh) * 2020-01-23 2020-05-05 Oppo广东移动通信有限公司 电子设备
CN111694161A (zh) * 2020-06-05 2020-09-22 Oppo广东移动通信有限公司 光发射模组、深度相机及电子设备

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107343122A (zh) * 2017-08-02 2017-11-10 深圳奥比中光科技有限公司 3d成像装置
KR20180017375A (ko) * 2016-08-09 2018-02-21 엘지이노텍 주식회사 카메라 모듈
CN207218775U (zh) * 2017-09-28 2018-04-10 北京小米移动软件有限公司 电子设备
CN108418922A (zh) * 2018-04-10 2018-08-17 Oppo广东移动通信有限公司 支架、输入输出组件及终端
CN109040556A (zh) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 成像装置及电子设备
CN208572262U (zh) * 2018-01-31 2019-03-01 宁波舜宇光电信息有限公司 阵列摄像模组及其电子设备
CN109451228A (zh) * 2018-12-24 2019-03-08 华为技术有限公司 摄像组件及电子设备
CN208724035U (zh) * 2018-09-04 2019-04-09 南京华捷艾米软件科技有限公司 一种3d深度摄像头组件及带有该组件的装置
CN110049214A (zh) * 2019-03-25 2019-07-23 华为技术有限公司 摄像组件及电子设备
CN209823807U (zh) * 2019-07-09 2019-12-20 Oppo广东移动通信有限公司 电子设备
CN210075369U (zh) * 2019-05-16 2020-02-14 昆山丘钛微电子科技有限公司 光学投影模块及移动终端

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017206142A1 (en) * 2016-06-02 2017-12-07 Henkel (China) Investment Co., Ltd. Portable odor quantitative detector
CA3034146A1 (en) * 2016-07-19 2018-01-25 Shenzhen Royole Technologies Co., Ltd. Flexible device
CN206674128U (zh) 2017-09-08 2017-11-24 深圳奥比中光科技有限公司 结构稳定的3d成像装置
CN207691920U (zh) 2018-01-29 2018-08-03 信利光电股份有限公司 一种结构光摄像模组
CN108234850B (zh) 2018-01-31 2024-04-05 Oppo广东移动通信有限公司 一种电子装置、摄像头固定组件及其安装方法
CN108390970B (zh) * 2018-04-10 2020-11-06 Oppo广东移动通信有限公司 支架、输入输出组件和终端

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180017375A (ko) * 2016-08-09 2018-02-21 엘지이노텍 주식회사 카메라 모듈
CN107343122A (zh) * 2017-08-02 2017-11-10 深圳奥比中光科技有限公司 3d成像装置
CN207218775U (zh) * 2017-09-28 2018-04-10 北京小米移动软件有限公司 电子设备
CN208572262U (zh) * 2018-01-31 2019-03-01 宁波舜宇光电信息有限公司 阵列摄像模组及其电子设备
CN108418922A (zh) * 2018-04-10 2018-08-17 Oppo广东移动通信有限公司 支架、输入输出组件及终端
CN109040556A (zh) * 2018-08-22 2018-12-18 Oppo广东移动通信有限公司 成像装置及电子设备
CN208724035U (zh) * 2018-09-04 2019-04-09 南京华捷艾米软件科技有限公司 一种3d深度摄像头组件及带有该组件的装置
CN109451228A (zh) * 2018-12-24 2019-03-08 华为技术有限公司 摄像组件及电子设备
CN110049214A (zh) * 2019-03-25 2019-07-23 华为技术有限公司 摄像组件及电子设备
CN210075369U (zh) * 2019-05-16 2020-02-14 昆山丘钛微电子科技有限公司 光学投影模块及移动终端
CN209823807U (zh) * 2019-07-09 2019-12-20 Oppo广东移动通信有限公司 电子设备

Also Published As

Publication number Publication date
EP3993370A1 (en) 2022-05-04
CN209823807U (zh) 2019-12-20
US11741746B2 (en) 2023-08-29
EP3993370A4 (en) 2022-08-17
US20220122376A1 (en) 2022-04-21

Similar Documents

Publication Publication Date Title
WO2021004248A1 (zh) 电子设备
WO2020038068A1 (zh) 成像装置及电子设备
RU2730416C1 (ru) Электронное устройство с панелью отображения
US11516455B2 (en) Electronic device and method for controlling the same
WO2020134879A1 (zh) 摄像组件及电子设备
US20210250472A1 (en) Dual-vision camera, gimbal system and movable platform
WO2020038063A1 (zh) 电子装置和电子装置的控制方法
WO2020038059A1 (zh) 支架、输入输出组件和移动设备
CN108390970B (zh) 支架、输入输出组件和终端
WO2020052289A1 (zh) 深度获取模组及电子装置
TW201905525A (zh) 雙鏡頭模組及其電子裝置
WO2020052288A1 (zh) 深度采集模组及移动终端
WO2020038056A1 (zh) 飞行时间组件及电子设备
WO2020038052A1 (zh) 输入输出组件和移动设备
CN210327827U (zh) 光发射组件、摄像模组及电子设备
JP6688426B1 (ja) レンズモジュール
WO2021016813A1 (zh) 拍摄设备、云台装置及无人机
WO2020038057A1 (zh) 深度采集模组及电子设备
WO2021027580A1 (zh) 终端
CN212301894U (zh) 发射模组、摄像头及电子装置
WO2020034996A1 (zh) 用于移动电子设备的辅助摄像装置及其组装方法
WO2021016816A1 (zh) 拍摄设备、云台装置及无人机
CN213693886U (zh) 一种摄像头模组及设备
US20190253590A1 (en) Camera Module
CN211089730U (zh) 摄像模组、壳体及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20836654

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020836654

Country of ref document: EP

Effective date: 20220128