CN111953822B - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
CN111953822B
CN111953822B CN202010030418.2A CN202010030418A CN111953822B CN 111953822 B CN111953822 B CN 111953822B CN 202010030418 A CN202010030418 A CN 202010030418A CN 111953822 B CN111953822 B CN 111953822B
Authority
CN
China
Prior art keywords
electronic device
picture
processor
display
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010030418.2A
Other languages
Chinese (zh)
Other versions
CN111953822A (en
Inventor
吴易锡
徐仁邦
黄健庭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Asustek Computer Inc
Original Assignee
Asustek Computer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Asustek Computer Inc filed Critical Asustek Computer Inc
Publication of CN111953822A publication Critical patent/CN111953822A/en
Application granted granted Critical
Publication of CN111953822B publication Critical patent/CN111953822B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/58Means for changing the camera field of view without moving the camera body, e.g. nutating or panning of optics or image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Telephone Function (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an electronic device, which comprises a machine body, a camera module, at least one sensor and a processor. The camera module is rotatably disposed on the body. The sensor is arranged on the electronic device and used for generating an environment sensing signal. The processor is electrically connected with the display screen, the camera module and the at least one sensor, and when the processor loads and executes an application program, the application program enables the electronic device to execute the following steps: generating a display picture; when the head portrait display function of the application program is started, a camera module is used for capturing a head portrait picture, and the head portrait picture and the display picture are combined into a live game picture; and when the electronic device is in a live broadcast mode and the processor judges that the environment dynamic change occurs according to the environment sensing signal, the processor transmits a notification signal to the application program to notify the application program to close the head portrait display function. The electronic device can automatically close the display picture when the use environment changes.

Description

Electronic device
Technical Field
The present invention relates to an electronic device, and more particularly, to an environment sensing system and a display control method of an electronic device.
Background
When a user uses the mobile phone to carry out operation or game live broadcasting, if the mobile phone drops or is influenced by the environment to cause the camera lens to turn over, the camera can shoot other privacy pictures, and the private pictures are leaked out through live broadcasting.
Disclosure of Invention
The invention aims to provide an electronic device which can automatically close a display screen when the use environment changes.
The invention provides an electronic device which comprises a machine body, a camera module, at least one sensor and a processor. The camera module is rotatably arranged on the body. At least one sensor is disposed on the electronic device for generating an environmental sensing signal. The processor is electrically connected with the display screen, the camera module and the at least one sensor, and when the processor loads and executes the application program, the application program enables the electronic device to execute the following steps: generating a display picture; when the head portrait display function of the application program is started, a camera module is used for capturing a head portrait picture, and the head portrait picture and the display picture are combined into a live broadcast picture; and when the electronic device is in a live broadcast mode and the processor detects and judges that the environment dynamic change occurs according to the environment sensing signal, the processor transmits a notification signal to the application program to notify the application program to close the head portrait display function.
In summary, the electronic apparatus and the control method provided in the embodiments of the present invention can prevent the camera lens from being turned over when the user uses the mobile phone to perform a task or live broadcast, so as to prevent other privacy pictures from being captured and leaking. The sensor detects the environmental change around the electronic device to control the output streaming pictures and related accessories, thereby protecting the privacy of users.
Drawings
Fig. 1A is an external view of an electronic device according to some embodiments of the invention;
FIG. 1B is a block diagram of an electronic device according to some embodiments of the invention;
fig. 2A to 2C are schematic views illustrating a camera module of an electronic device being flipped to different flipping angles relative to a main body according to some embodiments;
FIG. 3 is a block diagram illustrating an internal architecture of an operating system 151 executed by a processor according to some embodiments of the invention; and
fig. 4A and 4B are schematic diagrams illustrating a live game frame under different situations.
Detailed Description
In the following description, numerous implementation details are set forth in order to provide a thorough understanding of the present invention. It should be understood, however, that these implementation details are not to be interpreted as limiting the invention. That is, in some embodiments of the invention, such implementation details are not necessary. In addition, for the sake of simplicity, some conventional structures and elements are shown in the drawings in a simple schematic manner.
Referring to fig. 1A, fig. 1A is an external view of an electronic device 100 according to some embodiments of the invention. In various applications, the electronic device 100 may be a mobile phone, a tablet computer, a personal computer, a notebook computer, etc. For example, the electronic device 100 may be a smart phone for applications such as communication and live broadcasting.
Referring to fig. 1A and 1B, the electronic device 100 includes a body 110, a camera module 130, a display screen 140, a processor 150, a motor 160, at least one sensor 170, and a communication circuit 180. The sensor 170 is disposed on the electronic device 150, and the sensor 170 is configured to generate an environment sensing signal SS according to a detected state or change of an environment.
In the present disclosure, when the electronic device 100 is in the live mode, the processor 150 can determine whether an environment dynamic change occurs according to the environment sensing signal SS generated by the sensor 170, and selectively turn on or off a specific function, such as an avatar function in a game application. The detailed technical approaches for detecting the dynamic environment change and turning on/off the avatar function will be fully described in the following embodiments.
The processor 150 is electrically connected to the camera module 130, the display screen 140, the motor 160, and the orientation sensing elements 170a and 170 b. The motor 160 is electrically connected to the camera module 130 for driving the camera module 130 to turn relative to the body 110.
In the embodiment of FIG. 1B, the sensor 170 on the electronic device 100 may include two orientation sensing elements 170a and 170B. The two sets of orientation sensing elements 170a and 170b are respectively disposed on the body 110 and the camera module 130. The orientation sensing assembly 170a disposed on the body 110 senses the orientation of the body 110 to generate orientation information. The orientation sensing component 170b disposed on the camera module 130 senses the orientation of the camera module 130 to generate another orientation information. The orientation information generated by the two orientation sensing elements 170a and 170b can be used as the environment sensing signal SS.
In some embodiments, the processor 150 is a Central Processing Unit (CPU), an Application-specific integrated circuit (ASIC), a multiprocessor, a decentralized processing system, or a suitable processing circuit. In one embodiment, the display screen 140 is a touch screen.
In some embodiments, the orientation sensing components 170a or 170b in the sensor 170 each include at least one of a gyroscope or a gravity sensor. The gyroscope is used to detect the current angular velocity of the main body 110 and the camera module 130 as the orientation information, and the gravity sensor is used to detect the current gravity value of the main body 110 and the camera module 130 as the orientation information. Thus, the processor 150 can determine the included angle between the camera module 130 and the body 110 according to the angular velocity detected by the gyroscope. The processor 150 can also determine whether the electronic device 100 is now laid or erected or the angle between the camera module 130 and the body 110 is determined according to the gravity value detected by the gravity sensor. In some embodiments, the electronic device 100 may further include a circuit element such as a display card (not shown) or an audio/video processing circuit (not shown). The circuit elements may be controlled by the processor 150 to provide processed image data to the display screen 140 for display.
Referring to fig. 2A to 2C, fig. 2A to 2C are schematic diagrams illustrating a camera module 130 of the electronic device 100 being flipped to different flipping angles relative to a main body in some embodiments. Since the camera module 130 can rotate relative to the body 110, when the camera module 130 rotates to different positions, the extending axis AX1 of the position of the camera module 130 forms different angles relative to the extending axis AX2 of the body 110, as shown in fig. 2A and 2C.
In one embodiment, as shown in fig. 2A, when the electronic device 100 is in the front lens mode (e.g., the electronic device 100 is performing a video live function), the camera module 130 is flipped to a topmost position (a first position) as shown in fig. 2A, an extending axis AX2 of the position of the camera module 130 is opposite to an extending axis AX1 of the display screen 140, an included angle θ s between the extending axes AX1 and AX2 is 180 degrees, and a lens direction of the camera module 130 is completely toward the front of the body (i.e., the same direction as the display screen 140).
In one embodiment, as shown in fig. 2B, when the electronic device 100 is in the rear lens mode (e.g., the user operates the electronic device 100 to shoot a surrounding scenery or record a surrounding scene), the camera module 130 is flipped to the bottom end and returned to the accommodating space 120, as shown in the position (the second position) shown in fig. 2B, an angle between the extending axis AX2 of the position where the camera module 130 is located and the extending axis AX1 of the body 110 is 0 degree, and the lens direction of the camera module 130 is completely toward the rear of the body (i.e., opposite to the display screen 140).
Referring to fig. 2C, in this embodiment, the electronic device 100 is in the front lens mode, the camera module 130 is turned back to a position where the included angle θ s between the camera module 130 and the main body 110 is approximately 130 degrees by an external force, at this time, the position of the camera module 130 is greatly deviated from the standard position of the originally set front lens mode (i.e., the second position where the included angle is 180 degrees), when the electronic device 100 is originally in the front lens mode, and the electronic device 100 detects that the amount of change of the deflection angle between the camera module 130 and the main body 110 is 50 degrees (the included angle θ s is changed from 180 degrees to 130 degrees) by the external force, the head image VM captured by the camera module 130 does not conform to the expected image data (for example, the image data does not include a user). At this time, the processor 150 may determine that the camera module 130 has been turned over by the external force according to the environment sensing signal SS (i.e., the orientation information generated by the orientation sensing elements 170a and 170b, respectively), and further determine that the environment dynamic change occurs at this time.
Fig. 3 is a block diagram illustrating an internal architecture of an operating system 151 executed by the processor 150 according to some embodiments of the invention. The operating system 151 includes a Kernel module 1510, a Hardware Abstraction Layer module (HAL module)1511, and an Application Software execution module (Application Software execution module) 1512.
In an embodiment, the operating system 151 is an Android (Android) system, and in this embodiment, the execution core module 1510 is an execution core layer of the Android system, the hardware abstraction layer module 1511 is a hardware abstraction layer of the Android system, and the application execution module 1512 is an application software layer of the Android system.
In another embodiment, the execution core module 1510, the hardware abstraction layer module 1511 and the application software execution module 1512 are implemented by the processor 150, a processing circuit or an application-specific integrated circuit (ASIC).
When a user starts an application program and plays the application program on the electronic device, the processor 150 loads and executes the application program in the application software executing module 1512 of the operating system 151. In one embodiment, the application is a game application and the live broadcast is a game live broadcast. The processor 150 executes the game content of the game application to generate a live view (which may include an avatar view in addition to the game view including the game content itself) accordingly.
Referring to fig. 4A and 4B together, fig. 4A and 4B respectively illustrate schematic views of live frames D1 and D2 under different situations. As shown in fig. 4A, when the processor 150 loads and executes the application program, the display screen AP is generated. In one embodiment, the display screen AP is a game screen, and the display screen AP includes a virtual character in a game application program, an operation interface of a game, a scenario session of the game, and the like.
When the avatar displaying function of the application is activated, the processor 150 controls the camera module 130 to capture the avatar frame VM of the current user, and combines the avatar frame VM and the display frame AP into the live view D1. In one embodiment, the execution core module 1510 is configured to receive the avatar VM from the camera module 130, the hardware abstraction layer module 1511 is configured to receive the avatar VM from the execution core module 1510, and the application software execution module 1512 is configured to receive the avatar VM from the hardware abstraction layer module 1511. The application software executing module 1512 combines the display screen AP generated by the execution of the application program itself with the avatar screen VM to form the live view D1. In one embodiment, the live view D1 is a game live view.
At this time, the application software executing module 1512 of the processor 150 may display the live view D1 on the display screen 140 or transmit the live view D1 to an external server (not shown) through the communication circuit 180 coupled to the processor 150, so that a near-end viewer can see the live view D1 through the display screen 140 and a far-end viewer can see the live view D1 through the external server.
In the embodiment shown in fig. 4A, the display frame AP may be an execution frame of a game application being played by a user (e.g., a live game host) of the electronic device 100, and meanwhile, in the embodiment shown in fig. 4A, the avatar frame VM is an image of the user captured by the camera module 130 in the self-timer mode, in which case, a viewer watching the live frame D1 can see a game frame and an avatar frame of the user currently playing a game, so as to know actions and expressions of the user (e.g., the live game host) of the electronic device 100.
When the electronic device 100 is in the live mode, the processor 150 can determine whether the environment dynamic change occurs according to the environment sensing signal SS generated by the sensor 170. The execution core module 1510 is configured to receive the environment sensing signal SS from the sensor 170, the hardware abstraction layer module 1511 is configured to receive the environment sensing signal SS from the execution core module 1510 and the avatar VM from the execution core module 1510, and the application software execution module 1512 is configured to receive the environment sensing signal SS and the avatar VM from the hardware abstraction layer module 1511. The application execution module 1512 detects whether an environment dynamic change occurs according to the environment sensing signal SS.
In one embodiment, when the electronic device 100 is in the live mode and the processor 150 determines that the environment dynamic change occurs according to the environment sensing signal SS, the processor 150 sends a notification signal to the application program to notify the application program to turn off the avatar display function. In detail, when the electronic device 100 is in the live mode and the application software executing module 1512 in the operating system 151 executed by the processor 150 detects that a dynamic change of the environment (for example, the camera module 130 is pulled by an external force to change the size of the included angle between the camera module 130 and the main body 110) occurs according to the environment sensing signal SS, the application software executing module 1512 notifies the application program to turn off the avatar function. At this time, the application execution module 1512 generates the live view D2 separately using the display view AP generated by the application and displays the live view on the display screen 140, as shown in fig. 4B. Or transmitted to an external server (not shown) through the communication circuit 180, so that the near-end viewer can see the live view D2 through the display screen 140, and the far-end viewer can see the live view D2 through the external server.
The live view D2 shown in fig. 4B only includes the display view AP generated by the application program, and does not include the avatar VM captured by the camera module 130. That is, when the environment changes dynamically (for example, the external force pulls the camera module 130 to deviate from the preset position of the camera module 130), the live view D2 no longer broadcasts the avatar of the current user (for example, the main game player) live, so as to avoid the privacy view of the user being transmitted by accidental live broadcast.
In addition, when the application software executing module 1512 on the processor 150 determines that the environmental dynamic change occurs according to the environmental sensing signal SS, the processor 150 may notify the camera module 130 to stop capturing the head portrait frame VM, and may control the camera module 130 to turn to the bottom end, so that the camera module 130 returns to the accommodating space 120, such as the position (the second position) illustrated in fig. 2B, thereby preventing the camera module 130 from being damaged due to the environmental dynamic change.
In another embodiment, the sensor 170 for sensing the dynamic change of the environment is not limited to the two orientation sensing elements 170a and 170b in the above embodiments.
In another embodiment, the at least one sensor 170 may comprise an optical sensor 170c, as shown in fig. 1B, and the environmental sensing signal comprises an optical sensing reading detected by the optical sensor 170 c. The optical sensor 170c is coupled to the processor 150 and transmits the optical sensor readings to the processor 150. The application software executing module 1512 detects whether an environmental dynamic change occurs according to the received optical sensing reading. For example, when the electronic device 100 is playing a game and playing a live broadcast (as shown in fig. 4A, if the avatar function is turned on, the output live broadcast frame D1 includes the avatar frame VM and the display frame AP), the optical sensing reading detected by the optical sensor 170c may be stable within a certain range of values, and if the optical sensing reading suddenly changes greatly, it may represent that the electronic device 100 slides down or turns over to a different orientation, and at this time, it may detect that an environmental dynamic change occurs. When the dynamic change of the environment is detected, the application software executing module 1512 on the processor 150 notifies the game application to close the avatar displaying function, and the output live view D2 will not include the avatar VM. In addition, in some embodiments, the processor 150 may also notify the camera module 130 to stop capturing the head portrait frame VM, and may control the camera module 130 to flip to the bottom.
In another embodiment, the at least one sensor 170 may further include a proximity sensor 170d, as shown in fig. 1B, the environment sensing signal includes a proximity sensing reading detected by the proximity sensor 170d, the proximity sensing reading represents a distance between the proximity sensor 170d and the front closest plane. The proximity sensor 170d is coupled to the processor 150 and transmits the proximity sensing reading to the processor 150. Proximity sensing readings typically change significantly when environmental dynamics occur (electronic device 100 slips, hits a particular object, changes position in space, etc.). The application software executing module 1512 detects whether a dynamic environment change occurs according to the received proximity sensing reading. When a dynamic change in the environment is detected, the application software executing module 1512 on the processor 150 notifies the game application to turn off the avatar display function. In addition, in some embodiments, the processor 150 may also notify the camera module 130 to stop capturing the head portrait frame VM, and may control the camera module 130 to flip to the bottom.
In another embodiment, the at least one sensor 170 can further comprise a hall sensor 170e, as shown in fig. 1B, the environment sensing signal comprises a hall sensing reading detected by the hall sensor 170e, and the hall sensing reading represents a distribution of a magnetic field around the hall sensor 170 e. The hall sensor 170e is coupled to the processor 150 and transmits hall sensor readings to the processor 150. The application software executing module 1512 detects whether an environmental dynamic change occurs according to the received hall sensing readings. When a dynamic change in the environment is detected, the application software executing module 1512 on the processor 150 notifies the game application to turn off the avatar display function. In addition, in some embodiments, the processor 150 may also notify the camera module 130 to stop capturing the head portrait frame VM, and may control the camera module 130 to flip to the bottom.
It should be noted that the at least one sensor 170 of the electronic device 100 of the present disclosure may include a combination of one or more of the position sensing elements 170a and 170b, the optical sensor 170c, the proximity sensor 170d, and the hall sensor 170e in the above embodiments. That is, the electronic device 100 may include one of the sensors 170 for individually sensing the environmental dynamic changes. Alternatively, the electronic device 100 may include a combination of multiple sensors for comprehensively evaluating whether the dynamic environment change occurs, and the combination of the multiple sensors may be configured to notify the game application program to turn off the display head portrait function when more than two sensors detect the dynamic environment change, so as to avoid erroneous detection caused by over-sensitivity of the sensors.
In summary, the electronic apparatus and the control method provided in the embodiments of the present invention can prevent the camera lens from being turned over when the user uses the mobile phone to perform a task or live broadcast, which may cause other privacy pictures to be captured and leaked. Environmental changes around the electronic device are detected by the sensor and processed in a hardware abstraction layer in the processor to control output streaming pictures and related accessories, so that the privacy of a user is protected.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.

Claims (9)

1. An electronic device, comprising:
a body;
a camera module rotatably provided to the body;
the sensor is arranged on the electronic device and used for generating an environment sensing signal; and
a processor electrically connected to the camera module and the sensor, the processor being configured to execute an operating system, the operating system including an application execution module, wherein when the processor loads and executes an application program in the application execution module, the application program enables the electronic device to perform the following steps:
generating a display picture;
when the head portrait display function of the application program is started, controlling the camera module to capture a head portrait picture, and combining the head portrait picture and the display picture to form a live broadcast picture; and
when the electronic device is in a live broadcast mode and the processor judges that the environment dynamic change occurs according to the environment sensing signal, the processor transmits a notification signal to the application program to notify the application program to close the display avatar function,
the sensor comprises a plurality of orientation sensing components, the environment sensing signal comprises a plurality of orientation information, the application software execution module judges an included angle between the camera module and the machine body according to the received orientation information, and judges whether the camera module is overturned by external force to generate the dynamic environment change according to the included angle.
2. The electronic device of claim 1, wherein the operating system further comprises:
the execution core module is used for receiving the head portrait picture from the camera module and receiving the environment sensing signal from the sensor;
the hardware abstraction layer module is configured to receive the avatar image and the environment sensing signal from the execution core module, wherein the application software execution module is configured to execute the application program and receive the avatar image and the environment sensing signal from the hardware abstraction layer module.
3. The electronic device according to claim 1, wherein when the avatar function is activated, the application software execution module combines the avatar screen and the display screen into the live screen.
4. The electronic device according to claim 1, wherein when the application execution module determines that the dynamic environment change occurs according to the environment sensing signal, the application execution module notifies the application program to close the avatar display function, and the application execution module forms the live view separately by using the display view.
5. The electronic device of claim 1, wherein the sensor comprises an optical sensor, the environment sensing signal comprises an optical sensing reading, and the application software executing module determines whether the dynamic environment change occurs according to the received optical sensing reading.
6. The electronic device of claim 1, wherein the sensor comprises a proximity sensor, the environment sensing signal comprises a proximity sensing reading, and the application software execution module determines whether the dynamic environment change occurs according to the received proximity sensing reading.
7. The electronic device of claim 1, wherein the sensor comprises a hall sensor, the environment sensing signal comprises a hall sensing reading, and the application execution module determines whether the dynamic environment change occurs according to the received hall sensing reading.
8. The electronic device of claim 1, further comprising a display screen electrically connected to the processor,
wherein, when the head portrait display function is started, the live broadcast picture formed by combining the head portrait picture and the display picture is displayed on the display screen,
and when the application program closes the head portrait display function, displaying the live broadcast picture generated by the display picture on the display screen.
9. The electronic device of claim 1, further comprising a communication circuit coupled to the processor,
wherein, when the head portrait display function is started, the live broadcast picture formed by combining the head portrait picture and the display picture is transmitted to an external server through the communication circuit,
and when the application program closes the head portrait display function, transmitting the live broadcast picture generated by the display picture to the external server through the communication circuit.
CN202010030418.2A 2019-05-15 2020-01-13 Electronic device Active CN111953822B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962848074P 2019-05-15 2019-05-15
US62/848,074 2019-05-15

Publications (2)

Publication Number Publication Date
CN111953822A CN111953822A (en) 2020-11-17
CN111953822B true CN111953822B (en) 2021-11-02

Family

ID=73335603

Family Applications (5)

Application Number Title Priority Date Filing Date
CN201910949732.8A Active CN111953866B (en) 2019-05-15 2019-10-08 Electronic device
CN201911188589.1A Active CN111953844B (en) 2019-05-15 2019-11-28 Electronic device
CN201911197152.4A Active CN111953867B (en) 2019-05-15 2019-11-29 Electronic device
CN202010017140.5A Active CN111953868B (en) 2019-05-15 2020-01-08 Electronic device
CN202010030418.2A Active CN111953822B (en) 2019-05-15 2020-01-13 Electronic device

Family Applications Before (4)

Application Number Title Priority Date Filing Date
CN201910949732.8A Active CN111953866B (en) 2019-05-15 2019-10-08 Electronic device
CN201911188589.1A Active CN111953844B (en) 2019-05-15 2019-11-28 Electronic device
CN201911197152.4A Active CN111953867B (en) 2019-05-15 2019-11-29 Electronic device
CN202010017140.5A Active CN111953868B (en) 2019-05-15 2020-01-08 Electronic device

Country Status (2)

Country Link
CN (5) CN111953866B (en)
TW (5) TWI708986B (en)

Family Cites Families (63)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3689946B2 (en) * 1995-10-05 2005-08-31 ソニー株式会社 Data processing apparatus and method
US20050014527A1 (en) * 2003-07-18 2005-01-20 Agere Systems Incorporated Retractable rotatable camera module for mobile communication device and method of operation thereof
CN2694316Y (en) * 2003-10-18 2005-04-20 鸿富锦精密工业(深圳)有限公司 Portable electronic apparatus with built-in digital camera
CN2692932Y (en) * 2004-02-09 2005-04-13 宏达国际电子股份有限公司 Hand-held electronic apparatus
CN1633136B (en) * 2004-12-29 2010-09-22 海信集团有限公司 Handset with automatic video switching
CN1979322A (en) * 2005-12-02 2007-06-13 上海乐金广电电子有限公司 Camera image regulation system using acceleration sensor and method therefor
KR100731592B1 (en) * 2007-04-06 2007-06-22 주식회사 영국전자 Transmitter-receiver the image combination and self-checking and communicating voice are possible
CN101448142A (en) * 2007-11-27 2009-06-03 鸿富锦精密工业(深圳)有限公司 Image tracking device and image tracking method thereof
JP5079549B2 (en) * 2008-03-03 2012-11-21 パナソニック株式会社 Imaging device and imaging device body
CN101634871A (en) * 2008-07-23 2010-01-27 华硕电脑股份有限公司 Portable electronic device
CN101686327A (en) * 2008-09-27 2010-03-31 佛山市顺德区汉达精密电子科技有限公司 Adjusting system of image presenting angle of camera shooting system
US8823743B2 (en) * 2009-10-02 2014-09-02 Sony Corporation Image processing device and method, and program
CN102082907B (en) * 2009-11-30 2013-05-08 比亚迪股份有限公司 Method and system for monitoring surrounding environment of user by utilizing portable terminal camera
JP5439272B2 (en) * 2010-04-26 2014-03-12 オリンパスイメージング株式会社 Vibrating device and imaging device using the same
CN201985933U (en) * 2010-12-31 2011-09-21 上海华勤通讯技术有限公司 Mobile phone with turning lens
CN102170493B (en) * 2011-02-12 2015-04-22 惠州Tcl移动通信有限公司 Mobile phone as well as method and device for controlling video calls of mobile phone
TWM417729U (en) * 2011-04-11 2011-12-01 Dxg Technology Corp Hand-held electronic device
TWI537670B (en) * 2011-12-13 2016-06-11 鴻海精密工業股份有限公司 Dual lens camera
TWM436853U (en) * 2012-04-24 2012-09-01 Abunno Technology Co Ltd Spherical lens holder
US20140320604A1 (en) * 2013-04-24 2014-10-30 Nvidia Corporation Reusing a standalone camera as part of a three-dimensional (3d) camera in a data processing device
CN103500007A (en) * 2013-09-27 2014-01-08 深圳市金立通信设备有限公司 Method for determining rotation angle of rotating member of terminal and terminal
US10021296B2 (en) * 2013-12-31 2018-07-10 Futurewei Technologies, Inc. Automatic rotatable camera for panorama taking in mobile terminals
US20150288880A1 (en) * 2014-04-07 2015-10-08 Wu-Hsiung Chen Image capturing method and image capturing apparatus
CN104023105A (en) * 2014-06-13 2014-09-03 广东欧珀移动通信有限公司 Detection device and detection method of angled rotation of camera of mobile terminal
CN110650241B (en) * 2014-06-16 2021-01-29 华为技术有限公司 Method for presenting panoramic photo in mobile terminal and mobile terminal
CN105320270B (en) * 2014-07-18 2018-12-28 宏达国际电子股份有限公司 For executing the method and its electronic device of face function
CN204425471U (en) * 2014-12-08 2015-06-24 路宽 Mobile terminal and Full-automatic turn-over type camera assembly
CN104469165B (en) * 2014-12-23 2017-11-07 广东欧珀移动通信有限公司 Mobile terminal and the method for detecting camera rotation status
KR102373170B1 (en) * 2015-01-07 2022-03-11 삼성전자주식회사 A mehtod for simultaneously displaying one or more items and an electronic device therefor
CN104601870B (en) * 2015-02-15 2018-04-13 广东欧珀移动通信有限公司 The image pickup method and mobile terminal of a kind of rotating camera
US10291847B2 (en) * 2015-02-26 2019-05-14 Htc Corporation Portable device and manipulation method
CN204536890U (en) * 2015-04-29 2015-08-05 天津市百分百信息技术有限公司 The home safety monitor system of privacy can be protected
CN104850803A (en) * 2015-05-25 2015-08-19 小米科技有限责任公司 Terminal control method and apparatus
CN106293029B (en) * 2015-05-30 2020-12-08 深圳富泰宏精密工业有限公司 Portable electronic device and camera module control method thereof
CN104966029A (en) * 2015-06-12 2015-10-07 联想(北京)有限公司 Information processing method and electronic equipment
US9906721B2 (en) * 2015-10-30 2018-02-27 Essential Products, Inc. Apparatus and method to record a 360 degree image
CN105357436B (en) * 2015-11-03 2018-07-03 广东欧珀移动通信有限公司 For the image cropping method and system in image taking
CN205596193U (en) * 2015-12-02 2016-09-21 南昌欧菲光电技术有限公司 Terminal with rotatable camera
KR102530469B1 (en) * 2016-01-08 2023-05-09 삼성전자주식회사 Electronic Apparatus and Controlling Method thereof
CN105657270A (en) * 2016-01-29 2016-06-08 宇龙计算机通信科技(深圳)有限公司 Photographed video processing method, device and equipment
CN105827847B (en) * 2016-04-08 2019-02-12 Oppo广东移动通信有限公司 A kind of guard method that mobile terminal falls, mobile terminal and computer readable storage medium
CN107517344A (en) * 2016-06-15 2017-12-26 珠海格力电器股份有限公司 The method of adjustment and device of camera device identification range
CN106484349A (en) * 2016-09-26 2017-03-08 腾讯科技(深圳)有限公司 The treating method and apparatus of live information
CN107302655B (en) * 2016-09-29 2019-11-01 维沃移动通信有限公司 It is a kind of to shoot the adjusting method and mobile terminal found a view
CN106804010B (en) * 2017-03-16 2020-02-14 北京安云世纪科技有限公司 Incoming call processing method and device based on virtual reality equipment and virtual reality equipment
CN107147927B (en) * 2017-04-14 2020-04-03 北京小米移动软件有限公司 Live broadcast method and device based on live broadcast wheat connection
CN107277315A (en) * 2017-05-29 2017-10-20 孙海燕 The live video camera of one kind building
CN107333055B (en) * 2017-06-12 2020-04-03 美的集团股份有限公司 Control method, control device, intelligent mirror and computer readable storage medium
TWM556446U (en) * 2017-09-26 2018-03-01 Jiuh Rong Electronics Co Ltd Electronic product with camera lens shielding device
CN107911579A (en) * 2017-11-07 2018-04-13 广东欧珀移动通信有限公司 Mobile terminal
CN207530941U (en) * 2017-11-13 2018-06-22 郑宇杰 Image tracing device
CN107819907B (en) * 2017-11-14 2019-12-27 维沃移动通信有限公司 Camera control method and mobile terminal
CN107671862B (en) * 2017-11-17 2020-01-21 珠海市一微半导体有限公司 Method for processing robot stuck
CN208128319U (en) * 2017-11-20 2018-11-20 张雷 A kind of concealed overturning regulating device of mobile phone camera
CN108363946B (en) * 2017-12-29 2022-05-03 成都通甲优博科技有限责任公司 Face tracking system and method based on unmanned aerial vehicle
CN108566510B (en) * 2018-01-31 2020-08-21 努比亚技术有限公司 Flexible screen control method, mobile terminal and readable storage medium
CN108683795A (en) * 2018-03-30 2018-10-19 中兴通讯股份有限公司 Mobile terminal and its control method and computer readable storage medium
CN208015863U (en) * 2018-04-26 2018-10-26 广州高瑞信息科技有限公司 A kind of IP Camera with privacy protection function
CN108965580B (en) * 2018-06-08 2021-01-01 Oppo广东移动通信有限公司 Anti-falling control method and device for sliding assembly and electronic device
CN108924287B (en) * 2018-06-08 2020-12-15 Oppo广东移动通信有限公司 Anti-falling control method and device for sliding assembly and electronic device
CN109167894B (en) * 2018-06-15 2019-12-31 Oppo广东移动通信有限公司 Camera control method and device, mobile terminal and storage medium
CN109525837B (en) * 2018-11-26 2020-07-21 维沃移动通信有限公司 Image generation method and mobile terminal
CN110602517B (en) * 2019-09-17 2021-05-11 腾讯科技(深圳)有限公司 Live broadcast method, device and system based on virtual environment

Also Published As

Publication number Publication date
CN111953844B (en) 2022-03-15
TWI724740B (en) 2021-04-11
CN111953868A (en) 2020-11-17
CN111953866B (en) 2021-11-26
CN111953844A (en) 2020-11-17
TWI714370B (en) 2020-12-21
TW202044817A (en) 2020-12-01
CN111953866A (en) 2020-11-17
TWI709807B (en) 2020-11-11
TW202043898A (en) 2020-12-01
TW202043900A (en) 2020-12-01
TW202043899A (en) 2020-12-01
CN111953867B (en) 2021-11-26
TWI708986B (en) 2020-11-01
TW202044029A (en) 2020-12-01
CN111953867A (en) 2020-11-17
CN111953868B (en) 2022-01-18
TWI726577B (en) 2021-05-01
CN111953822A (en) 2020-11-17

Similar Documents

Publication Publication Date Title
EP4093012A1 (en) Camera module and electronic device
CN110213608B (en) Method, device, equipment and readable storage medium for displaying virtual gift
CN109660855B (en) Sticker display method, device, terminal and storage medium
CN110368689B (en) Game interface display method, system, electronic equipment and storage medium
KR102439257B1 (en) A mobile device including a camera module
CN110225048B (en) Data transmission method and device, first terminal and storage medium
EP4057618A1 (en) Camera starting method and electronic device
WO2021104357A1 (en) Electronic apparatus, and image capturing method
WO2021082744A1 (en) Video viewing method and electronic apparatus
US10979700B2 (en) Display control apparatus and control method
US10924789B2 (en) Display control apparatus, control method for display control apparatus, and non-transitory computer readable medium
JP2018180051A (en) Electronic device and control method thereof
CN112130945A (en) Gift presenting method, device, equipment and storage medium
CN108317992A (en) A kind of object distance measurement method and terminal device
US11706378B2 (en) Electronic device and method of controlling electronic device
US11100903B2 (en) Electronic device and control method for controlling a display range on a display
CN111953822B (en) Electronic device
US11079898B2 (en) Electronic device for controlling display of VR image, control method of electronic device, and non-transitory computer readable medium
CN110493457B (en) Terminal device control method and terminal device
US11477385B2 (en) Electronic device with rotatable camera for protecting privacy
US10911593B2 (en) Electronic device having a rotatable camera module
CN110233966B (en) Image generation method and terminal
JP7187307B2 (en) Electronic device and its control method
JP2018180050A (en) Electronic device and control method thereof
CN110996115A (en) Live video playing method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant