CN111314520A - Terminal equipment - Google Patents

Terminal equipment Download PDF

Info

Publication number
CN111314520A
CN111314520A CN202010196267.8A CN202010196267A CN111314520A CN 111314520 A CN111314520 A CN 111314520A CN 202010196267 A CN202010196267 A CN 202010196267A CN 111314520 A CN111314520 A CN 111314520A
Authority
CN
China
Prior art keywords
pattern
collector
image
sub
interaction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010196267.8A
Other languages
Chinese (zh)
Other versions
CN111314520B (en
Inventor
李亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010196267.8A priority Critical patent/CN111314520B/en
Publication of CN111314520A publication Critical patent/CN111314520A/en
Application granted granted Critical
Publication of CN111314520B publication Critical patent/CN111314520B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0272Details of the structure or mounting of specific components for a projector or beamer module assembly
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/0202Constructional details or processes of manufacture of the input device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72427User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality for supporting games or graphical animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/52Details of telephonic subscriber devices including functional features of a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/54Details of telephonic subscriber devices including functional features of a projector or beamer module assembly

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses terminal equipment, which comprises a body, wherein the body comprises a processor, a first projection collector and a second projection collector, wherein the first projection collector and the second projection collector are oppositely arranged on two sides of the body; the first projection collector is used for projecting a first interactive pattern and collecting a first image aiming at the first interactive pattern; the second projection collector is used for projecting a second interactive pattern and collecting a second image aiming at the second interactive pattern; the processor is configured to determine, based on the first image and the second image, that a first interaction is performed on the first interaction pattern by a first operation body, or that a second interaction is performed on the second interaction pattern by a second operation body, and execute control corresponding to the first interaction or the second interaction on the terminal device.

Description

Terminal equipment
Technical Field
The embodiment of the application relates to but is not limited to electronic technology, and particularly relates to a terminal device.
Background
Terminal equipment such as cell-phones has become the requisite in people's life, and terminal equipment makes the user control terminal equipment correspondingly through the touch button that shows on the display screen, and the control mode that provides is single, can't satisfy the demand of diversified control mode, therefore, terminal equipment such as cell-phones how to provide other control modes, satisfies the needs of user in diversified control, is the problem that awaits a great deal of solution.
Disclosure of Invention
The embodiment of the application provides terminal equipment, which comprises a body, wherein the body comprises a processor, a first projection collector and a second projection collector, wherein the first projection collector and the second projection collector are oppositely arranged on two sides of the body;
the first projection collector is used for projecting a first interactive pattern and collecting a first image aiming at the first interactive pattern;
the second projection collector is used for projecting a second interactive pattern and collecting a second image aiming at the second interactive pattern;
the processor is configured to determine, based on the first image and the second image, that a first interaction is performed on the first interaction pattern by a first operation body, or that a second interaction is performed on the second interaction pattern by a second operation body, and execute control corresponding to the first interaction or the second interaction on the terminal device.
In the embodiment of the application, the terminal equipment comprises a body, wherein the body comprises a processor, and a first projection collector and a second projection collector which are oppositely arranged on two sides of the body; the first projection collector is used for projecting a first interactive pattern and collecting a first image aiming at the first interactive pattern; the second projection collector is used for projecting a second interactive pattern and collecting a second image aiming at the second interactive pattern; the processor is used for determining that the first interaction is carried out on the first interaction pattern by the first operation body or determining that the second interaction is carried out on the second interaction pattern by the second operation body based on the first image and the second image, and executing control corresponding to the first interaction or the second interaction on the terminal equipment. Therefore, the terminal equipment can project the first interactive image and the second interactive image through two different projection collectors, and a user can realize the control of the terminal equipment only by interacting the first interactive image and/or the second interactive image, so that another control scheme of the terminal equipment is provided.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
fig. 1 is a schematic diagram of a hardware structure of a projection system provided in the related art;
fig. 2a is a schematic structural diagram of a terminal device in a first direction according to an embodiment of the present disclosure;
fig. 2b is a schematic structural diagram of a terminal device in a second direction according to an embodiment of the present application;
fig. 3a is a schematic structural diagram of another terminal device provided in an embodiment of the present application in a first direction;
fig. 3b is a schematic structural diagram of another terminal device provided in the embodiment of the present application in a second direction;
fig. 4a is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure, in which a support body is combined with a body;
fig. 4b is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure, in which a support body is supported with respect to a main body.
It should be noted that the drawings in the embodiments of the present application are only for illustrating schematic positions of the respective devices on the terminal equipment, and do not represent actual positions in the terminal equipment, the actual positions of the respective devices may be changed or shifted according to actual conditions (for example, the structure of the terminal equipment), and the proportions of different parts in the terminal equipment in the drawings do not represent actual proportions.
Detailed Description
The technical solution of the present application is further elaborated below with reference to the drawings and the embodiments.
The projection keyboard is a new input mode by projection, it adopts virtual keyboard technique, and uses projector built in the projection device to project the outline of standard keyboard on any surface, then uses infrared ray technique to track the action of finger, finally completes the acquisition of input information. The specific method for tracking the movement of the finger by the infrared technology comprises the following steps: the linear infrared light source is arranged at the bottom of the projection device, the emitted linear infrared signal is used for detecting whether a finger is pressed down, if the finger is pressed down, the infrared ray can be reflected at a certain angle, and the camera is arranged above the linear infrared light source and used for receiving the reflected infrared signal. When a finger presses down, infrared rays emitted by the linear laser are reflected into the camera, wherein the camera can be added with an infrared filter, so that only the infrared rays can enter the camera, the interference of visible light is avoided, the camera sends the collected image to a computer, and the computer determines a target key in a virtual keyboard pressed by the finger based on infrared reflection light spots of the finger in the image.
Fig. 1 is a schematic diagram of a hardware structure of a projection system provided in the related art, where the projection system 1 includes a projection device 11 (e.g., a projector), a camera 12, and a computer 13, the computer 13 controls a projector of the projection device 11 to project a projection pattern (e.g., a keyboard pattern) onto a plane, and controls a line laser of the projection device 11 to project a parallel light beam parallel to the plane, the computer 13 can control the camera 12 to acquire an image of the projection pattern on the plane, and the camera 12 sends the acquired image to the computer 13, so that a control operation, such as a click operation or a slide operation, of a user on the image is determined through recognition of a laser spot in the image by the computer 13, and corresponding control is implemented based on the control operation.
However, in the related art, there is no solution to integrate the functions of the projection apparatus, the camera, and the computer into one device, and particularly, in the case of a mobile phone which is becoming more common, there is no solution to integrate the functions of the projection apparatus, the camera, and the computer into a mobile phone. Based on this, the embodiments of the present application propose the following terminal device.
In other embodiments of the present application, the terminal device may be a server, a tablet computer, a notebook computer, a palm computer, a personal digital assistant, a portable media player, a smart speaker, a navigation device, a display device, a wearable device such as a smart bracelet, a Virtual Reality (VR) device, an Augmented Reality (AR) device, a pedometer, a digital TV, a desktop computer, or a terminal device in a 5G network.
The following describes in detail the technical solutions of the present application and how the technical solutions of the present application solve the above technical problems by embodiments and with reference to the drawings. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
It should be noted that: in the present examples, "first", "second", etc. are used for distinguishing similar objects and are not necessarily used for describing a particular order or sequence.
The technical means described in the embodiments of the present application may be arbitrarily combined without conflict.
Fig. 2a is a schematic structural diagram of a terminal device provided in an embodiment of the present application in a first direction, and fig. 2b is a schematic structural diagram of a terminal device provided in an embodiment of the present application in a second direction, as shown in fig. 2a and fig. 2b, a terminal device 2 includes a body 21.
The body 21 includes a processor (not shown) and a first projection collector 211 and a second projection collector 212 which are oppositely disposed at both sides of the body 21. The processor is disposed inside the body 21 and electrically connected to the first projection collector 211 and the second projection collector 212, respectively, to control the first projection collector 211 and the second projection collector 212.
The projection collector may be provided at an edge or a bezel of the display panel of the terminal device 2. In an embodiment, the body 21 may include a display screen 213, a front camera 214, a rear camera 215, and four frames surrounding the display screen 213, where the four frames are not on the same plane as the display screen 213, and the four frames are respectively a left frame, a right frame, an upper frame, and a lower frame, where a volume control key may be disposed on the right frame, and a charging interface and a speaker may be disposed on the lower frame.
Taking fig. 2a and 2b as an example, the first projection collector 211 and the second projection collector 212 are respectively disposed at the left frame or the right frame. In other embodiments of the present application, the first projection picker 211 and the second projection picker 212 may be disposed at the upper border and the lower border, respectively. In other embodiments of the present application, the first projection picker 211 and the second projection picker 212 may be disposed at left and right ends, or top and bottom ends, of the display screen 213, respectively. It can be appreciated that as the screen occupancy of the terminal device 2 is continuously increased, the first projection collector 211 and the second projection collector 212 may be disposed under the display screen 213 of the terminal device 2.
Under the condition that the terminal device 2 starts the projection acquisition function, the first projection acquirer 211 is used for projecting a first interactive pattern and acquiring a first image aiming at the first interactive pattern; the second projection collector 212 is used for projecting a second interaction pattern and collecting a second image aiming at the second interaction pattern; the processor is configured to determine, based on the first image and the second image, that the first interaction is performed on the first interaction pattern by the first operation body, or that the second interaction is performed on the second interaction pattern by the second operation body, and perform control corresponding to the first interaction or the second interaction on the terminal device 2.
In one embodiment, the first and second projection collectors 211 and 212 may collect the first and second images using a first collection interval, it being understood that the first collection interval should be a very short time interval such that the user does not perceive input delay or perceives as little input delay as possible, for example, the first collection interval may be between 0.01 seconds and 1 second, for example, the first collection interval may be 0.01 seconds, 0.1 seconds, or 1 second. In another embodiment, in order to avoid the situation that the terminal device 2 consumes more power and the terminal device 2 consumes more power due to the fact that the first projection collector 211 and the second projection collector 212 sample too fast, the processor controls the first projection collector 211 and the second projection collector 212 to collect the first image and the second image at the larger second collection interval when no finger reflection spot or finger is detected in the currently obtained first image and second image, and controls the first projection collector 211 and the second projection collector 212 to collect the first image and the second image at the smaller first collection interval when the finger reflection spot or finger is detected in the currently obtained first image and second image. For example, the second acquisition interval may be between 1 second and 3 seconds, e.g., the second acquisition interval is 1 second, 2 seconds, or 3 seconds.
The first manipulation body may be a finger of a left hand of the user, and the second manipulation body may be a finger of a right hand of the user. In other embodiments, the first operation body and the second operation body may be operation devices such as operation pens. The first interaction of the first interaction pattern by the first operation body can be a click operation of the first operation body on a certain key position or a certain target position in the first interaction pattern, or a sliding operation of the first operation body on a certain target area of the first interaction pattern; the second interaction of the second interaction pattern by the second operation body may be a click operation performed by the second operation body on a certain key position or a certain target position in the second interaction pattern, or a sliding operation performed on a certain target area of the second interaction pattern.
For example, if the first interactive pattern is a first partial keyboard pattern comprising a plurality of first key identifications, the second interactive pattern is a plurality of second key identifications and a second partial keyboard pattern is formed by combining the first partial keyboard pattern and the second partial keyboard pattern, the display 213 of the terminal device 2 is controlled to display the letter S if the processor determines that the user clicks the key identification of the letter S with the left hand based on the first image and the second image. For another example, if the first interactive pattern or the second interactive pattern is a cursor control pattern, the processor may control the cursor position change of the terminal device 2 based on the obtained first image or the second image by sliding the operating body on the cursor control pattern.
In the embodiment of the application, the terminal equipment can project the first interactive image and the second interactive image through two different projection collectors, and a user can control the terminal equipment only by performing corresponding interaction on the first interactive image and/or the second interactive image, so that another control scheme of the terminal equipment is provided.
Fig. 3a is a schematic structural diagram of another terminal device provided in an embodiment of the present application in a first direction, and fig. 3b is a schematic structural diagram of another terminal device provided in an embodiment of the present application in a second direction, please refer to fig. 2a to 3b in combination, in the terminal device provided in the embodiment of the present application, the terminal device 2 includes a body 21, the body 21 includes a processor, and a first projection collector 211 and a second projection collector 212 that are oppositely disposed on two sides of the body 21.
The first projection collector 211 includes a first projector 2111, a first collector 2112, and a first laser emitter 2113, and the second projection collector 212 includes a second projector 2121, a second collector 2122, and a second laser emitter 2123; a first projector 2111 for projecting a first interaction pattern, a second projector 2121 for projecting a second interaction pattern; a first laser emitter 2113 for emitting first parallel laser light on the first alternating pattern, a second laser emitter 2123 for emitting second parallel laser light on the second alternating pattern; the first collector 2112 is used for collecting a first image, and the second collector 2122 is used for collecting a second image; the processor is further configured to determine a first laser spot on the first interaction pattern or a second laser spot on the second interaction pattern based on the first image and the second image, and determine the first interaction or the second interaction based on the first laser spot or the second laser spot.
During the use of the terminal device 2 by the user, the body 21 of the terminal device 2 may be erected using a handset support or prop, the first laser emitter 2113 and the second laser emitter 2123 may be disposed at the bottom of the body 21, or at an end away from the first projector 2111 and the second projector 2121, or at an end of the terminal device 2 close to a carrying surface (e.g., a table) when erected, and the laser light emitted by the first laser emitter 2113 and the second laser emitter 2123 is light parallel to a horizontal plane, and the laser light is proximate to the surfaces of the first interactive pattern and the second interactive pattern. In one embodiment, the first laser emitter 2113 and the second laser emitter 2123 may emit laser light parallel to the bearing surface based on the angle between the terminal device 2 and the bearing surface in the erected condition. The first and second parallel lasers may be infrared lasers or other visible lasers or other non-visible lasers.
The first collector 2112 may include two cameras, and the second collector 2122 may include two cameras, in a specific expression: first collector 2112 comprises first sub-collector 2114 and second sub-collector 2115, and second collector 2122 comprises third sub-collector 2124 and fourth sub-collector 2125; the first sub-collector 2114 is configured to collect a first sub-image for the first interaction pattern, the second sub-collector 2115 is configured to collect a second sub-image for the first interaction pattern, and the processor is configured to determine a first laser spot based on the first sub-image and/or the second sub-image; a third sub-collector 2124 configured to collect a third sub-image for the second interaction pattern, a fourth sub-collector 2125 configured to collect a fourth sub-image for the second interaction pattern, and a processor configured to determine a second laser spot based on the third sub-image and/or the fourth sub-image.
Determining the laser spot may be determining a position of the laser spot in the image and determining, based on the position, which key or key identification the user finger pressed. The first laser spot and the second laser spot may be formed based on reflection of the laser light by a user's finger. The processor may determine the interaction operation of the finger on the first interaction pattern or the second interaction pattern through the position of the first laser spot on the first image or the position of the second laser spot on the second image, for example, the user presses the keyboard sign S with the finger of the left hand, and the processor may determine that the user presses the keyboard sign S with the finger of the user according to the position of the laser spot, so as to display S on the terminal device 2.
In one embodiment, the processor may determine which collector is used to collect the images of the first interaction pattern and the second interaction pattern during the collection process based on a usage scenario of the terminal device 2, where the usage scenario may be at least one of an input habit, a typing habit, a currently used application program, and a projected first interaction pattern and a projected second interaction pattern of a user, and it should be understood that the processor may control the first sub-collector 2114 and the third sub-collector 2124 to operate simultaneously, and may also control the first sub-collector 2114 and the fourth sub-collector 2125 to operate simultaneously. In another embodiment, the processor may control the first sub-collector 2114, the second sub-collector 2115, the third sub-collector 2124 and the fourth sub-collector 2125 to operate simultaneously, so that the four sub-collectors continuously collect images, the processor may determine a first laser spot from the first sub-image and/or the second sub-image, determine a second laser spot from the third sub-image and/or the fourth sub-image, and select the sub-image based on definition and/or accuracy, or the processor may obtain a composite image based on the first sub-image and the second sub-image, and determine the first laser spot based on the composite image, and the processor may obtain a composite image based on the third sub-image and the fourth sub-image, and determine the second laser spot based on the composite image.
In one embodiment, a first sub-collector 2114 and a second sub-collector 2115 are provided on both sides of the first projector 2111, respectively; the third sub-collector 2124 and the fourth sub-collector 2125 are respectively disposed at both sides of the second projector 2121. Of course, it can be appreciated that in other embodiments, the first sub-collector 2114 and the second sub-collector 2115 can be disposed on the same side as the first projector 2111, and the third sub-collector 2124 and the fourth sub-collector 2125 can be disposed on the same side as the second projector 2121. The two sub-collectors, the projector and the laser emitter on one side may be on the same axis, or the axis between the two sub-collectors on one side may be perpendicular to the axis between the projector and the laser emitter.
It should be noted that in other embodiments of the present application, the first collector 2112 may be a camera, and the second collector 2122 may be a camera, through which the first image and the second image are collected respectively. The processor is configured to determine a first laser spot on the first interactive pattern or a second laser spot on the second interactive pattern based on the first image and the second image. The first and second collectors 2112, 2122 may acquire the first and second images based on the first acquisition interval, or may acquire the first and second images based on the first and second acquisition intervals.
It is worth noting that in one implementation, any one of the collectors in the present application may be a camera with an infrared filter, so as to filter out natural light, and make the analysis result of the processor accurate. In another embodiment, the collector may not have an ir filter, so that although the collector cannot filter out natural light, the collector may be applied to other shooting scenes, thereby enabling the collector to have diversified use scenes. In another embodiment, the collector may not have an infrared filter, but the processor may process the obtained first image and the second image through an image algorithm, so that not only the influence of natural light on the detection result can be avoided, but also the collector can be applied to other shooting scenes.
It should be noted that in other embodiments of the present application, the first projection collector 211 may include only the first projector 2111 and the first collector 2112 and not the first laser emitter 2113, the second projection collector 212 may include only the second projector 2121 and the second collector 2122 and not the second laser emitter 2123, and when the user interacts with the first interaction pattern and the second interaction pattern, the first collector 2112 and the second collector 2122 can know the position of the first interaction pattern and the second interaction pattern operated by the finger of the user through the acquired first image and second image. For the description of the first projector 2111, the first collector 2112, the second projector 2121, and the second collector 2122, please refer to the description of the above embodiments, which is not repeated herein.
In the embodiment of the application, the processor is configured to determine a first laser spot on the first interaction pattern or a second laser spot on the second interaction pattern based on the first image and the second image, and determine the first interaction or the second interaction based on the first laser spot or the second laser spot, so that a scheme for determining the first interaction and the second interaction is provided, and the manner for determining the first interaction and the second interaction is simple and accurate.
Fig. 4a is a schematic structural view of a support body of a terminal device provided in an embodiment of the present disclosure, and fig. 4b is a schematic structural view of a support body of a terminal device provided in an embodiment of the present disclosure, please refer to fig. 2a to 4b in combination.
The terminal device 2 further comprises a supporting body 22, the supporting body 22 comprises a supporting main body (not shown) and a first side wall 221 and a second side wall 222 which are oppositely arranged at two sides of the supporting main body, and the supporting body 22 and the body 21 are rotatably connected; under the condition that the supporting body 22 rotates to be attached to the body 21, the first side wall 221 covers the first projection collector 211, and the second side wall 222 covers the second projection collector 212; when the support 22 is rotated to a target angle with respect to the body 21, the first projection picker 211 and the second projection picker 212 are exposed, and the support 22 supports the body 21 to stand.
The support body, the first sidewall 221 and the second sidewall 222 may be an integral structure, and the material of the support body 22 may be rubber, leather, plastic, metal, or the like. The support 22 is rotatably connected to the body 21, and the rotating shaft is located at the upper side of the body 21, so that the display screen 213 is erected when the support 22 is extended relative to the body 21. In other embodiments, the support 22 and the body 21 are rotatably connected, and the rotation axis is located at the left or right side of the body 21, so that the display screen 213 can be horizontally placed when the support 22 is supported relative to the body 21. The rotatable connection between the support 22 and the body 21 may include that the support 22 and the body 21 are connected by a rotating shaft, the rotating shaft may be separately provided, or the rotating shaft may be provided on the support 22 or the body 21.
Referring to fig. 4a, when the supporting body 22 is attached to the body 21, the first projection collector 211 and the second projection collector 212 are not shown to the user of the terminal device 2. Referring to fig. 4b, when the angle between the supporting body 22 and the body 21 is the target angle, the first projection collector 211 and the second projection collector 212 are exposed.
Through set up the supporter 22 on terminal equipment 2, supporter 22 is when the laminating is at body 21, and the supporter 22 can protect first projection collector 211 and second projection collector 212 at least, and when body 21 had rear camera 215, can also protect rear camera 215, and supporter 22 can replace the common cell-phone protective housing in the market at least now. When the supporting body 22 is supported relative to the main body 21, the supporting body 22 can support the mobile phone to stand up, so that the user can watch the content on the display screen 213 of the terminal device 2 conveniently, and the projection of the pattern and the collection of the image are facilitated.
In one embodiment, the support body 22 comprises a power supply for supplying power to the electrical equipment of the body 21 or to the battery of the body 21 in case the battery level of the body 21 is lower than a target threshold.
The power supply of the support body 22 may be a battery. The power supply may be enclosed or disposed within the support 22, the power supply not being visible to the user. In one embodiment, the main body 21 has a charging interface for charging the battery in the main body 21 through the charging interface on the main body 21, and the support 22 has a charging interface for charging the battery in the support 22 through the charging interface on the support 22. In another embodiment, the terminal device 2 may only have a charging interface on the main body 21, and after the battery in the main body 21 is charged through the charging interface on the main body 21, the battery in the support 22 is charged through wireless charging or wired charging by using the charging interface on the main body 21.
In one embodiment, the power supply of the support 22 may be connected to the power management chip in the body 21 by wire, so as to supply power to the electrical equipment or the battery of the body 21. In another embodiment, the power supply of the supporting body 22 may be a wireless power supply, and the wireless power supply may charge the battery in the body 21 by wireless charging, so that the connection reliability of the terminal device 2 is high and the terminal device 2 has high dustproof and waterproof performance by wireless charging.
It should be understood that in the case that the body 21 includes the rear camera 215, the support main body may be provided with an opening at a position corresponding to the rear camera 215 to avoid the support body 22 from obstructing the rear camera 215 and thereby affecting the use of the rear camera 215 by the user. In addition, it is worth noting that, in order to facilitate the use of the terminal device 2 by the user in different usage scenarios, the support body 22 may be a detachable component.
In an embodiment, the body 21 is provided with a first limiting member, the supporting body 22 is provided with a second limiting member, and the first limiting member and the second limiting member are in limiting cooperation so as to limit the supporting body 22 and the body 21 to rotate within a target angle. In this way, the user can open the body 21 and the support 22 to a target angle when using the projection capture function, which is convenient for the user to operate.
In the embodiment of the application, the support body is rotatably connected with the body, so that the support body can support the mobile phone to stand up, a user can conveniently watch the mobile phone, and when the support body is attached to the body, the support body can play a role in protecting the first projection collector and the second projection collector in the body; in addition, the power supply device is arranged on the support body, and the terminal equipment is provided with the power supply device on the body and the power supply device on the support body, so that a battery for supplying power to the terminal equipment can be determined according to the requirements of the terminal equipment, and the power storage capacity of the terminal equipment is improved.
With continuing reference to fig. 2a to 4b, based on the foregoing embodiments, an embodiment of the present application provides a terminal device, where when the terminal device 2 is disabled from the projection and collection function, the first collector 2112 is further configured to collect a first shot image, the second collector 2122 is further configured to collect a second shot image, and the processor is further configured to synthesize a target image based on the first shot image and the second shot image.
For example, in one embodiment, the terminal device 2 may obtain the first captured image as a portrait and the second captured image as a landscape, and the terminal device 2 may synthesize the portrait in the landscape. In another embodiment, the terminal device 2 may combine or splice the first captured image and the second captured image to obtain a panoramic image that can be currently acquired. In another embodiment, the terminal device 2 may combine the first collector 2112 and the second collector 2122 with the front camera 214 and the rear camera 215 on the terminal device 2 to obtain a new usage scene, for example, the terminal device 2 controls to simultaneously turn on the front camera 214, the rear camera 215, the first collector 2112 and the second collector 2122, and take pictures with the four cameras at the same time, and synthesize the obtained four images to obtain a panoramic image.
In the application embodiment, the user can shoot the left image, the right image or shoot the left image and the right image simultaneously by using the terminal equipment, so that the diversified use scenes and the entertainment efficiency of the mobile phone are improved.
With continuing reference to fig. 2a to 4b, based on the foregoing embodiments, the present embodiment provides a terminal device, where the processor in the present embodiment is further configured to obtain information of an application currently used by the terminal device 2 and/or information of a use habit of a current user, and determine a first interaction pattern and a second interaction pattern based on the information of the application and/or the information of the use habit.
For example, in the case where the user opens a video application, the first interactive pattern may be a key pattern for controlling video playback to control video playback, and the second interactive key may be a keyboard pattern to facilitate the user to send a bullet screen. For another example, when the user opens the office application, the first interactive pattern may be a first partial keyboard pattern, the second interactive pattern may be a second partial keyboard pattern, and the first partial keyboard pattern and the second partial keyboard pattern are combined to form a keyboard pattern, so that the user can type with both hands in a comfortable posture, and the office efficiency is improved. For another example, when the terminal device 2 determines that the user is accustomed to 26-key input, projecting the combination of the first interactive pattern and the second interactive pattern may include a 26-key pattern, or determines that the user is accustomed to 9-grid input, the first interactive pattern may be a punctuation mark, and the second interactive pattern may be a 9-grid pattern.
Through the mode, the first interactive patterns and the second interactive patterns can enable the display of the interactive patterns to be more humanized according to the information of the currently used application program and/or the information of the use habits of the current user.
With continuing reference to fig. 2a to 4b, based on the foregoing embodiments, an embodiment of the present application provides a terminal device, where the first interactive pattern and the second interactive pattern in the embodiment of the present application are the same and are overlapped on the projection plane.
Under the condition that the terminal device 2 emits laser light, both the first collector 2112 and the second collector 2122 can acquire light spots obtained by reflecting the laser light by the fingers of the user, so that the terminal device 2 can obtain accurate interactive judgment by using the image acquired by the first collector 2112 and the image acquired by the second collector 2122. Under the condition that the terminal device 2 does not emit laser light, since the first projector 2111 and the second projector 2121 project the same interaction pattern, when the user presses the target key identification, although the first acquirer 2112 cannot shoot the key behind the target key, the second acquirer 2122 can shoot, for example, when the user presses the keyboard S, the first acquirer 2112 cannot shoot the keyboard X, and the second acquirer 2122 cannot shoot the keyboard Z, but the processor analyzes the two obtained images to determine that the user presses the keyboard S, so that the terminal device 2 can obtain accurate interaction judgment by using the image collected by the first acquirer 2112 and the image collected by the second acquirer 2122.
In the embodiment of the application, the terminal can obtain more accurate interactive judgment.
With continuing reference to fig. 2a to 4b, based on the foregoing embodiments, the present application provides a terminal device, in which the first interactive pattern and the second interactive pattern are different and are respectively projected on two sides of the body 21.
In one implementation, the first interactive pattern is a first partial keyboard pattern, and the second interactive pattern is a second partial keyboard pattern; the first portion of the keypad pattern and the second portion of the keypad pattern combine to form a keypad pattern. The first part of the keyboard pattern can be a pattern from the left side of the keyboard to the letter T, the second part of the keyboard pattern can be a pattern from the letter Y to the right side of the keyboard, and the second part of the keyboard pattern can comprise numeric keys or not comprise numeric keys; or, when the user inputs a large number of numbers, the first part of the keyboard pattern is a 26-key keyboard pattern or a 9-grid keyboard pattern, and the second part of the keyboard is a numeric keyboard pattern.
In another implementation, the first interactive pattern is a first portion of a gamepad pattern, the second interactive pattern is a second portion of the gamepad pattern, and the first portion of the gamepad pattern and the second portion of the gamepad pattern combine to form the gamepad pattern.
In yet another implementation, the first interaction pattern is a keyboard pattern and the second interaction pattern is a cursor control pattern. In this way, the user's manipulation of the projected pattern can be made closer to the manipulation of a physical keyboard and mouse.
In yet another implementation, the first interactive pattern is a first text input mode pattern, and the second interactive pattern is a second text input mode pattern. For example, the first character input mode pattern is a pinyin input mode pattern, and the second character input mode pattern is a handwriting input mode pattern.
It can be understood that the above listed usage scenarios can be combined with the projection secondary acquisition function provided in the embodiment of the present application, and a specific combination manner is not described differently, and the usage scenarios of the first interactive pattern and the second interactive pattern in the embodiment of the present application are not limited to the above listed usage scenarios, and the present application is not described in detail for other usage scenarios.
In the embodiment of the application, the terminal device provides different choices of various first interactive patterns and second interactive patterns, so that the projection acquisition function of the terminal device can be applied to various different input control scenes, and the terminal device has various application modes.
For explaining the functions that can be realized by the terminal device of the embodiment of the present application, please continue to refer to fig. 2a to 4b in combination, because the terminal device 2 provided in the embodiment of the present application includes the body 21 and the support 22, compared with a common mobile phone in the market, the support 22 is added, the support 22 and the body 21 are rotatably connected, and when the state of the terminal device 2 is in an open state, that is, when the support 22 rotates to an angle with the body 21 is a target angle, the first projection collector 211 and the second projection collector 212 of the terminal device 2 are exposed, so that a user can control the terminal device 2 through a virtual keyboard or a game handle projected by the first projection collector 211 and the second projection collector 212. Since the first projection collector 211 and the second projection collector 212 are located at the left and right sides of the terminal device 2 after the terminal device 2 is opened, the user can use both hands to operate the keyboard or the gamepad, and the scenario is greatly optimized for the scenarios of typing and playing games. When the terminal device 2 is closed, that is, when the support body 22 is rotated to be attached to the body 21, the first projection collector 211 and the second projection collector 212 can be shielded by the support body 22, thereby protecting the first projection collector 211 and the second projection collector 212.
In addition, if the terminal device 2 does not use the projection keyboard function, the camera in the first projection collector 211 and the camera in the second projection collector 212 can be used as photographing cameras, so that the terminal device 2 can be flatly held to photograph a picture or a video, and horizontal photographing (simultaneously photographing left and right pictures) is realized, and the terminal device 2 can be flatly held to photograph and simultaneously photograph left and right pictures, so that the feasibility and the office entertainment efficiency of the terminal device 2 are greatly improved.
In addition, because the support 22 has a battery, the battery on the support 22 can be a non-replaceable battery or a replaceable battery, so that the battery capacity of the terminal device 2 can be expanded, and when the battery in the body 21 is insufficient, the battery on the support 22 can supply power to the electric equipment of the terminal device 2, so that the shortage of the battery in the body 21 can be compensated, and in addition, the battery of the support 22 can be charged through the charging circuit of the terminal device 2.
The Processor may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a Central Processing Unit (CPU), a controller, a microcontroller, and a microprocessor. It is understood that the electronic device implementing the above-mentioned processor function may be other electronic devices, and the embodiments of the present application are not particularly limited.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be appreciated that reference throughout this specification to "an embodiment of the present application" or "an embodiment described previously" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in the embodiments of the present application" or "in the embodiments" in various places throughout this specification are not necessarily all referring to the same embodiments. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments.
Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict.
The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the related art may be embodied in the form of a software product stored in a storage medium, and including several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
In the embodiments of the present application, the descriptions of the same contents in different embodiments may be mutually referred to.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A terminal device, comprising:
the body comprises a processor, a first projection collector and a second projection collector, wherein the first projection collector and the second projection collector are oppositely arranged on two sides of the body;
the first projection collector is used for projecting a first interactive pattern and collecting a first image aiming at the first interactive pattern;
the second projection collector is used for projecting a second interactive pattern and collecting a second image aiming at the second interactive pattern;
the processor is configured to determine, based on the first image and the second image, that a first interaction is performed on the first interaction pattern by a first operation body, or that a second interaction is performed on the second interaction pattern by a second operation body, and execute control corresponding to the first interaction or the second interaction on the terminal device.
2. The terminal device of claim 1, wherein the first projection collector comprises a first projector, a first collector, and a first laser emitter, and the second projection collector comprises a second projector, a second collector, and a second laser emitter;
the first projector is used for projecting the first interaction pattern, and the second projector is used for projecting the second interaction pattern;
the first laser emitter is used for emitting first parallel laser on the first interactive pattern, and the second laser emitter is used for emitting second parallel laser on the second interactive pattern;
the first collector is used for collecting the first image, and the second collector is used for collecting the second image;
the processor is further configured to determine a first laser spot on the first interaction pattern or a second laser spot on the second interaction pattern based on the first image and the second image, and determine the first interaction or the second interaction based on the first laser spot or the second laser spot.
3. The terminal device according to claim 1, further comprising:
the supporting body comprises a supporting main body, a first side wall and a second side wall, wherein the first side wall and the second side wall are oppositely arranged on two sides of the supporting main body;
the first side wall covers the first projection collector and the second side wall covers the second projection collector under the condition that the supporting body is rotated to be attached to the body;
when the support body rotates to a target angle between the support body and the body, the first projection collector and the second projection collector are exposed, and the support body supports the body to stand.
4. A terminal device according to claim 3, characterized in that the support body comprises a power supply for supplying power to the consumer of the body or to the battery of the body in case the battery level of the body is below a target threshold.
5. The terminal device according to claim 3, wherein a first limiting member is disposed on the body, a second limiting member is disposed on the supporting body, and the first limiting member and the second limiting member are in limiting engagement to limit rotation of the supporting body and the body within the target angle.
6. The terminal device of claim 2, wherein the first collector comprises a first sub-collector and a second sub-collector, and the second collector comprises a third sub-collector and a fourth sub-collector;
the first sub-collector is configured to collect a first sub-image for the first interaction pattern, the second sub-collector is configured to collect a second sub-image for the first interaction pattern, and the processor is configured to determine the first laser spot based on the first sub-image and/or the second sub-image;
the third sub-collector is configured to collect a third sub-image for the second interaction pattern, the fourth sub-collector is configured to collect a fourth sub-image for the second interaction pattern, and the processor is configured to determine the second laser spot based on the third sub-image and/or the fourth sub-image.
7. The terminal device according to claim 6, wherein the first sub-collector and the second sub-collector are respectively disposed at two sides of the first projector; the third sub-collector and the fourth sub-collector are respectively arranged on two sides of the second projector.
8. The terminal device according to any one of claims 1 to 7, wherein in a case where the terminal device disables the projection capture function, the first collector is further configured to capture a first captured image, the second collector is further configured to capture a second captured image, and the processor is further configured to synthesize a target image based on the first captured image and the second captured image.
9. The terminal device according to any of claims 1 to 7,
the processor is further configured to obtain information of an application program currently used by the terminal device and/or information of a use habit of a current user, and determine the first interaction pattern and the second interaction pattern based on the information of the application program and/or the information of the use habit; the first interactive pattern and the second interactive pattern are the same and are overlapped on a projection plane, or the first interactive pattern and the second interactive pattern are different and are respectively projected on two sides of the body;
wherein, in a case where the first and second interactive patterns are different, the first and second interactive patterns are one of:
the first interactive pattern is a first partial keyboard pattern, and the second interactive pattern is a second partial keyboard pattern; the first partial keyboard pattern and the second partial keyboard pattern are combined to form a keyboard pattern;
the first interactive pattern is a first part of a gamepad pattern, the second interactive pattern is a second part of the gamepad pattern, and the first part of the gamepad pattern and the second part of the gamepad pattern are combined to form the gamepad pattern;
the first interactive pattern is a keyboard pattern, and the second interactive pattern is a cursor control pattern;
the first interactive pattern is a first character input mode pattern, and the second interactive pattern is a second character input mode pattern.
10. The terminal device according to any one of claims 1 to 7, wherein the terminal device is a mobile phone.
CN202010196267.8A 2020-03-19 2020-03-19 Terminal equipment Active CN111314520B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010196267.8A CN111314520B (en) 2020-03-19 2020-03-19 Terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010196267.8A CN111314520B (en) 2020-03-19 2020-03-19 Terminal equipment

Publications (2)

Publication Number Publication Date
CN111314520A true CN111314520A (en) 2020-06-19
CN111314520B CN111314520B (en) 2021-09-21

Family

ID=71148127

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010196267.8A Active CN111314520B (en) 2020-03-19 2020-03-19 Terminal equipment

Country Status (1)

Country Link
CN (1) CN111314520B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103092460A (en) * 2012-04-23 2013-05-08 王道平 Method for inputting Chinese characters on equipment and equipment
CN103324272A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Electronic equipment control method and device
CN104869187A (en) * 2015-04-28 2015-08-26 江苏卡罗卡国际动漫城有限公司 Newspaper reading mobile phone with support plate
CN108153428A (en) * 2018-01-26 2018-06-12 哨鸟(深圳)前海科技有限公司 Laser-projection keyboard and its implementation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324272A (en) * 2012-03-19 2013-09-25 联想(北京)有限公司 Electronic equipment control method and device
CN103092460A (en) * 2012-04-23 2013-05-08 王道平 Method for inputting Chinese characters on equipment and equipment
CN104869187A (en) * 2015-04-28 2015-08-26 江苏卡罗卡国际动漫城有限公司 Newspaper reading mobile phone with support plate
CN108153428A (en) * 2018-01-26 2018-06-12 哨鸟(深圳)前海科技有限公司 Laser-projection keyboard and its implementation

Also Published As

Publication number Publication date
CN111314520B (en) 2021-09-21

Similar Documents

Publication Publication Date Title
CN109194879B (en) Photographing method, photographing device, storage medium and mobile terminal
CN110276840B (en) Multi-virtual-role control method, device, equipment and storage medium
WO2019153824A1 (en) Virtual object control method, device, computer apparatus, and storage medium
US8854433B1 (en) Method and system enabling natural user interface gestures with an electronic system
EP3742743A1 (en) Method and apparatus for displaying additional object, computer device, and storage medium
CN103119628B (en) Utilize three-dimensional user interface effect on the display of kinetic characteristic
US8872760B2 (en) Recording and reproducing apparatus
CN111726536A (en) Video generation method and device, storage medium and computer equipment
US7064742B2 (en) Input devices using infrared trackers
US9392165B2 (en) Array camera, mobile terminal, and methods for operating the same
CN112351185A (en) Photographing method and mobile terminal
CN108525298A (en) Image processing method, device, storage medium and electronic equipment
CN107333047B (en) Shooting method, mobile terminal and computer readable storage medium
CN108495032A (en) Image processing method, device, storage medium and electronic equipment
US20140132725A1 (en) Electronic device and method for determining depth of 3d object image in a 3d environment image
WO2021208251A1 (en) Face tracking method and face tracking device
CN110052030B (en) Image setting method and device of virtual character and storage medium
CN114741559A (en) Method, apparatus and storage medium for determining video cover
CN112437231B (en) Image shooting method and device, electronic equipment and storage medium
CN111314520B (en) Terminal equipment
CN112511743A (en) Video shooting method and device
CN114466140B (en) Image shooting method and device
CN108650463B (en) A kind of photographic method and mobile terminal
JP2022543510A (en) Imaging method, device, electronic equipment and storage medium
CN108683851B (en) A kind of photographic method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant