KR20160141582A - Mobile terminal and method for controlling the same - Google Patents
Mobile terminal and method for controlling the same Download PDFInfo
- Publication number
- KR20160141582A KR20160141582A KR1020150077475A KR20150077475A KR20160141582A KR 20160141582 A KR20160141582 A KR 20160141582A KR 1020150077475 A KR1020150077475 A KR 1020150077475A KR 20150077475 A KR20150077475 A KR 20150077475A KR 20160141582 A KR20160141582 A KR 20160141582A
- Authority
- KR
- South Korea
- Prior art keywords
- image
- main body
- screen
- movement
- display unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000033001 locomotion Effects 0.000 claims abstract description 99
- 230000007423 decrease Effects 0.000 claims description 11
- 230000009467 reduction Effects 0.000 claims description 5
- 230000008859 change Effects 0.000 abstract description 19
- 238000004891 communication Methods 0.000 description 27
- 238000010586 diagram Methods 0.000 description 21
- 230000006870 function Effects 0.000 description 19
- 238000010295 mobile communication Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000007667 floating Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 239000010408 film Substances 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 239000004984 smart glass Substances 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000001965 increasing effect Effects 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000004576 sand Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 229920003002 synthetic resin Polymers 0.000 description 2
- 239000000057 synthetic resin Substances 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- VJYFKVYYMZPMAB-UHFFFAOYSA-N ethoprophos Chemical compound CCCSP(=O)(OCC)SCCC VJYFKVYYMZPMAB-UHFFFAOYSA-N 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000001646 magnetic resonance method Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000009774 resonance method Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- H04M1/72522—
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/0402—Recording geometries or arrangements
- G03H1/041—Optical element in the object space affecting the object beam, not otherwise provided for
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
The present invention relates to a mobile terminal capable of controlling a floating hologram image and a control method thereof.
A terminal can be divided into a mobile / portable terminal and a stationary terminal depending on whether the terminal is movable or not. The mobile terminal can be divided into a handheld terminal and a vehicle mount terminal according to whether the user can directly carry the mobile terminal.
As the functions of the same terminal are diversified, for example, they are implemented in the form of a multimedia device having a combination of functions such as photographing and video shooting, music and video file playback, game and broadcast reception .
In order to support and enhance the functionality of such terminals, it may be considered to improve the structural and / or software parts of the terminal.
Furthermore, the terminal may be provided with a device capable of outputting an image output to the display unit as a floating hologram image. Here, the floating hologram image refers to an image generated by projecting the image output to the display unit on a screen formed obliquely to the display unit. Such a floating hologram image is displayed as if it were floating in the air. Accordingly, there is an increasing demand for a new user interface that can easily control images output to a floating hologram device.
It is an object of the present invention to provide a mobile terminal capable of controlling an image projected on a screen of a terminal according to movement of the terminal and a control method thereof.
An object of the present invention is to provide a mobile terminal capable of controlling a projected image without applying a touch input while an image is projected on a screen of a terminal, and a control method thereof.
According to an aspect of the present invention, there is provided a mobile terminal including a main body, a sensing unit configured to sense movement of the main body, a display unit disposed on a front surface of the main body, A controller for controlling the display unit such that a first image projected on the screen changes to a second image according to a movement of the main body when a motion of the main body is sensed, .
In one embodiment, the control unit limits the control of the touch input applied to the display unit while the image is projected on the screen.
In one embodiment, the first image is an image corresponding to a first region of a predetermined image, the second image is an image corresponding to a second region of the predetermined image different from the first region, And is variable according to movement of the main body with respect to the region.
In one embodiment, if the movement of the main body is a movement in which the main body rotates at a predetermined angle with respect to a virtual axis, a position on the predetermined image of the second region varies according to the predetermined angle.
In one embodiment, if the movement of the main body is a movement in which the main body moves along one direction, the area of the second region varies according to the movement distance of the main body.
In one embodiment, the first image is an image corresponding to a first playback point of the moving image including a plurality of playback points, and the control unit is configured to move, based on the first motion of the main body, Wherein the control unit controls the display unit to project the second image corresponding to a second reproduction time point instead of the first image, And controls the display unit to project the second image.
In one embodiment, the controller may change the second reproduction time based on the predetermined angle if the first movement is a movement in which the main body rotates at a predetermined angle with respect to the imaginary first axis. And the second motion is a movement in which the main body rotates at the predetermined angle with respect to the second virtual axis different from the first axis, the ratio of enlargement or reduction of the first image is variable based on the predetermined angle .
In one embodiment, the control unit controls the display unit so that the second image is projected on the screen only when the movement of the main body satisfies a predetermined condition, and when the movement of the main body satisfies predetermined conditions The control unit controls the display unit to maintain the first image.
In one embodiment, the predetermined condition is that the main body rotates at a predetermined angle or more with respect to a virtual axis.
In one embodiment, the control unit controls the display unit to project the second image onto the screen only when the movement of the main body is detected in a state in which the user input is applied, The control unit controls the display unit to maintain the first image when motion of the main body is detected.
In one embodiment, in a state where an image including a first image and a second image is projected on the screen, the control unit determines that the area of the first image decreases with time, Controlling the display unit such that an area of the second image is increased based on an area decrease amount, and when a movement of the main body rotated by a predetermined angle or more with respect to a virtual axis is detected, And controls the display unit so that the area of the second image decreases based on the area increase amount of the first image.
In one embodiment, the image processing apparatus further includes a notification output unit configured to output a notification signal, and when the area of the first image is smaller than a predetermined reference area based on a user input, And controls the notification output unit.
According to another aspect of the present invention, there is provided a method of controlling a mobile terminal including a display unit, a reflector configured to reflect an image output from at least a part of the display unit, and a screen configured to project the reflected image, Outputting an image to at least a partial area of the display unit so that the first image is projected, and when the movement of the terminal is sensed, the first image projected on the screen changes to a second image according to the movement of the terminal And changing an image output to the display unit.
In one embodiment, the image projected on the screen is an image corresponding to a partial area of a predetermined image, and the step of modifying the image output to the display unit comprises: Wherein the second region is configured to change the image output to the display unit such that the second image corresponding to the second region is projected instead of the first image corresponding to the region, And is variable according to the movement of the terminal.
According to the present invention, a user can control an image projected on a screen without applying a touch input while an image is projected on the screen.
Further, according to the present invention, a user can control an image projected on a screen through an intuitive user input.
1A is a block diagram illustrating a mobile terminal according to the present invention.
1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
2 is a perspective view illustrating a mobile terminal according to an embodiment of the present invention.
3A is a conceptual diagram illustrating a principle of projecting an image on a screen of a mobile terminal according to an exemplary embodiment of the present invention.
FIG. 3B is a conceptual diagram illustrating an image displayed on the display unit when there are a plurality of screens described in FIG. 3A.
4 is a flowchart illustrating a method of changing an image projected on a screen of a mobile terminal according to movement of a main body according to an embodiment of the present invention.
5A and 5B are conceptual diagrams illustrating an image projected on a screen of a mobile terminal according to an embodiment of the present invention, in accordance with movement of the main body.
6A to 6C are conceptual diagrams showing a state in which the image projected on the screen described in FIG. 3 is changed according to the movement of the main body.
7A and 7B are conceptual diagrams showing a moving image outputted to a screen changing according to the movement of the main body.
8A and 8B are conceptual diagrams showing an embodiment for changing an image projected on a screen only when the movement of the main body satisfies predetermined conditions.
9A and 9B are conceptual diagrams illustrating an embodiment of changing an image projected on a screen only when the main body is moved in a state in which a user input is applied.
10A and 10B are conceptual diagrams showing an embodiment in which a part of an image projected on a screen is changed according to the movement of the main body.
11 is a conceptual diagram showing an embodiment for controlling an image projected on a screen without moving the main body.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
FIG. 1A is a block diagram for explaining a mobile terminal according to the present invention, and FIGS. 1B and 1C are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
The
The
The
The
The
The
The
The
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the various components of the
First, referring to the
The mobile communication module 112 may be a mobile communication module or a mobile communication module capable of communicating with a plurality of mobile communication devices in accordance with technical standards or communication standards for mobile communication (e.g., Global System for Mobile communication (GSM), Code Division Multi Access (CDMA), Wideband CDMA (WCDMA) A terminal, or a server on a mobile communication network constructed according to a mobile communication network (e.g.
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Examples of the wireless Internet technology include a wireless LAN (WLAN), a wireless fidelity (WiFi) direct, a digital living network alliance (DLNA), a wireless broadband (Wibro), a world interoperability for microwave access Packet Access, and Long Term Evolution (LTE). The
The
The short-
Here, the other
The
The
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, etc. of a touch object to be touched on the touch screen on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
In addition, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
Generally, 3D stereoscopic images consist of left image (left eye image) and right image (right eye image). A top-down method of arranging a left image and a right image in one frame according to a method in which a left image and a right image are combined into a three-dimensional stereoscopic image, A checker board system in which pieces of a left image and a right image are arranged in a tile form, a left-to-right (right-side) Or an interlaced method in which rows are alternately arranged, and a time sequential (frame-by-frame) method in which right and left images are alternately displayed in time.
In addition, the 3D thumbnail image may generate a left image thumbnail and a right image thumbnail from the left image and right image of the original image frame, respectively, and may be generated as one image as they are combined. In general, a thumbnail means a reduced image or a reduced still image. The left image thumbnail and the right image thumbnail generated in this way are displayed on the screen with a difference of the left and right distance by the depth corresponding to the parallax between the left image and the right image, thereby exhibiting a stereoscopic spatial feeling.
The left and right images necessary for realizing the three-dimensional stereoscopic image can be displayed on the stereoscopic display unit by the stereoscopic processing unit. The stereoscopic processing unit receives a 3D image (an image at a reference time point and an image at an expansion point), sets a left image and a right image therefrom, or receives a 2D image and converts it into a left image and a right image.
The
The
In addition to the vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Referring to FIGS. 1B and 1C, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may also be mounted on the
As shown, when the
These
The
Meanwhile, the
The
1B and 1C, a
However, these configurations are not limited to this arrangement. These configurations may be omitted or disposed on other surfaces as needed. For example, the
The
The
In addition, the
The
The touch sensor may be a film having a touch pattern and disposed between the
In this way, the
The first
The
The
The
The first and
In this figure, the
The contents input by the first and
On the other hand, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap with the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. When the
Meanwhile, the
The
The
And a
The
The
And a second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the broadcast receiving module 111 (see FIG. 1A) may be configured to be able to be drawn out from the terminal body. Alternatively, the antenna may be formed in a film type and attached to the inner surface of the
The terminal body is provided with a power supply unit 190 (see FIG. 1A) for supplying power to the
The
The
The
Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings.
FIG. 3A is a conceptual diagram illustrating a principle of projecting an image on a screen of a mobile terminal according to an embodiment of the present invention. FIG. FIG. 7 is a conceptual diagram illustrating an image displayed on the display unit when there are a plurality of screens described.
2, a mobile terminal according to an embodiment of the present invention includes a main body, a
The
For example, the
The
The
The
3, the
Therefore, while the image is being projected on the
On the other hand, the
When there are a plurality of
For example, as shown in FIG. 3B, when there are four
On the other hand, when all the images output to the areas are the same, the user sees the same image in four directions regardless of the direction in which the
As described above, the mobile terminal according to the present invention provides the user with the image projected on the
Hereinafter, a method of outputting an image projected on a screen of a mobile terminal according to the movement of the main body according to the present invention will be described in detail.
FIG. 4 is a flowchart illustrating a method of changing an image projected on a screen of a mobile terminal according to movement of a main body according to an embodiment of the present invention. FIGS. 5A and 5B are views In accordance with the movement of the main body.
First, an image is output to at least a part of the
Next, when the motion of the main body of the terminal is detected, the first image projected on the
The
As shown in FIG. 5A, when the movement of the main body is a movement of rotating at a predetermined angle with respect to a virtual axis, the
As shown in FIG. 5A, when the main body of the
In addition, the
On the other hand, the imaginary axis may be one or more. When there are a plurality of virtual axes, the
5B, when the main body moves in one direction, the
For example, when the main body moves in the direction shown in FIG. 5B, the
On the other hand, the one direction may be one or more. The
As described above, a method of controlling a mobile terminal according to an exemplary embodiment of the present invention provides a method of changing an image output to a screen according to motion of a main body. Thus, the user can control the image projected on the screen without applying the touch input while the image is projected on the screen.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
6A to 6C are conceptual diagrams showing a state in which the image projected on the screen described in FIG. 3 is changed according to the movement of the main body.
An image corresponding to a partial area of a predetermined image may be projected on the screen. Only a portion of the image included in the region is projected on the
6A, when the main body rotates to the right by a predetermined angle with respect to the imaginary axis in a state that the first image corresponding to the
That is, the user can see the right portion of the predetermined image by rotating the main body to the right by a predetermined angle. On the other hand, although not shown in the figure, when the main body rotates to the left by a predetermined angle, the left part of the predetermined image may be output.
6B, when the first image corresponding to the
Meanwhile, although not shown in the drawing, when the main body moves along the direction opposite to the direction shown in FIG. 6B, an enlarged image of a part of the predetermined image can be projected on the screen.
On the other hand, the image projected on the
For example, as shown in FIG. 6C, when the main body rotates at a predetermined angle with respect to a virtual axis in a state where an image of an area corresponding to the first direction is projected on the
On the other hand, a moving picture including a plurality of reproduction points may be reproduced on the
7A and 7B are conceptual diagrams showing a moving image outputted to a screen changing according to the movement of the main body.
The first image may be an image corresponding to a first reproduction point of a moving image including a plurality of reproduction points. At this time, the
The
When the main body rotates at a predetermined angle with respect to the first axis, the
For example, as shown in FIG. 7A, when the main body rotates 30 degrees with respect to a specific axis in a state in which an image corresponding to the first reproduction time (0:00) is output to the
That is, when the main body rotates at a predetermined angle with respect to a specific axis, the
When the main body rotates at a predetermined angle with respect to the second axis, the
That is, when the main body rotates at a predetermined angle with respect to a specific axis, the
As described above, the mobile terminal according to the present invention changes an image or a moving image projected on the screen according to the movement of the main body. Thus, the user can control the image projected on the screen without applying the touch input.
Hereinafter, an embodiment in which the image projected on the screen is changed only when the movement of the main body satisfies predetermined conditions even if the main body moves, will be described.
8A and 8B are conceptual diagrams showing an embodiment for changing an image projected on a screen only when the movement of the main body satisfies predetermined conditions.
The
For example, as shown in FIG. 8A, when the main body rotates at a predetermined angle or more with respect to a virtual axis, the
That is, the mobile terminal according to the present invention can maintain the original image even when the user unintentionally moves the main body by keeping the image projected on the
Hereinafter, an embodiment in which the image projected on the screen is changed only when the main body is moved while the user input is applied will be described.
9A and 9B are conceptual diagrams illustrating an embodiment of changing an image projected on a screen only when the main body is moved in a state in which a user input is applied.
The
The user input may be an input to a user input unit provided on a side surface or a rear surface of the main body.
For example, as shown in FIG. 9B, when the main body moves along one direction in a state where the user input portion is pressed, the
That is, the mobile terminal according to the present invention maintains the image projected on the
Hereinafter, an embodiment in which a part of the image projected on the screen is changed according to the movement of the main body will be described.
10A and 10B are conceptual diagrams showing an embodiment in which a part of an image projected on a screen is changed according to the movement of the main body.
An image including the first image and the second image may be projected on the
The
On the other hand, in the case where the
As shown in FIG. 10A, an hourglass image including the
10A, when the main body rotates 180 degrees with respect to the predetermined axis, the
Meanwhile, the mobile terminal according to the present invention may further include a notification output unit configured to output a notification signal.
As shown in FIG. 10B, the
As described above, the present invention changes the image projected on the
On the other hand, it is possible to control an image projected on the
11 is a conceptual diagram showing an embodiment for controlling an image projected on the
11, when the user's air gesture is sensed while the image is projected on the
Thereby, the user can control the image projected on the
It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer-readable medium include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . Also, the computer may include a
Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.
100: mobile terminal 110: wireless communication unit
120: A / V input / output unit 130: user input unit
140: sensing unit 150: output unit
160: memory 170: interface section
180: control unit 190: power supply unit
Claims (14)
A sensing unit configured to sense motion of the main body;
A display unit provided on a front surface of the main body;
A reflection plate configured to reflect an image output to at least a part of the display unit;
A screen for projecting the reflected image; And
And a controller for controlling the display unit such that a first image projected on the screen changes to a second image according to movement of the main body when motion of the main body is sensed.
Wherein the control unit limits the control of the touch input applied to the display unit while the image is projected on the screen.
Wherein the first image is an image corresponding to a first area of a predetermined image,
Wherein the second image is an image corresponding to a second region of the predetermined image that is different from the first region, and is variable according to the movement of the main body with respect to the first region.
If the movement of the main body is a movement of the main body to rotate at a predetermined angle with respect to the imaginary axis,
Wherein a position on the predetermined image of the second region is varied according to the predetermined angle.
If the movement of the body is a movement of the body along one direction,
Wherein an area of the second region is variable according to a moving distance of the main body.
Wherein the first image is an image corresponding to a first reproduction point of a moving image including a plurality of reproduction points,
Wherein,
Controls the display unit to project the second image corresponding to the second reproduction time point instead of the first image corresponding to the first reproduction time point based on the first movement of the main body,
And controls the display unit to project the second image in which the first image is enlarged or reduced based on the second movement of the main body different from the first movement.
Wherein when the first movement is a movement in which the main body rotates at a predetermined angle with respect to a virtual first axis, the second reproduction time is varied based on the predetermined angle,
If the second movement is a movement in which the main body rotates at the predetermined angle with respect to a virtual second axis different from the first axis, the ratio of enlargement or reduction of the first image is varied based on the predetermined angle The mobile terminal comprising:
Controls the display unit such that the second image is projected onto the screen only when the movement of the main body satisfies a predetermined condition,
Wherein the controller controls the display unit to maintain the first image when the movement of the main body does not satisfy a predetermined condition.
Wherein the main body rotates at a predetermined angle or more with respect to a virtual axis.
Controls the display unit such that the second image is projected onto the screen only when movement of the main body is detected in a state in which the user input is applied,
Wherein the controller controls the display unit to maintain the first image when motion of the main body is detected in a state in which the user input is not applied.
In a state in which an image including the first image and the second image is projected on the screen,
Wherein,
Controls the display unit such that an area of the first image decreases with time and an area of the second image increases based on an area decrease amount of the first image,
Wherein an area of the first image increases with time and an area of the second image based on an area increase amount of the first image when a movement of the main body rotating about a virtual axis by a predetermined angle or more is detected, Wherein the control unit controls the display unit to decrease the power consumption of the mobile terminal.
And a notification output unit configured to output a notification signal,
Wherein,
Wherein the control unit controls the notification output unit to output a notification signal when the area of the first image becomes smaller than a preset reference area based on a user input.
Outputting an image to at least a part of the display unit so that a first image is projected on the screen; And
And changing an image output to the display unit such that the first image projected on the screen changes to a second image according to the movement of the terminal when motion of the terminal is detected.
Wherein the image projected on the screen is a video corresponding to a partial area of a predetermined image,
Wherein the step of changing the image output to the display unit comprises:
And changes the image output to the display unit such that the second image corresponding to the second area is projected instead of the first image corresponding to the first area based on the movement of the terminal,
Wherein the second area is varied according to movement of the terminal with respect to the first area.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150077475A KR20160141582A (en) | 2015-06-01 | 2015-06-01 | Mobile terminal and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150077475A KR20160141582A (en) | 2015-06-01 | 2015-06-01 | Mobile terminal and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160141582A true KR20160141582A (en) | 2016-12-09 |
Family
ID=57574777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150077475A KR20160141582A (en) | 2015-06-01 | 2015-06-01 | Mobile terminal and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160141582A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115243020A (en) * | 2022-07-18 | 2022-10-25 | 海信视像科技股份有限公司 | Image following method for projection equipment during screen lifting and projection equipment |
-
2015
- 2015-06-01 KR KR1020150077475A patent/KR20160141582A/en unknown
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115243020A (en) * | 2022-07-18 | 2022-10-25 | 海信视像科技股份有限公司 | Image following method for projection equipment during screen lifting and projection equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101642808B1 (en) | Mobile terminal and method for controlling the same | |
KR20150088084A (en) | Mobile terminal and control method for the mobile terminal | |
KR20150134949A (en) | Mobile terminal and control method for the mobile terminal | |
KR20160143135A (en) | Mobile terminal and method for controlling the same | |
KR20150144629A (en) | Mobile terminal and method of controlling the same | |
KR20180103866A (en) | Mobile terminal and control method thereof | |
KR20150084133A (en) | Mobile terminal and method for controlling the same | |
KR20170055867A (en) | Mobile terminal and method for controlling the same | |
KR101510704B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR101529933B1 (en) | Mobile terminal | |
KR20170073985A (en) | Mobile terminal and method for controlling the same | |
KR20150072940A (en) | Mobile terminal and control method for the mobile terminal | |
KR20180037370A (en) | Mobile terminal and method for controlling the same | |
KR20160149061A (en) | Mobile terminal and method for controlling the same | |
KR20170037431A (en) | Mobile terminal and control method for the mobile terminal | |
KR20170025270A (en) | Mobile terminal and method for controlling the same | |
KR20160141582A (en) | Mobile terminal and method for controlling the same | |
KR20150094243A (en) | Mobile terminal and method for controlling the same | |
KR20150134664A (en) | Mobile terminal and method for controlling the same | |
KR20170042164A (en) | Mobile terminal and method for controlling the same | |
KR20170014193A (en) | Mobile terminal and method for controlling the same | |
KR101698099B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR20170009048A (en) | Mobile terminal and method for controlling the same | |
KR20160073038A (en) | Mobile terminal and method for controlling the same | |
KR20160088033A (en) | Mobile terminal and method for controlling the same |