US10824314B2 - User terminal and control method of the same - Google Patents

User terminal and control method of the same Download PDF

Info

Publication number
US10824314B2
US10824314B2 US16/069,597 US201616069597A US10824314B2 US 10824314 B2 US10824314 B2 US 10824314B2 US 201616069597 A US201616069597 A US 201616069597A US 10824314 B2 US10824314 B2 US 10824314B2
Authority
US
United States
Prior art keywords
objects
input
user
display
user terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/069,597
Other languages
English (en)
Other versions
US20190026012A1 (en
Inventor
Seung Hun Lee
Do Youn KANG
Sung Bae Park
Ju Yeon Lee
Jung Hwan Choi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KANG, DO YOUN, LEE, SEUNG HUN, CHOI, JUNG HWAN, LEE, JU YEON, PARK, SUNG BAE
Publication of US20190026012A1 publication Critical patent/US20190026012A1/en
Application granted granted Critical
Publication of US10824314B2 publication Critical patent/US10824314B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones
    • H04M1/72522
    • H04M1/72583
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus

Definitions

  • the present invention relates to a user terminal and a control method of the same, in which a first object of obstructing a view is easily changed into a second object by moving from a center region to an edge region on a screen.
  • FIGS. 1A, 1B, 1C illustrate that an object is moved from a center region on a screen in a user terminal 1 of the related art.
  • a user has to first touch the object ( FIG. 1A ), and select one of buttons B displayed for adjusting size control.
  • buttons B displayed for adjusting size control.
  • a user selects a minimization button for minimizing the object A ( FIG. 1B ), thereby getting the object A out of the screen center region ( FIG. 1C ).
  • An aspect of the present invention is to provide a user terminal and a control method of the same, in which an object is easily moved to an edge region and reduced in size, thereby securing a center region on a screen to be free from obstruction of the object.
  • a user terminal conceived to solve the problems of the present invention includes: an image processor configured to process an image; a display configured to display the processed image; and a controller configured to control the image processor so that a first object included in the image can be moved to an edge region on a screen of the display in response to a user's input for moving the first object to the edge region on the screen of the display, and the first object can be changed into a second object smaller than the first object and displayed on the display.
  • the controller may determine a position for the second object based on at least one of an input direction, an input releasing position, and an input kind of the user.
  • the controller may determine at least one among a size, shape, color, and transparency of the second object based on the position of the second object, a running state of the first object, and the edge region.
  • the second object may comprise at least one among a shape, color and region of the first object, a region for release into the first object, and a mark for informing the release into the first object.
  • the controller may change the second object into the first object in response to the user's input including at least one of an input direction, an input releasing position, and an input kind with regard to the second object.
  • the controller may determine the position of the first object based on at least one among a direction, degree, and kind of the user's input and an input releasing position with regard to the second object.
  • controller may control a plurality of second objects to be displayed as overlapped.
  • the controller may inform each overlapped second object in response to the user's input.
  • controller may set at least one among a size, shape, color, and transparency of the second object.
  • the controller may change a plurality of first objects into a plurality of second objects in response to the user's input.
  • a method of controlling a user terminal conceived to solve the problems of the present invention includes: moving a first object included in an image to an edge region on a display screen in response to a user's input for moving the first object to the edge region on the display screen; changing the first object into a second object smaller than the first object; and displaying the second object.
  • the moving may comprise determining a position for the second object based on at least one of an input direction, an input releasing position, and an input kind of the user.
  • the determining of the position for the second object may comprise determining at least one among a size, shape, color, and transparency of the second object based on the position of the second object, a running state of the first object, and the edge region.
  • the second object comprises at least one among a shape, color and region of the first object, a region for release into the first object, and a mark for informing the release into the first object.
  • the method may additionally comprises: changing the second object into the first object in response to the user's input including at least one of an input direction, an input releasing position, and an input kind with regard to the second object.
  • the changing into the first object may comprise determining the position of the first object based on at least one among a direction, degree, and kind of the user's input and an input releasing position with regard to the second object.
  • the method may further comprise displaying a plurality of second objects as overlapped.
  • the displaying as overlapped may comprise informing each overlapped second object in response to the user's input.
  • the method may further comprise setting at least one among a size, shape, color, and transparency of the second object.
  • the method may further comprise changing a plurality of first objects into a plurality of second objects in response to the user's input.
  • the present invention it is easy to move an object to an edge region and reduce the size of the object, thereby having an effect on securing a center region of a screen to be free from the obstruction of the object.
  • the object moved to the edge region and reduced in size is easily returned to the center region of the screen.
  • FIGS. 1A, 1B, 1C illustrate that an object is moved from a center region on a screen in a user terminal of a related art
  • FIGS. 2A, 2B, 2C, 2D illustrate that an object is moved to an edge region and reduced in size in a user terminal according to one embodiment of the present invention
  • FIG. 3 is a block diagram of a user terminal according to the present invention.
  • FIG. 4 is a block diagram of a user terminal according to one embodiment of the present invention.
  • FIG. 5 is a control flowchart of showing a control method of a user terminal according to the present invention
  • FIG. 6 is a control flowchart of showing a control method of a user terminal according to one embodiment of the present invention
  • FIGS. 7 and 8 are control flowcharts of showing a control method of a user terminal according to another embodiment of the present invention.
  • FIGS. 9A, 9B, 9C illustrate that an object is moved to an edge region and reduced in size by a user's touch swing input in a user terminal according to one embodiment of the present invention
  • FIGS. 10A, 10B, 10C illustrate that an object moved and reduced in size is returned to a center region on a screen in a user terminal according to one embodiment of the present invention
  • FIG. 11 illustrates the size, shape, color, transparency, etc. of an object moved and reduced in size in a user terminal according to one embodiment of the present invention
  • FIG. 12 illustrates that objects are displayed as overlapped in a user terminal according to one embodiment of the present invention
  • FIG. 13 illustrates descriptions about each of overlapped objects in a user terminal according to one embodiment of the present invention
  • FIG. 14 illustrates that an object moved to an edge region and reduced in size is returned to a center region on a screen by a user's touch swing input in a user terminal according to one embodiment of the present invention
  • FIGS. 15A, 15B illustrate that a plurality of objects are moved to an edge region and reduced in size and returned in a user terminal according to one embodiment of the present invention.
  • FIGS. 2A, 2B, 2C, 2D illustrate that an image of teeth is captured and analysis information about the teeth is displayed on a screen in a user terminal 10 according to one embodiment of the present invention.
  • FIGS. 2A, 2B, 2C, 2D illustrate that an object is moved to an edge region and reduced in size in a user terminal 10 according to one embodiment of the present invention.
  • an object A is displayed in a center region on a screen of the user terminal 10 .
  • a user touches and moves the object A, which is displayed in the center region on the screen of the user terminal 10 , in a rightward direction.
  • the object A may have a touch region for movement, and the touch region may have to be touched and moved to move the object.
  • the touch region may be not present, and the object may be set to move in response to touch with any region unlike touch for execution.
  • At least a part of the object moved by a user's touch gets out of a screen region, or is moved within at least one setting among settings such as the size of the object, a preset distance from the edge region, etc.
  • the object A when at least a part of the object moved by a user's touch gets out of the screen region or is moved within at least one setting among the settings such as the size of the object, a preset distance from the edge region, etc., the object A is changed into an object B smaller than the object A. Further, the position of the object may be determined based on at least one among a direction of a user's input, an input releasing position, and the kind of the input, and the object B may be positioned at the determined position. In FIG. 2D , the object B is positioned as a semicircular shape in an edge region of the screen, but there are no limits to the shape of the object B.
  • the object B may be displayed corresponding to the characteristic shape, color, logo and the like of the object A, thereby informing that it is changed from the object A. Further, the object B may display at least one of a selection region for release to return to the object A, and a mark for informing the release.
  • a user's input is achieved by touch, and drag-and-drop methods, but not limited thereto.
  • a user's input may be achieved by a touch swing method of touching and swing an input, a plurality of touch inputs, etc.
  • FIG. 3 is a block diagram of a user terminal 10 according to the present invention.
  • the user terminal 10 according to the present invention may include an image processor 110 , a display 120 , and a controller 100 .
  • the image processor 110 performs image processing processes without limitations to the kinds thereof, and the image processing processes may for example include de-multiplexing for dividing a predetermined signal into signals corresponding to characteristics, decoding corresponding to an image format of an image signal, de-interlacing for converting an image signal from an interlaced type into a progressive type, noise reduction for improving image quality, detail enhancement, frame refresh rate conversion, etc. Further, the image processing processes may further include a decoder (not shown) for decoding a source image corresponding to an image format of an encoded source image, and a frame buffer (not shown) for storing the decoded source image in units of frames.
  • the image processor 110 may be achieved by a system-on-chip where various functions are integrated, or an image processing board (not shown) in which individual elements for independently performing respective processes are mounted a printed circuit board, and internally provided in the display apparatus.
  • the image processor 110 performs various preset image processing processes with regard to a broadcast signal including an image signal received from a receiver (not shown), and a source image including an image signal received from an image source (not shown).
  • the image processor 110 outputs an image signal subjected to such a process to the display 120 , so that the processed source image can be displayed on the display 120 .
  • the display 120 may display an image based on an image signal output from an image processor 110 .
  • the display 120 may include additional elements in accordance with its types.
  • the display 110 may include a liquid crystal display (LCD) panel (not shown), a backlight unit (not shown) for illuminating the LCD panel, and a panel driving board (not shown) for driving the LCD panel.
  • LCD liquid crystal display
  • backlight unit not shown
  • panel driving board not shown
  • the display 120 displays an image based on an image signal processed by the image processor 110 .
  • the method may include a LCD, a plasma display panel (PDP), an organic light emitting diode (OLED), and the like method.
  • the display 120 may include an LCD panel, a PDP panel, an OLED panel, etc.
  • the display 120 may display an image and a color calibration process.
  • the display 120 may include a display panel for displaying an image thereon, and a panel driver for processing an input image signal to be displayed as an image on the display panel.
  • a panel driver for processing an input image signal to be displayed as an image on the display panel.
  • An image signal received from an external input source through an interface is subjected to decoding, de-interlacing, scaling and the like image processing processes and then displayed on the display 120 .
  • the controller 100 may control general elements inside the user terminal 10 .
  • the controller 100 may control the image processor 110 so that a first object included in an image can be moved to an edge region on a screen in response to a user's input for moving the first object to the edge region on the screen of the display 120 , and displayed on the first display 120 as changed into a second object smaller than the first object.
  • FIG. 4 is a block diagram of a user terminal 10 according to one embodiment of the present invention.
  • the user terminal 10 may include the elements of FIG. 3 , and may additionally include a communicator 130 , a storage 140 , a user input 150 , an audio receiver 160 , an image capturer 170 , and a UI generator 180 .
  • the communicator 130 may receive a signal based on an external input and transmit the signal to the image processor 110 or the controller 100 .
  • the communicator 130 may connect with various external input cables to receive a signal from the external input by the cable, or may wirelessly receive a signal in accordance with preset wireless communication standards.
  • the communicator 130 may include a plurality of connectors (not shown) to which the cables are respectively connected.
  • the communicator 130 may receive a signal from a connected external input, for example, a broadcast signal, an image signal, a data signal and the like based on high definition multimedia interface (HDMI), universal serial bus (USB), and Component standards, or may receive communication data through a communication network.
  • HDMI high definition multimedia interface
  • USB universal serial bus
  • the communicator 130 may include not only an element for receiving a signal/data from an external input, but also various additional elements such as a wireless communication module (not shown) for wireless communication or a tuner (not shown) for a broadcast signal in accordance with designs of the user terminal 10 .
  • the communicator 130 may not only receive a signal from the external device but also transmit the information/data/signal of the user terminal 10 to the external device. That is, the communicator 130 is not limited to only the elements for receiving the signal from the external device, and may be achieved as an interface for interactive communication.
  • the communicator 130 may receive a control signal for selecting a user interface (UI) from a plurality of control devices.
  • UI user interface
  • the communicator 130 may be materialized by a communication module for publicly known wireless near field communication, such as Bluetooth, Infrared, Ultra-Wideband (UWB), ZigBee, etc., or may be materialized by a publicly known communication port for wired communication.
  • the communicator 130 may be utilized for many purposes of transmitting/receiving data, a command for controlling the display, etc. as well as the control signal for selecting the UI.
  • the storage 140 may be materialized by a writable nonvolatile memory (i.e. read only memory (ROM)) in which data is retained even though the user terminal 10 is powered off and which can reflect a user's changes. That is, the storage 140 may be provided as one of a flash memory, an erasable programmable read only memory (EPROM) and an electrically erasable programmable read only memory (EEPROM).
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • the user input 150 may transmit various preset control commands or information to the controller 100 in response to a user's control and input.
  • the user input 150 may be materialized by a menu-key or input panel provided at an outer side of the user terminal 10 , or a remote controller or the like separated from the user terminal 10 .
  • the user input 150 and the display 120 may be integrated into a single body.
  • the display 120 is provided as a touch screen, a user may touch an input menu (not shown) displayed on the display 120 , thereby transmitting a preset command to the controller 100 .
  • the user input 150 may receive a user's motion and voice.
  • the motion of a user may include a touch input.
  • the user input 150 may receive a user's motion and voice directly, or may receive information about a user's motion and voice form the external device.
  • the audio receiver 160 is materialized by a microphone to receive a user's voice command and transmit it to the controller 100 .
  • the image capturer 170 is materialized by a microphone to receive a user's gesture and transmit it to the controller 100 .
  • the UI generator 180 may generate a UI for operating an application program to be executed.
  • the UI includes a plurality of sub UIs provided in the form of an icon, a text, etc.
  • the application program may operate corresponding to the selected sub UI. That is, each sub UI may be generated in units of a plurality of functions or events for operating the application program running in the user terminal 10 .
  • the UI generator 180 refers to a software or hardware function for generating and controlling a UI displayed on the display 120 , and the function may be performed by the controller 100 to be described later.
  • the UI generator 180 may be configured by a separate chipset, or may be configured by a separate microprocessor.
  • the controller 100 may determine a position of the second object based on at least one of an input direction, an input releasing position, and an input kind of a user.
  • the controller 100 may determine at least one among the size, shape, color, and transparency of the second object based on the position of the second object, the running state of the first object, and the edge region.
  • the second object may include at least one of the shape, color and region of the first object, a region for release into the first object, and a mark for informing the release into the first object.
  • the controller 100 may change the second object into the first object in response to a user's input including at least one among the input direction, the input releasing position, and the input kind with regard to the second object.
  • the controller 100 may determine the position of the first object based on at least one among the user's input direction, degree, kind, and input releasing position with regard to the second object.
  • the controller 100 may display the plurality of second objects as overlapped.
  • the controller 100 may inform a user of the overlapped second objects through a UI in response to an input.
  • the controller 100 may set at least one among the size, shape, color, and transparency of the second object.
  • the controller 100 may change a plurality of first objects into a plurality of second objects in response to a user's input.
  • FIG. 5 is a control flowchart of showing a control method of the user terminal 10 according to the present invention.
  • the first object included in the image is moved to the edge region on the screen of the display in response to a user's input for moving the first object to the edge region on the screen of the display (S 11 ).
  • the first object is changed into the second object smaller than the first object (S 12 ).
  • the changed second object is displayed (S 13 ).
  • FIG. 6 is a control flowchart of showing a control method of the user terminal 10 according to one embodiment of the present invention.
  • At least one among the size, shape, color, and transparency of the second object is set (S 21 ).
  • the first object is moved to the edge region on the screen of the display 120 in response to a user's input for moving the first object to the edge region on the screen of the display 120 (S 22 ).
  • the position of the second object is determined based on at least one among an input direction, an input releasing position, and an input kind of a user (S 23 ).
  • At least one among the size, shape, color, and transparency of the second object is determined based on the position of the second object, the running state of the first object, and the edge region (S 24 ).
  • the first object is changed into the second object smaller than the first object (S 25 ).
  • the changed second object is displayed (S 26 ).
  • the second object is changed into the first object in response to a user's input including at least one among the input direction, the input releasing position and the input kind of the second object (S 27 ).
  • the position of the first object is determined based on at least one among the direction, degree, kind and input releasing position of a user's input with regard to the second object (S 28 ).
  • the order of the operations S 27 and S 28 may be reversed.
  • the changed first object is displayed (S 29 ).
  • FIGS. 7 and 8 are control flowcharts of showing a control method of the user terminal 10 according to another embodiment of the present invention.
  • At least one among the size, shape, color, and transparency of the second object is set (S 31 ).
  • the first object is moved to the edge region on the screen of the display 120 in response to a user's input for moving the first object to the edge region on the screen of the display 120 (S 32 ).
  • the position of the second object is determined based on at least one among an input direction, an input releasing position, and an input kind of a user (S 33 ).
  • At least one among the size, shape, color, and transparency of the second object is determined based on the position of the second object, the running state of the first object, and the edge region (S 34 ).
  • the first object is changed into the second object smaller than the first object (S 35 ).
  • the changed second object is displayed (S 36 ).
  • the plurality of second objects overlapped with respect to a direction are displayed as overlapped (S 37 ).
  • the plurality of second objects is changed into the plurality of first objects in response to a user's input including at least one among the input direction, the input releasing position, and the input kind with respect to the plurality of second objects (S 39 ).
  • the positions of the plurality of first objects are determined corresponding to the plurality of second objects based on at least one among the input direction, degree, kind of a user and the input releasing position with respect to the plurality of second objects (S 40 ).
  • the order of the operations S 27 and S 28 may be reversed.
  • the plurality of changed first objects are displayed (S 41 ).
  • FIGS. 9A, 9B, 9C illustrate that an object is moved to an edge region and reduced in size by a user's touch swing input in the user terminal 10 according to one embodiment of the present invention.
  • a user touches a point T 1 on the first object displayed on the display 120 , drags the first object to a point T 2 as it is thrown rightward like a swing, and releases the touch at the point T 2 ,
  • the first object continues to slide and move rightward in response to such a user's input even after the user's touch is released. Then, a region A 1 is free from obstruction of the first object positioned 1 corresponding to a center region on the screen, and the first object is positioned in a region A 2 . A partial right side of the first object may get out of the region of the screen and disappear.
  • the controller 100 may change the first object into the second object B when the first object is positioned within a predetermined distance from the right side of the screen, at least a part of the first object gets out of the screen, and the whole of the first object gets out of the screen.
  • the second object B may be displayed as a semicircular shape, in which a mark for displaying the whole shape ( ⁇ ) as a full screen may be displayed at an upper portion of the displayed second object B, and a mark for displaying the minimum shape ( ) as a minimum screen may be displayed at a lower portion.
  • This shape of the second object B is merely an example, and the second object B may have various different shapes.
  • FIGS. 10A, 10B, 10C illustrate that an object moved and reduced in size is returned to a center region on a screen in the user terminal 10 according to one embodiment of the present invention.
  • a user may touch the second object B positioned in the edge region on the screen of the display 120 , and drag it leftward as shown therein.
  • a user touches a mark region about the display shape and size displayed on the second object B and changes the second object B into the first object A.
  • a user's input for returning the second object B to the first object A may be varied depending on settings.
  • the second object B may be touched, or may be dragged by a user to be changed in position.
  • the second object B may be moved to a center region on the screen and changed into the first object A in accordance with a degree of touch swing.
  • the second object B may be moved to and positioned in an opposite edge region in accordance with a user's touch swing. While the second object B is moving, the second object B may be changed into the first object A or may be just moved without the change.
  • FIG. 11 illustrates the size, shape, color, transparency, etc. of an object moved and reduced in size in the user terminal 10 according to one embodiment of the present invention.
  • the second object moved from the center region to the edge region on the screen of the user terminal 10 and reduced in size may be displayed with various sizes, shapes, color, transparency, etc.
  • a second object B 1 is shaped like a semicircle and may include marks for displaying a full screen/minimized screen.
  • the size of the second object B 1 may be previously set in accordance with a user's settings, or may be varied.
  • the second object B 1 is positioned in the edge region, that is, in close-contact with the right side of the screen. Alternatively, the second object B 1 may be a little spaced apart from the right side of the screen without the close contact.
  • the color of the second object B 1 may be set corresponding to the color of the first object of before the change.
  • a second object B 2 is shaped like a rectangular shape and may include marks for enlargement/reduction.
  • the second object B 2 may be changed in size.
  • the second object B 2 may be set as a rectangular object, or may be changed in shape according to its positions.
  • a second object B 3 is shaped like a semicircle, and may display a logo image of the first object of before change. Further, the second object B 3 may be highlighted to be distinguished from other objects. The second object B 3 highlighted to be distinguished from other second objects may include a broadcast image being broadcasted unlike a stored image of which reproduction is controllable, or a messenger of informing that a message arrives.
  • a second object B 4 may be a triangular object.
  • a second object B 5 may be positioned in an edge region on the screen but maintain the shape of the first object unlike other second objects.
  • the second object B 5 may be provided to display content, and set to have transparency so as to less obstruct a view for other first objects. For instance, when a first object is of a schedule chart, brief content may be maintained and provide information to a user.
  • FIG. 12 illustrates that objects are displayed as overlapped in the user terminal 10 according to one embodiment of the present invention.
  • the second objects may pile up in the edge region on the screen.
  • a user may move the plurality of first objects in similar directions and change them into the second objects.
  • the controller 100 may position the moved second objects at opposite sides of the already positioned second object.
  • the second objects E, F, G, H, I and J may be displayed as overlapped in accordance with settings. For example, when a user moves one first object rightward and changes it into a second object, it is expected that a desired second object is present in the moving direction.
  • the second object when a user continues to move the first object in a similar direction without overlap, the second object may be positioned along a right edge region of the screen in a direction different from the moving direction. In this case, a user may be a little configured, and therefore the second objects may be overlapped and grouped. Of course, the second objects are overlapped enough as long as a center region of the screen is not obstructed.
  • FIG. 13 illustrates descriptions about each of overlapped objects in the user terminal 10 according to one embodiment of the present invention.
  • the second objects are overlapped as described in FIG. 12 , it may be difficult to know what application program corresponds to the object among the overlapped second objects. Therefore, when a user touches a group of second objects or when a user is within a predetermined distance from the screen of the display 120 to make touch, it may be guided what application program corresponds to each second object.
  • the guide may be made with regard to every second object through a UI, or only the second object selected by a user.
  • a guide UI (a) may be displayed with regard to the overlapped second objects E, F, G, H, I and J.
  • FIG. 14 illustrates that an object moved to an edge region and reduced in size is returned to a center region on a screen by a user's touch swing input in the user terminal 10 according to one embodiment of the present invention.
  • the second object is changed into the first object and then displayed on the screen.
  • the first object may be moved and displayed in one of regions A 1 to A 2 in accordance with a user's speeds of dragging leftward as much as the distance D 1 . The higher the user's dragging speed is, the more the first object moves leftward. The lower the user's dragging speed is, the less the first object moves leftward.
  • FIGS. 15A, 15B illustrate that a plurality of objects are moved to an edge region and reduced in size and returned in a user terminal according to one embodiment of the present invention.
  • a user makes touches T 1 , T 2 and T 3 with a plurality of first objects A 1 , A 2 and A 3 using three fingers and then releases the touches while dragging and swinging them in a direction D 1 .
  • the plurality of first objects A 1 , A 2 and A 3 are moved to the edge region in the direction D 1 and changed into the second object.
  • the plurality of first objects are moved at a time and changed into respective first objects.
  • a user makes touches T 4 , T 5 and T 6 with the plurality of second objects B, C and D using three fingers so as to release/change the second object into the first object, and then releases the touches while dragging and swinging them in a direction D 2 .
  • the second object B, C and D are respectively changed into the plurality of first objects A 1 , A 2 and A 3 and displayed on the screen center region.
  • a user can move the first object, which may obstruct a view for a center region on the screen of the display 120 , to an edge region on the screen, thereby intuitively and easily using the user terminal 10 .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
US16/069,597 2016-01-22 2016-04-26 User terminal and control method of the same Active US10824314B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
KR10-2016-0008338 2016-01-22
KR1020160008338A KR20170088229A (ko) 2016-01-22 2016-01-22 사용자단말기 및 그 제어방법
PCT/KR2016/004323 WO2017126744A1 (ko) 2016-01-22 2016-04-26 사용자단말기 및 그 제어방법

Publications (2)

Publication Number Publication Date
US20190026012A1 US20190026012A1 (en) 2019-01-24
US10824314B2 true US10824314B2 (en) 2020-11-03

Family

ID=59362015

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/069,597 Active US10824314B2 (en) 2016-01-22 2016-04-26 User terminal and control method of the same

Country Status (3)

Country Link
US (1) US10824314B2 (ko)
KR (1) KR20170088229A (ko)
WO (1) WO2017126744A1 (ko)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD918249S1 (en) * 2019-02-19 2021-05-04 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone with animated graphical user interface

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112118333A (zh) * 2019-06-20 2020-12-22 摩托罗拉移动有限责任公司 响应于用户输入改变操作模式的电子设备和相应方法
CN113920224A (zh) * 2021-09-29 2022-01-11 北京达佳互联信息技术有限公司 素材展示方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090228820A1 (en) * 2008-03-07 2009-09-10 Samsung Electronics Co. Ltd. User interface method and apparatus for mobile terminal having touchscreen
WO2013051762A1 (ko) 2011-10-05 2013-04-11 한국과학기술원 베젤 영역을 이용한 사용자 단말 제어방법
US20130132885A1 (en) * 2011-11-17 2013-05-23 Lenovo (Singapore) Pte. Ltd. Systems and methods for using touch input to move objects to an external display and interact with objects on an external display
WO2013151322A1 (en) 2012-04-06 2013-10-10 Samsung Electronics Co., Ltd. Method and device for executing object on display
US20130275901A1 (en) * 2011-12-29 2013-10-17 France Telecom Drag and drop operation in a graphical user interface with size alteration of the dragged object
KR20150077774A (ko) 2013-12-30 2015-07-08 삼성전자주식회사 화면 전환 방법 및 그 장치

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090228820A1 (en) * 2008-03-07 2009-09-10 Samsung Electronics Co. Ltd. User interface method and apparatus for mobile terminal having touchscreen
WO2013051762A1 (ko) 2011-10-05 2013-04-11 한국과학기술원 베젤 영역을 이용한 사용자 단말 제어방법
US20130132885A1 (en) * 2011-11-17 2013-05-23 Lenovo (Singapore) Pte. Ltd. Systems and methods for using touch input to move objects to an external display and interact with objects on an external display
US20130275901A1 (en) * 2011-12-29 2013-10-17 France Telecom Drag and drop operation in a graphical user interface with size alteration of the dragged object
WO2013151322A1 (en) 2012-04-06 2013-10-10 Samsung Electronics Co., Ltd. Method and device for executing object on display
KR20150077774A (ko) 2013-12-30 2015-07-08 삼성전자주식회사 화면 전환 방법 및 그 장치

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
International Search Report for PCT/KR2016/004323, dated Oct. 18, 2016, 4 pages.
Written Opinion of the ISA for PCT/KR2016/004323, dated Oct. 18, 2016, 7 pages.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD918249S1 (en) * 2019-02-19 2021-05-04 Beijing Xiaomi Mobile Software Co., Ltd. Mobile phone with animated graphical user interface

Also Published As

Publication number Publication date
US20190026012A1 (en) 2019-01-24
WO2017126744A1 (ko) 2017-07-27
KR20170088229A (ko) 2017-08-01

Similar Documents

Publication Publication Date Title
US9467732B2 (en) Display apparatus and control method for displaying an operational state of a user input
KR101092722B1 (ko) 차량의 멀티미디어 시스템 조작용 사용자 인터페이스 장치
CN105323623B (zh) 显示设备、包括显示设备的多显示器***及其控制方法
KR102169521B1 (ko) 입력장치, 디스플레이장치 및 그 제어방법
US9342168B2 (en) Input apparatus, display apparatus, control method thereof and display system
KR102441357B1 (ko) 디스플레이 장치 및 그 제어 방법
KR20130080891A (ko) 디스플레이장치 및 그 제어방법
US10824314B2 (en) User terminal and control method of the same
KR20130094044A (ko) 영상 표시 장치에서 자막 속성을 변경하기 위한 장치 및 방법
KR101943419B1 (ko) 입력장치, 디스플레이장치, 그 제어방법 및 디스플레이 시스템
KR20160056658A (ko) 어라운드 뷰 모니터 시스템 및 그 제어방법
US20140240263A1 (en) Display apparatus, input apparatus, and control method thereof
JP2015039084A (ja) 画像表示装置セット
KR20110103789A (ko) 리모컨 및 그 제어방법, 디스플레이장치 및 그 제어방법, 디스플레이시스템 및 그 제어방법
US20130176505A1 (en) Input apparatus, display apparatus and methods for controlling a display through user manipulation
KR102250091B1 (ko) 디스플레이 장치 및 디스플레이 방법
KR20160032883A (ko) 디스플레이 장치 및 이의 인디케이터를 디스플레이하는 방법
KR20180071725A (ko) 디스플레이 장치 및 방법
KR20140113096A (ko) 사용자단말장치 및 그 제어방법
US9582150B2 (en) User terminal, electronic device, and control method thereof
KR20170054866A (ko) 디스플레이 장치 및 그 제어 방법
US9940012B2 (en) Display device, calibration device and control method thereof
EP3220254B1 (en) Display apparatus and method of operating the same
US20160110206A1 (en) Display apparatus and controlling method thereof
KR20130033182A (ko) 영상표시장치의 동작 방법

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEE, SEUNG HUN;KANG, DO YOUN;PARK, SUNG BAE;AND OTHERS;SIGNING DATES FROM 20180706 TO 20180711;REEL/FRAME:046567/0708

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY