WO2017086689A1 - Dispositif électronique et son procédé de commande - Google Patents

Dispositif électronique et son procédé de commande Download PDF

Info

Publication number
WO2017086689A1
WO2017086689A1 PCT/KR2016/013195 KR2016013195W WO2017086689A1 WO 2017086689 A1 WO2017086689 A1 WO 2017086689A1 KR 2016013195 W KR2016013195 W KR 2016013195W WO 2017086689 A1 WO2017086689 A1 WO 2017086689A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
display area
touch
area
size
Prior art date
Application number
PCT/KR2016/013195
Other languages
English (en)
Korean (ko)
Inventor
조규현
정희석
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020160013523A external-priority patent/KR102222338B1/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to US15/775,194 priority Critical patent/US10817022B2/en
Publication of WO2017086689A1 publication Critical patent/WO2017086689A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1624Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with sliding enclosures, e.g. sliding keyboard or display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1652Details related to the display arrangement, including those related to the mounting of the display in the housing the display being flexible, e.g. mimicking a sheet of paper, or rollable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1675Miscellaneous details related to the relative movement between the different enclosures or enclosure parts
    • G06F1/1677Miscellaneous details related to the relative movement between the different enclosures or enclosure parts for detecting open or closed state or particular intermediate positions assumed by movable parts of the enclosure, e.g. detection of display lid position with respect to main body in a laptop, detection of opening of the cover of battery compartment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F9/00Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements
    • G09F9/30Indicating arrangements for variable information in which the information is built-up on a support by selection or combination of individual elements in which the desired character or characters are formed by combining individual elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Definitions

  • the present invention relates to an electronic device and a control method thereof, and more particularly, to an electronic device including a display which is wound by a roll and whose size of the display area is changed by the rotation of the roll.
  • An object of the present disclosure is to provide an electronic device for providing new information to a display in which a display area is changed when a touch and rolling interaction or a touch and unrolling interaction is input, and a control method thereof.
  • an electronic device includes a housing including a roll; A touch display wound around the roll and configured to change a size of a display area according to rotation of the roll, and to sense a user touch; A detector configured to detect a size of the display area; And a processor electrically connected to the display and the sensing unit, wherein the processor is configured to display the roll of the roll in a state where a user touch is sensed on an area of the touch display while an execution screen of an application is provided on the display area.
  • the touch display may be controlled to provide new information to the display area.
  • the housing including the roll and the roll is wound, the size of the display area can be changed according to the rotation of the roll, at least one of the display area
  • a control method of an electronic device having a touch display that displays a screen including components of the method includes providing an execution screen of an application on the display area; Detecting an increase in the size of the display area by rotating the roll when a user touch is detected in one area of the touch display; And providing new information along with an execution screen of the application on the display area.
  • the user's usability may be improved by providing new information.
  • FIG. 1 is a view for explaining an embodiment in which new information is provided as a size change of a display area of an electronic device occurs after a user touch is detected in the display area according to an embodiment of the present disclosure
  • FIG. 2A and 2B illustrate an electronic device having a rollable display according to one embodiment of the present disclosure
  • 3A and 3B are block diagrams illustrating a hardware configuration of an electronic device according to an embodiment of the present disclosure
  • 4A and 4B are diagrams for describing various embodiments of providing new information according to a location where a user touch is detected, according to an embodiment of the present disclosure
  • FIG. 5 is a view for explaining an embodiment in which a display area is increased in a state where a user touch is not detected, according to an embodiment of the present disclosure
  • 6A and 6B are diagrams for describing various embodiments of a user touch detected position when a display area is reduced according to one embodiment of the present disclosure
  • FIG. 7 is a view for explaining an embodiment in which a display area is reduced in a state where a user touch is not detected, according to an embodiment of the present disclosure
  • FIGS. 8 and 9 are diagrams for describing an exemplary embodiment in which a screen displayed on a display area is changed according to a user interaction according to one embodiment of the present disclosure
  • FIG. 10 is a flowchart illustrating a control method of an electronic device according to an embodiment of the present disclosure.
  • a and / or B may include A, may include B, or may include both A and B.
  • first”, “second”, “first”, “second”, and the like in the present disclosure may modify various elements of the present disclosure, but do not limit the corresponding elements.
  • the above expressions do not limit the order and / or importance of the corresponding elements.
  • the above expressions may be used to distinguish one component from another.
  • both a first user device and a second user device are user devices and represent different user devices.
  • the first component may be referred to as the second component, and similarly, the second component may also be referred to as the first component.
  • the user interaction may include at least one of touch interaction, rolling interaction, unrolling interaction, touch and rolling interaction, touch and unrolling interaction, bending interaction, voice interaction, button interaction, motion interaction, and multimodal interaction. It may include one, but is not limited thereto.
  • a “rolling interaction” is an interaction in which the flexible display is wound to reduce the display area of the flexible display, and may be an interaction in which the flexible display enters the housing of the electronic device by the rotation of the roll.
  • “unrolling interaction” is an interaction in which the flexible display is unfolded to increase the display area of the flexible display, and may be an interaction in which the flexible display is drawn out to the housing of the electronic device by the rotation of the roll.
  • the “touch and rolling interaction” or “touch and unrolling interaction” may be an interaction in which the flexible display is drawn into or taken out of the housing of the electronic device by the rotation of the roll while the user touches the flexible display.
  • an “application” refers to a set of computer programs designed to perform a particular task.
  • applications may vary. For example, web applications, game applications, video playback applications, map applications, memo applications, calendar applications, phone book applications, broadcast applications, exercise support applications, payment applications, photo folder applications, medical device control applications, multiple medical devices
  • “layout” may refer to the number of components constituting the screen, the position of the components, and a display method of display items within the components. That is, the change in the layout of the screen means that the number of components constituting the screen is changed, the position of the components is changed, or the display method of display items within the components is changed (for example, video content is changed to an icon). Can mean.
  • the position of the component may be a relative position between the components.
  • the electronic device 100 may be a mobile terminal such as a smart phone, but this is only one embodiment, various electronics such as a tablet PC, notebook PC, desktop PC, digital TV, etc. It may be implemented as a device.
  • the electronic device 100 may display a screen including at least one component.
  • the screen may be various screens such as an application execution screen, a menu screen, a lock release screen, and the like.
  • the at least one component may include various components such as content, icons, text, widgets, graphic items (eg, progress bar, etc.).
  • the electronic device 100 may display an execution screen of an application.
  • the electronic device 100 includes a left housing 110-1, a right housing 110-2, and a rollable touch display 120. do.
  • the left housing 110-1 may include a roll 115 through which the display 120 may be wound and an opening 117 through which the display 120 may be retracted / retracted, as shown in FIG. 2A.
  • the right housing 110-2 may be combined with the display 120 to be integrally formed.
  • the display 120 may be drawn in / drawn out through the opening 117 according to the rotation of the roll.
  • the display area of the display 120 may decrease, and when the display 120 is drawn through the opening 117, the display area of the display 120 May increase.
  • the rotation of the roll may be performed by a user interaction in which the user pulls or pushes the right housing 110-2, but this is only an example and may be automatically performed by a driving unit for driving the roll. .
  • the rotation of the roll is made through an interaction of pulling or pushing only the right housing 110-2, but this is only an example, and is applied to both housings 110-1 and 110-2.
  • the roll may be provided to rotate the roll through an interaction of pulling or pushing at least one of the two housings 110-1 and 110-2.
  • the electronic device 100 may include a configuration for fixing the curled state of the display 120 and the unfolded state of the display 120.
  • the electronic device 100 may withdraw the display from the housing of the electronic device 100 according to the rotation of the roll according to the unrolling interaction.
  • the electronic device 100 may lead the display into the housing of the electronic device 100 according to the rotation of the roll according to the rolling interaction.
  • the electronic device 100 may change the display area to the new display area.
  • Information can be provided.
  • the new information may include various information such as a UI including at least one item for controlling the electronic device 100, a UI related to an executed application, notification information, history information, and the like.
  • the electronic device 100 may display an execution screen 50 of an application (eg, a video application).
  • an application eg, a video application
  • the electronic device 100 is illustrated in FIG. 1B.
  • the UI 70 including an item for controlling the electronic device 100 may be displayed together with the execution screen 50 of the existing application.
  • the UI 70 including the item for controlling the electronic device 100 may determine a position to be displayed according to the area where the user touch is detected. For example, when the size of the display area is increased by the rotation of the roll while the user touches the right area of the touch display 120, the electronic device 100 may determine the left area that is the opposite side to the area where the user touch is detected. New information can be displayed at.
  • the UI 70 including the item for controlling the electronic device 100 may increase in size to correspond to the increased size of the display area. That is, while the size of the execution screen 50 of the previously displayed application is maintained, the size of the UI 70 including items for controlling the electronic device 100 may be increased by the size of the increased display area.
  • the electronic device 100 executes the execution of the existing application.
  • the size of the screen can be increased in proportion to the increased display area.
  • the electronic device 100 may change and display the layout of the execution screen of the existing application. In this case, changing the layout may mean that the relative arrangement of the components included in the screen is changed, the number of components is changed, or the display form of the components is changed.
  • FIG. 3A is a block diagram schematically illustrating a configuration of an electronic device 100 according to an embodiment of the present disclosure. As shown in FIG. 3A, the electronic device 100 includes a touch display 120, a detector 130, and a processor 140.
  • the touch display 120 displays image data.
  • the touch display 120 may be implemented as a flexible display.
  • the touch display 120 may be drawn into or out of the housing 110 of the electronic device 100 according to the rotation of the roll provided in the housing 110 of the electronic device 100.
  • the display area of the touch display 120 may be changed according to the rotation of the roll. In detail, when the touch display 120 is drawn into the housing according to the rotation of the roll, the display area of the touch display 120 may be reduced. In addition, when the touch display 120 is drawn out to the housing as the roll rotates, the display area of the touch display 120 may increase.
  • the touch display 120 may display a screen including at least one component under the control of the processor 140.
  • the display 120 may display an execution screen of an application (eg, a web application, a camera application, a gallery application, a message application, etc.) including a plurality of contents and icons.
  • the display 120 may display a standby screen including a plurality of widgets and a plurality of icons.
  • the display 120 may display an unlock screen including a plurality of widgets and an unlock icon.
  • the touch display 120 may detect a user touch for controlling the electronic device 100.
  • the touch display 120 may include a display panel for outputting an image and a touch panel for sensing a user touch.
  • the sensing unit 130 may include various sensors, and may transmit information collected by each sensor to the processor 140.
  • the sensing unit 130 may include a sensor for detecting the withdrawal amount of the display 120 according to the rotation of the roll.
  • the display 1603 may be wound into the left housing 110-1 or drawn out of the left housing 110-1.
  • the extraction direction of the display 120 may be a lower direction 200.
  • the sensing unit 130 may be an optical sensor and may detect the pattern 131 printed on the display 120.
  • the pattern 131 may be a QR code, a barcode, a black and white pattern, or a color pattern corresponding to the pixel line of the display 120. As the display 120 is extracted, the value of the pattern 131 printed on the display 120 may be changed.
  • the processor 140 may detect the withdrawal amount of the display 120 according to the changed value of the pattern 131.
  • the withdrawal amount of the display 120 may be determined in various ways.
  • the withdrawal amount of the display 120 may be determined through a motion sensor such as a gyro sensor, and the withdrawal amount of the display 120 may be determined through a rotation detection sensor that detects rotation of a roll.
  • the processor 140 is electrically connected to the display 120 and the sensing unit 130, and controls the overall operation of the electronic device 100. Specifically, when the size of the display area is increased by the rotation of the roll while the user touch is detected in one area of the touch display 120 while the execution screen of the application is provided in the display area, the processor 140 displays the display.
  • the touch display 120 may be controlled to provide new information to the area.
  • the processor 140 may control the touch display 120 to provide new information in the display area while maintaining the size of the execution screen of the application. That is, the processor 140 may control the touch display 120 to increase the size of the area providing new information according to the size of the increased display area while maintaining the size of the execution screen of the application.
  • the processor 140 may control the touch display 120 to provide new information at a position opposite to the area where the user touch is detected. For example, when the size of the display area is increased by the rotation of the roll 115 while the user touch touches the right area of the touch display 120, the processor 140 may display new information in the left area of the display area. The touch display 120 may be controlled to provide. In addition, when the size of the display area is increased by the rotation of the roll 115 while the user touch touches the left area of the touch display 120, the processor 140 provides new information to the right area of the display area. The touch display 120 can be controlled.
  • the processor 140 May not display new information and may control the touch display 120 to increase the size of an execution screen of an application provided in the display area as the display area size increases.
  • the processor 140 may provide a screen having a different layout to the user depending on whether the user touch is detected.
  • the processor 140 may determine the position of the area where the user touch is detected.
  • the touch display 120 may be controlled to provide a different screen according to the method.
  • the processor 140 may display only the execution screen of the application in the display area.
  • the touch display 120 may be controlled to provide.
  • the processor 140 may provide only the new information to the display area. ) Can be controlled.
  • the processor 140 may determine the size of the display area. As the size decreases, the touch display 120 may be controlled to reduce the size of the execution screen and new information.
  • the processor 140 may provide other information in the area where the new information is displayed. Can be controlled.
  • FIG. 3B is a detailed block diagram illustrating a configuration of an electronic device according to an embodiment of the present disclosure.
  • the electronic device 100 may include a sensing unit 130, a processor 140, a memory 150, a driving module 160, a user input module 170, a touch display 120, and communication.
  • Module 180 and bus 190 are only one embodiment, according to the implementation may add a new component, and may delete at least one component.
  • the processor 140 for example, the other components (eg, the memory 150, the driving module 160, the user input module 170, the touch display 120, the communication module 160 via the bus 190).
  • the received command can be decrypted, and an operation or data processing according to the decrypted command can be executed.
  • the processor 140 controls overall operations of the electronic device 100.
  • the processor 140 may change one of the size and the layout of the component according to the change of the display area detected by the detector 130.
  • the processor 140 may be implemented by at least one of a graphics processing unit (GPU), a central processing unit (CPU), and an application processor (AP), and may also be implemented as one chip.
  • GPU graphics processing unit
  • CPU central processing unit
  • AP application processor
  • the memory 150 is received from the processor 140 or other components (eg, the drive module 160, the user input module 170, the display 120, the communication module 160, etc.) or the processor 140 or It may store instructions or data generated by other components.
  • the memory 150 may include, for example, programming modules such as a kernel 151, middleware 152, an application programming interface (API) 153, or an application 154.
  • programming modules such as a kernel 151, middleware 152, an application programming interface (API) 153, or an application 154.
  • API application programming interface
  • Each of the aforementioned programming modules may be composed of software, firmware, hardware, or a combination of two or more thereof.
  • the memory 150 may be implemented with various memories.
  • the memory may be implemented as internal memory (for example, volatile memory (DRAM, SRAM) or nonvolatile memory (OTPROM, PROM, etc.) or external memory (for example, flash drive, CD, SD, etc.). Can be.
  • volatile memory DRAM, SRAM
  • nonvolatile memory OTPROM, PROM, etc.
  • external memory for example, flash drive, CD, SD, etc.
  • the drive module 160 is a module capable of driving the rotation of the roll 115 to roll or unfold the display.
  • the driving module 160 may automatically rotate the roll 115 according to a user command input through the user input module 170 and semi-automatically according to the external force (push and pull) of the user input to the housing.
  • the roll 115 may be rotated.
  • the driving module 160 may roll or unfold the display through a driving method other than rotation of the roll.
  • the driving module 160 may be implemented as a circuit for controlling the motor and the motor, but this is only an example and may be implemented in another configuration.
  • the user input module 170 may receive a command or data from a user and transfer the command or data to the processor 140 or the memory 150 through the bus 190.
  • the user input module 170 may include various user input devices such as a (digital) pen sensor, a key or an ultrasonic input device, a keyboard, a mouse, a voice input device, a point device, a remote control signal receiver, and the like. Can be.
  • a (digital) pen sensor such as a (digital) pen sensor, a key or an ultrasonic input device, a keyboard, a mouse, a voice input device, a point device, a remote control signal receiver, and the like. Can be.
  • the touch display 120 may display an image, an image, or data to a user.
  • the touch display 120 may be implemented as a liquid-crystal display (LCD) or an active-matrix organic light-emitting diode (AM-OLED) or the like, and may be flexible, transparent, or wearable. It may be implemented wearable.
  • LCD liquid-crystal display
  • AM-OLED active-matrix organic light-emitting diode
  • the touch display 120 may include a display panel and a touch panel.
  • the communication module 180 may connect communication between the other electronic device 192 and the electronic device 100, and communicate with the other electronic device 193 and the server 194 through the network 191.
  • the communication module 180 may include a predetermined short range communication protocol (eg, Wifi (wireless fidelity), BT (Bluetooth), near field communication (NFC), or predetermined network communication (eg, Internet, a local area network (LAN), a WAN ( wire area network, telecommunication network, cellular network, satellite network, or plain old telephone service (POTS), etc.
  • Each of the electronic devices 192 and 193 may be the same (eg, the same type) device as the electronic device 100. Or another (eg, other type) device.
  • the communication module 180 may receive a data about a web application by performing a communication connection with an external server 194 through the network 191.
  • the sensing unit 130 may include various sensors such as a gyro sensor, a gesture sensor, a grip sensor, and an acceleration sensor, and provides information collected by each sensor to the processor 140 in predetermined time units.
  • sensors such as a gyro sensor, a gesture sensor, a grip sensor, and an acceleration sensor, and provides information collected by each sensor to the processor 140 in predetermined time units.
  • the detector 130 may include, for example, a gesture sensor, a gyro sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, an RGB (red, green, blue) sensor, a biometric sensor, temperature / humidity, and the like. It may include at least one of a sensor, an illumination sensor, or an ultraviolet (ultra violet) sensor.
  • the detector 130 may measure a physical quantity or detect an operation state of the electronic device to convert the measured or detected information into an electrical signal.
  • the sensing unit 130 may further include a control circuit for controlling at least one or more sensors belonging thereto.
  • the sensing unit 130 may include various sensors for detecting the rotation of the roll.
  • the bus 190 may be a circuit connecting the aforementioned components to each other and transferring a communication signal (eg, a control message) between the aforementioned components.
  • a communication signal eg, a control message
  • unit or “module” as used in the present disclosure may refer to a unit including one or a combination of two or more of hardware, software, or firmware.
  • unit or “module” may be used interchangeably with terms such as, for example, logic, logical block, component, or circuit.
  • the "unit” or “module” may be a minimum unit or part of an integrally constructed part.
  • the module may be a minimum unit or part of performing one or more functions.
  • the “module” can be implemented mechanically or electronically.
  • a “module” in accordance with the present disclosure may be an application-specific integrated circuit (ASIC) chip, field-programmable gate arrays (FPGAs), or programmable logic devices that perform certain operations, known or developed in the future. logic device).
  • ASIC application-specific integrated circuit
  • FPGAs field-programmable gate arrays
  • programmable logic devices that perform certain operations, known or developed in the future. logic device).
  • FIGS. 4A to 9 are diagrams for describing an exemplary embodiment in which new information is provided depending on whether a user touch is present when the size of the display area increases due to an unrolling interaction.
  • the processor 140 controls the touch display 120 to display an execution screen of a video application.
  • the processor 140 may detect a user touch touching the left area of the touch display 120 through the detector 130, and detect a touch and unrolling interaction in which the user extends the display area in the right direction. have.
  • the processor 140 displays a UI 420 including execution icons capable of executing another application, in the right region of the touch display, as shown in FIG. 4A (b).
  • the touch display 120 can be controlled.
  • the processor 140 may display new information on the right area of the touch display 120. Can be.
  • the processor 140 maintains the size of the execution screen 410 of the existing video application.
  • the touch display 120 may be controlled to gradually increase the size of the UI 420 including the execution icons as the display area increases.
  • the UI 420 including the execution icons may be gradually increased as the display area is increased, but this is only an example.
  • the UI including the execution icons is included. 420 may be displayed.
  • the processor 140 controls the touch display 120 to display the execution screen 410 of the video application.
  • the processor 140 may detect a user touch touching the right area of the touch display 120 through the detector 130, and detect a touch and unrolling interaction in which the user extends the display area in the left direction. have.
  • the processor 140 displays a UI 420 'including execution icons capable of executing another application as shown in (b) of FIG. 4B in the left region of the touch display.
  • the touch display 120 may be controlled to display.
  • the processor 140 may display new information on the left area of the touch display 120.
  • the touch display 120 may be controlled.
  • 4A and 4B illustrate that the same UI is displayed even if the area in which the new information is displayed is different, this is only an example, and different UIs may be displayed according to the area in which the new information is displayed.
  • the processor 140 may execute a first UI (eg, an application).
  • the touch display 120 may be controlled to display a UI including an executable icon) in the left area.
  • the processor 140 may control the second UI (eg, the electronic device 100).
  • the touch display 120 may be controlled to display a UI including an icon which may be displayed in the right area.
  • 4A and 4B illustrate that new information is displayed in an area opposite to a point where a user touch is detected, but this is only an example, and new information is displayed in an area corresponding to the point where the user touch is detected. Can be.
  • the processor 140 when the size of the display area is increased by the rotation of the roll 115 in a state in which the user touch is not detected on the touch display 120, the processor 140 does not display new information and executes the currently displayed application.
  • the touch display 120 may be controlled to increase the size of the screen.
  • the processor 140 controls the touch display 120 to display the execution screen 510 of the video application.
  • the processor 140 may detect an unrolling interaction in which the size of the display area is increased in both directions in the state where the user touch is not detected on the touch display 120 through the sensing unit 130.
  • the processor 140 When the unrolling interaction is detected, the processor 140 does not display new information, as shown in FIG. 5B, and executes the execution screen 510 ′ of the application currently displayed according to the increased display area size.
  • the touch display 120 may be controlled to increase the size of the touch display 120.
  • the processor 140 may control the touch display 120 to provide a screen having a different layout depending on whether the user touches the touch screen.
  • 6A to 7 are diagrams for describing an exemplary embodiment in which new information is provided depending on the presence or absence of a user touch when the size of the display area is reduced by the rolling interaction.
  • the processor 140 may display a UI 620 including an execution screen 610 of a video application and an execution icon for executing an application. To control.
  • the processor 140 may detect a user's touch of touching the right area on which the UI 620 including the execution icon is displayed through the sensor 130, and detect a touch and rolling interaction that reduces the display area. .
  • the processor 140 may remove the execution screen 610 of the video application and include UI icons for executing another application as shown in FIG. 6A (b).
  • the touch display 120 may be controlled to display only 620 '.
  • the UI 620 ′ including execution icons capable of executing another application may be changed according to the size of the reduced display area.
  • the processor 140 may display the UI 620 including an execution screen 610 of a video application and an execution icon for executing the application. To control.
  • the processor 140 may detect a user touch on a left region on which the execution screen 610 of the video application is displayed through the sensing unit 130, and sense a touch and rolling interaction that reduces the display area.
  • the processor 140 When a touch and rolling interaction is detected, the processor 140 removes the UI 620 including execution icons capable of executing another application, as shown in FIG. 6B (b), and displays the execution screen of the video application.
  • the touch display 120 may be controlled to display only 610 ′. In this case, the execution screen 610 ′ of the video application may be changed according to the size of the reduced display area.
  • the processor 140 is displayed on the area where the user touch is not detected.
  • the touch display 120 may be controlled to remove the screen and display only the screen displayed on the area where the user touch is detected.
  • the processor 140 may display an execution screen of the application according to the reduced size of the display area.
  • the touch display 120 may be controlled to reduce the size of new information.
  • the processor 140 displays the touch screen 120 to display a UI 720 including an execution screen 710 of a video application and an execution icon for executing the application. ).
  • the processor 140 may detect a rolling interaction in which the size of the display area is reduced in at least one direction in a state in which the user touch is not detected on the touch display 120 through the sensing unit 130.
  • the processor 140 displays the location of the UI 720 ′ including the execution screen 710 ′ of the video application and the execution icon for executing the application, as shown in FIG. 7B.
  • the touch display 120 can be controlled to maintain the size and reduce the size only.
  • the processor 140 may control the touch display 120 to provide a screen having a different layout depending on whether the user touches the touch screen.
  • the processor 140 may change the image to another area where the new information is displayed.
  • the touch display 120 may be controlled to provide information.
  • a screen displayed on the touch display 120 of the electronic device 100 may conceptually include a main area 810 and a plurality of sub areas 820-1, 820-2, 830-1, and 830-2.
  • the main area 810 may be an area where various screens (for example, an execution screen, an idle screen, a lock screen, etc. of an application) are displayed, and the sub area 820 is new information generated by touch and unrolling interaction.
  • this may be an area in which a UI including an execution icon for executing an application, information related to a currently executed application, a UI for controlling the electronic device 100, and the like are displayed.
  • each of the sub areas 820-1, 820-2, 830-1, and 830-2 may provide different information from each other.
  • the sub areas 820-1, 820-2, 830-1, and 830-2 may provide different types of information according to locations.
  • the sub areas 820-1 and 820-2 located on the left side can provide information related to the currently executed application.
  • the currently executed application is a video application
  • the first sub area 820-1 provides information about the currently executed video
  • the second sub area 820-2 is a play list of the video application. It can provide information about.
  • the sub region located in the left region may be provided with an execution screen of an application related to the currently executed application (for example, when the currently executed application is a text application, the execution screen of the telephone application to the related application).
  • the sub-areas 830-1 and 830-2 located on the right side can provide information not related to the currently executed application.
  • the third sub area 830-1 provides a UI including an execution icon for executing another application
  • the fourth sub area 830-2 provides various settings (eg, for example, of electronic devices).
  • a communication setting, a volume setting, a screen setting, etc.) may be provided.
  • the sub-region located in the right region may provide a UI including notification information.
  • the processor 140 may change the displayed sub area through the drag interaction while the main area 810 and the sub area are displayed.
  • the processor 140 may control the touch display 120 to display the main area 810 and the first sub area 820-1 together.
  • an execution screen of an application may be provided in the main area 810, and new information generated by touch and unrolling interaction may be provided in the sub area 820.
  • the processor 140 may detect a drag interaction of dragging the first sub-region 820-1 in the upward direction while touching the main region 810 through the touch display 120.
  • the processor 140 removes the first sub-region 820-1 and removes the main display region 810 and the second sub-region as shown in FIG. 7B.
  • the touch display 120 may be controlled to display 820-2 together.
  • the second sub-region 820-2 may be moved upward and displayed according to the drag interaction.
  • the processor 140 may detect a drag interaction that touches a point of each of the main area 810 and the second sub area 820-2 through the touch display 120, and then drags it in the left direction.
  • the processor 140 removes the second sub area 820-2 and moves the main display area 810 to the left as shown in FIG. 7C.
  • the touch display 120 may be controlled to display the third sub region 830-1 in the right region.
  • the third sub-region 830-1 may be displayed by being moved in the left direction according to the drag interaction.
  • the processor 140 may detect a drag interaction of dragging the third sub-region 820-1 in the upward direction while touching the main region 810 through the touch display 120.
  • the processor 140 removes the third sub area 820-3 and removes the main display area 810 and the fourth sub area as shown in FIG. 7D.
  • the touch display 120 may be controlled to display 830-2 together.
  • the fourth sub-region 830-2 may be moved upward and displayed according to the drag interaction.
  • the processor 140 may provide various information to the user by changing the sub area through a touch interaction or a drag interaction.
  • FIG. 10 is a flowchart illustrating a control method of the electronic device 100 according to an embodiment of the present disclosure.
  • the electronic device 100 provides an execution screen of an application in the display area (S1010).
  • the execution screen of the application is only one embodiment, and other screens (for example, a standby screen including a plurality of application execution icons, a lock screen, etc.) may also include the technical idea of the present invention.
  • the electronic device 100 detects an increase in the size of the display area by the rotation of the roll 115 in a state where a user touch is detected in one area of the touch display 120.
  • the electronic device 100 In response to the increase in the size of the display area, the electronic device 100 provides new information along with the execution screen of the application to the display area in operation S1030.
  • the user can receive various information while maintaining the execution screen of the application displayed in the main area, thereby enabling multitasking.
  • the electronic device 100 may improve usability of the user through the rollable display.
  • a device eg, modules or electronic device 100
  • a method eg, operations
  • the at least one computer may perform a function corresponding to the command.
  • the computer-readable storage medium may be, for example, the memory 150.
  • Programs include, for example, hard disks, floppy disks, magnetic media (such as magnetic tape), optical media (such as compact disc read only memory) and digital versatile disc (DVD). ), Magneto-optical media (e.g. floptical disks), hardware devices (e.g. read only memory (ROM), random access memory (RAM), or flash memory, etc.)
  • the storage medium is generally included as part of the configuration of the electronic device 100, but may be mounted through a port of the electronic device 100, or electronically.
  • the program may be included in an external device (eg, a cloud, a server, or another electronic device) that is located outside the apparatus 100.
  • the program may be stored in a plurality of storage media, wherein at least one of the plurality of storage media is stored. Some of the electronic devices 100 It may be located in the device unit.
  • Instructions can include high-level language code that can be executed by a computer using an interpreter, as well as machine code such as produced by a compiler.
  • the aforementioned hardware device may be configured to operate as one or more software modules to perform the operations of the various embodiments, and vice versa.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif électronique et son procédé de commande. Le dispositif électronique comprend : un boîtier comprenant un rouleau; un dispositif d'affichage tactile qui est enroulé sur un rouleau, est capable de changer la taille d'une zone d'affichage selon la rotation du rouleau, et détecte un toucher d'un utilisateur; une partie de détection pour détecter la taille de la zone d'affichage; et un processeur qui est connecté électriquement au dispositif d'affichage et à la partie de détection, si la taille de la zone d'affichage augmente par la rotation du rouleau tandis qu'un toucher d'utilisateur est détecté sur une zone du dispositif d'affichage tactile lorsqu'un écran d'exécution d'une application est disposé dans la zone d'affichage, le processeur peut commander le dispositif d'affichage tactile de façon à fournir de nouvelles informations à la zone d'affichage.
PCT/KR2016/013195 2015-11-18 2016-11-16 Dispositif électronique et son procédé de commande WO2017086689A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/775,194 US10817022B2 (en) 2015-11-18 2016-11-16 Electronic device and control method therefor

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201562256889P 2015-11-18 2015-11-18
US62/256,889 2015-11-18
KR10-2016-0013523 2016-02-03
KR1020160013523A KR102222338B1 (ko) 2015-11-18 2016-02-03 전자 장치 및 이의 제어 방법

Publications (1)

Publication Number Publication Date
WO2017086689A1 true WO2017086689A1 (fr) 2017-05-26

Family

ID=58719083

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/013195 WO2017086689A1 (fr) 2015-11-18 2016-11-16 Dispositif électronique et son procédé de commande

Country Status (1)

Country Link
WO (1) WO2017086689A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110730985A (zh) * 2017-06-08 2020-01-24 Lg电子株式会社 数字标牌及其操作方法
KR20200084480A (ko) * 2019-01-02 2020-07-13 삼성디스플레이 주식회사 표시 장치 및 그 구동 방법
CN111886569A (zh) * 2018-03-22 2020-11-03 三星电子株式会社 电子装置和执行该电子装置的功能的方法
CN115087942A (zh) * 2020-02-14 2022-09-20 华为技术有限公司 卷曲设备上的自适应显示长宽比调节器和手势

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008150600A1 (fr) * 2007-06-05 2008-12-11 Immersion Corporation Procédé et appareil pour surface tactile souple à effet haptique
US20090051830A1 (en) * 2006-01-06 2009-02-26 Yoshiteru Matsushita Mobile terminal unit, display method, display program, and recording medium
US20100167791A1 (en) * 2008-12-30 2010-07-01 Lg Electronics Inc. Portable terminal having flexible display and screen controlling method thereof
US20130201208A1 (en) * 2012-02-07 2013-08-08 Eunhyung Cho Icon display method for a pull-out display device
US20140118317A1 (en) * 2012-11-01 2014-05-01 Samsung Electronics Co., Ltd. Method of controlling output of screen of flexible display and portable terminal supporting the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090051830A1 (en) * 2006-01-06 2009-02-26 Yoshiteru Matsushita Mobile terminal unit, display method, display program, and recording medium
WO2008150600A1 (fr) * 2007-06-05 2008-12-11 Immersion Corporation Procédé et appareil pour surface tactile souple à effet haptique
US20100167791A1 (en) * 2008-12-30 2010-07-01 Lg Electronics Inc. Portable terminal having flexible display and screen controlling method thereof
US20130201208A1 (en) * 2012-02-07 2013-08-08 Eunhyung Cho Icon display method for a pull-out display device
US20140118317A1 (en) * 2012-11-01 2014-05-01 Samsung Electronics Co., Ltd. Method of controlling output of screen of flexible display and portable terminal supporting the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110730985A (zh) * 2017-06-08 2020-01-24 Lg电子株式会社 数字标牌及其操作方法
US11062683B2 (en) 2017-06-08 2021-07-13 Lg Electronics Inc. Digital signage and operating method thereof
CN110730985B (zh) * 2017-06-08 2021-11-26 Lg电子株式会社 数字标牌及其操作方法
CN111886569A (zh) * 2018-03-22 2020-11-03 三星电子株式会社 电子装置和执行该电子装置的功能的方法
KR20200084480A (ko) * 2019-01-02 2020-07-13 삼성디스플레이 주식회사 표시 장치 및 그 구동 방법
KR102572268B1 (ko) * 2019-01-02 2023-08-30 삼성디스플레이 주식회사 표시 장치 및 그 구동 방법
CN115087942A (zh) * 2020-02-14 2022-09-20 华为技术有限公司 卷曲设备上的自适应显示长宽比调节器和手势

Similar Documents

Publication Publication Date Title
KR102222338B1 (ko) 전자 장치 및 이의 제어 방법
US11348201B2 (en) Electronic device having rollable display and method of controlling the same
WO2014077530A1 (fr) Procédé d'agencement d'une liste dans un dispositif d'affichage souple et dispositif électronique associé
WO2012128548A2 (fr) Procédé et appareil de gestion d'éléments dans un presse-papiers d'un terminal portable
WO2013168885A1 (fr) Procédé de fourniture d'écran de verrouillage et dispositif de terminal pour le mettre en œuvre
WO2013055097A1 (fr) Procédé et appareil de fourniture d'une fonction de déverrouillage d'un dispositif tactile
WO2017086689A1 (fr) Dispositif électronique et son procédé de commande
WO2015178541A1 (fr) Dispositif d'affichage et son procédé de commande
WO2014084633A1 (fr) Procédé d'affichage d'applications et dispositif électronique associé
WO2014073825A1 (fr) Dispositif portable et son procédé de commande
WO2015037960A1 (fr) Dispositif et procédé de fourniture d'écran de verrouillage
WO2012036327A1 (fr) Procédé et dispositif d'affichage d'ordonnancement en communications mobiles
WO2017135797A2 (fr) Procédé et dispositif électronique pour gérer le fonctionnement d'applications
WO2016080559A1 (fr) Dispositif d'affichage pliable susceptible de fixer un écran au moyen du pliage d'un dispositif d'affichage et procédé pour commander le dispositif d'affichage pliable
WO2013133478A1 (fr) Dispositif portable et son procédé de commande
WO2014137019A1 (fr) Double appareil de déverrouillage d'un dispositif portable équipé d'un écran extensible et procédé de commande associé
WO2020085704A1 (fr) Dispositif électronique pliable et son procédé d'affichage multi-étage de contenus
WO2018004140A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2020213834A1 (fr) Dispositif électronique pour afficher des écrans d'exécution d'une pluralité d'applications et son procédé de fonctionnement
WO2021096110A1 (fr) Appareil d'affichage et procédé de commande associé
WO2022030890A1 (fr) Procédé de capture d'image à fenêtres multiples et dispositif électronique associé
WO2018056587A1 (fr) Appareil électronique et son procédé de commande
WO2015064984A1 (fr) Appareil électronique et système de communication comportant celui-ci
WO2013105759A1 (fr) Procédé et appareil pour gérer un contenu, et support d'enregistrement lisible par ordinateur sur lequel est enregistré un programme pour exécuter le procédé de gestion de contenu
WO2017030285A1 (fr) Procédé de fourniture d'ui et dispositif d'affichage utilisant celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16866641

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16866641

Country of ref document: EP

Kind code of ref document: A1