WO2017191911A1 - Dispositif électronique et procédé de commande de dispositif électronique - Google Patents

Dispositif électronique et procédé de commande de dispositif électronique Download PDF

Info

Publication number
WO2017191911A1
WO2017191911A1 PCT/KR2017/004027 KR2017004027W WO2017191911A1 WO 2017191911 A1 WO2017191911 A1 WO 2017191911A1 KR 2017004027 W KR2017004027 W KR 2017004027W WO 2017191911 A1 WO2017191911 A1 WO 2017191911A1
Authority
WO
WIPO (PCT)
Prior art keywords
application
electronic device
external electronic
processor
display
Prior art date
Application number
PCT/KR2017/004027
Other languages
English (en)
Korean (ko)
Inventor
원미연
고나영
곽현지
김지혜
이승민
이원희
황우석
Original Assignee
삼성전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020170011928A external-priority patent/KR20170124954A/ko
Application filed by 삼성전자 주식회사 filed Critical 삼성전자 주식회사
Priority to US16/098,982 priority Critical patent/US20190196683A1/en
Publication of WO2017191911A1 publication Critical patent/WO2017191911A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output

Definitions

  • the present invention relates to an electronic device including a display and a method of controlling the electronic device through a user interface displayed on the display.
  • Various embodiments of the present disclosure provide an electronic device and a method of controlling the electronic device that can control a plurality of functions provided by the first application through an object displayed on an execution screen of the second application.
  • An electronic device may display an execution screen of a first application on a display and the display, change an execution screen of the first application to an execution screen of the second application, and execute the second application.
  • a processor configured to display a first object provided by the first application on an execution screen of the.
  • the first object may be an object for controlling a plurality of functions provided by the first application in relation to the second application.
  • a method of controlling an electronic device may include displaying an execution screen of a first application on a display, changing an execution screen of the first application to an execution screen of a second application, and the second operation. And displaying a first object provided by the first application on an execution screen of the application.
  • the first object may be an object for controlling a plurality of functions provided by the first application in relation to the second application.
  • an operation of displaying a running screen of a first application on a display, changing a running screen of the first application to a running screen of a second application, and the second application may be performed.
  • a program for performing a method including an operation of displaying a first object provided by the first application on an execution screen of may be recorded.
  • the first object may be an object for controlling a plurality of functions provided by the first application in relation to the second application.
  • a plurality of functions provided by the second application may be controlled through an object provided by the first application on an execution screen of the second application. Accordingly, the user can conveniently control the electronic device without changing the application execution screen.
  • FIG. 1 is a diagram illustrating a network system according to various embodiments of the present disclosure.
  • FIG. 2 is a block diagram illustrating a configuration of a first electronic device according to various embodiments of the present disclosure.
  • FIG. 3 is a diagram illustrating a method of executing a second application according to various embodiments of the present disclosure.
  • FIG. 4 is a diagram illustrating a method of changing a first object according to various embodiments of the present disclosure.
  • FIG. 5 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • FIG. 6 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • FIG. 7A is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • FIG. 7B is a diagram illustrating a method of outputting content through grouped external electronic devices according to various embodiments of the present disclosure.
  • 7C is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • FIG. 8 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • FIG. 9 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • FIG. 10 is a diagram illustrating a method of deleting a first object according to various embodiments of the present disclosure.
  • FIG. 11 is a diagram illustrating a function of selecting an external electronic device to output content according to various embodiments of the present disclosure.
  • FIG. 12 illustrates a function of providing a notification according to various embodiments of the present disclosure.
  • FIG. 13 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • FIG. 14 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • 15 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • 16 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • 17 is a flowchart illustrating a control method of a first electronic device according to various embodiments of the present disclosure.
  • FIG. 1 is a diagram illustrating a network system according to various embodiments of the present disclosure.
  • the network system 1000 may include the first electronic device 100, the second electronic device 200, and at least one third electronic device 300-1, 300-2, 300. -3).
  • the first electronic device 100, the second electronic device 200, and the at least one third electronic device 300-1, 300-2, and 300-3 may be connected to each other through a network.
  • the first electronic device 100, the second electronic device 200, and the third electronic device 300-1, 300-2, and 300-3 may be connected through a wired or wireless network.
  • the network may be, for example, a home network.
  • the first electronic device 100 may be a portable electronic device.
  • the first electronic device 100 may include a smartphone or a tablet PC.
  • the first electronic device 100 may transmit content (eg, video, audio, etc.) to an external electronic device (eg, third electronic devices 300-1, 300-2, 300-3) through a network. ) Can be sent.
  • the first electronic device 100 may control an external electronic device (for example, third electronic devices 300-1, 300-2, and 300-3) through a network.
  • the first electronic device 100 may provide a user interface to a user through an application and control the external electronic device according to a user input received through the user interface.
  • the second electronic device 200 may be a network device.
  • the second electronic device 200 may connect different networks such as an access point (AP) or a router, or a plurality of electronic devices (eg, the first electronic device 100 and the first device) through the network.
  • AP access point
  • 3 may be a device that connects the electronic devices 300-1, 300-2, and 300-3 to each other.
  • the third electronic devices 300-1, 300-2, and 300-3 may be content output devices.
  • the third electronic devices 300-1, 300-2, and 300-3 may be devices including audio or a display, such as a TV or a speaker.
  • the third electronic devices 300-1, 300-2, and 300-3 may output content received from the first electronic device 100 under the control of the first electronic device 100. have.
  • FIG. 2 is a block diagram illustrating a configuration of a first electronic device according to various embodiments of the present disclosure.
  • the first electronic device 100 may include a display 110, an input module 120, a communication module 130, a memory 140, and a processor 150.
  • the display 110 may display an execution screen of an application.
  • the display 110 may display a first user interface provided by the first application when the first application is executed.
  • the display 110 may display a second user interface provided by the second application when the second application is executed.
  • the input module 120 may receive a user input.
  • the input module 120 may include a touch sensor panel sensing a user's touch manipulation or a pen sensor panel sensing the user's pen manipulation.
  • the input module 120 may include a voice recognition sensor that recognizes a user's voice or a motion recognition sensor that detects a user's gesture.
  • the display 110 and the input module 120 may be implemented as a touch screen in which an input panel is disposed on the display panel to simultaneously perform display and touch manipulation sensing.
  • the communication module 130 may communicate with an external electronic device (eg, third electronic devices 300-1, 300-2, and 300-3). According to an embodiment of the present disclosure, the communication module 130 may transmit a control signal for controlling the external electronic device to the external electronic device. According to an embodiment of the present disclosure, the communication module 130 may include a cellular module, a wireless-fidelity (Wi-Fi) module, or a Bluetooth module.
  • Wi-Fi wireless-fidelity
  • the memory 140 may store an application and a user interface.
  • the memory 140 may store a first application that controls an external electronic device for outputting content provided by the second application and at least one second application that provides the content.
  • the first application may be an application for selecting at least one second application, content, or an external electronic device to output the content.
  • the second application may be an application that provides audio or video content such as music, radio, movie, drama, and the like.
  • the second application may be an application that receives audio or video content from an external server (eg, a content providing server) and plays the received content.
  • the second application may be an application that manages content stored in an internal storage device (eg, the memory 140) or an external storage device (eg, a cloud server).
  • the memory 140 may store a first user interface provided by the first application and a second user interface provided by the second application.
  • the processor 150 may control overall operations of the first electronic device.
  • the processor 150 may control the display 110, the input module 120, the communication module 130, and the memory 140, respectively, to execute an execution screen of an application according to various embodiments of the present disclosure (or, User interface) and provide various functions to the user through an object included in the execution screen of the application.
  • the first electronic device 100 may include at least one processor 150.
  • the processor 150 may be implemented as a system on chip (SoC) including a central processing unit (CPU), a graphic processing unit (GPU), or a memory.
  • SoC system on chip
  • CPU central processing unit
  • GPU graphic processing unit
  • memory a memory
  • FIG. 3 is a diagram illustrating a method of executing a second application according to various embodiments of the present disclosure.
  • the processor 150 may display an execution screen of the first application on the display 110. For example, when a user input for the first application icon is received, the processor 150 may execute the first application and display an execution screen of the first application on the display 110. The processor 150 may execute the second application when a user input for the second application icons 11 and 12 included in the execution screen of the first application is received.
  • the processor 150 may execute the second application.
  • the processor 150 may display an execution screen of the second application on the display 110.
  • the processor 150 may change the execution screen of the first application displayed on the display 110 to the execution screen of the second application.
  • the processor 150 may display the first object 15 provided by the first application on the execution screen of the second application.
  • the first object 15 may be, for example, a floating user interface (UI) for providing a plurality of functions provided by the first application.
  • the plurality of functions may include a first function of changing an application execution screen, a second function of controlling at least one external electronic device for outputting content provided by the second application, and a third function of recognizing a voice.
  • the second function of controlling the at least one external electronic device may include at least one of a name change of at least one external electronic device, a grouping of a plurality of external electronic devices, a release of the grouping, and a change of an external electronic device included in the group. May include functionality.
  • FIG. 4 is a diagram illustrating a method of changing a first object according to various embodiments of the present disclosure.
  • the first object provided by the first application may include a plurality of first objects corresponding to a plurality of functions provided by the first application.
  • the processor 150 may select one of a plurality of functions provided by the first application. For example, the processor 150 may select a designated function as a default or a function selected last by a user.
  • the processor 150 may display an object corresponding to a selected function among a plurality of first objects on the display 110. For example, referring to the image of FIG. 4, the processor 150 may display the first object 15 corresponding to the first function.
  • the processor 150 may correspond to a plurality of functions provided by the first application.
  • the plurality of first objects 16, 17, and 18 may be additionally displayed.
  • the processor 150 may display the first object 16 corresponding to the second function, the first object 17 and the fourth function corresponding to the third function on the display 110.
  • the first object 18 corresponding to the function may be additionally displayed.
  • the processor 150 may receive a user input for selecting one of the plurality of first objects 15, 16, 17, and 18.
  • the long tap input may be moved to the first object 16 corresponding to the second function and then terminated (drag and drop).
  • the processor 150 may receive a tap input for the first object 16 corresponding to the second function after the long tap input for the first object 15 corresponding to the first function is terminated. Can be.
  • the processor 150 may select one of the plurality of first objects 15, 16, 17, and 18.
  • the first object selected by the user may be displayed on the display 110, and the other first object not selected may be deleted from the display 110.
  • the processor 150 moves the first object 15 corresponding to the first function to the first object 16 corresponding to the third function selected by the user.
  • the first object 15, 17, or 18 that is not selected may be deleted from the display screen.
  • FIG. 5 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • the processor 150 may provide a first function of changing an execution screen of an application through a first object provided by the first application. For example, referring to the image of FIG. 5, the processor 150 may display the first object 21 corresponding to the first function on the execution screen of the second application. Referring to the image, the processor 150 may receive a second input provided by the first application on the display 110 when a user input (for example, a tap input) for the first object 21 is received. The object 22 can be displayed. The second object 22 may include a menu 23 for displaying an execution screen of the first application. Referring to the image, when the user input for the menu 23 for displaying the execution screen of the first application is received, the processor 150 displays the execution screen of the second application displayed on the display 110. You can change to the execution screen of. When the execution screen of the first application is displayed, the processor 150 may delete the first object provided by the first application from the display screen.
  • a user input for example, a tap input
  • the object 22 can be displayed.
  • the second object 22 may include a menu 23
  • FIG. 6 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • the processor 150 may provide a function of changing an application execution screen through the first object provided by the first application. For example, referring to the image 601 of FIG. 6, the processor 150 may display the first object 25 on the execution screen of the second application. Referring to the image, the processor 150 may display the second object 26 on the display 110 when a user input (for example, a tap input) for the first object 25 is received. Can be.
  • the second object 26 may include a plurality of icons corresponding to a plurality of second applications for providing content.
  • the plurality of second applications may be, for example, applications that provide content in association with the first application.
  • the processor 150 displays the execution screen of the second application displayed on the display 110.
  • the execution screen of another second application corresponding to may be displayed.
  • the processor 150 may maintain the first object 25 provided by the first application without deleting the display screen even if the execution screen of the application is changed.
  • the user may conveniently change the application execution screen using the first object.
  • FIG. 7A is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • the processor 150 may provide a function of controlling the external electronic device through the first object provided by the first application.
  • the external electronic device controlled through the first object may be an electronic device registered in the first application.
  • the processor 150 may display the first object 31 on the execution screen of the second application.
  • the processor 150 may display the second object 32 provided by the first application on the display 110.
  • the second object 32 may include an external electronic device list 33 and a grouping menu 34 that provides a grouping function of the external electronic device.
  • the processor 150 may select an external electronic device to be grouped in the second object 32.
  • the selection menu 35 can be displayed.
  • the user may select an external electronic device to be grouped among a plurality of external electronic devices by using the selection menu 35. For example, a user may sequentially select a main device (or a master device) and a sub device (or a slave device) among a plurality of electronic devices to be grouped.
  • the processor 150 may display a list of the external electronic devices included in the second object 32 for each group. For example, the processor 150 may distinguish and display the external electronic device list 36 included in the first group and the external electronic device list 37 included in the second group.
  • FIG. 7B is a diagram illustrating a method of outputting content through grouped external electronic devices according to various embodiments of the present disclosure.
  • the execution screen of the second application may include a content reproduction menu 38.
  • the processor 150 may display a list 39 of an external electronic device capable of playing content on the display 110. . If there is a grouped external electronic device, the processor 150 may display the external electronic device list for each group. According to an embodiment of the present disclosure, when a user input for selecting one of the external electronic device lists 39 is received, the processor 150 may transmit content to the selected external electronic device (or a selected group).
  • 7C is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • the processor 150 may provide a function of controlling the external electronic device through the first object provided by the first application.
  • the external electronic device controlled through the first object may be an electronic device registered in the first application.
  • the processor 150 may display the first object 41 provided by the first application on the execution screen of the second application.
  • the processor 150 may display the second object 42 on the display 110.
  • the second object 42 may include external electronic device lists 43 and 44. If the grouped external electronic devices exist, the external electronic device lists 43 and 44 may be displayed for each group.
  • the ungrouping menu 45 and the group in the second object 42 are received.
  • the group editing menu 46 providing a function of changing the included external electronic device and the name changing menu 47 for changing the name of the external electronic device included in the group may be displayed.
  • the ungrouping menu 45 may provide a function of ungrouping the external electronic device.
  • the group edit menu 46 may provide a function of adding a new external electronic device to the group or excluding some of the external electronic devices included in the group.
  • the name change menu 47 may provide a function of changing a name of the external electronic device.
  • FIG. 8 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • the processor 150 may provide a speech recognition function through the first object provided by the first application. For example, referring to the image of FIG. 8, the processor 150 may display the first object 51 provided by the first application on an execution screen of the second application. Referring to the image, the processor 150 may activate a voice recognition function when a user input (for example, a tap input) for the first object 51 is received. When a user input (for example, a tap input) for the first object 51 is received, the processor 150 may display the second object 52 on the display 110.
  • the second object 52 may include information related to speech recognition (for example, speech recognition).
  • FIG. 9 is a diagram illustrating a function provided through a first object according to various embodiments of the present disclosure.
  • the processor 150 may provide a function of changing an environment setting value of the electronic device through the first object provided by the first application. For example, referring to the image of FIG. 9, the processor 150 may display the first object 53 provided by the first application on an execution screen of the second application. Referring to the image, when the user input (for example, a tap input) for the first object 53 is received, the processor 150 may display the second object 54 on the display 110. Can be.
  • the second object 54 may include a menu for changing an environment setting value of the electronic device 100.
  • the second object 54 may include a menu for providing a function of setting an alarm function and setting an audio output value.
  • FIG. 10 is a diagram illustrating a method of deleting a first object according to various embodiments of the present disclosure.
  • the processor 150 may display the first object 55 provided by the first application on an execution screen of the second application.
  • the processor 150 may delete the third object 55 from the display screen. 56 may be additionally displayed. According to an embodiment of the present disclosure, the processor 150 may receive a user input for selecting the third object 56. For example, the long tap input for the first object 55 may end after being moved to the third object 56. For another example, the processor 150 may receive a tap input for the third object 56 after the long tap input for the first object 55 is terminated.
  • a specified user input eg, a long tap
  • the processor 150 may delete the third object 55 from the display screen. 56 may be additionally displayed.
  • the processor 150 may receive a user input for selecting the third object 56.
  • the long tap input for the first object 55 may end after being moved to the third object 56.
  • the processor 150 may receive a tap input for the third object 56 after the long tap input for the first object 55 is terminated.
  • the processor 150 may delete the first object from the display screen.
  • FIG. 11 is a diagram illustrating a function of selecting an external electronic device to output content according to various embodiments of the present disclosure.
  • the processor 150 may be connected with at least one external electronic device registered in the first application through a network while the second application is running. For example, a situation in which a user arrives home while listening to music through a speaker or earphone included in the electronic device 100 using a second application may occur. When the user arrives at home and the electronic device 100 is connected to the home network, the electronic device 100 may be connected to at least one speaker registered in the first application through the home network.
  • the processor 150 when the second processor is connected to at least one external electronic device registered to the first application through a network while the second application is running, the processor 150 outputs at least one content to the display 110.
  • the second object 57 for selecting the external electronic device may be displayed.
  • the second object 57 may include a list of external electronic devices connected via a network and a selection menu 58 for selecting an external electronic device to output contents. If there is a grouped external electronic device, the external electronic device list may be displayed for each group.
  • the processor 150 may transmit content provided by the second application to the selected external electronic device (or group).
  • FIG. 12 illustrates a function of providing a notification according to various embodiments of the present disclosure.
  • the processor 150 may be connected to an external electronic device that is not registered in the first application through a network while the second application is running. For example, when the user purchases a new speaker and connects to the home network, the electronic device 100 may be connected to the new speaker through the home network.
  • the processor 150 may register the unregistered external electronic device to the first application.
  • the notifying may display the second object 59.
  • the processor 150 executes the screen of the second application. May be changed to an execution screen of the first application for registering the external electronic device.
  • FIG. 13 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • the processor 150 may display an execution screen of the first application on the display 110.
  • the execution screen of the first application may include a first area 61 and a second area 62.
  • the first area 61 may include, for example, a second application list.
  • the second area 62 may be, for example, a browsing area for selecting a function or content included in the second application.
  • the second area 62 may include a menu related to an application icon selected by the user among application icons included in the first area 61. For example, when the second icon 63 is selected by the user, the processor 150 may display a menu related to the second icon 63 in the second area 62.
  • the processor 150 displays a sub-menu related to the selected menu in the second area 62.
  • the processor 150 may display a submenu of the selected menu 64 in the second area 62.
  • the processor 150 may display a submenu of the selected submenu. For example, when a user input for the first submenu 65 is received, the processor 150 may display the submenu of the first submenu 65 in the second area 62. According to an embodiment of the present disclosure, the processor 150 may receive a user input for a second application list included in the first area while browsing through the second area. For example, the processor 150 may receive a user input for the third icon 66 of the application icons included in the second application list.
  • the processor 150 may display a menu related to the third icon 66 in the second area 62. For example, the processor 150 may change the browsing screen associated with the second icon 63 displayed in the second area 62 to the browsing screen associated with the third icon 66.
  • the processor 150 displays a menu related to the second icon 63 in the second area 62 as shown in the image. I can display it. Whenever an application icon included in the first area 61 is selected, the processor 150 may display a basic menu (or a top menu) of the selected application icon in the second area 62.
  • FIG. 14 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • the processor 150 may display the selected application icon (or the application icon being browsed) at the center of the first area. For example, referring to the ⁇ 1401> image of FIG. 14, when the third icon 72 of the plurality of icons included in the first area 71 is selected, the third icon 72 is the center of the first area 71. (73). Referring to the image, when the fourth icon 74 of the plurality of icons included in the first area 71 is selected, the position of the plurality of icons is changed to change the fourth icon 74 to the first area 71. The center 73 can be displayed. The processor 150 may change only the position where the icon is displayed without changing the arrangement order of the plurality of icons.
  • the processor 150 may arrange icons included in the second application list in a specified order. According to an embodiment of the present disclosure, the processor 150 may arrange the icons according to the usage history of the application. For example, the processor 150 may arrange the icons in the order recently used. In another example, the processor 150 may sort the icons according to a designated sorting order (eg, an order specified by a user). In another example, the processor 150 may arrange icons according to a designated sorting order, but change only the order of the most recently used one icon (eg, first).
  • a designated sorting order eg, an order specified by a user
  • the processor 150 may arrange icons according to a designated sorting order, but change only the order of the most recently used one icon (eg, first).
  • 15 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • the processor 150 may change the sort order of the second application list included in the first area 75 according to a user input. For example, referring to the images ⁇ 1501> to ⁇ 1503> of FIG. 15, after a specified user input (eg, a long tap) is received for the sixth icon 76 of the plurality of icons, the third icon 77 and When dragging and dropping between the fourth icon 78, the position (or sorting order) of the sixth icon 76 may be changed between the third icon 77 and the fourth icon 78.
  • a specified user input eg, a long tap
  • 16 is a diagram illustrating an execution screen of a first application according to various embodiments of the present disclosure.
  • the processor 150 may provide a search function through an execution screen of the first application.
  • the processor 150 may provide a content search function by a plurality of second applications through an execution screen of the first application.
  • the processor 150 may provide search results by a plurality of second applications when a search word is input by a user. For example, when a search word such as a song, a singer, or an album is input, the processor 150 may provide search results of each of the plurality of second applications on one screen.
  • the processor 150 may provide an order of providing search results according to at least one of a sort order of the second application list, a history of use of the second application, and an execution state of the second application (eg, whether an account is logged in). You can decide. For example, referring to ⁇ 1601> of FIG. 16, the processor 150 arranges and displays a plurality of second applications App1, App2, App3, and App4 on the first area 81 of the first application execution screen. can do. Referring to the image, the processor 150 may display the results searched through the respective second applications in the order from the top to the bottom in the sort order of the second application list included in the first area. In another example, the processor 150 displays the search results of the most recently used (or currently used) second application at the top, and displays the search results of the remaining second applications in the first area. 2 Can be displayed according to the sort order of the application list.
  • 17 is a flowchart illustrating a control method of a first electronic device according to various embodiments of the present disclosure.
  • the flowchart illustrated in FIG. 17 may include operations processed by the first electronic device 100 described above. Therefore, although omitted below, contents described with respect to the first electronic device 100 with reference to FIGS. 1 through 16 may also be applied to the flowchart illustrated in FIG. 17.
  • the first electronic device 100 may display an execution screen of the first application on the display in operation 1710. For example, when the user input for the first application icon is received, the first electronic device 100 may execute the first application and display an execution screen of the first application on the display.
  • the first application may be, for example, an application that controls at least one external electronic device for outputting content provided by the second application.
  • the first electronic device 100 may change the execution screen of the first application to the execution screen of the second application.
  • the first electronic device 100 may execute the second application when a user input for the second application icon included in the execution screen of the first application is received.
  • the second application may change the execution screen of the first application to the execution screen of the second application.
  • the second application may be an application that provides audio or video content such as music, radio, movie, drama, and the like.
  • the first electronic device 100 is.
  • the first object for controlling a plurality of functions provided by the first application may be displayed on an execution screen of the second application.
  • the first object may be, for example, a floating user interface (UI) for providing a plurality of functions provided by the first application.
  • the plurality of functions may include a first function of changing an application execution screen, a second function of controlling at least one external electronic device for outputting content provided by the second application, and a third function of recognizing a voice. And a fourth function of changing an environment setting value of the electronic device.
  • the second function of controlling the at least one external electronic device may include at least one of a name change of at least one external electronic device, a grouping of a plurality of external electronic devices, a release of the grouping, and a change of an external electronic device included in the group. May include functionality.
  • the first electronic device 100 may select one of a plurality of functions provided by the first application. According to an embodiment of the present disclosure, the first electronic device 100 may display an object corresponding to a selected function among the plurality of first objects on the display 110. According to an embodiment of the present disclosure, when a user input for changing a selected function is received, the first electronic device 100 may display a first object corresponding to the changed function among the plurality of first objects.
  • the first electronic device 100 may determine whether a specified event related to the first object has occurred. For example, the first electronic device may check whether a user input for the first object has been received. For another example, it may be checked whether the second application is connected to at least one external electronic device registered or not registered with the first application through a network.
  • the first electronic device 100 may display a second object related to the event on a display in operation 1750. For example, when a user input for the first object is received, the first electronic device 100 may display a second object corresponding to the first object (or a currently selected function). As another example, when the second electronic device 100 is connected to an external electronic device registered to the first application through a network while the second application is executed, the first electronic device 100 selects an external electronic device to output the content. Can be displayed.
  • the first electronic device 100 when the first electronic device 100 is connected to an external electronic device that is not registered in the first application through a network while the second application is executed, the first electronic device 100 connects the unregistered external electronic device to the first application.
  • the second object indicating that the registration can be displayed may be displayed.
  • At least a portion of an apparatus (eg, modules or functions thereof) or a method (eg, operations) according to various embodiments may be implemented by instructions stored in a computer-readable storage medium in the form of a program module.
  • the processor may perform a function corresponding to the instruction.
  • Computer-readable recording media include hard disks, floppy disks, magnetic media (e.g. magnetic tape), optical recording media (e.g. CD-ROM, DVD, magnetic-optical media (e.g. floppy disks), internal memory, etc.
  • Instructions may include code generated by a compiler or code that may be executed by an interpreter.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon divers modes de réalisation de la présente invention, un dispositif électronique peut comprendre un affichage et un processeur. Le processeur peut afficher sur l'affichage un écran d'exécution d'une première application, remplacer l'écran d'exécution de la première application par un écran d'exécution d'une seconde application et afficher sur l'écran d'exécution de la seconde application un premier objet fourni par la première application.
PCT/KR2017/004027 2016-05-03 2017-04-13 Dispositif électronique et procédé de commande de dispositif électronique WO2017191911A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/098,982 US20190196683A1 (en) 2016-05-03 2017-04-13 Electronic device and control method of electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201662331058P 2016-05-03 2016-05-03
US62/331,058 2016-05-03
KR10-2017-0011928 2017-01-25
KR1020170011928A KR20170124954A (ko) 2016-05-03 2017-01-25 전자 장치 및 전자 장치의 제어 방법

Publications (1)

Publication Number Publication Date
WO2017191911A1 true WO2017191911A1 (fr) 2017-11-09

Family

ID=60203710

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/004027 WO2017191911A1 (fr) 2016-05-03 2017-04-13 Dispositif électronique et procédé de commande de dispositif électronique

Country Status (1)

Country Link
WO (1) WO2017191911A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110085189A (ko) * 2010-01-19 2011-07-27 박철 터치패널을 갖는 개인휴대단말기의 작동방법
KR20130133460A (ko) * 2012-05-29 2013-12-09 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20140049881A (ko) * 2012-10-18 2014-04-28 엘지전자 주식회사 이동 단말기 및 이의제어방법
KR20150141084A (ko) * 2014-06-09 2015-12-17 엘지전자 주식회사 이동단말기 및 그 제어방법
US20160021338A1 (en) * 2014-07-17 2016-01-21 Htc Corporation Method for performing a video talk enhancement function and an electric device having the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110085189A (ko) * 2010-01-19 2011-07-27 박철 터치패널을 갖는 개인휴대단말기의 작동방법
KR20130133460A (ko) * 2012-05-29 2013-12-09 엘지전자 주식회사 이동 단말기 및 그 제어방법
KR20140049881A (ko) * 2012-10-18 2014-04-28 엘지전자 주식회사 이동 단말기 및 이의제어방법
KR20150141084A (ko) * 2014-06-09 2015-12-17 엘지전자 주식회사 이동단말기 및 그 제어방법
US20160021338A1 (en) * 2014-07-17 2016-01-21 Htc Corporation Method for performing a video talk enhancement function and an electric device having the same

Similar Documents

Publication Publication Date Title
WO2016048024A1 (fr) Appareil d'affichage et procédé d'affichage correspondant
WO2012039587A1 (fr) Procédé et appareil pour modifier l'écran d'accueil dans un dispositif tactile
WO2012154006A2 (fr) Procédé et appareil de partage de données entre différents dispositifs de réseau
AU2011339167B2 (en) Method and system for displaying screens on the touch screen of a mobile device
WO2013168885A1 (fr) Procédé de fourniture d'écran de verrouillage et dispositif de terminal pour le mettre en œuvre
WO2014014204A1 (fr) Dispositif électronique comprenant de multiples cartes sim et procédé associé
WO2015005605A1 (fr) Utilisation à distance d'applications à l'aide de données reçues
WO2014142471A1 (fr) Procédé et système de commande multi-entrées, et dispositif électronique les prenant en charge
WO2013054994A1 (fr) Dispositif formant terminal d'utilisateur et procédé de partage de contenu correspondant
WO2012108620A2 (fr) Procédé de commande d'un terminal basé sur une pluralité d'entrées, et terminal portable prenant en charge ce procédé
WO2016080559A1 (fr) Dispositif d'affichage pliable susceptible de fixer un écran au moyen du pliage d'un dispositif d'affichage et procédé pour commander le dispositif d'affichage pliable
WO2012026785A2 (fr) Système et procédé de fourniture d'interface d'entrée de liste de contacts
WO2014119975A1 (fr) Procédé et système de partage d'une partie d'une page web
WO2014030929A1 (fr) Appareil de fourniture d'une interface utilisateur pour partager des contenus médias dans un réseau à domicile et support d'enregistrement permettant d'enregistrer des programmes
WO2014175660A1 (fr) Procédé de commande d'écran et son dispositif électronique
WO2015009066A1 (fr) Procédé de fonctionnement d'un service de conversation basé sur une application de messagerie, interface utilisateur et dispositif électronique employant ce procédé et cette interface
WO2013100469A1 (fr) Système et procédé de fourniture d'une interface utilisateur selon des informations de localisation
WO2014073935A1 (fr) Procédé et système de partage d'un dispositif de sortie entre des dispositifs multimédias à des fins d'émission et de réception de données
WO2013125789A1 (fr) Appareil électronique, procédé de commande de celui-ci, et support de stockage lisible par ordinateur
WO2014175603A1 (fr) Procede et serveur pour fournir des services d'utilisation de contenus musicaux
WO2015167072A1 (fr) Dispositif numérique fournissant un rejet tactile et procédé de commande pour celui-ci
WO2012033337A2 (fr) Appareil et procédé multimédia de communication de contenu
WO2013125785A1 (fr) Procédé d'exécution de tâche, système et support d'enregistrement lisible par ordinateur
WO2013133545A1 (fr) Système de recherche et son procédé de fonctionnement
WO2018056587A1 (fr) Appareil électronique et son procédé de commande

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17792814

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17792814

Country of ref document: EP

Kind code of ref document: A1