WO2013175770A1 - 情報処理装置、情報処理方法、および情報処理プログラム - Google Patents
情報処理装置、情報処理方法、および情報処理プログラム Download PDFInfo
- Publication number
- WO2013175770A1 WO2013175770A1 PCT/JP2013/003232 JP2013003232W WO2013175770A1 WO 2013175770 A1 WO2013175770 A1 WO 2013175770A1 JP 2013003232 W JP2013003232 W JP 2013003232W WO 2013175770 A1 WO2013175770 A1 WO 2013175770A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- application
- display
- information processing
- touch panel
- control unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1641—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being formed by a plurality of foldable display components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1647—Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/16—Details of telephonic subscriber devices including more than one display unit
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to an information processing apparatus, an information processing method, and an information processing program which are applied to a terminal provided with a touch panel.
- Patent Document 1 discloses a terminal (hereinafter referred to as a "large screen terminal") in which the touch panel has a large screen.
- a terminal hereinafter referred to as a "large screen terminal”
- two housings each having a touch panel can be opened and closed via a hinge. Thereby, the user can use it as one large screen touch panel by setting the two touch panels in the facing state.
- a finger e.g., a thumb
- the touch panel does not reach the entire screen. That is, even if the user selects and drags an object from the application A displayed in the range where the thumb reaches, the user can not drop it on the application B displayed in the range where the thumb does not reach. Therefore, the user can not move an object from application A to application B.
- the finger performing the operation reaches the entire screen, but if the dragging distance is long, erroneous operation is likely to occur. That is, while the user is selecting and dragging an object from the application A, the user can release the dragging finger from the touch panel and can not drop it onto the application B. Therefore, the user can not move an object from application A to application B.
- the touch panels are discontinuous due to the presence of a part of the housing, and an erroneous operation occurs.
- Cheap That is, when the user is dragging an object while selecting and dragging an object from application A, the finger is released from the touch panel and dropped in application B when the finger touching a part of the housing I can not do it. Therefore, the user can not move an object from application A to application B.
- An object of the present invention is to enable a user to easily move an object between applications simultaneously displayed on a touch panel.
- An information processing apparatus is an information processing apparatus including a touch panel that simultaneously displays a first application and a second application, the touch panel in a state where an object of the first application is dragged. And a control unit for moving the display of the second application to a droppable position, triggered by the detection of the presence. .
- An information processing method is an information processing method performed by a terminal including a touch panel that simultaneously displays a first application and a second application, in which an object of the first application is dragged. And detecting the presence of a predetermined time in the defined area on the touch panel, and moving the second application to a droppable position triggered by the detection of the presence. .
- An information processing program is an information processing program to be executed by a computer of a terminal provided with a touch panel for simultaneously displaying a first application and a second application, wherein an object of the first application is a drag A process of detecting that the specified area on the touch panel has been in a predetermined state for a predetermined time, and a process of moving the second application to a droppable position using the detection of the presence as a trigger , To perform.
- the user can easily move the object between the applications displayed simultaneously on the touch panel.
- FIG. 1 A diagram showing an outline of a characteristic operation of the information processing apparatus according to the first embodiment of the present invention
- the flowchart which shows one example of operation of the information processing equipment which relates to the form 1 of execution of this invention
- the figure which shows the outline
- Embodiment 1 Hereinafter, Embodiment 1 of the present invention will be described in detail with reference to the drawings.
- FIG. 1 is a view showing an example of the appearance of an information processing apparatus 100 according to the present embodiment.
- the application destination of the information processing apparatus 100 may be a smartphone or the like.
- the information processing apparatus 100 includes a housing 1 and a housing 2.
- the housing 1 and the housing 2 are each flat.
- the housing 1 and the housing 2 are connected via the hinge 3.
- the housing 1 has a touch panel 10
- the housing 2 has a touch panel 20.
- the user can perform an open / close operation of opening and closing the housings 1 and 2 around the hinge 3 in the information processing apparatus 100.
- the user can fold or spread the touch panels 10 and 20 according to the application.
- FIG. 1 shows an appearance in which the touch panels 10 and 20 are in a spread state. In this spread state, the user can use the two touch panels 10 and 20 as one large screen touch panel.
- the information processing apparatus 100 can simultaneously display two applications.
- the touch panel 10 displays the application 6, and the touch panel 20 displays the application 7.
- the application 6 is, for example, a mail application for creating, sending and receiving a mail.
- the application 7 is, for example, a viewer application for browsing image data. Further, the application 7 displays a plurality of thumbnails of image data.
- the thumbnail is an object that can be moved from the application 7 to the application 6 by dragging and dropping.
- the term "move" as used herein also includes the meaning of copying of image data.
- FIG. 2A and FIG. 2B are diagrams showing an example of transition of screens when the information processing apparatus 100 is in operation.
- the touch panel 20 has a predetermined area (hereinafter referred to as a “prescribed area”) 5 as a part thereof.
- the defined area 5 is located near the boundary between the touch panel 10 and the touch panel 20, in other words, near the boundary between the display of the application 6 and the display of the application 7.
- the object 4 mentioned here is, for example, a thumbnail.
- the user drags the selected object 4 in the direction of the touch panel 10 where the application 6 is displayed.
- the object 4 reaches the defined area 5, the user waits while dragging the object 4.
- the defined area 5 may be displayed on the touch panel 20 so that the user can visually recognize, or may not be displayed when the user can infer the defined area 5.
- the defined area 5 may be in the touch panel 10.
- the application 6 displayed on the touch panel 10 slides (translates) a predetermined distance in the direction of the touch panel 20 and stops, as a trigger that the object 4 being dragged remains in the defined area 5 for a predetermined time.
- the application 6 is displayed across the touch panels 10 and 20.
- the application 6 is displayed below the object 4 being dragged and is also displayed above the application 7.
- the user drops the object 4.
- the user completes the operation of moving the object 4 from the application 7 to the application 6.
- silane display moving and displaying one of the two applications displayed simultaneously to the touch panel on which the other is displayed. Also, conversely, moving the slide-displayed application to the original position and displaying it is hereinafter referred to as “slide return display”.
- the information processing apparatus 100 when the user moves the object 4 from the application 7 to the application 6, the object 4 is transferred to the defined area 5 of the touch panel 20 where the application 7 is displayed. All you have to do is drag it. That is, the user does not have to drag to the touch panel 10 on which the application 6 is displayed. Therefore, the information processing apparatus 100 can solve the above-described problem that the operating finger does not reach the display of the destination application. Further, the information processing apparatus 100 can solve the problem that the erroneous operation is likely to occur due to the length of the distance to be dragged and the discontinuity between the touch panels. As a result, in the information processing apparatus 100 according to the present embodiment, the user can easily move the object between the applications displayed simultaneously on the touch panel.
- FIG. 3 is a block diagram showing an example of the configuration of the information processing apparatus 100. As shown in FIG. 3
- the information processing apparatus 100 includes a touch panel 10, a touch panel 20, a touch panel coordinate management unit 50, an application control unit 60, and a display control unit 70.
- the touch panels 10 and 20 are the same as those shown in FIGS. 1 and 2, respectively.
- the touch panel 10 includes an input detection unit 11 and an image display unit 12.
- the input detection unit 11 detects a touch on the touch panel 10. Thereby, the input detection unit 11 detects an operation of selecting the object 4 displayed by the application 7 as shown in FIGS. 1 and 2, for example. Then, the input detection unit 11 outputs an X coordinate and a Y coordinate (hereinafter referred to as “position information”) indicating the position of the selected object to the touch panel control unit 51 of the touch panel coordinate management unit 50.
- position information a Y coordinate
- the user performs touch on the touch panel 10 using a finger or a predetermined device (hereinafter referred to as a “finger or the like”).
- the image display unit 12 displays a screen or the like of the application based on the information input from the split display control unit 74.
- Examples of the image display unit 12 include an LCD (Liquid Cristal Display), an organic EL (Electro Luminescence), and the like.
- the touch panel 20 has an input detection unit 21 and an image display unit 22. Each of these units has the same function as the input detection unit 11 and the image display unit 12 of the touch panel 10 in order. Therefore, since the structure of the touch panel 10 is the same as the structure of the touch panel 20, the description of the structure of the touch panel 20 is omitted.
- the touch panel coordinate management unit 50 includes a touch panel control unit 51, a drag & drop determination unit 52, a defined area detection unit 53, and a timer 54.
- the touch panel control unit 51 inputs position information from each of the input detection units 11 and 21. Then, the touch panel control unit 51 outputs the input position information to the drag and drop determination unit 52, the defined area detection unit 53, and the multi-application control unit 61.
- the drag and drop determination unit 52 determines, based on the position information input from the touch panel control unit 51, whether the drag or the drop has been performed. Then, the drag and drop determination unit 52 outputs determination result information indicating the determination result to the multi-application control unit 61.
- the determination result information is information indicating that a drag has been performed or a drop has been performed.
- the drag & drop determination unit 52 turns on the drag valid flag. Then, the drag and drop determination unit 52 outputs the drag enabled flag that has been turned on to the multi-application control unit 61 and the defined area detection unit 53.
- the drag & drop determination unit 52 turns off the drag valid flag. Then, the drag and drop determination unit 52 outputs the drag enabled flag that has been turned off to the multi-application control unit 61 and the defined area detection unit 53.
- the defined area detection unit 53 inputs, from the drag and drop determination unit 52, a drag valid flag that has been turned on. Then, the defined area detection unit 53 detects whether or not the object being dragged is present in the defined area 5 based on the position information input from the touch panel control unit 51 and the defined area information held in advance.
- the defined area information is information in which the range of the defined area 5 is defined by a plurality of X coordinates and Y coordinates.
- the defined area detection unit 53 controls the timer 54 to be ON.
- the prescribed area detection unit 53 detects that the object is present in the prescribed area 5 as long as a part of the object is also present in the prescribed area 5.
- the defined area detection unit 53 controls the timer 54 to be OFF and sets the defined area detection flag to ON. Then, the defined area detection unit 53 outputs the defined area detection flag set to ON to the multi-application control unit 61.
- the defined area detection unit 53 When the defined area detection unit 53 receives the drag enable flag turned off from the drag and drop determination unit 52, the defined area detection unit 53 changes the defined area detection flag from on to off. Then, the defined area detection unit 53 outputs the defined area detection flag turned off to the multi-application control unit 61.
- the timer 54 is controlled to be turned on by the defined area detection unit 53 to start counting of the defined time. Then, when the clocking of the specified time is ended, the timer 54 notifies the specified area detecting unit 53 that the clocking has ended. Thereafter, the timer 54 is controlled to be turned off by the defined area detection unit 53, thereby resetting the clocked time.
- the application control unit 60 includes applications 6 and 7 and a multi-application control unit 61.
- the applications 6 and 7 are the same as those shown in FIGS. 1 and 2, respectively.
- the applications 6 and 7 may be applications called home applications, which allow the user to customize the home screen.
- the applications 6 and 7 input and output various information with the multi-application control unit 61. Various information will be described later.
- the multi-application control unit 61 notifies the application 6 or 7 of the position information input from the touch panel control unit 51, the determination result information input from the drag and drop determination unit 52, and the display position information input from the display control unit 70. .
- the applications 6 and 7 perform predetermined processing based on the input information. Then, the applications 6 and 7 output processing result information indicating the processing result to the multi-application control unit 61.
- the multi-application control unit 61 outputs a display instruction to the multi-application display position management unit 71 of the display control unit 70 based on the input processing result information.
- the multi-application control unit 61 determines processing to be performed based on the drag valid flag input from the drag and drop determination unit 52 and the defined area detection flag input from the defined area detection unit 53.
- the multi-application control unit 61 determines to perform slide display of the application 6 or 7. Then, the multi-application control unit 61 outputs a display instruction to perform slide display of the application 6 or 7 (hereinafter referred to as “slide display instruction”) to the multi-application display position management unit 71.
- slide display as described with reference to FIG. 2, when an application is displayed on each of two touch panels, the display of one application slides toward the touch panel on which the other application is displayed. Operation. At this time, the slid application is displayed below the object being dragged.
- the multi-application control unit 61 propagates the information of the dropped object to the application 6 or 7 slide-displayed. For example, when the object 4 of the application 7 is dropped on the slide-displayed application 6, the multi-application control unit 61 acquires information of the object 4 from the application 7 and outputs the information to the application 6. Then, the multi-application control unit 61 outputs, to the multi-application display position management unit 71, a display instruction to perform slide return display (hereinafter referred to as "slide return display instruction").
- the slide back display is an operation of returning the display of the application being slid to the original display position when slide display is performed.
- the display control unit 70 includes a multi-application display position management unit 71, application display control units 72 and 73, and a split display control unit 74.
- the multi-application display position management unit 71 outputs the display instruction input from the multi-application control unit 61 to the application display control unit 72 or 73.
- the display instruction here is, for example, a slide display instruction or a slide return display instruction.
- the multi-application display position management unit 71 updates the display position information held by itself with the received display position information.
- the multi-application display position management unit 71 outputs the updated display position information to the multi-application control unit 61.
- the display position information is information indicating in which position of the touch panels 10 and 20 the applications 6 and 7 are displayed, respectively.
- the display position of each application is defined by X coordinates and Y coordinates.
- the display position information may be defined only by the Y coordinate.
- a position to be displayed by default (hereinafter referred to as "default display position") is determined.
- the default display position of each of the apps 6 and 7 is different.
- the positions illustrating the applications 6 and 7 are the respective default display positions.
- the position to be displayed (hereinafter referred to as "slide display position") is also determined as a result of slide display.
- the slide display position of each of the applications 6 and 7 is the same.
- the position illustrating the application 6 is the common slide display position of the applications 6 and 7.
- the slide display position includes, for example, all of the defined area 5.
- at least a part of the dragged object 4 overlaps the display of the slid application 6.
- the slide display position can be said to be a position at which the object being dragged can be dropped onto the slid application.
- the application display control unit 72 has display position information indicating a default display position of the application 7 and display position information indicating a slide display position of the application 7.
- the application display control unit 73 also has display position information indicating the default display position of the application 6 and display position information indicating the slide display position of the application 6.
- the multi-application display position management unit 71 generates full screen display information based on the updated display position information, and outputs the full screen display information to the split display control unit 74.
- the full screen display information is image data displayed on the touch panels 10 and 20.
- the application display control unit 72 changes the display position information of the application 7 possessed by the application display control unit 72 based on the display instruction input from the multi-application display position management unit 71.
- the application display control unit 73 changes the display position information of the application 6 owned by the application display control unit 73 based on the display instruction input from the multi-application display position management unit 71. For example, when the slide display instruction of the application 6 is received, the application display control unit 73 changes the display position information indicating the default display position to display position information indicating the slide display position. On the other hand, when receiving the slide return display instruction of the application 6, the application display control unit 73 changes the display position information indicating the slide display position to display position information indicating the default display position.
- the operation of the application display control unit 72 is the same as that of the application display control unit 73. Thus, when the change of the display position information is completed, the application display control units 72 and 73 output the changed display position information to the multi-application display position management unit 71.
- the split display control unit 74 controls the image data indicated by the full screen display information input from the multi-application display position management unit 71 to be separately displayed on the touch panels 10 and 20. That is, the divided display control unit 74 divides the image data indicated by the full screen display information and outputs the divided image data to the image display units 12 and 22.
- the information processing apparatus 100 is, for example, a central processing unit (CPU), a storage medium such as a read only memory (ROM) storing a control program, and a working medium such as a random access memory (RAM). It has a memory.
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- the drag-and-drop determination unit 52 turns on the drag valid flag.
- the defined area detection unit 53 turns on the defined area detection flag.
- the multi-application control unit 61 determines to slide and display the application 6 displayed on the touch panel 10 in the direction of the touch panel 20 when the drag valid flag and the defined area detection flag are both ON.
- the multi-application display position management unit 71 controls the application display control unit 73 and the split display control unit 74 to execute slide display of the application 6.
- the display of the application 6 moves in parallel in the direction of the touch panel 20.
- the application 6 is displayed below the object 4 being dragged and above the application 7 across the touch panels 10 and 20.
- the application 6 is displayed to include the object 4 being dragged. In this state, when the user drops the object 4, the object 4 moves from the application 7 to the application 6.
- the information processing apparatus 100 when moving the object 4 from the application 7 to the application 6, the user need only drag the object 4 to the defined area 5 of the touch panel 20 where the application 7 is displayed. . That is, the user does not have to drag to the touch panel 10 on which the application 6 is displayed. Therefore, the information processing apparatus 100 can solve the above-described problem that the operating finger does not reach the display of the destination application. Further, the information processing apparatus 100 can solve the problem that the erroneous operation is likely to occur due to the length of the distance to be dragged and the discontinuity between the touch panels. As a result, in the information processing apparatus 100 according to the present embodiment, the user can easily move the object between the applications displayed simultaneously on the touch panel.
- FIG. 4 is a flowchart showing an example of the operation of the information processing apparatus 100. Below, the case where a user moves one object of application 7 to application 6 is explained as an example.
- step S101 the touch panel 20 displays a plurality of objects of the application 7 as shown in FIG.
- An object is, for example, a thumbnail.
- the touch panel 10 displays a mail creation screen of the application 6.
- the input detection unit 21 outputs, to the touch panel control unit 51, positional information indicating the X coordinate and the Y coordinate of the selected object 4.
- the input detection unit 21 outputs the position information of the dragged object 4 to the touch panel control unit 51.
- step S102 when the touch panel control unit 51 acquires the position information of the object 4 being selected from the input detection unit 21, the touch panel control unit 51 outputs the position information to the drag & drop determination unit 52.
- step S103 the drag & drop determination unit 52 determines whether the object 4 is dragged based on the input position information. For example, when there is a change in the input position information, the touch panel control unit 51 determines that the object 4 is dragged. On the other hand, when there is no change in the input position information, the touch panel control unit 51 determines that the object 4 is not dragged.
- step S104 if the object 4 is not dragged (S103: NO), the flow proceeds to step S104.
- step S104 the drag & drop determination unit 52 keeps the drag valid flag OFF. Thereafter, the flow returns to step S102. The initial state of the drag valid flag is OFF.
- step S105 when the object 4 is dragged (S103: YES), the flow proceeds to step S105.
- step S105 the drag & drop determination unit 52 turns on the drag valid flag. Then, the drag and drop determination unit 52 outputs the position information of the object 4 and the drag valid flag (ON) to the defined area detection unit 53. In addition, the drag and drop determination unit 52 outputs a drag valid flag (ON) to the multi-application control unit 61.
- step S106 when the position information of the object 4 and the drag valid flag (ON) are input, the defined area detection unit 53 determines whether the object 4 exists in the defined area 5 based on the position information. .
- step S107 when the object 4 is not present in the defined area 5 (S106: NO), the flow proceeds to step S107.
- step S107 the defined area detection unit 53 keeps the defined area detection flag OFF. Thereafter, the flow returns to step S102.
- the initial state of the defined area detection flag is OFF.
- step S108 if the object 4 is present in the defined area 5 (S106: YES), the flow proceeds to step S108.
- step S108 the defined area detection unit 53 controls the timer 54 to be ON so as to start counting of the defined time.
- step S109 the timer 54 counts time until the specified time has elapsed. Then, when the specified time has elapsed (S109: YES), the timer 54 notifies the specified area detection unit 53 that the clocking has ended.
- step S110 when receiving the notification that the clocking has ended, the defined area detection unit 53 controls the timer 54 to be OFF so that the clocking is ended and the time is reset.
- step S111 the defined area detection unit 53 turns on the defined area detection flag. Then, the defined area detection unit 53 outputs the defined area detection flag (ON) to the multi-application control unit 61.
- step S112 the multi-application control unit 61 determines whether or not both the input drag valid flag and the defined area detection flag are ON.
- step S113 the multi-application control unit 61 outputs a slide display instruction of the application 6 to the multi-application display position management unit 71.
- step S114 the multi-application display position management unit 71 that has received the slide display instruction of the application 6 performs slide display of the application 6. That is, the multi-application display position management unit 71 outputs the slide display instruction of the application 6 to the application display control unit 73.
- the application display control unit 73 changes the display position information of its own from a predetermined default display position to a predetermined slide display position. Then, when the change of the display position information is completed, the application display control unit 73 outputs the changed display position information to the multi-application display position management unit 71.
- step S115 the multi-application display position management unit 71 updates the display position information of the application 6 indicating the default display position owned by itself to the display position information indicating the slide display position input from the application display control unit 73. Then, the multi-application display position management unit 71 outputs the updated display position information to the multi-application control unit 61. Also, the multi-application display position management unit 71 generates full screen display information based on the updated display position information, and outputs the full screen display information to the split display control unit 74. Next, the split display control unit 74 controls the image data indicated by the input full screen display information to be separately displayed on the touch panels 10 and 20. As a result, the information processing apparatus 100 transitions from the display shown in FIG. 2A to the display shown in FIG. 2B.
- step S116 the drag & drop determination unit 52 determines whether the object 4 is dropped based on the input position information. For example, when the input of position information continues, the drag and drop determination unit 52 determines that the object 4 is being dragged. On the other hand, when there is no input of position information, the drag and drop determination unit 52 determines that the object 4 has been dropped.
- step S117 if the object 4 is dropped (S116: YES), the flow proceeds to step S117.
- step S117 the drag & drop determination unit 52 turns off the drag valid flag that has been turned on. Then, the drag and drop determination unit 52 outputs the drag valid flag (OFF) to the defined area detection unit 53. In addition, the drag and drop determination unit 52 outputs the drag valid flag (OFF) to the multi-application control unit 61.
- step S118 when the defined area detection unit 53 receives the drag effective flag (OFF), the defined area detection unit 53 turns off the defined area detection flag that has been turned on. Then, the defined area detection unit 53 outputs a defined area detection flag (OFF) to the multi-application control unit 61.
- step S119 the multi-application control unit 61 acquires the information of the object 4 from the application 7 and propagates it to the application 6, triggered by the input of both the drag valid flag (OFF) and the defined area detection flag (OFF). Do.
- the information of the object 4 is, for example, image data such as a moving image or a still image.
- step S120 the multi-application control unit 61 outputs a slide back display instruction to the multi-application display position management unit 71.
- step S121 the multi-application display position management unit 71 that has received the slide return display instruction of the application 6 performs slide return display of the application 6. That is, the multi-application display position management unit 71 outputs the slide return display instruction of the application 6 to the application display control unit 73.
- the application display control unit 73 changes the display position information of itself from the slide display position to the default display position. Then, when the change of the display position information is completed, the application display control unit 73 outputs the changed display position information to the multi-application display position management unit 71.
- step S122 the multi-application display position management unit 71 updates the display position information of the application 6 indicating the default display position owned by itself to the display position information indicating the slide display position input from the application display control unit 73. Then, the multi-application display position management unit 71 outputs the updated display position information to the multi-application control unit 61. Also, the multi-application display position management unit 71 generates full screen display information based on the updated display position information, and outputs the full screen display information to the split display control unit 74. Next, the split display control unit 74 controls the image data indicated by the input full screen display information to be separately displayed on the touch panels 10 and 20. As a result, the information processing apparatus 100 transitions from the display shown in FIG. 2B to the display shown in FIG. 2A.
- the information processing apparatus 100 can solve the above-described problem that the operating finger does not reach the display of the destination application. Further, the information processing apparatus 100 can solve the problem that the erroneous operation is likely to occur due to the length of the distance to be dragged and the discontinuity between the touch panels. As a result, in the information processing apparatus 100 according to the present embodiment, the user can easily move the object between the applications displayed simultaneously on the touch panel.
- Second Embodiment In the first embodiment described above, an example has been described in which one of the two applications is displayed in a sliding manner when the information processing apparatus 100 has activated two applications. In the present embodiment, an example of slide display performed when the information processing apparatus 100 activates three or more applications will be described.
- the display of those applications is the display shown in FIG. 2A. That is, although the applications 6 and 7 are displayed on the touch panels 10 and 20, respectively, the other applications are not displayed.
- the user moves the object 4 of the application 7 to an application other than the application 6 which is not displayed.
- the user drags the object 4 to the defined area 5 and waits for a predetermined time while dragging.
- the information processing apparatus 100 first causes the application 6 being displayed on the touch panel 10 to be slide-displayed in the same manner as the operation described in the first embodiment.
- the information processing apparatus 100 displays the application 8 hidden behind the display of the application 6 on the touch panel 10 and causes the same to be displayed in a sliding manner as the application 6.
- the information processing apparatus 100 causes the slid application 8 to be displayed on the previously slid application 6.
- the information processing apparatus 100 repeats the same operation as the operation of the application 8 described above.
- the information processing apparatus 100 when there are three or more activated applications, the information processing apparatus 100 according to the present embodiment automatically switches the display of the applications one after another and displays them as sliding pages. Therefore, the user may hold the object 4 while waiting in the defined area 5 until the desired application is slide-displayed. That is, when a plurality of applications are activated, the user does not need to perform an operation for finding the display of the desired application or an operation for displaying the desired application, and the like, and the object is easily moved between the applications It can be carried out.
- FIG. 6 is a view showing an example of the appearance of the information processing apparatus 101 according to the present embodiment.
- the application destination of the information processing apparatus 101 may be a smartphone or a tablet.
- the information processing apparatus 101 has a rectangular flat housing 80.
- the housing 80 has a touch panel 90.
- the touch panel 90 displays the application 7 in the right half and the application 6 in the left half.
- Reference numeral 31 in the drawing indicates the boundary between the display of the application 6 and the display of the application 7.
- the defined area 5 is located near the boundary 31 in the touch panel 90.
- FIG. 7 is a block diagram showing an example of the configuration of the information processing apparatus 101. As shown in FIG.
- the information processing apparatus 101 includes a touch panel 90, a touch panel coordinate management unit 50, an application control unit 60, and a display control unit 70.
- the display control unit 70 does not include the split display control unit 74 shown in FIG.
- the touch panel 90 is the same as that shown in FIG. 6 and includes an input detection unit 91 and an image display unit 92.
- the touch panel coordinate management unit 50, the application control unit 60, and the display control unit 70 illustrated in FIG. 7 have been described using FIG. 3, the description here is omitted.
- the input detection unit 91 and the image display unit 92 also have the same functions as the input detection units 11 and 12 and the image display units 12 and 22 shown in FIG. 3 respectively, the description here is omitted.
- the operation of the information processing apparatus 101 is the same as the operation of the information processing apparatus 100.
- the information processing apparatus 101 slides the application 6 displayed on the left half of the touch panel 90 in the direction of the right half of the touch panel 90.
- the application 6 is displayed above the application 7 and below the object 4 in the central portion of the touch panel 90, as in FIG. 2B.
- the information processing apparatus 101 can also apply the slide display operation of the above-described second embodiment when a plurality of applications are activated.
- the same effects as the effects described in the first and second embodiments can be obtained even with a single touch panel.
- the present invention is not limited to this.
- the present invention is applicable.
- the present invention may apply a display method in which, for example, the display of the application 6 temporarily disappears from the touch panel 10 and appears on the application 7 being displayed on the touch panel 20.
- the information processing apparatus is an information processing apparatus including a touch panel for simultaneously displaying the first application and the second application, and the definition of the touch panel in a state where the object of the first application is dragged.
- a detection unit that detects presence of a predetermined time in an area; and a control unit that moves the display of the second application to a droppable position using the detection of the presence as a trigger. It is.
- the detection unit detects that the object is dropped to the second application after the display of the second application moves the object to a droppable position. And the control unit moves the display of the second application to an original position using the detection of the drop as a trigger.
- control unit when the control unit moves the display of the second application to a position where the object can be dropped, the control unit may display the first application under the display of the object. It is something to be displayed on the display of the application.
- the control unit in addition to the first application and the second application, when a third application not displayed on the touch panel is activated, the control unit is configured to Subsequently to the movement of the display of the second application, the third application is displayed on the touch panel, and the display of the third application is moved to a position where the object can be dropped.
- the control unit when a plurality of the third applications are activated, the control unit causes the display and the movement of the third application to be continued following the movement of the display of the second application. To be done continuously.
- the defined area is located in the vicinity of a boundary between the display of the first application and the display of the second application.
- the touch panel includes a first touch panel for displaying the first application, and a second touch panel for displaying the second application, and the first touch panel is , Having the prescribed area.
- an information processing method is an information processing method performed by a terminal provided with a touch panel for simultaneously displaying the first application and the second application, in which the object of the first application is dragged. Detecting the presence of a predetermined time in a defined area on a touch panel, and moving the second application to a droppable position using the detection of the presence as a trigger. It is.
- An information processing program is an information processing program to be executed by a computer of a terminal provided with a touch panel for simultaneously displaying a first application and a second application, and an object of the first application is dragged. Processing for detecting existence in a defined area on the touch panel for a predetermined time in a state, and processing for moving the second application to a droppable position using the detection of the existence as a trigger It is something to be done.
- the present invention is useful as an information processing apparatus, an information processing method, and an information processing program applied to a terminal provided with a touch panel.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
以下、本発明の実施の形態1について、図面を参照して詳細に説明する。
図1は、本実施の形態に係る情報処理装置100の外観の一例を示す図である。情報処理装置100の適用先は、スマートフォンなどが挙げられる。
このような情報処理装置100の特徴的な動作について、図2を参照して説明する。以下、例として、図1の状態のとき、ユーザは、アプリ7のオブジェクト4を選択し、アプリ6にドラッグ&ドロップを行う場合について、説明する。図2A、図2Bは、情報処理装置100の動作時の画面の遷移例を示す図である。
図3は、情報処理装置100の構成の一例を示すブロック図である。
図4は、情報処理装置100の動作の一例を示すフローチャートである。以下では、例として、ユーザが、アプリ7の1つのオブジェクトをアプリ6へ移動させる場合について、説明する。
上記実施の形態1では、情報処理装置100が2つのアプリを起動している場合にそのうちの1つをスライド表示させる例について説明した。本実施の形態では、情報処理装置100が3つ以上のアプリを起動している場合に行うスライド表示の例について説明する。
上記実施の形態1では、情報処理装置が2つのタッチパネルを備えている例について説明した。本実施の形態では、情報処理装置が1つのタッチパネルを備えている場合に行うスライド表示の例について説明する。
図6は、本実施の形態に係る情報処理装置101の外観の一例を示す図である。情報処理装置101の適用先は、スマートフォンやタブレットなどが挙げられる。
図7は、情報処理装置101の構成の一例を示すブロック図である。
情報処理装置101の動作は、情報処理装置100の動作と同じである。例えば、図6において、ユーザは、アプリ7のオブジェクト4をアプリ6へ移動させる場合、オブジェクト4を規定エリア5へドラッグする。そして、ユーザは、ドラッグしたまま所定時間待機する。これにより、情報処理装置101は、タッチパネル90の左半分に表示されているアプリ6を、タッチパネル90の右半分の方向へスライドさせる。このスライド表示の結果、アプリ6は、図2Bと同様に、タッチパネル90の中央部分において、アプリ7の上、かつ、オブジェクト4の下に表示される。
3 ヒンジ
4 オブジェクト
5 規定エリア
6、7、8 アプリ
10、20、90 タッチパネル
11、21、91 入力検出部
12、22、92 画像表示部
31 境界
50 タッチパネル座標管理部
51 タッチパネル制御部
52 ドラッグ&ドロップ判定部
53 規定エリア検出部
54 タイマ
60 アプリ制御部
61 マルチアプリ制御部
70 表示制御部
71 マルチアプリ表示位置管理部
72、73 アプリ表示制御部
74 分割表示制御部
100、101 情報処理装置
Claims (9)
- 第1のアプリケーションおよび第2のアプリケーションを同時に表示するタッチパネルを備える情報処理装置であって、
前記第1のアプリケーションのオブジェクトがドラッグされた状態で前記タッチパネルの規定エリアに所定時間存在したこと、を検出する検出部と、
前記存在の検出をトリガとして、前記第2のアプリケーションの表示を、前記オブジェクトをドロップ可能な位置へ移動させる制御部と、
を備える、情報処理装置。 - 前記検出部は、
前記第2のアプリケーションの表示が前記オブジェクトをドロップ可能な位置へ移動した後、前記オブジェクトが前記第2のアプリケーションへドロップされたことを検出し、
前記制御部は、
前記ドロップの検出をトリガとして、前記第2のアプリケーションの表示を、元の位置へ移動させる、
請求項1記載の情報処理装置。 - 前記制御部は、
前記第2のアプリケーションの表示を、前記オブジェクトをドロップ可能な位置へ移動させた際、前記オブジェクトの表示の下、かつ、前記第1のアプリケーションの表示の上に表示させる、
請求項1記載の情報処理装置。 - 前記第1のアプリケーションおよび前記第2のアプリケーションの他に、前記タッチパネルに表示されていない第3のアプリケーションが起動している場合、
前記制御部は、
前記第2のアプリケーションの表示の移動に続けて、前記第3のアプリケーションを前記タッチパネルに表示させ、当該第3のアプリケーションの表示を、前記オブジェクトをドロップ可能な位置へ移動させる、
請求項1記載の情報処理装置。 - 前記第3のアプリケーションが複数起動している場合、
前記制御部は、
前記第2のアプリケーションの表示の移動に続けて、前記第3のアプリケーションの表示および移動を連続して行う、
請求項4記載の情報処理装置。 - 前記規定エリアは、
前記第1のアプリケーションの表示および前記第2のアプリケーションの表示の境界の近傍に位置する、
請求項1記載の情報処理装置。 - 前記タッチパネルは、前記第1のアプリケーションを表示する第1のタッチパネルと、前記第2のアプリケーションを表示する第2のタッチパネルとを備え、
前記第1のタッチパネルは、前記規定エリアを有する、
請求項1記載の情報処理装置。 - 第1のアプリケーションおよび第2のアプリケーションを同時に表示するタッチパネルを備える端末が行う情報処理方法であって、
前記第1のアプリケーションのオブジェクトがドラッグされた状態で前記タッチパネル上の規定エリアに所定時間存在したこと、を検出するステップと、
前記存在の検出をトリガとして、前記第2のアプリケーションを、前記オブジェクトをドロップ可能な位置へ移動させるステップと、
を有する、情報処理方法。 - 第1のアプリケーションおよび第2のアプリケーションを同時に表示するタッチパネルを備える端末のコンピュータに実行させる情報処理プログラムであって、
前記第1のアプリケーションのオブジェクトがドラッグされた状態で前記タッチパネル上の規定エリアに所定時間存在したこと、を検出する処理と、
前記存在の検出をトリガとして、前記第2のアプリケーションを、前記オブジェクトをドロップ可能な位置へ移動させる処理と、
を実行させる、情報処理プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013545565A JPWO2013175770A1 (ja) | 2012-05-25 | 2013-05-21 | 情報処理装置、情報処理方法、および情報処理プログラム |
CN201380001571.0A CN103597439B (zh) | 2012-05-25 | 2013-05-21 | 信息处理装置、信息处理方法和信息处理程序 |
US14/232,033 US9529518B2 (en) | 2012-05-25 | 2013-05-21 | Information processing device, information processing method, and information processing program |
US15/285,983 US10082947B2 (en) | 2012-05-25 | 2016-10-05 | Information processing device, information processing method, and information processing program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012119487 | 2012-05-25 | ||
JP2012-119487 | 2012-05-25 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/232,033 A-371-Of-International US9529518B2 (en) | 2012-05-25 | 2013-05-21 | Information processing device, information processing method, and information processing program |
US15/285,983 Continuation US10082947B2 (en) | 2012-05-25 | 2016-10-05 | Information processing device, information processing method, and information processing program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013175770A1 true WO2013175770A1 (ja) | 2013-11-28 |
Family
ID=49623480
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/003232 WO2013175770A1 (ja) | 2012-05-25 | 2013-05-21 | 情報処理装置、情報処理方法、および情報処理プログラム |
Country Status (4)
Country | Link |
---|---|
US (2) | US9529518B2 (ja) |
JP (1) | JPWO2013175770A1 (ja) |
CN (1) | CN103597439B (ja) |
WO (1) | WO2013175770A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015081739A1 (en) * | 2013-12-04 | 2015-06-11 | Huawei Technologies Co., Ltd. | Method of performing one or more actions on electronic device |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150193096A1 (en) * | 2014-01-07 | 2015-07-09 | Samsung Electronics Co., Ltd. | Electronic device and method for operating the electronic device |
CN105204794B (zh) * | 2014-06-16 | 2020-04-14 | 中兴通讯股份有限公司 | 视图显示处理方法、装置及投影设备 |
US10528224B2 (en) * | 2014-12-10 | 2020-01-07 | Rakuten, Inc. | Server, display control method, and display control program |
JP6868427B2 (ja) * | 2017-03-23 | 2021-05-12 | シャープ株式会社 | 入力機能付き表示装置 |
US11093197B2 (en) * | 2017-07-31 | 2021-08-17 | Stmicroelectronics, Inc. | System and method to increase display area utilizing a plurality of discrete displays |
IT201800002984A1 (it) | 2018-02-23 | 2019-08-23 | Cnh Ind Italia Spa | Pala migliorata avente una capacita' variabile |
KR102513752B1 (ko) * | 2018-04-11 | 2023-03-24 | 삼성전자 주식회사 | 전자 장치 및 전자 장치의 제어 방법 |
EP3971687B1 (en) * | 2019-08-19 | 2024-04-03 | Samsung Electronics Co., Ltd. | Electronic device and method for application selection and customization of layout on foldable display. |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011070525A (ja) * | 2009-09-28 | 2011-04-07 | Kyocera Corp | 携帯端末装置 |
Family Cites Families (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5327161A (en) * | 1989-08-09 | 1994-07-05 | Microtouch Systems, Inc. | System and method for emulating a mouse input device with a touchpad input device |
GB9201949D0 (en) * | 1992-01-30 | 1992-03-18 | Jenkin Michael | Large-scale,touch-sensitive video display |
JPH06274305A (ja) * | 1993-03-18 | 1994-09-30 | Hitachi Ltd | 画面表示装置及びその制御方法 |
US6088005A (en) * | 1996-01-11 | 2000-07-11 | Hewlett-Packard Company | Design and method for a large, virtual workspace |
JP3304290B2 (ja) * | 1997-06-26 | 2002-07-22 | シャープ株式会社 | ペン入力装置及びペン入力方法及びペン入力制御プログラムを記録したコンピュータ読み取り可能な記録媒体 |
US6331840B1 (en) | 1998-03-27 | 2001-12-18 | Kevin W. Nielson | Object-drag continuity between discontinuous touch screens of a single virtual desktop |
US6229502B1 (en) * | 1998-11-03 | 2001-05-08 | Cylark Development Llc | Electronic book |
US6545669B1 (en) * | 1999-03-26 | 2003-04-08 | Husam Kinawi | Object-drag continuity between discontinuous touch-screens |
EP1269295A2 (de) * | 2000-03-31 | 2003-01-02 | Glenn Rolus Borgward | Universelles digitales mobilgerät |
US6643124B1 (en) * | 2000-08-09 | 2003-11-04 | Peter J. Wilk | Multiple display portable computing devices |
US20030076364A1 (en) * | 2001-10-18 | 2003-04-24 | International Business Machines Corporation | Method of previewing a graphical image corresponding to an icon in a clipboard |
JP2006518507A (ja) * | 2003-02-19 | 2006-08-10 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | ポータブル装置間でのコンテンツアイテムのアドホック共有のためのシステムと、そのインタラクション方法 |
JP2005346583A (ja) * | 2004-06-04 | 2005-12-15 | Canon Inc | 画像表示装置、マルチディスプレイ・システム、座標情報出力方法及びその制御プログラム |
US20070064004A1 (en) * | 2005-09-21 | 2007-03-22 | Hewlett-Packard Development Company, L.P. | Moving a graphic element |
KR100755851B1 (ko) | 2005-10-14 | 2007-09-07 | 엘지전자 주식회사 | 멀티미디어 디스플레이 방법, 이를 위한 이동 단말기, 및이동 단말기용 크래들 |
US7844301B2 (en) * | 2005-10-14 | 2010-11-30 | Lg Electronics Inc. | Method for displaying multimedia contents and mobile communications terminal capable of implementing the same |
US7533349B2 (en) * | 2006-06-09 | 2009-05-12 | Microsoft Corporation | Dragging and dropping objects between local and remote modules |
US20070294357A1 (en) * | 2006-06-20 | 2007-12-20 | Lennox Bertrand Antoine | Geospatial community facilitator |
JP2008257442A (ja) * | 2007-04-04 | 2008-10-23 | Sharp Corp | 電子掲示装置 |
JP4712786B2 (ja) * | 2007-12-13 | 2011-06-29 | 京セラ株式会社 | 情報処理装置 |
EP2131271A1 (en) * | 2008-06-04 | 2009-12-09 | NEC Corporation | Method for enabling a mobile user equipment to drag and drop data objects between distributed applications |
US20090322689A1 (en) * | 2008-06-30 | 2009-12-31 | Wah Yiu Kwong | Touch input across touch-sensitive display devices |
US8863038B2 (en) * | 2008-09-08 | 2014-10-14 | Qualcomm Incorporated | Multi-panel electronic device |
US8947320B2 (en) * | 2008-09-08 | 2015-02-03 | Qualcomm Incorporated | Method for indicating location and direction of a graphical user interface element |
US8959446B2 (en) * | 2008-11-20 | 2015-02-17 | Canon Kabushiki Kaisha | Information processing apparatus and method of controlling the same |
US8330733B2 (en) * | 2009-01-21 | 2012-12-11 | Microsoft Corporation | Bi-modal multiscreen interactivity |
JP2010176332A (ja) * | 2009-01-28 | 2010-08-12 | Sony Corp | 情報処理装置、情報処理方法およびプログラム |
JP4697558B2 (ja) | 2009-03-09 | 2011-06-08 | ソニー株式会社 | 情報処理装置、情報処理方法及び情報処理プログラム |
JP5606686B2 (ja) * | 2009-04-14 | 2014-10-15 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP5229083B2 (ja) * | 2009-04-14 | 2013-07-03 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
JP5177071B2 (ja) * | 2009-04-30 | 2013-04-03 | ソニー株式会社 | 送信装置および方法、受信装置および方法、並びに送受信システム |
US8246080B1 (en) * | 2009-07-10 | 2012-08-21 | Bridget Renee Bennett | Foldable compartmentalized clipboard |
US9092115B2 (en) * | 2009-09-23 | 2015-07-28 | Microsoft Technology Licensing, Llc | Computing system with visual clipboard |
US8473870B2 (en) * | 2010-02-25 | 2013-06-25 | Microsoft Corporation | Multi-screen hold and drag gesture |
US8751970B2 (en) * | 2010-02-25 | 2014-06-10 | Microsoft Corporation | Multi-screen synchronous slide gesture |
JP4865053B2 (ja) * | 2010-04-22 | 2012-02-01 | 株式会社東芝 | 情報処理装置およびドラッグ制御方法 |
US9495473B2 (en) * | 2010-07-19 | 2016-11-15 | Soasta, Inc. | Analytic dashboard with user interface for producing a single chart statistical correlation from source and target charts during a load test |
AU2011101160B4 (en) * | 2010-09-09 | 2013-07-18 | Opentv, Inc. | Methods and systems for drag and drop content sharing in a multi-device environment |
US9046992B2 (en) * | 2010-10-01 | 2015-06-02 | Z124 | Gesture controls for multi-screen user interface |
JP5984339B2 (ja) * | 2011-04-26 | 2016-09-06 | 京セラ株式会社 | 電子機器、画面制御方法および画面制御プログラム |
US20130097541A1 (en) * | 2011-10-13 | 2013-04-18 | Gface Gmbh | Smart drag and drop |
US20130167072A1 (en) * | 2011-12-22 | 2013-06-27 | Sap Portals Israel Ltd. | Smart and Flexible Layout Context Manager |
US9098183B2 (en) * | 2012-09-28 | 2015-08-04 | Qualcomm Incorporated | Drag and drop application launches of user interface objects |
-
2013
- 2013-05-21 US US14/232,033 patent/US9529518B2/en active Active
- 2013-05-21 JP JP2013545565A patent/JPWO2013175770A1/ja active Pending
- 2013-05-21 WO PCT/JP2013/003232 patent/WO2013175770A1/ja active Application Filing
- 2013-05-21 CN CN201380001571.0A patent/CN103597439B/zh not_active Expired - Fee Related
-
2016
- 2016-10-05 US US15/285,983 patent/US10082947B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011070525A (ja) * | 2009-09-28 | 2011-04-07 | Kyocera Corp | 携帯端末装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015081739A1 (en) * | 2013-12-04 | 2015-06-11 | Huawei Technologies Co., Ltd. | Method of performing one or more actions on electronic device |
Also Published As
Publication number | Publication date |
---|---|
US20170024101A1 (en) | 2017-01-26 |
CN103597439A (zh) | 2014-02-19 |
US9529518B2 (en) | 2016-12-27 |
JPWO2013175770A1 (ja) | 2016-01-12 |
US20140173470A1 (en) | 2014-06-19 |
US10082947B2 (en) | 2018-09-25 |
CN103597439B (zh) | 2018-09-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013175770A1 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム | |
US11226724B2 (en) | Swiping functions for messaging applications | |
WO2022068773A1 (zh) | 桌面元素调整方法、装置和电子设备 | |
KR101229699B1 (ko) | 애플리케이션 간의 콘텐츠 이동 방법 및 이를 실행하는 장치 | |
JP7328182B2 (ja) | 画像処理装置、画像処理装置の制御方法及びプログラム | |
EP2698708A1 (en) | Method for providing user interface having multi-tasking function, mobile communication device, and computer readable recording medium for providing the same | |
US9377944B2 (en) | Information processing device, information processing method, and information processing program | |
KR20130093043A (ko) | 터치 및 스와이프 내비게이션을 위한 사용자 인터페이스 방법 및 모바일 디바이스 | |
US20120297329A1 (en) | Electronic apparatus, program, and control method | |
US20150185987A1 (en) | Method, apparatus and computer readable medium for zooming and operating screen frame | |
WO2014044133A1 (zh) | 应用界面及控制应用界面操作的方法和装置 | |
US20140225847A1 (en) | Touch panel apparatus and information processing method using same | |
JP2014071724A (ja) | 電子機器、制御方法及び制御プログラム | |
JP2015158713A (ja) | 表示制御装置、画像形成装置およびプログラム | |
US20160110051A1 (en) | System and method to control a touchscreen user interface | |
US9009627B2 (en) | Electronic apparatus, program, and control method for displaying access authority for data files | |
KR102095039B1 (ko) | 터치 인터페이스를 제공하는 장치에서 터치 입력을 수신하는 방법 및 장치 | |
TWI530864B (zh) | 可攜式電子裝置以及使用者介面操作方法 | |
US20230297209A1 (en) | Content sharing methods and apparatus, terminal, storage medium | |
JP6445777B2 (ja) | オブジェクトを管理する情報処理装置およびその制御方法 | |
JP6478260B2 (ja) | 電子機器、電子機器の制御方法及びプログラム | |
JP2012053662A (ja) | タッチパネル操作方式の携帯端末 | |
AU2019205000B2 (en) | Component display processing method and user equipment | |
JP2017091445A (ja) | 携帯電子機器、制御方法及び制御プログラム | |
JP2006039819A (ja) | 座標入力装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2013545565 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14232033 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13793433 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13793433 Country of ref document: EP Kind code of ref document: A1 |