WO2014136372A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par un ordinateur - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par un ordinateur Download PDF

Info

Publication number
WO2014136372A1
WO2014136372A1 PCT/JP2014/000219 JP2014000219W WO2014136372A1 WO 2014136372 A1 WO2014136372 A1 WO 2014136372A1 JP 2014000219 W JP2014000219 W JP 2014000219W WO 2014136372 A1 WO2014136372 A1 WO 2014136372A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
information processing
displayed
touch panel
lcd
Prior art date
Application number
PCT/JP2014/000219
Other languages
English (en)
Japanese (ja)
Inventor
亮二 蓮井
Original Assignee
Necカシオモバイルコミュニケーションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necカシオモバイルコミュニケーションズ株式会社 filed Critical Necカシオモバイルコミュニケーションズ株式会社
Publication of WO2014136372A1 publication Critical patent/WO2014136372A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device ; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed

Definitions

  • the present invention relates to an information processing apparatus, an information processing method, and a non-transitory computer-readable medium.
  • Patent Document 1 discloses a technology related to dragging a display object that straddles a plurality of touch screen displays.
  • a display object approaches the end of the touch screen display by dragging the display object on the touch screen display, a target display as a touch screen display to which the display object is moved is determined. Move the display object towards the target display.
  • dragging of the display object is resumed on the target display.
  • An object of the present invention is to provide a technique for allowing a user to execute a drag with confidence when dragging an object so as to straddle between adjacent touch screen displays.
  • a first touch screen display having a first display and a first touch panel provided on the first display, a second display, An object for displaying an object on a logical display formed by integrating the first display and the second display; and a second touch screen display having a second touch panel provided to overlap the second display.
  • the object displayed on the first display is 1 of the display, when moved to the specific area is a region closer to the second display, and displays said object in a particular embodiment, the information processing apparatus is provided.
  • a first touch screen display having a first display and a first touch panel provided on the first display, a second display,
  • An information processing method for an information processing apparatus comprising: a second touch screen display having a second touch panel provided to overlap the second display, wherein the first display and the second display
  • An object display step for displaying an object on a logical display obtained by integrating the above and the object displayed on the first display are moved based on a drag operation performed on the first touch panel.
  • An object moving step which is displayed on the first display Serial object, the first display, when moved to the specific area is a region closer to the second display, and displays said object in a particular embodiment, the information processing method is provided.
  • a first touch screen display having a first display and a first touch panel provided on the first display, a second display
  • a non-transitory computer-readable medium for storing an information processing program of an information processing apparatus comprising: a second touch screen display having a second touch panel provided to overlap the second display,
  • An information processing program includes: an object display step for displaying an object on a logical display formed by integrating the first display and the second display on a computer; and a drag operation performed on the first touch panel. Based on the object displayed on the first display An object moving step for moving the object, and the object displayed on the first display is moved into a specific area of the first display that is close to the second display. A non-transitory computer readable medium is then provided that causes the object to be displayed in a special manner.
  • the object moves into a specific area and is displayed in a special manner according to the above configuration. Therefore, the user can confirm that the information processing apparatus grasps the user's intention to drag the object so as to straddle between adjacent touch screen displays. As a result, when dragging an object so as to straddle between adjacent touch screen displays, the user can execute the drag with peace of mind.
  • FIG. 1 is a functional block diagram of the information processing apparatus.
  • FIG. 2 is a perspective view of the information processing apparatus.
  • FIG. 3 is a front view illustrating a usage mode of the information processing apparatus.
  • FIG. 4 is a functional block diagram of the information processing apparatus.
  • FIG. 5 is a diagram illustrating a logical TSD (touch screen display).
  • FIG. 6 is a control flow of the information processing apparatus.
  • FIG. 7 is a control flow of the information processing apparatus.
  • FIG. 8 is a control flow of the information processing apparatus.
  • FIG. 9 is a diagram illustrating how an object moves by a drag operation.
  • FIG. 10 is a diagram illustrating a state in which an object is enlarged and displayed.
  • FIG. 11 is a diagram illustrating a state in which an object is enlarged and displayed.
  • the multi-display mobile terminal 1 (information processing apparatus) includes a left LCD 2 (first display) and a left touch panel 3 (first touch panel) provided to overlap the left LCD 2.
  • a right TSD 7 (second touch screen) having a left TSD 4 (first touch screen display), a right LCD 5 (second display), and a right touch panel 6 (second touch panel) provided over the right LCD 5 Display), an object display unit 8 (object display means) for displaying an object on a logical display formed by integrating the left LCD 2 and the right LCD 5, and the left LCD 2 based on a drag operation performed on the left touch panel 3.
  • an object moving unit 9 (object moving means) for moving the object displayed on the screen.
  • the object moves into a specific area and is displayed in a special manner according to the above configuration. Therefore, the user can confirm that the multi-display portable terminal 1 grasps the user's intention to drag the object so as to straddle between adjacent touch screen displays. As a result, when dragging an object so as to straddle between adjacent touch screen displays, the user can execute the drag with peace of mind.
  • FIG. 2 shows a multi-display portable terminal 10 as an information processing apparatus.
  • the multi-display portable terminal 10 has a base 11 (first housing), a cover 12 (second housing), and a cover for the base 11 so that the cover 12 can freely rotate with respect to the base 11. And a hinge 13 (casing connection means) for connecting 12 to each other.
  • the base 11 includes a left TSD 14 (first touch screen display) and a control unit 15 (control means).
  • the left TSD 14 is configured in a rectangular shape having a short side 14a and a long side 14b.
  • the cover 12 includes a right TSD 16 (second touch screen display).
  • the right TSD 16 is configured in a rectangular shape having a short side 16a and a long side 16b.
  • the description will be made on the assumption that the multi-display portable terminal 10 is completely opened so that the angle formed by the base 11 and the cover 12 is 180 degrees.
  • the angle formed by the base 11 and the cover 12 may be other than 180 degrees.
  • the left TSD 14 includes a left LCD 17 (first display) and a left touch panel 18 (first touch panel).
  • the left LCD 17 displays a desired image based on the image signal output from the control unit 15.
  • the left touch panel 18 is provided so as to overlap the left LCD 17.
  • the left touch panel 18 outputs touch data corresponding to the touch position of the left touch panel 18 touched with a finger or a touch pen to the control unit 15.
  • the touch data is composed of an X coordinate and a Y coordinate with the upper left corner of the left touch panel 18 as the origin. When the left touch panel 18 is not touched, empty touch data is output to the control unit 15.
  • the right TSD 16 includes a right LCD 19 (second display) and a right touch panel 20 (second touch panel).
  • the right LCD 19 displays a desired image based on the image signal output from the control unit 15.
  • the right touch panel 20 is provided so as to overlap the right LCD 19.
  • the right touch panel 20 outputs touch data corresponding to the touch position of the right touch panel 20 touched with a finger or a touch pen to the control unit 15.
  • the touch data is composed of an X coordinate and a Y coordinate with the upper left corner of the right touch panel 20 as the origin. When the right touch panel 20 is not touched, empty touch data is output to the control unit 15.
  • the control unit 15 includes a CPU 30 (Central Processing Unit), a RAM 31 (Random Access Memory), and a ROM 32 (Read Only Memory).
  • the ROM 32 stores a game program (information processing program).
  • the game program is read on the CPU 30 and executed on the CPU 30, so that the hardware such as the CPU 30, the object display unit 33 (object display unit), the object moving unit 34 (object moving unit), and the counter 35 are displayed. And let it function as.
  • the control unit 15 integrates the left TSD 14 and the right TSD 16 to form a single logical TSD 36. Specifically, the control unit 15 integrates the left LCD 17 and the right LCD 19 to constitute a single logical LCD 37. Similarly, the control unit 15 integrates the left touch panel 18 and the right touch panel 20 to configure a single logical touch panel 38.
  • the control unit 15 converts the touch data output from the right touch panel 20 into touch data having the upper left corner of the left touch panel 18 as the origin by executing predetermined coordinate conversion in order to realize the logical touch panel 38. Specifically, assuming that the width of the left touch panel 18 is 600 pixels, the control unit 15 performs coordinate conversion that adds 600 to the X coordinate of touch data output from the right touch panel 20.
  • control unit 15 When the control unit 15 outputs an image signal corresponding to an image formed on the logical LCD 37 to the right LCD 19 in order to realize the logical LCD 37, the control unit 15 performs an inverse conversion of the above coordinate conversion in advance on the image signal output to the right LCD 19. Execute.
  • touch event information includes the touch start event information when the previous step is no touch and the current step is touch, the touch end event information when the previous step is touch and the current step is no touch, both the previous step and the current step Touch movement event information when there is a touch, long touch event information when there is touch and the touch data is the same in both the previous step and the current step, and the like.
  • Each touch event information includes the touch data of this step.
  • the object display unit 33 is a part for displaying the object 40 on the logical LCD 37.
  • the object moving unit 34 is a part that moves the object 40 displayed on the logical LCD 37 based on a drag operation performed on the logical touch panel 38.
  • the object display unit 33 enlarges and displays the object 40 as shown in FIG. More specifically, when the object 40 displayed on the left LCD 17 moves into the left specific area 41, the object display section 33 is a portion of the object 40 displayed on the right LCD 19 as shown in FIG. The object 40 is enlarged and displayed so that is enlarged and displayed.
  • the left specific area 41 will be described in detail.
  • the left specific area 41 is a partial area of the left LCD 17 and an area close to the right LCD 19.
  • the left specific area 41 is an area adjacent to the right LCD 19.
  • the left specific area 41 is a band-shaped area.
  • the left specific area 41 in the present embodiment has a coordinate of the upper left corner of the logical LCD 37 (599 ⁇ w). / 2,0), and is a rectangular area whose coordinates in the lower right corner are (599,799).
  • the right specific area 42 is a partial area of the right LCD 19 and is an area close to the left LCD 17.
  • the right specific area 42 is an area adjacent to the left LCD 17.
  • the right specific area 42 is a band-shaped area.
  • the right specific area 42 is a rectangular area in the logical LCD 37 whose upper left corner coordinates are (600, 0) and whose lower right corner coordinates are (600 + w / 2,799).
  • the state of the control unit 15 transitions between three states of “drag state”, “drag interruption state”, and “no drag state”.
  • “Drag state” means a state in which the object 40 is being dragged.
  • “Drag interrupted state” means a state in which dragging of the object 40 is interrupted.
  • the “no drag state” means a state where the object 40 is dropped.
  • the object display unit 33 displays the object 40 on the left LCD 17 as shown in (a) of FIG. 9 (S110).
  • the initial state of the control unit 15 is a no-drag state.
  • the control unit 15 determines whether the state of the control unit 15 is a drag state (S120). In S120, when it is determined that the state of the control unit 15 is not the drag state (S120: NO), the control unit 15 advances the process to S130 in FIG. In S130, the control unit 15 determines whether the state of the control unit 15 is the dragging suspended state (S130). In S130, when it is determined that the state of the control unit 15 is not the drag interruption state (S130: NO), the control unit 15 determines whether the object 40 has been long touched based on the touch event information (S140). Here, the control unit 15 determines whether or not the object 40 is long touched based on the presence or absence of the long touch event information. If it is determined in S140 that the object 40 has not been touched long (S140: NO), the control unit 15 returns the process to S120.
  • S140 touch event information
  • the control unit 15 determines whether the object 40 has been released based on the touch event information (S160). Here, the control unit 15 determines whether the object 40 has been released based on the presence / absence of the touch end event information. In S160, when it is determined that the object 40 has not been released (S160: NO), the object moving unit 34 moves the object 40 based on the touch event information (S170). Here, the control unit 15 is based on a difference between the touch data included in the touch event information of the previous step stored in the RAM 31 in advance and the touch data included in the touch event information of the current step. The object 40 is moved.
  • the object display unit 33 determines whether the object 40 is in the left specific area 41 (S180). Specifically, the object display unit 33 determines whether the reference point (reference pixel) of the object 40 is in the left specific area 41 (S180). The reference point of the object 40 is the center point of the object 40 in the present embodiment. If it is determined in S180 that the object 40 is not within the left specific area 41 (S180: NO), the object display unit 33 displays the object 40 in the same normal size as the previous time as shown in FIG. 9B. (S190), and the process returns to S120. On the other hand, in S180, when it is determined that the object 40 is in the left specific area 41 (S180: YES) as shown in FIG. 9C, the object display unit 33 displays the object as shown in FIG. The object 40 is enlarged and displayed (S200), and the process returns to S120.
  • S180 when it is determined that the object 40 is in the left specific area 41 (S180: YES) as shown in FIG. 9C, the object display unit 33 displays
  • the enlarged display of the object 40 by the object display unit 33 will be described in detail.
  • the outer frame 40a before the object 40 is enlarged is indicated by a broken line
  • the outer frame 40b after the object 40 is enlarged is indicated by a solid line.
  • the object display unit 33 enlarges the object 40 so that the outer frame 40 a is uniformly spaced from the reference point of the object 40 as indicated by a thick arrow.
  • the object display unit 33 enlarges the object 40 with a constant aspect ratio, as indicated by a thick arrow.
  • the object display unit 33 enlarges the object 40 so that the portion of the object 40 displayed on the right LCD 19 is enlarged and displayed as shown in FIG. Specifically, the object display unit 33 enlarges the object 40 so that a portion of the object 40 on the right LCD 19 side with respect to the boundary line 50 is enlarged and displayed.
  • a boundary line 50 is a boundary line between the left LCD 17 and the right LCD 19.
  • the control unit 15 determines whether the object 40 is in the left specific area 41 based on the touch event information (S210). .
  • the control unit 15 changes the state of the control unit 15 from the drag state to the drag suspension state (S220), and performs processing. Return to S120.
  • control unit 15 changes the state of the control unit 15 from the drag state to the no-drag state, and also moves the object 40 Drop (S230) and return to S120.
  • the object 40 is released in the left specific area 41, it is considered that the drag is temporarily interrupted without dropping the object 40.
  • the control unit 15 If it is determined in S130 that the state of the control unit 15 is the drag interruption state (S130: YES), the control unit 15 starts the counter 35 (S240). Next, the control unit 15 determines whether or not the object 40 is touched based on the touch event information (S250). In S250, when the control unit 15 determines that the object 40 has been touched, the object display unit 33 displays the object 40 in a normal size (S255), and the control unit 15 interrupts the state of the control unit 15 by dragging. The state is changed to the drag state (S150), and the process returns to S120. On the other hand, when it is determined that the object 40 has not been touched, the control unit 15 determines whether the counter 35 has counted a predetermined time (S260).
  • S260 when it is determined that the counter 35 has not counted the predetermined time (S260: NO), the control unit 15 returns the process to S250.
  • S260 when the control unit 15 determines that the counter 35 has counted a predetermined time (S260: YES), the object display unit 33 displays the object 40 in a normal size (S265), and the control unit 15 The state of the control unit 15 is changed from the drag interruption state to the no drag state, the object 40 is dropped (S270), and the process returns to S120.
  • the second embodiment has been described above, the second embodiment has the following features.
  • the multi-display portable terminal 10 includes a left TSD 14 (first touch panel) having a left LCD 17 (first display) and a left touch panel 18 (first touch panel) provided to overlap the left LCD 17. 1 right touch screen display), a right LCD 19 (second display), and a right touch panel 20 (second touch panel) provided to overlap the right LCD 19; a right TSD 16 (second touch screen display); Based on the object display unit 33 (object display means) for displaying the object display unit 33 on the logical LCD 37 (logical display) formed by integrating the left LCD 17 and the right LCD 19, and a drag operation performed on the left LCD 17, An object moving unit 34 (object for moving the object 40 displayed on the left LCD 17 Comprising a moving means), a.
  • the object 40 displayed on the left LCD 17 moves into the left specific area 41 (specific area) that is an area close to the right LCD 19 of the left LCD 17, as shown in FIG.
  • the object 40 moves into the left specific area 41 and is displayed in a special manner according to the above configuration. Therefore, the user can confirm that the control unit 15 of the multi-display portable terminal 10 grasps the user's intention to drag the object 40 so as to straddle between the adjacent left TSD 14 and right TSD 16.
  • the user can execute the drag with peace of mind.
  • the object display unit 33 enlarges and displays the object 40. Such a display mode is easily recognized by the user.
  • the object display unit 33 displays an enlarged portion of the object 40 displayed on the right LCD 19 so as to be displayed.
  • the object 40 is enlarged and displayed. According to the above configuration, there is a merit that it is easy to touch the object 40 when dragging the object 40 with the right TSD 16 is resumed in S250 of FIG.
  • An information processing method of the multi-display portable terminal 10 includes an object display step (S110 in FIG. 6) for displaying an object on an integrated display formed by integrating the left LCD 17 and the right LCD 19, and a line for the left LCD 17.
  • An object moving step (S170 in FIG. 6) for moving the object 40 displayed on the left LCD 17 based on the drag operation.
  • the object 40 displayed on the left LCD 17 moves into the left specific area 41 that is an area close to the right LCD 19 of the left LCD 17, the object 40 is displayed in a special manner. According to the above method, when dragging the object 40 so as to straddle between the adjacent left TSD 14 and right TSD 16, the user can perform the drag with peace of mind.
  • the width of the left specific area 41 and the right specific area 42 is half of the width w of the object 40. According to this, when the reference point (center point) of the object 40 enters the left specific area 41 regardless of the display size of the object 40, the object 40 is displayed across the left LCD 17 and the right LCD 19. Become. However, instead of this, the widths of the left specific area 41 and the right specific area 42 may be set freely.
  • the object display unit 33 enlarges the object 40 with a constant aspect ratio, as indicated by a thick arrow in FIG.
  • the object display unit 33 may enlarge and display the object 40 with a certain direction as shown by a thick arrow in FIG.
  • the drag operation is performed based on the difference between the touch data included in the touch event information of the previous step stored in the RAM 31 in advance and the touch data included in the touch event information of the current step.
  • the object 40 is enlarged and displayed so that the object 40 is stretched in the same direction as the drag direction.
  • the drag direction of the drag operation is the right direction on the paper surface, and the object 40 is enlarged and displayed so as to be particularly extended toward the right LCD 19. Further, when the drag direction of the drag operation is diagonally upward to the right, the object 40 can be enlarged and displayed so as to be particularly stretched diagonally upward to the right.
  • Non-transitory computer readable media include various types of tangible storage media (tangible storage medium).
  • Examples of non-transitory computer-readable media include magnetic recording media (eg flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg magneto-optical discs), CD-ROMs (Read Only Memory), CD-Rs, CD-R / W, semiconductor memory (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable ROM), flash ROM, RAM (random access memory)) are included.
  • the program may also be supplied to the computer by various types of temporary computer-readable media. Examples of transitory computer readable media include electrical signals, optical signals, and electromagnetic waves.
  • the temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
  • Multi-display mobile terminal 11
  • Base 12 Cover 13 Hinge 14 Left TSD 14a Short side 14b Long side 15
  • Control unit 16 Right TSD 16a Short side 16b Long side 17
  • Left LCD 18 Left touch panel 19
  • Right LCD 20 Right touch panel 33
  • Object display unit 34 Object movement unit 35
  • Counter 36 Logic TSD 37
  • logic LCD 38 logical touch panel 40 object 41 left specific area 42 right specific area 40a outer frame 40b outer frame 50 boundary line

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Selon cette invention, un terminal portatif multiécran (10) est doté : d'un afficheur tactile gauche (14) ; d'un afficheur tactile droit (16) ; d'une unité d'affichage d'objets (33) qui affiche des objets (40) sur un écran LCD logique (37) obtenu par combinaison d'un écran LCD gauche (17) et d'un écran LCD droit (19) ; et d'une unité de déplacement d'objets (34) qui déplace les objets (40) affichés sur ledit écran LCD gauche (17) en se basant sur des opérations de glissement effectuées sur cet écran LCD gauche (17). Si un objet (40) affiché sur l'écran LCD gauche (17) est déplacé dans une région spécifiée gauche (41), ladite région spécifiée gauche (41) étant la région de l'écran LCD gauche (17) proche de l'écran LCD droit (19), ladite unité d'affichage d'objets (33) affiche cet objet (40) d'une façon spéciale.
PCT/JP2014/000219 2013-03-08 2014-01-17 Dispositif de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par un ordinateur WO2014136372A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-046067 2013-03-08
JP2013046067 2013-03-08

Publications (1)

Publication Number Publication Date
WO2014136372A1 true WO2014136372A1 (fr) 2014-09-12

Family

ID=51490912

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/000219 WO2014136372A1 (fr) 2013-03-08 2014-01-17 Dispositif de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par un ordinateur

Country Status (1)

Country Link
WO (1) WO2014136372A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000242393A (ja) * 1999-02-23 2000-09-08 Canon Inc 情報処理装置及びその制御方法
JP2005149322A (ja) * 2003-11-18 2005-06-09 Canon Inc 表示装置、情報処理装置、表示システムおよびそれらの制御方法
JP2012185297A (ja) * 2011-03-04 2012-09-27 Sharp Corp マルチディスプレイシステム、情報処理端末装置、情報処理方法及びコンピュータプログラム
JP2012216223A (ja) * 2012-06-08 2012-11-08 Casio Comput Co Ltd 表示装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000242393A (ja) * 1999-02-23 2000-09-08 Canon Inc 情報処理装置及びその制御方法
JP2005149322A (ja) * 2003-11-18 2005-06-09 Canon Inc 表示装置、情報処理装置、表示システムおよびそれらの制御方法
JP2012185297A (ja) * 2011-03-04 2012-09-27 Sharp Corp マルチディスプレイシステム、情報処理端末装置、情報処理方法及びコンピュータプログラム
JP2012216223A (ja) * 2012-06-08 2012-11-08 Casio Comput Co Ltd 表示装置

Similar Documents

Publication Publication Date Title
US10452333B2 (en) User terminal device providing user interaction and method therefor
KR101838031B1 (ko) 휴대용 단말기에서 아이콘 관리 방법 및 장치
EP2987068B1 (fr) Procédé pour régler une zone d'affichage et dispositif électronique associé
JP5805588B2 (ja) 電子機器、制御方法及び制御プログラム
WO2016015585A1 (fr) Procédé de capture d'écran pour un dispositif terminal ainsi que dispositif terminal, produit programme d'ordinateur et support d'enregistrement lisible par ordinateur d'un procédé de capture d'écran
US20140053097A1 (en) Method for providing user interface having multi-tasking function, mobile communication device, and computer readable recording medium for providing the same
US10067666B2 (en) User terminal device and method for controlling the same
WO2013175770A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, et programme de traitement d'informations
KR20120066122A (ko) 타임라인 바를 이용한 터치스크린 제어방법, 장치 및 이를 위한 프로그램이 기록된 기록매체 및 사용자 단말
US20140292697A1 (en) Portable terminal having double-sided touch screens, and control method and storage medium therefor
JP2014071724A (ja) 電子機器、制御方法及び制御プログラム
US9933895B2 (en) Electronic device, control method for the same, and non-transitory computer-readable storage medium
WO2014044133A1 (fr) Interface d'application, et procédé et appareil de commande d'opération sur une interface d'application
JP2015523649A (ja) 画像表示処理方法、画像表示処理装置、プログラム、及び記録媒体
JP2011107781A (ja) 表示制御装置、及びその制御方法
JP2015524132A (ja) ラップアラウンド・ナビゲーション
KR102113509B1 (ko) 가상 키패드 제어 방법 및 그 전자 장치
CN104571814B (zh) 一种投影方法及电子设备
JP2016057759A (ja) 電子機器、方法およびプログラム
US9823890B1 (en) Modifiable bezel for media device
JP2010262636A (ja) タッチセンシティブディスプレイ上のメニュー項目に対する標的ゾーン
JP5775432B2 (ja) 装置、方法、及びプログラム
JP5762885B2 (ja) 装置、方法、及びプログラム
WO2014136372A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et support non transitoire lisible par un ordinateur
JP2013065291A (ja) 装置、方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14760142

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14760142

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP