GB2358336A - Selecting alphanumeric characters or menu options by movement of the display device - Google Patents

Selecting alphanumeric characters or menu options by movement of the display device Download PDF

Info

Publication number
GB2358336A
GB2358336A GB0026519A GB0026519A GB2358336A GB 2358336 A GB2358336 A GB 2358336A GB 0026519 A GB0026519 A GB 0026519A GB 0026519 A GB0026519 A GB 0026519A GB 2358336 A GB2358336 A GB 2358336A
Authority
GB
United Kingdom
Prior art keywords
display
electronic device
portable electronic
motion
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0026519A
Other versions
GB2358336B (en
GB0026519D0 (en
Inventor
Jr George W Schaupp
Richard J Vilmur
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motorola Solutions Inc
Original Assignee
Motorola Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motorola Inc filed Critical Motorola Inc
Publication of GB0026519D0 publication Critical patent/GB0026519D0/en
Publication of GB2358336A publication Critical patent/GB2358336A/en
Application granted granted Critical
Publication of GB2358336B publication Critical patent/GB2358336B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/02Input arrangements using manually operated switches, e.g. using keyboards or dials
    • G06F3/023Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
    • G06F3/0233Character input methods
    • G06F3/0236Character input methods using selection techniques to select from displayed items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/56Arrangements for indicating or recording the called number at the calling subscriber's set

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A portable electronic device (100) facilitates entry of user data and commands by detecting motion of the device (100). A display (112) is used to depict a plurality of user interface options (300, 500, 502). The user physically moves the device (100) to highlight and enter a desired character of data (302) or a command (504). For example, a virtual keypad (300) may be displayed, and when the device (100) is moved forward, backward, left, or right relative to the plane of the display (112) the next letter or number in that direction (or the opposite direction) may be highlighted. Then, by moving the device (100) downward (i.e., perpendicular to the plane of the display (112)) the highlighted letter or number may be entered as if it were typed on a real, dedicated keypad.

Description

2358336 APPARATUS AND METHODS FOR SELECTING A USER INTERFACE OPTION ON A
PORTABLE ELECTRONIC DEVICE
Field of the Invention
The present invention relates in general to a method and apparatus for selecting a user interface on a portable electronic device and in particular to entering data and/or selecting menu options via a graphical user interface, associated with a portable electronic device, by detecting motion of the device which is contemporaneous with a predetermined graphical display.
Background of the Invention
Advances in integrated circuit technology have allowed portable electronic devices, such as cellular telephones, to substantially decrease in size. These reduced size devices are often preferred by users because they are lighter and provide greater portability. However, certain components of a typical portable electronic device can not be reduced much further in size without significantly impairing the device's usability. One such component is the keypad. The typical cellular telephone keypad includes dialing digits 0 - 9, as well as a few other function keys such as "clear', "send", "end", etc. The keypad adds cost, mechanical design complexity, size, and weight to the cellular telephone. The minimum size of the keys is dictated by the size of the human finger. In other words, if the keys are too small, many users will find it difficult to dial the telephone.
Proposed solutions include use of such technologies as voice recognition and touch screens to eliminate the keypad, however these technologies have significant limitations and drawbacks. Voice recognition solutions may require extensive training and often fail to perform properly, thereby requiring the user to repeat certain commands. This repetition can be time consuming and frustrating for the user. Further, voice recognition algorithms require significant computational resources, thereby adding cost to the electronic device and/or delaying the responsiveness of the recognition process. Still further, voice recognition solutions lack privacy. For example, if the user is dialing a phone number in a public area, the phone number and/or name of the person called can be heard by people nearby. Touch screen technology solves the problems of privacy, accuracy, and computational resources. However, touch screens are expensive and often reduce the visual clarity of the underlying display. Touch screens are also similarly size limited if it is intended for the user to use a finger to make a selection, or the touch screen requires the use of a stylus which may become lost.
Brief Description of the Drawings
These and other features and advantages of the present invention will be apparent to those of ordinary skill in the art in view of the detailed description of the preferred embodiment which is made with reference to the drawings, a brief description of which is provided below.
FIG. 1 is a block diagram illustrating a portable electronic device.
FIG. 2 is a flowchart of a program that can be implemented by the portable electronic device of FIG. 1 to select alphanumeric characters via a user interface.
FIG. 3 is a line drawing of an exemplary user interface that can be implemented by the portable electronic device of FIG. 1 to display alphanumeric options.
FIG. 4 is a flowchart of a program that can be implemented by the portable electronic device of FIG. 1 to select menu items via a user interface.
FIG. 5 is a line drawing of an exemplary user interface that can be implemented by the portable electronic device of FIG. 1 to display menu options.
FIG. 6 is a flowchart of a program that can be implemented by the portable electronic device of FIG. 1 to execute a software routine in response to the detection of predefined display motions.
Detailed Description of the Preferred Embodiments
A portable electronic device depicts a plurality of user interface options on its display and allows a user to select one or more of the options by physically moving the device in a particular direction. For example, a virtual keypad may be displayed, and when the device is moved forward, backward, left, or right relative to the plane of the display, the next letter in that direction (or the opposite direction) may be highlighted. Then, by moving the device downward (i.e., perpendicular to the plane of the display) the highlighted letter may be entered as if it were typed on a real, dedicated keypad.
A portable electronic device 100 is shown in FIG. 1. In the preferred embodiment, the portable electronic device 100 is a wireless communication device such as a handheld cellular telephone or pager. However, the portable electronic device 100 may be a general purpose computing device, such as a personal digital assistant, or an application specific device, such as an electronic book or an electronic map. The portable electronic device is coupled to a power source 101. Preferably, the power source is a battery.
However, persons of ordinary skill in the art will readily appreciate that other power sources, such as, for example, an AC transformer's output converted to DC or a solar panel, may be used as the power source 101.
A controller 102 in the portable electronic device 100 may include a data memory 104, such as a random-access memory, a program memory 106, which may be in the form of a read-only memory (ROM), and a microprocessor 108, all of which may be interconnected by an address/data bus 109. In one embodiment, the program memory 106 electronically stores a computer program that implements all or part of the method described below with respect to FIGS. 2, 4, and 6. Preferably, the program is executed by the microprocessor 108. The program memory 106 may be loaded from a fixed memory device such as a hard drive, or the program memory 106 may be preloaded with firmware as is well known to persons of ordinary skill in the art. Some of the steps described in the method below may be performed manually or without the use of the portable electronic device 100.
Preferably, a motion detector 110 via an A/D converter 111 and a display 112 are electronically coupled to the controller 102 via a conventional input/output (1/0) circuit 114. The motion detector 110 is also mechanically coupled to the display 112 such that the motion of the motion detector 110 corresponds to the motion of the display 112.
The motion detector 110 is preferably made up of two accelerometers, each of which detect changes in motion (e.g., accelerations). A first accelerometer is positioned parallel to the plane of the display 112 to detect motion of the device 100 parallel to the plane of the display 112 (i.e., along the x-axis and y-axis). The first accelerometer outputs two voltages, one for the x-axis and one for the y-axis, centered approximately around a static DC voltage point. As the device 100 is accelerated (or moved) in a positive or negative direction along the x-axis, the y-axis or a combination thereof, the output voltages increase or decrease from the static DC voltage point. For example, if movement of the device 100 causes the x-axis output voltage to increase from the static DC voltage point, there is movement in an X direction and the acceleration in the X direction is positive (+). If movement of the device 100 causes the x-axis output voltage to decrease from the static DC voltage point, there is movement in the X direction and the acceleration in the X direction is negative (-). If there is no movement of the device 100 in the X direction, the x-axis output voltage does not vary and there is no acceleration in the X direction. The same holds true for the y-axis output voltage. A second accelerometer is positioned perpendicular to the plane of the display 112 to detect motion of the device 100 perpendicular to the plane of the display 112 (i.e., along the z-axis). The second accelerometer operates in a manner similar to that of the first accelerometer except only one output voltage is used to track movement of the device in a Z direction. The accelerometers may be Analog Devices 2-axis MEMS (micromachine) based accelerometers, part number ADLX202, commonly available from many sources, or other suitable commercially available accelerometer.
The controller 102 receives motion detection signals from the motion detector 110. In response to certain motion detection signals, the controller 102 causes the display 112 to show various predefined graphical depictions.
By moving the electronic device 100 in a particular direction contemporaneous with a particular display being shown, a user of the portable electronic device 100 may enter data and commands into the controller 102.
The 1/0 circuit 114 may also include a transmitter and receiver (not depicted) for electronically coupling the controller 102 to an antenna 116.
The antenna 116 may be used to transmit and/or receive information associated with user interface options and other information used in the process described below.
In the illustrated embodiment of FIG. 1, the device 100 lacks a dedicated keypad, such as a cellular telephone keypad having alphanumeric keys 0-9 and function keys "send", "end", "clear", etc, and, thereby, avoids the added cost, mechanical design complexity, size, and weight associated with such dedicated keypad.
A flowchart of a program 200 that can be implemented by the portable electronic device 100 to select alphanumeric characters is illustrated in FIG.
2. Preferably, the programmed steps are executed by the controller 102.
Generally, the program 200 generates a series of graphical depictions indicative of selected (but not yet entered) alphanumeric characters in response to the motion of the display 112. The program 200 also allows a selected character to be entered in response to a motion of the display 112.
Alphanumeric characters include English letters, numbers, Roman numerals, Chinese characters, Kanji, Kana and/or any other human recognizable language symbol.
When the program 200 is initiated, the controller 102 initializes a "current character' variable to a default value at step 202. For example, a character located near the center of a virtual keypad may be selected as the default character in order to minimize the average "distance" to the desired character. Alternatively, the default character may be the last character entered. Subsequently, at step 204, the controller 102 generates a graphical user interface on the display 112 showing a plurality of characters 300 (e.g., a virtual keypad) with the current character 302 visually identified (e.g., highlighted, see FIG. 3).
The program 200 then enters a motion detection loop (steps 206 - 214). In this example, the motion detection loop looks for motion (e.g., acceleration) along the plane of the display 112 in a direction consisting essentially of a vector to the left (step 206), to the right (step 208), forward (step 210), backward (step 212), and motion perpendicular to the plane of the display 112 in a direction consisting essentially of a downward vector (step 214). For purposes of clarity and discussion, the plane of the display 112 extends co-planarly with a plane defined by the FIG. 3 drawing sheet on which the display 112 of the device 100 is depicted. Motion from a static situation or a change in velocity implies acceleration. Motions are preferably generated by the user holding the device 100 in his hand and then moving his hand, or motions may be generated by the user by tapping a side of the device 100. Typically, there is always some motion occurring. Certain motions are below the threshold of ordinary detection, and, therefore, can not be considered by the program 200. Other motions, such as involuntary hand vibrations and/or slight hand movements may be detected, but are preferably ignored to avoid changing the display without deliberate hand movement by the user. On the other end of the spectrum, excessive accelerations may be ignored. For example, dropping the portable electronic device 100 on a hard surface typically creates an acceleration that is preferably considered too large to come from a deliberate hand movement.
Although this embodiment looks for motion in only five directions, persons of ordinary skill in the art will readily appreciate that detection of motion in many other directions is possible. For example, a motion including both a significant forward vector and a significant right vector (i.e., diagonal motion) may be detected. Once detected, diagonal motion may produce a change to the display 112 (e.g., diagonal movement of a cursor) or, depending on the current depiction on the display, the diagonal motion may be ignored.
Once a motion with a magnitude greater than a minimum threshold and less than a maximum threshold is detected in one of the predefined directions, a corresponding step (steps 216 - 224, 400) is taken to enter a new mode of operation, produce a new depiction on the display 112 and/or enter the current character 302. In this embodiment, if motion is detected to the left (step 206), the current character 302 (or button, or icon, etc.) is moved to the left if possible (step 216). If there are no characters depicted to the left of the current character 302, no change is made to the current character 302.
Similarly, if motion is detected to the right (step 208), the current character 302 is moved to the right if possible (step 218). If forward motion is detected (step 210), the current character 302 is moved up if possible (step 220). Or, in this embodiment, a menu's button 303 may be selected. If backward motion is detected (step 212), the current character 302 is moved down if possible (step 222). Once the current character 302 is updated, the program regenerates the depiction of the graphical user interface on the display 112 to show the new current character 302 highlighted (step 204). Inan alternate embodiment, the direction of the current character 302 is moved opposite to the detected direction of the display 112. In this manner, the user is presented with the appearance of the cursor position remaining stationary while the device 100 moves.
The number of character positions that the current character 302 moves may vary. In one embodiment the current character 302 is shifted one position for each valid motion detected regardless of the magnitude of the motion. In another embodiment, the number of positions the current character 302 moves may be proportionate to the magnitude of the detected motion. For example, the range of motions between the minimum threshold and the maximum threshold may be divided into three ranges (e.g., low, medium and high). A detected motion may then be categorized by comparing the magnitude of the motion with the thresholds of the predetermined ranges.
If a motion in the low range is detected, the current character 302 may move one position (e.g., from M" to "2"). If a motion in the medium range is detected, the current character 302 may move two positions (e.g., from M " to "X). If a motion in the high range is detected, the current character 302 may move three positions (e.g., from M" to "4"). Persons of ordinary skill in the art will readily appreciate that many other associations between detected motion and cursor movement are possible.
Preferably, when the user has the character highlighted that he wishes to enter, he produces a downward motion that is perpendicular to the plane of the display 112. Such downward movement is preferred because it is a movement that a user associates with pressing a conventional key or button and, thus, will be partially intuitive to the user. The program 200 detects the downward motion (step 214) and determines if a character 302 or the menu button 303 is currently highlighted (step 215). If a character 302 is currently highlighted, the program 200 preferably enters the current character 302 (step 224). If the menu button 303 is highlighted, the program 200 preferably enters a menu mode at step 400 (see FIGS. 4 and 5). Of course, buttons other than a menu. button may be used. Entering an alphanumeric character preferably includes storing the character in memory 104 and displaying the character in an alphanumeric character input area 304 on the display 112 of the portable electronic device 100. In some embodiments, entering a numeric character also includes dialing the entered digit into a telephone system.
Once a character is entered, the program 200 preferably repeats (step 202).
In an alternate preferred embodiment, the user uses a predefined gesture motion (e.g., double tap the portable electronic device 100) to "enter" the current character or select a virtual button. This embodiment has the advantage of requiring one less accelerometer if the predefined gesture does not use z-axis acceleration. Avoiding use of z-axis acceleration reduces cost, complexity and thickness of the portable electronic device 100.
A flowchart of a program 400 that can be implemented by the portable electronic device 100 to select menu items 504 is illustrated in FIG. 4.
Preferably, the programmed steps are executed by the controller 102.
Generally, the program 400 generates a series of graphical depictions indicative of selected (but not yet entered) menu items 504 in response to the motion of the display 112. The program 400 also allows a selected menu item 504 to be entered in response to a motion of the display 112. Menu items include text, icons, and/or any other type of graphical symbol or string of symbols. Preferably, selection of a menu item executes a software routine associated with that menu item. For example, a menu item may cause the device 100 to enter or exit the alpha-entry mode described above.
When the program 400 is initiated, the controller 102 initializes a is c&current menu item" variable to a default value at step 402. For example, the title of the first drop-down menu 502 in a menu bar 500 may be selected (see
FIG. 5). Subsequently, at step 404, the controller 102 generates a graphical user interface on the display 112 showing the menu bar 500 with the drop down menu 502 and a selected menu item 504. The program 400 then enters a motion detection loop (steps 406 - 414). In this example, the motion detection loop looks for motion along the plane of the display 112 in a direction consisting essentially of a vector to the left (step 406), to the right (step 408), forward (step 410), backward (step 412), and motion perpendicular to the plane of the display 112 (step 414).
As described above, the magnitude and/or duration of certain motions may be above or below a predefined threshold. Preferably these motions are ignored to avoid changing the depiction on the display 112 without a deliberate hand movement by the user. Again, this embodiment looks for motion in only five directions. However, persons of ordinary skill in the art will readily appreciate that detection of motion in many other directions is possible.
Once a motion greater than a minimum threshold and less than a maximum threshold is detected in one of the predefined directions, a corresponding step (steps 416 - 424) is taken to produce a new depiction on the display 112 and/or execute a software routine associated with the current menu selection 504. In this embodiment, if motion is detected to the left (step 406), the previous menu 502 is shown if possible (step 416). If there is no menu to the left of the current menu, no change is made. Similarly, if motion is detected to the right (step 408), the next menu 502 is shown if possible (step418). If forward motion is detected (step 410), the current menu item 504 is moved up if possible (step 420). If backward motion is detected (step 412), the current menu item 504 is moved down if possible (step 422). Once the current menu item 504 is updated, the program 400 regenerates the graphical user interface on the display 112 to show the new menu item 504 highlighted (step 404). In an alternate embodiment, the direction the current menu item 504 is moved is opposite to the detected direction of the display 112. The number of positions moved in the menu 502 may vary. Inone embodiment the number of positions is shifted once for each motion regardless of the magnitude of the motion. In another embodiment, the number of positions moved may be proportionate to the magnitude of the detected motion as described in detail above.
Preferably, when the user has the desired menu item 504 highlighted, he produces a downward motion that is perpendicular to the plane of the display 112. The program 400 detects the downward motion (step 414) and executes a software routine associated with the current menu item 504 (step 424). Preferably, the program 400 then repeats (step 402). In an alternate embodiment, the user may use a predefined gesture instead of the downward motion to conserve an accelerometer as described above.
A flowchart of a program that can be implemented by the portable electronic device of FIG. 1 to execute a software routine in response to the detection of predefined display motions (i.e., gestures) is shown in FIG. 6.
Preferably, the programmed steps are executed by the controller 102.
When the program 600 is initiated, the controller 102 waits for a signal f rom the motion detector 110 that a motion associated with the display 112 has been detected (step 602). When a motion is detected, the controller 102 executes a correlation software routine (step 604). The correlation software routine compares the detected motion, and optionally a series of previously detected motions stored in data memory 104, to one or more motion patterns stored in program memory 106 (e.g., a predefined gesture such as shaking right and left for "no" or shaking forward and backward for "yes"). The controller 102 then determines if the correlation software found a match with a predefined motion (step 606). If no match is found, the controller 102 adds the current motion to a series of motions stored in data memory 104 for use in subsequent correlation steps (step 608). If a match is found at step 606, the controller 102 may clear the series of motions stored in data memory 104 so they are not reused (step 610).
Subsequently, the controller 102 executes a display generation routine associated with the matched predefined motion and/or toggles (i.e., arms or disarms) an input mode (step 612). Preferably, the new display generated depends on the current display as well as the predefined motion. For example, if the current display asks the user if he would like to dial the displayed number, and the motion is determined to be a yes like forward and backward motion, then the new display may be an indication that the number is being dialed. By arming and disarming the input mode, the user may avoid undesired motions from being translated to user inputs. For example, a predefined gesture motion such as a double tap (i.e., two taps within a predefined time period) on a display 112 may cause the device 100 to enter an alpha-entry mode.
A method and apparatus for entering user data and commands into a portable electronic device by detecting motion of the device has been provided. Although the foregoing description focuses on display depictions with discrete cursor positions such as keys on a virtual keyboard and/or menu items, persons of ordinary skill in the art will readily appreciate that the resolution of cursor movement may be as small as the smallest changeable display element. For example, a user may use the aforementioned to move an arrow freely over a depiction of a map or move a "windowed display" over a larger virtual display. For example, the user may view a map, larger than the physical display 112, by scrolling in the direction of a detected motion or by scrolling in the opposite direction of the detected motion. Systems implementing the selection methods described herein can enjoy a rich graphical user interface without the added cost, complexity, size, and weight of a typical user input device.
The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many modifications and variations are possible in light of the above teachings. It is intended that the scope of the invention be limited not by this detailed description, but rather by the claims appended hereto.

Claims (23)

  1. What is claimed is:
    Claims 1 A method for selecting a user interface option of entering alphanumeric data in a user interface of a portable electronic device, the portable electronic device having a display, a motion detector and a memory, the method comprising:
    generating a depiction on the display of the portable electronic device, the depiction having a first alphanumeric character; detecting a movement associated with the display of the portable electronic device; and storing the first alphanumeric character in the memory of the portable electronic device in response to the detection of the movement.
  2. 2. The method as defined in claim 1, wherein:
    generating a depiction on the display of the portable electronic device comprises generating a first depiction of the first alphanumeric character and a second alphanumeric character on the display, and detecting a movement associated with the display of the portable electronic device comprises detecting a first movement associated with the display in a first direction.
  3. 3. The method as defined in claim 2, wherein:
    generating a depiction on the display of the portable electronic device further comprises generating a second depiction on the display, the second depiction visually identifying the first alphanumeric character in response to the detection of the first movement in the first direction, detecting a movement associated with the display of the portable electronic device further comprises detecting a second movement associated with the display in a second direction, and storing the first alphanumeric character in the memory of the portable electronic device comprises storing the first alphanumeric character in the memory in response to the detection of the second movement in the second direction.
  4. 4. The method as defined in claim 3, wherein:
    detecting a first movement associated with the display comprises comparing a magnitude associated with the first movement to a predefined threshold, and detecting a second movement associated with the display of the portable electronic device comprises comparing a magnitude associated with the second movement to a predefined threshold.
  5. 5. The method as defined in claim 1, further comprising displaying the first alphanumeric character in an alphanumeric input area on the display of the portable electronic device in response to the detection of the movement.
  6. 6. The method as defined in claim 1, further comprising dialing the first alphanumeric character in response to the detection of the movement.
  7. 7. The method as defined in claim 1, wherein generating a depiction comprises generating the first alphanumeric character as a character selected from the group of characters consisting of a letter, a number, a Roman numeral, a Chinese character, a Kanji character, a Kana character, and a punctuation character.
  8. 8. A method for selecting a user interface option of selecting a menu item in a user interface of a portable electronic device, the portable electronic device having a display and a motion detector, the method comprising:
    generating a depiction on the display of the portable electronic device, the depiction having a first menu item; detecting a movement associated with the display of the portable electronic device; and executing a software routine associated with the first menu item in response to the detection of the movement.
  9. 9. The method as defined in claim 8, wherein:
    generating a depiction on the display of the portable electronic device comprises generating a first depiction of the first menu item and a second menu item on the display, and detecting a movement associated with the display of the portable electronic device comprises detecting a first movement associated with the display in a first direction.
  10. 10. The method as defined in claim 9, wherein:
    generating a depiction on the display of the portable electronic device further comprises generating a second depiction on the display, the second depiction visually identifying the first menu item in response to the detection of the first movement in the first direction, detecting a movement associated with the display of the portable electronic device further comprises detecting a second movement associated with the display in a second direction, and executing a software routine associated with the first menu item comprises executing a software routine associated with the first menu item in response to the detection of the second movement in the second direction.
  11. 11. The method as defined in claim 9, wherein detecting a first movement associated with the display comprises comparing a magnitude associated with the first movement to a predefined threshold.
  12. 12. The method as defined in claim 8, wherein generating the depiction comprises generating the first menu item as an icon.
  13. 13. A portable electronic device comprising:
    a display; a motion detector associated with the display, the motion detector being adapted to generate a motion detection signal in response to movement of the display; and a controller operatively coupled to the display and the motion detector, the controller causing the display to generate a user interface depiction, the user interface depiction being indicative of at least one user input option, the controller, in response to the motion detection signal, selecting the at least one user input option.
  14. 14. The portable electronic device as defined in claim 13, wherein:
    the motion detector is adapted to generate a first motion detection signal in response to a first motion of the display and a second motion detection signal in response to a second motion of the display, and the controller is further adapted to cause the display to generate a first user interface depiction indicative of a plurality of user input options in response to receiving the first motion detection signal, and to cause the display to generate a second user interface depiction in response to receiving the second motion detection signal, the second user interface depiction being indicative of a selected user input option.
  15. 15. The portable electronic device as defined in claim 13, wherein the portable electronic device lacks a dedicated keypad.
  16. 16. The portable electronic device as defined in claim 13, wherein the controller is further adapted to dial a digit.
  17. 17. The portable electronic device as defined in claim 13, wherein the portable electronic device comprises a wireless communication device.
  18. 18. The portable electronic device as defined in claim 13, wherein the portable electronic device comprises one of a cellular telephone, a pager, and a personal digital assistant.
  19. 19. A method for selecting a user interface option of invoking a software routine in a user interface of a portable electronic device, the portable electronic device having a display, a motion detector and a memory, the method comprising:
    detecting a motion associated with the display; executing a correlation software routine in response to the detection of the motion of the display; determining if the motion associated with the display correlates to a predefined motion; and executing the software routine in response to determining that the motion associated with the display correlates to the predefined motion.
  20. 20. The method as defined in claim 19, wherein executing the software routine comprises executing a display generation routine.
  21. 21. The method as defined in claim 20 wherein executing the display generation routine comprises generating a windowed display associated with a larger virtual display.
  22. 22. The method of as defined in claim 19 wherein executing the software routine comprises executing an input mode arming/disarming software routine.
  23. 23. The method as defined in claim 19, further comprising: storing the motion associated with the display in the memory when the motion associated with the display does not correlate to a predefined motion; and clearing previous motions from the memory when the motion associated with the display correlates to a predefined motion.
GB0026519A 1999-11-03 2000-10-31 Apparatus and methods for selecting a user interface option on a portable electronic device Expired - Fee Related GB2358336B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US43298699A 1999-11-03 1999-11-03

Publications (3)

Publication Number Publication Date
GB0026519D0 GB0026519D0 (en) 2000-12-13
GB2358336A true GB2358336A (en) 2001-07-18
GB2358336B GB2358336B (en) 2002-09-25

Family

ID=23718391

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0026519A Expired - Fee Related GB2358336B (en) 1999-11-03 2000-10-31 Apparatus and methods for selecting a user interface option on a portable electronic device

Country Status (3)

Country Link
KR (1) KR20010051396A (en)
CN (1) CN1119051C (en)
GB (1) GB2358336B (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003081879A1 (en) * 2002-03-26 2003-10-02 Nokia Oyj User interface for a portable telecommunication device
DE10341580A1 (en) * 2003-09-09 2005-03-31 Siemens Ag Input device for a data processing system
GB2407664A (en) * 2003-01-31 2005-05-04 Roke Manor Research Method of secure network communication, data input via a graphical representation of a key pad and selective viewing of an area of a display
EP1703706A1 (en) * 2005-02-23 2006-09-20 Samsung Electronics Co., Ltd. Apparatus and method for controlling menu navigation in a terminal by using an inertial sensor in said terminal
US7567818B2 (en) 2004-03-16 2009-07-28 Motionip L.L.C. Mobile device with wide-angle optics and a radiation sensor
EP2262221A1 (en) * 2006-08-07 2010-12-15 Samsung Electronics Co., Ltd. Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
WO2014011785A1 (en) * 2012-07-13 2014-01-16 Symbol Technologies, Inc. Device and method for performing a functionality
EP2734916A1 (en) * 2011-07-21 2014-05-28 Sony Corporation Information processing apparatus, information processing method, and program
US8994644B2 (en) 2007-01-26 2015-03-31 Apple Inc. Viewing images with tilt control on a hand-held device
US9202095B2 (en) 2012-07-13 2015-12-01 Symbol Technologies, Llc Pistol grip adapter for mobile device
US9697393B2 (en) 2015-11-20 2017-07-04 Symbol Technologies, Llc Methods and systems for adjusting mobile-device operating parameters based on housing-support type
US9727095B2 (en) 2001-05-16 2017-08-08 Apple Inc. Method, device and program for browsing information on a display
US9817546B2 (en) 2009-05-19 2017-11-14 Samsung Electronics Co., Ltd. Mobile device and method for executing particular function through touch event on communication related list

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100432912C (en) * 2004-05-07 2008-11-12 索尼株式会社 Mobile electronic apparatus, display method, program and graphical interface thereof
KR20100066036A (en) 2008-12-09 2010-06-17 삼성전자주식회사 Operation method and apparatus for portable device
US20130238992A1 (en) * 2012-03-08 2013-09-12 Motorola Mobility, Inc. Method and Device for Content Control Based on Data Link Context
US10521188B1 (en) 2012-12-31 2019-12-31 Apple Inc. Multi-user TV user interface
CN111782128B (en) 2014-06-24 2023-12-08 苹果公司 Column interface for navigating in a user interface
CN111078110B (en) * 2014-06-24 2023-10-24 苹果公司 Input device and user interface interactions
DK201670582A1 (en) 2016-06-12 2018-01-02 Apple Inc Identifying applications on which content is available
DK201670581A1 (en) 2016-06-12 2018-01-08 Apple Inc Device-level authorization for viewing content
US11966560B2 (en) 2016-10-26 2024-04-23 Apple Inc. User interfaces for browsing content from multiple content applications on an electronic device
DK201870354A1 (en) 2018-06-03 2019-12-20 Apple Inc. Setup procedures for an electronic device
CN113906419A (en) 2019-03-24 2022-01-07 苹果公司 User interface for media browsing application
WO2020198237A1 (en) 2019-03-24 2020-10-01 Apple Inc. User interfaces including selectable representations of content items
US11683565B2 (en) 2019-03-24 2023-06-20 Apple Inc. User interfaces for interacting with channels that provide content that plays in a media browsing application
CN113940088A (en) 2019-03-24 2022-01-14 苹果公司 User interface for viewing and accessing content on an electronic device
CN113906380A (en) 2019-05-31 2022-01-07 苹果公司 User interface for podcast browsing and playback applications
US11863837B2 (en) 2019-05-31 2024-01-02 Apple Inc. Notification of augmented reality content on an electronic device
CN110859640A (en) * 2019-11-13 2020-03-06 先临三维科技股份有限公司 Scanner, operation method, device and system thereof, storage medium and processor
US11843838B2 (en) 2020-03-24 2023-12-12 Apple Inc. User interfaces for accessing episodes of a content series
US11899895B2 (en) 2020-06-21 2024-02-13 Apple Inc. User interfaces for setting up an electronic device
US11720229B2 (en) 2020-12-07 2023-08-08 Apple Inc. User interfaces for browsing and presenting content
US11934640B2 (en) 2021-01-29 2024-03-19 Apple Inc. User interfaces for record labels

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0825514A2 (en) * 1996-08-05 1998-02-25 Sony Corporation Information processing device and method for inputting information by operating the overall device with a hand
JPH1195910A (en) * 1997-09-17 1999-04-09 Citizen Watch Co Ltd Pointing device
WO1999022338A1 (en) * 1997-10-28 1999-05-06 British Telecommunications Public Limited Company Portable computers

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0764754A (en) * 1993-08-24 1995-03-10 Hitachi Ltd Compact information processor
JPH10240436A (en) * 1996-12-26 1998-09-11 Nikon Corp Information processor and recording medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0825514A2 (en) * 1996-08-05 1998-02-25 Sony Corporation Information processing device and method for inputting information by operating the overall device with a hand
JPH1195910A (en) * 1997-09-17 1999-04-09 Citizen Watch Co Ltd Pointing device
WO1999022338A1 (en) * 1997-10-28 1999-05-06 British Telecommunications Public Limited Company Portable computers

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WPI Abstract Accession No. 1999-292594 & JP 11 095 910 A *

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9727095B2 (en) 2001-05-16 2017-08-08 Apple Inc. Method, device and program for browsing information on a display
US7593748B2 (en) 2002-03-26 2009-09-22 Panu Korhonen User interface for a portable telecommunication device
WO2003081879A1 (en) * 2002-03-26 2003-10-02 Nokia Oyj User interface for a portable telecommunication device
GB2407664A (en) * 2003-01-31 2005-05-04 Roke Manor Research Method of secure network communication, data input via a graphical representation of a key pad and selective viewing of an area of a display
US7933971B2 (en) 2003-01-31 2011-04-26 Roke Manor Research Limited Method for secure communication over a public data network via a terminal that is accessible to multiple users
DE10341580A1 (en) * 2003-09-09 2005-03-31 Siemens Ag Input device for a data processing system
US7567818B2 (en) 2004-03-16 2009-07-28 Motionip L.L.C. Mobile device with wide-angle optics and a radiation sensor
EP1703706A1 (en) * 2005-02-23 2006-09-20 Samsung Electronics Co., Ltd. Apparatus and method for controlling menu navigation in a terminal by using an inertial sensor in said terminal
EP2302880A1 (en) * 2005-02-23 2011-03-30 Samsung Electronics Co., Ltd. Apparatus and method for controlling menu navigation in a terminal by using an inertial sensor in said terminal
EP2262221A1 (en) * 2006-08-07 2010-12-15 Samsung Electronics Co., Ltd. Portable terminal and user interface control method thereof based on pattern recognition and analysis of image captured by camera
US8994644B2 (en) 2007-01-26 2015-03-31 Apple Inc. Viewing images with tilt control on a hand-held device
US11029816B2 (en) 2009-05-19 2021-06-08 Samsung Electronics Co., Ltd. Mobile device and method for executing particular function through touch event on communication related list
US9817546B2 (en) 2009-05-19 2017-11-14 Samsung Electronics Co., Ltd. Mobile device and method for executing particular function through touch event on communication related list
EP2734916A1 (en) * 2011-07-21 2014-05-28 Sony Corporation Information processing apparatus, information processing method, and program
EP2734916A4 (en) * 2011-07-21 2015-03-18 Sony Corp Information processing apparatus, information processing method, and program
US10915188B2 (en) 2011-07-21 2021-02-09 Sony Corporation Information processing apparatus, information processing method, and program
US9489070B2 (en) 2011-07-21 2016-11-08 Sony Corporation Information processing apparatus, information processing method, and program
US10416785B2 (en) 2011-07-21 2019-09-17 Sony Corporation Information processing apparatus, information processing method, and program
US9791896B2 (en) 2012-07-13 2017-10-17 Symbol Technologies, Llc Device and method for performing a functionality
WO2014011785A1 (en) * 2012-07-13 2014-01-16 Symbol Technologies, Inc. Device and method for performing a functionality
US9704009B2 (en) 2012-07-13 2017-07-11 Symbol Technologies, Llc Mobile computing device including an ergonomic handle and thumb accessible display while the handle is gripped
GB2518564B (en) * 2012-07-13 2020-12-09 Symbol Technologies Llc Device and method for performing a functionality
US9202095B2 (en) 2012-07-13 2015-12-01 Symbol Technologies, Llc Pistol grip adapter for mobile device
GB2518564A (en) * 2012-07-13 2015-03-25 Symbol Technologies Inc Device and method for performing a functionality
US9697393B2 (en) 2015-11-20 2017-07-04 Symbol Technologies, Llc Methods and systems for adjusting mobile-device operating parameters based on housing-support type

Also Published As

Publication number Publication date
KR20010051396A (en) 2001-06-25
GB2358336B (en) 2002-09-25
GB0026519D0 (en) 2000-12-13
CN1295419A (en) 2001-05-16
CN1119051C (en) 2003-08-20

Similar Documents

Publication Publication Date Title
GB2358336A (en) Selecting alphanumeric characters or menu options by movement of the display device
US6861946B2 (en) Motion-based input system for handheld devices
US7336263B2 (en) Method and apparatus for integrating a wide keyboard in a small device
US9639267B2 (en) Quick gesture input
US8094938B2 (en) Apparatus and method for handwriting recognition
US8707195B2 (en) Devices, methods, and graphical user interfaces for accessibility via a touch-sensitive surface
US20090249203A1 (en) User interface device, computer program, and its recording medium
US20050223342A1 (en) Method of navigating in application views, electronic device, graphical user interface and computer program product
US7580029B2 (en) Apparatus and method for handwriting recognition
US20060017711A1 (en) Form factor for portable device
US20030006967A1 (en) Method and device for implementing a function
EP2112581A1 (en) Method and apparatus for operating graphic menu bar and recording medium using the same
KR101135071B1 (en) Method, Touch Screen Terminal And Computer-Readable Recording Medium with Program for Presenting Contents List
EP2075681A2 (en) Method of displaying menu items and related touch screen device
EP2033064A1 (en) Mobile device with virtual keypad
JP2003529130A (en) Integrated keypad system
WO2004109441A2 (en) Improved user interface for character entry using a minimum number of selection keys
CN103324428A (en) Electronic apparatus and method for symbol input
US20070205991A1 (en) System and method for number dialing with touch sensitive keypad
KR20100089376A (en) Method, touch screen terminal and computer-readable recording medium with program for releasing of locking touch screen
WO2002088853A1 (en) Motion-based input system for handheld devices
JP5667632B2 (en) Electronic device and control method thereof
KR100859010B1 (en) Apparatus and method for handwriting recognition
KR100763042B1 (en) Method and device for operating a user-input area on an electronic display device
KR102266426B1 (en) Smartphone control method using breath

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20061031