CN112041804A - Device operation control - Google Patents

Device operation control Download PDF

Info

Publication number
CN112041804A
CN112041804A CN201980029126.2A CN201980029126A CN112041804A CN 112041804 A CN112041804 A CN 112041804A CN 201980029126 A CN201980029126 A CN 201980029126A CN 112041804 A CN112041804 A CN 112041804A
Authority
CN
China
Prior art keywords
motion
type
user input
display regions
information displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980029126.2A
Other languages
Chinese (zh)
Inventor
马里亚·弗朗西斯卡·琼斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ma LiyaFulangxisikaQiongsi
Original Assignee
Ma LiyaFulangxisikaQiongsi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ma LiyaFulangxisikaQiongsi filed Critical Ma LiyaFulangxisikaQiongsi
Publication of CN112041804A publication Critical patent/CN112041804A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

An apparatus, comprising: a memory for storing motion type control data; comprising a display for displaying information in a plurality of separate display areas, each display area being independently identifiable; an identification of a type of motion and a corresponding operation to be performed on information displayed in the one or more identified display regions; motion detection means for detecting motion of the device; user input means for receiving user input during a learning phase, the user input identifying an operation to be performed on information displayed in one or more identified display regions; and a processor programmed to, in a learning phase, when receiving a user input identifying an operation to be performed on information displayed in the one or more identified display areas, identify a type of motion from the motion detected by the motion detection device and store the received user input in the motion type control data in correspondence with data identifying the type of motion, and after the learning phase, when the motion detection device detects motion, identify the type of motion using the motion type control data and perform the corresponding operation on the information displayed in the one or more identified display areas.

Description

Device operation control
Technical Field
The present invention relates to an apparatus and method for controlling an apparatus using a detected type of motion.
Background
Portable and mobile devices such as mobile phones, tablet computer devices, and laptop computers are widely used. The challenge for designers of portable devices is to design a device that is lightweight, battery efficient, and easy to use. One aspect of ease of use is that the user can easily enter control commands.
Disclosure of Invention
The present invention provides an apparatus comprising: a display for displaying information in a plurality of separate display areas, each display area being separately identifiable; a memory for storing motion type control data comprising an identification of a motion type and a corresponding operation to be performed on information displayed in the one or more identified display regions; motion detection means for detecting motion of the device; user input means for receiving user input during a learning phase, the user input identifying an operation to be performed on information displayed in the one or more identified display regions; and a processor programmed to, in a learning phase, when receiving user input identifying an operation to be performed on information displayed in the one or more identified display regions, identify a type of motion from the motion detected by the motion detection device and store the received user input in the motion type control data in correspondence with data identifying the type of motion, and after the learning phase, when the motion detection device detects motion, identify a type of motion using the motion type control data and perform the corresponding operation on information displayed in the one or more identified display regions.
The present invention also provides a method of controlling display information in a plurality of separate display areas on a device, the device having a motion detector means for detecting motion of the device, each display area being separately identifiable, the method comprising: during the learning phase: receiving user input identifying an operation to be performed on information displayed in the one or more identified display regions; detecting motion of the device using a motion detector device; identifying a type of motion from the motion detected by the motion detection device; storing the received user input as motion type control data in correspondence with data identifying the motion type; and repeating the receiving, detecting, identifying and storing steps; and in the operational phase: detecting motion of the device using a motion detector device; and identifying a motion type using the motion type control data and performing a corresponding operation on information displayed in the one or more identified display regions.
The present invention also provides an apparatus comprising: a memory for storing control data including an identification of a motion type, a corresponding control operation, and corresponding gesture type data; motion detection means for detecting motion of the device; gesture detection means for detecting a gesture; user input means for receiving user input identifying a control operation during a learning phase; and a processor programmed to: in the learning phase, when a user input identifying a control operation is received, a motion type is identified from the motion detected by the motion detection means, the received user input is stored in the control data in correspondence with data identifying the motion type and corresponding gesture type data is determined, and after the learning phase, when the motion detection means detects motion, the control data is used to identify the motion type and perform the corresponding control operation, or when the gesture detection means detects a gesture, the control data is used to identify the gesture type and perform the corresponding control operation.
Drawings
FIG. 1 is a schematic diagram illustrating an example handheld device;
FIG. 2 is a schematic diagram showing an alternative display of an example apparatus;
FIG. 3 is a schematic diagram showing an alternative display of an example apparatus;
FIG. 4 is a schematic diagram showing electronic components of an example apparatus;
FIG. 5 is a flow diagram illustrating an example learning phase; and
FIG. 6 is a flow chart illustrating an example operational phase.
Detailed Description
In the following detailed description, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the inventive subject matter may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the embodiments, and it is to be understood that other embodiments may be utilized, and that structural, logical and electrical changes may be made without departing from the scope of the inventive subject matter. Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed.
The following description is, therefore, not to be taken in a limiting sense, and the scope of the present subject matter is defined by the appended claims and their equivalents.
In the following embodiments, like parts are denoted by like reference numerals.
In the following embodiments, the term data store or memory is intended to encompass any computer-readable storage medium and/or device (or collection of data storage media and/or devices). Examples of data storage include, but are not limited to, optical disks (e.g., CD-ROM, DVD-ROM, etc.), magnetic disks (e.g., hard disk, floppy disk, etc.), memory circuits (e.g., solid state drive, Random Access Memory (RAM), etc.), and so forth.
In one embodiment, the functions or algorithms described herein are implemented in hardware, software, or a combination of software and hardware. The software includes computer-executable instructions stored on a computer-readable carrier medium such as a memory or other type of storage device. Further, the described functions may correspond to modules, which may be software, hardware, firmware, or any combination thereof. Multiple functions are performed in one or more modules as desired, and the described embodiments are merely examples. The software is executed on a digital signal processor, ASIC, microprocessor, or other type of processor.
Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit. Thus, the exemplary process flow is applicable to software, firmware, and hardware implementations.
A general embodiment provides an apparatus, comprising: a display for displaying information in a plurality of independent display areas, each display area being independently identifiable; a memory for storing motion type control data comprising an identification of a motion type and a corresponding operation to be performed on information displayed in the one or more identified display regions; motion detection means for detecting motion of the device; user input means for receiving user input identifying an operation to be performed on information displayed in the one or more identified display regions during a learning phase; and a processor programmed to, in the learning phase, when receiving user input identifying an operation to be performed on information displayed in the one or more identified display regions, identify a type of motion from the motion detected by the motion detection device and store the received user input in the motion type control data in correspondence with data identifying the type of motion, and after the learning phase, when the motion detection device detects motion, identify a type of motion using the motion type control data and perform the corresponding operation on information displayed in the one or more identified display regions.
A user of the device is able to train the device on a preferred type of motion to be used as input to cause an operation to be performed on information displayed in one or more of the identified display regions. In this way, the user can adjust the manner in which the user interacts with a particular identified display region of the device to select a preferred type of motion for a particular operation to be performed on information displayed in one or more of the identified display regions. The identification of the motion type may include recorded motion data for each motion type for matching with the detected motion data to identify the motion type from the motion data. In one embodiment, not all motion types need to be learned, and some motion types may be pre-stored.
The present invention is applicable to portable devices, but is not limited thereto. The device may be any device capable of moving in a reciprocating, rotating or rocking motion. Such devices are typically portable and wireless, but may simply be moved and connected by wires. The device may be part of another apparatus and thus removable in use to allow a user to shake, rotate or reciprocate to control the device. The device may comprise a mobile phone, a tablet computer, a wearable device such as a smart watch or a personal monitor device, a laptop computer, or a machine control device.
The operations performed on the information displayed in the one or more identified display regions may include any operation on the information in the display region, such as cutting, pasting, inserting, or deleting the information displayed in the one or more display regions of the display. The operation may be limited to operation on one display area, a subset of display areas, or all display areas.
The type of motion that the user may use to identify and perform the control operation may include, for example, any one of a linear motion, a reciprocating motion, a rotational motion, a swinging motion, a seesaw motion, a shaking motion, or a rock motion. The displacement or amplitude of the "shake," the frequency or rate of the "shake," and/or the speed/rate or force of the "shake" may also be a component of the type of motion of the device. The motion types may include a single motion type or a combination of motion types, e.g., translational reciprocating motion and rotational motion, and the motion may be in multiple directions. A motion threshold may be set whereby the type of motion is identified from the motion data only when the speed or magnitude of the motion is above the threshold. This avoids erroneous motion type recognition and thus erroneous control operation execution. The motion may include a complex mixture or aggregation of motions or types of motions (e.g., rotation with shaking at the same time), or a series of motions (e.g., rotation after shaking or rotation in one direction followed by rotation in another direction). Thus, the series of movements may be a series of the same movement type or different movement types.
To reduce errors when the device identifies a type of motion, during the learning phase, after a user input has been received to identify an operation to be performed on information displayed in one or more identified display regions, the device may require the user to perform the type of motion more than once so that motion may be detected more than once and the motion pattern recorded each time so that some form of average of the desired motion pattern may be stored as data identifying the type of motion in the motion type control data. This smoothing or averaging of recorded movements helps the processor to learn deviations in movement data that may be received during an operation phase in which the user intends to operate on information displayed in one or more identified display regions, where the user desires the movement to be a type of movement for which the operation is to be performed on information displayed in the one or more identified display regions. The matching error of the motion data from the motion detecting means with the motion data stored in the motion type control data is reduced.
In the learning phase, the user may input a sequence of user inputs representing a sequence of desired motion types to be used for performing a corresponding operation. Thus, in the learning phase, user input may be received, and subsequently motion data may be input in successive pairs to generate motion-type control data for a plurality of control operations to be performed on information displayed in one or more identified display regions.
In the learning phase, a user selection of one or more display regions may be received to identify the one or more display regions as part of the user input. The user selection of one or more display regions may be made using a pointing or touch input, or simply by selecting a display region from a list of display regions listed by the identifier.
In the learning phase, a menu may be displayed to the user to allow the user to generate user input simply by selecting a displayed menu item and one or more display regions to which an operation is to be applied.
In an alternative example, in the learning phase, the apparatus may have a list of settings that control the operation, and the user input includes a selection to implement the list to learn a type of motion for all listed operations to be performed on information displayed in the one or more identified display regions. Thus, in the learning phase, the device simply requires the user to cause the required type of movement of the device to be achieved sequentially for the listed operations to be performed on the information displayed in the one or more identified display regions. The device may indicate to the user when each type of motion should be performed for each control operation. This may be achieved, for example, by a displayed message, an indicator light, or a sound output.
The motion type and control operation can be learned for individual users. For example, the user may enter their name or signature to use the display device, and the learned motion types and control operations are then stored for use by the user when the user enters their name or signature.
The user may use each region of the display simultaneously. In addition, the display screen area may be positioned according to the user's needs and may be preset or set by the user. The user may use the motion control to manipulate the display area, e.g., pointing to select and position, or to reposition a portion of the display screen area or data within the selected display area to reposition.
In one embodiment, the user can select and control the display area and can open and adjust the display area, for example, by zooming in on it, for example, by tapping on the selected display area, to open the display area to a given size as required or desired by the user. This may enable the user to fully open the display area, e.g., covering the entire screen.
In one embodiment, the device includes a flexible device having one or more displays, the flexible device having one or more display regions, wherein display data can be transferred between the one or more display regions, including appearing and transferring on one or more faces of the device display. Devices having displays on both sides may include electronic document readers capable of displaying and controlling the displayed data. For example, the data may be displayed on the front side a of the device and then transferred to the side B, i.e., the back side of the device.
In one embodiment, the device comprises a flexible device, and data may be displayed on one or more folded or bent regions of the flexible device, and a user may use motions or gestures to transition data between display regions on the same display or another display (if more than one display is provided). The user input device may include any conventional device input device, such as a keyboard, a pointing device (e.g., a mouse, track pad, or trackball), a touch-sensitive display, and/or a pen. The pen allows the user to enter handwritten text and graphics.
The device may have a display, which may include one or more fixed or flexible displays. One or more of the displays may be detachable. The displays may be of different types, e.g. one flexible and one rigid.
One aspect provides a carrier medium, such as a non-transitory storage medium storing code for execution by a processor of a machine to implement the method, or a transitory medium carrying processor-executable code for execution by a processor of a machine to implement the method. Embodiments may be implemented in programmable digital logic implementing computer code. The code may be provided to programmable logic, such as a processor or microprocessor, on a carrier medium. One such embodiment of a carrier medium is a transitory medium, i.e., a signal such as an electrical, electromagnetic, acoustic, magnetic, or optical signal. Another form of carrier medium is a non-transitory storage medium storing the code, such as a solid state memory, a magnetic medium (hard drive) or an optical medium (compact disc (CD) or Digital Versatile Disc (DVD)).
In one example, the Apparatus may be used as a Display unit in any example of a Display device disclosed in co-pending application GB1722249.8 filed on 29.12.2017, the contents of which are incorporated herein by reference in their entirety, or the Apparatus may be used with the Apparatus disclosed in co-pending british patent application GB1805278.7 entitled "Display Apparatus" filed on the same date as the present application, the contents of which are incorporated herein by reference in their entirety.
Specific embodiments will now be described with reference to the accompanying drawings.
Fig. 1 is a schematic diagram showing a hand-held device 1. The device 1 may comprise any handheld device, such as a mobile phone.
The direction of movement that the device 1 may undergo as a method of inputting commands for one or more independent display areas of the device is illustrated by the arrows in fig. 1. The motion may be a back and forth linear or reciprocating motion in any of three directions along three axes x, y and z in three dimensions as indicated by the cross arrows. Additionally, or alternatively, the motion may be non-linear, curved, or rotational, as shown by curved arrows a and B. The rotational movement may be a rotation about any of the x, y or z axes.
Fig. 2 is a schematic diagram showing the mobile device 2.
The device 2 comprises a display screen 3 capable of displaying information. In this example, the display 3 displays three types of information, i.e., the entered text 11, the handwritten text 41, and the image 31, in three different display areas. Although in this embodiment different information is shown displayed in different areas, the invention is also applicable to the display of the same information type or a mixed information type in the same or different display areas. The information displayed may be, for example, a document or form having areas to be completed or filled in, for example, address and name areas in standard form letters, or fields in the form. The information displayed in the display area may include any one of text, images, or video. The information may be displayed by any application executing on the device, such as a web browser, an email application, a word processor, a spreadsheet, an image editor, and so forth. Handwritten text 41 may be input using a pen device 50. The display screen 3 may be sensitive to the proximity of the pen device 50 to detect the coordinates of the pen device 50 to track its position and thus produce the input handwritten text 41. The display 3 may be touch sensitive to allow information and commands to be input to the device 2, for example by displaying touch sensitive options and/or a keyboard.
In the example of fig. 2, movement of the apparatus 2 may cause a control operation to be performed to modify any displayed information in any display area, such as by erasing all displayed information or selective portions (such as typing one or more of text 11, handwritten text 41 or image 31), or by entering predefined information (e.g. entering text or image data) into the display area.
Figure 3 shows an alternative display screen 20 of the device. In this example, there are three different display areas clearly defined by the boundary markers, namely, a typed-text display area 10 having a typed text 11 displayed, an image display area 30 displaying an image 31, and a handwritten-text display area 40 displaying a handwritten text 41 created using a pen 50.
Although fig. 2 and 3 illustrate motion control of displayed information in the form of text or images, motion control may include any displayed information. In a shopping application, the virtual shopping cart may be moved, filled or emptied, and items may be checked out. Each control operation may have an associated control motion for which the device is taught during the learning phase. Some control options may be pre-stored and need not be taught.
The motion of the exemplary device may be a back and forth linear or reciprocating motion in any of three directions along three axes x, y, and z in three dimensions as shown by the intersecting arrows in fig. 1. Also, or alternatively, the motion may be non-linear, curved, or rotational, as indicated by the curved arrows in fig. 1.
In the example of fig. 3, movement of the device may erase all or some of the information displayed in one or more of the display areas 10, 30 and 40. One or more of the displayed regions and the operations performed on the information in the displayed regions are defined as commands specific to the motion of the device. The apparatus stores a set of motion patterns with corresponding display area identifiers and information operations such that a particular motion type will cause a particular operation on information displayed in only one display area, only a subset of the display areas, or in all display areas.
Although different types of information are shown and described in fig. 3 as being displayed in different display areas, different display areas may display the same type of information or any combination of types of information. Further, the display area may display mixed information types. The example of fig. 3 is for illustration only.
The examples of fig. 2 and 3 describe a control operation of controlling the display of information in a specific display area according to a recognized motion pattern or type.
The type of motion detected by the motion detection means of the device may detect any of a number of types of motion patterns in any dimension or direction. Examples of different motion patterns are: front-to-back rocking, left-to-right rocking, up-and-down rocking, left-to-right turning (left-to-right tilting), up-and-down turning (up-and-down tilting), and rotation. Each of these motion patterns represents lateral (rocking) or rotational motion along or about one of the three axes shown in fig. 1. Two types of each of these motion types may be detected based on the starting motion. For example, front-to-back may be front-to-back or back-to-front, left-to-right may be left-to-right or right-to-left, rotation may be clockwise or counterclockwise, and so forth. Further, the force with which the motion is performed (e.g., the speed or frequency of the shaking or reciprocating motion) may be used to characterize different types or patterns of motion. Thus, the same spatial displacement of the display device may correspond to different motion types, depending on the speed, acceleration and frequency at which the displacement is performed. Further, the complex motion types may include complex motion patterns that include combinations of basic motion types, e.g., lateral and rotational motion, or a combination of reciprocating lateral and rotational motion.
The memory of the device may store a set of operations to be implemented by the processor on the identified one or more display regions for each type of motion detected by the motion detection device and identified by the processor from the signal from the motion detection device. These operations may be predefined or downloaded from a data source, and they may be user defined or supplemented. User definition or supplementation may occur during a learning phase.
Fig. 4 is a schematic diagram illustrating electronic components of an example apparatus.
The processor 100 is connected to a static or non-volatile memory 102 that stores code (e.g., operating system code) used by the processor 100 and for storing data and information for display in a non-volatile manner. Volatile memory 103 is provided for storing application code implemented by processor 100, as well as data and information for display. The volatile memory may store code (e.g., code modules) including code for identifying a motion pattern detected by the motion sensor 104 and code for controlling the display screen 105 in response to performing an operation on information displayed in a separate display area (e.g., code modules). The memory may also store data mapping operations on the display area and identifiers of the display area relative to the type of movement that are pre-stored or learned during the learning phase.
A display screen 105 is connected to the processor 100 for displaying information. Display screen 105 may include a touch sensitive display to provide an input device. A pen 50 (shown in fig. 2) may be used with the display screen 105 to allow entry of handwritten text or drawings. A camera 106 may also optionally be connected to the processor 100 to capture images. The camera 106 may also serve as an input device for inputting information and commands using gesture recognition performed by the processor on images captured by the camera 106.
A network connector 101 is connected to the processor 100 for communication with a communication or computer network. The communication links may transfer data, information, and control between the devices and other devices or computers over a network. The network connector 101 may connect to a communications or computer network using a wired or wireless connection. For example, the wired connection may be an ethernet connection to a LAN, or a telephone connection, such that the network connector 101 acts as a modem or router. The wireless connection may be WIFI, WiMAX, CDMA, GSM, GPRS, wireless local loop, or WAN. For low range and low power wireless communication, bluetooth may be used.
A motion sensor 104 is connected to the processor 100 to detect motion of the device. The motion sensor 104 may include any conventional motion sensor, such as an accelerometer. The motion sensor 104 is capable of detecting motion of the device in any direction, i.e., it is a multi-axis motion sensor to detect motion along any one of three orthogonal axes in three dimensions.
In any of the above examples, the device may have display screens on both sides so that it can be flipped over to view information on both sides. The control operation may be applied to both sides or to only one side. In one example, this may be selectively set. One or more displays may also be removable from the mounting frame.
In one or more embodiments, the device may include a docking portion or mount to which the display screen is removably or detachably mounted so that the display screen may be used separately from the mount. The mount and display screen may communicate between each other using a wireless link, such that the mount may contain some components of the display device, such as a processor and memory, while the display screen may contain only those components necessary to operate the display screen, detect movement of the display screen, and communicate with the processor. In one embodiment, more than one display screen may be provided that is detachable from the mount, wherein each display screen operates independently to provide an erase operation through motion detection using a shared processor in the mount.
An apparatus may include one or more speakers for audio output and one or more microphones for audio input. In addition, the device may be connected to peripheral devices such as a keyboard and/or pointing device through a wired or wireless connection to allow information and commands to be input to the device.
A method of operating an example apparatus will now be described with reference to fig. 5 and 6.
Fig. 5 is a flow chart illustrating a learning phase of the apparatus.
In step S1, the apparatus enters a learning phase. This may be done automatically upon start-up of the device, or as a result of some other operation of the device, or may be a result of a user selection, i.e. a user input, such as using an input device of the device. In step S2, the display of the device displays a menu of defined operations to be performed on information displayed in the one or more identified display areas, which the user may select to perform using motion as user input. The displayed menu may simply be a list or a drop down menu, for example. The list may be organized into categories of operation types. The menu may also display a list of display regions to which operations may be applied, and the user may select one or more display regions. Alternatively, the display may display the regions in a selectable manner such that the user may select one or more regions to which the selected operation is to be applied using, for example, a pointing device or touch input to select one or more regions of the display.
In step S3, a user selection of an operation menu item is received, and then in step S4, the device waits for a motion detector device, such as an accelerometer, to detect a motion of the device. The motion is recorded within a short period of time (e.g., 1 to 2 seconds) suitable for capturing the motion pattern. In order to improve the accuracy of the matching process and allow for a change in the user' S motion pattern, in step S5, the process loops back to ask the user to repeat the motion input a number (N) of times, for example, any number of times from 3 to 10 times. More iterations improve the accuracy of the matching process, but reduce the usability of the device: the user will lose patience because the motion pattern needs to be repeated too many times.
At the end of the loop back process, the recorded motion patterns of the repeated recordings are averaged in step S6 and used in step S7 to be stored as part of the information on the motion type in the stored motion type control data. Data indicating the degree of deviation between the recorded movement patterns may also be stored as part of the movement type data in order to assist the matching process during the operational phase, i.e. to assist in assessing whether the match lies within the expected deviation of the stored average movement pattern. An identifier identifying the corresponding control operation is also stored as part of the stored motion-type control data.
Then, the process will return to step S2 to allow the user to select another operation from the menu. If the user has completed the selection operation, the user may elect to exit the learning phase.
Fig. 6 is a flow chart illustrating the operational stages of the device after the learning stage.
In step S10, the device enters the operational phase, and in step S11, the motion of the device is detected in step S12. The motion pattern is compared to a stored pattern of motion types in the motion type control data to determine if the motion matches the motion type. If no match is found (step S13), the process returns to step S11 to detect further motion. If a match is found with the motion data of the matching type in step S13, a corresponding operation on the information displayed in the one or more identified display areas is identified in the motion type control data in step S14, and the operation on the information displayed in the one or more identified display areas is performed in step S15. The process then returns to step S11 to detect the next movement of the device to perform further operations on the information displayed in the one or more identified display areas.
In fig. 5, the motion data of a plurality of recorded motion operations is averaged. However, the invention is not limited to averaging and only one motion detection may be used, or one number may be recorded and stored separately to match the motion pattern separately during the operational phase.
In fig. 5, the user is presented with a menu to select an operation to teach the device the type of motion used to control the operation. The user may also be able to define their own operations to be performed on information displayed in one or more identified display areas using the user input device.
In any of the above examples, the device may include other input devices, such as an audio sensor for audio input, a keyboard, a pointing device (e.g., a mouse), or a touch screen, to enable a user to input control and information in conjunction with a motion detection control input method. Further, a camera may be provided to allow image capture, enabling gesture input as a method of control or information input. The apparatus may include an infrared sensor, for example, a passive infrared sensor (PIR).
The information displayed in the display area may include video (moving images, television, 3D video, panoramic view, etc.).
The apparatus may comprise a remote control for controlling the apparatus. The remote control may be a motion response to motion control of the device.
In embodiments, in addition to or instead of motion control input, a camera may be provided in the apparatus to capture video in a direction facing the user, so that the user may use gesture input control depending on the type of gesture recognized. Gesture control may include recognizing movement of a body part of the user (i.e., a hand, facial features, or entire body position). Pattern recognition techniques may be used to discern the types of gestures and recognize them. Example gesture types may include a left or right hand flick, or an up or down hand flick, or a hand swipe.
Where gesture control is provided in conjunction with motion control, for example where a user is unable to pick up the device, for example where their hands are wet or dirty, the gesture control may be used to control the erase operation. Further, the device may be controlled to respond to the detected gesture type and recognize a comparable or associated motion type that inputs the same control, and control the display to cause the display to mimic a motion command equivalent to the gesture command. For example, if the motion command is a left-to-right rotation, when an equivalent or associated gesture command is recognized, e.g., waving a hand from left to right, the display screen may rotate the display from left to right to serve as confirmation back to the user that the gesture command has been recognized as a motion command. The equivalent gesture type may be automatically determined by the processor from the corresponding motion type, and thus the motion type data may include corresponding gesture type data for the same operation. Using gestures that match motion commands in a general movement means that the gestures will be intuitive to the user, e.g. rotation of the device may be equivalent to rotation of the user's hand.
This aspect of the invention may be applied to input any type of command to the device, including execution of an application, taking a picture using a camera of the device, or any other operation that changes the state of the device. This aspect may use the hardware of fig. 4 and the methods of the flowcharts of fig. 5 and 6 to learn and execute control operations. This aspect is not limited to the control of the display operation in the display area, but may be applied to input of any control operation to the apparatus.
The motion type and control operation can be learned for individual users. For example, the user may enter their name or signature to use the display device, and the learned motion types and actions are then stored for use by the user when the user enters their name or signature.
In one aspect, the present invention provides gesture control to supplement motion control. In this aspect, an apparatus may include a memory for storing control data including an identification of a motion type, a corresponding control operation, and corresponding gesture type data; motion detection means for detecting motion of the device; gesture detection means for detecting a gesture; a user input device for receiving a user input of the identified control operation during the learning phase; and a processor programmed to, in the learning phase, when receiving a user input identifying a control operation, identify a type of motion from the motion detected by the motion detection means, store the received user input in the control data in correspondence with data identifying the type of motion, and determine corresponding gesture type data, and after the learning phase, when the motion detection means detects motion, identify the type of motion using the control data and perform the corresponding control operation, or when the gesture detection means detects a gesture, identify the type of gesture using the control data and perform the corresponding control operation.
In this regard, the device may be mobile or fixed/attached. A camera may be provided in the apparatus to capture video in a direction facing the user so that the user may use the gesture as a control input depending on the type of gesture recognized. Gesture control may include identifying motion of a body part of the user, i.e. a hand, facial feature or overall body position, or identifying an object moved by the user, such as a wand, ruler or an item worn by the user. Pattern recognition techniques may be used to discern the types of gestures and recognize them. Example gesture types may include a left or right hand flick, or an up or down hand flick, or a hand swipe. The gestures match the motion learned by the device during the learning phase, e.g., the rotation of the device may be mimicked by the rotation of the user's hand.
It will be readily understood by those skilled in the art that various other changes in the details, materials, and arrangements of the parts and method stages which have been described and illustrated in order to explain the nature of this subject matter may be made without departing from the principles and scope of the subject matter as expressed in the subjoined claims.

Claims (20)

1. An apparatus, comprising:
a display for displaying information in a plurality of separate display areas, each display area being separately identifiable;
a memory for storing motion type control data comprising an identification of a motion type and a corresponding operation to be performed on information displayed in the one or more identified display regions;
motion detection means for detecting motion of the device;
user input means for receiving user input during a learning phase, the user input identifying an operation to be performed on information displayed in the one or more identified display regions; and
a processor programmed to, in the learning phase, when receiving user input identifying an operation to be performed on information displayed in one or more identified display regions, identify a type of motion from motion detected by the motion detection apparatus and store the received user input in the motion type control data in correspondence with data identifying the type of motion, and after the learning phase, when the motion detection apparatus detects motion, identify a type of motion using the motion type control data and perform the corresponding operation on information displayed in one or more identified display regions.
2. The device of claim 1, wherein the processor is programmed to, in the learning phase, when receiving user input identifying an operation to be performed on information displayed in one or more identified display regions, identify the type of motion by averaging the detected motion received from the motion detection device a plurality of times, and store the received user input in the motion type control data in correspondence with data identifying the type of motion
3. Apparatus according to claim 1 or 2, wherein the processor is programmed to, in the learning phase, receive a plurality of user inputs and identify the type of motion from the motion detected by the motion detection apparatus for each user input, and store the received user inputs in the motion type control data in correspondence with data identifying the respective type of motion.
4. The apparatus of any of claims 1-3, wherein the processor is programmed to identify a plurality of motion types comprising a combination of motions in a plurality of directions.
5. The apparatus of claim 4, wherein the direction comprises at least one of a lateral and/or rotational direction.
6. The apparatus of any one of claims 1-5, wherein the processor is programmed to, in the learning phase, receive a user selection of one or more display regions to identify the one or more display regions as part of the user input.
7. Apparatus according to any one of claims 1 to 5, wherein the processor is programmed to, in the learning phase, control the display to display a menu of operations to be performed on information displayed in one or more identified display regions, and the user input device is adapted to receive a user selection of one of the operations to be performed on information displayed in one or more identified display regions in the menu, and a selection of one or more of the display regions to which the operation applies as the user input.
8. Apparatus according to any one of claims 1 to 5, wherein the processor is programmed to, in the learning phase, control the display to display a list of operations to be performed on information displayed in one or more identified display regions, the user input device being adapted to receive a user selection to select the list as the list of user inputs identifying operations to be performed on information displayed in one or more identified display regions and to receive a selection of one or more of the display regions to which the operations apply as the user input, and the processor being programmed to identify the type of motion from the motion detected by the motion detection device for each user input and store the user inputs in the motion type control data in correspondence with data identifying the respective type of motion In (1).
9. The apparatus of any of the preceding claims, wherein the processor is programmed to identify a plurality of motion patterns in a plurality of directions and to perform the corresponding operations on information displayed in one or more identified display regions.
10. A method of controlling display information in a plurality of separate display areas on a device having a motion detector device for detecting motion of the device, each display area being separately identifiable, the method comprising:
during the learning phase:
receiving user input identifying an operation to be performed on information displayed in the one or more identified display regions;
detecting motion of the device using the motion detector device;
identifying a type of motion from motion detected by the motion detection device;
storing the received user input as motion type control data in correspondence with data identifying the motion type;
and repeating said receiving, detecting, identifying and storing steps; and
in the operational phase:
detecting motion of the device using the motion detector device; and
a motion type is identified using the motion type control data, and a corresponding operation is performed on information displayed in the identified one or more display regions.
11. The method of claim 10, wherein in the learning phase, when a user input identifying an operation to be performed on information displayed in one or more identified display regions is received, the type of motion is identified by averaging the detected motion received from the motion detection device a plurality of times, and the received user input is stored in the motion type control data in correspondence with data identifying the type of motion.
12. The method of claim 10 or 11, wherein the type of motion comprises a combination of motions in multiple directions.
13. The method of claim 12, wherein the direction comprises at least one of a lateral direction and/or a rotational direction.
14. A method according to any one of claims 10 to 13, comprising, in the learning phase, receiving a user selection of one or more display regions to identify the one or more display regions as part of the user input.
15. A method according to any one of claims 10 to 13, comprising, in the learning phase, displaying a menu of operations to be performed on information displayed in one or more identified display regions, and receiving the user input as a user selection of one of the operations to be performed on information displayed in one or more identified display regions in the menu, and as the user input a selection of one or more of the display regions to which the operation applies.
16. The method of any of claims 10 to 13, comprising, in the learning phase, displaying a list of operations to be performed on information displayed in one or more identified display regions, wherein the user input is received as a user selection to select the list as a list of user inputs identifying operations to be performed on information displayed in one or more identified display regions; and a selection of one or more of the display regions to which the operation is applied, the selection being as the user input, a type of motion being identified from the motion detected by the motion detection apparatus for each user input, and the user input being stored in the motion type control data in correspondence with data identifying the respective type of motion.
17. The method of any of claims 10 to 16, wherein a plurality of motion patterns in a plurality of directions are identified to perform the corresponding operations to be performed on information displayed in one or more identified display regions.
18. A carrier medium carrying processor implementable code for execution by a processor to carry out the method of any of claims 10 to 17.
19. An apparatus, comprising:
a memory for storing control data including an identification of a motion type, a corresponding control operation, and corresponding gesture type data;
motion detection means for detecting motion of the device;
gesture detection means for detecting a gesture;
user input means for receiving user input identifying a control operation during a learning phase; and
a processor programmed to:
in the learning phase, when a user input identifying a control operation is received, identifying a type of motion from the motion detected by the motion detection means, storing the received user input in the control data in correspondence with data identifying the type of motion, and determining corresponding gesture type data, an
After the learning phase, when the motion detection means detects a motion, the control data is used to identify a type of motion and perform the corresponding control operation, or when the gesture detection means detects a gesture, the control data is used to identify a type of gesture and perform the corresponding control operation.
20. The device of claim 19, wherein the gesture detection device comprises a camera.
CN201980029126.2A 2018-03-29 2019-03-29 Device operation control Pending CN112041804A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GB1805269.6A GB2572434A (en) 2018-03-29 2018-03-29 Device operation control
GB1805269.6 2018-03-29
PCT/GB2019/050933 WO2019186203A1 (en) 2018-03-29 2019-03-29 Device operation control

Publications (1)

Publication Number Publication Date
CN112041804A true CN112041804A (en) 2020-12-04

Family

ID=62142364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980029126.2A Pending CN112041804A (en) 2018-03-29 2019-03-29 Device operation control

Country Status (8)

Country Link
EP (1) EP3776160A1 (en)
JP (2) JP2021519977A (en)
KR (1) KR20210002512A (en)
CN (1) CN112041804A (en)
GB (1) GB2572434A (en)
SG (1) SG11202009628UA (en)
WO (1) WO2019186203A1 (en)
ZA (1) ZA202100682B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080132286A1 (en) * 2006-12-01 2008-06-05 Samsung Electronics Co., Ltd. Operation mode-driving mechanism and method for mobile terminal
KR100912310B1 (en) * 2008-04-17 2009-08-14 엘지전자 주식회사 User interface controlling method by detecting user's gestures
US20100216517A1 (en) * 2009-02-24 2010-08-26 Samsung Electronics Co., Ltd. Method for recognizing motion based on motion sensor and mobile terminal using the same
CN102420942A (en) * 2011-11-28 2012-04-18 康佳集团股份有限公司 Photograph device and photograph control method based on same
CN102722239A (en) * 2012-05-17 2012-10-10 上海冠勇信息科技有限公司 Non-contact control method of mobile device
US20120306903A1 (en) * 2011-06-01 2012-12-06 Research In Motion Limited Portable electronic device including touch-sensitive display and method of controlling same
US20140184495A1 (en) * 2012-12-31 2014-07-03 Joseph Patrick Quin Portable Device Input by Configurable Patterns of Motion
US20150177826A1 (en) * 2013-12-19 2015-06-25 Sony Corporation Apparatus and control method based on motion

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6798429B2 (en) * 2001-03-29 2004-09-28 Intel Corporation Intuitive mobile device interface to virtual spaces
US20050212760A1 (en) * 2004-03-23 2005-09-29 Marvit David L Gesture based user interface supporting preexisting symbols
US8391786B2 (en) * 2007-01-25 2013-03-05 Stephen Hodges Motion triggered data transfer
JP6324203B2 (en) * 2014-05-14 2018-05-16 キヤノン株式会社 Information processing apparatus, control method therefor, program, and recording medium
KR102275653B1 (en) * 2014-10-21 2021-07-09 삼성전자주식회사 Wearable device and method for transmitting contents
CN104639966A (en) * 2015-01-29 2015-05-20 小米科技有限责任公司 Method and device for remote control
US9996164B2 (en) * 2016-09-22 2018-06-12 Qualcomm Incorporated Systems and methods for recording custom gesture commands

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080132286A1 (en) * 2006-12-01 2008-06-05 Samsung Electronics Co., Ltd. Operation mode-driving mechanism and method for mobile terminal
KR100912310B1 (en) * 2008-04-17 2009-08-14 엘지전자 주식회사 User interface controlling method by detecting user's gestures
US20100216517A1 (en) * 2009-02-24 2010-08-26 Samsung Electronics Co., Ltd. Method for recognizing motion based on motion sensor and mobile terminal using the same
US20120306903A1 (en) * 2011-06-01 2012-12-06 Research In Motion Limited Portable electronic device including touch-sensitive display and method of controlling same
CN102420942A (en) * 2011-11-28 2012-04-18 康佳集团股份有限公司 Photograph device and photograph control method based on same
CN102722239A (en) * 2012-05-17 2012-10-10 上海冠勇信息科技有限公司 Non-contact control method of mobile device
US20140184495A1 (en) * 2012-12-31 2014-07-03 Joseph Patrick Quin Portable Device Input by Configurable Patterns of Motion
US20150177826A1 (en) * 2013-12-19 2015-06-25 Sony Corporation Apparatus and control method based on motion

Also Published As

Publication number Publication date
ZA202100682B (en) 2023-10-25
WO2019186203A1 (en) 2019-10-03
EP3776160A1 (en) 2021-02-17
JP2021519977A (en) 2021-08-12
JP2024056764A (en) 2024-04-23
GB201805269D0 (en) 2018-05-16
GB2572434A (en) 2019-10-02
SG11202009628UA (en) 2020-10-29
KR20210002512A (en) 2021-01-08

Similar Documents

Publication Publication Date Title
US20220129060A1 (en) Three-dimensional object tracking to augment display area
AU2014219558B2 (en) Method for controlling display of multiple objects depending on input related to operation of mobile terminal, and mobile terminal therefor
JP6074170B2 (en) Short range motion tracking system and method
JP6159323B2 (en) Information processing method and information processing apparatus
CN109242765B (en) Face image processing method and device and storage medium
EP2790089A1 (en) Portable device and method for providing non-contact interface
EP2743846A2 (en) Information search method and device and computer readable recording medium thereof
US10949668B2 (en) Electronic apparatus and method for controlling thereof
CN109923496A (en) Computing system based on and by the input of the interaction of physical hinge that is connected to each other of two display equipment
JPWO2014208168A1 (en) Information processing apparatus, control method, program, and storage medium
CN107407945A (en) From the system and method for screen locking capture images
JP6379880B2 (en) System, method, and program enabling fine user interaction with projector-camera system or display-camera system
CN102918477A (en) Apparatus, method, computer program and user interface
CN204945943U (en) For providing the remote control equipment of remote control signal for external display device
CN112041804A (en) Device operation control
CN107885439A (en) A kind of note dividing method and mobile terminal
JP7379364B2 (en) display device
CN103345358A (en) Display device and information processing method thereof
JP2021197024A (en) Display unit, display method, and program
JP2015176483A (en) Image processing program, image processing method, and information processing device
Buda Rotation techniques for 3D object interaction on mobile devices
KR101898162B1 (en) Apparatus and method of providing additional function and feedback to other apparatus by using information of multiple sensor
JP7030527B2 (en) Electronic devices, information processing methods, programs and storage media
JP2018028923A (en) Image processing program, image processing method, and information processor
Bunscheit Imaginary Interfaces: Development of user interfaces for creating highly mobile screen-less devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40042777

Country of ref document: HK