US20140149916A1 - Content manipulation using swipe gesture recognition technology - Google Patents

Content manipulation using swipe gesture recognition technology Download PDF

Info

Publication number
US20140149916A1
US20140149916A1 US13/932,898 US201313932898A US2014149916A1 US 20140149916 A1 US20140149916 A1 US 20140149916A1 US 201313932898 A US201313932898 A US 201313932898A US 2014149916 A1 US2014149916 A1 US 2014149916A1
Authority
US
United States
Prior art keywords
swipe gesture
content
touch
swipe
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/932,898
Inventor
Robert S. MANOFF
Todd Houck
Jesse D. SQUIRE
Caleb K. SHAY
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Swipethru LLC
Original Assignee
Somo Audience Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Somo Audience Corp filed Critical Somo Audience Corp
Priority to US13/932,898 priority Critical patent/US20140149916A1/en
Priority to CA2892999A priority patent/CA2892999A1/en
Priority to AU2013352207A priority patent/AU2013352207A1/en
Priority to PCT/US2013/072186 priority patent/WO2014085555A1/en
Priority to JP2015545425A priority patent/JP6309020B2/en
Priority to EP13858848.8A priority patent/EP2926227A4/en
Priority to CN201380071413.2A priority patent/CN104937525B/en
Priority to US14/175,522 priority patent/US20140245164A1/en
Publication of US20140149916A1 publication Critical patent/US20140149916A1/en
Priority to US14/310,663 priority patent/US9218120B2/en
Priority to US15/077,535 priority patent/US10089003B2/en
Priority to US16/116,459 priority patent/US10831363B2/en
Assigned to SoMo Audience Corp. reassignment SoMo Audience Corp. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SQUIRE, JESSE D., HOUCK, Todd, MANOFF, ROBERT S., SHAY, CALEB K.
Assigned to SWIPETHRU LLC reassignment SWIPETHRU LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SoMo Audience Corp.
Priority to US17/039,151 priority patent/US11461536B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/12Use of codes for handling textual entities
    • G06F40/14Tree-structured documents
    • G06F40/143Markup, e.g. Standard Generalized Markup Language [SGML] or Document Type Definition [DTD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/274Converting codes to words; Guess-ahead of partial word inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements

Definitions

  • the disclosed embodiments are directed to a content manipulation using swipe gesture recognition technology. More specifically, the disclosed embodiments are directed to systems and methods for manipulation of Internet-based content provided by a hosting server using swipe gesture recognition technology on a user device.
  • Touchscreen-based user computing devices such as smartphones, tablets, e-readers, touch-enabled laptops, and touch-enabled desktop computers, are commonly used to request content from servers, via the Internet.
  • Such content may include an advertisement, or other type of display window, which is overlaid on the displayed content until a particular portion of the advertisement is touched by the user.
  • the portion which must be touched to close the advertisement may be a small “X” in a corner of the advertisement.
  • Such undesired activations of an advertiser's webpage can increase the advertiser's costs, because the advertisement may be paid for based on a particular cost-per-click (CPC). Therefore, an advertiser purchasing advertisements on mobile devices on a CPC basis may find that they are getting a very high click-through rate (CTR) but a low return on investment (ROI) due to accidental click-throughs. This may annoy current and potential customers and may result in a negative brand perception, which is a significant concern for large brands.
  • CTR click-through rate
  • ROI return on investment
  • the disclosed embodiments provide a system, a method, and processor instructions for implementing swipe gesture recognition technology to manipulate Internet-based content provided by a hosting server to a user device.
  • a user may have a computing device, such as, for example, a mobile device.
  • the mobile device could be, e.g., a mobile phone, a tablet, a household appliance which is running an embedded mobile piece of software, such as mobile browser.
  • the browser may make a request for a piece of content.
  • This content is typically hosted on a web page and is accessed via the Internet.
  • a web page is actually making a request from the device to a server that hosts the content.
  • the content may include an advertisement and the behaviors that apply to that advertisement.
  • the server When the server receives a content request, it provides a “package” of the content, the advertisement, and certain related information, such swipe gesture behaviors and technology packages that are needed to apply this related information to the requested content.
  • Different types of advertisements may be bundled in different ways, and there may be different elements of the swipe gesture recognition technology as it gets assembled, specifically, with a particular piece of content. Once those decisions are made, the server sends the bundle, which contains at least the swipe gesture recognition technology pieces and the advertisement.
  • the advertisement may come from a server other than the content server (i.e., the hosting server).
  • the content server may not have access to the advertisement itself. Rather, the content server may just receive enough information about the advertisement to allow the content server to determine how the swipe gesture recognition technology will be applied to the advertisement on the user's device.
  • the information relating to the advertisement which is received by the content server may be an identifier of the particular content package that it is looking to apply the swipe technology, e.g., a uniform resource locator (URL).
  • the information may also include additional context data that would indicate how the advertisement is to be displayed on the user device, e.g., whether the advertisement is full screen or a window bar at the bottom of the page.
  • the information received by the content server may include particular details of the type of presentation that is desired by the advertiser and how the advertisement is meant to react to detected swipe gestures. Alternatively, the advertiser may want only an indication that a swipe has occurred and may handle the response to the swipe in the software operating on the user device.
  • the information received by the content server relating to the advertisement provides a context in which the advertisement is to be displayed on the user device.
  • the information describes the particular behaviors which are necessary from the swipe gesture recognition technology, whether the swipe technology needs to include presentation components, and, if so, which presentation components are necessary.
  • the package containing the content, advertising information, and swipe gesture recognition technology (e.g., in the form of a module of code) is assembled, it is delivered to the user device.
  • the package may be, for example, in the form of client-type script, and this script makes decisions based on the information that is available at the user device level.
  • the swipe gesture recognition technology is on the user device, it already knows where the advertising content is located and details regarding how the presentation is to be displayed on the user device. The display behavior is, thus, based on the package.
  • the swipe gesture recognition technology “crawls” through parts of the content page and finds the advertisement that it is meant to present/control. If the package needs to apply presentation components, it will do so that at this point.
  • the swipe technology will also determine where it needs to start listening for the user's gestures in order to interpret them and decide whether the user meant to perform a swipe gesture or whether the user meant to perform some other type of operation, such as a click, drag or other gesture.
  • the swipe gesture recognition technology is in a passive listening mode in which it waits for the user to touch the screen of the user device. Once the user touches the screen, if, for example, the touch is within the portions of the screen to which swipe gesture behavior is to be applied, then the system will attempt to interpret whether the touching should be deemed to be a swipe gesture.
  • the swipe gesture recognition technology “wakes up” and starts trying to figure out what the user's intent is with respect to the touching action.
  • the technology does this by applying behavioral analysis to the way the user's fingers move, e.g., how many fingers are in place, in which direction are they moving, for what length of time have they been moving, and various other criteria.
  • the swipe technology records the gestures and the interaction that the user has with the screen. Once the user removes his finger from the screen, then the swipe technology stops collecting the touch data and the analysis it has been doing in the background and starts to make decisions based on the touch data, e.g., touch location and time data.
  • the swipe gesture recognition technology analyzes the touching actions which the user performs with his finger on the screen and determines whether these actions constitute a swipe gesture of the sort the system is configured to look for and react to. If the system determines that the touching actions do not amount to the swipe gesture it is seeking, then the system may go back into a passive “listening” mode in which it awaits further touching actions.
  • swipe gesture recognition technology detects the particular swipe gestures for which it is “listening,” then it will take different actions depending on the particular nature of the package installed on the user device. In a general case, if the system detects a bona fide swipe gesture, then it will signal this to any application or entity which happens to be listening to it.
  • the swipe gesture recognition technology may also implement the user interface on the user device. In such a case, the system takes specific actions upon detection of the swipe gesture, such as, for example, making an overlaid displayable content element (e.g., an advertisement) disappear, which may be implemented by having the advertisement slide off the screen in an animated fashion.
  • an overlaid displayable content element e.g., an advertisement
  • the portion of the package which asked for the swipe gesture recognition to be applied (e.g., a portion coded by an entity responsible for the advertising content) is then responsible for taking the appropriate actions based on a received indication that a swipe has occurred.
  • the disclosed embodiments provide systems and methods for listening to touch screen gestures, determining whether a detected gesture is appropriate, and then making sure that the system alerts the applications which need to know, so that, for example, an application can apply some type of action to a displayed advertisement.
  • the application may, for example, close an overlaid advertisement, “swipe through” the advertisement to a different site identified by the advertisement, or cause the advertisement to initiate an animation or other executable element. For example, if the advertisement is for a soft drink, then the swipe gesture may cause an animation to execute which shows the beverage being poured into a glass.
  • one aspect of the disclosed embodiments is the ability to process all of the touchscreen data and to interpret how to perform user interaction.
  • the system after detection of the user touching the screen, the system starts “listening” for a swipe gesture.
  • the system may determine whether the touch was on a defined area, e.g., on an advertisement, and may also determine whether the system is responsible for reacting to the swipe gesture or merely reporting it. If the touch was not on the advertisement, the system may stop listening and wait for the next detected touch. If the touch was on the advertisement, then the system will capture touch data, e.g., location and time, until the user stops touching the screen. At that point, the system determines whether the pattern of the user's interaction should be recognized as a swipe gesture. If not, the system goes back into a dormant state and waits for the next touch to occur.
  • the gesture would be deemed to be a bona fide swipe gesture.
  • certain applications may be alerted and/or certain actions may be taken.
  • the action taken might be the hiding of the advertisement or might be only the alerting of applications and then waiting for the next touch event.
  • swipe gesture recognition technology is attached to an advertisement, for example, and the resulting “package” is delivered to a user device.
  • the package may include the content provider's (i.e., the entity responsible for the advertising content) script components as part of its content.
  • the content provider's script is executed on the user's device, the application, or other entity, using it has set up certain information to let the script know its context.
  • There may be, for example, an element called “ABCD,” which is the advertisement to which the swipe gesture recognition technology is to be attached.
  • element ABCD there may be a sub-element which has arrows (or other symbols) to indicate to the user that a swipe gesture is needed and that identify the element as the specific area that the system should monitor for swipe gestures.
  • the specific area may turn a different color or provide some other sort of indication if the user touches the area instead of swiping the area.
  • Certain embodiments are configured to detect swipe gestures in four different directions. The system will make sure that the swipe gesture is going left, right, up, or down. A diagonal swipe, on the other hand, would not be recognized.
  • the “package” in question for the core swipe gesture recognition technology may be a an HTML file having a combination of, e.g., text, image, and video content with a script block (e.g., JavaScript) embedded therein to act as an application programming interface (API).
  • the swipe gesture behaviors may be defined in an external JavaScript library which is referenced by the advertising content using a standard ⁇ script> tag (with an “src” attribute which identifies the external library). The behaviors are applied to any HTML content by a block of script in the content which instructs the swipe library regarding to which parts of the page they should attach themselves, which options should be used for swipe gesture detection, and what should be done if a swipe gesture is detected.
  • swipe advertisement which is configured to provide the user experience as well
  • the server hosting the advertising content may return a block of HTML content which contains the swipe gesture recognition “user experience.”
  • Each of these user experience-providing swipe advertisements may be a previously-assembled and static block of HTML code, rather than something that reacts dynamically on an incoming request.
  • content e.g., a web page
  • content may have items across which it would like to track swipe gestures. It will make a request to the swipe script to be notified anytime the user performs a swipe gesture on the particular area.
  • the swipe gesture recognition technology is in the background making decisions on the touch patterns to determine whether the user intended to make a swipe gesture.
  • the system alerts any entity, e.g., an application or webpage HTML code, which has requested to be notified.
  • the entity requesting notification may be a third party which is just using the detected behaviors themselves.
  • the entity requesting notification is going to be a swipe advertisement package which provides some behavior support and user interface (UI) support for a third party applications.
  • UI user interface
  • all entities may be notified when a swipe gesture is detected. If a particular entity is one which has registered with the system and requested a comprehensive package, then the system will take certain actions in response to a detected swipe gesture, such as, for example, closing or clicking-through an advertisement. The system may perform whatever action is necessary upon detection of a swipe gesture and then “tear down” the listeners, i.e., touch detection monitors, because they are no longer necessary. This is done to avoid draining the user device or slowing down a session.
  • the system may take the appropriate actions and continue to listen for swipe gestures.
  • swipe gesture is used to initiate an animation
  • an animation such as an advertisement for a soft drink in which the user swipes across a soda bottle and the animation then shows the soft drink being poured out into a glass and releasing bubbles.
  • the listeners may be left in place after the animation is initiated so that user can initiate the animation repeatedly.
  • the disclosed invention provides a method for manipulation of content provided by a hosting server using swipe gesture recognition on a user device having a touch input display.
  • the method includes storing the content, combined with a swipe gesture recognition module to form a content package, on the hosting server, wherein the swipe gesture recognition module is associated with at least one displayable content element of the content.
  • the method further includes receiving, at the hosting server, a request for the content package from the user device and transmitting the content package from the hosting server to the user device for display by an application running on the user device.
  • the swipe gesture recognition module is configured to perform swipe gesture recognition when the at least one displayable content element is displayed on the user device.
  • the swipe gesture recognition includes receiving touch input data from the touch input display of the user device.
  • the swipe gesture recognition further includes accessing, using the swipe gesture recognition module, a swipe gesture determination module stored on the hosting server or a second server to analyze the touch input data to determine whether a swipe gesture has occurred on the at least one displayable content element.
  • the swipe gesture recognition further includes applying a defined action to the at least one displayable content element if it is determined that a swipe gesture has occurred on the at least one displayable content element.
  • FIG. 1 is a flow chart of a method performed on a user device and a hosting sever, the user device implementing swipe gesture recognition technology to manipulate Internet-based content provided by the hosting server, in accordance with the disclosed invention.
  • FIG. 2 is a flow chart of a method for performing swipe gesture recognition and content manipulation on a user device.
  • FIG. 3 is a flow chart of a method for performing swipe gesture recognition and content manipulation on a user device for an advertisement.
  • FIGS. 4A and 4B are a flow chart of a method for performing swipe gesture recognition on a user device.
  • FIG. 1 depicts a method for manipulating displayed content, which is performed on a user device and a hosting sever.
  • the user device implements swipe gesture recognition technology to manipulate the content provided by the hosting server and displayed on the user device.
  • the technology may include swipe gesture recognition software code in various forms, such as, for example, HTML-based scripts, compiled modules, plug-ins, applets, application program interface (API), etc.
  • the user device may be any type of user computing device, such as, for example, a mobile device (e.g., smartphone, tablet, etc.) or a personal computer or laptop with a touchscreen or trackpad-type element which allows a user to make swipe gestures.
  • the user device initiates a content request 105 to a hosting server, e.g., via the Internet.
  • the content request may be, for example, in the form of a uniform resource locator (URL) directed to a particular webpage.
  • URL uniform resource locator
  • the hosting server upon receiving the content request, prepares the requested content 110 for transmission to the user device.
  • the preparation of the content includes conventional aspects, such as the composition of a webpage using hypertext markup language (HTML) and plug-ins, e.g., scripts or other executable elements.
  • the prepared content may also include advertising content, which may include content retrieved from other servers.
  • the preparation of the content also includes the embedding of a swipe technology configuration 115 which establishes how the content will react to swipe gestures performed on the user device, as explained in further detail below.
  • the content is then transmitted to the user device in the form of a content response 120 .
  • the user device receives the content response sent by the hosting server and attaches the swipe technology 130 to an element of the content, such as, for example, an advertisement which overlays the content when it is displayed on the user device.
  • the user device displays the content, e.g., on a touch screen, and awaits detection of a user touching the screen 135 .
  • the user device begins to perform gesture input 140 , i.e., touch data input and accumulation, which provides the basis for ascertaining whether the detected touch is in fact a swipe gesture by the user.
  • the user device After the user stops touching the screen 145 , the user device begins to interpret the detected gestures and apply the resulting actions 150 . For example, if the touching of the screen is interpreted as a bona fide swipe gesture, then the overlaid advertising content may be removed from the display, e.g., the advertisement may be “swept” off of the content on which it is overlaid.
  • the swipe gesture may initiate an animation or other executable aspect of the displayed content.
  • the user may then interact with the underlying, i.e., “non-swipe,” content 155 . If, on the other hand, the touching of the screen is not interpreted as a bona fide swipe gesture, then the user device will again await a user touching the screen 135 .
  • FIG. 2 depicts the method for performing swipe gesture recognition and content manipulation from the standpoint of the user device.
  • the user device initiates a content request 205 , which is transmitted to the hosting server.
  • the user device then awaits a content response 210 from the hosting server.
  • the remainder of the steps performed by the user device to detect and recognize bona fide swipe gestures are as described above with respect to FIG. 1 . These steps include: the user touching the screen 220 , beginning gesture input 225 , the user stops touching the screen 230 , interpretation of the gestures and application of an action 235 , and user interaction with non-swipe content 240 .
  • FIG. 3 depicts the method for performing swipe gesture recognition and content manipulation on a user device in the particular case of an advertisement overlaid on other displayed content, e.g., a webpage.
  • the user device received content from a hosting server to be displayed and attaches swipe gesture recognition to the content 305 .
  • the user device then awaits detection of a user touching the device screen 310 .
  • the user device begins to perform gesture input 315 , i.e., touch data input and accumulation.
  • the user device determines whether the touch was on the advertisement 320 . In other words, it is determined whether the touch is within a defined start element, e.g., within an area defined by an advertisement “window” or other displayed graphical element. If so, then the user device continuously captures data regarding the screen touch 330 , e.g., location and time data, until the user stops touching the screen. If, on the other hand, the touch is not determined to be on the advertisement 325 , then the swipe gesture recognition is stopped 335 and the next user touch is awaited.
  • a defined start element e.g., within an area defined by an advertisement “window” or other displayed graphical element. If so, then the user device continuously captures data regarding the screen touch 330 , e.g., location and time data, until the user stops touching the screen. If, on the other hand, the touch is not determined to be on the advertisement 325 , then the swipe gesture recognition is stopped 335 and the next user touch is awaited.
  • an analysis is performed to determine whether the detected gesture is a bona fide swipe gesture.
  • the analysis involves the application of certain criteria to the touch data, e.g., location and time data, collected during the swipe detection. If it is determined that the touch data, i.e., touch “pattern,” meets the applied criteria 340 , then a signal is output to various applications which are “listening” for a swipe 350 . If, on the other hand, the swipe gesture is not recognized as a swipe gesture 345 , then the user device awaits the next touch 355 .
  • a defined action which is “registered” with the swipe technology attached to the advertisement is performed 360 .
  • the registered action may be to close an overlaid advertisement window.
  • the registered action may also involve the initiation of an animation or other executable element.
  • the registered action may be implemented using various forms of software code, such as, for example, HTML-based scripts.
  • the detection of a swipe gesture may be signaled to listening applications, but any further action may be taken by the applications themselves, rather than by the swipe technology attached to the advertisement.
  • an advertisement may be part of displayed content which includes an executable element, e.g., a module written in Adobe Flash, and this element may handle the operation of removing the overlaid advertisement when the swipe gesture is signaled.
  • the attached swipe technology awaits the next touch 365 instead of performing an action.
  • a step may be performed in which the swipe technology is detached from the advertisement and stops its swipe detection functions 370 , or, alternatively, the swipe technology may continue swipe detection and await the next touch 375 .
  • FIGS. 4A and 4B depict a method for performing swipe gesture recognition on a user device.
  • swipe gesture recognition begins with a user touching the screen of the user device 402 . This touching can be detected and signaled to the swipe gesture recognition technology by the conventional touch detection electronics of the user device.
  • the detection of a touching of the screen may be referred to as a “touch start event” 404 .
  • a series of criteria are applied to the underlying touch data.
  • the touch data may include, for example, location and time information regarding touched locations of the screen.
  • the first criteria applied to the touch data may be a determination of whether multiple fingers were used to touch the screen 406 . This is followed by a determination of whether multiple-finger touches are allowed for swipe detection 408 , which is a parameter which may be set based on defined preferences, e.g., preferences set by the content provider. If multiple-finger touches are not permitted in the particular configuration in question, the process then detaches the touch handlers and timers and awaits the next touch start event.
  • the process determines whether there is a defined “starting element” 410 , such as, for example, an overlaid advertisement having a defined area (e.g., a displayed window). If so, then the process determines whether the touch began within the boundaries of the starting element 412 . If the touch began on the starting element (or if a starting element is not configured), then the process initiates a “touch move” event 414 and a “touch end” event listener 416 , which are routines which detect the movement of a finger (or fingers) touching the screen and the end of the touching. These actions may be signaled by the conventional touch screen detection system of the user device. Alternatively, the touch move event may be detected based on processing of underlying touch location and time data.
  • starting element such as, for example, an overlaid advertisement having a defined area (e.g., a displayed window). If so, then the process determines whether the touch began within the boundaries of the starting element 412 . If the touch began on the starting element (or if a
  • a timer is initiated. If the elapsed time 418 exceeds a determined time limit 420 without a touch move event being detected, then the touch detection handlers and timers may be detached 422 and the process will await the next touch.
  • a touch move event is detected 424 before the time limit is exceeded, then a number of criteria are applied to the touch move data, as described in the following paragraphs.
  • the process determines whether any defined elements were touched during the touch move event 426 .
  • a window defining an overlaid advertisement may be defined as a “target element.”
  • the process determines whether an element touched during the movement is a target element 428 . If the touched element is not a target element, then the process may detach the touch handlers and timers depending upon a parameter which defines whether the touch (i.e., the swipe gesture) is required to remain on a target element 430 . If, on the other hand, the touched element is a target element, then there is no need for further processing in this branch of the process.
  • the process computes a direction of the touch movement 432 based on the location data and, in particular, the last touch move data. The process then determines whether the computed direction is consistent with prior touch move data 434 . In other words, the process determines whether the swipe gesture is continuing in a single direction.
  • the extent to which the movement is allowed to vary from a straight line or a particular direction may be established by setting parameters to desired values, e.g., based on preferences of the content provider or user. It should be noted that the direction criteria applied at this point in the process relates to a set of touch move data with respect to a preceding set of touch move data.
  • the process also determines the distance from the last touch move data 436 and can compute the speed of the movement based on the computed distance 438 .
  • This allows various parameters to be applied, such as, for example, parameters which filter out swipe motions which are deemed to be too slow or too fast to be bona fide swipe gestures.
  • Such parameters may be set in advance based on preferences of a content provider, user, or system designer, or any combination thereof.
  • the touch move data is recorded 440 . More specifically, the touch move data which is recorded may be a screen location along which the swipe gesture is being made by the user.
  • the recording of the touch move data is subject to the “filtering” by the applied criteria. This recording process continues as the user performs the swipe gesture until a touch end event is received 442 , which means that the user has removed the user's finger (or fingers) from the touch screen.
  • the process may determine whether an “ending element” was configured 444 , which means that a particular element has been defined as an element upon which the swipe gesture must end 446 . For example, if the swipe gesture is being made close an overlaid advertisement, then a parameter may be set which requires the entire swipe to be within the area (e.g., a displayed window) of the advertisement. If the touch end is not on the defined ending element, then the swipe gesture recognition process may terminate (i.e., the touch handlers and timers may be detached and the next touch will be awaited).
  • the touch move data i.e., the data relating to most or all of the swipe gesture
  • these criteria may include a determination of whether the swipe gesture was performed within a defined time limit 450 . There may also be a determination of whether there are too many touch locations outside of the defined target element 452 . There may be a determination of whether the swipe gesture covers a defined minimum distance 454 (e.g., whether the swipe gesture was long enough relative to the displayed content).
  • the swipe gesture criteria may also include a determination of whether swipe gesture was performed in a defined allowed direction 456 , e.g., horizontally across the screen.
  • the criteria may also include a determination of whether the line of movement is close enough to a straight line 458 to qualify as a bona fide swipe.
  • swipe gesture data meet all of the applied criteria then the swipe gesture is recognized and the “listener” applications are signaled that a swipe gesture has occurred 460 .
  • swipe gesture recognition technology for implementing the algorithms discussed above includes a set of “behaviors” which may be applied in a “stand-alone configuration” as well as bundled as part of a “swipe ad” or “swipe content” package.
  • behaviors refers to the manner in which the user interface detects and responds to swipe gestures and other inputs initiated by the user via the user interface input mechanism, e.g., a touch screen, as implemented in object-oriented software.
  • behaviors are exposed, i.e., made available to a programmer, through a script, e.g., JavaScript, application programming interface (API).
  • a script e.g., JavaScript, application programming interface (API).
  • API application programming interface
  • the programmer may incorporate these behavioral elements into the code which is configured to present the content to the user.
  • the programmer will also include code, e.g., scripts, to handle the various types of detected behavior which can be received via the API.
  • the API is implemented via a set of JavaScript objects which are incorporated into the HTML code used to present content on the user device.
  • the API also includes tags which reference external JavaScript files, which may reside on the hosting server and/or on a controlled public content delivery network (CDN).
  • CDN controlled public content delivery network
  • the external files implement, inter alia, the swipe gesture recognition algorithms discussed above in order to provide swipe gesture detection information via the JavaScript objects.
  • swipe ad package may be a modular system which includes tags for accessing external script files that define behaviors and actions to be taken in response to detected behaviors.
  • Other functions may also be provided by the external files, such as, for example, functions relating to the manner in which the content is presented on the user device.
  • the bundled functions provided by the swipe ad or swipe content package allow the programmer creating the content to work primarily in HTML, while using JavaScript objects and external JavaScript references to present the content to the user with swipe gesture functionality.
  • content such as an advertisement
  • HTML tags e.g., an anchor tag around an image tag. This tagged content may then be accessed by the swipe ad package to be displayed with swipe gesture functionality.
  • the JavaScript objects discussed below support the swipe gesture recognition technology behaviors by providing named sets of well-known, i.e., enumerated, values in logical groupings. The members of these objects may be referenced for use as values when interacting with the swipe gesture behavior API functions.
  • the following object provides enumerated values which define a degree to which a detected gesture must have a linear trajectory in order to be deemed a proper swipe gesture:
  • the following object provides well-known values which define a direction in which a detected gesture must be oriented in order to be deemed a proper swipe gesture:
  • Swipe gesture detection functionality is exposed in the API in the form of two functions which work in concert to allow addition and removal of “observers” for the swipe gesture on a given page element, e.g., an overlaid or pop up image.
  • a swipe gesture observer When adding the swipe gesture observer, a set of options may be specified which allow control and fine tuning of the detection algorithm.
  • targetElementId (optional) The identifier of the document object model (DOM) element to which swipe gesture detection is to be applied. If not specified, then the current document is assumed.
  • DOM document object model
  • startElementId (optional) The identifier of the DOM element on which the swipe gesture must begin in order to be deemed a proper swipe gesture. If not specified, then no constraint is applied.
  • endElementId (optional) The identifier of the DOM element on which the swipe gesture must end in order to be deemed a proper swipe gesture. If not specified, then no constraint is applied.
  • ignoreSwipeElementIds (optional) An array of DOM element identifiers that should not participate in swipe detection. These elements will not allow the touch events to be processed for swipe detection. Elements are typically specified for this option when primarily intended to receive another gesture, such as an advertisement image being clicked. If not specified, then swipe detection will be applied using normal event processing.
  • AllowMultipleFingers (optional) A Boolean value that indicates if a swipe gesture may be performed with multiple fingers. If not specified, then false is assumed.
  • maximumTouchesOffTarget (optional) The total number of finger touches that are allowed to be outside of the target element for the gesture to still be considered a swipe. If not specified, then no constraint is applied.
  • minimumDistanceVertical (optional) The minimum distance, in pixels, that are required for a valid vertical swipe gesture. If not specified, then a distance of 50 pixels is assumed.
  • minimumDistanceHorizontal (optional) The minimum distance, in pixels, that are required for a valid horizontal swipe gesture. If not specified, then a distance of 50 pixels is assumed.
  • maximumMoveInterval (optional) The maximum amount of time, in milliseconds, that may elapse between figure movements in order to record a valid swipe gesture. If not specified, then an interval of 250 milliseconds is assumed.
  • maximumGestureInterval (optional) The maximum amount of time, in milliseconds, that may elapse between the beginning and ending touches of a gesture in order to be considered a swipe. If not specified, then an interval of 1250 milliseconds is assumed.
  • lineDetection (optional) The strictness of line detection that should be applied. This may take the form of a well-known value from the SoMo.Swipe. LineDetection object or a floating point percentage value that represents the tolerance factor to use when verifying that a gesture was linear. If not specified, the well-known LineDirection.Even value is assumed.
  • allowedDirections (optional) An array of directions in which a swipe gesture may be performed in order to be considered valid.
  • the members of this array will take the form of a well-known value from the SoMo.Swipe.Direction object. If not specified, the four primary directions are assumed.
  • swipeCallbacks (optional) An array of functions to be invoked when a swipe gesture meeting the specified options is detected.
  • the function is expected to have a signature of function(args), in which args will be an object containing the members: targetElement and direction where the values are the DOM element that the swipe gesture was observed and the direction of the gesture, respectively.
  • the direction will be expressed as a well-known value from the SoMo.Swipe.Direction object.
  • JavaScript interface definition or interface contact which defines the structure for using the JavaScript objects.
  • the definition includes enumerations of certain values to be used as arguments of the objects, e.g., for line detection and direction.
  • minimumDistanceVertical ⁇ int ⁇ [OPTIONAL] The minimum distance, in pixels, that are required for a vertical swipe gesture. If not specified, a reasonable default is applied * minimumDistanceHorizontal ⁇ int ⁇ [OPTIONAL] The minimum distance, in pixels, that are required for a hoizontal swipe gesture. If not specified, a reasonable default is applied * maximumMoveInterval ⁇ int ⁇ [OPTIONAL] The maximum amount of time, in milliseconds, that may elapse between finger movements. If not specified, a reasonable default is applied * maximumGestureInterval ⁇ int ⁇ [OPTIONAL] The maximum amount of time, in milliseconds, that performing the gesture is allowed to take.
  • swipe behaviors reference as it would appear in an HTML page:
  • swipe content unit e.g., an advertisement unit

Abstract

A method of manipulation of content provided by a hosting server using swipe gesture recognition on a user device having a touch input display. The method includes storing the content, combined with a swipe gesture recognition module to form a content package, on the hosting server. The swipe gesture recognition module is associated with at least one displayable content element of the content. The method further includes receiving, at the hosting server, a request for the content package from the user device and transmitting the content package from the hosting server to the user device for display by an application running on the user device. A defined action is applied to the at least one displayable content element if it is determined that a swipe gesture has occurred on the at least one displayable content element.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Patent Application No. 61/730,899, filed on Nov. 28, 2012, entitled “CONTENT MANIPULATION USING SWIPE GESTURE RECOGNITION TECHNOLOGY,” which is hereby incorporated by reference in its entirety.
  • FIELD OF THE INVENTION
  • The disclosed embodiments are directed to a content manipulation using swipe gesture recognition technology. More specifically, the disclosed embodiments are directed to systems and methods for manipulation of Internet-based content provided by a hosting server using swipe gesture recognition technology on a user device.
  • BACKGROUND OF THE INVENTION
  • Touchscreen-based user computing devices, such as smartphones, tablets, e-readers, touch-enabled laptops, and touch-enabled desktop computers, are commonly used to request content from servers, via the Internet. Such content may include an advertisement, or other type of display window, which is overlaid on the displayed content until a particular portion of the advertisement is touched by the user. The portion which must be touched to close the advertisement may be a small “X” in a corner of the advertisement.
  • On relatively small screens, particularly smartphone screens, it is often difficult for the user to accurately touch the designated portion of the screen to close the advertisement. This may result in the user unintentionally touching an active portion of the advertisement instead of the close “button,” which may, in turn, result in an undesired activation of an animation or even an undesired navigation to a different webpage. This can be a very frustrating and annoying experience for the user.
  • Such undesired activations of an advertiser's webpage can increase the advertiser's costs, because the advertisement may be paid for based on a particular cost-per-click (CPC). Therefore, an advertiser purchasing advertisements on mobile devices on a CPC basis may find that they are getting a very high click-through rate (CTR) but a low return on investment (ROI) due to accidental click-throughs. This may annoy current and potential customers and may result in a negative brand perception, which is a significant concern for large brands.
  • SUMMARY OF THE INVENTION
  • The disclosed embodiments provide a system, a method, and processor instructions for implementing swipe gesture recognition technology to manipulate Internet-based content provided by a hosting server to a user device.
  • In the disclosed system and method, a user may have a computing device, such as, for example, a mobile device. The mobile device could be, e.g., a mobile phone, a tablet, a household appliance which is running an embedded mobile piece of software, such as mobile browser. The browser may make a request for a piece of content. This content is typically hosted on a web page and is accessed via the Internet. In such a case, a web page is actually making a request from the device to a server that hosts the content. The content may include an advertisement and the behaviors that apply to that advertisement. When the server receives a content request, it provides a “package” of the content, the advertisement, and certain related information, such swipe gesture behaviors and technology packages that are needed to apply this related information to the requested content. Different types of advertisements may be bundled in different ways, and there may be different elements of the swipe gesture recognition technology as it gets assembled, specifically, with a particular piece of content. Once those decisions are made, the server sends the bundle, which contains at least the swipe gesture recognition technology pieces and the advertisement.
  • The advertisement may come from a server other than the content server (i.e., the hosting server). The content server may not have access to the advertisement itself. Rather, the content server may just receive enough information about the advertisement to allow the content server to determine how the swipe gesture recognition technology will be applied to the advertisement on the user's device.
  • In certain embodiments, the information relating to the advertisement which is received by the content server may be an identifier of the particular content package that it is looking to apply the swipe technology, e.g., a uniform resource locator (URL). The information may also include additional context data that would indicate how the advertisement is to be displayed on the user device, e.g., whether the advertisement is full screen or a window bar at the bottom of the page. The information received by the content server may include particular details of the type of presentation that is desired by the advertiser and how the advertisement is meant to react to detected swipe gestures. Alternatively, the advertiser may want only an indication that a swipe has occurred and may handle the response to the swipe in the software operating on the user device.
  • The information received by the content server relating to the advertisement provides a context in which the advertisement is to be displayed on the user device. The information describes the particular behaviors which are necessary from the swipe gesture recognition technology, whether the swipe technology needs to include presentation components, and, if so, which presentation components are necessary.
  • Once the package containing the content, advertising information, and swipe gesture recognition technology (e.g., in the form of a module of code) is assembled, it is delivered to the user device. The package may be, for example, in the form of client-type script, and this script makes decisions based on the information that is available at the user device level. Once the swipe gesture recognition technology is on the user device, it already knows where the advertising content is located and details regarding how the presentation is to be displayed on the user device. The display behavior is, thus, based on the package. Among other things, the swipe gesture recognition technology “crawls” through parts of the content page and finds the advertisement that it is meant to present/control. If the package needs to apply presentation components, it will do so that at this point. The swipe technology will also determine where it needs to start listening for the user's gestures in order to interpret them and decide whether the user meant to perform a swipe gesture or whether the user meant to perform some other type of operation, such as a click, drag or other gesture.
  • After the preparation described above has been done and all of the related decisions have been made, then the swipe gesture recognition technology is in a passive listening mode in which it waits for the user to touch the screen of the user device. Once the user touches the screen, if, for example, the touch is within the portions of the screen to which swipe gesture behavior is to be applied, then the system will attempt to interpret whether the touching should be deemed to be a swipe gesture.
  • Thus, when the user touches the screen, the swipe gesture recognition technology “wakes up” and starts trying to figure out what the user's intent is with respect to the touching action. The technology does this by applying behavioral analysis to the way the user's fingers move, e.g., how many fingers are in place, in which direction are they moving, for what length of time have they been moving, and various other criteria. After the user touches the screen, the swipe technology records the gestures and the interaction that the user has with the screen. Once the user removes his finger from the screen, then the swipe technology stops collecting the touch data and the analysis it has been doing in the background and starts to make decisions based on the touch data, e.g., touch location and time data.
  • The swipe gesture recognition technology analyzes the touching actions which the user performs with his finger on the screen and determines whether these actions constitute a swipe gesture of the sort the system is configured to look for and react to. If the system determines that the touching actions do not amount to the swipe gesture it is seeking, then the system may go back into a passive “listening” mode in which it awaits further touching actions.
  • If the swipe gesture recognition technology detects the particular swipe gestures for which it is “listening,” then it will take different actions depending on the particular nature of the package installed on the user device. In a general case, if the system detects a bona fide swipe gesture, then it will signal this to any application or entity which happens to be listening to it. In certain embodiments, the swipe gesture recognition technology may also implement the user interface on the user device. In such a case, the system takes specific actions upon detection of the swipe gesture, such as, for example, making an overlaid displayable content element (e.g., an advertisement) disappear, which may be implemented by having the advertisement slide off the screen in an animated fashion. In embodiments in which the package contains the swipe gesture recognition technology without a display handling portion, then the portion of the package which asked for the swipe gesture recognition to be applied (e.g., a portion coded by an entity responsible for the advertising content) is then responsible for taking the appropriate actions based on a received indication that a swipe has occurred.
  • The disclosed embodiments provide systems and methods for listening to touch screen gestures, determining whether a detected gesture is appropriate, and then making sure that the system alerts the applications which need to know, so that, for example, an application can apply some type of action to a displayed advertisement. The application may, for example, close an overlaid advertisement, “swipe through” the advertisement to a different site identified by the advertisement, or cause the advertisement to initiate an animation or other executable element. For example, if the advertisement is for a soft drink, then the swipe gesture may cause an animation to execute which shows the beverage being poured into a glass. Thus, one aspect of the disclosed embodiments, is the ability to process all of the touchscreen data and to interpret how to perform user interaction.
  • In certain embodiments, after detection of the user touching the screen, the system starts “listening” for a swipe gesture. The system may determine whether the touch was on a defined area, e.g., on an advertisement, and may also determine whether the system is responsible for reacting to the swipe gesture or merely reporting it. If the touch was not on the advertisement, the system may stop listening and wait for the next detected touch. If the touch was on the advertisement, then the system will capture touch data, e.g., location and time, until the user stops touching the screen. At that point, the system determines whether the pattern of the user's interaction should be recognized as a swipe gesture. If not, the system goes back into a dormant state and waits for the next touch to occur. If it is determined, for example, that the gesture was long enough, was in the proper direction, did not deviate from its axis by too much, and had the proper magnitude, then the gesture would be deemed to be a bona fide swipe gesture. Based upon this determination, certain applications may be alerted and/or certain actions may be taken. Depending on the type of package that was delivered to the user device from the hosting server, the action taken might be the hiding of the advertisement or might be only the alerting of applications and then waiting for the next touch event.
  • In certain embodiments, swipe gesture recognition technology is attached to an advertisement, for example, and the resulting “package” is delivered to a user device. The package may include the content provider's (i.e., the entity responsible for the advertising content) script components as part of its content. When the content provider's script is executed on the user's device, the application, or other entity, using it has set up certain information to let the script know its context. There may be, for example, an element called “ABCD,” which is the advertisement to which the swipe gesture recognition technology is to be attached. Inside element ABCD there may be a sub-element which has arrows (or other symbols) to indicate to the user that a swipe gesture is needed and that identify the element as the specific area that the system should monitor for swipe gestures. The specific area may turn a different color or provide some other sort of indication if the user touches the area instead of swiping the area. Certain embodiments are configured to detect swipe gestures in four different directions. The system will make sure that the swipe gesture is going left, right, up, or down. A diagonal swipe, on the other hand, would not be recognized.
  • The “package” in question for the core swipe gesture recognition technology may be a an HTML file having a combination of, e.g., text, image, and video content with a script block (e.g., JavaScript) embedded therein to act as an application programming interface (API). The swipe gesture behaviors may be defined in an external JavaScript library which is referenced by the advertising content using a standard <script> tag (with an “src” attribute which identifies the external library). The behaviors are applied to any HTML content by a block of script in the content which instructs the swipe library regarding to which parts of the page they should attach themselves, which options should be used for swipe gesture detection, and what should be done if a swipe gesture is detected.
  • In the case of a swipe advertisement which is configured to provide the user experience as well, the process works the same. The server hosting the advertising content may return a block of HTML content which contains the swipe gesture recognition “user experience.” The combination of this HTML content and the advertisement content and requests and configures the swipe gesture behaviors in a manner similar to that discussed above (i.e., similar to the case in which the swipe advertisement is not configured to provide the user experience itself). Each of these user experience-providing swipe advertisements may be a previously-assembled and static block of HTML code, rather than something that reacts dynamically on an incoming request.
  • In certain embodiments, content, e.g., a web page, may have items across which it would like to track swipe gestures. It will make a request to the swipe script to be notified anytime the user performs a swipe gesture on the particular area. As the user is touching the screen, the swipe gesture recognition technology is in the background making decisions on the touch patterns to determine whether the user intended to make a swipe gesture. When a swipe gesture is detected, the system alerts any entity, e.g., an application or webpage HTML code, which has requested to be notified. In some embodiments, the entity requesting notification may be a third party which is just using the detected behaviors themselves. In some embodiments, the entity requesting notification is going to be a swipe advertisement package which provides some behavior support and user interface (UI) support for a third party applications.
  • In certain embodiments, all entities may be notified when a swipe gesture is detected. If a particular entity is one which has registered with the system and requested a comprehensive package, then the system will take certain actions in response to a detected swipe gesture, such as, for example, closing or clicking-through an advertisement. The system may perform whatever action is necessary upon detection of a swipe gesture and then “tear down” the listeners, i.e., touch detection monitors, because they are no longer necessary. This is done to avoid draining the user device or slowing down a session.
  • In other embodiments, the system may take the appropriate actions and continue to listen for swipe gestures. For example, there may be advertisements in which the swipe gesture is used to initiate an animation, such as an advertisement for a soft drink in which the user swipes across a soda bottle and the animation then shows the soft drink being poured out into a glass and releasing bubbles. The listeners may be left in place after the animation is initiated so that user can initiate the animation repeatedly.
  • In one aspect, the disclosed invention provides a method for manipulation of content provided by a hosting server using swipe gesture recognition on a user device having a touch input display. The method includes storing the content, combined with a swipe gesture recognition module to form a content package, on the hosting server, wherein the swipe gesture recognition module is associated with at least one displayable content element of the content. The method further includes receiving, at the hosting server, a request for the content package from the user device and transmitting the content package from the hosting server to the user device for display by an application running on the user device.
  • Further in regard to this aspect of the disclosed invention, the swipe gesture recognition module is configured to perform swipe gesture recognition when the at least one displayable content element is displayed on the user device. The swipe gesture recognition includes receiving touch input data from the touch input display of the user device. The swipe gesture recognition further includes accessing, using the swipe gesture recognition module, a swipe gesture determination module stored on the hosting server or a second server to analyze the touch input data to determine whether a swipe gesture has occurred on the at least one displayable content element. The swipe gesture recognition further includes applying a defined action to the at least one displayable content element if it is determined that a swipe gesture has occurred on the at least one displayable content element.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects and advantages of the disclosed subject matter will be apparent upon consideration of the following detailed description, taken in conjunction with accompanying drawings, in which like reference characters refer to like parts throughout, and in which:
  • FIG. 1 is a flow chart of a method performed on a user device and a hosting sever, the user device implementing swipe gesture recognition technology to manipulate Internet-based content provided by the hosting server, in accordance with the disclosed invention.
  • FIG. 2 is a flow chart of a method for performing swipe gesture recognition and content manipulation on a user device.
  • FIG. 3 is a flow chart of a method for performing swipe gesture recognition and content manipulation on a user device for an advertisement.
  • FIGS. 4A and 4B are a flow chart of a method for performing swipe gesture recognition on a user device.
  • DETAILED DESCRIPTION
  • FIG. 1 depicts a method for manipulating displayed content, which is performed on a user device and a hosting sever. The user device implements swipe gesture recognition technology to manipulate the content provided by the hosting server and displayed on the user device. The technology may include swipe gesture recognition software code in various forms, such as, for example, HTML-based scripts, compiled modules, plug-ins, applets, application program interface (API), etc.
  • The user device may be any type of user computing device, such as, for example, a mobile device (e.g., smartphone, tablet, etc.) or a personal computer or laptop with a touchscreen or trackpad-type element which allows a user to make swipe gestures. The user device initiates a content request 105 to a hosting server, e.g., via the Internet. The content request may be, for example, in the form of a uniform resource locator (URL) directed to a particular webpage.
  • The hosting server, upon receiving the content request, prepares the requested content 110 for transmission to the user device. The preparation of the content includes conventional aspects, such as the composition of a webpage using hypertext markup language (HTML) and plug-ins, e.g., scripts or other executable elements. The prepared content may also include advertising content, which may include content retrieved from other servers. The preparation of the content also includes the embedding of a swipe technology configuration 115 which establishes how the content will react to swipe gestures performed on the user device, as explained in further detail below. The content is then transmitted to the user device in the form of a content response 120.
  • The user device receives the content response sent by the hosting server and attaches the swipe technology 130 to an element of the content, such as, for example, an advertisement which overlays the content when it is displayed on the user device. The user device displays the content, e.g., on a touch screen, and awaits detection of a user touching the screen 135. Once a touch is detected, the user device begins to perform gesture input 140, i.e., touch data input and accumulation, which provides the basis for ascertaining whether the detected touch is in fact a swipe gesture by the user.
  • After the user stops touching the screen 145, the user device begins to interpret the detected gestures and apply the resulting actions 150. For example, if the touching of the screen is interpreted as a bona fide swipe gesture, then the overlaid advertising content may be removed from the display, e.g., the advertisement may be “swept” off of the content on which it is overlaid. Various other types of action may result from the swipe gesture. For example, the swipe gesture may initiate an animation or other executable aspect of the displayed content. After the resulting action, the user may then interact with the underlying, i.e., “non-swipe,” content 155. If, on the other hand, the touching of the screen is not interpreted as a bona fide swipe gesture, then the user device will again await a user touching the screen 135.
  • FIG. 2 depicts the method for performing swipe gesture recognition and content manipulation from the standpoint of the user device. As discussed above, the user device initiates a content request 205, which is transmitted to the hosting server. The user device then awaits a content response 210 from the hosting server. After the content is received, it is displayed with embedded swipe gesture recognition technology 215. The remainder of the steps performed by the user device to detect and recognize bona fide swipe gestures are as described above with respect to FIG. 1. These steps include: the user touching the screen 220, beginning gesture input 225, the user stops touching the screen 230, interpretation of the gestures and application of an action 235, and user interaction with non-swipe content 240.
  • FIG. 3 depicts the method for performing swipe gesture recognition and content manipulation on a user device in the particular case of an advertisement overlaid on other displayed content, e.g., a webpage. As above, the user device received content from a hosting server to be displayed and attaches swipe gesture recognition to the content 305. The user device then awaits detection of a user touching the device screen 310. When a touch is detected, the user device begins to perform gesture input 315, i.e., touch data input and accumulation.
  • As part of the gesture input 315, the user device determines whether the touch was on the advertisement 320. In other words, it is determined whether the touch is within a defined start element, e.g., within an area defined by an advertisement “window” or other displayed graphical element. If so, then the user device continuously captures data regarding the screen touch 330, e.g., location and time data, until the user stops touching the screen. If, on the other hand, the touch is not determined to be on the advertisement 325, then the swipe gesture recognition is stopped 335 and the next user touch is awaited.
  • After the swipe gesture is completed, i.e., after the user stops touching the device 330, then an analysis is performed to determine whether the detected gesture is a bona fide swipe gesture. The analysis, as discussed in further detail below, involves the application of certain criteria to the touch data, e.g., location and time data, collected during the swipe detection. If it is determined that the touch data, i.e., touch “pattern,” meets the applied criteria 340, then a signal is output to various applications which are “listening” for a swipe 350. If, on the other hand, the swipe gesture is not recognized as a swipe gesture 345, then the user device awaits the next touch 355.
  • If a bona fide swipe gesture is detected, then a defined action which is “registered” with the swipe technology attached to the advertisement is performed 360. For example, the registered action may be to close an overlaid advertisement window. The registered action may also involve the initiation of an animation or other executable element. The registered action may be implemented using various forms of software code, such as, for example, HTML-based scripts.
  • In certain embodiments, the detection of a swipe gesture may be signaled to listening applications, but any further action may be taken by the applications themselves, rather than by the swipe technology attached to the advertisement. For example, an advertisement may be part of displayed content which includes an executable element, e.g., a module written in Adobe Flash, and this element may handle the operation of removing the overlaid advertisement when the swipe gesture is signaled. In such a case, after the listeners are signaled that a swipe has occurred 350, then the attached swipe technology awaits the next touch 365 instead of performing an action.
  • If an action is applied by the attached swipe technology 360 following the detection of a swipe 350, then a step may be performed in which the swipe technology is detached from the advertisement and stops its swipe detection functions 370, or, alternatively, the swipe technology may continue swipe detection and await the next touch 375.
  • FIGS. 4A and 4B depict a method for performing swipe gesture recognition on a user device. As discussed above, swipe gesture recognition begins with a user touching the screen of the user device 402. This touching can be detected and signaled to the swipe gesture recognition technology by the conventional touch detection electronics of the user device. The detection of a touching of the screen may be referred to as a “touch start event” 404. Upon detection of the touch start event, a series of criteria are applied to the underlying touch data. The touch data may include, for example, location and time information regarding touched locations of the screen.
  • The first criteria applied to the touch data may be a determination of whether multiple fingers were used to touch the screen 406. This is followed by a determination of whether multiple-finger touches are allowed for swipe detection 408, which is a parameter which may be set based on defined preferences, e.g., preferences set by the content provider. If multiple-finger touches are not permitted in the particular configuration in question, the process then detaches the touch handlers and timers and awaits the next touch start event.
  • If multiple-finger swipe gestures are allowed, then the process determines whether there is a defined “starting element” 410, such as, for example, an overlaid advertisement having a defined area (e.g., a displayed window). If so, then the process determines whether the touch began within the boundaries of the starting element 412. If the touch began on the starting element (or if a starting element is not configured), then the process initiates a “touch move” event 414 and a “touch end” event listener 416, which are routines which detect the movement of a finger (or fingers) touching the screen and the end of the touching. These actions may be signaled by the conventional touch screen detection system of the user device. Alternatively, the touch move event may be detected based on processing of underlying touch location and time data.
  • After a touch start event is detected and the touch move and touch end listeners are attached, then a timer is initiated. If the elapsed time 418 exceeds a determined time limit 420 without a touch move event being detected, then the touch detection handlers and timers may be detached 422 and the process will await the next touch.
  • If, on the other hand, a touch move event is detected 424 before the time limit is exceeded, then a number of criteria are applied to the touch move data, as described in the following paragraphs.
  • The process determines whether any defined elements were touched during the touch move event 426. For example, a window defining an overlaid advertisement may be defined as a “target element.” The process then determines whether an element touched during the movement is a target element 428. If the touched element is not a target element, then the process may detach the touch handlers and timers depending upon a parameter which defines whether the touch (i.e., the swipe gesture) is required to remain on a target element 430. If, on the other hand, the touched element is a target element, then there is no need for further processing in this branch of the process.
  • The process computes a direction of the touch movement 432 based on the location data and, in particular, the last touch move data. The process then determines whether the computed direction is consistent with prior touch move data 434. In other words, the process determines whether the swipe gesture is continuing in a single direction. The extent to which the movement is allowed to vary from a straight line or a particular direction (e.g., horizontally across the screen) may be established by setting parameters to desired values, e.g., based on preferences of the content provider or user. It should be noted that the direction criteria applied at this point in the process relates to a set of touch move data with respect to a preceding set of touch move data. It is used to filter out movements with directional changes from one data set to the next, e.g., one touch location to the next, which are too significant to allow the processing of the particular movement to continue. A further set of criteria may be applied to the swipe gesture as a whole, as discussed below.
  • The process also determines the distance from the last touch move data 436 and can compute the speed of the movement based on the computed distance 438. This allows various parameters to be applied, such as, for example, parameters which filter out swipe motions which are deemed to be too slow or too fast to be bona fide swipe gestures. Such parameters, as with all of the parameters discussed herein, may be set in advance based on preferences of a content provider, user, or system designer, or any combination thereof.
  • After the application of the various criteria discussed above to the touch move data, the touch move data is recorded 440. More specifically, the touch move data which is recorded may be a screen location along which the swipe gesture is being made by the user. The recording of the touch move data, as discussed above, is subject to the “filtering” by the applied criteria. This recording process continues as the user performs the swipe gesture until a touch end event is received 442, which means that the user has removed the user's finger (or fingers) from the touch screen.
  • After the touch end event is received, the process may determine whether an “ending element” was configured 444, which means that a particular element has been defined as an element upon which the swipe gesture must end 446. For example, if the swipe gesture is being made close an overlaid advertisement, then a parameter may be set which requires the entire swipe to be within the area (e.g., a displayed window) of the advertisement. If the touch end is not on the defined ending element, then the swipe gesture recognition process may terminate (i.e., the touch handlers and timers may be detached and the next touch will be awaited).
  • After the ending element criteria are applied, the touch move data, i.e., the data relating to most or all of the swipe gesture, are then interpreted 448 to determine whether the swipe gesture should be deemed bona fide. These criteria may include a determination of whether the swipe gesture was performed within a defined time limit 450. There may also be a determination of whether there are too many touch locations outside of the defined target element 452. There may be a determination of whether the swipe gesture covers a defined minimum distance 454 (e.g., whether the swipe gesture was long enough relative to the displayed content).
  • The swipe gesture criteria may also include a determination of whether swipe gesture was performed in a defined allowed direction 456, e.g., horizontally across the screen. The criteria may also include a determination of whether the line of movement is close enough to a straight line 458 to qualify as a bona fide swipe.
  • If the swipe gesture data meet all of the applied criteria then the swipe gesture is recognized and the “listener” applications are signaled that a swipe gesture has occurred 460.
  • The swipe gesture recognition technology (or, “swipe technology”) for implementing the algorithms discussed above includes a set of “behaviors” which may be applied in a “stand-alone configuration” as well as bundled as part of a “swipe ad” or “swipe content” package. The term “behaviors” refers to the manner in which the user interface detects and responds to swipe gestures and other inputs initiated by the user via the user interface input mechanism, e.g., a touch screen, as implemented in object-oriented software.
  • In the stand-alone swipe technology configuration, behaviors are exposed, i.e., made available to a programmer, through a script, e.g., JavaScript, application programming interface (API). This allows the behaviors to be configured to interact with any elements of a web page, such as for example, overlaid or pop-up images to be displayed. The programmer may incorporate these behavioral elements into the code which is configured to present the content to the user. In such a case, the programmer will also include code, e.g., scripts, to handle the various types of detected behavior which can be received via the API.
  • The API is implemented via a set of JavaScript objects which are incorporated into the HTML code used to present content on the user device. The API also includes tags which reference external JavaScript files, which may reside on the hosting server and/or on a controlled public content delivery network (CDN). The external files implement, inter alia, the swipe gesture recognition algorithms discussed above in order to provide swipe gesture detection information via the JavaScript objects.
  • In the bundled configuration, both the behavioral elements and the detected behavior handling are bundled as part of swipe ad or swipe content package. In this configuration, the swipe ad package may be a modular system which includes tags for accessing external script files that define behaviors and actions to be taken in response to detected behaviors. Other functions may also be provided by the external files, such as, for example, functions relating to the manner in which the content is presented on the user device.
  • The bundled functions provided by the swipe ad or swipe content package allow the programmer creating the content to work primarily in HTML, while using JavaScript objects and external JavaScript references to present the content to the user with swipe gesture functionality. For example, content, such as an advertisement, may be included in an HTML file by using HTML tags (e.g., an anchor tag around an image tag). This tagged content may then be accessed by the swipe ad package to be displayed with swipe gesture functionality.
  • The JavaScript objects discussed below support the swipe gesture recognition technology behaviors by providing named sets of well-known, i.e., enumerated, values in logical groupings. The members of these objects may be referenced for use as values when interacting with the swipe gesture behavior API functions.
  • The following object provides enumerated values which define a degree to which a detected gesture must have a linear trajectory in order to be deemed a proper swipe gesture:
  • SoMo.Swipe.LineDetection
  • Strict
  • Even
  • Loose
  • The following object provides well-known values which define a direction in which a detected gesture must be oriented in order to be deemed a proper swipe gesture:
  • SoMo.Swipe.Direction
  • Up
  • Down
  • Left
  • Right
  • Swipe gesture detection functionality is exposed in the API in the form of two functions which work in concert to allow addition and removal of “observers” for the swipe gesture on a given page element, e.g., an overlaid or pop up image. When adding the swipe gesture observer, a set of options may be specified which allow control and fine tuning of the detection algorithm.
  • The usage of the observer-adding function is as follows:
  • SoMo.Swipe.addSwipeObserver
  • targetElementId—(optional) The identifier of the document object model (DOM) element to which swipe gesture detection is to be applied. If not specified, then the current document is assumed.
  • startElementId—(optional) The identifier of the DOM element on which the swipe gesture must begin in order to be deemed a proper swipe gesture. If not specified, then no constraint is applied.
  • endElementId—(optional) The identifier of the DOM element on which the swipe gesture must end in order to be deemed a proper swipe gesture. If not specified, then no constraint is applied.
  • ignoreSwipeElementIds—(optional) An array of DOM element identifiers that should not participate in swipe detection. These elements will not allow the touch events to be processed for swipe detection. Elements are typically specified for this option when primarily intended to receive another gesture, such as an advertisement image being clicked. If not specified, then swipe detection will be applied using normal event processing.
  • allowMultipleFingers—(optional) A Boolean value that indicates if a swipe gesture may be performed with multiple fingers. If not specified, then false is assumed.
  • maximumTouchesOffTarget—(optional) The total number of finger touches that are allowed to be outside of the target element for the gesture to still be considered a swipe. If not specified, then no constraint is applied.
  • minimumDistanceVertical—(optional) The minimum distance, in pixels, that are required for a valid vertical swipe gesture. If not specified, then a distance of 50 pixels is assumed.
  • minimumDistanceHorizontal—(optional) The minimum distance, in pixels, that are required for a valid horizontal swipe gesture. If not specified, then a distance of 50 pixels is assumed.
  • maximumMoveInterval—(optional) The maximum amount of time, in milliseconds, that may elapse between figure movements in order to record a valid swipe gesture. If not specified, then an interval of 250 milliseconds is assumed.
  • maximumGestureInterval—(optional) The maximum amount of time, in milliseconds, that may elapse between the beginning and ending touches of a gesture in order to be considered a swipe. If not specified, then an interval of 1250 milliseconds is assumed.
  • lineDetection—(optional) The strictness of line detection that should be applied. This may take the form of a well-known value from the SoMo.Swipe. LineDetection object or a floating point percentage value that represents the tolerance factor to use when verifying that a gesture was linear. If not specified, the well-known LineDirection.Even value is assumed.
  • allowedDirections—(optional) An array of directions in which a swipe gesture may be performed in order to be considered valid. The members of this array will take the form of a well-known value from the SoMo.Swipe.Direction object. If not specified, the four primary directions are assumed.
  • swipeCallbacks—(optional) An array of functions to be invoked when a swipe gesture meeting the specified options is detected. The function is expected to have a signature of function(args), in which args will be an object containing the members: targetElement and direction where the values are the DOM element that the swipe gesture was observed and the direction of the gesture, respectively. The direction will be expressed as a well-known value from the SoMo.Swipe.Direction object.
  • Returns:
  • An object that represents the unique handle assigned to the swipe observer and the set of options being applied to gesture detection. This object must be passed to the removeSwipeObserver function in order to properly stop observing swipe gestures.
  • The following is an example of the use of the SoMo.Swipe. addSwipeObserver object:
  • var handle = SoMo.Swipe.addSwipeObserver(
    {
    targetElementId : ‘myAdContainerId’,
    ignoreSwipeElementIds : [‘myAdImageId’],
    lineDetection : SoMo.Swipe.LineDetection.Loose,
    swipeCallbacks : [function(args) { alert(‘swipe’); }]
    });
    var handle = SoMo.Swipe.addSwipeObserver(
    {
    targetElementId : ‘santaImageId’,
    startElementId : ‘santaHatId’,
    endElementId : ‘santaBellyId’,
    ignoreSwipeElementIds : [‘myAdImageId’, ‘someOtherId’],
    allowMultipleFingers : true,
    maximumTouchesOfftarget : 5,
    minimumDistanceVertical : 100,
    minimumDistanceHorizontal : 65,
    maximumMoveInterval : 200,
    maximumGestureInterval : 1100,
    lineDetection : SoMo.Swipe.LineDetection.Loose,
    allowedDirections : [SoMo.Swipe.Direction.Up]
    swipeCallbacks : [handleSwipe]
    });
  • The usage of the observer-removing function is as follows:
  • removeSwipeObserver
  • Arguments: The set of options returned when addSwipeObserver was called. For example:
  • var handle = SoMo.Swipe.addSwipeObserver(options);
    SoMo.Swipe.removeSwipeObserver(handle);
  • The following is a restatement of the object definitions given above in the form of a JavaScript interface definition or interface contact, which defines the structure for using the JavaScript objects. The definition includes enumerations of certain values to be used as arguments of the objects, e.g., for line detection and direction.
  • var SoMo = SoMo ||{ };
    // Swipe Behaviors
    /**
    * A static class providing the public interface for swipe ad
    functionality.
    * @class
    */
    SoMo.Swipe =
    {
    /**
    * Adds an observer to be alerted when a swipe gesture is
    detected.
    * @member Swipe
    *
    * @param options {object} A set of options for controlling swipe
    detection behavior. The allowed options are:
    * targetElementId {string} [OPTIONAL] Id of
    the DOM element that swipe detection is applied to. If not specified,
    the screen is used
    * startElementId {string} [OPTIONAL] Id of
    the DOM element that the swipe gesture must begin on. If not
    specified, no constraint is applied
    * endElementId {string} [OPTIONAL] Id of
    the DOM element that the swipe gesture must end on. If not specified,
    no constraint is applied
    * ignoreSwipeElementIds {array} [OPTIONAL] A set
    of DOM element ids that should not participate in swipe detection.
    These elements will not allow the touch events to bubble. If not
    specified, swipe detection will be applied using normal event bubbling
    * allowMultipleFinger {boolean} [OPTIONAL] True if
    gestures with multiple fingers are eligible to be considered swipe
    gestures. If not specified, false is assumed
    * maximumTouchesOff Target {int} [OPTIONAL] The
    total number of touches that are allowed to occur on an element that is
    not the target. If not specified, no constraint is applied
    * minimumDistanceVertical {int} [OPTIONAL] The
    minimum distance, in pixels, that are required for a vertical swipe
    gesture. If not specified, a reasonable default is applied
    * minimumDistanceHorizontal {int} [OPTIONAL] The
    minimum distance, in pixels, that are required for a hoizontal swipe
    gesture. If not specified, a reasonable default is applied
    * maximumMoveInterval {int} [OPTIONAL] The
    maximum amount of time, in milliseconds, that may elapse between finger
    movements. If not specified, a reasonable default is applied
    * maximumGestureInterval {int} [OPTIONAL] The
    maximum amount of time, in milliseconds, that performing the gesture is
    allowed to take. If not specified, a reasonable default is applied
    * lineDetection {int|float} [OPTIONAL] The
    type of line detection strictness that shouId be applied. If not
    specified, a reasonable default is applied
    * allowedDirections {array} [OPTIONAL] The set
    of directions in which a swipe gesture is valid. If not applied, the
    four primary directions are allowed
    * swipeCallbacks {array} [OPTIONAL] The set
    of callback functions to alert when a swipe gesture is detected. If
    not specified, no notification is performed
    *
    * @returns {object} The set of options received normalized with
    the default values for unspecified members
      */
      addSwipeObserver : function Swipe$addSwipeObserver(options) { },
      /**
    * Removes a swipe observer.
    * @member Swipe
    *
    * @param options {object} The set of options received as a
    return value when the swipe observer was added.
      */
      removeSwipeObserver : function Swipe$removeSwipeObserver(options)
    { }
    };
    // Line Detection
    SoMo.Swipe.LineDetection =
    {
    Strict  : 0,
    Loose  : 2 ,
    Even  : 4
    };
    // Direction
    SoMo.Swipe.Direction =
    {
    Up : 0,
    Down : 2 ,
    Left : 4,
    Right : 8
    };
  • The following is an example of the swipe behaviors reference, as it would appear in an HTML page:
  • <script type=“text/javascript” src=“http://ef2083a34f0ec9f817e8-
    64e501cf4fb3a4a144bedf4c4ec2f5da.r2.cf2.rackcdn.com/static/swipe-
    ads/somo-swipe-behaviors.js”></script>
  • The following is an example of a swipe content unit (e.g., an advertisement unit) being included in an HTML page:
  • <script type=“text/javascript”>
    window.SwipeAdState =
    {
    Normal : 0,
    AdOnly : 2,
    DoNotDisplay : 4,
    AwaitCallback : 8
    };
    window.currentSwipeAdId = ‘swipeAd9876789’;
    window.swipeState = window.SwipeAdState.Normal;
    window.swipeQueue = window.swipeQueue || [ ];
    window.swipeQueue.push(window.currentSwipeAdId);
    </script>
    <div style=“display:none; position:absolute; top:−3000px; left:−3000px; z-
    index:−9999;” id=“adWrapper123454321”>
    <!-- AD CREATIVE -->
    <a
    href=“http://www.giftcards.com/?utm source=swipe&utm medium=mobile&utm campaign=728
    x90” target=“_blank”><img src=“http://www.swipeadvertising.com/ads/tablet-
    728x90.png” height=“90” width=“728” alt=“Ad Content” /></a>
    <!-- END AD CREATIVE -->
    </div>
    <script type=“text/javascript” src=“http://ef2083a34f0ec9f817e8-
    64e501cf4fb3a4a144bedf4c4ec2f5da.r2.cf2.rackcdn.com/static/swipe-ads/somo-bottom-
    728x90-close.js”></script>
    <script type=“text/javascript”>
    (function(undefined)
    {
    var wrapperId = ‘adWrapper123454321’;
    var id = window.currentSwipeAdId;
    delete window.currentSwipeAdId;
    var initialize = (function(id, wrapperId)
    {
    return function( ) { SoMo.SwipeAds.initializeAd[id](id, wrapperId); };
    })(id, wrapperId);
    if (window.swipeState === window.SwipeAdState.AwaitCallback)
    {
    SoMo.SwipeAds.addSwipeAdCompleteCallback(id, initialize);
    window.swipeState = window.SwipeAdState.Normal;
    }
    else
    {
    SoMo.DOMReady.add(initialize);
    }
    })( );
    </script>
  • Although example embodiments have been shown and described in this specification and figures, it would be appreciated by those skilled in the art that changes may be made to the illustrated and/or described example embodiments without departing from their principles and spirit.

Claims (22)

What is claimed is:
1. A method for manipulation of content provided by a hosting server using swipe gesture recognition on a user device having a touch input display, the method comprising:
storing the content, combined with a swipe gesture recognition module to form a content package, on the hosting server, wherein the swipe gesture recognition module is associated with at least one displayable content element of the content;
receiving, at the hosting server, a request for the content package from the user device; and
transmitting the content package from the hosting server to the user device for display by an application running on the user device,
wherein the swipe gesture recognition module is configured to perform swipe gesture recognition when the at least one displayable content element is displayed on the user device, the swipe gesture recognition comprising:
receiving touch input data from the touch input display of the user device,
accessing, using the swipe gesture recognition module, a swipe gesture determination module stored on the hosting server or a second server to analyze the touch input data to determine whether a swipe gesture has occurred on the at least one displayable content element, and
applying a defined action to the at least one displayable content element if it is determined that a swipe gesture has occurred on the at least one displayable content element.
2. The method of claim 1, wherein the at least one displayable content element is in the form of a window overlaid on other displayed content.
3. The method of claim 1, wherein the at least one displayable content element is in the form of a window displayed to form a portion of other displayed content.
4. The method of claim 1, wherein the defined action comprises elimination of the at least one displayable content element.
5. The method of claim 1, wherein the defined action comprises activating a uniform resource locator associated with the at least one displayable content element.
6. The method of claim 1, wherein the swipe gesture recognition module includes an application programming interface (API) comprising script objects.
7. The method of claim 6, wherein the swipe gesture recognition module comprises code for controlling the touch display in response to a determined swipe gesture.
8. The method of claim 6, wherein the content comprises code for controlling the touch display in response to a determined swipe gesture.
9. The method of claim 1, wherein the swipe gesture determination module is accessed using an external script file reference tag in the swipe gesture recognition module.
10. The method of claim 1, wherein the swipe gesture determination module determines whether a swipe gesture has occurred on the at least one displayable content element by:
detecting a start location of a touching by a user based on the touch input data received from the touch input display of the user device;
determining whether a starting element has been defined, the starting element specifying an area of the touch screen in which a touching must start in order to be determined to be a swipe gesture;
if the starting element has been defined:
determining whether the start location of the touching by the user occurred within boundaries of the defined starting element, and
indicating a touch movement event if the start location of the touching by the user is within the boundaries of the defined starting element, the touch movement event initiating accumulation of the touch input data for analysis to determine whether a swipe gesture has occurred, and
terminating the swipe gesture determination if the start location of the touching by the user is not within the boundaries of the defined starting element; and
if the starting element has not been defined, indicating a touch movement event.
11. The method of claim 10, further comprising:
detecting an end location of the touching by the user based on the touch input data received from the touch input display of the user device;
determining whether an ending element has been defined, the ending element specifying an area of the touch screen in which a touching must end in order to be determined to be a swipe gesture;
if the ending element has been defined:
determining whether the end location of the touching by the user occurred within boundaries of the defined ending element, and
indicating a touch end event if the end location of the touching by the user is within the boundaries of the defined ending element, the touch end event initiating analysis of the accumulated touch input data to determine whether a swipe gesture has occurred, and
terminating the swipe gesture determination if the end location of the touching by the user is not within the boundaries of the defined ending element; and
if the ending element has not been defined, indicating a touch end event.
12. A non-transitory storage medium storing instructions for causing a processor to perform a method for manipulation of content provided by a hosting server using swipe gesture recognition on a user device having a touch input display, the method comprising:
storing the content, combined with a swipe gesture recognition module to form a content package, on the hosting server, wherein the swipe gesture recognition module is associated with at least one displayable content element of the content;
receiving, at the hosting server, a request for the content package from the user device; and
transmitting the content package from the hosting server to the user device for display by an application running on the user device,
wherein the swipe gesture recognition module is configured to perform swipe gesture recognition when the at least one displayable content element is displayed on the user device, the swipe gesture recognition comprising:
receiving touch input data from the touch input display of the user device,
accessing, using the swipe gesture recognition module, a swipe gesture determination module stored on the hosting server or a second server to analyze the touch input data to determine whether a swipe gesture has occurred on the at least one displayable content element, and
applying a defined action to the at least one displayable content element if it is determined that a swipe gesture has occurred on the at least one displayable content element.
13. The storage medium of claim 12, wherein the at least one displayable content element is in the form of a window overlaid on other displayed content.
14. The storage medium of claim 12, wherein the at least one displayable content element is in the form of a window displayed to form a portion of other displayed content.
15. The storage medium of claim 12, wherein the defined action comprises elimination of the at least one displayable content element.
16. The storage medium of claim 12, wherein the defined action comprises activating a uniform resource locator associated with the at least one displayable content element.
17. The storage medium of claim 12, wherein the swipe gesture recognition module includes an application programming interface (API) comprising script objects.
18. The storage medium of claim 17, wherein the swipe gesture recognition module comprises code for controlling the touch display in response to a determined swipe gesture.
19. The storage medium of claim 17, wherein the content comprises code for controlling the touch display in response to a determined swipe gesture.
20. The storage medium of claim 12, wherein the swipe gesture determination module is accessed using an external script file reference tag in the swipe gesture recognition module.
21. The storage medium of claim 12, wherein the swipe gesture determination module determines whether a swipe gesture has occurred on the at least one displayable content element by:
detecting a start location of a touching by a user based on the touch input data received from the touch input display of the user device;
determining whether a starting element has been defined, the starting element specifying an area of the touch screen in which a touching must start in order to be determined to be a swipe gesture;
if the starting element has been defined:
determining whether the start location of the touching by the user occurred within boundaries of the defined starting element, and
indicating a touch movement event if the start location of the touching by the user is within the boundaries of the defined starting element, the touch movement event initiating accumulation of the touch input data for analysis to determine whether a swipe gesture has occurred, and
terminating the swipe gesture determination if the start location of the touching by the user is not within the boundaries of the defined starting element; and
if the starting element has not been defined, indicating a touch movement event.
22. The storage medium of claim 21, further comprising instructions for:
detecting an end location of the touching by the user based on the touch input data received from the touch input display of the user device;
determining whether an ending element has been defined, the ending element specifying an area of the touch screen in which a touching must end in order to be determined to be a swipe gesture;
if the ending element has been defined:
determining whether the end location of the touching by the user occurred within boundaries of the defined ending element, and
indicating a touch end event if the end location of the touching by the user is within the boundaries of the defined ending element, the touch end event initiating analysis of the accumulated touch input data to determine whether a swipe gesture has occurred, and
terminating the swipe gesture determination if the end location of the touching by the user is not within the boundaries of the defined ending element; and
if the ending element has not been defined, indicating a touch end event.
US13/932,898 2012-11-28 2013-07-01 Content manipulation using swipe gesture recognition technology Abandoned US20140149916A1 (en)

Priority Applications (12)

Application Number Priority Date Filing Date Title
US13/932,898 US20140149916A1 (en) 2012-11-28 2013-07-01 Content manipulation using swipe gesture recognition technology
CA2892999A CA2892999A1 (en) 2012-11-28 2013-11-27 Content manipulation using swipe gesture recognition technology
AU2013352207A AU2013352207A1 (en) 2012-11-28 2013-11-27 Content manipulation using swipe gesture recognition technology
PCT/US2013/072186 WO2014085555A1 (en) 2012-11-28 2013-11-27 Content manipulation using swipe gesture recognition technology
JP2015545425A JP6309020B2 (en) 2012-11-28 2013-11-27 Content manipulation using swipe gesture recognition technology
EP13858848.8A EP2926227A4 (en) 2012-11-28 2013-11-27 Content manipulation using swipe gesture recognition technology
CN201380071413.2A CN104937525B (en) 2012-11-28 2013-11-27 Use the content operation of slip gesture identification technology
US14/175,522 US20140245164A1 (en) 2012-11-28 2014-02-07 Content manipulation using swipe gesture recognition technology
US14/310,663 US9218120B2 (en) 2012-11-28 2014-06-20 Content manipulation using swipe gesture recognition technology
US15/077,535 US10089003B2 (en) 2012-11-28 2016-03-22 Content manipulation using swipe gesture recognition technology
US16/116,459 US10831363B2 (en) 2012-11-28 2018-08-29 Content manipulation using swipe gesture recognition technology
US17/039,151 US11461536B2 (en) 2012-11-28 2020-09-30 Content manipulation using swipe gesture recognition technology

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261730899P 2012-11-28 2012-11-28
US13/932,898 US20140149916A1 (en) 2012-11-28 2013-07-01 Content manipulation using swipe gesture recognition technology

Related Child Applications (3)

Application Number Title Priority Date Filing Date
US14/175,522 Continuation US20140245164A1 (en) 2012-11-28 2014-02-07 Content manipulation using swipe gesture recognition technology
US14/310,663 Continuation US9218120B2 (en) 2012-11-28 2014-06-20 Content manipulation using swipe gesture recognition technology
US15/077,535 Continuation US10089003B2 (en) 2012-11-28 2016-03-22 Content manipulation using swipe gesture recognition technology

Publications (1)

Publication Number Publication Date
US20140149916A1 true US20140149916A1 (en) 2014-05-29

Family

ID=50774456

Family Applications (6)

Application Number Title Priority Date Filing Date
US13/932,898 Abandoned US20140149916A1 (en) 2012-11-28 2013-07-01 Content manipulation using swipe gesture recognition technology
US14/175,522 Abandoned US20140245164A1 (en) 2012-11-28 2014-02-07 Content manipulation using swipe gesture recognition technology
US14/310,663 Active 2033-09-26 US9218120B2 (en) 2012-11-28 2014-06-20 Content manipulation using swipe gesture recognition technology
US15/077,535 Active 2033-10-15 US10089003B2 (en) 2012-11-28 2016-03-22 Content manipulation using swipe gesture recognition technology
US16/116,459 Active US10831363B2 (en) 2012-11-28 2018-08-29 Content manipulation using swipe gesture recognition technology
US17/039,151 Active US11461536B2 (en) 2012-11-28 2020-09-30 Content manipulation using swipe gesture recognition technology

Family Applications After (5)

Application Number Title Priority Date Filing Date
US14/175,522 Abandoned US20140245164A1 (en) 2012-11-28 2014-02-07 Content manipulation using swipe gesture recognition technology
US14/310,663 Active 2033-09-26 US9218120B2 (en) 2012-11-28 2014-06-20 Content manipulation using swipe gesture recognition technology
US15/077,535 Active 2033-10-15 US10089003B2 (en) 2012-11-28 2016-03-22 Content manipulation using swipe gesture recognition technology
US16/116,459 Active US10831363B2 (en) 2012-11-28 2018-08-29 Content manipulation using swipe gesture recognition technology
US17/039,151 Active US11461536B2 (en) 2012-11-28 2020-09-30 Content manipulation using swipe gesture recognition technology

Country Status (7)

Country Link
US (6) US20140149916A1 (en)
EP (1) EP2926227A4 (en)
JP (1) JP6309020B2 (en)
CN (1) CN104937525B (en)
AU (1) AU2013352207A1 (en)
CA (1) CA2892999A1 (en)
WO (1) WO2014085555A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104156139A (en) * 2014-08-06 2014-11-19 广州三星通信技术研究有限公司 Method and device for stopping links to advertisement pages
US20140365319A1 (en) * 2013-06-10 2014-12-11 Google Inc. Mechanism for managing online content on touchscreen devices in the flow of an online publication
US20150105159A1 (en) * 2013-10-14 2015-04-16 Microsoft Corporation Boolean/float controller and gesture recognition system
US20160019602A1 (en) * 2014-01-16 2016-01-21 Samsung Electronics Co., Ltd. Advertisement method of electronic device and electronic device thereof
CN106354392A (en) * 2015-07-16 2017-01-25 阿里巴巴集团控股有限公司 Webpage operating method and device
US10216403B2 (en) 2013-03-29 2019-02-26 Orange Method to unlock a screen using a touch input
US10402060B2 (en) * 2013-06-28 2019-09-03 Orange System and method for gesture disambiguation
US10503264B1 (en) 2015-06-16 2019-12-10 Snap Inc. Radial gesture navigation
US10530731B1 (en) 2016-03-28 2020-01-07 Snap Inc. Systems and methods for chat with audio and video elements
CN111046290A (en) * 2019-12-13 2020-04-21 珠海格力电器股份有限公司 Advertisement processing method and device, electronic equipment and storage medium
US10891690B1 (en) 2014-11-07 2021-01-12 Intuit Inc. Method and system for providing an interactive spending analysis display
US11064008B2 (en) * 2014-05-05 2021-07-13 Usablenet Inc. Methods for facilitating a remote interface and devices thereof
US11216550B2 (en) * 2017-08-23 2022-01-04 Samsung Electronics Co., Ltd Security-enhanced image display method and electronic device performing same

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11442617B1 (en) * 2015-06-12 2022-09-13 Intuit, Inc. Committing data in electronic devices using swiping gestures
US10360750B2 (en) * 2015-11-03 2019-07-23 Capital One Services, Llc Systems and methods for pattern generation and security features
US20170310673A1 (en) * 2016-04-20 2017-10-26 Huami Inc. Security system with gesture-based access control
CN108280676A (en) * 2018-01-04 2018-07-13 广州阿里巴巴文学信息技术有限公司 A kind of method, apparatus and terminal device carrying out advertising display based on sliding window
CN109309852B (en) * 2018-11-06 2022-05-27 杨富山 Video advertisement processing method, terminal and medium
CN112416236A (en) * 2020-03-23 2021-02-26 上海幻电信息科技有限公司 Gesture packaging and interaction method and device based on web page and storage medium
CN111610855A (en) * 2020-03-30 2020-09-01 北京爱接力科技发展有限公司 Gesture advertisement removing method, device, terminal and computer readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090319181A1 (en) * 2008-06-20 2009-12-24 Microsoft Corporation Data services based on gesture and location information of device
US20110179387A1 (en) * 2009-03-16 2011-07-21 Shaffer Joshua L Event Recognition
US20110181526A1 (en) * 2010-01-26 2011-07-28 Shaffer Joshua H Gesture Recognizers with Delegates for Controlling and Modifying Gesture Recognition
US20110258049A1 (en) * 2005-09-14 2011-10-20 Jorey Ramer Integrated Advertising System
US20110288913A1 (en) * 2010-05-20 2011-11-24 Google Inc. Interactive Ads
US20120092277A1 (en) * 2010-10-05 2012-04-19 Citrix Systems, Inc. Touch Support for Remoted Applications
US20120254804A1 (en) * 2010-05-21 2012-10-04 Sheha Michael A Personal wireless navigation system
US20130074014A1 (en) * 2011-09-20 2013-03-21 Google Inc. Collaborative gesture-based input language
US20130086532A1 (en) * 2011-09-30 2013-04-04 Oracle International Corporation Touch device gestures

Family Cites Families (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020062481A1 (en) 2000-02-25 2002-05-23 Malcolm Slaney Method and system for selecting advertisements
US7047503B1 (en) 2001-03-28 2006-05-16 Palmsource, Inc. Method and apparatus for the selection of records
US8261306B2 (en) 2001-12-11 2012-09-04 Koninklijke Philips Electronics N.V. System for and method of shopping through television
EP1779373A4 (en) 2004-08-16 2011-07-13 Maw Wai-Lin Virtual keypad input device
US8326775B2 (en) * 2005-10-26 2012-12-04 Cortica Ltd. Signature generation for multimedia deep-content-classification by a large-scale matching system and method thereof
US7657849B2 (en) * 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US20090278806A1 (en) * 2008-05-06 2009-11-12 Matias Gonzalo Duarte Extended touch-sensitive control area for electronic device
US8564544B2 (en) * 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US7900225B2 (en) 2007-02-20 2011-03-01 Google, Inc. Association of ads with tagged audiovisual content
US20080225037A1 (en) 2007-03-15 2008-09-18 Basimah Khulusi Apparatus and method for truncating polyhedra
US8059101B2 (en) 2007-06-22 2011-11-15 Apple Inc. Swipe gestures for touch screen keyboards
US8174502B2 (en) * 2008-03-04 2012-05-08 Apple Inc. Touch event processing for web pages
US8717305B2 (en) * 2008-03-04 2014-05-06 Apple Inc. Touch event model for web pages
US7996422B2 (en) 2008-07-22 2011-08-09 At&T Intellectual Property L.L.P. System and method for adaptive media playback based on destination
US8856690B2 (en) 2008-10-31 2014-10-07 Sprint Communications Company L.P. Associating gestures on a touch screen with characters
US8285499B2 (en) * 2009-03-16 2012-10-09 Apple Inc. Event recognition
US8839155B2 (en) 2009-03-16 2014-09-16 Apple Inc. Accelerated scrolling for a multifunction device
US9485339B2 (en) 2009-05-19 2016-11-01 At&T Mobility Ii Llc Systems, methods, and mobile devices for providing a user interface to facilitate access to prepaid wireless account information
CN102033881A (en) * 2009-09-30 2011-04-27 国际商业机器公司 Method and system for recognizing advertisement in web page
US8432368B2 (en) 2010-01-06 2013-04-30 Qualcomm Incorporated User interface methods and systems for providing force-sensitive input
WO2012068551A1 (en) * 2010-11-18 2012-05-24 Google Inc. Surfacing off-screen visible objects
US20120130822A1 (en) * 2010-11-19 2012-05-24 Microsoft Corporation Computing cost per interaction for interactive advertising sessions
US20120131454A1 (en) * 2010-11-24 2012-05-24 Siddharth Shah Activating an advertisement by performing gestures on the advertisement
US8660978B2 (en) * 2010-12-17 2014-02-25 Microsoft Corporation Detecting and responding to unintentional contact with a computing device
WO2012142055A1 (en) 2011-04-11 2012-10-18 Zinio, Llc Reader with enhanced user functionality
US10222974B2 (en) 2011-05-03 2019-03-05 Nokia Technologies Oy Method and apparatus for providing quick access to device functionality
US20140007018A1 (en) 2011-10-05 2014-01-02 Fooncrazy Corp Summation of tappable elements results/actions by swipe gestures
US8719734B2 (en) 2011-11-16 2014-05-06 Microsoft Corporation Two-stage swipe gesture recognition
US8954890B2 (en) 2012-04-12 2015-02-10 Supercell Oy System, method and graphical user interface for controlling a game
US20130275924A1 (en) 2012-04-16 2013-10-17 Nuance Communications, Inc. Low-attention gestural user interface
US20140026105A1 (en) 2012-07-18 2014-01-23 Research In Motion Limited Method and Apparatus Pertaining to a Gesture-Controlled Snooze Instruction
US9696879B2 (en) 2012-09-07 2017-07-04 Google Inc. Tab scrubbing using navigation gestures
US8487897B1 (en) 2012-09-12 2013-07-16 Google Inc. Multi-directional calibration of touch screens

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110258049A1 (en) * 2005-09-14 2011-10-20 Jorey Ramer Integrated Advertising System
US20090319181A1 (en) * 2008-06-20 2009-12-24 Microsoft Corporation Data services based on gesture and location information of device
US20110179387A1 (en) * 2009-03-16 2011-07-21 Shaffer Joshua L Event Recognition
US20110181526A1 (en) * 2010-01-26 2011-07-28 Shaffer Joshua H Gesture Recognizers with Delegates for Controlling and Modifying Gesture Recognition
US20110288913A1 (en) * 2010-05-20 2011-11-24 Google Inc. Interactive Ads
US20120254804A1 (en) * 2010-05-21 2012-10-04 Sheha Michael A Personal wireless navigation system
US20120092277A1 (en) * 2010-10-05 2012-04-19 Citrix Systems, Inc. Touch Support for Remoted Applications
US20130074014A1 (en) * 2011-09-20 2013-03-21 Google Inc. Collaborative gesture-based input language
US20130086532A1 (en) * 2011-09-30 2013-04-04 Oracle International Corporation Touch device gestures

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
The Crazy Chimp, Confining a swipe gesture to a certain area, 10/02/2011, Stock Overflow, pages 1-3 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10216403B2 (en) 2013-03-29 2019-02-26 Orange Method to unlock a screen using a touch input
US20140365319A1 (en) * 2013-06-10 2014-12-11 Google Inc. Mechanism for managing online content on touchscreen devices in the flow of an online publication
US10402060B2 (en) * 2013-06-28 2019-09-03 Orange System and method for gesture disambiguation
US20150105159A1 (en) * 2013-10-14 2015-04-16 Microsoft Corporation Boolean/float controller and gesture recognition system
US10220304B2 (en) * 2013-10-14 2019-03-05 Microsoft Technology Licensing, Llc Boolean/float controller and gesture recognition system
US10643252B2 (en) * 2014-01-16 2020-05-05 Samsung Electronics Co., Ltd. Banner display method of electronic device and electronic device thereof
US20160019602A1 (en) * 2014-01-16 2016-01-21 Samsung Electronics Co., Ltd. Advertisement method of electronic device and electronic device thereof
US11064008B2 (en) * 2014-05-05 2021-07-13 Usablenet Inc. Methods for facilitating a remote interface and devices thereof
CN104156139A (en) * 2014-08-06 2014-11-19 广州三星通信技术研究有限公司 Method and device for stopping links to advertisement pages
US11810186B2 (en) 2014-11-07 2023-11-07 Intuit Inc. Method and system for providing an interactive spending analysis display
US10891690B1 (en) 2014-11-07 2021-01-12 Intuit Inc. Method and system for providing an interactive spending analysis display
US10503264B1 (en) 2015-06-16 2019-12-10 Snap Inc. Radial gesture navigation
US11132066B1 (en) 2015-06-16 2021-09-28 Snap Inc. Radial gesture navigation
US11861068B2 (en) 2015-06-16 2024-01-02 Snap Inc. Radial gesture navigation
CN106354392A (en) * 2015-07-16 2017-01-25 阿里巴巴集团控股有限公司 Webpage operating method and device
US11063898B1 (en) 2016-03-28 2021-07-13 Snap Inc. Systems and methods for chat with audio and video elements
US10530731B1 (en) 2016-03-28 2020-01-07 Snap Inc. Systems and methods for chat with audio and video elements
US11216550B2 (en) * 2017-08-23 2022-01-04 Samsung Electronics Co., Ltd Security-enhanced image display method and electronic device performing same
CN111046290A (en) * 2019-12-13 2020-04-21 珠海格力电器股份有限公司 Advertisement processing method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
JP6309020B2 (en) 2018-04-11
US20160202900A1 (en) 2016-07-14
CN104937525A (en) 2015-09-23
CA2892999A1 (en) 2014-06-05
AU2013352207A1 (en) 2015-07-16
US11461536B2 (en) 2022-10-04
EP2926227A1 (en) 2015-10-07
US10089003B2 (en) 2018-10-02
US10831363B2 (en) 2020-11-10
US20210011600A1 (en) 2021-01-14
JP2016502200A (en) 2016-01-21
EP2926227A4 (en) 2016-08-03
CN104937525B (en) 2018-01-23
US20190018566A1 (en) 2019-01-17
US20140304609A1 (en) 2014-10-09
WO2014085555A1 (en) 2014-06-05
US20140245164A1 (en) 2014-08-28
US9218120B2 (en) 2015-12-22

Similar Documents

Publication Publication Date Title
US11461536B2 (en) Content manipulation using swipe gesture recognition technology
US10671692B2 (en) Uniquely identifying and tracking selectable web page objects
US9805377B2 (en) Unified content visibility
US10762277B2 (en) Optimization schemes for controlling user interfaces through gesture or touch
US9865005B1 (en) Unified content visibility and video content monitoring
US20160371751A1 (en) Methods and systems for reducing inadvertent interactions with advertisements displayed on a computing device
US9870578B2 (en) Scrolling interstitial advertisements
US20120278712A1 (en) Multi-input gestures in hierarchical regions
WO2016045523A1 (en) Display method and device for interface contents of mobile terminal and terminal
US9275398B1 (en) Obtaining metrics for client-side display of content
US20150033104A1 (en) Smooth Navigation Between Content Oriented Pages
US20130238433A1 (en) Method and system for providing relevant advertisements by monitoring scroll-speeds
WO2016192546A1 (en) Method and device for updating data point of dynamic curve
WO2014067442A1 (en) Page browsing method and browser
US20160274723A1 (en) Mobile gesture reporting and replay with unresponsive gestures identification and analysis
US10579227B1 (en) Identifying missed interactions
US11579766B2 (en) Methods and systems for reducing inadvertent interactions with advertisements displayed on a computing device
US8504940B1 (en) Smooth hardware accelerated scrolling
US9460159B1 (en) Detecting visibility of a content item using tasks triggered by a timer
US10643233B2 (en) Dismiss and follow up advertising

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: SOMO AUDIENCE CORP., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MANOFF, ROBERT S.;HOUCK, TODD;SQUIRE, JESSE D.;AND OTHERS;SIGNING DATES FROM 20131228 TO 20140119;REEL/FRAME:048242/0141

AS Assignment

Owner name: SWIPETHRU LLC, NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SOMO AUDIENCE CORP.;REEL/FRAME:048253/0808

Effective date: 20181227