KR20140060025A - Method of performing game, server performing the same and media storing the same - Google Patents

Method of performing game, server performing the same and media storing the same Download PDF

Info

Publication number
KR20140060025A
KR20140060025A KR1020120126610A KR20120126610A KR20140060025A KR 20140060025 A KR20140060025 A KR 20140060025A KR 1020120126610 A KR1020120126610 A KR 1020120126610A KR 20120126610 A KR20120126610 A KR 20120126610A KR 20140060025 A KR20140060025 A KR 20140060025A
Authority
KR
South Korea
Prior art keywords
game
mobile terminal
data
motion
sensing
Prior art date
Application number
KR1020120126610A
Other languages
Korean (ko)
Inventor
송지영
Original Assignee
(주)네오위즈게임즈
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)네오위즈게임즈 filed Critical (주)네오위즈게임즈
Priority to KR1020120126610A priority Critical patent/KR20140060025A/en
Priority to PCT/KR2013/008368 priority patent/WO2014042484A1/en
Publication of KR20140060025A publication Critical patent/KR20140060025A/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/24Constructional details thereof, e.g. game controllers with detachable joystick handles
    • A63F13/245Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • A63F13/35Details of game servers
    • A63F13/355Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an encoded video stream for transmitting to a mobile phone or a thin client
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A game execution method is performed in a game execution server that can be connected to a mobile terminal. The game execution method includes receiving sensing data associated with an impact from a user from the mobile terminal, determining operation data corresponding to impact data associated with a game operation in the received sensing data, And performing a game operation by the game client.

Description

TECHNICAL FIELD [0001] The present invention relates to a method for performing a game, a game execution server for performing the game, and a recording medium storing the same,

The present invention relates to a game performing technique, and more particularly, to a game performing technique that receives sensing data associated with an impact from a user from a mobile terminal, determines operation data corresponding to impact data associated with a game operation from received sensing data, To thereby allow a user to operate the game through an impact directly or indirectly applied to the mobile terminal, and a game execution server for performing the method.

A mobile terminal which can be utilized only as a conventional communication means has been utilized as a portable computing device capable of performing various functions such as internet, music, movie, and SNS (Social Network Service) due to the appearance of a smart phone.

In particular, a technique has been developed in which a user's operation is utilized as an input means of a mobile terminal by sensing a user's motion through a user motion detection sensor (for example, a gyro sensor) that can be included in the mobile terminal. 2. Description of the Related Art In recent years, a mobile game using a motion sensor of a mobile terminal has been released due to popularity of a game device (e.g., Nintendo Wii) based on user's actions.

Various techniques related to a game execution method using a mobile terminal as a game operation apparatus are provided. The following prior patents provide techniques related thereto.

Korean Patent Laid-Open No. 10-2012-0070638 discloses a game output through a smartphone and a screen output device in which a smart phone equipped with a gyroscope sensor and a gravity sensor is moved in a free direction, A mobile game service server for registering and providing a plurality of moving games available to users who have made online accesses; A web address is input on the screen output device according to a user's operation, and the receiver program is downloaded and stored on the web site of the main office. When a receiver signal is received from the outside, the interpretation is performed through the receiver program and the IP address is displayed on the screen A screen output device for causing a game output on the screen to proceed in response to a game operation signal input from the outside; And transmitting the receiver signal to the screen output device after downloading and storing the receiver application after making an online connection to the application store, inputting the displayed IP address on the screen of the screen output device, And a smart terminal for attempting an online connection and outputting a game operation signal for selecting a desired moving game among the moving games stored in the screen output device when the online connection is established with the screen output device, Technology.

The prior art provides a technique by which a user can operate a game performed on a PC using a gyroscope sensor and a gravity sensor of a mobile terminal. Here, this prior art only provides a technique for utilizing a mobile terminal as a PC game operating means.

Korean Patent Laid-Open Publication No. 10-2012-0017333 relates to a mobile terminal and a motion recognition method for accurately recognizing a motion of a user at the time of stopping or moving, and more particularly, to setting and storing an initial reference point; Executing a predetermined application by recognizing a motion inputted by a user according to an initial reference point; Calculating a displacement of a corresponding motion with respect to an initial reference point and correcting the initial reference point when a specific motion having a direction change is input during execution of the application, It is able to recognize the motion of the user accurately. Especially, by recognizing the vector displacement with respect to the initial reference value by using the geomagnetic sensor (gyro sensor) and acceleration sensor while moving, various functions of the mobile terminal There is an effect that can be done.

The prior art provides a mobile terminal that can accurately recognize a user's motion. In this case, the prior art does not provide a game execution technique that recognizes a user's motion, but merely provides a correction technique for an initial reference value by calculating a displacement of a specific motion by a user.

That is, these prior arts provide a technique of using the motion of the user or accurately recognizing the motion of the user in the process of utilizing the mobile terminal as a game operation device.

In the field of game technology, it is a most important technical goal to prevent a deviation of a game from an existing user and to attract a new user by providing a variety of game execution methods to a user to induce a user's interest. Currently, a variety of technology fields are provided through mobile terminals. However, techniques for manipulating games through a direct or indirect impact on the mobile terminal and preventing erroneous operation due to accuracy limitations of the sensors in the mobile terminal are not provided In fact.

Korean Patent Publication No. 10-2012-0070638 Korean Patent Publication No. 10-2012-0017333

The present invention relates to a method and apparatus for receiving a sensing data associated with a shock by a user from a mobile terminal and determining a manipulation data that can be processed by the game client in the received sensing data to manipulate the game so that the user can directly or indirectly And a game execution server for performing the game execution method.

The present invention extracts a sensing value associated with a game operation by excluding a sensing value irrelevant to a game operation by operating the game by determining operation data corresponding to the impact data associated with the game operation in the sensing data received from the mobile terminal, A game execution method capable of preventing a malfunction caused by a user and a game execution server performing the game execution method.

The present invention provides a mobile terminal that receives a motion sensing value from each of at least two motion sensors in a mobile terminal and processes a sensed value based on an operation speed and a sensing accuracy of each motion sensor to perform a game operation by a game client, The present invention provides a game execution method capable of minimizing data processing by a user and reducing a delay that may occur when a game is executed and accurately operating a game by accurately recognizing an impact on a mobile terminal.

Among the embodiments, a method of performing a game is performed in a game execution server that can be connected to a mobile terminal. The game execution method includes receiving sensing data associated with an impact from a user from the mobile terminal, determining operation data corresponding to impact data associated with a game operation in the received sensing data, And performing a game operation by the game client.

In one embodiment, receiving the sensing data associated with an impact by the user may further comprise receiving a sensing value for a direct or indirect impact sensed through the at least one sensor in the mobile terminal .

In one embodiment, the step of receiving sensing data associated with an impact by the user comprises receiving at least one of a motion sensing value sensed by a motion sensor in the mobile terminal and a sound sensing value sensed by a sound sensor As shown in FIG.

In one embodiment, receiving the sensing data associated with an impact by the user may further comprise receiving the motion sensing value from each of the at least two motion sensors in the mobile terminal.

In one embodiment, receiving sensing data associated with an impact by the user includes receiving a first motion sensing value via a first motion sensor and sensing a second motion sensing value via at least one second motion sensor And correcting the received first motion sensing value.

In one embodiment, the step of determining operation data corresponding to the impact data may further include extracting a sensing value corresponding to at least one of a vector pattern and a sound pattern included in the impact data in the sensing data have.

In one embodiment, the vector pattern may be associated with a motion sensing value by the mobile terminal and may include at least one of a predetermined range of motion magnitude and motion direction. In one embodiment, the sound pattern may be associated with a sound sensing value by the mobile terminal and may include at least one of a predetermined range of sound magnitude and frequency.

In one embodiment, determining operational data corresponding to the impulse data may comprise associating the extracted sensed value with a game operation vector associated with the game operation to generate a game operational instruction that may be processed by the game client Step < / RTI >

In one embodiment, the step of determining the operation data corresponding to the impact pattern may further include correcting at least one of the vector pattern and the sound pattern based on the received sensing data.

In one embodiment, the method may further include receiving feedback on the game operation from the game client and performing a sensible action by the mobile terminal via the received feedback.

Among the embodiments, the game performing server may be connected to the mobile terminal. Wherein the game execution server comprises: a sensing data receiver for receiving sensing data associated with an impact from a user from the mobile terminal; an operation data determiner for determining operation data corresponding to impact data associated with a game operation in the received sensing data; And a game operation unit for performing a game operation by the game client through the determined operation data.

In one embodiment, the sensing data receiver may receive a sensing value for a direct or indirect impact sensed through at least one sensor in the mobile terminal.

In one embodiment, the sensing data receiver may receive at least one of a motion sensing value sensed by a motion sensor in the mobile terminal and a sound sensing value sensed by a sound sensor.

In one embodiment, the sensing data receiver may receive the motion sensing value from each of the at least two motion sensors in the mobile terminal.

In one embodiment, the sensing data receiver receives a first motion sensing value via a first motion sensor, receives a second motion sensing value via at least one second motion sensor, and outputs the received first motion sensing value Can be corrected.

In one embodiment, the operation data determination unit may extract a sensing value corresponding to at least one of a vector pattern and a sound pattern included in the impulse data in the sensing data.

In one embodiment, the vector pattern may be associated with a motion sensing value by the mobile terminal and may include at least one of a predetermined range of motion magnitude and motion direction. In one embodiment, the sound pattern may be associated with a sound sensing value by the mobile terminal and may include at least one of a predetermined range of sound magnitude and frequency.

In one embodiment, the operation data determination unit may generate the game operation command that can be processed by the game client by associating the extracted sensing value with a game operation vector associated with the game operation.

In one embodiment, the operation data determination unit may correct at least one of the vector pattern and the sound pattern based on the received sensing data.

In one embodiment, the mobile terminal may further include a sensible action execution unit that receives feedback on the game operation from the game client and performs a sensible action by the mobile terminal through the received feedback.

In a preferred embodiment of the present invention, a recording medium on which a computer program relating to a game performing method performed on a game performing server that can be connected to a mobile terminal includes receiving sensing data associated with an impact from a user from the mobile terminal, Determining operation data corresponding to impact data associated with the game operation in the data, and performing game operation by the game client through the determined operation data.

The game performing method and related techniques according to an embodiment of the present invention include receiving sensing data associated with an impact from a user from a mobile terminal, determining operation data that can be processed by the game client in the received sensing data, So that the user can operate the game through an impact directly or indirectly applied to the mobile terminal.

The game execution method and related arts according to an embodiment of the present invention determine operation data corresponding to impact data associated with a game operation in sensing data received from a mobile terminal to operate a game, It is possible to extract the sensing value associated with the game operation and prevent malfunction caused by the user.

A game performing method and associated techniques according to an embodiment of the present invention receive a motion sensing value from each of at least two motion sensors in a mobile terminal and process the sensing value based on the operating speed and sensing accuracy of each motion sensor Thereby minimizing data processing by the mobile terminal, thereby reducing delays that may occur during game execution, and operating the game by accurately recognizing the impact on the mobile terminal.

1 is a view for explaining a game execution system according to an embodiment of the present invention.
2 is a block diagram illustrating the mobile terminal of FIG.
FIG. 3 is a block diagram illustrating the game execution server shown in FIG. 1. FIG.
4 is a flowchart illustrating a game execution process according to the present invention.
5 is a flowchart illustrating a game execution process according to an embodiment of the present invention.
6 is a diagram illustrating an example of a game execution system according to an embodiment of the present invention.
7 is a diagram illustrating an example of a game screen for explaining a game performing process according to an embodiment of the present invention.

The description of the present invention is merely an example for structural or functional explanation, and the scope of the present invention should not be construed as being limited by the embodiments described in the text. That is, the embodiments are to be construed as being variously embodied and having various forms, so that the scope of the present invention should be understood to include equivalents capable of realizing technical ideas. Also, the purpose or effect of the present invention should not be construed as limiting the scope of the present invention, since it does not mean that a specific embodiment should include all or only such effect.

Meanwhile, the meaning of the terms described in the present application should be understood as follows.

The terms "first "," second ", and the like are intended to distinguish one element from another, and the scope of the right should not be limited by these terms. For example, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

It is to be understood that when an element is referred to as being "connected" to another element, it may be directly connected to the other element, but there may be other elements in between. On the other hand, when an element is referred to as being "directly connected" to another element, it should be understood that there are no other elements in between. On the other hand, other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.

It should be understood that the singular " include "or" have "are to be construed as including a stated feature, number, step, operation, component, It is to be understood that the combination is intended to specify that it does not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

In each step, the identification code (e.g., a, b, c, etc.) is used for convenience of explanation, the identification code does not describe the order of each step, Unless otherwise stated, it may occur differently from the stated order. That is, each step may occur in the same order as described, may be performed substantially concurrently, or may be performed in reverse order.

The present invention can be embodied as computer-readable code on a computer-readable recording medium, and the computer-readable recording medium includes all kinds of recording devices for storing data that can be read by a computer system . Examples of the computer-readable recording medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like, and also implemented in the form of a carrier wave (for example, transmission over the Internet) . In addition, the computer-readable recording medium may be distributed over network-connected computer systems so that computer readable codes can be stored and executed in a distributed manner.

All terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs, unless otherwise defined. Commonly used predefined terms should be interpreted to be consistent with the meanings in the context of the related art and can not be interpreted as having ideal or overly formal meaning unless explicitly defined in the present application.

1 is a view for explaining a game execution system according to an embodiment of the present invention.

Referring to FIG. 1, the game performing system 100 includes a mobile terminal 110, a game performing server 120, and a game client 130.

The mobile terminal 110 may correspond to a portable computing device capable of sensing impacts by a user.

More specifically, when a user directly or indirectly impacts the mobile terminal 110, the mobile terminal 110 may be capable of sensing a physical phenomenon that may occur due to the impact, .

For example, the impact directly applied by the user to the mobile terminal 110 may correspond to an impact through direct contact with the mobile terminal 110 by a user with a finger, palm, fist, stick, or the like. In another example, an impact indirectly imposed on a mobile terminal 110 by a user may place the mobile terminal 110 on a floor, desk, or the like and allow the user to lower the floor, desk, etc. with a finger, palm, fist, And may be applied to the mobile terminal 110 through an impact transmission medium (here, floor, desk, etc.).

Here, the physical phenomenon that may occur due to the impact may correspond to at least one of the motion of the mobile terminal 110 and the sound generated due to the impact.

In one embodiment, the mobile terminal 110 may correspond to a smart phone or a tablet personal computer. Here, the mobile terminal 110 may include at least one of a motion sensor and a sound sensor.

For example, if the user directly impacts the mobile terminal 110, the mobile terminal 110 may determine the motion of the mobile terminal 110 with respect to the impact via a motion sensor (e.g., a gyroscope sensor) And sense the sound of the impact through a sound sensor (e.g., a microphone).

In another example, when the user places the mobile terminal 110 on a table and indirectly impacts (e.g., sinks with the palm) in the vicinity of the mobile terminal 110, (For example, up-and-down vibration) of the mobile terminal 110 with respect to the impact, and sense the sound of the impact through the sound sensor.

In one embodiment, the motion sensor may include at least one of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor. More specifically, the acceleration sensor corresponds to a device capable of generating a sensing value as data by measuring static and dynamic accelerations with respect to the X, Y, and Z axes. Also, the angular velocity sensor may correspond to a device capable of generating a sensing value as data by measuring a rotational angular velocity with respect to X, Y, and Z axes, and may correspond to, for example, a gyroscope sensor. The geomagnetic sensor corresponds to a device capable of generating absolute azimuth information as data.

Here, the moosene sensor may generate a motion sensing value for motion of the mobile terminal 110 when a direct or indirect impact is applied to the mobile terminal 110. [

In one embodiment, the sound sensor may correspond to a microphone. Here, the sound sensor may generate a sound sensing value for the sound that occurs near the mobile terminal 110.

The mobile terminal 110 may be connected to the game execution server 120 via a network. In one embodiment, the mobile terminal 110 may be connected to the game performing server 120 via a wired or wireless network. In one embodiment, when the mobile terminal 110 is connected to the game performing server 120 over a wireless network, it may be connected to the game performing server 120 via one of a 3G network, a wifi, a blue tooth, and an IrDA Can be connected.

The mobile terminal 110 is described with reference to FIG.

The game execution server 120 corresponds to a computing device that can be connected to the mobile terminal 110 and each of the game clients through a network. In one embodiment, the game performing server 120 may be included in a personal computer or a mobile terminal 110.

In one embodiment, the game performing server 120 may be embodied in a PC or a smart phone.

For example, in FIG. 6, when the game execution server 120 is implemented in a PC, the game execution server 120 is connected to the mobile terminal 110 via a wireless network, For example, a coin flipping game program). That is, the user can directly or indirectly impact the mobile terminal 110 to manipulate the PC game.

In another example, when the game execution server 120 and the game client 130 are included in the mobile terminal 110 and implemented, (For example, a coin flipping game program) that includes a component. That is, the user may directly or indirectly impact the mobile terminal 110 to operate the game application installed in the mobile terminal 110.

The game execution server 120 will be described with reference to FIG.

The game client 130 is a game providing means for driving a game in a game executing device, operating a game through a signal input by a user, and providing the operation result to a user through an output means. In one embodiment, game client 130 may correspond to game software.

Here, when the game performing apparatus corresponds to the PC, the game client 130 (for example, a coin flipping game program) operates the game through the sensing data received from the mobile terminal 110, It can be outputted through a monitor which is an output means.

In one embodiment, the game client 130 may load at least a portion of the game engine for game driving. For example, at least some of the predetermined operations required to drive the game may be performed in the game client 130.

The game client 130 may be connected to the game execution server 120 via a network. In one embodiment, the game client 130 may be implemented within a computing device with the game performing server 120. [ For example, when the game execution server 120 is included in the PC, the game client 130 may be included in the storage device of the PC.

2 is a block diagram illustrating the mobile terminal of FIG.

Referring to FIG. 2, the mobile terminal 110 includes a sensor unit 210, a communication unit 220, a sensible action unit 230, and a control unit 240.

The sensor unit 210 senses an impact by the user. More specifically, the sensor unit 210 may be included in or attached to the mobile terminal 110 to generate a sensing value corresponding to an impact by the user. Here, the sensor unit 210 may correspond to at least one of the motion of the mobile terminal 110 and the sound generated due to the impact, directly or indirectly, to the mobile terminal 110 .

In one embodiment, the sensor portion 210 may include at least one of a motion sensor 211 and a sound sensor 212.

First, the motion sensor 211 may generate a motion sensing value corresponding to an impact applied to the mobile terminal 110 directly or indirectly. For example, if the user directly impacts the mobile terminal 110, the motion sensor 211 may detect roll, pitch, and yaw data corresponding to the motion of the mobile terminal 110 Or may generate vector data including a size and a direction corresponding to the movement of the mobile terminal 110.

In one embodiment, the motion sensor 211 may include at least one of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor.

In one embodiment, the motion sensor 211 may include at least two motion sensors. Here, the motion sensor 211 may include at least two of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor.

In one embodiment, the motion sensor 211 may comprise a first motion sensor and at least one second motion sensor. Here, the first motion sensor and the at least one second motion sensor may correspond to those designed to be in inverse proportion in terms of the operating speed and the sensing accuracy, respectively.

More specifically, the first motion sensor may correspond to a motion sensor designed to have a faster operation speed than the second motion sensor, and the second motion sensor may correspond to a motion sensor designed to have a higher sensing accuracy as compared with the first motion sensor. can do. Here, the operating speed refers to a time for recognizing the motion of the mobile terminal 110 to generate a motion sensing value, and the sensing accuracy refers to a noise range of an error range or a motion sensing value capable of recognizing the motion of the mobile terminal 110 It can mean size. That is, the first motion sensor corresponds to the motion sensor having the highest operation speed among the motion sensors included in the sensor unit 210, and the second motion sensor corresponds to the motion sensor among the motion sensors included in the sensor unit 210 It may correspond to a motion sensor having a higher sensing accuracy than the first motion sensor.

In one embodiment, the first motion sensor corresponds to a gyroscope sensor, and the second sensor can correspond to an acceleration sensor and a geomagnetic sensor. That is, the gyroscope sensor having the highest operation speed corresponds to the first motion sensor, the acceleration sensor having the lower operating speed than the first motion sensor and the geosensor sensor having the higher sensing accuracy correspond to the second motion sensor can do.

Next, the sound sensor 212 may generate a sound sensing value corresponding to an impact applied to the mobile terminal 110 either directly or indirectly. In one embodiment, the sound sensor 212 may correspond to a microphone included in the mobile terminal 110.

For example, if the user directly impacts the mobile terminal 110, the sound sensor 212 may generate a sound sensing value corresponding to the sound that is generated in the vicinity of the mobile terminal 110. For example, if the user indirectly impacts the mobile terminal 110 by placing the mobile terminal 110 on the desk and then lowering the desk with the palm of the hand, the sound sensor 212 may be in a collision between the user's palm and the desk The sound generated due to the noise can be sensed. Here, the sound sensor 212 may generate a sound sensing value including at least one of the magnitude and the frequency of the sound due to the impact.

The communication unit 220 may transmit the sensing value generated by the sensor unit 210 to the game execution server 120 and may receive feedback on the game operation from the game execution server 120. The related contents will be described in more detail with reference to FIG.

The sensible action means 230 corresponds to means for providing a sensible action to the user by performing feedback on the game operation received from the game performing server 120. [ In one embodiment, the sensible action means 230 may include at least one of a display, a vibration sensor, a camera, and a speaker. That is, the sensible action means 230 is a device capable of performing feedback on a specific event occurring during a game progress, and may correspond to at least one of a screen of the mobile terminal 110, a vibration device, have. The related contents will be described in more detail with reference to FIG.

The control unit 240 controls the operation of the sensor unit 210, the communication unit 220, and the sensible action unit 230 and the flow of data.

FIG. 3 is a block diagram illustrating the game execution server shown in FIG. 1. FIG.

3, the game execution server 120 includes a sensing data receiving unit 310, an operation data determination unit 320, a game operation unit 330, a sensible action execution unit 340, and a control unit 350 .

The sensing data receiving unit 310 receives the sensing data associated with the impact from the mobile terminal 110.

More specifically, the sensing data receiving unit 310 receives the sensing data including the sensing value generated by the sensing unit 210 of the mobile terminal 110 from the communication unit 220. Here, the sensing data may include a sensing value for an impact that is directly or indirectly applied to the mobile terminal 110 by a user.

In one embodiment, the sensing data receiver 310 may receive a sensing value for a direct or indirect impact sensed through at least one sensor in the mobile terminal 110. Here, the mobile terminal 110 may include at least one of a motion sensor 211 and a sound sensor 212.

In one embodiment, the sensing data receiver 310 receives at least one of a motion sensing value sensed by the motion sensor 211 in the mobile terminal 110 and a sound sensing value sensed by the sound sensor 212 .

The case where the coin flipping game proceeds according to the game performing process according to the present invention will be described by way of example. When the user indirectly impacts the mobile terminal 110 by putting the mobile terminal 110 on the desk and then grasping down the desk with his or her hand on the desk, the motion sensor 211 of the mobile terminal 110 detects the motion of the mobile terminal 110 110 in the vertical direction to generate a motion sensing value. Here, the sound sensor 212 may generate a sound sensing value by sensing a sound due to a collision between the desk and the palm of the user. The sensing data receiving unit 310 may receive sensing data including at least one of a motion sensing value and a sound sensing value through the communication unit 220 of the mobile terminal 110. [

In one embodiment, the sensing data receiver 310 may receive motion sensing values generated from each of the at least two motion sensors 211 in the mobile terminal 110. Here, the motion sensor 211 may include at least two of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor.

In one embodiment, when the mobile terminal 110 includes a first motion sensor and at least one second motion sensor that are designed to be in inverse proportion to the operating speed and sensing accuracy, respectively, the sensing data receiver 310 may include a first Receive the first sensed value through the motion sensor, receive the second sensed motion value through the at least one second motion sensor, and correct the first sensed motion value.

Hereinafter, the first motion sensor included in the mobile terminal 110 corresponds to the gyroscope sensor and the second motion sensor corresponds to the acceleration sensor and the geomagnetic sensor Corresponding examples will be described. This description is not intended to limit the scope of the present invention.

For example, the sensing data receiving unit 310 receives the first motion sensing value generated by the gyroscope sensor which is the first motion sensor. Here, since the gyroscope sensor is designed to have a higher operation speed than the acceleration sensor and the geomagnetic sensor, the sensing data receiving unit 310 can first receive the first motion sensing value. The sensing data receiving unit 310 may receive roll, pitch, and yaw data included in the first motion sensing value.

Here, the sensing data receiving unit 310 may receive the second motion sensing value generated from the acceleration sensor and the geomagnetic sensor corresponding to the second motion sensor. Here, since the acceleration sensor is designed to have a higher operating speed than the gyroscope sensor but with a higher degree of accuracy in terms of roll and pitch, the roll and pitch data in the first motion sensing value can be accelerated It is possible to correct the roll and pitch data included in the second motion sensing data generated by the sensor. In addition, since the geomagnetic sensor is designed to have a high accuracy with respect to yaw, at the first motion sensing value, the yaw data is converted into the yaw data included in the second motion sensing value generated by the geomagnetic sensor, Data can be corrected.

That is, the sensing data receiving unit 310 collects motion sensing values through at least two motion sensors 211 so as to accurately recognize the impact on the mobile terminal 110, It is possible to correct the motion sensing value by another motion sensor having a high accuracy based on the value. By performing this process on the game execution server 120 rather than on the mobile terminal 110, battery efficiency of the mobile terminal 110 can be increased, and the delay that can occur during the data processing process during game execution can be minimized, Speed can be improved.

The operation data determination unit 320 determines operation data corresponding to the impact data from the sensing data received from the mobile terminal 110. [ Here, the impact data may correspond to the data including predetermined predetermined conditions for extracting the sensing value associated with the game operation from the sensing data. The manipulation data may correspond to data that a game client can recognize and manipulate a specific object in the game.

A case where the game performing server 120 performs a coin flip game will be described as an example. When the user places the mobile terminal 110 on the desk and places the desk down with his or her hand, the operation data determination unit 320 determines the operating frequency of the sine function waveform in the vertical direction (z axis) which exponentially decreases from the mobile terminal 110 A motion sensing value corresponding to the motion sensing value can be received. That is, the operation data determination unit 320 may receive the motion sensing value corresponding to the motion of the mobile terminal 110 with respect to the up-down vibration motion.

Here, the operation data determination unit 320 can determine, as the operation data, a motion sensing value corresponding to the preset impact data in order to extract only the operation required to flip the coin in the coin flip game. That is, since only the vibration in the first substantial direction among the up and down vibration motions of the mobile terminal 110 is the motion associated with the operation of reversing the coin, the motion sense value in the upper direction of the initial maximum value is set in advance as the impact data, The data determination unit 320 can determine, as the operation data, a sensing value corresponding to the impact data in the received sensing data.

Here, the impact data can be variously set to finely manipulate (flip) a specific object (coin) in the game. For example, even if an impact is applied depending on the surrounding environment of the mobile terminal 110, the impact data may not be sensed according to the performance of the motion sensor 211. Therefore, the impact data may include a motion sensing value in a vertical upper direction, (E. G., A common frequency band due to a collision between a desk and a palm).

As another example, a case where the game performing server 120 performs a slotting game is described as an example. When the user places the mobile terminal 110 on the desk and shouts "wave" near the sound sensor (for example, a microphone), the operation data determination unit 320 determines that the sound sensing value Lt; / RTI >

Here, the operation data determination unit 320 can determine, as the operation data, a motion sensing value corresponding to the preset impact data in order to extract only the operation required to pass the tag in the slot machine game. That is, the sound sensing value corresponding to the sound of the maximum size is set in advance as the impact data, and the operation data determination unit 320 can determine the sensing value corresponding to the impact data from the received sensing data as the operation data.

Here, the impact data can be variously set to finely manipulate (pass) a specific object (tag) on the game. For example, in order to accurately extract sound by the user for game operation, the impact data can be set as a frequency range corresponding to a certain range of sound size and a plosive sound during the fostering of a person.

That is, the operation data determination unit 320 may extract a sensing value corresponding to the impact data from the sensing data received from the mobile terminal 110, and determine the extracted sensing value as operation data for game operation.

In one embodiment, the impulse data may include at least one of a vector pattern and a sound pattern. Here, the vector pattern includes a specific magnitude and direction for the motion sensing value, and the sound pattern may include a specific magnitude and frequency for the sound sensing value.

For example, the impulse data can be implemented as data of [0, 0, 3] corresponding to a motion sensing value of size 3 or more in the vertical upper direction (z-axis) in three-dimensional space have. In another example, the impact data may be implemented as [100 Hz, 3db] data corresponding to the frequency corresponding to the plosive sound during human growth and the sound corresponding to 3db.

That is, the background pattern and the sound pattern correspond to a predetermined pattern of the sound due to the motion and the impact of the mobile terminal 110 associated with the game operation in an impact applied to the mobile terminal 110 to manipulate a specific object in the game Data.

In one embodiment, the vector pattern may be defined as a vector pattern for Roll, Pitch, and Yaw data. That is, the vector pattern can be set not only as a vector for three-dimensional space but also as a specific range for angular rotation angles of x, y and z axes.

In one embodiment, the vector pattern may be associated with a motion sensing value and may include at least one of a predetermined range of motion magnitude and motion direction. For example, a vector pattern is data of [0, 0, 3_5] corresponding to a motion sensing value of size 3 or more and 5 or less in a vertically upper direction (z axis) in a three-dimensional space (x, Can be implemented.

In one embodiment, the sound pattern may be associated with a sound sensing value and may include at least one of a predetermined range of sound magnitude and frequency. For example, the sound pattern can be implemented as [100_150hz, 3_5db] data corresponding to the frequency range corresponding to the plosive sound of a person and the size of the sound corresponding to 3 to 5 db.

That is, the operation data determination unit 320 may extract sensing values corresponding to at least one of a vector pattern and a sound pattern included in the impulse data in the sensing data, and determine the operation data for game operation.

In one embodiment, upon receiving sensing data from the mobile terminal 110, the operation data determination unit 320 may correct at least one of a predetermined vector pattern and a sound pattern based on the sensing data. This process can change the sound due to the motion of the mobile terminal 110 and the impact corresponding to the impact on the mobile terminal 110 according to the surrounding environment at the time of executing the game, In order to reset the impact data.

For example, if the user wishes to play a game by placing the mobile terminal 110 on a rubber plate having high elasticity, the user sets the rubber plate on the palm of his hand as a preparatory step in the pre- Can be indirectly impacted. If the operation data determining unit 320 can not extract the sensing values corresponding to the preset vector pattern and the sound pattern based on the sensing data received from the mobile terminal 110, The motion size, the motion direction, and a specific range of sound size and frequency included in the sound pattern.

In one embodiment, the operation data determination unit 320 may extract a sensing value corresponding to the impact data from the sensing data, and generate a game operation command that can be processed by the game client based on the sensed value . Here, the game operation command may be associated with a game operation vector capable of operating a specific object in the game.

For example, when the pattern data corresponds to [0, 0, 3] and the operation data determination unit 320 extracts a motion sensing value corresponding to [0, 0, 5] in the sensing data, The controller 320 can generate a game operation command corresponding to [up, 5] associated with the game operation vector recognized by the game client and corresponding to the size 5 in the vertical upper direction on the game. Here, the game client can reverse the coin by applying a force of size 5 in the upper direction of the coin on the game through game manipulation commands.

In order to facilitate the understanding of the present invention, a vector pattern and a game operation vector corresponding to the z-axis of the three-dimensional space have been described as an example. However, the ordinary artisan in the technical field of the present invention, It will be appreciated that an implementation of the vector pattern and game manipulation vector is possible. For example, when the game performing server 120 performs the boxing game, the vector pattern and the game operation vector may be stored in the three-dimensional space, such as x, y, and z axes.

The game operation unit 330 performs a game operation by the game client 130 via the operation data determined by the operation data determination unit 320. [

More specifically, the operation data determination unit 320 determines the operation data corresponding to the impact data associated with the game operation in the sensing data. Here, the operation data corresponds to a command that the game client 130 can recognize and that can operate a specific object in the game. The game operation unit 330 may transmit operation data to the game client 130 so that the game client 130 may operate a specific object.

As a result, if the user directly or indirectly impacts the mobile terminal 110, the game can be operated through the sensed value of the impact.

The sensible action execution unit 340 receives the specific event data generated in the game process from the game client 130 as feedback and transmits the sensed action through the sensible action means 230 included in the mobile terminal 110 Type action can be performed. That is, the sensible action execution unit 340 can utilize the mobile terminal 110 as a means for operating a game and as a feedback device for game operation.

For example, when the game execution server 120 corresponds to a coin flip game, when the game operation unit 330 transmits the operation data to the game client 130 and the coin is inverted in the game, the real- ) Can receive event data corresponding to [coin flipping] from the game client 130 as feedback. Here, the sensible action execution unit 340 generates data corresponding to {vibration, 3 seconds} that can cause the [vibration apparatus] as the sensible action means 230 to vibrate, and transmits the data to the mobile terminal 110 can do. That is, the real-time action execution unit 340 may cause the mobile terminal 110 to perform a specific operation for a specific event occurring during the game progress, so that the user feels as if the user actually reversed the coin.

In another example, when the user acquires the highest point on the game, the real-time action execution unit 340 may receive event data corresponding to [best point] as feedback. Here, the sensible action execution unit 340 may output the text corresponding to {celebration} through the display device of the mobile terminal 110, and reproduce the sound corresponding to {celebration music} through the speaker . In one embodiment, the real-time action unit 340 may transmit the user's score to the user's friend (e.g., follower) through the SNS application installed in the mobile terminal 110. [

The control unit controls the operation and data flow of the sensing data receiving unit 310, the operation data determination unit 320, the game operation unit 330, and the personalization setting profile transmission unit 250.

4 is a flowchart illustrating a game execution process according to the present invention.

The sensing data receiving unit 310 receives the sensing data related to the impact from the user from the mobile terminal 110 (step S410).

More specifically, the sensing data receiving unit 310 receives the sensing data including the sensing value generated by the sensing unit 210 of the mobile terminal 110 from the communication unit 220. Here, the sensing data may include a sensing value for an impact that is directly or indirectly applied to the mobile terminal 110 by a user.

In one embodiment, the sensing data receiver 310 may receive a sensing value for a direct or indirect impact sensed through at least one sensor in the mobile terminal 110. Here, the mobile terminal 110 may include at least one of a motion sensor 211 and a sound sensor 212.

In one embodiment, the sensing data receiver 310 receives at least one of a motion sensing value sensed by the motion sensor 211 in the mobile terminal 110 and a sound sensing value sensed by the sound sensor 212 .

In one embodiment, the sensing data receiver 310 may receive motion sensing values generated from each of the at least two motion sensors 211 in the mobile terminal 110. Here, the motion sensor 211 may include at least two of an acceleration sensor, an angular velocity sensor, and a geomagnetic sensor.

In one embodiment, if the mobile terminal 110 includes a first motion sensor and at least one second motion sensor that are designed to be in inverse proportion to the operating speed and sensing accuracy, respectively, Receive the first sensed value through the motion sensor, receive the second sensed motion value through the at least one second motion sensor, and correct the first sensed motion value.

The operation data determination unit 320 determines operation data corresponding to the impact data from the sensing data received from the mobile terminal 110 (step S420). Here, the impact data may correspond to the data including predetermined predetermined conditions for extracting the sensing value associated with the game operation from the sensing data. The manipulation data may correspond to data that a game client can recognize and manipulate a specific object in the game.

A case where the game performing server 120 performs a coin flip game will be described as an example. When the user places the mobile terminal 110 on the desk and places the desk down with his or her hand, the operation data determination unit 320 determines the operating frequency of the sine function waveform in the vertical direction (z axis) which exponentially decreases from the mobile terminal 110 A motion sensing value corresponding to the motion sensing value can be received. That is, the operation data determination unit 320 may receive the motion sensing value corresponding to the motion of the mobile terminal 110 with respect to the up-down vibration motion.

Here, the operation data determination unit 320 can determine, as the operation data, a motion sensing value corresponding to the preset impact data in order to extract only the operation required to flip the coin in the coin flip game. That is, since only the vibration in the first substantial direction among the up and down vibration motions of the mobile terminal 110 is the motion associated with the operation of reversing the coin, the motion sense value in the upper direction of the initial maximum value is set in advance as the impact data, The data determination unit 320 can determine, as the operation data, a sensing value corresponding to the impact data in the received sensing data.

That is, the operation data determination unit 320 may extract a sensing value corresponding to the impact data from the sensing data received from the mobile terminal 110, and determine the extracted sensing value as operation data for game operation.

In one embodiment, the operation data determination unit 320 may extract a sensing value corresponding to the impact data from the sensing data, and generate a game operation command that can be processed by the game client based on the sensed value . Here, the game operation command may be associated with a game operation vector capable of operating a specific object in the game.

For example, when the pattern data corresponds to [0, 0, 3] and the operation data determination unit 320 extracts a motion sensing value corresponding to [0, 0, 5] in the sensing data, The controller 320 can generate a game operation command corresponding to [up, 5] associated with the game operation vector recognized by the game client and corresponding to the size 5 in the vertical upper direction on the game. Here, the game client can reverse the coin by applying a force of size 5 in the upper direction of the coin on the game through game manipulation commands.

The game operation unit 330 performs a game operation by the game client 130 via the operation data determined by the operation data determination unit 320 (step S430).

More specifically, the operation data determination unit 320 determines the operation data corresponding to the impact data associated with the game operation in the sensing data. Here, the operation data corresponds to a command that the game client 130 can recognize and that can operate a specific object in the game. The game operation unit 330 may transmit operation data to the game client 130 so that the game client 130 may operate a specific object.

As a result, if the user directly or indirectly impacts the mobile terminal 110, the game can be operated through the sensed value of the impact.

5 is a flowchart illustrating a game execution process according to an embodiment of the present invention.

The sensing data receiving unit 310 may receive at least one of a motion sensing value and a sound sensing value associated with a shock from a user from the mobile terminal 110 (step S510).

For example, when the game execution server 120 proceeds with a coin flip game, if a user puts the mobile terminal 110 on a desk and applies a shock indirectly to the mobile terminal 110 by touching the desk with his / The motion sensor 211 of the mobile terminal 110 may sense motion in the vertical direction of the mobile terminal 110 according to the impact to generate a motion sensing value. Here, the sound sensor 212 may generate a sound sensing value by sensing a sound due to a collision between the desk and the palm of the user. The sensing data receiving unit 310 may receive sensing data including at least one of a motion sensing value and a sound sensing value through the communication unit 220 of the mobile terminal 110. [

The operation data determination unit 320 may extract a sensing value corresponding to at least one of a vector pattern and a sound pattern included in the impact data in the sensing data (step S520).

More specifically, the impulse data may include at least one of a vector pattern and a sound pattern. Here, the vector pattern includes a specific magnitude and direction for the motion sensing value, and the sound pattern may include a specific magnitude and frequency for the sound sensing value.

For example, the impulse data can be implemented as data of [0, 0, 3] corresponding to a motion sensing value of size 3 or more in the vertical upper direction (z-axis) in three-dimensional space have. In another example, the impact data may be implemented as [100 Hz, 3db] data corresponding to the frequency corresponding to the plosive sound during human growth and the sound corresponding to 3db.

In one embodiment, the vector pattern may be defined as a vector pattern for Roll, Pitch, and Yaw data. That is, the vector pattern can be set not only as a vector for three-dimensional space but also as a specific range for angular rotation angles of x, y and z axes.

In one embodiment, the vector pattern may be associated with a motion sensing value and may include at least one of a predetermined range of motion magnitude and motion direction. For example, a vector pattern is data of [0, 0, 3_5] corresponding to a motion sensing value of size 3 or more and 5 or less in a vertically upper direction (z axis) in a three-dimensional space (x, Can be implemented.

In one embodiment, the sound pattern may be associated with a sound sensing value and may include at least one of a predetermined range of sound magnitude and frequency. For example, the sound pattern can be implemented as [100_150hz, 3_5db] data corresponding to the frequency range corresponding to the plosive sound of a person and the size of the sound corresponding to 3 to 5 db.

In operation S530, the operation data determination unit 320 extracts a sensing value corresponding to at least one of a vector pattern and a sound pattern included in the impulse data in the sensing data, and determines the operation data as the operation data for the game operation.

The operation data determination unit 320 may extract a sensing value corresponding to the impact data from the sensing data and generate a game operation command that can be processed by the game client based on the sensed sensing value (step S540). Here, the game operation command may be associated with a game operation vector capable of operating a specific object in the game.

For example, when the pattern data corresponds to [0, 0, 3] and the operation data determination unit 320 extracts a motion sensing value corresponding to [0, 0, 5] in the sensing data, The controller 320 can generate a game operation command corresponding to [up, 5] associated with the game operation vector recognized by the game client and corresponding to the size 5 in the vertical upper direction on the game. Here, the game client can reverse the coin by applying a force of size 5 in the upper direction of the coin on the game through game manipulation commands.

The game operation unit 330 performs a game operation by the game client 130 via the operation data determined by the operation data determination unit 320 (step S550).

More specifically, the operation data determination unit 320 determines the operation data corresponding to the impact data associated with the game operation in the sensing data. Here, the operation data corresponds to a command that the game client 130 can recognize and that can operate a specific object in the game. The game operation unit 330 may transmit operation data to the game client 130 so that the game client 130 may operate a specific object.

As a result, if the user directly or indirectly impacts the mobile terminal 110, the game can be operated through the sensed value of the impact.

The sensible action execution unit 340 may receive specific event data generated during the game process from the game client 130 as feedback (step S560).

Here, the real action action unit 340 may perform the real action action through the real action action unit 230 included in the mobile terminal 110 through the received feedback (step S570). That is, the sensible action execution unit 340 can utilize the mobile terminal 110 as a means for operating a game and as a feedback device for game operation.

For example, when the game execution server 120 corresponds to a coin flip game, when the game operation unit 330 transmits the operation data to the game client 130 and the coin is inverted in the game, the real- ) Can receive event data corresponding to [coin flipping] from the game client 130 as feedback. Here, the sensible action execution unit 340 generates data corresponding to {vibration, 3 seconds} that can cause the [vibration apparatus] as the sensible action means 230 to vibrate, and transmits the data to the mobile terminal 110 can do. That is, the real-time action execution unit 340 may cause the mobile terminal 110 to perform a specific operation for a specific event occurring during the game progress, so that the user feels as if the user actually reversed the coin.

In another example, when the user acquires the highest point on the game, the real-time action execution unit 340 may receive event data corresponding to [best point] as feedback. Here, the sensible action execution unit 340 may output the text corresponding to {celebration} through the display device of the mobile terminal 110, and reproduce the sound corresponding to {celebration music} through the speaker . In one embodiment, the real-time action unit 340 may transmit the user's score to the user's friend (e.g., follower) through the SNS application installed in the mobile terminal 110. [

6 is a diagram illustrating an example of a game execution system according to an embodiment of the present invention.

6, the game performing system 100 may be implemented by including the game performing server 120 and the game client 130 in a personal computer (PC). Here, the mobile terminal 110 may be connected to the PC through the wireless network and connected to the game execution server 120, and the game client 130 may be connected to the game execution server 120 through a wired network inside the PC . That is, the user can execute a game installed on the PC and manipulate the game by directly or indirectly impacting the mobile terminal 110.

7 is a diagram illustrating an example of a game screen for explaining a game execution process according to an embodiment of the present invention. FIG. 7 illustrates a screen on which a game can be performed according to an embodiment of the present invention, and is not intended to limit the scope of the present invention.

In FIG. 7A, the game performing server 120 is connected to the game cleanness 130 corresponding to the coin flip game program, and can execute the coin flip game.

The sensing data receiving unit 310 may receive the sensing data corresponding to the impact directly or indirectly applied to the mobile terminal 110 by the user. Here, the sensing data may include a motion sensing value corresponding to the up and down vibration movement of the mobile terminal 110 and a sound sensing value corresponding to the sound due to the impact.

The operation data determination unit 320 can extract the sensing value associated with the operation of reversing the coin in the game and determine the operation data. Here, the operation data determination unit 320 may determine, as operation data, a motion sensing value having a maximum magnitude value or a sound sensing value corresponding to a predetermined frequency range of the mobile terminal 110.

The game operation unit 330 can generate a game operation command that allows the game client to recognize the determined operation data and pass the coin on the game.

In FIG. 7B, the game performing server 120 is connected to the game cleanness 130 corresponding to the slotting game program, and can execute the slotting game.

The sensing data receiving unit 310 may receive sensing data corresponding to an impact indirectly applied to the mobile terminal 110 by the user. That is, when the user shout "wave" as if passing a real tag near the microphone of the mobile terminal 110, the sound sensor 212 of the mobile terminal 110 senses at least one of the magnitude and frequency of the sound, A sensing value can be generated. Here, the sensing data receiving unit 310 may receive the sensing data including the sound sensing value from the mimic terminal 110.

The operation data determination unit 320 can extract the sensing value associated with the operation of passing a tag on the game to determine the operation data. Here, the operation data determination unit 320 may determine, as the operation data, a sound sensing value corresponding to a predetermined frequency range set in advance among the sound sensing values.

The game operation unit 330 can generate a game operation command that allows the game client to recognize the determined operation data and pass the ticket in the game.

7C, the game performing server 120 may be connected to the game client 130 corresponding to the rhythm game program, and may execute the rhythm game.

In order to implement this embodiment, the game performing system 100 may include a plurality of mobile terminals 110. In FIG. 7C, it is assumed that the game execution server 120 is connected to five mobile terminals 110. FIG.

The sensing data receiving unit 310 may receive sensing data corresponding to an impact indirectly applied to each of the plurality of mobile terminals 110 by the user.

For example, when a user arranges five mobile terminals 110 in a row on a desk and applies a shock to the desk once, each of the five mobile terminals 110 generates a motion sensing value corresponding to the impact can do. Here, the sensing data receiving unit 310 may receive a motion sensing value from each of the five mobile terminals 110. [

The operation data determination unit 320 may determine the mobile terminal 110 having the maximum size motion sensing value in the received sensing data. That is, the operation data determination unit 320 determines the specific mobile terminal 110 that has transmitted the maximum size motion sensing value at the five motion sensing values, so that the user can select the closest one of the five mobile terminals 110 It is possible to determine the impacted mobile terminal 110.

Here, the operation data determination unit 320 may assign an identification number to the five mobile terminals 110. For example, the operation data determination unit 320 may assign 1 to 5 from the leftmost mobile terminal. Such a process may be set up in connection with each mobile terminal 110 through a game environment setting at a stage before the game is executed.

If the mobile terminal 110 that has transmitted the maximum size motion sensing value corresponds to the number 2, the operation data determination unit 320 ignores the motion sensing values of the mobile terminals 110, 1 to 3 and 5 , It is possible to extract only the motion sensing value transmitted by the second mobile terminal 110 and determine it as operation data.

The game operation unit 330 may generate a game operation command that can be recognized by the game client and recognize the determined operation data as a strike against line 2 on the game.

That is, when a user places a plurality of mobile terminals 110 in order and when the rhythm bar of line 2 falls on the game, the user can manipulate the game by indirectly impacting the second nearest mobile terminal 110 have.

It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the present invention as defined by the following claims It can be understood that

110: mobile terminal 120: game execution server
130: game client 210: sensor unit
220: communication unit 230: sensible action means
240:
310: sensing data receiving unit 320: operation data determining unit
330: game operation unit 340: real-time action execution unit
350:

Claims (23)

A game execution method performed in a game execution server that can be connected to a mobile terminal,
Receiving sensing data associated with a shock from a user from the mobile terminal;
Determining operation data corresponding to impact data associated with a game operation from the received sensing data; And
And performing a game operation by the game client through the determined operation data.
2. The method of claim 1, wherein receiving sensing data associated with an impulse by the user comprises:
Further comprising receiving a sensing value for a direct or indirect impact sensed through at least one sensor in the mobile terminal.
3. The method of claim 2, wherein receiving sensing data associated with an impulse by the user comprises:
Further comprising the step of receiving at least one of a motion sensing value sensed by a motion sensor in the mobile terminal and a sound sensing value sensed by a sound sensor.
4. The method of claim 3, wherein receiving sensing data associated with an impulse by the user comprises:
Further comprising receiving the motion sensing value from each of at least two motion sensors in the mobile terminal.
5. The method of claim 4, wherein receiving sensing data associated with an impact by the user comprises:
Receiving a first motion sensing value via a first motion sensor; And
Further comprising: receiving a second motion sensing value via at least one second motion sensor to correct the received first motion sensing value.
2. The method according to claim 1, wherein the step of determining operation data corresponding to the impact data comprises:
Further comprising extracting a sensing value corresponding to at least one of a vector pattern and a sound pattern included in the impact data in the sensing data.
7. The method of claim 6,
Wherein the motion information includes at least one of a motion size and a motion direction of a predetermined range, which are associated with the motion sensing value by the mobile terminal.
7. The method of claim 6,
And at least one of a sound range and a frequency associated with a sound sensing value by the mobile terminal and in a predetermined range.
The method according to claim 6, wherein the step of determining operation data corresponding to the impact data comprises:
Further comprising the step of associating the extracted sensing value with a game operation vector associated with the game operation to generate a game operation command that can be processed by the game client.
The method according to claim 6, wherein the step of determining operation data corresponding to the impact pattern
And correcting at least one of the vector pattern and the sound pattern based on the received sensing data.
The method according to claim 1,
Receiving feedback on the game operation from the game client; And
And performing a sensible action by the mobile terminal through the received feedback.
A game execution server capable of being connected to a mobile terminal,
A sensing data receiver for receiving sensing data associated with an impact from a user from the mobile terminal;
An operation data determination unit that determines operation data corresponding to impact data associated with a game operation in the received sensing data; And
And a game operation unit for performing a game operation by the game client through the determined operation data.
13. The apparatus of claim 12, wherein the sensing data receiver
And receives a sensing value for a direct or indirect impact sensed through at least one sensor in the mobile terminal.
14. The apparatus of claim 13, wherein the sensing data receiver
Wherein the controller receives at least one of a motion sensing value sensed by a motion sensor in the mobile terminal and a sound sensing value sensed by a sound sensor.
15. The apparatus of claim 14, wherein the sensing data receiver
And receives the motion sensing value from each of at least two motion sensors in the mobile terminal.
16. The apparatus of claim 15, wherein the sensing data receiver
Receiving a first motion sensing value via a first motion sensor and receiving a second motion sensing value via at least one second motion sensor to correct the received first motion sensing value, .
13. The apparatus according to claim 12, wherein the operation data determination unit
And extracts a sensing value corresponding to at least one of a vector pattern and a sound pattern included in the impact data in the sensing data.
18. The method of claim 17,
Wherein the game execution server comprises at least one of a motion size and a motion direction associated with the motion sensing value by the mobile terminal and within a predetermined range.
18. The method of claim 17,
And at least one of a sound range and a frequency in a predetermined range, which is related to the sound sensing value by the mobile terminal.
18. The apparatus of claim 17, wherein the operation data determination unit
And associating the extracted sensing value with a game operation vector associated with the game operation to generate a game operation command that can be processed by the game client.
18. The apparatus of claim 17, wherein the operation data determination unit
And corrects at least one of the vector pattern and the sound pattern based on the received sensing data.
13. The method of claim 12,
Further comprising a sensible action execution unit for receiving feedback on the game operation from the game client and performing a sensible action by the mobile terminal through the received feedback.
A computer-readable recording medium storing a computer program for a game execution method performed in a game execution server that can be connected to a mobile terminal,
Receiving sensing data associated with a shock from a user from the mobile terminal;
Determining operation data corresponding to impact data associated with a game operation in the received sensing data; And
And performing a game operation by the game client through the determined operation data.
KR1020120126610A 2012-09-14 2012-11-09 Method of performing game, server performing the same and media storing the same KR20140060025A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020120126610A KR20140060025A (en) 2012-11-09 2012-11-09 Method of performing game, server performing the same and media storing the same
PCT/KR2013/008368 WO2014042484A1 (en) 2012-09-14 2013-09-16 Game providing method, terminal, server and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020120126610A KR20140060025A (en) 2012-11-09 2012-11-09 Method of performing game, server performing the same and media storing the same

Publications (1)

Publication Number Publication Date
KR20140060025A true KR20140060025A (en) 2014-05-19

Family

ID=50889602

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020120126610A KR20140060025A (en) 2012-09-14 2012-11-09 Method of performing game, server performing the same and media storing the same

Country Status (1)

Country Link
KR (1) KR20140060025A (en)

Similar Documents

Publication Publication Date Title
US8184092B2 (en) Simulation of writing on game consoles through the use of motion-sensing technology
US11794105B2 (en) Game processing system, game processing program, and game processing method
CN102446025B (en) Physical model based gesture recognition
CN103272382A (en) Method and device for using Bluetooth gamepad to simulate intelligent terminal touch screen to control game
CN107562201B (en) Directional interaction method and device, electronic equipment and storage medium
JP2013125373A5 (en)
JP6504058B2 (en) INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM
JP6064111B2 (en) User interface system, operation signal analysis method and program for batting operation
JP2017012619A (en) Computer program for advancing game by touch operation
US20230041294A1 (en) Augmented reality (ar) pen/hand tracking
KR20140060025A (en) Method of performing game, server performing the same and media storing the same
JP2015123109A (en) Program and server
CN114547581A (en) Method and apparatus for providing a captcha system
JP2018117867A (en) Program and game system
JP2020110603A (en) Game program, method, and information processing unit
JP2015231437A (en) Program and game system
Costa et al. An architecture for using smartphones as interfaces for computer games
JP2017012307A (en) Program and terminal
JP2015147064A (en) Program, game device, and server system
US20140152548A1 (en) Data processing apparatus and recording medium
US11745101B2 (en) Touch magnitude identification as input to game
JP6307651B1 (en) GAME PROGRAM, METHOD, AND INFORMATION PROCESSING DEVICE
US20240173618A1 (en) User-customized flat computer simulation controller
JP6668425B2 (en) Game program, method, and information processing device
CN115080936A (en) Security verification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal