KR101673805B1 - Method and user device for setting relationship between bolck and iot device - Google Patents

Method and user device for setting relationship between bolck and iot device Download PDF

Info

Publication number
KR101673805B1
KR101673805B1 KR1020150127651A KR20150127651A KR101673805B1 KR 101673805 B1 KR101673805 B1 KR 101673805B1 KR 1020150127651 A KR1020150127651 A KR 1020150127651A KR 20150127651 A KR20150127651 A KR 20150127651A KR 101673805 B1 KR101673805 B1 KR 101673805B1
Authority
KR
South Korea
Prior art keywords
block
toy
motion
control
setting
Prior art date
Application number
KR1020150127651A
Other languages
Korean (ko)
Inventor
홍제훈
Original Assignee
(주)모션블루
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)모션블루 filed Critical (주)모션블루
Priority to KR1020150127651A priority Critical patent/KR101673805B1/en
Priority to US14/966,253 priority patent/US20170065878A1/en
Application granted granted Critical
Publication of KR101673805B1 publication Critical patent/KR101673805B1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H30/00Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
    • A63H30/02Electrical arrangements
    • A63H30/04Electrical arrangements using wireless transmission
    • H04M1/72533
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2201/00Electronic components, circuits, software, systems or apparatus used in telephone systems
    • H04M2201/42Graphical user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/22Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Toys (AREA)

Abstract

A method and a user terminal for establishing a control relationship of a block and object internet operation device are provided. A user terminal for setting a control relationship between a block and an action toy according to an embodiment of the present invention includes a recognition unit for recognizing at least one of the block and the action toy disposed on the screen, And a control relationship setting unit that sets a control relationship that allows the recognized motion toy to be controlled, wherein the motion attribute includes at least one of an identifier of the motion toy in which the control relationship is set, data detected in the block, And a control command for controlling the toy.

Description

[0001] METHOD AND USER DEVICE FOR SETTING RELATIONSHIP BETWEEN BOLCK AND IOT DEVICE [0002]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a technique for establishing relationships between Internet and Things (IoT) operation devices controlled by blocks and blocks.

A motion toy such as a robot can cause a curiosity of the user as the control means is diversified.

In the past, control means such as a wireless or wired remote controller has been used. Recently, with the development of speech recognition technology, technologies for controlling the motion toy using various sensors, such as controlling the motion toy using voice, have been developed.

However, since the sensors of the motion toy are integrally inserted into the motion toy, the control of the motion toy using the sensor is limited to the use of the sensor, and the sensing value of the sensor for controlling the motion toy is predetermined, But also difficulty in creative learning using toys.

SUMMARY OF THE INVENTION The present invention has been made to solve the above-mentioned problems of the prior art, and it is an object of the present invention to provide a method for intuitively setting a control relationship between a motion toy and blocks for controlling the motion toy.

It is another object of the present invention to provide a method for combining a plurality of blocks so as to control an operation toy by various methods and a method for setting a control relationship between the combined blocks and the operation toy.

In order to achieve the above object, a user terminal for setting a control relation between a block and an operation toy according to an embodiment of the present invention includes a recognition unit for recognizing at least one of the block and the motion toy disposed on the screen, And a control relationship setting unit that sets a control relationship that allows the recognized motion toy to be controlled in accordance with the motion property of the recognized block, wherein the motion attribute includes an identifier of the motion toy in which the control relationship is set, And a control command for controlling the motion toy according to the sensed data.

According to another aspect of the present invention, there is provided a method for setting a control relationship between a block and an action toy, the method comprising the steps of: (a) recognizing at least one of the block and the action toy And (b) setting a control relationship that allows the recognized motion toy to be controlled according to an operation attribute of the recognized block, wherein the motion attribute includes an identifier of the motion toy in which the control relationship is set, And a control command for controlling the operation toy in response to the sensed data.

According to an embodiment of the present invention, the control relationship between the motion toys and the blocks for controlling the motion toys can be set easily and amusingly.

In addition, since the control relationships between the various blocks and the motion toys are set, the control means of the motion toys is diversified, so that the curiosity of the user can be constantly generated.

In addition, creative learning is possible by setting the control relation between the motion toy and various blocks, and setting the control relation for the combination of the blocks.

It should be understood that the effects of the present invention are not limited to the above effects and include all effects that can be deduced from the detailed description of the present invention or the configuration of the invention described in the claims.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram of a system for setting control relationships between a block and an action toy according to an embodiment of the present invention. FIG.
2 is a block diagram illustrating a configuration of a user terminal according to an embodiment of the present invention.
3 is a flowchart illustrating a process of setting a control relationship between a block and an action toy according to an embodiment of the present invention.
4 is a diagram illustrating a method of setting a control relationship between a block and an action toy according to an embodiment of the present invention.
5 is a diagram illustrating a screen for setting operation attributes of a block according to an embodiment of the present invention.
FIG. 6 is a diagram illustrating a method of releasing a control relationship between a block and an operation toy according to an embodiment of the present invention. Referring to FIG.
7 is a diagram illustrating a method of setting a control relationship between a block and a block according to an embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, the present invention will be described with reference to the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein.

In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

Throughout the specification, when a part is referred to as being "connected" to another part, it includes not only "directly connected" but also "indirectly connected" .

Also, when an element is referred to as "comprising ", it means that it can include other elements, not excluding other elements unless specifically stated otherwise.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram of a system for setting control relationships between a block and an action toy according to an embodiment of the present invention. FIG.

A system 100 for setting control relationships between a block and an operation toy includes a block 110, an Internet of Things (IoT) operation device 120, and a user terminal 130 can do.
Hereinafter, the 'object Internet operating device 120' is referred to as an 'action toy 120'.

First, the block 110 may be placed on the screen of the user terminal 130 and may be associated with the motion toys 120 disposed on the screen of the user terminal 130, Can be set.

The control relationship setting between the block 110 and the motion toy 120 or the block 110 and the block 110 may be performed by dragging and dragging and dropping between the placement areas displayed on the screen of the user terminal 130 drag & drop) or a sequential touch (touching the placement area of the motion toy after touching the placement area of the block).

The control relation setting will be described in detail with reference to FIG. 5 to FIG.

In addition, the block 110 may have various kinds of blocks 110. FIG.

For example, there may be a block including a sensor (voice sensing sensor, motion sensing sensor, pressure sensing sensor, light sensing sensor, etc.) and output means (LED, speaker, display window, etc.).

In addition, the operation attribute may be set through the user interface provided in the user terminal 130,

Herein, the 'action attribute' includes identification information of the motion toy 120 having a control relationship with the block 110, data (attribute value) sensed in the block 110, a condition under which the control command is transmitted to the motion toy 120, A control command to control the motion toy 120, and the like.

For example, when a control relation is established with another block in the block 110, identification information of another block in which a control relation is set, data (attribute value) sensed in the block 110, Conditions. ≪ / RTI >

In one embodiment, in the control relationship between the sensor block 110 including the speech recognition sensor and the motion toy 120,

"Motion Toys ID - ACT_ROBOT"

'Property (Sensing) value (value) - more than 3 seconds'

If the user's voice (or mouth wind) is maintained for 3 seconds in the sensor block 110, the motion toy 120 whose ID is ACT_ROBOT can rotate one rotation have.

The block 110 may also include a wireless communication module capable of communicating sensed information (value) to the user terminal 130 and wirelessly communicating with the user terminal 130 for that purpose.

A specific pattern may be formed on the portion of the block 110 that is in contact with the user terminal 130 to allow the user terminal 130 to identify the block 110, An electrostatic method can be used.

Meanwhile, the operation toy 120 may be disposed on the screen of the user terminal 130, and a control relationship with the block 110 disposed on the screen of the user terminal 130 may be established.

In addition, the action toy 120 may operate by receiving a control command corresponding to the attribute (sensing) value set in the operation attribute of the block 110 for which the control relationship is set, from the user terminal 130.

The operating toy 120 may also include a wireless communication module for wirelessly communicating with the user terminal 130 and may include a portion of the operating toy 120 that is in contact with the bottom of the operating toy 120, A specific pattern may be formed so that the user terminal 130 can recognize (identify) the motion toy 120. [

Here, the 'pattern' for recognizing (identifying) the motion toy 120 may use an electrostatic method in the same manner as the block 110.

For reference, a control command for controlling the motion toy 120 may control the motion of the motion toy 120 itself, and may allow a specific function of the motion toy 120 to be executed on the game being executed by the user terminal 130 It is possible.

For example, a game in which a missile is fired is executed on the screen of the user terminal 130, and when the action toy 120 disposed on the screen of the user terminal 130 serves as a missile launcher, The missile may be fired from the position of the motion toy 120 in the game according to the sensing data of the sensor block 110 that senses the motion of the user, which means shooting.

Meanwhile, the user terminal 130 may recognize the block 110 or the motion toy 120 disposed on the screen. At this time, the pattern formed on the lower portion of the block 110 and the motion toy 120 Pattern) may be used to identify the block 110 or the motion toy 120.

In addition, the user terminal 130 may display a 'placement area' on the screen around the recognized block 110 or the action toy 120.

The 'placement area' may include an indication that the user terminal 130 has recognized the block 110 or the motion toy 120 and may include a circle or a circle according to the lower shape of the block 110 or the motion toy 120 Rectangles, and the like.

Of course, as shown in FIG. 1, it may have a certain shape irrespective of the lower shape of the block 110 or the motion toy 120.

The user may set the arrangement area of the block 110 or the motion toy 120 at the time of setting the control relationship between the block 110 and the motion toy 120 or setting the control relationship between the block 110 and the block 110 After touching for a specific time or longer, the control relationship setting can be dragged to the desired block 110 or the placement area of the motion toy 120 according to the guidance of the user terminal 130.

For reference, the control relation setting can be performed not only by the dragging, but also by a drag and drop or a sequential touch (touching the placement area of the motion toy after touching the placement area of the block).

Once the control relationship is established, the user terminal 130 may store control relationship configuration information.

In addition, the user terminal 130 may display a user interface for setting an operation property of the corresponding block 110 around the recognized block 110, and may set an operation attribute inputted through the corresponding user interface as a control relationship setting Can be stored together with information.

The user terminal 130 may also be connected wirelessly with the block 110 and the motion toy 120.

When the sensed data is received from the block 110, the controller 110 may transmit a control command corresponding to the received data to the motion toy 120 based on the motion attribute of the corresponding block 110.

The user terminal 130 is connected to a service server (not shown) and stores information about the pattern of the block 110 and the motion toy 120, information about the block 110 and information of the game using the motion toy 120 Update can be performed.

For reference, the user terminal 130 includes a processor and a memory coupled to the processor, and the memory may store program instructions for causing the processor to perform operations of the user terminal 130 described in the embodiments of the present invention.
In addition, an application that executes the above-described operations by the program instructions may be produced using HTML (Hyper Text Markup Language) 5. [

The user terminal 130 may also include a smart phone or tablet computer that includes a touch screen and may place the block 110 and the motion toy 120 on the screen, have

2 is a block diagram illustrating a configuration of a user terminal according to an embodiment of the present invention.

The user terminal 130 according to an embodiment of the present invention includes a recognition unit 131, a placement area display unit 132, a control relation setting unit 133, a storage unit 134, an operation property user interface providing unit 135, A control command providing unit 136, and a communication unit 137. [

The recognition unit 131 can recognize the blocks 110 and the motion toys 120 disposed on the screen.

The recognition unit 131 can identify the block 110 or the motion toy 120 by recognizing the pattern formed on the block 110 or the lower part of the motion toy 120 disposed on the screen, The 'pattern' can be an electrostatic type.

The recognizing unit 131 may extract the position of the recognized block 110 or the motion toys 120 on the screen.

For this, the recognition unit 131 may recognize a plurality of nodes constituting the pattern formed at the lower portion of the block 110 or the motion toy 120, and may determine the position of the center node among the plurality of nodes, (Coordinates) at which the toy 120 is disposed.

On the other hand, the placement area display unit 132 can display the placement area around the block 110 or the motion toy 120 recognized by the recognition unit 131. [

At this time, the placement area display unit 132 displays the position of the block 110 or the motion toy 120 extracted from the recognition unit 131 and the bottom area of the recognized block 110 or the motion toy 120 The size of the layout area can be calculated.

That is, the block 110 or the arrangement position (coordinates) of the motion toy 120 is extracted from the recognition unit 131, and the motion of the block 110 or the lower part of the motion toy 120 contacting the screen of the user terminal 130 Since the size and size of the size are predetermined, the placement area display unit 132 can calculate the size of the placement area displayed around the recognized block 110 or the motion toy 120.

For reference, the placement area display unit 132 may display the shape of the placement area according to the bottom area type of the block 110 or the motion toy 120 when displaying the placement area.

For example, if the bottom surface area of the block 110 or the motion toy 120 is circular, the placement area can be displayed in a circular shape. If the shape is rectangular, the placement area can be displayed in a square shape.

The control relation setting unit 133 can set the control relationship between the block 110 and the motion toy 120 or between the block 110 and the block 110. [

For reference, before setting the 'control relation', a menu for setting the control relationship or a specific button / icon may be selected. The control relation setting unit 133 may set the control relationship between the block 110 disposed on the screen, (110) capable of setting the control relationship among the control blocks (120), the motion toy (120), or the blocks (110) capable of setting the control relationship and the block (110) can do.

The setting of the control relationship between the block 110 and the motion toy 120 or between the block 110 and the block 110 can be set by dragging, dragging-and-drop or sequential touch between respective layout regions displayed on the screen.

Here, the motion toy 120 may be set in a control relationship with one or more blocks 110 (1: 1 or 1: N) and the block 110 may also be in control relationship with one or more other blocks : 1 or 1: N).

Meanwhile, the storage unit 134 may store the control relation setting information of the block 110, the motion toy 120, the blocks 110 and 110, and the like.

In addition, the storage unit 134 may store the operation attribute together with the control relationship setting information of the block 110 when the operation attribute of the block 110 is set through the operation attribute user interface described later.

In addition, the storage unit 134 may store information on the games executed in the user terminal 130, the blocks 110 available in each game, and the motion toy 120.

In addition, information on the background of each game and information on the arrangement position of the blocks 110 and the motion toys 120 in the background can be stored.

When the user terminal 130 accesses the service server (not shown), the information stored in the storage unit 134 may be compared with the latest information of the service server (not shown) to determine whether the information is updated.

Of course, when the information of the service server (not shown) is updated with the latest information, the user terminal 130 receives a message from the service server (not shown) ), The information stored in the storage unit 134 may be updated.

On the other hand, the operation attribute user interface providing unit 135 may display on the screen a user interface for inputting the operation attributes of the block 110 for which the control relation with the operation toy 120 is established.

When the control command providing unit 136 and the motion toy 120 are disposed on the screen and the sensing data is received from the block 110, The control command defined in the operation attribute of the corresponding block 110 can be transmitted to the motion toy 120 having a control relationship with the corresponding block 110.

Meanwhile, the communication unit 137 can wirelessly communicate with the block 110 and the motion toy 120 using various wireless communication methods, and wirelessly communicate with a service server (not shown).

3 is a flowchart illustrating a process of setting a control relationship between a block and an action toy according to an embodiment of the present invention.

3 may be performed by the user terminal 130 described above. Hereinafter, the process of FIG. 3 will be described with the user terminal 130 performing the main process.

The user terminal 130 recognizes the blocks 110 and the motion toys 120 disposed on the screen (S301).

After step S301, the user terminal 130 displays the placement area around the recognized block 110 and the motion toy 120, respectively (S302).

After step S302, the user terminal 130 sets the control relationship between the block 110 and the motion toy 120 by dragging, dragging-and-drop or sequential touching between the block 110 and the arrangement area of the motion toy 120, (S303).

After step S303, the user terminal 130 displays a user interface for setting an operation attribute of the block around the block on the screen (S304).

After S304, the user terminal 130 stores the operation attribute of the block input through the user interface (S305).

When the sensing data is received from the block 110 after step S305, the user terminal 130 extracts the control command corresponding to the received sensing data from the operation attribute of the corresponding block 110, and transmits the extracted control command to the motion toy 120 (S306).

4 is a diagram illustrating a method of setting a control relationship between a block and an action toy according to an embodiment of the present invention.

The user can confirm the information about the block 110 and the motion toy 120 and request the control relationship setting by selecting a predetermined menu or button for setting the control relationship between the block 110 and the motion toy 120. [

The user terminal 130 can display the block 110 and the motion toy 120 that can set the control relationship when the user inputs a control relationship setting request to the block 110 and the motion toy 120. [

At this time, the user terminal 130 can display the block 110 where the control relationship can be set and the arrangement area 410 of the motion toy 120 by blinking in a specific color, and can output a specific sound together.

4, the user may drag the placement area of the block 110 to the placement area of the motion toy 120 or, conversely, the placement area of the motion toy 120 to the placement area of the block 110 The control relationship between the block 110 and the motion toy 120 can be set by dragging.

Here, the control relation may be set not only by dragging, but also by drag-and-drop or sequential touch.

The block 110 in which the control relation is set and the motion toy 120 may be displayed as a 'control relation setting line 420' indicating that the control relationship is established as shown in FIG.

5 is a diagram illustrating a screen for setting operation attributes of a block according to an embodiment of the present invention.

The block 110 and the motion toy 120 are disposed on the screen of the user terminal 130 and the block 110 and the motion toy 120 are recognized by the user terminal 130, And the control relationship setting of the motion toy 120 is completed.

As shown in FIG. 5, each placement area 510 is displayed around the recognized block 110 and the motion toy 120.

After the control relationship between the block 110 and the motion toy 120 is established and the user's touch continues for more than a certain time in the placement area 510 of the block 110, A user interface 520 capable of setting operation attributes can be displayed on the screen as shown in FIG. 5, and setting values for the operation attributes can be input from the user.

FIG. 6 is a diagram illustrating a method of releasing a control relationship between a block and an operation toy according to an embodiment of the present invention. Referring to FIG.

6 shows a block 110 and an operation toy 120 disposed on the screen of the user terminal 130. The user terminal 130 recognizes the block 110 and the operation toy 120, And an arrangement area 610 is shown around the motion toy 120 and the block 110, respectively.

If the user's touch is continued for more than a certain time in the block 110 or the placement area 610 of the motion toy 120, the user terminal 130 may control the information about the corresponding block 110 or the motion toy 120, The user interface 620 capable of confirming the setting information can be displayed on the screen as shown in Fig.

When the control relationship setting information button 621 is selected by the user, the user terminal 130 can display the information block 630 of the block 110 or the motion toy 120 for which the current control relationship is set.

Thereafter, when the control relationship setting release button 631 is selected by the user, the user terminal 130 can cancel the control relationship setting between the block 110 and the motion toy 120. [

That is, when the control relationship setting is released in either the block 110 or the motion toy 120, the control relationship setting on both sides can be canceled.

When the control relationship between the block 110 and the motion toy 120 is canceled, the motion attribute of the block 110 may also be initialized.

7 is a diagram illustrating a method of setting a control relationship between a block and a block according to an embodiment of the present invention.

7 is a block diagram of a user terminal 130 in which a first block 111 and a second block 112 which are a plurality of blocks are arranged on the screen of the user terminal 130 and the user terminal 130 recognizes the blocks 111 and 112 , And a placement area 710 is displayed around each recognized block 111,

In the embodiment of FIG. 7, the first block 111 may be a voice sense block and the second block 112 may be an LED block.

The user terminal 130 may display the information 720 on the voice sensing block 111 on the screen if the user's touch is continued for more than a specific time in the arrangement area 710 of the voice sensing block 111. [

At this time, if there is a block or an operation toy having a control relation with the voice sensing block 111, the user terminal 130 may further display the control relationship setting information.

Thereafter, in order to set the control relationship between the blocks 111 and 112, the control relationship can be established by dragging the arrangement area of the LED block 112 to the arrangement area of the voice sense block.

Of course, it is also possible to set the control relationship by dragging the arrangement area of the voice detection block 111 to the arrangement area of the LED block 112. [

Here, the control relation may be set not only by dragging, but also by drag-and-drop or sequential touch.

The voice sensing block 111 and the LED block 112 for which the control relationship is established can be displayed as a 'control relation setting line 730' which indicates that the control relation is established as shown in FIG.

The user terminal 130 displays a user interface (not shown) for setting an operation attribute of the LED block 112 on the screen if the user's touch is continued for a predetermined time or longer in the arrangement area 710 of the LED block 112 can do.

At this time, since the block in which the control relation is set is the voice detection block 111, the user interface (not shown) sets the brightness of the LED block 112 stepwise according to the level at which the voice is detected, Can include items that can be.

For reference, the cancellation of the control relationship between the blocks is the same as the cancellation of the control relationship disconnection between the block 110 and the motion toy 120 shown in Fig. 6, and thus a detailed description thereof will be omitted.

It will be understood by those skilled in the art that the foregoing description of the present invention is for illustrative purposes only and that those of ordinary skill in the art can readily understand that various changes and modifications may be made without departing from the spirit or essential characteristics of the present invention. will be.

It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

For example, each component described as a single entity may be distributed and implemented, and components described as being distributed may also be implemented in a combined form.

The scope of the present invention is defined by the appended claims, and all changes or modifications derived from the meaning and scope of the claims and their equivalents should be construed as being included within the scope of the present invention.

100: A system for setting the control relationship between blocks and motion toys
110: block
120: motion toys
130: User terminal
131:
132:
133: Control relation setting section
134:
135: Operation property user interface
136: Control command offerer
137:

Claims (15)

A user terminal for setting a control relationship between a block disposed on a screen and an action toy,
A recognition unit for recognizing at least one of the block and the motion toys using the pattern when a pattern formed on a lower portion of the block or the motion toys touches the screen when the block or the motion toys are disposed on the screen;
A placement area display unit for displaying a placement area indicating an area where the motion toy is disposed in the vicinity of the recognized block and the motion toy according to the recognition of the block or the motion toy;
A controller for setting a control relationship between the recognized block and the motion toy by dragging, dragging and dropping or sequential touching of the block and the region in which the motion toy is disposed; A control relation setting unit for displaying a control relation setting line between the layout areas;
A user interface for inputting an operation attribute including at least one of an identifier of an operation toy in which the control relation is set, data to be sensed in the block, and control command to control the operation toy in response to the sensed data, An operation property providing user interface; And
And a control command providing unit operable to transmit the control command defined in the operation attribute to the motion toy based on the motion attribute of the block when the sensed data is received from the block disposed on the screen.
delete delete delete The method according to claim 1,
The placement area display unit
Wherein the control unit blinks the placement area of the block and the motion toy that can set the control relationship, or displays the placement area in a specific color.
delete The method according to claim 1,
Wherein the operation attribute user interface providing unit displays a user interface for setting an operation attribute of the block or releasing the set control relationship,
Wherein the user interface sets one or more conditions of the number and size of data sensed in the block, and sets the control command to control the motion toy when the set condition is satisfied.
delete A method for setting a control relationship between a block and a motion toy disposed on a screen of a user terminal,
Recognizing at least one of the block and the toys using the pattern when a pattern formed on the bottom of the block or the toys touches the screen when the block or the toys is disposed on the screen;
Displaying an arrangement area indicating an area where the block and the motion toy are disposed around the recognized block and the motion toy according to the recognition of the block or the motion toy;
A controller for setting a control relationship between the recognized block and the motion toy by dragging, dragging and dropping or sequential touching of the block and the region in which the motion toy is disposed; Displaying a control relationship setting line between the layout areas;
A user interface for inputting an operation attribute including at least one of an identifier of an operation toy in which the control relation is set, data to be sensed in the block, and control command to control the operation toy in response to the sensed data, ; And
And transmitting the control command defined in the operation attribute to the operation toy based on the operation attribute of the block when the sensed data is received from the block disposed on the screen, How to set it up.
delete delete delete delete 10. The method of claim 9,
Wherein the block and the action toy operate on a background of a game displayed on a screen of the user terminal.
15. A computer program stored in a medium containing a series of instructions for a computer to perform the method according to any one of claims 9 to 14.
KR1020150127651A 2015-09-09 2015-09-09 Method and user device for setting relationship between bolck and iot device KR101673805B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150127651A KR101673805B1 (en) 2015-09-09 2015-09-09 Method and user device for setting relationship between bolck and iot device
US14/966,253 US20170065878A1 (en) 2015-09-09 2015-12-11 Block, method and user terminal for providing a game by setting control relation between the block and a toy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150127651A KR101673805B1 (en) 2015-09-09 2015-09-09 Method and user device for setting relationship between bolck and iot device

Publications (1)

Publication Number Publication Date
KR101673805B1 true KR101673805B1 (en) 2016-11-08

Family

ID=57527946

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150127651A KR101673805B1 (en) 2015-09-09 2015-09-09 Method and user device for setting relationship between bolck and iot device

Country Status (1)

Country Link
KR (1) KR101673805B1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101946464B1 (en) 2017-01-26 2019-02-11 (주)로보케어 Apparatus for providing sensing interactive contents and method thereof

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100975017B1 (en) * 2009-10-16 2010-08-09 한밭대학교 산학협력단 Color study apparatus using electronic cube

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100975017B1 (en) * 2009-10-16 2010-08-09 한밭대학교 산학협력단 Color study apparatus using electronic cube

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101946464B1 (en) 2017-01-26 2019-02-11 (주)로보케어 Apparatus for providing sensing interactive contents and method thereof

Similar Documents

Publication Publication Date Title
US9983687B1 (en) Gesture-controlled augmented reality experience using a mobile communications device
US20170185276A1 (en) Method for electronic device to control object and electronic device
US10814220B2 (en) Method for controlling display of electronic device using multiple controllers and device for the same
CN108465238A (en) Information processing method, electronic equipment in game and storage medium
KR20180006956A (en) Information processing method and terminal, and computer storage medium
CN106878390B (en) Electronic pet interaction control method and device and wearable equipment
BR112014029915B1 (en) METHOD FOR GENERATING AN INTERFACE THAT IS ENHANCED FOR A PRESENT CONTEXT, COMPUTER STORAGE MEDIA AND COMPUTING DEVICE
CN106068640A (en) Optionally notify to wearable computing device redirection
CN104520787A (en) Headset computer (HSC) as auxiliary display with ASR and HT input
CN107066173B (en) Method of controlling operation thereof and device
CN108211349A (en) Information processing method, electronic equipment and storage medium in game
KR102021851B1 (en) Method for processing interaction between object and user of virtual reality environment
KR102655584B1 (en) Display apparatus and controlling method thereof
JP2017535001A (en) Push user interface
US12028476B2 (en) Conversation creating method and terminal device
CN107251570A (en) Sensing data availability from remote equipment
US20220152505A1 (en) Virtual object control method and apparatus, storage medium, and electronic device
CN104932782A (en) Information processing method and apparatus and smart glasses
KR101987859B1 (en) A program, a game system, an electronic device, a server, and a game control method for improving operability of user input
KR101673805B1 (en) Method and user device for setting relationship between bolck and iot device
US20150084848A1 (en) Interaction between generic interaction devices and an interactive display
KR101742444B1 (en) Method and system for coupling among devices using adaptive pattern recognition
US20220011861A1 (en) Inputs to virtual reality devices from touch surface devices
KR102308927B1 (en) Method for outputting screen and display device for executing the same
US20170065878A1 (en) Block, method and user terminal for providing a game by setting control relation between the block and a toy

Legal Events

Date Code Title Description
GRNT Written decision to grant