CN109799733A - The control method and control system of robot - Google Patents

The control method and control system of robot Download PDF

Info

Publication number
CN109799733A
CN109799733A CN201711144855.1A CN201711144855A CN109799733A CN 109799733 A CN109799733 A CN 109799733A CN 201711144855 A CN201711144855 A CN 201711144855A CN 109799733 A CN109799733 A CN 109799733A
Authority
CN
China
Prior art keywords
mutual
instruction
behavior
action behavior
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711144855.1A
Other languages
Chinese (zh)
Other versions
CN109799733B (en
Inventor
胡杨刚
黄维涵
李广超
何智谦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Yeyou Information Technology Co ltd
Original Assignee
Guangzhou Smart Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Smart Technology Co Ltd filed Critical Guangzhou Smart Technology Co Ltd
Priority to CN201711144855.1A priority Critical patent/CN109799733B/en
Publication of CN109799733A publication Critical patent/CN109799733A/en
Application granted granted Critical
Publication of CN109799733B publication Critical patent/CN109799733B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Manipulator (AREA)
  • Toys (AREA)

Abstract

The invention discloses a kind of control method of robot and control system, which includes: the behavior classification determine instruction for being used to control robot mutual-action behavior for receiving mobile terminal and sending;The data information of respective classes is obtained from database according to the behavior classification determine instruction;Receive the first mutual-action behavior control signal that the mobile terminal is sent, wherein the first mutual-action behavior control signal is generated by way of shaking according to the mobile terminal;Control signal according to first mutual-action behavior selects corresponding data information to generate the second mutual-action behavior control instruction from the data information of the respective classes;The second mutual-action behavior control instruction is sent to the robot, so that the robot executes corresponding actions according to the second mutual-action behavior control instruction.It is interacted the present invention has the advantage that can be realized in such a way that mobile phone shakes with intelligent robot, avoids the selection of user difficult.

Description

The control method and control system of robot
Technical field
The present invention relates to intelligent robot technology fields, and in particular to the control method and control system of robot.
Background technique
Intelligent robot is liked because its flourishing " brain " is deep by the majority of consumers, has audio, broadcasting of video etc. Function mainly can simply be interacted by way of voice or touch-control.But existing intelligent robot needs pass through Interface downloads resource, when user faces a large amount of music/story resource, can cause user and select difficult problem.
Summary of the invention
The present invention is directed at least solve one of above-mentioned technical problem.
For this purpose, the first purpose of this invention is to propose a kind of control method of robot, one can be shaken by mobile phone The mode shaken is realized to be interacted with intelligent robot, avoids the selection of user difficult.
To achieve the goals above, embodiment of the invention discloses a kind of control methods of robot, including following step It is rapid: to receive the behavior classification determine instruction for being used to control robot mutual-action behavior that mobile terminal is sent;According to the behavior class Other determine instruction obtains the data information of respective classes from database;Receive the mutual-action behavior control that the mobile terminal is sent Signal, wherein the mutual-action behavior control signal is generated by way of shaking according to the mobile terminal;According to described Mutual-action behavior control signal selects corresponding data information to generate mutual-action behavior control from the data information of the respective classes Instruction;The second mutual-action behavior control instruction is sent to the robot, so that the robot is mutual according to described second Dynamic behaviour control instruction execution corresponding actions.
Further, the behavior classification determine instruction includes audio play instruction, video playing instruction and limb action At least one of instruction instruction, wherein the limb action instruction executes corresponding limbs row for controlling the robot For movement.
Further, described that phase is selected from the data information of the respective classes according to mutual-action behavior control signal The data information answered generates mutual-action behavior control instruction: obtaining the mutual-action behavior assisted Selection letter of user's input Breath, wherein the mutual-action behavior assisted Selection information include age of user, the time of mutual-action behavior, in user preferences type extremely A kind of few information;Signal and the mutual-action behavior assisted Selection information are controlled from the respective classes according to the mutual-action behavior Corresponding data information is selected in data information;The mutual-action behavior control instruction is generated according to the data information of selection.
Further, described when the behavior determines that classification instruction includes audio play instruction or video playing instruction Mutual-action behavior assisted Selection information further includes creator's information.
Further, when the behavior determines that classification instruction includes audio play instruction, the mutual-action behavior auxiliary choosing Selecting information further includes cadence information.
The control method of robot according to an embodiment of the present invention selects the behavior classification of robot first, then passes through The mode that mobile phone shakes makes robot execute corresponding operation in the behavior classification chosen, can be tired to avoid the selection of user It is difficult.
For this purpose, second object of the present invention is to propose a kind of control method of robot, one can be shaken by mobile phone The mode shaken is realized to be interacted with intelligent robot, avoids the selection of user difficult.
To achieve the goals above, embodiment of the invention discloses a kind of control systems of robot, comprising: communication mould Block, for receiving the behavior classification determine instruction and mutual-action behavior control signal of mobile terminal transmission, so that the robot root Corresponding actions are executed according to the mutual-action behavior control instruction, wherein the behavior classification determine instruction is for determining the machine The mutual-action behavior classification of people, the mutual-action behavior control signal is generated by way of shaking according to the mobile terminal; Memory module, for storing data information;Control module, the control module be used for according to the behavior classification determine instruction from The data information of respective classes is obtained in the memory module, the control module, which is also used to control according to the mutual-action behavior, to be believed Corresponding data information number is selected to generate the mutual-action behavior control instruction from the data information of the respective classes.
Further, the behavior classification determine instruction includes audio play instruction, video playing instruction and limb action At least one of instruction instruction, wherein the limb action instruction executes corresponding limbs row for controlling the robot For movement.
Further, the control module is further used for obtaining the mutual-action behavior assisted Selection information of user's input, and Signal and the mutual-action behavior assisted Selection information are controlled from the data information of the respective classes according to the mutual-action behavior Corresponding data information is selected, and then the mutual-action behavior control instruction is generated according to the data information of selection;Wherein, described mutual It includes at least one of age of user, the time of mutual-action behavior, user preferences type information that selection information is assisted in dynamic behavior.
Further, described when the behavior classification determine instruction includes audio play instruction or video playing instructs Mutual-action behavior assisted Selection information further includes creator's information.
Further, when the behavior classification determine instruction includes audio play instruction, the mutual-action behavior auxiliary choosing Selecting information further includes cadence information.
The control system of robot according to an embodiment of the present invention selects the behavior classification of robot first, then passes through The mode that mobile phone shakes makes robot execute corresponding operation in the behavior classification chosen, can be tired to avoid the selection of user It is difficult.
Additional aspect and advantage of the invention will be set forth in part in the description, and will partially become from the following description Obviously, or practice through the invention is recognized.
Detailed description of the invention
Above-mentioned and/or additional aspect of the invention and advantage will become from the description of the embodiment in conjunction with the following figures Obviously and it is readily appreciated that, in which:
Fig. 1 is the flow chart of the control method of the robot of one embodiment of the invention;
Fig. 2 is the structural block diagram of the control system of the robot of one embodiment of the invention.
Specific embodiment
The embodiment of the present invention is described below in detail, examples of the embodiments are shown in the accompanying drawings, wherein from beginning to end Same or similar label indicates same or similar element or element with the same or similar functions.Below with reference to attached The embodiment of figure description is exemplary, and for explaining only the invention, and is not considered as limiting the invention.
In the description of the present invention, it is to be understood that, term " center ", " longitudinal direction ", " transverse direction ", "upper", "lower", The orientation or positional relationship of the instructions such as "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outside" is It is based on the orientation or positional relationship shown in the drawings, is merely for convenience of description of the present invention and simplification of the description, rather than instruction or dark Show that signified device or element must have a particular orientation, be constructed and operated in a specific orientation, therefore should not be understood as pair Limitation of the invention.In addition, term " first ", " second " are used for description purposes only, it is not understood to indicate or imply opposite Importance.
In the description of the present invention, it should be noted that unless otherwise clearly defined and limited, term " installation ", " phase Even ", " connection " shall be understood in a broad sense, for example, it may be being fixedly connected, may be a detachable connection, or be integrally connected;It can To be mechanical connection, it is also possible to be electrically connected;It can be directly connected, can also can be indirectly connected through an intermediary Connection inside two elements.For the ordinary skill in the art, above-mentioned term can be understood at this with concrete condition Concrete meaning in invention.
Referring to following description and drawings, it will be clear that these and other aspects of the embodiment of the present invention.In these descriptions In attached drawing, some particular implementations in the embodiment of the present invention are specifically disclosed, to indicate to implement implementation of the invention Some modes of the principle of example, but it is to be understood that the scope of embodiments of the invention is not limited.On the contrary, of the invention Embodiment includes all changes, modification and the equivalent fallen within the scope of the spirit and intension of attached claims.
The present invention is described below in conjunction with attached drawing.
Fig. 1 is the flow chart of the control method of the robot of one embodiment of the invention.As shown in Figure 1, the present invention is implemented The control method of the robot of example, comprising the following steps:
S1: the behavior classification determine instruction for being used to control robot mutual-action behavior that mobile terminal is sent is received.
Specifically, mobile terminal sends behavior classification determine instruction to the communication terminal of robot.At of the invention one In embodiment, behavior classification determine instruction includes in audio play instruction, video playing instruction and the instruction of limbs behavior act At least one instruction.Wherein, limb action instruction executes corresponding limbs behavior act, such as control machine for controlling robot Device people walking or lift hand etc..
S2: the data information of respective classes is obtained from database according to behavior classification determine instruction.
Specifically, after robot receives behavior classification determine instruction, the data information of respective classes is obtained from database. Such as after receiving audio play instruction, audio-frequency information is obtained from audio repository.
S3: it receives the mutual-action behavior that mobile terminal is sent and controls signal.Wherein, mutual-action behavior control signal is according to movement Terminal is generated by way of shaking.
Specifically, setting mobile terminal generates mutual-action behavior control signal by way of shaking, which can Movement is executed to control robot.
S4: signal is controlled according to mutual-action behavior, corresponding data information is selected to generate mutually from the data information of respective classes Dynamic behaviour control instruction.
In an example of the invention, robot controls signal from music class data information according to the first mutual-action behavior A Duan Yinle is selected, and generates the second mutual-action behavior control instruction for playing this section of music.
In one embodiment of the invention, step S4 further comprises:
S401: the mutual-action behavior assisted Selection information of user's input is obtained.Wherein, mutual-action behavior assisted Selection information includes At least one of age of user, the time of mutual-action behavior, user preferences type information.Such as when user selects to broadcast in step s 2 When putting the music on, if the object of robot broadcasting music is 1 to 2 year old child, the time of mutual-action behavior can be set as evening 9 O'clock to 9 thirty, light music is played.
Further, when behavior classification determine instruction includes audio play instruction or video playing instructs, mutual-action behavior Assisted Selection information further includes creator's information, such as selects the works for being suitble to the author of child age section.
Further, when behavior classification determine instruction includes audio play instruction, mutual-action behavior assisted Selection information is also Including cadence information, comfortable rhythm is conducive to children's sleep.
S402: signal and mutual-action behavior assisted Selection information are controlled from the data information of respective classes according to mutual-action behavior Corresponding data information is selected, such as selects lullaby from audio repository.
S403: mutual-action behavior control instruction is generated according to the data information of selection.
S5: the second mutual-action behavior control instruction is sent to robot, so that robot is according to mutual-action behavior control instruction Corresponding actions are executed, such as the instruction for playing lullaby is sent to robot, so that robot plays lullaby.
The control method of robot according to an embodiment of the present invention selects the behavior classification of robot first, then passes through The mode that mobile phone shakes makes robot execute corresponding operation in the behavior classification chosen, can be tired to avoid the selection of user It is difficult.
Fig. 2 is the structural block diagram of the control system of the robot of one embodiment of the invention.As shown in Fig. 2, the present invention is real Apply the control system of the robot of example, comprising: communication module 210, memory module 220 and control module 230.
Wherein, communication module 210 is used to receive the behavior classification determine instruction and mutual-action behavior control of mobile terminal transmission Signal, so that robot executes corresponding actions according to mutual-action behavior control instruction.Wherein, behavior classification determine instruction is for determining The mutual-action behavior classification of robot, mutual-action behavior, which controls signal, to be generated by way of shaking according to mobile terminal.Storage The information for storing data of module 220.Control module 230 is used to obtain phase from memory module according to behavior classification determine instruction Answer the data information of classification.Control module 230 is also used to control signal from the data information of respective classes according to mutual-action behavior Corresponding data information is selected to generate mutual-action behavior control instruction.
The control system of robot according to an embodiment of the present invention selects the behavior classification of robot first, then passes through The mode that mobile phone shakes makes robot execute corresponding operation in the behavior classification chosen, can be tired to avoid the selection of user It is difficult.
In one embodiment of the invention, behavior classification determine instruction includes audio play instruction, video playing instruction With limb action instruction at least one of instruction.Wherein, limb action instruction executes corresponding limbs for controlling robot Behavior act.
In one embodiment of the invention, it is auxiliary to be further used for the mutual-action behavior that acquisition user inputs for control module 230 Selection information is helped, and signal and mutual-action behavior assisted Selection information are controlled from the data information of respective classes according to mutual-action behavior Corresponding data information is selected, and then mutual-action behavior control instruction is generated according to the data information of selection.Wherein, mutual-action behavior is assisted Helping selection information includes at least one of age of user, the time of mutual-action behavior, user preferences type information.
In one embodiment of the invention, when behavior classification determine instruction includes that audio play instruction or video playing refer to When enabling, mutual-action behavior assisted Selection information further includes creator's information.
In one embodiment of the invention, when behavior classification determine instruction includes audio play instruction, mutual-action behavior Assisted Selection information further includes cadence information.
It should be noted that the specific embodiment of the control system of the robot of the embodiment of the present invention and the present invention are implemented The specific embodiment of the control method of the robot of example is similar, referring specifically to the description of method part, in order to reduce redundancy, no It repeats.
In addition, the control method of the robot of the embodiment of the present invention and other compositions of control system and effect are for this All be for the technical staff in field it is known, in order to reduce redundancy, do not repeat them here.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example Point is included at least one embodiment or example of the invention.In the present specification, schematic expression of the above terms are not Centainly refer to identical embodiment or example.Moreover, particular features, structures, materials, or characteristics described can be any One or more embodiment or examples in can be combined in any suitable manner.
Although an embodiment of the present invention has been shown and described, it will be understood by those skilled in the art that: not A variety of change, modification, replacement and modification can be carried out to these embodiments in the case where being detached from the principle of the present invention and objective, this The range of invention is by claim and its equivalent limits.

Claims (10)

1. a kind of control method of robot, which comprises the following steps:
Receive the behavior classification determine instruction for being used to control robot mutual-action behavior that mobile terminal is sent;
The data information of respective classes is obtained from database according to the behavior classification determine instruction;
Receive the mutual-action behavior control signal that the mobile terminal is sent, wherein the mutual-action behavior control signal is according to institute Mobile terminal is stated to generate by way of shaking;
Control signal according to the mutual-action behavior selects corresponding data information to generate from the data information of the respective classes Mutual-action behavior control instruction;
The second mutual-action behavior control instruction is sent to the robot, so that the robot is according to the mutual-action behavior Control instruction executes corresponding actions.
2. the control method of robot according to claim 1, which is characterized in that the behavior classification determine instruction includes At least one of audio play instruction, video playing instruction and the instruction of limbs behavior act instruction, wherein the limb action Instruction executes corresponding limbs behavior act for controlling the robot.
3. the control method of robot according to claim 1, which is characterized in that described to be controlled according to the mutual-action behavior Signal selects corresponding data information generation mutual-action behavior control instruction further to wrap from the data information of the respective classes It includes:
Obtain the mutual-action behavior assisted Selection information of user's input, wherein the mutual-action behavior assisted Selection information includes user At least one of age, the time of mutual-action behavior, user preferences type information;
Signal and the mutual-action behavior assisted Selection information is controlled according to the mutual-action behavior to believe from the data of the respective classes Corresponding data information is selected in breath;
The mutual-action behavior control instruction is generated according to the data information of selection.
4. the control method of robot according to claim 3, which is characterized in that when the behavior classification determine instruction packet When including audio play instruction or video playing instruction, the mutual-action behavior assisted Selection information further includes creator's information.
5. the control method of robot according to claim 3, which is characterized in that when the behavior classification determine instruction packet When including audio play instruction, the mutual-action behavior assisted Selection information further includes cadence information.
6. a kind of control system of robot characterized by comprising
Communication module, for receiving the behavior classification determine instruction and mutual-action behavior control signal of mobile terminal transmission, so that institute It states robot and corresponding actions is executed according to the mutual-action behavior control instruction, wherein the behavior classification determine instruction is for true The mutual-action behavior classification of the fixed robot, the mutual-action behavior control signal is according to the mobile terminal by shaking Mode generates;
Memory module, for storing data information;
Control module, the control module are corresponding for being obtained from the memory module according to the behavior classification determine instruction The data information of classification, the control module are also used to control data of the signal from the respective classes according to the mutual-action behavior Corresponding data information is selected to generate the mutual-action behavior control instruction in information.
7. the control system of robot according to claim 6, which is characterized in that the behavior classification determine instruction includes At least one of audio play instruction, video playing instruction and limb action instruction instruction, wherein the limb action instruction Corresponding limbs behavior act is executed for controlling the robot.
8. the control system of robot according to claim 7, which is characterized in that the control module is further used for obtaining The mutual-action behavior assisted Selection information of family input is taken, and signal and mutual-action behavior auxiliary are controlled according to the mutual-action behavior Selection information selects corresponding data information from the data information of the respective classes, and then raw according to the data information of selection At the mutual-action behavior control instruction;
Wherein, the mutual-action behavior assist selection information include age of user, the time of mutual-action behavior, in user preferences type extremely A kind of few information.
9. the control system of robot according to claim 8, which is characterized in that when the behavior classification determine instruction packet When including audio play instruction or video playing instruction, the mutual-action behavior assisted Selection information further includes creator's information.
10. the control system of robot according to claim 8, which is characterized in that when the behavior classification determine instruction When including audio play instruction, the mutual-action behavior assisted Selection information further includes cadence information.
CN201711144855.1A 2017-11-17 2017-11-17 Control method and control system of robot Active CN109799733B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711144855.1A CN109799733B (en) 2017-11-17 2017-11-17 Control method and control system of robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711144855.1A CN109799733B (en) 2017-11-17 2017-11-17 Control method and control system of robot

Publications (2)

Publication Number Publication Date
CN109799733A true CN109799733A (en) 2019-05-24
CN109799733B CN109799733B (en) 2021-05-11

Family

ID=66555965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711144855.1A Active CN109799733B (en) 2017-11-17 2017-11-17 Control method and control system of robot

Country Status (1)

Country Link
CN (1) CN109799733B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103607616A (en) * 2013-11-08 2014-02-26 四川长虹电器股份有限公司 Television system and control method for video recommendation based on shaking method
CN203779501U (en) * 2013-11-20 2014-08-20 中山市大谷电子科技有限公司 Multifunctional service robot
CN105100931A (en) * 2015-08-29 2015-11-25 天脉聚源(北京)科技有限公司 Interactive signal processing method and interactive signal processing device
CN105303490A (en) * 2015-11-30 2016-02-03 广东小天才科技有限公司 Method and device for sending knowledge information
CN105511260A (en) * 2015-10-16 2016-04-20 深圳市天博智科技有限公司 Preschool education accompany robot, and interaction method and system therefor
CN106625670A (en) * 2016-12-26 2017-05-10 迈赫机器人自动化股份有限公司 Control system and method of multifunctional man-machine interaction humanoid teaching robot
CN106657188A (en) * 2015-11-02 2017-05-10 天脉聚源(北京)科技有限公司 WeChat shake terminal content pushing method and system
CN106792041A (en) * 2016-12-09 2017-05-31 北京小米移动软件有限公司 Content share method and device
CN106897369A (en) * 2017-01-17 2017-06-27 成都视达科信息技术有限公司 A kind of content-data recommends method and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103607616A (en) * 2013-11-08 2014-02-26 四川长虹电器股份有限公司 Television system and control method for video recommendation based on shaking method
CN203779501U (en) * 2013-11-20 2014-08-20 中山市大谷电子科技有限公司 Multifunctional service robot
CN105100931A (en) * 2015-08-29 2015-11-25 天脉聚源(北京)科技有限公司 Interactive signal processing method and interactive signal processing device
CN105511260A (en) * 2015-10-16 2016-04-20 深圳市天博智科技有限公司 Preschool education accompany robot, and interaction method and system therefor
CN106657188A (en) * 2015-11-02 2017-05-10 天脉聚源(北京)科技有限公司 WeChat shake terminal content pushing method and system
CN105303490A (en) * 2015-11-30 2016-02-03 广东小天才科技有限公司 Method and device for sending knowledge information
CN106792041A (en) * 2016-12-09 2017-05-31 北京小米移动软件有限公司 Content share method and device
CN106625670A (en) * 2016-12-26 2017-05-10 迈赫机器人自动化股份有限公司 Control system and method of multifunctional man-machine interaction humanoid teaching robot
CN106897369A (en) * 2017-01-17 2017-06-27 成都视达科信息技术有限公司 A kind of content-data recommends method and system

Also Published As

Publication number Publication date
CN109799733B (en) 2021-05-11

Similar Documents

Publication Publication Date Title
CN111026932B (en) Man-machine dialogue interaction method and device, electronic equipment and storage medium
CN101879375B (en) Mobile-phone game handle based on Bluetooth transmission and realization method thereof
US10512845B2 (en) Real-time manipulation of gameplay through synchronous signal consumption in non-interactive media
CN113018848B (en) Game picture display method, related device, equipment and storage medium
US20140358986A1 (en) Cloud Database-Based Interactive Control System, Method and Accessory Devices
CN106217384B (en) A kind of method and apparatus that control service robot is danced
CN104484378A (en) Method, device and terminal for pushing notification messages
KR102641797B1 (en) Method, device, terminal and storage medium for previewing in-game actions in a non-game environment
CN106095384B (en) A kind of effect adjusting method and user terminal
US20080014833A1 (en) Toy and Game Performance Parameters Updated by Real World Events
DE102016125126A1 (en) Electrical systems and related methods for providing intelligent electronic mobile device features to a user of a wearable device
WO2018108174A1 (en) Interface interactive assembly control method and apparatus, and wearable device
CN102413023B (en) Interactive delight system and method
CN103517108A (en) Smart television playing device, mobile terminal, method for achieving somatosensory function and system
CN102158543A (en) Internet controlled toy system
CN110292776A (en) A kind of plot generation method and device calculate equipment and storage medium
CN103312871A (en) Method using mobile phone as game handle
JP6741746B2 (en) Program, game server, information processing terminal, method and game system
CN109799733A (en) The control method and control system of robot
JP2016209093A (en) Game system, management device, game device, and program
CN104064205B (en) Intelligence promotees to sleep or wake up up music box ringing method and intelligence promotees sleep or wake up music box up
CN107025814A (en) A kind of music education plateform system for supporting many interactive application modes
CN107135260A (en) Intelligent television UI sound effect systems and implementation method
US11642276B2 (en) System, apparatus, and method for controlling devices based on accumulation of input
CN103197834B (en) A kind of mobile terminal and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240523

Address after: Room 101, No. 1168 Xingang East Road, Haizhu District, Guangzhou City, Guangdong Province, 510220, self designed units 2004 and 2005 (office only)

Patentee after: Guangzhou Yeyou Information Technology Co.,Ltd.

Country or region after: China

Address before: Room 209, No. 115 Jiufo Jianshe Road, Zhongxin Guangzhou Knowledge City, Huangpu District, Guangzhou City, Guangdong Province, 510700

Patentee before: GUANGZHOU AORUI INTELLIGENT TECHNOLOGY CO.,LTD.

Country or region before: China