CN107688431A - Man-machine interaction method based on radar fix - Google Patents

Man-machine interaction method based on radar fix Download PDF

Info

Publication number
CN107688431A
CN107688431A CN201710729771.8A CN201710729771A CN107688431A CN 107688431 A CN107688431 A CN 107688431A CN 201710729771 A CN201710729771 A CN 201710729771A CN 107688431 A CN107688431 A CN 107688431A
Authority
CN
China
Prior art keywords
radar
coordinate
points
point
triangle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710729771.8A
Other languages
Chinese (zh)
Other versions
CN107688431B (en
Inventor
张佳佳
陈焰立
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Mdt Infotech Ltd Meow
Original Assignee
Shanghai Mdt Infotech Ltd Meow
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Mdt Infotech Ltd Meow filed Critical Shanghai Mdt Infotech Ltd Meow
Priority to CN201710729771.8A priority Critical patent/CN107688431B/en
Publication of CN107688431A publication Critical patent/CN107688431A/en
Application granted granted Critical
Publication of CN107688431B publication Critical patent/CN107688431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Processing Or Creating Images (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The present invention relates to a kind of man-machine interaction method based on radar fix, comprise the following steps:1) Real time data acquisition, and carry out coordinate transform process;2) radar fix and projection coordinate's calibration process, the process record radar fix corresponding to the mark point and picture displaing coordinate to determine several mark points in advance;3) radar available point screening process;4) radar point coordinates mapping viewing area coordinate process;5) radar points process of cluster analysis, the higher radar points of aggregation extent are considered an effective coverage, the coordinate after the radar points mapping in the effective coverage is then subjected to equalization as an effective trigger coordinate.Compared with prior art, the advantages that a kind of brand-new interactive experience can be brought in the online lower large scene amusement game of the present invention.

Description

Man-machine interaction method based on radar fix
Technical field
The present invention relates to a kind of man-machine interaction method, more particularly, to a kind of man-machine interaction method based on radar fix.
Background technology
With the development of computer science and technology, the mode of man-machine interaction is more and more various, from mouse, keyboard to body-sensing, Speech recognition etc., develop always towards the direction of natural interaction.The exchange method of radar fix is a kind of friendship similar to mouse Mutual mode, the position of contact can be accurately positioned using radar, and mouse information can be produced.Lower large scene amusement online A kind of brand-new interactive experience can be brought in game.But how to be implemented as needing to solve the problems, such as instantly.
The content of the invention
It is an object of the present invention to overcome the above-mentioned drawbacks of the prior art and provide one kind is based on radar fix Man-machine interaction method.
The purpose of the present invention can be achieved through the following technical solutions:
A kind of man-machine interaction method based on radar fix, comprises the following steps:
1) Real time data acquisition, and carry out coordinate transform process;
2) radar fix and projection coordinate's calibration process, the process record the mark to determine several mark points in advance Radar fix and picture displaing coordinate corresponding to point;
3) radar available point screening process;
4) radar point coordinates mapping viewing area coordinate process;
5) radar points process of cluster analysis, the higher radar points of aggregation extent are considered an effective coverage, then Coordinate after radar points mapping in the effective coverage is subjected to equalization as an effective trigger coordinate.
Described Real time data acquisition, and carry out coordinate transform process and be specially:
The raw position data that radar equipment returns is obtained in real time, and the radar point coordinates of return uses polar coordinates (r, θ) table Show, and polar coordinates are converted into Descartes's that coordinate (x, y).
Described raw position data is obtained by way of network or serial communication.
Support multiple radar equipments while handle, each radar processing procedure is placed in a thread and carried out.
Described radar fix and projection coordinate's calibration process be specially:
Whole projected picture is divided into H*V sub- display pictures, the corresponding sub-viewing areas of a radar, shown per height Show that picture has M*N mark point, M*N mark point is evenly distributed in sub- display picture;
Whole calibration process is demarcated one by one by radar, and barrier is successively placed on into son corresponding to the radar shows picture In the mark point in face, the radar fix of the mark point is recorded successively by calibrating procedure, until mark point corresponding to all radars All complete demarcation.
Described radar available point screening process is specially:
Single radar is corresponded into sub- projected picture and carries out triangular grid by mark point number, is divided into (M-1) * (N- 1) * 2 triangles, after having demarcated radar fix and projected picture coordinate, radar fix Rp corresponding to each apex of triangle It is determination with screen coordinate Sp;Traversal radar points carry out judging whether it is in arbitrary triangle, in triangle Point be effective radar points, otherwise is invalid radar points.
Judge whether the detailed process in triangle ABC is a point P:P and C is first determined whether in AB sides, so P and B is judged afterwards whether in AC sides, then judges that P and A whether in BC sides, only these three conditions meet simultaneously when, concludes P Inside triangle ABC.
Described radar point coordinates maps viewing area coordinate process:
Any point P is expressed as P=K1*A+K2*B+K3*C, and K1+K2+K3=1, wherein K1=in triangle ABC Square_BCP/Square_ABC, K2=Square_ACP/Square_ABC,
K3=Square_ABP/Square_ABC, wherein Square_BCP, Square_ACP, Square_ABP and Square_ABC represents triangle BCP, triangle ACP, triangle ABP and triangle ABC area respectively;
It is P'=K1*A'+ that can try to achieve the coordinate points that arbitrfary point P in triangle T 1 is mapped in triangle T 2 by relation above K2*B'+K3*C', so as to solve screen coordinate corresponding to effective radar points.
Described radar points process of cluster analysis is specially:
51) all effective radar points are traveled through, calculate neighbor point number in radar points radius r, point of proximity number is more than n Addition dynamic array Array in, less than casting out for n;
52) all effectively radar points in Array are ranked up from big to small by neighbor point number, then to the number after descending Group is traveled through;
521) first effective radar points is inserted in dynamic array Array1;
522) the effective radar points being inserted into Array are compared with already present effectively radar points in Array1, If existing in Array1 and being less than r1 with the effective radar points coordinate distance being inserted into, give up the effective radar being inserted into Point, radar points that are on the contrary then being inserted into this are inserted in Array1;
53) effective radar points in Array1 are final contact cluster members, then by the adjacent click-through of point Row average turns to the final coordinate of the contact.
Compared with prior art, the present invention can be accurately positioned the position of contact using radar, and can produce mouse Message, a kind of brand-new interactive experience can be brought in lower large scene amusement game online.
Brief description of the drawings
Fig. 1 is flow chart of the present invention.
Embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation describes, it is clear that described embodiment is the part of the embodiment of the present invention, rather than whole embodiments.Based on this hair Embodiment in bright, the every other reality that those of ordinary skill in the art are obtained on the premise of creative work is not made Example is applied, should all belong to the scope of protection of the invention.
As shown in figure 1, specific implementation process of the present invention:
101st, Real time data acquisition and coordinate transform process.By way of network or serial communication, radar is obtained in real time The raw position data that equipment returns, the radar point coordinates of return are represented, it is necessary to which polar coordinates are converted into using polar coordinates (r, θ) With Descartes's that coordinate (x, y).Support multiple radar equipments while handle, same as below, each radar processing procedure is placed on Carried out in one thread, so as to ensure the real-time of whole system.
102nd, radar fix and projection coordinate's calibration process.The process is to determine several mark points in advance, records the mark Radar fix and picture displaing coordinate corresponding to note point, because mark point is respectively, so the picture displaing coordinate of mark point It is known, it is necessary to express its corresponding radar fix.Whole projected picture is divided into H*V son first and shows picture Face, the corresponding sub-viewing areas of a radar, every sub- display picture have M*N mark point, and M*N mark point is uniformly distributed In sub- display picture.Whole calibration process is demarcated one by one by radar, and barrier is successively placed on corresponding to the radar In the mark point of sub- display picture, the radar fix of the mark point is recorded successively by calibrating procedure, until all radars are corresponding Mark point all complete to demarcate.
103rd, radar available point screening process.The coordinate points that radar returns only just are having corresponding to it in sub- view field Effect, invalid point needs filter out, so as to reduce the complexity of calculating.Single radar is corresponded into sub- projected picture by mark point Number carries out triangular grid, is divided into * 2 triangles of (M-1) * (N-1), has demarcated radar fix and projected picture coordinate Afterwards, radar fix Rp and screen coordinate Sp corresponding to each apex of triangle is to determine.Traversal radar points are judged Whether it is in arbitrary triangle, and the point in triangle is effective radar points, otherwise is invalid radar points.Judge one Whether the method in triangle ABC is to first determine whether that P and C whether in AB sides, then judges P and B whether in AC mono- to point P Side, then judge that P and A whether in BC sides, only these three conditions meet simultaneously when, can be concluded that P inside triangle ABC.
104th, radar point coordinates mapping viewing area coordinate process.Step 3 determined in which triangle residing for radar points, The radar fix and screen coordinate of the Atria vertex correspondence are known again.The problem is converted at known one o'clock one Coordinate in individual triangle, solve its coordinate in another triangle.Any point P is represented by P in triangle ABC =K1*A+K2*B+K3*C, and K1+K2+K3=1, wherein K1=Square_BCP/Square_ABC, K2=Square_ACP/ Square_ABC,
K3=Square_ABP/Square_ABC, arbitrfary point P in triangle T 1 can be tried to achieve by relation above and be mapped to triangle Coordinate points in shape T2 are P'=K1*A'+K2*B'+K3*C'.Screen corresponding to effective radar points can be solved by the method Coordinate.
105th, radar points process of cluster analysis.Because radar points are than comparatively dense, live animal may be felt after entering induction zone Multiple radar points are answered, the live animal closed on can also sense multiple radar points, and it is crucial for how effectively distinguishing different live animals.Root Process according to the distributional analysis mobiles of radar points is exactly cluster analysis, and the higher radar points of aggregation extent are considered into one Individual effective coverage, the coordinate after the radar points mapping in the effective coverage is then subjected to equalization and sat as an effectively triggering Mark.Process of cluster analysis:All effective radar points are traveled through first, calculate neighbor point number in radar points radius r, point of proximity In addition dynamic array Array of the number more than n, less than casting out for n.Then by all effectively radar points in Array by neighbouring Point number is ranked up from big to small, then the array after descending is traveled through, and is first inserted first effective radar points dynamic In state array Array1, then already present effectively radar in the effective radar points being inserted into Array and Array1 is clicked through Row compares, and with the effective radar points coordinate distance being inserted into is less than r1 if existed in Array1, and that gives up that this is inserted into has Radar points are imitated, radar points that are on the contrary then being inserted into this are inserted in Array1.After being disposed, effective radar points in Array1 It is final contact cluster member, the adjacent point of the point is then subjected to the final coordinate that equalization is the contact.
The foregoing is only a specific embodiment of the invention, but protection scope of the present invention is not limited thereto, any Those familiar with the art the invention discloses technical scope in, various equivalent modifications can be readily occurred in or replaced Change, these modifications or substitutions should be all included within the scope of the present invention.Therefore, protection scope of the present invention should be with right It is required that protection domain be defined.

Claims (9)

1. a kind of man-machine interaction method based on radar fix, it is characterised in that comprise the following steps:
1) Real time data acquisition, and carry out coordinate transform process;
2) radar fix and projection coordinate's calibration process, the process record the mark point pair to determine several mark points in advance The radar fix and picture displaing coordinate answered;
3) radar available point screening process;
4) radar point coordinates mapping viewing area coordinate process;
5) radar points process of cluster analysis, the higher radar points of aggregation extent is considered an effective coverage, then should The coordinate after radar points mapping in effective coverage carries out equalization as an effective trigger coordinate.
2. according to the method for claim 1, it is characterised in that described Real time data acquisition, and carry out Coordinate Conversion mistake Journey is specially:
The raw position data that radar equipment returns is obtained in real time, the radar point coordinates of return is represented using polar coordinates (r, θ), and Polar coordinates are converted into Descartes's that coordinate (x, y).
3. according to the method for claim 2, it is characterised in that described raw position data passes through network or serial communication Mode obtain.
4. according to the method for claim 2, it is characterised in that support multiple radar equipments while handle, at each radar Reason process is placed in a thread and carried out.
5. according to the method for claim 1, it is characterised in that described radar fix and projection coordinate's calibration process is specific For:
Whole projected picture is divided into H*V sub- display pictures, the corresponding sub-viewing areas of a radar, picture is shown per height There is M*N mark point in face, and M*N mark point is evenly distributed in sub- display picture;
Whole calibration process is demarcated one by one by radar, and barrier is successively placed on into sub- display picture corresponding to the radar In mark point, the radar fix of the mark point is recorded successively by calibrating procedure, until mark point is all complete corresponding to all radars Into demarcation.
6. according to the method for claim 1, it is characterised in that described radar available point screening process is specially:
Single radar is corresponded into sub- projected picture and carries out triangular grid by mark point number, is divided into (M-1) * (N-1) * 2 Individual triangle, after having demarcated radar fix and projected picture coordinate, radar fix Rp and screen corresponding to each apex of triangle Curtain coordinate Sp is determination;Traversal radar points carry out judging whether it is in arbitrary triangle, the point in triangle For effective radar points, on the contrary is invalid radar points.
7. according to the method for claim 6, it is characterised in that judge a point P whether the specific mistake in triangle ABC Cheng Wei:First determine whether that whether P and C whether in AB sides, then judges P and B in AC sides, then judge P and A whether in BC mono- Side, only these three conditions meet simultaneously when, conclude P inside triangle ABC.
8. according to the method for claim 7, it is characterised in that described radar point coordinates mapping viewing area coordinate process Specially:
Any point P is expressed as P=K1*A+K2*B+K3*C, and K1+K2+K3=1, wherein K1=Square_ in triangle ABC BCP/Square_ABC, K2=Square_ACP/Square_ABC,
K3=Square_ABP/Square_ABC, wherein Square_BCP, Square_ACP, Square_ABP and Square_ ABC represents triangle BCP, triangle ACP, triangle ABP and triangle ABC area respectively;
It is P'=K1*A'+K2* that can try to achieve the coordinate points that arbitrfary point P in triangle T 1 is mapped in triangle T 2 by relation above B'+K3*C', so as to solve screen coordinate corresponding to effective radar points.
9. according to the method for claim 7, it is characterised in that described radar points process of cluster analysis is specially:
51) all effective radar points are traveled through, calculate neighbor point number in radar points radius r, point of proximity number adds more than n's Enter in dynamic array Array, less than casting out for n;
52) all effectively radar points in Array are ranked up from big to small by neighbor point number, then the array after descending is entered Row traversal;
521) first effective radar points is inserted in dynamic array Array1;
522) the effective radar points being inserted into Array are compared with already present effectively radar points in Array1, if Exist in Array1 and be less than r1 with the effective radar points coordinate distance being inserted into, then give up the effective radar points being inserted into, instead The radar points for being then inserted into this insertion Array1 in;
53) effective radar points in Array1 are final contact cluster members, are then carried out the adjacent point of the point equal Value is the final coordinate of the contact.
CN201710729771.8A 2017-08-23 2017-08-23 Man-machine interaction method based on radar positioning Active CN107688431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710729771.8A CN107688431B (en) 2017-08-23 2017-08-23 Man-machine interaction method based on radar positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710729771.8A CN107688431B (en) 2017-08-23 2017-08-23 Man-machine interaction method based on radar positioning

Publications (2)

Publication Number Publication Date
CN107688431A true CN107688431A (en) 2018-02-13
CN107688431B CN107688431B (en) 2021-04-30

Family

ID=61153625

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710729771.8A Active CN107688431B (en) 2017-08-23 2017-08-23 Man-machine interaction method based on radar positioning

Country Status (1)

Country Link
CN (1) CN107688431B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110687508A (en) * 2019-10-12 2020-01-14 内蒙古工业大学 Method for correcting monitoring data of micro-varying radar
CN110953991A (en) * 2019-12-19 2020-04-03 陕西长岭电子科技有限责任公司 Display method for monitoring hanging swing of helicopter
CN112774181A (en) * 2021-01-11 2021-05-11 浙江星汉云图人工智能科技有限公司 Radar data processing method, processing system and computer storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6400313B1 (en) * 2000-01-12 2002-06-04 Honeywell International Inc. Projection of multi-sensor ray based data histories onto planar grids
JP2007257438A (en) * 2006-03-24 2007-10-04 Casio Comput Co Ltd Pointing device, external information processor, instruction position specifying device and instruction position specifying method
CN104345996A (en) * 2014-10-28 2015-02-11 武汉麦塔威科技有限公司 Radar eye identification interactive device
CN104423721A (en) * 2013-09-02 2015-03-18 苗注雨 Frameless multipoint touch man-machine interaction method and system based on radar eye
CN105302296A (en) * 2015-09-09 2016-02-03 浙江工业大学 Man-machine interaction ground system based on laser radar
CN106537173A (en) * 2014-08-07 2017-03-22 谷歌公司 Radar-based gesture recognition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6400313B1 (en) * 2000-01-12 2002-06-04 Honeywell International Inc. Projection of multi-sensor ray based data histories onto planar grids
JP2007257438A (en) * 2006-03-24 2007-10-04 Casio Comput Co Ltd Pointing device, external information processor, instruction position specifying device and instruction position specifying method
CN104423721A (en) * 2013-09-02 2015-03-18 苗注雨 Frameless multipoint touch man-machine interaction method and system based on radar eye
CN106537173A (en) * 2014-08-07 2017-03-22 谷歌公司 Radar-based gesture recognition
CN104345996A (en) * 2014-10-28 2015-02-11 武汉麦塔威科技有限公司 Radar eye identification interactive device
CN105302296A (en) * 2015-09-09 2016-02-03 浙江工业大学 Man-machine interaction ground system based on laser radar

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110687508A (en) * 2019-10-12 2020-01-14 内蒙古工业大学 Method for correcting monitoring data of micro-varying radar
CN110953991A (en) * 2019-12-19 2020-04-03 陕西长岭电子科技有限责任公司 Display method for monitoring hanging swing of helicopter
CN112774181A (en) * 2021-01-11 2021-05-11 浙江星汉云图人工智能科技有限公司 Radar data processing method, processing system and computer storage medium
CN112774181B (en) * 2021-01-11 2023-11-10 北京星汉云图文化科技有限公司 Radar data processing method, radar data processing system and computer storage medium

Also Published As

Publication number Publication date
CN107688431B (en) 2021-04-30

Similar Documents

Publication Publication Date Title
CN103226387B (en) Video fingertip localization method based on Kinect
CN107247834A (en) A kind of three dimensional environmental model reconstructing method, equipment and system based on image recognition
CN103606192A (en) Wind field visual display method based on three-dimensional virtual globe
CN107688431A (en) Man-machine interaction method based on radar fix
CN101610425B (en) Method for evaluating stereo image quality and device
CN107004271A (en) Display methods, device, electronic equipment, computer program product and non-transient computer readable storage medium storing program for executing
CN104134389B (en) Interactive sand table demonstration system and method
CN105096311A (en) Technology for restoring depth image and combining virtual and real scenes based on GPU (Graphic Processing Unit)
CN111870952B (en) Altitude map generation method, device, equipment and storage medium
CN109448078A (en) A kind of image editing system, method and apparatus
CN110633843B (en) Park inspection method, device, equipment and storage medium
CN107704799A (en) A kind of human motion recognition method and equipment, computer-readable recording medium
CN107481260A (en) A kind of region crowd is detained detection method, device and storage medium
CN106611441A (en) Processing method and device for three-dimensional map
CN112435337A (en) Landscape visual field analysis method and system
CN103780873A (en) Method for realizing 3D scene display by transformer station video monitoring system
CN110378336A (en) Semantic class mask method, device and the storage medium of target object in training sample
CN103778659B (en) Processing method and apparatus of stereoscopic thermodynamic chart
CN109308394A (en) A kind of pre-buried map generalization method, apparatus of three-dimensional water power, equipment and storage medium
CN114520978B (en) Method and system for automatically arranging base stations in network planning simulation
CN103309980B (en) Performance data processing method, Apparatus and system
CN109389119A (en) Point of interest area determination method, device, equipment and medium
CN110300413A (en) A kind of communication site's simulant design method and system based on density algorithm
CN109325991A (en) A kind of pre-buried map generalization method, apparatus of three-dimensional water power, equipment and storage medium
CN105157681B (en) Indoor orientation method, device and video camera and server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant