CN109782962A - A kind of projection interactive method, device, system and terminal device - Google Patents
A kind of projection interactive method, device, system and terminal device Download PDFInfo
- Publication number
- CN109782962A CN109782962A CN201811514475.7A CN201811514475A CN109782962A CN 109782962 A CN109782962 A CN 109782962A CN 201811514475 A CN201811514475 A CN 201811514475A CN 109782962 A CN109782962 A CN 109782962A
- Authority
- CN
- China
- Prior art keywords
- image
- projection
- hot spot
- coordinate
- calibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The application is suitable for technical field of data processing, provide a kind of projection interactive method, device, system and terminal device, described includes: to obtain image collecting device in the collected projected image in perspective plane, the projected image is handled to obtain the hot spot variation track of hot spot, wherein, the hot spot is that the light of control device transmitting is irradiated to the hot spot of perspective plane generation;Operation trace is obtained according to the coordinate conversion relation of the hot spot variation track and the projected image and projection operation interface, operational order is generated according to the operation trace;The projection operation interface is operated according to the operational order.The projection that the application can solve existing interactive projection system is at high cost, and the problem of be difficult to portable mobile.
Description
Technical field
The application belongs to technical field of data processing more particularly to a kind of projection interactive method, device, system and terminal are set
It is standby.
Background technique
With the development of science and technology, interactive projection technology has progressed into people's lives.Current common interactive mode
Shadow casting technique mostly uses Rear projection, and projection screen is connect with computer, and pressure sensor or grating are provided in projection screen
The identification devices such as structure, when stylus or finger and projection screen contact, projection screen by identification device detect stylus or
Behind the position of finger, stylus or the operation input computer of finger operate projection operation interface instead of mouse.
But when realizing interactive projection using identification devices such as pressure sensor or optical grating constructions, projection screen needs to pacify
The identification devices such as pressure sensor or optical grating construction are filled, projection cost is improved, and when projector is mobile, needing will be special
Projection screen also move together, otherwise cannot achieve interactive projection, the interactive projection system made is difficult to portable mobile.
To sum up, the projection of existing interactive projection system is at high cost, and is difficult to portable mobile.
Summary of the invention
In view of this, the embodiment of the present application provides a kind of projection interactive method, device, system and terminal device, with solution
The projection of certainly existing interactive projection system is at high cost, and the problem of be difficult to portable mobile.
The first aspect of the embodiment of the present application provides a kind of projection interactive method, comprising:
Image collecting device is obtained in the collected projected image in perspective plane, the projected image is handled to obtain light
The hot spot variation track of spot, wherein the hot spot is that the light of control device transmitting is irradiated to the hot spot of perspective plane generation;
It is obtained according to the hot spot variation track and the projected image and the coordinate conversion relation at projection operation interface
Operation trace generates operational order according to the operation trace;
The projection operation interface is operated according to the operational order.
The second aspect of the embodiment of the present application provides a kind of projection interactive device, comprising:
Spot tracks module, for obtaining image collecting device in the collected projected image in perspective plane, to the projection
Image is handled to obtain the hot spot variation track of hot spot, wherein the hot spot is that the light that control device issues is irradiated to throwing
The hot spot that shadow face generates;
Operational order module, for according to the hot spot variation track and the projected image and projection operation interface
Coordinate conversion relation obtains operation trace, generates operational order according to the operation trace;
Instruction control module, for being operated according to the operational order to the projection operation interface.
The second aspect of the embodiment of the present application provides a kind of projection interactive system, comprising: projector, control device, figure
As acquisition device and any one of the above project interactive device;
The projector and described image acquisition device are communicated to connect with the projection interactive device respectively;
The control device is for emitting light to perspective plane.
The fourth aspect of the embodiment of the present application provides a kind of terminal device, including memory, processor and is stored in
In the memory and the computer program that can run on the processor, when the processor executes the computer program
It realizes such as the step of the above method.
5th aspect of the embodiment of the present application provides a kind of computer readable storage medium, the computer-readable storage
Media storage has computer program, realizes when the computer program is executed by processor such as the step of the above method.
Existing beneficial effect is the embodiment of the present application compared with prior art:
In the projection interactive method of the application, to image acquisition device to projected image handled to obtain hot spot
Hot spot variation track hot spot variation track is transformed to according to the coordinate conversion relation of projected image and projection operation interface
Operation trace generates operational order according to operation trace, is operated according to operational order to projection operation interface, hands in projection
In mutual process, without using special projection screen, projection cost is reduced, while also not needing in moving process with spy
The projection screen of system moves together, and improves convenience, and the projection for solving existing interactive projection system is at high cost, and is difficult to
Portable mobile.
Detailed description of the invention
It in order to more clearly explain the technical solutions in the embodiments of the present application, below will be to embodiment or description of the prior art
Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is only some of the application
Embodiment for those of ordinary skill in the art without creative efforts, can also be attached according to these
Figure obtains other attached drawings.
Fig. 1 is a kind of implementation process schematic diagram of projection interactive method provided by the embodiments of the present application;
Fig. 2 is a kind of schematic diagram for projecting interactive device provided by the embodiments of the present application;
Fig. 3 is a kind of system structure diagram of projection interaction provided by the embodiments of the present application;
Fig. 4 is the schematic diagram of terminal device provided by the embodiments of the present application;
Fig. 5 is the schematic diagram that nine grids provided by the embodiments of the present application divide calibration region.
Specific embodiment
In being described below, for illustration and not for limitation, the tool of such as particular system structure, technology etc is proposed
Body details, so as to provide a thorough understanding of the present application embodiment.However, it will be clear to one skilled in the art that there is no these specific
The application also may be implemented in the other embodiments of details.In other situations, it omits to well-known system, device, electricity
The detailed description of road and method, so as not to obscure the description of the present application with unnecessary details.
In order to illustrate technical solution described herein, the following is a description of specific embodiments.
It should be appreciated that ought use in this specification and in the appended claims, term " includes " instruction is described special
Sign, entirety, step, operation, the presence of element and/or component, but be not precluded one or more of the other feature, entirety, step,
Operation, the presence or addition of element, component and/or its set.
It is also understood that mesh of the term used in this present specification merely for the sake of description specific embodiment
And be not intended to limit the application.As present specification and it is used in the attached claims, unless on
Other situations are hereafter clearly indicated, otherwise " one " of singular, "one" and "the" are intended to include plural form.
It will be further appreciated that the term "and/or" used in present specification and the appended claims is
Refer to any combination and all possible combinations of one or more of associated item listed, and including these combinations.
As used in this specification and in the appended claims, term " if " can be according to context quilt
Be construed to " when ... " or " once " or " in response to determination " or " in response to detecting ".Similarly, phrase " if it is determined that " or
" if detecting [described condition or event] " can be interpreted to mean according to context " once it is determined that " or " in response to true
It is fixed " or " once detecting [described condition or event] " or " in response to detecting [described condition or event] ".
In the specific implementation, mobile terminal described in the embodiment of the present application is including but not limited to such as with the sensitive table of touch
Mobile phone, laptop computer or the tablet computer in face (for example, touch-screen display and/or touch tablet) etc it is other
Portable device.It is to be further understood that in certain embodiments, above equipment is not portable communication device, but is had
The desktop computer of touch sensitive surface (for example, touch-screen display and/or touch tablet).
In following discussion, the mobile terminal including display and touch sensitive surface is described.However, should manage
Solution, mobile terminal may include that one or more of the other physical User of such as physical keyboard, mouse and/or control-rod connects
Jaws equipment.
Mobile terminal supports various application programs, such as one of the following or multiple: drawing application program, demonstration application
Program, word-processing application, website creation application program, disk imprinting application program, spreadsheet applications, game are answered
With program, telephony application, videoconference application, email application, instant messaging applications, forging
Refining supports application program, photo management application program, digital camera application program, digital camera application program, web-browsing to answer
With program, digital music player application and/or video frequency player application program.
At least one of such as touch sensitive surface can be used in the various application programs that can be executed on mobile terminals
Public physical user-interface device.It can be adjusted among applications and/or in corresponding application programs and/or change touch is quick
Feel the corresponding information shown in the one or more functions and terminal on surface.In this way, terminal public physical structure (for example,
Touch sensitive surface) it can support the various application programs with user interface intuitive and transparent for a user.
In addition, term " first ", " second " etc. are only used for distinguishing description, and should not be understood as in the description of the present application
Indication or suggestion relative importance.
Embodiment one:
A kind of projection interactive method provided below the embodiment of the present application one is described, and please refers to attached drawing 1, the application
Projection interactive method in embodiment one includes:
Step S101, image collecting device is obtained in the collected projected image in perspective plane, and the projected image is carried out
Processing obtains the hot spot variation track of hot spot, wherein the hot spot is that the light of control device transmitting is irradiated to perspective plane generation
Hot spot;
In the projection interactive method of the present embodiment, projector by the image projection at projection operation interface to perspective plane, when with
When family carries out projecting interactive, it can be operated by projected image of the control device to perspective plane.
Control device can emit beam, and user can be with operating control device, and emitting beam is irradiated to perspective plane and forms light
Spot.
Image collecting device can projected image to be shown on acquired projections face, obtain image collecting device and adopted on perspective plane
The projected image collected handle to projected image the hot spot variation track of available hot spot.
Control device can choose as luminous touch control pen, and the shape of stylus can be arbitrarily to grab the shape taken convenient for user
The luminous end of shape, stylus has light generating device, when the luminous end of stylus and projection face contact, light generating device
It can be triggered, form hot spot to emit beam and be irradiated on perspective plane.
Specifically, luminous touch control pen can choose as infrared stylus, and the light that stylus issues is infrared light, infrared light
It is irradiated to perspective plane and generates infrared light spot.
Step S102, according to the coordinate transform of the hot spot variation track and the projected image and projection operation interface
Relationship obtains operation trace, generates operational order according to the operation trace;
There are coordinate conversion relations with projection operation interface for projected image, so that it may according to projected image and projection operation circle
There are coordinate conversion relations in face maps to projection operation interface for the hot spot variation track of projected image, obtains operation trace, root
Operational order is generated according to operation trace, for example, hot spot variation track is line segment from bottom to up, then maps to projection operation interface
Operation trace be line segment from bottom to up, the operational order of page upward sliding, hot spot can be generated according to this operation trace
Variation track is always in the same position, then operation trace generates the operational order of clicking operation also always in same position,
And according to projected image and projection operation interface there are coordinate conversion relation it can be seen that coordinate where operation trace, determines a little
Hit the click location of the operational order of operation.
Step S103, the projection operation interface is operated according to the operational order.
Projection operation interface can be operated according to operational order, for example, the operational order of page upward sliding can
To control the content of pages upward sliding at projection operation interface, the operational order of clicking operation can be in the finger at projection operation interface
Fixed click location executes clicking operation.
Further, the projected image and the coordinate conversion relation at the projection operation interface obtain by the following method
It arrives:
A1, by first image projection at the projection operation interface to the perspective plane, the calibration area of the first image
There are three the above calibration points for setting in domain;
It, can be by the first image projection to throwing when needing the coordinate conversion relation of labeling projection image and projection operation interface
Shadow face, the first image is the image of the coordinate conversion relation for labeling projection image Yu projection operation interface, in the first image
Calibration region in setting there are three the above calibration point.
A2, obtain described image acquisition device in collected second image in the perspective plane, to second image into
Row processing obtains coordinate of each calibration point on second image, and second image is the perspective view of the first image
Picture;
Second image is the projected image of the first image, obtains image collecting device collected second figure on the projection surface
Picture handles the second image to obtain coordinate of each calibration point on the second image.
A3, according to coordinate of the calibration point in the first image and the calibration point on second image
Coordinate pair the first image and the two-dimensional geometry transformation equation of second image are solved, and are obtained the coordinate transform and are closed
System.
Two-dimensional geometry transformation includes rotation, Pan and Zoom, by the two-dimensional geometry for solving the first image and the second image
The changes in coordinates relationship of the available projected image of transformation equation and projection operation interface can will be thrown by coordinate conversion relation
In content map to projection operation interface on shadow image.
The two-dimensional geometry transformation equation of first image and the second image can be expressed as following form:
XL=XT(SXCosθ)+YT(-SYSinθ)+(TXCosθ-TYSinθ)
YL=XT(SXSinθ)+YT(SYCosθ)+(TXSinθ+TYCosθ)
Wherein, (XL, YL) be the second image on coordinate (XT, YT) map to the coordinate of the first image, SXAnd SYFor scaling
Running parameter, TXAnd TYFor translation transformation parameter, θ is rotationally-varying parameter.
Above-mentioned two-dimensional geometry equation of change includes this five location parameters of SX, SY, TX, TY and θ, needs to establish five or more
Equation carry out parametric solution, the coordinate of each calibration point on the first image and the coordinate on the second image can establish
Two equations, therefore, it is necessary to 3 or more calibration points can solve two-dimensional geometry transformation equation, obtain projected image and projection
The coordinate conversion relation of operation interface.
Since above-mentioned equation is related to the calculating of trigonometric function, operation is complex, therefore, can be by exchange entry to above-mentioned
Equation is simplified, and is following form by above-mentioned equation transform:
XL=AXT+BYT+C
YL=DXT+EYT+F
Wherein, A, B, C, D, E and F are with the parameter solved, although parameter to be solved becomes six from five, not
With the calculating for carrying out trigonometric function, it is more suitable for processor calculating, and calibration point quantity and five needed for the solution of six parameters
The quantity for the calibration point that the solution of a parameter needs is consistent.
Further, before described by the first image projection to the perspective plane further include:
B1, the calibration region that the first image is divided into preset quantity calculate separately the coordinate in each calibration region
Transformation relation.
During practical application, since the camera of image collecting device and perspective plane are present in three-dimensional space,
Camera can because optical lens characteristic make imaging there are radial distortions, and due to assembly during, imaging sensing
Device is not substantially parallel with camera lens, therefore there are tangential distortions for imaging.
If necessary to eliminate radial distortion and tangential distortion, then need to seek in camera by complicated matrix operation
Ginseng and outer ginseng, consuming time is long, and in the subsequent coordinate using in projected image, each coordinate require according to internal reference and
Outer ginseng carries out a correction calculation, seriously reduces processing speed, so that projection interactive process is not smooth.
Therefore, if you need to reduce the influence of radial distortion and tangential distortion, the first image can be divided into multiple calibration
Region, each calibration region include three or more calibration points, and each region is individually coordinately transformed the solution of relationship, thus
High-precision calibration is realized in the case where not solving camera internal reference and outer ginseng.
With increasing for calibration region quantity, the precision of calibration is also increased accordingly, but is calculated time-consuming and also increased, because
This, can preset quantity determines according to actual conditions specific value, for example, preset quantity can be set to 9, as shown in figure 5,
The division that calibration region is carried out in the form of nine grids, is divided into 9 calibration regions for the first image, in the form of black and white chessboard
It is shown, the coordinate transform that 3 angle points that each calibration region can choose the region carry out the region as calibration point is closed
The calculating of system, it is subsequent hot spot variation track is mapped as operation trace during, the calibration where hot spot variation track
The corresponding coordinate conversion relation of regional choice is calculated.
The present embodiment one provide projection interactive method in, to image acquisition device to projected image handle
The hot spot variation track of hot spot is obtained, according to the coordinate conversion relation of projected image and projection operation interface, hot spot is changed into rail
Mark is transformed to operation trace, generates operational order according to operation trace, is operated according to operational order to projection operation interface,
During projecting interaction, projector can be put down as the projector of other no interactions functions in any more smooth
Face is projected, and without using special projection screen, reduces projection cost, while also not needing in moving process with spy
The projection screen of system moves together, and improves convenience, and the projection for solving existing interactive projection system is at high cost, and is difficult to
Portable mobile.
Projected image and the coordinate conversion relation at projection operation interface can be by three or more calibration points in the first figure
The solution of the coordinate of picture and the coordinate progress two-dimensional geometry transformation equation in the second image obtains.
In order to further increase the accuracy of calibration, the first image can be divided into multiple calibration regions, each calibration
Region calculates separately the coordinate conversion relation of projected image Yu projection operation interface, reduces camera radial distortion and tangential distortion
Influence.
It should be understood that the size of the serial number of each step is not meant that the order of the execution order in above-described embodiment, each process
Execution sequence should be determined by its function and internal logic, the implementation process without coping with the embodiment of the present application constitutes any limit
It is fixed.
Embodiment two:
The embodiment of the present application two provides a kind of projection interactive device, for purposes of illustration only, only showing relevant to the application
Part, as shown in Fig. 2, projection interactive device includes,
Spot tracks module 201, for obtaining image collecting device in the collected projected image in perspective plane, to the throwing
Shadow image is handled to obtain the hot spot variation track of hot spot, wherein the hot spot is that the light that control device issues is irradiated to
The hot spot that perspective plane generates;
Operational order module 202, for according to the hot spot variation track and the projected image and projection operation circle
The coordinate conversion relation in face obtains operation trace, generates operational order according to the operation trace;
Instruction control module 203, for being operated according to the operational order to the projection operation interface.
Further, described device further include:
Labeling projection module, for by first image projection at the projection operation interface to the perspective plane, described
There are three the above calibration points for setting in the calibration region of one image;
Coordinate obtaining module is right for obtaining described image acquisition device in collected second image in the perspective plane
Second image is handled to obtain coordinate of each calibration point on second image, and second image is described the
The projected image of one image;
Coordinate transformation module, for according to coordinate of the calibration point in the first image and the calibration point in institute
The two-dimensional geometry transformation equation for stating coordinate pair the first image and second image on the second image is solved, and is obtained
The coordinate conversion relation.
Further, described device further include:
Region division module calculates separately each for the first image to be divided into the calibration region of preset quantity
Demarcate the coordinate conversion relation in region.
It should be noted that the contents such as information exchange, implementation procedure between above-mentioned apparatus/unit, due to the application
Embodiment of the method is based on same design, concrete function and bring technical effect, for details, reference can be made to embodiment of the method part, this
Place repeats no more.
Embodiment three:
The embodiment of the present application three provides a kind of projection interactive system, for purposes of illustration only, only showing relevant to the application
Part, as shown in figure 3, projection interactive system includes: projector 1, control device 2, image collecting device 3 and above-mentioned any one
Kind projection interactive device 4;
The projector 1 and described image acquisition device 3 are communicated to connect with the projection interactive device 4 respectively;
The control device 2 is for emitting light to perspective plane.
The field angle of camera should be greater than the projectional angle of projector 1 in image collecting device 3, so that camera can capture
Complete projected image.Camera can use high-definition camera, and acquisition frame rate can choose as the frame per second per second greater than 24 frames,
Exposure can be adjusted according to actual needs.
Further, the control device 2 is specially infrared stylus.
Further, the camera lens outer side covering of the camera of described image acquisition device 3 has optical filter 5.
When control device 2 is infrared stylus, can be filtered in the camera lens outer side covering of the camera of image collecting device 3
Mating plate 5, optical filter 5 can use band logical material, the interference light in filtering environmental, for example, the transmitting of infrared stylus is infrared
Light can choose the infrared light that wavelength is 940nm, and optical filter 5 can use the band logical material of 940 ± 20nm, filter out its commplementary wave length
Interference light, improve projection interactive system anti-interference ability.
Example IV:
Fig. 4 is the schematic diagram for the terminal device that the embodiment of the present application four provides.As shown in figure 4, the terminal of the embodiment is set
Standby 4 include: processor 40, memory 41 and are stored in the meter that can be run in the memory 41 and on the processor 40
Calculation machine program 42.The processor 40 realizes the step in above-mentioned projection interactive method embodiment when executing the computer program 42
Such as step S101 to S103 shown in FIG. 1 suddenly,.Alternatively, the processor 40 is realized when executing the computer program 42
State the function of each module/unit in each Installation practice, such as the function of module 201 to 203 shown in Fig. 2.
Illustratively, the computer program 42 can be divided into one or more module/units, it is one or
Multiple module/units are stored in the memory 41, and are executed by the processor 40, to complete the application.Described one
A or multiple module/units can be the series of computation machine program instruction section that can complete specific function, which is used for
Implementation procedure of the computer program 42 in the terminal device 4 is described.For example, the computer program 42 can be divided
It is cut into spot tracks module, length comparison module and instruction control module, each module concrete function is as follows:
Spot tracks module, for obtaining image collecting device in the collected projected image in perspective plane, to the projection
Image is handled to obtain the hot spot variation track of hot spot, wherein the hot spot is that the light that control device issues is irradiated to throwing
The hot spot that shadow face generates;
Operational order module, for according to the hot spot variation track and the projected image and projection operation interface
Coordinate conversion relation obtains operation trace, generates operational order according to the operation trace;
Instruction control module, for being operated according to the operational order to the projection operation interface.
The terminal device 4 can be the calculating such as desktop PC, notebook, palm PC and cloud server and set
It is standby.The terminal device may include, but be not limited only to, processor 40, memory 41.It will be understood by those skilled in the art that Fig. 4
The only example of terminal device 4 does not constitute the restriction to terminal device 4, may include than illustrating more or fewer portions
Part perhaps combines certain components or different components, such as the terminal device can also include input-output equipment, net
Network access device, bus etc..
Alleged processor 40 can be central processing unit (Central Processing Unit, CPU), can also be
Other general processors, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit
(Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field-
Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic,
Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor
Deng.
The memory 41 can be the internal storage unit of the terminal device 4, such as the hard disk or interior of terminal device 4
It deposits.The memory 41 is also possible to the External memory equipment of the terminal device 4, such as be equipped on the terminal device 4
Plug-in type hard disk, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card dodge
Deposit card (Flash Card) etc..Further, the memory 41 can also both include the storage inside list of the terminal device 4
Member also includes External memory equipment.The memory 41 is for storing needed for the computer program and the terminal device
Other programs and data.The memory 41 can be also used for temporarily storing the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function
Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different
Functional unit, module are completed, i.e., the internal structure of described device is divided into different functional unit or module, more than completing
The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also
To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated
Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list
Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Above system
The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment
The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure
Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician
Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed
Scope of the present application.
In embodiment provided herein, it should be understood that disclosed device/terminal device and method, it can be with
It realizes by another way.For example, device described above/terminal device embodiment is only schematical, for example, institute
The division of module or unit is stated, only a kind of logical function partition, there may be another division manner in actual implementation, such as
Multiple units or components can be combined or can be integrated into another system, or some features can be ignored or not executed.Separately
A bit, shown or discussed mutual coupling or direct-coupling or communication connection can be through some interfaces, device
Or the INDIRECT COUPLING or communication connection of unit, it can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or
In use, can store in a computer readable storage medium.Based on this understanding, the application realizes above-mentioned implementation
All or part of the process in example method, can also instruct relevant hardware to complete, the meter by computer program
Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on
The step of stating each embodiment of the method.Wherein, the computer program includes computer program code, the computer program generation
Code can be source code form, object identification code form, executable file or certain intermediate forms etc..The computer-readable medium
It may include: any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic that can carry the computer program code
Dish, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory (RAM,
Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It should be noted that described
The content that computer-readable medium includes can carry out increasing appropriate according to the requirement made laws in jurisdiction with patent practice
Subtract, such as does not include electric carrier signal and electricity according to legislation and patent practice, computer-readable medium in certain jurisdictions
Believe signal.
Embodiment described above is only to illustrate the technical solution of the application, rather than its limitations;Although referring to aforementioned reality
Example is applied the application is described in detail, those skilled in the art should understand that: it still can be to aforementioned each
Technical solution documented by embodiment is modified or equivalent replacement of some of the technical features;And these are modified
Or replacement, the spirit and scope of each embodiment technical solution of the application that it does not separate the essence of the corresponding technical solution should all
Comprising within the scope of protection of this application.
Claims (10)
1. a kind of projection interactive method characterized by comprising
Image collecting device is obtained in the collected projected image in perspective plane, the projected image is handled to obtain hot spot
Hot spot variation track, wherein the hot spot is that the light of control device transmitting is irradiated to the hot spot of perspective plane generation;
It is operated according to the hot spot variation track and the projected image and the coordinate conversion relation at projection operation interface
Track generates operational order according to the operation trace;
The projection operation interface is operated according to the operational order.
2. projection interactive method as described in claim 1, which is characterized in that the projected image and the projection operation interface
Coordinate conversion relation obtain by the following method:
By first image projection at the projection operation interface to the perspective plane, it is arranged in the calibration region of the first image
There are three the above calibration points;
Described image acquisition device is obtained in collected second image in the perspective plane, second image handle
To coordinate of each calibration point on second image, second image is the projected image of the first image;
According to coordinate and calibration point coordinate pair on second image of the calibration point in the first image
The first image and the two-dimensional geometry transformation equation of second image are solved, and the coordinate conversion relation is obtained.
3. projection interactive method as claimed in claim 2, which is characterized in that described by the first image projection to the projection
Before face further include:
The first image is divided into the calibration region of preset quantity, the coordinate transform for calculating separately each calibration region is closed
System.
4. a kind of projection interactive device characterized by comprising
Spot tracks module, for obtaining image collecting device in the collected projected image in perspective plane, to the projected image
It is handled to obtain the hot spot variation track of hot spot, wherein the hot spot is that the light that control device issues is irradiated to perspective plane
The hot spot of generation;
Operational order module, for the coordinate according to the hot spot variation track and the projected image and projection operation interface
Transformation relation obtains operation trace, generates operational order according to the operation trace;
Instruction control module, for being operated according to the operational order to the projection operation interface.
5. projection interactive device as claimed in claim 4, which is characterized in that described device further include:
Labeling projection module, for by first image projection at the projection operation interface to the perspective plane, first figure
There are three the above calibration points for setting in the calibration region of picture;
Coordinate obtaining module, for obtaining described image acquisition device in collected second image in the perspective plane, to described
Second image is handled to obtain coordinate of each calibration point on second image, and second image is first figure
The projected image of picture;
Coordinate transformation module, for according to coordinate of the calibration point in the first image and the calibration point described
The two-dimensional geometry transformation equation of coordinate pair the first image and second image on two images is solved, and is obtained described
Coordinate conversion relation.
6. projection interactive device as claimed in claim 5, which is characterized in that described device further include:
Region division module calculates separately each calibration for the first image to be divided into the calibration region of preset quantity
The coordinate conversion relation in region.
7. a kind of projection interactive system characterized by comprising projector, control device, image collecting device and such as right
It is required that projection interactive device described in any one of 4 to 6;
The projector and described image acquisition device are communicated to connect with the projection interactive device respectively;
The control device is for emitting light to perspective plane.
8. projection interactive device as claimed in claim 7, which is characterized in that the control device is specially infrared stylus.
9. a kind of terminal device, including memory, processor and storage are in the memory and can be on the processor
The computer program of operation, which is characterized in that the processor realizes such as claims 1 to 3 when executing the computer program
The step of any one the method.
10. a kind of computer readable storage medium, the computer-readable recording medium storage has computer program, and feature exists
In when the computer program is executed by processor the step of any one of such as claims 1 to 3 of realization the method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811514475.7A CN109782962A (en) | 2018-12-11 | 2018-12-11 | A kind of projection interactive method, device, system and terminal device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811514475.7A CN109782962A (en) | 2018-12-11 | 2018-12-11 | A kind of projection interactive method, device, system and terminal device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109782962A true CN109782962A (en) | 2019-05-21 |
Family
ID=66496166
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811514475.7A Pending CN109782962A (en) | 2018-12-11 | 2018-12-11 | A kind of projection interactive method, device, system and terminal device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109782962A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110769229A (en) * | 2019-07-10 | 2020-02-07 | 成都极米科技股份有限公司 | Method, device and system for detecting color brightness of projection picture |
CN110837322A (en) * | 2019-09-29 | 2020-02-25 | 深圳市火乐科技发展有限公司 | Projection touch control method, projection equipment, projection curtain and storage medium |
CN111179148A (en) * | 2019-12-30 | 2020-05-19 | 深圳优地科技有限公司 | Data display method and device |
CN111552393A (en) * | 2020-05-09 | 2020-08-18 | 扬州哈工科创机器人研究院有限公司 | Multimedia conference screen control method and system |
CN112015286A (en) * | 2020-07-31 | 2020-12-01 | 青岛海尔科技有限公司 | Method and device for interactive projection and projection system |
CN112632208A (en) * | 2020-12-25 | 2021-04-09 | 际络科技(上海)有限公司 | Traffic flow trajectory deformation method and device |
CN112702586A (en) * | 2020-12-21 | 2021-04-23 | 成都极米科技股份有限公司 | Projector virtual touch tracking method, device and system based on visible light |
CN112822468A (en) * | 2020-12-31 | 2021-05-18 | 成都极米科技股份有限公司 | Projection control method and device, projection equipment and laser controller |
CN114598850A (en) * | 2020-11-19 | 2022-06-07 | 成都极米科技股份有限公司 | Projection control identification method and device and control equipment |
CN115174878A (en) * | 2022-07-18 | 2022-10-11 | 峰米(重庆)创新科技有限公司 | Projection picture correction method, apparatus and storage medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2619232A1 (en) * | 1975-04-30 | 1976-11-11 | Thomson Brandt | OPTICAL PROJECTION DEVICE AND OPTICAL READER EQUIPPED WITH IT |
CN101840062A (en) * | 2009-08-21 | 2010-09-22 | 深圳先进技术研究院 | Interactive projection system and interactive method |
CN101907954A (en) * | 2010-07-02 | 2010-12-08 | 中国科学院深圳先进技术研究院 | Interactive projection system and interactive projection method |
CN103838437A (en) * | 2014-03-14 | 2014-06-04 | 重庆大学 | Touch positioning control method based on projection image |
CN104090689A (en) * | 2014-06-27 | 2014-10-08 | 深圳市中兴移动通信有限公司 | Mobile terminal and interactive projection method and system thereof |
CN104765233A (en) * | 2015-03-16 | 2015-07-08 | 浙江工业大学 | Screen visible-light-track tracking projection system |
CN105183163A (en) * | 2015-09-08 | 2015-12-23 | 闫维新 | Screen or projection non-contact type interaction device based on motion capture |
CN107682595A (en) * | 2017-08-14 | 2018-02-09 | 中国科学院深圳先进技术研究院 | A kind of alternative projection method, system and computer-readable recording medium |
-
2018
- 2018-12-11 CN CN201811514475.7A patent/CN109782962A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2619232A1 (en) * | 1975-04-30 | 1976-11-11 | Thomson Brandt | OPTICAL PROJECTION DEVICE AND OPTICAL READER EQUIPPED WITH IT |
CN101840062A (en) * | 2009-08-21 | 2010-09-22 | 深圳先进技术研究院 | Interactive projection system and interactive method |
CN101907954A (en) * | 2010-07-02 | 2010-12-08 | 中国科学院深圳先进技术研究院 | Interactive projection system and interactive projection method |
CN103838437A (en) * | 2014-03-14 | 2014-06-04 | 重庆大学 | Touch positioning control method based on projection image |
CN104090689A (en) * | 2014-06-27 | 2014-10-08 | 深圳市中兴移动通信有限公司 | Mobile terminal and interactive projection method and system thereof |
CN104765233A (en) * | 2015-03-16 | 2015-07-08 | 浙江工业大学 | Screen visible-light-track tracking projection system |
CN105183163A (en) * | 2015-09-08 | 2015-12-23 | 闫维新 | Screen or projection non-contact type interaction device based on motion capture |
CN107682595A (en) * | 2017-08-14 | 2018-02-09 | 中国科学院深圳先进技术研究院 | A kind of alternative projection method, system and computer-readable recording medium |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110769229A (en) * | 2019-07-10 | 2020-02-07 | 成都极米科技股份有限公司 | Method, device and system for detecting color brightness of projection picture |
CN110837322A (en) * | 2019-09-29 | 2020-02-25 | 深圳市火乐科技发展有限公司 | Projection touch control method, projection equipment, projection curtain and storage medium |
CN111179148A (en) * | 2019-12-30 | 2020-05-19 | 深圳优地科技有限公司 | Data display method and device |
CN111179148B (en) * | 2019-12-30 | 2023-09-08 | 深圳优地科技有限公司 | Data display method and device |
CN111552393A (en) * | 2020-05-09 | 2020-08-18 | 扬州哈工科创机器人研究院有限公司 | Multimedia conference screen control method and system |
CN112015286B (en) * | 2020-07-31 | 2023-06-09 | 青岛海尔科技有限公司 | Method, device and projection system for interactive projection |
CN112015286A (en) * | 2020-07-31 | 2020-12-01 | 青岛海尔科技有限公司 | Method and device for interactive projection and projection system |
CN114598850B (en) * | 2020-11-19 | 2023-09-29 | 成都极米科技股份有限公司 | Projection control identification method, device and control equipment |
CN114598850A (en) * | 2020-11-19 | 2022-06-07 | 成都极米科技股份有限公司 | Projection control identification method and device and control equipment |
CN112702586A (en) * | 2020-12-21 | 2021-04-23 | 成都极米科技股份有限公司 | Projector virtual touch tracking method, device and system based on visible light |
CN112632208A (en) * | 2020-12-25 | 2021-04-09 | 际络科技(上海)有限公司 | Traffic flow trajectory deformation method and device |
CN112822468B (en) * | 2020-12-31 | 2023-02-17 | 成都极米科技股份有限公司 | Projection control method and device, projection equipment and laser controller |
CN112822468A (en) * | 2020-12-31 | 2021-05-18 | 成都极米科技股份有限公司 | Projection control method and device, projection equipment and laser controller |
CN115174878A (en) * | 2022-07-18 | 2022-10-11 | 峰米(重庆)创新科技有限公司 | Projection picture correction method, apparatus and storage medium |
CN115174878B (en) * | 2022-07-18 | 2024-03-15 | 峰米(重庆)创新科技有限公司 | Projection picture correction method, apparatus and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109782962A (en) | A kind of projection interactive method, device, system and terminal device | |
US11842438B2 (en) | Method and terminal device for determining occluded area of virtual object | |
CN101231450B (en) | Multipoint and object touch panel arrangement as well as multipoint touch orientation method | |
JP5950130B2 (en) | Camera-type multi-touch interaction device, system and method | |
CN101727245B (en) | Multi-touch positioning method and multi-touch screen | |
CN104838337B (en) | It is inputted for the no touch of user interface | |
CN102622108B (en) | A kind of interactive projection system and its implementation | |
CN108304075A (en) | A kind of method and apparatus carrying out human-computer interaction in augmented reality equipment | |
KR20100072207A (en) | Detecting finger orientation on a touch-sensitive device | |
CN201191355Y (en) | Multi-point object touch screen apparatus | |
CN102722254B (en) | Method and system for location interaction | |
Roman et al. | A scalable distributed paradigm for multi-user interaction with tiled rear projection display walls | |
Fiorentino et al. | Design review of CAD assemblies using bimanual natural interface | |
CN108596955A (en) | A kind of image detecting method, image detection device and mobile terminal | |
CN107682595B (en) | interactive projection method, system and computer readable storage medium | |
CN202159302U (en) | Augment reality system with user interaction and input functions | |
CN108769545A (en) | A kind of image processing method, image processing apparatus and mobile terminal | |
CN109345558A (en) | Image processing method, device, medium and electronic equipment | |
Chavarría et al. | Interactive optical 3D-touch user interface using a holographic light-field display and color information | |
CN107678540A (en) | Virtual touch screen man-machine interaction method, system and device based on depth transducer | |
CN106131533A (en) | A kind of method for displaying image and terminal | |
CN203606780U (en) | Multi-touch and gesture recognition fusion system | |
CN113457117B (en) | Virtual unit selection method and device in game, storage medium and electronic equipment | |
CN109816723A (en) | Method for controlling projection, device, projection interactive system and storage medium | |
Cheng et al. | Fingertip-based interactive projector–camera system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190521 |
|
RJ01 | Rejection of invention patent application after publication |