CN106445121A - Virtual reality device and terminal interaction method and apparatus - Google Patents

Virtual reality device and terminal interaction method and apparatus Download PDF

Info

Publication number
CN106445121A
CN106445121A CN201610810917.7A CN201610810917A CN106445121A CN 106445121 A CN106445121 A CN 106445121A CN 201610810917 A CN201610810917 A CN 201610810917A CN 106445121 A CN106445121 A CN 106445121A
Authority
CN
China
Prior art keywords
information
mentioned
terminal
virtual reality
special
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610810917.7A
Other languages
Chinese (zh)
Inventor
贾靖杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN201610810917.7A priority Critical patent/CN106445121A/en
Publication of CN106445121A publication Critical patent/CN106445121A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a virtual reality device and terminal interaction method and apparatus. The method provided by a specific embodiment comprises the steps of receiving information sent by a terminal; and generating three-dimensional image information with a special effect according to the information and first special effect information, wherein the first special effect information is preset special effect information or special effect information selected from a preset special effect information list. According to the method provided by the embodiment, different special effects are added for different information, so that the user experience is improved.

Description

The exchange method of virtual reality device and terminal and device
Technical field
The application is related to intelligent terminal field and in particular to wearable device technical field, more particularly, to virtual reality set The standby exchange method with terminal and device.
Background technology
Virtual reality device is a kind of electronic equipment providing a user with the sensory experience such as vision, the sense of hearing, tactile, power feel. Virtual reality device has been applied to the fields such as medical treatment, education, body-building, military affairs, Infotainment.At present, user mainly passes through hand Handle is interacted with virtual reality device, the command information that virtual reality device receive user is sent by handle.Existing void Intend real world devices to be connected with mobile phone, but in this case, mobile phone is usually as a part for virtual reality device, main The functions such as screen display to be provided, IMAQ, phonetic incepting.
However, existing virtual reality device can not be interacted with intelligent terminal, to receive or to send information, generate tool Specific 3-dimensional image.
Content of the invention
The purpose of the application is to propose exchange method and the device of a kind of virtual reality device and terminal.
In a first aspect, this application provides the exchange method of a kind of virtual reality device and terminal, methods described includes:Connect Receive the information that terminal sends;Generated according to described information and the first special-effect information and have specific 3-dimensional image information, wherein, institute Stating the first special-effect information is the special-effect information pre-setting or is chosen from preset special-effect information list according to described information Special-effect information.
In certain embodiments, described exchange method also includes:Receive the motion rail of the described terminal that described terminal sends Mark information;According to described motion track information, the information of the corresponding dummy object of default described terminal, default second special efficacy Information, generates the three-dimensional animation information having specific described dummy object.
In certain embodiments, described exchange method also includes:By being arranged on the sensing on described virtual reality device The image information of user and the limb motion trace information of described virtual reality device is dressed in device collection;According to described image information Generate the motion characteristic information of described user with described limb motion trace information;Generated according to described motion characteristic information and refer to Order;Send described instruction to described terminal.
In certain embodiments, described exchange method also includes:Receive the instruction that described terminal sends;Execute described finger Order.
In certain embodiments, described exchange method also includes:Based on Wi-Fi Direct standard with described terminal in advance Set up and connect.
Second aspect, this application provides the interactive device of a kind of virtual reality device and terminal, described device includes:The One receiving unit, is configured to the information of receiving terminal transmission;First signal generating unit, is configured to according to described information and first Special-effect information generates has specific 3-dimensional image information, and wherein, described first special-effect information is the special-effect information pre-setting Or the special-effect information chosen from preset special-effect information list according to described information.
In certain embodiments, described device also includes:Second receiving unit, is configured to receive what described terminal sent The motion track information of described terminal;Second signal generating unit, was configured to according to described motion track information, default described end Hold the information of corresponding dummy object, default second special-effect information, the three-dimensional generating the specific described dummy object of tool is moved Picture information.
In certain embodiments, described device also includes:Collecting unit, is configured to be arranged on described virtual reality The image information of user and the limb motion trace information of described virtual reality device is dressed in sensor collection on equipment;Action Feature signal generating unit, is configured to generate the action of described user according to described image information and described limb motion trace information Characteristic information;Instruction generation unit, is configured to generate instruction according to described motion characteristic information;Transmitting element, is configured to Send described instruction to described terminal.
In certain embodiments, described device also includes:Instruction reception unit, is configured to receive what described terminal sent Instruction;Instruction execution unit, is configured to carry out described instruction.
In certain embodiments, described device also includes:Set up connection unit, be configured to mark based on Wi-Fi Direct Accurate and described terminal pre-builds and is connected.
Virtual reality device and the exchange method of terminal and device that the application provides, the letter being sent by receiving terminal Breath, then, generates according to the message receiving and the first special-effect information and has specific 3-dimensional image information.Wherein, the first special efficacy Information can be the special-effect information pre-setting or the special efficacy chosen from preset special-effect information list according to above- mentioned information is believed Breath.It is achieved thereby that generate the 3-dimensional image information of different special efficacys for different information, improve Consumer's Experience.
Brief description
By reading the detailed description that non-limiting example is made made with reference to the following drawings, other of the application Feature, objects and advantages will become more apparent upon:
Fig. 1 is that the application can apply to exemplary system architecture figure therein;
Fig. 2 is the flow chart of the virtual reality device according to the application and an embodiment of exchange method of terminal;
Fig. 3 is the schematic diagram of the virtual reality device according to the application and application scenarios of exchange method of terminal;
Fig. 4 is the flow chart of the virtual reality device according to the application and another embodiment of exchange method of terminal;
Fig. 5 is the flow chart of the virtual reality device according to the application and another embodiment of exchange method of terminal;
Fig. 6 is the structural representation of the virtual reality device according to the application and an embodiment of interactive device of terminal Figure;
Fig. 7 is the structural representation of the virtual reality device according to the application and another embodiment of interactive device of terminal Figure;
Fig. 8 is the structural representation of the virtual reality device according to the application and another embodiment of interactive device of terminal Figure;
Fig. 9 is adapted for the structural representation of the computer system of the virtual reality device for realizing the embodiment of the present application.
Specific embodiment
With reference to the accompanying drawings and examples the application is described in further detail.It is understood that this place is retouched The specific embodiment stated is used only for explaining related invention, rather than the restriction to this invention.It also should be noted that, in order to It is easy to describe, in accompanying drawing, illustrate only the part related to about invention.
It should be noted that in the case of not conflicting, the embodiment in the application and the feature in embodiment can phases Mutually combine.To describe the application below with reference to the accompanying drawings and in conjunction with the embodiments in detail.
Fig. 1 show can apply the virtual reality device of the application and the exchange method of terminal or virtual reality device with The exemplary system architecture 100 of the embodiment of the interactive device of terminal.
As shown in figure 1, system architecture 100 can include terminal device 101,102,103, network 104 and virtual reality set Standby 105.Network 104 is in order to provide Jie of communication link between terminal device 101,102,103 and virtual reality device 105 Matter.Network 104 can include various connection types, for example wired, wireless communication link or fiber optic cables etc..
User can be interacted with virtual reality device 105 by network 104 with using terminal equipment 101,102,103, to connect Receive or send message etc..User can also pass through network 104 and terminal device 101,102,103 using virtual reality device 105 Interaction, has received or has sent message etc..On terminal device 101,102,103, various types of client application can be installed, Such as JICQ, social platform software, video jukebox software etc..
Terminal device 101,102,103 can be to have the work(such as collection movement locus, collection image, the instruction of execution reception The various electronic equipments of energy, including but not limited to smart mobile phone, panel computer, pocket computer on knee and desktop computer etc. Deng.
Virtual reality device 105 can be the virtual reality device providing various services, for example, according to terminal device 101, 102nd, the information of 103 transmissions, generates the virtual reality device having specific image information.Virtual reality device can be to reception Information processed, and generate tool specific 3-dimensional image information.Wherein, above-mentioned virtual reality device can include but not It is limited to:Wear-type visual device (Head Mount Display, HDM), intelligent glasses etc..
It should be noted that the exchange method of the virtual reality device that provided of the embodiment of the present application and terminal is typically by void Intend real world devices 105 to execute, correspondingly, virtual reality device is generally positioned at virtual reality device with the interactive device of terminal In 105.
It should be understood that the number of the terminal device, network and virtual reality device in Fig. 1 is only schematically.According to Realize needing, can have any number of terminal device, network and virtual reality device.
With continued reference to Fig. 2, show an enforcement of the virtual reality device according to the application and the exchange method of terminal The flow process 200 of example.Described virtual reality device and the exchange method of terminal, comprise the following steps:
Step 201, the information that receiving terminal sends.
In the present embodiment, virtual reality device and terminal interaction method run virtual reality device thereon (for example Virtual reality device 105 shown in Fig. 1) wired connection mode or radio connection receiving terminal apparatus (example can be passed through Terminal device 101,102,103 as shown in Figure 1) information that sends.Above-mentioned radio connection can include but is not limited to 3G/ 4G connection, WiFi connection, bluetooth connection, WiMAX connection, Zigbee connection, UWB (ultra wideband) connection, Yi Jiqi His currently known or in the future exploitation radio connection.Above- mentioned information can be various types of information, can include but not It is limited to message of film and TV, command information, image information, audio-frequency information, the motion track information of above-mentioned terminal device.
Step 202, generates according to above- mentioned information and the first special-effect information and has specific 3-dimensional image information.
In the present embodiment, above-mentioned virtual reality device is according to the information receiving in step 201 and the life of the first special-effect information Become there is the 3-dimensional image information of special-effect.Wherein, above-mentioned first special-effect information can include but is not limited to following at least one ?:Acoustic information, light information, background picture information, animation information.As an example, first, wear-type visual device can be pre- First special-effect information of first receive user setting, this first special-effect information is the special-effect information characterizing dense fog.Then, above-mentioned wear Formula visual device receives the frame of 3D (Three Dimensions, the three-dimensional) documentary film of the tropical rain forest that smart mobile phone sends.It Afterwards, above-mentioned wear-type visual device can adopt the irregular scenery of particIe system to be modeled, to simulate dense fog, generating and characterizing The image of dense fog, finally, above-mentioned wear-type visual device adopts image composing technique, by the frame of above-mentioned documentary film and above-mentioned sign The image of dense fog is synthesized together and shows.
In some optional implementations of the present embodiment, above-mentioned virtual reality device can be handed over user by voice Mutually, preserve the first special-effect information of user setup.User can also be in the first special-effect information setting page setup of above-mentioned terminal First special-effect information, then, above-mentioned first special-effect information is sent to virtual reality device, finally, above-mentioned virtual reality device Receive and preserve above-mentioned first special-effect information.
In some optional implementations of the present embodiment, virtual reality device can also determine according to the information receiving Virtual scene belonging to this information;Then, choose the special efficacy mated with above-mentioned virtual scene from preset special-effect information list Information is as the first special-effect information.Wherein, above-mentioned special-effect information list can include but is not limited to:Characterize the special efficacy letter missed old times or old friends Breath, the special-effect information characterizing romantic special-effect information, characterizing terror.As an example, virtual reality device receive audio-frequency information and Continuous frame if it is determined that in above-mentioned audio-frequency information sound be the pixel value representing black in low-frequency sound and continuous 3 frames Quantity exceed predetermined number, then generate and characterize terrified sound of knocking at the door.
In some optional implementations of the present embodiment, above-mentioned virtual reality device can also receive above-mentioned terminal and send out The instruction sent, and execute above-mentioned instruction.Wherein, above-mentioned instruction can include but is not limited to:Indicate and play the instruction suspending, instruction Amplify the instruction of dummy object, the virtual object of instruction rotation above-mentioned virtual reality device input that above-mentioned virtual reality device is thrown in The instruction of body.As an example, terminal is the mobile phone of user, when above-mentioned mobile phone is by being arranged on the sensor on this mobile phone, detection Turn right 45 degree and advance 20 centimetres to this mobile phone, then above-mentioned mobile phone sends instruction to above-mentioned virtual reality device and is moved to the left The instruction of dummy object and the instruction amplifying above-mentioned dummy object that virtual reality device is thrown in.Then, above-mentioned virtual reality sets Standby receive and execute above-mentioned instruction, by the dummy object of input be moved to the left default distance and according to default scaling on State dummy object.
In some optional implementations of the present embodiment, above-mentioned virtual reality device can be based on Wi-Fi Direct Standard is pre-build with above-mentioned terminal and is connected, and wherein, Wi-Fi Direct standard refers to that the equipment in wireless network need not pass through Wireless router can interconnective communication standard.Above-mentioned virtual reality device and above-mentioned terminal can adopt RTSP (Real Time Streaming Protocol, real time streaming transport protocol) interaction message to be to set up connection.The process that above-mentioned foundation connects Comprise the steps:
First, terminal sends the request obtaining the RTSP method that this virtual reality device is supported to virtual reality device. Wherein, above-mentioned RTSP method refers to realize each method of real time streaming transport protocol.Then, on above-mentioned virtual reality device receives State request, and retrieve the RTSP method of this equipment support.Afterwards, above-mentioned virtual reality device is by the RTSP side of the support retrieving Method sends to above-mentioned terminal.Finally, above-mentioned terminal receives and preserves the RTSP method that above-mentioned virtual reality device sends.
Second, above-mentioned virtual reality device sends the request obtaining the RTSP method that this terminal is supported to above-mentioned terminal. Then, above-mentioned terminal receives above-mentioned request, and retrieves the RTSP method of this terminal support.Afterwards, above-mentioned terminal will retrieve The RTSP method supported sends to above-mentioned virtual reality device.Finally, above-mentioned virtual reality device receives and preserves above-mentioned terminal The RTSP method sending.
3rd, the parameter information that above-mentioned terminal sends the RTSP method of this terminal support to above-mentioned virtual reality device is supreme State virtual reality device, above-mentioned virtual reality device receives and preserves above-mentioned parameter information.Then, above-mentioned virtual reality device is sent out To above-mentioned terminal, above-mentioned terminal receives and preserves above-mentioned virtual reality device the parameter information of RTSP method giving the support of this equipment The parameter information sending.
4th, the RTSP method that above-mentioned terminal is supported according to this terminal and the parameter information of the method and the above-mentioned void preserving Intend RTSP method that real world devices support and the parameter information that this RTSP method is supported, determine and can use in communication process RTSP method and the parameter information of the method.Then, the parameter of the RTSP method of above-mentioned determination and the method is believed by above-mentioned terminal Breath sends to above-mentioned virtual reality device.
In some optional implementations of the present embodiment, above-mentioned virtual reality device has been preset open applications program DLL.Developer can call above-mentioned open API to control each sensing on above-mentioned virtual reality device Device, to develop more functions.
With continued reference to Fig. 3, Fig. 3 is the application scenarios of the virtual reality device according to the present embodiment and the exchange method of terminal A schematic diagram.In the application scenarios of Fig. 3, first, user 301 is interacted with smart mobile phone 302 by voice, arranges first Special-effect information, wherein, above-mentioned first special-effect information is to characterize the special-effect information that happy birthday.Then, user 301 hand-held intelligent hand Around machine 302 gathers above-mentioned user 301, upper and lower image information.Afterwards, user 301 is adopted above-mentioned by mobile phone 302 The special-effect information of the sign birthday of the image information of collection and setting sends to intelligent glasses 304 through wireless network 303.Afterwards, intelligence Above-mentioned image information and the above-mentioned sign special-effect information that happy birthday can be received by glasses 304, according to the above-mentioned image information receiving Generate the virtual 3D image 306 of user 301, and according to the fast special-effect information of above-mentioned sign birthday receiving, generate 3D virtual The virtual fresh flower of cake 307,3D 308 simultaneously plays music 309.Finally, user 305 can see user 301 by intelligent glasses 304 Virtual image information 306, virtual cake 307, virtual fresh flower 308 and hear music 309.
The method that above-described embodiment of the application provides is passed through to increase special efficacy for the information receiving it is achieved that according to different Virtual scene generates the 3-dimensional image information with different special efficacys, increased authenticity and interest, improves Consumer's Experience.
With further reference to Fig. 4, it illustrates the stream of virtual reality device and another embodiment of exchange method of terminal Journey 400.The flow process 400 of this exchange method, comprises the following steps:
Step 401, the information that receiving terminal sends.
In this embodiment, the exchange method of virtual reality device and terminal runs virtual reality device (such as Fig. 1 thereon Shown virtual reality device 105) can be by way of wireless the connection or the mode receiving terminal of wired connection sends Information.
Step 402, generates according to above- mentioned information and the first special-effect information and has specific 3-dimensional image information.
In the present embodiment, above-mentioned virtual reality device is specific according to above- mentioned information, the first special-effect information generation tool The method of 3-dimensional image information is referred to step 203, will not be described here.
Step 403, receives the motion track information of the above-mentioned terminal that above-mentioned terminal sends.
In this embodiment, the virtual reality device on can be by way of wireless connection or the mode of wired connection connects Receive the motion track information of the above-mentioned terminal that above-mentioned terminal sends.Wherein, above-mentioned motion track information is to characterize above-mentioned terminal to exist Movement locus in solid space.Above-mentioned terminal can be by being arranged on three axis accelerometer, the three-axis gyroscope of above-mentioned terminal Motion track information at the above-mentioned terminal of sensor Real-time Collection.
Step 404, according to the above-mentioned motion track information, information of the corresponding dummy object of default above-mentioned terminal, default The second special-effect information, generate have specific above-mentioned dummy object three-dimensional animation information.
In the present embodiment, user can select a dummy object as above-mentioned end from default dummy object list Hold corresponding dummy object.And obtain the information of this dummy object.Wherein, above- mentioned information can include unique mark of dummy object Knowledge, shape information etc..As an example, above-mentioned dummy object list can include following at least one:Racket, tennis racket, Knife, sword.User can also select the second special-effect information for the corresponding dummy object of above-mentioned terminal.Above-mentioned second special-effect information is permissible Including but not limited to following at least one:Acoustic information, image information, animation information.Above-mentioned virtual reality device is according to above-mentioned Motion track information, the information of the corresponding dummy object of above-mentioned terminal, the second special-effect information, generate tool specific above-mentioned virtual The three-dimensional animation information of object.As an example, virtual reality device is wear-type visual device, and terminal is the mobile phone of user, on State wear-type visual device and receive the deep woods that above-mentioned mobile phone sends and reach a standard the image information of game, the 3-dimensional image being currently generated Picture is the picture that 3 wolves guard valley crossing, and above-mentioned 3-dimensional image picture has dense fog special efficacy.Then, user is from virtual object In body list, choose sword as the corresponding dummy object of above-mentioned mobile phone.Afterwards, above-mentioned wear-type visual device is above-mentioned to receive The motion track information of mobile phone is analyzed, and determines above-mentioned mobile phone according to moving from top to bottom, then generated with form of arcs The movement locus of this sword increasing from top to bottom, and load white light in camber line the right and left, generate simultaneously and characterize object of riving Sound.Finally, after default a period of time, camber line, white light, sound disappear.
Figure 4, it is seen that compared with embodiment corresponding with Fig. 2, the virtual reality device in the present embodiment and terminal The flow process 400 of exchange method highlight to be generated according to the motion track information of terminal and have the three-dimensional of specific dummy object and move Draw, it is achieved thereby that user carries out interaction by terminal with the virtual environment in virtual reality, strengthen interactivity and feeling of immersion.
With further reference to Fig. 5, it illustrates the stream of virtual reality device and another embodiment of exchange method of terminal Journey 500.The flow process 500 of this exchange method, comprises the following steps:
Step 501, the information that receiving terminal sends.
In this embodiment, the exchange method of virtual reality device and terminal runs virtual reality device (such as Fig. 1 thereon Shown virtual reality device 105) can be by way of wireless the connection or the mode receiving terminal of wired connection sends Information.
Step 502, generates according to above- mentioned information and the first special-effect information and has specific 3-dimensional image information.
In the present embodiment, above-mentioned virtual reality device is specific according to above- mentioned information, the first special-effect information generation tool The method of 3-dimensional image information is referred to step 203, will not be described here.
Step 503, dresses the use of above-mentioned virtual reality device by the sensor collection being arranged on virtual reality device The image information at family and limb motion trace information.
In the present embodiment, above-mentioned virtual reality device can be by being arranged on the shooting on above-mentioned virtual reality device The sensor Real-time Collection such as machine, thermal infrared imager dresses the image information of the user of above-mentioned virtual reality device.Above-mentioned virtual existing Real equipment can also be real-time by sensors such as the three axis accelerometer that is arranged on above-mentioned virtual reality device, three-axis gyroscopes Gather the limb motion trace information of above-mentioned user, wherein, above-mentioned limb motion trace information can include but is not limited to following At least one:Head movement trace information, arm motion trace information.
Step 504, generates the motion characteristic of above-mentioned user according to above-mentioned image information and above-mentioned limb motion trace information Information.
In the present embodiment, above-mentioned virtual reality device can adopt convolutional neural networks (Convolutional Neural Network, CNN) algorithm identifies that from above-mentioned image information and above-mentioned limb motion trace information above-mentioned user's is dynamic Make characteristic information.Wherein, above-mentioned motion characteristic information can include but is not limited to following at least one:Characterize head rotation a certain The motion characteristic information of angle, characterize centre of body weight lean forward motion characteristic information, characterize body advance motion characteristic information, Characterize the motion characteristic information that body retreats.
In the present embodiment, above-mentioned convolutional neural networks can include input layer, convolutional layer, pond layer, full articulamentum, defeated Go out layer.Wherein, above-mentioned virtual reality device is all trained in advance to each layer.Above-mentioned convolutional neural networks can be according to such as Lower step identifies and extracts the motion characteristic information of above-mentioned user:First, input layer is to above-mentioned image information, above-mentioned limb motion Trace information carries out denoising and normalized, the numerical matrix after being normalized.Then, convolutional layer is by after above-mentioned normalization Numerical matrix be cut into 8 minor matrixs taking advantage of 8, each minor matrix and convolution kernel are carried out convolutional calculation, obtain convolution matrix. Afterwards, pond layer carries out aggregate statistics calculating to above-mentioned convolution matrix, obtains local feature information.Afterwards, full articulamentum is to upper State local feature information to be processed, generate global characteristics information.Finally, output layer is classified to above-mentioned global characteristics information Process, determine the motion characteristic information of above-mentioned user.
Step 505, generates instruction according to above-mentioned motion characteristic information.
In the present embodiment, above-mentioned virtual reality device retrieve from default motion characteristic information list with the presence or absence of with The motion characteristic information of the motion characteristic information match obtaining in step 504.Wherein, above-mentioned motion characteristic information list is permissible Including but not limited to:Characterize the motion characteristic information of next information of playing, characterize the motion characteristic letter playing a upper information Breath, sign play the motion characteristic information suspended.In response to obtaining in presence and step 504 in above-mentioned motion characteristic information list Motion characteristic information match motion characteristic information, then according to this motion characteristic information generate instruction.As an example, virtual The real world devices frame information according to the film receiving in step 502, plays above-mentioned film.If the use generating in step 504 The motion characteristic information at family is that remaining finger characterizing right hand forefinger horizontal positioned and the right hand is in the state of bending, then with default Motion characteristic information list in characterize play a upper information motion characteristic information match, then virtual reality device generate Play the instruction of last film.
Step 505, sends above-mentioned instruction to above-mentioned terminal.
In the present embodiment, above-mentioned virtual reality device by above-mentioned instruction by way of wireless the connection or wired connection Mode sends to above-mentioned terminal.Above-mentioned terminal receives above-mentioned instruction, then executes above-mentioned instruction.
From figure 5 it can be seen that compared with embodiment corresponding with Fig. 2, the virtual reality device in the present embodiment and terminal The flow process 500 of exchange method highlight image information and limb motion trace information according to user, generate command information, and Send above-mentioned instruction to terminal it is achieved that user passes through limb action control terminal, control operation is more humane.
With further reference to Fig. 6, as the realization to method shown in above-mentioned each figure, this application provides a kind of virtual reality sets A standby embodiment with the interactive device of terminal, this device embodiment is corresponding with the embodiment of the method shown in Fig. 2, this device Specifically can apply in various electronic equipments.
As shown in fig. 6, the virtual reality device described in the present embodiment is included with the interactive device 600 of terminal:First reception Unit 601, the first signal generating unit 602.Wherein, the first receiving unit 601 is configured to the information of receiving terminal transmission;First is raw Unit 602 is become to be configured to generate the specific 3-dimensional image information of tool according to above- mentioned information and the first special-effect information, wherein, on Stating the first special-effect information is the special-effect information pre-setting or is chosen from preset special-effect information list according to above- mentioned information Special-effect information.
In the present embodiment, virtual reality device and the first receiving unit 601 of the interactive device 600 of terminal can pass through Wired connection mode or the information of radio connection receiving terminal apparatus transmission.Above-mentioned terminal can include but is not limited to intelligence Energy mobile phone, panel computer, pocket computer on knee and desktop computer.Above- mentioned information can be various types of information, can To include but is not limited to the motion track information of message of film and TV, command information, image information, audio-frequency information, above-mentioned terminal device.
In the present embodiment, above-mentioned first signal generating unit 602 receives according to above-mentioned first receiving unit 601 information, One special-effect information generates the 3-dimensional image information with special-effect.Wherein, above-mentioned first special-effect information can include but not limit In following at least one:Acoustic information, light information, background picture information, animation information.
In some optional implementations of the present embodiment, above-mentioned interactive device 600 can be handed over user by voice Mutually, preserve the first special-effect information of user setup.User can also be in the first special-effect information setting page setup of above-mentioned terminal First special-effect information, then, above-mentioned first special-effect information is sent to virtual reality device, and finally, above-mentioned interactive device 600 connects Receive and preserve above-mentioned first special-effect information.
In some optional implementations of the present embodiment, above-mentioned interactive device 600 can also be according to the information receiving Determine the virtual scene belonging to this information;Then, choose from preset special-effect information list and mate with above-mentioned virtual scene Special-effect information is as the first special-effect information.Wherein, above-mentioned special-effect information list can include but is not limited to:Characterize the special efficacy missed old times or old friends Information, the special-effect information characterizing romantic special-effect information, characterizing terror.
In some optional implementations of the present embodiment, above-mentioned interactive device 600 can also include:Command reception list First (Fig. 6 is not shown) is configured to receive the instruction that above-mentioned terminal sends;Instruction execution unit (Fig. 6 is not shown) is configured to hold The above-mentioned instruction of row.Above-mentioned instruction reception unit can monitor the instruction transmission request of above-mentioned terminal, in response to listening to this instruction Transmission request, then receive the instruction that above-mentioned terminal sends.Wherein, above-mentioned instruction may refer to show and plays the instruction suspending, instruction Amplify the instruction of dummy object, the virtual object of instruction rotation above-mentioned virtual reality device input that above-mentioned virtual reality device is thrown in The instruction of body.Above-mentioned instruction execution unit executes the instruction of above-mentioned reception.
In some optional implementations of the present embodiment, interactive device 600 can also include:Set up connection unit (Fig. 6 is not shown) is configured to be pre-build with above-mentioned terminal based on Wi-Fi Direct standard and is connected.Above-mentioned virtual reality sets Standby RTSP interaction message can be adopted to be connected to set up with above-mentioned terminal.
In the present embodiment, above-mentioned first signal generating unit 602 can generate according to the different information receiving and have different spies The 3-dimensional image information of effect, increased authenticity and interest, improves Consumer's Experience.
With further reference to Fig. 7, as the realization to method shown in above-mentioned each figure, this application provides a kind of virtual reality sets Standby another embodiment with the interactive device of terminal, this device embodiment is corresponding with the embodiment of the method shown in Fig. 4, this dress Put and specifically can apply in various electronic equipments.
As shown in fig. 7, the virtual reality device described in the present embodiment is included with the interactive device 700 of terminal:First reception Unit 701, the first signal generating unit 702, the second receiving unit 703, the second signal generating unit 704.Wherein, the first receiving unit 701 It is configured to the information of receiving terminal transmission;First signal generating unit 702 is configured to according to above- mentioned information and the first special-effect information Generate and have specific 3-dimensional image information, the second receiving unit 701 is configured to the fortune of the above-mentioned terminal of receiving terminal transmission Dynamic trace information;Second signal generating unit 702 is configured to according to above-mentioned motion track information, the corresponding void of default above-mentioned terminal Intend the information of object, default second special-effect information, generate the three-dimensional animation information having specific above-mentioned dummy object.
In the present embodiment, virtual reality device is existing with virtual with the first receiving unit 701 of the interactive device 700 of terminal Real equipment, as the first receiving unit 601 function of the interactive device 600 of terminal, will not be described here.
In the present embodiment, virtual reality device is existing with virtual with the first signal generating unit 702 of the interactive device 700 of terminal Real equipment, as the first signal generating unit 602 function of the interactive device 600 of terminal, will not be described here.
In the present embodiment, virtual reality device and the second receiving unit 703 of the interactive device 700 of terminal can pass through The mode of the wireless mode connecting or wired connection receives the motion track information of the above-mentioned terminal that above-mentioned terminal sends.Its In, above-mentioned motion track information is to characterize movement locus in solid space for the above-mentioned terminal.Above-mentioned terminal can be by installing Motion track information in above-mentioned terminals of sensor Real-time Collection such as the three axis accelerometer of above-mentioned terminal, three-axis gyroscopes.
In the present embodiment, the second signal generating unit 704 can support that user selects one from default dummy object list Individual dummy object is as the corresponding dummy object of above-mentioned terminal.And obtain the information of this dummy object.Wherein, above- mentioned information is permissible Including the unique mark of dummy object, shape information etc..Second signal generating unit 704 can also support that user is that above-mentioned terminal-pair should Dummy object select the second special-effect information.Above-mentioned second special-effect information can include but is not limited to following at least one:Sound Information, image information, background information, animation information.Above-mentioned virtual reality device is according to above-mentioned motion track information, above-mentioned terminal The information of corresponding dummy object, the second special-effect information, generate the three-dimensional animation information having specific above-mentioned dummy object.
In the present embodiment, above-mentioned second signal generating unit 704 is specific according to the motion track information generation tool of terminal The three-dimensional animation of dummy object, it is achieved thereby that user carries out interaction by terminal with the virtual environment in virtual reality, strengthens Interactivity and feeling of immersion.
With further reference to Fig. 8, as the realization to method shown in above-mentioned each figure, this application provides a kind of virtual reality sets Standby another embodiment with the interactive device of terminal, this device embodiment is corresponding with the embodiment of the method shown in Fig. 5, this dress Put and specifically can apply in various electronic equipments.
As shown in figure 8, the virtual reality device described in the present embodiment is included with the interactive device 800 of terminal:First reception Unit 801, the first signal generating unit 802, collecting unit 803, motion characteristic signal generating unit 804, instruction generation unit 805, transmission Unit 806.Wherein, the first receiving unit 801 is configured to the information of receiving terminal transmission;First signal generating unit 802 configuration is used Has specific 3-dimensional image information in generating according to above- mentioned information and the first special-effect information, collecting unit 803 is configured to lead to Cross be arranged on above-mentioned virtual reality device sensor collection dress above-mentioned virtual reality device the image information of user and Limb motion trace information;Motion characteristic signal generating unit 804 is configured to according to above-mentioned image information and above-mentioned limb motion rail Mark information generates the motion characteristic information of above-mentioned user;Instruction generation unit 805 is configured to according to above-mentioned motion characteristic information Generate instruction;Transmitting element 806 is configured to send above-mentioned instruction to above-mentioned terminal.
In the present embodiment, virtual reality device is existing with virtual with the first receiving unit 801 of the interactive device 800 of terminal Real equipment, as the first receiving unit 601 function of the interactive device 600 of terminal, will not be described here.
In the present embodiment, virtual reality device is existing with virtual with the first signal generating unit 802 of the interactive device 800 of terminal Real equipment, as the first signal generating unit 602 function of the interactive device 600 of terminal, will not be described here.
In the present embodiment, virtual reality device can be by installing with the collecting unit 803 of the interactive device 800 of terminal The sensor Real-time Collection such as the video camera on above-mentioned virtual reality device, thermal infrared imager dresses above-mentioned virtual reality device The image information of user.Above-mentioned collecting unit 803 can also be by being arranged on the 3-axis acceleration on above-mentioned virtual reality device The limb motion trace information of the above-mentioned user of the sensor Real-time Collection such as meter, three-axis gyroscope, wherein, above-mentioned limb motion track Information can include but is not limited to following at least one:Head movement trace information, arm motion trace information.
In the present embodiment, above-mentioned motion characteristic signal generating unit 804 can adopt convolutional neural networks or depth confidence net Network (Deep Belief Network, DBN) identifies above-mentioned user from above-mentioned image information and above-mentioned limb motion trace information Motion characteristic information.As an example, above-mentioned motion characteristic signal generating unit 804 can adopt depth confidence Network Recognition user's Motion characteristic information.Wherein, above-mentioned depth confidence network is the network crossed through the substantial amounts of sample training containing label.Above-mentioned action Feature signal generating unit 804 can according to the image information of user, the limb motion trace information of user and the Joint Distribution of label, Choose the maximum label of probable value, wherein, the maximum label corresponding motion characteristic information of above-mentioned probable value is above-mentioned user Motion characteristic information.
In the present embodiment, above-mentioned instruction generation unit 805 can be retrieved from default motion characteristic information list is The no motion characteristic information existing with the motion characteristic information match of generation in above-mentioned motion characteristic signal generating unit 804.Response Exist in the above-mentioned motion characteristic information list and the motion characteristic information match that obtains in motion characteristic signal generating unit 804 Motion characteristic information, then generate instruction according to this motion characteristic information.
In the present embodiment, instruction is generated based on above-mentioned instruction generation unit 805, above-mentioned transmitting element 804 passes through wireless The mode connecting or the above-mentioned instruction of transmission of wired connection are to above-mentioned terminal.Above-mentioned terminal receives above-mentioned instruction, then in execution State instruction.
In the present embodiment, above-mentioned motion characteristic signal generating unit 804 is according to the image information of user and limb motion track Information generates the motion characteristic information of above-mentioned user, and then, above-mentioned instruction generation unit 805 is according to the motion characteristic of above-mentioned user Information generates command information, concurrently serves and states instruction to terminal it is achieved that user passes through limb action control terminal, control operation More humane.
Below with reference to Fig. 9, it illustrates the computer system 900 being suitable to the terminal device for realizing the embodiment of the present application Structural representation, this terminal device is corresponding with the monitor terminal shown in Fig. 1, this terminal device can be various types of can Wearable device.
As shown in figure 9, terminal device 900 include CPU (CPU) 901, memory 902, input block 903, Output unit 904, communication unit 905 and bus 906, wherein, CPU 901, memory 902, input block 903, output unit 904 and communication unit 905 be connected with each other by bus 906.Here, may be implemented as computer according to the present processes Program, and be stored in memory 902.CPU 901 in terminal device 900 is by calling in memory 902 the upper of storage State computer program, to implement the view display function limiting in the present processes.
In some implementations, input block 903 can be voice receiving unit, and output unit 904 can be display The equipment such as screen, loudspeaker, communication unit 905 can be the equipment being communicated with other equipment.Thus, CPU 901 can adjust Generated with above-mentioned computer program and have specific 3-dimensional image information, output unit 904 can be controlled to export and above-mentioned there is spy The 3-dimensional image information of effect.
Flow chart in accompanying drawing and block diagram are it is illustrated that according to the system of the various embodiment of the application, method and computer journey The architectural framework in the cards of sequence product, function and operation.At this point, each square frame in flow chart or block diagram can generation A part for one module of table, program segment or code, the part of described module, program segment or code comprises one or more For realizing the executable instruction of the logic function of regulation.It should also be noted that in some realizations as replacement, institute in square frame The function of mark can also be to occur different from the order being marked in accompanying drawing.For example, the square frame that two succeedingly represent is actual On can execute substantially in parallel, they can also execute sometimes in the opposite order, and this is depending on involved function.Also to It is noted that the combination of each square frame in block diagram and/or flow chart and the square frame in block diagram and/or flow chart, Ke Yiyong Execute the function of regulation or the special hardware based system of operation to realize, or can be referred to computer with specialized hardware The combination of order is realizing.
It is described in involved unit in the embodiment of the present application to realize by way of software it is also possible to pass through hard The mode of part is realizing.Described unit can also be arranged within a processor, for example, it is possible to be described as:A kind of processor bag Include the first receiving unit, the first signal generating unit.Wherein, the title of these units is not constituted to this unit originally under certain conditions The restriction of body, for example, the first receiving unit is also described as " unit of the information that receiving terminal sends ".
As another aspect, present invention also provides a kind of nonvolatile computer storage media, this non-volatile calculating Machine storage medium can be the nonvolatile computer storage media included in device described in above-described embodiment;Can also be Individualism, without the nonvolatile computer storage media allocated in terminal.Above-mentioned nonvolatile computer storage media is deposited Contain one or more program, when one or more of programs are executed by an equipment so that described equipment:Receive The information that terminal sends;Generated according to described information and the first special-effect information and have specific 3-dimensional image information, wherein, described First special-effect information is the special-effect information pre-setting or the spy being chosen from preset special-effect information list according to described information Effect information.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.People in the art Member is it should be appreciated that involved invention scope is however it is not limited to the technology of the particular combination of above-mentioned technical characteristic in the application Scheme, also should cover simultaneously in the case of without departing from described inventive concept, be carried out by above-mentioned technical characteristic or its equivalent feature The other technical schemes being combined and being formed.Such as features described above has similar work(with (but not limited to) disclosed herein The technical scheme that the technical characteristic of energy is replaced mutually and formed.

Claims (10)

1. a kind of virtual reality device and the exchange method of terminal are it is characterised in that methods described includes:
The information that receiving terminal sends;
Generated according to described information and the first special-effect information and have specific 3-dimensional image information, wherein, described first special efficacy letter Breath is the special-effect information pre-setting or the special-effect information chosen from preset special-effect information list according to described information.
2. exchange method according to claim 1 is it is characterised in that methods described also includes:
Receive the motion track information of the described terminal that described terminal sends;
According to described motion track information, the information of the corresponding dummy object of default described terminal, default second special efficacy letter Breath, generates the three-dimensional animation information having specific described dummy object.
3. exchange method according to claim 1 is it is characterised in that methods described also includes:
Dress the image of the user of described virtual reality device by the sensor collection being arranged on described virtual reality device Information and limb motion trace information;
Generate the motion characteristic information of described user according to described image information and described limb motion trace information;
Instruction is generated according to described motion characteristic information;
Send described instruction to described terminal.
4. exchange method according to claim 1 is it is characterised in that methods described also includes:
Receive the instruction that described terminal sends;
Execute described instruction.
5. exchange method according to claim 1 is it is characterised in that methods described also includes:
Pre-build with described terminal based on Wi-Fi Direct standard and be connected.
6. a kind of virtual reality device and the interactive device of terminal are it is characterised in that described device includes:
First receiving unit, is configured to the information of receiving terminal transmission;
First signal generating unit, is configured to generate tool specific 3-dimensional image letter according to described information and the first special-effect information Breath, wherein, described first special-effect information is the special-effect information pre-setting or is arranged from preset special-effect information according to described information The special-effect information chosen in table.
7. interactive device according to claim 6 is it is characterised in that described device also includes:
Second receiving unit, is configured to receive the motion track information of the described terminal that described terminal sends;
Second signal generating unit, is configured to according to described motion track information, default described terminal corresponding dummy object Information, default second special-effect information, generate the three-dimensional animation information having specific described dummy object.
8. interactive device according to claim 6 is it is characterised in that described device also includes:
Collecting unit, described virtual reality is dressed in the sensor collection being configured to be arranged on described virtual reality device The image information of the user of equipment and limb motion trace information;
Motion characteristic signal generating unit, is configured to generate described use according to described image information and described limb motion trace information The motion characteristic information at family;
Instruction generation unit, is configured to generate instruction according to described motion characteristic information;
Transmitting element, is configured to send described instruction to described terminal.
9. interactive device according to claim 6 is it is characterised in that described device also includes:
Instruction reception unit, is configured to receive the instruction that described terminal sends;
Instruction execution unit, is configured to carry out described instruction.
10. interactive device according to claim 6 is it is characterised in that described device also includes:
Set up connection unit, be configured to pre-build with described terminal based on Wi-Fi Direct standard and be connected.
CN201610810917.7A 2016-09-08 2016-09-08 Virtual reality device and terminal interaction method and apparatus Pending CN106445121A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610810917.7A CN106445121A (en) 2016-09-08 2016-09-08 Virtual reality device and terminal interaction method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610810917.7A CN106445121A (en) 2016-09-08 2016-09-08 Virtual reality device and terminal interaction method and apparatus

Publications (1)

Publication Number Publication Date
CN106445121A true CN106445121A (en) 2017-02-22

Family

ID=58164235

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610810917.7A Pending CN106445121A (en) 2016-09-08 2016-09-08 Virtual reality device and terminal interaction method and apparatus

Country Status (1)

Country Link
CN (1) CN106445121A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107318029A (en) * 2017-06-14 2017-11-03 北京易威科技有限公司 Switch the method for correspondence virtual reality scenario according to video content under VR environment
CN108989784A (en) * 2018-07-09 2018-12-11 歌尔科技有限公司 Image display method, device, equipment and the storage medium of virtual reality device
CN110691279A (en) * 2019-08-13 2020-01-14 北京达佳互联信息技术有限公司 Virtual live broadcast method and device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103425272A (en) * 2013-09-02 2013-12-04 福州大学 Method for controlling movement of computer three-dimensional dummy object by smartphone
EP2889718A1 (en) * 2013-12-30 2015-07-01 Samsung Electronics Co., Ltd A natural input based virtual ui system for electronic devices
CN104915979A (en) * 2014-03-10 2015-09-16 苏州天魂网络科技有限公司 System capable of realizing immersive virtual reality across mobile platforms
CN205210819U (en) * 2015-11-06 2016-05-04 深圳信息职业技术学院 Virtual reality human -computer interaction terminal
CN105652442A (en) * 2015-12-31 2016-06-08 北京小鸟看看科技有限公司 Head-mounted display equipment and interaction method for head-mounted display equipment and intelligent terminal

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103425272A (en) * 2013-09-02 2013-12-04 福州大学 Method for controlling movement of computer three-dimensional dummy object by smartphone
EP2889718A1 (en) * 2013-12-30 2015-07-01 Samsung Electronics Co., Ltd A natural input based virtual ui system for electronic devices
CN104915979A (en) * 2014-03-10 2015-09-16 苏州天魂网络科技有限公司 System capable of realizing immersive virtual reality across mobile platforms
CN205210819U (en) * 2015-11-06 2016-05-04 深圳信息职业技术学院 Virtual reality human -computer interaction terminal
CN105652442A (en) * 2015-12-31 2016-06-08 北京小鸟看看科技有限公司 Head-mounted display equipment and interaction method for head-mounted display equipment and intelligent terminal

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
喻晓和编著: "《虚拟现实技术基础教程》", 30 June 2015, 北京:清华大学出版社 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107318029A (en) * 2017-06-14 2017-11-03 北京易威科技有限公司 Switch the method for correspondence virtual reality scenario according to video content under VR environment
CN108989784A (en) * 2018-07-09 2018-12-11 歌尔科技有限公司 Image display method, device, equipment and the storage medium of virtual reality device
CN110691279A (en) * 2019-08-13 2020-01-14 北京达佳互联信息技术有限公司 Virtual live broadcast method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
JP6700463B2 (en) Filtering and parental control methods for limiting visual effects on head mounted displays
US10089793B2 (en) Systems and methods for providing real-time composite video from multiple source devices featuring augmented reality elements
US20180373413A1 (en) Information processing method and apparatus, and program for executing the information processing method on computer
US9626103B2 (en) Systems and methods for identifying media portions of interest
CN105358227B (en) Share 3d gaming process
CN106095235B (en) control method and device based on virtual reality
KR101907136B1 (en) System and method for avatar service through cable and wireless web
KR101951761B1 (en) System and method for providing avatar in service provided in mobile environment
CN112198959A (en) Virtual reality interaction method, device and system
CN107147941A (en) Barrage display methods, device and the computer-readable recording medium of video playback
CN107029429A (en) The system and method that time shift for realizing cloud game system is taught
CN107683449A (en) The personal space content that control is presented via head mounted display
CN109069934A (en) Spectators' view tracking to the VR user in reality environment (VR)
JP6750046B2 (en) Information processing apparatus and information processing method
CN110050290A (en) Virtual reality experience is shared
US20190155484A1 (en) Method and apparatus for controlling wallpaper, electronic device and storage medium
KR101792715B1 (en) Exercise management system simulating real environment
US20180169517A1 (en) Reactive animation for virtual reality
CN106575163A (en) Feedback provision method, system, and analysis device
CN108200269A (en) Display screen control management method, terminal and computer readable storage medium
CN110393008A (en) Photograph album generating means, photograph album generate system and album creating method
CN106445121A (en) Virtual reality device and terminal interaction method and apparatus
CN108416832A (en) Display methods, device and the storage medium of media information
US20160179206A1 (en) Wearable interactive display system
CN112121406A (en) Object control method and device, storage medium and electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20170222