CN113282141A - Wearable portable computer and teaching platform based on mix virtual reality - Google Patents

Wearable portable computer and teaching platform based on mix virtual reality Download PDF

Info

Publication number
CN113282141A
CN113282141A CN202110602374.0A CN202110602374A CN113282141A CN 113282141 A CN113282141 A CN 113282141A CN 202110602374 A CN202110602374 A CN 202110602374A CN 113282141 A CN113282141 A CN 113282141A
Authority
CN
China
Prior art keywords
computer
assembly
sensor
content
backpack
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110602374.0A
Other languages
Chinese (zh)
Inventor
吴慧欣
陈继坤
彭锋
杨梦凡
彭馨予
宋文辉
刘孟轩
宋宗珀
安丽鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Zhongmeng Electronic Technology Co ltd
North China University of Water Resources and Electric Power
Original Assignee
Henan Zhongmeng Electronic Technology Co ltd
North China University of Water Resources and Electric Power
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Zhongmeng Electronic Technology Co ltd, North China University of Water Resources and Electric Power filed Critical Henan Zhongmeng Electronic Technology Co ltd
Priority to CN202110602374.0A priority Critical patent/CN113282141A/en
Publication of CN113282141A publication Critical patent/CN113282141A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • G09B5/065Combinations of audio and video presentations, e.g. videotapes, videodiscs, television systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a wearable portable computer based on mixed virtual reality. The wearable portable computer comprises a backpack computer, a head-mounted display, a leg sensor, a foot sensor and a wireless sensing handle; the head-mounted display and the wireless sensing handle are both in communication connection with the backpack computer; the backpack computer is a computer carried in a backpack form, and a battery, a computer host and cooling equipment are arranged in the backpack; the battery supplies power to the computer host and the cooling equipment; the computer host is provided with an equipment interconnection interface; the head-mounted display is connected to the equipment interconnection interface and used for providing mixed reality vision for a user and surrounding observers; the leg sensor, the foot sensor and the wireless sensing handle are connected to the equipment interconnection interface. The invention has better compatibility and expansibility to peripheral equipment, and can be applied to teaching scenes.

Description

Wearable portable computer and teaching platform based on mix virtual reality
Technical Field
The invention relates to the technical field of wearable equipment, in particular to a wearable portable computer and a teaching platform based on mixed virtual reality.
Background
With the development of screen display technology, more and more display technologies can be selected, and the semi-transparent LED display technology and the portable projection technology which appear recently enrich the realization forms of display equipment and lay the foundation for related applications. However, currently, common AR and game devices mainly include handheld game devices, mobile phones, tablet computers, wearable devices, and the like, and still interact with users through the conventional portable displays, which has certain limitations, for example, the head display needs to occupy the visual field of the users, which interferes the users to observe the real world, and the wearable computer cannot rapidly switch the use scenes; the portable head display is incompatible with connection of the AR, the mobile phone and the game equipment; the portable head display has poor functional expansibility; the existing partially transparent panel cannot adjust the transparency and has poor effect under the use of a specific scene; there is no relevant wearable device and mixed reality application used in scenes such as classroom teaching.
Disclosure of Invention
Aiming at the problems of poor compatibility and poor expansibility of the existing wearable equipment during interaction with a user, the invention provides a wearable portable computer and a teaching platform based on mixed virtual reality.
In one aspect, the invention provides a wearable portable computer based on mixed virtual reality, comprising a backpack computer, a head-mounted display, a leg sensor, a foot sensor and a wireless sensing handle; the head-mounted display and the wireless sensing handle are both in communication connection with the backpack computer;
the backpack computer is a computer carried in a backpack form, and a battery, a computer host and cooling equipment are arranged in the backpack; the battery supplies power to the computer host and the cooling equipment; the computer host is provided with an equipment interconnection interface;
the head-mounted display is connected to the equipment interconnection interface and used for providing mixed reality vision for a user and surrounding observers;
the leg sensor, the foot sensor and the wireless sensing handle are connected to the equipment interconnection interface, and the wireless sensing handle is used for interacting with a backpack computer; the leg sensor and the foot sensor are used for acquiring the motion postures and the motion states of the corresponding parts, so that the computer host can control the computer according to the motion postures and the motion states.
Furthermore, a power interface is further arranged inside the backpack, and the power interface is used for being connected with an external power supply, and the external power supply supplies power to the battery.
Furthermore, the head-mounted display comprises a helmet, a visual interaction unit, a sound interaction unit, a sensor unit and a power supply unit which are all hung on the USB bus;
the visual interaction unit comprises an adjustable transparent screen assembly and a projection display assembly, and the adjustable transparent screen assembly and the projection display assembly are connected to an equipment interconnection interface through a cable or a back plate;
the sound interaction unit comprises an audio input module, an audio output module, a microphone array, an earphone interface and a wireless audio interface; the microphone array comprises a speech microphone and an ambient microphone; the audio input module is used for processing sound information acquired by the microphone array and transmitting the processed data to the backpack computer through the equipment interconnection interface; the audio output module is used for decoding the audio signal received from the video stream audio interface and then transmitting the decoded audio signal to the earphone interface and/or the wireless audio interface;
the sensor unit comprises a GPS module, an attitude sensor, a motion sensor, an environment sensor, a wearing sensor, a biosensor and a camera;
the power supply unit comprises a power supply interface used for accessing an external power supply.
Further, the adjustable transparent screen assembly includes a display assembly, a display assembly holder, and an optical assembly; the display assembly and the optical assembly are connected to a helmet by a display assembly holder;
the optical component is used for adjusting the light path from the display component to an observer;
the adjustable transparent screen assembly has a stowed state and a use state; in the retracted state, the adjustable transparent screen assembly is automatically closed; in the use state, an observer observes the screen content through the display component or observes the real world through the transparent screen.
Further, the projection display assembly comprises a projection assembly and a stabilizer assembly; the projection assembly is connected to a helmet through the stabilizer assembly; the projection assembly comprises a laser ranging sensor and an anti-shake lens; the stabilizer assembly includes a roll stabilizer and a pitch stabilizer.
In another aspect, the invention provides a teaching platform of a wearable portable computer based on mixed virtual reality, which comprises the wearable portable computer, a cloud platform and a local platform;
the cloud platform comprises a user management unit, an application/content mall unit and a cloud service unit; the user management unit is used for managing user accounts, equipment, subscriptions and purchases; the application/content mall unit is used for the content producer to publish the content and the content consumer to purchase, download and subscribe the related content; the cloud service unit is used for providing friend communication related functions, content related communities, online platforms between equipment and relevant SDK support of application/content;
the local platform comprises a multi-user management unit, an application and content management unit, a storage management unit and an equipment management unit; the multi-user management unit is used for accessing respective cloud services for each user, providing personalized settings and managing digital authorization held by each user; the application and content management unit is used for managing the content and resources in the local platform and realizing the addition, deletion and operation of related resources; the storage management unit is used for managing each storage medium of the local platform, reading the content in the storage medium and the authorization with the card, and managing the user data in the storage medium; the device management unit is used for setting hardware drivers and hardware related functions of the device, so that the device can operate according to the expectation of a user.
The invention has the beneficial effects that:
the adjustable transparent screen component and the projection display component in the head-mounted display provide visual interaction functions for users and observers, wherein the adjustable transparent screen component uses an electrochromic technology, so that the transmittance of a part of transparent screens can be adjusted, and the switching between AR-VR is realized; and a USB bus is used for providing a unified interconnection and expansion scheme for each component or peripheral equipment; the GPS module, the camera, the motion sensor, the environment sensor and other sensors are used, so that better AR and game use experience is provided for users; in addition, the invention can also be applied to teaching scenes such as classrooms and the like.
Drawings
Fig. 1 is a block diagram of a wearable portable computer based on mixed virtual reality according to an embodiment of the present invention;
FIG. 2 is a block diagram of a head mounted display according to an embodiment of the present invention;
fig. 3 is a block diagram of a sound interaction unit according to an embodiment of the present invention;
fig. 4 is a block diagram of a sensor unit according to an embodiment of the present invention;
fig. 5 is a schematic diagram of PD chain powering provided by an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of an adjustable transparent screen assembly according to an embodiment of the present invention;
FIG. 7 is a schematic structural diagram of a projection display module according to an embodiment of the present invention;
fig. 8 is a functional structure schematic diagram of a cloud platform according to an embodiment of the present invention;
fig. 9 is a functional structure diagram of a local platform according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an embodiment of the present invention provides a wearable portable computer based on mixed virtual reality, including a backpack computer, a head-mounted display, a leg sensor, a foot sensor, and a wireless sensing handle; the head-mounted display and the wireless sensing handle are both in communication connection with the backpack computer;
the backpack computer is a computer carried in a backpack form, and a battery, a computer host and cooling equipment are arranged in the backpack; the battery supplies power to the computer host and the cooling equipment; the computer host is provided with an equipment interconnection interface;
as an implementation manner, a power interface may be further disposed inside the backpack, and the power interface is used for accessing an external power source, and the battery is powered by the external power source. When the battery is fully charged or reaches a certain value, the charging is stopped, the external power supply is used for supplying power, and when the external power supply stops supplying power, the external power supply is converted into the battery for supplying power. The cooling device is used for reducing the temperature of the battery, the computer and related modules thereof and ensuring the normal operation of the device. The device interconnect interface is used to power and communicate with peripheral devices. The backpack computer may identify the device according to the USB protocol, utilizing the various functions provided by the peripheral device. The backpack computer comprises a wireless communication module for connecting the Internet, other equipment and human-computer interaction equipment. The user controls the computer related functions through the human-computer interaction device or the peripheral device.
The head-mounted display is connected to the equipment interconnection interface and used for providing mixed reality vision for a user and surrounding observers;
the leg sensor, the foot sensor and the wireless sensing handle are connected to the equipment interconnection interface, and the wireless sensing handle is used for interacting with a backpack computer; the leg sensor and the foot sensor are used for acquiring the motion postures and the motion states of the corresponding parts, so that the computer host can control the computer according to the motion postures and the motion states.
The wearable portable computer based on the mixed virtual reality can provide mixed reality vision for a user and surrounding observers through the head-mounted display.
On the basis of the above embodiment, as an implementable manner, the head-mounted display includes a helmet, and a visual interaction unit, a sound interaction unit, a sensor unit and a power supply unit all hung on a USB bus;
specifically, as shown in fig. 2, each component of the different units may adopt a universal serial bus, and may be hung on a USB bus.
The visual interaction unit comprises an adjustable transparent screen assembly and a projection display assembly, and the adjustable transparent screen assembly and the projection display assembly are connected to an equipment interconnection interface through a cable or a back plate;
in particular, the adjustable transparent screen assembly is used for user visual interaction, enabling a user to observe a real environment as much as possible when not being displayed; the projection display component is used for visual interaction of users, and is used for projecting display contents to the ground for the users and surrounding users to watch on the premise of ensuring safety.
As shown in fig. 3, the sound interaction unit includes an audio input module, an audio output module, a microphone array, an earphone interface, and a wireless audio interface; the microphone array comprises a speech microphone and an ambient microphone; the audio input module is used for processing sound information acquired by the microphone array and transmitting the processed data to the backpack computer through the equipment interconnection interface; the audio output module is used for decoding the audio signal received from the video stream audio interface and then transmitting the decoded audio signal to the earphone interface and/or the wireless audio interface;
specifically, the sound interaction unit is used for voice and auditory interaction, can enable a user to hear multimedia sound while hearing environmental sound, can also isolate mute and increase immersion, and is provided with a microphone to collect environmental and user voices and instructions. The audio input module processes sound according to requirements and then codes the sound into a format which can be recognized by equipment after processing such as active noise reduction, audio amplification, background sound elimination, voice enhancement, sound channel combination, ear return, noise measurement and the like. The earphone interface is used for connecting an earphone, and the wireless audio interface is used for connecting wireless audio equipment; the wireless audio interface can transmit sound to the playing device in a Bluetooth mode, a frequency modulation mode, an amplitude modulation mode and the like.
The sensor unit comprises a GPS module, an attitude sensor, a motion sensor, an environment sensor, a wearing sensor, a biosensor and a camera;
in particular, the design of the sensor unit is shown in fig. 4. The GPS module is used for acquiring the geographical position of the user by utilizing a global positioning system, so that more accurate information can be conveniently acquired by positioning-based application. The motion and gesture sensor is used for detecting the motion and gesture of the user and providing relevant sensor data for the application. The environment sensor is used for enhancing the perception ability of the user, detecting the surrounding environment state of the user and supporting related applications. The wearing sensor is used for detecting the wearing state of the user and controlling the operation mode of the equipment. The biosensor is used for detecting the physiological condition of a user and supporting related applications. The user can configure the number of cameras as required, and for example, the system can include a main visual angle camera, an auxiliary camera and an environment camera, wherein the main visual angle camera is used for recording and AR application, the auxiliary camera is used for auxiliary imaging or completing specific application in cooperation with other cameras, and the environment camera is used for enhancing user perception capability, detecting the surrounding environment state of the user and supporting related application.
The power supply unit comprises a power supply interface used for accessing an external power supply.
Specifically, as an implementable manner, as shown in fig. 5, the power supply is implemented by adopting a PD chain power supply scheme. The external power supply supplies power to the mobile power supply in a PD mode, and the mobile power supply supplies power to the core main equipment. The core main device uses a PD mode to supply power for the peripheral device. Each device comprises a power management chip which converts the PD power supply into the power specification required by each part and then supplies power to the corresponding part. The user can adjust the relation of each part of chain as required, adjusts the connection and the primary and secondary relation of each part in the power supply chain.
On the basis of the above embodiments, structurally, as shown in fig. 6, the adjustable transparent screen assembly in the embodiment of the present invention includes a display assembly 11 (for example, an adjustable transparent screen using electrochromic technology as a display assembly), a display assembly holder 12, and an optical assembly 13; the display component 11 and the optical component 13 are connected to the helmet 14 through the display component holder 12 (for example, the display component 11 and the optical component 13 can be connected to a rotating shaft of the helmet 14 through a connector), and the optical component 13 is used for adjusting the optical path from the display component to the observer; the adjustable transparent screen assembly has a stowed state and a use state; in the retracted state, the adjustable transparent screen assembly is automatically closed; in the use state, an observer observes the screen content through the display component or observes the real world through the transparent screen.
On the basis of the above embodiments, structurally, as shown in fig. 7, the projection display assembly in the embodiment of the present invention includes a projection assembly 21 and a stabilizer assembly 22; the projection assembly 21 is connected to the helmet 14 through the stabilizer assembly 22; the projection assembly 21 comprises a laser ranging sensor 211 and an anti-shake lens 212; the stabilizer assembly 22 includes a roll stabilizer 221 and a pitch stabilizer 222.
Specifically, the projection assembly is connected with the helmet through the stabilizer assembly, and the connection mode can be determined according to requirements. The stabilizer is used for keeping the projected image stable according to the equipment needs, and the combination mode and the installation mode of the stabilizer can be determined according to the needs. The laser ranging sensor is used for automatic focusing of a projected image, and the anti-shake lens can automatically focus and keep a picture stable to prevent shaking within a certain range.
The embodiment of the invention also provides a teaching platform of the wearable portable computer based on the mixed virtual reality, which comprises the wearable portable computer, a cloud platform and a local platform in the embodiments;
as shown in fig. 8, the cloud platform includes a user management unit, an application/content mall unit, and a cloud service unit; the user management unit is used for managing user accounts, equipment, subscriptions and purchases; the application/content mall unit is used for the content producer to publish the content and the content consumer to purchase, download and subscribe the related content; the cloud service unit is used for providing friend communication related functions, content related communities, online platforms between equipment and relevant SDK support of application/content;
as shown in fig. 9, the local platform includes a multi-user management unit, an application and content management unit, a storage management unit, and a device management unit; the multi-user management unit is used for accessing respective cloud services for each user, providing personalized settings and managing digital authorization held by each user; the application and content management unit is used for managing the content and resources in the local platform and realizing the addition, deletion and operation of related resources; the storage management unit is used for managing each storage medium of the local platform, reading the content in the storage medium and the authorization with the card, and managing the user data in the storage medium; the device management unit is used for setting hardware drivers and hardware related functions of the device, so that the device can operate according to the expectation of a user.
Specifically, the mixed virtual reality is realized by a retractable adjustable transparent screen component, a user can see the mixed reality content through a panel, and surrounding observers can observe the main visual field of the user through a portable projection. The user can use the mixed reality with him or her by carrying the wearable portable computer, and can use the computer as required in playgrounds, classrooms and other places without dividing the place. The wearable portable computer can transmit the picture to a remote screen in a wireless screen projection mode and the like, so that the visual field remote display is realized; the wearable portable computer can acquire the related mixed reality application through the application store or acquire the related mixed reality application in the form of an application card.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (6)

1. A wearable portable computer based on mixed virtual reality is characterized by comprising a backpack computer, a head-mounted display, a leg sensor, a foot sensor and a wireless sensing handle; the head-mounted display and the wireless sensing handle are both in communication connection with the backpack computer;
the backpack computer is a computer carried in a backpack form, and a battery, a computer host and cooling equipment are arranged in the backpack; the battery supplies power to the computer host and the cooling equipment; the computer host is provided with an equipment interconnection interface;
the head-mounted display is connected to the equipment interconnection interface and used for providing mixed reality vision for a user and surrounding observers;
the leg sensor, the foot sensor and the wireless sensing handle are connected to the equipment interconnection interface, and the wireless sensing handle is used for interacting with a backpack computer; the leg sensor and the foot sensor are used for acquiring the motion postures and the motion states of the corresponding parts, so that the computer host can control the computer according to the motion postures and the motion states.
2. The wearable portable computer of claim 1, wherein a power interface is further disposed inside the backpack, and the power interface is used for accessing an external power source through which the battery is powered.
3. The wearable portable computer according to claim 1, wherein the head-mounted display comprises a helmet, and a visual interaction unit, a sound interaction unit, a sensor unit and a power supply unit all hung on a USB bus;
the visual interaction unit comprises an adjustable transparent screen assembly and a projection display assembly, and the adjustable transparent screen assembly and the projection display assembly are connected to an equipment interconnection interface through a cable or a back plate;
the sound interaction unit comprises an audio input module, an audio output module, a microphone array, an earphone interface and a wireless audio interface; the microphone array comprises a speech microphone and an ambient microphone; the audio input module is used for processing sound information acquired by the microphone array and transmitting the processed data to the backpack computer through the equipment interconnection interface; the audio output module is used for decoding the audio signal received from the video stream audio interface and then transmitting the decoded audio signal to the earphone interface and/or the wireless audio interface;
the sensor unit comprises a GPS module, an attitude sensor, a motion sensor, an environment sensor, a wearing sensor, a biosensor and a camera;
the power supply unit comprises a power supply interface used for accessing an external power supply.
4. The wearable portable computer of claim 3, wherein the adjustable transparent screen assembly comprises a display assembly, a display assembly holder, and an optical assembly; the display assembly and the optical assembly are connected to a helmet by a display assembly holder;
the optical component is used for adjusting the light path from the display component to an observer;
the adjustable transparent screen assembly has a stowed state and a use state; in the retracted state, the adjustable transparent screen assembly is automatically closed; in the use state, an observer observes the screen content through the display component or observes the real world through the transparent screen.
5. The wearable portable computer of claim 3, wherein the projection display assembly comprises a projection assembly and a stabilizer assembly; the projection assembly is connected to a helmet through the stabilizer assembly; the projection assembly comprises a laser ranging sensor and an anti-shake lens; the stabilizer assembly includes a roll stabilizer and a pitch stabilizer.
6. A teaching platform of a wearable portable computer based on mixed virtual reality, which is characterized by comprising the wearable portable computer, a cloud platform and a local platform of any one of claims 1 to 5;
the cloud platform comprises a user management unit, an application/content mall unit and a cloud service unit; the user management unit is used for managing user accounts, equipment, subscriptions and purchases; the application/content mall unit is used for the content producer to publish the content and the content consumer to purchase, download and subscribe the related content; the cloud service unit is used for providing friend communication related functions, content related communities, online platforms between equipment and relevant SDK support of application/content;
the local platform comprises a multi-user management unit, an application and content management unit, a storage management unit and an equipment management unit; the multi-user management unit is used for accessing respective cloud services for each user, providing personalized settings and managing digital authorization held by each user; the application and content management unit is used for managing the content and resources in the local platform and realizing the addition, deletion and operation of related resources; the storage management unit is used for managing each storage medium of the local platform, reading the content in the storage medium and the authorization with the card, and managing the user data in the storage medium; the device management unit is used for setting hardware drivers and hardware related functions of the device, so that the device can operate according to the expectation of a user.
CN202110602374.0A 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality Pending CN113282141A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110602374.0A CN113282141A (en) 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110602374.0A CN113282141A (en) 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality

Publications (1)

Publication Number Publication Date
CN113282141A true CN113282141A (en) 2021-08-20

Family

ID=77282858

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110602374.0A Pending CN113282141A (en) 2021-05-31 2021-05-31 Wearable portable computer and teaching platform based on mix virtual reality

Country Status (1)

Country Link
CN (1) CN113282141A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114360322A (en) * 2021-12-08 2022-04-15 江西中船航海仪器有限公司 Portable navigation sextant simulator

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN104932679A (en) * 2014-03-21 2015-09-23 三星电子株式会社 Wearable device and method of operating the same
CN204695231U (en) * 2015-06-18 2015-10-07 陈会兵 Portable helmet immersion systems
CN106662747A (en) * 2014-08-21 2017-05-10 微软技术许可有限责任公司 Head-mounted display with electrochromic dimming module for augmented and virtual reality perception
US20180011682A1 (en) * 2016-07-06 2018-01-11 Bragi GmbH Variable computing engine for interactive media based upon user biometrics
CN107783639A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Virtual reality leisure learning system
CN108288242A (en) * 2018-01-31 2018-07-17 上海维拓网络科技有限公司 Teaching controlling platform and control method based on virtual reality engine technology
CN109932054A (en) * 2019-04-24 2019-06-25 北京耘科科技有限公司 Wearable Acoustic detection identifying system
CN111352239A (en) * 2018-12-22 2020-06-30 杭州融梦智能科技有限公司 Augmented reality display device and interaction method applying same
CN211786373U (en) * 2020-03-30 2020-10-27 哈雷医用(广州)智能技术有限公司 Portable AR wears display device
CN112051895A (en) * 2020-09-03 2020-12-08 西安戴森电子技术有限公司 Wearable solid state computer

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
CN104932679A (en) * 2014-03-21 2015-09-23 三星电子株式会社 Wearable device and method of operating the same
CN106662747A (en) * 2014-08-21 2017-05-10 微软技术许可有限责任公司 Head-mounted display with electrochromic dimming module for augmented and virtual reality perception
CN204695231U (en) * 2015-06-18 2015-10-07 陈会兵 Portable helmet immersion systems
US20180011682A1 (en) * 2016-07-06 2018-01-11 Bragi GmbH Variable computing engine for interactive media based upon user biometrics
CN107783639A (en) * 2016-08-24 2018-03-09 南京乐朋电子科技有限公司 Virtual reality leisure learning system
CN108288242A (en) * 2018-01-31 2018-07-17 上海维拓网络科技有限公司 Teaching controlling platform and control method based on virtual reality engine technology
CN111352239A (en) * 2018-12-22 2020-06-30 杭州融梦智能科技有限公司 Augmented reality display device and interaction method applying same
CN109932054A (en) * 2019-04-24 2019-06-25 北京耘科科技有限公司 Wearable Acoustic detection identifying system
CN211786373U (en) * 2020-03-30 2020-10-27 哈雷医用(广州)智能技术有限公司 Portable AR wears display device
CN112051895A (en) * 2020-09-03 2020-12-08 西安戴森电子技术有限公司 Wearable solid state computer

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114360322A (en) * 2021-12-08 2022-04-15 江西中船航海仪器有限公司 Portable navigation sextant simulator
CN114360322B (en) * 2021-12-08 2023-02-17 江西中船航海仪器有限公司 Portable navigation sextant simulator

Similar Documents

Publication Publication Date Title
EP3862845B1 (en) Method for controlling display screen according to eyeball focus and head-mounted electronic equipment
CN110830811B (en) Live broadcast interaction method, device, system, terminal and storage medium
US10176783B2 (en) Interactive wearable and portable smart devices
CN109600678B (en) Information display method, device and system, server, terminal and storage medium
RU2670784C9 (en) Orientation and visualization of virtual object
US9618747B2 (en) Head mounted display for viewing and creating a media file including omnidirectional image data and corresponding audio data
US9341866B2 (en) Spectacles having a built-in computer
CN109920065B (en) Information display method, device, equipment and storage medium
US8400519B2 (en) Mobile terminal and method of controlling the operation of the mobile terminal
CN109729372B (en) Live broadcast room switching method, device, terminal, server and storage medium
KR20140052294A (en) Method for providing user with virtual image in head-mounted display device, machine-readable storage medium and head-mounted display device
CN112965683A (en) Volume adjusting method and device, electronic equipment and medium
CN111031170A (en) Method, apparatus, electronic device and medium for selecting communication mode
CN113395566B (en) Video playing method and device, electronic equipment and computer readable storage medium
US20220404631A1 (en) Display method, electronic device, and system
CN106020459B (en) Intelligent glasses, and control method and control system of intelligent glasses
US20100283711A1 (en) An integrated computation and communication system, a framed interface therefor and a method of operating thereof
CN113282141A (en) Wearable portable computer and teaching platform based on mix virtual reality
CN112770177A (en) Multimedia file generation method, multimedia file release method and device
CN213876195U (en) Glasses frame and intelligent navigation glasses
CN106020458B (en) Intelligent glasses, and control method and control system of intelligent glasses
KR20170046947A (en) Mobile terminal and method for controlling the same
CN111343449A (en) Augmented reality-based display method and intelligent wearable device
KR20190070899A (en) Method for providing user with virtual image in head-mounted display device, machine-readable storage medium and head-mounted display device
Butow et al. Google glass for dummies

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination