CN212256620U - Intelligent laboratory - Google Patents

Intelligent laboratory Download PDF

Info

Publication number
CN212256620U
CN212256620U CN202020176930.3U CN202020176930U CN212256620U CN 212256620 U CN212256620 U CN 212256620U CN 202020176930 U CN202020176930 U CN 202020176930U CN 212256620 U CN212256620 U CN 212256620U
Authority
CN
China
Prior art keywords
naked eye
imaging area
imaging
smell
experiment table
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020176930.3U
Other languages
Chinese (zh)
Inventor
孔令阔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Pupil Beijing Technology Co ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202020176930.3U priority Critical patent/CN212256620U/en
Application granted granted Critical
Publication of CN212256620U publication Critical patent/CN212256620U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses intelligence laboratory relates to experimental facilities. The intelligent laboratory comprises a laboratory table, an imaging area, a display, a naked eye 3D imaging plate and a somatosensory camera. The application has the advantages of simple structure and low cost, can reduce the laboratory construction investment of schools and education and culture mechanisms by more than 50 percent, greatly increase the fund utilization rate in the education field and solve the urgent need of education units; meanwhile, the wish of moving home in a scientific laboratory can be really realized, and scientific education is really led to go into thousands of households. This application realizes gesture recognition through body sensing camera and host computer, can realize both hands simulation for the teaching practicality reinforcing, the body is felt the reinforcing.

Description

Intelligent laboratory
Technical Field
The application relates to an experimental facility, in particular to an intelligent virtual simulation laboratory.
Background
At present, the problems of extreme shortage of scientific teacher teams, serious shortage of scientific test equipment, uneven scientific education content, simple and crude experimental equipment and the like exist in China. The cost of building a scientific laboratory is often millions, which makes the education department have mental weakness, thus hindering the popularization and development of scientific education. Meanwhile, people also want to move the scientific laboratory home. In addition, with the progress of science and technology, the requirements of people on teaching experiment feeling are higher and higher.
Therefore, how to really realize the wish of moving back to home in a scientific laboratory, and really let scientific education go into thousands of households, can reduce the investment in the laboratory of schools and education institutions, and can increase the teaching experiment feeling, and the problem to be solved at present is urgently needed.
SUMMERY OF THE UTILITY MODEL
It is an object of the present application to overcome the above problems or to at least partially solve or mitigate the above problems.
The application provides an intelligent laboratory for carry out virtual experiment simulation operation, include:
the experiment table is positioned at the bottom and used for supporting other parts in the intelligent laboratory, the experiment table is provided with an embedded host and used for controlling naked eye 3D images in the experiment table, and the experiment table is provided with an exposed operation panel, is used as an operation interface, is connected with the host and is used for selecting image contents to be displayed;
the imaging area is a square space formed by four rod pieces and used for displaying projection, the four rod pieces correspond to four vertexes of a square contour in the top surface of the experiment table, and the four rod pieces are perpendicular to and fixedly connected with the top surface;
the display is arranged at the imaging area and used for projecting images to be displayed in the host computer onto the naked eye 3D imaging plate;
the naked eye 3D imaging plate is obliquely inserted into the imaging area and used for playing a naked eye 3D image picture;
the body sensing camera is arranged at the intelligent laboratory and connected with the host machine, and is used for collecting gesture actions of an experimenter and transmitting the gesture actions to the host machine, and gesture recognition is carried out through the host machine so as to control naked eye 3D images in the experiment table.
Optionally, the intelligent laboratory further comprises a top cover, which covers the top ends of the four rods and is fixedly connected with the top cover;
the top surface of the front end of the experiment table is provided with an inclined surface extending downwards from the top surface, and the operation panel is positioned at the inclined surface;
the naked eye 3D imaging plate is one, one end of the naked eye 3D imaging plate is fixed at the bottom of the front end of the imaging area, the other end of the naked eye 3D imaging plate is fixed at the top of the imaging area, close to the rear end, and an included angle between the naked eye 3D imaging plate and the top surface of the experiment table is 45 degrees so as to form a positive Z-shaped structure with the imaging area;
the display is arranged at the bottom surface of the top cover, and the screen faces downwards;
the body sensing camera is arranged at the top cover.
Optionally, a transparent film is correspondingly disposed at the front end of the imaging region, and non-transparent films are correspondingly disposed at the rear end, the left end and the right end of the imaging region so as to cover the imaging region.
Optionally, the intelligent laboratory further comprises a top cover, which covers the top ends of the four rods and is fixedly connected with the top cover;
the top surface of the front end of the experiment table is provided with an inclined surface extending downwards from the top surface, and the operation panel is positioned in the inclined surface;
the three naked eye 3D imaging plates correspond to the front end, the left end and the right end of the imaging area, each naked eye 3D imaging plate is in a regular triangle shape, the vertex of each naked eye 3D imaging plate is fixed at the top of the imaging area close to the rear end, and the bottom edge of each naked eye 3D imaging plate is fixed at the bottom of each end of the corresponding imaging area to form a regular pyramid structure with the imaging area;
the display is arranged at the bottom surface of the top cover, and the screen faces downwards;
the body sensing camera is arranged at the top cover.
Optionally, transparent films are correspondingly arranged at the front end, the left end and the right end of the imaging area, and a non-transparent film is arranged at the rear end of the imaging area so as to cover the imaging area.
Optionally, the top surfaces of the four ends of the experiment table are correspondingly provided with inclined surfaces extending downwards from the top surfaces, the number of the operation panels is four, and each operation panel is located at the corresponding inclined surface;
the naked eye 3D imaging plates are four and correspond to four ends of the imaging area, each naked eye 3D imaging plate is in a regular triangle shape, the vertex of each naked eye 3D imaging plate is fixed at the center of the top surface of the experiment table, and the bottom edge of each naked eye 3D imaging plate is fixed at the top of each end of the corresponding imaging area so as to form an inverted pyramid structure with the imaging area;
the display is arranged at the top surface of the experiment table, and the screen faces upwards;
the body feeling cameras are four in number and correspondingly arranged on four inclined planes of the experiment table.
Optionally, transparent films are correspondingly arranged at four ends of the imaging area so as to cover the imaging area.
Optionally, the intelligent laboratory further comprises a smell module connected to the host, the smell module contains a plurality of smell modules, each smell module corresponds to a smell, each smell module is a solid smell particle, and the host calls the smell modules with different smells to release the smell in the corresponding smell particle, so that the smell experience is achieved.
Optionally, the intelligent laboratory further comprises a power line interface disposed at the laboratory bench.
The intelligent laboratory comprises a laboratory table, an imaging area, a display, a naked eye 3D imaging plate and a somatosensory camera, and is simple in structure and low in manufacturing cost, the construction investment of schools and teaching and culture mechanism laboratories can be reduced by more than 50%, the fund utilization rate of the education field is greatly increased, and the first-aid of education units is solved; meanwhile, the wish of moving home in a scientific laboratory can be really realized, and scientific education is really led to go into thousands of households. Therefore, the application can solve the severe problems of extreme shortage of scientific teacher teams, serious shortage of scientific experimental equipment, uneven scientific education content, simple and crude experimental equipment and the like in the existing scientific education system. This application realizes gesture recognition through body sensing camera and host computer, can realize both hands simulation for the teaching practicality reinforcing, the body is felt the reinforcing.
The above and other objects, advantages and features of the present application will become more apparent to those skilled in the art from the following detailed description of specific embodiments thereof, taken in conjunction with the accompanying drawings.
Drawings
Some specific embodiments of the present application will be described in detail hereinafter by way of illustration and not limitation with reference to the accompanying drawings. The same reference numbers in the drawings identify the same or similar elements or components. Those skilled in the art will appreciate that the drawings are not necessarily drawn to scale. In the drawings:
FIG. 1 is a schematic block diagram of an intelligent laboratory according to one embodiment of the present application;
FIG. 2 is a schematic front view of the intelligent laboratory of FIG. 1 with a scent module;
FIG. 3 is a schematic side view of the intelligent laboratory of FIG. 1 with a power cord interface;
FIG. 4 is a schematic rear view of FIG. 1;
FIG. 5 is a schematic block diagram of an intelligent laboratory according to another embodiment of the present application;
FIG. 6 is a schematic front view of the intelligent laboratory of FIG. 5 with a scent module;
FIG. 7 is a schematic side view of the intelligent laboratory of FIG. 5 with a power cord interface;
FIG. 8 is a schematic rear view of FIG. 5;
FIG. 9 is a schematic front view of an intelligent laboratory according to another embodiment of the present application;
fig. 10 is a schematic top view of fig. 9.
The symbols in the drawings represent the following meanings:
100 of the intelligent laboratory, and the intelligent laboratory,
the system comprises a test table 1, an imaging area 2, a top cover 3, a naked eye 3D imaging plate 4, a display 5, a motion sensing camera 6, a smell module 7 and a power line interface 8;
11 an operating panel.
Detailed Description
FIG. 1 is a schematic block diagram of an intelligent laboratory according to one embodiment of the present application. Fig. 2 is a schematic front view of the intelligent laboratory of fig. 1 provided with a scent module. Fig. 3 is a schematic side view of the intelligent laboratory of fig. 1 with a power cord interface. Fig. 4 is a schematic rear view of fig. 1. Fig. 5 is a schematic block diagram of an intelligent laboratory according to another embodiment of the present application. Fig. 6 is a schematic front view of the intelligent laboratory of fig. 5 provided with a scent module. Fig. 7 is a schematic side view of the intelligent laboratory of fig. 5 with a power cord interface. Fig. 8 is a schematic rear view of fig. 5. Fig. 9 is a schematic front view of an intelligent laboratory according to another embodiment of the present application. Fig. 10 is a schematic top view of fig. 9.
The front, back, top and bottom orientations in this application are defined as front, back, top and bottom with reference to the drawings as seen by the reader.
As shown in fig. 1, and also referring to fig. 2-10, the present embodiment provides an intelligent laboratory 100 for performing virtual experiment simulation operations, which may generally include: laboratory bench 1, image area 2, display 5, bore hole 3D imaging plate 4 and body sensor 6. The laboratory table 1 is located at the bottom for supporting other parts in the intelligent laboratory 100. Other parts refer to parts of the intelligent laboratory 100 other than the laboratory bench 1. The experiment table 1 is provided with an embedded host (not shown in the figure) for controlling naked eye 3D images in the experiment table 1, and the experiment table 1 is provided with an exposed operation panel 11 which is used as an operation interface, is connected with the host and is used for selecting image contents to be displayed. The imaging area 2 is a square space composed of four rods for displaying projection. The four bars correspond to the four vertices of the square profile in the top surface of the laboratory table 1, the four bars being perpendicular to and fixedly connected to the top surface. A display 5 is arranged at the imaging area 2 for projecting an image to be displayed in a host computer onto the naked eye 3D imaging plate 4. The naked eye 3D imaging plate 4 is obliquely inserted into the imaging area 2 and used for playing a naked eye 3D image picture. The motion sensing camera 6 is arranged at the intelligent laboratory 100, is connected with the host, and is used for collecting gesture actions of an experimenter and transmitting the gesture actions to the host, and gesture recognition is carried out through the host to control naked eye 3D images in the experiment table 1.
In specific implementation, the naked eye 3D imaging plate 4 can be a white coating film with the transmittance of about 80% and the thickness of 2-3 mm.
The gesture recognition is the prior art and comprises three steps of gesture cutting, gesture analysis and gesture recognition. Gesture cutting: the gesture motion is obtained by utilizing an image acquisition device such as the somatosensory camera 6 adopted by the application, and a plane model of the gesture is obtained. And (3) gesture analysis: the hand shape is distinguished from other objects due to the special appearance of the hand shape, and the host computer adopts a gesture recognition algorithm combining geometric moment and edge detection to calculate the distance between the images by setting the weights of the two characteristics so as to realize the recognition of the letter gesture. Gesture recognition: the gesture is recognized by considering the motion of the gesture as a sequence of static gesture images and then comparing the sequence of gesture templates to be recognized with a known sequence of gesture templates.
Intelligent laboratory 100 principle of operation: referring to fig. 1, a host of an intelligent laboratory 100 is embedded in a laboratory table 1, and the host is powered on by supplying power to the laboratory table 1. The display 5 above the experiment table 1 projects images to be displayed in the host computer onto the upper naked eye 3D imaging plate 4, the naked eye 3D imaging plate 4 presents naked eye 3D picture images to a user, and an experimenter controls the naked eye 3D images in the experiment table 1 through gesture transformation to perform related experiments and learning, so that a good application effect is achieved.
Therefore, the application has the advantages of simple structure and low cost, can reduce the construction investment of schools and teaching and culture institutions in laboratories by more than 50 percent, greatly increase the fund utilization rate in the education field and solve the urgent need of education units; meanwhile, the wish of moving home in a scientific laboratory can be really realized, and scientific education is really led to go into thousands of households. Therefore, the application can solve the severe problems of extreme shortage of scientific teacher teams, serious shortage of scientific experimental equipment, uneven scientific education content, simple and crude experimental equipment and the like in the existing scientific education system.
In addition, the traditional laboratory has potential safety hazard in doing experiments, and the physical injury can be brought to the experimenter by improper operation. For example, blow out an explosion from an alcohol burner. And use this application, when blowing out the alcohol burner, can simulate out real explosion scene for the experimenter can have real vision and experience and feel, because be the simulation scene consequently can not real explosion, can not bring the injury on the flesh for the experimenter. Therefore, under the condition of truly displaying an experimental scene, the safety of an experimenter can be guaranteed, and worries of teachers and parents are solved.
In addition, compared with the operation and control equipment such as an operation handle, the battery replacement device has the defects of easy loss and frequent battery replacement. According to the gesture recognition system, gesture recognition is carried out through the motion sensing camera 6 and the host arranged in the intelligent laboratory 100, so that double-hand simulation can be realized, the teaching practicability is enhanced, and the motion sensing is enhanced; and because body feels camera 6 and host computer all are the equipment of fixing at intelligent laboratory 100, can not lose, concentrate the power supply, solved the problem of frequent change battery.
In this embodiment, as shown in fig. 1 to 4, the intelligent laboratory 100 further includes a top cover 3 covering the top of the four rods and fixedly connected thereto. The top surface of the front end of the experiment table 1 is provided with a slope extending downwards from the top surface, and the operation panel 11 is positioned at the slope. The naked eye 3D imaging plate 4 is one piece. One end of the naked eye 3D imaging plate 4 is fixed at the bottom of the front end of the imaging area 2, the other end of the naked eye 3D imaging plate 4 is fixed at the top of the imaging area 2 close to the rear end, and an included angle between the naked eye 3D imaging plate 4 and the top surface of the experiment table 1 is 45 degrees so as to form a positive Z-shaped structure with the imaging area 2. In this example, a certain imaging space is also arranged behind the naked-eye 3D imaging plate 4, so that the imaging effect is better. The display 5 is arranged at the bottom surface of the top cover 3 with the screen facing downward. The body sensing camera 6 is arranged at the top cover 3.
More specifically, a transparent film is correspondingly disposed at the front end of the imaging region 2, and non-transparent films are correspondingly disposed at the rear end, the left end and the right end of the imaging region 2 so as to cover the imaging region 2 therein.
In this embodiment, as shown in fig. 5 to 8, the intelligent laboratory 100 further includes a top cover 3 covering the top of the four rods and fixedly connected thereto. The top surface of the front end of the experiment table 1 is provided with a slope extending downwards from the top surface, and the operation panel 11 is positioned in the slope. The naked eye 3D imaging plate 4 is three, corresponding to the front end, the left end and the right end of the imaging area 2. Each naked eye 3D imaging plate 4 is in a regular triangle shape, the vertex of each naked eye 3D imaging plate 4 is fixed to the top of the imaging area 2 close to the rear end, and the bottom edge of each naked eye 3D imaging plate 4 is fixed to the bottom of each end of the imaging area 2 corresponding to the bottom edge, so that a regular pyramid structure is formed between the bottom edge and the imaging area 2. The display 5 is arranged at the bottom surface of the top cover 3 with the screen facing downward. The body sensing camera 6 is arranged at the top cover 3.
More specifically, transparent films are correspondingly disposed at the front end, the left end and the right end of the imaging region 2, and a non-transparent film is disposed at the rear end of the imaging region 2 to cover the imaging region 2 therein.
In this embodiment, as shown in fig. 9 to 10, the top surfaces of the four ends of the experiment table 1 are correspondingly provided with inclined surfaces extending downwards from the top surfaces. The number of the operation panels 11 is four, and each operation panel 11 is located at a corresponding inclined plane. The naked eye 3D imaging plate 4 is four, and corresponds to four ends of the imaging area 2. Each naked eye 3D imaging plate 4 is in a regular triangle shape, the vertex of each naked eye 3D imaging plate 4 is fixed at the center of the top surface of the experiment table 1, and the bottom edge of each naked eye 3D imaging plate 4 is fixed at the top of each end of the corresponding imaging area 2 so as to form an inverted pyramid structure with the imaging area 2. The display 5 is arranged at the top surface of the laboratory table 1 with the screen facing upwards. The body feeling cameras 6 are four in number and correspondingly arranged on four inclined planes of the experiment table 1.
More specifically, transparent films are provided at the four ends of the imaging area 2, respectively, to cover the imaging area 2 therein.
More specifically, the transparent film in the above embodiments may be tempered glass, and the non-transparent film may be an aluminum plate.
Furthermore, the intelligent laboratory 100 as shown in fig. 1, 5 and 9 further comprises a smell module 7 connected to the host. The smell module 7 contains a plurality of groups of smell modules. Each smell module corresponds a smell, and each smell module is solid state smell granule, through the smell identification system of host computer transfers the smell of the smell module release corresponding smell granule in of different smells, reaches high-quality smell sense experience.
In specific implementation, the odor particles are food-grade odor particles, so that the odor is harmless to human bodies.
Further, as shown in fig. 3 and 7, the intelligent laboratory 100 further includes a power line interface 8 disposed at the laboratory table 1. For example, it may be arranged at the side of the laboratory table 1.
It is to be noted that, unless otherwise specified, technical or scientific terms used herein shall have the ordinary meaning as understood by those skilled in the art to which this application belongs.
In the description of the present application, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "radial," "circumferential," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the present application and to simplify the description, and are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be considered limiting of the present application.
Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. In the description of the present application, "a plurality" means two or more unless specifically defined otherwise.
In this application, unless expressly stated or limited otherwise, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can include, for example, fixed connections, removable connections, or integral parts; can be mechanically or electrically connected; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
In this application, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through intervening media. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
The above description is only for the preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present application should be covered within the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. An intelligent laboratory for performing virtual experiment simulation operations, comprising:
the experiment table is positioned at the bottom and used for supporting other parts in the intelligent laboratory, the experiment table is provided with an embedded host and used for controlling naked eye 3D images in the experiment table, and the experiment table is provided with an exposed operation panel, is used as an operation interface, is connected with the host and is used for selecting image contents to be displayed;
the imaging area is a square space formed by four rod pieces and used for displaying projection, the four rod pieces correspond to four vertexes of a square contour in the top surface of the experiment table, and the four rod pieces are perpendicular to and fixedly connected with the top surface;
the display is arranged at the imaging area and used for projecting images to be displayed in the host computer onto a naked eye 3D imaging plate;
the naked eye 3D imaging plate is obliquely inserted into the imaging area and used for playing a naked eye 3D image picture; and
the body sensing camera is arranged at the intelligent laboratory and connected with the host machine, and is used for collecting gesture actions of an experimenter and transmitting the gesture actions to the host machine, and gesture recognition is carried out through the host machine so as to control naked eye 3D images in the experiment table.
2. The intelligent laboratory according to claim 1, further comprising a top cover covering and fixedly connected to a top end of the four rods;
the top surface of the front end of the experiment table is provided with an inclined surface extending downwards from the top surface, and the operation panel is positioned at the inclined surface;
the naked eye 3D imaging plate is one, one end of the naked eye 3D imaging plate is fixed at the bottom of the front end of the imaging area, the other end of the naked eye 3D imaging plate is fixed at the top of the imaging area, close to the rear end, and an included angle between the naked eye 3D imaging plate and the top surface of the experiment table is 45 degrees so as to form a positive Z-shaped structure with the imaging area;
the display is arranged at the bottom surface of the top cover, and the screen faces downwards;
the body sensing camera is arranged at the top cover.
3. The intelligent laboratory according to claim 2, wherein a transparent film is correspondingly disposed at the front end of the imaging area, and non-transparent films are correspondingly disposed at the rear end, the left end and the right end of the imaging area, so as to cover the imaging area therein.
4. The intelligent laboratory according to claim 1, further comprising a top cover covering and fixedly connected to a top end of the four rods;
the top surface of the front end of the experiment table is provided with an inclined surface extending downwards from the top surface, and the operation panel is positioned in the inclined surface;
the three naked eye 3D imaging plates correspond to the front end, the left end and the right end of the imaging area, each naked eye 3D imaging plate is in a regular triangle shape, the vertex of each naked eye 3D imaging plate is fixed at the top of the imaging area close to the rear end, and the bottom edge of each naked eye 3D imaging plate is fixed at the bottom of each end of the corresponding imaging area to form a regular pyramid structure with the imaging area;
the display is arranged at the bottom surface of the top cover, and the screen faces downwards;
the body sensing camera is arranged at the top cover.
5. The intelligent laboratory according to claim 4, wherein transparent films are correspondingly arranged at the front end, the left end and the right end of the imaging area, and a non-transparent film is arranged at the rear end of the imaging area so as to cover the imaging area.
6. The intelligent laboratory according to claim 1, wherein the top surface of the four ends of the laboratory table is correspondingly provided with inclined surfaces extending downwards from the top surface, the number of the operation panels is four, and each operation panel is located at the corresponding inclined surface;
the naked eye 3D imaging plates are four and correspond to four ends of the imaging area, each naked eye 3D imaging plate is in a regular triangle shape, the vertex of each naked eye 3D imaging plate is fixed at the center of the top surface of the experiment table, and the bottom edge of each naked eye 3D imaging plate is fixed at the top of each end of the corresponding imaging area so as to form an inverted pyramid structure with the imaging area;
the display is arranged at the top surface of the experiment table, and the screen faces upwards;
the body feeling cameras are four in number and correspondingly arranged on four inclined planes of the experiment table.
7. The intelligent laboratory according to claim 6, wherein transparent films are correspondingly provided at four ends of the imaging area to cover the imaging area therein.
8. The intelligent laboratory according to claim 1, further comprising a smell module connected to the host, wherein the smell module contains a plurality of smell modules, each smell module corresponds to a smell, and each smell module is a solid smell granule, and the host calls the smell modules with different smells to release the smell in the corresponding smell granule, so as to achieve smell experience.
9. The intelligent laboratory of any one of claims 1 to 8, further comprising a power line interface disposed at the laboratory bench.
CN202020176930.3U 2020-02-17 2020-02-17 Intelligent laboratory Active CN212256620U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020176930.3U CN212256620U (en) 2020-02-17 2020-02-17 Intelligent laboratory

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020176930.3U CN212256620U (en) 2020-02-17 2020-02-17 Intelligent laboratory

Publications (1)

Publication Number Publication Date
CN212256620U true CN212256620U (en) 2020-12-29

Family

ID=73989472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020176930.3U Active CN212256620U (en) 2020-02-17 2020-02-17 Intelligent laboratory

Country Status (1)

Country Link
CN (1) CN212256620U (en)

Similar Documents

Publication Publication Date Title
CN206379004U (en) A kind of immersion tutoring system based on VR technologies
US11257392B2 (en) Apparatus, engine, system and method of providing simulation of and training for the operation of heavy equipment
CN207883270U (en) A kind of 3D holographic interaction display systems for museum
CN205406026U (en) Unreal image formation of image show stand based on control technique is felt to kinect body
CN204028557U (en) The holographic phantom imaging system of a kind of 360 degree
CN212256620U (en) Intelligent laboratory
KR20150113294A (en) System, method and computer readable recording medium for managing an education using a smart pen based on a dot code
CN206733888U (en) One kind rotation hand propelled secondary modern school mathematics is given lessons with board of education instrument
CN207233229U (en) Visual communication professional teaching apparatus
CN112164147A (en) Interaction device for AR augmented reality teaching demonstration and smart cloud platform used by same
CN206179266U (en) Interactive teaching appearance
CN207037630U (en) Suitable for the Intelligent touch electronic whiteboard of classroom instruction
CN207041235U (en) A kind of Multifunctional teaching desk based on Computer Multimedia Technology
CN211718689U (en) Holographic projection imaging interaction system
CN209772175U (en) SLAB super laboratory and SLAB experiment host computer subassembly
CN206075584U (en) A kind of new teaching touch one-piece machine
CN107831894A (en) It is a kind of suitable for mobile terminal every empty-handed gesture writing on the blackboard method
CN107085485A (en) Suitable for the Intelligent touch electronic whiteboard of classroom instruction
CN208433026U (en) A kind of wisdom education system
CN111276014A (en) Intelligent experiment learning machine
CN206757696U (en) Human resource management service scenario evaluator
CN107665621B (en) Three-view device for high school mathematics
CN112181142A (en) AR teaching demonstration interaction device and smart cloud platform used by same
CN204557819U (en) A kind of mechanical drawing view projections stage apparatus
CN206442447U (en) A kind of Full-automatic electric-controlled VR macroshot platform containers

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220317

Address after: Room 0106-647, floor 1, No. 26, Shangdi Information Road, Haidian District, Beijing 100085 (the service life of the cluster registered residence is until April 5, 2023)

Patentee after: Magic pupil (Beijing) Technology Co.,Ltd.

Address before: 102600 1702, unit 2, building 4, new lissilai mansion, Daxing District, Beijing

Patentee before: Kong Lingkuo