CN116305364A - Glasses model generation method and electronic equipment - Google Patents

Glasses model generation method and electronic equipment Download PDF

Info

Publication number
CN116305364A
CN116305364A CN202310345001.9A CN202310345001A CN116305364A CN 116305364 A CN116305364 A CN 116305364A CN 202310345001 A CN202310345001 A CN 202310345001A CN 116305364 A CN116305364 A CN 116305364A
Authority
CN
China
Prior art keywords
model
eyeglass
glasses
head
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310345001.9A
Other languages
Chinese (zh)
Inventor
彭庆菲
江建好
唐显蒙
方友森
简寒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhuhai Saina Shibo Technology Co ltd
Original Assignee
Zhuhai Saina Shibo Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhuhai Saina Shibo Technology Co ltd filed Critical Zhuhai Saina Shibo Technology Co ltd
Priority to CN202310345001.9A priority Critical patent/CN116305364A/en
Publication of CN116305364A publication Critical patent/CN116305364A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/10Additive manufacturing, e.g. 3D printing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0621Item configuration or customization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the application provides a glasses model generation method and electronic equipment. The glasses model generation method comprises the following steps: acquiring head characteristic data of a user, and generating a head virtual three-dimensional model based on the head characteristic data; determining a first model of the eye; based on the head virtual three-dimensional model, the first glasses model is adjusted to generate a second glasses model, and a glasses frame, a nose pad and glasses legs of the second glasses model are respectively matched with the head virtual three-dimensional model; determining a lens pretilt angle of the second eyeglass model; and when the lens pretilt angle is not in the preset angle threshold value interval, adjusting the second eyeglass model to generate a third eyeglass model, wherein the lens pretilt angle of the third eyeglass model is in the preset angle threshold value interval. According to the eyeglass model generation method, the customizing flow and the customizing standard of the 3D printing eyeglasses can be standardized, the wearing comfort level of the 3D printing eyeglasses is improved, and the user experience of a user for customizing the 3D printing eyeglasses is improved.

Description

Glasses model generation method and electronic equipment
Technical Field
The application relates to the technical field of glasses manufacturing, in particular to a glasses model generation method and electronic equipment.
Background
The 3D printing is a rapid prototyping technology for generating corresponding real objects in a layer-by-layer printing mode based on a digital model file. 3D printing is widely used in various fields, for example, custom glasses through 3D printing.
The process of customizing eyeglasses by 3D printing generally includes: establishing a virtual three-dimensional model of the head of the user according to the characteristic data of the head of the user; selecting an initial three-dimensional digital model of the glasses which accords with the personal preference style by a user; and adjusting the initial three-dimensional digital model of the glasses selected by the user to adapt to the virtual three-dimensional model of the head of the user, thereby obtaining the customized glasses.
Typically, the adjustment of the user-selected initial three-dimensional digital model of the eyeglass is performed manually by a designer. Because different designers manually adjust the initial three-dimensional digital model to be suitable for the adaptation standard of the virtual three-dimensional model of the head of the user is inconsistent, the adaptation effect cannot be accurately measured. This makes the wearing comfort of the final custom-manufactured spectacles uncertain. After the user wears customized glasses in person, the user also needs to adjust at least one of the pile head, the nose support, the glasses legs and the like in the later period to improve the wearing comfort of the glasses, so that the user experience of the 3D printing customized glasses is affected.
Disclosure of Invention
Aiming at the problem of how to improve the user experience of 3D printing customized glasses, the application provides a glasses model generation method and electronic equipment, and also provides a computer readable storage medium.
The embodiment of the application adopts the following technical scheme:
in a first aspect, the present application provides a method for generating a glasses model, where the method is applied to an electronic device, the method includes:
acquiring head characteristic data of a user, and generating a head virtual three-dimensional model based on the head characteristic data;
determining a first model of the eye;
based on the head virtual three-dimensional model, the first glasses model is adjusted to generate a second glasses model, and a glasses frame, a nose pad and glasses legs of the second glasses model are respectively matched with the head virtual three-dimensional model;
determining a lens pretilt angle of the second eyeglass model;
and when the lens pretilt angle is not in the preset angle threshold value interval, adjusting the second eyeglass model to generate a third eyeglass model, wherein the lens pretilt angle of the third eyeglass model is in the preset angle threshold value interval.
According to the glasses model generation method, the second glasses model which is adaptive to the virtual three-dimensional model of the head can be generated by firstly generating the glasses frame, the nose support and the glasses legs, and the front inclination angle of the lenses is adjusted on the basis of the second glasses model so as to obtain the third glasses model for 3D printing. According to the eyeglass model generation method, the customizing flow and the customizing standard of the 3D printing eyeglasses can be standardized, the wearing comfort level of the 3D printing eyeglasses is improved, and the user experience of a user for customizing the 3D printing eyeglasses is improved.
In an implementation manner of the first aspect, the adjusting the first glasses model based on the head virtual three-dimensional model, and generating a second glasses model, includes:
determining feature points in the virtual three-dimensional model of the head, and shapes and positions of eyes, nose, and ears;
determining a characteristic point distance related to the eyeglass parameter based on the characteristic points;
based on the feature point spacing, and the shape and position of the eyes, nose, and ears, parameters of the first eyeglass model are adjusted to generate the second eyeglass model.
In an implementation manner of the first aspect, the determining a lens pretilt angle of the second eyeglass model includes:
measuring a first distance value, wherein the first distance value is the distance from the hinge center point on the same side of the second eyeglass model to the bending point of the eyeglass leg;
measuring a second distance value, wherein the second distance value is the distance between the center point of the nose pad on the same side of the second glasses model and the horizontal center line of the lens;
measuring a third distance value, wherein the third distance value is the distance between the pile head center point on the same side of the second eyeglass model and the lens horizontal center line;
a lens pretilt angle of the second eyeglass model is determined based on the first distance value, the second distance value, and the third distance value.
In an implementation manner of the first aspect, the adjusting the second eyeglass model when the lens pretilt angle is not within a preset angle threshold interval includes:
adjusting at least one parameter of the position and angle of the stake head of the second eyeglass model; and/or adjusting at least one parameter of the position and angle of the nose pad of the second eyeglass model.
In one implementation manner of the first aspect, the frame, the nose pad and the temple of the third eyeglass model are respectively adapted to the virtual three-dimensional model of the head.
In a second aspect, the present application provides a method of generating eyeglasses, the method comprising:
acquiring a third eyeglass model based on the method of the first aspect;
performing slice layering processing on the third eyeglass model to obtain at least one slice layer image data;
performing data processing based on the slice layer image data to obtain layer printing data;
and performing three-dimensional printing based on the layer printing data to obtain layers of the glasses, and performing layer-by-layer printing and superposition to obtain the glasses.
In a third aspect, the present application provides an electronic device, including:
the model building module is used for acquiring head characteristic data of a user and generating a head virtual three-dimensional model based on the head characteristic data;
A model selection module for determining a first model of the eye;
the model adapting module is used for adjusting the first glasses model based on the head virtual three-dimensional model to generate a second glasses model, and a glasses frame, a nose pad and glasses legs of the second glasses model are respectively adapted to the head virtual three-dimensional model;
a model adjustment module for: determining a lens pretilt angle of the second eyeglass model; and when the lens pretilt angle is not in the preset angle threshold value interval, adjusting the second eyeglass model to generate a third eyeglass model, wherein the lens pretilt angle of the third eyeglass model is in the preset angle threshold value interval.
In a fourth aspect, the present application provides an eyewear generation system, the system comprising:
the model building module is used for acquiring head characteristic data of a user and generating a head virtual three-dimensional model based on the head characteristic data;
a model selection module for determining a first model of the eye;
the model adapting module is used for adjusting the first glasses model based on the head virtual three-dimensional model to generate a second glasses model, and a glasses frame, a nose pad and glasses legs of the second glasses model are respectively adapted to the head virtual three-dimensional model;
A model adjustment module for: determining a lens pretilt angle of the second eyeglass model; when the lens pretilt angle is not in the preset angle threshold value interval, the second eyeglass model is adjusted to generate a third eyeglass model, and the lens pretilt angle of the third eyeglass model is in the preset angle threshold value interval;
the model slicing module is used for performing slicing layering processing on the third eyeglass model to obtain at least one slice layer image data;
the data processing module is used for carrying out data processing based on the slice layer image data to obtain layer printing data;
and the model printing module is used for performing three-dimensional printing based on the layer printing data to obtain layers of the glasses, and performing layer-by-layer printing and superposition to obtain the glasses.
In a fifth aspect, the present application provides an electronic device comprising a memory for storing computer program instructions and a processor for executing the computer program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps according to the first or second aspect.
In a sixth aspect, the present application provides a computer readable storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the method according to the first or second aspect.
Drawings
FIG. 1 is a schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a virtual three-dimensional model of a head generated according to a method of an embodiment of the present application;
FIG. 3 is a flow chart illustrating a method of generating a model of eyeglasses according to an embodiment of the present application;
FIG. 4 is a flow chart illustrating the adjustment of a model of eyeglasses according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a second eyeglass model generated in accordance with a method of an embodiment of the present application;
FIG. 6 is a schematic diagram of eyeglasses according to an embodiment of the present application;
FIG. 7 is a flow chart illustrating the adjustment of a model of eyeglasses according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a eyeglass model according to an embodiment of the present application;
FIG. 9 is a side view of the eyeglass model of FIG. 8;
FIG. 10 is a front view of the eyeglass model of FIG. 8;
FIG. 11 is a partial flow chart illustrating a method of generating eyeglasses according to an embodiment of the present application;
fig. 12 is a schematic diagram illustrating a structure of an eyeglass generating system according to an embodiment of the present application.
Detailed Description
For the purposes, technical solutions and advantages of the present application, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The terminology used in the description section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
To the problem of how to improve the user experience of 3D printing customization glasses, this application provides a glasses model generation method, at the in-process of generating the glasses model, optimize the adjustment to the glasses model to wearing comfort level for finally customize after producing glasses, the user need not to carry out loaded down with trivial details adaptation adjustment to glasses, just can ensure the wearing comfort level of glasses, thereby improves user experience.
The glasses model generation method is applied to electronic equipment.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the present application.
As shown in fig. 1, the electronic device 100 includes a model building module 110, a glasses model library 120, a model selecting module 130, a model adapting module 141, and a model adjusting module 142.
The model building module 110 is used to generate a virtual three-dimensional model of the user's head.
In an embodiment, the electronic device 100 further comprises a data acquisition module for acquiring head characteristic data of the user, for example a three-dimensional scanning device.
In another embodiment, the electronic device 100 does not include a data acquisition module for acquiring head characteristic data of the user. The head characteristic data of the user is input to the model building module 110 by an external device other than the electronic device 100.
FIG. 2 is a schematic diagram of a virtual three-dimensional model of a head generated according to a method according to an embodiment of the present application.
The model building module 110 performs three-dimensional scanning on the head of the user through the three-dimensional scanning device to obtain head scanning image data, and performs three-dimensional model reconstruction based on the head scanning image data by using the three-dimensional modeling software to obtain the head virtual three-dimensional model shown in fig. 2. In one embodiment, the three-dimensional modeling software is integrated into the scanner.
In other embodiments, the mobile phone with a clear camera can be used for shooting the head of the user at multiple angles so as to acquire multiple head characteristic image data of the user for the three-dimensional modeling software to establish a virtual three-dimensional model of the head; or may acquire user head data for three-dimensional reconstruction in other ways, without limitation in this application. The three-dimensional modeling software used in the present embodiment may be commercially available, for example Maya, C4D, or the like, as long as it can perform three-dimensional reconstruction based on the user's head data to obtain a head virtual three-dimensional model that satisfies the requirements.
The eyeglass model library 120 is used to store eyeglass models created in advance. The eyeglass model library 120 stores an eyeglass database including various eyeglass styles. For example, different styles, different colors of eyeglass models. In the embodiment shown in fig. 1, the eyeglass model library 120 is a local database of the electronic device 100. In another embodiment, the glasses model library may also be stored in the cloud.
The model selection module 130 is used to determine a first model of the eye lens.
The first eyeglass model is an eyeglass model that is not adapted to the virtual three-dimensional model of the user's head. In an embodiment, corresponding eyeglass models are generated for different eyeglass styles. The first eyeglass model is an eyeglass model of an eyeglass style selected by a user.
Specifically, in one embodiment, the user selects a model of the glasses of the cardiology device in the model library 120 according to his/her own preference. The model selection module 130 determines the eyeglass model selected by the user as the first eyeglass model according to the input operation of the user.
In another embodiment, the external device (e.g., a cloud server or a local terminal device) directly inputs the eyeglass model to the model selection module 130, and the model selection module 130 takes the received eyeglass model as the first eyeglass model.
The model adapting module 141 is configured to adjust the first glasses model based on the head virtual three-dimensional model, and generate a second glasses model.
The model adjustment module 142 is used for determining the lens pretilt angle of the second eyeglass model; when the lens pretilt angle is not within the preset angle threshold value interval, the second eyeglass model is adjusted to generate a third eyeglass model for 3D printing, wherein the lens pretilt angle of the third eyeglass model is within the preset angle threshold value interval (the concept of the lens pretilt angle and the adjustment process of the second model refer to the following description).
Fig. 3 is a flowchart illustrating a method for generating a model of glasses according to an embodiment of the present application.
The electronic device 100 shown in fig. 1 performs the following steps shown in fig. 3 to generate a third eyeglass model for 3D printing.
S310, the model building module 110 acquires the head characteristic data of the user, and generates a head virtual three-dimensional model based on the head characteristic data.
S320, the model selection module 130 determines a first model of the eye lens.
S330, the model adapting module 141 adjusts the first glasses model to generate a second glasses model based on the head virtual three-dimensional model, and the glasses frame, the nose pad and the glasses legs of the second glasses model are respectively adapted to the head virtual three-dimensional model.
Fig. 4 is a flowchart illustrating the adjustment of a model of glasses according to an embodiment of the present application.
Specifically, in one implementation of S330, the following procedure is performed as shown in fig. 4.
S410, determining characteristic points in the virtual three-dimensional model of the head, and shapes and positions of eyes, nose and ears.
Specifically, based on the head virtual three-dimensional model, feature points in the head virtual three-dimensional model are determined in three-dimensional modeling software (such as Blender).
The feature points include: at least one characteristic point of a face profile point, a cheek point, an eyebrow bow point, an eyebrow starting point, an eyebrow tail point, an inner eye corner point, an outer eye corner point, a pupil point, a temple point, a cheekbone point, a temporal bone point, a nose root point, a nose tip point, left and right nose side points, left and right mouth corner points, a chin bottom point, an ear root point and an auricle point. (it should be noted here that the feature points listed above are only examples, and in other embodiments, more feature points may be extracted.)
S420, determining the characteristic point spacing related to the glasses parameters based on the characteristic points of the head virtual three-dimensional model.
Such as interpupillary distance, eye height, eye distance, temporal bone distance, cheekbone distance, and relative distance of pupil point to auricular root point.
Specifically, the pupil distance, the eye height and the eye distance are determined based on pupil points; determining temporal bone distance based on temporal bone points; determining a zygomatic distance based on the zygomatic point; and measuring the relative distance from the pupil point to the auricular root point to obtain the distance value from the pupil point to the auricular root point.
And S430, adjusting parameters of the first glasses model based on the characteristic point spacing and the shapes and positions of eyes, nose and ears to generate a second glasses model.
In one implementation of S430, including:
adjusting the first eyeglass model based on the shape and position of the eye and at least one of pupil distance, eye height, temporal bone distance and cheekbone distance, generating frame parameters of a second eyeglass model adapted to the virtual three-dimensional model of the head;
adjusting the first glasses model based on the shape and position of the nose, the positions of the nose root point and the left and right nose side points, and generating nose pad parameters of a second glasses model adapted to the head virtual three-dimensional model;
adjusting the first eyeglass model based on the shape and position of the ear, the eye distance and the relative distance from the pupil point to the auricular root point, and generating the temple parameters of the second eyeglass model adapted to the head virtual three-dimensional model;
And generating a second eyeglass model of which the eyeglass frame, the nose pad and the eyeglass legs are respectively matched with the head virtual three-dimensional model based on the adaptive eyeglass frame parameters, the nose pad parameters and the eyeglass leg parameters.
FIG. 5 is a schematic diagram of a second eyeglass model generated in accordance with a method of an embodiment of the present application.
In one embodiment, after S430, a second eyeglass model is generated that is adapted to the virtual three-dimensional model of the head, as shown in fig. 5.
After S330, S340 and S350 are performed.
S340, the model adjustment module 142 determines the lens pretilt angle of the second eyeglass model.
S350, when the lens pretilt angle is not within the preset angle threshold interval, the model adjustment module 142 adjusts the second glasses model, and generates a third glasses model for 3D printing, where the lens pretilt angle of the third glasses model is within the preset angle threshold interval.
Specifically, when the glasses are in a wearing state (or the temples are in a horizontal state), an angle between a plane in which the lenses are positioned and a vertical direction (an angle between a plane line and a vertical line of the lenses) is called a lens pretilt angle.
Fig. 6 is a schematic diagram of glasses according to an embodiment of the present application.
As shown in fig. 6, the lens pretilt angle α is an angle between a plane line L1 of the eyeglass lens and a vertical line L, specifically, the eyeglass lens is laid flat on a certain plane, the plane line L1 of the eyeglass lens is a straight line passing through an upper vertex and a lower vertex of the eyeglass lens, and the vertical line L is a normal line perpendicular to the plane.
The lens front inclination angle can influence the comfort level when a user wears the glasses, and the problem of vision of the user is aggravated when the user wears the glasses with unsuitable lens front inclination angle for a long time.
Therefore, in order to improve wearing comfort of the glasses, in S340, S350, the lens pretilt angle for the second glasses model is adjusted. According to the method, on the basis of the second eyeglass model with the eyeglass frame, the nose support and the eyeglass legs respectively matched with the head virtual three-dimensional model, the forward inclination angle of the eyeglass of the second eyeglass model is adjusted to improve wearing comfort, the adjusting variables are few, the conditions which are required to be met simultaneously are few, and the eyeglass model with high wearing comfort can be obtained simply, rapidly and accurately.
Specifically, fig. 7 is a flowchart illustrating adjustment of a model of glasses according to an embodiment of the present application.
In one embodiment, the model adjustment module 142 performs the following steps as shown in FIG. 7 to implement S340, S350.
S700, determining the lens front dip angle of the second eyeglass model.
In an embodiment, the second eyeglass model does not include lenses, and in S700, the plane (the plane line L1 shown in fig. 6) in which the lenses lie is determined according to the frame of the second eyeglass model, so as to further determine the front tilt angle (the angle α between the plane line L1 and the vertical line L shown in fig. 6).
Specifically, in S700, one skilled in the art may determine the lens pretilt angle of the second eyeglass model in a number of different ways.
For example, in one implementation of S700, a side view of the second eyeglass model is acquired (see fig. 6). And carrying out image recognition according to the side view, and acquiring the front dip angle of the lens. For example, the lens pretilt angle α shown in fig. 6 is measured by image recognition.
For another implementation of S700, in the three-dimensional modeling software, the lens pretilt angle is obtained by labeling between the plane line and the vertical line of the lens with an angle labeling function.
For another example, in another implementation of S700, the lens pretilt angle of the second eyeglass model is calculated from model parameters of the second eyeglass model.
Specifically, fig. 8 is a schematic diagram of a glasses model according to an embodiment of the present application.
In one embodiment, the second eyeglass model is shown in fig. 8, 1 is the bridge of the eyeglasses; 2 is a nose pad (right side) of the glasses; 3 is a glasses frame; 4 is the lens (left side) of the glasses; 5 is the pile head (left side) of the glasses; 6 are the temples (left side) of the glasses.
Since the shape of the glasses is designed to be central symmetry, the marks 2, 4, 5 and 6 mark only one side of the glasses.
Fig. 9 is a side view of the eyeglass model of fig. 8.
As shown in fig. 9, 51 is the center point of the hinge connecting the frame to the temple; reference numeral 61 denotes a temple bending point of the temple.
Based on the positions of 51 and 61, the distance D1 (first distance value D1) from the hinge center point 51 to the temple bending point 61 is measured.
Fig. 10 is a front view of the eyeglass model of fig. 8.
As shown in fig. 10, 21 is the center point of the nose pad (left nose pad shown in fig. 8); 52 is the center point of the pile head (left pile head shown in fig. 8); reference numeral 41 denotes a horizontal center line of the lens (left lens shown in fig. 8).
Based on the positions of 21 and 41, the distance D2 (second distance value) from the nose pad center point 21 to the lens horizontal center line 41 is measured.
Based on the positions of 52 and 41, the distance D3 (third distance value) of the toe center point 52 from the lens horizontal center line 41 is measured.
The lens pretilt angle of the second eyeglass model (eyeglass model shown in fig. 8) is determined based on the distance values D1, D2 and D3. Specifically, in one embodiment, the lens pretilt angle α is calculated based on the following formula.
Figure BDA0004159711000000071
In the above implementation manner of S700, the lens pretilt angle α can be obtained efficiently and accurately by calculating the lens pretilt angle according to the distance values between different components by the pretilt angle calculation formula (formula 1).
S710, judging whether the lens front inclination angle of the second glasses model is within a preset angle threshold value interval.
Specifically, in one embodiment, the preset angle threshold interval may be determined by one skilled in the art based on a history of the user's comfort of wearing the glasses.
In another embodiment, the preset angle threshold interval may be set by the user.
Specifically, in one embodiment, the preset angle threshold interval is 0 ° to 10 °.
When the lens pretilt angle of the second eyeglass model is within the preset angle threshold section, S721 is performed.
S721, the current second eyeglass model is used as a third eyeglass model for 3D printing.
When the lens pretilt angle of the second eyeglass model is not within the preset angle threshold section (e.g., less than 0 ° or greater than 10 °), S722 is performed.
S722, the second eyeglass model is adjusted so that the front inclination angle of the lens is within a preset angle threshold interval.
In S722, the second eyeglass model is adjusted on the basis of ensuring the adaptation of the second eyeglass model to the virtual three-dimensional model of the head. Specifically, the frame parameters, nose pad parameters, and temple parameters of the adjusted second eyeglass model are still within the tolerance range adapted to the user's head characteristic data, i.e., the frame, nose pad, and temple of the adjusted second eyeglass model are adapted to the virtual three-dimensional model of the head.
In one implementation of S722, the second eyeglass model is adjusted to change the lens pretilt angle based on ensuring that the second eyeglass model is adapted to the virtual three-dimensional model of the head. And repeating the steps S700 and S710 for the adjusted second eyeglass model until the lens pretilt angle of the adjusted second eyeglass model is within the preset angle threshold value interval.
For example, at least one parameter of the position and angle of the peg head 5 of the second eyeglass model as shown in fig. 8 is adjusted; and/or at least one parameter of the position and angle of the nose pad 2.
For the adjusted second eyeglass model, measuring the first distance value D1, the second distance value D2 and the third distance value D3, calculating the lens pretilt angle based on the formula 1, judging whether the lens pretilt angle of the adjusted second eyeglass model is within a preset angle threshold value interval, and repeating the process until the lens pretilt angle of the adjusted second eyeglass model is within a range of more than or equal to 0 DEG and less than or equal to 10 deg.
Table 1 shows an illustration of the results of adjusting the second eyeglass model to obtain a lens pretilt angle within a preset angle threshold interval in one embodiment.
TABLE 1
D3 (Unit: mm) D2 (Unit: mm) D1 (Unit: mm) Alpha (unit: °)
Second eyeglass model 20 2.5 105 12.37
First time adjustment 16 2.5 100 10.66
Second time adjustment 14 2.5 96 9.9
As shown in table 1, it is determined that the first distance value D1 of the second eyeglass model, in which the eyeglass frame, the nose pad and the temple are respectively adapted to the virtual three-dimensional model of the head, is 105mm, the second distance value D2 is 2.5mm, the third distance value D3 is 20mm, and the lens pretilt angle α angle is 12.37 ° calculated based on the three distance values, and is not located in a preset interval greater than or equal to 0 ° and less than or equal to 10 °.
By the first adjustment, the position parameters of the pile head 5 are adjusted, so that the third distance value D3 from the pile head center point 52 to the lens horizontal center line 41 is changed to 16mm, the second distance value D2 is not adjusted, the first distance value D1 is changed to 100mm, and the frame parameters, the nose pad parameters and the temple parameters of the adjusted eyeglass model are located within the tolerance range adapted to the head characteristic data of the user.
The lens pretilt angle alpha angle of the first-adjusted second eyeglass model is calculated to be 10.66 degrees, and is not still in a preset interval which is more than or equal to 0 degrees and less than or equal to 10 degrees.
And continuously performing second adjustment on the second eyeglass model after the first adjustment, and adjusting the position parameters of the pile head 5 to change the third distance value D3 from the pile head center point 52 to the lens horizontal center line 41 to 14mm, wherein the second distance value D2 is not adjusted and changed, and the first distance value D1 is changed to 96mm.
And calculating the lens pretilt angle alpha angle of the second eyeglass model subjected to the second adjustment to be 9.9 degrees, wherein the lens pretilt angle alpha angle is positioned in a preset interval which is larger than or equal to 0 degree and smaller than or equal to 10 degrees.
The second eyeglass model after the second adjustment is the third eyeglass model for 3D printing.
According to the glasses model generation method, the second glasses model which is adaptive to the virtual three-dimensional model of the head can be generated by firstly generating the glasses frame, the nose support and the glasses legs, and the front inclination angle of the lenses is adjusted on the basis of the second glasses model so as to obtain the third glasses model for 3D printing. According to the eyeglass model generation method, the customizing flow and the customizing standard of the 3D printing eyeglasses can be standardized, the wearing comfort level of the 3D printing eyeglasses is improved, and the user experience of a user for customizing the 3D printing eyeglasses is improved.
Furthermore, based on the eyeglass model generating method in the embodiment of the present application, an embodiment of the present application further provides an eyeglass generating method.
Fig. 11 is a partial flowchart illustrating a method of generating eyeglasses according to an embodiment of the present application.
After S350, the following steps shown in fig. 11 are performed to generate glasses based on the third glasses model for 3D printing generated in S350.
S1100, performing slice layering processing on a third eyeglass model for 3D printing to obtain at least one slice layer image data;
s1110, performing data processing based on slice layer image data to obtain layer printing data;
in this embodiment, specifically, slice analysis processing may be performed on the third glasses model by using slice software (sainner_3dp) of the sainner three-dimensional science and technology company to obtain at least one slice layer image data, and then the slice layer image data is converted into layer print data through halftone processing;
s1120, performing three-dimensional printing based on the layer printing data to obtain layers of the glasses, and performing layer-by-layer printing and superposition to obtain the glasses.
In some embodiments, three-dimensional printing techniques that may be used include, but are not limited to: stereolithography (SLA), digital Light Processing (DLP), three-dimensional printing technology (3 DP), multi-jet fusion technology (MJF), and various other types of 3D printing or additive manufacturing techniques known in the art are not limited herein.
Furthermore, based on the glasses generating method of the embodiment of the application, an embodiment of the application further provides a glasses generating system.
Fig. 12 is a schematic diagram illustrating a structure of an eyeglass generating system according to an embodiment of the present application.
As shown in fig. 12, the eyeglass generation system 1200 includes:
front-end device 1210, front-end device 1210 specifically includes:
a data acquisition module 1211 configured to acquire head characteristic data of a user;
a model building module 1212 configured to build a head virtual three-dimensional model based on the head characteristic data;
the model selection module 1213 is configured to determine a first eyeglass model of the eyeglass.
In one embodiment, the front-end 1210, which includes a data acquisition module 1211, a model creation module 1212, and a model selection module 1213, may be a scanner (e.g., model number TS-268) of the three-dimensional science and technology company. In another embodiment, the front-end 1210 can also be a combination of camera with a camera and computer device.
The middle-end apparatus 1220, the middle-end apparatus 1220 specifically includes:
a model adaptation module 1221 configured to adjust the first eyeglass model, generating a second eyeglass model adapted to the virtual three-dimensional model of the head;
a model adjustment module 1222, the model adjustment module 1222 including a data calculation unit 1223 and a model adjustment unit 1224; the data calculation unit 1223 is configured to calculate a lens pretilt angle of the second eyeglass model; the model adjustment unit 1224 is configured to adjust the second eyeglass model based on the lens pre-tilt angle, generating a third eyeglass model for 3D printing.
In an embodiment, the middle-end 1220 including the model adapting module 1221 and the model adapting module 1222 may be a computer device with three-dimensional modeling software (such as Blender).
The back-end device 1230, the back-end device 1230 specifically includes:
a model slicing module 1231 configured to perform slicing layering processing on the glasses model to obtain at least one slice image data;
a data processing module 1232 configured to perform data processing based on slice layer image data to obtain layer print control data;
the model printing module 1233 is configured to perform three-dimensional printing based on the layer printing control data to obtain the layers of the glasses, print layer by layer, and superimpose to obtain the glasses.
In an embodiment, in particular, the back-end device 1230 including the model slicing module 1231, the data processing module 1232, and the model printing module 1233 may be a combination of a computer device with model slice analysis software and a three-dimensional printer.
In the description of the embodiments of the present application, for convenience of description, the functions are described as being divided into various modules, where the division of each module is merely a division of a logic function, and the functions of each module may be implemented in one or more pieces of software and/or hardware when the embodiments of the present application are implemented.
In particular, the apparatus according to the embodiments of the present application may be fully or partially integrated into one physical entity or may be physically separated when actually implemented. And these modules may all be implemented in software in the form of calls by the processing element; or can be realized in hardware; it is also possible that part of the modules are implemented in the form of software called by the processing element and part of the modules are implemented in the form of hardware. For example, the detection module may be a separately established processing element or may be implemented integrated in a certain chip of the electronic device. The implementation of the other modules is similar. In addition, all or part of the modules can be integrated together or can be independently implemented. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in a software form.
For example, the modules above may be one or more integrated circuits configured to implement the methods above, such as: one or more specific integrated circuits (Application Specific Integrated Circuit, ASIC), or one or more digital signal processors (Digital Singnal Processor, DSP), or one or more field programmable gate arrays (Field Programmable Gate Array, FPGA), etc. For another example, the modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
An embodiment of the present application also proposes an electronic device comprising a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform a method as described in the embodiments of the present application.
Specifically, in an embodiment of the present application, the one or more computer programs are stored in the memory, where the one or more computer programs include instructions, which when executed by the apparatus, cause the apparatus to perform the method steps described in the embodiments of the present application.
Specifically, in an embodiment of the present application, the processor of the electronic device may be a device on chip SOC, where the processor may include a central processing unit (Central Processing Unit, CPU) and may further include other types of processors. Specifically, in an embodiment of the present application, the processor of the electronic device may be a PWM control chip.
In particular, in an embodiment of the present application, the processor may include, for example, a CPU, DSP, microcontroller, or digital signal processor, and may further include a GPU, an embedded Neural network processor (Neural-network Process Units, NPU), and an image signal processor (Image Signal Processing, ISP), where the processor may further include a necessary hardware accelerator or logic processing hardware circuit, such as an ASIC, or one or more integrated circuits for controlling the execution of the program of the present application, and so on. Further, the processor may have a function of operating one or more software programs, which may be stored in a storage medium.
In particular, in an embodiment of the present application, the memory of the electronic device may be a read-only memory (ROM), other type of static storage device capable of storing static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device capable of storing information and instructions, an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory, CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any computer readable medium capable of carrying or storing desired program code in the form of instructions or data structures and capable of being accessed by a computer.
In particular, in an embodiment of the present application, the processor and the memory may be combined into a processing device, more commonly separate components, and the processor is configured to execute the program code stored in the memory to implement the method described in the embodiment of the present application. In particular, the memory may also be integrated into the processor or may be separate from the processor.
Further, the devices, apparatuses, modules illustrated in the embodiments of the present application may be implemented by a computer chip or entity, or by a product having a certain function.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein.
In several embodiments provided herein, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.
Specifically, in an embodiment of the present application, there is further provided a computer readable storage medium, where a computer program is stored, when the computer program is executed on a computer, to cause the computer to perform the method provided in the embodiment of the present application.
An embodiment of the present application also provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method provided by the embodiments of the present application.
The description of embodiments herein is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments herein. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the embodiments of the present application, the term "at least one" refers to one or more, and the term "a plurality" refers to two or more. "and/or", describes an association relation of association objects, and indicates that there may be three kinds of relations, for example, a and/or B, and may indicate that a alone exists, a and B together, and B alone exists. Wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
In the present embodiments, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
All embodiments in the application are described in a progressive manner, and identical and similar parts of all embodiments are mutually referred, so that each embodiment mainly describes differences from other embodiments. In particular, for the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments in part.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus, the apparatus and the units described above may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The foregoing is merely a specific embodiment of the present application, and any person skilled in the art may easily think of changes or substitutions within the technical scope of the present application, and should be covered in the scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for generating a model of glasses, the method being applied to an electronic device, the method comprising:
acquiring head characteristic data of a user, and generating a head virtual three-dimensional model based on the head characteristic data;
determining a first model of the eye;
based on the head virtual three-dimensional model, the first glasses model is adjusted to generate a second glasses model, and a glasses frame, a nose pad and glasses legs of the second glasses model are respectively matched with the head virtual three-dimensional model;
determining a lens pretilt angle of the second eyeglass model;
and when the lens pretilt angle is not in the preset angle threshold value interval, adjusting the second eyeglass model to generate a third eyeglass model, wherein the lens pretilt angle of the third eyeglass model is in the preset angle threshold value interval.
2. The method of claim 1, wherein the adjusting the first eyeglass model based on the virtual three-dimensional model of the head to generate a second eyeglass model comprises:
Determining characteristic points in the head virtual three-dimensional model; and the shape and position of the eyes, nose and ears;
determining a characteristic point distance related to the eyeglass parameter based on the characteristic points;
parameters of the first eyeglass model are adjusted based on the feature point spacing, and shapes and positions of eyes, nose, and ears to generate the second eyeglass model.
3. The method of claim 1, wherein the determining the lens pretilt angle of the second eyeglass model comprises:
measuring a first distance value, wherein the first distance value is the distance from the hinge center point on the same side of the second eyeglass model to the bending point of the eyeglass leg;
measuring a second distance value, wherein the second distance value is the distance between the center point of the nose pad on the same side of the second glasses model and the horizontal center line of the lens;
measuring a third distance value, wherein the third distance value is the distance between the pile head center point on the same side of the second eyeglass model and the lens horizontal center line;
a lens pretilt angle of the second eyeglass model is determined based on the first distance value, the second distance value, and the third distance value.
4. The method of claim 1, wherein adjusting the second eyeglass model when the lens pretilt angle is not within a preset angle threshold interval comprises:
Adjusting at least one parameter of the position and angle of the stake head of the second eyeglass model; and/or adjusting at least one parameter of the position and angle of the nose pad of the second eyeglass model.
5. The method according to any one of claims 1-4, wherein the frame, nose pad, temple of the third eyeglass model are adapted to the virtual three-dimensional model of the head, respectively.
6. A method of eyewear creation, the method comprising:
obtaining a third eyeglass model based on the method of any one of claims 1-5;
performing slice layering processing on the third eyeglass model to obtain at least one slice layer image data;
performing data processing based on the slice layer image data to obtain layer printing data;
and performing three-dimensional printing based on the layer printing data to obtain layers of the glasses, and performing layer-by-layer printing and superposition to obtain the glasses.
7. An electronic device, the electronic device comprising:
the model building module is used for acquiring head characteristic data of a user and generating a head virtual three-dimensional model based on the head characteristic data;
a model selection module for determining a first model of the eye;
the model adapting module is used for adjusting the first glasses model based on the head virtual three-dimensional model to generate a second glasses model, and a glasses frame, a nose pad and glasses legs of the second glasses model are respectively adapted to the head virtual three-dimensional model;
A model adjustment module for: determining a lens pretilt angle of the second eyeglass model; and when the lens pretilt angle is not in the preset angle threshold value interval, adjusting the second eyeglass model to generate a third eyeglass model, wherein the lens pretilt angle of the third eyeglass model is in the preset angle threshold value interval.
8. An eyewear creation system, the system comprising:
the model building module is used for acquiring head characteristic data of a user and generating a head virtual three-dimensional model based on the head characteristic data;
a model selection module for determining a first model of the eye;
the model adapting module is used for adjusting the first glasses model based on the head virtual three-dimensional model to generate a second glasses model, and a glasses frame, a nose pad and glasses legs of the second glasses model are respectively adapted to the head virtual three-dimensional model;
a model adjustment module for: determining a lens pretilt angle of the second eyeglass model; when the lens pretilt angle is not in the preset angle threshold value interval, the second eyeglass model is adjusted to generate a third eyeglass model, and the lens pretilt angle of the third eyeglass model is in the preset angle threshold value interval;
The model slicing module is used for performing slicing layering processing on the third eyeglass model to obtain at least one slice layer image data;
the data processing module is used for carrying out data processing based on the slice layer image data to obtain layer printing data;
and the model printing module is used for performing three-dimensional printing based on the layer printing data to obtain layers of the glasses, and performing layer-by-layer printing and superposition to obtain the glasses.
9. An electronic device comprising a memory for storing computer program instructions and a processor for executing the computer program instructions, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps of any one of claims 1-5 or claim 6.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when run on a computer, causes the computer to perform the method of any of claims 1-5 or 6.
CN202310345001.9A 2023-03-31 2023-03-31 Glasses model generation method and electronic equipment Pending CN116305364A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310345001.9A CN116305364A (en) 2023-03-31 2023-03-31 Glasses model generation method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310345001.9A CN116305364A (en) 2023-03-31 2023-03-31 Glasses model generation method and electronic equipment

Publications (1)

Publication Number Publication Date
CN116305364A true CN116305364A (en) 2023-06-23

Family

ID=86785072

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310345001.9A Pending CN116305364A (en) 2023-03-31 2023-03-31 Glasses model generation method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116305364A (en)

Similar Documents

Publication Publication Date Title
CN105842875B (en) A kind of spectacle frame design method based on face three-dimensional measurement
US11307437B2 (en) Method of designing and placing a lens within a spectacles frame
US9568748B2 (en) Methods of designing and fabricating custom-fit eyeglasses using a 3D printer
US20230019466A1 (en) Systems and methods for determining the scale of human anatomy from images
CN109460635B (en) Method and system for generating a frame
CN105874378B (en) Method for determining a geometric definition of a custom optical device
CN111033363A (en) Method, apparatus and computer program for virtually fitting spectacle frames
CN111033364A (en) Method, apparatus and computer program for virtually fitting spectacle frames
EP3289406A1 (en) Wearable devices such as eyewear customized to individual wearer parameters
NL2014891B1 (en) Method for manufacturing a spectacle frame adapted to a spectacle wearer.
US20220373820A1 (en) Method and apparatus for design and fabrication of customized eyewear
CN105008987A (en) Spectacle lens design system, supply system, design method, and production method
EP2972568A1 (en) Method and apparatus for design and fabrication of customized eyewear
CN116305364A (en) Glasses model generation method and electronic equipment
JP2005202292A (en) Generation system of design data, generation method of design data, recording medium and program
US10768452B2 (en) Method for ordering an optical equipment
US20220004020A1 (en) Method for producing at least one nose pad of view detection glasses
WO2019138515A1 (en) Image forming device, eyeglass lens selection system, image forming method, and program
KR102685196B1 (en) Method and apparatus for recommanding customized eyeglasses based on artificial intelligence
WO2024118525A1 (en) Population based eyewear fitting
US20230113713A1 (en) Goggle customization system and methods
CN117390718A (en) Glasses model generation method, device and equipment
CN116449582A (en) Glasses customization method, glasses customization system and customized glasses
EP4078278A1 (en) Method and system for determining a fitted position of an ophthalmic lens with respect to a wearer referential and method for determining a lens design of an ophthalmic lens
KR20220100060A (en) Method and system for providing spectacle frames

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination