EP3443409A1 - Verfahren zum bestimmen von optischen parametern eines probanden und computerprogrammprodukt zum durchführen des verfahrens - Google Patents
Verfahren zum bestimmen von optischen parametern eines probanden und computerprogrammprodukt zum durchführen des verfahrensInfo
- Publication number
- EP3443409A1 EP3443409A1 EP17713579.5A EP17713579A EP3443409A1 EP 3443409 A1 EP3443409 A1 EP 3443409A1 EP 17713579 A EP17713579 A EP 17713579A EP 3443409 A1 EP3443409 A1 EP 3443409A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- image
- optical parameters
- subject
- user
- image selection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02C—SPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
- G02C13/00—Assembling; Repairing; Cleaning
- G02C13/003—Measuring during assembly or fitting of spectacles
- G02C13/005—Measuring geometric parameters required to locate ophtalmic lenses in spectacles frames
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/11—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
Definitions
- the present invention relates to a method for determining optical parameters of a subject and to a computer program product for carrying out the method.
- a plurality of optical parameters must be determined. On the one hand to obtain the information required for Einschieifen and insertion into the socket, on the other hand to make optimizations in the lens itself and optimally adapted to the subsequent carrying position or position of use.
- centering data such as the disc length, the disc height, the distance between the glasses of a pair of glasses, the decentration of a centering point and the centering required.
- These and other optical parameters which are used in particular for the description of the position of use can, or are necessary, are in relevant standards, such as DIN EN ISO 13666, DIN EN ISO 58208, DIN EN ISO 8624 and DIN 5340 included and can be removed.
- the determination of the individual parameters and the centering data can in principle be done manually by a variety of measuring devices, the value read in each case is immediately known.
- each parameter to be determined manually requires a separate test procedure, which in turn can have a great dependence on the person performing the procedure as well as, in principle, significant errors, for example due to parallax.
- video centering systems can be used to determine individual parameters and centering data, which determine individual or even all parameters by taking an image of a person with spectacle frame and with the aid of possibly further aids.
- a video centering system is e.g. in EP 1 844 363 B1.
- the video centering system described therein is based on a stereo camera system and a mirror image fixation, which allows excellent results with a single measurement pickup.
- the result is determined by an evaluation.
- the evaluation is a sequence of manual selection steps in which predetermined features are selected in the image.
- the user or user of the video centering system can be supported by image processing. If all predetermined selection steps have been completed, the corresponding parameters are determined and displayed.
- the accuracy is primarily defined by the measurement method.
- the method used by Rodenstock on the basis of a stereo camera system and mirror-image fixation provides excellent results with a single measurement acquisition. Other methods suffer from inherent errors due to model assumptions or the loss of information through projection onto 2D. The duration of a measurement is influenced by various factors.
- a first independent aspect for achieving the object relates to a method for determining optical parameters of a subject by means of a video centering system, comprising the steps:
- the evaluation of the generated image data is a computer-aided automatic
- the number of iteration steps performed in the iterative determination of the optical parameters depends on the number of manual image selection steps performed by the user.
- a subject is, in particular, a spectacle wearer or a patient for whom a pair of spectacles is to be manufactured or adapted.
- a user is understood to mean a person who operates the video centering system.
- a user can e.g. be an optician or ophthalmologist. But it is also possible that the subject operates the Videozentriersystem itself and thus the subject and the user are the same person.
- the generation of image data takes place in particular by recording digital images, preferably with the aid of a stereo camera system.
- the image data or the digital images are preferably generated from at least two different recording directions.
- the images can be recorded simultaneously with only one single camera in succession or with several, in particular two, cameras which are located at different locations.
- two different picking directions mean that different image data are generated from overlapping partial areas of the head, preferably from one and the same partial area of the head, in particular that image data of identical partial areas of the user's head are generated under different perspective views , As a result, the same portion of the head will be the same but the image data is different.
- Different recording directions can also be achieved, for example, by the image data being generated by at least two image recording devices or cameras, wherein effective optical axes of the at least two image recording devices are not parallel.
- the image data or the digital images are generated with the aid of a stereoscopic image recording device.
- the video centering system is preferably based on a stereoscopic image acquisition.
- the image data includes digital stereoscopic images.
- the captured images correspond to the generated image data.
- the pupils of the subject and a spectacle frame edge or a spectacle lens edge are imaged, wherein in the generated image data, the pupils of the subject are bounded by the spectacle frame edge or the spectacle lens edge.
- eyeglasses in the sense of the present description do not necessarily have to have spectacle lenses, but can merely be a spectacle frame Alternatively, the subject may wear a spectacle frame with associated spectacle lenses.
- Automatic or machine image processing of the image data comprises, in particular, known pattern recognition algorithms with which certain elements or features of an image can be recognized and / or marked by software or computer assistance.
- image processing includes algorithms that can be used to search for specific shapes, points and / or environments of points.
- Computer-assisted automatic image processing preferably includes automatic detection of pupils Pupillenstoffigeen of the subject and an automatic detection of one or more frame edge points and / or lens edge points, such as an upper, lower, nasal or temporal frame edge point and / or Briilenglasrandddling, the glasses or spectacle frame of the subject.
- evaluation data are preferably generated which comprise 3D data or spatial information in three-dimensional space of predetermined points of the system of the subject's head and the spectacles arranged thereon in the position of use.
- the optical parameters are determined in particular on the basis of the generated evaluation data or the location information.
- the evaluation data generated by means of the evaluation of the image data comprise location information for at least one of the following points:
- the horizontal plane of the subject cuts both pupils of the subject and runs parallel to a predetermined zero line of sight of the subject;
- dimensioning in box size is understood to mean the measuring system as described in the relevant standards, for example in DIN EN ISO 8624 and / or DIN EN ISO 13666 and / or DIN 58208 and / or DIN 5340 , Further, with regard to the case size and other conventional terms and parameters used, the book “The Optics of the Eye and the Visual Aids” by Dr. Ing. Roland Enders, 1995 Optical Publishing GmbH, Heidelberg, as well as the book “Optics and Technology of the Glasses” by Heinz Diepes and Ralf Blendowksi, 2002 Publisher Optician fürveröttingmaschine GmbH, Heidelberg, referenced. The standards as well as the named book represent an integral part of the disclosure of the present application for the definitions of terms.
- the limits in the case of a dimension in box size include one eye or both eyes, which are furthest outside or inside and / or above or below lie. These detection points are conventionally determined by tangents to the spectacle frame or the respective areas of the spectacle frame assigned to the respective eyes (compare DIN 58 208, Figure 3).
- the zero-sighting direction in the sense of this invention is a line of sight straight ahead with parallel fixing lines. In other words, it is a viewing direction defined by a position of the eye relative to the subject's head, the eyes looking at an object that is at eye level and located at an infinitely distant point. Consequently, the zero-sighting direction in the sense of this invention is determined only by the position of the eyes relative to the head of the subject. If the subject's head is in a normal upright posture, the zero-sighting direction substantially corresponds to the horizontal direction in the frame of reference of the earth.
- the zero-sighting direction may, however, be tilted to the horizontal direction in the frame of reference of the earth, for example, if the test person moves his head forward or to the side without any further movement of the eyes inclines.
- the zero-sighting direction of both eyes forms a plane which, in the reference system of the earth, is essentially parallel to the horizontal plane.
- the plane spanned by the two null directions of the two eyes may also be inclined to the horizontal plane in the frame of reference of the earth, for example, if the subject tilts the head forward or to the side.
- the subject's horizontal plane corresponds to a first plane and the subject's vertical plane corresponds to a second plane perpendicular to the first plane.
- the horizontal plane in the reference frame of the subject may be parallel to a horizontal plane in the frame of reference of the earth and extend only through the center of a pupil. This is especially the case if the two eyes of the subject are arranged, for example, at different heights (in the frame of reference of the earth).
- a representation of at least the subregion of the system of the head and the spectacles of the subject arranged in the position of use are determined.
- this three-dimensional representation local relations in the three-dimensional space of the evaluation data relative to one another can be determined in a simple manner, and the optical parameters of the subject can be determined therefrom.
- the evaluation of the image data further comprises executing a user-definable number of a set of a plurality of predetermined manual image-selection steps.
- a predetermined or predetermined image selection step is understood to be a user interaction step in which the user is shown a recorded image and the user has one or more predetermined elements, such as the pupil centers of the subject or Fassungsrand- or eyeglass lens points of the glasses, selects or selects.
- the user for this purpose also displayed selection crosses, in particular using arrow keys on a keyboard and / or with a computer mouse, move.
- the user can also select the selection elements directly by clicking with the computer mouse.
- the image displayed to the user preferably already has a preselection of the element to be selected or of the elements to be selected on the basis of a preceding automatic image processing. This automatic preselection can be checked by the user and reworked if necessary.
- the image selection steps are predetermined with regard to the elements to be selected.
- the image selection steps may also be predetermined with regard to their sequence to be carried out, so that the individual image selection steps can be performed by the user only in a predetermined or predetermined order.
- the user can specify the number of image selection steps performed by him.
- the user definable number x of the plurality N of predetermined manual image selection steps may be an integer between zero and the total number N of predetermined image selection steps, i.e., 0 ⁇ x ⁇ N, where x and N are integers.
- the user may specify, for example, or decide not to execute any or all of the available or predetermined manual image selection steps.
- the user must execute a predetermined number of image selection steps, for example one or two image selection steps, and only then decide or can determine whether and when he cancels the iterative determination.
- the user definable number x of the plurality N of predetermined manual image selection steps may be an integer between one and the total number N of predetermined image selection steps, ie, 1 ⁇ x ⁇ N user definable number x of the plurality N of predetermined manual image selection steps an integer between two and the total number N of predetermined image selection steps, ie, 2 ix -S N.
- the optical parameters are determined iteratively by evaluating the generated image data, the number of It depends on the number of manual image selection steps performed by the user. In other words, the determination of the optical parameters takes place stepwise, ie the determination approaches the exact solution in repeated iteration or determination steps.
- the particular optical parameters can be updated.
- the updating may include optimizing, refining, enhancing, and / or replacing at least one of the optical parameters to be determined.
- initially estimated or initially assumed values of optical parameters may be determined by means of performed evaluation steps, i. be updated with the help of one or more automatic image processing and / or with the help of one or more executed image selection steps, in particular based on the evaluation data generated by the evaluation.
- the number of iteration or determination steps for the determination of the optical parameters is also possible for the number of iteration or determination steps to be only one, ie for a total of only one iteration or determination step to take place and thus no repeated or updated determination takes place.
- the determination of the optical parameters is based on automatic image processing only.
- the number of iteration steps is preferably two. If the number of manual image selection steps performed by the user is two, the number of iteration steps, depending on the embodiment, may be equal to one or three. If the number of manual image selection steps performed by the user is greater than three, then the number of iteration steps is preferably greater than or equal to three and in particular greater than or equal to five.
- the method according to the invention thus advantageously allows the user a high degree of flexibility by setting the number of image selection steps performed by him in terms of accuracy and speed.
- the inventive method allows a faster determination of the optical parameters taking into account the accuracy. If the user sets, for example, the most accurate possible determination of the optical parameters, then he can execute all predetermined image selection steps. However, if the user focuses on determining the optical parameters as quickly as possible, or if it is sufficient that only some of the optical parameters are determined with some accuracy, the user can speed up the process by using no or only a portion of the given manual ones Performs image selection steps. By setting the number of image selection steps performed, the user can advantageously influence the accuracy and speed of the method according to the invention or the determination of the optical parameters.
- the accuracy of the method according to the invention or the specific optical parameters depends on the number of image selection steps carried out by the user.
- the accuracy of at least one of the optical parameters is increased.
- the user is thus able with the method according to the invention to select an optimum ratio of speed and accuracy for the respective application. This is advantageous, for example, if it is not necessary for an application that the optical parameters are determined with the highest possible accuracy. Or even if the user only wants to determine those optical parameters that are already output after the automatic image processing and / or after a few manual image selection steps with sufficient accuracy.
- the method for determining optical parameters of a subject by means of a video centering system may comprise in particular the following steps:
- evaluating the generated image data comprises computer-assisted automatic image processing of the image data and executing a user-definable number of a plurality of predetermined manual image-selection steps, and wherein:
- the accuracy and / or quality of the particular optical parameters depends on the number of manual image selection steps performed by the user.
- a renewed determination or updating, refinement and / or optimization of the optical parameters takes place, in particular on the basis of the last executed image Image selection step present evaluation result or the present after the last performed image selection step evaluation data.
- the method further comprises providing the determined optical parameters.
- the term "providing” includes, for example, displaying, outputting and / or making available the specific optical parameters, in particular with the aid of an interface, eg for consulting programs, an ordering software or a storage in a database the data in the form of a display and / or accessibility made using an interface.
- the optical parameters determined by the method according to the invention include both individual parameters and centering data.
- Individual parameters in the sense of this invention are, for example: the monocular pupillary distance (PD), ie the distance between the center of the socket and the center of the pupil when the eye is in the primary position; the corneal vertex distance (HSA), ie the distance between the back surface of the glass and the apex of the cornea, measured in the direction perpendicular to the frame level; the pretilt (VN), ie the angle in the vertical plane between the optical axis of the glass and the fixation line of the eye in primary position; the lens angle (FSW), ie the angle between the socket plane and the right or left disk plane; the individual head posture, ie the head tilt and head rotation.
- PD monocular pupillary distance
- HSA corneal vertex distance
- VN the angle in the vertical plane between the optical axis of the glass and the fixation line of the eye in primary position
- the lens angle (FSW) ie the angle between the socket plane and the right or left disk plane
- the individual head posture ie
- Centering data in the sense of this invention are, for example: the slice length (SL), ie the distance between the vertical sides of a rectangle circumscribing the frame edge; the slice height (SH), ie the distance between the horizontal sides of the rectangle circumscribing the border of the frame; - the distance between the glasses (AzG); the decentration of the centering point, horizontal (u), ie the horizontal distance of the centering point from the geometric center of the disc; - the decentration of the centering point, vertical (v), ie the vertical distance of the centering point from the geometric center of the disc;
- the method further comprises providing initial optical parameters, wherein determining the subject's optical parameters is based on the provided initial optical parameters and additional evaluation data generated by the evaluation.
- the additional evaluation data preferably comprises 3D data or spatial information in the three-dimensional space of predetermined points of the system of the head of the subject and the spectacles arranged thereon in the position of use.
- the initials Optical parameters provided before the beginning of the evaluation preferably comprise estimated or assumed values, in particular standard values, for at least part or all of the optical parameters to be determined.
- the initial optical parameters or the standard values can result from average values of previous measurements or as empirical values.
- the initial optical parameters or the default values are preferably updated, improved, or replaced by the evaluation of the method according to the invention, in particular with the aid of the evaluation data generated by the evaluation.
- the default values are oriented e.g. on the value of non-individualized products or statistical averages obtained in advance. Especially for HSA, VN and FSW reasonable assumptions are available. For PD, SL, SH, the determination of the statistical means, e.g. consider the following criteria to better match the final value:
- auxiliary parameters derived from original data of a previous measurement such as the ratio SL / SH, or the ratio of the pupil distance to the grinding height, can be provided or used.
- auxiliary parameters it is advantageously possible, based on already present default values provide further initial optical parameters.
- these auxiliary parameters can also be used in the evaluation of the image data.
- the auxiliary parameter or the ratio SL / SH can be approximated to the SH if the SL has already been approximated.
- the initial optical parameters or the default values can be taken into account, for example, as follows: Due to fads or fashionable ideals, in a certain mode period, which e.g. is about a year, certain versions popular. Thus, the mean values of the parameters AzG, SL or SH can also change after this mode period. Since this trend usually only takes place geographically limited, a change in the parameters can only be observed locally (continent, country, city). An average for these parameters is therefore preferably taken into account taking into account all recordings taken and evaluated in the local environment that are not older than e.g. a year are calculated.
- the pupil distance e.g. gender, age and geographic information about the location where the measurement or method of the invention, i. the determination of the optical parameters, is performed, taken into account or used.
- predetermined filters can thus be applied to measurements already carried out, ie to earlier measurement results.
- a meaningful determination and application of appropriate filters on already performed measurements is a separate task of the statistical data analysis.
- the filters can be dynamic, ie require a recalculation for a single measurement, or rely on a static model.
- Static in this context means that a filter with the same data at different places and times comes to the same result.
- Dynamic means that the filter takes place and / or time into account.
- a dynamic filter as described above, can only consider data that is within of one year retroactively from the time of measurement, while older data is ignored.
- a recalculation is necessary if the data basis changes, eg if new data is added.
- recalculation is necessary for each measurement to account for the dynamically changing data basis.
- a digital description of the socket form can also be used to provide the initial optical parameters.
- the production of bi-data takes place by taking a first image of at least partial regions of the system of the head of the subject and a pair of spectacles arranged thereon in a first recording direction and taking a second image of at least partial regions of the system under a second recording direction , wherein the first and the second recording direction differ from each other.
- the evaluation of the image data begins with a determination of positions of the pupils of the subject in the first image.
- a first evaluation step may include determining the position of a first pupil of the subject in the first image and determining the position of a second pupil of the subject in the first image.
- a second evaluation step which follows immediately after the first evaluation step, a determination of the position of the first pupil of the subject in the second image and a determination of the position of the second pupil of the subject in the second image can take place.
- Both the first evaluation step and the second evaluation step thus each comprise two evaluation sub-steps, namely the determination of the position of the first pupil and the determination of the position of the second pupil. Determining positions of the pupils in the first and The second image can each take place on the basis of an automatic image processing and / or at least one user-executed image selection step.
- an optimal evaluation order is achieved with regard to the iterative or updated determination of the optical parameters.
- the accuracy of the iteratively determined or provided parameters can be increased rapidly and effectively, since SD coordinates are already obtained at an early point in time.
- determining the pupil positions in both images e.g. already determinable a total pupil distance that can be provided updated.
- an initial default value of the total pupil distance can be updated or replaced.
- an additional head rotation to be detected could be e.g. be taken into account as correction value.
- an updated monopupillary distance can also be determined or provided by halving.
- model-based HSA values can be updated.
- a user-executable first image selection step comprises manually selecting positions of the pupil centers of the subject in the first image.
- a second image selection step performed or executed by the user immediately after the first image selection step, comprises manually selecting positions of the pupils or pupil centers of the subject in the second image.
- the first image selection step comprises a first image selection substep, namely, manually selecting the position of the first pupil in the first image, and a second image selection substep, namely, manually selecting the position of the second pupil in the first image.
- the second image selection step also comprises a first image selection substep, namely, manually selecting the position of the first pupil or pupil center in the second image, and a second image selection substep, namely, manually selecting the position of the second pupil or pupil center in the second image.
- a third image selection step performed or executed by the user immediately after the second image selection step comprises manually selecting, preferably temporal, or more preferably nasal, border points of the goggles worn by the subject in the first image.
- a fourth image selection step, performed or performed by the user immediately after the third image selection step includes manually selecting, preferably temporal, or more preferably nasal, border margins of goggles worn by the subject in the second image.
- first selection elements preferably the temporal, or more preferably the nasal, peripheral margins are selected as the second selection element.
- a further improved estimation of the mono-PD can be made and provided.
- Such an iterative determination can be continued accordingly for the remaining optical parameters.
- one or more image selection steps executable or executed by the user immediately after the fourth image selection step include manually selecting further, in particular temporal, nasal, lower and / or upper, frame edge points of the spectacles worn by the subject in the first and / or second Image.
- further image selection steps to be performed or executed by the user in the given order comprise: manually selecting a frame horizontal of the spectacles worn by the subject in the first image,
- a manual selection of at least one temporal-upper chest measurement of the spectacles worn by the subject in the first image Preferably, two nasal-lower box dimensions, namely one to be selected for the left and one for the right lens.
- two temporal-upper box dimensions namely one for the left and one for the right spectacle lens, are preferably selected.
- an associated automatic image processing is carried out immediately before each executable manual image selection step.
- an automatic preselection of the image elements to be selected in the respective image selection step can take place by means of automatic image processing associated with an image selection step. The user then need only check the automatic preselection and correct if necessary. This facilitates user interaction.
- the iterative determination of the optical parameters also takes place on the basis of additional statistical data, in particular on the basis of the abovementioned auxiliary parameters, which describe a dependency between two or more of the optical parameters.
- the statistical data may describe a dependency between SL and SH. This is helpful, for example, if the SL has already been determined as far as possible after determination of the temporal and nasal border margins, but for which SH only the original default values or mean values are available. Consequently For example, the accuracy of further optical parameters can be quickly increased without the need for an additional manual image selection step. In particular, the quality of the automatic preselection can be improved, whereby the corrections to be made by the user are further reduced.
- the particular optical parameters or updated optical parameters of the subject are provided with an indication of the expected accuracy of the particular optical parameters, i. output or displayed.
- an additional value to the parameter may inform about its quality, for example an indication between 0% and 100%.
- labeling can also be carried out without an explicit indication, e.g. is described in the manual and the user or operator has obtained by reading the knowledge that an evaluation only after full implementation reaches the maximum accuracy and values are within certain limits indefinite in case of premature termination.
- the determined optical parameters are displayed after each executed iteration step, in particular after each executed automatic image processing and / or after each executed image selection step, optionally with an indication of the accuracy or quality.
- the particular optical parameters may be displayed on a computer display. Based on the displayed current Results of the optical parameters can help the user to decide whether further manual image selection steps are necessary to further increase the accuracy of the parameters, or whether the accuracy is already sufficient and therefore he can prematurely terminate the determination process.
- Another independent aspect for achieving the object relates to a computer program product comprising program parts which, when loaded in a computer, are designed to carry out a method according to the invention.
- Figure 1 shows a schematic flow diagram for determining optical parameters according to a conventional approach
- Figure 2 is a schematic flow diagram for determining optical parameters according to a preferred embodiment of the present invention
- FIG. 3 shows a schematic diagram of the evaluation order in the determination of optical parameters according to a preferred embodiment of the present invention
- FIG. 4 shows exemplary values of optical parameters which were determined according to different evaluation steps of the method according to the invention.
- the location information chosen in the present specification are each related to the immediately described and illustrated figure and are mutatis mutandis transferred to the new situation in a change in position.
- the following description of the method according to the invention relates, for example, to a video centering system which is suitable for recording stereoscopic images with the aid of a first camera and a second camera, ie a video centering system with a stereo camera system. It It is understood, however, that the method according to the invention can also be used with other video centering systems, eg systems with only one camera, with which all optical parameters can be determined. If this description refers to a "frame edge", this term must be replaced by the term "lens edge" in the case of borderless frames.
- K1 means a first camera view and "K2" means a second camera view.
- KT means a first camera image and a first image taken with the first camera and
- K2 means a second camera image and a second image taken with the second camera, respectively are therefore each marked with the numbered reference symbol "BS”.
- the abbreviation “pupils K1" of the image selection step BS1 or BS10 means that in this step the user can select or select the pupils of the subject shown in the first camera view K 1.
- the middle boxes each refer to an automatic image processing and are therefore designated by the numbered reference symbol "BV.”
- image processing frame edge ++ used in FIG. 2 means that one or more predetermined points or elements of the frame or frame Eyeglass lens edge, eg upper, lower, nasal or temporal frame edge points or spectacle edge points, or also a frame horizontal or box dimensions are automatically detected.
- the lower boxes each refer to the specific optical parameters.
- FIG. 1 shows a schematic flow chart for determining optical parameters according to a conventional procedure.
- a measurement recording 1 are generated in the image data by a first image with a first camera K1 and a second image with a second camera K2 are recorded.
- the first and the second camera are located at different locations, so that images of the subject's head are taken from different perspectives, eg with the first camera K1 from the front and with the second camera K2 from below.
- the evaluation of the generated image data or the recorded images takes place.
- a computer-aided automatic image processing BV1 using appropriate Schml. Pattern recognition algorithms detected in both the first and in the second image, the pupils of the subject.
- a user interaction or an image selection step BS1 to be executed by the user takes place in which the user selects or selects the pupils of the test person in the first image taken with the first camera K1.
- a further computer-aided automatic image processing BV2 using corresponding image or pattern recognition algorithms in both the first and in the second image of the frame edge or the lens edge of the glasses worn by the subject detected.
- a further user interaction or a further image selection step BS 2 to be performed by the user in which the user prescribes predetermined versions. Glass edge points of the spectacles worn by the subject in the selected with the first camera K1 first image selects or selects.
- the image selection step BS3 the user thus selects the pupils of the subject in the second camera image K2, while the user selects predetermined frame edge points or spectacle edge points of the spectacles worn by the subject in the second camera image K2 in the subsequent image selection step BS4.
- the individual evaluation data can be generated from the evaluation data generated thereby Parameters PD, HSA, VN, FSW, head rotation and head tilt are determined.
- FIG. 2 shows a schematic flow diagram for determining optical parameters according to a preferred embodiment of the present invention.
- the starting point of the method is a measurement recording 1, in which image data are generated by recording a first image B1, for example with a first camera and a second image B2, eg with a second camera.
- a multiplicity of user interaction steps or image selection steps BS10 to BS22 are also predetermined for the determination of the optical parameters.
- all the picture selection steps shown in FIG. 2 are optional, ie the user of the video centering system can decide for yourself whether and how many of the given, possible image selection steps he performs. In other words, he can set the number of image selection steps performed. With the number of image selection steps carried out, the user can influence in particular the number of executed iteration steps and / or the accuracy or quality of the specific optical parameters and thus also the speed of the parameter determination.
- the individual specific optical parameters are each provided with numbers in square brackets. These numbers refer to the accuracy or quality of the particular optical parameters.
- the accuracy or quality is therefore lowest for certain optical parameters marked with [1] and highest for certain optical parameters marked with [4]. Furthermore, the accuracy or quality of the particular optical parameters marked with [2] is less than the accuracy or quality of the particular optical parameters marked with [3].
- a computer-aided automatic image processing of the generated image data or the recorded images is performed.
- the pupils of the subject are automatically detected both in the first and in the second recorded image.
- a second image processing step BV12 both in the first as well as in the second recorded image of the Perimeter border or predetermined points or elements of the frame border of the spectacles worn by the subject using detected image or pattern recognition algorithms detected.
- all optical parameters can be determined and output fully automatically. This determination, which is based on only one determination or iteration step, thus represents a zeroth estimate for all optical parameters and corresponds to the lowest accuracy.
- the determination may in particular be model-free, i. without any assumptions.
- the initial position will be initial optical parameters, i. estimated values or default values for the optical parameters to be determined.
- initial optical parameters i. estimated values or default values for the optical parameters to be determined.
- VN 7 °
- HSA 13 mm
- FSW 5 °
- PD 64 mm.
- these model assumptions or default values can be improved by automated image processing.
- pupil detection a suggestion can already be made as to where the pupils are located. On this basis, the total PD can already be provided optimized. This also applies to the further selection elements of the frame edge and the parameters derived therefrom.
- the initial optical parameters can therefore be adapted or improved by the automatic image processing.
- the initial optical parameters HSA and PD are adjusted on the basis of the detected pupils.
- the user may perform an optional image selection step BS10 by selecting the subjects' pupils in the first camera image K1.
- the further image processing steps BV14 with regard to the pupils and BV16 can be carried out with respect to the frame edge, so that finally on the basis of the evaluation data generated thereby in a second determination step or iteration step, all output or displayed optical parameters to be updated.
- the user may at this point abort the determination process, if the accuracy of the particular optical parameters is already sufficient, or else execute the next predefined or possible image selection step BS12.
- the image processing step BV18 is subsequently performed on the frame edge.
- all specific optical parameters can be updated.
- the total pupil distance can already be increased with increased accuracy, i. in the first estimate, but while e.g. the head tilt and head rotation are still based on a lower accuracy, namely the zeroth estimate.
- the user can abort the determination process, provided the accuracy of the optical parameters determined hitherto is already sufficient, or else execute the next predefined or possible image selection step BS14.
- the user executes the image selection step BS14 by selecting one or more predetermined frame margins, e.g. nasal border margins selected spectacles of the subject in the first camera image K1, then a further image processing step BV20 is performed with respect to the frame edge. With the evaluation data generated thereby, in turn, all specific optical parameters can be updated. In addition, after the previously executed steps, e.g. the mono-pupil distance and the value for AzG with increased accuracy, namely in the first estimate, output or displayed. At this point, the user can abort the determination process if the accuracy of the optical parameters determined up to that point is already sufficient, or if he performs the next predefined or possible image selection step BS16.
- predetermined frame margins e.g. nasal border margins selected spectacles of the subject in the first camera image K1
- a further image processing step BV20 is performed with respect to the frame edge.
- all specific optical parameters can be updated.
- the user can abort the determination process if the accuracy of the optical parameters determined up
- the user executes the image selection step BS16 by selecting the frame edge point selected in the previous image selection step BS14 now also selected in the second camera image K2, then a further image processing step BV22 is performed with respect to the frame edge.
- the user may at this point abort the determination process, provided that the accuracy of the optical parameters determined hitherto is already sufficient, or else execute the next predetermined or possible image selection step.
- further image selection steps and image processing steps can be performed at this point, which proceed analogously to BS14, BV20, BS16 and BV22, but relate to further predetermined frame edge points, in particular to temporal, lower and / or upper frame edge points.
- These further steps are preferably carried out alternately between camera view 1 and camera view 2 for each further preset frame edge point.
- the nasal, upper and lower frame edge points are selected, so that with the evaluation data generated thereby, in turn, all specific optical parameters can be updated.
- the head tilt and the head rotation are displayed with increased accuracy, namely in the first estimation, while e.g. the lens disc angle, the disc height SH, the insertion height y and the value for v is still based on a lower accuracy, namely the zeroth estimate.
- the user executes the further predetermined image selection step BS18 by selecting the frame horizontal of the subject's spectacles in the first camera image K1, then another image processing step BV24 is performed with respect to the frame edge.
- all specific optical parameters can be updated.
- the overall pupil distance with further increased accuracy, namely in the second estimation, and the head inclination can already be output or displayed with the highest accuracy, ie as the final value, according to the steps performed so far.
- the user can click on this Place the determination process abort, if the accuracy of the previously determined optical parameters is already sufficient, or perform the next predetermined or possible image selection step BS20.
- the user executes the further predetermined image selection step BS20 by selecting the lower nasal box dimension of the subject's spectacles in the first camera image K1, then another image processing step BV26 is performed with respect to the frame edge.
- all specific optical parameters can be updated.
- the steps performed so far for example, the mono-pupil distance, the HSA, the pretilt and the mounting disk angle with further increased accuracy, namely in second estimation, and the insertion height y and the value for AzG already with the highest accuracy, ie as a final value , are output or displayed.
- the user can abort the determination process if the accuracy of the optical parameters determined up to that point is already sufficient, or else if he carries out the next predefined or possible image selection step BS22. If the user executes the further predetermined image selection step BS22 by selecting the upper temporal box dimension of the subject's spectacles in the first camera image K1, then all the optical parameters can be output or displayed with the highest accuracy, ie as final values.
- the image processing steps BV12 and BV16 and the associated determination steps or iteration steps are optional, ie it can be specified in one possible embodiment of the method according to the invention that the user must execute at least the image selection steps BS10 and BS12.
- the determination step following the image processing step BV 8 is not the third but the first determination step or iteration step of the determination method.
- it is advantageous to modify the evaluation sequence in comparison to the conventional method so that 3Q data can advantageously already be obtained at an early point in time.
- the pupil positions in both images are first determined. From this an overall PD can already be determined, which can be provided updated.
- an overall PD can already be determined, which can be provided updated.
- an additionally determined head rotation for example, could still be taken into account as a correction value.
- An updated mono-PD can also be provided by halving the overall PD.
- an HSA value can already be updated.
- the temporal or preferably the nasal border margins are selected. From this a further estimation of the mono-PD can be made and provided. This determination can be continued accordingly for the other values. In this case, the values not yet finally determined can be provided again updated with each further known measuring point.
- Statistical data describing a dependency between the parameters for example, between SL and SH, can once again be included in this updated provision if the SL has already been largely determined after determination of the temporal and nasal peripheral points, but for which SH only the original mean values are available ,
- the provision of the specific optical parameters is preferably carried out with an indication of the expected accuracy or quality of the information.
- FIG. 3 shows a schematic diagram of the evaluation order in the determination of optical parameters according to a preferred embodiment of the present invention.
- a total of seven different image selection steps A to G with the points to be selected by the user or Outlines elements that can be performed by the user after a first machine image preprocessing is indicated in the third column in which camera view or in which images the respective image selection steps are executed.
- the image selection steps A to D are respectively performed alternately in the first camera image and the second camera image.
- the image selection steps E to G are executed only in the first camera image.
- the right-hand column of FIG. 3 it is indicated in each case for some of the optical parameters, in which accuracy they can be determined after the respectively executed image selection steps.
- the accuracy of the optical parameters is also indicated here by a digit from 1 to 4 behind each optical parameter, where 1 means the lowest and 4 the highest accuracy.
- 1 means the lowest and 4 the highest accuracy.
- all optical parameters can be determined or provided with the lowest accuracy, ie in the zeroth estimate.
- the accuracy of at least some of the optical parameters can be successively increased until at the end, ie after the executed image selection step G, all parameters with the highest accuracy are determined.
- the right pupil 11 and the left pupil 12 of the subject are selected in both camera images.
- the right nasal frame edge point 13 and the left nasal frame edge point 14 of the subject's spectacles are respectively selected in both camera images.
- the right upper frame edge point 15 the left upper frame edge point 16, the left lower frame edge point 17, and the right lower frame edge point 18 of the subject's eyeglasses are selected in both camera images.
- the right temporal frame edge point 19 and the left temporal frame edge point 20 of the subjects' glasses are selected in both camera images.
- the image selection step E the frame horizontal 21 in the first camera image is selected.
- the nasal lower box dimension 22 of the right spectacle lens and the nasal lower body dimension 23 of the left spectacle lens in the first camera image are respectively selected.
- the temporal upper box dimension 24 of the right-hand spectacle lens and the temporal upper box dimension 25 of the left-hand spectacle lens in the first camera image are selected.
- all not yet finally determined optical parameters can be redetermined or updated by the additionally obtained information. It is also possible that the image selection steps B, C and D are also performed in an order other than that shown in FIG.
- Table 1 below shows a further exemplary sequence of image selection steps.
- the number of the step in the first column, the selection element to be selected in the second column, the dependencies of already selected elements in the third column, and the corresponding 3D point to be calculated are listed in the fourth column.
- the numbers 1 and 2 refer respectively to the camera view, i. on whether the selection step in the first or second image is executed.
- the term "left” and “right” refers to the left and right pupils and the left and right lenses, respectively.
- a selection and evaluation of the pupils takes place first. After step 4, the total pupil distance can thus already be determined.
- the next step is the selection and evaluation of nasal border margins. This advantageously allows a fast approximate determination of the monocular pupillary distance.
- the socket angle can be approximated to the PD level.
- the pre-tilt can be determined on the left, while the left disk length, the left disk height and the monocular pupil distance can be estimated with increased accuracy.
- approximately the right anterior tilt, the dial angle and the monocular pupil distance can be determined.
- the slice length on the right and the slice height on the left can be determined.
- Table 2 An alternative exemplary sequence of image selection steps is shown in Table 2 below. The representation and the designations correspond to those of Table 1. Table 2:
- the total pupil distance can thus already be determined.
- Table 1 is the selection and evaluation of nasal border margins. This advantageously allows a fast approximate determination of the monocular pupillary distance.
- an evaluation of the temporal frame edge points is carried out to use an approximately determined FSW approximately for the further estimation of the monocular pupillary distance. After the eighteenth step, in particular the pre-tilt left can be determined.
- the user selection may proceed as shown in the following Table 3.
- the designations of the selection elements in Table 3 correspond to those of Tables 1 and 2.
- a computer-based suggestion of one or more selection items is displayed on a screen simultaneously , In other words, the simultaneous display of specific selection elements on the screen takes place in each case in an associated display step.
- a computer-based suggestion for the left and right pupils in the first camera view is displayed.
- a computer-based proposal for the left and right pupils in the second camera view is displayed.
- a computer-based proposal of the selection elements of the spectacle frame shown in Table 3 is shown in the first camera view.
- a computer-based proposal of the selection elements of the spectacle frame shown in Table 3 is shown in the second camera view.
- a computer-based proposal of the upper frame horizontal is shown in the first camera view.
- a computer-based proposal of the box dimensions given in Table 3 is shown in the first camera view.
- a computer-based suggestion of the left and right frame form is displayed in the first camera view.
- the user preferably in any order or alternatively in a predetermined order, manually select the respective selection elements.
- the proposed selection elements can serve as a guide. If all selection elements of a display step have been selected by the user, the next display step is continued. Table 3:
- FIG. 4 shows exemplary values of optical parameters which were determined according to different evaluation steps of the method according to the invention.
- initial optical parameters are shown as a starting point without evaluation. Since these are only estimates, these values may be e.g.
- the values for PD, SL, SH and AzG can be updated or improved in accuracy, for example, the updated values can be displayed in color "Orange" are displayed.
- the values for PD, HSA and AzG can be updated or improved.
- the values for HSA can therefore be e.g. As the parameters PD and AzG have already reached the highest accuracy, the corresponding values can be represented, for example, in the color "green".
- the values for HSA, FSW, SL, SH, u and v can be updated or improved. Since the highest accuracy for the parameters PD, FSW, SL, and AzG has already been reached at this time, the corresponding values can be obtained, for example. The determination of the optical parameters can be continued in this way until finally all the parameters are determined with the highest accuracy and are thus output in the color "green".
Landscapes
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Geometry (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Eyeglasses (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016004430.6A DE102016004430A1 (de) | 2016-04-12 | 2016-04-12 | Verfahren zum Bestimmen von optischen Parametern eines Probanden und Computerprogrammprodukt zum Durchführen des Verfahrens |
PCT/EP2017/000338 WO2017178092A1 (de) | 2016-04-12 | 2017-03-15 | Verfahren zum bestimmen von optischen parametern eines probanden und computerprogrammprodukt zum durchführen des verfahrens |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3443409A1 true EP3443409A1 (de) | 2019-02-20 |
Family
ID=58413048
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17713579.5A Pending EP3443409A1 (de) | 2016-04-12 | 2017-03-15 | Verfahren zum bestimmen von optischen parametern eines probanden und computerprogrammprodukt zum durchführen des verfahrens |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP3443409A1 (de) |
DE (1) | DE102016004430A1 (de) |
IL (1) | IL262300A (de) |
WO (1) | WO2017178092A1 (de) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3702831A1 (de) * | 2019-03-01 | 2020-09-02 | Carl Zeiss Vision International GmbH | Datensatz zur verwendung in einem verfahren zur herstellung eines brillenglases |
EP3944004A1 (de) | 2020-07-23 | 2022-01-26 | Carl Zeiss Vision International GmbH | Computerimplementiertes verfahren zur erzeugung von daten zur herstellung mindestens eines brillenglases und verfahren zur herstellung einer brille |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102005003699B4 (de) | 2005-01-26 | 2018-07-05 | Rodenstock Gmbh | Vorrichtung und Verfahren zum Bestimmen von optischen Parametern eines Benutzers; Computerprogrammprodukt |
DE102009004381B4 (de) * | 2008-01-10 | 2011-09-01 | Rodenstock Gmbh | Vorrichtung, Verfahren und Computerprogrammprodukt |
DE102008012268B4 (de) * | 2008-03-03 | 2017-09-21 | Rodenstock Gmbh | Vorrichtung, Verwendung, Verfahren und Computerprogrammprodukt zum dreidimensionalen Darstellen von Darstellungsbilddaten |
DE102011115239B4 (de) * | 2011-09-28 | 2016-02-11 | Rodenstock Gmbh | Bestimmung der Scheibenform unter Berücksichtigung von Tracerdaten |
DE102013010684B4 (de) * | 2013-06-26 | 2021-11-18 | Rodenstock Gmbh | Ein Verfahren und eine Messvorrichtung zur verbesserten Erfassung von Fassungsparametern einer Brillenfassung sowie Stützscheibe |
-
2016
- 2016-04-12 DE DE102016004430.6A patent/DE102016004430A1/de active Pending
-
2017
- 2017-03-15 EP EP17713579.5A patent/EP3443409A1/de active Pending
- 2017-03-15 WO PCT/EP2017/000338 patent/WO2017178092A1/de active Application Filing
-
2018
- 2018-10-11 IL IL262300A patent/IL262300A/en unknown
Also Published As
Publication number | Publication date |
---|---|
IL262300A (en) | 2018-11-29 |
DE102016004430A1 (de) | 2017-10-12 |
WO2017178092A1 (de) | 2017-10-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2282232B1 (de) | Verwendung einer Vorrichtung zum Bestimmen von optischen Parametern eines Benutzers | |
EP2115524B1 (de) | Verfahren zur berechnung eines brillenglases mit variabler lage der bezugspunkte | |
EP2115526B1 (de) | Bezugspunkte für orthostellung | |
EP2115525B1 (de) | Flexibler gleitsichtglasoptimierer | |
DE102011115239B4 (de) | Bestimmung der Scheibenform unter Berücksichtigung von Tracerdaten | |
DE102018121742B4 (de) | Verfahren und System zur virtuellen anatomischen Anpassung einer Brille | |
EP3183616B1 (de) | Ermittlung von benutzerdaten unter berücksichtigung von bilddaten einer ausgewählten brillenfassung | |
DE102011054833A1 (de) | Verfahren zum Messen der binokularen Sehleistung, Programm zum Messen der binokularen Sehleistung, Verfahren zum Entwerfen von Brillengläsern und Verfahren zur Herstellung von Brillengläsern | |
EP3702832B1 (de) | Verfahren, vorrichtungen und computerprogramm zum bestimmen eines nah-durchblickpunktes | |
DE102006033491A1 (de) | Vorrichtung und Verfahren zum Bestimmen einer Trageposition einer Brille, Computerprogrammvorrichtung | |
EP3730998A1 (de) | Bestimmung mindestens eines optischen parameters eines brillenglases | |
EP3422087B1 (de) | Verfahren zur korrektur von zentrierparametern und/oder einer achslage sowie entsprechendes computerprogramm und verfahren | |
DE102010049168A1 (de) | Verordnungs- und individualisierungsabhängige Modifikation des temporalen peripheren Sollastigmatismus und Anpassung der Objektabstandsfunktion an veränderte Objektabstände für die Nähe und/oder die Ferne | |
DE10140656A1 (de) | Verfahren zum Entwerfen und Optimieren eines individuellen Brillenglases | |
DE60206342T2 (de) | Verfahren zur Beurteilung der binokularen Eigenschaften von Brillengläsern, Vorrichtung zur Anzeige dieser Eigenschaften und zugehöriger Apparat | |
WO2018138149A1 (de) | Computerimplementiertes verfahren zur bestimmung einer repräsentation eines brillenfassungsrands oder einer repräsentation der ränder der gläser einer brille | |
WO2017178092A1 (de) | Verfahren zum bestimmen von optischen parametern eines probanden und computerprogrammprodukt zum durchführen des verfahrens | |
WO2012022380A1 (de) | Verfahren und einrichtung zur bestimmung des augenabstandes von personen | |
WO2020178167A1 (de) | Datensatz zur verwendung in einem verfahren zur herstellung eines brillenglases | |
EP4185920B1 (de) | Computerimplementiertes verfahren zur erzeugung von daten zur herstellung mindestens eines brillenglases und verfahren zur herstellung einer brille | |
DE102005063668B3 (de) | Vorrichtung und Verfahren zum Bestimmen von optischen Parametern eines Benutzers; Computerprogrammprodukt |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20181108 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SCHMID, LEONHARD Inventor name: TIEMANN, MARKUS Inventor name: SEITZ, PETER Inventor name: SCHUBART, JOHANNES |
|
RIN1 | Information on inventor provided before grant (corrected) |
Inventor name: SEITZ, PETER Inventor name: SCHUBART, JOHANNES Inventor name: SCHMID, LEONHARD Inventor name: TIEMANN, MARKUS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20220315 |