CN108537628B - Method and system for creating customized products - Google Patents

Method and system for creating customized products Download PDF

Info

Publication number
CN108537628B
CN108537628B CN201810244436.3A CN201810244436A CN108537628B CN 108537628 B CN108537628 B CN 108537628B CN 201810244436 A CN201810244436 A CN 201810244436A CN 108537628 B CN108537628 B CN 108537628B
Authority
CN
China
Prior art keywords
user
model
individual
computer system
parametric model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810244436.3A
Other languages
Chinese (zh)
Other versions
CN108537628A (en
Inventor
蒂莫西·A·方特
埃里克·J·瓦劳迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bespoke Inc
Original Assignee
Bespoke Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bespoke Inc filed Critical Bespoke Inc
Publication of CN108537628A publication Critical patent/CN108537628A/en
Application granted granted Critical
Publication of CN108537628B publication Critical patent/CN108537628B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0621Item configuration or customization
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/111Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring interpupillary distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B29WORKING OF PLASTICS; WORKING OF SUBSTANCES IN A PLASTIC STATE IN GENERAL
    • B29DPRODUCING PARTICULAR ARTICLES FROM PLASTICS OR FROM SUBSTANCES IN A PLASTIC STATE
    • B29D12/00Producing frames
    • B29D12/02Spectacle frames
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y10/00Processes of additive manufacturing
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y50/00Data acquisition or data processing for additive manufacturing
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/001Assembling; Repairing
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/003Measuring during assembly or fitting of spectacles
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/003Measuring during assembly or fitting of spectacles
    • G02C13/005Measuring geometric parameters required to locate ophtalmic lenses in spectacles frames
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/024Methods of designing ophthalmic lenses
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • G02C7/024Methods of designing ophthalmic lenses
    • G02C7/027Methods of designing ophthalmic lenses considering wearer's parameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16BBIOINFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR GENETIC OR PROTEIN-RELATED DATA PROCESSING IN COMPUTATIONAL MOLECULAR BIOLOGY
    • G16B5/00ICT specially adapted for modelling or simulations in systems biology, e.g. gene-regulatory networks, protein interaction networks or metabolic networks
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/403Arrangements for multi-party communication, e.g. for conferences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B33ADDITIVE MANUFACTURING TECHNOLOGY
    • B33YADDITIVE MANUFACTURING, i.e. MANUFACTURING OF THREE-DIMENSIONAL [3-D] OBJECTS BY ADDITIVE DEPOSITION, ADDITIVE AGGLOMERATION OR ADDITIVE LAYERING, e.g. BY 3-D PRINTING, STEREOLITHOGRAPHY OR SELECTIVE LASER SINTERING
    • B33Y80/00Products made by additive manufacturing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/169Holistic features and representations, i.e. based on the facial image taken as a whole

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Ophthalmology & Optometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Optics & Photonics (AREA)
  • Public Health (AREA)
  • General Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computer Hardware Design (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Architecture (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Computer Graphics (AREA)

Abstract

The present invention relates to a method and system for creating a customized product. Systems and methods for creating a fully customized product from scratch without the full use of off-the-shelf or pre-specified components are disclosed. A system for creating a customized product includes an image capture device for capturing image data and/or measurement data of a user. A computer is communicatively coupled with the image capture device and configured to construct an anatomical model of a user based on the captured image data and/or measurement data. The computer provides a configurable product model and enables previewing and automatic or guided user customization of the product model. A display is communicatively coupled with the computer and displays a customized product model superimposed on the user's anatomical model or image data.

Description

Method and system for creating customized products
The present application is a divisional application of an inventive patent application having an application date of 2014, 22/8, No. 201480056330.0(PCT/US2014/052366), entitled "method and system for creating customized products".
Technical Field
The present invention relates to creating, manufacturing and delivering a one-up customized product from scratch on demand. More particularly, the present invention creates, manufactures, and delivers customized personal products on demand that are best suited to the needs and preferences of individual users by building products from specifications generated from user-specific preference profiles that are automated and/or user-directed, and by building unique leading customized products based on the profiles.
Background
While there are many types of personal products that one may want to customize or manufacture as a unique product tailored to a particular user, an important one of these personal products is eyewear. While the present invention will be described in connection with creating, producing, and delivering customized eyewear, it should be appreciated that the present invention relates to creating, producing, and delivering a wide variety of products that relate to the user's anatomy or physical characteristics as well as the user's preferences for particular products. In view of the foregoing, it should be appreciated that there is a great deal of similarity in describing the invention in creating, producing, and delivering eyewear with respect to creating, producing, and delivering a wide variety of products that are customized to the characteristics and desires of the user. The invention is thus described below in terms of eyewear, but it should be understood that the invention is not so limited.
Although many people must purchase eyeglasses, purchasing eyeglasses presents a number of challenges to consumers. For traditional in-store purchases, the in-store options that consumers face are limited, so often multiple stores are required. However, the user must study a series of awkward options to find a compromise between fit, style, color, shape, price, etc. Eyeglasses are typically mass produced, with a particular style of eyeglasses having only one or two common colors and sizes. The user's face is sufficiently unique that the face can even be used as the primary form of identification, but the user must choose between products manufactured for faces that are not his own face but are normal. It is difficult for users to find a pair of glasses that is ideal for their unique tastes, facial anatomy, and needs. It is also difficult for users to imagine what glasses they try because they first need an optical prescription.
In recent years, new companies have opened up the online glasses market, trying to solve some of these problems. However, none of the commercially available eyeglass selection systems attempt to provide a completely unique leading, beginning-of-design product that is customized to the user's anatomy and the user's likes and dislikes. There is therefore a need to provide users with a fully customizable lead product that does not rely solely on existing mass-produced or stockpiled components of previous designs. The basic form, size, shape or other attributes of the key components must be customized to provide the user with a truly unique and customized product. Once the image data for the user is available, it is then desirable to analyze and make key measurements of the user's face, determine the user's preferences, and manufacture a customized pair of glasses on demand.
It is of course desirable that this process be as automatic as possible and that this process give the user the most perfect unique pair of glasses he or she sees. If this can be achieved in a relatively quick manner, a user is provided with a quick and unique pair of spectacles that are manufactured on demand.
More particularly, the online market is rapidly developing, but there are still many problems for consumers. It is difficult for consumers to try on glasses while shopping online. Online websites, while more than in-store choices, often face a myriad of pages of glasses from which to make selections. The quality of the glasses is often unknown and consumers are even more concerned about whether their new glasses fit correctly and are comfortable because they cannot actually take or see the glasses before purchasing.
There is clearly a need for a shopping experience that results in unique tailored products, the materials and design of which are of high quality, the price of which is reasonable and affordable to the user for unique leading items manufactured from scratch, and an easier and more customized experience to create and purchase a perfect product for an individual, in this case a pair of glasses.
The concept of virtual try-on clothing (including glasses) has been discussed in the prior art for many years. The patents listed below all relate to preview systems, but none relate to providing a product designed from scratch, but rely on pre-manufactured parts of a particular item.
For example, Spackova in US 4,539,585 describes a computer system for viewing a garment worn on a person in an image. US 4,730,260 to Mori and US 4,845,641 to Ninomiya et al describe computer systems for virtually covering eyeglasses on a person in an image. US 5,280,570 to Jordan describes a system that requires a user to virtually try on glasses inside a store to render in reality what their eyes will look like after wearing the glasses. Norton, US 5,592,248, describes various methods of overlaying a virtual eyeglass image over an image of a person's face to preview the look. US 5,983,201 to Faye describes a system for a user to virtually try on various glasses on their personal computer by connecting to an online store, selecting a subset of glasses based on user preferences and size, and allowing the user to purchase a frame. US6,095,650 to Gao describes another system for capturing an image and displaying glasses superimposed on the user's image, including scaling the image and detecting the pupil to determine the center of the frame. US6,142,628 to Saigo describes another fitting system that includes a lens selection and a lens shape display in addition to the frame. US 7,016,824 to Waupotitsh describes a glasses preview system that overlays a glasses model on a 3D face model provided by a user using this face model. US6,692,127 to Abitbol describes an eyewear fitting system that requires a wide angle camera to obtain a 3D model. Foley, US6,535,223, describes a system for determining interpupillary distance based on an image of a person's face containing a subject of known proportions, and superimposing preview glasses and allowing an order to be placed.
All of the aforementioned prior art studies various ways to preview eyewear superimposed on a person's image, but none of these prior art techniques is a system that creates, assembles, and delivers unique and unique products on demand from scratch. These prior art techniques also fail to allow previews of new customized eyewear that were not previously mass produced. These prior art techniques also do not use user-specific information to improve the glasses for the user. In short, these prior art techniques do not customize, adapt, modify, implement, or create new products, such as eyeglasses, using a system that provides unique products from scratch as needed. Moreover, all of the above techniques rely on glasses that preview images superimposed on a person's image.
On the other hand, Fujie's US 5,576,778 describes a system for designing eyewear based on a person's facial dimensions. It should be noted that Fujie is limited to implementing a design that controls various anchor points on a Bezier curve (Bezier curve) extracted from the facial image data. However, the specification of these anchors or individual controls over anchors is technical and rather difficult, since it is the user's language that is used to control these points to control the shape, so it is more technical and difficult. Moreover, Fujie is limited to specifically transmitting bezier curve-based polar coordinates to the machine tool. This is too complicated for the user and the user's language alone may not be suitable as the only way to control.
US 6,944,327 to soato describes a system for customizing eyewear based on a preview image of a user's face. However, Soatto does not take into account automatically generated user preferences. Soatto does not describe an end-to-end process as desired and does not describe a complete system that can actually make eyeglasses. Also, the Soatto method is limited to a specific camera, only a frontal face image, and determines the size using a method of generating a two-dimensional template of the face. The preview is limited to the frontal image only, and no critical sizing information about the temple to ensure good preview and comfort for the user is included. Moreover, most computer systems are not readily available with multi-lens cameras. Note that the adjustment is only made via the control points, while the size of the periphery remains the same, with a limited range of applications-different users will obviously require different sizes. It will be appreciated that the method of describing a 3D model of a face requires two or more cameras, which is generally not available to most users.
US 6,533,418 to izumiani describes a system for ordering glasses based on image previews superimposed on the face of a user. However, this patent only discusses changing the lens shape, frame type, frame parts and color. It does not explain the change of shape of the frame, but only the change of some parts or the change of the style of the frame from borderless to rimmed, which is very limited when the user wants to completely customize the glasses. Moreover, this patent does not describe an automatic algorithm for determining the frame size or for assisting in the selection of the best frame, based on the face of the user. Instead, it is a manual system using a similar custom order catalog with many interchangeable parts to choose from, which may be too numerous or complex for the eyeglass consumer. In addition, the described preview system shows only the front and side portraits of the user wearing glasses, without an interactive view, 3D view or video, and without automatically measuring the size of the face. Moreover, the user must assist or enter information to obtain the appropriate measurement. Finally, while this patent describes manufacturing eyeglasses, it does not clearly describe how custom made eyeglasses can actually be manufactured.
US 7,845,797 to Warden describes a method of manufacturing customized eyewear using front and side images in a system provided with multiple cameras and illumination sources. The method requires capturing images of the eyewear on and off the user's face before determining the optimal lens position. This method is rather limited because it requires the user to have in fact the glasses he desires and it assumes that the user only wants to improve the orientation of the lenses in the rear pair of frames. In short, this is not an end-to-end system that starts out and then creates, designs, assembles, and delivers customized products as needed.
To meet the needs of the average consumer, there is a compelling need for an easy-to-use method and system that can provide a trusted and enjoyable shopping experience. The system must be able to work with the computer hardware and image capture devices available to the average consumer, thus limiting the minimum hardware to single-lens digital cameras that are standalone or embedded in a computer system and do not have depth or distance measurement capability. Embodiments of the invention describe both systems that use single camera hardware and systems that benefit from multiple cameras or depth camera technology, provided that the forms of these technologies available to consumers are becoming more popular, or provided that the computer system is installed in a retail or office location.
The prior art describes techniques designed primarily for the aesthetic preview of eyeglasses on a user. More quantitative analysis is needed to enable a better experience, custom fit, custom styling, automated adjustments and recommendations, and the overall ability to perform eyewear design that conforms to each user's unique anatomy and taste.
The interpupillary distance is often the only measurement used to ensure proper fitting of the eyeglasses, and this measurement alone is not sufficient to ensure proper physical fitting of the customized eyeglasses. More information is particularly needed for advanced optics, such as progressive or digitally compensated or free form lenses. But should not require the user to manually make the measurements regardless of the type and number of facial measurements needed to make the custom glasses. Most target users' technical understandings are limited to following simple prompts on a web browser. The consumer needs an experience that is easier to pick and choose parts and parts or to custom draw every detail than has been described in the prior art, especially when using only 2D images. The method and system must enable easy customization, including automatic sizing and styling if the user desires automatic recommendations. The average user should be able to obtain whatever eyewear design they desire and be well suited by having a custom design that fits his face, seeing a preview in the "what you see is what you get" display, and being able to make changes and see the effects and fit on his face.
Finally, the method and system must result in a manufacturable product that can be produced and sold to a customer at a reasonable cost and with acceptable lead times. It will be appreciated that a better preview system is also useless if the previewed product is not ultimately manufactured at a cost and time frame that is satisfactory to the user ordering the product.
Accordingly, there is a continuing need for a method and system that allows for a greater degree and greater personalization of lenses and frames, more accurate modeling and previewing, more automated or assisted eyeglass selection and customization, more detailed measurements, and a method of efficiently and economically producing customized eyeglasses to fulfill a user's order.
Disclosure of Invention
The present invention has a number of important parts. The first part is to understand that what is desired is a lead custom product designed from scratch that is not manufactured entirely from off-the-shelf, previously designed, mass produced or stocked components. As mentioned above, there are many systems that involve picking a number of parts that are made or manufactured in advance and then assembling the parts together in a custom object. However, if there are many items in mass production, the user does not have the feeling that he or she gets a unique one-off product that is truly centered around the user's particular profile. Products made from mass-produced parts will also not be customized to the desired extent needed to fit the user's unique anatomy and preferences. At least some portion of the customized product must be created completely from scratch in order to fit the user, such as by manufacturing some form of product into a unique, non-mass produced shape or size. The ability to automatically design and modify the basic shape and form of a customized product, with or without user guidance, is an important advantage over systems that simply allow users to browse and assemble mass-produced parts.
The second part is how to identify the individual's anatomical features, what to measure when measuring them, and how to utilize these anatomical measurement features when creating leading ab initio designed objects.
The third part is a unique product that can determine the user's profile, his buying habits, his likes and dislikes, derived over a period of time, and can use all of these likes and dislikes and profiles to provide suggestions to the user.
Fourth, given all of the above information, it is important for a product modeled according to the user's anatomical features and preferences to be able to dynamically manufacture a unique product and deliver the product to the user in an acceptable timeline. What is output is a unique product that the user may think of, or he or she may never think of, but is derived from the predictability of the process flow that results in the manufacture of a product on demand.
At a high level, therefore, the system of the present invention is an end-to-end system that enables a user to obtain a fully customized product from scratch without the limitations of fully using off-the-shelf, previously designed, mass produced or stored components. The product is custom made and best suited to the anatomy and personal preferences of the user. The system may integrate steps from collecting data about the user until the final product is delivered. This is far beyond the prior art as it provides innovations that permit design and manufacture from scratch without full use of stored, pre-designed or pre-manufactured parts. Instead, the product is designed from scratch and automatically with some or all of the following: the user's likes and dislikes, his unique anatomical attributes and unique requirements so that the finished product will be as close as possible to the user's desires in terms of design, shape, fit, size, color, weight, finish, function and artistic impression. In addition, because the system can be viewed as a professional system, it appears to provide the user with an expert to provide the product with the most appropriate style and fit. The system of the invention provides suggestions for selection at each node, reflecting the so-called artificial intelligence of the expert.
The system itself is not only unique, but also describes various techniques to develop anatomical models, directly derive certain anatomical features, various imaging techniques, ranging and size characterization techniques, scaling techniques, product presentation techniques, user interaction techniques, and custom manufacturing techniques; these techniques add to the already unique features of the system of the present invention.
One of the features of the present invention is the ability to obtain the characteristics of an individual and more particularly his or her face. It has been found that self-portrait, for example by using a smartphone or an electronic camera, can be used to provide the image information necessary to derive the necessary anatomical model. Although so-called "self-timer" or self-timer images taken by camera phones are not three-dimensional, 3-D modeling of a person's face can be generated using various features of the image formed by the smartphone. Thus, a convenient way to input a person's anatomical features is to use a popular cell phone for image capture, and the present invention finds that there is sufficient information in the self-portrait taken by a single camera to permit anatomical modeling.
Although the invention will be described in connection with eyeglasses, it is within the scope of the invention to design, manufacture and deliver personalized products of any nature from scratch, including for example jewelry, clothing, helmets, headsets and other personal items. The scope also focuses on leading customized products made from scratch, but the described method can also be applied to very unique customized products that are not necessarily 100% leading or made from scratch. Many products would benefit from having a very wide variety of designs to provide customized products (e.g., hundreds, thousands, millions of designs) that would be too difficult to configure, store, or manufacture using conventional methods, and would be well suited for the methods described herein. A high degree of configurability that requires custom-made products is within the scope of the present invention.
The comprehensiveness of the on-demand end-to-end system of the present invention relies on the following:
obtaining and analyzing image data and anatomical information
In the present invention, new methods are presented that enable improved or alternative ways to enable capturing images of a user and determining anatomical information and models of the user. These include more detailed anatomical data, aesthetic analysis, and other metrics that are used to affect both the eyeglass frame and advanced optical design. To date, no one has attempted to use anatomical information, aesthetic information, and other metrics extracted from image data to provide information for such detailed designs.
Obtaining other user information
Other user information and preferences not automatically obtained from the image data may be used to provide further information to customize the product. This information is used in novel prediction and learning algorithms that enable product design to be modified to suit a particular user.
Configurable product model
The present invention describes a configurable product model that enables customization that is far more personalized than interchanging storage components to make a customized assembly. The configurable model allows the entire shape, contour, 3D surface, measurements, colors, finishes, and more to be fully customized for an individual user.
Product customization
An algorithm is used that automatically customizes the shape and style of the glasses to the user based on the user's anatomy and personal preferences derived from the analyzed image data. Predictive algorithms are also used to predict user tastes and designs to aid in the design and manufacture of customized products. This helps to provide the user with the highest probability of design in advance.
Previewing leading customized products to a user
The method of the present invention provides high fidelity rendering of advanced customized products. These are not standard previews of previously existing products. Previewing of a leading customized product, such as glasses, occurs before the product is produced or exists because it is specifically and uniquely tailored to the user. These previews involve more advanced techniques than previews of existing products, since the product does not yet exist and there is no previous photograph, document or test of the product representation. Everything must be dynamically generated or configured to enable high quality previews of leading customized products that have not yet been built. The system of the present invention not only renders existing products (e.g., glasses or parts of glasses), but also provides a completely new custom design from scratch.
User interaction with product previews
Various improved methods allow users to interact with customized product previews, modify customized designs in real-time, get feedback from others, and allow other friends/designers/opticians to design customized products for them as well.
Manufacturing customized products
Unlike the prior art which describes very basic customization methods, such as interchanging parts or limitedly customizing some components of the eyewear, the system of the present invention manufactures fully customized products, such as high-end eyewear, from scratch. Leading custom glasses include frames and lenses custom built to a user in a specific shape, size and color. The system of the invention uses advanced technology that allows to deliver finished but completely custom designed eyeglasses with the same high quality material and generally high-grade eyeglasses.
Shopping system
Finally, the present invention comprises a shopping system that enables users to step through the steps necessary to obtain customized products, enter their data and preferences, and select and purchase products.
Definition of
The following definitions are for illustrative purposes to help define the scope of the words used herein. These definitions do not limit the scope of the invention and those skilled in the art will recognize additional definitions that may be applied to each category. By definition as used herein, image data includes 2D images, digital images, videos, image series, stereo images, 3D images, images captured with a standard light sensitive camera, images captured with a camera with multiple lenses, images captured with a depth camera, images captured with a laser, infrared, or other sensor module. Computer systems include tablets, phones, desktops, notebooks, kiosks, servers, wearable computers, network computers, distributed or parallel computers, or virtual computers. The imaging device includes a single lens camera, a multi-lens camera, a depth camera, a laser camera, an infrared camera, or a digital camera. Input devices include touch screens, gesture sensors, keyboards, mice, depth cameras, audio speech recognition, and wearable devices. Displays include panels, LCDs, projectors, 3D displays, heads-up displays, flexible displays, televisions, holographic displays, wearable displays, or other display technologies. The preview image in the form of an image, video or interactive rendering contains an image of the user overlaid with an image of the product model, an image of the user overlaid with a rendering of the product model, an anatomical model of the user and an image of the product model. The anatomical model, details, and dimensions include length of features (e.g., finger length), distance between features (e.g., distance between ears), angle of features, surface area, volume of features, 2D contour of features (e.g., contour of wrist), 3D model of features (e.g., surface of nose or ear), 3D coordinates, 3D mesh or surface representation, shape estimation or model, curvature measurement, or estimate of skin or hair color definition. The model or 3D model includes a point cloud, parametric model, texture mapping model, surface or volumetric mesh, or other collection of points, lines, and geometric elements representing an object. Manufacturing instructions include step-by-step manufacturing instructions, assembly instructions, customized specifications, CAM files, g-codes, automated software instructions, coordinates for controlling a machine, templates, images, drawings, material specifications, inspection dimensions, or requirements. A manufacturing system includes a computer system configured to deliver manufacturing instructions to a user and/or machine, a networked computer system including a machine configured to comply with the manufacturing instructions, a series of computer systems and machines through which the instructions continue. Spectacles include spectacle frames, sunglass frames, frames and lenses together, prescription spectacles, non-prescription (flat lens) spectacles, sports spectacles or electronic or wearable technical spectacles.
Customized product
The following are embodiments of custom fitting and design, previewing, modifying by user preference, and then first custom manufacturing a product after customization based on the user's anatomy derived from image data:
according to one embodiment, a method for creating a customized product is disclosed. A method includes acquiring image data of a user using at least one computer system; determining, using at least one computer system, anatomical details and/or dimensions of a user; configuring (e.g., customizing shape, size, dimension, color, finish, etc.) a new product model for the user using at least one computer system and the user's anatomical data; applying, using at least one computer system, a configurable product model to image data or an anatomical model of a user; previewing, using at least one computer system, an image of a user with a configurable product model; optionally adjusting and updating, using at least one computer system and/or user input, the preview of configurable product model attributes (e.g., custom shape, size, dimension, color, finishing, etc.); preparing using at least one computer system executing instructions for manufacturing a customized product based on the preview model; and manufacturing the new customized product using the at least one computer system and the manufacturing system.
According to one embodiment, a system for creating a customized product is disclosed. A system comprising: an image acquisition device configured to obtain image data of a user; an input device configured to receive an instruction from a user; a display configured to display image data to a user; a manufacturing system configured to produce a customized product; a digital storage device to store instructions to create and preview a customized product; a processor configured to execute instructions to perform a method comprising: capturing image data of a user using at least one computer system; determining, using at least one computer system, anatomical details and/or dimensions of a user; configuring (e.g., customizing shape, size, dimension, color, finish, etc.) a new product model for the user using at least one computer system and the user's anatomical data; applying, using at least one computer system, a configurable product model to image data or an anatomical model of a user; previewing, using at least one computer system, an image of a user with a configurable product model; optionally adjusting and updating, using at least one computer system and/or user input, the preview of configurable product model attributes (e.g., custom shape, size, dimension, color, finishing, etc.); preparing, using at least a computer system, instructions for manufacturing a customized product based on the preview model; and manufacturing the new customized product using the at least one computer system and the manufacturing system.
A system for creating a customized product is disclosed. A system comprising: an image acquisition device configured to obtain image data of a user; an input device configured to receive an instruction from a user; a display configured to display image data to a user; a manufacturing system configured to produce a customized product; a digital storage device to store instructions to create and preview a customized product; and a processor configured to execute instructions to perform the method.
The system comprises: collecting image data of a user; determining anatomical details and/or dimensions of a user; configuring the product to take into account these details by providing a corresponding new product model; applying a configurable product model to the user's image data or anatomical model; previewing an image of a user with a configurable product model; optionally adjusting and updating the preview; instructions for preparing a customized product for manufacture based on the preview model; and manufacturing new customized products. The above-described methods may be implemented using a suitably programmed computer or may take the form of non-transitory computer-readable media.
More particularly, systems and methods for creating customized eyewear are disclosed that include at least one computer system configured to receive image data of a user. The computer system is also configured to receive other data from the user including, but not limited to, demographic data, prescriptions, preferences, and the like. The systems and methods may include determining quantified anatomical information about a user from data provided by the user. The systems and methods may include customizing the attributes of the eyewear model, including size, shape, color, finishing, and style, to meet the user's anatomical structure and style requirements. The system also includes physically manufacturing the custom eyewear such that it matches the preview representation.
According to one embodiment, a system and method for creating and visualizing customized eyewear is disclosed, comprising at least one computer system configured with a display. The computer system is also provided with at least one image capturing device for capturing image data and/or measurement data of the user. The computer system is also configured to receive other data from the user, including demographic data, prescriptions, and preferences. The systems and methods may include determining quantified anatomical information about a user from data provided by the user. The system and method may include visualizing an eyewear model superimposed on image data of a user in a suitable position on the user's face. The system and method may also include customizing attributes of the eyewear model and providing an updated preview of the customized eyewear superimposed on the image data of the user. The system and method includes physically manufacturing the custom eyewear such that it matches the preview representation.
According to another embodiment, a system and method for automatically customizing eyewear is disclosed. The computer system is further configured to analyze the user's image data, quantitative anatomical information, and other provided data to determine the best attributes of the eyewear model so that it best matches the user's anatomical structure and style preferences.
According to another embodiment, a system and method for interacting with a customized eyewear model is disclosed. The computer system is also configured with an interface application. The systems and methods may include obtaining input or commands from a user through a computer system. The system and method may further include controlling the visualization including angle, zoom, and rotation of the eyewear preview. The system and method may also include controlling a position and orientation of an eyewear model of the image data of the user. The systems and methods may also include enabling a user to directly customize properties of the eyewear model and provide an updated preview.
According to another embodiment, a system and method for automatically defining an optical lens design is disclosed. The system and method includes analyzing the user's quantified anatomical information, prescription information, and custom eyewear models to calculate parameters needed to affect the optical design, including interpupillary distance, vertex distance, facial wrap, eyewear, and frame profile. The system and method are also configured to provide parameters to a manufacturing system for designing and manufacturing a customized lens.
According to another embodiment, a system and method for purchasing a network interface for customized eyewear is disclosed. The computer system is also configured with a data transfer device. The systems and methods include providing an interface for a user to select an eyewear design, interact with the eyewear design, preview and customize the eyewear design, customize the eyewear, and communicate all information needed to build and ship the customized eyewear to the user.
According to another embodiment, a system and method for controlling the manufacture of customized eyewear is disclosed. The computer system is also configured to communicate data and information to at least one manufacturing system. The system and method include communicating a customized eyewear model or parameter, user information, and an order to a manufacturing system. The system and method also includes converting the eyewear model or parameters into manufacturing data used to control the manufacturing equipment. The system and method also includes providing instructions for the machinery, robots, and operators to build, inspect, and ship the customized eyewear.
According to another embodiment, a system and method for a parametric eyewear model is disclosed. The systems and methods include a representation of the eyewear that contains dimensional information about the shape and size of the eyewear design. The system and method also includes parameters defining certain key features of the lens model, including but not limited to length, width, height, thickness, and radius. The system and method also includes updating the eyewear model when at least one parameter changes, automatically altering the eyewear to satisfy constraints for all parameters.
According to another embodiment, a system and method for learning from user interactions and preferences is disclosed that involves a learning machine or a predictor or prediction machine. The systems and methods include tracking actions taken by a user in selecting, customizing, and previewing glasses. The system and method also includes machine learning analysis of the tracked actions in addition to the user provided image data, quantified anatomical information, and other provided information to determine the user's preference for customizing the eyewear attributes. The system and method also include making recommendations to the user based on the learning analysis.
According to another embodiment, a system and method for learning from a data subject is disclosed. The system and method includes building a database of image data, quantified anatomical information, preferences, and other information relating customized eyewear to user information. The systems and methods include training machine learning classifiers to predict a user's preferences based on their data. The system and method also includes applying analysis to the new user to best provide a custom eyewear design that will fit the user's anatomy and preferences.
According to another embodiment, a system and method for guiding a user through a customization process is disclosed. The systems and methods include steps required to provide a sequence of instructions or questions to guide the user through customization of the eyewear to their preferences and anatomy.
According to another embodiment, a system and method for predicting poor fit is disclosed. The systems and methods include analyzing a fit between quantified anatomical information of a user and a custom eyewear design. The systems and methods include the use of simulations, physical modeling, and analysis to predict when a sub-optimal fit between the eyewear and the user has been designed. The system and method also includes informing the user of the suboptimal design or automatically correcting it.
According to another embodiment, a system and method for previewing vision via a customized eyewear model is disclosed. The systems and methods include rendering a preview vision, including the shape, size, and optical properties of the lens, by customizing an eyewear model. The systems and methods include rendering a real-time or static scene that simulates the user's vision, including but not limited to distortion, focus area, color, and other optical effects.
According to another embodiment, a system and method for replicating another pair of eyeglasses is disclosed. The systems and methods include receiving image data of a person wearing glasses, including a user. The system and method also includes detecting the glasses and analyzing the shape, color, and size. The system and method also include optimizing the custom eyewear design to match the analysis of shape, size, and color. The system and method also includes previewing customized eyewear on the user's image data and allowing further customization.
According to another embodiment, a system and method for sharing customized eyewear previews and the ability to customize eyewear is disclosed. The systems and methods include sending permission from at least one computer system to at least one other computer system to preview and customize eyewear on a user's image data. The system and method also include allowing third parties to interact with, customize, and update eyewear models on the user's image data. The system and method also includes a third party to provide feedback and update the design to the user.
According to another embodiment, a system and method for matching eye glass color to another object is disclosed. The system and method includes obtaining image data or information (including but not limited to manufacturer, part number, etc.) about an object having a desired color. The system and method also include calibrating the color of the image data with the reference image. The system and method also include extracting color attributes of the desired object and applying the colors to the custom eyewear model.
Drawings
These and other features of the present invention will be better understood when considered in conjunction with the following description, taken in conjunction with the accompanying drawings, in which:
FIG. 1A is a block diagram of a system for creating a leading customized product from scratch without the exclusive use of off-the-shelf components
FIG. 1B is a block diagram of a customized eyewear shopping system;
FIG. 2 is a block diagram of the image capture portion of the system of the present invention illustrating the interplay between the image capture device, user input, and other information coupled to the computer system driving the manufacturing process;
FIG. 3 is a diagrammatic illustration of eyewear and portions of eyewear that may be customized using the system of the present invention;
FIG. 4 is a graphical illustration of a user's face and anatomical features;
FIG. 5 is an illustration of a computer system used to capture image data;
FIG. 6 is a graphical illustration between a face and eyeglasses for analyzing the size of the face, permitting more face and eyeglasses parameters;
FIG. 7 is a graphical illustration of additional dimensions of the face and eyeglasses;
FIG. 8 is a diagrammatic representation of a parameterized, quantified anatomical model;
FIG. 9 is a graphical illustration of an example of a parameterized eyewear model before and after being adjusted to customize fit widths without affecting other critical dimensions;
FIG. 10 is a graphical illustration of two eyeglass designs with optimal eye center positions;
FIG. 11 is an illustration of an example computer system interface for previewing, correcting, and customizing eyewear;
FIG. 12 is an illustration showing an example illustration of customized adjustment of the width of the glasses using a computer system interface that enables determination of placement of a product on a person's face, and improvements that may be made when improving an individual's representation;
FIG. 13 is a diagrammatic illustration showing an example illustration of an eyewear design being edited;
FIG. 14 is a graphical illustration of an example of automatic eyewear model adjustment to optimize parameters;
FIG. 15 is a graphical illustration of an example of a customized 3D eyeglass model converted to a planar mode for manufacturing;
FIG. 16 is an illustration of an example of a custom 3D eyeglass model and manufacturing section;
FIG. 17 is a diagrammatic illustration of a computer with an imaging device to acquire an image of a user with a reference;
FIG. 18 is a diagrammatic illustration of a computer system used to register an anatomical model with an original user image;
FIG. 19 is a diagrammatic illustration of using a computer system to reconstruct a model of a user's face and a model of a reference target based on image data;
FIG. 20 is a graphical illustration of scaling an anatomical model according to a user's face using a two mirror reflective system;
FIG. 21 is a diagrammatical illustration of constructing and scaling an anatomical model of a user's face from a collection of previously acquired images and fitting a 3-D face model across a feature set and camera positions;
FIG. 22 is an illustration of scaling a user's face using existing glasses already owned by the user;
FIG. 23 is an illustration of a system for measuring the dimensions of a reference object by displaying a reference frame and calculating the pixel size and true size of the reference frame;
FIG. 24 is an illustration of a system for customizing an eyeglass design optimized to fit asymmetrical facial features;
FIG. 25 is an illustration of a system to achieve a simulated camera view;
FIG. 26 is a block diagram of an in-store customized glasses shopping method;
FIG. 27 is a block diagram of an in-store customized glasses shopping system;
FIG. 28 is an illustration of a system for customizing an eyeglass nosepiece to fit different user anatomies;
FIG. 29 is a diagrammatic illustration of configuring a custom product model, illustrating a small portion of the degree of shape and size customization;
FIG. 30 is a diagrammatic illustration of a custom eyewear model prior to aligning the eyewear model with an anatomical model;
FIG. 31 is a diagrammatic illustration of a customized eyewear model after aligning the eyewear model with an anatomical model;
FIG. 32 is a block diagram of a manufacturing sequence for customizing a leading product; and
fig. 33 is an illustration of creating a customized helmet.
Detailed Description
Referring to FIG. 1A, a system is provided in which a computer system 14 creates a customized product from scratch based on input to the computer system, including input based on a user's image. Beginning with the head refers to the fact that: provided are advanced customized products that are manufactured without the use of off-the-shelf, previously designed, previously produced, or stocked components specifically. This does not mean that secondary components such as fasteners, hinges, and the like cannot be used as parts for customized products. However, the main components of the product are designed from scratch, thus giving the product a new uniqueness, different from what it would be possible to provide by assembling the product from pre-manufactured components.
It is important to know where the computer systems that generate these customized products obtain information from. The computer system obtains imaging data of the user, determines anatomical data from the image data, measurements, and further optional user preferences and information, such as likes and dislikes of the user, as determined by analyzing the user's computer history. The computer system also accepts input from a user, where the user may specify certain preferences or directly control some aspect of product customization.
The system is not operated in vacuum; in other words, the computer system does not generate customized products from the null. In order for a computer to begin its inventive process, a configurable product model is installed on the computer system, the product model specifying, at least in some broad outline, the necessary structure and specifications for a customizable product.
In view of the above, and as illustrated at 10, computer system 14 obtains and analyzes image data and determines anatomical measurements and details of the user. As described above, image capture can be achieved in a number of different ways, most notably by using self-portrait images generated from a handheld electronic device (such as a smartphone or electronic camera). This is a convenient image capture method for a general user who can use a popular cell phone as a starting point for defining his or her own anatomical features.
As illustrated at 12, the computer system obtains optional user preferences and information, which may be collected from a variety of sources. At least one configurable product model 13 is provided to the computer system at 14 for directing the computer system. After analyzing all of its inputs, computer system 14 automatically outputs a new customized product model. Thus, the output of the computer system 14 is provided to the preview system 15, where the computer system creates a preview of the customized product and the user. The computer system then prepares product models and information for manufacturing the selected leading fully customized product, as illustrated at 17.
Note that at 16, optional user interaction is provided to update, notify, or control the preview and customized product. After the computer system has created the preview of the customized product, the user may specify optional user interactions to update, notify, or control the preview and the customized product. When these additional control instructions are input to computer system 14, the system can perform optional new guidance for the custom product, either by directly adding changes to the user or using the input to inform the new custom product model.
More particularly, the system operates as follows. The computer system obtains image data at 10 by a variety of means, such as a camera or imaging device connected to the computer system, wherein a user transfers the image data to the computer system, or from another computer system. Anatomical measurements and details may be derived in size, model, shape analysis, etc., and will be described in more detail.
As illustrated at 12, computer system 14 obtains other optional user information and preferences. This information, such as demographic information, medical or prescription information, answers to questions, style selections, keywords, etc., may be used as further input to the computer system for automated analysis and product customization by the user.
As illustrated at 13, the computer system obtains configurable product models that are added by the manufacturer or designer. These configurable product models are representations of custom products, and these configurable product models may be modified to alter attributes including shape, size, color, finish, and so forth. Configurable models can vary by thousands, millions, or infinity, but are also created with the ability to constrain or limit configurability to a field of choice by the manufacturer (e.g., only a certain range of material thicknesses can be used, or certain dimensions must not be changed when configured). The configurable model may contain mass-produced or pre-designed subcomponents, such as fasteners, but the assembly of the primary custom component with the subcomponents results in a highly customized leading ab initio product.
As illustrated at 14, the computer system uses input consisting of configurable product models, user image data, user anatomy data, and optionally user preferences to generate a new customized product model. The computer system may use a variety of techniques, including equations, analysis, shape models, machine learning, clustering, look-up tables, and the like, to generate the final customized product model. The computer system may also generate a range of custom models for the user to select from. These customization models are considered to be leading, non-storage, and fully customized for individual users.
As illustrated at 15, the computer system creates a preview of the customized product model. The preview may consist of an image of the customized product, a rendering of the customized product model on the user's anatomical model, a rendering of the customized product model on the user's image data, a physical rapid prototyping of the customized product model, and so forth. The preview may be presented to the user on a display of the computer system.
As illustrated at 16, the computer system accepts user input to update, notify, or control the customized product model. The user, or others licensed by the user, may change the preview, select configurable options (such as color or size) to customize the product model, answer questions to improve the product model, or the user may directly alter the configurable model (i.e., change shape or style) according to their preferences.
As illustrated at 17, the computer system prepares the customized product for manufacture approved by the user. Preparation may include converting the customized product model and user preferences into a set of specifications, instructions, data structures, computer numerical control instructions, 2D or 3D model files, and so forth that the manufacturing system may interpret. The preparation may also contain custom computer control instructions for directing a machine or person through each step of the manufacturing process.
As illustrated at 18, the computer system provides instructions to the manufacturing system that produces the leading customized product. Various specific methods for producing leading customized products will be described.
The block diagram of computer system 220 used as user 200 in FIG. 2 generally describes the aforementioned computers and manufacturing systems. In an exemplary embodiment, at least one computer system 220, including but not limited to a tablet, phone, desktop, laptop, kiosk, or wearable computer, is configured with a display 230 for presenting image data to a user. Display 230 includes an LCD screen, a flexible screen, a projection device, a 3D display, a heads-up display, or other display technology. The computer system 220 has input devices for controlling the computer system including, but not limited to, a touch screen, keyboard, mouse, track pad, or gesture sensor. The computer system 220 is also configured with an image capture device 210 including, but not limited to, a single lens camera, a video camera, a multi-lens camera, an IR camera, a laser scanner, an interferometer, and the like. The image capturing device will be referred to as a "camera" hereinafter. The computer system 220 is also configured to connect to a network or other system in order to communicate and transfer data 240. Computer system 220 is configured to connect to other computer systems 250 including, but not limited to, servers, remote computers, and the like. Other computer systems 250 are coupled to the manufacturing system 260 or control the manufacturing system 260. Computer system 220 is also configured to provide an interface to user 200 for viewing, customizing, purchasing, and ordering customized products.
In addition to a custom product system for creating custom products based on user image data, anatomy, and preferences, the present invention also describes a shopping system that allows a user to access the custom product system: means to shop, order, browse, interact, provide payment, and the like. One embodiment of a custom eyewear shopping system built around a custom product system is described:
customized glasses shopping system
Referring to fig. 1B, a system for ordering customized leading eyeglasses created from scratch is detailed. As illustrated at 101, a user uses a computer system to view glasses and select at least one style to attempt. This first step is optional and the user may view a plurality of glasses on the computer display and choose to preview any of the plurality of glasses. The user may select a style to try and preview at the beginning of their shopping experience, before purchasing, or at any time they choose. As illustrated at 102, the computer system instructs the user how to acquire image data and reference information. The computer system camera captures image data consisting of one or more images, videos or point-in-time previews of the user, and the computer system display presents the image data through its display. As seen at 103, the computer system analyzes the computer image data and constructs an anatomical model that is aligned with the image data. Thereafter, as illustrated at 104, the computer system prompts the user for prescription data, personal data, and other information, which may optionally be entered at a later step. This is followed by the computer system analyzing the input information as illustrated at 105: measurements, anatomical models, user preferences, and image data. As illustrated at 106, the computer system automatically adjusts the size and fit of the glasses for the user. Additionally, the computer system may automatically recommend shape, style, and color selections to the user, as illustrated at 107. As illustrated at step 108, the computer system creates at least one new customized eyewear model having at least one component designed from scratch and automatically places the eyewear model on the user's image data. The computer system renders a preview of the custom eyewear model (which may contain lenses), as illustrated at 109. As described above, the rendering may include a combination of user image data and a user anatomical model and a custom eyewear model.
As illustrated at 110, a user may interact with a computer system to adjust at least one of a size, shape, position, style, color, finish, pattern, and the like of the eyewear. The results are illustrated at 111, where the computer system makes a recommendation if the glasses may be poorly fitted based on user interaction or if the glasses are out-of-order.
Thereafter, as illustrated at 112, the computer system stores the data and calculates the price and estimated delivery, as well as any other relevant information that the customer needs to decide whether to place an order. The user may select alternative eyewear, as illustrated at 113, or the user may select custom eyewear to order, as illustrated at 114.
If the user selects substitute eyewear, as illustrated at 113, the computer system automatically generates a new custom eyewear model, as illustrated at 108, and the process begins again.
Once the user selects the glasses to order (as illustrated at 114), the computer system analyzes the user information and the model and prepares manufacturing instructions, and the computer system prepares custom manufacturing files for the manufacturing device, as illustrated at 115. The computer system then manages the manufacturing equipment and personnel to build the custom eyewear, as illustrated at 116. Finally, the eyewear is shipped to the user, as illustrated at 117. This completes the customized eyewear product that is created and manufactured for the user from scratch.
The following section will describe more details of the key steps involved in creating a leading customized product for a user:
obtaining and analyzing image data and anatomical information
The following section describes detailed systems and methods for obtaining and analyzing image data and anatomical information, which are illustrated at step 10 in FIG. 1A and at 102, 103, and 105 in FIG. 1B.
Prior to describing detailed methods for obtaining and analyzing image data and anatomical information, the terms of facial anatomy and eyewear are described for reference. Fig. 3 shows spectacles 301, wherein the various parts of the spectacles are indicated. The front frame 302 holds the lens 303 in place. A nosepiece 304 is centered on the front frame 302, and nose pads 305 extend from the front frame 302 to hold the eyeglasses 301 on the nose of the wearer. The hinge 306 connects the front frame 302 to the temple 307, the temple 307 resting on top of the wearer's ear at feature 308. Fig. 3 shows only one eyewear design, and it should be appreciated that these basic components may be applied to other eyewear designs, or that some eyewear designs may have different components.
Fig. 4 shows a face 401, eyes 402 of a user, pupils 403 and eyebrows 404 at the center of the eyes 402. The ear 405 also has a location marked as the top 406 of the ear where the temple of the glasses will rest. The nose 407 serves the key function of supporting the eyeglasses. Cheekbones 408, mouth 409, forehead 410, chin/chin 411, nostrils 412 and hair 413 are other important features when detecting and analyzing quantitative anatomical models.
Capturing image data
Fig. 5 shows a user 501 capturing image data of their face 503 using a computer device 502. Instructions are provided to the user to place the user's face in certain locations when the computer system captures and analyzes image data of their face. The computer system may capture an image of a person's face using a smartphone or handheld electronic camera. As mentioned above, there is sufficient information in the personal view of a single camera to permit 3D modeling, and more particularly the generation of anatomical models.
The computer system may require certain objects to be present in the image to provide a scaled reference. It is important to ensure that the size of the glasses is designed to a suitable size relative to the user's face, and that dimensions are provided to the image data or resulting anatomical models and measurements to ensure that the size is accurately designed. Reference objects may include, but are not limited to: coins, rulers, paper, credit cards, computer discs, electrical or computer connectors, stamps, calibration targets on a computer device, or the computer device itself. The object, when placed near the user's face, provides a reference size for the system to size the image data. If other image techniques, such as a depth camera, are available, or if inherently sized shape model techniques are used, then the reference object may not be needed because the proportion of the image data can be determined by the imaging device or the shape model.
In an exemplary embodiment, once the user has followed the instructions and is in front of the imaging device of the computer system, their data acquisition and analysis begins. A first reference image is captured in which the user holds the reference object in the same field as their face. The computer system analyzes the image data captured by the computer to detect the reference object and measure the size of the reference object, e.g., in pixels. The computer system also analyzes the image data to detect one or more of a number of features, including but not limited to pupils, eyes, nose, mouth, ears, face, eyebrows, hair, and the like. In one exemplary embodiment, the pupils of the user are detected and a landmark is placed on the center of each pupil. In another embodiment, the user may optionally be consulted to confirm or edit the location of each pupil marker to ensure accuracy. The distance in pixel units between the pupils or other features is scaled from pixel to distance units such as millimeters or inches using data previously analyzed from a reference object. In another embodiment, the user may have previously acquired data about the size of their face, such as the interpupillary distance obtained from an optometrist or optical examination, and the user may input this data into the computer system instead of using the reference object to obtain the scale. Alternatively, the reference image is acquired at a later time in the process, or at the same time as the other image data is acquired.
The purpose of scaling the data with the reference object is to ensure that the measurement values can be derived from the final quantitative anatomical model of the user. There are several key measurements that can be used to best determine how to virtually place and fit the glasses on the image of the user's face.
Fig. 6 shows a graphical illustration of the relationship between the glasses 601 and the user's face 602. The location of contact between the lens and the face is important because the location controls the fit of the lens. Shown is the contact location between the glasses 601 and the nose 603 of the user. The location of contact between the eyewear 601 and the user's ears 604, and the height and length between the top of the eyewear 605 and the top of the ears 606 are also shown.
Fig. 7 illustrates various detailed eyeglass measurements. Fig. 7 shows eyeglasses 701 with pupils 702 between them being the pupillary distance for both eyes (Pd)703a and the centre of the nose and the pupils 702 being the pupillary distance for a single eye 703 b. Further, if the highest quality optics are desired, or if specialized optics such as progressive lenses are desired, additional measurements regarding the eye and optics are useful, such as vertex distance 709 (distance from eye to lens), anteversion angle 710 (angle of lens to face front), face or frame wrap (frame wrap)704 (curvature of frame around face), lens height 713 (vertical position of pupil in lens), or optical center. The foregoing prior art is limited in that the rich information available from a fully quantized anatomical model of a user's face is not generated and used to fully customize the eyeglass frame and optics and achieve an optimal eyeglass shopping interface and experience.
For example, fig. 7 also shows a distance 707 between the nose pads of the glasses. In this regard, fig. 7 shows a model of the nose 711, which is used to derive quantitative measurements, including but not limited to the length 712 of the nose and the width 713 at various locations. Because the size of each user's nose is not the same, it is a great advantage to be able to accurately measure the size and shape of the nose and then customize the fitted glasses to fit the anatomy perfectly. If the two contact surfaces are correctly aligned and mated so that no high pressure points exist and if the spectacles are naturally supported in the correct position by the nose, optimum comfort of the nose pads of the spectacles resting on the nose of the user is achieved. Each user may have unique preferences in terms of where he prefers to wear his glasses on his nose to maximize comfort, aesthetics, or utility. Moreover, the configuration/shape of the nose varies greatly between ethnicities. For example, asian users have smaller noses than caucasians and have flatter nose bridges, and they often prefer glasses specifically designed for their population. However, there are significant advantages if the glasses are designed not for one population, but for individual users and their unique anatomy. It will be appreciated that the quantified anatomy of the nose enables the custom glasses to rest precisely on the nose, wherein it is desirable to achieve maximum comfort, aesthetics and utility once the case is opened for use, without subsequent adjustment (which is typically performed by an optical professional). However, many eyeglass designs do not allow for later correct adjustment of eyeglass features such as nose pads, particularly on plastic frames.
Fig. 7 also shows additional measurements of temple length 705 and distance 706 between temples required to achieve engagement with the user's face. Moreover, forehead, cheekbones, nose length, and head width may limit where the eyewear fits on the user's face. Other dimensions of the face, such as head shape, curvature, length, shape and angle of the nose, etc., are more useful in helping to suggest the best style and shape of eyewear for a particular user. The position of the pupil relative to the glasses is an important factor in ensuring good optical quality.
In one exemplary embodiment, the computer system directs the user to position and move his head while the camera captures a series of images or videos. The rotation is from one side to one side, up and down, or combined. The computer system guides the user to move the head to a precise location or simply requests the user to approximate the movement shown to it on the display. In another embodiment, the user has a handheld computer system and the camera is rotated around the user's head, rather than rotating the head. In another embodiment, rather than capturing an image or video with a computer system, the user has uploaded the image or video to the system, or the user captures an image or video with another imaging device and uploads the image or video to the computer system.
The captured video may consist of a series of images of the user's face at various angles that make up a set of image data. The computer system may perform analysis on the image immediately after the image is captured to provide feedback to the user in the event of a problem or in the event of insufficient image quality, pose, or amount of data being acquired.
In one exemplary embodiment, the computer system analyzes the image data to ensure that the user's face is approximately maintained within certain confines within the center of the frame. The computer system may run face detection algorithms on the image data to detect the boundaries of faces within each image. If the computer system detects that the face is outside the range, detects interference or occlusion in front of the user's face, or excessive blurring or other unacceptable acquisition artifacts, a warning and instructions are provided to the user as to how to reacquire a new set of image data. In addition, the computer system cuts or eliminates portions of the image data before performing more intensive calculations on the remaining data set to reduce computation and/or transmission time. For example, the computer system may crop any image portions that are outside the range of the detected face. In addition to detecting a face, the computer system may also estimate the pose (degree of rotation) of the face. The pose is estimated by using various face detector or classifier algorithms that are trained to determine pose. Using the pose estimate for each image, the computer system determines whether a sufficient range of poses has been captured. If not, the computer system may direct the user to re-acquire. The computer system may also filter out unwanted images. For example, there may be repeated poses, or a small number of unacceptable images below the quality threshold. The computer system may not reject the entire image set, but may reject some number of unacceptable images and only process images that exceed a quality threshold that is based on the aforementioned metric.
The computer system automatically or with user input identifies the exact image capture device and then uses an understanding of the optics of the image capture device to correct for optical distortion or utilizes knowledge of the depth of field of the lens to better analyze the data set. Depending on the image capture device, the computer system also corrects for distortions or imperfections, such as barrel distortion observed on wide-angle lenses. These corrections enable the acquired image data to best represent the user.
Quantifying anatomical models
Referring back to 10 of fig. 1A and 103 of fig. 1B, the method describes constructing a quantized anatomical model of at least a portion of a face and a head of a user. Once the complete image data set is acquired, the computer system analyzes the image data to construct a quantitative anatomical model of the user's face. The model is constructed using various techniques, and in one exemplary embodiment, the quantitative anatomical model is represented as a surface mesh composed of certain elements, including but not limited to polygons, curvilinear elements, and the like.
FIG. 8 shows an example of a grid 804. The resolution of the mesh is modified based on curvature, location, and features on the face, among other things. For example, the resolution of detailed locations around the eyes and nose is higher than areas where less detail is present (such as the top of the head). In one exemplary embodiment, the face mesh models only the face regions of the front and sides, while in other embodiments the face mesh models the entire head or any portion of a necessarily smaller region containing the face (such as only the eyes and nose). Alternative representations include point clouds, distance maps, image volumes, or vectors.
In an exemplary embodiment, the generic quantized anatomical model is distorted to fit the user's face. The model is parameterized and represented as a grid, and the individual grid points are influenced by adjusting the parameters. FIG. 8 shows an example of a model 801 with grid elements 804. In this example, one parameter affects the length 803 of the mouth feature 802. If the parameters affecting length 803 have been adjusted, the appropriate element of the mouth will adjust the coordinates to match the specified parameters. Other models, such as shape models, may have general parameters, such as a principal component, which does not correspond to a particular feature, but allows the general anatomical model to be adapted according to a number of different face sizes and shapes.
The computer system analyzes the image data to iteratively perform a sequence of feature detection, pose estimation, alignment, and model parameter adjustment. Face detection and pose estimation algorithms are used to determine the overall position and direction of face orientation, which aids in model positioning and alignment. A machine learning approach is used to train a classifier to detect faces and determine the pose of the head in images that are post-processed to define various features, including but not limited to haar features or local binary features. The training data set consists of images of faces in various poses, with facial position and pose orientation noted in the images, and also contains specific facial features. The output consists of the position of the face in the image and the vector or pose of the head orientation direction.
Once the face and pose are determined for the first image frame, an iterative process begins in which more detailed facial features are defined that relate to the glasses orientation and overall facial geometry, including but not limited to the position of the eyes, the position and shape of the nose, the position of the ears, the position of the top of the ears, the position of the corners of the mouth, the position of the chin, the edges of the face, and so forth. The image is analyzed again using machine learning to detect facial features and edges. When these features are located, the generic quantized anatomical model parameters are aligned and adjusted to find the best fit with the features, thereby minimizing the error between the detected feature locations and the mesh. Additional optimizations of the generic quantitative anatomical model may be performed to enhance local improvement of the model using texture information in the image.
In an exemplary embodiment, the generic quantitative anatomical model has parameters that affect features including, but not limited to, eye position, eye size, face width, malar structure, ear position, ear size, forehead position, nose width and length and curvature, female/male shape, age, and the like. The convergence of the optimization is quantified using an estimate of the error between the detected features and the model. Minor variations between adjacent images in the data set are also used to improve pose estimation and alignment of the model with the image data. The process iterates to subsequent image frames.
In one exemplary embodiment, features detected from adjacent image frames are used to initialize a subsequent frame or a previous frame to enhance feature detection. The process continues through the required number of images and may loop through the images multiple times to converge on the optimal parameters to minimize the error between the distorted generic model and the image data. Regularization and smoothing processes may be used to minimize noise and variations in the anatomical model fit between feature points, poses, and frames. As described above, the final quantified anatomical model will be scaled based on reference data, such as user input, or scale to a reference object. Alternatively, if the anatomical model is derived as a shape model of real world dimensions, the correlation between the shape and size of the face can be used to directly provide the scale of the model.
Because the model is refined through a series of images, the orientation and geometric relationship between the model and the image data is known. Beam adjustment may be performed on the feature points and facial models on the image, which provides an exact camera position to align the anatomical model with the image data. This information can be used to orient the model and align the model with the image data for subsequent rendering.
Those skilled in the art will recognize that there are many ways to construct and represent quantitative information from a set of image data. In another embodiment, the existing generic anatomical model need not be used to generate the quantitative anatomical model. Quantitative anatomical models are constructed directly using methods such as Structure From Motion (SFM) photogrammetry. In this technique, a series of images around the user's face is required. A 3D representation is constructed using the relative distances between the features detected in each image and the features in the different images. A method that combines a generic shape model with subsequent local SFM refinement can be utilized to enhance local details of features such as nose shape.
In another embodiment, the quantitative anatomical model consists of only a point cloud of detected key features. For example, the center of the eye, the corners of the eye, the tip of the nose, the top of the ear, and other important landmarks are detected and tracked through multiple images. These simple points oriented in space in the dataset provide all the information necessary to obtain the quantitative information needed for subsequent analysis. The information may be obtained using the methods described previously, or using other methods such as active appearance models or active shape models.
Image data can be acquired using techniques such as depth cameras or laser sensors, and there are existing techniques that describe how these techniques can directly generate 3D models, essentially like a 3D scanner, by the ability to detect distances. In addition, the depth is estimated using an out-of-focus region or using a disparity between adjacent images.
Alternatively, the quantified anatomical model and dimensions may be derived from a pre-existing model of the user's face that the user possesses. The model may be acquired from a 3D scanning system or an imaging device. If the user already has an anatomical model of their face, they may digitally transfer the anatomical model to a computer system via a non-transitory computer-readable medium, a network connection, or other means.
During the capture of user image data for customizing a product, such as glasses, the scale and size of the user is an important factor in ensuring that the resulting product is of the proper size and that the user receives a product that matches the preview version. The following embodiments describe various systems and methods for acquiring, scaling and reconstructing an anatomical model from image data:
embodiments for scaling an anatomical model of a user's face with reference objects present in multiple images
Referring now to fig. 17, for this embodiment, a) a computer system 1701 is configured with a camera or imaging device 1702 for capturing image data of a user 1703; b) a reference target 1704 of known size (e.g., coin, credit card, phone, tablet, screen, paper, ruler, etc.) is positioned such that the reference target 1704 is visible in at least some images of the user; c) the reference target has at least one predetermined dimension 1705 (e.g., the diameter of a coin); d) the computer system reconstructing an anatomical model of the user's face based on the image data; e) the computer system detecting a reference target in at least some of the images, including detecting at least one predetermined dimension; f) the computer system registers the anatomical model with the original user image such that the model coordinates and camera position align the facial model with the pose, position, and scale of the image of the user's face 1703; g) the computer system setting a scaling factor for the size of the anatomical model using the ratio of the detected target size to the known size of the reference target in each image; and h) the computer system may additionally average or weight the measured sizes of the plurality of predetermined sizes of the reference target in each frame in order to reduce the error of any single size measurement.
Embodiments for scaling an anatomical model of a user's face with reference objects present in only one image
In this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device; b) capturing individual images of a user using a computer system configured with a camera or imaging device, a reference target of known size being present in the images; c) the reference target has at least one predetermined dimension (e.g., the diameter of a coin); d) the computer system reconstructing an anatomical model of the user's face based on the image data; e) the computer system registering the anatomical model with an image of the user containing the reference target such that the model coordinates and camera position align the facial model with the pose, position and scale of the image of the user's face; and f) the computer system setting a scaling factor for the size of the face model using the ratio of the detected target size to the known size of the reference target in the image.
Embodiments of scaling image data for constructing an anatomical model of a user's face:
in this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device; b) a reference target of known dimensions (e.g., coin, credit card, telephone, tablet, screen, paper, ruler, etc.) is positioned such that the reference target is visible in at least some images of the user; c) the reference target has at least one predetermined dimension (e.g., the diameter of a coin); d) the computer system detecting a reference target in the at least one image, including detecting at least one predetermined dimension; e) the computer system uses the ratio of the detected size of the object to a predetermined size to set a scale factor for the image data (e.g., apply the size to the size of the pixel); and f) reconstructing, by the computer system, an anatomical model of the user's face based on the image data, the model employing the base dimensions of the image
Embodiments of an anatomical model of a user's face are scaled with reference objects contained in the model.
An advantage of this embodiment is that the orientation and position of the reference object relative to the user's face is not important, since the reference object will be reconstructed with the model.
Referring to fig. 19, in this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device; b) a reference target of known dimensions (e.g., coin, credit card, telephone, tablet, screen, paper, ruler, etc.) is positioned such that the reference target is visible in at least some images of the user; c) the reference target has at least one predetermined dimension (e.g., the diameter of a coin); d) as shown in fig. 19, the computer system reconstructs a model (or models) of the user's face 1901 and a reference target 1902 based on the image data, wherein the face and target may or may not be in contact with each other, such that the two models are located in space relative to each other; e) the computer system detecting a reference target in the model, including detecting at least one predetermined dimension; f) the computer system setting a scale factor of the overall model using a ratio of a size of a reference target detected in the model to a predetermined size of the target; and g) optionally, the computer system removes the reference object from the model after scaling, leaving only the final scaled face model.
An embodiment of an anatomical model of a user's face is scaled with a user-input interpupillary distance (Pd).
In this embodiment, the user typically has the optometrist measure Pd, which provides a reference dimension for scaling the head. How this is performed is as follows: a) capturing image data of a user using a computer system configured with a camera or imaging device; b) the computer system reconstructing an anatomical model of the user's face based on the image data; c) the computer system detecting eye features (pupil, iris, etc.) of the user in the face model and measuring distances between the eye features; d) before, after or during the image acquisition and reconstruction process, the user provides their Pd measurements; and e) the computer system uses the user's Pd measurement to set a scale factor for the model dimensions, resizing the model so that the eye distance measured in the model matches the user's actual Pd.
Scaling with dimensions detected and measured in an image and then applied to scale a user's facial model Embodiments for scaling an anatomical model of a user's face
In this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device; b) a reference target of known dimensions (e.g., coin, credit card, telephone, tablet, screen, paper, ruler, etc.) is positioned such that the reference target is visible in at least some images of the user; c) the reference target is determined to have at least one predetermined dimension (e.g., the diameter of a coin); d) the computer system detecting a reference target in the at least one image, including detecting at least one predetermined dimension; e) the computer system detecting facial features (pupil, iris, eye angle, mouth angle, nose, etc.) in the at least one image and measuring unscaled distances between the facial features; f) the computer system reconstructing an anatomical model of the user's face based on the image data; g) the computer system uses the ratio of the size of the reference target detected in the image to the predetermined size of the target to set the scale factor (Pd, distance between corners of the eyes, mouth width, etc.) of the detected facial features; h) the computer system detecting facial features in the facial model, measuring distances between the facial features, and scaling the facial model using the scaled facial feature measurements; and i) optionally, the computer system detects facial features directly in the facial model aligned with the image data without first detecting facial features in the image data.
Method for scaling an anatomical model of a user's face by determining a depth using existing reference targets Table (A table)
In this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device; b) a reference target of known dimensions (e.g., coin, credit card, telephone, tablet, screen, paper, ruler, etc.) is positioned such that the reference target is visible in at least some images of the user; c) the reference target has at least one predetermined dimension (e.g., the diameter of a coin); d) the computer system detecting a reference target in at least some of the images, including detecting at least one predetermined dimension; e) as shown in FIG. 17, the computer system 1701 uses a detected dimension 1705 of a reference target 1704, a known size, and intrinsic camera parameters to determine a distance 1706 from the camera to the target; f) the computer system reconstructing a model of the user's face based on the image; g) the computer system determining a scale of a face model of the user using the distance from the reference target and the user's face and the intrinsic camera parameters; and h) optionally, the computer system averages the sizes of the reference targets measured from the plurality of frames to reduce error of any single image measurement prior to scaling the face model.
Scaling, using a computer system, a real of an anatomical model of a user's face with a depth detected in an image Detailed description of the preferred embodiments
In this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device with depth sensing capabilities; b) the user positions the computer system to obtain their own images, while the computer system also measures the distance from the computer to the user (rangefinder, auto focus distance, depth sensor, etc.); c) the computer system uses the distance measured from the computer to the user and the intrinsic camera parameters to determine the scale of the image; and d) the computer system reconstructing a model of the user's face based on the image data; where the model is inherently scaled based on the size in the image.
Scaling, using a computer system, an anatomical model of a user's face with a depth detected in each pixel In the implementation methodTable (A table)
In this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device with depth sensing capabilities; b) the user positioning the computer system to obtain their own image while the computer system also measures the distance from the computer to each pixel in the image data; c) the computer system scaling each pixel of the image data using the distance measured from the computer to the user at each pixel and using parameters inherent to the camera; and d) the computer system reconstructing a model of the user's face based on the image data, applying the scale of each pixel to the model, such that the model is scaled when complete.
Scaling, using a computer system, an anatomical model of a user's face with a depth detected only at near distances An embodiment of (1).
In this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device with depth sensing capabilities; b) capturing close-up image data of a user using a computer system configured with a camera with depth sensing capabilities, e.g., image data including at least the user's eyes or other facial features; c) during the capturing of the close-up image, the user positions the computer system to obtain an image of at least some of the facial features while the computer system also measures the distance from the computer to the user; d) the computer system detecting facial features (iris, pupil, etc.) in the close-up image and measuring distances between the features; e) the computer system determining a proportion of pixels in the image data using the distance measured from the computer to the user and the inherent camera properties; f) the computer system determining a reference distance between the facial features based on the image scale and the measured distance between the features; g) the computer system reconstructing a model of the user's face based on image data of the user's entire face; and h) the computer system detects facial features in the face model, measures distances between the facial features, and scales the face model using the reference feature measurements.
Embodiments of an anatomical model of a user's face are scaled using a computer system and a two-mirror reflex.
Referring to fig. 20, in this embodiment, a) image data of a user 2004 is acquired using a computer system 2001 configured with an imaging device 2003 and a display 2008 on the same side as the imaging device; b) user 2004 captures an image in front of mirror 2007 with display 2008 and imaging device 2003 facing toward mirror 2007 so that they simultaneously capture image data of the user and the device displays a preview of the image data captured by the imaging device through the mirror reflection as well; c) the computer system detecting at least one dimension of the computer system in the image (screen size, feature size on the computer, reference image on the computer, etc.); d) the computer system determines a known reference size of the detected dimensions by providing its make/model, screen size, size of the reference image, etc.; e) the computer system detecting at least one dimension (distance between eye features, head size, model dimensions, etc.) in each of a user's simultaneous sets of image data (user and user on the device's display); f) computer system 2001 uses the reference dimensions and inherent camera properties of the computer system to determine distance 2009 between the device and the mirror; g) the computer system setting a scale factor for the detected user size using the distance between the device and the mirror, the detected user size on the display of the device, the detected user size in the mirror, and the attributes of the imaging device; h) the computer system reconstructing a model of the user's face based on the image data; i) the computer system detecting a user size on the reconstructed model and scaling the model based on the scale factor; and j) optionally, the user may place or hold the reference object against the mirror to determine the distance from the computer system to the mirror.
Enforcement of scaling an anatomical model of a user's face using a front camera and a rear camera of a computing device A method for preparing a medical liquid.
Referring again to fig. 20, in this embodiment, a) image data of a user is acquired using a computer system 2001 configured with imaging devices on a front side 2002 and a back side 2003 of the computer system; b) user 2004 captures image data in front of mirror 2007, thereby causing it to capture the user's image data with one camera (direction 2005) and the user's reflected image with the opposite camera (direction 2006) simultaneously; c) the computer system detecting at least one dimension of the computer system (screen size, feature size on the computer, reference image on the computer, etc.) in the image data; d) the computer system determines a known reference size of the detected dimensions by providing its make/model, screen size, size of the reference image, etc.; e) the computer system reconstructing an anatomical model of the user's face based on the image data, wherein the computer system optionally uses the pair of image data together as stereo data to enhance the 3D reconstruction; f) the computer system aligning the anatomical model over the two sets of image data; g) the computer system determining a scale factor for the model using the reference dimensions, the aligned anatomical model, and the camera-intrinsic parameters; and, h) optionally, the user places or holds the reference object against the mirror to determine the distance from the computer system to the mirror.
An embodiment for scaling an anatomical model of a user's face using a computer system and a mirror:
in this embodiment, a) image data of a user positioned in front of a mirror is acquired using a computer system configured with a camera or imaging device, wherein the camera is positioned next to the user's face; b) a reference target of known dimensions (e.g., coin, credit card, telephone, tablet, screen, paper, ruler, etc.) is positioned such that the reference target is on the mirror and visible in at least some images of the user; c) the reference target has at least one predetermined dimension (e.g., the diameter of a coin); d) the computer system detecting a reference target in the at least one image, including detecting at least one predetermined dimension; e) the computer system reconstructing an anatomical model of the user's face based on the image data; f) the computer system uses parameters inherent to the camera, the detected reference dimensions of the reference object, and the known dimensions of the reference object to determine the distance from the camera to the mirror. Because the mirror is the midpoint between the user and the user's reflection seen by the camera, the distance from the camera to the user is twice the distance from the camera to the mirror; g) the computer system sets the scale of the image data using the distance from the camera to the user and parameters inherent to the camera; and h) the computer system reconstructing an anatomical model of the user's face based on the image data.
Embodiments for constructing and scaling an anatomical model of a user's face from a collection of previously acquired images
An advantage of this embodiment is the use of a collection of previously captured images (e.g., a collection of existing photos, an album, a social network or online image album, etc.) that may be at the disposal of the user. Referring to fig. 21, in this embodiment, a) the computer system receives a collection of images (e.g., 2101, 2102, 2103) of a user 2105, b) facial recognition data of the images may be tagged in advance to determine which face in each photograph is the user. c) If the images have not been previously tagged, the computer system performs facial recognition, prompts the user to identify which face is the user in the at least one image, or using the highest frequency detected face to determine the user from others in the photograph, d) the computer system detects facial features (e.g., points of eyes, nose, mouth, ears, jaw, etc.) in each image of the user and fits the facial model 2104 with the image data, e) optionally, the computer system determines an expression in each image (e.g., 2101 vs 2103), and adjusting the face model to a neutral expression, f) the computer system determining the pose of the user's face in each image, g) the computer system reconstructing a single model 2104 of the user's face by fitting the face model across the set of features and camera positions (2105, 2106, 2107) of the user. The face model is scaled by one of these methods: h) the computer system requests additional data from the user based on the aforementioned method (Pd input, image with reference target, etc.). i) The computer system detects known objects in the image to determine a reference size (e.g., identifying paper, logos, phone calls, etc.). j) The computer system requests additional image data of the captured user and reference object using any other method described herein. k) Face models are inherently scaled by having a shape model that relates to the size and shape.
Embodiments for scaling a user's face using existing glasses already owned by the user
Many people who purchase eyeglasses already own the eyeglasses and, whether or not the eyeglasses fit well, it is used to help scale the size of the user's face. Alternatively, the manufacturer may send a pair of sample glasses for use in this process.
Referring to fig. 22, in this embodiment, a) image data 2201 of a user 2202 is acquired using a computer system configured with a camera or imaging device; b) using a computer system to capture individual image data 2203 of a user wearing own glasses 2204, c) the computer system requesting the user to provide reference dimensional information about the pair of glasses, such as the width 2205 or length of the frame, the size of the lens, etc. (e.g., for scaling a photograph of the glasses next to a reference target for the glasses, by setting the settings on the glasses and computer system display 2206 to 1: 1-scale reference 2208 aligns a measurement to glasses 2207 (as explained in later embodiments), inputs the measurement, a model name of the glasses, a ruler or an interactive ruler displayed on a screen that a user can use to measure their glasses, etc.), d) the computer system reconstructs a model of the user's face based on the image data, e) the computer system detects a size of the glasses in the image data (e.g., an overall width or height of the frame, a width of the lens, etc.), f) the computer system associates features or models of the user's face (e.g., eyes 2209 and mouth corners 2210) between the glasses-on and glasses-off image data, g) the computer system determines a scale factor for the face model based on the detected size of the glasses and a reference size of the glasses and the associated features between the glasses-on and glasses-off image data, and h) the computer system registers the face model with the original user image, such that the model coordinates and camera position align the face model with the pose, position and scale of the image of the user's face.
Use ofImplementation of sonar to scale user's face
For any embodiment that requires calculation of the distance from the computer system to the user or the distance from the computer system to the mirror, the sonar method is used.
The following embodiments describe determining distance using sound. a) Capturing image data of a user using a computer system configured with a camera or imaging device; b) using a computer system that is also configured with a microphone and a speaker to emit sound (e.g., a series of frequencies, repeated sounds, etc.) and record the same sound with the microphone, c) emitting sound from the speaker on the device, the user's body, or a headset held at a distance or other device, d) the computer system calculating the distance between the computer system itself and an object, such as the distance from the computer system to a mirror, or the distance from the user's ear's headset and the computer system, by analyzing the time elapsed from when sound is emitted from the computer system to when sound is detected by the computer system's microphone, e) the computer system can use multiple sound, filtering, or other analysis methods to reduce noise, reflections, artifacts, and optimize the accuracy of distance detection, and f) the computer system uses the distance to scale the user's number of images as described in other embodiments According to an anatomical model.
An embodiment of determining Pd from a face model that has been reconstructed and scaled:
in this embodiment, a) the computer system obtains a scaled face model of the user from the image data (using any of the methods previously described), b) the computer system detects features of the eyes (iris, pupil, etc.) from the face model, and c) the computer system measures the distance between the eye features on the face model to calculate Pd.
Embodiments are provided to users with a way to measure the size of a reference object of their choice.
For any embodiment requiring a reference object of known size, in some cases, the user needs to use an object that they or the computer system do not know how large the size is, i.e., business cards, pencils, glasses they own, etc.
This particular embodiment describes a system for measuring rectangular objects (or objects that can fit within a rectangle) whose dimensions are unknown, but the method can be extended to any shape. Referring to fig. 23: a) displaying a reference frame 2303 on a display using a computer system 2301 configured with a display 2302 and an input device, b) the computer system obtaining information about the computer system's display, such as resolution, pixel size, overall display size. The computer system obtains this information from itself, software on the computer system, from a web browser, from a user providing information about the display or computer system model, c) the computer system calculates the pixel size of the display (e.g., by dividing the length and width of the screen by the number of pixels). d) The computer system then calculates the true size of the reference frame 2303 on the display, e) the computer system instructs the user to place their reference object 2306 against the screen and adjust the reference frame 2303 to match the size of the object 2307 as specified by 2305 using an input device (touch screen, mouse, touch pad, gesture, etc.), f) the computer obtains the size of the reference object by calculating the adjusted size of the reference frame, and g) optionally, the computer system is configured with an imaging device 2308 to capture image data of the reference object so that it obtains information about the appearance of the object for recognition in future images. If the computer system is configured with a depth imaging device, the computer system uses the depth and scale information to enhance the measurement of the reference object.
For any embodiment involving the use of a reference object, the object need not be perpendicular to the imaging device to obtain the correct dimensions. Using previous knowledge about the reference object, the angle of the object relative to the camera is determined. The true reference dimension of the object is determined using the angle on the image plane and the measured distance.
Optional user preferences and information
FIGS. 1A and 1B depict capturing the user's prescription data and other information to provide information for analysis in step 104. This step can be performed at a later time, but if the data capture is computationally expensive, it is advantageous to capture the data while the computer system analyzes the image data. The computer system requests information by a user in the form of input information via an input device connected to the computer system. The computer system may also receive information by obtaining image data of the physical information set (e.g., a photograph of the prescription). The computer system may use optical character recognition to decode the image and extract the user's prescription data. The computer system may receive user information through voice recognition, electronically delivered data, or other means. The use of the information input by the user will be described later in the description of modeling the lenses and creating the customized eyewear model.
Configurable product model
In fig. 1A and 1B, steps 106 and 107 depict a configurable product or a configurable eyewear model. In an exemplary embodiment, the configurable model is three-dimensional, configured with parametric features and dimensions, and represented as a 3D surface mesh. The 3D model of the glasses is created by a variety of methods, such as 3D capture via scanning or photogrammetry, or by 3D Computer Aided Drafting (CAD) or 3D modeling. It should be noted that a variety of other methods may be used or a variety of other representations of the model may be configured, such as a 2D model, a shape model, a feature-based model, and so forth.
In an exemplary embodiment, a 3D parametric model is created by the eyewear manufacturer, comprising the frame and or frame and lens. The 3D parametric model is created as a surface mesh or solid model composed of elements or features including, but not limited to, polygons, curve elements, and the like. The parametric model enables one or more dimensions of the eyewear to be altered, which will update the appropriate model and mesh elements while maintaining consistent relationships between other features.
Fig. 9 shows an example of an eyeglass model 901, the eyeglass model 901 being adapted to an eyeglass model 902 by modifying parameters of an eyeglass width 903 around a lens. The advantage of the parameterized eyewear model is that the width 907 of the bridge and nose pads are preserved, the height 908 is preserved, and the overall aesthetic appearance between the eyewear models 901 and 902 is consistent. Parameterization enables a wide variation in only one aspect of frame 901 without affecting other important design elements. One advantage of a parameterized eyewear model is that changes are propagated from one feature to the rest of the model while constraining all other features. These changes are represented as simple numbers, which allows for very efficient data transfer and storage. These parameters can vary in myriad variations in the size and shape of the product, allowing the ultimate accuracy of the fit of the custom model to the user's anatomy and preferences, if desired. The shape of the lens in this example can vary widely or in an infinite number of variations, embodying the leading rationale for a customized product from scratch. By changing and customizing the base shape of the primary part (in this case the front bezel), the design is inherently leading and customized for the individual user in a very unique way that could never be achieved with pre-manufactured or stored parts.
Fig. 13 illustrates an exemplary base eyewear design 1301 that embodies further shape customization. The base design is the basic pattern or shape that the model of the glasses has, and this base design can be modified by configuration and parameters. The computer system adjusts the curvature between points 1305 and 1307. Alternatively, the user directs the computer system input device, selects a point on the glasses at 1305, and moves along the dotted line in the direction of arrow 1306 toward point 1307. The glasses 1302 will then be modified in the edited region 1308. To maintain symmetry while reducing the number of steps necessary to customize the eyewear, changes on one side of the eyewear are applied to the other side of the eyewear as well, as shown in the updated eyewear 1303. This symmetry effect is one example of a constraint that can be introduced as a feature of the configurable model.
The configurable eyewear model has constraints that prevent certain critical sections/regions from being modified to designs that are no longer optimal to manufacture. For example, the minimum thickness of each section is limited to ensure structural strength, the minimum thickness around the lens is limited to ensure that the lens can be assembled into the eyewear without breaking, and the possible hinge locations are limited to ensure that the hinge can fit and seat at the proper angle. If a particular storage component hinge must be used, the attachment point of the hinge must be consistent regardless of changes in the basic form and shape of the customized eyewear. In addition, certain features are relevant due to the symmetrical or stacked effect; for example, if a computer or user adjusts the width or thickness of one portion of the edge, the entire edge on both sides is adjusted to ensure a symmetrical and attractive appearance. The overall position of the features (such as the position of the hinge and nose pads, etc.) is still constrained. All of these constraints and relationships will be pre-programmed by the eyewear designer and will be integrated in the configurable model.
FIG. 29 illustrates an example of customization implemented with a configurable product model; in particular, the ability to combine various parameters to improve and customize product models. Eyewear model 2900 is configured to illustrate 16 variations in the diagram. Column 4 2902 illustrates an exemplary configuration of an eyeglass lens width 2903 and height 2904. Line 4 2901 illustrates a combination of various parameters of the nose bridge width 2905, the distance between the temples where they contact the ears 2906, the height from the front frame to the ears 2907, and other minor changes. Key features such as material thickness 2908 and hinge size and location 2909 remain unchanged. The parameter configuration enables great configurability of the eyewear design while still being manufacturable. A manufacturer can use one hinge and one material thickness for all of these designs and more, yet still allow for great customization of the base shape and size. Models 2900 and 2910 are clearly distinct and generally require different mass-produced products. It is simply impractical to provide customers with this level of variation with traditional mass-produced products, which requires the design and storage of thousands, millions, or more components. The model configurable with the rest of the method and system described herein allows one base model to be configured in all configurations illustrated in FIG. 29, so that one product can be tailored to an individual customer and then produced To do so. It should be noted that these 16 variations represent a very small subset of the total potential design variations; thousands, millions, or countless variations may be achieved by interleaving between the examples shown (as will be explained in more detail below) and other parameters not shown in the configuration specification. For example, if there are 10 parameters that can be changed for the configurable model; there can be 20 increments (this can also be infinite) per parameter, such as distances of 2mm, 4mm, 6mm, etc.; and the model can provide 20 colors and 3 finishes; the total of this model is 6x1021Or six thousand seven powers (i.e., 6000 by 10 hundred million) configuration combinations. It should also be noted that these types of configurations do not impose limitations on the types of replacement and assembly of ready-made part components. The basic shape and size of the part will be quite different each time a parameter is changed, requiring a configurable model and parts of it to be manufactured from scratch. This level of customization can only be achieved with the leading ab initio customization approach described herein.
In addition to geometry, the lens model may also have parameters in terms of surface finish, color, texture, and other cosmetic attributes. The 3D glasses model can be mapped with an image texture to represent the surface, or rendered with texture, lighting, and surface properties (such as reflection, transmission, subsurface scattering, surface or roughness) to represent the true photographic appearance of the glasses. The configurable nature of the model will allow for the representation of a large number of materials, lacquers, colours and surface finishes. The glasses and lenses are rendered in the most realistic possible photograph using various rendering techniques known to those skilled in the art, such as ray tracing, in order to accurately represent and reproduce on a display the exact appearance of the frame and lenses at the time of manufacture. Other optical interaction effects, such as shadows and reflections, may be displayed on the glasses and on the 3D model of the user's face. The 3D glasses model has a hinge point at the temple to allow the temple to bend with respect to the frame front and fit to the user's facial model. In another embodiment, the 3D eyeglass model also allows for a suitable amount of elastic modulus (degree of stretch) in the bulk material properties of the frame, which may depend on the selected frame material.
Product customization
Once the anatomical model is constructed, the anatomical model is used to provide information for placement and customization of the configurable product model. In an exemplary embodiment, the computer system automatically adjusts the glasses according to the user's face based on at least one of: the anatomical model, the user's preference input, and the user's image data are quantified. Quantifying the dimensions of both the anatomical model and the configurable eyewear model is known to the computer system, automatically making various sizing adjustments to ensure an optimal fit, or to achieve a solution that closely approximates an optimal fit. Three different approaches are described: the method of customizing the configurable eyewear model prior to aligning/placing relative to the anatomical model and rendering a preview for the user, the method of customizing the configurable eyewear model after aligning/placing relative to the anatomical model but prior to rendering a preview for the user, and the method of customizing the configurable eyewear model after aligning/placing and rendering a preview for the user, thereby enabling the user to provide additional input after seeing the underlying pre-configured eyewear model on their face.
Customization prior to placement on an anatomical model
In one embodiment, the eyewear model is automatically customized before being placed on the anatomical model; thus, before fitting or rendering the eyewear model directly to the user's image, a completely new and customized design is created:
refer to fig. 30. In this embodiment, a) the computer system obtains a scaled face model 3001 (using any of the methods previously described) that identifies key facial features 3005, including but not limited to the size, points, lines, and faces of the eyes, nose, ears, forehead, etc., b) the computer system obtains a configurable 3D product model 3002 that identifies key features 3006, including but not limited to the size, points, lines, and faces of the temples, nose pads, lenses, bridge, etc. c) The computer system performs optimization on the configuration product model parameters to reduce errors between the face and various features of the model based on predefined fit metrics, such as an optimal ratio of eyeglass width to face width, optimal centering of the eye within the lens, and so forth. For example, the length of the temple arm is adjusted until the error between the temple arm and the top of the ear is minimized. Alternatively, the computer system optimizes fit and style based on other techniques, such as machine learning or analytical equations. d) The computer system updates the configurable product model 3003 with the new parameters. e) The computer system performs optimization to obtain a rigid body transformation (as illustrated at 3004) to align the product model 3003 with the face 3001. Errors between key features of the product and the face are minimized and some features are weighted more heavily than others. f) The computer system transforms the coordinates of the product model to align the product model with the anatomical model to place the new eyewear design in alignment with the user's anatomy.
Customization after placement on an anatomical model
In another embodiment, the base eyewear is positioned relative to the anatomical model, and then the automatic adjustment is done as follows, creating a completely new customized product before rendering for the user's preview. Refer to fig. 31.
a) The computer system obtains a scaled face model 3101 (using any of the aforementioned methods) that identifies key facial features 3107, including but not limited to the size, points, lines, and faces of the eyes, nose, ears, forehead, etc. b) The computer system obtains a configurable product model 3102 that identifies key features 3108, including but not limited to the dimensions, points, lines and faces of the temples, nose pads, lenses, nose bridge, etc., c) the computer system performs optimizations to obtain a rigid body transformation to align the product model with the face, as illustrated at 3103. Errors between key features of the product and the face are minimized and some features are weighted more heavily than others. d) The computer system transforms coordinates of the product model to align the product model with the anatomical model. As illustrated at 3104, the computer system analyzes the interaction and size and error between the product model and the anatomical model. In the example illustration, the eyewear model at 3103 is too large for the user's face, is placed too low due to nose size, and is too wide for the face shape. e) The computer system then automatically adapts the product model, as illustrated in 3105, to further minimize the error between the facial features and the product features based on predefined fit metrics, such as the optimal ratio of the eyeglass width to the face width, the optimal centering of the eyes within the lenses, and so on. The resulting custom model 3106 is better designed for the user.
Custom fitting
The computer analysis quantifies a set of measurements between the anatomical model and the eyewear model. The set of measurements includes, but is not limited to: the width of the glasses relative to the width of the face; the distance between the nose pads is relative to the nose width; the angle, shape or size of the nose pad relative to the angle, shape or size of the nose; the length of the temple relative to the ear position; the height of the glasses relative to the height of the face; the height of each ear relative to the eye or other reference point; the distance between the lens center and the eye center; the distance from the inner surface of the lens to the apex of the pupil; the outward angle of the temple relative to the frame; the outward angle of the lens relative to the plane formed by the front face of the face; the glasses wrap angle is relative to the corresponding wrap curvature of the face.
The computer system uses these measurements to optimize the configurable eyewear model based on the user's face. The automatic adjustment is provided with information by a default metric such as the optimal value of the ratio of the glasses to the face width. Ideally, each metric is a dimensionless ratio that scales correctly across all user faces. However, some measurements such as vertex distance may be sizing. A range of optimum values may also be used. Each metric is optimized individually or together if there is an interplay, such as the interplay between the frame width of the eyeglasses and the temple angle.
For example, fig. 14 shows a user quantified anatomical model 1401 and a configurable eyewear model 1402 in a view 1411 prior to automatic optimization. One set of metrics is the ratio of the eyeglass width 1403 to the face width 1404, the temple angle 1407, and the length 1406 of the entire temple relative to the distance 1405 to the top of the ear. As just one example, the optimal values for these metrics are 0.95, 87 degrees, and 1 for which the pre-optimized eyewear model 1402 does not satisfy. The computer system will attempt to minimize the error between all three metrics and the optimal value. An optimization method such as least squares, steepest descent, or other optimization methods known to those skilled in the art are used to obtain a new set of eyewear parameters that best fits the user's face. After updating the parameters, the automatically adjusted 3D glasses model as shown at 1412 is displayed, thereby enabling a better first visualization or approximation of all glasses models, as the width 1408, temple length 1409 and temple angle 1410 are better suited for the user. Automatically sizing the eyewear to the best fit or near best fit size for the user provides a better shopping experience because the time and steps it takes for the user to get the final eyewear design is reduced. Users may also be surprised when seeing that they are wearing a clocklike eyeglass design that has never been envisioned or that they have not previously been aware of a clocklike eyeglass design that will fit their style. The concept of having each design and style fit well is a huge first step in ensuring a good shopping experience.
As another example, fig. 28 illustrates a cross-section of a nose 2801 and an eyewear model 2802 prior to customization. The nose pad 2803 does not match the contour of the nose and intersects the surface of the nose. The same nose 2804 is illustrated with an eyewear model 2805 that is custom configured for the user. Nose pad 2806 now matches the contour and angle of the nose and rests just on the surface. This is an example of the superior ability to be fully customized, as the prior art is not able to fully customize the nose pad contour to precisely match and fit the user's nose.
In some cases, when the eyewear model is highly configurable or the optimal values are completely within the solution space of the parameterized design, no optimization is needed and a direct solution to the exact specified metrics can be obtained. For example, if the temple length needs to be 103.4mm and the front width of the eyeglasses needs to be 142.1mm, the model can be adjusted to these values accurately.
The optimal value may vary based on other factors input by the user or determined from the image data, such as gender, age, facial shape, eyewear style, eyewear usage, or what is currently popular. For example, women may generally prefer glasses that are slightly smaller than men with respect to their facial size. Users who select eyeglasses for leisure use may prefer an increased frame wrap and tighter temple fit to reduce wind into their eyes, widen their corrected field of view, and/or provide greater protection from impact or sun. A user selecting plastic eyewear may prefer larger eyewear than a user selecting metal eyewear. These user-defined preferences may be used to alter the optimal parameters during the customization process.
Customization and aesthetic prediction
In an exemplary embodiment, the computer system recommends custom fits and styles based on the user's image data and possible additional information provided by the user. In addition to custom fits to the base design selected by the user, the computer system may suggest eyewear styles to create custom products specific to the user. Information about the user obtained from the user's imaging data and anatomical model for providing customized recommendations includes, but is not limited to:
overall face size, such as the area of the face front or the volume of the head in the model; a face width; a facial height; ear position (each ear can have a different height); pupil distance; eye size, such as area or length or height; the spacing between the eyes; asymmetry of the nose, eyes or mouth; eye color; hair color; the number and shape of the hairs; skin color; race; age; location or regional style trends; sex; evaluation of zygomatic bone shape and location; forehead angle; cheek angle; the eye socket under the eye; eyebrow size and shape; facial shapes (e.g., circular, square, oval, etc.); vertical position of the eyes relative to the center of the face; hair style (e.g., up, down, long, bald, straight, curly); facial hair; the intensity or softness of the feature.
A portion of the features, all of the features, or additional features are defined from the image data. Some features may be measured directly on the quantitative anatomical model. For example, the curvature of the nose and the position of the ears may be measured directly from the anatomical model. In one exemplary embodiment, the features are classified using a machine learning algorithm. A training database of image data from multiple faces is collected and all features are recorded. The computer system performs a plurality of analyses on each image, such as intensity maps, gradient filters, Haar filters, black plug filters (hessians), Sobel filters (Sobel filters), hall transforms (Hough transforms), segmentation, and canny filters, in order to measure or detect a plurality of features, such as mouth angle, facial edges, nose size, wrinkles, and so forth. For example, to estimate wrinkle features to help estimate age, a computer system analyzes portions of an image segmented by an anatomical model. Within the scope of the model, a sobel filter is applied to detect edges and the intensity of the edges. The face region is subdivided into multiple regions, a sobel filter is applied in the multiple regions, and the number and strength of edges are quantized within each region. The sum of all areas of the face provides a feature for detecting wrinkles. A person without wrinkles will have edge features only at key facial features such as eyes and mouth, his score is relatively lower than a person with wrinkles, which will be more due to their wrinkles. The features in the training set are classified using machine learning methods, including but not limited to support vector machines, boosting, bagging, random forest (random forms), and the like. The computer system then correlates the image data with the desired features using a machine learning classifier.
Other aesthetic characteristics may also be quantified. Using the previously mentioned techniques to detect skin or hair features allows isolation of those areas of the image data. Image analysis of the color will then allow the characterization of skin color and hair color to be determined. Clustering this approach will enable the determination of the type of skin tone or hair color, grouping similar colors together in an image. Alternatively, a machine learning approach may be used on the color space of the image data in order to train a classifier to determine aesthetic characteristics.
In an exemplary embodiment, the user is also required to provide some information to enhance or supplement the data analyzed from his image data. The user provides information including, but not limited to: age; sex; a location; occupation; style preferences, such as "fashion" or "traditional"; the type of suit they want to wear when wearing glasses (formal, casual, etc.); a color preference; their favorite clothing; priority levels of different eyewear styles or shapes; and words describing themselves or their taste.
Each feature may also be provided with a corresponding weight that indicates to the algorithm the importance of the feature. Alternatively, the user may link a social networking website, a personal profile, promotional database information about the user, or other such private information source to the computer system. This enables the computer system to import a variety of information about the user, not just by querying the information available to the user, such as a list of the user's favorite music, stars, places they have gone, their favorite restaurants, or linguistic analysis of words and descriptors they use publicly. For example, if a user's posts on a blog or social networking site are analyzed, it may be clear that they mention "red" much more frequently than other colors, or that they most often wear dark formal attire in the image, which information may be used to inform the computer system of the user's color or style preferences.
In one exemplary embodiment, the computer system will have a training database of preferences associated with various features. These preferences include, but are not limited to: lens style, lens material, lens shape, lens color, lens finishing, lens size (including local sizing, including overall size and custom local adjustments such as width, thickness, etc.), position of the lens on the face, and lens size.
Preferences are determined by the actual user, designer, test user, or by other means. The preferences are set to a single preference, a plurality of preferences, a range of preferences, a ranked preference, or a scored preference. In addition, users may have disliked preferences, or features that are not appealing to them. For example, a user may enjoy a circular frame shape and an oval frame shape as well, but not a rectangular frame shape. Preferences are automatically set based on the user's use of the computer system. In one exemplary embodiment, a user purchases glasses, and when he takes certain actions, such as rating the glasses during a shopping session, adding the glasses to their shopping cart, changing the glasses, or answering questions about the glasses, the computer system records his actions and associates the actions with preferences. For example, if a user repeatedly tries on, likes, and changes to glasses with a blue color, then blue color is associated as the user's preference.
In another embodiment, these preferences may be determined by a professional designer or test user. A designer or test user will gradually pass through a specific set of questions or activities that require the designer or user to rate or evaluate various eyewear designs and features. They may also be required to modify or customize the glasses according to their preferences. Based on the detailed test results of these users, a database of their preferences may be built.
The database is then composed of relationships between a number of variables: image data of the user, a quantified anatomical model, and provided personal information; analytical data about the user and their image data; and the preferences they have set. The computer system applies machine learning or predictive analysis to build response (preference) predictions based on the new user's following inputs: his new image data and anatomical models, personal information and shopping behavior on a computer system. This approach can achieve an advantage of providing a highly customized and convenient eyeglass purchasing experience. For example, image data analysis of a user and several basic answers to questions provide the following detailed profile of the user: more than 30 women, dark medium length hair, square face, very small nose, somewhat blue eyes, medium skin tone, fashionable fashion taste, white-collar profession, a brisk trend, daily wear of glasses, and living in urban areas. Each of these features may be associated with various eyewear preferences, and when sorted by machine learning methods, the combined information can recommend a set of eyewear that truly matches the user's preferences, even if she does not say or does not know in advance her preferences for eyewear design. When combined with the method of automatically determining the size of the glasses, in the glasses purchasing implementations described herein, the user has a very personalized experience at the beginning of her shopping experience, and will get more desirable customized glasses faster and easier than with other existing shopping implementations.
In another embodiment, the product model is customized for asymmetry. For example, fig. 24 shows a user 2401 with the common problem of crooked noses 2402 and one ear lower than the other at 2403. These anatomical asymmetries of the face exist in many people and affect the fit or appearance of the eyeglasses on their face, often requiring manual correction by the optometrist, which may or may not correct the problem. On the user 2401, the eyeglasses 2404 rest on an angle 2405 and are biased to one side due to the asymmetric facial features. In any of the previous custom implementations, the product model may be adapted differently for the left and right sides of the face. This can be achieved by having different measurements, points, faces or other geometries to optimize for left and right size of the product. The resulting eyeglasses can be of different sizes for features on the left and right sides, e.g., the temples have different lengths, or the nose pads are offset to one side. Additional constraints are added to the optimization to achieve horizontal and well-aligned placement of the eyewear even if the user has asymmetrical features. After asymmetric customization, the user 2401 gets eyeglasses 2406 that can be mounted horizontally on the face and centered.
It is desirable to design custom glasses in consideration of the user's face in various expressions. For example, when a person is smiling, the cheek structure may change, or when a person is frowning, the shape of the forehead may change, which may cause interference with the design of the glasses, causing the glasses to shift or feel uncomfortable during normal use. The following embodiments describe a method for custom optimizing an eyeglass design that fits a variety of expressions:
in this embodiment, a) a computer system configured with an imaging device captures image data and constructs a model of the user's face at neutral expression (using any of the methods previously described), b) the computer system captures additional image data of the user with at least one additional expression and constructs at least one additional face model (or obtains parameters necessary to adjust a single model according to various expressions), and c) the computer system performs placement, design optimization, user adjustment and preview using one additional constraint in the previously described methods: eyewear design, placement, and preview is performed on multiple facial models representing a user in multiple expressions. An optimal design is generated that satisfies the constraints of all facial models or all expressions, resulting in customized eyewear that best fits the user over the user's range of facial expressions and movements.
Custom and optical devices
As described above in step 104 of fig. 1B, the computer system prompts the user to enter his optical lens prescription information, which is the information necessary to order prescription eyeglasses. Prescription information is also used to render the lens in the size, shape and thickness that the user will receive in order to provide a more complete and realistic preview of the entire eyewear. Because different prescriptions require different optics (thinner or thicker lenses, greater or less curvature), the specific prescription of the user affects the visual appearance of their final product. If the user does not enter data, then an estimate of the average prescription lens or a flat lens sheet (without optical correction) is used for rendering, which will provide at least a view of the lens in the eyeglass frame. Instead, the user is asked general questions about their vision, such as myopia or hyperopia, astigmatism, vision rating, their favorite lens type, and the like. The computer system may use these general questions to correlate with the most likely lens size and thickness for the user. The user views the customized lens rendering to determine whether a certain frame style and size is acceptable, and/or whether the lens index he selects is appropriate, given his prescription strength. For example, after seeing a prescription of-9.0 rendered with a standard lens having a standard index of 1.49 (the resulting lens would be thick), the user may want a different custom eyewear design, hide the thick lens edge, or the user may want such a higher index of 1.67 or 1.71 to reduce the lens thickness. The computer system also automatically suggests lens indices based on frame design and prescription to provide optimal vision and aesthetic appearance. For example, a very strong prescription user may want a plastic frame because the rim of the plastic frame is thicker, enabling a much thicker lens edge to be more aesthetically masked, and the computer system may suggest this.
In an exemplary embodiment, the user may select a lens style including, but not limited to, lens tint (clear, various shades of sunglasses, automatically darkened lenses with estimates of indoor and outdoor tint, polarized lenses, etc.), prescription style (flat lens, single vision, digital compensation, bifocal, progressive, etc.), lens material index (1.5, 1.67, etc.), lens coating, lens edge lens treatment (thinning of lens edge), or brand. Any changes that can be seen are realistically rendered on the 3D glasses model, including any distortions or optical effects resulting from the particular lens type and prescription that the observer can see when viewing the user wearing the glasses.
In an exemplary embodiment, more advanced measurements are derived from the quantified anatomical and spectacle models to enable digitally compensated (i.e., freeform), progressive, or other advanced optical lens designs. In order to manufacture a digitally compensated and/or progressive lens, it is desirable to have a variety of measurements necessary, including but not limited to, interpupillary distance, vertex distance, rake angle, frame wrap, and lens height relative to the pupil. Traditionally, eye care professionals (opticians, optometrists, etc.) have performed these measurements in person using specialized equipment or cameras. These measurements, even when done with professional means, are often difficult to estimate, such as measuring the distance from the surface of the eye to the back of the lens. Measurements using anatomical and spectacle models are much easier and more accurate on a computer system because there are no physical obstacles or limitations to the measurements. It would be a great advantage for a user to automatically obtain measurements when selecting their glasses on a computer system, thus removing the cost and time of going to an eye care professional.
In another embodiment, the product model is configured to optimize optical parameters for manufacturing the lens. In addition to using the details and dimensions of the anatomical model to provide information to the lens design, the eyeglass frame can be optimized to enhance the optical design. For example, the nominal vertex distance (the distance from the eye to the inner surface of the lens) is about 12-14 mm. For normal eyeglasses, the standard vertex distances vary widely, but the configurable frame can be adjusted to achieve the best measurement. Other parameters include, but are not limited to: frame wrap, eye position relative to the center of the lens, rake angle, etc. In this embodiment, a) the computer system obtains a scaled facial model (using any of the aforementioned methods) that identifies key facial features including, but not limited to, points, lines and faces of the eyes, nose, ears, forehead, etc., b) the computer system obtains a configurable 3D product model that identifies key features including, but not limited to, points, lines and faces of the temples, nose pads, lenses, bridge, etc., c) the computer system analyzes dimensions of interest including, but not limited to, vertex distance, forward tilt angle, Pd and frame wrap, D) the computer system optimizes product model parameters that will change the shape of the glasses and the state in which the glasses rest on the user's face until the dimensions are within their desired range (e.g., vertex distance 12-14mm), e) the computer system updates the configurable product model with the new parameters, f) the computer system performs optimization to obtain a rigid body transformation to align the product model with the face. Errors between key features of the product and the face are minimized and some features are weighted more heavily than others, and g) the computer system transforms coordinates of the product model to align the product model with the anatomical model.
As noted above, FIG. 7 illustrates some of the various measurements that are required. The interpupillary distance (Pd) is measured as binocular 703a or monocular 703b measurements. Monocular measurements are generally preferred to achieve the best possible implementation of the user's prescription glasses, but monocular measurements are more difficult to measure accurately and generally require the use of specialized equipment for in-person physical measurements. Most Pd measurements performed on a single 2D frontal image of the user rely on binocular measurements because the system easily detects eye position, while it is more difficult to accurately detect the exact center of the nose due to lighting constraints, the user may not be precisely facing the camera, and so on. However, by using the eye and nose features of the user's quantitative anatomical model, monocular Pd is better obtained. In this case, the additional information provided by the quantitative anatomical model allows the center of the nose to be determined automatically, even if the individual 2D images used to construct the quantitative anatomical model alone are not sufficient to perform the measurement (e.g., the user is not perfectly facing the camera in all 2D images). If a straight line distance is measured between the centers of the eyes, the monocular Pd for each eye is defined as the distance from the center of the eye to the center of the nose bridge. It is often difficult for a trained eye care professional to accurately measure vertex distance 709 in person, but quantifying an anatomical model can also provide advantages. The vertex distance is the distance from the center of the eye to the inner surface of the lens. It is difficult for the eye care professional to measure this parameter, considering that it is difficult to get in the middle of the frame after the frame is worn on the face and eyes of the user. When measuring in person, the user tries on each glasses design, and the measurement needs to be repeated, which is inconvenient and time-consuming. Therefore, measurement values are often estimated. However, this high difficulty size is calculated with great accuracy by various methods of quantifying the anatomical model applied to the user wearing the glasses, such as ray tracing from the center of the eye surface to the inner surface of the lens on the glasses model. The perpendicularity of the light rays with respect to the plane of the face is ensured by constructing a plane on the front face of the face using various features in the model, or by using the plane of the lens. The forward tilt angle 710 is the vertical angle of the lens from a perfectly vertical state. Again, this dimension is measured using a quantitative anatomical model in combination with an eyewear model. A plane is defined for the vertical position of the user's face and through the lens. The angle between the planes about the horizontal axis is used to calculate the rake angle. The frame wrap-around 704 is the horizontal angle of the lens in the frame relative to the user's face, and the frame wrap-around 704 is calculated in a similar manner to the anteversion angle by using the angle about the vertical axis. The fit height 713 is calculated in a similar manner as the vertex distance. Once the computer system calculates the lens position (the optical center of the lens) directly at the center on the pupil (which is the dimension required to calculate the apex distance), the vertical distance to the bottom of the inner surface of the lens hole in the frame is calculated to determine the fit height. An advantage of all these measurements is that they are performed using 3D lenses positioned relative to and previewed by the user relative to the user's quantified anatomical model.
In an exemplary embodiment, once the computer system obtains all necessary information to manufacture the user's lenses (all frame sizes, interpupillary distances, additional facial measurements (the lenses are digitally compensated), prescription information, lens material indices, and lens processing selections), the system is also able to realistically render the user's lenses in selected eyewear located on the user's image data. The algorithms for reconstructing a 3D version of a lens, given the above information, are already sophisticated algorithms and are necessary in determining the surface and edges of modern lenses digitally. In one embodiment, the computer system uses complex rendering techniques, such as raster or ray tracing, to not only display the lens as a 3D object, but also to render how the light bends as it passes through the lens. Using the rendering techniques, the system is able to render the lenses in a frame that is located on the user's face so as to allow the user to see exactly what they are in the third party's eyes. When the glasses with lenses are placed on the user's face, an accurate distorted view of the face looking through the lenses can be displayed. Moreover, the actual performance of the anti-reflective coating is presented to the user, as well as the distorted appearance of the lens due to the inclusion of lens features such as no-line progression, bifocal (dedicated magnification area), and the like. With an accurate rendering, the user is better able to make informed decisions about the frame and lens type selected, with a clear compromise between the various options. When a user purchases a lens in a retail environment, others will be dissuaded from increasing the lens material index, which can reduce the lens thickness by 20%. But his information is incomplete; typically, one does not tell him how thick his lenses will actually be in the frame he chooses, and often he cannot imagine how much the thinning is 20% actually reduced by millimeters, but often he cannot make such a comparison abstractly if the aesthetic effect of the lenses being worn by a real person is not seen. This imperfect information often results in the user paying for the upgrade glasses, and if the user gets more information, the user may not upgrade: thinning by 20% may look much but in practice may be by only 0.7mm and may not be practical enough in view of price. In this embodiment, the user is presented with not only a realistic photo rendering of the selected lens, but also all of the lens configurations within the various frame configurations, and the user can make a more informed decision. Moreover, the final manufactured lens looks like a rendering, so there is no chance of accidents.
In another embodiment, any lens configuration is displayed in cross-sectional view, thereby enabling visualization of the thickness of the lens at any location and comparison with other lens configurations (width, material index, digital compensation, etc.).
Customization from pre-existing eyewear
In another embodiment, the user captures image data that he has worn his own physical glasses. The image data is captured by the computer system or the user provides the image data to the computer system. The computer system analyzes the image data using methods similar to those previously described, but additionally performs image processing to detect and determine the shape, color and position of the eyewear. The computer system then adjusts the configurable eyewear model to match the eyewear worn by the user, similar to how the quantitative anatomical model is adapted according to the user's face. A shape model or other algorithm may be used to adapt and fit the eyewear model according to the image data or features detected in the image data. This enables the user to copy or copy and modify the glasses that they already have. For example, a user may have a pair of glasses that they like, but do not like the frame color and nose pad width. Users can use the system to create a model of their eyeglasses and use the aforementioned methods and systems to adjust frame color and nose pad width. The user may also use this system to indicate where on the nose they want to wear existing glasses (for aesthetic, practical or comfort reasons). The system will then place all new eyewear designs at this location on the user's nose. In another embodiment, the user uploads any photograph of any person wearing the glasses, and the computer system can detect and analyze the shape and color of the glasses, and then update the user with a new 3D glasses model that best matches the glasses photograph. For example, a user may have seen a photograph of a friend or star wearing a style of glasses, and they may upload the photograph to obtain a similar design, which may then be further customized according to their taste and anatomy.
In another embodiment, the eyewear designer or manufacturer provides a sample eyewear frame that the user may wear during a portion of the image data collection process. Similar to the methods described above, the computer system detects and analyzes the eyewear. In this embodiment, it is advantageous that the lens model is of a size and shape known to the designer. The presence of the eyewear model on the user's face in the image data provides both a reference scale of the data (since the size of the detected eyewear is known) and a very strong detection feature to enable a more robust anatomical model reconstruction. By tracking a known object in each frame and knowing that this object has a consistent relationship with other features of the user's face, the computer system will be able to more robustly detect the features of the user. In addition, the user will be able to physically touch and view the quality and workmanship of the sample eyeglass frame.
Alignment of
Referring to 108 of FIG. 1B, the eyewear model is aligned with the anatomical model. In an exemplary embodiment, the configurable eyewear model and the quantitative anatomical model are aligned based on optimization of the reference geometry. The alignment may occur before customization to provide information to the customization process about the geometric interaction between the user's anatomy and the eyewear model, or after customization and before rendering to ensure that the eyewear model is properly placed on the user's face. Ideally, the nose pads should be tangent to and on the surface of the nose with the temples on top of the ears and against the sides of the head when the eyeglasses are worn. For a given design, the top of the glasses should be at a distance from the user's forehead. For a given design, the eye should be centered as much as possible with reference to the ideal eye position. Because there is no default placement and each person's face is different, the method of customizing the glasses must account for the different anatomy of each individual user.
Fig. 10 shows two example eyewear designs: a small circular mirror frame 1001 and a large pilot frame 1002. The optimal eye position for design 1001 is shown as 1003, centered within the lens opening of the eyewear; the optimal position for design 1002 is shown as 1004, decentered toward the top of the lens opening. The ideal initial placement of the glasses positions the user's eyes as close as possible to these locations (e.g., directly behind them).
The optimization is obtained by minimizing the distance between: the center of the glasses and the midline of the nose; the intersection of the top of each modeled ear with the bottom of the head and temple (resting on top of the ear); nose pads and nose surfaces on the glasses; the center point of the eye and the designed optimal eye position; a predetermined offset distance between the forehead and/or the cheekbones and a particular eyeglass front bezel. Alternative position and measurement combination combinations may be used to optimize placement.
In an exemplary embodiment, the temple of the eyeglasses flexes at the hinge to ensure engagement with the user's face by contacting the sides of their face at a location above the ear where contact is made with the temple. For example, if the width of the user's head at the ears is narrower than the width of the eyeglasses, the temples will bend inward to remain in contact with the sides of this face so that the fit appears realistic and the user can see if the eyeglasses are acceptable to him. The computer system represents the eyewear as a dynamic or flexible assembly of sections that can allow angular rotation of the temple as defined by the hinge. In another embodiment, the temple itself is allowed to elastically deform, bend inward or bend outward to ensure that the temple is flush against the side of the head at the top of the ear. In this embodiment, the computer system may represent the temples of the glasses as deformable units that can be safely elastically deflected by a predetermined amount.
In another embodiment, the relationship between the quantified anatomical model features and the eyewear model is set by machine learning techniques and/or algorithms established from a database of training models, wherein the position between the anatomical model and the eyewear model has been set as the optimal condition. Based on the new anatomical parameters and the glasses geometry, the system can assign an orientation and alignment between the quantified anatomical model and the glasses model using a classifier trained from training data. This approach can improve the user's subtle preferences for the manner in which the glasses are placed.
Customized product previews
Once the quantitative anatomical model is established, scaled, and aligned from the image data and/or anatomical model, the representation of the glasses can be fitted to the user's face. Referring back to 15 of FIG. 1A and 109 of FIG. 1B, rendering the eyewear model on the user's image data to create a customized preview is described. In one exemplary embodiment, the user is presented with his image data, with the custom glasses properly positioned and superimposed on his face. In an exemplary embodiment, the quantified anatomical model is not displayed to the user, but is used to align and measure the data. The data is displayed as an interactive image that a user can adjust, rotate, and zoom by interacting with a computer system containing a computer peripheral such as a touch screen, mouse, gesture interaction, or any other human interface technology. This would enable the user to see how the custom glasses look on their face in various orientations.
In another embodiment, at least one still image is shown, such as a front view and a side view, or multiple views at a set number of degrees around a vertical axis centered on the user's face. In yet another embodiment, an augmented reality approach is used. A live video feed of a user's face is displayed using a computer system configured with a camera. The quantitative anatomical model tracks the user's face in real-time, allowing the 3D glasses model to be displayed and superimposed on the user's face in real-time as the user moves his face in front of the computer system. This creates the illusion of looking into the mirror when trying to wear the glasses, as in a retail store. In yet another embodiment, the user's image data may not be displayed, but instead a model of their face and head may be presented to the user while the 3D glasses model is superimposed and correctly positioned on their face. Instead, the glasses are represented as a series of images pre-rendered from various angles, rather than the actual 3D model. This method enables display of a high-quality pre-rendered image to be easily implemented via a network system.
In another embodiment, the image data analysis is performed remotely on another computer system, such as a server or cloud computer, to utilize faster or more specialized or complex computing power than the user's computer system may possess. Remote servers have thousands of networked CPU and GPU cores, larger and faster data storage devices, resulting in a much more computationally intensive and/or efficient system than the local computer system owned by the user. The user's computer system communicates the image data to the remote computer system and, after analyzing the image data, transmits the solution or additional data, such as rendered images, back to the user's computer system over a network or other data transmission method. In another embodiment, the user's computer system performs an initial calculation before sending data to the remote system or a final calculation after receiving data back from the remote system, which is advantageous in that the initial or final calculation can reduce the amount of data transmitted to or from the remote system or reduce the computational burden on the remote system.
The computer system analyzes the illumination intensity, quality, source, and temperature of the user's image data. Once the quantitative anatomical model is constructed and aligned with the image data, the computer system analyzes each individual image for at least one of:
color temperature within the scope of the anatomical model with reference to a normal white balance.
The positions of the bright and dark areas corresponding to highlights and shadows, which can provide information for the illumination source analysis. The illumination source or sources are detected by iteratively adjusting or directly calculating light sources on the anatomical model and minimizing the error between the calculated highlights and shadows and the measured highlights and shadows.
The overall brightness and contrast within the scope of the anatomical model accounts for the intensity and quality of the light source.
The light source is applied to the rendering of the 3D glasses model using the information about the lighting to best match the image data, providing a near seamless merging of the glasses model with the image data of the user.
In order to achieve a realistic and better looking preview for the user, it is advantageous to set a good white balance to the user image data so that the user appears to have a natural skin tone under natural lighting. Automatic white balancing implemented in many image devices or image post-processing software is used. In addition, white balance information is localized using the detected face region. Having specific objects in the image for achieving accurate white balance has a further advantage. There are typically yellow, green or blue hues in the different illumination sources, and these should be removed by adjustment. In this embodiment, a) image data of a user is acquired using a computer system configured with a camera or imaging device; b) locating a white balance object of known size such that the white balance object is visible in at least some images of the user, c) the computer system instructing the user to use a white or off-white balance object, such as paper, newspaper, telephone, cell phone housing, electronic device. Alternatively, the white balance target is an object of known color, such as a banknote, an electronic device, or a logo, d) the computer system detects the white balance target in at least one image of the user, e) the computer system adjusts the white balance (e.g., rgb or color temperature and hue) of the image data until the target becomes neutral white or gray. And f) the computer system applies the white balance settings to all image data of the user.
The following embodiments describe systems and methods for creating a preview of customized eyewear on a user's image or anatomical data. A quantized anatomical model of the user's face is established, scaled and aligned from the image data such that the model coordinates and camera position align the face model with the pose, position and scale level of the image of the user's face. The configurable 3D glasses model is aligned with the quantified anatomical model. An image of the configurable eyewear is rendered on the user's image data or model. The glasses are rendered using a variety of techniques familiar to those skilled in the art, including but not limited to raster, scan line, and ray trace rendering.
Embodiments for rendering images of glasses on user image data
In this embodiment, a) the computer system sets the camera position such that the anatomical model and the configurable 3D glasses model are aligned with the pose and position of the image data of the user, b) the computer system displays (or maintains) all surfaces of the configurable 3D glasses model that are located between the camera and the anatomical model, c) the computer system hides (or deletes) all surfaces of the configurable 3D glasses model that are located behind the anatomical model (e.g., the anatomical model is between the camera and the configurable 3D glasses model), D) the computer system renders only the displayed (or maintained) surfaces of the configurable 3D glasses model and does not render the hidden (or removed) glasses surfaces or the anatomical model, and e) the computer system merges the rendered glasses images onto the image of the user.
Using depth calculations on user image dataEmbodiments for rendering images of glasses on
In this embodiment, a) the computer system sets the camera position so that the anatomical model and the configurable 3D glasses model are aligned with the pose and position of the user's image data, b) the computer system calculates the depth (or distance) from the camera to all surfaces or vertices of the glasses model and the anatomical model at any given point on the image. The computer system may store depth values, c) the computer system renders only the nearest surface at any given point or pixel on the image, d) the computer system applies transparency to the anatomical model so that the anatomical model is not rendered explicitly, but is used for depth calculations, and e) the computer system renders glasses over the background composed of the user's original image.
An embodiment for rendering a glasses image on user image data with ray tracing:
in this embodiment, a) the computer system sets the camera position such that the anatomical model and the configurable 3D eyewear model are aligned with the pose and position of the user's image data, b) the computer system sets the surface of the anatomical model to be invisible in the final rendering, but opaque, and not reflecting rays, c) the computer system tracks rays between the camera and the scene, D) the computer system renders only the configurable 3D eyewear model because the anatomical model is invisible, e) displays the configurable 3D eyewear model with some portions hidden behind the opaque but invisible anatomical model, and f) the computer system merges the rendered image onto the user's image. The anatomical model may also be used as a surface onto which rays may cast shadows.
An embodiment of rendering a glasses image on user image data with a mask:
in this embodiment, a) the computer system sets the camera position such that the anatomical model and the configurable 3D glasses model are aligned with the pose and position of the image data of the user, b) the computer system renders the configurable 3D glasses model and the anatomical model into a binary mask image (e.g., 1 is a pixel for the configurable 3D glasses model to be in front of the anatomical model and 0 is a pixel for the anatomical model to be in front of the configurable 3D glasses model), c) the computer system renders the configurable 3D glasses model, D) applies a binary mask to the rendered image, hides the anatomical model and any portion of the configurable 3D glasses model that is behind the anatomical model, and e) the computer system merges the rendered glasses image with the mask applied to the image of the user.
Embodiments for rendering glasses images on user image data with masks during rendering
In this embodiment, a) the computer system sets the camera position such that the anatomical model and the configurable 3D glasses model are aligned with the pose and position of the image data of the user, b) the computer system renders the configurable 3D glasses model and the anatomical model into a binary mask image (e.g., 1 is the pixel for the configurable 3D glasses model to be in front of the anatomical model and 0 is the pixel for the anatomical model to be in front of the configurable 3D glasses model), c) the computer system renders the configurable 3D glasses model, the mask preventing rendering in the black region (neither the anatomical model nor anything behind the anatomical model will be visible or generated during rendering), and D) the computer system merges the rendered glasses image with the mask applied to the image of the user.
Embodiments for rendering glasses with texture mapped face models
In this embodiment, a) the computer system obtains a scaled face model of the user from the image data (using any of the methods previously described), b) the computer system constructs a face model using the captured images to create a texture-mapped image of the user and applies the texture-mapped image to the face model, c) the computer system positions the configurable 3D glasses model in alignment with the face model of the user (using any of the methods previously described), D) the computer system renders the texture-mapped face model together with the configurable glasses model to create preview image data for the user, e) optionally, superimposes the texture-mapped face model and the glasses model rendering on the original image of the user, or f) optionally, the computer system allows the user to provide input to control or adjust the pose and position of the face and glasses models, the image data is rendered after each adjustment by the user.
Preview using user photos
It is desirable for the user to see a preview of the customized eyewear on any photograph they select. The images may be favorite photographs, professional photographs, or other images different from the images used to construct the anatomical model. This embodiment describes a method of aligning an anatomical model with a new image and then rendering glasses on the new image. In this embodiment, a) the computer system obtains a new image of the user (not necessarily used to obtain anatomical data). Uploading an image, linking to a computer via a network connection, sending via email, sms, or other communication system, etc., b) the computer system obtaining a scaled face model of the user from the image data (using any of the methods previously described), c) the computer system detecting faces, estimating poses, and detecting facial features in the new image, D) the computer system performing a rigid body transformation of the face model and camera to align facial model features with the facial features detected in the new image, e) the computer system positioning the configurable 3D glasses model in alignment with the user's face model (using any of the methods previously described), and f) the computer system rendering glasses on the user's new image (using any of the methods previously described).
Simulating camera view angle
It is also desirable to simulate camera or visual properties (focal length, distortion, field of view, distance from subject) different from the camera used to capture the image data. The user may want to simulate the human eye or the angle of view of a camera lens taking a better look. The wide-angle lens of a computer camera taking pictures at close distances tends to emphasize and enlarge objects (nose or glasses) closer to the lens, while reducing the appearance (ears and sides of the head) of objects further from the lens when compared to the eyes of a person or camera at a greater distance.
Referring to fig. 25: a) the computer system obtains a scaled face model of the user 2501 from the image data (using any of the methods described previously), b) the computer system positions the configurable 3D glasses model 2502 in alignment with the user's face model (using any of the methods described previously) c) the computer system sets the camera position 2503, such that the anatomical model and the configurable 3D glasses model 2504 are aligned with the pose and position of the user's image data, D) the computer system alters the intrinsic camera parameters and distance from the model 2505 to simulate different perspectives and camera attributes, while still maintaining the same placement of the glasses in alignment with the user's image data 2506, e) the computer system renders the glasses on the user's image (using any of the methods previously described), and f) optionally, the computer system uses anatomical information from the original and simulated camera properties and positions to distort and distort the original user image. The distortion can allow the base image data to better represent different camera perspectives.
Implementation of physical Preview
It is advantageous to have a physical preview of the customized product rather than a digital preview. The following embodiments describe two methods of providing users with a physical preview of their glasses:
in this embodiment, a) the computer system obtains a scaled face model of the user from the image data (using any of the methods described previously), b) the computer system customizes the configurable 3D glasses model to fit the user's face model (using any of the methods described previously), and c) the computer system converts the 3D glasses model into a digital file for rapid manufacturing. Techniques include, but are not limited to:
i. the glasses model is 3D printed directly from plastic, paper or metal. The mold is converted into a hollow body to save cost and weight.
Convert the 3D model into a flat pattern and cut flat sheets (paper, cardboard, plastic, metal, etc.) with CNC lasers, water jet, vinyl cutter, milling cutter, etc. Optionally, the flat sheet is folded or bent.
Converting the 3D model into a plurality of parts, such as a front frame and a temple, produced using the aforementioned method. Fasteners, adhesives or other methods are used to assemble the parts.
d) The computer system receives input from a user, including but not limited to: name and address, optional payment information, other contact information, shipping preferences, and e) the computer system generates instructions to build, package, and ship a rapid prototype of the customized eyewear model to the user.
In this embodiment, a) the computer system obtains a scaled face model of the user from the image data (using any of the methods described previously), b) the computer system customizes the configurable 3D glasses model to fit the user's face model (using any of the methods described previously), c) the computer system converts the 3D glasses model into a digital file for rapid manufacturing. Techniques include, but are not limited to:
i. the glasses model is 3D printed directly from plastic, paper or metal. The mold is converted into a hollow body to save cost and weight.
Convert the 3D model into a flat pattern and cut flat sheets (paper, cardboard, plastic, metal, etc.) with CNC lasers, water jet, vinyl cutter, milling cutter, etc. Optionally, the flat sheet is folded or bent.
Converting the 3D model into a plurality of parts, such as a front frame and a temple, produced using the aforementioned method. Fasteners, adhesives or other methods are used to assemble the parts.
d) The computer system generates files for the user and provides a way for the user to obtain the digital files, including but not limited to emails, links downloaded from a web server, attachments to digital messages, etc., and e) the computer system generates instructions for the user to construct a rapid prototype with the files, such as instructions to use a printer or 3D printer, assembly instructions, instructions to send the files to a service point for printing or construction, etc.
Physical size of rendering glasses 1: 1 embodiment of the image:
a user may want to know the true size of their glasses in addition to the preview rendering of the glasses on their image or model. For example, a user may compare this size to existing glasses that they own.
In this embodiment, a) the computer system obtains a scaled face model of the user from the image data (using any of the methods described previously), b) the computer system customizes the configurable 3D glasses model to fit the user's face model (using any of the methods described previously), c) the computer system obtains information about the computer system's display, such as resolution, pixel size, overall display size. The computer system obtains this information from itself, from a web browser, from a user providing information about the display or computer system model, d) the computer system calculates the pixel size of the display (e.g., by dividing the length and width of the screen by the number of pixels), e) the computer system calculates the display's pixel size by using the pixel size and dimensions of the eyewear model, using a 1: 1, rendering the eyewear model in various orientations such as front view, side view, top view, f) the computer system displaying to the user 1: 1 image, and g) optionally, the computer system renders a real-time interactive graphic of the eyewear model, which the user can control through the input means, in the real object 1: 1 size to rotate and translate.
Physics-based previews
A common problem with the fitting of eyeglasses is that the nose and temples are not of the correct size, resulting in the eyeglasses slipping off the nose of the user. The physics-based preview method can simulate whether the glasses will stay on the nose. The following is an embodiment of physics-based tuning:
in this embodiment, a) the computer system displays a preview of the customized eyewear model on the user's image data and facial model (using any of the methods previously described), b) the computer system accepts user input (touch screen, slider bar, mouse control, gestures, etc.) to move the bezel of the eyewear model vertically up and down relative to the user's face and/or to move the bezel closer to or farther away from the user's face, c) the computer system imposes constraints to ensure that the eyewear does not interfere with the model, such as the nose pads intersecting the surface of the facial model, or the temple intersecting the top of the ears of the facial model, d) the computer system applies the following physical attributes to the eyewear model and facial model
i. Mass of the lens model, which is estimated from the volume and material properties of the lens model
Coefficient of friction of spectacle material
Coefficient of friction of skin, estimated from general properties of human skin
e) The computer system solves a system of mechanical equations representing the balance of the forces acting on the weight of the eyewear and the opposing frictional forces of the eyewear nose pads contacting the nose surface of the facial model and the eyewear legs contacting the ears of the facial model, and f) iteratively solves the mechanical equations until a steady state is reached at which the eyewear is positioned in a state of balanced force support.
Lens view rendering
In another embodiment, the computer system simulates the vision of a user wearing progressive eyewear. The user is presented with a view so that he can see through his configured lens and see the world he would see through the lens. This technique is best applied to custom configuration of digitally compensated (freeform) progressive lenses that do not wrap. A photograph (pre-selected or user uploaded image stream, or live image stream from a computer system imaging device) may be displayed on the screen with the lens in front of the image. Information is superimposed on the lens, identifying to the user the various corrected zones of the lens (deformation zone, focal gradual transition zone (corridor), area of maximum magnification, transition zone, etc.). The system is able to display how far behind the lens it actually places the photograph and, using ray tracing rendering techniques known to those skilled in the art, can distort the photograph as light passes from the photograph through the lens and to the viewer. In this preview, changes to the lens design or shape/size of the glasses can be updated in real time. The user will be better able to understand the area in the lens where distortion will occur (the peripheral area in a progressive lens) and the amount of distortion given various digital lens designs. In another embodiment, the computer system uses its imaging sensor to provide a physical preview of what it sees through the system display, and the computer system can distort this view in real time given the selected lens design. This live preview augmented reality view will allow the user to experience a life seen through the lens they customize given the lens parameters and the customized frame parameters.
User interaction and control
Referring to 16 of FIG. 1A and 110, 113, 114 of FIG. 1B, the computer system provides a way for a user to interact with the computer system to shop, select, edit, modify, preview, control preview, visualize, purchase, and perform other related activities for the customized product.
Fig. 11 shows an example computer system interface 1101 that may be displayed on a display of a computer system with an eyewear preview 1106 on a user 1102. The computer system interface includes controls 1103 for ordering, viewing, configuring, sending previews, sharing, obtaining help, or other functions. The eyewear style or base design can be selected with control 1108 and the color/finish can be selected with control 1105. Instructions are provided to the user via display 1104. Those skilled in the art will recognize that various other designs may be suitable for the same needs described for viewing, customizing and ordering eyewear. For example, multiple views of the glasses may be used, with 2, 4, or 9 windows displayed in different patterns at the same time, or different viewing perspectives of the user. In one embodiment, a computer system displays multiple instances of a user, each wearing a different configuration of customized eyewear. Each of the eyeglasses shown may have one or more options changed. For example, the display shows nine instances of a user's face, each showing the same custom eyewear design worn by the user, but each showing a different color, style, or lens material. In another example, multiple instances of the user are shown, each with the same style and color of the glasses, but with sizes that are automatically designed to be slightly different from the face, such as modified to be slightly larger or smaller, or with the glasses placed in positions that are modified to be slightly higher or lower on the face (using the same sizing algorithm or competing algorithms). In another example, the display shows multiple instances of the user wearing the same or different custom glasses, viewed from different angles (front, isometric, side, top). When the individual cases of the user are manipulated, all the individual cases are updated at the same time. For example, when a user changes the view of an individual case, the same view change is applied to all cases.
In one exemplary embodiment, the computer system allows the user to adjust the position of the eyewear model on his face. The user selects the glasses with their input device and adjusts the glasses in certain positions by moving, dragging, or making other control actions with the input device. For example, the user grasps the temple of the glasses and slides the temple up or down to better fit over the ear, or the user grasps the glasses at the bridge of the nose, placing or adjusting the manner and location of resting of the glasses on his nose. In addition, the user is able to correct any errors in the automatic placement of the glasses.
In another embodiment, the eyewear model is adapted and configured in real-time or near real-time as the user adjusts the position. For example, often the user simply moves the glasses to a new location for a preview, which may result in the glasses no longer fitting in this location because the nose may be too narrow or the temple too long, or some parts may not fit depending on the new location. For configurable eyewear, as the user moves the eyewear, the model may be adapted so that the eyewear changes shape to fit the user's face in a new position. If the user takes the glasses a little further from their face, the nose pads will get slightly longer and the temples will get slightly longer (among other variations), rather than the nose pads being too short and the temples being too short and the glasses falling off the user's face as if they were not adjusted.
For example, in fig. 11, the angle at which the glasses 1106 are placed on the user 1102 previewed with the interface 1101 is incorrect. The user adjusts the position by selecting the glasses 1106 with the input device and moving the glasses in the direction 1107 shown. The preview is then updated, as shown in view 1109, showing the glasses 1110 in place on the user's face according to the user's specifications. Alternatively, the user can manually identify the specific points at which the ears, eyes, nose, and other features are located so that the computer system can more accurately align the eyeglasses. The left and right ears of a person are often at different heights, which often results in the glasses resting askew or inclined. The ability to adjust the angle and ensure a custom eyeglass design that takes into account the different heights of the left and right ears provides a great advantage for the user to obtain a proper and comfortable fit. By means of the configurable eyewear model, not only can a proper fit be shown for previewing, but also the user can actually configure and manufacture so that the product obtained by the user fits well in reality just like a preview effect, which is a clear advantage over the prior art.
After the eyewear model is automatically placed on the user's anatomical model, it is desirable to allow the user to adjust the placement according to their preferences during the preview. For example, a user may prefer to wear their glasses higher or lower relative to their eyes or nose, or farther or closer to their face. These adjustments may help inform the custom eyewear design that is appropriate for placing eyewear according to the user's preferences. One of the great advantages of fully customized eyewear is that the base design can be adapted to fit the user's placement preferences. Typically, a user may preview or wear the glasses at different locations on their face (closer or further from the eyes, or higher or lower on the nose), but if the glasses are not the correct size and shape, the glasses will be uncomfortable, will not stop in place, or may not be worn in the desired location. The following embodiments describe systems and methods that enable custom placement of custom eyeglasses:
Adjusting the vertical of an eyewear model on a user's face by setting the vertical position and adapting eyewear model placement Embodiment of the position:
in this embodiment, a) the computer system displays a preview of the customized eyewear model on the user's image data, b) the computer system accepts user input (touch screen, slider bar, mouse control, gestures, etc.), moves the front frame of the eyewear model vertically up and down relative to the user's face, c) the computer system solves for the constraint system to adjust the eyewear model appropriately on the user's face.
i. The vertical height of the front frame must be at the vertical position specified by the user
The temple of the glasses must contact the vertex of the user's head where each of the user's ears intersects the head in the facial model. Adjusting temples to different heights according to symmetry or asymmetry of a user's face
The nose pad region of the glasses must contact but not intersect the nose of the user's facial model
Optionally, as previously mentioned, the constraint system may be other points, lines, faces or features.
d) If the constraints can be met by adjusting the eyewear position to achieve the user-specified vertical position of the eyewear model, the system will display an updated preview with the new eyewear model position, and e) optionally, if the constraints cannot be met, the system informs the user that the position is not possible, or that their eyewear may not fit properly (e.g., may slide down the nose). Alternatively, if the calculations are done in real time, the user will only be able to adjust the glasses within the set vertical distance range.
Adjusting an eyewear model on a user's face by setting a position and adapting the eyewear model to achieve a desired position Embodiments of the positions of (a):
in this embodiment, a) the computer system displays a preview of the customized eyewear model on the user's image data, b) the computer system accepts user input (touch screen, slider bar, mouse control, gestures, etc.) to move the bezel of the eyewear model vertically up and down relative to the user's face, and/or closer or further away from the user's face, c) the computer system solves the constraint system to properly adjust the eyewear model on the user's face,
i. the vertical height of the front frame and proximity to the face must be at the user-specified location
The temple of the glasses must contact the vertex of the user's head where each of the user's ears intersects the head in the facial model. Adjusting temples to different heights according to symmetry or asymmetry of a user's face
The nose pad region of the glasses must contact but not intersect the nose of the user's facial model
Optionally, as previously mentioned, the constraint system may be other points, lines, faces or features.
d) If the adjustment creates a gap or interference between the nose in the eyewear model and the user's face model, the computer system adapts the nose of the eyewear model (adjusts the thickness, position of the nose pads, width, etc.) to make contact with the user's nose. e) If the adjustment creates a gap or interference between the temple and the ear of the user's face, the computer system adjusts the temple (adjusts the length, angle, etc.), f) if the adjustment creates a gap or interference that is outside the resolvable domain of the custom eyewear model constraints, or if a significant portion of the eyewear causes interference (e.g., the entire frame moves into the face), the computer system does not allow the adjustment to an unacceptable position, and g) the system displays an updated preview with the new eyewear model position
Embodiments for adjusting the position of an eyewear model on a user's face by pre-computing a series of options
In this embodiment, a) the computer system calculates the best fit of the eyewear model on the user's image data, b) the computer system makes a number of adjustments to the vertical position of the eyewear to move the eyewear up and down over the nose or farther/closer to the face in set increments (i.e., +4mm, +2mm, -4mm) from the best position, c) the computer system pre-renders the user's image and eyewear models in all adjusted configurations, d) the computer system displays a preview of the customized eyewear model on the user's image data, e) the computer system accepts user input (touch screen, slide bar, mouse control, gestures, etc.) to move the front frame of the eyewear model vertically up and down relative to the user's face in increments used to pre-calculate the adjusted configuration, and f) the computer system displaying the adjusted configuration rendering that matches the user's selection
An embodiment of adjusting the vertical position of the eyewear model on the user's face with surface constraints:
in this embodiment, a) the computer system calculates the best fit of the eyewear model on the user's image data, b) the computer system sets constraints that limit the possible movement between the eyewear and the facial model,
i. The model glasses move only in certain directions (e.g. farther/closer to the face, or vertically up and down)
The model glasses are rotated only along the axis formed by the line passing through the contact point between each ear and the temple
The eyeglass model must maintain contact between the temple and the vertex of each user's ear-head intersection on the face model
The eyewear model nose pads must contact the nose surface on the face model, or within certain tolerances
v. optionally, as previously mentioned, the constraint system may be another point, line, surface or feature.
c) The computer system displays a preview of the customized eyewear model on the user's image data, d) the computer system accepts user input (touch screen, slider bar, mouse control, gestures, etc.) to move the eyewear model. The computer system calculates a constraint system for each user input, e) the eyewear model moves only within predefined constraints, and f) the computer system displays a positional adjustment of the eyewear model as the user moves the eyewear model.
Implementer for adjusting the vertical position of a model of glasses on the face of a user with an image of the user's current glasses A method for preparing a medical liquid.
The user may already have a preferred position where a pair of glasses can rest on their face. This embodiment describes how new custom eyeglasses can be designed so that the same positioning is achieved even if the style, shape and design of the eyeglasses are different:
In this embodiment, a) a computer system configured with an imaging device acquires image data, and constructing a model of the user's face (using any of the methods described previously), b) the user using the computer system to capture image data of the user's wearing glasses positioned according to their preferences, c) the computer system extracting the anatomical location of the glasses contacting the user's face (e.g., the location where the nose pads rest against the user's nose), and/or a reference position where the eyewear is positioned relative to the facial features (e.g. the top of the eyewear is positioned at a distance above the eyes, or distance down the length of the nose where the eyeglasses nosepiece lies), d) the computer system uses the anatomical and/or reference locations to optimize the fit and design of the new custom eyeglasses, e) the computer system solves the constraint system to properly adjust the eyeglass model on the user's face.
i. The vertical height, angle and proximity to the face of the front frame must be at the position closest to the extracted data
The temple of the glasses must contact the vertex where each of the ears in the user's facial model intersects the head. Adjusting temples to different heights according to symmetry or asymmetry of a user's face
The nose pad region of the glasses must contact but not intersect the nose of the user's facial model
Optionally, as previously mentioned, the constraint system may be other points, lines, faces or features.
f) Computer system displaying preview of customized eyewear model on image data of user
User interaction and control of configurable models
One great advantage of custom eyewear systems is that users can directly modify and update products according to their preferences. In one exemplary embodiment, the computer system provides a user with controls to edit or adjust the shape of the glasses from the base design, which is used as a modification template. The computer system may have automatically customized the base design for the user, or the base design may be the original base design prior to any customization.
FIG. 12 illustrates an example computer interface 1201 for adjusting eyewear 1203 previewed on a user 1202. The base design is constructed from a variety of styles or materials, including but not limited to full-frame, half-frame, frameless, plastic, or metal. Controls include, but are not limited to: control points on the glasses that can be dragged or adjusted, sliders linked to certain features, direct drawing on the frame, and touch, gesture, mouse, or other interaction for stretching or pushing/pulling features of the frame. In one embodiment, the controls allow the user to change certain limited features including, but not limited to, nose pad width, temple length and height, and the width and height of the front of the eyeglasses. For example, if the face of the user 1202 in fig. 12 is narrow, he adjusts the glasses 1203 so that the overall size of the glasses is narrowed. The user selects the glasses 1203 with the input device of the computer system and moves the edge of the glasses inwards towards his face, as indicated by the arrow in fig. 12. The resulting modified glasses 1206 are shown in the updated preview 1205. The ability of the user to make such easy and customized adjustments to the eyewear prior to purchase represents a significant change from the prior art approach to purchasing eyewear products. The feedback may be nearly instantaneous, and the user can see the rendered preview updated on the display of the computer system.
In one embodiment, the use of constraints limits customization to a range predefined with a configurable model. The parametric design and constraints of the model can be used to limit feature adjustments to maintain the base design for each pair of eyeglasses while allowing the user to achieve a customized fit and sizing in a simple process. While some use cases may have the advantage of giving the user 100% control over the design, there are significant advantages to limiting adjustments so that the user can easily obtain an aesthetically pleasing and yet manufacturable product. For example, without any constraints, the user may inadvertently make an undesirable design that intersects itself or is very asymmetric or uneven, neither fitting nor looking. In addition to the inherent constraints, controls such as control points, arrows, etc. may be highlighted only on the adjustable region, or when the user moves their input device over the region, the controls are highlighted, or there is an instruction to explain which portion(s) of the glasses they are able to change.
In another embodiment, the user is less limited in being able to make adjustments while still maintaining the overall eyeglass design. For example, the computer system enables a user to grab and adjust any portion of the eyewear, giving controls to adjust the length, height, width, and thickness of any portion of the eyewear, as well as the curvature of various components such as the edges and temples. Fig. 13 shows an exemplary base eyewear design 1301. The user directs the computer system input device, selects a point on the glasses at 1305, and moves along the dotted line in the direction of arrow 1306 toward point 1307. The glasses 1302 will then be modified in the edited region 1308. To reduce the number of steps necessary to customize the eyewear while maintaining symmetry, changes on one side of the eyewear are applied to the other side of the eyewear as well, as shown in the updated eyewear 1303.
User adjustment but not direct editing
In another embodiment, the computer system may ask the user some questions to help guide him to the adjustment or to perform multiple adjustments in sequence. For example, the computer system may ask "are the glasses now too wide or too narrow on your face? "or" is the glasses now too thick or too thin? "or" do you want a larger or smaller pattern? The user will be able to select an option or answer a prompt through the interface and then subsequently observe the adjustment of the responsive glasses. When combined with the machine learning techniques described herein, this can represent a powerful way to provide personalized and customized recommendations, while allowing slight adaptation based on the user's real-time feedback.
In another embodiment, the computer system alerts the user to certain critical adjustment areas, including but not limited to nose pads and temples. The nose and the tops of the two ears are the three critical points of contact that must fit well, and each ear may be at a different height. The computer system may require the user to inspect these specific areas and adjust them as needed. For example, the user may adjust the temple length until the temple fits well over the ear, or independently adjust the temple angle to correspond to different ear heights of the user, so that the front frame of the eyeglasses rests ideally and aesthetically pleasing on his nose.
In another embodiment, the user may adjust, modify, reposition, or select a new eyewear design in real-time on a preview of their image data. As previously described, a real-time preview is provided and the user is given controls to modify the eyewear design in real-time.
Improper fit
Referring back to step 111 of FIG. 1B, it is described when the computer system detects that there is a potentially inappropriate or uncomfortable fit, or whether a design that is not likely to be customized has been created. These undesirable configurations may be due to user interaction and customization of their model, and the user may not know how the changes they make affect the model. For example, if temples are required to flex too far to accommodate the user's face, they can be uncomfortable as pressure can be applied to the sides of the user's head. The pressure on the user's head is calculated based on the hinge design properties, the degree to which the hinge and/or temple are deformed, and the distance from the hinge to the location where the temple contacts the user's head. In another example, the nose pads are too tight on the nose, or too loose and the eyeglasses may slip off. The absolute interference may exceed the detection capability of the computer system. Analysis of the anatomical model and the configurable eyewear model enables detection of interfering surfaces. The pressure on the nose pads is calculated based on the geometry of the face and the glasses and the material properties of the glasses. If it is determined that the pressure is too high, a warning or automatic adjustment to the design is provided. In addition, the lens may be positioned at a non-optimal angle, thereby making the visual experience of the user poor or the visual acuity less than optimal. The computer system analyzes the following criteria (and other criteria) between the 3D glasses model and the quantified anatomical model to ensure proper fit on the user: the interference or gap between the nose pad and the nose, the interference or gap between the top of the ear and the temple, the temple angle (inward or outward) required to fit the ear, the lens angle, and the position of the eyewear on the nose and the position of the eyes relative to the lens (e.g., is the eye centered within the lens
The computer system combines dimensional information with material properties, force and deformation calculations, and computational simulations of stress/strain. There may be a specification for each metric analyzed and the user is alerted if one does not meet a criterion. Alternatively, the computer system automatically suggests an alternative or a set of alternatives.
Custom finishing
In one exemplary embodiment, the computer system provides the user with controls to change the color, finish, texture, or material of the eyewear. User control of these options may occur without automatic recommendation by the computer system, or the user may be given control after the initial custom design has been made by the computer system. The computer system displays a preview on the glasses or a plurality of colors applied to the glasses. The user selects different colors for various portions of the glasses. The selection of colors may be limited to a set of colors/finishes determined by the manufacturer, or there may be hundreds or thousands or more colors/finishes. The user also selects the option of finishing the material to be previewed. Examples of selected and rendered finishes include polishing, wire drawing, satin, clear coating, varnish, matte, embossing, hammer forging, texturing, and the like. User changes and edits to the glasses may occur at an editing interface where updates are applied to the preview view or the changes and edits are applied and previewed in real-time.
In another embodiment, the user takes a picture of an object, such as clothing, nail polish, a picture, and the like. The user provides the photograph as a digital image or takes the photograph using a computer system. The user selects a point or area of the photograph for the computer system to match the color or pattern. The computer system analyzes the photograph and specifies custom colors or patterns from the image. Computer systems may require the use of calibration standards to obtain high accuracy in color matching and reproduction. The calibration standard is a printed card on which the various calibration colors and shades that the user must include in the image. The manufacturer may provide the card to the user or the user may print the card. The computer display may also be presented next to an object having a desired color. The color calibration pattern may be displayed on the display, may be captured with the object in a mirror or using a second image capturing device. Alternatively, the user is prompted to include a known object in the photograph. The known object will be an object that is calibrated and stored in a computer system. Examples may include ubiquitous logos known to be professionally printed with a high degree of color accuracy and consistency, such as on food boxes or magazines, pop cans, currency, or credit cards. Alternatively, the computer system may have a database of known colors from other manufacturers, such as cosmetics, paint samples, automobiles, or textiles from which the user can select the colors of her favorite shirt, automobile, or nail polish colors, and the manufacturer would then have the color information necessary to accurately reproduce and match the desired colors.
In another embodiment, the eyewear is customized with a pattern, image, or text from the user. The pattern, image or text will be referred to herein as a pattern. The pattern is printed, engraved, etched, painted or otherwise applied to any surface of the eyewear. Generating a pattern from a library of options available on the computer system, providing a pattern from her image similar to the previous description of custom colors, or user inputting a pattern. For example, a user may want to print his name inside the temple. Or he may desire to etch a line design on the side of the temple or to print a leaf texture pattern on the eyeglasses. The pattern is rendered and previewed to the user on the 3D glasses model, and then accurately rendered on the manufactured glasses.
In another embodiment, the eyewear is customized with accessories including, but not limited to, logos, amulets, jewelry, and the like. For example, the base design may have the option of placing an accessory on each temple near the hinge. There is a default accessory and the user can choose to change, reposition, or remove the accessory. The user may select from various options including various shapes, colors, materials, and so forth. The computer system renders the accessory for display on the 3D glasses model for preview by the user.
Preference recording
In an exemplary embodiment, once the user has selected the glasses and adjusted its size, color, and other features, these preferences are recorded and stored to a non-transitory computer readable medium. The computer system also stores models of users, image data, and other information. When a user selects an alternative eyewear design (such as a different material or a different style), the eyewear is adjusted according to their preferences based on the user's past interactions and preferences, thus making the experience of browsing the eyewear more customized while also reducing repetitive tasks. For example, once desired fit preferences are established, any design or style can be updated to fit the user according to the user's preferences. If a user prefers glasses that are slightly smaller than their face width and they prefer to wear the eyes farther away from their eyes, all styles can be adjusted according to this preference. In another embodiment, the preferences of a particular user are improved when the user is using the computer system. As described above in the method of building a training database of preferences, the computer system records and tracks the preferences of the user as they purchase and preview the glasses. This information is used to improve the preferences of the user and is added to the information he has entered or previously analyzed from the image data he has provided. The user stored preferences may also be used to build a larger database for future predictions and customizations of new users, as mentioned above.
When the user and/or the computer system adjusts the glasses, the computer system records the magnitude and direction of the change (when relevant). The configurable eyewear model is updated by adjusting the appropriate model parameters by an amount to match the changes requested by the user. Any constraints written into the model are checked and if limits are exceeded, the computer system provides a warning to the user. Alternatively, a change up to a limit is applied, and any excessive change that exceeds the limit is ignored or not allowed (the user is alerted that the limit has been exceeded, or not alerted that the limit has been exceeded). For example, if the user changes the eyeglass width from 140mm to 190mm, but the maximum design width is limited to 170mm, the eyeglasses will only adjust to a maximum of 170mm and the user is notified that this limit has been reached. As previously described, the computer system renders and displays the updated model, thereby enabling the user to preview the new 3D glasses model on his image data. In another embodiment, the area of the glasses that changed is highlighted or identified to the user for a period of time or until the user accepts the change. The user is provided with provision to cancel (or redo) any changes that he requested.
Efficiency of configuration
When a user or computer system requests changes to a configurable model to fit different users, it may be desirable to have multiple custom designs pre-configured in order to achieve efficiency. For example, hundreds, thousands, or millions of design configurations may be preconfigured and stored on a computer system or network accessible computer system. If these preliminary stage configurations cover the most commonly accessed design configurations, they can be quickly accessed and displayed to the user. Alternatively, a shape matching algorithm, a lookup table, or other technique is used to find the model that is closest to the user's preferences. Subsequent minor adjustments are then made from the pre-staged configuration to fine-tune the configurable model according to the exact user preferences.
Preparation for manufacture
As shown at 17 of fig. 1A and at 115 and 116 of fig. 1B, the computer system stores data representing the user's preferences and designs, and then calculates price and shipping estimates. After the user determines the final custom glasses that he wants to order, the computer system may generate a final representation that is rendered in a more photo-realistic manner and has higher quality and resolution (if the original preview image is made of lower quality for efficiency). The computer system provides the user with a price, an expected shipping date, and other information prior to completing the order for the user's customized eyewear. The representation may be composed of various parameters and settings selected by the user or a final 3D model of the glasses. The computer system communicates the eyewear presentation and preferences, size, configuration data, and other information to another computer system accessible by the manufacturer via a network connection or other means of information transfer. In addition to the eyewear presentation, the computer system may also receive the user's personal information, payment details, shipping address, image data, and any other information needed to complete the order.
To provide an estimated shipping date and price, the computer system actively tracks a number of parameters, including but not limited to: stock status of all required raw materials, current production capacity, unfinished work, future plans, planned orders, and material delivery dates or production capacities, etc. The computer system performs scheduling and shipping estimations to provide the user with an expected delivery date or to provide the manufacturer with the actions required to achieve a guaranteed delivery date for the user.
Manufacturing customized products
FIG. 1B at 114 shows the user deciding to purchase the glasses. 18 of FIG. 1A and 116 and 117 of FIG. 1B depict the analysis and preparation of information and files for the manufacture of eyeglasses and lenses. Once the final eyeglass representation, preferences, dimensions, configuration data is available in the manufacturer's computer system, the data is analyzed to automatically create a manufacturing work order and a collection of manufacturing CAD, CAM, CNC or other manufacturing and modeling files. A serial identifier linked to the user's order is created to track the glasses as they move through the production process. The computer system associates the serial number with the raw material, specification and quality control sheet. The computer system also prepares manufacturing files according to the manufacturing methods required for a particular lens model, including but not limited to: a model file for rapidly establishing a prototype or build-up manufacturing method; model files for machining (e.g., g-code), grooving, milling, or other subtractive manufacturing methods that are converted to tool path CNC code; a model file for lithography converted into a flat pattern; model files for laser cutting, laser marking/etching, water jet cutting, stamping (and production of stamping tools) or other 2-D cutting methods that are converted into flat patterns and tool paths or robot control codes; rapid prototyping or additive manufacturing methods for conversion to inverse geometry for injection molding, casting or other tool production to form model files for molds, and conversion to robot control instructions for part handling, polishing, assembly, drilling, cutting, etc.
The computer system also prepares a manufacturing file according to: prescription information, lens materials and user information for lens manufacturing that translates into lens surfacing, lens laser marking and lens edge machining instructions; user entered parameters for updating existing manufacturing files for any of the above methods; colors and patterns to be painted, anodized, deposited, plated, stamped, printed, etched, engraved, or otherwise used to change the visual appearance of the eyewear; and, in general, quantitative information that is automatically converted from a user's order designation to a file or instructions for the manufacturing equipment.
Fig. 15 shows an example of a 3D eyewear design 1501 that is automatically converted into a flat pattern of a front 1502, left temple 1503 and right temple 1504 in preparation for laser cutting or machining of a metal or plastic sheet. These parts, as well as other parts for other orders, are automatically arranged to optimize manufacturing metrics such as minimization of material usage or processing time. The flat pattern also contains geometric information about the bending locations 1505 that the manufacturing equipment uses to bend or form the pre-cut part. The pattern is stored as a digital file or other medium as needed to provide the dimensions and instructions to the manufacturing equipment. The subsequent operations may include bending, folding or other forming operations performed on automated equipment. The manufacturing system may use the sequence identifier to determine at each step what operation to perform on the part, or to obtain a specification for the part. A bending pattern or other computer readable instructions are provided to the device.
Fig. 16 shows an example of a customized 3D parametric eyewear model 1601 and the resulting manufactured part 1602 produced. Parts such as these are formed using any of the foregoing manufacturing techniques or other methods known to those skilled in the art.
With respect to manufacturing, step 117 of FIG. 1B depicts the computer system controlling the manufacturing equipment and personnel. The computer system may schedule sequences for multiple manufacturing devices with or without the assistance of a person. As an illustrative example, a computer system may provide a set of instructions to perform the following sequence to manufacture a metal eyeglass frame:
the robot pulls the necessary material and feeds the material to the laser cutter or CNC machine. In parallel, instructions are sent to the lens manufacturing equipment to surface machine, polish, mark, coat, and edge the lens. The laser cutter cuts out the shape of the lens and marks the command and tool path with a logo or other decorative indicia. The robot transmits the laser cut parts to the bending and molding machines. Bending and molding machines to shape the lens into the desired final shape. The robot transmits the part to the polisher. And (5) finishing the part by the polishing machine. Instructions for painting, coating, anodizing, printing, or dyeing the lens. Instructions for the robot to sort the finished parts and associate the glasses and lenses. The operator assembles the glasses and lenses, nose pads and ear pads and performs the instructions for the final inspection. The robot packs the finished product and applies the label for shipping instructions.
The aforementioned instructions are a sequence for a customized product. To enable the successful manufacture of multiple customized products, a computer system controlling the manufacturing process creates a sequence of commands for each stage of the process for each customized portion produced. FIG. 32 illustrates a block diagram that shows a process flow for customizing a lead product. From 3201, an order for customized glasses 1 at 3202, an order for customized glasses 2 at 3203, and an order for customized glasses 3 at 3204 are received over time. After receiving the order, each pair of glasses receives a serial number at 3205. The computer system divides the parts into lots 3206 and 3207 for laser cutting based on machine availability, open orders, transfers, and other data. The computer system provides instructions to the laser cutter for each lot to cut the part. Thus, when the customized product moves from the laser cutter to the next step, the laser cutter receives instructions for the next batch of customized products. After laser cutting, the computer system provides a sequence of instructions for each part, one by one, to a bender 3208. As each part is completed on the bender, the computer system provides instructions to the molding press 3209.
In one embodiment, the computer system generates instructions for quality control or inspection. The computer system creates a template for use by the inspector for the inspected dimension or pass/fail criteria. Because each part is unique and leading, it is important to create unique inspection standards. The computer system may also provide instructions to the automated inspection that is made up of the dimensions, attributes or criteria of each individual product. Additionally, the computer system may provide data or models of the user's anatomy to the manufacturing device to produce an inspection or assembly device. For example, a 3D printed model of the user's ears and nose may be generated to ensure that the final product model fits the user properly.
In any of the foregoing steps, a subcontractor or multiple manufacturing sites may be used, and in one embodiment, the computer system automatically processes the order information and/or preparation of manufacturing instructions/recipes. Finally, in step 118 of FIG. 1, the customized eyewear is shipped to the user.
Alternate delivery system
The following embodiments describe alternative or additional systems and methods to supplement or augment the foregoing description.
In-store system:
the described methods and systems for creating customized products and eyewear may be used in retail stores, optometry rooms, or other physical locations. The system and method is controlled, in part or in whole, by a customer, optician, optometrist, sales person or other professional, assisting the user in selecting and purchasing the best frames and lenses in an office or retail location or through remote assistance via a computer network. FIG. 26 illustrates an exemplary method for purchasing customized eyewear using one system in a store. FIG. 27 illustrates an exemplary computer system. The customer 2700 uses the in-store computer system 2730 with optional assistance from an in-store or remote professional 2710. The computer system 2730 is configured with an image capture device 2720 and a display 2740. The computer system optionally calibrates the imaging device to measure the color of the custom color that matches the object that the user uses to customize the eyewear material. The in-store computer system is configured with a data transfer connection 2750 with the manufacturer's systems 2780 and 2790 and optionally with the computer's computer system 2770 and the professional's store computer system 2760, which data transfer connection 2750 may contain the user's information, prescriptions, etc.
If the process is initiated at a professional's store or office, the user's personal computer system can access the user's image data and eyewear inventory after a session with the professional, so the user can access this information at a later time. For example, they may continue to shop at home after the store completes the initial model and customized settings. The computer system may also be configured to cooperate with the optometry equipment to measure prescription information and automatically incorporate the measurements into the computer system, thereby eliminating the need for manual entry of prescription data. Another advantage of in-store systems is the ability to create more controlled and higher quality image capture and display systems. For kiosks or computer systems designed specifically for capturing image data and displaying custom previews, more advanced dedicated hardware components may be used, such as a multi-camera system or depth sensing camera with calibration functionality.
Fig. 27 illustrates an exemplary method. In this embodiment, at 2701, a computer system configured with a camera or imaging device is used to acquire image data of a user. The computer system may optionally be further configured with a reference target, a plurality or calibrated imaging devices, a depth device, a wearable reference target (such as glasses), or a calibrated distance and location device to ensure that the proportion of the user can be measured by the computer system. At 2702, a store or office professional can assist the customer in using the computer system and capturing image data. At 2703, the computer system reconstructs an anatomical model of the user's face based on the image data. At 2704 and 2705, the computer system optionally has an input device that enables a store professional, doctor or other person to input additional anatomical data, such as physical measurements, prescription information, and the like. The computer system automatically configures or adjusts the user's custom eyewear model for size and fit 2707 and style 2708. At 2709, the computer system creates a custom product and registers the anatomical model with the original user image such that the model coordinates and camera position align the facial model with the pose, position, and scale of the image of the user's face. At 2710, the computer system aligns the eyewear model with the user model and the image, and renders a preview of the eyewear model on the user's image. At 2711, the computer system optionally has or connects to a rapid-build prototyping system (3D printer, CNC cutter, etc.) to create a physical prototype or preview for the user. At 2712 and 2713, the computer system has an input device that enables a user or store professional to adjust, update, or configure the custom eyewear model. The computer system has input means to enable a user or store professional to select and try various eyewear models. At 2714, the computer system and optionally a professional can recommend if the eyewear is not well suited for the customer. At 2715, the computer system calculates data regarding price and manufacturing time. At 2717, the user or a store professional selects and tries various eyewear models. At 2716, the customer may choose to order custom eyewear. At 2718, the computer system communicates the final eyewear model and user information to the manufacturer's computer system via a network connection or other form of electronic communication, thereby enabling the manufacturer to produce custom eyewear. At 2719 and 2720, the manufacturer's computer system and manufacturing system preprocesses the eyewear model and information and produces custom eyewear. At 2721, the customized eyewear is completed and shipped to the customer or is ready for removal at a store location.
Shared data and design access:
in another embodiment, the user provides access to his image data and anatomical model to another party (such as a friend, family member, eye care professional, or popular consultant). Users enable a computer system to transfer their image data and optionally other information (such as preferences, eyewear models and settings) to another computer system via a network or data transfer technique. This transfer is accomplished with a hyperlink, authentication log-in, or other mechanism that is sent directly to another person through one of a variety of communication modalities, such as email, digital messaging, social networking, cloud storage, and the like. The other party then adjusts, customizes and previews the glasses on the original user's face model or image data. The other party then saves the collection and eyewear design and then sends back images, designs, views, customizations, suggestions, notifications, etc. to the original user. The original user then uses his computer system to preview the glasses that the other party designed and fitted for him. A great advantage of this embodiment is to allow users to crowd-source the design of their glasses to others, potentially increasing the diversity and quality of the designs they receive for preview. In this case, they have both the capability of computer-driven algorithms and human-driven designs.
In an exemplary embodiment, the user sends his own multiple image data or interaction models with previews of the glasses. Image data or models are sent from a user's computer system to another computer system via a computer network or other information transfer system through one of a variety of forms of communication, such as email, digital messaging, social networking, cloud storage, and the like. The computer system then allows an authorized person or persons to provide responses, ratings, messages, and other forms of feedback to the original user.
In another embodiment, the system is used by lens designers or fashion brands to create their own lens production line. Building a new eyeglass production line requires significant start-up costs because parts must be ordered in bulk by conventional manufacturing methods, high fidelity prototypes are expensive, and many styles, sizes, and color combinations must be ordered and kept in inventory before being promoted. A designer may use the system described herein to create a set of designs with different colors, shapes, sizes, and other features. A database full of user image data, anatomical models and preferences provides a particular means for testing and previewing glasses across a large population sample. A design sample may be provided and when the user views and wants to order the design, the on-demand manufacturing and delivery method may be used so that the designer or fashion brand will never need to keep inventory.
In another embodiment, a system without an image analysis component can be used if the eye care professional takes physical measurements and uses a computer system and inputs anatomical data about the user into a system that generates a custom design with a configurable eyewear model. The professional or user can then provide the preferences and improvements and have the person make the glasses as described above.
Additional product
In another embodiment, all of the methods and techniques described herein are applied to customize, render, display, and manufacture a custom eyeglass case. The user can choose from a variety of materials, colors, designs, shapes and features and see an accurate rendering of the box on his display. Moreover, the box can be automatically sized to fit the custom designed eyeglasses so that there is no excess free space in the box to allow the eyeglasses to rock around — the box can be automatically designed to custom fit the eyeglasses so that the size of the box is minimized and the ability of the box to protect the eyeglasses during transport is increased. The color, style and material of the case and the method of manufacture can also be matched to those used to manufacture custom eyewear. The customized text, such as the user's name, is engraved or marked on or in the box. The same eyeglass manufacturing techniques described herein can also be used to manufacture custom boxes.
One skilled in the art will recognize that the systems and methods described herein may also be used to customize, render, display, and manufacture other customized products. Because the described techniques are applicable to using custom image data, anatomical models, and product models built for customization, a wide variety of other products can be designed in a similar manner, such as: custom jewelry (e.g., bracelets, necklaces, earrings, rings, nose nails, tongue rings/nails, etc.), custom watches (watch faces, watchbands, etc.), custom cufflinks, custom bow ties and common ties, custom tie clips, custom hats, custom bras, inserts (pads) and other undergarments, custom swimwear, custom clothing (jacket, pants, shirt, skirt, etc.), custom nipple and pacifier (based on scanning and reproduction of the mother's anatomy), custom prosthesis, custom helmet (motorcycle, bicycle, ski, snowboard, horse racing, Fl, etc.), custom earplugs (active or passive hearing protection), custom audio headset (on or in the ear) ends (on or in the ear), custom bluetooth headset ends (on or in the ear), custom safety goggles or face mask, and custom head mounted display.
As another example embodiment of a product, the following system and method describe a customized helmet product. Refer to fig. 33.
According to one embodiment, a method for creating a customized helmet is disclosed. A method includes capturing image data of a user using at least one computer system (3301 and 3302 show two users with different head shapes); determining, using at least one computer system, anatomical details and/or dimensions of a user; configuring (e.g., customizing shape, size, dimension, color, finish, etc.) a new customized helmet model (showing a configurable helmet model 3303 with a protective element 3304 and straps 3305) for a user using at least one computer system and the user's anatomical data; applying, using at least one computer system, a configurable headgear model to image data or an anatomical model of a user; previewing, using at least one computer system, an image of a user with a configurable helmet model (a customized helmet model 3306 is shown on the user's head, adapted to their unique head shape); optionally adjusting and updating, using at least one computer system and/or user input, a preview of configurable helmet model attributes (e.g., custom shape, size, dimension, color, finish, etc.); preparing, using at least one computer system executing instructions for manufacturing a custom helmet based on the preview model; and manufacturing a new customized helmet using the at least one computer system and the manufacturing system.
According to one embodiment, a system for creating a customized helmet is disclosed. A system comprising: an image acquisition device configured to obtain image data of a user; an input device configured to receive an instruction from a user; a display configured to display image data to a user; a manufacturing system configured to produce a customized helmet; a digital storage device to store instructions to create and preview the custom helmet; a processor configured to execute instructions that perform a method comprising: capturing image data of a user using at least one computer system; determining, using at least one computer system, anatomical details and/or dimensions of a user; configuring (e.g., customizing shape, size, dimension, color, finish, etc.) a new helmet model for the user using at least one computer system and the user's anatomical data; applying, using at least one computer system, a configurable headgear model to image data or an anatomical model of a user; previewing, using at least one computer system, an image of a user with a configurable helmet model; optionally adjusting and updating, using at least one computer system and/or user input, a preview of configurable helmet model attributes (e.g., custom shape, size, dimension, color, finish, etc.); preparing, using at least a computer system, instructions for manufacturing a custom helmet based on the preview model; and manufacturing a new customized helmet using the at least one computer system and the manufacturing system.

Claims (25)

1. A computer-implemented method for using a computer system to generate instructions for manufacturing an adjusted individual-specific eyewear product, the method comprising:
capturing one or more images of an anatomy of an individual;
quantifying at least a portion of an anatomical feature of the individual's anatomy;
generating a configurable parametric model for the individual, individual-specific eyewear product by shaping a surface or contour of an adjustable frame portion of a configurable parametric model to match a quantified portion of the anatomical feature of the individual's anatomy subject to one or more constraints of a fixed frame portion of the eyewear product, the configurable parametric model including the adjustable frame portion and the fixed frame portion;
generating a display of at least an adjustable frame portion of a configurable parametric model of the individual-specific eyewear product shaped to match the quantified portion of the anatomical feature of the individual's anatomy;
receiving a user input request to modify one or more adjustable geometric parameters of the configurable parametric model;
generating, for the individual, an updated parametric model of the individual-specific eyewear product based on the received request to modify the one or more adjustable geometric parameters of the configurable parametric model; and
Generating an electronic file including instructions for manufacturing the adjustable frame portion for adjusting the physical implementation of the individual-specific eyewear product from the updated parametric model.
2. The method of claim 1, further comprising:
converting the updated parametric model, instructions or electronic file into one or more of computer numerical control code, flat pattern, rapid prototyping instructions, instructions for additive manufacturing methods, or robotic control instructions.
3. The method of claim 1, further comprising:
determining one or more manufacturing metrics associated with manufacturing the physical embodiment of the individual-specific eyewear product, wherein the one or more manufacturing metrics include a selection of one or more raw materials, production capacity, future plans, product specifications, or quality checks.
4. The method of claim 1, further comprising:
in manufacturing the physical embodiment of the individual-specific eyewear product, one or more templates, one or more criteria, or one or more three-dimensional models are generated for quality control that include at least one of the one or more geometric features of the individual's anatomy.
5. The method of claim 1, wherein the instructions for manufacturing the individual-specific eyewear product from the updated parametric model comprise one or more specifications, data structures, computer numerical control instructions, machine-readable instructions, technician instructions, or a model file interpretable by a manufacturing system.
6. The method of claim 1, further comprising:
determining a method for manufacturing the physical embodiment of the individual-specific eyewear product from the updated parametric model; and
modifying generation of the electronic file including instructions for manufacturing the physical embodiment of the individual-specific eyewear product based on the determined manufacturing method.
7. The method of claim 1, further comprising:
generating the instructions for manufacturing the physical embodiment of the individual-specific eyewear product, including instructions for automated or manual inspection of quality control.
8. The method of claim 1, further comprising:
identifying a subdivision of the updated parametric model as part of a flat pattern; and
Generating the instructions for manufacturing the individual-specific eyewear product based on instructions for manufacturing the identified subdivided one or more flat patterns for the updated parametric model.
9. The method of claim 1, further comprising:
determining manipulations performed on the subdivision of the updated parametric model;
determining geometric information associated with the manipulations performed on the subdivision of the updated parametric model; and
generating the instructions for manufacturing the individual-specific eyewear product based on the geometric information associated with the manipulations performed on the subdivision of the updated parametric model.
10. The method of claim 9, wherein the manipulation comprises bending, folding, or forming performed by an automated device with the geometric information associated with the manipulation performed on the subdivision of the updated parametric model.
11. The method of claim 1, wherein the updated parametric model comprises prescription information, lens information, user information associated with the individual, lens laser marking, lens edge machining instructions, parameters entered by the individual or by a different user, or design attribute information.
12. The method of claim 11, wherein the design attribute information comprises one or more surfaces, finishes, markings, coatings, edging, markings, decorative markings, colors, patterns, or materials.
13. The method of claim 1, further comprising:
determining one or more stages for manufacturing the individual-specific eyewear product from the updated parametric model; and
generating a command sequence for at least one of one or more stages for manufacturing the individual-specific eyewear product in accordance with the updated parametric model, wherein the electronic file includes the command sequence.
14. The method of claim 1, further comprising:
determining one or more manufacturing metrics for manufacturing the individual-specific eyewear product from the updated parametric model; and
instructions to generate the electronic file for generating the manufacture for the individual-specific eyewear product based on optimization of the one or more manufacturing metrics.
15. The method of claim 1, further comprising:
determining a prediction associated with the manufacture of the individual-specific eyewear product from the updated parametric model, wherein the prediction is based on a time to complete the manufacture of the individual-specific eyewear product or an expected delivery date to complete the manufacture of the individual-specific eyewear product.
16. The method of claim 1, further comprising:
determining a time to complete the manufacture of the individual-specific eyewear product from the updated parametric model or determining an expected delivery date to complete the manufacture of the individual-specific eyewear product from the updated parametric model.
17. The method of claim 1, further comprising:
determining a price for manufacturing the individual-specific eyewear product from the updated parametric model.
18. The method of claim 1, wherein the individual-specific product comprises an eyeglass frame.
19. A computer system for generating instructions for manufacturing an individual-specific eyewear product, the system comprising:
a data storage device storing instructions for generating instructions for manufacturing an adjusted individual-specific eyewear product; and
a processor configured to execute the instructions to perform a method comprising:
capturing one or more images of an anatomy of an individual;
quantifying at least a portion of an anatomical feature of the individual's anatomy;
generating a configurable parametric model for the individual, the individual-specific eyewear product by shaping a surface or contour of an adjustable frame portion of a configurable parametric model to match a quantified portion of the anatomical feature of the individual's anatomy subject to one or more constraints of a fixed frame portion of the eyewear product, the configurable parametric model including the adjustable frame portion and the fixed frame portion;
Generating a display of at least an adjustable frame portion of a configurable parametric model of the individual-specific eyewear product shaped to match the quantified portion of the anatomical feature of the individual's anatomy;
receiving a user input request to modify one or more adjustable geometric parameters of the configurable parametric model;
generating, for the individual, an updated parametric model of the individual-specific eyewear product based on the received request to modify the one or more adjustable geometric parameters of the configurable parametric model; and
generating an electronic file including instructions for manufacturing the adjustable frame portion for adjusting the physical implementation of the individual-specific eyewear product from the updated parametric model.
20. A non-transitory computer-readable medium for use on a computer system containing computer-executable programming instructions for generating instructions for manufacturing an adjusted individual-specific eyewear product according to a method comprising:
capturing one or more images of an anatomy of an individual;
quantifying at least a portion of an anatomical feature of the individual's anatomy;
Generating a configurable parametric model for an individual, individual-specific eyewear product by shaping a surface or contour of an adjustable frame portion of the configurable parametric model to match a quantified portion of the anatomical feature of an individual subject's anatomy subject to one or more constraints of a fixed frame portion of the eyewear product, the configurable parametric model including the adjustable frame portion and the fixed frame portion;
generating a display of at least an adjustable frame portion of a configurable parametric model of the individual-specific eyewear product shaped to match the quantified portion of the anatomical feature of the individual's anatomy;
receiving a user input request to modify one or more adjustable geometric parameters of the configurable parametric model;
generating, for the individual, an updated parametric model of the individual-specific eyewear product based on the received request to modify the one or more adjustable geometric parameters of the configurable parametric model; and
generating an electronic file including instructions for manufacturing the adjustable frame portion of the physical implementation of the individual-specific eyewear product from the updated parametric model.
21. A computer-implemented method for using a computer system to generate instructions for generating a model to adjust an individual-specific eyewear product, the method comprising:
receiving an anatomical model of a face of an individual;
identifying and quantifying one or more geometric features of the anatomical model of the individual's face;
determining one or more optical measurements for lens construction based on the one or more identified and quantified geometric features of the anatomical model; and determining a configurable parametric model for an individual, individual-specific eyewear product by generating one or more surfaces of a lens portion of the configurable parametric model based on the one or more determined optical measurements for lens construction;
generating a display of at least an adjustable frame portion of a configurable parametric model of the individual-specific eyewear product shaped to match the quantified portion of the anatomical feature of the individual's anatomy.
22. The method of claim 21, wherein the one or more optical measurements include a pupil distance, an apex distance, a forward tilt, a frame wrap, and/or a lens height of the lens portion of the configurable parametric model relative to a pupil of the anatomical model of the individual's face.
23. The method of claim 21, the step of identifying and quantifying one or more geometric features of the anatomical model of the individual's face further comprising:
determining a position of eyes of the individual, a center of a nose of the individual, and/or a vertical position of a face of the individual based on the anatomical model of the face of the individual.
24. The method of claim 21, further comprising:
determining the configurable parametric model by generating one or more geometric parameters of the configurable parametric model based on the identified and quantified one or more geometric features of the anatomical model of the individual's face.
25. The method of claim 21, further comprising:
generating an electronic file comprising instructions for manufacturing a physical embodiment of the individual-specific eyewear product according to the determined configurable parametric model, wherein the physical embodiment of the individual-specific eyewear product comprises a progressive lens.
CN201810244436.3A 2013-08-22 2014-08-22 Method and system for creating customized products Active CN108537628B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201361869051P 2013-08-22 2013-08-22
US61/869,051 2013-08-22
US201462002738P 2014-05-23 2014-05-23
US62/002,738 2014-05-23
CN201480056330.0A CN105637512B (en) 2013-08-22 2014-08-22 For creating the method and system of customed product

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201480056330.0A Division CN105637512B (en) 2013-08-22 2014-08-22 For creating the method and system of customed product

Publications (2)

Publication Number Publication Date
CN108537628A CN108537628A (en) 2018-09-14
CN108537628B true CN108537628B (en) 2022-02-01

Family

ID=52480074

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810244436.3A Active CN108537628B (en) 2013-08-22 2014-08-22 Method and system for creating customized products
CN201480056330.0A Active CN105637512B (en) 2013-08-22 2014-08-22 For creating the method and system of customed product

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201480056330.0A Active CN105637512B (en) 2013-08-22 2014-08-22 For creating the method and system of customed product

Country Status (8)

Country Link
US (16) US9470911B2 (en)
EP (1) EP3036701A4 (en)
JP (2) JP6099232B2 (en)
KR (2) KR101821284B1 (en)
CN (2) CN108537628B (en)
AU (2) AU2014308590B2 (en)
CA (1) CA2921938C (en)
WO (1) WO2015027196A1 (en)

Families Citing this family (432)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8977377B2 (en) * 2010-02-25 2015-03-10 Jostens, Inc. Method for digital manufacturing of jewelry items
US9147222B2 (en) * 2010-06-23 2015-09-29 Digimarc Corporation Detecting encoded signals under adverse lighting conditions using adaptive signal detection
GB201102794D0 (en) * 2011-02-17 2011-03-30 Metail Ltd Online retail system
US9208265B2 (en) 2011-12-02 2015-12-08 Jostens, Inc. System and method for jewelry design
US10929904B1 (en) 2012-10-23 2021-02-23 Protolabs, Inc. Automated fabrication price quoting and fabrication ordering for computer-modeled structures
EP2916164B1 (en) * 2012-11-05 2021-06-02 Nikon Corporation Method for designing spectacle lens, and system for designing spectacle lens
US9582615B2 (en) 2013-01-16 2017-02-28 Jostens, Inc. Modeling using thin plate spline technology
US10159296B2 (en) 2013-01-18 2018-12-25 Riddell, Inc. System and method for custom forming a protective helmet for a customer's head
JP6456366B2 (en) * 2013-06-07 2019-01-23 エシロール エンテルナショナル How to determine an optical instrument
US9836846B2 (en) * 2013-06-19 2017-12-05 Commonwealth Scientific And Industrial Research Organisation System and method of estimating 3D facial geometry
CN103489107B (en) * 2013-08-16 2015-11-25 北京京东尚科信息技术有限公司 A kind of method and apparatus making virtual fitting model image
US9470911B2 (en) 2013-08-22 2016-10-18 Bespoke, Inc. Method and system to create products
US9606701B1 (en) 2013-10-14 2017-03-28 Benko, LLC Automated recommended joining data with presented methods for joining in computer-modeled structures
US10373183B1 (en) 2013-10-16 2019-08-06 Alekhine, Llc Automatic firm fabrication price quoting and fabrication ordering for computer-modeled joining features and related structures
CN105580051B (en) * 2013-10-30 2019-05-14 英特尔公司 Picture catching feedback
USD789228S1 (en) 2013-11-25 2017-06-13 Jostens, Inc. Bezel for a ring
FR3013620B1 (en) * 2013-11-26 2015-12-25 Essilor Int METHOD FOR BEVELING AN OPHTHALMIC LENS
WO2015081343A1 (en) * 2013-12-01 2015-06-04 Wildtrack Classification system for similar objects from digital images
US10169795B2 (en) * 2013-12-10 2019-01-01 Google Technology Holdings LLC Sizing wearable items by device scanning
WO2015092623A1 (en) * 2013-12-20 2015-06-25 Koninklijke Philips N.V. 3-d patient interface device adjustment system and method
CN105874376B (en) * 2014-01-03 2019-02-01 依视路国际公司 Method for determining the optical device including at least one optical mirror slip and a spectacle frame
US10682826B2 (en) * 2014-01-06 2020-06-16 Madico, Inc. Platform for validating materials and cutting protective covers
US11537765B1 (en) 2014-02-20 2022-12-27 Benko, LLC Placement and pricing of part marks in computer-modeled structures
JP6269816B2 (en) * 2014-03-27 2018-01-31 日本電気株式会社 POS terminal, information processing apparatus, white balance adjustment method and program
US11410224B1 (en) * 2014-03-28 2022-08-09 Desprez, Llc Methods and software for requesting a pricing in an electronic marketplace using a user-modifiable spectrum interface
US20150277155A1 (en) * 2014-03-31 2015-10-01 New Eye London Ltd. Customized eyewear
US10552882B1 (en) 2014-05-20 2020-02-04 Desprez, Llc Methods and software for enabling custom pricing in an electronic commerce system
US9760935B2 (en) * 2014-05-20 2017-09-12 Modiface Inc. Method, system and computer program product for generating recommendations for products and treatments
US9951394B2 (en) * 2014-05-28 2018-04-24 National Beef Packing Company, Llc Hide routing systems and methods
US10713394B1 (en) 2014-06-12 2020-07-14 Benko, LLC Filtering components compatible with a computer-modeled structure
US10121178B2 (en) * 2014-06-13 2018-11-06 Ebay Inc. Three-dimensional eyeglasses modeling from two-dimensional images
US10852838B2 (en) * 2014-06-14 2020-12-01 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
US11392396B1 (en) 2014-06-24 2022-07-19 Desprez, Llc Systems and methods for automated help
US10025805B1 (en) 2014-06-24 2018-07-17 Benko, LLC Systems and methods for automated help
US20160027097A1 (en) * 2014-07-23 2016-01-28 Zenni Optical Inc. Visual Search Interface for Open Filters for Eyeglass Selection
US10460342B1 (en) 2014-08-12 2019-10-29 Benko, LLC Methods and software for providing targeted advertising to a product program
US9086582B1 (en) * 2014-08-20 2015-07-21 David Kind, Inc. System and method of providing custom-fitted and styled eyewear based on user-provided images and preferences
JPWO2016035181A1 (en) * 2014-09-03 2017-06-22 株式会社ニコン Imaging apparatus, information processing apparatus, and imaging system
US20160070822A1 (en) * 2014-09-09 2016-03-10 Primesmith Oy Method, Apparatus and Computer Program Code for Design and Visualization of a Physical Object
US10095217B2 (en) 2014-09-15 2018-10-09 Desprez, Llc Natural language user interface for computer-aided design systems
US10162337B2 (en) 2014-09-15 2018-12-25 Desprez, Llc Natural language user interface for computer-aided design systems
US11599086B2 (en) 2014-09-15 2023-03-07 Desprez, Llc Natural language user interface for computer-aided design systems
US9613020B1 (en) 2014-09-15 2017-04-04 Benko, LLC Natural language user interface for computer-aided design systems
US10528032B2 (en) * 2014-10-08 2020-01-07 Aetrex Worldwide, Inc. Systems and methods for generating a patterned orthotic device
US20160239976A1 (en) * 2014-10-22 2016-08-18 Pointivo, Inc. Photogrammetric methods and devices related thereto
US11023934B1 (en) 2014-10-30 2021-06-01 Desprez, Llc Business variable optimization for manufacture or supply of designed products
US11276095B1 (en) 2014-10-30 2022-03-15 Desprez, Llc Methods and software for a pricing-method-agnostic ecommerce marketplace for manufacturing services
US10073439B1 (en) 2014-10-31 2018-09-11 Desprez, Llc Methods, systems, and software for processing expedited production or supply of designed products
US10235009B1 (en) 2014-10-31 2019-03-19 Desprez, Llc Product variable optimization for manufacture or supply of designed products
US10836110B2 (en) 2014-10-31 2020-11-17 Desprez, Llc Method and system for ordering expedited production or supply of designed products
US11415961B1 (en) 2014-10-31 2022-08-16 Desprez, Llc Automated correlation of modeled product and preferred manufacturers
US9629698B2 (en) 2014-11-04 2017-04-25 James R. Glidewell Dental Ceramics, Inc. Method and apparatus for generation of 3D models with applications in dental restoration design
GB201420090D0 (en) * 2014-11-12 2014-12-24 Knyttan Ltd Image to item mapping
CN104618705B (en) * 2014-11-28 2017-04-05 深圳市魔眼科技有限公司 Different distance self adaptation holography display methods and equipment based on eyeball tracking
JP6594129B2 (en) * 2014-11-28 2019-10-23 キヤノン株式会社 Information processing apparatus, information processing method, and program
KR102329814B1 (en) * 2014-12-01 2021-11-22 삼성전자주식회사 Pupilometer for 3d display
KR101665396B1 (en) * 2015-01-08 2016-10-13 이주성 Method for mapping of file
EP3262588A1 (en) * 2015-02-26 2018-01-03 Finity Technology Limited Computer implemented platform for the creation of a virtual product
GB2536060B (en) * 2015-03-06 2019-10-16 Specsavers Optical Group Ltd Virtual trying-on experience
US10194799B2 (en) * 2015-03-09 2019-02-05 Sanovas Intellectual Property, Llc Robotic ophthalmology
CN107250719B (en) 2015-03-10 2020-07-14 豪雅镜片泰国有限公司 Spectacle wearing parameter measurement system, measurement program, measurement method thereof, and method for manufacturing spectacle lens
US10803501B1 (en) 2015-03-17 2020-10-13 Desprez, Llc Systems, methods, and software for generating, customizing, and automatedly e-mailing a request for quotation for fabricating a computer-modeled structure from within a CAD program
US11004126B1 (en) 2016-03-17 2021-05-11 Desprez, Llc Systems, methods, and software for generating, customizing, and automatedly e-mailing a request for quotation for fabricating a computer-modeled structure from within a CAD program
RU2596062C1 (en) 2015-03-20 2016-08-27 Автономная Некоммерческая Образовательная Организация Высшего Профессионального Образования "Сколковский Институт Науки И Технологий" Method for correction of eye image using machine learning and method of machine learning
US10832197B1 (en) 2015-03-30 2020-11-10 EMC IP Holding Company LLC Creating and utilizing bill of work information to provide a dynamic routing plan for manufacturing a product
WO2016164859A1 (en) * 2015-04-10 2016-10-13 Bespoke, Inc. Systems and methods for creating eyewear with multi-focal lenses
US9885887B2 (en) * 2015-04-22 2018-02-06 Kurt Matthew Gardner Method of determining eyeglass frame measurements from an image by executing computer-executable instructions stored on a non-transitory computer-readable medium
US20170168323A1 (en) * 2015-04-22 2017-06-15 Kurt Matthew Gardner Method of Determining Eyeglass Fitting Measurements from an Image by Executing Computer-Executable Instructions Stored on a Non-Transitory Computer-Readable Medium
US11086148B2 (en) * 2015-04-30 2021-08-10 Oakley, Inc. Wearable devices such as eyewear customized to individual wearer parameters
CN106462726A (en) * 2015-05-06 2017-02-22 埃西勒国际通用光学公司 Frame recognition system and method
CN104814715A (en) * 2015-05-13 2015-08-05 张仕郎 Intelligent cloud optometry robot system
EP3098734A1 (en) * 2015-05-28 2016-11-30 Dassault Systèmes Querying a database with likeness criterion
NL2014891B1 (en) * 2015-05-29 2017-01-31 Maydo B V Method for manufacturing a spectacle frame adapted to a spectacle wearer.
US9532709B2 (en) * 2015-06-05 2017-01-03 Jand, Inc. System and method for determining distances from an object
US10012832B2 (en) 2015-06-22 2018-07-03 Microsoft Technology Licensing, Llc Automatic lens design using off-the-shelf components
NL1041388B1 (en) * 2015-07-02 2017-01-30 Mat Nv Glasses, assembly of such glasses and oxygen delivery means, also method for constructing such an assembly and aid for use therewith.
KR102146398B1 (en) * 2015-07-14 2020-08-20 삼성전자주식회사 Three dimensional content producing apparatus and three dimensional content producing method thereof
US10489812B2 (en) * 2015-07-15 2019-11-26 International Business Machines Corporation Acquiring and publishing supplemental information on a network
DE102015213832B4 (en) * 2015-07-22 2023-07-13 Adidas Ag Method and device for generating an artificial image
US10635901B2 (en) * 2015-08-11 2020-04-28 Sony Interactive Entertainment Inc. Head-mounted display
US9770165B2 (en) * 2015-08-13 2017-09-26 Jand, Inc. Systems and methods for displaying objects on a screen at a desired visual angle
US10007352B2 (en) 2015-08-21 2018-06-26 Microsoft Technology Licensing, Llc Holographic display system with undo functionality
US20180247426A1 (en) * 2015-08-28 2018-08-30 Lawrence Gluck System for accurate remote measurement
US10620778B2 (en) * 2015-08-31 2020-04-14 Rockwell Automation Technologies, Inc. Augmentable and spatially manipulable 3D modeling
US10459430B2 (en) * 2015-09-11 2019-10-29 Xerox Corporation Method and system for variable data printing in a 3D print system
WO2017042612A1 (en) 2015-09-12 2017-03-16 Shamir Optical Industry Ltd. Automatic eyewear measurement and specification
US10089522B2 (en) * 2015-09-29 2018-10-02 BinaryVR, Inc. Head-mounted display with facial expression detecting capability
US10359763B2 (en) 2015-10-19 2019-07-23 International Business Machines Corporation Automated prototype creation based on analytics and 3D printing
AU2016349939A1 (en) * 2015-11-03 2018-05-17 The Stainless Steel Monument Company Pty Ltd A design system and method
JP6572099B2 (en) * 2015-11-06 2019-09-04 キヤノン株式会社 Imaging apparatus, control method therefor, and program
JP6967510B2 (en) * 2015-11-10 2021-11-17 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Determining information about the patient's face
US10220172B2 (en) * 2015-11-25 2019-03-05 Resmed Limited Methods and systems for providing interface components for respiratory therapy
FR3044430B1 (en) * 2015-11-26 2018-03-23 Ak Optique METHOD FOR MANUFACTURING A GOGGLE FRAME WITH MEASURED NASAL SUPPORTS
FR3044429B1 (en) * 2015-11-26 2018-01-05 Ak Optique METHOD FOR MANUFACTURING A CUSTOM GOGGLE FRAME
CN108292320A (en) * 2015-12-08 2018-07-17 索尼公司 Information processing unit, information processing method and program
US10706580B2 (en) * 2015-12-09 2020-07-07 Hajime Kasahara Position-information specifying method, position-information specifying device, and position-information specifying program
CN105469072B (en) * 2015-12-14 2019-05-31 依视路国际公司 A kind of method and system for assessing eyeglass wearer and its wearing spectacles collocation degree
US10311182B2 (en) * 2015-12-16 2019-06-04 Dassault Systemes Topological change in a constrained asymmetrical subdivision mesh
EP3182306A1 (en) * 2015-12-16 2017-06-21 Dassault Systèmes Topological change in a constrained asymmetrical subdivision mesh
CN106887024B (en) * 2015-12-16 2019-09-17 腾讯科技(深圳)有限公司 The processing method and processing system of photo
EP3182378B1 (en) * 2015-12-16 2021-08-11 Dassault Systèmes Modification of a constrained asymmetrical subdivision mesh
WO2017102003A1 (en) * 2015-12-17 2017-06-22 Essilor International (Compagnie Generale D'optique) Distributed optical job and manufacturing computation systems and methods
KR101867564B1 (en) * 2015-12-29 2018-07-17 엘리셀 주식회사 method and apparatus of manufacturing personalized mask-pack
US10614288B2 (en) * 2015-12-31 2020-04-07 Cerner Innovation, Inc. Methods and systems for detecting stroke symptoms
CN105894533A (en) * 2015-12-31 2016-08-24 乐视移动智能信息技术(北京)有限公司 Method and system for realizing body motion-sensing control based on intelligent device and intelligent device
US10321123B2 (en) 2016-01-05 2019-06-11 Reald Spark, Llc Gaze correction of multi-view images
US10564628B2 (en) * 2016-01-06 2020-02-18 Wiivv Wearables Inc. Generating of 3D-printed custom wearables
WO2017137948A1 (en) * 2016-02-10 2017-08-17 Vats Nitin Producing realistic body movement using body images
US10775266B2 (en) * 2016-02-12 2020-09-15 Shamir Optical Industry Ltd Methods and systems for testing of eyeglasses
JP6762003B2 (en) 2016-02-29 2020-09-30 国立大学法人神戸大学 Object surface correction method and workpiece processing method
ES2604806B1 (en) * 2016-03-03 2018-07-25 Horizons Optical S.L.U. PROCEDURE FOR ORDERING CORRESPONDING MANUFACTURING GLASSES AND MANUFACTURING PROCEDURES AND SUPPLY AND DEVICE
EP3424009A1 (en) * 2016-03-04 2019-01-09 Essilor International Method of ordering an ophthalmic lens and corresponding system
US9460557B1 (en) * 2016-03-07 2016-10-04 Bao Tran Systems and methods for footwear fitting
EP3428877A4 (en) 2016-03-09 2019-10-30 Nikon Corporation Detection device, information processing device, detection method, detection program, and detection system
BE1023456B1 (en) * 2016-03-09 2017-03-27 Fit Things Nv Cutting device and method
WO2017163643A1 (en) * 2016-03-22 2017-09-28 富士フイルム株式会社 Fit adjustment value-outputting eyeglasses system, fit adjustment value-outputting eyeglasses, fit information-detecting eyeglasses, program to be used in fit adjustment value-outputting eyeglasses system, and adjustment function-equipped eyeglasses
EP3432776B1 (en) * 2016-03-23 2023-03-29 The Procter & Gamble Company Imaging method for determining stray fibers
US11423449B1 (en) 2016-03-23 2022-08-23 Desprez, Llc Electronic pricing machine configured to generate prices based on supplier willingness and a user interface therefor
GB2549071B (en) * 2016-03-23 2020-11-11 Sony Interactive Entertainment Inc 3D printing system
US10556309B1 (en) 2016-03-24 2020-02-11 Proto Labs Inc. Methods of subtractively manufacturing a plurality of discrete objects from a single workpiece using a removable fixating material
US10176275B1 (en) 2016-03-28 2019-01-08 Luvlyu, Inc. Breast shape visualization and modeling tool
DE102016106121A1 (en) 2016-04-04 2017-10-05 Carl Zeiss Ag Method and device for determining parameters for spectacle fitting
CN105708467B (en) * 2016-04-06 2017-12-29 广州小亮点科技有限公司 Human body actual range measures and the method for customizing of spectacle frame
US10579203B2 (en) * 2016-04-13 2020-03-03 Intel Corporation Wellness mirror
US10401824B2 (en) 2016-04-14 2019-09-03 The Rapid Manufacturing Group LLC Methods and software for reducing machining equipment usage when machining multiple objects from a single workpiece
WO2017181151A1 (en) * 2016-04-14 2017-10-19 Cornell University Methods for incremental 3d printing and 3d printing arbitrary wireframe meshes
JP6734690B2 (en) * 2016-04-15 2020-08-05 株式会社Shoei Method for adjusting helmet comfort and information processing device for adjustment
EP3455671A1 (en) 2016-05-10 2019-03-20 Materialise N.V. Method of designing and placing a lens within a spectacles frame
US20170337516A1 (en) * 2016-05-18 2017-11-23 Kuiu, Inc. Distributed product development and funding system
US10672055B2 (en) * 2016-05-23 2020-06-02 Oath Inc. Method and system for presenting personalized products based on digital signage for electronic commerce
CA3024874A1 (en) * 2016-06-01 2017-12-07 Vidi Pty Ltd An optical measuring and scanning system and methods of use
CN105842875B (en) * 2016-06-07 2018-07-24 杭州美戴科技有限公司 A kind of spectacle frame design method based on face three-dimensional measurement
EP3258308A1 (en) * 2016-06-13 2017-12-20 ESSILOR INTERNATIONAL (Compagnie Générale d'Optique) Frame for a head mounted device
US10402068B1 (en) 2016-06-16 2019-09-03 Amazon Technologies, Inc. Film strip interface for interactive content
US10417356B1 (en) 2016-06-16 2019-09-17 Amazon Technologies, Inc. Physics modeling for interactive content
US10018854B2 (en) * 2016-06-22 2018-07-10 Indizen Optical Technologies of America, LLC Custom ophthalmic lens design derived from multiple data sources
CN106200918B (en) * 2016-06-28 2019-10-01 Oppo广东移动通信有限公司 A kind of information display method based on AR, device and mobile terminal
EP3264286B1 (en) 2016-06-28 2020-11-18 Dassault Systèmes Querying a database with morphology criterion
FR3053509B1 (en) * 2016-06-30 2019-08-16 Fittingbox METHOD FOR OCCULATING AN OBJECT IN AN IMAGE OR A VIDEO AND ASSOCIATED AUGMENTED REALITY METHOD
WO2018002533A1 (en) * 2016-06-30 2018-01-04 Fittingbox Method for concealing an object in an image or a video and associated augmented reality method
CN106205633B (en) * 2016-07-06 2019-10-18 李彦芝 It is a kind of to imitate, perform practice scoring system
US10780338B1 (en) 2016-07-20 2020-09-22 Riddell, Inc. System and methods for designing and manufacturing bespoke protective sports equipment
US10216011B2 (en) * 2016-07-25 2019-02-26 iCoat Company, LLC Eyewear measuring systems, methods and devices
CN106054405A (en) * 2016-07-27 2016-10-26 深圳市金立通信设备有限公司 Lens adjusting method and terminal
CN109478294A (en) * 2016-07-28 2019-03-15 依视路国际公司 Glasses matching tool
US10534809B2 (en) * 2016-08-10 2020-01-14 Zeekit Online Shopping Ltd. Method, system, and device of virtual dressing utilizing image processing, machine learning, and computer vision
US10290136B2 (en) 2016-08-10 2019-05-14 Zeekit Online Shopping Ltd Processing user selectable product images and facilitating visualization-assisted coordinated product transactions
US20180068449A1 (en) * 2016-09-07 2018-03-08 Valve Corporation Sensor fusion systems and methods for eye-tracking applications
WO2018051160A1 (en) * 2016-09-16 2018-03-22 Essilor International Systems and methods to provide customized product information
WO2018053703A1 (en) * 2016-09-21 2018-03-29 Intel Corporation Estimating accurate face shape and texture from an image
WO2018054718A1 (en) * 2016-09-22 2018-03-29 Essilor International Method for determining an electronic spectacle frame element
CN107865473B (en) * 2016-09-26 2019-10-25 华硕电脑股份有限公司 Characteristics of human body's range unit and its distance measuring method
US9990780B2 (en) * 2016-10-03 2018-06-05 Ditto Technologies, Inc. Using computed facial feature points to position a product model relative to a model of a face
US10328686B2 (en) 2016-10-10 2019-06-25 Microsoft Technology Licensing, Llc Determining print-time for generation of 3D objects based on 3D printing parameters
JP6744191B2 (en) * 2016-10-12 2020-08-19 株式会社日立製作所 Parts ordering method selecting device, parts ordering method selecting method and program
CN106444676B (en) * 2016-10-21 2019-03-01 上海安宸信息科技有限公司 A kind of intelligence control system based on Wide Area Network covering
US11361003B2 (en) * 2016-10-26 2022-06-14 salesforcecom, inc. Data clustering and visualization with determined group number
KR101851303B1 (en) 2016-10-27 2018-04-23 주식회사 맥스트 Apparatus and method for reconstructing 3d space
US10445938B1 (en) * 2016-11-09 2019-10-15 Snap Inc. Modifying multiple objects within a video stream
WO2018087386A1 (en) * 2016-11-14 2018-05-17 Themagic5 User-customised goggles
EP3321817A1 (en) 2016-11-14 2018-05-16 Dassault Systèmes Querying a database based on a parametric view function
US11559378B2 (en) 2016-11-17 2023-01-24 James R. Glidewell Dental Ceramics, Inc. Scanning dental impressions
US10284731B2 (en) 2016-11-29 2019-05-07 Echostar Technologies L.L.C. Apparatus, systems and methods for generating 3D model data from a media content event
US11226831B2 (en) 2016-12-05 2022-01-18 Facebook, Inc. Customizing content based on predicted user preferences
US10715520B2 (en) * 2016-12-08 2020-07-14 Mastercard International Incorporated Systems and methods for decentralized biometric enrollment
US11609547B2 (en) * 2016-12-19 2023-03-21 Autodesk, Inc. Gestural control of an industrial robot
EP3339943A1 (en) * 2016-12-21 2018-06-27 Fielmann Ventures GmbH Method and system for obtaining optometric parameters for fitting eyeglasses
CN106682424A (en) * 2016-12-28 2017-05-17 上海联影医疗科技有限公司 Medical image adjusting method and medical image adjusting system
US10545481B2 (en) 2016-12-28 2020-01-28 Proto Labs Inc Methods and software for providing graphical representations of a plurality of objects in a central through opening
US10675857B2 (en) * 2016-12-30 2020-06-09 Konica Minolta Business Solutions U.S.A., Inc. Patterns for 3D printing
WO2018129094A1 (en) * 2017-01-06 2018-07-12 Intuitive Surgical Operations, Inc. System and method for registration and coordinated manipulation of augmented reality image components
EP3568835A1 (en) * 2017-01-12 2019-11-20 Atos Spain Sociedad Anonima A system for manufacturing personalized products by means of additive manufacturing doing an image-based recognition using electronic devices with a single camera
EP3355101B1 (en) * 2017-01-27 2019-05-15 Carl Zeiss Vision International GmbH Computer-implemented method for determining a representation of a spectacle frame rim or a representation of the edges of the lenses of a pair of spectacles
EP3355103A1 (en) * 2017-01-27 2018-08-01 Carl Zeiss AG Computer-implemented method for determining centring parameters
EP3355214A1 (en) * 2017-01-27 2018-08-01 Carl Zeiss Vision International GmbH Method, computing device and computer program for spectacle frame design
CN106951678A (en) * 2017-02-20 2017-07-14 集合智造(北京)餐饮管理有限公司 It is a kind of to be used for the method for information gathering and processing, system in food flexible manufacturing
GB2559978A (en) * 2017-02-22 2018-08-29 Fuel 3D Tech Limited Systems and methods for obtaining eyewear information
JP6824381B2 (en) * 2017-02-27 2021-02-03 Vivita株式会社 CAD equipment and programs
JP2018144380A (en) * 2017-03-07 2018-09-20 富士ゼロックス株式会社 Fabrication management system and fabrication management control apparatus
US10860748B2 (en) * 2017-03-08 2020-12-08 General Electric Company Systems and method for adjusting properties of objects depicted in computer-aid design applications
KR102037573B1 (en) * 2017-03-09 2019-10-29 주식회사 파트너스앤코 Diagnosis data processing apparatus based on interview data and camera, and system thereof
JP6558388B2 (en) * 2017-03-14 2019-08-14 オムロン株式会社 Image processing device
JP6961972B2 (en) * 2017-03-24 2021-11-05 富士フイルムビジネスイノベーション株式会社 Three-dimensional shape molding equipment, information processing equipment and programs
DE102017107346A1 (en) * 2017-04-05 2018-10-11 Carl Zeiss Ag Device for power supply of and / or communication with an eye implant by means of illumination radiation
US10307908B2 (en) * 2017-04-07 2019-06-04 X Development Llc Methods and systems for establishing and maintaining a pre-build relationship
WO2018191784A1 (en) * 2017-04-19 2018-10-25 SPEQS Limited Eyeglasses ordering system and digital interface therefor
WO2018199890A1 (en) * 2017-04-24 2018-11-01 Truthvision, Inc. A system and method for measuring pupillary distance
US10664903B1 (en) * 2017-04-27 2020-05-26 Amazon Technologies, Inc. Assessing clothing style and fit using 3D models of customers
US10777018B2 (en) * 2017-05-17 2020-09-15 Bespoke, Inc. Systems and methods for determining the scale of human anatomy from images
US10234848B2 (en) * 2017-05-24 2019-03-19 Relativity Space, Inc. Real-time adaptive control of additive manufacturing processes using machine learning
US10621779B1 (en) * 2017-05-25 2020-04-14 Fastvdo Llc Artificial intelligence based generation and analysis of 3D models
EP3410178A1 (en) 2017-06-01 2018-12-05 Carl Zeiss Vision International GmbH Method, device and computer program for virtual adapting of a spectacle frame
EP3413122B1 (en) * 2017-06-08 2020-03-04 Carl Zeiss Vision International GmbH Method, device and computer program for determining a close-up viewpoint
US10810647B2 (en) * 2017-06-09 2020-10-20 International Business Machines Corporation Hybrid virtual and physical jewelry shopping experience
JP2019537758A (en) * 2017-06-12 2019-12-26 美的集団股▲フン▼有限公司Midea Group Co., Ltd. Control method, controller, smart mirror, and computer-readable storage medium
US20180357819A1 (en) * 2017-06-13 2018-12-13 Fotonation Limited Method for generating a set of annotated images
WO2018236285A1 (en) * 2017-06-22 2018-12-27 Neitas Pte. Ltd. Information processing device
EP3422087B1 (en) * 2017-06-28 2019-12-25 Carl Zeiss Vision International GmbH Method for correcting centring parameters and/or an axis position and corresponding computer programme and method
EP3422708A1 (en) * 2017-06-29 2019-01-02 Koninklijke Philips N.V. Apparatus and method for generating an image
EP3422278A1 (en) * 2017-06-29 2019-01-02 MTG Co., Ltd. Commercial product size determination device and commercial product size determination method
TW201907334A (en) * 2017-07-03 2019-02-16 華碩電腦股份有限公司 Electronic apparatus, image processing method and non-transitory computer-readable recording medium
WO2019010134A1 (en) 2017-07-03 2019-01-10 Hunsmann Margrit Sieglinde Color engine
JP6514278B2 (en) * 2017-07-04 2019-05-15 ファナック株式会社 Laser processing robot system
CN107424047A (en) * 2017-07-06 2017-12-01 广州视源电子科技股份有限公司 A kind of method and system of customed product
ES2753645T3 (en) 2017-07-06 2020-04-13 Zeiss Carl Vision Int Gmbh Procedure, device and computer program for the virtual adaptation of a spectacle frame
EP3425446B1 (en) 2017-07-06 2019-10-30 Carl Zeiss Vision International GmbH Method, device and computer program for virtual adapting of a spectacle frame
CN109429060B (en) * 2017-07-07 2020-07-28 京东方科技集团股份有限公司 Pupil distance measuring method, wearable eye equipment and storage medium
US10386647B1 (en) * 2017-07-11 2019-08-20 Facebook, Inc. Magnetic interpupillary distance adjustment
EP3651995A4 (en) * 2017-07-14 2021-04-21 MRK Fine Arts, LLC Image selection and sizing for jewelry
FR3069687B1 (en) * 2017-07-25 2021-08-06 Fittingbox PROCESS FOR DETERMINING AT LEAST ONE PARAMETER ASSOCIATED WITH AN OPHTHALMIC DEVICE
WO2019025665A1 (en) * 2017-08-03 2019-02-07 Oyyo Method for determining dimensional characteristics of an item of eyewear, and corresponding item of eyewear and production method
CN108305317B (en) * 2017-08-04 2020-03-17 腾讯科技(深圳)有限公司 Image processing method, device and storage medium
US10762605B2 (en) * 2017-08-04 2020-09-01 Outward, Inc. Machine learning based image processing techniques
US10740985B2 (en) 2017-08-08 2020-08-11 Reald Spark, Llc Adjusting a digital representation of a head region
US20190070787A1 (en) * 2017-08-10 2019-03-07 William Marsh Rice University Machine learning enabled model for predicting the spreading process in powder-bed three-dimensional printing
US20190057180A1 (en) * 2017-08-18 2019-02-21 International Business Machines Corporation System and method for design optimization using augmented reality
CN107633398A (en) 2017-08-24 2018-01-26 阿里巴巴集团控股有限公司 Method for displaying image and device and electronic equipment
KR101976086B1 (en) * 2017-08-31 2019-08-28 김지환 Method for Providing Augmented Hat Service
WO2019045735A1 (en) 2017-08-31 2019-03-07 General Electric Company Distribution of customized engineering models for additive manufacturing
US11244505B2 (en) * 2017-08-31 2022-02-08 Sony Group Corporation Methods of constructing a printable 3D model, and related devices and computer program products
US20190072934A1 (en) * 2017-09-01 2019-03-07 Debbie Eunice Stevens-Wright Parametric portraiture design and customization system
US10507337B2 (en) * 2017-09-13 2019-12-17 Elekta, Inc. Radiotherapy treatment plan optimization workflow
US10881827B2 (en) * 2017-09-29 2021-01-05 Koninklijke Philips N.V. Providing a mask for a patient based on a temporal model generated from a plurality of facial scans
US11844414B2 (en) * 2017-10-20 2023-12-19 L'oreal Method for manufacturing a personalized applicator for the application of a cosmetic composition
KR102421539B1 (en) * 2017-10-20 2022-07-14 로레알 Method of making custom applicators for application of cosmetic compositions
US10613710B2 (en) * 2017-10-22 2020-04-07 SWATCHBOOK, Inc. Product simulation and control system for user navigation and interaction
US10585420B2 (en) 2017-11-06 2020-03-10 Abemis LLC Method and system to generate three-dimensional meta-structure model of a workpiece
KR20190061770A (en) 2017-11-28 2019-06-05 양희재 Service servers and methods for making and selling customized jewelry
US11157985B2 (en) * 2017-11-29 2021-10-26 Ditto Technologies, Inc. Recommendation system, method and computer program product based on a user's physical features
CN108122190B (en) * 2017-12-06 2021-06-01 中国航空工业集团公司西安航空计算技术研究所 GPU unified dyeing array vertex dyeing task attribute data assembling method
WO2019110012A1 (en) * 2017-12-08 2019-06-13 Shanghaitech University Face detection and recognition method using light field camera system
US10413172B2 (en) * 2017-12-11 2019-09-17 1-800 Contacts, Inc. Digital visual acuity eye examination for remote physician assessment
US10712811B2 (en) 2017-12-12 2020-07-14 Facebook, Inc. Providing a digital model of a corresponding product in a camera feed
EP3499447A1 (en) * 2017-12-12 2019-06-19 Facebook, Inc. Providing a digital model of corresponding product in a camera feed
US11281824B2 (en) 2017-12-13 2022-03-22 Dassault Systemes Simulia Corp. Authoring loading and boundary conditions for simulation scenarios
CN108230385B (en) * 2017-12-20 2022-01-14 湖南大学 Method and device for detecting number of ultra-high laminated and ultra-thin cigarette labels by single-camera motion
EP3502934A1 (en) * 2017-12-21 2019-06-26 Quicket GmbH Methods and systems for generating a representation of a seated person using facial measurements
US10620454B2 (en) 2017-12-22 2020-04-14 Optikam Tech, Inc. System and method of obtaining fit and fabrication measurements for eyeglasses using simultaneous localization and mapping of camera images
US11579472B2 (en) 2017-12-22 2023-02-14 Optikam Tech, Inc. System and method of obtaining fit and fabrication measurements for eyeglasses using depth map scanning
CN108171577A (en) * 2017-12-27 2018-06-15 大连鉴影光学科技有限公司 A kind of intelligent spectacles matching system of C2M platforms
US10643446B2 (en) 2017-12-28 2020-05-05 Cerner Innovation, Inc. Utilizing artificial intelligence to detect objects or patient safety events in a patient room
JP6899097B2 (en) * 2017-12-29 2021-07-07 京セラドキュメントソリューションズ株式会社 Image processing device, image processing method and image processing program
US10433630B2 (en) * 2018-01-05 2019-10-08 L'oreal Cosmetic applicator system including trackable cosmetic device, and client device to assist users in makeup application
US11370185B2 (en) 2018-01-11 2022-06-28 E-Vision Smart Optics, Inc. Three-dimensional (3D) printing of electro-active lenses
US20190244274A1 (en) * 2018-02-06 2019-08-08 Perfect Corp. Systems and methods for recommending products based on facial analysis
US11115492B2 (en) * 2018-02-13 2021-09-07 Ebay, Inc. Methods and system for determining parameters for a product based on analysis of input media
US10574881B2 (en) * 2018-02-15 2020-02-25 Adobe Inc. Smart guide to capture digital images that align with a target image model
JP6560383B1 (en) * 2018-02-16 2019-08-14 ヤフー株式会社 Information display program, information display method, information display device, and distribution device
US11017575B2 (en) 2018-02-26 2021-05-25 Reald Spark, Llc Method and system for generating data to provide an animated visual representation
EP3537239A1 (en) 2018-03-06 2019-09-11 Siemens Aktiengesellschaft Method for operating a machine tool by adapting a precompiled data model
WO2019185856A1 (en) * 2018-03-29 2019-10-03 Asimos Oy Method for manufacturing one-piece corrective eyewear
KR102039171B1 (en) * 2018-03-30 2019-10-31 경일대학교산학협력단 Mirrored apparatus for performing virtual fitting using artificial neural network, method thereof and computer recordable medium storing program to perform the method
CN111971609B (en) * 2018-04-06 2023-03-14 依视路国际公司 Method for customizing a head mounted device adapted for generating a virtual image
US10789725B2 (en) * 2018-04-22 2020-09-29 Cnoga Medical Ltd. BMI, body and other object measurements from camera view display
KR102091643B1 (en) * 2018-04-23 2020-03-20 (주)이스트소프트 Apparatus for processing image using artificial neural network, method thereof and computer recordable medium storing program to perform the method
US10831042B2 (en) 2018-05-03 2020-11-10 Optikam Tech, Inc. System and method for obtaining and utilizing measurements to enable customized eyewear to be purchased online
CN112585667A (en) * 2018-05-16 2021-03-30 康耐克斯数字有限责任公司 Intelligent platform counter display system and method
US11257297B1 (en) * 2018-06-15 2022-02-22 Baru Inc. System, method, and computer program product for manufacturing a customized product
JP7326707B2 (en) * 2018-06-21 2023-08-16 カシオ計算機株式会社 Robot, robot control method and program
US10762702B1 (en) * 2018-06-22 2020-09-01 A9.Com, Inc. Rendering three-dimensional models on mobile devices
US11676157B2 (en) 2018-07-13 2023-06-13 Shiseido Company, Limited System and method for adjusting custom topical agents
EP3598209A1 (en) * 2018-07-18 2020-01-22 Essilor International Method for determining an opththalmic equipment and system associated
EP3598210A1 (en) * 2018-07-18 2020-01-22 Essilor International Method for determining an ophthalmic component
EP3598208A1 (en) * 2018-07-18 2020-01-22 Essilor International Method for determining a component of an ophthalmic equipment and associated system
CN112654914B (en) 2018-07-26 2023-04-11 欧科蕾公司 Lenses for eyeglasses and other head-mounted supports with improved optics
WO2020037279A1 (en) 2018-08-16 2020-02-20 Riddell, Inc. System and method for designing and manufacturing a protective helmet
CN109191247A (en) * 2018-08-23 2019-01-11 郑植 A kind of jewelry custom-built system and method
KR102149395B1 (en) * 2018-08-31 2020-08-31 주식회사 더메이크 System for providing eyewear wearing and recommendation services using a true depth camera and method of the same
US10481321B1 (en) * 2018-09-06 2019-11-19 Facebook Technologies, Llc Canted augmented reality display for improved ergonomics
FR3085766B1 (en) * 2018-09-06 2020-10-16 Sidel Participations COMPUTER ASSISTANCE PROCESS IN THE MANAGEMENT OF A PRODUCTION LINE
US11209650B1 (en) 2018-09-06 2021-12-28 Facebook Technologies, Llc Waveguide based display with multiple coupling elements for artificial reality
EP3621019A1 (en) 2018-09-07 2020-03-11 Giffits GmbH Computer-implemented method for generating supply and/or order data for an individual object
US10909225B2 (en) * 2018-09-17 2021-02-02 Motorola Mobility Llc Electronic devices and corresponding methods for precluding entry of authentication codes in multi-person environments
JP7331088B2 (en) * 2018-09-24 2023-08-22 マジック リープ, インコーポレイテッド Method and system for 3D model sharing
TWI676873B (en) * 2018-10-01 2019-11-11 財團法人工業技術研究院 Tools monitoring system and monitoring method thereof
US10943278B2 (en) * 2018-10-10 2021-03-09 Capital One Services, Llc Systems and methods for SMS e-commerce assistant
WO2020079510A1 (en) * 2018-10-16 2020-04-23 Abhay Swamy System and method for production and customization of material
CN111090263B (en) * 2018-10-23 2023-11-28 杨宇 Customized intelligent production line control system and control method
CN109483066A (en) * 2018-10-24 2019-03-19 宁波欧尼克科技有限公司 A kind of system of processing of unstructured component
DE102018008669A1 (en) * 2018-11-05 2020-05-07 Shape Engineering GmbH Process for the manufacture of spectacle lenses
US10685457B2 (en) * 2018-11-15 2020-06-16 Vision Service Plan Systems and methods for visualizing eyewear on a user
US11167198B2 (en) 2018-11-21 2021-11-09 Riddell, Inc. Football helmet with components additively manufactured to manage impact forces
US20200159040A1 (en) * 2018-11-21 2020-05-21 Kiritz Productions LLC, VR Headset Stabilization Design and Nose Insert Series Method and apparatus for enhancing vr experiences
CN109558665B (en) * 2018-11-22 2023-01-10 杭州美戴科技有限公司 Automatic design method of personalized flexible nose pad
JP6852141B2 (en) * 2018-11-29 2021-03-31 キヤノン株式会社 Information processing device, imaging device, control method of information processing device, and program
US11922649B2 (en) * 2018-11-30 2024-03-05 Arithmer Inc. Measurement data calculation apparatus, product manufacturing apparatus, information processing apparatus, silhouette image generating apparatus, and terminal apparatus
EP3895115A1 (en) * 2018-12-10 2021-10-20 Essilor International System, device, and method for determining optical article feasibility
US11107241B2 (en) * 2018-12-11 2021-08-31 Seiko Epson Corporation Methods and systems for training an object detection algorithm using synthetic images
DE102018009811A1 (en) 2018-12-13 2020-06-18 YOU MAWO GmbH Method for generating manufacturing data for manufacturing glasses for a person
CN113196139B (en) * 2018-12-20 2023-08-11 美国斯耐普公司 Flexible eye-wear device with dual cameras for generating stereoscopic images
US10825260B2 (en) * 2019-01-04 2020-11-03 Jand, Inc. Virtual try-on systems and methods for spectacles
US11227075B2 (en) * 2019-01-25 2022-01-18 SWATCHBOOK, Inc. Product design, configuration and decision system using machine learning
CN109733050B (en) * 2019-02-19 2020-11-06 大连声鹭科技有限公司 Self-help record-recording and self-help seal-making method
FR3092987A1 (en) * 2019-02-22 2020-08-28 Deepsmile Technology PROCESS FOR VISUALIZING THE IMPACT OF A DENTAL TREATMENT PLAN
US11234893B2 (en) * 2019-02-27 2022-02-01 Steven A. Shubin, Sr. Method and system of creating a replica of an anatomical structure
KR102646684B1 (en) * 2019-02-28 2024-03-13 삼성전자 주식회사 Electronic device and method for generating contents
ES2886876T3 (en) * 2019-03-08 2021-12-21 Exocad Gmbh Computer-implemented modeling of an individualized dental prosthetic piece for the patient
KR102579621B1 (en) * 2019-03-14 2023-09-18 미쯔이가가꾸가부시끼가이샤 Lens ordering system, lens ordering method, program and data structure
WO2020208421A1 (en) * 2019-04-09 2020-10-15 Shiseido Company, Limited System and method for creation of topical agents with improved image capture
EP3726474A1 (en) * 2019-04-19 2020-10-21 Koninklijke Philips N.V. Methods and systems for handling virtual 3d object surface interaction
CN110113528B (en) 2019-04-26 2021-05-07 维沃移动通信有限公司 Parameter obtaining method and terminal equipment
CN111861608B (en) * 2019-04-29 2024-04-30 杭州优工品科技有限公司 Product customization method and device based on three-dimensional online visualization and storage medium
US11030801B2 (en) * 2019-05-17 2021-06-08 Standard Cyborg, Inc. Three-dimensional modeling toolkit
CN110348944A (en) * 2019-06-10 2019-10-18 脉脉眼镜科技(深圳)有限责任公司 Long-range optometry and the internet glasses marketing system and method in kind shown on trial
KR102294822B1 (en) * 2019-06-10 2021-08-26 김용만 Eyeglasses manufacturing body data measuring device using stereo vision
US11540906B2 (en) 2019-06-25 2023-01-03 James R. Glidewell Dental Ceramics, Inc. Processing digital dental impression
US11534271B2 (en) 2019-06-25 2022-12-27 James R. Glidewell Dental Ceramics, Inc. Processing CT scan of dental impression
US11622843B2 (en) 2019-06-25 2023-04-11 James R. Glidewell Dental Ceramics, Inc. Processing digital dental impression
US11238611B2 (en) * 2019-07-09 2022-02-01 Electric Avenue Software, Inc. System and method for eyewear sizing
US20220252905A1 (en) * 2019-07-18 2022-08-11 Essilor International System and method for determining at least one feature of at least one lens mounted in a spectacle frame
EP4003824A1 (en) 2019-07-22 2022-06-01 Specialized Bicycle Components, Inc. Bicycle saddle
WO2021022158A1 (en) * 2019-08-01 2021-02-04 The Regents Of The University Of Colorado, A Body Corporate Eyeglass frames for treatment of dry eye syndrome and corresponding 3d printing systems and methods
US11488239B2 (en) * 2019-08-26 2022-11-01 Warby Parker Inc. Virtual fitting systems and methods for spectacles
US11684819B2 (en) 2019-08-29 2023-06-27 Wahoo Fitness, LLC Indoor bicycle training device
KR20210026172A (en) * 2019-08-29 2021-03-10 엘지전자 주식회사 Multimedia device and method for controlling the same
KR20220050990A (en) * 2019-08-29 2022-04-25 와후 피트니스 엘.엘.씨. Indoor Bicycle Control Method and System
WO2021044937A1 (en) * 2019-09-06 2021-03-11 ソニー株式会社 Information processing device, information processing method, and information processing program
US10991067B2 (en) 2019-09-19 2021-04-27 Zeekit Online Shopping Ltd. Virtual presentations without transformation-induced distortion of shape-sensitive areas
EP4022386A1 (en) 2019-09-24 2022-07-06 Bespoke, Inc. d/b/a Topology Eyewear Systems and methods for adjusting stock eyewear frames using a 3d scan of facial features
KR102293038B1 (en) * 2019-09-26 2021-08-26 주식회사 더메이크 System and method for recommending eyewear based on sales data by face type and size
EP3798944A1 (en) * 2019-09-30 2021-03-31 Hoya Lens Thailand Ltd. Learning model generation method, computer program, eyeglass lens selection support method, and eyeglass lens selection support system
TWI731442B (en) * 2019-10-18 2021-06-21 宏碁股份有限公司 Electronic apparatus and object information recognition method by using touch data thereof
CN110866970B (en) * 2019-10-21 2023-04-25 西南民族大学 System and method for realizing reconstruction of lens matching through facial key point recognition
WO2021086319A1 (en) * 2019-10-29 2021-05-06 Hewlett-Packard Development Company, L.P. Generation of model data for three dimensional printers
US20210173230A1 (en) * 2019-10-30 2021-06-10 LuxMea Studio, LLC Bespoke eyewear and system and method for manufacturing bespoke eyewear
US11176357B2 (en) * 2019-10-30 2021-11-16 Tascent, Inc. Fast face image capture system
KR20210054315A (en) * 2019-11-05 2021-05-13 손금아 Customized eyeglass production system and method
KR20210059060A (en) * 2019-11-13 2021-05-25 삼성디스플레이 주식회사 Detecting device
US10965931B1 (en) * 2019-12-06 2021-03-30 Snap Inc. Sensor misalignment compensation
KR20220114550A (en) * 2019-12-17 2022-08-17 에씰로 앙터나시오날 Apparatus, method, and computer readable storage medium for contextualized equipment recommendation
CN114830015A (en) * 2019-12-19 2022-07-29 依视路国际公司 Method for determining a value of at least one geometric parameter of a subject wearing an eye-wear
US11748929B2 (en) * 2019-12-19 2023-09-05 Essilor International Apparatus, method, and computer-readable storage medium for expanding an image database for evaluation of eyewear compatibility
CN111179033A (en) * 2019-12-27 2020-05-19 珠海随变科技有限公司 Background data processing method for glasses customization, glasses customization method, device and equipment
US11622207B2 (en) 2019-12-31 2023-04-04 Starkey Laboratories, Inc. Generating a hearing assistance device shell
CN111694167A (en) * 2020-01-03 2020-09-22 周爱霞 Spectacle frame processing system and method based on lens size selection
CN111274673B (en) * 2020-01-07 2021-02-23 上海索辰信息科技股份有限公司 Optical product model optimization method and system based on particle swarm optimization
CN113128310A (en) * 2020-01-12 2021-07-16 邓广博 Target searching platform and method based on multi-parameter acquisition
US11614638B1 (en) * 2020-01-15 2023-03-28 Meta Platforms Technologies, Llc Prescription optical element for selected head mounted device
EP3851903A1 (en) * 2020-01-15 2021-07-21 Essilor International A method and system for providing an eyeglasses frame
EP3851902A1 (en) * 2020-01-15 2021-07-21 Essilor International Method for generating modified eyeglass frame manufacturing data
US11768378B1 (en) 2020-01-15 2023-09-26 Meta Platforms Technologies, Llc Prescription optical element based on optical-mechanical profile
US11642018B1 (en) 2020-01-15 2023-05-09 Meta Platforms Technologies, Llc Volumetric depth imaging for lens fit
US11755790B2 (en) 2020-01-29 2023-09-12 America's Collectibles Network, Inc. System and method of bridging 2D and 3D assets for product visualization and manufacturing
US11783475B2 (en) 2020-02-07 2023-10-10 Meta Platforms Technologies, Llc In ear device customization using machine learning
US11181758B2 (en) 2020-02-07 2021-11-23 Facebook Technologies, Llc Eyewear frame customization using machine learning
CN115088004A (en) * 2020-02-12 2022-09-20 依视路国际公司 Optical device for predicting the presence of a given person
US11826632B2 (en) 2020-02-19 2023-11-28 Auburn University Methods for manufacturing individualized protective gear from body scan and resulting products
DE102020104536A1 (en) 2020-02-20 2021-08-26 Carl Zeiss Vision International Gmbh Methods and devices for three-dimensional eye reconstruction
WO2021168336A1 (en) * 2020-02-21 2021-08-26 Ditto Technologies, Inc. Fitting of glasses frames including live fitting
CN111419537B (en) * 2020-02-26 2022-04-19 东莞理工学院 Preparation method of personalized 3D printing medical isolation eyeshade
EP3876026A1 (en) 2020-03-06 2021-09-08 Carl Zeiss Vision International GmbH Method and devices for determining inclination angle
CN111507202B (en) * 2020-03-27 2023-04-18 北京万里红科技有限公司 Image processing method, device and storage medium
GB2593702A (en) * 2020-03-30 2021-10-06 Fuel 3D Tech Limited Method and system for eyewear fitting
EP3889968A1 (en) 2020-04-01 2021-10-06 L'oreal Method for self-measuring facial or corporal dimensions, notably for the manufacturing of personalized applicators
US20230153479A1 (en) * 2020-04-02 2023-05-18 Ocularex, Inc. Real time augmented reality selection of user fitted eyeglass frames for additive manufacture
US11507056B1 (en) 2020-04-06 2022-11-22 Lockheed Martin Corporation System and method for three-dimensional (3D) computer-aided manufacturing (CAM) of an ensemble of pilot equipment and garments
CA3179919A1 (en) 2020-04-15 2021-10-21 Warby Parker Inc. Virtual try-on systems for spectacles using reference frames
US11531655B2 (en) * 2020-04-15 2022-12-20 Google Llc Automatically improving data quality
US20210322112A1 (en) * 2020-04-21 2021-10-21 Mazor Robotics Ltd. System and method for aligning an imaging device
WO2021221672A1 (en) * 2020-04-30 2021-11-04 Hewlett-Packard Development Company, L.P. Customized parametric models to manufacture devices
WO2021226045A1 (en) * 2020-05-04 2021-11-11 Meazure Me Custom HD, LLC Methods and systems for automated selection and ordering of hair products
US11869152B2 (en) * 2020-05-13 2024-01-09 Bodygram, Inc. Generation of product mesh and product dimensions from user image data using deep learning networks
FR3111451B1 (en) 2020-06-12 2022-11-11 Acep Trylive Device and method for acquiring images of a pair of spectacles
US11423567B2 (en) * 2020-06-17 2022-08-23 Fotonation Limited Method and system to determine the location and/or orientation of a head
US11533443B2 (en) 2020-06-29 2022-12-20 Innovega, Inc. Display eyewear with adjustable camera direction
US11475242B2 (en) 2020-07-27 2022-10-18 Seiko Epson Corporation Domain adaptation losses
US11748942B2 (en) 2020-08-13 2023-09-05 Siemens Mobility Pty Ltd System and method for automatically generating trajectories for laser applications
US11500658B2 (en) * 2020-08-24 2022-11-15 Visionworks of America, Inc. Systems and methods for using a transaction data structure for configuring and providing graphical user interfaces
WO2022044036A1 (en) * 2020-08-25 2022-03-03 Lenskart Solutions Pvt. Ltd., System and method to determine true facial feature measurements of a face in a 2d image
US11544846B2 (en) 2020-08-27 2023-01-03 James R. Glidewell Dental Ceramics, Inc. Out-of-view CT scan detection
CN111988579B (en) * 2020-08-31 2022-05-31 杭州海康威视***技术有限公司 Data auditing method and system and electronic equipment
US20220067954A1 (en) * 2020-08-31 2022-03-03 Koninklijke Philips N.V. Dynamic measurement optimization based on image quality
CN112330793A (en) * 2020-09-30 2021-02-05 安克创新科技股份有限公司 Obtaining method of ear mold three-dimensional model, earphone customizing method and computing device
US11669880B2 (en) 2020-10-08 2023-06-06 Kiksar Technologies Private Limited System and method for customization of an eyewear
US11590555B2 (en) 2020-10-26 2023-02-28 Joseph Oshman Methods of creating bike rack hooks
WO2022093467A1 (en) * 2020-10-27 2022-05-05 Marc Lemchen Methods for direct printing of orthodontic and dental appliances onto the teeth of a patient
CN112606402A (en) * 2020-11-03 2021-04-06 泰州芯源半导体科技有限公司 Product manufacturing platform applying multi-parameter analysis
CN112308073B (en) * 2020-11-06 2023-08-25 中冶赛迪信息技术(重庆)有限公司 Method, system, equipment and medium for identifying loading and unloading and transferring states of scrap steel train
US20220163822A1 (en) * 2020-11-24 2022-05-26 Christopher Chieco System and method for virtual fitting of eyeglasses
CN112788192B (en) * 2020-12-21 2023-05-09 柏丽德珠宝(广州)有限公司 Jewelry customization-based image processing method and device
WO2022140796A1 (en) * 2020-12-23 2022-06-30 BLNG Corporation Systems and methods for generating jewelry designs and models using machine learning
US11543802B2 (en) * 2020-12-23 2023-01-03 Etsy, Inc. Multi-source item creation system
US20210335504A1 (en) * 2020-12-29 2021-10-28 Abeer Ayoub System and method for comprehensive eye care
FR3118821B1 (en) * 2021-01-13 2024-03-01 Fittingbox Method for detecting and tracking in a video stream the face of an individual wearing a pair of glasses
CN117292416A (en) * 2021-01-25 2023-12-26 天津怡和嘉业医疗科技有限公司 Face size determining method and device
WO2022161730A1 (en) 2021-01-28 2022-08-04 Essilor International Method and apparatus for determining a fit of a visual equipment
TWI789702B (en) * 2021-02-09 2023-01-11 晶碩光學股份有限公司 Lens construction method
EP4043946A1 (en) * 2021-02-12 2022-08-17 Essilor International A device and method for automatically evaluating visual equipment
EP4047519A1 (en) * 2021-02-22 2022-08-24 Carl Zeiss Vision International GmbH Devices and methods for processing eyeglass prescriptions
CN112801038B (en) * 2021-03-02 2022-07-22 重庆邮电大学 Multi-view face in-vivo detection method and system
US20220300728A1 (en) * 2021-03-22 2022-09-22 Snap Inc. True size eyewear experience in real time
US11972592B2 (en) 2021-04-06 2024-04-30 Innovega, Inc. Automated eyewear frame design through image capture
US11762219B2 (en) 2021-04-06 2023-09-19 Innovega, Inc. Automated contact lens design through image capture of an eye wearing a reference contact lens
US11546686B2 (en) 2021-04-06 2023-01-03 Dan Clark Audio, Inc. Headphone ear pad system
US11604368B2 (en) 2021-04-06 2023-03-14 Innovega, Inc. Contact lens and eyewear frame design using physical landmarks placed on the eye
CN112965983B (en) * 2021-04-12 2022-05-17 邹可权 Establishment method and use method of personalized cup database
DE102021109140A1 (en) * 2021-04-13 2022-10-13 Bundesdruckerei Gmbh Method and arrangement for the optical detection of a person's head
US20220343534A1 (en) * 2021-04-23 2022-10-27 Google Llc Image based detection of display fit and ophthalmic fit measurements
US11704931B2 (en) * 2021-04-29 2023-07-18 Google Llc Predicting display fit and ophthalmic fit measurements using a simulator
USD990180S1 (en) 2021-04-30 2023-06-27 Specialized Bicycle Components, Inc. Bicycle saddle
KR20220150084A (en) 2021-05-03 2022-11-10 주식회사 시리우스시스템 Real-Time Supporting System for Producing Process
US11625094B2 (en) 2021-05-04 2023-04-11 Google Llc Eye tracker design for a wearable device
EP4320476A1 (en) * 2021-05-10 2024-02-14 Apple Inc. Fit detection system for head-mountable devices
WO2022254409A1 (en) * 2021-06-04 2022-12-08 ResMed Pty Ltd System and method for providing customized headwear based on facial images
US20220390771A1 (en) * 2021-06-07 2022-12-08 Blink Technologies Inc. System and method for fitting eye wear
US20220398781A1 (en) * 2021-06-10 2022-12-15 EyesMatch Ltd. System and method for digital measurements of subjects
CN113515873B (en) * 2021-07-07 2022-04-19 中国科学院重庆绿色智能技术研究院 Metal additive manufacturing molten pool shape prediction method based on dimensional analysis
JP2023009953A (en) * 2021-07-08 2023-01-20 日本電気株式会社 Analysis apparatus, communication system, analysis method, and program
US11971246B2 (en) * 2021-07-15 2024-04-30 Google Llc Image-based fitting of a wearable computing device
DE102021118580A1 (en) 2021-07-19 2023-01-19 Nodety GmbH Process for controlling production devices
CN113706470B (en) * 2021-07-29 2023-12-15 天津中科智能识别产业技术研究院有限公司 Iris image segmentation method and device, electronic equipment and storage medium
US11915200B2 (en) * 2021-07-29 2024-02-27 Zazzle Inc. Collaborative video chat screen sharing using a digital product collaboration platform
US20230046950A1 (en) * 2021-08-12 2023-02-16 Google Llc Image based detection of fit for a head mounted wearable computing device
US20230103129A1 (en) * 2021-09-27 2023-03-30 ResMed Pty Ltd Machine learning to determine facial measurements via captured images
JP7380664B2 (en) 2021-10-19 2023-11-15 カシオ計算機株式会社 Display control method, program and display control device
CA3238445A1 (en) * 2021-11-17 2023-05-25 Sergey Nikolskiy Systems and methods for automated 3d teeth positions learned from 3d teeth geometries
JP7095849B1 (en) * 2021-11-26 2022-07-05 アイジャパン株式会社 Eyewear virtual fitting system, eyewear selection system, eyewear fitting system and eyewear classification system
CN114445554B (en) * 2021-12-24 2023-02-10 广东时谛智能科技有限公司 Special customization method and device for shoe body, electronic equipment and storage medium
CN114528610A (en) * 2022-01-21 2022-05-24 深圳市佐吖服装有限公司 Novel parameterized brim design method based on garment cad software
US20230252655A1 (en) * 2022-02-09 2023-08-10 Google Llc Validation of modeling and simulation of wearable device
CN114488566B (en) * 2022-02-15 2024-04-02 首都医科大学附属北京同仁医院 Face morphology three-dimensional data-based spectacle frame personalized design method
EP4249994A1 (en) * 2022-03-23 2023-09-27 Essilor International Improved detection of an outline of a spectacle frame
WO2023187730A1 (en) * 2022-03-31 2023-10-05 Soul Machines Limited Conversational digital character blending and generation
EP4254251A1 (en) * 2022-04-01 2023-10-04 BAE SYSTEMS plc Vehicle design tool
WO2023187311A1 (en) * 2022-04-01 2023-10-05 Bae Systems Plc Vehicle design tool
WO2023215397A1 (en) * 2022-05-03 2023-11-09 Ditto Technologies, Inc. Systems and methods for scaling using estimated facial features
EP4300172A1 (en) 2022-07-01 2024-01-03 Fielmann AG Method for determining lens fitting parameters
KR102499864B1 (en) * 2022-07-12 2023-02-16 (주)인터비젼 Customizing progressive lens design device and method
US11538228B1 (en) * 2022-08-26 2022-12-27 Illuscio, Inc. Systems and methods for augmented reality viewing based on directly mapped point cloud overlays
US11893847B1 (en) 2022-09-23 2024-02-06 Amazon Technologies, Inc. Delivering items to evaluation rooms while maintaining customer privacy
US11733853B1 (en) * 2022-09-28 2023-08-22 Zazzle Inc. Parametric modelling and grading
CN117252866B (en) * 2023-11-14 2024-02-13 山东迪格重工机械有限公司 Numerical control stamping forming self-adjustment operation detection method based on image recognition

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1999059106A1 (en) * 1998-05-13 1999-11-18 Acuscape International, Inc. Method and apparatus for generating 3d models from medical images
CN1264474A (en) * 1997-05-16 2000-08-23 保谷株式会社 System for making spectacles to order
WO2000060513A1 (en) * 1999-04-01 2000-10-12 Max Hats, Ltd. Integrated product and process for mass customization of goods one item at a time
US6144388A (en) * 1998-03-06 2000-11-07 Bornstein; Raanan Process for displaying articles of clothing on an image of a person
US6241355B1 (en) * 1996-03-29 2001-06-05 Brian A. Barsky Computer aided contact lens design and fabrication using spline surfaces
US6692127B2 (en) * 2000-05-18 2004-02-17 Visionix Ltd. Spectacles fitting system and fitting methods useful therein
CN102073759A (en) * 2010-12-29 2011-05-25 温州大学 Facial form characteristic parameter-based eyeglass configuration control method
CN102439603A (en) * 2008-01-28 2012-05-02 耐特维塔有限公司 Simple techniques for three-dimensional modeling
CN103096819A (en) * 2010-06-29 2013-05-08 乔治·弗雷 Patient matching surgical guide and method for using the same

Family Cites Families (119)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4539585A (en) 1981-07-10 1985-09-03 Spackova Daniela S Previewer
JPS6180222A (en) 1984-09-28 1986-04-23 Asahi Glass Co Ltd Method and apparatus for adjusting spectacles
EP0260710A3 (en) 1986-09-19 1989-12-06 Hoya Corporation Method of forming a synthetic image in simulation system for attachment of spectacles
KR910000591B1 (en) 1986-10-30 1991-01-26 가부시기가이샤 도시바 Glasses frame picture process record method and it's system
JPH0535827A (en) 1991-05-10 1993-02-12 Miki:Kk Spectacles selection and designing system
JP3072398B2 (en) * 1991-09-30 2000-07-31 青山眼鏡株式会社 Eyeglass frame manufacturing system
DE4224922A1 (en) 1992-07-28 1994-02-03 Carlo Kuester Automatic consultation system as aid for customer in selecting product eg spectacles - has video camera to obtain image of face for use in selecting spectacles from available models or for customised versions
US5280570A (en) 1992-09-11 1994-01-18 Jordan Arthur J Spectacle imaging and lens simulating system and method
JP2751145B2 (en) 1993-12-15 1998-05-18 株式会社三城 Eyeglass shape design design system
ES2111478B1 (en) 1995-10-02 1998-11-01 Tovar Romero Jaime DESIGN SYSTEM AND SELECTION OF GLASSES THROUGH DIGITAL IMAGE PROCESSING.
US5592248A (en) 1995-11-16 1997-01-07 Norton; Ross A. Computerized method for fitting eyeglasses
US5724258A (en) * 1996-05-09 1998-03-03 Johnson & Johnson Vision Products, Inc. Neural network analysis for multifocal contact lens design
US5983201A (en) 1997-03-28 1999-11-09 Fay; Pierre N. System and method enabling shopping from home for fitted eyeglass frames
JP4026782B2 (en) * 1997-05-16 2007-12-26 日本電信電話株式会社 Eyeglass manufacturing support system
US6142628A (en) 1998-02-03 2000-11-07 Saigo; Tsuyoshi Eyeglasses try-on simulation system
US6563499B1 (en) 1998-07-20 2003-05-13 Geometrix, Inc. Method and apparatus for generating a 3D region from a surrounding imagery
US6095650A (en) 1998-09-22 2000-08-01 Virtual Visual Devices, Llc Interactive eyewear selection system
US7062454B1 (en) 1999-05-06 2006-06-13 Jarbridge, Inc. Previewing system and method
US8046270B2 (en) * 2000-05-19 2011-10-25 Eastman Kodak Company System and method for providing image products and/or services
WO2001032074A1 (en) 1999-11-04 2001-05-10 Stefano Soatto System for selecting and designing eyeglass frames
EP1136869A1 (en) 2000-03-17 2001-09-26 Kabushiki Kaisha TOPCON Eyeglass frame selecting system
WO2001079918A1 (en) 2000-04-19 2001-10-25 Aoyama Gankyo Kabushikikaisha Production method for spectacles-frame
US6535223B1 (en) 2000-05-31 2003-03-18 Schmidt Laboratories, Inc. Method and system for determining pupiltary distant and element height
WO2001098730A2 (en) 2000-06-16 2001-12-27 Eyeweb, Inc. Sizing objects from an image
US6791584B1 (en) 2000-09-05 2004-09-14 Yiling Xie Method of scaling face image with spectacle frame image through computer
US6664956B1 (en) 2000-10-12 2003-12-16 Momentum Bilgisayar, Yazilim, Danismanlik, Ticaret A. S. Method for generating a personalized 3-D face model
US6792401B1 (en) 2000-10-31 2004-09-14 Diamond Visionics Company Internet-based modeling kiosk and method for fitting and selling prescription eyeglasses
CA2429472A1 (en) 2000-11-24 2002-05-30 Vision Optic Co., Ltd. Eyeglasses order/sale system over network and its method
KR100417923B1 (en) * 2000-11-29 2004-02-11 이성균 Spectacles order system by using data communication network and the operation method thereof
US7016824B2 (en) 2001-02-06 2006-03-21 Geometrix, Inc. Interactive try-on platform for eyeglasses
DE10106562B4 (en) 2001-02-13 2008-07-03 Rodenstock Gmbh Method for demonstrating the influence of a particular spectacle frame and the optical glasses used in this spectacle frame
JP2002279295A (en) * 2001-03-21 2002-09-27 Minolta Co Ltd Method of providing processing data
US7103211B1 (en) 2001-09-04 2006-09-05 Geometrix, Inc. Method and apparatus for generating 3D face models from one camera
US20030065255A1 (en) * 2001-10-01 2003-04-03 Daniela Giacchetti Simulation of an aesthetic feature on a facial image
US6682195B2 (en) 2001-10-25 2004-01-27 Ophthonix, Inc. Custom eyeglass manufacturing method
US7434931B2 (en) * 2001-10-25 2008-10-14 Ophthonix Custom eyeglass manufacturing method
CN100462048C (en) 2002-01-04 2009-02-18 株式会社威炯眼镜 Spectacle and contact lens selecting system and method thereof
EP1495447A1 (en) * 2002-03-26 2005-01-12 KIM, So-Woon System and method for 3-dimension simulation of glasses
DE10216824B4 (en) 2002-04-16 2006-03-02 Thomas Doro Method and apparatus for constructing a custom goggle
US20040004633A1 (en) 2002-07-03 2004-01-08 Perry James N. Web-based system and method for ordering and fitting prescription lens eyewear
FR2842977A1 (en) 2002-07-24 2004-01-30 Total Immersion METHOD AND SYSTEM FOR ENABLING A USER TO MIX REAL-TIME SYNTHESIS IMAGES WITH VIDEO IMAGES
DE10323811A1 (en) * 2003-05-23 2005-01-13 Bwg Bergwerk- Und Walzwerk-Maschinenbau Gmbh Method for the continuous drawing of metallic strips and drawstring line
JP4369694B2 (en) 2003-07-18 2009-11-25 東海旅客鉄道株式会社 Protecting concrete structures
US6994327B2 (en) 2003-08-13 2006-02-07 Certainteed Corporation Cap and base assembly for a fence post
US7512255B2 (en) * 2003-08-22 2009-03-31 Board Of Regents, University Of Houston Multi-modal face recognition
JP4186766B2 (en) 2003-09-12 2008-11-26 セイコーエプソン株式会社 Spectacle lens manufacturing system and spectacle lens manufacturing method
WO2005029158A2 (en) 2003-09-12 2005-03-31 Neal Michael R Method of interactive system for previewing and selecting eyewear
FR2860887B1 (en) 2003-10-13 2006-02-03 Interactif Visuel Systeme Ivs FACE CONFIGURATION MEASUREMENT AND EYEGLASS MOUNTS ON THIS FACE IMPROVED EFFICIENCY
AU2005215056B2 (en) * 2004-02-20 2011-06-09 Essilor International (Compagnie Generale D'optique) System and method for analyzing wavefront aberrations
US7154529B2 (en) 2004-03-12 2006-12-26 Hoke Donald G System and method for enabling a person to view images of the person wearing an accessory before purchasing the accessory
US7804997B2 (en) 2004-06-10 2010-09-28 Technest Holdings, Inc. Method and system for a three dimensional facial recognition system
JP2006113425A (en) 2004-10-18 2006-04-27 Yasumori Kanazawa Supply system of eyeglasses
FR2885231A1 (en) 2005-04-29 2006-11-03 Lorraine Sole Soc D Optique Sa Spectacle frame selecting method, involves selecting spectacle frame adapted to consumer morphology by interrogating frame`s dimensional characteristic indexing database and incorporating image of frame adapted to consumer to consumer image
US7607776B1 (en) 2005-10-24 2009-10-27 James Waller Lambuth Lewis Digital eye bank for virtual clinic trials
US7487116B2 (en) 2005-12-01 2009-02-03 International Business Machines Corporation Consumer representation rendering with selected merchandise
JP4232166B2 (en) * 2006-01-31 2009-03-04 株式会社アイメトリクス・ジャパン Glasses wearing simulation method and apparatus
US20070244722A1 (en) 2006-04-12 2007-10-18 Gary Nathanael Wortz Method for determining refractive state and providing corrective eyewear
EP1862110A1 (en) 2006-05-29 2007-12-05 Essilor International (Compagnie Generale D'optique) Method for optimizing eyeglass lenses
WO2008014482A2 (en) * 2006-07-27 2008-01-31 Personics Holdings Inc. Method and device of customizing headphones
US9164299B2 (en) 2006-10-26 2015-10-20 Carl Zeiss Vision Australia Holdings Limited Ophthalmic lens dispensing method and system
WO2008089998A1 (en) * 2007-01-25 2008-07-31 Rodenstock Gmbh Method for determining remote and near reference points
US20080198328A1 (en) 2007-02-16 2008-08-21 Seriani Joseph S System and method self enabling customers to obtain refraction specifications for, and purchase of, previous or new fitted eyeglasses
US7665843B2 (en) 2007-02-21 2010-02-23 Yiling Xie Method and the associate mechanism for stored-image database-driven spectacle frame fitting services over public network
US20110071804A1 (en) 2007-02-21 2011-03-24 Yiling Xie Method And The Associated Mechanism For 3-D Simulation Stored-Image Database-Driven Spectacle Frame Fitting Services Over Public Network
WO2008130907A1 (en) * 2007-04-17 2008-10-30 Mikos, Ltd. System and method for using three dimensional infrared imaging to identify individuals
EP2031434B1 (en) 2007-12-28 2022-10-19 Essilor International An asynchronous method for obtaining spectacle features to order
EP2037314B1 (en) 2007-12-28 2021-12-01 Essilor International A method and computer means for choosing spectacle lenses adapted to a frame
US20090319337A1 (en) 2008-06-09 2009-12-24 Yiling Xie Optical product network via Internet
EP2161611A1 (en) * 2008-09-04 2010-03-10 Essilor International (Compagnie Générale D'Optique) Method for optimizing the settings of an ophtalmic system
DE102009004380B4 (en) 2009-01-12 2012-04-05 Rodenstock Gmbh A method of making an individual spectacle frame, computer program product, use and spectacle frame making apparatus
FR2945365B1 (en) 2009-05-11 2011-06-24 Acep France METHOD AND SYSTEM FOR ONLINE SELECTION OF A VIRTUAL GOGGLE FRAME
US20110022545A1 (en) * 2009-07-24 2011-01-27 A Truly Electric Car Company Re-inventing carmaking
GB0913311D0 (en) 2009-07-30 2009-09-02 Nova Resources S A R L Eyeglass frames
GB0920129D0 (en) 2009-11-17 2009-12-30 Nova Resources S A R L Correct scaling for superimposed images
FR2955409B1 (en) 2010-01-18 2015-07-03 Fittingbox METHOD FOR INTEGRATING A VIRTUAL OBJECT IN REAL TIME VIDEO OR PHOTOGRAPHS
JP5648299B2 (en) * 2010-03-16 2015-01-07 株式会社ニコン Eyeglass sales system, lens company terminal, frame company terminal, eyeglass sales method, and eyeglass sales program
FR2957511B1 (en) 2010-03-19 2021-09-24 Fittingbox METHOD AND DEVICE FOR MEASURING INTER-PUPILLARY DISTANCE
US9959453B2 (en) * 2010-03-28 2018-05-01 AR (ES) Technologies Ltd. Methods and systems for three-dimensional rendering of a virtual augmented replica of a product image merged with a model image of a human-body feature
US8459792B2 (en) 2010-04-26 2013-06-11 Hal E. Wilson Method and systems for measuring interpupillary distance
US9519396B2 (en) * 2010-09-28 2016-12-13 Apple Inc. Systems, methods, and computer-readable media for placing an asset on a three-dimensional model
US20130088490A1 (en) 2011-04-04 2013-04-11 Aaron Rasmussen Method for eyewear fitting, recommendation, and customization using collision detection
US8622545B2 (en) * 2011-07-26 2014-01-07 Shaw Vision Inc. Spectacle lenses and method of making same
KR20130032117A (en) * 2011-09-22 2013-04-01 주식회사 유비케어 Apparatus and method for providing information for taking medicine
FR2980681A3 (en) 2011-09-29 2013-04-05 Fittingbox METHOD OF DETERMINING OCULAR AND OPTICAL MEASUREMENTS OF INTEREST FOR THE MANUFACTURE AND MOUNTING OF GLASSES OF CORRECTIVE GLASSES USING A CAMERA IN A NON-CONTROLLED CONTEXT
US9236024B2 (en) 2011-12-06 2016-01-12 Glasses.Com Inc. Systems and methods for obtaining a pupillary distance measurement using a mobile computing device
US20130187916A1 (en) 2012-01-25 2013-07-25 Raytheon Company System and method for compression and simplification of video, pictorial, or graphical data using polygon reduction for real time applications
US8708494B1 (en) 2012-01-30 2014-04-29 Ditto Technologies, Inc. Displaying glasses with recorded images
FR2987920B1 (en) 2012-03-08 2018-03-02 Essilor International METHOD FOR DETERMINING A GEOMETRIC-MORPHOLOGICAL, POSTURE OR BEHAVIORAL CHARACTERISTIC OF A BEARER OF A PAIR OF EYEWEAR
FR2988494B1 (en) 2012-03-20 2015-04-03 Essilor Int METHOD FOR MANUFACTURING A PERSONALIZED EYE PROTECTION OPHTHALMIC LENS FOR A BEARER
EP2834059A1 (en) 2012-04-03 2015-02-11 LUXeXcel Holding B.V. Device and method for producing custom-made spectacles
US9483853B2 (en) 2012-05-23 2016-11-01 Glasses.Com Inc. Systems and methods to display rendered images
US20130314413A1 (en) 2012-05-23 2013-11-28 1-800 Contacts, Inc. Systems and methods for scaling a three-dimensional model
US20130335416A1 (en) * 2012-05-23 2013-12-19 1-800 Contacts, Inc. Systems and methods for generating a 3-d model of a virtual try-on product
US9286715B2 (en) 2012-05-23 2016-03-15 Glasses.Com Inc. Systems and methods for adjusting a virtual try-on
US20130314401A1 (en) 2012-05-23 2013-11-28 1-800 Contacts, Inc. Systems and methods for generating a 3-d model of a user for a virtual try-on product
NL2010009C2 (en) 2012-12-19 2014-06-23 Sfered Intelligence B V Method and device for determining user preferred dimensions of a spectacle frame.
US20140240470A1 (en) * 2013-02-28 2014-08-28 Codi Comercio Design Industrial, Lda Method, system and device for improving optical measurement of ophthalmic spectacles
US20140253707A1 (en) * 2013-03-11 2014-09-11 Dasa V. Gangadhar Automated acquisition of eyeglasses
US9429773B2 (en) 2013-03-12 2016-08-30 Adi Ben-Shahar Method and apparatus for design and fabrication of customized eyewear
US9470911B2 (en) 2013-08-22 2016-10-18 Bespoke, Inc. Method and system to create products
US20150127132A1 (en) 2013-11-01 2015-05-07 West Coast Vision Labs Inc. Method and system for generating custom-fit eye wear geometry for printing and fabrication
EP2887131A1 (en) 2013-12-20 2015-06-24 Jakob Schmied Method for producing spectacles that are tailored to a person and spectacles
FR3016051B1 (en) 2014-01-02 2017-06-16 Essilor Int METHOD FOR DETERMINING AT LEAST ONE GEOMETRIC PARAMETER OF A PERSONALIZED FRAME OF EYEWEAR AND METHOD OF DETERMINING THE ASSOCIATED CUSTOM FRAME
US9810927B1 (en) 2014-03-19 2017-11-07 3-D Frame Solutions LLC Process and system for customizing eyeglass frames
US20150277155A1 (en) 2014-03-31 2015-10-01 New Eye London Ltd. Customized eyewear
JP6449911B2 (en) 2014-04-30 2019-01-09 マテリアライズ・ナムローゼ・フエンノートシャップMaterialise Nv Object customization system and method in additive machining
EP2946914A1 (en) 2014-05-21 2015-11-25 Mount Bros LLC Method of manufacturing eyeglass frames, apparatus for carrying out the method and frames obtained with such method
FR3024911B1 (en) 2014-08-14 2017-10-13 Oyyo METHOD FOR TRYING AND MANUFACTURING GLASSES
WO2016115369A1 (en) 2015-01-14 2016-07-21 Northwestern University Compositions, systems and methods for patient specific ophthalmic device
US9341867B1 (en) 2015-01-16 2016-05-17 James Chang Ho Kim Methods of designing and fabricating custom-fit eyeglasses using a 3D printer
US10194799B2 (en) 2015-03-09 2019-02-05 Sanovas Intellectual Property, Llc Robotic ophthalmology
US11086148B2 (en) 2015-04-30 2021-08-10 Oakley, Inc. Wearable devices such as eyewear customized to individual wearer parameters
NL2014891B1 (en) 2015-05-29 2017-01-31 Maydo B V Method for manufacturing a spectacle frame adapted to a spectacle wearer.
FR3038077A1 (en) 2015-06-26 2016-12-30 Frederic Clodion DEVICE FOR 3D VIEWING AND FACE MEASUREMENT FOR THE PRODUCTION OF SUNGLASSES OR SUNGLASSES, TO MEASURE
FR3044429B1 (en) 2015-11-26 2018-01-05 Ak Optique METHOD FOR MANUFACTURING A CUSTOM GOGGLE FRAME
DE102016106121A1 (en) 2016-04-04 2017-10-05 Carl Zeiss Ag Method and device for determining parameters for spectacle fitting
BR102016009093A2 (en) 2016-04-22 2017-10-31 Sequoia Capital Ltda. EQUIPMENT FOR ACQUISITION OF 3D IMAGE DATE OF A FACE AND AUTOMATIC METHOD FOR PERSONALIZED MODELING AND MANUFACTURE OF GLASS FRAMES
CN107498846A (en) 2017-09-20 2017-12-22 厦门云镜视界设计有限公司 A kind of manufacturing glasses method based on 3D models
CN107589562A (en) 2017-09-20 2018-01-16 厦门云镜视界设计有限公司 A kind of making spectacles to order method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6241355B1 (en) * 1996-03-29 2001-06-05 Brian A. Barsky Computer aided contact lens design and fabrication using spline surfaces
CN1264474A (en) * 1997-05-16 2000-08-23 保谷株式会社 System for making spectacles to order
US6144388A (en) * 1998-03-06 2000-11-07 Bornstein; Raanan Process for displaying articles of clothing on an image of a person
WO1999059106A1 (en) * 1998-05-13 1999-11-18 Acuscape International, Inc. Method and apparatus for generating 3d models from medical images
WO2000060513A1 (en) * 1999-04-01 2000-10-12 Max Hats, Ltd. Integrated product and process for mass customization of goods one item at a time
US6692127B2 (en) * 2000-05-18 2004-02-17 Visionix Ltd. Spectacles fitting system and fitting methods useful therein
CN102439603A (en) * 2008-01-28 2012-05-02 耐特维塔有限公司 Simple techniques for three-dimensional modeling
CN103096819A (en) * 2010-06-29 2013-05-08 乔治·弗雷 Patient matching surgical guide and method for using the same
CN102073759A (en) * 2010-12-29 2011-05-25 温州大学 Facial form characteristic parameter-based eyeglass configuration control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"混合散光儿童戴镜与否对调节影响的对比研究";李文涛 等;《中华眼视光学与视觉科学杂志》;20130525(第5期);第295-298页 *

Also Published As

Publication number Publication date
KR20180009391A (en) 2018-01-26
US20200285081A1 (en) 2020-09-10
JP6542744B2 (en) 2019-07-10
CA2921938A1 (en) 2015-02-26
CA2921938C (en) 2016-12-20
US20160062151A1 (en) 2016-03-03
KR101821284B1 (en) 2018-01-23
CN105637512A (en) 2016-06-01
US11428960B2 (en) 2022-08-30
AU2014308590A1 (en) 2016-03-03
US20150154678A1 (en) 2015-06-04
US20180299704A1 (en) 2018-10-18
US20150212343A1 (en) 2015-07-30
JP6099232B2 (en) 2017-03-22
AU2016208357B2 (en) 2018-04-12
US10451900B2 (en) 2019-10-22
US20220357600A1 (en) 2022-11-10
JP2017041281A (en) 2017-02-23
US20170269385A1 (en) 2017-09-21
AU2016208357A1 (en) 2016-08-11
US20160062152A1 (en) 2016-03-03
WO2015027196A1 (en) 2015-02-26
KR102207026B1 (en) 2021-01-22
US10698236B2 (en) 2020-06-30
CN105637512B (en) 2018-04-20
JP2016537716A (en) 2016-12-01
US11914226B2 (en) 2024-02-27
US20150154322A1 (en) 2015-06-04
US20150055086A1 (en) 2015-02-26
US10222635B2 (en) 2019-03-05
US9529213B2 (en) 2016-12-27
US10031351B2 (en) 2018-07-24
US10459256B2 (en) 2019-10-29
US9470911B2 (en) 2016-10-18
EP3036701A1 (en) 2016-06-29
CN108537628A (en) 2018-09-14
US11428958B2 (en) 2022-08-30
US9703123B2 (en) 2017-07-11
AU2014308590B2 (en) 2016-04-28
KR20160070744A (en) 2016-06-20
US20220350174A1 (en) 2022-11-03
US10031350B2 (en) 2018-07-24
EP3036701A4 (en) 2017-01-18
US20190146246A1 (en) 2019-05-16
US20170068121A1 (en) 2017-03-09
US9304332B2 (en) 2016-04-05
US20150154679A1 (en) 2015-06-04
US20240142803A1 (en) 2024-05-02
US20150055085A1 (en) 2015-02-26
US11867979B2 (en) 2024-01-09

Similar Documents

Publication Publication Date Title
US11867979B2 (en) Method and system to create custom, user-specific eyewear

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 1254991

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant