WO2001097682A1 - Procede et appareil d'emission d'informations automatiques concernant des lentilles - Google Patents

Procede et appareil d'emission d'informations automatiques concernant des lentilles Download PDF

Info

Publication number
WO2001097682A1
WO2001097682A1 PCT/JP2001/005203 JP0105203W WO0197682A1 WO 2001097682 A1 WO2001097682 A1 WO 2001097682A1 JP 0105203 W JP0105203 W JP 0105203W WO 0197682 A1 WO0197682 A1 WO 0197682A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
information
data
visual acuity
user
Prior art date
Application number
PCT/JP2001/005203
Other languages
English (en)
Japanese (ja)
Inventor
Takehiko Yoshida
Original Assignee
Vision Optic Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vision Optic Co., Ltd. filed Critical Vision Optic Co., Ltd.
Priority to AU2001264320A priority Critical patent/AU2001264320A1/en
Publication of WO2001097682A1 publication Critical patent/WO2001097682A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C13/00Assembling; Repairing; Cleaning
    • G02C13/003Measuring during assembly or fitting of spectacles

Definitions

  • the present invention relates to an unmanned lens information transmitting method and an unmanned lens information transmitting method, in which any person can perform an optometry at a relatively close place near work and residence, and can purchase a lens as needed.
  • measurement of naked eye acuity or corrected acuity is performed by going to an ophthalmologist and receiving medical treatment, or by using a visual acuity measuring device provided at an eyeglass store.
  • a virtual shopping street has been formed on a network such as the Internet, and it is possible to order glasses online at a spectacle shop provided in the virtual shopping street. There are no systems available for sale.
  • the main object of the present invention is to transmit information about the lens at a convenient location close to the office and workplace, while leaving the face-to-face sales element.
  • An object of the present invention is to provide a non-information transmitting device and a method thereof that can be purchased as needed. Disclosure of the invention
  • An invention according to claim 1 of the present invention is an apparatus for transmitting information about a lens from a certain space into which a human can enter, wherein the visual acuity measuring means for measuring the visual acuity in a state where the lens is removed, Communication means for processing data measured by the visual acuity measuring means and transmitting the data to the manned lens processing center, output means for displaying and Z or voice instructions on the lens, and input for inputting the lens instruction And a means for transmitting unmanned lens information.
  • the invention according to claim 2 of the present invention further comprises lens power input means for inputting the lens data of the contact described in the package with the power of the lens already used, wherein the unmanned lens information according to claim 1 The transmitting device.
  • the communication means processes the data input by the used lens power input means for inputting the power of the already used lens, and sends the data to a manned lens processing center.
  • the unmanned lens information transmitting device according to claim 2, further comprising a transmitting unit.
  • the invention according to claim 4 of the present invention is characterized in that the visual acuity measuring means includes a visual acuity measuring device that automatically measures naked eye acuity and calculates a lens power necessary for correcting the naked eye acuity.
  • An unmanned lens information transmitting device according to any of the above.
  • the invention according to claim 5 of the present invention is directed to any one of claims 1 to 4, wherein the communication means includes any of a CPU, a data compression device, an image processing device, data storage, and a WWW browser. It is an unmanned lens information transmission device described in the above. Corrected form (Rule 91)
  • An invention according to claim 6 of the present invention is the unmanned lens information transmitting device according to any one of claims 1 to 3, wherein the output means and the input means include a sunset panel.
  • An invention according to claim 7 of the present invention is the unmanned lens information transmitting device according to any one of claims 1 to 6, wherein the input means includes a camera.
  • the invention according to claim 8 of the present invention is the unmanned lens information transmitting device according to any one of claims 1 to 7, wherein the input means includes any one of a keyboard and a voice input means.
  • the fixed space into which a human can enter is a booth provided with an entrance on one side, and the entrance is provided with a detecting means for detecting human entry and exit, At least one of the power measuring means, the visual acuity measuring means, the communication means, the output means, and the input means of the already used lens is turned on by a signal of the means, and is turned off when leaving.
  • An unmanned lens information transmitting device according to any one of claims 1 to 8.
  • the invention according to claim 10 of the present invention is the invention according to any one of claims 1 to 9, further comprising a fee input unit for inputting a fee required for using the power of the used lens and the use of the Z or the visual acuity measuring unit.
  • Unmanned lens information transmission device Unmanned lens information transmission device.
  • An invention according to claim 11 of the present invention is a method for transmitting information about a lens from a certain space into which a human can enter, wherein the visual acuity measuring step of measuring visual acuity in a state where the lens is removed; A communication step of processing the data measured in the measurement step and transmitting the data to a manned lens processing center; an output step of displaying and transmitting Z- or voice-related instructions regarding the lens; and an input step of inputting a lens-related instruction. It is a method of transmitting unmanned lens information. Corrected form (Rule 91)
  • the invention according to claim 12 of the present invention further comprises a lens power input step of inputting contact lens data describing the power of the lens already used in the package.
  • the invention according to claim 13 of the present invention is characterized in that the communication step processes information input by a power input step of an already used lens to input the power of the already used lens, and sends the data to a manned lens processing center.
  • the invention according to claim 14 of the present invention includes the step of allowing a user to select at least one of visual acuity measurement and purchase of a lens, wherein the unmanned person according to any one of claims 11 to 13 includes: This is a lens information transmission method.
  • the invention of claim 15 of the present invention includes a step of displaying at least one lens selected by the user, confirming the user, and / or displaying a price to prompt the user to purchase.
  • An unmanned lens information transmission method according to any one of claims 11 to 14.
  • FIG. 1 is an illustrative view of a lens information transmitting device using an unmanned counter booth according to an embodiment of the present invention.
  • FIG. 2 is an illustrative view showing a structure inside a lens information transmitting device using an unmanned counter booth.
  • FIG. 3 is an illustrative view showing a structure of a visual acuity measuring device.
  • FIG. 4 is a diagram showing a network by a network according to an embodiment of the present invention. It is a figure showing the example of system composition of a small order sales system.
  • FIG. 5 is a diagram showing an outline (part 1) of a processing flow of the glasses order-one sales system by the network.
  • FIG. 6 is a diagram showing an outline (part 2) of a processing flow of the glasses ordering and selling system via the network.
  • FIG. 7 is a diagram showing an outline (step 2) of a processing flow of the glasses order-one sales system by a network when the customer is already a customer.
  • FIG. 8 is a diagram showing an outline (step 3) of the processing flow of the glasses order sales system via the network when the customer is not a customer but has a prescription.
  • FIG. 9 is a diagram showing an outline (step 4) of the flow of processing of the mega order sales system via the network when there is no customer and no prescription.
  • FIG. 10 is a diagram showing an outline (step 4 ′) of the processing flow of the glasses order-one sales system by the network when there is no customer and no prescription.
  • FIG. 11 is a diagram showing an outline (step 5) of a processing flow of the glasses order-one sales system by the network when the existing reading glasses are selected.
  • FIG. 12 is a lens selection reference database.
  • Fig. 13 shows the lens database.
  • FIG. 14 is a diagram showing an example of a system configuration of a remote visual acuity measuring system.
  • FIG. 15 is a diagram showing an example of a database structure related to user information managed by storage means in a lens processing center.
  • FIG. 16 shows the measurement of visual acuity managed by the storage means in the lens processing center.
  • FIG. 17 is a diagram illustrating an example of the structure of a database relating to reference information for determining the reference information.
  • FIG. 17 is a diagram illustrating an example of a database structure relating to visual acuity measurement information managed by a storage unit in a lens processing center.
  • FIG. 18 is a diagram showing an example of a database structure relating to a visual acuity table managed by the storage means in the lens processing center.
  • FIG. 19 is a diagram showing an example of a database structure for myopia information managed by the storage means in the lens processing center.
  • FIG. 20 is a diagram showing an example of a database structure related to hyperopia information, which is managed by the storage means in the lens processing center.
  • FIG. 21 is a diagram showing an example of a database structure relating to astigmatism information, which is managed by the storage means in the lens processing center.
  • Fig. 22 shows the naked eye visual acuity measurement screen for the user.
  • Figure 23 is a screen that displays a visual acuity chart for users.
  • FIG. 24 is a diagram showing the results of visual acuity measurement.
  • FIG. 25 is a diagram showing a configuration example of a virtual experience system for wearing glasses.
  • FIG. 26 is a diagram showing an example of a database structure relating to user information managed by the storage means in the lens processing center.
  • FIG. 27 is a diagram showing an example of data input from the frame selection information input means in the lens processing center.
  • FIG. 28 is a diagram showing an example of a database structure relating to a frame function structure of each frame managed by the storage means in the lens processing center.
  • FIG. 29 is a diagram showing an example of a database structure relating to a frame decoration structure of each frame managed by the storage means in the lens processing center.
  • FIG. 30 is an illustrative view showing a measuring method on a side surface of the face image;
  • FIG. 31 is an illustrative view showing a measuring method in front of a face image;
  • FIG. 32 is an illustrative view showing a method of adjusting a frame; BEST MODE FOR CARRYING OUT THE INVENTION
  • FIG. 1 is an illustrative view of a lens information transmitting device using an unmanned counter booth according to an embodiment of the present invention
  • FIG. 2 is an illustrative view showing an internal structure thereof.
  • the counter booth 1 in the unmanned lens information transmission device is a terminal used when a contact lens purchaser sells an order using the network. Yes, implemented by a personal computer, for example, a user terminal 101, a frequency measuring device 102, and lens data input means 1 13 as input means for inputting data of a contact lens already used.
  • a visual acuity measuring device 103 for automatically measuring a visual acuity in a state where a subject, who is a user, removes glasses, the data input means 113 of the used lens and the visual acuity measuring device 1 Communication means for processing the data measured by 0 3 and sending it to a manned lens processing center located elsewhere 104, display of instructions on lens power measurement, and display of Z or And an output means 1 0 5 tell by voice, vision, etc. measured optometry apparatus 1 0 3, and input means 1 0 6 for inputting an instruction about which lenses when answering to a question in the measurement.
  • Counter booth 1 also has a catalog of contact lenses, etc., but the types are limited, so the minimum required items Corrected form (Rule 91)
  • the catalog information such as lenses transmitted from the glasses processing center 2 is displayed on the output means 105 of the user terminal 101 so that the user can view and select it.
  • the user terminal 101 is composed of a personal computer or the like, and is an input / output device serving as an interface with the user who is the purchaser. Specifically, the user terminal 101 and the digital camera 108 are connected to each other.
  • the input means 106 includes a liquid crystal or CRT display, the PC monitor 109, the printer 112, and the output means 105 including speed.
  • a touch panel 107 is prepared as an information input device for text data and the like, various input devices such as a pointing device such as a mouse, a track pole, and a joystick, a keyboard, and a switch can be used.
  • a digital camera 108 is prepared as an image information input means, but any device such as a TV camera, a video camera, or a digital still camera that can digitize and input image information may be used. If necessary, a device for providing an auxiliary light such as a flash may be provided. Further, as a storage device capable of storing images or other information, a hard disk and a CD-ROM drive are prepared, so that image information or programs stored in the CD can be used. Of course, it is also possible to configure a device for a storage medium such as DVD, MO, or a memory unit.
  • the user terminal 101 can be connected to the Internet as a computer network (network) so that image information or information such as software can be transmitted and received via the network. Has become.
  • the user terminal 101 communicates with the server of the lens processing center 2. It has a WWW browser as an interface.
  • the www browser is realized as a program stored in its memory.
  • the already used contact lens data input means 1 13 uses the data from the touch panel 107 to input the contact lens data written on the package, but separate voice input means and scanner An input method for inputting may be adopted by such a method.
  • the visual acuity measuring device 103 is a device that automatically and subjectively measures the refractive power of both eyes, and automatically measures the naked eye acuity, and determines the lens power required to correct the naked eye acuity. It is configured to be able to perform calculations.
  • this eyesight measuring device 103 is provided with a goggle-shaped face pad 103a, and the position of the eye to be examined is almost fixed. And a joystick 103 b and a response switch 103 c are provided, and the subject operates or touches the touch panel 107, the joystick 103 b and the response switch 103 c. Thus, a predetermined input can be performed.
  • distance and near measurement is performed as follows.
  • the user's terminal 101 is configured to guide the subject separately as an instruction of the user while performing the subjective examination of the left and right eyes.
  • the general specifications of the eyesight measuring device 103 are as follows.
  • this visual acuity measuring device 103 performs a subjective optometry, which reflects the objective and binocular vision and the intention of the user as a subject, and has an independent measurement optical system for each eye.
  • the pair of measuring optical system devices 103d, 103d includes a pair of optical heads 103e, 103e, and a pair of optical heads 103e, 103e.
  • the image of the subject's eye incident on the object is detected through the prism 103 ⁇ , 103 f, and the central processing unit 103 j including the image processing means 103 i, 103 i emits light into an electric signal. It is configured to convert to, for example, and send it to the user terminal 101.
  • the angle of the prism 103 f can be changed with the solenoid 103 g.
  • Each of the measuring optical system devices 103d is configured to perform alignment with autoalignments 103h and 103h, and the index is presented inside the device.
  • the visual acuity measuring device 103 includes a central processing unit 103 j such as a CPU, and includes visual acuity measurement information input means, image processing means 103 i, visual acuity data creating means, and storage means.
  • the visual acuity measurement information input means is a means for registering data including a reference for measuring a visual acuity related to the visual acuity measurement input from the joystick 103b or the like.
  • the visual acuity measurement information input means is a means for registering and managing each data in each file of the visual acuity data base for measuring the visual acuity.
  • the image processing means 103 i is a means for converting information about the light output from the measuring optical system device 103 d into an electric signal.
  • the visual acuity data creating means searches the managed database for visual acuity measurement data such as the frequency of myopia, hyperopia and astigmatism based on the criteria for measuring the input visual acuity, using the following storage means. And means for creating a visual acuity measurement result including the data extracted by the search.
  • the visual acuity measuring device 103 is provided with storage means for recording and managing the following visual acuity measurement data file, visual acuity chart data file, myopia information data file, hyperopia information data file, and astigmatism information data file as a visual acuity database. .
  • the visual acuity measurement data file stores data such as naked eye acuity, corrected visual acuity, interpupillary distance, distance correction power, near correction power, measurement date, and frequency determiner.
  • the visual acuity data file stores data related to the subjective test chart.
  • Presbyopia check chart (Cross-line date: Cross cylinder method)
  • the degree of myopia, the relationship between myopia and visual acuity, the type of myopia (frequency), and treatment are registered and managed.
  • the parallel ray that enters the eye when no nodes are connected forms an image at a point in front of the retina (the far point is a finite anterior eye).
  • Table 1 shows the relationship between myopia and visual acuity.
  • the types of myopia are as follows.
  • a moderate concave lens is worn as a treatment for myopia.
  • hyperopia is an eye in which parallel rays entering the eye form an image at a point behind the retina when the eye is not adjusting at all (the far point is a finite posterior eye).
  • the type of hyperopia is represented by the frequency, for example, as follows.
  • astigmatism means that parallel rays entering the eye do not form a single point when the eye is not adjusting at all.
  • Compound astigmatism (wear by combining a cylindrical lens and a spherical lens)
  • a correction lens based on corrected eyesight is set in the lens portion of the viewing window of the faceplate 103a, and a red-green test or the like is performed. This is to determine the corrective power required to make the lens by listening to the appearance and making fine adjustments by taking into account the power of the lens that has been hung and the desire for contact lenses and eyeglasses to be created in the future.
  • the frequency measuring device 102 is a device that automatically measures corrected vision when the subject, who is the user, removes the glasses, and puts the glasses on the designated place.
  • a mounting table is formed. When a lens is placed on the mounting table, the lens is positioned at the optical center of the meter so that the lens meter measures the power of the spectacle lens.
  • Counter booth 1 was set up as soon as the user entered it.
  • Sensor 110 such as a remote sensor detects that a user has entered, and the above-mentioned user terminal 101, frequency measuring device 102, lens data input means 113, visual acuity measuring device 110 3.
  • the communication means 104, the output means 105, and the input means 106 are all energized and turned on.
  • the output means 105 sends a voice, for example, a welcome word such as "Hello!
  • the voice instructs the user to "touch the touch panel in front of you.”
  • the user terminal 101, frequency measuring device 102, lens data input device 113, visual acuity measuring device 103, communication device 104, output device 105 becomes active.
  • the user terminal 101 receives the “initial screen” from the lens processing center 2, and controls the output terminal 105 to display it on the touch panel 107 by the control device.
  • the services provided in the counter 1 booth 1 are “sight measurement”, “lens selection”, and “purchase of contact lenses”.
  • the user terminal 101 When the user enters the displayed fee into the installed fee entry slot 111, the user terminal 101 provides the user with a touch panel 107 and a next touch panel to provide the next eyesight measurement service. After asking various questions on the PC monitor 109 and letting the user answer the questions, the doctor performs the necessary interviews for the visual acuity test, and then measures the naked eye visual acuity.
  • I use contact lenses
  • I would use the touch panel 107 to enter the contact lens data on the contact lens package I am currently using.
  • the control means of the user terminal 101 controls the contact lens data input by the touch panel 107 to be processed by the lens data input means 113, and the communication means 104 Sent to the lens processing center 2.
  • the naked eye vision is measured from the output means 105 on the PC monitor 109, the voice and the sunset panel 107, and an instruction is given to sit in front of the vision measurement device 103.
  • the result can be printed out by the provided printer 112.
  • Data input by the lens data input means 113 which is the input means for inputting the data of the power measuring device 102 and the contact lens already used, and the state in which the subject, who is the user, removes the glasses
  • the data determined by the visual acuity measuring device 103 which automatically measures the corrected visual acuity at the site, is transmitted by the communication means 104 via a wide area computer network (Internet) to a manned lens processing sensor located elsewhere. It is configured to transmit various information about the measured lens to 2. In the “sight measurement” step, the user who has completed the vision measurement then moves to “lens selection”.
  • Approximately three types of lenses reflecting the power measured in the eyesight measurement step are displayed on the screen of the PC monitor 109, and the user is asked to select the lens they want to purchase based on conditions such as price and thickness and other lens characteristics.
  • an image of the lens reflecting the power measured in the visual acuity measurement step is displayed on the screen of the PC monitor 109, and the thickness of the lens is compared, and the user checks the lens.
  • the communication with the lens processing center 2 is cut off, and the user terminal 101, frequency measuring device 102, lens input device 1 13 and visual acuity measuring device 1 03, Touch panel 107, Digital camera 108, PC monitor 109 enters screen save mode.
  • the lens processing center 2 includes electronic shop information processing means 21, display information generating means 22, lens order sales processing means 23, settlement processing means 24, and WWW server / CGI 25.
  • the lens processing center 2 specifically includes information processing devices such as a personal computer, a workstation, and a server.
  • the electronic shop information processing means 21 is stored in a storage device such as a magnetic disk device or an optical disk device of these information processing devices.
  • each of the processing means described above, that is, the WWW server CGI 25, the display information generating means 22, the lens order sales processing means 23, and the settlement processing means 24, is actually a memory included in the information processing device. It is stored and executed in the form of a program.
  • the electronic shop information processing means 21 defines the product data such as contact lenses handled by the lens processing center 2 using the product definition unit via the input / output device.
  • the product data defined here is stored in the product database as product data information.
  • the product data information includes text data such as the name of a product shelf that displays products such as contact lenses, product numbers such as contact lenses, product names, prices, product descriptions, and product management information. Includes product image data.
  • the lens processing center 2 also has an input / output device as an interface with the creator of the electronic catalog. The input / output device is provided by the catalog creator with the product shelf name, product item, Accepts input of text data such as price or product information such as image data that represents the shape of the product.
  • order information of the products purchased by the buyer it outputs information including product information such as product number and quantity, destination information of the product, payment information such as external payment processing institution name, payment date, and amount. .
  • an information processing device such as a personal computer having a keyboard, a mouse, a CRT display and the like as an input / output device can be used.
  • the product definition section can be realized by a program stored and executed in the memory of such an information processing device.
  • the electronic shop information processing means 21 is provided with electronic shop opening information means including a shop database, a product database, and a basket database.
  • the shop database stores information for opening an electronic shop and information defining a shop layout for displaying product information.
  • the product database stores the defined product data information.
  • the basket database stores information on products ordered to be purchased from the user terminal 101 of the counter 1 booth 1. It is.
  • the electronic shop information processing means 21 realizes a function of storing the transferred product data information in a product database.
  • the display information generating means 22 generates display information such as an electronic catalog in response to a request from the user terminal 101 of the counter booth 1.
  • the display information generation means 22 includes parameter analysis means 221, file search means 22.2, and display data generation means 223.
  • the parameter analysis means 22 1 is used to analyze the visual acuity measurement data received from the user terminal 101 of the counter booth 1 received via the WWW server / CGI 25 Extract parameters included in frame selection information, etc.
  • the file search means 222 searches each database registered and stored by the electronic shop information processing means 21 based on the parameters extracted by the parameter analysis means 222.
  • the display data generating means 222 generates display data that can be displayed as a WWW page based on the data searched by the file searching means 222. In other words, the display data generating means 222 has a function as a so-called WWW page generator.
  • the lens order sales processing means 23 when the product to be purchased (contact lens, etc.) is determined by the user terminal 101 of the counter booth 1, the display information generating means 22 and the customer ID and the product ID to be purchased. Receiving the detailed information of the product to be purchased from the product database based on the information, and storing the information of the product in the customer basket database for the target customer in the passest database. . Thereafter, the target customer obtains a list of products to be purchased from the basket server overnight and passes it to the display information generating means 22.
  • the payment processing means 24 receives the customer ID from the display information generating means 22 when the purchase of the product is determined by the user terminal 101 of the council 1 Take out the product data for the buyer from the basket database. Then, it requests the external payment processing institution 3 to perform the payment processing based on the extracted product data information.
  • the settlement processing means 24 receives the notification from the external settlement processing organization 3 that the settlement processing has been completed, notifies the lens order sales processing means 23 and the electronic shop information processing means 21 that the order processing has been completed, and In order to inform the user terminal 101 of the counter 1 booth 1 of the purchase process, the base statement data is created, and the data is passed to the display information generating means 22.
  • the WWW server (World Wide Web) / CGI (Comm on Gateway Interface) 25 is an interface between the user terminal 101 and the user terminal 101 on the power line. It functions as an interface, receives display request information from the user terminal 101, and transfers display data to the user terminal 101 of the counter booth 1.
  • CGI Common Gateway Interface
  • the external settlement processing institution 3 performs settlement processing of the ordered lens price on behalf of the lens processing center 2 based on the request sent from the settlement processing means 24 of the lens processing center 2.
  • the WWW server ZCGI 25 receives the glasses order page information sent from the user terminal 101 of the counter board 1, and activates the display information generating means 22.
  • the display information generating means 22 receives the lens order page information from the WWW server / CGI 25 and analyzes the lens order page information received by the parameter analyzing means 22 1.
  • Parameter analysis tool 2 2 1 identifies the electronic shop to be displayed as the analysis result It outputs information such as shop IDs to identify the types of background screens of electronic catalogs, log templates, product IDs of products to be displayed, and customer IDs to identify buyers.
  • the file search means 222 searches the shop database, the product database, and the basket database, and reads the counter booth 1
  • the user terminal 101 acquires the data necessary to create a display screen of the home page requested to be displayed.
  • the display data generating means 222 first determines the type of request from the user terminal 101 of the counter booth 1. If the request from the user terminal 101 of the counter booth 1 is other than "determination of a product to be purchased" or "purchase of a product", the search result is obtained by the file search means 2 23 using the search result. The display data generating means 2 23 generates a display data.
  • the type of request from the user terminal 101 of the counter booth 1 was "determination of the product to be purchased” as a result of the determination in the step of determining the type of request from the user terminal 101 of the counter booth 1.
  • the display data generating means 2 2 3 is a lens order one sales processing means. 2 Start 3
  • the lens order sales processing means 23 receives the customer ID from the display data generating means 23 and the product ID of the product designated to be purchased by the customer. 'Using this product ID as key information, obtain detailed product data information for the product from the product database. Then, the merchandise data information obtained in the above step is stored in the basket database by the customer ID identified by the customer ID received from the display data generating means 222. Store in customer basket database. At this time, if the corresponding customer basket database does not exist, a customer passport database corresponding to the customer ID is created and product information is stored. Further, all the product data information selected by the customer so far is taken out from the customer basket store overnight and passed to the display data generating means 223.
  • the display data generation means 2 2 3 creates display information of a list of products that the customer is planning to purchase from the product data information received from the glasses order / sales processing means 23 and uses the counter booth 1 To the user terminal 101. Based on the information displayed at this time, the customer can confirm the product to be purchased and cancel part or all of the product to be purchased.
  • the display data generating means 223 activates the settlement processing means 24 before generating the display data.
  • the settlement processing means 24 When activated, the settlement processing means 24 receives the customer ID from the display data generating means 23. Using the received customer ID as a key, the settlement processing means 24 searches the basket database for merchandise data information of the purchased merchandise stored in the customer's customer database identified by the customer ID. Request payment processing to the external payment processing institution 3 based on the product data information obtained as a result of the search. In response to this request, the external payment processing institution 3 executes the payment processing work on behalf of the lens processing server 2, and notifies the lens processing server 2 when the payment processing is completed. The payment processing performed by the external payment processing institution 3 is the same as before, Here, detailed description is omitted.
  • the settlement processing means 24 Upon receiving a notification from the external settlement processing institution 3 that the settlement processing has been completed, the settlement processing means 24 receives information on the ordered product such as the product number and the order quantity, destination information indicating the destination of the product, and settlement. External payment acting on behalf of the processing The order information including the payment information including the name of the processing institution 3, the payment date and the amount information is transferred to the lens processing center 2. In the lens processing center 2, the order information received from the WWW server ZCGI 25 is displayed by the input / output device.
  • counter booth 1 prints out the entered data to printer 1 for confirmation.
  • 1 2 prints 2 sheets. (One of these is a receipt, and the other is a receipt for merchandise.)
  • the payment for the merchandise will only be made by credit card or at the store where the unmanned counter 1 booth 1 is installed.
  • the settlement processing means 24 prepares the statement data notifying that the settlement processing has been completed, and passes it to the display data generation means 223.
  • the display data generating means 222 generates a display screen for notifying the completion of the payment processing using the received statement data, and transfers the display screen to the user terminal 101 of the counter booth 1.
  • the data sent to the lens processing center 2 server Orders for lenses and frames are carried out based on the information, and processing is performed by the logistics center as soon as the goods arrive.
  • the processed goods will be delivered to the location specified above, and the goods will be exchanged in exchange for a storage card brought by the user.
  • Fig. 5 is a diagram showing an outline (part 1) of the processing flow of the glasses order sales system via the network.
  • FIG. 6 is a diagram showing an outline (part 2) of the processing flow of the glasses order sales system via the network.
  • FIG. 7 is a diagram showing an outline (step 2) of a processing flow of the glasses order-one sales system by a network when the customer is already a customer.
  • FIG. 8 is a diagram showing an outline (step 3) of the processing flow of the glasses order sales system via the network when the customer is not a customer but has a prescription.
  • FIG. 9 is a diagram showing an outline (step 4) of a flow of processing of the mega-order-one sales system via the network when there is no customer and no prescription.
  • FIG. 10 is a diagram showing an outline (step 4 ′) of the processing flow of the glasses order-one sales system by the network when there is no customer and no prescription.
  • FIG. 11 is a diagram showing an outline (step 5) of the processing flow of the glasses order sales system via the network when pre-made reading glasses are selected.
  • the user authentication screen is a screen that prompts for input of user authentication information.
  • the user terminal 101 of the counter booth 1 receives and displays the user authentication screen, inputs the user authentication information, and transmits it to the lens processing center 2.
  • User authentication information is information such as passwords and user IDs.
  • the lens processing center 2 the user authentication information is received, and based on this, the lens order sales processing means 2 3 ⁇
  • the database management means searches the purchaser information database for authentication.
  • the lens processing center 2 transmits a basic attribute input screen for inputting the basic attributes of the purchaser to the user terminal 101 of the counter booth 1.
  • the basic attributes such as address, name, date of birth, phone number, etc., and the condition of the eyes (at hand) Difficult, etc.), and input the request for glasses.
  • the user terminal 1 of the counter booth 1 At the lens processing center 2, the user terminal 1 of the counter booth 1
  • step 2 As a result of the search, if it is already determined that the customer is a customer, the process proceeds to step 2 shown in FIG. 7, and the visual acuity measurement data managed in the lens processing center 2 is extracted.
  • glasses are created in the user terminal 101 of the counter booth 1 based on the previous data. Send an inquiry screen asking if it is OK to do so.
  • the user terminal 101 of the counter 1 booth 1 if the same frame as the previous time and the same lens as the previous time are sufficient, check the button on the inquiry screen of the evening panel 107. And send it to the lens processing center 2 from the user terminal 101 of the counter booth 1.
  • the lens processing center 2 the stock of the selected frame is confirmed by the electronic shop information processing means 21.
  • the process proceeds to a frame selection step and a visual acuity measurement step and a Z or lens selection step described later.
  • the purchaser selects the check button on the evening touch panel 107, "frame selection step”, “sight measurement step”, and "lens selection step”.
  • the purchaser's intention is transmitted from the user terminal 101 of 1 to the lens processing center 2.
  • the customer database, etc. is extracted from the basic attributes input by the purchaser.When it is determined that the customer is not a customer by searching, it is asked whether a prescription is possessed.
  • the screen is transmitted from the lens processing center 2 to the user terminal 101 of the counter booth 1.
  • the purchaser can see the doctor on the prescription confirmation screen that asks whether the user has the prescription sent to the user terminal 101 of the counter booth 1. If you have the prescription, select YES for the checkout button 107 on the evening touch panel.If you do not have it, select NO for the checklist button 107 on the evening touch panel. .
  • step 3 shown in Fig. 8 If you have a doctor's prescription, that is, if you select "YES”, proceed to step 3 shown in Fig. 8, and read the processing prescription from the lens processing center 2 with a scanner or send it to the prescription.
  • An inquiry screen asking whether to enter the text data of the counter booth 1 is sent to the user terminal 101 of the counter booth 1.
  • the purchaser inputs data based on the doctor's prescription or reads the prescription using a scanner and selects a check button to send the touch panel 107. Then, the image data read by the scanner (not shown) is transmitted to the lens processing center 2.
  • the lens processing center 2 sequentially transmits the data to the user terminal 101 of the counter booth 1 in order to shift to the frame selection step and / or the lens selection step.
  • the age of the buyer is 40 or less from the lens processing center 2. 4 Send an inquiry screen asking if you are over 5 years old.
  • presbyopia is determined, and an inquiry screen for asking whether or not to order presbyopia is transmitted to the user terminal 101 of the county booth 1. If you want to order glasses, select the check button “YES” on the evening touch panel 107 and proceed to step 4 ′ shown in FIG. 10, and then select the frame selection step and lens selection from the lens processing center 2. In order to proceed to the step, the data is sequentially transmitted to the user terminal 101 of the counter 1 booth 1.
  • step 4 ′ Transmits to the user terminal 101 of the counter 1 booth 1 in order to shift to the frame selection step and the lens selection step. In this case, it is presumed to be presbyopia when judging from age, so there are more steps to choose between reading glasses and bifocals.
  • the frequency that can be judged from the age of the purchaser is determined, and the procedure shifts to an existing reading glasses order system (step 5 in FIG. 11) for easily providing reading glasses.
  • step 4 shown in Fig. 9 to process the lens.
  • the process proceeds to a frame selecting step and a lens selecting step or a visual acuity measuring step.
  • the lens selection step will be described below.
  • the purchaser determines that the data is the same as the most recent visual acuity data as input by the lens data input means 1 13 and the diopter measuring device 102, and checks the touch button 107 on the touch panel.
  • the customer decides that a lens can be created based on the doctor's prescription data, and when selecting the ⁇ Lens selection by prescription '' check box on the 107
  • the lens selection means 26 uses the lens selection means 26 based on the respective evening data. You have to choose a lens.
  • the visual acuity measuring device 103 instructs the user to proceed to the remote visual acuity measuring step. .
  • the questions related to contact lens selection are displayed on the evening touch panel 107 screen, and the lens is selected.
  • the type of lens After selecting the type of lens, introduce about three types of contact lenses, and select from the manufacturer, type, price, color, disposable (1 DAY / 1 WEEK), etc. Once the lens to be purchased is determined, the determined image and price are displayed again on the PC monitor 109 screen for confirmation, and the user is confirmed.
  • FIG. 14 is a diagram showing a system configuration example of an eyesight measurement system.
  • this remote visual acuity measuring system is composed of a user terminal 101 of a counter board 1, a frequency measuring device 102, a visual acuity measuring device 103, and lens data input means 113. And lens processing center 1002 hardware. These are physically connected by a network.
  • the network connecting the user terminal 101 of the counter booth 1 and the lens processing center 1002 is the Internet.
  • This remote visual acuity measuring system is used to measure the visual acuity input from the user terminal 101 of the counter booth 1 (frequency measuring device 102, visual acuity measuring device 103, lens data input means 113).
  • a lens processing center comprising: output means for extracting visual acuity measurement data such as the degree of myopia, hyperopia, and astigmatism based on reference data, and outputting a visual acuity measurement result including the extracted data. 1002.
  • the lens processing center 1002 includes a visual acuity measurement server, and includes user information registration means 1003, visual acuity measurement information input means 1004, storage means 1005, and image processing means 1006. , Voice processing means 1 0 7, visual acuity data creating means 1 0 8, user information recording means 1 0 0 3, visual acuity measurement information input means 1 0 0 4, storage means 1 0 0 5, image processing means 1 0 0 6, voice processing means 1 0 7, control means 1 0 9 controlling visual acuity data creating means 1 0 8 And a WWW server 101.
  • the 08 and the WWW server 110 are stored and executed in the form of a program in a memory of the information processing device.
  • the database managed by the storage means 1005 is stored in a storage device such as a porcelain disk device or an optical disk device.
  • the lens processing center 1002 is connected to the user terminal 101 of the counter 1 booth 1 via the wide area computer network (Internet).
  • Internet wide area computer network
  • the storage means 1005 is a counter by the user information registration means 1003, the visual acuity measurement information input means 10004, the image processing means 1006, and the voice processing means 10007.
  • the information collected from the user terminal 101 of the booth 1 and the information created by the lens processing sensor 1002 are stored in a user information database, a reference database for measuring eyesight, an eyesight measurement database, It is managed as a visual acuity table database, a myopia information database, a hyperopia information database, and an astigmatism information database, and a user information database, a reference database for measuring visual acuity, a visual acuity measurement database, a visual acuity table database, a myopia information database, a hyperopia.
  • This is a means having a function of browsing information stored in a storage device as an information database and astigmatism information database.
  • control means 109 also has an extraction means for extracting data based on a specific condition, and a transmission means for transmitting certain information to the user terminal 101 of the counter booth 1.
  • the user information registration means 1003 stores data relating to the user, that is, the person seeking visual acuity measurement, for example, basic attributes such as address, name, date of birth, telephone number, etc. This is a means for collecting data for identifying users, such as requests for glasses, user identifiers (IDs), user passwords, user codes, etc., and registering and managing them in a user information database.
  • data necessary for identifying and transmitting users such as fax numbers, e-mail addresses, URLs, etc., and data about the computer environment are registered as user data.
  • the visual acuity measurement information input means 1004 is a means for registering data relating to the visual acuity measurement sent from the user terminal 101 of the county booth 1.
  • the visual acuity measurement information input means 1004 is a means for registering and managing each data in a visual acuity measurement data base.
  • the image processing means 1006 is means for transmitting and displaying an image relating to eyesight to the user terminal 101 of the counter booth 1 and receiving and processing the image from the user terminal 101 of the counter booth 1. .
  • the voice processing means 1 0 7 sends a message to the user by voice based on the screen sent to the user terminal 101 of the counter booth 1, and outputs the message from the user terminal 101 of the counter 1 booth 1. It is a means of judging the transmitted voice and registering and managing it as data.
  • the visual acuity data creating means 8 searches for visual acuity measurement data such as the frequency of myopia, hyperopia, and astigmatism input from the user terminal 101 of the counter-booth 1, and includes the data extracted by the search. This is a means of creating visual acuity measurement results.
  • the WWW server 1 0 1 0 has a user terminal 1 It has a WWW server means for constructing a homepage, which is used as an interface for accessing the control means 109 of the lens processing center 1002.
  • the WWW server 10010 authenticates with a password and an identifier (ID) whether or not the user who makes a registration and a browsing request with respect to the database managed by the storage means 1005 is an authorized user. Have user authentication means.
  • this eyesight measurement system is realized on a network such as the Internet (wide area computer network) using a homepage or the like.
  • the lens processing center 1002 sets up a home page on the Internet by the WWW server 10010.
  • Users can access the lens processing center 2 homepage as an interface by accessing the user terminal 101 of the counter 1 connected to the wide area computer network by a WWW browser or the like. It accesses the information registration means 1003 and requests visual acuity measurement.
  • the user is properly registered by the user authentication means of the WWW server 1001 by the user authentication information of the user password and / or the user identifier (ID).
  • the user information registration means 1003 of the lens processing center 1002 sends the information requested by the user to register via the wide area computer network.
  • the user also registers the password and password or user member identifier (ID), etc., and the user information registration means 1003 transmits the information from the user to the user via the wide area computer network. Write management to the information database.
  • FIGS. 15 to 21 show examples of the structure of each database managed by the storage means 100'5 in the lens processing center 1002.
  • information for identifying the user includes a user code, a user identifier (ID), a user password, an address, a name, a date of birth, and a telephone number. It is a database that stores and stores user data, including user attributes, including basic attributes.
  • Pieces of user information are the data entered on the user information registration screen transmitted to the user terminal 101 of the counter 1 booth 1 by the user information registration means 1003.
  • the user information identifier (ID) and password may be determined at the service center based on the user information obtained offline, and may be automatically assigned at the first access from the user. It is good.
  • the reference database for measuring visual acuity includes the purpose of use, age, previous frequency, binocular vision at the previous frequency, left-right balance at the previous frequency, years of use of front glasses, type of contact (when used together) ), Desired correction eyesight, data related to the presence or absence of diseases related to eyesight are stored.
  • the visual acuity measurement database stores data such as naked eye acuity, corrected visual acuity, interpupillary distance, distance correction power, near correction power, measurement date, and frequency determiner.
  • the eye chart database stores data indicating the relationship between the frequency and the Landolt's ring.
  • the myopia information database registers and manages the degree of myopia, the relationship between myopia and visual acuity, the type of myopia (frequency), and the treatment.
  • Myopia enters the eye when no adjustment is made to the eye. This is the eye where the parallel rays form an image at a point in front of the retina (the far point is finite before the eye).
  • Table 1 shows the relationship between myopia and visual acuity.
  • the types of myopia are as follows.
  • a moderate concave lens is worn as a treatment for myopia.
  • hyperopia is an eye in which parallel rays entering the eye form an image at a point behind the retina when the eye is not adjusting at all (far point is finite posterior).
  • the type of hyperopia is represented by the frequency, for example, as follows.
  • Mild hyperopia (+ 4D), moderate hyperopia (+ 4D to + 7D), strong hyperopia (+ 7D) Wear moderate convex lens as a treatment for hyperopia.
  • astigmatism information database the degree of astigmatism, the type of astigmatism, and the treatment method are registered. Managed.
  • astigmatism means that the parallel rays entering the eye do not form a single point when the eye is not adjusting at all.
  • Compound astigmatism (wear by combining a cylindrical lens and a spherical lens)
  • the user closes one eye with his hand and looks at the naked eye visual acuity measurement screen ( Figure 22) with one eye.
  • the naked-eye visual acuity measurement screen (Fig. 22) shows the point of gaze with one eye.
  • the visual acuity measuring device 103 determines the distance from the naked eye visual acuity measurement screen (FIG. 22) and displays a visual acuity table having a size corresponding to the visual acuity of 1.0.
  • the eyesight measuring device 103 ask the evening touch panel 107 by a message on the screen or by voice asking "Can you see where the wheel is vacant?" 6If you can see it, touch the evening switch (voice response) at “YES” on the evening switch panel 107. Further, the visual acuity measuring apparatus 103 sends the inquiry screen to the evening touch panel 107, and continues to ask the question "Where is free” on the evening touch panel 107, and the direction in which the Landolt ring is empty To select from eight directions: “up, down, left, right, upper left, lower left, upper right, lower right” and make the button on the panel 107 appear (answer by voice). If the direction is correct, display a 1.2 eyesight chart and repeat the same procedure.
  • the visual acuity measuring device 103 determines the correct frequency immediately before visual acuity in which two mistakes have been made consecutively as naked eye visual acuity.
  • the visual acuity measuring device 103 inputs naked eye visual acuity data.
  • the lens processing center 1002 converts the corrected eyesight measurement data using a lens so that the corrected eyesight has a degree of 1.2 on a one-by-one basis based on the input naked eye eyesight data. 1 is transmitted to the user terminal 101 and displayed on the visual acuity measuring device 103. That is, the “Landolt ring that is likely to be seen” is displayed on the corrected visual acuity measurement screen of the visual acuity measuring device 103. (3) With the eyesight measuring device 103, the Landolt ring displayed on the screen of the eyesight measuring device 103 is measured with the naked eye with one eye closed, and the degree of visibility around 1.2 degrees is measured. .
  • the eyesight measuring device 103 judges the frequency as the corrected eyesight.
  • a visual acuity chart showing the Landolt ring with a mouth directly above, four Landolt rings with a mouth at 90 degrees left, and 90 degrees to the right 90 degrees right above the eyesight measuring device 103 was used. Displayed, the user can judge the astigmatism and the axis of astigmatism based on the appearance of the Landolt ring on the corrected vision measurement screen.
  • the frequency determination points during astigmatism measurement are as follows.
  • the visual acuity measuring device 103 displays the corrected visual acuity measurement screen to which the spherical power (S PH) is added, and performs the same test again.
  • the visual acuity measuring device 103 does not have astigmatism by pressing the check button ⁇ not connected '' on the corrected visual acuity measuring screen of the touch panel 107. to decide (2) Furthermore, the visual acuity measuring device 103 displays a radial index image, and the visual acuity measuring device 103 determines the positions of the darkest and lightest lines on the corrected visual acuity measurement screen. By instructing the position with the joystick 103b, the visual acuity measuring device 103 specifies the axis of astigmatism (AXIS).
  • AXIS axis of astigmatism
  • the presbyopic power can be determined from their age.
  • the frequency determination system for time is as follows.
  • User terminal 101 of counter booth 1 is displayed on the questionnaire screen of sunset panel 107 sent from lens processing sensor 1002 to user terminal 101 of counter booth 1, Enter the occupation, use of glasses, hobby, sport, presence of illness, etc.
  • the power under the condition (1) is determined in advance based on the user database of the lens processing center 1002.
  • the hyperopia and the type (frequency) of hyperopia Extract from base and display.
  • the astigmatism information database based on the degree of astigmatism, the relation between the Landolt's ring and the degree, and the relation of the axis of astigmatism.
  • the visual acuity measurement result of the visual acuity measuring device 103 is transmitted to and displayed on the touch panel 107, for example, as shown in FIG. 24.
  • DIST represents distance power
  • READ represents near power
  • S PH represents the spherical power
  • CYL represents the astigmatic power
  • AXIS represents the axis
  • P.D. represents the distance from the center of the right eye to the center of the left eye. That is, it represents the distance between pupils.
  • any person can remotely measure naked eye visual acuity or corrected visual acuity through the Internet.
  • the lens processing center 1002 may be integrated into the lens processing center 2 to perform processing intensively by using means having the same function.
  • the processing center 1002 and the lens processing center 2 may be configured to perform processing in a distributed manner by a plurality of computers, servers, and the like.
  • Lenses are registered as various databases, but the lens selection means 26 of the lens processing center 2 is based on the most recent visual acuity data, doctor's prescription, and data measured by the remote visual acuity measurement system. Then, a lens selection screen that displays the lens and Z according to the customer's desire input and transmitted from the customer by the purchaser client 1 or the lens recommended for the customer at the lens processing center 2 is displayed at the counter 1 booth 1 Send to user terminal 101. If you are already a customer, The purchased lens is also displayed on the lens selection screen.
  • the choices for the lens include manufacturer name, mold, application, lens function (lens thickness, lens lightness, durability, UV cut), color, price, frequency, etc. Sees the options, selects the desired lens, enters the desired lens purchase on the lens selection screen, and sends it to the lens processing center 2.
  • the lens selection means 26 ⁇ glasses order-sales processing means 23 ⁇ settlement processing means 24 performs glasses order sales processing.
  • the frame when data on the functional and decorative surfaces of the frame exists in the lens processing center 2 such as when the customer has already been a customer, the frame can be registered using a flash, an image, a design, or the like. Therefore, selection of a frame when functional data and decorative data of the frame are in the lens processing center 2 will be described below.
  • the frames are registered as a data base in the lens processing center 2, and a frame selection screen displaying a representative frame is displayed by the frame selection means 27 to the user of the counter 1 booth 1. Send to terminal 101.
  • the customer responds to the questionnaire inquiries about fashion, materials, design, budget, etc. on the frame selection screen, and based on the data indicating the customer's intention, the lens processing center 2
  • the frame determined to be optimum by the selection means 27 is selected, and the user terminal 10 of the counter booth 1 is returned from the lens processing center 2 again. Send the frame selection screen to 1.
  • the previously purchased frame is also displayed on the frame selection screen.
  • the options for the frame include fashion, material, design, price, etc.
  • the customer looks at the options, selects the desired frame, enters the purchase of the desired frame on the frame selection screen, and processes the lens. Send it to Sunset 2.
  • the frame selecting means 27 the user is instructed to proceed to the next virtual experience step of wearing glasses.
  • FIG. 25 is a diagram showing a configuration example of a virtual experience system for wearing glasses.
  • This eyeglass wearing virtual experience system is a system that allows various eyeglass frames to be worn on the image of the user's face.
  • This system consists of a user terminal 101 of the counter booth 1, a lens processing center 2 It consists of 0 0 2.
  • the following description is based on the assumption that the network connecting the user terminal 101 of the counter 1 booth 1 and the lens processing sensor 200 2 is the Internet.
  • the lens processing center 2002 has user information registration means 2003, frame selection information input means 2004, storage means 200, frame information registration means 2600, frame image registration. Means 206, Frame Selection Means 208, Image Processing Means 200, Output Means 209, and User Information Registration Means 2003, frame selection information input means 200, storage means 200, frame information registration means 206, frame image registration means 2061, image processing means 200, A control means for controlling the frame selection means 209 and the output means 209 is provided, and a server including a WWW server is further provided.
  • an information processing device including a personal computer, a workstation, a server, and the like may be used.
  • the lens processing center 2 is connected to a user terminal 10 of the counter booth 1 via a wide area computer network (Internet). Connected to 1.
  • Internet wide area computer network
  • the WWW server has a www server which constructs a home page used as an interface for accessing the user terminal 101 of the counter booth 1, the control means 210, and the like.
  • the WWW server checks whether the user from the user terminal 101 of the counter booth 1 that makes a registration / browsing request to the database managed by the database management means 205 is an authorized user. It has user authentication means for authenticating whether or not it is a password or an identifier (ID).
  • input means 2000 such as a keyboard inputs each of the frames that can be provided in the lens processing center 2002.
  • the text data related to the frame function structure data and the frame decoration structure data is registered and managed.
  • the frame image registration means 20061 of the input means 2006 of the lens processing center 2002 inputs an image of a frame which can be provided by the lens processing center 2002. Then, the frame image input in the lens processing center 2002 is registered and managed.
  • the user information registration means 2003 of the lens processing center 2002 stores the user information such as the face image transmitted from the user terminal 101 of the counter booth 1. Register and manage information.
  • the storage means 2005 stores the face image of the user input by the user information registration means 2003 and the image of the frame input by the frame image registration means 2061 of the input means 206.
  • the frame selection means 2000 of the lens processing center 2002 is used for selecting a frame desired by the user, which is managed by the storage means 2000, that is, functional structure data, decorative structure data, face image data.
  • a suitable one is selected from the frame function structure, the frame decoration structure, and the frame image of each frame of the storage means 205 stored by the frame information registration means 260, It is configured to be able to generate or select a frame image that displays different types of spectacle frames.
  • the image processing means 200 of the lens processing center 200 stores the eyeglass frame image selected by the above-described frame selection means 208, and the face image managed by the storage means 205. It is configured to be able to output an eyeglass wearing image synthesized overnight.
  • the computer which is the core of the lens processing center 2002, first receives a command to operate the virtual experience system wearing glasses from the keyboard, and also collects data such as personal information of the user and display parameters of the visual field image.
  • user information registration means 2003 capable of accepting selection instructions
  • Frame image registration means 2061 capable of accepting the input of an image data digitized from a computer, performing image processing in accordance with the input data and selecting or forming an appropriate virtual glasses wearing image.
  • Image processing means 2007 capable of generating and outputting.
  • software for the virtual experience system for wearing glasses, image information, And storage means for storing and managing a sample of a visual field image which can be selected and displayed.
  • the visual field image generated or selected by the image processing means 2000 is output from the output means 2000 to the PC monitor 109 of the user terminal 101 of the counter-booth 1 and displayed.
  • the lens processing center 2 sets up a homepage on the Internet using a WWW server.
  • the user can access frame selection information using the homepage of the lens processing center 2002 as an interface by accessing the user terminal 101 of the counter 1 booth 1 connected to the wide area computer network, such as the www browser of the user terminal 101.
  • the user accesses the input means 204 and requests registration of frame selection reference data.
  • the lens processing center 2002 uses the user authentication means of the WWW server to identify the user whose user has been registered properly based on the user's password and user authentication information such as Z or user identifier (ID). Certify that
  • the frame selection information input means 2004 of the lens processing sensor 2002 writes and manages the selection reference data requested to be registered by the user via the wide area computer network in the storage means 2005. .
  • the lens processing center 2002 transmits an input screen of the user basic attributes to the user terminal 101 of the counter booth 1.
  • the user sends the user's basic attributes, such as name, address, date of birth, telephone number, eye condition (such as difficulty in seeing at hand), and requests for glasses to user client 1 Enter on the basic attribute input screen.
  • frame selection criteria for example, fashion, budget, function, feeling of fitting to the face, etc., are entered in the user basic attribute input screen sent from the electronic service center 2.
  • the lens processing center 2002 stores the basic attributes of the user, frame selection criteria, etc. as shown in the structure of each database shown in FIGS. 26 to 29 managed by the storage means 2000.
  • the face image input by the digital camera 108 serving as the image input device of the user terminal 101 of the counter booth 1 is also subjected to lens processing. Sent to 2.
  • a ruler or the like When inputting with the image input device, a ruler or the like is positioned below the provided face, and a face image is input together with the ruler.
  • a frame selection reference is created in the lens processing center 2002.
  • the functional structure data of the frame is obtained.
  • the evening and decoration structure data is created, stored and managed by the storage means 2000.
  • the functional structure data of the frame was determined from, for example, the distance between the left and right pupils, the width from the center between the left and right pupils to the ear, and the width from the center between the left and right pupils to the ear.
  • Nose clings (nose pad) determined based on the opening angle of the temple, the distance from the ear to the corneal apex, the position of the temple bending, the distance between the corneal apex and the nose, and the distance between the corneal apex and the nose Angle and budget.
  • the selection criteria (fashionability, fit to face, etc.) transmitted from the user terminal 101 of the counter booth 1 are mainly provided by the frame selection information input means 204 of the lens processing center 2002.
  • the decorative structure data of the frame is created based on the text data such as), and is stored and managed by the storage means 205.
  • the shape of the target is Wellington, Lloyd, Oval, Square, Tonneau, Poston, Butterfly, and Wottle (Drop).
  • the materials are rimless (one point, three point), metal ny roll, sel ny roll, metal, cell, brorain, combination, and others. Brands are various brands.
  • the frame images input by the input means 200 of the lens processing center 2002 and the frame image registration means 206 of the lens 206 are registered and managed.
  • the size is the actual size (44 ⁇ to 62 ⁇ ), and the features are shape memory alloy, super light weight, super elasticity, combined use with sunglasses, portable, etc.
  • the opening angle of the nose clings is determined based on the distance to the apex, the position where the temple bends, the distance between the corneal apex and the nose, and the distance between the corneal apex and the nose.
  • frame decoration structure data of each frame are Wellington, Lloyd, Opal, Square, Tonneau, Poston, Butterfly, and Auto (drop).
  • the materials are black (two-point, three-point), metal nail roll, cell nail roll, metal, cell, broline, combination, and others. Brands are various brands. Colors are various colors.
  • the face image transmitted from the user terminal 101 of the counter booth 1 is registered and managed in the user information registration means 200 3 of the lens processing center 200 2.
  • the frame image input by the frame image registering means 200 of the lens processing sensor 200 is registered and managed.
  • the face image of the user input by the user information registering means 200 and the image of the frame input by the frame image registering means 206 of the input means 206 are stored in the storage means 200 Stored by five.
  • the frame selection unit 200 of the lens processing center 2002 selects the frame desired by the user managed by the storage unit 2000, that is, the function selection data, the decoration structure data, and the face. According to the image data, a suitable one is selected from the frame function structure, the frame decoration structure, and the frame image of each frame of the storage means 205 stored by the frame information registration means 206. In addition, it is configured to be able to generate or select a frame image displaying several types of different eyeglass frames.
  • the image processing means 2000 of the lens processing center 2002 stores the eyeglass frame image selected by the above-mentioned frame selecting means 2000 in the face image data managed by the storage means 2000. It is configured to be able to output an eyeglass-wearing image synthesized with the image.
  • the glasses wearing image synthesized by the image processing means 2 0 7 It is configured to be able to output to the user terminal 101 of the counter booth 1 via the Internet by the WWW server.
  • the user authentication screen is a screen that prompts for input of user authentication information.
  • the user terminal 101 of the counter 1 booth 1 receives and displays the user authentication screen, inputs the user authentication information, and transmits it to the lens processing center 2002.
  • the user authentication information is information such as a password and a user ID.
  • the user authentication information is received, and based on this, the storage management means 2005 and the user information registration means 2003 search the user information database for authentication. I do.
  • a screen for inputting basic attributes is further transmitted from the lens processing center 2002 to the user terminal 101 of the counter booth 1.
  • the basics of the user's attributes such as name, address, date of birth, telephone number, etc. Enter attributes.
  • the lens processing center 2002 receives the basic attribute information of the user and, based on this information, stores the user information data base by the storage means 2005 and the user information registration means 2003. Create a password and register passwords, user IDs, etc.
  • the lens processing sensor 200 input the frame selection criterion. Is transmitted to the user terminal 101 of the counter booth 1 to select a frame for reference.
  • the frame selection criterion input screen is used to input the criterion (fashion, budget, function, fit to face, etc.) for the user to select a frame.
  • the user inputs frame selection criteria such as fashion, budget, function, and fit to the face on the frame selection reference input screen of the user terminal 101 of the counter booth 1.
  • a screen prompting the transmission of the user's face image is transmitted from the lens processing center 2002 to the user terminal 101 of the counter booth 1. I do.
  • the user captures the front and side (both left and right) face images into the user terminal 101 of the counter booth 1 using an image input device, for example, a digital camera or a scanner.
  • an image input device for example, a digital camera or a scanner.
  • the front and side face images of the user are transmitted from the user terminal 101 of the counter 1 booth 1 to the lens processing center 200 2 via the Internet.
  • the frame selection text data and image data (image of the user's face) sent from the user terminal 101 of the counter 1 booth 1 are input to the frame selection information input means.
  • the data is received at 2004 and registered and managed by the storage means 2005.
  • the corneal vertex and nose of the user's eye The distance (L 2 ) is measured, and a value obtained by taking the average of the left and right is registered and managed by the storage means 200 5.
  • L 2 is typically 12 mm.
  • the opening angle of the nose clings (nose pad) is determined and registered by the frame selection information input means 204 based on the measured values.
  • the width to the ear (L 3 ) is measured separately for the left and right, with the center between the pupils of the left and right eyes as the base point. It is registered and managed by the storage means 2000.
  • the opening angle of the temple is determined separately for the left and right sides based on the measured values by the frame selection information input means 2004 and registered.
  • the interpupillary distance (PD) is first determined.
  • the distance between the left side of the left eye and the left side of the right eye can be obtained by calculating PD) Is required.
  • the distance to the pupil and ear of the eye (L 4) also because it can not pupil hole is detected in ⁇ images obtained from pupil distance in the right or the eye than the left ear (L a) and The distance (L b) to the left side is calculated, and the distance between the pupil of the eye and the ear (L 4 ) is calculated. The same can be obtained for the right eye.
  • the opening angle 0 of the right temple and the left temple of the glasses frame is adjusted by correcting the value obtained from the following equation and bending it.
  • L 5 represents a spectacle frame front size of the (cf. 3 2 Figure). 4 If a bifocal lens is specified, the angle of the nose clings is corrected based on the added value, and determined and registered in order to add an additional 5 degrees to the lens surface inclination angle. As described above, in the lens processing center 200 2, the central processing unit and the frame selection information input means 200 4 calculate and create the functional structure data, the decoration structure data, and the face image data, and Along with the day, it is stored by the memory means 205.
  • the frame function structure, the frame decoration structure, and the frame image of each frame are input and stored in advance by the frame information registering unit 206 and the frame image registering unit 2061.
  • a counter is obtained from the frame function structure, frame decoration structure and frame image of each frame registered by the memory means 205. Select a frame that conforms to the functional structure data, decorative structure data, and face image data based on the frame selection criteria transmitted from the user terminal 101 of the booth 1.
  • the image processing unit 200 of the lens processing center 200 matches the face image of the user.
  • An eyeglass wearing image is generated by synthesizing the frame image of the frame and the user's face image, and the eyeglass wearing image obtained by synthesizing the frame image with the user's face image generated by the image processing means 200 is output.
  • Means 209 * WWW server sends to user terminal 101 of counter booth 1 via the Internet.
  • the user who sees the image transmitted to the user terminal 101 of the counter booth 1 determines whether the frame that matches his / her desire is selected, and what kind of face will be displayed when the frame is applied to the face Can be confirmed by the glasses wearing screen. If the user is sending an image that is different from the desired frame, or if the user wants to see a face with a different frame, the effect is further transmitted from the lens processing center 200. The information is input to the displayed glasses wearing screen and transmitted to the lens processing center 2002.
  • a user can wear various kinds of eyeglass frames in the photograph data, and variously through a network such as the Internet at a counter booth near work and residence. You can change the eyeglass frame of your choice and select the most suitable frame for your taste.
  • this system and method allows the user to keep his or her ready-made glasses and contact lenses, The user can select the optimal eyeglass frame with his / her eyesight as the selected eyeglass frame is placed on the face.
  • the lens processing center 1002 and 2002 may be integrated with the lens processing center 2 by a single computer or server, and may be processed in a distributed manner by a plurality of computers and servers. You may make it. Industrial applicability

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Optics & Photonics (AREA)
  • Eye Examination Apparatus (AREA)
  • Eyeglasses (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

L'invention concerne un procédé et un appareil d'émission automatique portant sur des lentilles qui consistent à communiquer des informations sur une lentille d'une manière qui offre un facteur de vente directe placé de façon appropriée à proximité du lieu de travail ou du domicile de l'utilisateur, et qui permet à ce dernier d'acheter une lentille, s'il y a lieu. L'appareil d'émission d'informations automatique concernant des lentilles, conçu pour communiquer des informations sur une lentille à partir d'un espace donné dans lequel l'individu peut pénétrer, comprend un dispositif de mesure (103) de l'acuité visuelle qui mesure l'acuité visuelle corrigée de l'utilisateur, la lentille étant enlevée; un dispositif de communications (104) qui traite les données recueillies à l'aide du dispositif de mesure (103) de l'acuité visuelle, les données ainsi traitées étant transmises à un centre de traitement de lentilles piloté; un dispositif de sortie (105) qui affiche/indique vocalement une instruction relative aux lentilles, et un dispositif d'entrée (106) qui saisit ladite instruction.
PCT/JP2001/005203 2000-06-23 2001-06-19 Procede et appareil d'emission d'informations automatiques concernant des lentilles WO2001097682A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001264320A AU2001264320A1 (en) 2000-06-23 2001-06-19 Unmanned lens information transmitting method and apparatus

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2000-189519 2000-06-23
JP2000189519 2000-06-23
JP2000201299A JP2002078681A (ja) 2000-06-23 2000-07-03 無人レンズ情報発信方法およびその装置
JP2000-201299 2000-07-03

Publications (1)

Publication Number Publication Date
WO2001097682A1 true WO2001097682A1 (fr) 2001-12-27

Family

ID=26594554

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2001/005203 WO2001097682A1 (fr) 2000-06-23 2001-06-19 Procede et appareil d'emission d'informations automatiques concernant des lentilles

Country Status (3)

Country Link
JP (1) JP2002078681A (fr)
AU (1) AU2001264320A1 (fr)
WO (1) WO2001097682A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6848822B2 (en) 2002-05-31 2005-02-01 3M Innovative Properties Company Light guide within recessed housing

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4515605B2 (ja) * 2000-06-29 2010-08-04 株式会社トプコン 眼屈折力測定装置
JP4188118B2 (ja) * 2003-03-20 2008-11-26 株式会社トプコン 検眼装置
US8182091B2 (en) 2004-07-28 2012-05-22 Solohealth, Inc. Automated vision screening apparatus and method
US7614747B2 (en) * 2004-07-28 2009-11-10 Solohealth, Inc. Automated vision screening apparatus and method
US8313828B2 (en) 2008-08-20 2012-11-20 Johnson & Johnson Vision Care, Inc. Ophthalmic lens precursor and lens
US8318055B2 (en) 2007-08-21 2012-11-27 Johnson & Johnson Vision Care, Inc. Methods for formation of an ophthalmic lens precursor and lens
US8317505B2 (en) 2007-08-21 2012-11-27 Johnson & Johnson Vision Care, Inc. Apparatus for formation of an ophthalmic lens precursor and lens
US9417464B2 (en) 2008-08-20 2016-08-16 Johnson & Johnson Vision Care, Inc. Method and apparatus of forming a translating multifocal contact lens having a lower-lid contact surface
CN102272803A (zh) 2008-12-31 2011-12-07 庄臣及庄臣视力保护公司 用于分配眼科镜片的设备和方法
US8240849B2 (en) 2009-03-31 2012-08-14 Johnson & Johnson Vision Care, Inc. Free form lens with refractive index variations
US8807076B2 (en) 2010-03-12 2014-08-19 Johnson & Johnson Vision Care, Inc. Apparatus for vapor phase processing ophthalmic devices
JP6413062B2 (ja) * 2014-07-18 2018-10-31 東海光学株式会社 近視矯正を必要としない人のためのサングラス用のレンズの設計方法
US9645412B2 (en) 2014-11-05 2017-05-09 Johnson & Johnson Vision Care Inc. Customized lens device and method
JP6563786B2 (ja) 2015-11-10 2019-08-21 株式会社トプコン 眼科検査システム
US10359643B2 (en) 2015-12-18 2019-07-23 Johnson & Johnson Vision Care, Inc. Methods for incorporating lens features and lenses having such features
JP6660750B2 (ja) 2016-02-01 2020-03-11 株式会社トプコン 眼科検査システム
JP7433614B1 (ja) 2022-10-28 2024-02-20 株式会社Linc’well サーバ装置、方法及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03253968A (ja) * 1990-03-03 1991-11-13 Fujitsu Ltd 医療事務システムの入力即時チェック方法
JPH08123544A (ja) * 1994-10-21 1996-05-17 Matsushita Electric Ind Co Ltd 制御装置
JPH10334326A (ja) * 1997-06-04 1998-12-18 Syst Consulting Service Kk 情報付き自動販売機
JPH11167589A (ja) * 1997-09-30 1999-06-22 Seed Co Ltd 眼鏡自動選定装置、眼鏡自動選定装置を用いた眼鏡販売システム、及び、眼鏡の自動選定のためのプログラムを記録した記録媒体
JPH11338905A (ja) * 1998-05-27 1999-12-10 Hoya Corp 眼鏡装用シミュレーションにおける合成画像作成方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03253968A (ja) * 1990-03-03 1991-11-13 Fujitsu Ltd 医療事務システムの入力即時チェック方法
JPH08123544A (ja) * 1994-10-21 1996-05-17 Matsushita Electric Ind Co Ltd 制御装置
JPH10334326A (ja) * 1997-06-04 1998-12-18 Syst Consulting Service Kk 情報付き自動販売機
JPH11167589A (ja) * 1997-09-30 1999-06-22 Seed Co Ltd 眼鏡自動選定装置、眼鏡自動選定装置を用いた眼鏡販売システム、及び、眼鏡の自動選定のためのプログラムを記録した記録媒体
JPH11338905A (ja) * 1998-05-27 1999-12-10 Hoya Corp 眼鏡装用シミュレーションにおける合成画像作成方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6848822B2 (en) 2002-05-31 2005-02-01 3M Innovative Properties Company Light guide within recessed housing

Also Published As

Publication number Publication date
AU2001264320A1 (en) 2002-01-02
JP2002078681A (ja) 2002-03-19

Similar Documents

Publication Publication Date Title
WO2002042969A1 (fr) Systeme de commande/vente de lunettes sur reseau et procede correspondant
JP5648299B2 (ja) 眼鏡販売システム、レンズ企業端末、フレーム企業端末、眼鏡販売方法、および眼鏡販売プログラム
US11457807B2 (en) System and method for enabling customers to obtain refraction specifications and purchase eyeglasses or contact lenses
CN100462048C (zh) 眼镜选定***及其方法
US5983201A (en) System and method enabling shopping from home for fitted eyeglass frames
US20130141694A1 (en) Systems and methods for enabling customers to obtain refraction specifications for and purchase of eyeglasses or contact lenses
WO2001097682A1 (fr) Procede et appareil d'emission d'informations automatiques concernant des lentilles
US20040004633A1 (en) Web-based system and method for ordering and fitting prescription lens eyewear
US20130231941A1 (en) System and method for automated optical dispensing
WO2001097683A1 (fr) Dispositif telecommande de traitement d'informations relatives a des lunettes et procede associe
US20010042028A1 (en) Method and system for eyeglass ordering on a network
JP2001350982A (ja) ネットワークによるメガネオーダー販売システムおよびその方法
JP3572581B2 (ja) 眼鏡販売システム
JP2002078679A (ja) 無人メガネ情報発信装置およびその方法
KR101909660B1 (ko) 안경알 자판기 서비스 제공 방법
US20050029021A1 (en) Cold-flame propulsion system
US20020171806A1 (en) Optical measurement device
TWI223764B (en) System for determining level of magnification of eyeglasses and contact lenses and method thereof
JP2001183611A (ja) 眼鏡販売管理システム
JP2006350684A (ja) メガネの販売システム

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase