WO2024029359A1 - Dispositif ophtalmologique - Google Patents

Dispositif ophtalmologique Download PDF

Info

Publication number
WO2024029359A1
WO2024029359A1 PCT/JP2023/026594 JP2023026594W WO2024029359A1 WO 2024029359 A1 WO2024029359 A1 WO 2024029359A1 JP 2023026594 W JP2023026594 W JP 2023026594W WO 2024029359 A1 WO2024029359 A1 WO 2024029359A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
gesture input
eye
examiner
control unit
Prior art date
Application number
PCT/JP2023/026594
Other languages
English (en)
Japanese (ja)
Inventor
健 大宮
Original Assignee
株式会社トプコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社トプコン filed Critical 株式会社トプコン
Publication of WO2024029359A1 publication Critical patent/WO2024029359A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters

Definitions

  • the present disclosure relates to an ophthalmological device.
  • Patent Document 1 there has been known an ophthalmological apparatus that includes a touch panel including a detection surface capable of detecting a touch operation and acquires information about a subject's eye in response to an operation on the touch panel (see, for example, Patent Document 1).
  • the ophthalmologic apparatus described in Patent Document 1 includes a strip-shaped first operation area extending along the right edge of the detection surface of the touch panel from the upper end to the lower end of the detection surface, and a strip-shaped first operation region extending from the upper end to the lower end along the left edge of the detection surface.
  • a second operation area is provided, and the value of the optometry parameter can be changed in accordance with a drag operation to the first operation area and the second operation area.
  • the ophthalmological apparatus described in Patent Document 1 has improved operability, and the examiner operates the touch panel while checking the patient's facial expression and the condition of the patient's eye without visually checking the touch panel screen. It becomes possible to do so. Thus, it is desired to develop a technique that can more efficiently acquire information about the eye to be examined with simpler operations.
  • the present disclosure has been made in view of the above circumstances, and aims to provide an ophthalmologic apparatus that can more efficiently acquire information about the eye to be examined with a simpler operation.
  • an ophthalmologic apparatus of the present disclosure includes an information acquisition unit used to acquire information about the eye to be examined, and a touch panel display screen on which an operation screen for operating the information acquisition unit is displayed.
  • the information acquisition device includes a display unit, and a control unit that controls the information acquisition unit based on input information including a touch operation on the display screen.
  • the control unit operates in a normal input mode in which the information acquisition unit is controlled based on the input information on the operation screen, and in a normal input mode in which the information acquisition unit is controlled based on the input information by gesture input on a gesture input area provided on the display screen. and a gesture input mode for controlling the parts.
  • FIG. 1 is a perspective view showing the overall configuration of an ophthalmologic apparatus according to a first embodiment.
  • FIG. 2 is a block diagram showing the configuration of a control system of the ophthalmologic apparatus according to the first embodiment.
  • FIG. 2 is a diagram showing a detailed configuration of a right eye measurement optical system of the ophthalmologic apparatus according to the first embodiment.
  • FIG. 3B is a diagram schematically showing a cross-sectional view of the field lens of FIG. 3A.
  • FIG. 3B is a diagram schematically showing a cross-sectional view of the conical prism of FIG. 3A.
  • FIG. 3 is a diagram showing an example of an operation screen displayed on the display unit of the ophthalmologic apparatus according to the first embodiment.
  • FIG. 7 is an explanatory diagram for explaining an example and a modified example of gesture input to the gesture input area of the display unit.
  • FIG. 3 is an explanatory diagram for explaining an example of gesture input to a gesture input area of a display unit.
  • FIG. 3 is an explanatory diagram for explaining an example of gesture input to a gesture input area of a display unit.
  • FIG. 7 is an explanatory diagram for explaining a modified example of gesture input to the gesture input area of the display unit.
  • FIG. 7 is an explanatory diagram for explaining a modified example of gesture input to the gesture input area of the display unit.
  • FIG. 7 is an explanatory diagram for explaining a modified example of gesture input to the gesture input area of the display unit.
  • FIG. 7 is an explanatory diagram for explaining a modified example of gesture input to the gesture input area of the display unit. It is a flowchart which shows an example of operation of the ophthalmological apparatus concerning a 1st embodiment.
  • FIG. 7 is an explanatory diagram for explaining gesture input by the examiner controller used in the ophthalmologic apparatus according to the second embodiment. It is a perspective view showing the whole composition of an ophthalmologic device concerning a 3rd embodiment.
  • FIG. 7 is an explanatory diagram for explaining an example of a subjective test operation screen displayed on the display unit of the ophthalmologic apparatus according to the third embodiment.
  • the ophthalmologic apparatus 100 of the first embodiment is a binocular open type ophthalmologic apparatus that can measure the characteristics of the subject's eyes E simultaneously with both eyes open. Note that the ophthalmological apparatus 100 of this embodiment can also perform examinations on one eye at a time by shielding one eye or turning off the fixation target. Further, the ophthalmological apparatus is not limited to a binocular open type, and the present disclosure can also be applied to an ophthalmological apparatus that measures characteristics of each eye.
  • the ophthalmological apparatus 100 of the first embodiment is an apparatus that performs arbitrary subjective tests, and can also perform objective tests.
  • the subjective test the ophthalmological apparatus 100 presents an optotype or the like to the subject at a predetermined presentation position, and obtains a test result based on the subject's response to the optotype or the like.
  • the subjective tests include subjective refraction measurements such as distance vision tests, intermediate vision tests, near vision tests, contrast tests, night vision tests, glare tests, pinhole tests, stereoscopic vision tests, and visual field tests.
  • the ophthalmological apparatus 100 irradiates the eye E with light and measures information (eye characteristics) regarding the eye E based on the detection result of the returned light.
  • This objective test includes measurement for acquiring the characteristics of the eye E to be examined and photographing for acquiring an image of the eye E to be examined. Furthermore, the objective tests used objective refraction measurement (REF measurement), corneal topography (keratometry), intraocular pressure measurement, fundus photography, and optical coherence tomography (hereinafter referred to as "OCT”). Examples include tomographic imaging (OCT imaging) and measurement using OCT.
  • OCT optical coherence tomography
  • the ophthalmologic apparatus 100 of this embodiment mainly includes a main body 10, an examiner controller 27, and a subject controller 28.
  • the main body 10 includes a base 11, an optometry table 12, a support 13, an arm 14, a drive mechanism (drive section) 15, a pair of measurement heads 16, and a forehead rest. 17, and a control section 26.
  • the ophthalmological apparatus 100 acquires information about the eye E of the patient facing the optometry table 12 with the patient's forehead placed on the forehead rest 17 provided between both measurement heads 16. do.
  • FIG. 1 or 2 the ophthalmologic apparatus 100 of this embodiment mainly includes a main body 10, an examiner controller 27, and a subject controller 28.
  • the main body 10 includes a base 11, an optometry table 12, a support 13, an arm 14, a drive mechanism (drive section) 15, a pair of measurement heads 16, and a forehead rest. 17, and a control section 26.
  • the ophthalmological apparatus 100 acquires information about the eye E of the patient facing the optometry table 12 with the patient
  • the X axis, Y axis, and Z axis are taken, and when viewed from the subject, the left and right direction is the X direction, the up and down direction (vertical direction) is the Y direction, and the X direction is The direction perpendicular to the Y direction (the depth direction of the measurement head 16) is defined as the Z direction.
  • the optometry table 12 is a desk on which the examiner controller 27 and the subject controller 28 are placed, as well as items used for optometry, and is supported by the base 11.
  • the optometry table 12 may be supported by the base 11 so that its position (height position) in the Y direction can be adjusted.
  • the support column 13 is supported by the base 11 so as to extend in the Y direction at the rear end of the optometry table 12, and is provided with an arm 14 at its tip.
  • the arm 14 suspends both measurement heads 16 on the optometry table 12 via a drive mechanism 15, and extends from the support 13 toward the front in the Z direction.
  • the arm 14 is movable in the Y direction relative to the support column 13. Note that the arm 14 may be movable in the X direction and the Z direction with respect to the support column 13.
  • a pair of measurement heads 16 are supported by being suspended by a pair of drive mechanisms 15.
  • the measurement heads 16 are provided in pairs to individually correspond to the left and right eyes E of the subject, and in the following, when individually described, they will be referred to as the left eye measurement head 16L and the right eye measurement head 16R. do.
  • the left eye measurement head 16L acquires information about the left eye E of the subject
  • the right eye measurement head 16R acquires information about the right eye E of the subject.
  • the left eye measurement head 16L and the right eye measurement head 16R are configured to be plane symmetrical with respect to a vertical plane located between them in the X direction.
  • Each measurement head 16 is provided with a mirror 18 (18L, 18R) that is a deflection member, and information of the corresponding eye E is acquired through the mirror 18 by a measurement optical system 21, which will be described later.
  • Each measurement head 16 is provided with a measurement optical system 21 (individually referred to as a right eye measurement optical system 21R and a left eye measurement optical system 21L) that acquires eye information of the eye E to be examined. .
  • a measurement optical system 21 (individually referred to as a right eye measurement optical system 21R and a left eye measurement optical system 21L) that acquires eye information of the eye E to be examined. .
  • the detailed configuration of the measurement optical system 21 will be described later.
  • Both measurement heads 16 are movably suspended by a drive mechanism 15 provided on a base suspended from the tip of the arm 14.
  • the drive mechanism 15 includes a left eye drive mechanism 15L that corresponds to the left eye measurement head 16L, and a right eye drive mechanism 15R that corresponds to the right eye measurement head 16R.
  • the left eye drive mechanism 15L includes a left eye vertical drive section 22L, a left eye horizontal drive section 23L, and a left eye X-direction rotation drive section (left eye horizontal rotation drive section) 24L. and a left eye Y-direction rotation drive unit (left eye vertical rotation drive unit) 25L.
  • the right eye drive mechanism 15R includes a right eye vertical drive section 22R, a right eye horizontal drive section 23R, a right eye X direction rotation drive section (right eye horizontal direction rotation drive section) 24R, and a right eye Y direction rotation drive section. It has a drive section (vertical rotation drive section for right eye) 25R.
  • each drive unit corresponding to the left eye measurement head 16L and the configuration of each drive unit corresponding to the right eye measurement head 16R are plane symmetrical with respect to a vertical plane located between the two in the X direction. has been done. In the following, unless mentioned individually, they may simply be referred to as the vertical drive section 22, the horizontal drive section 23, the X-direction rotation drive section 24, and the Y-direction rotation drive section 25. The same applies to other components provided symmetrically in the left and right directions.
  • the drive mechanism 15 has a configuration in which a vertical drive section 22, a horizontal drive section 23, an X-direction rotation drive section 24, and a Y-direction rotation drive section 25 are arranged in this order from the upper side.
  • the vertical drive section 22 is fixed to the base section at the tip of the arm 14, and based on the control signal from the control section 26, the vertical drive section 22 moves the horizontal drive section 23, the X-direction rotation drive section 24, and the Y-direction rotation drive section to the arm 14. 25 in the Y direction (vertical direction).
  • the horizontal drive section 23 is fixed to the vertical drive section 22, and based on the control signal from the control section 26, the horizontal drive section 23 rotates the X direction rotation drive section 24 and the Y direction rotation drive section 25 with respect to the vertical drive section 22 in the X direction and the Y direction rotation drive section 23. Move in the Z direction (horizontal direction).
  • the vertical drive section 22 and the horizontal drive section 23 are provided with an actuator that generates a driving force, such as a pulse motor, and a transmission mechanism that transmits the driving force, such as a combination of gears or a rack and pinion. It consists of
  • the horizontal drive unit 23 can be easily constructed by providing a combination of actuators and transmission mechanisms separately in the X direction and the Z direction, for example, and can easily control movement in the horizontal direction.
  • the X-direction rotation drive section 24 is connected to the horizontal drive section 23. Based on the control signal from the control unit 26, the X-direction rotation driving unit 24 moves the corresponding measuring head 16 and Y-direction rotation driving unit 25 to the horizontal driving unit 23 at the eyeball rotation point O of the corresponding eye E. (See FIG. 3) The eyeballs are rotated in the X direction (horizontal direction) about a pair of left and right vertical eyeball rotation axes that extend in the vertical direction (Y direction).
  • the Y direction rotation drive section 25 is connected to the X direction rotation drive section 24.
  • a measuring head 16 is suspended from this Y-direction rotation driving section 25 .
  • the Y-direction rotation drive unit 25 moves the corresponding measurement head 16 to the X-direction rotation drive unit 24 in a horizontal direction (
  • the eyeballs are rotated in the Y direction (vertical direction, up and down direction) about a pair of left and right horizontal eyeball rotation axes extending in the X direction.
  • the X-direction rotation drive unit 24 and the Y-direction rotation drive unit 25 may have a configuration in which, for example, a transmission mechanism that receives a driving force from an actuator moves along an arcuate guide groove. By aligning the center position of each guide groove with the pair of horizontal eyeball rotation axes and the pair of vertical eyeball rotation axes, the X-direction rotation drive unit 24 and the Y-direction rotation drive unit 25 can rotate The measurement head 16 can be rotated about the eyeball rotation axis and a pair of vertical eyeball rotation axes.
  • the X-direction rotation drive unit 24 supports the Y-direction rotation drive unit 25 and the measurement head 16 so as to be rotatable around a rotation axis provided therein, and also cooperates with the horizontal drive unit 23 to rotate the measurement head 16. It can also be configured to rotate while changing the supporting position.
  • the Y-direction rotation drive section 25 supports the measurement head 16 so as to be rotatable around a rotation axis provided therein, and also cooperates with the vertical drive section 22 to rotate the measurement head 16 while changing its supporting position. It is also possible to have a configuration in which the
  • the drive mechanism 15 can move each measurement head 16 individually or in conjunction with each other in the X direction, Y direction, and Z direction, and can also move the measurement head 16 in the vertical eyeball rotation axis and horizontal eyeball axis of the eye E to be examined. It can be rotated in the X direction and the Y direction around the rotation axis.
  • each of the drive units 22 to 25 of the drive mechanism 15 is driven to move and rotate each measurement head 16 in response to a control signal from the control unit .
  • the examiner who is the examiner, can manually drive each of the drive units 22 to 25 to move and rotate each measuring head 16.
  • the left eye X-direction rotation drive section 24L and the right eye X-direction rotation drive section 24R rotate the left eye measurement head 16L and the right eye measurement head 16R in the X direction (horizontal direction).
  • the eye E to be examined can be made to diverge (divergent movement) or converge (convergence movement).
  • the left eye Y direction rotation drive unit 25L and the right eye Y direction rotation drive unit 25R rotate the left eye measurement head 16L and the right eye measurement head 16R in the Y direction (vertical direction).
  • the line of sight of the eye E to be examined can be directed downward or returned to its original position.
  • the subject can perform divergence motion and convergence motion tests, and perform binocular viewing from a distance test at a far point distance to a near vision test at a near point distance.
  • Various characteristics of both eyes E can be measured by performing tests at various test distances.
  • the examiner controller 27 is a device used by the examiner, who is the operator, to operate the ophthalmological apparatus 100.
  • the examiner controller 27 is an information processing device including a computer having a CPU, a storage device, and the like.
  • the examiner controller 27 of the first embodiment is composed of a tablet terminal. Note that the examiner controller 27 is not limited to a tablet terminal, but can also be a smartphone or other portable information terminal, a notebook personal computer, a desktop personal computer, etc. It can also be a dedicated controller.
  • the examiner controller 27 is configured to be portable. The examiner may operate the examiner controller 27 while it is placed on the optometry table 12, or may hold it in his/her hand.
  • the examiner controller 27 includes a display section (display panel) 30 consisting of a touch panel display.
  • the display unit 30 includes a display screen 30a on which images and the like are displayed, and a touch panel type input unit 30b arranged to overlap with the display screen 30a.
  • the display unit 30 itself is an input unit, and the display screen 30a of the display unit 30 functions as an input unit 30b that accepts input operations including touch operations from the examiner.
  • the input unit 30b also functions as a detection surface that detects a touch operation by an examiner's finger, stylus, or the like.
  • the examiner controller 27 is capable of short-range communication with the control unit 26 using communication means such as short-range wireless.
  • the examiner controller 27 controls the operator's operation screen (for example, the subjective test operation screen 40 in FIG. 4) and the image sensor 159 of the measurement optical system 21, which will be described later, based on the display control signal sent from the control unit 26.
  • a predetermined screen or image, such as the anterior eye segment image E' obtained in step 1, is displayed on the display screen 30a.
  • the examiner controller 27 also accepts an operation input such as a touch operation by the examiner on the display screen 30a (input section 30b), and sends input information (control signal) corresponding to this operation input to the control section 26.
  • a touch operation performed on the display screen 30a (input section 30b) will be described.
  • a touch operation will be described as an operation performed by touching the display screen 30a (input section 30b), but it is possible to perform a touch operation at a position near the display screen 30a (a predetermined distance away from the display screen 30a) without touching the display screen 30a. ).
  • the touch operation includes, for example, a pinch-in operation, a pinch-out operation, a flick operation, a tap operation, a drag operation, a swipe operation, etc. on the display screen 30a (input unit 30b).
  • the pinch-in operation is an operation in which the examiner brings two fingers close to each other while touching the display screen 30a.
  • the pinch-out operation is an operation in which the examiner moves two fingers apart while touching the display screen 30a.
  • the flick operation is an operation in which the examiner touches the display screen 30a in a sweeping motion.
  • the tap operation is an operation in which the examiner taps the display screen 30a.
  • the drag operation is an operation in which the examiner touches the display screen 30a while dragging the touch point to another point while touching the operation surface.
  • the swipe operation is an operation in which the examiner touches the display screen 30a in a sweeping manner.
  • the input unit 30b detection unit detects a touch operation using a detection method such as a capacitance method, a resistive film method, or a surface acoustic wave method.
  • the above-mentioned touch operation may be a touch operation performed with one finger (hereinafter sometimes referred to as “single touch operation”), or a touch operation performed with multiple fingers of 2 or more and 10 or less (hereinafter referred to as “single touch operation”). , sometimes referred to as "multi-touch operation”).
  • the display screen 30a (input section 30b) has a gesture input area 30c.
  • This gesture input area 30c may be at least a part of the display screen 30a (input section 30b).
  • the gesture input area 30c may be a half area of the display screen 30a, such as the upper half, the lower half, the right half, or the left half of the display screen 30a, or may be a wider area than the half, It is more preferable to cover the entire surface of the display screen 30a.
  • the entire surface of the display screen 30a may be substantially the entire surface, but it only needs to be substantially the entire surface.
  • an icon area for displaying a predetermined icon to be visually operated is provided near the edge (at least one of the left edge, right edge, top edge, and bottom edge) of the display screen 30a, and the gesture input area 30c is located in this icon area. It may be an area other than the above. Then, the examiner can visually operate the icons in the icon area and perform gesture input by blind touching in the gesture input area 30c.
  • the display screen 30a may be divided vertically or horizontally to provide two gesture input areas 30c. Thereby, the examiner can input gestures related to the left eye test in one gesture input area 30c, and perform gesture inputs related to the right eye test in the other gesture input area 30c.
  • the ophthalmological apparatus 100 has a "normal input mode” and a "gesture input mode” as input modes for the display screen 30a (input unit 30b).
  • the control unit 26 controls the ophthalmological apparatus 100 based on input information by a touch operation on the operation screen displayed on the display screen 30a of the display unit 30 (for example, the operation screen 40 for subjective examination in FIG. 4). This mode controls each part of the .
  • the "gesture input mode” is a mode in which the control unit 26 controls each part of the ophthalmological apparatus 100 based on input information by gesture input to the gesture input area 30c.
  • the display screen 30a (input section 30b) functions as a gesture input area 30c. The procedure for switching to each input mode and the operation of the ophthalmological apparatus 100 under control of the control unit 26 in each input mode will be described later.
  • the examiner controller 27 displays an operation screen by the examiner on the display screen 30a. Furthermore, the optotype presented to the eye E by the optotype chart 143 may be displayed toward the examiner on the display screen 30a.
  • the examiner controller 27 also accepts input operations such as touch operations on icons displayed on the operation screen, pop-up screens, etc. by the examiner, and sends input information (control signals) corresponding to the input operations to the control unit 26. do.
  • the examiner controller 27 accepts an input operation by gesture input to the gesture input area 30c, and sends input information (control signal) corresponding to the input operation to the control unit 26.
  • "Gesture input” in this embodiment means the following operations (1) and (2) on the gesture input area 30c, but is not limited to this, and can also be used as either one of (1) or (2). good.
  • Directional touch operation is an operation in which the examiner moves his or her finger while touching the display screen 30a, such as pinch-in operation, pinch-out operation, flick operation, drag operation, swipe operation, etc. Can be mentioned.
  • This "directional touch operation” may be a single touch operation using one finger or a multi-touch operation using multiple fingers.
  • a “touch operation without directionality” is an operation in which the examiner briefly touches the display screen 30a without moving the finger, and includes, for example, a tap operation performed as if tapping the display screen 30a. Therefore, (2) "multi-touch operation without directionality” includes a tap operation performed using multiple fingers. Note that touch operations other than (1) and (2), that is, “single touch operations without directionality" are recognized by the control unit 26 as normal input operations in the normal input mode.
  • the subject controller 28 is a device used by the subject to respond when acquiring various eye information about the subject's eye E.
  • the subject controller 28 includes, for example, a keyboard, a mouse, a joystick, a touch pad, a touch panel, and the like.
  • the patient controller 28 is connected to the control unit 26 via a wired or wireless communication path, and controls input information (control signals) according to operations performed on the patient controller 28. 26.
  • the measurement optical system 21L for the left eye and the measurement optical system 21R for the right eye are a visual acuity test device that performs a visual acuity test while switching the optotype to be presented, and a visual acuity test device that performs a visual acuity test while switching the visual target to be presented, and a visual acuity test device that measures the appropriate corrective refractive power of the eye E while switching and arranging a corrective lens.
  • a phoropter to obtain images
  • a refractometer and wavefront sensor to measure refractive power
  • a fundus camera to take images of the fundus
  • a tomography device to take tomographic images of the retina
  • a specular microscope to take images of the corneal endothelium
  • a corneal shape to be measured.
  • a keratometer for measuring intraocular pressure, a tonometer for measuring intraocular pressure, and the like may be used alone or in combination.
  • FIG. 3A is a diagram showing the detailed configuration of the right eye measurement optical system 21R in the ophthalmologic apparatus 100 of this embodiment.
  • the mirror 18R is omitted.
  • the configuration of the left eye measurement optical system 21L is the same as the right eye measurement optical system 21R, so a description thereof will be omitted, and only the right eye measurement optical system 21R will be described below.
  • the ophthalmological apparatus 100 includes a Z alignment system 110, an XY alignment system 120, and a keratometry system as an optical system (measurement optical system 21) for testing the subject's eye E. 130, an optotype projection system 140, an anterior segment observation system 150, a reflex measurement projection system 160, and a reflex measurement light receiving system 170.
  • the anterior segment observation system 150 takes a video of the anterior segment of the eye E to be examined.
  • the imaging surface of the imaging element 159 is arranged at the pupil conjugate position Q.
  • the anterior segment illumination light source 151 irradiates the anterior segment of the eye E to be examined with illumination light (for example, infrared light).
  • illumination light for example, infrared light.
  • the light reflected by the anterior segment of the eye E to be examined passes through the objective lens 152, the dichroic mirror 153, the half mirror 154, the relay lenses 155 and 156, and the dichroic mirror 157.
  • the light transmitted through the dichroic mirror 157 is imaged by an imaging lens 158 on the imaging surface of an imaging element 159 (area sensor).
  • the image sensor 159 captures images and outputs signals at a predetermined rate.
  • the output (video signal) of the image sensor 159 is input to the control section 26 .
  • the control section 26 causes the anterior eye segment image E' based on this video signal to be displayed on the display screen 30a of the display section 30.
  • the anterior segment image E' is, for example, an infrared moving image.
  • the Z alignment system 110 projects light (infrared light) onto the eye E to perform alignment of the anterior segment observation system 150 in the optical axis direction (anterior-posterior direction, Z direction).
  • the light output from the Z alignment light source 111 is projected onto the cornea of the eye E to be examined, is reflected by the cornea, and is imaged on the sensor surface of the line sensor 113 by the imaging lens 112.
  • the position of the corneal vertex changes in the optical axis direction of the anterior segment observation system 150
  • the light projection position on the sensor surface of the line sensor 113 changes.
  • the control unit 26 determines the position of the corneal apex of the eye E based on the light projection position on the sensor surface of the line sensor 113, and based on this determines the position of the corneal vertex of the eye E, controls the drive mechanism 15 that moves the measurement optical system 21, and performs Z alignment. Execute.
  • the XY alignment system 120 sends light (infrared light) to the subject's eye E for alignment in directions (left-right direction (X direction), up-down direction (Y direction)) orthogonal to the optical axis of the anterior segment observation system 150. irradiate.
  • the XY alignment system 120 includes an XY alignment light source 121 provided in an optical path branched from the anterior eye segment observation system 150 by a half mirror 154. The light output from the XY alignment light source 121 is reflected by the half mirror 154 and projected onto the eye E through the anterior segment observation system 150. Light reflected by the cornea of the eye E to be examined is guided to the imaging device 159 through the anterior segment observation system 150.
  • the control unit 26 causes the anterior segment image E' including the bright spot image and the alignment mark to be displayed on the display screen 30a of the display unit 30.
  • an examiner such as an examiner moves the measurement optical system so as to guide a bright spot image within the alignment mark.
  • the control unit 26 controls the drive mechanism 15 that moves the measurement optical system 21 so that the displacement of the bright spot image with respect to the alignment mark is canceled.
  • the keratometry system 130 projects a ring-shaped light beam (infrared light) onto the cornea for measuring the shape of the cornea of the eye E to be examined.
  • the keratoplate 131 is placed between the objective lens 152 and the eye E to be examined.
  • a kerato ring light source (not shown) is provided on the back side of the kerato plate 131 (on the objective lens 152 side). By illuminating the keratoplate 131 with light from the keratoring light source, a ring-shaped light beam is projected onto the cornea of the eye E to be examined.
  • the reflected light from the cornea of the eye E to be examined (keratling image) is detected by the image sensor 159 together with the anterior segment image E'.
  • the control unit 26 calculates a corneal shape parameter representing the shape of the cornea by performing a known calculation based on this keratoring image.
  • the visual target projection system 140 presents various visual targets such as a fixation target and a visual target for subjective testing to the eye E to be examined.
  • the light (visible light) output from the light source 141 is made into a parallel light beam by the collimating lens 142, and is irradiated onto the optotype chart 143.
  • the optotype chart 143 includes, for example, a transmissive liquid crystal panel, and displays a pattern representing the optotype.
  • the light transmitted through the optotype chart 143 passes through relay lenses 144 and 145, is reflected by a reflecting mirror 146, passes through a dichroic mirror 168, and is reflected by a dichroic mirror 153.
  • the light reflected by the dichroic mirror 153 passes through the objective lens 152 and is projected onto the fundus Ef.
  • the light source 141, collimating lens 142, and optotype chart 143 are movable together in the optical axis direction.
  • the control unit 26 controls the optotype chart 143 by moving the light source 141, collimating lens 142, and optotype chart 143 in the optical axis direction based on the results of the objective measurement.
  • the control unit 26 causes the optotype chart 143 to display the optotype selected by the examiner or the control unit 26 . Thereby, the optotype is presented to the subject.
  • the subject responds to the visual target.
  • the control unit 26 Upon receiving the input of the response content, the control unit 26 performs further control and calculates the subjective test value. For example, in visual acuity measurement, the control unit 26 selects and presents the next optotype based on the response to the Landolt ring, etc., and determines the visual acuity value by repeating this process.
  • the reflex measurement projection system 160 and the reflex measurement light receiving system 170 are used for objective refraction measurement (reflex measurement).
  • the reflex measurement projection system 160 projects a ring-shaped light beam (infrared light) for objective measurement onto the fundus Ef.
  • the reflex measurement light receiving system 170 receives the ring-shaped light beam returning from the eye E to be examined.
  • the reflex measurement light source 161 may be an SLD (Superluminescent Diode) light source, which is a high-intensity light source whose emission diameter is equal to or less than a predetermined size.
  • the reflex measurement light source 161 is movable in the optical axis direction and is arranged at a fundus conjugate position P.
  • the ring diaphragm 165 (specifically, the transparent part) is arranged at the pupil conjugate position Q.
  • the focusing lens 174 is movable in the optical axis direction.
  • the focusing lens 174 may be a known variable focus lens whose focal position can be changed under the control of the control unit 26.
  • the imaging surface of the imaging element 159 is arranged at the fundus conjugate position P.
  • the light output from the reflex measurement light source 161 passes through the relay lens 162 and enters the conical surface of the conical prism 163.
  • the light incident on the conical surface is deflected and exits from the bottom surface of the conical prism 163.
  • the light emitted from the bottom surface of the conical prism 163 passes through a field lens 164 and then through a ring-shaped light-transmitting portion of a ring diaphragm 165 .
  • the light (ring-shaped light beam) that has passed through the transparent portion of the ring diaphragm 165 is reflected by the reflective surface of the apertured prism 166, passes through the rotary prism 167, and is reflected by the dichroic mirror 168.
  • the light reflected by the dichroic mirror 168 is reflected by the dichroic mirror 153, passes through the objective lens 152, and is projected onto the eye E to be examined.
  • the rotary prism 167 is used to average the light intensity distribution of the ring-shaped light beam to blood vessels and diseased areas of the fundus Ef and to reduce speckle noise caused by the light source.
  • the conical prism 163 be placed as close as possible to the pupil conjugate position Q.
  • FIG. 3B schematically shows a cross-sectional view of the field lens 164.
  • a ring diaphragm 165 may be attached to the lens surface of the field lens 164 on the eye E side.
  • a light-shielding film is deposited on the lens surface of the field lens 164 so that a ring-shaped light-transmitting portion is formed.
  • the reflex measurement projection system 160 may have a configuration in which the field lens 164 is omitted.
  • FIG. 3C schematically shows a cross-sectional view of the conical prism 163.
  • a ring diaphragm 165 may be attached to the bottom surface 163b of the conical prism 163, through which the light that has passed through the relay lens 162 enters the conical surface 163a.
  • a light-shielding film is deposited on the bottom surface 163b of the conical prism 163 so that a ring-shaped light-transmitting part is formed.
  • the ring diaphragm may be provided on the conical surface 163a side of the conical prism 163.
  • the ring diaphragm 165 may be a diaphragm in which a transparent portion having a shape corresponding to a predetermined measurement pattern is formed. This diaphragm may have a transparent portion formed at a position eccentric to the optical axis of the reflex measurement projection system 160. Moreover, two or more light-transmitting parts may be formed in the aperture.
  • the return light of the ring-shaped light beam projected onto the fundus Ef passes through the objective lens 152 and is reflected by the dichroic mirror 153 and the dichroic mirror 168.
  • the return light reflected by the dichroic mirror 168 passes through the rotary prism 167, passes through the hole of the apertured prism 166, passes through the relay lens 171, is reflected by the reflection mirror 172, and is then passed through the relay lens 173 and the focusing lens. Pass through 174.
  • the light that has passed through the focusing lens 174 is reflected by a reflecting mirror 175, then reflected by a dichroic mirror 157, and then imaged by an imaging lens 158 on an imaging surface of an imaging element 159.
  • the control unit 26 calculates the refractive power value of the eye E to be examined by performing a known calculation based on the output from the image sensor 159.
  • the refractive power value includes spherical power, astigmatic power, and astigmatic axis angle.
  • a diaphragm is arranged between the apertured prism 166 and the relay lens 171 to limit the diameter of the light beam on the pupil.
  • the transparent portion of this diaphragm is arranged at the pupil conjugate position Q.
  • the control unit 26 controls the reflex measurement light source 161 and the focusing lens 174 so that the fundus Ef, the reflex measurement light source 161, and the imaging surface of the image sensor 159 become optically conjugate. are moved in the optical axis direction. Further, the control unit 26 moves the optotype unit in the optical axis direction in conjunction with the movement of the reflex measurement light source 161 and the focusing lens 174.
  • the optotype unit including the light source 141, collimating lens 142, and optotype chart 143, the reflex measurement light source 161, and the focusing lens 174 may be movable in the respective optical axis directions in conjunction with each other.
  • the optotype displayed on the optotype chart 143 in the subjective test etc. is not particularly limited as long as it is used for optometry, and suitable examples include Landolt ring, Snellen optotype, E chart, etc.
  • optotypes include characters such as hiragana and katakana, pictures of animals and fingers, specific figures for binocular visual function tests such as crosses, and visual targets such as landscape paintings and photographs.
  • Various optotypes can be used, such as markers.
  • the visual target may be a still image or a moving image.
  • the optotype chart 143 includes a liquid crystal panel, so it is possible to display an optotype with a desired shape, form, and contrast at a predetermined examination distance, making it possible to perform a multifaceted and detailed eye examination.
  • the ophthalmological apparatus 100 includes two optotype charts 143 corresponding to the left and right eyes E, the optotypes that provide parallax are arranged in a manner that corresponds to a predetermined examination distance (position where the optotype is presented). This makes it possible to perform stereoscopic examinations easily and precisely with the natural visual axis orientation.
  • control unit 26 of the ophthalmologic apparatus 100 of this embodiment is provided on the base 11 of the main body 10 and centrally controls each unit of the ophthalmological apparatus 100.
  • the control section 26 includes the above-mentioned left eye measurement optical system 21L, right eye measurement optical system 21R, left eye vertical drive section 22L of the left eye drive mechanism 15L, and left eye measurement optical system 21L.
  • the control unit 26 includes an internal memory 26a, and is capable of short-range communication with the examiner controller 27 and the examinee controller 28 via communication means such as short-range wireless. Further, the control unit 26 responds to operations on the examiner controller 27 and the examinee controller 28 as appropriate by deploying a program stored in the connected storage unit 29 or built-in internal memory 26a on, for example, RAM. The operation of the ophthalmological apparatus 100 is controlled in an integrated manner.
  • the internal memory 26a is composed of a RAM, etc.
  • the storage section 29 is composed of a ROM, an EEPROM, etc.
  • control unit 26 controls each unit of the ophthalmological apparatus 100 based on operation inputs made to the display screen 30a of the display unit 30. As described above, the control unit 26 has a "normal input mode” and a “gesture input mode.”
  • FIG. 4 is a diagram showing an example of the operation screen 40 for the subjective test.
  • This operation screen 40 for subjective testing includes a correction value setting area 41 in which correction values such as sphericity (S), astigmatism power (C), astigmatism axis (A), and addition power (ADD) of the eye E to be examined are set;
  • An examination distance setting area 42 where an examination distance is set, an optotype icon 43 where an optotype is selected, an optotype display area 44 where the selected optotype is displayed, and an anterior segment image E' captured by the image sensor 159. It has an anterior segment image display area (optometry window) 45 where is displayed, various operation buttons 46, etc.
  • the correction value setting area 41 for example, a value measured by an objective test is set as an initial value.
  • a desired correction value in the correction value setting area 41 single touch operation
  • a list of correction values is displayed as a pop-up screen.
  • the examiner can change the correction value by selecting a desired correction value from the pop-up screen and performing a tap operation.
  • the control unit 26 displays the changed correction value in the correction value setting area 41 and controls the measurement optical system 21 based on the changed correction value.
  • the operation screen 40 for subjective testing also has a display area for PD values (pupillary distance), VD values (corneal vertex distance), etc., and a screen for red/green testing.
  • control unit 26 when the control unit 26 recognizes that the input information input from the examiner controller 27 is either "directional touch operation" or “non-directional multi-touch operation," the control unit 26 selects an input mode. Switch to "gesture input mode”. Then, the control unit 26 controls each unit of the ophthalmological apparatus 100 based on information input by gesture input to the gesture input area 30c provided on the display screen 30a.
  • the left diagram of FIG. 5 is an explanatory diagram for explaining an example of a gesture input for switching the eye E to be examined between the right eye and the left eye.
  • the examiner performs a gesture input (eg, swipe operation in a semicircular manner) in the gesture input area 30c of the display screen 30a.
  • a gesture input eg, swipe operation in a semicircular manner
  • the examiner can input operations such as switching the eye E to be examined and changing the setting values set in the measurement head 16, for example.
  • Such gesture input can be suitably used, for example, when testing the left eye and right eye sequentially.
  • the examiner can perform gesture input using the entire gesture input area 30c, improving operability.
  • the right diagram of FIG. 5 is an explanatory diagram for explaining a modification in which the gesture input area 30c is divided into left and right halves.
  • the examiner can change the setting value of the right eye measurement head 16R by inputting a gesture to the left area, and can change the setting value of the left eye measurement head 16L by inputting a gesture to the right area.
  • the reason for this configuration is that the anterior segment image E' of the left eye photographed by the imaging device 159 of the left eye measurement head 16L with the left and right sides of the examiner as reference points is, for example, The anterior eye image E' of the right eye, which is displayed in the anterior eye image display area 45 on the right side of This corresponds to being displayed in the anterior segment image display area 45.
  • the examiner may perform gesture input for the left eye in the left region of the gesture input region 30c, and perform gesture input for the right eye in the right region.
  • the gesture input area 30c As described above, by dividing the gesture input area 30c into left and right sides corresponding to the left eye and the right eye, it is possible to provide a pair of measurement heads 16L and 16R as in the present embodiment, so that one eye or both eyes can be simultaneously
  • the present invention can be suitably used to control the ophthalmological apparatus 100 that can be inspected. Therefore, the examiner can more appropriately perform operation input for testing with one eye or both eyes.
  • the left diagram and the center diagram of FIG. 6 are explanatory diagrams for explaining an example of gesture input for instructing to open and close the subject's eye E.
  • the examiner performs a circular swipe operation with two fingers (multi-touch) on the gesture input area 30c to block one eye E to be examined. You can input instructions for testing with one eye.
  • Can input operation instructions for cancellation can be suitably used when performing a single-eye test and a binocular test in succession, and the examiner can efficiently switch the eye E to be tested.
  • the right side of the paper in FIG. 6 is an explanatory diagram for explaining an example of gesture input for instructing initial settings.
  • the examiner can input an initial setting (reset) instruction by performing a Z-shape swipe operation on the gesture input area 30c.
  • the initial setting (reset) process includes, for example, returning the correction value changed on the subjective test operation screen 40 etc. to the correction value at the time of the objective test, returning the correction value to the correction value of glasses, etc. currently in use, Examples include returning to the previously determined correction value, returning to the default value at the time of shipment, and returning the measuring head 16 that has been moved or rotated in the XYZ directions by the drive mechanism 15 to its initial position.
  • it is not limited to these.
  • FIG. 7 is an explanatory diagram for explaining an example of gesture input for changing the spherical power (S), astigmatic power (V), and astigmatic axis (A).
  • the examiner can issue an instruction to increase the sphericity (Up) by performing a single-touch swipe operation upward or downward on the gesture input area 30c. You can input operations and input instructions to lower (Down).
  • the examiner performs a single-touch swipe operation to the right or left on the gesture input area 30c, thereby inputting an instruction to increase the astigmatism power.
  • the examiner adjusts the angle of the astigmatism axis by performing a multi-touch swipe operation with two fingers each upward or downward on the gesture input area 30c. You can input operations for raising instructions and lowering instructions.
  • the examiner can easily change the spherical power, astigmatic power, and astigmatic axis of the eye E to be examined by swiping in a predetermined direction, and can test the eye E more efficiently.
  • the control unit may increase or decrease the correction value by one unit (for example, 0.25, 0.50, etc. in the case of spherical power) each time the control unit recognizes the operation input by swiping each correction value.
  • the correction value may be increased or decreased depending on the length of the swipe operation (the length of the touch on the display screen 30a) (for example, in the case of spherical power, if it is short, it is 0.25, if it is long, it is 0.50 or 1.00, etc.). The same applies to the following modified examples.
  • FIG. 8 is an explanatory diagram for explaining a modification of gesture input for changing the spherical power, astigmatic power, and astigmatic axis.
  • the gesture input area 30c is divided into left and right halves.
  • the examiner inputs operation inputs for changing the spherical power, astigmatism power, and astigmatism axis of one eye E (for example, the right eye) in the left area, and inputs instructions for changing the spherical power, astigmatism power, and astigmatism axis of one eye E (for example, the right eye), and inputs operation inputs for the other eye E (for example, the left eye) in the right area.
  • FIGS. 9 to 11 are explanatory diagrams for explaining modified examples of gesture input for changing the spherical power, astigmatic power, and astigmatic axis.
  • Gesture input that is different from the example in FIG. 7 will be described below.
  • the operation input for changing the spherical power and astigmatic power is the same as in the example shown in FIG.
  • input an instruction to change the astigmatism axis is shown on the right side of the paper in FIG.
  • the operation input for changing the spherical power is the same as in the example shown in FIG.
  • the examiner can input an instruction to change the astigmatism axis by performing a multi-touch swipe operation with two fingers in the left or right direction.
  • the examiner By performing a swipe operation, you can input an instruction to change the spherical power.
  • the examiner performs a multi-touch swipe operation using two fingers to draw a semicircle upward or downward on the gesture input area 30c. , you can input an instruction to change the astigmatism power.
  • the examiner can instruct the astigmatism axis to change by performing a multi-touch swipe operation with three fingers in a semicircular manner upward or downward. You can perform operation input.
  • the examiner may perform large gesture input using the entire gesture input area 30c (each area when divided into left and right areas), or may perform gesture input using only a portion of the gesture input area 30c. You may also perform small gesture input using .
  • Gesture input is a directional touch operation or a non-directional multi-touch operation, so no matter which region of the gesture input area 30c the gesture input is performed, the control unit 26 determines whether it is a gesture input or a normal input. be able to operate each part by properly recognizing the Further, by enabling operational input through gesture input as shown in FIGS. 5 to 11, the examiner can operate the ophthalmologic apparatus 100 with a so-called blind touch without visually checking the display screen 30a.
  • the examiner has the examinee sit on a chair or the like, face the ophthalmologic apparatus 100, and place the examinee's forehead on the forehead rest part 17. Then, at the timing when a sensor or the like detects that the subject has placed his or her forehead on the forehead rest part 17, or at the timing when the examiner gives a photographing instruction from the operation screen, the operation shown in the flowchart of FIG. 12 is started. Ru.
  • step S1 the control unit 26 controls the anterior segment observation system 150 provided in the left and right measurement optical systems 21 to start photographing the anterior segment of the left and right eyes E to be examined.
  • the control unit 26 controls the display unit 30 of the examiner controller 27 to display left and right anterior segment images (frontal images) E' based on image signals output from the image sensor 159 of the anterior segment observation system 150. , is displayed on the display screen 30a.
  • the examiner performs an operation input to start alignment from the input section 30b of the examiner controller 27.
  • the control unit 26 which has received the input information (control signal) corresponding to this operation input, controls the visual target projection system 140 to place a fixation target (for example, a point light source) at the center position of the visual target chart 143 in step S2.
  • a fixation target for example, a point light source
  • a visual target is displayed and presented to the eye E to be examined. In this state, the examiner instructs the subject to fixate the fixation target.
  • the Z alignment system 110 aligns the measurement head 16 in the Z direction by the operation described above under the control of the control unit 26.
  • an XY alignment system 120 aligns the measurement head 16 in the X and Y directions.
  • the control unit 26 controls the left and right measurement optical systems 21 to perform the objective test based on the operator's operational input of an objective test instruction from the input unit 30b (or automatically). Let it happen.
  • the objective test include corneal shape (kerato) measurement using the keratometry system 130, eye refractive power (reflex) measurement using the reflex measurement projection system 160 and the reflex measurement light receiving system 170, and the like.
  • the control unit 26 controls the display unit 30 to perform the subjective test operations shown in FIG.
  • the screen 40 is displayed on the display screen 30a.
  • the display unit 30 displays the correction value measured in the objective test in the correction value setting area 41 of the subjective test operation screen 40, and also displays the test distance, PD value, left and right anterior segment images E', etc. indicate.
  • the examiner can tap the test distance setting area 42 on the subjective test operation screen 40 to change the test distance, or tap the optotype icon 43 to select the optotype to be presented to the eye E to be examined. .
  • the examiner taps one of the anterior eye segment images E' displayed in the anterior eye segment image display area 45 of the subjective test operation screen 40, or performs a tap operation as shown in FIG.
  • a gesture input as shown in the center figure of the paper, one eye to be examined E can be shielded.
  • the control unit 26 controls the optotype projection system 140 to display the optotype on the optotype chart 143 and display the optotype on the eye to be examined. E, and the same optotype is displayed in the optotype display area 44 of the subjective test operation screen 40 of the display screen 30a.
  • the control unit 26 drives the left and right X-direction rotation drive units 24 according to the examination distance to move the left and right measurement heads 16 in the X direction. It may also be rotated in the direction.
  • the examiner performs a subjective test by having the subject answer how the optotype looks.
  • the examiner touches the input unit 30b to appropriately change correction values such as the spherical power, astigmatic power, and the angle of the astigmatic axis.
  • the examiner can change each correction value by single-touching the corresponding area of the correction value setting area 41 on the operation screen 40 for subjective testing, or as shown in FIGS. 7 to 11. This can also be done using gesture input.
  • step S7 the control unit 26 determines whether there is a touch operation to change the correction value on the display screen 30a (input unit 30b). If there is a change touch operation (YES), the program proceeds to step S8; if there is no change touch operation (NO), the program proceeds to END and ends the process.
  • the control unit 26 determines whether the input to the display screen 30a (input unit 30b) is a gesture input.
  • the control unit 26 recognizes that the operation input is the above-mentioned directional touch operation (1) or the non-directional multi-touch operation (2), the control unit 26 determines that the operation input is a gesture input to the gesture input area 30c.
  • the control unit 26 recognizes that it is an input other than (1) and (2), that is, a single touch operation without directionality, it determines that it is a normal input to the subjective test operation screen 40.
  • step S8 When it is determined in step S8 that gesture input is required (YES), the program proceeds to step S9, and the control unit 26 switches the input mode to gesture input mode. On the other hand, when it is determined in step S8 that it is a normal input (NO), the program proceeds to step S10, and the control unit 26 switches the input mode to the normal input mode. After that, the program proceeds to step S11.
  • the control unit 26 analyzes the control signal (input information) from the display unit 30 according to the input mode and obtains the instruction content.
  • the control unit 26 changes the correction value (set value) according to the analysis result, and controls the display unit 30 to display the changed correction value on the display screen 30a.
  • This display is normally performed by changing the correction value displayed in the correction value setting area 41 of the operation screen 40 for subjective testing, but in the gesture input mode, it does not interfere with gesture input on the display screen 30a. This may be done by displaying the modified correction value in such an area (for example, one corner of the display screen 30a).
  • the items to be changed may be not only correction values such as spherical power, astigmatic power, and astigmatic axis, but also visual targets to be presented, examination distance, and the like.
  • the control unit 26 controls the measurement optical system 21 based on the changed correction value (set value). Thereby, the correction value of the subject's eye E by the measurement optical system 21 is changed, and the subject can perform a subjective test using the changed correction value.
  • the examiner determines the correction value (prescription) depending on whether the examinee's answer is correct or not, or changes the correction value by touch operation or gesture input on the operation screen 40 for the subjective test again. Tests can be repeated. Furthermore, the examiner can have the subject perform a subjective test by changing the visual target to be presented or by changing the testing distance.
  • the control unit 26 receives a control signal for an operation input to change the correction value again, the control unit 26 repeats the processing of steps S7 to S12, and receives a control signal for an operation input to change the optotype or the examination distance. If so, repeat steps S6 to S12.
  • step S7 determines whether the prescription is determined and there is no touch operation to change the correction value, etc. (determination in step S7 is NO)
  • the program proceeds to the end, and the ophthalmological apparatus 100 collects information about the eye E to be examined.
  • the operation for acquiring (inspecting) ends.
  • the ophthalmologic apparatus 100 includes the measurement optical system 21 (information acquisition section), the display section 30 having the touch panel display screen 30a, and the control section 26.
  • the control unit 26 operates in a normal input mode in which the measurement optical system 21 is controlled based on input information to the operation screen, and a gesture input mode in which the measurement optical system 21 is controlled based on input information by gesture input to the gesture input area 30c. has.
  • the examiner can operate the ophthalmological apparatus 100 by inputting operations on the normal operation screen and inputting gestures.
  • the examiner can appropriately and quickly input instructions to the ophthalmological apparatus 100 by inputting gestures, and can input operations by blind touching without visually checking the display screen 30a.
  • the examiner does not have to perform the test by alternately viewing the patient and the display screen 30a, and can check whether the patient's eye E is fixating on the optotype, is in an appropriate posture, etc. Tests can be performed while observing the condition of the person.
  • the examiner controller 27 is placed on the optometry table 51 and can be operated by blind touch, the examiner can adjust the position of the examinee's head to the correct position, open the eyelids, etc. Tests can be performed while assisting the examiner. Therefore, according to the ophthalmologic apparatus 100 of the first embodiment, the examiner can acquire information about the eye E more efficiently with a simpler operation.
  • the ophthalmologic apparatus 100 according to the second embodiment will be described below with reference to FIG. 13.
  • the ophthalmologic apparatus 100 according to the second embodiment has the same basic configuration as the ophthalmologic apparatus 100 according to the first embodiment shown in FIG. 1, except that a smartphone is used as the examiner controller 27A instead of a tablet terminal. ing. Therefore, the same configurations as in the first embodiment are given the same reference numerals as in the first embodiment, and detailed description thereof will be omitted. The same applies to subsequent embodiments.
  • the ophthalmological apparatus includes a main body 10 shown in FIG. 1, a control section 26 provided in the main body 10, an examiner controller 27A shown in FIG. It mainly includes a controller 28.
  • the examiner controller 27A of the second embodiment includes a display section (display panel) 30 made of a touch panel display, similar to the examiner controller 27 of the first embodiment.
  • the display unit 30 includes a display screen 30a and a touch panel type input unit 30b, and the entire surface of the input unit 30b serves as a gesture input area 30c.
  • the ophthalmologic apparatus 100 of the second embodiment has a "normal input mode” and a "gesture input mode” similarly to the first embodiment.
  • the examiner can perform normal input by a single touch operation with no directionality on the display screen 30a (input section 30b) of the examiner controller 27A. Furthermore, the examiner can input gestures by performing a directional touch operation and/or a non-directional multi-touch operation on the gesture input area 30c.
  • the examiner can operate the ophthalmologic apparatus 100 by inputting gestures similar to those of the first embodiment described using FIGS. 5 to 11.
  • specific gesture input is possible when performing a red/green test, an add power test using a near crosshair, a cross cylinder test, etc. These tests are repeated multiple times while changing the correction value while the examiner holds the examiner controller 27, so the examinee and examiner may feel fatigued and the test efficiency and accuracy may decrease. There is. In order to prevent this, it is desirable for the examiner to be able to operate quickly with a blind touch without visually checking the operation screen, thereby further shortening the examination time.
  • the control unit 26 allows the examiner to control the display screen of the display unit 30 of the examiner controller 27A while displaying the operation screen for a predetermined examination. It is possible to recognize that three fingers, such as the middle finger, the ring finger, and the little finger, are placed on the end of the input unit 30a (input unit 30b) (multi-long press operation). When this operation input is recognized, the control unit 26 assumes that gesture input is selected and switches the input mode to "gesture input mode.”
  • the red/green test operation screen 40A shown in FIG. 13 includes an optotype display area 44 where optotypes are displayed, an anterior segment image display area 45, various operation buttons 46, and the like.
  • the examiner can lower the spherical power by -0.25D (Down) by touching the "red” button of the operation button 46, and can lower the spherical power by touching the "green” button. can be +0.25D (Up). Furthermore, by touching the "same” button, the examiner can proceed to the next examination without changing the spherical power.
  • the examiner can switch the input mode to the "gesture input mode" by placing three fingers on the display screen 30a.
  • the finger detection position may be at the right end, the left end, or both ends of the display screen 30a, depending on the subject's dominant hand.
  • the display screen 30a (input unit 30b) functions as a gesture input area 30c.
  • the control unit 26 recognizes this operation and adjusts the spherical power by -0.25D. shall be.
  • the control unit 26 recognizes this operation and sets the spherical power to +0.25D.
  • the control unit 26 ends the red/green test without changing the spherical power.
  • the addition power test is a test in which a near crosshair is presented to the subject's eye E, the subject is asked to compare the vertical line and the horizontal line, and the subject is asked to answer which one is more clearly visible.
  • the examiner operates the operation buttons on the operation screen in response to the clearly visible line answered by the examinee.
  • the control unit 26 that receives the control signal based on this operation input adjusts the addition power. Then, the examiner repeats the operation, and the control unit 26 adjusts the addition power until the examinee can see the vertical lines and horizontal lines with the same clarity or can see the horizontal lines slightly better.
  • the examiner can direction. Further, if the examinee's answer is "I can clearly see the vertical lines,” the examiner performs a swipe operation in the vertical direction (up and down direction) on the gesture input area 30c.
  • the control unit 26 that receives the control signals based on these gesture inputs can be configured to adjust the addition power according to the direction of the swipe operation.
  • the cross cylinder test is a test for adjusting the astigmatism axis using an optotype consisting of a point group or multiple lines.
  • the examiner touch-operates the operation buttons according to how the subject sees the optotype.
  • the control unit 26, which has received the control signal based on this operation input, controls the measurement optical system 21 to change the astigmatic axis counterclockwise or to set the astigmatic power to +0.25D.
  • the control unit 26 controls the measurement optical system 21 to change the astigmatic axis clockwise or to set the astigmatic power to -0.25D.
  • the examiner places two fingers on the gesture input area 30c (the two fingers represent the astigmatism axis, and the two fingers (representing the angle of the astigmatic axis), the swipe operation can be performed counterclockwise or clockwise.
  • the control unit 26 receives the control signals based on these gesture inputs and recognizes the counterclockwise swipe operation, it controls the measurement optical system 21 to change the astigmatism axis counterclockwise or to change the astigmatism power + It can be configured to be 0.25D. Further, when the control unit 26 recognizes a clockwise swipe operation, the control unit 26 may be configured to control the measurement optical system 21 to change the astigmatism axis clockwise or to set the astigmatism power to -0.25D.
  • the examiner can perform not only a normal visual acuity test but also a red-green test, an addition power test, a cross-cylinder test, etc. by blind touch. It can be done more quickly and efficiently. Thereby, the examination time is shortened, fatigue of the examiner and the examinee is suppressed, and the examination can be performed more appropriately.
  • the above-described gesture input of the ophthalmologic apparatus 100 of the second embodiment can also be applied to the ophthalmologic apparatus 100 of the first embodiment. Since the ophthalmologic apparatus 100 of the first embodiment uses a tablet terminal as the examiner controller 27, the way the examiner holds it is somewhat different. For this reason, the ophthalmologic apparatus 100 of the first embodiment may be configured to switch to the "gesture input mode" when it recognizes a long-press operation with three fingers on the display screen 30a, or may switch to the "gesture input mode" by It may also be configured to perform when a long press operation is recognized.
  • the ophthalmological apparatus 100 of the first and second embodiments may be configured to recognize, for example, a pinch-in operation and a pinch-out operation on the gesture input area 30c as an operation input for increasing or decreasing a predetermined correction value or the like. can. Further, the ophthalmological apparatus 100 of the first and second embodiments recognizes a pinch-in operation and a pinch-out operation in the horizontal direction as an operation input for increasing or lowering the spherical power, and recognizes a pinch-in operation and a pinch-out operation in the vertical direction. For example, a configuration may be adopted in which the out operation is recognized as an operation input for increasing or decreasing the astigmatic power, but the configuration is not limited to this.
  • the ophthalmological apparatus 100 of the first and second embodiments accepts a touch operation to draw numbers, symbols, characters, etc. on the gesture input area 30c, and analyzes the trajectory of the touch operation to determine a correction value according to the obtained value.
  • an optotype, an examination distance, a PD value, a VD value, etc. may be changed.
  • the ophthalmologic apparatus 100 includes a main body 50 and an examiner controller 27, as shown in FIG.
  • the main body 50 includes an optometry table 51, a refractor head 52 that is an optometry optical system, and an optometry presentation device 53.
  • the optometry table 51 is equipped with a refractor head 52, on which various devices such as an optotype presentation device 53 and an examiner controller 27 are placed, and on which the examinee rests on his elbows in order to maintain an appropriate posture during the examination. It is a desk on which you can place your hands and arms. Further, the optometry table 51 is movable in the vertical direction (Y direction) manually or by an appropriate drive mechanism, and the refractor head 52 and the optotype presentation device are moved according to the height of the eye E from the floor. 53 height adjustments are possible.
  • the refractor head 52 is a device used to select a lens suitable for the eye E to be examined.
  • This refractor head 52 has a plurality of testing lenses (corrective lenses), and selectively arranges the testing lenses (corrective lenses) in front of the eye E to be examined.
  • the examiner can perform a refraction test and other tests while changing the test lens as appropriate.
  • the refractor head 52 includes a support mechanism 54 and a pair of optometry units 55L and 55R.
  • the pair of optometry units 55L and 55R each include a pair of optometry optical systems for the left eye (not shown) and an optometry optical system for the right eye, which are provided on the left and right sides so as to correspond to the left and right eyes E to be examined.
  • the pair of optometry units 55L and 55R are supported by a support mechanism 54 so as to be movable in the vertical direction (Y direction). Further, the pair of optometry units 55L and 55R can be rotated in the circumferential direction (around the Y axis) by the support mechanism 54, and are inserted and removed between the eye E and the optotype presentation device 53 by this rotation.
  • the pair of optometry units 55L and 55R can be slid in the left-right direction (X direction) by a known slide mechanism (not shown), and can be moved relatively toward and away from each other.
  • Optometry windows 56L, 56R are provided in front and rear of the pair of optometry units 55L, 55R, respectively.
  • optical members such as a plurality of examination lenses for the left and right eyes, polarizing filters, and shielding plates are arranged, respectively.
  • Optical members such as a plurality of inspection lenses are selectively arranged at positions facing each of the optometry windows 56L and 56R by a known drive mechanism (not shown).
  • the optotype presentation device 53 is a device that presents an optotype to the eye E to be examined.
  • the optotype presentation device 53 includes a rectangular parallelepiped-shaped housing 53a and an optotype projection system 53b built in the housing 53a.
  • a window (opening) 53c is provided on the front surface (on the subject's side) of the housing 53a for the subject to visually view the optotype image.
  • the optotype projection system 53b is an optical system that generates an optotype image for testing the visual function of the eye E to be examined.
  • the optotype projection system 53b includes, for example, an optotype display section including a display for displaying an optotype, a convex lens system that generates a virtual image of the optotype using a light beam from the optotype, and a convex lens system that generates a virtual image of the optotype using a light flux from the optotype. It includes an optical path bending mirror (not shown) that reflects the optical path and forms a virtual image of the target image at an image point position in front of the eye E to be examined. The subject can visually recognize this optotype image through the window 53c of the housing 53a.
  • the examiner controller 27 is configured from a tablet terminal and includes a display section 30 consisting of a touch panel display.
  • the display section 30 includes a display screen 30a and a touch panel type input section 30b, and the display screen 30a (input section 30b) has a gesture input area 30c.
  • the CPU of the examiner controller 27, which is a tablet terminal is used as the control unit 26, but the control unit 26 may be provided separately in the ophthalmologic apparatus 100B.
  • the control unit 26 controls each unit of the ophthalmological apparatus 100B based on operation inputs made on the display screen 30a (input unit 30b).
  • FIG. 15 is a diagram showing an example of a subjective test operation screen 40B displayed on the display unit 30 of the ophthalmological apparatus 100B of the third embodiment.
  • This operation screen 40B for subjective testing includes a correction value setting area 41 in which correction values such as sphericity (S), astigmatism power (C), astigmatism axis (A), and addition power (ADD) of the eye E to be examined are set;
  • This gesture input mode switching icon 47 is an icon for switching the input mode from "normal input mode” to "gesture input mode.” The examiner clicks on this gesture input mode switching icon 47. Then, the control unit 26 that has received the control signal based on this operation input switches the input mode to the "gesture input mode", and the display screen 30a (input unit 30b) functions as the gesture input area 30c. The examiner can input various instructions to the ophthalmological apparatus 100B by performing gesture inputs as shown in FIGS. 5 to 11 in the gesture input area 30c.
  • the ophthalmological apparatus 100B of the third embodiment has a configuration in which it does not accept gesture input when in the "normal input mode” and does not accept normal input when in the "gesture input mode", thereby preventing false detections and It suppresses erroneous operations and improves inspection accuracy.
  • control unit 26 may control the display unit 30 to keep an operation screen such as the operation screen for subjective testing 40B displayed on the display screen 30a, or may erase the operation screen. You may. Furthermore, if the control unit 26 is configured to control the display unit 30 to display the trajectory of the gesture input on the display screen 30a while erasing the operation screen, the examiner can understand the movement of his or her gesture input. You can check whether it is appropriate.
  • the control unit 26 does not allow the examiner to input an operation input of an instruction to switch the input mode. , you can recognize it more clearly and switch input modes more reliably. Therefore, the ophthalmologic apparatus 100B can further improve the detection accuracy of gesture input. Then, the examiner can appropriately input gestures using the gesture input area 30c provided on the entire surface of the display screen 30a.
  • the examiner can also use a single-touch operation without directionality for operation input for the "gesture input mode.” Furthermore, the examiner can also use directional touch operations or non-directional multi-touch operations to input operations for the "normal input mode,” expanding the variety of touch operations.
  • the ophthalmologic apparatus 100B of the third embodiment has a configuration in which it does not accept gesture input in the "normal input mode” and does not accept normal input in the "gesture input mode.”
  • the configuration may be such that normal input can be performed by operating .
  • the control unit 26 controls the display unit 30 to display the operation screen on the display screen 30a with reduced brightness (diminished), and basically allows gesture input. Then, for example, when a long-press touch operation is performed on the correction value setting area 41, the control unit 26 may be configured to display a correction value selection screen as a pop-up, assuming that a normal input has been performed.
  • the ophthalmological apparatuses 100 to 100B of the first to third embodiments described above may have a "gesture input practice mode" as an input mode.
  • This "gesture input practice mode” is an input mode for the examiner to practice gesture input or for the ophthalmological apparatus 100 itself to learn gesture input. Switching to the "gesture input practice mode” may be performed using a "practice mode switching icon” or a switch provided on the operation screen, or may be performed by inputting a predetermined gesture.
  • the examiner When the examiner practices gesture input, the examiner performs a predetermined gesture input in the gesture input area 30c. Then, the control unit 26 analyzes this gesture input, controls the display unit 30, and displays the name to be changed (for example, spherical power, astigmatic power, astigmatic axis, inspection distance, etc.) and the numerical value after the change (preferably). is a numerical value corresponding to the amount of operation at the time of gesture input), and furthermore, the locus of the operation is displayed on the display screen 30a. Therefore, the examiner can appropriately know which item the gesture is instructing to change, know the amount of change in the numerical value relative to the amount of operation, and know whether or not his or her gesture input is appropriate. .
  • the name to be changed for example, spherical power, astigmatic power, astigmatic axis, inspection distance, etc.
  • the numerical value after the change preferably. is a numerical value corresponding to the amount of operation at the time of gesture input
  • the locus of the operation is displayed
  • the examiner performs gesture input in the gesture input area 30c according to the item to be changed (correction value, optotype, examination distance, etc.).
  • the control unit 26 analyzes this gesture input and acquires the type of operation (pinch-in operation, pinch-out operation, flick operation, drag operation, swipe operation, etc.), operation direction, operation amount, etc. as gesture information.
  • the control unit 26 ranks the acquired gesture information, displays it on the display screen 30a in association with the item to be changed, and stores it in the storage unit 29 (or the storage device of the examiner controller 27).
  • the ophthalmologic apparatus 100 may be equipped with AI (artificial intelligence), and may learn the state of the examiner's gesture input using the AI.
  • the ophthalmological apparatus 100 etc. can grasp and learn the gesture input that varies depending on the examiner, the examiner's gesture input habits, etc., and can detect gesture input with more accuracy and more appropriately. Information on the eye E to be examined can be acquired more efficiently.
  • the examiner controllers 27 and 27A of the ophthalmological apparatuses 100 to 100B of the first to third embodiments may include a notification unit such as a vibrator, a light source, and a sound output unit. These may be provided in a tablet terminal, a smartphone terminal, etc., or may be provided separately.
  • the control unit 26 recognizes the gesture input, the control unit 26 indicates that the examiner has accepted the gesture input by vibrating the vibrating unit, changing the light intensity (emission state) of the display screen 30a, or outputting a voice or a buzzer sound from the sound output unit.
  • the configuration is configured to notify.
  • control unit 26 may be configured to issue a notification using the notification unit when the gesture input is not appropriate (error) or when the upper or lower limit of numerical value change has been reached and no further changes can be made.
  • control unit 26 may be configured to change the vibration frequency, light emission state, voice, or buzzer sound to notify the examiner depending on the situation. Thereby, the examiner can clearly understand whether or not the gesture input was performed appropriately, and can perform the gesture input more appropriately.
  • the ophthalmological apparatuses 100 to 100B of the first to third embodiments output (record) a log of input information input in the gesture input mode to the storage unit 29 serving as an output unit, or output (record) a log of input information input in the gesture input mode to a printer. It is also possible to have a configuration in which the information is printed (printed).
  • the examiner, etc. can check whether the test was conducted appropriately or not by visually checking the display unit 30 on which the input information stored in the storage unit 29 is displayed or by checking the printed matter. If something happens, you can appropriately understand the cause, etc. Furthermore, by referring to these, the examiner can perform the next examination more appropriately.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne un dispositif ophtalmologique qui peut acquérir des informations sur un œil à examiner par une opération plus simple et avec une efficacité supérieure. Le dispositif ophtalmologique (100) comprend : une tête de mesure (16) ayant un système optique de mesure (21) qui est une unité d'acquisition d'informations à utiliser pour l'acquisition d'informations sur un œil à examiner ; une unité d'affichage (30) qui a un écran d'affichage de type panneau tactile (30a) sur lequel une image d'opération est affichée ; et une unité de commande (26) qui commande la tête de mesure (16) sur la base d'informations d'entrée incluant une opération tactile pour l'écran d'affichage (30a). L'unité de commande (26) comprend : un mode d'entrée normal pour commander le système optique de mesure (21) sur la base des informations d'entrée pour l'image d'opération ; et un mode d'entrée gestuelle pour commander le système optique de mesure (21) sur la base d'informations d'entrée pour une région d'entrée gestuelle (30c) par entrée gestuelle.
PCT/JP2023/026594 2022-08-05 2023-07-20 Dispositif ophtalmologique WO2024029359A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022125914A JP2024022372A (ja) 2022-08-05 2022-08-05 眼科装置
JP2022-125914 2022-08-05

Publications (1)

Publication Number Publication Date
WO2024029359A1 true WO2024029359A1 (fr) 2024-02-08

Family

ID=89848894

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/026594 WO2024029359A1 (fr) 2022-08-05 2023-07-20 Dispositif ophtalmologique

Country Status (2)

Country Link
JP (1) JP2024022372A (fr)
WO (1) WO2024029359A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012090976A (ja) * 2010-09-30 2012-05-17 Nidek Co Ltd 眼科装置
JP2015223518A (ja) * 2014-05-29 2015-12-14 株式会社トプコン 眼科装置
JP2017176545A (ja) * 2016-03-30 2017-10-05 株式会社ニデック 眼科装置、および眼科装置制御プログラム
WO2018105512A1 (fr) * 2016-12-06 2018-06-14 株式会社ニデック Système d'observation et programme de commande d'observation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012090976A (ja) * 2010-09-30 2012-05-17 Nidek Co Ltd 眼科装置
JP2015223518A (ja) * 2014-05-29 2015-12-14 株式会社トプコン 眼科装置
JP2017176545A (ja) * 2016-03-30 2017-10-05 株式会社ニデック 眼科装置、および眼科装置制御プログラム
WO2018105512A1 (fr) * 2016-12-06 2018-06-14 株式会社ニデック Système d'observation et programme de commande d'observation

Also Published As

Publication number Publication date
JP2024022372A (ja) 2024-02-16

Similar Documents

Publication Publication Date Title
JP7007436B2 (ja) 眼科装置
JP7320662B2 (ja) 眼科装置
JP2019216817A (ja) 眼科装置
JP2020127588A (ja) 眼科装置
WO2015102092A1 (fr) Dispositif ophtalmologique
JP2023138863A (ja) 眼科装置
WO2024029359A1 (fr) Dispositif ophtalmologique
JP7249097B2 (ja) 眼科装置及び検眼システム
EP4342360A1 (fr) Appareil ophtalmique
US20220369921A1 (en) Ophthalmologic apparatus and measurement method using the same
WO2024070829A1 (fr) Dispositif ophtalmologique
JP7279498B2 (ja) 自覚式検眼装置
JP7248770B2 (ja) 眼科装置
JP2024047535A (ja) 眼科装置
JP7216562B2 (ja) 眼科装置
JP7166080B2 (ja) 眼科装置
JP7068061B2 (ja) 眼科装置
CN117752294A (zh) 眼科装置
JP2023142781A (ja) 眼科システム
JP2024044015A (ja) 眼科装置
JP2022038942A (ja) 検眼装置及び検眼装置の制御プログラム
JP2020069201A (ja) 検眼システム
JP2020138002A (ja) 眼科装置及びその作動方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23849914

Country of ref document: EP

Kind code of ref document: A1