TW201124892A - Display with an optical sensor - Google Patents

Display with an optical sensor Download PDF

Info

Publication number
TW201124892A
TW201124892A TW099133841A TW99133841A TW201124892A TW 201124892 A TW201124892 A TW 201124892A TW 099133841 A TW099133841 A TW 099133841A TW 99133841 A TW99133841 A TW 99133841A TW 201124892 A TW201124892 A TW 201124892A
Authority
TW
Taiwan
Prior art keywords
optical sensor
dimensional optical
designated area
function
display
Prior art date
Application number
TW099133841A
Other languages
Chinese (zh)
Inventor
John P Mccarthy
Bradley Neal Suggs
Robert Campbell
Original Assignee
Hewlett Packard Development Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/US2009/051587 external-priority patent/WO2011011008A1/en
Priority claimed from PCT/US2009/051599 external-priority patent/WO2011011009A1/en
Application filed by Hewlett Packard Development Co filed Critical Hewlett Packard Development Co
Publication of TW201124892A publication Critical patent/TW201124892A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A display system can include a panel 110. The panel 110 can include a perimeter 117 and can display images on a front side. A bezel 170 can extend from the perimeter of the panel. The display system can include a designated area 140 on the bezel. A three dimensional optical sensor 115 can generate information to determine if an object is in contact with the designated area on the bezel.

Description

201124892 六、發明說明: I:發明戶斤屬之技術領域3 本發明係有關於具有光學感測器之顯示器。 發明背景 一種電阻性觸壓螢幕面板係由一狹窄間隙分開之兩個 薄的、金屬的、電氣傳導層所組成。當一物件,諸如一手 指,於該面板外表面之一點上按下時,該等兩金屬層於該 點變為連接而該面板之後作為一對具有連接輸出之分壓 器。此記錄為一觸壓事件造成該電流改變並送至該控制器 加以處理。一種電容性觸壓螢幕面板為一電容器之一感測 器,其板極包括介於一格柵型樣中該等水平及垂直軸間之 重疊區域。人體亦會傳導靜電而該感測器之表面上的觸壓 會影響電場並使該裝置之電容產生一可測量的改變。 【發明内容】 依據本發明之一實施例,係特地提出一種一種顯示器 系統,其包含有:一包括一視野計來於一前側顯示影像之 面板;一從該面板之該視野計延伸的聚光圈;一位於該聚 光圈之指定區;以及一可產生資訊以決定一物件是否與該 聚光圈上之該指定區接觸的三維度光學感測器。 圖式簡單說明 本發明之某些實施例可參照下列圖形來說明: 第1圖是一根據本發明之一示範實施例的顯示器; 第2圖是根據本發明之一示範實施例的該顯示器之一 201124892 部分; 第3圖是一根據本發明之一示範實施例的三維度光學 感測器; 第4圖是一根據本發明之一示範實施例的顯示器; 第5圖是一根據本發明之一示範實施例的顯示器; 第6圖是一根據本發明之一示範實施例的方塊圖;以及 第7圖是一根據本發明之該方法的一示範實施例之流 程圖。 C實施方式3 詳細說明 一種計算系統可具有規劃來執行一功能之按紐。例如 一計算系統可具有開啟一網頁瀏覽器、開啟電子郵件或調 整音量之按紐。該等按钮可為機械式按钮、電容式按紐或 電阻式按紐。 一機械式按紐可包括,例如,一圓頂,其中若該圓頂 因該按鈕受致動而變形,則該圓頂之接點形成與另一接觸 層以及該計算系統之一功能的連接。一機械式按鈕可具有 一有效壽命而該壽命之後該功能性會降級,例如若該按鈕 經常使用,則產生該信號之該等接點可變為操作中。機械 式按鈕也對諸如灰塵或液體之外來汙染相當敏感。電容式 按鈕可對諸如改變該按鈕區附近之電容的一使用者手指之 一物件作出反應。 於一實施例中,一三維度光學感測器安裝於一顯示器 之一透光層後面。該三維度光學感測器可安裝於該顯示器 201124892 系統之該面板的視野計外側。將該三維度光«測器安裝 於_4面板之視野計㈣河增加域㈣面板轉移至 該使用者,因為該顯示器系統之—部分的任何薄層皆不附 接於該透光層,附接會降低該光線轉移或使該面板產生之 影像的清晰度降級。將該三維度光學感湘安裝於該顯示 益之該透光層後面可保護該光學感測器免受諸如灰塵之外 物汙染。若該光學感測li之視野受到該透光層上阻礙該光 學感測器視野的外物汙染,則在不需分開準備清潔該光學 感測器的情況下可清潔該透光層。 -亥—維度光學感測器可具有超過該顯示器面板之該視 野計的-視野。鋪示n面板之該視野相近可為一聚光 圈。«光圈可為該透光層之—部分或可為—分開部分。 遠聚光圈可包括指定區。該聚光圈上之該等指定區可以該 三維度光學感測錄視以決定—物件是㈣該聚光圈上之 該指定區接觸。藉由以—三維度光學感·來決定一物件 之位置可以單-二維度光學感測器來於該聚光圈上使 用多個指定區。 於-實施例中,一顯示器系統可包括一面板。該面板 可包括-視野計並可於—前側顯示影像。—聚光圈可從該 面板之§亥視野計延伸。若接觸到該聚光圈上之一區域則其 可叉彳日定來執行一功能。一三維度光學感測器可決定一物 件是否與該指定區接觸。 參照該等圖形,第1圖為根據本發明之一示範實施例的 一顯不器系統100。該顯示器系統1〇〇包括一面板11〇以及該 201124892 面板110之該表面116前面用於顯示影像之一透光層1〇5。該 面板110前面為顯示一影像之該表面116,而該面板11〇背面 與前面相對。一三維度光學感測器H5可位於與該面板11〇 之該透光層相同側以保護該三維度光學感測器115免受汙 染。於一替代實施例中,該三維度光學感測器115可位於該 透光層105前面。該透光層1〇5可為玻璃、塑膠、或其他透 光材料。例如,該面板110可為一液晶顯示器(LCD)面板、 一電漿顯示器、一陰極射線管(CRT)、一OLED或諸如數位 光處理(DLP)之一投影顯示器。於一實施例中,將該三維度 光學感測器安裝於該顯示器系統100之該面板110的表面 116之該視野計117外側的一區域中,可使該透光層之清晰 度不因該三維度光學感測器而降低。 該三維度光學感測器115可從位於該三維度光學感測 器115之該視野135的一物件^。之該三維度光學感測器來 決定厚度。於一實施例中該物件12〇之厚度可用來決定該物 件疋否與一指定區140接觸。於一實施例中該物件之厚度可 用來決定該物件是否位於該顯示器之一規劃距離13〇中但 不與該指定區140接觸《例如該物件12〇可為靠近該指定區 140之一使用者的手以及手指。 該顯示器系統1〇〇可包括該顯示器系統1〇〇之該聚光圈 170上的指定區14〇0該指定區14〇可為該聚光圈ι7〇上之一 區域’由一物體接觸時其執行該計算系統之一功能。於一 實施例中’該指定區14〇可印於該聚光圈17〇上以識別該指 疋區位於該聚光圈上的位置之一使用者。該指定區之功 201124892 月b亦可於該聚光圈170上指出以識別若接觸到該聚光圈丨 上之指定區140時出現哪個功能。於一實施例中,該指示可 由一光源來提供以便從該聚光圈之該前面或背面將一型樣 投影於該聚光圈上。若該指定區之功能改變則該投影型樣 可改變。該型樣可為例如,描述該功能之文字、描述該指 定區14 0之該功能或該功能之其他某些指示的一符號。若該 物件120位於該三維度光學感測器丨丨5之該視野〗35中則來 自該光源125之光線可從該物件反射並由該三維度光學感 測器115擷取。 於一實施例中,一間隙114可介於該透光層105及該面 板110之間。從該透光層105及該面板11〇之間該間隙114可 允許該三維度光學感測器115具有該透光層1〇5之一視野。 該間隙114亦可延伸經過該面板之視野計進入超過該聚光 圈之區域。於一實施例中,該聚光圈對可見光不透光但對 該三維度光學感測器擷取之光線透光。若該聚光圈對該三 維度光學感測器擷取之光線透光,則該光線可於該間隙中 從έ亥聚光圈行進至該三維度光學感測器。該間隙可從例如 0.1公分至0.5公分,但該間隙亦可為其他數量。該三維度光 學感測器115之該視野包括該透光層1〇5上之該視野計 117。右3亥聚光圈對該三維度光學感測器操取之光線不透 光,則該三維度光學感測器可放置來允許該視野包括該聚 光圈之表面上3亥指疋區所在位置的·一表面。於-—實施例 中,一稜鏡可用來彎曲該光線以包括該三維度光學感測器 之該視野中的聚光圈。 201124892 於一實施例中,將該光學感測器附接於該面板後可受 組配。例如’將該光學感測器附接於該顯示器後,於該面 板上顯示資訊之一電腦可藉由顯示該面板上之物件來接受 訓練。之後該使用者可接觸該顯示器上該面板顯示該物件 的位置,而該電腦可校準該光學感測器使得往後接觸該顯 示器可由該電腦來解譯作為該顯示器之一接點。 第2圖是根據本發明之一示範實施例的該顯示器2〇〇之 一部分。該顯示器200之部分可包括多個指定區240-242。 該等指定區240-242可位於該顯示器面板216之該視野計 217外側的該聚光圈270上。該等指定區可作為一按紐24 〇 a_ c 來予以操作。該等按鈕可執行諸如啟動一調整選單或改變 該顯示器之輸入的該顯示器功能,另一範例中,該按姐可 開啟諸如一網頁瀏覽器或電子郵件程式之—應用程式,或 可改變諸如該等網路設定之該電腦設定。該指定區241可為 一捲動輪。例如,該捲動輪可用來在一文件中將一游標向 上或向下移動、在一清單中向上及向下移動或者執行其他 功能。該指定區242可為一捲動桿。例如,該捲動桿可用來 改變該計算系統之語音音量、可放大一物件或者執行其他 功能。 第3圖是一根據本發明之一示範實施例的三維度光學 感測器315。該三維度光學感測器315可從一光源325接收從 一物件320反射之光線。該光源325可為例如發射使用者可 見之光線的一紅外線光或一雷射光源。該光源325可位於相 對該三維度光學感測器315之任何位置,其允許該光線來反 201124892 射該物件320並由該三維度光學感測器315擷取。於一實施 例中,該紅外線光可從可為該使用者的手之一物件320反射 並由該三雉度光學感’則器315擁取。一三維度影像中之^一物 件可對映呈針對每一物件來給定一堆疊順序(Z-order)、距 離順序之不同平面。該堆疊順序可使一電腦程式來區分該 等前景及背景物件並可使一電腦程式來決定該物件與該顯 示器之距離。 使用以諸如身歷聲之一三角剖分為基礎的方法之雙維 度感測器玎包含密集的影像處理來估計物件之深度。該雙 維度影像處理使用來自一感測器之資料並處理該資料來產 生通常無法從一雙維度感測器取得的資料。針對一三維度 感測器可能無法使用密集的影像處理,因為來自該三維度 感测4之資料包括深度資料。例如,針對一飛行時間之該 影像處理,三維度光學感測器可包含一簡單的詢查表來將 該感測器讀數對映至一物件與該顯示器之距離。該飛行時 間感测益從光線從一已知來源行進、從一物件反射並返回 至該三維度光學感測器所花費的時間來決定一物件與該感 測器之深度。該影像中一物件之深度可從該三維度光學感 測器來決定’其並不使用—第二三維度光學感測器來決定 該影像中與該物件之距離。 於一替代實施例中,該光源可發出結構光,其為諸如 一平面、格柵,或更複雜外型之一光線型樣以一已知角度 來杈’?、/至物件上。該光線型樣撞擊表面時會變形的方法 允許視覺系絲計算現場料物件之深度及表φ資訊。整 201124892 體反映是一種提供一完全 兀全視差立體视域的技術。為了記錄201124892 VI. Description of the Invention: I: Technical Field of Inventions The present invention relates to displays having optical sensors. BACKGROUND OF THE INVENTION A resistive touch screen panel consists of two thin, metallic, electrically conductive layers separated by a narrow gap. When an object, such as a finger, is pressed over a point on the outer surface of the panel, the two metal layers become joined at that point and the panel acts as a pair of voltage dividers having a connected output. This record causes the current to change for a touch event and is sent to the controller for processing. A capacitive touch screen panel is a sensor of a capacitor, the plate of which includes an overlap between the horizontal and vertical axes in a grid pattern. The human body also conducts static electricity and the contact pressure on the surface of the sensor affects the electric field and produces a measurable change in the capacitance of the device. SUMMARY OF THE INVENTION According to an embodiment of the present invention, a display system includes: a panel including a perimeter meter for displaying images on a front side; and a focusing aperture extending from the perimeter of the panel a designated area located in the bezel; and a three-dimensional optical sensor that generates information to determine whether an object is in contact with the designated area on the bezel. BRIEF DESCRIPTION OF THE DRAWINGS Some embodiments of the present invention may be described with reference to the following figures: Figure 1 is a display in accordance with an exemplary embodiment of the present invention; and Figure 2 is a view of the display in accordance with an exemplary embodiment of the present invention. a portion of 201124892; FIG. 3 is a three-dimensional optical sensor according to an exemplary embodiment of the present invention; FIG. 4 is a display according to an exemplary embodiment of the present invention; FIG. 5 is a view according to the present invention A display of an exemplary embodiment; FIG. 6 is a block diagram of an exemplary embodiment of the present invention; and FIG. 7 is a flow chart of an exemplary embodiment of the method according to the present invention. C. Embodiment 3 Detailed Description A computing system may have a button to perform a function. For example, a computing system can have a button to open a web browser, open an email, or adjust the volume. These buttons can be mechanical buttons, capacitive buttons or resistive buttons. A mechanical button can include, for example, a dome wherein if the dome is deformed by actuation of the button, the contact of the dome forms a functional connection with another contact layer and one of the computing systems. A mechanical button may have an effective lifetime and the functionality may degrade after that lifetime, e.g., if the button is used frequently, the contacts that produce the signal may become operational. Mechanical buttons are also quite sensitive to contamination such as dust or liquids. The capacitive button reacts to an object such as a user's finger that changes the capacitance near the button area. In one embodiment, a three-dimensional optical sensor is mounted behind a light transmissive layer of a display. The three-dimensional optical sensor can be mounted outside the field of view of the panel of the display 201124892 system. The three-dimensional optical detector is mounted on the _4 panel of the perimeter (4) River Addition Field (4) panel to the user, because any thin layer of the display system is not attached to the light transmissive layer, This will reduce the light transfer or degrade the sharpness of the image produced by the panel. Mounting the three-dimensional optical sensor behind the light transmissive layer of the display protects the optical sensor from contamination such as dust. If the field of view of the optical sensing li is contaminated by foreign matter on the light transmissive layer that obstructs the field of view of the optical sensor, the light transmissive layer can be cleaned without separately preparing to clean the optical sensor. The Hai-Dimensional Optical Sensor can have a field of view that exceeds the field of view of the display panel. The view of the n-panel can be a concentrating aperture. «The aperture may be part of the light transmissive layer or may be a separate part. The far aperture can include a designated area. The designated areas on the concentrating aperture can be determined by the three dimensional optical sensing recording - the object is (iv) the designated area contact on the concentrating aperture. Determining the position of an object by a three-dimensional optical sensation can be used for a single-two-dimensional optical sensor to use a plurality of designated areas on the concentrating aperture. In an embodiment, a display system can include a panel. The panel can include a perimeter and can display images on the front side. - The concentrating aperture can be extended from the panel's CMOS. If one of the areas on the spot is touched, it can be forked to perform a function. A three-dimensional optical sensor determines whether an object is in contact with the designated area. Referring to the figures, FIG. 1 is a display system 100 in accordance with an exemplary embodiment of the present invention. The display system 1A includes a panel 11A and a front side of the surface 116 of the 201124892 panel 110 for displaying a light transmitting layer 1〇5 of the image. The front surface of the panel 110 is the surface 116 for displaying an image, and the back surface of the panel 11 is opposite to the front surface. A three-dimensional optical sensor H5 can be located on the same side of the light transmissive layer as the panel 11A to protect the three-dimensional optical sensor 115 from contamination. In an alternative embodiment, the three-dimensional optical sensor 115 can be located in front of the light transmissive layer 105. The light transmissive layer 1〇5 may be glass, plastic, or other light transmissive material. For example, the panel 110 can be a liquid crystal display (LCD) panel, a plasma display, a cathode ray tube (CRT), an OLED, or a projection display such as digital light processing (DLP). In one embodiment, the three-dimensional optical sensor is mounted in an area of the surface 116 of the panel 110 of the display system 100 outside the perimeter 117, so that the clarity of the transparent layer is not The three-dimensional optical sensor is reduced. The three-dimensional optical sensor 115 can be from an object located in the field of view 135 of the three-dimensional optical sensor 115. The three-dimensional optical sensor determines the thickness. In one embodiment, the thickness of the article 12 can be used to determine if the article is in contact with a designated area 140. In one embodiment, the thickness of the object can be used to determine whether the object is in a planned distance 13〇 of the display but is not in contact with the designated area 140. For example, the object 12 can be a user near the designated area 140. Hands as well as fingers. The display system 1 can include a designated area 14 〇 0 on the concentrating aperture 170 of the display system 1 . The designated area 14 〇 can be one of the areas on the concentrating aperture ι 7 ' when it is contacted by an object One of the functions of this computing system. In one embodiment, the designated area 14 can be printed on the bezel 17A to identify a user of the location of the fingerprint area on the bezel. The function of the designated area may also be indicated on the spotlight 170 to identify which function occurs when the designated area 140 on the spot 丨 is touched. In one embodiment, the indication can be provided by a light source to project a pattern onto the bezel from the front or back of the bezel. The projection pattern can be changed if the function of the designated area is changed. The pattern can be, for example, a text describing the function, a symbol describing the function of the designated area 140 or some other indication of the function. If the object 120 is in the field of view 35 of the three-dimensional optical sensor 丨丨5, light from the source 125 can be reflected from the object and captured by the three-dimensional optical sensor 115. In one embodiment, a gap 114 can be interposed between the light transmissive layer 105 and the panel 110. The gap 114 between the light transmissive layer 105 and the panel 11A allows the three-dimensional optical sensor 115 to have a field of view of the light transmissive layer 1〇5. The gap 114 can also extend through the perimeter of the panel into an area beyond the concentrating aperture. In one embodiment, the concentrating aperture is opaque to visible light but transmits light to the three-dimensional optical sensor. If the bezel transmits light to the light drawn by the three-dimensional optical sensor, the light can travel from the aperture to the three-dimensional optical sensor in the gap. The gap may be, for example, from 0.1 cm to 0.5 cm, but the gap may be other numbers. The field of view of the three-dimensional optical sensor 115 includes the perimeter meter 117 on the light transmissive layer 1〇5. The right 3H concentrating aperture is opaque to the light taken by the three-dimensional optical sensor, and the three-dimensional optical sensor can be placed to allow the field of view to include the location of the 3 疋 finger region on the surface of the concentrating aperture. · A surface. In an embodiment, a beam can be used to bend the light to include a bezel in the field of view of the three-dimensional optical sensor. 201124892 In an embodiment, the optical sensor can be assembled after being attached to the panel. For example, after attaching the optical sensor to the display, one of the computers displaying the information on the panel can be trained by displaying the objects on the panel. The user can then contact the display panel to display the location of the object, and the computer can calibrate the optical sensor such that subsequent contact with the display can be interpreted by the computer as a contact for the display. Figure 2 is a portion of the display 2 in accordance with an exemplary embodiment of the present invention. Portions of the display 200 can include a plurality of designated areas 240-242. The designated areas 240-242 can be located on the bezel 270 outside the perimeter 217 of the display panel 216. These designated areas can be operated as a button 24 〇 a_ c. The buttons may perform the display function such as launching an adjustment menu or changing the input of the display. In another example, the sister may open an application such as a web browser or an email program, or may change such as The computer settings set by the network. The designated area 241 can be a scrolling wheel. For example, the scroll wheel can be used to move a cursor up or down in a file, move up and down in a list, or perform other functions. The designated area 242 can be a scroll bar. For example, the scroll bar can be used to change the voice volume of the computing system, zoom in on an object, or perform other functions. Figure 3 is a three-dimensional optical sensor 315 in accordance with an exemplary embodiment of the present invention. The three-dimensional optical sensor 315 can receive light reflected from an object 320 from a light source 325. The light source 325 can be, for example, an infrared light or a laser source that emits light visible to the user. The light source 325 can be located anywhere adjacent the three-dimensional optical sensor 315, which allows the light to be incident on and captured by the three-dimensional optical sensor 315. In one embodiment, the infrared light is reflected from an object 320 that can be one of the user's hands and is captured by the three-degree optical sensor 315. An object in a three-dimensional image can be mapped to a different plane for each object in a Z-order and distance sequence. The stacking sequence allows a computer program to distinguish between the foreground and background objects and to cause a computer program to determine the distance of the object from the display. A two-dimensional sensor that uses a method based on a triangulation of one of the accommodative sounds, including dense image processing, estimates the depth of the object. This two-dimensional image processing uses data from a sensor and processes the data to produce data that is typically not available from a dual-dimensional sensor. Intensive image processing may not be available for a three-dimensional sensor because the data from the three-dimensional sensing 4 includes depth data. For example, for image processing of a time of flight, the three-dimensional optical sensor can include a simple lookup table to map the sensor reading to the distance of an object from the display. The flight time sense determines the depth of an object and the sensor from the time it takes for the light to travel from a known source, reflect from an object, and return to the three-dimensional optical sensor. The depth of an object in the image can be determined from the three-dimensional optical sensor 'it is not used' - the second three-dimensional optical sensor determines the distance from the object in the image. In an alternate embodiment, the light source can emit structured light that is one of a light pattern such as a flat surface, a grid, or a more complex form, at a known angle 杈 ? , / / onto the object. The method of deforming the light pattern when striking the surface allows the visual line to calculate the depth of the object and the information of the table φ. The 201124892 body reflection is a technique that provides a full parallax stereoscopic field of view. For recording

微透鏡之基本影像的航錄影像可作電子轉移而之後於影 像處理中重新建構。某些實施例中,該整體反映透鏡可具 有不同的焦點長度而該物件深度可根據該物件是否對準焦 點,來蚊、或未對準焦點,由—散焦感 測器來決疋。本發明之該等實施例並不侷限於上述之三維 度光學感測器的類型’其亦可為任何三維度感測器類变。 第4圖是一根據本發明之一示範實施例的顯示器。一計 算系統根據兩個接觸之指定區能夠執行多種功能。可感測 超過一個物件420之一顯示器系統4〇〇能夠執行其他工作。 於一實施例中,會有一第一三維度光學感測器415及一 第二三維度光學感測器417。該第一三維度光學感測器415 可具有包括含有一指定區440之一視域的部分455之一視 野。於包括介於該透光層405及該面板間之一間隙的一實施 例中,該視野之一部分可超過該透光層405。該視舒455中, 可擷取一物件422之影像。因為該第一物件422介於該第一 三維度光學感測器415及一第二物件420之間,故該第二物 件420無法被該第一三維度光學感測器415看見。沿著該視 野之該部分455,該視野會受該第一物件422阻礙。該第二 三維度光學感測器417可於該第二三維度光學感測器417之 該視野的一部分460中掘取包括該第一物件422及該第二物 10 201124892 件420兩者之深度的一影像。該第一三維度光學感測器415 可決定一第一物件422,例如一使用者手指之距離。若該第 一三維度光學感測器415對該第二物件420之視域受該第一 物件422阻礙,則該第一三維度光學感測器415無法擷取一 第二物件420,例如一使用者另一手之手指。該第一三維度 光學感測器415及該第二三維度光學感測器417可位於該顯 示器系統400之角落,或者該等光學感測器可位於該顯示器 當中或之上諸如該頂部、底部、或側面的任何地方。例如 該第一物件422可與該面板之該表面416上顯示之一物件接 觸而該第二物件420可與該指定區440接觸,並且若從該第 一三維度光學感測器415該第一物件422阻礙該第二物件 420之視域,則為了檢測與該指定區440接觸之該第二物件 420,可使用該第二三維度光學感測器4Π。 於一實施例中,若該第一三維度光學感測器415之該視 野太窄而無法檢測與位於該顯示器面板與該三維度光學感 測器415相同側之該指定區441接觸的物件,亦可使用該第 二二維度光學感測器417。例如該第一三維度光學感測器 415可於該面板之交又處看見該指定區,而該光學感測 器415可不具有包括位於該面板與該第_三維度光學感測 器415相同側之該指定區441的一視野,而該指定區441可被 該第二三維度光學感測器417看見。 因為二維度光學感測器之深度已知,故該光學感測 器可用來決定物件之大小。若該光學感測器之深度未知, 則物件422之衫像可顯示與離該光學感測器…更遠之一 201124892 較大物件420相同。該物件之大小可被該計算系統使用來決 定諸如與該指定區接觸之手、手指、筆、或其他物件的物 件類型。 第5圖是一根據本發明之一示範實施例的顯示器。該光 學感測器具有延伸超過該顯示器面板510之該視野計517的 一可視區。超過該視野計517之物件移動可啟動一電腦系統 之功能。於一實施例中,該指定區540可作為按鈕並可位於 該顯示器面板510之該視野計517的外側。該指定區540可以 是印於環繞該顯示器面板510之該聚光圈570上的一符號或 文字。該指定區無移動部分並且無電氣性連接至該電腦系 統580。諸如一使用者手指之一物件與一指定區54〇接觸 時,該光學感測器515可檢測到。於一實施例中,該顯示器 系統可被包圍於亦包圍一計算系統580之一外罩中,或該計 算系統可位於與該顯示器系統之該外罩分開的一外罩中。 第6圖是一根據本發明之一示範實施例的方塊圖。該光 學感測器模組600包括該光源625以及該光學感測器615。該 光學感測器模組600可拮員取包括一影像中之一物件的高 度、寬度、及深度之資料。該光學感測器模組6〇〇可連接至 一通訊埠670來將擷取資料發送至一計算裝置。該通訊埠 670可為一計算裝置上之一通訊埠67〇。例如,該通訊埠67〇 可為一通用串列匯流排(USB)埠或一IEEE 1394埠。於一實 施例中,該通訊埠670可為該計算裝置之該輸入輸出控制器 675的一部分。該輸入輸出控制器675可連接至一電腦可讀 媒體685。一計算裝置之該輸入輸出控制器675可連接至一 12 201124892 控制器680。 該控制器680可透過該輸入輸出控制器675之通訊埠 670來接收該三維度光學感測器模組6〇〇擷取之資訊。該控 制器680可從該三維度光學感測器模組6〇〇擷取之資訊來決 定一物件與該光學感測器模組600之距離。該控制器680可 決定該物件與該光學感測器615之距離,並可從該光學感測 器615提供之資料來決定該物件是否與該顯示器系統之一 聚光圈接觸。該控制器680可受規劃為將該計算系統之該聚 光圈上的一指定區聯結至該顯示器或該計算系統之功能。 於一實施例中,該控制器68 0為一處理器或一特定應用積體 電路(ASIC)。 第7圖是一根據本發明之該方法的一示範實施例之流 程圖。該方法由使用一三維度光學感測器來檢測一顯示器 聚光圈上之一指定區來開始(方塊710)。該計算裝置可從該 三維度光學感測器提供之資訊來決定一物件與該區域是否 小於一接觸距離(方塊720)。該計算裝置可從該物件與該指 定區接觸之深度資訊來決定該物件與該顯示器系統之距離 是否實質為零公分。於一實施例中,實質零公分表示該三 維度光學感測器之該解析度可能無法決定與該顯示器接 觸,而與該顯示器系統小於一接觸距離之一物件可具有來 自該三維度光學感測器之深度資訊,該資訊由該計算裝置 來決定成為一零距離以及與該顯示器系統之一接點。一接 觸距離可為例如與該指定區相距0.2公分但可為其他距 離。若該物件與該指定區接觸,則該物件與該指定區之計 13 201124892 算距離為零。 右该物件位於該指定區之一規劃接觸距離以内,則該 什异裝置可執行—功能(方塊730)。該計算系統執行之該功 能可為’例如,控制該音量、控制該顯示器亮度、以及控 制諸如播放、停止、暫停、向前快轉、回捲之多媒體功能、 或可為其他計算系統之功能。於一實施例中,該功能可繼 續維持直到該物件不再小於該接觸距離為止。例如,若該 功月b用於增加音量,則該音量持續增加直到該物件不再小 於違接觸距離為止。另—實施例中,若該指定區接觸超過 一規劃的時間週期,則可執行—第二功能。例如,若該指 疋區用於降低音量,則與該區域接觸超過—規劃的時間週 期會使聲音靜音。 上述該等技術可在用於組配一計算系統來執行該方法 之一 如但非侷限於 存媒體之磁性儲存媒體;諸如光碟媒體(例如 電腦可讀媒體中具體化。該電腦可讀媒體可包括,例 ,下列元件之任何數量:包括磁碟及磁帶儲The aerial image of the basic image of the microlens can be electronically transferred and then reconstructed in the image processing. In some embodiments, the integral reflective lens can have a different focal length and the object depth can be determined by a defocus sensor depending on whether the object is aligned with the focus, the mosquito, or the misaligned focus. The embodiments of the present invention are not limited to the type of three-dimensional optical sensor described above - it can also be any three-dimensional sensor type change. Figure 4 is a display in accordance with an exemplary embodiment of the present invention. A computing system is capable of performing a variety of functions based on a designated area of two contacts. It is possible to sense that one of the more than one object 420, the display system 4, is capable of performing other tasks. In one embodiment, there is a first three-dimensional optical sensor 415 and a second three-dimensional optical sensor 417. The first three-dimensional optical sensor 415 can have a field of view including a portion 455 containing a field of view of a designated area 440. In an embodiment including a gap between the light transmissive layer 405 and the panel, a portion of the field of view may extend beyond the light transmissive layer 405. In the image 455, an image of an object 422 can be captured. Because the first object 422 is interposed between the first three-dimensional optical sensor 415 and a second object 420, the second object 420 cannot be seen by the first three-dimensional optical sensor 415. Along the portion 455 of the field of view, the field of view is obstructed by the first object 422. The second three-dimensional optical sensor 417 can drill the depth of both the first object 422 and the second object 10 201124892 420 in a portion 460 of the field of view of the second three-dimensional optical sensor 417 An image of one. The first three-dimensional optical sensor 415 can determine a first object 422, such as the distance of a user's finger. If the first three-dimensional optical sensor 415 is blocked by the first object 422, the first three-dimensional optical sensor 415 cannot capture a second object 420, such as a The user's finger on the other hand. The first three-dimensional optical sensor 415 and the second three-dimensional optical sensor 417 can be located at a corner of the display system 400, or the optical sensors can be located in or on the display such as the top, bottom Or anywhere on the side. For example, the first object 422 can be in contact with one of the objects on the surface 416 of the panel and the second object 420 can be in contact with the designated area 440, and if the first object is from the first three-dimensional optical sensor 415 The object 422 obstructs the field of view of the second object 420, and the second three-dimensional optical sensor 4 can be used to detect the second object 420 in contact with the designated area 440. In an embodiment, if the field of view of the first three-dimensional optical sensor 415 is too narrow to detect an object that is in contact with the designated area 441 on the same side of the display panel as the three-dimensional optical sensor 415, The second two-dimensional optical sensor 417 can also be used. For example, the first three-dimensional optical sensor 415 can see the designated area at the intersection of the panel, and the optical sensor 415 may not include the same side of the panel as the third-dimensional optical sensor 415. A field of view of the designated area 441, and the designated area 441 can be seen by the second three-dimensional optical sensor 417. Because the depth of the two-dimensional optical sensor is known, the optical sensor can be used to determine the size of the object. If the depth of the optical sensor is unknown, the shirt image of the object 422 can be displayed the same as the larger object 420 that is further from the optical sensor. The size of the item can be used by the computing system to determine the type of item such as a hand, finger, pen, or other item that is in contact with the designated area. Figure 5 is a display in accordance with an exemplary embodiment of the present invention. The optical sensor has a viewing zone that extends beyond the perimeter 517 of the display panel 510. Object movement beyond the perimeter 517 activates the functionality of a computer system. In one embodiment, the designated area 540 can act as a button and can be located outside of the perimeter meter 517 of the display panel 510. The designated area 540 can be a symbol or text printed on the bezel 570 surrounding the display panel 510. The designated area has no moving parts and is not electrically connected to the computer system 580. The optical sensor 515 can detect when an object such as a user's finger contacts a designated area 54A. In one embodiment, the display system can be enclosed in a housing that also encloses a computing system 580, or the computing system can be located in a housing that is separate from the housing of the display system. Figure 6 is a block diagram of an exemplary embodiment of the present invention. The optical sensor module 600 includes the light source 625 and the optical sensor 615. The optical sensor module 600 can provide information on the height, width, and depth of an object including an image. The optical sensor module 6A can be coupled to a communication port 670 for transmitting the captured data to a computing device. The communication port 670 can be one of the communication devices on a computing device. For example, the communication port 67 can be a universal serial bus (USB) port or an IEEE 1394 port. In one embodiment, the communication port 670 can be part of the input and output controller 675 of the computing device. The input and output controller 675 can be coupled to a computer readable medium 685. The input/output controller 675 of a computing device can be coupled to a 12 201124892 controller 680. The controller 680 can receive the information captured by the three-dimensional optical sensor module 6 through the communication port 670 of the input/output controller 675. The controller 680 can determine the distance between an object and the optical sensor module 600 from the information captured by the three-dimensional optical sensor module 6. The controller 680 can determine the distance of the object from the optical sensor 615 and can determine from the information provided by the optical sensor 615 whether the object is in contact with a spot of the display system. The controller 680 can be configured to couple a designated area on the aperture of the computing system to the display or to the computing system. In one embodiment, the controller 68 is a processor or an application specific integrated circuit (ASIC). Figure 7 is a flow diagram of an exemplary embodiment of the method in accordance with the present invention. The method begins by detecting a designated area on a display bezel using a three-dimensional optical sensor (block 710). The computing device can determine from the information provided by the three-dimensional optical sensor whether an object is less than a contact distance with the area (block 720). The computing device determines whether the distance between the object and the display system is substantially zero centimeters from the depth information of the object in contact with the designated area. In one embodiment, substantially zero centimeters indicates that the resolution of the three-dimensional optical sensor may not be in contact with the display, and one of the objects less than one contact distance from the display system may have optical sensing from the three-dimensionality. The depth information of the device, which is determined by the computing device to be a zero distance and a contact with the display system. A contact distance may be, for example, 0.2 cm from the designated area but may be other distances. If the object is in contact with the designated area, the distance between the object and the designated area is zero. The object is executable-function (block 730) if the object to the right is within one of the planned contact distances of the designated area. The functionality performed by the computing system can be, for example, controlling the volume, controlling the brightness of the display, and controlling multimedia functions such as play, stop, pause, forward fast forward, rewind, or can be a function of other computing systems. In one embodiment, the function can continue to be maintained until the object is no longer less than the contact distance. For example, if the power month b is used to increase the volume, the volume continues to increase until the object is no longer less than the contact distance. In another embodiment, if the designated area contacts more than a planned time period, the second function can be performed. For example, if the fingerprint area is used to reduce the volume, then contact with the area is exceeded—the planned time period will silence the sound. The above techniques may be embodied in a magnetic storage medium for assembling a computing system to perform one of the methods, such as but not limited to storing media; such as in a compact disc medium, such as a computer readable medium. Including, for example, any number of the following components: including disk and tape storage

CD-ROM CD-R、等等)之光學儲存舰以及數位視訊磁顿存媒體丨 全像記憶體;包括諸如flash記憶體、EEPR0M、EpR〇M、 ROM之半導體式記憶體單元的賴電性記憶體儲存媒體; 鐵磁性數位記憶體;包括暫存器、緩衝器或快取記憶體、 主要記Μ、RAM、料之依電性儲存_ ;以及網際網 路,在此僅列舉—二其他新式及各種不_型的電腦可 讀媒體可用來儲存及/紐送本文說明之料敕體模組。計 算系統可以許多形式來呈現,包括但不偽限於大型電腦、 201124892 迷你電腦、伺服器、工作站、個人電腦、文字編譯器、個 人數位助理、各種不同的無線裝置及嵌式系統,在此僅列 舉一二。 上述說明中,可提出許多細節來提供對本發明之了 解。然而,業界熟於此技者應了解本發明在無該等細節的 情況下亦可加以實作。而本發明已相關一有限數量之實施 例來加以揭示,業界熟於此技者應體認其可作許多修改及 變化型態。該等後附申請專利範圍意欲涵蓋該類修改及變 化型態並位於本發明之真實精神及範疇中。 【圖式簡單說明】 第1圖是一根據本發明之一示範實施例的顯示器; 第2圖是根據本發明之一示範實施例的該顯示器之一 部分; 第3圖是一根據本發明之一示範實施例的三維度光學 感測器; 第4圖是一根據本發明之一示範實施例的顯示器; 第5圖是一根據本發明之一示範實施例的顯示器; 第6圖是一根據本發明之一示範實施例的方塊圖;以及 第7圖是一根據本發明之該方法的一示範實施例之流 程圖。 【主要元件符號說明】 100、200、400…顯示器系統 114…間隙 105、405.··透光層 115、315、515、615.··三維度 110、216、510…面板 光學感測器 15 201124892 116.. .表面 117、217、517···視野計 120、320...物件 125、325、625…光源 130.. .規劃距離 135.. .視野 140、240、241、242、440、 441、540、...指定區 170、270、570...聚光圈 240a-c...按鈕 417.. .第二三維度光學感測器 420.. .第二物件 422.. .第一物件 455、460...部分 580.. .電腦系統 600.. .光學感測器模組 670··.通訊埠 675.. .輸入輸出控制器 680.. .控制器 685.. .電腦可讀媒體 415·.·第一三維度光學感測器 710、720、730…方塊 16CD-ROM CD-R, etc. optical storage ship and digital video magnetic memory media holographic memory; including semiconductor memory devices such as flash memory, EEPR0M, EpR〇M, ROM Memory storage media; ferromagnetic digital memory; including scratchpad, buffer or cache memory, main memory, RAM, material storage _; and internet, only listed here - two others New and various non-type computer readable media can be used to store and/or send the cartridge modules described herein. Computing systems can be presented in many forms, including but not limited to large computers, 201124892 minicomputers, servers, workstations, personal computers, text compilers, personal digital assistants, various wireless devices, and embedded systems. One or two. In the above description, numerous details are set forth to provide an understanding of the invention. However, those skilled in the art will appreciate that the present invention can be practiced without such detail. While the present invention has been disclosed in connection with a limited number of embodiments, those skilled in the art will recognize that many modifications and variations are possible. The scope of the appended claims is intended to cover such modifications and variations and fall within the true spirit and scope of the invention. BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a display according to an exemplary embodiment of the present invention; FIG. 2 is a portion of the display according to an exemplary embodiment of the present invention; FIG. 3 is a view of one of the present inventions A three-dimensional optical sensor of an exemplary embodiment; FIG. 4 is a display according to an exemplary embodiment of the present invention; FIG. 5 is a display according to an exemplary embodiment of the present invention; A block diagram of an exemplary embodiment of the invention; and FIG. 7 is a flow diagram of an exemplary embodiment of the method in accordance with the present invention. [Description of main component symbols] 100, 200, 400... display system 114... gaps 105, 405. ··transmission layers 115, 315, 515, 615. Three-dimensionality 110, 216, 510... panel optical sensor 15 201124892 116.. Surface 117, 217, 517... Perimeter 120, 320... Objects 125, 325, 625... Light source 130.. Planning distance 135.. Field of view 140, 240, 241, 242, 440 , 441, 540, ... designated area 170, 270, 570... concentrating circle 240a-c... button 417.. second three-dimensional optical sensor 420.. second object 422.. . First object 455, 460... part 580.. computer system 600.. optical sensor module 670.. communication 675.. input and output controller 680.. controller 685.. Computer readable medium 415.. first three dimensional optical sensor 710, 720, 730... block 16

Claims (1)

201124892 七、申請專利範圍: l —種顯示器系統,其包含有: 一包括一視野計來於一前側顯示影像之面板; 一從該面板之該視野計延伸的聚光圈; 一位於該聚光圈之指定區;以及 一可產生貧訊以決定一物件是否與該聚光圏上之 δ亥指定區接觸的三維度光學感測器。 2.如申請專利範圍第1項之系統,更包含一第二三維度光 學感測器來決定一物件是否與位於該第二三維度光學 感測器之一視野中的一第二指定區接觸。 3·如申晴專利範圍第i項之系統更包含若該物件及該指 疋區之間元成接觸時待執行之一功能的一指示。 如申明專利fc圍第3項之系統,其中該功能之該指示為 從s玄聚光圈上之一光源產生的一型樣。 如申印專利範圍第4項之系統,其中該指定區可執行多 種功能,而其中該功能之該指示可改變。 6.如申請專利範圍第i項之系,統,其中該指示區之該功能 可從一捲動輪、一滑動桿、以及一按鈕的其中之一來選 擇。 •如申w專利圍第1項之系統’其中該三維度光學感測 益可從由-飛行時間感測H、—結構光感卿、聚焦感 測器、以及一散焦感測器組成之群組中來選擇。 8· 一種方法,包含有下列步驟: 以一三維度光學感測器來檢測一顯示器聚光圈上 17 201124892 之一指定區; 從該三維度光學感測器提供之資料來決定一物件 是否小於與該指定區之一接觸距離;以及 執行若該物件位於該指定區之一規劃的接觸距離 内之一功能。 利第8項之方法,其中該功能從控制該音 里控制5亥顯不器亮度、以及控制多媒體功能的其中之 一來選擇。 〇.如申請專利範圍第8 _ _ 8項之方法,更包含將該聚光圈上之 —指示與該功能相關聯。 18201124892 VII. Patent application scope: l A display system comprising: a panel comprising a perimeter meter for displaying images on a front side; a collecting aperture extending from the perimeter of the panel; a location in the focusing aperture a designated area; and a three-dimensional optical sensor that generates a message to determine whether an object is in contact with the designated area on the spot. 2. The system of claim 1, further comprising a second three-dimensional optical sensor to determine whether an object is in contact with a second designated area located in a field of view of the second three-dimensional optical sensor . 3. The system of item ii of the Shenqing patent scope further includes an indication of a function to be performed if the object and the finger area are in contact with each other. For example, the system of claim 3, wherein the indication of the function is a type produced from a light source on the s-convergence aperture. A system of claim 4, wherein the designated area performs a plurality of functions, and wherein the indication of the function is changeable. 6. The system of claim i, wherein the function of the indicator zone is selectable from one of a scroll wheel, a slider, and a button. • The system of claim 1, wherein the three-dimensional optical sensing benefit comprises: a time-of-flight sensing H, a structured light sensor, a focus sensor, and a defocus sensor. Select from the group. 8. A method comprising the steps of: detecting a designated area on a display spotlight by a three-dimensional optical sensor; determining information from an object of the three-dimensional optical sensor to determine whether an object is smaller than a contact distance of one of the designated areas; and performing a function if the object is within a planned contact distance of one of the designated areas. The method of item 8, wherein the function is selected from one of controlling the brightness of the sound control and controlling the multimedia function.如. The method of applying for the patent scope 8 _ _ 8 further includes associating the indication on the concentrating aperture with the function. 18
TW099133841A 2009-07-23 2010-10-05 Display with an optical sensor TW201124892A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/US2009/051587 WO2011011008A1 (en) 2009-07-23 2009-07-23 Display with an optical sensor
PCT/US2009/051599 WO2011011009A1 (en) 2009-07-23 2009-07-23 Display with an optical sensor
PCT/US2009/060283 WO2011011024A1 (en) 2009-07-23 2009-10-09 Display with an optical sensor

Publications (1)

Publication Number Publication Date
TW201124892A true TW201124892A (en) 2011-07-16

Family

ID=43499322

Family Applications (1)

Application Number Title Priority Date Filing Date
TW099133841A TW201124892A (en) 2009-07-23 2010-10-05 Display with an optical sensor

Country Status (2)

Country Link
TW (1) TW201124892A (en)
WO (1) WO2011011024A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102496169A (en) * 2011-11-30 2012-06-13 威盛电子股份有限公司 Method and device for drawing overlapped object
DE102014113654A1 (en) * 2014-09-22 2016-03-24 Assa Abloy Sicherheitstechnik Gmbh Panic push rod with emergency button and sliding display

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4529316A (en) * 1982-10-18 1985-07-16 Robotic Vision Systems, Inc. Arrangement of eliminating erroneous data in three-dimensional optical sensors
US6614422B1 (en) * 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
JP4429047B2 (en) * 2004-03-11 2010-03-10 キヤノン株式会社 Coordinate input device, control method therefor, and program
JP5201825B2 (en) * 2006-12-13 2013-06-05 富士フイルム株式会社 Distance image acquisition apparatus and method

Also Published As

Publication number Publication date
WO2011011024A1 (en) 2011-01-27

Similar Documents

Publication Publication Date Title
US9176628B2 (en) Display with an optical sensor
TWI484386B (en) Display with an optical sensor
CN103052928B (en) The system and method that many display inputs realize can be made
US20110267264A1 (en) Display system with multiple optical sensors
US20120200495A1 (en) Autostereoscopic Rendering and Display Apparatus
US8664582B2 (en) Display with an optical sensor
TWI461975B (en) Electronic device and method for correcting touch position
CN102741782A (en) Methods and systems for position detection
US20120120029A1 (en) Display to determine gestures
US20120319945A1 (en) System and method for reporting data in a computer vision system
KR20130108604A (en) Apparatus and method for user input for controlling displayed information
US10379680B2 (en) Displaying an object indicator
CN105492990A (en) Touch input association
TW201124892A (en) Display with an optical sensor
TW201234233A (en) Sensing system
US9274547B2 (en) Display with an optical sensor
TWI419017B (en) Input system having a sheet-like light shield
TWI476664B (en) Multi-touch optical input device and method thereof
TW201039215A (en) Touch system