CN102411474A - Mobile terminal and method of controlling operation of the same - Google Patents

Mobile terminal and method of controlling operation of the same Download PDF

Info

Publication number
CN102411474A
CN102411474A CN2011102900383A CN201110290038A CN102411474A CN 102411474 A CN102411474 A CN 102411474A CN 2011102900383 A CN2011102900383 A CN 2011102900383A CN 201110290038 A CN201110290038 A CN 201110290038A CN 102411474 A CN102411474 A CN 102411474A
Authority
CN
China
Prior art keywords
image
rendering
portable terminal
steric information
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011102900383A
Other languages
Chinese (zh)
Other versions
CN102411474B (en
Inventor
李珍术
金东玉
金泰润
辛承珉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
LG Electronics Inc
Original Assignee
LG Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by LG Electronics Inc filed Critical LG Electronics Inc
Publication of CN102411474A publication Critical patent/CN102411474A/en
Application granted granted Critical
Publication of CN102411474B publication Critical patent/CN102411474B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Telephone Function (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a mobile terminal and a method for controlling the operation of the mobile terminal. The method comprises the following steps: dividing at least a first image and a second image into a plurality of blocks, the first and second images capable of generating a three-dimensional (3D) image using binocular disparity; searching at least a pair of matching blocks in at least a first image and a second image; calculating depth information of each of the at least one pair of matching blocks based on a difference in position between each of the at least one pair of matching blocks; and calculating stereoscopic information of the 3D image based on the calculated depth information.

Description

The method of operating of portable terminal and control portable terminal
The cross reference of related application
The present invention requires the benefit of priority of on September 20th, 2010 to the korean 10-2010-0092610 of Korea S Department of Intellectual Property submission, and mode is by reference incorporated its disclosure into this paper.
Technical field
The present invention relates to the method for operating of portable terminal and this portable terminal of control, and or rather, relate to the portable terminal and the method for operating of controlling this portable terminal that can provide about the steric information of three-dimensional (3D) image.
Background technology
Portable terminal is a portable set, and it can provide various services to the user, such as, voice call service, video calling service, information I/O service and data storage service.
Type variation owing to the service that provides by portable terminal; The more and more mobile terminal is equipped with various sophisticated functionss; Such as; Take pictures or motion picture, playing music or motion pictures files, game shows are provided, receive broadcast program and the wireless Internet service is provided, and therefore be evolved to multimedia player.
Various trials have been carried out so that such sophisticated functions is embodied as hardware device or software program.For example, developed various user interfaces (UI) environment, wherein, the function that allows the search of user easier ground and select to want.
Simultaneously, developed a plurality of two dimensions (2D) image combination that is used for through camera is taken, and the result who handles this combination creates the various technology of three-dimensional (3D) image.Through various technology are applied to portable terminal, use the portable terminal establishment and show that various 3D renderings are possible.
The three-dimensional property of 3D rendering is based on the left eye of 3D rendering and the parallax between the eye image, and changes according to the position difference of the object in left eye and the eye image.Yet owing to still there is not the method for the three-dimensional property of suitably measuring 3D rendering, the evaluation of the three-dimensional property of 3D rendering possibly mainly depend on spectators' subjective suggestion usually.
Therefore, a kind of method of needs is come quantitative analysis and is measured the three-dimensional property of 3D rendering, and effectively uses the result of this analysis and measurement to control the performed various operations of portable terminal.
Summary of the invention
The invention provides the method for operating of portable terminal and this portable terminal of control, wherein, can be used to control the various operations of carrying out by portable terminal effectively about the steric information of three-dimensional (3D) image.
According to an aspect of the present invention, a kind of method of operating of controlling portable terminal is provided, this method comprises in first and second images at least one is divided into a plurality of blocks that this first image and second image can use binocular parallax to realize 3D rendering; Search is many to match block in first image and second image, and calculates the depth information of every pair of match block based on the difference of the position between every pair of match block; And the steric information of calculating 3D rendering based on the depth information of every pair of match block.
According to a further aspect in the invention, a kind of portable terminal is provided, this portable terminal comprises display module, and this display module is configured to show 3D rendering above that based on first image and second image that use binocular parallax; And controller; This controller is configured in first image and second image at least one is divided into a plurality of; Search is many to match block in first image and second image; Calculate the depth information of every pair of match block based on the difference of the position between every pair of match block, calculate the steric information of 3D rendering based on the depth information of every pair of match block, and on display module, show steric information.
According to a further aspect in the invention, the method for operating of control portable terminal is provided, this method comprises based on first image and second image that use binocular parallax, on display module, shows 3D rendering; And on display module, show the steric information gauge, and this steric information gauge shows the steric information of 3D rendering, and the depth information that is based on the object that comprises in first image and second image calculates this steric information.
According to a further aspect in the invention, a kind of portable terminal is provided, this portable terminal comprises display module, and this display module is configured to show 3D rendering above that based on first image and second image that use binocular parallax; And controller, this controller is configured to demonstration steric information gauge on display module, and this steric information gauge shows the steric information of 3D rendering, and the depth information that is based on the object that comprises in first image and second image calculates this steric information.
Description of drawings
To become more obvious with other feature and advantage through describe its preferred embodiment in detail with reference to accompanying drawing more than of the present invention, in the accompanying drawings:
Fig. 1 is the block diagram of portable terminal according to an exemplary embodiment of the present invention;
Fig. 2 is the preceding stereographic map of the portable terminal shown in Fig. 1;
Fig. 3 is the back stereographic map of the portable terminal shown in Fig. 2;
Fig. 4 to 7 is the figure that illustrate example how to calculate the steric information that is used for using at portable terminal shown in Figure 1;
Fig. 8 is a process flow diagram of controlling the method for operating of portable terminal according to an exemplary embodiment of the present invention;
Fig. 9 is the process flow diagram of the control portable terminal method of operating of another exemplary embodiment according to the present invention;
Figure 10 to 14 is figure of each example of the use of diagram steric information; And
Figure 15 is the figure that illustrates the example of the broadcast singal with steric information.
Embodiment
To describe the present invention in detail with reference to accompanying drawing hereinafter, in the accompanying drawings, show exemplary embodiment of the present invention.
As used herein, mobile phone, smart phone, laptop computer, digit broadcasting receiver, PDA(Personal Digital Assistant), portable media player (PMP), camera, navigator, panel computer or e-book (e-book) reader can be indicated in term " portable terminal ".In the disclosure, can use a technical term " module " and " unit " with exchanging.
Fig. 1 illustrates the block diagram of portable terminal 100 according to an embodiment of the invention.With reference to figure 1, portable terminal 100 can comprise wireless communication unit 110, audio/video (A/V) input block 120, user input unit 130, sensing cell 140, output unit 150, storer 160, interface unit 170, controller 180 and power supply unit 190.In wireless communication unit 110, A/V input block 120, user input unit 130, sensing cell 140, output unit 150, storer 160, interface unit 170, controller 180 and the power supply unit 190 two or more can be merged into individual unit, and perhaps some in wireless communication unit 110, A/V input block 120, user input unit 130, sensing cell 140, output unit 150, storer 160, interface unit 170, controller 180 and the power supply unit 190 can be divided into two or more littler unit.
Wireless communication unit 110 can comprise broadcast reception module 111, mobile communication module 113, wireless Internet module 115, short-range communication module 117 and GPS (GPS) module 119.
Broadcast reception module 111 can be through broadcast channel from external broadcasting management server receiving broadcast signal and/or broadcast related information.Broadcast channel can be satellite channel or terrestrial channel.Broadcast management server can be to generate broadcast singal and/or broadcast related information and broadcast singal that transmission generated and/or the server of the broadcast related information that generated, perhaps can be the server that receives and send then previous broadcast singal that generates and/or the previous broadcast related information that generates.
Broadcast related information can comprise broadcast channel information, broadcast program information and/or broadcast service provider information.Broadcast singal can be the combination of TV broadcast singal, radio signals, data broadcasting signal, data broadcasting signal and TV broadcast singal or the combination of data broadcasting signal and radio signals.Broadcast related information can be provided for portable terminal 100 through mobile communications network.In this case, can receive broadcast related information through mobile communication module 113 rather than through broadcast reception module 111.Broadcast related information can be taked various forms.For example, broadcast related information can be the electronic program guides (EPG) of DMB (DMB) or can be the electronic service guidebooks (ESG) of hand-held digital video broadcast (DVB-H).
Broadcast reception module 111 can use the various broadcast systems such as T-DMB (DMB-T), digital multimedia broadcast (dmb) via satellite (DMB-S), only media forward link (MediaFLO), DVB-H and floor synthetic service digits broadcasting (ISDB-T) to come receiving broadcast signal.In addition, broadcast reception module 111 can be configured to be applicable to the broadcast system of the nearly all type the broadcast system of setting forth except that this paper.The broadcast singal and/or the broadcast related information that are received by broadcast reception module 111 can be stored in the storer 160.
Mobile communication module 113 can be sent at least one in base station, exterior terminal and the server with wireless signal through mobile communications network, or receives wireless signal through at least one from base station, exterior terminal and server of mobile communications network.Wireless signal can be that transmission/reception voice call signal, video calling signal or text/Multimedia Message comprise various types of data according to portable terminal 100.
Wireless Internet module 115 can be to be used for the module of access internet wirelessly.Wireless Internet module 115 can be embedded in the portable terminal 100 and maybe can be installed in the external unit.Wireless Internet module 115 can be embedded in the portable terminal 100 and maybe can be installed in the external unit.Wireless Internet module 115 can be used various wireless interconnected network technologies, inserts (Wimax) and high-speed downlink packet access (HSDPA) such as wireless lan (wlan), WiMAX (WiBro), worldwide interoperability for microwave.
Short-range communication module 117 can be the module that is used for short haul connection.Short-range communication module 117 can be used various short-range communication technique, such as bluetooth, RF identification (RFID), Infrared Data Association (IrDA), ultra broadband (UWB) and purple honeybee (ZigBee).
GPS module 119 can be from a plurality of gps satellite receiving position informations.
A/V input block 120 can be used for received audio signal or vision signal.A/V input block 120 can comprise camera 121 and microphone 123.Camera 121 can be handled the various picture frames of being taken by imageing sensor such as rest image or moving image under video calling pattern or image taking pattern.Can show the picture frame of handling by camera 121 by display module 151.
The picture frame of being handled by camera 121 can be stored in the storer 160 or can send to external unit through wireless communication unit 110.Portable terminal 100 can comprise two or more cameras 121.
Microphone 123 can use microphone during call mode, logging mode or speech recognition mode, to receive the external voice signal, and can convert voice signal to the electroacoustic sound data.In call mode, mobile communication module 113 can convert the electroacoustic sound data to can be easy to send to the mobile communication base station data, and can export the data that obtain through conversion then.Microphone 123 can use various noise remove algorithms to remove the noise that possibly generate during the external voice signal receiving.
User input unit 130 can generate key input data based on user's input, is used to control the operation of portable terminal 100.User input unit 130 may be implemented as keyboard, thin film switch, perhaps can be through pushed or touch static pressure or the capacitance contact board that receives order or information by the user.Alternatively, may be implemented as can be through being rotated roller, fine motion dish or jog wheels or the operating rod that receives order or information for user input unit 130.Still alternatively, user input unit 130 may be implemented as finger mouse.Particularly, if user input unit 130 is embodied as touch pad, and form mutual layer structure with display module 151, then user input unit 130 and display module 151 can be collectively referred to as touch-screen.
Sensing cell 140 is confirmed the current state of portable terminal 100; Open or whether the position and the portable terminal 100 of closure, portable terminal 100 contact with the user such as portable terminal 100, and generate the sensing signal of the operation that is used to control portable terminal 100.For example, when portable terminal 100 was the slide cover type mobile phone, sensing cell 140 can confirm that mobile device 100 opens or closure.In addition, whether sensing cell 140 can confirm portable terminal 100 by power supply unit 190 power supplies, and whether interface unit 170 is connected to external unit.
Sensing cell 140 can comprise detecting sensor 141, pressure transducer 143 and motion sensor 145.Detecting sensor 141 can not have under the situation of Mechanical Contact with entity, has determined whether near the object existence and near portable terminal 100.More particularly, detecting sensor 141 can through detect that AC magnetic field changes or the static capacity rate of change detect be present near with near object.Sensing cell 140 can comprise two or more detecting sensors 141.
Pressure transducer 143 can confirm whether pressure is applied on the portable terminal 100, perhaps can measure the rank (if having pressure) that is applied to the pressure on the portable terminal 100.Pressure transducer 143 can be installed in the specific part that needs pressure detection of portable terminal 100.For example, pressure transducer 143 can be installed in the display module 151.In this case, can distinguish typical touch input based on the data that provide by pressure transducer 143 and touch input, wherein generate the higher pressure stage of typical touch input and generate the pressure touch and import through using than being used to pressure.In addition, when receiving pressure touch input through display module 151, can confirm to touch the rank that is applied to the pressure of display module 151 when importing based on the data that provide by pressure transducer 143 when detecting pressure.
Motion sensor 145 can use acceleration sensor or gyro sensor to confirm the position and the motion of portable terminal 100.
Simultaneously, acceleration transducer is the equipment that a kind of variation that is used to degree of will speed up converts electric signal to.Development along with recent microelectromechanical systems (MEMS) technology; In multiple product, used acceleration transducer to be used for multiple purpose, from detecting big motion (such as the car crass that the gas-bag system of automobile, carries out) widely to detecting small movements (such as the motion of the hand that in the recreation input media, carries out).Usually, represent one of two or more axis directions or more multiple acceleration transducers be incorporated in the single encapsulation.Exist the certain situation that only needs the detection of an axis direction (for example, Z-direction).Therefore, when needs X or Y axle acceleration sensor but not during the Z axle acceleration sensor, X or Y axle acceleration sensor can be installed on the attachment base, and attachment base can be installed on the main substrate.
Gyro sensor is the sensor that is used for measured angular speed, can confirm the relative sense of rotation of portable terminal 100 with respect to reference direction.
Output unit 150 can output audio signal, vision signal and alarm signal.Output unit 150 can comprise display module 151, audio frequency output module 153, alarm modules 155 and sense of touch module 157.
Display module 151 can show the various information of handling through portable terminal 100.For example, if portable terminal 100 is in call mode, then display module 151 can show and is used to carry out or the user interface (UI) or the graphic user interface (GUI) of receipt of call.If portable terminal 100 is in video calling pattern or image taking pattern, then display module 151 can show UI or the GUI that is used to take or receive image.
If display module 151 is with user input unit 130 cambium layer structure together and therefore be embodied as touch-screen, then display module 151 can be used as output device and input equipment.If display module 151 is embodied as touch-screen, then display module 151 can also comprise touch panel and touch panel controller.Touch panel is the transparent panel that is attached on the outside of portable terminal 100, and can be connected to the internal bus of portable terminal 100.Whether touch panel persistent surveillance touch panel is touched by the user.In case receive the touch input to touch panel, then touch panel will send to the touch panel controller with the corresponding a plurality of signals of touch input.The touch panel controller is handled the signal that is sent by touch panel, and the signal after will handling sends to controller 180.Then, controller 180 is based on determine whether to generate which part that touches input and touched touch panel through the signal after the processing of touch panel controller transmission.
Display module 151 can comprise Electronic Paper (e-paper).Electronic Paper is a kind of reflective display technology, and resolution high as common ink is on paper, wide visual angle and excellent perceptual property can be provided.Electronic Paper can realize on the various types of substrates such as plastic base, metal substrate or paper substrate plate, even and after power supply breaks off, also can show and keep image above that.In addition, because Electronic Paper need not backlight assembly, so it can reduce the power consumption of portable terminal 100.Can display module 151 be embodied as Electronic Paper through hemisphere twisted nematic ball (electrostatic-charged hemispherical twist ball), use electrophoretic deposition (electrophoretic deposition) or the use microcapsules (microcapsule) that use static electrification.
Display module 151 can comprise at least one in LCD (LCD), thin film transistor (TFT) (TFT)-LCD, Organic Light Emitting Diode (OLED), flexible display, three-dimensional (3D) display.Portable terminal 100 can comprise two or more display modules 151.For example, portable terminal 100 can comprise outside display module (not shown) and inner display module (not shown).
Audio frequency output module 153 can be exported the voice data that during call reception pattern, call mode, logging mode, speech recognition mode or broadcast reception pattern, is received by wireless communication unit 110, maybe can be output in the voice data that exists in the storer 160.In addition, audio frequency output module 153 can be exported the various voice signals that are associated with the function such as receipt of call or message of portable terminal 100.Audio frequency output module 153 can comprise loudspeaker and hummer.
Alarm modules 155 can be exported the alarm signal of the generation of incident in the indicating mobile terminal 100.The example of incident comprises receiving calling signal, receives message and receives key signals.The example of the alarm signal through alarm modules 155 output comprises sound signal, vision signal and vibration signal.More specifically, alarm modules 155 can be exported alarm signal when receiving call signal or message.In addition, alarm modules 155 can receive key signals and can export alarm signal as the feedback to key signals.Therefore, the user can be based on the generation of easily discerning incident through the alarm signal of alarm modules 155 outputs.Being used for not only can be through alarm modules 155 output to the alarm signal of the generation of user notification incident but also can be through display module 151 or 153 outputs of audio frequency output module.
The various haptic effects (such as vibration) that sense of touch module 157 can provide the user to discover.If sense of touch module 157 generates vibration as haptic effect, the intensity and the pattern of the vibration that then generates through sense of touch module 157 can change in every way.Sense of touch module 157 can be synthesized different vibrating effects, and can export synthetic result.Alternatively, sense of touch module 157 can in turn be exported different vibrating effects.
Except vibration; Sense of touch module 157 can also provide various haptic effects, such as the haptic effect that uses the pin array that moves perpendicular to the contacting skin surface to obtain, through spraying or suck air via squit hole or inlet hole haptic effect, through skin surface is applied the effect that stimulates the effect that obtains, obtain through contact electrode, through using effect that electrostatic force obtains and through using the device that can absorb heat or give birth to heat to realize the effect of heat or cold sensation acquisition.Sense of touch module 157 can be configured to make the user can use the muscular sense of finger or arm to discern haptic effect.Portable terminal 100 can comprise two or more sense of touch modules 157.
Storer 160 can memory controller 180 the required various programs of operation.In addition, storer 160 can temporarily be stored the various data such as telephone directory, message, rest image or moving image.
Storer 160 can comprise at least one in flash-type storage medium, hard disk type storage medium, the miniature storage medium of multimedia card, card type storer (for example, secure digital (SD) or extreme digital (XD) storer), random-access memory (ram) and the ROM (read-only memory) (ROM).Portable terminal 100 may operate in the network memory of the function of execute store 160 on the internet.
Interface unit 170 can dock with the external unit that can be connected to portable terminal 100.Interface unit 170 can be the wire/wireless head phone; External cell charger, the wire/wireless FPDP is used for the for example draw-in groove of storage card, subscriber's identification module (SIM) card or subscriber identification module (UIM) card; Audio frequency I/O (I/O) terminal, video i/o terminal or earphone.Interface unit 170 can be from outer equipment receiving data, or can be supplied power by external unit.Interface unit 170 can send to other assemblies in the portable terminal 100 with the data that provided by external unit, maybe can the data that provided by other assemblies in the portable terminal 100 be sent to external unit.
When portable terminal 100 was connected to external bracket, interface unit 170 can be provided for electric power is offered the path of portable terminal 100 from external bracket, perhaps various signals is sent to the path of portable terminal 100 from external bracket.
Controller 180 can be controlled the overall operation of portable terminal 100.For example, controller 180 can be carried out and carry out/receive audio call, transmission/reception data or carry out/the relevant various control operations of receiver, video calling.Controller 180 can comprise the multimedia player module 181 of play multimedia data.Multimedia player module 181 may be implemented as hardware device and can be installed in the controller 180.Alternatively, multimedia player module 181 may be implemented as software program.
Power supply unit 190 can be by the power supply of external power source or internal electric source, and other assembly power supplies that can be in portable terminal 100.
Therefore portable terminal 100 can comprise wired/wireless communication system or satellite communication system, and can be that unit sends in the communication system of data and operates with frame or packet.
Hereinafter will referring to figs. 2 and 3, specifically describe the external structure of portable terminal 100.The present invention can be applied to the portable terminal such as nearly all type of folded form, board-type, rotary-type, slide type mobile terminal.Yet, for ease, suppose that portable terminal 100 is the board-type portable terminals that are equipped with full touch-screen.
Fig. 2 illustrates the preceding stereographic map of portable terminal 100, and Fig. 3 illustrates the back stereographic map of portable terminal 100.With reference to figure 2, the outside of portable terminal 100 can be formed by procapsid 100-1 and back casing 100-2.Various electronic installations can be installed in the space that is formed by procapsid 100-1 and back casing 100-2.Procapsid 100-1 and back casing 100-2 can be formed by synthetic resin through injection molding.Alternatively, procapsid 100-1 and back casing 100-2 can be formed by the metal such as stainless steel (STS) or titanium (Ti).
Display unit 151, the first audio frequency output module 153a, camera 121a and first to the 3rd user's load module 130a to 130c can be deployed on the main body of portable terminal 100, and particularly, are deployed on the procapsid 100-1.The the 4th and the 5th user's load module 130d and 130e and microphone 123 can be deployed on the side of back casing 100-2.
If touch pad is configured to overlap on the display module 151, and therefore form mutual layer structure, then display module 151 can be used as touch-screen.Therefore, the user can be only through touch display module 151 and to the various information of portable terminal 100 inputs.
The first output module 153a may be implemented as receiver or loudspeaker.Camera 121 can be configured to be suitable for taking user's static or moving image.Microphone 123 can be configured to suitably receive user's voice or other sound.
First to the 5th user load module 130a to 130e and the 6th user's load module 130f and the 7th user's load module 130g can be collectively referred to as user input unit 130; And any device can be used as first to the 7th user's load module 130a to 130f, as long as it can be operated with tactile manner.For example, user input unit 130 may be implemented as can according to the user push or touch operation receives the thin film switch or the touch pad of order or information, what perhaps may be implemented as roller or be used to rotate key touches type (jog type) or operating rod.With regard to function; First to the 3rd load module 130a to 130c can be operating as and be used to import the function key such as the order that starts, finishes or roll; Four-function family load module 130d can be operating as the function key of the operator scheme that is used to select portable terminal 100, and the 5th user's load module 130e can be operating as the hot key that is used to activate the specific function in portable terminal 100.
Can be set at the back of back casing 100-2 in addition with reference to 3, two camera 121b of figure and 121c, and the 6th and the 7th user's load module 130f and 130g and interface unit 170 can be deployed on the side of back casing 100-2.
Camera 121b and 121c can have the image taking direction relative basically with the image taking direction of camera 121a, and can have the resolution different with the resolution of camera 121a.Camera 121b and 121c can be used simultaneously, during the 3D rendering screening-mode, to create three-dimensional (3D) image, and perhaps can be by independent use, to create two dimension (2D) image.One of camera 121b and 121c can be configured to and can be moved.Therefore, farther or nearer through one of camera 121b and 121c and another mobile camera moving are got, can adjust the distance between two camera 121b and the 121c.
Flashlamp 125 and mirror can be deployed between camera 121b and the 121c.When utilizing the image of camera 121b and 121c reference object, flashlamp 125 can illuminate this object.When the user hoped to take its oneself image, mirror can allow the user to see its oneself.
Another audio frequency output module (not shown) can be arranged on the back casing 100-2 in addition.Can realize stereo function with the audio frequency output module 153 on procapsid 100-1 at the audio frequency output module on the back casing 100-2.Audio frequency output module on back casing 100-2 also can be used in the speaker mode.
Interface unit 170 can be used as and allows portable terminal 100 and external unit through the static line or the passage of swap data wirelessly.
Except the antenna that is used for airtime communication, the broadcast singal receiving antenna can be deployed on the side of procapsid 100-1 or back casing 100-2.The broadcast singal receiving antenna can be mounted, and makes it to extend from procapsid 100-1 or back casing 100-2.
Power supply unit 190 can be installed on the back casing 100-2, and can be to portable terminal 100 power supplies.For example, power supply unit 190 can be the rechargeable battery that removably is combined to back casing 100-2, is used to be recharged.
Fig. 4 to 7 is the figure that are used for explaining example how to calculate the steric information of using at portable terminal 100.Portable terminal 100 can use two cameras in the back of its main body, and promptly camera 121b and 121c create 3D rendering.For ease, hereinafter camera 121b and 121c are called the first camera 121b and the second camera 121c respectively.
With reference to figure 4 (a) and 4 (b),, can obtain first image 205 and second image 207 through utilizing the first camera 121b and the second camera 121c reference object 200.
First image 205 and second image 207 can correspond respectively to left eye and eye image, are used for using at the establishment 3D rendering.With reference to figure 4 (c), consider the parallax between first image 205 and second image 207, controller 180 can be through creating 3D rendering 210 with first image 205 and 207 combinations of second image.
3D rendering is the illusion that is used for creating at image the degree of depth, and therefore the technology of lively truly feels is provided for spectators.Two eyes general 65mm that is separated from each other.Therefore; When in two eyes each presented the different 2D image with real world; These 2D images can be projected on the retina of two eyes; And brain utilizes binocular parallax from the 2D retinal images, to extract the degree of depth, and binocular parallax results from the horizontal range of two eyes, and is one of most important factor that when design 3D display device, should be considered.
3D rendering 210 can be displayed on the display module 151, or can be printed.Portable terminal 100 also can use the essentially identical method that is used to create 3D rendering 210 to create the 3D video.
Have the whole bag of tricks that shows 3D rendering, such as stereo display method, it is the method that under the situation of using glasses, shows 3D rendering; The automatic stereo display packing, it is under the situation of not using glasses, to show the method for 3D rendering, and also is called as glasses-free 3D; And projective techniques, it uses holography (holography).Stereo display method generally is used for domestic television set, and the automatic stereo display packing generally is used for portable terminal.
The example of automatic stereo display packing includes, but not limited to lens explicit representation, disparity barrier (parallax barrier) method and parallax illumination (parallax illumination) method.The lens explicit representation relates to the front that a slice hemispherical lens is used to show the equipment of left eye and eye image.The disparity barrier explicit representation relates to through disparity barrier projection left eye and eye image.The parallax illumination method relates to illuminatian plate is placed on LCD back, so as to make at a distance from row pixel (alternate columns of pixels) for left eye with right eye it is thus clear that.
The method of above-mentioned establishment or demonstration 3D rendering can be applied to portable terminal and other equipment.
The degree of depth of the object in the 3D rendering can change according to the position difference of the object in left eye and the eye image.To specifically describe each example of the degree of depth how to calculate the object in the 3D rendering hereinafter.
Fig. 5 is the example that illustrates the object degree of depth how to calculate 3D rendering.With reference to figure 5; The z coordinate zp of point P is the position of object; Can use triangle that forms by right eye position R, some P and P2 and the triangle that forms by left eye position L, some P and some P1 to calculate; The triangle that is formed by right eye position R, some P and P2 is the projection on eye image planar I P2, and the triangle that is formed by left eye position L, some P and some P1 is the projection on left-eye image planar I P1, shown in equality (1):
z p = f - 2 df x ′ ′ - x ′ . . . ( 1 )
Wherein, x " the x coordinate of indication point P2, the x coordinate of x ' indication point P1, the distance between 2d indication left eye and the right eye, and the distance between f indication eyes and the virtual screen.
Therefore, the degree of depth between eyes (or camera) and the object, that is, and the degree of depth, can use equality (2) to calculate:
Figure BSA00000583145900161
With reference to equality (2), the difference of the x coordinate of the object that the left eye that uses 3D rendering and right eye are interior, the degree of depth that can calculating object.
Because some P1 is on the left side of a P2, thus the possibility of result that the x coordinate of some P2 deducts the x coordinate of a P1 have on the occasion of, it is called as positive parallax.In this case, object possibly seem to be positioned at after the virtual screen.
On the other hand, when the result who deducts the x coordinate of a P1 when the x of P2 coordinate had negative value, promptly when negative parallax occurred, object possibly seem to be positioned at the front of virtual screen.When a P1 and some P2 are consistent each other, that is, when parallax free occurred, object possibly seem to be positioned on the virtual screen.
In the above described manner, can calculate relation between the degree of the degree of depth and eye fatigue of the degree of depth and object of the object in the 3D rendering.Medical research shows that when the convergent angle when watching object (convergence angle) was spent above 1.5, it is tired that spectators possibly begin to feel.Just, when focusing on distant object, when focusing on nearby object, it is higher that convergent angle becomes, and convergent angle is big more, and then spectators are more tired.
Fig. 6 (a) and 6 (b) are another examples that illustrates the degree of depth how to calculate the object in the 3D rendering.With reference to figure 6 (a), at least one in left-eye image 310 and the eye image 320 can be divided into a plurality of.After this, can search for the piece with the eye image 320 of first 311 of left-eye image 310 coupling, that is, and second 321.
For search in eye image 320 and first 311 piece that matees the most, the block matching algorithm of in-service evaluation function (evaluation function) (such as mean square deviation (MSE) function, mean absolute error (MAE) function or mean absolute difference (MAD) function) can be used.When in left eye and eye image 310 and 320 each was divided into a plurality of M * N piece, MSE and MAE function can be respectively by equality (3) and (4) definition:
MSE ( i , j ) = 1 MN Σ m = 0 M Σ n = 0 N [ L k ( m , n ) - R k ( m + i , n + j ) ] 2 · · · ( 3 ) ; And
MAE ( i , j ) = 1 MN Σ m = 0 M Σ n = 0 N | L k ( m , n ) - R k ( m + i , n + j ) | . . . ( 4 )
Wherein, L kThe k piece of indication left-eye image 310, and R kThe k piece of indication eye image 310.Evaluation function can select to have the piece of minimum MAD or MSE from eye image 320, as first 311 best matching blocks.
In the left eye and eye image of 3D rendering, because the object of 3D rendering has identical y coordinate usually, but different x coordinates, so can use the evaluation function that only changes the x coordinate.
In case found second piece 321, then can use first 311 x coordinate d 1X coordinate d with second 321 2Between difference come compute depth information.
Similarly,, can search for the piece with the eye image 320 of the 3rd 313 of left-eye image 310 coupling with reference to figure 6 (b), that is, and the 4th 323.Then, can use the 3rd 313 x coordinate d 3X coordinate d with the 4th 323 4Between difference come compute depth information.
Can carry out above-mentioned matching operation to whole left eye and eye image 310 or 320, thereby the piece that calculates with left eye and eye image 310 or 320 is the depth information of unit.
In case calculated the depth information of each piece that is used for left eye and eye image 310 or 320, then can calculate the steric information of the 3D rendering that constitutes by left eye and eye image 310 and 320.Can be that unit calculates steric information with the frame, as the depth information of the piece of left eye and eye image 310 and 320 on average or standard deviation (deviation).Based on whether there being level and smooth change in depth between the adjacent object of 3D rendering, also can calculate steric information.
This steric information may be provided in numerical data, perhaps as figure or 3D rendering, as shown in Figure 7.
Fig. 8 is a process flow diagram of controlling the portable terminal method of operating according to an exemplary embodiment of the present invention, and particularly, how to calculate the steric information of 3D rendering.With reference to figure 8, if in response to for example user command, select to be used to calculate the evaluation model (S400) of three-dimensional property of the steric information of 3D rendering, then controller 180 can be cancelled from the left eye and the eye image (S405) of the 3D rendering of storer 160 inputs.
After this, controller 180 can be divided into a plurality of with in left eye and the eye image each, and can in left eye and eye image, search for a pair of match block (S410).After this, controller 180 can calculate this to the position difference between the match block (S415), and can be based on the result calculated of carrying out among the operation S415, calculates this depth information to match block.
If the depth information for all the right match block in left eye and eye image calculates completion (S425), then controller 180 can be based on the depth information of all the right match block in left eye and the eye image, calculates the steric information of input 3D rendering.
As stated, according to predefined rule sets, steric information may be calculated, for example, and the average or standard deviation of the depth information of every pair match block.
After this, controller 180 can be exported steric information (S435).More particularly, controller 180 can be output as numerical data or figure or image with this steric information.Controller 180 can with the 3D rendering storing stereoscopic information relatively of input, for future use.
With the exemplary embodiment that example has been described Fig. 8 that is calculated as of the steric information of rest image.Yet the present invention also can be applied to the calculating of the steric information of 3D video.More particularly,, can carry out the same procedure of the steric information that is used to calculate rest image, thereby calculate the steric information of 3D video for each frame of 3D video.In this case, can or play the steric information that section is calculated the 3D video with the frame of 3D video.
Fig. 9 is the method for operating according to the control portable terminal of another exemplary embodiment of the present invention, and particularly, during multimedia mode, how to utilize the steric information of 3D rendering.With reference to figure 9, if in response to for example user command, select multimedia mode (S500), the tabulation (S505) that then controller 180 can display file on display module 151 as the pattern that is used to check photograph album or playing video file.In this case, if the tabulation that is shown comprises the 2D image, then controller 180 also can show the title or the thumbnail image of 2D image.If the tabulation that is shown comprises 3D rendering, then controller 180 also can not only show the title or the thumbnail image of 3D rendering, also can show the steric information of 3D rendering.Can use numerical data or figure to show the steric information of 3D rendering.
One of if the file in the tabulation of having selected to be shown (S510), then controller 180 can be play selected file.If selected file is the 3D rendering with steric information, then controller 180 can show steric information gauge (gauge) (S515) on display module 151.
If receive the order of playback that is used to control selected file such as " stopping ", " F.F. ", " rollback ", then controller 180 can be controlled the playback (S525) of selected file according to received user command.If selected file is the 3D video, then can be in response to the playback of controlling the 3D video from the detected user's input of steric information gauge.
Operation S515 to S525 can be repeated to carry out, and is terminated (S530) up to multimedia mode.
Except multimedia mode, steric information can be used in the various operator schemes of portable terminal 100.For example, during the obtaining of 3D rendering, can steric information be presented on the camera preview screen, thereby allow the user to select suitable formation based on steric information.In addition, based on the steric information of 3D content, can control the 3D playback of content automatically.
Figure 10 to 15 is the figure that illustrate each example of how using steric information.
With reference to Figure 10, during multimedia mode, can the thumbnail image of the file that can play be presented on " photograph album " screen 600.Can come the thumbnail image of mark 3D rendering with a plurality of asterisks, so that the existence of indication steric information.More particularly, the number of the asterisk that is labeled of each thumbnail image is corresponding to the degree of the three-dimensional property of corresponding document.For example, the thumbnail image of 2D image can be with asterisk, and the thumbnail image of 3D rendering can have a more than asterisk.In addition, having the thumbnail image of the 3D rendering of high three-dimensional property degree can be than the more asterisk of thumbnail image mark of the 3D rendering of low three-dimensional property degree.Therefore, the user can come optionally to check highly three-dimensional image based on its star.
With reference to Figure 11, in the whole 3D perhaps the three-dimensional property degree of the present frame of 3D content can be used as star and be displayed on the side of 3D content playback screen 610.
With reference to Figure 12 (a) and 12 (b); When 3D content playback screen 620 is shown; If the user selects to show the steric information of 3D content; Then can show through converting 3D content playback screen 620 to display screen 630 that gray mode obtains, and bar shaped steric information gauge 633 can be displayed on the side of display screen 630.Bar shaped steric information gauge 633 can be divided into a plurality of parts of filling different colours.
Then, the user can select the three-dimensional property rank of expectation from bar shaped steric information gauge 633, and can therefore adjust the three-dimensional property rank of this 3D content.After this, controller 180 can convert 3D content playback screen 620 to the original color pattern, and can show the several objects from the 3D content choice only three-dimensionally, and two-dimentional other unselected objects that shows.
With reference to Figure 13, the depth information of each in a plurality of of 3D rendering can be displayed on 3D rendering is depicted as on the side on the display screen 640 of image 643.
With reference to Figure 14, steric information gauge 653 can be displayed on the side of the display screen 650 that shows 3D rendering, thereby the steric information of this 3D rendering is provided in real time.This steric information gauge 653 can just be shown when the three-dimensional property degree of this 3D rendering surpasses with reference to rank when the user asks or only.Can use steric information gauge 653 to adjust the three-dimensional property degree of this 3D rendering.For example, when the user felt too tired because of the three-dimensional property degree of this 3D rendering, the user can be simply through dragging the three-dimensional property degree that reduces this 3D rendering on steric information gauge 653.
In brief, steric information gauge 653 not only can be used to provide the steric information of 3D rendering, also can be used to adjust the three-dimensional property degree of this 3D rendering.
Figure 15 illustrates how to insert steric information in the broadcast singal and send the figure of the example of this broadcast singal.With reference to Figure 15, mpeg transport stream (TS) grouping comprises head and payload.Head has the regular length of 4 bytes, and comprises sync byte, packet identifier (ID), scrambling control data and adaptation fields.
According to how it is encoded, the MPEG-4 frame of video is categorized into intra-coded frame (I frame), encoded predicted frame (P frame) and alternating binary coding frame (B frame).Therefore the I frame is an independent frame, and can be independent of other before or frame subsequently and be encoded as single image.With reference to its previous I or P frame the P frame is encoded.Just, the P frame can be encoded as different with its previous frame.Previous P frame with subsequently with reference to it is encoded the B frame.In the set of pictures (GOP) as one group of continuous picture in the encoded video streams, I frame, a plurality of P frame and a plurality of B frame arranged with repeat pattern, for example, IBBPBBPBBPBB, it is called as the GOP pattern.
Can the steric information of 3D content be inserted in the MPEG TS grouping, and can send MPEG TS and divide into groups.For example, with reference to Figure 15, can be unit with the I frame, calculate steric information in advance, and can it be recorded in header extension 700 or the payload user data then.
The result of the wrong and flicker in the mistake in the normally excessive three-dimensional property of 3D rendering unsatisfactory, the image acquisition procedures, the image display process.The example of the mistake in the image acquisition procedures includes, but not limited to the distortion of image offset, light and the camera mistake in being provided with.Mistake in the image display process comprises, but is not limited to, left eye and eye image dislocation, and it possibly cause obstinate headache.Flicker is that per second shows the phenomenon that the dozens of image is caused, and it also can cause headache or feel sick.
Consider these, not only comprise the depth information of 3D rendering, and comprise for improving necessary other information of user satisfaction, and also can be as the steric information of this 3D rendering and 3D rendering is provided.
Be not limited thereto the exemplary embodiment of place elaboration according to portable terminal of the present invention with according to the method for this portable terminal of control of the present invention.Therefore, the variant of the exemplary embodiment of this place elaboration can fall within the scope of the present invention with combination.
The present invention may be implemented as and can read and can be write the code in the computer readable recording medium storing program for performing by the processor that in portable terminal, comprises.Calculating readable medium recording program performing can be the recording unit that data is stored in any kind wherein with the computer-readable mode.The example of computer readable recording medium storing program for performing comprises ROM, RAM, CD-ROM, tape, floppy disk, light data storage and the carrier wave data transmission of internet (for example, through).Computer readable recording medium storing program for performing can be distributed in a plurality of computer systems that are connected to network, so that the mode that is dispersed writes computer-readable code wherein, or from its execution.Those of ordinary skill in the art can be easy to explain the realization functional programs of wanting required for the present invention, code and code segment.
As stated, according to the present invention, can calculate the steric information of 3D rendering based on the position difference of object in the left eye of 3D rendering and the eye image.Then, this steric information can be used to such as the various uses of taking, estimate and play the 3D content.
Though specifically illustrate and described the present invention with reference to exemplary embodiment of the present invention; But those of ordinary skill in the art will understand; Under situation about not breaking away from, can do various changes in form and details by the defined the spirit and scope of the present invention of hereinafter claim.

Claims (15)

1. method of operating of controlling portable terminal, said method comprises:
At least the first image is become a plurality of with second image division, and said first image and second image can use binocular parallax to produce three-dimensional (3D) image;
At least one pair of match block of search at least the first image and second image;
Based on the difference of the position between every pair in said at least one pair of match block, calculate every pair depth information in said at least one pair of match block; And
Based on the depth information that is calculated, calculate the steric information of said 3D rendering.
2. method according to claim 1 wherein, is searched at least the first image and second image and is comprised: utilizes block matching algorithm at least the first image and second image, to search for.
3. method according to claim 1, wherein, calculate steric information and comprise: with the frame or the section of resetting is that unit calculates steric information.
4. method according to claim 1 further comprises: said steric information is shown as numerical data, figure or image at least.
5. method according to claim 1 further comprises: the steric information gauge of the said steric information of data representing on the display screen that shows said 3D rendering.
6. method according to claim 5 further comprises: in response to from the detected user's input of said steric information gauge, adjust the degree of the three-dimensional property of the 3D rendering that is shown.
7. method according to claim 5 wherein, in response to specific user command or when said steric information satisfies the predetermined condition group, shows said steric information gauge.
8. method according to claim 5 further comprises:
In the 3D rendering that is shown, show one or more objects of from the 3D rendering that is shown, selecting three-dimensionally; And
In the 3D rendering that is shown, show unselected object two-dimentionally.
9. method according to claim 1 further comprises: on the display screen that shows said 3D rendering, show the designator of said steric information.
10. method according to claim 1 further comprises: said 3D rendering and said steric information are inserted in the signal, and said signal is sent to another equipment.
11. a portable terminal comprises:
Display module, said display module is configured to use binocular parallax, based at least the first image and second image, shows three-dimensional (3D) image; And
Controller; Said controller is configured at least the first image is become a plurality of with second image division; At least one pair of match block of search at least the first image and second image; Difference based on the position between every pair in said at least one pair of match block is calculated every pair depth information in said at least one pair of match block, calculates the steric information of said 3D rendering based on the depth information that is calculated, and on said display module, shows said steric information.
12. portable terminal according to claim 11, wherein, said controller further is configured on said display module, said steric information is shown as numerical data, figure or image at least.
13. portable terminal according to claim 11, wherein, said controller further is configured to utilize block matching algorithm at least the first image and second image, to search for.
14. portable terminal according to claim 11 further comprises: storer, said storer are configured to storage at least the first image and second image and steric information.
15. portable terminal according to claim 11, wherein, said controller is configured to further on said display module, show the steric information gauge that said steric information gauge is represented the steric information of said 3D rendering.
CN201110290038.3A 2010-09-20 2011-09-20 Mobile terminal and method of controlling operation of the same Expired - Fee Related CN102411474B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2010-0092610 2010-09-20
KR1020100092610A KR101640404B1 (en) 2010-09-20 2010-09-20 Mobile terminal and operation control method thereof

Publications (2)

Publication Number Publication Date
CN102411474A true CN102411474A (en) 2012-04-11
CN102411474B CN102411474B (en) 2014-10-22

Family

ID=44772653

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110290038.3A Expired - Fee Related CN102411474B (en) 2010-09-20 2011-09-20 Mobile terminal and method of controlling operation of the same

Country Status (6)

Country Link
US (1) US9456205B2 (en)
EP (1) EP2432231A3 (en)
JP (1) JP5379200B2 (en)
KR (1) KR101640404B1 (en)
CN (1) CN102411474B (en)
TW (1) TWI508525B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106814967A (en) * 2017-01-25 2017-06-09 努比亚技术有限公司 The apparatus and method of retrieving image in a kind of picture library
CN109348114A (en) * 2018-11-26 2019-02-15 Oppo广东移动通信有限公司 Imaging device and electronic equipment
CN110784940A (en) * 2019-11-14 2020-02-11 新乡学院 Method and electronic device for network connection

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012253713A (en) * 2011-06-07 2012-12-20 Sony Corp Image processing device, method for controlling image processing device, and program for causing computer to execute the method
CN104718495B (en) * 2012-10-11 2018-02-27 田原博史 Video observing system
NO339902B1 (en) * 2012-11-02 2017-02-13 Rolls Royce Marine As SYSTEM TO CONTROL OR LOAD A CABLE OR LIKE A DRUM
KR102090269B1 (en) * 2012-12-14 2020-03-17 삼성전자주식회사 Method for searching information, device, and computer readable recording medium thereof
TWI556664B (en) * 2014-11-24 2016-11-01 To achieve a wireless interaction between an electronic device and an initiator
US11550387B2 (en) * 2015-03-21 2023-01-10 Mine One Gmbh Stereo correspondence search
US10853625B2 (en) 2015-03-21 2020-12-01 Mine One Gmbh Facial signature methods, systems and software
WO2016154123A2 (en) * 2015-03-21 2016-09-29 Mine One Gmbh Virtual 3d methods, systems and software

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
US6674431B1 (en) * 1999-05-28 2004-01-06 Minolta Co., Ltd. Method for processing data expressing three-dimensional shape
CN1864415A (en) * 2003-08-26 2006-11-15 夏普株式会社 3-dimensional video reproduction device and 3-dimensional video reproduction method
CN101231754A (en) * 2008-02-03 2008-07-30 四川虹微技术有限公司 Multi-visual angle video image depth detecting method and depth estimating method
CN101277454A (en) * 2008-04-28 2008-10-01 清华大学 Method for generating real time tridimensional video based on binocular camera
CN101459857A (en) * 2007-12-10 2009-06-17 深圳华为通信技术有限公司 Communication terminal and information system
US20090324059A1 (en) * 2006-09-04 2009-12-31 Koninklijke Philips Electronics N.V. Method for determining a depth map from images, device for determining a depth map
JP2010177921A (en) * 2009-01-28 2010-08-12 Fujifilm Corp Stereoscopic imaging apparatus and stereoscopic imaging method

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0744701B2 (en) * 1986-12-27 1995-05-15 日本放送協会 Three-dimensional superimpose device
JPH0568268A (en) * 1991-03-04 1993-03-19 Sharp Corp Device and method for generating stereoscopic visual image
JP3465988B2 (en) * 1994-04-27 2003-11-10 松下電器産業株式会社 Method and apparatus for estimating movement and depth
JP3826236B2 (en) * 1995-05-08 2006-09-27 松下電器産業株式会社 Intermediate image generation method, intermediate image generation device, parallax estimation method, and image transmission display device
JPH1074267A (en) * 1996-07-03 1998-03-17 Canon Inc Display control device and its method
US6043838A (en) * 1997-11-07 2000-03-28 General Instrument Corporation View offset estimation for stereoscopic video coding
US6208348B1 (en) * 1998-05-27 2001-03-27 In-Three, Inc. System and method for dimensionalization processing of images in consideration of a pedetermined image projection format
US6573855B1 (en) * 1998-08-31 2003-06-03 Osaka Gas Co., Ltd. Three-dimensional questing method, three-dimensional voxel data displaying method, and device therefor
JP2001061164A (en) * 1999-08-19 2001-03-06 Toshiba Corp Transmission method of stereoscopic video signal
WO2001091448A2 (en) * 2000-05-19 2001-11-29 Koninklijke Philips Electronics N.V. Method system and apparatus_for motion estimation using block matching
JP3554257B2 (en) * 2000-07-31 2004-08-18 キヤノン株式会社 Display control device and method
JP4251952B2 (en) * 2003-10-01 2009-04-08 シャープ株式会社 Stereoscopic image display apparatus and stereoscopic image display method
JP3943098B2 (en) * 2004-05-20 2007-07-11 株式会社アイ・オー・データ機器 Stereo camera
KR20060063265A (en) 2004-12-07 2006-06-12 삼성전자주식회사 Method and apparatus for processing image
JP4755565B2 (en) 2006-10-17 2011-08-24 シャープ株式会社 Stereoscopic image processing device
WO2009131703A2 (en) * 2008-04-25 2009-10-29 Thomson Licensing Coding of depth signal
US20110292044A1 (en) * 2009-02-13 2011-12-01 Kim Woo-Shik Depth map coding using video information
KR20100092610A (en) 2009-02-13 2010-08-23 주식회사 디티즌 A slide connection jack rotated
JP5274359B2 (en) * 2009-04-27 2013-08-28 三菱電機株式会社 3D video and audio recording method, 3D video and audio playback method, 3D video and audio recording device, 3D video and audio playback device, 3D video and audio recording medium
US20130286017A1 (en) * 2010-05-07 2013-10-31 David MARIMÓN SANJUAN Method for generating depth maps for converting moving 2d images to 3d

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5734743A (en) * 1994-07-12 1998-03-31 Canon Kabushiki Kaisha Image processing method and apparatus for block-based corresponding point extraction
US6674431B1 (en) * 1999-05-28 2004-01-06 Minolta Co., Ltd. Method for processing data expressing three-dimensional shape
CN1864415A (en) * 2003-08-26 2006-11-15 夏普株式会社 3-dimensional video reproduction device and 3-dimensional video reproduction method
US20090324059A1 (en) * 2006-09-04 2009-12-31 Koninklijke Philips Electronics N.V. Method for determining a depth map from images, device for determining a depth map
CN101459857A (en) * 2007-12-10 2009-06-17 深圳华为通信技术有限公司 Communication terminal and information system
CN101231754A (en) * 2008-02-03 2008-07-30 四川虹微技术有限公司 Multi-visual angle video image depth detecting method and depth estimating method
CN101277454A (en) * 2008-04-28 2008-10-01 清华大学 Method for generating real time tridimensional video based on binocular camera
JP2010177921A (en) * 2009-01-28 2010-08-12 Fujifilm Corp Stereoscopic imaging apparatus and stereoscopic imaging method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106814967A (en) * 2017-01-25 2017-06-09 努比亚技术有限公司 The apparatus and method of retrieving image in a kind of picture library
CN109348114A (en) * 2018-11-26 2019-02-15 Oppo广东移动通信有限公司 Imaging device and electronic equipment
CN110784940A (en) * 2019-11-14 2020-02-11 新乡学院 Method and electronic device for network connection

Also Published As

Publication number Publication date
EP2432231A3 (en) 2014-12-10
CN102411474B (en) 2014-10-22
KR20120056928A (en) 2012-06-05
US20120069005A1 (en) 2012-03-22
TW201218749A (en) 2012-05-01
JP5379200B2 (en) 2013-12-25
JP2012065327A (en) 2012-03-29
TWI508525B (en) 2015-11-11
EP2432231A2 (en) 2012-03-21
US9456205B2 (en) 2016-09-27
KR101640404B1 (en) 2016-07-18

Similar Documents

Publication Publication Date Title
CN102411474B (en) Mobile terminal and method of controlling operation of the same
CN102445989B (en) Mobile terminal and method of controlling operation thereof thereof
CN102479052A (en) Mobile terminal and operation control method thereof
CN102566747A (en) Mobile terminal and method for controlling operation of mobile terminal
EP2432238A2 (en) Mobile terminal and method for controlling operation of the mobile terminal
KR101872865B1 (en) Electronic Device And Method Of Controlling The Same
CN110249291A (en) System and method for the augmented reality content delivery in pre-capture environment
CN106060520B (en) A kind of display mode switching method and its device, intelligent terminal
CN102377875A (en) Mobile terminal and image display method thereof
KR20120081649A (en) Mobile terminal and operation control method thereof
CN103863713B (en) Packing box
CN110119260B (en) Screen display method and terminal
CN103959340A (en) Graphics rendering technique for autostereoscopic three dimensional display
CN103279942A (en) Control method for realizing virtual 3D (3-dimension) display on 2D (2-dimension) screen on basis of environment sensor
KR20150024199A (en) Head mount display device and control method thereof
CN114281234A (en) Image display method, device and storage medium
CN105739684B (en) Electronic system and its operating method with gesture alignment mechanism
CN118251643A (en) Electronic device and method for anchoring augmented reality objects
CN110136570B (en) Screen display method and terminal
CN105096794A (en) Display device, control method of display device and display system
KR20120040766A (en) Mobile terminal and operation control method thereof
KR101678447B1 (en) Mobile Terminal And Method Of Displaying Image
CN203780961U (en) Packing box
KR20130057302A (en) Mobile terminal and operation method thereof
CN203780962U (en) Packing device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141022

Termination date: 20180920

CF01 Termination of patent right due to non-payment of annual fee