WO2021155575A1 - Electric device, method of controlling electric device, and computer readable storage medium - Google Patents

Electric device, method of controlling electric device, and computer readable storage medium Download PDF

Info

Publication number
WO2021155575A1
WO2021155575A1 PCT/CN2020/074508 CN2020074508W WO2021155575A1 WO 2021155575 A1 WO2021155575 A1 WO 2021155575A1 CN 2020074508 W CN2020074508 W CN 2020074508W WO 2021155575 A1 WO2021155575 A1 WO 2021155575A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal processor
image signal
image
page
point cloud
Prior art date
Application number
PCT/CN2020/074508
Other languages
French (fr)
Inventor
Chiaki Aoyama
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp., Ltd. filed Critical Guangdong Oppo Mobile Telecommunications Corp., Ltd.
Priority to PCT/CN2020/074508 priority Critical patent/WO2021155575A1/en
Priority to CN202080093632.0A priority patent/CN114982214A/en
Publication of WO2021155575A1 publication Critical patent/WO2021155575A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • H04N1/3877Image rotation
    • H04N1/3878Skew detection or correction

Definitions

  • the present invention relates to an electric device, a method of controlling the electric device, and a computer readable storage medium.
  • the present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electric device and a method of controlling electric device.
  • an electric device may include:
  • a camera module that takes a photograph of a subject to acquire a master camera image
  • a range sensor module that acquires range depth information of the subject by using a light
  • an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the range depth information
  • the image signal processor controls the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface,
  • the image signal processor estimates a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information
  • the image signal processor obtains an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
  • the range sensor module emits a pulsed light to the subject and detects a reflected light of the pulsed light reflected from the subject, thereby acquiring time of flight (ToF) depth information as the range depth information.
  • ToF time of flight
  • the image signal processor obtains a master camera image including the curved page surface, by taking a photograph of the curved page surface that has been opened and curved with the camera module, and
  • the image signal processor acquires the ToF depth information of the curved page surface by the range sensor module.
  • the image signal processor estimates the curve corresponds to the position of the curved page surface, in a plane perpendicular to the crease direction of the position of the opened crease of the page, based on the master camera image and the ToF depth information.
  • the image signal processor obtains a master camera image including a curved page surface of other page in the state where the book is opened and ToF depth information of the curved page surface of the other page,
  • the image signal processor estimates other curve corresponding to the position of the curved page surface of the other page based on the master camera image and the ToF depth information
  • the image signal processor obtains an image of a surface of the other page that has been projection transformed into a plane, by projection transforming the curved page surface of the other page in the master camera image to be a plane, based on the estimated other curve, and
  • the image signal processor synthesizes the images of the surface of the two pages of the acquired plane, and acquires an image of the surface of the two pages of the plane in the state where the book is opened.
  • the image signal processor sets a crease position designation frame designated by the user on the opened page in the image taken by the camera module in the state where the book is opened, and
  • the image signal processor acquires the master camera image including the curved page surface of the page in which the crease position designation frame is set, by taking an image of the curved page surface of the page that has been opened and curved in the state of opening the book with the camera module.
  • a display module that displays predefined information
  • a main processor that controls the display module and the input module
  • the image signal processor displays the crease position designation frame on the display module together with the master camera image taken by the camera module, and
  • the image signal processor sets the crease position designation frame at a position designated by the user on the curved page surface of the page of the master camera image, in response to an operation input related to an instruction of the crease position designation frame by the user to the input module.
  • the image signal processor obtains reference point cloud data for the curved page surface, based on the master camera image and the ToF depth information
  • the image signal processor calculates a normal vector by applying a principal component analysis to the reference point cloud data
  • the image signal processor performs projection transform of the reference point cloud data into first point cloud data of the master camera image taken from the depth direction, based on the calculated normal vector.
  • the image signal processor scans the first point cloud data along a plurality of lines, in a direction perpendicular to the longitudinal direction of the crease position designation frame,
  • the image signal processor calculates a slope of a valley of the first point cloud data, by applying the least squares method to the scanned first point cloud data,
  • the image signal processor estimates the crease position based on the calculated slope of the valley
  • the image signal processor obtains second point cloud data by first rotating the first point cloud data so that the estimated crease position is parallel to a preset reference direction.
  • the image signal processor scans the second point cloud data along a plurality of lines in a direction perpendicular to the reference direction
  • the image signal processor calculates a slope of the ridge in the reference direction of the scanned second point cloud data by applying a least square method to the scanned second point cloud data
  • the image signal processor acquires the third point cloud data by a second rotating the second point cloud data, so that the calculated slope of the ridge is parallel to a preset reference plane.
  • the image signal processor scans the third point cloud data along a plurality of lines in the reference direction
  • the image signal processor calculates an average value of the third point cloud data in the vicinity of the plurality of lines in the depth direction
  • the image signal processor approximates the calculated average value as a curve in a direction perpendicular to the reference direction and a depth direction by a fourth-order or higher order polynomial.
  • the image signal processor divides the third point cloud data into a plurality of rectangular areas for each section obtained by dividing the curve
  • the image signal processor obtains the image of the surface of the page that has been projective transformed into a plane by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on a relation between coordinates of the plurality of rectangular regions and coordinates of the projection space of the reference point cloud data.
  • a detection resolution of the range sensor module is lower than a detection resolution of the camera module.
  • a method for controlling an electric device including: a camera module that takes a photograph of a subject to acquire a master camera image; a range sensor module that acquires range depth information of the subject by using a light; and an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the range depth information,
  • the method including:
  • the range depth information is time of flight (ToF) depth information.
  • ToF time of flight
  • a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the computer program implements a method for controlling an electric device, and the method comprises:
  • the range depth information is time of flight (ToF) depth information.
  • ToF time of flight
  • FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention
  • FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1;
  • FIG. 3 is a diagram illustrating an example of an overall flow for the electric device 100 shown in FIG. 1 and FIG. 2 to acquire an image obtained by transforming the surface of a page of the book 100 in an opened state into a plane;
  • FIG. 4 is a diagram illustrating a specific example of step S1 shown in FIG. 3 for acquiring the master camera image and ToF depth information;
  • FIG. 5 is a diagram illustrating an example of display on the display module 45 of the electric device 100 when the crease position is designated;
  • FIG. 6 is a diagram illustrating a specific example of step S2 shown in FIG. 3 for executing the top view process
  • FIG. 7A is a diag ram illustrating an example of the reference point cloud data P of the image of the surface of the cu rved page imaged from the tilted direction with the book opened;
  • FIG. 7B is a diagram illustrating an example of the first point cloud data P1 obtained by projective transformation as photographed from the depth direction (from the front) ;
  • FIG. 8A is a diagram illustrating an example of scanning along a plurality of lines L1 with respect to the first point cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the crease position designation frame;
  • FIG. 8B is a diagram illustrating an example of the second point cloud data P2 obtained by the first rotation R of the first point cloud data so that the estimated crease position D2 is parallel to a preset reference direction (the z axis direction) ;
  • FIG. 9A is a diagram illustrating an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (the z axis direction) ;
  • FIG. 9B is a diagram illustrating an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2;
  • FIG. 9C is a diagram illustrating an example of the third point cloud data P3 obtained by rotating the second point cloud data P2 by the second rotation Q so that the calculated inclination of the ridge N is parallel to a preset reference plane (z-y plane) ;
  • FIG. 10A is a diagram illustrating an example of scanning along a plurality of lines L3 with respect to the third point cloud data P3 in the reference direction (the z axis direction) ;
  • FIG. 10B is a diagram illustrating an example of an average value A of the third point cloud data P3 in the depth direction (the x axis direction) in the vicinity of the plurality of lines L3;
  • FIG. 10C is a diagram illustrating an example of a curve E obtained by approximating the calculated average value A with a fourth-order or higher order polynomial in the direction perpendicular to the reference direction (the y axis direction) and the depth direction (the x axis direction) ;
  • FIG. 11 is a diagram illustrating a specific example of step S25 for executing the dividing process shown in FIG. 6;
  • FIG. 12A is a diagram illustrating an example of dividing a curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range;
  • FIG. 12B is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12A;
  • FIG. 12C is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12B;
  • FIG. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in FIG. 3;
  • FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E;
  • FIG. 14B is a diagram illustrating an example of coordinates obtained by inversely transforming the third point cloud data P3 into the projection space of the point cloud data P1;
  • FIG. 15A is a diagram illustrating an example of a plurality of rectangular areas J for each section that the third point cloud data P3 is divided into obtained by dividing the curve E;
  • FIG. 15B is a diagram illustrating an example of a plurality of rectangular areas G developed in the two-dimensional space (u, v) from a plurality of rectangular areas J in the three-dimensional space (x, y, z) ;
  • FIG. 16 is a diagram illustrating an example of a relationship between coordinates when the point cloud data is expanded on a plane and coordinates on a captured image
  • FIG. 17 is a diagram illustrating an example of an image obtained by projective transformation for each corresponding rectangular area so that the surface of the curved page in the master camera image becomes a plane;
  • FIG. 18 is a diagram illustrating an example of the images of the surfaces of the two pages 200 and 201 that have been projected and transformed into a plane.
  • FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1.
  • the electric device 100 includes a camera module 10, a range sensor module 20, and an image signal processor 30 that controls the camera module 10 and the range sensor module 20, and processes camera image data acquired from the camera module 10.
  • reference numeral 101 in FIG. 1 depicts a subject which is book.
  • the camera module 10 includes, for example, a master lens 10a that is capable of focusing on a subject, a master image sensor 10b that detects an image inputted via the master lens 10a, and a master image sensor driver 10c that drives the master image sensor 10b, as shown in FIG. 2.
  • the camera module 10 includes, for example, a Gyro sensor 10d that the angular velocity and the acceleration of the camera module 10, a focus &OIS actuator 10f that actuates the master lens 10a, and a focus &OIS driver 10e that drives the focus &OIS actuator 10f, as shown in FIG. 2.
  • the camera module 10 acquires a master camera image of the subjects 101, for example (FIG. 1) .
  • the range sensor module 20 includes, for example, a ToF lens 20a, a range sensor 20b that detects the reflection light inputted via the ToF lens 20a, a range sensor driver 20c that drives the range sensor 20b, and a projector 20d that outputs the pulse lights.
  • the range sensor module 20 acquires range depth information of the subject 101 by using a light. Especially, the range sensor module 20 acquires the time of flight (ToF) depth information (ToF depth value) as the range depth information by emitting pulsed light toward the subjects 101, and detecting the reflection light from the subjects 101, for example.
  • ToF time of flight
  • the resolution of the detection by the range sensor module 20 is lower than the resolution of the detection by the camera module 10.
  • the image signal processor 30 controls, for example, the camera module 10 and the range sensor module 20 to acquire a camera image, which is the master camera image, based on the master camera image obtained by means of the camera module 10 and the ToF depth information obtained by means of the range sensor module 20.
  • the electric device Furthermore, as shown in FIG. 2, for example, the electric device
  • GNSS global navigation satellite system
  • CODEC CODEC
  • speaker 43 CODEC
  • microphone 44 a display module 45
  • input module 46 an inertial navigation unit (IMU) 47
  • main processor 48 main processor
  • memory 49 main memory
  • the GNSS module 40 measures the current position of the electric device 100, for example.
  • the wireless communication module 41 performs wireless communications with the Internet, for example.
  • the CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method, as shown in FIG. 2 for example.
  • the speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42, for example.
  • the microphone 44 outputs sound data to the CODEC 42 based on inputted sound, for example.
  • the display module 45 displays predefined information.
  • the input module 46 receives a user′s input (a user′s operations) .
  • An IMU 47 detects, for example, the angular velocity and the acceleration of the electric device 100.
  • the main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
  • GNSS global navigation satellite system
  • the memory 49 stores a program and data required for the image signal processor 30 to control the camera module 10 and the range sensor module 20, acquired image data, and a program and data required for the main processor 48 to control the electric device 100.
  • the memory 49 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the main processor 48, the computer program implements a method for controlling the electric device 100.
  • the method comprises: correcting, by means of the image signal processor 30, the camera module 10 and the range sensor module 20 to acquire the master camera image including a curved page surface in a state where a book is opened, and range depth information of the curved page surface; estimating, by means of the image signal processor 30, a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information; and obtaining, by means of the image signal processor 30, an image of the surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
  • the electric device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment, but may be other types of electric devices (for instance, a tablet computer and a PDA) including camera modules.
  • FIG. 3 is a diagram illustrating an example of an overall flow for the electric device 100 shown in FIG. 1 and FIG. 2 to acquire an image obtained by transforming the curved page surface of a page of the book 100 in an opened state into a plane.
  • the image signal processor 30 controls the camera module 10 and the range sensor module 20 to take a master camera image including a curved page surface in a state where a book is opened, and the ToF depth information of the curved page surface (a step S1) .
  • the image signal processor 30 executes the top view processing (a step S2) .
  • the top view processing estimates a crease position of the page based on the master camera image and the ToF depth information, and estimates a curve corresponding to the position of the curved page surface of the page perpendicular to the crease position.
  • the image signal processor 30 executes image correction processing (a step S3) .
  • This image correction processing projects and transforms the curved page surface of the page in the master camera image so as to be a plane based on the estimated curve.
  • the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed to a plane.
  • the image signal processor 30 determines whether or not additional photographing such as another curved page in the state where the book is opened is necessary (a step S4) .
  • the image signal processor 30 returns to the step S1, and takes the master camera image including the curved page surface of the other page in the state where the book is opened and the ToF depth information of the curved page surface of the page.
  • the image signal processor 30 estimates another curve corresponding to the position of the curved page surface of the other page based on the master camera image and the ToF depth information.
  • the image signal processor 30 performs projective transformation on the curved page surface of the other page curved in the master camera image so as to be a plane based on the estimated other curve. Thereby, the image signal processor 30 acquires an image of the transformed surface of the page onto the plane.
  • the image signal processor 30 determines in the step S4 that no additional photographing is necessary, the image signal processor 30 combines the images of the two surface pages of the acquired plane and opens the book. Thereby, the image signal processor 30 acquires an image of the surface of the two pages of the plane in the state (a step S5) .
  • the present invention has the following preconditions (a) to (d) .
  • (a) The book is opened up and down (in the direction of the crease) with almost the same way of bending.
  • (b) The binding margin (the crease position) of the book is positioned substantially above or below one of the center, left end, and right end of the screen.
  • (c) The resolution of the detection by the range sensor module 20 is lower than the resolution of the camera module 10.
  • (d) The distance to the surface of the book can be detected by the range sensor module 20.
  • FIG. 4 is a diagram illustrating a specific example of step S1 shown in FIG. 3 for acquiring the master camera image and ToF depth information.
  • FIG. 5 is a diagram illustrating an example of display on the display module 45 of the electric device 100 when the crease position is designated.
  • the image signal processor 30 sets the crease position designation frame 45a designated by the user on the opened page in the image taken by the camera module 10 in the state where the book is opened (a step S11) .
  • the image signal processor 30 displays the crease position designation frame 45 a on the display module 45 together with the master camera image taken by the camera module 10.
  • the image signal processor 30 sets the crease position designation frame 45a at a position designated by the user on the curved page surface of the page of the master camera image, in response to an operation input related to an instruction of the crease position designation frame 45a by the user to the input module 46.
  • the crease position designation frame 45a is first displayed at the center or the previous operation position in the display module 45.
  • the user touches the position to be designated.
  • nothing may be displayed at first, and then, when the user touches the display module 45, the crease position designation frame 45a is displayed at the touched position in the display module 45.
  • the crease position designation frame 45a rotates up and down or left and right in the display module 45.
  • the frame direction designation button B1 is included in the input module 46.
  • the crease position designation frame 45a has a mark (an arrow in the example) indicating the vertical direction.
  • the page display when the page display is upside down, it is rotated by the user's operation of the frame direction designation button B1 so as to be in the opposite direction.
  • the user touches and drags the end of the crease position designation frame 45a, it can be rotated based on the center position.
  • the crease position designation frame 45a is set in the direction connecting the touched points in the display module 45.
  • the image signal processor 30 takes a master camera image including the curved page surface of the page on which the crease position designation frame 45a is set, by taking a photograph of the curved page surface of the page that has been opened and curved with the camera module 10 in a state where the book is opened (a step S12) .
  • the camera module 10 takes a photograph of the curved page surface of the page that is opened and curved while the book displayed on the display module 45 is opened.
  • the image signal processor 30 acquires a master camera image including the curved page surface, by capturing the curved page surface of the page that is opened and curved with the camera module 10, in a state where the book 101 is opened.
  • the image signal processor 30 acquires ToF depth information of the curved page surface of the page by irradiating the curved page surface of the page with pulse light from the range sensor module 20.
  • FIG. 6 is a diagram illustrating a specific example of step S2 shown in FIG. 3 for executing the top view process.
  • the image signal processor 30 performs the projective conversion of the point cloud data into data photographed from the front (a step S21) .
  • the image signal processor 30 executes estimation of the crease position of the point cloud data subjected to the projective transformation, and rotation of the point cloud data into data (a step S22) .
  • the image signal processor 30 performs estimation of the ridge inclination and projective transformation of the rotated point cloud data (a step S23) .
  • the image signal processor 30 estimates the curved page surface by estimating a curve corresponding to the position of the curved page surface of the page (a step S24) .
  • the image signal processor 30 estimates the curve corresponding to the position of the curved page surface, in the plane (XY plane) perpendicular to the crease direction (the z axis direction) of the opened crease position of the page, based on the master camera image and the ToF depth information (the steps S22 to S24) .
  • FIG. 7A is a diagram ill ustrating an example of the reference point cloud data P of the image of the curved page surface of the curved page imaged from the tilted direction with the book opened.
  • FIG. 7B is a diagram illustrating an example of the first point cloud data P1 obtained by projective transformation as photographed from the depth direction (from the front) .
  • the image signal processor 30 acquires the reference point cloud data of the curved page surface of the page based on the master camera image and the ToF depth information.
  • the image signal processor 30 calculates a normal vector by applying the principal component analysis to the reference point cloud data.
  • the image signal processor 30 performs the projective transform T of the reference point cloud data P into the first point cloud data P1 of the master camera image taken from the depth direction (the front) , based on the calculated normal vector.
  • FIG. 8A is a diagram illustrating an example of scanning along a plurality of lines L1 with respect to the first point cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the crease position designation frame.
  • FIG. 8B is a diagram illustrating an example of the second point cloud data P2 obtained by the first rotation R of the first point cloud data so that the estimated crease position D2 is parallel to a preset reference direction (the z axis direction) .
  • the image signal processor 30 scans the first point cloud data P1 along a plurality of lines L1 in a direction (short direction) perpendicular to the longitudinal direction D1 of the crease position designation frame 45a.
  • the image signal processor 30 calculates the slope of the valley of the first point cloud data P1 by applying the least square method to the scanned first point cloud data P1.
  • the image signal processor 30 estimates the crease position (the valley position) M based on the calculated slope of the valley.
  • the image signal processor 30 acquires the second point cloud data P2, by performing the first rotation R of the first point cloud data P1 so that the estimated crease position M is parallel to the preset reference direction (the z axis direction) .
  • the image signal processor 30 scans the first point cloud data P1 along the plurality of lines L1
  • the image signal processor 30 extracts data, in the predetermined range from the first point cloud data P1, is the data for the least square method.
  • FIG. 9A is a diagram illustrating an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (the z axis direction) .
  • FIG. 9B is a diagram illustrating an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2.
  • FIG. 9C is a diagram illustrating an example of the third point cloud data P3 obtained by rotating the second point cloud data P2 by the second rotation Q so that the calculated inclination of the ridge N is parallel to a preset reference plane (the z-y plane) .
  • the image signal processor 30 scans the second point cloud data P2 along a plurality of lines L2 in a direction perpendicular to the reference direction (the z axis direction) .
  • the image signal processor 30 calculates the slope of ridges N in the reference direction of the scanned second point cloud data, by applying a least square method to the scanned second point cloud data.
  • the image signal processor 30 acquires the third point cloud data P3 by the second rotating Q of the second point cloud data P2 to so that the calculated slope of the ridge N is parallel to a preset reference plane (the z-y plane) .
  • the coordinate transformation, from the original reference point cloud data P to the third point cloud data P3, is "QRT" .
  • FIG. 10A is a diagram illustrating an example of scanning along a plurality of lines L3 with respect to the third point cloud data P3 in the reference direction (the z axis direction) .
  • FIG. 10B is a diagram illustrating an example of an average value A of the third point cloud data P3 in the depth direction (the x axis direction) in the vicinity of the plurality of lines L3.
  • FIG. 10C is a diagram illustrating an example of a curve E obtained by approximating the calculated average value A with a fourth-order or higher order polynomial in the direction perpendicular to the reference direction (the y axis direction) and the depth direction (the x axis direction) .
  • the image signal processor 30 scans the third point cloud data P3 along a plurality of lines L3 in the reference direction (z) direction.
  • the image signal processor 30 calculates an average value A of the third point cloud data P3 in the vicinity of the depth direction (the x axis direction) of the plurality of lines L3.
  • FIG. 11 is a diagram illustrating a specific example of step S25 for executing the dividing process shown in FIG. 6.
  • FIG. 12A is a diagram illustrating an example of dividing a curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range.
  • FIG. 12B is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12A.
  • FIG. 12C is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12B.
  • the image signal processor 30 calculates an error between points in the division process (astep S251) .
  • the image signal processor 30 determines whether or not the error between points is within an allowable range (a step S252) .
  • the image signal processor 30 divides the points exceeding the range (a step S253) .
  • the image signal processor 30 ends the division process when the error between the points falls within an allowable range.
  • FIG. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in FIG. 3.
  • FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E.
  • FIG. 14B is a diagram illustrating an example of coordinates obtained by inversely transforming the third point cloud data P3 into the projection space of the point cloud data P1.
  • FIG. 15A is a diagram illustrating an example of a plurality of rectangular areas J for each section that the third point cloud data P3 is divided into obtained by dividing the curve E.
  • FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E.
  • FIG. 15B is a diagram illustrating an example of a plurality of rectangular areas G developed in the two-dimensional space (u, v) from a plurality of rectangular areas J in the three-dimensional space (x, y, z) .
  • FIG. 16 is a diagram illustrating an example of a relationship between coordinates when the point cloud data is expanded on a plane and coordinates on a taken image.
  • the image signal processor 30 sets an area for executing the process (a step S31) .
  • the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular areas for each section obtained by dividing the curve E.
  • the coordinates of the third point cloud data P3 are transformed into the projection space coordinates of the original reference point of the cloud data P (FIG. 14B) by the inverse transformation T -1 R -1 Q -1 shown in the expression (2) .
  • the image signal processor 30 estimates a transformation matrix for expanding the three-dimensional space plane (FIG. 15A) into the two-dimensional space plane (FIG. 15B) .
  • the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular areas J for each section obtained by dividing the curve.
  • the image signal processor 30 transforms a plurality of rectangular regions J obtained by dividing the third point cloud data P3 in the three-dimensional space (x, y, z) into the two-dimensional space (u, v) , to obtain a plurality of rectangular areas G expanded in the two-dimensional space (u, v) .
  • the width of the rectangular region G shown in FIG. 15B is expressed by the expression (3) .
  • the length of the rectangular region G shown in FIG. 15B is expressed by the expression (4) .
  • the image signal processor 30 calculates a transformation matrix for each region from the relationship between the coordinates when expanded on a plane and the coordinates on the taken image.
  • the offset values Ou, Ov and the enlargement/reduction ratio k match with the camera image after distortion correction, are obtained by calibration, by adding an offset to the coordinates normalized by the distance to enlarge/reduce.
  • Expression (8) is obtained from the expression (6) and the expression (7) .
  • a transformation matrix is calculated by solving an equation from a combination of known points.
  • the image signal processor 30 performs projective transformation so that the curved page surface of the page in the master camera image becomes a plane, based on the coordinates of the plurality of rectangular areas and the coordinates of the projection space of the reference point cloud data P (step S33 in FIG. 13) .
  • FIG. 17 is a diagram illustrating an example of an image obtained by projective transformation for each corresponding rectangular area so that the curved page surface of the curved page in the master camera image becomes a plane.
  • the image signal processor 30 performs projective transformation for each corresponding rectangular region so that the curved page surface of the curved page in the master camera image becomes a plane.
  • the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed into a plane by this projective transformation.
  • the image signal processor 30 determines whether or not the entire area of the third point cloud data P3 has been processed (step S34 in FIG. 13) .
  • FIG. 18 is a diagram illustrating an example of the images of the surfaces of the two pages 200 and 201 that have been projected and transformed into a plane.
  • the image signal processor 30 synthesizes two images of the surface of the two pages of the acquired plane, and acquires the image of the surface of the two pages of the plane in the state where the book is opened.
  • the camera image of the surface of the page that is extended in a plane can be acquired from the camera image of the surface of the curved page of the book opened.
  • the present invention it is possible to obtain a widened image by simply photographing a widened book.
  • This technology can be provided at low cost by being sold as a smartphone application.
  • the present invention does not require a large-scale device.
  • first and second are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features.
  • a feature defined as “first” and “second” may comprise one or more of this feature.
  • a plurality of means “two or more than two” , unless otherwise specified.
  • the terms “mounted” , “connected” , “coupled” and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
  • a structure in which a first feature is "on" or “below” a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween.
  • a first feature "on” , “above” or “on top of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “on” , “above” or “on top of” the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature “below” , “under” or “on bottom of” a second feature may include an embodiment in which the first feature is orthogonally or obliquely “below” , "under” or “on bottom of” the second feature, or just means that the first feature is at a height lower than that of the second feature.
  • Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
  • the logic and/or step described in other manners herein or shown in the flow chart may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment.
  • the computer readable medium may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment.
  • the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) .
  • the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
  • each part of the present disclosure may be realized by the hardware, software, firmware or their combination.
  • a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system.
  • the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
  • each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module.
  • the integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
  • the storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

An electric device according to the embodiments of the present disclosure includes a camera module that takes a photograph of a subject to acquire a master camera image; a range sensor module that emits a pulsed light to the subject and detects a reflected light of the pulsed light reflected from the subject, thereby acquiring time of flight (ToF) depth information; and an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the ToF depth information.

Description

ELECTRIC DEVICE, METHOD OF CONTROLLING ELECTRIC DEVICE, AND COMPUTER READABLE STORAGE MEDIUM FIELD
The present invention relates to an electric device, a method of controlling the electric device, and a computer readable storage medium.
BACKGROUND
Recently, when capturing a book as an image, a device that captures an image by pressing the book against a flat glass and scanning it with a line sensor from the bottom is popular.
There are devices that take in books from the top while they are open, but they are more expensive and have a lower penetration rate than the previous type.
Users who read a large number of books often use a type that disassembles books into sheets and reads them continuously.
On the other hand, there is also a conventional technique for reading paper placed on a flat surface using a smartphone camera. This conventional technology is realized by a free application and is widely used. However, in this prior art, when the paper is bent, the surface of the paper is bent in the image read from the paper.
Therefore, it is required to acquire an image of the surface of the page that is extended in a plane from an image of the surface of the curved page that has been spread out with a book opened, using a camera of an electric device such as a smartphone.
SUMMARY
The present disclosure aims to solve at least one of the technical problems mentioned above. Accordingly, the present disclosure needs to provide an electric device and a method of controlling electric device.
In accordance with the present disclosure, an electric device may include:
a camera module that takes a photograph of a subject to acquire a master camera image;
a range sensor module that acquires range depth information of the subject by using a light; and
an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the range depth information,
wherein
the image signal processor controls the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface,
the image signal processor estimates a curve corresponding to the position of the curved page surface based on the master camera image  and the range depth information, and
the image signal processor obtains an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
In some embodiments, wherein the range sensor module emits a pulsed light to the subject and detects a reflected light of the pulsed light reflected from the subject, thereby acquiring time of flight (ToF) depth information as the range depth information.
In some embodiments, wherein the image signal processor obtains a master camera image including the curved page surface, by taking a photograph of the curved page surface that has been opened and curved with the camera module, and
wherein the image signal processor acquires the ToF depth information of the curved page surface by the range sensor module.
In some embodiments, wherein the image signal processor estimates the curve corresponds to the position of the curved page surface, in a plane perpendicular to the crease direction of the position of the opened crease of the page, based on the master camera image and the ToF depth information.
In some embodiments, wherein the image signal processor obtains a master camera image including a curved page surface of other page in the state where the book is opened and ToF depth information of the curved page surface of the other page,
wherein the image signal processor estimates other curve corresponding to the position of the curved page surface of the other page based on the master camera image and the ToF depth information,
wherein the image signal processor obtains an image of a surface of the other page that has been projection transformed into a plane, by projection transforming the curved page surface of the other page in the master camera image to be a plane, based on the estimated other curve, and
wherein the image signal processor synthesizes the images of the surface of the two pages of the acquired plane, and acquires an image of the surface of the two pages of the plane in the state where the book is opened.
In some embodiments, wherein the image signal processor sets a crease position designation frame designated by the user on the opened page in the image taken by the camera module in the state where the book is opened, and
wherein the image signal processor acquires the master camera image including the curved page surface of the page in which the crease position designation frame is set, by taking an image of the curved page surface of the page that has been opened and curved in the state of opening the book with the camera module.
In some embodiments, further comprising:
a display module that displays predefined information;
an input module which receives user′s operations; and
a main processor that controls the display module and the input  module,
wherein the image signal processor displays the crease position designation frame on the display module together with the master camera image taken by the camera module, and
wherein the image signal processor sets the crease position designation frame at a position designated by the user on the curved page surface of the page of the master camera image, in response to an operation input related to an instruction of the crease position designation frame by the user to the input module.
In some embodiments, wherein the image signal processor obtains reference point cloud data for the curved page surface, based on the master camera image and the ToF depth information,
wherein the image signal processor calculates a normal vector by applying a principal component analysis to the reference point cloud data, and
wherein the image signal processor performs projection transform of the reference point cloud data into first point cloud data of the master camera image taken from the depth direction, based on the calculated normal vector.
In some embodiments, wherein the image signal processor scans the first point cloud data along a plurality of lines, in a direction perpendicular to the longitudinal direction of the crease position designation frame,
wherein the image signal processor calculates a slope of a valley of the first point cloud data, by applying the least squares method to the scanned first point cloud data,
wherein the image signal processor estimates the crease position based on the calculated slope of the valley, and
wherein the image signal processor obtains second point cloud data by first rotating the first point cloud data so that the estimated crease position is parallel to a preset reference direction.
In some embodiments, wherein the image signal processor scans the second point cloud data along a plurality of lines in a direction perpendicular to the reference direction,
wherein the image signal processor calculates a slope of the ridge in the reference direction of the scanned second point cloud data by applying a least square method to the scanned second point cloud data,
wherein the image signal processor acquires the third point cloud data by a second rotating the second point cloud data, so that the calculated slope of the ridge is parallel to a preset reference plane.
In some embodiments, wherein the image signal processor scans the third point cloud data along a plurality of lines in the reference direction,
wherein the image signal processor calculates an average value of the third point cloud data in the vicinity of the plurality of lines in the depth direction, and
wherein the image signal processor approximates the calculated average value as a curve in a direction perpendicular to the reference direction and a depth direction by a fourth-order or higher order  polynomial.
In some embodiments, wherein the image signal processor divides the third point cloud data into a plurality of rectangular areas for each section obtained by dividing the curve, and
wherein the image signal processor obtains the image of the surface of the page that has been projective transformed into a plane by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on a relation between coordinates of the plurality of rectangular regions and coordinates of the projection space of the reference point cloud data.
In some embodiments, wherein a detection resolution of the range sensor module is lower than a detection resolution of the camera module.
In accordance with the present disclosure, a method for controlling an electric device including: a camera module that takes a photograph of a subject to acquire a master camera image; a range sensor module that acquires range depth information of the subject by using a light; and an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the range depth information,
the method including:
controlling, by means of the image signal processor, the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface;
estimating, by means of the image signal processor, a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information; and
obtaining, by means of the image signal processor, an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
In some embodiments, wherein the range depth information is time of flight (ToF) depth information.
In accordance with the present disclosure, a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the computer program implements a method for controlling an electric device, and the method comprises:
correcting, by means of the image signal processor, the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface;
estimating, by means of the image signal processor, a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information; and
obtaining, by means of the image signal processor, an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
In some embodiments, wherein the range depth information is time of flight (ToF) depth information.
BRIEF DESCRIPTION OF THE DRAWINGS
These and/or other aspects and advantages of embodiments of the present disclosure will become apparent and more readily appreciated from the following descriptions made with reference to the drawings, in which:
FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1;
FIG. 3 is a diagram illustrating an example of an overall flow for the electric device 100 shown in FIG. 1 and FIG. 2 to acquire an image obtained by transforming the surface of a page of the book 100 in an opened state into a plane;
FIG. 4 is a diagram illustrating a specific example of step S1 shown in FIG. 3 for acquiring the master camera image and ToF depth information;
FIG. 5 is a diagram illustrating an example of display on the display module 45 of the electric device 100 when the crease position is designated;
FIG. 6 is a diagram illustrating a specific example of step S2 shown in FIG. 3 for executing the top view process;
FIG. 7A is a diag ram illustrating an example of the reference point cloud data P of the image of the surface of the cu rved page imaged from the tilted direction with the book opened;
FIG. 7B is a diagram illustrating an example of the first point cloud data P1 obtained by projective transformation as photographed from the depth direction (from the front) ;
FIG. 8A is a diagram illustrating an example of scanning along a plurality of lines L1 with respect to the first point cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the crease position designation frame;
FIG. 8B is a diagram illustrating an example of the second point cloud data P2 obtained by the first rotation R of the first point cloud data so that the estimated crease position D2 is parallel to a preset reference direction (the z axis direction) ;
FIG. 9A is a diagram illustrating an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (the z axis direction) ;
FIG. 9B is a diagram illustrating an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2;
FIG. 9C is a diagram illustrating an example of the third point cloud data P3 obtained by rotating the second point cloud data P2 by the second rotation Q so that the calculated inclination of the ridge N is  parallel to a preset reference plane (z-y plane) ;
FIG. 10A is a diagram illustrating an example of scanning along a plurality of lines L3 with respect to the third point cloud data P3 in the reference direction (the z axis direction) ;
FIG. 10B is a diagram illustrating an example of an average value A of the third point cloud data P3 in the depth direction (the x axis direction) in the vicinity of the plurality of lines L3;
FIG. 10C is a diagram illustrating an example of a curve E obtained by approximating the calculated average value A with a fourth-order or higher order polynomial in the direction perpendicular to the reference direction (the y axis direction) and the depth direction (the x axis direction) ;
FIG. 11 is a diagram illustrating a specific example of step S25 for executing the dividing process shown in FIG. 6;
FIG. 12A is a diagram illustrating an example of dividing a curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range;
FIG. 12B is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12A;
FIG. 12C is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12B;
FIG. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in FIG. 3;
FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E;
FIG. 14B is a diagram illustrating an example of coordinates obtained by inversely transforming the third point cloud data P3 into the projection space of the point cloud data P1;
FIG. 15A is a diagram illustrating an example of a plurality of rectangular areas J for each section that the third point cloud data P3 is divided into obtained by dividing the curve E;
FIG. 15B is a diagram illustrating an example of a plurality of rectangular areas G developed in the two-dimensional space (u, v) from a plurality of rectangular areas J in the three-dimensional space (x, y, z) ;
FIG. 16 is a diagram illustrating an example of a relationship between coordinates when the point cloud data is expanded on a plane and coordinates on a captured image;
FIG. 17 is a diagram illustrating an example of an image obtained by projective transformation for each corresponding rectangular area so that the surface of the curved page in the master camera image becomes a plane; and
FIG. 18 is a diagram illustrating an example of the images of the surfaces of the two  pages  200 and 201 that have been projected and  transformed into a plane.
DETAILED DESCRIPTION
Embodiments of the present disclosure will be described in detail and examples of the embodiments will be illustrated in the accompanying drawings. The same or similar elements and the elements having same or similar functions are denoted by like reference numerals throughout the descriptions. The embodiments described herein with reference to the drawings are explanatory, which aim to illustrate the present disclosure, but shall not be construed to limit the present disclosure.
FIG. 1 is a diagram illustrating an example of an arrangement of an electric device 100 and a subject 101 according to an embodiment of the present invention. FIG. 2 is a diagram illustrating an example of the configuration of the electric device 100 shown in FIG. 1.
As shown in FIG. 1and FIG. 2, for example, the electric device 100 includes a camera module 10, a range sensor module 20, and an image signal processor 30 that controls the camera module 10 and the range sensor module 20, and processes camera image data acquired from the camera module 10. For example, reference numeral 101 in FIG. 1 depicts a subject which is book.
The camera module 10 includes, for example, a master lens 10a that is capable of focusing on a subject, a master image sensor 10b that detects an image inputted via the master lens 10a, and a master image sensor driver 10c that drives the master image sensor 10b, as shown in FIG. 2.
Furthermore, the camera module 10 includes, for example, a Gyro sensor 10d that the angular velocity and the acceleration of the camera module 10, a focus &OIS actuator 10f that actuates the master lens 10a, and a focus &OIS driver 10e that drives the focus &OIS actuator 10f, as shown in FIG. 2.
The camera module 10 acquires a master camera image of the subjects 101, for example (FIG. 1) .
As shown in FIG. 2, the range sensor module 20 includes, for example, a ToF lens 20a, a range sensor 20b that detects the reflection light inputted via the ToF lens 20a, a range sensor driver 20c that drives the range sensor 20b, and a projector 20d that outputs the pulse lights.
The range sensor module 20 acquires range depth information of the subject 101 by using a light. Especially, the range sensor module 20 acquires the time of flight (ToF) depth information (ToF depth value) as the range depth information by emitting pulsed light toward the subjects 101, and detecting the reflection light from the subjects 101, for example.
The resolution of the detection by the range sensor module 20 is lower than the resolution of the detection by the camera module 10.
The image signal processor 30 controls, for example, the camera module 10 and the range sensor module 20 to acquire a camera image, which is the master camera image, based on the master camera image obtained by means of the camera module 10 and the ToF depth information obtained by means of the range sensor module 20.
Furthermore, as shown in FIG. 2, for example, the electric device 
100 includes a global navigation satellite system (GNSS) module 40, a wireless communication module 41, a CODEC 42, a speaker 43, a microphone 44, a display module 45, an input module 46, an inertial navigation unit (IMU) 47, a main processor 48, and a memory 49.
The GNSS module 40 measures the current position of the electric device 100, for example.
The wireless communication module 41 performs wireless communications with the Internet, for example.
The CODEC 42 bidirectionally performs encoding and decoding, using a predetermined encoding/decoding method, as shown in FIG. 2 for example.
The speaker 43 outputs a sound in accordance with sound data decoded by the CODEC 42, for example.
The microphone 44 outputs sound data to the CODEC 42 based on inputted sound, for example.
The display module 45 displays predefined information.
The input module 46 receives a user′s input (a user′s operations) .
An IMU 47 detects, for example, the angular velocity and the acceleration of the electric device 100.
The main processor 48 controls the global navigation satellite system (GNSS) module 40, the wireless communication module 41, the CODEC 42, the speaker 43, the microphone 44, the display module 45, the input module 46, and the IMU 47.
The memory 49 stores a program and data required for the image signal processor 30 to control the camera module 10 and the range sensor module 20, acquired image data, and a program and data required for the main processor 48 to control the electric device 100.
For example, the memory 49 includes a computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by the main processor 48, the computer program implements a method for controlling the electric device 100. For example, the method comprises: correcting, by means of the image signal processor 30, the camera module 10 and the range sensor module 20 to acquire the master camera image including a curved page surface in a state where a book is opened, and range depth information of the curved page surface; estimating, by means of the image signal processor 30, a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information; and obtaining, by means of the image signal processor 30, an image of the surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
The electric device 100 having the above-described configuration is a mobile phone such as a smartphone in this embodiment, but may be other types of electric devices (for instance, a tablet computer and a PDA) including camera modules.
Next, an example of a method of controlling the electric device 100 having the above-described configuration and functions will now be described. In particular, an example of a flow of the electric device 100 for  acquiring a camera image of the curved page surface of a page in a state of being extended in a plane, based on a curved image of the curved page surface in a state where the book is opened apart, by using a camera will be described below.
FIG. 3 is a diagram illustrating an example of an overall flow for the electric device 100 shown in FIG. 1 and FIG. 2 to acquire an image obtained by transforming the curved page surface of a page of the book 100 in an opened state into a plane.
First, as shown in FIG. 3, the image signal processor 30 controls the camera module 10 and the range sensor module 20 to take a master camera image including a curved page surface in a state where a book is opened, and the ToF depth information of the curved page surface (a step S1) .
Next, as shown in FIG. 3, the image signal processor 30 executes the top view processing (a step S2) . The top view processing estimates a crease position of the page based on the master camera image and the ToF depth information, and estimates a curve corresponding to the position of the curved page surface of the page perpendicular to the crease position.
Next, as shown in FIG. 3, the image signal processor 30 executes image correction processing (a step S3) . This image correction processing projects and transforms the curved page surface of the page in the master camera image so as to be a plane based on the estimated curve. Thereby, the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed to a plane.
Next, as shown in FIG. 3, the image signal processor 30 determines whether or not additional photographing such as another curved page in the state where the book is opened is necessary (a step S4) .
If the additional photographing is needed, the image signal processor 30 returns to the step S1, and takes the master camera image including the curved page surface of the other page in the state where the book is opened and the ToF depth information of the curved page surface of the page.
In the step S2, the image signal processor 30 estimates another curve corresponding to the position of the curved page surface of the other page based on the master camera image and the ToF depth information.
Then, in the step S3, the image signal processor 30 performs projective transformation on the curved page surface of the other page curved in the master camera image so as to be a plane based on the estimated other curve. Thereby, the image signal processor 30 acquires an image of the transformed surface of the page onto the plane.
If the image signal processor 30 determines in the step S4 that no additional photographing is necessary, the image signal processor 30 combines the images of the two surface pages of the acquired plane and opens the book. Thereby, the image signal processor 30 acquires an image of the surface of the two pages of the plane in the state (a step S5) .
The present invention has the following preconditions (a) to (d) . (a) : The book is opened up and down (in the direction of the crease) with almost the same way of bending. (b) : The binding margin (the crease position) of the book is positioned substantially above or below one of the  center, left end, and right end of the screen. (c) : The resolution of the detection by the range sensor module 20 is lower than the resolution of the camera module 10. (d) : The distance to the surface of the book can be detected by the range sensor module 20.
Here, an example of a flow for acquiring the master camera image and ToF depth information shown in FIG. 3 will be described below.
FIG. 4 is a diagram illustrating a specific example of step S1 shown in FIG. 3 for acquiring the master camera image and ToF depth information. FIG. 5 is a diagram illustrating an example of display on the display module 45 of the electric device 100 when the crease position is designated.
As shown in FIG. 4, the image signal processor 30 sets the crease position designation frame 45a designated by the user on the opened page in the image taken by the camera module 10 in the state where the book is opened (a step S11) .
The image signal processor 30 displays the crease position designation frame 45 a on the display module 45 together with the master camera image taken by the camera module 10.
The image signal processor 30 sets the crease position designation frame 45a at a position designated by the user on the curved page surface of the page of the master camera image, in response to an operation input related to an instruction of the crease position designation frame 45a by the user to the input module 46.
For example, as shown in FIG. 5, the crease position designation frame 45a is first displayed at the center or the previous operation position in the display module 45. When the user wants to designate another position, the user touches the position to be designated. Or, in the display module 45, nothing may be displayed at first, and then, when the user touches the display module 45, the crease position designation frame 45a is displayed at the touched position in the display module 45.
Further, when the user operates the frame direction designation button B1, the crease position designation frame 45a rotates up and down or left and right in the display module 45. The frame direction designation button B1 is included in the input module 46.
Further, for example, as shown in FIG. 5, the crease position designation frame 45a has a mark (an arrow in the example) indicating the vertical direction.
For example, in the display module 45, when the page display is upside down, it is rotated by the user's operation of the frame direction designation button B1 so as to be in the opposite direction.
For example, when the user touches and drags the end of the crease position designation frame 45a, it can be rotated based on the center position.
For example, when the user touches the display module 45 with two fingers, the crease position designation frame 45a is set in the direction connecting the touched points in the display module 45.
Then, as shown in FIG. 4, the image signal processor 30 takes a master camera image including the curved page surface of the page on which the crease position designation frame 45a is set, by taking a  photograph of the curved page surface of the page that has been opened and curved with the camera module 10 in a state where the book is opened (a step S12) .
For example, as shown in FIG. 5, when the user operates the shutter button 46 of the input module 46, the camera module 10 takes a photograph of the curved page surface of the page that is opened and curved while the book displayed on the display module 45 is opened.
The image signal processor 30 acquires a master camera image including the curved page surface, by capturing the curved page surface of the page that is opened and curved with the camera module 10, in a state where the book 101 is opened.
The image signal processor 30 acquires ToF depth information of the curved page surface of the page by irradiating the curved page surface of the page with pulse light from the range sensor module 20.
Next, an example of a flow for executing the top view process shown in FIG. 3 will be described below.
FIG. 6 is a diagram illustrating a specific example of step S2 shown in FIG. 3 for executing the top view process.
First, as shown in FIG. 6, the image signal processor 30 performs the projective conversion of the point cloud data into data photographed from the front (a step S21) .
Then the image signal processor 30 executes estimation of the crease position of the point cloud data subjected to the projective transformation, and rotation of the point cloud data into data (a step S22) .
Then, as shown in FIG. 6, the image signal processor 30 performs estimation of the ridge inclination and projective transformation of the rotated point cloud data (a step S23) .
Then, the image signal processor 30 estimates the curved page surface by estimating a curve corresponding to the position of the curved page surface of the page (a step S24) .
In this way, as shown in FIG. 6, in the top view process, the image signal processor 30 estimates the curve corresponding to the position of the curved page surface, in the plane (XY plane) perpendicular to the crease direction (the z axis direction) of the opened crease position of the page, based on the master camera image and the ToF depth information (the steps S22 to S24) .
FIG. 7A is a diagram ill ustrating an example of the reference point cloud data P of the image of the curved page surface of the curved page imaged from the tilted direction with the book opened. FIG. 7B is a diagram illustrating an example of the first point cloud data P1 obtained by projective transformation as photographed from the depth direction (from the front) .
As shown in FIG. 7A, the image signal processor 30 acquires the reference point cloud data of the curved page surface of the page based on the master camera image and the ToF depth information.
Then, the image signal processor 30 calculates a normal vector by applying the principal component analysis to the reference point cloud data.
Then, as shown in FIG. 7B, the image signal processor 30 performs  the projective transform T of the reference point cloud data P into the first point cloud data P1 of the master camera image taken from the depth direction (the front) , based on the calculated normal vector.
In this way, apply principal component analysis to the reference point cloud data p photographed from an oblique angle to generate a normal vector, and perform the projective transform T on the data photographed from the front (the y-z plane) .
FIG. 8A is a diagram illustrating an example of scanning along a plurality of lines L1 with respect to the first point cloud data P1 in a direction perpendicular to the longitudinal direction D1 of the crease position designation frame. FIG. 8B is a diagram illustrating an example of the second point cloud data P2 obtained by the first rotation R of the first point cloud data so that the estimated crease position D2 is parallel to a preset reference direction (the z axis direction) .
As shown in FIG. 8A, the image signal processor 30 scans the first point cloud data P1 along a plurality of lines L1 in a direction (short direction) perpendicular to the longitudinal direction D1 of the crease position designation frame 45a.
Then, the image signal processor 30 calculates the slope of the valley of the first point cloud data P1 by applying the least square method to the scanned first point cloud data P1.
As shown in FIG. 8A, the image signal processor 30 estimates the crease position (the valley position) M based on the calculated slope of the valley.
Then, as shown in FIG. 8B, the image signal processor 30 acquires the second point cloud data P2, by performing the first rotation R of the first point cloud data P1 so that the estimated crease position M is parallel to the preset reference direction (the z axis direction) .
In addition, when the image signal processor 30 scans the first point cloud data P1 along the plurality of lines L1, the image signal processor 30 extracts data, in the predetermined range from the first point cloud data P1, is the data for the least square method.
FIG. 9A is a diagram illustrating an example of scanning along a plurality of lines L2 with respect to the second point cloud data P2 in a direction perpendicular to the preset reference direction (the z axis direction) . FIG. 9B is a diagram illustrating an example of the slope of the ridge N in the longitudinal direction of the scanned second point cloud data P2. FIG. 9C is a diagram illustrating an example of the third point cloud data P3 obtained by rotating the second point cloud data P2 by the second rotation Q so that the calculated inclination of the ridge N is parallel to a preset reference plane (the z-y plane) .
For example, as shown in FIG. 9A, the image signal processor 30 scans the second point cloud data P2 along a plurality of lines L2 in a direction perpendicular to the reference direction (the z axis direction) .
Then, for example, as shown in FIG. 9B, the image signal processor 30 calculates the slope of ridges N in the reference direction of the scanned second point cloud data, by applying a least square method to the scanned second point cloud data.
Then, for example, as shown in FIG. 9C, the image signal processor  30 acquires the third point cloud data P3 by the second rotating Q of the second point cloud data P2 to so that the calculated slope of the ridge N is parallel to a preset reference plane (the z-y plane) .
The coordinate transformation, from the original reference point cloud data P to the third point cloud data P3, is "QRT" .
FIG. 10A is a diagram illustrating an example of scanning along a plurality of lines L3 with respect to the third point cloud data P3 in the reference direction (the z axis direction) . FIG. 10B is a diagram illustrating an example of an average value A of the third point cloud data P3 in the depth direction (the x axis direction) in the vicinity of the plurality of lines L3. FIG. 10C is a diagram illustrating an example of a curve E obtained by approximating the calculated average value A with a fourth-order or higher order polynomial in the direction perpendicular to the reference direction (the y axis direction) and the depth direction (the x axis direction) .
For example, as shown in FIG. 10A, the image signal processor 30 scans the third point cloud data P3 along a plurality of lines L3 in the reference direction (z) direction.
Then, for example, as shown in FIG. 10B, the image signal processor 30 calculates an average value A of the third point cloud data P3 in the vicinity of the depth direction (the x axis direction) of the plurality of lines L3.
Then, for example, as shown in FIG. 10C, the image signal processor 30 calculates the curve E approximated by a fourth-order or higher polynomial expression (1) in the direction perpendicular to the reference direction (the y axis direction) and the depth direction (the x axis direction) , based on the average value A.x=a 4y 4+a 3y 3+a 2y 2+a 1y+b    (1)
Next, an example of a flow for executing the dividing process shown in FIG. 3 will be described below.
FIG. 11 is a diagram illustrating a specific example of step S25 for executing the dividing process shown in FIG. 6. FIG. 12A is a diagram illustrating an example of dividing a curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range. FIG. 12B is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12A. FIG. 12C is a diagram illustrating an example of dividing the curve E by points so that an error between the points of the curve E approximated by a polynomial falls within an allowable range, is continuous with FIG. 12B.
For example, as shown in FIG. 11 and FIG. 12A, the image signal processor 30 calculates an error between points in the division process (astep S251) .
Next, as shown in FIG. 11, the image signal processor 30 determines whether or not the error between points is within an allowable range (a step  S252) .
Then, as shown in FIG. 11, FIG. 12B and FIG. 12C, when the error between the points exceeds the allowable range, the image signal processor 30 divides the points exceeding the range (a step S253) .
On the other hand, the image signal processor 30 ends the division process when the error between the points falls within an allowable range. 
Next, an example of a flow for executing the image correction processing shown in FIG. 3 will be described below.
FIG. 13 is a diagram showing a specific example of step S3 for executing the image correction processing shown in FIG. 3. FIG. 14A is a diagram illustrating an example in which the third point cloud data P3 is divided into a plurality of rectangular areas for each section obtained by dividing the curve E. FIG. 14B is a diagram illustrating an example of coordinates obtained by inversely transforming the third point cloud data P3 into the projection space of the point cloud data P1. FIG. 15A is a diagram illustrating an example of a plurality of rectangular areas J for each section that the third point cloud data P3 is divided into obtained by dividing the curve E. FIG. 15B is a diagram illustrating an example of a plurality of rectangular areas G developed in the two-dimensional space (u, v) from a plurality of rectangular areas J in the three-dimensional space (x, y, z) . FIG. 16 is a diagram illustrating an example of a relationship between coordinates when the point cloud data is expanded on a plane and coordinates on a taken image.
First, as shown in FIG. 13, in the image correction process, the image signal processor 30 sets an area for executing the process (a step S31) .
For example, as shown in FIG. 14A, the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular areas for each section obtained by dividing the curve E.
For example, the coordinates of the third point cloud data P3 are transformed into the projection space coordinates of the original reference point of the cloud data P (FIG. 14B) by the inverse transformation T -1R -1Q -1 shown in the expression (2) .
Figure PCTCN2020074508-appb-000001
Then, as shown in step S32 in FIG. 13, the image signal processor 30 estimates a transformation matrix for expanding the three-dimensional space plane (FIG. 15A) into the two-dimensional space plane (FIG. 15B) .
For example, as shown in FIG. 15A, the image signal processor 30 divides the third point cloud data P3 into a plurality of rectangular areas J for each section obtained by dividing the curve.
Then, as shown in FIG. 15B, the image signal processor 30 transforms a plurality of rectangular regions J obtained by dividing the  third point cloud data P3 in the three-dimensional space (x, y, z) into the two-dimensional space (u, v) , to obtain a plurality of rectangular areas G expanded in the two-dimensional space (u, v) .
For example, the width of the rectangular region G shown in FIG. 15B is expressed by the expression (3) . Further, the length of the rectangular region G shown in FIG. 15B is expressed by the expression (4) .
Figure PCTCN2020074508-appb-000002
|z 1-z 0|=|v 1-v 0|    (4)
Then, the image signal processor 30 calculates a transformation matrix for each region from the relationship between the coordinates when expanded on a plane and the coordinates on the taken image.
As shown in equation (5) , the offset values Ou, Ov and the enlargement/reduction ratio k, match with the camera image after distortion correction, are obtained by calibration, by adding an offset to the coordinates normalized by the distance to enlarge/reduce.
Figure PCTCN2020074508-appb-000003
Two expressions can be created with one point of correspondence. There are nine unknowns (the expression (7) ) .
Figure PCTCN2020074508-appb-000004
Expression (8) is obtained from the expression (6) and the expression (7) .
Figure PCTCN2020074508-appb-000005
Then, in equation (8) , if "c" is assumed (for example, "1" ) , the number of unknowns becomes eight (equation (9) ) .
Figure PCTCN2020074508-appb-000006
row3=-u′row1-v′row2   (10)
Then, in the expression (9) , when the relationship of the expression (10) is established, the expression (11) is obtained.
Figure PCTCN2020074508-appb-000007
Therefore, if the correspondence between the four points in the coordinates when developed on the plane and the coordinates on the photographed image is known, the unknown can be obtained.
That is, as shown in the expression (12) , a transformation matrix is calculated by solving an equation from a combination of known points.
Figure PCTCN2020074508-appb-000008
Next, the image signal processor 30 performs projective transformation so that the curved page surface of the page in the master camera image becomes a plane, based on the coordinates of the plurality of rectangular areas and the coordinates of the projection space of the reference point cloud data P (step S33 in FIG. 13) .
FIG. 17 is a diagram illustrating an example of an image obtained by projective transformation for each corresponding rectangular area so that the curved page surface of the curved page in the master camera image becomes a plane.
For example, as shown in FIG. 17, the image signal processor 30 performs projective transformation for each corresponding rectangular region so that the curved page surface of the curved page in the master camera image becomes a plane.
Then, the image signal processor 30 acquires an image of the surface of the page that has been projectively transformed into a plane by this projective transformation.
Then, the image signal processor 30 determines whether or not the entire area of the third point cloud data P3 has been processed (step S34 in FIG. 13) .
Then, when the image signal processor 30 has not processed the entire area of the third point cloud data P3, the process returns to step S31 in FIG. 13.
On the other hand, as shown in FIG. 15B, when the image signal processor 30 has processed the entire area of the third point cloud data P3, the division processing ends.
FIG. 18 is a diagram illustrating an example of the images of the surfaces of the two  pages  200 and 201 that have been projected and transformed into a plane.
As shown in FIG. 18, the image signal processor 30 synthesizes two images of the surface of the two pages of the acquired plane, and acquires the image of the surface of the two pages of the plane in the state where the book is opened.
Thereby, using the camera of the electric device 100 such as a smartphone, the camera image of the surface of the page that is extended in a plane can be acquired from the camera image of the surface of the curved page of the book opened.
As described above, according to the present invention, it is possible to obtain a widened image by simply photographing a widened book. This technology can be provided at low cost by being sold as a smartphone application. The present invention does not require a large-scale device.
In the description of embodiments of the present disclosure, it is to be understood that terms such as "central" , "longitudinal" , "transverse" , "length" , "width" , "thickness" , "upper" , "lower" , "front" , "rear" , "back" , "left" , "right" , "vertical" , "horizontal" , "top" , "bottom" , "inner" , "outer" , "clockwise" and "counterclockwise" should be construed to refer to the orientation or the position as described or as shown in the drawings in discussion. These relative terms are only used to simplify the description of the present disclosure, and do not indicate or imply that the device or element referred to must have a particular orientation, or must be constructed or operated in a particular orientation. Thus, these terms cannot be constructed to limit the present disclosure.
In addition, terms such as "first" and "second" are used herein for purposes of description and are not intended to indicate or imply relative importance or significance or to imply the number of indicated technical features. Thus, a feature defined as "first" and "second" may comprise one or more of this feature. In the description of the present  disclosure, "a plurality of" means "two or more than two" , unless otherwise specified.
In the description of embodiments of the present disclosure, unless specified or limited otherwise, the terms "mounted" , "connected" , "coupled" and the like are used broadly, and may be, for example, fixed connections, detachable connections, or integral connections; may also be mechanical or electrical connections; may also be direct connections or indirect connections via intervening structures; may also be inner communications of two elements which can be understood by those skilled in the art according to specific situations.
In the embodiments of the present disclosure, unless specified or limited otherwise, a structure in which a first feature is "on" or "below" a second feature may include an embodiment in which the first feature is in direct contact with the second feature, and may also include an embodiment in which the first feature and the second feature are not in direct contact with each other, but are in contact via an additional feature formed therebetween. Furthermore, a first feature "on" , "above" or "on top of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "on" , "above" or "on top of" the second feature, or just means that the first feature is at a height higher than that of the second feature; while a first feature "below" , "under" or "on bottom of" a second feature may include an embodiment in which the first feature is orthogonally or obliquely "below" , "under" or "on bottom of" the second feature, or just means that the first feature is at a height lower than that of the second feature.
Various embodiments and examples are provided in the above description to implement different structures of the present disclosure. In order to simplify the present disclosure, certain elements and settings are described in the above. However, these elements and settings are only by way of example and are not intended to limit the present disclosure. In addition, reference numbers and/or reference letters may be repeated in different examples in the present disclosure. This repetition is for the purpose of simplification and clarity and does not refer to relations between different embodiments and/or settings. Furthermore, examples of different processes and materials are provided in the present disclosure. However, it would be appreciated by those skilled in the art that other processes and/or materials may also be applied.
Reference throughout this specification to "an embodiment" , "some embodiments" , "an exemplary embodiment" , "an example" , "a specific example" or "some examples" means that a particular feature, structure, material, or characteristics described in connection with the embodiment or example is included in at least one embodiment or example of the present disclosure. Thus, the appearances of the above phrases throughout this specification are not necessarily referring to the same embodiment or example of the present disclosure. Furthermore, the particular features, structures, materials, or characteristics may be combined in any suitable manner in one or more embodiments or examples.
Any process or method described in a flow chart or described herein in other ways may be understood to include one or more modules, segments or portions of codes of executable instructions for achieving specific logical functions or steps in the process, and the scope of a preferred embodiment of the present disclosure includes other implementations, in which it should be understood by those skilled in the art that functions may be implemented in a sequence other than the sequences shown or discussed, including in a substantially identical sequence or in an opposite sequence.
The logic and/or step described in other manners herein or shown in the flow chart, for example, a particular sequence table of executable instructions for realizing the logical function, may be specifically achieved in any computer readable medium to be used by the instructions execution system, device or equipment (such as a system based on computers, a system comprising processors or other systems capable of obtaining instructions from the instructions execution system, device and equipment executing the instructions) , or to be used in combination with the instructions execution system, device and equipment. As to the specification, "the computer readable medium" may be any device adaptive for including, storing, communicating, propagating or transferring programs to be used by or in combination with the instruction execution system, device or equipment. More specific examples of the computer readable medium comprise but are not limited to: an electronic connection (an electronic device) with one or more wires, a portable computer enclosure (a magnetic device) , a random access memory (RAM) , a read only memory (ROM) , an erasable programmable read-only memory (EPROM or a flash memory) , an optical fiber device and a portable compact disk read-only memory (CDROM) . In addition, the computer readable medium may even be a paper or other appropriate medium capable of printing programs thereon, this is because, for example, the paper or other appropriate medium may be optically scanned and then edited, decrypted or processed with other appropriate methods when necessary to obtain the programs in an electric manner, and then the programs may be stored in the computer memories.
It should be understood that each part of the present disclosure may be realized by the hardware, software, firmware or their combination. In the above embodiments, a plurality of steps or methods may be realized by the software or firmware stored in the memory and executed by the appropriate instructions execution system. For example, if it is realized by the hardware, likewise in another embodiment, the steps or methods may be realized by one or a combination of the following techniques known in the art: a discrete logic circuit having a logic gate circuit for realizing a logic function of a data signal, an application-specific integrated circuit having an appropriate combination logic gate circuit, a programmable gate array (PGA) , a field programmable gate array (FPGA) , etc.
Those skilled in the art shall understand that all or parts of the steps in the above exemplifying method of the present disclosure may  be achieved by commanding the related hardware with programs. The programs may be stored in a computer readable storage medium, and the programs comprise one or a combination of the steps in the method embodiments of the present disclosure when run on a computer.
In addition, each function cell of the embodiments of the present disclosure may be integrated in a processing module, or these cells may be separate physical existence, or two or more cells are integrated in a processing module. The integrated module may be realized in a form of hardware or in a form of software function modules. When the integrated module is realized in a form of software function module and is sold or used as a standalone product, the integrated module may be stored in a computer readable storage medium.
The storage medium mentioned above may be read-only memories, magnetic disks, CD, etc.
Although embodiments of the present disclosure have been shown and described, it would be appreciated by those skilled in the art that the embodiments are explanatory and cannot be construed to limit the present disclosure, and changes, modifications, alternatives and variations can be made in the embodiments without departing from the scope of the present disclosure.

Claims (17)

  1. An electric device comprising:
    a camera module that takes a photograph of a subject to acquire a master camera image;
    a range sensor module that acquires range depth information of the subject by using a light; and
    an image signal processor that controls the camera module and the range sensor module to acquire a camera image, based on the master camera image and the range depth information, wherein
    the image signal processor controls the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface,
    the image signal processor estimates a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information, and
    the image signal processor obtains an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
  2. The electric device according to claim 1,
    wherein the range sensor module emits a pulsed light to the subject and detects a reflected light of the pulsed light reflected from the subject, thereby acquiring time of flight (ToF) depth information as the range depth information.
  3. The electric device according to claim 2, wherein the image signal processor obtains a master camera image including the curved page surface, by taking a photograph of the curved page surface that has been opened and curved with the camera module, and
    wherein the image signal processor acquires the ToF depth information of the curved page surface by the range sensor module.
  4. The electric device according to claim 2, wherein the image signal processor estimates the curve corresponds to the position of the curved page surface, in a plane perpendicular to the crease direction of the position of the opened crease of the page, based on the master camera image and the ToF depth information.
  5. The electric device according to claim 2, wherein the image signal processor obtains a master camera image including a curved page surface of other page in the state where the book is opened and ToF depth information of the curved page surface of the other page,
    wherein the image signal processor estimates other curve corresponding to the position of the curved page surface of the other page based on the master camera image and the ToF depth information,
    wherein the image signal processor obtains an image of a surface of  the other page that has been projection transformed into a plane, by projection transforming the curved page surface of the other page in the master camera image to be a plane, based on the estimated other curve, and
    wherein the image signal processor synthesizes the images of the surface of the two pages of the acquired plane, and acquires an image of the surface of the two pages of the plane in the state where the book is opened.
  6. The electric device according to claim 3, wherein the image signal processor sets a crease position designation frame designated by the user on the opened page in the image taken by the camera module in the state where the book is opened, and
    wherein the image signal processor acquires the master camera image including the curved page surface in which the crease position designation frame is set, by taking an image of the curved page surface of the page that has been opened and curved in the state of opening the book with the camera module.
  7. The electric device according to claim 6, further comprising:
    a display module that displays predefined information;
    an input module which receives user’s operations; and
    a main processor that controls the display module and the input module,
    wherein the image signal processor displays the crease position designation frame on the display module together with the master camera image taken by the camera module, and
    wherein the image signal processor sets the crease position designation frame at a position designated by the user on the curved page surface of the master camera image, in response to an operation input related to an instruction of the crease position designation frame by the user to the input module.
  8. The electric device according to claim 7, wherein the image signal processor obtains reference point cloud data for the curved page surface, based on the master camera image and the ToF depth information,
    wherein the image signal processor calculates a normal vector by applying a principal component analysis to the reference point cloud data, and
    wherein the image signal processor performs projection transform of the reference point cloud data into first point cloud data of the master camera image taken from the depth direction, based on the calculated normal vector.
  9. The electric device according to claim 8, wherein the image signal processor scans the first point cloud data along a plurality of lines, in a direction perpendicular to the longitudinal direction of the crease position designation frame,
    wherein the image signal processor calculates a slope of a valley of the first point cloud data, by applying the least squares method to the scanned first point cloud data,
    wherein the image signal processor estimates the crease position based on the calculated slope of the valley, and
    wherein the image signal processor obtains second point cloud data by first rotating the first point cloud data so that the estimated crease position is parallel to a preset reference direction.
  10. The electric device according to claim 9, wherein the image signal processor scans the second point cloud data along a plurality of lines in a direction perpendicular to the reference direction,
    wherein the image signal processor calculates a slope of the ridge in the reference direction of the scanned second point cloud data by applying a least square method to the scanned second point cloud data,
    wherein the image signal processor acquires the third point cloud data by a second rotating the second point cloud data, so that the calculated slope of the ridge is parallel to a preset reference plane.
  11. The electric device according to claim 10, wherein the image signal processor scans the third point cloud data along a plurality of lines in the reference direction,
    wherein the image signal processor calculates an average value of the third point cloud data in the vicinity of the plurality of lines in the depth direction, and
    wherein the image signal processor approximates the calculated average value as a curve in a direction perpendicular to the reference direction and a depth direction by a fourth-order or higher order polynomial.
  12. The electric device according to claim 11, wherein the image signal processor divides the third point cloud data into a plurality of rectangular areas for each section obtained by dividing the curve, and
    wherein the image signal processor obtains the image of the surface of the page that has been projective transformed into a plane by projective transformation of the curved page surface in the master camera image to be a plane, based on a relation between coordinates of the plurality of rectangular regions and coordinates of the projection space of the reference point cloud data.
  13. The electric device according to claim 1, wherein a detection resolution of the range sensor module is lower than a detection resolution of the camera module.
  14. A method for controlling an electric device including: a camera module that takes a photograph of a subject to acquire a master camera image; a range sensor module that acquires range depth information of the subject by using a light; and an image signal processor that controls the camera module and the range sensor module to acquire a camera image,  based on the master camera image and the range depth information,
    the method comprising:
    controlling, by means of the image signal processor, the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface;
    estimating, by means of the image signal processor, a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information; and
    obtaining, by means of the image signal processor, an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface of the page in the master camera image to be a plane, based on the estimated curve.
  15. The method according to claim 14,
    wherein the range sensor module emits a pulsed light to the subject and detects a reflected light of the pulsed light reflected from the subject, thereby acquiring time of flight (ToF) depth information as the range depth information.
  16. A computer readable storage medium having a computer program stored thereon, wherein when the computer program is executed by a processor, the computer program implements a method for controlling an electric device, and the method comprises:
    correcting, by means of the image signal processor, the camera module and the range sensor module to acquire the master camera image including a curved page surface of a page in a state where a book is opened, and range depth information of the curved page surface;
    estimating, by means of the image signal processor, a curve corresponding to the position of the curved page surface based on the master camera image and the range depth information; and
    obtaining, by means of the image signal processor, an image of a surface of the page that has been projection transformed into a plane, by projective transformation of the curved page surface in the master camera image to be a plane, based on the estimated curve.
  17. The computer readable storage medium according to claim 16,
    wherein the range depth information is time of flight (ToF) depth information.
PCT/CN2020/074508 2020-02-07 2020-02-07 Electric device, method of controlling electric device, and computer readable storage medium WO2021155575A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/074508 WO2021155575A1 (en) 2020-02-07 2020-02-07 Electric device, method of controlling electric device, and computer readable storage medium
CN202080093632.0A CN114982214A (en) 2020-02-07 2020-02-07 Electronic device, method of controlling electronic device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/074508 WO2021155575A1 (en) 2020-02-07 2020-02-07 Electric device, method of controlling electric device, and computer readable storage medium

Publications (1)

Publication Number Publication Date
WO2021155575A1 true WO2021155575A1 (en) 2021-08-12

Family

ID=77199693

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/074508 WO2021155575A1 (en) 2020-02-07 2020-02-07 Electric device, method of controlling electric device, and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN114982214A (en)
WO (1) WO2021155575A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5497236A (en) * 1993-06-23 1996-03-05 Ricoh Company Ltd. Method and apparatus for distortion correction of scanned images
CN102833460A (en) * 2011-06-15 2012-12-19 富士通株式会社 Image processing method, image processing device and scanner
US20140292802A1 (en) * 2013-03-26 2014-10-02 Sharp Laboratories Of America, Inc. Methods and Systems for Correcting a Document Image
CN104835119A (en) * 2015-04-23 2015-08-12 天津大学 Method for positioning base line of bending book cover
CN105872291A (en) * 2016-05-31 2016-08-17 大连成者科技有限公司 Intelligent internet high-definition scanner with laser correcting function
CN105979117A (en) * 2016-04-28 2016-09-28 大连成者科技有限公司 Laser line-based curved page image flattening method
CN110519480A (en) * 2019-09-21 2019-11-29 深圳市本牛科技有限责任公司 A kind of surface flattening method based on laser calibration
CN110533769A (en) * 2019-08-20 2019-12-03 福建捷宇电脑科技有限公司 A kind of leveling method opening book image and terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102124014B1 (en) * 2013-10-29 2020-06-17 삼성전자주식회사 Image photographing apparatus for making bokeh image and method thereof
CN109726614A (en) * 2017-10-27 2019-05-07 北京小米移动软件有限公司 3D stereoscopic imaging method and device, readable storage medium storing program for executing, electronic equipment
CN109089047B (en) * 2018-09-29 2021-01-12 Oppo广东移动通信有限公司 Method and device for controlling focusing, storage medium and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5497236A (en) * 1993-06-23 1996-03-05 Ricoh Company Ltd. Method and apparatus for distortion correction of scanned images
CN102833460A (en) * 2011-06-15 2012-12-19 富士通株式会社 Image processing method, image processing device and scanner
US20140292802A1 (en) * 2013-03-26 2014-10-02 Sharp Laboratories Of America, Inc. Methods and Systems for Correcting a Document Image
CN104835119A (en) * 2015-04-23 2015-08-12 天津大学 Method for positioning base line of bending book cover
CN105979117A (en) * 2016-04-28 2016-09-28 大连成者科技有限公司 Laser line-based curved page image flattening method
CN105872291A (en) * 2016-05-31 2016-08-17 大连成者科技有限公司 Intelligent internet high-definition scanner with laser correcting function
CN110533769A (en) * 2019-08-20 2019-12-03 福建捷宇电脑科技有限公司 A kind of leveling method opening book image and terminal
CN110519480A (en) * 2019-09-21 2019-11-29 深圳市本牛科技有限责任公司 A kind of surface flattening method based on laser calibration

Also Published As

Publication number Publication date
CN114982214A (en) 2022-08-30

Similar Documents

Publication Publication Date Title
KR102644273B1 (en) System and method for disparity estimation using cameras with different fields of view
JP4435145B2 (en) Method and apparatus for providing panoramic image by calibrating geometric information
WO2019205852A1 (en) Method and apparatus for determining pose of image capture device, and storage medium therefor
JP5670476B2 (en) Image capture device with tilt or perspective correction capability
CN109309796B (en) Electronic device for acquiring image using multiple cameras and method for processing image using the same
JP4529837B2 (en) Imaging apparatus, image correction method, and program
US10915998B2 (en) Image processing method and device
WO2018228330A1 (en) Focusing method and apparatus for realizing clear human face, and computer device
KR102524982B1 (en) Apparatus and method for applying noise pattern to image processed bokeh
KR102200866B1 (en) 3-dimensional modeling method using 2-dimensional image
KR102452575B1 (en) Apparatus and method for compensating variation of images caused by optical image stabilization motion
JP2002057879A (en) Apparatus and method for image processing, and computer readable recording medium
CN111327823A (en) Video generation method and device and corresponding storage medium
US20190355104A1 (en) Image Correction Method and Apparatus
JP6528680B2 (en) Display device, display method and display program
CN115004685A (en) Electronic device and method for displaying image at electronic device
WO2021031709A1 (en) Imaging method, imaging device, and electronic equipment
WO2021155575A1 (en) Electric device, method of controlling electric device, and computer readable storage medium
WO2022193310A1 (en) Electric device, method of controlling electric device, and computer readable storage medium
US11164388B2 (en) Electronic device and method for providing augmented reality object therefor
CN109379536B (en) Picture generation method, device, terminal and corresponding storage medium
US11917295B2 (en) Method for correcting shaking at high magnification and electronic device therefor
WO2022027191A1 (en) Method and device for plane correction, computer-readable medium, and electronic device
CN115174878A (en) Projection picture correction method, apparatus and storage medium
CN105827931B (en) It is a kind of based on the audio-frequency inputting method and device taken pictures

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20917423

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20917423

Country of ref document: EP

Kind code of ref document: A1