首页 > 解决方案 > How to apply texture on irregular shapes in 2d image?

问题描述

enter image description here

I'm trying to apply texture on CALayer from UIColor pattern image. Texture is applying but it's not correctly perspective transformed. Looks like there is an issue with my drawing logic i.e I need to use a texture image and map it on to the irregular shape. I did some research and come to know this thing can be achievable through OpenGL or metal by mapping texture image to irregular shape in 2d image.

Looking for some kind of guidance, How i can correctly perspective transform tile patterns?

        let image = UIImage(named: "roofTiles_square")?.flattened
        
        if let realImage = image {
            let color = UIColor(patternImage: realImage)
            controller.quadView.quadLayer.fillColor = color.cgColor
        }

Any help would be much appreciated.

Thanks

标签: iosswiftcalayer

解决方案


I am writing up a detailed solution, but I want make sure I am solving the correct problem. The plan is to create a transformation that properly warps a version of the "roof tile" pattern (or any such pattern) such that when mapped to the quadrilateral in the image on the right is correctly perspectively warped? i.e, the quad ABCE is mapped to the quad A'B'C'E'?

enter image description here

The first step is to compute a homography that maps the quad ABCD to the quad A'B'C'D'. OpenCV provides methods for this, but lets do that math ourself. We are searching for a 3x3 matrix H that maps the points A,B,C,D to the points A',B',C',D' as follows (we'll actually do it the other way around):

enter image description here

Using 3D homogenous vectors (x,y,w) allows us to work in 3-D and the division by w provides the necessary perspective foreshortening (to make a long story short). It turns out that any scale multiple of H works meaning that it only has 8 degrees of freedom (instead of the full 3*3 = 9). What this means is that we want HA' to be a scale multiple of A thus their cross product is zero:

enter image description here

If we perform the cross product we can rewrite this last equation as

enter image description here

The last equation above is actually a linear combination of the first two equations (multiply the first equation by x, multiply the second equation by y, add them and you arrive at the third equation). Since the third equation in linearly dependent we throw it out and only use the first two. After negating the second equation, swapping them, and then converting them to matrix form we get

enter image description here

Thus one point correspondence A' - A yields two equations. If we have n point correspondences we get 2n equations:

enter image description here

We need n >= 4 to have at least 8 equations to arrive at a proper solution; i.e., we need at least 4 (non-collinear) points. Thus we have a homogenous system of equations which we solve using singular value decomposition:

enter image description here

Obviously the trivial solution h = 0 works but it not very useful. Setting h to the last column of V leads to the least squared error solution of our system where h has unit length.

Lets compute H for your specific example. Let's assume the source image to transform is WxH = 500x300, thus A = (0,0), B = (W,0), C = (0,H), and D = (W,H). The destination image is 484x217 and I located the corners of the roof as A' = (70.7, 41.3), B' = (278.8, 76.3), C' = (136.4, 121,2), and D' = (345.1, 153,2). I'll use Eigen to do the computation. So I'll load my source and destination points into matrices:

#include <Eigen/Dense>
...
constexpr double W = 500;
constexpr double H = 300;
constexpr size_t N = 4;

Eigen::Matrix<double,2,N> SRC;
SRC <<
    0, W, 0, W,
    0, 0, H, H;
Eigen::Matrix<double,2,N> DST;
DST <<
    70.7, 278.8, 136.4, 345.1,
    41.3,  76.3, 121.2, 153.2;

I construct the 8x9 matrix A as described above

Eigen::Matrix<double,2*N,9> A;
A.setZero();
for (size_t i = 0; i < N; i++) {
    const double x_ = DST(0,i), y_ = DST(1,i);
    const double x  = SRC(0,i), y  = SRC(1,i);
    A(2*i,0) = A(2*i+1,3) = x_;
    A(2*i,1) = A(2*i+1,4) = y_;
    A(2*i,2) = A(2*i+1,5) = 1;
    A(2*i,6) = -x*x_;
    A(2*i,7) = -x*y_;
    A(2*i,8) = -x;
    A(2*i+1,6) = -y*x_;
    A(2*i+1,7) = -y*y_;
    A(2*i+1,8) = -y;
}

I then compute the SVD, extract the solution from the last column of V, and stored the result in a 3x3 matrix:

Eigen::JacobiSVD<Eigen::Matrix<double,2*N,9>> svd(A, Eigen::ComputeFullV);
Eigen::Matrix<double,9,1> h = svd.matrixV().col(8);
Eigen::Matrix3d Homography;
Homography <<
    h(0), h(1), h(2),
    h(3), h(4), h(5),
    h(6), h(7), h(8);

yielding the desired 3x3 matrix H:

  -0.016329     0.013427      0.599927
   0.004571    -0.0271779     0.799277
   1.78122e-06 -2.83812e-06  -0.00613631

We can take a look at a sample warped image using OpenCV. I load my source texture and my homography H and use the OpenCV warpPerspective function

#include <opencv2/opencv.hpp>
#include <opencv2/imgproc.hpp>

int main() {
    cv::Mat sourceImage = imread("texture.png", cv::IMREAD_COLOR);
    cv::Matx33d H(-0.016329, 0.013427, 0.599927,
                  0.004571, -0.0271779, 0.799277,
                  1.78122e-06, -2.83812e-06, -0.00613631);
    cv::Mat destImage;
    cv::warpPerspective(sourceImage, destImage, H, cv::Size(487,217),
                        cv::INTER_LINEAR | cv::WARP_INVERSE_MAP);
    cv::imwrite("warped.png", destImage);
    return 0;
}

The result looks plausible:

enter image description here enter image description here


推荐阅读