I have an image of a LTCC (circuit carrier) board with VIAs(which can be detected as circles in our image). And I have been able to detect the four reference circles in the image and store the four pixel coordinates as (x1, y1), (x2, y2), (x3, y3), (x4, y4).
I want to fix the orientation of the image, using the four big reference circles, located at four corners of the image. The final orientation should be such that, the (four) detected circles' center coordinates are aligned with perfect orientation, e.g. if the four coordinates are (x1, y1), (x2, y2), (x3, y3), (x4, y4), arranged in a clockwise manner, starting from top left, then in the output image we should have x1==x4, x2==x3, y1==y2, y3==y4.
How can I output a value for 'α' in R = Matrix([[cos(α), -sin(α)], [sin(α), cos(α)]])?
I have tried this following code only to get an unwanted result:
# -*- coding: utf-8 -*-
"""
Created on Wed Jan 24 22:57:52 2024
@author: Erfan
"""
import cv2
import numpy as np
from sympy import symbols, Eq, solve, cos, sin, Matrix
# Read TIFF image
tif_image = cv2.imread('./8th Aug Scan files/Au-8700dpi-8bit-gr-0.tif', cv2.IMREAD_GRAYSCALE)
# Rescale the image
new_size = (1017, 1000)
tif_image_scaled = cv2.resize(tif_image, new_size, interpolation=cv2.INTER_AREA)
# Calculate scaling factor
original_size = tif_image.shape[:2]
scaling_factor = (new_size[0] / original_size[0], new_size[1] / original_size[1])
# Print resolution infos of working images and scaling factor
print('\n\nOriginal image shape:', tif_image.shape)
print('\nScaled image shape:', tif_image_scaled.shape)
print('\nScaling factor (width, height):', scaling_factor)
# Define symbolic variables
alpha = symbols('alpha')
x1_prime, y1_prime, x2_prime, y2_prime, x3_prime, y3_prime, x4_prime, y4_prime = symbols('x1_prime y1_prime x2_prime y2_prime x3_prime y3_prime x4_prime y4_prime')
x1, y1, x2, y2, x3, y3, x4, y4 = symbols('x1 y1 x2 y2 x3 y3 x4 y4')
sx = scaling_factor[0]
sy = scaling_factor[1]
x1_prime, y1_prime, x2_prime, y2_prime, x3_prime, y3_prime, x4_prime, y4_prime = int(116/sx), int(106/sy), int(882/sx), int(98/sy), int(886/sx), int(860/sy), int(120/sx), int(868/sy)
# Define the rotation matrix in terms of alpha
R = Matrix([[cos(alpha), -sin(alpha)], [sin(alpha), cos(alpha)]])
# Define the coordinates of the four reference points before and after rotation
P1 = Matrix([x1, y1])
P2 = Matrix([x2, y2])
P3 = Matrix([x3, y3])
P4 = Matrix([x4, y4])
P1_prime = Matrix([x1_prime, y1_prime])
P2_prime = Matrix([x2_prime, y2_prime])
P3_prime = Matrix([x3_prime, y3_prime])
P4_prime = Matrix([x4_prime, y4_prime])
# Define the coordinates of the four reference points after rotation
P1_prime = R * P1
P2_prime = R * P2
P3_prime = R * P3
P4_prime = R * P4
# Define the equations for each coordinate
##eq1 = Eq(x1*cos(alpha) - y1*sin(alpha), x1_prime)
##eq2 = Eq(x1*sin(alpha) + y1*cos(alpha), y1_prime)
##eq3 = Eq(x2*cos(alpha) - y2*sin(alpha), x2_prime)
##eq4 = Eq(x2*sin(alpha) + y2*cos(alpha), y2_prime)
##eq5 = Eq(x3*cos(alpha) - y3*sin(alpha), x3_prime)
##eq6 = Eq(x3*sin(alpha) + y3*cos(alpha), y3_prime)
##eq7 = Eq(x4*cos(alpha) - y4*sin(alpha), x4_prime)
##eq8 = Eq(x4*sin(alpha) + y4*cos(alpha), y4_prime)
eq1 = Eq(x1_prime*cos(alpha) - y1_prime*sin(alpha), x1_prime)
eq2 = Eq(x1_prime*sin(alpha) + y1_prime*cos(alpha), y1_prime)
eq3 = Eq(x2*cos(alpha) - y1_prime*sin(alpha), x2_prime)
eq4 = Eq(x2*sin(alpha) + y1_prime*cos(alpha), y2_prime) ## implemented x1=x4, x2=x3, y1=y2, y3=y4 HERE and x1=x1_prime, y1=y1_prime
eq5 = Eq(x2*cos(alpha) - y3*sin(alpha), x3_prime)
eq6 = Eq(x2*sin(alpha) + y3*cos(alpha), y3_prime)
eq7 = Eq(x1_prime*cos(alpha) - y3*sin(alpha), x4_prime)
eq8 = Eq(x1_prime*sin(alpha) + y3*cos(alpha), y4_prime)
# Substitute given values directly into all equations
eq1 = eq1.subs({x1: x1_prime, y1: y1_prime})
eq2 = eq2.subs({x1: x1_prime, y1: y1_prime})
####
# Solve the system of equations for alpha
solution = solve([eq1, eq2, eq3, eq4, eq5, eq6, eq7, eq8], alpha)
alpha = solution
# Print the solution
print("\n\nThe value of alpha is:", alpha)
print("\nRotation matrix: ", R)
print("\n\nx1_prime: ", x1_prime, "\ny1_prime: ", y1_prime)
This code outputs:
Original image shape: (27394, 27394)
Scaled image shape: (1000, 1017)
Scaling factor (width, height): (0.03712491786522596, 0.036504344016938015)
The value of alpha is: [(2*atan((x2 - sqrt(x2**2 + y3**2 - 554979364))/(y3 + 23558)),), (2*atan((x2 + sqrt(x2**2 + y3**2 - 554979364))/(y3 + 23558)),)]
Rotation matrix: Matrix([[cos(alpha), -sin(alpha)], [sin(alpha), cos(alpha)]])
x1_prime: 3124
y1_prime: 2903
As you can see in my code I am mainly using the OpenCV library to do image processing tasks and now using Sympy to solve equations with multiple variables.
In my output, why am I not getting a single value for alpha here? I could use the value of alpha to get my 2D rotation matrix that I can use to change and fix the orientation of my image. How can I find alpha?
I am a beginner in image processing field. I am aware that this approach may have its problems associated with it, if you can identify any problems that needs to be taken care of to ensure that the end results are correct with this approach, I would appreciate it.
(first) follow-up question: After rotating the image (if its possible like this), I want to crop the new_image in such manner that all four cornermost pixels of final_ouput_image will exactly be the center coordinates image positions of the four detected (reference) circles, in the original image. Do you think, there could be other better approaches to solve my case?