Currently I am developing a project in Java which will stitch images taken from GoPro which is installed in a drone. With images also comes GPS data which is assigned to every camera shot:
Latitude Longitude Altitude Roll Pitch Yaw
CAM, 467270000, 1895, 58.3705557, 31.0359788, 131.30, 109.88, 0.73, -10.70, 172.79
CAM, 467273600, 1895, 58.3702211, 31.0354466, 126.43, 105.01, -1.02, -4.61, 218.37
CAM, 467277400, 1895, 58.3699252, 31.0348309, 127.61, 106.19, -6.50, -2.75, 233.71
CAM, 467281400, 1895, 58.3695767, 31.0342934, 120.87, 99.45, -0.52, 7.95, 229.61
CAM, 467284800, 1895, 58.3692659, 31.0337095, 131.80, 110.38, -8.02, 6.47, 225.28
CAM, 467288800, 1895, 58.3689556, 31.0331251, 132.72, 111.30, -5.50, -7.32, 223.22
CAM, 467292800, 1895, 58.3685826, 31.0326798, 132.01, 110.59, -16.65, 0.24, 215.20
CAM, 467297400, 1895, 58.3682075, 31.0330935, 127.13, 105.71, -37.07, 9.32, 143.64
CAM, 467300600, 1895, 58.3683265, 31.0339103, 132.22, 110.80, -24.37, 3.29, 102.24
CAM, 467303800, 1895, 58.3686659, 31.034442, 131.46, 110.04, -9.07, -1.69, 76.16
This is the input data which I use for algorithms. Firstly, I create ArrayList
of latitude
, longitude
, altitude
and
distanceList
(distance between centres of 2 consecutive images);azimuthList
(image's azimuth);coordinateAngleList
(next image's angle relatively to the previous one);pixelSizeList
(size of one pixel in meters).
Here are the algorithms that I use to calculate listed values:
distanceList
andazimuthList
:private static double calculateDistanceOrAzimuthAngle(Double llat1, Double llong1, Double llat2, Double llong2, String mode) { double pi = Math.PI; //===== Earth radius in meters int rad = 6372795; //===== Coordinates in radians double phi1 = Math.toRadians(llat1); double phi2 = Math.toRadians(llat2); double lam1 = Math.toRadians(llong1); double lam2 = Math.toRadians(llong2); double lambda = lam2 - lam1; //===== Cos and Sin of latitudes and delta of longitudes double c11 = Math.cos(phi1); double c12 = Math.cos(phi2); double s11 = Math.sin(phi1); double s12 = Math.sin(phi2); double cdelta = Math.cos(lambda); double sdelta = Math.sin(lambda); switch (mode) { case "distance": { //===== Calculating lengths of bigger circle double x = s11 * s12 + c11 * c12 * cdelta; double y = Math.sqrt(Math.pow(c12 * sdelta, 2) + Math.pow(c11 * s12 - s11 * c12 * cdelta, 2)); double ad = Math.atan2(y, x); return ad * rad; } case "azimuth": { double x = (c11 * s12) - (s11 * c12 * cdelta); double y = sdelta * c12; double z = Math.toDegrees(Math.atan(-y / x)); if (x < 0) z = z + 180; double z2 = (z + 180) % 360 - 180; z2 = Math.toRadians(-z2); double anglerad2 = z2 - ((2 * pi) * Math.floor((z2 / (2 * pi)))); double anglerad = (anglerad2 * 180.) / pi; anglerad = 360 - (anglerad - 90); if (anglerad > 360) anglerad -= 360; return anglerad; } default: return 0; } }
coordinateAngleList
:private static double angleFromCoordinate(Double slat1, Double slong1, Double slat2, Double slong2) { double long1 = slong1; double long2 = slong2; double lat1 = Math.toRadians(slat1); double lat2 = Math.toRadians(slat2); double dLon = Math.toRadians(long2 - long1); double y = Math.sin(dLon) * Math.cos(lat2); double x = Math.cos(lat1) * Math.sin(lat2) - Math.sin(lat1) * Math.cos(lat2) * Math.cos(dLon); return (Math.toDegrees(Math.atan2(y, x)) + 360) % 360; }
pixelSizeList
:private static double calculatePixelSize(double altitude, double FOV, double imgWidth, double imgHeight) { double halfFOV = FOV / 2; double angleA = 180 - 90 - halfFOV; double hypotenuse = altitude / Math.sin(angleA); double imageDiagonalInMeters = Math.sqrt(Math.pow(hypotenuse, 2) - Math.pow(altitude, 2)); double imageDiagonalInPixels = Math.sqrt(Math.pow(imgWidth, 2) + Math.pow(imgHeight, 2)); return imageDiagonalInMeters / imageDiagonalInPixels; }
Here are the first 10 values of each list:
Distance: [48.46373238266117, 48.71355037065455, 49.85503769989944, 48.52930211808859, 48.510404299271855, 48.94825023128212, 48.19766435644759, 49.451021971786204, 48.85744749197166, 50.080448602974066]
Azimuth: [230.16731156516713, 222.50205323598703, 231.0321217131937, 225.42496727180497, 225.3540760894774, 237.94887148656915, 300.046347389977, 15.525254536580519, 50.593577501672826, 56.856166424208084]
Coordinate Angel: [39.832235289576545, 47.497422524511194, 38.96742063247876, 44.57453556828523, 44.645426326236986, 32.050749366729804, 329.9540048495128, 254.47544091635925, 219.40687520849113, 213.14423324262788]
Pixel size: [0.060905719060422164, 0.0586466874395215, 0.05919405033739887, 0.05606758768342138, 0.061137652491726126, 0.06156441000532543, 0.061235064532873786, 0.05897139424334705, 0.06133247657402146, 0.06097993775843944]
After that a simple rotation of the image is happening. I've tried to use the angles and other values that I've received to rotate images with Photoshop, but almost the same thing happens. Here's the rotating algorithm:
private void rotate(int numToRotate) {
//TODO correct rotation without cutting image
try {
for (int i = 0; i < numToRotate - 1; i++) {
BufferedImage image = ImageIO.read(new File("/Users/Nick/Downloads/geotagged-1/input" + i + "_1.png"));
// The required drawing location
int drawLocationX = 500;
int drawLocationY = 500;
// Rotation information
//double rotationRequired = azimuthList.get(i);
double rotationRequired = coordinateAngleList.get(i);
double locationX = image.getWidth() / 2;
double locationY = image.getHeight() / 2;
//AffineTransform tx = AffineTransform.getRotateInstance(Math.toRadians(rotationRequired), locationX, locationY);
//AffineTransformOp op = new AffineTransformOp(tx, AffineTransformOp.TYPE_BILINEAR);
AffineTransform transform = new AffineTransform();
transform.rotate(Math.toRadians(rotationRequired), locationX, locationY);
AffineTransformOp op = new AffineTransformOp(transform, AffineTransformOp.TYPE_BILINEAR);
// Drawing the rotated image at the required drawing locations
BufferedImage opImage = op.filter(image, null);
BufferedImage im = new BufferedImage((opImage.getWidth()), (opImage.getHeight()), BufferedImage.TYPE_INT_ARGB);
im.getGraphics().drawImage(opImage, drawLocationX, drawLocationY, null);
ImageIO.write(im, "png", new File("/Users/Nick/Dropbox/ЭЛСИ/ДЗЗ/AgroScan/rotated/" + i + ".png"));
}
} catch (IOException e) {
e.printStackTrace();
}
}
Either way, the resulting images aren't ending up in a somewhat close straight line. Stitching is based on creating a big BufferedImage and putting images according to the metric distance between their centres (with pixelSize
for each image being calculated previously).
What am I doing wrong with rotating the images? I've tried rotating according to azimuth, but that makes it only worse. Is there a mistake in any algorithm that I am using?