I am attempting to write code where a user can take a picture from their camera and it will then display inside a specified UIImageView as well as upload to a server for later user. I am using a iPad as the device. However when the user takes the picture, it is rotated 90 degrees. Also, it is scaled wrong. The aspect ratio is different then what the photo was taken. This is the code I am using to scale and size: http://pastebin.com/HxNkb7Be
When uploading the file to the my server, I get the Image's data like so:
NSData *imageData = UIImageJPEGRepresentation(scaleAndRotateImage(image), 0.90f);
This is how I get the UIImage from the camera:
// Get the asset representation
ALAssetRepresentation *rep = [asset defaultRepresentation];
// Get the right orientation
UIImageOrientation orientation = UIImageOrientationUp;
NSNumber *orientationValue = [[rep metadata] objectForKey:@"Orientation"];
if (orientationValue != nil) {
orientation = [orientationValue intValue];
}
// Get the CG image reference and convert to UIImage
UIImage *image = [UIImage imageWithCGImage:[rep fullResolutionImage] scale:rep.scale orientation:orientation];
May be because of the helper method [rep fullResolutionImage] you use. The following is a good method which you can use to achieve what you want. (this will solve the scaling problem as well)